names
stringlengths 1
95
| readmes
stringlengths 0
399k
| topics
stringlengths 0
421
| labels
stringclasses 6
values |
---|---|---|---|
Trip-Record-Data-Analysis | trip record data analysis as a skilled data engineer i spearheaded a comprehensive project to extract transform and analyze tlc trip data records for the year 2016 employing cutting edge technologies on the google cloud platform gcp i successfully developed a robust etl pipeline and insightful looker dashboard to empower stakeholders with data driven decision making the project commenced with scrapping the tlc trip data and skillfully organizing it into a star schema data model encompassing fact and dimensional tables the fact table housed the core trip data while dimensional tables enriched the dataset with contextual information optimizing data analysis efficiency to ensure seamless data flow and storage i securely stored the data in gcp cloud storage leveraging its scalability and reliability furthermore i deployed mage ai onto gcp compute engine a high performance infrastructure to drive the etl pipeline s functionality taking advantage of mage ai s capabilities i skillfully extracted the tlc trip data from gcp cloud storage using apis implementing python based transformation logic i meticulously cleaned and preprocessed the raw data ensuring data quality and consistency the transformed data was then efficiently exported to google bigquery a fully managed data warehouse where its immense querying capabilities could be harnessed for in depth data exploration i harnessed the power of complex sql logics devising tailored queries to gain valuable insights these queries involved data aggregations filtering and joins between fact and dimensional tables enabling the identification of critical patterns and trends within the dataset to provide a visually compelling representation of the findings i crafted a dynamic dashboard using looker studio this dashboard showcased various trends in the tlc trip data including popular pickup and drop off locations peak hours ride durations and customer demographics the looker dashboard empowered stakeholders to make informed decisions optimize operational processes and identify growth opportunities | cloud |
|
cvlib | downloads https static pepy tech personalized badge cvlib period total units international system left color grey right color blue left text pip 20installs https pepy tech project cvlib pypi https img shields io pypi v cvlib svg color blue https pypi org project cvlib cvlib a simple high level easy to use open source computer vision library for python installation installing dependencies provided the below python packages are installed cvlib is completely pip installable opencv tensorflow if you don t have them already installed you can install through pip pip install opencv python tensorflow optional or you can compile them from source if you want to enable optimizations for your specific hardware for better performance if you are working with gpu you can install tensorflow gpu package through pip make sure you have the necessary nvidia drivers installed preoperly cuda toolkit cudnn etc if you are not sure just go with the cpu only tensorflow package you can also compile opencv from source to enable cuda optimizations for nvidia gpu installing cvlib pip install cvlib to upgrade to the newest version pip install upgrade cvlib optional if you want to build cvlib from source clone this repository and run the below commands git clone https github com arunponnusamy cvlib git cd cvlib pip install note compatability with python 2 x is not officially tested face detection detecting faces in an image is as simple as just calling the function detect face it will return the bounding box corners and corresponding confidence for all the faces detected example python import cvlib as cv faces confidences cv detect face image seriously that s all it takes to do face detection with cvlib underneath it is using opencv s dnn module with a pre trained caffemodel to detect faces to enable gpu python faces confidences cv detect face image enable gpu true checkout face detection py in examples directory for the complete code sample output examples images face detection output jpg gender detection once face is detected it can be passed on to detect gender function to recognize gender it will return the labels man woman and associated probabilities example python label confidence cv detect gender face underneath cvlib is using an alexnet like model trained on adience dataset https talhassner github io home projects adience adience data html agegender by gil levi and tal hassner for their cvpr 2015 https talhassner github io home publication 2015 cvpr paper to enable gpu python label confidence cv detect gender face enable gpu true checkout gender detection py in examples directory for the complete code sample output examples images gender detection output jpg object detection detecting common objects in the scene is enabled through a single function call detect common objects it will return the bounding box co ordinates corrensponding labels and confidence scores for the detected objects in the image example python import cvlib as cv from cvlib object detection import draw bbox bbox label conf cv detect common objects img output image draw bbox img bbox label conf underneath it uses yolov4 https github com alexeyab darknet model trained on coco dataset http cocodataset org capable of detecting 80 common objects https github com arunponnusamy object detection opencv blob master yolov3 txt in context to enable gpu python bbox label conf cv detect common objects img enable gpu true checkout object detection py in examples directory for the complete code real time object detection yolov4 is actually a heavy model to run on cpu if you are working with real time webcam video feed and doesn t have gpu try using tiny yolo which is a smaller version of the original yolo model it s significantly fast but less accurate python bbox label conf cv detect common objects img confidence 0 25 model yolov4 tiny check out the example examples object detection webcam py to learn more other supported models yolov3 yolov3 tiny custom trained yolo weights to run inference with custom trained yolov3 v4 weights try the following python from cvlib object detection import yolo yolo yolo weights config labels bbox label conf yolo detect objects img yolo draw bbox img bbox label conf to enable gpu python bbox label conf yolo detect objects img enable gpu true checkout the example examples yolo custom weights inference py to learn more sample output examples images object detection output jpg utils video to frames get frames method can be helpful when you want to grab all the frames from a video just pass the path to the video it will return all the frames in a list each frame in the list is a numpy array python import cvlib as cv frames cv get frames downloads demo mp4 optionally you can pass in a directory path to save all the frames to disk python frames cv get frames downloads demo mp4 downloads demo frames creating gif animate method lets you create gif from a list of images just pass a list of images or path to a directory containing images and output gif name as arguments to the method it will create a gif out of the images and save it to disk for you python cv animate frames documents frames gif sponsor developing and maintaining open source projects takes a lot of time and effort if you are getting value out of this project consider supporting my work by simply buying me a coffee https buymeacoffee com arunponnusamy one time or every month license cvlib is released under mit license help for bugs and feature requests feel free to file a github issue https github com arunponnusamy cvlib issues make sure to check whether the issue has been filed already for usage related how to questions please create a new question on stackoverflow https stackoverflow com questions tagged cvlib with the tag cvlib community join the official discord server https discord gg chhqjzgwfh or github discussions https github com arunponnusamy cvlib discussions to talk about all things cvlib citation if you find cvlib helpful in your work please cite the following bibtex misc ar2018cvlib author arun ponnusamy title cvlib high level computer vision library for python howpublished url https github com arunponnusamy cvlib year 2018 | computer-vision image-processing machine-learning deep-learning python | ai |
iotdb | internet of things nmap fingerprints a repository of nmap scans for various iot devices the following command was run to perform the scan nmap n pn ss pt 0 65535 v a ox name ip where name is a descriptive name for the device ex wink hub and ip is the ip address the device has on your local network the above command should complete within an hour but it doesn t check for any udp services that the device may be running to perform a complete scan of the device that checks both tcp and udp please run the following command nmap n pn ssu pt 0 65535 u 0 65535 v a ox name ip if you re on an ipv6 network also try running the following command to get a fingerprint of its ipv6 capabilities nmap 6 n pn ssu pt 0 65535 u 0 65535 v a ox name ip to have your nmap scan results added to this repository either submit a pull request or send your xml file to support shodan io | server |
|
asgard_shop | asgard shop materials for the talk given at flutter vikings 2022 documents slides https www figma com proto tbq8pqii94evb7texddzrt flutterviking design systems page id 123 3a363 node id 123 3a364 viewport 241 2c48 2c0 07 scaling contain starting point node id 123 3a364 ux ui design system figma document https www figma com file tbq8pqii94evb7texddzrt flutterviking design systems node id 15 3a408 ui prototype https www figma com proto tbq8pqii94evb7texddzrt flutterviking design systems page id 4 3a249 node id 4 3a250 viewport 241 2c48 2c2 7 scaling min zoom starting point node id 4 3a250 | os |
|
SQL-Proyect | sql proyect este proyecto fue desarrollado durante el per odo de la cursada de base de datos en la universidad nacional del centro de la provincia de buenos aires en la carrera de ingenieria de software comprende un proyecto dado por enunciado con una problematica para una base de datos sql funcional y real this project was developed during the period of the database course at the national university of the center of the province of buenos aires in the software engineering career it includes a project given by statement with a problem for a real and functional sql database | server |
|
sentimentalizer | archived it is 2023 you should be using something else for sentiment analysis maybe this https huggingface co blog sentiment analysis python is something you can use sentimentalizer inspired by sentan https github com martinrue sentan node sentiment https github com martinrue node sentiment this gem can be used separately or integrated with rails app instructions for rails use 1 install gem using bundler gem sentimentalizer 2 run rails g sentimentalizer this will generate an initializer file with after initialize hook for rails it s basically training a model to use in the application it will run everytime you start server or run any rake commands would love some input on this 3 now you can run following after require sentimentalizer ruby sentimentalizer analyze message or tweet or status or for json output sentimentalizer analyze message or tweet or status true you will get output like this ruby sentimentalizer analyze i am so happy text i am so happy probability 0 937 sentiment sentimentalizer analyze i am so happy true text i am so happy probability 0 937 sentiment instructions for vanilla ruby use 1 install gem using bundler gem sentimentalizer 2 either fire up irb or require it in your project with require sentimentalizer 3 now you need to train the engine in order to use it ruby require sentimentalizer sentimentalizer setup or wrap it in a class so setup can be automatic class analyzer def initialize sentimentalizer setup end def process phrase sentimentalizer analyze phrase end end or for json output sentimentalizer analyze message or tweet or status true and now you will get output like this ruby analyzer analyzer new analyzer process i am so happy text i am so happy probability 0 937 sentiment analyzer process i am so happy true text i am so happy probability 0 937 sentiment contributing to sentimentalizer check out the latest master to make sure the feature hasn t been implemented or the bug hasn t been fixed yet check out the issue tracker to make sure someone already hasn t requested it and or contributed it fork the project start a feature bugfix branch commit and push until you are happy with your contribution make sure to add tests for it this is important so i don t break it in a future version unintentionally please try not to mess with the rakefile version or history if you want to have your own version or is otherwise necessary that is fine but please isolate to its own commit so i can cherry pick around it copyright copyright c 2018 malavbhavsar see license txt for further details | rails ruby machine-learning sentiment-analysis naive-bayes | ai |
Cosmos_Intruders | cosmos intruders cmis 435 mobile application development in progress for more details look at the ms and assignment at blackboard olat image https user images githubusercontent com 78967253 116073951 5adf8980 a691 11eb 8ade 21be4850d772 jpeg image https user images githubusercontent com 78967253 116073958 5c10b680 a691 11eb 8d0a 8e9af16095a1 jpeg image https user images githubusercontent com 78967253 116073961 5d41e380 a691 11eb 9911 d74f36695e06 jpeg | front_end |
|
Sublime-Super-Snippets | a collection of front end developer related sublime text 2 snippets this is a very young repository and will be shaped into a well oiled collection in no time go nuts lots of credit goes to joshua hibbert for the base of the css snippets https github com joshnh css snippets | front_end |
|
torch-LLM4SGG | llm4sgg large language model for weakly supervised scene graph generation llms https img shields io badge task wssgg blue llms https img shields io badge model gpt 3 5 green llms https img shields io badge model llms green code for the paper llm4sgg large language model for weakly supervised scene graph generation https arxiv org pdf 2310 10404 pdf overview img src figure introduction png width 600 img figure introduction png addressing two issues inherent in the conventional approach parser https github com vacancy scenegraphparser knowledge base wordnet https dl acm org doi pdf 10 1145 219717 219748 semantic over simplification step 2 the standard scene graph parser commonly leads to converting the fine grained predicates into coarse grained predicates which we refer to as semantic over simplification for example in figure c an informative predicate lying on in the image caption is undesirably converted into a less informative predicate on because the scene parser operating on rule based fails to capture the predicate lying on at once and its heuristic rules fall short of accommodating the diverse range of caption s structure as a result in figure b the predicate distribution follows long tailedness to make matter worse span style color red 12 out of 50 predicates span are non existent which means that these 12 predicates can never be predicted low density scene graph step 3 the triplet alignment based on knowledge base i e wordnet leads to low density scene graphs i e the number of remaining triplets after step 3 is small specifically a triplet is discarded if any of three components i e subject predicate object or their synonym hypernym hyponym within the triplet fail to align with the entity or predicate classes in the target data for example in figure d the triplet elephant carrying log is discarded because log does not exist in the target data nor its synonym hypernym even if elephant and carrying do exist as a result a large number of predicates is discarded resulting in a poor generalization and performance degradation this is attributed to the fact that the static structured knowledge of kb is insufficient to cover the semantic relationships among a wide a range of words proposed approach llm4sgg to alleviate the two issues aforementioned above we adopt a pre trained large language model llm inspired by the idea of chain of thoughts cot https arxiv org pdf 2201 11903 pdf which arrives at an answer in a stepwise manner we seperate the triplet formation process into two chains each of which replaces the rule based parser in step 2 i e chain 1 chain 1 triplet extraction via llm and the kb in step 3 i e chain 2 chain 2 alignment of classes in triplets via llm regarding an llm we employ gpt 3 5 turbo in chatgpt https chat openai com img src figure installation logo png width 15 installation python 3 9 0 python conda install pytorch 1 10 1 torchvision 0 11 2 cudatoolkit 11 3 c pytorch c conda forge pip install openai pip install einops shapely timm yacs tensorboardx ftfy prettytable pymongo tqdm pickle numpy pip install transformers once the package has been installed please run setup py file python python setup py build develop user img src figure dataset logo png width 15 dataset directory structure root dataset coco captions train2017 json captions val2017 json coco triplet labels npy images png vg image data json vg sgg with attri h5 vg sgg dicts with attri json vg 100k png gqa gqa 200 id info json gqa 200 train json gqa 200 test json images png img src figure coco logo2 png width 15 training data to train sgg model we use image caption with its image in coco dataset please download the coco https cocodataset org download dataset and put the corresponding files into dataset coco directory the name of files in url are 2017 traing images 118k 18gb 2017 val images 5k 1gb 2017 train val annotations 241mb note that after downloading the raw images please combine them into dataset coco images directory for a fair comparison we use 64k images following the previous studies sgnls https github com yiwuzhong sgg from nls li et al mm 22 https github com xcppy ws sgg please download a file https drive google com file d 1kxvsecabqig2ac8kupdb831j5 e ajrl view usp sharing including the image id of 64k images test data for evaluation we use visual genome vg and gqa datasets img src figure vg logo png width 14 vg we follow the same pre processing strategy with vs3 cvpr23 https github com zyong812 vs3 cvpr23 please download the linked files to prepare necessary files raw images part 1 9gb https cs stanford edu people rak248 vg 100k 2 images zip part 2 5gb https cs stanford edu people rak248 vg 100k 2 images2 zip annotation files image data json https drive google com file d 1kl7opx3niinxsw9zepohknoznqrsdwae view usp sharing vg sgg dicts with attri json https drive google com file d 156cp5upajaroslzzeshhz3eviguw8vlw view usp sharing vg sgg with attri h5 https drive google com file d 143rs zh6wpc3wn1wqkztpau0 ufhb7sg view usp sharing after downloading the raw images and annotation files please put them into dataset vg vg 100k and dataset vg directory respectively img src figure gqa logo png width 15 gqa we follow the same preprocessing strategy with sha gcl for sgg https github com dongxingning sha gcl for sgg please download the linked files to prepare necessary files raw images full 20 3gb https downloads cs stanford edu nlp data gqa images zip annotation files gqa 200 id info json https drive google com file d 1edzflqv5spteiwlxb432hwlmtxmeq4ci view usp sharing gqa 200 test json https drive google com file d 1ns0jdbqf73awbjtyq szyboahducpn8o view usp sharing gqa 200 train json https drive google com file d 1 qq4pvyiksdq7an9vrj2axrqfgdikgxc view usp sharing after downloading the raw images and annotation files please put them into dataset gqa images and dataset gqa directory respectively img src figure extraction png width 20 triplet extraction process via llm vg to utilize gpt 3 5 turbo in chatgpt please insert your openai key which is obtained from https platform openai com account api keys https platform openai com account api keys please follow step by step to obtain localized triplets chain 1 triplet extraction via llm since triplet extraction via llm is based on openai s api the code can be runned in parallel for example 10 000 images can be divided into 1 000 images with 10 codes to this end please change start and end variables in py code and name of saved files to avoid overwriting files extract triplets from original captions python python triplet extraction process extract triplet with original caption py api key extract triplets from paraphrased captions python python triplet extraction process extract triplet with paraphrased caption py api key after chain 1 the output files are located in dataset coco directory the files containing misaligned triplets can be downloaded as dataset coco misaligned triplets original pkl https drive google com file d 1eez5cqypjci jyaebcbgesni4ex4usgm view usp sharing dataset coco misaligned triplets paraphrased pkl https drive google com file d 17cef1jjiw0zpdlm2zligo9ozfp3onkp7 view usp sharing chain 2 alignment of classes in triplets via llm python python triplet extraction process alignment classes vg py api key after chain 2 the output files are located in triplet extraction process alignment dict directory the files containing aligned entity predicate information can be downloaded as triplet extraction process alignment dict aligned entity dict vg pkl https drive google com file d 1nbmgghoweywc1gcn4eipqaix3nrgjew8 view usp sharing triplet extraction process alignment dict aligned predicate dict vg pkl https drive google com file d 1zltfhpuw4tv56eqh2b6f7sungtxoph74 view usp sharing construction of aligned triplets in vs3 format python python triplet extraction process final preprocess triplets vg py after final instruction the output file is located in dataset vg directory the file containing aligned triplets in vs3 format can be downloaded as follows dataset vg aligned triplet info vg json https drive google com file d 14urh6sch 64cp0vuaouzayyn2sxplowi view usp sharing grounding unlocalized triplets we follow same code in vs3 https github com zyong812 vs3 cvpr23 blob main tools data preprocess parse sg from coco captionv2 py to ground unlocalized triplets a pre trained glip https github com microsoft glip model is necessary to ground them please put the pre trained glip model to model directory python download pre trained glip models mkdir model wget https penzhanwu2bbs blob core windows net data glipv1 open models glip tiny model o365 goldg cc sbu pth o swin tiny patch4 window7 224 pth wget https penzhanwu2bbs blob core windows net data glipv1 open models glip large model pth o swin large patch4 window12 384 22k pth python grounding unlocalized triplets python tools data preprocess parse sg from coco caption llm vg py after grounding unlocalized triplets the output file named aligned triplet info vg grounded json is located in dataset vg directory the file of localized triplets can be downloaded as follows dataset vg aligned triplet info vg grounded json https drive google com file d 10m30zx su ouf3pfju4m yajbg7dwefm view usp sharing img src figure extraction png width 20 triplet extraction process via llm gqa based on the extracted triplets in chain 1 chain 1 triplet extraction via llm please run the below codes similar to the process in triplet extraction process via llm vg triplet extraction process via llm vg python chain 2 alignment of classes in triplets via llm python triplet extraction process alignment classes gqa py api key construction of aligned tripelts in vs3 format python triplet extraction process final preprocess triplets gqa py grounding unlocalized triplets python tools data preprocess parse sg from coco caption llm gqa py we provide files regarding gqa dataset aligned entity dict gqa pkl https drive google com file d 1pv5vlcfgml0pfu7brqvvnkxhytimp9 p view usp sharing aligned predicate dict gqa pkl https drive google com file d 1k 50k2miofwf1t6vqu c1ssjnyyxcuza view usp sharing aligned triplet info gqa grounded json https drive google com file d 1pw2ubrhqd4mxcfwmqxbzwnpyeho1oqb7 view usp sharing training model to change localized triplets constructed by llm please change cococaption scene graph path in maskrcnn benchmark config paths catalog py https github com rlqja1107 torch llm4sgg blob master maskrcnn benchmark config paths catalog py file img src figure vg logo png width 14 vg please change variable in cococaption scenegraph to dataset vg aligned triplet info vg grounded json localized triplets python bash train vg sh if you want to train model with reweighting strategy please run the code python bash train rwt vg sh img src figure gqa logo png width 14 gqa please change variable in cococaption scenegraph to dataset gqa aligned triplet info gqa grounded json localized triplets after changing variable please run the code python bash train gqa sh test python please change model checkpoint in test sh file bash test sh we also provide pre trained models img src figure vg logo png width 14 vg model vg vs3 pth https drive google com file d 17b6xl5kvb62z6dxch8mn4jlmgyvnzjk1 view usp sharing config yml https drive google com file d 1ulovluyt2nv hwehk a15becoyzzpne view usp sharing evaluation res txt https drive google com file d 1hsy0nqra j61yhqp lj xxll2d4pbcw1 view usp sharing model vg vs3 rwt pth https drive google com file d 1pcuyzofch4 i9ohhdn69koovbcuistkh view usp sharing config yml https drive google com file d 1isakrohclmjqiixxjqymmohxpn6uowr2 view usp sharing evaluation res txt https drive google com file d 1vnkms9tiawb3wmjjaup85lztineohjzt view usp sharing img src figure gqa logo png width 14 gqa model gqa vs3 pth https drive google com file d 16gwbc1ucjzofhxm3vgj5zy9htylk5xlf view usp sharing config yml https drive google com file d 13eetihqsccgwlk6zoeiub jwsxldufnf view usp sharing evaluation res txt https drive google com file d 1zy5xpegrt79plknuksglk0dk amjmtz5 view usp sharing | ai |
|
stream-site | preview image https pub rachni com img chrome 2016 12 28 22 38 36 png rachni an nginx rtmp module front end authors joel bethke joel bethke gmail com andrew may credits special thanks to hdxx for helping sort out some installation dependency issues this site aims to be a easy to use front end for the nginx rtmp module a few more screenshots can be found here http imgur com a q90jg there is no demo currently but i am working on standing something up current features account system lets users sign up for the site with email verification to activate accounts allows for password resets other account features are planned see github issues private stream keys each user is assigned a private streaming key this is used to connect to the ingest server anyone attempting to connect without a valid account streamkey will be denied on demand recording allows anyone to start recording a stream to the server for playback later recorded videos are stored indefinitely future plans to only let the streamer start stop their own recording in site recording playback all recorded videos are viewable playable from the site itself download is allowed on live notifications will allow any user to subscribe to another channel and receive an email notification when they go live currently implemented simply future enhancements planned stream api api for grabbing stream info possibly other functions useful for making widgets on another site for current viewers live channels can also be used to start stop recording if authorized built on mdl https getmdl io index html sass http sass lang com video js https github com videojs video js player planned features see github issues https github com fenrirthviti stream site issues page for details or to make any feature requests config information this site uses and requires linux i m using the exec function in a few directives which does not work on windows you can probably get around this but i do not intend to run this on windows so i have not tested any flavor of linux that offers the below packages should be fine nginx with nginx http flv module https github com winshining nginx http flv module http ssl module and http xslt module postgresql php7 i believe it should run on php5 4 but i have not tested mail function is required note the following packages were included in my build of php7 but you may need to install them manually php pgsql php xml php curl javascript ffmppeg for live screenshots recording remuxing this is the config string i used for nginx prefix etc nginx user nginx group nginx sbin path usr sbin nginx conf path etc nginx nginx conf error log path var log nginx error log http log path var log nginx access log pid path var run nginx pid lock path var run nginx lock with http ssl module add dynamic module path to nginx http flv module with http xslt module with openssl path to openssl all nginx conf files can be found in src nginx note this is a fairly complex nginx setup please make sure you read all the conf files and make the necessary directory config changes to them note many of the administration features will require manual database manipulation currently there is not much that will need to be done but be aware there is no admin console currently it is planned for future updates but has not been a priority installation install nginx with nginx http flv module http ssl module and http xslt module see above verify all config files are updated to the paths you want to use check every file there is a lot to configure copy the config files from src nginx to your nginx config directory default etc nginx if you used my config line and restart nginx install pgsql and set up your database user import the sql files from src pgsql to your database this will set up the two required tables make sure you update line 18 in subscribers sql line 19 of chat sql and line 23 of users sql to your database user account edit lib database class php with your db info edit inc config php to your liking copy everything but src and scss to your server if you wish to use sass to edit the site layouts colors all the files are in src css and the main file is scss application css otherwise just copy css to use the pre compiled versions css site css is required either way the nginx user needs to be able to access the following execute rights to rtmp sslive sh and rtmp convert sh write rights to var log rachni full access to your recordings folder configured in rtmp conf and main conf | nginx-rtmp stream php7 javascript | front_end |
machine_learning_project | application url housingpredictor https ml regression app herokuapp com start machine learning project software and account requirement 1 github account https github com 2 heroku account https dashboard heroku com login 3 vs code ide https code visualstudio com download 4 git cli https git scm com downloads 5 git documentation https git scm com docs gittutorial creating conda environment conda create p venv python 3 7 y conda activate venv or conda activate venv pip install r requirements txt to add files to git git add or git add file name note to ignore file or folder from git we can write name of file folder in gitignore file to check the git status git status to check all version maintained by git git log to create version commit all changes by git git commit m message to send version changes to github git push origin main to check remote url git remote v to setup ci cd pipeline in heroku we need 3 information 1 heroku email anishyadav7045075175 gmail com 2 heroku api key 3 heroku app name ml regression app build docker image docker build t image name tagname note image name for docker must be lowercase to list docker image docker images run docker image docker run p 5000 5000 e port 5000 f8c749e73678 to check running container in docker docker ps tos stop docker conatiner docker stop container id python setup py install install ipykernel pip install ipykernel data drift when your datset stats gets change we call it as data drift write a function to get training file path from artifact dir | ai |
|
CZ4045 | cz4045 nanyang technological university school of computer science and engineering academic year 2020 2021 semester 1 source code for cz4045 natural language processing assignments code written or modified by me assignment 1 assignment 3 1 sentence segmentation ipynb assignment 1 assignment 3 1 scripts extract pdf text py assignment 1 assignment 3 1 scripts scrape dota2 patch notes py assignment 2 2 2 ner q2 part 1to3 ipynb original repositories assignment 1 https github com yongwei12 nlpassignment1 assignment 2 https github com jsheng1996 nlpassignment2 | natural-language-processing sentence-segmentation lstm python nltk pytorch | ai |
Maxim | maxim ok so this was a quickly hacked together thing we did for a mooc in 2013 i really don t think you want to use this anymore but i ll leave it here for posterity cross platform javascript java audio dsp and mobile web development library compatible with processing maxim is designed to make it easier to program cross platform audio for desktops amd mobile platforms it provides a single api for building complex audio applications on android ios and the desktop using the webaudioapi in combination with traditional java approaches for compatibility it s a work in progress but vastly simplifies the process of getting started writing audio and music software for mobile platforms some notes if you are using javascript mode make sure your browser supports webaudioapi properly see here for a list of browsers that support webaudio http caniuse com audio api | front_end |
|
stress-ng | stress ng stress next generation a href https repology org project stress ng versions img src https repology org badge vertical allrepos stress ng svg alt packaging status align right a stress ng will stress test a computer system in various selectable ways it was designed to exercise various physical subsystems of a computer as well as the various operating system kernel interfaces stress ng features 310 stress tests 80 cpu specific stress tests that exercise floating point integer bit manipulation and control flow 20 virtual memory stress tests 40 file system stress tests 30 memory cpu cache stress tests portable builds on linux debian devuan rhel fedora centos slackware opensuse ubuntu etc solaris freebsd netbsd openbsd dragonflybsd minix android macos x serenity os gnu hurd haiku windows subsystem for linux and sunos dilos solaris with gcc musl gcc clang icc icx tcc and pcc tested on alpha armel armhf arm64 hppa i386 m68k mips32 mips64 power32 ppc64el risc v sh4 s390x sparc64 x86 64 stress ng was originally intended to make a machine work hard and trip hardware issues such as thermal overruns as well as operating system bugs that only occur when a system is being thrashed hard use stress ng with caution as some of the tests can make a system run hot on poorly designed hardware and also can cause excessive system thrashing which may be difficult to stop stress ng can also measure test throughput rates this can be useful to observe performance changes across different operating system releases or types of hardware however it has never been intended to be used as a precise benchmark test suite so do not use it in this manner running stress ng with root privileges will adjust out of memory settings on linux systems to make the stressors unkillable in low memory situations so use this judiciously with the appropriate privilege stress ng can allow the ionice class and ionice levels to be adjusted again this should be used with care tarballs tarballs of each version of stress ng can be downloaded using the url https github com colinianking stress ng tarball version where version is the relevant version number for example https github com colinianking stress ng tarball v0 13 05 running latest stress ng snapshot in a container bash docker run rm ghcr io colinianking stress ng help or bash docker run rm colinianking stress ng help debian packages for ubuntu recent versions of stress ng are available in the ubuntu stress ng ppa for various ubuntu releases https launchpad net colin king archive ubuntu stress ng sudo add apt repository ppa colin king stress ng sudo apt update sudo apt install stress ng building stress ng to build the following libraries will ensure a fully functional stress ng build note libattr is not required for more recent disto releases debian ubuntu gcc g libaio dev libapparmor dev libatomic1 libattr1 dev libbsd dev libcap dev libeigen3 dev libgbm dev libgcrypt dev libglvnd dev libipsec mb dev libjpeg dev libjudy dev libkeyutils dev libkmod dev libmd dev libmpfr dev libsctp dev libxxhash dev zlib1g dev rhel fedora centos gcc g eigen3 devel judy devel keyutils libs devel kmod devel libaio devel libatomic libattr devel libbsd devel libcap devel libgbm devel libgcrypt devel libglvnd core devel libglvnd devel libjpeg devel libmd devel mpfr devel libx11 devel libxau devel libxcb devel lksctp tools devel xorg x11 proto devel xxhash devel zlib devel rhel fedora centos static builds gcc g eigen3 devel glibc static judy devel keyutils libs devel libaio devel libatomic static libattr devel libbsd devel libcap devel libgbm devel libgcrypt devel libglvnd core devel libglvnd devel libjpeg devel libmd devel libx11 devel libxau devel libxcb devel lksctp tools devel mpfr devel xorg x11 proto devel xxhash devel zlib devel suse gcc gcc c eigen3 devel keyutils devel libaio devel libapparmor devel libatomic1 libattr devel libbsd devel libcap devel libgbm devel libglvnd devel libjpeg turbo libkmod devel libmd devel libseccomp devel lksctp tools devel mpfr devel xxhash devel zlib devel clearlinux devpkg eigen devpkg judy devpkg kmod devpkg attr devpkg libbsd devpkg libgcrypt devpkg libjpeg turbo devpkg libsctp devpkg mesa alpine linux build base eigen dev jpeg dev judy dev keyutils dev kmod dev libaio dev libatomic libattr libbsd dev libcap dev libgcrypt dev libmd dev libseccomp dev lksctp tools dev mesa dev mpfr dev xxhash dev zlib dev note the build will try to detect build dependencies and will build an image with functionality disabled if the support libraries are not installed at build time stress ng will detect kernel features that are available on the target build system and enable stress tests appropriately stress ng has been build tested on ubuntu debian debian gnu hurd slackware rhel sles centos kfreebsd openbsd netbsd freebsd debian kfreebsd dragonfly bsd os x minix solaris 11 3 openindiana and hiaku ports to other posix unix like operating systems should be relatively easy note always run make clean after fetching changes from the git repository to force the build to regenerate the build configuration file parallel builds using make j are supported to build on bsd systems one requires gcc and gnu make cc gcc gmake clean cc gcc gmake to build on os x systems just use make clean make j to build on minix gmake and clang are required cc clang gmake clean cc clang gmake to build on sunos one requires gcc and gnu make build using cc gcc gmake clean cc gcc gmake to build on dilos one requires gcc and gnu make build using cc gcc gmake clean cc gcc gmake to build on haiku alpha 4 make clean make to build a static image example for android use make clean static 1 make to build with full warnings enabled make clean pedantic 1 make to build with the tiny c compiler make clean cc tcc make to build with the pcc portable c compiler use make clean cc pcc make to build with the musl c library make clean cc musl gcc to build with the intel c compiler icc use make clean cc icc make to build with the intel c compiler icx use make clean cc icx make to perform a cross compilation using gcc use a static build specify the toolchain both cc and cxx for example a mips64 cross build make clean static 1 cc mips64 linux gnuabi64 gcc cxx mips64 linux gnuabi64 g make j nproc contributing to stress ng send patches to colin i king gmail com or merge requests at https github com colinianking stress ng quick start reference guide the ubuntu stress ng reference guide https wiki ubuntu com kernel reference stress ng contains a brief overview and worked examples examples run 4 cpu 2 virtual memory 1 disk and 8 fork stressors for 2 minutes and print measurements stress ng cpu 4 vm 2 hdd 1 fork 8 timeout 2m metrics stress ng info 573366 setting to a 120 second 2 mins 0 00 secs run per stressor stress ng info 573366 dispatching hogs 4 cpu 2 vm 1 hdd 8 fork stress ng info 573366 successful run completed in 123 78s 2 mins 3 78 secs stress ng info 573366 stressor bogo ops real time usr time sys time bogo ops s bogo ops s cpu used per stress ng info 573366 secs secs secs real time usr sys time instance stress ng info 573366 cpu 515396 120 00 453 02 0 18 4294 89 1137 24 94 42 stress ng info 573366 vm 2261023 120 01 223 80 1 80 18840 15 10022 27 93 99 stress ng info 573366 hdd 367558 123 78 10 63 11 67 2969 49 16482 42 18 02 stress ng info 573366 fork 598058 120 00 68 24 65 88 4983 80 4459 13 13 97 run matrix stressor on all online cpus for 60 seconds and measure temperature stress ng matrix 1 tz t 60 stress ng info 1171459 setting to a 60 second run per stressor stress ng info 1171459 dispatching hogs 8 matrix stress ng info 1171459 successful run completed in 60 01s 1 min 0 01 secs stress ng info 1171459 matrix stress ng info 1171459 acpitz0 75 00 c 348 15 k stress ng info 1171459 acpitz1 75 00 c 348 15 k stress ng info 1171459 pch skylake 60 17 c 333 32 k stress ng info 1171459 x86 pkg temp 62 72 c 335 87 k run a mix of 4 i o stressors and check for changes in disk s m a r t metadata sudo stress ng iomix 4 smart t 30s stress ng info 1171471 setting to a 30 second run per stressor stress ng info 1171471 dispatching hogs 4 iomix stress ng info 1171471 successful run completed in 30 37s stress ng info 1171471 device id s m a r t attribute value change stress ng info 1171471 sdc 01 read error rate 88015771 71001 stress ng info 1171471 sdc 07 seek error rate 59658169 92 stress ng info 1171471 sdc c3 hardware ecc recovered 88015771 71001 stress ng info 1171471 sdc f1 total lbas written 481904395 877 stress ng info 1171471 sdc f2 total lbas read 3768039248 5139 stress ng info 1171471 sdd be temperature difference 3670049 1 benchmark system calls using the vdso stress ng vdso 1 t 5 metrics stress ng info 1171584 setting to a 5 second run per stressor stress ng info 1171584 dispatching hogs 1 vdso stress ng info 1171585 stress ng vdso exercising vdso functions clock gettime time gettimeofday getcpu stress ng info 1171585 stress ng vdso 9 88 nanoseconds per call excluding 1 73 nanoseconds test overhead stress ng info 1171584 successful run completed in 5 10s stress ng info 1171584 stressor bogo ops real time usr time sys time bogo ops s bogo ops s cpu used per stress ng info 1171584 secs secs secs real time usr sys time instance stress ng info 1171584 vdso 430633496 5 10 5 10 0 00 84375055 96 84437940 39 99 93 stress ng info 1171584 vdso 9 88 nanoseconds per call average per stressor generate and measure branch misses using perf metrics sudo stress ng branch 1 perf t 10 stdout grep branch stress ng info 1171714 604 703 327 branch instructions 53 30 m sec stress ng info 1171714 598 760 234 branch misses 52 77 m sec 99 02 bugs and regressions found with stress ng stress ng has found kernel and qemu bugs regressions and appropriate fixes have been landed to address these issues 2015 keys ensure we free the assoc array edit if edit is valid https git kernel org cgit linux kernel git torvalds linux git commit id ca4da5dd1f99fe9c59f1709fb43e818b18ad20e0 proc fix esrch error when writing to proc pid coredump filter https git kernel org cgit linux kernel git torvalds linux git commit id 41a0c249cb8706a2efa1ab3d59466b23a27d0c8b smp divide error https bugs centos org view php id 14366 2016 fs locks c kernel oops during posix lock stress test https lkml org lkml 2016 11 27 212 sched core fix a race between try to wake up and a woken up task https git kernel org pub scm linux kernel git torvalds linux git commit id 135e8c9250dd5c8c9aae5984fde6f230d0cbfeaf devpts fix null pointer dereference on failed memory allocation https git kernel org cgit linux kernel git torvalds linux git commit id 5353ed8deedee9e5acb9f896e9032158f5d998de arm64 do not enforce strict 16 byte alignment to stack pointer https git kernel org cgit linux kernel git torvalds linux git commit id e6d9a52543338603e25e71e0e4942f05dae0dd8a 2017 arm dts meson8b add reserved memory zone to fix silent freezes https git kernel org pub scm linux kernel git torvalds linux git commit id b9b4bf504c9e94fe38b93aa2784991c80cebcf2e arm64 dts meson gx add firmware reserved memory zones https git kernel org pub scm linux kernel git torvalds linux git commit id bba8e3f42736cf7f974968a818e53b128286ad1d ext4 lock the xattr block before checksuming it https git kernel org pub scm linux kernel git torvalds linux git commit id dac7a4b4b1f664934e8b713f529b629f67db313c rcu preempt detected stalls on cpus tasks https lkml org lkml 2017 8 28 574 bug unable to handle kernel null pointer dereference https lkml org lkml 2017 10 30 247 warning possible circular locking dependency detected https www spinics net lists kernel msg2679315 html 2018 illumos ofdlock assertion failed lckdat l start 0 https www illumos org issues 9061 debugobjects use global free list in debug check no obj freed https git kernel org pub scm linux kernel git torvalds linux git commit id 1ea9b98b007a662e402551a41a4413becad40a65 ext4 validate inode bitmap 99 comm stress ng corrupt inode bitmap https bugs launchpad net ubuntu source linux bug 1780137 virtio s390 fix race in ccw io helper https git kernel org pub scm linux kernel git torvalds linux git commit id 78b1a52e05c9db11d293342e8d6d8a230a04b4e7 2019 mm page idle c fix oops because end pfn is larger than max pfn https git kernel org pub scm linux kernel git next linux next git commit mm page idle c id d96d6145d9796d5f1eac242538d45559e9a23404 mm compaction avoid 100 cpu usage during compaction when a task is killed https git kernel org pub scm linux kernel git torvalds linux git commit id 670105a25608affe01cb0ccdc2a1f4bd2327172b mm vmalloc c preload a cpu with one object for split purpose https git kernel org pub scm linux kernel git torvalds linux git commit id 82dd23e84be3ead53b6d584d836f51852d1096e6 perf evlist use unshare clone fs in sb threads to let setns clone newns work https git kernel org cgit linux kernel git torvalds linux git commit id b397f8468fa27f08b83b348ffa56a226f72453af riscv reject invalid syscalls below 1 https git kernel org cgit linux kernel git torvalds linux git commit id 556f47ac6083d778843e89aa21b1242eee2693ed 2020 risc v don t allow write exec only page mapping request in mmap https git kernel org cgit linux kernel git torvalds linux git commit id e0d17c842c0f824fd4df9f4688709fc6907201e1 riscv set max pfn to the pfn of the last page https git kernel org cgit linux kernel git torvalds linux git commit id c749bb2d554825e007cbc43b791f54e124dadfce crypto hisilicon update sec driver module parameter https git kernel org cgit linux kernel git torvalds linux git commit id 57b1aac1b426b7255afa195298ed691ffea204c6 net atm fix update of position index in lec seq next https git kernel org pub scm linux kernel git torvalds linux git commit id 2f71e00619dcde3d8a98ba3e7f52e98282504b7d sched debug fix memory corruption caused by multiple small reads of flags https git kernel org pub scm linux kernel git torvalds linux git commit id 8d4d9c7b4333abccb3bf310d76ef7ea2edb9828f ocfs2 ratelimit the max lookup times reached notice https git kernel org pub scm linux kernel git torvalds linux git commit id 45680967ee29e67b62e6800a8780440b840a0b1f using perf can crash kernel with a stack overflow https bugs launchpad net ubuntu source linux bug 1875941 stress ng on gcov enabled focal kernel triggers oops https bugs launchpad net ubuntu source linux bug 1879470 kernel bug list del corruption on s390x from stress ng mknod and stress ng symlink https bugzilla redhat com show bug cgi id 1849196 2021 sparc64 fix opcode filtering in handling of no fault loads https git kernel org pub scm linux kernel git torvalds linux git commit id e5e8b80d352ec999d2bba3ea584f541c83f4ca3f opening a file with o direct on a file system that does not support it will leave an empty file https bugzilla kernel org show bug cgi id 213041 locking atomic sparc fix arch cmpxchg64 local https git kernel org pub scm linux kernel git torvalds linux git commit id 7e1088760cfe0bb1fdb1f0bd155bfd52f080683a btrfs fix exhaustion of the system chunk array due to concurrent allocations https git kernel org pub scm linux kernel git torvalds linux git commit id 986aa0f276752ca4809f95b260f59fafef01a6a7 btrfs rework chunk allocation to avoid exhaustion of the system chunk array https git kernel org cgit linux kernel git torvalds linux git commit id 79bd37120b149532af5b21953643ed74af69654f btrfs fix deadlock with concurrent chunk allocations involving system chunks https git kernel org cgit linux kernel git torvalds linux git commit id 1cb3db1cf383a3c7dbda1aa0ce748b0958759947 locking atomic sparc fix arch cmpxchg64 local https git kernel org cgit linux kernel git torvalds linux git commit id 7e1088760cfe0bb1fdb1f0bd155bfd52f080683a pipe do fasync notifications for every pipe io not just state changes https git kernel org cgit linux kernel git torvalds linux git commit id fe67f4dd8daa252eb9aa7acb61555f3cc3c1ce4c io wq remove gfp atomic allocation off schedule out path https git kernel org cgit linux kernel git torvalds linux git commit id d3e9f732c415cf22faa33d6f195e291ad82dc92e mm swap consider max pages in iomap swapfile add extent https git kernel org cgit linux kernel git torvalds linux git commit id 36ca7943ac18aebf8aad4c50829eb2ea5ec847df block loop fix deadlock between open and remove https git kernel org cgit linux kernel git torvalds linux git commit id 990e78116d38059c9306cf0560c1c4ed1cf358d3 2022 copy process move fd install out of sighand siglock critical section https git kernel org cgit linux kernel git torvalds linux git commit id ddc204b517e60ae64db34f9832dc41dafa77c751 minix fix bug when opening a file with o direct https git kernel org cgit linux kernel git torvalds linux git commit id 9ce3c0d26c42d279b6c378a03cd6a61d828f19ca arch arm64 fix topology initialization for core scheduling https git kernel org cgit linux kernel git torvalds linux git commit id 5524cbb1bfcdff0cad0aaa9f94e6092002a07259 running stress ng on minux 3 4 0 rc6 on amd64 assert in vm region c 313 https github com stichting minix research foundation minix issues 333 unshare test triggers unhandled page fault https bugs launchpad net ubuntu source linux bug 1959215 request module dos https www spinics net lists kernel msg4349826 html numa benchmark regression in linux 5 18 https lore kernel org lkml ymrwk 2fkou1zraxpi fuller cnet underflow in mas spanning rebalance and test https lore kernel org linux mm 20220625003854 1230114 1 liam howlett oracle com mm huge memory do not clobber swp entry t during thp split https git kernel org cgit linux kernel git torvalds linux git commit id 71e2d666ef85d51834d658830f823560c402b8b6 apparmor 42 5 regression of stress ng kill ops per sec due to commit https lkml org lkml 2022 12 31 27 clocksource suspend the watchdog temporarily when high read lantency detected https lore kernel org lkml 20221220082512 186283 1 feng tang intel com t 2023 qemu system m68k segfaults on opcode 0x4848 https gitlab com qemu project qemu issues 1462 rtmutex ensure that the top waiter is always woken up https git kernel org cgit linux kernel git torvalds linux git commit id db370a8b9f67ae5f17e3d5482493294467784504 mm swap fix swap info struct race between swapoff and get swap pages https git kernel org cgit linux kernel git torvalds linux git commit id 6fe7d6b992113719e96744d974212df3fcddc76c block bfq fix division by zero error on zero wsum https git kernel org cgit linux kernel git torvalds linux git commit id e53413f8deedf738a6782cc14cc00bd5852ccf18 riscv mm ensure prot of vm write and vm exec must be readable https git kernel org pub scm linux kernel git torvalds linux git commit id 6569fc12e442ea973d96db39e542aa19a7bc3a79 revert mm vmscan make global slab shrink lockless https git kernel org pub scm linux kernel git torvalds linux git commit id 71c3ad65fabec9620d3f548b2da948c79c7ad9d5 crash hang in mm swapfile c 718 add to avail list when exercising stress ng https bugzilla kernel org show bug cgi id 217738 mm fix zswap writeback race condition https git kernel org pub scm linux kernel git torvalds linux git commit id 04fc7816089c5a32c29a04ec94b998e219dfb946 x86 fpu set x86 feature osxsave feature after enabling osxsave in cr4 https git kernel org pub scm linux kernel git torvalds linux git commit id 2c66ca3949dc701da7f4c9407f2140ae425683a5 kernel fork beware of put task struct calling context https git kernel org pub scm linux kernel git torvalds linux git commit id d243b34459cea30cfe5f3a9b2feb44e7daff9938 arm64 dts ls1028a add l1 and l2 cache info https git kernel org pub scm linux kernel git torvalds linux git commit id fcf7ff67a2aa6d8055b9b815ad8a28a5231afa1e kernel improvements that used stress ng 2020 selinux complete the inlining of hashtab functions https git kernel org cgit linux kernel git torvalds linux git commit id 54b27f9287a7b3dfc85549f01fc9d292c92c68b9 selinux store role transitions in a hash table https git kernel org cgit linux kernel git torvalds linux git commit id e67b2ec9f6171895e774f6543626913960e019df sched rt optimize checking group rt scheduler constraints https git kernel org cgit linux kernel git torvalds linux git commit id b4fb015eeff7f3e5518a7dbe8061169a3e2f2bc7 sched fair handle case of task h load returning 0 https git kernel org cgit linux kernel git torvalds linux git commit id 01cfcde9c26d8555f0e6e9aea9d6049f87683998 sched deadline unthrottle pi boosted threads while enqueuing https git kernel org cgit linux kernel git torvalds linux git commit id feff2e65efd8d84cf831668e182b2ce73c604bbb mm fix madvise willneed performance problem https git kernel org cgit linux kernel git torvalds linux git commit id 66383800df9cbdbf3b0c34d5a51bf35bcdb72fd2 powerpc dma fix dma map ops get required mask https git kernel org cgit linux kernel git torvalds linux git commit id 437ef802e0adc9f162a95213a3488e8646e5fc03 stress ng close causes kernel oops es v5 6 rt and v5 4 rt https www spinics net lists linux rt users msg22350 html 2021 revert mm slub consider rest of partial list if acquire slab fails https git kernel org cgit linux kernel git torvalds linux git commit id 9b1ea29bc0d7b94d420f96a0f4121403efc3dd85 mm memory add orig pmd to struct vm fault https git kernel org cgit linux kernel git torvalds linux git commit id 5db4f15c4fd7ae74dd40c6f84bf56dfcf13d10cf selftests powerpc add test of mitigation patching https git kernel org cgit linux kernel git torvalds linux git commit id 34f7f79827ec4db30cff9001dfba19f496473e8d dm crypt avoid percpu counter spinlock contention in crypt page alloc https git kernel org cgit linux kernel git torvalds linux git commit id 528b16bfc3ae5f11638e71b3b63a81f9999df727 mm migrate optimize hotplug time demotion order updates https git kernel org cgit linux kernel git torvalds linux git commit id 295be91f7ef0027fca2f2e4788e99731aa931834 powerpc rtas rtas busy delay improvements https git kernel org cgit linux kernel git torvalds linux git commit id 38f7b7067dae0c101be573106018e8af22a90fdf 2022 sched core accounting forceidle time for all tasks except idle task https git kernel org cgit linux kernel git torvalds linux git commit id b171501f258063f5c56dd2c5fdf310802d8d7dc1 ipc mqueue use get tree nodev in mqueue get tree https git kernel org cgit linux kernel git torvalds linux git commit id d60c4d01a98bc1942dba6e3adc02031f5519f94b 2023 mm swapfile add cond resched in get swap pages https git kernel org cgit linux kernel git torvalds linux git commit id 7717fc1a12f88701573f9ed897cc4f6699c661e3 module add debug stats to help identify memory pressure https git kernel org cgit linux kernel git torvalds linux git commit id df3e764d8e5cd416efee29e0de3c93917dff5d33 module avoid allocation if module is already present and ready https git kernel org cgit linux kernel git torvalds linux git commit id 064f4536d13939b6e8cdb71298ff5d657f4f8caa sched interleave cfs bandwidth timers for improved single thread performance at low utilization https git kernel org cgit linux kernel git torvalds linux git commit id 41abdba9374734b743019fc1cc05e3225c82ba6b filemap add filemap map order0 folio to handle order0 folio https git kernel org cgit linux kernel git torvalds linux git commit id c8be03806738c86521dbf1e0503bc90855fb99a3 presentations stress ng presentation at elce 2019 lyon https static sched com hosted files osseu19 29 lyon stress ng presentation oct 2019 pdf video of the above presentation https www youtube com watch v 8qaxstkfq3a linux foundation mentoring session may 2022 https www youtube com watch v gd3hn02vsha kernel recipes presentation sept 2023 https www youtube com watch v pd0nozctivq citations linux kernel performance test tool https cdrdv2 public intel com 723902 lkp tests pdf 2015 enhancing cloud energy models for optimizing datacenters efficiency https cs gmu edu menasce cs788 papers iccac2015 outin pdf tejo a supervised anomaly detection scheme for newsql databases https hal archives ouvertes fr hal 01211772 document coma resource monitoring of docker containers http www scitepress org papers 2015 54480 54480 pdf an investigation of cpu utilization relationship between host and guests in a cloud infrastructure http www diva portal org smash get diva2 861239 fulltext02 2016 increasing platform determinism pqos dpdk https www intel com content dam www public us en documents white papers increasing platform determinism pqos dpdk paper pdf towards energy efficient data management in hpc the open ethernet drive approach http www pdsw org pdsw discs16 papers p43 kougkas pdf cpu and memory performance analysis on dynamic and dedicated resource allocation using xenserver in data center environment http ieeexplore ieee org document 7877341 how much power does your server consume estimating wall socket power using rapl measurements http www ena hpc org 2016 pdf khan et al enahpc pdf devops for iot applications using cellular networks and cloud https www ericsson com assets local publications conference papers devops pdf a virtual network function workload simulator https uu diva portal org smash get diva2 1043751 fulltext01 pdf characterizing and reducing cross platform performance variability using os level virtualization http www lofstead org papers 2016 varsys pdf how much power does your server consume estimating wall socket power using rapl measurements https www researchgate net publication 306004997 how much power does your server consume estimating wall socket power using rapl measurements uie user centric interference estimation for cloud applications https www3 cs stonybrook edu anshul ic2e16 uie pdf 2017 auto scaling of containers the impact of relative and absolute metrics https www researchgate net publication 319905237 auto scaling of containers the impact of relative and absolute metrics testing the windows subsystem for linux https blogs msdn microsoft com wsl 2017 04 11 testing the windows subsystem for linux practical analysis of the precision time protocol under different types of system load http www diva portal org smash get diva2 1106630 fulltext01 pdf towards virtual machine energy aware cost prediction in clouds http eprints whiterose ac uk 124309 1 paper final pdf algorithms and architectures for parallel processing https books google co uk books id s4wwdwaaqbaj pg pa7 lpg pa7 dq http kernel ubuntu com cking stress ng source bl ots bvzccbq2io sig riqkwyehgmvposajiemkjggev0m hl en sa x ved 0ahukewifo6lo2fbxahwbtxqkhrcndy04hhdoaqgumae v onepage q http 3a 2f 2fkernel ubuntu com 2f cking 2fstress ng 2f f false advanced concepts and tools for renewable energy supply of data centres https www riverpublishers com pdf ebook rp 9788793519411 pdf monitoring and modelling open compute servers http staff www ltu se damvar publications eriksson 20et 20al 20 202017 20 20monitoring 20and 20modelling 20open 20compute 20servers pdf experimental and numerical analysis for potential heat reuse in liquid cooled data centres http personals ac upc edu jguitart homepagefiles ecm16 pdf modeling and analysis of performance under interference in the cloud https www3 cs stonybrook edu anshul mascots17 pdf effectively measure and reduce kernel latencies for real time constraints https elinux org images a a9 elc2017 effectively measure and reduce kernel latencies for real time constraints 281 29 pdf monitoring and analysis of cpu load relationships between host and guests in a cloud networking infrastructure http www diva portal org smash get diva2 861235 fulltext02 measuring the impacts of the preempt rt patch http events17 linuxfoundation org sites events files slides rtpatch pdf reliable library identification using vmi techniques https rp os3 nl 2016 2017 p64 report pdf elastic ppq a two level autonomic system for spatial preference query processing over dynamic data stream https www researchgate net publication 319613604 elastic ppq a two level autonomic system for spatial preference query processing over dynamic data streams openepc integration within 5gtn as an nfv proof of concept http jultika oulu fi files nbnfioulu 201706082638 pdf time aware dynamic binary instrumentation https uwspace uwaterloo ca bitstream handle 10012 12182 arafa pansy pdf sequence 3 experience report log mining using natural language processing and application to anomaly detection https hal laas fr hal 01576291 document mixed time criticality process interferences characterization on a multicore linux system https re public polimi it retrieve handle 11311 1033069 292404 paper accepted version pdf cloud orchestration at the level of application https ec europa eu research participants documents downloadpublic rfppogljenyrcluyk3n5efn4nnvvzejpvel3ttaxcfhxrzrgaxdzn2dmsjbycjnibxb6dlj3pt0 attachment vfeyqtq4m3ptuwq4vdn5uwndyvz0uevswst2rehrv1q 2018 multicore emulation on virtualised environment https indico esa int event 165 contributions 1230 attachments 1195 1412 04b multicore presentation pdf stress sgx load and stress your enclaves for fun and profit https seb vaucher org papers stress sgx pdf quiho automated performance regression testing using inferred resource utilization profiles https dl acm org citation cfm id 3184422 dl acm coll dl preflayout flat hypervisor and virtual machine memory optimization analysis http dspace ut ee bitstream handle 10062 60705 viitkar bsc2018 pdf real time testing with fuego https elinux org images 4 43 elc2018 real time testing with fuego 181024m pdf fecbench an extensible framework for pinpointing sources of performance interference in the cloud edge resource spectrum https www academia edu 68455840 fecbench an extensible framework for pinpointing sources of performance interference in the cloud edge resource spectrum quantifying the interaction between structural properties of software and hardware in the arm big little architecture https research abo fi ws files 26568716 quantifyinginteraction pdf rapl in action experiences in using rapl for power measurements https dl acm org doi 10 1145 3177754 2019 performance isolation of co located workload in a container based vehicle software architecture https www thinkmind org articles ambient 2019 2 20 40020 pdf analysis and detection of cache based exploits https ssg lancs ac uk wp content uploads 2020 07 analysis and detection vateva pdf kmvx detecting kernel information leaks with multi variant execution https research vu nl ws files 122357910 kmvx pdf scalability of kubernetes running over aws https www diva portal org smash get diva2 1367111 fulltext02 a study on performance measures for auto scaling cpu intensive containerized applications https link springer com article 10 1007 s10586 018 02890 1 scavenger a black box batch workload resource manager for improving utilization in cloud environments https www3 cs stonybrook edu sjavadi files javadi socc2019 pdf estimating cloud application performance based on micro benchmark profiling https core ac uk download pdf 198051426 pdf 2020 performance and energy trade offs for parallel applications on heterogeneous multi processing systems https www mdpi com 1996 1073 13 9 2409 htm c balancer a system for container profiling and scheduling https arxiv org pdf 2009 08912 pdf modelling vm latent characteristics and predicting application performance using semi supervised non negative matrix factorization https ieeexplore ieee org document 9284328 semi dynamic load balancing efficient distributed learning in non dedicated environments https dl acm org doi 10 1145 3419111 3421299 a performance analysis of hardware assisted security technologies https openscholarship wustl edu cgi viewcontent cgi article 1556 context eng etds green cloud software engineering for big data processing https eprints leedsbeckett ac uk id eprint 7294 1 greencloudsoftwareengineeringforbigdataprocessingpv kor pdf real time detection for cache side channel attack using performance counter monitor https www proquest com docview 2533920884 subverting linux integrity measurement architecture https svs informatik uni hamburg de publications 2020 2020 08 27 bohling ima pdf real time performance assessment using fast interrupt request on a standard linux kernel https onlinelibrary wiley com doi full 10 1002 eng2 12114 low energy consumption on post moore platforms for hpc research https revistas usfq edu ec index php avances article download 2108 2919 18081 managing latency in edge cloud environment https s2group cs vu nl files pubs 2020 jss ig edge cloud pdf demystifying the real time linux scheduling latency https bristot me files research papers ecrts2020 deoliveira2020demystifying pdf 2021 streamline a fast flushless cache covert channel attack by enabling asynchronous collusion https dl acm org doi pdf 10 1145 3445814 3446742 experimental analysis in hadoop mapreduce a closer look at fault detection and recovery techniques https www mdpi com 1131714 performance characteristics of the bluefield 2 smartnic https arxiv org pdf 2105 06619 pdf evaluating latency in multiprocessing embedded systems for the smart grid https www mdpi com 1996 1073 14 11 3322 work in progress timing diversity as a protective mechanism https dl acm org doi pdf 10 1145 3477244 3477614 sequential deep learning architectures for anomaly detection in virtual network function chains https arxiv org pdf 2109 14276 pdf wattedge a holistic approach for empirical energy measurements in edge computing https www researchgate net publication 356342806 wattedge a holistic approach for empirical energy measurements in edge computing ptemagnet fine grained physical memory reservation for faster page walks in public clouds https www pure ed ac uk ws portalfiles portal 196157550 ptemagnet margaritov doa19112020 afv pdf the price of meltdown and spectre energy overhead of mitigations at operating system level https www4 cs fau de publications 2021 herzog 2021 eurosec pdf an empirical study of thermal attacks on edge platforms https digitalcommons kennesaw edu cgi viewcontent cgi article 1590 context undergradsymposiumksu sage practical scalable ml driven performance debugging in microservices https people csail mit edu delimitrou papers 2021 asplos sage pdf a generalized approach for practical task allocation using a mape k control loop https www marquez barja com images papers a generalized approach for software placement in the fog using a mape k control loop authorversion pdf towards independent run time cloud monitoring https research spec org icpe proceedings 2021 companion p21 pdf firestarter 2 dynamic code generation for processor stress tests https tu dresden de zih forschung ressourcen dateien projekte firestarter firestarter 2 dynamic code generation for processor stress tests pdf lang en 2022 a general method for evaluating the overhead when consolidating servers performance degradation in virtual machines and containers https link springer com article 10 1007 s11227 022 04318 5 fedcomm understanding communication protocols for edge based federated learning https arxiv org pdf 2208 08764 pdf achieving isolation in mixed criticality industrial edge systems with real time containers https drops dagstuhl de opus volltexte 2022 16332 pdf lipics ecrts 2022 15 pdf design and implementation of machine learning based fault prediction system in cloud infrastructure https www mdpi com 2079 9292 11 22 3765 the tsn building blocks in linux https arxiv org pdf 2211 14138 pdf ukharon a membership service for microsecond applications https www usenix org system files atc22 guerraoui pdf evaluating secure enclave firmware development for contemporary risc v workstationscontemporary risc v workstation https scholar afit edu cgi viewcontent cgi article 6319 context etd evaluation of real time linux on risc v processor architecture https trepo tuni fi bitstream handle 10024 138547 j c3 a4mb c3 a4ckmarkus pdf hertzbleed turning power side channel attacks into remote timing attacks on x86 https www hertzbleed com hertzbleed pdf 2023 fight hardware with hardware system wide detection and mitigation of side channel attacks using performance counters https dl acm org doi 10 1145 3519601 introducing k4 0s a model for mixed criticality container orchestration in industry 4 0 https arxiv org pdf 2205 14188 pdf a comprehensive study on optimizing systems with data processing units https arxiv org pdf 2301 06070 pdf estimating cloud application performance based on micro benchmark profiling https research chalmers se publication 506903 file 506903 fulltext pdf pspray timing side channel based linux kernel heap exploitation technique https lifeasageek github io papers yoochan pspray pdf robust and accurate performance anomaly detection and prediction for cloud applications a novel ensemble learning based framework https journalofcloudcomputing springeropen com articles 10 1186 s13677 022 00383 6 fn4 feasibility study for a python based embedded real time control system https www mdpi com 2079 9292 12 6 1426 adaptation of parallel saas to heterogeneous co located cloud resources https www mdpi com 2076 3417 13 8 5115 b56 applsci 13 05115 a methodology and framework to determine the isolation capabilities of virtualisation technologies https dl acm org doi pdf 10 1145 3578244 3583728 data station delegated trustworthy and auditable computation to enable data sharing consortia with a data escrow https arxiv org pdf 2305 03842 pdf an empirical study of resource stressing faults in edge computing applications https dl acm org doi pdf 10 1145 3578354 3592873 finding flaky tests in javascript applications using stress and test suite reordering https repositories lib utexas edu handle 2152 120282 the power of telemetry uncovering software based side channel attacks on apple m1 m2 systems https arxiv org pdf 2306 16391 pdf a performance evaluation of embedded multi core mixed criticality system based on preempt rt linux https www jstage jst go jp article ipsjjip 31 0 31 78 pdf data leakage in isolated virtualized enterprise computing systemssystems https scholar smu edu cgi viewcontent cgi article 1034 context engineering compsci etds considerations for benchmarking network performance in containerized infrastructures https datatracker ietf org doc draft dcn bmwg containerized infra energat fine grained energy attribution for multi tenancy https hotcarbon org 2023 pdf a4 he pdf quantifying the security profile of linux applications https dl acm org doi 10 1145 3609510 3609814 gotham testbed a reproducible iot testbed for security experiments and dataset generation https arxiv org pdf 2207 13981 pdf profiling with trust system monitoring from trusted execution environments https assets researchsquare com files rs 3169665 v1 covered 63751076 8387 429e 8296 3f3cc4c3ed34 pdf c 1689832627 thermal aware on device inference using single layer parallelization with heterogeneous processors https www sciopen com article pdf 10 26599 tst 2021 9010075 pdf towards fast adaptive and hardware assisted user space scheduling https arxiv org pdf 2308 02896 pdf heterogeneous anomaly detection for software systems via semi supervised cross modal attention https arxiv org pdf 2302 06914 pdf green coding an empirical approach to harness the energy consumption ofsoftware services https theses hal science tel 04074973 document i am keen to add to the stress ng project page any citations to research or projects that use stress ng i also appreciate information concerning kernel bugs or performance regressions found with stress ng contributors many thanks to the following contributors to stress ng in alphabetical order abdul haleem aboorva devarajan adriand martin adrian ratiu aleksandar n kostadinov alexander kanavin alexandru ardelean alfonso s nchez beato allen h andrey gelman andr wild anisse astier anton eliasson arjan van de ven baruch siach bryan w lewis camille constans carlos santos christian ehrhardt christopher brown chunyu hu danilo krummrich davidson francis david turner dominik b czarnota dorinda bassey eder zulian eric lin erik stahlman erwan velu fabien malfoy fabrice fontaine fernand sieber florian weimer francis laniel guilherme janczak hui wang hsieh tseng shen iy n m ndez veiga james hunt jan luebbe jianshen liu john kacur jules maselbas julien olivain kenny gong khalid elmously khem raj luca pizzamiglio luis chamberlain luis henriques matthew tippett mauricio faria de oliveira maxime chevallier max kellermann maya rashish mayuresh chitale meysam azad mike koreneff nick hanley paul menzel piyush goyal ralf ramsauer rosen penev siddhesh poyarekar shoily rahman thadeu lima de souza cascardo thia wyrod thinh tran tim gardner tim gates tim orling tommi rantala witold baryluk yong xuan wang zhiyi sun | stress-testing kernel memory disk cpu overheating linux freebsd posix openbsd x86 c | os |
equinox | equinox img src https i imgur com omcxwz0 png alt eva design system height 20px https eva design pub https img shields io pub vpre equinox svg https pub dev packages equinox a eva design https eva design implementation in flutter screenshots p float left img src https i imgur com nf02pxn jpg width 49 img src https i imgur com oseeyij jpg width 49 img src https i imgur com almhkl8 jpg width 49 img src https i imgur com z7uepam jpg width 49 p tutorials and documentation you can check out the documentation in here https pub dev documentation equinox latest and wiki in here https github com kekland equinox wiki getting started depend on it add this to your package s pubspec yaml file yaml dependencies equinox 0 3 3 install it you can install packages from the command line bash flutter pub get import it now in your dart code you can use dart import package equinox equinox dart setup you have to replace materialapp or cupertinoapp with equinoxapp dart class myapp extends statelesswidget override widget build buildcontext context return equinoxapp theme eqthemes defaultlighttheme title flutter demo home homepage then instead of a scaffold you have to use eqlayout dart override widget build buildcontext context return eqlayout appbar eqappbar centertitle true title auth test subtitle v0 0 3 child mybody use it every widget in equinox is prefixed with eq for example eqbutton eqtabs etc dart eqbutton appearance widgetappearance ghost ontap label log in size widgetsize large status widgetstatus primary customization customization is done using stylist https github com kekland stylist i will write a guide on styling your app soon other eva design implementations angular https github com akveo nebular react native https github com akveo react native ui kitten icons the eva icons flutter https github com piyushmaurya23 eva icons flutter package is already integrated into equinox so you can use it right away by using evaicons credits eva design team repository https github com eva design eva contact me e mail kk erzhan gmail com | flutter ui-kit flutter-components flutter-plugin flutter-package flutter-ui flutter-widget | os |
sefirot | sefirot github actions https github com globalbrain sefirot workflows test badge svg https github com globalbrain sefirot actions codecov https codecov io gh globalbrain sefirot branch main graph badge svg https codecov io gh globalbrain sefirot license https img shields io npm l globalbrain sefirot svg https github com globalbrain sefirot blob main license md sefirot is a collection of vue components for global brain design system components are meant to be clean sophisticated and scalable sefirot is focused on being used within global brain s ecosystem hence the design ui ux of components is relatively fixed and customization capability is limited in exchange for customizability we can create components that are more robust dynamic and clean feel free to leverage any component within this project you may customize components how you see fit and perhaps some features may be valuable to you any suggestions requests or questions are welcome documentation you can check out the documentation for sefirot at https sefirot globalbrains com contribution we re really excited that you are interested in contributing to sefirot before submitting your contribution though please make sure to take a moment and read through the following guidelines code style guide sefirot follows the official vue style guide https v3 vuejs org style guide but always remember to follow the golden rule hellip every line of code should appear to be written by a single person no matter the number of contributors mdash cite mdo cite development bash pnpm run serve serve documentation website at http localhost 3000 bash pnpm run lint lint files using a rule of standard js bash pnpm test run the tests bash pnpm run coverage output test coverage in coverage directory license sefirot is open sourced software licensed under the mit license license md | vue | os |
GCP-practice-project | gcp practice project practice project associate cloud engineering working on changes trying again to merge and pull | cloud |
|
esp_rtos_ws2812 | esp rtos ws2812 ws2812 driver ported to frertos enabled control via wifi coap endpoint | os |
|
design | img src https user images githubusercontent com 8019099 133430653 24422d2a 3c8d 4052 9ad6 0580597151ee png div align right a href https cal com website a a href https github com calendso docs issues community support a div design documentation the official design documentation containing our custom design system that we use for all products and services this documentation site runs on docusaurus https docusaurus io so you may refer to their documentation should you need information on anything that isn t covered here prerequisites git node js npm yarn installation firstly clone the repository using git console git clone https github com calendso design git now you can install the dependencies with yarn console yarn install editing to create edit and delete documentation pages you can simply create markdown md files in the docs folder you can edit markdown with any text editor but vs code and webstorm have side by side previews so you can see your formatted content whilst writing markdown local development console docusaurus start this command starts a local development server and opens up a browser window most changes are reflected live without having to restart the server build console yarn build this command generates static content into the build directory and can be served using any static contents hosting service deployment console git user your github username use ssh true yarn deploy if you are using github pages for hosting this command is a convenient way to build the website and push to the gh pages branch | os |
|
awesome-iota | awesome iota awesome https cdn rawgit com sindresorhus awesome d7305f38d29fed78fa85652e3a63e154dd8e8829 media badge svg https github com sindresorhus awesome a community driven list of useful iota blogs articles videos and tools this is not a catalog of all the content just a starting point for your explorations inspired by awesome scala https github com lauris awesome scala other amazingly awesome lists can be found in the awesome awesomeness https github com bayandin awesome awesomeness list awesome iota awesome iota about about blogs blogs sites articles articles videos videos wallets wallets tools tools developers developers contributing contributing about what is iota iota is a revolutionary new transactional settlement and data integrity layer for the internet of things it s based on a new distributed ledger architecture the tangle which overcomes the inefficiencies of current blockchain designs and introduces a new way of reaching consensus in a decentralized peer to peer system for the first time ever through iota people can transfer money without any fees this means that even infinitesimally small nanopayments can be made through iota iota is the missing puzzle piece for the machine economy to fully emerge and reach its desired potential we envision iota to be the public permissionless backbone for the internet of things that enables true interoperability between all devices iota https iota org next generation blockchain whitepaper https iota org iota whitepaper pdf the tangle wikipedia https en wikipedia org wiki iota distributed ledger technology https en wikipedia org wiki iota distributed ledger technology a primer on iota https blog iota org a primer on iota with presentation e0a6eb2cc621 a primer on iota with presentation iota china http iotachina com iota china iota italia http iotaitalia com iota italia iota korea http blog naver com iotakorea iota iota japan http lhj hatenablog jp entry iota iota iota on reddit https www reddit com r iota blogs sites blogs sites about iota iota blog https blog iota org official blog of iota all about iota https aboutiota info all about iota iota support http www iotasupport com iota support your guide to the world of iota the tangler http www tangleblog com no bells and whistles just information iota stack exchange https iota stackexchange com iota stack exchange iota steemit trending https steemit com trending iota iota steemit trustediotalliance https www trustediot org securing iot products with blockchain iota tips http www iota tips iota tips iota token http iotatoken io iota decentralised internet of things token iota news http iota news iota news blog http lhj hatenablog jp entry iota iota articles articles about iota iota development roadmap https blog iota org iota development roadmap 74741f37ed01 iota development roadmap insights ubuntu com https insights ubuntu com 2017 02 20 iota iot revolutionized with a ledger iota iot revolutionised with a ledger blog iota org https blog iota org automating machine transactions and building trust in the 4th industrial revolution d3219a157396 automating machine transactions and building trust in the 4th industrial revolution satoshiwatch com https satoshiwatch com coins iota in depth iota the winner takes the winner takes it all crypto judgement https medium com cryptojudgement iota promise of a bright crypto future 6b7517349e32 iota promise of a bright crypto future energycentral com https medium com ercwl iota is centralized 6289246e7b4d genuine peer to peer processing makes iota tangle what bitcoin blockchains should be constellationr com https www constellationr com blog news blockchain or distributed ledger defining requirement not technology 0 blockchain or distributed ledger defining the requirement not the technology iota ecosystem fund https blog iota org iota ecosystem fund 2 million f6ade6a4d8ba iota ecosystem fund 2 million videos videos about iota what is iota https www youtube com watch v yj9j a acb4 iota what is iota what is the tangle technology the bitcoin killer boxmining https www youtube com watch v uwep5cextje what is iota in a nutshell ivan on tech https www youtube com watch v c y4kykzcai iota and machine to machine economy programmer explains introducing iota https www youtube com watch v fbgfiqpzr6a introducing iota new crypto next generation blockchain https www youtube com watch v wbhkao9lobk iota next generation blockchain new crypto https www youtube com watch v pn64rets2gy new crypto iota discover iota https www youtube com watch v h09z2n0mtuq discover iota general overview for beginners https www youtube com watch v 2azqznkermy general overview for beginners this is iota https www youtube com watch v lyvlq13wfse this is iota iota the next level https www youtube com watch v cm xhh6n2zc iota the next level what is iot https www youtube com watch v s64s3grzlsm what is iot how it works the iot https www youtube com watch v qsipnhoimoe how it works the iot welcome to the iota universe https www youtube com watch v n5seevhbln8 welcome to the iota universe iota coin https www youtube com watch v sunkglqhc8y iota coin iota ecosystem fund https www youtube com watch v 7rliikr4cqy iota ecosystem fund wallets iota wallets trinity wallet https github com iotaledger trinity wallet iota trinity wallet desktop and mobile tools 3rd party tools on iota tangle blox https tangle blox pm iota tangle iota tools http iota tools iota tools iota price http iotaprice com iota price iota cool http iota cool iota cool tools iota balance lookup http iotabalance com minimalist iota balance lookup tangler org http tangler org iota tangle explorer codebuffet co https iota codebuffet co iota tangle explorer iotasear ch https iotasear ch iota tangle explorer thetangle org https thetangle org iota tangle explorer developers links to developing iota applications official by the iota foundation documentation https iota readme io v1 1 0 docs documentation of iota the iota developer hub https iota readme io the iota developer hub get started with iota https learn iota org the best way to learn more about iota through interactive developer tutorials the iota developer hub https dev iota org start developing with iota iota lib csharp https github com iotaledger iota lib csharp iota lib csharp iota lib py https github com iotaledger iota lib py pyota the iota python api library iota lib js https github com iotaledger iota lib js iota javascript library iota rs https github com iotaledger iota rs iota implementation rust iota lib go https github com iotaledger iota lib go giota the iota go api library iota lib java https github com iotaledger iota lib java jota library is a simple java wrapper around iota curl lib js https github com iotaledger curl lib js iota proof of work algorithm ported to javascript to work in webgl2 enabled browsers non official libraries net library https github com borlay borlay iota library net library written in c iota c library https github com thibault martinez iota lib cpp a iri client library in c for iri iota address utilities https github com prizz iota address utilities a library to working with iota addresses iota c library https github com th0br0 iota lib c a c library to generate seeds addresses etc managment iota peer manager https github com akashgoswami ipm a peer neighbors management webui for iri open iota https github com prizz open iota a open tangle explorer web app iota search https github com eukaryote31 iotasearch a web app for exploring addresses transactions etc iota dashboard https github com lsquires iota dashboard a tangle visualiser iota transaction spammer https github com prizz iota transaction spammer webapp a web app transaction spammer iota reattacher https github com normpad iota reattacher a app which finds valid transactions and reattach them iota full node cli https github com nazarimilad iota node a cli to manage a full node payment payiota https github com lacicloud payiota a iota pay gateway for php deployment docker iri full node https github com bluedigits iota node a dockerized full node iri playbook https github com nuriel77 iri playbook a ansible playbook to setup a iri full node other bolero https github com semkodev bolero fun a cross platform full node carriota nelson https github com semkodev nelson cli neighbor discovery for a iri full node iotatipbot https github com normpad iotatipbot a tip bot for reddit tanglestash https github com loehnertz tanglestash an algorithm to persist any file onto the tangle of iota iota auth https github com thedewpoint iotauth 2fa built on the iota tangle iota basic https github com thedewpoint iota basic basic implementation of iota api allowing easy interactions with the network in progress iota prom exporter https github com crholliday iota prom exporter prometheus exporter for iota metrics and associated grafana dashboard tangleid https github com tangleid tangleid secure self sovereign identity built on iota tangle contributing your contributions are always welcome please submit a pull request or create an issue to add a new blogs tutorials videos or libraries to the list | iota blockchain decentralized iot tangle | server |
Proximity-Marketing-App | proximity marketing app pma final year project bsc hons software systems development waterford institute of technology br img src https user images githubusercontent com 20372577 56387130 6d5d9780 621b 11e9 856c 3783af311a7b png height 539 width 954 br using bluetooth low energy ble beacon technology this web application provides shoppers with relevant and timely information about in store products straight to their mobile devices users can easily access product reviews and price comparisons before making a purchase and also take advantage of special offers through digital discount vouchers combining the in store experience where a shopper can pick up and get the feel of a product with the online experience where reviews and product insights are at your fingertips the pma brings the benefits of both worlds together br take this typical scenario a customer walks into an electronics store and heads directly to a tv she wants to buy standing in front of the tv she receives a browser notification on her smartphone she clicks the link upon which the pma provides her with product info product reviews special offers and a price comparison with similar products br table tr td notifications screen td td product info screen td tr tr td img src https user images githubusercontent com 20372577 46041881 d6f54780 c10b 11e8 9fa1 a9d1a0f2b435 jpg height 700 width 375 td td img src https user images githubusercontent com 20372577 46042532 841c8f80 c10d 11e8 8e76 9b22af68b668 png height 700 width 375 td tr table br prerequisites install npm https www npmjs com get npm install firebase npm install firebase save installation install node dependencies start application npm install npm start check browser at localhost 3000 samsung tv localhost 3000 asics runners features product description product reviews price comparison digital vouchers technologies used bluetooth beacons eddystone react js javascript firebase firestore hosting authentication security webhose api voucherify api systems overview img src https user images githubusercontent com 20372577 46046923 c8faf300 c11a 11e8 9aef 1c038383097e png br documentation see analysis and design documentation a href https github com shane walsh proximity marketing app wiki analysis design here a on github wikis please see the development journey documentation a href https shanewalsh gitbook io proximity marketing app here a on gitbooks | server |
|
cryptography | preface readme md history readme md encrypttype readme md hash readme md blockcipher readme md mac readme md pki pki readme md digitalsignature readme md share readme md mpc mpc readme md ot readme md gc readme md mpc mpc mpc implementation md zkp zkp introduce md zkp app md groth16 zkp groth16 md sonic fractal halo supersonic marlin plonk zk stark china readme md preface readme md v1 0 1 guoshijiang2012 163 com lgzaxe discord https discord gg ww86tqew telegram shijiangguo twitter seek web3 eth 0xe3b4ecd2ec88026f84cf17fef8babfd9184c94f0 erc20 0xe3b4ecd2ec88026f84cf17fef8babfd9184c94f0 layer2 0xe3b4ecd2ec88026f84cf17fef8babfd9184c94f0 savourlabs savourdao python go rust node vue react guoshijiang2012 163 com lgzaxe discord https discord gg ww86tqew telegram shijiangguo twitter seek web3 | cryptography blockchain | blockchain |
machine-learning-nd | machine learning nd udacity s machine learning nanodegree project files and notes this repository contains project files and lecture notes for udacity s machine learning engineer nanodegree program https www udacity com course machine learning engineer nanodegree nd009 which i started working on in september 2016 the machine learning engineer nanodegree is an online certification it involves 1 courses in supervised learning unsupervised learning and reinforcement learning and 2 six projects p0 p5 in this directory courses include lecture videos quizzes and programming problems these courses were developed by georgia tech udacity google and kaggle this directory includes lecture notes lesson notes and project code p0 to p5 see also my notes for udacity s data analyst nanodegree https www udacity com course data analyst nanodegree nd002 v a program outline 0 exploratory project titanic survival exploration 1 model evaluation and validation project 1 predicting boston housing prices 2 supervised learning project 2 building a student intervention system predicting whether or not students will fail so schools can intervene to help them graduate 3 unsupervised learning project 3 creating customer segments segmenting customers based on spending in different categories 4 reinforcement learning project 4 train a smartcab to drive implement q learning algorithm 5 machine learning specialisation of choice | ai |
|
IoT-Home-Guard | py3 6 https img shields io badge python 3 6 blue svg mit https img shields io github license mashape apistatus svg iot home guard iot home guard is a project to help people discover malware in smart home devices for users the project can help to detect compromised smart home devices for security researchers it is also useful in network analysis and malicious hehaviors detection in july 2018 we had completed the first version we will complete the second version by october 2018 with improvement of user experience and increased number of identifiable devices the first generation is a hardware device based on raspberry pi with wireless network interface controllers we will customize new hardware in the second generation the system can be set up with software part in laptops after essential environment configuration software part is available in software tools proof of principle our approach is based on the detection of malicious network traffic a device implanted malwares will communicate with remote server trigger a remote shell or send audios videos to server the chart below shows the network traffic of a device which implanted snooping malwares red line traffic between devices and a remote spy server green line normal traffic of devices black line sum of tcp traffic mi listen wakeup resources mi listen wakeup png modules 1 ap module and data flow catcher catch network traffic 2 traffic analying engine extract characteristics from network traffic and compare them with device fingerprint database 3 device fingerprint database normal network behaviors of each devices based on whitelist call apis of 360 threat intelligence database https ti 360 net https ti 360 net 4 web server there may be a web server in the second generation procedure data flow catcher devices connected device fingerprint databse flow analyze engine 360 threat intelligence database web server user interfaces the tool works as an access point connected manually by devices under test sends network traffic to traffic analyzing engine for characteristic extraction traffic analyzing engine compares characteristics with entries in device fingerprint database to recognize device type and suspicious network connection device fingerprint database is a collect of normal behaviors of each device based on whitelist additionally characteristics will be searched on threat intelligence database of qihoo 360 to identify malicious behaviors a web server is set up as user interfaces effectiveness in our research we have succcessfully implanted trojans in eight devices including smart speakers cameras driving recorders and mobile translators with iot implant toolkit a demo video below implantdemo gif resources implantdemo gif we collected characteristics of those devices and ran iot home guard all devices implanted trojans have been detected we believe that malicious behaviors of more devices can be identified with high accuracy after supplement of fingerprint database tutorials of iot home guard for a hardware tool see iot home guard hardware tool readme md for a software tool see iot home guard software tools readme md | server |
|
mad | mad github for mobile application development useful links http www ntu edu sg home ehchua programming android android ndk html http www ntu edu sg home ehchua programming android android howto html | front_end |
|
Movie-Recommendation-And-Analysis | movie recommendation and analysis how to analyze a movie database to find some useful insights and recommend movies by feature engineering movie recommendation and analysis using feature engineering is a process of using data and algorithms to analyze and recommend movies based on certain features feature engineering is the process of creating transforming and selecting features that are relevant to a particular problem feature engineering can help to identify patterns and relationships in data that can be used to make recommendations for example analyzing a user s past movie ratings can help to identify similar movies that they may enjoy feature engineering can also be used to analyze movie reviews to identify certain elements in a movie such as the plot characters and genre this can help to recommend movies that a user may be more likely to enjoy additionally feature engineering can be used to analyze movies to identify trends and patterns in the data which can be used to make predictions and provide insights | server |
|
ml-design-patterns | this is not an official google product ml design patterns source code accompanying o reilly book br title machine learning design patterns br authors valliappa lak lakshmanan sara robinson michael munn br img src mldp cover color jpg height 300 https www oreilly com library view machine learning design 9781098115777 img br a href https shop aer io oreilly p machine learning design 9781098115784 9149 buy from o reilly a br a href https www amazon com machine learning design patterns preparation dp 1098115783 buy from amazon a br we will update this repo with source code as we write each chapter stay tuned img src https deepnote com buttons try in a jupyter notebook white svg https deepnote com launch url https github com googlecloudplatform ml design patterns chapters preface the need for ml design patterns data representation design patterns 1 hashed feature 2 embedding 3 feature cross 4 multimodal input problem representation design patterns 5 reframing 6 multilabel 7 ensemble 8 cascade 9 neutral class 10 rebalancing patterns that modify model training 11 useful overfitting 12 checkpoints 13 transfer learning 14 distribution strategy 15 hyperparameter tuning resilience patterns 16 stateless serving function 17 batch serving 18 continuous model evaluation 19 two phase predictions 20 keyed predictions reproducibility patterns 21 transform 22 repeatable sampling 23 bridged schema 24 windowed inference 25 workflow pipeline 26 feature store 27 model versioning responsible ai 28 heuristic benchmark 29 explainable predictions 30 fairness lens summary | ai |
|
Awesome-GPT | awesome gpt awesome https awesome re badge svg https awesome re awesome papers datasets and projects about the study of large language models like gpt 3 gpt 3 5 chatgpt gpt 4 etc papers survey a survey on in context learning arxiv 2023 paper https arxiv org pdf 2301 00234 pdf a survey on gpt 3 arxiv 2023 paper https arxiv org pdf 2212 00857 pdf a survey of large language models arxiv 2023 paper https arxiv org pdf 2303 18223 pdf 2023 tree of thoughts deliberate problem solving with large language models arxiv 2023 paper https arxiv org pdf 2305 10601 pdf code https github com princeton nlp tree of thought llm gpt 4 technical report openai 2023 paper https cdn openai com papers gpt 4 pdf react synergizing reasoning and acting in language models iclr 2023 notable top 5 paper https openreview net pdf id we vluyul x code https anonymous 4open science r react 2268 selection inference exploiting large language models for interpretable logical reasoning iclr 2023 notable top 5 paper https openreview net pdf id 3pf3wg6o a4 what learning algorithm is in context learning investigations with linear models iclr 2023 notable top 5 paper https openreview net pdf id 0g0x4h8yn4i language models are greedy reasoners a systematic formal analysis of chain of thought iclr 2023 paper https arxiv org pdf 2210 01240 pdf code https github com asaparov prontoqa visual chatgpt talking drawing and editing with visual foundation models arxiv 2023 paper https arxiv org pdf 2303 04671 pdf code https github com microsoft visual chatgpt toolformer language models can teach themselves to use tools arxiv 2023 paper https arxiv org pdf 2302 04761 pdf code https github com lucidrains toolformer pytorch check your facts and try again improving large language models with external knowledge and automated feedback arxiv 2023 paper https arxiv org pdf 2302 12813 pdf can gpt 3 perform statutory reasoning arxiv 2023 paper https arxiv org pdf 2302 06100v1 pdf how close is chatgpt to human experts comparison corpus evaluation and detection arxiv 2023 paper https arxiv org pdf 2301 07597 pdf large language models can be easily distracted by irrelevant context arxix 2023 paper https arxiv org pdf 2302 00093 pdf theory of mind may have spontaneously emerged in large language models arxiv 2023 paper https arxiv org ftp arxiv papers 2302 2302 02083 pdf chatgpt makes medicine easy to swallow an exploratory case study on simplified radiology reports arxiv 2023 paper https arxiv org pdf 2212 14882 pdf 2022 large language models are zero shot reasoners neurips 2022 paper https arxiv org pdf 2205 11916 pdf chain of thought prompting elicits reasoning in large language models paper https arxiv org pdf 2201 11903 pdf automatic chain of thought prompting in large language models paper https arxiv org pdf 2210 03493 pdf what can transformers learn in context a case study of simple function classes arxiv 2022 paper https arxiv org pdf 2208 01066 pdf code https github com dtsip in context learning rethinking the role of demonstrations what makes in context learning work arxiv 2022 paper https arxiv org pdf 2202 12837 pdf code https github com alrope123 rethinking demonstrations why can gpt learn in context language models secretly perform gradient descent as meta optimizers arxiv 2022 paper https arxiv org pdf 2212 10559 pdf 2021 medically aware gpt 3 as a data generator for medical dialogue summarization pmlr 2021 paper https proceedings mlr press v149 chintagunta21a chintagunta21a pdf gpt understands too arxiv 2021 paper https arxiv org pdf 2103 10385 pdf webgpt browser assisted question answering with human feedback arxiv 2021 paper https arxiv org pdf 2112 09332 pdf code https www microsoft com en us bing apis bing web search api 2020 language models are few shot learners neurips 2020 paper https arxiv org pdf 2005 14165 pdf datasets hello simpleai hc3 https huggingface co datasets hello simpleai hc3 projects awesome chatgpt prompts https github com f awesome chatgpt prompts | ai |
|
ctf-blockchain | ctf blockchain challenges this repository collects blockchain challenges in ctfs and wargames these challenges are categorized by topic not by difficulty or recommendation also there are my writeups and exploits for some challenges e g paradigm ctf 2022 src paradigmctf2022 if there are any incorrect descriptions i would appreciate it if you could let me know via issue or pr table of contents ethereum ethereum contract basics contract basics evm puzzles evm puzzles misuse of tx origin misuse of txorigin weak sources of randomness from chain attributes weak sources of randomness from chain attributes erc 20 basics erc 20 basics storage overwrite by delegatecall storage overwrite by delegatecall context mismatch in delegatecall context mismatch in delegatecall integer overflow integer overflow non executable ether transfers to contracts non executable ether transfers to contracts forced ether transfers to contracts via selfdestruct forced ether transfers to contracts via selfdestruct large gas consumption by contract callees large gas consumption by contract callees forgetting to set view pure to interface and abstract contract functions forgetting to set viewpure to interface and abstract contract functions view functions that do not always return same values view functions that do not always return same values mistakes in setting storage and memory mistakes in setting storage and memory tracing transactions tracing transactions reversing states reversing states reversing transactions reversing transactions reversing evm bytecodes reversing evm bytecodes evm bytecode golf evm bytecode golf jump oriented programming jump oriented programming gas optimization gas optimization collisions when using abi encodepacked with variable length arguments collisions when using abiencodepacked with variable length arguments bypassing verifications with zero iteration loops bypassing verifications with zero iteration loops reentrancy attacks reentrancy attacks flash loan basics flash loan basics governance attacks by executing flash loans during snapshots governance attacks by executing flash loans during snapshots bypassing repayments of push architecture flash loans bypassing repayments of push architecture flash loans bugs in amm price calculation algorithm bugs in amm price calculation algorithm attacks using custom tokens attacks using custom tokens oracle manipulation attacks without flash loans oracle manipulation attacks without flash loans oracle manipulation attacks with flash loans oracle manipulation attacks with flash loans sandwich attacks sandwich attacks recoveries of private keys by same nonce attacks recoveries of private keys by same nonce attacks brute forcing addresses brute forcing addresses recoveries of public keys recoveries of public keys encryption and decryption in secp256k1 encryption and decryption in secp256k1 bypassing bots and taking erc 20 tokens owned by wallets with known private keys bypassing bots and taking erc 20 tokens owned by wallets with known private keys claimable intermediate nodes of merkle trees claimable intermediate nodes of merkle trees precompiled contracts precompiled contracts faking errors faking errors foundry cheatcodes foundry cheatcodes front running front running back running back running head overflow bugs in calldata tuple abi reencoding solidity 0 8 16 head overflow bugs in calldata tuple abi reencoding solidity 0816 overwriting storage slots via local storage variables solidity 0 8 1 overwriting storage slots via local storage variables solidity 081 overwriting arbitrary storage slots by setting array lengths to 2 256 1 solidity 0 6 0 overwriting arbitrary storage slots by setting array lengths to 2256 1 solidity 060 constructors that is just functions by typos solidity 0 5 0 constructors that is just functions by typos solidity 050 overwriting storage slots via uninitialized storage pointer solidity 0 5 0 overwriting storage slots via uninitialized storage pointer solidity 050 other ad hoc vulnerabilities and methods other ad hoc vulnerabilities and methods bitcoin bitcoin bitcoin basics bitcoin basics recoveries of private keys by same nonce attacks recoveries of private keys by same nonce attacks 1 bypassing pow of other applications using bitcoin s pow database bypassing pow of other applications using bitcoins pow database cairo cairo solana solana move move other blockchain related other blockchain related ethereum note if an attack is only valid for a particular version of solidity and not for the latest version the version is noted at the end of the heading to avoid notation fluctuations evm terms are avoided as much as possible and solidity terms are used contract basics these challenges can be solved if you know the basic mechanics of ethereum the basic language specification of solidity https docs soliditylang org en latest and the basic operation of contracts challenge note keywords capture the ether deploy a contract src capturetheether faucet wallet capture the ether call me src capturetheether contract call capture the ether choose a nickname src capturetheether contract call capture the ether guess the number src capturetheether contract call capture the ether guess the secret number src capturetheether keccak256 ethernaut 0 hello ethernaut src ethernaut contract call abi ethernaut 1 fallback src ethernaut receive ether function paradigm ctf 2021 hello src paradigmctf2021 contract call 0x41414141 ctf sanity check src 0x41414141ctf contract call paradigm ctf 2022 random src paradigmctf2022 contract call downunderctf 2022 solve me src downunderctf2022 evm puzzles puzzle challenges that can be solved by understanding the evm specifications no vulnerabilities are used to solve these challenges challenge note keywords capture the ether guess the new number src capturetheether block number block timestamp capture the ether predict the block hash src capturetheether blockhash ethernaut 13 gatekeeper one src ethernaut msg sender tx origin gasleft mod 8191 0 type conversion ethernaut 14 gatekeeper two src ethernaut msg sender tx origin extcodesize is 0 cipher shastra minion msg sender tx origin extcodesize is 0 block timestamp seccon beginners ctf 2020 c4b block number paradigm ctf 2021 babysandbox src paradigmctf2021 babysandbox staticcall call delegatecall extcodesize is 0 paradigm ctf 2021 lockbox ecrecover abi encodepacked msg data length ethernautdao 6 no name src ethernautdao noname block number gas price war fvictorio s evm puzzles src fvictorioevmpuzzles huff challenge challenge 3 src huffchallenge paradigm ctf 2022 lockbox2 src paradigmctf2022 paradigm ctf 2022 sourcecode src paradigmctf2022 quine numen cyber ctf 2023 littlemoney src numenctf function pointer numen cyber ctf 2023 asslot src numenctf staticcall that return different values misuse of tx origin tx origin refers to the address of the transaction publisher and should not be used as the address of the contract caller msg sender challenge note keywords ethernaut 4 telephone src ethernaut weak sources of randomness from chain attributes since contract bytecodes are publicly available it is easy to predict pseudorandom numbers whose generation is completed on chain using only states not off chain data it is equivalent to having all the parameters of a pseudorandom number generator exposed if you want to use random numbers that are unpredictable to anyone use a decentralized oracle with a random number function for example chainlink vrf https docs chain link docs chainlink vrf which implements verifiable random function vrf challenge note keywords capture the ether predict the future src capturetheether ethernaut 3 coin flip src ethernaut downunderctf 2022 crypto casino src downunderctf2022 erc 20 basics these challenges can be solved with an understanding of the erc 20 token standard https eips ethereum org eips eip 20 challenge note keywords ethernaut 15 naught coin src ethernaut transfer approve transferfrom paradigm ctf 2021 secure src paradigmctf2021 weth defi security summit stanford vtoken src defisecuritysummitstanford storage overwrite by delegatecall delegatecall is a potential source of vulnerability because the storage of the delegatecall caller contract can be overwritten by the called contract challenge note keywords ethernaut 6 delegation src ethernaut ethernaut 16 preservation src ethernaut ethernaut 24 puzzle wallet src ethernaut proxy contract ethernaut 25 motorbike src ethernaut proxy contract eip 1967 standard proxy storage slots https eips ethereum org eips eip 1967 defi security summit stanford insecureumlenderpool src defisecuritysummitstanford flash loan quillctf2023 d3l3g4t3 src quillctf2022 d3l3g4t3 numen cyber ctf 2023 counter src numenctf writing evm code context mismatch in delegatecall contracts called by delegatecall are executed in the context of the delegatecall caller contract if the function does not carefully consider the context a bug will be created challenge note keywords ethernautdao 3 carmarket src ethernautdao carmarket non use of address this integer overflow for example subtracting 1 from the value of a variable of uint type when the value is 0 causes an arithmetic overflow arithmetic overflow has been detected and reverted state since solidity v0 8 0 contracts written in earlier versions can be checked by using the safemath library https github com openzeppelin openzeppelin contracts blob release v3 4 contracts math safemath sol challenge note keywords capture the ether token sale src capturetheether multiplication capture the ether token whale src capturetheether subtraction ethernaut 5 token src ethernaut subtraction non executable ether transfers to contracts do not create a contract on the assumption that normal ether transfer send or transfer can always be executed if a destination is a contract and there is no receive ether function or payable fallback function ether cannot be transferred however instead of the normal transfer functions the selfdestruct described below can be used to force such a contract to transfer ether challenge note keywords ethernaut 9 king src ethernaut project sekai ctf 2022 random song src projectsekaictf2022 randomsong chainlink vrf forced ether transfers to contracts via selfdestruct if a contract does not have a receive ether function and a payable fallback function it is not guaranteed that ether will not be received when a contract executes selfdestruct it can transfer its ether to another contract or eoa and this selfdestruct transfer can be forced even if the destination contract does not have the receive ether function and the payable fallback function if the application is built on the assumption that the ether is 0 it could be a bug challenge note keywords capture the ether retirement fund src capturetheether integer overflow ethernaut 7 force src ethernaut large gas consumption by contract callees a large amount of gas can be consumed by loops and recursion in call and there may not be enough gas for the rest of the process until solidity v0 8 0 zero division and assert false could consume a lot of gas challenge note keywords ethernaut 20 denial src ethernaut forgetting to set view pure to interface and abstract contract functions if you forget to set view or pure for a function and design your application under the assumption that the state will not change it will be a bug challenge note keywords ethernaut 11 elevator src ethernaut view functions that do not always return same values since view functions can read state they can be conditionally branched based on state and do not necessarily return the same value challenge note keywords ethernaut 21 shop src ethernaut mistakes in setting storage and memory if storage and memory are not set properly old values may be referenced or overwriting may not occur resulting in vulnerability challenge note keywords n1ctf 2021 babydefi cover protocol infinite minting https coverprotocol medium com 12 28 post mortem 34c5f9f718d4 flash loan tracing transactions various information can be obtained just by following the flow of transaction processing blockchain explorers such as etherscan are useful challenge note keywords ethernaut 17 recovery src ethernaut loss of deployed contract address reversing states since the state and the bytecodes of contracts are public all variables including private variables are readable private variables are only guaranteed not to be directly readable by other contracts but we as an entity outside the blockchain can read them challenge note keywords capture the ether guess the random number src capturetheether ethernaut 8 vault src ethernaut ethernaut 12 privacy src ethernaut cipher shastra sherlock 0x41414141 ctf secure enclave src 0x41414141ctf log storage ethernautdao 1 privatedata src ethernautdao privatedata reversing transactions reversing the contents of a transaction or how the state has been changed by the transaction challenge note keywords darkctf secret of the contract src darkctf downunderctf 2022 secret and ephemeral src downunderctf2022 reversing evm bytecodes reversing a contract for which code is not given in whole or in part use a decompiler e g heimdall https github com jon becker heimdall rs panoramix https github com eveem org panoramix and a disassembler e g ethersplay https github com crytic ethersplay challenge note keywords incognito 2 0 ez keep in plain text 0x41414141 ctf crackme sol src 0x41414141ctf decompile 0x41414141 ctf crypto casino src 0x41414141ctf bypass condition check paradigm ctf 2021 babyrev 34c3 ctf chaingang blaze ctf 2018 smart contract def con ctf qualifier 2018 sag pbctf 2020 pbcoin paradigm ctf 2022 stealing sats paradigm ctf 2022 electric sheep paradigm ctf 2022 fun reversing challenge downunderctf 2022 evm vault mechanism src downunderctf2022 ekoparty ctf 2022 byte src ekopartyctf2022 stack tracing ekoparty ctf 2022 smartrev src ekopartyctf2022 memory tracing numen cyber ctf 2023 hexp src numenctf previous block hash gas price 2 24 evm bytecode golf these challenges have a limit on the length of the bytecode to be created challenge note keywords ethernaut 18 magicnumber src ethernaut paradigm ctf 2021 rever src paradigmctf2021 rever palindrome detection in addition the code that inverts the bytecode must also be able to detect palindromes huff challenge challenge 1 src huffchallenge jump oriented programming jump oriented programming jop challenge note keywords seccon ctf 2023 quals tokyo payload https github com minaminao tokyo payload paradigm ctf 2021 jop real world ctf 3rd re montagy gas optimization these challenges have a limit on the gas to be consumed challenge note keywords huff challenge challenge 2 src huffchallenge collisions when using abi encodepacked with variable length arguments challenge note keywords seetf 2023 operation feathered fortune fiasco src seetf2023 bypassing verifications with zero iteration loops challenge note keywords seetf 2023 murky seepass src seetf2023 array length merkle proof reentrancy attacks in case a function of contract a contains interaction with another contract b or ether transfer to b the control is temporarily transferred to b since b can call a in this control it will be a bug if the design is based on the assumption that a is not called in the middle of the execution of that function for example when b executes the withdraw function to withdraw ether deposited in a the ether transfer triggers a control shift to b and during the withdraw function b executes a s withdraw function again even if the withdraw function is designed to prevent withdrawal of more than the limit if it is simply called twice if the withdraw function is executed in the middle of the withdraw function it may be designed to bypass the limit check to prevent reentrancy attacks use the checks effects interactions pattern challenge note keywords capture the ether token bank src capturetheether erc 223 tokenfallback ethernaut 10 re entrancy src ethernaut call paradigm ctf 2021 yield aggregator htb university ctf 2020 quals moneyheist ethernautdao 4 vendingmachine src ethernautdao vendingmachine call defi security summit stanford insecuredexlp src defisecuritysummitstanford erc 223 tokenfallback maplectf 2022 maplebacoin src maplectf quillctf 2023 safenft src quillctf2022 safenft erc721 safemint numen cyber ctf 2023 simplecall src numenctf call seetf 2023 pigeonbank src seetf2023 project sekai ctf 2023 re remix src projectsekaictf2023 read only reentrancy flash loan basics flash loans are uncollateralised loans that allow the borrowing of an asset as long as the borrowed assets are returned before the end of the transaction the borrower can deal with the borrowed assets any way they want within the transaction by making large asset moves attacks can be made to snatch funds from defi applications or to gain large amounts of votes for participation in governance a solution to attacks that use flash loans to corrupt oracle values is to use a decentralized oracle challenge note keywords damn vulnerable defi 1 unstoppable simple flash loan with a single token failure to send the token directly damn vulnerable defi 2 naivereceiver the flashloan function can specify a borrower but the receiver side does not authenticate the tx sender so the receiver s funds can be drained as a fee damn vulnerable defi 3 truster the target of a call is made into the token and the token can be taken by approving it to oneself damn vulnerable defi 4 sideentrance flash loan that allows each user to make a deposit and a withdrawal the deposit can be executed at no cost at the time of the flash loan governance attacks by executing flash loans during snapshots if the algorithm distributes some kind of rights using the token balance at the time of a snapshot and if a malicious user transaction can trigger a snapshot a flash loan can be used to obtain massive rights a period of time to lock the token will avoid this attack challenge note keywords damn vulnerable defi 5 therewarder get reward tokens based on the deposited token balance damn vulnerable defi 6 selfie get voting power in governance based on the deposited token balance bypassing repayments of push architecture flash loans there are two architectures of flash loans push and pull with push architectures represented by uniswap and aave v1 and pull architectures by aave v2 and dydx the proposed flash loan in eip 3156 flash loans https eips ethereum org eips eip 3156 is a pull architecture challenge note keywords paradigm ctf 2021 upgrade bypass using the lending functionality implemented in the token bugs in amm price calculation algorithm a bug in the automated market maker amm price calculation algorithm allows a simple combination of trades to drain funds challenge note keywords ethernaut 22 dex src ethernaut attacks using custom tokens the ability of a protocol to use arbitrary tokens is not in itself a bad thing but it can be an attack vector in addition bugs in the whitelist design which assumes that arbitrary tokens are not available could cause funds to drain challenge note keywords ethernaut 23 dex two src ethernaut oracle manipulation attacks without flash loans it corrupts the value of the oracle and drains the funds of applications that refer to that oracle challenge note keywords paradigm ctf 2021 broker distort uniswap prices and liquidate positions on lending platforms that reference those prices damn vulnerable defi 7 compromised off chain private key leak oracle manipulation oracle manipulation attacks with flash loans the use of flash loans distorts the value of the oracle and drains the funds of the protocols that reference that oracle the ability to move large amounts of funds through a flash loan makes it easy to distort the oracle and cause more damage challenge note keywords damn vulnerable defi 8 puppet distort the price of uniswap v1 and leak tokens from a lending platform that references that price defi security summit stanford borrowsysteminsecureoracle src defisecuritysummitstanford lending protocol sandwich attacks for example if there is a transaction by another party to sell token a and buy b the attacker can put in a transaction to sell a and buy b before the transaction and later put in a transaction to sell the same amount of b and buy a thereby ultimately increasing the amount of a at a profit in general such revenue earned by selecting inserting and reordering transactions contained in a block generated by a miner is referred to as miner extractable value mev recently it is also called maximal extractable value challenge note keywords paradigm ctf 2021 farmer src paradigmctf2021 sandwich the trade from comp to weth to dai recoveries of private keys by same nonce attacks in general a same nonce attack is possible when the same nonce is used for different messages in the elliptic curve dsa ecdsa and the secret key can be calculated in ethereum if nonces used to sign transactions are the same this attack is feasible challenge note keywords capture the ether account takeover src capturetheether paradigm ctf 2021 babycrypto src paradigmctf2021 metatrust ctf ecdsa src metatrustctf ecdsa brute forcing addresses brute force can make a part of an address a specific value challenge note keywords capture the ether fuzzy identity src capturetheether 28 bits create2 numen cyber ctf 2023 exist src numenctf 16 bits recoveries of public keys the address is the public key applied to a keccak256 hash and the public key cannot be recovered from the address if even one transaction has been sent the public key can be back calculated from it specifically it can be recovered from the recursive length prefix rlp encoded data nonce gas price gas to value data chain id 0 0 and the signature v r s challenge note keywords capture the ether public key src capturetheether rlp ecdsa encryption and decryption in secp256k1 challenge note keywords 0x41414141 ctf rich club src 0x41414141ctf dex flash loan bypassing bots and taking erc 20 tokens owned by wallets with known private keys if a wallet with a known private key has an erc 20 token but no ether it is usually necessary to first send ether to the wallet and then transfer the erc 20 token to get the erc 20 token however if a bot that immediately takes the ether sent at this time is running the ether will be stolen when the ether is simply sent in this situation we can use flashbots https docs flashbots net bundled transactions or just permit and transferfrom if the token is eip 2612 permit https eips ethereum org eips eip 2612 friendly challenge note keywords ethernautdao 5 ethernautdaotoken src ethernautdao ethernautdaotoken claimable intermediate nodes of merkle trees challenge note keywords paradigm ctf 2022 merkledrop src paradigmctf2022 precompiled contracts challenge note keywords paradigm ctf 2022 vanity src paradigmctf2022 faking errors challenge note keywords ethernaut 27 good samaritan src ethernaut foundry cheatcodes challenge note keywords paradigm ctf 2022 trapdooor src paradigmctf2022 paradigm ctf 2022 trapdoooor front running challenge note keywords downunderctf 2022 private log src downunderctf2022 back running mev share can be used to create bundled transactions to back run challenge note keywords mev share ctf mevsharectfsimple 1 src mevsharectf mev share ctf mevsharectfsimple 2 src mevsharectf mev share ctf mevsharectfsimple 3 src mevsharectf mev share ctf mevsharectfsimple 4 src mevsharectf mev share ctf mevsharectfmagicnumberv1 src mevsharectf mev share ctf mevsharectfmagicnumberv2 src mevsharectf mev share ctf mevsharectfmagicnumberv3 src mevsharectf mev share ctf mevsharectfnewcontract address src mevsharectf mev share ctf mevsharectfnewcontract salt src mevsharectf create2 head overflow bugs in calldata tuple abi reencoding solidity 0 8 16 see https blog soliditylang org 2022 08 08 calldata tuple reencoding head overflow bug challenge note keywords 0ctf 2022 tctf nft market src 0ctf2022 tctfnftmarket numen cyber ctf 2023 wallet src numenctf illegal v in ecrecover overwriting storage slots via local storage variables solidity 0 8 1 in foo storage foo the local variable foo points to slot 0 challenge note keywords capture the ether donation src capturetheether overwriting arbitrary storage slots by setting array lengths to 2 256 1 solidity 0 6 0 for example any storage variable can be overwritten by negatively arithmetic overflowing the length of an array to 2 256 1 it need not be due to overflow the length property has been read only since v0 6 0 challenge note keywords capture the ether mapping src capturetheether ethernaut 19 alien codex src ethernaut paradigm ctf 2021 bank constructors that is just functions by typos solidity 0 5 0 in versions before v0 4 22 the constructor is defined as a function with the same name as the contract so a typo of the constructor name could cause it to become just a function resulting in a bug since v0 5 0 this specification is removed and the constructor keyword must be used challenge note keywords capture the ether assume ownership src capturetheether ethernaut 2 fallout src ethernaut overwriting storage slots via uninitialized storage pointer solidity 0 5 0 since v0 5 0 uninitialized storage variables are forbidden so this bug cannot occur challenge note keywords capture the ether fifty years src capturetheether ethernaut locked deleted https forum openzeppelin com t ethernaut locked with solidity 0 5 1115 other ad hoc vulnerabilities and methods challenge note keywords paradigm ctf 2021 bouncer src paradigmctf2021 bouncer the funds required for batch processing are the same as for single processing paradigm ctf 2021 market make the value of one field be recognized as the value of another field by using key misalignment in the eternal storage pattern ethernautdao 2 walletlibrary src ethernautdao walletlibrary m and n of m of n multisig wallet can be changed paradigm ctf 2022 rescue src paradigmctf2022 paradigm ctf 2022 just in time paradigm ctf 2022 0xmonaco balsnctf 2022 src balsnctf2022 initialize safetransferfrom create2 numen cyber ctf 2023 lenderpool src numenctf flash loan numen cyber ctf 2023 goatfinance src numenctf check sum address seetf 2023 pigeon vault src seetf2023 eip 2535 diamonds multi facet proxy corctf 2023 baby wallet src corctf2023 missing from to check bitcoin note including challenges of bitcoin variants whose transaction model is unspent transaction output utxo bitcoin basics challenge note keywords tsukuctf 2021 genesis genesis block wormcon 0x01 what s my wallet address bitcoin address ripemd 160 recoveries of private keys by same nonce attacks there was a bug and it has been fixed using rfc6979 https datatracker ietf org doc html rfc6979 https github com daedalus bitcoin recover privkey challenge note keywords darkctf duplicacy within src darkctf bypassing pow of other applications using bitcoin s pow database bitcoin uses a series of leading zeros in the sha 256 hash value as a proof of work pow but if other applications are designed in the same way its pow time can be significantly reduced by choosing one that matches the conditions from bitcoin s past pow results challenge note keywords dragon ctf 2020 bit flip 2 64 bit pow cairo challenge note keywords paradigm ctf 2022 riddle of the sphinx src paradigmctf2022 contract call paradigm ctf 2022 cairo proxy src paradigmctf2022 integer overflow paradigm ctf 2022 cairo auction src paradigmctf2022 uint256 balsnctf 2022 cairo reverse src balsnctf2022 reversing solana challenge note keywords alles ctf 2021 secret store solana spl token alles ctf 2021 legit bank alles ctf 2021 bugchain alles ctf 2021 ebpf reversing ebpf paradigm ctf 2022 otterworld src paradigmctf2022 paradigm ctf 2022 otterswap src paradigmctf2022 paradigm ctf 2022 pool paradigm ctf 2022 solhana 1 paradigm ctf 2022 solhana 2 paradigm ctf 2022 solhana 3 corctf 2023 tribunal project sekai ctf 2023 the bidding project sekai ctf 2023 play for free move challenge note keywords numen cyber ctf 2023 move to checkin src numenctf contract call in sui numen cyber ctf 2023 chatgpt tell me where is the vulnerability src numenctf osint numen cyber ctf 2023 move to crackme src numenctf reversing move code and linux executable other blockchain related things that are not directly related to blockchains but are part of the ecosystems challenge note keywords tsukuctf 2021 interplanetary protocol ipfs address base32 in lowercase | ctf blockchain ethereum solidity evm | blockchain |
esp-open-rtos | esp open rtos a community developed open source freertos http www freertos org based framework for esp8266 wifi enabled microcontrollers intended for use in both commercial and open source projects originally based on but substantially different from the espressif iot rtos sdk https github com espressif esp8266 rtos sdk resources build status https travis ci org superhouse esp open rtos svg branch master https travis ci org superhouse esp open rtos email discussion list https groups google com d forum esp open rtos irc channel esp open rtos on freenode web chat link http webchat freenode net channels 23esp open rtos uio d4 github issues list bugtracker https github com superhouse esp open rtos issues please note that this project is released with a contributor code of conduct https github com superhouse esp open rtos blob master code of conduct md by participating in this project you agree to abide by its terms quick start install esp open sdk https github com pfalcon esp open sdk build it with make toolchain esptool libhal standalone n then edit your path and add the generated toolchain bin directory the path will be something like path to esp open sdk xtensa lx106 elf bin despite the similar name esp open sdk has different maintainers but we think it s fantastic other toolchains may also work as long as a gcc cross compiler is available on the path and libhal and libhal headers are compiled and available to gcc the proprietary tensilica xcc compiler will probably not work install esptool py https github com themadinventor esptool and make it available on your path if you used esp open sdk then this is done already the esp open rtos build process uses gnu make and the utilities sed and grep if you built esp open sdk then you have these already use git to clone the esp open rtos project note the recursive git clone recursive https github com superhouse esp open rtos git cd esp open rtos to build any examples that use wifi create include private ssid config h defining the two macro defines c define wifi ssid mywifissid define wifi pass my secret password build an example project found in the examples directory and flash it to a serial port make flash j4 c examples http get espport dev ttyusb0 run make help c examples http get for a summary of other make targets note the c option to make is the same as changing to that directory then running make the build process wiki page https github com superhouse esp open rtos wiki build process has in depth details of the build process goals provide professional quality framework for wifi enabled rtos projects on esp8266 open source code for all layers above the mac layer ideally lower layers if possible this is a work in progress see issues list https github com superhouse esp open rtos issues leave upstream source clean for easy interaction with upstream projects flexible build and compilation settings current status is alpha quality actively developed ap station mode ie wifi client mode and udp tcp client modes are tested other functionality should work contributors and testers are welcome code structure examples contains a range of example projects one per subdirectory check them out include contains header files from espressif rtos sdk relating to the binary libraries xtensa core core contains source headers for low level esp8266 functions peripherals core include esp contains useful headers for peripheral access etc minimal to no freertos dependencies extras is a directory that contains optional components that can be added to your project most extras components will have a corresponding example in the examples directory extras include mbedtls mbedtls https tls mbed org is a tls ssl library providing up to date secure connectivity and encryption support i2c software i2c driver upstream project https github com kanflo esp open rtos driver i2c rboot ota ota support over the air updates including a tftp server for receiving updates for rboot by raburton http richard burtons org 2015 05 18 rboot a new boot loader for esp8266 bmp180 driver for digital pressure sensor upstream project https github com angus71 esp open rtos driver bmp180 freertos contains freertos implementation subdirectory structure is the standard freertos structure freertos source portable esp8266 contains the esp8266 port lwip contains the lwip tcp ip library see third party libraries https github com superhouse esp open rtos wiki third party libraries wiki page for details libc contains the newlib libc libc details here https github com superhouse esp open rtos wiki libc configuration open source components freertos http www freertos org v10 2 0 lwip http lwip wikia com wiki lwip wiki v2 0 3 with some modifications https github com ourairquality lwip newlib https github com ourairquality newlib v3 0 0 with patches for xtensa support and locking stubs for thread safe operation on freertos for details of how third party libraries are integrated see the wiki page https github com superhouse esp open rtos wiki third party libraries binary components binary libraries inside the lib dir are all supplied by espressif as part of their rtos sdk these parts were mit licensed as part of the esp open rtos build process all binary sdk symbols are prefixed with sdk this makes it easier to differentiate binary open source code and also prevents namespace conflicts espressif s rtos sdk provided a libssl based on axtls this has been replaced with the more up to date mbedtls library see below some binary libraries appear to contain unattributed open source code libnet80211 a libwpa a appear to be based on freebsd net80211 wpa or forks of them see this issue https github com superhouse esp open rtos issues 4 libudhcp has been removed from esp open rtos it was released with the espressif rtos sdk but udhcp is gpl licensed licensing bsd license as described in license applies to original source files lwip http lwip wikia com wiki lwip wiki lwip is copyright c swedish institute of computer science freertos since v10 is provided under the mit license license details in files under freertos dir freertos is copyright c amazon source binary components from the espressif iot rtos sdk https github com espressif esp iot rtos sdk were released under the mit license source code components are relicensed here under the bsd license the original parts are copyright c espressif systems newlib is covered by several copyrights and licenses as per the files in the libc directory mbedtls https tls mbed org is provided under the apache 2 0 license as described in the file extras mbedtls mbedtls apache 2 0 txt mbedtls is copyright c arm limited components under extras may contain different licenses please see those directories for details contributions contributions are very welcome if you find a bug please raise an issue to report it https github com superhouse esp open rtos issues if you have feature additions or bug fixes then please send a pull request there is a list of outstanding enhancements in the issues list https github com superhouse esp open rtos issues contributions to these as well as other improvements are very welcome if you are contributing code please ensure that it can be licensed under the bsd open source license specifically code from espressif iot sdk cannot be merged as it is provided under either the espressif general public license or the espressif mit license which are not compatible with the bsd license recent releases of the espressif iot rtos sdk cannot be merged as they changed from mit license to the espressif mit license which is not bsd compatible the espressif binaries used in esp open rtos were taken from revision ec75c85 as this was the last mit licensed revision https github com espressif esp8266 rtos sdk commit 43585fa74550054076bdf4bfe185e808ad0da83e for code submissions based on reverse engineered binary functionality please either reverse engineer functionality from mit licensed espressif releases or make sure that the reverse engineered code does not directly copy the code structure of the binaries it cannot be a derivative work of an incompatible binary the best way to write suitable code is to first add documentation somewhere like the esp8266 reverse engineering wiki http esp8266 re foogod com describing factual information gained from reverse engineering such as register addresses bit masks orders of register writes etc then write new functions referring to that documentation as reference material coding style for new contributions in c please use bsd style and indent using 4 spaces for assembly please use the following instructions indented using 8 spaces inline comments use as a comment delimiter comments on their own line s use first operand of each instruction should be vertically aligned where possible for xtensa special registers prefer wsr ax sr over wsr sr ax if you re an emacs user then there is a dir locals el file in the root which configures cc mode and asm mode you will need to approve some variable values as safe see also the additional comments in dir locals el if you re editing assembly code upstream code is left with the indentation and style of the upstream project sponsors work on parts of esp open rtos has been sponsored by superhouse automation http superhouse tv | os |
|
kikiibackend | server |
||
aws-iot-device-sdk-js | new version available a new aws iot device sdk is now available https github com awslabs aws iot device sdk js v2 it is a complete rework built to improve reliability performance and security we invite your feedback this sdk will no longer receive feature updates but will receive security updates aws iot sdk for javascript the aws iot device sdk js package allows developers to write javascript applications which access the aws iot platform via mqtt or mqtt over the secure websocket protocol http docs aws amazon com iot latest developerguide protocols html it can be used in node js environments as well as in browser applications overview overview installation install mac only tls behavior mac tls warning examples examples api documentation api connection types connections example programs programs browser applications browser troubleshooting troubleshooting unit tests unittests license license support support a name overview a overview this document provides instructions on how to install and configure the aws iot device sdk for javascript and includes examples demonstrating use of the sdk apis mqtt connection this package is built on top of mqtt js https github com mqttjs mqtt js blob master readme md and provides three classes device thingshadow and jobs the device class wraps mqtt js https github com mqttjs mqtt js blob master readme md to provide a secure connection to the aws iot platform and expose the mqtt js https github com mqttjs mqtt js blob master readme md interfaces upward it provides features to simplify handling of intermittent connections including progressive backoff retries automatic re subscription upon connection and queued offline publishing with configurable drain rate collection of metrics beginning with release v2 2 0 of the sdk aws collects usage metrics indicating which language and version of the sdk is being used this allows us to prioritize our resources towards addressing issues faster in sdks that see the most and is an important data point however we do understand that not all customers would want to report this data by default in that case the sending of usage metrics can be easily disabled by set options enablemetrics to false thing shadows the thingshadow class implements additional functionality for accessing thing shadows via the aws iot api the thingshadow class allows devices to update be notified of changes to get the current state of or delete thing shadows from aws iot thing shadows allow applications and devices to synchronize their state on the aws iot platform for example a remote device can update its thing shadow in aws iot allowing a user to view the device s last reported state via a mobile app the user can also update the device s thing shadow in aws iot and the remote device will synchronize with the new state the thingshadow class supports multiple thing shadows per mqtt connection and allows pass through of non thing shadow topics and mqtt events jobs the jobs class implements functionality to interact with the aws iot jobs service the iot job service manages deployment of iot fleet wide tasks such as device software firmware deployments and updates rotation of security certificates device reboots and custom device specific management tasks included in this package is an example agent the agent can be used either as a stand alone program to manage installation and maintenance of files and other running processes or it can be incorporated into a customized agent to meet specific application needs a name install a installation note aws iot node js sdk will only support node version 4 or above you can check your node version by sh node v installing with npm sh npm install aws iot device sdk installing from github sh git clone https github com aws aws iot device sdk js git cd aws iot device sdk js npm install a name mac tls warning a mac only tls behavior please note that on mac once a private key is used with a certificate that certificate key pair is imported into the mac keychain all subsequent uses of that certificate will use the stored private key and ignore anything passed in programmatically a name examples a examples device class js copyright amazon com inc or its affiliates all rights reserved spdx license identifier mit 0 var awsiot require aws iot device sdk replace the values of youruniqueclientidentifier and yourcustomendpoint with a unique client identifier and custom host endpoint provided in aws iot note client identifiers must be unique within your aws account if a client attempts to connect with a client identifier which is already in use the existing connection will be terminated var device awsiot device keypath yourprivatekeypath certpath yourcertificatepath capath yourrootcacertificatepath clientid youruniqueclientidentifier host yourcustomendpoint device is an instance returned by mqtt client see mqtt js for full documentation device on connect function console log connect device subscribe topic 1 device publish topic 2 json stringify test data 1 device on message function topic payload console log message topic payload tostring thing shadow class js copyright amazon com inc or its affiliates all rights reserved spdx license identifier mit 0 var awsiot require aws iot device sdk replace the values of youruniqueclientidentifier and yourcustomendpoint with a unique client identifier and custom host endpoint provided in aws iot cloud note client identifiers must be unique within your aws account if a client attempts to connect with a client identifier which is already in use the existing connection will be terminated var thingshadows awsiot thingshadow keypath yourprivatekeypath certpath yourcertificatepath capath yourrootcacertificatepath clientid youruniqueclientidentifier host yourcustomendpoint client token value returned from thingshadows update operation var clienttokenupdate simulated device values var rval 187 var gval 114 var bval 222 thingshadows on connect function after connecting to the aws iot platform register interest in the thing shadow named rgbledlamp thingshadows register rgbledlamp function once registration is complete update the thing shadow named rgbledlamp with the latest device state and save the clienttoken so that we can correlate it with status or timeout events thing shadow state var rgbledlampstate state desired red rval green gval blue bval clienttokenupdate thingshadows update rgbledlamp rgbledlampstate the update method returns a clienttoken if non null this value will be sent in a status event when the operation completes allowing you to know whether or not the update was successful if the update method returns null it s because another operation is currently in progress and you ll need to wait until it completes or times out before updating the shadow if clienttokenupdate null console log update shadow failed operation still in progress thingshadows on status function thingname stat clienttoken stateobject console log received stat on thingname json stringify stateobject these events report the status of update get and delete calls the clienttoken value associated with the event will have the same value which was returned in an earlier call to get update or delete use status events to keep track of the status of shadow operations thingshadows on delta function thingname stateobject console log received delta on thingname json stringify stateobject thingshadows on timeout function thingname clienttoken console log received timeout on thingname with token clienttoken in the event that a shadow operation times out you ll receive one of these events the clienttoken value associated with the event will have the same value which was returned in an earlier call to get update or delete jobs class js copyright amazon com inc or its affiliates all rights reserved spdx license identifier mit 0 var awsiot require aws iot device sdk replace the values of youruniqueclientidentifier and yourcustomendpoint with a unique client identifier and custom host endpoint provided in aws iot cloud note client identifiers must be unique within your aws account if a client attempts to connect with a client identifier which is already in use the existing connection will be terminated var jobs awsiot jobs keypath yourprivatekeypath certpath yourcertificatepath capath yourrootcacertificatepath clientid youruniqueclientidentifier host yourcustomendpoint jobs is built on top of awsiot device and inherits all of the same functionality jobs on connect function console log connect device subscribe topic 1 device publish topic 2 json stringify test data 1 jobs on message function topic payload console log message topic payload tostring to subscribe to job execution events call the subscribetojobs method which takes a callback that will be invoked when a job execution is available or an error occurs the job object passed to the callback contains information about the job execution and methods for updating the job execution status details covered in the api documentation below jobs subscribetojobs thingname function err job if isundefined err console log default job handler invoked jobid job id tostring console log job document job document else console error err jobs subscribetojobs thingname customjob function err job if isundefined err console log customjob operation handler invoked jobid job id tostring console log job document job document else console error err after calling subscribetojobs for each operation on a particular thing call startjobnotifications to cause any existing queued job executions for the given thing to be published to the appropriate subscribetojobs handler only needs to be called once per thing jobs startjobnotifications thingname function err if isundefined err console log job notifications initiated for thing thingname else console error err a name api a api documentation a href device code awsiot b device b code a a href thingshadow code awsiot b thingshadow b code a a href jobs code awsiot b jobs b code a a href register code awsiot thingshadow b register b code a a href unregister code awsiot thingshadow b unregister b code a a href update code awsiot thingshadow b update b code a a href get code awsiot thingshadow b get b code a a href delete code awsiot thingshadow b delete b code a a href publish code awsiot thingshadow b publish b code a a href subscribe code awsiot thingshadow b subscribe b code a a href unsubscribe code awsiot thingshadow b unsubscribe b code a a href end code awsiot thingshadow b end b code a a href subscribetojobs code awsiot jobs b subscribetojobs b code a a href unsubscribefromjobs code awsiot jobs b unsubscribefromjobs b code a a href startjobnotifications code awsiot jobs b startjobnotifications b code a a href job code b job b code a a href document code job b document b code a a href id code job b id b code a a href operation code job b operation b code a a href status code job b status b code a a href inprogress code job b inprogress b code a a href failed code job b failed b code a a href succeeded code job b succeeded b code a a name device a awsiot device options returns a wrapper for the mqtt client https github com mqttjs mqtt js blob master readme md client class configured for a tls connection with the aws iot platform and with arguments as specified in options the awsiot specific arguments are as follows host the aws iot endpoint you will use to connect clientid the client id you will use to connect to aws iot certpath path of the client certificate file keypath path of the private key file associated with the client certificate capath path of your ca certificate file clientcert same as certpath but can also accept a buffer containing client certificate data privatekey same as keypath but can also accept a buffer containing private key data cacert same as capath but can also accept a buffer containing ca certificate data autoresubscribe set to true to automatically re subscribe to topics after reconnection default true offlinequeueing set to true to automatically queue published messages while offline default true offlinequeuemaxsize enforce a maximum size for the offline message queue default 0 e g no maximum offlinequeuedropbehavior set to oldest or newest to define drop behavior on a full queue when offlinequeuemaxsize 0 draintimems the minimum time in milliseconds between publishes when draining after reconnection default 250 basereconnecttimems the base reconnection time in milliseconds default 1000 maximumreconnecttimems the maximum reconnection time in milliseconds default 128000 minimumconnectiontimems the minimum time in milliseconds that a connection must be maintained in order to be considered stable default 20000 protocol the connection type either mqtts default wss websocket tls or wss custom auth websocket tls with custom authentication note that when set to wss values must be provided for the access key id and secret key in either the following options or in environment variables as specified in websocket configuration websockets when set to wss custom auth valid headers must be provided as specified in custom auth custom auth websocketoptions if protocol is set to wss you can use this parameter to pass additional options to the underlying websocket object these options are documented here https github com websockets ws blob master doc ws md class wswebsocket filename used to load credentials from the file different than the default location when protocol is set to wss default value is aws credentials profile used to specify which credential profile to be used when protocol is set to wss default value is default accesskeyid used to specify the access key id when protocol is set to wss overrides the environment variable aws access key id and aws access key id from filename if set secretkey used to specify the secret key when protocol is set to wss overrides the environment variable aws secret access key and aws secret access key from filename if set sessiontoken required when authenticating via cognito optional otherwise used to specify the session token when protocol is set to wss overrides the environment variable aws session token if set region used to specify aws account region e g us east 1 when protocol is set to wss if undefined a value is derived from host customauthheaders used to specify your custom authorization headers when protocol is set to wss custom auth the fields x amz customauthorizer name x amz customauthorizer signature and the field for your token name are required servername used for sni if undefined a value is derived from host port used to specify which port to connect to if undefined 443 or 8883 will be chosen depending on protocol customauthquerystring used to specify the token credentials in a query string for custom authorization when protocol is set to wss custom auth more info can be found here https docs aws amazon com iot latest developerguide custom auth html custom auth websockets keepalive used to specify the time interval for each ping request default is set to 300 seconds to connect to aws iot enablemetrics used to report sdk version usage metrics it is set to true by default to disable metrics collection set value to false debug set to true for verbose logging default false all certificates and keys must be in pem format options also contains arguments specific to mqtt see the mqtt client documentation https github com mqttjs mqtt js blob master readme md client for details of these arguments note aws iot doesn t support retained messages setting retain flag to true for message publishing including last will and testament messages will result in connection termination for aws iot protocol specifics please visit here http docs aws amazon com iot latest developerguide protocols html supports all events emitted by the mqtt client https github com mqttjs mqtt js blob master readme md client class a name updatewebsocketcredentials a awsiot device updatewebsocketcredentials accesskeyid secretkey sessiontoken expiration update the credentials set used to authenticate via websocket sigv4 this method is designed to be invoked during the callback of the getcredentialsforidentity method http docs aws amazon com awsjavascriptsdk latest aws cognitoidentity html getcredentialsforidentity property in the aws sdk for javascript http docs aws amazon com awsjavascriptsdk guide index html accesskeyid the latest access key to use when connecting via websocket sigv4 secretkey the latest secret key to use when connecting via websocket sigv4 sessiontoken the latest session token to use when connecting via websocket sigv4 expiration the time this credentials set will expire a name thingshadow a awsiot thingshadow deviceoptions thingshadowoptions the thingshadow class wraps an instance of the device class with additional functionality to operate on thing shadows via the aws iot api the arguments in deviceoptions include all those in the device class device thingshadowoptions has the addition of the following arguments specific to the thingshadow class operationtimeout the timeout for thing operations default 10 seconds supports all events emitted by the mqtt client https github com mqttjs mqtt js blob master readme md client class however the semantics for the message event are slightly different and additional events are available as described below event message function topic message emitted when a message is received on a topic not related to any thing shadows topic topic of the received packet message payload of the received packet event status function thingname stat clienttoken stateobject emitted when an operation update get delete completes thingname name of the thing shadow for which the operation has completed stat status of the operation accepted rejected clienttoken the operation s clienttoken stateobject the stateobject returned for the operation applications can use clienttoken values to correlate status events with the operations that they are associated with by saving the clienttokens returned from each operation event delta function thingname stateobject emitted when a delta has been received for a registered thing shadow thingname name of the thing shadow that has received a delta stateobject the stateobject returned for the operation event foreignstatechange function thingname operation stateobject emitted when a different client s update or delete operation is accepted on the shadow thingname name of the thing shadow for which the operation has completed operation operation performed by the foreign client update delete stateobject the stateobject returned for the operation this event allows an application to be aware of successful update or delete operations performed by different clients event timeout function thingname clienttoken emitted when an operation update get delete has timed out thingname name of the thing shadow that has received a timeout clienttoken the operation s clienttoken applications can use clienttoken values to correlate timeout events with the operations that they are associated with by saving the clienttokens returned from each operation a name register a awsiot thingshadow register thingname options callback register interest in the thing shadow named thingname the thingshadow class will subscribe to any applicable topics and will fire events for the thing shadow until awsiot thingshadow unregister unregister is called with thingname options can contain the following arguments to modify how this thing shadow is processed ignoredeltas set to true to not subscribe to the delta sub topic for this thing shadow used in cases where the application is not interested in changes e g update only default false persistentsubscribe set to false to unsubscribe from all operation sub topics while not performing an operation default true discardstale set to false to allow receiving messages with old version numbers default true enableversioning set to true to send version numbers with shadow updates default true the persistentsubscribe argument allows an application to get faster operation responses at the expense of potentially receiving more irrelevant response traffic i e response traffic for other clients who have registered interest in the same thing shadow when persistentsubscribe is set to false operation sub topics are only subscribed to during the scope of that operation note that in this mode update get and delete operations will be much slower however the application will be less likely to receive irrelevant response traffic the discardstale argument allows applications to receive messages which have obsolete version numbers this can happen when messages are received out of order applications which set this argument to false should use other methods to determine how to treat the data e g use a time stamp property to know how old stale it is if enableversioning is set to true version numbers will be sent with each operation aws iot maintains version numbers for each shadow and will reject operations which contain the incorrect version in applications where multiple clients update the same shadow clients can use versioning to avoid overwriting each other s changes if the callback parameter is provided it will be invoked after registration is complete i e when subscription acks have been received for all shadow topics applications should wait until shadow registration is complete before performing update get delete operations a name unregister a awsiot thingshadow unregister thingname unregister interest in the thing shadow named thingname the thingshadow class will unsubscribe from all applicable topics and no more events will be fired for thingname a name update a awsiot thingshadow update thingname stateobject update the thing shadow named thingname with the state specified in the javascript object stateobject thingname must have been previously registered using awsiot thingshadow register register the thingshadow class will subscribe to all applicable topics and publish stateobject on the b update b sub topic this function returns a clienttoken which is a unique value associated with the update operation when a status or timeout event is emitted the clienttoken will be supplied as one of the parameters allowing the application to keep track of the status of each operation the caller may create their own clienttoken value if stateobject contains a clienttoken property that will be used rather than the internally generated value note that it should be of atomic type i e numeric or string this function returns null if an operation is already in progress a name get a awsiot thingshadow get thingname clienttoken get the current state of the thing shadow named thingname which must have been previously registered using awsiot thingshadow register register the thingshadow class will subscribe to all applicable topics and publish on the b get b sub topic this function returns a clienttoken which is a unique value associated with the get operation when a status or timeout event is emitted the clienttoken will be supplied as one of the parameters allowing the application to keep track of the status of each operation the caller may supply their own clienttoken value optional if supplied the value of clienttoken will be used rather than the internally generated value note that this value should be of atomic type i e numeric or string this function returns null if an operation is already in progress a name delete a awsiot thingshadow delete thingname clienttoken delete the thing shadow named thingname which must have been previously registered using awsiot thingshadow register register the thingshadow class will subscribe to all applicable topics and publish on the b delete b sub topic this function returns a clienttoken which is a unique value associated with the delete operation when a status or timeout event is emitted the clienttoken will be supplied as one of the parameters allowing the application to keep track of the status of each operation the caller may supply their own clienttoken value optional if supplied the value of clienttoken will be used rather than the internally generated value note that this value should be of atomic type i e numeric or string this function returns null if an operation is already in progress a name publish a awsiot thingshadow publish topic message options callback identical to the mqtt client publish https github com mqttjs mqtt js blob master readme md publish method with the restriction that the topic may not represent a thing shadow this method allows the user to publish messages to topics on the same connection used to access thing shadows a name subscribe a awsiot thingshadow subscribe topic options callback identical to the mqtt client subscribe https github com mqttjs mqtt js blob master readme md subscribe method with the restriction that the topic may not represent a thing shadow this method allows the user to subscribe to messages from topics on the same connection used to access thing shadows a name unsubscribe a awsiot thingshadow unsubscribe topic callback identical to the mqtt client unsubscribe https github com mqttjs mqtt js blob master readme md unsubscribe method with the restriction that the topic may not represent a thing shadow this method allows the user to unsubscribe from topics on the same used to access thing shadows a name end a awsiot thingshadow end force callback invokes the mqtt client end https github com mqttjs mqtt js blob master readme md end method on the mqtt connection owned by the thingshadow class the force and callback parameters are optional and identical in function to the parameters in the mqtt client end https github com mqttjs mqtt js blob master readme md end method a name jobs a awsiot jobs deviceoptions the jobs class wraps an instance of the device class with additional functionality to handle job execution management through the aws iot jobs platform arguments in deviceoptions are the same as those in the device class device and the jobs class supports all of the same events and functions as the device class the jobs class also supports the following methods a name subscribetojobs a awsiot jobs subscribetojobs thingname operationname callback subscribes to job execution notifications for the thing named thingname if operationname is specified then the callback will only be called when a job ready for execution contains a property called operation in its job document with a value matching operationname if operationname is omitted then the callback will be called for every job ready for execution that does not match another subscribetojobs subscription thingname name of the thing to receive job execution notifications operationname optionally filter job execution notifications to jobs with a value for the operation property that matches operationname callback function err job callback for when a job execution is ready for processing or an error occurs err a subscription error or an error that occurs when client is disconnecting job an object that contains job job execution information and functions for updating job execution status a name unsubscribefromjobs a awsiot jobs unsubscribefromjobs thingname operationname callback unsubscribes from job execution notifications for the thing named thingname having operations with a value of the given operationname if operationname is omitted then the default handler for the thing with the given name is unsubscribed thingname name of the thing to cancel job execution notifications for operationname optional name of previously subscribed operation names callback function err callback for when the unsubscribe operation completes a name startjobnotifications a awsiot jobs startjobnotifications thingname callback causes any existing queued job executions for the given thing to be published to the appropriate subscribetojobs handler only needs to be called once per thing thingname name of the thing to cancel job execution notifications for callback function err callback for when the startjobnotifications operation completes a name job a job object that contains job execution information and functions for updating job execution status a name document a job document the json document describing details of the job to be executed eg operation install otherproperty value a name id a job id returns the job id a name operation a job operation returns the job operation from the job document eg install reboot etc a name status a job status returns the current job status according to aws orchestra status in progress queued statusdetails progress 50 a name inprogress a job inprogress statusdetails callback update the status of the job execution to be in progress for the thing associated with the job statusdetails optional document describing the status details of the in progress job e g string string progress 50 callback function err optional callback for when the operation completes err is null if no error occurred a name failed a job failed statusdetails callback update the status of the job execution to be failed for the thing associated with the job statusdetails optional document describing the status details of the in progress job e g string string progress 0 callback function err optional callback for when the operation completes err is null if no error occurred a name succeeded a job succeeded statusdetails callback update the status of the job execution to be success for the thing associated with the job statusdetails optional document describing the status details of the in progress job e g string string progress 100 callback function err optional callback for when the operation completes err is null if no error occurred a name connections a connection types this sdk supports three types of connections to the aws iot platform mqtt over tls with mutual certificate authentication using port 8883 mqtt over websocket tls with sigv4 authentication using port 443 mqtt over websocket tls using a custom authorization function to authenticate the default connection type is mqtt over tls with mutual certificate authentication to configure a websocket tls connection set the protocol option to wss when instantiating the awsiot device device or awsiot thingshadow thingshadow classes to use custom auth set the protocol option to wss custom auth a name custom auth a custom authorization configuration to use custom authorization you must first set up an authorizer function in lambda and register it with iot once you do you will be able to authenticate using this function there are two ways to use custom auth set the customauthheaders option to your headers object when instantiating the awsiotdevice device or awsiot thingshadow thingshadow classes the headers object is an object containing the header name and values as key value pairs js x amz customauthorizer name testauthorizer x amz customauthorizer signature signature testauthorizertoken token set the customauthquerystring option to your headers object when instantiating the awsiotdevice device class the query string is a string containing the values as key value pairs js x amz customauthorizer name testauthorizer x amz customauthorizer signature signature testauthorizertoken token a name programs a example programs the examples directory contains several programs which demonstrate usage of the aws iot apis device example js demonstrate simple mqtt publish and subscribe operations echo example js echoexample test thing shadow operation by echoing all delta state updates to the update topic used in conjunction with the aws iot console https console aws amazon com iot to verify connectivity with the aws iot platform thing example js use a thing shadow to automatically synchronize state between a simulated device and a control application thing passthrough example js demonstrate use of a thing shadow with pasthrough of standard mqtt publish and subscribe messages temperature control temperature control js an interactive device simulation which uses thing shadows jobs example js jobsexample receive example job execution messages and update job execution status jobs agent js jobsagent example agent to handle standard operations such as reboot report system status and shutdown it also handles installation of files including but not limited to configuration files program updates and security certificates it also can install and launch other programs and manage their executions start stop and restart the example programs use command line parameters to set options to see the available options run the program and specify the h option as follows sh node examples example program h note you have to use the certificate created in the same region as your host end point you will also need to use unique custom endpoint with h command line option when connect examples to iot cloud a name websockets a websocket configuration the example programs can be configured to use a websocket tls connection to the aws iot platform by adding protocol wss to the command line to override the default setting of mqtts sh p protocol protocol connect using protocol mqtts wss when using a websocket tls connection you have the following options to set credentials export variables to system environment sh export aws access key id a valid aws access key id export aws secret access key a valid aws secret access key load iam credentials from shared credential file the default shared credential file is located in aws credentials for linux users and userprofile aws credentials for windows users this could be configured using aws cli visit the aws cli home page https aws amazon com cli alternatively you could provide credential file in different path or another profile by specifying in the awsiot device options the values of aws access key id and aws secret access key must contain valid aws identity and access management iam credentials for more information about aws iam visit the aws iam home page https aws amazon com iam a name certificates a certificate configuration when not configured to use a websocket tls connection the example programs require a client certificate and private key created using either the aws iot console https console aws amazon com iot or the aws iot cli https aws amazon com cli in order to authenticate with aws iot each example program uses command line options to specify the names and or locations of certificates as follows specify a directory containing default named certificates sh f certificate dir dir look in dir for certificates the certificate dir f option will read all certificate and key files from the directory specified default certificate key file names are as follows certificate pem crt your aws iot certificate private pem key the private key associated with your aws iot certificate root ca crt the root ca certificate available from the aws documentation here https docs aws amazon com iot latest developerguide server authentication html server authentication certs specify certificate names and locations individually sh k private key file use file as private key c client certificate file use file as client certificate a ca certificate file use file as ca certificate the f certificate directory option can be combined with these so that you don t have to specify absolute pathnames for each file a href configurationfile a use a configuration file the aws iot console https console aws amazon com iot can generate json configuration data specifying the parameters required to connect a device to the aws iot platform the json configuration data includes pathnames to certificates the hostname and port number etc the command line option configuration file f is used when reading parameters from a configuration file sh f configuration file file use file json format for configuration the configuration file is in json format and may contain the following properties host the host name to connect to port the port number to use when connecting to the host 8883 for aws iot with client certificate clientid the client id to use when connecting privatekey file containing the private key clientcert file containing the client certificate cacert file containing the ca certificate thingname thing name to use tips for using json configuration files the f certificate directory and f configuration file options can be combined so that you don t have to use absolute pathnames in the configuration file when using a configuration file to run any of the example programs other than echo example js echoexample you must specify different client ids for each process using the i command line option device example js device example js is run as two processes which communicate with one another via the aws iot platform using mqtt publish and subscribe the command line option test mode t is used to set which role each process performs it s easiest to run each process in its own terminal window so that you can see the output generated by each note that in the following examples all certificates are located in the certs directory and have the default names as specified in the certificate configuration section certificates terminal window 1 sh node examples device example js f certs test mode 1 h prefix iot region amazonaws com terminal window 2 sh node examples device example js f certs test mode 2 h prefix iot region amazonaws com thing example js similar to device example js thing example js is also run as two processes which communicate with one another via the aws iot platform thing example js uses a thing shadow to synchronize state between the two processes and the command line option test mode t is used to set which role each process performs as with device example js it s best to run each process in its own terminal window or on separate hosts in this example the example programs are configured to use websocket tls connections to the aws iot platform as specified in the websocket configuration websockets terminal window 1 sh node examples thing example js p wss test mode 1 h prefix iot region amazonaws com terminal window 2 sh node examples thing example js p wss test mode 2 h prefix iot region amazonaws com thing passthrough example js similar to thing example js thing passthrough example js is also run as two processes which communicate with one another via the aws iot platform thing passthrough example js uses a thing shadow to synchronize state from one process to another and uses mqtt publish subscribe to send information in the other direction the command line option test mode t is used to set which role each process performs as with thing example js it s best to run each process in its own terminal window note that in the following examples all certificates are located in the certs directory and have the default names as specified in the certificate configuration section certificates terminal window 1 sh node examples thing passthrough example js f certs test mode 1 h prefix iot region amazonaws com terminal window 2 sh node examples thing passthrough example js f certs test mode 2 h prefix iot region amazonaws com a name echoexample a echo example js echo example js is used in conjunction with the aws iot console https console aws amazon com iot to verify connectivity with the aws iot platform and to perform interactive observation of thing shadow operation in the following example the program is run using the configuration file config json and the certificates are located in the certs directory here the f certificate directory and f configuration file options are combined so that the configuration file doesn t need to contain absolute pathnames sh node examples echo example js f config json f certs thing name testthing1 a name temp control a temperature control js temperature control js is an interactive simulation which demonstrates how thing shadows can be used to easily synchronize applications and internet connected devices like thing example js temperature control js runs in two separate terminal windows and is configured via command line options in the following example all certificates are located in the certs directory and have the default names as specified in the certificate configuration section certificates the process running with test mode 2 simulates an internet connected temperature control device and the process running with test mode 1 simulates a mobile application which is monitoring controlling it the processes may be run on different hosts if desired installing dependencies temperature control js uses the blessed js https github com chjj blessed and blessed contrib js https github com yaronn blessed contrib libraries to provide an interactive terminal interface it looks best on an 80x25 terminal with a black background and white or green text and requires utf 8 character encoding you ll need to install these libraries in the examples temperature control directory as follows sh cd examples temperature control npm install running the simulation terminal window 1 sh node examples temperature control temperature control js f certs test mode 1 h prefix iot region amazonaws com temperature control js mobile application mode https s3 amazonaws com aws iot device sdk js supplemental images temperature control mobile app mode png running the simulation terminal window 2 sh node examples temperature control temperature control js f certs test mode 2 h prefix iot region amazonaws com temperature control js device mode https s3 amazonaws com aws iot device sdk js supplemental images temperature control device mode png using the simulation the simulated temperature control device has two controls setpoint and status status controls whether or not the device is active and setpoint controls the interior temperature the device will attempt to achieve in addition the device reports the current interior and exterior temperatures as well as its operating state heating cooling or stopped two thing shadows are used to connect the simulated device and mobile application one contains the controls and the other contains the measured temperatures and operating state both processes can update the controls but only the device can update the measured temperatures and the operating state controlling the simulation is done using the kbd up kbd kbd down kbd kbd left kbd kbd right kbd and kbd enter kbd keys as follows kbd up kbd increase the setpoint kbd down kbd decrease the setpoint kbd left kbd move left on the menu bar kbd right kbd move right on the menu bar kbd enter kbd select the current menu option operating state the operating state of the device is indicated by the color of the interior temperature field as follows red heating cyan cooling white stopped the following example shows the temperature control simulation in device mode while the operating state is heating temperature control js device mode heating operating state https s3 amazonaws com aws iot device sdk js supplemental images temperature control device mode heating png log the log window displays events of interest e g network connectivity status toggles re synchronization with the thing shadow etc menu options mode toggle the device status status can be controlled from both the simulated device and the mobile application network toggle the network connectivity of the device or mobile application this can be used to observe how both sides re synchronize when connectivity is restored in this example the mobile application is disconnected from the network although it has requested that the setpoint be lowered to 58 degrees the command can t be sent to the device as there is no network connectivity so the operating state still shows as stopped when the mobile application is reconnected to the network it will attempt to update the thing shadow for the device s controls if no control changes have been made on the device side during the disconnection period the device will synchronize to the mobile application s requested state otherwise the mobile application will re synchronize to the device s current state temperature control js mobile application mode network disconnected https s3 amazonaws com aws iot device sdk js supplemental images temperature control mobile app mode network disconnected png exiting the simulation the simulation can be exited at any time by pressing kbd q kbd kbd ctrl kbd kbd c kbd or by selecting exit on the menu bar a name jobsexample a jobs example js jobs example js like the echo example js echoexample can receive messages via the aws iot console https console aws amazon com iot to verify connectivity with the aws iot platform but it can also receive and process job executions initiated through the aws iot device jobs management platform see the aws iot jobs documentation here https aws amazon com documentation iot for more information on creating and deploying jobs running the jobs example sh node examples jobs example js f certs h prefix iot region amazonaws com t thingname a name jobsagent a jobs agent js jobs agent js can be run on a device as is or it can be modified to suit specific use cases example job documents are provided below for more information see the aws iot connected device management documentation here https aws amazon com documentation iot running the jobs agent sh node examples jobs agent js f certs h prefix iot region amazonaws com t agentthingname using the jobs agent systemstatus operation the jobs agent will respond to the aws iot jobs management platform with system status information when it receives a job execution notification with a job document that looks like this operation systemstatus reboot operation when the jobs agent receives a reboot job document it will attempt to reboot the device it is running on while sending updates on its progress to the aws iot jobs management platform after the reboot the job execution status will be marked as in progress until the jobs agent is also restarted at which point the status will be updated to success to avoid manual steps during reboot it is suggested that device be configured to automatically start the jobs agent at device startup time job document format operation reboot shutdown operation when the jobs agent receives a shutdown job document it will attempt to shutdown the device operation shutdown install operation when the jobs agent receives an install job document it will attempt to install the files specified in the job document an install job document should follow this general format operation install packagename uniquepackagename workingdirectory jobs example directory launchcommand node jobs example js f certs h prefix iot region amazonaws com t thingname autostart true files filename jobs example js fileversion 1 0 2 10 filesource url https some bucket s3 amazonaws com jobs example js checksum inline value 9569257356cfc5c7b2b849e5f58b5d287f183e08627743498d9bd52801a2fbe4 hashalgorithm sha256 filename config json filesource url https some bucket s3 amazonaws com config json packagename each install operation must have a unique package name if the packagename matches a previous install operation then the new install operation overwrites the previous one workingdirectory optional property for working directory launchcommand optional property for launching an application package if omitted copy files only autostart if set to true then agent will execute launch command when agent starts up files specifies files to be installed filename name of file as written to file system filesource url location of file to be downloaded from checksum optional file checksum inline value checksum value hashalgorithm checksum hash algorithm used start operation when the jobs agent receives a start job document it will attempt to startup the specified package operation start packagename somepackagename stop operation when the jobs agent receives a stop job document it will attempt to stop the specified package operation stop packagename somepackagename restart operation when the jobs agent receives a restart job document it will attempt to restart the specified package operation restart packagename somepackagename a name browser a browser applications this sdk can be packaged to run in a browser using browserify http browserify org or webpack https webpack js org and includes helper scripts and example application code to help you get started writing browser applications that use aws iot background browser applications connect to aws iot using mqtt over the secure websocket protocol http docs aws amazon com iot latest developerguide protocols html there are some important differences between node js and browser environments so a few adjustments are necessary when using this sdk in a browser application when running in a browser environment the sdk doesn t have access to the filesystem or process environment variables so these can t be used to store credentials while it might be possible for an application to prompt the user for iam credentials the amazon cognito identity service https aws amazon com cognito provides a more user friendly way to retrieve credentials which can be used to access aws iot the temperature monitor temperature monitor browser example browser example application illustrates this use case using sdk with browserify installing browserify this sdk could also work with web applications using browserify first you ll need to make sure that browserify is installed the following instructions and the scripts in this package assume that it is installed globally as with sh npm install g browserify browser application utility this sdk includes a utility script called scripts browserize sh this script can create a browser bundle containing both the aws sdk for javascript https aws amazon com sdk for browser and this sdk or you can use it to create application bundles for browser applications like the ones under the examples browser directory for windows user who does not want to use bash shell the sdk also includes batch file windows browserize bat which does the same job as browserize sh but able to run in windows cmd to create the combined aws sdk browser bundle run this command in the sdk s top level directory sh npm run script browserize this command will create a browser bundle in browser aws iot sdk browser bundle js the browser bundle makes both the aws sdk and aws iot device sdk modules available so that you can require them from your browserified application bundle note for windows user who running scripts in cmd since batch script file does not work well with npm package script windows user could just call script directly to replace npm run script browserize this also applies for example applications demonstrated below sh scripts windows browserize bat creating application bundles you can also use the scripts browserize sh script to browserify your own applications and use them with the aws sdk browser bundle for example to prepare the temperature monitor temperature monitor browser example browser example application for use run this command in the sdk s top level directory sh npm run script browserize examples browser temperature monitor index js this command does two things first it creates an application bundle from examples browser temperature monitor index js and places it in examples browser temperature monitor bundle js second it copies the browser aws iot sdk browser bundle js into your application s directory where it can be used e g html script src aws iot sdk browser bundle js script script src bundle js script a name temperature monitor browser example a temperature monitor browser example application this sdk includes a companion browser application to the temperature control example application temp control the browser application allows you to monitor the status of the simulated temperature control device 1 follow the instructions to install the temperature control example application temp control 1 in order for the browser application to be able to authenticate and connect to aws iot you ll need to configure a cognito identity pool in the amazon cognito console https console aws amazon com cognito use amazon cognito to create a new identity pool and allow unauthenticated identities to connect obtain the poolid constant make sure that the policy attached to the unauthenticated role https console aws amazon com iam home roles has permissions to access the required aws iot apis more information about aws iam roles and policies can be found here http docs aws amazon com iam latest userguide access policies manage html 1 edit examples browser temperature monitor aws configuration js and replace the values of poolid and region with strings containing the id of the cognito identity pool and your aws region e g us east 1 from the previous step 1 create the application browser bundle by executing the following command in the top level directory of the sdk sh npm run script browserize examples browser temperature monitor index js 1 start an instance of the device simulation using sh node examples temperature control temperature control js f certs test mode 2 h prefix iot region amazonaws com note although the above example shows connecting using a certificate private key set you can use any of the command line options described in the example programs section programs 1 open examples browser temperature monitor index html in your web browser it should connect to aws iot and began displaying the status of the simulated temperature control device you started in the previous step if you change the device s settings the browser window should update and display the latest status values a name lifecycle event monitor browser example a lifecycle event monitor browser example application this sdk includes a browser application which demonstrates the functionality of aws iot lifecycle events http docs aws amazon com iot latest developerguide life cycle events html aws iot generates lifecycle events whenever clients connect or disconnect applications can monitor these and take action when clients connect or disconnect from aws iot follow these instructions to run the application 1 in order for the browser application to be able to authenticate and connect to aws iot you ll need to configure a cognito identity pool in the amazon cognito console https console aws amazon com cognito use amazon cognito to create a new identity pool and allow unauthenticated identities to connect obtain the poolid constant make sure that the policy attached to the unauthenticated role https console aws amazon com iam home roles has permissions to access the required aws iot apis more information about aws iam roles and policies can be found here http docs aws amazon com iam latest userguide access policies manage html 1 edit examples browser lifecycle aws configuration js and replace the values of poolid and region with strings containing the id of the cognito identity pool and your aws region e g us east 1 from the previous step 1 create the application browser bundle by executing the following command in the top level directory of the sdk sh npm run script browserize examples browser lifecycle index js 1 open examples browser lifecycle index html in your web browser after connecting to aws iot it should display connected clients 1 start programs which connect to aws iot e g the example programs in this package programs make sure that these programs are connecting to the same aws region that your cognito identity pool was created in the browser application will display a green box containing the client id of each client which connects when the client disconnects the box will disappear 1 if a dynamodb table named lifecycleevents exists in your account and has a primary key named clientid the lifecycle event browser monitor browser application will display the client id contained in each row by updating this table using an aws iot rule http docs aws amazon com iot latest developerguide iot rules html triggered by lifecycle events http docs aws amazon com iot latest developerguide life cycle events html you can maintain a persistent list of all of the currently connected clients within your account a name mqtt explorer browser example a mqtt explorer browser example application this sdk includes a browser application which implements a simple interactive mqtt client you can use this application to subscribe to a topic and view the messages that arrive on it or to publish to a topic follow these instructions to run the application 1 in order for the browser application to be able to authenticate and connect to aws iot you ll need to configure a cognito identity pool in the amazon cognito console https console aws amazon com cognito use amazon cognito to create a new identity pool and allow unauthenticated identities to connect obtain the poolid constant make sure that the policy attached to the unauthenticated role https console aws amazon com iam home roles has permissions to access the required aws iot apis more information about aws iam roles and policies can be found here http docs aws amazon com iam latest userguide access policies manage html 1 edit examples browser mqtt explorer aws configuration js and replace the values of poolid and region with strings containing the id of the cognito identity pool and your aws region e g us east 1 from the previous step 1 create the application browser bundle by executing the following command in the top level directory of the sdk sh npm run script browserize examples browser mqtt explorer index js 1 open examples browser mqtt explorer index html in your web browser after connecting to aws iot it should display input fields allowing you to subscribe or publish to a topic by subscribing to for example you will be able to monitor all traffic within your aws account as allowed by the policy associated with the unauthenticated role of your cognito identity pool reducing browser bundle size after your application development is complete you will probably want to reduce the size of the browser bundle there are a couple of easy techniques to do this and by combining both of them you can create much smaller browser bundles eliminate unused features from the aws sdk 1 the aws sdk for javascript https github com aws aws sdk js allows you to install only the features you use in your application in order to use this feature when preparing a browser bundle first you ll need to remove any existing bundle that you ve already created sh rm browser aws iot sdk browser bundle js 2 define the aws features your application uses as a comma separated list in the aws services environment variable for example the mqtt explorer example mqtt explorer browser example uses only aws cognito identity so to create a bundle containing only this feature do sh export aws services cognitoidentity for a list of the aws sdk feature names refer to the features subdirectory of the aws sdk for javascript https github com aws aws sdk js tree master features as another example if your application uses cognito identity dynamodb s3 and sqs you would do sh export aws services cognitoidentity dynamodb s3 sqs 3 create the browser app and bundle e g for the mqtt explorer example mqtt explorer browser example do sh npm run script browserize examples browser mqtt explorer index js uglify the bundle source uglify https www npmjs com package uglify is an npm utility for minimizing the size of javascript source files to use it first install it as a global npm package sh npm install g uglify once installed you can use it to reduce the bundle size sh uglify s browser aws iot sdk browser bundle js o browser aws iot sdk browser bundle min js after you ve created the minimized bundle you ll need to make sure that your application loads this version rather than the non minimized version e g html script src aws iot sdk browser bundle min js script optimization results by using both of the above techniques for the mqtt explorer example mqtt explorer browser example the bundle size can be reduced from 2 4mb to 615kb a name troubleshooting a using sdk with webpack in order to work with webpack you have to create a webpack package you can put your file dependencies in entry js and output it as bundle js an example is provided in the location examples browser mqtt webpack sh cd examples browser mqtt webpack npm install node modules bin webpack config webpack config js the index html will load the output file bundle js and execute functions defined in entry js this duplicates the example of mqtt explore above which loaded sdk into web browser using browserify troubleshooting if you have problems connecting to the aws iot platform when using this sdk or running the example programs there are a few things to check region mismatch you have to use the certificate created in the same region as your host end point duplicate client ids within your aws account the aws iot platform will only allow one connection per client id many of the example programs run as two processes which communicate with one another if you don t specify a client id the example programs will generate random client ids but if you are using a json configuration file configurationfile you ll need to explictly specify client ids for both programs using the i command line option invalid npm version to run the browserize sh script which prepares the browser example applications you ll need to use npm version 3 this is because browserize sh expects package dependencies to be handled using the npm version 3 strategy which is different than the strategy used in npm version 2 https docs npmjs com how npm works npm3 if you re having trouble running the browser application examples make sure that you re using npm version 3 you can check your npm version with npm v a name unittests a unit tests this package includes unit tests which can be run as follows sh npm test running the unit tests will also generate code coverage data in the reports directory a name license a license this sdk is distributed under the apache license version 2 0 http www apache org licenses license 2 0 see license txt and notice txt for more information a name suport a support if you have technical questions about aws iot device sdk use the aws iot forum https forums aws amazon com forum jspa forumid 210 for any other questions on aws iot contact aws support https aws amazon com contact us | server |
|
rune.js | rune js rune js is a javascript library for programming graphic design systems with svg official documentation http runemadsen github io rune js releases https github com runemadsen rune js releases follow runemadsen https twitter com runemadsen for announcements building and running tests this repo uses the rune plugin js package for building and running the tests in both node and the browser bash rune build rune test node rune test browser contributors yining shi http 1023 io jorge moreno https www moro es | os |
|
Huatuo-Llama-Med-Chinese | readme md english readme en md p align center width 100 a href https github com scir hi huatuo llama med chinese target blank img src assets logo logo new png alt scir hi huatuo style width 60 min width 300px display block margin auto a p huatuo bentsao original name huatuo instruction tuning large language models with chinese medical knowledge code license https img shields io badge code 20license apache 2 0 green svg https github com scir hi huatuo llama med chinese blob main license python 3 9 https img shields io badge python 3 9 blue svg https www python org downloads release python 390 instruction tuning llama alpaca chinese bloom chatgpt api news 2023 09 12 arxiv https arxiv org pdf 2309 04198 pdf 2023 09 08 arxiv https arxiv org pdf 2309 04175 pdf 2023 08 07 https github com hit scir huozi 2023 08 05 ccl 2023 demo track poster 2023 08 03 scir https github com hit scir huozi 2023 07 19 bloom https huggingface co bigscience bloom 7b1 2023 05 12 2023 04 28 alpaca https github com ymcui chinese llama alpaca 2023 04 24 llama 2023 03 31 llama a quick start python 3 9 pip install r requirements txt lora 1 0 https github com hit scir huozi bloom 7b bloom 7b https huggingface co bigscience bloomz 7b1 alpaca chinese 7b https github com ymcui chinese llama alpaca llama llama 7b https huggingface co decapoda research llama 7b hf lora lora hugging face 1 lora https pan baidu com s 1bpndnb1wqztwy be6mfcna pwd m21s 2 bloom lora https pan baidu com s 1jpcueohesfgypzj7u52fag pwd scir hugging face https huggingface co lovepon lora bloom med bloom 3 alpaca lora https pan baidu com s 16oxcjzxnxjdpl8skihgnxw pwd scir hugging face https huggingface co lovepon lora alpaca med https pan baidu com s 1hddk84ashmzoflkmypbijw pwd scir hugging face https huggingface co lovepon lora alpaca med alldata 4 llama lora https pan baidu com s 1jih per6jzea6n2u6sumog pwd jjpf hugging face https huggingface co thinksoso lora llama med https pan baidu com s 1jadypclr2blyxituffsjpa pwd odsk hugging face https huggingface co lovepon lora llama literature lora lora folder name adapter config json lora adapter model bin lora chatglm chatglm 6b med https github com scir hi med chatglm infer data infer json infer bash scripts infer sh bash scripts infer literature single sh bash scripts infer literature multi sh infer sh base model lora lora weights instruct dir python infer py base model base model path lora weights lora weights path use lora true instruct dir infer data path prompt template template path bloom llama alpaca templates bloom deploy json templates med template json br templates literature template json scripts test sh cmekg https github com king yyf cmekg tools gpt3 5 prompt 2023 gpt3 5 data literature liver cancer json 1k p align center width 100 a href https github com scir hi huatuo llama med chinese target blank img src assets case png alt scir hi huatuo literature style width 100 min width 300px display block margin auto a p 16 https arxiv org pdf 2309 04198 pdf finetune data llama data json finetune bash scripts finetune sh llama a100 sxm 80gb 10 2h17m batch size 128 40g 3090 4090 24gb batch size wandb https wandb ai thinksoso llama med runs a5wgcnzt overview workspace user thinksoso 2023 3 llama alpaca bentsao 1 q a scir 2 q a llama alpaca 3 q a 4 q a llama alpaca bloom based based 5 q a requirements cuda lora llama based llama lora issue 6 q a https haochun wang https github com dyr1 https github com thinksoso https github com ruibai1999 https github com rootnx https github com imsovegetable https github com 1278882181 https github com jianyuchen01 https github com flowolfzzz http homepage hit edu cn stanzhao lang zh https github com hit scir huozi facebook llama https github com facebookresearch llama stanford alpaca https github com tatsu lab stanford alpaca alpaca lora by tloen https github com tloen alpaca lora cmekg https github com king yyf cmekg tools https yiyan baidu com welcome logo citation huatuo tuning llama model with chinese medical knowledge https arxiv org pdf 2304 06975 misc wang2023huatuo title huatuo tuning llama model with chinese medical knowledge author haochun wang and chi liu and nuwa xi and zewen qiang and sendong zhao and bing qin and ting liu year 2023 eprint 2304 06975 archiveprefix arxiv primaryclass cs cl knowledge tuning large language models with structured medical knowledge bases for reliable response generation in chinese https arxiv org pdf 2309 04175 pdf misc wang2023knowledgetuning title knowledge tuning large language models with structured medical knowledge bases for reliable response generation in chinese author haochun wang and sendong zhao and zewen qiang and zijian li and nuwa xi and yanrui du and muzhen cai and haoqiang guo and yuhan chen and haoming xu and bing qin and ting liu year 2023 eprint 2309 04175 archiveprefix arxiv primaryclass cs cl the calla dataset probing llms interactive knowledge acquisition from chinese medical literature https arxiv org pdf 2309 04198 pdf misc du2023calla title the calla dataset probing llms interactive knowledge acquisition from chinese medical literature author yanrui du and sendong zhao and muzhen cai and jianyu chen and haochun wang and yuhan chen and haoqiang guo and bing qin year 2023 eprint 2309 04198 archiveprefix arxiv primaryclass cs cl | llama llm medical nlp aidoctor medgpt medqa chinese bloom huozi | ai |
VetTag | vettag introduction this is the official cleaned repo we used to train evaluate and interpret for vettag paper https www nature com articles s41746 019 0113 1 please feel free to contact yuhui zh15 mails tsinghua edu cn if you have any problem using these scripts usage unsupervised learning please create a json file in path to hypes with the following format json psvg json data dir path to data psvg encoder path path to data encoder json prefix psvg oneline label size 0 data dir and prefix save data in path to data psvg psvg oneline train tsv path to data psvg psvg oneline valid tsv and path to data psvg psvg oneline test tsv for training validation and test the file should only contain one line for the whole text encoder path save vocabulary in path to data encoder json it is a json file with format hello 0 world 1 label size for unsupervised learning label size should equal to 0 then use the following command to train and save the model in path to exp psvg python trainer py outputdir path to exp psvg train emb corpus psvg hypes path to hypes psvg json batch size 5 bptt size 600 model type transformer supervised learning please create a json file in path to hypes with the following format json csu json data dir path to data csu encoder path path to data encoder json prefix csu label size 4577 data dir and prefix save data in path to data csu csu train tsv path to data csu csu valid tsv and path to data csu csu test tsv for training validation and test the file contains lines of annotated clinical notes with format text tab label 1 space label 2 space space label k for each line encoder path save vocabulary in path to data encoder json the same file for unsupervised learning it is a json file with format hello 0 world 1 label size for supervised learning we use 4577 finegrained snomed diagnosis codes then use the following command to train and save the model in path to exp csu python trainer py outputdir path to exp csu corpus csu hypes path to hypes csu json batch size 5 model type transformer cut down len 600 train emb hierachical inputdir path to exp psvg pretrained model pickle external evaluation please create a json file in path to hypes with the following format json pp json data dir path to data pp encoder path path to data encoder json prefix pp label size 4577 data dir and prefix save data in path to data csu pp test tsv for test the file contains lines of annotated clinical notes with format text tab label 1 space label 2 space space label k for each line encoder path save vocabulary in path to data encoder json the same file for unsupervised learning it is a json file with format hello 0 world 1 label size for supervised learning we use 4577 finegrained snomed diagnosis codes the same for supervised learning then use the following command to evaluate the model python trainer py outputdir path to exp pp corpus pp hypes path to hypes pp json batch size 5 model type transformer cut down len 600 hierachical inputdir path to exp psvg pretrained model pickle statistics and analysis refer to jupyter snomed stat ipynb jupyter species stat ipynb jupyter length label distribution ipynb and jupyter analysis ipynb hierarchical training two files are required parents json and labels json in data dir labels json the format is snomed id 1 snomed id 2 snomed id 4577 which is all 4577 snomed labels we use parents json the format is snomed id i parent of snomed id i which is all snomed labels and their parents in the shortest path from the root node introduced in the method section interpretation refer to jupyter interpret ipynb and jupyter salient words ipynb | ai |
|
StocksProject | predicting stock market returns this repository contains the code for the portfolio project i m working on at data science retreat berlin the project aim is to build a model to predict stock market prices using a combination of machine learning algorithms the output of the prediction are the daily returns of s p 500 index i m exploring two possible different problems binary classification problem predict positive up or negative down return respect to the previous day regression problem predict the exact return more useful to feed an hypothetical trading algorithm the language i picked to implement the analysis is python numpy scipy pandas matplotlib scikit although the first exploratory stuff has been done in r the main file is stocks py the script calls several functions contained in the functions py i m actively working on the project meaning that the repo is going to be updated quite often | ai |
|
mdt | mdt a police mobile data terminal system designed for use on the fivem platform and esx framework | os |
|
mathematics-for-machine-learning-coursera | mathematics for machine learning cousera this repository contains all the quizzes assignments for the specialization mathematics for machine learning by imperial college of london on coursera br proof of my certification can be seen here https www coursera org account accomplishments specialization xsg9yarupcat br note the material provided in this repository is only for helping those who may get stuck at any point of time in the course it is strongly advised that no one should just copy the solutions voilation of coursera honor code presented here updates course 1 linear algebra completed br nbsp nbsp nbsp nbsp week1 completed br nbsp nbsp nbsp nbsp week2 completed br nbsp nbsp nbsp nbsp week3 completed br nbsp nbsp nbsp nbsp week4 completed br nbsp nbsp nbsp nbsp week5 completed br course 2 multivariate calculus completed br nbsp nbsp nbsp nbsp week1 completed br nbsp nbsp nbsp nbsp week2 completed br nbsp nbsp nbsp nbsp week3 completed br nbsp nbsp nbsp nbsp week4 completed br nbsp nbsp nbsp nbsp week5 completed br nbsp nbsp nbsp nbsp week6 completed br course 3 pca completed br nbsp nbsp nbsp nbsp week1 completed br nbsp nbsp nbsp nbsp week2 completed br nbsp nbsp nbsp nbsp week3 completed br nbsp nbsp nbsp nbsp week4 completed br | linear-algebra machine-learning eigenvalues eigenvectors principal-component-analysis multivariable-calculus coursera mathematics-machine-learning | ai |
300Days__MachineLearningDeepLearning | journey of 300daysofdata in machine learning and deep learning machinelearning https github com thinamxx 300days machinelearningdeeplearning blob main images ml jpg books and resources status of completion 1 machine learning from scratch https dafriedman97 github io mlbook content introduction html white check mark 2 a comprehensive guide to machine learning white check mark 3 hands on machine learning with scikit learn keras and tensorflow white check mark 4 speech and language processing https web stanford edu jurafsky slp3 5 machine learning crash course https developers google com machine learning crash course white check mark 6 deep learning with pytorch part i https www manning com books deep learning with pytorch white check mark 7 dive into deep learning https d2l ai white check mark 8 logistic regression documentation https ml cheatsheet readthedocs io en latest logistic regression html white check mark 9 deep learning for coders with fastai and pytorch white check mark 10 approaching almost any machine learning problem 11 pyimagesearch https www pyimagesearch com research papers 1 practical recommendations for gradient based training of deep architectures https arxiv org pdf 1206 5533 pdf projects and notebooks 1 california housing prices https github com thinamxx californiahousing prices git 2 logistic regression from scratch https github com thinamxx machinelearning algorithms blob main logisticregression logisticregression ipynb 3 implementation of lenet architecture https github com thinamxx machinelearning algorithms blob main lenetarchitecture lenetarchitecture ipynb 4 neural networks style transfer https github com thinamxx neural style transfer 5 object recognition on images cifar10 https github com thinamxx cifar10 recognition 6 dog breed identification imagenet https github com thinamxx dogbreedclassification 7 sentiment analysis dataset notebook https github com thinamxx neuralnetworks sentimentanalysis blob master pytorch sentiment 20analysis 20dataset ipynb 8 sentiment analysis with rnn https github com thinamxx neuralnetworks sentimentanalysis blob master pytorch sentiment 20analysis 20rnn ipynb 9 sentiment analysis with cnn https github com thinamxx neuralnetworks sentimentanalysis blob master pytorch sentiment 20analysis 20cnn ipynb 10 natural language inference dataset https github com thinamxx natural language inference blob main naturallanguage 20inference 20data ipynb 11 natural language inference attention https github com thinamxx natural language inference blob main nl 20inference 20attention ipynb 12 natural language inference bert https github com thinamxx natural language inference blob main nl 20inference 20bert ipynb 13 deep convolutional gan https github com thinamxx gan blob main deep 20gan ipynb 14 fastai introduction notebook https github com thinamxx fastai blob main 1 20introduction ipynb 15 fastai image detection https github com thinamxx fastai blob main 2 20model 20production beardetector ipynb 16 fastai training classifier https github com thinamxx fastai blob main 3 20training 20a 20classifier digitclassifier ipynb 17 fastai image classification https github com thinamxx fastai blob main 4 20image 20classification imageclassification ipynb 18 fastai multilabel classification regression https github com thinamxx fastai blob main 5 20multilabelclassification 20regression multilabelclassification ipynb 19 fastai image regression https github com thinamxx fastai blob main 5 20multilabelclassification 20regression regression ipynb 20 fastai advanced classification https github com thinamxx fastai blob main 6 20advanced 20classification imagenetteclassification ipynb 21 fastai collaborative filtering https github com thinamxx fastai blob main 7 20collaborative 20filtering collaborativefiltering ipynb 22 fastai tabular modeling https github com thinamxx fastai blob main 8 20tabular 20modeling tabularmodel ipynb 23 fastai natural language processing https github com thinamxx fastai blob main 9 20natural 20language 20processing nlp ipynb 24 fastai data munging https github com thinamxx fastai blob main 10 20data 20munging datamunging ipynb 25 fastai language model from scratch https github com thinamxx fastai blob main 11 20language 20model languagemodel ipynb 26 fastai convolutional neural networks https github com thinamxx fastai blob main 12 20convolutional 20neural 20networks cnn ipynb 27 fastai residual networks https github com thinamxx fastai blob main 13 20resnets resnets ipynb 28 fastai architecture details https github com thinamxx fastai blob main 14 20architecture 20details architectures ipynb 29 fastai training process https github com thinamxx 300days machinelearningdeeplearning blob main images day 20259 png 30 fastai neural network foundations https github com thinamxx fastai blob main 16 20neural 20network 20foundations neuralfoundations ipynb 31 fastai cnn interpretation with cam https github com thinamxx fastai blob main 17 20cnn 20interpretation cnn 20interpretation ipynb 32 fastai fastai learner from scratch https github com thinamxx fastai blob main 18 20fastai 20learner fastai 20learner ipynb 33 fastai chest x rays classification https github com thinamxx fastai blob main 19 20chest 20xrays 20classification xrays 20classification ipynb 34 supervised and unsupervised learning https github com thinamxx approachinganymachinelearning blob main 01 20supervised 20unsupervised 20learning supervised 20unsupervised ipynb 35 evaluation metrics https github com thinamxx approachinganymachinelearning blob main 02 20evaluation 20metrics evaluation 20metrics ipynb 36 opencv notebook https github com thinamxx computervision blob main 01 20opencv opencv ipynb 37 opencv project i https github com thinamxx computervision blob main 01 20opencv ocv 20project 20i ipynb 38 opencv project ii https github com thinamxx computervision blob main 01 20opencv ocv 20project 20ii ipynb 39 convolution https github com thinamxx computervision blob main 02 20convolutionalneuralnetwork convolutions ipynb 40 convolutional layers https github com thinamxx computervision blob main 02 20convolutionalneuralnetworks convolutional 20layers ipynb 41 fastai transformers https github com thinamxx fastai blob main 20 20transformers transformers ipynb day1 of 300daysofdata gradient descent and cross validation gradient descent is an iterative approach to approximating the parameters that minimize a differentiable loss function cross validation is a resampling procedure used to evaluate machine learning models on a limited data sample which has a parameter that splits the data into number of groups on my journey of machine learning and deep learning today i have read in brief about the fundamental topics such as calculus matrices matrix calculus random variables density functions distributions independence maximum likelihood estimation and conditional probability i have also read and implemented about gradient descent and cross validation i am starting this journey from scratch and i am following the book machine learning from scratch i have presented the implementation of gradient descent and cross validation here in the snapshots i hope you will also spend some time reading the topics from the book mentioned above i am excited about the days to come book machine learning from scratch https dafriedman97 github io mlbook content introduction html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 201 png day2 of 300daysofdata ordinary linear regression linear regression is a linear approach to modelling the relationships between a scalar response or dependent variable and one or more explanatory variables or independent variables on my journey of machine learning and deep learning today i have read and implemented about ordinary linear regression parameter estimation minimizing loss and maximizing likelihood along with the construction and implementation of the lr from the book machine learning from scratch i have also started reading the book a comprehensive guide to machine learning which focuses on mathematics and theory behind the topics i have read about regression ordinary least squares vector calculus orthogonal projection ridge regression feature engineering fitting ellipses polynomial features hyperparameters and validation errors and cross validation from this book i have presented the implementation of linear regression along with visualizations using python here in the snapshots i hope you will also spend some time reading the topics and books mentioned above excited about the days ahead books machine learning from scratch https dafriedman97 github io mlbook content introduction html a comprehensive guide to machine learning image https github com thinamxx 300days machinelearningdeeplearning blob main images day 202a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 202b png day3 of 300daysofdata on my journey of machine learning and deep learning today i have read and implemented about regularized regression such as ridge regression and lasso regression bayesian regression glms poisson regression along with construction and implementation of the same from the book machine learning from scratch i have also read the book a comprehensive guide to machine learning which focuses on mathematics and theory behind the topics i have read about maximum likelihood estimation or mle and maximum a posteriori or mae for regression probabilistic model bias variance tradeoff metrics bias variance decomposition alternative decomposition multivariate gaussians estimating gaussians from data weighted least squares ridge regression and generalized least squares from this book i have presented the implementation of ridge regression lasso regression along with cross validation bayesian regression and poisson regression using python here in the snapshot i hope you will also spend some time reading the topics and books mentioned above excited about the days ahead books machine learning from scratch https dafriedman97 github io mlbook content introduction html a comprehensive guide to machine learning image https github com thinamxx 300days machinelearningdeeplearning blob main images day 203 png day4 of 300daysofdata on my journey of machine learning and deep learning today i have read and implemented about discriminative classifiers such as binary and multiclass logistic regression the perceptron algorithm parameter estimation fishers linear discriminant and fisher criterion along with construction and implementation of the same from the book machine learning from scratch i have also read the book a comprehensive guide to machine learning which focuses on mathematics and theory behind the topics i have read about kernels and ridge regression linear algebra derivation computational analysis sparse least squares orthogonal matching pursuit total least squares low rank formulation dimensionality reduction principal component analysis projection changing coordinates minimizing reconstruction errors and probabilistic pca from this book i have presented the implementation of binary and multiclass logistic regression the perceptron algorithm and fishers linear discriminant using python here in the snapshot i hope you will also spend some time reading the topics and books mentioned above excited about the days ahead books machine learning from scratch https dafriedman97 github io mlbook content introduction html a comprehensive guide to machine learning image https github com thinamxx 300days machinelearningdeeplearning blob main images day 204 png day5 of 300daysofdata on my journey of machine learning and deep learning today i have read and implemented about generative classifiers such as linear discriminative analysis or lda quadratic discriminative analysis or qda naive bayes parameter estimation and data likelihood along with construction and implementation of the same from the book machine learning from scratch i have also read the book a comprehensive guide to machine learning which focuses on mathematics and theory behind the topics i have read about generative and discriminative classification bayes decision rule least squares support vector machines feature extension neural network extension binary and multiclass logistic regression loss function training multiclass extension gaussian discriminant analysis qda and lda classification and support vector machines from this book i have presented the implementation of lda qda and naive bayes along with visualizations using python here in the snapshot i hope you will also spend some time reading the topics and books mentioned above excited about the days ahead books machine learning from scratch https dafriedman97 github io mlbook content introduction html a comprehensive guide to machine learning image https github com thinamxx 300days machinelearningdeeplearning blob main images day 205 png day6 of 300daysofdata decision trees a decision tree is an interpretable machine learning for regression and classification it is a flow chart like structure in which each internal node represents a test on an attribute and each branch represents the outcome of the test on my journey of machine learning and deep learning today i have read about decision trees such as regression trees and classification trees building trees making splits and predictions hyperparameters pruning and regularization along with construction and implementation of the same from the book machine learning from scratch i have also read the book a comprehensive guide to machine learning which focuses on mathematics and theory behind the topics i have read about decision tree learning entropy and information gini impurity stopping criteria random forests boosting and adaboost gradient boosting and kmeans clustering from this book i have presented the implementation of regression trees and classification trees using python here in the snapshot i hope you will also spend some time reading the topics and books mentioned above excited about the days ahead books machine learning from scratch https dafriedman97 github io mlbook content introduction html a comprehensive guide to machine learning image https github com thinamxx 300days machinelearningdeeplearning blob main images day 206 png day7 of 300daysofdata tree ensemble methods ensemble methods combine the outputs of multiple simple models which is often called learners in order to create the fine model with low variance due to their high variance a decision trees often fail to reach a level of precision comparable to other predictive algorithms and ensemble methods minimize the variance on my journey of machine learning and deep learning today i have read and implemented about tree ensemble methods such as bagging for decision trees bootstrapping random forests and procedure boosting adaboost for binary classification weighted classification trees the discrete adaboost algorithm and adaboost for regression along with construction and implementation of the same from the book machine learning from scratch i have presented the implementation of bagging random forests and adaboost along with different base estimators using python here in the snapshot i hope you will also spend some time reading the topics and book mentioned above excited about the days ahead books machine learning from scratch https dafriedman97 github io mlbook content introduction html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 207 png day8 of 300daysofdata on my journey of machine learning and deep learning today i have read and implemented about neural networks from the book machine learning from scratch i have read about model structure communication between layers activation functions such as relu sigmoid the linear activation function optimization back propagation calculating gradients chain rule and observations loss functions along with construction using the loop approach and the matrix approach and implementation of the same from this book i have also read the book a comprehensive guide to machine learning which focuses on mathematics and theory behind the topics i have read about convolutional neural networks and layers pooling layers back propagation for cnn resnet and visual understanding of cnns from this book besides i have seen a couple of videos of neural networks and deep learning i have presented the simple implementation of neural networks with the functional api and the sequential api using tensorflow here in the snapshot i hope you will also spend some time reading the topics and books mentioned above excited about the days ahead books machine learning from scratch https dafriedman97 github io mlbook content introduction html a comprehensive guide to machine learning image https github com thinamxx 300days machinelearningdeeplearning blob main images day 208 png day9 of 300daysofdata reinforcement learning in reinforcement learning the learning system called an agent in a particular context can observe the environment select and perform actions and get rewards in return or penalties in the form of negative rewards it must learn by itself what is the best policy to get the most reward over time on my journey of machine learning and deep learning today i have started reading and implementing from the book hands on machine learning with scikit learn keras and tensorflow i have read briefly about the machine learning landscape viz types of machine learning systems such as supervised and unsupervised learning semisupervised learning reinforcement learning batch learning and online learning instance based learning and model based learning from this book i have presented the simple implementation of linear regression and knearest neighbors along with a simple plot using python here in the snapshot i hope you will also spend some time reading the topics and book mentioned above excited about the days ahead book hands on machine learning with scikit learn keras and tensorflow image https github com thinamxx 300days machinelearningdeeplearning blob main images day 209a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 209b png day10 of 300daysofdata on my journey of machine learning and deep learning today i have read about the main challenges of machine learning such as insufficient quantity of training data non representative training data poor quality data irrelevant features overfitting and underfitting the training data and testing and validating hyperparameter tuning and model selection and data mismatch from the book hands on machine learning with scikit learn keras and tensorflow i have started working on california housing prices dataset which is included in this book i will build a model of housing prices in california in this project i have presented the simple implementation of data processing and few techniques of eda using python here in the snapshot i have also presented the implementation of sweetviz library for analysis here i really appreciate chanin nantasenamat for sharing about this library in one of his videos i hope you will also spend some time reading the topics and book mentioned above excited about the days ahead book hands on machine learning with scikit learn keras and tensorflow chanin nantasenamat video on sweetviz https www youtube com watch v ur ok8vbpey lc z22itptbrzv0vfky504t1aokgq4l23pa5kermfzdyrfkbk0h00410 1605764911555430 california housing prices https github com thinamxx californiahousing prices git image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2010 png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2010b png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2010a png day11 of 300daysofdata on my journey of machine learning and deep learning today i have learned and implemented about creating categories from attributes stratified sampling visualizing data to gain insights scatter plots correlations scatter matrix and attribute combinations from the book hands on machine learning with scikit learn keras and tensorflow i have continued working with california housing prices dataset which is included in this book this dataset was based on data from the 1990 california census i will build a model of housing prices in california in this project i am still working on the same i have presented the implementation of stratified sampling correlations using scatter matrix and attribute combinations using python here in the snapshots i have also presented the snapshots of correlations using scatter plots here i hope you will spend some time working on the same and reading the topics and book mentioned above excited about the days ahead book hands on machine learning with scikit learn keras and tensorflow california housing prices https github com thinamxx californiahousing prices git image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2011a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2011b png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2011c png day12 of 300daysofdata on my journey of machine learning and deep learning today i have learned and implemented about preparing the data for machine learning algorithms data cleaning simple imputer ordinal encoder onehot encoder feature scaling transformation pipeline standard scaler column transformer linear regression decision tree regressor and cross validation from the book hands on machine learning with scikit learn keras and tensorflow i have continued working with california housing prices dataset which is included in this book this dataset was based on data from the 1990 california census i will build a model of housing prices in california in this project the notebook contains almost every topics mentioned above i have presented the implementation of data preparation handling missing values onehot encoder column transformer linear regression decision tree regressor along with cross validation using python here in the snapshots i hope you will spend some time working on the same and reading the topics and book mentioned above excited about the days ahead book hands on machine learning with scikit learn keras and tensorflow california housing prices https github com thinamxx californiahousing prices git image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2012a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2012b png day13 of 300daysofdata on my journey of machine learning and deep learning today i have learned and implemented about random forest regressor ensemble learning tuning the model grid search randomized search analyzing the best models and errors model evaluation cross validation and few more topics related to the same from the book hands on machine learning with scikit learn keras and tensorflow i have completed working with california housing prices dataset which is included in this book this dataset was based on data from the 1990 california census i have built a model using random forest regressor of california housing prices dataset to predict the price of the houses in california i have presented the implementation of random forest regressor and tuning the model with grid search and randomized search along with cross validation using python here in the snapshot i hope you will spend some time working on the same and reading the topics and book mentioned above excited about the days ahead book hands on machine learning with scikit learn keras and tensorflow california housing prices https github com thinamxx californiahousing prices git image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2013 png day14 of 300daysofdata confusion matrix confusion matrix is a better way to evaluate the performance of a classifier the general idea of confusion matrix is to count the number of times instances of class a are classified as class b this approach requires to have a set of predictions so that they can be compared to the actual targets on my journey of machine learning and deep learning today i have read and implemented about classification training a binary classifier using stochastic gradient descent measuring accuracy using cross validation implementation of cv confusion matrix precision and recall and their curves and few more topics related to the same from the book hands on machine learning with scikit learn keras and tensorflow i have presented the implementation of sgd classifier in mnist dataset along with precision and recall using python here in the snapshots i have also presented the curves of precision and recall here i hope you will spend some time working on the same and reading the topics and book mentioned above i am excited about the days ahead book hands on machine learning with scikit learn keras and tensorflow image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2014a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2014b png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2014c png day15 of 300daysofdata on my journey of machine learning and deep learning today i have read and implemented about the roc curve random forest classifier sgd classifier multi class classification one vs one and one vs all strategies cross validation error analysis using confusion matrix multi class classification kneighbors classifier multi output classification noises precision and recall tradeoff and few more topics related to the same from the book hands on machine learning with scikit learn keras and tensorflow i have completed the topic classification from this book i have presented the implementation of the roc curve random forest classifier in multi class classification the one vs one strategy standard scaler error analysis multi label classification and multi output classification using scikit learn here in the snapshots i hope you will also work on the same i hope you will also spend some time reading the topics and book mentioned above i am excited about the days ahead book hands on machine learning with scikit learn keras and tensorflow image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2015a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2015b png day16 of 300daysofdata ridge regression ridge regression is a regularized linear regression viz a regularization term is added to the cost function which forces the learning algorithm to not only fit the data but also keep the model weights as small as possible on my journey of machine learning and deep learning today i have read and implemented about training the models linear regression the normal equations and computational complexity cost function and gradient descent such as batch gradient descent convergence rate stochastic gradient descent mini batch gradient descent polynomial regression and poly features learning curves bias and variance tradeoff regularized linear models such as ridge regression and few more related to the same from the book hands on machine learning with scikit learn keras and tensorflow i have presented the implementation of polynomial regression learning curves and ridge regression along with visualization using python here in the snapshots i hope you will spend some time working on the same and reading the topics and book mentioned above excited about the days ahead book hands on machine learning with scikit learn keras and tensorflow image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2016 png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2016b png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2016a png day17 of 300daysofdata elastic net elastic net is a middle grouped between ridge regression and lasso regression the regularization term r is a simple mix of both ridge and lasso s regularization terms when r equals 0 it is equivalent to ridge and when r equals 1 it is equivalent to lasso regression on my journey of machine learning and deep learning today i have read and implemented about lasso regression elastic net early stopping sgd regressor logistic regression estimating probabilities training and cost function sigmoid function decision boundaries softmax regression or multinomial logistic regression cross entropy and few more topics related to the same from the book hands on machine learning with scikit learn keras and tensorflow i have just started reading the topic support vector machines i have presented the simple implementation of lasso regression elastic net early stopping logistic regression and softmax regression using scikit learn here in the snapshots i hope you will spend some time working on the same and reading the topics and book mentioned above excited about the days ahead book hands on machine learning with scikit learn keras and tensorflow image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2017a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2017 png day18 of 300daysofdata support vector machines a support vector machines or svm is a very powerful and versatile machine learning model which is capable of performing linear and nonlinear classification regression and even outlier detection svms are particularly well suited for classification of complex but medium sized datasets on my journey of machine learning and deep learning today i have read and implemented about support vector machines linear svm classification soft margin classification nonlinear svm classification polynomial regression polynomial kernel adding similarity features gaussian rbf kernel computational complexity svm regression which is linear as well nonlinear and few more topics related to the same from the book hands on machine learning with scikit learn keras and tensorflow i have presented the implementation of nonlinear svm classification using svc and linear svc along with visualization using python here in the snapshots i hope you will spend some time working on the same and reading the topics and book mentioned above excited about the days ahead book hands on machine learning with scikit learn keras and tensorflow image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2018a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2018b png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2018c png day19 of 300daysofdata voting classifiers voting classifiers are the classifiers which aggregates the predictions of different classifiers and predicts the class that gets the most votes the majority vote classifier is called a hard voting classifier on my journey of machine learning and deep learning today i have read and implemented about ensemble learning and random forests voting classifiers such as hard voting and soft voting classifiers and few more topics related to the same actually i have also started working on a research project with an amazing team i have presented the implementation of hard voting and soft voting classifiers using scikit learn here in the snapshots i hope you will spend some time working on the same and reading the topics mentioned above excited about the days ahead book hands on machine learning with scikit learn keras and tensorflow image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2019a png day20 of 300daysofdata the cart training algorithm the algorithm which represents scikit learn s implementation of the classification and regression tree or cart training algorithm to train decision trees also called growing trees it s working principle is splitting the training set into two subsets using a feature and a threshold on my journey of machine learning and deep learning today i have read and implemented about decision functions and predictions decision trees decision tree classifier making predictions gini impurity white box models and black box models estimating class probabilities the cart training algorithm computational complexities entropy regularization hyperparameters decision tree regressor cost function and instability from the book hands on machine learning with scikit learn keras and tensorflow i have presented the simple implementation of decision tree classifier and decision tree regressor along with visualization of the same using python here in the snapshots i hope you will spend some time working on the same and reading the topics and book mentioned above excited about the days ahead book hands on machine learning with scikit learn keras and tensorflow image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2020b png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2020a png day21 of 300daysofdata bagging and pasting it refers to the approach which uses the same training algorithm for every predictor but to train them on different random subsets of the training set when sampling is performed with replacement it is called bagging and when sampling is performed without replacement it is called pasting on my journey of machine learning and deep learning today i have read and implemented about ensemble learning and random forests voting classifiers bagging and pasting in scikit learn out of bag evaluation random patches and random subspaces random forests extremely randomized trees ensemble feature importance boosting adaboost gradient boosting and few more topics related to the same from the book hands on machine learning with scikit learn keras and tensorflow i have presented the implementation of bagging ensembles decision trees random forest classifier feature importance adaboost classifier and gradient boosting using python here in the snapshots i hope you will spend some time working on the same and reading the topics and book mentioned above excited about the days ahead book hands on machine learning with scikit learn keras and tensorflow image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2021aa png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2021a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2021b png day22 of 300daysofdata manifold learning manifold learning refers to the dimensionality reduction algorithms that work by modeling the manifold on which the training instances lie which relies on manifold hypothesis which holds that most real world high dimensional datasets to a much lower dimensional manifold on my journey of machine learning and deep learning today i have read and implemented about gradient boosting early stopping stochastic gradient boosting extreme gradient boosting or xgboost stacking and blending dimensionality reduction curse of dimensionality approaches for dimensionality reduction projection and manifold learning and few more topics related to the same from the book hands on machine learning with scikit learn keras and tensorflow i have presented the implementation of gradient boosting with early stopping along with visualization using scikit learn here in the snapshots i hope you will spend some time working on the same and reading the topics and book mentioned above excited about the days ahead book hands on machine learning with scikit learn keras and tensorflow image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2022a png day23 of 300daysofdata incremental pca incremental pca or ipca algorithms are the algorithms in which we can split the training set into mini batches and feed an ipca algorithm one mini batch at a time it is useful for large training sets and also to apply pca online on my journey of machine learning and deep learning today i have read and implemented about principal component analysis or pca preserving the variance principal components projecting down the dimensions explained variance ratio choosing the right number of dimensions pca for compression and decompression reconstruction error randomized pca svd incremental pca and few more topics related to the same from the book hands on machine learning with scikit learn keras and tensorflow i have presented the implementation of pca randomized pca and incremental pca along with visualizations using scikit learn here in the snapshots i hope you will spend some time working on the same i hope you will also spend some time reading the topics and book mentioned above excited about the days ahead book hands on machine learning with scikit learn keras and tensorflow image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2023b png day24 of 300daysofdata clustering clustering algorithms are the algorithms whose goal is to group similar instances together into clusters it is a great tool for data analysis customer segmentation recommender systems search engines image segmentation dimensionality reduction and many more on my journey of machine learning and deep learning today i have read and implemented about kernel principal component analysis selecting a kernel and tuning hyperparameters pipeline and grid search locally linear embedding dimensionality reduction techniques such as multi dimensional scaling isomap and linear discriminant analysis unsupervised learning such as clustering and kmeans clustering algorithm and few more topics related to the same from the book hands on machine learning with scikit learn keras and tensorflow i have presented the implementation of kernel pca and grid search cv and kmeans clustering algorithm along with a visualization using python here in the snapshots i hope you will spend some time working on the same and reading the topics and book mentioned above excited about the days ahead book hands on machine learning with scikit learn keras and tensorflow image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2024a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2024b png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2024c png day25 of 300daysofdata image segmentation image segmentation is the task of partitioning an image into multiple segments in semantic segmentation all the pixels that are part of the same object type get assigned to the same segment in instance segmentation all pixels that are part of the individual object are assigned to the same segment on my journey of machine learning and deep learning today i have read and implemented about kmeans algorithms centroid initialization accelerated kmeans and mini batch kmeans finding the optimal numbers of clusters elbow rule and silhouette coefficient score limitations of kmeans using clustering for image segmentation and preprocessing such as dimensionality reduction and few more topics related to the same from the book hands on machine learning with scikit learn keras and tensorflow i have presented the implementation of clustering algorithms for image segmentation and preprocessing along with visualizations using python here in the snapshots i hope you will spend some time working on the same and reading the topics and book mentioned above excited about the days ahead book hands on machine learning with scikit learn keras and tensorflow image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2025a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2025c png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2025b png day26 of 300daysofdata gaussian mixtures model a gaussian mixture model or gmm is a probabilistic model that assumes that the instances were generated from the mixture of several gaussian distributions whose parameters are unknown all the instances generated from a single gaussian distributions form a cluster that typically looks like an ellipsoid on my journey of machine learning and deep learning today i have read and implemented about using clustering algorithms for semi supervised learning active learning and uncertainty sampling dbscan agglomerative clustering birch algorithms mean shift and affinity propagation algorithms spectral clustering gaussian mixtures model expectation maximization algorithm and few more topics related to the same from the book hands on machine learning with scikit learn keras and tensorflow i have presented the implementation of clustering algorithms for semi supervised learning and dbscan along with visualizations using python here in the snapshots i hope you will spend some time working on the same and reading the topics and book mentioned above excited about the days ahead book hands on machine learning with scikit learn keras and tensorflow image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2026a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2026b png day27 of 300daysofdata anomaly detection anomaly detection also called outlier detection is the task of detecting instances that deviate strongly from the norm these instances are called anomalies or outliers while the normal instances are called inliers it is useful in fraud detection and more on my journey of machine learning and deep learning today i have read and implemented about gaussian mixture models anomaly detection using gaussian mixtures novelty detection selecting the number of clusters bayesian information criterion akaike information criterion likelihood function bayesian gaussian mixture models fast mcd isolation forest local outlier factor one class svm and few more topics related to the same from the book hands on machine learning with scikit learn keras and tensorflow i have just started neural networks and deep learning from this book i have presented the implementation of gaussian mixture model along with visualizations using python here in the snapshots i hope you will spend some time working on the same and reading the topics and book mentioned above excited about the days ahead book hands on machine learning with scikit learn keras and tensorflow image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2027a png day28 of 300daysofdata rectified linear unit function or relu it is a continuous but not differentiable at 0 where the slope changes abruptly and makes the gradient descent bounce around it works very well and has the advantage of fast to compute on my journey of machine learning and deep learning today i have read and implemented about introduction to artificial neural networks with keras biological neurons logical computations with neurons the perceptron hebbian learning multi layer perceptron and backpropagation gradient descent hyperbolic tangent function and rectified linear unit function regression mlps classification mlps softmax activation and few more topics related to the same from the book hands on machine learning with scikit learn keras and tensorflow i have presented the implementation of building an image classifier using the sequential api along with visualization using keras here in the snapshots i hope you will spend some time working on the same and reading the topics and book mentioned above i am excited about the days ahead book hands on machine learning with scikit learn keras and tensorflow image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2028a png day29 of 300daysofdata on my journey of machine learning and deep learning today i have read and implemented about creating the model using sequential api compiling the model loss function and activation function training and evaluating the model learning curves using the model to make predictions building the regression mlp using the sequential api building complex models using the functional api deep neural networks and few more topics related to the same from the book hands on machine learning with scikit learn keras and tensorflow i have presented the implementation of building regression mlp using sequential api and functional api here in the snapshots i hope you will gain some insights and you will spend some time working on the same i hope you will also spend some time reading and implementing the topics from the book mentioned above i am excited about the days ahead book hands on machine learning with scikit learn keras and tensorflow image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2029a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2029b png day30 of 300daysofdata on my journey of machine learning and deep learning today i have read and implemented about building the complex models using functional api deep neural network architecture relu activation function handling multiple inputs in the model mean squared error loss function and stochastic gradient descent optimizer handling multiple outputs or auxiliary output for regularization and few more topics related to the same from the book hands on machine learning with scikit learn keras and tensorflow i have presented the implementation of handling multiple inputs using keras functional api along with the implementation of handling multiple outputs or auxiliary output for regularization using the same here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time reading the topics from the book mentioned above and below i am excited about the days ahead book hands on machine learning with scikit learn keras and tensorflow image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2030aa png day31 of 300daysofdata callbacks and early stopping early stopping is a method that allows you to specify an arbitrarily large number of training epochs and stopping once the model stops improving on the validation dataset on my journey of machine learning and deep learning today i have read and implemented about building dynamic models using the sub classing api sequential api and functional api saving and restoring the model using callbacks model checkpoints early stopping weights and biases and few more topics related to the same from the book hands on machine learning with scikit learn keras and tensorflow i have presented the implementation of building dynamic models using the sub classing api along with the implementation of using callbacks and early stopping here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time reading the topics from the book mentioned above and below excited about the days ahead book hands on machine learning with scikit learn keras and tensorflow image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2031a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2031b png day32 of 300daysofdata on my journey of machine learning and deep learning today i have read and implemented about visualization using tensorboard learning curves fine tuning neural network hyperparameters randomized search cv regressor libraries to optimize hyperparameters such as hyperopt talos and few more number of hidden layers number of neurons per hidden layer learning rate batch size and other hyperparameters and few more topics related to the same from the book hands on machine learning with scikit learn keras and tensorflow i have also spend some time reading the paper which is named as practical recommendations for gradient based training of deep architectures here i have read about deep learning and greedy layer wise pretraining online learning and optimization of generalization error and few more related to the same i have presented the implementation of tuning hyperparameters keras regressors and randomized search cv here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time reading the topics from the book mentioned above and below excited about the days ahead book hands on machine learning with scikit learn keras and tensorflow paper practical recommendations for gradient based training of deep architectures https arxiv org pdf 1206 5533 pdf image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2032a png day33 of 300daysofdata vanishing gradient during backpropagation and calculating gradients it often gets smaller and smaller as the algorithms progresses down to the lower layers which prevents the training to converge to the good solution this leads to vanishing gradient problem on my journey of machine learning and deep learning today i have read and implemented about training deep neural networks vanishing and exploding gradient problems glorot and he initialization non saturating activation functions batch normalization and its implementation logistic and sigmoid activation function selu activation function relu activation function and variants leaky relu and parametric leaky relu and few more topics related to the same from the book hands on machine learning with scikit learn keras and tensorflow i have presented the implementation of leaky relu and batch normalization here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time reading the topics from the book mentioned above and below excited about the days ahead book hands on machine learning with scikit learn keras and tensorflow image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2033a png day34 of 300daysofdata gradient clipping gradient clipping is the technique to lessen the exploding gradients problem which simply clip the gradients during backpropagation so that they never exceed some threshold and it is mostly used in recurrent neural networks on my journey of machine learning and deep learning today i have read and implemented about gradient clipping batch normalization reusing pretrained layers deep neural networks and transfer learning unsupervised pretraining restricted boltzmann machines pretraining on an auxiliary task self supervised learning faster optimizers gradient descent optimizer momentum optimization nesterov accelerated gradient and few more topics related to the same from the book hands on machine learning with scikit learn keras and tensorflow i have presented the simple implementation of transfer learning using keras and sequential api here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time reading the topics from the book mentioned above and below excited about the days ahead book hands on machine learning with scikit learn keras and tensorflow image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2034a png day35 of 300daysofdata adam optimization adam which stands for adaptive moment estimation combines the ideas of momentum optimization and rmsprop where momentum optimization keeps track of an exponentially decaying average of past gradients and rmsprop keeps track of an exponentially decaying average of past squared gradients on my journey of machine learning and deep learning today i have read and implemented about adagrad algorithm gradient descent rmsprop algorithm adaptive moment estimation or adam optimization adamax nadam optimization training sparse models dual averaging learning rate scheduling power scheduling exponential scheduling piecewise constant scheduling performance scheduling and few more topics related to the same from the book hands on machine learning with scikit learn keras and tensorflow i have presented the implementation of exponential scheduling and piecewise constant scheduling here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time reading the topics from the book mentioned above and below excited about the days ahead book hands on machine learning with scikit learn keras and tensorflow image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2035a png day36 of 300daysofdata deep neural networks the best deep neural networks configurations which will work fine in most cases without requiring much hyperparameter tuning is here kernel initializer as lecun initialization activation function as selu normalization as none regularization as early stopping optimizer as nadam learning rate schedule as performance scheduling on my journey of machine learning and deep learning today i have read and implemented about avoiding overfitting through regularization l1 and l2 regularization dropout regularization self normalization batch normalization monte carlo dropout max norm regularization activation functions like selu and leaky relu nadam optimization and few more topics related to the same from the book hands on machine learning with scikit learn keras and tensorflow i have presented the implementation of l2 regularization and dropout regularization using keras here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time reading the topics from the book mentioned above and below excited about the days ahead book hands on machine learning with scikit learn keras and tensorflow image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2036a png day37 of 300daysofdata on my journey of machine learning and deep learning today i have read and implemented about custom models and training with tensorflow high level deep learning apis io and preprocessing lower level deep learning apis deployment and optimization tensorflow architecture tensors and operations keras low level api tensors and numpy sparse tensors arrays string tensors custom loss functions saving and loading the models containing custom components and few more topics related to the same from the book hands on machine learning with scikit learn keras and tensorflow i have also started reading a book speech and language processing here i have read about regular expressions text normalization tokenization lemmatization stemming sentence segmentation edit distance and few more topics related to the same i have presented the simple implementation of custom loss function here in the snapshot i hope you will also spend some time reading the topics from the books mentioned above and below excited about the days ahead books hands on machine learning with scikit learn keras and tensorflow speech and language processing https web stanford edu jurafsky slp3 image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2037a png day38 of 300daysofdata on my journey of machine learning and deep learning today i have read and implemented about custom activation functions initializers regularizers and constraints custom metrics mae and mse streaming metric custom layers custom models losses and metrics based on models internals and few more topics related to the same from the book hands on machine learning with scikit learn keras and tensorflow i have also started reading a book speech and language processing here i have read about regular expressions basic regular expression patterns disjunction range kleene star wildcard expression grouping and precedence operator hierarchy greedy and non greedy matching sequence and anchors counters and few more topics related to the same i have presented the implementation of custom activation functions initializers regularizers constraints and custom metrics here in the snapshots i hope you will also spend some time reading the topics from the book mentioned above and below excited about the days ahead books hands on machine learning with scikit learn keras and tensorflow speech and language processing https web stanford edu jurafsky slp3 image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2038a png day39 of 300daysofdata prefetching and data api prefetching is the loading of the resource before it is required to decrease the time waiting for that resource in other words while the training algorithm is working on one batch the dataset will already be working in parallel on getting the next batch ready which will improve the performance dramatically on my journey of machine learning and deep learning today i have read and implemented about loading and preprocessing data using tensorflow the data api chaining transformations shuffling the dataset gradient descent interleaving lines from multiple files parallelism preprocessing the dataset decoding prefetching multithreading and few more topics related to the same from the book hands on machine learning with scikit learn keras and tensorflow i have presented the simple implementation of data api using tensorflow here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time reading the topics from the book mentioned above and below excited about the days ahead books hands on machine learning with scikit learn keras and tensorflow image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2039a png day40 of 300daysofdata embedding and representation learning an embedding is a trainable dense vector that represents a category the better the representation of the categories the easier it will be for the neural network to make accurate predictions so embeddings must make the useful representations of the categories this is called representation learning on my journey of machine learning and deep learning today i have read and implemented about the features api column transformer numerical and categorical features crossed categorical features encoding categorical features using one hot vectors and embeddings representation learning word embeddings using feature columns for parsing using feature columns in the models and few more topics related to the same from the book hands on machine learning with scikit learn keras and tensorflow i have presented the simple implementation of the features api in numerical and categorical columns along with parsing here in the snapshots i hope you will also spend some time reading the topics from the book mentioned above and below excited about the days ahead books hands on machine learning with scikit learn keras and tensorflow image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2040a png day41 of 300daysofdata convolutional layer the most important building block of cnn is the convolutional layer neurons in the first convolutional layer are not connected to every single pixel in the input image but only to pixels in their respective fields similarly each neurons in second cl is connected only to neurons located within a small rectangle in the first layer on my journey of machine learning and deep learning today i have read and implemented about deep computer vision using convolutional neural networks the architecture of the visual cortex convolutional layer zero padding filters stacking multiple feature maps padding memory requirements pooling layer invariance convolutional neural network architectures and few more topics related to the same from the book hands on machine learning with scikit learn keras and tensorflow i have presented the simple implementation of convolutional neural network architecture here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time reading the topics from the book mentioned above and below excited about the days ahead books hands on machine learning with scikit learn keras and tensorflow image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2041a png day42 of 300daysofdata resnet model residual network or resnet won the ilsvrc 2015 challenge developed by kaiming he using an extremely deep cnn composed of 152 layers this network uses the skip connections which is also called shortcut connections the signal feeding into a layer is also added to the output of a layer located a bit higher up the stack on my journey of machine learning and deep learning today i have read and implemented about lenet 5 architecture alexnet cnn architecture data augmentation local response normalization googlenet architecture inception module vggnet residual network or resnet residual learning xception or extreme inception squeeze and excitation network or senet and few more topics related to the same from the book hands on machine learning with scikit learn keras and tensorflow i have presented the implementation of resnet 34 cnn using keras here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time reading the topics from the book mentioned above and below excited about the days ahead books hands on machine learning with scikit learn keras and tensorflow image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2042a png day43 of 300daysofdata xception model xception which stands for extreme inception is a variant of googlenet architecture which was proposed in 2016 by fran ois chollet it merges the ideas of googlenet and resnet architecture but it replaces the inception modules with a special type of layer called a depthwise separable convolution on my journey of machine learning and deep learning today i have read and implemented about using pretrained models from keras googlenet and residual network or resnet imagenet pretrained models for transfer learning xception model convolutional neural network batching prefetching global average pooling and few more topics related to the same from the book hands on machine learning with scikit learn keras and tensorflow i have presented the implementation of pretrained models such as resnet and xception for transfer learning here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time reading the topics from the book mentioned above and below excited about the days ahead books hands on machine learning with scikit learn keras and tensorflow image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2043a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2043b png day44 of 300daysofdata semantic segmentation in semantic segmentation each pixel is classified according to the class of the object it belongs to but the different objects of the same class are not distinguished on my journey of machine learning and deep learning today i have read and implemented about classification and localization crowdsourcing in computer vision intersection over union metric object detection fully convolutional networks or fcns valid padding you only look once or yolo architecture mean average precision or map convolutional neural networks semantic segmentation and few more topics related to the same from the book hands on machine learning with scikit learn keras and tensorflow i have just completed learning from this book i have presented the implementation of classification and localization along with the visualization here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time reading the topics from the book mentioned above and below excited about the days ahead books hands on machine learning with scikit learn keras and tensorflow image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2044a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2044b png day45 of 300daysofdata empirical risk minimization training a model means learning good values for all the weights and the biases from labeled examples in supervised learning a machine learning algorithm builds a model by examining many examples and attempting to find a model that minimizes loss which is called empirical risk minimization on my journey of machine learning and deep learning today i have started learning from the machine learning crash course of google here i have learned about machine learning philosophy fundamentals of machine learning and uses labels and features labeled and unlabeled example models and inference regression and classification linear regression weights and bias training and loss empirical risk minimization mean squared error or mse reducing loss gradient descent and few more topics related to the same i have presented the simple implementation of basic recurrent neural network here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the course mentioned above and below excited about the days ahead course machine learning crash course https developers google com machine learning crash course image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2045 png day46 of 300daysofdata on my journey of machine learning and deep learning today i have learned from machine learning crash course of google here i have learned and implemented about learning rate or step size hyperparameters in machine learning algorithms regression gradient descent optimizing learning rate stochastic gradient descent or sgd batch and batch size minibatch stochastic gradient descent convergence hierarchy of tensorflow toolkits and few more topics related to the same i have also spend some time in reading the book speech and language processing here i have read about regular expressions and patterns precision and recall kleene star aliases for common characters re operators for counting and few more topics related to the same i have presented the simple implementation of recurrent neural network and deep rnn using keras here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the course and book mentioned above and below excited about the days ahead course machine learning crash course https developers google com machine learning crash course book speech and language processing https web stanford edu jurafsky slp3 image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2046 png day47 of 300daysofdata feature vector and feature engineering feature engineering means transforming raw data into feature vector which is the set of floating values comprising the examples of the dataset on my journey of machine learning and deep learning today i have learned from machine learning crash course of google here i have learned and implemented about generalization of model overfitting gradient descent and loss statistical and computational learning theories stationarity of data splitting of data and validation set representation and feature engineering feature vector categorical features and vocabulary one hot encoding and sparse representation qualities of good features and few more topics related to the same i have presented the simple implementation of rnn along with gru here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the course and book mentioned above and below excited about the days ahead course machine learning crash course https developers google com machine learning crash course image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2047 png day48 of 300daysofdata scaling features scaling means converting floating point feature values from their natural range into standard range such as 0 to 1 if the feature set contains multiple features then feature scaling helps gradient descent to converge more quickly on my journey of machine learning and deep learning today i have learned from machine learning crash course of google here i have learned and implemented about scaling feature values handling extreme outliers binning scrubbing the data standard deviation feature cross and synthetic feature encoding nonlinearity stochastic gradient descent cross product crossing one hot vectors regularization for simplicity generalization curve l2 regularization early stopping lambda and learning rate and few more topics related to the same i have presented the simple implementation of linear regression model using sequential api here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the course mentioned above and below excited about the days ahead course machine learning crash course https developers google com machine learning crash course image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2048 png day49 of 300daysofdata prediction bias prediction bias is a quantity that measures how far apart is the average of predictions from the average of labels in dataset prediction bias is completely a different quantity than bias on my journey of machine learning and deep learning today i have learned from machine learning crash course of google here i have learned and implemented about logistic regression and calculating probability sigmoid function binary classification log loss and regularization early stopping l1 and l2 regularization classification and thresholding confusion matrix class imbalance and accuracy precision and recall roc curve area under curve or auc prediction bias calibration layer bucketing sparsity feature cross and one hot encoding and few more topics related to the same i have presented the simple implementation of normalization and binary classification using keras here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the course mentioned above and below excited about the days ahead course machine learning crash course https developers google com machine learning crash course image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2049 png day50 of 300daysofdata categorical data and sparse tensors categorical data refers to input features that represent one or more discrete items from a finite set of choices sparse tensors are the tensors with very few non zero elements on my journey of machine learning and deep learning today i have learned from machine learning crash course of google here i have learned and implemented about neural networks hidden layers and activation functions nonlinear classification and feature crosses sigmoid function rectified linear unit or relu backpropagation vanishing and exploding gradients dropout regularization multi class neural networks softmax logistic regression embeddings collaborative filtering sparse features principal component analysis word2vec and few more topics related to the same i have presented the simple implementation of deep neural networks in multi class classification here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the course mentioned above and below excited about the days ahead course machine learning crash course https developers google com machine learning crash course image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2050 png day51 of 300daysofdata deep learning deep learning is the general class of algorithms which falls under artificial intelligence and deals with training mathematical entities named deep neural networks by presenting the instructive examples it uses large amounts of data to approximate complex functions on my journey of machine learning and deep learning today i have started reading and implementing from the book deep learning with pytorch here i have learned about core pytorch deep learning introduction and revolution tensors and arrays deep learning competitive landscape utility libraries pretrained neural network that recognizes the subject of an image imagenet image recognition alexnet and resnet torch vision module and few more topics related to the same from here i have presented the implementation of obtaining pretrained neural networks for image recognition using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book deep learning with pytorch https www manning com books deep learning with pytorch image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2051 png day52 of 300daysofdata the gan game gan stands for generative adversarial network where generative means something being created adversarial means the two neural networks are competing to out smart the other and well network means neural networks a cycle gan can turn images of one domain into images of another domain without the need for us to explicitly provide matching pairs in the training set on my journey of machine learning and deep learning today i have read and implemented from the book deep learning with pytorch here i have learned about pretrained models generative adversarial network or gan resnet generator and discriminator models cycle gan architecture torch vision module deep fakes a neural network that turns horses into zebras and few more topics related to the same from here i have presented the implementation of cycle gan that turns horses into zebras using pytorch here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book deep learning with pytorch https www manning com books deep learning with pytorch image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2052a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2052b png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2052c png day53 of 300daysofdata tensors and multi dimensional arrays tensors are the fundamental data structure in pytorch a tensor is an array that is a data structure which stores a collection of numbers that are accessible individually using a index and that can be indexed with multiple indices on my journey of machine learning and deep learning today i have read and implemented from the book deep learning with pytorch here i have learned about a pretrained neural network that describes the scenes neuraltalk2 model recurrent neural network torch hub fundamental building block tensors the world as floating point numbers multidimensional arrays and tensors lists and indexing tensors named tensors einsum broadcasting and few more topics related to the same from here i have presented the simple implementation of indexing tensors and named tensors using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book deep learning with pytorch https www manning com books deep learning with pytorch image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2053 png day54 of 300daysofdata tensors and multi dimensional arrays tensors are the fundamental data structure in pytorch a tensor is an array that is a data structure which stores a collection of numbers that are accessible individually using a index and that can be indexed with multiple indices on my journey of machine learning and deep learning today i have read and implemented from the book deep learning with pytorch here i have learned about named tensors changing the names of named tensors broadcasting tensors unnamed dimensions tensor element types specifying the numeric data type the tensor api creation operations indexing random sampling serialization parallelism tensors storage referencing storage indexing into storage and few more topics related to the same from here i have presented the simple implementation of named tensors tensor datatype attributes and tensor api using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book deep learning with pytorch https www manning com books deep learning with pytorch image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2054 png day55 of 300daysofdata encoding color channels the most common way to encode colors into numbers is rgb where a color is defined by three numbers representing the intensity of red green and blue on my journey of machine learning and deep learning today i have read and implemented from the book deep learning with pytorch here i have learned about tensors metadata such as size offset and stride transposing tensors without copying transposing in higher dimensions contiguous tensors managing tensors device attribute such as moving to gpu and cpu numpy interoperability generalized tensors serializing tensors data representation using tensors working with images adding color channels changing the layout and few more topics related to the same from here i have presented the implementation of working with images such as changing the layout and permute method along with contiguous tensors using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book deep learning with pytorch https www manning com books deep learning with pytorch image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2055 png day56 of 300daysofdata continuous ordinal and categorical values continuous values are the values which can be counted and measured along with units ordinal values are the continuous values with no fixed relationships between values categorical values are the enumerations of possibilities on my journey of machine learning and deep learning today i have read and implemented from the book deep learning with pytorch here i have learned about normalizing the image data working with 3d images or volumetric image data representing the tabular data loading the data tensors using numpy continuous values ordinal values categorical values ratio scale and interval scale nominal scale one hot encoding and embeddings singleton dimensions and few more topics related to the same from here i have presented the implementation of normalizing the image data volumetric data tabular data and one hot encoding using pytorch here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book deep learning with pytorch https www manning com books deep learning with pytorch image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2056 png day57 of 300daysofdata continuous ordinal and categorical values continuous values are the values which can be counted and measured along with units ordinal values are the continuous values with no fixed relationships between values categorical values are the enumerations of possibilities on my journey of machine learning and deep learning today i have read and implemented from the book deep learning with pytorch here i have learned about continuous and categorical data pytorch tensor api finding thresholds in tabular data advanced indexing working with time series data adding time dimension in data shaping the data by time period tensors and arrays and few more topics related to the same from here i have presented the implementation of working with categorical data time series data and finding thresholds using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book deep learning with pytorch https www manning com books deep learning with pytorch image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2057 png day58 of 300daysofdata encoding and ascii every written characters is represented by a code which refers to a sequence of bits of appropriate length so that each character can be uniquely identified and it is called encoding on my journey of machine learning and deep learning today i have read and implemented from the book deep learning with pytorch here i have learned about working with time series data ordinal variables one hot encoding and concatenation unsqueeze and singleton dimension mean standard deviation and rescaling variables text representation natural language processing and recurrent neural networks converting the text into numbers project gutenberg corpus one hot encoding of characters encoding and ascii embeddings and processing the text and few more topics related to the same from here i have presented the implementation of time series data and text representation using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book deep learning with pytorch https www manning com books deep learning with pytorch image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2058 png day59 of 300daysofdata loss function loss function is a function that computes a single numerical value that the learning process will attempt to minimize the calculation of loss typically involves taking the difference between the desired outputs for some training samples on my journey of machine learning and deep learning today i have read and implemented from the book deep learning with pytorch here i have learned about one hot encoding and vectors data representation using tensors text embeddings natural language processing the mechanics of learning johannes kepler s lesson in modeling eccentricity parameter estimation weight bias and gradients simple linear model loss function or cost function mean square loss broadcasting and few more topics related to the same from here i have presented the simple implementation of representing text mechanics of learning and simple linear model using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book deep learning with pytorch https www manning com books deep learning with pytorch image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2059 png day60 of 300daysofdata gradient descent gradient descent is the first order iterative optimization algorithm for finding a local minimum of a differentiable function simply gradient is the derivates of the function with respect to each parameter on my journey of machine learning and deep learning today i have read and implemented from the book deep learning with pytorch here i have learned about cost function or loss function optimizing parameters using gradient descent decreasing loss function parameter estimation mechanics of learning scaling factor and learning rate evaluations of model computing the derivative of loss function and linear function defining gradient function partial derivative and iterating the model the training loop and few more topics related to the same from here i have presented the implementation of loss function computing derivatives gradient function and training loop here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book deep learning with pytorch https www manning com books deep learning with pytorch image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2060 png day61 of 300daysofdata hyperparameter tuning hyperparameter tuning refers to the training of model s parameters and hyperparameters control how the training goes hyperparameters are generally set manually on my journey of machine learning and deep learning today i have read and implemented from the book deep learning with pytorch here i have learned about gradient descent optimizing the training loop overtraining convergence and divergence learning rate hyperparameter tuning normalizing the inputs visualization or plotting the data argument unpacking pytorch s autograd and backpropagation chain rule linear model and few more topics related to the same from here i have presented the simple implementation of training loop and gradient descent along with visualization using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book deep learning with pytorch https www manning com books deep learning with pytorch image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2061 png day62 of 300daysofdata on my journey of machine learning and deep learning today i have read and implemented from the book deep learning with pytorch here i have learned about gradient descent pytorch s autograd and backpropagation chain rule and tensors grad attribute and parameters simple linear function and simple loss function accumulating grad functions zeroing the gradients autograd enabled training loop optimizers and vanilla gradient descent and optim submodule of torch and few more topics related to the same from here i have presented the simple implementation of linear model and loss function autograd enabled training loop using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book deep learning with pytorch https www manning com books deep learning with pytorch image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2062 png day63 of 300daysofdata stochastic gradient descent stochastic gradient descent or sgd comes from the fact that the gradient is typically obtained by averaging over a random subset of all input samples on my journey of machine learning and deep learning today i have read and implemented from the book deep learning with pytorch here i have learned about optimizers vanilla gradient descent optimization stochastic gradient descent momentum argument minibatch learning rate and params optim module neural network models adam optimizers backpropagation optimizing weights training validation and overfitting evaluating the training loss generalizing to the validation set overfitting and penalization terms and few more topics related to the same from here i have presented the implementation of sgd and adam optimizer along with the training loop here in the snapshots it is the continuation of the previous snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book deep learning with pytorch https www manning com books deep learning with pytorch image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2063 png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2063a png day64 of 300daysofdata activation functions activation functions are nonlinear which allows the overall network to approximate more complex functions they are differentiable so that gradients can be computed through them on my journey of machine learning and deep learning today i have read and implemented from the book deep learning with pytorch here i am learning to use a neural network to fit the data artificial neurons the learning process and loss function non linear activation functions weights and biases composing a multilayer network understanding the error function capping and compressing the output range tanh and relu activations choosing the activation functions the pytorch nn module and few more topics related to the same from here i have presented the simple implementation of linear model and training loop using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book deep learning with pytorch https www manning com books deep learning with pytorch image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2064 png day65 of 300daysofdata activation functions activation functions are nonlinear which allows the overall network to approximate more complex functions they are differentiable so that gradients can be computed through them on my journey of machine learning and deep learning today i have read and implemented from the book deep learning with pytorch here i have learned about the pytorch nn module simple linear model batching input data optimizing batches mean square error loss function training loop neural networks sequential model tanh activation function inspecting parameters weights and biases ordereddict module comparing to the linear model overfitting and few more topics related to the same form here i have presented the simple implementation of sequential model and ordereddict submodule using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book deep learning with pytorch https www manning com books deep learning with pytorch image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2065 png day66 of 300daysofdata computer vision computer vision is an interdisciplinary scientific field that deals with how computers can gain high level understanding from digital images or videos it seeks to understand and automate tasks that the human visual system can do on my journey of machine learning and deep learning today i have read and implemented from the book deep learning with pytorch here i have started the new topic learning from images i have learned about simple image recognition cifar10 which is a dataset of tiny images torch vision module the dataset class iterable dataset python imaging library or pil package dataset transforms arrays and tensors permute function and few more topics related to the same i have presented the simple implementation of torch vision module along with cifar10 dataset using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book deep learning with pytorch https www manning com books deep learning with pytorch image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2066 png day67 of 300daysofdata computer vision computer vision is an interdisciplinary scientific field that deals with how computers can gain high level understanding from digital images or videos on my journey of machine learning and deep learning today i have read and implemented from the book deep learning with pytorch here i have learned about permutation function normalizing the data stacking mean and standard deviation torch vision module and submodules cifar10 dataset pil package image recognition building the dataset building a fully connected neural networks model sequential model simple linear model classification and regression problems one hot encoding and softmax and few more topics related to the same from here i have presented the implementation of normalizing the data building the dataset and neural network model using torch vision modules here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book deep learning with pytorch https www manning com books deep learning with pytorch image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2067 png day68 of 300daysofdata softmax function softmax function is a type of function that takes a vector of values and produces another vector of the same dimension where the values satisfy the constraints presented as probabilities softmax is a monotone function that the lower values in the input will correspond to lower values in the output on my journey of machine learning and deep learning today i have read and implemented from the book deep learning with pytorch here i have learned about representing output as probabilities and softmax function pytorch s nn module backpropagation a loss for classification mse loss negative log likelihood or nll loss log softmax function training the classifier stochastic gradient descent hyperparameters minibatches and few more topics related to the same from here i have presented the implementation of softmax function building neural network model and training loop using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book deep learning with pytorch https www manning com books deep learning with pytorch image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2068 png day69 of 300daysofdata cross entropy loss cross entropy loss is a negative log likelihood of the predicted distribution under the target distribution as an outcome the combination of log softmax function and nll loss function is equivalent to using cross entropy loss on my journey of machine learning and deep learning today i have read and implemented from the book deep learning with pytorch here i have learned about gradient descent minibatches and data loader stochastic gradient descent neural network model log softmax function nll loss function cross entropy loss function trainable parameters weights an biases translation invariant data augment torch vision and nn modules and few more topics related to the same from here i have presented the implementation of building deep neural network training loop and model evaluation using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book deep learning with pytorch https www manning com books deep learning with pytorch image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2069 png day70 of 300daysofdata translational invariance translational invariance makes the convolutional neural network invariant to translation which means that if we translate the inputs then the cnn will still be able to detect the class to which the input belongs on my journey of machine learning and deep learning today i have read and implemented from the book deep learning with pytorch here i have started reading the topic using convolutions to generalize i have learned about convolutional neural network translation invariant weights and biases discrete cross correlations locality or local operations on neighborhood data model parameters multi channel image padding the boundary kernel size detecting features with convolutions and few more topics related to the same i have presented the simple implementation of cnn and building the data using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book deep learning with pytorch https www manning com books deep learning with pytorch image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2070 png day71 of 300daysofdata down sampling down sampling is the scaling of an image by half which is equivalent of taking four neighboring pixels as input and producing one pixel as output down sampling principle can be implemented in different ways on my journey of machine learning and deep learning today i have read and implemented from the book deep learning with pytorch here i have learned about kernel size padding the image edge detection kernel locality and translation invariant learning rate and weight update max pooling layer and down sampling stride convolutional neural networks receptive field tanh activation function simple linear model sequential model parameters of the model and few more topics related to the same from here i have presented the implementation of convolutional neural network plotting the image and inspecting the parameters of the model using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book deep learning with pytorch https www manning com books deep learning with pytorch image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2071 png day72 of 300daysofdata down sampling down sampling is the scaling of an image by half which is equivalent of taking four neighboring pixels as input and producing one pixel as output down sampling principle can be implemented in different ways on my journey of machine learning and deep learning today i have read and implemented from the book deep learning with pytorch here i have learned about sub classing the nn module the sequential or the modular api forward function linear model max pooling layer padding the data convolutional neural network architecture resnet kernel size and attributes tanh activation function model parameters the functional api stateless modules and few more topics related to the same from here i have presented the implementation of sub classing the nn module using the sequential api and the functional api using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book deep learning with pytorch https www manning com books deep learning with pytorch image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2072 png day73 of 300daysofdata down sampling down sampling is the scaling of an image by half which is equivalent of taking four neighboring pixels as input and producing one pixel as output down sampling principle can be implemented in different ways on my journey of machine learning and deep learning today i have read and implemented from the book deep learning with pytorch here i have learned about the torch nn module the functional api convolutional neural network and the training the data loader module forward and backward pass of the network stochastic gradient descent optimizer zeroing the gradients cross entropy loss function model evaluation and gradient descent and few more topics related to the same from here i have presented the implementation of training loop and model evaluation using pytorch here in the snapshot actually it is the continuation of yesterday s snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book deep learning with pytorch https www manning com books deep learning with pytorch image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2073 png day74 of 300daysofdata down sampling down sampling is the scaling of an image by half which is equivalent of taking four neighboring pixels as input and producing one pixel as output down sampling principle can be implemented in different ways such as max pooling on my journey of machine learning and deep learning today i have read and implemented from the book deep learning with pytorch here i have learned about saving and loading the model weights and parameters of the model training the model on gpu the torch nn module and sub modules map location keyword designing model long short term memory or lstm adding memory capacity or width to the network feed forward network overfitting and few more topics related to the same from here i have presented the implementation of adding memory capacity or width to the network using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book deep learning with pytorch https www manning com books deep learning with pytorch image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2074 png day75 of 300daysofdata l2 regularization l2 regularization is the sum of the squares of all the weights in the model whereas l1 regularization is the sum of the absolute values of all the weights in the model l2 regularization is also referred to as weight decay on my journey of machine learning and deep learning today i have read and implemented from the book deep learning with pytorch here i have learned about convolutional neural network l2 regularization and l1 regularization optimization and generalization weight decay the pytorch nn module and sub modules stochastic gradient descent optimizer overfitting and dropout deep neural networks randomization and few more topics related to the same from here i have presented the implementation of l2 regularization and dropout layer using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book deep learning with pytorch https www manning com books deep learning with pytorch image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2075 png day76 of 300daysofdata l2 regularization l2 regularization is the sum of the squares of all the weights in the model whereas l1 regularization is the sum of the absolute values of all the weights in the model l2 regularization is also referred to as weight decay on my journey of machine learning and deep learning today i have read and implemented from the book deep learning with pytorch here i have learned about dropout module batch normalization and non linear activation functions regularization and principled augmentation convolutional neural networks minibatch and standard deviation deep neural networks and depth module skip connections mechanism relu activation function implementation of functional api and few more topics related to the same from here i have presented the implementation of batch normalization and deep neural networks and depth module using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book deep learning with pytorch https www manning com books deep learning with pytorch image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2076 png day77 of 300daysofdata identity mapping when the output of the first activations is used as the input of the last in addition to the standard feed forward path then it is called the identity mapping identity mapping alleviate the issues of vanishing gradients on my journey of machine learning and deep learning today i have read and implemented from the book deep learning with pytorch here i have learned about convolutional neural networks skip connections resnet architecture simple linear layer max pooling layer identity mapping highway networks unet model dense networks and very deep neural networks sequential and functional api forward and backpropagation torch vision module and sub modules batch normalization layer custom initializations and few more topics related to the same from here i have presented the implementation of resnet architecture and very deep neural networks using pytorch here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book deep learning with pytorch https www manning com books deep learning with pytorch image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2077a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2077b png day78 of 300daysofdata voxel a voxel is the 3d equivalent to the familiar 2d pixel it encloses a volume of space rather than an area on my journey of machine learning and deep learning today i have read and implemented from the book deep learning with pytorch here i have learned about ct scan dataset voxel segmentation grouping and classification nodules 3d convolutions neural networks downloading the luna dataset data loading parsing the data training and validation set and few more topics related to the same from here i have started working with luna dataset which stands for lung nodule analysis 2016 the luna grand challenge is the combination of an open dataset with high quality labels of patient ct scans many with lung nodules and a public ranking of classifiers against the data i have presented the implementation of preparing the data using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book deep learning with pytorch https www manning com books deep learning with pytorch image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2078 png day79 of 300daysofdata on my journey of machine learning and deep learning today i have read and implemented from the book deep learning with pytorch here i have learned about data loading and parsing the data ct scan dataset data pipeline and few more topics related to the same from here besides i have also learned about auto encoders recurrent neural networks and long short term memory or lstm data processing one hot encoding random splitting of training and validation dataset and few more i have continued working with luna dataset which stands for lung nodule analysis 2016 the luna grand challenge is the combination of an open dataset with high quality labels of patient ct scans many with lung nodules and a public ranking of classifiers against the data i have presented the simple implementation of data preparation using pytorch here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book deep learning with pytorch https www manning com books deep learning with pytorch image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2079 png day80 of 300daysofdata on my journey of machine learning and deep learning today i have read and implemented from the book deep learning with pytorch here i have learned about loading the individual ct scans dataset 3d nodules density data simpleitk library hounsfield units voxels batch normalization loading a nodule using the patient coordinate system converting between millimeters and voxel addresses array coordinates matrix multiplication and few more topics related to the same from here besides i have also learned about auto encoders using lstm stateful decoder model and data visualization i have continued working with luna dataset which stands for lung nodule analysis 2016 i have presented the implementation of conversion between patient coordinates and arrays coordinates on ct scans dataset using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book deep learning with pytorch https www manning com books deep learning with pytorch image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2080 png day81 of 300daysofdata voxel and nodules a voxel is the 3d equivalent to the familiar 2d pixel it encloses a volume of space rather than an area a mass of tissue made of proliferating cell in the lung is called a tumor a small tumor just a few millimeters wide is called a nodules on my journey of machine learning and deep learning today i have read and implemented from the book deep learning with pytorch here i have learned about pytorch dataset instance implementation luna dataset class cross entropy loss positive and negative nodules arrays and tensors caching candidate arrays training and validation datasets data visualization and few more topics related to the same from here besides i have also learned about about normalization of data variance threshold rdkit library and few more topics related to the same i have presented the implementation of preparing the luna dataset using pytorch here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book deep learning with pytorch https www manning com books deep learning with pytorch image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2081 png day82 of 300daysofdata tagging algorithms the problem of learning to predict classes that are not mutually exclusive is called multilabel classification auto tagging problems are best described as multilabel classification problems on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about a motivating example on machine learning learning algorithms training process data features models objective functions optimization algorithms supervised learning regression binary multiclass and hierarchical classification cross entropy and mean squared error loss functions gradient descent tagging algorithms and few more topics related to the same from here i have presented the implementation of preparing the data normalization removing low variance features and data loaders using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2082 png day83 of 300daysofdata reinforcement learning reinforcement learning gives a very general statement of problem in which an agent interacts with the environment over a series of time steps and receives some observation and must choose action on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about search algorithms recommender systems sequence learning tagging and parsing machine translation unsupervised learning interacting with an environment and reinforcement learning data manipulation mathematical operations broadcasting mechanisms indexing and slicing saving memory in tensors conversion to other datatypes and few more topics related to the same from here i have presented the implementation of mathematical operations tensors concatenation broadcasting mechanisms and datatypes conversion using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2083 png day84 of 300daysofdata tensors tensors refer to algebraic objects describing the n dimensional arrays with an arbitrary number of axes vectors are first order tensors and matrices are second order tensors on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about data processing reading the dataset handling the missing data categorical data conversion to the tensor format linear algebra such as scalars vectors length dimensionality and shape matrices symmetric matrix tensors basic properties of tensor arithmetic reduction non reduction sum dot products matrix vector products and few more topics related to the same from here i have presented the implementation of data processing handling the missing data scalars vectors matrices and dot products using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2084 png day85 of 300daysofdata method of exhaustion the ancient process of finding the area of curved shapes such as circle by inscribing the polygons in such shapes which better approximate the circle is called the method of exhaustion on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about matrix multiplication l1 and l2 normalization frobenius normalization calculus method of exhaustion derivatives and differentiation partial derivatives gradient descents chain rule automatic differentiation backward for non scalar variables detaching computation backpropagation computing the gradient with control flow and few more topics related to the same from here i have presented the implementation of matrix multiplication l1 l2 and frobenius normalization derivatives and differentiation automatic differentiation and computing the gradient using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2085 png day86 of 300daysofdata method of exhaustion the ancient process of finding the area of curved shapes such as circle by inscribing the polygons in such shapes which better approximate the circle is called the method of exhaustion on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about probabilities basic probability theory sampling multinomial distribution axioms of probability theory random variables dealing with multiple random variables joint probability conditional probability bayes theorem marginalization independence and dependence expectation and variance finding classes and functions in a module and few more topics related to the same from here i have presented the implementation of multinomial distribution visualization of probabilities derivatives and differentiation using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2086 png day87 of 300daysofdata hyperparameters the parameters that are tunable but not updated in the training loop are called hyperparameters hyperparameters tuning is the process by which hyperparameters are chosen and typically requires adjusting based on the results of the training loop on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about linear regression basic elements of linear regression linear model and transformation loss function analytic solution minibatch stochastic gradient descent making predictions with the learned model vectorization of speed the normal distribution and squared loss linear regression to deep neural networks biological interpretation hyperparameters tuning and few more topics related to the same from here i have presented the implementation of vectorization of speed and normal distributions using python here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2087 png day88 of 300daysofdata hyperparameters the parameters that are tunable but not updated in the training loop are called hyperparameters on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about linear regression implementation from scratch data pipeline deep learning frameworks generating the artificial dataset scatter plot and correlation reading the dataset minibatches features and labels parallel computing initializing the model parameters minibatch stochastic gradient descent defining the simple linear regression model broadcasting mechanism vectors and scalars and few more topics related to the same from here i have presented the implementation of generating the synthetic dataset generating the scatter plot reading the dataset initializing the model parameters and defining the linear regression model using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2088 png day89 of 300daysofdata linear regression linear regression is a linear approach to modelling the relationship between a scalar response and one or more explanatory variables also known as dependent variables and independent variables on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about linear regression defining the loss function defining the optimization algorithm minibatch stochastic gradient descent training the model tensors and differentiation concise implementation of linear regression generating the synthetic dataset model evaluation and few more topics related to the same from here i have presented the implementation of defining the loss function minibatch stochastic gradient descent training and evaluating the model concise implementation of linear regression and reading the dataset using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2089 png day90 of 300daysofdata linear regression linear regression is a linear approach to modelling the relationship between a scalar response and one or more explanatory variables also known as dependent variables and independent variables on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about softmax regression classification problem network architecture parameterization cost of fully connected layers softmax operation vectorization for minibatches loss function log likelihood softmax and derivatives cross entropy loss information theory basics entropy and surprisal model prediction and evaluation the image classification dataset and few more topics related to the same from here i have presented the implementation of image classification dataset visualization softmax regression and operation along with model parameters using pytorch here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2090a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2090b png day91 of 300daysofdata activation functions activation functions decide whether a neuron should be activated or not by calculating the weighted sum and further adding bias with it they are differentiable operators on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about cross entropy loss function classification accuracy and training softmax regression model parameters optimization algorithms multi layer perceptrons hidden layers linear models problems from linear to nonlinear models universal approximators activation functions like relu function sigmoid function tanh function derivatives and gradients and few more topics related to the same from here i have presented the implementation of softmax regression model classification accuracy relu function sigmoid function tanh function along with visualizations using pytorch here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2091a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2091b png day92 of 300daysofdata activation functions activation functions decide whether a neuron should be activated or not by calculating the weighted sum and further adding bias with it they are differentiable operators on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about implementation of multi layer perceptrons initializing model parameters relu activation functions cross entropy loss function training the model fully connected layers simple linear layer softmax regression and function stochastic gradient descent sequential api high level apis learning rate weights and biases tensors hyperparameters and few more topics related to the same from here i have presented the implementation of multi layer perceptrons relu activation function training the model and model evaluations using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2092 png day93 of 300daysofdata multi layer perceptrons the simplest deep neural networks are called multi layer perceptrons they consist of multiple layers of neurons on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about model selection underfitting overfitting training error and generalization error statistical learning theory model complexity early stopping training testing and validation dataset k fold cross validation dataset size polynomial regression generating the dataset training and testing the model third order polynomial function fitting linear function fitting high order polynomial function fitting weight decay normalization and few more topics related to the same from here i have presented the implementation of generating the dataset defining the training function and polynomial function fitting using pytorch here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2093a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2093b png day94 of 300daysofdata multi layer perceptrons the simplest deep neural networks are called multi layer perceptrons they consist of multiple layers of neurons each fully connected to those in layers below from which they receive input and above which in turn influence on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about high dimensional linear regression model parameters defining l2 normalization penalty defining the training loop regularization and weight decay dropout and overfitting bias and variance tradeoff gaussian distributions stochastic gradient descent training error and test error and few more topics related to the same from here i have presented the implementation of high dimensional linear regression model parameters l2 normalization penalty regularization and weight decay using pytorch here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2094a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2094b png day95 of 300daysofdata dropout and co adaption dropout is the process of injecting noise while computing each internal layer during forward propagation co adaption is the condition in neural network which is characterized by a state in which each layer relies on the specific pattern of the activations in the previous layer on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about dropout overfitting generalization error bias and variance tradeoff robustness through perturbations l2 regularization and weight decay co adaption dropout probability dropout layer fashion mnist dataset activation functions stochastic gradient descent the sequential and functional api and few more topics related to the same from here i have presented the implementation of dropout layer training and testing the model using pytorch here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2095a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2095b png day96 of 300daysofdata dropout and co adaption dropout is the process of injecting noise while computing each internal layer during forward propagation co adaption is the condition in neural network which is characterized by a state in which each layer relies on the specific pattern of the activations in the previous layer on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about forward propagation backward propagation and computational graphs numerical stability vanishing and exploding gradients breaking the symmetry parameter initialization environment and distribution shift covariate shift label shift concept shift non stationary distributions empirical risk and true risk batch learning online learning reinforcement learning and few more topics related to the same from here i have presented the implementation of data preprocessing and data preparation using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html predicting housing prices https github com thinamxx californiahousing prices blob main predictinghouseprices ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2096 png day97 of 300daysofdata on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about training and building deep networks downloading and caching datasets data preprocessing regression problems accessing and reading the dataset numerical and discrete categorical features optimization and variance arrays and tensors simple linear model the sequential api root mean squared error adam optimizer hyperparameter tuning k fold cross validation training and validation error model selection overfitting and regularization and few more topics related to the same from here i have presented the implementation of simple linear model root mean squared error training function and k fold cross validation using pytorch here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html predicting housing prices https github com thinamxx californiahousing prices blob main predictinghouseprices ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2097 png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2097a png day98 of 300daysofdata constant parameters constant parameters are the terms that are neither the result of the previous layers nor updatable parameters in the neural networks on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about k fold cross validation training and predictions hyperparameters optimization deep learning computation layers and blocks softmax regression multi layer perceptrons resnet architecture forward and backward propagation function relu activation function the sequential block implementation mlp implementation constant parameters and few more topics related to the same from here i have presented the implementation of mlp the sequential api class and forward propagation function using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html predicting housing prices https github com thinamxx californiahousing prices blob main predictinghouseprices ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2098 png day99 of 300daysofdata constant parameters constant parameters are the terms that are neither the result of the previous layers nor updatable parameters in the neural networks on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about parameter management parameter access targeted parameters collecting parameters from nested block parameter initialization custom initialization tied parameters deferred initialization multi layer perceptrons input dimensions defining custom layers layers without parameters forward propagation function constant parameters xavier initializer weight and bias and few more topics related to the same from here i have presented the implementation of parameter access parameter initialization tied parameters and layers without parameters using pytorch here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2099a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2099b png day100 of 300daysofdata invariance and locality principle translation invariance principle states that out network should respond similarly to the same patch regardless of where it appears in the image locality principle states that the network should focus on local regions without regard to the contents of the image in distant regions on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about fully connected layers to convolutions translation invariance locality principle constraining the mlp convolutional neural networks cross correlation images and channels file io loading and saving tensors loading and saving model parameters custom layers layers with parameters and few more topics related to the same from here i have presented the implementation of layers with parameters loading and saving the tensors and model parameters using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20100a png day101 of 300daysofdata invariance and locality principle translation invariance principle states that out network should respond similarly to the same patch regardless of where it appears in the image locality principle states that the network should focus on local regions without regard to the contents of the image in distant regions on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about convolutional neural networks convolutions for images the cross correlation operation convolutional layers constructor and forward propagation function weight and bias object edge detection in images learning a kernel back propagation feature map and receptive field kernel parameters and few more topics related to the same from here i have presented the implementation of cross correlation operation convolutional layers and learning a kernel using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20101 png day102 of 300daysofdata maximum pooling pooling operators consist of a fixed shape window that is slid over all the regions in the input according to its stride computing a single output for each location which is either maximum or the average value of the elements in the pooling window on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about padding and stride strided convolutions cross correlations multiple input and multiple output channels convolutional layer maximum pooling layer and average pooling layer pooling window and operators convolutional neural networks lenet architecture supervised learning convolutional encoder sigmoid activation function and few more topics related to the same from here i have presented the implementation of cnn implementation of padding stride and pooling layers multiple channels using pytorch here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20102 png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20102a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20102b png day103 of 300daysofdata vgg networks vgg networks construct a network using reusable convolutional blocks vgg models are defined by the number of convolutional layers and output channels in each block on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about convolutional neural networks supervised learning deep cnn and alexnet support vector machine and features learning representations data and hardware accelerator problems architectures of lenet and alexnet activation functions such as relu networks using cnn blocks vgg neural networks architecture padding and pooling convolutional layers dropout dense and linear layers and few more topics related to the same from here i have presented the implementation of alexnet architecture and vgg networks architecture along with cnn blocks using pytorch here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20103a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20103b png day104 of 300daysofdata vgg networks vgg networks construct a network using reusable convolutional blocks vgg models are defined by the number of convolutional layers and output channels in each block on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about network in network or nin architecture nin blocks and model convolutional layer relu activation function the sequential and functional api global average pooling layer networks with parallel concatenations or googlenet inception blocks googlenet model and architecture maximum pooling layer training the model and few more topics related to the same from here i have presented the implementation of nin block and model inception block and googlenet model using pytorch here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20104a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20104b png day105 of 300daysofdata batch normalization batch normalization continuously adjusts the intermediate output of the neural network by utilizing the mean and standard deviation of the minibatch so that the values of the intermediate output are more stable on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about batch normalization training deep neural networks scale parameter and shift parameter batch normalization layers fully connected layers convolutional layers batch normalization during prediction tensors mean and variance applying bn in lenet concise implementation of bn using high level api internal covariate shift dropout layer residual networks or resnet function classes residual blocks and few more topics related to the same from here i have presented the implementation of batch normalization architecture using pytorch here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20105a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20105b png day106 of 300daysofdata batch normalization batch normalization continuously adjusts the intermediate output of the neural network by utilizing the mean and standard deviation of the minibatch so that the values of the intermediate output are more stable on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about densely connected neural networks or densenet dense blocks batch normalization activation functions and convolutional layer transition layer residual networks or resnet function classes residual blocks residual mapping residual connection resnet model maximum and average pooling layers training the model and few more topics related to the same from here i have presented the implementation of resnet architecture and resnet model using pytorch here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20106a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20106b png day107 of 300daysofdata sequence models the prediction beyond the known observations is called extrapolation the estimating between the existing observations is called interpolation sequence models require specialized statistical tools for estimation such as auto regressive models on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about densenet model convolutional layers recurrent neural networks sequence models interpolation and extrapolation statistical tools autoregressive models latent autoregressive models markov models reinforcement learning algorithms causality conditional probability distribution training the mlp one step ahead prediction and few more topics related to the same from here i have presented the implementation of densenet architectures and simple implementation of rnns using pytorch here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20107a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20107b png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20107c png day108 of 300daysofdata tokenization and vocabulary tokenization is the splitting of a string or text into a list of tokens vocabulary is the dictionary that maps string tokens into numerical indices on my journey of machine learning and deep learning today i have read and implemented from the book ve into deep learning here i have learned about text preprocessing corpus of text tokenization function sequence models and dataset vocabulary dictionary multilayer perceptron one step ahead prediction multi step ahead prediction tensors recurrent neural networks and few more topics related to the same from here i have presented the implementation of reading the dataset tokenization and vocabulary using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20108 png day109 of 300daysofdata sequential partitioning sequential partitioning is the strategy that preserves the order of split subsequences when iterating over minibatches it ensures that the subsequences from two adjacent minibatches during iteration are adjacent in the original sequence on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about language models and sequence dataset conditional probability laplace smoothing markov models and ngrams unigram bigram and trigram models natural language statistics stop words word frequencies zipf s law reading long sequence data minibatches random sampling sequential partitioning and few more topics related to the same from here i have presented the implementation of unigram bigram and trigram model frequencies random sampling and sequential partitioning using pytorch here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20109a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20109b png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20109c png day110 of 300daysofdata recurrent neural networks recurrent neural networks are the networks that uses recurrent computation for hidden states the hidden state of an rnn can capture historical information of the sequence up to the current time step on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about recurrent neural networks or rnn hidden state neural networks without hidden states rnns with hidden states rnn layers rnn based character level language models perplexity implementation of rnn from scratch one hot encoding vocabulary initializing the model parameters rnn model minibatch and tanh activation function prediction and warm up period gradient clipping backpropagation and few more topics related to the same from here i have presented the implementation rnn model gradient clipping and training the model using pytorch here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20110 png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20110a png day111 of 300daysofdata recurrent neural networks recurrent neural networks are the networks that uses recurrent computation for hidden states the hidden state of an rnn can capture historical information of the sequence up to the current time step on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about implementation of recurrent neural networks defining the rnn model training and prediction backpropagation through time exploding gradients vanishing gradients analysis of gradients in rnns full computation truncating time steps randomized truncation gradient computing strategies in rnns activation functions regular truncation and few more topics related to the same from here i have presented the implementation of recurrent neural networks training and prediction using pytorch here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20111a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20111b png day112 of 300daysofdata gated recurrent units gated recurrent units or grus are a gating mechanisms in recurrent neural networks in which hidden state should be updated and also when it should be reset it aims to solve the vanishing gradient problem which comes with standard rnns on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about modern recurrent neural networks gradient clipping gated recurrent units or grus memory cell gated hidden state reset gate and update gate broadcasting candidate hidden state hadamard product operator hidden state initializing model parameters defining the gru model training and prediction and few more topics related to the same from here i have presented the implementation of gated recurrent units gru model training and prediction using pytorch here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20112a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20112b png day113 of 300daysofdata long short term memory long short term memory or lstm is a type of recurrent neural networks capable of learning order dependence in sequence prediction problems lstm has input gates forget gates and output gates that control the flow of information on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about long short term memory or lstm gated memory cell input gate forget gate and output gate candidate memory cell tanh activation function sigmoid activation function memory cell hidden state initializing model parameters defining the lstm model training and prediction gated recurrent units or grus gaussian distribution and few more topics related to the same from here i have presented the implementation of long short term memory or lstm model training and prediction using pytorch here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20113a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20113b png day114 of 300daysofdata long short term memory long short term memory or lstm is a type of recurrent neural networks capable of learning order dependence in sequence prediction problems lstm has input gates forget gates and output gates that control the flow of information on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about deep recurrent neural networks functional dependencies bidirectional recurrent neural networks dynamic programming in hidden markov models bidirectional model computational cost and applications machine translation and dataset preprocessing the dataset tokenization vocabulary padding text sequences and few more topics related to the same from here i have presented the implementations of downloading the dataset preprocessing tokenization and vocabulary using pytorch here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20114a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20114b png day115 of 300daysofdata encoder and decoder architecture encoder takes a variable length sequence as the input and transforms it into a state with a fixed shape decoder maps the encoded state of a fixed shape to a variable length sequence on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about encoder and decoder architectures machine translation model sequence transduction models forward propagation function sequence to sequence learning recurrent neural networks embedding layer gated recurrent units or gru layers hidden states and units rnn encoder and decoder architecture vocabulary and few more topics related to the same from here i have presented the implementation of encoder decoder architectures and rnn encoder decoder for sequence to sequence learning using pytorch here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20115a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20115b png day116 of 300daysofdata sequence search greedy search is the conditional probability of generating an output sequence based on the input sequence beam search is an improved version of greedy search with a hyperparameter named beam size on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about softmax cross entropy loss function sequence masking teacher forcing training and prediction evaluation of predicted sequences bleu or bilingual evaluation understudy rnn encoder decoder beam search greedy search exhaustive search attention mechanisms attention cues nonvolitional cue and volitional cue queries keys and values attention pooling and few more topics related to the same from here i have presented the implementation of sequence masking softmax cross entropy loss training rnn encoder decoder model and bleu using pytorch here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20116a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20116b png day117 of 300daysofdata attention pooling attention pooling selectively aggregates values or sensory inputs to produce the output it implies the interaction between queries and keys on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about attention pooling or nadaraya watson kernel regression queries or volitional cues and keys or non volitional cues generating the dataset average pooling non parametric attention pooling attention weight gaussian kernel parametric attention pooling batch matrix multiplication defining the model training the model stochastic gradient descent mse loss function and few more topics related to the same from here i have presented the implementation of attention mechanisms non parametric attention pooling batch matrix multiplication nw kernel regression model training and prediction using pytorch here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20117a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20117b png day118 of 300daysofdata attention pooling attention pooling selectively aggregates values or sensory inputs to produce the output it implies the interaction between queries or volitional cues and keys or non volitional cues attention pooling is the weighted average of the training outputs it can be parametric or nonparametric on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about attention scoring functions gaussian kernel attention weights softmax activation function masked softmax operation text sequences probability distribution additive attention queries keys and values tanh activation function dropout and linear layer attention pooling and few more topics related to the same from here i have presented the implementation of masked softmax operation and additive attention using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20118 png day119 of 300daysofdata attention pooling attention pooling selectively aggregates values or sensory inputs to produce the output it implies the interaction between queries or volitional cues and keys or non volitional cues attention pooling is the weighted average of the training outputs it can be parametric or nonparametric on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about scaled dot product attention queries keys and values additive attention attention pooling bahdanau attention rnn encoder decoder architecture hidden states embedding defining decoder with attention sequence to sequence attention decoder and few more topics related to the same from here i have presented the implementation of scaled dot product attention and sequence to sequence attention decoder model using pytorch here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20119a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20119b png day120 of 300daysofdata multi head attention multi head attention is the design for attention mechanisms which runs through an attention mechanism several times in parallel instead of performing single attention pooling queries keys and values can be transformed into learned linear projections which are fed into attention pooling in parallel on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about bahdanau attention recurrent neural networks encoder decoder architecture training the sequence to sequence model embedding layer attention weights gru heatmaps multi head attention queries keys and values attention pooling additive attention and scaled dot product attention transpose functions and few more topics related to the same from here i have presented the implementation multi head attention using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20120 png day121 of 300daysofdata multi head attention multi head attention is the design for attention mechanisms which runs through an attention mechanism several times in parallel instead of performing single attention pooling queries keys and values can be transformed into learned linear projections which are fed into attention pooling in parallel on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about multi head attention queries keys and values attention pooling scaled dot product attention self attention and positional encoding recurrent neural networks intra attention comparing cnns rnns and self attention padding tokens absolute positional information relative positional information and few more topics related to the same from here i have presented the implementation of positional encoding using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20121 png day122 of 300daysofdata transformer architecture transformer is an architecture for transforming one sequence into another one with the help of two parts encoder and decoder it makes the use of self attention mechanisms on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about transformer self attention encoder and decoder architecture sequence embeddings positional encoding position wise feed forward networks residual connection and layer normalization encoder block and multi head self attention transformer decoder queries keys and values scaled dot product attention and few more topics related to the same from here i have presented the implementation of position wise feed forward networks residual connection and layer normalization encoder decoder block and transformer decoder using pytorch here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20122a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20122b png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20122c png day123 of 300daysofdata transformer architecture transformer is an architecture for transforming one sequence into another one with the help of two parts encoder and decoder it makes the use of self attention mechanisms on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about decoder architecture self attention encoder decoder attention position wise feed forward networks residual connections transformer decoder embedding layer sequential blocks training the transformer architecture and few more topics related to the same from here i have also read about logistic regression sigmoid activation function weights initialization gradient descent cost function and more i have presented the implementation of logistic regression from scratch using numpy transformer decoder and training using pytorch here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html logistic regression docs https ml cheatsheet readthedocs io en latest logistic regression html implementation of logistic regression https github com thinamxx machinelearning algorithms tree main logisticregression image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20123a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20123b png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20123c png day124 of 300daysofdata transformer architecture transformer is an architecture for transforming one sequence into another one with the help of two parts encoder and decoder it makes the use of self attention mechanisms on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about optimization algorithms and deep learning objective function and minimization goal of optimization generalization error training error risk function and empirical risk function optimization challenges local minimum and global minimum saddle points hessian matrix and eigenvalues vanishing gradients convexity convex sets and functions jensen s inequality and few more topics related to the same from here i have presented the implementation of local minima saddle points vanishing gradients and convex functions using pytorch here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20124 png day125 of 300daysofdata gradient descent gradient descent is an optimization algorithm which is used to minimize the differentiable function by iteratively moving in the direction of steepest descent as defined by the negative of the gradient on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about convexity and second derivatives constrained optimization lagrangian function and multipliers penalties projections gradient clipping stochastic gradient descent one dimensional gradient descent objective function learning rate local minimum and global minimum multivariate gradient descent and few more topics related to the same from here i have presented the implementation of one dimensional gradient descent local minima and multivariate gradient descent using pytorch here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20125a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20125b png day126 of 300daysofdata gradient descent gradient descent is an optimization algorithm which is used to minimize the differentiable function by iteratively moving in the direction of steepest descent as defined by the negative of the gradient on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about multivariate gradient descent adaptive methods learning rate newtons method taylor expansion hessian function gradient and backpropagation nonconvex function convergence analysis linear convergence preconditioning gradient descent with line search stochastic gradient descent loss functions and few more topics related to the same from here i have presented the implementation of newtons method non convex functions and stochastic gradient descent using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20126 png day127 of 300daysofdata stochastic gradient descent stochastic gradient descent is an iterative method for optimizing an objective function with suitable differentiable properties it is a variation of the gradient descent algorithm that calculates the error and updates the model on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about stochastic gradient descent dynamic learning rate exponential decay and polynomial decay convergence analysis for convex objectives stochastic gradient and finite samples minibatch stochastic gradient descent vectorization and caches matrix multiplications minibatches variance implementation of gradients and few more topics related to the same from here i have presented the implementation of stochastic gradient descent and minibatch stochastic gradient descent using pytorch here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20127a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20127b png day128 of 300daysofdata stochastic gradient descent stochastic gradient descent is an iterative method for optimizing an objective function with suitable differentiable properties it is a variation of the gradient descent algorithm that calculates the error and updates the model on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about the momentum method stochastic gradient descent leaky averages variance accelerated gradient an ill conditioned problem and convergence effective sample weight practical experiments implementation of momentum with sgd theoretical analysis quadratic convex functions scalar functions and few more topics related to the same from here i have presented the implementation of momentum method effective sample weight and scalar functions using pytorch here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20128a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20128b png day129 of 300daysofdata stochastic gradient descent stochastic gradient descent is an iterative method for optimizing an objective function with suitable differentiable properties it is a variation of the gradient descent algorithm that calculates the error and updates the model on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about adagrad optimization algorithms sparse features and learning rates preconditioning stochastic gradient descent algorithm the algorithms implementation of adagrad from scratch deep learning and computational constraints learning rates and few more topics related to the same from here i have presented the implementation adagrad optimization algorithm from scratch using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20129 png day130 of 300daysofdata rmsprop optimization algorithm rmsprop is a gradient based optimization algorithm that utilizes the magnitude of recent gradients to normalize the gradients it deals with adagrad s radically diminishing learning rates it divides the learning rate by an exponentially decaying average of squared gradients on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about rmsprop optimization algorithm learning rate leaky averages and momentum method implementation of rmsprop from scratch gradient descent algorithm preconditioning and few more topics related to the same from here i have presented the implementation of rmsprop optimization algorithm from scratch using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20130 png day131 of 300daysofdata rmsprop optimization algorithm rmsprop is a gradient based optimization algorithm that utilizes the magnitude of recent gradients to normalize the gradients it deals with adagrad s radically diminishing learning rates it divides the learning rate by an exponentially decaying average of squared gradients on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about adadelta optimization algorithms learning rates leaky averages momentum gradient descent concise implementation of adadelta adam optimization algorithms vectorization and minibatch sgd weighting parameters normalization concise implementation of adam algorithms and few more topics related to the same from here i have presented the implementation of adadelta optimization algorithm and adam optimization algorithm from scratch using pytorch here in the snapshot i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20131 png day132 of 300daysofdata adam optimizer adam uses exponential weighted moving averages also known as leaky averaging to obtain an estimate of both momentum and also the second moment of the gradient it combines the features of many optimization algorithms it uses ewma on minibatch stochastic gradient descent on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about adam and yogi optimization algorithms variance minibatch sgd learning rate scheduling weight vectors convolutional layer linear layer max pooling layer sequential api relu cross entropy loss schedulers overfitting and few more topics related to the same from here i have presented the implementation of lenet architecture and yogi optimization algorithm using pytorch here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20132a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20132b png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20132c png day133 of 300daysofdata adam optimizer adam uses exponential weighted moving averages also known as leaky averaging to obtain an estimate of both momentum and also the second moment of the gradient it combines the features of many optimization algorithms it uses ewma on minibatch stochastic gradient descent on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about learning rate scheduling square root scheduler factor scheduler learning rate and polynomial decay multi factor scheduler piecewise constant optimization and local minimum cosine scheduler and few more topics related to the same from here i have presented the implementation of multi factor scheduler and cosine scheduler using pytorch here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20133 png day134 of 300daysofdata adam optimizer adam uses exponential weighted moving averages also known as leaky averaging to obtain an estimate of both momentum and also the second moment of the gradient it combines the features of many optimization algorithms it uses ewma on minibatch stochastic gradient descent on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about model computational performance compilers and interpreters symbolic programming and imperative programming hybrid programming dynamic computations graph hybrid sequential acceleration by hybridization multi layer perceptrons asynchronous computation and few more topics related to the same from here i have presented the implementation of hybrid sequential acceleration by hybridization and asynchronous computation using pytorch here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20134 png day135 of 300daysofdata adam optimizer adam uses exponential weighted moving averages also known as leaky averaging to obtain an estimate of both momentum and also the second moment of the gradient it combines the features of many optimization algorithms it uses ewma on minibatch stochastic gradient descent on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about asynchronous computation barriers and blockers improving computation and memory footprint automatic parallelism parallel computation and communication training on multiple gpus splitting the problem data parallelism network partitioning layer wise partitioning data parallel partitioning and few more topics related to the same from here i have presented the implementation of initializing model parameters and defining lenet model using pytorch here in the snapshot i am still working on the implementation of lenet model i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html implementation of lenet architecture https github com thinamxx machinelearning algorithms blob main lenetarchitecture lenetarchitecture ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20135 png day136 of 300daysofdata adam optimizer adam uses exponential weighted moving averages also known as leaky averaging to obtain an estimate of both momentum and also the second moment of the gradient it combines the features of many optimization algorithms it uses ewma on minibatch stochastic gradient descent on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about training on multiple gpus lenet architecture data synchronization model parallelism data broadcasting data distribution optimization algorithms implementation back propagation model animation cross entropy loss function convolutional layer relu activation function matrix multiplication average pooling layer and few more topics related to the same from here i have presented the implementation of data distribution data synchronization and training function using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html implementation of lenet architecture https github com thinamxx machinelearning algorithms blob main lenetarchitecture lenetarchitecture ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20136 png day137 of 300daysofdata on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about optimization and synchronization resnet neural networks architecture convolutional layer batch normalization layer strides and padding the sequential api parameter initialization and logistics minibatch gradient descent training resnet model stochastic gradient descent optimizer cross entropy loss function back propagation parallelization and few more topics related to the same from here i have presented the implementation of resnet architecture initialization and training the model using pytorch here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html implementation of lenet architecture https github com thinamxx machinelearning algorithms blob main lenetarchitecture lenetarchitecture ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20137a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20137b png day138 of 300daysofdata on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about computer vision applications image augmentation deep neural networks common image augmentation method such as flipping and cropping horizontal flipping and vertical flipping changing the color of images overlying multiple image augmentation methods cifar10 dataset torch vision module and random color jitter instance and few more topics related to the same from here i have presented the implementation of flipping and cropping the images and changing the color of images using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html implementation of lenet architecture https github com thinamxx machinelearning algorithms blob main lenetarchitecture lenetarchitecture ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20138 png day139 of 300daysofdata on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about image augmentation cifar10 dataset using a multi gpu training model fine tuning the model overfitting pretrain neural network target initialization resnet model imagenet dataset normalization of rgb images mean and standard deviation torch vision module flipping and cropping images adam optimization cross entropy loss function and few more topics related to the same from here i have presented the implementation of training the model with image augmentation and normalization of images using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20139a png day140 of 300daysofdata on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about fine tuning the model pretrain neural networks normalization of images mean and standard deviation defining and initializing the model cross entropy loss function data loader class learning rate and stochastic gradient descent model parameters transfer learning source model and target model weights and biases and few more topics related to the same from here i have presented the implementation of normalization of images flipping and cropping the images and training pretrained model using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20140 png day141 of 300daysofdata on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about object detection and object recognition image classification and computer vision images and bounding boxes target location and axis coordinates and few more topics related to the same from here i have also spend some time reading the book speech and language processing here i have learned about regular expressions disjunction grouping and precedence precision and recall substitution and capture groups lookahead assertions words corpora and few more topics related to the same i have presented the simple implementation of object detection and bounding boxes using pytorch here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html speech and language processing https web stanford edu jurafsky slp3 image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20141 png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20141a png day142 of 300daysofdata on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about computer vision anchor boxes object detection algorithms bounding boxes generating multiple anchor boxes computation complexity sizes and ratios and few more topics related to the same from here i have also spend some time reading the book speech and language processing here i have learned about text normalization unix tools for crude tokenization and normalization word tokenization named entity detection penn treebank tokenization and few more topics related to the same from here i have presented the implementation of generating anchor boxes object detection and bounding boxes using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html speech and language processing https web stanford edu jurafsky slp3 image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20142 png day143 of 300daysofdata on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about computer vision generating multiple anchor boxes batch size coordinate values intersection over union algorithm jaccard index computation complexity sizes and ratios and few more topics related to the same from here i have also spend some time reading the book speech and language processing here i have learned about byte pair encoding algorithm for tokenization subword tokens wordpiece and greedy tokenization algorithm maximum matching algorithm word normalization lemmatization and stemming the porter stemmer and few more topics related to the same from here i have presented the implementation of generating anchor boxes and intersection over union algorithm using pytorch here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html speech and language processing https web stanford edu jurafsky slp3 image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20143 png day144 of 300daysofdata on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about computer vision labeling training set anchor boxes object detection and image recognition ground truth bounding box index anchor boxes and offset boxes intersection over union and jaccard algorithm and few more topics related to the same from here i have also spend some time reading the book speech and language processing here i have learned about sentence segmentation the minimum edit distance algorithm viterbi algorithm n gram language models probability spelling correction and grammatical error correction and few more topics related to the same from here i have presented the implementation of labeling training set anchor boxes and initializing offset boxes using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html speech and language processing https web stanford edu jurafsky slp3 image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20144 png day145 of 300daysofdata image segmentation image segmentation is the process of partitioning digital image into multiple segments or set of pixels the goal of segmentation is to simplify the representation of image into something meaningful and easier to analyze on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about non maximum suppression algorithms prediction bounding boxes ground truth bounding boxes confidence level batch size intersection over union algorithm or jaccard index aspect ratios bounding boxes for prediction multi box target function anchor boxes and few more topics related to the same from here i have presented the implementation of initializing multi box anchor boxes and initializing prediction bounding boxes using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20145 png day146 of 300daysofdata on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about multiscale object detection generating multiple anchor boxes object detection single shot multiple detection algorithm category prediction layer bounding boxes prediction layer concatenating predictions for multiple scales height and width down sample block cnn layer relu and max pooling layer and few more topics related to the same from here i have also spend some time reading the book speech and language processing here i have read about part of speech tagging information extraction named entity recognition regular expressions and few more topics related to the same from here i have presented the implementation of initializing category prediction layer and height width down sample block using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html speech and language processing https web stanford edu jurafsky slp3 image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20146 png day147 of 300daysofdata on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about single shot multi box detection algorithm the base neural network height width down sample block category prediction layer bounding box prediction layer multiscale feature blocks the sequential api and few more topics related to the same from here i have also spend some time reading the book speech and language processing here i have learned about n gram language models chain rule of probability markov models maximum likelihood estimation relative frequency evaluating language models log probabilities perplexity generalization zeros sparsity and few more topics related to the same from here i have presented the implementation of base ssd network and complete ssd model using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html speech and language processing https web stanford edu jurafsky slp3 image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20147 png day148 of 300daysofdata on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about single shot multi box detection model implementation of tiny ssd model forward propagation function data reading and initialization object detection multi scale feature block global max pooling layer and few more topics related to the same from here i have also spend some time reading the book speech and language processing here i have learned about unknown words or out of vocabulary words oov rate smoothing laplace smoothing text classification add one smoothing mle add k smoothing and few more topics related to the same from here i have presented the implementation of single shot multi box detection model and dataset initialization using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html speech and language processing https web stanford edu jurafsky slp3 image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20148 png day149 of 300daysofdata on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about softmax activation function convolutional layer training the single shot multi box detection model multi scale anchor boxes cross entropy loss function l1 normalization loss function average absolute error accuracy rate category and offset losses and few more topics related to the same from here i have also spend some time reading the book speech and language processing here i have learned about backoff and interpolation katz backoff kneser ney smoothing absolute discounting the web and stupid backoff perplexity relation to entropy and few more topics related to the same from here i have presented the implementation of training single shot multi box detection model loss and evaluation functions using pytorch here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html speech and language processing https web stanford edu jurafsky slp3 image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20149 png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20149a png day150 of 300daysofdata image segmentation image segmentation is the process of partitioning digital image into multiple segments or set of pixels the goal of segmentation is to simplify the representation of image into something meaningful and easier to analyze on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about region based convolutional neural networks fast rcnn faster rcnn mask rcnn category prediction layer bounding boxes prediction layer support vector machines rol pooling layer and rol alignment layer pixel level semantics image segmentation and instance segmentation pascal voc2012 semantic segmentation rgb data preprocessing and few more topics related to the same from here i have presented the implementation of semantic segmentation and data preprocessing using pytorch here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20150a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20150b png day151 of 300daysofdata sequence to sequence model sequence to sequence neural networks can be built with a modular and reusable encoder and decoder architecture the encoder model generates a thought vector which is a dense and fixed dimension vector representation of the data the decoder model use thought vectors to generate output sequences on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about dataset classes for custom semantic segmentation rgb channels normalization of images random cropping operation sequence to sequence recurrent neural networks label encoder one hot encoder encoding and vectorization long short term memory or lstm and few more topics related to the same from here i have presented the implementation dataset classes for custom semantic segmentation using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20151 png day152 of 300daysofdata on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about transposed convolutional layer cnns basic 2d transposed convolution broadcasting matrices kernel size padding strides and channels analogy to matrix transposition matrix multiplication and matrix vector multiplication and few more topics related to the same from here i have also spend some time reading the book speech and language processing here i have learned about naive bayes and sentiment classification text categorization spam detection probabilistic classifier multinomial nb classifier bag of words mlp unknown and stop words and few more topics related to the same from here i have presented the implementation of transposed convolution padding strides and matrix multiplication using pytorch here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html speech and language processing https web stanford edu jurafsky slp3 image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20152 png day153 of 300daysofdata transposed convolution transposed convolution implies that stride padding do not correspond to the number of zeros added around the image and the amount of shift in the kernel when sliding it across the input as they would in a standard convolution operation on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about fully convolutional neural networks semantic segmentation principles transposed convolutional layer constructing a pretrained neural networks model global average pooling layer flattening layer image processing and upsampling bilinear interpolation kernel function and few more topics related to the same from here i have presented the implementation of fully convolutional layer pretrained nns bilinear interpolation kernel function and transposed convolutional layer using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20153 png day154 of 300daysofdata neural style transfer algorithms it is the task of changing the style of an image in one domain to the style of an image in another domain it manipulates images or videos in order to adopt the appearance of another image on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about softmax cross entropy loss function stochastic gradient descent cnns neural networks style transfer composite images rgb channels normalization and few more topics related to the same from here i have also spend some time reading the book speech and language processing here i have learned about optimizing naive bayes for sentiment analysis sentiment lexicons naive bayes as language models precision recall and fmeasure multi label and multinomial classification and few more topics related to the same from here i have started working on style transfer using neural networks the notebook is mentioned below though i am still working on it book dive into deep learning https d2l ai index html speech and language processing https web stanford edu jurafsky slp3 neural networks style transfer https github com thinamxx neural style transfer image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20154 png day155 of 300daysofdata neural style transfer algorithms it is the task of changing the style of an image in one domain to the style of an image in another domain it manipulates images or videos in order to adopt the appearance of another image on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about neural networks style transfer convolutional neural networks reading the content and style images preprocessing and postprocessing the images extracting image features composite images vgg neural networks squared error loss faction total variance loss function normalization of rgb channels of images and few more topics related to the same from here i am still working on style transfer using neural networks the notebook is mentioned below though i am still working on it i have presented the implementation of function for extracting features and square error loss function using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html speech and language processing https web stanford edu jurafsky slp3 neural networks style transfer https github com thinamxx neural style transfer image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20155 png day156 of 300daysofdata on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about creating and initializing the composite images synchronization functions adam optimizer gram matrix convolutional neural networks neural networks style transfer loss functions and few more topics related to the same from here i have also spend some time reading the book speech and language processing here i have learned about test sets and cross validation statistical significance testing naive bayes classifiers bootstrapping logistic regression generative and discriminative classifiers feature representation sigmoid classification weight and bias term and few more topics related to the same from here i have completed working on style transfer using neural networks the notebook is mentioned below but i am still updating book dive into deep learning https d2l ai index html speech and language processing https web stanford edu jurafsky slp3 neural networks style transfer https github com thinamxx neural style transfer image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20156a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20156b png day157 of 300daysofdata on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about computer vision image classification cifar10 dataset obtaining and organizing the dataset augmentation and few more topics related to the same apart from that i have learned about data scraping and scrapy named entity recognition and spacy trained transformer model using spacy geocoding and few more topics related to the same from here i have completed working on style transfer using neural networks notebook i have started working on object recognition on images cifar10 notebook all the notebooks are mentioned below i have presented the implementation of obtaining and organizing the cifar10 dataset here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html neural networks style transfer https github com thinamxx neural style transfer object recognition on images cifar10 https github com thinamxx cifar10 recognition image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20157 png day158 of 300daysofdata on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about computer vision image classification image augmentation and overfitting normalization of rgb channels data loader and validation set and few more topics related to the same from here apart from that i have learned about stanford ner algorithms nltk named entity recognition and few more topics related to the same i have completed working on style transfer using neural networks notebook i have started working on object recognition on images cifar10 notebook all the notebooks are mentioned below i have presented the implementation of obtaining and organizing the dataset image augmentation and normalization using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book dive into deep learning https d2l ai index html neural networks style transfer https github com thinamxx neural style transfer object recognition on images cifar10 https github com thinamxx cifar10 recognition image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20158 png day159 of 300daysofdata on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about computer vision resnet model and residual blocks xavier random initialization cross entropy loss function defining training functions stochastic gradient descent learning rate scheduler evaluation metrics and few more topics related to the same i have also spend some time reading the book speech and language processing here i have learned about sentiment classification learning in logistic regression conditional mle cost function and few more topics related to the same from here i am working on object recognition on images cifar10 notebook the notebook is mentioned below i have presented the implementation defining a training function using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book dive into deep learning https d2l ai index html object recognition on images cifar10 https github com thinamxx cifar10 recognition image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20159 png day160 of 300daysofdata on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about imagenet dataset obtaining and organizing the dataset image augmentation such as flipping and resizing the image changing brightness and contrast of image transfer learning and features normalization of images and few more topics related to the same from here i have completed working on object recognition on images cifar10 notebook i have started working on dog breed identification imagenet notebook all the notebooks are mentioned below i have presented the implementation of image augmentation and normalization defining neural networks model and loss function using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book dive into deep learning https d2l ai index html object recognition on images cifar10 https github com thinamxx cifar10 recognition dog breed identification imagenet https github com thinamxx dogbreedclassification image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20160 png day161 of 300daysofdata on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about defining the training functions computer vision hyperparameters stochastic gradient descent optimization function learning rate scheduler and optimization training loss and validation loss and few more topics related to the same from here i have also spend some time reading the book speech and language processing here i have learned about gradient for logistic regression sgd algorithm minibatch training and few more topics related to the same from here i am working on dog breed identification imagenet notebook the notebooks is mentioned below i have presented the implementation of defining the training function using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book dive into deep learning https d2l ai index html speech and language processing https web stanford edu jurafsky slp3 object recognition on images cifar10 https github com thinamxx cifar10 recognition dog breed identification imagenet https github com thinamxx dogbreedclassification image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20161 png day162 of 300daysofdata on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about pretrained text representations word embedding and word2vec one hot vectors the skip gram model and training the continuous bag of words model and training approximate training negative sampling hierarchical softmax reading and processing the dataset subsampling vocabulary and few more topics related to the same from here apart from that i have also read about improving chemical autoencoders latent space and molecular diversity with hetero encoders i am working on dog breed identification imagenet notebook the notebooks is mentioned below i have presented the implementation of reading and preprocessing the dataset subsampling and comparison using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book dive into deep learning https d2l ai index html object recognition on images cifar10 https github com thinamxx cifar10 recognition dog breed identification imagenet https github com thinamxx dogbreedclassification image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20162 png day163 of 300daysofdata on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about subsampling extracting central target words and context words maximum context window size penn tree bank dataset and pretraining word embedding and few more topics related to the same from here i have also spend some time reading the book speech and language processing here i have learned about regularization and overfitting manhattan distance lasso and ridge regression multinomial logistic regression features in mlr learning in mlr interpreting models deriving gradient equation and few more topics related to the same from here i have completed working on dog breed identification imagenet notebook i have presented the implementation of extracting central target words and context words using pytorch here in the snapshot i hope you will gain some insights i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book dive into deep learning https d2l ai index html speech and language processing https web stanford edu jurafsky slp3 object recognition on images cifar10 https github com thinamxx cifar10 recognition dog breed identification imagenet https github com thinamxx dogbreedclassification image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20163 png day164 of 300daysofdata on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about subsampling and negative sampling word embedding and word2vec probability reading into batches concatenation and padding random minibatches and few more topics related to the same from here i have also spend some time reading the book speech and language processing here i have learned about vector semantics and embeddings lexical semantics lemmas and senses word sense disambiguation word similarity principle of contrast representation learning synonymy and few more topics related to the same from here i have presented the implementation negative sampling using pytorch here in the snapshot i hope you will gain some insights i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book dive into deep learning https d2l ai index html speech and language processing https web stanford edu jurafsky slp3 image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20164 png day165 of 300daysofdata subsampling subsampling is a method that reduces data size by selecting a subset of the original data the subset is specified by choosing a parameter subsampling attempts to minimize the impact of high frequency words on the training of a word embedding model on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about word embedding batches loss function and padding center and context words negative sampling data loader instance vocabulary subsampling data iterations mask variables and few more topics related to the same from here i have presented the implementation of reading batches and function for loading ptb dataset using pytorch here in the snapshots i hope you will gain some insights i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20165a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20165b png day166 of 300daysofdata word embedding word embedding is a term used for the representation of words for text analysis typically in the form of a real valued vector that encodes the meaning of the word such that the words that are closer in the vector space are expected to be similar in meaning on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about word embedding word2vec the skip gram model embedding layer word vector skip gram model forward calculation batch matrix multiplication binary cross entropy loss function negative sampling mask variables and padding initializing model parameters and few more topics related to the same from here i have presented the implementation of embedding layer skip gram model forward calculation and binary cross entropy loss function using pytorch here in the snapshot i hope you will gain some insights i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20166 png day167 of 300daysofdata on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about training skip gram model loss function applying word embedding model negative sampling word embedding with global vectors or glove conditional probability the glove model cross entropy loss function and few more topics related to the same from here i have also spend some time reading the book speech and language processing here i have learned about word relatedness semantic field semantic frames and roles connotation and sentiment vector semantics embeddings and few more topics related to the same from here i have presented the implementation of training word embedding model using pytorch here in the snapshot i hope you will gain some insights i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book dive into deep learning https d2l ai index html speech and language processing https web stanford edu jurafsky slp3 image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20167 png day168 of 300daysofdata on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about subword embedding fast text and byte pair encoding finding synonyms and analogies pretrained word vectors token embedding central words and context words and few more topics related to the same from here i have also spend some time reading the book speech and language processing here i have learned about words and vectors vectors and documents term document matrices information retrieval row vector and context matrix and few more topics related to the same from here i have presented the implementation of defining token embedding class using pytorch here in the snapshot i hope you will gain some insights i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book dive into deep learning https d2l ai index html speech and language processing https web stanford edu jurafsky slp3 image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20168 png day169 of 300daysofdata on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about finding synonyms and analogies word embedding model and word2vec applying pretrained word vectors cosine similarity and few more topics related to the same from here i have also spend some time reading the book speech and language processing here i have read about cosine for measuring similarity dot and inner products weighing terms in the vector term frequency inverse document frequency or tfidf collection frequency applications of tfidf vector model and few more topics related to the same from here i have presented the implementation of cosine similarity and finding synonyms and analogies using pytorch here in the snapshot i hope you will gain some insights i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book dive into deep learning https d2l ai index html speech and language processing https web stanford edu jurafsky slp3 image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20169 png day170 of 300daysofdata bidirectional encoder representations from transformers elmo encodes context bidirectionally but uses task specific architectures and gpt is a task agnostic but encodes context left to right bert encodes context bidirectionally and requires minimal architecture changes for a wide range of nlp tasks on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about bert architecture from context independent to context sensitive word embedding model and word2vec from task specific to task agnostic embeddings from language models or elmo architecture input representations token segment and positional embedding and learnable positional embedding and few more topics related to the same from here i have presented the implementation of bert input representations and bert encoder class using pytorch here in the snapshot i hope you will gain some insights i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20170 png day171 of 300daysofdata on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about bert encoder class pretraining tasks masked language modeling multi layer perceptron forward inference bert input sequences bidirectional context encoding and few more topics related to the same from here i have also spend some time reading the book speech and language processing here i have learned about pointwise mutual information or pmi laplace smoothing word2vec skip gram with negative sampling or sgns the classifier logistic and sigmoid function cosine similarity and dot product and few more topics related to the same from here i have presented the implementation of masked language modeling and bert encoder using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book dive into deep learning https d2l ai index html speech and language processing https web stanford edu jurafsky slp3 image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20171 png day172 of 300daysofdata bidirectional encoder representations from transformers elmo encodes context bidirectionally but uses task specific architectures and gpt is a task agnostic but encodes context left to right bert encodes context bidirectionally and requires minimal architecture changes for a wide range of nlp tasks the embeddings are the sum of the token segment and positional embeddings on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about bidirectional encoder representations from transformers or bert architecture next sentence prediction model cross entropy loss function mlp bert model masked language modeling bert encoder pretraining bert model and few more topics related to the same from here i have presented the implementation of next sentence prediction and bert model using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20172 png day173 of 300daysofdata on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about pretraining bert model and dataset defining helper functions for pretraining tasks generating next sentence prediction task generating masked language modeling task sequence tokens and few more topics related to the same from here i have also spend some time reading the book speech and language processing here i have read about learning skip gram embeddings binary classifier target and context embedding visualizing embeddings semantic properties of embeddings and few more topics related to the same from here i have presented the implementation of generating next sentence prediction task and generating masked language modeling task using pytorch here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book dive into deep learning https d2l ai index html speech and language processing https web stanford edu jurafsky slp3 image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20173a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20173b png day174 of 300daysofdata on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about pretraining bert model next sentence prediction task and masked language modeling task transforming text into pretraining dataset and few more topics related to the same from here i have also learned about scorer and example instances of spacy model long short term memory neural networks smiles vectorizer feed forward neural networks and few more topics related to the same i have presented the implementation of transforming text into pretraining dataset using pytorch here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20174a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20174b png day175 of 300daysofdata on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about pretraining bert model cross entropy loss function adam optimization function zeroing gradients back propagation and optimization masked language modeling loss and next sentence prediction loss and few more topics related to the same from here i have presented the implementation of pretraining bert model getting loss from bert model and training a neural networks model using pytorch here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20175a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20175b png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20175c png day176 of 300daysofdata on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about natural language processing applications nlp architecture and pretraining sentiment analysis and dataset text classification tokenization and vocabulary padding tokens to same length and few more topics related to the same from here apart from that i have also learned about named entity recognition frequency distribution nltk extending lists and few more topics related to the same from here i have presented the implementation of reading the dataset tokenization and vocabulary and padding to fixed length using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book dive into deep learning https d2l ai index html sentiment analysis dataset notebook https github com thinamxx neuralnetworks sentimentanalysis blob master pytorch sentiment 20analysis 20dataset ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20176 png day177 of 300daysofdata sentiment analysis sentiment analysis is the use of natural language processing text analysis computational linguistics and biometrics to systematically identify extract quantify and study affective states and subjective information it is widely applied to voice of the customer materials such as reviews and survey responses online and social media and healthcare materials for applications that range from marketing to customer service to clinical medicine on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about creating data iterations tokenization and vocabulary truncating and padding recurrent neural networks model and sentiment analysis pretrained word vectors and glove bidirectional lstm and embedding layer linear layer and decoding encoding and sequence data xavier initialization and few more topics related to the same from here i have presented the implementation of bidirectional recurrent neural networks model using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book dive into deep learning https d2l ai index html sentiment analysis dataset notebook https github com thinamxx neuralnetworks sentimentanalysis blob master pytorch sentiment 20analysis 20dataset ipynb sentiment analysis with rnn https github com thinamxx neuralnetworks sentimentanalysis blob master pytorch sentiment 20analysis 20rnn ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20177 png day178 of 300daysofdata sentiment analysis sentiment analysis is the use of natural language processing text analysis computational linguistics and biometrics to systematically identify extract quantify and study affective states and subjective information it is widely applied to voice of the customer materials such as reviews and survey responses online and social media and healthcare materials for applications that range from marketing to customer service to clinical medicine on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about word vectors and vocabulary training and evaluating bidirectional rnn model sentiment analysis and one dimensional convolutional neural networks one dimensional cross correlation operation max over time pooling layer the text cnn model relu activation function and dropout layer and few more topics related to the same from here i have presented the implementation of text convolutional neural networks using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book dive into deep learning https d2l ai index html sentiment analysis dataset notebook https github com thinamxx neuralnetworks sentimentanalysis blob master pytorch sentiment 20analysis 20dataset ipynb sentiment analysis with rnn https github com thinamxx neuralnetworks sentimentanalysis blob master pytorch sentiment 20analysis 20rnn ipynb sentiment analysis with cnn https github com thinamxx neuralnetworks sentimentanalysis blob master pytorch sentiment 20analysis 20cnn ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20178 png day179 of 300daysofdata natural language inference natural language inference is a study where a hypothesis can be inferred from a premise where both are a text sequence it determines the logical relationship between a pair of text sequences on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about natural language inference and dataset premise hypothesis or entailment contradiction and neutral the stanford natural language inference dataset reading snli dataset and few more topics related to the same from here i have presented the implementation of reading snli dataset using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book dive into deep learning https d2l ai index html sentiment analysis dataset notebook https github com thinamxx neuralnetworks sentimentanalysis blob master pytorch sentiment 20analysis 20dataset ipynb sentiment analysis with rnn https github com thinamxx neuralnetworks sentimentanalysis blob master pytorch sentiment 20analysis 20rnn ipynb sentiment analysis with cnn https github com thinamxx neuralnetworks sentimentanalysis blob master pytorch sentiment 20analysis 20cnn ipynb natural language inference dataset https github com thinamxx natural language inference blob main naturallanguage 20inference 20data ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20179 png day180 of 300daysofdata natural language inference natural language inference is a study where a hypothesis can be inferred from a premise where both are a text sequence it determines the logical relationship between a pair of text sequences on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about natural language inference and snli dataset premises hypotheses and labels vocabulary padding and truncation of sequences dataset and dataloader module and few more topics related to the same from here apart from here i have also read about confusion matrix and classification reports frequency distribution and word cloud of text data i have presented the implementation of loading snli dataset using pytorch here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book dive into deep learning https d2l ai index html sentiment analysis dataset notebook https github com thinamxx neuralnetworks sentimentanalysis blob master pytorch sentiment 20analysis 20dataset ipynb sentiment analysis with rnn https github com thinamxx neuralnetworks sentimentanalysis blob master pytorch sentiment 20analysis 20rnn ipynb sentiment analysis with cnn https github com thinamxx neuralnetworks sentimentanalysis blob master pytorch sentiment 20analysis 20cnn ipynb natural language inference dataset https github com thinamxx natural language inference blob main naturallanguage 20inference 20data ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20180 png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20180a png day181 of 300daysofdata natural language inference natural language inference is a study where a hypothesis can be inferred from a premise where both are a text sequence it determines the logical relationship between a pair of text sequences on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about natural language inference using attention model multi layer perceptron or mlp with attention mechanisms alignment of premises and hypotheses word embeddings and attention weights and few more topics related to the same from here i have presented the implementation of mlp and attention mechanism using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book dive into deep learning https d2l ai index html natural language inference dataset https github com thinamxx natural language inference blob main naturallanguage 20inference 20data ipynb natural language inference https github com thinamxx natural language inference blob main nl 20inference 20attention ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20181 png day182 of 300daysofdata comparing and aggregating class comparing class compares a word in one sequence with the other sequence that is softly aligned with the word aggregating class aggregates the two sets of comparison vectors to infer the logical relationship it feeds the concatenation of both summarization results into mlp function to obtain the classification result of the logical relationship on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about comparing word sequences soft alignment multi layer perceptron or mlp classifier aggregating comparison vectors linear layer and concatenation decomposable attention model embedding layer and few more topics related to the same from here i have presented the implementation of comparing class aggregating class and decomposable attention model using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book dive into deep learning https d2l ai index html natural language inference dataset https github com thinamxx natural language inference blob main naturallanguage 20inference 20data ipynb natural language inference https github com thinamxx natural language inference blob main nl 20inference 20attention ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20182 png day183 of 300daysofdata comparing and aggregating class comparing class compares a word in one sequence with the other sequence that is softly aligned with the word aggregating class aggregates the two sets of comparison vectors to infer the logical relationship it feeds the concatenation of both summarization results into mlp function to obtain the classification result of the logical relationship on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about decomposable attention model embedding layer and linear layer training and evaluating the attention model natural language inference entailment contradiction and neutral pretrained glove embedding snli dataset adam optimizer and cross entropy loss function premises and hypotheses and few more topics related to the same from here i have presented the implementation of training and evaluating attention model using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book dive into deep learning https d2l ai index html natural language inference dataset https github com thinamxx natural language inference blob main naturallanguage 20inference 20data ipynb natural language inference attention https github com thinamxx natural language inference blob main nl 20inference 20attention ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20183 png day184 of 300daysofdata bert model notes bert requires minimal architecture changes for sequence level and token level nlp applications such as single text classification text pair classification or regression and text tagging on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about fine tuning bert for sequence level and token level applications single text classification text pair classification or regression text tagging question answering natural language inference and pretrained bert model loading pretrained bert model and parameters semantic textual similarity pos tagging and few more topics related to the same from here i have presented the implementation of loading pretrained bert model and parameters using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book dive into deep learning https d2l ai index html natural language inference attention https github com thinamxx natural language inference blob main nl 20inference 20attention ipynb natural language inference bert https github com thinamxx natural language inference blob main nl 20inference 20bert ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20184 png day185 of 300daysofdata bert model notes bert requires minimal architecture changes for sequence level and token level nlp applications such as single text classification text pair classification or regression and text tagging on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about loading pretrained bert model and parameters the dataset for fine tuning bert model premise hypothesis and input sequence tokenization and vocabulary truncating and padding tokens natural language inference and few more topics related to the same from here i have presented the implementation of the dataset for fine tuning bert model and generating training and test examples using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book dive into deep learning https d2l ai index html natural language inference attention https github com thinamxx natural language inference blob main nl 20inference 20attention ipynb natural language inference bert https github com thinamxx natural language inference blob main nl 20inference 20bert ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20185a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20185b png day186 of 300daysofdata generative adversarial networks generative adversarial networks consist of two deep networks generator and discriminator the generator generates the image as much closer to the true image as possible to fool discriminator by maximizing the cross entropy loss the discriminator tries to distinguish the generated images from the true images by minimizing the cross entropy loss on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about generative adversarial networks generator and discriminator networks updating discriminator and few more topics related to the same from here i have also read about recommender systems collaborative filtering explicit and implicit feedbacks recommendation tasks and few more topics related to the same i have presented a simple implementation of generator and discriminator networks and optimization using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book dive into deep learning https d2l ai index html natural language inference attention https github com thinamxx natural language inference blob main nl 20inference 20attention ipynb natural language inference bert https github com thinamxx natural language inference blob main nl 20inference 20bert ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20186 png day187 of 300daysofdata generative adversarial networks generative adversarial networks consist of two deep networks generator and discriminator the generator generates the image as much closer to the true image as possible to fool discriminator by maximizing the cross entropy loss the discriminator tries to distinguish the generated images from the true images by minimizing the cross entropy loss on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about generator and discriminator networks binary cross entropy loss function adam optimizer and normalized tensors gaussian distribution real and generated data and few more topics related to the same from here i have presented a simple implementation of updating generator and training function using pytorch here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book dive into deep learning https d2l ai index html natural language inference attention https github com thinamxx natural language inference blob main nl 20inference 20attention ipynb natural language inference bert https github com thinamxx natural language inference blob main nl 20inference 20bert ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20187a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20187b png day188 of 300daysofdata generative adversarial networks generative adversarial networks consist of two deep networks generator and discriminator the generator generates the image as much closer to the true image as possible to fool discriminator by maximizing the cross entropy loss the discriminator tries to distinguish the generated images from the true images by minimizing the cross entropy loss on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about deep convolutional generative adversarial networks the pokemon dataset resizing and normalization dataloader the generator block module transposed convolution layer batch normalization layer relu activation function and few more topics related to the same from here i have also read about inter quartile range mean absolute deviation box plots density plots frequency tables and few more topics related to the same i have presented the implementation of the generator block and pokemon dataset using pytorch here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book dive into deep learning https d2l ai index html deep convolutional gan https github com thinamxx gan blob main deep 20gan ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20188a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20188b png day189 of 300daysofdata generative adversarial networks generative adversarial networks consist of two deep networks generator and discriminator the generator generates the image as much closer to the true image as possible to fool discriminator by maximizing the cross entropy loss the discriminator tries to distinguish the generated images from the true images by minimizing the cross entropy loss on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about deep convolutional generative adversarial networks the generator and the discriminator networks leaky relu activation function and dying relu problem batch normalization convolutional layer stride and padding and few more topics related to the same from here i have presented the implementation of the discriminator block and the generator block using pytorch here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book dive into deep learning https d2l ai index html deep convolutional gan https github com thinamxx gan blob main deep 20gan ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20189 png day190 of 300daysofdata generative adversarial networks generative adversarial networks consist of two deep networks generator and discriminator the generator generates the image as much closer to the true image as possible to fool discriminator by maximizing the cross entropy loss the discriminator tries to distinguish the generated images from the true images by minimizing the cross entropy loss on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about deep convolutional generative adversarial networks the generator and the discriminator blocks cross entropy loss function adam optimization function and few more topics related to the same from here i have presented the implementation of training generator and discriminator networks using pytorch here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book dive into deep learning https d2l ai index html deep convolutional gan https github com thinamxx gan blob main deep 20gan ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20190a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20190b png day191 of 300daysofdata on my journey of machine learning and deep learning today i have started reading and implementing from the book deep learning for coders with fastai and pytorch here i have read about deep learning in practice areas of deep learning a brief history of neural networks fastai and jupyter notebooks cat and dog classification image loaders pretrained models resnet and cnns error rate and few more topics related to the same from here i have presented the implementation of cat and dog classification using fastai here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch fastai introduction notebook https github com thinamxx fastai blob main 1 20introduction ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20191 png day192 of 300daysofdata transfer learning transfer learning is defined as the process of using pretrained model for a task different from what it was originally trained for fine tuning is a transfer learning technique that updates the parameters of pretrained model by training for additional epochs using a different task from that used for pretraining on my journey of machine learning and deep learning today i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about machine learning and weight assignment neural networks and stochastic gradient descent limitations inherent to ml image recognition classification and regression overfitting and validation set transfer learning semantic segmentation sentiment classification data loaders and few more topics related to the same from here i have presented the implementation of semantic segmentation and sentiment classification using fastai here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch fastai introduction notebook https github com thinamxx fastai blob main 1 20introduction ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20192 png day193 of 300daysofdata transfer learning transfer learning is defined as the process of using pretrained model for a task different from what it was originally trained for fine tuning is a transfer learning technique that updates the parameters of pretrained model by training for additional epochs using a different task from that used for pretraining on my journey of machine learning and deep learning today i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about tabular data and classification tabular data loaders categorical and continuous data recommendation system and collaborative filtering datasets for models validation sets and test sets judgement in test sets and few more topics related to the same from here i have presented the implementation of tabular classification and recommendation system model using fastai here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch fastai introduction notebook https github com thinamxx fastai blob main 1 20introduction ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20193 png day194 of 300daysofdata the drivetrain approach it can be stated as start with considering your objective then think about what actions you can take to meet that objective and what data you have or can acquire that can help and then build a model that you can use to determine the best actions to take to get the best results in terms of your objective on my journey of machine learning and deep learning today i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about the practice of deep learning the state of dl computer vision text and nlp combining text and images tabular data and recommendation systems the drivetrain approach gathering data and duck duck go questionnaire and few more topics related to the same from here i have presented the implementation of gathering data for object detection using duck duck go and fastai here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch fastai image detection https github com thinamxx fastai blob main 2 20model 20production beardetector ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20194 png day195 of 300daysofdata the drivetrain approach it can be stated as start with considering your objective then think about what actions you can take to meet that objective and what data you have or can acquire that can help and then build a model that you can use to determine the best actions to take to get the best results in terms of your objective on my journey of machine learning and deep learning today i have read and implemented from the book deep learning for coders with fastai and pytorch here i have fastai dependencies and functions biased dataset data to data loaders data block api dependent and independent variables random splitting image transformations and few more topics related to the same from here i have presented the implementation of gathering data and initializing data loaders using duck duck go and fastai here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch fastai image detection https github com thinamxx fastai blob main 2 20model 20production beardetector ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20195 png day196 of 300daysofdata data augmentation data augmentation refers to creating random variations of the input data such that they appear different but do not change the meaning of the data randomresizedcrop is a specific example of data augmentation on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about data loaders image block resizing squishing and stretching images padding images data augmentation image transformations training the model and error rate random resizing and cropping and few more topics related to the same from here i have presented the implementation of data loaders data augmentation and training the model using fastai here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch fastai image detection https github com thinamxx fastai blob main 2 20model 20production beardetector ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20196 png day197 of 300daysofdata data augmentation data augmentation refers to creating random variations of the input data such that they appear different but do not change the meaning of the data randomresizedcrop is a specific example of data augmentation on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about training pretrained model data augmentation and transformations classification interpretation and confusion matrix cleaning dataset inference model and parameters notebooks and widgets and few more topics related to the same from here i have presented the implementation of classification interpretation cleaning dataset inference model and parameters using fastai here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch fastai image detection https github com thinamxx fastai blob main 2 20model 20production beardetector ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20197 png day198 of 300daysofdata data ethics ethics refers to well founded standards of right and wrong that prescribe what humans should do it is the study and development of ones ethical standards recourse process feedback loops bias are key examples for data ethics on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about data ethics bugs and recourse feedback loops bias integrating ml with product design training a digit classifier pixels and computer vision tenacity and deep learning pixel similarity list comprehensions and few more topics related to the same from here i have presented the simple implementation of pixels and computer vision using fastai here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch fastai training classifier https github com thinamxx fastai blob main 3 20training 20a 20classifier digitclassifier ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20198 png day199 of 300daysofdata l1 and l2 norm taking the mean of absolute value of differences is called mean absolute difference or l1 norm taking the mean of square of differences and then taking the square root is called root mean squared error or l2 norm on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about rank of tensors mean absolute difference or l1 norm and root mean squared error or l2 norm numpy arrays and pytorch tensors computing metrics using broadcasting and few more topics related to the same from here i have presented the simple implementation of arrays and tensors l1 and l2 norm using fastai here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch fastai training classifier https github com thinamxx fastai blob main 3 20training 20a 20classifier digitclassifier ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20199 png day200 of 300daysofdata l1 and l2 norm taking the mean of absolute value of differences is called mean absolute difference or l1 norm taking the mean of square of differences and then taking the square root is called root mean squared error or l2 norm on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about computing metrics using broadcasting mean absolute error stochastic gradient descent initializing parameters loss function calculating gradients backpropagation and derivatives learning rate optimization and few more topics related to the same from here i have presented the simple implementation of stochastic gradient descent using fastai here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch fastai training classifier https github com thinamxx fastai blob main 3 20training 20a 20classifier digitclassifier ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20200 png day201 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about the gradient descent process initializing parameters calculating predictions and inspecting calculating loss and mse calculating gradients and backpropagation stepping the weights and updating parameters repeating the process stopping the process and few more topics related to the same from here i have presented the implementation of the gradient descent process using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch fastai training classifier https github com thinamxx fastai blob main 3 20training 20a 20classifier digitclassifier ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20201 png day202 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about the mnist loss function matrices and vectors independent variables weights and biases parameters matrix multiplication and dataset class gradient descent process and learning rate activation function and few more topics related to the same from here i have presented the implementation of the dataset class and matrix multiplication using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch fastai training classifier https github com thinamxx fastai blob main 3 20training 20a 20classifier digitclassifier ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20202 png day203 of 300daysofdata accuracy and loss function the key difference between metric such as accuracy and loss function is that the loss is to drive automated learning and the metric is to drive human understanding the loss must be a function with meaningful derivative and metrics focuses on performance of the model on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about matrix multiplication activation function loss function gradients and slope sigmoid function accuracy metrics and understanding and few more topics related to the same from here i have presented the implementation of loss function and sigmoid using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch fastai training classifier https github com thinamxx fastai blob main 3 20training 20a 20classifier digitclassifier ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20203 png day204 of 300daysofdata sgd and minibatches the process to change or update the weights based on the gradients in order to consider some of the details involved in the next phase of the learning process is called an optimization step the calculation of average loss for a few data items at a time is called a minibatch the number of data items in the minibatch is called batchsize a larger batchsize means more accurate and stable estimate of the dataset gradients from the loss function whereas a single batchsize result in an imprecise and unstable gradient on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about stochastic gradient descent and minibatches optimization step batch size dataloader and dataset initializing parameters weights and bias backpropagation and gradients loss function and few more topics related to the same from here i have presented the implementation of dataloader and gradients using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch fastai training classifier https github com thinamxx fastai blob main 3 20training 20a 20classifier digitclassifier ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20204 png day205 of 300daysofdata sgd and minibatches the process to change or update the weights based on the gradients in order to consider some of the details involved in the next phase of the learning process is called an optimization step the calculation of average loss for a few data items at a time is called a minibatch the number of data items in the minibatch is called batchsize a larger batchsize means more accurate and stable estimate of the dataset gradients from the loss function whereas a single batchsize result in an imprecise and unstable gradient on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about calculating gradients and back propagation weights bias and parameters zeroing gradients training loop and learning rate accuracy and evaluation creating an optimizer and few more topics related to the same from here i have presented the implementation of calculating gradients accuracy and training using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch fastai training classifier https github com thinamxx fastai blob main 3 20training 20a 20classifier digitclassifier ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20205a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20205b png day206 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about creating an optimizer linear module weights and biases model parameters optimization and zeroing gradients sgd class data loaders and learner class of fastai and few more topics related to the same from here i have presented the implementation of creating optimizer and learner class using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch fastai training classifier https github com thinamxx fastai blob main 3 20training 20a 20classifier digitclassifier ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20206a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20206b png day207 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about adding a nonlinearity simple linear classifiers basic neural networks weight and bias tensors rectified linear unit or relu activation function universal approximation theorem sequential module and few more topics related to the same from here i have presented the implementation of creating simple neural networks using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch fastai training classifier https github com thinamxx fastai blob main 3 20training 20a 20classifier digitclassifier ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20207 png day208 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about image classification localization regular expressions data block and data loaders regex labeller data augmentation presizing checking and debugging data block item and batch transformations and few more topics related to the same from here i have presented the implementation of creating and debugging data block and data loaders using fastai and pytorch here in the snapshot i have used resize as an item transform with a large size and randomresizedcrop as a batch transform with a smaller size randomresizedcrop will be added if min scale parameter is passed in aug transforms function as was done in datablock call below i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch fastai image classification https github com thinamxx fastai blob main 4 20image 20classification imageclassification ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20208 png day209 of 300daysofdata exponential function exponential function is defined as e x where e is a special number approximately equal to 2 718 it is the inverse of natural logarithm function exponential function is always positive and increases very rapidly on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about cross entropy loss function viewing activations and labels softmax activation function sigmoid function exponential function negative log likelihood binary classification and few more topics related to the same from here i have presented the implementation of softmax function and negative log likelihood using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch fastai image classification https github com thinamxx fastai blob main 4 20image 20classification imageclassification ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20209 png day210 of 300daysofdata exponential function exponential function is defined as e x where e is a special number approximately equal to 2 718 it is the inverse of natural logarithm function exponential function is always positive and increases very rapidly on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about logarithmic function negative log likelihood cross entropy loss function softmax function model interpretation confusion matrix improving the model the learning rate finder logarithmic scale and few more topics related to the same from here i have presented the implementation of cross entropy loss confusion matrix and learning rate finder using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch fastai image classification https github com thinamxx fastai blob main 4 20image 20classification imageclassification ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20210 png day211 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about unfreezing and transfer learning freezing trained layers discriminative learning rates selecting the number of epochs deeper architectures and few more topics related to the same from here i have presented the implementation of unfreezing and transfer learning and discriminative learning rates using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch fastai image classification https github com thinamxx fastai blob main 4 20image 20classification imageclassification ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20211 png day212 of 300daysofdata multilabel classification multilabel classification refers to the problem of identifying the categories of objects in images that may not contain exactly one type of object on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about questionnaire of image classification multilabel classification and regression pascal dataset pandas and dataframes constructing datablock datasets and dataloaders lambda functions and few more topics related to the same from here i have presented the implementation of creating datablock and dataloaders using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch fastai multilabel classification regression https github com thinamxx fastai blob main 5 20multilabelclassification 20regression multilabelclassification ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20212 png day213 of 300daysofdata multilabel classification multilabel classification refers to the problem of identifying the categories of objects in images that may not contain exactly one type of object on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about lambda functions transformation blocks such as image block and multi category block one hot encoding data splitting dataloaders datasets and datablock resizing and cropping and few more topics related to the same from here i have presented the implementation of creating datablock and dataloaders using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch fastai multilabel classification regression https github com thinamxx fastai blob main 5 20multilabelclassification 20regression multilabelclassification ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20213 png day214 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about binary cross entropy loss function dataloaders and learner getting model activations sigmoid and softmax functions one hot encoding getting accuracy partial function and few more topics related to the same from here f binary cross entropy and its module equivalent nn bceloss calculate cross entropy on a one hot encoded target but don t include the initial sigmoid normally f binary cross entropy with logits or nn bcewithlogitsloss do both sigmoid and binary cross entropy in a single function similarly for single label dataset f nll loss or nn nlloss for the version without initial softmax and f cross entropy or nn crossentropyloss for the version with initial softmax i have presented the implementation of cross entropy loss functions and accuracy using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch fastai multilabel classification regression https github com thinamxx fastai blob main 5 20multilabelclassification 20regression multilabelclassification ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20214 png day215 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about multilabel classification and threshold sigmoid activation overfitting image regression validation loss and metrics partial function and few more topics related to the same from here f binary cross entropy and its module equivalent nn bceloss calculate cross entropy on a one hot encoded target but don t include the initial sigmoid normally f binary cross entropy with logits or nn bcewithlogitsloss do both sigmoid and binary cross entropy in a single function similarly for single label dataset f nll loss or nn nlloss for the version without initial softmax and f cross entropy or nn crossentropyloss for the version with initial softmax i have presented the implementation of training the convolutions with accuracy and threshold using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch fastai multilabel classification regression https github com thinamxx fastai blob main 5 20multilabelclassification 20regression multilabelclassification ipynb fastai image regression https github com thinamxx fastai blob main 5 20multilabelclassification 20regression regression ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20215 png day216 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about image regression and localization assembling the dataset initializing datablock and dataloaders points and data augmentation training the model sigmoid range mse loss function transfer learning and few more topics related to the same from here i have presented the implementation of initializing datablock and dataloaders and training image regression using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch fastai multilabel classification regression https github com thinamxx fastai blob main 5 20multilabelclassification 20regression multilabelclassification ipynb fastai image regression https github com thinamxx fastai blob main 5 20multilabelclassification 20regression regression ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20216a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20216b png day217 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about imagenette classification datablock and dataloaders data normalization and normalize function progressive resizing and data augmentation transfer learning mean and standard deviation and few more topics related to the same from here progressive resizing is the process of gradually using larger and larger images as training progresses i have presented the implementation of initializing datablock and dataloaders normalization and progressive resizing using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch advanced classification https github com thinamxx fastai blob main 6 20advanced 20classification imagenetteclassification ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20217a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20217b png day218 of 300daysofdata label smoothing label smoothing is a process which replaces all the labels i e 1s with a number a bit less than 1 and 0s with a number a bit more than 0 for training it will make training more robust even if there is mislabeled data which results to be a model that generalizes better at inference on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about progressive resizing test time augmentation mixup augmentation linear combinations callbacks label smoothing and cross entropy loss function and few more topics related to the same from here during inference or validation creating multiple versions of each image using data augmentation and then taking the average or maximum of the predictions for each augmented version of the image is called test time augmentation i have presented the implementation of progressive resizing test time augmentation mixup augmentation and label smoothing using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch advanced classification https github com thinamxx fastai blob main 6 20advanced 20classification imagenetteclassification ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20218 png day219 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about collaborative filtering learning the latent factors loss function and stochastic gradient descent creating dataloaders batches dot product and matrix multiplication and few more topics related to the same from here the mathematical operation of multiplying the elements of two vectors together and then summing up the result is called dot product i have presented the implementation of initializing dataset and creating dataloaders using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch collaborative filtering https github com thinamxx fastai blob main 7 20collaborative 20filtering collaborativefiltering ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20219 png day220 of 300daysofdata embedding the special layer that indexes into a vector using an integer but has its derivative calculated in such a way that it is identical to what it would have been if it had done a matrix multiplication with a one hot encoded vector is called embedding multiplying by a one hot encoded matrix using the computational shortcut that it can be implemented by simply indexing directly the thing that multiply the one hot encoded matrix is called the embedding matrix on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about creating dataloaders embedding matrix collaborative filtering object oriented programming with python inheritance module and forward propagation function batches and learner sigmoid range and few more topics related to the same from here i have presented the implementation embedding dot product class and sigmoid range using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch collaborative filtering https github com thinamxx fastai blob main 7 20collaborative 20filtering collaborativefiltering ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20220a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20220b png day221 of 300daysofdata embedding the special layer that indexes into a vector using an integer but has its derivative calculated in such a way that it is identical to what it would have been if it had done a matrix multiplication with a one hot encoded vector is called embedding multiplying by a one hot encoded matrix using the computational shortcut that it can be implemented by simply indexing directly the thing that multiply the one hot encoded matrix is called the embedding matrix on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about collaborative filtering weight decay or l2 regularization overfitting creating embeddings and weight matrices parameter module and few more topics related to the same from here weight decay consists of adding sum of the squared weights to the loss function the idea is that the larger the coefficients are the sharper the canyons will be in the loss function i have presented the implementation of biases and weight decay and matrices using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch collaborative filtering https github com thinamxx fastai blob main 7 20collaborative 20filtering collaborativefiltering ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20221a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20221b png day222 of 300daysofdata embedding the special layer that indexes into a vector using an integer but has its derivative calculated in such a way that it is identical to what it would have been if it had done a matrix multiplication with a one hot encoded vector is called embedding multiplying by a one hot encoded matrix using the computational shortcut that it can be implemented by simply indexing directly the thing that multiply the one hot encoded matrix is called the embedding matrix on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about interpreting embedding and biases principal component analysis or pca collab learner embedding distance and cosine similarity bootstrapping a collaborative filtering model probabilistic matrix factorization or dot product model and few more topics related to the same from here i have presented the implementation interpreting biases collab learner model and embedding distance using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch collaborative filtering https github com thinamxx fastai blob main 7 20collaborative 20filtering collaborativefiltering ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20222 png day223 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about deep learning and collaborative filtering embedding matrices linear function relu and nonlinear functions sigmoid range forward propagation function tabular model and embedding neural networks and few more topics related to the same from here in python kwargs in a parameter list means put any additional keyword arguments into a dict called kwargs and kwargs in an argument list means insert all key and value pairs in the kwargs dict as named arguments here i have presented the implementation deep learning for collaborative filtering and neural networks using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch collaborative filtering https github com thinamxx fastai blob main 7 20collaborative 20filtering collaborativefiltering ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20223 png day224 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about tabular modeling categorical embeddings continuous and categorical variables recommendation system the tabular dataset ordinal columns decision trees handling dates tabular pandas and tabular proc object and few more topics related to the same from here i have presented the implementation of handling dates tabular pandas and tabular proc using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch tabular modeling https github com thinamxx fastai blob main 8 20tabular 20modeling tabularmodel ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20224 png day225 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about tabular modeling creating the decision tree leaf nodes root mean squared error dtreeviz library stopping criterion overfitting and few more topics related to the same from here i have presented the implementation of creating decision tree and leaf nodes using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch tabular modeling https github com thinamxx fastai blob main 8 20tabular 20modeling tabularmodel ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20225a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20225b png day226 of 300daysofdata random forest a random forest is a model that averages the predictions of a large number of decision trees which are generated by randomly varying various parameters that specify what data is used to train the tree and other tree parameters bagging is a particular approach to ensembling or combining the results of multiple models together on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about categorical variables random forests and bagging predictors ensembling optimal parameters out of bag error tree variance for prediction confidence and standard deviation model interpretation and few more topics related to the same from here the out of bag error or oob error is a way of measuring prediction error in the training dataset by including in the calculation of a rows error trees only where that row was not included in the training i have presented the implementation of creating random forest and model interpretation using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch tabular modeling https github com thinamxx fastai blob main 8 20tabular 20modeling tabularmodel ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20226 png day227 of 300daysofdata random forest a random forest is a model that averages the predictions of a large number of decision trees which are generated by randomly varying various parameters that specify what data is used to train the tree and other tree parameters bagging is a particular approach to ensembling or combining the results of multiple models together on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about random forest feature importance removing low importance variables removing redundant features determining similarity of features rank correlation oob score and few more topics related to the same from here i have presented the implementation of random forest and feature importance using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch tabular modeling https github com thinamxx fastai blob main 8 20tabular 20modeling tabularmodel ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20227 png day228 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about removing redundant features determining similarity oob score partial dependence plots data leakage root mean squared error and few more topics related to the same from here standard deviation of predictions across the trees presents the relative confidence of predictions the model is more consistent when the standard deviation is lower i have presented the implementation of removing redundant features and partial dependence plots using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch tabular modeling https github com thinamxx fastai blob main 8 20tabular 20modeling tabularmodel ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20228 png day229 of 300daysofdata random forest model just averages the predictions of a number of trees and therefore it can never predict values outside the range of the training data random forests are not able to extrapolate outside the types of data i e out of domain data here prediction is simply the prediction that the random forest makes here bias is the prediction based on taking the mean of the dependent variable similarly contributions tells us the total change in prediction due to each of the independent variables on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about tree interpreter redundant features waterfall charts or plots random forest prediction bias and contributions the extrapolation problem unsqueeze method out of domain data and few more topics related to the same from here i have presented the implementation of tree interpreter waterfall plots extrapolation problem using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch tabular modeling https github com thinamxx fastai blob main 8 20tabular 20modeling tabularmodel ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20229 png day230 of 300daysofdata random forest model just averages the predictions of a number of trees and therefore it can never predict values outside the range of the training data random forests are not able to extrapolate outside the types of data i e out of domain data here prediction is simply the prediction that the random forest makes here bias is the prediction based on taking the mean of the dependent variable similarly contributions tells us the total change in prediction due to each of the independent variables on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about the extrapolation problem and random forest finding out of domain data root mean squared error and feature importance histograms and few more topics related to the same from here i have presented the implementation of finding out of domain data and rmse using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch tabular modeling https github com thinamxx fastai blob main 8 20tabular 20modeling tabularmodel ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20230 png day231 of 300daysofdata random forest random forest model just averages the predictions of a number of trees and therefore it can never predict values outside the range of the training data random forests are not able to extrapolate outside the types of data i e out of domain data on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about tabular modeling and neural networks continuous and categorical features embedding matrix mean squared error and regression tabular learner and learning rate ensembling bagging and boosting combining embeddings and few more topics related to the same from here ensembling is the generalization technique in which the average of the predictions of several models are used i have presented the implementation of tabular modeling and neural networks and ensembling using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch tabular modeling https github com thinamxx fastai blob main 8 20tabular 20modeling tabularmodel ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20231a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20231b png day232 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about nlp and language model self supervised learning text preprocessing tokenization numericalization and embedding matrix subword and characters tokens and few more topics related to the same from here token is a element of a list created by the tokenization process which could be a word a part of a word or subword or a single character i have presented the implementation of loading the data and word tokenization using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch natural language processing https github com thinamxx fastai blob main 9 20natural 20language 20processing nlp ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20232 png day233 of 300daysofdata tokenization subword tokenization splits words into smaller parts based on the most commonly occurring sub strings word tokenization splits a sentence on spaces as well as applying language specific rules to try to separate parts of meaning even when there are no spaces subword tokenization provides a way to easily scale between character tokenization i e using a small subword vocab and word tokenization i e using a large subword vocab and handles every human language without needing language specific algorithms to be developed on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about word tokenization subword tokenization setup method vocabulary numericalization with fastai embedding matrices and few more topics related to the same from here i have presented the implementation of subword tokenization and numericalization using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch natural language processing https github com thinamxx fastai blob main 9 20natural 20language 20processing nlp ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20233 png day234 of 300daysofdata tokenization subword tokenization splits words into smaller parts based on the most commonly occurring sub strings word tokenization splits a sentence on spaces as well as applying language specific rules to try to separate parts of meaning even when there are no spaces subword tokenization provides a way to easily scale between character tokenization i e using a small subword vocab and word tokenization i e using a large subword vocab and handles every human language without needing language specific algorithms to be developed on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about numericalization with fastai embedding matrices creating batches for language model tokenization training a text classifier language model using datablock data loaders fine tuning language model and transfer learning and few more topics related to the same from here i have presented the implementation of creating data loaders and data block for language model using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch natural language processing https github com thinamxx fastai blob main 9 20natural 20language 20processing nlp ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20234 png day235 of 300daysofdata encoder encoder is defined as the model which doesn t contain task specific final layers the term encoder means much the same thing as body when applied to vision cnn but encoder tends to be more used for nlp and generative models on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about encoder model text generation and classification creating the classifier data loaders embeddings data augmentation fine tuning the classifier discriminative learning rates and gradual unfreezing disinformation and language models and few more topics related to the same from here i have presented the implementation of training text classifier model using discriminative learning rates and gradual unfreezing using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch natural language processing https github com thinamxx fastai blob main 9 20natural 20language 20processing nlp ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20235a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20235b png day236 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about data munging with fastai tokenization and numericalization creating data loaders and data block mid level api transforms decode method data augmentation cropping and padding and few more topics related to the same from here i have presented the implementation of creating data loaders tokenization and numericalization using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch data munging https github com thinamxx fastai blob main 10 20data 20munging datamunging ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20236 png day237 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about data munging decorator pipeline method transformed collections training and validation set data loaders object categorize method transformations and few more topics related to the same from here i have presented the implementation of pipeline class and transformed collections using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch data munging https github com thinamxx fastai blob main 10 20data 20munging datamunging ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20237 png day238 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i i have read about datasets class transformed collections pipelines categorize method data loaders and data block text block partial function category block and few more topics related to the same from here i have presented the implementation of datasets class transformed collections and data loaders using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch data munging https github com thinamxx fastai blob main 10 20data 20munging datamunging ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20238 png day239 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about applying mid level data api for siamese pair and computer vision data loaders transforms and resizing images data augmentation subclasses transformed collections and few more topics related to the same from here datasets class will apply two or more pipelines in parallel to the same raw object and build a tuple with the result it will automatically do the setup and index into a datasets i have presented the implementation of siamese image object and data augmentation using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch data munging https github com thinamxx fastai blob main 10 20data 20munging datamunging ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20239 png day240 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about siamese transform object random splitting transformed collections and datasets class data loaders totensor and inttofloattensor methods data and batch normalization and few more topics related to the same from here totensor method converts images to tensors inttofloattensor method converts the tensor of images containing the integers from 0 to 255 to a tensor of floats and divide by 255 to make values between 0 and 1 i have presented the implementation of siamese transform object and data augmentation using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch data munging https github com thinamxx fastai blob main 10 20data 20munging datamunging ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20240 png day241 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about language model from scratch data concatenation and tokenization vocabulary and numericalization neural networks independent variables and dependent variable sequence of tensors and few more topics related to the same from here i have presented the implementation of preparing sequence of tensors for language model using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch language model from scratch https github com thinamxx fastai blob main 11 20language 20model languagemodel ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20241 png day242 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about language model from scratch using pytorch sequence tensors creating data loaders and batchsize neural network architecture and linear layers words embeddings and activations weight matrix creating learner and training and few more topics related to the same from here i will create neural network architecture that takes three words as input and returns the predictions of the probability of each possible next word in the vocab i will use three standard linear layers the first linear layer will use only the first words embedding as activations the second layer will use the second words embedding plus the first layers output activations and the third layer will use the third words embedding plus the second layers output activations the key effect is that every word is interpreted in the information context of any words preceding it each of these three layers will use the same weight matrix i have presented the implementation of creating data loaders language model from scratch and training using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch language model from scratch https github com thinamxx fastai blob main 11 20language 20model languagemodel ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20242 png day243 of 300daysofdata backpropagation through time backpropagation through time is a process of treating a neural network with effectively one layer per time step as one big model and calculating gradients on it in the usual way the bptt technique is used to avoid running out of memory and time which detaches the history of computation steps in the hidden state every few time steps hidden state is defined as the activations that are updated at each step of a recurrent neural network on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about recurrent neural networks hidden state of nn improving the rnn maintaining the state of rnn unrolled representation backpropagation and derivatives detach method stateful rnn backpropagation through time and few more topics related to the same from here i have presented the implementation of recurrent neural networks and language model using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch language model from scratch https github com thinamxx fastai blob main 11 20language 20model languagemodel ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20243 png day244 of 300daysofdata backpropagation through time backpropagation through time is a process of treating a neural network with effectively one layer per time step as one big model and calculating gradients on it in the usual way the bptt technique is used to avoid running out of memory and time which detaches the history of computation steps in the hidden state every few time steps hidden state is defined as the activations that are updated at each step of a recurrent neural network on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about backpropagation through time lmdataloader object and arranging the dataset creating data loaders callbacks and reset method creating more signal and few more topics related to the same from here i have presented the implementation of arranging dataset creating data loaders callbacks and reset method using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch language model from scratch https github com thinamxx fastai blob main 11 20language 20model languagemodel ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20244 png day245 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about creating more signal or sequence cross entropy loss function and flatten method multilayer recurrent neural networks and activations unrolled representation stack and few more topics related to the same from here the single layer recurrent neural network performed better than multilayer recurrent neural network because a deeper model leads to exploding and vanishing activations i have presented the implementation creating more signal and multilayer recurrent neural network using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch language model from scratch https github com thinamxx fastai blob main 11 20language 20model languagemodel ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20245a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20245b png day246 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about exploding and disappearing activations matrix multiplication architecture of long short term memory and rnn sigmoid and tanh function hidden state and cell state forget gate input gate cell gate and output gate chunk method and few more topics related to the same from here i have presented the implementation long short term memory using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch language model from scratch https github com thinamxx fastai blob main 11 20language 20model languagemodel ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20246 png day247 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about training language model using lstm embedding layer linear layer overfitting and regularization of lstm dropout regularization training or inference bernoulli method and few more topics related to the same from here dropout is a regularization technique which randomly changes some activations to zero at a training time i have presented the implementation language model using long short term memory and dropout using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch language model from scratch https github com thinamxx fastai blob main 11 20language 20model languagemodel ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20247 png day248 of 300daysofdata activation regularization activation regularization is a process of adding the small penalty to the final activations produced by the lstm to make it as small as possible it is a regularization method very similar to weight decay on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about activation regularization and temporal activation regularization language model using long short term memory weight decay training a weight tied regularized lstm weight tying and input embeddings text learner cross entropy loss function and few more topics related to the same from here i have presented the implementation language model using regularized long short term memory and regularized dropout and activation regularization using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch language model from scratch https github com thinamxx fastai blob main 11 20language 20model languagemodel ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20248 png day249 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about convolutional neural networks the magic of convolutions feature engineering kernel and matrix mapping a convolutional kernel nested list comprehensions matrix multiplications and few more topics related to the same from here feature engineering is the process of creating a new transformations of the input data in order to make it easier to model i have presented the implementation of feature engineering and mapping a convolutional kernel using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch convolutional neural networks https github com thinamxx fastai blob main 12 20convolutional 20neural 20networks cnn ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20249 png day250 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about convolutions with pytorch rank tensors creating data block and data loaders channel of images unsqueeze method and unit axis strides and padding understanding the convolutions equations matrix multiplication shared weights and few more topics related to the same from here a channel is a single basic color in an image for a regular full color images there are three channels red green and blue kernels passed to convolutions need to be rank 4 tensors i have presented the implementation of convolutions and dataloaders using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch convolutional neural networks https github com thinamxx fastai blob main 12 20convolutional 20neural 20networks cnn ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20250 png day251 of 300daysofdata channels and features channels and features are largely used interchangeably and refer to the size of the second axis of a weight matrix which is the number of activations per grid cell after a convolution channels refer to the input data i e colors or activations inside the network using a stride 2 convolution often increases the number of features at the same time because the number of activations in the activation map decrease by the factor of 4 on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about convolutional neural network refactoring channels and features understanding convolution arithmetic biases receptive fields convolution over rgb image stochastic gradient descent and few more topics related to the same from here i have presented the implementation of convolutional neural network and training the learner using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch convolutional neural networks https github com thinamxx fastai blob main 12 20convolutional 20neural 20networks cnn ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20251 png day252 of 300daysofdata channels and features channels and features are largely used interchangeably and refer to the size of the second axis of a weight matrix which is the number of activations per grid cell after a convolution channels refer to the input data i e colors or activations inside the network using a stride 2 convolution often increases the number of features at the same time because the number of activations in the activation map decrease by the factor of 4 on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about improving training stability of convolutional neural networks batch size and splitting the dataset simple baseline network activations and kernel size activation stat callbacks learning rate creating a learner and training and few more topics related to the same from here i have presented the implementation of convolutional neural network and training the learner using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch convolutional neural networks https github com thinamxx fastai blob main 12 20convolutional 20neural 20networks cnn ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20252 png day253 of 300daysofdata one cycle training 1 cycle training is a combination of warmup and annealing warmup is the one where learning rate grows from the minimum value to the maximum value and annealing is the one where it decreases back to the minimum value on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about activation stats callbacks increasing batch size activations 1 cycle training warmup and annealing super convergence learning rate and momentum colorful dimension and histograms and few more topics related to the same from here i have presented the implementation of increasing batch size 1 cycle training and inspecting momentum and activations using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch convolutional neural networks https github com thinamxx fastai blob main 12 20convolutional 20neural 20networks cnn ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20253 png day254 of 300daysofdata fully convolutional networks the idea in fully convolutional networks is to take the average of activations across a convolutional grid a fully convolutional networks has a number of convolutional layers some of which will be stride 2 convolutions at the end of which is an adaptive average pooling layer a flatten layer to remove the unit axis and finally a linear layer larger batches have gradients that are more accurate since they are calculated from more data but larger batch size means fewer batches per epoch which means fewer opportunities for the model to update weights on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about residual networks or resnets convolutional neural networks strides and padding fully convolutional networks adaptive average pooling layer flatten layer activations and matrix multiplications and few more topics related to the same from here i have presented the implementation of preparing data and fully convolutional networks using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch residual networks https github com thinamxx fastai blob main 13 20resnets resnets ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20254 png day255 of 300daysofdata fully convolutional networks the idea in fully convolutional networks is to take the average of activations across a convolutional grid a fully convolutional networks has a number of convolutional layers some of which will be stride 2 convolutions at the end of which is an adaptive average pooling layer a flatten layer to remove the unit axis and finally a linear layer larger batches have gradients that are more accurate since they are calculated from more data but larger batch size means fewer batches per epoch which means fewer opportunities for the model to update weights on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about fully convolutional neural networks building resnet skip connections identity mapping sgd batch normalization layer trainable parameters true identity path convolutional neural networks average pooling layer and few more topics related to the same from here i have presented the implementation of resnet architecture and skip connections using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch residual networks https github com thinamxx fastai blob main 13 20resnets resnets ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20255 png day256 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about residual networks relu activation function skip connections training deeper models loss landscape of nn stem of the network convolutional layers max pooling layer and few more topics related to the same from here stem is defined as the first few layers of cnn it has different structure than the main body of cnn i have presented the implementation of training deeper models and stem of network using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch residual networks https github com thinamxx fastai blob main 13 20resnets resnets ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20256 png day257 of 300daysofdata bottleneck layers bottleneck layers use three convolutions two 1x1 at the begining and the end and one 3x3 the 1x1 convolutions are much faster which facilitates to use higher number of filters in and out the 1x1 convolutions diminish and then restore the number of channels so called bottleneck the overall impact is to facilitate the use of more filters in the same amount of time on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about stem of the network residual network architecture bottleneck layers convolutional neural networks progressive resizing and few more topics related to the same from here i have presented the implementation of training deeper networks and bottleneck layers using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch residual networks https github com thinamxx fastai blob main 13 20resnets resnets ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20257 png day258 of 300daysofdata splitter function a splitter is a function that tells the fastai library how to split the model into parameter groups which are used to train only the head of the model during transfer learning the params is just a function that returns all parameters of a given module on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about body and head of networks batch normalization layer unet learner and architecture generative vision models nearest neighbor interpolation transposed convolutions siamese network loss function and splitter function and few more topics related to the same from here i have presented the implementation of siamese network model loss function and splitter function using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch architecture details https github com thinamxx fastai blob main 14 20architecture 20details architectures ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20258 png day259 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about stochastic gradient descent loss function updating weights optimization function creating data block and data loaders resnet model and learner training process and few more topics related to the same from here i have presented the implementation of preparing dataset and baseline model using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch training process https github com thinamxx fastai blob main 15 20training 20process training ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20259 png day260 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about training process stochastic gradient descent optimization function learning rate finder momentum optimizer callbacks zeroing gradients partial function and few more topics related to the same from here i have presented the implementation of functions for optimizer and sgd here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch training process https github com thinamxx fastai blob main 15 20training 20process training ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20260 png day261 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about stochastic gradient descent and optimization function momentum exponentially weighted moving average gradient averages callbacks rms prop adaptive learning rate divergence and epsilon and few more topics related to the same from here i have presented the implementation of momentum and rms prop using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch training process https github com thinamxx fastai blob main 15 20training 20process training ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20261 png day262 of 300daysofdata adam optimizer adam mixes the idea of sgd with momentum and rmsprop together where it uses the moving average of the gradients as a direction and divides by the square root of the moving average of the gradients squared to give an adaptive learning rate to each parameter it takes the unbiased moving average on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about rmsprop optimizer sgd adam optimizer unbiased moving average of gradients momentum parameter decoupled weight decay l1 and l2 regularization callbacks and few more topics related to the same from here i have presented the implementation of rms prop and adam optimizer using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch training process https github com thinamxx fastai blob main 15 20training 20process training ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20262 png day263 of 300daysofdata adam optimizer adam mixes the idea of sgd with momentum and rmsprop together where it uses the moving average of the gradients as a direction and divides by the square root of the moving average of the gradients squared to give an adaptive learning rate to each parameter it takes the unbiased moving average on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about creating callbacks loss functions model resetter callbacks rnn regularization callback ordering and exceptions stochastic gradient descent and few more topics related to the same from here i have presented the implementation of model resetter callback and rnn regularization callback using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch training process https github com thinamxx fastai blob main 15 20training 20process training ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20263 png day264 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about neural networks building a neural network from scratch modeling a neuron nonlinear activation functions hidden size fully connected layer and dense layer linear layer matrix multiplication from scratch elementwise arithmetic and few more topics related to the same from here i have presented the implementation of matrix multiplication from scratch and elementwise arithmetic using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch neural network foundations https github com thinamxx fastai blob main 16 20neural 20network 20foundations neuralfoundations ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20264 png day265 of 300daysofdata forward and backward passes computing all the gradients of a given loss with respect to its parameters is known as backward pass similarly computing the output of the model on a given input based on the matrix products is known as forward pass on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about broadcasting with scalar broadcasting vector and matrix unsqueeze method einstein summation matrix multiplication the forward and backward passes defining and initializing layer activation function linear layer weights and biases and few more topics related to the same from here i have presented the implementation of einstein summation and defining and initializing linear layer using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch neural network foundations https github com thinamxx fastai blob main 16 20neural 20network 20foundations neuralfoundations ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20265 png day266 of 300daysofdata forward and backward passes computing all the gradients of a given loss with respect to its parameters is known as backward pass similarly computing the output of the model on a given input based on the matrix products is known as forward pass on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about mean and standard deviation matrix multiplications xavier initialization relu activation kaiming initialization weights and activations and few more topics related to the same from here i have presented the implementation of xavier initialization relu activation and matrix multiplications using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch neural network foundations https github com thinamxx fastai blob main 16 20neural 20network 20foundations neuralfoundations ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20266 png day267 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about kaiming initialization forward pass mean squared error loss function gradients and backward pass linear layers and relu activation function chain rule backpropagation and few more topics related to the same from here i have presented the implementation of kaiming initialization mse loss function and gradients using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch neural network foundations https github com thinamxx fastai blob main 16 20neural 20network 20foundations neuralfoundations ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20267 png day268 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about gradients of matrix multiplication symbolic computation forward and backward propagation function model parameters weights and biases refactoring the model callable module and few more topics related to the same from here i have presented the implementation of relu module linear module and mean squared error module using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch neural network foundations https github com thinamxx fastai blob main 16 20neural 20network 20foundations neuralfoundations ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20268 png day269 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about initializing model architecture callable function forward and backward propagation function linear function mean squared error loss function relu activation function back propagation function and gradients squeeze function and few more topics related to the same from here i have also read about perturbations and neural networks vanishing gradients and convolutional neural networks i have presented the implementation of defining model architecture layer function and relu using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch neural network foundations https github com thinamxx fastai blob main 16 20neural 20network 20foundations neuralfoundations ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20269 png day270 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about defining base class and sub classes linear layer relu activation function and non linearities mean squared error function super class initializer kaiming initialization elementwise arithmetic and broadcasting and few more topics related to the same from here i have presented the implementation of defining linear layer and linear model using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch neural network foundations https github com thinamxx fastai blob main 16 20neural 20network 20foundations neuralfoundations ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20270 png day271 of 300daysofdata class activation map the class activation map uses the output of the last convolutional layer which is just before the average pooling layer together with predictions to give a heatmap visualization of model decision on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about cnn interpretation class activation map hooks heatmap visualization activations and convolutional layer dot product feature map data loaders and few more topics related to the same from here i have presented the implementation of defining hook function and decoding images using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch cnn interpretation with cam https github com thinamxx fastai blob main 17 20cnn 20interpretation cnn 20interpretation ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20271 png day272 of 300daysofdata class activation map the class activation map uses the output of the last convolutional layer which is just before the average pooling layer together with predictions to give a heatmap visualization of model decision on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about hook class and context manager gradient class activation map heatmap visualization activations and weights gradients and back propagation model interpretation and few more topics related to the same from here i have presented the implementation of defining hook function activations gradients and heatmap visualization using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch cnn interpretation with cam https github com thinamxx fastai blob main 17 20cnn 20interpretation cnn 20interpretation ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20272a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20272b png day273 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about fastai learner from scratch dependent and independent variable vocabulary dataset and indexing and few more topics related to the same i have also read about convolutional neural networks perturbations and loss functions i have presented the implementation of preparing training and validation dataset using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch fastai learner from scratch https github com thinamxx fastai blob main 18 20fastai 20learner fastai 20learner ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20273 png day274 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about creating collation function parallel preprocessing decoding images data loader class normalization and image statistics permuting axis order precision and few more topics related to the same from here i have presented the implementation of initializing data loader and normalization using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch fastai learner from scratch https github com thinamxx fastai blob main 18 20fastai 20learner fastai 20learner ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20274 png day275 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about module and parameter forward propagation function convolutional layer training attributes kaiming normalization and xavier normalization initializer transformation function weights and biases linear model tensors and few more topics related to the same from here i have presented the implementation of defining module convolutional layer and linear model using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch fastai learner from scratch https github com thinamxx fastai blob main 18 20fastai 20learner fastai 20learner ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20275 png day276 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about convolutional neural networks linear model testing module sequential module parameters adaptive pooling layer and mean stride hook function pipeline and few more topics related to the same from here i have presented the implementation of testing module sequential module and convolutional neural network using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch fastai learner from scratch https github com thinamxx fastai blob main 18 20fastai 20learner fastai 20learner ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20276 png day277 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about loss function negative log likelihood function log softmax function log of sum of exponentials stochastic gradient descent optimizer function data loaders training and validation sets and few more topics related to the same from here i have presented the implementation of negative log likelihood function cross entropy loss function sgd optimizer and data loaders using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch fastai learner from scratch https github com thinamxx fastai blob main 18 20fastai 20learner fastai 20learner ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20277 png day278 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about data convolutional neural net model loss function stochastic gradient descent and optimization function learner callbacks parameters training and epochs and few more topics related to the same from here i have presented the implementation of learner and callbacks using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch fastai learner from scratch https github com thinamxx fastai blob main 18 20fastai 20learner fastai 20learner ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20278 png day279 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read binary classification chest x rays dicom or digital imaging and communications in medicine plotting the dicom data random splitter function medical imaging pixel data and few more topics related to the same from here i have presented the implementation of getting dicom files and inspection using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch chest x rays classification https github com thinamxx fastai blob main 19 20chest 20xrays 20classification xrays 20classification ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20279 png day280 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about binary classification initializing data block and data loaders image block and category block batch transformations training pretrained model learning rate finder tensors and probabilities model interpretation and few more topics related to the same from here i have presented the implementation of initializing data block and data loaders training pretrained model and interpretation using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch chest x rays classification https github com thinamxx fastai blob main 19 20chest 20xrays 20classification xrays 20classification ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20280 png day281 of 300daysofdata sensitivity specificity sensitivity true positive true positive false negative it is also known as a type ii error specificity true negative false positive true negative it is also known as a type i error on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about sensitivity and specificity positive predictive value and negative predictive value confusion matrix and model interpretation type i ii error accuracy and prevalence and few more topics related to the same from here i have presented the implementation of confusion matrix sensitivity and specificity accuracy using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch chest x rays classification https github com thinamxx fastai blob main 19 20chest 20xrays 20classification xrays 20classification ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20281 png day282 of 300daysofdata cross validation cross validation is a step in the process of building a machine learning model which helps us to ensure that our models fit the data accurately and also ensures that we do not overfit on my journey of machine learning and deep learning i have read and implemented from the book approaching almost any machine learning problem here i have read about supervised and unsupervised learning features samples and targets classification and regression clustering t distributed stochastic neighbour embedding 2d arrays cross validation overfitting and few more topics related to the same from here i have presented the implementation of tsne decomposition and preparing dataset here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch supervised and unsupervised learning https github com thinamxx approachinganymachinelearning blob main 01 20supervised 20unsupervised 20learning supervised 20unsupervised ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20282 png day283 of 300daysofdata cross validation cross validation is a step in the process of building a machine learning model which helps us to ensure that our models fit the data accurately and also ensures that we do not overfit on my journey of machine learning and deep learning i have read and implemented from the book approaching almost any machine learning problem here i have read about decision trees and classification features and parameters accuracy and model predictions overfitting and model generalization training loss and validation loss cross validation and few more topics related to the same from here i have presented the implementation of decision tree classifier and model evaluation here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch supervised and unsupervised learning https github com thinamxx approachinganymachinelearning blob main 01 20supervised 20unsupervised 20learning supervised 20unsupervised ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20283 png day284 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book approaching almost any machine learning problem here i have read about stratified kfold cross validation skewed dataset and classification data distribution hold out cross validation time series data regression and sturge s rule probabilities evaluation metrics and accuracy and few more topics related to the same from here i have presented the implementation of distribution of labels and stratified kfold here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch supervised and unsupervised learning https github com thinamxx approachinganymachinelearning blob main 01 20supervised 20unsupervised 20learning supervised 20unsupervised ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20284 png day285 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book approaching almost any machine learning problem here i have read about evaluation metrics and accuracy score training and validation set precision and recall true positive and true negative false positive and false negative binary classification and few more topics related to the same from here i have presented the implementation of true negative false negative false positive and accuracy score here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch evaluation metrics https github com thinamxx approachinganymachinelearning blob main 02 20evaluation 20metrics evaluation 20metrics ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20285 png day286 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book approaching almost any machine learning problem here i have read about true positive rate recall and sensitivity false positive rate and specificity area under roc curve prediction probability and thresholds log loss function multiclass classification and macro averaged precision and few more topics related to the same from here i have presented the implementation of true negative rate false positive rate log loss function and macro averaged precision here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch evaluation metrics https github com thinamxx approachinganymachinelearning blob main 02 20evaluation 20metrics evaluation 20metrics ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20286 png day287 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book approaching almost any machine learning problem here i have read about multiclass classification macro averaged precision micro averaged precision weighted precision recall metrics random forest regressor mean squared error root mean squared error and few more topics related to the same from here i have presented the implementation of micro averaged precision and weighted precision here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch evaluation metrics https github com thinamxx approachinganymachinelearning blob main 02 20evaluation 20metrics evaluation 20metrics ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20287 png day288 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book approaching almost any machine learning problem here i have read about recall metrics for multiclass classification weighted f1 score confusion matrix type i error and type ii error auc curve multilabel classification and average precision and few more topics related to the same from here i have presented the implementation of weighted f1 score and average precision here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch evaluation metrics https github com thinamxx approachinganymachinelearning blob main 02 20evaluation 20metrics evaluation 20metrics ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20288a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20288b png day289 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book approaching almost any machine learning problem here i have read about regression metrics such as mean absolute and average error root mean squared error squared logarithmic error mean absolute percentage error r squared and coefficient of determination cohen s kappa score mcc score and few more topics related to the same from here i have presented the implementation of mean absolute and average error squared logarithmic error mean absolute percentage error r squared and mcc score here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch evaluation metrics https github com thinamxx approachinganymachinelearning blob main 02 20evaluation 20metrics evaluation 20metrics ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20289a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20289b png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20289c png day290 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented about object detection and fine tuning image segmentation tensors and aspect ratio arrays dataset and data loaders i have also started the machine learning engineering for production specialization from coursera here i have read about steps of ml project and case study ml project lifecycle and few more topics related to the same from here i have presented the implementation of dataset class here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead resource machine learning engineering for production image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20290 png day291 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from pyimagesearch blogs here i have read about opencv loading and displaying an image accessing pixels array slicing and cropping resizing images rotating image smoothing image drawing on an image and few more topics related to the same i have also read about ml project lifecycle deployment patterns and pipeline monitoring from machine learning engineering for production specialization of coursera i have presented the implementation of opencv in resizing and rotating and image smoothing and drawing on an image here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead resources machine learning engineering for production pyimagesearch https www pyimagesearch com opencv notebook https github com thinamxx computervision blob main 01 20opencv opencv ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20291 png day292 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from pyimagesearch blogs here i have read about opencv counting objects converting image to grayscale edge detection thresholding detecting and drawing contours erosions and dilations masking and bitwise operations and few more topics related to the same from here i have also read about modeling overview key challenges and low average error from machine learning engineering for production specialization of coursera i have presented the implementation of opencv in converting image to grayscale edge detection thresholding detecting and drawing contours erosions and dilations here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead resources machine learning engineering for production pyimagesearch https www pyimagesearch com opencv notebook https github com thinamxx computervision blob main 01 20opencv opencv ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20292 png day293 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from pyimagesearch blogs here i have read about opencv rotating images image preprocessing rotation matrix and center coordinates image parsing edge detection and contour detection masking and blurring images and few more topics related to the same from here i have also read about baseline model selecting and training model error analysis and prioritization from machine learning engineering for production specialization of coursera i have presented the implementation of opencv in rotating images and getting roi of images here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead resources machine learning engineering for production pyimagesearch https www pyimagesearch com opencv project i https github com thinamxx computervision blob main 01 20opencv ocv 20project 20i ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20293 png day294 of 300daysofdata histogram matching histogram matching can be used as a normalization technique in an image processing pipeline as a form of color correction and color matching which allows to obtain a consistent normalized representation of images even if lighting conditions change on my journey of machine learning and deep learning i have read and implemented from pyimagesearch blogs here i have read about opencv color detection rgb colorspace histogram matching pixel distribution cumulative distribution resizing image and few more topics related to the same from here i have also read about skewed datasets performance auditing data centric ai development and data augmentation from machine learning engineering for production specialization of coursera i have presented the implementation of opencv in histogram matching here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead resources machine learning engineering for production pyimagesearch https www pyimagesearch com opencv project ii https github com thinamxx computervision blob main 01 20opencv ocv 20project 20ii ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20294 png day295 of 300daysofdata histogram matching histogram matching can be used as a normalization technique in an image processing pipeline as a form of color correction and color matching which allows to obtain a consistent normalized representation of images even if lighting conditions change on my journey of machine learning and deep learning i have read and implemented from pyimagesearch blogs here i have read about convolutional neural networks convolutional matrix kernels spatial dimensions padding roi of image elementwise multiplication and addition rescaling intensity laplacian kernel detecting blur and smoothing and few more topics related to the same from here i have presented the implementation of convolution method and constructing kernels here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead resources machine learning engineering for production pyimagesearch https www pyimagesearch com convolution https github com thinamxx computervision blob main 02 20convolutionalneuralnetwork convolutions ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20295 png day296 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from pyimagesearch blogs here i have read about convolutional layers filters and kernel size strides padding input data format dilation rate activation function weights and biases kernel and bias initializer and regularizer generalization and overfitting kernel and bias constraint caltech dataset strided net and few more topics related to the same from here i have presented the implementation of strided net here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead resources machine learning engineering for production pyimagesearch https www pyimagesearch com convolutional layer https github com thinamxx computervision blob main 02 20convolutionalneuralnetworks convolutional 20layers ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20296 png day297 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from pyimagesearch blogs here i have read about cnn architecture strided net label binarizer and one hot encoding image data generator and data augmentation loading and resizing images and few more topics related to the same from here i have presented the implementation of label binarizer and preparing dataset here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead resources machine learning engineering for production pyimagesearch https www pyimagesearch com convolutional layer https github com thinamxx computervision blob main 02 20convolutionalneuralnetworks convolutional 20layers ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20297 png day298 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from pyimagesearch blogs here i have read about convolutional neural networks adam optimization function compiling and training strided net model data augmentation and image data generator classification report plotting training loss and accuracy overfitting and generalization and few more topics related to the same from here i have presented the implementation of compiling and training model classification report training loss and accuracy here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead resources machine learning engineering for production pyimagesearch https www pyimagesearch com convolutional layer https github com thinamxx computervision blob main 02 20convolutionalneuralnetworks convolutional 20layers ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20298 png day299 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about transformers model gpt2 pretrained model and tokenizer encodes and decodes methods preparing dataset transform method data loaders and few more topics related to the same from here i have presented the implementation of pretrained gpt2 model and tokenizer and transformed dataloaders using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch transformers https github com thinamxx fastai blob main 20 20transformers transformers ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20299a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20299b png day300 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about transformers model data loaders batch size and sequence length language model fine tuning gpt2 model callback learner perplexity and cross entropy loss function learner rate finder training and generating predictions and few more topics related to the same from here i have presented the implementation of initializing dataloaders fine tuning gpt2model and lr finder using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch transformers https github com thinamxx fastai blob main 20 20transformers transformers ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20300a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20300b png | machine-learning deep-learning python | ai |
helix-ui | build status https travis ci com helixdesignsystem helix ui svg branch master https travis ci com helixdesignsystem helix ui helixui ci build pipeline https github com helixdesignsystem helix ui actions workflows build yml badge svg https github com helixdesignsystem helix ui actions workflows build yml helixui saucelabs tests https github com helixdesignsystem helix ui actions workflows browser tests yml badge svg https github com helixdesignsystem helix ui actions workflows browser tests yml helixui web components library the helixui web components library provides front end developers a full suite of web components for building uis adopting the library will enable developers to build products with consistent markup styles and behavior across a variety of frameworks documentation homepage https helixdesignsystem github io helix ui getting started guide https helixdesignsystem github io helix ui guides getting started npm releases https www npmjs com package helix ui helix react wrappers react support our sister repo helix react https github com helixdesignsystem helix react provides react component wrappers to helixui npm releases https www npmjs com package helix design system helix react react compatiblity with web components https helixdesignsystem github io helix ui guides react compatibility developer setup 1 install yarn https yarnpkg com en docs install 2 git clone git github com helixdesignsystem helix ui git recommended for 2fa setups 3 cd helix ui 4 yarn install to install project dependencies 5 yarn start 6 navigate to http 127 0 0 1 3000 in your favorite browser run component test suite initial run or on code updates yarn test build to compile code and run the full component test suite just making changes to tests yarn test to run the full component test suite | helix-ui-webcomponents webcomponents-framework helix-design-system | os |
crosstalk-generation | crosstalk generation rocket rocket rocket rocket rocket rocket https github com anonno2 crosstalk generation blob main readme zh cn md english https github com anonno2 crosstalk generation tada the largest chinese open source crosstalk dataset so far collected from the internet books and other open source projects tada data types include stand up crosstalk paired up crosstalk group crosstalk and sketches tada since this study is only for cross talk the training set provided only includes cross talk for other types please refer to the data description initial data module quick start quick start data description data description model description model description machine metrics machine metrics innovative metrics innovative metrics human scoring metrics human scoring metrics correlation coefficient correlation coefficient between human metrics and machine metrics generate examples generate examples quick start image 20220417153303002 https github com anonno2 crosstalk generation blob main img container server png model type docker file googledriver file baidudriver t5 anon2010 crosstalk gen env t5 download https drive google com file d 1xcrhkk1d7cakeyjyctwbirf0kaieebsr view usp sharing download https pan baidu com s 1wv73 io7oytujzbkddf9iw pwd t8i9 gpt pangu a cpm to be continued steps for usage 1 download the relevant images and files first 2 start the model image take t5 as an example here docker run dp 32488 8080 memory 3g v model res model res anon2010 crosstalk gen env t5 3 access the host port model res model files decompression path memory specify the maximum memory limit too small may cause failure to start data description the data is divided into two parts initial data and training data initial data including sketches crawled from the internet single performing dual performing group performing and ketch comedy src common data meta json metadata for full data completemetaexportdata zip full data type number single performing 168 dual performing 3685 group performing 256 ketch comedy 5222 plain dialogue texts extract from which text above 4859 full data 9331 number total words 16481376 number of utterances 663305 number of long utterances 8717 number of short utterances 446756 median word numbers of utterances 16 mean utterances per script 71 metadata format description isalldialog true whether it is a pure dialogue format charsize 526 word count in this scripts filepath u399dy relative path roles roles sentencesize 25 utterances size source https www 399dy com xiangsheng 11176 html source idx 28 index title title type type in text formatting example train data the 2948 dialogues were cleaned and screened from the initial data the validation set was 368 dialogues selected from the initial data and the test set was 10 rounds of dialogue among the 50 dialogues selected from the initial data opened in project src common data src common data train raw zip training corpus in original format contains meta information dev raw zip verification corpus in original format contains meta information train pure text txt training corpus in plain text format dev pure text txt verification corpus in plain text format test filter 50x20 txt test corpus machine index calculation and manual evaluation are based on this data data number of scripts number of utterances word count training corpus 2948 173194 3184664 verification 368 41482 905201 test corpus 50 1000 19268 model description finetune gpt t5 t5 small unilm rnn gpt3 zero shot inference cpm large gpt3 pangu a zhouwenwang machine metrics for details of the data files and benchmarking methods used for machine index scoring see machine metrics https github com anonno2 crosstalk generation blob main eval data machine eval readme md bleu 1 bleu 2 bleu 3 bleu 4 gleu rouge 1 rouge 2 rouge l distinct 1 distinct 2 gpt ep50 10 04 3 69 1 53 0 7 2 75 15 28 1 78 13 7 6 89 37 39 t5 pesg ep15 11 75 5 58 3 13 1 77 3 94 20 8 4 98 19 25 9 02 42 68 small t5 pesg ep95 11 71 5 39 2 93 1 67 3 64 19 98 4 37 18 61 8 08 36 38 cpm large 7 94 2 87 1 19 0 5 1 68 9 88 1 28 8 83 5 82 34 43 unilm ep45 8 88 4 32 2 47 1 41 3 36 20 22 4 91 18 98 7 53 29 90 rnn 11 77 4 02 1 47 0 57 2 49 17 25 2 13 15 94 4 73 16 23 gpt3 base davinci 14 68 7 45 4 44 2 77 5 13 22 25 5 65 20 03 8 43 40 7 gpt3 ft200 davinci 9 66 4 89 3 01 1 92 4 66 21 79 5 5 20 22 9 725 43 15 gpt3 ft1000 davinci 13 39 6 86 4 14 2 55 5 18 22 83 5 68 20 73 9 55 45 56 panggu a 6 42 2 09 0 83 0 37 1 31 7 0 75 6 14 8 25 50 98 zhouwenwang 7 33 2 26 0 9 0 4 1 81 10 41 1 01 8 61 9 72 53 53 ps the reason why gpt3 ft1000 is not included in the paper is because we only select one of the similar models when manually marking in the previous gpt3 finetune manual comparison the manual score of ft200 is higher general quality humor coherence ethically risky flag gpt3 ft200 davinci 145 115 43 2 gpt3 ft1000 davinci 136 109 40 2 some small discoveries 1 gpt3 davinci s machine metrics are significantly higher than any other generation method 2 most large models zero shot inference significantly worse direct inference than finetune 3 after gpt3 goes through finetune the machine index drops but in the following manual scoring it will get a higher score higher machine indicators are not necessarily more in line with human reading habits innovative metrics zhouwenwang unilm t5 smallt5 95 rnn pangu a gpt3 finetune gpt3 base gpt ep50 cpm large 1 0 0695 0 0 0 0 0 00754 0 0 00293 0 0 04777 2 0 23223 0 07405 0 03709 0 02538 0 01244 0 10615 0 02337 0 028 0 00826 0 22447 3 0 58549 0 37475 0 22624 0 19325 0 10785 0 43833 0 16998 0 18063 0 13619 0 53327 4 0 83795 0 68884 0 51332 0 45936 0 32618 0 73634 0 4297 0 42368 0 43385 0 78466 5 0 94343 0 86874 0 75219 0 68772 0 62232 0 91339 0 67156 0 64273 0 73764 0 91419 6 0 97811 0 94026 0 88568 0 82587 0 84941 0 97563 0 81931 0 76029 0 91134 0 9615 7 0 98968 0 96837 0 94607 0 90037 0 94703 0 99402 0 88814 0 80665 0 97395 0 98139 8 0 99347 0 98052 0 97288 0 94256 0 98062 0 9987 0 91946 0 82957 0 99267 0 99015 9 0 99544 0 98527 0 98457 0 9669 0 99051 0 99972 0 93244 0 84394 0 99757 0 99252 10 0 99711 0 98799 0 99007 0 98092 0 99472 1 0 93793 0 85047 0 9993 0 99393 image 20220417153303002 https github com anonno2 crosstalk generation blob main img image 20220417153303002 png some small discoveries 1 zhouwenwang cpm pangu a zero shot inference is more innovative 2 in the finetune model unilm and t5 are relatively more innovative 3 models with higher innovative evaluations are often generated more randomly human scoring metrics the numbers in parentheses of the column names are human scoring of the real data the combined score and humor score are capped at 750 fluency and discrimination scores are capped at 150 for details of the data files used for manual scoring and the scoring method see manual scoring https github com anonno2 crosstalk generation blob main eval data human eval readme md test design here we will infer a total of 10 pre trained models and large models excluding t5 small because the effect is not as good as t5 plus the original data for the first 10 sentences of the 50 segments of the test set to infer the next 10 sentences a total of 30 subjects with different genders occupations ages and beliefs were found and asked to rate the samples generated by the model the comprehensive score and humor score were scored on a 5 point scale and the fluency and discrimination were on a 1 point scale 1 if yes otherwise 0 general quality 528 humor 519 coherence 143 ethically risky flag 3 gpt ep50 225 256 59 2 t5 pesg ep15 270 296 76 7 cpm large 213 240 60 34 unilm ep45 276 301 84 2 rnn 217 242 41 4 gpt3 base davinci 322 325 98 5 gpt3 ft200 davinci 341 353 106 2 panggu a 230 257 63 4 zhouwenwang 184 191 28 8 some small discoveries 1 although gpt3 s machine indicators have declined after finetune all dimensions of manual scoring are ranked first machine metrics don t fully assess how well a generation works 2 compared with gpt and rnn the generation quality of model structures such as unilm and t5 will be significantly improved correlation coefficient between human metrics and machine metrics pearsonr correlation https github com anonno2 crosstalk generation blob main img pearsonr correlation 20coefficient png pearsonr correlation https github com anonno2 crosstalk generation blob main img spearman correlation 20coefficient png generate examples short stories generated by gpt3 chinese https github com anonno2 crosstalk generation blob main gpt3 generate samples short sample zh cn md english https github com anonno2 crosstalk generation blob main gpt3 generate samples short sample en us md long stories generated by gpt3 chinese https github com anonno2 crosstalk generation blob main gpt3 generate samples long sample zh cn md english https github com anonno2 crosstalk generation blob main gpt3 generate samples long sample en us md interactive generation demo 1 user chatbot user chatbot user chatbot user chatbot user chatbot user chatbot user chatbot demo 2 user chatbot user chatbot user chatbot user chatbot user chatbot demo 3 user chatbot user chatbot user chatbot user chatbot user chatbot user anon chatbot script generation demo 1 demo 2 demo 3 example of generating different models in the same context demo 1 2700 context context unilm 2700 context context gpt 2700 context context gpt3 2700 context context seq2seq 2700 context context demo 2 context context unilm context context gpt context context gpt3 context context seq2seq context context demo 3 context context unilm context context gpt context context gpt3 context context seq2seq context context | chinese crosstalk-dataset gpt-2 humor-generation pretrained-models t5 text-generation | ai |
Qix | prs welcome https img shields io badge prs welcome brightgreen svg style flat square http makeapullrequest com github stars https img shields io github stars ty4z2008 qix svg style plastic github forks https img shields io github forks ty4z2008 qix svg color blue style plastic about me weibo jun http weibo com ty4z2008 twitter https twitter com ty4z2008 e mail ty4z2008 gmail com scale system channel https t me scalesystem https t me scalesystem note there may be some incorrect information in the article i hope i can correct error with you you can contact me with email or pr pull request welcome blush my translation node mysql document translate node mysql offcial document https github com felixge node mysql blob master readme md node mysql chinese document https github com ty4z2008 qix blob master node md machine learning and deep learning resources chapter 1 https github com ty4z2008 qix blob master dl md chapter 2 https github com ty4z2008 qix blob master dl2 md golang learning resources chapter 1 https github com ty4z2008 qix blob master golang md postgresql database resources chapter 1 https github com ty4z2008 qix blob master pg md distributed system resources chapter 1 https github com ty4z2008 qix blob master ds md database system resources chapter 1 https github com ty4z2008 qix blob master db md additional notes dear friends in order to respect to the efforts of authorship in the reading process when you find that resource the authorship is incorrect i also want you to submit feedback https github com ty4z2008 qix issues thanks buddy license mit license https github com ty4z2008 qix blob master license md | machine-learning distributed-systems postgresql deep-learning go awesome distributed-computing distributed-database | ai |
ikago-web | ikago web ikago web is a front end interface for ikago https github com zhxie ikago license ikago web is licensed under the mit license license | front_end |
|
learn_blockchain | learn blockchain leigg https github com chasespace 2022 10 27 2023 02 26 1 1 1 b 2h 2019 https www bilibili com video bv1gt411t7tq b 19h56m 2018 https www bilibili com video bv1vt411x7jf with golang youtube 2h48m building a blockchain in golang https www youtube com playlist list plpp5mqvvi4pgmnygesshrlvue2b33xv1l with rust youtube 2h30m build a cryptocurrency blockchain in rust https www youtube com watch v vjdt05zl6jk youtube 2h22m build a bitcoin like blockchain with substrate beginner friendly https www youtube com playlist list plp0 uexy enxeturzk2frt7muxf2hz6sn 1 2 2018 https book douban com subject 30280401 https github com inoutcode bitcoin book 2nd 2018 https book douban com subject 27161851 https github com inoutcode ethereum book 2020 https baike baidu com item 56688853 fr aladdin 1 3 https learnblockchain cn 2019 11 08 zkp info with rust list of top blockchains using the rust programming language https 101blockchains com top blockchains using rust programming language mario zupan how to build a blockchain in rust https blog logrocket com how to build a blockchain in rust mario zupan substrate blockchain development core concepts https blog logrocket com substrate blockchain framework core concepts mario zupan how to build a custom blockchain implementation in rust using substrate https blog logrocket com custom blockchain implementation rust substrate 1 4 how to become a blockchain engineer https betterprogramming pub how to become a blockchain engineer fa4386a0504f https news marsbit co 20230210125706294078 html https news marsbit co 20221021155340570531 html liquid https www btcstudy org 2022 08 29 six differences between liquid and lightning news 2022 10 https news marsbit co 20221221121035896551 html l0 l1 l2 l3 https www 51cto com article 717194 html https foresightnews pro article detail 15899 web3 https foresightnews pro article detail 20341 12 https news marsbit co 20230113105059573706 html celestia dymension fuel https news marsbit co 20221201123135995808 html web3 web3 https www 8btc com article 6800520 op zk validium plasma cobo ventures https foresightnews pro article detail 19834 https mirror xyz 0x8b00cee42f226b340af806cd7aaa4c10cc5e0154 v3ziyxecqhfvtdm m9s uhkmsuosrra9kreneewp m8 https ethereum org zh developers docs scaling https ethereum org zh developers docs scaling sidechains https www 528btc com college 160273260170283 html op vs zk rollup https www defidaonews com media 6756250 optimism vs zk rollup https www techflowpost com article 1929 rollup zk rollups optimistic arbitrum https www chaincatcher com article 2087449 l2 optimism arbitrum https www ccvalue cn article 1400517 html zk rollup https mirror xyz bubai eth cfijx6asjerg mdyccn7qyf0t88ey3fck f7mnltq9i 5 optimism https chainfeeds xyz feed detail d427a20e fe6a 49f9 b9dc 48e8496f4db3 layer2 arbitrum zk https www tuoluo cn article detail 10071107 html layer2 op arbitrum https www ccvalue cn article 1407807 html news rollup https news marsbit co 20220718215520388582 html https www tuoluo cn article detail 10101056 html https foresightnews pro article detail 23040 https zhuanlan zhihu com p 598585397 2022 60 https foresightnews pro article detail 22241 2022 60 https foresightnews pro article detail 22857 1 5 2022 12 16 aztec 1 https news marsbit cc 20221216092927524453 html 2022 11 17 2022 top100 hyperledgerfabric 26 18 quorum 11 https www 8btc com article 6788632 2 2 1 go ethereum book https goethereumbook org client 2 2 https www bcskill com index php archives 1133 html ethereum wallets a beginner s guide to storing eth https cointelegraph com ethereum for beginners ethereum wallets a beginners guide to storing eth hd bip32 bip44 bip39 https www 8btc com article 334792 3 3 1 https item jd com 10057770151476 html basic exercises test solidity basic exercises 3 2 wtf https wtf academy 2023 1 web3 solidity by example https solidity by example org solidity defi repo https github com consensys ethereum developer tools list blob master readme chinese md 3 3 erc usdt erc 20 learn smartcontract nft series erc20 example md 3 4 nft nft learn smartcontract nft beginner series nft learn smartcontract other learn famous project code md gas learn smartcontract nft series saving gas coding md openzeppelin learn smartcontract other how to upgrade contract md 3 5 defi todo 3 6 dao todo 3 7 learn smartcontract other security coding md learn smartcontract other skilled coding md gas learn smartcontract nft series saving gas coding md the solcurity standard 2 3 solidity swc registry 4 web3 5 3 8 https www yuanyuzhouneican com article 161517 html 4 1 2w blockchain introduce md 40 blockchain industries md 15 blockchain finance md 3 9w blockchain tech detail md 1w cryptograph md 3 4w consensus md 3 5k bitcoin intro md 4k bitcoin development md 2 5w bitcoin tech detail md 2w bitcoin usage md 8k ethereum intro md 3 7w ethereum tech detail md 1 4w smart contract md 6k smart contract dev guide md 2w ethereum execute contract md nft 2 6w nft overview md todo defi defi overview md todo gamefi gamefi overview md todo ipfs ipfs filecoin overview md issue md issue license cc by nc sa 4 0 2 https github com transmissions11 solcurity 3 https mp weixin qq com s fcnz4p52ku0ey469zqdx2a 4 https swcregistry io 5 https learnblockchain cn article 4202 | blockchain |
|
checsdm | checsdm consistency of heterogeneous embedded control system design models | os |
|
NLP-Nanodegree | udacity natural language processing nanodegree nlp nlpnd certificate png projects of natural language processing 1 part of speech tagger with hidden markove model https github com udacity hmm tagger 2 neural machine translation with recurrent neural networks https github com udacity aind2 nlp capstone 3 end to end automatic speech recognition with convolutional neural networks and recurrent neural networks https github com udacity aind vui capstone | ai |
|
frontend-tips | front end tips a series of super tiny quick tips tricks and best practices of front end development the series cover different topics css html javascript typescript browser developer tools contributing pull requests are welcomed to submit your favorite tip please create a markdown file and put it in the contents contents folder the content of markdown file has to look like md category created tags title the content of post category can be one of tip trick or practice created the date that post is created tags the list of topic s separated by a comma title must match with the file name here contents convert string to number mdx is an example about this project is developed by nguyen huu phuoc i love building products and sharing knowledge be my friend on twitter https twitter com nghuuphuoc github https github com phuocng | front-end front-end-development html css javascript developer-tools tips-and-tricks best-practices eleventy | front_end |
TTK4155 | ttk4155 embedded and industrial computer systems design http www ntnu edu studies courses ttk4155 in collaboration with khuong huynh https github com khuongh and h vard olai kopperstad https github com haavardok this repository includes all developed software for the course project the project s goal was to develop an embedded system for a fully functional one player ping pong game with an atmel avr atmega162 https ww1 microchip com downloads en devicedoc atmel 2513 8 bit avr microntroller atmega162 datasheet pdf and an arduino uno including both software and hardware the system included user controls with touchpads and a joystick an lcd display with menu game settings and statistics and ping pong board including a motor encoder servo and solenoid for controlling the racket and an ir sensor for detecting the ball with pwm this work allowed me to develop my skills in developing embedded systems including low level programming design of electrical circuits bus communication can bus and implemenation of discrete control systems pid controller for racket br img src ttk4155 messy picture jpg alt alt text title optional title style display inline block margin 0 auto max width 300px excuse the messy picture | os |
|
Camerafeed | camerafeed a simple app or can be used as a module to track people as they move in a video you can also draw triplines that will emit events when a person crosses them you can use the camerafeed module in your own app or you can run it as a stand alone app that pulls from settings ini getting started 1 first things first you gotta install python 3 4 and opencv 3 0 luckily computer vision expert adrian has made his fantastic tutorials available http www pyimagesearch com opencv tutorials resources guides 2 clone the repo bash git clone git github com liquidg3 camerafeed git 3 jump into the repo bash cd camerafeed 4 install additional dependencies bash pip setup py develop 5 run the app bash python run py settings checkout settings ini for everything you can customize when running camerafeed as an app ini video source the video to load put 0 to use your computer s camera source footage sample trimmed m4v ini the following are for cropping the video try and only show parts you need frame x1 250 frame y1 40 frame x2 550 frame y2 240 ini set a max width and the video will be proportionality scaled to this size smaller is usually better max width 500 ini black and white b and w false ini settings for hog detectmultiscale hog win stride 4 padding 6 scale 1 05 ini use background removal mog enabled false ini when tracking a person person life 20 how many frames to wait before considering the person gone max distance 50 how far can a person move between detections ini for crossing lines triplines total lines 1 line1 start 100 60 line1 end 200 160 line1 buffer 10 line1 direction 1 north line1 direction 2 south ini line2 start 50 180 line2 end 200 150 line2 buffer 10 line2 direction 1 out line2 direction 2 in recommended reading 1 pedestrian detection opencv http www pyimagesearch com 2015 11 09 pedestrian detection opencv 2 hog detectmultiscale parameters explained http www pyimagesearch com 2015 11 16 hog detectmultiscale parameters explained | ai |
|
Automotive-Buzzer-and-Light-Control | automotive buzzer and light control a demonstration of diagrams related to embedded systems static and dynamic design concepts the system used for illustration consists of 2 microcontrollers communicating with each other using can protocol the static design diagrams are block diagram layered architecture the dynamic design diagrams are state machine sequence diagram | os |
|
Hybrid-Mobile-Development-with-Ionic | hybrid mobile development with ionic this is the code repository for hybrid mobile development with ionic https www packtpub com application development hybrid mobile development ionic utm source github utm medium repository utm content 9781785286056 published by packt it contains all the supporting project files necessary to work through the book from start to finish about the book this book will help you to develop a complete professional and quality mobile application with ionic framework you will start the journey by learning to configure customize and migrate ionic 1x to 3x then you will move on to ionic 3 components and see how you can customize them according to your applications you will also implement various native plugins and integrate them with ionic and ionic cloud services to use them optimally in your application by this time you will be able to create a full fledged e commerce application next you will master authorization authentication and security techniques in ionic 3 to ensure that your application and data are secure further you will integrate the backend services such as firebase and the cordova ibeacon plugin in your application lastly you will be looking into progressive web applications and its support with ionic with a demonstration of an offline first application instructions and navigation all of the code is organized into folders the commands and instructions will look like the following angular ionic 1 angular module wedding controllers controller loginctrl function scope categoryservice controller function and di of categoryservice angular 4 ionic 3 import component from angular core import navcontroller from ionic angular component templateurl build pages catalog categories html export class categorypage di of navcontroller for navigation constructor private navctrl navcontroller this nav navctrl related products beginning ionic hybrid application development video https www packtpub com web development beginning ionic hybrid application development video utm source github utm medium repository utm content 9781785284465 hybrid cloud management with red hat cloudforms https www packtpub com virtualization and cloud hybrid cloud management red hat cloudforms utm source github utm medium repository utm content 9781785283574 getting started with ionic https www packtpub com application development getting started ionic utm source github utm medium repository utm content 9781784390570 suggestions and feedback click here https docs google com forms d e 1faipqlse5qwunkgf6puvzpirpdtuy1du5rlzew23ubp2s p3wb gcwq viewform if you have any feedback or suggestions download a free pdf i if you have already purchased a print or kindle version of this book you can get a drm free pdf version at no cost br simply click on the link to claim your free pdf i p align center a href https packt link free ebook 9781785286056 https packt link free ebook 9781785286056 a p | front_end |
|
design-system | p align center a href img alt src https innovaccer com static image site logo innovaccer logo black svg width 20 a p h1 align center masala design system h1 masala design system mds is an open source design system built at innovaccer this is a simple and customizable component library to build faster beautiful and more accessible react applications on the guidelines and principles of masala design system br div align center codecov https codecov io gh innovaccer design system branch master graph badge svg token 2ly7jlzgx0 https codecov io gh innovaccer design system github https img shields io github license innovaccer design system github top language https img shields io github languages top innovaccer design system snyk vulnerabilities for github repo https img shields io snyk vulnerabilities github innovaccer design system div br get up and running to install innovaccer design system in your project you will need to run the following command using npm https www npmjs com bash npm install innovaccer design system if you prefer yarn https yarnpkg com en use the following command instead bash yarn add innovaccer design system adding style import style at your app s root it is not included in library bundle and shipped as a single css file for more details see our styling styling section js import innovaccer design system css if you want to try out innovaccer design system you can also use codesandbox https codesandbox io s focused germain shbcw edit innovaccer design system https codesandbox io static img play codesandbox svg https codesandbox io s focused germain shbcw usage js import button from innovaccer design system const app return button done button for more information about each component check out our storybook https innovaccer github io design system check out our tutorial docs apptutorial md to guide you in creating an awesome app cdn if you prefer to include library globally by marking it as external in your application library https unpkg com browse innovaccer design system provides various single file distributions which can be used as following html style link href https unpkg com innovaccer design system 2 5 0 3 css dist index css rel stylesheet un compressed umd script src https unpkg com browse innovaccer design system 2 5 0 3 dist index umd js script brotli compressed umd script src https unpkg com innovaccer design system 2 5 0 3 dist index umd js br script gzip compressed umd script src https unpkg com innovaccer design system 2 5 0 3 dist index umd js gz script styling as this component library is part of a framework agnostic design system used at innovaccer the styling is done with css using css variables for theming and bem methodology for reusable and modular styling so it requires you to include css in your project by either importing or serving it as a static file the complete stylesheet is published as part of the component library at path innovaccer design system css you can include css by importing it or loading it from cdn using font the css sets the font family as nunito sans for the body to add this font in your project you need to load this font the recommended way to do it is by adding the following google font cdn link to your app s head html link href https fonts googleapis com css family nunito sans 300 300i 400 400i 600 600i 700 700i 800 800i 900 900i display swap rel stylesheet updating font if you don t add the font described above font family will not be affected by css however if you want to update the font family update it via the following css variable css font family reset styles as bem is used reset css is not used and no style reset is done polyfill for ie for css variables to work on ie we use a polyfill at runtime to achieve dynamic theming through variables please add the following polyfill in your page html script src https cdn jsdelivr net npm css vars ponyfill 2 script script cssvars onlylegacy true script card file box repos here are the supporting repositories mds rich text editor https github com innovaccer mds rich text editor feature rich wysiwyg what you see is what you get html editor and wysiwyg markdown editor it is used to create blogs notes sections comment sections etc it has a variety of tools to edit and format rich content mds docs https github com innovaccer mds docs documentation site for masala design system mds helpers https github com innovaccer mds helpers alert service books documentation masala design system http design innovaccer com components storybook https innovaccer github io design system code of conduct we expect everyone participating in the community to abide by our code of conduct https github com innovaccer design system blob master code of conduct md please read it please follow it we work hard to build each other up and create amazing things together how to contribute whether you re helping us fix bugs improve the docs or spread the word we d love to have you as part of the community muscle purple heart check out our contributing guide https github com innovaccer design system blob master contributing md for ideas on contributing and setup steps for getting our repositories up and running on your local machine contributors thanks goes to these wonderful people emoji key https allcontributors org docs en emoji key all contributors list start do not remove or modify this section prettier ignore start markdownlint disable table tr td align center a href https github com aditya kumawat img src https avatars githubusercontent com u 12715487 v 4 s 100 width 100px alt br sub b aditya kumawat b sub a br a href https github com innovaccer design system commits author aditya kumawat title code a td td align center a href https riyalohia github io portfolio img src https avatars githubusercontent com u 31706090 v 4 s 100 width 100px alt br sub b riya lohia b sub a br a href https github com innovaccer design system commits author riyalohia title code a td td align center a href http satyamyadav info img src https avatars githubusercontent com u 3583587 v 4 s 100 width 100px alt br sub b satyam yadav b sub a br a href https github com innovaccer design system commits author satyamyadav title code a a href https github com innovaccer design system commits author satyamyadav title documentation a a href https github com innovaccer design system pulls q is 3apr reviewed by 3asatyamyadav title reviewed pull requests a td td align center a href https github com sandeshchoudhary img src https avatars githubusercontent com u 11272274 v 4 s 100 width 100px alt br sub b sandeshchoudhary b sub a br a href https github com innovaccer design system commits author sandeshchoudhary title code a td td align center a href https github com adityajhajharia img src https avatars githubusercontent com u 42600089 v 4 s 100 width 100px alt br sub b adityajhajharia b sub a br a href https github com innovaccer design system commits author adityajhajharia title code a td td align center a href https github com anuradha9712 img src https avatars githubusercontent com u 46045493 v 4 s 100 width 100px alt br sub b anuradha aggarwal b sub a br a href https github com innovaccer design system commits author anuradha9712 title code a td td align center a href https github com xlreon img src https avatars githubusercontent com u 26788670 v 4 s 100 width 100px alt br sub b sidharth b sub a br a href https github com innovaccer design system commits author xlreon title code a td tr tr td align center a href https github com stuti1090 img src https avatars githubusercontent com u 65341865 v 4 s 100 width 100px alt br sub b stuti1090 b sub a br a href https github com innovaccer design system commits author stuti1090 title code a td td align center a href https github com veekays img src https avatars githubusercontent com u 6420348 v 4 s 100 width 100px alt br sub b vikas singh b sub a br a href https github com innovaccer design system commits author veekays title code a td td align center a href https github com saniyagupta img src https avatars githubusercontent com u 15903031 v 4 s 100 width 100px alt br sub b saniyagupta b sub a br a href https github com innovaccer design system commits author saniyagupta title code a td td align center a href https www linkedin com in parth chauhan 984624193 img src https avatars githubusercontent com u 35137224 v 4 s 100 width 100px alt br sub b parth chauhan b sub a br a href https github com innovaccer design system commits author chauhanparth210 title code a td td align center a href https github com stutirao img src https avatars githubusercontent com u 45294592 v 4 s 100 width 100px alt br sub b stuti pandey b sub a br a href https github com innovaccer design system commits author stutirao title code a td td align center a href https github com shib00 img src https avatars githubusercontent com u 33096446 v 4 s 100 width 100px alt br sub b shivam dwivedi b sub a br a href https github com innovaccer design system commits author shib00 title code a td td align center a href http www rahulgaur info img src https avatars githubusercontent com u 760474 v 4 s 100 width 100px alt br sub b rahul gaur b sub a br a href https github com innovaccer design system commits author aregee title code a td tr tr td align center a href https github com atifzaidi92 img src https avatars githubusercontent com u 54103064 v 4 s 100 width 100px alt br sub b atifzaidi92 b sub a br a href https github com innovaccer design system commits author atifzaidi92 title code a td td align center a href https github com sumit2399 img src https avatars githubusercontent com u 66456021 v 4 s 100 width 100px alt br sub b sumit dhyani b sub a br a href https github com innovaccer design system commits author sumit2399 title code a td td align center a href https tanmay portfolio herokuapp com img src https avatars githubusercontent com u 36269283 v 4 s 100 width 100px alt br sub b tanmay sharma b sub a br a href https github com innovaccer design system commits author 927tanmay title code a td td align center a href https github com rashi gupta 2000 img src https avatars githubusercontent com u 99866103 v 4 s 100 width 100px alt br sub b rashi gupta b sub a br a href https github com innovaccer design system commits author rashi gupta 2000 title code a td td align center a href https github com varnikajain15 img src https avatars githubusercontent com u 55780559 v 4 s 100 width 100px alt br sub b varnika jain b sub a br a href https github com innovaccer design system commits author varnikajain15 title code a td td align center a href https github com aman2000verma img src https avatars githubusercontent com u 45339091 v 4 s 100 width 100px alt br sub b aman verma b sub a br a href https github com innovaccer design system commits author aman2000verma title code a td td align center a href https github com samyak3009 img src https avatars githubusercontent com u 56395892 v 4 s 100 width 100px alt br sub b samyak jain b sub a br a href https github com innovaccer design system commits author samyak3009 title code a td tr table markdownlint restore prettier ignore end all contributors list end all contributors list start do not remove or modify this section prettier ignore start markdownlint disable markdownlint restore prettier ignore end all contributors list end this project follows the all contributors https github com all contributors all contributors specification contributions of any kind welcome memo license licensed under the mit license https github com innovaccer design system blob master license | hacktoberfest hacktoberfest2020 innovaccer css bem css-variable react javascript typescript design-system component-library hacktoberfest2021 ui ui-components components hacktoberfest2022 | os |
zevision | face recognition and analysis object detection api this project provides an api for face detection sentiment analysis and object detection that can be easily used by developers in custom made applications image https github com zenika open source zevision blob master zevision png the models are based on state of the art computer vision libraries a rest api allows for the training and usage of the models it can run natively on the machine or in a docker container watch a short presentation here https docs google com presentation d e 2pacx 1vsi8fjwrl7eh5fvxtemrhruze f8top4ik5dbx5h bn 3cjx kkxr573 9fv7 20tl p6jtucbcc4 v pub start false loop true delayms 5000 environnement setup installation locally on a ubuntu 16 04 terminal while in the repository s directory launch sudo bash setup sh the installation of all the necessary tools would be underway after you enter your root password support for windows and macos is coming soon using docker in progress library usage guide the library contains the python code used for the model training and prediction library usage and documentation here https github com zenika open source zevision tree master lib api usage guide the api allows access to the different features of the library api usage and documentation here https github com zenika open source zevision tree master api | recognition face-recognition docker-container python flask tensorflow-models dlib opencv opencv-python library server object-detection neural-network deep-learning | ai |
SimulateGPT | simulator logo simulategpt logo png a target blank href https colab research google com github openbiolink simulategpt blob main simulategpt ipynb img src https colab research google com assets colab badge svg alt open in colab a this repository contains code for the paper large language models are universal biomedical simulators schaefer et al 2023 computational simulation of biological processes can be a valuable tool in accelerating biomedical research but usually requires a high level of domain knowledge and extensive manual adaptations recently large language models llms such as gpt 4 have proven surprisingly successful in solving complex tasks across diverse fields by emulating human language generation at a very large scale here we explore the potential of leveraging llms as simulators of biological systems we establish proof of concept of a text based simulator simulategpt that leverages llm reasoning we demonstrate good prediction performance across diverse biomedical use cases without explicit domain knowledge or manual tuning our results show that llms can be used as versatile and broadly applicable biological simulators repository structure folders system messages gpt 4 system prompts with simple descriptive names e g simulator 4 markdown experiments protocols code and results for executed and planned running experiments for details see subsection below experiment name main md code or meta data files prompts ai messages experiments each experiment is kept in a separate folder containing main md experiment documentation objective method results conclusion using markdown main md in addition to the paper s methods section prompts prompts for this experiment user prompts ai messages chat gpt4 generated results file name schema system message prompt filename using snakemake to run experiments simply run snakemake c1 k config experiment name your experiment name 1 core continue with undone jobs if a job failed if you want to use my conda env add use conda the pipeline generates the files according to the schema indicated above code files src utils py the top level utils file provides everything you need to run your prompts in an automated fashion the functions are simple documented and reflect the defined repository structure we streamlined our api access using snakemake make sure to provide your private open ai api key as argument api key environment variable openai api key or in the password store notebook the simulator ipynb notebook is configured to work within colab but will also work on your local installation human input prompt guidelines provide a starting point for the simulation e g a situation or experimental setup or a detailed complex question that will be answered using a simulation optional can include imply a perturbation if you expect a final outcome explicitly request it use the words final outcome optional you can increase the novelty by adding focus on more novelty the simulator can be used to ask detailed complex questions about biology the simulator has the potential to assess the question in more depth and provide more informed answers than the default chatgpt | ai |
|
Information-Security-Applications | this course covers information security technology and its applications from the basics of hardware and software security where security is implemented security policy specific security measures security management and security operation techniques will be explained first students learn threat analysis methods to identify security threats to systems and tamper attacks and their countermeasures are explained as threats to hardware and software public key authentication infrastructure will be introduced using ssl tls the protocol for encrypted communication paths used in encrypted email and web browsers as an example in addition we will learn about os access control technology which is the basis for android and ios security mechanisms finally case studies related to these learned techniques and knowledge will be presented and security issues and countermeasures encountered in the case studies will be discussed | server |
|
ucdp_udagram_restapi | udagram image filtering microservice udagram is a simple cloud application developed alongside the udacity cloud engineering nanodegree it allows users to register and log into a web client post photos to the feed and process photos using an image filtering microservice the project is split into three parts 1 the simple frontend https github com grutt udacity c2 frontend a basic ionic client web application which consumes the restapi backend 2 the restapi backend https github com grutt udacity c2 restapi a node express server which can be deployed to a cloud service 3 the image filtering microservice https github com grutt udacity c2 image filter the final project for the course it is a node express application which runs a simple python script to process images tasks setup python environment you ll need to set up and use a virtual environment for this project to create a virtual environment run the following from within the project directory 1 install virtualenv dependency pip install virtualenv 2 create a virtual environment virtualenv venv 3 activate the virtual environment source venv bin activate note you ll need to do this every time you open a new terminal 4 install dependencies pip install r requirements txt when you re done working and leave the virtual environment run deactivate setup node environment you ll need to create a new node server open a new terminal within the project directory and run 1 initialize a new project npm init 2 install express npm i express save 3 install typescript dependencies npm i ts node dev tslint typescript types bluebird types express types node save dev 4 look at the package json file from the restapi repo and copy the scripts block into the auto generated package json in this project this will allow you to use shorthand commands like npm run dev create a new server ts file use our basic server as an example to set up this file for this project it s ok to keep all of your business logic in the one server ts file but you can try to use feature directories and app use routing if you re up for it use the restapi structure to guide you add an endpoint to handle post imagetoprocess requests it should accept two post parameter image url string a public url of a valid image file upload image signedurl string optional a url which will allow a put request with the processed image it should respond with 422 unprocessable if either post parameters are invalid it should require a token in the auth header or respond with 401 unauthorized it should be versioned the matching token should be saved as an environment variable tip we broke this out into its own auth router before but you can access headers as part of the req headers within your endpoint block it should respond with the image as the body if upload image signedurl is included in the request it should respond with a success message if upload image signedurl is not included in the request refactor your restapi server add a request to the image filter server within the restapi post feed endpoint it should create new signedurls required for the imagetoprocess post request body it should include a post request to the new server tip keep the server address and token as environment variables it should overwrite the image in the bucket with the filtered image in other words it will have the same filename in s3 deploying your system follow the process described in the course to eb init a new application and eb create a new environment to deploy your image filter service stand out postman integration tests try writing a postman collection to test your endpoint be sure to cover post requests with and without tokens post requests with valid and invalid parameters refactor data models try adding another column to your tables to save a separate key for your filtered image remember you ll have to rename the file before adding it to s3 advanced refactor data models try adding a second opencv filter script and add an additional parameter to select which filter to use as a post parameter | cloud |
|
Mobile-Development-with-.NET-Second-Edition | mobile development with net second edition a href https www packtpub com product mobile development with net second edition 9781800204690 utm source github utm medium repository utm campaign 9781800204690 img src https static packt cdn com products 9781800204690 cover smaller alt mobile development with net second edition height 256px align right a this is the code repository for mobile development with net second edition https www packtpub com product mobile development with net second edition 9781800204690 utm source github utm medium repository utm campaign 9781800204690 published by packt build cross platform mobile applications with xamarin forms 5 and asp net core 5 what is this book about the net 5 framework is a unified framework with capabilities that enable you to use microsoft s developer ecosystem on a single platform xamarin used for developing mobile applications is one of the app model implementations for net core infrastructure this book covers the following exciting features discover the latest features of net 5 which can be used in mobile application development explore xamarin forms shell for building cross platform mobile uis understand the technical requirements of a consumer mobile app for your app design focus on advanced concepts in mobile development such as app data management push notifications and graph apis manage app data with entity framework core use microsoft s project rome for creating cross device experiences with xamarin become well versed with how to implement machine learning in your mobile apps if you feel this book is for you get your copy https www amazon com dp 1800204698 today a href https www packtpub com utm source github utm medium banner utm campaign githubbanner img src https raw githubusercontent com packtpublishing github master github png alt https www packtpub com border 5 a instructions and navigations all of the code is organized into folders for example chapter01 the code will look like the following static char numberchars new 0 1 2 3 4 5 6 7 8 9 static char opchars new static void main string args var calculator new calculator calculator resultchanged result console clear console writeline environment newline result todo get input following is what you need for this book this book is for asp net core developers who want to get started with mobile development using xamarin and other microsoft technologies working knowledge of c programming is necessary to get started with the following software and hardware list you can run all code files present in the book chapter 1 18 software and hardware list chapter software required os required 1 18 net 5 0 and above windows mac os x and linux any 1 18 xamarin forms 5 0 windows mac os x and linux any 1 18 python windows mac os x and linux any 1 18 docker mac os x and linux any 1 18 visual studio windows mac os x and linux any 7 18 microsoft azure free trial windows related products other books you may enjoy asp net core 5 and react second edition packt https www packtpub com product asp net core 5 and react second edition 9781800206168 utm source github utm medium repository utm campaign 9781800206168 amazon https www amazon com dp 180020616x customizing asp net core 5 0 packt https www packtpub com product customizing asp net core 5 0 9781801077866 utm source github utm medium repository utm campaign 9781801077866 amazon https www amazon com dp 180107786x get to know the author can bilgin is a solution architect working for authority partners inc he has been working in the software industry for almost two decades on various consumer and enterprise level engagements for high profile clients using technologies such as biztak service fabric orleans dynamics crm xamarin wcf azure services and other web cloud technologies his passion lies in mobile and iot development using modern tools available to developers he shares his experience on his blog on social media and through speaking engagements at local and international community events he was recognized as a microsoft mvp for his technical contributions between 2014 and 2018 download a free pdf i if you have already purchased a print or kindle version of this book you can get a drm free pdf version at no cost br simply click on the link to claim your free pdf i p align center a href https packt link free ebook 9781800204690 https packt link free ebook 9781800204690 a p | front_end |
|
HAL_FreeRTOS_Modbus | armink freemodbus slave master rtt stm32 freemodbus modbus modbus freemodbus freemodbus v1 6 modbus 1 1 freemodbus modbus mb c modbus freemodbus modbus mb m c modbus freemodbus modbus ascii mbascii c ascii freemodbus modbus functions mbfunccoils c freemodbus modbus functions mbfunccoils m c freemodbus modbus functions mbfuncdisc c freemodbus modbus functions mbfuncdisc m c freemodbus modbus functions mbfuncholding c freemodbus modbus functions mbfuncholding m c freemodbus modbus functions mbfuncinput c freemodbus modbus functions mbfuncinput m c freemodbus modbus functions mbfuncother c modbus freemodbus modbus functions mbutils c freemodbus modbus rtu mbcrc c crc freemodbus modbus rtu mbrtu c rtu freemodbus modbus rtu mbrtu m c rtu freemodbus modbus tcp mbtcp c tcp freemodbus port port c freemodbus port portevent c freemodbus port portevent m c freemodbus port portserial c freemodbus port portserial m c freemodbus port porttimer c freemodbus port porttimer m c freemodbus port user mb app c modbus freemodbus port user mb app m c modbus m modbus 2 1 2 1 1 rt thread 1 ucos freertos freemodbus port portevent m c xmbmasterporteventinit xmbmasterporteventpost xmbmasterporteventget vmbmasterosresinit xmbmasterrunrestake vmbmasterrunresrelease vmbmastererrorcbrespondtimeout vmbmastererrorcbreceivedata vmbmastererrorcbexecutefunction modbus vmbmastercbrequestscuuess embmasterwaitrequestfinish modbus modbus modbus modbus poll 1 2 1 2 freemodbus port user mb app m c 4 freemodbus id usmregholdbuf 2 1 id 3 1 2 1 3 modbus modbus 4 modbus modbus modbus embmasterreginputcb embmasterregholdingcb embmasterregcoilscb embmasterregdiscretecb easydatamanager https github com armink easydatamanager 2 2 freemodbus port cpu stm32f103x rt thread 1 rt thread bsp mcu 2 2 1 freemodbus port portserial m c vmbmasterportserialenable 485 vmbmasterportclose xmbmasterportserialinit 485 xmbmasterportserialputbyte xmbmasterportserialgetbyte prvvuarttxreadyisr pxmbmasterframecbtransmitterempty prvvuartrxisr pxmbmasterframecbbytereceived cpu 2 2 2 freemodbus port porttimer m c xmbmasterporttimersinit t3 5 usprescalervalue ust35timeout50us vmbmasterporttimerst35enable t3 5 vmbmasterporttimersconvertdelayenable vmbmasterporttimersrespondtimeoutenable vmbmasterporttimersdisable prvvtimerexpiredisr pxmbmasterportcbtimerexpired 1 usprescalervalue ust35timeout50us 2 freemodbus modbus include mbconfig h cpu 3 api modbus api mb mre no err mb mre no reg mb mre ill arg mb mre rev data mb mre timedout mb mre master busy mb mre exe fun modbus c master station usmregholdbuf 0 i 1 3 1 c embmasterreqerrcode embmasterreqwriteholdingregister uchar ucsndaddr ushort usregaddr ushort usregdata long ltimeout ucsndaddr 0 usregaddr usregdata ltimeout 3 2 c embmasterreqerrcode embmasterreqwritemultipleholdingregister uchar ucsndaddr ushort usregaddr ushort usnregs ushort pusdatabuffer long ltimeout ucsndaddr 0 usregaddr usnregs pusdatabuffer ltimeout 3 3 c embmasterreqerrcode embmasterreqreadholdingregister uchar ucsndaddr ushort usregaddr ushort usnregs long ltimeout ucsndaddr 0 usregaddr usregdata ltimeout 3 4 c embmasterreqerrcode embmasterreqreadwritemultipleholdingregister uchar ucsndaddr ushort usreadregaddr ushort usnreadregs ushort pusdatabuffer ushort uswriteregaddr ushort usnwriteregs long ltimeout ucsndaddr 0 usreadregaddr usnreadregs pusdatabuffer uswriteregaddr usnwriteregs ltimeout 3 5 c embmasterreqerrcode embmasterreqreadinputregister uchar ucsndaddr ushort usregaddr ushort usnregs long ltimeout ucsndaddr 0 usregaddr usregdata ltimeout 3 6 c embmasterreqerrcode embmasterreqwritecoil uchar ucsndaddr ushort uscoiladdr ushort uscoildata long ltimeout ucsndaddr 0 uscoiladdr uscoildata ltimeout 3 7 c embmasterreqerrcode embmasterreqwritemultiplecoils uchar ucsndaddr ushort uscoiladdr ushort usncoils uchar pucdatabuffer long ltimeout ucsndaddr 0 uscoiladdr usncoils pucdatabuffer ltimeout 3 8 c embmasterreqerrcode embmasterreqreadcoils uchar ucsndaddr ushort uscoiladdr ushort usncoils long ltimeout ucsndaddr 0 uscoiladdr usncoils ltimeout 3 9 c embmasterreqerrcode embmasterreqreaddiscreteinputs uchar ucsndaddr ushort usdiscreteaddr ushort usndiscretein long ltimeout ucsndaddr 0 usdiscreteaddr usndiscretein ltimeout 4 1 freemodbus modbus include mbconfig h modbus rtu modbus ascii modbus tcp 3 modbus rtu 1 4 2 1 embmasterinit modbus 2 embmasterenable modbus 3 embmasterpoll 4 api modbus 4 3 api freemodbus port portevent m c bsd license 1 http www rt thread org | os |
|
WebFundamentals | web fundamentals doi https zenodo org badge 49000057 svg https zenodo org badge latestdoi 49000057 this repository contains the slides for the web fundamentals http rubenverborgh github io webfundamentals module of the ghent university course web development http studiegids ugent be 2016 en studiefiches c003779 pdf br view the slides online http rubenverborgh github io webfundamentals questions feedback and suggestions welcome do you have a question on one of topics please create an issue https github com rubenverborgh webfundamentals issues new do you have feedback on contents or form please create an issue https github com rubenverborgh webfundamentals issues new do you have a suggestion to improve the slides please create a pull request https github com rubenverborgh webfundamentals pulls please read and accept the contributor agreement https github com rubenverborgh webfundamentals blob gh pages contributing md before contributing finding your way around this repository contains 1 introductory slidedeck index html https github com rubenverborgh webfundamentals blob gh pages index html in the root folder https github com rubenverborgh webfundamentals 5 lecture slidedecks index html files in subfolders such as architecture https github com rubenverborgh webfundamentals tree gh pages architecture shared images https github com rubenverborgh webfundamentals tree gh pages shared images styles https github com rubenverborgh webfundamentals tree gh pages shared styles fonts https github com rubenverborgh webfundamentals tree gh pages shared fonts and scripts https github com rubenverborgh webfundamentals tree gh pages shared scripts images per lecture images folders in subfolders such as architecture https github com rubenverborgh webfundamentals tree gh pages architecture auxiliary files in the root folder https github com rubenverborgh webfundamentals how to start a typical starting point would be to open up any index html file either in the root folder or any of the subfolders this allows you to edit the contents of the corresponding slidedeck the slides themselves are regular html files brought to life with the shower https github com shower shower presentation engine they use the clear https github com rubenverborgh shower clear template with a few customizations in shared styles web fundamentals css https github com rubenverborgh webfundamentals blob gh pages shared styles web fundamentals css you can just open the slides in your browser from the local filesystem while editing alternatively you can install gulp http gulpjs com and run the gulp command in the root folder which will autorefresh your browser upon changes license except where otherwise noted the content of these slides is licensed under a creative commons attribution 4 0 international license http creativecommons org licenses by 4 0 | ugent open-webslides web slides course | front_end |
Embedded | embedded project files for ece4534 embedded system design the repo project new board is the master and the other one is slave | os |
|
ITF | information technology fundamentals hardware cable cable vs dsl vs fiber internet https www youtube com watch v qqyiwmamq38 hdmi vs displayport vs dvi vs vga vs thunderbolt https www youtube com watch v ifo3eiqbnj8 computer device overclocking https www youtube com watch v htbmwzs6rbm network device hub vs switch vs router https www youtube com watch v 1z0ulvg pw8 modem vs router https www youtube com watch v mad4kq5835y 2 4 ghz vs 5 ghz wi fi https www youtube com watch v j bf ke5llq non hardware cryptography password hashing salts peppers https www youtube com watch v tnzmuok3e database concept sql vs nosql or mysql vs mongodb https www youtube com watch v zs kxvoeq5y protocol dhcp https www youtube com watch v e6 tah5bkjo dns https www youtube com watch v mpqzvypudgu port forwarding https www youtube com watch v 2g1uemdgwxw last update 2021 05 25 03 38 41 | server |
|
OpenML | license https img shields io badge license bsd 203 clause blue svg https opensource org licenses bsd 3 clause openml open machine learning welcome to the openml github page tada contents toc depthfrom 2 depthto 6 withlinks 1 updateonsave 1 orderedlist 0 who are we who are we what is openml what is openml benefits for science benefits for science benefits for scientists benefits for scientists benefits for society benefits for society get involved get involved toc who are we we are a group of people who are excited about open science open data and machine learning we want to make machine learning and data analysis simple accessible collaborative and open with an optimal division of labour between computers and humans what is openml want to learn about openml or get involved please do and get in touch openmlhq googlegroups com in case of questions or comments incoming envelope getting started check out the openml website https www openml org to get a first impression of what openml is the openml documentation page https openml github io openml gives an introduction in details and features as well as openml s different apis https openml github io openml apis and integrations https openml github io openml sklearn so that everyone can work with their favorite tool how to contribute https github com openml openml blob master contributing md citation and honor code https www openml org terms communication contact https github com openml openml wiki communication channels openml is an online machine learning platform for sharing and organizing data machine learning algorithms and experiments it is designed to create a frictionless networked ecosystem that you can readily integrate into your existing processes code environments allowing people all over the world to collaborate and build directly on each other s latest ideas data and results irrespective of the tools and infrastructure they happen to use as an open science platform openml provides important benefits for the science community and beyond benefits for science many sciences have made significant breakthroughs by adopting online tools that help organizing structuring and analyzing scientific data online indeed any shared idea question observation or tool may be noticed by someone who has just the right expertise to spark new ideas answer open questions reinterpret observations or reuse data and tools in unexpected new ways therefore sharing research results and collaborating online as a possibly cross disciplinary team enables scientists to quickly build on and extend the results of others fostering new discoveries moreover ever larger studies become feasible as a lot of data are already available questions such as which hyperparameter is important to tune which is the best known workflow for analyzing this data set or which data sets are similar in structure to my own can be answered in minutes by reusing prior experiments instead of spending days setting up and running new experiments benefits for scientists scientists can also benefit personally from using openml for example they can save time because openml assists in many routine and tedious duties finding data sets tasks flows and prior results setting up experiments and organizing all experiments for further analysis moreover new experiments are immediately compared to the state of the art without always having to rerun other people s experiments another benefit is that linking one s results to those of others has a large potential for new discoveries see for instance feurer et al 2015 post et al 2016 probst et al 2017 leading to more publications and more collaboration with other scientists all over the world finally openml can help scientists to reinforce their reputation by making their work published or not visible to a wide group of people and by showing how often one s data code and experiments are downloaded or reused in the experiments of others benefits for society openml also provides a useful learning and working environment for students citizen scientists and practitioners students and citizen scientist can easily explore the state of the art and work together with top minds by contributing their own algorithms and experiments teachers can challenge their students by letting them compete on openml tasks or by reusing openml data in assignments finally machine learning practitioners can explore and reuse the best solutions for specific analysis problems interact with the scientific community or efficiently try out many possible approaches get involved openml has grown into quite a big project we could use many more hands to help us out wrench you want to contribute awesome check out our wiki page on how to contribute https github com openml openml wiki how to contribute or get in touch https github com openml openml wiki communication channels there may be unexpected ways for how you could help we are open for any ideas you want to support us financially yes getting funding through conventional channels is very competitive and we are happy about every small contribution please send an email to openmlhq googlegroups com github organization structure openml s code distrubuted over different repositories to simplify development please see their individual readme s and issue trackers of you like to contribute these are the most important ones openml openml https github com openml openml the openml web application including the rest api openml openml python https github com openml openml python the python api to talk to openml from python scripts including scikit learn openml openml r https github com openml openml r the r api to talk to openml from r scripts inclusing mlr openml java https github com openml java the java api to talk to openml from java scripts openml openml weka https github com openml openml weka the weka plugin to talk to openml from the weka toolbox | machine-learning open-science science citizen-scientists collaboration opendata datasets hacktoberfest | ai |
interface | uniswap labs interface codecov https codecov io gh uniswap interface branch main graph badge svg token yvt2y86o82 https codecov io gh uniswap interface unit tests https github com uniswap interface actions workflows unit tests yaml badge svg https github com uniswap interface actions workflows unit tests yaml integration tests https github com uniswap interface actions workflows integration tests yaml badge svg https github com uniswap interface actions workflows integration tests yaml lint https github com uniswap interface actions workflows lint yml badge svg https github com uniswap interface actions workflows lint yml release https github com uniswap interface actions workflows release yaml badge svg https github com uniswap interface actions workflows release yaml crowdin https badges crowdin net uniswap interface localized svg https crowdin com project uniswap interface an open source interface for uniswap a protocol for decentralized exchange of ethereum tokens website uniswap org https uniswap org interface app uniswap org https app uniswap org docs uniswap org docs https docs uniswap org twitter uniswap https twitter com uniswap reddit r uniswap https www reddit com r uniswap email contact uniswap org mailto contact uniswap org discord uniswap https discord gg fcfybsbcu5 whitepapers v1 https hackmd io c dvwdsfsxuh gd4wke ig v2 https uniswap org whitepaper pdf v3 https uniswap org whitepaper v3 pdf accessing the uniswap interface to access the uniswap interface use an ipfs gateway link from the latest release https github com uniswap uniswap interface releases latest or visit app uniswap org https app uniswap org unsupported tokens check out useunsupportedtokenlist in src state lists hooks ts src state lists hooks ts for blocking tokens in your instance of the interface you can block an entire list of tokens by passing in a tokenlist like here src constants lists ts contributions for steps on local deployment development and code contribution please see contributing contributing md pr title your pr title must follow conventional commits https www conventionalcommits org en v1 0 0 summary and should start with one of the following types https github com angular angular blob 22b96b9 contributing md type build changes that affect the build system or external dependencies example scopes yarn eslint typescript ci changes to our ci configuration files and scripts example scopes vercel github cypress docs documentation only changes feat a new feature fix a bug fix perf a code change that improves performance refactor a code change that neither fixes a bug nor adds a feature style changes that do not affect the meaning of the code white space formatting missing semi colons etc test adding missing tests or correcting existing tests example commit messages feat adds support for gnosis safe wallet fix removes a polling memory leak chore bumps redux version other things to note please describe the change using verb statements ex removes x from y prs with multiple changes should use a list of verb statements add any relevant unit integration tests changes will be previewable via vercel non obvious changes should include instructions for how to reproduce them accessing uniswap v2 the uniswap interface supports swapping adding liquidity removing liquidity and migrating liquidity for uniswap protocol v2 swap on uniswap v2 https app uniswap org swap use v2 view v2 liquidity https app uniswap org pools v2 add v2 liquidity https app uniswap org add v2 migrate v2 liquidity to v3 https app uniswap org migrate v2 accessing uniswap v1 the uniswap v1 interface for mainnet and testnets is accessible via ipfs gateways linked from the v1 0 0 release https github com uniswap uniswap interface releases tag v1 0 0 | uniswap ethereum blockchain | blockchain |
book-list | bookstore application this application is inspired by istios bookinfo https istio io latest docs examples bookinfo example app it is intended to demonstrate modern development practices such as microservices and containerization the code accompanies a lecture held at tum heilbronn the main purpose of the application is to expose static information about books via a restful api docs component svg every component is a separate spring boot application which you can find in its dedicated folder getting started github actions the repository comes with a ready to use github workflow which takes care of the continuous integration part for you to use it make sure to set it up correctly 1 create repository secrets https docs github com en actions security guides encrypted secrets for your docker username and your docker password do not store sensitive information such as your username or password in the repository always use an encrypted storage for that 2 select the bookstore component that you want to build this can be done via the component environment variable in the workflow file github workflows build push yml the workflow will be triggered for every change that you do on the main branch as a result of a successful workflow run you will see a new docker image in your personal dockerhub repository the naming schema is as follows secrets docker username bookstore env component github run number every run creates a new tag to prevent cache issues while fetching the image | cloud |
|
BlueBerry | blueberry university software engineering project blueberry apple ios app store | os |
|
nlp-ue4 | nlp ue4 natural language processing plugin for unreal engine 4 using tensorflow this plugin was built upon getnamo s tensorflow ue4 plugin https github com getnamo tensorflow ue4 which you can find here this plugin allows you to extract entities and intents from sentences fully in blueprints without touching c or python installation 1 to use this nlp plugin you must first follow the instructions for intalling tensorflow ue4 https github com getnamo tensorflow ue4 releases 2 download and add nlp ue4 plugin https github com glenn v w nlp ue4 to plugins 3 download and add googlenews vectors negative300 bin 3 39 gb https drive google com file d 0b7xkcwpi5kdynlnuttlss21pqmm edit usp sharing to content scripts examples you can find a bare bones example project here https github com glenn v w nlp ue4 examples for a more in depth use of this plugin you can find a text based adventure game i ve been working on here https github com glenn v w nlp puzzlegame feature overview this plugin s workings were heavily inspired by microsoft luis https eu luis ai similarly to it we work with entities and intents the main difference between luis and this plugin is that this plugin works offline without the need to pay for microsoft azure but it is missing a number of features that microsoft luis does have namely patterns regexes etc in an ideal world these will be added later but we ll see so how to get started using this plugin there s two major parts for using this plugin there s an in engine part and an out of engine part let s start to the latter out of engine entities make your way to content entities in this folder you can have as many entities as you wish each type of entity must be a csv file and the name of the file will be the type of the entity the csv file must have the following structure colors csv https puu sh dcfzk 06892ba83b png 1 field a1 must be empty or contain 2 field b1 must contain entities 3 field a2 must contain either true or false this determines whether the entity has an impact on the intent of a sentence in other words it determines if the entity is meaningful colors and many adjectives may be described as meaningless as far as the intent is concerned true meaningful false meaningless 4 field b2 b3 b4 and onwards must be structured as seen in the image words that fall within the same entity category but have a different meaning should be in different fields while synonyms should be in the same field for example red and ruby are in the same field since as far as we re conserned here they re synonyms blue meanwhile is in a different field 5 field a3 a4 a5 and onwards must have unique names but the names are meaningless i suggest using row numbers for simplicity the first word in b2 will be referred to as the base of that entity henceforth trainingdata and intents make your way to content scripts this folder must contain 3 csv files for the plugin to function trainingdatasentences csv trainingdataintents csv intents csv the following screenshot has those files open in that order from left to right trainingdatasentences trainingdataintents and intents intents https puu sh dcg9y 593462f598 png so what s going on here trainingdatasentences csv left hand file includes the sentences our neural net will be training on for intent recognition this should contain sentences similar to what players may be entering in the game but be careful there s a very strict way to structure them 1 change sentence to be lower case 2 remove all punctuation marks 3 remove all stop words ourselves hers between yourself but again there about once during out very having with they own an be some for do its yours such into of most itself other off is s am or who as from him each the themselves until below are we these your his through don nor me were her more himself this down should our their while above both up to ours had she all no when at any before them same and been have in will on does yourselves then that because what over why so can did not now under he you herself has just where too only myself which those i after few whom t being if theirs my against a by doing it how further was here than 4 make sure you replace all words that belong to an entity to the base of that entity see entities 5 if a word belongs to an entity that was selected to be meaningless to the intent remove it so for example imagine we have an entity of objects with base barrel an entity of colors with base red and is set to be meaningless an intent of equipables with base key we would like to enter the following sentence into our training set open the green chest using the green key with the above structuring that would become open barrel using key this sentence can then be added to our csv file where each word is a seperate field and all the fields after the sentence is complete are filled with none until j max of 10 words of course this sentence corresponds to an intent we must select the corresponding intent of this sentence we do this in trainingdataintents csv where we set the corresponding field to 1 and all incorrect fields to 0 these collumns correspond directly with the rows in intents csv so for example if intents csv has the followin fields b2 goto b3 gothrough b4 use and b5 pickup the first of those goto corresponds to the first collumn in trainingdataintents csv the second one gothrough corresponds to the second collumn and so on intents csv has similar rules to entities 1 field a1 must be empty or contain 2 field b1 must contain intents 3 field a3 a4 a5 and onwards must have unique names but the names are meaningless i suggest using row numbers for simplicity in engine to use natural language processing in a blueprint you must add a tensorflowcomponent and a naturallanguagecomponent x https puu sh dd4dp 302260f52f png in the tensorflowcomponent set the tensorflowmodule to glennvwsnaturallanguageprocessing x https puu sh dd4ds 55f7557bc9 png in the naturallanguagecomponent set the intent data table to a data table containing your intents there should be one by default which you can modify to your needs next all you need to do to use language processing is the following x https puu sh dd4dw 10b494c504 png anywhere in your code where you wish to process a sentence call process sentence from the naturallanguagecomponent and pass it your sentence string on beginplay bind an event to sentenceprocessed from the naturallanguagecomponent this event will be called when the net completes processing after you call process sentence and will receive the intent string and the detected entities array of entity type string and specific entity string in the order that they appeared in the original sentence you can parse this result as you wish video overview that may sound like a lot so you can also watch this video for a quick summary of the plugin s features insert video here troubleshooting command window pops up on first begin play on first play the plugin adds modules to the python virtual environment this may take a few minutes depending on internet connectivity the naturallanguagecomponent does not complete training wait for a few minutes before pressing play again python modules are being installed in the background just be patient license https github com glenn v w nlp ue4 blob master license nlp and tensorflow plugin mit https opensource org licenses mit tensorflow and tensorflow icon apache 2 0 http www apache org licenses license 2 0 | ai |
|
FreeRTOS-ChibiOSAL | freertos with chibios hal this repository contains all code needed to build the chibios hal together with the freertos real time operating system an example project is included in the folder example first you should run make in the freertos directory this will create a static library containing freertos you can edit the settings in freertosconfig h to match your project use the makefile matching the type of cpu you use then run make in the example folder to create the binary | os |
|
NLP-using-Spark | nlp using spark natural language processing using pyspark mllib read the details about the project here https towardsdatascience com natural language processing with spark 9efef3564270 source email 5120c2f3f19 1578287067734 layercake autolayercakewriternotification a007adb9 e2a0 47c2 b12c 18e4c66c1ae0 sk 90b64f9ee285b37bd7057056d37bc947 the notebook analyzes the dataset for the disaster tweets nlp kaggle competition https www kaggle com c nlp getting started | ai |
|
Spotway_etl_pipeline | spotway data engineering project end to end etl pipeline built using python aws cloud technologies lambda s3 glue athena cloudwatch and spotify web api what is spotway spotway is an end to end etl extract transform load pipeline built using python spotify s web api and aws cloud technologies to see what songs are currently popular and trendy so that you are always up to date on the latest tunes tech stack skils used 1 python 2 spoity web api 3 aws lambda 4 aws s3 5 aws glue 6 aws cloudwatch 7 aws athena pipeline architecture the architecture diagram created below highlights and breaks down the etl pipeline into different stages we access the spotify api and use python to extract data from the api to build our pipeline we use aws cloudwatch to have a daily trigger which it is then extracted on aws lambda the extracted data is stored on a bucket in aws s3 the raw extracted data on s3 is placed on aws lambda once again and is transformed by the lambda function and stored on s3 the transformed data is crawled by aws glue and can be accessed and used for data analytics on aws athena screenshot 2023 05 29 at 4 00 19 pm https github com anujgarlapati spotway etl pipeline assets 59670482 97934928 cbc9 4dfa b14a f263bed2d349 building the etl pipeline before starting to build the etl pipeline we must require access to the dataset the dataset used for this data engineering project is spotify s web api an account must be created to access this api and this will allow us to get the credentials of both the client id and client secret these tokens can be accessed as seen in the image below screenshot 2023 05 31 at 2 58 51 pm https github com anujgarlapati spotway etl pipeline assets 59670482 38653e7d 6583 46ea a72c 60f13ed7a7ac spotify extract data api py this python file code is used to extract data from spotify s web api the code is run on aws lambda where it is then stored in a lambda function and is used for the initial stages of our etl pipeline the code can be broken down as follows the packages necessary for the lambda spotify extraction data api function import json import os import spotipy from spotipy oauth2 import spotifyclientcredentials import boto3 from datetime import datetime inside the lambda function both the client id and client secret are stored in aws lambda environment variables to keep confidentiality an object is created in order to extract data def lambda handler event context client id os environ get client id client secret os environ get client secret spotify credentials spotifyclientcredentials client id client id client secret client secret spotify ob spotipy spotify client credentials manager spotify credentials playlists spotify ob user playlists spotify more specifically we are extracting data from the global top 50 playlists where the url is being extracted so that we access the playlist data top50 playlist link https open spotify com playlist 37i9dqzevxbmdohdwvn2tf top50 playlist url top50 playlist link split 1 top50 data spotify ob playlist tracks top50 playlist url boto3 provides a python api for aws cloud infrastructure services in the following code we store the raw data in a bucket in s3 client boto3 client s3 filename final raw data str datetime now json client put object bucket spotway data pipeline project anuj key spotify data raw process required filename body json dumps top50 data the code is seen in aws lambda as follows screenshot 2023 05 31 at 3 43 12 pm https github com anujgarlapati spotway etl pipeline assets 59670482 596ea0b2 6d2f 494c 9667 128a914741cb in amazon cloudwatch we set a daily trigger as we are in need of extracting data once every day as the playlist data is ever changing this can be done by adding a trigger in the function overview screenshot 2023 06 08 at 9 03 36 pm https github com anujgarlapati spotway etl pipeline assets 59670482 d0f8a1df 1b9a 4f73 bf31 4db400da9668 spotify data transformation py once raw data is extracted daily from spoitfy s web api and stored on a bucket in s3 we must have a transformation function in aws lambda as the second part of our etl pipeline the first part of transforming the extracted data is that we are simply first looking into different characteristics of the data we have multiple functions in this python file where the first lambda function focuses on the album data import json import boto3 from datetime import datetime from io import stringio import pandas as pd def spotifyalbum top50 data top50 list album for line in top50 data items album id top50 line track album id album name top50 line track album name album releasedate top50 line track album release date album totaltracks top50 line track album total tracks album url top50 line track album external urls spotify converting into dictonary album dict id album id top50 name album name top50 release date album releasedate top50 total tracks album totaltracks top50 url album url top50 top50 list album append album dict return top50 list album lambda function for artist data def spotifyartist top50 data top50 list artist for line in top50 data items for key value in line items if key track for artist in value artists artist dict artist id artist id artist name artist name url artist href top50 list artist append artist dict return top50 list artist lambda function for songs data def spotifysongs top50 data song list top50 for line in top50 data items song id top50 line track id song name top50 line track name song duration top50 line track duration ms song url top50 line track external urls spotify song popularity top50 line track popularity song added line added at converting into dictonary song dict id song id top50 name song name top50 duration song duration top50 url song url top50 popularity song popularity top50 song added song added song list top50 append song dict return song list top50 once we have each function for different parts of the playlist data we can then call each of the functions in the lambda handler where the transformed data will be placed in a bucket in s3 the code is below def lambda handler event context s3 boto3 client s3 bucket spotway data pipeline project anuj key spotify data raw process required spotify data spotify keys for file in s3 list objects bucket bucket prefix key contents file key file key if file key split 1 json response s3 get object bucket bucket key file key content response body jsonobject json loads content read spotify data append jsonobject spotify keys append file key for top50 data in spotify data top50 list album spotifyalbum top50 data top50 list artist spotifyartist top50 data song list top50 spotifysongs top50 data top 50 album df pd dataframe from dict top50 list album top 50 album df top 50 album df drop duplicates subset id top 50 artist df pd dataframe from dict top50 list artist top 50 artist df top 50 artist df drop duplicates subset artist id top50 song df pd dataframe from dict song list top50 top 50 album df release date pd to datetime top 50 album df release date top50 song df song added pd to datetime top50 song df song added top50 songkey spotify transformed data spotify songs data songs transformed data str datetime now csv song buffer stringio top50 song df to csv song buffer index false song content song buffer getvalue s3 put object bucket bucket key top50 songkey body song content top50 albumkey spotify transformed data spotify albums data album transformed data str datetime now csv album buffer stringio top 50 album df to csv album buffer index false album content album buffer getvalue s3 put object bucket bucket key top50 albumkey body album content top50 artistkey spotify transformed data spotify artist data artist transformed data str datetime now csv artist buffer stringio top 50 artist df to csv artist buffer index false artist content artist buffer getvalue s3 put object bucket bucket key top50 artistkey body artist content s3 resource boto3 resource s3 for key in spotify keys copy source bucket bucket key key s3 resource meta client copy copy source bucket spotify data raw processed key split 1 s3 resource object bucket key delete the trigger for spotify data transformation function is as follows screenshot 2023 06 08 at 9 03 46 pm https github com anujgarlapati spotway etl pipeline assets 59670482 dd4bda16 5cf3 48bf 9faf 30f96dcbf3fd loading the data in the etl pipeline once both the extraction and transformation phases are done for the etl pipeline the last and final phase of the etl pipeline is to load the data to load the data we must use the crawler in aws glue to connect a datastore to create metadata tables there are three different crawlers that we have created to load the data in a catalog the artist song and album data this can be seen in the image down below screenshot 2023 05 31 at 4 30 00 pm https github com anujgarlapati spotway etl pipeline assets 59670482 a1ec777e 0e57 48e7 a4c5 549d051e01b8 once the data is loaded into a data catalog by aws glue s crawler we can then access the data using aws athena for data analytics we are able to perform a multitude of tasks which include even running sql queries to sort and organize the data this can be seen down below screenshot 2023 05 31 at 4 39 07 pm https github com anujgarlapati spotway etl pipeline assets 59670482 3373915e 7ebc 40f0 83ae dd0f0697ed0d conclusion key takeways this project had deeply taught me how to create an etl pipeline to extract transform and load data using aws cloud technologies ultimately i have learned thoroughly on how to use aws lambda glue s3 athena and cloudwatch note i have also cleaned the data using a jupyter notebook and this can be seen in the following spotify etl ipynb file | cloud |
|
EC_Project | h1 align center welcome to ec project h1 div align center p img alt version src https img shields io badge version v2 0 0 blue svg cacheseconds 2592000 a href https github com kurisaw collaborative ec project tree main documents target blank img alt documentation src https img shields io badge documentation yes brightgreen svg a a href https github com kurisaw collaborative ec project graphs contributors target blank img alt contributors src https img shields io github contributors kurisaw collaborative ec project a a href https github com kurisaw collaborative ec project blob main license target blank img alt license mit src https img shields io badge license mit yellow svg a a href https twitter com kurisawhh target blank img alt twitter kurisawhh src https img shields io twitter follow kurisawhh svg style social a p div china 2023 national college students embedded chip design and system competition urban patrol car homepage https github com kurisaw collaborative ec project wiki https github com kurisaw collaborative ec project wiki technical documentation demo https www bilibili com video bv1on411b7jm vd source 7de393144f462a4eade54292bd598c34 author kurisaw website https kurisaw eu org twitter kurisawhh https twitter com kurisawhh github kurisaw https github com kurisaw contributing contributions issues and feature requests are welcome br feel free to check issues page https github com kurisaw collaborative ec project issues you can also take a look at the contributing guide https github com kurisaw collaborative ec project graphs contributors thank you for submitting code contributions to the project repositor contributors div style text align justify a href https github com kurisaw img src https avatars githubusercontent com u 98592772 v 4 width 70 alt kurisaw p style margin top 5px kurisaw p a div div style text align justify a href https github com guapi61 img src https avatars githubusercontent com u 115223041 v 4 width 70 alt nedki l p style margin top 5px guapi61 p a div div style text align justify a href https github com nedki l img src https avatars githubusercontent com u 114712457 v 4 width 70 alt nedki l p style margin top 5px nedki l p a div show your support give a if this project helped you a href https www patreon com kurisaw img src https c5 patreon com external logo become a patron button 2x png width 160 a div align center copyright 2023 kurisaw collaborative https github com kurisaw collaborative br this project is mit https github com kurisaw collaborative ec project blob main license licensed dtv this readme was generated with by readme md generator https github com kefranabg readme md generator | os |
|
awesome-yoruba-nlp | awesome yoruba nlp awesome https cdn rawgit com sindresorhus awesome d7305f38d29fed78fa85652e3a63e154dd8e8829 media badge svg https github com sindresorhus awesome a curated list of resources dedicated to natural language processing in the yoruba language maintainers olamilekan wahab https github com olamyy please read the contribution guidelines contributing md before contributing please feel free to create pull requests https github com olamyy awesome yoruba nlp pulls contents papers papers slides slides datasets datasets papers natural language processing of english to yoruba https www researchgate net publication 301988345 natural language processing of english language to yoruba language a computerized identification system for verb sorting and arrangement in a natural language case study of the nigerian yoruba language http www eajournals org wp content uploads a computerized identification system for verb sorting and arrangement in a natural language case study of the nigerian yoruba language pdf morphological analysis of standard yor b nouns http www ajer org papers v5 06 b05060812 pdf part of speech tagging of yoruba standard language of niger congo family http www isca in com it sci archive v1 i1 1 isca rjcits 2013 003 pdf rule based parts of speech tagging of yor b simple sentences https www grin com document 347022 yoruba word formation processes ucla linguistics http linguistics ucla edu images stories adewole 1995 pdf part of speech tagging of yoruba standard language of niger congo family https leilbadrahzaki wordpress com development of a syllabicator for yor b language http ifecisrg org sites default files articles syoruba 20syllabicator 4 pdf natural language processing of english to yoruba language https www researchgate net profile abayomi alli adebayo publication 296675810 ait 2015 conference proceedings links 56d7e26508aebe4638af23a4 ait 2015 conference proceedings pdf page 120 applying rough set theory to yor b language translation https www researchgate net profile babatunde obalalu publication 301888987 applying rough set theory to yoruba language translation links 572b1fab08ae2efbfdbdbe89 pdf a computational model of yoruba morphology lexical analyzer http citeseerx ist psu edu viewdoc download doi 10 1 1 740 823 rep rep1 type pdf intelligent system for learning and understanding of yoruba language https pdfs semanticscholar org ac53 a1d3c93518fabd26470181ecbf07462e5e17 pdf a stochastic collocation algorithm method for processing the yoruba language using the data context approach based on text lexicon and grammar http www akamaiuniversity us pjst19 1 175 pdf integration of yoruba language into marytts https link springer com article 10 1007 s10772 016 9334 8 attentive seq2seq learning for diacritic restoration of yor b language text https arxiv org abs 1804 00832 slides development of tech to speech system in yoruba https www slideshare net alexanderdecker development of text to speech system for yoruba language a number to yoruba text transcription system https www slideshare net aflat a number to yorb text transcription system datasets wiki dump https dumps wikimedia org yowiki latest yoruba text https github com niger volta lti yoruba text | nlp nlp-resources nlp-machine-learning yoruba | ai |
Udacity-Natural-Language-Processing-Nanodegree | udacity natural language processing nanodegree this repository contains all my solutions to the tutorials and projects of the udacity natural language processing nanodegree link to course page https www udacity com course natural language processing nanodegree nd892 workspace for all the tutoraials and projects i used the provided and preconfigured workspaces of udacity | udacity nanodegree udacity-nanodegree natural-language-processing nlp deep-learning hmm machine-translation speech-recognition speech-to-text sentiment-analysis topic-modeling part-of-speech-tagger keras python3 | ai |
Information_Extraction_Basic | information extraction basic repository which contains various technologies about information extraction requirements janome for japanese language processing corpus basically this information extraction ie algorithm is assuming corpus or text of japanese corpus is constructed by the tool doccano http doccano herokuapp com named entity recognition ner dictionary based ner dict base ner py all the word needed to be recognized by ner is contained in the dictionary the key of the dictionary is the word and the value of the dictionary is the label extraction is storongly based on regular expression dictionary people organism location location number number number people people organism organism number number location location location location people people number number number number people people rule based ner rule base ner py all the word needed to be recognezed should match the rules given by user first text will be tokenized to list of word to all of the words obtained rules will be applied from head to tail and if it matched the word will be labeled as the value of the rule this will depend on the accuracy of tokenization and well formed rule with correct order rules lambda x x people lambda x x organism lambda x x 1 location lambda x x 2 location lambda x x or x or x number location location people people organism organism number number location location location location people people number number number number people people machine learning based ner ml base ner py ner task is recognized as labeling classification problem for each word for example raw text labeled text b loc b loc i loc i loc b loc b loc i loc i loc b per b per b num b num b num b num b per b per this is using iob2 representation tokens pos tags then we can represent a word with its features and context s features word x i context win size 1 x i 1 x i 1 features x i x i pos x i 1 x i 1 pos x i 1 x i 1 pos correct label i loc features should be represented as vector for training model one hot vector could be one of the option so model s input will be feature of the word vector output will be label of the word ner process will be 1 tokenize text into words 2 convert each words into feature vector v 3 label the word with output of the model which input is v 4 extract word with relevant lables method used in the training model is called one versus rest which train binary classifier for each label for exapmle if there is 3 labels the train three binary classifer perceptron in addition to remove non resonable sequence of labels viterbi algorithm can be used for example these are non reasonable labels sequence o i per i org i per linear structured learning based ner | server |
|
libbitcoin-blockchain | this branch is not usable in its current state please see version3 https github com libbitcoin libbitcoin blockchain tree version3 for the latest functional branch | blockchain |
|
mupen64plus-ae | mupen64plus ae mupen64plus android edition ae is an android user interface for mupen64plus please visit the official forum http www paulscode com forum index php for support and discussion img src https f droid org badge get it on png alt get it on f droid height 80 https f droid org packages org mupen64plusae v3 alpha nightly builds download nightly builds from continuous integration build status build actions actions https github com mupen64plus ae mupen64plus ae actions workflows build yml build https github com mupen64plus ae mupen64plus ae actions workflows build yml badge svg build instructions 1 download and install the prerequisites android studio https developer android com studio index html during the installation make sure the latest sdk and ndk if running windows make sure you install git python awk and required microsoft visual c redistributable i e cmake 3 18 1 requires microsoft visual c redistributable 2015 and that the binaries are in your path environment variable 2 clone the mupen64plus ae repository and initialize the working copy git clone https github com mupen64plus ae mupen64plus ae git 3 open the project using android studio 4 build and run the app from android studio select build make project to build select run run app to run | front_end |
|
ESP8266_RTOS_WITH_ALIYUN_IOT_SDK | important notice about this repository a new branching model is applied to this repository which consists of a master branch and release branches 1 master branch the master branch is an integration branch where bug fixes features are gathered for compiling and functional testing 2 release branch the release branch is where releases are maintained and hot fixes with names like release v2 x x are added please ensure that all your production related work are tracked with the release branches with this new model we can push out bug fixes more quickly and achieve simpler maintenance roadmap esp8266 rtos sdk s framework is quite outdated and different from the current esp idf https github com espressif esp idf and we are planning to migrate esp8266 rtos sdk to esp idf eventually after v2 0 0 however we will firstly provide a new version of esp8266 sdk esp8266 rtos sdk v3 0 which shares the same framework with esp idf esp idf style as a work around because the multi cpu architecture is not supported by esp idf for the time being actions to be taken for esp8266 rtos sdk v3 0 include the following items 1 modify the framework to esp idf style 2 restructure some core libraries including wi fi libraries and libmain 3 update some third party libraries including freertos lwip mbedtls nopoll libcoap spiffs cjson wolfssl etc 4 update some drivers 5 others developing with the esp8266 rtos sdk get toolchain you can get toolchain from arduio firstly windows https dl espressif com dl xtensa lx106 elf win32 1 22 0 88 gde0bdc1 4 8 5 tar gz mac https dl espressif com dl xtensa lx106 elf osx 1 22 0 88 gde0bdc1 4 8 5 tar gz linux 64 https dl espressif com dl xtensa lx106 elf linux64 1 22 0 88 gde0bdc1 4 8 5 tar gz linux 32 https dl espressif com dl xtensa lx106 elf linux32 1 22 0 88 gde0bdc1 4 8 5 tar gz get esp8266 rtos sdk besides the toolchain that contains programs to compile and build the application you also need esp8266 specific api libraries they are provided by espressif in esp8266 rtos sdk https github com espressif esp8266 rtos sdk repository to get it open terminal navigate to the directory you want to put esp8266 rtos sdk and clone it using git clone command cd esp git clone https github com espressif esp8266 rtos sdk git esp8266 rtos sdk will be downloaded into esp esp8266 rtos sdk setup path to esp8266 rtos sdk the toolchain programs access esp8266 rtos sdk using idf path environment variable this variable should be set up on your pc otherwise projects will not build setting may be done manually each time pc is restarted another option is to set up it permanently by defining idf path in user profile for manually the command export idf path esp esp8266 rtos sdk start a project now you are ready to prepare your application for esp8266 to start off quickly we can use examples get started project template project from examples directory in sdk once you ve found the project you want to work with change to its directory and you can configure and build it connect you are almost there to be able to proceed further connect esp8266 board to pc check under what serial port the board is visible and verify if serial communication works note the port number as it will be required in the next step configuring the project being in terminal window go to directory of project template application by typing cd esp esp8266 rtos sdk examples get started project template then start project configuration utility menuconfig cd esp esp8266 rtos sdk examples get started project template make menuconfig in the menu navigate to serial flasher config default serial port to configure the serial port where project will be loaded to confirm selection by pressing enter save configuration by selecting save and then exit application by selecting exit note on windows serial ports have names like com1 on macos they start with dev cu on linux they start with dev tty here are couple of tips on navigation and use of menuconfig use up down arrow keys to navigate the menu use enter key to go into a submenu escape key to go out or to exit type to see a help screen enter key exits the help screen use space key or y and n keys to enable yes and disable no configuration items with checkboxes pressing while highlighting a configuration item displays help about that item type to search the configuration items once done configuring press escape multiple times to exit and say yes to save the new configuration when prompted compiling the project make all will compile app based on the config flashing the project when make all finishes it will print a command line to use esptool py to flash the chip however you can also do this from make by running make flash this will flash the entire project app bootloader and init data bin to a new chip the settings for serial port flashing can be configured with make menuconfig you don t need to run make all before running make flash make flash will automatically rebuild anything which needs it viewing serial output the make monitor target uses the idf monitor tool https esp idf readthedocs io en latest get started idf monitor html to display serial output from the esp32 idf monitor also has a range of features to decode crash output and interact with the device check the documentation page for details https esp idf readthedocs io en latest get started idf monitor html exit the monitor by typing ctrl to flash and monitor output in one pass you can run make flash monitor compiling flashing just the app after the initial flash you may just want to build and flash just your app not the bootloader and init data bin make app build just the app make app flash flash just the app make app flash will automatically rebuild the app if it needs it in normal development there s no downside to reflashing the bootloader and init data bin each time if they haven t changed note recommend to use these 2 commands if you have flashed bootloader and init data bin parallel builds esp8266 rtos sdk supports compiling multiple files in parallel so all of the above commands can be run as make jn where n is the number of parallel make processes to run generally n should be equal to or one more than the number of cpu cores in your system multiple make functions can be combined into one for example to build the app bootloader using 5 jobs in parallel then flash everything and then display serial output from the esp32 run make j5 app flash monitor erasing flash the make flash target does not erase the entire flash contents however it is sometimes useful to set the device back to a totally erased state to erase the entire flash run make erase flash this can be combined with other targets ie make erase flash flash will erase everything and then re flash the new app bootloader and init data bin updating esp8266 rtos sdk after some time of using esp8266 rtos sdk idf you may want to update it to take advantage of new features or bug fixes the simplest way to do so is by deleting existing esp8266 rtos sdk folder and cloning it again another solution is to update only what has changed this method is useful if you have a slow connection to github to do the update run the following commands cd esp esp8266 rtos sdk git pull the git pull command is fetching and merging changes from esp8266 rtos sdk repository on github | os |
|
blockchain | blockchain background got the idea from learn blockchains by building one https github com dvf blockchain building cd cmd go build o goblockchain usage start instances nodes as follows you can start as many nodes as you want with the following command goblockchain port 8001 goblockchain port 8002 goblockchain port 8003 json endpoints get full blockchain curl 127 0 0 1 8001 chain mine a new block curl 127 0 0 1 8001 mine adding a new transaction post 127 0 0 1 8001 transactions new body json sender 1a1zp1ep5qgefi2dmptftl5slmv7divfna recipient 1ez69snzzmepmzx3wpezmktrcbf2gpnq55 amount 1000 register a node in the blockchain network post 127 0 0 1 8001 nodes register body json nodes http 127 0 0 1 8002 http 127 0 0 1 8003 resolving blockchain curl 127 0 0 1 8001 nodes resolve | blockchain |
|
design-system-talks | design system talks know a talk that isn t listed below feel free to create a new pull request or open an issue a design system is a collection of documentation on principles and best practices that helps guide a team to build digital products they are often embodied in ui libraries and pattern libraries but can extend to include guides on other areas such as voice and tone talk speaker s introducing design systems into chaos https www youtube com watch v fzsi1bk brm diana mounter github thinking in symbols for universal design https www youtube com watch v z5xxgxbz3fo benjamin wilkins airbnb designing a design system https www youtube com watch v 7hyollo2gc4 jina bolton salesforce designing a design system for modular modules and building a team to build it https www youtube com watch v wsfk5rccxr4 jesiah mccann br marianne epstein br josh trout usa today living design systems https www youtube com watch v crp5cx7nzw jina bolton salesforce scaling design with systems https www youtube com watch v tuly1cym57g karri saarinen airbnb design systems at scale https www youtube com watch v kq48beotjyc sarah federman design systems are for people https www youtube com watch v ldctzdycy1k jina bolton design systems when and how much https www youtube com watch v hx02sal ih0 diana mounter design systems and creativity https www youtube com watch v bmrdrw93knw jina anne how to foster participation https www youtube com watch v 6xzhhhgtt9a inayaili de le n design systems for the rest of us https www youtube com watch v 60tmvbw1kfc siddharth kshetrapal minimising complexity https www youtube com watch v cg8yssvdd5c kellie matheson br richard hallows you ve built a design system now what https www youtube com watch v mpr4drniekq bethany sonefeld accessibility in the gov uk design system https www youtube com watch v oeymedpnpce nick colley design systems at deliveroo learnings and frustrations https www youtube com watch v 6c0yvuwjc84 raphael guilleminot br matt vagni your design system has a heart https www youtube com watch v j4tiaw2gqhm hana lodhi br antonas deduchovas delivering flexible cross platform design systems https www youtube com watch v luzbzaawtte charlie robbins design systems at scale https www youtube com watch v bltrdvksrsm sarah federman design system apis and the developer experience https www youtube com watch v hftvh9snezq diana mounter other resources awesome design systems https github com alexpate awesome design systems design systems handbook https www designbetter co design systems handbook | talks designsystems conference | os |
xinu-arduino | xinu arduino the xinu arduino project is a xinu operating system subset modified to run on an avr atmega328p microcontroller e g arduino boards visit its web page for detailed instructions http se fi uncoma edu ar xinu avr a href https www youtube com watch v jacuup bkiu title demo video xinu os into avr atmega328p img style float right alt demo video xinu os into atmega328p mcu src http se fi uncoma edu ar xinu avr www files placa3 jpg a at present the core pieces of xinu are working so you can already integrate it in the development of multi tasking embedded systems you will also need any bare avr mcu or arduino board of course for lovers of because small is beautiful fusixos retrobsd unix in microcontrollers etc this project provides a user interface example as well the xinu shell and some tiny versions of utilities like echo a text editor a basic interpreter ps kill free date cal and some more check the demo video https www youtube com watch v jacuup bkiu if you want to see a xinu avr session in a little avr mcu like a retro computer system the source code is comprise of 1 the xinu os for avr atmega328p microkernel 2 basic examples apps of how to use xinu 3 a complete example the xinu shell and tiny versions of several unix like utilities a name whatisxinu a what is xinu xinu is a small elegant and easy to understand operating system originally developed by douglas comer for instructional purposes at purdue university in the 1980s since then it has been ported to many architectures and hardware platforms xinu uses powerful primitives to provides all the componentes and the same functionality many conventional operating sytems supply because the whole source code size is small xinu is suitable for embedded systems strong the xinu operating system includes strong dynamic process creation dynamic memory allocation real time clock management process coordination and synchronization local and remote file systems a shell and device independent i o functions the xinu operating system is documented in the book d comer operating system design the xinu approach second edition crc press 2015 isbn 9781498712439 https xinu cs purdue edu textbook many sites defines xinu as a free unix system or similar statements it is not xinu differs completely from the internal structure of unix or linux for academic purposes xinu is smaller elegant and easier to understand applications written for one system will not run on the other without modification xinu is not unix history xinu originally ran on digital equipment corporation lsi 11 s with only 64k bytes of memory at the end of 1979 and the inning of 1980 over the years xinu have been expanded and ported to a wide variety of architectures and platforms including ibm pc macintosh digital equipment corporation vax and decstation 3100 sun microsystems sun 2 sun 3 and sparcstations and for several arm mips and x86 embedded boards it has been used as the basis for many research projects furthermore xinu has been used as an embedded system in products by companies such as motorola mitsubishi hewlett packard and lexmark there is a full tcp ip stack and even the original version of xinu for the pdp 11 supported arbitrary processes and network i o there are current versions of xinu for galileo intel boards arm beagle boards several mips platforms and for x86 pc hardware and virtual machines a name code a source code there is just one git repository and it has everything git http github com zrafa xinu avr the list below is just for convenience a href https github com zrafa xinu avr the xinu os for avr atmega328p a a href https github com zrafa xinu avr tree master apps example apps a a href https github com zrafa xinu avr tree master apps shell the xinu shell and tiny unix like utilities editor basic interpreter ps kill echo uptime sleep etc a a href https xinu cs purdue edu the official xinu page and code a a name authors a authors xinu os copyright c 2012 2015 douglas e comer and crc press inc this version for avr atmega328p v0 1 c 2020 rafael ignacio zurita rafa fi uncoma edu ar acknowledgments michael m minor he is the author of another avr port os xinu a href https sites google com site avrxinu avrxinu a we use his context switch code the addargs in xinu shell and a few lines more his port is for bigger avr microcontrollers 16kb of ram and he used an old version of xinu xinu from the 1987 book edition a name notes a notes about the xinu os port for avr atmega328p current official xinu versions are designed for arm mips and x86 architectures the hardware differences between those and the ultra small avr microcontroller required changes to some low level data structures of xinu mainly using the flash memory in the avr mcu for keeping several read only data structures previously in ram also several limits were imposed so the read write data structures fits into the sram avr memory the xinu version for avr atmega328p has the core functionality of xinu and provides some extensions including an eeprom file system and several unix like utilities for the xinu shell this mcu has just 2kb of sram 32kb of flash memory and 1kb of eeprom the xinu version for avr uses 12kb of flash and 0 9kb of ram so there is still 50 of room sram and flash for the embedded application running on xinu concurrent processes so this project might be stimulating and very fun for lovers of embedded systems development and operating system internals notes about the port 1 max number of processes 4 to 8 2 main process is now the embedded application process 3 max number of semaphores 2 to 6 the size of the table of process queues depends on this 4 max number of devices 4 to 5 4 the clkhandler wakeup a process preemption for cpu every 100ms 5 several limits for buffers 32bytes for tty input 16bytes for names of devices 1byte for the queues keys and the list continues 6 sleepms is now delay sleep100ms and the key for queue char in ms 100 7 many vars in data structures have a smaller size e g before int32 now char 8 sleep sleeps max 20 seconds date type 9 most of the libc are from avr libc 10 init load bss and data from flash to ram from avr libc 11 shell manages max 6 tokens 12 date and time is managed by a little lib no ntp or rtc 13 most of the const char in source code were moved to flash program space via flash directive from gcc or progmem from avr libc 14 tty in is asynchronous with interrupts ok but tty out is polled based synchronous 15 open read write seek close use struct dentry it is on flash on this port 16 remote file systems local file systems ram file systems are disabled so far 17 ports ptinit ptsend ptrecv etc are disabled so far 18 null process has priority 1 a name douglas a douglas comer douglas comer is a professor of computer science at purdue university who was inducted into the internet hall of fame on september 2019 as one of the earliest tcp ip and internetworking researchers comer wrote the first series of textbooks explaining the scientific principles underlying the design of the internet and its communications protocols providing some of the earliest formal guidance for building efficient networks and applications that use the internet comer s three volume textbook series internetworking with tcp ip written in 1987 is widely considered to be the authoritative reference for internet protocols the series played a key role in popularizing internet protocols by making them more understandable to a new generation of engineers and it professionals prof douglas comer designed and developed the xinu operating system in 1979 1980 douglas comer page https www cs purdue edu homes comer internet hall of fame https www cs purdue edu news articles 2019 comer ihof html | xinu embedded-systems rtos microkernel xinu-os arduino arduino-uno | os |
ml-cvnets | cvnets a library for training computer vision networks cvnets is a computer vision toolkit that allows researchers and engineers to train standard and novel mobile and non mobile computer vision models for variety of tasks including object classification object detection semantic segmentation and foundation models e g clip table of contents what s new whats new installation installation getting started getting started supported models and tasks supported models and tasks maintainers maintainers research effort at apple using cvnets research effort at apple using cvnets contributing to cvnets contributing to cvnets license license citation citation what s new july 2023 version 0 4 of the cvnets library includes bytes are all you need transformers operating directly on file bytes https arxiv org abs 2306 00238 rangeaugment efficient online augmentation with range learning https arxiv org abs 2212 10553 training and evaluating foundation models clip mask r cnn efficientnet swin transformer and vit enhanced distillation support installation we recommend to use python 3 10 and pytorch https pytorch org version v1 12 0 instructions below use conda if you don t have conda installed you can check out how to install conda https docs conda io en latest miniconda html latest miniconda installer links bash clone the repo git clone git github com apple ml cvnets git cd ml cvnets create a virtual env we use conda conda create n cvnets python 3 10 8 conda activate cvnets install requirements and cvnets package pip install r requirements txt c constraints txt pip install editable getting started general instructions for working with cvnets are given here docs source en general examples for training and evaluating models are provided here docs source en models and here examples examples for converting a pytorch model to coreml are provided here docs source en general readme pytorch to coreml md supported models and tasks to see a list of available models and benchmarks please refer to model zoo docs source en general readme model zoo md and examples examples folder details summary imagenet classification models summary cnns mobilenetv1 https arxiv org abs 1704 04861 mobilenetv2 https arxiv org abs 1801 04381 mobilenetv3 https arxiv org abs 1905 02244 efficientnet https arxiv org abs 1905 11946 resnet https arxiv org abs 1512 03385 regnet https arxiv org abs 2003 13678 transformers vision transformer https arxiv org abs 2010 11929 mobilevitv1 https arxiv org abs 2110 02178 mobilevitv2 https arxiv org abs 2206 02680 swintransformer https arxiv org abs 2103 14030 details details summary multimodal classification summary byteformer https arxiv org abs 2306 00238 details details summary object detection summary ssd https arxiv org abs 1512 02325 mask r cnn https arxiv org abs 1703 06870 details details summary semantic segmentation summary deeplabv3 https arxiv org abs 1706 05587 pspnet https arxiv org abs 1612 01105 details details summary foundation models summary clip https arxiv org abs 2103 00020 details details summary automatic data augmentation summary rangeaugment https arxiv org abs 2212 10553 autoaugment https arxiv org abs 1805 09501 randaugment https arxiv org abs 1909 13719 details details summary distillation summary soft distillation hard distillation details maintainers this code is developed by a href https sacmehta github io target blank sachin a and is now maintained by sachin a href https mchorton com target blank maxwell horton a a href https www mohammad pro target blank mohammad sekhavat a and yanzi jin previous maintainers a href https farzadab github io target blank farzad a research effort at apple using cvnets below is the list of publications from apple that uses cvnets mobilevit light weight general purpose and mobile friendly vision transformer iclr 22 https arxiv org abs 2110 02178 cvnets high performance library for computer vision acm mm 22 https arxiv org abs 2206 02002 separable self attention for mobile vision transformers mobilevitv2 https arxiv org abs 2206 02680 rangeaugment efficient online augmentation with range learning https arxiv org abs 2212 10553 bytes are all you need transformers operating directly on file bytes https arxiv org abs 2306 00238 contributing to cvnets we welcome prs from the community you can find information about contributing to cvnets in our contributing contributing md document please remember to follow our code of conduct code of conduct md license for license details see license license citation if you find our work useful please cite the following paper inproceedings mehta2022mobilevit title mobilevit light weight general purpose and mobile friendly vision transformer author sachin mehta and mohammad rastegari booktitle international conference on learning representations year 2022 inproceedings mehta2022cvnets author mehta sachin and abdolhosseini farzad and rastegari mohammad title cvnets high performance library for computer vision year 2022 booktitle proceedings of the 30th acm international conference on multimedia series mm 22 | ade20k classification computer-vision deep-learning detection imagenet machine-learning mscoco pascal-voc pytorch segmentation | ai |
iot-433mhz | iot 433mhz logo https github com roccomuso iot 433mhz blob master other pics logo128x128 png raw true iot 433mhz logo build status https travis ci org roccomuso iot 433mhz svg branch master https travis ci org roccomuso iot 433mhz npm version https img shields io npm v iot 433mhz svg https www npmjs com package iot 433mhz dependency status https david dm org roccomuso iot 433mhz png https david dm org roccomuso iot 433mhz span class badge patreon a href https patreon com roccomuso title donate to this project using patreon img src https img shields io badge patreon donate yellow svg alt patreon donate button a span summary iot 433mhz is a home automation framework for 433mhz devices that runs on node js you can control 433mhz rc power sockets pir sensors door sensors and much more to get started you just need a 433mhz transmitter and receiver both connected to an arduino with the iot 433mhz sketch on it a pc raspberrypi that runs the iot 433mhz platform connected to arduino through usb ui demo iot 433mhz ui https github com roccomuso iot 433mhz blob master other pics web ui gif raw true iot 433mhz web ui features multi platform windows mac os x linux basic authentication intuitive api webhooks to build your own interface built in material design cards based template real time ui refresh detect radio frequency codes 433mhz generate cards and assign it to your rooms control rc power sockets pir sensors door sensors and much more telegram bot for alarm notifications totally open source open hardware recommended hardware for more about the required 433mhz transmitter receiver and the supported hardware see the hardware layer page https github com roccomuso iot 433mhz tree master hardware layer general install npm https nodei co npm dl iot 433mhz png https nodei co npm iot 433mhz you can get it on npm https www npmjs com package iot 433mhz npm install iot 433mhz g and then execute it from console with iot 433mhz or clone this repo code git clone https github com roccomuso iot 433mhz git code then don t forget to use the right node version install nvm to handle node js versions and install all the dependencies tested and fully working with node 6 17 1 and yarn 1 10 0 nvm install 6 17 1 npm i g yarn 1 10 0 yarn install heads up on raspberry pi you can encounter some issue installing all the dependencies due to permission errors if that happens try this code sudo chown r user group npm code combined with running code npm cache clean code to get any busted packages out of your cache in addition if the error still persist try adding the flag code unsafe perm code sudo npm install unsafe perm if installing from git or sudo npm install iot 433mhz g unsafe perm if installing from npm note the reason for using the unsafe perm option is that when node gyp tries to recompile any native libraries eg serialport it tries to do so as a nobody user and then fails to get access to certain directories allowing it root access during install allows the dependencies to be installed correctly during the upgrade if running on different platforms follow the platform specific setup below browser support ie https cloud githubusercontent com assets 398893 3528325 20373e76 078e 11e4 8e3a 1cb86cf506f0 png internet explorer chrome https cloud githubusercontent com assets 398893 3528328 23bc7bc4 078e 11e4 8752 ba2809bf5cce png google chrome firefox https cloud githubusercontent com assets 398893 3528329 26283ab0 078e 11e4 84d4 db2cf1009953 png firefox opera https cloud githubusercontent com assets 398893 3528330 27ec9fa8 078e 11e4 95cb 709fd11dac16 png opera safari https cloud githubusercontent com assets 398893 3528331 29df8618 078e 11e4 8e3e ed8ac738693f png safari ie 11 latest latest latest latest specific setup iot 433mhz is built on top of node js the server is multi platform can run on different hardware combinations shown below a computer with arduino connected and a 433 mhz transmitter and receiver tx rx arduino https github com roccomuso iot 433mhz blob master other schemes arduino transmitter and receiver jpg raw true arduino interface and 433mhz mac linux the iot 433mhz server should run smoothly remember to install with root permission sudo sudo npm install iot 433mhz g and then execute with iot 433mhz windows to run the server on windows make sure to install python 2 7 and microsoft visual studio express 2013 required by node serialport https github com voodootikigod node serialport then just do a npm install iot 433mhz g and then execute with iot 433mhz b raspberry pi raspbian jessie with 433 mhz transmitter and receiver to use iot 433mhz on raspberry pi first do a system update update code etc apt sources list code to have code jessie code wherever you ve currently got code wheezy code code sudo apt get update sudo apt get dist upgrade code code sudo rpi update code reboot then install node js wget http node arm herokuapp com node latest armhf deb sudo dpkg i node latest armhf deb check installation node v transmitter and receiver connected to gpio one way to go is directly connecting the radio transmitter and receiver to the gpio as shown in the following picture but first remember to install wiringpi link http wiringpi com download and install and to execute the app with root permission sudo rpi 433mhz https github com roccomuso iot 433mhz blob master other schemes raspberry pi rxb6 kxd10036 on 3 3v jpg raw true iot 433mhz with rpi heads up the rf receiver module operates at 5v the gpio data pins can only support 3 3v if you put your receiver on 5v the data io pin of the raspberry will also receive 5v which is way too high a simple resistor 4 7k should be fine as already outlined in many forum posts but is recommendend a logic level converter level shifter or a simple voltage divider level shifter https github com roccomuso iot 433mhz blob master other schemes rpi llc receiver jpg raw true level shifter here the voltage divider voltage divider https github com roccomuso iot 433mhz blob master other schemes voltage divider jpg raw true voltage divider the important thing here is the ratio of r1 to r2 r1 should be just over half r2 s value to ensure 5v is divided down to 3 3v the values shown here should be suitable for most uses nb for this configuration the raspberry pi platform uses the 433mhz utils library through the rpi 433 module but notice that rfsniffer compiled c appears to chew up all the rpi cpu 95 not ideal at all therefore an external arduino is the recommended solution using rpi with an external arduino remember to install with root permission the system can run on rpi using an external arduino like the other platforms to do that just set to code true code the code use external arduino code option in the code config json code file in this way we ll force the rpi to use an arduino through usb using the node js serialport module i m not sure if strictly necessary but it s worth installing the arduino ide and related drivers with code apt get install arduino code heads up sometimes the usb doesn t get detected on the fly you should be able to see it with code ls dev tty code usb not working https www raspberrypi org forums viewtopic php f 28 t 53832 just plug it and then reboot your rpi is recommended to run the server on the rpi through a terminal session see screen https www raspberrypi org forums viewtopic php t 8099 p 101209 config through the settings page from the web interface you can more or less change the general settings stored in code config json code few of those settings are there listed with their default values debug true start the app in debugging mode username root username required to authenticate required also during api calls password root password required to authenticate required also during api calls arduino baudrate 9600 the arduino default baudrate no need to change it server port 8080 choose on which port you wanna run the web interface db compact interval 12 database needs to be compacted to have better performance by default every 12 hours it will be compacted put 0 to avoid db compacting backend urls you can specify a backend json file containing the urls to carry out notifications nb this requires the iot 433mhz backend repo iot 433mhz makes use of the node debug module it s enabled by default but you can enable or disable it using the environment variable code debug iot 433mhz code you could also debug a specific part of the application providing as secondary param the file name like code debug iot 433mhz socketfunctions js code if you made a change to the settings from the web interface then to make it effective you need to restart the app the best way to set custom settings is through the cli optional parameters shown below usage start the system with the console global command iot 433mhz then you ll have to select the correct serial port to which the arduino is attached to start iot 433mhz https github com roccomuso iot 433mhz blob master other pics start iot 433mhz png raw true console start iot 433mhz once selected you re ready to go you re then free to use the system through the beautiful web interface thumbs up for material design or use the api to build your own interface to custom your system settings simply use the cli options iot 433mhz help that shows something like that iot 433mhz cli options https github com roccomuso iot 433mhz blob master other pics iot 433mhz cli options png raw true iot 433mhz cli options you can provide some parameter also as env variables node env development for virtual serial port port 8080 web server port serial port dev ttyusb0 serial port built in web interface reachable on the code http serveraddress port code the web code server port code is defined in code config json code default s value is 8080 it works well in browsers like chrome reccomended firefox safari opera microsoft edge it doesn t on internet explorer avoid it it also announce itself over mdns on the address iot 433mhz local so you don t have to struggle to get the server ip address once you open the address on your browser an authentication is required username and password are stored inside the config json file default values are root root if you wanna have a live console output of your iot 433mhz running on node there s a real time console mirroring web console on code http serveraddress port console html code thanks to console mirroring https github com roccomuso console mirroring add to homescreen the web interface provides along with supported browsers the ability to add the page on your homescreen like a native application the first time you ll open it a pop up will come out added to homescreen https github com roccomuso iot 433mhz blob master other pics added to homescreen jpg raw true added to homescreen heads up if your server is running on a rpi make sure to have a static ip address assigned to your server otherwise the linked app on the homescreen will not work anymore what kind of devices the system works with see the hardware page https github com roccomuso iot 433mhz tree master hardware layer api below every single api available is documented too lazy to copy and paste just download and import the postman collection download https github com roccomuso iot 433mhz blob master other iot 433mhz json postman collection tip the iot 433mhz server requires a basic authentication also for the api calls username and password are defined inside the config json file default username and password root root what you need to take into account is to set the following header field during your http requests code authorization basic cm9vddpyb290 code where the last string is the base64 encoding of code root root code if you changed default username and password you should update the base64 text too learn more https en wikipedia org wiki basic access authentication on basic authentication access code get api settings get code return the current settings useful to see notification status code get api system get uid code return the unique iot system uid a unique random id generated from the system code get api system new uid code generate a new unique iot system uid a unique random id generated from the system code get api system telegram enable code enable notification through telegram bot code get api system telegram disable code disable notification through telegram bot code get api system email enable code enable notification through email code get api system email disable code disable notification through email code get api code send rfcode code send the specified rfcode return a status object code status ok code or code status error error error description code code get api codes ignored code return a list of ignored codes stored in db code get api codes all code return all the registered codes stored in db code get api codes available code return all the available codes stored in db available codes can be assigned to a new device card code get api cards all code return all the cards stored in db code get api cards get shortname code return a single card with the specified shortname code post api cards new code form data required parameters headline a brief headline shortname lower case no spaces card body a description html allowed room lower case no spaces type must be one of the following types switch alarm info device if type switch gotta have on code and off code parameters if type alarm just the trigger code parameter optional parameter code card img code code background color code must be an hex color with json response 200 ok code done true newcard code where newcard is the json card just inserted or code done false error error description code code get api cards delete shortname code delete the card with the specified shortname it returns code status ok cards deleted 1 code or code status error error error description code code post api cards arm all code arm all the alarm type cards it returns code status ok cards affected n armed true code or code status error error error description code code post api cards disarm all code disarm all the alarm type cards it returns code status ok cards affected n armed false code or code status error error error description code code get api alarm shortname arm code only alarm type cards can be armed code get api alarm shortname disarm code only alarm type cards can be disarmed if disarmed no webhook callbacks or any kind of notifications will be sent code get api switch shortname on code turn on a switch code get api switch shortname off code turn off a switch code get api switch shortname toggle code toggle a switch webhooks webhooks allow you to build or set up integrations which subscribe to certain events on the iot 433mhz system when one of those events is triggered we ll send a http post payload to the webhook s configured url thanks to node webhooks https github com roccomuso node webhooks webhooks can be used to catch several events alarm triggered event new card event card deleted event new code detected event switch toggle event nb in this current release webhooks are not card specific for example a single alarmtriggered event type catches every alarm trigger it s up to you parse the payload and make sure that was the sensor you were wishing for use the api below to set up and interacts with webhooks code post api webhook add webhookshortname code add a new url for the selected webhook required parameters webhookshortname provided in url it must be one of these alarmtriggered newcard carddeleted newcode switchtoggle url the url to which a http post request will be sent when the event get fired the request carries a json payload field that gotta be parsed let s describe every event json payload you re gonna listen for according to the supplied code webhookshortname code code alarmtriggered card id last alert 1453 code shortname room code nb an alarmtriggered webhook callback will be executed only if the alarm card is armed code newcard card id headline shortname card body img type switch alarm info room device code nb device depends on type if switch we would look for these properties on code off code notification sound is on if alarm last alert trigger code notification sound if info device got no properties code carddeleted card id code code newcode code bitlength protocol code nb the detected code could be ignored or already attached to a device card code switchtoggle card id is on true false sent code timestamp 1453 code code get api webhook get code return the whole webhook db file code get api webhook get webhookshortname code return the selected webhook code get api webhook delete webhookshortname code remove all the urls attached to the selected webhook code post api webhook delete webhookshortname code remove only one single url attached to the selected webhook a json body with the url parameter is required url http code post api webhook trigger webhookshortname code trigger a webhook it requires a json body that will be turned over to the webhook urls telegram bot notifications out of the box the iot 433mhz provides notifications through email and through a telegram bot of course you re free to develop your own notification system using our webhooks api notifications should be enabled and configured through the code menu settings code page this is how the settings page looks like telegram settings https github com roccomuso iot 433mhz blob master other pics iot 433mhz telegram settings png raw true telegram settings by default there is a 5 second notification delay editable from code config json notificationdelay code so you won t be flooded by alarms signals the email notification system is under construction 18 https github com roccomuso iot 433mhz issues 18 android ios apps soon will be available the official app on both the stores pull requests if you submit a pull request thanks there are a couple rules to follow though to make it manageable the pull request should be atomic i e contain only one feature if it contains more please submit multiple pull requests reviewing massive 1000 loc pull requests is extremely hard likewise if for one unique feature the pull request grows too large more than 200 loc tests not included please get in touch first please stick to the current coding style it s important that the code uses a coherent style for readability do not include sylistic improvements housekeeping if you think one part deserves lots of housekeeping use a separate pull request so as not to pollute the code don t forget tests for your new feature inspiration inspired by pimatic homeduino https www npmjs com package pimatic homeduino dst dev this is a project in his beta stage documentation is under construction author rocco musolino roccomuso https twitter com roccomuso | iot arduino alarm telegram-bot node npm domotic raspberry-pi transmitter webhooks iot-433mhz radio-frequency nodejs | server |
blockchain-to-spreadsheet | what this is a chrome browser extension that can help readers contextualize news stories with hype about blockchain technology by reminding you that blockchain is in essence a giant excel spreadsheet why for fun how to install this extension in your chrome browser you can go get it from the official chrome store same tool still free fewer steps required by going to this link official chrome store install easier for novices https chrome google com webstore detail replace blockchain with s johdgapbhomlhcflancninpeafocpopn hl en us gl us or you can do a manual install from this github repo also easy by following these steps note enabling developer mode extensions in your browser as needed for manual install is risky chrome may remind you to be worried about that to avoid those pop up warnings use the official chrome store install above instead of this manual install 1 download this git repository to your computer by clicking clone or download and then download zip 2 unzip extract the zip on your computer 3 go to this url in your chrome browser chrome extensions chrome extensions 4 click to check the box for developer mode 5 one of the new buttons that should appear is load unpacked extension click it 6 choose the folder that the unzipped extracted files are in probably called blockchain to spreadsheet master 7 the extension should now appear and the enabled box should be checked start browsing blockchain news to read about all the things a giant excel spreadsheet can do example here s a screenshot for y all screenshot https user images githubusercontent com 22127496 36832800 dde60102 1ce1 11e8 9559 4c4d93456376 png special thanks twitter user mims christopher mims for the suggestion https twitter com mims status 968967786130300928 to make a browser extension that replaces blockchain with multiple copies of a giant excel spreadsheet for example multiple copies of a giant excel spreadsheet to revolutionize journalism thanks to tom maxwell for the tutorial and template code https 9to5google com 2015 06 14 how to make a chrome extensions for this project enjoy cbl | blockchain |
|
sign-language-translator | how to build a neural network to translate sign language into english real time sign language translation using computer vision this repository includes all source code for the soon to be tutorial on digitalocean with the same title including a real time sign language translator based on a live feed utilities used for portions of the tutorial such as dataloaders simple convolutional neural network written in pytorch http pytorch org with pretrained model created by alvin wan http alvinwan com november 2019 img width 832 alt screen shot 2019 11 29 at 4 42 59 am src https user images githubusercontent com 2068077 69869958 2c266f00 1263 11ea 9dad d5f72b56d047 png img width 832 alt screen shot 2019 11 29 at 4 44 34 am src https user images githubusercontent com 2068077 69869959 2c266f00 1263 11ea 9aab 8af38c1a0946 png getting started for complete step by step instructions see the soon to be tutorial on digitalocean this codebase was developed and tested using python 3 6 if you re familiar with python then see the below to skip the tutorial and get started quickly optional setup a python virtual environment https www digitalocean com community tutorials common python tools using virtualenv installing with pip and managing packages a thorough virtualenv how to with python 3 6 1 install all python dependencies pip install r requirements txt 2 navigate into src cd src 3 launch the script for a sign language translator python step 5 camera py how it works see the below resources for explanations of related concepts understanding least squares http alvinwan com understanding least squares understanding neural networks http alvinwan com understanding neural networks acknowledgements these models are trained on a sign language mnist dataset curated by tecperson as published on kaggle https www kaggle com datamunge sign language mnist | ai |
|
generator-jhipster-ant-design | generator jhipster ant design npm version npm image npm url build status travis image travis url dependency status daviddm image daviddm url jhipster blueprint ant design system blueprint for jhipster client introduction this is a jhipster http jhipster github io blueprint that is meant to be used in a jhipster application prerequisites as this is a jhipster http jhipster github io blueprint we expect you have jhipster and its related tools already installed installing jhipster https jhipster github io installation html installation with yarn to install this blueprint bash yarn global add generator jhipster ant design to update this blueprint bash yarn global upgrade generator jhipster ant design with npm to install this blueprint bash npm install g generator jhipster ant design to update this blueprint bash npm update g generator jhipster ant design usage to use this blueprint run the below command bash jhipster blueprint ant design running local blueprint version for development during development of blueprint please note the below steps they are very important 1 link your blueprint globally note if you do not want to link the blueprint step 3 to each project being created use npm instead of yarn as yeoman doesn t seem to fetch globally linked yarn modules on the other hand this means you have to use npm in all the below steps as well bash cd ant design npm link 2 link a development version of jhipster to your blueprint optional required only if you want to use a non released jhipster version like the master branch or your own custom fork you could also use yarn for this if you prefer bash cd generator jhipster npm link cd ant design npm link generator jhipster 3 create a new folder for the app to be generated and link jhipster and your blueprint there bash mkdir my app cd my app npm link generator jhipster ant design npm link generator jhipster optional needed only if you are using a non released jhipster version jhipster d blueprint ant design license mit chiho sin https github com chihosin npm image https img shields io npm v generator jhipster ant design svg npm url https npmjs org package generator jhipster ant design travis image https travis ci org chihosin generator jhipster ant design svg branch master travis url https travis ci org chihosin generator jhipster ant design daviddm image https david dm org chihosin generator jhipster ant design svg theme shields io daviddm url https david dm org chihosin generator jhipster ant design | os |
|
DataEngineeringwithGCP | data engineering with google cloud platform a href https www packtpub com product data engineering with google cloud platform 9781800561328 utm source github utm medium repository utm campaign 9781800561328 img src https static packt cdn com products 9781800561328 cover smaller alt data engineering with google cloud platform height 256px align right a this is the code repository for data engineering with google cloud platform https www packtpub com product data engineering with google cloud platform 9781800561328 utm source github utm medium repository utm campaign 9781800561328 published by packt a practical guide to operationalizing scalable data analytics systems on gcp esta es una copia del github https github com packtpublishing data engineering with google cloud platform tree main chapter 3 code my comments on the development of this book mis comentarios sobre el desarrollo de este libro this entire project was worked on in gcp so the creation of a virtual venv in chapter 6 is missing here in order to be able to work with apache beam in order to work in composer due to account problems i had to create the environment using terraform se trabaj todo este proyecto en gcp por lo que aqu falta la creaci n de un virtual venv en el cap tulo 6 para poder trabajar con apache beam para poder trabajar en composer por problemas de la cuenta tuve que crear el ambiente utilizando terraform | cloud |
|
embeddedObjectExtractor | embeddedobjectextractor this is a information extraction system designed to extract embedded objects such as tables from pdf documents | os |
|
wiz | wiz ide wiz is ide for web development using angular more easy screenshot https github com season framework wiz blob main screenshot wiz gif installation install nodejs npm angular bash apt install nodejs npm npm i g n n stable install wiz python package bash pip install season install pip install season upgrade upgrade lastest usage create project and start web server bash cd workspace wiz create myapp cd myapp wiz run port 3000 http 127 0 0 1 3000 wiz on your web browser start server as daemon bash wiz server start start daemon server wiz server stop stop daemon server regist system service for linux bash run on wiz project root directory wiz service regist myapp wiz service start myapp upgrade ide from command line bash pip install season upgrade upgrade core wiz ide upgrade ide upgrade wiz cli create project wiz create project name example bash wiz create myapp build project wiz build project project name clean flag flag syntax description project wiz build project name build project default main clean wiz build clean clean build default false example bash wiz build project main wiz build project dev clean daemon api wiz run host host port port log log file path flag flag syntax description port wiz run action port port web server port default 3000 host wiz run action host host web server host default 0 0 0 0 log wiz run action log path log file path default none example bash wiz run port 3000 wiz run port 3000 host 0 0 0 0 wiz run port 3000 log wiz log wiz server action log log file path force action action syntax description start wiz server start flags start wiz server as daemon stop wiz server stop flags stop wiz server daemon restart wiz server restart flags restart wiz server daemon flag flag syntax description log wiz server action log path log file path default none force wiz server start force force start daemon example bash wiz server start force wiz server stop wiz server restart service api wiz service list example bash wiz service list wiz service regist name port same as install example bash wiz service regist myapp or wiz service install myapp src 3001 or wiz service install myapp bundle 3001 wiz service unregist name same as uninstall remove delete rm example bash wiz service unregist myapp or wiz service remove myapp wiz service status name example bash wiz service status myapp wiz service start name example bash wiz service start myapp wiz service start start all services wiz service stop name example bash wiz service stop myapp wiz service stop stop all services wiz service restart name example bash wiz service restart myapp wiz service restart restart all services bundle project wiz pkg project project name example bash wiz pkg bundle main project wiz pkg project main output workspace bundle file created after run bundle api run using command wiz run bundle or adding services using wiz service install myservice bundle version policy x y z x major update upgrade not supported y minor update support command upgrade core function changed required server restart z ui update support upgrade from web ui not required server restart release note 2 3 24 core command change bundle pkg 2 3 21 2 3 23 core change requirement for python old version support 2 3 20 plugin workspace create widget bug at portal module fixed 2 3 19 core add dependency flask socketio 2 3 18 core upgrade to angular 16 core color changed core add build command plugin workspace tree view component changed plugin git commit bug fixed 2 3 17 core wiz response stream api 2 3 16 plugin workspace bug at app create fixed 2 3 15 core cache added for wiz config 2 3 14 core cache added for wiz components model controller api 2 3 13 core bundle command added 2 3 12 core service command upgraded add bundle option 2 3 11 core service command upgraded add port option 2 3 10 core boot config changed 2 3 9 core boot config changed 2 3 8 plugin workspace portal framework widget create bug fixed 2 3 7 plugin workspace statusbar bug fixed 2 3 6 plugin workspace npm plugin bug fixed core default plugin config bug fixed portal framework 2 3 5 core assets path bug fixed 2 3 4 core bundle path bug fixed 2 3 3 plugin workspace config list bug fixed 2 3 2 plugin workspace app json bug fixed 2 3 1 plugin workspace portal framework controller bug fixed 2 3 0 core move build logic to ide plugin core add bundle structure core localize angular cli core add linux service cli core add statusbar at bottom of ide plugin define model at plugin plugin workspace angular build logic changed plugin workspace integrated portal framework plugin at workspace plugin workspace build portal framework on builder model 2 2 x major issues ide overlay menu shortcut config plugin user customized plugin portal add portal framework plugin plugin workspace refresh list bug fixed core ide monaco editor bug fixed plugin workspace usability improvements plugin core auto complete keyword core toastr on build error plugin workspace hidden portal framework on route plugin workspace image viewer core angular version upgrade core typescript dependencies bug fixed 2 1 x major issues ide plugin concept changed ide layout changed ide config concept added plugin core move to app link in monaco editor plugin core add core plugins upgrade button plugin core add restart server button plugin workspace add app route editor service plugin workspace preview bug fixed plugin workspace page namespace bug fixed plugin workspace set default code if component ts not exists plugin workspace import create app bug fixed plugin core remove useless log plugin workspace config folder bug fixed plugin bug fixed remove unused file plugin workspace add route build plugin workspace remove useless log plugin core plugin updated core add lib plugin object command bug fixed 2 0 x major issues upgrade base project to angular 14 2 0 ui ux full changed drag and drop interface git branch to project multiple project in workspace enhanced ide plugin and easily develop 3rd party apps support pip and npm on ui ide socket auto install angular cli angular 15 flask response bug fixed on filesend wiz bundle mode update wiz server command multiprocess config bug fixed socketio bug fixed ide controller threading bug fixed flask socketio 1 0 x major issue clean code full changed ide remove season flask concept enhanced performance logging for wiz concept upgrade plugin structure config structure changed stable version for git merge add wiz server start log file method print bug fixed add daemon server command socket io transport server starting log auto remove invalid character on update wsgi bug fixed remove dukpy windows install bug support macosx 0 5 x support plugin storage port scan when wiz project created wiz based online plugin development env support programmable api for plugins remove useless resources socketio config config socketio py packages version bug fixed jinja2 werkzeug add src folder for tracing plugin code check installed function wiz installed forced dev mode in dev branch if not master wiz resource handler updated add function response flask resp and pil image at response add babel script option add wiz path function git merge bug fixed update wiz theme render logic git merge logic changed wiz instance as global in wiz api add match api at wiz instance 0 4 x integrate wiz season flask support git flow workspace structure changed base code workspace changed mysql to filesystem ui upgrade support installer developer production mode developer enabled socketio logger on every pages production disabled socketio logger dictionary bug fixed in app html history display ui changed workspace app browse in route workspace add cache clean in workspace git bug changed if author is not set default user to wiz full size log viewer keyword changed cache bug fixed socketio performance upgrade wiz js embeded wiz api js changed async mode 0 3 x add socket io framework on build command run modified add pattern ignores change framework object 0 2 x framework structure upgraded command line tool function changed submodule structure added logging simplify public directory structure add response template from string function add response template function add variable expression change option interface loader update config onerror changed add response abort error handler in controller error response redirect update relative module path logger upgrade file trace bug fixed logger upgrade log executed file trace logger upgrade code trace error handler bug fixed apache wsgi bug fixed public app py apache wsgi bug fixed | python web-framework wsgi | front_end |
junior | junior a framework for building html5 mobile apps with a native look and feel check out the github page http justspamjustin github com junior | front_end |
|
anchor | div align center img height 170x src https pbs twimg com media fvuvao9xeaaulvk format png name small h1 anchor h1 p strong solana sealevel framework strong p p a href https github com coral xyz anchor actions img alt build status src https github com coral xyz anchor actions workflows tests yaml badge svg a a href https anchor lang com img alt tutorials src https img shields io badge docs tutorials blueviolet a a href https discord gg nhhgsxanxk img alt discord chat src https img shields io discord 889577356681945098 color blueviolet a a href https opensource org licenses apache 2 0 img alt license src https img shields io github license coral xyz anchor color blueviolet a p div anchor is a framework for solana s sealevel https medium com solana labs sealevel parallel processing thousands of smart contracts d814b378192 runtime providing several convenient developer tools for writing smart contracts rust edsl for writing solana programs idl https en wikipedia org wiki interface description language specification typescript package for generating clients from idl cli and workspace management for developing complete applications if you re familiar with developing in ethereum s solidity https docs soliditylang org en v0 7 4 truffle https www trufflesuite com web3 js https github com ethereum web3 js then the experience will be familiar although the dsl syntax and semantics are targeted at solana the high level flow of writing rpc request handlers emitting an idl and generating clients from idl is the same getting started for a quickstart guide and in depth tutorials see the anchor book https book anchor lang com and the older documentation https anchor lang com that is being phased out to jump straight to examples go here https github com coral xyz anchor tree master examples for the latest rust and typescript api documentation see docs rs https docs rs anchor lang and the typedoc https coral xyz github io anchor ts index html packages package description version docs anchor lang rust primitives for writing programs on solana crates io https img shields io crates v anchor lang color blue https crates io crates anchor lang docs rs https docs rs anchor lang badge svg https docs rs anchor lang anchor spl cpi clients for spl programs on solana crates https img shields io crates v anchor spl color blue docs rs https docs rs anchor spl badge svg https docs rs anchor spl anchor client rust client for anchor programs crates https img shields io crates v anchor client color blue docs rs https docs rs anchor client badge svg https docs rs anchor client coral xyz anchor typescript client for anchor programs npm https img shields io npm v coral xyz anchor svg color blue https www npmjs com package coral xyz anchor docs https img shields io badge docs typedoc blue https coral xyz github io anchor ts index html coral xyz anchor cli cli to support building and managing an anchor workspace npm https img shields io npm v coral xyz anchor cli svg color blue https www npmjs com package coral xyz anchor cli docs https img shields io badge docs typedoc blue https coral xyz github io anchor cli commands html note anchor is in active development so all apis are subject to change this code is unaudited use at your own risk examples here s a counter program where only the designated authority can increment the count rust use anchor lang prelude declare id fg6pafpogxkysidmpwtk6w2bez7fefcykg476zpfslns program mod counter use super pub fn initialize ctx context initialize start u64 result let counter mut ctx accounts counter counter authority ctx accounts authority key counter count start ok pub fn increment ctx context increment result let counter mut ctx accounts counter counter count 1 ok derive accounts pub struct initialize info account init payer authority space 48 pub counter account info counter pub authority signer info pub system program program info system derive accounts pub struct increment info account mut has one authority pub counter account info counter pub authority signer info account pub struct counter pub authority pubkey pub count u64 for more see the examples https github com coral xyz anchor tree master examples and tests https github com coral xyz anchor tree master tests directories license anchor is licensed under apache 2 0 license unless you explicitly state otherwise any contribution intentionally submitted for inclusion in anchor by you as defined in the apache 2 0 license shall be licensed as above without any additional terms or conditions contribution thank you for your interest in contributing to anchor please see the contributing md contributing md to learn how thanks div align center a href https github com coral xyz anchor graphs contributors img src https contrib rocks image repo coral xyz anchor width 100 a div | solana rust blockchain smart-contracts coral | blockchain |
python-graphenelib | python library for graphene https img shields io pypi v graphenelib svg style for the badge https img shields io github downloads xeroc python graphenelib total svg style for the badge https img shields io pypi pyversions graphenelib svg style for the badge https img shields io pypi l graphenelib svg style for the badge current build status travis master https travis ci org xeroc python graphenelib png branch master https travis ci org xeroc python graphenelib docs master https readthedocs org projects python graphenelib badge version latest http python graphenelib readthedocs io en latest codecov https codecov io gh xeroc python graphenelib branch master graph badge svg https codecov io gh xeroc python graphenelib maintainability https api codeclimate com v1 badges 8dd7a9b3f06a1ef9188a maintainability https codeclimate com github xeroc python graphenelib maintainability test coverage https api codeclimate com v1 badges 8dd7a9b3f06a1ef9188a test coverage https codeclimate com github xeroc python graphenelib test coverage https pyup io repos github xeroc python graphenelib shield svg https cla assistant io readme badge xeroc python graphenelib documentation visit the pygraphenelib website http docs pygraphenelib com en latest for in depth documentation on this python library installation install with pip3 sudo apt get install libffi dev libssl dev python dev pip3 install graphenelib manual installation git clone https github com xeroc python graphenelib cd python graphenelib python3 setup py install user upgrade pip install user upgrade graphenelib contributing python bitshares welcomes contributions from anyone and everyone please see our guidelines for contributing contributing md and the code of conduct code of conduct md discussion and developers discussions around development and use of this library can be found in a dedicated telegram channel https t me pybitshares license a copy of the license is available in the repository s license license txt file | blockchain python-library cryptocurrency api serialization-library wallet | blockchain |
NLP | nlp nlp pku msr 1 2 3 n gram 4 5 6 7 w wcrf msr f1 95 7 nlp wcrf 8 | bayes ngram mm memm crf | ai |
dumb | dumb with the massive daily increase of useless scripts on genius s web frontend and having to download megabytes of clutter dumb https github com rramiachraf dumb tries to make reading lyrics from genius a pleasant experience and as lightweight as possible a href https codeberg org rramiachraf dumb img src https img shields io badge codeberg 232185d0 a screenshot https raw githubusercontent com rramiachraf dumb main screenshot png installation usage go 1 18 https go dev dl is required bash git clone https github com rramiachraf dumb cd dumb go build dumb the default port is 5555 you can use other ports by setting the port environment variable public instances url region cdn operator https dm vern cc us no https vern cc https sing whatever social us de yes whatever social https dumb lunar icu de yes maximiliangt500 https dumb privacydev net fr no https privacydev net tor url operator http dm vernccvbvyi5qhfzyqengccj7lkove6bjot2xhh5kajhwvidqafczrad onion https vern cc http dumb g4c3eya4clenolymqbpgwz3q3tawoxw56yhzk4vugqrl6dtu3ejvhjid onion https privacydev net i2p url operator http vernxpcpqi2y4uhu7to4rnjmyjjgzh3x3qxyzpmkhykefchkmleq b32 i2p https vern cc for people who might be capable and interested in hosting a public instance feel free to do so and don t forget to open a pull request so your instance can be included here contributing contributions are welcome license mit https github com rramiachraf dumb blob main licence | alternative-frontends alternative | front_end |
dso-toolkit | this project is using percy io for visual regression testing https percy io static images percy badge svg https percy io dso toolkit dso toolkit npm version http img shields io npm v dso toolkit svg https npmjs org package dso toolkit view this project on npm build status master branch https img shields io travis com dso toolkit dso toolkit master https travis ci com dso toolkit dso toolkit slack chat https dso toolkit slack com slack chat invite link https join slack com t dso toolkit shared invite zt 58125gbo ftpaarcnu47rmgkt7kwika dso toolkit design system of the digitaal stelsel omgevingswet dso digitaal stelsel omgevingswet translated stands for digital system for the environment and planning act of the netherlands the dso toolkit consists of documentation and a style guide in addition two implementations are provided css and web components the web components for angular and react get wrappers see issue 915 getting started zie https www dso toolkit nl voor actuele documentatie npm registry npm install dso toolkit save dev bundle css import or bundle dso toolkit dist dso css cdn the toolkit and component library are distributed to dso toolkit nl use the table below to resolve the branch channel to the base url branch channel url master stable https cdn dso toolkit nl master tags only releases https cdn dso toolkit nl version the same goes for the component library branch channel url master stable https www dso toolkit nl master tags only releases https www dso toolkit nl version html link rel stylesheet href https cdn dso toolkit nl master version dso css for web components html script type module src https cdn dso toolkit nl master version core dso toolkit esm js script script nomodule src https cdn dso toolkit nl master version core dso toolkit js script the referenced scripts are very small only the actually used web components are lazy loaded for more information https stenciljs com docs distribution develop or mockups to work on the dso toolkit using components and variants or create mockups of pages forms or components you need node 18 and yarn see contributing md contributing md on how to contribute either install yarn with npm install global yarn or use yarn with npx npx yarn my commands here git clone git github com dso toolkit dso toolkit git cd dso toolkit yarn install environments depending on the work being done development can be done in several environments development this environment is used to develop new components in storybook storybook is built around stories and since this project has multiple storybooks one for each implementation the easiest way to start this environment is with one of the following commands yarn start yarn start react yarn start angular yarn start all this will run the corresponding storybook s since these commands contain a colon these commands can be run from anywhere in the project the following processes are started default css in watch mode stencil in watch mode storybook and cypress react css in watch mode stencil in watch mode storybook for react components react css in watch mode stencil in watch mode storybook for angular components all css in watch mode stencil in watch mode storybook and storybook for react and angular components this will start stencil on http localhost 45333 storybook on http localhost 45000 and the cypress gui since stencil and storybook are running it s possible to develop the components but keep in mind the tests run in a production environment this means no stencil development tools like hmr leaflet development of leaflet plugins is package transcendent run the following command from root yarn start leaflet yarn start react leaflet this will start stencil http localhost 45333 and storybook http localhost 45000 in production no live reload hmr and the leaflet plugins development environment on http localhost 41234 or the react leaflet development environment on http localhost 42345 requirements node 18 for development on the dso toolkit you also need yarn ports ports used during development 41234 leaflet plugins dev app 42345 react leaflet plugins dev app 43300 docusaurus 45333 stencil 45000 storybook for html css web components 56406 storybook for react components 46006 storybook for angular components | os |
|
Leveraging-cache-and-MessagingQueue-to-scale-BlockchainNetwork | warning this repository is no longer maintained warning this repository will not be updated the repository will be kept available in read only mode leveraging the cache and messaging queue to scale a blockchain network read this in other languages readme ko md in this step we will configure redis and rabbitmq cluster in our architecture to control the flow of incoming request to blockchain network with the direct use of rest api calls it is not possible to control the number of requests sent to blockchain network this might cause errors such as read write conflicts etc in order to control the flow of request sent to blockchain network and scale our application we will use rabbitmq cluster with 3 nodes consisting of mirrored queues to queue the user requests and redis cluster with 6 nodes 3 leaders and 3 followers where results of execution are store for a short duration in architecture diagram we have rabbitmq producer present in api containers that queue the requests to rabbitmq cluster and rabbitmq consumers configured with an instance of fabric node sdk in task execution containers to consume the requests from users and send it blockchain network for execution you will find the configuration code for redis in backend utils util js you will find the configuration code for rabbitmq in rabbitclient utils util js included components hyperledger fabric docker hyperledger fabric sdk for node js application workflow diagram application workflow images arch png 1 issue a git clone https github com ibm leveraging cache and messagingqueue to scale blockchainnetwork 2 issue the command build sh to setup the network prerequisites docker https www docker com products v1 13 or higher docker compose https docs docker com compose overview v1 8 or higher steps 1 run build sh script to build and start the network 1 run the build sh script 2 check the logs to see the results 2 check the logs 3 test the blockchain network 3 test the blockchainnetwork 1 run the build sh script this accomplishes the following a clean up system by removing any existing blockchain docker images b generate certificates the crypto config yaml crypto configuration file defines the identity of who is who it tells peers and orderers what organization they belown to and what domain they belong to c create peers orderers and channel the configtx yaml file initializes a blockchain network or channel and services with an orderer genesis block which serves as the first block on a chain additionally membership services are installed on each channel peer in this case the shop and fitcoin peers d build docker images of the orderer peers channel network open a new terminal and run the following command bash export fabric cfg path pwd chmod x cryptogen chmod x configtxgen chmod x generate certs sh chmod x generate cfgtx sh chmod x docker images sh chmod x build sh chmod x clean sh build sh 2 check the logs command bash docker logs blockchain setup output bash ca registration complete ca registration complete default channel not found attempting creation successfully created a new default channel joining peers to the default channel chaincode is not installed attempting installation base container image present info packager golang js packaging golang from bcfit info packager golang js packaging golang from bcfit successfully installed chaincode on the default channel successfully instantiated chaincode on all peers blockchain network setup complete command bash docker ps output bash f4ddfcb1e4d8 haproxy 1 7 docker entrypoint 5 minutes ago up 5 minutes 0 0 0 0 3000 3000 tcp rabbitclient 5f40495511f1 backend node index js 5 minutes ago up 5 minutes fitcoin fitcoin backend 1 6ea304c78c40 backend node index js 5 minutes ago up 5 minutes 0 0 0 0 3030 3030 tcp fitcoin shop backend 1 ef481c334532 rabbit client node index js 5 minutes ago up 5 minutes 0 0 0 0 3003 3000 tcp rabbitclient3 51b7c6cee311 rabbit client node index js 5 minutes ago up 5 minutes 0 0 0 0 3002 3000 tcp rabbitclient2 1195c7cb43ca rabbit client node index js 5 minutes ago up 5 minutes 0 0 0 0 3001 3000 tcp rabbitclient1 15533fe3a151 redis server docker entrypoint 5 minutes ago up 5 minutes 6379 tcp 0 0 0 0 7000 7005 7000 7005 tcp fitcoin redis server 1 556840abfc4d dev shop peer bcfit 1 0e0d4e71de9ac7df4d0d20dfcf583e3e63227edda600fe338485053387e09c50 chaincode peer add 6 minutes ago up 6 minutes dev shop peer bcfit 1 8c594ddc16f4 haproxy 1 7 docker entrypoint 6 minutes ago up 6 minutes 0 0 0 0 5672 5672 tcp 0 0 0 0 15672 15672 tcp rabbitmq c59da84a4e7c rabbitmq 3 management usr local bin clus 6 minutes ago up 6 minutes 4369 tcp 5671 5672 tcp 15671 15672 tcp 25672 tcp rabbitmq2 f07024afd0f1 rabbitmq 3 management usr local bin clus 6 minutes ago up 6 minutes 4369 tcp 5671 5672 tcp 15671 15672 tcp 25672 tcp rabbitmq3 7ef2085afd54 rabbitmq 3 management docker entrypoint s 6 minutes ago up 6 minutes 4369 tcp 5671 5672 tcp 15671 15672 tcp 25672 tcp rabbitmq1 2a775a81c967 blockchain setup node index js 7 minutes ago up 7 minutes 3000 tcp blockchain setup 90136f4c90fe fitcoin peer peer node start 7 minutes ago up 7 minutes 0 0 0 0 8051 7051 tcp 0 0 0 0 8053 7053 tcp fitcoin peer 19e4890f71e3 shop peer peer node start 7 minutes ago up 7 minutes 0 0 0 0 7051 7051 tcp 0 0 0 0 7053 7053 tcp shop peer 654ada9fbbf6 ishangulhane fabric couchdb tini docker ent 7 minutes ago up 7 minutes 4369 tcp 9100 tcp 0 0 0 0 9984 5984 tcp shop statedb b19022ef3b2a ishangulhane fabric couchdb tini docker ent 7 minutes ago up 7 minutes 4369 tcp 9100 tcp 0 0 0 0 5984 5984 tcp ca datastore 6360ff012bbd fitcoin ca fabric ca server st 7 minutes ago up 7 minutes 0 0 0 0 8054 7054 tcp fitcoin ca 9d06dd0a009d orderer peer orderer 7 minutes ago up 7 minutes 0 0 0 0 7050 7050 tcp orderer0 0de13cd1ba31 shop ca fabric ca server st 7 minutes ago up 7 minutes 0 0 0 0 7054 7054 tcp shop ca 9dba93e63b5c ishangulhane fabric couchdb tini docker ent 7 minutes ago up 7 minutes 4369 tcp 9100 tcp 0 0 0 0 8984 5984 tcp fitcoin statedb command bash docker logs fitcoin fitcoin backend 1 output ca registration complete ca registration complete ca registration complete x awaiting rpc requests on clientclient0 x awaiting rpc requests on clientclient2 x awaiting rpc requests on clientclient1 command bash docker logs fitcoin shop backend 1 output ca registration complete ca registration complete starting socker server x awaiting rpc requests on clientclient0 3 test the blockchainnetwork in a separate terminal navigate to testapplication folder and run the following command npm install node index js navigate to url to view the blockchain blocks http localhost 8000 history html blocks images blocks png now navigate to url to perform operations on network http localhost 8000 test html note for this application the user queue value can be either user queue or seller queue sample enroll user request blocks images enroll png sample query request blocks images query user png sample invoke request blocks images invoke user png additional resources hyperledger fabric docs https hyperledger fabric readthedocs io en latest hyperledger composer docs https hyperledger github io composer latest introduction introduction html license this code pattern is licensed under the apache software license version 2 separate third party code objects invoked within this code pattern are licensed by their respective providers pursuant to their own separate licenses contributions are subject to the developer certificate of origin version 1 1 dco https developercertificate org and the apache software license version 2 https www apache org licenses license 2 0 txt apache software license asl faq https www apache org foundation license faq html whatdoesitmean | ibmcode blockchain-network rabbitmq-cluster hyperledger-fabric | blockchain |
design-ui-kit | ui kit italia la risorsa del design system del paese per costruire e prototipare siti internet e servizi digitali della pubblica amministrazione ui kit preview https designers italia it static c454a0c74d8e70f0349e9ac1df1ec35f e8469 uikit avif ciao questa la versione 3 di ui kit italia risorsa ufficiale del design system del paese l insieme di fondamenti e componenti utili ai designer per progettare e prototipare siti internet e servizi digitali della pubblica amministrazione che rispettano le norme e semplificano la vita dei cittadini scopri come iniziare https designers italia it design system come iniziare per designer a utilizzare il kit approfondisci le risorse di sviluppo coerenti e la documentazione ufficiale del design system https designers italia it design system nel nuovo sito designers italia ui kit italia disponibile dalla community figma com designersitalia https figma com designersitalia e scaricabile in locale anche in formato sketch dalla presente repository italia design ui kit https github com italia design ui kit i design tokens https designers italia it design system fondamenti design tokens di ui kit italia sono disponibili per lo sviluppo nei formati css e scss dalla repository italia design tokens italia https github com italia design tokens italia scopri come contribuire https designers italia it design system come contribuire per il design per partecipare con segnalazioni o contributi puoi aprire una segnalazione https github com italia design ui kit issues new o esplorare le attivit e segnalazioni in corso https github com italia design ui kit issues questa repository ospita il salvataggio di ui kit italia v3 nel suo formato nativo figma fig disponibile per visualizzazione copia per l uso e remix anche attraverso la pagina della community figma del progetto designers italia https figma com designersitalia la conversione nel formato open sketch https github com sketch hq sketch document sketch realizzata in automatico utilizzando la versione corrente latest del convertitore fig2sketch https github com sketch hq fig2sketch questa conversione da considerare in stato di testing alcune funzionalit nel file sketch potrebbero non essere presenti e o non corrette presente la memoria storica della precedente versione v2 nativa sketch deprecata dello ui kit insieme con altri materiali d archivio come lo studio per l evoluzione dell esperienza utente di spid nel ramo 2 x https github com italia design ui kit tree 2 x in stesura | sketch ui-components design-system figma gov government | os |
elastic-cloud-engineering | elastic cloud engineering demo repository this is the demo repository for my elastic cloud engineeering talk at the aws meetup in amsterdam installing dependencies bundle install build cloudformation template rake | cloud |
|
Workflow | workflow project for web information technologies info30005 semester 1 2018 made by armaan mcleod https github com opticgenius wei how ng https github com ngweihow steven tang https github com yc built with node js express mongodb react redux react admin credentials for demo user type username password admin admin admin manager jsmith manager user barryw berry configuration development api server ensure that your version of node js supports async await get packages using npm install npm install prefix app and npm install prefix admin for the server frontend app and admin interface respectively install mongo server start a mongo instance using npm run localdb or equivalent a mongod instance could have also been started automatically run npm run watch and npm run devstart to start the server in the development context on port 5000 of localhost use npm run prettify to prettify javascript code it may be necessary to run sudo npm g install prettier to install prettier note sudo service stop mongodb may be required on certain linux distributions to free up the mongo port when running npm run localdb development react admin app ensure that you re in the app or admin directory ensure that the api server is started add some sample data with npm run populate run npm install run npm start production method 1 docker manual deploy set the react app api url environment variable in the dockerfile to reflect the setup of your api server before running the build use sudo docker build t flow to build a docker image and subsequently deploy the image method 2 docker heroku build automated build and deploy set the react app api url environment variable in the dockerfile to reflect the setup of your api server before running the build use heroku stack set container a app name to set the app type before pushing the code if using automated builds method 3 npm manual build deploy install dependencies and run npm run build to build the app without docker ensure that the react app api url environment variable is set and available during the build process as it is used by the frontend apps to determine the prefix of each request url deploy files in the generated build directory and use npm start to run the app additionally please set the following environment variables and make them available to the deployed server mongo url connection url of mongo instance session secret hashes user sessions cors origin origin protocol hostname port of frontend port port that api server should start on if applicable tests coverage npm test run npm run report to generate coverage report license mit unless otherwise specified workflow workflow png https github com yc info30005 | server |
|
text-splitter | text splitter docs https docs rs text splitter badge svg https docs rs text splitter licence https img shields io crates l text splitter https github com benbrandt text splitter blob main license txt crates io https img shields io crates v text splitter https crates io crates text splitter codecov https codecov io github benbrandt text splitter branch main graph badge svg token tuf1iai7g7 https codecov io github benbrandt text splitter rust crate text splitter https crates io crates text splitter python bindings semantic text splitter https pypi org project semantic text splitter unfortunately couldn t acquire the same package name large language models llms can be used for many tasks but often have a limited context size that can be smaller than documents you might want to use to use documents of larger length you often have to split your text into chunks to fit within this context size this crate provides methods for splitting longer pieces of text into smaller chunks aiming to maximize a desired chunk size but still splitting at semantically sensible boundaries whenever possible get started by number of characters rust use text splitter characters textsplitter maximum number of characters in a chunk let max characters 1000 default implementation uses character count for chunk size let splitter textsplitter default optionally can also have the splitter trim whitespace for you with trim chunks true let chunks splitter chunks your document text max characters with huggingface tokenizer requires the tokenizers feature to be activated rust use text splitter textsplitter can also use anything else that implements the chunksizer trait from the text splitter crate use tokenizers tokenizer let tokenizer tokenizer from pretrained bert base cased none unwrap let max tokens 1000 let splitter textsplitter new tokenizer optionally can also have the splitter trim whitespace for you with trim chunks true let chunks splitter chunks your document text max tokens with tiktoken tokenizer requires the tiktoken rs feature to be activated rust use text splitter textsplitter can also use anything else that implements the chunksizer trait from the text splitter crate use tiktoken rs cl100k base let tokenizer cl100k base unwrap let max tokens 1000 let splitter textsplitter new tokenizer optionally can also have the splitter trim whitespace for you with trim chunks true let chunks splitter chunks your document text max tokens using a range for chunk capacity you also have the option of specifying your chunk capacity as a range once a chunk has reached a length that falls within the range it will be returned it is always possible that a chunk may be returned that is less than the start value as adding the next piece of text may have made it larger than the end capacity rust use text splitter characters textsplitter maximum number of characters in a chunk will fill up the chunk until it is somewhere in this range let max characters 500 2000 default implementation uses character count for chunk size let splitter textsplitter default with trim chunks true let chunks splitter chunks your document text max characters method to preserve as much semantic meaning within a chunk as possible a recursive approach is used starting at larger semantic units and if that is too large breaking it up into the next largest unit here is an example of the steps used 1 split the text by a given level 2 for each section does it fit within the chunk size yes merge as many of these neighboring sections into a chunk as possible to maximize chunk length no split by the next level and repeat the boundaries used to split the text if using the top level chunks method in descending length 1 descending sequence length of newlines newline is r n n or r each unique length of consecutive newline sequences is treated as its own semantic level 2 unicode sentence boundaries https www unicode org reports tr29 sentence boundaries 3 unicode word boundaries https www unicode org reports tr29 word boundaries 4 unicode grapheme cluster boundaries https www unicode org reports tr29 grapheme cluster boundaries 5 characters splitting doesn t occur below the character level otherwise you could get partial bytes of a char which may not be a valid unicode str note on sentences there are lots of methods of determining sentence breaks all to varying degrees of accuracy and many requiring ml models to do so rather than trying to find the perfect sentence breaks we rely on unicode method of sentence boundaries which in most cases is good enough for finding a decent semantic breaking point if a paragraph is too large and avoids the performance penalties of many other methods inspiration this crate was inspired by langchain s textsplitter https python langchain com en latest modules indexes text splitters examples recursive text splitter html but looking into the implementation there was potential for better performance as well as better semantic chunking a big thank you to the unicode rs team for their unicode segmentation https crates io crates unicode segmentation crate that manages a lot of the complexity of matching the unicode rules for words and sentences | ai |
|
IOE-Question-Bank | ioe question bank this is a project in database management system course in ioe pulchowk campus this project provides the user as simple search interface where user can search for links of question papers of electronics and communication engineering from year 2070 to 2075 b s contributors amrita thakur pujan budhathoki sarmila upreti and shirish shrestha | server |
|
Embedded-Systems-Practical | your emsys logbook in this repository you can commit your lab assignments for assessment in each of the lab subfolders you can include arduino sketches if relevant for the assignment screenshots which can be displayed in the readme md file bits of text or measurement data again if relevant for the assignment you will find a subdirectory for each of the labs lab1 lab1 lab2 lab2 lab3 lab3 lab4 lab4 lab5 lab5 lab6 lab6 glhf | os |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.