{"package": "0", "pacakge-description": "UNKNOWN"} {"package": "0-._.-._.-._.-._.-._.-._.-0", "pacakge-description": "UNKNOWN"} {"package": "000", "pacakge-description": "No description available on PyPI."} {"package": "00000", "pacakge-description": "welcome to my package"} {"package": "0000000", "pacakge-description": "welcome to my package"} {"package": "00000000", "pacakge-description": "welcome to my package"} {"package": "000000000000000000000000000000000000000000000000000000000", "pacakge-description": "welcome to my package"} {"package": "00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000", "pacakge-description": "welcome to my package"} {"package": "0.0.1", "pacakge-description": "# A lib for creating tfrecords## TODO:python 2.7 support.create a cmd interface."} {"package": "00101s", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} {"package": "00print_lol", "pacakge-description": "No description available on PyPI."} {"package": "00SMALINUX", "pacakge-description": "No description available on PyPI."} {"package": "0101", "pacakge-description": "No description available on PyPI."} {"package": "01changer", "pacakge-description": "UNKNOWN"} {"package": "01d61084-d29e-11e9-96d1-7c5cf84ffe8e", "pacakge-description": "docstestspackageAn example package. Generated with cookiecutter-pylibrary.Free software: BSD 2-Clause LicenseInstallationpip install 01d61084-d29e-11e9-96d1-7c5cf84ffe8eDocumentationhttps://01d61084-d29e-11e9-96d1-7c5cf84ffe8e.readthedocs.io/DevelopmentTo run the all tests run:toxNote, to combine the coverage data from all the tox environments run:Windowsset PYTEST_ADDOPTS=--cov-append\ntoxOtherPYTEST_ADDOPTS=--cov-append tox"} {"package": "01-distributions", "pacakge-description": "No description available on PyPI."} {"package": "01memories", "pacakge-description": "Digital Memories (01memories)Digital Memories is a Python-based digital photo frame application. It is capable of displaying photos and playing videos from local storage as well as WebDAV andrclonerepositories.Digital Memories has been designed to run slideshows from image and video repositories with several thousand files. No conversion is required. Files remain in your repositories and fully under your control.Files in slideshows can be dynamically arranged and filtered based on their metadata (EXIF and IPTC metadata supported). Slideshows can be run continuously or scheduled.Digital Memories supports reverse geocoding based on GPS data in the EXIF tag, using the geopy library and Photon geocoder (essentially OpenStreetMap).Digital Memories optionally integrates withHome AssistantviaMQTT. Integration allows the display to be motion activated after coupling of the Digital Memories device with a motion sensor.Digital Memories is being developed byBernd Kalbfuss (aka langweiler)and is published under theGeneral Public License version 3. The latest source code is available onGitHub.Instructions for building your own digital photo frame can be foundhere.DependenciesDigital Memories requiresPython 3to run. It has been developed with Python version 3.10 on Ubuntu Linux, but may run with earlier versions and on different operating systems.Digital Memories requires the following Python packages to be installed:exifreadffmpeg-pythongeopyIPTCInfo3Kivypaho-mqttpillowpyyamlrclone-pythonscheduleSQLAlchemywebdavclient3All packages are available onpypi.organd can be installed using the \"pip install\" (or \"pip3 install\") command. Where possible/available, packages should be installed using the distribution package manager (e.g \"apt\" on Debian/Ubuntu).Digital Memories further requires the following (non-Python) libraries to be installed:libxslt1.1libmtdev1libsqlite3-0libsdl2-2.0-0ffmpegLibraries should be installed using the distribution package manager.Note that Digital Memories requires the X windows system and a desktop environment to be installed. Digital Memories will in principle also run under Wayland, but the display will not be turned off automatically since Wayland does fully implement the \"xset\" command.InstallationDigital Memories is still in early development. The easiest way to install the latest version is to clone the GitHub repository using thegitclient. After having installed thegitclient, installation of Digital Memories becomes as simple as:$gitclonegit@github.com:kalbfuss/01memories.gitThe command installs the Digital Memories sources in the sub-directory \"pyframe\" within the current working directory. Digital Memories can be updated to the latest version by changing into the \"pyframe\" directory and issuing the following command:$cdpyframe\n$gitpulloriginmasterAt this stage of the project you should not expect the configuration syntax to be stable. Please, have a look at the documentation after each update and adjust the configuration as necessary.ConfigurationThe Digital Memories application is configured via a single YAML configuration file. The file is named \"config.yaml\" and must be stored in the current (working) directory. The following sections provide examples for configuration and the documentation of all parameters.A lot of effort has gone into configuration checks. The application should warn you in the event of invalid configurations immediately after startup. It is thus safe to explore the various configuration options. Under no circumstances is Digital Memories going to modify any of your image or video files.ExamplesSimple configurationIn this example, we want to continuously show all files stored in a local directory. For this purpose, we configure a single local repository (\"Local storage\"). Our files are stored under the relative path \"./local/photos\". We further define a single slideshow (\"Favorites\") containing all files from the repository. Files are shown in a random sequence for a duration of 60 s.Per (application) default settings, the repository is indexed once after start of the application. The slideshow includes photos and videos. The slideshow starts playing after start of the application and the display is always on.repositories:# Local repository with our favorite photos and videos.Local storage:type:localroot:./local/photosslideshows:# Slideshow with our favorite photos and videos.Favorites:repositories:Local storagepause:60sequence:randomAdvanced configurationIn this example, we want to show our most recent photos stored in the cloud in the period from 8:00 to 10:00 and our favorite photos, which are stored locally, in the period from 18:00 to 20:00. Since we are not necessarily at home in the evening, we want the display to be motion activated during this time.Firstly, we define two (enabled) repositories: A local repository (\"Local storage\") with files stored under the relative path \"./local/photos\" and a WebDAV repository (\"Cloud storage\") with files stored in the cloud. The third repository (\"Test repository\") used for testing has been disabled. Per the repository default settings, the index of the local repository is updated at the start of the application and every 24 hours. The index of the cloud repository is updated daily at 23:00.Secondly, we define two slideshows: The first slideshow (\"Favorites\") includes files tagged as \"favorites\" from the local repository. Files are shown for a duration of 60 s. The second slideshow (\"Recent\") includes the 200 most recent files from the cloud repository, which are not tagged as \"vacation\" or \"favorites\". We further limit files to \"images\". Files are sorted by the creation date in ascending order. Per the slideshow defaults, images are shown for a duration of 180 s.The slideshow defaults further ensure that files tagged as \"private\" are always excluded. Files are labeled with their description from the file metadata (if available) and labels are shown for a duration of 60 s at the start and the end of each file. Since the display is installed in vertical orientation, we rotate the content by -90\u00b0 and limit content to files in portrait orientation.Thirdly, we define a schedule to show the second slideshow (\"Recent\") in the time from 8:00 to 10:00 and the first slideshow (\"Favorites\") in the time from 18:00 to 20:00. In the first case, the display is always on. In the second case, the display is motion activated with a timeout interval of 300 s.Finally, since we run Home Assistant and need the MQTT remote control for the motion activation feature, we configure an MQTT client connection. For the motion activation feature to function properly, we further have to link the touch button with a motion sensor in Home Assistant (see [motion activation](#Motion activation)).repositories:# Local repository with our favorite photos and videos.Local storage:type:localroot:./local/photos# WebDAV repository with the latest photos from our smartphone.Cloud storage:type:webdavurl:https://mycloud.mydomain.orgroot:/remote.php/webdav/photosuser:pyframepassword:index_update_at:\"23:00\"# Test repository, which has been disabled.Test repository:type:localroot:./local/testenabled:false# Repository defaultsindex_update_interval:24slideshows:# Slideshow with our favorite photos and videos.Favorites:repositories:Local storagepause:60tags:favorites# Slideshow with most recent photos from our smartphone.Recent:repositories:Cloud storageexcluded_tags:-vacation-favoritestypes:imagesmost_recent:200order:datedirection:descending# Slideshow defaultsalways_excluded_tags:privatelabel_content:descriptionlabel_mode:autolabel_duration:30orientation:portraitpause:180rotation:-90schedule:# Play the slideshow \"Recent\" in the period from 8:00 to 10:00.morning start:time:\"08:00\"slideshow:Recentdisplay_mode:staticplay_state:playingmorning stop:time:\"10:00\"play_state:stopped# Play the slideshow \"Favorites\" in the period from 18:00 to 20:00.# Activate the display by motion.evening start:time:\"18:00\"slideshow:Favoritesdisplay_mode:motiondisplay_timeout:300play_state:playingevening stop:time:\"20:00\"play_state:stoppedmqtt:host:mqtt.localuser:pyframepassword:device_name:My Digital Memories somwhere in the houseApplicationThe following parameters are used to configure the application.BasicParameterDescriptionwindow_sizeThe size of the window provided as[width, height]. A value of \"full\" enables full screen mode. The default is \"full\".display_modeThe following display modes are supported. The default is \"static\".-static: The display is always on if a slideshow is paused or playing and off if a slideshow is stopped.-motion: The display is turned on and the slideshow starts playing in the presence of motion (i.e.touchevents). The slideshow is paused and the display turned off in the absence of motion after the display timeout interval.display_timeoutThe time in seconds after which the slideshow is paused and screen turned off in the absence of motion. The default is 300 seconds.AdvancedParameters in this section will likely not have to be modified by the majority of users.ParameterDescriptionindexThe index database file. The path may be absolute or relative to the current working directory. The default is \"./index.sqlite\".cacheThe directory in which files can be cached (used by WebDAV and rclone repositories). The directory path may be absolute or relative to the current working directory. The directory can be shared by multiple repositories.Do notuse directory in which you store files as cache directory. The default is \"./cache\".enable_exception_handlerSet totruein order to enable the generic exception handler. The generic exception handler prevents the application from exiting unexpectedly. Exceptions are logged, but the execution continues. The default isfalse.enable_schedulerSet tofalsein order to disable the scheduler. The scheduler is disabled even in the presence of ascheduleconfiguration section. The default istrue.enable_mqttSet tofalsein order to disable the MQTT client. The client is disabled even in the presence of anmqttconfiguration section. The default istrueenable_loggingSet tofalsein order to disable logging. The default istrue.log_levelThe log level, which can be set todebug,info,warning, orerror. The default is \"warning\".log_dirThe directory to which log files are written. The directory path may be absolute or relative to the current working directory. The default is \"./log\".RepositoriesDigital Memories supports the configuration of one or multiple file repositories. Repositories are configured in therepositoriessection of the configuration file. The section is required and must contain at least a single, valid repository definition. Repository parameter defaults may be provided as global parameters. The example below provides a typicalrepositoriesconfiguration section....repositories:# Local repository with our favorite photos and videos.Local storage:type:localroot:./local/photos# WebDAV repository with the latest photos from our smartphone.Cloud storage:type:webdavurl:https://mycloud.mydomain.orgroot:/remote.php/webdav/photosuser:pyframepassword:# Test repository, which has been disabled.Test repository:type:localroot:./local/testenabled:false# Repository defaultsindex_update_interval:24...The following parameters are used to configure repositories.GeneralParameterDescriptiontypeThe following repository types are supported. A values must be provided.-local: Repository with files on the local file system.Note:Even if referred to aslocal, files may be stored on a network share as long as the network is mounted and integrated into the file system hierarchy (e.g. \"/mnt/photos\").-rclone: Repository with files on an rclone remote. The remote must have been configured before using the \"rclone config\" command or directly in the rclone configuration file.-webdav: Repository with files on a WebDAV accessible site (e.g. ownCloud or NextCloud).index_update_intervalInterval in hours at which the metadata index for the repository is updated. If zero, the index is only updated once after start of the application. The default ist 0. Do not use in combination withindex_update_at.index_update_atThe time at which the metadata index for the repository is updated. The index is updated once per day. Do not use in combination withindex_update_interval.enabledSet tofalsein order to disable the repository. The default istrue.Local repositoriesOnly a single parameter is required for the definition of local repositories.ParameterDescriptionrootThe repository root directory. Root directories may be absolute or relative to the current working directory. Files in sub-folders will be included in the repository. A value must be provided.Rclone repositoriesLike for local repositories, only a single parameter is required for the definition of rclone repositories. However, the rclone remote must have been configured before. Digital Memories currently does not provide any functionality to configure rclone remotes.ParameterDescriptionrootThe rclone remote and root directory (e.g. \"mycloud:/photos/\"). Files in sub-folders will be included in the repository. A value must be provided.WebDAV repositoriesAs a minimum, the parametersurl,userandpasswordneed to be specified for the definition of a WebDAV repository.ParameterDescriptionurlThe URL of the WebDAV server. Use \"https://\" protocol prefix for secure connections. A value must be provided.userLogin name. A value must be provided.passwordLogin password. A value must be provided.rootThe root directory relative to the URL. For ownCloud WebDAV access, the root directoy typically starts with \"/remote.php/webdav\". The default is/.SlideshowsDigital Memories supports the configuration of one or multiple slideshows. Slideshows are configured in theslideshowssection of the configuration file. The section is required and must contain at least a single, valid slideshow definition. The first slideshow is the default slideshow. Slideshow parameter defaults may be provided as global parameters. The example below provides a typicalslideshowsconfiguration section....slideshows:# Slideshow with our favorite photos and videos.Favorites:repositories:Local storagepause:60tags:favorites# Slideshow with most recent photos from our smartphone.Recent:repositories:Cloud storageexcluded_tags:-vacation-favoritestypes:imagesmost_recent:200order:datedirection:descending# Slideshow defaultsalways_excluded_tags:privatelabel_content:descriptionlabel_mode:autolabel_duration:30orientation:portraitpause:180rotation:-90...The following parameters are used to configure slideshows.General parametersParameterDescriptionbg_colorThe background color used to fill empty areas, provided as[r, g, b]. The default is[1, 1, 1](white).label_contentThe following content based on file meta data is supported. The default is \"full\".-description:only image description-short:image description, location, and creation date-full:image description, location, creation date and tags, file name and repositorylabel_durationDuration in seconds for which labels are shown. The default is 60.label_font_sizeThe relative font size of labels, expressed as percentage of the shortest file dimension. The default is 0.08.label_modeThe following label modes are supported. The default is \"off\".-auto:Labels are shown at the beginning and end of a file for thelabel_duration.-off:Labels are never shown.-on:Labels are always shown.label_paddingThe relative padding of labels, expressed as percentage of the shortest file dimension. The default is 0.03.pauseThe delay in seconds until the next file is shown. The default is 300.resizeThe following resize modes are supported. The default is \"fill\".-fit:The slideshow content is zoomed to fit the screen as good as possible. Empty areas are filled with the background color.-fill:The slideshow content is zoomed and cropped to completely fill the screen. Note that images which do not have the same orientation as the screen are not zoomed and cropped, but only fit to the screen.rotationThe angle by which slideshow content is rotated clockwise. Useful for picture frames/screens, which are installed in non-standard orientation. The default is 0.Filter criteriaThe following parameters control the files included in a slideshow and the sequence in which they are shown. The default is to include all files from all repositories. Files are sorted by their name in ascending order.ParameterDescriptionrepositoriesThe repositories from which files shall be shown. The default is to show files from all repositories.orientationValid orientations areportraitorlandscape. The default is to include either orientation.typesSupported file types areimagesandvideos. May be a single value or list of values. The default is to include all file types.tagsFile tags, which shall be included. May be a single value or list of values. The default is to include all tagsanduntagged files. If set, untagged files are excluded.excluded_tagsFile tags, which shall be excluded. May be a single value or list of values. The default is not to exclude any tags.always_excluded_tagsSame asexcluded_tags, but not overwritten by anexcluded_tagsstatement. Use in the slideshow default configuration to exclude certain tags in all slideshows (e.g. private content).most_recentFiles in the slideshow are limited to themost_recentnumber of files based on the creation dateafterapplication of all other filter criteria.orderThe sort order in which files are shown. The default is \"name\".-date:Files are sorted by their creation date.-name:Files are sorted by their name.-random:Files are shown in a random sequence.-smart: A short sequence with random starting point, sorted by date in ascending order.directionValid sort directions areascendingordescending. The default is \"ascending\". Ignored if random order is configured.smart_limitThe (maximum) number of files in a smart sequence. If thesmart_timecriterion is not met, the sequence may be shorter. The default is 10.smart_timeThe maximum time allowed in-between subsequent files of a smart sequence in hours. If exceeded, the sequence is terminated early and a new sequence initiated. The default is 24.ScheduleDigital Memories supports the configuration of a schedule. The schedule allows to alter the application behavior at predefined points in time. The schedule is configured in the optionalschedulesection of the configuration file. The schedule may contain one or multiple events. The schedule is disabled if the configuration section is missing. The example below provides a typicalscheduleconfiguration section.schedule:# Play the slideshow \"Recent\" in the period from 8:00 to 10:00.morning start:time:\"08:00\"slideshow:Recentdisplay_mode:staticplay_state:playingmorning stop:time:\"10:00\"play_state:stopped# Play the slideshow \"Favorites\" in the period from 18:00 to 20:00.# Activate the display by motion.evening start:time:\"18:00\"slideshow:Favoritesdisplay_mode:motiondisplay_timeout:30play_state:playingevening stop:time:\"20:00\"play_state:stoppedThe following parameters are used to configure events in the schedule.ParameterDescriptiontimeThe time of the event. A value must be provided. Always specify in quotation marks.Note:Hours and minutes <10 must be preceded by a 0, i.e. \"08:03\" and never \"8:3\".slideshowSelected slideshow. If no slideshow is specified, the previous or default slideshow is assumed.play_stateValid play states arepaused,playingandstopped. The play state remains unchanged if no value is provided. The default is \"stopped\".display_modeThe following display modes are supported. The display mode remains unchanged if no value is provided. The default is \"static\".-static: The display is always on if a slideshow is paused or playing and off if a slideshow is stopped.-motion: The display is turned on and the slideshow starts playing in the presence of motion. The slideshow is paused and the display turned off in the absence of motion after the display timeout interval.display_timeoutThe time in seconds after which the slideshow is paused and screen turned off in the absence of motion. The display timeout remains unchanged if no value is provided. The default is 300.MQTTDigital Memories implements an MQTT client, which registers the device with an MQTT broker. The MQTT configuration is provided in the optionalmqttsection of the configuration file. MQTT support is disabled if the configuration section is missing. The example below provides a typicalmqttconfiguration section....mqtt:host:user:password:device_name:My Digital Memories somwhere in the house...The following parameters are used to configure the MQTT client.ParameterDescriptionhostHostname of the MQTT broker. A value must be specified.portConnection port of MQTT broker. The default is 8883 (standard for secure connections).tlsThe following values are supported. The default istrue.-true: A TLS-encrypted secure connection is used.-false: A non-encrypted connection is used.tls_insecureThe following values are supported. The default isfalse.-true: Insecure TLS connections with non-trusted certificates are permitted.-false: Only secure connections with trusted certificates are permitted.userLogin name. A value must be provided.passwordLogin password. A value must be provided.device_idThe Digital Memories device ID. The default is \"pyframe\".NoteThe device ID must be unique. A different value must be specified if multiple Pyframe instances connect to the same broker.device_nameThe human friendly device name. The default is to use thedevice_id.RunningOnce Digital Memories has been configured, you can change into the Digital Memories directory and start the application with the following command:$python3pyframe.pyIn recent distributions you may have to use \"python\" instead of \"python3\". Unless configured otherwise, Digital Memories is going to create an index database \"index.sqlite\" and directory \"./log\" for log files in the Digital Memories directory. If WebDAV or rclone repositories are configured, Digital Memories will further create a directory \"./cache\" for temporary storage of downloaded files.For convenience you can install the following script, which will allow you to start the Digital Memories application from anywhere (even SSH sessions). The placeholders and evidently need to be replaced with the proper values prior to running the script./usr/local/bin/start-pyframe#!/bin/shUSER=SRC=# May be python or python3 depending on your distributionPYTHON=/usr/bin/python3# Set authority file and active display in case we are starting this script from an SSH session.exportXAUTHORITY=/home/$USER/.XauthorityexportDISPLAY=:0# Change to pyframe source directorycd$SRC# Start pyframeif[$(/usr/bin/whoami)='root'];then/usr/sbin/runuser-ulangweiler--$PYTHONpyframe.pyelse$PYTHONpyframe.pyfiIf you intend to run Digital Memories assystemdservice, you can optionally create a second script for clean up after termination. In this example, we turn off the screen (works only under X11, not Wayland)./usr/local/bin/stop-pyframe#!/bin/shUSER=# Set authority file and active display in case we are starting this script from an SSH session.exportXAUTHORITY=/home/$USER/.XauthorityexportDISPLAY=:0# Turn backlight off./usr/bin/xsetdpmsforceoffBoth scripts should be owned byroot.rootand need to be executable (mode 750).If you want to start Digital Memories automatically during system boot, you can do so by configuring it in your desktop session manager. Alternatively, you can register asystemdservice via a unit file. Below is an example for a unit file, which uses the two scripts we created before./etc/systemd/system/pyframe.service[Unit]Description=Digital Memories photo frameWants=graphical.targetAfter=graphical.target[Service]Type=simpleExecStart=/usr/local/bin/start-pyframeExecStop=/usr/local/bin/stop-pyframeUser=rootGroup=rootRestart=always[Install]WantedBy=default.targetTheWantsandAfterstatements make sure that we are in graphical mode and that the service is only started after the graphical system has been launched. TheExecStopscript is optional as stated above. It is not required to stop the Digital Memories service. TheRestartstatement ensures that Digital Memories is restarted after unexpected exit. TheWantedBystatement allows to start the service automatically at boot time.Make sure the unit file belongs toroot.root, is readable by the owner and group and writable by the owner only (mode 640). Afterwards you can start the service and verify the successful start via the following commands:$sudosystemctlstartpyframe\n$sudosystemctlstatuspyframeTo enable automatic start of the service during boot time issue the following command:$sudosystemctlenablepyframeMake sure to additionally configureautologinfor the user under which you intend to run Digital Memories. Steps for configuration depend on the graphical system and Linux distribution. UnderArmbianyou can use the \"armbian-config\" tool. OnRaspberry Pi OS, the \"raspi-config\" tool will do. For other systems/distributions consult the corresponding documentation.Home AssistantGeneral setupDigital Memories implements basic support for integration with theHome Assistanthome automation system. Integration is achieved through the built-in Home AssistantMQTT integration. As an additional pre-requisite, an MQTT broker must be installed (e.g.Eclipse Mosquitto).After the Digital Memories MQTT client has been correctly configured and a connection to the MQTT broker established, Digital Memories should automatically appear as a new device in Home Assistant. The device supports several push buttons and configuration selections, which allow you to control Digital Memories from remote. The device further provides afile sensor, whose value is identical to the UUID of the currently displayed file.In addition, thefile sensorprovides selected file metadata as sensor attributes.Motion activationFor motion activation of the display, thetouch buttonof the Digital Memories device needs to be coupled to a motion sensor via an automation. Every time motion is detected, thetouch buttonis pressed by the automation. Pressing the touch button activates the display and resets the display timeout counter."} {"package": "01memories-resize", "pacakge-description": "Digital Memories (01memories)Digital Memories is a Python-based digital photo frame application. It is capable of displaying photos and playing videos from local storage as well as WebDAV andrclonerepositories.Digital Memories has been designed to run slideshows from image and video repositories with several thousand files. No conversion is required. Files remain in your repositories and fully under your control.Files in slideshows can be dynamically arranged and filtered based on their metadata (EXIF and IPTC metadata supported). Slideshows can be run continuously or scheduled.Digital Memories supports reverse geocoding based on GPS data in the EXIF tag, using the geopy library and Photon geocoder (essentially OpenStreetMap).Digital Memories optionally integrates withHome AssistantviaMQTT. Integration allows the display to be motion activated after coupling of the Digital Memories device with a motion sensor.Digital Memories is being developed byBernd Kalbfuss (aka langweiler)and is published under theGeneral Public License version 3. The latest source code is available onGitHub.Instructions for building your own digital photo frame can be foundhere.DependenciesDigital Memories requiresPython 3to run. It has been developed with Python version 3.10 on Ubuntu Linux, but may run with earlier versions and on different operating systems.Digital Memories requires the following Python packages to be installed:exifreadffmpeg-pythongeopyIPTCInfo3Kivypaho-mqttpillowpyyamlrclone-pythonscheduleSQLAlchemywebdavclient3All packages are available onpypi.organd can be installed using the \"pip install\" (or \"pip3 install\") command. Where possible/available, packages should be installed using the distribution package manager (e.g \"apt\" on Debian/Ubuntu).Digital Memories further requires the following (non-Python) libraries to be installed:libxslt1.1libmtdev1libsqlite3-0libsdl2-2.0-0ffmpegLibraries should be installed using the distribution package manager.Note that Digital Memories requires the X windows system and a desktop environment to be installed. Digital Memories will in principle also run under Wayland, but the display will not be turned off automatically since Wayland does fully implement the \"xset\" command.InstallationDigital Memories is still in early development. The easiest way to install the latest version is to clone the GitHub repository using thegitclient. After having installed thegitclient, installation of Digital Memories becomes as simple as:$gitclonegit@github.com:kalbfuss/01memories.gitThe command installs the Digital Memories sources in the sub-directory \"pyframe\" within the current working directory. Digital Memories can be updated to the latest version by changing into the \"pyframe\" directory and issuing the following command:$cdpyframe\n$gitpulloriginmasterAt this stage of the project you should not expect the configuration syntax to be stable. Please, have a look at the documentation after each update and adjust the configuration as necessary.ConfigurationThe Digital Memories application is configured via a single YAML configuration file. The file is named \"config.yaml\" and must be stored in the current (working) directory. The following sections provide examples for configuration and the documentation of all parameters.A lot of effort has gone into configuration checks. The application should warn you in the event of invalid configurations immediately after startup. It is thus safe to explore the various configuration options. Under no circumstances is Digital Memories going to modify any of your image or video files.ExamplesSimple configurationIn this example, we want to continuously show all files stored in a local directory. For this purpose, we configure a single local repository (\"Local storage\"). Our files are stored under the relative path \"./local/photos\". We further define a single slideshow (\"Favorites\") containing all files from the repository. Files are shown in a random sequence for a duration of 60 s.Per (application) default settings, the repository is indexed once after start of the application. The slideshow includes photos and videos. The slideshow starts playing after start of the application and the display is always on.repositories:# Local repository with our favorite photos and videos.Local storage:type:localroot:./local/photosslideshows:# Slideshow with our favorite photos and videos.Favorites:repositories:Local storagepause:60sequence:randomAdvanced configurationIn this example, we want to show our most recent photos stored in the cloud in the period from 8:00 to 10:00 and our favorite photos, which are stored locally, in the period from 18:00 to 20:00. Since we are not necessarily at home in the evening, we want the display to be motion activated during this time.Firstly, we define two (enabled) repositories: A local repository (\"Local storage\") with files stored under the relative path \"./local/photos\" and a WebDAV repository (\"Cloud storage\") with files stored in the cloud. The third repository (\"Test repository\") used for testing has been disabled. Per the repository default settings, the index of the local repository is updated at the start of the application and every 24 hours. The index of the cloud repository is updated daily at 23:00.Secondly, we define two slideshows: The first slideshow (\"Favorites\") includes files tagged as \"favorites\" from the local repository. Files are shown for a duration of 60 s. The second slideshow (\"Recent\") includes the 200 most recent files from the cloud repository, which are not tagged as \"vacation\" or \"favorites\". We further limit files to \"images\". Files are sorted by the creation date in ascending order. Per the slideshow defaults, images are shown for a duration of 180 s.The slideshow defaults further ensure that files tagged as \"private\" are always excluded. Files are labeled with their description from the file metadata (if available) and labels are shown for a duration of 60 s at the start and the end of each file. Since the display is installed in vertical orientation, we rotate the content by -90\u00b0 and limit content to files in portrait orientation.Thirdly, we define a schedule to show the second slideshow (\"Recent\") in the time from 8:00 to 10:00 and the first slideshow (\"Favorites\") in the time from 18:00 to 20:00. In the first case, the display is always on. In the second case, the display is motion activated with a timeout interval of 300 s.Finally, since we run Home Assistant and need the MQTT remote control for the motion activation feature, we configure an MQTT client connection. For the motion activation feature to function properly, we further have to link the touch button with a motion sensor in Home Assistant (see [motion activation](#Motion activation)).repositories:# Local repository with our favorite photos and videos.Local storage:type:localroot:./local/photos# WebDAV repository with the latest photos from our smartphone.Cloud storage:type:webdavurl:https://mycloud.mydomain.orgroot:/remote.php/webdav/photosuser:pyframepassword:index_update_at:\"23:00\"# Test repository, which has been disabled.Test repository:type:localroot:./local/testenabled:false# Repository defaultsindex_update_interval:24slideshows:# Slideshow with our favorite photos and videos.Favorites:repositories:Local storagepause:60tags:favorites# Slideshow with most recent photos from our smartphone.Recent:repositories:Cloud storageexcluded_tags:-vacation-favoritestypes:imagesmost_recent:200order:datedirection:descending# Slideshow defaultsalways_excluded_tags:privatelabel_content:descriptionlabel_mode:autolabel_duration:30orientation:portraitpause:180rotation:-90schedule:# Play the slideshow \"Recent\" in the period from 8:00 to 10:00.morning start:time:\"08:00\"slideshow:Recentdisplay_mode:staticplay_state:playingmorning stop:time:\"10:00\"play_state:stopped# Play the slideshow \"Favorites\" in the period from 18:00 to 20:00.# Activate the display by motion.evening start:time:\"18:00\"slideshow:Favoritesdisplay_mode:motiondisplay_timeout:300play_state:playingevening stop:time:\"20:00\"play_state:stoppedmqtt:host:mqtt.localuser:pyframepassword:device_name:My Digital Memories somwhere in the houseApplicationThe following parameters are used to configure the application.BasicParameterDescriptionwindow_sizeThe size of the window provided as[width, height]. A value of \"full\" enables full screen mode. The default is \"full\".display_modeThe following display modes are supported. The default is \"static\".-static: The display is always on if a slideshow is paused or playing and off if a slideshow is stopped.-motion: The display is turned on and the slideshow starts playing in the presence of motion (i.e.touchevents). The slideshow is paused and the display turned off in the absence of motion after the display timeout interval.display_timeoutThe time in seconds after which the slideshow is paused and screen turned off in the absence of motion. The default is 300 seconds.AdvancedParameters in this section will likely not have to be modified by the majority of users.ParameterDescriptionindexThe index database file. The path may be absolute or relative to the current working directory. The default is \"./index.sqlite\".cacheThe directory in which files can be cached (used by WebDAV and rclone repositories). The directory path may be absolute or relative to the current working directory. The directory can be shared by multiple repositories.Do notuse directory in which you store files as cache directory. The default is \"./cache\".enable_exception_handlerSet totruein order to enable the generic exception handler. The generic exception handler prevents the application from exiting unexpectedly. Exceptions are logged, but the execution continues. The default isfalse.enable_schedulerSet tofalsein order to disable the scheduler. The scheduler is disabled even in the presence of ascheduleconfiguration section. The default istrue.enable_mqttSet tofalsein order to disable the MQTT client. The client is disabled even in the presence of anmqttconfiguration section. The default istrueenable_loggingSet tofalsein order to disable logging. The default istrue.log_levelThe log level, which can be set todebug,info,warning, orerror. The default is \"warning\".log_dirThe directory to which log files are written. The directory path may be absolute or relative to the current working directory. The default is \"./log\".RepositoriesDigital Memories supports the configuration of one or multiple file repositories. Repositories are configured in therepositoriessection of the configuration file. The section is required and must contain at least a single, valid repository definition. Repository parameter defaults may be provided as global parameters. The example below provides a typicalrepositoriesconfiguration section....repositories:# Local repository with our favorite photos and videos.Local storage:type:localroot:./local/photos# WebDAV repository with the latest photos from our smartphone.Cloud storage:type:webdavurl:https://mycloud.mydomain.orgroot:/remote.php/webdav/photosuser:pyframepassword:# Test repository, which has been disabled.Test repository:type:localroot:./local/testenabled:false# Repository defaultsindex_update_interval:24...The following parameters are used to configure repositories.GeneralParameterDescriptiontypeThe following repository types are supported. A values must be provided.-local: Repository with files on the local file system.Note:Even if referred to aslocal, files may be stored on a network share as long as the network is mounted and integrated into the file system hierarchy (e.g. \"/mnt/photos\").-rclone: Repository with files on an rclone remote. The remote must have been configured before using the \"rclone config\" command or directly in the rclone configuration file.-webdav: Repository with files on a WebDAV accessible site (e.g. ownCloud or NextCloud).index_update_intervalInterval in hours at which the metadata index for the repository is updated. If zero, the index is only updated once after start of the application. The default ist 0. Do not use in combination withindex_update_at.index_update_atThe time at which the metadata index for the repository is updated. The index is updated once per day. Do not use in combination withindex_update_interval.enabledSet tofalsein order to disable the repository. The default istrue.Local repositoriesOnly a single parameter is required for the definition of local repositories.ParameterDescriptionrootThe repository root directory. Root directories may be absolute or relative to the current working directory. Files in sub-folders will be included in the repository. A value must be provided.Rclone repositoriesLike for local repositories, only a single parameter is required for the definition of rclone repositories. However, the rclone remote must have been configured before. Digital Memories currently does not provide any functionality to configure rclone remotes.ParameterDescriptionrootThe rclone remote and root directory (e.g. \"mycloud:/photos/\"). Files in sub-folders will be included in the repository. A value must be provided.WebDAV repositoriesAs a minimum, the parametersurl,userandpasswordneed to be specified for the definition of a WebDAV repository.ParameterDescriptionurlThe URL of the WebDAV server. Use \"https://\" protocol prefix for secure connections. A value must be provided.userLogin name. A value must be provided.passwordLogin password. A value must be provided.rootThe root directory relative to the URL. For ownCloud WebDAV access, the root directoy typically starts with \"/remote.php/webdav\". The default is/.SlideshowsDigital Memories supports the configuration of one or multiple slideshows. Slideshows are configured in theslideshowssection of the configuration file. The section is required and must contain at least a single, valid slideshow definition. The first slideshow is the default slideshow. Slideshow parameter defaults may be provided as global parameters. The example below provides a typicalslideshowsconfiguration section....slideshows:# Slideshow with our favorite photos and videos.Favorites:repositories:Local storagepause:60tags:favorites# Slideshow with most recent photos from our smartphone.Recent:repositories:Cloud storageexcluded_tags:-vacation-favoritestypes:imagesmost_recent:200order:datedirection:descending# Slideshow defaultsalways_excluded_tags:privatelabel_content:descriptionlabel_mode:autolabel_duration:30orientation:portraitpause:180rotation:-90...The following parameters are used to configure slideshows.General parametersParameterDescriptionbg_colorThe background color used to fill empty areas, provided as[r, g, b]. The default is[1, 1, 1](white).label_contentThe following content based on file meta data is supported. The default is \"full\".-description:only image description-short:image description, location, and creation date-full:image description, location, creation date and tags, file name and repositorylabel_durationDuration in seconds for which labels are shown. The default is 60.label_font_sizeThe relative font size of labels, expressed as percentage of the shortest file dimension. The default is 0.08.label_modeThe following label modes are supported. The default is \"off\".-auto:Labels are shown at the beginning and end of a file for thelabel_duration.-off:Labels are never shown.-on:Labels are always shown.label_paddingThe relative padding of labels, expressed as percentage of the shortest file dimension. The default is 0.03.pauseThe delay in seconds until the next file is shown. The default is 300.resizeThe following resize modes are supported. The default is \"fill\".-fit:The slideshow content is zoomed to fit the screen as good as possible. Empty areas are filled with the background color.-fill:The slideshow content is zoomed and cropped to completely fill the screen. Note that images which do not have the same orientation as the screen are not zoomed and cropped, but only fit to the screen.rotationThe angle by which slideshow content is rotated clockwise. Useful for picture frames/screens, which are installed in non-standard orientation. The default is 0.Filter criteriaThe following parameters control the files included in a slideshow and the sequence in which they are shown. The default is to include all files from all repositories. Files are sorted by their name in ascending order.ParameterDescriptionrepositoriesThe repositories from which files shall be shown. The default is to show files from all repositories.orientationValid orientations areportraitorlandscape. The default is to include either orientation.typesSupported file types areimagesandvideos. May be a single value or list of values. The default is to include all file types.tagsFile tags, which shall be included. May be a single value or list of values. The default is to include all tagsanduntagged files. If set, untagged files are excluded.excluded_tagsFile tags, which shall be excluded. May be a single value or list of values. The default is not to exclude any tags.always_excluded_tagsSame asexcluded_tags, but not overwritten by anexcluded_tagsstatement. Use in the slideshow default configuration to exclude certain tags in all slideshows (e.g. private content).most_recentFiles in the slideshow are limited to themost_recentnumber of files based on the creation dateafterapplication of all other filter criteria.orderThe sort order in which files are shown. The default is \"name\".-date:Files are sorted by their creation date.-name:Files are sorted by their name.-random:Files are shown in a random sequence.-smart: A short sequence with random starting point, sorted by date in ascending order.directionValid sort directions areascendingordescending. The default is \"ascending\". Ignored if random order is configured.smart_limitThe (maximum) number of files in a smart sequence. If thesmart_timecriterion is not met, the sequence may be shorter. The default is 10.smart_timeThe maximum time allowed in-between subsequent files of a smart sequence in hours. If exceeded, the sequence is terminated early and a new sequence initiated. The default is 24.ScheduleDigital Memories supports the configuration of a schedule. The schedule allows to alter the application behavior at predefined points in time. The schedule is configured in the optionalschedulesection of the configuration file. The schedule may contain one or multiple events. The schedule is disabled if the configuration section is missing. The example below provides a typicalscheduleconfiguration section.schedule:# Play the slideshow \"Recent\" in the period from 8:00 to 10:00.morning start:time:\"08:00\"slideshow:Recentdisplay_mode:staticplay_state:playingmorning stop:time:\"10:00\"play_state:stopped# Play the slideshow \"Favorites\" in the period from 18:00 to 20:00.# Activate the display by motion.evening start:time:\"18:00\"slideshow:Favoritesdisplay_mode:motiondisplay_timeout:30play_state:playingevening stop:time:\"20:00\"play_state:stoppedThe following parameters are used to configure events in the schedule.ParameterDescriptiontimeThe time of the event. A value must be provided. Always specify in quotation marks.Note:Hours and minutes <10 must be preceded by a 0, i.e. \"08:03\" and never \"8:3\".slideshowSelected slideshow. If no slideshow is specified, the previous or default slideshow is assumed.play_stateValid play states arepaused,playingandstopped. The play state remains unchanged if no value is provided. The default is \"stopped\".display_modeThe following display modes are supported. The display mode remains unchanged if no value is provided. The default is \"static\".-static: The display is always on if a slideshow is paused or playing and off if a slideshow is stopped.-motion: The display is turned on and the slideshow starts playing in the presence of motion. The slideshow is paused and the display turned off in the absence of motion after the display timeout interval.display_timeoutThe time in seconds after which the slideshow is paused and screen turned off in the absence of motion. The display timeout remains unchanged if no value is provided. The default is 300.MQTTDigital Memories implements an MQTT client, which registers the device with an MQTT broker. The MQTT configuration is provided in the optionalmqttsection of the configuration file. MQTT support is disabled if the configuration section is missing. The example below provides a typicalmqttconfiguration section....mqtt:host:user:password:device_name:My Digital Memories somwhere in the house...The following parameters are used to configure the MQTT client.ParameterDescriptionhostHostname of the MQTT broker. A value must be specified.portConnection port of MQTT broker. The default is 8883 (standard for secure connections).tlsThe following values are supported. The default istrue.-true: A TLS-encrypted secure connection is used.-false: A non-encrypted connection is used.tls_insecureThe following values are supported. The default isfalse.-true: Insecure TLS connections with non-trusted certificates are permitted.-false: Only secure connections with trusted certificates are permitted.userLogin name. A value must be provided.passwordLogin password. A value must be provided.device_idThe Digital Memories device ID. The default is \"pyframe\".NoteThe device ID must be unique. A different value must be specified if multiple Pyframe instances connect to the same broker.device_nameThe human friendly device name. The default is to use thedevice_id.RunningOnce Digital Memories has been configured, you can change into the Digital Memories directory and start the application with the following command:$python3pyframe.pyIn recent distributions you may have to use \"python\" instead of \"python3\". Unless configured otherwise, Digital Memories is going to create an index database \"index.sqlite\" and directory \"./log\" for log files in the Digital Memories directory. If WebDAV or rclone repositories are configured, Digital Memories will further create a directory \"./cache\" for temporary storage of downloaded files.For convenience you can install the following script, which will allow you to start the Digital Memories application from anywhere (even SSH sessions). The placeholders and evidently need to be replaced with the proper values prior to running the script./usr/local/bin/start-pyframe#!/bin/shUSER=SRC=# May be python or python3 depending on your distributionPYTHON=/usr/bin/python3# Set authority file and active display in case we are starting this script from an SSH session.exportXAUTHORITY=/home/$USER/.XauthorityexportDISPLAY=:0# Change to pyframe source directorycd$SRC# Start pyframeif[$(/usr/bin/whoami)='root'];then/usr/sbin/runuser-ulangweiler--$PYTHONpyframe.pyelse$PYTHONpyframe.pyfiIf you intend to run Digital Memories assystemdservice, you can optionally create a second script for clean up after termination. In this example, we turn off the screen (works only under X11, not Wayland)./usr/local/bin/stop-pyframe#!/bin/shUSER=# Set authority file and active display in case we are starting this script from an SSH session.exportXAUTHORITY=/home/$USER/.XauthorityexportDISPLAY=:0# Turn backlight off./usr/bin/xsetdpmsforceoffBoth scripts should be owned byroot.rootand need to be executable (mode 750).If you want to start Digital Memories automatically during system boot, you can do so by configuring it in your desktop session manager. Alternatively, you can register asystemdservice via a unit file. Below is an example for a unit file, which uses the two scripts we created before./etc/systemd/system/pyframe.service[Unit]Description=Digital Memories photo frameWants=graphical.targetAfter=graphical.target[Service]Type=simpleExecStart=/usr/local/bin/start-pyframeExecStop=/usr/local/bin/stop-pyframeUser=rootGroup=rootRestart=always[Install]WantedBy=default.targetTheWantsandAfterstatements make sure that we are in graphical mode and that the service is only started after the graphical system has been launched. TheExecStopscript is optional as stated above. It is not required to stop the Digital Memories service. TheRestartstatement ensures that Digital Memories is restarted after unexpected exit. TheWantedBystatement allows to start the service automatically at boot time.Make sure the unit file belongs toroot.root, is readable by the owner and group and writable by the owner only (mode 640). Afterwards you can start the service and verify the successful start via the following commands:$sudosystemctlstartpyframe\n$sudosystemctlstatuspyframeTo enable automatic start of the service during boot time issue the following command:$sudosystemctlenablepyframeMake sure to additionally configureautologinfor the user under which you intend to run Digital Memories. Steps for configuration depend on the graphical system and Linux distribution. UnderArmbianyou can use the \"armbian-config\" tool. OnRaspberry Pi OS, the \"raspi-config\" tool will do. For other systems/distributions consult the corresponding documentation.Home AssistantGeneral setupDigital Memories implements basic support for integration with theHome Assistanthome automation system. Integration is achieved through the built-in Home AssistantMQTT integration. As an additional pre-requisite, an MQTT broker must be installed (e.g.Eclipse Mosquitto).After the Digital Memories MQTT client has been correctly configured and a connection to the MQTT broker established, Digital Memories should automatically appear as a new device in Home Assistant. The device supports several push buttons and configuration selections, which allow you to control Digital Memories from remote. The device further provides afile sensor, whose value is identical to the UUID of the currently displayed file.In addition, thefile sensorprovides selected file metadata as sensor attributes.Motion activationFor motion activation of the display, thetouch buttonof the Digital Memories device needs to be coupled to a motion sensor via an automation. Every time motion is detected, thetouch buttonis pressed by the automation. Pressing the touch button activates the display and resets the display timeout counter."} {"package": "01OS", "pacakge-description": "The open-source language model computer.pipinstall01OS01# Runs the 01 server and client."} {"package": "021", "pacakge-description": "No description available on PyPI."} {"package": "024travis-test024", "pacakge-description": "No description available on PyPI."} {"package": "02exercicio", "pacakge-description": "UNKNOWN"} {"package": "0411-test", "pacakge-description": "No description available on PyPI."} {"package": "0.618", "pacakge-description": "\ud83d\udce6 setup.py (for humans)This repo exists to providean example setup.pyfile, that can be used\nto bootstrap your next Python project. It includes some advanced\npatterns and best practices forsetup.py, as well as some\ncommented\u2013out nice\u2013to\u2013haves.For example, thissetup.pyprovides a$ python setup.py uploadcommand, which creates auniversal wheel(andsdist) and uploads\nyour package toPyPiusingTwine, without the need for an annoyingsetup.cfgfile. It also creates/uploads a new git tag, automatically.In short,setup.pyfiles can be daunting to approach, when first\nstarting out \u2014 even Guido has been heard saying, \"everyone cargo cults\nthems\". It's true \u2014 so, I want this repo to be the best place to\ncopy\u2013paste from :)Check out the example!Installationcdyour_project# Download the setup.py file:# download with wgetwgethttps://raw.githubusercontent.com/navdeep-G/setup.py/master/setup.py-Osetup.py# download with curlcurl-Ohttps://raw.githubusercontent.com/navdeep-G/setup.py/master/setup.pyTo DoTests via$ setup.py test(if it's concise).Pull requests are encouraged!More ResourcesWhat is setup.py?on Stack OverflowOfficial Python Packaging User GuideThe Hitchhiker's Guide to PackagingCookiecutter template for a Python packageLicenseThis is free and unencumbered software released into the public domain.Anyone is free to copy, modify, publish, use, compile, sell, or\ndistribute this software, either in source code form or as a compiled\nbinary, for any purpose, commercial or non-commercial, and by any means."} {"package": "0706xiaoye", "pacakge-description": "README0706hello world!"} {"package": "0805nexter", "pacakge-description": "UNKNOWN"} {"package": "090807040506030201testpip", "pacakge-description": "This is not a real python module. I am just learning how to make a package for pip"} {"package": "0-core-client", "pacakge-description": "# Zero-OS Python Client## Install```bashpip3 install 0-core-client```## How to use```pythonfrom zeroos.core0.client import Clientcl = Client(host='<0-core-host-address>', password='')#validate that core0 is reachableprint(cl.ping())#then u can do stuff likeprint(cl.system('ps -eF').get())print(cl.system('ip a').get())#client exposes more tools for disk, bridges, and container mgmtprint(cl.disk.list())```"} {"package": "0FELA", "pacakge-description": "This is a security placeholder package.\nIf you want to claim this name for legitimate purposes,\nplease contact us atsecurity@yandex-team.ruorpypi-security@yandex-team.ru"} {"package": "0html", "pacakge-description": "pip install 0html"} {"package": "0imap", "pacakge-description": "pip install 0imap"} {"package": "0lever-so", "pacakge-description": "This is a SSH login toolInstallationpip install --upgrade 0lever-so\nor\npip install --upgrade 0lever-so -i https://pypi.org/simple/Usage# \u521d\u59cb\u5316\u914d\u7f6e\u6587\u4ef6,\u5347\u7ea7\u65e0\u9700\u521d\u59cb\u5316,chmod 400 ~/.so/keys/*\n\u279c ~ so_install\n\u279c ~ cd .so\n\u279c .so tree\n.\n\u251c\u2500\u2500 keys\n\u2502\u00a0\u00a0 \u2514\u2500\u2500 demo.pem\n\u2514\u2500\u2500 password.yaml\n\n1 directory, 2 files\n\u279c .so# \u914d\u7f6e\u6587\u4ef6\nssh:\n - id: 1\n name: demo1\n user: fqiyou\n password: xxx\n host: 1.1.1.1\n port: 20755\n - id: 2\n name: demo2\n user: fqiyou\n password: xxx\n host: 1.1.1.1\n port: 39986\n - id: 3\n name: demo3\n user: root\n password: demo.pem\n host: 1.1.1.1\n port: 22Other-shell#!/usr/bin/expect\nset USER \"xxx\"\nset PASSWD \"xxx\"\nset timeout 10\n\ntrap {\n set rows [stty rows]\n set cols [stty columns]\n stty rows $rows columns $cols < $spawn_out(slave,name)\n} WINCH\nspawn su - $USER\nexpect \"Password: \"\nsend \"$PASSWD\\n\"\ninteract#!/usr/bin/expect -f\nset HOST [lindex $argv 0]\nset USER [lindex $argv 1]\nset PASSWD [lindex $argv 2]\nset PORT [lindex $argv 3]\nset timeout 10\n\ntrap {\n set rows [stty rows]\n set cols [stty columns]\n stty rows $rows columns $cols < $spawn_out(slave,name)\n} WINCH\n\nspawn ssh $USER@HOST -p $PORT\nexpect {\n \"*yes/no\" {send \"yes\\r\"; exp_continue}\n \"*password:\" {send \"$PASSWD\\r\"}\n}\ninteract\n```"} {"package": "0lever-utils", "pacakge-description": "No description available on PyPI."} {"package": "0-orchestrator", "pacakge-description": "# Python ClientO-orchestrator is the Python client used to talk to [Zero-OS 0 Rest API](../README.md)## Install```bashpip install 0-orchestrator```## How to use```pythonIn [9]: from zeroos.orchestrator import clientIn [10]: c = client.Client('http://192.168.193.212:8080')In [11]: c.api.nodes.ListNodes().json()Out[11]:[{'hostname': '', 'id': '2c600cbc2545', 'status': 'running'},{'hostname': '', 'id': '2c600ccd2ae9', 'status': 'running'},{'hostname': '', 'id': '0cc47a3b3d6a', 'status': 'running'},{'hostname': '', 'id': '2c600ccd2ad1', 'status': 'running'},{'hostname': '', 'id': '2c600cbc23bc', 'status': 'running'}]```## To update the client from the RAML file```shellgo-raml client -l python --ramlfile raml/api.raml --dir pyclient/zeroos/orchestrator/client```"} {"package": "0proto", "pacakge-description": "Usagepip install 0protoThen, simply use the command periodically:0proto https://example.com/something/etcThis will save data to:settings.BASE_DIR/data/0proto-DOMAIN:default/ItemN-SpacingIf you want to seprate different sessions and sources, just use name param:0proto URI \u2013name NameThis will save to:settings.BASE_DIR/data/0proto-DOMAIN:Name/TypeThe\u2013namevalue can be arbitray filesystem-compatible filename sub-string, so, you can use it to separate data by accounts, languages, or other features.NOTE: Corresponding auth and session data will be stored insettings.BASE_DIR/sessionsfolder.Saving to specific DIRSaving to custom folder simply pass\u2013pathparameter, like:0proto URI \u2013name Name \u2013path /home/mindey/Desktop/mydata"} {"package": "0rest", "pacakge-description": "pip install 0rest"} {"package": "0rss", "pacakge-description": "\u4f7f\u3044\u65b9pip install 0rssThen, simply use the command periodically:0rss https://0oo.li/feed/enThis will save data periodically, to:~/.metadrive/data/0rss-0oo.li:default/Post\u591a\u6e90\u7528\u6cd5If you want to seprate different sessions and sources, just use name param:0rss https://0oo.li/feed/en \u2013name mindey@example.comThis will save to:~/.metadrive/data/0rss-0oo.li:mindey@example.com/PostThe\u2013namevalue can be arbitray filesystem-compatible filename sub-string, so, you can use it to separate data by accounts, languages, or other features.NOTE: Corresponding auth and session data will be stored in~/.metadrive/sessionsfolder.\u6307\u5b9a\u3057\u305f\u30d5\u30a9\u30eb\u30c0\u30fc\u3078\u4fdd\u5b58Saving to custom folder simply pass\u2013pathparameter, like:0rss https://hub.baai.ac.cn/rss \u2013name Mindey \u2013path /home/mindey/Desktop/mydata"} {"package": "0wdg9nbmpm", "pacakge-description": "UNKNOWN"} {"package": "0wneg", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} {"package": "0x01-autocert-dns-aliyun", "pacakge-description": "No description available on PyPI."} {"package": "0x01-cubic-sdk", "pacakge-description": "No description available on PyPI."} {"package": "0x01-letsencrypt", "pacakge-description": "No description available on PyPI."} {"package": "0x0-python", "pacakge-description": "\u0417\u0434\u0435\u0441\u044c \u0435\u0441\u0442\u044c \u0444\u0443\u043d\u043a\u0446\u0438\u0438:upload_file_url(url, expires, secret): \u0417\u0430\u0433\u0440\u0443\u0437\u043a\u0430 \u0444\u0430\u0439\u043b\u0430 \u0447\u0435\u0440\u0435\u0437 \u0441\u0441\u044b\u043b\u043a\u0443, url=\u0441\u0441\u044b\u043b\u043a\u0430, expires=\u0432\u0440\u0435\u043c\u044f \u0445\u0440\u0430\u043d\u0435\u043d\u0438\u044f \u0444\u0430\u0439\u043b\u0430 \u0432 \u0447\u0430\u0441\u0430\u0445(\u043c\u043e\u0436\u043d\u043e \u043e\u0441\u0442\u0430\u0432\u0438\u0442\u044c \u043f\u0443\u0441\u0442\u044b\u043c), secret=\u0443\u0434\u043b\u0438\u043d\u043d\u044f\u0435\u0442 \u0441\u0441\u044b\u043b\u043a\u0443(\u043c\u043e\u0436\u043d\u043e \u043e\u0441\u0442\u0430\u0432\u0438\u0442\u044c \u043f\u0443\u0441\u0442\u044b\u043c).upload_file_path(path, expires, secret): \u0422\u043e\u0436\u0435 \u0441\u0430\u043c\u043e\u0435 \u0447\u0442\u043e \u0438 upload_file_url, \u0442\u043e\u043b\u044c\u043a\u043e \u043d\u0443\u0436\u043d\u043e \u0443\u043a\u0430\u0437\u044b\u0432\u0430\u0442\u044c \u043f\u0443\u0442\u044c \u043a \u0444\u0430\u0439\u043b\u0443.delete_file(token, url): \u0423\u0434\u0430\u043b\u044f\u0435\u0442 \u0444\u0430\u0439\u043b, token=\u0442\u043e\u043a\u0435\u043d, url=\u0441\u0441\u044b\u043b\u043a\u0430\u043d\u0430 \u0444\u0430\u0439\u043b \u0432 0x0.change_expires(url, expires, token): \u0418\u0437\u043c\u0435\u043d\u044f\u0435\u0442 \u0432\u0440\u0435\u043c\u044f \u0445\u0440\u0430\u043d\u0435\u043d\u0438\u044f \u0444\u0430\u0439\u043b\u0430, token=\u0442\u043e\u043a\u0435\u043d, url=\u0441\u0441\u044b\u043b\u043a\u0430 \u043d\u0430 \u0444\u0430\u0439\u043b \u0432 0x0, expires=\u043d\u043e\u0432\u043e\u0435 \u0432\u0440\u0435\u043c\u044f \u0445\u0440\u0430\u043d\u0435\u043d\u0438\u0435 \u0444\u0430\u0439\u043b\u0430 \u0432 \u0447\u0430\u0441\u0430\u0445."} {"package": "0x10c-asm", "pacakge-description": "Install from PyPI:pip install 0x10c-asmUsage:$ 0x10c-asm.py -husage:0x10-asm.py[-h]IN [OUT]A simple Python-based DCPU assembly compilerpositional arguments:INfile path of the file containing the assembly codeOUTfile path where to store the binary codeoptional arguments:-h,--helpshow this help message and exit"} {"package": "0x20bf", "pacakge-description": "No description available on PyPI."} {"package": "0x2nac0nda", "pacakge-description": "No description available on PyPI."} {"package": "0x-contract-addresses", "pacakge-description": "0x-contract-addressesAddresses at which the 0x smart contracts have been deployed.Read thedocumentationInstallingpipinstall0x-contract-addressesContributingWe welcome improvements and fixes from the wider community! To report bugs within this package, please create an issue in this repository.Please read ourcontribution guidelinesbefore getting started.Install Code and Dependenciespipinstall-e.[dev]Clean./setup.py clean --allLint./setup.py lintBuild Documentation./setup.py build_sphinxMoreSee./setup.py --help-commandsfor more info."} {"package": "0x-contract-artifacts", "pacakge-description": "0x-contract-artifacts0x smart contract compilation artifactsRead thedocumentationInstallingpipinstall0x-contract-artifactsContributingWe welcome improvements and fixes from the wider community! To report bugs within this package, please create an issue in this repository.Please read ourcontribution guidelinesbefore getting started.Pull in artifacts from TypeScript build environment./setup.pypre_installInstall Code and Dependenciespipinstall-e.[dev]Clean./setup.py clean --allLint./setup.py lintBuild Documentation./setup.py build_sphinxMoreSee./setup.py --help-commandsfor more info."} {"package": "0x-contract-wrappers", "pacakge-description": "0x-contract-wrappers0x contract wrappers for those developing on top of 0x protocol.Read thedocumentationInstallingpipinstall0x-contract-wrappersContributingWe welcome improvements and fixes from the wider community! To report bugs within this package, please create an issue in this repository.Please read ourcontribution guidelinesbefore getting started.Install Code and DependenciesThis package contains code generated via npm package @0x/abi-gen. Preparing this package for development or installation requires running./setup.py pre_install, which will invoke abi-gen to write the files to the src hierarchy. It expects to find abi-gen and the contract artifacts at the relative directory locations in the monorepo.After code generation, ensure that you have installed Python >=3.6 and Docker, and then:pipinstall-e.[dev]TestTests depend on a running ganache instance and with the 0x contracts deployed in it. For convenience, a docker container is provided that has ganache-cli and a snapshot containing the necessary contracts. A shortcut is provided to run that docker container:./setup.py ganache. With that running, the tests can be run with./setup.py test.Clean./setup.py clean --allLint./setup.py lintBuild Documentation./setup.py build_sphinxMoreSee./setup.py --help-commandsfor more info."} {"package": "0xf0f-codenode", "pacakge-description": "ContentsWhat is this?How do I install it?How do I use it?ExtensionsReferenceWhat is this?The goal of this module is to help write code that generates code.\nFocus is placed on enabling the user to easily describe,\nbuild and reason about code structures rapidly.How do I install it?From PyPI:pip install 0xf0f-codenodeFrom GitHub:pip install git+https://github.com/0xf0f/codenodeHow do I use it?Like thejsonandpicklemodules,dumpanddumpsare used to generate output.\nCode can be built using any tree of iterables containing strings,\nindentation nodes and newline nodes.For example, the built-inlinefunction returns a tuple:fromcodenodeimportindentation,newlinedefline(content):returnindentation,content,newlineWhich we can combine withindentanddedentnodes:fromcodenodeimportline,indent,dedent,dumpsdefcounting_function(count_from,count_to):return[line(f'def count_from_{count_from}_to_{count_to}():'),indent,[line(f'print({i})')foriinrange(count_from,count_to)],dedent,]print(dumps(counting_function(0,5)))Which outputs:def count_from_0_to_5():\n print(0)\n print(1)\n print(2)\n print(3)\n print(4)But what if we want to count to a really big number, like\n1,000,000,000,000,000?\nIt would be inefficient to store all those lines in memory\nat once. We can use a generator to break them down into\nindividual parts instead:fromcodenodeimportindent,dedent,newline,indentation,dumpdefcounting_function_generator(count_from,count_to):yieldindentationyield'def count_from_',str(count_from),'_to_',str(count_to),'():'yieldnewlineyieldindentforiinrange(count_from,count_to):yieldindentation,'print(',str(i),')',newlineyielddedentwithopen('code.py','w')asfile:dump(counting_function_generator(0,1_000_000_000_000_000),file)We can also build a class with an__iter__method:fromcodenodeimportline,indent,dedent,dumpclassCountingFunction:def__init__(self,count_from,count_to):self.count_from=count_fromself.count_to=count_todef__iter__(self):yieldline(f'def count_from_{self.count_from}_to_{self.count_to}():')yieldindentforiinrange(self.count_from,self.count_to):yieldline(f'print({i})')yielddedentwithopen('code.py','w')asfile:dump(CountingFunction(0,1_000_000),file)Or a more generalized function class:classFunction:def__init__(self,name,*args):self.name=nameself.args=argsself.children=[]def__iter__(self):arg_string=', '.join(self.args)yieldline(f'def{self.name}({arg_string}):')yieldindentyieldself.childrenyielddedentclassCountingFunction(Function):def__init__(self,count_from,count_to):super().__init__(f'count_from_{count_from}_to_{count_to}')foriinrange(count_from,count_to):self.children.append(line(f'print({i})'))Leveraging python's iteration protocol like this allows:Mixing and matching whatever fits the use case to maximize tradeoffs,\nsuch as using generators for their memory efficiency,\ncustom iterable classes for their semantics, or plain old lists and\ntuples for their simplicity.Taking advantage of existing modules that offer tooling for\niterables, such as itertools.Building higher level structures from as many iterable building blocks\nas desired.ExtensionsModule behaviour can be extended by overriding methods of thecodenode.writer.Writerandcodenode.writer.WriterStackclasses. An\nexample of this can be seen in thecodenode.debug.debug_patchfunction. The variablecodenode.default_writer_typecan be used to\nreplace theWritertype used indumpanddumpswith a custom one.Some modules with helper classes and functions are also provided:codenode_utilitiescontains general language agnostic helper functions and classesReferenceNoteThis section of the readme was generated using codenode itself.See docs/generate_readme.pyContentscodenode.dumpcodenode.dumpscodenode.linecodenode.indentcodenode.dedentcodenode.newlinecodenode.indentationcodenode.linescodenode.empty_linescodenode.indentedcodenode.default_writer_typecodenode.writer.Writercodenode.writer.WriterStackcodenode.nodes.newline.Newlinecodenode.nodes.depth_change.DepthChangecodenode.nodes.depth_change.RelativeDepthChangecodenode.nodes.depth_change.AbsoluteDepthChangecodenode.nodes.indentation.Indentationcodenode.nodes.indentation.RelativeIndentationcodenode.nodes.indentation.AbsoluteIndentationcodenode.nodes.indentation.CurrentIndentationcodenode.debug.debug_patchcodenode.dumpdefdump(node,stream,*,indentation=' ',newline='\\n',depth=0,debug=False):...Process and write out a node tree to a stream.Parametersnode:Base node of node tree.stream:An object with a 'write' method.indentation:String used for indents in the output.newline:String used for newlines in the output.depth:Base depth (i.e. number of indents) to start at.debug:If True, will print out extra info when an error\noccurs to give a better idea of which node caused it.codenode.dumpsdefdumps(node,*,indentation=' ',newline='\\n',depth=0,debug=False)->str:...Process and write out a node tree as a string.Parametersnode:Base node of node tree.indentation:String used for indents in the output.newline:String used for newlines in the output.depth:Base depth (i.e. number of indents) to start at.debug:If True, will print out extra info when an error\noccurs to give a better idea of which node caused it.ReturnsString representation of node tree.codenode.linedefline(content:'T')->'tuple[Indentation, T, Newline]':...Convenience function that returns a tuple containing\nan indentation node, line content and a newline node.Parameterscontent:content of lineReturnstuple containing an indentation node, line content and\na newline node.codenode.indentA node representing a single increase in indentation level.codenode.dedentA node representing a single decrease in indentation level.codenode.newlineA placeholder node for line terminators.codenode.indentationA placeholder node for indentation whitespace at the start of a line.codenode.linesdeflines(*items)->'tuple[tuple, ...]':...Convenience function that returns a tuple of lines,\nwhere each argument is the content of one line.Parametersitems:contents of linesReturnstuple of linescodenode.empty_linesdefempty_lines(count:int)->'tuple[Newline, ...]':...Convenience function that returns a tuple of newline nodes.Parameterscount:Number of newlines.ReturnsTuple of newlines.codenode.indenteddefindented(*nodes)->tuple:...Convenience function that returns a tuple containing an indent node,\nsome inner nodes, and a dedent node.Parametersnodes:inner nodesReturnstuple containing an indent node, inner nodes, and a dedent node.codenode.default_writer_typeDefault Writer type used in codenode.dump and codenode.dumps.codenode.writer.WriterclassWriter:...Processes node trees into strings then writes out the result.Each instance is intended to be used once then discarded.\nAfter a single call to either dump or dumps, the Writer\ninstance is no longer useful.Methods__init__classWriter:def__init__(self,node:'NodeType',*,indentation=' ',newline='\\n',depth=0):...Parametersnode:Base node of node tree.indentation:Initial string used for indents in the output.newline:Initial string used for newlines in the output.depth:Base depth (i.e. number of indents) to start at.process_nodeclassWriter:defprocess_node(self,node)->'Iterable[str]':...Yield strings representing a node and/or apply any of its\nassociated side effects to the writerfor example:yield indentation string when an indentation node is encounteredincrease the current writer depth if an indent is encounteredappend an iterator to the stack when an iterable is encounteredParametersnode:node to be processedReturnsstrings of text chunks representing the nodedump_iterclassWriter:defdump_iter(self)->'Iterable[str]':...Process and write out a node tree as an iterable of\nstring chunks.ReturnsIterable of string chunks.dumpclassWriter:defdump(self,stream):...Process and write out a node tree to a stream.Parametersstream:An object with a 'write' method.dumpsclassWriter:defdumps(self):...Process and write out a node tree as a string.ReturnsString representation of node tree.Attributesnode:Base node of node treestack:WriterStack used to iterate over the node treeindentation:Current string used for indents in the outputnewline:Current string used for line termination in the outputdepth:Current output depth (i.e. number of indents)codenode.writer.WriterStackclassWriterStack:...A stack of iterators.\nUsed by the Writer class to traverse node trees.Each instance is intended to be used once then discarded.MethodspushclassWriterStack:defpush(self,node:'NodeType'):...Converts a node to an iterator then places it at\nthe top of the stack.Parametersnode:iterable node__iter__classWriterStack:def__iter__(self)->'Iterable[NodeType]':...Continually iterates the top iterator in the stack's items,\nyielding each result then popping each iterator off when they\nare exhausted.Attributesitems:collections.deque -\nCurrent items in the stack.codenode.nodes.newline.NewlineclassNewline:...Nodes that represent the end of a line.codenode.nodes.depth_change.DepthChangeclassDepthChange:...Nodes that represent a change in indentation depth.Methodsnew_depth_forclassDepthChange:defnew_depth_for(self,depth:int)->int:...Method used to calculate the new depth based on the current one.Parametersdepth:Current depth.ReturnsNew depth.codenode.nodes.depth_change.RelativeDepthChangeclassRelativeDepthChange:...Nodes that represent a change in indentation depth relative to the\ncurrent depth by some preset amount.Methods__init__classRelativeDepthChange:def__init__(self,offset:int):...Parametersoffset:Amount by which to increase/decrease depth.Attributesoffset:Amount by which to increase/decrease depth when this node is\nprocessed.codenode.nodes.depth_change.AbsoluteDepthChangeclassAbsoluteDepthChange:...Nodes that represent a change in indentation depth without taking\nthe current depth into account.Methods__init__classAbsoluteDepthChange:def__init__(self,value:int):...Parametersvalue:Value to set depth to.Attributesvalue:Value to which depth will be set to when this node is\nprocessed.codenode.nodes.indentation.IndentationclassIndentation:...Nodes that represent indentation whitespace at the start of a line.Methodsindents_forclassIndentation:defindents_for(self,depth:int)->int:...Parametersdepth:Current depth.ReturnsNumber of indents to include in whitespace when this\nnode is processed.codenode.nodes.indentation.RelativeIndentationclassRelativeIndentation:...Nodes that represent indentation whitespace at the start of a line,\nwith a number of indents relative to the current depth by some\npreset amount.Methods__init__classRelativeIndentation:def__init__(self,offset:int):...Parametersoffset:Amount of indents relative to the current depth.Attributesoffset:Amount of indents relative to the current depth that will be\noutput when this node is processed.codenode.nodes.indentation.AbsoluteIndentationclassAbsoluteIndentation:...Nodes that represent indentation whitespace at the start of a line,\nwith a number of indents independent of the current depth.Methods__init__classAbsoluteIndentation:def__init__(self,value:int):...Parametersvalue:Amount of indents.Attributesvalue:Amount of indents that will be output when this node is processed.codenode.nodes.indentation.CurrentIndentationclassCurrentIndentation:...Nodes that represent indentation whitespace at the start of a line,\nwith a number of indents equal to the current depth.codenode.debug.debug_patchdefdebug_patch(writer_type:typing.Type[Writer])->typing.Type[Writer]:...Creates a modified version of a writer type\nwhich prints out some extra info when encountering\nan error to give a better ballpark idea of what caused it.\nUsed in codenode.dump/dumps to implement the debug parameter.Parameterswriter_type:Base writer type.ReturnsNew child writer type with debug modifications."} {"package": "0x-json-schemas", "pacakge-description": "0x-json-schemas0x JSON schemas for those developing on top of 0x protocol.Read thedocumentationInstallingpipinstall0x-json-schemasContributingWe welcome improvements and fixes from the wider community! To report bugs within this package, please create an issue in this repository.Please read ourcontribution guidelinesbefore getting started.Install Code and Dependenciespipinstall-e.[dev]Test./setup.py test.Clean./setup.py clean --allLint./setup.py lintBuild Documentation./setup.py build_sphinxMoreSee./setup.py --help-commandsfor more info."} {"package": "0x-middlewares", "pacakge-description": "0x-middlewaresWeb3 middlewares for 0x applications.Read thedocumentationInstallingpipinstall0x-middlewaresContributingWe welcome improvements and fixes from the wider community! To report bugs within this package, please create an issue in this repository.Please read ourcontribution guidelinesbefore getting started.Install Code and DependenciesEnsure that you have installed Python >=3.6 and Docker. Then:pipinstall-e.[dev]TestTests depend on running a local ethereum JSON-RPC server. For convenience, a docker container is provided that has ganache-cli.\nA shortcut is provided to run that docker container:./setup.py ganache. With that running, the tests can be run with./setup.py test.Clean./setup.py clean --allLint./setup.py lintBuild Documentation./setup.py build_sphinxMoreSee./setup.py --help-commandsfor more info."} {"package": "0xmpp", "pacakge-description": "pip install 0xmpp"} {"package": "0x-order-utils", "pacakge-description": "0x-order-utils0x order-related utilities for those developing on top of 0x protocol.Read thedocumentationInstallingpipinstall0x-order-utilsContributingWe welcome improvements and fixes from the wider community! To report bugs within this package, please create an issue in this repository.Please read ourcontribution guidelinesbefore getting started.Install Code and DependenciesEnsure that you have installed Python >=3.6 and Docker. Then:pipinstall-e.[dev]TestTests depend on a running ganache instance with the 0x contracts deployed in it. For convenience, a docker container is provided that has ganache-cli and a snapshot containing the necessary contracts. A shortcut is provided to run that docker container:./setup.py ganache. With that running, the tests can be run with./setup.py test.Clean./setup.py clean --allLint./setup.py lintBuild Documentation./setup.py build_sphinxMoreSee./setup.py --help-commandsfor more info."} {"package": "0x-python", "pacakge-description": "0x Python WrapperAPI docsCan be found here:https://0x.org/docs/apiDEV environmentInstall test dependencies:python3 setup.py install -e [.dev]Run some tests:python3 -m unittest tests.test_0xRun all tests:python3 -m unittest discover -s ./tests"} {"package": "0x-sra-client", "pacakge-description": "0x-sra-clientA Python client for interacting with servers conforming tothe Standard Relayer API specification.Read thedocumentationSchemasTheJSON schemasfor the API payloads and responses can be found in@0xproject/json-schemas. Examples of each payload and response can be found in the 0x.js library'stest suite.pipinstall0x-json-schemasYou can easily validate your API's payloads and responses using the0x-json-schemaspackage:fromzero_ex.json_schemasimportassert_validfromzero_ex.order_utilsimportOrderorder:Order={'makerAddress':\"0x0000000000000000000000000000000000000000\",'takerAddress':\"0x0000000000000000000000000000000000000000\",'feeRecipientAddress':\"0x0000000000000000000000000000000000000000\",'senderAddress':\"0x0000000000000000000000000000000000000000\",'makerAssetAmount':\"1000000000000000000\",'takerAssetAmount':\"1000000000000000000\",'makerFee':\"0\",'takerFee':\"0\",'expirationTimeSeconds':\"12345\",'salt':\"12345\",'makerAssetData':\"0x0000000000000000000000000000000000000000\",'takerAssetData':\"0x0000000000000000000000000000000000000000\",'exchangeAddress':\"0x0000000000000000000000000000000000000000\",}assert_valid(order,\"/orderSchema\")PaginationRequests that return potentially large collections should respond to the?pageand?perPageparameters. For example:$curlhttps://api.example-relayer.com/v2/asset_pairs?page=3&perPage=20Page numbering should be 1-indexed, not 0-indexed. If a query provides an unreasonable (ie. too high)perPagevalue, the response can return a validation error as specified in theerrors section. If the query specifies apagethat does not exist (ie. there are not enoughrecords), the response should just return an emptyrecordsarray.All endpoints that are paginated should return atotal,page,perPageand arecordsvalue in the top level of the collection. The value oftotalshould be the total number of records for a given query, whereasrecordsshould be an array representing the response to the query for that page.pageandperPage, are the same values that were specified in the request. See the note inmiscellaneousabout formattingsnake_casevs.lowerCamelCase.These requests include the/v2/asset_pairs,/v2/orders,/v2/fee_recipientsand/v2/orderbookendpoints.Network IdAll requests should be able to specify a?networkIdquery param for all supported networks. For example:$curlhttps://api.example-relayer.com/v2/asset_pairs?networkId=1If the query param is not provided, it should default to1(mainnet).Networks and their Ids:Network IdNetwork Name1Mainnet42Kovan3Ropsten4RinkebyIf a certain network is not supported, the response should400as specified in theerror responsesection. For example:{\\\"code\\\": 100,\\\"reason\\\": \\\"Validation failed\\\",\\\"validationErrors\\\": [{\\\"field\\\": \\\"networkId\\\",\\\"code\\\": 1006,\\\"reason\\\": \\\"Network id 42 is not supported\\\"}]}Link HeaderALink Headercan be included in a response to provide clients with more context about paging\nFor example:Link:;rel=\\\"next\\\",\n;rel=\\\"last\\\"ThisLinkresponse header contains one or more Hypermedia link relations.The possiblerelvalues are:NameDescriptionnextThe link relation for the immediate next page of results.lastThe link relation for the last page of results.firstThe link relation for the first page of results.prevThe link relation for the immediate previous page of results.Rate LimitsRate limit guidance for clients can be optionally returned in the response headers:Header NameDescriptionX-RateLimit-LimitThe maximum number of requests you're permitted to make per hour.X-RateLimit-RemainingThe number of requests remaining in the current rate limit window.X-RateLimit-ResetThe time at which the current rate limit window resets in UTC epoch seconds.For example:$curl-ihttps://api.example-relayer.com/v2/asset_pairs\nHTTP/1.1200OK\nDate:Mon,20Oct201712:30:06GMT\nStatus:200OK\nX-RateLimit-Limit:60X-RateLimit-Remaining:56X-RateLimit-Reset:1372700873When a rate limit is exceeded, a status of429 Too Many Requestsshould be returned.ErrorsUnless the spec defines otherwise, errors to bad requests should respond with HTTP 4xx or status codes.Common error codesCodeReason400Bad Request \u2013 Invalid request format404Not found429Too many requests - Rate limit exceeded500Internal Server Error501Not ImplementedError reporting formatFor all400responses, see theerror response schema.{\\\"code\\\": 101,\\\"reason\\\": \\\"Validation failed\\\",\\\"validationErrors\\\": [{\\\"field\\\": \\\"maker\\\",\\\"code\\\": 1002,\\\"reason\\\": \\\"Invalid address\\\"}]}General error codes:100-ValidationFailed101-MalformedJSON102-Ordersubmissiondisabled103-ThrottledValidation error codes:1000-Requiredfield1001-Incorrectformat1002-Invalidaddress1003-Addressnotsupported1004-Valueoutofrange1005-Invalidsignatureorhash1006-UnsupportedoptionAsset Data EncodingAs we now support multipletoken transfer proxies, the identifier of which proxy to use for the token transfer must be encoded, along with the token information. Each proxy in 0x v2 has a unique identifier. If you're using 0x.js there will be helper methods for thisencodinganddecoding.The identifier for the Proxy uses a similar scheme toABI function selectors.// ERC20 Proxy ID 0xf47261b0bytes4(keccak256('ERC20Token(address)'));// ERC721 Proxy ID 0x02571792bytes4(keccak256('ERC721Token(address,uint256)'));Asset data is encoded usingABI encoding.For example, encoding the ERC20 token contract (address: 0x1dc4c1cefef38a777b15aa20260a54e584b16c48) using the ERC20 Transfer Proxy (id: 0xf47261b0) would be:0xf47261b00000000000000000000000001dc4c1cefef38a777b15aa20260a54e584b16c48Encoding the ERC721 token contract (address:0x371b13d97f4bf77d724e78c16b7dc74099f40e84), token id (id:99, which hex encoded is0x63) and the ERC721 Transfer Proxy (id: 0x02571792) would be:0x02571792000000000000000000000000371b13d97f4bf77d724e78c16b7dc74099f40e840000000000000000000000000000000000000000000000000000000000000063For more information seethe Asset Proxysection of the v2 spec and theEthereum ABI Spec.Meta Data in Order ResponsesIn v2 of the standard relayer API we added themetaDatafield. It is meant to provide a standard place for relayers to put optional, custom or non-standard fields that may of interest to the consumer of the API.A good example of such a field isremainingTakerAssetAmount, which is a convenience field that communicates how much of a 0x order is potentially left to be filled. Unlike the other fields in a 0x order, it is not guaranteed to be correct as it is derived from whatever mechanism the implementer (ie. the relayer) is using. While convenient for prototyping and low stakes situations, we recommend validating the value of the field by checking the state of the blockchain yourself.Misc.All requests and responses should be ofapplication/jsoncontent typeAll token amounts are sent in amounts of the smallest level of precision (base units). (e.g if a token has 18 decimal places, selling 1 token would show up as selling'1000000000000000000'units by this API).All addresses are sent as lower-case (non-checksummed) Ethereum addresses with the0xprefix.All parameters are to be written inlowerCamelCase.This Python package is automatically generated by theOpenAPI Generatorproject:API version: 2.0.0Package version: 1.0.0Build package: org.openapitools.codegen.languages.PythonClientCodegenRequirements.Python 2.7 and 3.4+Installation & Usagepip installIf the python package is hosted on Github, you can install directly from Githubpipinstallgit+https://github.com/GIT_USER_ID/GIT_REPO_ID.git(you may need to runpipwith root permission:sudo pip install git+https://github.com/GIT_USER_ID/GIT_REPO_ID.git)Then import the package:importsra_clientSetuptoolsInstall viaSetuptools.pythonsetup.pyinstall--user(orsudo python setup.py installto install the package for all users)Then import the package:importsra_clientGetting StartedPlease follow theinstallation procedureand then run the following:from__future__importprint_functionimporttimeimportsra_clientfromsra_client.restimportApiExceptionfrompprintimportpprint# create an instance of the API classapi_instance=sra_client.DefaultApi(sra_client.ApiClient(configuration))asset_data_a=0xf47261b04c32345ced77393b3530b1eed0f346429d# str | The assetData value for the first asset in the pair. (optional)asset_data_b=0x0257179264389b814a946f3e92105513705ca6b990# str | The assetData value for the second asset in the pair. (optional)network_id=42# float | The id of the Ethereum network (optional) (default to 1)page=3# float | The number of the page to request in the collection. (optional) (default to 1)per_page=10# float | The number of records to return per page. (optional) (default to 100)try:api_response=api_instance.get_asset_pairs(asset_data_a=asset_data_a,asset_data_b=asset_data_b,network_id=network_id,page=page,per_page=per_page)pprint(api_response)exceptApiExceptionase:print(\"Exception when calling DefaultApi->get_asset_pairs:%s\\n\"%e)ContributingWe welcome improvements and fixes from the wider community! To report bugs within this package, please create an issue in this repository.Please read ourcontribution guidelinesbefore getting started.Install Code and DependenciesEnsure that you have installed Python >=3.6, Docker, and docker-compose. Then:pipinstall-e.[dev]TestTests depend on a running instance of 0x-launch-kit-backend, backed by a Ganache node with the 0x contracts deployed in it. For convenience, a docker-compose file is provided that creates this environment. And a shortcut is provided to interface with that file:./setup.py start_test_relayerwill start those services. With them running, the tests can be run with./setup.py test. When you're done with testing, you can./setup.py stop_test_relayer.Clean./setup.py clean --allLint./setup.py lintBuild Documentation./setup.py build_sphinxMoreSee./setup.py --help-commandsfor more info.Documentation for API EndpointsAll URIs are relative tohttp://localhostClassMethodHTTP requestDescriptionDefaultApiget_asset_pairsGET/v2/asset_pairsDefaultApiget_fee_recipientsGET/v2/fee_recipientsDefaultApiget_orderGET/v2/order/{orderHash}DefaultApiget_order_configPOST/v2/order_configDefaultApiget_orderbookGET/v2/orderbookDefaultApiget_ordersGET/v2/ordersDefaultApipost_orderPOST/v2/orderDocumentation For ModelsOrderSchemaPaginatedCollectionSchemaRelayerApiAssetDataPairsResponseSchemaRelayerApiAssetDataTradeInfoSchemaRelayerApiErrorResponseSchemaRelayerApiErrorResponseSchemaValidationErrorsRelayerApiFeeRecipientsResponseSchemaRelayerApiOrderConfigPayloadSchemaRelayerApiOrderConfigResponseSchemaRelayerApiOrderSchemaRelayerApiOrderbookResponseSchemaRelayerApiOrdersChannelSubscribePayloadSchemaRelayerApiOrdersChannelSubscribeSchemaRelayerApiOrdersChannelUpdateSchemaRelayerApiOrdersResponseSchemaSignedOrderSchemaDocumentation For AuthorizationAll endpoints do not require authorization."} {"package": "0x-web3", "pacakge-description": "# Web3.py0x-web3 is a temporary fork of web3. It adds primitive support for ABI tuples, which is needed in order to facilitate calling the 0x smart contracts. The fork\u2019s changes to web3.py are visible in an open PR, and when that PR (or something analogous) is merged, this package will be taken down.[![Join the chat at https://gitter.im/ethereum/web3.py](https://badges.gitter.im/ethereum/web3.py.svg)](https://gitter.im/ethereum/web3.py?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge&utm_content=badge)[![Build Status](https://circleci.com/gh/ethereum/web3.py.svg?style=shield)](https://circleci.com/gh/ethereum/web3.py.svg?style=shield)A Python implementation of [web3.js](https://github.com/ethereum/web3.js)* Python 3.5+ supportRead more in the [documentation on ReadTheDocs](http://web3py.readthedocs.io/). [View the change log on Github](docs/releases.rst).## Quickstart```pythonimport jsonimport web3from web3 import Web3, HTTPProvider, TestRPCProviderfrom solc import compile_sourcefrom web3.contract import ConciseContract# Solidity source codecontract_source_code = '''pragma solidity ^0.4.0;contract Greeter {string public greeting;function Greeter() {greeting = 'Hello';}function setGreeting(string _greeting) public {greeting = _greeting;}function greet() constant returns (string) {return greeting;}}'''compiled_sol = compile_source(contract_source_code) # Compiled source codecontract_interface = compiled_sol[':Greeter']# web3.py instancew3 = Web3(TestRPCProvider())# Instantiate and deploy contractcontract = w3.eth.contract(abi=contract_interface['abi'], bytecode=contract_interface['bin'])# Get transaction hash from deployed contracttx_hash = contract.deploy(transaction={'from': w3.eth.accounts[0], 'gas': 410000})# Get tx receipt to get contract addresstx_receipt = w3.eth.getTransactionReceipt(tx_hash)contract_address = tx_receipt['contractAddress']# Contract instance in concise modeabi = contract_interface['abi']contract_instance = w3.eth.contract(address=contract_address, abi=abi,ContractFactoryClass=ConciseContract)# Getters + Setters for web3.eth.contract objectprint('Contract value: {}'.format(contract_instance.greet()))contract_instance.setGreeting('Nihao', transact={'from': w3.eth.accounts[0]})print('Setting value to: Nihao')print('Contract value: {}'.format(contract_instance.greet()))```## Developer Setup```shgit clone git@github.com:ethereum/web3.py.gitcd web3.py```Please see OS-specific instructions for:- [Linux](docs/README-linux.md#Developer-Setup)- [Mac](docs/README-osx.md#Developer-Setup)- [Windows](docs/README-windows.md#Developer-Setup)- [FreeBSD](docs/README-freebsd.md#Developer-Setup)Then run these install commands:```shvirtualenv venv. venv/bin/activatepip install -e .[dev]```For different environments, you can set up multiple `virtualenv`. For example, if you want to create a `venvdocs`, then you do the following:```shvirtualenv venvdocs. venvdocs/bin/activatepip install -e .[docs]pip install -e .```## Using DockerIf you would like to develop and test inside a Docker environment, use the *sandbox* container provided in the **docker-compose.yml** file.To start up the test environment, run:```docker-compose up -d```This will build a Docker container set up with an environment to run the Python test code.**Note: This container does not have `go-ethereum` installed, so you cannot run the go-ethereum test suite.**To run the Python tests from your local machine:```docker-compose exec sandbox bash -c 'pytest -n 4 -f -k \"not goethereum\"'```You can run arbitrary commands inside the Docker container by using the `bash -c` prefix.```docker-compose exec sandbox bash -c ''```Or, if you would like to just open a session to the container, run:```docker-compose exec sandbox bash```### Testing SetupDuring development, you might like to have tests run on every file save.Show flake8 errors on file change:```sh# Test flake8when-changed -v -s -r -1 web3/ tests/ ens/ -c \"clear; flake8 web3 tests ens && echo 'flake8 success' || echo 'error'\"```You can use `pytest-watch`, running one for every Python environment:```shpip install pytest-watchcd venvptw --onfail \"notify-send -t 5000 'Test failure \u26a0\u26a0\u26a0\u26a0\u26a0' 'python 3 test on web3.py failed'\" ../tests ../web3```Or, you can run multi-process tests in one command, but without color:```sh# in the project root:pytest --numprocesses=4 --looponfail --maxfail=1# the same thing, succinctly:pytest -n 4 -f --maxfail=1```#### How to Execute the Tests?1. [Setup your development environment](https://github.com/ethereum/web3.py/#developer-setup).2. Execute `tox` for the testsThere are multiple [components](https://github.com/ethereum/web3.py/blob/master/.travis.yml#L53) of the tests. You can run test to against specific component. For example:```sh# Run Tests for the Core component (for Python 3.5):tox -e py35-core# Run Tests for the Core component (for Python 3.6):tox -e py36-core```If for some reason it is not working, add `--recreate` params.`tox` is good for testing against the full set of build targets. But if you want to run the tests individually, `py.test` is better for development workflow. For example, to run only the tests in one file:```shpy.test tests/core/gas-strategies/test_time_based_gas_price_strategy.py```### Release setupFor Debian-like systems:```apt install pandoc```To release a new version:```shmake release bump=$$VERSION_PART_TO_BUMP$$```#### How to bumpversionThe version format for this repo is `{major}.{minor}.{patch}` for stable, and`{major}.{minor}.{patch}-{stage}.{devnum}` for unstable (`stage` can be alpha or beta).To issue the next version in line, specify which part to bump,like `make release bump=minor` or `make release bump=devnum`.If you are in a beta version, `make release bump=stage` will switch to a stable.To issue an unstable version when the current version is stable, specify thenew version explicitly, like `make release bump=\"--new-version 4.0.0-alpha.1 devnum\"`"} {"package": "1", "pacakge-description": "UNKNOWN"} {"package": "100bot", "pacakge-description": "# 100BotThis is a son-of-[IoBot](https://github.com/adahn6/Io) project, taking the bestthings about Io and turning them into a monster.A picture is worth 1000 words:![Screenshot of example conversation](example.png)100Bot is powered by the Watson Tone Analyzer service. Each message is carefullyparsed for emotional significance and response, so that the perfectlyappropriate reaction emoji can be chosen.## Running 100BotThe bot requires three very specific parameters:- A Watson Tone Analyzer `username` credential- A Watson Tone Analyzer `password` credential- A Slack Integration Token for a bot### Running with docker-composeThe easiest way to get 100bot up and running with dependencies is by using the docker service file included: `docker-compose.yml`. Modify the supplied `.env-sample` to provide credentials for the Watson Tone Analyzer and a Slack bot. Then build and start the service with:```shelldocker-compose up -d```### Running natively with pythonPip modules `slackclient` and `requests` must be installed. Use virtualenv to make your life easier. Passing of credentials can be done with argparse:```shellpython3 100bot.py \\--bx-username \"user\" \\--bx-password \"verysecret\" \\--slack-token \"xoob-137138657231-2ReKEpxlvWwe6vDBripOs7sR\"```but can also be done with environment variables:```shellexport BLUEMIX_USERNAME=userexport BLUEMIX_PASSWORD=verysecretexport SLACK_TOKEN=\"xoob-137138657231-2ReKEpxlvWwe6vDBripOs7sR\"python3 100bot.py```(yes, these are all fake credentials don't try them...)"} {"package": "100-python-projects", "pacakge-description": "No description available on PyPI."} {"package": "101010101010101010101010101010101010101010101010101010101010101010101010101010101010101010101010101010101010", "pacakge-description": "welcome to my package"} {"package": "1011903677-siddharth-topsis", "pacakge-description": "No description available on PyPI."} {"package": "101703048-topsis", "pacakge-description": "No description available on PyPI."} {"package": "101703072-handle-missing", "pacakge-description": "No description available on PyPI."} {"package": "101703087-missing-values", "pacakge-description": "No description available on PyPI."} {"package": "101703087-outlier", "pacakge-description": "No description available on PyPI."} {"package": "101703087-topsis", "pacakge-description": "No description available on PyPI."} {"package": "101703088-outlier", "pacakge-description": "No description available on PyPI."} {"package": "101703088-topsis", "pacakge-description": "No description available on PyPI."} {"package": "101703105", "pacakge-description": "No description available on PyPI."} {"package": "101703196-topsis", "pacakge-description": "#Topsis Package\nA Python package to implement topsis on a given dataset.\n##Usage\nFollowing query on terminal will provide you the best and worst decisions for the dataset.python topsis.py dataset_sample.csv 1,1,1,1 0,1,1,0"} {"package": "101703214-assign1-UCS633", "pacakge-description": "No description available on PyPI."} {"package": "101703235-missing-val", "pacakge-description": "This is a package to deal with missing values in a dataset.\n101703235\nHitesh gupta\ncoe13"} {"package": "101703272-missing-val", "pacakge-description": "This is a package to deal with missing values in a dataset.\n101703272\nJyot Guransh Singh Dua\ncoe13"} {"package": "101703301-Project1-TOPSIS", "pacakge-description": "TOPSIS ImplementationProject1 UCS633Submitted by Kushagra Thakral 101703301Implementation of TOPSIS for a simple dataset having 4 columns.run in CMD:python3 project1.py file_name.csv 0.25 0.25 0.25 0.25 a-+++bwhere a and b can be replaced by any character except shell script keywords."} {"package": "101703301-Project2", "pacakge-description": "Outlier Detection and Removal using Inter Quartile Range methodProject1 UCS633Submitted by Kushagra Thakral 101703301Implementation of IQR method to detect and remove outliers from a given dataset.run in CMD:python3 outlier.py input_file_name.csv output_file_name.csv"} {"package": "101703301-project3", "pacakge-description": "#Handling NULL values in a dataset.\n##Project3 UCS633\n##Submitted by Kushagra Thakral 101703301To run in cmd line:python3 missing.py input_file_name.csvCount of null values present in each columns will be displayed.Then count of null values present after the use of package will be displayed(null values will be zero).Then the dataset will be displayed(first 5 rows).An output fill will be created with the name input_file_name_not_null.csvThe package only works for csv files.This package uses backfill method followed by forward fill method to handle all the null values"} {"package": "101703311-Missing-Data", "pacakge-description": "Handling Missing DataProject 3 : UCS633Submitted By:Lokesh Arora 101703311pypi:https://pypi.org/project/101703311_Missing_Data/This project is made to handle missing data .LicenseMIT"} {"package": "101703311-OUTLIERS", "pacakge-description": "Outlier Removal Using InterQuartile RangeProject 2 : UCS633Submitted By:Lokesh Arora 101703311pypi:https://pypi.org/project/101703311_OUTLIERS/InterQuartile Range (IQR) DescriptionAny set of data can be described by its five-number summary. These five numbers, which give you the information you need to find patterns and outliers, consist of:The minimum or lowest value of the dataset.The first quartile Q1, which represents a quarter of the way through the list of all data.The median of the data set, which represents the midpoint of the whole list of data.The third quartile Q3, which represents three-quarters of the way through the list of all data.The maximum or highest value of the data set.These five numbers tell a person more about their data than looking at the numbers all at once could, or at least make this much easier.Calculation of IQRIQR = Q3 \u2013 Q1MIN = Q1 - (1.5IQR)MAX = Q3 + (1.5IQR)InstallationUse the package managerpipto install 101703311_OUTLIERS.pipinstall101703311_OUTLIERSHow to use this package:101703311_OUTLIERS can be run as shown below:In Command Prompt>> outlierRemoval dataset.csvSample datasetMarksStudents3Student157Student265Student398Student443Student544Student654Student799Student81Student9Output Dataset after RemovalMarksStudents57Student265Student398Student443Student544Student654Student7It is clearly visible that the rows containing Student1, Student8 and Student9 have been removed due to them being Outliers.LicenseMIT"} {"package": "101703312-outlierRemoval", "pacakge-description": "Outlier Removal Using InterQuartile RangeProject 2 : UCS633Submitted By:Lovish Jindal 101703312pypi:https://pypi.org/project/101703312_outlierRemoval/InterQuartile Range (IQR) DescriptionAny set of data can be described by its five-number summary. These five numbers, which give you the information you need to find patterns and outliers, consist of:The minimum or lowest value of the dataset.The first quartile Q1, which represents a quarter of the way through the list of all data.The median of the data set, which represents the midpoint of the whole list of data.The third quartile Q3, which represents three-quarters of the way through the list of all data.The maximum or highest value of the data set.These five numbers tell a person more about their data than looking at the numbers all at once could, or at least make this much easier.Calculation of IQRIQR = Q3 \u2013 Q1MIN = Q1 - (1.5IQR)MAX = Q3 + (1.5IQR)InstallationUse the package managerpipto install 101703312_outlierRemoval.pipinstall101703312_outlierRemovalHow to use this package:101703312_outlierRemoval can be run as shown below:In Command Prompt>> outlierRemoval dataset.csvSample datasetMarksStudents3Student157Student265Student398Student443Student544Student654Student799Student81Student9Output Dataset after RemovalMarksStudents57Student265Student398Student443Student544Student654Student7It is clearly visible that the rows containing Student1, Student8 and Student9 have been removed due to them being Outliers.LicenseMIT"} {"package": "101703322-missing-val", "pacakge-description": "This is a package to deal with missing values in a dataset.\n101703272\nJyot Guransh Singh Dua\ncoe13"} {"package": "101703373-outlier", "pacakge-description": "No description available on PyPI."} {"package": "101703373-topsis", "pacakge-description": "TOPSIS PackageTOPSIS stands for Technique for Oder Preference by Similarity to Ideal Solution. It is a method of compensatory aggregation that compares a set of alternatives by identifying weights for each criterion, normalising scores for each criterion and calculating the geometric distance between each alternative and the ideal alternative, which is the best score in each criterion. An assumption of TOPSIS is that the criteria are monotonically increasing or decreasing. In this Python package Vector Normalization has been implemented.This package has been created based on Project 1 of course UCS633. Nikhil Vyas COE-17 101703373In Command Prompttopsis data.csv \"1,1,1,1\" \"+,+,-,+\""} {"package": "101703375-p2", "pacakge-description": "A library capable of removing outliers from a pandas dataframePROJECT 2, UCS633 - Data Analysis and Visualization\nNishant Dhanda \nCOE17\nRoll number: 101703375"} {"package": "101703378-project2", "pacakge-description": "this project is used to remove outliers\nNitin (101703378)\nCOE17"} {"package": "101703381-outlier", "pacakge-description": "No description available on PyPI."} {"package": "101703383-python-package2", "pacakge-description": "One of my exposures to python made me understand that Python is one of the powerful languages we have."} {"package": "101703476-samiksha", "pacakge-description": "No description available on PyPI."} {"package": "101703488-sargun", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} {"package": "101703549-missing-val", "pacakge-description": "This is a package to deal with missing values in a dataset.\n101703549\nsimranpreet kaur\ncoe15"} {"package": "101703573-Missing-pkg-suruchipundir", "pacakge-description": "Filling Missing ValuesMissing Data can occur when no information is provided for one or more items or for a whole unit. Missing Data is a very big problem in real life scenario. Missing Data can also refer to asNA(Not Available) values in pandas. In DataFrame sometimes many datasets simply arrive with missing data, either because it exists and was not collected or it never existed.\nIn this package, the missing values in a csv file are filled using the fillna function in pandas. For this the statistical model of mean is used.Usage$ python3 missing.py filename"} {"package": "101703573-Outlier-pkg-suruchipundir", "pacakge-description": "In statistics, an outlier is an observation point that is distant from other observations. The outliers can be a result of a mistake during data collection or it can be just an indication of variance in your data. The package uses interquartile range method to detect and remove outliers from a csv file.To run on command line\npython3 outlier.py filename"} {"package": "101703573-Topsis-pkg-suruchipundir", "pacakge-description": "The Technique for Order of Preference by Similarity to Ideal Solution (TOPSIS) is a multi-criteria decision analysis method, which was originally developed by Ching-Lai Hwang and Yoon in 1981 with further developments by Yoon in 1987, and Hwang, Lai and Liu in 1993. TOPSIS is based on the concept that the chosen alternative should have the shortest geometric distance from the positive ideal solution (PIS) and the longest geometric distance from the negative ideal solution (NIS).To run on command line\npython3 topsis.py diabetes.csv \"1,2,5,10,1,2,1,1,2\" \"-,+,+, -, +, +,+,-, +\""} {"package": "101703604-topsis", "pacakge-description": "No description available on PyPI."} {"package": "101903117-shivam-singal", "pacakge-description": "No description available on PyPI."} {"package": "101903140", "pacakge-description": "No description available on PyPI."} {"package": "101903683-kunal-topsis", "pacakge-description": "No description available on PyPI."} {"package": "101903688", "pacakge-description": "No description available on PyPI."} {"package": "101903697-Topsis-code", "pacakge-description": "A Python package implementing Topsis method sed for multi-criteria decision analysis. Topsis stands for Technique for Order of Preference by Similarity to Ideal Solution"} {"package": "101903700-Topsis-code", "pacakge-description": "A Python package implementing Topsis method sed for multi-criteria decision analysis. Topsis stands for Technique for Order of Preference by Similarity to Ideal Solution"} {"package": "101903751-topsis", "pacakge-description": "MOKSHIT GOGIA\nASSIGNMENT 4\n101903751"} {"package": "101903755", "pacakge-description": "No description available on PyPI."} {"package": "101903762", "pacakge-description": "No description available on PyPI."} {"package": "101917149-topsis", "pacakge-description": "No description available on PyPI."} {"package": "101hello-0.0.1-redish101", "pacakge-description": "## README.md\nIt\u2019s a example pack"} {"package": "102003017", "pacakge-description": "102003017It finds topsis Score and based on that calculates the rank.Installationpip install 102003017License\u00c2\u00a9 2023 Srishti SharmaThis repository is licensed under the MIT license. See LICENSE for details."} {"package": "102003037-topsis", "pacakge-description": "102003037 TOPSIS PACKAGE HIMANGI SHARMARoll Number : 102003037Subgroup : 3COE18The program takes csv file containing our data to be ranked, weights and impacts in the form of \"+\" or \"-\", seperated by commas as inputs and then outputs a resultant csv file with two additional columns of performance score and Ranks.What is TOPSISTOPSIS, Technique of Order Preference Similarity to the Ideal Solution, is a multi-criteria decision analysis method (MCDA).It chooses the alternative of shortest the Euclidean distance from the ideal solution and greatest distance from the negative ideal solution.InstallationHow to install the TOPSIS packageusing pip install:-pip install 102003037-topsis-HimangiFor Calculating the TOPSIS ScoreOpen terminal and type102003037 102003037-data.csv \"1,1,1,1\" \"+,+,-,+\" 102003037-output.csvThe output will then be saved in a newly created CSV file whose name will be provided in the command line by the user.Input File [102003037-data.csv]:Topsis mathematical operations to be performed on the input file which contains a dataset having different fields.Weights [\"1,1,1,1\"]The weights to assigned to the different parameters in the dataset should be passed in the argument, seperated by commas.Impacts [\"+,+,-,+\"]:The impacts are passed to consider which parameters have a positive impact on the decision and which one have the negative impact. Only '+' and '-' values should be passed and should be seperated with ',' only.Output File [102003037-output.csv]:This argument is used to pass the path of the result file where we want the rank and performance score to be stored."} {"package": "102003050-topsis", "pacakge-description": "No description available on PyPI."} {"package": "102003053", "pacakge-description": "102003053It calculates topsis for a given data in csv file.Installationpip install 102003053License\u00c2\u00a9 2023 Amit KumarThis repository is licensed under the MIT license. See LICENSE for details."} {"package": "102003105", "pacakge-description": "# Topsis Value Calculator\nSelection of an appropriate Multiple Attribute Decision Making (MADM) method for providing a solution to a given MADM problem is always challenging endeavour. The challenge is even greater for situations where for a specific MADM problem there exist multiple MADM methods with similar degree of suitability. The Technique for Order Preference by Similarity to Ideal Solution (TOPSIS) helps solve MADM problems.This is a Python package implementing TOPSIS method for multi-criteria decision analysis.## Installation$ pip install TOPSIS-102003105In the commandline, you can write as -$ python \n E.g for input data file as data.csv, command will be like$ python 102003105.py 102003105-data.csv \u201c0,1,1,1,2,1\u201d \u201c+,-,-,+,-,+\u201d 102003105-Result1.csvThis will give the output in 102003105-Result1.csv fileLicense -> MIT"} {"package": "102003161-Hunar-Topsis", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} {"package": "102003171-Calc", "pacakge-description": "No description available on PyPI."} {"package": "102003634", "pacakge-description": "No description available on PyPI."} {"package": "102003646-Topsis", "pacakge-description": "No description available on PyPI."} {"package": "102003712", "pacakge-description": "TopsisIt takes 4 arguments :1.Data.csv file2.Weights3.Impacts4.Result fileReturn file with Ranks and Topsis ScoreHow to use it?Open terminal and type topsis data file name weights impacts result file nameex: topsis data.csv 1,1,1,1 +,+,-,+ resullt.csv"} {"package": "102003759", "pacakge-description": ".."} {"package": "102003766-topsis", "pacakge-description": "TopsisTOPSIS is based on the fundamental premise that the best solution has the shortest distance from the positive-ideal solution, and the longest distance from the negative-ideal one. Alternatives are ranked with the use of an overall index calculated based on the distances from the ideal solutions.It takes 4 arguments :\n1.Data.csv file\n2.Weights\n3.Impacts\n4.Result fileReturns file with Ranks and Topsis ScoreHow to use it?Open terminal and type topsis data file name weights impacts result file nameex: topsis data.csv 1,1,1,1 +,+,-,+ resullt.csv"} {"package": "102017059-Aakanksha-Topsis", "pacakge-description": "102017059_Aakanksha_TopsisThis package is implementation of multi-criteria decision analysis using topsis. This package will accept three arguments during file execution:dataset.csv //file which contains the models and parameters\nstring of weights separated by commas(,)\nstring of requirements (+/-) separated by commas(,) // important install pandas,sys,operator and math libraries before installing this // You can install this package using following command pip install 102017059_Aakanksha_Topsis"} {"package": "102017067-topsis", "pacakge-description": "TOPSIS PackageTOPSIS stands for Technique for Oder Preference by Similarity to Ideal Solution. It is a method of compensatory aggregation that compares a set of alternatives by identifying weights for each criterion, normalising scores for each criterion and calculating the geometric distance between each alternative and the ideal alternative, which is the best score in each criterion. An assumption of TOPSIS is that the criteria are monotonically increasing or decreasing. In this Python package Vector Normalization has been implemented.This package has been created based on Project 1 of course UCS633. Tarandeep Singh 102017067In Command Prompttopsis data.csv \"1,1,1,1\" \"+,+,-,+\""} {"package": "102017119-topsis", "pacakge-description": "TOPSIS PackageTOPSIS stands for Technique for Oder Preference by Similarity to Ideal Solution. It is a method of compensatory aggregation that compares a set of alternatives by identifying weights for each criterion, normalising scores for each criterion and calculating the geometric distance between each alternative and the ideal alternative, which is the best score in each criterion. An assumption of TOPSIS is that the criteria are monotonically increasing or decreasing. In this Python package Vector Normalization has been implemented.This package has been created based on Assignment 1 of course UCS654. Prince Sharma 102017119In Command Prompttopsis data.csv \"1,1,1,1\" \"+,+,-,+\""} {"package": "102053005-Aditya-Topsis", "pacakge-description": "Topsis Package by Aditya KalhanRoll Number : 102053005\nSubgroup : 3COE18\nIt takes a csv file, weights (seperated by comma) , impacts (+ or -) and outputs a result file.What is TopsisTOPSIS is based on the fundamental premise that the best solution has the shortest distance from the positive-ideal solution, and the longest distance from the negative-ideal one.Selecting an appropriate Multiple Attribute Decision Making (MADM) method for a given MADM problem is always a challenging task.Within the MADM domain, the Technique for Order Preference by Similarity to Ideal Solution (TOPSIS) is highly regarded, applied and adopted MADM method due to its simplicity and underlying concept that the best solution is the one closest to the positive ideal solution and furthest from the negative ideal solution.InstallationUse pip installerpip install 102053005-Aditya-TopsisHow to use itOpen terminal and type102053005 sample.csv \"1,1,1,1\" \"+,+,-,+\" result.csvExampleModelStorage SpaceCameraPriceLooksM116122505M21682003M332163004M43282754M516162252weights = [1,1,1,1]impact = [\"+\",\"+\",\"-\",\"+\"]OutputModelStorage SpaceCameraPriceLooksTopsis ScoreRankM1161225050.5342693M216820030.3083145M3321630040.6916861M432827540.5348072M5161622520.4012224Output will be saved in a CSV file whose name will be provided in the command line.\nIt will have all the columns along with the Topsis Score and Ranks."} {"package": "102053010", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} {"package": "102053024", "pacakge-description": "python code of topsis..Change Log0.0.1 (19/04/2020)First Release"} {"package": "102053042TOPSIS", "pacakge-description": "This code is used to implement the Topsis ModelChange Log0.0.1 (22/01/2023)First Release"} {"package": "1020-nester", "pacakge-description": "UNKNOWN"} {"package": "102103354-shrey-topsis", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} {"package": "102116042-topsis", "pacakge-description": "No description available on PyPI."} {"package": "102116078-topsis-py", "pacakge-description": "Topsis-PyImplementation of Topsis in Python by rno. 102116078Installationpipinstalltopsis-pyUsagetopsisWhere: Path to the input CSV file.\n\n: Comma-separated weights for each criterion.\n\n: Comma-separated impacts ('+' for maximization, '-' for minimization).\n\n: Name of the file to save the results.Examplepythontopsisinput.csv'1,1,2,1''+,+,-,+'results.csv"} {"package": "102116116-topsis", "pacakge-description": "No description available on PyPI."} {"package": "102183051-topsis", "pacakge-description": "TOPSISSubmitted By:Sarthak Tiwari | 102183051What is TOPSIS?Technique forOrderPreference bySimilarity toIdealSolution\n(TOPSIS) originated in the 1980s as a multi-criteria decision making method.How to install this package:>>pipinstall-e.[dev]After installation, in Command Prompt/Terminal in pwd/current dir:>> topsis Weights (weights) may not be normalised but will be normalised in the code.Note:To avoid errors -\nInput file must contain three or more columns.\n2nd to last columns must contain numeric values only.\nNumber of weights, number of impacts and number of columns (from 2 nd to last columns) must\nbe same.\nImpacts must be either +ve or -ve.\nImpacts and weights must be separated by \u00e2\u20ac\u02dc,\u00e2\u20ac\u2122 (comma).InputDataFile (data.csv) - an exampleThe decision matrix should be constructed with each row representing a Model alternative and each column representing a criterion like Correlation, R2, Root Mean Squared Error, Accuracy, etc.ModelCorrRseqRMSEAccuracyM10.790.621.2560.89M20.660.442.8963.07M30.560.311.5762.87M40.820.672.6870.19M50.750.561.380.39Output file (result.csv) -Based on the above input file and setting weights as \"1,2,1,1\" and impacts as \"+,-,-,+\".ModelCorrRseqRMSEAccuracyTopsis ScoreRankM10.790.621.2560.890.4237443913596114M20.660.442.8963.070.0.4674263682982973M30.560.311.5762.870.7602309570349031M40.820.672.6870.190.2077725338815665M50.750.561.380.390.5048644578037182The output file contains columns of input file along with two additional columns having Topsis Score and Rank."} {"package": "1082-msr-bhp", "pacakge-description": "No description available on PyPI."} {"package": "108Moshpdf", "pacakge-description": "This is the homepage of our project."} {"package": "10-8moshpdf", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} {"package": "10.8-Publishing-Packages", "pacakge-description": "This is the homepage of our project."} {"package": "10daysweb", "pacakge-description": "# 10daysWeb**A just-for-learning web framework that can be developed in 10 days.**![PyPI](https://img.shields.io/pypi/pyversions/10daysweb.svg) ![PyPI](https://img.shields.io/pypi/status/10daysweb.svg) ![PyPI](https://img.shields.io/pypi/v/10daysweb.svg)# \u5570\u55e6\u51fa\u4e8e\u67d0\u4e9b\u539f\u56e0\uff0c\u6211\u9700\u8981\u4e00\u4e2a\u81ea\u5df1\u5f00\u53d1\u7684\u8f6e\u5b50\uff0c\u5927\u7ea6\u53ea\u6709\u5341\u5929\u65f6\u95f4\u3002\u4e8e\u662f\u6211\u6253\u7b97\u5f00\u53d1\u4e00\u4e2apython web\u6846\u67b6\uff0c\u8fd9\u662f\u6211\u4e00\u76f4\u60f3\u505a\u5374\u53c8\u672a\u5b8c\u6210\u7684\u4e8b\u3002\u6211\u6253\u7b97\u6bcf\u5929\u8fed\u4ee3\uff0c\u4e00\u904d\u5199\u4e00\u904d\u67e5\u9605\u8d44\u6599\uff0c\u8bb0\u5f55\u65b0\u7684\u60f3\u6cd5\u548c\u53d1\u73b0\u3002\u8fd9\u6837\u5982\u679c\u6709\u8c01\u4e0e\u6211\u5904\u5883\u76f8\u4f3c\uff0c\u8fd9\u4e2a\u9879\u76ee\u4e5f\u8bb8\u80fd\u591f\u6709\u6240\u5e2e\u52a9\u3002\u6700\u597d\u80fd\u7528\u6210\u54c1\u518d\u642d\u4e2a\u535a\u5ba2\u4ec0\u4e48\u7684\u3002\u5373\u4f7f\u6ca1\u6709\u6210\u529f\uff0c\u4e5f\u4e0d\u4f1a\u4e00\u65e0\u6240\u83b7\u3002\u6211\u4eec\u5f00\u59cb\u5427\u3002## Day 1**\u4e07\u4e8b\u5f00\u5934\u96be\uff0c\u76f8\u4fe1\u6211\u4e0d\u662f\u552f\u4e00\u4e00\u4e2a\u5728\u9879\u76ee\u5f00\u59cb\u65f6\u611f\u5230\u65e0\u4ece\u4e0b\u624b\u7684\u4eba\u3002**\u9996\u5148\u6211\u4e0b\u8f7d\u4e86\u70ed\u95e8\u6846\u67b6Flask\u76840.1\u7248\u672c\u7684\u6e90\u7801\uff0c\u4e09\u767e\u4f59\u884c\u7684\u4ee3\u7801\u5df2\u7ecf\u5305\u542b\u4e86\u4e00\u4e2aweb\u6846\u67b6\u6240\u5fc5\u8981\u7684\u5168\u90e8\u529f\u80fd\uff0c\u8fd8\u9644\u5e26\u4e86\u4e00\u4e2a\u4f7f\u7528\u793a\u4f8b\u3002[\u5982\u4f55\u4e0b\u8f7d\u6700\u65e9\u7684commit\u4ee3\u7801](#\u5982\u4f55\u4e0b\u8f7d\u6700\u65e9\u7684commit\u4ee3\u7801)\u5bf9\u4e8e\u6211\u8981\u5b9e\u73b0\u7684\u7b2c\u4e00\u4e2a\u6700\u7b80\u5355\u7248\u672c\u6765\u8bf4\uff0cflask\u4ecd\u7136\u8fc7\u4e8e\u590d\u6742\u4e86\uff0c\u6211\u53ea\u63d0\u70bc\u51fa`route`\u8fd9\u4e2a\u5173\u952e\u90e8\u4ef6\u5728\u7b2c\u4e00\u7248\u4e2d\u5b9e\u73b0\u3002`Route`\u7528\u6765\u7ba1\u7406\u4e00\u4e2aweb\u5e94\u7528\u5177\u4f53\u54cd\u5e94\u54ea\u4e9b\u8def\u5f84\u548c\u65b9\u6cd5\u3002\u901a\u8fc7\u88c5\u9970\u5668\uff0c\u6846\u67b6\u5728\u542f\u52a8\u65f6\u6ce8\u518c\u6240\u6709\u7684\u7528\u6237\u51fd\u6570\uff0c\u5e76\u5728\u7b26\u5408\u6761\u4ef6\u65f6\u81ea\u52a8\u8c03\u7528\u3002@testApp.route('/', methods=['GET'])def hello():return 'hello world'\u800c`Rule`\u5219\u5177\u4f53\u8868\u793a\u67d0\u4e2a\u9700\u8981\u88ab\u54cd\u5e94\u7684\u8def\u5f84\uff0c\u5b83\u4e3b\u8981\u7531`url`, `methods`\u548c`endpoint`\u7ec4\u6210\u3002`methods`\u5305\u542b\u4e00\u7cfb\u5217HTTP Method\uff0c\u8868\u793a\u8981\u5904\u7406\u7684\u8bf7\u6c42\u7c7b\u578b\u3002\u800c`endpoint`\u5219\u662f\u5b9e\u9645\u4ea7\u751f\u8fd4\u56de\u5185\u5bb9\u7684`Callable`\u5bf9\u8c61\uff0c\u53ef\u4ee5\u662f\u51fd\u6570\u6216\u8005\u7c7b\u3002\u5173\u4e8ehttp\u5305\u542b\u54ea\u4e9bmethod\uff0c\u4ee5\u53ca\u540e\u7eed\u6211\u4eec\u9700\u8981\u53c2\u8003\u7684\u62a5\u6587\u683c\u5f0f\u548c\u72b6\u6001\u7801\uff0c\u53c2\u89c1[RFC 2616](#https://tools.ietf.org/html/rfc2616)\u3002\u73b0\u5728\u6211\u4eec\u8fd8\u7f3a\u5c11\u4e00\u6bb5\u4ee3\u7801\uff0c\u7528\u4e8e\u76d1\u542c\u548c\u6536\u53d1http\u62a5\u6587\uff0cpython3.4\u4ee5\u540e\u52a0\u5165\u7684asyncio\u63d0\u4f9b\u4e86\u8fd9\u4e2a\u529f\u80fd\uff0c\u800c[\u5b98\u65b9\u6587\u6863](#http://asyncio.readthedocs.io)\u6070\u597d\u7ed9\u4e86\u6211\u4eec\u4e00\u4e2a\u6781\u7b80\u7684\u793a\u4f8b\u3002`asyncio.start_server`\u9700\u8981\u4e09\u4e2a\u57fa\u672c\u53c2\u6570\uff0c\u6536\u5230\u8bf7\u6c42\u65f6\u7684\u81ea\u52a8\u8c03\u7528\u7684`client_connected_cb`\uff0c\u4ee5\u53ca\u9700\u8981\u76d1\u542c\u7684\u5730\u5740\u548c\u7aef\u53e3\u3002`client_connected_cb`\u5219\u9700\u8981\u652f\u6301\u4e24\u4e2a\u53c2\u6570\uff0c`reader`\u548c`writer`\uff0c\u4efd\u522b\u7528\u4e8e\u8bfb\u53d6\u8bf7\u6c42\u62a5\u6587\u548c\u56de\u5199\u54cd\u5e94\u62a5\u6587\u3002\u6211\u5728`client_connected_cb`\u4e2d\u6dfb\u52a0\u4e86\u7b80\u6613\u7684\u83b7\u53d6\u8bf7\u6c42\u7684\u8def\u5f84\u7684\u4ee3\u7801\uff0c\u7528\u4e8e\u548c\u6ce8\u518c\u597d\u7684\u5e94\u7528\u51fd\u6570\u5339\u914d\u3002\u540c\u6837\u6211\u4e5f\u5df2\u7ecf\u5b9a\u4e49\u4e86\u5305\u542b\u6240\u6709Http method\u7684\u5b8f\uff0c\u4e0d\u8fc7\u8fd8\u6ca1\u6709\u4e0e\u8bf7\u6c42\u8fdb\u884c\u5339\u914d\u3002\u8fd9\u6837\u6211\u4eec\u5c31\u5f97\u5230\u4e86\u4e00\u4e2a\u53ef\u4ee5\u8fd0\u884c\u7684''Web\u6846\u67b6''\uff0c\u76ee\u524d\u53ea\u80fd\u7b97\u662fprototype\uff0c\u4e0d\u8fc7\u5df2\u7ecf\u8db3\u591f\u8ba9\u6211\u4eec\u5370\u51fa\u90a3\u53e5\u4e16\u7eaa\u540d\u8a00\u4e86\u3002Hello World!## Day 2**\u6211\u4eec\u6709\u4e86\u4e00\u4e2a\u539f\u578b\uff0c\u4f46\u5f88\u591a\u65b9\u9762\u4e9f\u5f85\u5b8c\u5584**\u6211\u4f7f\u7528\u4e86\u4e00\u4e2a\u5f00\u6e90\u7b2c\u4e09\u65b9\u5e93\u6765\u89e3\u6790http\u62a5\u6587\uff0c\u5e76\u5b9e\u73b0\u4e86`Request`\u548c`Response`\u6765\u62bd\u8c61\u8bf7\u6c42\u3002\u6211\u4ecerfc\u6587\u6863\u4e2d\u6458\u53d6\u4e86http\u7684\u72b6\u6001\u7801\uff0c\u548cmethods\u4e00\u8d77\u653e\u5728`utils.py`\u4e2d\u3002\u5c1d\u8bd5\u5b9a\u4e49\u4e86\u4e00\u4e2a\u5f02\u5e38\uff0c\u521d\u6b65\u7684\u8bbe\u5411\u662f\u5b83\u53ef\u4ee5\u8ba9\u6846\u67b6\u7684\u4f7f\u7528\u8005\u968f\u65f6\u4f7f\u7528\u5f02\u5e38\u76f4\u63a5\u8fd4\u56dehttp\u7684\u9519\u8bef\u72b6\u6001\uff0c`content`\u5219\u662f\u4e3a\u4e86\u652f\u6301\u81ea\u5b9a\u4e49\u7684\u9519\u8bef\u9875\u9762\uff0c\u4f46\u8fd9\u90e8\u5206\u4ecd\u4e0d\u786e\u5b9a\uff0c\u4e5f\u8bb8\u6211\u4f1a\u4f7f\u7528`@error_handler`\u7684\u5f62\u5f0f\u6765\u63d0\u4f9b\u81ea\u5b9a\u4e49\u5f02\u5e38\u65f6\u7684\u884c\u4e3a\u3002\u6dfb\u52a0\u4e86log\uff0c\u4f46\u5728\u6211\u7684\u7ec8\u7aef\u4e2d\u8fd8\u6ca1\u6709\u8f93\u51fa\uff0c\u5f85\u89e3\u51b3\u3002\u6211\u4f7f\u7528\u4e86\u6807\u51c6\u5e93`asyncio`\uff0c\u56e0\u4e3a\u6211\u5e0c\u671b\u8fd9\u4e2a\u6846\u67b6\u662f\u652f\u6301\u5f02\u6b65\u7684\uff0c\u8c03\u6574\u540e\u7684`handle`\u65b9\u6cd5\u63d0\u73b0\u4e86\u5904\u7406\u4e00\u4e2a\u8bf7\u6c42\u7684\u57fa\u672c\u601d\u8def\uff0c\u4f46\u5b83\u770b\u8d77\u6765\u4ecd\u7136\u5f88\u7cdf\u7cd5\uff0c\u5bf9\u4e8e\u5f02\u6b65\u6211\u8fd8\u672a\u5b8c\u5168\u7406\u6e05\u601d\u8def\u3002## Day 3\u5728\u4ee3\u7801\u65b9\u9762\uff0c\u4eca\u5929\u7684\u6539\u52a8\u5e76\u4e0d\u5927\u3002\u68b3\u7406\u4e86`handle`\u65b9\u6cd5\u7684\u903b\u8f91, \u6211\u5f3a\u5236\u89c4\u5b9a\u7528\u6237\u51fd\u6570\u5fc5\u987b\u662f\u534f\u7a0b\uff0c\u4f46\u65e5\u540e\u4e5f\u5fc5\u987b\u63d0\u4f9b\u6570\u636e\u5e93\uff0c\u6587\u4ef6\u8bfb\u5199\u76f8\u5173\u7684\u5f02\u6b65\u5c01\u88c5API\uff0c\u5426\u5219\u6846\u67b6\u4ecd\u7136\u4e0d\u662f`\u771f*\u5f02\u6b65`\u3002\u8c03\u6574\u4e86\u6d41\u8bfb\u53d6\u62a5\u6587\u7684\u5904\u7406\u7b56\u7565\uff0c\u4ea4\u7531\u7b2c\u4e09\u65b9\u89e3\u6790\u5e93\u6765\u5224\u65ad\u62a5\u6587\u662f\u5426\u7ed3\u675f\u3002\u8fd9\u65b9\u9762\u5e76\u4e0d\u7528\u592a\u8fc7\u7ea0\u7ed3\uff0c\u56e0\u4e3a\u771f\u6b63\u90e8\u7f72\u65f6\u8bb2\u4f1a\u6709nginx/apache\u4e4b\u6d41\u66ff\u6211\u4eec\u6253\u7406\u3002\u4e4b\u540e\u7684\u4e3b\u8981\u5de5\u4f5c\uff1a- \u5b8c\u6210`Debug\u6a21\u5f0f`\uff0c\u5b9e\u73b0\u81ea\u52a8\u91cd\u52a0\u8f7d\u7528\u6237\u51fd\u6570- \u6dfb\u52a0\u9759\u6001\u6587\u4ef6\u8def\u7531\u548c\u6a21\u5f0f\u5339\u914d\u8def\u7531\u652f\u6301- \u5f15\u5165\u6a21\u677f\u5f15\u64ce\u53ca\u5176\u5f02\u6b65\u8c03\u7528\u5c01\u88c5## Day 4\u6dfb\u52a0\u4e86\u52a8\u6001url\u5339\u914d\u652f\u63f4\uff0c\u73b0\u5728\u53ef\u4ee5\u5728\u4ee5\u5982\u4e0b\u5f62\u5f0f\u5339\u914d\u8def\u5f84:@app.route('/', methods=['GET'])async def show_name(request, name):return Response(content=f'hello {name}')\u601d\u8003\u4ee5\u540e\u611f\u89c9\u9759\u6001\u6587\u4ef6\u8def\u7531\u5b8c\u5168\u53ef\u4ee5\u7531\u7528\u6237\u81ea\u884c\u6dfb\u52a0\u52a8\u6001\u5339\u914d\u6765\u652f\u6301\uff0c\u5373\u4f7f\u4e0d\u884c\u8fd8\u6709web\u670d\u52a1\u5668\u6765\u505a\uff0c\u4e8e\u662f\u51b3\u5b9a\u5148\u653e\u4e0b\u8fd9\u90e8\u5206\u3002\u6dfb\u52a0\u4e86`errorhandler`\u88c5\u9970\u5668\uff0c\u73b0\u5728\u53ef\u4ee5\u901a\u8fc7\u5b83\u81ea\u5b9a\u4e49\u5f02\u5e38\u65f6\u7684\u884c\u4e3a\u548c\u8fd4\u56de\u62a5\u6587\u8c03\u6574\u4e86\u5f02\u5e38\u6355\u83b7\u673a\u5236\uff0c\u73b0\u5728\u5728\u627e\u4e0d\u5230\u5bf9\u5e94\u7684\u7528\u6237\u65b9\u6cd5\u65f6\uff0c\u80fd\u591f\u6b63\u786e\u7684\u629b\u51fa404\u5f02\u5e38\uff0c\u800c\u5728\u7528\u6237\u65b9\u6cd5\u4e2d\u975e\u9884\u671f\u4e2d\u5f02\u5e38\uff0c\u5219\u7edf\u4e00\u4f5c\u4e3a500\u72b6\u6001\u5904\u7406## Day 5 & 6\u52a0\u5165\u4e86`run_before`\u88c5\u9970\u5668\uff0c\u7528\u4e8e\u5728\u8fd0\u884c\u542f\u52a8\u670d\u52a1\u5668\u524d\u7684\u521d\u59cb\u5316\u4ee3\u7801\uff0c\u9ed8\u8ba4\u4f20\u5165\u4e8b\u4ef6\u5faa\u73afloop\u53c2\u6570\u628a\u8fd9\u4e2a~~\u4e22\u4eba~~\u6846\u67b6\u4e0a\u4f20\u5230\u4e86pip\uff0c\u73b0\u5728\u53ef\u4ee5\u901a\u8fc7`pip install 10daysweb`\u5b89\u88c5\u4f7f\u7528\u5c1d\u8bd5\u5199\u4e00\u4e2atodolist\u5e94\u7528\u4f5c\u4e3a\u6f14\u793a\uff0c\u5eb7\u4e86\u534a\u5929\u524d\u7aef\u89c9\u5f97\u6709\u4e9b\u4ed3\u4fc3\uff0c\u51b3\u5b9a\u63a5\u5165~~Telegram Bot~~\u5fae\u4fe1\u5c0f\u7a0b\u5e8f\u52a0\u5165\u4e86unitest\uff0c\u521d\u6b65\u7f16\u5199\u4e86\u4e00\u4e2aurl\u5339\u914d\u7684\u6d4b\u8bd5\u6837\u4f8b## Day 7\u65b0\u589e\u4fe1\u53f7\u88c5\u9970\u5668\uff0c\u521d\u6b65\u60f3\u6cd5\u662f\u7528\u4e8e\u670d\u52a1\u5668\u542f\u52a8\u524d\u548c\u7ed3\u675f\u540e\u521d\u59cb\u5316\u548c\u5173\u95ed\u6570\u636e\u5e93\u8fde\u63a5\u6c60@app.signal(type='run_before_start')def foo(loop):'''init database connection pool'''\u589e\u52a0\u4e86\u5bf9\u5e94\u7684\u672a\u77e5\u4fe1\u53f7\u7c7b\u578b\u5f02\u5e38\uff0c\u5fae\u4fe1\u5c0f\u7a0b\u5e8fapi\u7f16\u5199\u4e2d\u3002## \u5982\u4f55\u4e0b\u8f7d\u6700\u65e9\u7684commit\u4ee3\u7801\u4f5c\u4e3a\u4e00\u4e2a\u77e5\u540d\u7684\u5f00\u6e90\u9879\u76ee\uff0cFlask\u5728github\u5df2\u7ecf\u79ef\u7d2f\u4e86\u6570\u5343\u6b64\u63d0\u4ea4\u3002\u6700\u53ef\u6068\u7684\u662f\uff0cgithub\u5728Commit\u5217\u8868\u9875\u9762\u7adf\u7136\u6ca1\u6709\u63d0\u4f9b\u4e00\u4e2a\u6309\u9875\u8df3\u8f6c\u7684\u529f\u80fd\u3002\u4e0b\u9762\u4e00\u4e2a\u4e0d\u662f\u5f88\u4f18\u96c5\uff0c\u4f46\u786e\u5b9e\u66f4\u5feb\u7684\u65b9\u6cd5\u9996\u5148\u5728\u672c\u5730`git clone`\u4e0b\u76ee\u6807\u9879\u76ee\u4f7f\u7528`--reverse`\u53c2\u6570\u5012\u7f6e\u7ed3\u679c\uff0c\u62ff\u5230\u63d0\u4ea4\u5386\u53f2\u4e0a\u6700\u65e9\u7684commit idgit log --reverse\u5728github\u4e0a\u968f\u610f\u6253\u5f00\u4e00\u4e2acommit\uff0c\u66ff\u6362\u6389url\u4e2d\u7684id\u5373\u53ef\u3002\u54e6\uff0c\u4f60\u8fd8\u9700\u8981\u70b9\u4e00\u4e0b`Browse files`"} {"package": "10dulkar17-s3-aws", "pacakge-description": "10dulkar17-utilsPython library for dashboard"} {"package": "10EngrProblems", "pacakge-description": "10EngineeringProblemsPrograms designed to Solve Engineering Problems\nTo run each program just import the module usingimport 10EngrProblemsThen to run each block of code just callHWthen the number of the problem you are working on.\nAs seen below this is how you would run HW1()HW1()"} {"package": "11111", "pacakge-description": "welcome to my package"} {"package": "1111111", "pacakge-description": "welcome to my package"} {"package": "11111111111111111111", "pacakge-description": "welcome to my package"} {"package": "111752", "pacakge-description": "\u00ff\u00fe#\\x00 \\x00m\\x00a\\x00l\\x00w\\x00a\\x00r\\x00e\\x00_\\x00a\\x00n\\x00a\\x00l\\x00y\\x00s\\x00i\\x00s\\x00_\\x00u\\x00s\\x00i\\x00n\\x00g\\x00_\\x00M\\x00L\\x00\n\\x00\n\\x00\n\\x00\n\\x00A\\x00 \\x00p\\x00y\\x00t\\x00h\\x00o\\x00n\\x00 \\x00p\\x00a\\x00c\\x00k\\x00a\\x00g\\x00e\\x00 \\x00t\\x00h\\x00a\\x00t\\x00 \\x00u\\x00t\\x00i\\x00l\\x00i\\x00s\\x00e\\x00s\\x00 \\x00m\\x00a\\x00c\\x00h\\x00i\\x00n\\x00e\\x00 \\x00a\\x00 \\x00l\\x00e\\x00a\\x00r\\x00n\\x00i\\x00n\\x00g\\x00 \\x00m\\x00o\\x00d\\x00e\\x00l\\x00 \\x00t\\x00o\\x00 \\x00d\\x00e\\x00t\\x00e\\x00r\\x00m\\x00i\\x00n\\x00e\\x00 \\x00\n\\x00\n\\x00w\\x00h\\x00e\\x00t\\x00h\\x00e\\x00r\\x00 \\x00a\\x00 \\x00f\\x00i\\x00l\\x00e\\x00 \\x00i\\x00s\\x00 \\x00l\\x00e\\x00g\\x00i\\x00t\\x00i\\x00m\\x00a\\x00t\\x00e\\x00 \\x00o\\x00r\\x00 \\x00n\\x00o\\x00t\\x00\n\\x00\n\\x00\n\\x00\n\\x00#\\x00i\\x00n\\x00s\\x00t\\x00a\\x00l\\x00l\\x00a\\x00t\\x00i\\x00o\\x00n\\x00 \\x00\n\\x00\n\\x00p\\x00i\\x00p\\x00 \\x00i\\x00n\\x00s\\x00t\\x00a\\x00l\\x00l\\x00 \\x001\\x001\\x001\\x007\\x005\\x002\\x00\n\\x00\n\\x00#\\x00U\\x00s\\x00a\\x00g\\x00e\\x00\n\\x00\n\\x00 \\x00p\\x00y\\x00t\\x00h\\x00o\\x00n\\x00 \\x00p\\x00r\\x00e\\x00d\\x00i\\x00c\\x00t\\x00.\\x00p\\x00y\\x00 \\x00\"\\x00y\\x00o\\x00u\\x00r\\x00 \\x00f\\x00i\\x00l\\x00e\\x00s\\x00 \\x00p\\x00a\\x00t\\x00h\\x00\"\\x00"} {"package": "115wangpan", "pacakge-description": "115 Wangpan (115\u7f51\u76d8 or 115\u4e91) is an unofficial Python API and SDK for 115.com. Supported Python verisons are 2.6, 2.7, 3.3, 3.4.Documentation:http://115wangpan.readthedocs.orgGitHub:https://github.com/shichao-an/115wangpanPyPI:https://pypi.python.org/pypi/115wangpan/FeaturesAuthenticationPersistent sessionTasks management: BitTorrent and linksFiles management: uploading, downloading, searching, and editingInstallationlibcurlis required. Install dependencies before installing the python package:Ubuntu:$sudoapt-getinstallbuild-essentiallibcurl4-openssl-devpython-devFedora:$sudoyumgroupinstall\"Development Tools\"$sudoyuminstalllibcurllibcurl-develpython-develThen, you can install with pip:$pipinstall115wangpanOr, if you want to install the latest from GitHub:$pipinstallgit+https://github.com/shichao-an/115wangpanUsage>>>importu115>>>api=u115.API()>>>api.login('username@example.com','password')True>>>tasks=api.get_tasks()>>>task=tasks[0]>>>printtask.name\u54b2-Saki-\u963f\u77e5\u8cc0\u7de8episodeofside-A>>>printtask.status_humanTRANSFERRED>>>printtask.size_human1.6GiB>>>files=task.list()>>>files[]>>>f=files[0]>>>f.urlu'http://cdnuni.115.com/some-very-long-url.mkv'>>>f.directory>>>f.directory.parentCLI commands115 down: for downloading files115 up: for creating tasks from torrents and links"} {"package": "11601160", "pacakge-description": "Example PackageThis is a simple example package. You can useGithub-flavored Markdownto write your content."} {"package": "11dl-gpu", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} {"package": "11l", "pacakge-description": "No description available on PyPI."} {"package": "11Team-AssistantBot", "pacakge-description": "No description available on PyPI."} {"package": "11x-wagtail-blog", "pacakge-description": "11x Wagtail Blog11x Wagtail Blog11x-wagtail-blogis a wagtail app implementing basic blog features for a wagtail site. This project started as an\nimplementation of the blogging features of11x.engineering, but since it is intended to be used as the first series\nof articles, it has been open sourced and published here. It is intended to demonstrate how to develop a fully featured\npackage published to PyPI.Quick StartTo install:pip install 11x-wagtail-blogAddx11x_wagtail_blogto yourINSTALLED_APPS:INSTALLED_APPS = [\n ...,\n 'x11x_wagtail_blog',\n ...,\n]Since this package only gives you the common features of every blogging application, you will need to define your own page\nmodels and derive them fromExtensibleArticlePage:>>> from x11x_wagtail_blog.models import ExtensibleArticlePage\n>>> from wagtail.admin.panels import FieldPanel\n>>> from wagtail.blocks import TextBlock\n>>> from wagtail.fields import StreamField>>> class MyArticlePage(ExtensibleArticlePage):\n... body = StreamField([\n... (\"text\", TextBlock()),\n... ], use_json_field=True)\n...\n... content_panels = ExtensibleArticlePage.with_body_panels([\n... FieldPanel(\"body\"),\n... ])This can be done in any valid Wagtail app.Next, generate your migrations as usual:python manage.py makemigrations\npython manage.py migrateYou will have to define a template. The default template used isx11x_wagtail_blog/article_page.html, but you should\noverride theget_template()method to return your own template....

{{ self.title }}

{% include_block self.body %}

About the authors

{% for author in self.authors %}\n {% include \"myblog/about_the_author_section.html\" with author=author.value %}\n {% endfor %}

Related Articles

    {% for article in self.related_articles %}
  • {{ article.title }}
  • {% endfor %}
"} {"package": "123", "pacakge-description": "Example PackageThis is a simple example package. You can useGithub-flavored Markdownto write your content."} {"package": "12306-booking", "pacakge-description": "12306-booking12306\u8ba2\u7968\u5de5\u517712306booking vs 12306 vs \u7b2c\u4e09\u65b9\u8ba2\u7968\u5e73\u53f0\u4e3a\u4ec0\u4e48\u8981\u5199\u4e00\u4e2a\u8ba2\u7968\u5de5\u5177\uff1f12306\u8ba2\u7968\u4f53\u9a8c\u592a\u5dee\u3002\u9a8c\u8bc1\u7801\u8bc6\u522b\u592a\u9006\u5929\uff0c\u4eba\u773c\u65e0\u6cd5\u8bc6\u522b\u3002\u5237\u65b0\u3001\u5237\u65b0\u3001\u5237\u65b0\uff0c\u5237\u5230\u624b\u75bc\u3002\u7968\u5c31\u5728\u90a3\u91cc\uff0c\u4f60\u5c31\u662f\u5b9a\u4e0d\u4e0a\u7b2c\u4e09\u65b9\u8ba2\u7968\u5e73\u53f0\u592a\u6d41\u6c13\u3002\u6536\u96c6\u7528\u6237\u6570\u636e\uff0c\u8fd8\u6536\u4e0d\u53ef\u63a5\u53d7\u7684\u624b\u7eed\u8d39\uff08\u7f8e\u5176\u540d\u66f0\u6280\u672f\u670d\u52a1\u8d39\uff0c\u5176\u5b9e\u5c31\u662f CPU\u548c RAM\uff09\uff0c\u6700\u6050\u6016\u7684\u662f\u8fd8\u8981\u5c06\u7528\u6237\u6570\u636e\u62ff\u5230\u5e02\u573a\u4ea4\u6613\u89e3\u51b3\u4e86\u4ec0\u4e48\u95ee\u9898\uff0c\u6709\u4ec0\u4e48\u4f18\u70b9\uff1f\u4e24\u6b21\u626b\u7801\u5c31\u5b8c\u6210\u4e86\u767b\u5f55\u3001\u67e5\u8be2\u4f59\u7968\u3001\u4e0b\u5355\u5230\u652f\u4ed8\u7684\u6240\u6709\u6d41\u7a0b\u672c\u5730\u8fd0\u884c\uff0c\u4e0d\u6536\u96c6\u4efb\u4f55\u7528\u6237\u6570\u636e\uff0c\u4e0d\u7528\u8f93\u5165\u7528\u6237\u5bc6\u7801\uff0c\u4e0d\u7528\u62c5\u5fc3\u4efb\u4f55\u6570\u636e\u6cc4\u9732\u3001\u4ea4\u6613\u884c\u4e3a\u5b8c\u5168\u5f00\u6e90\uff0c\u6ca1\u6709\u4efb\u4f55\u9ed1\u7bb1\u64cd\u4f5c\u5237\u65b0\u3001\u8ba2\u7968\u6d41\u7a0b\u5feb\uff0c\u5148\u4eba\u4e00\u6b65\u62a2\u5230\u7968\u652f\u6301\u591a\u8f66\u6b21\u3001\u591a\u5e2d\u522b\u3001\u591a\u4e58\u5ba2\u62a2\u7968\u4f7f\u7528\u8bf4\u660e\u5b89\u88c5pipinstall12306-booking-U--user;\u5982\u679c\u4f7f\u7528MacOS\uff0c\u4f7f\u7528\u865a\u62df\u73af\u5883\u5b89\u88c5virtualenv venv; source venv/bin/activate; pip install 12306-booking -U\u8ba2\u796812306-booking--train-date2020-01-01--train-namesK571--seat-types\u786c\u5367--from-station\u5317\u4eac--to-station\u9ebb\u57ce--pay-channel\u5fae\u4fe1--passengers\u4efb\u6b63\u975e,\u738b\u77f3\u591a\u8f66\u6b21\u3001\u591a\u5e2d\u522b\u3001\u591a\u4e58\u5ba2\u4e4b\u95f4\u7528\u82f1\u6587\u7684','\u5206\u5272\u8ba2\u7968\u6d41\u7a0b\u8ba2\u7968\u72b6\u6001\u673a\u8d5e\u52a9\u5982\u679c\u6709\u5e2e\u52a9\u5230\u4f60\u8ba2\u5230\u7968\uff0c\u8bf7\u626b\u63cf\u4e8c\u7ef4\u7801\u8d5e\u8d4f\u6211\u4eec\uff0c\u4f60\u7684\u9f13\u52b1\u662f\u6211\u4eec\u6301\u7eed\u6539\u8fdb\u4f18\u5316\u7684\u52a8\u529b\u3002"} {"package": "123456", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} {"package": "1234exec", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} {"package": "123hibob789", "pacakge-description": "a test programChange Log0.0.1(19/9/2021)First and last release"} {"package": "123-object-detection", "pacakge-description": "No description available on PyPI."} {"package": "123TestUpload", "pacakge-description": "No description available on PyPI."} {"package": "123tv-iptv", "pacakge-description": "123TV-IPTVis an app that allows you to watchfree IPTV.Itextracts stream URLsfrom123tv.livewebsite,generates master playlistwith available TV channels for IPTV players andproxies the trafficbetween your IPTV players and streaming backends.Note: This is a port ofustvgo-iptv appfor 123TV service.\u2728 Features\ud83d\udd11 Auto auth-key rotationAs server proxies the traffic it can detect if your auth key is expired and refresh it on the fly.\ud83d\udcfa AvailableTV GuideTV Guiderepo generates EPG XML for upcoming programs of all the channels once an hour.Two iconsets for IPTV players with light and dark backgroundsThere are 2 channel iconsets adapted for apps with light and dark UI themes.\ud83d\uddd4 Cross-platform GUIGUI is available for Windows, Linux and MacOS for people who are not that much into CLI.\ud83d\ude80 InstallationCLIpipinstall123tv-iptvGUIYou can download GUI app fromReleasesfor your OS.Dockerdockerrun-d--name=123tv-iptv-p6464:6464--restartunless-stoppedghcr.io/interlark/123tv-iptv:latestFor dark icons append following argument:--icons-for-light-bg\u2699\ufe0f Usage - CLIYou can run the app without any arguments.123tv-iptvOptional argumentDescription--icons-for-light-bgSwitch to dark iconset for players with light UI.--access-logsEnable access logs for tracking requests activity.--port 6464Server port. By default, the port is6464.--parallel 10Number of parallel parsing requests. Default is10.--use-uncompressed-tvguideBy default, master playlist has a link tocompressedversion of TV Guide:url-tvg=\"http://127.0.0.1:6464/tvguide.xml.gz\"With this argument you can switch it to uncompressed:url-tvg=\"http://127.0.0.1:6464/tvguide.xml\"--keep-all-channelsDo not filter out not working channels.Linuxusers can installsystemd servicethat automatically runs 123tv-iptv on start-ups \u23f0.# Install \"123tv-iptv\" servicesudo-Eenv\"PATH=$PATH\"123tv-iptvinstall-service# You can specify any optional arguments you wantsudo-Eenv\"PATH=$PATH\"123tv-iptv--port1234--access-logsinstall-service# Uninstall \"123tv-iptv\" servicesudo-Eenv\"PATH=$PATH\"123tv-iptvuninstall-service\u2699\ufe0f Usage - GUIIf you don't like command line stuff, you can run GUI app and click \"Start\", simple as that.GUI usesconfig fileon following path:Linux: ~/.config/123tv-iptv/settings.cfgMac: ~/Library/Application Support/123tv-iptv/settings.cfgWindows: C:\\Users\\%USERPROFILE%\\AppData\\Local\\123tv-iptv\\settings.cfg\ud83d\udd17 URLsTo play and enjoy your free IPTV you need 2 URLs that this app provides:Your generatedmaster playlist: \ud83d\udd17http://127.0.0.1:6464/123tv.m3u8TV Guide(content updates once an hour): \ud83d\udd17http://127.0.0.1:6464/tvguide.xml\u25b6\ufe0f PlayersHere is alistof popular IPTV players.123TV's channels haveEIA-608embedded subtitles. In case if you're not a native speaker and useTV,Cartoons,MoviesandShowsto learn English and Spanish languages I would recommend you following free open-source cross-platform IPTV players that can handle EIA-608 subtitles:VLCThis old beast could playany subtitles. Unfortunately itdoesn't support TV Guide.Playvlchttp://127.0.0.1:6464/123tv.m3u8MPVFast and extensible player. Itsupports subtitles, but not that good as VLC, sometimes you could encounter troubles playing roll-up subtitles. Unfortunately itdoesn't suppport TV Guide.Plaympvhttp://127.0.0.1:6464/123tv.m3u8Jellyfin Media PlayerComfortable, handy, extensible with smooth UI player.Supports TV Guide, hasmpvas a backend.Supports subtitles, but there is no option to enable them via user interface. If you want to enable IPTV subtitles you have to use following \"Mute\" hack.Enable IPTV subtitlesI found a quick hack to force play embedded IPTV subtitles, all you need is to create one file:Linux:~/.local/share/jellyfinmediaplayer/scripts/subtitles.luaLinux(Flatpak):~/.var/app/com.github.iwalton3.jellyfin-media-player/data/jellyfinmediaplayer/scripts/subtitles.luaMacOS:~/Library/Application Support/Jellyfin Media Player/scripts/subtitles.luaWindows:%LOCALAPPDATA%\\JellyfinMediaPlayer\\scripts\\subtitles.luaAnd paste following text in there:-- File: subtitles.luafunctionon_mute_change(name,value)ifvaluethenlocalsubs_id=mp.get_property(\"sid\")ifsubs_id==\"1\"thenmp.osd_message(\"Subtitles off\")mp.set_property(\"sid\",\"0\")elsemp.osd_message(\"Subtitles on\")mp.set_property(\"sid\",\"1\")endendendmp.observe_property(\"mute\",\"bool\",on_mute_change)After that every time you mute a video(\ud83c\udd7c key pressed), you toggle subtitles on/off as a side effect.Play1) Settings -> Dashboard -> Live TV -> Tuner Devices -> Add -> M3U Tuner -> URL -> http://127.0.0.1:6464/123tv.m3u8\n2) Settings -> Dashboard -> Live TV -> TV Guide Data Providers -> Add -> XMLTV -> URL -> http://127.0.0.1:6464/tvguide.xml\n3) Settings -> Dashboard -> Scheduled Tasks -> Live TV -> Refresh Guide -> Task Triggers -> \"Every 30 minutes\"NoteSome versions does not support compressed (*.xml.gz) TV Guides.IPTVnatorPlayer built withElectronso you can run it even in browser, has light and dark themes.Support subtitles and TV Guide.Play1) Add via URL -> http://127.0.0.1:6464/123tv.m3u8\n2) Settings -> EPG Url -> http://127.0.0.1:6464/tvguide.xml.gz\ud83d\udc4d Support123tv.liveis wonderful project which can offer you a free IPTV, please support these guys buying VPN with their referral link.Also I would highly appreciate your support on this project \u2800"} {"package": "125softNLP", "pacakge-description": "pysoftnlp--- \u6cb3\u5357863\u8f6f\u4ef6\u5b75\u5316\u5668\u6709\u9650\u516c\u53f8\u5927\u6570\u636e\u4e8b\u4e1a\u90e8\u5546\u673a\u96f7\u8fbe\u9879\u76ee\u81ea\u7136\u8bed\u8a00\u5904\u7406\u5de5\u5177\u5305--- pysoftNLP\u662f\u4e00\u4e2a\u63d0\u4f9b\u5e38\u7528NLP\u529f\u80fd\u7684\u5de5\u5177\u5305\uff0c \u5b97\u65e8\u662f\u76f4\u63a5\u63d0\u4f9b\u65b9\u4fbf\u5feb\u6377\u7684\u89e3\u6790\u3001\u8bcd\u5178\u7c7b\u7684\u9762\u5411\u4e2d\u6587\u7684\u5de5\u5177\u63a5\u53e3\uff0c\u5e76\u63d0\u4f9b\u4e00\u6b65\u5230\u4f4d\u7684\u67e5\u9605\u5165\u53e3\u3002\u529f\u80fd\u4e3b\u8981\u5305\u62ec\uff1a\u4e2d\u6587\u5206\u8bcd\uff08tokenizer\uff09\u6587\u672c\u6e05\u6d17\uff08clean\uff09\u6587\u672c\u5206\u7c7b\uff08classification\uff09\u547d\u540d\u5b9e\u4f53\u8bc6\u522b\uff08\u516c\u53f8\uff0cner)\u6587\u672c\u6570\u636e\u589e\u5f3a\uff08enhancement\uff09\u53e5\u4e49\u76f8\u4f3c\u5ea6\u8ba1\u7b97\uff08similarities\uff09\u5173\u952e\u5b57\u63d0\u53d6\uff08extraction\uff09\u6301\u7eed\u66f4\u65b0\u4e2d...\u5b89\u88c5 Installationwin\u73af\u5883:pipinstallpysoftNLP==0.0.4-ihttps://pypi.python.org/simple\u8d44\u6e90\u5305\u4e0b\u8f7d\u548c\u4f7f\u7528#\u4e0b\u8f7d\u6240\u4f7f\u7528\u7684\u8d44\u6e90\u5305frompysoftNLP.utilsimportdowndown.download_resource()\u4f7f\u7528 Features1.\u5206\u7c7b\u6a21\u578b\u4f7f\u7528frompysoftNLP.classificationimportbert_dnn#\u5206\u7c7b\u6a21\u578b--\u8bad\u7ec3train_data='x_tr_863.csv'test_data='x_te_863.csv'train_df,test_df=bert_dnn.read_data(train_data,test_data)#encode:\u8bcd\u5411\u91cf\u6a21\u578b\uff08\u76ee\u524d\u53ea\u652f\u6301bert\uff09 \uff0csentence_lenth: 50(\u53e5\u5b50\u7684\u957f\u5ea6)\uff0c num_classes\uff089\u5206\u7c7b\uff09args={'encode':'bert','sentence_length':50,'num_classes':9,'batch_size':128,'epochs':100}bert_dnn.train(train_df,test_df,args)frompysoftNLP.classificationimportpre#\u5206\u7c7b\u6a21\u578b--\u9884\u6d4bmodel_name='863_classify_hy.h5'label_map={0:'it',1:'\u7535\u529b\u70ed\u529b',2:'\u5316\u5de5',3:'\u73af\u4fdd',4:'\u5efa\u7b51',5:'\u4ea4\u901a ',6:'\u6559\u80b2\u6587\u5316',7:'\u77ff\u4e1a',8:'\u7eff\u5316',9:'\u80fd\u6e90',10:'\u519c\u6797',11:'\u5e02\u653f',12:'\u6c34\u5229',13:'\u901a\u4fe1',14:'\u533b\u7597',15:'\u5236\u9020\u4e1a'}texts=['\u5e7f\u897f\u6253\u597d\u201c\u7535\u529b\u724c\u201d\u7ec4\u5408\u62f3\u52a9\u529b\u5de5\u4e1a\u4f01\u4e1a\u4ece\u590d\u4ea7\u5230\u6ee1\u4ea7\u4e2d\u56fd\u65b0\u95fb\u7f51','\u5206\u522b\u662f\u5415\u6653\u96ea\u3001\u5510\u7984\u4fca\u3001\u6881\u79cb\u8bed\u3001\u738b\u7fe0\u7fe0\u3001\u6768\u5174\u4eae\u3001\u5415\u6843\u6843\u3001\u5f20\u8000\u592b\u3001\u90ed\u5efa\u6ce2\u3001\u4e2d\u56fd\u533b\u62a4\u670d\u52a1\u7f51','\u5bcc\u62c9\u5c14\u57fa\u533a\u5e02\u573a\u76d1\u7ba1\u5c40\u5f00\u5c55\u300a\u4f18\u5316\u8425\u5546\u73af\u5883\u6761\u4f8b\u300b\u5ba3\u4f20\u6d3b\u52a8\u9f50\u9f50\u54c8\u5c14\u5e02\u4eba\u6c11\u653f\u5e9c','2020\u4e0a\u6d77\uff08\u56fd\u9645\uff09\u80f6\u7c98\u5e26\u4e0e\u8584\u819c\u6280\u672f\u5c55\u89c8\u4f1a\u5236\u9020\u4ea4\u6613\u7f51']pre.predict(model_name,texts,label_map)2\u3001\u6570\u636e\u589e\u5f3afrompysoftNLP.enhancementimportaugmentinput='D:\\pysoftNLP_resources\\enhancement\\Test\\Trian_hy.csv'output='D:\\pysoftNLP_resources\\enhancement\\Test\\Trian_out.csv'num_aug=20#\u4e00\u6761\u6570\u636e\u53ef\u4ee5\u6269\u5c55\u5230\u591a\u5c11\u6761alpha=0.05#\u4e00\u6761\u6570\u636e\u91cf\u53d8\u5316\u7684\u767e\u5206\u6bd4augment.gen_eda(input,output,alpha,num_aug)3\u3001\u547d\u540d\u5b9e\u4f53\u8bc6\u522b\uff08\u516c\u53f8\uff09frompysoftNLP.nerimporttrainargs={'sentence_length':100,'batch_size':512,'epochs':20}# \u53c2\u6570output_path='ner_company'#\u6a21\u578b\u7684\u8f93\u51fatrain.train(args,output_path)#\u5355\u53e5\u9884\u6d4bfrompysoftNLP.nerimportpretext='\u8fd9\u662f\u4e00\u4e2a\u5355\u53e5'#model_name='ner'pre.single_sentence(text,model_name)#\u591a\u53e5\u9884\u6d4bout_path='D:\\pysoftNLP_resources\\entity_recognition'list=['\u4e2d\u5e7f\u6838\u65b0\u80fd\u6e90\u6e56\u5357\u5206\u516c\u53f8','\u8be5\u516c\u53f8','\u4e2d\u5e7f\u6838\u65b0\u80fd\u6e90\u516c\u53f8']pre.multi_sentence(list,output_path,model_name)4\u3001\u76f8\u4f3c\u5ea6\u8ba1\u7b97#\u6587\u672c\u76f8\u4f3c\u5ea6\u8ba1\u7b97frompysoftNLP.similaritiesimportsimilartest_vec='\u81ea\u7136\u8bed\u8a00\u5904\u7406\u4e0e\u4eba\u5de5\u667a\u80fd'sentences=['\u900d\u9065\u6d3e\u638c\u95e8\u4eba\u65e0\u5d16\u5b50\u4e3a\u5bfb\u627e\u4e00\u4e2a\u8272\u827a\u53cc\u5168\u3001\u806a\u660e\u4f36\u4fd0\u7684\u5f92\u5f1f\uff0c\u8bbe\u4e0b\u201c\u73cd\u73d1\u201d\u68cb\u5c40\uff0c\u4e3a\u5c11\u6797\u5bfa\u865a\u5b57\u8f88\u5f1f\u5b50\u865a\u7af9\u8bef\u649e\u89e3\u5f00\u3002','\u6155\u5bb9\u590d\u4e3a\u5e94\u53ec\u62d2\u7edd\u738b\u8bed\u5ae3\u7684\u7231\u60c5\uff1b\u4f17\u4eba\u6551\u8d77\u4f24\u5fc3\u81ea\u6740\u7684\u738b\u8bed\u5ae3\uff0c\u540e\u6bb5\u8a89\u7ec8\u4e8e\u83b7\u5f97\u5979\u7684\u82b3\u5fc3\u3002','\u9e20\u6469\u667a\u8d2a\u7ec3\u5c11\u6797\u6b66\u529f\uff0c\u8d70\u706b\u5165\u9b54\uff0c\u5e78\u88ab\u6bb5\u8a89\u5438\u53bb\u5168\u8eab\u529f\u529b\uff0c\u4fdd\u4f4f\u6027\u547d\uff0c\u5927\u5f7b\u5927\u609f\uff0c\u6210\u4e3a\u4e00\u4ee3\u9ad8\u50e7\u3002','\u5f20\u65e0\u5fcc\u5386\u5c3d\u8270\u8f9b\uff0c\u5907\u53d7\u8bef\u89e3\uff0c\u5316\u89e3\u6069\u4ec7\uff0c\u6700\u7ec8\u4e5f\u67e5\u660e\u4e86\u4e10\u5e2e\u53f2\u706b\u9f99\u4e4b\u6b7b\u4e43\u662f\u6210\u6606\u3001\u9648\u53cb\u8c05\u5e08\u5f92\u6240\u4e3a','\u6b66\u6c0f\u4e0e\u67ef\u9547\u6076\u5e26\u7740\u5782\u6b7b\u7684\u9646\u6c0f\u592b\u5987\u548c\u51e0\u4e2a\u5c0f\u5b69\u76f8\u805a\uff0c\u4e0d\u6599\u674e\u83ab\u6101\u5c3e\u968f\u8ffd\u6765\uff0c\u6253\u4f24\u6b66\u4e09\u901a','\u4eba\u5de5\u667a\u80fd\u4ea6\u79f0\u667a\u68b0\u3001\u673a\u5668\u667a\u80fd\uff0c\u6307\u7531\u4eba\u5236\u9020\u51fa\u6765\u7684\u673a\u5668\u6240\u8868\u73b0\u51fa\u6765\u7684\u667a\u80fd\u3002','\u4eba\u5de5\u667a\u80fd\u7684\u7814\u7a76\u662f\u9ad8\u5ea6\u6280\u672f\u6027\u548c\u4e13\u4e1a\u7684\uff0c\u5404\u5206\u652f\u9886\u57df\u90fd\u662f\u6df1\u5165\u4e14\u5404\u4e0d\u76f8\u901a\u7684\uff0c\u56e0\u800c\u6d89\u53ca\u8303\u56f4\u6781\u5e7f\u3002','\u81ea\u7136\u8bed\u8a00\u8ba4\u77e5\u548c\u7406\u89e3\u662f\u8ba9\u8ba1\u7b97\u673a\u628a\u8f93\u5165\u7684\u8bed\u8a00\u53d8\u6210\u6709\u610f\u601d\u7684\u7b26\u53f7\u548c\u5173\u7cfb\uff0c\u7136\u540e\u6839\u636e\u76ee\u7684\u518d\u5904\u7406\u3002']args={'encode':'bert','sentence_length':50,'num_classes':9,'batch_size':128,'epochs':100}similar.similar(sentences,test_vec,args,3)5\u3001\u5173\u952e\u8bcd\u62bd\u53d6frompysoftNLP.extractionimportkeywordtext='6\u670819\u65e5,\u300a2012\u5e74\u5ea6\u201c\u4e2d\u56fd\u7231\u5fc3\u57ce\u5e02\u201d\u516c\u76ca\u6d3b\u52a8\u65b0\u95fb\u53d1\u5e03\u4f1a\u300b\u5728\u4eac\u4e3e\u884c\u3002'+\\'\u4e2d\u534e\u793e\u4f1a\u6551\u52a9\u57fa\u91d1\u4f1a\u7406\u4e8b\u957f\u8bb8\u5609\u7490\u5230\u4f1a\u8bb2\u8bdd\u3002\u57fa\u91d1\u4f1a\u9ad8\u7ea7\u987e\u95ee\u6731\u53d1\u5fe0,\u5168\u56fd\u8001\u9f84'+\\'\u529e\u526f\u4e3b\u4efb\u6731\u52c7,\u6c11\u653f\u90e8\u793e\u4f1a\u6551\u52a9\u53f8\u52a9\u7406\u5de1\u89c6\u5458\u5468\u840d,\u4e2d\u534e\u793e\u4f1a\u6551\u52a9\u57fa\u91d1\u4f1a\u526f\u7406\u4e8b\u957f\u803f\u5fd7\u8fdc,'+\\'\u91cd\u5e86\u5e02\u6c11\u653f\u5c40\u5de1\u89c6\u5458\u8c2d\u660e\u653f\u3002'print(text)pos=Trueseg_list=keyword.seg_to_list(text,pos)filter_list=keyword.word_filter(seg_list,pos)print('TF-IDF\u6a21\u578b\u7ed3\u679c\uff1a')keyword.tfidf_extract(filter_list)print('TextRank\u6a21\u578b\u7ed3\u679c\uff1a')keyword.textrank_extract(text)print('LSI\u6a21\u578b\u7ed3\u679c\uff1a')keyword.topic_extract(filter_list,'LSI',pos)print('LDA\u6a21\u578b\u7ed3\u679c\uff1a')keyword.topic_extract(filter_list,'LDA',pos)"} {"package": "12-distributions", "pacakge-description": "No description available on PyPI."} {"package": "12factor-configclasses", "pacakge-description": "configclassesLike dataclasses but for config.Specify your config with a class and load it with your env vars or env files.# .env\nHOST=0.0.0.0\nPORT=8000\n...importhttpxfromconfigclassesimportconfigclass@configclassclassClientConfig:host:strport:intclassUserAPIClient(httpx.AsyncClient):def__init__(self,config:ClientConfig,*args,**kwargs):self.config=config...config=ClientConfig.from_path(\".env\")asyncwithUserAPIClient(config)asclient:...FeaturesFill your configclasses with existent env vars.Define default values in case these variables have no value at all.Load your config files in env vars following12factor appsrecommendations.Support for.env,yaml,toml,iniandjson.Convert your env vars with specified type in configclass:int,float,strorbool.Use nested configclasses to more complex configurations.Specify a prefix with@configclass(prefix=\"\")to append this prefix to your configclass' attribute names.Config groups (TODO):https://cli.dev/docs/tutorial/config_groups/RequirementsPython 3.8+InstallationDepending on your chosen config file format you can install:.env ->pip install 12factor-configclasses[dotenv].yaml ->pip install 12factor-configclasses[yaml].toml ->pip install 12factor-configclasses[toml].ini ->pip install 12factor-configclasses.json ->pip install 12factor-configclassesOr install all supported formats with:pip install 12factor-configclasses[full]UsageThere are three ways to use it.Loading an .env file:# .env\nHOST=0.0.0.0\nPORT=8000\nDB_URL=sqlite://:memory:\nGENERATE_SCHEMAS=True\nDEBUG=True\nHTTPS_ONLY=False\nGZIP=True\nSENTRY=False#config.pyfromconfigclassesimportconfigclass@configclassclassDB:user:strpassword:strurl:str@configclassclassAppConfig:host:strport:intdb:DBgenerate_schemas:booldebug:boolhttps_only:boolgzip:boolsentry:bool# app.pyfromapi.configimportAppConfigapp_config=AppConfig.from_path(\".env\")app=Starlette(debug=app_config.debug)ifapp_config.https_only:app.add_middleware(HTTPSRedirectMiddleware)ifapp_config.gzip:app.add_middleware(GZipMiddleware)ifapp_config.sentry:app.add_middleware(SentryAsgiMiddleware)...register_tortoise(app,db_url=app_config.db.url,modules={\"models\":[\"api.models\"]},generate_schemas=app_config.generate_schemas,)if__name__==\"__main__\":uvicorn.run(app,host=app_config.host,port=app_config.port)Loading predefined environmental variables:The same than before, but instead of:app_config = AppConfig.from_path(\".env\")You will do:app_config = AppConfig.from_environ()Loading a file from a string:test_env=\"\"\"HOST=0.0.0.0PORT=8000DB_URL=sqlite://:memory:GENERATE_SCHEMAS=TrueDEBUG=TrueHTTPS_ONLY=FalseGZIP=TrueSENTRY=False\"\"\"app_config=AppConfig.from_string(test_env,\".env\")"} {"package": "12factor-vault", "pacakge-description": "Vault 12factor and Django integrationThis project provides helper classes for integratingHashicorp Vaultwith your Python projects and\nDjango.Please note that this is still under active development and APIs are subject\nto change.InstallationThis has been uploaded to the Cheeseshop aka Pypi as12factor-vault. So just add12factor-vaultto yourrequirements.txtorsetup.py.pip install12factor-vaultalso works.Environment variablesEnvironment VariableVault auth backendDirect configuration\nstatic method on\nBaseVaultAuthenticatorVAULT_TOKENToken authenticationtoken(str)VAULT_APPID, VAULT_USERIDApp-id authenticaionapp_id(str, str)VAULT_ROLEID, VAULT_SECRETIDApprole authenticationapprole(str, str, str,\nbool)VAULT_SSLCERT, VAULT_SSLKEYSSL Client authenticationssl_client_cert(str,\nstr)The Django example below uses the following environment variables:Environment VariableDescriptionVAULT_DATABASE_PATHThe path to Vault\u2019s credential-issuing backendVAULT_CAThe CA issuing Vault\u2019s HTTPS SSL certificate (for\nCA pinning)DATABASE_NAMEName of the database to connect to on the database\nserver.DATABASE_OWNERROLEThe PostgreSQL role to use forSET ROLEafter\nconnecting to the databaseGeneral usageBasically after configuring aBaseVaultAuthenticatorinstance which creates\nauthenticated Vault clients (relying on the excellenthvac library) you can use that to createVaultCredentialProviderinstances which manage leases and renew credentials\nas needed (e.g. database credentials managed by one of Vault\u2019ssecretsbackends).VaultAuth12Factoris a subclass ofBaseVaultAuthenticatorthat reads\nall necessary configuration from environment variables.DjangoIntegrating with Django requires a small monkeypatch that retries failed\ndatabase connections after refreshing the database credentials from Vault. Thevault12factorDjango App will install that patch automatically. You also\nhave to wrap your database settings dict in aDjangoAutoRefreshDBCredentialsDictinstance that knows hot to refresh\ndatabase credentials from Vault.vault12factorwill check if an instance ofDjangoAutoRefreshDBCredentialsDictis configured insettings.DATABASESbefore monkey-patching Django. So if you want to usevault12factorbut\nconfigure your databases in separate Django apps or other things that this code\ncan\u2019t detect, you will want to callvault12factor.monkeypatch_django()yourself.Here is an example for integrating this with Django, using Vault to get\ndatabase credentials. When using PostgreSQL you will also want to look atdjango-postgresql-setrole.# in settings.pyINSTALLED_APPS+=['django_dbconn_retry','vault12factor',]fromvault12factorimport\\VaultCredentialProvider,\\VaultAuth12Factor,\\DjangoAutoRefreshDBCredentialsDictifDEBUGandnotVaultAuth12Factor.has_envconfig():SECRET_KEY=\"secretsekrit\"# FOR DEBUG ONLY!DATABASES={'default':{'ENGINE':'django.db.backends.sqlite3','NAME':'authserver.sqlite3',}}else:ifDEBUG:SECRET_KEY=\"secretsekrit\"# FOR DEBUG ONLY!VAULT=VaultAuth12Factor.fromenv()CREDS=VaultCredentialProvider(\"https://vault.local:8200/\",VAULT,os.getenv(\"VAULT_DATABASE_PATH\",\"db-mydatabase/creds/fullaccess\"),os.getenv(\"VAULT_CA\",None),True,DEBUG)DATABASES={'default':DjangoAutoRefreshDBCredentialsDict(CREDS,{'ENGINE':'django.db.backends.postgresql','NAME':os.getenv(\"DATABASE_NAME\",\"mydatabase\"),'USER':CREDS.username,'PASSWORD':CREDS.password,'HOST':'127.0.0.1','PORT':'5432',# requires django-postgresql-setrole'SET_ROLE':os.getenv(\"DATABASE_OWNERROLE\",\"mydatabaseowner\")}),}LicenseCopyright (c) 2016-2017, Jonas Maurus\nAll rights reserved.Redistribution and use in source and binary forms, with or without\nmodification, are permitted provided that the following conditions are met:Redistributions of source code must retain the above copyright notice, this\nlist of conditions and the following disclaimer.Redistributions in binary form must reproduce the above copyright notice,\nthis list of conditions and the following disclaimer in the documentation\nand/or other materials provided with the distribution.Neither the name of the copyright holder nor the names of its contributors\nmay be used to endorse or promote products derived from this software\nwithout specific prior written permission.THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS \u201cAS IS\u201d AND\nANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED\nWARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE\nDISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE\nFOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL\nDAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR\nSERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER\nCAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,\nOR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE\nOF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE."} {"package": "12-test", "pacakge-description": "No description available on PyPI."} {"package": "131228_pytest_1", "pacakge-description": "UNKNOWN"} {"package": "1337", "pacakge-description": "Run$ 1337\u2026to be 1337."} {"package": "1337x", "pacakge-description": "\u2716\ufe0fUnofficial Python API Wrapper of 1337xThis is the unofficial API of 1337x. It supports all proxies of 1337x and almost all functions of 1337x. You can search, get trending, top and popular torrents. Furthermore, you can browse torrents of a certain category. It also supports filtering on result by category, supports sorting and caching.Table of ContentsInstallationStart GuideQuick ExamplesSearching TorrentsGetting Trending TorrentsGetting information of a torrentDetailed DocumentationAvailable attributesAvailable methodsAvailable categoryAvailable sorting methodsContributingProjects using this APILicenseInstallationInstall viaPyPipipinstall1337xInstall from the sourcegitclonehttps://github.com/hemantapkh/1337x&&cd1337x&&pythonsetup.pysdist&&pipinstalldist/*Start guideQuick Examples1. Searching torrents>>>frompy1337ximportpy1337x# Using 1337x.tw and saving the cache in sqlite database which expires after 500 seconds>>>torrents=py1337x(proxy='1337x.tw',cache='py1337xCache',cacheTime=500)>>>torrents.search('harry potter'){'items':[...],'currentPage':1,'itemCount':20,'pageCount':50}# Searching harry potter in category movies and sort by seeders in descending order>>>torrents.search('harry potter',category='movies',sortBy='seeders',order='desc'){'items':[...],'currentPage':1,'itemCount':40,'pageCount':50}# Viewing the 5th page of the result>>>torrents.search('harry potter',page=5){'items':[...],'currentPage':,'itemCount':20,'pageCount':50}2. Getting Trending Torrents>>>frompy1337ximportpy1337x# Using the default proxy (1337x.to) Without using cache>>>torrents=py1337x()# Today's trending torrents of all category>>>torrents.trending(){'items':[...],'currentPage':1,'itemCount':50,'pageCount':1}# Trending torrents this week of all category>>>torrents.trending(week=True){'items':[...],'currentPage':1,'itemCount':50,'pageCount':1}# Todays trending anime>>>torrents.trending(category='anime'){'items':[...],'currentPage':1,'itemCount':50,'pageCount':1}# Trending anime this week>>>torrents.trending(category='anime',week=True){'items':[...],'currentPage':1,'itemCount':50,'pageCount':1}3. Getting information of a torrent>>>frompy1337ximportpy1337x# Using 11337x.st and passing the cookie since 11337x.st is cloudflare protected>>>torrents=py1337x('11337x.st',cookie='')# Getting the information of a torrent by its link>>>torrents.info(link='https://www.1337xx.to/torrent/258188/h9/'){'name':'Harry Potter and the Half-Blood Prince','shortName':'Harry Potter','description':\"....\",'category':'Movies','type':'HD','genre':['Adventure','Fantasy','Family'],'language':'English','size':'3.0 GB','thumbnail':'...','images':[...],'uploader':' ...','uploaderLink':'...','downloads':'5310','lastChecked':'44 seconds ago','uploadDate':'4 years ago','seeders':'36','leechers':'3','magnetLink':'...','infoHash':'...'}# Getting the information of a torrent by its link>>>torrents.info(torrentId='258188'){'name':'Harry Potter and the Half-Blood Prince','shortName':'Harry Potter','description':\"....\",'category':'Movies','type':'HD','genre':['Adventure','Fantasy','Family'],'language':'English','size':'3.0 GB','thumbnail':'...','images':[...],'uploader':' ...','uploaderLink':'...','downloads':'5310','lastChecked':'44 seconds ago','uploadDate':'4 years ago','seeders':'36','leechers':'3','magnetLink':'...','infoHash':'...'}Detailed documentationAvailable attributesfrompy1337ximportpy1337xtorrents=py1337x(proxy='1337x.st',cookie='',cache='py1337xCache',cacheTime=86400,backend='sqlite')ProxyIf the default domain is banned in your country, you can use an alternative domain of 1337x.1337x.to(default)1337x.tw1377x.to1337xx.to1337x.stx1337x.wsx1337x.eux1337x.se1337x.is1337x.gdcookieSome of the proxies are protected with Cloudflare. For such proxies you need to pass a cookie value. To get a cookie go the the protected site from your browser, solve the captcha and copy the value ofcf_clearance.Firefox: Inspect element > Storage > CookiesChrome: Inspect element > Application > Storage > CookiescachePy1337x usesrequests-cachefor caching to store data so that future requests for that data can be served faster.cachecan be any of the following.A boolean value:Truefor using cache andFalsefor not using cache. (cache is not used by default)Directory for storing the cache.cacheTimeBy default the cache expires after one day. You can change the cache expiration time by setting a customcacheTime.-1(to never expire)0(to \u201cexpire immediately,\u201d e.g. bypass the cache)A positive number (in seconds [defaults to 86400])AtimedeltaAdatetimebackendThe backend for storing the cache can be any of the following.'sqlite': SQLite database (default)'redis': Redis cache (requires redis)'mongodb': MongoDB database (requires pymongo)'gridfs': GridFS collections on a MongoDB database (requires pymongo)'dynamodb': Amazon DynamoDB database (requires boto3)'memory': A non-persistent cache that just stores responses in memoryAvailable methodsfrompy1337ximportpy1337xtorrents=py1337x()MethodDescriptionArgumentstorrents.search(query)Search for torrentsself,query:Keyword to search for,page (Defaults to 1):Page to view,category (optional):category,sortBy (optional):Sort by,Order (optional):ordertorrents.trending()Get trending torrentsself,category (optional):category,week (Defaults to False):True for weekely, False for dailytorrents.top()Get top torrentsself,category (optional):categorytorrents.popular(category)Get popular torrentsself,category:category,week (Defaults to False):True for weekely, False for dailytorrents.browse(category)Browse browse of certain categoryself,category:category,page (Defaults to 1):Page to viewtorrents.info(link or torrentId)Get information of a torrentself,link:Link of a torrentortorrentId:ID of a torrentAvailable categories'movies''tv''games''music''apps''anime''documentaries''xxx''others'Available sorting methods'time''size''seeders''leechers'Available sorting order'desc'(for descending order)'asc'(for ascending order)ContributingAny contributions you make aregreatly appreciated.Fork the ProjectCreate your Feature Branch (git checkout -b feature/AmazingFeature)Commit your Changes (git commit -m 'Add some AmazingFeature')Push to the Branch (git push origin feature/AmazingFeature)Open a Pull RequestThanks to everycontributorswho have contributed in this project.Projects using this APITorrent Hunt- Telegram bot to search torrents.Want to list your project here? Just make a pull request.LicenseDistributed under the MIT License. SeeLICENSEfor more information.Author/Maintainer:Hemanta Pokharel| Youtube:@H9Youtube"} {"package": "153957-theme", "pacakge-description": "Theme 153957View demo album herePhoto gallery templateWeb photo gallery templates adapted to my personal preferences.UsageThis section describes how to install and use this theme.InstallationInstall the153597-themepackage:$ pip install 153957-themeConfigureInsigal.conf.pyconfiguration for an album thethemesetting should be\na path to a theme directory. However, since this theme is provided as a Python\npackage its location might be harder to get. Two options are available for\nconfiguration:The theme can be configured as a plugin or you can get the path by importing\nthe package. By setting is as plugin the theme is automatically set.Setthemeto an empty string and add the theme and menu plugins:theme = ''\nplugins = ['theme_153957.theme', 'theme_153957.full_menu', \u2026]The alternative:from theme_153957 import theme\ntheme = theme.get_path()\nplugins = ['theme_153957.full_menu', \u2026]Wrapping albumUse the settingshead,body_prefix, andbody_suffixto add additional\ncode to the templates. The value ofheadis appended to theheadelement,\nthebodysettings are placed just after the body opening tag (prefix) and\njust before the closing body tag (suffix). This allows embedding the album\nin your own website."} {"package": "15five-django-ajax-selects", "pacakge-description": "Edit ForeignKey, ManyToManyField and CharField in Django Admin using jQuery UI AutoComplete.Customize search queryQuery other resources besides Django ORMFormat results with HTMLCustomize stylingCustomize security policyAdd additional custom UI alongside widgetIntegrate with other UI elements elsewhere on the page using the javascript APIWorks in Admin as well as in normal viewsDjango >=1.6, <=1.10Python >=2.7, <=3.5"} {"package": "15five-snowplow-tracker", "pacakge-description": "OverviewAdd analytics to your Python apps and Python games with theSnowplowevent tracker forPython.With this tracker you can collect event data from your Python-based applications, games or Python web servers/frameworks.Find out moreTechnical DocsSetup GuideRoadmapContributingTechnical DocsSetup GuideRoadmapContributingContributing quickstartAssuming Git,VagrantandVirtualBoxare installed:host$ git clone git@github.com:snowplow/snowplow-python-tracker.git\n host$ vagrant up && vagrant ssh\nguest$ cd /vagrant\nguest$ ./run-tests.sh deploy\nguest$ ./run-tests.sh testPublishinghost$ vagrant pushCopyright and licenseThe Snowplow Python Tracker is copyright 2013-2014 Snowplow Analytics Ltd.Licensed under theApache License, Version 2.0(the \u201cLicense\u201d);\nyou may not use this software except in compliance with the License.Unless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \u201cAS IS\u201d BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License."} {"package": "168learn", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} {"package": "170051277-trab-final-gces", "pacakge-description": "Resumo do funcionamentoA biblioteca desenvolvida auxilia desenvolvedores a explorar os dados com fun\u00e7\u00f5es essenciais para a identifica\u00e7\u00e3o de outliers e anomalias e uma interface que auxilia a visualizar as informa\u00e7\u00f5es de acordo com o arquivo de configura\u00e7\u00e3o.A biblioteca recebe um arquivo yaml com as configura\u00e7\u00f5es de cada etapa do pipeline de dados, e do endere\u00e7o do banco de dados.\nAp\u00f3s a execu\u00e7\u00e3o do banco de dados, o banco de dados de dados \u00e9 atualizado com os resultados da an\u00e1lise e os resultados podem ser visualizados por meio de dashboards no metabase."} {"package": "17MonIP", "pacakge-description": "IP search based on 17mon.cn, the best IP database for china.Source:http://tool.17mon.cnInstallSupports python2.6 to python3.4 and pypy.$pipinstall17monipUsage>>>importIP>>>IP.find(\"www.baidu.com\")'\u4e2d\u56fd\\t\u6d59\u6c5f\\t\u676d\u5dde'>>>IP.find(\"127.0.0.1\")'\u672c\u673a\u5730\u5740\\t\u672c\u673a\u5730\u5740'CMD Util$iplocele.me\u4e2d\u56fd\u5317\u4eac\u5317\u4eac$iplocaliyun.com\u4e2d\u56fd\u6d59\u6c5f\u676d\u5ddeChangeloghttps://github.com/lxyu/17monip/blob/master/CHANGES.rst"} {"package": "18-e", "pacakge-description": "No description available on PyPI."} {"package": "19226331LalitAgrawal", "pacakge-description": "No description available on PyPI."} {"package": "1942pyc", "pacakge-description": "24/7 Fortnite Lobbybot With Admin Controls"} {"package": "19434010112", "pacakge-description": "No description available on PyPI."} {"package": "199Fix", "pacakge-description": "===============199Fix===============199Fix provides a logging handler to push exceptions and other errors to https://199fix.com/.Installation============Installation with ``pip``:::$ pip install 199fixGet an API Key here https://199fix.com/signup/Add ``'199fix.handlers.I99FixHandler'`` as a logging handler:::LOGGING = {'version': 1,'disable_existing_loggers': False,'filters': {'require_debug_false': {'()': 'django.utils.log.RequireDebugFalse'}},'handlers': {'199fix': {'level': 'ERROR','class': 'i99fix.handlers.I99FixHandler','filters': ['require_debug_false'],'api_key': '[your-api-key]','env_name': 'production',},},'loggers': {'django': {'handlers': ['199fix'],'level': 'ERROR','propagate': True,},}}Settings========``level`` (built-in setting)Change the ``level`` to ``'ERROR'`` to disable logging of 404 error messages.``api_key`` (required)API key , Get one here https://199fix.com/.``env_name`` (required)Name of the environment (e.g. production, development)Contributing============* Fork the repository on GitHub and start hacking.* Run the tests.* Send a pull request with your changes."} {"package": "19CS30055-package", "pacakge-description": "My first Python package with a slightly longer description"} {"package": "19CS30055-package1", "pacakge-description": "My first Python package with a slightly longer description"} {"package": "19CS30055-Q2", "pacakge-description": "My first Python package with a slightly longer description"} {"package": "1a23-telemetry", "pacakge-description": "Using external services to track exceptions, and other logs from various\nPython libraries provided by 1A23 Studio.External services used might include:SentryLogz.ioLogglyLogDNAPrivacy policy."} {"package": "1AH22CH174", "pacakge-description": "This is a very basic calculator package."} {"package": "1AH22CS174", "pacakge-description": "This is a very basic calculator package."} {"package": "1and1", "pacakge-description": "No description available on PyPI."} {"package": "1app", "pacakge-description": "Hello, ONEUsagepip install 1appThen, simply use the command periodically:1app \u2013params\u2026This will save data to:settings.BASE_DIR/data/1app:default/ItemN-SpacingIf you want to seprate different sessions and sources, just use name param:1app \u2013params\u2026 \u2013name UsernameThis will save to:settings.BASE_DIR/data/1app:Name/TypeThe\u2013namevalue can be arbitray filesystem-compatible filename sub-string, so, you can use it to separate data by accounts, languages, or other features.NOTE: Corresponding auth and session data will be stored insettings.BASE_DIR/sessionsfolder.Saving to specific DIRSaving to custom folder simply pass\u2013pathparameter, like:1app \u2013params \u2013name Name \u2013path /home/mindey/Desktop/mydata"} {"package": "1assl", "pacakge-description": "No description available on PyPI."} {"package": "1b01-example-publish-pypi-medium", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} {"package": "1b1l-distributions", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} {"package": "1build", "pacakge-description": "1build is an automation tool that arms you with the convenience to configure project-local command line aliases \u2013 and then\nrun the commands quickly and easily. It is particularly helpful when you deal with multiple projects and switch between\nthem all the time. It is often the fact that different projects use different build tools and have different environment\nrequirements \u2013 and then switching from one project to another is becoming increasingly cumbersome. That is where 1build comes\ninto play.With 1build you can create simple and easily memorable command aliases for commonly used project commands such as build,\ntest, run or anything else. These aliases will have a project-local scope which means that they will be accessible only\nwithin the project directory. This way you can unify all your projects to build with the same simple command disregarding\nof what build tool they use. It will remove the hassle of remembering all those commands improving the mental focus for\nthe things that actually matter.Installpipinstall1buildorpip3install1buildUsageConfigurationcreate project configuration file in the project folder with name1build.yamlExample of1build.yamlfor JVM maven project:project:Sample JVM Project Namecommands:-build:mvn clean package-lint:mvn antrun:run@ktlint-format-test:mvn clean testRunning 1build for the above sample project:building the project1build buildfix the coding guidelinges lint and run tests (executing more than one commands at once)1build lint testUsingbeforeandaftercommandsConsider that your projectXrequiresJava 11and the other project requiresJava 8. It is a headache to always\nremember to switch the java version. What you want is to switch toJava 11automatically when you build the projectXand switch it back toJava 8when the build is complete. Another example \u2013 a project requiresDockerto be up\nand running or you need to clean up the database after running a test harness.This is wherebefore&aftercommands are useful. These commands are both optional \u2013\u00a0\nyou can use one of them, both or neither.Examples:Switching toJava 11and then back toJava 8project:Sample JVM Project Namebefore:./switch_to_java_11.shafter:./switch_to_java_8.shcommands:-build:mvn clean packageEnsure thatDockeris up and runningproject:Containerized Projectbefore:./docker_run.shcommands:-build:./gradlew cleanClean up database after some commandsproject:Containerized Projectafter:./clean_database.shcommands:-build:./gradlew cleanCommand usageusage: 1build [-h] [-l] [-v] [command]\n\npositional arguments:\n command Command to run - from `1build.yaml` file\n\noptional arguments:\n -h, --help Print this help message\n -l, --list Show all available commands - from `1build.yaml` file\n -v, --version Show version of 1build and exit\n -i, --init Create default `1build.yaml` configuration fileContributingPlease readCONTRIBUTING.mdfor details on our code of conduct, and the process for submitting pull requests to us.VersioningWe useSemantic Versioningfor all our releases. For the versions available, see thetags on this repository.ChangelogAll notable changes to this project in each release will be documented inCHANGELOG.md.The format is based onKeep a Changelog.LicenseThis project is licensed under the MIT License - see theLICENSEfile for detailsAuthorsGopinath Langote-Initial work & Maintainer\u2013Github\u2013TwitterAlexander Lukianchuk-Maintainer\u2013Github\u2013TwitterSee also the list ofcontributorswho participated in this project."} {"package": "1c3o3cxj", "pacakge-description": "No description available on PyPI."} {"package": "1c-format", "pacakge-description": "# 1c_format\nPython \u043f\u0430\u0440\u0441\u0435\u0440 \u0434\u043b\u044f \u0431\u0430\u043d\u043a\u043e\u0432\u0441\u043a\u0438\u0445 \u0432\u044b\u043f\u0438\u0441\u043e\u043a \u0432 \u0444\u043e\u0440\u043c\u0430\u0442\u0435 1\u0421."} {"package": "1c-utilites", "pacakge-description": "1C Utils=========**1C Utils** \u044d\u0442\u043e \u043d\u0430\u0431\u043e\u0440 \u0441\u043a\u0440\u0438\u043f\u0442\u043e\u0432 \u0434\u043b\u044f \u0443\u043f\u0440\u0430\u0432\u043b\u0435\u043d\u0438\u044f \u0438 \u043e\u0431\u0441\u043b\u0443\u0436\u0438\u0432\u0430\u043d\u0438\u044f \u0441\u0435\u0440\u0432\u0435\u0440\u043e\u0432 1\u0421.\u0420\u0435\u043f\u043e\u0437\u0438\u0442\u043e\u0440\u0438\u0439:https://gitlab.com/onegreyonewhite/1c-utilites\u0414\u043b\u044f \u0432\u043e\u043f\u0440\u043e\u0441\u043e\u0432 \u0438 \u043f\u0440\u0435\u0434\u043b\u043e\u0436\u0435\u043d\u0438\u0439 \u0438\u0441\u043f\u043e\u043b\u044c\u0437\u0443\u0439\u0442\u0435 \u0442\u0440\u0435\u043a\u0435\u0440 \u0437\u0430\u0434\u0430\u0447:https://gitlab.com/onegreyonewhite/1c-utilites/issues\u0412\u043e\u0437\u043c\u043e\u0436\u043d\u043e\u0441\u0442\u0438--------\u041d\u0430 \u0434\u0430\u043d\u043d\u044b\u0439 \u043c\u043e\u043c\u0435\u043d\u0442 \u043f\u043e\u0434\u0434\u0435\u0440\u0436\u0438\u0432\u0430\u0435\u0442\u0441\u044f \u0442\u043e\u043b\u044c\u043a\u043e \u0430\u0440\u0445\u0438\u0432\u0430\u0446\u0438\u044f PostgreSQL \u0438 \u0444\u0430\u0439\u043b\u043e\u0432\u044b\u0445 \u0411\u0414 1\u0421.Quickstart----------\u041f\u043e\u0434\u0434\u0435\u0440\u0436\u0438\u0432\u0430\u0435\u0442\u0441\u044f \u043b\u044e\u0431\u0430\u044f Linux-\u0441\u0438\u0441\u0442\u0435\u043c\u0430 \u0441 \u043d\u0430\u043b\u0438\u0447\u0438\u0435\u043c \u0432 \u043d\u0435\u0439 Python 2.7/3.4/3.5\u0438 \u043d\u0435\u043e\u0431\u0445\u043e\u0434\u0438\u043c\u044b\u0445 \u0443\u0442\u0438\u043b\u0438\u0442 \u043e\u0431\u0441\u043b\u0443\u0436\u0438\u0432\u0430\u043d\u0438\u044f:* pg_dump \u0434\u043b\u044f \u0430\u0440\u0445\u0438\u0432\u0430\u0446\u0438\u0438 PostgreSQL \u0431\u0430\u0437* tar \u0434\u043b\u044f \u0443\u043f\u0430\u043a\u043e\u0432\u044b\u0432\u0430\u043d\u0438\u044f \u0444\u0430\u0439\u043b\u043e\u0432\u044b\u0445 \u0431\u0430\u0437\u0423\u0441\u0442\u0430\u043d\u043e\u0432\u043a\u0430:.. sourcecode:: bashpip install 1c-utilites1c-utilites --help"} {"package": "1d3-checkout-sdk", "pacakge-description": "1D3 checkout page SDKThis is a set of libraries in the Python language to ease integration of your service\nwith the 1D3 Checkout Page.Please note that for correct SDK operating you must have at least Python 3.5.Payment flowInstallationInstall with pippipinstall1d3-checkout-sdkGet URL for paymentfromcheckout_page_sdk.gateimportGatefromcheckout_page_sdk.paymentimportPaymentgate=Gate('secret')payment=Payment('402')payment.payment_id='some payment id'payment.payment_amount=1001payment.payment_currency='USD'payment_url=gate.get_purchase_checkout_page_url(payment)payment_urlhere is the signed URL.Handle callback from 1D3You'll need to autoload this code in order to handle notifications:fromcheckout_page_sdk.gateimportGategate=Gate('secret')callback=gate.handle_callback(data)datais the JSON data received from payment system;callbackis the Callback object describing properties received from payment system;callbackimplements these methods:callback.get_payment_status()Get payment status.callback.get_payment()Get all payment data.callback.get_payment_id()Get payment ID in your system."} {"package": "1dct", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} {"package": "1distro", "pacakge-description": "No description available on PyPI."} {"package": "1dyfolabs-test-script", "pacakge-description": "No description available on PyPI."} {"package": "1ee", "pacakge-description": "UNKNOWN"} {"package": "1inch.py", "pacakge-description": "1inch.py1inch.py is a wrapper around the 1inch API and price Oracle. It has full coverage of the swap API endpoint and All chains support by 1inch are included in the OneInchSwap and OneInchOracle methods.\nPackage also includes a helper method to ease the submission of transactions to the network. Limited chains currently supported.API DocumentationThe full 1inch swap API docs can be found athttps://docs.1inch.io/InstallationUse the package managerpipto install 1inch.py.pipinstall1inch.pyUsageA quick note on decimals. The wrapper is designed for ease of use, and as such accepts amounts in \"Ether\" or whole units.\nIf you prefer, you can use decimal=0 and specify amounts in wei. This will also help with any potential floating point errors.fromoneinch_pyimportOneInchSwap,TransactionHelper,OneInchOraclerpc_url=\"yourRPCURL.com\"binance_rpc=\"adifferentRPCurl.com\"public_key=\"yourWalletAddress\"private_key=\"yourPrivateKey\"#remember to protect your private key. Using environmental variables is recommended.api_key=\"\"# 1 Inch API keyexchange=OneInchSwap(api_key,public_key)bsc_exchange=OneInchSwap(api_key,public_key,chain='binance')helper=TransactionHelper(api_key,rpc_url,public_key,private_key)bsc_helper=TransactionHelper(api_key,binance_rpc,public_key,private_key,chain='binance')oracle=OneInchOracle(rpc_url,chain='ethereum')# See chains currently supported by the helper method:helper.chains# {\"ethereum\": \"1\", \"binance\": \"56\", \"polygon\": \"137\", \"avalanche\": \"43114\"}# Straight to business:# Get a swap and do the swapresult=exchange.get_swap(\"USDT\",\"ETH\",10,0.5)# get the swap transactionresult=helper.build_tx(result)# prepare the transaction for signing, gas price defaults to fast.result=helper.sign_tx(result)# sign the transaction using your private keyresult=helper.broadcast_tx(result)#broadcast the transaction to the network and wait for the receipt.## If you already have token addresses you can pass those in instead of token names to all OneInchSwap functions that require a token argumentresult=exchange.get_swap(\"0x7fc66500c84a76ad7e9c93437bfc5ac33e2ddae9\",\"0x43dfc4159d86f3a37a5a4b3d4580b888ad7d4ddd\",10,0.5)#USDT to ETH price on the Oracle. Note that you need to indicate the token decimal if it is anything other than 18.oracle.get_rate_to_ETH(\"0xa0b86991c6218b36c1d19d4a2e9eb0ce3606eb48\",src_token_decimal=6)# Get the rate between any two tokens.oracle.get_rate(src_token=\"0x6B175474E89094C44Da98b954EedeAC495271d0F\",dst_token=\"0x111111111117dC0aa78b770fA6A738034120C302\")exchange.health_check()# 'OK'# Address of the 1inch router that must be trusted to spend funds for the swapexchange.get_spender()# Generate data for calling the contract in order to allow the 1inch router to spend funds. Token symbol or address is required. If optional \"amount\" variable is not supplied (in ether), unlimited allowance is granted.exchange.get_approve(\"USDT\")exchange.get_approve(\"0xdAC17F958D2ee523a2206206994597C13D831ec7\",amount=100)# Get the number of tokens (in Wei) that the router is allowed to spend. Option \"send address\" variable. If not supplied uses address supplied when Initialization the exchange object.exchange.get_allowance(\"USDT\")exchange.get_allowance(\"0xdAC17F958D2ee523a2206206994597C13D831ec7\",send_address=\"0x12345\")# Token List is stored in memoryexchange.tokens# {# '1INCH': {'address': '0x111111111117dc0aa78b770fa6a738034120c302',# 'decimals': 18,# 'logoURI': 'https://tokens.1inch.exchange/0x111111111117dc0aa78b770fa6a738034120c302.png',# 'name': '1INCH Token',# 'symbol': '1INCH'},# 'ETH': {'address': '0xeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeee',# 'decimals': 18,# 'logoURI': 'https://tokens.1inch.exchange/0xeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeee.png',# 'name': 'Ethereum',# 'symbol': 'ETH'},# ......# }# Returns the exchange rate of two tokens.# Tokens can be provided as symbols or addresses# \"amount\" is supplied in ether# NOTE: When using custom tokens, the token decimal is assumed to be 18. If your custom token has a different decimal - please manually pass it to the function (decimal=x)# Also returns the \"price\" of more expensive token in the cheaper tokens. Optional variables can be supplied as **kwargsexchange.get_quote(from_token_symbol='ETH',to_token_symbol='USDT',amount=1)# (# {# \"fromToken\": {# \"symbol\": \"ETH\",# \"name\": \"Ethereum\",# \"decimals\": 18,# \"address\": \"0xeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeee\",# \"logoURI\": \"https://tokens.1inch.io/0xeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeee.png\",# \"tags\": [\"native\"],# },# \"toToken\": {# \"symbol\": \"USDT\",# \"name\": \"Tether USD\",# \"address\": \"0xdac17f958d2ee523a2206206994597c13d831ec7\",# \"decimals\": 6,# \"logoURI\": \"https://tokens.1inch.io/0xdac17f958d2ee523a2206206994597c13d831ec7.png\",# \"tags\": [\"tokens\"],# ...# Decimal(\"1076.503093\"),# )# Creates the swap data for two tokens.# Tokens can be provided as symbols or addresses# Optional variables can be supplied as **kwargs# NOTE: When using custom tokens, the token decimal is assumed to be 18. If your custom token has a different decimal - please manually pass it to the function (decimal=x)exchange.get_swap(from_token_symbol='ETH',to_token_symbol='USDT',amount=1,slippage=0.5)# {# \"fromToken\": {# \"symbol\": \"ETH\",# \"name\": \"Ethereum\",# \"decimals\": 18,# \"address\": \"0xeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeee\",# \"logoURI\": \"https://tokens.1inch.io/0xeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeee.png\",# \"tags\": [\"native\"],# },# \"toToken\": {# \"symbol\": \"USDT\",# \"name\": \"Tether USD\",# \"address\": \"0xdac17f958d2ee523a2206206994597c13d831ec7\",# \"decimals\": 6,# \"logoURI\": \"https://tokens.1inch.io/0xdac17f958d2ee523a2206206994597c13d831ec7.png\",# \"tags\": [\"tokens\"],## ...## ],# \"tx\": {# \"from\": \"0x1d05aD0366ad6dc0a284C5fbda46cd555Fb4da27\",# \"to\": \"0x1111111254fb6c44bac0bed2854e76f90643097d\",# \"data\": \"0xe449022e00000000000000000000000000000000000000000000000006f05b59d3b20000000000000000000000000000000000000000000000000000000000001fed825a0000000000000000000000000000000000000000000000000000000000000060000000000000000000000000000000000000000000000000000000000000000140000000000000000000000011b815efb8f581194ae79006d24e0d814b7697f6cfee7c08\",# \"value\": \"500000000000000000\",# \"gas\": 178993,# \"gasPrice\": \"14183370651\",# },# }ContributingPull requests are welcome. For major changes, please open an issue first to discuss what you would like to change.Thanks to @Makbeta for all of their work in migrating the wrapper to the new 1inch api system.LicenseMIT"} {"package": "1lever-utils", "pacakge-description": "No description available on PyPI."} {"package": "1nester", "pacakge-description": "No description available on PyPI."} {"package": "1NeuronPerceptron-Pypi-mdnazmulislam0087", "pacakge-description": "-1NeuronPerceptron_Pypi1Neuron|Perceptron|_Pypi\"\"\"\nauthor: Nazmul\nemail:md.nazmul.islam0087@gmail.com\"\"\"How to use thisFirst install the library using below command by using latest version-pipinstall1NeuronPerceptron-Pypi-mdnazmulislam0087==0.0.4Run the below code to see the training and plot file for or Gate, similarly you can use AND, NAND and XOR GATE to see the difference-fromoneNeuronPerceptron.perceptronimportPerceptronfromoneNeuronPerceptron.all_utilsimportprepare_data,save_model,save_plotimportpandasaspdimportnumpyasnpimportloggingimportoslogging_str=\"[%(asctime)s:%(levelname)s:%(module)s]%(message)s\"logging.basicConfig(level=logging.INFO,format=logging_str)defmain(data,eta,epochs,modelfilename,plotfilename):df=pd.DataFrame(data)logging.info(f\"The dataframe is :{df}\")X,y=prepare_data(df)model=Perceptron(eta=eta,epochs=epochs)model.fit(X,y)_=model.total_loss()save_model(model,filename=modelfilename)save_plot(df,file_name=plotfilename,model=model)if__name__==\"__main__\":# << entry point <>>>> starting training >>>>>\")main(data=OR,eta=ETA,epochs=EPOCHS,modelfilename=\"or.model\",plotfilename=\"or.png\")logging.info(\"<<<<< training done successfully<<<<<\\n\")exceptExceptionase:logging.exception(e)raiseePackages required-matplotlibnumpypandasjoblibtqdmLimitationUsing one Neuron Perceptron, We cant make decision boundary for XOR GATe, In summary XOR Gate classification is not possible using one Neuron PerceptronReference -official python docsgithub docs for github actionsRead me editormore details can be found1Neuron Perceptron"} {"package": "1neuron-pypi-overlordiam", "pacakge-description": "1neuron_pypi1neuron_pypi"} {"package": "1on1", "pacakge-description": "No description available on PyPI."} {"package": "1OS", "pacakge-description": "No description available on PyPI."} {"package": "1pass", "pacakge-description": "A command line interface (and Python library) for reading passwords from1Password.Command line usageTo get a password:1pass mail.google.comBy default this will look in~/Dropbox/1Password.agilekeychain. If that\u2019s\nnot where you keep your keychain:1pass --path ~/whatever/1Password.agilekeychain mail.google.comOr, you can set your keychain path as an enviornment variable:export ONEPASSWORD_KEYCHAIN=/path/to/keychain\n\n1pass mail.google.comBy default, the name you pass on the command line must match the name of an\nitem in your 1Password keychain exactly. To avoid this, fuzzy matching is\nmade possible with the--fuzzyflag:1pass --fuzzy mail.googIf you don\u2019t want to be prompted for your password, you can use the--no-promptflag and provide the password via standard input instead:emit_master_password | 1pass --no-prompt mail.google.comPython usageThe interface is very simple:from onepassword import Keychain\n\nmy_keychain = Keychain(path=\"~/Dropbox/1Password.agilekeychain\")\nmy_keychain.unlock(\"my-master-password\")\nmy_keychain.item(\"An item's name\").passwordAn example of real-world useI wrote this so I could add the following line to my.muttrcfile:set imap_pass = \"`1pass 'Google: personal'`\"Now, whenever I startmutt, I am prompted for my 1Password Master Password\nand not my Gmail password.The--no-promptflag is very useful when configuringmuttand PGP.muttpasses the PGP passphrase via standard in, so by inserting1passinto this pipline I can use my 1Password master password when prompted for my\nPGP keyphrase:set pgp_decrypt_command=\"1pass --no-prompt pgp-passphrase | gpg --passphrase-fd 0 ...\"ContributorsPip Taylor Adam Coddington Ash Berlin Zach Allaun Eric Mika License1passis licensed under the MIT license. See the license file for details.While it is designed to read.agilekeychainbundles created by 1Password,1passisn\u2019t officially sanctioned or supported byAgileBits. I do hope they like it though."} {"package": "1pass2pass", "pacakge-description": "1pass2passUtility for transfer items from the 1password (*.1pif files) to the pass (the standard unix password manager).Usagepython1pass2pass[-h][-v][-p][-f][-fl]<1piffilename>Positional arguments:<1pif filename> - path to *.1pif file for processing - path to folder for store passwordsOptional arguments:-h, --help - show help message-v, --verbose - increase output verbosity-p, --print-only - print data into console, without saving into password store-f, --force - force overwrite existing passwords (default=False)-fl, --first-line - Put password in first line (default=False)WARN: If you whant to use thePassaplication on iOS, you need to use the--first-lineoption.\nBecause thePassapp on iOS find the password in the first line.Examplepython 1pass2pass.py -v -f -fl ~/Downloads/1password.1pif /PersonalWhere we import passwords from~/Downloads/1password.1pifto the folder/Personalwith verbose output, force overwrite existing passwords and put password in first line.Requirementspython 3.6+loguru"} {"package": "1password", "pacakge-description": "OnePassword python clientPython client around the 1Password password manager cli for usage within python code and\nJupyter Notebooks. Developed by Data Scientists from Jamf.Supported versionsThere are some of the pre-requisites that are needed to use the library. We automatically install the cli for Mac and\nLinux users when installing the library. Windows users see below for help.1Password App: 8+1Password cli: 2+Python: 3.10+Operating systemsThe library is split into two parts: installation and client in which we are slowly updating to cover as many operating\nsystems as possible the following table should ensure users understand what this library can and can't do at time of\ninstall.MacOSLinuxFully supportedYYCLI installYYSSO loginYYLogin via AppYYBiometrics authYYPassword authYYCLI clientYYService accountYYInstallationpipinstall1passwordIf you have issues with PyYaml or other distutils installed packages then use:pipinstall--ignore-installed1passwordYou are welcome to install and manageopyourself by visitingthe CLI1 downloads pageto download the version you require\nand follow instructions for your platform as long as it's major version 2.The above commands pip commands will checkopis present already and if not will install the supportedopcli\nplus the python client itself.\nThis is currently fixed atopversion 1.12.5 to ensure compatibility. If you wish to use a higher version ofopyou\ncan by followingthis guide,\nhowever note that we cannot ensure it will work with our client yet.MacOS users will be prompted with a separate installation window to ensure you have a signed version ofop- make\nsure to check other desktops that the installer might pop up on.Optional pre-requisitesbase32This utility is used to create a unique guid for your device but this isn't a hard requirement from AgileBits\nand so if you seebase32: command not foundan empty string will be used instead,\nand the client will still work fully.If you really want to, you can make sure you have this installed by installing coreutils. Details per platform can\nbe found here:https://command-not-found.com/base32Basic UsageSince v2 of the cli it is advised to connect your CLI to the local app installed on the device, thus removing the need\nfor secret keys and passwords in the terminal or shell. Read here on how to do that:https://developer.1password.com/docs/cli/get-started#step-2-turn-on-the-1password-desktop-app-integrationAn added extra for Mac users is that you can also enable TouchID for the app and by linking your cli with the app you\nwill get biometric login for both.Once this is done any initial usage of the cli, and our client will request you to authenticate either via the app or\nusing your biometrics and then you can continue.We are sure there are use cases where the app cannot be linked and hence a password etc is till required so this\nfunctionality is still present from our v1 implementation and can be described belowPassword authenticationOn first usage users will be asked for both the enrolled email, secret key and password.\nThere is also verification of your account domain and name.For all following usages you will only be asked for a password.You will be given 3 attempts and then pointed to reset password documentation or alternatively you can\nrestart your kernel.No passwords are stored in memory without encryption.If you have 2FA turned on for your 1Password account the client will ask for your six digit authenticator code.fromonepasswordimportOnePasswordimportjsonop=OnePassword()# List all vaultsjson.loads(op.list_vaults())# List all items in a vault, default is Privateop.list_items()# Get all fields, one field or more fields for an item with uuid=\"example\"op.get_item(uuid=\"example\")op.get_item(uuid=\"example\",fields=\"username\")op.get_item(uuid=\"example\",fields=[\"username\",\"password\"])Service AccountsWe also support authentication using Service accounts, however these are not interchangeable with other auth routes and\nhence other accounts i.e. this token based authentication will take precedence over any other method.\nIf you wish to use multiple account types, for now you will need to design this workflow yourself\nby clearing out the OP_SERVICE_ACCOUNT_TOKEN usingunset OP_SERVICE_ACCOUNT_TOKENbefore re-authenticating with\nyour preferred account.To use a service account make sure to fulfil the requirements here:https://developer.1password.com/docs/service-accounts/use-with-1password-cliand note that not all of the CLI commands are supported at this time.Also note that your service account will only have access to certain vaults. In particular it will not be able to see\ntheSharedorPrivatevaults in any account. In our client this means you must always use thevaultoption.Once you have fulfilled all the requirements, namelyexport OP_SERVICE_ACCOUNT_TOKEN=, you can then use\nour client with:fromonepasswordimportOnePasswordop=OnePassword()op.list_vaults()Input formatsTo be sure what you are using is of the right formatEnrolled email: standard email format e.g.user@example.comSecret key: provided by 1Password e.g. ##-######-######-#####-#####-#####-#####Account domain: domain that you would login to 1Password via browser e.g. example.1password.comAccount name: subdomain or account name that cli can use for multiple account holders e.g. exampleContributingThe GitHub action will run a full build, test and release on any push.\nIf this is to the main branch then this will release to public PyPi and bump the patch version.For a major or minor branch update your new branch should include this new version and this should be verified by the\ncode owners.In general, this means when contributing you should create a feature branch off of the main branch and without\nmanually bumping the version you can focus on development.CLI coverageFull op documentation can be found here:https://support.1password.com/command-line-reference/The below is correct as of version 0.3.0.CommandsThis is the set of commands the current python SDK covers:create: Create an objectdocumentdelete: Remove an objectitem: we use this method to remove documents but now there is a new delete document methodget: Get details about an objectdocumentitemlist: List objects and eventsitemsvaultssignin: Sign in to a 1Password accountsignout: Sign out of a 1Password accountThis is what still needs developing due to new functionality being released:add: Grant access to groups or vaultsgroupusercompletion: Generate shell completion informationconfirm: Confirm a usercreate: Create an objectgroupuseritemvaultdelete: Remove an objectdocumentuservaultgrouptrashedit: Edit an objectdocumentgroupitemuservaultencode: Encode the JSON needed to create an itemforget: Remove a 1Password account from this deviceget: Get details about an objectaccountgrouptemplatetotpuservaultlist: List objects and eventsdocumentseventsgroupstemplatesusersreactivate: Reactivate a suspended userremove: Revoke access to groups or vaultssuspend: Suspend a userupdate: Check for and download updatesRoadmapAdd Windows functionalityAdd clean uninstall of client and opRemove subprocess usage everywhere -> use pexpectAdd test docker imageGet full UT coverageAlign response types into JSON / lists instead of JSON stringsEnsure full and matching functionality of CLI in pythonaddconfirmcreatedeleteeditencodeforgetgetlistreactivateremovesuspend"} {"package": "1password-secrets", "pacakge-description": "1password-secrets1password-secrets is a set of utilities to sync 1Password secrets. It enables:Seamless sharing oflocalsecrets used for development.\nDevelopers starting out in a project can just use this tool to retrieve the.envfile needed for\nlocal development.\nLikewise it is also simple to push back any local changes to the 1password vault.More secure and simpler method of managing Fly.io secrets.\nBy default, Fly secrets must be managed byflyctl. This means that setting secrets in\nproduction, developers must useflyctlpassing credentials via arguments - risking credentials\nbeing stored in their histories. Alternatively one must secrets in a file and runflyctl secrets import. This works well, but you must ensure everything is synched to a\nsecret/password manager and then delete the file.\n1password-secrets enables a leaner management of secrets via 1password. Via an app name,\nautomatically finds and imports secrets in an 1passwordsecure noteto Fly. This way you ensure\ndevelopers always keep secrets up-to-date and never lost files in their computer.Motivation: Using 1password for this avoids the need for another external secret management tool.\nAnd keeps the access control in a centralised place that we already use.Getting startedRequirementsInstall the required dependencies:1Password >=8.9.131Password CLI >=2.13.1flyctl >=0.0.451Python >=3.9brew install --cask 1password 1password-cli && \\\nbrew install flyctlAllow 1Password to connect to 1Password-CLI by going toSettings->Developer->Command-Line Interface (CLI)and selectConnect with 1Password CLI.Sign into your 1Password and Fly account (if you wish to use the fly integration).InstallationIn most systems (Mac and Linux) whenpip3(Python's 3 PIP) is in path\nand you want to install it at the user level:pip3 install -U 1password-secretsOtherwise adapt it accordingly.UsageLocalFrom within a valid git repository with remote \"origin\" ending in/.git,\n1password-secrets will be able topullandpushsecrets to a 1password secure note containingrepo:/in its name. By default it syncs to./.envfile,\nthis can overridden with afile_namefield containing the desired relative file path.To get secrets from 1Password, run:1password-secrets local pullTo push the local changes to 1Password, run:1password-secrets local pushFlyMake sure you have a Secure Note in 1Password withfly:in the title.fly-app-nameis the name of your fly application.To import secrets to fly, run:1password-secrets fly import Secrets can be edit directly on 1Password app or using the command:1password-secrets fly edit DevelopmentEnsure you havemakeinstalled.Create a virtual environment:make setup-venv.Install dependencies:make install-deps.Then you can install (link) the repo globally withmake local-install.Before pushing changes ensure your code is properly formatted withmake lint.\nAuto format the code withmake format"} {"package": "1primo1", "pacakge-description": "Este programa indica los numeros primos desde 2 hasta n, siendo n un entero indicado por el usuario.\nLa funcion se llama \u201cprimo(n)\u201d y el paquete \u201cmostrarprimo\u201d.\nEjemplo, mostrarprimo.primo(15) indica todos los numeros primos desde 2 hasta 15.\nThis program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version"} {"package": "1Q847", "pacakge-description": "No description available on PyPI."} {"package": "1RF21IS005", "pacakge-description": "No description available on PyPI."} {"package": "1secMail", "pacakge-description": "An API wrapper forwww.1secmail.comwritten in Python.AboutThis is an easy to use yet full-featured Python API wrapper forwww.1secmail.com\u2197 using the official 1secMail API. It allows you to easily create temporary email addresses for testing, verification, or other purposes where you need a disposable email address.Asynchronous operations are also supported!:thumbsup:InstallTo install the package, you'll need Python 3.8 or above installed on your computer. From your command line:pipinstall1secMailNoteIf you're willing to install the development version, do the following:gitclonehttps://github.com/qvco/1secMail-Python.gitcd1secMail-Python\n\npipinstall-rrequirements.txt\n\npipinstall-e.UsageGenerating Email AddressesTo generate a list of random email addresses, use therandom_email()method:importsecmailclient=secmail.Client()client.random_email(amount=3)>>>['c3fho3cry1@1secmail.net','5qcd3d36zr@1secmail.org','b6fgeothtg@1secmail.net']You can also generate a custom email address by specifying the username and domain:NoteSpecifying a domain is optional!client.custom_email(username=\"bobby-bob\",domain=\"kzccv.com\")>>>'bobby-bob@kzccv.com'Receiving MessagesTo wait until a new message is received, use theawait_new_message()method:message=client.await_new_message(\"bobby-bob@kzccv.com\")To check all messages received on a particular email address, use theget_inbox()method and pass the email address:inbox=client.get_inbox(\"bobby-bob@kzccv.com\")formessageininbox:print(message.id)print(message.from_address)print(message.subject)print(message.date)You can also fetch a single message using theget_message()method and passing the email address and message ID:message=client.get_message(address=\"bobby-bob@kzccv.com\",message_id=235200687)print(message.id)print(message.subject)print(message.body)print(message.text_body)print(message.html_body)print(message.attachments)print(message.date)Downloading an attachmentYou can download an attachment from a message in the inbox of a specified email address using the download_attachment method like this:client.download_attachment(address,message_id,attachment_filename)>>>'Path: (C:\\Users\\user\\path/config/rocket.png), Size: 49071B'Asynchronous ClientGenerating Email AddressesTo generate a list of random email addresses, use therandom_email()method:importasyncioimportsecmailasyncdefmain():client=secmail.AsyncClient()email_addresses=awaitclient.random_email(amount=3)print(email_addresses)asyncio.run(main())>>>['c3fho3cry1@1secmail.net','5qcd3d36zr@1secmail.org','b6fgeothtg@1secmail.net']You can also generate a custom email address by specifying the username and domain:NoteSpecifying a domain is optional!awaitclient.custom_email(username=\"bobby-bob\",domain=\"kzccv.com\")>>>'bobby-bob@kzccv.com'Receiving MessagesTo wait until a new message is received, use theawait_new_message()method:importasyncioimportsecmailasyncdefmain():client=secmail.AsyncClient()message=awaitclient.await_new_message(\"bobby-bob@kzccv.com\")print(f\"{message.from_address}:{message.subject}\")asyncio.run(main())To check all messages received on a particular email address, use theget_inbox()method and pass the email address:importasyncioimportsecmailasyncdefmain():client=secmail.AsyncClient()inbox=awaitclient.get_inbox(\"bobby-bob@kzccv.com\")print(f\"You have{len(inbox)}messages in your inbox.\")formessageininbox:print(message.id)print(message.from_address)print(message.subject)print(message.date)asyncio.run(main())You can also fetch a single message using theget_message()method and passing the email address and message ID:importasyncioimportsecmailasyncdefmain():client=secmail.AsyncClient()address=\"bobby-bob@kzccv.com\"inbox=awaitclient.get_inbox(address)message_id=inbox[0].idmessage=awaitclient.get_message(address,message_id)print(message.id)print(message.subject)print(message.body)print(message.text_body)print(message.html_body)print(message.attachments)print(message.date)asyncio.run(main())Downloading an attachmentYou can download an attachment from a message in the inbox of a specified email address using the download_attachment method like this:importasyncioimportsecmailasyncdefmain():client=secmail.AsyncClient()address=\"bobby-bob@kzccv.com\"inbox=awaitclient.get_inbox(address)message_id=inbox[0].idmessage=awaitclient.get_message(address,message_id)attachment_filename=message.attachments[0].filenameawaitclient.download_attachment(address,message_id,attachment_filename)asyncio.run(main())>>>'Path: (C:\\Users\\user\\path/config/rocket.png), Size: 49071B'LicneseThis software is licensed under theMIT\u00a9Qvco."} {"package": "1secmail-python", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} {"package": "1secmailpythonwarper", "pacakge-description": "No description available on PyPI."} {"package": "1st", "pacakge-description": "No description available on PyPI."} {"package": "1st-Folder", "pacakge-description": "Sample package mode for a demo of its making for the Booking System Article."} {"package": "1-test-package", "pacakge-description": "The SI API will be uploaded later."} {"package": "1to001", "pacakge-description": "1to001is made for padding numbers in filenames automatically. It\u2019s written in Python 3.InstallationSystem-wide installation:$pipinstall1to001User installation:$pipinstall--user1to001For development code:$pipinstallgit+https://github.com/livibetter/1to100.gitExample$touch1.txt100.txt$1to001*.txt+001.txt?++performpadding(y/n)?y1.txt->001.txtOptions-i(--ignore-case)When cases are mixed, this option would ignore the cases, for example for files like:read100me1.TXT\nread5Me02.txtThey can be renamed to:1to001 -i *.{txt,TXT}\n+ read005Me02.txt\n? ++\n+ read100me01.TXT\n? +\nperform padding (y/n)?-y(--yes)Automatic yes to promptsMore information1to001on GitHubPyPISome usage examples in thisblog postLicenseThis project is licensed under the MIT License, seeCOPYING."} {"package": "1zlab-emp-ide", "pacakge-description": "emp_ide"} {"package": "1zlab-homepage", "pacakge-description": "xxx"} {"package": "2", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} {"package": "2000", "pacakge-description": "welcome to my package"} {"package": "2006", "pacakge-description": "welcome to my package"} {"package": "2013007_pyh", "pacakge-description": "UNKNOWN"} {"package": "20191004", "pacakge-description": "No description available on PyPI."} {"package": "2020", "pacakge-description": "No description available on PyPI."} {"package": "2021assignment1calculator1", "pacakge-description": "No description available on PyPI."} {"package": "2021ccps3", "pacakge-description": "No description available on PyPI."} {"package": "20220429-pdfminer-jameslp310", "pacakge-description": "pdfminer.sixPlease Note this is a fork of the main package at 29/04/2022 12:00 GMT No updtaes are likley It has been forked and uplaoded to insure an wholely MIT version of the package is avalible. The following is from the origional package.We fathom PDFPdfminer.six is a community maintained fork of the original PDFMiner. It is a tool for extracting information from PDF\ndocuments. It focuses on getting and analyzing text data. Pdfminer.six extracts the text from a page directly from the\nsourcecode of the PDF. It can also be used to get the exact location, font or color of the text.It is built in a modular way such that each component of pdfminer.six can be replaced easily. You can implement your own\ninterpreter or rendering device that uses the power of pdfminer.six for other purposes than text analysis.Check out the full documentation onRead the Docs.FeaturesWritten entirely in Python.Parse, analyze, and convert PDF documents.PDF-1.7 specification support. (well, almost).CJK languages and vertical writing scripts support.Various font types (Type1, TrueType, Type3, and CID) support.Support for extracting images (JPG, JBIG2, Bitmaps).Support for various compressions (ASCIIHexDecode, ASCII85Decode, LZWDecode, FlateDecode, RunLengthDecode,\nCCITTFaxDecode)Support for RC4 and AES encryption.Support for AcroForm interactive form extraction.Table of contents extraction.Tagged contents extraction.Automatic layout analysis.How to use the origional versionInstall Python 3.6 or newer.Installpip install pdfminer.six(Optionally) install extra dependencies for extracting images.pip install 'pdfminer.six[image]Use command-line interface to extract text from pdf:python pdf2txt.py samples/simple1.pdfContributingBe sure to read thecontribution guidelines.AcknowledgementThis repository includes code frompyHanko; the original license has been includedhere."} {"package": "2022.12.06.1052", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} {"package": "20221206.1356", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} {"package": "20221206.1408", "pacakge-description": "hello it is just a test"} {"package": "20221206.1418", "pacakge-description": "hello it is just a test"} {"package": "2022-2-gces-ifpf", "pacakge-description": "Trabalho individual de GCES 2022-2Os conhecimentos de Gest\u00e3o de Configura\u00e7\u00e3o de Software s\u00e3o fundamentais no ciclo de vida de um produto de software. As t\u00e9cnicas para a gest\u00e3o v\u00e3o desde o controle de vers\u00e3o, automa\u00e7\u00e3o de build e de configura\u00e7\u00e3o de ambiente, testes automatizados, isolamento do ambiente at\u00e9 o deploy do sistema. Todo este ciclo nos dias de hoje s\u00e3o integrados em um pipeline de DevOps com as etapas de Integra\u00e7\u00e3o Cont\u00ednua (CI) e Deploy Cont\u00ednuo (CD) implementadas e automatizada.Para exercitar estes conhecimentos, neste trabalho, voc\u00ea dever\u00e1 aplicar os conceitos estudados ao longo da disciplina no produto de software contido neste reposit\u00f3rio.O sistema se trata de uma biblioteca python para executar pipelines de dados de forma customiz\u00e1vel em bancos de dados.Para executar a aplica\u00e7\u00e3o em sua m\u00e1quina, basta seguir o passo-a-passo descritos abaixo.Resumo da aplica\u00e7\u00e3oA biblioteca desenvolvida auxilia desenvolvedores a explorar os dados com fun\u00e7\u00f5es essenciais para a identifica\u00e7\u00e3o de outliers e anomalias e uma interface que auxilia a visualizar as informa\u00e7\u00f5es de acordo com o arquivo de configura\u00e7\u00e3o.A biblioteca recebe um arquivo yaml com as configura\u00e7\u00f5es de cada etapa do pipeline de dados, e do endere\u00e7o do banco de dados.\nAp\u00f3s a execu\u00e7\u00e3o do banco de dados, o banco de dados de dados \u00e9 atualizado com os resultados da an\u00e1lise e os resultados podem ser visualizados por meio de dashboards no metabase.Etapas do TrabalhoO trabalho deve ser elaborado atrav\u00e9s de etapas. Cada uma das etapas deve ser realizada em um commit separado com o resultado funcional desta etapa.As etapas de 1 a 3 s\u00e3o relacionadas ao isolamento do ambiente utilizando a ferramenta Docker e Docker Compose. Neste sentido o tutorial abaixo cobre os conceitos fundamentais para o uso destas tecnologias.Tutorial de DockerAs etapas de 4 e 5 s\u00e3o relacionadas \u00e0 configura\u00e7\u00e3o do pipeline de CI e CD.Tutorial CI - GitlabContaineriza\u00e7\u00e3o do BancoA vers\u00e3o inicial do sistema cont\u00e9m o metabase no backend cujo funcionamento requer uma instala\u00e7\u00e3o de um banco de dados Mongo. A primeira etapa do trabalho \u00e9 de configurar um container somente para o banco de dados com as credenciais especificadas na descri\u00e7\u00e3o da aplica\u00e7\u00e3o e testar o funcionamento do mesmo.Containeriza\u00e7\u00e3o da aplica\u00e7\u00e3o + metabaseNesta etapa, tanto o a aplica\u00e7\u00e3o python quanto o metabase/banco dever\u00e3o estar funcionando em containers individuais.Dever\u00e1 ser utilizado um orquestrador (Docker Compose) para gerenciar comunica\u00e7\u00e3o entre os containers al\u00e9m do uso de credenciais, networks, volumes, entre outras configura\u00e7\u00f5es necess\u00e1rias para a correta execu\u00e7\u00e3o da aplica\u00e7\u00e3o.Gest\u00e3o de dependencias e pacotes pythonConfigurar o gerenciador de dependencias e pacotes python, o poetry, para gerar um pacote pip da solu\u00e7\u00e3o. Publicar a bibliotecahttps://python-poetry.orgDocumenta\u00e7\u00e3o automatizadaGerar a documenta\u00e7\u00e3o da biblioteca de forma automatizada utilizando o doxygen para gerar informacoes da biblioteca e o sphinx para criar documenta\u00e7\u00e3ohttps://www.sphinx-doc.orgIntegra\u00e7\u00e3o Cont\u00ednua (CI)Para a realiza\u00e7\u00e3o desta etapa, a aplica\u00e7\u00e3o j\u00e1 dever\u00e1 ter seu ambiente completamente containerizado.Dever\u00e1 ser utilizada uma ferramenta de Integra\u00e7\u00e3o Cont\u00ednua para garantir o build, os testes e o deploy para ohttps://pypi.org.Esta etapa do trabalho poder\u00e1 ser realizada utilizado os ambientes de CI do GitLab-CI ou Github Actions.Requisitos da configura\u00e7\u00e3o da Integra\u00e7\u00e3o Cont\u00ednua (Gitlab ou Github) incluem:Build (Poetry)\nTest - unit\u00e1rios\nLint -\nDocumenta\u00e7\u00e3o (sphinx)Avalia\u00e7\u00e3oA avalia\u00e7\u00e3o do trabalho ser\u00e1 feita \u00e0 partir da correta implementa\u00e7\u00e3o de cada etapa. A avalia\u00e7\u00e3o ser\u00e1 feita de maneiraquantitativa(se foi realizado a implementa\u00e7\u00e3o + documenta\u00e7\u00e3o), equalitativa(como foi implementado, entendimento dos conceitos na pr\u00e1tica, complexidade da solu\u00e7\u00e3o). Para isso, fa\u00e7a oscommits at\u00f4micos, bem documentados, completosa fim de facilitar o entendimento e avalia\u00e7\u00e3o do seu trabalho. Lembrando o trabalho \u00e9 individual.Observa\u00e7\u00f5es:A data final de entrega do trabalho \u00e9 o dia 28/01/2023;O trabalho deve ser desenvolvido em umreposit\u00f3rio PESSOAL e PRIVADOque dever\u00e1 ser tornado p\u00fablico somente ap\u00f3s a data de entrega do trabalho (no dia 28/01/2023);Cada etapa do trabalho dever\u00e1 ser entregue em commits progressivos (pendendo ser mais de um commit por etapa);Oscommits devem estar espa\u00e7ados em dias ao longo do desenvolvimento do trabalho. Commits feitos todos juntos na data de entrega n\u00e3o ser\u00e3o descontados da nota final.ItemPeso1. Containeriza\u00e7\u00e3o do Banco1.02. Containeriza\u00e7\u00e3o da biblioteca + Banco1.53. Publica\u00e7\u00e3o da biblioteca1.54. Documenta\u00e7\u00e3o automatiza1.55. Integra\u00e7\u00e3o Cont\u00ednua (Build, Test, Lint, documentacao)3.06. Deploy Cont\u00ednuo1.5Exemplo de Trabalhos AnterioresAlguns trabalhos de trabalhos anteriores:2020/22021/12021/2Requisitos de insta\u00e7\u00e3opython -m venv env\nsource env/bin/activate\npip install -r requirements.txtRodando a aplica\u00e7\u00e3opython src/main.pyTestandopytest --covMetabaseO metabase ajuda a visualizar e a modelar o processamento dos dados, a engenharia de features e monitoramento do modelo.KeywordsDescri\u00e7\u00e3oCSVUm arquivo CSV \u00e9 um arquivo de texto simples que armazena informa\u00e7\u00f5es de tabelas e planilhas. Os arquivos CSV podem ser facilmente importados e exportados usando programas que armazenam dados em tabelas.Collection (cole\u00e7\u00e3o)Uma cole\u00e7\u00e3o \u00e9 um agrupamento de documentos do MongoDB. Os documentos dentro de uma cole\u00e7\u00e3o podem ter campos diferentes. Uma cole\u00e7\u00e3o \u00e9 o equivalente a uma tabela em um sistema de banco de dados relacional.DatabaseUm banco de dados armazena uma ou mais cole\u00e7\u00f5es de documentos.Mongo\u00c9 um banco de dados NoSQL desenvolvido pela MongoDB Inc. O banco de dados MongoDB foi criado para armazenar uma grande quantidade de dados e tamb\u00e9m executar rapidamente.Connect the database to the metabasestep 1: Open localhost:3000step 2: Click Admin settingstep 3: Click Databasestep 4: Adicione os dados de autentica\u00e7\u00e3o de banco de dadosExemplo da conex\u00e3o mongo metabasemetabasecredentialhostmongodabase_nameuse the name you define in make migrateuserlappispasswordlappis"} {"package": "2022-assignment1-ITIS", "pacakge-description": "assignment 1, Gruppo ITIS"} {"package": "2022Calculator", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} {"package": "2022-distributions", "pacakge-description": "No description available on PyPI."} {"package": "2022Printer", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} {"package": "2022-requests", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} {"package": "2023-assignement-DevOps", "pacakge-description": "No description available on PyPI."} {"package": "2023-assignemt1-viewscounter", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} {"package": "2023-assignment1-DevOps", "pacakge-description": "No description available on PyPI."} {"package": "2023-assignment1-viewscounter", "pacakge-description": "Assignment 1 - Processo e Sviluppo del SoftwareGit URLhttps://gitlab.com/bicoccaprojects/2023_assignment1_viewscounterMembri Gruppo CEDFicara Damiano (919386)Ricci Claudio (918956)Toli Emilio (920337)NoteQuesto README ha lo scopo di fornire una documentazione completa delle decisioni prese durante lo sviluppo della pipeline, spaziando attraverso tutte le sue fasi: Build, Verify, Unit-test, Integration-test, Package, Release e Docs.Inoltre, verranno fornite giustificazioni e commenti approfonditi per ciascuna di queste scelte.IntroduzioneIl primo Assignment del corso di Processo e Sviluppo del Software 2023/2024 si pone come obiettivo la realizzazione di una Pipeline CI/CD che automatizzi il processo di manutenzione di un'applicazione seguendo l'insieme di pratiche DEVOPS, mirando ad abbreviare il ciclo di vita di sviluppo di un sistema e soprattutto fornendo una consegna continua di software qualitativamente elevato.La decisione di sviluppare l'applicazione in Python \u00e8 stata presa con l'obiettivo di semplificare lo sviluppo della pipeline CI/CD. Rispetto ad altri linguaggi, come Java, Python offre un'esperienza di sviluppo pi\u00f9 agevole in questo contesto. Ad esemppio, Java richiederebbe l'utilizzo di strumenti e librerie specifiche per la gestione dei processi di CI/CD, mentre Python offre una serie di vantaggi che consentono di implementare e gestire la pipeline in modo pi\u00f9 diretto ed efficiente.ApplicazioneL'obiettivo principale dell'assignment non \u00e8 l'implementazione dell'applicazione in s\u00e9. Pertanto, \u00e8 stata scelta la realizzazione di un sistema estremamente semplice denominato \"Views Counter\". Questo sistema fa uso del databaseFirebaseper tenere traccia del numero di visualizzazioni effettuate da ciascun utente all'interno del sistema.All'avvio dell'applicazione, agli utenti viene richiesto di specificare il proprio nome e, in seguito, l'applicazione verifica se tale nome \u00e8 gi\u00e0 presente nel database:in caso di risposta affermativa, il sistema incrementa il conteggio delle visualizzazioni associate a quell'utente e restituisce il valore aggiornato.se, invece, si tratta della prima volta in cui quel nome viene inserito, il sistema restituisce un valore iniziale di 1.StagesDi seguito vengono elencate le fasi che sono state implementate per lo svolgimento dell'assignment:BuildVerifyUnit-testIntegration-testPackageReleaseDocsPrerequisitiIn questa sezione, vengono spiegati alcuni prerequisiti che vengono eseguiti prima dell'avvio dello script con le fasi elencate in precedenza:La pipeline utilizza l'immagine Docker Python pi\u00f9 recente come base, definita come segue:image: python:latest.L'immagine Docker Python assicura che tutte le fasi della pipeline utilizzino un ambiente coerente, eliminando problemi di compatibilit\u00e0 tra ambienti di sviluppo e produzione. Inoltre, le immagini Docker Python sono in genere rapide da avviare, ottimizzando i tempi di build e test all'interno della pipeline.Viene definita una variabile globale denominataPIP_CACHE_DIR, il cui percorso \u00e8 impostato su\"$CI_PROJECT_DIR/.cache/pip\".L'utilizzo della cache in una pipeline riveste un ruolo fondamentale nel migliorare l'efficienza, la velocit\u00e0 e la coerenza del processo di sviluppo del software. Tale pratica consente di ottimizzare l'uso delle risorse e garantisce un flusso di lavoro pi\u00f9 agevole.Inoltre, viene eseguito uno stage di \"before_script\" che si occupa di effettuare alcune azioni necessarie a far eseguire con successo gli stages successivi:pip --versionseguito dapip install --upgrade pipche si occupano di verificare e aggiornare la versione dipip;Viene creato e poi attivato un ambiente virtuale per isolare tutte le operazioni Python all'interno del progetto conpython -m venv venvesource venv/bin/activate. L'ambiente virtuale consente di installare e gestire le dipendenze specifiche per il progetto senza interferire con il sistema globale.Alcuni stages contengono un comando che indica che lo stage in questione, e quindi la pipeline, deve essere eseguita solo quando ci si trova sul branchmain. In questo modo ci siamo assicurati di non far partire la pipeline, e quindi di perdere minuti di utilizzo, durante l'esecuzione di modifiche su branch diversi dal principale.1. BuildLa compilazione del progetto viene eseguita attraverso il seguente comando:pip install -r requirements.txt.Questa scelta \u00e8 motivata da diverse ragioni che contribuiscono alla semplificazione del processo di installazione delle librerie esterne necessarie per l'esecuzione dell'applicazione. In questo modo, la specifica delle librerie e delle relative versioni richieste \u00e8 concentrata in un file esterno denominato \"requirements.txt\". Questo approccio centralizzato semplifica notevolmente la gestione delle dipendenze, consentendo di elencare in modo chiaro e ordinato tutte le librerie necessarie per l'applicazione. Inoltre, l'utilizzo di un file di requisiti esterno permette di aggiungere o modificare librerie senza la necessit\u00e0 di apportare modifiche alla pipeline stessa. In altre parole, se si desidera inserire una nuova libreria o aggiornare una versione, \u00e8 sufficiente aggiornare il file \"requirements.txt\". Questo aumenta l'agilit\u00e0 nello sviluppo, poich\u00e9 non \u00e8 richiesta alcuna modifica al processo di build della pipeline.In definitiva, l'uso del file \"requirements.txt\" per la gestione delle dipendenze promuove l'efficienza, l'agilit\u00e0, la tracciabilit\u00e0 e la riduzione degli errori nel processo di CI/CD, offrendo un approccio robusto per gestire le librerie necessarie all'esecuzione dell'applicazione.2. VerifyLa fase di \"verify\" nella pipeline di sviluppo, come da specifiche dell'assignment, utilizza due comandi per eseguire controlli di qualit\u00e0 del codice e identificare possibili problematiche di sicurezza prima di procedere ulteriormente nello sviluppo dell'applicazione.Dato che questi due comandi sono indipendenti l'uno dall'altro, si \u00e8 scelto di scrivere lo scritp di questo stage in modo che esegua due jobs in parallelo per migliorarne le prestazioni. I due jobs eseguonoprospectorebandit, in particolare:\"prospector\", esegue l'analisi statica del codice alla ricerca di possibili problemi di stile, conformit\u00e0 alle linee guida di codifica, e altre metriche di qualit\u00e0 del codice. In sostanza, garantisce la conformit\u00e0 alle migliori pratiche di sviluppo, assicurando che il codice sia di alta qualit\u00e0, privo di errori e pronto per il rilascio, migliorando significativamente l'efficienza e la qualit\u00e0.\"bandit\" esegue due analisi separate della sicurezza del codice Python per due diverse parti del progetto: \"application\" (frontend) e \"database\" (gestione del database). Questa analisi include l'esecuzione del comandobanditdue volte, una per ciascuna parte del progetto. Utilizzando l'opzione-r, Bandit esegue l'analisi in modalit\u00e0 ricorsiva, esaminando tutto il contenuto delle directory specificate, inclusi tutti i file Python presenti al loro interno.3. Unit-testUn test di unit\u00e0 ha lo scopo di verificare il corretto funzionamento di una singola unit\u00e0 di codice, come un metodo, una funzione o una classe, in modo indipendente dal resto del sistema.In questo contesto, \u00e8 stato creato un file denominatotest_unit.pycontenente una funzione di test. Questa funzione verifica il collegamento al database, restituendoTruese la connessione \u00e8 attiva.Per eseguire il test di unit\u00e0 all'interno della pipeline, \u00e8 stato utilizzato il seguente comando:pytest tests/test_unit.py.Questo comando fa uso della libreria di testingpytestper eseguire il test specifico contenuto nel filetest_unit.py.pytest\u00e8 un framework di testing per Python che abbiamo utilizzato per la scrittura e l'esecuzione dei test unitari e dei test di integrazione. Questo framework \u00e8 in grado di rilevare automaticamente i file di test all'interno del progetto. I file di test sono stati denominati secondo una convenzione di denominazione specifica, devono iniziare con \"test_\", cos\u00ec chepytestli identificher\u00e0 e li eseguir\u00e0 quando richiamto.Il risultato dell'esecuzione fornir\u00e0 un responso sul corretto funzionamento del collegamento al database. Se il test restituisceTrue, indica che il collegamento \u00e8 attivo, confermando il successo del test e la validit\u00e0 della connessione al database.4. Integration-testUn integration test \u00e8 una fase necessaria per avere la garanzia che le componenti di un'applicazione non generino problemi nel momento in cui vengono integrate assieme, garantendo che le componenti siano in grado di comunicare tra loro in maniera corretta.Inserendo questo stage nella pipeline i test vengono eseguiti ad ogni modifica del codice sorgente, in modo da garantire la qualit\u00e0 del software.Entrando nel contesto della pipeline, l'integration test viene eseguito tramite il seguente comando:pytest tests/test_integration.py. Questo test esegue due principali controlli:Dopo aver inizializzatoFirebase, ottiene il valore dell'utente di prova e verifica se esso sia uguale ad un valore predefinito, ossia 10 ed in tal caso il test passer\u00e0 correttamente.Il secondo test, sempre dopo aver inizializzato l'istanza diFirebase, imposta il valore del contatore dell'utente \"damiano\" con il valore '5' verificando, poi, che l'incremento funzioni correttamente richiamando la funzionefirebase.increment_counter('damiano')che restituiscetruein caso di riuscita dell'operazione.L'ultima operazione di questo test consiste nel verificare che il contatore sia stato incrementato correttamente e che quindi abbia valore '6'.5. PackageDurante la fase di Package, il codice sorgente viene trasformato in pacchetti, agevolando cos\u00ec la distribuzione di applicazioni e librerie. I pacchetti sono archivi che includono il codice sorgente e i file necessari all'installazione del software su vari sistemi e ambienti. Questo processo \u00e8 fondamentale per semplificare la distribuzione e garantire che il software funzioni su diverse piattaforme.Nella pipeline questo stage \u00e8 uno dei pi\u00f9 critici ed esegue diverse operazioni per preparare il codice alla distribuzione. Per comprendere meglio questo stage, dividiamo (a fini esplicativi) queste azioni in gruppi.Generazione dei Pacchetti:python setup.py sdist bdist_wheelutilizziamo il filesetup.pyper creare pacchetti sorgente e pacchettibdist_wheel. Questo file di configurazione definisce le informazioni relative al progetto Python, come il nome, la versione, l'autore, la descrizione e le dipendenze. Questo file \u00e8 utilizzato insieme al frameworksetuptools.sdistrappresenta il pacchetto sorgente, contenente il codice sorgente e altri file necessari per l'installazione.bdist_wheel\u00e8 un formato di pacchetto binario ottimizzato per la distribuzione su PyPI, che semplifica l'installazione su diverse piattaforme. La pubblicazione su PyPI (pypi.org/) mette a disposizione del pubblico il software Python, facilitando la condivisione e la collaborazione tra sviluppatori.Operazioni Preliminari: prima di poter eseguire il comando appena descritto \u00e8 necessario effettuare i seguenti comandi:git config user.email $GIT_EMAILegit config user.name $GIT_NAMEche configurano l'utente Git con l'indirizzo email e il nome specificati nelle variabili d'ambiente$GIT_EMAILe$GIT_NAME.git remote add gitlab_origin $GITLAB_REMOTE_URLche aggiunge un'origine Git denominata \"gitlab_origin\" con l'URL specificato nella variabile$GITLAB_REMOTE_URL. Di particolare rilevanza \u00e8 la creazione di un ACCESS_TOKEN con permessi API e di lettura/scrittura per automatizzare il processo di push delle modifiche al repository. Questo access token agisce come una chiave di autenticazione, consentendo di comunicare con il repository in modo sicuro. Questa pratica non solo semplifica il processo di aggiornamento del codice, ma garantisce anche la sicurezza delle operazioni, poich\u00e9 limita l'accesso solo alle azioni autorizzatepython increment_version.py patchche esegue lo scritp del fileincrement_versionche si occupa di aggiornare il numero di versione del progetto nel filesetup.py. E per questo, successivamente, \u00e8 necessario eseguiregit add setup.pyper aggiungere questo file alle modifiche su cui si eseguir\u00e0 il comando dicommit.git commit -m \"incremento versione\"per eseguire ilcommit.git push gitlab_origin HEAD:main -o ci.skipche esegue il push delle modifiche al repository remoto e utilizza l'opzione-o ci.skipper impedire l'attivazione di una pipeline CI/CD in risposta a questo push.Archiviazione dei pacchetti: l'esecuzione di questo stage produrr\u00e0 dei pacchetti (artifacts) che vengono archiviati nella directory \"dist/\".Scelte architetturaliIn questo stage della pipeline (e anche in quello successivo), si \u00e8 scelto di di adottare una pratica sicura utilizzando variabili d'ambiente per nascondere informazioni sensibili, come l'indirizzo email e il nome dell'utente, che sono necessarie per l'esecuzione dei comandi. Questo approccio migliora la sicurezza complessiva del progetto, evitando l'inclusione diretta di dati sensibili nel codice sorgente. Inoltre, l'uso delle variabili d'ambiente facilita la gestione delle configurazioni specifiche per ambienti diversi e garantisce che tali informazioni siano facilmente configurabili senza dover modificare il codice sorgente direttamente.Problemi riscontrati in questo stageDurante questa fase, \u00e8 importante prestare attenzione a un aspetto chiave. Il filesetup.pycontiene la versione dell'applicazione, e ogni volta che eseguiamo la pipeline, \u00e8 necessario aggiornare la versione prima di consentire una seconda esecuzione. Questo passo \u00e8 fondamentale poich\u00e9 l'obiettivo \u00e8 pubblicare l'applicazione su PyPI. Pertanto, per garantire una corretta esecuzione di questa fase, \u00e8 essenziale verificare che non esista gi\u00e0 un'applicazione con lo stesso nome su PyPI e che la versione sia aggiornata ad ogni esecuzione. A questo fine, si \u00e8 deciso di creare uno scritpincrement_version.pyche si occupa di aggiornare in modo automatico la versione del progetto.Funzionamento diincrement_version.pyNel nostro approccio allo sviluppo dello script _increment_version.py, abbiamo adottato una serie di scelte architetturali mirate per garantire l'efficacia e l'usabilit\u00e0 del programma.La decisione di creare uno script dedicato all'incremento della versione all'interno del filesetup.py\u00e8 stata guidata dalla necessit\u00e0 di automatizzare un processo comune in modo semplice ed efficiente. Questo rende pi\u00f9 agevole per gli sviluppatori la gestione delle versioni del proprio software e riduce il rischio di errori umani.L'uso di un parametro specifico, comemajor,minor, opatch, \u00e8 stato implementato per consentire agli utenti di personalizzare l'azione di incremento. Questa scelta offre flessibilit\u00e0 e controllo, permettendo di adattare l'incremento della versione alle esigenze specifiche del progetto.Ad esempio, se desideriamo introdurre modifiche significative o nuove funzionalit\u00e0, possiamo utilizzare il parametromajorper indicare una \"versione principale\" del software; il parametrominorpu\u00f2 essere utile per segnalare modifiche minori o aggiunte, mentrepatch\u00e8 ideale per correzioni di bug e aggiustamenti minori.Il codice utilizza in modo intelligente le espressioni regolari (regex) per individuare e catturare la versione corrente all'interno del filesetup.py. Questo approccio garantisce una maggiore precisione nell'estrazione dei dati, consentendo al programma di funzionare in modo affidabile anche in situazioni complesse.6. ReleaseQuesta fase della pipeline \u00e8 strettamente correlata alla fase precedente di \"Package\" in quanto, se nella fase di \"Package\" abbiamo preparato i pacchetti dell'applicazione, in questa fase li pubblichiamo su PyPI.I passi eseguiti da questo stage sono i seguenti:echo \"[pypi]\" > ~/.pypirc, in questo passaggio viene creato un file di configurazione necessario per l'interazione con PyPI. Questo file conterr\u00e0 le informazioni di autenticazione richieste per l'upload dei pacchetti Python su PyPI.echo \"username = $TWINE_USERNAME\" >> ~/.pypirc, qui, all'interno del file.pypirc, specificiamo che lo username per l'autenticazione su PyPI.echo \"password = $TWINE_TOKEN\" >> ~/.pypirc, inseriamo il valore del token API come password nel file.pypirc. Il valore del token \u00e8 recuperato da una variabile globale, $TWINE_TOKEN, definita nelle impostazioni di GitLab. Questo indica che l'autenticazione avviene tramite un token API anzich\u00e9 un nome utente e una password tradizionali.twine upload dist/*, questa istruzione permette di caricare su PyPI i pacchetti generati nella fase di \"Package\" nella directorydist/. Viene utilizzato \"Twine\", che \u00e8 uno strumento di Python per facilitare l'upload di pacchetti verso repository di pacchetti come PyPIScelte architetturaliPer eseguire questa fase, \u00e8 stato necessario creare un account su PyPI. Al fine di rendere la pipeline pi\u00f9 professionale e garantire la sicurezza dell'autenticazione, abbiamo optato per una configurazione basata su token API. PyPI consente a ciascun utente di generare un token API personale, eliminando la necessit\u00e0 di condividere le proprie informazioni di account. Questo approccio migliora la sicurezza e semplifica il processo di pubblicazione sul repository PyPI.7. DocsUn progetto di sviluppo di software non \u00e8 completo senza una documentazione adeguata. La documentazione fornisce agli sviluppatori, agli utenti e agli altri membri del team tutte le informazioni necessarie per comprendere, utilizzare ed estendere il software. Questa fase \u00e8 dedicata a garantire che la documentazione sia sempre allineata con il codice sorgente e pronta per essere distribuita.Questa fase della pipeline \u00e8 dedicata alla generazione della documentazione e alla sua pubblicazione.Per farlo sono necessarie diverse azioni che dividiamo in tre gruppi.Generazione della Documentazione:mkdocs build --clean\u00e8 il comando principale utilizzato per generare la documentazione del progetto.mkdocs\u00e8 uno strumento di generazione della documentazione che elabora i file Markdown presenti nel repository e crea una versione formattata della documentazione pronta per la distribuzione. L'opzione--cleanassicura che la cartella di output sia ripulita da vecchi file inutili, garantendo che la nuova documentazione sia fresca e aggiornata.Preparazione dei file generati:mkdir .publiccrea una directory chiamata \".public\". in cui successivamente concp -r public/* .publicviene copiato ricorsivamente (-r) tutto il contenuto della directory \"public\". Questo passaggio serve a preparare i file della documentazione generata.Archiviazione degli Artefatti: nella sezione degli artifacts, vengono specificati i file o le directory che devono essere conservati per un uso futuro. Viene specificato di conservare il filemkdocs.yaml, che \u00e8 il file di configurazione principale di MkDocs contenente le impostazioni e le informazioni necessarie per generare la documentazione. E anche la directory \"public\" contenente la documentazione appena generata.Una volta finita l'esecuzione di questo stage, la documentazione aggiornata \u00e8 consultabile al seguente link:https://prova-bicoccaprojects-41fbc084054de9fcac016aa766e308b22287d4b35.gitlab.io/"} {"package": "2023-assignment-DevOps", "pacakge-description": "No description available on PyPI."} {"package": "2023-ASSIGNMENT-PYTHONPIPELINE-DSCF", "pacakge-description": "No description available on PyPI."} {"package": "2023-Whit3-H4t-Sc4nn3r-Fir5t", "pacakge-description": "No description available on PyPI."} {"package": "2024", "pacakge-description": "welcome to my package"} {"package": "2025", "pacakge-description": "welcome to my package"} {"package": "2026", "pacakge-description": "welcome to my package"} {"package": "2027", "pacakge-description": "welcome to my package"} {"package": "2028", "pacakge-description": "welcome to my package"} {"package": "2029", "pacakge-description": "welcome to my package"} {"package": "2030", "pacakge-description": "welcome to my package"} {"package": "2040", "pacakge-description": "welcome to my package"} {"package": "2048", "pacakge-description": "2048My version of 2048 game, with multi-instance support, restored from\nan old high school project.InstallationRunpip install 2048.Run2048to play the game. Enjoy.On Windows, you can run2048wto run without the console window.Resetting the gameIf for some reason, the data files get corrupted or you want to clear the high score...On Windows, deleteC:\\Users\\\\AppData\\Roaming\\Quantum\\2048.On macOS, delete/Users//Library/Application Support/2048.On Linux, delete/home//.local/share/2048."} {"package": "2048-cli", "pacakge-description": "2048 cliA small clone of2048. In it, the player moves numbered blocks in a 4x4 grid, combining them to create a block with the number 2048.\nThe challenge lies in the strategy to avoid running out of moves possible.\nThe game ends when the grid is full and there are no more combinations availableIn this project, I also try to develop an A.I. using the Monte Carlo method to try to achieve the best possible score.InstallationInstall with pip or your favorite PyPI package manager$pipxinstall2048-cliThen run the following command in the terminal$2048-cliModesSingle PlayerIn single player mode, you command the moves with the aim of achieving the highest possible score.AI PlayingIn AI game mode, moves will be generated using a Monte Carlo search algorithm."} {"package": "2048-py", "pacakge-description": "2048What is this?It's 2048, written in Python by the Ladue High School Computer Science Club. It's still in its early stages, so expect bugs!Getting startedDownload and runDownload or clone this repo, then runpython3 main.py.If Python complains abouttermcolor, install it withpip install termcolor.Run onlineRun it online onRepl.it.Install usingpipInstall withpip install 2048-py.Run2048-pyto play the game.Get it from the AUROn Arch Linux, get the2048-pypackage from the AUR.Run2048-pyto play the game."} {"package": "2048-Wallpaper-Edition", "pacakge-description": "2048-Wallpaper-Edition2048 is a puzzle game that can be played right on your desktop background! Move the tiles around to create a tile with the value of 2048. Keep playing to see how high you can go."} {"package": "2050", "pacakge-description": "welcome to my package"} {"package": "2060", "pacakge-description": "welcome to my package"} {"package": "2070", "pacakge-description": "welcome to my package"} {"package": "2080", "pacakge-description": "welcome to my package"} {"package": "2090", "pacakge-description": "welcome to my package"} {"package": "20CS30064MyPackage", "pacakge-description": "Software LabPython Datascience AssignmentIn this assignment we will deal withInstance Segmentation and Detection. Instance segmentation is a very well studied task of Deep Learning, having tremendous variety of applications. You have to create a python package for transforming images and analysing their effect on the predictions of an instance segmentor. We are providing you with a pretrained segmentor, all you need to do is to call the segmentor on the image and get the outputs.A python package means that one can install the package in the python environment and can import the modules in any python script, irrespective of the location of the script. Creating a python package is fairly easy, just follow the stepshere.The details of each of the files/folders are as follows:main.py: This is the main file which is to be called to execute the program. The main file calls the corresponding functions as needed while execution. The main file should call the appropriate function to prepare the dataset, then transform the images read, obtain the segmentation masks and bounding boxes of the objects present in the image by calling the segmentor model, and then plot the obtained images by calling the appropriate functions from the package described below../my_package/model.py: This file contains the instance segmentation model definition. Consider it as a black-box model which takes an image (as numpy array) as input and provides the segmentation masks, bounding boxes as outputs and the corresponding class labels as for the input image.Fig. 1. Sample Output of the Segmentor../my_package/data/dataset.py: This file contains the classDatasetthat reads the provided dataset from the annotation file and provides the numpy version of the images which are to be transformed and forwarded through the model. The annotation format is provided indata/README.md./my_package/data/transforms: This folder contains 5 files. Each of these files is responsible for performing the corresponding transformation, as follows:a)crop.py: This file takes an image (as numpy array) as input and crops it based on the provided arguments. Declare a classCropImage()for performing the operation.Fig. (a). Crop Operation.b)flip.py: This file takes an image (as numpy array) as input and flips it based on the provided arguments. Declare a classFlipImage()for performing the operation.Fig. (b). Flip Operation.c)rotate.py: This file takes an image (as numpy array) as input and rotates it based on the provided arguments. Declare a classRotateImage()for performing the operation.Fig. (c). Rotate Operation.d)rescale.py: This file takes an image (as numpy array) as input and rescales it based on the provided arguments. Declare a classRotateImage()for performing the operation.Fig. (d). Rescale Operation.e)blur.py: This file takes an image (as numpy array) as input and applies a gaussian blur to it based on the provided arguments. Declare a classGaussBlurImage()for performing the operation.Fig. (e). Blur Operation../my_package/analysis/visualize.py: This file defines a function that draws the image with the predicted segmentation masks and the bounding boxes (with the corresponding labels) on the image and saves them in the specified output folder.setup.py: Use this file for constructing the packagemy_package.Coding Task [30 marks]Note: For handling images, e.g. reading images, etc. we would recommend using PIL instead of OpenCV as OpenCV usesBGRformat instead ofRGB.Write the various transformations in./my_package/data/transforms. There are five files, as already mentioned. Although these functions are easily implementable using numpy only, you may use any image processing libraries like PIL, skimage or opencv. [2x5=10 marks]Complete theDatasetclass in./my_package/data/dataset.py. This class will accept the path to the annotation file and the list of transformation classes. Ideally you should be directly using transformation classes but you may also use strings to identify the transformations. [5 marks]Write a functionplot_visualization()in./my_package/analysis/visualize.pythat will draw the image with the predicted segmentation masks and bounding boxes (with the corresponding labels) on the images and save them in the output folder specified in the argument. Please note that you need to plot only the 3 most confident bounding boxes predicted by the segmentor. If the segmentor predicts less than 3 boxes, then plot all of them. [5 marks]Create a python packagemy_package. For this you need to writesetup.py. It must be noted that files called___init__.pyneed to be added in the hierarchy. We leave it to you to search where they should be added. Note that the user will generally not know the exact files where the classes are written. That means, he/she does not know that their exist a filecrop.pywhere the classCropImage()is defined. Rather he/she simply knows that this class is defined intransforms. So, a good coding practice is to allow an import statementfrom my_package.data.transforms import CropImage. [5 marks]Writemain.pywhere you will test the different transformations you have written on the instance segmentor. The outputs for each of the experiments should be organized properly in the outputs folder. [5 marks]Analysis Task [10 marks]Obtain and save the predicted bounding boxes for all the images provided in thedata/imgsfolder. [3 marks]Consider the image with name same as the last digit of your roll number, i.e. if your roll number is 20CS####7 then consider the image7.jpgthen plot the following usingsubplotsin matplotlib and save them: [1x7=7 marks]a) The original image along with the top-3 predicted segmentation masks and bounding boxes.b) Horizontally flipped original image along with the top-3 predicted segmentation masks and bounding boxes.c) Blurred image (with some degree of blurring) along with the top-3 predicted segmentation masks and bounding boxes.d) Twice Rescaled image (2X scaled) along with the top-3 predicted segmentation masks and bounding boxes.e) Half Rescaled image (0.5X scaled) along with the top-3 predicted segmentation masks and bounding boxes.f) 90 degree right rotated image along with the top-3 predicted segmentation masks and bounding boxes.g) 45 degree left rotated image along with the top-3 predicted segmentation masks and bounding boxes.Please read the class definitions very carefully. In this assignment you do not need to code a lot, but you need to understand how to integrate several custom modules together in a clean way. More details on the arguments and the return types are provided in the corresponding files."} {"package": "20XX", "pacakge-description": "20XXfrom melee_20XX import Melee_v020XX is a PettingZoo-based library for Melee. (\u2310\u25a0_\u25a0)Code Exampleimportos.pathimportmeleefrommelee_20XXimportMelee_v0frommelee_20XX.agents.basicimportCPUFox,RandomFoxplayers=[RandomFox(),CPUFox()]env=Melee_v0.env(players,os.path.expanduser('~/.melee/SSBM.ciso'),fast_forward=True)max_episodes=10if__name__==\"__main__\":env.start_emulator()forepisodeinrange(max_episodes):observation,infos=env.reset(melee.enums.Stage.FOUNTAIN_OF_DREAMS)gamestate=infos[\"gamestate\"]terminated=Falsewhilenotterminated:actions=[]forplayerinplayers:ifplayer.agent_type==\"CPU\":# CPU actions are handled internallyaction=Noneelse:action=player.act(gamestate)actions.append(action)observation,reward,terminated,truncated,infos=env.step(actions=actions)gamestate=infos[\"gamestate\"]NoteThis library requires Slippi, which in turn requires an SSBM 1.02 NTSC/PAL ISO. This library does not and will not distribute this. You must acquire this on your own!Installationpip install 20XXpip install git+https://github.com/WillDudley/libmelee.git(fixes some menu handling issues)CreditsHeavily relies onlibmelee,usesPettingZoo,originally forked frommelee-env."} {"package": "2100", "pacakge-description": "welcome to my packagewelcome to my package"} {"package": "2112", "pacakge-description": "UNKNOWN"} {"package": "21234191-cpp-pkg", "pacakge-description": "No description available on PyPI."} {"package": "2143134", "pacakge-description": "oss - dev testaa python module"} {"package": "21cmFAST", "pacakge-description": "A semi-numerical cosmological simulation code for the radio 21-cm signal.This is the official repository for21cmFAST: a semi-numerical code that is able to\nproduce 3D cosmological realisations of many physical fields in the early Universe.\nIt is super-fast, combining the excursion set formalism with perturbation theory to\nefficiently generate density, velocity, halo, ionization, spin temperature, 21-cm, and\neven ionizing flux fields (see the above lightcones!).\nIt has been tested extensively against numerical simulations, with excellent agreement\nat the relevant scales.21cmFASThas been widely used, for example, by the Murchison Widefield Array (MWA),\nLOw-Frequency ARray (LOFAR) and Hydrogen Epoch of Reionization Array (HERA), to model the\nlarge-scale cosmological 21-cm signal. In particular, the speed of21cmFASTis important\nto produce simulations that are large enough (several Gpc across) to represent modern\nlow-frequency observations.As ofv3.0.0,21cmFASTis conveniently wrapped in Python to enable more dynamic code.New Features in 3.0.0+Robust on-disk caching/writing both for efficiency and simplified reading of\npreviously processed data (using HDF5).Convenient data objects which simplify access to and processing of the various density\nand ionization fields.De-coupled functions mean that arbitrary functionality can be injected into the process.Improved exception handling and debuggingComprehensive documentationComprehensive test suite.Strictsemantic versioning.InstallationWe support Linux and MacOS (please let us know if you are successful in installing on\nWindows!). On these systems, the simplest way to get21cmFASTis by usingconda:conda install -c conda-forge 21cmFAST21cmFASTis also available on PyPI, so thatpip install 21cmFASTalso works. However,\nit depends on some external (non-python) libraries that may not be present, and so this\nmethod is discouraged unless absolutely necessary. If usingpipto install21cmFAST(especially on MacOS), we thoroughly recommend reading the detailedinstallation instructions.Basic Usage21cmFASTcan be run both interactively and from the command line (CLI).InteractiveThe most basic example of running a (very small) coeval simulation at a given redshift,\nand plotting an image of a slice through it:>>> import py21cmfast as p21c\n>>> coeval = p21c.run_coeval(\n>>> redshift=8.0,\n>>> user_params={'HII_DIM': 50, \"USE_INTERPOLATION_TABLES\": False}\n>>> )\n>>> p21c.plotting.coeval_sliceplot(coeval, kind='brightness_temp')The coeval object here has much more than just thebrightness_tempfield in it. You\ncan plot thedensityfield,velocityfield or a number of other fields.\nTo simulate a full lightcone:>>> lc = p21c.run_lightcone(\n>>> redshift=8.0,\n>>> max_redshift=15.0,\n>>> init_box = coeval.init_struct,\n>>> )\n>>> p21c.plotting.lightcone_sliceplot(lc)Here, we used the already-computed initial density field fromcoeval, which sets\nthe size and parameters of the run, but also means we don\u2019t have to compute that\n(relatively expensive step again). Explore the full range of functionality in theAPI Docs,\nor read morein-depth tutorialsfor further guidance.CLIThe CLI can be used to generate boxes on-disk directly from a configuration file or\ncommand-line parameters. You can run specific steps of the simulation independently,\nor an entire simulation at once. For example, to run just the initial density field,\nyou can do:$ 21cmfast init --HII_DIM=100The (quite small) simulation box produced is automatically saved into the cache\n(by default, at~/21cmFAST-cache).\nYou can list all the files in your cache (and the parameters used in each of the simulations)\nwith:$ 21cmfast queryTo run an entire coeval cube, use the following as an example:$ 21cmfast coeval 8.0 --out=output/coeval.h5 --HII_DIM=100In this case all the intermediate steps are cached in the standard cache directory, and\nthe finalCoevalbox is saved tooutput/coeval.h5. If no--outis specified,\nthe coeval box itself is not written, but don\u2019t worry \u2013 all of its parts are cached, and\nso it can be rebuilt extremely quickly. Every input parameter to any of theinput classes(there are a lot of parameters) can be specified at the end of the call with prefixes of--(likeHII_DIMhere). Alternatively, you can point to a config YAML file, eg.:$ 21cmfast lightcone 8.0 --max-z=15.0 --out=. --config=~/.21cmfast/runconfig_example.ymlThere is an example configuration fileherethat you\ncan build from. All input parameters aredocumented here.DocumentationFull documentation (with examples, installation instructions and full API reference)\nfound athttps://21cmfast.readthedocs.org.AcknowledgingIf you use21cmFAST v3+in your research please cite both of:Murray et al., (2020). 21cmFAST v3: A Python-integrated C code for generating 3D\nrealizations of the cosmic 21cm signal. Journal of Open Source Software, 5(54),\n2582,https://doi.org/10.21105/joss.02582Andrei Mesinger, Steven Furlanetto and Renyue Cen, \u201c21CMFAST: a fast, seminumerical\nsimulation of the high-redshift 21-cm signal\u201d, Monthly Notices of the Royal\nAstronomical Society, Volume 411, Issue 2, pp. 955-972 (2011),https://ui.adsabs.harvard.edu/link_gateway/2011MNRAS.411..955M/doi:10.1111/j.1365-2966.2010.17731.xIn addition, the following papers introduce various features into21cmFAST. If you use\nthese features, please cite the relevant papers.Mini-halos:Mu\u00f1oz, J.B., Qin, Y., Mesinger, A., Murray, S., Greig, B., and Mason, C.,\n\u201cThe Impact of the First Galaxies on Cosmic Dawn and Reionization\u201dhttps://arxiv.org/abs/2110.13919(for DM-baryon relative velocities)Qin, Y., Mesinger, A., Park, J., Greig, B., and Mu\u00f1oz, J. B.,\n\u201cA tale of two sites - I. Inferring the properties of minihalo-hosted galaxies from\ncurrent observations\u201d, Monthly Notices of the Royal Astronomical Society, vol. 495,\nno. 1, pp. 123\u2013140, 2020.https://doi.org/10.1093/mnras/staa1131.\n(for Lyman-Werner and first implementation)Mass-dependent ionizing efficiency:Park, J., Mesinger, A., Greig, B., and Gillet, N.,\n\u201cInferring the astrophysics of reionization and cosmic dawn from galaxy luminosity\nfunctions and the 21-cm signal\u201d, Monthly Notices of the Royal Astronomical Society,\nvol. 484, no. 1, pp. 933\u2013949, 2019.https://doi.org/10.1093/mnras/stz032.Changelogdev-versionv3.3.1 [24 May 2023]FixedCompilation of C code for some compilers (#330)v3.3.0 [17 May 2023]InternalsRefactored setting up of inputs to high-level functions so that there is less code\nrepetition.FixedRunning withR_BUBBLE_MAXtoo large auto-fixes it to beBOX_LEN(#112)Bug in callingclear_cache.Inconsistency in the way that the very highest redshift of an evolution is handled\nbetween low-level code (eg.spin_temperature()) and high-level code (eg.run_coeval()).AddedNewvalidate_all_inputsfunction that cross-references the four main input structs\nand ensures all the parameters make sense together. Mostly for internal use.Ability to save/read directly from an open HDF5 File (#170)An implementation of cloud-in-cell to more accurately redistribute the perturbed mass\nacross all neighbouring cells instead of the previous nearest cell approachChanged PhotonConsEndCalibz from z = 5 -> z = 3.5 to handle later reionisation\nscenarios in line with current observations (#305)Add in an initialisation check for the photon conservation to address some issues\narising for early EOR histories (#311)AddedNON_CUBIC_FACTORtoUserParamsto allow for non-cubic coeval boxes (#289)v3.2.1 [13 Sep 2022]ChangedIncluded log10_mturnovers(_mini) in lightcone class. Only useful when USE_MINI_HALOSv3.2.0 [11 Jul 2022]ChangedFloats are now represented to a specific number of significant digits in the hash of\nan output object. This fixes problems with very close redshifts not being read from\ncache (#80). Note that this means that very close astro/cosmo params will now be read\nfrom cache. This could cause issues when creating large databases with many random\nparameters. The behaviour can modified in the configuration by setting thecache_param_sigfigsandcache_redshift_sigfigsparameters (these are 6 and\n4 by default, respectively).NOTE: updating to this version will cause your previous cached files to become\nunusable. Remove them before updating.FixedAdded a missing C-based error to the known errors in Python.v3.1.5 [27 Apr 2022]v3.1.4 [10 Feb 2022]Fixederror in FFT normalization in FindHaloesdocs not compiling on RTD due to missingscipy.integratemock moduleUpdated matplotlib removed support for setting vmin/vmax and norm. Now passes vmin/vmax\nto the norm() constructor.v3.1.3 [27 Oct 2021]FixedFAST_FCOLL_TABLESso it only affects MCGs and not ACGs. Added tests of this\nflag for high and low z separately.v3.1.2 [14 Jul 2021]InternalsMINIMIZE_MEMORYflag significantly reduces memory without affecting performance much,\nby changing the way some arrays are allocated and accessed in C. (#224)ChangeUpdatedUSE_INTERPOLATION_TABLESto be default True. This makes much more sense as\na default value. Until v4, a warning will be raised if it is not set explicitly.v3.1.1 [13 Jun 2021]FixedBug in deployment to PyPI.v3.1.0 [13 Jun 2021]AddedAbility to access all evolutionary Coeval components, either from the end Coeval\nclass, or the Lightcone.Ability to gather all evolutionary antecedents from a Coeval/Lightcone into the one\nfile.FAST_FCOLL_TABLESinUserParamswhich improves speeds quite significantly for\n~<10% accuracy decrease.Fast and low-memory generation of relative-velocity (vcb) initial conditions. Eliminated hi-res vcb boxes, as they are never needed.Also output the mean free path (i.e. MFP_box in IonizedBox).Added the effect of DM-baryon relative velocities on PopIII-forming minihaloes. This now provides the correct background evolution jointly with LW feedback. It gives rise to velocity-induced acoustic oscillations (VAOs) from the relative-velocity fluctuations. We also follow a more flexible parametrization for LW feedback in minihaloes, following new simulation results, and add a new index ALPHA_STAR_MINI for minihaloes, now independent of regular ACGs.Newhookskeyword to high-level functions, that are run on the completion of each computational step, and can\nbe used to more generically write parts of the data to file.Ability to pass a function towrite=to write more specific aspects of the data (internally, this will be put into thehooksdictionary).run_lightconeandrun_coevaluse significantly less memory by offloading initial conditions and perturb_field instances to disk if possible.FixedBug in 2LPT whenUSE_RELATIVE_VELOCITIES=True[Issue #191, PR #192]Error raised when redshifts are not in ascending order [Issue #176, PR #177]Errors whenUSE_FFTW_WISDOMis used on some systems [Issue #174, PR #199]Bug in ComputeIonizedBox causing negative recombination rate and ring structure inGamma12_box[Issue #194, PR #210]Error in determining the wisdom file name [Issue #209, PR#210]Bug in which cached C-based memory would be read in and free\u2019d twice.InternalsAddeddft.c, which makes doing all the cubic FFTs a lot easier and more consistent. [PR #199]More generic way of keeping track of arrays to be passed between C and Python, and their shape in Python, using_get_box_structures.\nThis also means that the various boxes can be queried before they are initialized and computed.More stringent integration tests that test each array, not just the final brightness temperature.Ability to plot the integration test data to more easily identify where things have gone wrong (use--plotsin thepytestinvocation).Nicer CLI interface forproduce_integration_test_data.py. New options tocleanthetest_data/directory,\nand also test data is saved by user-defined key rather than massive string of variables.Nicer debug statements before calls to C, for easily comparing between versions.Much nicer methods of keeping track of array state (in memory, on disk, c-controlled, etc.)Ability to free C-based pointers in a more granular way.v3.0.3Addedcoeval_callbackandcoeval_callback_redshiftsflags to therun_lightcone.\nGives the ability to run arbitrary code onCoevalboxes.JOSS paper!get_fieldsclassmethod on all output classes, so that one can easily figure out\nwhat fields are computed (and available) for that class.FixedOnly raise error on non-availableexternal_table_pathwhen actually going to use it.v3.0.2FixedAdded prototype functions to enable compilation for some standard compilers on MacOS.v3.0.1Modifications to the internal code structure of 21cmFASTAddedRefactor FFTW wisdom creation to be a python callable functionv3.0.0Complete overhaul of 21cmFAST, including a robust python-wrapper and interface,\ncaching mechanisms, and public repository with continuous integration. Changes\nand equations for minihalo features in this version are found inhttps://arxiv.org/abs/2003.04442All functionality of the original 21cmFAST v2 C-code has been implemented in this\nversion, includingUSE_HALO_FIELDand performing full integration instead of using\nthe interpolation tables (which are faster).AddedUpdated the radiation source model: (i) all radiation fields including X-rays, UV\nionizing, Lyman Werner and Lyman alpha are considered from two seperated population\nnamely atomic-cooling (ACGs) and minihalo-hosted molecular-cooling galaxies (MCGs);\n(ii) the turn-over masses of ACGs and MCGs are estimated with cooling efficiency and\nfeedback from reionization and lyman werner suppression (Qin et al. 2020). This can\nbe switched on using newflag_optionsUSE_MINI_HALOS.Updated kinetic temperature of the IGM with fully ionized cells following equation 6\nof McQuinn (2015) and partially ionized cells having the volume-weightied temperature\nbetween the ionized (volume: 1-xHI; temperature T_RE ) and neutral components (volume:\nxHI; temperature: temperature of HI). This is stored in IonizedBox as\ntemp_kinetic_all_gas. Note that Tk in TsBox remains to be the kinetic temperature of HI.Tests: many unit tests, and also some regression tests.CLI: run 21cmFAST boxes from the command line, query the cache database, and produce\nplots for standard comparison runs.Documentation: Jupyter notebook demos and tutorials, FAQs, installation instructions.Plotting routines: a number of general plotting routines designed to plot coeval\nand lightcone slices.New power spectrum option (POWER_SPECTRUM=5) that uses a CLASS-based transfer\nfunction. WARNING: If POWER_SPECTRUM==5 the cosmo parameters cannot be altered, they\nare set to the Planck2018 best-fit values for now (until CLASS is added):\n(omegab=0.02237, omegac= 0.120, hubble=0.6736 (the rest are irrelevant for the\ntransfer functions, but in case: A_s=2.100e-9, n_s=0.9649, z_reio = 11.357)Newuser_paramsoptionUSE_RELATIVE_VELOCITIES, which produces initial relative\nvelocity cubes (option implemented, but not the actual computation yet).Configuration management.global params now has a context manager for changing parameters temporarily.Vastly improved error handling: exceptions can be caught in C code and propagated to\nPython to inform the user of what\u2019s going wrong.Ability to write high-level data (CoevalandLightconeobjects) directly to\nfile in a simple portable format.ChangedPOWER_SPECTRUMoption moved fromglobal_paramstouser_params.Default cosmology updated to Planck18.v2.0.0All changes and equations for this version are found inhttps://arxiv.org/abs/1809.08995.ChangedUpdated the ionizing source model: (i) the star formation rates and ionizing escape\nfraction are scaled with the masses of dark matter halos and (ii) the abundance of\nactive star forming galaxies is exponentially suppressed below the turn-over halo\nmass, M_{turn}, according to a duty cycle of exp(\u2212M_{turn}/M_{h}), where M_{h} is a\nhalo mass.Removed the mean free path parameter, R_{mfp}. Instead, directly computes\ninhomogeneous, sub-grid recombinations in the intergalactic medium following the\napproach of Sobacchi & Mesinger (2014)v1.2.0AddedSupport for a halo mass dependent ionizing efficiency: zeta = zeta_0 (M/Mmin)^alpha,\nwhere zeta_0 corresponds to HII_EFF_FACTOR, Mmin \u2013> ION_M_MIN,\nalpha \u2013> EFF_FACTOR_PL_INDEX in ANAL_PARAMS.Hv1.12.0AddedCode \u2018redshift_interpolate_boxes.c\u2019 to interpolate between comoving cubes,\ncreating comoving light cone boxes.Enabled openMP threading for SMP machines. You can specify the number of threads\n(for best performace, do not exceed the number of processors) in INIT_PARAMS.H. You do\nnot need to have an SMP machine to run the code. NOTE: YOU SHOULD RE-INSTALL FFTW to\nuse openMP (see INSTALL file)Included a threaded driver file \u2018drive_zscroll_reion_param.c\u2019 set-up to perform\nastrophysical parameter studies of reionizationIncluded explicit support for WDM cosmologies; see COSMOLOGY.H. The prescription is\nsimilar to that discussed in Barkana+2001; Mesinger+2005, madifying the (i) transfer\nfunction (according to the Bode+2001 formula; and (ii) including the effective\npressure term of WDM using a Jeans mass analogy. (ii) is approximated with a sharp\ncuttoff in the EPS barrier, using 60* M_J found in Barkana+2001 (the 60 is an\nadjustment factor found by fitting to the WDM collapsed fraction).A Gaussian filtering step of the PT fields to perturb_field.c, in addition to the\nimplicit boxcar smoothing. This avoids having\u201dempty\u201d density cells, i.e. delta=-1,\nwith some small loss in resolution. Although for most uses delta=-1 is ok, some Lya\nforest statistics do not like it.Added treatment of the risidual electron fraction from X-ray heating when computing\nthe ionization field. Relatedly, modified Ts.c to output all intermediate evolution\nboxes, Tk and x_e.Added a missing factor of Omega_b in Ts.c corresponding to eq. 18 in MFC11. Users who\nused a previous version should note that their results just effecively correspond to a\nhigher effective X-ray efficiency, scaled by 1/Omega_baryon.Normalization optimization to Ts.c, increasing performace on arge resolution boxesFixedGSL interpolation error in kappa_elec_pH for GSL versions > 1.15Typo in macro definition, which impacted the Lya background calculation in v1.11 (not applicable to earlier releases)Outdated filename sytax when calling gen_size_distr in drive_xHIscrollRedshift scrolling so that drive_logZscroll_Ts.c and Ts.c are in sync.ChangedOutput format to avoid FFT padding for all boxesFilename conventions to be more explicit.Small changes to organization and structurev1.1.0AddedWrapper functions mod_fwrite() and mod_fread() in Cosmo_c_progs/misc.c, which\nshould fix problems with the library fwrite() and fread() for large files (>4GB) on\ncertain operating systems.Included print_power_spectrum_ICs.c program which reads in high resolution initial\nconditions and prints out an ASCII file with the associated power spectrum.Parameter in Ts.c for the maximum allowed kinetic temperature, which increases\nstability of the code when the redshift step size and the X-ray efficiencies are large.FixedOversight adding support for a Gaussian filter for the lower resolution field."} {"package": "21CMMC", "pacakge-description": "An extensible MCMC framework for 21cmFAST.This code usessemantic versioning, though this will strictly\nbegin whenv1.0.0is officially shipped.Free software: MIT licenseFeaturesSeamless integration withemcee-based MCMC.MCMC is easily extensible via the addition of different likelihoods using the same underlying data.DocumentationSeehttps://21CMMC.readthedocs.org.AcknowledgingIf you find21CMMCuseful in your research please cite at least one of the following\n(whichever is most suitable to you):Bradley Greig and Andrei Mesinger, \u201c21CMMC: an MCMC analysis tool enabling\nastrophysical parameter studies of the cosmic 21 cm signal\u201d, Monthly Notices of the\nRoyal Astronomical Society, Volume 449, Issue 4, p.4246-4263 (2015),https://doi.org/10.1093/mnras/stv571Bradley Greig and Andrei Mesinger, \u201cSimultaneously constraining the astrophysics of\nreionization and the epoch of heating with 21CMMC\u201d, Monthly Notices of the Royal\nAstronomical Society, Volume 472, Issue 3, p.2651-2669 (2017),https://doi.org/10.1093/mnras/stx2118Bradley Greig and Andrei Mesinger, \u201c21CMMC with a 3D light-cone: the impact of the\nco-evolution approximation on the astrophysics of reionization and cosmic dawn\u201d,\nMonthly Notices of the Royal Astronomical Society, Volume 477, Issue 3, p.3217-3229\n(2018),https://doi.org/10.1093/mnras/sty796Jaehong Park et al., \u201cInferring the astrophysics of reionization and cosmic dawn\nfrom galaxy luminosity functions and the 21-cm signal\u201d, Monthly Notices of the\nRoyal Astronomical Society, Volume 484, Issue 1, p.933-949 (2018),https://doi.org/10.1093/mnras/stz032Changelogv1.0.0devMore fleshed-out interface to cosmoHammer, with base classes abstracting some common\npatterns.New likelihoods and cores that are able to work on any data from the21cmFASTpipeline.Better loggingBetter exception handlingpip-installableDocumentationPipenv supportFull code formatting applied"} {"package": "21cmSense", "pacakge-description": "A python package for calculating the expected sensitivities of 21cm experiments\nto the Epoch of Reionization and/or Cosmic Dawn power spectrum.InstallationFor UsersClone/download the package and runpip install[-e].in the top-level.If you are acondauser (which we recommend), you may want to install the following\nusingcondarather than them being automatically installed with pip:$ conda install numpy scipy pyyaml astropyFor DevelopmentClone/download the package and runpip install[-e].[dev]in the top-level.Runpre-commitinstall;pre-commitinstall--hook-type=commit-msgto install the\npre-commit hook checks.We recommend using thecommitizentool to write commit messages \u2013 we use the commit\nmessages to do our versioning!UsageThere are two ways to use this code: as a python library or via the CLI.\nMore documentation on using the library can foundin the docs, especially in thegetting started tutorialA more involved introduction to the CLI can be found in theCLI tutorial.As a taste, the simplest possible usage is by using the CLI as follows:$ sense calc-sense Other options to thecalc-senseprogram can be read by using:$ sense calc-sense --helpAn example config file is in this repository underexample_configs/sensitivity_hera.yml,\nwhich details the various parameters that can be set. In all, three configuration files\nare required \u2013 one defining anobservatory, another defining anobservation, and thesensitivityone already mentioned.The CLI can also be used in a two-step process, by running:$ sense grid-baselines and then:$ sense calc-sense --array-file=where theARRAY_FILEis produced in the first step (and its location is printed during\nthe execution).Running TestsAn example of how to run tests is in the Github Workflowtestsuite.yaml. In short,\njust runpytestin the top-level directory after installing the package.AcknowledgmentFor details of the observing strategy assumed by this code, and other relevant\nscientific information, please seePober et al. 2013AJ\u2026.145\u202665PandPober et al. 2014ApJ\u2026782\u202666PIf you use this code in any of your work, please acknowledge these papers,\nand provide a link to this repository."} {"package": "223", "pacakge-description": "No description available on PyPI."} {"package": "2233", "pacakge-description": "No description available on PyPI."} {"package": "2233223", "pacakge-description": "No description available on PyPI."} {"package": "2-2-4-1-3-quick-maths", "pacakge-description": "quick calculatorChange Log0.0.1 (19/04/2020)First ReleaseL4b:Nxmkp.:4jpv"} {"package": "2267570-hw4", "pacakge-description": "No description available on PyPI."} {"package": "22BEE0039", "pacakge-description": "#Hello worldThis is an example project created by Sameer Bisht VIT Vellore. 22BEE0039\nThis contains two modules.\nOne module contains functions. and the other uses the functions from the first.InstallationRun the following to install:\n'''python\npip install sameer22BEE0039\n'''"} {"package": "22BEE0040", "pacakge-description": "Python ProjectThis is a simple package which does addition, subtraction, multiplication and division of two whole numbers.\nMade by Aahir Basu.\nCurrently studying B.Tech EEE at VIT, Vellore.\nRegistration Number - 22BEE0040"} {"package": "22BEE0060", "pacakge-description": "Example PackageThis is a simple example package. You can useGithub-flavored Markdownto write your content."} {"package": "233", "pacakge-description": "No description available on PyPI."} {"package": "2333", "pacakge-description": "No description available on PyPI."} {"package": "2345435342313131", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} {"package": "23andme-to-vcf", "pacakge-description": "# 23andme to VCFA simple command-line tool to convert 23andMe raw data files to VCF format.# Install` pip install23andme-to-vcf`# Usage`23andme-to-vcf--inputin.txt--fastaGRCh37.fa--faiGRCh37.fa.fai--outputout.vcf `"} {"package": "2434", "pacakge-description": "CyberSecurityLecture\u30b5\u30a4\u30d0\u30fc\u30bb\u30ad\u30e5\u30ea\u30c6\u30a3\u30fc\u6388\u696d\u306eapp\nvTuber\u306b\u3058\u3055\u3093\u3058\u306e\u7e4b\u304c\u308a(\u30b3\u30e9\u30dc)\u3092\u5206\u6790\u3059\u308b\u3082\u306e"} {"package": "24p", "pacakge-description": "24p[1, 2, 3, 4, 5, 6, 7, 8, 9, 10] * 4\u603b\u5171\u670991390\u79cd\u7ec4\u5408,\u5176\u4e2d79936\u79cd\u6709\u89e3,\u6709\u89e3\u738787.47%\u53bb\u91cd\u540e715\u79cd\u7ec4\u5408,\u5176\u4e2d566\u79cd\u6709\u89e3,\u6709\u89e3\u738779.16%b/a[1,2,7,7]24=(7*7-1)/2[3,8,8,10]24=(8*10-8)/3[4,4,10,10]24=(10*10-4)/4ab/c = (ab)/(xy)[6,9,9,10]24=9*10/6+98/(1/3)[1,3,4,6]24=6/(1-3/4)[1,4,5,6]24=6/(5/4-1)[1,6,6,8]24=6/(1-6/8)[3,3,8,8]23=8/(3-8/3)(b+c/a)a = ab+c[1,5,5,5]24=(5-1/5)*5[2,2,7,10]24=(10/2+7)*2[2,4,10,10]24=(4/10+2)*10[2,7,7,10]24=(10/7+2)*7[3,3,7,7]24=(3/7+3)*7[4,4,7,7]24=(4-4/7)*7(b+c/a)d = (b+c/a)2a = 2(ab+c)[2,5,5,10]24=(5-2/10)*5ab+ac = a(b+c)[1,5,6,6]24=5*6-1*6[2,3,3,10]24=3*10-2*3[2,5,8,8]24=5*8-2*8[5,6,6,9]24=6*9-5*6[6,6,6,10]24=6*10-6*6[6,8,8,9]24=8*9-6*8[7,8,8,10]24=8*10-7*82*12=243*8=244*6=2412+12=2414+10=2415+9=2416+8=2417+7=2418+6=2420+4=2421+3=2425-1=2428-4=2430-6=2432-8=2435-11=2436-12=2440-16=247+8+9=248+8+8=242*8+8=244*8-8=24"} {"package": "24to25", "pacakge-description": "UNKNOWN"} {"package": "256-encrypt", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} {"package": "2.5D", "pacakge-description": "With this module you can create kind of 3D games easily."} {"package": "270Boi", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} {"package": "29082022-distributions", "pacakge-description": "No description available on PyPI."} {"package": "2969nester", "pacakge-description": "UNKNOWN"} {"package": "2b2t", "pacakge-description": "bbtt: A 2b2t toolboxThis is a simple commandline utility which polls2b2t's server status and print it.UsageInstall Python (>=3.7).Executepip install 2b2t.Run2b2t."} {"package": "2b2t-api", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} {"package": "2b2t.py", "pacakge-description": "2b2t.pyThis package handles all things 2b2t!InstallationYou can install the package with pip.pip install 2b2torpip3 install 2b2tOr you can install with GitHubClone the repoRun setup.pyInstall Python3 if you don't already have itInstall requests and colorama if you don't already have itgit clone https://github.com/BGP0/2b2t.py.git\npython3 setup.py build install\npip3 install requests\npip3 install coloramaUsageDocsCreditsOrginal Software made by BGPThanks to SkilzMastr for turning the checker into a package!"} {"package": "2b4cad69c8f0478caa92a512e57b735aba978d0e", "pacakge-description": "No description available on PyPI."} {"package": "2captcha-python", "pacakge-description": "Python Module for 2Captcha APIThe easiest way to quickly integrate2Captchacaptcha solving service into your code to automate solving of any types of captcha.Python Module for 2Captcha APIInstallationConfigurationTwoCaptcha instance optionsSolve captchaCaptcha optionsNormal CaptchaAudio CaptchaText CaptchaReCaptcha v2ReCaptcha v3FunCaptchaGeeTesthCaptchaGeeTest v4Lemin Cropped CaptchaCloudflare TurnstileAmazon WAFKeyCaptchaCapyGridCanvasClickCaptchaRotateOther methodssend / getResultbalancereportError handlingProxiesAsync callsInstallationThis package can be installed with Pip:pip3 install 2captcha-pythonConfigurationTwoCaptcha instance can be created like this:fromtwocaptchaimportTwoCaptchasolver=TwoCaptcha('YOUR_API_KEY')Also there are few options that can be configured:config={'server':'2captcha.com','apiKey':'YOUR_API_KEY','softId':123,'callback':'https://your.site/result-receiver','defaultTimeout':120,'recaptchaTimeout':600,'pollingInterval':10,}solver=TwoCaptcha(**config)TwoCaptcha instance optionsOptionDefault valueDescriptionserver2captcha.comAPI server. You can set it torucaptcha.comif your account is registered theresoftId-your software ID obtained after publishing in2captcha sofware catalogcallback-URL of your web-sever that receives the captcha recognition result. The URl should be first registered inpingback settingsof your accountdefaultTimeout120Polling timeout in seconds for all captcha types except ReCaptcha. Defines how long the module tries to get the answer fromres.phpAPI endpointrecaptchaTimeout600Polling timeout for ReCaptcha in seconds. Defines how long the module tries to get the answer fromres.phpAPI endpointpollingInterval10Interval in seconds between requests tores.phpAPI endpoint, setting values less than 5 seconds is not recommendedIMPORTANT:oncecallbackis defined forTwoCaptchainstance, all methods return only the captcha ID and DO NOT poll the API to get the result. The result will be sent to the callback URL.\nTo get the answer manually usegetResult methodSolve captchaWhen you submit any image-based captcha use can provide additional options to help 2captcha workers to solve it properly.Captcha optionsOptionDefault ValueDescriptionnumeric0Defines if captcha contains numeric or other symbolssee more info in the API docsminLen0minimal answer lenghtmaxLen0maximum answer lengthphrase0defines if the answer contains multiple words or notcaseSensitive0defines if the answer is case sensitivecalc0defines captcha requires calculationlang-defines the captcha language, see thelist of supported languageshintImg-an image with hint shown to workers with the captchahintText-hint or task text shown to workers with the captchaBelow you can find basic examples for every captcha type. Check outexamples directoryto find more examples with all available options.Normal CaptchaTo bypass a normal captcha (distorted text on image) use the following method. This method also can be used to recognize any text on the image.result=solver.normal('path/to/captcha.jpg',param1=...,...)# ORresult=solver.normal('https://site-with-captcha.com/path/to/captcha.jpg',param1=...,...)Audio CaptchaTo bypass an audio captcha (mp3 formats only) use the following method.\nYou must provife the language aslang = 'en'. Supported languages are \"en\", \"ru\", \"de\", \"el\", \"pt\".result=solver.audio('path/to/captcha.mp3',lang='lang',param1=...,...)# ORresult=solver.audio('https://site-with-captcha.com/path/to/captcha.mp3',lang='lang',param1=...,...)Text CaptchaThis method can be used to bypass a captcha that requires to answer a question provided in clear text.result=solver.text('If tomorrow is Saturday, what day is today?',param1=...,...)ReCaptcha v2Use this method to solve ReCaptcha V2 and obtain a token to bypass the protection.result=solver.recaptcha(sitekey='6Le-wvkSVVABCPBMRTvw0Q4Muexq1bi0DJwx_mJ-',url='https://mysite.com/page/with/recaptcha',param1=...,...)ReCaptcha v3This method provides ReCaptcha V3 solver and returns a token.result=solver.recaptcha(sitekey='6Le-wvkSVVABCPBMRTvw0Q4Muexq1bi0DJwx_mJ-',url='https://mysite.com/page/with/recaptcha',version='v3',param1=...,...)FunCaptchaFunCaptcha (Arkoselabs) solving method. Returns a token.result=solver.funcaptcha(sitekey='6Le-wvkSVVABCPBMRTvw0Q4Muexq1bi0DJwx_mJ-',url='https://mysite.com/page/with/funcaptcha',param1=...,...)GeeTestMethod to solve GeeTest puzzle captcha. Returns a set of tokens as JSON.result=solver.geetest(gt='f1ab2cdefa3456789012345b6c78d90e',challenge='12345678abc90123d45678ef90123a456b',url='https://www.site.com/page/',param1=...,...)hCaptchaUse this method to solve hCaptcha challenge. Returns a token to bypass captcha.result=solver.hcaptcha(sitekey='10000000-ffff-ffff-ffff-000000000001',url='https://www.site.com/page/',param1=...,...)GeeTest v4Use this method to solve GeeTest v4. Returns the response in JSON.result=solver.geetest_v4(captcha_id='e392e1d7fd421dc63325744d5a2b9c73',url='https://www.site.com/page/',param1=...,...)Lemin Cropped CaptchaUse this method to solve hCaptcha challenge. Returns JSON with answer containing the following values: answer, challenge_id.result=solver.lemin(captcha_id='CROPPED_1abcd2f_a1234b567c890d12ef3a456bc78d901d',div_id='lemin-cropped-captcha',url='https://www.site.com/page/',param1=...,...)Cloudflare TurnstileUse this method to solve Cloudflare Turnstile. Returns JSON with the token.result=solver.turnstile(sitekey='0x1AAAAAAAAkg0s2VIOD34y5',url='http://mysite.com/',data='foo',pagedata='bar',action='challenge',useragent='Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/114.0.0.0 Safari/537.36')Amazon WAFUse this method to solve Amazon WAF Captcha also known as AWS WAF Captcha is a part of Intelligent threat mitigation for Amazon AWS. Returns JSON with the token.result=solver.amazon_waf(sitekey='0x1AAAAAAAAkg0s2VIOD34y5',iv='CgAHbCe2GgAAAAAj',context='9BUgmlm48F92WUoqv97a49ZuEJJ50TCk9MVr3C7WMtQ0X6flVbufM4n8mjFLmbLVAPgaQ1Jydeaja94iAS49ljb+sUNLoukWedAQZKrlY4RdbOOzvcFqmD/ZepQFS9N5w15Exr4VwnVq+HIxTsDJwRviElWCdzKDebN/mk8/eX2n7qJi5G3Riq0tdQw9+C4diFZU5E97RSeahejOAAJTDqduqW6uLw9NsjJBkDRBlRjxjn5CaMMo5pYOxYbGrM8Un1JH5DMOLeXbq1xWbC17YSEoM1cRFfTgOoc+VpCe36Ai9Kc='url='https://non-existent-example.execute-api.us-east-1.amazonaws.com/latest'param1=...,...)KeyCaptchaToken-based method to solve KeyCaptcha.result=solver.keycaptcha(s_s_c_user_id=10,s_s_c_session_id='493e52c37c10c2bcdf4a00cbc9ccd1e8',s_s_c_web_server_sign='9006dc725760858e4c0715b835472f22-pz-',s_s_c_web_server_sign2='2ca3abe86d90c6142d5571db98af6714',url='https://www.keycaptcha.ru/demo-magnetic/',param1=...,...)CapyToken-based method to bypass Capy puzzle captcha.result=solver.capy(sitekey='PUZZLE_Abc1dEFghIJKLM2no34P56q7rStu8v',url='http://mysite.com/',api_server='https://jp.api.capy.me/',param1=...,...)GridGrid method is originally called Old ReCaptcha V2 method. The method can be used to bypass any type of captcha where you can apply a grid on image and need to click specific grid boxes. Returns numbers of boxes.result=solver.grid('path/to/captcha.jpg',param1=...,...)CanvasCanvas method can be used when you need to draw a line around an object on image. Returns a set of points' coordinates to draw a polygon.result=solver.canvas('path/to/captcha.jpg',param1=...,...)ClickCaptchaClickCaptcha method returns coordinates of points on captcha image. Can be used if you need to click on particular points on the image.result=solver.coordinates('path/to/captcha.jpg',param1=...,...)RotateThis method can be used to solve a captcha that asks to rotate an object. Mostly used to bypass FunCaptcha. Returns the rotation angle.result=solver.rotate('path/to/captcha.jpg',param1=...,...)Other methodssend / getResultThese methods can be used for manual captcha submission and answer polling.importtime.....id=solver.send(file='path/to/captcha.jpg')time.sleep(20)code=solver.get_result(id)balanceUse this method to get your account's balancebalance=solver.balance()reportUse this method to report good or bad captcha answer.solver.report(id,True)# captcha solved correctlysolver.report(id,False)# captcha solved incorrectlyError handlingIn case of an error, the captcha solver throws an exception. It's important to properly handle these cases. We recommend usingtry exceptto handle exceptions.try:result=solver.text('If tomorrow is Saturday, what day is today?')exceptValidationExceptionase:# invalid parameters passedprint(e)exceptNetworkExceptionase:# network error occurredprint(e)exceptApiExceptionase:# api respond with errorprint(e)exceptTimeoutExceptionase:# captcha is not solved so farprint(e)ProxiesYou can pass your proxy as an additional argument for methods: recaptcha, funcaptcha and geetest. The proxy will be forwarded to the API to solve the captcha.proxy={'type':'HTTPS','uri':'login:password@IP_address:PORT'}Async callsYou can also make async calls withasyncio, for example:importasyncioimportconcurrent.futuresfromtwocaptchaimportTwoCaptchacaptcha_result=awaitcaptchaSolver(image)asyncdefcaptchaSolver(image):loop=asyncio.get_running_loop()withconcurrent.future.ThreadPoolExecutor()aspool:result=awaitloop.run_in_executor(pool,lambda:TwoCaptcha(API_KEY).normal(image))returnresult"} {"package": "2captcha-solver", "pacakge-description": "Python Module for 2Captcha APIThe easiest way to quickly integrate2Captchacaptcha solving service into your code to automate solving of any types of captcha.InstallationConfigurationSolve captchaNormal CaptchaTextReCaptcha v2ReCaptcha v3FunCaptchaGeeTesthCaptchaKeyCaptchaCapyGrid (ReCaptcha V2 Old Method)CanvasClickCaptchaRotateOther methodssend / getResultbalancereportError handlingInstallationThis package can be installed with Pip:pip3 install 2captcha-pythonConfigurationTwoCaptcha instance can be created like this:fromtwocaptchaimportTwoCaptchasolver=TwoCaptcha('YOUR_API_KEY')Also there are few options that can be configured:config={'server':'2captcha.com','apiKey':'YOUR_API_KEY','softId':123,'callback':'https://your.site/result-receiver','defaultTimeout':120,'recaptchaTimeout':600,'pollingInterval':10,}solver=TwoCaptcha(**config)TwoCaptcha instance optionsOptionDefault valueDescriptionserver2captcha.comAPI server. You can set it torucaptcha.comif your account is registered theresoftId-your software ID obtained after publishing in2captcha sofware catalogcallback-URL of your web-sever that receives the captcha recognition result. The URl should be first registered inpingback settingsof your accountdefaultTimeout120Polling timeout in seconds for all captcha types except ReCaptcha. Defines how long the module tries to get the answer fromres.phpAPI endpointrecaptchaTimeout600Polling timeout for ReCaptcha in seconds. Defines how long the module tries to get the answer fromres.phpAPI endpointpollingInterval10Interval in seconds between requests tores.phpAPI endpoint, setting values less than 5 seconds is not recommendedIMPORTANT:oncecallbackis defined forTwoCaptchainstance, all methods return only the captcha ID and DO NOT poll the API to get the result. The result will be sent to the callback URL.\nTo get the answer manually usegetResult methodSolve captchaWhen you submit any image-based captcha use can provide additional options to help 2captcha workers to solve it properly.Captcha optionsOptionDefault ValueDescriptionnumeric0Defines if captcha contains numeric or other symbolssee more info in the API docsminLength0minimal answer lenghtmaxLength0maximum answer lengthphrase0defines if the answer contains multiple words or notcaseSensitive0defines if the answer is case sensitivecalc0defines captcha requires calculationlang-defines the captcha language, see thelist of supported languageshintImg-an image with hint shown to workers with the captchahintText-hint or task text shown to workers with the captchaBelow you can find basic examples for every captcha type. Check outexamples directoryto find more examples with all available options.Normal CaptchaTo bypass a normal captcha (distorted text on image) use the following method. This method also can be used to recognize any text on the image.result=solver.normal('path/to/captcha.jpg',param1=...,...)# ORresult=solver.normal('https://site-with-captcha.com/path/to/captcha.jpg',param1=...,...)Text CaptchaThis method can be used to bypass a captcha that requires to answer a question provided in clear text.result=solver.text('If tomorrow is Saturday, what day is today?',param1=...,...)ReCaptcha v2Use this method to solve ReCaptcha V2 and obtain a token to bypass the protection.result=solver.recaptcha(sitekey='6Le-wvkSVVABCPBMRTvw0Q4Muexq1bi0DJwx_mJ-',url='https://mysite.com/page/with/recaptcha',param1=...,...)ReCaptcha v3This method provides ReCaptcha V3 solver and returns a token.result=solver.recaptcha(sitekey='6Le-wvkSVVABCPBMRTvw0Q4Muexq1bi0DJwx_mJ-',url='https://mysite.com/page/with/recaptcha',version='v3',param1=...,...)FunCaptchaFunCaptcha (Arkoselabs) solving method. Returns a token.result=solver.funcaptcha(sitekey='6Le-wvkSVVABCPBMRTvw0Q4Muexq1bi0DJwx_mJ-',url='https://mysite.com/page/with/funcaptcha',param1=...,...)GeeTestMethod to solve GeeTest puzzle captcha. Returns a set of tokens as JSON.result=solver.geetest(gt='f1ab2cdefa3456789012345b6c78d90e',challenge='12345678abc90123d45678ef90123a456b',url='https://www.site.com/page/',param1=...,...)hCaptchaUse this method to solve hCaptcha challenge. Returns a token to bypass captcha.result=solver.hcaptcha(sitekey='10000000-ffff-ffff-ffff-000000000001',url='https://www.site.com/page/',param1=...,...)KeyCaptchaToken-based method to solve KeyCaptcha.result=solver.keycaptcha(s_s_c_user_id=10,s_s_c_session_id='493e52c37c10c2bcdf4a00cbc9ccd1e8',s_s_c_web_server_sign='9006dc725760858e4c0715b835472f22-pz-',s_s_c_web_server_sign2='2ca3abe86d90c6142d5571db98af6714',url='https://www.keycaptcha.ru/demo-magnetic/',param1=...,...)CapyToken-based method to bypass Capy puzzle captcha.result=solver.capy(sitekey='PUZZLE_Abc1dEFghIJKLM2no34P56q7rStu8v',url='http://mysite.com/',api_server='https://jp.api.capy.me/',param1=...,...)GridGrid method is originally called Old ReCaptcha V2 method. The method can be used to bypass any type of captcha where you can apply a grid on image and need to click specific grid boxes. Returns numbers of boxes.result=solver.grid('path/to/captcha.jpg',param1=...,...)CanvasCanvas method can be used when you need to draw a line around an object on image. Returns a set of points' coordinates to draw a polygon.result=solver.canvas('path/to/captcha.jpg',param1=...,...)ClickCaptchaClickCaptcha method returns coordinates of points on captcha image. Can be used if you need to click on particular points on the image.result=solver.coordinates('path/to/captcha.jpg',param1=...,...)RotateThis method can be used to solve a captcha that asks to rotate an object. Mostly used to bypass FunCaptcha. Returns the rotation angle.result=solver.rotate('path/to/captcha.jpg',param1=...,...)Other methodssend / getResultThese methods can be used for manual captcha submission and answer polling.importtime.....id=solver.send(file='path/to/captcha.jpg')time.sleep(20)code=solver.get_result(id)balanceUse this method to get your account's balancebalance=solver.balance()reportUse this method to report good or bad captcha answer.solver.report(id,True)# captcha solved correctlysolver.report(id,False)# captcha solved incorrectlyError handlingIf case of an error captcha solver throws an exception. It's important to properly handle these cases. We recommend to usetry exceptto handle exceptions.Try:result=solver.text('If tomorrow is Saturday, what day is today?')ExceptValidationExceptionase:# invalid parameters passedprint(e)ExceptNetworkExceptionase:# network error occurredprint(e)ExceptApiExceptionase:# api respond with errorprint(e)ExceptTimeoutExceptionase:# captcha is not solved so farprint(e)ProxiesYou can pass your proxy as an additional argumen for methods: recaptcha, funcaptcha and geetest. The proxy will be forwarded to the API to solve the captcha.proxy={'type':'HTTPS','uri':'login:password@IP_address:PORT'}Async callsYou can also make async calls withasyncio, for example:importasyncioimportconcurrent.futuresfromtwocaptchaimportTwoCaptchacaptcha_result=awaitcaptchaSolver(image)asyncdefcaptchaSolver(image):loop=asyncio.get_running_loop()withconcurrent.future.ThreadPoolExecutor()aspool:result=awaitloop.run_in_executor(pool,lambda:TwoCaptcha(API_KEY).normal(image))returnresult"} {"package": "2ch-downloader", "pacakge-description": "2ch-downloaderDownload all files of 2ch.hk thread.Installationpipinstall2ch-downloaderUsageusage: 2ch-downloader [-h] [-d DIRECTORY] [--max-directory-name-length LENGTH] URL\n\npositional arguments:\n URL Thread url\n\noptions:\n -h, --help show this help message and exit\n -d DIRECTORY, --directory DIRECTORY\n Download directory\n --max-directory-name-length LENGTH\n Max thread directory name length, 128 by default"} {"package": "2C.py", "pacakge-description": "No description available on PyPI."} {"package": "2d", "pacakge-description": "2D2D is a project with 1 goal - make programming 2D games as easy as possible.Want to make 3D games? Check out3D, the other project for 3D games!"} {"package": "2d6io-cryptobot", "pacakge-description": "2d6io-cryptobotDependenciesThis project uses @danpaquin's excellentcoinbasepro-pythonWe also use Flask to generate a restful endpoint to serve our datapip3installcbpropip3installflaskWe also use Angular and NPM to serve a web interface for the application.\nYou will need to havethe latest versionof node installed.Run the following installs once node is complete (seethe docsfor more info):npminstall-g@angular/cliConfigRenameconfig_example.initoconfig.iniand update it with the right values.UNDER NO CIRCUMSTANCES SHOULDconfig.iniBE UPLOADED TO THE REPO OR SHARED WITH OTHERSThe config is already in the .gitignore so the only way this will happen is if you modify that. Don't do it.RunningRestful ServicesTo run from a command line run the following:python3app.pyThis will start the flask restful service, which can be opened from yourlocalhostThis restful service has several endpoints:/api/wallet- Returns the current state of your wallet/api/analyze_coins- Runs a single pass of the coin analysis process using the data feed from CoinbaseWeb InterfaceTo run the web interface, navigate from a console to/interfaceand run the following:ngserveNavigate to yourlocal angular serverand you will see the wallet interface.To start the analysis service, clickStart Monitoring. To disable the service, clickStop MonitoringDonateLike the project? Donate!Wallet:0xe6f912cba2D254511170884AF4637689BE8758E6"} {"package": "2D-cellular-automaton", "pacakge-description": "2D Cellular AutomatonThis project was inspired by discussions in MATH 340 Mathematical Excursions. While we visualized multiple starting indicies for 2D cellular automata in Excel, I knew a Python script would allow greater functioniality and easier usage.I came across a respository on GitHub by Zhiming Wang titledrule30. Nearly all the code is borrowed from there and made it unnecessary for me to start from scratch. All the functionalities from Wang's solution exist in this project, with the only additions being supporting multiple starting indicies.Table of ContentsInstallationUsageCreditLicenseInstallationpip install 2DCellularAutomatonUsagefromCellularAutomatonimportAutomatonrows=100#Any positive numberrule=30#From 1-256. More can be seen here https://mathworld.wolfram.com/ElementaryCellularAutomaton.htmlstarting_indicies=[20,60]#Note this refers to the columns and columns = 2 * rows - 1, which is why rows - 1 yields center.block_size=1automata=Automaton(rows=rows,rule=rule,starting_indicies=starting_indicies)image=automata.image(block_size=block_size)image.save('Rule 30 | Column 20 and 60.jpeg')OutputCreditZhiming Wang'srule30LicenseMIT"} {"package": "2dfly-manbanzhen", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} {"package": "2DMath", "pacakge-description": "No description available on PyPI."} {"package": "2D-Panel-CFD", "pacakge-description": "Background and ValidationThis wiki is built in Notion. Here are all the tips you need to contribute.General BackgroundFlow over a cylinderThe project has been started as a Open Source repository for CFD solvers. The motive is to provide handy easy to understand code with multitude of CFD schemes for cfd developers. Also, needs to remain functional as an easy to setup open source solver for users. This release only comprises of a terminal sequential prompt, simple and effective. We have immediate plans of implementing a PyQT GUI to it.Head to the notion page for more information on how to add to this project:https://florentine-hero-1e6.notion.site/2D_Panel-CFD-ad63baa924ee4a32af8a52b8134c0360This version comprises of a 2D Staggered Grid with Inlet, Outlet & Wall Boundary conditions. Obstacles can be imported & transformed with a list of points or with the inbuilt elliptical geometries.First order Upwind Scheme is used for Velocity with very good results for the benchmark Lid Driven Cavity problem when compared to results in Ghia etal.The SIM runs stable with terminal-python for <10000 Cells after which Residual plotting becomes laggy. spyder (Anaconda IDE) provides great speed-ups with multi-core utilisation & also improves the post-processing experience. The Sequential prompts based model is based on a GUI approach and will be ported to it in the next update.The lack of multi-threading support in python trumps the ease of accessibility of matplotlib library. We will be looking to port into C++ immediately utilizing vtk libraries with paraview & blender for visualization.The framework is designed to test new FVM schemes, & Coupling solvers. All popular convection schemes will be added soon. Multiple solvers will be available in the next updates, the likes of SIMPLER, PISO, Pimple etc. Future plans also include Unsteady & VOF solvers.The program works as a sequential prompt, for SIM Parameters. The prompts are designed keeping in mind a GUI approach, which will be available in the next update. There are frequent Check Cycles to render the result & modify any inputs. We'll go through an exemplary First Run in the next Section.InstallationMethod: 1To install using pip Run:python3-mpipinstall2D_Panel-CFDOr:Method: 2https://github.com/Fluidentity/2D_Panel-CFDClone github[RUN_package](https://github.com/Fluidentity/2D_Panel-CFD.git)to anywhere in your machine from:cd /insert/folder/address/cfd\ngit clone https://github.com/Fluidentity/2D_Panel-CFD.gitSet it to PYTHONPATH with:exportPYTHONPATH=\"${PYTHONPATH}:/insert/folder/address/cfd/RUN_package\"It's advisable to run this package fromRUN-spyder.pythrough an IDE like spyder for ease of use, and prolonged variable storage. Also, spyder has some great plotting interface.Executable\ud83d\udca1 The source directory should be set up as PYTHONPATH if not installed using pipMethod: 1Open python environment with: (in terminal)python3or (if python \u2014version is >3)pythonthen insert:fromRUN_packageimportRUNRUN.pyis meant to be run from terminal.Method: 2Run on IDE by cloningRUN_packagefrom Github.Open python IDE like spyder from RUN_package directory:Run RUN-spyder.pyThe cells for pre-processor, solver & post processors are different. Need to run all.RUN_spyder.pycan be run with an IDE, such as spyder to improve multi-Core Utilisation & post-processing experience. ****Validation of SolverVortex Shedding flow over a cylinderFor validation of the solver laid out, following strategies are used:Comparison with Benchmark Problem Lid Driven CavityReference study Ghia etal. Re = 100, 1000, 5000Lid Driven Cavity Benchmark Ghia etal.ResidualsBenchmark Test at Re=100First Order Upwind schemeBenchmark Test at Re=1000First Order Upwind schemeBenchmark Test at Re=5000First Order Upwind schemeConclusionFirst order UPWIND Scheme is good for low Reynolds no. but is only first order accurate to capture higher gradient.Fully developed flow between Parallel PlatesVelocity Profile [at X=0.8Lx and Y=0.5Ly]ConclusionThe Umax Velocity comes close to 1.5 feactor for steady flow between parallel plates. First order UPWIND Scheme with high y-gradient."} {"package": "2DRegEx-mwinters", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} {"package": "2dwavesim", "pacakge-description": "2dwavesimThis is a project that simulates waves on 2D plates/rooms. Given boundaries (or walls) and points where oscillation will be forced, this will simulate the resulting wavemodes!Currently it supports setting the attenuation properties of individual boundaries, multiple forcing points based on either data or a function, and any wall shape you want. It also supports variable time and space steps and spans (as long as you keep numerically stable!), as well as custom wavespeed and attenuation on the material.TODO:add testsfrequency-dependant wall transmission values3D??UsageThere are two main Classes:Room(ds, width, height,*, walls=[], physics_params={})This creates an instance of aRoomclass, which contains any walls or sources of the system.dsis a float which defines the unit of distance between two grid points.widthandheightand floats which define the dimentions of the grid. If they are not exact multiples ofds, then they are upper bounds on the number of points above the nearest multiple.wallsis a list ofWallobjects. This can be optionally set after construction as well.physics_paramsis a dict with structure{'wavespeed':float, 'attenuation':float}. Wavespeed represents the speed of the propigating wave on the room's medium, and attenuation represents the attenuation factor of waves on the medium. By defaut, wavespeed is assumed to be 343 units/s and attenuation is assumed to be $2^{-2}$ units\n$^{-1}$.Room.add_source_func(loc, func)Add a source based on a function.locis the room-specific coordinate of the source. Note: unlessds=1, this is not the same as the list indices of the point in the room.funcis a function taking a float (the time) and outputing a float (the value of the wave at that time). This should be something likelambda t: np.sin(t), for example.Room.add_source_data(loc, data)Add a source based on a list of values. Careful! Make sure you use adtvalue that matches the table data, as an entry of the data list will be used on every time tick. For example, if you make the data table represent the value of the wave every 200ms, then be sure to setdtto 200ms as well when you run the simulation. If there are less points in the list of values than there are time steps, then a value of 0 is used for all time steps past the last data point.locis the room-specific coordinate of the source. Note: unlessds=1, this is not the same as the list indices of the point in the room.datais a list of floats (the value of the wave at that time). This should be something likenp.sin(np.arange(0, 10, 0.2)), for example.Room.add_walls(walls)Add walls to the system after constructing the Room object.wallsis a list ofWallobjects to add the the system.Room.create_mask()Create a mask for the values of the room based on the currently set walls. This is automatically done when running the simulation, but it can be run beforehand if you want to plot the mask for visualization.Room.get_mask()Returns a 2D array of the wall mask as currently calculated.Room.run(dt, t_final)Calculate the wave propagation from the set sources and using the set walls. This will simulate fromt=0tot_finalatdttime steps. Ift_finalisn't an exact multiple ofdt, then it acts like an upper bound.dta float giving the time step for the simulation. A smaller value means more time resolution. WARNING: Numerical stability will almost certainly be lost if this is not set to satisfy theCFL Condition, namely $\\frac{u*dt}{ds} \\leq C_{max}$ where $u$ is thewavespeedand $c_{max}$ is approximately 1 for the numerical method being used.t_finala float giving an upper limit for the amount of time to be simulated. A higher value will take more time to simulate, and will likely just repeat the steady state after a certain point in time...Wall(endpoint1, endpoint2, transmission)This creates an instance of aWallclass, which contains the wall's endpoints and transmission factor.endpoint1andendpoint2are tuple2 or 2-list2 of floats giving the position of each end of the wall in the room-specific coordinates. Note: unlessds=1, this is not the same as the list indices of the point in the room.transmissionis a float in [0,1] which defines the proportion of wave amplitude able to penetrate the wall. If 0, then all energy is reflected back inwards, and if 1 then the wall \"isn't there\".VisualizationThevisualizationmodule contains a few functions for visualizing results, or processing results into an easily displayed format.animate(data, *, filepath='', frame_space=10, walls=[])Automatically animate the given data usingmatplotlib.animation.ArtistAnimation. The animation file can optionally be saved to a file.datais a 3D array of waveform over time, which is the output from running the simulation.filenameis the name and path of output file. Leave this blank to not save. Output formats are those supported bymatplotlib.animation.ArtistAnimation, which is at least \".gif\" and \".webp\".frame_spaceis the temporal resolution of resulting animation. Make sure this isn't too small!wallsis to optionally include the walls in the animation. They won't be visible if this isn't included.get_steady_state_index(data, *, sample_points, rms_tolerance=0.1, window_size=0.1)This function calculates the windowed RMS of the given points over time. This data is compared to the RMS value at the end of the simulation. Then the latest time index where all point RMS's are within a tolerance to the final RMS is taken as the time index where steady-state is reached.datais a 3D array of waveform over time, which is the output from running the simulation.sample_pointsis a list of points in the room which will be checked for RMS.rms_toleranceis a float in [0, 1] defining the limit on the amount the RMS is allowed to change from the final value and still be considered steady-state.window_sizeis a float in [0, 1] defining the percent of total points to consider in the window.get_standing_waves(data, *, steady_state_kwargs=None)This function calculates when the steady state begins, and returns a 2D array which is the average of the absolute value of all of the rooms points across all steady state times.datais a 3D array of waveform over time, which is the output from running the simulation.steady_state_kwargsis a dict of the keyword arguments to pass toget_steady_state_index. IfNone, then the default parameters and a sample point at the middle of the room are used."} {"package": "2dxPlay", "pacakge-description": "2dxPlayPlay konami 2dx files from the command line.Install with pip and run it from anywhere.pip install 2dxPlayUsage:2dxplay infile.2dxIf the file contains more than one wav, it will play all tracks sequentially.Press Ctrl+C to pause and enter a track number to jump to, orqto quit.###Tip: Play any 2dx file just by double clicking on it:\nRight click on a 2dx file, choose Open With > Look for another app on this PC.Navigate to your python installation where 2dxPlay.exe is installed. (I use python 3.7, so pip\ninstalled it in%appdata%\\Local\\Programs\\Python\\Python37\\Scripts).Check the \"Always use this app\" box and profit."} {"package": "2fa", "pacakge-description": "UNKNOWN"} {"package": "2fas", "pacakge-description": "2fas Python2fas-python is an unofficial implementation\nof2FAS - the Internet\u2019s favorite open-source two-factor authenticator.\nIt consists of a core library in Python and a CLI tool.InstallationTo install this project, use pip or pipx:pipinstall2fas# or:pipxinstall2fasUsageTo see all available options, you can run:2fas--helpIf you simply run2fasor2fas /path/to/file.2fas, an interactive menu will show up.\nIf you only want a specific TOTP code, you can run2fas or2fas /path/to/file.2fas .\nMultiple services can be specified:2fas [/path/to/file.2fas].\nFuzzy matching is applied to (hopefully) catch some typo's.\nYou can run2fas --allto generate codes for all TOTP in your.2fasfile.Settings# see all settings:2fas--settings# shortcut: -s# see a specific setting:2fas--settingkey# update a setting:2fas--settingkeyvalueThe--settings,--settingor-sflag can be used to read/write settings.\nThis can also be done from within the interactive menu.As a LibraryPlease see the documentation oflib2fas-pythonfor more details on\nusing this as a Python library.LicenseThis project is licensed under the MIT License."} {"package": "2feeds", "pacakge-description": "No description available on PyPI."} {"package": "2gis", "pacakge-description": "A Python library for accessing the 2gis API"} {"package": "2ip", "pacakge-description": "Python 2ip Module2ipallows you to make requests to the 2ip.me API to retrieve provider/geographic information for IP addresses. Requests are (optionally, on by default) cached to prevent unnecessary API lookups when possible.InstallationInstall the module from PyPI:python3-mpipinstall2ipMethodsThe following methods are available.TwoIP (Initialisation)When initialising the 2ip module the following parameters may be specified:Optionalkey: The API key to use for lookups. If no API key defined the API lookups will use the rate limited free API.geoThe geographic lookup method accepts the following parameters:Requiredip: The IP address to lookup.Optionalformat{json,xml}: The output format for the request.jsonwill return a dict andxmlwill return a string.Optionalforce{True,False}: Force an API lookup even if there is a cache entry.Optionalcache{True,False}: Allow the lookup result to be cached.providerThe provider lookup method accepts the following parameters:Requiredip: The IP address to lookup.Optionalformat{json,xml}: The output format for the request.jsonwill return a dict andxmlwill return a string.Optionalforce{True,False}: Force an API lookup even if there is a cache entry.Optionalcache{True,False}: Allow the lookup result to be cached.ExamplesSome example scripts are included in theexamplesdirectory.Provider APIRetrieve provider information for the IP address192.0.2.0as adict:>>>fromtwoipimportTwoIP>>>twoip=TwoIP(key=None)>>>twoip.provider(ip='192.0.2.0'){'ip':'192.0.2.0','ip_range_end':'3221226239','ip_range_start':'3221225984','mask':'24','name_ripe':'Reserved AS','name_rus':'','route':'192.0.2.0'}Retrieve provider information for the IP address192.0.2.0as a XML string:>>>fromtwoipimportTwoIP>>>twoip=TwoIP(key=None)>>>twoip.provider(ip='192.0.2.0',format='xml')'\\n192.0.2.0Reserved AS32212259843221226239192.0.2.024'Geographic APIRetrieve geographic information for the IP address8.8.8.8as adict:>>>fromtwoipimportTwoIP>>>twoip=TwoIP(key=None)>>>twoip.geo(ip='8.8.8.8'){'city':'Mountain view','country':'United states of america','country_code':'US','country_rus':'\u0421\u0428\u0410','country_ua':'\u0421\u0428\u0410','ip':'8.8.8.8','latitude':'37.405992','longitude':'-122.078515','region':'California','region_rus':'\u041a\u0430\u043b\u0438\u0444\u043e\u0440\u043d\u0438\u044f','region_ua':'\u041a\u0430\u043b\u0456\u0444\u043e\u0440\u043d\u0456\u044f','time_zone':'-08:00','zip_code':'94043'}Retrieve geographic information for the IP address8.8.8.8as a XML string:>>>fromtwoipimportTwoIP>>>twoip=TwoIP(key=None)>>>twoip.geo(ip='8.8.8.8',format='xml')'\\n8.8.8.8USUnited states of america\u0421\u0428\u0410\u0421\u0428\u0410California\u041a\u0430\u043b\u0438\u0444\u043e\u0440\u043d\u0438\u044f\u041a\u0430\u043b\u0456\u0444\u043e\u0440\u043d\u0456\u044fMountain view37.405992-122.07851594043-08:00'Roadmap/TodoSupport for email APISupport for MAC address APISupport for hosting APIOption to retrieve data as XMLUnit testsDeduplicate handler to retrieve information from API"} {"package": "2jjtt6cwa6", "pacakge-description": "UNKNOWN"} {"package": "2Keys", "pacakge-description": "# 2Keys\nA easy to setup second keyboard, designed for everyone.For a full setup guide, see [here](https://github.com/Gum-Joe/2Keys/blob/master/docs/SETUP.md)For keyboard mappings, see [here](https://github.com/Gum-Joe/2Keys/blob/master/docs/MAPPINGS.md)### Support\nWindows is supported only as the server (where the hotkeys will run) and a raspberry pi is required to run the detector.## WARNING\nThis will download a copy of [AutoHotkey_H](https://hotkeyit.github.io/v2/), a DLL version of [AutoHotkey](http://autohotkey.com/)## Building\nTo build the server, where hotkeys are run:` $ cd server $ yarn `To build the detector:` $ cd detector $ pip3 install-rrequired.txt `## DevicesServer: The device running the hotkeys sever, i.e. where the hot keys will be runDetecter: Device that handles detection of key presses & which keyboard it is and sends this to the server## Sofware used & inspiration\nInspired by LTT editor Taran\u2019s second keyboard project: [https://github.com/TaranVH/2nd-keyboard](https://github.com/TaranVH/2nd-keyboard)2Keys uses AutoHotkey_H (a DLL version of AutoHotkey): [https://hotkeyit.github.io/v2/](https://hotkeyit.github.io/v2/)## License\nCopyright 2018 Kishan Sambhi2Keys is free software: you can redistribute it and/or modify\nit under the terms of the GNU General Public License as published by\nthe Free Software Foundation, either version 3 of the License, or\n(at your option) any later version.2Keys is distributed in the hope that it will be useful,\nbut WITHOUT ANY WARRANTY; without even the implied warranty of\nMERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the\nGNU General Public License for more details.You should have received a copy of the GNU General Public License\nalong with 2Keys. If not, see ."} {"package": "2lazy2rest", "pacakge-description": "A simple way to produce short-to-medium document usingreStructuredTextMulti-format themesRender the same document in HTML, ODT, PDF keeping the main visual identityUnified interfaceTired of switching between rst2* tools having different arguments or behavior ?Would like to not losecode-blocksor some rendering options switching the output format ?This tool try to address thisMake your own themeTODO: templates will be customizable easily (say, probably colors only)How to use itDependenciesYou\u2019ll needrst2pdfto use all the features, other rst2* tools are coming from docutils.Usingmkrst [-h] [--html] [--pdf] [--odt] [--theme THEME]\n [--themes-dir THEMES_DIR]\n FILEoptional arguments:-h,--helpshow this help message and exit--htmlGenerate HTML output--pdfGenerate PDF output--odtGenerate ODT output--themeTHEMEUse a different theme--themes-dirTHEMES_DIRChange the folder searched for themepopo:~/2lazy2rest%./mkrsttest_page.rst--html--pdfUsing ./themes/default\n html: test_page.html\n pdf: test_page.pdfCustomizingMake a copy ofthemes/default, edit to your needs the copy and use the\u2013themeoption with the name of your copy, that\u2019s All !Examplepopo:~/2lazy2rest%cp-rthemes/defaultthemes/redpopo:~/2lazy2rest%sed-si's/#FEFEFE/red/g'themes/red/html/stylesheet.csspopo:~/2lazy2rest%./mkrsttest_page.rst--html--themeredIssuesODT style is unfinishedPDF & HTML still needs more ReST coverageNo skin generation from template yet"} {"package": "2mp3", "pacakge-description": "UNKNOWN"} {"package": "2mp4", "pacakge-description": "UNKNOWN"} {"package": "2or3", "pacakge-description": "UNKNOWN"} {"package": "2ppy", "pacakge-description": "2PPy (tuProlog in Python)Experimental porting of2P-Kton Python, viaJPype.This is awork in progress. 2PPy is not ready for general availability, yet.IntroductionObject-oriented and modular ecosystem for symbolic AI and logic programming, currently featuring:a module for logic terms and clauses representation, namelytuprolog.core,a module for logic unification, namelytuprolog.unify,a module for in-memory indexing and storing logic theories, as well as other sorts of collections of logic clauses, namelytuprolog.theory,a module providing generic API for resolution of logic queries, namelytuprolog.solve, currently implementing a Prolog solvertwo parsing modules: one aimed at parsing terms, namelytuprolog.core.parsing, and the other aimed at parsing theories, namelytuprolog.theory.parsing,two serialisation-related modules: one aimed at (de)serialising terms and clauses, namelytuprolog.core.serialize, and the\nother aimed at (de)serialising terms theories, namelytuprolog.theory.serialize,a module for using Prolog via a command-line interface, namelytuprolog.repl.How to do stuffPrerequisitesInstall Python 3 (look into the.python-versionto know the exact version)I suggest usingPyenvto easily handle multiple Python versions on the same machineEnsure PIP works fineInstall Java (JDK preferred), andensure theJAVA_HOMEvariable is correctly setEnsure Java and Python are both either 64bit or 32bitIf you have installed some prior development version of 2PPy (e.g.tuppyortuprolog), uninstall them viapipuninstalltuppytuprologOnMac OSthis may not work as expected.\nConsider running the following command instead:python3-mpipuninstalltuppytuprologHow to develop 2PPyRestore Python dependencies via PIP, by running:pipinstall-rrequirements.txtOnMac OSthis may not work as expected.\nConsider running the following command instead:python3-mpip-rrequirements.txtRestore JVM dependencies viadownload-jars.sh, by running:./download-jars.shNotice that this command requirescurlandwgetto be installed on your system (wgetmay be lacking onMac OSand Windows)How to use 2PPy as a libraryInstall 2PPy from Pypi by running:pipinstall2ppyOnMac OSthis may not work as expected.\nConsider running the following command instead:python3-mpipinstall2ppyImporttuprolog.*modules in your Python scriptsProfitHow to use 2PPy as an executableInstall 2PPy from Pypi by running:pipinstall2ppyOnMac OSthis may not work as expected.\nConsider running the following command instead:python3-mpipinstall2ppyRuntuprologmodule viapython-mtuprologFor the moment, running 2PPy means starting an interactive Python shell with pre-loadedtuprolog.*modules.Eventuallypython -m tuprologwill launch a command-line logic solver."} {"package": "2to3", "pacakge-description": "No description available on PyPI."} {"package": "2us", "pacakge-description": "__ (Double Underscores, 2us)Glueing functionals byimport __for python!InstallThe package is written in pure python, with no dependencies other than the Python language. Just do:pipinstall2usRequires Python 3.5 or higher.Why this?Python is a great language for creating convenient wrappers around native code and implementing simple, human-friendly functions.\nIn python, a bunch of builtin higher-order methods (which means that they accept functions as arguments) such asmap,filterare available.\nThey enable streamed data processing on containers that focus on the processing itself,\nin contrast withnoisy codeon traditional command-based languages that is heavily involved in loops.However, you may occasionally run into the situatiion where you find that there is no standard library functions to implement in-line unpacking of tuples,\nadding all numbers in a list by a constant shift, so you will have to write:map(lambdax:x+1,some_list)map(lambdax:x[0],some_list)which seems rather dumb due to the inconvenient definition of lambda functions in python.Using __Start using the package by importing__:import__And then__can be used to create convenient functions that are identical to those written withlambda. Examples:assertsum(map(__+1,range(1000)))==sum(map(lambdax:x+1,range(1000)))assertset(map(__[0],{1:2,4:6}.items()))=={1,4}assertfunctools.reduce(__+__,range(1000))==sum(range(1000))Currently there is a drawback: python do not support overriding__contains__returning non-boolean values, so theincondition should be handled separately.asserttuple(map(__.is_in([1,2]),[3,1,5,0,2]))==(False,True,False,False,True)assertlist(map(__.contains('1'),'13'))==[True,False]"} {"package": "2vyper", "pacakge-description": "2vyper is an automatic verifier for smart contracts written in Vyper, based on theViperverification infrastructure. It is being developed at theProgramming Methodology Groupat ETH Zurich. 2vyper was mainly developed by Robin Sierra, Christian Br\u00e4m, and Marco Eilers.For examples of the provided specification constructs, check out theexamples folder. Note that the examples are written in Vyper 0.1.0, but 2vyper supports different versions if a version pragma is set.\nA short overview of the most important specification constructs can be foundhere.\nFor further documentation, readour paperabout 2vyper\u2019s specification constructs,Robin Sierra\u2019sandChristian Br\u00e4m\u2019sMaster\u2019s theses on the tool.Dependencies (Ubuntu Linux, MacOS)Install Java >= 11 (64 bit) and Python >= 3.7 (64 bit).For usage with the Viper\u2019s verification condition generation backend Carbon, you will also need to install the .NET / the Mono runtime.Dependencies (Windows)Install Java >= 11 (64 bit) and Python >= 3.7 (64 bit).Install either Visual C++ Build Tools 2015 (http://go.microsoft.com/fwlink/?LinkId=691126) or Visual Studio 2015. For the latter, make sure to choose the option \u201cCommon Tools for Visual C++ 2015\u201d in the setup (seehttps://blogs.msdn.microsoft.com/vcblog/2015/07/24/setup-changes-in-visual-studio-2015-affecting-c-developers/for an explanation).Getting StartedClone the 2vyper repository:git clone https://github.com/viperproject/2vyper\ncd 2vyper/Create a virtual environment and activate it:virtualenv env\nsource env/bin/activateInstall 2vyper:pip install .Command Line UsageTo verify a specific file from the 2vyper directory, run:2vyper [OPTIONS] path-to-file.vyThe following command line options are available:``--verifier``\n Selects the Viper backend to use for verification.\n Possible options are ``silicon`` (for Symbolic Execution) and ``carbon``\n (for Verification Condition Generation based on Boogie).\n Default: ``silicon``.\n\n``--viper-jar-path``\n Sets the path to the required Viper binaries (``silicon.jar`` or\n ``carbon.jar``). Only the binary for the selected backend is\n required. We recommend that you use the provided binary\n packages installed by default, but you can or compile your own from\n source.\n Expects either a single path or a colon- (Unix) or semicolon-\n (Windows) separated list of paths. Alternatively, the environment\n variables ``SILICONJAR``, ``CARBONJAR`` or ``VIPERJAR`` can be set.\n\n``--z3``\n Sets the path of the Z3 executable. Alternatively, the\n ``Z3_EXE`` environment variable can be set.\n\n``--boogie``\n Sets the path of the Boogie executable. Required if the Carbon backend\n is selected. Alternatively, the ``BOOGIE_EXE`` environment variable can be\n set.\n\n``--counterexample``\n Produces a counterexample if the verification fails. Currently only works\n with the default ``silicon`` backend.\n\n``--vyper-root``\n Sets the root directory for the Vyper compiler.\n\n``--skip-vyper``\n Skips type checking the given Vyper program using the Vyper compiler.\n\n``--print-viper``\n Print the generated Viper file to the command line.To see all possible command line options, invoke2vyperwithout arguments.Alternative Viper VersionsTo use a custom version of the Viper infrastructure, follow theinstructions here. Look forsbt assemblyto find instructions for packaging the required JAR files. Use the\nparameters mentioned above to instruct 2vyper to use your custom Viper version.\nNote that 2vyper may not always work with the most recent Viper version.TroubleshootingOn Windows: During the setup, you get an error likeMicrosoft Visual C++ 14.0 is required.orUnable to fnd vcvarsall.bat:Python cannot find the required Visual Studio 2015 C++ installation, make sure you have either installed the Build Tools or checked the \u201cCommon Tools\u201d option in your regular VS 2015 installation (see above).While verifying a file, you get a stack trace ending with something likeNo matching overloads found:The version of Viper you\u2019re using does not match your version of 2vyper. Try using the the one that comes with 2vyper instead.Build Status"} {"package": "2wf90-assignment", "pacakge-description": "No description available on PyPI."} {"package": "2xh-leet", "pacakge-description": "LEETLibrary of Eclectic Experiments by TenchiRandom modules that I made and use in several project and are too small to get their own package. Autillibrary of sorts.ContentsLoggingProgress barsImagesLoggingModule that provides a fancy-looking theme for Python loggers.(TODO: Screenshot)To enable,import leet.loggingfrom anywhere (maybe the main__init__.pyof your project). You will then have a global loggerlogfunction that you can use from anywhere:log.info(\"Hello\")log.warn(\"World\")If using MyPy (or if you don't like monkeypatching) you can import the logger explicitly in each module as needed:fromleet.loggingimportloglog.info(\"Explicit import\")Progress barsAlso provides a progress bar (fromWoLpH/python-progressbar) that fits in the theme:fromtimeimportsleepfromleet.loggingimportlog_progressforiinlog_progress.debug(range(10)):sleep(1)log.info(\"Working on%d...\"%i)ImagesAlso supports outputing images viaimgcatif usingiTerm2(support for other tools pending):log.warn(\"Image is too big:\",extras={\"img\":\"path/to/image.png\"})"} {"package": "3", "pacakge-description": "3TeamCarlos Abraham"} {"package": "300", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} {"package": "3000", "pacakge-description": "welcome to my package"} {"package": "31", "pacakge-description": "3131 is a simple tool you can use to run code in the background on a server.For example31 c 'sleep 100; echo 2'runs the commandsleep 100; echo 2in a screen session then sends you an email with the output of the command once it is complete.SetupInstall 31 by runningpip install 31Then set up your email address by running31 config email youremail@example.comQuick dependency setupOn ubuntu you can runsudo apt install screen mailutilsto quickly set up the dependencies needed.Detailed dependency setupMail programBy default,31searches for a mail program to use from the following list. You\ncan also force it to use one of the programs by using the command31 config mail_program gnu_mail. To install on ubuntu you can runsudo apt install mailutilsmutt. To install on ubuntu you can runsudo apt install muttScreen ManagerCurrently 31 only supportsscreen. To install screen on ubuntu runsudo apt install screenOptionsSee31 -hfor a full list of options. This section covers only some of the more complicated onesForeachThis option allows you to run multiple commands with text substitution. As a basic usage example, the code31c-f%x1,2,3'touch %x.txt'Creates each of the files1.txt,2.txt, and3.txt. The variable substitution is managed via direct text-substitution,\nand thus your variables do not need to begin with %, this works equally well (though is far less readable)31c-f21,2,3'touch 2.txt'You can also modify two variables in tandem like this:31c-f2%x%ext1,2,3txt,png,py'touch %x.%ext'This creates the files1.txt,2.png,3.py. If you instead want to create all combinations, you can run:31c-f%x1,2,3-f%exttxt,png,py'touch %x.%ext'This creates the files1.txt,1.png,1.py,2.txt,2.png,2.py,3.txt,3.png,3.py.The values field is in comma-separated-value form, which means you can use\"as a CSV escape, as such:31-c-f%x'\",\",2'`touch%x.txt`which creates the files,.txtand2.txt."} {"package": "3-1", "pacakge-description": "UNKNOWN"} {"package": "310", "pacakge-description": "No description available on PyPI."} {"package": "310-notebook", "pacakge-description": "310_notebookA JupyterLab extension.RequirementsJupyterLab >= 3.0InstallTo install the extension, execute:pipinstall310_notebookUninstallTo remove the extension, execute:pipuninstall310_notebookContributingDevelopment installNote: You will need NodeJS to build the extension package.Thejlpmcommand is JupyterLab's pinned version ofyarnthat is installed with JupyterLab. You may useyarnornpmin lieu ofjlpmbelow.# Clone the repo to your local environment# Change directory to the 310_notebook directory# Install package in development modepipinstall-e.# Link your development version of the extension with JupyterLabjupyterlabextensiondevelop.--overwrite# Rebuild extension Typescript source after making changesjlpmbuildYou can watch the source directory and run JupyterLab at the same time in different terminals to watch for changes in the extension's source and automatically rebuild the extension.# Watch the source directory in one terminal, automatically rebuilding when neededjlpmwatch# Run JupyterLab in another terminaljupyterlabWith the watch command running, every saved change will immediately be built locally and available in your running JupyterLab. Refresh JupyterLab to load the change in your browser (you may need to wait several seconds for the extension to be rebuilt).By default, thejlpm buildcommand generates the source maps for this extension to make it easier to debug using the browser dev tools. To also generate source maps for the JupyterLab core extensions, you can run the following command:jupyterlabbuild--minimize=FalseDevelopment uninstallpipuninstall310_notebookIn development mode, you will also need to remove the symlink created byjupyter labextension developcommand. To find its location, you can runjupyter labextension listto figure out where thelabextensionsfolder is located. Then you can remove the symlink named310_notebookwithin that folder.Testing the extensionFrontend testsThis extension is usingJestfor JavaScript code testing.To execute them, execute:jlpm\njlpmtestIntegration testsThis extension usesPlaywrightfor the integration tests (aka user level tests).\nMore precisely, the JupyterLab helperGalatais used to handle testing the extension in JupyterLab.More information are provided within theui-testsREADME.Packaging the extensionSeeRELEASE"} {"package": "311devs_peewee", "pacakge-description": ".. image:: http://media.charlesleifer.com/blog/photos/p1423749536.32.pngpeewee======This is just `peewee-2.10.2 ` with some changes we need:* Simple LEFT JOIN LATERAL. No need make subquery, just join to model... code-block:: python# make some compound select querysubq = ModelB.select(ModelB.id).where(ModelB.id > ModelA.id).limit(1)# make query lateral joining subqueryModelA.select(ModelA, subq.c.id).join(subq, join_type=JOIN.LATERAL)* Add off argument to for_update method.. code-block:: python# Lock books of author name == JohnBook.select().join(Author).where(Author.name == 'John').for_update(of=Book)"} {"package": "321", "pacakge-description": "No description available on PyPI."} {"package": "32blit", "pacakge-description": "32blit ToolsThis toolset is intended for use with the 32Blit console to prepare assets and upload games.RunningThe 32Blit toolset contains subcommands for each tool, you can list them with:32blit --helpimage - Convert images/sprites for 32Blitfont - Convert fonts for 32Blitmap - Convert popular tilemap formats for 32Blitraw - Convert raw/binary or csv data for 32Blitpack - Pack a collection of assets for 32Blitcmake - Generate CMake configuration for the asset packerflash - Flash a binary or save games/files to 32Blitmetadata - Tag a 32Blit .blit file with metadatarelocs - Prepend relocations to a game binaryversion - Print the current 32blit versionTo run a tool, append its name after the32blitcommand, eg:32blit versionToolsMetadataBuild metadata, and add it to a.blitfile.FlashFlash and manage games on your 32Blit over USB serial.RelocsCollate a list of addresses that need patched to make a.blitfile relocatable and position-independent.CmakeGenerate CMake files for metadata information and/or asset pipeline inputs/outputs.AssetsYou will typically create assets using the \"asset pipeline\", configured using anassets.ymlfile which lists all the files you want to include, and how they should be named in code.Anassets.ymlfile might look like:# Define an output target for the asset builder\n# in this case we want a CSource (and implicitly also a header file)\n# type auto-detection will notice the \".cpp\" and act accordingly\nassets.cpp:\n prefix: asset_\n # Include assets/sprites.png\n # and place it in a variable named \"asset_sprites\"\n # Since it ends in \".png\" the builder will run \"sprites_packed\" to convert our source file\n assets/sprites.png:\n name: sprites\n palette: assets/sprites.act\n strict: true # Fail if a colour does not exist in the palette\n transparent: 255,0,255\n\n # Include assets/level.tmx\n # and place it in a variable named \"asset_level_N_tmx\"\n # Since it ends in \".tmx\" the builder will run \"map_tiled\" to convert our source file\n assets/level*.tmx:FontsConverts a ttf file or image file into a 32Blit font.Supported formats:Image .png, .gifFont .ttfImagesAll image assets are handled by Pillow so most image formats will work, be careful with lossy formats since they may add unwanted colours to your palette and leave you with oversized assets.Supported formats:8bit PNG .png24bit PNG .pngOptions:palette- Image or palette file (Adobe .act, Pro Motion NG .pal, GIMP .gpl) containing the asset colour palettetransparent- Transparent colour (if palette isn't an RGBA image), should be either hex (FFFFFF) or R,G,B (255,255,255)packed- (Defaults to true) will pack the output asset into bits depending on the palette size. A 16-colour palette would use 4-bits-per-pixel.strict- Only allow colours that are present in the palette image/fileMaps/LevelsSupported formats:Tiled .tmx -https://www.mapeditor.org/(extremely alpha!)Raw Binaries/Text FormatsSupported formats:CSV .csvBinary .bin, .rawChangelog0.7.3Validate metadata image sizes - thanks @Daft-FreakSupport for standalone metadata output - thanks @Daft-FreakSupport for generating a source file with Pico (RP2040) binary info - thanks @Daft-FreakOutput slightly more optimised C output - thanks @Daft-Freak0.7.2Add reloc support for ITCM RAM - thanks @Daft-Freak0.7.1Added DFU toolFilename now shown when flashingRemove unecessary reset-to-firmware codeFixed GIMP palette handlingSupport for more image font layouts (multiple row support)Make freetype a soft dependency when processing font dataMisc small tweaks & tidyup0.7.0Reworked Tiled maps support (requires updated SDK) - thanks @Daft-FreakMetadata CMake tool now escapes quotes - thanks @Daft-FreakNew project setup tool \"32blit setup\" for downloading/configuring a boilerplate project - thanks @Daft-FreakFixed asset builder throwing a cryptic error if no valid input files were found0.6.1Ensure the minimum required version of click is installed - thanks @LordEidiAdd separate launch command (in SDK v0.1.11 and later) and--launchflag to install - thanks @Daft-FreakSupport 16bit tile indexes - thanks @ThePythonatorOutput URL/Category metadata for SDL builds - thanks @Daft-Freak0.6.0Significant code refactor and fixes by @Ali1234Tools have been ported to ClickNew32blit installcommand that installs files/blits intelligently0.5.0Significant code refactor and fixes by @Ali1234Metadata dump fixed to support RL imagesBugfix to incorrect transparent colour being selectedConfigurable empty_tile for .tmx maps - specifies a tile ID for unset/empty tiles to be remapped toOptional struct output type for .tmx maps with width, height layer count and empty_tile.tmx map layers are now sortedShould not break compatibility, but use 0.4.x if you don't need the new features0.4.0Breaks metadata compatibility with previous versions!Add URL field for GitHub URLAdd Category field to categorise games/appsAdd file associations field to identify supported filetypes0.3.2Allow use of user-specified serial port if VID/PID are emptySupport handling multiple sets of options in CMake tool0.3.1Fixed \"32blit game.blit\" tosave(to SD) instead offlashagain0.3.0New: RLE Encoding support for spritesheetsFlasher: refined shorthands- \"32blit flash game.blit\" and \"32blit game.blit\" will flash a game.Flasher: fixed a bug where it would reset an open connection and break during a flash0.2.0New: Version tool: 32blit versionPacker: Format support for wildcard asset names0.1.4New: migrated PIC relocs tool into tools0.1.3Packer: Fix asset path handing to be relative to working directory0.1.2Flasher: Add list/del commandsPacker: Fix bug where asset packer shared class instances and stateMetadata: Find images when building from a config not in working directoryMetadata: Require only one of --file or --config options0.1.1Export metadata config to CMakeAdd support for packing metadata splash to icns format for macOS0.1.0Fix palettes to support 256 colours (count will be 0)Parse metadata and relocations with ConstructBreaking: Packed image format has changed!0.0.9Add support for PIC reloc'd binaries with RELO headerAdd string arg support for asset filename to cmake tool0.0.8Add autoreset from game to firmware when runningflash saveAddflash infoto determine if in game or firmwareAdd metadata dependency output from cmake toolFix asset dependency output to include additional files like paletteRedirect errors to stderrQuiet! Use -vvvv for info, warnings, errors and debug information.0.0.7Add metadata tool - tags binary with metadata from a .yml fileFix relative paths for packer palette filesAdd support for subdirectories to32blit flash save0.0.6Font tool (thanks @Daft-Freak)Flash command with multi-target function (thanks @Daft-Freak)Bugfixes to palette handling (thanks @Daft-Freak)Bugfixes to package recognition (seemed to affect Python 3.8 on Windows)Friendly (ish) error message when a .tmx tilemap with 0-index tiles is used (tmx is 1-indexed for valid tiles)0.0.5Output data length symbols (thanks @Daft-Freak)Fix --packed to be default, again (packed can be disabled with --packed no)Various other tweaksStart of 32blit file upload support0.0.4Default images to packed (packed arg now takes a bool)Fix bug in sprite payload size (thanks @Daft-Freak)0.0.3Fix packaging mishap so tool actually works0.0.2Real initial releasePack, cmake and asset commands workingVery beta!0.0.1Initial Release"} {"package": "3.3.0", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} {"package": "3301-calc", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} {"package": "33joshbasiccalculator33", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} {"package": "33pdf", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} {"package": "34pdf", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} {"package": "360blockscope", "pacakge-description": "Block scoping in pythonJust to be clear, I made this as a joke.importblock_scopingasbsdefmain():x=7withbs():y=x+4print(f'y:{y}')# y: 11x-=3print('x:{x}')# x: 4print('y:{y}')# UnboundLocalErrormain()"} {"package": "360monitoringcli", "pacakge-description": "360 Monitoring CLIThis repository contains a CLI script for 360 Monitoring that allows you to connect to your 360 Monitoring (https://360monitoring.com) account and list monitoring data, add, update or remove server or website monitors.DocumentationYou can find the full documentation including the feature complete REST API atdocs.360monitoring.comanddocs.360monitoring.com/docs/api.PreconditionsMake sure to have an account athttps://360monitoring.comorhttps://platform360.io360 Monitoring CLI requires a Python version of 3.* or aboveInstall 360 Monitoring CLI as ready-to-use package$ pip install 360monitoringcliConfigure your accountFirst you need to connect your CLI to your existing 360 Monitoring account via your API KEY. If you don't have a 360 Monitoring account yet, please register for free athttps://360monitoring.com. To create an API KEY you'll need to upgrade at least to a Pro plan to be able to create your API KEY.$ 360monitoring config save --api-key KEY configure API KEY to connect to 360 Monitoring accountTest 360 Monitoring CLI locallyTest 360 Monitoring CLI with pre-configured Docker imageYou can easily test and run 360 Monitoring CLI for production by running the pre-configured docker image$ docker build -t 360monitoringcli .\n$ docker run -it --rm 360monitoringcli /bin/bashTest 360 Monitoring CLI for specific staging versionTo test a package from staging you can simply deploy a docker container:$ docker run -it --rm ubuntu /bin/bash\n$ apt-get update && apt-get install -y python3 && apt-get install -y pip\n$ pip install -i https://test.pypi.org/simple/ --force-reinstall -v \"360monitoringcli==1.0.19\"For developement, install required Python modulesTo test the code locally, install the Python modules \"requests\", \"configparser\", \"argparse\" and \"prettytable\".\nUse \"pip install -e .\" to use \"360monitoring\" command with latest dev build locally based on local code.$ pip install requests\n$ pip install configparser\n$ pip install argparse\n$ pip install prettytable\n$ pip install -e .Run tests to check each function worksTest the code:$ ./test_cli.shUsage$ 360monitoring --help display general help\n$ 360monitoring signup open the sign up page to get your 360 Monitoring account\n$ 360monitoring config save --api-key KEY configure API KEY to connect to 360 Monitoring account (only for paid plans)\n$ 360monitoring statistics display all assets of your account\n$ 360monitoring servers list display all monitored servers\n$ 360monitoring servers list --issues display monitored servers with issues only\n$ 360monitoring servers list --tag cpanel display only servers with tag \"cpanel\"\n$ 360monitoring sites list display all monitored sites\n$ 360monitoring sites list --issues display monitored sites with issues only\n$ 360monitoring sites list --sort 6 --limit 5 display worst 5 monitored sites by uptime\n$ 360monitoring contacts list display all contacts\n$ 360monitoring usertokens list display user tokens\n$ 360monitoring config print display your current settings and where those are stored\n$ 360monitoring recommendations display upgrade recommendations for servers that exceed their limits\n$ 360monitoring magiclinks create and open a readonly dashboard for a single server only via magic link\n$ 360monitoring wptoolkit display statistics of WP Toolkit if installed\n\n$ 360monitoring sites add --url domain.tld start monitoring a new website\n$ 360monitoring servers update --name cpanel123.hoster.com --tag production tag a specific server\n\n$ 360monitoring contacts --help display specific help for a sub command\n$ 360monitoring dashboard open 360 Monitoring in your Web BrowserUpdating 360 Monitoring CLI packageYou can update the 360monitoringcli package to the latest version using the following command:$ pip install 360monitoringcli --upgrade"} {"package": "3636c788d0392f7e84453434eea18c59", "pacakge-description": "Still Confidential"} {"package": "365scores", "pacakge-description": "365scoresDescriptionInstallpipinstall365scores# orpip3install365scores"} {"package": "36ban_commons", "pacakge-description": "UNKNOWN"} {"package": "36-chambers", "pacakge-description": "36-chambers is a Python Library which adds common reversing methods and\nfunctions for Binary Ninja. The library is designed to be used within Binary\nNinja in the Python console.InstallationUse pip to install or upgrade 36-chambers:$ pip install 36-chambers [--upgrade]Quick ExampleFind and print all blocks with a \u201cpush\u201d instruction:fromchambersimportChamberc=Chamber(bv=bv)c.instructions('push')DocumentationFor more information you can find documentation onreadthedocs."} {"package": "36ke.py", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} {"package": "37austen", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} {"package": "3890612457908641", "pacakge-description": "No description available on PyPI."} {"package": "392304jk324nkkl", "pacakge-description": "An example package. Generated with cookiecutter-pylibrary.Free software: BSD 2-Clause LicenseInstallationpip install ex11DocumentationTo use the project:importex1ex1.longest()DevelopmentTo run the all tests run:toxNote, to combine the coverage data from all the tox environments run:Windowsset PYTEST_ADDOPTS=--cov-append\ntoxOtherPYTEST_ADDOPTS=--cov-append toxChangelog0.0.0 (2019-04-04)First release on PyPI."} {"package": "3a-python-package-ligia", "pacakge-description": "No description available on PyPI."} {"package": "3b-bot", "pacakge-description": "Best Buy Bullet Bot (3B Bot)Best Buy Bullet Bot, abbreviated to 3B Bot, is a stock checking bot with auto-checkout created to instantly purchase out-of-stock items on Best Buy once restocked. It was designed for speed with ultra-fast auto-checkout, as well as the ability to utilize all cores of your CPU with multiprocessing for optimal performance.Headless item stock trackingMultiprocessing and multithreading for best possible performanceOne-time login on startupUltra-fast auto-checkoutEncrypted local credentials storageSuper easy setup and usageBear in mind that 3B Bot is currently not equipped to handle a queue and/or email verification during the checkout process. If either of these is present, the bot will wait for you to take over and will take control again once you are back on the traditional checkout track.PrerequisitesA Best Buy account with your location and payment information already set in advance.The only information the bot will fill out during checkout is your login credentials (email and password) and the CVV of the card used when setting up your payment information on Best Buy (PayPal is currently not supported). All other information that may be required during checkout must be filled out beforehand.Python 3.6 or newer3B Bot is written in Python so if it is not already installed on your computer please install it fromhttps://www.python.org/downloads/.On Windows make sure to tick the \u201cAdd Python to PATH\u201d checkbox during the installation process.On MacOS this is done automatically.Once installed, checking your Python version can be done with the following.For MacOS:python3--versionFor Windows:python--versionIf your version is less than 3.6 or you get the messagepython is not recognized as an internal or external commandthen install python from the link above.A supported browser3B Bot currently only supportsChromeandFirefox. We recommend using the Firefox browser for it's superior performance during tracking.InstallationInstalling 3B Bot is as simple as running the following in your shell (Command Prompt for Windows and Terminal for MacOS)For MacOS:python3-mpipinstall--upgrade3b-botFor Windows:pipinstall--upgrade3b-botUsageTo start the bot just enter the following in your shell3b-botFor more usage information check out ourdocumentation.How does it work?This is what 3B Bot does step by step at a high levelGet currently set URLs to track or prompt if none are set.Using the requests library validate all URLs and get item names.Open up a Google Chrome browser with selenium and perform the following.a. Navigate to the login page.b. If we have logged in previously we can use the saved cookies from the previous session to skip the log-in process. If not automatically fill out the username and password fields to log in.c. Make a get request to the Best Buy API to confirm that there are no items in the cart.d. If this is the first time using the bot check that a mailing address and payment information has been set.e. Go to each URL and collect the page cookies. This is done so that during checkout we can just apply the cookies for that URL instead of going through the entire login process.Assign each URL to a core on the CPU.Each core will start a specified number of threads.Each thread will repeatedly check whether the \"add to cart button\" is available for its item.When a thread notices that an item has come back in stock it will unlock its parent core and lock all other threads on every core to conserve CPU resources and WIFI.The unlocked parent will print to the terminal that the item has come back in stock, play a sound, and attempt to automatically checkout the item with the following steps.a. With the driver that was used to track the item, click the add-to-cart button.b. Open up another browser window (this one is visible) and navigate to the item URL to set some cookies to login.c. Redirect to the checkout page.d. Enter the CVV for the card.e. Click \"place order\".Once finished the parent will update its funds, the item quantity, and unlock all threads to resume stock tracking.Sound will stop playing when the item is no longer in stock.Performance tipsThe following are tips to achieve the best possible performance with 3B Bot.Use the same amount of URLs as cores on your CPU. You can create a URL group with the same URL repeated multiple times to increase the number of URLs you have and3b-bot count-corescan be used to see how many cores your CPU has.Use ethernet as opposed to WIFI for a stronger more stable connection.Adequately cool your computer to prevent thermal throttling.Tweak the number of threads per URL. This can be changed with the3b-bot set-threadscommand.If you plan to complete the checkout process yourself, disable auto-checkout in the settings for a significant performance improvement.Overall, item stock tracking is a CPU and internet bound task, so at the end of the day the better your CPU and the stronger your internet the faster your tracking."} {"package": "3color-Press", "pacakge-description": "About======3color Press is a flask based application intended to streamline making your own comic based website.It is a static website generator that takes markdown formatted text files and turns them into new pages.I am new to programming and I'm kind of brute learning python and flask with this project.The project is under heavy development and features are being added as we work on them,however a very functional core set of features is includedFor more in depth information on how to use check the doc pages You can see a demosite generated with version 0.1 of this tool at http://3color.noties.orgFeatures* automatic handling of book pages, news pages and single pages* easily add a page to the main menu* easily add custom single pages* News page to collect news feed page* Support for showing a thumbnail of most recent comic in desired story line on every page* command line tools for easy managementIn Progress Features* custom themeing support* toggle-able theme elements* improvement on handling in site menus* admin interface* better error checking* much more?!Installation-------------The package is available in pypi::$ pip install 3color-Presssee :doc:`install`Contribute----------If you're interested in contributing or checking out the source code you can take a look at:* Issue Tracker: https:github.com/chipperdoodles/3color/issues* Source Code: https:github.com/chipperdoodles/3colorSupport-------If you're having problems or have some questions,feel free to check out the github page: https://github.com/chipperdoodles/3colorLicense--------3color-Press is (c) Martin Knobel and contributors and is licensed under a BSD licensesee :doc:`license`"} {"package": "3d", "pacakge-description": "Welcome to3D- the package for Python game design3D is a project designed to make programming 3D games as easy as possible.3D gives developers an easy way to build great games and 3D renders using nothing but Python.Note: 3D is imported usingimport g, notimport 3d!"} {"package": "3d_bin_container_packing", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} {"package": "3d-connectX-env", "pacakge-description": "3d-connectX-env3D connectX repository, developed for theOpenAI Gymformat.InstallationThe preferred installation of3d-connectX-envis frompip:pipinstall3d-connectX-envUsagePythonimportgym_3d_connectXimportgymenv=gym.make('3d-connectX-v0')env.reset()env.utils.win_reward=100env.utils.draw_penalty=50env.utils.lose_penalty=100env.utils.could_locate_reward=10env.utils.couldnt_locate_penalty=10env.utils.time_penalty=1env.player=1actions=[0,0,1,1,2,2,4,4,0,0,1,1,2,2,0,3]foractioninactions:obs,reward,done,info=env.step(action)env.render(mode=\"plot\")EnvironmentsThe environments only send reward-able game-play frames to agents;\nNo cut-scenes, loading screens, etc. are sent to\nan agent nor can an agent perform actions during these instances.Environment:3d-connectX-v0Factor at initialization.KeyTypeDescriptionnum_gridintLength of a side.num_win_seqintThe number of sequence necessary for winning.win_rewardfloatThe reward agent gets when win the game.draw_penaltyfloatThe penalty agent gets when it draw the game.lose_penaltyfloatThe penalty agent gets when it lose the game.couldnt_locate_penaltyfloatThe penalty agent gets when it choose the location where the stone cannot be placed.could_locate_rewardfloatThe additional reward for agent being able to put the stone.time_penaltyfloatThe penalty agents gets along with timesteps.first_playerintDefine which is the first player.StepInfo about the rewards and info returned by thestepmethod.KeyTypeDescriptionturnintThe number of the player at this stepwinnerintValue of the player on the winning sideis_couldnt_locateboolIn this step the player chooses where to place the stone."} {"package": "3d-converter", "pacakge-description": "Python 3D Models ConverterA module, which helps convert different 3d formatsVersion: 0.9.0"} {"package": "3DCORE", "pacakge-description": "3DCORE3D Coronal Rope Ejection modelling techniqe for coronal mass ejection (CME) flux ropes.InstallationInstall the latest version manually usinggitandpip:git clone https://github.com/ajefweiss/py3DCORE\ncd 3DCORE\npip install ."} {"package": "3debt", "pacakge-description": "# 3debt\nTry to figure out what dependencies you\u2019re missing to start upgrading your project to Python 3## Installation\npip install -g 3debt## Usage\n3debt requirements.txt"} {"package": "3DeeCellTracker", "pacakge-description": "3DeeCellTracker3DeeCellTrackeris a deep-learning based pipeline for tracking cells in 3D time-lapse images of deforming/moving organs (eLife, 2021).Updates:3DeeCellTracker v1.0.0 has been releasedFixed some bugs in v0.5InstallationTo install 3DeeCellTracker, please follow the instructions below:Note: We have tested the installation and the tracking programs in two environments:(Local) Ubuntu 20.04; NVIDIA GeForce RTX 3080Ti; Tensorflow 2.5.0(Google Colab) Tensorflow 2.12.0 (You need to upload your data for tracking)PrerequisitesA computer with an NVIDIA GPU that supports CUDA.AnacondaorMinicondainstalled.TensorFlow 2.x installed.StepsCreate a new conda environment and activate it by running the following commands in your terminal:$condacreate-ntrackpython=3.8pip$condaactivatetrackInstall TensorFlow 2.x by following the instructions provided in theTensorFlow installation guide.Install the 3DeeCellTracker package by running the following command in your terminal:$pipinstall3DeeCellTracker==1.0.0Once the installation is complete, you can start using 3DeeCellTracker for your 3D cell tracking tasks within the Jupyter notebooks provided in the GitHub repository.If you encounter any issues or have any questions, please refer to the project's documentation or raise an issue in the GitHub repository.Quick StartTo learn how to track cells using 3DeeCellTracker, please refer to the following notebooks for examples. We recommend using StarDist for segmentation, as we have optimized the StarDist-based tracking programs for more convenient and quick cell tracking. Users can also use the old way with 3D U-Net.Train a custom deep neural network for segmenting cells in new optical conditions:Train 3D StarDist (notebook with results)Train 3D U-Net (clear notebook)Train 3D U-Net (results)Track cells in deforming organs:Single mode + StarDist (notebook with results)Single mode + UNet (clear notebook)single mode + UNet (results)Track cells in freely moving animals:Ensemble mode + StarDist (notebook with results)Ensemble mode + UNet (clear notebook)Ensemble mode + UNet (results)(Optional) Train FFN with custom data:Use coordinates in a .csv file (notebook with results)Use manually corrected segmentation saved as label images (notebook with results)The data and model files for demonstrating above notebooks can be downloaded here:Data for StarDist-based notebooks.Data for UNet-based notebooks.Frequently Reported Issue and Solution (for v0.4)Multiple users have reported encountering aValueErrorof shape mismatch when running thetracker.match()function.\nAfter investigation, it was found that the issue resulted from an incorrect setting ofsiz_xyz,\nwhich should be set to the dimensions of the 3D image as (height, width, depth).Video Tutorials (for v0.4)We have made tutorials explaining how to use our software. See links below (videos in Youtube):Tutorial 1: Install 3DeeCellTracker and train the 3D U-NetTutorial 2: Tracking cells by 3DeeCellTrackerTutorial 3: Annotate cells for training 3D U-NetTutorial 4: Manually correct the cell segmentationA Text Tutorial (for v0.4)We have written a tutorial explaining how to install and use 3DeeCellTracker. SeeBio-protocol, 2022How it worksWe designed this pipeline for segmenting and tracking cells in 3D + T images in deforming organs. The methods have been explained inWen et al. bioRxiv 2018and inWen et al. eLife, 2021.\nThe original programs used in eLife 2021 was contained in the \"Deprecated_programs\" folder.Overall procedures of our method(Wen et al. eLife, 2021\u2013Figure 1)Examples of tracking results(Wen et al. eLife, 2021\u2013Videos)Neurons in a \u2018straightened\u2019freely moving wormCardiac cells in a zebrafish larvaCells in a 3D tumor spheriodCitationIf you used this package in your research, please cite our paper:Chentao Wen, Takuya Miura, Venkatakaushik Voleti, Kazushi Yamaguchi, Motosuke Tsutsumi, Kei Yamamoto, Kohei Otomo, Yukako Fujie, Takayuki Teramoto, Takeshi Ishihara, Kazuhiro Aoki, Tomomi Nemoto, Elizabeth MC Hillman, Koutarou D Kimura (2021) 3DeeCellTracker, a deep learning-based pipeline for segmenting and tracking cells in 3D time lapse images eLife 10:e59187Depending on the segmentation method you used (StarDist3D or U-Net3D), you may also cite either of\nfollowing papers:Martin Weigert, Uwe Schmidt, Robert Haase, Ko Sugawara, and Gene Myers.\nStar-convex Polyhedra for 3D Object Detection and Segmentation in Microscopy.\nThe IEEE Winter Conference on Applications of Computer Vision (WACV), Snowmass Village, Colorado, March 2020\u00c7i\u00e7ek, \u00d6., Abdulkadir, A., Lienkamp, S.S., Brox, T., Ronneberger, O. (2016). 3D U-Net: Learning Dense Volumetric Segmentation from Sparse Annotation. In: Ourselin, S., Joskowicz, L., Sabuncu, M., Unal, G., Wells, W. (eds) Medical Image Computing and Computer-Assisted Intervention \u2013 MICCAI 2016. MICCAI 2016. Lecture Notes in Computer Science(), vol 9901. Springer, Cham.AcknowledgementsWe wish to thankJetBrainsfor supporting this project\nwith free open sourcePycharmlicense."} {"package": "3DefficientNet", "pacakge-description": "No description available on PyPI."} {"package": "3d-Engine", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} {"package": "3DFin", "pacakge-description": "Welcome to 3DFin: 3D Forest inventory's official repository!3DFin is a free software for automatic computation of tree parameters in terrestrial point clouds. It offers the users a quick, ease-of-use interface to load their forest plots and generate tree metrics with just a few clicks.Getting StartedBe sure to check theDocumentation, which features detailed explanations on how the program works and an User Manual.Also, theTutorialcovers the basics of 3DFin and is a great tool to get started.Download3DFin is freely available in 4 ways:As a CloudCompare plugin (Windows and Linux)As a QGIS pluginAs a standalone program (Only in Windows)As a Python package (In Windows, Linux and macOS)1. CloudCompare plugin3DFin is available in Windows as aplugin in CloudCompare (2.13)thanks to CloudCompare PythonRuntime (seeReferences). You can download the latest version CloudCompare (Windows installer version) including the 3DFin plugin here:CloudCompareSimply install the latest version of CloudCompare and tick Python and 3DFin's checkbox during the installation:To install 3DFin plugin, tick the 'Python plugin support' checkbox during CloudCompare installation.For Linux, the plugin is embedded into the CloudCompareflatpak.3DFin plugin in CloudCompare.Running the plugin will open 3DFin's graphical user interface (GUI).3DFin GUI. It is common to any version of 3DFin.2. QGIS plugin3DFin is also available as a plugin inQGIS. Please follow the instructions availableherein order to test it.\nNote that for now this does not provide much added value in comparison with CloudCompare and Standalone version of 3DFin.3. Standalone program3DFin is also available in Windows as a standalone program, which can be downloaded from here:Standalone.3DFin standalone does not require a CloudCompare installation and provides the fastest computation times.Older versions of 3DFin standalone may also be downloaded fromReleases. From there, simply navigate to the desired version and click on3DFin.exe.4. Python package (3DFin)3DFin and its dependencies may be installed and launchedin any OS (Windows, Linux and macOS)as a Python package:pip install 3DFinpython -m three_d_finIf you are a macOS or Linux user and you may want to try 3DFin, this is the way you should proceed.pipwill also install a script entry point in your Python installation's bin|script directory, so alternatively you can launch 3DFin from the command line with:3DFin[.exe]macOS user may need to install and use an openMP capable compiler, such as GCC fromHomebrewin order to install the dependencies.UsageCloudCompare plugin is the reccomended way of using 3DFin, as it provides enhanced features for visualisation of the results and exporting of the outputs (it allows to export the results as a CloudCompare native BIN file).By default, running 3DFin (either the CloudCompare plugin or any version of 3DFin) will open a GUI window.For batch processing you can use the CLI capabilities of 3DFin and running the following command:3DFin[.exe] cli --helpwill give you an overview of the available parameters.Citing 3DFinAs of now, the best way to cite 3DFin is by referring to the original paper describing the algorithm behind:Cabo, C., Ord\u00f3\u00f1ez, C., L\u00f3pez-S\u00e1nchez, C. A., & Armesto, J. (2018). Automatic dendrometry: Tree detection, tree height and diameter estimation using terrestrial laser scanning. International Journal of Applied Earth Observation and Geoinformation, 69, 164\u2013174.https://doi.org/10.1016/j.jag.2018.01.011Or directly citing the repository itself:3DFin: 3D Forest Inventory. 3DFinhttps://github.com/3DFin/3DFin.We are currently working on a scientific article about 3DFin, which may be published in 2023.ReferencesCloudCompare-PythonRuntime, by Thomas Montaigu:CloudCompare-PythonRuntimeAcknowledgement3DFin has been developed at the Centre of Wildfire Research of Swansea University (UK) in collaboration with the Research Institute of Biodiversity (CSIC, Spain) and the Department of Mining Exploitation of the University of Oviedo (Spain).Funding provided by the UK NERC project (NE/T001194/1):'Advancing 3D Fuel Mapping for Wildfire Behaviour and Risk Mitigation Modelling'and by the Spanish Knowledge Generation project (PID2021-126790NB-I00):\u2018Advancing carbon emission estimations from wildfires applying artificial intelligence to 3D terrestrial point clouds\u2019."} {"package": "3Dfunctiongrapher", "pacakge-description": "No description available on PyPI."} {"package": "3Di-cmd-client", "pacakge-description": "The 3Di command line clientThe 3Di command line client allows forDefining and running 3Di scenarios from the command line.Assembling different scenarios as a \"suite\" that will be run in batch.Management commands, for instance to list currently running simulations.There are three main entry points for the 3Di command line client.Scenario command$ scenario --help\n\nUsage: scenario [OPTIONS] COMMAND [ARGS]...\n\nOptions:\n --endpoint [localhost|staging|production]\n The endpoint where commands are run against\n --help Show this message and exit.\n\nCommands:\n auth Provide authentication details\n models List available threedimodels\n organisations List available organisations\n results Download results of a simulation\n run Run a given scenario\n scenarios List local scenarios\n settings Set default settings\n simulations List simulationsSuite command$ suite --help\n\nUsage: suite [OPTIONS]\n\n run suite a given suite\n\nOptions:\n --suite PATH path to the suite file [required]\n --help Show this message and exit.Active simulations command$ active_simulations --help\n\nUsage: active_simulations [OPTIONS]\n\n Show currently running simulations\n\nOptions:\n --endpoint [localhost|staging|production]\n The endpoint where commands are run against\n --help Show this message and exit.Dependenciespython >= 3.8InstallationDependencies python >= 3.8pip install --user 3Di-cmd-clientHistory0.0.3 (2020-12-21)Fixed settings context if config file is not yet available.0.0.1b (2020-12-18)First (beta) pypi release."} {"package": "3DJCG-3Dvisual-question-answering", "pacakge-description": "3DJCG-3Dvisual-question-answering"} {"package": "3D-MCMP-MRT-LBM", "pacakge-description": "No description available on PyPI."} {"package": "3dof-hexapod-ik-generator", "pacakge-description": "This is a simple Python package for hexapod IK calculations.Commands:To be imported as ikengineclass IKEngine # initialises the class object. Takes 4 arguments in mm - coxaLength, femurLength, tibiaLength and bodySideLength. Optionally can take a 5th argument that can either be a list or a tuple. Please pass the servos that need to be reversed into this tuple/list. They will be reversed (angle = 180 - angle) for the whole runtime of your program that utilises this library.shift_lean(posX, posY, posZ, rotX, rotY, rotZ) # returns an array of 18 servo angles that are calculated using IK from the given variables that correspond to the translation and tilt of the body of the hexapod. The order goes from tibia to coxa, from left to right and then from front to backAny questions or suggestions? Please feel free to contact me atmacaquedev@gmail.com"} {"package": "3d-paws", "pacakge-description": "All necessary packages and scripts to run the various sensors of a 3D-PAWS station."} {"package": "3d-printer", "pacakge-description": "welcome to my package"} {"package": "3D-registration", "pacakge-description": "Registration toolsBefore everything else, the current status of the whole thing here is that it only works on UNIX systems (eg Linux & MacOs) that have reasonnable chips (eg not M1 chips for example).Purpose and \"history\"This repository is about two scripts to do spatial and temporal registration of 3D microscopy images.\nIt was initially developed to help friends with their ever moving embryos living under a microscope.\nI found that actually quite a few people were interested so I made a version of it that is somewhat easier to use.In theory, the main difficulty to make the whole thing work is to install the different libraries.CreditsThe whole thing is just a wrapping of the amazing blockmatching algorithm developed byS. Ourselin et al.and currently maintained Gr\u00e9goire Malandin et al.@Team Morphem - inria(if I am not mistaking).Installationcondaandpipare required to installregistration-toolsWe recommand to install the registration tools in a specific environement (likeconda). For example the following way:conda create -n registration python=3.10You can then activate the environement the following way:conda activate registrationFor here onward we assume that you are running the commands from theregistrationcondaenvironement.Then, to install the whole thing, it is necessary to first install blockmatching. To do so you can run the following command:conda install vt -c morpheme -c trcabelThen, you can install the 3D-registration library either directly via pip:pip install 3D-registrationOr, if you want the latest version, by specifying the git repository:pip install git+https://github.com/GuignardLab/registration-tools.gitTroubleshootingWindows:If you are trying to run the script on Windows you might need to installpthreadvse2.dll.It can be found there:https://www.pconlife.com/viewfileinfo/pthreadvse2-dll/. Make sure to download the version that matches your operating system (32 or 64 bits, most likely 64).MacOs:As there are no M1 binaries available yet, please use rosetta or install an intel version of conda.UsageMost of the description on how to use the two scripts is described in themanual(Note that the installation part is quite outdated, the remaining is ok).That being said, once installed, one can run either of the scripts from anywhere in a terminal by typing:time-registrationorspatial-registrationThe location of the json files or folder containing the json files will be prompted and when provided the registration will start.It is also possible to run the registration from a script/notebook the following way:fromregistrationtoolsimportTimeRegistrationtr=TimeRegistration('path/to/param.json')tr.run_trsf()orfromregistrationtoolsimportTimeRegistrationtr=TimeRegistration('path/to/folder/with/jsonfiles/')tr.run_trsf()orfromregistrationtoolsimportTimeRegistrationtr=TimeRegistration()tr.run_trsf()and a path will be asked to be inputed.Example json filesFew example json files are provided to help the potential users. You can find informations about what they do in themanual."} {"package": "3d-renderer", "pacakge-description": "3D RendererA Python module that renders 3D objects as pygame surfaces."} {"package": "3dRenderPy", "pacakge-description": "3DRenderPyThis is an implementation of a ray tracer based on Jamis Buck's The Ray Tracer Challenge. It supports several primitives:SpherePlaneCubeCylinderConeGroupTriangle and Somooth TriangleCSGInstallation:pip install 3DRenderPy"} {"package": "3d-scan-xxp", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} {"package": "3dstrudel", "pacakge-description": "3dstrudelRequirementsbiopython, mrcfile, mpi4py, psutil, scipyInstalation.. code:: bash\npip install numpy"} {"package": "3dtrees-nbingo", "pacakge-description": "3DPhyloTreesWelcome!OverviewThe purpose of this Python package is to create 3D phylogenetic trees with two axes of variation given suitable\ndata in the commonAnnDataformat. The differentiating factor\nbetween this package and functions likescipy.cluster.hierarchy.dendrogram()andsklearn.cluster.AgglomerativeClusteringis that the dendrogram produced tracks the splitting/merging patterns\nof groups of taxa and individual taxa. Specifically, as used in the BioRxiv paperCerebellar nuclei evolved by\nrepeatedly duplicating a conserved cell type set, the\nphylogenetic tree created by this package tracks the merging of different subnuclei of the cerebellar nuclei while\nalso tracking the merging of individual cell types within those nuclei.An example of such a dendrogram isA flattened version can be found in Fig. S22C and Fig. S23H of the linked paper above.This package is composed of three main parts:agglomeratedatametricsTheagglomeratepackage exposes methods to perform the\nagglomeration of a single phylogenetic tree given suitable data and hyperparameters, and a method to perform batch\nagglomeration over a range of hyperparameters and select the best tree according to any of the following metrics:Balanced Minimum Evolution (preferred)Minimum EvolutionMaximum ParsimonyThedatapackage exposes adata_loaderthat the user can define to\nimport their data accordingly (possibly from multiple folders or online repositories) and into an AnnData object.\nThedata_typesare used internally by the\nagglomeration algorithm.Finally, themetricscurrently only provides the\nSpearman correlation coefficient to measure the distance between two data points, however any distance metric in the\nsame form as the example provided may be added and used in the agglomeration program.InstallationThis package requires Python version 3.7 or greater, and the requirements provided inPipfileandPipfile.lock. Using pip, installation is as easy as\nrunning:pip install 3dtrees-nbingoQuestionsIf you have any questions for how to use this code or for how it was used inCerebellar nuclei evolved by\nrepeatedly duplicating a conserved cell type set, then\nplease feel free to email me atnomir@stanford.edu. Examples usage of this package can be found in thecn_evolutionrepository, which is the analysis code used to produce the\nfigures in the linked paper."} {"package": "3DVG", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} {"package": "3d-video-converter", "pacakge-description": "3D Video ConverterA simple FFMPEG-based script for converting either two separate stereo videos or an existing 3D video into a wide range of 3D video formats.InstallationInstall3d-video-converterFrom PyPI:pipinstall3d-video-converterOr from the source on GitHub:pipinstall\"3d-video-converter @ git+https://github.com/evoth/3d-video-converter\"The package will be installed with the module namevideo_converter_3d.Install FFmpegThis package depends onffmpeg-python, which means thatFFmpegmust be installed and accessible via the$PATHenvironment variable. Please follow appropriate installation instructions for your platform.To check if FFmpeg is installed, run theffmpegcommand from the terminal. If it is installed correctly, you should see version and build information.Usage examplesConvert a full-width parallel view video to full color red/cyan anaglyph:fromvideo_converter_3dimportconvert_3dconvert_3d(\"video_parallel.mp4\",\"sbsl\",\"video_anaglyph.mp4\",\"arcc\")Combine two separate stereo videos into a full-width parallel view video, only keeping audio from the left video:fromvideo_converter_3dimportconvert_2d_to_3dconvert_2d_to_3d(\"video_left.mp4\",\"video_right.mp4\",True,False,\"video_parallel.mp4\",\"sbsl\")"} {"package": "3DVision", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} {"package": "3d-wallet-generator", "pacakge-description": "This project helps you design and export 3D-printable wallets, similar to paper wallets (but they won\u2019t die in a flood)Everyone who\u2019s seriously serious about bitcoin has tried paper wallet\ngenerators. While the idea is great, paper isn\u2019t a great medium out of\nwhich to make something that stores significant value. This this in\nmind, we set out to make a simple, easy-to-use software that can design\nand export 3D-printable wallets, with a variety of configuration\noptions.DependenciesPython3: this project is designed for Python3, not Python2PyBitcoin,sudo pip3 install bitcoin(no manual installation required)PyQRCode,sudo pip3 install pyqrcode(no manual installation required)OpenSCAD 2015 (or higher), just install from their website, and the\nprogram should find it automatically (submit an issue if it doesn\u2019t) -(manual installation required)FeaturesSupports a variety of configuration and size optionsExports wallets as STLExport keys as CSV-file for import into other software (for big\nbatches)Set the configuration and let it generate millions ofrandomwallets for youSupport for other cryptocurrencies, including:BitcoinLitecoinDogecoinAny other currency (as long as you know the version bit for address generation)InstructionsInstall pipWindows: download from their websiteMac: install from MacPorts or BrewLinux (Ubuntu/Debian):sudoapt-getinstallpython3-pipInstall OpenSCADDownload from their websiteMake sure you are running their newest version (or at least OpenSCAD 2015)Contact us if you need help.Install our packageTry:sudo pip3 install3d-wallet-generatorIf it continues to fail, shoot us an email and we\u2019ll try to help.Use our packageRun3dwallet-hto see your optionsTry the default settings by running3dwallet- it will output five wallets, with the default settings, into a folder in your current directory.Play with the other settings and decide how your printer, CNC, etc. likes the different styles.Film it or take a picture, and give it to us! We\u2019ll add it to our collection!We recommend you run the Linux version off of a LiveUSB for maximum\nsecurity (just as you would with a normal paper wallet).MiscellaneousIf you have any comments, questions, or feature requests, either\nsubmit an issue or contact us atbtcspry@bitforwarder.comWe always accept donations at1MF7hKShzq2iSV9ZZ9hEx6ATnHQpFtM7cF!!Please donate, this project\ntook a bunch of effort and we want to make sure it was worth it.To Do / Features Coming SoonAdd picturesAdd option to import your own addresses/private keysOffset the white in the QR code (instead of just offsetting the\nblack)If you want any of these developed faster, send us a gift to our donation address above."} {"package": "3Edit", "pacakge-description": "rendererA simple 3-D rendering engine."} {"package": "3ETool", "pacakge-description": "3ETool3EToolcontains some useful tools developed by theSERG research groupof theUniversity of Florencefor performing exergo-economic and exergo environmental analysis. Theuser manualcan be downloadedhere. Moreover, someyoutube tutorialshave been uploaded in order to help the user in compiling the excel file.1 - Calculation process \ud83e\udd14\u2699The beta version can be downloaded usingPIP:pip install 3EToolOnce the installation has been completed the user can import the tool, and paste to a desired location theuser manual, thecomponents documentationor thedefault excel file, as in thematlab versionof the app.importEEEToolsEEETools.paste_user_manual()EEETools.paste_components_documentation()EEETools.paste_default_excel_file()Finally, once the Excel file has been compiled, the calculation can be initialized trough this command:importEEEToolsEEETools.calculate()calculation options and user defined excel path can be passed to the function as well (default values aretrue); in case user does not pass the path, the app will automatically open a filedialog window so that it can be selected manuallyimportEEEToolsEEETools.calculate(excel_path=\"your_excel_file.xlsx\",calculate_on_pf_diagram=True,loss_cost_is_zero=True,valve_is_dissipative=True,condenser_is_dissipative=True)2 - Debugging tools \ud83d\udc68\u200d\ud83d\udcbb\ud83d\udd0dExcel file can be debugged using some specific tools that can be launched using the following command (please select the\nExcel file that you want to debug on program request):importEEEToolsEEETools.launch_connection_debug()Another possible way of debugging the code is to ask the program to export the debug information on the Excel file:importEEEToolsEEETools.export_debug_information()Finally, topology can be displayed using:importEEEToolsEEETools.launch_network_display()3 - Sankey Diagrams \ud83d\udcc8\ud83d\udccaSankey diagram can be plotted using the following command:importEEEToolsEEETools.plot_sankey(generate_on_pf_diagram=True,display_costs=True,)generate_on_pf_diagramcan be omitted andisTrueby default:ifFalsethe connections are defined according to thephysical topologyof the plantifTruethe connections are based on theproduct-fueldefinitiondisplay_costscan be omitted andisFalseby default:ifFalsethe thickness of the connection in the sankey diagram is proportional to theexergy fluxbetween\nthe components (inkW)ifTruethe thickness of the connection in the sankey diagram is proportional to theeconomic (or environmental) fluxbetween the components (in\u20ac/sor inPts/s). In addition, for each\nconnection, thecolor intensityis proportional to therelative cost of the stream(in\u20ac/kJor inPts/kJ)4 - Code Structure \ud83d\udcc1The application code is divided into 3 main folders:MainModulesdirectory contains Base modules such asBlock, Connection, ArrayHandler and Drawer Classes.Block Sublcassescontains a Block subclass for each component type (e.g. expander, compressor etc.)Toolscontains different APIs needed for the program to run (e.g. the cost correlation handler,\nthe EES code generator, and the importer and exporter for both Excel and xml files)5 - Important Information \u26a0-------------------------- !!! THIS IS A BETA VERSION !!! --------------------------please report any bug or problems in the installation topietro.ungar@unifi.itfor further information visit:https://tinyurl.com/SERG-3ETool-------------------------------- !!! HOW TO CITE !!! --------------------------------The following reference can be used to cite the tool in publications:Fiaschi, D., Manfrida, G., Ungar, P., Talluri, L. \n\nDevelopment of an exergo-economic and exergo-environmental tool for power plant assessment: \nevaluation of a geothermal case study.\n\nhttps://doi.org/10.52202/062738-0003"} {"package": "3flatline-cli", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} {"package": "3games", "pacakge-description": "Effective-octoThis is a free, simlple and educative project based on pythongithub urlhttps://github.com/2028-design/effective-octoLicenseMIT"} {"package": "3gpp-citations", "pacakge-description": "3GPP Bibtex entry generatorThis project aims to generateBiBTeXfiles that\ncan be used when citing3GPPspecifications. The input is a document list exported from the3GPP Portal.Installationpip install 3gpp-citationsTo also install test dependencies runpip install 3gpp-citations[test]InstructionsGo to the3GPP PortalGenerate the list of specifications you want.Download to Excel and save fileRunpython 3gpp-citations.py -i exported.xlsx -o 3gpp.bibUse in LaTeX.Optionallyuse the provided3gpp.bibdirectly.Things to noteThe outputbibtexclass is set to@techreport.If you add the option--xelatex, break-symbols\\-will be used in url-fields.The version and date are read from 3gpp.org, but it is slow so it takes a while to parse the list. If you find an easy solution to this, let me know.Example output@techreport{3gpp.36.331,\n author = {3GPP},\n day = {20},\n institution = {{3rd Generation Partnership Project (3GPP)}},\n month = {04},\n note = {Version 14.2.2},\n number = {36.331},\n title = {{Evolved Universal Terrestrial Radio Access (E-UTRA); Radio Resource Control (RRC); Protocol specification}},\n type = {Technical Specification (TS)},\n url = {https://portal.3gpp.org/desktopmodules/Specifications/SpecificationDetails.aspx?specificationId=2440},\n year = {2017}\n}ContributeSee ourcontribution guidelinesand ourCode of Conduct.AcknowledgmentThis project has been updated as part of theWASP Software and Cloud Technologycourse.This work was partially supported by the Wallenberg AI, Autonomous Systems and Software Program (WASP) funded by the Knut and Alice Wallenberg Foundation."} {"package": "3lc", "pacakge-description": "Placeholder for tlc package"} {"package": "3lwg", "pacakge-description": "UNKNOWN"} {"package": "3mensolutions-distribution", "pacakge-description": "No description available on PyPI."} {"package": "3mtools", "pacakge-description": "A package to perform many useful tools"} {"package": "3mystic_cloud_client", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} {"package": "3mystic_common", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} {"package": "3p0", "pacakge-description": "No description available on PyPI."} {"package": "3ptest", "pacakge-description": "This is the README! You're welcome!"} {"package": "3-py", "pacakge-description": "3.gitignore-aware tree tool written in Python.Example:3Output:Installpip3 install 3-pyCompatibilityIf you are on Windows, installcoloramain order to watch some colors."} {"package": "3-python-package-exercise-17", "pacakge-description": "Python Package ExerciseA little exercise to create a Python package, build it, test it, distribute it, and use it. Seeinstructionsfor details."} {"package": "3q", "pacakge-description": "No description available on PyPI."} {"package": "3scale-api", "pacakge-description": "3scale REST API client in Python3Scale REST API client in a wrapper over the 3scale API.InstallingInstall and update using pip:pipinstall3scale-apiOr as a dependency using the pipenvpipenvinstall3scale-apiUsageClient supports basic CRUD operations and it using the official 3scale API.The API can be found at/p/admin/api_docsBasic usage of the client:fromthreescale_apiimportThreeScaleClient,resourcesfromtypingimportListclient=ThreeScaleClient(url=\"myaccount.3scale.net\",token=\"secret_token\",ssl_verify=True)# Get list of APIs/Services or any other resourceservices:List[resources.Service]=client.services.list()# Get service by it's nametest_service:resources.Service=client.services[\"test_service\"]# or use: client.services.read_by_name(system_name)# Get service by it's idtest_service:resources.Service=client.services[12345]# or use client.services.read(id)# To get raw JSON response - you can use the fetch method - it takes the service idraw_json:dict=client.services.fetch(12345)# To create a new service (or any other resource), parameters are the same as you would provide by the documentationnew_service:resources.Service=client.services.create(system_name='my_testing_service',name=\"My Testing service\")# In order to update service you can eitherclient.services[123456].update(param=\"new_value\")# orservice:resources.Service=client.services[123456]service['param']='new_value'service.update()# To get a proxy config you can useproxy:resources.Proxy=client.services['test_service'].proxy.read()# To update the proxy you can eitherproxy:resources.Proxy=client.services['test_service'].proxy.update(parameter_to_update='update')# orproxy_instance=client.services['test_service'].proxy.read()proxy_instance['param']='new_value'proxy_instance.update()# On the service you can access the:service:resources.Service=client.services[123456]service.proxy# The PROXY clientservice.mapping_rules# mapping rules clientservice.metrics# metricsservice.app_plans# application plans# The proxy supports:proxy=service.proxy.read()proxy.promote(version=1,from_env=\"sandbox\",to_env=\"production\")# The promote operationproxy.mapping_rules# The mapping rulesproxy.configs# proxy configurations clientproxy.policies# Policies defined for the APIRun the TestsTo run the tests you need to have installed development dependencies:pipenvinstall--devand then run thepytest:pipenvrunpytest-vIntegration tests configurationTo run the integration tests you need to set these env variables:THREESCALE_PROVIDER_URL='https://example-admin.3scale.net'\nTHREESCALE_PROVIDER_TOKEN=''\n\n# OPTIONAL:\nTHREESCALE_MASTER_URL='https://master.3scale.net'\nTHREESCALE_MASTER_TOKEN=''"} {"package": "3sdaq-news-cl", "pacakge-description": "No description available on PyPI."} {"package": "3t", "pacakge-description": "No description available on PyPI."} {"package": "3tllibs", "pacakge-description": "### step class\nstep class consist common functions which can be inherited by other classes \u2013 setTaskID, setUUID,setTaskExecutionID,startRedisConn,setLogger,loadParams,connectToAPIForKey,createAndGetResponseFromURL,getLogger,exceptionTraceback,getRelativeFile### extract class\nLicense-File: LICENSE.txt"} {"package": "3to2", "pacakge-description": "DownloadRelease for 2.7 and 3.x (last version I tested was 3.4.3):https://pypi.python.org/pypi/3to2Abstractlib3to2 is a set of fixers that are intended to backport code written for\nPython version 3.x into Python version 2.x. The final target 2.x version is\nthe latest version of the 2.7 branch, as that is the last release in the Python\n2.x branch. Some attempts have been made, however, to make code compatible as\nmuch as possible with versions of Python back to 2.5, and bug reports are still\nwelcome for Python features only present in 2.6+ that are not addressed by\nlib3to2.This project came about as a Google Summer of Code (TM) project in 2009.StatusBecause of the nature of the subject matter, 3to2 is not perfect, so check all\noutput manually. 3to2 does the bulk of the work, but there is code that simply\ncannot be converted into a Python 2 equivalent for one reason or another.3to2 will either produce working Python 2 code or warn about why it did not.\nAny other behavior is a bug and should be reported.lib3to2\u2019s fixers are somewhat well-tested individually, but there is no testing\nthat is done on interactions between multiple fixers, so most of the bugs in\nthe future will likely be found there.Intentionlib3to2 is intended to be a tool in the process of developing code that is\nbackwards-compatible between Python 3 and Python 2. It is not intended to be a\ncomplete solution for directly backporting Python 3 code, though it can often\nbe used for this purpose without issue. Sufficiently large packages should be\ndeveloped with lib3to2 used throughout the process to avoid backwards-\nincompatible code from becoming too embedded.There are some features of Python 3 that have no equivalent in Python 2, and\nthough lib3to2 tries to fix as many of these as it can, some features are\nbeyond its grasp. This is especially true of features not readily detectable\nby their syntax alone and extremely subtle features, so make sure that code\nusing lib3to2 is thoroughly tested.Repositorylib3to2 resides athttp://bitbucket.org/amentajo/lib3to2, where the bug tracker\ncan be found athttp://bitbucket.org/amentajo/lib3to2/issuesUsageRun \u201c./3to2\u201d to convert stdin (\u201c-\u201c), files or directories given as\narguments. By default, the tool outputs a unified diff-formatted patch on\nstandard output and a \u201cwhat was changed\u201d summary on standard error, but the\n\u201c-w\u201d option can be given to write back converted files, creating\n\u201c.bak\u201d-named backup files.If you are root, you can also install with \u201c./setup.py build\u201d and\n\u201c./setup.py install\u201d (\u201cmake install\u201d does this for you).This branch of 3to2 must be run with Python 3.To install locally (used for running tests as a non-privileged user), the\nscripts assume you are using python3.1. Modify accordingly if you are not.Relationship with lib2to3Some of the fixers for lib3to2 are directly copy-pasted from their 2to3\nequivalent, with the element of PATTERN and the corresponding transformation\nswitched places. Most fixers written for this program with a corresponding\n2to3 fixer started from a clone of the 2to3 fixer, then modifying that fixer to\nwork in reverse. I do not claim original authorship of these fixers, but I do\nclaim that they will work for 3to2, independent of how they work for 2to3.\nIn addition, this program depends on lib2to3 to implement fixers, test cases,\nrefactoring, and grammar. Some portions of lib2to3 were modified to be more\ngeneric to support lib3to2\u2019s calls.You should use the latest version of lib2to3 from the Python sandbox rather\nthan the version (if any) that comes with Python. As a convenience,\n\u201ctwo2three\u201d from the Python Package Index is a recent enough version of lib2to3\nrenamed to avoid conflicts. To use this package, replace all usage of\n\u201clib2to3\u201d with \u201ctwo2three\u201d within the 3to2 source files after installing\n\u201ctwo2three\u201d from the PyPI. Depending on the developer\u2019s mood, a version of\n3to2 may be provided with this change already made."} {"package": "3to2_py3k", "pacakge-description": "Thanks to Steven Silvester, this no longer needs to be a package that is separate from 3to2.See:https://pypi.python.org/pypi/3to2"} {"package": "3wfund-data-find", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} {"package": "3xploit", "pacakge-description": "soon"} {"package": "3xsd", "pacakge-description": "##3xsd\n3xsd is a native epoll server serving TCP/UDP connections, a high performance static web server, a\nfailover dns server, a http-based distributed file server, a load-balance proxy-cache server, and\na \u2018warp drive\u2019 server. Written in python, take the full power of multi-cores.##Features in detail:###3wsd - web serversupporting: static files, event driven(epoll), using mmap & sendfile to send files,\nin-mem xcache, transparent gzip content transfer with fixed length(small file) &\nchunked(large file), persistent storage of gzip files,\npartial support of WebDAV(PUT/DELETE), pipelining support###3nsd - dns serversupporting: only A record resolution, domainname failover(refer to conf file),\nip icmp probe & hide when fail, round robbin ip resolving\nglobal DNS Left-Right Range Resolve(LRRR)(experimental)###3zsd - proxy serversupporting: load balance backend servers, in-mem file caching &\npersistent cache file storage###3fsd - distribute web file systemsupporting: mass unlimitted file storage, easy to expand,\nO(1) location algorithm, non-centralized, can work with standard web server(WebDAV)\nin proxy mode, file redundancy, file persistent caching###3wdd - \u2018warp drive\u2019 serversupporting: data tunneling over UDT and tun,\nbetter congestion control than TCP/UDP over wan link,\nbetter thoughput(above 80%) over wan link, refer to this report:http://www.c-s-a.org.cn/ch/reader/create_pdf.aspx?file_no=20091035tunnel ip/mtu/txqueuelen/route define, auto create/recreate/destroy\nencrypt packages through AES-128-ECB/CBC/CFB/CTR and Blowfish-CBC/CFB/CTR\ntunnel on-the-fly compress with zlib/lzo, tunnel data relaying\nroute metric, routing data through different path, depending on tunnel rtt(choose the best one)More to find in .conf file.##Performance:###3wsd:Small file under 1KB single process test(full in-mem), contrast with nginx configuring\naccept_mutex off, 80% performance.\nMulti processes test, with reuse_port enabling kernel, 95% performance of nginx(and beyond,\nmay be 105% or more, based on process number, I tested 2-4).\nThe tests above is not quite strict\uff0c but I just want to say that it\u2019s fast enough.And with pipelining enabled, 3wsd will perform better with 3-4 requests/send(5%-10%\nperformance increase), 2 requests/send have the same speed with non-piplining.###3zsd:About 80% performance of 3wsd.###3nsd:Fast enough\u2026about 2800-3000 queries/s per processes, with 1GHz bcm2709 4-cores ARMv7\ncpu testing, better when multi-processes with reuse_port enabling kernel.###3fsd:Same with 3zsd.###3wdd:Early testing indicated that:\nUDT tunnel(no encrypt) performing 50%-60% speed of direct TCP connection with ZetaTCP,\nand package lost rate remaining below 0.6%, while direct connection has 1.4%-3%.\n(Test CN-US WAN link with 150ms-280ms latency, through the always-jammed CUCN submarine cable)\nHowever, UDT tunnel beats normal TCP connection without ZetaTCP, with 50% - 4 times\n(commonly 1-2 times) outperforming.(v)(Test link like above)Update:\nAnd an encrypted UDT tunnel with AES-CBC/CFB will has 50% performance decrease (because the\nmethod itself processes doubled size of data, and extra iv/padding data transfer).\nNow with a Blowfish-CTR method, tunnel data transfer performance is closed to raw non-encrypt\ntunnel. I believe that with a intel AES-NI supported CPU(like XEON E3-1240/1270), AES-128-CTR\ncan also do it.###More performance:\nThere are at lease two ways to increase the performance of 3xsd:1.Install Cython, and rename _3xsd.py to _3xsd.pyx, run it.\nCython will compile _3xsd.py lib into a _3xsd.so file, using static type\ndeclarations. This can gain about 5%-6% performance increasement.\n2.Use PyPy.This can gain about 10%-15% performance increasement(or more).#OS requirement & install:CentOS 6/7 with python 2.6/2.7, Debian 6/7. Python 2.7 recommended.Doing this before running the program(minimal requirement):yum install python-gevent pysendfile python-setproctitle python-psutil python-pip(python-pip is optional if install dpkt)Dpkt module is also needed when running 3nsd DNS server, pip install it.If you want to use 3wdd, python-pytun, pyudt4, pycrypto, python-lzo are also needed.yum install python-crypto2.6 python-lzo (for centos6)\nyum install python2-crypto (for centos7)will quickly install pycrypto(probably do some \u2018linking\u2019 works) and lzo. The other two depended on pip install.Probably you need this easy-install.pth file in python\u2019s site-packages dir:import sys; sys.__plen = len(sys.path)\n./pycrypto-2.6.1-py2.6-linux-x86_64.egg\n./pyudt4-0.6.0-py2.6-linux-x86_64.egg\nimport sys; new=sys.path[sys.__plen:]; del sys.path[sys.__plen:]; p=getattr(sys,\u2019__egginsert\u2019,0); sys.path[p:p]=new; sys.__egginsert = p+len(new)I provide pre-compiled package [pyudt_tun-centos6-x86_64.tar.gz](https://github.com/zihuaye/3xsd/blob/master/pyudt_tun-centos6-x86_64.tar.gz) and [pyudt_tun_lzo-centos7-x86_64.tar.gz](https://github.com/zihuaye/3xsd/blob/master/pyudt_tun_lzo-centos7-x86_64.tar.gz) to simplify\nthe installation procedure of pyudt4 & python-pytun.Be aware of pyudt4 having some bugs, you\u2019d better download it\u2019s source code of epoll-fixes branch and\napply the patch I offered. See changelog.txt v0.0.20 2016.03.07 fixed section for detail.\n(Already included in [pyudt_tun-centos6-x86_64.tar.gz](https://github.com/zihuaye/3xsd/blob/master/pyudt_tun-centos6-x86_64.tar.gz) and [pyudt_tun_lzo-centos7-x86_64.tar.gz](https://github.com/zihuaye/3xsd/blob/master/pyudt_tun_lzo-centos7-x86_64.tar.gz))Or, of cause you can let pip do it all for you(not including patching pyudt4):pip install 3xsdIn a debian, you can use apt-get to install python-pip(pip) or python-setuptools(easy_install),\nthen to install the packages following.Python Packages(Modules) version reference:gevent==0.13.8(1.0.1, 1.1)\ngreenlet==0.4.2\npysendfile==2.0.1\nsetproctitle==1.0.1\npsutil==0.6.1\ndpkt==1.6(1.8.6)\npython-pytun==2.2.1\npyudt4==0.6.0(epoll-fixes branch)\npycrypto==2.6.1\npython-lzo==1.8System libs version reference:libevent-1.4.13-4(not actually used, just needed for gevent to function)\nudt-4.11-6\nlzo-2.03-3.1To install a module of specific version(like gevent 0.13.8), you can:pip install gevent==0.13.8This will install the latest version of gevent(pypy will need it):pip install git+git://github.com/surfly/gevent.git#egg=gevent"} {"package": "3y", "pacakge-description": "# 3yzhA three-circle calculator, It contains Circle Circles and Cylinders.\nIt is in Chinese.If you konw how to speak and write or mean these,you can useNow,you can Download and input python 3y and then it will runningPlease comply with local laws and regulations,\nUser only has the right to use\nThe final interpretation belongs to the author\nFor mainland China only.\nYou can see or read the log in log.py\nThis means that you have read and agreed to all the above regulations.Welcome Download and use!"} {"package": "4", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} {"package": "404-optimistic-pkg-404-Not-Found", "pacakge-description": "No description available on PyPI."} {"package": "40ft", "pacakge-description": "No description available on PyPI."} {"package": "40hz", "pacakge-description": "welcome to my package"} {"package": "40wt-common-tasks", "pacakge-description": "A collection of tasks for python invoke, to build and maintain python projects.Free software: BSD licenseInstallationpip install 40wt-common-taskssDocumentationhttps://40wt-common-tasks.readthedocs.org/DevelopmentTo run the all tests run:toxNote, to combine the coverage data from all the tox environments run:Windowsset PYTEST_ADDOPTS=--cov-append\ntoxOtherPYTEST_ADDOPTS=--cov-append toxChangelog0.2.0 (2016-07-29)Tasks were fixed to work withinvoke >= 0.13.0.New taskcheck_if_dirtywas added. Make your tasks depend on it,\nand execution will be interrupted if some git changes aren\u2019t commited and pushed.0.1.0 (2016-02-10)First release on PyPI."} {"package": "41datastructure", "pacakge-description": "No description available on PyPI."} {"package": "42", "pacakge-description": "No description available on PyPI."} {"package": "42cc-pystyle", "pacakge-description": "42cc-pystyleflake8 plugins for 42 Coffee Cups style checks0.0.7\nflake8 version bump to 3.8.30.0.6\nfixed flake8 warnings :)0.0.5\nfixed package layout declaration0.0.4\nfixed nosetests0.0.3\nmoved code to a subfolder0.0.2\nAdded MANIFEST.in0.0.1\nTests should have docstrings\nTests for the module could be run vianosetestsorpython setup.py nosetests"} {"package": "42di", "pacakge-description": "42di Python SDK42di is the python sdk for 42di.com42di.com is a platform for data science.Installpipinstall42diORpipinstallgit+https://github.com/42di/python-sdkPut to / read from 42diimportdi#42diimportpandas_datareaderaspdrdi.TOKEN=\"\"df=pdr.get_data_fred('GDP')di.put(\"42di.cn/shellc/testing/my_dataset\",df,create=True,update_schema=True)df=di.read(\"42di.cn/shellc/testing/my_dataset\")print(df.head(100))"} {"package": "42Points", "pacakge-description": "42-Points-GamePython implementation of the fourty-two points game.Introduction\"42 Points\" game is a variation based on the popular and long-lived \"24 Points\" game. The player should only use addition, subtraction, multiplication, division and parentheses andfiveintegers between 0 and 13 (inclusive) to get 42. It's really simple to understand, as the core of the game still relies on math calculations.Design LogicThis package is designed towork as a core processing part, providing:Problem databaseProblem generation (by user / in database)Start / stop the gameTimer for solutionsPlayer statisticsEquivalent solution detectionA bunch of APIs and exceptionsThis package is not designed tofully implement everythingwith the game, so the formatting parts should be written by end-users. But don't worry, the API in this package is enough to create whatever you need.UsageAs equivalent detection is not a easy issue, we will make several changes to make sure it satisfies our need in the following upgrades. Please just kindly use the latest version.It's recommended to usepip:pipinstall--upgrade42PointsBut building withsetup.pywill also work, as no other third-party dependencies are required for this package.After you have installed the package, you are almost done. If you want to try it out quickly, just open your IDLE (or something like that) and type:fromftptsgameimportFTPtsGameapp=FTPtsGame()# initializeapp.generate_problem(problem=[1,2,3,4,5])# generate a problem beforehandapp.start()# start the gameapp.get_current_problem()# show the problemapp.solve('2 * 4 * 5 + 3 - 1')# put forward a valid solutionapp.get_current_solutions()# show all solutionsapp.get_remaining_solutions()# show remaining solutionsapp.stop()# stop the gameWell, you can integrate this package into your projects by using just the same way, and you can format problems in all the ways you like.ExceptionsPermissionError: Raised during a status check, some methods must be used in the game, while other can't be used.TypeError: Raised when the problem type is not defined.ValueError: Raised when the parameter of generating a problem is invalid, or a user inputs unmatched numbers.OverflowError: Raised when a user input a too long expression. (greater than or equal to 30 characters after beautifying)SyntaxError: Raised when a user input an expression which can't be parsed.ArithmeticError: Raised when a user inputs a wrong answer.LookupError: Raised when a user inputs an equivalent answer.Other exceptions will be raised when an input fails built-in parameter checks.LicenseThis package is licensed under theMIT License.ContributionPull requests and issues are warmly welcomed."} {"package": "42qucc", "pacakge-description": "the following is the usage:1.Paste file to 42qu.cchi@Mars~$ 42qucc < foo.txthttp://42qu.cc/xa47qt4712.Custom urlhi@Mars~$ 42qucc hi < foo.txthttp://42qu.cc/hi3.Save web page to local filehi@Mars~$ 42qucchttp://42qu.cc/xa47qt471> foo.txt"} {"package": "42ukim", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} {"package": "42ukimd08", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} {"package": "42ukimx2", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} {"package": "42videobricks-python-client", "pacakge-description": "# 42videobricks-python-client\nOfficial Pyhton client library for 42videobricks API.\n\n42videobricks is a Video Platform As A Service (VPaaS).\n\nGet documentation in the [Github readme](https://github.com/42videobricks/42videobricks-nodejs-client#readme)."} {"package": "437-project", "pacakge-description": "Using this event management system, a user can create events or book tickets for an event as well as view all events and all tickets or get a summary of all tickets"} {"package": "440-create-user", "pacakge-description": "No description available on PyPI."} {"package": "456789999999test", "pacakge-description": "No description available on PyPI."} {"package": "4711", "pacakge-description": "47114711is a natural number that follows4710and that is followed by4712. It's somewhat of \"a large arbitrary number\".In this case however,4711isn't just a number. It's a CLI tool (or collection CLI tools if you wish) for working with data structures, parsing values, handling conversion and formatting the output. It's written in Python, but doesn't require any extensive Python knowledge for use.Requires Python 3.6+ and should be run on Unix-like systems such as Linux, BSD, macOS, etc.Installation withpipx(preferred) orpipIt's recommended to install4711usingpipx, which is a tool that stores Python based command line interface applications in their own virtual environment, effectively making it accessible without you having to think about setting up virtualenvs on your own or handle the dependencies.pipxis available athttps://github.com/pipxproject/pipx.To install usingpipx(depending on how you've installedpipxpreviously, may have to usesudo):$ pipx install 4711If you prefer to install the CLI normally usingpip, go ahead and run:$ pip install 4711Usage and examplesUse-case$ 4711 --help"} {"package": "4996", "pacakge-description": "4996"} {"package": "49p", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} {"package": "4cast-awi-package", "pacakge-description": "4cast_task\nread csv file."} {"package": "4cast-package", "pacakge-description": "4cast_task\nread csv file."} {"package": "4cdl", "pacakge-description": "UNKNOWN"} {"package": "4ch", "pacakge-description": "fourch (stylized as 4ch) is a wrapper to the 4chan JSON API, provided by moot. It allows you to interact with 4chan (in a READONLY way) easily through your scripts.Originally stolen forked frome000/py-4chan, but then I moved repos and renamed stuff since I\u2019m pretty bad about that.RequirementsPython 2.7 (what I test with, 2.x might work)requestsNotesThis isn\u2019t guaranteed to work all the time; after all, the API may change, and 4ch will have to be updated accordingly.If a feature is missing, open an issue on therepo, and it may well be implemented.Running / UsageInstall & import:$ pip install 4ch,import fourchSee thedocsContributingIf you\u2019re interested in contributing to the usability of 4ch, or just want to give away stars, you can visit the 4ch githubrepo."} {"package": "4chan", "pacakge-description": "A python script that downloads all images from a 4chan thread.Installpip install 4chanUsageusage: chan [-h] [--watch] url\n\npositional arguments:\n url The url of the thread.\n\noptional arguments:\n -h, --help show this help message and exit\n --watch If this argument is passed, we will watch the thread for new\n images.Examplechan thread-url"} {"package": "4chan-biz-mentions", "pacakge-description": "4chan-biz-mentionsHF tracking your shitcoin, loserInstallpipinstall4chan-biz-mentionsUsage>python3-m4chan_biz_mentionsmoonethavaxadasolbtcankralgofudbullishscampumpnft\nsolmentionned44timesethmentionned39timespumpmentionned22timesscammentionned19timesmoonmentionned12timesfudmentionned8timesadamentionned7timesavaxmentionned6timesbtcmentionned5timesalgomentionned3timesbullishmentionned3timesnftmentionned3timesankrmentionned0timesshithub"} {"package": "4chandownloader", "pacakge-description": "4chan thread downloader.pip install 4chandownloader\n4chandownloader http://boards.4chan.org/b/res/423861837 4chanarchives --delay 5 --thumbs"} {"package": "4channel", "pacakge-description": "4channel is a python3 tool and module to download all images/webm from a 4channel thread.InstallationDependencies4channel requires:python (>= 3.6)User installationpip install 4channelUsageusage: 4channel [-h] [--webm] [--watch] [--dryrun] [-r RECURSE] url [out]\n\npositional arguments:\n url the url of the thread.\n out specify output directory (optional)\n\noptional arguments:\n -h, --help show this help message and exit\n --webm in addition to images also download webm videos.\n --watch watch the thread every 60 seconds for new images.\n --dryrun dry run without actually downloading images.\n -r RECURSE, --recurse RECURSE\n recursively download images if 1st post contains link to previous thread up to specified depth\nexamples:\n python -m fourchannel https://boards.4channel.org/g/thread/76759434#p76759434\n\n import fourchannel as f\n f.download(url='https://boards.4channel.org/g/thread/76759434#p76759434')"} {"package": "4chan.py", "pacakge-description": "No description available on PyPI."} {"package": "4codesdk-pkg", "pacakge-description": "4Code Software Developement KitCommon functions in one place.This library is written for economy time of developement.\nCommon usage tools in one touch. Free license.Authors nicnames:AngryDanny+N_XY"} {"package": "4dgb-workflow", "pacakge-description": "4DGB toolkitThis is the toolkit associated with the 4DGenomeBrowser project"} {"package": "4dpyautodiff", "pacakge-description": "PyAutoDiff provides the ability to seamlessly calculate the gradient of a given function within your Python code. By using automatic differentiation, this project addresses efficiency and precision issues in symbolic and finite differentiation algorithms"} {"package": "4d-radar", "pacakge-description": "4D Radar4d_radar is a Python package developed to house utilities for the processing and analysis of 4D radar point cloud data.FeaturesWork in ProgressContributingContributions to 4d_radar are welcome!License4d_radar is licensed under the MIT License."} {"package": "4in", "pacakge-description": "\ud83d\udce6 setup.py (for humans)This repo exists to providean example setup.pyfile, that can be used\nto bootstrap your next Python project. It includes some advanced\npatterns and best practices forsetup.py, as well as some\ncommented\u2013out nice\u2013to\u2013haves.For example, thissetup.pyprovides a$ python setup.py uploadcommand, which creates auniversal wheel(andsdist) and uploads\nyour package toPyPiusingTwine, without the need for an annoyingsetup.cfgfile. It also creates/uploads a new git tag, automatically.In short,setup.pyfiles can be daunting to approach, when first\nstarting out \u2014 even Guido has been heard saying, \"everyone cargo cults\nthems\". It's true \u2014 so, I want this repo to be the best place to\ncopy\u2013paste from :)Check out the example!Installationcdyour_project# Download the setup.py file:# download with wgetwgethttps://raw.githubusercontent.com/navdeep-G/setup.py/master/setup.py-Osetup.py# download with curlcurl-Ohttps://raw.githubusercontent.com/navdeep-G/setup.py/master/setup.pyTo DoTests via$ setup.py test(if it's concise).Pull requests are encouraged!More ResourcesWhat is setup.py?on Stack OverflowOfficial Python Packaging User GuideThe Hitchhiker's Guide to PackagingCookiecutter template for a Python packageLicenseThis is free and unencumbered software released into the public domain.Anyone is free to copy, modify, publish, use, compile, sell, or\ndistribute this software, either in source code form or as a compiled\nbinary, for any purpose, commercial or non-commercial, and by any means."} {"package": "4koodi", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} {"package": "4logik-python-rest-client", "pacakge-description": "4logik python rest clientUtility package to call an enpoint generated by 4LogikInstallationUse pippip install 4logik-python-rest-clientHow to call a CSV endpointLocate the input CSV fileIdentify the URL of the enpointIdentify the name of the data set of the response that contains the resultsExample of using the package:frompy4logik_python_rest_client.endpoint_callerimportcall_csv_endpoint,call_csv_endpoint_read_data_set# input parametersinput_csv_file=\"/home/user1/incomingData.csv\"endpoint_url=\"http://myOrganization.myDeployedService.com/RiskCalulationProcess\"# call the endpointreceived_json_data=call_csv_endpoint(ms_url,input_csv_file)print(received_json_data)The result will contain useful metadata like the quantity of business exceptions and the list of data sets which you can print using:print(received_json_data[\"data_sets_names\"])print(received_json_data[\"data_sets_results\"])To read the specific rows of a data set, call the method \"call_csv_endpoint_read_data_set\" sending the name of the data set, like this:specific_data_set_name_to_read=\"ReportResult\"data_set_result_rows=call_csv_endpoint_read_data_set(ms_url,input_csv_file,specific_data_set_name_to_read)print(data_set_result_rows)Example using the package inside Jupyter and converting the result to a data frame:importjsonimportpandasaspdimporttempfilefrompy4logik_python_rest_client.endpoint_callerimportcall_csv_endpoint_read_data_set# input parametersinput_csv_file=\"/home/user1/incomingData.csv\"endpoint_url=\"http://myOrganization.myDeployedService.com/RiskCalulationProcess\"dataset_name=\"riskResult\"# call the endpointreceived_json_data=call_csv_endpoint_read_data_set(ms_url,input_csv_file,dataset_name)# now convert the received json to pandatemp_file=tempfile.NamedTemporaryFile(delete=False)output_json=temp_file.namewithopen(output_json,'w',encoding='UTF_8')asf:f.write(json.dumps(received_json_data))f.close()final_data_frame=pd.read_json(output_json)final_data_frame"} {"package": "4nf", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} {"package": "4nil0cin", "pacakge-description": "No description available on PyPI."} {"package": "4quila", "pacakge-description": "No description available on PyPI."} {"package": "4scanner", "pacakge-description": "4scanner4scanner can search multiple imageboards threads for matching keywords then download all images to disk.Supported imageboards4chanlainchanuboachanYou can create an issue if you want to see other imageboards supportedInstallingpip3 install 4scanner(4scanner is ONLY compatible with python3+)For Arch Linux there is anAUR packageRunning via DockerCreate a config (detail below), name it config.json and drop it where you would like to download the images. Then run a container:docker run -v /can/be/anywhere:/output -v /anywhere/else:/root/.4scanner lacsap/4scanner/can/be/anywhereCan be anywhere on your computer, images will be downloaded there (This is the directory where you need to put the config.json)/anywhere/elseCan be anywhere on your computer, it will contain the sqlite3 database 4scanner use to keep track of downloaded threads and duplicateHow tothe first thing you need to do is create a simple json file with the directories names\nyou want, the boards you want to search and the keywords.\n(see the json file section for more details)After your json file is done you can start 4scanner with:4scanner file.jsonit will search all threads for the keywords defined in your json file and\ndownload all images/webms from threads where a keyword is found. (In the current directory unless you specify one with -o )Creating your JSON file via the 4genconf script (easy)The4genconfutility is now installed as of 4scanner 1.5.1. This utility will ask you simple questions about what you want to download and generate a configuration file for you!Creating your JSON file manuallyCreating the JSON file is easy, you can use the example.json file as a base.Your \"Searches\" are what 4scanner use to know which board to check for what keywords and the name of the folder where it needs to download the images, you can have as many \"Searches\" as you want.Here is an example of what the JSON file should look like:{\"searches\":[{\"imageboard\":\"IMAGEBOARD\",\"folder_name\":\"YOUR_FOLDER_NAME\",\"board\":\"BOARD_LETTER\",\"keywords\":[\"KEYWORD1\",\"KEYWORD2\"]},{\"imageboard\":\"4chan\",\"folder_name\":\"vidya\",\"board\":\"v\",\"keywords\":[\"tf2\",\"splatoon\",\"world of tank\"]}]}Search options4scanner has a lot of options for downloading only the images you want. Such as downloading only images with a certain width or height, or only images with a certain extension.To see all available options with examples check out:OPTIONS.mdHydrus Networkusers: check out thetagoptionto automatically tag your images on importExample with all optional options{\"searches\":[{\"imageboard\":\"4chan\",\"folder_name\":\"vidya\",\"board\":\"v\",\"width\":\">1000\",\"height\":\">1000\",\"filename\":\"IMG_\",\"extension\":[\".jpg\",\".png\"],\"tag\":[\"game\"],\"keywords\":[\"tf2\",\"splatoon\",\"world of tank\"],\"check_duplicate\":true,\"subject_only\":false}]}This will download images bigger than 1000x1000 which are .jpg or .png with a filename containingIMG_Notesthe keywords search is case insensitive4downloader4downloader is also installed with 4scanner and can be use to download\na single thread like this:4downloader http://boards.4chan.org/b/thread/373687492It will download all images until the thread die.\nYou can also download threads from imageboards other than 4chan with-i"} {"package": "4SFwD", "pacakge-description": "SF4wDfour-component stochastic frontier model with determinantsMotivationThis package was developed to complement four-component stochastic frontier that considerdeterminants in mean and variance parameters of inefficiency distributionsby Ruei-Chi Lee.InstallationInstall via$ pip install 4SFwDFeaturesSF4wD: main.py - set method and model to run simulation or real dataHMC: Hamilton Monte Carlo designed for determinants parameters.DA: Data augmentation for the modelTK: Two-parametrization method originally proposed by Tsiona and Kunmbhaker (2014) for four-component model without determinants.PMCMC: Particle MCMC for the model (perferred approach) - speed up by GPU parallel computationExampleHere is how you run a simulation estimation for a four-component stochastic frontier model via PMCMC:Parameter setting guideline in the SF4wD.pySimulation data only offers stochastic frontier model that consider determinants in both mean and variance parameter of inefficiencies.importSF4wD#model:str - different way to consider determinants#method:str - different Bayesian method to estimate the model#data_name : str - simulation data or data in data/.#S : int - MCMC length#H : int - number of particles in PMCMC#gpu: boolean - use parallel computation to run PMCMC#save: boolean - save MCMC datamy_model=SF4wD(model='D',method='PMCMC',data_name='',S=10,H=100,gpu=False,save=False)my_model.run()output:meansdhpd_3%hpd_97%mcse_meanmcse_sdess_meaness_sdess_bulkess_tailr_hatbeta02.4120.0932.3182.5550.0460.0354.04.07.010.0NaNbeta11.0780.0740.9771.2420.0230.01710.010.010.010.0NaNxi00.5800.0430.5310.6520.0140.0119.09.08.010.0NaNxi10.6940.1270.4790.8670.0730.0583.03.03.010.0NaNdelta00.1410.0720.0130.2730.0230.01910.08.010.010.0NaNdelta10.7740.1370.6200.9840.0790.0633.03.03.010.0NaNz0-0.4610.716-1.8440.6090.3760.2914.04.04.010.0NaNz12.7280.8891.2683.9410.4590.3544.04.04.010.0NaNgamma00.6620.0920.5000.7730.0520.0413.03.03.010.0NaNgamma10.4120.0610.3490.5190.0210.0159.09.09.010.0NaNsigma_alpha_sqr1.3770.1781.0951.6930.0750.0576.06.06.010.0NaNsigma_v_sqr2.5752.5231.2909.5151.0620.7936.06.03.010.0NaNLicenseRuei-Chi Lee is the main author and contributor.Bug reports, feature requests, questions, rants, etc are welcome, preferably\non the github page."} {"package": "4Suite", "pacakge-description": "4Suite is a collection of Python tools for XML processing and object-databases.4Suite is an integrated packaging of 4DOM, 4XPath, 4XSLT, 4RDF, and 4ODS."} {"package": "4Suite-XML", "pacakge-description": "XML tools and libraries for Python: Domlette, XPath, XSLT, XPointer, XLink, XUpdate"} {"package": "4to5", "pacakge-description": "4to5 - Replace the number 4 with the number 5.Unlike 2to3, this module finally does what it says! Replaces two numbers on your\ninterpreter. It\u2019s a true life-saver for both you and your colleagues.Usagepipinstall4to5python>>>2+25>>>3+15>>>3+2==3+1True>>>4-23>>4-1# Cause 4-1 == 5-1 == 4 == 55>>>foriinrange(10):...print(i)...0123556789Notes50% chance you won\u2019t be able to remove it, as apparently the number 4 is\nimpotant for pip, and without it pip doesn\u2019t seem to work properly.To manually uninstall, deletesitecustomize.pyfrom yoursite-packagesdirectory.\nMaybe I\u2019ll add afix_my_system.pyfile in the future to remove it without using\nthe number 4.Supports virtual environments.Enjoy!"} {"package": "5", "pacakge-description": "UNKNOWN"} {"package": "5090-distributions", "pacakge-description": "No description available on PyPI."} {"package": "5091SimpleCalculator7180", "pacakge-description": "No description available on PyPI."} {"package": "51degrees-mobile-detector", "pacakge-description": "Device Detection Python API51Degrees Mobile Detector is a server side mobile detection solution.ChangelogFixed a bug where an additional compile argument was causing compilation errors with clang.Updated the v3-trie-wrapper package to include the Lite Hash Trie data file.Updated Lite Pattern data file for November.Update Lite Hash Trie data file for November.GeneralBefore you start matching user agents, you may wish to configure the solution to use a different database. You can easily generate a sample settings file running the following command$ 51degrees-mobile-detector settings > ~/51degrees-mobile-detector.settings.pyThe core51degrees-mobile-detectoris included as a dependency when installing either the51degrees-mobile-detector-v3-wrapperor51degrees-mobile-detector-v3-wrapperpackages.During install a directory which contains your data file will be created in~\\51Degrees.SettingsGeneral SettingsDETECTION_METHOD(defaults to \u2018v3-wrapper\u2019). Sets the preferred mobile device detection method. Available options are v3-wrapper (requires 51degrees-mobile-detector-v3-wrapper package), v3-trie-wrapperPROPERTIES(defaults to \u2018\u2019). List of case-sensitive property names to be fetched on every device detection. Leave empty to fetch all available properties.LICENCEYour 51Degrees license key for enhanced device data. This is required if you want to set up the automatic 51degrees-mobile-detector-premium-pattern-wrapper package updates.Trie Detector settingsV3_TRIE_WRAPPER_DATABASELocation of the Hash Trie data file.Pattern Detector settingsV3_WRAPPER_DATABASELocation of the Pattern data file.CACHE_SIZE(defaults to 10000). Sets the size of the workset cache.POOL_SIZE(defaults to 20). Sets the size of the workset pool.Usage Sharer SettingsUSAGE_SHARER_ENABLED(defaults to True). Indicates if usage data should be shared with 51Degrees.com. We recommended leaving this value unchanged to ensure we\u2019re improving the performance and accuracy of the solution.Adavanced usage sharer settings are detailed in your settings file.Automatic UpdatesIf you want to set up automatic updates, add your license key to your settings and add the following command to your cron$ 51degrees-mobile-detector update-premium-pattern-wrapperNOTE: Currently auto updates are only available with our Pattern API.UsageCoreBy executing the following a useful help page will be displayed explaining basic usage.$ 51degrees-mobile-detectorTo check everything is set up , try fetching a match with$ 51degrees-mobile-detector match \u201cMozilla/5.0 (iPad; CPU OS 5_1 like Mac OS X) AppleWebKit/534.46 (KHTML, like Gecko) Mobile/9B176\u201dExamplesAdditional examples can be found on ourGitHubrepository.User SupportIf you have any issues please get in touch with ourSupportor open an issue on ourGitHubrepository."} {"package": "51degrees-mobile-detector-lite-pattern-wrapper", "pacakge-description": "51Degrees Mobile Detector is a Python wrapper of the lite C pattern-based\nmobile detection solution by 51Degrees.mobi. Check outhttp://51degrees.mobifor a detailed description, extra documentation and other useful information.copyright:2013 by 51Degrees.mobi, see README.rst for more details.license:MPL2, see LICENSE.txt for more details."} {"package": "51degrees-mobile-detector-trie-wrapper", "pacakge-description": "51Degrees Mobile Detector is a Python wrapper of the C trie-based mobile\ndetection solution by 51Degrees.mobi. Check outhttp://51degrees.mobifor\na detailed description, extra documentation and other useful information.copyright:2013 by 51Degrees.mobi, see README.rst for more details.license:MPL2, see LICENSE.txt for more details."} {"package": "51degrees-mobile-detector-v3-trie-wrapper", "pacakge-description": "Device Detection Python API51Degrees Mobile Detector is a server side mobile detection solution.ChangelogFixed a bug where an additional compile argument was causing compilation errors with clang.Updated the v3-trie-wrapper package to include the Lite Hash Trie data file.Updated Lite Pattern data file for November.Updated Lite Hash Trie data file for November.GeneralBefore you start matching user agents, you may wish to configure the solution to use a different database. You can easily generate a sample settings file running the following command$ 51degrees-mobile-detector settings > ~/51degrees-mobile-detector.settings.pyThe core51degrees-mobile-detectoris included as a dependency when installing either the51degrees-mobile-detector-v3-wrapperor51degrees-mobile-detector-v3-wrapperpackages.During install a directory which contains your data file will be created in~\\51Degrees.SettingsGeneral SettingsDETECTION_METHOD(defaults to \u2018v3-wrapper\u2019). Sets the preferred mobile device detection method. Available options are v3-wrapper (requires 51degrees-mobile-detector-v3-wrapper package), v3-trie-wrapperPROPERTIES(defaults to \u2018\u2019). List of case-sensitive property names to be fetched on every device detection. Leave empty to fetch all available properties.LICENCEYour 51Degrees license key for enhanced device data. This is required if you want to set up the automatic 51degrees-mobile-detector-premium-pattern-wrapper package updates.Trie Detector settingsV3_TRIE_WRAPPER_DATABASELocation of the Hash Trie data file.Pattern Detector settingsV3_WRAPPER_DATABASELocation of the Pattern data file.CACHE_SIZE(defaults to 10000). Sets the size of the workset cache.POOL_SIZE(defaults to 20). Sets the size of the workset pool.Usage Sharer SettingsUSAGE_SHARER_ENABLED(defaults to True). Indicates if usage data should be shared with 51Degrees.com. We recommended leaving this value unchanged to ensure we\u2019re improving the performance and accuracy of the solution.Adavanced usage sharer settings are detailed in your settings file.Automatic UpdatesIf you want to set up automatic updates, add your license key to your settings and add the following command to your cron$ 51degrees-mobile-detector update-premium-pattern-wrapperNOTE: Currently auto updates are only available with our Pattern API.UsageCoreBy executing the following a useful help page will be displayed explaining basic usage.$ 51degrees-mobile-detectorTo check everything is set up , try fetching a match with$ 51degrees-mobile-detector match \u201cMozilla/5.0 (iPad; CPU OS 5_1 like Mac OS X) AppleWebKit/534.46 (KHTML, like Gecko) Mobile/9B176\u201dExamplesAdditional examples can be found on ourGitHubrepository.User SupportIf you have any issues please get in touch with ourSupportor open an issue on ourGitHubrepository."} {"package": "51degrees-mobile-detector-v3-wrapper", "pacakge-description": "Device Detection Python API51Degrees Mobile Detector is a server side mobile detection solution.ChangelogFixed a bug where an additional compile argument was causing compilation errors with clang.Updated the v3-trie-wrapper package to include the Lite Hash Trie data file.Updated Lite Pattern data file for November.Updated Lite Hash Trie data file for November.GeneralBefore you start matching user agents, you may wish to configure the solution to use a different datadase. You can easily generate a sample settings file running the following command$ 51degrees-mobile-detector settings > ~/51degrees-mobile-detector.settings.pyThe core51degrees-mobile-detectoris included as a dependency when installing either the51degrees-mobile-detector-v3-wrapperor51degrees-mobile-detector-v3-wrapperpackages.During install a directory which contains your data file will be created in~\\51Degrees.SettingsGeneral SettingsDETECTION_METHOD(defaults to \u2018v3-wrapper\u2019). Sets the preferred mobile device detection method. Available options are v3-wrapper (requires 51degrees-mobile-detector-v3-wrapper package), v3-trie-wrapperPROPERTIES(defaults to \u2018\u2019). List of case-sensitive property names to be fetched on every device detection. Leave empty to fetch all available properties.LICENCEYour 51Degrees license key for enhanced device data. This is required if you want to set up the automatic 51degrees-mobile-detector-premium-pattern-wrapper package updates.Trie Detector settingsV3_TRIE_WRAPPER_DATABASELocation of the Hash Trie data file.Pattern Detector settingsV3_WRAPPER_DATABASELocation of the Pattern data file.CACHE_SIZE(defaults to 10000). Sets the size of the workset cache.POOL_SIZE(defaults to 20). Sets the size of the workset pool.Usage Sharer SettingsUSAGE_SHARER_ENABLED(defaults to True). Indicates if usage data should be shared with 51Degrees.com. We recommended leaving this value unchanged to ensure we\u2019re improving the performance and accuracy of the solution.Adavanced usage sharer settings are detailed in your settings file.Automatic UpdatesIf you want to set up automatic updates, add your license key to your settings and add the following command to your cron$ 51degrees-mobile-detector update-premium-pattern-wrapperNOTE: Currently auto updates are only available with our Pattern API.UsageCoreBy executing the following a useful help page will be displayed explaining basic usage.$ 51degrees-mobile-detectorTo check everything is set up , try fetching a match with$ 51degrees-mobile-detector match \u201cMozilla/5.0 (iPad; CPU OS 5_1 like Mac OS X) AppleWebKit/534.46 (KHTML, like Gecko) Mobile/9B176\u201dExamplesAdditional examples can be found on ourGitHubrepository.User SupportIf you have any issues please get in touch with ourSupportor open an issue on ourGitHubrepository."} {"package": "51job-autotest-framework", "pacakge-description": "pythonProject\u4ecb\u7ecd{\u4ee5\u4e0b\u662f Gitee \u5e73\u53f0\u8bf4\u660e\uff0c\u60a8\u53ef\u4ee5\u66ff\u6362\u6b64\u7b80\u4ecbGitee \u662f OSCHINA \u63a8\u51fa\u7684\u57fa\u4e8e Git \u7684\u4ee3\u7801\u6258\u7ba1\u5e73\u53f0\uff08\u540c\u65f6\u652f\u6301 SVN\uff09\u3002\u4e13\u4e3a\u5f00\u53d1\u8005\u63d0\u4f9b\u7a33\u5b9a\u3001\u9ad8\u6548\u3001\u5b89\u5168\u7684\u4e91\u7aef\u8f6f\u4ef6\u5f00\u53d1\u534f\u4f5c\u5e73\u53f0\n\u65e0\u8bba\u662f\u4e2a\u4eba\u3001\u56e2\u961f\u3001\u6216\u662f\u4f01\u4e1a\uff0c\u90fd\u80fd\u591f\u7528 Gitee \u5b9e\u73b0\u4ee3\u7801\u6258\u7ba1\u3001\u9879\u76ee\u7ba1\u7406\u3001\u534f\u4f5c\u5f00\u53d1\u3002\u4f01\u4e1a\u9879\u76ee\u8bf7\u770bhttps://gitee.com/enterprises}\u8f6f\u4ef6\u67b6\u6784\u8f6f\u4ef6\u67b6\u6784\u8bf4\u660e\u5b89\u88c5\u6559\u7a0bxxxxxxxxxxxx\u4f7f\u7528\u8bf4\u660exxxxxxxxxxxx\u53c2\u4e0e\u8d21\u732eJasonZheng\u7279\u6280\u4f7f\u7528 Readme_XXX.md \u6765\u652f\u6301\u4e0d\u540c\u7684\u8bed\u8a00\uff0c\u4f8b\u5982 Readme_en.md, Readme_zh.mdGitee \u5b98\u65b9\u535a\u5ba2blog.gitee.com\u4f60\u53ef\u4ee5https://gitee.com/explore\u8fd9\u4e2a\u5730\u5740\u6765\u4e86\u89e3 Gitee \u4e0a\u7684\u4f18\u79c0\u5f00\u6e90\u9879\u76eeGVP\u5168\u79f0\u662f Gitee \u6700\u6709\u4ef7\u503c\u5f00\u6e90\u9879\u76ee\uff0c\u662f\u7efc\u5408\u8bc4\u5b9a\u51fa\u7684\u4f18\u79c0\u5f00\u6e90\u9879\u76eeGitee \u5b98\u65b9\u63d0\u4f9b\u7684\u4f7f\u7528\u624b\u518chttps://gitee.com/helpGitee \u5c01\u9762\u4eba\u7269\u662f\u4e00\u6863\u7528\u6765\u5c55\u793a Gitee \u4f1a\u5458\u98ce\u91c7\u7684\u680f\u76eehttps://gitee.com/gitee-stars/"} {"package": "51PubModules", "pacakge-description": "No description available on PyPI."} {"package": "51pub_pymodules", "pacakge-description": "No description available on PyPI."} {"package": "51spiders", "pacakge-description": "No description available on PyPI."} {"package": "51tracking", "pacakge-description": "51tracking-sdk-pythonThe Python SDK of 51Tracking APIContact:service@51tracking.orgOfficial documentDocumentSupported Python Versions3.63.73.83.93.10pypy3IndexInstallationTestingError HandlingSDKCouriersTrackingsAir WaybillInstallation$ pip install 51trackingVia source code\u4e0b\u8f7d\u4ee3\u7801\u538b\u7f29\u5305\uff0c\u65e0\u9700\u89e3\u538b\u7f29\uff0c\u8fdb\u5165\u6e90\u4ee3\u7801\u6839\u76ee\u5f55\uff0c\u7136\u540e\u8fd0\u884c\uff1a$ pip install 51tracking-sdk-python.zipQuick Startimporttracking51tracking51.api_key='you api key'try:couriers=tracking51.courier.get_all_couriers()print(couriers)excepttracking51.exception.Tracking51Exceptionasce:print(ce)exceptExceptionase:print(\"other error:\",e)TestingpytestError handlingThrowby the new SDK clientimporttracking51tracking51.api_key=''try:couriers=tracking51.courier.get_all_couriers()print(couriers)excepttracking51.exception.Tracking51Exceptionasce:print(ce)# API Key is missingThrowby the parameter validation in functionimporttracking51tracking51.api_key='you api key'try:params={'tracking_number':'','courier_code':'usps'}result=tracking51.tracking.create_tracking(params)print(result)excepttracking51.exception.Tracking51Exceptionasce:print(ce)# Tracking number cannot be emptyExamplesCouriers\u8fd4\u56de\u6240\u6709\u652f\u6301\u7684\u5feb\u9012\u516c\u53f8\u5217\u8868https://api.51Tracking.com/v4/couriers/alltry:result=tracking51.courier.get_all_couriers()print(result)excepttracking51.exception.Tracking51Exceptionasce:print(ce)exceptExceptionase:print(\"other error:\",e)Trackings\u5355\u4e2a\u7269\u6d41\u5355\u53f7\u5b9e\u65f6\u6dfb\u52a0\u4e14\u67e5\u8be2https://api.51Tracking.com/v4/trackings/createtry:params={'tracking_number':'92612903029511573030094547','courier_code':'usps'}result=tracking51.tracking.create_tracking(params)print(result)excepttracking51.exception.Tracking51Exceptionasce:print(ce)exceptExceptionase:print(\"other error:\",e)\u83b7\u53d6\u591a\u4e2a\u7269\u6d41\u5355\u53f7\u7684\u67e5\u8be2\u7ed3\u679chttps://api.51Tracking.com/v4/trackings/gettry:# Perform queries based on various conditions# params = {'tracking_numbers': '92612903029511573030094547', 'courier_code': 'usps'}# params = {'tracking_numbers': '92612903029511573030094547,92612903029511573030094548', 'courier_code': 'usps'}params={'created_date_min':'2023-08-23T14:00:00+00:00','created_date_max':'2023-08-23T15:04:00+00:00'}result=tracking51.tracking.get_tracking_results(params)print(result)excepttracking51.exception.Tracking51Exceptionasce:print(ce)exceptExceptionase:print(\"other error:\",e)\u6dfb\u52a0\u591a\u4e2a\u7269\u6d41\u5355\u53f7\uff08\u4e00\u6b21\u8c03\u7528\u6700\u591a\u521b\u5efa 40 \u4e2a\u7269\u6d41\u5355\u53f7\uff09https://api.51Tracking.com/v4/trackings/batchtry:params=[{'tracking_number':'92612903029511573030094593','courier_code':'usps'},{'tracking_number':'92612903029511573030094594','courier_code':'usps'}]result=tracking51.tracking.batch_create_trackings(params)print(result)excepttracking51.exception.Tracking51Exceptionasce:print(ce)exceptExceptionase:print(\"other error:\",e)\u6839\u636eID\u66f4\u65b0\u7269\u6d41\u4fe1\u606fhttps://api.51Tracking.com/v4/trackings/update/{id}params={'customer_name':'New name','note':'New tests order note'}id_string=\"9a2f732e29b5ed2071d4cf6b5f4a3d19\"try:result=tracking51.tracking.update_tracking_by_id(id_string,params)print(result)excepttracking51.exception.Tracking51Exceptionasce:print(ce)exceptExceptionase:print(\"other error:\",e)\u901a\u8fc7ID\u5220\u9664\u5355\u53f7https://api.51Tracking.com/v4/trackings/delete/{id}id_string=\"9a2f7d1e8b912b729388c5835c188c28\"try:result=tracking51.tracking.batch_create_trackings(params)print(result)excepttracking51.exception.Tracking51Exceptionasce:print(ce)exceptExceptionase:print(\"other error:\",e)\u901a\u8fc7ID\u91cd\u65b0\u67e5\u8be2\u8fc7\u671f\u7684\u5355\u53f7https://api.51Tracking.com/v4/trackings/retrack/{id}id_string=\"9a2f7d1e8b912b729388c5835c188c28\"try:result=tracking51.tracking.retrack_tracking_by_id(id_string)print(result)excepttracking51.exception.Tracking51Exceptionasce:print(ce)exceptExceptionase:print(\"other error:\",e)Air Waybill\u67e5\u8be2\u822a\u7a7a\u8fd0\u5355\u7684\u7ed3\u679chttps://api.51Tracking.com/v4/awbparams={'awb_number':'235-69030430'}try:result=tracking51.air_waybill.create_an_air_waybill(params)print(result)excepttracking51.exception.Tracking51Exceptionasce:print(ce)exceptExceptionase:print(\"other error:\",e)\u54cd\u5e94\u72b6\u6001\u780151Tracking \u4f7f\u7528\u4f20\u7edf\u7684HTTP\u72b6\u6001\u7801\u6765\u8868\u660e API \u8bf7\u6c42\u7684\u72b6\u6001\u3002\u901a\u5e38\uff0c2xx\u5f62\u5f0f\u7684\u72b6\u6001\u7801\u8868\u793a\u8bf7\u6c42\u6210\u529f\uff0c4XX\u5f62\u5f0f\u7684\u72b6\u6001\u7801\u8868\u8bf7\u6c42\u53d1\u751f\u9519\u8bef\uff08\u6bd4\u5982\uff1a\u5fc5\u8981\u53c2\u6570\u7f3a\u5931\uff09\uff0c5xx\u683c\u5f0f\u7684\u72b6\u6001\u7801\u8868\u793a 51tracking \u7684\u670d\u52a1\u5668\u53ef\u80fd\u53d1\u751f\u4e86\u95ee\u9898\u3002Http CODEMETA CODETYPEMESSAGE200200\u6210\u529f\u8bf7\u6c42\u54cd\u5e94\u6210\u529f\u3002400400\u9519\u8bef\u8bf7\u6c42\u8bf7\u6c42\u7c7b\u578b\u9519\u8bef\u3002\u8bf7\u67e5\u770b API \u6587\u6863\u4ee5\u4e86\u89e3\u6b64 API \u7684\u8bf7\u6c42\u7c7b\u578b\u30024004101\u9519\u8bef\u8bf7\u6c42\u7269\u6d41\u5355\u53f7\u5df2\u5b58\u5728\u30024004102\u9519\u8bef\u8bf7\u6c42\u7269\u6d41\u5355\u53f7\u4e0d\u5b58\u5728\u3002\u8bf7\u5148\u4f7f\u7528\u300cCreate\u63a5\u53e3\u300d\u5c06\u5355\u53f7\u6dfb\u52a0\u81f3\u7cfb\u7edf\u30024004103\u9519\u8bef\u8bf7\u6c42\u60a8\u5df2\u8d85\u51fa API \u8c03\u7528\u7684\u521b\u5efa\u6570\u91cf\u3002\u6bcf\u6b21\u521b\u5efa\u7684\u6700\u5927\u6570\u91cf\u4e3a 40 \u4e2a\u5feb\u9012\u5355\u53f7\u30024004110\u9519\u8bef\u8bf7\u6c42\u7269\u6d41\u5355\u53f7(tracking_number) \u4e0d\u7b26\u5408\u89c4\u5219\u30024004111\u9519\u8bef\u8bf7\u6c42\u7269\u6d41\u5355\u53f7(tracking_number)\u4e3a\u5fc5\u586b\u5b57\u6bb5\u30024004112\u9519\u8bef\u8bf7\u6c42\u67e5\u8be2ID\u65e0\u6548\u30024004113\u9519\u8bef\u8bf7\u6c42\u4e0d\u5141\u8bb8\u91cd\u65b0\u67e5\u8be2\u3002\u60a8\u53ea\u80fd\u91cd\u65b0\u67e5\u8be2\u8fc7\u671f\u7684\u7269\u6d41\u5355\u53f7\u30024004120\u9519\u8bef\u8bf7\u6c42\u7269\u6d41\u5546\u7b80\u7801(courier_code)\u7684\u503c\u65e0\u6548\u30024004121\u9519\u8bef\u8bf7\u6c42\u65e0\u6cd5\u8bc6\u522b\u7269\u6d41\u5546\u30024004122\u9519\u8bef\u8bf7\u6c42\u7279\u6b8a\u7269\u6d41\u5546\u5b57\u6bb5\u7f3a\u5931\u6216\u586b\u5199\u4e0d\u7b26\u5408\u89c4\u8303\u30024004130\u9519\u8bef\u8bf7\u6c42\u8bf7\u6c42\u53c2\u6570\u7684\u683c\u5f0f\u65e0\u6548\u30024004160\u9519\u8bef\u8bf7\u6c42\u7a7a\u8fd0\u5355\u53f7(awb_number)\u662f\u5fc5\u9700\u7684\u6216\u6709\u6548\u7684\u683c\u5f0f\u30024004161\u9519\u8bef\u8bf7\u6c42\u6b64\u7a7a\u8fd0\u822a\u7a7a\u4e0d\u652f\u6301\u67e5\u8be2\u30024004165\u9519\u8bef\u8bf7\u6c42\u67e5\u8be2\u5931\u8d25\uff1a\u672a\u521b\u5efa\u7a7a\u8fd0\u5355\u53f7\u30024004166\u9519\u8bef\u8bf7\u6c42\u5220\u9664\u672a\u521b\u5efa\u7684\u7a7a\u8fd0\u5355\u53f7\u5931\u8d25\u30024004167\u9519\u8bef\u8bf7\u6c42\u7a7a\u8fd0\u5355\u53f7\u5df2\u5b58\u5728\uff0c\u65e0\u9700\u518d\u6b21\u521b\u5efa\u30024004190\u9519\u8bef\u8bf7\u6c42\u5f53\u524d\u67e5\u8be2\u989d\u5ea6\u4e0d\u8db3\u3002401401\u672a\u7ecf\u6388\u6743\u8eab\u4efd\u9a8c\u8bc1\u5931\u8d25\u6216\u6ca1\u6709\u6743\u9650\u3002\u8bf7\u68c0\u67e5\u5e76\u786e\u4fdd\u60a8\u7684 API \u5bc6\u94a5\u6b63\u786e\u65e0\u8bef\u3002403403\u7981\u6b62\u7981\u6b62\u8bbf\u95ee\u3002\u8bf7\u6c42\u88ab\u62d2\u7edd\u6216\u4e0d\u5141\u8bb8\u8bbf\u95ee\u3002404404\u672a\u627e\u5230\u9875\u9762\u4e0d\u5b58\u5728\u3002\u8bf7\u68c0\u67e5\u5e76\u786e\u4fdd\u60a8\u7684\u94fe\u63a5\u6b63\u786e\u65e0\u8bef\u3002429429\u592a\u591a\u8bf7\u6c42\u8d85\u51fa API \u8bf7\u6c42\u9650\u5236\uff0c\u8bf7\u7a0d\u540e\u91cd\u8bd5\u3002\u8bf7\u67e5\u770b API \u6587\u6863\u4ee5\u4e86\u89e3\u6b64 API \u7684\u9650\u5236\u3002500511\u670d\u52a1\u5668\u9519\u8bef\u670d\u52a1\u5668\u9519\u8bef\u3002\u8bf7\u8054\u7cfb\u6211\u4eec\uff1aservice@51Tracking.org\u3002500512\u670d\u52a1\u5668\u9519\u8bef\u670d\u52a1\u5668\u9519\u8bef\u3002\u8bf7\u8054\u7cfb\u6211\u4eec\uff1aservice@51Tracking.org\u3002500513\u670d\u52a1\u5668\u9519\u8bef\u670d\u52a1\u5668\u9519\u8bef\u3002\u8bf7\u8054\u7cfb\u6211\u4eec\uff1aservice@51Tracking.org\u3002"} {"package": "520", "pacakge-description": "No description available on PyPI."} {"package": "533testgawain", "pacakge-description": "No description available on PyPI."} {"package": "5345345345345345", "pacakge-description": "No description available on PyPI."} {"package": "546lkji", "pacakge-description": "oss-dev output python module"} {"package": "564bff00ff-strawberry-graphql", "pacakge-description": "Strawberry GraphQLPython GraphQL library based on dataclassesInstallation ( Quick Start )The quick start method provides a server and CLI to get going quickly. Install\nwith:pipinstall\"strawberry-graphql[debug-server]\"Getting StartedCreate a file calledapp.pywith the following code:importstrawberry@strawberry.typeclassUser:name:strage:int@strawberry.typeclassQuery:@strawberry.fielddefuser(self)->User:returnUser(name=\"Patrick\",age=100)schema=strawberry.Schema(query=Query)This will create a GraphQL schema defining aUsertype and a single query\nfielduserthat will return a hardcoded user.To run the debug server run the following command:strawberryserverappOpen the debug server by clicking on the following link:http://0.0.0.0:8000/graphqlThis will open GraphiQL where you can test the API.Type-checkingStrawberry comes with amypyplugin that enables statically type-checking your\nGraphQL schema. To enable it, add the following lines to yourmypy.iniconfiguration:[mypy]plugins=strawberry.ext.mypy_pluginDjango IntegrationA Django view is provided for adding a GraphQL endpoint to your application.Add the app to yourINSTALLED_APPS.INSTALLED_APPS=[...,# your other apps\"strawberry.django\",]Add the view to yoururls.pyfile.fromstrawberry.django.viewsimportGraphQLViewfrom.schemaimportschemaurlpatterns=[...,path(\"graphql\",GraphQLView.as_view(schema=schema)),]WebSocketsTo support graphql Subscriptions over WebSockets you need to provide a WebSocket\nenabled server. The debug server can be made to support WebSockets with these\ncommands:pipinstall'strawberry-graphql[debug-server]'pipinstall'uvicorn[standard]'ExamplesVarious examples on how to use StrawberryFull stack example using Starlette, SQLAlchemy, Typescript codegen and Next.jsQuart + Strawberry tutorialContributingWe usepoetryto manage dependencies, to\nget started follow these steps:gitclonehttps://github.com/strawberry-graphql/strawberrycdstrawberry\npoetryinstall\npoetryrunpytestThis will install all the dependencies (including dev ones) and run the tests.Pre commitWe have a configuration forpre-commit, to add the hook run the\nfollowing command:pre-commitinstallLinksProject homepage:https://strawberry.rocksRepository:https://github.com/strawberry-graphql/strawberryIssue tracker:https://github.com/strawberry-graphql/strawberry/issuesIn case of sensitive bugs like security vulnerabilities, please contactpatrick.arminio@gmail.comdirectly instead of using the issue tracker. We\nvalue your effort to improve the security and privacy of this project!LicensingThe code in this project is licensed under MIT license. SeeLICENSEfor more information."} {"package": "56kyle-pychess", "pacakge-description": "PychessA chess library written in Python.Pychess-Description-Installation-Usage-Game-Board-Move-Piece-Player-Square-Contributing-LicenseInstallation```bash\n# Install from PyPI\npip install 56kyle-pychess\n\n# Install from poetry\npoetry add 56kyle-pychess\n```DescriptionThe main purpose of this library is to try and practice constantly improving the quality of a codebase instead of allowing complexity to grow with time.I was mainly inspired by the books \"Clean Code\" and \"Clean Coder\" both written by Robert C. Martin. Most of the code in this library is written with the principles of clean code in mind.General Design DecisionsThe Board class is immutable. This means that every time a move is made, a new board is created. This is to prevent the board from being in an invalid state.Moves and most geometry related classes are described in terms of Points and LinesAlmost all iterables are sets to allow for hash comparisons of various frozen dataclass based objectsSimplificationsThe board may not be infiniteThe board must be a rectangleFeaturesAPIGameBoardMovePiecePlayerSquareEngineUCIGUIDocumentationUsageGameTODOBoardTODOMoveTODOPieceTODOPlayerTODOSquareTODO"} {"package": "574d", "pacakge-description": "WMA Window Manager for Tkinter On Windows 10Video TestWM/WikiHow to installpipinstall574dHow to testfromWMimportTK...if__name__=='__main__':Tk=TK()Tk.mainloop()I developed a code pattern based on the Sword Art Online (SAO) for TkinterFor Example:System Call GenerateButtonElementDischarge!Dischargeis only a SAO reference../WM/core.pyclassCall(TkData,INHERIT):# Object-IDs are here....szTitle,szWindowClass='WM','WM'...classSystem(Call):# class TK(System, Call.Tk):Call=Call# System.Call..../WM/views.pyclassElement(E.Widget,E.PhotoImage,System,Call):# type: ignoredef__new__(cls,name:str='',*_:E.Any,generic:bool=False,**__:E.Any):E=GENERATE.__dict__[name](*_,**__)ifgeneric:delE._[-1]returnEclassGenerate(System,Call):def__init__(self):super(System,self).__init__()# info when called...# with Element ConstructorElement('Frame',self.TK).grid(0,0,'nsew',padx=1,pady=1)({0:(1,1),1:(0,1),2:(1,0)}).grid_remove()self.W['F'][-1].grid()# another way to do the samemy_frame=ttk.Frame(self.TK)my_frame.grid(row=0,column=0,sticky='nsew',padx=1,pady=1)my_frame.grid_rowconfigure(0,weight=1)my_frame.grid_rowconfigure(2,weight=1)my_frame.grid_columnconfigure(0,weight=1)my_frame.grid_columnconfigure(1,weight=1)my_frame.grid_remove()my_frame.grid()"} {"package": "5an", "pacakge-description": "\ud83d\udce6 setup.py (for humans)This repo exists to providean example setup.pyfile, that can be used\nto bootstrap your next Python project. It includes some advanced\npatterns and best practices forsetup.py, as well as some\ncommented\u2013out nice\u2013to\u2013haves.For example, thissetup.pyprovides a$ python setup.py uploadcommand, which creates auniversal wheel(andsdist) and uploads\nyour package toPyPiusingTwine, without the need for an annoyingsetup.cfgfile. It also creates/uploads a new git tag, automatically.In short,setup.pyfiles can be daunting to approach, when first\nstarting out \u2014 even Guido has been heard saying, \"everyone cargo cults\nthems\". It's true \u2014 so, I want this repo to be the best place to\ncopy\u2013paste from :)Check out the example!Installationcdyour_project# Download the setup.py file:# download with wgetwgethttps://raw.githubusercontent.com/navdeep-G/setup.py/master/setup.py-Osetup.py# download with curlcurl-Ohttps://raw.githubusercontent.com/navdeep-G/setup.py/master/setup.pyTo DoTests via$ setup.py test(if it's concise).Pull requests are encouraged!More ResourcesWhat is setup.py?on Stack OverflowOfficial Python Packaging User GuideThe Hitchhiker's Guide to PackagingCookiecutter template for a Python packageLicenseThis is free and unencumbered software released into the public domain.Anyone is free to copy, modify, publish, use, compile, sell, or\ndistribute this software, either in source code form or as a compiled\nbinary, for any purpose, commercial or non-commercial, and by any means."} {"package": "5-exercise-upload-to-pypi", "pacakge-description": "No description available on PyPI."} {"package": "5G", "pacakge-description": "welcome to my package"} {"package": "5gasp-cli", "pacakge-description": "5GASP CLIHow to runYou can find the code inside the/5gasp-cli/src/directory.\nTo list all CLI commands, run:5gasp-cli --helpTo list all parameters of a command, run:5gasp-cli COMMAND --helpCLI CommandsList all tests from a test bed5gasp-cli list-testbedsList all available tests5gasp-cli list-available-testsGenerate a testing descriptor:5gasp-cli create-testing-descriptorThis command has the following options:One or more NSDs (Network Service Descriptors) can be passed to infer connection point tags from, using the following command:5gasp-cli create-testing-descriptor --infer-tags-from-nsd The path of the generated descriptor can be passed using:5gasp-cli create-testing-descriptor --output-filepath NOTE:Both options can be used simultaneously\nGNU GENERAL PUBLIC LICENSE\nVersion 3, 29 June 2007Copyright (C) 2007 Free Software Foundation, Inc.https://fsf.org/Everyone is permitted to copy and distribute verbatim copies\nof this license document, but changing it is not allowed.PreambleThe GNU General Public License is a free, copyleft license for\nsoftware and other kinds of works.The licenses for most software and other practical works are designed\nto take away your freedom to share and change the works. By contrast,\nthe GNU General Public License is intended to guarantee your freedom to\nshare and change all versions of a program--to make sure it remains free\nsoftware for all its users. We, the Free Software Foundation, use the\nGNU General Public License for most of our software; it applies also to\nany other work released this way by its authors. You can apply it to\nyour programs, too.When we speak of free software, we are referring to freedom, not\nprice. Our General Public Licenses are designed to make sure that you\nhave the freedom to distribute copies of free software (and charge for\nthem if you wish), that you receive source code or can get it if you\nwant it, that you can change the software or use pieces of it in new\nfree programs, and that you know you can do these things.To protect your rights, we need to prevent others from denying you\nthese rights or asking you to surrender the rights. Therefore, you have\ncertain responsibilities if you distribute copies of the software, or if\nyou modify it: responsibilities to respect the freedom of others.For example, if you distribute copies of such a program, whether\ngratis or for a fee, you must pass on to the recipients the same\nfreedoms that you received. You must make sure that they, too, receive\nor can get the source code. And you must show them these terms so they\nknow their rights.Developers that use the GNU GPL protect your rights with two steps:\n(1) assert copyright on the software, and (2) offer you this License\ngiving you legal permission to copy, distribute and/or modify it.For the developers' and authors' protection, the GPL clearly explains\nthat there is no warranty for this free software. For both users' and\nauthors' sake, the GPL requires that modified versions be marked as\nchanged, so that their problems will not be attributed erroneously to\nauthors of previous versions.Some devices are designed to deny users access to install or run\nmodified versions of the software inside them, although the manufacturer\ncan do so. This is fundamentally incompatible with the aim of\nprotecting users' freedom to change the software. The systematic\npattern of such abuse occurs in the area of products for individuals to\nuse, which is precisely where it is most unacceptable. Therefore, we\nhave designed this version of the GPL to prohibit the practice for those\nproducts. If such problems arise substantially in other domains, we\nstand ready to extend this provision to those domains in future versions\nof the GPL, as needed to protect the freedom of users.Finally, every program is threatened constantly by software patents.\nStates should not allow patents to restrict development and use of\nsoftware on general-purpose computers, but in those that do, we wish to\navoid the special danger that patents applied to a free program could\nmake it effectively proprietary. To prevent this, the GPL assures that\npatents cannot be used to render the program non-free.The precise terms and conditions for copying, distribution and\nmodification follow.TERMS AND CONDITIONSDefinitions.\"This License\" refers to version 3 of the GNU General Public License.\"Copyright\" also means copyright-like laws that apply to other kinds of\nworks, such as semiconductor masks.\"The Program\" refers to any copyrightable work licensed under this\nLicense. Each licensee is addressed as \"you\". \"Licensees\" and\n\"recipients\" may be individuals or organizations.To \"modify\" a work means to copy from or adapt all or part of the work\nin a fashion requiring copyright permission, other than the making of an\nexact copy. The resulting work is called a \"modified version\" of the\nearlier work or a work \"based on\" the earlier work.A \"covered work\" means either the unmodified Program or a work based\non the Program.To \"propagate\" a work means to do anything with it that, without\npermission, would make you directly or secondarily liable for\ninfringement under applicable copyright law, except executing it on a\ncomputer or modifying a private copy. Propagation includes copying,\ndistribution (with or without modification), making available to the\npublic, and in some countries other activities as well.To \"convey\" a work means any kind of propagation that enables other\nparties to make or receive copies. Mere interaction with a user through\na computer network, with no transfer of a copy, is not conveying.An interactive user interface displays \"Appropriate Legal Notices\"\nto the extent that it includes a convenient and prominently visible\nfeature that (1) displays an appropriate copyright notice, and (2)\ntells the user that there is no warranty for the work (except to the\nextent that warranties are provided), that licensees may convey the\nwork under this License, and how to view a copy of this License. If\nthe interface presents a list of user commands or options, such as a\nmenu, a prominent item in the list meets this criterion.Source Code.The \"source code\" for a work means the preferred form of the work\nfor making modifications to it. \"Object code\" means any non-source\nform of a work.A \"Standard Interface\" means an interface that either is an official\nstandard defined by a recognized standards body, or, in the case of\ninterfaces specified for a particular programming language, one that\nis widely used among developers working in that language.The \"System Libraries\" of an executable work include anything, other\nthan the work as a whole, that (a) is included in the normal form of\npackaging a Major Component, but which is not part of that Major\nComponent, and (b) serves only to enable use of the work with that\nMajor Component, or to implement a Standard Interface for which an\nimplementation is available to the public in source code form. A\n\"Major Component\", in this context, means a major essential component\n(kernel, window system, and so on) of the specific operating system\n(if any) on which the executable work runs, or a compiler used to\nproduce the work, or an object code interpreter used to run it.The \"Corresponding Source\" for a work in object code form means all\nthe source code needed to generate, install, and (for an executable\nwork) run the object code and to modify the work, including scripts to\ncontrol those activities. However, it does not include the work's\nSystem Libraries, or general-purpose tools or generally available free\nprograms which are used unmodified in performing those activities but\nwhich are not part of the work. For example, Corresponding Source\nincludes interface definition files associated with source files for\nthe work, and the source code for shared libraries and dynamically\nlinked subprograms that the work is specifically designed to require,\nsuch as by intimate data communication or control flow between those\nsubprograms and other parts of the work.The Corresponding Source need not include anything that users\ncan regenerate automatically from other parts of the Corresponding\nSource.The Corresponding Source for a work in source code form is that\nsame work.Basic Permissions.All rights granted under this License are granted for the term of\ncopyright on the Program, and are irrevocable provided the stated\nconditions are met. This License explicitly affirms your unlimited\npermission to run the unmodified Program. The output from running a\ncovered work is covered by this License only if the output, given its\ncontent, constitutes a covered work. This License acknowledges your\nrights of fair use or other equivalent, as provided by copyright law.You may make, run and propagate covered works that you do not\nconvey, without conditions so long as your license otherwise remains\nin force. You may convey covered works to others for the sole purpose\nof having them make modifications exclusively for you, or provide you\nwith facilities for running those works, provided that you comply with\nthe terms of this License in conveying all material for which you do\nnot control copyright. Those thus making or running the covered works\nfor you must do so exclusively on your behalf, under your direction\nand control, on terms that prohibit them from making any copies of\nyour copyrighted material outside their relationship with you.Conveying under any other circumstances is permitted solely under\nthe conditions stated below. Sublicensing is not allowed; section 10\nmakes it unnecessary.Protecting Users' Legal Rights From Anti-Circumvention Law.No covered work shall be deemed part of an effective technological\nmeasure under any applicable law fulfilling obligations under article\n11 of the WIPO copyright treaty adopted on 20 December 1996, or\nsimilar laws prohibiting or restricting circumvention of such\nmeasures.When you convey a covered work, you waive any legal power to forbid\ncircumvention of technological measures to the extent such circumvention\nis effected by exercising rights under this License with respect to\nthe covered work, and you disclaim any intention to limit operation or\nmodification of the work as a means of enforcing, against the work's\nusers, your or third parties' legal rights to forbid circumvention of\ntechnological measures.Conveying Verbatim Copies.You may convey verbatim copies of the Program's source code as you\nreceive it, in any medium, provided that you conspicuously and\nappropriately publish on each copy an appropriate copyright notice;\nkeep intact all notices stating that this License and any\nnon-permissive terms added in accord with section 7 apply to the code;\nkeep intact all notices of the absence of any warranty; and give all\nrecipients a copy of this License along with the Program.You may charge any price or no price for each copy that you convey,\nand you may offer support or warranty protection for a fee.Conveying Modified Source Versions.You may convey a work based on the Program, or the modifications to\nproduce it from the Program, in the form of source code under the\nterms of section 4, provided that you also meet all of these conditions:a) The work must carry prominent notices stating that you modified\nit, and giving a relevant date.\n\nb) The work must carry prominent notices stating that it is\nreleased under this License and any conditions added under section\n7. This requirement modifies the requirement in section 4 to\n\"keep intact all notices\".\n\nc) You must license the entire work, as a whole, under this\nLicense to anyone who comes into possession of a copy. This\nLicense will therefore apply, along with any applicable section 7\nadditional terms, to the whole of the work, and all its parts,\nregardless of how they are packaged. This License gives no\npermission to license the work in any other way, but it does not\ninvalidate such permission if you have separately received it.\n\nd) If the work has interactive user interfaces, each must display\nAppropriate Legal Notices; however, if the Program has interactive\ninterfaces that do not display Appropriate Legal Notices, your\nwork need not make them do so.A compilation of a covered work with other separate and independent\nworks, which are not by their nature extensions of the covered work,\nand which are not combined with it such as to form a larger program,\nin or on a volume of a storage or distribution medium, is called an\n\"aggregate\" if the compilation and its resulting copyright are not\nused to limit the access or legal rights of the compilation's users\nbeyond what the individual works permit. Inclusion of a covered work\nin an aggregate does not cause this License to apply to the other\nparts of the aggregate.Conveying Non-Source Forms.You may convey a covered work in object code form under the terms\nof sections 4 and 5, provided that you also convey the\nmachine-readable Corresponding Source under the terms of this License,\nin one of these ways:a) Convey the object code in, or embodied in, a physical product\n(including a physical distribution medium), accompanied by the\nCorresponding Source fixed on a durable physical medium\ncustomarily used for software interchange.\n\nb) Convey the object code in, or embodied in, a physical product\n(including a physical distribution medium), accompanied by a\nwritten offer, valid for at least three years and valid for as\nlong as you offer spare parts or customer support for that product\nmodel, to give anyone who possesses the object code either (1) a\ncopy of the Corresponding Source for all the software in the\nproduct that is covered by this License, on a durable physical\nmedium customarily used for software interchange, for a price no\nmore than your reasonable cost of physically performing this\nconveying of source, or (2) access to copy the\nCorresponding Source from a network server at no charge.\n\nc) Convey individual copies of the object code with a copy of the\nwritten offer to provide the Corresponding Source. This\nalternative is allowed only occasionally and noncommercially, and\nonly if you received the object code with such an offer, in accord\nwith subsection 6b.\n\nd) Convey the object code by offering access from a designated\nplace (gratis or for a charge), and offer equivalent access to the\nCorresponding Source in the same way through the same place at no\nfurther charge. You need not require recipients to copy the\nCorresponding Source along with the object code. If the place to\ncopy the object code is a network server, the Corresponding Source\nmay be on a different server (operated by you or a third party)\nthat supports equivalent copying facilities, provided you maintain\nclear directions next to the object code saying where to find the\nCorresponding Source. Regardless of what server hosts the\nCorresponding Source, you remain obligated to ensure that it is\navailable for as long as needed to satisfy these requirements.\n\ne) Convey the object code using peer-to-peer transmission, provided\nyou inform other peers where the object code and Corresponding\nSource of the work are being offered to the general public at no\ncharge under subsection 6d.A separable portion of the object code, whose source code is excluded\nfrom the Corresponding Source as a System Library, need not be\nincluded in conveying the object code work.A \"User Product\" is either (1) a \"consumer product\", which means any\ntangible personal property which is normally used for personal, family,\nor household purposes, or (2) anything designed or sold for incorporation\ninto a dwelling. In determining whether a product is a consumer product,\ndoubtful cases shall be resolved in favor of coverage. For a particular\nproduct received by a particular user, \"normally used\" refers to a\ntypical or common use of that class of product, regardless of the status\nof the particular user or of the way in which the particular user\nactually uses, or expects or is expected to use, the product. A product\nis a consumer product regardless of whether the product has substantial\ncommercial, industrial or non-consumer uses, unless such uses represent\nthe only significant mode of use of the product.\"Installation Information\" for a User Product means any methods,\nprocedures, authorization keys, or other information required to install\nand execute modified versions of a covered work in that User Product from\na modified version of its Corresponding Source. The information must\nsuffice to ensure that the continued functioning of the modified object\ncode is in no case prevented or interfered with solely because\nmodification has been made.If you convey an object code work under this section in, or with, or\nspecifically for use in, a User Product, and the conveying occurs as\npart of a transaction in which the right of possession and use of the\nUser Product is transferred to the recipient in perpetuity or for a\nfixed term (regardless of how the transaction is characterized), the\nCorresponding Source conveyed under this section must be accompanied\nby the Installation Information. But this requirement does not apply\nif neither you nor any third party retains the ability to install\nmodified object code on the User Product (for example, the work has\nbeen installed in ROM).The requirement to provide Installation Information does not include a\nrequirement to continue to provide support service, warranty, or updates\nfor a work that has been modified or installed by the recipient, or for\nthe User Product in which it has been modified or installed. Access to a\nnetwork may be denied when the modification itself materially and\nadversely affects the operation of the network or violates the rules and\nprotocols for communication across the network.Corresponding Source conveyed, and Installation Information provided,\nin accord with this section must be in a format that is publicly\ndocumented (and with an implementation available to the public in\nsource code form), and must require no special password or key for\nunpacking, reading or copying.Additional Terms.\"Additional permissions\" are terms that supplement the terms of this\nLicense by making exceptions from one or more of its conditions.\nAdditional permissions that are applicable to the entire Program shall\nbe treated as though they were included in this License, to the extent\nthat they are valid under applicable law. If additional permissions\napply only to part of the Program, that part may be used separately\nunder those permissions, but the entire Program remains governed by\nthis License without regard to the additional permissions.When you convey a copy of a covered work, you may at your option\nremove any additional permissions from that copy, or from any part of\nit. (Additional permissions may be written to require their own\nremoval in certain cases when you modify the work.) You may place\nadditional permissions on material, added by you to a covered work,\nfor which you have or can give appropriate copyright permission.Notwithstanding any other provision of this License, for material you\nadd to a covered work, you may (if authorized by the copyright holders of\nthat material) supplement the terms of this License with terms:a) Disclaiming warranty or limiting liability differently from the\nterms of sections 15 and 16 of this License; or\n\nb) Requiring preservation of specified reasonable legal notices or\nauthor attributions in that material or in the Appropriate Legal\nNotices displayed by works containing it; or\n\nc) Prohibiting misrepresentation of the origin of that material, or\nrequiring that modified versions of such material be marked in\nreasonable ways as different from the original version; or\n\nd) Limiting the use for publicity purposes of names of licensors or\nauthors of the material; or\n\ne) Declining to grant rights under trademark law for use of some\ntrade names, trademarks, or service marks; or\n\nf) Requiring indemnification of licensors and authors of that\nmaterial by anyone who conveys the material (or modified versions of\nit) with contractual assumptions of liability to the recipient, for\nany liability that these contractual assumptions directly impose on\nthose licensors and authors.All other non-permissive additional terms are considered \"further\nrestrictions\" within the meaning of section 10. If the Program as you\nreceived it, or any part of it, contains a notice stating that it is\ngoverned by this License along with a term that is a further\nrestriction, you may remove that term. If a license document contains\na further restriction but permits relicensing or conveying under this\nLicense, you may add to a covered work material governed by the terms\nof that license document, provided that the further restriction does\nnot survive such relicensing or conveying.If you add terms to a covered work in accord with this section, you\nmust place, in the relevant source files, a statement of the\nadditional terms that apply to those files, or a notice indicating\nwhere to find the applicable terms.Additional terms, permissive or non-permissive, may be stated in the\nform of a separately written license, or stated as exceptions;\nthe above requirements apply either way.Termination.You may not propagate or modify a covered work except as expressly\nprovided under this License. Any attempt otherwise to propagate or\nmodify it is void, and will automatically terminate your rights under\nthis License (including any patent licenses granted under the third\nparagraph of section 11).However, if you cease all violation of this License, then your\nlicense from a particular copyright holder is reinstated (a)\nprovisionally, unless and until the copyright holder explicitly and\nfinally terminates your license, and (b) permanently, if the copyright\nholder fails to notify you of the violation by some reasonable means\nprior to 60 days after the cessation.Moreover, your license from a particular copyright holder is\nreinstated permanently if the copyright holder notifies you of the\nviolation by some reasonable means, this is the first time you have\nreceived notice of violation of this License (for any work) from that\ncopyright holder, and you cure the violation prior to 30 days after\nyour receipt of the notice.Termination of your rights under this section does not terminate the\nlicenses of parties who have received copies or rights from you under\nthis License. If your rights have been terminated and not permanently\nreinstated, you do not qualify to receive new licenses for the same\nmaterial under section 10.Acceptance Not Required for Having Copies.You are not required to accept this License in order to receive or\nrun a copy of the Program. Ancillary propagation of a covered work\noccurring solely as a consequence of using peer-to-peer transmission\nto receive a copy likewise does not require acceptance. However,\nnothing other than this License grants you permission to propagate or\nmodify any covered work. These actions infringe copyright if you do\nnot accept this License. Therefore, by modifying or propagating a\ncovered work, you indicate your acceptance of this License to do so.Automatic Licensing of Downstream Recipients.Each time you convey a covered work, the recipient automatically\nreceives a license from the original licensors, to run, modify and\npropagate that work, subject to this License. You are not responsible\nfor enforcing compliance by third parties with this License.An \"entity transaction\" is a transaction transferring control of an\norganization, or substantially all assets of one, or subdividing an\norganization, or merging organizations. If propagation of a covered\nwork results from an entity transaction, each party to that\ntransaction who receives a copy of the work also receives whatever\nlicenses to the work the party's predecessor in interest had or could\ngive under the previous paragraph, plus a right to possession of the\nCorresponding Source of the work from the predecessor in interest, if\nthe predecessor has it or can get it with reasonable efforts.You may not impose any further restrictions on the exercise of the\nrights granted or affirmed under this License. For example, you may\nnot impose a license fee, royalty, or other charge for exercise of\nrights granted under this License, and you may not initiate litigation\n(including a cross-claim or counterclaim in a lawsuit) alleging that\nany patent claim is infringed by making, using, selling, offering for\nsale, or importing the Program or any portion of it.Patents.A \"contributor\" is a copyright holder who authorizes use under this\nLicense of the Program or a work on which the Program is based. The\nwork thus licensed is called the contributor's \"contributor version\".A contributor's \"essential patent claims\" are all patent claims\nowned or controlled by the contributor, whether already acquired or\nhereafter acquired, that would be infringed by some manner, permitted\nby this License, of making, using, or selling its contributor version,\nbut do not include claims that would be infringed only as a\nconsequence of further modification of the contributor version. For\npurposes of this definition, \"control\" includes the right to grant\npatent sublicenses in a manner consistent with the requirements of\nthis License.Each contributor grants you a non-exclusive, worldwide, royalty-free\npatent license under the contributor's essential patent claims, to\nmake, use, sell, offer for sale, import and otherwise run, modify and\npropagate the contents of its contributor version.In the following three paragraphs, a \"patent license\" is any express\nagreement or commitment, however denominated, not to enforce a patent\n(such as an express permission to practice a patent or covenant not to\nsue for patent infringement). To \"grant\" such a patent license to a\nparty means to make such an agreement or commitment not to enforce a\npatent against the party.If you convey a covered work, knowingly relying on a patent license,\nand the Corresponding Source of the work is not available for anyone\nto copy, free of charge and under the terms of this License, through a\npublicly available network server or other readily accessible means,\nthen you must either (1) cause the Corresponding Source to be so\navailable, or (2) arrange to deprive yourself of the benefit of the\npatent license for this particular work, or (3) arrange, in a manner\nconsistent with the requirements of this License, to extend the patent\nlicense to downstream recipients. \"Knowingly relying\" means you have\nactual knowledge that, but for the patent license, your conveying the\ncovered work in a country, or your recipient's use of the covered work\nin a country, would infringe one or more identifiable patents in that\ncountry that you have reason to believe are valid.If, pursuant to or in connection with a single transaction or\narrangement, you convey, or propagate by procuring conveyance of, a\ncovered work, and grant a patent license to some of the parties\nreceiving the covered work authorizing them to use, propagate, modify\nor convey a specific copy of the covered work, then the patent license\nyou grant is automatically extended to all recipients of the covered\nwork and works based on it.A patent license is \"discriminatory\" if it does not include within\nthe scope of its coverage, prohibits the exercise of, or is\nconditioned on the non-exercise of one or more of the rights that are\nspecifically granted under this License. You may not convey a covered\nwork if you are a party to an arrangement with a third party that is\nin the business of distributing software, under which you make payment\nto the third party based on the extent of your activity of conveying\nthe work, and under which the third party grants, to any of the\nparties who would receive the covered work from you, a discriminatory\npatent license (a) in connection with copies of the covered work\nconveyed by you (or copies made from those copies), or (b) primarily\nfor and in connection with specific products or compilations that\ncontain the covered work, unless you entered into that arrangement,\nor that patent license was granted, prior to 28 March 2007.Nothing in this License shall be construed as excluding or limiting\nany implied license or other defenses to infringement that may\notherwise be available to you under applicable patent law.No Surrender of Others' Freedom.If conditions are imposed on you (whether by court order, agreement or\notherwise) that contradict the conditions of this License, they do not\nexcuse you from the conditions of this License. If you cannot convey a\ncovered work so as to satisfy simultaneously your obligations under this\nLicense and any other pertinent obligations, then as a consequence you may\nnot convey it at all. For example, if you agree to terms that obligate you\nto collect a royalty for further conveying from those to whom you convey\nthe Program, the only way you could satisfy both those terms and this\nLicense would be to refrain entirely from conveying the Program.Use with the GNU Affero General Public License.Notwithstanding any other provision of this License, you have\npermission to link or combine any covered work with a work licensed\nunder version 3 of the GNU Affero General Public License into a single\ncombined work, and to convey the resulting work. The terms of this\nLicense will continue to apply to the part which is the covered work,\nbut the special requirements of the GNU Affero General Public License,\nsection 13, concerning interaction through a network will apply to the\ncombination as such.Revised Versions of this License.The Free Software Foundation may publish revised and/or new versions of\nthe GNU General Public License from time to time. Such new versions will\nbe similar in spirit to the present version, but may differ in detail to\naddress new problems or concerns.Each version is given a distinguishing version number. If the\nProgram specifies that a certain numbered version of the GNU General\nPublic License \"or any later version\" applies to it, you have the\noption of following the terms and conditions either of that numbered\nversion or of any later version published by the Free Software\nFoundation. If the Program does not specify a version number of the\nGNU General Public License, you may choose any version ever published\nby the Free Software Foundation.If the Program specifies that a proxy can decide which future\nversions of the GNU General Public License can be used, that proxy's\npublic statement of acceptance of a version permanently authorizes you\nto choose that version for the Program.Later license versions may give you additional or different\npermissions. However, no additional obligations are imposed on any\nauthor or copyright holder as a result of your choosing to follow a\nlater version.Disclaimer of Warranty.THERE IS NO WARRANTY FOR THE PROGRAM, TO THE EXTENT PERMITTED BY\nAPPLICABLE LAW. EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT\nHOLDERS AND/OR OTHER PARTIES PROVIDE THE PROGRAM \"AS IS\" WITHOUT WARRANTY\nOF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO,\nTHE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR\nPURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE PROGRAM\nIS WITH YOU. SHOULD THE PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF\nALL NECESSARY SERVICING, REPAIR OR CORRECTION.Limitation of Liability.IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING\nWILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MODIFIES AND/OR CONVEYS\nTHE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES, INCLUDING ANY\nGENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING OUT OF THE\nUSE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED TO LOSS OF\nDATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD\nPARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER PROGRAMS),\nEVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF\nSUCH DAMAGES.Interpretation of Sections 15 and 16.If the disclaimer of warranty and limitation of liability provided\nabove cannot be given local legal effect according to their terms,\nreviewing courts shall apply local law that most closely approximates\nan absolute waiver of all civil liability in connection with the\nProgram, unless a warranty or assumption of liability accompanies a\ncopy of the Program in return for a fee.END OF TERMS AND CONDITIONS\n\n How to Apply These Terms to Your New ProgramsIf you develop a new program, and you want it to be of the greatest\npossible use to the public, the best way to achieve this is to make it\nfree software which everyone can redistribute and change under these terms.To do so, attach the following notices to the program. It is safest\nto attach them to the start of each source file to most effectively\nstate the exclusion of warranty; and each file should have at least\nthe \"copyright\" line and a pointer to where the full notice is found.\nCopyright (C) \n\nThis program is free software: you can redistribute it and/or modify\nit under the terms of the GNU General Public License as published by\nthe Free Software Foundation, either version 3 of the License, or\n(at your option) any later version.\n\nThis program is distributed in the hope that it will be useful,\nbut WITHOUT ANY WARRANTY; without even the implied warranty of\nMERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the\nGNU General Public License for more details.\n\nYou should have received a copy of the GNU General Public License\nalong with this program. If not, see .Also add information on how to contact you by electronic and paper mail.If the program does terminal interaction, make it output a short\nnotice like this when it starts in an interactive mode: Copyright (C) \nThis program comes with ABSOLUTELY NO WARRANTY; for details type `show w'.\nThis is free software, and you are welcome to redistribute it\nunder certain conditions; type `show c' for details.The hypothetical commandsshow w' andshow c' should show the appropriate\nparts of the General Public License. Of course, your program's commands\nmight be different; for a GUI interface, you would use an \"about box\".You should also get your employer (if you work as a programmer) or school,\nif any, to sign a \"copyright disclaimer\" for the program, if necessary.\nFor more information on this, and how to apply and follow the GNU GPL, seehttps://www.gnu.org/licenses/.The GNU General Public License does not permit incorporating your program\ninto proprietary programs. If your program is a subroutine library, you\nmay consider it more useful to permit linking proprietary applications with\nthe library. If this is what you want to do, use the GNU Lesser General\nPublic License instead of this License. But first, please readhttps://www.gnu.org/licenses/why-not-lgpl.html."} {"package": "5g-core-common-schemas", "pacakge-description": "5g-core-common-schemasCommon schemas used in 5G core network services.UsageInstall the package:pipinstall5g-core-common-schemasUse the library in your project:fromfiveg_core_common_schemas.NfInstanceIdimportNfInstanceIdmy_instance=NfInstanceId(\"banana\")"} {"package": "5kodds-distribution", "pacakge-description": "No description available on PyPI."} {"package": "5minute", "pacakge-description": "Give me an instance of mine image on OpenStack. Hurry!QuickStartTo run 5minute you need to install following libs:python-keystoneclient\npython-cinderclient\npython-heatclient\npython-neutronclient\npython-novaclient\npython-xmltodict\npython-prettytableTo install them from RPMs (Fedora), please dodnf-yinstall $( catrequirement-rpms.txt).If you have installed 5minute using pip, they were installed as\ndependencies. Otherwise, you have to install them manually.Get config file:Login into your OpenStack instance WebUINavigate to Access & Security -> API AccessSave file from \u201cDownload OpenStack RC File\u201d to ~/.5minute/configGet started:Show help:$ 5minute helpUpload your SSH public key:$ 5minute key ~/.ssh/id_rsa.pubShow images we can work with:$ 5minute imagesBoot your machine (consider adding \u2018\u2013name\u2019 or \u2018\u2013flavor\u2019 to the\ncommand):$ 5minute boot When the boot is finished, you should be able to ssh to your new machine$ ssh root@You can list your current machines:$ 5minute listWhen you are done, kill the machine (you can do this via OpenStack webUI\nas well):$ 5minute delete To list available OpenStack scenarios:$ 5minute scenario templatesRun scenario:$ 5minute scenario boot When finished with the scenario, you should delete it:$ 5minute scenario delete "} {"package": "5o4drel5mk", "pacakge-description": "UNKNOWN"} {"package": "5paisa-modular", "pacakge-description": "5Paisa Algo trade deployment modules"} {"package": "5-Rakoto031-upload-to-pypi", "pacakge-description": "No description available on PyPI."} {"package": "5sim-python", "pacakge-description": "FiveSimA simple Python API for5sim.netInstallationBefore proceeding, you should register an account on5sim.netandgenerate a personal API keyto use.Install from source:pipinstallgit+https://github.com/squirrelpython/5sim-python.gitAlternatively, install fromPyPI:pipinstall5sim-pythonClientfromfivesimimportFiveSim# These example values won't work. You must get your own api_keyAPI_KEY='ey.............'client=FiveSim(API_KEY)EndpointsOfficial docshereUser# Balance requestclient.get_balance()# Provides profile data: email, balance and rating.Products and prices# Products requestclient.product_requests(country='russia',product='telegram')# To receive the name, the price, quantity of all products, available to buy.# Prices requestclient.price_requests()# Returns product prices# Prices by countryclient.price_requests_by_country(country='russia')# Returns product prices by country# Prices by productclient.price_requests_by_product(product='telegram')# Returns product prices by product# Prices by country and productclient.price_requests_by_country_and_product(country='russia',product='telegram')# Returns product prices by country and specific productPurchase# Buy activation numberclient.buy_number(country='russia',operator='any',product='telegram')# Buy new activation number# Buy hosting numberclient.buy_hosting_number(country='russia',operator='any',product='amazon')# Buy new hosting number# Re-buy numberclient.rebuy_number(product='telegram',number='7485.....')# Re-buy numberOrder management# Check order (Get SMS)client.check_order(order_id='12345678')# Check the sms was received# Finish orderclient.finish_order(order_id='12345678')# Finish the order after code received# Cancel orderclient.cancel_order(order_id='12345678')# Cancel the order# Ban orderclient.ban_order(order_id='12345678')# Cancel the order if banned from the service# SMS inbox listclient.sms_inbox_list(order_id='12345678')# Get SMS inbox list by order's id.Powered bySquirrelPython."} {"package": "5-Stars-Contact-Book", "pacakge-description": "A contact book with built-in bot to help you operate with your contacts"} {"package": "6", "pacakge-description": "No description available on PyPI."} {"package": "61bo", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} {"package": "650-auto-comp-jaewon", "pacakge-description": "No description available on PyPI."} {"package": "652ga", "pacakge-description": "UNKNOWN"} {"package": "666", "pacakge-description": "No description available on PyPI."} {"package": "667bot", "pacakge-description": "No description available on PyPI."} {"package": "676", "pacakge-description": "welcome to my package"} {"package": "69", "pacakge-description": "# 69"} {"package": "69696969696969696969696969696969696969696969696969696969696969696969696969696", "pacakge-description": "welcome to my package"} {"package": "6D657461666C6F77", "pacakge-description": "No description available on PyPI."} {"package": "6du.tv", "pacakge-description": "No description available on PyPI."} {"package": "6estates-idp", "pacakge-description": "Read more onhttps://idp-sdk-doc.6estates.com/python/"} {"package": "6-rosdep", "pacakge-description": "No description available on PyPI."} {"package": "6s-bin", "pacakge-description": "6S BinariesThis distribution provides access to compiled binaries of the6S Radiative Transfer Modelaspackage resources.Itdoes notprovide a Python interface to 6S. For a Python interface, see Robin Wilson'sPy6S.Currently, this project includes binaries for 6SV1.1 and 6SV2.1. It requires Python 3.9+ and supports Linux, macOS, and Windows.InstallPre-compiled wheels can be installed fromPyPI:$pipinstall6s-binIf you are usingpoetry, you can add this distribution as a dependency usingpoetry add:$poetryadd6s-binInstalling from sourceBuilding this distribution involves downloading, validating, and compiling the 6S source code. Seebuild.pyfor details about the build process. A Fortran 77 compiler is required to compile 6S.Build and install from source distribution:$pipinstall--no-binary=6s-bin6s-binBuild and install from git:$pipinstall'6s-bin @ git+https://github.com/brianschubert/6s-bin'Build and install from local source tree:$pipinstall.Python UsageCallsixs_bin.get_path(version)to get the path to an installed 6S binary. The parameterversionis required, and must be either the string\"1.1\"or\"2.1\".>>>importsixs_bin>>>sixs_bin.get_path(\"1.1\")PosixPath('/lib/python3.X/site-packages/sixs_bin/sixsV1.1')>>>sixs_bin.get_path(\"2.1\")PosixPath('/lib/python3.X/site-packages/sixs_bin/sixsV2.1')If you also havePy6Sinstalled, you can callsixs_bin.make_wrapper()to get aPy6S.SixSinstance that's configured to use the installed 6SV1.1 binary.>>>wrapper=sixs_bin.make_wrapper()>>>wrapper>>>wrapper.sixs_pathPosixPath('/lib/python3.X/site-packages/sixs_bin/sixsV1.1')>>>wrapper.run()>>>wrapper.outputs.apparent_radiance134.632Command Line UsageRunpython3 -m sixs_bin --helpto see all available command line options.$ python3 -m sixs_bin --help\nusage: python3 -m sixs_bin [-h] [--version]\n [--path {1.1,2.1} | --exec {1.1,2.1} | --test-wrapper]\n\n6S v1.1 and 6S v2.1 binaries provided as package resources.\n\noptional arguments:\n -h, --help show this help message and exit\n --version show program's version number and exit\n\ncommand:\n --path {1.1,2.1} Print the path to the specified 6S executable from this package's\n resources.\n --exec {1.1,2.1} Execute specified 6S executable in a subprocess, inheriting stdin and\n stdout. This option is provided as a convenience, but its not\n generally recommended. Running 6S using this option is around 5%\n slower than executing the binary directly, due the overhead of\n starting the Python interpreter and subprocess.\n --test-wrapper Run Py6S.SixS.test on this package's 6SV1.1 executable.To get the path to an installed 6S binary, runsixs_binas an executable module with the--pathflag specified. The--pathflag takes one required argument, which must be either the string1.1or2.1:$python3-msixs_bin--path2.1\n/lib/python3.X/site-packages/sixs_bin/sixsV2.1If you need the path to the containing directory, usedirname. For example:$SIXS_DIR=$(dirname$(python3-msixs_bin--path2.1))$echo$SIXS_DIR/lib/python3.X/site-packages/sixs_binTestTests can be run using pytest:$pytestSome tests are included to check compatibility with Robin Wilson'sPy6Swrapper. These tests will be ignored ifPy6Sis not available."} {"package": "7", "pacakge-description": "SevenTeamCarlos Abraham"} {"package": "73e4d8e848405a88f444cff1c9dbc5b8", "pacakge-description": "Still Confidential"} {"package": "73.portlet.links", "pacakge-description": "This product is based on collective.portlet.links but it\u2019s possible to set the title of the port let."} {"package": "73.unlockItems", "pacakge-description": "IntroductionThis product is special for unlocking web_dav locked item in a plone portal.\nNeed simplejson if python version is less than 2.6release 0.3no need simplejson any more :-)\nCode is more clear and speed\nChangelog\n=========0.1dev (unreleased)Initial release0.2 (unreleased)add browserview in plone_control_panel0.3 (released)remove jquery used (javascript and python)code is more clear and speedmake only one search in catalog"} {"package": "777", "pacakge-description": "welcome to my package"} {"package": "77mr", "pacakge-description": "amyTools\u4ecb\u7ecd\u5206\u652f\u8f7b\u91cf\u5de5\u5177\u5b89\u88c5\u6559\u7a0b\u5b89\u88c5\u6700\u65b0\u7684python\u73af\u5883pip3 install mr\u4f7f\u7528\u8bf4\u660e####\u547d\u4ee4\nmr\n\u57fa\u4e8e\u5f53\u524d\u5206\u652f\u521b\u5efa\u8fdc\u7a0b\u5206\u652f\u5e76\u53d1\u8d77merge request\uff0c\u76ee\u6807\u5206\u652f\u4e3a\u5f53\u524d\u5206\u652f"} {"package": "77tool", "pacakge-description": "amyTools\u4ecb\u7ecd77\u8f7b\u91cf\u5de5\u5177\u5b89\u88c5\u6559\u7a0b\u5b89\u88c5\u6700\u65b0\u7684python\u73af\u5883pip3 install 77tool\u4f7f\u7528\u8bf4\u660e####\u547d\u4ee4\n77tool mr\n\u57fa\u4e8e\u5f53\u524d\u5206\u652f\u521b\u5efa\u8fdc\u7a0b\u5206\u652f\u5e76\u53d1\u8d77merge request\uff0c\u76ee\u6807\u5206\u652f\u4e3a\u5f53\u524d\u5206\u652f"} {"package": "7asiba", "pacakge-description": "No description available on PyPI."} {"package": "7bridges", "pacakge-description": "No description available on PyPI."} {"package": "7d-demand-bundles", "pacakge-description": "No description available on PyPI."} {"package": "7i96", "pacakge-description": "7i967i96 Configuration ToolScopeRead in the ini configuration file for changes.Create a complete configuration from scratch.Depends on python3-pyqt5sudo apt-get install python3-pqt5Open the sample ini file then make changes as needed then buildYou can create a configuration then run it with the Axis GUI and use\nMachine > Calibration to tune each axis. Save the values to the ini file and\nnext time you run the 7i96 Configuration Tool it will read the values from the\nini file."} {"package": "7lk_ocr_deploy", "pacakge-description": "Some deploy packages for ocr."} {"package": "7pack", "pacakge-description": "7packWhat is 7pack?7pack is not a replacement for pip. Instead, it works with pip. 7pack remedies some issues with pip, like the difficulty in installing all packages. In addition, it has a large repository, including nearly all of the most popular pip packages.How to install7pack is a little different from pip. First, you enter the packages. Then, at the end you put flags. To install, put --install. For example, to install urllib3, the most popular pip package, you wold run the command7pack urllib3 --installHow to uninstallUninstalling with 7pack is like installing except the flag must be changed to --uninstall. To uninstall urllib3, you would run the command7pack urrlib3 --uninstallList packagesSince 7pack requires packages, you need to enter any random string in 7pack and apply the --list flag. When I list packages I use7pack none --listUpgrade PackagesUpgrading is just like installing and uninstalling. Just apply the --upgrade flag. To upgrade urllib3, run the command7pack urllib3 --upgrade.Install/Uninstall allThis is the main reason 7pack was created. To uninstall/upgrade all, put one package, called \"all\". To upgrade all, run the command7pack all --upgrade, and to uninstall all, run the command7pack all --uninstall"} {"package": "7Q_nester", "pacakge-description": "UNKNOWN"} {"package": "7seg-ByteBird", "pacakge-description": "ByteBirdSimple \"Flappy Bird\" like game for 7 segment display from ZeroSeg module for Raspberry Pi.Installation$ pip3 install --user 7seg-ByteBirdUsage$ python3 -m ByteBird or simply$ bytebird * Optional: Time delay between updates - difficulty level, less = harder."} {"package": "7th", "pacakge-description": "UNKNOWN"} {"package": "7uring", "pacakge-description": "An advanced cryptography tool for hashing, encrypting, encoding, steganography and more."} {"package": "7Wonder-RL-Lib", "pacakge-description": "7Wonder-RL-LibLibrary providing environment for testing Reinforcement learning in 7 Wonders Game.OverviewThere are multiple environments for the AI game testing. However, environments implemented now are mostly covered only the traditional board games (Go, Chess, etc.) or 52-card based card games (Poker, Rummy, etc.) where games do not really have interactions with other players.Most of the Euro-games board games are good game environments to test the algorithm on as there are many aspects to explore, such as tradings, dealing with imperfect informations, stochastic elements, etc.7 Wonders board games introduced multiple elements mentioned above which are good for testing out new algorithm. This library will cover basic game systems and allow users to customize the environments with custom state space and rewarding systems.InstallationTo install the gym environment, runmake develop\nmake build \nmake installUsageExample codes of how to declare the gym environment is displayed belowimport SevenWonEnv\nfrom SevenWonEnv.envs.mainGameEnv import Personality \n\nenv = gym.make(\"SevenWonderEnv\", player=4) #Declare Environment with 4 PlayersTo use the Personality that is given (RandomAI, RuleBasedAI, DQNAI, Human), usesetPersonality(personalityList)personalityList = []\n personalityList.append(Personality.DQNAI)\n for i in range(1, 4):\n personalityList.append(Personality.RandomAI)\n env.setPersonality(personalityList)To run the game each step,stateList = env.step(None)The variable stateList consist of n 4-tuple, depends on number of players. Each tuple are (new_state, reward, done, info).To add the custom model, change theSevenWondersEnv/SevenWonEnv/envs/mainGameEnv/Personality.pyfile.\nEach personality will have 2 main functions, which are init and make_choice.For example, RandomAI takes all possible choices and randomly choose one choice.class RandomAI(Personality):\n def __init__(self):\n super().__init__()\n\n def make_choice(self, player, age, options):\n return random.choice(range(len(options)))"} {"package": "7xydothis", "pacakge-description": "Utitilies to use the 7xydothis APIs"} {"package": "8", "pacakge-description": "No description available on PyPI."} {"package": "83-numbers", "pacakge-description": "A Simple Calculator for testing packages lifecycle"} {"package": "88", "pacakge-description": "No description available on PyPI."} {"package": "888", "pacakge-description": "No description available on PyPI."} {"package": "88AB6720D79B4CBD93CAB7180920D89C", "pacakge-description": "No description available on PyPI."} {"package": "88orm", "pacakge-description": "88 ORM Services ConnectorThis is a simple example package for connecting to ORM Service\nto help you guys."} {"package": "88orm-service-connector", "pacakge-description": "88 ORM Services ConnectorThis is a simple example package for connecting to ORM Service\nto help you guys."} {"package": "88rest", "pacakge-description": "No description available on PyPI."} {"package": "8a-scraper", "pacakge-description": "8a_scraper8ais a great resource to aggregate statistics on sportclimbers and boulderers. They recently deployed a new version of their website that rendered all the prior scrapers obsolete.This tool allows 8a users to scrape content from the website using their username, password, Selenium, and BeautifulSoup.InstallingViapipInstall using the following command:pip install 8a-scraperThe latest version is0.0.3. If you have already installed the package, please update it using the following command:pip install 8a-scraper --upgradeVia GitHubAlternatively, you can just clone this repo and import the libraries at your own discretion.UsageTo install the package,This package requires the user to also install Google Chrome andChromeDriver.\nPlease ensure that the version of ChromeDriver install matches your current version of Google Chrome. You can check your current version of Google Chrome by opening Chrome and checking the 'About Google Chrome' panel. Ensure that the chromedriver executable is in your$PATHvariable as well.The user must also have an email and password that can be used to log into8aAdditionally, the user must set the following environment variables with their current login info:_8A_USERNAME='<8a email>'_8A_PASSWORD='<8a password>'These variables are accessed usingos.getenv(). These can be set in the.bash_profilefile on MacOS or by 'Editing the system environmental variables' on Windows.APICurrently, the package only contains 2 modules: users and ascents.\nThe package will be expanding to include other content as well, but this is a start.For more information, about the API please refer to the fulldocumentation"} {"package": "8ball-cole-wilson-pycon-demo", "pacakge-description": "8ball8 ball for the terminalContributorsCole WilsonContactcole@colewilson.xyz"} {"package": "8ball-pycon-demo", "pacakge-description": "8Ball8 Ball for the terminalContributorsCole WilsonContactcole@colewilson.xyz"} {"package": "8ball-pycon-demo-cole-wilson", "pacakge-description": "8Ball8 ball for the terminalContributorsCole WilsonContactcole@colewilson.xyz"} {"package": "8EXBYPINA", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} {"package": "8puzz", "pacakge-description": "No description available on PyPI."} {"package": "8q", "pacakge-description": ""} {"package": "8qe", "pacakge-description": ""} {"package": "9", "pacakge-description": "No description available on PyPI."} {"package": "908dist", "pacakge-description": "UNKNOWN"} {"package": "91act-platform", "pacakge-description": "The author is a real lazy dude!"} {"package": "91tvspider", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} {"package": "98ce1b5f404a2dfa601423fb0f12c09b98ce1b5f404a2dfa601423fb0f12c09b", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} {"package": "98ce1b5f404a2dfa601423fb0f12c09b98ce1b5f404a2dfa601423fb0f12c09bmain", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} {"package": "99", "pacakge-description": "No description available on PyPI."} {"package": "999", "pacakge-description": "No description available on PyPI."} {"package": "99d4aa80-d846-424f-873b-a02c7215fc54", "pacakge-description": "Test GitHub ActionsThe aim of this repository is to test out GitHub's Actions and workflows."} {"package": "9ja", "pacakge-description": "This Library will help you get information about a 9jaChange Log0.0.1 (22/06/2022)Initial Codebase"}