id
int64
0
2.72k
content
stringlengths
5
4.1k
language
stringclasses
4 values
embedding
unknown
100
optparse Parser for command line options Source code Lib optparse py Deprecated since version 3 2 The optparse module is deprecated and will not be developed further development will continue with the argparse module optparse is a more convenient flexible and powerful library for parsing command line options than the old getopt module optparse uses a more declarative style of command line parsing you create an instance of OptionParser populate it with options and parse the command line optparse allows users to specify options in the conventional GNU POSIX syntax and additionally generates usage and help messages for you Here s an example of using optparse in a simple script from optparse import OptionParser parser OptionParser parser add_option f file dest filename help write report to FILE metavar FILE parser add_option q quiet action store_false dest verbose default True help don t print status messages to stdout options args parser parse_args With these few lines of code users of your script can now do the usual thing on the command line for example yourscript file outfile q As it parses the command line optparse sets attributes of the options object returned by parse_args based on user supplied command line values When parse_args returns from parsing this command line options filename will be outfile and options verbose will be False optparse supports both long and short options allows short options to be merged together and allows options to be associated with their arguments in a variety of ways Thus the following command lines are all equivalent to the above example yourscript f outfile quiet yourscript quiet file outfile yourscript q foutfile yourscript qfoutfile Additionally users can run one of the following yourscript h yourscript help and optparse will print out a brief summary of your script s options Usage yourscript options Options h help show this help message and exit f FILE file FILE write report to FILE q quiet don t print status messages to stdout where the value of yourscript is determined at runtime normally from sys argv 0 Background optparse was explicitly designed to encourage the creation of programs with straightforward conventional command line interfaces To that end it supports only the most common command line syntax and semantics conventionally used under Unix If you are unfamiliar with these conventions read this section to acquaint yourself with them Terminology argument a string entered on the command line and passed by the shell to execl or execv In Python arguments are elements of sys argv 1 sys argv 0 is the name of the program being executed Unix shells also use the term word It is occasionally desirable to substitute an argument list other than sys argv 1 so you should read argument as an element of sys argv 1 or of some other list provided as a substitute for sys argv 1 option an argument used to supply extra information to guide or customize the execution of a program There are many different syntaxes for options the traditional Unix syntax is a hyphen followed by a single letter e g x or F Also traditional Unix syntax allows multiple options to be merged into a single argument e g x F is equivalent to xF The GNU project introduced followed by a series of hyphen separated words e g file or dry run These are the only two option syntaxes provided by optparse Some other option syntaxes that the world has seen include a hyphen followed by a few letters e g pf this is not the same as multiple options merged into a single argument a hyphen followed by a whole word e g file this is technically equivalent to the previous syntax but they aren t usually seen in the same program a plus sign followed by a single letter or a few letters or a word e g f rgb a slash followed by a letter or a few letters or a word e g f file These option syntaxes are not supported by optparse and they never will be This is deliberate the first three are non standard on any environment and the last only makes sense if you re exclusively targeting Windows or certain legacy platforms e g VMS MS DOS option argument a
en
null
101
n argument that follows an option is closely associated with that option and is consumed from the argument list when that option is With optparse option arguments may either be in a separate argument from their option f foo file foo or included in the same argument ffoo file foo Typically a given option either takes an argument or it doesn t Lots of people want an optional option arguments feature meaning that some options will take an argument if they see it and won t if they don t This is somewhat controversial because it makes parsing ambiguous if a takes an optional argument and b is another option entirely how do we interpret ab Because of this ambiguity optparse does not support this feature positional argument something leftover in the argument list after options have been parsed i e after options and their arguments have been parsed and removed from the argument list required option an option that must be supplied on the command line note that the phrase required option is self contradictory in English optparse doesn t prevent you from implementing required options but doesn t give you much help at it either For example consider this hypothetical command line prog v report report txt foo bar v and report are both options Assuming that report takes one argument report txt is an option argument foo and bar are positional arguments What are options for Options are used to provide extra information to tune or customize the execution of a program In case it wasn t clear options are usually optional A program should be able to run just fine with no options whatsoever Pick a random program from the Unix or GNU toolsets Can it run without any options at all and still make sense The main exceptions are find tar and dd all of which are mutant oddballs that have been rightly criticized for their non standard syntax and confusing interfaces Lots of people want their programs to have required options Think about it If it s required then it s not optional If there is a piece of information that your program absolutely requires in order to run successfully that s what positional arguments are for As an example of good command line interface design consider the humble cp utility for copying files It doesn t make much sense to try to copy files without supplying a destination and at least one source Hence cp fails if you run it with no arguments However it has a flexible useful syntax that does not require any options at all cp SOURCE DEST cp SOURCE DEST DIR You can get pretty far with just that Most cp implementations provide a bunch of options to tweak exactly how the files are copied you can preserve mode and modification time avoid following symlinks ask before clobbering existing files etc But none of this distracts from the core mission of cp which is to copy either one file to another or several files to another directory What are positional arguments for Positional arguments are for those pieces of information that your program absolutely positively requires to run A good user interface should have as few absolute requirements as possible If your program requires 17 distinct pieces of information in order to run successfully it doesn t much matter how you get that information from the user most people will give up and walk away before they successfully run the program This applies whether the user interface is a command line a configuration file or a GUI if you make that many demands on your users most of them will simply give up In short try to minimize the amount of information that users are absolutely required to supply use sensible defaults whenever possible Of course you also want to make your programs reasonably flexible That s what options are for Again it doesn t matter if they are entries in a config file widgets in the Preferences dialog of a GUI or command line options the more options you implement the more flexible your program is and the more complicated its implementation becomes Too much flexibility has drawbacks as well of course too many options can overwhelm users and make your code much harder to main
en
null
102
tain Tutorial While optparse is quite flexible and powerful it s also straightforward to use in most cases This section covers the code patterns that are common to any optparse based program First you need to import the OptionParser class then early in the main program create an OptionParser instance from optparse import OptionParser parser OptionParser Then you can start defining options The basic syntax is parser add_option opt_str attr value Each option has one or more option strings such as f or file and several option attributes that tell optparse what to expect and what to do when it encounters that option on the command line Typically each option will have one short option string and one long option string e g parser add_option f file You re free to define as many short option strings and as many long option strings as you like including zero as long as there is at least one option string overall The option strings passed to OptionParser add_option are effectively labels for the option defined by that call For brevity we will frequently refer to encountering an option on the command line in reality optparse encounters option strings and looks up options from them Once all of your options are defined instruct optparse to parse your program s command line options args parser parse_args If you like you can pass a custom argument list to parse_args but that s rarely necessary by default it uses sys argv 1 parse_args returns two values options an object containing values for all of your options e g if file takes a single string argument then options file will be the filename supplied by the user or None if the user did not supply that option args the list of positional arguments leftover after parsing options This tutorial section only covers the four most important option attributes action type dest destination and help Of these action is the most fundamental Understanding option actions Actions tell optparse what to do when it encounters an option on the command line There is a fixed set of actions hard coded into optparse adding new actions is an advanced topic covered in section Extending optparse Most actions tell optparse to store a value in some variable for example take a string from the command line and store it in an attribute of options If you don t specify an option action optparse defaults to store The store action The most common option action is store which tells optparse to take the next argument or the remainder of the current argument ensure that it is of the correct type and store it to your chosen destination For example parser add_option f file action store type string dest filename Now let s make up a fake command line and ask optparse to parse it args f foo txt options args parser parse_args args When optparse sees the option string f it consumes the next argument foo txt and stores it in options filename So after this call to parse_args options filename is foo txt Some other option types supported by optparse are int and float Here s an option that expects an integer argument parser add_option n type int dest num Note that this option has no long option string which is perfectly acceptable Also there s no explicit action since the default is store Let s parse another fake command line This time we ll jam the option argument right up against the option since n42 one argument is equivalent to n 42 two arguments the code options args parser parse_args n42 print options num will print 42 If you don t specify a type optparse assumes string Combined with the fact that the default action is store that means our first example can be a lot shorter parser add_option f file dest filename If you don t supply a destination optparse figures out a sensible default from the option strings if the first long option string is foo bar then the default destination is foo_bar If there are no long option strings optparse looks at the first short option string the default destination for f is f optparse also includes the built in complex type Adding types is covered in section Extending optparse Handling boolean flag o
en
null
103
ptions Flag options set a variable to true or false when a particular option is seen are quite common optparse supports them with two separate actions store_true and store_false For example you might have a verbose flag that is turned on with v and off with q parser add_option v action store_true dest verbose parser add_option q action store_false dest verbose Here we have two different options with the same destination which is perfectly OK It just means you have to be a bit careful when setting default values see below When optparse encounters v on the command line it sets options verbose to True when it encounters q options verbose is set to False Other actions Some other actions supported by optparse are store_const store a constant value pre set via Option const append append this option s argument to a list count increment a counter by one callback call a specified function These are covered in section Reference Guide and section Option Callbacks Default values All of the above examples involve setting some variable the destination when certain command line options are seen What happens if those options are never seen Since we didn t supply any defaults they are all set to None This is usually fine but sometimes you want more control optparse lets you supply a default value for each destination which is assigned before the command line is parsed First consider the verbose quiet example If we want optparse to set verbose to True unless q is seen then we can do this parser add_option v action store_true dest verbose default True parser add_option q action store_false dest verbose Since default values apply to the destination rather than to any particular option and these two options happen to have the same destination this is exactly equivalent parser add_option v action store_true dest verbose parser add_option q action store_false dest verbose default True Consider this parser add_option v action store_true dest verbose default False parser add_option q action store_false dest verbose default True Again the default value for verbose will be True the last default value supplied for any particular destination is the one that counts A clearer way to specify default values is the set_defaults method of OptionParser which you can call at any time before calling parse_args parser set_defaults verbose True parser add_option options args parser parse_args As before the last value specified for a given option destination is the one that counts For clarity try to use one method or the other of setting default values not both Generating help optparse s ability to generate help and usage text automatically is useful for creating user friendly command line interfaces All you have to do is supply a help value for each option and optionally a short usage message for your whole program Here s an OptionParser populated with user friendly documented options usage usage prog options arg1 arg2 parser OptionParser usage usage parser add_option v verbose action store_true dest verbose default True help make lots of noise default parser add_option q quiet action store_false dest verbose help be vewwy quiet I m hunting wabbits parser add_option f filename metavar FILE help write output to FILE parser add_option m mode default intermediate help interaction mode novice intermediate or expert default default If optparse encounters either h or help on the command line or if you just call parser print_help it prints the following to standard output Usage yourscript options arg1 arg2 Options h help show this help message and exit v verbose make lots of noise default q quiet be vewwy quiet I m hunting wabbits f FILE filename FILE write output to FILE m MODE mode MODE interaction mode novice intermediate or expert default intermediate If the help output is triggered by a help option optparse exits after printing the help text There s a lot going on here to help optparse generate the best possible help message the script defines its own usage message usage usage prog options arg1 arg2 optparse expands prog in the usage string to the name of the cu
en
null
104
rrent program i e os path basename sys argv 0 The expanded string is then printed before the detailed option help If you don t supply a usage string optparse uses a bland but sensible default Usage prog options which is fine if your script doesn t take any positional arguments every option defines a help string and doesn t worry about line wrapping optparse takes care of wrapping lines and making the help output look good options that take a value indicate this fact in their automatically generated help message e g for the mode option m MODE mode MODE Here MODE is called the meta variable it stands for the argument that the user is expected to supply to m mode By default optparse converts the destination variable name to uppercase and uses that for the meta variable Sometimes that s not what you want for example the filename option explicitly sets metavar FILE resulting in this automatically generated option description f FILE filename FILE This is important for more than just saving space though the manually written help text uses the meta variable FILE to clue the user in that there s a connection between the semi formal syntax f FILE and the informal semantic description write output to FILE This is a simple but effective way to make your help text a lot clearer and more useful for end users options that have a default value can include default in the help string optparse will replace it with str of the option s default value If an option has no default value or the default value is None default expands to none Grouping Options When dealing with many options it is convenient to group these options for better help output An OptionParser can contain several option groups each of which can contain several options An option group is obtained using the class OptionGroup class optparse OptionGroup parser title description None where parser is the OptionParser instance the group will be inserted in to title is the group title description optional is a long description of the group OptionGroup inherits from OptionContainer like OptionParser and so the add_option method can be used to add an option to the group Once all the options are declared using the OptionParser method add_option_group the group is added to the previously defined parser Continuing with the parser defined in the previous section adding an OptionGroup to a parser is easy group OptionGroup parser Dangerous Options Caution use these options at your own risk It is believed that some of them bite group add_option g action store_true help Group option parser add_option_group group This would result in the following help output Usage yourscript options arg1 arg2 Options h help show this help message and exit v verbose make lots of noise default q quiet be vewwy quiet I m hunting wabbits f FILE filename FILE write output to FILE m MODE mode MODE interaction mode novice intermediate or expert default intermediate Dangerous Options Caution use these options at your own risk It is believed that some of them bite g Group option A bit more complete example might involve using more than one group still extending the previous example group OptionGroup parser Dangerous Options Caution use these options at your own risk It is believed that some of them bite group add_option g action store_true help Group option parser add_option_group group group OptionGroup parser Debug Options group add_option d debug action store_true help Print debug information group add_option s sql action store_true help Print all SQL statements executed group add_option e action store_true help Print every action done parser add_option_group group that results in the following output Usage yourscript options arg1 arg2 Options h help show this help message and exit v verbose make lots of noise default q quiet be vewwy quiet I m hunting wabbits f FILE filename FILE write output to FILE m MODE mode MODE interaction mode novice intermediate or expert default intermediate Dangerous Options Caution use these options at your own risk It is believed that some of them bite g Group option Debug Options d deb
en
null
105
ug Print debug information s sql Print all SQL statements executed e Print every action done Another interesting method in particular when working programmatically with option groups is OptionParser get_option_group opt_str Return the OptionGroup to which the short or long option string opt_str e g o or option belongs If there s no such OptionGroup return None Printing a version string Similar to the brief usage string optparse can also print a version string for your program You have to supply the string as the version argument to OptionParser parser OptionParser usage prog f q version prog 1 0 prog is expanded just like it is in usage Apart from that version can contain anything you like When you supply it optparse automatically adds a version option to your parser If it encounters this option on the command line it expands your version string by replacing prog prints it to stdout and exits For example if your script is called usr bin foo usr bin foo version foo 1 0 The following two methods can be used to print and get the version string OptionParser print_version file None Print the version message for the current program self version to file default stdout As with print_usage any occurrence of prog in self version is replaced with the name of the current program Does nothing if self version is empty or undefined OptionParser get_version Same as print_version but returns the version string instead of printing it How optparse handles errors There are two broad classes of errors that optparse has to worry about programmer errors and user errors Programmer errors are usually erroneous calls to OptionParser add_option e g invalid option strings unknown option attributes missing option attributes etc These are dealt with in the usual way raise an exception either optparse OptionError or TypeError and let the program crash Handling user errors is much more important since they are guaranteed to happen no matter how stable your code is optparse can automatically detect some user errors such as bad option arguments passing n 4x where n takes an integer argument missing arguments n at the end of the command line where n takes an argument of any type Also you can call OptionParser error to signal an application defined error condition options args parser parse_args if options a and options b parser error options a and b are mutually exclusive In either case optparse handles the error the same way it prints the program s usage message and an error message to standard error and exits with error status 2 Consider the first example above where the user passes 4x to an option that takes an integer usr bin foo n 4x Usage foo options foo error option n invalid integer value 4x Or where the user fails to pass a value at all usr bin foo n Usage foo options foo error n option requires an argument optparse generated error messages take care always to mention the option involved in the error be sure to do the same when calling OptionParser error from your application code If optparse s default error handling behaviour does not suit your needs you ll need to subclass OptionParser and override its exit and or error methods Putting it all together Here s what optparse based scripts usually look like from optparse import OptionParser def main usage usage prog options arg parser OptionParser usage parser add_option f file dest filename help read data from FILENAME parser add_option v verbose action store_true dest verbose parser add_option q quiet action store_false dest verbose options args parser parse_args if len args 1 parser error incorrect number of arguments if options verbose print reading s options filename if __name__ __main__ main Reference Guide Creating the parser The first step in using optparse is to create an OptionParser instance class optparse OptionParser The OptionParser constructor has no required arguments but a number of optional keyword arguments You should always pass them as keyword arguments i e do not rely on the order in which the arguments are declared usage default prog options The usage summary to print when you
en
null
106
r program is run incorrectly or with a help option When optparse prints the usage string it expands prog to os path basename sys argv 0 or to prog if you passed that keyword argument To suppress a usage message pass the special value optparse SUPPRESS_USAGE option_list default A list of Option objects to populate the parser with The options in option_list are added after any options in standard_option_list a class attribute that may be set by OptionParser subclasses but before any version or help options Deprecated use add_option after creating the parser instead option_class default optparse Option Class to use when adding options to the parser in add_option version default None A version string to print when the user supplies a version option If you supply a true value for version optparse automatically adds a version option with the single option string version The substring prog is expanded the same as for usage conflict_handler default error Specifies what to do when options with conflicting option strings are added to the parser see section Conflicts between options description default None A paragraph of text giving a brief overview of your program optparse reformats this paragraph to fit the current terminal width and prints it when the user requests help after usage but before the list of options formatter default a new IndentedHelpFormatter An instance of optparse HelpFormatter that will be used for printing help text optparse provides two concrete classes for this purpose IndentedHelpFormatter and TitledHelpFormatter add_help_option default True If true optparse will add a help option with option strings h and help to the parser prog The string to use when expanding prog in usage and version instead of os path basename sys argv 0 epilog default None A paragraph of help text to print after the option help Populating the parser There are several ways to populate the parser with options The preferred way is by using OptionParser add_option as shown in section Tutorial add_option can be called in one of two ways pass it an Option instance as returned by make_option pass it any combination of positional and keyword arguments that are acceptable to make_option i e to the Option constructor and it will create the Option instance for you The other alternative is to pass a list of pre constructed Option instances to the OptionParser constructor as in option_list make_option f filename action store type string dest filename make_option q quiet action store_false dest verbose parser OptionParser option_list option_list make_option is a factory function for creating Option instances currently it is an alias for the Option constructor A future version of optparse may split Option into several classes and make_option will pick the right class to instantiate Do not instantiate Option directly Defining options Each Option instance represents a set of synonymous command line option strings e g f and file You can specify any number of short or long option strings but you must specify at least one overall option string The canonical way to create an Option instance is with the add_option method of OptionParser OptionParser add_option option OptionParser add_option opt_str attr value To define an option with only a short option string parser add_option f attr value And to define an option with only a long option string parser add_option foo attr value The keyword arguments define attributes of the new Option object The most important option attribute is action and it largely determines which other attributes are relevant or required If you pass irrelevant option attributes or fail to pass required ones optparse raises an OptionError exception explaining your mistake An option s action determines what optparse does when it encounters this option on the command line The standard option actions hard coded into optparse are store store this option s argument default store_const store a constant value pre set via Option const store_true store True store_false store False append append this option s argument to a list append_const appe
en
null
107
nd a constant value to a list pre set via Option const count increment a counter by one callback call a specified function help print a usage message including all options and the documentation for them If you don t supply an action the default is store For this action you may also supply type and dest option attributes see Standard option actions As you can see most actions involve storing or updating a value somewhere optparse always creates a special object for this conventionally called options which is an instance of optparse Values class optparse Values An object holding parsed argument names and values as attributes Normally created by calling when calling OptionParser parse_args and can be overridden by a custom subclass passed to the values argument of OptionParser parse_args as described in Parsing arguments Option arguments and various other values are stored as attributes of this object according to the dest destination option attribute For example when you call parser parse_args one of the first things optparse does is create the options object options Values If one of the options in this parser is defined with parser add_option f file action store type string dest filename and the command line being parsed includes any of the following ffoo f foo file foo file foo then optparse on seeing this option will do the equivalent of options filename foo The type and dest option attributes are almost as important as action but action is the only one that makes sense for all options Option attributes class optparse Option A single command line argument with various attributes passed by keyword to the constructor Normally created with OptionParser add_option rather than directly and can be overridden by a custom class via the option_class argument to OptionParser The following option attributes may be passed as keyword arguments to OptionParser add_option If you pass an option attribute that is not relevant to a particular option or fail to pass a required option attribute optparse raises OptionError Option action default store Determines optparse s behaviour when this option is seen on the command line the available options are documented here Option type default string The argument type expected by this option e g string or int the available option types are documented here Option dest default derived from option strings If the option s action implies writing or modifying a value somewhere this tells optparse where to write it dest names an attribute of the options object that optparse builds as it parses the command line Option default The value to use for this option s destination if the option is not seen on the command line See also OptionParser set_defaults Option nargs default 1 How many arguments of type type should be consumed when this option is seen If 1 optparse will store a tuple of values to dest Option const For actions that store a constant value the constant value to store Option choices For options of type choice the list of strings the user may choose from Option callback For options with action callback the callable to call when this option is seen See section Option Callbacks for detail on the arguments passed to the callable Option callback_args Option callback_kwargs Additional positional and keyword arguments to pass to callback after the four standard callback arguments Option help Help text to print for this option when listing all available options after the user supplies a help option such as help If no help text is supplied the option will be listed without help text To hide this option use the special value optparse SUPPRESS_HELP Option metavar default derived from option strings Stand in for the option argument s to use when printing help text See section Tutorial for an example Standard option actions The various option actions all have slightly different requirements and effects Most actions have several relevant option attributes which you may specify to guide optparse s behaviour a few have required attributes which you must specify for any option using that action store relevant ty
en
null
108
pe dest nargs choices The option must be followed by an argument which is converted to a value according to type and stored in dest If nargs 1 multiple arguments will be consumed from the command line all will be converted according to type and stored to dest as a tuple See the Standard option types section If choices is supplied a list or tuple of strings the type defaults to choice If type is not supplied it defaults to string If dest is not supplied optparse derives a destination from the first long option string e g foo bar implies foo_bar If there are no long option strings optparse derives a destination from the first short option string e g f implies f Example parser add_option f parser add_option p type float nargs 3 dest point As it parses the command line f foo txt p 1 3 5 4 fbar txt optparse will set options f foo txt options point 1 0 3 5 4 0 options f bar txt store_const required const relevant dest The value const is stored in dest Example parser add_option q quiet action store_const const 0 dest verbose parser add_option v verbose action store_const const 1 dest verbose parser add_option noisy action store_const const 2 dest verbose If noisy is seen optparse will set options verbose 2 store_true relevant dest A special case of store_const that stores True to dest store_false relevant dest Like store_true but stores False Example parser add_option clobber action store_true dest clobber parser add_option no clobber action store_false dest clobber append relevant type dest nargs choices The option must be followed by an argument which is appended to the list in dest If no default value for dest is supplied an empty list is automatically created when optparse first encounters this option on the command line If nargs 1 multiple arguments are consumed and a tuple of length nargs is appended to dest The defaults for type and dest are the same as for the store action Example parser add_option t tracks action append type int If t3 is seen on the command line optparse does the equivalent of options tracks options tracks append int 3 If a little later on tracks 4 is seen it does options tracks append int 4 The append action calls the append method on the current value of the option This means that any default value specified must have an append method It also means that if the default value is non empty the default elements will be present in the parsed value for the option with any values from the command line appended after those default values parser add_option files action append default mypkg defaults opts args parser parse_args files overrides mypkg opts files mypkg defaults overrides mypkg append_const required const relevant dest Like store_const but the value const is appended to dest as with append dest defaults to None and an empty list is automatically created the first time the option is encountered count relevant dest Increment the integer stored at dest If no default value is supplied dest is set to zero before being incremented the first time Example parser add_option v action count dest verbosity The first time v is seen on the command line optparse does the equivalent of options verbosity 0 options verbosity 1 Every subsequent occurrence of v results in options verbosity 1 callback required callback relevant type nargs callback_args callback_kwargs Call the function specified by callback which is called as func option opt_str value parser args kwargs See section Option Callbacks for more detail help Prints a complete help message for all the options in the current option parser The help message is constructed from the usage string passed to OptionParser s constructor and the help string passed to every option If no help string is supplied for an option it will still be listed in the help message To omit an option entirely use the special value optparse SUPPRESS_HELP optparse automatically adds a help option to all OptionParsers so you do not normally need to create one Example from optparse import OptionParser SUPPRESS_HELP usually a help option is added automatically but that can be suppressed usin
en
null
109
g the add_help_option argument parser OptionParser add_help_option False parser add_option h help action help parser add_option v action store_true dest verbose help Be moderately verbose parser add_option file dest filename help Input file to read data from parser add_option secret help SUPPRESS_HELP If optparse sees either h or help on the command line it will print something like the following help message to stdout assuming sys argv 0 is foo py Usage foo py options Options h help Show this help message and exit v Be moderately verbose file FILENAME Input file to read data from After printing the help message optparse terminates your process with sys exit 0 version Prints the version number supplied to the OptionParser to stdout and exits The version number is actually formatted and printed by the print_version method of OptionParser Generally only relevant if the version argument is supplied to the OptionParser constructor As with help options you will rarely create version options since optparse automatically adds them when needed Standard option types optparse has five built in option types string int choice float and complex If you need to add new option types see section Extending optparse Arguments to string options are not checked or converted in any way the text on the command line is stored in the destination or passed to the callback as is Integer arguments type int are parsed as follows if the number starts with 0x it is parsed as a hexadecimal number if the number starts with 0 it is parsed as an octal number if the number starts with 0b it is parsed as a binary number otherwise the number is parsed as a decimal number The conversion is done by calling int with the appropriate base 2 8 10 or 16 If this fails so will optparse although with a more useful error message float and complex option arguments are converted directly with float and complex with similar error handling choice options are a subtype of string options The choices option attribute a sequence of strings defines the set of allowed option arguments optparse check_choice compares user supplied option arguments against this master list and raises OptionValueError if an invalid string is given Parsing arguments The whole point of creating and populating an OptionParser is to call its parse_args method OptionParser parse_args args None values None Parse the command line options found in args The input parameters are args the list of arguments to process default sys argv 1 values an Values object to store option arguments in default a new instance of Values if you give an existing object the option defaults will not be initialized on it and the return value is a pair options args where options the same object that was passed in as values or the optparse Values instance created by optparse args the leftover positional arguments after all options have been processed The most common usage is to supply neither keyword argument If you supply values it will be modified with repeated setattr calls roughly one for every option argument stored to an option destination and returned by parse_args If parse_args encounters any errors in the argument list it calls the OptionParser s error method with an appropriate end user error message This ultimately terminates your process with an exit status of 2 the traditional Unix exit status for command line errors Querying and manipulating your option parser The default behavior of the option parser can be customized slightly and you can also poke around your option parser and see what s there OptionParser provides several methods to help you out OptionParser disable_interspersed_args Set parsing to stop on the first non option For example if a and b are both simple options that take no arguments optparse normally accepts this syntax prog a arg1 b arg2 and treats it as equivalent to prog a b arg1 arg2 To disable this feature call disable_interspersed_args This restores traditional Unix syntax where option parsing stops with the first non option argument Use this if you have a command processor which runs another comma
en
null
110
nd which has options of its own and you want to make sure these options don t get confused For example each command might have a different set of options OptionParser enable_interspersed_args Set parsing to not stop on the first non option allowing interspersing switches with command arguments This is the default behavior OptionParser get_option opt_str Returns the Option instance with the option string opt_str or None if no options have that option string OptionParser has_option opt_str Return True if the OptionParser has an option with option string opt_str e g q or verbose OptionParser remove_option opt_str If the OptionParser has an option corresponding to opt_str that option is removed If that option provided any other option strings all of those option strings become invalid If opt_str does not occur in any option belonging to this OptionParser raises ValueError Conflicts between options If you re not careful it s easy to define options with conflicting option strings parser add_option n dry run parser add_option n noisy This is particularly true if you ve defined your own OptionParser subclass with some standard options Every time you add an option optparse checks for conflicts with existing options If it finds any it invokes the current conflict handling mechanism You can set the conflict handling mechanism either in the constructor parser OptionParser conflict_handler handler or with a separate call parser set_conflict_handler handler The available conflict handlers are error default assume option conflicts are a programming error and raise OptionConflictError resolve resolve option conflicts intelligently see below As an example let s define an OptionParser that resolves conflicts intelligently and add conflicting options to it parser OptionParser conflict_handler resolve parser add_option n dry run help do no harm parser add_option n noisy help be noisy At this point optparse detects that a previously added option is already using the n option string Since conflict_handler is resolve it resolves the situation by removing n from the earlier option s list of option strings Now dry run is the only way for the user to activate that option If the user asks for help the help message will reflect that Options dry run do no harm n noisy be noisy It s possible to whittle away the option strings for a previously added option until there are none left and the user has no way of invoking that option from the command line In that case optparse removes that option completely so it doesn t show up in help text or anywhere else Carrying on with our existing OptionParser parser add_option dry run help new dry run option At this point the original n dry run option is no longer accessible so optparse removes it leaving this help text Options n noisy be noisy dry run new dry run option Cleanup OptionParser instances have several cyclic references This should not be a problem for Python s garbage collector but you may wish to break the cyclic references explicitly by calling destroy on your OptionParser once you are done with it This is particularly useful in long running applications where large object graphs are reachable from your OptionParser Other methods OptionParser supports several other public methods OptionParser set_usage usage Set the usage string according to the rules described above for the usage constructor keyword argument Passing None sets the default usage string use optparse SUPPRESS_USAGE to suppress a usage message OptionParser print_usage file None Print the usage message for the current program self usage to file default stdout Any occurrence of the string prog in self usage is replaced with the name of the current program Does nothing if self usage is empty or not defined OptionParser get_usage Same as print_usage but returns the usage string instead of printing it OptionParser set_defaults dest value Set default values for several option destinations at once Using set_defaults is the preferred way to set default values for options since multiple options can share the same destination For example if severa
en
null
111
l mode options all set the same destination any one of them can set the default and the last one wins parser add_option advanced action store_const dest mode const advanced default novice overridden below parser add_option novice action store_const dest mode const novice default advanced overrides above setting To avoid this confusion use set_defaults parser set_defaults mode advanced parser add_option advanced action store_const dest mode const advanced parser add_option novice action store_const dest mode const novice Option Callbacks When optparse s built in actions and types aren t quite enough for your needs you have two choices extend optparse or define a callback option Extending optparse is more general but overkill for a lot of simple cases Quite often a simple callback is all you need There are two steps to defining a callback option define the option itself using the callback action write the callback this is a function or method that takes at least four arguments as described below Defining a callback option As always the easiest way to define a callback option is by using the OptionParser add_option method Apart from action the only option attribute you must specify is callback the function to call parser add_option c action callback callback my_callback callback is a function or other callable object so you must have already defined my_callback when you create this callback option In this simple case optparse doesn t even know if c takes any arguments which usually means that the option takes no arguments the mere presence of c on the command line is all it needs to know In some circumstances though you might want your callback to consume an arbitrary number of command line arguments This is where writing callbacks gets tricky it s covered later in this section optparse always passes four particular arguments to your callback and it will only pass additional arguments if you specify them via callback_args and callback_kwargs Thus the minimal callback function signature is def my_callback option opt value parser The four arguments to a callback are described below There are several other option attributes that you can supply when you define a callback option type has its usual meaning as with the store or append actions it instructs optparse to consume one argument and convert it to type Rather than storing the converted value s anywhere though optparse passes it to your callback function nargs also has its usual meaning if it is supplied and 1 optparse will consume nargs arguments each of which must be convertible to type It then passes a tuple of converted values to your callback callback_args a tuple of extra positional arguments to pass to the callback callback_kwargs a dictionary of extra keyword arguments to pass to the callback How callbacks are called All callbacks are called as follows func option opt_str value parser args kwargs where option is the Option instance that s calling the callback opt_str is the option string seen on the command line that s triggering the callback If an abbreviated long option was used opt_str will be the full canonical option string e g if the user puts foo on the command line as an abbreviation for foobar then opt_str will be foobar value is the argument to this option seen on the command line optparse will only expect an argument if type is set the type of value will be the type implied by the option s type If type for this option is None no argument expected then value will be None If nargs 1 value will be a tuple of values of the appropriate type parser is the OptionParser instance driving the whole thing mainly useful because you can access some other interesting data through its instance attributes parser largs the current list of leftover arguments ie arguments that have been consumed but are neither options nor option arguments Feel free to modify parser largs e g by adding more arguments to it This list will become args the second return value of parse_args parser rargs the current list of remaining arguments ie with opt_str and value if applicable removed and
en
null
112
only the arguments following them still there Feel free to modify parser rargs e g by consuming more arguments parser values the object where option values are by default stored an instance of optparse OptionValues This lets callbacks use the same mechanism as the rest of optparse for storing option values you don t need to mess around with globals or closures You can also access or modify the value s of any options already encountered on the command line args is a tuple of arbitrary positional arguments supplied via the callback_args option attribute kwargs is a dictionary of arbitrary keyword arguments supplied via callback_kwargs Raising errors in a callback The callback function should raise OptionValueError if there are any problems with the option or its argument s optparse catches this and terminates the program printing the error message you supply to stderr Your message should be clear concise accurate and mention the option at fault Otherwise the user will have a hard time figuring out what they did wrong Callback example 1 trivial callback Here s an example of a callback option that takes no arguments and simply records that the option was seen def record_foo_seen option opt_str value parser parser values saw_foo True parser add_option foo action callback callback record_foo_seen Of course you could do that with the store_true action Callback example 2 check option order Here s a slightly more interesting example record the fact that a is seen but blow up if it comes after b in the command line def check_order option opt_str value parser if parser values b raise OptionValueError can t use a after b parser values a 1 parser add_option a action callback callback check_order parser add_option b action store_true dest b Callback example 3 check option order generalized If you want to re use this callback for several similar options set a flag but blow up if b has already been seen it needs a bit of work the error message and the flag that it sets must be generalized def check_order option opt_str value parser if parser values b raise OptionValueError can t use s after b opt_str setattr parser values option dest 1 parser add_option a action callback callback check_order dest a parser add_option b action store_true dest b parser add_option c action callback callback check_order dest c Callback example 4 check arbitrary condition Of course you could put any condition in there you re not limited to checking the values of already defined options For example if you have options that should not be called when the moon is full all you have to do is this def check_moon option opt_str value parser if is_moon_full raise OptionValueError s option invalid when moon is full opt_str setattr parser values option dest 1 parser add_option foo action callback callback check_moon dest foo The definition of is_moon_full is left as an exercise for the reader Callback example 5 fixed arguments Things get slightly more interesting when you define callback options that take a fixed number of arguments Specifying that a callback option takes arguments is similar to defining a store or append option if you define type then the option takes one argument that must be convertible to that type if you further define nargs then the option takes nargs arguments Here s an example that just emulates the standard store action def store_value option opt_str value parser setattr parser values option dest value parser add_option foo action callback callback store_value type int nargs 3 dest foo Note that optparse takes care of consuming 3 arguments and converting them to integers for you all you have to do is store them Or whatever obviously you don t need a callback for this example Callback example 6 variable arguments Things get hairy when you want an option to take a variable number of arguments For this case you must write a callback as optparse doesn t provide any built in capabilities for it And you have to deal with certain intricacies of conventional Unix command line parsing that optparse normally handles for you In particular callbacks should
en
null
113
implement the conventional rules for bare and arguments either or can be option arguments bare if not the argument to some option halt command line processing and discard the bare if not the argument to some option halt command line processing but keep the append it to parser largs If you want an option that takes a variable number of arguments there are several subtle tricky issues to worry about The exact implementation you choose will be based on which trade offs you re willing to make for your application which is why optparse doesn t support this sort of thing directly Nevertheless here s a stab at a callback for an option with variable arguments def vararg_callback option opt_str value parser assert value is None value def floatable str try float str return True except ValueError return False for arg in parser rargs stop on foo like options if arg 2 and len arg 2 break stop on a but not on 3 or 3 0 if arg 1 and len arg 1 and not floatable arg break value append arg del parser rargs len value setattr parser values option dest value parser add_option c callback dest vararg_attr action callback callback vararg_callback Extending optparse Since the two major controlling factors in how optparse interprets command line options are the action and type of each option the most likely direction of extension is to add new actions and new types Adding new types To add new types you need to define your own subclass of optparse s Option class This class has a couple of attributes that define optparse s types TYPES and TYPE_CHECKER Option TYPES A tuple of type names in your subclass simply define a new tuple TYPES that builds on the standard one Option TYPE_CHECKER A dictionary mapping type names to type checking functions A type checking function has the following signature def check_mytype option opt value where option is an Option instance opt is an option string e g f and value is the string from the command line that must be checked and converted to your desired type check_mytype should return an object of the hypothetical type mytype The value returned by a type checking function will wind up in the OptionValues instance returned by OptionParser parse_args or be passed to a callback as the value parameter Your type checking function should raise OptionValueError if it encounters any problems OptionValueError takes a single string argument which is passed as is to OptionParser s error method which in turn prepends the program name and the string error and prints everything to stderr before terminating the process Here s a silly example that demonstrates adding a complex option type to parse Python style complex numbers on the command line This is even sillier than it used to be because optparse 1 3 added built in support for complex numbers but never mind First the necessary imports from copy import copy from optparse import Option OptionValueError You need to define your type checker first since it s referred to later in the TYPE_CHECKER class attribute of your Option subclass def check_complex option opt value try return complex value except ValueError raise OptionValueError option s invalid complex value r opt value Finally the Option subclass class MyOption Option TYPES Option TYPES complex TYPE_CHECKER copy Option TYPE_CHECKER TYPE_CHECKER complex check_complex If we didn t make a copy of Option TYPE_CHECKER we would end up modifying the TYPE_CHECKER attribute of optparse s Option class This being Python nothing stops you from doing that except good manners and common sense That s it Now you can write a script that uses the new option type just like any other optparse based script except you have to instruct your OptionParser to use MyOption instead of Option parser OptionParser option_class MyOption parser add_option c type complex Alternately you can build your own option list and pass it to OptionParser if you don t use add_option in the above way you don t need to tell OptionParser which option class to use option_list MyOption c action store type complex dest c parser OptionParser option_list option_list Adding ne
en
null
114
w actions Adding new actions is a bit trickier because you have to understand that optparse has a couple of classifications for actions store actions actions that result in optparse storing a value to an attribute of the current OptionValues instance these options require a dest attribute to be supplied to the Option constructor typed actions actions that take a value from the command line and expect it to be of a certain type or rather a string that can be converted to a certain type These options require a type attribute to the Option constructor These are overlapping sets some default store actions are store store_const append and count while the default typed actions are store append and callback When you add an action you need to categorize it by listing it in at least one of the following class attributes of Option all are lists of strings Option ACTIONS All actions must be listed in ACTIONS Option STORE_ACTIONS store actions are additionally listed here Option TYPED_ACTIONS typed actions are additionally listed here Option ALWAYS_TYPED_ACTIONS Actions that always take a type i e whose options always take a value are additionally listed here The only effect of this is that optparse assigns the default type string to options with no explicit type whose action is listed in ALWAYS_TYPED_ACTIONS In order to actually implement your new action you must override Option s take_action method and add a case that recognizes your action For example let s add an extend action This is similar to the standard append action but instead of taking a single value from the command line and appending it to an existing list extend will take multiple values in a single comma delimited string and extend an existing list with them That is if names is an extend option of type string the command line names foo bar names blah names ding dong would result in a list foo bar blah ding dong Again we define a subclass of Option class MyOption Option ACTIONS Option ACTIONS extend STORE_ACTIONS Option STORE_ACTIONS extend TYPED_ACTIONS Option TYPED_ACTIONS extend ALWAYS_TYPED_ACTIONS Option ALWAYS_TYPED_ACTIONS extend def take_action self action dest opt value values parser if action extend lvalue value split values ensure_value dest extend lvalue else Option take_action self action dest opt value values parser Features of note extend both expects a value on the command line and stores that value somewhere so it goes in both STORE_ACTIONS and TYPED_ACTIONS to ensure that optparse assigns the default type of string to extend actions we put the extend action in ALWAYS_TYPED_ACTIONS as well MyOption take_action implements just this one new action and passes control back to Option take_action for the standard optparse actions values is an instance of the optparse_parser Values class which provides the very useful ensure_value method ensure_value is essentially getattr with a safety valve it is called as values ensure_value attr value If the attr attribute of values doesn t exist or is None then ensure_value first sets it to value and then returns value This is very handy for actions like extend append and count all of which accumulate data in a variable and expect that variable to be of a certain type a list for the first two an integer for the latter Using ensure_value means that scripts using your action don t have to worry about setting a default value for the option destinations in question they can just leave the default as None and ensure_value will take care of getting it right when it s needed Exceptions exception optparse OptionError Raised if an Option instance is created with invalid or inconsistent arguments exception optparse OptionConflictError Raised if conflicting options are added to an OptionParser exception optparse OptionValueError Raised if an invalid option value is encountered on the command line exception optparse BadOptionError Raised if an invalid option is passed on the command line exception optparse AmbiguousOptionError Raised if an ambiguous option is passed on the command line
en
null
115
What s New In Python 3 10 Editor Pablo Galindo Salgado This article explains the new features in Python 3 10 compared to 3 9 Python 3 10 was released on October 4 2021 For full details see the changelog Summary Release highlights New syntax features PEP 634 Structural Pattern Matching Specification PEP 635 Structural Pattern Matching Motivation and Rationale PEP 636 Structural Pattern Matching Tutorial bpo 12782 Parenthesized context managers are now officially allowed New features in the standard library PEP 618 Add Optional Length Checking To zip Interpreter improvements PEP 626 Precise line numbers for debugging and other tools New typing features PEP 604 Allow writing union types as X Y PEP 612 Parameter Specification Variables PEP 613 Explicit Type Aliases PEP 647 User Defined Type Guards Important deprecations removals or restrictions PEP 644 Require OpenSSL 1 1 1 or newer PEP 632 Deprecate distutils module PEP 623 Deprecate and prepare for the removal of the wstr member in PyUnicodeObject PEP 624 Remove Py_UNICODE encoder APIs PEP 597 Add optional EncodingWarning New Features Parenthesized context managers Using enclosing parentheses for continuation across multiple lines in context managers is now supported This allows formatting a long collection of context managers in multiple lines in a similar way as it was previously possible with import statements For instance all these examples are now valid with CtxManager as example with CtxManager1 CtxManager2 with CtxManager1 as example CtxManager2 with CtxManager1 CtxManager2 as example with CtxManager1 as example1 CtxManager2 as example2 it is also possible to use a trailing comma at the end of the enclosed group with CtxManager1 as example1 CtxManager2 as example2 CtxManager3 as example3 This new syntax uses the non LL 1 capacities of the new parser Check PEP 617 for more details Contributed by Guido van Rossum Pablo Galindo and Lysandros Nikolaou in bpo 12782 and bpo 40334 Better error messages SyntaxErrors When parsing code that contains unclosed parentheses or brackets the interpreter now includes the location of the unclosed bracket of parentheses instead of displaying SyntaxError unexpected EOF while parsing or pointing to some incorrect location For instance consider the following code notice the unclosed expected 9 1 18 2 19 2 27 3 28 3 29 3 36 4 37 4 38 4 39 4 45 5 46 5 47 5 48 5 49 5 54 6 some_other_code foo Previous versions of the interpreter reported confusing places as the location of the syntax error File example py line 3 some_other_code foo SyntaxError invalid syntax but in Python 3 10 a more informative error is emitted File example py line 1 expected 9 1 18 2 19 2 27 3 28 3 29 3 36 4 37 4 SyntaxError was never closed In a similar way errors involving unclosed string literals single and triple quoted now point to the start of the string instead of reporting EOF EOL These improvements are inspired by previous work in the PyPy interpreter Contributed by Pablo Galindo in bpo 42864 and Batuhan Taskaya in bpo 40176 SyntaxError exceptions raised by the interpreter will now highlight the full error range of the expression that constitutes the syntax error itself instead of just where the problem is detected In this way instead of displaying before Python 3 10 foo x z for z in range 10 t w File stdin line 1 foo x z for z in range 10 t w SyntaxError Generator expression must be parenthesized now Python 3 10 will display the exception as foo x z for z in range 10 t w File stdin line 1 foo x z for z in range 10 t w SyntaxError Generator expression must be parenthesized This improvement was contributed by Pablo Galindo in bpo 43914 A considerable amount of new specialized messages for SyntaxError exceptions have been incorporated Some of the most notable ones are as follows Missing before blocks if rocket position event_horizon File stdin line 1 if rocket position event_horizon SyntaxError expected Contributed by Pablo Galindo in bpo 42997 Unparenthesised tuples in comprehensions targets x y for x y in zip abcd 1234 File stdin line 1 x y for x y in zip abcd 123
en
null
116
4 SyntaxError did you forget parentheses around the comprehension target Contributed by Pablo Galindo in bpo 43017 Missing commas in collection literals and between expressions items x 1 y 2 z 3 File stdin line 3 y 2 SyntaxError invalid syntax Perhaps you forgot a comma Contributed by Pablo Galindo in bpo 43822 Multiple Exception types without parentheses try build_dyson_sphere except NotEnoughScienceError NotEnoughResourcesError File stdin line 3 except NotEnoughScienceError NotEnoughResourcesError SyntaxError multiple exception types must be parenthesized Contributed by Pablo Galindo in bpo 43149 Missing and values in dictionary literals values x 1 y 2 z File stdin line 4 z SyntaxError expression expected after dictionary key and values x 1 y 2 z w 3 File stdin line 1 values x 1 y 2 z w 3 SyntaxError expected after dictionary key Contributed by Pablo Galindo in bpo 43823 try blocks without except or finally blocks try x 2 something 3 File stdin line 3 something 3 SyntaxError expected except or finally block Contributed by Pablo Galindo in bpo 44305 Usage of instead of in comparisons if rocket position event_horizon File stdin line 1 if rocket position event_horizon SyntaxError cannot assign to attribute here Maybe you meant instead of Contributed by Pablo Galindo in bpo 43797 Usage of in f strings f Black holes all_black_holes and revelations File stdin line 1 all_black_holes SyntaxError f string cannot use starred expression here Contributed by Pablo Galindo in bpo 41064 IndentationErrors Many IndentationError exceptions now have more context regarding what kind of block was expecting an indentation including the location of the statement def foo if lel x 2 File stdin line 3 x 2 IndentationError expected an indented block after if statement in line 2 AttributeErrors When printing AttributeError PyErr_Display will offer suggestions of similar attribute names in the object that the exception was raised from collections namedtoplo Traceback most recent call last File stdin line 1 in module AttributeError module collections has no attribute namedtoplo Did you mean namedtuple Contributed by Pablo Galindo in bpo 38530 Warning Notice this won t work if PyErr_Display is not called to display the error which can happen if some other custom error display function is used This is a common scenario in some REPLs like IPython NameErrors When printing NameError raised by the interpreter PyErr_Display will offer suggestions of similar variable names in the function that the exception was raised from schwarzschild_black_hole None schwarschild_black_hole Traceback most recent call last File stdin line 1 in module NameError name schwarschild_black_hole is not defined Did you mean schwarzschild_black_hole Contributed by Pablo Galindo in bpo 38530 Warning Notice this won t work if PyErr_Display is not called to display the error which can happen if some other custom error display function is used This is a common scenario in some REPLs like IPython PEP 626 Precise line numbers for debugging and other tools PEP 626 brings more precise and reliable line numbers for debugging profiling and coverage tools Tracing events with the correct line number are generated for all lines of code executed and only for lines of code that are executed The f_lineno attribute of frame objects will always contain the expected line number The co_lnotab attribute of code objects is deprecated and will be removed in 3 12 Code that needs to convert from offset to line number should use the new co_lines method instead PEP 634 Structural Pattern Matching Structural pattern matching has been added in the form of a match statement and case statements of patterns with associated actions Patterns consist of sequences mappings primitive data types as well as class instances Pattern matching enables programs to extract information from complex data types branch on the structure of data and apply specific actions based on different forms of data Syntax and operations The generic syntax of pattern matching is match subject case pattern_1 action_1 case pattern_2 action_2 ca
en
null
117
se pattern_3 action_3 case _ action_wildcard A match statement takes an expression and compares its value to successive patterns given as one or more case blocks Specifically pattern matching operates by 1 using data with type and shape the subject 2 evaluating the subject in the match statement 3 comparing the subject with each pattern in a case statement from top to bottom until a match is confirmed 4 executing the action associated with the pattern of the confirmed match 5 If an exact match is not confirmed the last case a wildcard _ if provided will be used as the matching case If an exact match is not confirmed and a wildcard case does not exist the entire match block is a no op Declarative approach Readers may be aware of pattern matching through the simple example of matching a subject data object to a literal pattern with the switch statement found in C Java or JavaScript and many other languages Often the switch statement is used for comparison of an object expression with case statements containing literals More powerful examples of pattern matching can be found in languages such as Scala and Elixir With structural pattern matching the approach is declarative and explicitly states the conditions the patterns for data to match While an imperative series of instructions using nested if statements could be used to accomplish something similar to structural pattern matching it is less clear than the declarative approach Instead the declarative approach states the conditions to meet for a match and is more readable through its explicit patterns While structural pattern matching can be used in its simplest form comparing a variable to a literal in a case statement its true value for Python lies in its handling of the subject s type and shape Simple pattern match to a literal Let s look at this example as pattern matching in its simplest form a value the subject being matched to several literals the patterns In the example below status is the subject of the match statement The patterns are each of the case statements where literals represent request status codes The associated action to the case is executed after a match def http_error status match status case 400 return Bad request case 404 return Not found case 418 return I m a teapot case _ return Something s wrong with the internet If the above function is passed a status of 418 I m a teapot is returned If the above function is passed a status of 500 the case statement with _ will match as a wildcard and Something s wrong with the internet is returned Note the last block the variable name _ acts as a wildcard and insures the subject will always match The use of _ is optional You can combine several literals in a single pattern using or case 401 403 404 return Not allowed Behavior without the wildcard If we modify the above example by removing the last case block the example becomes def http_error status match status case 400 return Bad request case 404 return Not found case 418 return I m a teapot Without the use of _ in a case statement a match may not exist If no match exists the behavior is a no op For example if status of 500 is passed a no op occurs Patterns with a literal and variable Patterns can look like unpacking assignments and a pattern may be used to bind variables In this example a data point can be unpacked to its x coordinate and y coordinate point is an x y tuple match point case 0 0 print Origin case 0 y print f Y y case x 0 print f X x case x y print f X x Y y case _ raise ValueError Not a point The first pattern has two literals 0 0 and may be thought of as an extension of the literal pattern shown above The next two patterns combine a literal and a variable and the variable binds a value from the subject point The fourth pattern captures two values which makes it conceptually similar to the unpacking assignment x y point Patterns and classes If you are using classes to structure your data you can use as a pattern the class name followed by an argument list resembling a constructor This pattern has the ability to capture class attributes into v
en
null
118
ariables class Point x int y int def location point match point case Point x 0 y 0 print Origin is the point s location case Point x 0 y y print f Y y and the point is on the y axis case Point x x y 0 print f X x and the point is on the x axis case Point print The point is located somewhere else on the plane case _ print Not a point Patterns with positional parameters You can use positional parameters with some builtin classes that provide an ordering for their attributes e g dataclasses You can also define a specific position for attributes in patterns by setting the __match_args__ special attribute in your classes If it s set to x y the following patterns are all equivalent and all bind the y attribute to the var variable Point 1 var Point 1 y var Point x 1 y var Point y var x 1 Nested patterns Patterns can be arbitrarily nested For example if our data is a short list of points it could be matched like this match points case print No points in the list case Point 0 0 print The origin is the only point in the list case Point x y print f A single point x y is in the list case Point 0 y1 Point 0 y2 print f Two points on the Y axis at y1 y2 are in the list case _ print Something else is found in the list Complex patterns and the wildcard To this point the examples have used _ alone in the last case statement A wildcard can be used in more complex patterns such as error code _ For example match test_variable case warning code 40 print A warning has been received case error code _ print f An error code occurred In the above case test_variable will match for error code 100 and error code 800 Guard We can add an if clause to a pattern known as a guard If the guard is false match goes on to try the next case block Note that value capture happens before the guard is evaluated match point case Point x y if x y print f The point is located on the diagonal Y X at x case Point x y print f Point is not on the diagonal Other Key Features Several other key features Like unpacking assignments tuple and list patterns have exactly the same meaning and actually match arbitrary sequences Technically the subject must be a sequence Therefore an important exception is that patterns don t match iterators Also to prevent a common mistake sequence patterns don t match strings Sequence patterns support wildcards x y rest and x y rest work similar to wildcards in unpacking assignments The name after may also be _ so x y _ matches a sequence of at least two items without binding the remaining items Mapping patterns bandwidth b latency l captures the bandwidth and latency values from a dict Unlike sequence patterns extra keys are ignored A wildcard rest is also supported But _ would be redundant so is not allowed Subpatterns may be captured using the as keyword case Point x1 y1 Point x2 y2 as p2 This binds x1 y1 x2 y2 like you would expect without the as clause and p2 to the entire second item of the subject Most literals are compared by equality However the singletons True False and None are compared by identity Named constants may be used in patterns These named constants must be dotted names to prevent the constant from being interpreted as a capture variable from enum import Enum class Color Enum RED 0 GREEN 1 BLUE 2 color Color GREEN match color case Color RED print I see red case Color GREEN print Grass is green case Color BLUE print I m feeling the blues For the full specification see PEP 634 Motivation and rationale are in PEP 635 and a longer tutorial is in PEP 636 Optional EncodingWarning and encoding locale option The default encoding of TextIOWrapper and open is platform and locale dependent Since UTF 8 is used on most Unix platforms omitting encoding option when opening UTF 8 files e g JSON YAML TOML Markdown is a very common bug For example BUG rb mode or encoding utf 8 should be used with open data json as f data json load f To find this type of bug an optional EncodingWarning is added It is emitted when sys flags warn_default_encoding is true and locale specific default encoding is used X warn_default_encoding option and PYTHONWARNDEF
en
null
119
AULTENCODING are added to enable the warning See Text Encoding for more information New Features Related to Type Hints This section covers major changes affecting PEP 484 type hints and the typing module PEP 604 New Type Union Operator A new type union operator was introduced which enables the syntax X Y This provides a cleaner way of expressing either type X or type Y instead of using typing Union especially in type hints In previous versions of Python to apply a type hint for functions accepting arguments of multiple types typing Union was used def square number Union int float Union int float return number 2 Type hints can now be written in a more succinct manner def square number int float int float return number 2 This new syntax is also accepted as the second argument to isinstance and issubclass isinstance 1 int str True See Union Type and PEP 604 for more details Contributed by Maggie Moss and Philippe Prados in bpo 41428 with additions by Yurii Karabas and Serhiy Storchaka in bpo 44490 PEP 612 Parameter Specification Variables Two new options to improve the information provided to static type checkers for PEP 484 s Callable have been added to the typing module The first is the parameter specification variable They are used to forward the parameter types of one callable to another callable a pattern commonly found in higher order functions and decorators Examples of usage can be found in typing ParamSpec Previously there was no easy way to type annotate dependency of parameter types in such a precise manner The second option is the new Concatenate operator It s used in conjunction with parameter specification variables to type annotate a higher order callable which adds or removes parameters of another callable Examples of usage can be found in typing Concatenate See typing Callable typing ParamSpec typing Concatenate typing ParamSpecArgs typing ParamSpecKwargs and PEP 612 for more details Contributed by Ken Jin in bpo 41559 with minor enhancements by Jelle Zijlstra in bpo 43783 PEP written by Mark Mendoza PEP 613 TypeAlias PEP 484 introduced the concept of type aliases only requiring them to be top level unannotated assignments This simplicity sometimes made it difficult for type checkers to distinguish between type aliases and ordinary assignments especially when forward references or invalid types were involved Compare StrCache Cache str a type alias LOG_PREFIX LOG DEBUG a module constant Now the typing module has a special value TypeAlias which lets you declare type aliases more explicitly StrCache TypeAlias Cache str a type alias LOG_PREFIX LOG DEBUG a module constant See PEP 613 for more details Contributed by Mikhail Golubev in bpo 41923 PEP 647 User Defined Type Guards TypeGuard has been added to the typing module to annotate type guard functions and improve information provided to static type checkers during type narrowing For more information please see TypeGuard s documentation and PEP 647 Contributed by Ken Jin and Guido van Rossum in bpo 43766 PEP written by Eric Traut Other Language Changes The int type has a new method int bit_count returning the number of ones in the binary expansion of a given integer also known as the population count Contributed by Niklas Fiekas in bpo 29882 The views returned by dict keys dict values and dict items now all have a mapping attribute that gives a types MappingProxyType object wrapping the original dictionary Contributed by Dennis Sweeney in bpo 40890 PEP 618 The zip function now has an optional strict flag used to require that all the iterables have an equal length Builtin and extension functions that take integer arguments no longer accept Decimal s Fraction s and other objects that can be converted to integers only with a loss e g that have the __int__ method but do not have the __index__ method Contributed by Serhiy Storchaka in bpo 37999 If object __ipow__ returns NotImplemented the operator will correctly fall back to object __pow__ and object __rpow__ as expected Contributed by Alex Shkop in bpo 38302 Assignment expressions can now be used unparenthesized w
en
null
120
ithin set literals and set comprehensions as well as in sequence indexes but not slices Functions have a new __builtins__ attribute which is used to look for builtin symbols when a function is executed instead of looking into __globals__ __builtins__ The attribute is initialized from __globals__ __builtins__ if it exists else from the current builtins Contributed by Mark Shannon in bpo 42990 Two new builtin functions aiter and anext have been added to provide asynchronous counterparts to iter and next respectively Contributed by Joshua Bronson Daniel Pope and Justin Wang in bpo 31861 Static methods staticmethod and class methods classmethod now inherit the method attributes __module__ __name__ __qualname__ __doc__ __annotations__ and have a new __wrapped__ attribute Moreover static methods are now callable as regular functions Contributed by Victor Stinner in bpo 43682 Annotations for complex targets everything beside simple name targets defined by PEP 526 no longer cause any runtime effects with from __future__ import annotations Contributed by Batuhan Taskaya in bpo 42737 Class and module objects now lazy create empty annotations dicts on demand The annotations dicts are stored in the object s __dict__ for backwards compatibility This improves the best practices for working with __annotations__ for more information please see Annotations Best Practices Contributed by Larry Hastings in bpo 43901 Annotations consist of yield yield from await or named expressions are now forbidden under from __future__ import annotations due to their side effects Contributed by Batuhan Taskaya in bpo 42725 Usage of unbound variables super and other expressions that might alter the processing of symbol table as annotations are now rendered effectless under from __future__ import annotations Contributed by Batuhan Taskaya in bpo 42725 Hashes of NaN values of both float type and decimal Decimal type now depend on object identity Formerly they always hashed to 0 even though NaN values are not equal to one another This caused potentially quadratic runtime behavior due to excessive hash collisions when creating dictionaries and sets containing multiple NaNs Contributed by Raymond Hettinger in bpo 43475 A SyntaxError instead of a NameError will be raised when deleting the __debug__ constant Contributed by Donghee Na in bpo 45000 SyntaxError exceptions now have end_lineno and end_offset attributes They will be None if not determined Contributed by Pablo Galindo in bpo 43914 New Modules None Improved Modules asyncio Add missing connect_accepted_socket method Contributed by Alex Grönholm in bpo 41332 argparse Misleading phrase optional arguments was replaced with options in argparse help Some tests might require adaptation if they rely on exact output match Contributed by Raymond Hettinger in bpo 9694 array The index method of array array now has optional start and stop parameters Contributed by Anders Lorentsen and Zackery Spytz in bpo 31956 asynchat asyncore smtpd These modules have been marked as deprecated in their module documentation since Python 3 6 An import time DeprecationWarning has now been added to all three of these modules base64 Add base64 b32hexencode and base64 b32hexdecode to support the Base32 Encoding with Extended Hex Alphabet bdb Add clearBreakpoints to reset all set breakpoints Contributed by Irit Katriel in bpo 24160 bisect Added the possibility of providing a key function to the APIs in the bisect module Contributed by Raymond Hettinger in bpo 4356 codecs Add a codecs unregister function to unregister a codec search function Contributed by Hai Shi in bpo 41842 collections abc The __args__ of the parameterized generic for collections abc Callable are now consistent with typing Callable collections abc Callable generic now flattens type parameters similar to what typing Callable currently does This means that collections abc Callable int str str will have __args__ of int str str previously this was int str str To allow this change types GenericAlias can now be subclassed and a subclass will be returned when subscripting the
en
null
121
collections abc Callable type Note that a TypeError may be raised for invalid forms of parameterizing collections abc Callable which may have passed silently in Python 3 9 Contributed by Ken Jin in bpo 42195 contextlib Add a contextlib aclosing context manager to safely close async generators and objects representing asynchronously released resources Contributed by Joongi Kim and John Belmonte in bpo 41229 Add asynchronous context manager support to contextlib nullcontext Contributed by Tom Gringauz in bpo 41543 Add AsyncContextDecorator for supporting usage of async context managers as decorators curses The extended color functions added in ncurses 6 1 will be used transparently by curses color_content curses init_color curses init_pair and curses pair_content A new function curses has_extended_color_support indicates whether extended color support is provided by the underlying ncurses library Contributed by Jeffrey Kintscher and Hans Petter Jansson in bpo 36982 The BUTTON5_ constants are now exposed in the curses module if they are provided by the underlying curses library Contributed by Zackery Spytz in bpo 39273 dataclasses __slots__ Added slots parameter in dataclasses dataclass decorator Contributed by Yurii Karabas in bpo 42269 Keyword only fields dataclasses now supports fields that are keyword only in the generated __init__ method There are a number of ways of specifying keyword only fields You can say that every field is keyword only from dataclasses import dataclass dataclass kw_only True class Birthday name str birthday datetime date Both name and birthday are keyword only parameters to the generated __init__ method You can specify keyword only on a per field basis from dataclasses import dataclass field dataclass class Birthday name str birthday datetime date field kw_only True Here only birthday is keyword only If you set kw_only on individual fields be aware that there are rules about re ordering fields due to keyword only fields needing to follow non keyword only fields See the full dataclasses documentation for details You can also specify that all fields following a KW_ONLY marker are keyword only This will probably be the most common usage from dataclasses import dataclass KW_ONLY dataclass class Point x float y float _ KW_ONLY z float 0 0 t float 0 0 Here z and t are keyword only parameters while x and y are not Contributed by Eric V Smith in bpo 43532 distutils The entire distutils package is deprecated to be removed in Python 3 12 Its functionality for specifying package builds has already been completely replaced by third party packages setuptools and packaging and most other commonly used APIs are available elsewhere in the standard library such as platform shutil subprocess or sysconfig There are no plans to migrate any other functionality from distutils and applications that are using other functions should plan to make private copies of the code Refer to PEP 632 for discussion The bdist_wininst command deprecated in Python 3 8 has been removed The bdist_wheel command is now recommended to distribute binary packages on Windows Contributed by Victor Stinner in bpo 42802 doctest When a module does not define __loader__ fall back to __spec__ loader Contributed by Brett Cannon in bpo 42133 encodings encodings normalize_encoding now ignores non ASCII characters Contributed by Hai Shi in bpo 39337 enum Enum __repr__ now returns enum_name member_name and __str__ now returns member_name Stdlib enums available as module constants have a repr of module_name member_name Contributed by Ethan Furman in bpo 40066 Add enum StrEnum for enums where all members are strings Contributed by Ethan Furman in bpo 41816 fileinput Add encoding and errors parameters in fileinput input and fileinput FileInput Contributed by Inada Naoki in bpo 43712 fileinput hook_compressed now returns TextIOWrapper object when mode is r and file is compressed like uncompressed files Contributed by Inada Naoki in bpo 5758 faulthandler The faulthandler module now detects if a fatal error occurs during a garbage collector collection Contribu
en
null
122
ted by Victor Stinner in bpo 44466 gc Add audit hooks for gc get_objects gc get_referrers and gc get_referents Contributed by Pablo Galindo in bpo 43439 glob Add the root_dir and dir_fd parameters in glob and iglob which allow to specify the root directory for searching Contributed by Serhiy Storchaka in bpo 38144 hashlib The hashlib module requires OpenSSL 1 1 1 or newer Contributed by Christian Heimes in PEP 644 and bpo 43669 The hashlib module has preliminary support for OpenSSL 3 0 0 Contributed by Christian Heimes in bpo 38820 and other issues The pure Python fallback of pbkdf2_hmac is deprecated In the future PBKDF2 HMAC will only be available when Python has been built with OpenSSL support Contributed by Christian Heimes in bpo 43880 hmac The hmac module now uses OpenSSL s HMAC implementation internally Contributed by Christian Heimes in bpo 40645 IDLE and idlelib Make IDLE invoke sys excepthook when started without n User hooks were previously ignored Contributed by Ken Hilton in bpo 43008 Rearrange the settings dialog Split the General tab into Windows and Shell Ed tabs Move help sources which extend the Help menu to the Extensions tab Make space for new options and shorten the dialog The latter makes the dialog better fit small screens Contributed by Terry Jan Reedy in bpo 40468 Move the indent space setting from the Font tab to the new Windows tab Contributed by Mark Roseman and Terry Jan Reedy in bpo 33962 The changes above were backported to a 3 9 maintenance release Add a Shell sidebar Move the primary prompt to the sidebar Add secondary prompts to the sidebar Left click and optional drag selects one or more lines of text as with the editor line number sidebar Right click after selecting text lines displays a context menu with copy with prompts This zips together prompts from the sidebar with lines from the selected text This option also appears on the context menu for the text Contributed by Tal Einat in bpo 37903 Use spaces instead of tabs to indent interactive code This makes interactive code entries look right Making this feasible was a major motivation for adding the shell sidebar Contributed by Terry Jan Reedy in bpo 37892 Highlight the new soft keywords match case and _ in pattern matching statements However this highlighting is not perfect and will be incorrect in some rare cases including some _ s in case patterns Contributed by Tal Einat in bpo 44010 New in 3 10 maintenance releases Apply syntax highlighting to pyi files Contributed by Alex Waygood and Terry Jan Reedy in bpo 45447 Include prompts when saving Shell with inputs and outputs Contributed by Terry Jan Reedy in gh 95191 importlib metadata Feature parity with importlib_metadata 4 6 history importlib metadata entry points now provide a nicer experience for selecting entry points by group and name through a new importlib metadata EntryPoints class See the Compatibility Note in the docs for more info on the deprecation and usage Added importlib metadata packages_distributions for resolving top level Python modules and packages to their importlib metadata Distribution inspect When a module does not define __loader__ fall back to __spec__ loader Contributed by Brett Cannon in bpo 42133 Add inspect get_annotations which safely computes the annotations defined on an object It works around the quirks of accessing the annotations on various types of objects and makes very few assumptions about the object it examines inspect get_annotations can also correctly un stringize stringized annotations inspect get_annotations is now considered best practice for accessing the annotations dict defined on any Python object for more information on best practices for working with annotations please see Annotations Best Practices Relatedly inspect signature inspect Signature from_callable and inspect Signature from_function now call inspect get_annotations to retrieve annotations This means inspect signature and inspect Signature from_callable can also now un stringize stringized annotations Contributed by Larry Hastings in bpo 43817 itertools Add itertools pair
en
null
123
wise Contributed by Raymond Hettinger in bpo 38200 linecache When a module does not define __loader__ fall back to __spec__ loader Contributed by Brett Cannon in bpo 42133 os Add os cpu_count support for VxWorks RTOS Contributed by Peixing Xin in bpo 41440 Add a new function os eventfd and related helpers to wrap the eventfd2 syscall on Linux Contributed by Christian Heimes in bpo 41001 Add os splice that allows to move data between two file descriptors without copying between kernel address space and user address space where one of the file descriptors must refer to a pipe Contributed by Pablo Galindo in bpo 41625 Add O_EVTONLY O_FSYNC O_SYMLINK and O_NOFOLLOW_ANY for macOS Contributed by Donghee Na in bpo 43106 os path os path realpath now accepts a strict keyword only argument When set to True OSError is raised if a path doesn t exist or a symlink loop is encountered Contributed by Barney Gale in bpo 43757 pathlib Add slice support to PurePath parents Contributed by Joshua Cannon in bpo 35498 Add negative indexing support to PurePath parents Contributed by Yaroslav Pankovych in bpo 21041 Add Path hardlink_to method that supersedes link_to The new method has the same argument order as symlink_to Contributed by Barney Gale in bpo 39950 pathlib Path stat and chmod now accept a follow_symlinks keyword only argument for consistency with corresponding functions in the os module Contributed by Barney Gale in bpo 39906 platform Add platform freedesktop_os_release to retrieve operation system identification from freedesktop org os release standard file Contributed by Christian Heimes in bpo 28468 pprint pprint pprint now accepts a new underscore_numbers keyword argument Contributed by sblondon in bpo 42914 pprint can now pretty print dataclasses dataclass instances Contributed by Lewis Gaul in bpo 43080 py_compile Add quiet option to command line interface of py_compile Contributed by Gregory Schevchenko in bpo 38731 pyclbr Add an end_lineno attribute to the Function and Class objects in the tree returned by pyclbr readmodule and pyclbr readmodule_ex It matches the existing start lineno Contributed by Aviral Srivastava in bpo 38307 shelve The shelve module now uses pickle DEFAULT_PROTOCOL by default instead of pickle protocol 3 when creating shelves Contributed by Zackery Spytz in bpo 34204 statistics Add covariance Pearson s correlation and simple linear_regression functions Contributed by Tymoteusz Wołodźko in bpo 38490 site When a module does not define __loader__ fall back to __spec__ loader Contributed by Brett Cannon in bpo 42133 socket The exception socket timeout is now an alias of TimeoutError Contributed by Christian Heimes in bpo 42413 Add option to create MPTCP sockets with IPPROTO_MPTCP Contributed by Rui Cunha in bpo 43571 Add IP_RECVTOS option to receive the type of service ToS or DSCP ECN fields Contributed by Georg Sauthoff in bpo 44077 ssl The ssl module requires OpenSSL 1 1 1 or newer Contributed by Christian Heimes in PEP 644 and bpo 43669 The ssl module has preliminary support for OpenSSL 3 0 0 and new option OP_IGNORE_UNEXPECTED_EOF Contributed by Christian Heimes in bpo 38820 bpo 43794 bpo 43788 bpo 43791 bpo 43799 bpo 43920 bpo 43789 and bpo 43811 Deprecated function and use of deprecated constants now result in a DeprecationWarning ssl SSLContext options has OP_NO_SSLv2 and OP_NO_SSLv3 set by default and therefore cannot warn about setting the flag again The deprecation section has a list of deprecated features Contributed by Christian Heimes in bpo 43880 The ssl module now has more secure default settings Ciphers without forward secrecy or SHA 1 MAC are disabled by default Security level 2 prohibits weak RSA DH and ECC keys with less than 112 bits of security SSLContext defaults to minimum protocol version TLS 1 2 Settings are based on Hynek Schlawack s research Contributed by Christian Heimes in bpo 43998 The deprecated protocols SSL 3 0 TLS 1 0 and TLS 1 1 are no longer officially supported Python does not block them actively However OpenSSL build options distro configurations vendor patches and cip
en
null
124
her suites may prevent a successful handshake Add a timeout parameter to the ssl get_server_certificate function Contributed by Zackery Spytz in bpo 31870 The ssl module uses heap types and multi phase initialization Contributed by Christian Heimes in bpo 42333 A new verify flag VERIFY_X509_PARTIAL_CHAIN has been added Contributed by l0x in bpo 40849 sqlite3 Add audit events for connect handle enable_load_extension and load_extension Contributed by Erlend E Aasland in bpo 43762 sys Add sys orig_argv attribute the list of the original command line arguments passed to the Python executable Contributed by Victor Stinner in bpo 23427 Add sys stdlib_module_names containing the list of the standard library module names Contributed by Victor Stinner in bpo 42955 _thread _thread interrupt_main now takes an optional signal number to simulate the default is still signal SIGINT Contributed by Antoine Pitrou in bpo 43356 threading Add threading gettrace and threading getprofile to retrieve the functions set by threading settrace and threading setprofile respectively Contributed by Mario Corchero in bpo 42251 Add threading __excepthook__ to allow retrieving the original value of threading excepthook in case it is set to a broken or a different value Contributed by Mario Corchero in bpo 42308 traceback The format_exception format_exception_only and print_exception functions can now take an exception object as a positional only argument Contributed by Zackery Spytz and Matthias Bussonnier in bpo 26389 types Reintroduce the types EllipsisType types NoneType and types NotImplementedType classes providing a new set of types readily interpretable by type checkers Contributed by Bas van Beek in bpo 41810 typing For major changes see New Features Related to Type Hints The behavior of typing Literal was changed to conform with PEP 586 and to match the behavior of static type checkers specified in the PEP 1 Literal now de duplicates parameters 2 Equality comparisons between Literal objects are now order independent 3 Literal comparisons now respect types For example Literal 0 Literal False previously evaluated to True It is now False To support this change the internally used type cache now supports differentiating types 4 Literal objects will now raise a TypeError exception during equality comparisons if any of their parameters are not hashable Note that declaring Literal with unhashable parameters will not throw an error from typing import Literal Literal 0 Literal 0 Literal False Traceback most recent call last File stdin line 1 in module TypeError unhashable type set Contributed by Yurii Karabas in bpo 42345 Add new function typing is_typeddict to introspect if an annotation is a typing TypedDict Contributed by Patrick Reader in bpo 41792 Subclasses of typing Protocol which only have data variables declared will now raise a TypeError when checked with isinstance unless they are decorated with runtime_checkable Previously these checks passed silently Users should decorate their subclasses with the runtime_checkable decorator if they want runtime protocols Contributed by Yurii Karabas in bpo 38908 Importing from the typing io and typing re submodules will now emit DeprecationWarning These submodules have been deprecated since Python 3 8 and will be removed in a future version of Python Anything belonging to those submodules should be imported directly from typing instead Contributed by Sebastian Rittau in bpo 38291 unittest Add new method assertNoLogs to complement the existing assertLogs Contributed by Kit Yan Choi in bpo 39385 urllib parse Python versions earlier than Python 3 10 allowed using both and as query parameter separators in urllib parse parse_qs and urllib parse parse_qsl Due to security concerns and to conform with newer W3C recommendations this has been changed to allow only a single separator key with as the default This change also affects cgi parse and cgi parse_multipart as they use the affected functions internally For more details please see their respective documentation Contributed by Adam Goldschmidt Senthil Kumaran a
en
null
125
nd Ken Jin in bpo 42967 The presence of newline or tab characters in parts of a URL allows for some forms of attacks Following the WHATWG specification that updates RFC 3986 ASCII newline n r and tab t characters are stripped from the URL by the parser in urllib parse preventing such attacks The removal characters are controlled by a new module level variable urllib parse _UNSAFE_URL_BYTES_TO_REMOVE See gh 88048 xml Add a LexicalHandler class to the xml sax handler module Contributed by Jonathan Gossage and Zackery Spytz in bpo 35018 zipimport Add methods related to PEP 451 find_spec zipimport zipimporter create_module and zipimport zipimporter exec_module Contributed by Brett Cannon in bpo 42131 Add invalidate_caches method Contributed by Desmond Cheong in bpo 14678 Optimizations Constructors str bytes and bytearray are now faster around 30 40 for small objects Contributed by Serhiy Storchaka in bpo 41334 The runpy module now imports fewer modules The python3 m module name command startup time is 1 4x faster in average On Linux python3 I m module name imports 69 modules on Python 3 9 whereas it only imports 51 modules 18 on Python 3 10 Contributed by Victor Stinner in bpo 41006 and bpo 41718 The LOAD_ATTR instruction now uses new per opcode cache mechanism It is about 36 faster now for regular attributes and 44 faster for slots Contributed by Pablo Galindo and Yury Selivanov in bpo 42093 and Guido van Rossum in bpo 42927 based on ideas implemented originally in PyPy and MicroPython When building Python with enable optimizations now fno semantic interposition is added to both the compile and link line This speeds builds of the Python interpreter created with enable shared with gcc by up to 30 See this article for more details Contributed by Victor Stinner and Pablo Galindo in bpo 38980 Use a new output buffer management code for bz2 lzma zlib modules and add readall function to _compression DecompressReader class bz2 decompression is now 1 09x 1 17x faster lzma decompression 1 20x 1 32x faster GzipFile read 1 1 11x 1 18x faster Contributed by Ma Lin reviewed by Gregory P Smith in bpo 41486 When using stringized annotations annotations dicts for functions are no longer created when the function is created Instead they are stored as a tuple of strings and the function object lazily converts this into the annotations dict on demand This optimization cuts the CPU time needed to define an annotated function by half Contributed by Yurii Karabas and Inada Naoki in bpo 42202 Substring search functions such as str1 in str2 and str2 find str1 now sometimes use Crochemore Perrin s Two Way string searching algorithm to avoid quadratic behavior on long strings Contributed by Dennis Sweeney in bpo 41972 Add micro optimizations to _PyType_Lookup to improve type attribute cache lookup performance in the common case of cache hits This makes the interpreter 1 04 times faster on average Contributed by Dino Viehland in bpo 43452 The following built in functions now support the faster PEP 590 vectorcall calling convention map filter reversed bool and float Contributed by Donghee Na and Jeroen Demeyer in bpo 43575 bpo 43287 bpo 41922 bpo 41873 and bpo 41870 BZ2File performance is improved by removing internal RLock This makes BZ2File thread unsafe in the face of multiple simultaneous readers or writers just like its equivalent classes in gzip and lzma have always been Contributed by Inada Naoki in bpo 43785 Deprecated Currently Python accepts numeric literals immediately followed by keywords for example 0in x 1or x 0if 1else 2 It allows confusing and ambiguous expressions like 0x1for x in y which can be interpreted as 0x1 for x in y or 0x1f or x in y Starting in this release a deprecation warning is raised if the numeric literal is immediately followed by one of keywords and else for if in is and or In future releases it will be changed to syntax warning and finally to syntax error Contributed by Serhiy Storchaka in bpo 43833 Starting in this release there will be a concerted effort to begin cleaning up old import semantics that were kept for
en
null
126
Python 2 7 compatibility Specifically find_loader find_module superseded by find_spec load_module superseded by exec_module module_repr which the import system takes care of for you the __package__ attribute superseded by __spec__ parent the __loader__ attribute superseded by __spec__ loader and the __cached__ attribute superseded by __spec__ cached will slowly be removed as well as other classes and methods in importlib ImportWarning and or DeprecationWarning will be raised as appropriate to help identify code which needs updating during this transition The entire distutils namespace is deprecated to be removed in Python 3 12 Refer to the module changes section for more information Non integer arguments to random randrange are deprecated The ValueError is deprecated in favor of a TypeError Contributed by Serhiy Storchaka and Raymond Hettinger in bpo 37319 The various load_module methods of importlib have been documented as deprecated since Python 3 6 but will now also trigger a DeprecationWarning Use exec_module instead Contributed by Brett Cannon in bpo 26131 zimport zipimporter load_module has been deprecated in preference for exec_module Contributed by Brett Cannon in bpo 26131 The use of load_module by the import system now triggers an ImportWarning as exec_module is preferred Contributed by Brett Cannon in bpo 26131 The use of importlib abc MetaPathFinder find_module and importlib abc PathEntryFinder find_module by the import system now trigger an ImportWarning as importlib abc MetaPathFinder find_spec and importlib abc PathEntryFinder find_spec are preferred respectively You can use importlib util spec_from_loader to help in porting Contributed by Brett Cannon in bpo 42134 The use of importlib abc PathEntryFinder find_loader by the import system now triggers an ImportWarning as importlib abc PathEntryFinder find_spec is preferred You can use importlib util spec_from_loader to help in porting Contributed by Brett Cannon in bpo 43672 The various implementations of importlib abc MetaPathFinder find_module importlib machinery BuiltinImporter find_module importlib machinery FrozenImporter find_module importlib machinery WindowsRegistryFinder find_module importlib machinery PathFinder find_module importlib abc MetaPathFinder find_module importlib abc PathEntryFinder find_module importlib machinery FileFinder find_module and importlib abc PathEntryFinder find_loader importlib machinery FileFinder find_loader now raise DeprecationWarning and are slated for removal in Python 3 12 previously they were documented as deprecated in Python 3 4 Contributed by Brett Cannon in bpo 42135 importlib abc Finder is deprecated including its sole method find_module Both importlib abc MetaPathFinder and importlib abc PathEntryFinder no longer inherit from the class Users should inherit from one of these two classes as appropriate instead Contributed by Brett Cannon in bpo 42135 The deprecations of imp importlib find_loader importlib util set_package_wrapper importlib util set_loader_wrapper importlib util module_for_loader pkgutil ImpImporter and pkgutil ImpLoader have all been updated to list Python 3 12 as the slated version of removal they began raising DeprecationWarning in previous versions of Python Contributed by Brett Cannon in bpo 43720 The import system now uses the __spec__ attribute on modules before falling back on module_repr for a module s __repr__ method Removal of the use of module_repr is scheduled for Python 3 12 Contributed by Brett Cannon in bpo 42137 importlib abc Loader module_repr importlib machinery FrozenLoader module_repr and importlib machinery BuiltinLoader module_repr are deprecated and slated for removal in Python 3 12 Contributed by Brett Cannon in bpo 42136 sqlite3 OptimizedUnicode has been undocumented and obsolete since Python 3 3 when it was made an alias to str It is now deprecated scheduled for removal in Python 3 12 Contributed by Erlend E Aasland in bpo 42264 The undocumented built in function sqlite3 enable_shared_cache is now deprecated scheduled for removal in Python 3 12 Its use is strongly dis
en
null
127
couraged by the SQLite3 documentation See the SQLite3 docs for more details If a shared cache must be used open the database in URI mode using the cache shared query parameter Contributed by Erlend E Aasland in bpo 24464 The following threading methods are now deprecated threading currentThread threading current_thread threading activeCount threading active_count threading Condition notifyAll threading Condition notify_all threading Event isSet threading Event is_set threading Thread setName threading Thread name threading thread getName threading Thread name threading Thread isDaemon threading Thread daemon threading Thread setDaemon threading Thread daemon Contributed by Jelle Zijlstra in gh 87889 pathlib Path link_to is deprecated and slated for removal in Python 3 12 Use pathlib Path hardlink_to instead Contributed by Barney Gale in bpo 39950 cgi log is deprecated and slated for removal in Python 3 12 Contributed by Inada Naoki in bpo 41139 The following ssl features have been deprecated since Python 3 6 Python 3 7 or OpenSSL 1 1 0 and will be removed in 3 11 OP_NO_SSLv2 OP_NO_SSLv3 OP_NO_TLSv1 OP_NO_TLSv1_1 OP_NO_TLSv1_2 and OP_NO_TLSv1_3 are replaced by sslSSLContext minimum_version and sslSSLContext maximum_version PROTOCOL_SSLv2 PROTOCOL_SSLv3 PROTOCOL_SSLv23 PROTOCOL_TLSv1 PROTOCOL_TLSv1_1 PROTOCOL_TLSv1_2 and PROTOCOL_TLS are deprecated in favor of PROTOCOL_TLS_CLIENT and PROTOCOL_TLS_SERVER wrap_socket is replaced by ssl SSLContext wrap_socket match_hostname RAND_pseudo_bytes RAND_egd NPN features like ssl SSLSocket selected_npn_protocol and ssl SSLContext set_npn_protocols are replaced by ALPN The threading debug PYTHONTHREADDEBUG environment variable is deprecated in Python 3 10 and will be removed in Python 3 12 This feature requires a debug build of Python Contributed by Victor Stinner in bpo 44584 Importing from the typing io and typing re submodules will now emit DeprecationWarning These submodules will be removed in a future version of Python Anything belonging to these submodules should be imported directly from typing instead Contributed by Sebastian Rittau in bpo 38291 Removed Removed special methods __int__ __float__ __floordiv__ __mod__ __divmod__ __rfloordiv__ __rmod__ and __rdivmod__ of the complex class They always raised a TypeError Contributed by Serhiy Storchaka in bpo 41974 The ParserBase error method from the private and undocumented _markupbase module has been removed html parser HTMLParser is the only subclass of ParserBase and its error implementation was already removed in Python 3 5 Contributed by Berker Peksag in bpo 31844 Removed the unicodedata ucnhash_CAPI attribute which was an internal PyCapsule object The related private _PyUnicode_Name_CAPI structure was moved to the internal C API Contributed by Victor Stinner in bpo 42157 Removed the parser module which was deprecated in 3 9 due to the switch to the new PEG parser as well as all the C source and header files that were only being used by the old parser including node h parser h graminit h and grammar h Removed the Public C API functions PyParser_SimpleParseStringFlags PyParser_SimpleParseStringFlagsFilename PyParser_SimpleParseFileFlags and PyNode_Compile that were deprecated in 3 9 due to the switch to the new PEG parser Removed the formatter module which was deprecated in Python 3 4 It is somewhat obsolete little used and not tested It was originally scheduled to be removed in Python 3 6 but such removals were delayed until after Python 2 7 EOL Existing users should copy whatever classes they use into their code Contributed by Donghee Na and Terry J Reedy in bpo 42299 Removed the PyModule_GetWarningsModule function that was useless now due to the _warnings module was converted to a builtin module in 2 6 Contributed by Hai Shi in bpo 42599 Remove deprecated aliases to Collections Abstract Base Classes from the collections module Contributed by Victor Stinner in bpo 37324 The loop parameter has been removed from most of asyncio s high level API following deprecation in Python 3 8 The motivation behind this change is multifold
en
null
128
1 This simplifies the high level API 2 The functions in the high level API have been implicitly getting the current thread s running event loop since Python 3 7 There isn t a need to pass the event loop to the API in most normal use cases 3 Event loop passing is error prone especially when dealing with loops running in different threads Note that the low level API will still accept loop See Changes in the Python API for examples of how to replace existing code Contributed by Yurii Karabas Andrew Svetlov Yury Selivanov and Kyle Stanley in bpo 42392 Porting to Python 3 10 This section lists previously described changes and other bugfixes that may require changes to your code Changes in the Python syntax Deprecation warning is now emitted when compiling previously valid syntax if the numeric literal is immediately followed by a keyword like in 0in x In future releases it will be changed to syntax warning and finally to a syntax error To get rid of the warning and make the code compatible with future releases just add a space between the numeric literal and the following keyword Contributed by Serhiy Storchaka in bpo 43833 Changes in the Python API The etype parameters of the format_exception format_exception_only and print_exception functions in the traceback module have been renamed to exc Contributed by Zackery Spytz and Matthias Bussonnier in bpo 26389 atexit At Python exit if a callback registered with atexit register fails its exception is now logged Previously only some exceptions were logged and the last exception was always silently ignored Contributed by Victor Stinner in bpo 42639 collections abc Callable generic now flattens type parameters similar to what typing Callable currently does This means that collections abc Callable int str str will have __args__ of int str str previously this was int str str Code which accesses the arguments via typing get_args or __args__ need to account for this change Furthermore TypeError may be raised for invalid forms of parameterizing collections abc Callable which may have passed silently in Python 3 9 Contributed by Ken Jin in bpo 42195 socket htons and socket ntohs now raise OverflowError instead of DeprecationWarning if the given parameter will not fit in a 16 bit unsigned integer Contributed by Erlend E Aasland in bpo 42393 The loop parameter has been removed from most of asyncio s high level API following deprecation in Python 3 8 A coroutine that currently looks like this async def foo loop await asyncio sleep 1 loop loop Should be replaced with this async def foo await asyncio sleep 1 If foo was specifically designed not to run in the current thread s running event loop e g running in another thread s event loop consider using asyncio run_coroutine_threadsafe instead Contributed by Yurii Karabas Andrew Svetlov Yury Selivanov and Kyle Stanley in bpo 42392 The types FunctionType constructor now inherits the current builtins if the globals dictionary has no __builtins__ key rather than using None None as builtins same behavior as eval and exec functions Defining a function with def function in Python is not affected globals cannot be overridden with this syntax it also inherits the current builtins Contributed by Victor Stinner in bpo 42990 Changes in the C API The C API functions PyParser_SimpleParseStringFlags PyParser_SimpleParseStringFlagsFilename PyParser_SimpleParseFileFlags PyNode_Compile and the type used by these functions struct _node were removed due to the switch to the new PEG parser Source should be now be compiled directly to a code object using for example Py_CompileString The resulting code object can then be evaluated using for example PyEval_EvalCode Specifically A call to PyParser_SimpleParseStringFlags followed by PyNode_Compile can be replaced by calling Py_CompileString There is no direct replacement for PyParser_SimpleParseFileFlags To compile code from a FILE argument you will need to read the file in C and pass the resulting buffer to Py_CompileString To compile a file given a char filename explicitly open the file read it and compile the result O
en
null
129
ne way to do this is using the io module with PyImport_ImportModule PyObject_CallMethod PyBytes_AsString and Py_CompileString as sketched below Declarations and error handling are omitted io_module Import_ImportModule io fileobject PyObject_CallMethod io_module open ss filename rb source_bytes_object PyObject_CallMethod fileobject read result PyObject_CallMethod fileobject close source_buf PyBytes_AsString source_bytes_object code Py_CompileString source_buf filename Py_file_input For FrameObject objects the f_lasti member now represents a wordcode offset instead of a simple offset into the bytecode string This means that this number needs to be multiplied by 2 to be used with APIs that expect a byte offset instead like PyCode_Addr2Line for example Notice as well that the f_lasti member of FrameObject objects is not considered stable please use PyFrame_GetLineNumber instead CPython bytecode changes The MAKE_FUNCTION instruction now accepts either a dict or a tuple of strings as the function s annotations Contributed by Yurii Karabas and Inada Naoki in bpo 42202 Build Changes PEP 644 Python now requires OpenSSL 1 1 1 or newer OpenSSL 1 0 2 is no longer supported Contributed by Christian Heimes in bpo 43669 The C99 functions snprintf and vsnprintf are now required to build Python Contributed by Victor Stinner in bpo 36020 sqlite3 requires SQLite 3 7 15 or higher Contributed by Sergey Fedoseev and Erlend E Aasland in bpo 40744 and bpo 40810 The atexit module must now always be built as a built in module Contributed by Victor Stinner in bpo 42639 Add disable test modules option to the configure script don t build nor install test modules Contributed by Xavier de Gaye Thomas Petazzoni and Peixing Xin in bpo 27640 Add with wheel pkg dir PATH option to the configure script If specified the ensurepip module looks for setuptools and pip wheel packages in this directory if both are present these wheel packages are used instead of ensurepip bundled wheel packages Some Linux distribution packaging policies recommend against bundling dependencies For example Fedora installs wheel packages in the usr share python wheels directory and don t install the ensurepip _bundled package Contributed by Victor Stinner in bpo 42856 Add a new configure without static libpython option to not build the libpythonMAJOR MINOR a static library and not install the python o object file Contributed by Victor Stinner in bpo 43103 The configure script now uses the pkg config utility if available to detect the location of Tcl Tk headers and libraries As before those locations can be explicitly specified with the with tcltk includes and with tcltk libs configuration options Contributed by Manolis Stamatogiannakis in bpo 42603 Add with openssl rpath option to configure script The option simplifies building Python with a custom OpenSSL installation e g configure with openssl path to openssl with openssl rpath auto Contributed by Christian Heimes in bpo 43466 C API Changes PEP 652 Maintaining the Stable ABI The Stable ABI Application Binary Interface for extension modules or embedding Python is now explicitly defined C API Stability describes C API and ABI stability guarantees along with best practices for using the Stable ABI Contributed by Petr Viktorin in PEP 652 and bpo 43795 New Features The result of PyNumber_Index now always has exact type int Previously the result could have been an instance of a subclass of int Contributed by Serhiy Storchaka in bpo 40792 Add a new orig_argv member to the PyConfig structure the list of the original command line arguments passed to the Python executable Contributed by Victor Stinner in bpo 23427 The PyDateTime_DATE_GET_TZINFO and PyDateTime_TIME_GET_TZINFO macros have been added for accessing the tzinfo attributes of datetime datetime and datetime time objects Contributed by Zackery Spytz in bpo 30155 Add a PyCodec_Unregister function to unregister a codec search function Contributed by Hai Shi in bpo 41842 The PyIter_Send function was added to allow sending value into iterator without raising StopIteration exception Cont
en
null
130
ributed by Vladimir Matveev in bpo 41756 Add PyUnicode_AsUTF8AndSize to the limited C API Contributed by Alex Gaynor in bpo 41784 Add PyModule_AddObjectRef function similar to PyModule_AddObject but don t steal a reference to the value on success Contributed by Victor Stinner in bpo 1635741 Add Py_NewRef and Py_XNewRef functions to increment the reference count of an object and return the object Contributed by Victor Stinner in bpo 42262 The PyType_FromSpecWithBases and PyType_FromModuleAndSpec functions now accept a single class as the bases argument Contributed by Serhiy Storchaka in bpo 42423 The PyType_FromModuleAndSpec function now accepts NULL tp_doc slot Contributed by Hai Shi in bpo 41832 The PyType_GetSlot function can accept static types Contributed by Hai Shi and Petr Viktorin in bpo 41073 Add a new PySet_CheckExact function to the C API to check if an object is an instance of set but not an instance of a subtype Contributed by Pablo Galindo in bpo 43277 Add PyErr_SetInterruptEx which allows passing a signal number to simulate Contributed by Antoine Pitrou in bpo 43356 The limited C API is now supported if Python is built in debug mode if the Py_DEBUG macro is defined In the limited C API the Py_INCREF and Py_DECREF functions are now implemented as opaque function calls rather than accessing directly the PyObject ob_refcnt member if Python is built in debug mode and the Py_LIMITED_API macro targets Python 3 10 or newer It became possible to support the limited C API in debug mode because the PyObject structure is the same in release and debug mode since Python 3 8 see bpo 36465 The limited C API is still not supported in the with trace refs special build Py_TRACE_REFS macro Contributed by Victor Stinner in bpo 43688 Add the Py_Is x y function to test if the x object is the y object the same as x is y in Python Add also the Py_IsNone Py_IsTrue Py_IsFalse functions to test if an object is respectively the None singleton the True singleton or the False singleton Contributed by Victor Stinner in bpo 43753 Add new functions to control the garbage collector from C code PyGC_Enable PyGC_Disable PyGC_IsEnabled These functions allow to activate deactivate and query the state of the garbage collector from C code without having to import the gc module Add a new Py_TPFLAGS_DISALLOW_INSTANTIATION type flag to disallow creating type instances Contributed by Victor Stinner in bpo 43916 Add a new Py_TPFLAGS_IMMUTABLETYPE type flag for creating immutable type objects type attributes cannot be set nor deleted Contributed by Victor Stinner and Erlend E Aasland in bpo 43908 Porting to Python 3 10 The PY_SSIZE_T_CLEAN macro must now be defined to use PyArg_ParseTuple and Py_BuildValue formats which use es et s u y z U and Z See Parsing arguments and building values and PEP 353 Contributed by Victor Stinner in bpo 40943 Since Py_REFCNT is changed to the inline static function Py_REFCNT obj new_refcnt must be replaced with Py_SET_REFCNT obj new_refcnt see Py_SET_REFCNT available since Python 3 9 For backward compatibility this macro can be used if PY_VERSION_HEX 0x030900A4 define Py_SET_REFCNT obj refcnt Py_REFCNT obj refcnt void 0 endif Contributed by Victor Stinner in bpo 39573 Calling PyDict_GetItem without GIL held had been allowed for historical reason It is no longer allowed Contributed by Victor Stinner in bpo 40839 PyUnicode_FromUnicode NULL size and PyUnicode_FromStringAndSize NULL size raise DeprecationWarning now Use PyUnicode_New to allocate Unicode object without initial data Contributed by Inada Naoki in bpo 36346 The private _PyUnicode_Name_CAPI structure of the PyCapsule API unicodedata ucnhash_CAPI has been moved to the internal C API Contributed by Victor Stinner in bpo 42157 Py_GetPath Py_GetPrefix Py_GetExecPrefix Py_GetProgramFullPath Py_GetPythonHome and Py_GetProgramName functions now return NULL if called before Py_Initialize before Python is initialized Use the new Python Initialization Configuration API to get the Python Path Configuration Contributed by Victor Stinner in bpo 42260 PyList_SET_ITEM PyTuple_S
en
null
131
ET_ITEM and PyCell_SET macros can no longer be used as l value or r value For example x PyList_SET_ITEM a b c and PyList_SET_ITEM a b c x now fail with a compiler error It prevents bugs like if PyList_SET_ITEM a b c 0 test Contributed by Zackery Spytz and Victor Stinner in bpo 30459 The non limited API files odictobject h parser_interface h picklebufobject h pyarena h pyctype h pydebug h pyfpe h and pytime h have been moved to the Include cpython directory These files must not be included directly as they are already included in Python h see Include Files If they have been included directly consider including Python h instead Contributed by Nicholas Sim in bpo 35134 Use the Py_TPFLAGS_IMMUTABLETYPE type flag to create immutable type objects Do not rely on Py_TPFLAGS_HEAPTYPE to decide if a type object is mutable or not check if Py_TPFLAGS_IMMUTABLETYPE is set instead Contributed by Victor Stinner and Erlend E Aasland in bpo 43908 The undocumented function Py_FrozenMain has been removed from the limited API The function is mainly useful for custom builds of Python Contributed by Petr Viktorin in bpo 26241 Deprecated The PyUnicode_InternImmortal function is now deprecated and will be removed in Python 3 12 use PyUnicode_InternInPlace instead Contributed by Victor Stinner in bpo 41692 Removed Removed Py_UNICODE_str functions manipulating Py_UNICODE strings Contributed by Inada Naoki in bpo 41123 Py_UNICODE_strlen use PyUnicode_GetLength or PyUnicode_GET_LENGTH Py_UNICODE_strcat use PyUnicode_CopyCharacters or PyUnicode_FromFormat Py_UNICODE_strcpy Py_UNICODE_strncpy use PyUnicode_CopyCharacters or PyUnicode_Substring Py_UNICODE_strcmp use PyUnicode_Compare Py_UNICODE_strncmp use PyUnicode_Tailmatch Py_UNICODE_strchr Py_UNICODE_strrchr use PyUnicode_FindChar Removed PyUnicode_GetMax Please migrate to new PEP 393 APIs Contributed by Inada Naoki in bpo 41103 Removed PyLong_FromUnicode Please migrate to PyLong_FromUnicodeObject Contributed by Inada Naoki in bpo 41103 Removed PyUnicode_AsUnicodeCopy Please use PyUnicode_AsUCS4Copy or PyUnicode_AsWideCharString Contributed by Inada Naoki in bpo 41103 Removed _Py_CheckRecursionLimit variable it has been replaced by ceval recursion_limit of the PyInterpreterState structure Contributed by Victor Stinner in bpo 41834 Removed undocumented macros Py_ALLOW_RECURSION and Py_END_ALLOW_RECURSION and the recursion_critical field of the PyInterpreterState structure Contributed by Serhiy Storchaka in bpo 41936 Removed the undocumented PyOS_InitInterrupts function Initializing Python already implicitly installs signal handlers see PyConfig install_signal_handlers Contributed by Victor Stinner in bpo 41713 Remove the PyAST_Validate function It is no longer possible to build a AST object mod_ty type with the public C API The function was already excluded from the limited C API PEP 384 Contributed by Victor Stinner in bpo 43244 Remove the symtable h header file and the undocumented functions PyST_GetScope PySymtable_Build PySymtable_BuildObject PySymtable_Free Py_SymtableString Py_SymtableStringObject The Py_SymtableString function was part the stable ABI by mistake but it could not be used because the symtable h header file was excluded from the limited C API Use Python symtable module instead Contributed by Victor Stinner in bpo 43244 Remove PyOS_ReadlineFunctionPointer from the limited C API headers and from python3 dll the library that provides the stable ABI on Windows Since the function takes a FILE argument its ABI stability cannot be guaranteed Contributed by Petr Viktorin in bpo 43868 Remove ast h asdl h and Python ast h header files These functions were undocumented and excluded from the limited C API Most names defined by these header files were not prefixed by Py and so could create names conflicts For example Python ast h defined a Yield macro which was conflict with the Yield name used by the Windows winbase h header Use the Python ast module instead Contributed by Victor Stinner in bpo 43244 Remove the compiler and parser functions using struct _mod type because the public AST C AP
en
null
132
I was removed PyAST_Compile PyAST_CompileEx PyAST_CompileObject PyFuture_FromAST PyFuture_FromASTObject PyParser_ASTFromFile PyParser_ASTFromFileObject PyParser_ASTFromFilename PyParser_ASTFromString PyParser_ASTFromStringObject These functions were undocumented and excluded from the limited C API Contributed by Victor Stinner in bpo 43244 Remove the pyarena h header file with functions PyArena_New PyArena_Free PyArena_Malloc PyArena_AddPyObject These functions were undocumented excluded from the limited C API and were only used internally by the compiler Contributed by Victor Stinner in bpo 43244 The PyThreadState use_tracing member has been removed to optimize Python Contributed by Mark Shannon in bpo 43760 Notable security feature in 3 10 7 Converting between int and str in bases other than 2 binary 4 8 octal 16 hexadecimal or 32 such as base 10 decimal now raises a ValueError if the number of digits in string form is above a limit to avoid potential denial of service attacks due to the algorithmic complexity This is a mitigation for CVE 2020 10735 This limit can be configured or disabled by environment variable command line flag or sys APIs See the integer string conversion length limitation documentation The default limit is 4300 digits in string form Notable security feature in 3 10 8 The deprecated mailcap module now refuses to inject unsafe text filenames MIME types parameters into shell commands Instead of using such text it will warn and act as if a match was not found or for test commands as if the test failed Contributed by Petr Viktorin in gh 98966 Notable changes in 3 10 12 tarfile The extraction methods in tarfile and shutil unpack_archive have a new a filter argument that allows limiting tar features than may be surprising or dangerous such as creating files outside the destination directory See Extraction filters for details In Python 3 12 use without the filter argument will show a DeprecationWarning In Python 3 14 the default will switch to data Contributed by Petr Viktorin in PEP 706
en
null
133
errno Standard errno system symbols This module makes available standard errno system symbols The value of each symbol is the corresponding integer value The names and descriptions are borrowed from linux include errno h which should be all inclusive errno errorcode Dictionary providing a mapping from the errno value to the string name in the underlying system For instance errno errorcode errno EPERM maps to EPERM To translate a numeric error code to an error message use os strerror Of the following list symbols that are not used on the current platform are not defined by the module The specific list of defined symbols is available as errno errorcode keys Symbols available can include errno EPERM Operation not permitted This error is mapped to the exception PermissionError errno ENOENT No such file or directory This error is mapped to the exception FileNotFoundError errno ESRCH No such process This error is mapped to the exception ProcessLookupError errno EINTR Interrupted system call This error is mapped to the exception InterruptedError errno EIO I O error errno ENXIO No such device or address errno E2BIG Arg list too long errno ENOEXEC Exec format error errno EBADF Bad file number errno ECHILD No child processes This error is mapped to the exception ChildProcessError errno EAGAIN Try again This error is mapped to the exception BlockingIOError errno ENOMEM Out of memory errno EACCES Permission denied This error is mapped to the exception PermissionError errno EFAULT Bad address errno ENOTBLK Block device required errno EBUSY Device or resource busy errno EEXIST File exists This error is mapped to the exception FileExistsError errno EXDEV Cross device link errno ENODEV No such device errno ENOTDIR Not a directory This error is mapped to the exception NotADirectoryError errno EISDIR Is a directory This error is mapped to the exception IsADirectoryError errno EINVAL Invalid argument errno ENFILE File table overflow errno EMFILE Too many open files errno ENOTTY Not a typewriter errno ETXTBSY Text file busy errno EFBIG File too large errno ENOSPC No space left on device errno ESPIPE Illegal seek errno EROFS Read only file system errno EMLINK Too many links errno EPIPE Broken pipe This error is mapped to the exception BrokenPipeError errno EDOM Math argument out of domain of func errno ERANGE Math result not representable errno EDEADLK Resource deadlock would occur errno ENAMETOOLONG File name too long errno ENOLCK No record locks available errno ENOSYS Function not implemented errno ENOTEMPTY Directory not empty errno ELOOP Too many symbolic links encountered errno EWOULDBLOCK Operation would block This error is mapped to the exception BlockingIOError errno ENOMSG No message of desired type errno EIDRM Identifier removed errno ECHRNG Channel number out of range errno EL2NSYNC Level 2 not synchronized errno EL3HLT Level 3 halted errno EL3RST Level 3 reset errno ELNRNG Link number out of range errno EUNATCH Protocol driver not attached errno ENOCSI No CSI structure available errno EL2HLT Level 2 halted errno EBADE Invalid exchange errno EBADR Invalid request descriptor errno EXFULL Exchange full errno ENOANO No anode errno EBADRQC Invalid request code errno EBADSLT Invalid slot errno EDEADLOCK File locking deadlock error errno EBFONT Bad font file format errno ENOSTR Device not a stream errno ENODATA No data available errno ETIME Timer expired errno ENOSR Out of streams resources errno ENONET Machine is not on the network errno ENOPKG Package not installed errno EREMOTE Object is remote errno ENOLINK Link has been severed errno EADV Advertise error errno ESRMNT Srmount error errno ECOMM Communication error on send errno EPROTO Protocol error errno EMULTIHOP Multihop attempted errno EDOTDOT RFS specific error errno EBADMSG Not a data message errno EOVERFLOW Value too large for defined data type errno ENOTUNIQ Name not unique on network errno EBADFD File descriptor in bad state errno EREMCHG Remote address changed errno ELIBACC Can not access a needed shared library errno ELIBBAD Accessing a corrupted shared library errno ELIBS
en
null
134
CN lib section in a out corrupted errno ELIBMAX Attempting to link in too many shared libraries errno ELIBEXEC Cannot exec a shared library directly errno EILSEQ Illegal byte sequence errno ERESTART Interrupted system call should be restarted errno ESTRPIPE Streams pipe error errno EUSERS Too many users errno ENOTSOCK Socket operation on non socket errno EDESTADDRREQ Destination address required errno EMSGSIZE Message too long errno EPROTOTYPE Protocol wrong type for socket errno ENOPROTOOPT Protocol not available errno EPROTONOSUPPORT Protocol not supported errno ESOCKTNOSUPPORT Socket type not supported errno EOPNOTSUPP Operation not supported on transport endpoint errno ENOTSUP Operation not supported New in version 3 2 errno EPFNOSUPPORT Protocol family not supported errno EAFNOSUPPORT Address family not supported by protocol errno EADDRINUSE Address already in use errno EADDRNOTAVAIL Cannot assign requested address errno ENETDOWN Network is down errno ENETUNREACH Network is unreachable errno ENETRESET Network dropped connection because of reset errno ECONNABORTED Software caused connection abort This error is mapped to the exception ConnectionAbortedError errno ECONNRESET Connection reset by peer This error is mapped to the exception ConnectionResetError errno ENOBUFS No buffer space available errno EISCONN Transport endpoint is already connected errno ENOTCONN Transport endpoint is not connected errno ESHUTDOWN Cannot send after transport endpoint shutdown This error is mapped to the exception BrokenPipeError errno ETOOMANYREFS Too many references cannot splice errno ETIMEDOUT Connection timed out This error is mapped to the exception TimeoutError errno ECONNREFUSED Connection refused This error is mapped to the exception ConnectionRefusedError errno EHOSTDOWN Host is down errno EHOSTUNREACH No route to host errno EALREADY Operation already in progress This error is mapped to the exception BlockingIOError errno EINPROGRESS Operation now in progress This error is mapped to the exception BlockingIOError errno ESTALE Stale NFS file handle errno EUCLEAN Structure needs cleaning errno ENOTNAM Not a XENIX named type file errno ENAVAIL No XENIX semaphores available errno EISNAM Is a named type file errno EREMOTEIO Remote I O error errno EDQUOT Quota exceeded errno EQFULL Interface output queue is full New in version 3 11 errno ENOTCAPABLE Capabilities insufficient This error is mapped to the exception PermissionError Availability WASI FreeBSD New in version 3 11 1 errno ECANCELED Operation canceled New in version 3 2 errno EOWNERDEAD Owner died New in version 3 2 errno ENOTRECOVERABLE State not recoverable New in version 3 2
en
null
135
pickle Python object serialization Source code Lib pickle py The pickle module implements binary protocols for serializing and de serializing a Python object structure Pickling is the process whereby a Python object hierarchy is converted into a byte stream and unpickling is the inverse operation whereby a byte stream from a binary file or bytes like object is converted back into an object hierarchy Pickling and unpickling is alternatively known as serialization marshalling 1 or flattening however to avoid confusion the terms used here are pickling and unpickling Warning The pickle module is not secure Only unpickle data you trust It is possible to construct malicious pickle data which will execute arbitrary code during unpickling Never unpickle data that could have come from an untrusted source or that could have been tampered with Consider signing data with hmac if you need to ensure that it has not been tampered with Safer serialization formats such as json may be more appropriate if you are processing untrusted data See Comparison with json Relationship to other Python modules Comparison with marshal Python has a more primitive serialization module called marshal but in general pickle should always be the preferred way to serialize Python objects marshal exists primarily to support Python s pyc files The pickle module differs from marshal in several significant ways The pickle module keeps track of the objects it has already serialized so that later references to the same object won t be serialized again marshal doesn t do this This has implications both for recursive objects and object sharing Recursive objects are objects that contain references to themselves These are not handled by marshal and in fact attempting to marshal recursive objects will crash your Python interpreter Object sharing happens when there are multiple references to the same object in different places in the object hierarchy being serialized pickle stores such objects only once and ensures that all other references point to the master copy Shared objects remain shared which can be very important for mutable objects marshal cannot be used to serialize user defined classes and their instances pickle can save and restore class instances transparently however the class definition must be importable and live in the same module as when the object was stored The marshal serialization format is not guaranteed to be portable across Python versions Because its primary job in life is to support pyc files the Python implementers reserve the right to change the serialization format in non backwards compatible ways should the need arise The pickle serialization format is guaranteed to be backwards compatible across Python releases provided a compatible pickle protocol is chosen and pickling and unpickling code deals with Python 2 to Python 3 type differences if your data is crossing that unique breaking change language boundary Comparison with json There are fundamental differences between the pickle protocols and JSON JavaScript Object Notation JSON is a text serialization format it outputs unicode text although most of the time it is then encoded to utf 8 while pickle is a binary serialization format JSON is human readable while pickle is not JSON is interoperable and widely used outside of the Python ecosystem while pickle is Python specific JSON by default can only represent a subset of the Python built in types and no custom classes pickle can represent an extremely large number of Python types many of them automatically by clever usage of Python s introspection facilities complex cases can be tackled by implementing specific object APIs Unlike pickle deserializing untrusted JSON does not in itself create an arbitrary code execution vulnerability See also The json module a standard library module allowing JSON serialization and deserialization Data stream format The data format used by pickle is Python specific This has the advantage that there are no restrictions imposed by external standards such as JSON or XDR which can t represent pointer sharing ho
en
null
136
wever it means that non Python programs may not be able to reconstruct pickled Python objects By default the pickle data format uses a relatively compact binary representation If you need optimal size characteristics you can efficiently compress pickled data The module pickletools contains tools for analyzing data streams generated by pickle pickletools source code has extensive comments about opcodes used by pickle protocols There are currently 6 different protocols which can be used for pickling The higher the protocol used the more recent the version of Python needed to read the pickle produced Protocol version 0 is the original human readable protocol and is backwards compatible with earlier versions of Python Protocol version 1 is an old binary format which is also compatible with earlier versions of Python Protocol version 2 was introduced in Python 2 3 It provides much more efficient pickling of new style classes Refer to PEP 307 for information about improvements brought by protocol 2 Protocol version 3 was added in Python 3 0 It has explicit support for bytes objects and cannot be unpickled by Python 2 x This was the default protocol in Python 3 0 3 7 Protocol version 4 was added in Python 3 4 It adds support for very large objects pickling more kinds of objects and some data format optimizations It is the default protocol starting with Python 3 8 Refer to PEP 3154 for information about improvements brought by protocol 4 Protocol version 5 was added in Python 3 8 It adds support for out of band data and speedup for in band data Refer to PEP 574 for information about improvements brought by protocol 5 Note Serialization is a more primitive notion than persistence although pickle reads and writes file objects it does not handle the issue of naming persistent objects nor the even more complicated issue of concurrent access to persistent objects The pickle module can transform a complex object into a byte stream and it can transform the byte stream into an object with the same internal structure Perhaps the most obvious thing to do with these byte streams is to write them onto a file but it is also conceivable to send them across a network or store them in a database The shelve module provides a simple interface to pickle and unpickle objects on DBM style database files Module Interface To serialize an object hierarchy you simply call the dumps function Similarly to de serialize a data stream you call the loads function However if you want more control over serialization and de serialization you can create a Pickler or an Unpickler object respectively The pickle module provides the following constants pickle HIGHEST_PROTOCOL An integer the highest protocol version available This value can be passed as a protocol value to functions dump and dumps as well as the Pickler constructor pickle DEFAULT_PROTOCOL An integer the default protocol version used for pickling May be less than HIGHEST_PROTOCOL Currently the default protocol is 4 first introduced in Python 3 4 and incompatible with previous versions Changed in version 3 0 The default protocol is 3 Changed in version 3 8 The default protocol is 4 The pickle module provides the following functions to make the pickling process more convenient pickle dump obj file protocol None fix_imports True buffer_callback None Write the pickled representation of the object obj to the open file object file This is equivalent to Pickler file protocol dump obj Arguments file protocol fix_imports and buffer_callback have the same meaning as in the Pickler constructor Changed in version 3 8 The buffer_callback argument was added pickle dumps obj protocol None fix_imports True buffer_callback None Return the pickled representation of the object obj as a bytes object instead of writing it to a file Arguments protocol fix_imports and buffer_callback have the same meaning as in the Pickler constructor Changed in version 3 8 The buffer_callback argument was added pickle load file fix_imports True encoding ASCII errors strict buffers None Read the pickled representation of an object from the o
en
null
137
pen file object file and return the reconstituted object hierarchy specified therein This is equivalent to Unpickler file load The protocol version of the pickle is detected automatically so no protocol argument is needed Bytes past the pickled representation of the object are ignored Arguments file fix_imports encoding errors strict and buffers have the same meaning as in the Unpickler constructor Changed in version 3 8 The buffers argument was added pickle loads data fix_imports True encoding ASCII errors strict buffers None Return the reconstituted object hierarchy of the pickled representation data of an object data must be a bytes like object The protocol version of the pickle is detected automatically so no protocol argument is needed Bytes past the pickled representation of the object are ignored Arguments fix_imports encoding errors strict and buffers have the same meaning as in the Unpickler constructor Changed in version 3 8 The buffers argument was added The pickle module defines three exceptions exception pickle PickleError Common base class for the other pickling exceptions It inherits from Exception exception pickle PicklingError Error raised when an unpicklable object is encountered by Pickler It inherits from PickleError Refer to What can be pickled and unpickled to learn what kinds of objects can be pickled exception pickle UnpicklingError Error raised when there is a problem unpickling an object such as a data corruption or a security violation It inherits from PickleError Note that other exceptions may also be raised during unpickling including but not necessarily limited to AttributeError EOFError ImportError and IndexError The pickle module exports three classes Pickler Unpickler and PickleBuffer class pickle Pickler file protocol None fix_imports True buffer_callback None This takes a binary file for writing a pickle data stream The optional protocol argument an integer tells the pickler to use the given protocol supported protocols are 0 to HIGHEST_PROTOCOL If not specified the default is DEFAULT_PROTOCOL If a negative number is specified HIGHEST_PROTOCOL is selected The file argument must have a write method that accepts a single bytes argument It can thus be an on disk file opened for binary writing an io BytesIO instance or any other custom object that meets this interface If fix_imports is true and protocol is less than 3 pickle will try to map the new Python 3 names to the old module names used in Python 2 so that the pickle data stream is readable with Python 2 If buffer_callback is None the default buffer views are serialized into file as part of the pickle stream If buffer_callback is not None then it can be called any number of times with a buffer view If the callback returns a false value such as None the given buffer is out of band otherwise the buffer is serialized in band i e inside the pickle stream It is an error if buffer_callback is not None and protocol is None or smaller than 5 Changed in version 3 8 The buffer_callback argument was added dump obj Write the pickled representation of obj to the open file object given in the constructor persistent_id obj Do nothing by default This exists so a subclass can override it If persistent_id returns None obj is pickled as usual Any other value causes Pickler to emit the returned value as a persistent ID for obj The meaning of this persistent ID should be defined by Unpickler persistent_load Note that the value returned by persistent_id cannot itself have a persistent ID See Persistence of External Objects for details and examples of uses dispatch_table A pickler object s dispatch table is a registry of reduction functions of the kind which can be declared using copyreg pickle It is a mapping whose keys are classes and whose values are reduction functions A reduction function takes a single argument of the associated class and should conform to the same interface as a __reduce__ method By default a pickler object will not have a dispatch_table attribute and it will instead use the global dispatch table managed by the copyreg module However
en
null
138
to customize the pickling for a specific pickler object one can set the dispatch_table attribute to a dict like object Alternatively if a subclass of Pickler has a dispatch_table attribute then this will be used as the default dispatch table for instances of that class See Dispatch Tables for usage examples New in version 3 3 reducer_override obj Special reducer that can be defined in Pickler subclasses This method has priority over any reducer in the dispatch_table It should conform to the same interface as a __reduce__ method and can optionally return NotImplemented to fallback on dispatch_table registered reducers to pickle obj For a detailed example see Custom Reduction for Types Functions and Other Objects New in version 3 8 fast Deprecated Enable fast mode if set to a true value The fast mode disables the usage of memo therefore speeding the pickling process by not generating superfluous PUT opcodes It should not be used with self referential objects doing otherwise will cause Pickler to recurse infinitely Use pickletools optimize if you need more compact pickles class pickle Unpickler file fix_imports True encoding ASCII errors strict buffers None This takes a binary file for reading a pickle data stream The protocol version of the pickle is detected automatically so no protocol argument is needed The argument file must have three methods a read method that takes an integer argument a readinto method that takes a buffer argument and a readline method that requires no arguments as in the io BufferedIOBase interface Thus file can be an on disk file opened for binary reading an io BytesIO object or any other custom object that meets this interface The optional arguments fix_imports encoding and errors are used to control compatibility support for pickle stream generated by Python 2 If fix_imports is true pickle will try to map the old Python 2 names to the new names used in Python 3 The encoding and errors tell pickle how to decode 8 bit string instances pickled by Python 2 these default to ASCII and strict respectively The encoding can be bytes to read these 8 bit string instances as bytes objects Using encoding latin1 is required for unpickling NumPy arrays and instances of datetime date and time pickled by Python 2 If buffers is None the default then all data necessary for deserialization must be contained in the pickle stream This means that the buffer_callback argument was None when a Pickler was instantiated or when dump or dumps was called If buffers is not None it should be an iterable of buffer enabled objects that is consumed each time the pickle stream references an out of band buffer view Such buffers have been given in order to the buffer_callback of a Pickler object Changed in version 3 8 The buffers argument was added load Read the pickled representation of an object from the open file object given in the constructor and return the reconstituted object hierarchy specified therein Bytes past the pickled representation of the object are ignored persistent_load pid Raise an UnpicklingError by default If defined persistent_load should return the object specified by the persistent ID pid If an invalid persistent ID is encountered an UnpicklingError should be raised See Persistence of External Objects for details and examples of uses find_class module name Import module if necessary and return the object called name from it where the module and name arguments are str objects Note unlike its name suggests find_class is also used for finding functions Subclasses may override this to gain control over what type of objects and how they can be loaded potentially reducing security risks Refer to Restricting Globals for details Raises an auditing event pickle find_class with arguments module name class pickle PickleBuffer buffer A wrapper for a buffer representing picklable data buffer must be a buffer providing object such as a bytes like object or a N dimensional array PickleBuffer is itself a buffer provider therefore it is possible to pass it to other APIs expecting a buffer providing object such as memoryview
en
null
139
PickleBuffer objects can only be serialized using pickle protocol 5 or higher They are eligible for out of band serialization New in version 3 8 raw Return a memoryview of the memory area underlying this buffer The returned object is a one dimensional C contiguous memoryview with format B unsigned bytes BufferError is raised if the buffer is neither C nor Fortran contiguous release Release the underlying buffer exposed by the PickleBuffer object What can be pickled and unpickled The following types can be pickled built in constants None True False Ellipsis and NotImplemented integers floating point numbers complex numbers strings bytes bytearrays tuples lists sets and dictionaries containing only picklable objects functions built in and user defined accessible from the top level of a module using def not lambda classes accessible from the top level of a module instances of such classes whose the result of calling __getstate__ is picklable see section Pickling Class Instances for details Attempts to pickle unpicklable objects will raise the PicklingError exception when this happens an unspecified number of bytes may have already been written to the underlying file Trying to pickle a highly recursive data structure may exceed the maximum recursion depth a RecursionError will be raised in this case You can carefully raise this limit with sys setrecursionlimit Note that functions built in and user defined are pickled by fully qualified name not by value 2 This means that only the function name is pickled along with the name of the containing module and classes Neither the function s code nor any of its function attributes are pickled Thus the defining module must be importable in the unpickling environment and the module must contain the named object otherwise an exception will be raised 3 Similarly classes are pickled by fully qualified name so the same restrictions in the unpickling environment apply Note that none of the class s code or data is pickled so in the following example the class attribute attr is not restored in the unpickling environment class Foo attr A class attribute picklestring pickle dumps Foo These restrictions are why picklable functions and classes must be defined at the top level of a module Similarly when class instances are pickled their class s code and data are not pickled along with them Only the instance data are pickled This is done on purpose so you can fix bugs in a class or add methods to the class and still load objects that were created with an earlier version of the class If you plan to have long lived objects that will see many versions of a class it may be worthwhile to put a version number in the objects so that suitable conversions can be made by the class s __setstate__ method Pickling Class Instances In this section we describe the general mechanisms available to you to define customize and control how class instances are pickled and unpickled In most cases no additional code is needed to make instances picklable By default pickle will retrieve the class and the attributes of an instance via introspection When a class instance is unpickled its __init__ method is usually not invoked The default behaviour first creates an uninitialized instance and then restores the saved attributes The following code shows an implementation of this behaviour def save obj return obj __class__ obj __dict__ def restore cls attributes obj cls __new__ cls obj __dict__ update attributes return obj Classes can alter the default behaviour by providing one or several special methods object __getnewargs_ex__ In protocols 2 and newer classes that implements the __getnewargs_ex__ method can dictate the values passed to the __new__ method upon unpickling The method must return a pair args kwargs where args is a tuple of positional arguments and kwargs a dictionary of named arguments for constructing the object Those will be passed to the __new__ method upon unpickling You should implement this method if the __new__ method of your class requires keyword only arguments Otherwise it is recommended for compatibility
en
null
140
to implement __getnewargs__ Changed in version 3 6 __getnewargs_ex__ is now used in protocols 2 and 3 object __getnewargs__ This method serves a similar purpose as __getnewargs_ex__ but supports only positional arguments It must return a tuple of arguments args which will be passed to the __new__ method upon unpickling __getnewargs__ will not be called if __getnewargs_ex__ is defined Changed in version 3 6 Before Python 3 6 __getnewargs__ was called instead of __getnewargs_ex__ in protocols 2 and 3 object __getstate__ Classes can further influence how their instances are pickled by overriding the method __getstate__ It is called and the returned object is pickled as the contents for the instance instead of a default state There are several cases For a class that has no instance __dict__ and no __slots__ the default state is None For a class that has an instance __dict__ and no __slots__ the default state is self __dict__ For a class that has an instance __dict__ and __slots__ the default state is a tuple consisting of two dictionaries self __dict__ and a dictionary mapping slot names to slot values Only slots that have a value are included in the latter For a class that has __slots__ and no instance __dict__ the default state is a tuple whose first item is None and whose second item is a dictionary mapping slot names to slot values described in the previous bullet Changed in version 3 11 Added the default implementation of the __getstate__ method in the object class object __setstate__ state Upon unpickling if the class defines __setstate__ it is called with the unpickled state In that case there is no requirement for the state object to be a dictionary Otherwise the pickled state must be a dictionary and its items are assigned to the new instance s dictionary Note If __reduce__ returns a state with value None at pickling the __setstate__ method will not be called upon unpickling Refer to the section Handling Stateful Objects for more information about how to use the methods __getstate__ and __setstate__ Note At unpickling time some methods like __getattr__ __getattribute__ or __setattr__ may be called upon the instance In case those methods rely on some internal invariant being true the type should implement __new__ to establish such an invariant as __init__ is not called when unpickling an instance As we shall see pickle does not use directly the methods described above In fact these methods are part of the copy protocol which implements the __reduce__ special method The copy protocol provides a unified interface for retrieving the data necessary for pickling and copying objects 4 Although powerful implementing __reduce__ directly in your classes is error prone For this reason class designers should use the high level interface i e __getnewargs_ex__ __getstate__ and __setstate__ whenever possible We will show however cases where using __reduce__ is the only option or leads to more efficient pickling or both object __reduce__ The interface is currently defined as follows The __reduce__ method takes no argument and shall return either a string or preferably a tuple the returned object is often referred to as the reduce value If a string is returned the string should be interpreted as the name of a global variable It should be the object s local name relative to its module the pickle module searches the module namespace to determine the object s module This behaviour is typically useful for singletons When a tuple is returned it must be between two and six items long Optional items can either be omitted or None can be provided as their value The semantics of each item are in order A callable object that will be called to create the initial version of the object A tuple of arguments for the callable object An empty tuple must be given if the callable does not accept any argument Optionally the object s state which will be passed to the object s __setstate__ method as previously described If the object has no such method then the value must be a dictionary and it will be added to the object s __dict__ attribute Optionally
en
null
141
an iterator and not a sequence yielding successive items These items will be appended to the object either using obj append item or in batch using obj extend list_of_items This is primarily used for list subclasses but may be used by other classes as long as they have append and extend methods with the appropriate signature Whether append or extend is used depends on which pickle protocol version is used as well as the number of items to append so both must be supported Optionally an iterator not a sequence yielding successive key value pairs These items will be stored to the object using obj key value This is primarily used for dictionary subclasses but may be used by other classes as long as they implement __setitem__ Optionally a callable with a obj state signature This callable allows the user to programmatically control the state updating behavior of a specific object instead of using obj s static __setstate__ method If not None this callable will have priority over obj s __setstate__ New in version 3 8 The optional sixth tuple item obj state was added object __reduce_ex__ protocol Alternatively a __reduce_ex__ method may be defined The only difference is this method should take a single integer argument the protocol version When defined pickle will prefer it over the __reduce__ method In addition __reduce__ automatically becomes a synonym for the extended version The main use for this method is to provide backwards compatible reduce values for older Python releases Persistence of External Objects For the benefit of object persistence the pickle module supports the notion of a reference to an object outside the pickled data stream Such objects are referenced by a persistent ID which should be either a string of alphanumeric characters for protocol 0 5 or just an arbitrary object for any newer protocol The resolution of such persistent IDs is not defined by the pickle module it will delegate this resolution to the user defined methods on the pickler and unpickler persistent_id and persistent_load respectively To pickle objects that have an external persistent ID the pickler must have a custom persistent_id method that takes an object as an argument and returns either None or the persistent ID for that object When None is returned the pickler simply pickles the object as normal When a persistent ID string is returned the pickler will pickle that object along with a marker so that the unpickler will recognize it as a persistent ID To unpickle external objects the unpickler must have a custom persistent_load method that takes a persistent ID object and returns the referenced object Here is a comprehensive example presenting how persistent ID can be used to pickle external objects by reference Simple example presenting how persistent ID can be used to pickle external objects by reference import pickle import sqlite3 from collections import namedtuple Simple class representing a record in our database MemoRecord namedtuple MemoRecord key task class DBPickler pickle Pickler def persistent_id self obj Instead of pickling MemoRecord as a regular class instance we emit a persistent ID if isinstance obj MemoRecord Here our persistent ID is simply a tuple containing a tag and a key which refers to a specific record in the database return MemoRecord obj key else If obj does not have a persistent ID return None This means obj needs to be pickled as usual return None class DBUnpickler pickle Unpickler def __init__ self file connection super __init__ file self connection connection def persistent_load self pid This method is invoked whenever a persistent ID is encountered Here pid is the tuple returned by DBPickler cursor self connection cursor type_tag key_id pid if type_tag MemoRecord Fetch the referenced record from the database and return it cursor execute SELECT FROM memos WHERE key str key_id key task cursor fetchone return MemoRecord key task else Always raises an error if you cannot return the correct object Otherwise the unpickler will think None is the object referenced by the persistent ID raise pickle UnpicklingError un
en
null
142
supported persistent object def main import io import pprint Initialize and populate our database conn sqlite3 connect memory cursor conn cursor cursor execute CREATE TABLE memos key INTEGER PRIMARY KEY task TEXT tasks give food to fish prepare group meeting fight with a zebra for task in tasks cursor execute INSERT INTO memos VALUES NULL task Fetch the records to be pickled cursor execute SELECT FROM memos memos MemoRecord key task for key task in cursor Save the records using our custom DBPickler file io BytesIO DBPickler file dump memos print Pickled records pprint pprint memos Update a record just for good measure cursor execute UPDATE memos SET task learn italian WHERE key 1 Load the records from the pickle data stream file seek 0 memos DBUnpickler file conn load print Unpickled records pprint pprint memos if __name__ __main__ main Dispatch Tables If one wants to customize pickling of some classes without disturbing any other code which depends on pickling then one can create a pickler with a private dispatch table The global dispatch table managed by the copyreg module is available as copyreg dispatch_table Therefore one may choose to use a modified copy of copyreg dispatch_table as a private dispatch table For example f io BytesIO p pickle Pickler f p dispatch_table copyreg dispatch_table copy p dispatch_table SomeClass reduce_SomeClass creates an instance of pickle Pickler with a private dispatch table which handles the SomeClass class specially Alternatively the code class MyPickler pickle Pickler dispatch_table copyreg dispatch_table copy dispatch_table SomeClass reduce_SomeClass f io BytesIO p MyPickler f does the same but all instances of MyPickler will by default share the private dispatch table On the other hand the code copyreg pickle SomeClass reduce_SomeClass f io BytesIO p pickle Pickler f modifies the global dispatch table shared by all users of the copyreg module Handling Stateful Objects Here s an example that shows how to modify pickling behavior for a class The TextReader class below opens a text file and returns the line number and line contents each time its readline method is called If a TextReader instance is pickled all attributes except the file object member are saved When the instance is unpickled the file is reopened and reading resumes from the last location The __setstate__ and __getstate__ methods are used to implement this behavior class TextReader Print and number lines in a text file def __init__ self filename self filename filename self file open filename self lineno 0 def readline self self lineno 1 line self file readline if not line return None if line endswith n line line 1 return i s self lineno line def __getstate__ self Copy the object s state from self __dict__ which contains all our instance attributes Always use the dict copy method to avoid modifying the original state state self __dict__ copy Remove the unpicklable entries del state file return state def __setstate__ self state Restore instance attributes i e filename and lineno self __dict__ update state Restore the previously opened file s state To do so we need to reopen it and read from it until the line count is restored file open self filename for _ in range self lineno file readline Finally save the file self file file A sample usage might be something like this reader TextReader hello txt reader readline 1 Hello world reader readline 2 I am line number two new_reader pickle loads pickle dumps reader new_reader readline 3 Goodbye Custom Reduction for Types Functions and Other Objects New in version 3 8 Sometimes dispatch_table may not be flexible enough In particular we may want to customize pickling based on another criterion than the object s type or we may want to customize the pickling of functions and classes For those cases it is possible to subclass from the Pickler class and implement a reducer_override method This method can return an arbitrary reduction tuple see __reduce__ It can alternatively return NotImplemented to fallback to the traditional behavior If both the dispatch_table and reducer_override a
en
null
143
re defined then reducer_override method takes priority Note For performance reasons reducer_override may not be called for the following objects None True False and exact instances of int float bytes str dict set frozenset list and tuple Here is a simple example where we allow pickling and reconstructing a given class import io import pickle class MyClass my_attribute 1 class MyPickler pickle Pickler def reducer_override self obj Custom reducer for MyClass if getattr obj __name__ None MyClass return type obj __name__ obj __bases__ my_attribute obj my_attribute else For any other object fallback to usual reduction return NotImplemented f io BytesIO p MyPickler f p dump MyClass del MyClass unpickled_class pickle loads f getvalue assert isinstance unpickled_class type assert unpickled_class __name__ MyClass assert unpickled_class my_attribute 1 Out of band Buffers New in version 3 8 In some contexts the pickle module is used to transfer massive amounts of data Therefore it can be important to minimize the number of memory copies to preserve performance and resource consumption However normal operation of the pickle module as it transforms a graph like structure of objects into a sequential stream of bytes intrinsically involves copying data to and from the pickle stream This constraint can be eschewed if both the provider the implementation of the object types to be transferred and the consumer the implementation of the communications system support the out of band transfer facilities provided by pickle protocol 5 and higher Provider API The large data objects to be pickled must implement a __reduce_ex__ method specialized for protocol 5 and higher which returns a PickleBuffer instance instead of e g a bytes object for any large data A PickleBuffer object signals that the underlying buffer is eligible for out of band data transfer Those objects remain compatible with normal usage of the pickle module However consumers can also opt in to tell pickle that they will handle those buffers by themselves Consumer API A communications system can enable custom handling of the PickleBuffer objects generated when serializing an object graph On the sending side it needs to pass a buffer_callback argument to Pickler or to the dump or dumps function which will be called with each PickleBuffer generated while pickling the object graph Buffers accumulated by the buffer_callback will not see their data copied into the pickle stream only a cheap marker will be inserted On the receiving side it needs to pass a buffers argument to Unpickler or to the load or loads function which is an iterable of the buffers which were passed to buffer_callback That iterable should produce buffers in the same order as they were passed to buffer_callback Those buffers will provide the data expected by the reconstructors of the objects whose pickling produced the original PickleBuffer objects Between the sending side and the receiving side the communications system is free to implement its own transfer mechanism for out of band buffers Potential optimizations include the use of shared memory or datatype dependent compression Example Here is a trivial example where we implement a bytearray subclass able to participate in out of band buffer pickling class ZeroCopyByteArray bytearray def __reduce_ex__ self protocol if protocol 5 return type self _reconstruct PickleBuffer self None else PickleBuffer is forbidden with pickle protocols 4 return type self _reconstruct bytearray self classmethod def _reconstruct cls obj with memoryview obj as m Get a handle over the original buffer object obj m obj if type obj is cls Original buffer object is a ZeroCopyByteArray return it as is return obj else return cls obj The reconstructor the _reconstruct class method returns the buffer s providing object if it has the right type This is an easy way to simulate zero copy behaviour on this toy example On the consumer side we can pickle those objects the usual way which when unserialized will give us a copy of the original object b ZeroCopyByteArray b abc data pickle dumps b protocol
en
null
144
5 new_b pickle loads data print b new_b True print b is new_b False a copy was made But if we pass a buffer_callback and then give back the accumulated buffers when unserializing we are able to get back the original object b ZeroCopyByteArray b abc buffers data pickle dumps b protocol 5 buffer_callback buffers append new_b pickle loads data buffers buffers print b new_b True print b is new_b True no copy was made This example is limited by the fact that bytearray allocates its own memory you cannot create a bytearray instance that is backed by another object s memory However third party datatypes such as NumPy arrays do not have this limitation and allow use of zero copy pickling or making as few copies as possible when transferring between distinct processes or systems See also PEP 574 Pickle protocol 5 with out of band data Restricting Globals By default unpickling will import any class or function that it finds in the pickle data For many applications this behaviour is unacceptable as it permits the unpickler to import and invoke arbitrary code Just consider what this hand crafted pickle data stream does when loaded import pickle pickle loads b cos nsystem n S echo hello world ntR hello world 0 In this example the unpickler imports the os system function and then apply the string argument echo hello world Although this example is inoffensive it is not difficult to imagine one that could damage your system For this reason you may want to control what gets unpickled by customizing Unpickler find_class Unlike its name suggests Unpickler find_class is called whenever a global i e a class or a function is requested Thus it is possible to either completely forbid globals or restrict them to a safe subset Here is an example of an unpickler allowing only few safe classes from the builtins module to be loaded import builtins import io import pickle safe_builtins range complex set frozenset slice class RestrictedUnpickler pickle Unpickler def find_class self module name Only allow safe classes from builtins if module builtins and name in safe_builtins return getattr builtins name Forbid everything else raise pickle UnpicklingError global s s is forbidden module name def restricted_loads s Helper function analogous to pickle loads return RestrictedUnpickler io BytesIO s load A sample usage of our unpickler working as intended restricted_loads pickle dumps 1 2 range 15 1 2 range 0 15 restricted_loads b cos nsystem n S echo hello world ntR Traceback most recent call last pickle UnpicklingError global os system is forbidden restricted_loads b cbuiltins neval n b S getattr __import__ os system b echo hello world ntR Traceback most recent call last pickle UnpicklingError global builtins eval is forbidden As our examples shows you have to be careful with what you allow to be unpickled Therefore if security is a concern you may want to consider alternatives such as the marshalling API in xmlrpc client or third party solutions Performance Recent versions of the pickle protocol from protocol 2 and upwards feature efficient binary encodings for several common features and built in types Also the pickle module has a transparent optimizer written in C Examples For the simplest code use the dump and load functions import pickle An arbitrary collection of objects supported by pickle data a 1 2 0 3 4j b character string b byte string c None True False with open data pickle wb as f Pickle the data dictionary using the highest protocol available pickle dump data f pickle HIGHEST_PROTOCOL The following example reads the resulting pickled data import pickle with open data pickle rb as f The protocol version used is detected automatically so we do not have to specify it data pickle load f See also Module copyreg Pickle interface constructor registration for extension types Module pickletools Tools for working with and analyzing pickled data Module shelve Indexed databases of objects uses pickle Module copy Shallow and deep object copying Module marshal High performance serialization of built in types Footnotes 1 Don t confuse this with the marshal
en
null
145
module 2 This is why lambda functions cannot be pickled all lambda functions share the same name lambda 3 The exception raised will likely be an ImportError or an AttributeError but it could be something else 4 The copy module uses this protocol for shallow and deep copying operations 5 The limitation on alphanumeric characters is due to the fact that persistent IDs in protocol 0 are delimited by the newline character Therefore if any kind of newline characters occurs in persistent IDs the resulting pickled data will become unreadable
en
null
146
Tkinter Dialogs tkinter simpledialog Standard Tkinter input dialogs Source code Lib tkinter simpledialog py The tkinter simpledialog module contains convenience classes and functions for creating simple modal dialogs to get a value from the user tkinter simpledialog askfloat title prompt kw tkinter simpledialog askinteger title prompt kw tkinter simpledialog askstring title prompt kw The above three functions provide dialogs that prompt the user to enter a value of the desired type class tkinter simpledialog Dialog parent title None The base class for custom dialogs body master Override to construct the dialog s interface and return the widget that should have initial focus buttonbox Default behaviour adds OK and Cancel buttons Override for custom button layouts tkinter filedialog File selection dialogs Source code Lib tkinter filedialog py The tkinter filedialog module provides classes and factory functions for creating file directory selection windows Native Load Save Dialogs The following classes and functions provide file dialog windows that combine a native look and feel with configuration options to customize behaviour The following keyword arguments are applicable to the classes and functions listed below parent the window to place the dialog on top of title the title of the window initialdir the directory that the dialog starts in initialfile the file selected upon opening of the dialog filetypes a sequence of label pattern tuples wildcard is allowed defaultextension default extension to append to file save dialogs multiple when true selection of multiple items is allowed Static factory functions The below functions when called create a modal native look and feel dialog wait for the user s selection then return the selected value s or None to the caller tkinter filedialog askopenfile mode r options tkinter filedialog askopenfiles mode r options The above two functions create an Open dialog and return the opened file object s in read only mode tkinter filedialog asksaveasfile mode w options Create a SaveAs dialog and return a file object opened in write only mode tkinter filedialog askopenfilename options tkinter filedialog askopenfilenames options The above two functions create an Open dialog and return the selected filename s that correspond to existing file s tkinter filedialog asksaveasfilename options Create a SaveAs dialog and return the selected filename tkinter filedialog askdirectory options Prompt user to select a directory Additional keyword option mustexist determines if selection must be an existing directory class tkinter filedialog Open master None options class tkinter filedialog SaveAs master None options The above two classes provide native dialog windows for saving and loading files Convenience classes The below classes are used for creating file directory windows from scratch These do not emulate the native look and feel of the platform class tkinter filedialog Directory master None options Create a dialog prompting the user to select a directory Note The FileDialog class should be subclassed for custom event handling and behaviour class tkinter filedialog FileDialog master title None Create a basic file selection dialog cancel_command event None Trigger the termination of the dialog window dirs_double_event event Event handler for double click event on directory dirs_select_event event Event handler for click event on directory files_double_event event Event handler for double click event on file files_select_event event Event handler for single click event on file filter_command event None Filter the files by directory get_filter Retrieve the file filter currently in use get_selection Retrieve the currently selected item go dir_or_file os curdir pattern default key None Render dialog and start event loop ok_event event Exit dialog returning current selection quit how None Exit dialog returning filename if any set_filter dir pat Set the file filter set_selection file Update the current file selection to file class tkinter filedialog LoadFileDialog master title None A subclass of FileDialog tha
en
null
147
t creates a dialog window for selecting an existing file ok_command Test that a file is provided and that the selection indicates an already existing file class tkinter filedialog SaveFileDialog master title None A subclass of FileDialog that creates a dialog window for selecting a destination file ok_command Test whether or not the selection points to a valid file that is not a directory Confirmation is required if an already existing file is selected tkinter commondialog Dialog window templates Source code Lib tkinter commondialog py The tkinter commondialog module provides the Dialog class that is the base class for dialogs defined in other supporting modules class tkinter commondialog Dialog master None options show color None options Render the Dialog window See also Modules tkinter messagebox Reading and Writing Files
en
null
148
5 The import system Python code in one module gains access to the code in another module by the process of importing it The import statement is the most common way of invoking the import machinery but it is not the only way Functions such as importlib import_module and built in __import__ can also be used to invoke the import machinery The import statement combines two operations it searches for the named module then it binds the results of that search to a name in the local scope The search operation of the import statement is defined as a call to the __import__ function with the appropriate arguments The return value of __import__ is used to perform the name binding operation of the import statement See the import statement for the exact details of that name binding operation A direct call to __import__ performs only the module search and if found the module creation operation While certain side effects may occur such as the importing of parent packages and the updating of various caches including sys modules only the import statement performs a name binding operation When an import statement is executed the standard builtin __import__ function is called Other mechanisms for invoking the import system such as importlib import_module may choose to bypass __import__ and use their own solutions to implement import semantics When a module is first imported Python searches for the module and if found it creates a module object 1 initializing it If the named module cannot be found a ModuleNotFoundError is raised Python implements various strategies to search for the named module when the import machinery is invoked These strategies can be modified and extended by using various hooks described in the sections below Changed in version 3 3 The import system has been updated to fully implement the second phase of PEP 302 There is no longer any implicit import machinery the full import system is exposed through sys meta_path In addition native namespace package support has been implemented see PEP 420 5 1 importlib The importlib module provides a rich API for interacting with the import system For example importlib import_module provides a recommended simpler API than built in __import__ for invoking the import machinery Refer to the importlib library documentation for additional detail 5 2 Packages Python has only one type of module object and all modules are of this type regardless of whether the module is implemented in Python C or something else To help organize modules and provide a naming hierarchy Python has a concept of packages You can think of packages as the directories on a file system and modules as files within directories but don t take this analogy too literally since packages and modules need not originate from the file system For the purposes of this documentation we ll use this convenient analogy of directories and files Like file system directories packages are organized hierarchically and packages may themselves contain subpackages as well as regular modules It s important to keep in mind that all packages are modules but not all modules are packages Or put another way packages are just a special kind of module Specifically any module that contains a __path__ attribute is considered a package All modules have a name Subpackage names are separated from their parent package name by a dot akin to Python s standard attribute access syntax Thus you might have a package called email which in turn has a subpackage called email mime and a module within that subpackage called email mime text 5 2 1 Regular packages Python defines two types of packages regular packages and namespace packages Regular packages are traditional packages as they existed in Python 3 2 and earlier A regular package is typically implemented as a directory containing an __init__ py file When a regular package is imported this __init__ py file is implicitly executed and the objects it defines are bound to names in the package s namespace The __init__ py file can contain the same Python code that any other module can contain and Python will add so
en
null
149
me additional attributes to the module when it is imported For example the following file system layout defines a top level parent package with three subpackages parent __init__ py one __init__ py two __init__ py three __init__ py Importing parent one will implicitly execute parent __init__ py and parent one __init__ py Subsequent imports of parent two or parent three will execute parent two __init__ py and parent three __init__ py respectively 5 2 2 Namespace packages A namespace package is a composite of various portions where each portion contributes a subpackage to the parent package Portions may reside in different locations on the file system Portions may also be found in zip files on the network or anywhere else that Python searches during import Namespace packages may or may not correspond directly to objects on the file system they may be virtual modules that have no concrete representation Namespace packages do not use an ordinary list for their __path__ attribute They instead use a custom iterable type which will automatically perform a new search for package portions on the next import attempt within that package if the path of their parent package or sys path for a top level package changes With namespace packages there is no parent __init__ py file In fact there may be multiple parent directories found during import search where each one is provided by a different portion Thus parent one may not be physically located next to parent two In this case Python will create a namespace package for the top level parent package whenever it or one of its subpackages is imported See also PEP 420 for the namespace package specification 5 3 Searching To begin the search Python needs the fully qualified name of the module or package but for the purposes of this discussion the difference is immaterial being imported This name may come from various arguments to the import statement or from the parameters to the importlib import_module or __import__ functions This name will be used in various phases of the import search and it may be the dotted path to a submodule e g foo bar baz In this case Python first tries to import foo then foo bar and finally foo bar baz If any of the intermediate imports fail a ModuleNotFoundError is raised 5 3 1 The module cache The first place checked during import search is sys modules This mapping serves as a cache of all modules that have been previously imported including the intermediate paths So if foo bar baz was previously imported sys modules will contain entries for foo foo bar and foo bar baz Each key will have as its value the corresponding module object During import the module name is looked up in sys modules and if present the associated value is the module satisfying the import and the process completes However if the value is None then a ModuleNotFoundError is raised If the module name is missing Python will continue searching for the module sys modules is writable Deleting a key may not destroy the associated module as other modules may hold references to it but it will invalidate the cache entry for the named module causing Python to search anew for the named module upon its next import The key can also be assigned to None forcing the next import of the module to result in a ModuleNotFoundError Beware though as if you keep a reference to the module object invalidate its cache entry in sys modules and then re import the named module the two module objects will not be the same By contrast importlib reload will reuse the same module object and simply reinitialise the module contents by rerunning the module s code 5 3 2 Finders and loaders If the named module is not found in sys modules then Python s import protocol is invoked to find and load the module This protocol consists of two conceptual objects finders and loaders A finder s job is to determine whether it can find the named module using whatever strategy it knows about Objects that implement both of these interfaces are referred to as importers they return themselves when they find that they can load the requested module Pyt
en
null
150
hon includes a number of default finders and importers The first one knows how to locate built in modules and the second knows how to locate frozen modules A third default finder searches an import path for modules The import path is a list of locations that may name file system paths or zip files It can also be extended to search for any locatable resource such as those identified by URLs The import machinery is extensible so new finders can be added to extend the range and scope of module searching Finders do not actually load modules If they can find the named module they return a module spec an encapsulation of the module s import related information which the import machinery then uses when loading the module The following sections describe the protocol for finders and loaders in more detail including how you can create and register new ones to extend the import machinery Changed in version 3 4 In previous versions of Python finders returned loaders directly whereas now they return module specs which contain loaders Loaders are still used during import but have fewer responsibilities 5 3 3 Import hooks The import machinery is designed to be extensible the primary mechanism for this are the import hooks There are two types of import hooks meta hooks and import path hooks Meta hooks are called at the start of import processing before any other import processing has occurred other than sys modules cache look up This allows meta hooks to override sys path processing frozen modules or even built in modules Meta hooks are registered by adding new finder objects to sys meta_path as described below Import path hooks are called as part of sys path or package __path__ processing at the point where their associated path item is encountered Import path hooks are registered by adding new callables to sys path_hooks as described below 5 3 4 The meta path When the named module is not found in sys modules Python next searches sys meta_path which contains a list of meta path finder objects These finders are queried in order to see if they know how to handle the named module Meta path finders must implement a method called find_spec which takes three arguments a name an import path and optionally a target module The meta path finder can use any strategy it wants to determine whether it can handle the named module or not If the meta path finder knows how to handle the named module it returns a spec object If it cannot handle the named module it returns None If sys meta_path processing reaches the end of its list without returning a spec then a ModuleNotFoundError is raised Any other exceptions raised are simply propagated up aborting the import process The find_spec method of meta path finders is called with two or three arguments The first is the fully qualified name of the module being imported for example foo bar baz The second argument is the path entries to use for the module search For top level modules the second argument is None but for submodules or subpackages the second argument is the value of the parent package s __path__ attribute If the appropriate __path__ attribute cannot be accessed a ModuleNotFoundError is raised The third argument is an existing module object that will be the target of loading later The import system passes in a target module only during reload The meta path may be traversed multiple times for a single import request For example assuming none of the modules involved has already been cached importing foo bar baz will first perform a top level import calling mpf find_spec foo None None on each meta path finder mpf After foo has been imported foo bar will be imported by traversing the meta path a second time calling mpf find_spec foo bar foo __path__ None Once foo bar has been imported the final traversal will call mpf find_spec foo bar baz foo bar __path__ None Some meta path finders only support top level imports These importers will always return None when anything other than None is passed as the second argument Python s default sys meta_path has three meta path finders one that knows how to impor
en
null
151
t built in modules one that knows how to import frozen modules and one that knows how to import modules from an import path i e the path based finder Changed in version 3 4 The find_spec method of meta path finders replaced find_module which is now deprecated While it will continue to work without change the import machinery will try it only if the finder does not implement find_spec Changed in version 3 10 Use of find_module by the import system now raises ImportWarning Changed in version 3 12 find_module has been removed Use find_spec instead 5 4 Loading If and when a module spec is found the import machinery will use it and the loader it contains when loading the module Here is an approximation of what happens during the loading portion of import module None if spec loader is not None and hasattr spec loader create_module It is assumed exec_module will also be defined on the loader module spec loader create_module spec if module is None module ModuleType spec name The import related module attributes get set here _init_module_attrs spec module if spec loader is None unsupported raise ImportError if spec origin is None and spec submodule_search_locations is not None namespace package sys modules spec name module elif not hasattr spec loader exec_module module spec loader load_module spec name else sys modules spec name module try spec loader exec_module module except BaseException try del sys modules spec name except KeyError pass raise return sys modules spec name Note the following details If there is an existing module object with the given name in sys modules import will have already returned it The module will exist in sys modules before the loader executes the module code This is crucial because the module code may directly or indirectly import itself adding it to sys modules beforehand prevents unbounded recursion in the worst case and multiple loading in the best If loading fails the failing module and only the failing module gets removed from sys modules Any module already in the sys modules cache and any module that was successfully loaded as a side effect must remain in the cache This contrasts with reloading where even the failing module is left in sys modules After the module is created but before execution the import machinery sets the import related module attributes _init_module_attrs in the pseudo code example above as summarized in a later section Module execution is the key moment of loading in which the module s namespace gets populated Execution is entirely delegated to the loader which gets to decide what gets populated and how The module created during loading and passed to exec_module may not be the one returned at the end of import 2 Changed in version 3 4 The import system has taken over the boilerplate responsibilities of loaders These were previously performed by the importlib abc Loader load_module method 5 4 1 Loaders Module loaders provide the critical function of loading module execution The import machinery calls the importlib abc Loader exec_module method with a single argument the module object to execute Any value returned from exec_module is ignored Loaders must satisfy the following requirements If the module is a Python module as opposed to a built in module or a dynamically loaded extension the loader should execute the module s code in the module s global name space module __dict__ If the loader cannot execute the module it should raise an ImportError although any other exception raised during exec_module will be propagated In many cases the finder and loader can be the same object in such cases the find_spec method would just return a spec with the loader set to self Module loaders may opt in to creating the module object during loading by implementing a create_module method It takes one argument the module spec and returns the new module object to use during loading create_module does not need to set any attributes on the module object If the method returns None the import machinery will create the new module itself New in version 3 4 The create_module method of loaders Change
en
null
152
d in version 3 4 The load_module method was replaced by exec_module and the import machinery assumed all the boilerplate responsibilities of loading For compatibility with existing loaders the import machinery will use the load_module method of loaders if it exists and the loader does not also implement exec_module However load_module has been deprecated and loaders should implement exec_module instead The load_module method must implement all the boilerplate loading functionality described above in addition to executing the module All the same constraints apply with some additional clarification If there is an existing module object with the given name in sys modules the loader must use that existing module Otherwise importlib reload will not work correctly If the named module does not exist in sys modules the loader must create a new module object and add it to sys modules The module must exist in sys modules before the loader executes the module code to prevent unbounded recursion or multiple loading If loading fails the loader must remove any modules it has inserted into sys modules but it must remove only the failing module s and only if the loader itself has loaded the module s explicitly Changed in version 3 5 A DeprecationWarning is raised when exec_module is defined but create_module is not Changed in version 3 6 An ImportError is raised when exec_module is defined but create_module is not Changed in version 3 10 Use of load_module will raise ImportWarning 5 4 2 Submodules When a submodule is loaded using any mechanism e g importlib APIs the import or import from statements or built in __import__ a binding is placed in the parent module s namespace to the submodule object For example if package spam has a submodule foo after importing spam foo spam will have an attribute foo which is bound to the submodule Let s say you have the following directory structure spam __init__ py foo py and spam __init__ py has the following line in it from foo import Foo then executing the following puts name bindings for foo and Foo in the spam module import spam spam foo module spam foo from tmp imports spam foo py spam Foo class spam foo Foo Given Python s familiar name binding rules this might seem surprising but it s actually a fundamental feature of the import system The invariant holding is that if you have sys modules spam and sys modules spam foo as you would after the above import the latter must appear as the foo attribute of the former 5 4 3 Module spec The import machinery uses a variety of information about each module during import especially before loading Most of the information is common to all modules The purpose of a module s spec is to encapsulate this import related information on a per module basis Using a spec during import allows state to be transferred between import system components e g between the finder that creates the module spec and the loader that executes it Most importantly it allows the import machinery to perform the boilerplate operations of loading whereas without a module spec the loader had that responsibility The module s spec is exposed as the __spec__ attribute on a module object See ModuleSpec for details on the contents of the module spec New in version 3 4 5 4 4 Import related module attributes The import machinery fills in these attributes on each module object during loading based on the module s spec before the loader executes the module It is strongly recommended that you rely on __spec__ and its attributes instead of any of the other individual attributes listed below __name__ The __name__ attribute must be set to the fully qualified name of the module This name is used to uniquely identify the module in the import system __loader__ The __loader__ attribute must be set to the loader object that the import machinery used when loading the module This is mostly for introspection but can be used for additional loader specific functionality for example getting data associated with a loader It is strongly recommended that you rely on __spec__ instead of this attribute Changed in version
en
null
153
3 12 The value of __loader__ is expected to be the same as __spec__ loader The use of __loader__ is deprecated and slated for removal in Python 3 14 __package__ The module s __package__ attribute may be set Its value must be a string but it can be the same value as its __name__ When the module is a package its __package__ value should be set to its __name__ When the module is not a package __package__ should be set to the empty string for top level modules or for submodules to the parent package s name See PEP 366 for further details This attribute is used instead of __name__ to calculate explicit relative imports for main modules as defined in PEP 366 It is strongly recommended that you rely on __spec__ instead of this attribute Changed in version 3 6 The value of __package__ is expected to be the same as __spec__ parent Changed in version 3 10 ImportWarning is raised if import falls back to __package__ instead of parent Changed in version 3 12 Raise DeprecationWarning instead of ImportWarning when falling back to __package__ __spec__ The __spec__ attribute must be set to the module spec that was used when importing the module Setting __spec__ appropriately applies equally to modules initialized during interpreter startup The one exception is __main__ where __spec__ is set to None in some cases When __spec__ parent is not set __package__ is used as a fallback New in version 3 4 Changed in version 3 6 __spec__ parent is used as a fallback when __package__ is not defined __path__ If the module is a package either regular or namespace the module object s __path__ attribute must be set The value must be iterable but may be empty if __path__ has no further significance If __path__ is not empty it must produce strings when iterated over More details on the semantics of __path__ are given below Non package modules should not have a __path__ attribute __file__ __cached__ __file__ is optional if set value must be a string It indicates the pathname of the file from which the module was loaded if loaded from a file or the pathname of the shared library file for extension modules loaded dynamically from a shared library It might be missing for certain types of modules such as C modules that are statically linked into the interpreter and the import system may opt to leave it unset if it has no semantic meaning e g a module loaded from a database If __file__ is set then the __cached__ attribute might also be set which is the path to any compiled version of the code e g byte compiled file The file does not need to exist to set this attribute the path can simply point to where the compiled file would exist see PEP 3147 Note that __cached__ may be set even if __file__ is not set However that scenario is quite atypical Ultimately the loader is what makes use of the module spec provided by the finder from which __file__ and __cached__ are derived So if a loader can load from a cached module but otherwise does not load from a file that atypical scenario may be appropriate It is strongly recommended that you rely on __spec__ instead of __cached__ 5 4 5 module __path__ By definition if a module has a __path__ attribute it is a package A package s __path__ attribute is used during imports of its subpackages Within the import machinery it functions much the same as sys path i e providing a list of locations to search for modules during import However __path__ is typically much more constrained than sys path __path__ must be an iterable of strings but it may be empty The same rules used for sys path also apply to a package s __path__ and sys path_hooks described below are consulted when traversing a package s __path__ A package s __init__ py file may set or alter the package s __path__ attribute and this was typically the way namespace packages were implemented prior to PEP 420 With the adoption of PEP 420 namespace packages no longer need to supply __init__ py files containing only __path__ manipulation code the import machinery automatically sets __path__ correctly for the namespace package 5 4 6 Module reprs By default all modules have a us
en
null
154
able repr however depending on the attributes set above and in the module s spec you can more explicitly control the repr of module objects If the module has a spec __spec__ the import machinery will try to generate a repr from it If that fails or there is no spec the import system will craft a default repr using whatever information is available on the module It will try to use the module __name__ module __file__ and module __loader__ as input into the repr with defaults for whatever information is missing Here are the exact rules used If the module has a __spec__ attribute the information in the spec is used to generate the repr The name loader origin and has_location attributes are consulted If the module has a __file__ attribute this is used as part of the module s repr If the module has no __file__ but does have a __loader__ that is not None then the loader s repr is used as part of the module s repr Otherwise just use the module s __name__ in the repr Changed in version 3 12 Use of module_repr having been deprecated since Python 3 4 was removed in Python 3 12 and is no longer called during the resolution of a module s repr 5 4 7 Cached bytecode invalidation Before Python loads cached bytecode from a pyc file it checks whether the cache is up to date with the source py file By default Python does this by storing the source s last modified timestamp and size in the cache file when writing it At runtime the import system then validates the cache file by checking the stored metadata in the cache file against the source s metadata Python also supports hash based cache files which store a hash of the source file s contents rather than its metadata There are two variants of hash based pyc files checked and unchecked For checked hash based pyc files Python validates the cache file by hashing the source file and comparing the resulting hash with the hash in the cache file If a checked hash based cache file is found to be invalid Python regenerates it and writes a new checked hash based cache file For unchecked hash based pyc files Python simply assumes the cache file is valid if it exists Hash based pyc files validation behavior may be overridden with the check hash based pycs flag Changed in version 3 7 Added hash based pyc files Previously Python only supported timestamp based invalidation of bytecode caches 5 5 The Path Based Finder As mentioned previously Python comes with several default meta path finders One of these called the path based finder PathFinder searches an import path which contains a list of path entries Each path entry names a location to search for modules The path based finder itself doesn t know how to import anything Instead it traverses the individual path entries associating each of them with a path entry finder that knows how to handle that particular kind of path The default set of path entry finders implement all the semantics for finding modules on the file system handling special file types such as Python source code py files Python byte code pyc files and shared libraries e g so files When supported by the zipimport module in the standard library the default path entry finders also handle loading all of these file types other than shared libraries from zipfiles Path entries need not be limited to file system locations They can refer to URLs database queries or any other location that can be specified as a string The path based finder provides additional hooks and protocols so that you can extend and customize the types of searchable path entries For example if you wanted to support path entries as network URLs you could write a hook that implements HTTP semantics to find modules on the web This hook a callable would return a path entry finder supporting the protocol described below which was then used to get a loader for the module from the web A word of warning this section and the previous both use the term finder distinguishing between them by using the terms meta path finder and path entry finder These two types of finders are very similar support similar protocols and function in similar
en
null
155
ways during the import process but it s important to keep in mind that they are subtly different In particular meta path finders operate at the beginning of the import process as keyed off the sys meta_path traversal By contrast path entry finders are in a sense an implementation detail of the path based finder and in fact if the path based finder were to be removed from sys meta_path none of the path entry finder semantics would be invoked 5 5 1 Path entry finders The path based finder is responsible for finding and loading Python modules and packages whose location is specified with a string path entry Most path entries name locations in the file system but they need not be limited to this As a meta path finder the path based finder implements the find_spec protocol previously described however it exposes additional hooks that can be used to customize how modules are found and loaded from the import path Three variables are used by the path based finder sys path sys path_hooks and sys path_importer_cache The __path__ attributes on package objects are also used These provide additional ways that the import machinery can be customized sys path contains a list of strings providing search locations for modules and packages It is initialized from the PYTHONPATH environment variable and various other installation and implementation specific defaults Entries in sys path can name directories on the file system zip files and potentially other locations see the site module that should be searched for modules such as URLs or database queries Only strings should be present on sys path all other data types are ignored The path based finder is a meta path finder so the import machinery begins the import path search by calling the path based finder s find_spec method as described previously When the path argument to find_spec is given it will be a list of string paths to traverse typically a package s __path__ attribute for an import within that package If the path argument is None this indicates a top level import and sys path is used The path based finder iterates over every entry in the search path and for each of these looks for an appropriate path entry finder PathEntryFinder for the path entry Because this can be an expensive operation e g there may be stat call overheads for this search the path based finder maintains a cache mapping path entries to path entry finders This cache is maintained in sys path_importer_cache despite the name this cache actually stores finder objects rather than being limited to importer objects In this way the expensive search for a particular path entry location s path entry finder need only be done once User code is free to remove cache entries from sys path_importer_cache forcing the path based finder to perform the path entry search again If the path entry is not present in the cache the path based finder iterates over every callable in sys path_hooks Each of the path entry hooks in this list is called with a single argument the path entry to be searched This callable may either return a path entry finder that can handle the path entry or it may raise ImportError An ImportError is used by the path based finder to signal that the hook cannot find a path entry finder for that path entry The exception is ignored and import path iteration continues The hook should expect either a string or bytes object the encoding of bytes objects is up to the hook e g it may be a file system encoding UTF 8 or something else and if the hook cannot decode the argument it should raise ImportError If sys path_hooks iteration ends with no path entry finder being returned then the path based finder s find_spec method will store None in sys path_importer_cache to indicate that there is no finder for this path entry and return None indicating that this meta path finder could not find the module If a path entry finder is returned by one of the path entry hook callables on sys path_hooks then the following protocol is used to ask the finder for a module spec which is then used when loading the module The current working di
en
null
156
rectory denoted by an empty string is handled slightly differently from other entries on sys path First if the current working directory is found to not exist no value is stored in sys path_importer_cache Second the value for the current working directory is looked up fresh for each module lookup Third the path used for sys path_importer_cache and returned by importlib machinery PathFinder find_spec will be the actual current working directory and not the empty string 5 5 2 Path entry finder protocol In order to support imports of modules and initialized packages and also to contribute portions to namespace packages path entry finders must implement the find_spec method find_spec takes two arguments the fully qualified name of the module being imported and the optional target module find_spec returns a fully populated spec for the module This spec will always have loader set with one exception To indicate to the import machinery that the spec represents a namespace portion the path entry finder sets submodule_search_locations to a list containing the portion Changed in version 3 4 find_spec replaced find_loader and find_module both of which are now deprecated but will be used if find_spec is not defined Older path entry finders may implement one of these two deprecated methods instead of find_spec The methods are still respected for the sake of backward compatibility However if find_spec is implemented on the path entry finder the legacy methods are ignored find_loader takes one argument the fully qualified name of the module being imported find_loader returns a 2 tuple where the first item is the loader and the second item is a namespace portion For backwards compatibility with other implementations of the import protocol many path entry finders also support the same traditional find_module method that meta path finders support However path entry finder find_module methods are never called with a path argument they are expected to record the appropriate path information from the initial call to the path hook The find_module method on path entry finders is deprecated as it does not allow the path entry finder to contribute portions to namespace packages If both find_loader and find_module exist on a path entry finder the import system will always call find_loader in preference to find_module Changed in version 3 10 Calls to find_module and find_loader by the import system will raise ImportWarning Changed in version 3 12 find_module and find_loader have been removed 5 6 Replacing the standard import system The most reliable mechanism for replacing the entire import system is to delete the default contents of sys meta_path replacing them entirely with a custom meta path hook If it is acceptable to only alter the behaviour of import statements without affecting other APIs that access the import system then replacing the builtin __import__ function may be sufficient This technique may also be employed at the module level to only alter the behaviour of import statements within that module To selectively prevent the import of some modules from a hook early on the meta path rather than disabling the standard import system entirely it is sufficient to raise ModuleNotFoundError directly from find_spec instead of returning None The latter indicates that the meta path search should continue while raising an exception terminates it immediately 5 7 Package Relative Imports Relative imports use leading dots A single leading dot indicates a relative import starting with the current package Two or more leading dots indicate a relative import to the parent s of the current package one level per dot after the first For example given the following package layout package __init__ py subpackage1 __init__ py moduleX py moduleY py subpackage2 __init__ py moduleZ py moduleA py In either subpackage1 moduleX py or subpackage1 __init__ py the following are valid relative imports from moduleY import spam from moduleY import spam as ham from import moduleY from subpackage1 import moduleY from subpackage2 moduleZ import eggs from moduleA import foo A
en
null
157
bsolute imports may use either the import or from import syntax but relative imports may only use the second form the reason for this is that import XXX YYY ZZZ should expose XXX YYY ZZZ as a usable expression but moduleY is not a valid expression 5 8 Special considerations for __main__ The __main__ module is a special case relative to Python s import system As noted elsewhere the __main__ module is directly initialized at interpreter startup much like sys and builtins However unlike those two it doesn t strictly qualify as a built in module This is because the manner in which __main__ is initialized depends on the flags and other options with which the interpreter is invoked 5 8 1 __main__ __spec__ Depending on how __main__ is initialized __main__ __spec__ gets set appropriately or to None When Python is started with the m option __spec__ is set to the module spec of the corresponding module or package __spec__ is also populated when the __main__ module is loaded as part of executing a directory zipfile or other sys path entry In the remaining cases __main__ __spec__ is set to None as the code used to populate the __main__ does not correspond directly with an importable module interactive prompt c option running from stdin running directly from a source or bytecode file Note that __main__ __spec__ is always None in the last case even if the file could technically be imported directly as a module instead Use the m switch if valid module metadata is desired in __main__ Note also that even when __main__ corresponds with an importable module and __main__ __spec__ is set accordingly they re still considered distinct modules This is due to the fact that blocks guarded by if __name__ __main__ checks only execute when the module is used to populate the __main__ namespace and not during normal import 5 9 References The import machinery has evolved considerably since Python s early days The original specification for packages is still available to read although some details have changed since the writing of that document The original specification for sys meta_path was PEP 302 with subsequent extension in PEP 420 PEP 420 introduced namespace packages for Python 3 3 PEP 420 also introduced the find_loader protocol as an alternative to find_module PEP 366 describes the addition of the __package__ attribute for explicit relative imports in main modules PEP 328 introduced absolute and explicit relative imports and initially proposed __name__ for semantics PEP 366 would eventually specify for __package__ PEP 338 defines executing modules as scripts PEP 451 adds the encapsulation of per module import state in spec objects It also off loads most of the boilerplate responsibilities of loaders back onto the import machinery These changes allow the deprecation of several APIs in the import system and also addition of new methods to finders and loaders Footnotes 1 See types ModuleType 2 The importlib implementation avoids using the return value directly Instead it gets the module object by looking the module name up in sys modules The indirect effect of this is that an imported module may replace itself in sys modules This is implementation specific behavior that is not guaranteed to work in other Python implementations
en
null
158
fnmatch Unix filename pattern matching Source code Lib fnmatch py This module provides support for Unix shell style wildcards which are not the same as regular expressions which are documented in the re module The special characters used in shell style wildcards are Pattern Meaning matches everything matches any single character seq matches any character in seq seq matches any character not in seq For a literal match wrap the meta characters in brackets For example matches the character Note that the filename separator on Unix is not special to this module See module glob for pathname expansion glob uses filter to match pathname segments Similarly filenames starting with a period are not special for this module and are matched by the and patterns Also note that functools lru_cache with the maxsize of 32768 is used to cache the compiled regex patterns in the following functions fnmatch fnmatchcase filter fnmatch fnmatch name pat Test whether the filename string name matches the pattern string pat returning True or False Both parameters are case normalized using os path normcase fnmatchcase can be used to perform a case sensitive comparison regardless of whether that s standard for the operating system This example will print all file names in the current directory with the extension txt import fnmatch import os for file in os listdir if fnmatch fnmatch file txt print file fnmatch fnmatchcase name pat Test whether the filename string name matches the pattern string pat returning True or False the comparison is case sensitive and does not apply os path normcase fnmatch filter names pat Construct a list from those elements of the iterable names that match pattern pat It is the same as n for n in names if fnmatch n pat but implemented more efficiently fnmatch translate pat Return the shell style pattern pat converted to a regular expression for using with re match Example import fnmatch re regex fnmatch translate txt regex s txt Z reobj re compile regex reobj match foobar txt re Match object span 0 10 match foobar txt See also Module glob Unix shell style path expansion
en
null
159
sqlite3 DB API 2 0 interface for SQLite databases Source code Lib sqlite3 SQLite is a C library that provides a lightweight disk based database that doesn t require a separate server process and allows accessing the database using a nonstandard variant of the SQL query language Some applications can use SQLite for internal data storage It s also possible to prototype an application using SQLite and then port the code to a larger database such as PostgreSQL or Oracle The sqlite3 module was written by Gerhard Häring It provides an SQL interface compliant with the DB API 2 0 specification described by PEP 249 and requires SQLite 3 7 15 or newer This document includes four main sections Tutorial teaches how to use the sqlite3 module Reference describes the classes and functions this module defines How to guides details how to handle specific tasks Explanation provides in depth background on transaction control See also https www sqlite org The SQLite web page the documentation describes the syntax and the available data types for the supported SQL dialect https www w3schools com sql Tutorial reference and examples for learning SQL syntax PEP 249 Database API Specification 2 0 PEP written by Marc André Lemburg Tutorial In this tutorial you will create a database of Monty Python movies using basic sqlite3 functionality It assumes a fundamental understanding of database concepts including cursors and transactions First we need to create a new database and open a database connection to allow sqlite3 to work with it Call sqlite3 connect to create a connection to the database tutorial db in the current working directory implicitly creating it if it does not exist import sqlite3 con sqlite3 connect tutorial db The returned Connection object con represents the connection to the on disk database In order to execute SQL statements and fetch results from SQL queries we will need to use a database cursor Call con cursor to create the Cursor cur con cursor Now that we ve got a database connection and a cursor we can create a database table movie with columns for title release year and review score For simplicity we can just use column names in the table declaration thanks to the flexible typing feature of SQLite specifying the data types is optional Execute the CREATE TABLE statement by calling cur execute cur execute CREATE TABLE movie title year score We can verify that the new table has been created by querying the sqlite_master table built in to SQLite which should now contain an entry for the movie table definition see The Schema Table for details Execute that query by calling cur execute assign the result to res and call res fetchone to fetch the resulting row res cur execute SELECT name FROM sqlite_master res fetchone movie We can see that the table has been created as the query returns a tuple containing the table s name If we query sqlite_master for a non existent table spam res fetchone will return None res cur execute SELECT name FROM sqlite_master WHERE name spam res fetchone is None True Now add two rows of data supplied as SQL literals by executing an INSERT statement once again by calling cur execute cur execute INSERT INTO movie VALUES Monty Python and the Holy Grail 1975 8 2 And Now for Something Completely Different 1971 7 5 The INSERT statement implicitly opens a transaction which needs to be committed before changes are saved in the database see Transaction control for details Call con commit on the connection object to commit the transaction con commit We can verify that the data was inserted correctly by executing a SELECT query Use the now familiar cur execute to assign the result to res and call res fetchall to return all resulting rows res cur execute SELECT score FROM movie res fetchall 8 2 7 5 The result is a list of two tuple s one per row each containing that row s score value Now insert three more rows by calling cur executemany data Monty Python Live at the Hollywood Bowl 1982 7 9 Monty Python s The Meaning of Life 1983 7 5 Monty Python s Life of Brian 1979 8 0 cur executemany INSERT INTO movie VALUES data
en
null
160
con commit Remember to commit the transaction after executing INSERT Notice that placeholders are used to bind data to the query Always use placeholders instead of string formatting to bind Python values to SQL statements to avoid SQL injection attacks see How to use placeholders to bind values in SQL queries for more details We can verify that the new rows were inserted by executing a SELECT query this time iterating over the results of the query for row in cur execute SELECT year title FROM movie ORDER BY year print row 1971 And Now for Something Completely Different 1975 Monty Python and the Holy Grail 1979 Monty Python s Life of Brian 1982 Monty Python Live at the Hollywood Bowl 1983 Monty Python s The Meaning of Life Each row is a two item tuple of year title matching the columns selected in the query Finally verify that the database has been written to disk by calling con close to close the existing connection opening a new one creating a new cursor then querying the database con close new_con sqlite3 connect tutorial db new_cur new_con cursor res new_cur execute SELECT title year FROM movie ORDER BY score DESC title year res fetchone print f The highest scoring Monty Python movie is title r released in year The highest scoring Monty Python movie is Monty Python and the Holy Grail released in 1975 You ve now created an SQLite database using the sqlite3 module inserted data and retrieved values from it in multiple ways See also How to guides for further reading How to use placeholders to bind values in SQL queries How to adapt custom Python types to SQLite values How to convert SQLite values to custom Python types How to use the connection context manager How to create and use row factories Explanation for in depth background on transaction control Reference Module functions sqlite3 connect database timeout 5 0 detect_types 0 isolation_level DEFERRED check_same_thread True factory sqlite3 Connection cached_statements 128 uri False autocommit sqlite3 LEGACY_TRANSACTION_CONTROL Open a connection to an SQLite database Parameters database path like object The path to the database file to be opened You can pass memory to create an SQLite database existing only in memory and open a connection to it timeout float How many seconds the connection should wait before raising an OperationalError when a table is locked If another connection opens a transaction to modify a table that table will be locked until the transaction is committed Default five seconds detect_types int Control whether and how data types not natively supported by SQLite are looked up to be converted to Python types using the converters registered with register_converter Set it to any combination using bitwise or of PARSE_DECLTYPES and PARSE_COLNAMES to enable this Column names takes precedence over declared types if both flags are set Types cannot be detected for generated fields for example max data even when the detect_types parameter is set str will be returned instead By default 0 type detection is disabled isolation_level str None Control legacy transaction handling behaviour See Connection isolation_level and Transaction control via the isolation_level attribute for more information Can be DEFERRED default EXCLUSIVE or IMMEDIATE or None to disable opening transactions implicitly Has no effect unless Connection autocommit is set to LEGACY_TRANSACTION_CONTROL the default check_same_thread bool If True default ProgrammingError will be raised if the database connection is used by a thread other than the one that created it If False the connection may be accessed in multiple threads write operations may need to be serialized by the user to avoid data corruption See threadsafety for more information factory Connection A custom subclass of Connection to create the connection with if not the default Connection class cached_statements int The number of statements that sqlite3 should internally cache for this connection to avoid parsing overhead By default 128 statements uri bool If set to True database is interpreted as a URI Uniform Resource Identifier with a
en
null
161
file path and an optional query string The scheme part must be file and the path can be relative or absolute The query string allows passing parameters to SQLite enabling various How to work with SQLite URIs autocommit bool Control PEP 249 transaction handling behaviour See Connection autocommit and Transaction control via the autocommit attribute for more information autocommit currently defaults to LEGACY_TRANSACTION_CONTROL The default will change to False in a future Python release Return type Connection Raises an auditing event sqlite3 connect with argument database Raises an auditing event sqlite3 connect handle with argument connection_handle Changed in version 3 4 Added the uri parameter Changed in version 3 7 database can now also be a path like object not only a string Changed in version 3 10 Added the sqlite3 connect handle auditing event Changed in version 3 12 Added the autocommit parameter sqlite3 complete_statement statement Return True if the string statement appears to contain one or more complete SQL statements No syntactic verification or parsing of any kind is performed other than checking that there are no unclosed string literals and the statement is terminated by a semicolon For example sqlite3 complete_statement SELECT foo FROM bar True sqlite3 complete_statement SELECT foo False This function may be useful during command line input to determine if the entered text seems to form a complete SQL statement or if additional input is needed before calling execute See runsource in Lib sqlite3 __main__ py for real world use sqlite3 enable_callback_tracebacks flag Enable or disable callback tracebacks By default you will not get any tracebacks in user defined functions aggregates converters authorizer callbacks etc If you want to debug them you can call this function with flag set to True Afterwards you will get tracebacks from callbacks on sys stderr Use False to disable the feature again Register an unraisable hook handler for an improved debug experience sqlite3 enable_callback_tracebacks True con sqlite3 connect memory def evil_trace stmt 5 0 con set_trace_callback evil_trace def debug unraisable print f unraisable exc_value r in callback unraisable object __name__ print f Error message unraisable err_msg import sys sys unraisablehook debug cur con execute SELECT 1 ZeroDivisionError division by zero in callback evil_trace Error message None sqlite3 register_adapter type adapter Register an adapter callable to adapt the Python type type into an SQLite type The adapter is called with a Python object of type type as its sole argument and must return a value of a type that SQLite natively understands sqlite3 register_converter typename converter Register the converter callable to convert SQLite objects of type typename into a Python object of a specific type The converter is invoked for all SQLite values of type typename it is passed a bytes object and should return an object of the desired Python type Consult the parameter detect_types of connect for information regarding how type detection works Note typename and the name of the type in your query are matched case insensitively Module constants sqlite3 LEGACY_TRANSACTION_CONTROL Set autocommit to this constant to select old style pre Python 3 12 transaction control behaviour See Transaction control via the isolation_level attribute for more information sqlite3 PARSE_COLNAMES Pass this flag value to the detect_types parameter of connect to look up a converter function by using the type name parsed from the query column name as the converter dictionary key The type name must be wrapped in square brackets SELECT p as p point FROM test will look up converter point This flag may be combined with PARSE_DECLTYPES using the bitwise or operator sqlite3 PARSE_DECLTYPES Pass this flag value to the detect_types parameter of connect to look up a converter function using the declared types for each column The types are declared when the database table is created sqlite3 will look up a converter function using the first word of the declared type as the converter dict
en
null
162
ionary key For example CREATE TABLE test i integer primary key will look up a converter named integer p point will look up a converter named point n number 10 will look up a converter named number This flag may be combined with PARSE_COLNAMES using the bitwise or operator sqlite3 SQLITE_OK sqlite3 SQLITE_DENY sqlite3 SQLITE_IGNORE Flags that should be returned by the authorizer_callback callable passed to Connection set_authorizer to indicate whether Access is allowed SQLITE_OK The SQL statement should be aborted with an error SQLITE_DENY The column should be treated as a NULL value SQLITE_IGNORE sqlite3 apilevel String constant stating the supported DB API level Required by the DB API Hard coded to 2 0 sqlite3 paramstyle String constant stating the type of parameter marker formatting expected by the sqlite3 module Required by the DB API Hard coded to qmark Note The named DB API parameter style is also supported sqlite3 sqlite_version Version number of the runtime SQLite library as a string sqlite3 sqlite_version_info Version number of the runtime SQLite library as a tuple of integers sqlite3 threadsafety Integer constant required by the DB API 2 0 stating the level of thread safety the sqlite3 module supports This attribute is set based on the default threading mode the underlying SQLite library is compiled with The SQLite threading modes are 1 Single thread In this mode all mutexes are disabled and SQLite is unsafe to use in more than a single thread at once 2 Multi thread In this mode SQLite can be safely used by multiple threads provided that no single database connection is used simultaneously in two or more threads 3 Serialized In serialized mode SQLite can be safely used by multiple threads with no restriction The mappings from SQLite threading modes to DB API 2 0 threadsafety levels are as follows SQLite threading threadsafety SQLITE_THREADSAFE DB API 2 0 meaning mode single thread 0 0 Threads may not share the module multi thread 1 2 Threads may share the module but not connections serialized 3 1 Threads may share the module connections and cursors Changed in version 3 11 Set threadsafety dynamically instead of hard coding it to 1 sqlite3 version Version number of this module as a string This is not the version of the SQLite library Deprecated since version 3 12 will be removed in version 3 14 This constant used to reflect the version number of the pysqlite package a third party library which used to upstream changes to sqlite3 Today it carries no meaning or practical value sqlite3 version_info Version number of this module as a tuple of integers This is not the version of the SQLite library Deprecated since version 3 12 will be removed in version 3 14 This constant used to reflect the version number of the pysqlite package a third party library which used to upstream changes to sqlite3 Today it carries no meaning or practical value sqlite3 SQLITE_DBCONFIG_DEFENSIVE sqlite3 SQLITE_DBCONFIG_DQS_DDL sqlite3 SQLITE_DBCONFIG_DQS_DML sqlite3 SQLITE_DBCONFIG_ENABLE_FKEY sqlite3 SQLITE_DBCONFIG_ENABLE_FTS3_TOKENIZER sqlite3 SQLITE_DBCONFIG_ENABLE_LOAD_EXTENSION sqlite3 SQLITE_DBCONFIG_ENABLE_QPSG sqlite3 SQLITE_DBCONFIG_ENABLE_TRIGGER sqlite3 SQLITE_DBCONFIG_ENABLE_VIEW sqlite3 SQLITE_DBCONFIG_LEGACY_ALTER_TABLE sqlite3 SQLITE_DBCONFIG_LEGACY_FILE_FORMAT sqlite3 SQLITE_DBCONFIG_NO_CKPT_ON_CLOSE sqlite3 SQLITE_DBCONFIG_RESET_DATABASE sqlite3 SQLITE_DBCONFIG_TRIGGER_EQP sqlite3 SQLITE_DBCONFIG_TRUSTED_SCHEMA sqlite3 SQLITE_DBCONFIG_WRITABLE_SCHEMA These constants are used for the Connection setconfig and getconfig methods The availability of these constants varies depending on the version of SQLite Python was compiled with New in version 3 12 See also https www sqlite org c3ref c_dbconfig_defensive html SQLite docs Database Connection Configuration Options Connection objects class sqlite3 Connection Each open SQLite database is represented by a Connection object which is created using sqlite3 connect Their main purpose is creating Cursor objects and Transaction control See also How to use connection shortcut methods How
en
null
163
to use the connection context manager An SQLite database connection has the following attributes and methods cursor factory Cursor Create and return a Cursor object The cursor method accepts a single optional parameter factory If supplied this must be a callable returning an instance of Cursor or its subclasses blobopen table column row readonly False name main Open a Blob handle to an existing BLOB Binary Large OBject Parameters table str The name of the table where the blob is located column str The name of the column where the blob is located row str The name of the row where the blob is located readonly bool Set to True if the blob should be opened without write permissions Defaults to False name str The name of the database where the blob is located Defaults to main Raises OperationalError When trying to open a blob in a WITHOUT ROWID table Return type Blob Note The blob size cannot be changed using the Blob class Use the SQL function zeroblob to create a blob with a fixed size New in version 3 11 commit Commit any pending transaction to the database If autocommit is True or there is no open transaction this method does nothing If autocommit is False a new transaction is implicitly opened if a pending transaction was committed by this method rollback Roll back to the start of any pending transaction If autocommit is True or there is no open transaction this method does nothing If autocommit is False a new transaction is implicitly opened if a pending transaction was rolled back by this method close Close the database connection If autocommit is False any pending transaction is implicitly rolled back If autocommit is True or LEGACY_TRANSACTION_CONTROL no implicit transaction control is executed Make sure to commit before closing to avoid losing pending changes execute sql parameters Create a new Cursor object and call execute on it with the given sql and parameters Return the new cursor object executemany sql parameters Create a new Cursor object and call executemany on it with the given sql and parameters Return the new cursor object executescript sql_script Create a new Cursor object and call executescript on it with the given sql_script Return the new cursor object create_function name narg func deterministic False Create or remove a user defined SQL function Parameters name str The name of the SQL function narg int The number of arguments the SQL function can accept If 1 it may take any number of arguments func callback None A callable that is called when the SQL function is invoked The callable must return a type natively supported by SQLite Set to None to remove an existing SQL function deterministic bool If True the created SQL function is marked as deterministic which allows SQLite to perform additional optimizations Raises NotSupportedError If deterministic is used with SQLite versions older than 3 8 3 Changed in version 3 8 Added the deterministic parameter Example import hashlib def md5sum t return hashlib md5 t hexdigest con sqlite3 connect memory con create_function md5 1 md5sum for row in con execute SELECT md5 b foo print row acbd18db4cc2f85cedef654fccc4a4d8 create_aggregate name n_arg aggregate_class Create or remove a user defined SQL aggregate function Parameters name str The name of the SQL aggregate function n_arg int The number of arguments the SQL aggregate function can accept If 1 it may take any number of arguments aggregate_class class None A class must implement the following methods step Add a row to the aggregate finalize Return the final result of the aggregate as a type natively supported by SQLite The number of arguments that the step method must accept is controlled by n_arg Set to None to remove an existing SQL aggregate function Example class MySum def __init__ self self count 0 def step self value self count value def finalize self return self count con sqlite3 connect memory con create_aggregate mysum 1 MySum cur con execute CREATE TABLE test i cur execute INSERT INTO test i VALUES 1 cur execute INSERT INTO test i VALUES 2 cur execute SELECT mysum i FROM test print cur fetchone 0 c
en
null
164
on close create_window_function name num_params aggregate_class Create or remove a user defined aggregate window function Parameters name str The name of the SQL aggregate window function to create or remove num_params int The number of arguments the SQL aggregate window function can accept If 1 it may take any number of arguments aggregate_class class None A class that must implement the following methods step Add a row to the current window value Return the current value of the aggregate inverse Remove a row from the current window finalize Return the final result of the aggregate as a type natively supported by SQLite The number of arguments that the step and value methods must accept is controlled by num_params Set to None to remove an existing SQL aggregate window function Raises NotSupportedError If used with a version of SQLite older than 3 25 0 which does not support aggregate window functions New in version 3 11 Example Example taken from https www sqlite org windowfunctions html udfwinfunc class WindowSumInt def __init__ self self count 0 def step self value Add a row to the current window self count value def value self Return the current value of the aggregate return self count def inverse self value Remove a row from the current window self count value def finalize self Return the final value of the aggregate Any clean up actions should be placed here return self count con sqlite3 connect memory cur con execute CREATE TABLE test x y values a 4 b 5 c 3 d 8 e 1 cur executemany INSERT INTO test VALUES values con create_window_function sumint 1 WindowSumInt cur execute SELECT x sumint y OVER ORDER BY x ROWS BETWEEN 1 PRECEDING AND 1 FOLLOWING AS sum_y FROM test ORDER BY x print cur fetchall create_collation name callable Create a collation named name using the collating function callable callable is passed two string arguments and it should return an integer 1 if the first is ordered higher than the second 1 if the first is ordered lower than the second 0 if they are ordered equal The following example shows a reverse sorting collation def collate_reverse string1 string2 if string1 string2 return 0 elif string1 string2 return 1 else return 1 con sqlite3 connect memory con create_collation reverse collate_reverse cur con execute CREATE TABLE test x cur executemany INSERT INTO test x VALUES a b cur execute SELECT x FROM test ORDER BY x COLLATE reverse for row in cur print row con close Remove a collation function by setting callable to None Changed in version 3 11 The collation name can contain any Unicode character Earlier only ASCII characters were allowed interrupt Call this method from a different thread to abort any queries that might be executing on the connection Aborted queries will raise an OperationalError set_authorizer authorizer_callback Register callable authorizer_callback to be invoked for each attempt to access a column of a table in the database The callback should return one of SQLITE_OK SQLITE_DENY or SQLITE_IGNORE to signal how access to the column should be handled by the underlying SQLite library The first argument to the callback signifies what kind of operation is to be authorized The second and third argument will be arguments or None depending on the first argument The 4th argument is the name of the database main temp etc if applicable The 5th argument is the name of the inner most trigger or view that is responsible for the access attempt or None if this access attempt is directly from input SQL code Please consult the SQLite documentation about the possible values for the first argument and the meaning of the second and third argument depending on the first one All necessary constants are available in the sqlite3 module Passing None as authorizer_callback will disable the authorizer Changed in version 3 11 Added support for disabling the authorizer using None set_progress_handler progress_handler n Register callable progress_handler to be invoked for every n instructions of the SQLite virtual machine This is useful if you want to get called from SQLite during long running operations f
en
null
165
or example to update a GUI If you want to clear any previously installed progress handler call the method with None for progress_handler Returning a non zero value from the handler function will terminate the currently executing query and cause it to raise a DatabaseError exception set_trace_callback trace_callback Register callable trace_callback to be invoked for each SQL statement that is actually executed by the SQLite backend The only argument passed to the callback is the statement as str that is being executed The return value of the callback is ignored Note that the backend does not only run statements passed to the Cursor execute methods Other sources include the transaction management of the sqlite3 module and the execution of triggers defined in the current database Passing None as trace_callback will disable the trace callback Note Exceptions raised in the trace callback are not propagated As a development and debugging aid use enable_callback_tracebacks to enable printing tracebacks from exceptions raised in the trace callback New in version 3 3 enable_load_extension enabled Enable the SQLite engine to load SQLite extensions from shared libraries if enabled is True else disallow loading SQLite extensions SQLite extensions can define new functions aggregates or whole new virtual table implementations One well known extension is the fulltext search extension distributed with SQLite Note The sqlite3 module is not built with loadable extension support by default because some platforms notably macOS have SQLite libraries which are compiled without this feature To get loadable extension support you must pass the enable loadable sqlite extensions option to configure Raises an auditing event sqlite3 enable_load_extension with arguments connection enabled New in version 3 2 Changed in version 3 10 Added the sqlite3 enable_load_extension auditing event con enable_load_extension True Load the fulltext search extension con execute select load_extension fts3 so alternatively you can load the extension using an API call con load_extension fts3 so disable extension loading again con enable_load_extension False example from SQLite wiki con execute CREATE VIRTUAL TABLE recipe USING fts3 name ingredients con executescript INSERT INTO recipe name ingredients VALUES broccoli stew broccoli peppers cheese tomatoes INSERT INTO recipe name ingredients VALUES pumpkin stew pumpkin onions garlic celery INSERT INTO recipe name ingredients VALUES broccoli pie broccoli cheese onions flour INSERT INTO recipe name ingredients VALUES pumpkin pie pumpkin sugar flour butter for row in con execute SELECT rowid name ingredients FROM recipe WHERE name MATCH pie print row con close load_extension path entrypoint None Load an SQLite extension from a shared library Enable extension loading with enable_load_extension before calling this method Parameters path str The path to the SQLite extension entrypoint str None Entry point name If None the default SQLite will come up with an entry point name of its own see the SQLite docs Loading an Extension for details Raises an auditing event sqlite3 load_extension with arguments connection path New in version 3 2 Changed in version 3 10 Added the sqlite3 load_extension auditing event Changed in version 3 12 Added the entrypoint parameter iterdump Return an iterator to dump the database as SQL source code Useful when saving an in memory database for later restoration Similar to the dump command in the sqlite3 shell Example Convert file example db to SQL dump file dump sql con sqlite3 connect example db with open dump sql w as f for line in con iterdump f write s n line con close See also How to handle non UTF 8 text encodings backup target pages 1 progress None name main sleep 0 250 Create a backup of an SQLite database Works even if the database is being accessed by other clients or concurrently by the same connection Parameters target Connection The database connection to save the backup to pages int The number of pages to copy at a time If equal to or less than 0 the entire database is copied in a single s
en
null
166
tep Defaults to 1 progress callback None If set to a callable it is invoked with three integer arguments for every backup iteration the status of the last iteration the remaining number of pages still to be copied and the total number of pages Defaults to None name str The name of the database to back up Either main the default for the main database temp for the temporary database or the name of a custom database as attached using the ATTACH DATABASE SQL statement sleep float The number of seconds to sleep between successive attempts to back up remaining pages Example 1 copy an existing database into another def progress status remaining total print f Copied total remaining of total pages src sqlite3 connect example db dst sqlite3 connect backup db with dst src backup dst pages 1 progress progress dst close src close Example 2 copy an existing database into a transient copy src sqlite3 connect example db dst sqlite3 connect memory src backup dst New in version 3 7 See also How to handle non UTF 8 text encodings getlimit category Get a connection runtime limit Parameters category int The SQLite limit category to be queried Return type int Raises ProgrammingError If category is not recognised by the underlying SQLite library Example query the maximum length of an SQL statement for Connection con the default is 1000000000 con getlimit sqlite3 SQLITE_LIMIT_SQL_LENGTH 1000000000 New in version 3 11 setlimit category limit Set a connection runtime limit Attempts to increase a limit above its hard upper bound are silently truncated to the hard upper bound Regardless of whether or not the limit was changed the prior value of the limit is returned Parameters category int The SQLite limit category to be set limit int The value of the new limit If negative the current limit is unchanged Return type int Raises ProgrammingError If category is not recognised by the underlying SQLite library Example limit the number of attached databases to 1 for Connection con the default limit is 10 con setlimit sqlite3 SQLITE_LIMIT_ATTACHED 1 10 con getlimit sqlite3 SQLITE_LIMIT_ATTACHED 1 New in version 3 11 getconfig op Query a boolean connection configuration option Parameters op int A SQLITE_DBCONFIG code Return type bool New in version 3 12 setconfig op enable True Set a boolean connection configuration option Parameters op int A SQLITE_DBCONFIG code enable bool True if the configuration option should be enabled default False if it should be disabled New in version 3 12 serialize name main Serialize a database into a bytes object For an ordinary on disk database file the serialization is just a copy of the disk file For an in memory database or a temp database the serialization is the same sequence of bytes which would be written to disk if that database were backed up to disk Parameters name str The database name to be serialized Defaults to main Return type bytes Note This method is only available if the underlying SQLite library has the serialize API New in version 3 11 deserialize data name main Deserialize a serialized database into a Connection This method causes the database connection to disconnect from database name and reopen name as an in memory database based on the serialization contained in data Parameters data bytes A serialized database name str The database name to deserialize into Defaults to main Raises OperationalError If the database connection is currently involved in a read transaction or a backup operation DatabaseError If data does not contain a valid SQLite database OverflowError If len data is larger than 2 63 1 Note This method is only available if the underlying SQLite library has the deserialize API New in version 3 11 autocommit This attribute controls PEP 249 compliant transaction behaviour autocommit has three allowed values False Select PEP 249 compliant transaction behaviour implying that sqlite3 ensures a transaction is always open Use commit and rollback to close transactions This is the recommended value of autocommit True Use SQLite s autocommit mode commit and rollback have no effect in this mode LEGACY_T
en
null
167
RANSACTION_CONTROL Pre Python 3 12 non PEP 249 compliant transaction control See isolation_level for more details This is currently the default value of autocommit Changing autocommit to False will open a new transaction and changing it to True will commit any pending transaction See Transaction control via the autocommit attribute for more details Note The isolation_level attribute has no effect unless autocommit is LEGACY_TRANSACTION_CONTROL New in version 3 12 in_transaction This read only attribute corresponds to the low level SQLite autocommit mode True if a transaction is active there are uncommitted changes False otherwise New in version 3 2 isolation_level Controls the legacy transaction handling mode of sqlite3 If set to None transactions are never implicitly opened If set to one of DEFERRED IMMEDIATE or EXCLUSIVE corresponding to the underlying SQLite transaction behaviour implicit transaction management is performed If not overridden by the isolation_level parameter of connect the default is which is an alias for DEFERRED Note Using autocommit to control transaction handling is recommended over using isolation_level isolation_level has no effect unless autocommit is set to LEGACY_TRANSACTION_CONTROL the default row_factory The initial row_factory for Cursor objects created from this connection Assigning to this attribute does not affect the row_factory of existing cursors belonging to this connection only new ones Is None by default meaning each row is returned as a tuple See How to create and use row factories for more details text_factory A callable that accepts a bytes parameter and returns a text representation of it The callable is invoked for SQLite values with the TEXT data type By default this attribute is set to str See How to handle non UTF 8 text encodings for more details total_changes Return the total number of database rows that have been modified inserted or deleted since the database connection was opened Cursor objects A Cursor object represents a database cursor which is used to execute SQL statements and manage the context of a fetch operation Cursors are created using Connection cursor or by using any of the connection shortcut methods Cursor objects are iterators meaning that if you execute a SELECT query you can simply iterate over the cursor to fetch the resulting rows for row in cur execute SELECT t FROM data print row class sqlite3 Cursor A Cursor instance has the following attributes and methods execute sql parameters Execute a single SQL statement optionally binding Python values using placeholders Parameters sql str A single SQL statement parameters dict sequence Python values to bind to placeholders in sql A dict if named placeholders are used A sequence if unnamed placeholders are used See How to use placeholders to bind values in SQL queries Raises ProgrammingError If sql contains more than one SQL statement If autocommit is LEGACY_TRANSACTION_CONTROL isolation_level is not None sql is an INSERT UPDATE DELETE or REPLACE statement and there is no open transaction a transaction is implicitly opened before executing sql Deprecated since version 3 12 will be removed in version 3 14 DeprecationWarning is emitted if named placeholders are used and parameters is a sequence instead of a dict Starting with Python 3 14 ProgrammingError will be raised instead Use executescript to execute multiple SQL statements executemany sql parameters For every item in parameters repeatedly execute the parameterized DML Data Manipulation Language SQL statement sql Uses the same implicit transaction handling as execute Parameters sql str A single SQL DML statement parameters iterable An iterable of parameters to bind with the placeholders in sql See How to use placeholders to bind values in SQL queries Raises ProgrammingError If sql contains more than one SQL statement or is not a DML statement Example rows row1 row2 cur is an sqlite3 Cursor object cur executemany INSERT INTO data VALUES rows Note Any resulting rows are discarded including DML statements with RETURNING clauses Deprecated since version 3 1
en
null
168
2 will be removed in version 3 14 DeprecationWarning is emitted if named placeholders are used and the items in parameters are sequences instead of dict s Starting with Python 3 14 ProgrammingError will be raised instead executescript sql_script Execute the SQL statements in sql_script If the autocommit is LEGACY_TRANSACTION_CONTROL and there is a pending transaction an implicit COMMIT statement is executed first No other implicit transaction control is performed any transaction control must be added to sql_script sql_script must be a string Example cur is an sqlite3 Cursor object cur executescript BEGIN CREATE TABLE person firstname lastname age CREATE TABLE book title author published CREATE TABLE publisher name address COMMIT fetchone If row_factory is None return the next row query result set as a tuple Else pass it to the row factory and return its result Return None if no more data is available fetchmany size cursor arraysize Return the next set of rows of a query result as a list Return an empty list if no more rows are available The number of rows to fetch per call is specified by the size parameter If size is not given arraysize determines the number of rows to be fetched If fewer than size rows are available as many rows as are available are returned Note there are performance considerations involved with the size parameter For optimal performance it is usually best to use the arraysize attribute If the size parameter is used then it is best for it to retain the same value from one fetchmany call to the next fetchall Return all remaining rows of a query result as a list Return an empty list if no rows are available Note that the arraysize attribute can affect the performance of this operation close Close the cursor now rather than whenever __del__ is called The cursor will be unusable from this point forward a ProgrammingError exception will be raised if any operation is attempted with the cursor setinputsizes sizes Required by the DB API Does nothing in sqlite3 setoutputsize size column None Required by the DB API Does nothing in sqlite3 arraysize Read write attribute that controls the number of rows returned by fetchmany The default value is 1 which means a single row would be fetched per call connection Read only attribute that provides the SQLite database Connection belonging to the cursor A Cursor object created by calling con cursor will have a connection attribute that refers to con con sqlite3 connect memory cur con cursor cur connection con True description Read only attribute that provides the column names of the last query To remain compatible with the Python DB API it returns a 7 tuple for each column where the last six items of each tuple are None It is set for SELECT statements without any matching rows as well lastrowid Read only attribute that provides the row id of the last inserted row It is only updated after successful INSERT or REPLACE statements using the execute method For other statements after executemany or executescript or if the insertion failed the value of lastrowid is left unchanged The initial value of lastrowid is None Note Inserts into WITHOUT ROWID tables are not recorded Changed in version 3 6 Added support for the REPLACE statement rowcount Read only attribute that provides the number of modified rows for INSERT UPDATE DELETE and REPLACE statements is 1 for other statements including CTE Common Table Expression queries It is only updated by the execute and executemany methods after the statement has run to completion This means that any resulting rows must be fetched in order for rowcount to be updated row_factory Control how a row fetched from this Cursor is represented If None a row is represented as a tuple Can be set to the included sqlite3 Row or a callable that accepts two arguments a Cursor object and the tuple of row values and returns a custom object representing an SQLite row Defaults to what Connection row_factory was set to when the Cursor was created Assigning to this attribute does not affect Connection row_factory of the parent connection See How to create and
en
null
169
use row factories for more details Row objects class sqlite3 Row A Row instance serves as a highly optimized row_factory for Connection objects It supports iteration equality testing len and mapping access by column name and index Two Row objects compare equal if they have identical column names and values See How to create and use row factories for more details keys Return a list of column names as strings Immediately after a query it is the first member of each tuple in Cursor description Changed in version 3 5 Added support of slicing Blob objects class sqlite3 Blob New in version 3 11 A Blob instance is a file like object that can read and write data in an SQLite BLOB Binary Large OBject Call len blob to get the size number of bytes of the blob Use indices and slices for direct access to the blob data Use the Blob as a context manager to ensure that the blob handle is closed after use con sqlite3 connect memory con execute CREATE TABLE test blob_col blob con execute INSERT INTO test blob_col VALUES zeroblob 13 Write to our blob using two write operations with con blobopen test blob_col 1 as blob blob write b hello blob write b world Modify the first and last bytes of our blob blob 0 ord H blob 1 ord Read the contents of our blob with con blobopen test blob_col 1 as blob greeting blob read print greeting outputs b Hello world close Close the blob The blob will be unusable from this point onward An Error or subclass exception will be raised if any further operation is attempted with the blob read length 1 Read length bytes of data from the blob at the current offset position If the end of the blob is reached the data up to EOF End of File will be returned When length is not specified or is negative read will read until the end of the blob write data Write data to the blob at the current offset This function cannot change the blob length Writing beyond the end of the blob will raise ValueError tell Return the current access position of the blob seek offset origin os SEEK_SET Set the current access position of the blob to offset The origin argument defaults to os SEEK_SET absolute blob positioning Other values for origin are os SEEK_CUR seek relative to the current position and os SEEK_END seek relative to the blob s end PrepareProtocol objects class sqlite3 PrepareProtocol The PrepareProtocol type s single purpose is to act as a PEP 246 style adaption protocol for objects that can adapt themselves to native SQLite types Exceptions The exception hierarchy is defined by the DB API 2 0 PEP 249 exception sqlite3 Warning This exception is not currently raised by the sqlite3 module but may be raised by applications using sqlite3 for example if a user defined function truncates data while inserting Warning is a subclass of Exception exception sqlite3 Error The base class of the other exceptions in this module Use this to catch all errors with one single except statement Error is a subclass of Exception If the exception originated from within the SQLite library the following two attributes are added to the exception sqlite_errorcode The numeric error code from the SQLite API New in version 3 11 sqlite_errorname The symbolic name of the numeric error code from the SQLite API New in version 3 11 exception sqlite3 InterfaceError Exception raised for misuse of the low level SQLite C API In other words if this exception is raised it probably indicates a bug in the sqlite3 module InterfaceError is a subclass of Error exception sqlite3 DatabaseError Exception raised for errors that are related to the database This serves as the base exception for several types of database errors It is only raised implicitly through the specialised subclasses DatabaseError is a subclass of Error exception sqlite3 DataError Exception raised for errors caused by problems with the processed data like numeric values out of range and strings which are too long DataError is a subclass of DatabaseError exception sqlite3 OperationalError Exception raised for errors that are related to the database s operation and not necessarily under the control of the progr
en
null
170
ammer For example the database path is not found or a transaction could not be processed OperationalError is a subclass of DatabaseError exception sqlite3 IntegrityError Exception raised when the relational integrity of the database is affected e g a foreign key check fails It is a subclass of DatabaseError exception sqlite3 InternalError Exception raised when SQLite encounters an internal error If this is raised it may indicate that there is a problem with the runtime SQLite library InternalError is a subclass of DatabaseError exception sqlite3 ProgrammingError Exception raised for sqlite3 API programming errors for example supplying the wrong number of bindings to a query or trying to operate on a closed Connection ProgrammingError is a subclass of DatabaseError exception sqlite3 NotSupportedError Exception raised in case a method or database API is not supported by the underlying SQLite library For example setting deterministic to True in create_function if the underlying SQLite library does not support deterministic functions NotSupportedError is a subclass of DatabaseError SQLite and Python types SQLite natively supports the following types NULL INTEGER REAL TEXT BLOB The following Python types can thus be sent to SQLite without any problem Python type SQLite type None NULL int INTEGER float REAL str TEXT bytes BLOB This is how SQLite types are converted to Python types by default SQLite type Python type NULL None INTEGER int REAL float TEXT depends on text_factory str by default BLOB bytes The type system of the sqlite3 module is extensible in two ways you can store additional Python types in an SQLite database via object adapters and you can let the sqlite3 module convert SQLite types to Python types via converters Default adapters and converters deprecated Note The default adapters and converters are deprecated as of Python 3 12 Instead use the Adapter and converter recipes and tailor them to your needs The deprecated default adapters and converters consist of An adapter for datetime date objects to strings in ISO 8601 format An adapter for datetime datetime objects to strings in ISO 8601 format A converter for declared date types to datetime date objects A converter for declared timestamp types to datetime datetime objects Fractional parts will be truncated to 6 digits microsecond precision Note The default timestamp converter ignores UTC offsets in the database and always returns a naive datetime datetime object To preserve UTC offsets in timestamps either leave converters disabled or register an offset aware converter with register_converter Deprecated since version 3 12 Command line interface The sqlite3 module can be invoked as a script using the interpreter s m switch in order to provide a simple SQLite shell The argument signature is as follows python m sqlite3 h v filename sql Type quit or CTRL D to exit the shell h help Print CLI help v version Print underlying SQLite library version New in version 3 12 How to guides How to use placeholders to bind values in SQL queries SQL operations usually need to use values from Python variables However beware of using Python s string operations to assemble queries as they are vulnerable to SQL injection attacks For example an attacker can simply close the single quote and inject OR TRUE to select all rows Never do this insecure symbol input OR TRUE sql SELECT FROM stocks WHERE symbol s symbol print sql SELECT FROM stocks WHERE symbol OR TRUE cur execute sql Instead use the DB API s parameter substitution To insert a variable into a query string use a placeholder in the string and substitute the actual values into the query by providing them as a tuple of values to the second argument of the cursor s execute method An SQL statement may use one of two kinds of placeholders question marks qmark style or named placeholders named style For the qmark style parameters must be a sequence whose length must match the number of placeholders or a ProgrammingError is raised For the named style parameters must be an instance of a dict or a subclass which must contain keys for all
en
null
171
named parameters any extra items are ignored Here s an example of both styles con sqlite3 connect memory cur con execute CREATE TABLE lang name first_appeared This is the named style used with executemany data name C year 1972 name Fortran year 1957 name Python year 1991 name Go year 2009 cur executemany INSERT INTO lang VALUES name year data This is the qmark style used in a SELECT query params 1972 cur execute SELECT FROM lang WHERE first_appeared params print cur fetchall Note PEP 249 numeric placeholders are not supported If used they will be interpreted as named placeholders How to adapt custom Python types to SQLite values SQLite supports only a limited set of data types natively To store custom Python types in SQLite databases adapt them to one of the Python types SQLite natively understands There are two ways to adapt Python objects to SQLite types letting your object adapt itself or using an adapter callable The latter will take precedence above the former For a library that exports a custom type it may make sense to enable that type to adapt itself As an application developer it may make more sense to take direct control by registering custom adapter functions How to write adaptable objects Suppose we have a Point class that represents a pair of coordinates x and y in a Cartesian coordinate system The coordinate pair will be stored as a text string in the database using a semicolon to separate the coordinates This can be implemented by adding a __conform__ self protocol method which returns the adapted value The object passed to protocol will be of type PrepareProtocol class Point def __init__ self x y self x self y x y def __conform__ self protocol if protocol is sqlite3 PrepareProtocol return f self x self y con sqlite3 connect memory cur con cursor cur execute SELECT Point 4 0 3 2 print cur fetchone 0 How to register adapter callables The other possibility is to create a function that converts the Python object to an SQLite compatible type This function can then be registered using register_adapter class Point def __init__ self x y self x self y x y def adapt_point point return f point x point y sqlite3 register_adapter Point adapt_point con sqlite3 connect memory cur con cursor cur execute SELECT Point 1 0 2 5 print cur fetchone 0 How to convert SQLite values to custom Python types Writing an adapter lets you convert from custom Python types to SQLite values To be able to convert from SQLite values to custom Python types we use converters Let s go back to the Point class We stored the x and y coordinates separated via semicolons as strings in SQLite First we ll define a converter function that accepts the string as a parameter and constructs a Point object from it Note Converter functions are always passed a bytes object no matter the underlying SQLite data type def convert_point s x y map float s split b return Point x y We now need to tell sqlite3 when it should convert a given SQLite value This is done when connecting to a database using the detect_types parameter of connect There are three options Implicit set detect_types to PARSE_DECLTYPES Explicit set detect_types to PARSE_COLNAMES Both set detect_types to sqlite3 PARSE_DECLTYPES sqlite3 PARSE_COLNAMES Column names take precedence over declared types The following example illustrates the implicit and explicit approaches class Point def __init__ self x y self x self y x y def __repr__ self return f Point self x self y def adapt_point point return f point x point y def convert_point s x y list map float s split b return Point x y Register the adapter and converter sqlite3 register_adapter Point adapt_point sqlite3 register_converter point convert_point 1 Parse using declared types p Point 4 0 3 2 con sqlite3 connect memory detect_types sqlite3 PARSE_DECLTYPES cur con execute CREATE TABLE test p point cur execute INSERT INTO test p VALUES p cur execute SELECT p FROM test print with declared types cur fetchone 0 cur close con close 2 Parse using column names con sqlite3 connect memory detect_types sqlite3 PARSE_COLNAMES cur con execute CREATE TABLE test
en
null
172
p cur execute INSERT INTO test p VALUES p cur execute SELECT p AS p point FROM test print with column names cur fetchone 0 Adapter and converter recipes This section shows recipes for common adapters and converters import datetime import sqlite3 def adapt_date_iso val Adapt datetime date to ISO 8601 date return val isoformat def adapt_datetime_iso val Adapt datetime datetime to timezone naive ISO 8601 date return val isoformat def adapt_datetime_epoch val Adapt datetime datetime to Unix timestamp return int val timestamp sqlite3 register_adapter datetime date adapt_date_iso sqlite3 register_adapter datetime datetime adapt_datetime_iso sqlite3 register_adapter datetime datetime adapt_datetime_epoch def convert_date val Convert ISO 8601 date to datetime date object return datetime date fromisoformat val decode def convert_datetime val Convert ISO 8601 datetime to datetime datetime object return datetime datetime fromisoformat val decode def convert_timestamp val Convert Unix epoch timestamp to datetime datetime object return datetime datetime fromtimestamp int val sqlite3 register_converter date convert_date sqlite3 register_converter datetime convert_datetime sqlite3 register_converter timestamp convert_timestamp How to use connection shortcut methods Using the execute executemany and executescript methods of the Connection class your code can be written more concisely because you don t have to create the often superfluous Cursor objects explicitly Instead the Cursor objects are created implicitly and these shortcut methods return the cursor objects This way you can execute a SELECT statement and iterate over it directly using only a single call on the Connection object Create and fill the table con sqlite3 connect memory con execute CREATE TABLE lang name first_appeared data C 1985 Objective C 1984 con executemany INSERT INTO lang name first_appeared VALUES data Print the table contents for row in con execute SELECT name first_appeared FROM lang print row print I just deleted con execute DELETE FROM lang rowcount rows close is not a shortcut method and it s not called automatically the connection object should be closed manually con close How to use the connection context manager A Connection object can be used as a context manager that automatically commits or rolls back open transactions when leaving the body of the context manager If the body of the with statement finishes without exceptions the transaction is committed If this commit fails or if the body of the with statement raises an uncaught exception the transaction is rolled back If autocommit is False a new transaction is implicitly opened after committing or rolling back If there is no open transaction upon leaving the body of the with statement or if autocommit is True the context manager does nothing Note The context manager neither implicitly opens a new transaction nor closes the connection If you need a closing context manager consider using contextlib closing con sqlite3 connect memory con execute CREATE TABLE lang id INTEGER PRIMARY KEY name VARCHAR UNIQUE Successful con commit is called automatically afterwards with con con execute INSERT INTO lang name VALUES Python con rollback is called after the with block finishes with an exception the exception is still raised and must be caught try with con con execute INSERT INTO lang name VALUES Python except sqlite3 IntegrityError print couldn t add Python twice Connection object used as context manager only commits or rollbacks transactions so the connection object should be closed manually con close How to work with SQLite URIs Some useful URI tricks include Open a database in read only mode con sqlite3 connect file tutorial db mode ro uri True con execute CREATE TABLE readonly data Traceback most recent call last OperationalError attempt to write a readonly database Do not implicitly create a new database file if it does not already exist will raise OperationalError if unable to create a new file con sqlite3 connect file nosuchdb db mode rw uri True Traceback most recent call last OperationalError unable
en
null
173
to open database file Create a shared named in memory database db file mem1 mode memory cache shared con1 sqlite3 connect db uri True con2 sqlite3 connect db uri True with con1 con1 execute CREATE TABLE shared data con1 execute INSERT INTO shared VALUES 28 res con2 execute SELECT data FROM shared assert res fetchone 28 More information about this feature including a list of parameters can be found in the SQLite URI documentation How to create and use row factories By default sqlite3 represents each row as a tuple If a tuple does not suit your needs you can use the sqlite3 Row class or a custom row_factory While row_factory exists as an attribute both on the Cursor and the Connection it is recommended to set Connection row_factory so all cursors created from the connection will use the same row factory Row provides indexed and case insensitive named access to columns with minimal memory overhead and performance impact over a tuple To use Row as a row factory assign it to the row_factory attribute con sqlite3 connect memory con row_factory sqlite3 Row Queries now return Row objects res con execute SELECT Earth AS name 6378 AS radius row res fetchone row keys name radius row 0 Access by index Earth row name Access by name Earth row RADIUS Column names are case insensitive 6378 Note The FROM clause can be omitted in the SELECT statement as in the above example In such cases SQLite returns a single row with columns defined by expressions e g literals with the given aliases expr AS alias You can create a custom row_factory that returns each row as a dict with column names mapped to values def dict_factory cursor row fields column 0 for column in cursor description return key value for key value in zip fields row Using it queries now return a dict instead of a tuple con sqlite3 connect memory con row_factory dict_factory for row in con execute SELECT 1 AS a 2 AS b print row a 1 b 2 The following row factory returns a named tuple from collections import namedtuple def namedtuple_factory cursor row fields column 0 for column in cursor description cls namedtuple Row fields return cls _make row namedtuple_factory can be used as follows con sqlite3 connect memory con row_factory namedtuple_factory cur con execute SELECT 1 AS a 2 AS b row cur fetchone row Row a 1 b 2 row 0 Indexed access 1 row b Attribute access 2 With some adjustments the above recipe can be adapted to use a dataclass or any other custom class instead of a namedtuple How to handle non UTF 8 text encodings By default sqlite3 uses str to adapt SQLite values with the TEXT data type This works well for UTF 8 encoded text but it might fail for other encodings and invalid UTF 8 You can use a custom text_factory to handle such cases Because of SQLite s flexible typing it is not uncommon to encounter table columns with the TEXT data type containing non UTF 8 encodings or even arbitrary data To demonstrate let s assume we have a database with ISO 8859 2 Latin 2 encoded text for example a table of Czech English dictionary entries Assuming we now have a Connection instance con connected to this database we can decode the Latin 2 encoded text using this text_factory con text_factory lambda data str data encoding latin2 For invalid UTF 8 or arbitrary data in stored in TEXT table columns you can use the following technique borrowed from the Unicode HOWTO con text_factory lambda data str data errors surrogateescape Note The sqlite3 module API does not support strings containing surrogates See also Unicode HOWTO Explanation Transaction control sqlite3 offers multiple methods of controlling whether when and how database transactions are opened and closed Transaction control via the autocommit attribute is recommended while Transaction control via the isolation_level attribute retains the pre Python 3 12 behaviour Transaction control via the autocommit attribute The recommended way of controlling transaction behaviour is through the Connection autocommit attribute which should preferably be set using the autocommit parameter of connect It is suggested to set autocommit to False which
en
null
174
implies PEP 249 compliant transaction control This means sqlite3 ensures that a transaction is always open so connect Connection commit and Connection rollback will implicitly open a new transaction immediately after closing the pending one for the latter two sqlite3 uses BEGIN DEFERRED statements when opening transactions Transactions should be committed explicitly using commit Transactions should be rolled back explicitly using rollback An implicit rollback is performed if the database is close ed with pending changes Set autocommit to True to enable SQLite s autocommit mode In this mode Connection commit and Connection rollback have no effect Note that SQLite s autocommit mode is distinct from the PEP 249 compliant Connection autocommit attribute use Connection in_transaction to query the low level SQLite autocommit mode Set autocommit to LEGACY_TRANSACTION_CONTROL to leave transaction control behaviour to the Connection isolation_level attribute See Transaction control via the isolation_level attribute for more information Transaction control via the isolation_level attribute Note The recommended way of controlling transactions is via the autocommit attribute See Transaction control via the autocommit attribute If Connection autocommit is set to LEGACY_TRANSACTION_CONTROL the default transaction behaviour is controlled using the Connection isolation_level attribute Otherwise isolation_level has no effect If the connection attribute isolation_level is not None new transactions are implicitly opened before execute and executemany executes INSERT UPDATE DELETE or REPLACE statements for other statements no implicit transaction handling is performed Use the commit and rollback methods to respectively commit and roll back pending transactions You can choose the underlying SQLite transaction behaviour that is whether and what type of BEGIN statements sqlite3 implicitly executes via the isolation_level attribute If isolation_level is set to None no transactions are implicitly opened at all This leaves the underlying SQLite library in autocommit mode but also allows the user to perform their own transaction handling using explicit SQL statements The underlying SQLite library autocommit mode can be queried using the in_transaction attribute The executescript method implicitly commits any pending transaction before execution of the given SQL script regardless of the value of isolation_level Changed in version 3 6 sqlite3 used to implicitly commit an open transaction before DDL statements This is no longer the case Changed in version 3 12 The recommended way of controlling transactions is now via the autocommit attribute
en
null
175
4 Using Python on Windows This document aims to give an overview of Windows specific behaviour you should know about when using Python on Microsoft Windows Unlike most Unix systems and services Windows does not include a system supported installation of Python To make Python available the CPython team has compiled Windows installers with every release for many years These installers are primarily intended to add a per user installation of Python with the core interpreter and library being used by a single user The installer is also able to install for all users of a single machine and a separate ZIP file is available for application local distributions As specified in PEP 11 a Python release only supports a Windows platform while Microsoft considers the platform under extended support This means that Python 3 12 supports Windows 8 1 and newer If you require Windows 7 support please install Python 3 8 There are a number of different installers available for Windows each with certain benefits and downsides The full installer contains all components and is the best option for developers using Python for any kind of project The Microsoft Store package is a simple installation of Python that is suitable for running scripts and packages and using IDLE or other development environments It requires Windows 10 and above but can be safely installed without corrupting other programs It also provides many convenient commands for launching Python and its tools The nuget org packages are lightweight installations intended for continuous integration systems It can be used to build Python packages or run scripts but is not updateable and has no user interface tools The embeddable package is a minimal package of Python suitable for embedding into a larger application 4 1 The full installer 4 1 1 Installation steps Four Python 3 12 installers are available for download two each for the 32 bit and 64 bit versions of the interpreter The web installer is a small initial download and it will automatically download the required components as necessary The offline installer includes the components necessary for a default installation and only requires an internet connection for optional features See Installing Without Downloading for other ways to avoid downloading during installation After starting the installer one of two options may be selected image If you select Install Now You will not need to be an administrator unless a system update for the C Runtime Library is required or you install the Python Launcher for Windows for all users Python will be installed into your user directory The Python Launcher for Windows will be installed according to the option at the bottom of the first page The standard library test suite launcher and pip will be installed If selected the install directory will be added to your PATH Shortcuts will only be visible for the current user Selecting Customize installation will allow you to select the features to install the installation location and other options or post install actions To install debugging symbols or binaries you will need to use this option To perform an all users installation you should select Customize installation In this case You may be required to provide administrative credentials or approval Python will be installed into the Program Files directory The Python Launcher for Windows will be installed into the Windows directory Optional features may be selected during installation The standard library can be pre compiled to bytecode If selected the install directory will be added to the system PATH Shortcuts are available for all users 4 1 2 Removing the MAX_PATH Limitation Windows historically has limited path lengths to 260 characters This meant that paths longer than this would not resolve and errors would result In the latest versions of Windows this limitation can be expanded to approximately 32 000 characters Your administrator will need to activate the Enable Win32 long paths group policy or set LongPathsEnabled to 1 in the registry key HKEY_LOCAL_MACHINE SYSTEM CurrentControlSet Control
en
null
176
FileSystem This allows the open function the os module and most other path functionality to accept and return paths longer than 260 characters After changing the above option no further configuration is required Changed in version 3 6 Support for long paths was enabled in Python 4 1 3 Installing Without UI All of the options available in the installer UI can also be specified from the command line allowing scripted installers to replicate an installation on many machines without user interaction These options may also be set without suppressing the UI in order to change some of the defaults The following options found by executing the installer with can be passed into the installer Name Description passive to display progress without requiring user interaction quiet to install uninstall without displaying any UI simple to prevent user customization uninstall to remove Python without confirmation layout directory to pre download all components log filename to specify log files location All other options are passed as name value where the value is usually 0 to disable a feature 1 to enable a feature or a path The full list of available options is shown below Name Description Default InstallAllUsers Perform a system wide installation 0 TargetDir The installation directory Selected based on InstallAllUsers DefaultAllUsersTargetDir The default installation directory for ProgramFiles Python all user installs X Y or ProgramFiles x8 6 Python X Y DefaultJustForMeTargetDir The default install directory for LocalAppData Programs just for me installs Python PythonXY or Loc alAppData Programs Pytho n PythonXY 32 or Local AppData Programs Python PythonXY 64 DefaultCustomTargetDir The default custom install directory empty displayed in the UI AssociateFiles Create file associations if the 1 launcher is also installed CompileAll Compile all py files to pyc 0 PrependPath Prepend install and Scripts 0 directories to PATH and add PY to PATHEXT AppendPath Append install and Scripts directories 0 to PATH and add PY to PATHEXT Shortcuts Create shortcuts for the interpreter 1 documentation and IDLE if installed Include_doc Install Python manual 1 Include_debug Install debug binaries 0 Include_dev Install developer headers and 1 libraries Omitting this may lead to an unusable installation Include_exe Install python exe and related 1 files Omitting this may lead to an unusable installation Include_launcher Install Python Launcher for Windows 1 InstallLauncherAllUsers Installs the launcher for all users 1 Also requires Include_launcher to be set to 1 Include_lib Install standard library and extension 1 modules Omitting this may lead to an unusable installation Include_pip Install bundled pip and setuptools 1 Include_symbols Install debugging symbols pdb 0 Include_tcltk Install Tcl Tk support and IDLE 1 Include_test Install standard library test suite 1 Include_tools Install utility scripts 1 LauncherOnly Only installs the launcher This will 0 override most other options SimpleInstall Disable most install UI 0 SimpleInstallDescription A custom message to display when the empty simplified install UI is used For example to silently install a default system wide Python installation you could use the following command from an elevated command prompt python 3 9 0 exe quiet InstallAllUsers 1 PrependPath 1 Include_test 0 To allow users to easily install a personal copy of Python without the test suite you could provide a shortcut with the following command This will display a simplified initial page and disallow customization python 3 9 0 exe InstallAllUsers 0 Include_launcher 0 Include_test 0 SimpleInstall 1 SimpleInstallDescription Just for me no test suite Note that omitting the launcher also omits file associations and is only recommended for per user installs when there is also a system wide installation that included the launcher The options listed above can also be provided in a file named unattend xml alongside the executable This file specifies a list of options and values When a value is provided as an attribute it will be converted to a numbe
en
null
177
r if possible Values provided as element text are always left as strings This example file sets the same options as the previous example Options Option Name InstallAllUsers Value no Option Name Include_launcher Value 0 Option Name Include_test Value no Option Name SimpleInstall Value yes Option Name SimpleInstallDescription Just for me no test suite Option Options 4 1 4 Installing Without Downloading As some features of Python are not included in the initial installer download selecting those features may require an internet connection To avoid this need all possible components may be downloaded on demand to create a complete layout that will no longer require an internet connection regardless of the selected features Note that this download may be bigger than required but where a large number of installations are going to be performed it is very useful to have a locally cached copy Execute the following command from Command Prompt to download all possible required files Remember to substitute python 3 9 0 exe for the actual name of your installer and to create layouts in their own directories to avoid collisions between files with the same name python 3 9 0 exe layout optional target directory You may also specify the quiet option to hide the progress display 4 1 5 Modifying an install Once Python has been installed you can add or remove features through the Programs and Features tool that is part of Windows Select the Python entry and choose Uninstall Change to open the installer in maintenance mode Modify allows you to add or remove features by modifying the checkboxes unchanged checkboxes will not install or remove anything Some options cannot be changed in this mode such as the install directory to modify these you will need to remove and then reinstall Python completely Repair will verify all the files that should be installed using the current settings and replace any that have been removed or modified Uninstall will remove Python entirely with the exception of the Python Launcher for Windows which has its own entry in Programs and Features 4 2 The Microsoft Store package New in version 3 7 2 The Microsoft Store package is an easily installable Python interpreter that is intended mainly for interactive use for example by students To install the package ensure you have the latest Windows 10 updates and search the Microsoft Store app for Python 3 12 Ensure that the app you select is published by the Python Software Foundation and install it Warning Python will always be available for free on the Microsoft Store If you are asked to pay for it you have not selected the correct package After installation Python may be launched by finding it in Start Alternatively it will be available from any Command Prompt or PowerShell session by typing python Further pip and IDLE may be used by typing pip or idle IDLE can also be found in Start All three commands are also available with version number suffixes for example as python3 exe and python3 x exe as well as python exe where 3 x is the specific version you want to launch such as 3 12 Open Manage App Execution Aliases through Start to select which version of Python is associated with each command It is recommended to make sure that pip and idle are consistent with whichever version of python is selected Virtual environments can be created with python m venv and activated and used as normal If you have installed another version of Python and added it to your PATH variable it will be available as python exe rather than the one from the Microsoft Store To access the new installation use python3 exe or python3 x exe The py exe launcher will detect this Python installation but will prefer installations from the traditional installer To remove Python open Settings and use Apps and Features or else find Python in Start and right click to select Uninstall Uninstalling will remove all packages you installed directly into this Python installation but will not remove any virtual environments 4 2 1 Known issues 4 2 1 1 Redirection of local data registry and temporary paths Because of rest
en
null
178
rictions on Microsoft Store apps Python scripts may not have full write access to shared locations such as TEMP and the registry Instead it will write to a private copy If your scripts must modify the shared locations you will need to install the full installer At runtime Python will use a private copy of well known Windows folders and the registry For example if the environment variable APPDATA is c Users user AppData then when writing to C Users user AppData Local will write to C Users user AppDat a Local Packages PythonSoftwareFoundation Python 3 8_qbz5n2kfra8p0 Lo calCache Local When reading files Windows will return the file from the private folder or if that does not exist the real Windows directory For example reading C Windows System32 returns the contents of C Windows System32 plus the contents of C Program Files WindowsApps package_name VFS SystemX86 You can find the real path of any existing file using os path realpath import os test_file C Users example AppData Local test txt os path realpath test_file C Users example AppData Local Packages PythonSoftwareFoundation Python 3 8_qbz5n2kfra8p0 LocalCache Local test txt When writing to the Windows Registry the following behaviors exist Reading from HKLM Software is allowed and results are merged with the registry dat file in the package Writing to HKLM Software is not allowed if the corresponding key value exists i e modifying existing keys Writing to HKLM Software is allowed as long as a corresponding key value does not exist in the package and the user has the correct access permissions For more detail on the technical basis for these limitations please consult Microsoft s documentation on packaged full trust apps currently available at docs microsoft com en us windows msix desktop desktop to uwp behind the scenes 4 3 The nuget org packages New in version 3 5 2 The nuget org package is a reduced size Python environment intended for use on continuous integration and build systems that do not have a system wide install of Python While nuget is the package manager for NET it also works perfectly fine for packages containing build time tools Visit nuget org for the most up to date information on using nuget What follows is a summary that is sufficient for Python developers The nuget exe command line tool may be downloaded directly from https aka ms nugetclidl for example using curl or PowerShell With the tool the latest version of Python for 64 bit or 32 bit machines is installed using nuget exe install python ExcludeVersion OutputDirectory nuget exe install pythonx86 ExcludeVersion OutputDirectory To select a particular version add a Version 3 x y The output directory may be changed from and the package will be installed into a subdirectory By default the subdirectory is named the same as the package and without the ExcludeVersion option this name will include the specific version installed Inside the subdirectory is a tools directory that contains the Python installation Without ExcludeVersion python 3 5 2 tools python exe V Python 3 5 2 With ExcludeVersion python tools python exe V Python 3 5 2 In general nuget packages are not upgradeable and newer versions should be installed side by side and referenced using the full path Alternatively delete the package directory manually and install it again Many CI systems will do this automatically if they do not preserve files between builds Alongside the tools directory is a build native directory This contains a MSBuild properties file python props that can be used in a C project to reference the Python install Including the settings will automatically use the headers and import libraries in your build The package information pages on nuget org are www nuget org packages python for the 64 bit version and www nuget org packages pythonx86 for the 32 bit version 4 4 The embeddable package New in version 3 5 The embedded distribution is a ZIP file containing a minimal Python environment It is intended for acting as part of another application rather than being directly accessed by end users When extracted the embedded distr
en
null
179
ibution is almost fully isolated from the user s system including environment variables system registry settings and installed packages The standard library is included as pre compiled and optimized pyc files in a ZIP and python3 dll python37 dll python exe and pythonw exe are all provided Tcl tk including all dependents such as Idle pip and the Python documentation are not included Note The embedded distribution does not include the Microsoft C Runtime and it is the responsibility of the application installer to provide this The runtime may have already been installed on a user s system previously or automatically via Windows Update and can be detected by finding ucrtbase dll in the system directory Third party packages should be installed by the application installer alongside the embedded distribution Using pip to manage dependencies as for a regular Python installation is not supported with this distribution though with some care it may be possible to include and use pip for automatic updates In general third party packages should be treated as part of the application vendoring so that the developer can ensure compatibility with newer versions before providing updates to users The two recommended use cases for this distribution are described below 4 4 1 Python Application An application written in Python does not necessarily require users to be aware of that fact The embedded distribution may be used in this case to include a private version of Python in an install package Depending on how transparent it should be or conversely how professional it should appear there are two options Using a specialized executable as a launcher requires some coding but provides the most transparent experience for users With a customized launcher there are no obvious indications that the program is running on Python icons can be customized company and version information can be specified and file associations behave properly In most cases a custom launcher should simply be able to call Py_Main with a hard coded command line The simpler approach is to provide a batch file or generated shortcut that directly calls the python exe or pythonw exe with the required command line arguments In this case the application will appear to be Python and not its actual name and users may have trouble distinguishing it from other running Python processes or file associations With the latter approach packages should be installed as directories alongside the Python executable to ensure they are available on the path With the specialized launcher packages can be located in other locations as there is an opportunity to specify the search path before launching the application 4 4 2 Embedding Python Applications written in native code often require some form of scripting language and the embedded Python distribution can be used for this purpose In general the majority of the application is in native code and some part will either invoke python exe or directly use python3 dll For either case extracting the embedded distribution to a subdirectory of the application installation is sufficient to provide a loadable Python interpreter As with the application use packages can be installed to any location as there is an opportunity to specify search paths before initializing the interpreter Otherwise there is no fundamental differences between using the embedded distribution and a regular installation 4 5 Alternative bundles Besides the standard CPython distribution there are modified packages including additional functionality The following is a list of popular versions and their key features ActivePython Installer with multi platform compatibility documentation PyWin32 Anaconda Popular scientific modules such as numpy scipy and pandas and the conda package manager Enthought Deployment Manager The Next Generation Python Environment and Package Manager Previously Enthought provided Canopy but it reached end of life in 2016 WinPython Windows specific distribution with prebuilt scientific packages and tools for building packages Note that these packages may not inclu
en
null
180
de the latest versions of Python or other libraries and are not maintained or supported by the core Python team 4 6 Configuring Python To run Python conveniently from a command prompt you might consider changing some default environment variables in Windows While the installer provides an option to configure the PATH and PATHEXT variables for you this is only reliable for a single system wide installation If you regularly use multiple versions of Python consider using the Python Launcher for Windows 4 6 1 Excursus Setting environment variables Windows allows environment variables to be configured permanently at both the User level and the System level or temporarily in a command prompt To temporarily set environment variables open Command Prompt and use the set command C set PATH C Program Files Python 3 9 PATH C set PYTHONPATH PYTHONPATH C My_python_lib C python These changes will apply to any further commands executed in that console and will be inherited by any applications started from the console Including the variable name within percent signs will expand to the existing value allowing you to add your new value at either the start or the end Modifying PATH by adding the directory containing python exe to the start is a common way to ensure the correct version of Python is launched To permanently modify the default environment variables click Start and search for edit environment variables or open System properties Advanced system settings and click the Environment Variables button In this dialog you can add or modify User and System variables To change System variables you need non restricted access to your machine i e Administrator rights Note Windows will concatenate User variables after System variables which may cause unexpected results when modifying PATH The PYTHONPATH variable is used by all versions of Python so you should not permanently configure it unless the listed paths only include code that is compatible with all of your installed Python versions See also https docs microsoft com en us windows win32 procthread environment variables Overview of environment variables on Windows https docs microsoft com en us windows server administration windows commands set_1 The set command for temporarily modifying environment variables https docs microsoft com en us windows server administration windows commands setx The setx command for permanently modifying environment variables 4 6 2 Finding the Python executable Changed in version 3 5 Besides using the automatically created start menu entry for the Python interpreter you might want to start Python in the command prompt The installer has an option to set that up for you On the first page of the installer an option labelled Add Python to PATH may be selected to have the installer add the install location into the PATH The location of the Scripts folder is also added This allows you to type python to run the interpreter and pip for the package installer Thus you can also execute your scripts with command line options see Command line documentation If you don t enable this option at install time you can always re run the installer select Modify and enable it Alternatively you can manually modify the PATH using the directions in Excursus Setting environment variables You need to set your PATH environment variable to include the directory of your Python installation delimited by a semicolon from other entries An example variable could look like this assuming the first two entries already existed C WINDOWS system32 C WINDOWS C Program Files Python 3 9 4 7 UTF 8 mode New in version 3 7 Windows still uses legacy encodings for the system encoding the ANSI Code Page Python uses it for the default encoding of text files e g locale getencoding This may cause issues because UTF 8 is widely used on the internet and most Unix systems including WSL Windows Subsystem for Linux You can use the Python UTF 8 Mode to change the default text encoding to UTF 8 You can enable the Python UTF 8 Mode via the X utf8 command line option or the PYTHONUTF8 1 environment variable See PYTHONUTF
en
null
181
8 for enabling UTF 8 mode and Excursus Setting environment variables for how to modify environment variables When the Python UTF 8 Mode is enabled you can still use the system encoding the ANSI Code Page via the mbcs codec Note that adding PYTHONUTF8 1 to the default environment variables will affect all Python 3 7 applications on your system If you have any Python 3 7 applications which rely on the legacy system encoding it is recommended to set the environment variable temporarily or use the X utf8 command line option Note Even when UTF 8 mode is disabled Python uses UTF 8 by default on Windows for Console I O including standard I O see PEP 528 for details The filesystem encoding see PEP 529 for details 4 8 Python Launcher for Windows New in version 3 3 The Python launcher for Windows is a utility which aids in locating and executing of different Python versions It allows scripts or the command line to indicate a preference for a specific Python version and will locate and execute that version Unlike the PATH variable the launcher will correctly select the most appropriate version of Python It will prefer per user installations over system wide ones and orders by language version rather than using the most recently installed version The launcher was originally specified in PEP 397 4 8 1 Getting started 4 8 1 1 From the command line Changed in version 3 6 System wide installations of Python 3 3 and later will put the launcher on your PATH The launcher is compatible with all available versions of Python so it does not matter which version is installed To check that the launcher is available execute the following command in Command Prompt py You should find that the latest version of Python you have installed is started it can be exited as normal and any additional command line arguments specified will be sent directly to Python If you have multiple versions of Python installed e g 3 7 and 3 12 you will have noticed that Python 3 12 was started to launch Python 3 7 try the command py 3 7 If you want the latest version of Python 2 you have installed try the command py 2 If you see the following error you do not have the launcher installed py is not recognized as an internal or external command operable program or batch file The command py list displays the currently installed version s of Python The x y argument is the short form of the V Company Tag argument which allows selecting a specific Python runtime including those that may have come from somewhere other than python org Any runtime registered by following PEP 514 will be discoverable The list command lists all available runtimes using the V format When using the V argument specifying the Company will limit selection to runtimes from that provider while specifying only the Tag will select from all providers Note that omitting the slash implies a tag Select any 3 tagged runtime py V 3 Select any PythonCore released runtime py V PythonCore Select PythonCore s latest Python 3 runtime py V PythonCore 3 The short form of the argument 3 only ever selects from core Python releases and not other distributions However the longer form V 3 will select from any The Company is matched on the full string case insenitive The Tag is matched oneither the full string or a prefix provided the next character is a dot or a hyphen This allows V 3 1 to match 3 1 32 but not 3 10 Tags are sorted using numerical ordering 3 10 is newer than 3 1 but are compared using text V 3 01 does not match 3 1 4 8 1 2 Virtual environments New in version 3 5 If the launcher is run with no explicit Python version specification and a virtual environment created with the standard library venv module or the external virtualenv tool active the launcher will run the virtual environment s interpreter rather than the global one To run the global interpreter either deactivate the virtual environment or explicitly specify the global Python version 4 8 1 3 From a script Let s create a test Python script create a file called hello py with the following contents python import sys sys stdout write hello from Python s n
en
null
182
sys version From the directory in which hello py lives execute the command py hello py You should notice the version number of your latest Python 2 x installation is printed Now try changing the first line to be python3 Re executing the command should now print the latest Python 3 x information As with the above command line examples you can specify a more explicit version qualifier Assuming you have Python 3 7 installed try changing the first line to python3 7 and you should find the 3 7 version information printed Note that unlike interactive use a bare python will use the latest version of Python 2 x that you have installed This is for backward compatibility and for compatibility with Unix where the command python typically refers to Python 2 4 8 1 4 From file associations The launcher should have been associated with Python files i e py pyw pyc files when it was installed This means that when you double click on one of these files from Windows explorer the launcher will be used and therefore you can use the same facilities described above to have the script specify the version which should be used The key benefit of this is that a single launcher can support multiple Python versions at the same time depending on the contents of the first line 4 8 2 Shebang Lines If the first line of a script file starts with it is known as a shebang line Linux and other Unix like operating systems have native support for such lines and they are commonly used on such systems to indicate how a script should be executed This launcher allows the same facilities to be used with Python scripts on Windows and the examples above demonstrate their use To allow shebang lines in Python scripts to be portable between Unix and Windows this launcher supports a number of virtual commands to specify which interpreter to use The supported virtual commands are usr bin env usr bin python usr local bin python python For example if the first line of your script starts with usr bin python The default Python will be located and used As many Python scripts written to work on Unix will already have this line you should find these scripts can be used by the launcher without modification If you are writing a new script on Windows which you hope will be useful on Unix you should use one of the shebang lines starting with usr Any of the above virtual commands can be suffixed with an explicit version either just the major version or the major and minor version Furthermore the 32 bit version can be requested by adding 32 after the minor version I e usr bin python3 7 32 will request usage of the 32 bit python 3 7 New in version 3 7 Beginning with python launcher 3 7 it is possible to request 64 bit version by the 64 suffix Furthermore it is possible to specify a major and architecture without minor i e usr bin python3 64 Changed in version 3 11 The 64 suffix is deprecated and now implies any architecture that is not provably i386 32 bit To request a specific environment use the new V TAG argument with the complete tag The usr bin env form of shebang line has one further special property Before looking for installed Python interpreters this form will search the executable PATH for a Python executable matching the name provided as the first argument This corresponds to the behaviour of the Unix env program which performs a PATH search If an executable matching the first argument after the env command cannot be found but the argument starts with python it will be handled as described for the other virtual commands The environment variable PYLAUNCHER_NO_SEARCH_PATH may be set to any value to skip this search of PATH Shebang lines that do not match any of these patterns are looked up in the commands section of the launcher s INI file This may be used to handle certain commands in a way that makes sense for your system The name of the command must be a single argument no spaces in the shebang executable and the value substituted is the full path to the executable additional arguments specified in the INI will be quoted as part of the filename commands bin xpython C Progr
en
null
183
am Files XPython python exe Any commands not found in the INI file are treated as Windows executable paths that are absolute or relative to the directory containing the script file This is a convenience for Windows only scripts such as those generated by an installer since the behavior is not compatible with Unix style shells These paths may be quoted and may include multiple arguments after which the path to the script and any additional arguments will be appended 4 8 3 Arguments in shebang lines The shebang lines can also specify additional options to be passed to the Python interpreter For example if you have a shebang line usr bin python v Then Python will be started with the v option 4 8 4 Customization 4 8 4 1 Customization via INI files Two ini files will be searched by the launcher py ini in the current user s application data directory LOCALAPPDATA or env LocalAppData and py ini in the same directory as the launcher The same ini files are used for both the console version of the launcher i e py exe and for the windows version i e pyw exe Customization specified in the application directory will have precedence over the one next to the executable so a user who may not have write access to the ini file next to the launcher can override commands in that global ini file 4 8 4 2 Customizing default Python versions In some cases a version qualifier can be included in a command to dictate which version of Python will be used by the command A version qualifier starts with a major version number and can optionally be followed by a period and a minor version specifier Furthermore it is possible to specify if a 32 or 64 bit implementation shall be requested by adding 32 or 64 For example a shebang line of python has no version qualifier while python3 has a version qualifier which specifies only a major version If no version qualifiers are found in a command the environment variable PY_PYTHON can be set to specify the default version qualifier If it is not set the default is 3 The variable can specify any value that may be passed on the command line such as 3 3 7 3 7 32 or 3 7 64 Note that the 64 option is only available with the launcher included with Python 3 7 or newer If no minor version qualifiers are found the environment variable PY_PYTHON major where major is the current major version qualifier as determined above can be set to specify the full version If no such option is found the launcher will enumerate the installed Python versions and use the latest minor release found for the major version which is likely although not guaranteed to be the most recently installed version in that family On 64 bit Windows with both 32 bit and 64 bit implementations of the same major minor Python version installed the 64 bit version will always be preferred This will be true for both 32 bit and 64 bit implementations of the launcher a 32 bit launcher will prefer to execute a 64 bit Python installation of the specified version if available This is so the behavior of the launcher can be predicted knowing only what versions are installed on the PC and without regard to the order in which they were installed i e without knowing whether a 32 or 64 bit version of Python and corresponding launcher was installed last As noted above an optional 32 or 64 suffix can be used on a version specifier to change this behaviour Examples If no relevant options are set the commands python and python2 will use the latest Python 2 x version installed and the command python3 will use the latest Python 3 x installed The command python3 7 will not consult any options at all as the versions are fully specified If PY_PYTHON 3 the commands python and python3 will both use the latest installed Python 3 version If PY_PYTHON 3 7 32 the command python will use the 32 bit implementation of 3 7 whereas the command python3 will use the latest installed Python PY_PYTHON was not considered at all as a major version was specified If PY_PYTHON 3 and PY_PYTHON3 3 7 the commands python and python3 will both use specifically 3 7 In addition to environment variables the sa
en
null
184
me settings can be configured in the INI file used by the launcher The section in the INI file is called defaults and the key name will be the same as the environment variables without the leading PY_ prefix and note that the key names in the INI file are case insensitive The contents of an environment variable will override things specified in the INI file For example Setting PY_PYTHON 3 7 is equivalent to the INI file containing defaults python 3 7 Setting PY_PYTHON 3 and PY_PYTHON3 3 7 is equivalent to the INI file containing defaults python 3 python3 3 7 4 8 5 Diagnostics If an environment variable PYLAUNCHER_DEBUG is set to any value the launcher will print diagnostic information to stderr i e to the console While this information manages to be simultaneously verbose and terse it should allow you to see what versions of Python were located why a particular version was chosen and the exact command line used to execute the target Python It is primarily intended for testing and debugging 4 8 6 Dry Run If an environment variable PYLAUNCHER_DRYRUN is set to any value the launcher will output the command it would have run but will not actually launch Python This may be useful for tools that want to use the launcher to detect and then launch Python directly Note that the command written to standard output is always encoded using UTF 8 and may not render correctly in the console 4 8 7 Install on demand If an environment variable PYLAUNCHER_ALLOW_INSTALL is set to any value and the requested Python version is not installed but is available on the Microsoft Store the launcher will attempt to install it This may require user interaction to complete and you may need to run the command again An additional PYLAUNCHER_ALWAYS_INSTALL variable causes the launcher to always try to install Python even if it is detected This is mainly intended for testing and should be used with PYLAUNCHER_DRYRUN 4 8 8 Return codes The following exit codes may be returned by the Python launcher Unfortunately there is no way to distinguish these from the exit code of Python itself The names of codes are as used in the sources and are only for reference There is no way to access or resolve them apart from reading this page Entries are listed in alphabetical order of names Name Value Description RC_BAD_VENV_CFG 107 A pyvenv cfg was found but is corrupt RC_CREATE_PROCESS 101 Failed to launch Python RC_INSTALLING 111 An install was started but the command will need to be re run after it completes RC_INTERNAL_ERROR 109 Unexpected error Please report a bug RC_NO_COMMANDLINE 108 Unable to obtain command line from the operating system RC_NO_PYTHON 103 Unable to locate the requested version RC_NO_VENV_CFG 106 A pyvenv cfg was required but not found 4 9 Finding modules These notes supplement the description at The initialization of the sys path module search path with detailed Windows notes When no _pth file is found this is how sys path is populated on Windows An empty entry is added at the start which corresponds to the current directory If the environment variable PYTHONPATH exists as described in Environment variables its entries are added next Note that on Windows paths in this variable must be separated by semicolons to distinguish them from the colon used in drive identifiers C etc Additional application paths can be added in the registry as subkeys of SOFTWARE Python PythonCore version PythonPath under both the HKEY_CURRENT_USER and HKEY_LOCAL_MACHINE hives Subkeys which have semicolon delimited path strings as their default value will cause each path to be added to sys path Note that all known installers only use HKLM so HKCU is typically empty If the environment variable PYTHONHOME is set it is assumed as Python Home Otherwise the path of the main Python executable is used to locate a landmark file either Lib os py or pythonXY zip to deduce the Python Home If a Python home is found the relevant sub directories added to sys path Lib plat win etc are based on that folder Otherwise the core Python path is constructed from the PythonPath stored in the regis
en
null
185
try If the Python Home cannot be located no PYTHONPATH is specified in the environment and no registry entries can be found a default path with relative entries is used e g Lib plat win etc If a pyvenv cfg file is found alongside the main executable or in the directory one level above the executable the following variations apply If home is an absolute path and PYTHONHOME is not set this path is used instead of the path to the main executable when deducing the home location The end result of all this is When running python exe or any other exe in the main Python directory either an installed version or directly from the PCbuild directory the core path is deduced and the core paths in the registry are ignored Other application paths in the registry are always read When Python is hosted in another exe different directory embedded via COM etc the Python Home will not be deduced so the core path from the registry is used Other application paths in the registry are always read If Python can t find its home and there are no registry value frozen exe some very strange installation setup you get a path with some default but relative paths For those who want to bundle Python into their application or distribution the following advice will prevent conflicts with other installations Include a _pth file alongside your executable containing the directories to include This will ignore paths listed in the registry and environment variables and also ignore site unless import site is listed If you are loading python3 dll or python37 dll in your own executable explicitly call Py_SetPath or at least Py_SetProgramName before Py_Initialize Clear and or overwrite PYTHONPATH and set PYTHONHOME before launching python exe from your application If you cannot use the previous suggestions for example you are a distribution that allows people to run python exe directly ensure that the landmark file Lib os py exists in your install directory Note that it will not be detected inside a ZIP file but a correctly named ZIP file will be detected instead These will ensure that the files in a system wide installation will not take precedence over the copy of the standard library bundled with your application Otherwise your users may experience problems using your application Note that the first suggestion is the best as the others may still be susceptible to non standard paths in the registry and user site packages Changed in version 3 6 Add _pth file support and removes applocal option from pyvenv cfg Changed in version 3 6 Add python XX zip as a potential landmark when directly adjacent to the executable Deprecated since version 3 6 Modules specified in the registry under Modules not PythonPath may be imported by importlib machinery WindowsRegistryFinder This finder is enabled on Windows in 3 6 0 and earlier but may need to be explicitly added to sys meta_path in the future 4 10 Additional modules Even though Python aims to be portable among all platforms there are features that are unique to Windows A couple of modules both in the standard library and external and snippets exist to use these features The Windows specific standard modules are documented in MS Windows Specific Services 4 10 1 PyWin32 The PyWin32 module by Mark Hammond is a collection of modules for advanced Windows specific support This includes utilities for Component Object Model COM Win32 API calls Registry Event log Microsoft Foundation Classes MFC user interfaces PythonWin is a sample MFC application shipped with PyWin32 It is an embeddable IDE with a built in debugger See also Win32 How Do I by Tim Golden Python and COM by David and Paul Boddie 4 10 2 cx_Freeze cx_Freeze wraps Python scripts into executable Windows programs exe files When you have done this you can distribute your application without requiring your users to install Python 4 11 Compiling Python on Windows If you want to compile CPython yourself first thing you should do is get the source You can download either the latest release s source or just grab a fresh checkout The source tree contains a build solution and proj
en
null
186
ect files for Microsoft Visual Studio which is the compiler used to build the official Python releases These files are in the PCbuild directory Check PCbuild readme txt for general information on the build process For extension modules consult Building C and C Extensions on Windows 4 12 Other Platforms With ongoing development of Python some platforms that used to be supported earlier are no longer supported due to the lack of users or developers Check PEP 11 for details on all unsupported platforms Windows CE is no longer supported since Python 3 if it ever was The Cygwin installer offers to install the Python interpreter as well See Python for Windows for detailed information about platforms with pre compiled installers
en
null
187
Binary Data Services The modules described in this chapter provide some basic services operations for manipulation of binary data Other operations on binary data specifically in relation to file formats and network protocols are described in the relevant sections Some libraries described under Text Processing Services also work with either ASCII compatible binary formats for example re or all binary data for example difflib In addition see the documentation for Python s built in binary data types in Binary Sequence Types bytes bytearray memoryview struct Interpret bytes as packed binary data Functions and Exceptions Format Strings Byte Order Size and Alignment Format Characters Examples Applications Native Formats Standard Formats Classes codecs Codec registry and base classes Codec Base Classes Error Handlers Stateless Encoding and Decoding Incremental Encoding and Decoding IncrementalEncoder Objects IncrementalDecoder Objects Stream Encoding and Decoding StreamWriter Objects StreamReader Objects StreamReaderWriter Objects StreamRecoder Objects Encodings and Unicode Standard Encodings Python Specific Encodings Text Encodings Binary Transforms Text Transforms encodings idna Internationalized Domain Names in Applications encodings mbcs Windows ANSI codepage encodings utf_8_sig UTF 8 codec with BOM signature
en
null
188
gettext Multilingual internationalization services Source code Lib gettext py The gettext module provides internationalization I18N and localization L10N services for your Python modules and applications It supports both the GNU gettext message catalog API and a higher level class based API that may be more appropriate for Python files The interface described below allows you to write your module and application messages in one natural language and provide a catalog of translated messages for running under different natural languages Some hints on localizing your Python modules and applications are also given GNU gettext API The gettext module defines the following API which is very similar to the GNU gettext API If you use this API you will affect the translation of your entire application globally Often this is what you want if your application is monolingual with the choice of language dependent on the locale of your user If you are localizing a Python module or if your application needs to switch languages on the fly you probably want to use the class based API instead gettext bindtextdomain domain localedir None Bind the domain to the locale directory localedir More concretely gettext will look for binary mo files for the given domain using the path on Unix localedir language LC_MESSAGES domain mo where language is searched for in the environment variables LANGUAGE LC_ALL LC_MESSAGES and LANG respectively If localedir is omitted or None then the current binding for domain is returned 1 gettext textdomain domain None Change or query the current global domain If domain is None then the current global domain is returned otherwise the global domain is set to domain which is returned gettext gettext message Return the localized translation of message based on the current global domain language and locale directory This function is usually aliased as _ in the local namespace see examples below gettext dgettext domain message Like gettext but look the message up in the specified domain gettext ngettext singular plural n Like gettext but consider plural forms If a translation is found apply the plural formula to n and return the resulting message some languages have more than two plural forms If no translation is found return singular if n is 1 return plural otherwise The Plural formula is taken from the catalog header It is a C or Python expression that has a free variable n the expression evaluates to the index of the plural in the catalog See the GNU gettext documentation for the precise syntax to be used in po files and the formulas for a variety of languages gettext dngettext domain singular plural n Like ngettext but look the message up in the specified domain gettext pgettext context message gettext dpgettext domain context message gettext npgettext context singular plural n gettext dnpgettext domain context singular plural n Similar to the corresponding functions without the p in the prefix that is gettext dgettext ngettext dngettext but the translation is restricted to the given message context New in version 3 8 Note that GNU gettext also defines a dcgettext method but this was deemed not useful and so it is currently unimplemented Here s an example of typical usage for this API import gettext gettext bindtextdomain myapplication path to my language directory gettext textdomain myapplication _ gettext gettext print _ This is a translatable string Class based API The class based API of the gettext module gives you more flexibility and greater convenience than the GNU gettext API It is the recommended way of localizing your Python applications and modules gettext defines a GNUTranslations class which implements the parsing of GNU mo format files and has methods for returning strings Instances of this class can also install themselves in the built in namespace as the function _ gettext find domain localedir None languages None all False This function implements the standard mo file search algorithm It takes a domain identical to what textdomain takes Optional localedir is as in bindtextdomain Optional languages is a
en
null
189
list of strings where each string is a language code If localedir is not given then the default system locale directory is used 2 If languages is not given then the following environment variables are searched LANGUAGE LC_ALL LC_MESSAGES and LANG The first one returning a non empty value is used for the languages variable The environment variables should contain a colon separated list of languages which will be split on the colon to produce the expected list of language code strings find then expands and normalizes the languages and then iterates through them searching for an existing file built of these components localedir language LC_MESSAGES domain mo The first such file name that exists is returned by find If no such file is found then None is returned If all is given it returns a list of all file names in the order in which they appear in the languages list or the environment variables gettext translation domain localedir None languages None class_ None fallback False Return a Translations instance based on the domain localedir and languages which are first passed to find to get a list of the associated mo file paths Instances with identical mo file names are cached The actual class instantiated is class_ if provided otherwise GNUTranslations The class s constructor must take a single file object argument If multiple files are found later files are used as fallbacks for earlier ones To allow setting the fallback copy copy is used to clone each translation object from the cache the actual instance data is still shared with the cache If no mo file is found this function raises OSError if fallback is false which is the default and returns a NullTranslations instance if fallback is true Changed in version 3 3 IOError used to be raised it is now an alias of OSError Changed in version 3 11 codeset parameter is removed gettext install domain localedir None names None This installs the function _ in Python s builtins namespace based on domain and localedir which are passed to the function translation For the names parameter please see the description of the translation object s install method As seen below you usually mark the strings in your application that are candidates for translation by wrapping them in a call to the _ function like this print _ This string will be translated For convenience you want the _ function to be installed in Python s builtins namespace so it is easily accessible in all modules of your application Changed in version 3 11 names is now a keyword only parameter The NullTranslations class Translation classes are what actually implement the translation of original source file message strings to translated message strings The base class used by all translation classes is NullTranslations this provides the basic interface you can use to write your own specialized translation classes Here are the methods of NullTranslations class gettext NullTranslations fp None Takes an optional file object fp which is ignored by the base class Initializes protected instance variables _info and _charset which are set by derived classes as well as _fallback which is set through add_fallback It then calls self _parse fp if fp is not None _parse fp No op in the base class this method takes file object fp and reads the data from the file initializing its message catalog If you have an unsupported message catalog file format you should override this method to parse your format add_fallback fallback Add fallback as the fallback object for the current translation object A translation object should consult the fallback if it cannot provide a translation for a given message gettext message If a fallback has been set forward gettext to the fallback Otherwise return message Overridden in derived classes ngettext singular plural n If a fallback has been set forward ngettext to the fallback Otherwise return singular if n is 1 return plural otherwise Overridden in derived classes pgettext context message If a fallback has been set forward pgettext to the fallback Otherwise return the translated message Overridden in derived class
en
null
190
es New in version 3 8 npgettext context singular plural n If a fallback has been set forward npgettext to the fallback Otherwise return the translated message Overridden in derived classes New in version 3 8 info Return a dictionary containing the metadata found in the message catalog file charset Return the encoding of the message catalog file install names None This method installs gettext into the built in namespace binding it to _ If the names parameter is given it must be a sequence containing the names of functions you want to install in the builtins namespace in addition to _ Supported names are gettext ngettext pgettext and npgettext Note that this is only one way albeit the most convenient way to make the _ function available to your application Because it affects the entire application globally and specifically the built in namespace localized modules should never install _ Instead they should use this code to make _ available to their module import gettext t gettext translation mymodule _ t gettext This puts _ only in the module s global namespace and so only affects calls within this module Changed in version 3 8 Added pgettext and npgettext The GNUTranslations class The gettext module provides one additional class derived from NullTranslations GNUTranslations This class overrides _parse to enable reading GNU gettext format mo files in both big endian and little endian format GNUTranslations parses optional metadata out of the translation catalog It is convention with GNU gettext to include metadata as the translation for the empty string This metadata is in RFC 822 style key value pairs and should contain the Project Id Version key If the key Content Type is found then the charset property is used to initialize the protected _charset instance variable defaulting to None if not found If the charset encoding is specified then all message ids and message strings read from the catalog are converted to Unicode using this encoding else ASCII is assumed Since message ids are read as Unicode strings too all gettext methods will assume message ids as Unicode strings not byte strings The entire set of key value pairs are placed into a dictionary and set as the protected _info instance variable If the mo file s magic number is invalid the major version number is unexpected or if other problems occur while reading the file instantiating a GNUTranslations class can raise OSError class gettext GNUTranslations The following methods are overridden from the base class implementation gettext message Look up the message id in the catalog and return the corresponding message string as a Unicode string If there is no entry in the catalog for the message id and a fallback has been set the look up is forwarded to the fallback s gettext method Otherwise the message id is returned ngettext singular plural n Do a plural forms lookup of a message id singular is used as the message id for purposes of lookup in the catalog while n is used to determine which plural form to use The returned message string is a Unicode string If the message id is not found in the catalog and a fallback is specified the request is forwarded to the fallback s ngettext method Otherwise when n is 1 singular is returned and plural is returned in all other cases Here is an example n len os listdir cat GNUTranslations somefile message cat ngettext There is num d file in this directory There are num d files in this directory n num n pgettext context message Look up the context and message id in the catalog and return the corresponding message string as a Unicode string If there is no entry in the catalog for the message id and context and a fallback has been set the look up is forwarded to the fallback s pgettext method Otherwise the message id is returned New in version 3 8 npgettext context singular plural n Do a plural forms lookup of a message id singular is used as the message id for purposes of lookup in the catalog while n is used to determine which plural form to use If the message id for context is not found in the catalog and a fallback is specified the
en
null
191
request is forwarded to the fallback s npgettext method Otherwise when n is 1 singular is returned and plural is returned in all other cases New in version 3 8 Solaris message catalog support The Solaris operating system defines its own binary mo file format but since no documentation can be found on this format it is not supported at this time The Catalog constructor GNOME uses a version of the gettext module by James Henstridge but this version has a slightly different API Its documented usage was import gettext cat gettext Catalog domain localedir _ cat gettext print _ hello world For compatibility with this older module the function Catalog is an alias for the translation function described above One difference between this module and Henstridge s his catalog objects supported access through a mapping API but this appears to be unused and so is not currently supported Internationalizing your programs and modules Internationalization I18N refers to the operation by which a program is made aware of multiple languages Localization L10N refers to the adaptation of your program once internationalized to the local language and cultural habits In order to provide multilingual messages for your Python programs you need to take the following steps 1 prepare your program or module by specially marking translatable strings 2 run a suite of tools over your marked files to generate raw messages catalogs 3 create language specific translations of the message catalogs 4 use the gettext module so that message strings are properly translated In order to prepare your code for I18N you need to look at all the strings in your files Any string that needs to be translated should be marked by wrapping it in _ that is a call to the function _ For example filename mylog txt message _ writing a log message with open filename w as fp fp write message In this example the string writing a log message is marked as a candidate for translation while the strings mylog txt and w are not There are a few tools to extract the strings meant for translation The original GNU gettext only supported C or C source code but its extended version xgettext scans code written in a number of languages including Python to find strings marked as translatable Babel is a Python internationalization library that includes a pybabel script to extract and compile message catalogs François Pinard s program called xpot does a similar job and is available as part of his po utils package Python also includes pure Python versions of these programs called pygettext py and msgfmt py some Python distributions will install them for you pygettext py is similar to xgettext but only understands Python source code and cannot handle other programming languages such as C or C pygettext py supports a command line interface similar to xgettext for details on its use run pygettext py help msgfmt py is binary compatible with GNU msgfmt With these two programs you may not need the GNU gettext package to internationalize your Python applications xgettext pygettext and similar tools generate po files that are message catalogs They are structured human readable files that contain every marked string in the source code along with a placeholder for the translated versions of these strings Copies of these po files are then handed over to the individual human translators who write translations for every supported natural language They send back the completed language specific versions as a language name po file that s compiled into a machine readable mo binary catalog file using the msgfmt program The mo files are used by the gettext module for the actual translation processing at run time How you use the gettext module in your code depends on whether you are internationalizing a single module or your entire application The next two sections will discuss each case Localizing your module If you are localizing your module you must take care not to make global changes e g to the built in namespace You should not use the GNU gettext API but instead the class based API Let s say your module is called spa
en
null
192
m and the module s various natural language translation mo files reside in usr share locale in GNU gettext format Here s what you would put at the top of your module import gettext t gettext translation spam usr share locale _ t gettext Localizing your application If you are localizing your application you can install the _ function globally into the built in namespace usually in the main driver file of your application This will let all your application specific files just use _ without having to explicitly install it in each file In the simple case then you need only add the following bit of code to the main driver file of your application import gettext gettext install myapplication If you need to set the locale directory you can pass it into the install function import gettext gettext install myapplication usr share locale Changing languages on the fly If your program needs to support many languages at the same time you may want to create multiple translation instances and then switch between them explicitly like so import gettext lang1 gettext translation myapplication languages en lang2 gettext translation myapplication languages fr lang3 gettext translation myapplication languages de start by using language1 lang1 install time goes by user selects language 2 lang2 install more time goes by user selects language 3 lang3 install Deferred translations In most coding situations strings are translated where they are coded Occasionally however you need to mark strings for translation but defer actual translation until later A classic example is animals mollusk albatross rat penguin python for a in animals print a Here you want to mark the strings in the animals list as being translatable but you don t actually want to translate them until they are printed Here is one way you can handle this situation def _ message return message animals _ mollusk _ albatross _ rat _ penguin _ python del _ for a in animals print _ a This works because the dummy definition of _ simply returns the string unchanged And this dummy definition will temporarily override any definition of _ in the built in namespace until the del command Take care though if you have a previous definition of _ in the local namespace Note that the second use of _ will not identify a as being translatable to the gettext program because the parameter is not a string literal Another way to handle this is with the following example def N_ message return message animals N_ mollusk N_ albatross N_ rat N_ penguin N_ python for a in animals print _ a In this case you are marking translatable strings with the function N_ which won t conflict with any definition of _ However you will need to teach your message extraction program to look for translatable strings marked with N_ xgettext pygettext pybabel extract and xpot all support this through the use of the k command line switch The choice of N_ here is totally arbitrary it could have just as easily been MarkThisStringForTranslation Acknowledgements The following people contributed code feedback design suggestions previous implementations and valuable experience to the creation of this module Peter Funk James Henstridge Juan David Ibáñez Palomar Marc André Lemburg Martin von Löwis François Pinard Barry Warsaw Gustavo Niemeyer Footnotes 1 The default locale directory is system dependent for example on Red Hat Linux it is usr share locale but on Solaris it is usr lib locale The gettext module does not try to support these system dependent defaults instead its default is sys base_prefix share locale see sys base_prefix For this reason it is always best to call bindtextdomain with an explicit absolute path at the start of your application 2 See the footnote for bindtextdomain above
en
null
193
urllib robotparser Parser for robots txt Source code Lib urllib robotparser py This module provides a single class RobotFileParser which answers questions about whether or not a particular user agent can fetch a URL on the web site that published the robots txt file For more details on the structure of robots txt files see http www robotstxt org orig html class urllib robotparser RobotFileParser url This class provides methods to read parse and answer questions about the robots txt file at url set_url url Sets the URL referring to a robots txt file read Reads the robots txt URL and feeds it to the parser parse lines Parses the lines argument can_fetch useragent url Returns True if the useragent is allowed to fetch the url according to the rules contained in the parsed robots txt file mtime Returns the time the robots txt file was last fetched This is useful for long running web spiders that need to check for new robots txt files periodically modified Sets the time the robots txt file was last fetched to the current time crawl_delay useragent Returns the value of the Crawl delay parameter from robots txt for the useragent in question If there is no such parameter or it doesn t apply to the useragent specified or the robots txt entry for this parameter has invalid syntax return None New in version 3 6 request_rate useragent Returns the contents of the Request rate parameter from robots txt as a named tuple RequestRate requests seconds If there is no such parameter or it doesn t apply to the useragent specified or the robots txt entry for this parameter has invalid syntax return None New in version 3 6 site_maps Returns the contents of the Sitemap parameter from robots txt in the form of a list If there is no such parameter or the robots txt entry for this parameter has invalid syntax return None New in version 3 8 The following example demonstrates basic use of the RobotFileParser class import urllib robotparser rp urllib robotparser RobotFileParser rp set_url http www musi cal com robots txt rp read rrate rp request_rate rrate requests 3 rrate seconds 20 rp crawl_delay 6 rp can_fetch http www musi cal com cgi bin search city San Francisco False rp can_fetch http www musi cal com True
en
null
194
Superseded Modules The modules described in this chapter are deprecated and only kept for backwards compatibility They have been superseded by other modules aifc Read and write AIFF and AIFC files audioop Manipulate raw audio data cgi Common Gateway Interface support cgitb Traceback manager for CGI scripts chunk Read IFF chunked data crypt Function to check Unix passwords imghdr Determine the type of an image mailcap Mailcap file handling msilib Read and write Microsoft Installer files nis Interface to Sun s NIS Yellow Pages nntplib NNTP protocol client optparse Parser for command line options ossaudiodev Access to OSS compatible audio devices pipes Interface to shell pipelines sndhdr Determine type of sound file spwd The shadow password database sunau Read and write Sun AU files telnetlib Telnet client uu Encode and decode uuencode files xdrlib Encode and decode XDR data
en
null
195
email message Message Representing an email message using the compat32 API The Message class is very similar to the EmailMessage class without the methods added by that class and with the default behavior of certain other methods being slightly different We also document here some methods that while supported by the EmailMessage class are not recommended unless you are dealing with legacy code The philosophy and structure of the two classes is otherwise the same This document describes the behavior under the default for Message policy Compat32 If you are going to use another policy you should be using the EmailMessage class instead An email message consists of headers and a payload Headers must be RFC 5322 style names and values where the field name and value are separated by a colon The colon is not part of either the field name or the field value The payload may be a simple text message or a binary object or a structured sequence of sub messages each with their own set of headers and their own payload The latter type of payload is indicated by the message having a MIME type such as multipart or message rfc822 The conceptual model provided by a Message object is that of an ordered dictionary of headers with additional methods for accessing both specialized information from the headers for accessing the payload for generating a serialized version of the message and for recursively walking over the object tree Note that duplicate headers are supported but special methods must be used to access them The Message pseudo dictionary is indexed by the header names which must be ASCII values The values of the dictionary are strings that are supposed to contain only ASCII characters there is some special handling for non ASCII input but it doesn t always produce the correct results Headers are stored and returned in case preserving form but field names are matched case insensitively There may also be a single envelope header also known as the Unix From header or the From_ header The payload is either a string or bytes in the case of simple message objects or a list of Message objects for MIME container documents e g multipart and message rfc822 Here are the methods of the Message class class email message Message policy compat32 If policy is specified it must be an instance of a policy class use the rules it specifies to update and serialize the representation of the message If policy is not set use the compat32 policy which maintains backward compatibility with the Python 3 2 version of the email package For more information see the policy documentation Changed in version 3 3 The policy keyword argument was added as_string unixfrom False maxheaderlen 0 policy None Return the entire message flattened as a string When optional unixfrom is true the envelope header is included in the returned string unixfrom defaults to False For backward compatibility reasons maxheaderlen defaults to 0 so if you want a different value you must override it explicitly the value specified for max_line_length in the policy will be ignored by this method The policy argument may be used to override the default policy obtained from the message instance This can be used to control some of the formatting produced by the method since the specified policy will be passed to the Generator Flattening the message may trigger changes to the Message if defaults need to be filled in to complete the transformation to a string for example MIME boundaries may be generated or modified Note that this method is provided as a convenience and may not always format the message the way you want For example by default it does not do the mangling of lines that begin with From that is required by the Unix mbox format For more flexibility instantiate a Generator instance and use its flatten method directly For example from io import StringIO from email generator import Generator fp StringIO g Generator fp mangle_from_ True maxheaderlen 60 g flatten msg text fp getvalue If the message object contains binary data that is not encoded according to RFC standards the non compliant data
en
null
196
will be replaced by unicode unknown character code points See also as_bytes and BytesGenerator Changed in version 3 4 the policy keyword argument was added __str__ Equivalent to as_string Allows str msg to produce a string containing the formatted message as_bytes unixfrom False policy None Return the entire message flattened as a bytes object When optional unixfrom is true the envelope header is included in the returned string unixfrom defaults to False The policy argument may be used to override the default policy obtained from the message instance This can be used to control some of the formatting produced by the method since the specified policy will be passed to the BytesGenerator Flattening the message may trigger changes to the Message if defaults need to be filled in to complete the transformation to a string for example MIME boundaries may be generated or modified Note that this method is provided as a convenience and may not always format the message the way you want For example by default it does not do the mangling of lines that begin with From that is required by the Unix mbox format For more flexibility instantiate a BytesGenerator instance and use its flatten method directly For example from io import BytesIO from email generator import BytesGenerator fp BytesIO g BytesGenerator fp mangle_from_ True maxheaderlen 60 g flatten msg text fp getvalue New in version 3 4 __bytes__ Equivalent to as_bytes Allows bytes msg to produce a bytes object containing the formatted message New in version 3 4 is_multipart Return True if the message s payload is a list of sub Message objects otherwise return False When is_multipart returns False the payload should be a string object which might be a CTE encoded binary payload Note that is_multipart returning True does not necessarily mean that msg get_content_maintype multipart will return the True For example is_multipart will return True when the Message is of type message rfc822 set_unixfrom unixfrom Set the message s envelope header to unixfrom which should be a string get_unixfrom Return the message s envelope header Defaults to None if the envelope header was never set attach payload Add the given payload to the current payload which must be None or a list of Message objects before the call After the call the payload will always be a list of Message objects If you want to set the payload to a scalar object e g a string use set_payload instead This is a legacy method On the EmailMessage class its functionality is replaced by set_content and the related make and add methods get_payload i None decode False Return the current payload which will be a list of Message objects when is_multipart is True or a string when is_multipart is False If the payload is a list and you mutate the list object you modify the message s payload in place With optional argument i get_payload will return the i th element of the payload counting from zero if is_multipart is True An IndexError will be raised if i is less than 0 or greater than or equal to the number of items in the payload If the payload is a string i e is_multipart is False and i is given a TypeError is raised Optional decode is a flag indicating whether the payload should be decoded or not according to the Content Transfer Encoding header When True and the message is not a multipart the payload will be decoded if this header s value is quoted printable or base64 If some other encoding is used or Content Transfer Encoding header is missing the payload is returned as is undecoded In all cases the returned value is binary data If the message is a multipart and the decode flag is True then None is returned If the payload is base64 and it was not perfectly formed missing padding characters outside the base64 alphabet then an appropriate defect will be added to the message s defect property InvalidBase64PaddingDefect or InvalidBase64CharactersDefect respectively When decode is False the default the body is returned as a string without decoding the Content Transfer Encoding However for a Content Transfer Encoding of 8bit an attempt is ma
en
null
197
de to decode the original bytes using the charset specified by the Content Type header using the replace error handler If no charset is specified or if the charset given is not recognized by the email package the body is decoded using the default ASCII charset This is a legacy method On the EmailMessage class its functionality is replaced by get_content and iter_parts set_payload payload charset None Set the entire message object s payload to payload It is the client s responsibility to ensure the payload invariants Optional charset sets the message s default character set see set_charset for details This is a legacy method On the EmailMessage class its functionality is replaced by set_content set_charset charset Set the character set of the payload to charset which can either be a Charset instance see email charset a string naming a character set or None If it is a string it will be converted to a Charset instance If charset is None the charset parameter will be removed from the Content Type header the message will not be otherwise modified Anything else will generate a TypeError If there is no existing MIME Version header one will be added If there is no existing Content Type header one will be added with a value of text plain Whether the Content Type header already exists or not its charset parameter will be set to charset output_charset If charset input_charset and charset output_charset differ the payload will be re encoded to the output_charset If there is no existing Content Transfer Encoding header then the payload will be transfer encoded if needed using the specified Charset and a header with the appropriate value will be added If a Content Transfer Encoding header already exists the payload is assumed to already be correctly encoded using that Content Transfer Encoding and is not modified This is a legacy method On the EmailMessage class its functionality is replaced by the charset parameter of the email emailmessage EmailMessage set_content method get_charset Return the Charset instance associated with the message s payload This is a legacy method On the EmailMessage class it always returns None The following methods implement a mapping like interface for accessing the message s RFC 2822 headers Note that there are some semantic differences between these methods and a normal mapping i e dictionary interface For example in a dictionary there are no duplicate keys but here there may be duplicate message headers Also in dictionaries there is no guaranteed order to the keys returned by keys but in a Message object headers are always returned in the order they appeared in the original message or were added to the message later Any header deleted and then re added are always appended to the end of the header list These semantic differences are intentional and are biased toward maximal convenience Note that in all cases any envelope header present in the message is not included in the mapping interface In a model generated from bytes any header values that in contravention of the RFCs contain non ASCII bytes will when retrieved through this interface be represented as Header objects with a charset of unknown 8bit __len__ Return the total number of headers including duplicates __contains__ name Return True if the message object has a field named name Matching is done case insensitively and name should not include the trailing colon Used for the in operator e g if message id in myMessage print Message ID myMessage message id __getitem__ name Return the value of the named header field name should not include the colon field separator If the header is missing None is returned a KeyError is never raised Note that if the named field appears more than once in the message s headers exactly which of those field values will be returned is undefined Use the get_all method to get the values of all the extant named headers __setitem__ name val Add a header to the message with field name name and value val The field is appended to the end of the message s existing fields Note that this does not overwrite or delete any existing he
en
null
198
ader with the same name If you want to ensure that the new header is the only one present in the message with field name name delete the field first e g del msg subject msg subject Python roolz __delitem__ name Delete all occurrences of the field with name name from the message s headers No exception is raised if the named field isn t present in the headers keys Return a list of all the message s header field names values Return a list of all the message s field values items Return a list of 2 tuples containing all the message s field headers and values get name failobj None Return the value of the named header field This is identical to __getitem__ except that optional failobj is returned if the named header is missing defaults to None Here are some additional useful methods get_all name failobj None Return a list of all the values for the field named name If there are no such named headers in the message failobj is returned defaults to None add_header _name _value _params Extended header setting This method is similar to __setitem__ except that additional header parameters can be provided as keyword arguments _name is the header field to add and _value is the primary value for the header For each item in the keyword argument dictionary _params the key is taken as the parameter name with underscores converted to dashes since dashes are illegal in Python identifiers Normally the parameter will be added as key value unless the value is None in which case only the key will be added If the value contains non ASCII characters it can be specified as a three tuple in the format CHARSET LANGUAGE VALUE where CHARSET is a string naming the charset to be used to encode the value LANGUAGE can usually be set to None or the empty string see RFC 2231 for other possibilities and VALUE is the string value containing non ASCII code points If a three tuple is not passed and the value contains non ASCII characters it is automatically encoded in RFC 2231 format using a CHARSET of utf 8 and a LANGUAGE of None Here s an example msg add_header Content Disposition attachment filename bud gif This will add a header that looks like Content Disposition attachment filename bud gif An example with non ASCII characters msg add_header Content Disposition attachment filename iso 8859 1 Fußballer ppt Which produces Content Disposition attachment filename iso 8859 1 Fu DFballer ppt replace_header _name _value Replace a header Replace the first header found in the message that matches _name retaining header order and field name case If no matching header was found a KeyError is raised get_content_type Return the message s content type The returned string is coerced to lower case of the form maintype subtype If there was no Content Type header in the message the default type as given by get_default_type will be returned Since according to RFC 2045 messages always have a default type get_content_type will always return a value RFC 2045 defines a message s default type to be text plain unless it appears inside a multipart digest container in which case it would be message rfc822 If the Content Type header has an invalid type specification RFC 2045 mandates that the default type be text plain get_content_maintype Return the message s main content type This is the maintype part of the string returned by get_content_type get_content_subtype Return the message s sub content type This is the subtype part of the string returned by get_content_type get_default_type Return the default content type Most messages have a default content type of text plain except for messages that are subparts of multipart digest containers Such subparts have a default content type of message rfc822 set_default_type ctype Set the default content type ctype should either be text plain or message rfc822 although this is not enforced The default content type is not stored in the Content Type header get_params failobj None header content type unquote True Return the message s Content Type parameters as a list The elements of the returned list are 2 tuples of key value pairs as split on the
en
null
199
sign The left hand side of the is the key while the right hand side is the value If there is no sign in the parameter the value is the empty string otherwise the value is as described in get_param and is unquoted if optional unquote is True the default Optional failobj is the object to return if there is no Content Type header Optional header is the header to search instead of Content Type This is a legacy method On the EmailMessage class its functionality is replaced by the params property of the individual header objects returned by the header access methods get_param param failobj None header content type unquote True Return the value of the Content Type header s parameter param as a string If the message has no Content Type header or if there is no such parameter then failobj is returned defaults to None Optional header if given specifies the message header to use instead of Content Type Parameter keys are always compared case insensitively The return value can either be a string or a 3 tuple if the parameter was RFC 2231 encoded When it s a 3 tuple the elements of the value are of the form CHARSET LANGUAGE VALUE Note that both CHARSET and LANGUAGE can be None in which case you should consider VALUE to be encoded in the us ascii charset You can usually ignore LANGUAGE If your application doesn t care whether the parameter was encoded as in RFC 2231 you can collapse the parameter value by calling email utils collapse_rfc2231_value passing in the return value from get_param This will return a suitably decoded Unicode string when the value is a tuple or the original string unquoted if it isn t For example rawparam msg get_param foo param email utils collapse_rfc2231_value rawparam In any case the parameter value either the returned string or the VALUE item in the 3 tuple is always unquoted unless unquote is set to False This is a legacy method On the EmailMessage class its functionality is replaced by the params property of the individual header objects returned by the header access methods set_param param value header Content Type requote True charset None language replace False Set a parameter in the Content Type header If the parameter already exists in the header its value will be replaced with value If the Content Type header as not yet been defined for this message it will be set to text plain and the new parameter value will be appended as per RFC 2045 Optional header specifies an alternative header to Content Type and all parameters will be quoted as necessary unless optional requote is False the default is True If optional charset is specified the parameter will be encoded according to RFC 2231 Optional language specifies the RFC 2231 language defaulting to the empty string Both charset and language should be strings If replace is False the default the header is moved to the end of the list of headers If replace is True the header will be updated in place Changed in version 3 4 replace keyword was added del_param param header content type requote True Remove the given parameter completely from the Content Type header The header will be re written in place without the parameter or its value All values will be quoted as necessary unless requote is False the default is True Optional header specifies an alternative to Content Type set_type type header Content Type requote True Set the main type and subtype for the Content Type header type must be a string in the form maintype subtype otherwise a ValueError is raised This method replaces the Content Type header keeping all the parameters in place If requote is False this leaves the existing header s quoting as is otherwise the parameters will be quoted the default An alternative header can be specified in the header argument When the Content Type header is set a MIME Version header is also added This is a legacy method On the EmailMessage class its functionality is replaced by the make_ and add_ methods get_filename failobj None Return the value of the filename parameter of the Content Disposition header of the message If the header does not have a filename paramete
en
null