idx
int64
0
24.9k
question
stringlengths
68
4.14k
target
stringlengths
9
749
18,800
def deleteDuplicateRecords ( fnames , fids = nil , options = nil , dbid = @dbid ) num_deleted = 0 if options and not options . is_a? ( Hash ) raise "deleteDuplicateRecords: 'options' parameter must be a Hash" else options = { } options [ "keeplastrecord" ] = true options [ "ignoreCase" ] = true end findDuplicateRecordI...
Finds records with the same values in a specified list of fields and deletes all but the first or last duplicate record . The field list may be a list of field IDs or a list of field names . The options parameter can be used to keep the oldest record instead of the newest record and to control whether to ignore the cas...
18,801
def copyRecord ( rid , numCopies = 1 , dbid = @dbid ) clearFieldValuePairList getRecordInfo ( dbid , rid ) { | field | if field and field . elements [ "value" ] and field . elements [ "value" ] . has_text? if field . elements [ "fid" ] . text . to_i > 5 addFieldValuePair ( field . elements [ "name" ] . text , nil , nil...
Make one or more copies of a record .
18,802
def _importFromExcel ( excelFilename , lastColumn = 'j' , lastDataRow = 0 , worksheetNumber = 1 , fieldNameRow = 1 , firstDataRow = 2 , firstColumn = 'a' ) importFromExcel ( @dbid , excelFilename , lastColumn , lastDataRow , worksheetNumber , fieldNameRow , firstDataRow , firstColumn ) end
Import data directly from an Excel file into the active table .
18,803
def importCSVFile ( filename , dbid = @dbid , targetFieldNames = nil , validateLines = true ) importSVFile ( filename , "," , dbid , targetFieldNames , validateLines ) end
Add records from lines in a CSV file . If dbid is not specified the active table will be used . values in subsequent lines . The file must not contain commas inside field names or values .
18,804
def importTSVFile ( filename , dbid = @dbid , targetFieldNames = nil , validateLines = true ) importSVFile ( filename , "\t" , dbid , targetFieldNames , validateLines ) end
Import records from a text file in Tab - Separated - Values format .
18,805
def makeSVFile ( filename , fieldSeparator = "," , dbid = @dbid , query = nil , qid = nil , qname = nil ) File . open ( filename , "w" ) { | file | if dbid doQuery ( dbid , query , qid , qname ) end if @records and @fields output = "" fieldNamesBlock = proc { | element | if element . is_a? ( REXML :: Element ) and elem...
Make a CSV file using the results of a query . Specify a different separator in the second paramater . Fields containing the separator will be double - quoted .
18,806
def makeCSVFileForReport ( filename , dbid = @dbid , query = nil , qid = nil , qname = nil ) csv = getCSVForReport ( dbid , query , qid , qname ) File . open ( filename , "w" ) { | f | f . write ( csv || "" ) } end
Create a CSV file using the records for a Report .
18,807
def getCSVForReport ( dbid , query = nil , qid = nil , qname = nil ) genResultsTable ( dbid , query , nil , nil , nil , nil , "csv" , qid , qname ) end
Get the CSV data for a Report .
18,808
def doSQLUpdate ( sqlString ) sql = sqlString . dup dbname = "" state = nil fieldName = "" fieldValue = "" sqlQuery = "SELECT 3 FROM " clearFieldValuePairList sql . split ( ' ' ) . each { | token | case token when "UPDATE" state = "getTable" unless state == "getFilter" next when "SET" state = "getFieldName" unless stat...
Translate a simple SQL UPDATE statement to a QuickBase editRecord call .
18,809
def doSQLInsert ( sqlString ) sql = sqlString . dup dbname = "" state = nil fieldName = "" fieldValue = "" fieldNames = [ ] fieldValues = [ ] index = 0 clearFieldValuePairList sql . gsub! ( "(" , " " ) sql . gsub! ( ")" , " " ) sql . split ( ' ' ) . each { | token | case token when "INSERT" , "INTO" state = "getTable" ...
Translate a simple SQL INSERT statement to a QuickBase addRecord call .
18,810
def eachField ( record = @record ) if record and block_given? record . each { | field | if field . is_a? ( REXML :: Element ) and field . name == "f" and field . attributes [ "id" ] @field = field yield field end } end nil end
Iterate record XML and yield only f elements .
18,811
def alias_methods aliased_methods = [ ] public_methods . each { | old_method | if old_method . match ( / / ) new_method = old_method . gsub ( / / ) { | uc | "_#{uc.downcase}" } aliased_methods << new_method instance_eval ( "alias #{new_method} #{old_method}" ) end } aliased_methods end
Add method aliases that follow the ruby method naming convention . E . g . sendRequest will be aliased as send_request .
18,812
def to_hash methods = instance_variables . map { | i | i . to_s . gsub ( '@' , '' ) } Hash [ * methods . map { | m | [ m , send ( m . to_sym ) ] } . flatten ] end
Create a new cookbook from the given hash .
18,813
def _referenced_object_ids @data . each . select do | v | v && v . respond_to? ( :is_poxreference? ) end . map { | o | o . id } end
Return a list of all object IDs of all persistend objects that this Array is referencing .
18,814
def generate_expected_value if expected_difference . is_a? Range ( before_value + expected_difference . first ) .. ( before_value + expected_difference . end ) else before_value + expected_difference end end
Generate the expected value .
18,815
def obfuscate ( name , seed = nil ) rnd = Random . new ( seed || @seed ) vowels = %w( A E I O U ) consonants = ( 'A' .. 'Z' ) . to_a - vowels digits = ( '0' .. '9' ) . to_a dict = Hash [ ( vowels + consonants + digits ) . zip ( vowels . shuffle ( random : rnd ) + consonants . shuffle ( random : rnd ) + digits . shuffle...
Obfuscate a name or address either with the given seed or default seed
18,816
def start_background ( wait = 5 ) @server = WEBrick :: HTTPServer . new ( :BindAddress => @options [ :host ] , :Port => @options [ :port ] , :AccessLog => [ ] , :Logger => WEBrick :: Log . new ( StringIO . new , 7 ) ) @server . mount ( '/' , Rack :: Handler :: WEBrick , app ) @thread = Thread . new { @server . start } ...
Start a Community Zero server in a forked process . This method returns the PID to the forked process .
18,817
def running? if @server . nil? || @server . status != :Running return false end uri = URI . join ( url , 'cookbooks' ) headers = { 'Accept' => 'application/json' } Timeout . timeout ( 0.1 ) { ! open ( uri , headers ) . nil? } rescue SocketError , Errno :: ECONNREFUSED , Timeout :: Error false end
Boolean method to determine if the server is currently ready to accept requests . This method will attempt to make an HTTP request against the server . If this method returns true you are safe to make a request .
18,818
def stop ( wait = 5 ) Timeout . timeout ( wait ) do @server . shutdown @thread . join ( wait ) if @thread end rescue Timeout :: Error if @thread $stderr . puts ( "Community Zero did not stop within #{wait} seconds! Killing..." ) @thread . kill end ensure @server = nil @thread = nil end
Gracefully stop the Community Zero server .
18,819
def app lambda do | env | request = Request . new ( env ) response = router . call ( request ) response [ - 1 ] = Array ( response [ - 1 ] ) response end end
The actual application the server will respond to .
18,820
def create_cookbook ( metadata , overrides = { } ) cookbook = Cookbook . new ( { :name => metadata . name , :category => nil , :maintainer => metadata . maintainer , :description => metadata . description , :version => metadata . version } . merge ( overrides ) ) store . add ( cookbook ) cookbook end
Create the cookbook from the metadata .
18,821
def find_metadata ( tarball ) gzip = Zlib :: GzipReader . new ( tarball [ :tempfile ] ) tar = Gem :: Package :: TarReader . new ( gzip ) tar . each do | entry | if entry . full_name =~ / \. / return Metadata . from_json ( entry . read ) elsif entry . full_name =~ / \. / return Metadata . from_ruby ( entry . read ) end ...
Parse the metadata from the tarball .
18,822
def finalize @transactions . sort! { | a , b | a . date <=> b . date } balance = Value . new ( 0 ) @balances = @transactions . map do | act_txn | balance += act_txn . value { date : act_txn . date , value : balance , } end end
Internal method used to complete initialization of the Account after all transactions have been associated with it .
18,823
def lookup_schema ( schema_key ) lookup_time = Time . now . getutc if schema_key . is_a? ( String ) schema_key = SchemaKey . parse_key ( schema_key ) end failures = [ ] cache_result = @cache [ schema_key ] if not cache_result . nil? if not @cacheTtl . nil? store_time = cache_result [ 1 ] time_diff = ( lookup_time - sto...
Lookup schema in cache or try to fetch
18,824
def validate ( json ) schema_key = Resolver . get_schema_key json data = Resolver . get_data json schema = lookup_schema schema_key JSON :: Validator . validate! ( schema , data ) end
Return true or throw exception
18,825
def visible_for? ( user ) can_view_page = ( if dropdown? true elsif menuable . kind_of? ( Effective :: Page ) menuable . roles_permit? ( user ) else true end ) can_view_menu_item = ( if roles_mask == nil true elsif roles_mask == - 1 user . blank? elsif roles_mask == 0 user . present? else roles_permit? ( user ) end ) c...
For now it s just logged in or not? This will work with effective_roles one day ...
18,826
def copy ( dir , options = { } ) sync new_db = Store . new ( dir , options ) new_db . sync i = 0 each do | ref_obj | obj = ref_obj . _referenced_object obj . _transfer ( new_db ) obj . _sync i += 1 end PEROBS . log . debug "Copied #{i} objects into new database at #{dir}" new_db . exit true end
Create a new Store .
18,827
def exit if @cache && @cache . in_transaction? @cache . abort_transaction @cache . flush @db . close if @db PEROBS . log . fatal "You cannot call exit() during a transaction: #{Kernel.caller}" end @cache . flush if @cache @db . close if @db @db = @class_map = @in_memory_objects = @stats = @cache = @root_objects = nil e...
Close the store and ensure that all in - memory objects are written out to the storage backend . The Store object is no longer usable after this method was called .
18,828
def new ( klass , * args ) unless klass . is_a? ( BasicObject ) PEROBS . log . fatal "#{klass} is not a BasicObject derivative" end obj = _construct_po ( klass , _new_id , * args ) @cache . cache_write ( obj ) obj . myself end
You need to call this method to create new PEROBS objects that belong to this Store .
18,829
def _construct_po ( klass , id , * args ) klass . new ( Handle . new ( self , id ) , * args ) end
For library internal use only! This method will create a new PEROBS object .
18,830
def sync if @cache . in_transaction? @cache . abort_transaction @cache . flush PEROBS . log . fatal "You cannot call sync() during a transaction: \n" + Kernel . caller . join ( "\n" ) end @cache . flush end
Flush out all modified objects to disk and shrink the in - memory list if needed .
18,831
def object_by_id ( id ) if ( ruby_object_id = @in_memory_objects [ id ] ) begin object = ObjectSpace . _id2ref ( ruby_object_id ) if object . is_a? ( ObjectBase ) && object . _id == id return object end rescue RangeError => e @in_memory_objects . delete ( id ) end end if ( obj = @cache . object_by_id ( id ) ) PEROBS . ...
Return the object with the provided ID . This method is not part of the public API and should never be called by outside users . It s purely intended for internal use .
18,832
def check ( repair = false ) stats = { :errors => 0 , :object_cnt => 0 } sync stats [ :errors ] += @db . check_db ( repair ) @db . clear_marks @progressmeter . start ( "Checking object link structure" , @db . item_counter ) do @root_objects . each do | name , id | check_object ( id , repair , stats ) end end if repair ...
This method can be used to check the database and optionally repair it . The repair is a pure structural repair . It cannot ensure that the stored data is still correct . E . g . if a reference to a non - existing or unreadable object is found the reference will simply be deleted .
18,833
def each @db . clear_marks stack = [ 0 ] + @root_objects . values while ! stack . empty? id = stack . pop next if @db . is_marked? ( id ) unless ( obj = object_by_id ( id ) ) PEROBS . log . fatal "Database is corrupted. Object with ID #{id} " + "not found." end @db . mark ( id ) yield ( obj . myself ) if block_given? o...
Calls the given block once for each object passing that object as a parameter .
18,834
def mark classes = Set . new marked_objects = 0 @progressmeter . start ( "Marking linked objects" , @db . item_counter ) do each do | obj | classes . add ( obj . class ) @progressmeter . update ( marked_objects += 1 ) end end @class_map . keep ( classes . map { | c | c . to_s } ) PEROBS . log . debug "#{marked_objects ...
Mark phase of a mark - and - sweep garbage collector . It will mark all objects that are reachable from the root objects .
18,835
def check_object ( start_id , repair , stats ) @db . mark ( start_id ) todo_list = [ [ nil , start_id ] ] while ! todo_list . empty? ref_obj , id = todo_list . pop begin obj = object_by_id ( id ) rescue PEROBS :: FatalError obj = nil end if obj @db . mark ( id ) obj . _referenced_object_ids . each do | refd_id | todo_l...
Check the object with the given start_id and all other objects that are somehow reachable from the start object .
18,836
def _referenced_object_ids ids = [ ] _all_attributes . each do | attr | value = instance_variable_get ( ( '@' + attr . to_s ) . to_sym ) ids << value . id if value && value . respond_to? ( :is_poxreference? ) end ids end
Return a list of all object IDs that the attributes of this instance are referencing .
18,837
def _delete_reference_to_id ( id ) _all_attributes . each do | attr | ivar = ( '@' + attr . to_s ) . to_sym value = instance_variable_get ( ivar ) if value && value . respond_to? ( :is_poxreference? ) && value . id == id instance_variable_set ( ivar , nil ) end end mark_as_modified end
This method should only be used during store repair operations . It will delete all references to the given object ID .
18,838
def inspect "<#{self.class}:#{@_id}>\n{\n" + _all_attributes . map do | attr | ivar = ( '@' + attr . to_s ) . to_sym if ( value = instance_variable_get ( ivar ) ) . respond_to? ( :is_poxreference? ) " #{attr} => <PEROBS::ObjectBase:#{value._id}>" else " #{attr} => #{value.inspect}" end end . join ( ",\n" ) + "\n}\n" ...
Textual dump for debugging purposes
18,839
def _serialize attributes = { } _all_attributes . each do | attr | ivar = ( '@' + attr . to_s ) . to_sym value = instance_variable_get ( ivar ) attributes [ attr . to_s ] = value . respond_to? ( :is_poxreference? ) ? POReference . new ( value . id ) : value end attributes end
Return a single data structure that holds all persistent data for this class .
18,840
def add? ( record ) result = @set . add? ( formatted_name ( record ) ) return result if result flag ( record ) result end
Add record to set . Flag if covered already .
18,841
def method_missing ( mthd , * args , & block ) new_recs = args . reduce ( [ ] ) { | a , i | a . push ( formatted_name ( i ) ) if i . class . ancestors . include? ( ActiveRecord :: Base ) ; a } result = @set . send ( mthd , * ( args . map do | arg | arg . class . ancestors . include? ( ActiveRecord :: Base ) ? formatted...
method_missing will transform any record argument into a formatted string and pass the method and arguments on to the internal Set . Also will flag any existing records covered .
18,842
def get_result ( args ) xml = case args when Hash args . delete ( :xml ) else xml = args args = { } xml end args [ :range ] ||= 1000 .. 1999 if ! ( mq = xml . xpath ( 'epp:epp/epp:response/epp:msgQ' , EPPClient :: SCHEMAS_URL ) ) . empty? @msgQ_count = mq . attribute ( 'count' ) . value . to_i @msgQ_id = mq . attribute...
Takes a xml response and checks that the result is in the right range of results that is between 1000 and 1999 which are results meaning all went well .
18,843
def command ( * args , & _block ) builder do | xml | xml . command do if block_given? yield xml else command = args . shift command . call ( xml ) args . each do | ext | xml . extension do ext . call ( xml ) end end end xml . clTRID ( clTRID ) end end end
Creates the xml for the command .
18,844
def create_http http = Net :: HTTP . new ( API_URI . host , API_URI . port ) http . use_ssl = true apply_ca_cert_path ( http ) apply_pem ( http ) http end
Creates HTTPS handler .
18,845
def get_form_data ( defopts , opts ) io_encoding = 'utf8' if ! io_encoding || io_encoding == DEFAULT_IO_ENCODING opts = opts . to_hash if opts . respond_to? ( :to_hash ) req_contract = RegApi2 :: RequestContract . new ( defopts ) opts = req_contract . validate ( opts ) form = { 'username' => username || DEFAULT_USERNAM...
Gets form data for POST request
18,846
def startup! @started = true try ( :did_start_up ) if config [ 'nickserv_password' ] privmsg ( "identify #{config['nickserv_password']}" , "nickserv" ) Thread . new do sleep ( 10 ) finalize_startup end else finalize_startup end end
Will join channels specified in configuration .
18,847
def parse ( message ) puts "<< #{message.to_s.strip}" words = message . split ( " " ) sender = words [ 0 ] raw = words [ 1 ] channel = words [ 2 ] if / \s / . match ( message ) response ( "PONG #{$1}" ) elsif / \d / . match ( raw ) send ( "handle_#{raw}" , message ) if raws_to_handle . include? ( raw ) elsif raw == "PR...
What did they say?
18,848
def [] ( key , add_on = nil ) case key when / \A \d \d \d \z / by_code_and_add_on ( $1 . to_i , $2 . to_i ) when 100_000 .. 999_999 by_code_and_add_on ( * key . divmod ( 100 ) ) when 0 .. 9999 , / \A \d \z / case add_on when nil by_code ( key . to_i ) when 0 .. 99 , / \A \d \z / by_code_and_add_on ( key . to_i , add_on...
A convenience method to get one or many zip codes by code code and add - on code and city or just city . There are various allowed styles to pass those values . All numeric values can be passed either as Integer or String . You can pass the code and add - on as six - digit number or you can pass the code as four digit ...
18,849
def poop_deck ( brig ) if BRIGANTINE_OPTIONS . include? ( brig ) && ! self . flies . empty? self . old_csv_dump ( brig ) elsif brig . is_a? ( String ) "#{self.analemma}#{brig}" else "#{self.analemma}#{self.swabbie}#{self.aft}" end end
complete file path
18,850
def unfurl wibbly = self . waggoner == '' ? '' : Regexp . escape ( self . waggoner ) timey = self . sand_glass == '' ? '' : '\.\d+' wimey = self . gibbet == '' ? '' : Regexp . escape ( self . gibbet ) Regexp . new ( "#{wibbly}#{timey}#{wimey}" ) end
Regex for matching dumped CSVs
18,851
def binnacle ( join_value , humanize = true ) self . booty . map do | compass | string = compass . is_a? ( Hash ) ? self . run_through ( compass , join_value ) : compass . is_a? ( String ) ? compass : compass . is_a? ( Symbol ) ? compass . to_s : compass . to_s humanize ? string . to_s . gsub ( / / , "" ) . gsub ( / / ...
returns an array of strings for CSV header based on booty
18,852
def boatswain return self . swabbie unless self . swabbie . nil? highval = 0 self . axe . each do | flotsam | counter = self . filibuster ( flotsam ) highval = ( ( highval <=> counter ) == 1 ) ? highval : counter end ".#{highval + 1}" end
File increment for next CSV to dump
18,853
def write begin buf = [ @flags , @length , @id , @crc ] . pack ( FORMAT ) crc = Zlib . crc32 ( buf , 0 ) @file . seek ( @addr ) @file . write ( buf + [ crc ] . pack ( 'L' ) ) rescue IOError => e PEROBS . log . fatal "Cannot write blob header into flat file DB: " + e . message end end
Write the header to a given File .
18,854
def dom_updated? ( delay : 1.1 ) element_call do begin driver . manage . timeouts . script_timeout = delay + 1 driver . execute_async_script ( DOM_OBSERVER , wd , delay ) rescue Selenium :: WebDriver :: Error :: StaleElementReferenceError retry rescue Selenium :: WebDriver :: Error :: JavascriptError => e retry if e . ...
This method makes a call to execute_async_script which means that the DOM observer script must explicitly signal that it is finished by invoking a callback . In this case the callback is nothing more than a delay . The delay is being used to allow the DOM to be updated before script actions continue .
18,855
def serialize ( obj ) begin case @serializer when :marshal Marshal . dump ( obj ) when :json obj . to_json when :yaml YAML . dump ( obj ) end rescue => e PEROBS . log . fatal "Cannot serialize object as #{@serializer}: " + e . message end end
Serialize the given object using the object serializer .
18,856
def deserialize ( raw ) begin case @serializer when :marshal Marshal . load ( raw ) when :json JSON . parse ( raw , :create_additions => true ) when :yaml YAML . load ( raw ) end rescue => e PEROBS . log . fatal "Cannot de-serialize object with #{@serializer} " + "parser: " + e . message end end
De - serialize the given String into a Ruby object .
18,857
def check_option ( name ) value = instance_variable_get ( '@' + name ) if @config . include? ( name ) unless @config [ name ] == value instance_variable_set ( '@' + name , @config [ name ] ) end else @config [ name ] = value end end
Check a config option and adjust it if needed .
18,858
def ensure_dir_exists ( dir ) unless Dir . exist? ( dir ) begin Dir . mkdir ( dir ) rescue IOError => e PEROBS . log . fatal "Cannote create DB directory '#{dir}': #{e.message}" end end end
Ensure that we have a directory to store the DB items .
18,859
def match? gemfile_lock_path = File . join ( Configuration . instance . get ( :path ) , 'Gemfile.lock' ) if File . exists? gemfile_lock_path parser = Bundler :: LockfileParser . new ( File . read ( gemfile_lock_path ) ) if spec = parser . specs . find { | spec | spec . name == @name } Gem :: Version . new ( spec . vers...
Initialize a gem_spec .
18,860
def response_hash_for ( cookbook ) { 'cookbook' => url_for ( cookbook ) , 'average_rating' => cookbook . average_rating , 'version' => cookbook . version , 'license' => cookbook . license , 'file' => "http://s3.amazonaws.com/#{cookbook.name}.tgz" , 'tarball_file_size' => cookbook . name . split ( '' ) . map ( & :ord ) ...
The response hash for this cookbook .
18,861
def order ( * ordering ) @order = ordering . map do | c | if c . kind_of? ( String ) { :column => c , :ascending => true } else c . symbolize_keys! end end self end
Order the results by the specified columns .
18,862
def close begin @f . flush @f . flock ( File :: LOCK_UN ) @f . close rescue IOError => e PEROBS . log . fatal "Cannot close stack file #{@file_name}: #{e.message}" end end
Close the stack file . This method must be called before the program is terminated to avoid data loss .
18,863
def push ( bytes ) if bytes . length != @entry_bytes PEROBS . log . fatal "All stack entries must be #{@entry_bytes} " + "long. This entry is #{bytes.length} bytes long." end begin @f . seek ( 0 , IO :: SEEK_END ) @f . write ( bytes ) rescue => e PEROBS . log . fatal "Cannot push to stack file #{@file_name}: #{e.messag...
Push the given bytes onto the stack file .
18,864
def pop begin return nil if @f . size == 0 @f . seek ( - @entry_bytes , IO :: SEEK_END ) bytes = @f . read ( @entry_bytes ) @f . truncate ( @f . size - @entry_bytes ) @f . flush rescue => e PEROBS . log . fatal "Cannot pop from stack file #{@file_name}: " + e . message end bytes end
Pop the last entry from the stack file .
18,865
def get_comments ( bugs ) params = { } if bugs . kind_of? ( Array ) then params [ 'ids' ] = bugs elsif bugs . kind_of? ( Integer ) || bugs . kind_of? ( String ) then params [ 'ids' ] = [ bugs ] else raise ArgumentError , sprintf ( "Unknown type of arguments: %s" , bugs . class ) end result = comments ( params ) result ...
def get_bugs = begin rdoc
18,866
def when_ready ( simple_check = false , & _block ) already_marked_ready = ready unless simple_check no_ready_check_possible unless block_given? end self . ready = ready? not_ready_validation ( ready_error || 'NO REASON PROVIDED' ) unless ready yield self if block_given? ensure self . ready = already_marked_ready end
The when_ready method is called on an instance of an interface . This executes the provided validation block after the page has been loaded . The Ready object instance is yielded into the block .
18,867
def ready_validations_pass? self . class . ready_validations . all? do | validation | passed , message = instance_eval ( & validation ) self . ready_error = message if message && ! passed passed end end
This method checks if the ready validations that have been specified have passed . If any ready validation fails no matter if others have succeeded this method immediately returns false .
18,868
def match? match = false @instance . current_node . recursive_children do | child_node | match = match || ( child_node && child_node . match? ( @rules ) ) end ! match end
check if none of child node matches the rules .
18,869
def add_rid_related_validations ( options ) validates ( options [ :field ] , presence : true ) validates ( options [ :field ] , uniqueness : true ) if options [ :random_generation_method ] != :uuid end
Add the rid related validations to the model .
18,870
def define_rid_accessors ( related_class , relationship_name ) define_method ( "#{relationship_name}_rid" ) do self . send ( relationship_name ) . try ( random_unique_id_options [ :field ] ) end define_method ( "#{relationship_name}_rid=" ) do | rid | record = related_class . find_by_rid ( rid ) self . send ( "#{relati...
Defines the setter and getter for the RID of a relationship .
18,871
def build ( & block ) raise 'build must be called with a block' if ! block_given? root = menu_items . build ( title : 'Home' , url : '/' , lft : 1 , rgt : 2 ) root . parent = true instance_exec ( & block ) root . rgt = menu_items . map ( & :rgt ) . max self end
This is the entry point to the DSL method for creating menu items
18,872
def search_key_index ( key ) return 0 if @keys . empty? li = pi = 0 ui = @keys . size - 1 while li <= ui pi = li + ( ui - li ) / 2 if key < @keys [ pi ] ui = pi - 1 elsif key > @keys [ pi ] li = pi + 1 else return @is_leaf ? pi : pi + 1 end end @keys [ pi ] < key ? pi + 1 : pi end
Search the keys of the node that fits the given key . The result is either the index of an exact match or the index of the position where the given key would have to be inserted .
18,873
def response_value ( nonce , nc , cnonce , qop , a2_prefix = 'AUTHENTICATE' ) a1_h = h ( "#{preferences.username}:#{preferences.realm}:#{preferences.password}" ) a1 = "#{a1_h}:#{nonce}:#{cnonce}" if preferences . authzid a1 += ":#{preferences.authzid}" end if qop && ( qop . downcase == 'auth-int' || qop . downcase == '...
Calculate the value for the response field
18,874
def to_hash arr return { } if arr . nil? return arr if arr . kind_of? ( Hash ) arr = [ arr . to_sym ] unless arr . kind_of? ( Array ) ret = { } arr . each { | key | ret [ key . to_sym ] = { } } ret end
Normalizes required and optional fields to the form of Hash with options .
18,875
def fields_to_validate required_fields = to_hash opts [ :required ] optional_fields = to_hash opts [ :optional ] required_fields . keys . each { | key | required_fields [ key ] [ :required ] = true } optional_fields . merge ( required_fields ) end
Gets fields to validate
18,876
def validate_ipaddr key , value , opts if opts [ :ipaddr ] == true && value . kind_of? ( String ) value = IPAddr . new ( value ) end value . to_s end
Validates specified value with ipaddr field .
18,877
def validate_presence_of_required_fields form , fields absent_fields = [ ] fields . each_pair do | key , opts | next unless opts [ :required ] if ! form . has_key? ( key ) || form [ key ] . nil? absent_fields << key end end unless absent_fields . empty? raise RegApi2 :: ContractError . new ( "Required fields missed: #{...
Validates specified form for presence of all required fields .
18,878
def validate ( form ) fields = fields_to_validate return form if fields . empty? validate_presence_of_required_fields form , fields fields . each_pair do | key , opts | next if ! form . has_key? ( key ) || form [ key ] . nil? form [ key ] = validate_re key , form [ key ] , opts form [ key ] = validate_iso_date key , fo...
Validates specified form with required and optional fields .
18,879
def has_key? ( key ) node = self while node do i = node . search_key_index ( key ) if node . is_leaf? return node . keys [ i ] == key end node = node . children [ i ] end PEROBS . log . fatal "Could not find proper node to get from while " + "looking for key #{key}" end
Return if given key is stored in the node .
18,880
def remove_element ( index ) unless ( key = @keys . delete_at ( index ) ) PEROBS . log . fatal "Could not remove element #{index} from BigTreeNode " + "@#{@_id}" end update_branch_key ( key ) if index == 0 removed_value = @values . delete_at ( index ) if @keys . length < min_keys if @prev_sibling && @prev_sibling . par...
Remove the element from a leaf node at the given index .
18,881
def remove_child ( node ) unless ( index = search_node_index ( node ) ) PEROBS . log . fatal "Cannot remove child #{node._id} from node #{@_id}" end if index == 0 key = @keys . shift update_branch_key ( key ) else @keys . delete_at ( index - 1 ) end child = @children . delete_at ( index ) @tree . first_leaf = child . n...
Remove the specified node from this branch node .
18,882
def statistics ( stats ) traverse do | node , position , stack | if position == 0 if node . is_leaf? stats . leaf_nodes += 1 depth = stack . size + 1 if stats . min_depth . nil? || stats . min_depth < depth stats . min_depth = depth end if stats . max_depth . nil? || stats . max_depth > depth stats . max_depth = depth ...
Gather some statistics about the node and all sub nodes .
18,883
def include? ( id ) ! ( blob = find_blob ( id ) ) . nil? && ! blob . find ( id ) . nil? end
Return true if the object with given ID exists
18,884
def get_object ( id ) return nil unless ( blob = find_blob ( id ) ) && ( obj = blob . read_object ( id ) ) deserialize ( obj ) end
Load the given object from the filesystem .
18,885
def is_marked? ( id , ignore_errors = false ) ( blob = find_blob ( id ) ) && blob . is_marked? ( id , ignore_errors ) end
Check if the object is marked .
18,886
def env vars = { } @stack . variables . each { | k , v | vars [ "TF_VAR_#{k}" ] = v } SECRETS . each do | _provider , secrets | if secrets . is_a? ( Hash ) secrets . each do | k , v | vars [ k . to_s ] = v end end end vars end
fixme - make these shell safe
18,887
def search ( query ) regex = Regexp . new ( query , 'i' ) _cookbooks . collect do | _ , v | v [ v . keys . first ] if regex . match ( v [ v . keys . first ] . name ) end . compact end
Query the installed cookbooks returning those who s name matches the given query .
18,888
def add ( cookbook ) cookbook = cookbook . dup cookbook . created_at = Time . now cookbook . updated_at = Time . now entry = _cookbooks [ cookbook . name ] ||= { } entry [ cookbook . version ] = cookbook end
Add the given cookbook to the cookbook store . This method s implementation prohibits duplicate cookbooks from entering the store .
18,889
def remove ( cookbook ) return unless has_cookbook? ( cookbook . name , cookbook . version ) _cookbooks [ cookbook . name ] . delete ( cookbook . version ) end
Remove the cookbook from the store .
18,890
def find ( name , version = nil ) possibles = _cookbooks [ name ] return nil if possibles . nil? version ||= possibles . keys . sort . last possibles [ version ] end
Determine if the cookbook store contains a cookbook . If the version attribute is nil this method will return the latest cookbook version by that name that exists . If the version is specified this method will only return that specific version or nil if that cookbook at that version exists .
18,891
def versions ( name ) name = name . respond_to? ( :name ) ? name . name : name ( _cookbooks [ name ] && _cookbooks [ name ] . keys . sort ) || [ ] end
Return a list of all versions for the given cookbook .
18,892
def _referenced_object_ids @data . each_value . select { | v | v && v . respond_to? ( :is_poxreference? ) } . map { | o | o . id } end
Return a list of all object IDs of all persistend objects that this Hash is referencing .
18,893
def _delete_reference_to_id ( id ) @data . delete_if do | k , v | v && v . respond_to? ( :is_poxreference? ) && v . id == id end @store . cache . cache_write ( self ) end
This method should only be used during store repair operations . It will delete all referenced to the given object ID .
18,894
def split max_id = @min_id + ( @max_id - @min_id ) / 2 new_page_record = IDListPageRecord . new ( @page_file , max_id + 1 , @max_id , page . delete ( max_id ) ) @max_id = max_id new_page_record end
Split the current page . This split is done by splitting the ID range in half . This page will keep the first half the newly created page will get the second half . This may not actually yield an empty page as all values could remain with one of the pages . In this case further splits need to be issued by the caller .
18,895
def columns ( * names ) columns = @base . columns . select { | c | names . include? ( c . name ) } check_columns_subset_of_base_dimension names , columns @options [ :columns ] = columns end
Defines which columns of the base dimension are present in the shrunken dimension .
18,896
def insert_position ( node ) case node . type when :block node . children [ 1 ] . children . empty? ? node . children [ 0 ] . loc . expression . end_pos + 3 : node . children [ 1 ] . loc . expression . end_pos when :class node . children [ 1 ] ? node . children [ 1 ] . loc . expression . end_pos : node . children [ 0 ]...
Insert position .
18,897
def accessor_aspects ( element , * signature ) identifier = signature . shift locator_args = { } qualifier_args = { } gather_aspects ( identifier , element , locator_args , qualifier_args ) [ locator_args , qualifier_args ] end
This method provides the means to get the aspects of an accessor signature . The aspects refer to the locator information and any qualifier information that was provided along with the locator . This is important because the qualifier is not used to locate an element but rather to put conditions on how the state of the...
18,898
def delete_database dynamodb = Aws :: DynamoDB :: Client . new dynamodb . delete_table ( :table_name => @table_name ) dynamodb . wait_until ( :table_not_exists , table_name : @table_name ) end
Create a new DynamoDB object .
18,899
def delete_unmarked_objects deleted_objects_count = 0 each_item do | id | unless dynamo_is_marked? ( id ) dynamo_delete_item ( id ) deleted_objects_count += 1 @item_counter -= 1 end end dynamo_put_item ( 'item_counter' , @item_counter . to_s ) deleted_objects_count end
Permanently delete all objects that have not been marked . Those are orphaned and are no longer referenced by any actively used object .