idx
int64
0
24.9k
question
stringlengths
68
4.14k
target
stringlengths
9
749
11,200
def process read_events . each do | event | event . callback! event . flags . include? ( :ignored ) && event . notifier . watchers . delete ( event . watcher_id ) end end
Blocks until there are one or more filesystem events that this notifier has watchers registered for . Once there are events the appropriate callbacks are called and this function returns .
11,201
def data @fauxhai_data ||= lambda do if @options [ :path ] filepath = File . expand_path ( @options [ :path ] ) unless File . exist? ( filepath ) raise Fauxhai :: Exception :: InvalidPlatform . new ( "You specified a path to a JSON file on the local system that does not exist: '#{filepath}'" ) end else filepath = File ...
Create a new Ohai Mock with fauxhai .
11,202
def parse_and_validate ( unparsed_data ) parsed_data = JSON . parse ( unparsed_data ) if parsed_data [ 'deprecated' ] STDERR . puts "WARNING: Fauxhai platform data for #{parsed_data['platform']} #{parsed_data['platform_version']} is deprecated and will be removed in the 7.0 release 3/2019. #{PLATFORM_LIST_MESSAGE}" end...
As major releases of Ohai ship it s difficult and sometimes impossible to regenerate all fauxhai data . This allows us to deprecate old releases and eventually remove them while giving end users ample warning .
11,203
def call_service ( service , ** args , & block ) service_args = { context : service_context , logger : service_logger } . merge ( args ) service_logger . debug ( "#{self.class.name}/#{__id__} called service " "#{service.name}" ) service . call ( ** service_args , & block ) end
Calls the specified service
11,204
def build ( command , params = nil ) params = assemble_params ( sanitize ( params ) ) params . empty? ? command . to_s : "#{command} #{params}" end
Build the full command line .
11,205
def save ( options = { } , & block ) self . errors . clear options = { } if options == false options [ :validate ] = true unless options . has_key? ( :validate ) save_options = ActiveRecord :: VERSION :: MAJOR < 3 ? options [ :validate ] : options authenticate_via_protocol ( block_given? , options ) do | start_authenti...
core save method coordinating how to save the user . we dont want to ru validations based on the authentication mission we are trying to accomplish . instead we just return save as false . the next time around when we recieve the callback we will run the validations . when you call current_user_session in ApplicationCo...
11,206
def save_oauth_session super auth_session [ :auth_attributes ] = attributes . reject! { | k , v | v . blank? || ! self . respond_to? ( k ) } unless is_auth_session? end
user adds a few extra things to this method from Process modules work like inheritance
11,207
def complete_oauth_transaction token = token_class . new ( oauth_token_and_secret ) old_token = token_class . find_by_key_or_token ( token . key , token . token ) token = old_token if old_token if has_token? ( oauth_provider ) self . errors . add ( :tokens , "you have already created an account using your #{token_class...
single implementation method for oauth . this is called after we get the callback url and we are saving the user to the database . it is called by the validation chain .
11,208
def serialize_function ( func ) case func when String serialize_function_from_string ( func ) when Symbol serialize_function_from_symbol ( func ) when Proc serialize_function_from_proc ( func ) when Method serialize_function_from_method ( func ) else raise Spark :: CommandError , 'You must enter String, Symbol, Proc or...
Serialized can be Proc and Method
11,209
def serialize_function_from_method ( meth ) if pry? meth = Pry :: Method . new ( meth ) end { type : 'method' , name : meth . name , content : meth . source } rescue raise Spark :: SerializeError , 'Method can not be serialized. Use full path or Proc.' end
Serialize method as string
11,210
def from_file ( file ) check_read_only if file && File . exist? ( file ) file = File . expand_path ( file ) RubyUtils . loadPropertiesFile ( spark_conf , file ) end end
Initialize java SparkConf and load default configuration .
11,211
def get ( key ) value = spark_conf . get ( key . to_s ) case TYPES [ key ] when :boolean parse_boolean ( value ) when :integer parse_integer ( value ) else value end rescue nil end
Rescue from NoSuchElementException
11,212
def load_executor_envs prefix = 'SPARK_RUBY_EXECUTOR_ENV_' envs = ENV . select { | key , _ | key . start_with? ( prefix ) } envs . each do | key , value | key = key . dup key . slice! ( 0 , prefix . size ) set ( "spark.ruby.executor.env.#{key}" , value ) end end
Load environment variables for executor from ENV .
11,213
def compute before_start @split_index = socket . read_int SparkFiles . root_directory = socket . read_string count = socket . read_int count . times do Spark :: Broadcast . register ( socket . read_long , socket . read_string ) end @command = socket . read_data @iterator = @command . deserializer . load_from_io ( socke...
These methods must be on one method because iterator is Lazy which mean that exception can be raised at serializer or compute
11,214
def inspect comms = @command . commands . join ( ' -> ' ) result = %{#<#{self.class.name}:0x#{object_id}} result << %{ (#{comms})} unless comms . empty? result << %{ (cached)} if cached? result << %{\n} result << %{ Serializer: "#{serializer}"\n} result << %{Deserializer: "#{deserializer}"} result << %{>} result end
Initializing RDD this method is root of all Pipelined RDD - its unique If you call some operations on this class it will be computed in Java
11,215
def take ( count ) buffer = [ ] parts_count = self . partitions_size last_scanned = - 1 while buffer . empty? last_scanned += 1 buffer += context . run_job_with_command ( self , [ last_scanned ] , true , Spark :: Command :: Take , 0 , - 1 ) end items_per_part = buffer . size left = count - buffer . size while left > 0 ...
Take the first num elements of the RDD .
11,216
def aggregate ( zero_value , seq_op , comb_op ) _reduce ( Spark :: Command :: Aggregate , seq_op , comb_op , zero_value ) end
Aggregate the elements of each partition and then the results for all the partitions using given combine functions and a neutral zero value .
11,217
def coalesce ( num_partitions ) if self . is_a? ( PipelinedRDD ) deser = @command . serializer else deser = @command . deserializer end new_jrdd = jrdd . coalesce ( num_partitions ) RDD . new ( new_jrdd , context , @command . serializer , deser ) end
Return a new RDD that is reduced into num_partitions partitions .
11,218
def shuffle ( seed = nil ) seed ||= Random . new_seed new_rdd_from_command ( Spark :: Command :: Shuffle , seed ) end
Return a shuffled RDD .
11,219
def reserialize ( new_serializer ) if serializer == new_serializer return self end new_command = @command . deep_copy new_command . serializer = new_serializer PipelinedRDD . new ( self , new_command ) end
Return a new RDD with different serializer . This method is useful during union and join operations .
11,220
def intersection ( other ) mapping_function = 'lambda{|item| [item, nil]}' filter_function = 'lambda{|(key, values)| values.size > 1}' self . map ( mapping_function ) . cogroup ( other . map ( mapping_function ) ) . filter ( filter_function ) . keys end
Return the intersection of this RDD and another one . The output will not contain any duplicate elements even if the input RDDs did .
11,221
def partition_by ( num_partitions , partition_func = nil ) num_partitions ||= default_reduce_partitions partition_func ||= 'lambda{|x| Spark::Digest.portable_hash(x.to_s)}' _partition_by ( num_partitions , Spark :: Command :: PartitionBy :: Basic , partition_func ) end
Return a copy of the RDD partitioned using the specified partitioner .
11,222
def take_sample ( with_replacement , num , seed = nil ) if num < 0 raise Spark :: RDDError , 'Size have to be greater than 0' elsif num == 0 return [ ] end num_st_dev = 10.0 initial_count = self . count return [ ] if initial_count == 0 seed ||= Random . new_seed rng = Random . new ( seed ) if ! with_replacement && num ...
Return a fixed - size sampled subset of this RDD in an array
11,223
def group_by_key ( num_partitions = nil ) create_combiner = 'lambda{|item| [item]}' merge_value = 'lambda{|combiner, item| combiner << item; combiner}' merge_combiners = 'lambda{|combiner_1, combiner_2| combiner_1 += combiner_2; combiner_1}' combine_by_key ( create_combiner , merge_value , merge_combiners , num_partiti...
Group the values for each key in the RDD into a single sequence . Allows controlling the partitioning of the resulting key - value pair RDD by passing a Partitioner .
11,224
def aggregate_by_key ( zero_value , seq_func , comb_func , num_partitions = nil ) _combine_by_key ( [ Spark :: Command :: CombineByKey :: CombineWithZero , zero_value , seq_func ] , [ Spark :: Command :: CombineByKey :: Merge , comb_func ] , num_partitions ) end
Aggregate the values of each key using given combine functions and a neutral zero value .
11,225
def cogroup ( * others ) unioned = self others . each do | other | unioned = unioned . union ( other ) end unioned . group_by_key end
For each key k in this or other return a resulting RDD that contains a tuple with the list of values for that key in this as well as other .
11,226
def subtract ( other , num_partitions = nil ) mapping_function = 'lambda{|x| [x,nil]}' self . map ( mapping_function ) . subtract_by_key ( other . map ( mapping_function ) , num_partitions ) . keys end
Return an RDD with the elements from self that are not in other .
11,227
def sort_by ( key_function = nil , ascending = true , num_partitions = nil ) key_function ||= 'lambda{|x| x}' num_partitions ||= default_reduce_partitions command_klass = Spark :: Command :: SortByKey spilling = false memory = '' if memory . empty? spilling = false memory = nil else memory = to_memory_size ( memory ) e...
Sorts this RDD by the given key_function
11,228
def _reduce ( klass , seq_op , comb_op , zero_value = nil ) if seq_op . nil? rdd = self else rdd = new_rdd_from_command ( klass , seq_op , zero_value ) end rdd = rdd . coalesce ( 1 ) . compact comm = rdd . add_command ( klass , comb_op , zero_value ) comm . deserializer = @command . serializer PipelinedRDD . new ( rdd ...
This is base method for reduce operation . Is used by reduce fold and aggregation . Only difference is that fold has zero value .
11,229
def _combine_by_key ( combine , merge , num_partitions ) num_partitions ||= default_reduce_partitions combined = new_rdd_from_command ( combine . shift , * combine ) shuffled = combined . partition_by ( num_partitions ) merge_comm = shuffled . add_command ( merge . shift , * merge ) PipelinedRDD . new ( shuffled , merg...
For using a different combine_by_key
11,230
def disable jlogger . setLevel ( level_off ) JLogger . getLogger ( 'org' ) . setLevel ( level_off ) JLogger . getLogger ( 'akka' ) . setLevel ( level_off ) JLogger . getRootLogger . setLevel ( level_off ) end
Disable all Spark log
11,231
def accumulator ( value , accum_param = :+ , zero_value = 0 ) Spark :: Accumulator . new ( value , accum_param , zero_value ) end
Create an Accumulator with the given initial value using a given accum_param helper object to define how to add values of the data type if provided .
11,232
def parallelize ( data , num_slices = nil , serializer = nil ) num_slices ||= default_parallelism serializer ||= default_serializer serializer . check_each ( data ) file = Tempfile . new ( 'to_parallelize' , temp_dir ) serializer . dump_to_io ( data , file ) file . close jrdd = RubyRDD . readRDDFromFile ( jcontext , fi...
Distribute a local Ruby collection to form an RDD Direct method can be slow so be careful this method update data inplace
11,233
def run_job ( rdd , f , partitions = nil , allow_local = false ) run_job_with_command ( rdd , partitions , allow_local , Spark :: Command :: MapPartitions , f ) end
Executes the given partition function f on the specified set of partitions returning the result as an array of elements .
11,234
def run_job_with_command ( rdd , partitions , allow_local , command , * args ) if ! partitions . nil? && ! partitions . is_a? ( Array ) raise Spark :: ContextError , 'Partitions must be nil or Array' end partitions_size = rdd . partitions_size if partitions . nil? partitions = ( 0 ... partitions_size ) . to_a end parti...
Execute the given command on specific set of partitions .
11,235
def merge ( other ) if other . is_a? ( Spark :: StatCounter ) merge_stat_counter ( other ) elsif other . respond_to? ( :each ) merge_array ( other ) else merge_value ( other ) end self end
min of our values
11,236
def raise_use_case_error ( use_case , error ) name = error . class . name . split ( '::' ) . last klass = define_use_case_error ( use_case , name ) wrapped = klass . new ( error . message ) wrapped . set_backtrace ( error . backtrace ) raise wrapped end
This method intends to inject the error inside the context of the use case so we can identify the use case from it was raised
11,237
def bootstrap! scripts = Scripts . new ( self , Scripts :: BOOTSTRAP_SCRIPTS ) dir = File . join ( Library . base_dir , "scripts" ) scripts . each_pending ( dir ) do | filename , path | psql_file ( filename , path ) scripts . record_as_run! ( filename ) end end
Installs schema_evolution_manager . Automatically upgrades schema_evolution_manager .
11,238
def psql_command ( sql_command ) Preconditions . assert_class ( sql_command , String ) command = "psql --no-align --tuples-only --no-psqlrc --command \"%s\" %s" % [ sql_command , @url ] Library . system_or_error ( command ) end
executes a simple sql command .
11,239
def parse_attribute_values values = [ ] each_property do | name , value | values << AttributeValue . new ( name , value ) end DEFAULTS . each do | default | if values . find { | v | v . attribute . name == default . attribute . name } . nil? values << default end end values end
Returns a list of AttributeValues from the file itself including all defaults set by SEM . AttributeValues are defined in comments in the file .
11,240
def each_pending ( dir ) files = { } Scripts . all ( dir ) . each do | path | name = File . basename ( path ) files [ name ] = path end scripts_previously_run ( files . keys ) . each do | filename | files . delete ( filename ) end files . keys . sort . each do | filename | if ! has_run? ( filename ) yield filename , fi...
For each sql script that needs to be applied to this database yields a pair of |filename fullpath| in proper order
11,241
def has_run? ( filename ) if @db . schema_schema_evolution_manager_exists? query = "select count(*) from %s.%s where filename = '%s'" % [ Db . schema_name , @table_name , filename ] @db . psql_command ( query ) . to_i > 0 else false end end
True if this script has already been applied to the db . False otherwise .
11,242
def record_as_run! ( filename ) Preconditions . check_state ( filename . match ( / \d \d \d \d \d \d \- \d \d \d \d \d \d \. / ) , "Invalid filename[#{filename}]. Must be like: 20120503-173242.sql" ) command = "insert into %s.%s (filename) select '%s' where not exists (select 1 from %s.%s where filename = '%s')" % [ Db...
Inserts a record to indiciate that we have loaded the specified file .
11,243
def scripts_previously_run ( scripts ) if scripts . empty? || ! @db . schema_schema_evolution_manager_exists? [ ] else sql = "select filename from %s.%s where filename in (%s)" % [ Db . schema_name , @table_name , "'" + scripts . join ( "', '" ) + "'" ] @db . psql_command ( sql ) . strip . split end end
Fetch the list of scripts that have already been applied to this database .
11,244
def generate template = Template . new template . add ( 'timestamp' , Time . now . to_s ) template . add ( 'lib_dir' , @lib_dir ) template . add ( 'bin_dir' , @bin_dir ) template . parse ( TEMPLATE ) end
Generates the actual contents of the install file
11,245
def fill_buffer while true do begin new_lines = @file_pointer . read ( 10485760 ) rescue => e return nil end return nil unless new_lines temp_buf = [ ] ss = StringScanner . new ( new_lines ) while ss . scan ( / \n /m ) temp_buf << ss [ 0 ] end temp_buf << ss . rest unless ss . eos? new_first_line = temp_buf . shift if ...
read text data from bz2 compressed file by 1 megabyte
11,246
def replace ( word ) case @replacement when :vowels then word . gsub ( / /i , '*' ) when :stars then '*' * word . size when :nonconsonants then word . gsub ( / /i , '*' ) when :default , :garbled then '$@!#%' else raise LanguageFilter :: UnknownReplacement . new ( "#{@replacement} is not a known replacement type." ) en...
This was moved to private because users should just use sanitize for any content
11,247
def to_page page = Jekyll :: Page . new ( @site , @base , @dir , @name ) page . data [ "permalink" ] = File . dirname ( url ) + "/" page end
Convert this static file to a Page
11,248
def perform ( country : nil , title : nil ) @session . visit ( apple_url ( country ) ) response = { title : @session . find ( ".section-date .date-copy" ) . text . strip , services : [ ] , } MAX_RETRY_COUNT . times do services = fetch_services if services . empty? sleep 1 else response [ :services ] = services break en...
crawl apple system status page
11,249
def fetch_and_deserialize_response ( path ) self . request = FioAPI :: Request . get ( path , parser : ListResponseDeserializer ) self . response = request . parsed_response request end
Create request object ot provided uri and instantiate list deserializer . Request uri and deserialize response to response object with account info and transactions list .
11,250
def flash_class_for ( flash_type ) flash_type = flash_type . to_s . downcase . parameterize { success : "success" , error : "danger" , alert : "warning" , notice : "info" } . fetch ( flash_type . to_sym , flash_type ) end
Converts Rails flash message type to Bootstrap flash message type
11,251
def execute_recipe! ( stack_name , layer , recipe , app_name = Momentum . config [ :app_base_name ] ) raise "No recipe provided" unless recipe stack = Momentum :: OpsWorks . get_stack ( @ow , stack_name ) app = Momentum :: OpsWorks . get_app ( @ow , stack , app_name ) layer_names = layer ? [ layer ] : Momentum . config...
wait up to 15 minutes
11,252
def it ( * , & spec ) i = It . new ( described , challenges , helpers ) result = i . verify ( & spec ) if configuration . fetch ( :verbose , true ) print result . to_char ( configuration . fetch ( :color , false ) ) end results << result end
Add it method to the DSL .
11,253
def on ( method_name , * args , & block ) o = On . new ( described , results , ( challenges + [ Defi . send ( method_name , * args ) ] ) , helpers . dup , configuration ) o . instance_eval ( & block ) end
Add on method to the DSL .
11,254
def context ( * , & block ) o = On . new ( described , [ ] , challenges , helpers . dup , configuration ) results . concat ( Aw . fork! { o . instance_eval ( & block ) } ) end
Add context method to the DSL to build an isolated scope .
11,255
def watch ( identifier , callback_method = nil , ** params , & callback ) if callback _dispatcher ( params ) . register identifier , params , & callback else _dispatcher ( params ) . register_method_call identifier , self , callback_method , params end end
Watch for an event identified by its class identifier . If a callback method is provided then it will call that method on the caller of + watch + when the event happens . Otherwise it will run the callback block .
11,256
def call liquid = :: Liquid :: Template . parse ( template ) liquid . send ( :render , stringify_assigns , liquid_options ) . html_safe end
Render the Liquid content
11,257
def frontend_resource_path ( resource ) permalink_path = proc do | permalink | archangel . frontend_page_path ( permalink ) . gsub ( "%2F" , "/" ) end return permalink_path . call ( resource ) unless resource . class == Page return archangel . frontend_root_path if resource . homepage? permalink_path . call ( resource ...
Frontend resource permalink .
11,258
def get ( url , params = { } ) params = params . inject ( { } ) { | memo , ( k , v ) | memo [ k . to_s ] = v ; memo } preform ( url , :get , params : params ) do return connection . get ( url , params ) end end
Does a GET request to the url with the params
11,259
def post ( url , params ) params = convert_hash_keys ( params ) preform ( url , :post , params : params ) do return connection . post ( url , params ) end end
Does a POST request to the url with the params
11,260
def put ( url , params ) params = convert_hash_keys ( params ) preform ( url , :put , params : params ) do return connection . put ( url , params ) end end
Does a PUT request to the url with the params
11,261
def render ( template , local_assigns = { } ) default_controller . headers [ "Content-Type" ] ||= "text/html; charset=utf-8" assigns = default_assigns ( local_assigns ) options = { registers : default_registers } Archangel :: RenderService . call ( template , assigns , options ) end
Render Liquid content
11,262
def push ( value ) case value [ 0 ] when :CONTENT , :STRING_LITERAL value [ 1 ] . gsub! ( / \n \s / , ' ' ) if strip_line_breaks? if ! @stack . empty? && value [ 0 ] == @stack [ - 1 ] [ 0 ] @stack [ - 1 ] [ 1 ] << value [ 1 ] else @stack . push ( value ) end when :ERROR @stack . push ( value ) if @include_errors leave_...
Pushes a value onto the parse stack . Returns the Lexer .
11,263
def analyse ( string = nil ) raise ( ArgumentError , 'Lexer: failed to start analysis: no source given!' ) unless string || @scanner self . data = string || @scanner . string until @scanner . eos? send ( "parse_#{MODE[@mode]}" ) end push ( [ false , '$end' ] ) end
Start the lexical analysis .
11,264
def enter_object @brace_level = 0 push [ :AT , '@' ] case when @scanner . scan ( Lexer . patterns [ :string ] ) @mode = @active_object = :string push [ :STRING , @scanner . matched ] when @scanner . scan ( Lexer . patterns [ :preamble ] ) @mode = @active_object = :preamble push [ :PREAMBLE , @scanner . matched ] when @...
Called when the lexer encounters a new BibTeX object .
11,265
def initialize_copy ( other ) @fields = { } self . type = other . type self . key = other . key add ( other . fields ) end
Creates a new instance . If a hash is given the entry is populated accordingly .
11,266
def key = ( key ) key = key . to_s if registered? bibliography . entries . delete ( @key ) key = register ( key ) end @key = key rescue => e raise BibTeXError , "failed to set key to #{key.inspect}: #{e.message}" end
Sets the Entry s key . If the Entry is currently registered with a Bibliography re - registers the Entry with the new key ; note that this may change the key value if another Entry is already regsitered with the same key .
11,267
def provide ( name ) return nil unless name . respond_to? ( :to_sym ) name = name . to_sym fields [ name ] || fields [ aliases [ name ] ] end
Returns the field value referenced by the passed - in name . For example this will return the title value for booktitle if a corresponding alias is defined .
11,268
def field_names ( filter = [ ] , include_inherited = true ) names = fields . keys if include_inherited && has_parent? names . concat ( inherited_fields ) end unless filter . empty? names = names & filter . map ( & :to_sym ) end names . sort! names end
Returns a sorted list of the Entry s field names . If a + filter + is passed as argument returns all field names that are also defined by the filter . If the + filter + is empty returns all field names .
11,269
def inherited_fields return [ ] unless has_parent? names = parent . fields . keys - fields . keys names . concat ( parent . aliases . reject { | k , v | ! parent . has_field? ( v ) } . keys ) names . sort! names end
Returns a sorted list of all field names referenced by this Entry s cross - reference .
11,270
def rename! ( * arguments ) Hash [ * arguments . flatten ] . each_pair do | from , to | if fields . has_key? ( from ) && ! fields . has_key? ( to ) fields [ to ] = fields [ from ] fields . delete ( from ) end end self end
Renames the given field names unless a field with the new name already exists .
11,271
def valid? REQUIRED_FIELDS [ type ] . all? do | f | f . is_a? ( Array ) ? ! ( f & fields . keys ) . empty? : ! fields [ f ] . nil? end end
Returns false if the entry is one of the standard entry types and does not have definitions of all the required fields for that type .
11,272
def digest ( filter = [ ] ) names = field_names ( filter ) digest = type . to_s names . zip ( values_at ( * names ) ) . each do | key , value | digest << "|#{key}:#{value}" end digest = yield ( digest , self ) if block_given? digest end
Creates the entry s digest based on the passed - in filters .
11,273
def added_to_bibliography ( bibliography ) super @key = register ( key ) [ :parse_names , :parse_months ] . each do | parser | send ( parser ) if bibliography . options [ parser ] end if bibliography . options . has_key? ( :filter ) [ * bibliography . options [ :filter ] ] . each do | filter | convert! ( filter ) end e...
Called when the element was added to a bibliography .
11,274
def register ( key ) return nil if bibliography . nil? k = key . dup k . succ! while bibliography . has_key? ( k ) bibliography . entries [ k ] = self k end
Registers this Entry in the associated Bibliographies entries hash . This method may change the Entry s key if another entry is already registered with the current key .
11,275
def parse_names strings = bibliography ? bibliography . strings . values : [ ] NAME_FIELDS . each do | key | if name = fields [ key ] name = name . dup . replace ( strings ) . join . to_name fields [ key ] = name unless name . nil? end end self end
Parses all name values of the entry . Tries to replace and join the value prior to parsing .
11,276
def convert! ( * filters ) filters = filters . flatten . map { | f | Filters . resolve! ( f ) } fields . each_pair do | k , v | ( ! block_given? || yield ( k , v ) ) ? v . convert! ( * filters ) : v end self end
In - place variant of
11,277
def default_key k = names [ 0 ] k = k . respond_to? ( :family ) ? k . family : k . to_s k = BibTeX . transliterate ( k ) . gsub ( / / , '' ) k = k [ / / ] || 'unknown' k << ( year . to_s [ / \d / ] || '-' ) k << 'a' k . downcase! k end
Returns a default key for this entry .
11,278
def initialize_copy ( other ) @options = other . options . dup @errors = other . errors . dup @data , @strings = [ ] , { } @entries = Hash . new { | h , k | h . fetch ( k . to_s , nil ) } other . each do | element | add element . dup end self end
Creates a new bibliography .
11,279
def add ( * arguments ) Element . parse ( arguments . flatten , @options ) . each do | element | data << element . added_to_bibliography ( self ) end self end
Adds a new element or a list of new elements to the bibliography . Returns the Bibliography for chainability .
11,280
def save_to ( path , options = { } ) options [ :quotes ] ||= %w( { } ) File . open ( path , 'w:UTF-8' ) do | f | f . write ( to_s ( options ) ) end self end
Saves the bibliography to a file at the given path . Returns the bibliography .
11,281
def delete ( * arguments , & block ) objects = q ( * arguments , & block ) . map { | o | o . removed_from_bibliography ( self ) } @data = @data - objects objects . length == 1 ? objects [ 0 ] : objects end
Deletes an object or a list of objects from the bibliography . If a list of objects is to be deleted you can either supply the list of objects or use a query or block to define the list .
11,282
def extend_initials! groups = Hash . new do | h , k | h [ k ] = { :prototype => nil , :names => [ ] } end names . each do | name | group = groups [ name . sort_order ( :initials => true ) . downcase ] group [ :names ] << name if group [ :prototype ] . nil? || group [ :prototype ] . first . to_s . length < name . first ...
This method combines all names in the bibliography that look identical when using initials as first names and then tries to extend the first names for all names in each group to the longest available form . Returns the bibliography .
11,283
def matches? ( query ) return true if query . nil? || query . respond_to? ( :empty? ) && query . empty? case query when Symbol query . to_s == id . to_s when Element query == self when Regexp to_s . match ( query ) when / \/ \/ / to_s . match ( Regexp . new ( $1 ) ) when / \* \w \[ \] \] / query . scan ( / \* \w \[ \] ...
Returns true if the element matches the given query .
11,284
def set ( attributes = { } ) attributes . each_pair do | key , value | send ( "#{key}=" , value ) if respond_to? ( key ) end self end
Set the name tokens to the values defined in the passed - in hash .
11,285
def extend_initials ( with_first , for_last ) rename_if :first => with_first do | name | if name . last == for_last mine = name . initials . split ( / \. / ) other = initials ( with_first ) . split ( / \. / ) mine == other || mine . length < other . length && mine == other [ 0 , mine . length ] end end end
Sets the name s first name to the passed - in name if the last name equals for_last and the current first name has the same initials as with_first .
11,286
def rename_if ( attributes , conditions = { } ) if block_given? set ( attributes ) if yield self else set ( attributes ) if conditions . all? do | key , value | respond_to? ( key ) && send ( key ) == value end end self end
Renames the tokens according to the passed - in attributes if all of the conditions match or if the given block returns true .
11,287
def assert ( assertion , desc = nil ) if assertion pass else what = if block_given? yield else "Failed assert: assertion was " "`#{Assert::U.show(assertion, __assert_config__)}`." end fail ( fail_message ( desc , what ) ) end end
check if the assertion is a truthy value if so create a new pass result otherwise create a new fail result with the desc and what failed msg . all other assertion helpers use this one in the end
11,288
def pass ( pass_msg = nil ) if @__assert_pending__ == 0 capture_result ( Assert :: Result :: Pass , pass_msg ) else capture_result ( Assert :: Result :: Fail , "Pending pass (make it " "not pending)" ) end end
adds a Pass result to the end of the test s results does not break test execution
11,289
def fail ( message = nil ) if @__assert_pending__ == 0 if halt_on_fail? raise Result :: TestFailure , message || "" else capture_result ( Assert :: Result :: Fail , message || "" ) end else if halt_on_fail? raise Result :: TestSkipped , "Pending fail: #{message || ""}" else capture_result ( Assert :: Result :: Skip , "...
adds a Fail result to the end of the test s results break test execution if assert is configured to halt on failures
11,290
def enqueue_in ( number_of_seconds_from_now , klass , * args ) enqueue_at ( Time . now + number_of_seconds_from_now , klass , * args ) end
Identical to enqueue_at but takes number_of_seconds_from_now instead of a timestamp .
11,291
def decode ( object ) return unless object if defined? Yajl begin Yajl :: Parser . parse ( object , :check_utf8 => false ) rescue Yajl :: ParseError end else begin JSON . parse ( object ) rescue JSON :: ParserError end end end
Given a string returns a Ruby object .
11,292
def constantize ( camel_cased_word ) camel_cased_word = camel_cased_word . to_s if camel_cased_word . include? ( '-' ) camel_cased_word = classify ( camel_cased_word ) end names = camel_cased_word . split ( '::' ) names . shift if names . empty? || names . first . empty? constant = Object names . each do | name | const...
Given a camel cased word returns the constant it represents
11,293
def atom = ( value ) value . is_a? ( Atom ) ? super ( value ) : super ( Atom . find ( value . to_s ) ) end
Set the atom .
11,294
def prefix = ( value ) value . is_a? ( Prefix ) ? super ( value ) : super ( Prefix . find ( value . to_s ) ) end
Set the prefix .
11,295
def root_terms if terminal? [ self ] else atom . scale . root_terms . map do | t | self . class . new ( :atom => t . atom , :exponent => t . exponent * exponent ) end end end
The base units this term is derived from
11,296
def operate ( operator , other ) exp = operator == '/' ? - 1 : 1 if other . respond_to? ( :terms ) Unit . new ( other . terms . map { | t | t ** exp } << self ) elsif other . respond_to? ( :atom ) Unit . new ( [ self , other ** exp ] ) elsif other . is_a? ( Numeric ) self . class . new ( to_hash . merge ( :factor => fa...
Multiply or divide a term
11,297
def to_s ( mode = :primary_code ) res = send ( mode ) || primary_code res . respond_to? ( :each ) ? res . first . to_s : res . to_s end
String representation for the instance .
11,298
def scale = ( attrs ) @scale = if attrs [ :function_code ] Functional . new ( attrs [ :value ] , attrs [ :unit_code ] , attrs [ :function_code ] ) else Scale . new ( attrs [ :value ] , attrs [ :unit_code ] ) end end
Set the atom s scale . It can be set as a Scale or a Functional
11,299
def validate! missing_properties = %i{ primary_code names scale } . select do | prop | val = liner_get ( prop ) val . nil? || ( val . respond_to? ( :empty ) && val . empty? ) end if ! missing_properties . empty? missing_list = missing_properties . join ( ',' ) raise Unitwise :: DefinitionError , "Atom has missing prope...
A basic validator for atoms . It checks for the bare minimum properties and that it s scalar and magnitude can be resolved . Note that this method requires the units it depends on to already exist so it is not used when loading the initial data from UCUM .